My name in title on 2019-2020

see   this page at web

my name in title till 2018

 titles with my name since 2021

my name in title  what is my name; totals by year

my name in title from Math Reviews 

My publications

My name in titles of  videos

last updated on   Jun 30, 2023 
 

 

start 2019 Wasserstein  Vaserstein in tittle     


The Gromov–Wasserstein distance between networks and stable network invariants

S Chowdhury, F Mémoli - Information and Inference: A Journal of …, 2019 - academic.oup.com

We define a metric—the network Gromov–Wasserstein distance—on weighted, directed networks that is sensitive to the presence of outliers. In addition to proving its theoretical properties, we supply network invariants based on optimal transport that approximate this distance by means of lower bounds. We test these methods on a range of simulated network datasets and on a dataset of real-world global bilateral migration. For our simulations, we define a network generative model based on the stochastic block model. This may be of …

Cited by 13 Related articles All 6 versions 

MR4045478 Prelim Chowdhury, Samir; Mémoli, Facundo; The Gromov–Wasserstein distance between networks and stable network invariants. Inf. Inference 8 (2019), no. 4, 757–787.

 


Data-driven regularization of Wasserstein barycenters with an application to multivariate density registration

J Bigot, E Cazelles, N Papadakis - Information and Inference: A …, 2019 - academic.oup.com

We present a framework to simultaneously align and smoothen data in the form of multiple point clouds sampled from unknown densities with support in a-dimensional Euclidean space. This work is motivated by applications in bioinformatics where researchers aim to automatically homogenize large datasets to compare and analyze characteristics within a same cell population. Inconveniently, the information acquired is most certainly noisy due to misalignment caused by technical variations of the environment. To overcome this problem …

Cited by 7 Related articles All 8 versions 

MR4045477 Prelim Bigot, Jérémie; Cazelles, Elsa; Papadakis, Nicolas; Data-driven regularization of Wasserstein barycenters with an application to multivariate density registration. Inf. Inference 8 (2019), no. 4, 719–755.


MR4041703 Prelim Verdinelli, Isabella; Wasserman, Larry; Hybrid Wasserstein distance and fast distribution clustering. Electron. J. Stat. 13 (2019), no. 2, 5088–5119. 62G99 (62H30)

Hybrid Wasserstein distance and fast distribution clustering

by I Verdinelli - ‎2019 - ‎Related articles

Dec 12, 2019 - We define a modified Wasserstein distance for distribution clustering ... estimated using nonparametric regression and leads to fast and easy ...

 Cited by 9 Related articles All 2 versions



[HTML] Correction to: Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM Shao, L Xu - Probability Theory and Related Fields, 2019 - Springer

Under the above-strengthened Assumption 2.1, all the conclusions and examples in [1] still hold true, except that all the constants \(C_\theta \) therein will depend on the constants in the new assumption … Combining the previous three inequalities, we conclude that [1, (7.1)] still holds true … Combining the estimates of \(J_1\) and \(J_2\), we immediately get that [1, (7.2)] still holds true … Gorham et. al. (see [18] in [1]) recently put forward a method to measure sample quality with diffusions by a Stein discrepancy, in which the same Stein equation as (3.1) has to be …

Cited by 1 Related articles All 3 versions 

MR4026616 Prelim Fang, Xiao; Shao, Qi-Man; Xu, Lihu Correction to: Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula. Probab. Theory Related Fields 175 (2019), no. 3-4, 1177–1181. 60F05 (60H07)

[HTML] springer.com

[HTML] Correction to: Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM Shao, L Xu - Probability Theory and Related Fields, 2019 - Springer

Under the above-strengthened Assumption 2.1, all the conclusions and examples in [1] still hold true, except that all the constants \(C_\theta \) therein will depend on the constants in the new assumption … Combining the previous three inequalities, we conclude that [1, (7.1)] still holds …

Cited by 2 Related articles All 2 versions

 

MR4019758 Pending Bonnet, Benoît A Pontryagin maximum principle in Wasserstein spaces for constrained optimal control problems. ESAIM Control Optim. Calc. Var. 25 (2019), Art. 52, 38 pp. 49K20 (49K27 49Q20 58E25)

 A Pontryagin Maximum Principle in Wasserstein Spaces for Constrained Optimal Control Problems

B Bonnet - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

In this paper, we prove a Pontryagin Maximum Principle for constrained optimal control problems in the Wasserstein space of probability measures. The dynamics is described by a transport equation with non-local velocities which are affine in the control, and is subject to end-point and running state constraints. Building on our previous work, we combine the classical method of needle-variations from geometric control theory and the metric differential structure of the Wasserstein spaces to obtain a maximum principle formulated in …

Cited by 4 Related articles 


2019


Second-Order Models for Optimal Transport and Cubic Splines on the Wasserstein Space

JD Benamou, TO Gallouët, FX Vialard - Foundations of Computational …, 2019 - Springer

On the space of probability densities, we extend the Wasserstein geodesics to the case of higher-order interpolation such as cubic spline interpolation. After presenting the natural extension of cubic splines to the Wasserstein space, we propose a simpler approach based on the relaxation of the variational problem on the path space. We explore two different numerical approaches, one based on multimarginal optimal transport and entropic regularization and the other based on semi-discrete optimal transport.

Cited by 16 Related articles All 7 versions

MR4017682 Pending Benamou, Jean-David; Gallouët, Thomas O.; Vialard, François-Xavier Second-order models for optimal transport and cubic splines on the Wasserstein space. Found. Comput. Math. 19 (2019), no. 5, 1113–1143. 49Q20 (49M25 65D07)

 

MR4016722  Motamed, Mohammad; Appelö, Daniel Wasserstein metric-driven Bayesian inversion with applications to signal processing. Int. J. Uncertain. Quantif. 9 (2019), no. 4, 394–414. 62F15 (60B10 86A15 94A12)

Wasserstein metric-driven Bayesian inversion with applications to signal processing

M Motamed, D Appelo - International Journal for Uncertainty …, 2019 - dl.begellhouse.com

We present a Bayesian framework based on a new exponential likelihood function driven by the quadratic Wasserstein metric. Compared to conventional Bayesian models based on Gaussian likelihood functions driven by the least-squares norm (L 2 norm), the new …

Cited by 9 Related articles All 3 versions


Robust Wasserstein profile inference and applications to machine learning

J Blanchet, Y Kang, K Murthy - Journal of Applied Probability, 2019 - cambridge.org

We show that several machine learning estimators, including square-root least absolute shrinkage and selection and regularized logistic regression, can be represented as solutions to distributionally robust optimization problems. The associated uncertainty regions are based on suitably defined Wasserstein distances. Hence, our representations allow us to view regularization as a result of introducing an artificial adversary that perturbs the empirical distribution to account for out-of-sample effects in loss estimation. In addition, we …

 Cited by 124      Related articles     All 3 versions

MR4015639 Pending Blanchet, Jose; Kang, Yang; Murthy, Karthyek Robust Wasserstein profile inference and applications to machine learning. J. Appl. Probab. 56 (2019), no. 3, 830–857. 60B10 (62J05 62J07)

MR4013144   Yong, Peng; Liao, Wenyuan; Huang, Jianping; Li, Zhenchun; Lin, Yaoting Misfit function for full waveform inversion based on the Wasserstein metric with dynamic formulation. J. Comput. Phys. 399 (2019), 108911, 19 pp. 65R32 (49Q20 86A15)

Misfit function for full waveform inversion based on the Wasserstein metric with dynamic formulation 

By: Yong, Peng; Liao, Wenyuan; Huang, Jianping; et al.

JOURNAL OF COMPUTATIONAL PHYSICS  Volume: 399     Article Number: UNSP 108911   Published: DEC 15 2019 


 

MR4009553 Pending Carlier, Guillaume; Poon, Clarice On the total variation Wasserstein gradient flow and the TV-JKO scheme. ESAIM Control Optim. Calc. Var. 25 (2019), Art. 42, 21 pp. 49Q20 (35K35 35K59 49N15)

On the total variation Wasserstein gradient flow and the TV-JKO scheme

G Carlier, C Poon - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We study the JKO scheme for the total variation, characterize the optimizers, prove some of their qualitative properties (in particular a form of maximum principle and in some cases, a minimum principle as well). Finally, we establish a convergence result as the time step goes to zero to a solution of a fourth-order nonlinear evolution equation, under the additional assumption that the density remains bounded away from zero, this lower bound is shown in dimension one and in the radially symmetric case.

Cited by 6 Related articles 

<——2019  ———2019———10 


2019 see 2020
Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings

Y Dai, S Wang, X Chen, C Xu, W Guo - Knowledge-Based Systems, 2019 - Elsevier

Abstract Knowledge graph embedding aims to project entities and relations into low-dimensional and continuous semantic feature spaces, which has captured more attention in recent years. Most of the existing models roughly construct negative samples via a uniformly …

 

On the rate of convergence of empirical measures in∞-transportation distance

NG Trillos, D Slepčev - Canadian Journal of Mathematics, 2015 - cambridge.org

We consider random iid samples of absolutely continuous measures on bounded connected domains. We prove an upper bound on the $\infty $-transportation distance between the measure and the empirical measure of the sample. The bound is optimal in terms of scaling with the number of sample points.

Cited by 57 Related articles All 11 versions 

MR4009333 Pending Liu, Anning; Liu, Jian-Guo; Lu, Yulong On the rate of convergence of empirical measure in

∞-Wasserstein distance for unbounded density function. Quart. Appl. Math. 77 (2019), no. 4, 811–829. 60B10 (62G30)


 2019

MR4003560 Pending Weed, Jonathan; Bach, Francis Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance. Bernoulli 25 (2019), no. 4A, 2620–2648. 60B10 (62G30)

[PDF] arxiv.org

Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance

J Weed, F Bach - Bernoulli, 2019 - projecteuclid.org

The Wasserstein distance between two probability measures on a metric space is a 

measure of closeness with applications in statistics, probability, and machine learning. In 

this work, we consider the fundamental question of how quickly the empirical measure …

  Cited by 157 Related articles All 6 versions

MR4000104 Pending Gehér, György Pál; Titkos, Tamás; Virosztek, Dániel On isometric embeddings of Wasserstein spaces—the discrete case. J. Math. Anal. Appl. 480 (2019), no. 2, 123435, 11 pp. 28A33 (60B05)

On isometric embeddings of Wasserstein spaces–the discrete case

GP Gehér, T Titkos, D Virosztek - Journal of Mathematical Analysis and …, 2019 - Elsevier

The aim of this short paper is to offer a complete characterization of all (not necessarily surjective) isometric embeddings of the Wasserstein space W p (X), where X is a countable discrete metric space and 0< p<∞ is any parameter value. Roughly speaking, we will prove that any isometric embedding can be described by a special kind of X×(0, 1]-indexed family of nonnegative finite measures. Our result implies that a typical non-surjective isometric embedding of W p (X) splits mass and does not preserve the shape of measures. In order to …

Cited by 1 Related articles 

 

MR3996793  Chow, Yat Tin; Gangbo, Wilfrid A partial Laplacian as an infinitesimal generator on the Wasserstein space. J. Differential Equations 267 (2019), no. 10, 6065–6117. 60H30 (28A33 35C05 35K05 35K08 60J25)

 A partial Laplacian as an infinitesimal generator on the Wasserstein space

YT Chow, W Gangbo - Journal of Differential Equations, 2019 - Elsevier

In this manuscript, we consider special linear operators which we term partial Laplacians on the Wasserstein space, and which we show to be partial traces of the Wasserstein Hessian. We verify a distinctive smoothing effect of the “heat flows” they generated for a particular class of initial conditions. To this end, we will develop a theory of Fourier analysis and conic surfaces in metric spaces. We then identify a measure which allows for an integration by parts for a class of Sobolev functions. To achieve this goal, we solve a recovery problem on …

Cited by 11 Related articles All 9 versions


2019


A Wasserstein norm for signed measures, with application to nonlocal transport equation with source term

B Piccoli, F Rossi, M Tournus - arXiv preprint arXiv:1910.05105, 201
9 - arxiv.org

We introduce the optimal transportation interpretation of the Kantorovich norm on thespace 

of signed Radon measures with finite mass, based on a generalized Wasserstein 

distancefor measures with different masses. With the formulation and the new topological …

Cited by 4 Related articles All 7 versions


MR3996643  Massart, Estelle; Hendrickx, Julien M.; Absil, P.-A. 

Curvature of the manifold of fixed-rank positive-semidefinite matrices endowed with the Bures-Wasserstein metric. Geometric science of information, 739–748, Lecture Notes in Comput. Sci., 11712, Springer, Cham, 2019. 53B21 (15B48)

[PDF] uclouvain.be

Curvature of the manifold of fixed-rank positive-semidefinite matrices endowed with the Bures–Wasserstein metric

E MassartJM HendrickxPA Absil - International Conference on …, 2019 - Springer

We consider the manifold of rank-p positive-semidefinite matrices of size n, seen as a

quotient of the set of full-rank n-by-p matrices by the orthogonal group in dimension p. The

Cited by 11 Related articles All 6 versions


A unified formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt distances between positive definite operators

HQ Minh - International Conference on Geometric Science of …, 2019 - Springer

This work presents a parametrized family of distances, namely the Alpha Procrustes distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes distances provide a unified formulation encompassing both the Bures-Wasserstein and Log-Euclidean distances between SPD matrices. This formulation is then generalized to the set of positive definite Hilbert-Schmidt operators on a Hilbert space, unifying the infinite-dimensional Bures-Wasserstein and Log-Hilbert-Schmidt distances. The presented …

Cited by 4 Related articles 

MR3996615  Minh, Hà Quang A unified formulation for the Bures-Wasserstein and log-Euclidean/log-Hilbert-Schmidt distances between positive definite operators. Geometric science of information, 475–483, Lecture Notes in Comput. Sci., 11712, Springer, Cham, 2019. 62H99 (15B48 60B05)


[PDF] thecvf.com

Unimodal-uniform constrained wasserstein training for medical diagnosis

X LiuX Han, Y Qiao, Y GeS Li… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

The labels in medical diagnosis task are usually discrete and successively distributed. For

example, the Diabetic Retinopathy Diagnosis (DR) involves five health risk levels: no DR (0),

mild DR (1), moderate DR (2), severe DR (3) and proliferative DR (4). This labeling system is …

  CCited by 20 Related articles All 8 versions 


On the Bures–Wasserstein distance between positive definite matrices

R Bhatia, T Jain, Y Lim - Expositiones Mathematicae, 2019 - Elsevier

The metric d (A, B)= tr A+ tr B− 2 tr (A 1 2 BA 1 2) 1 2 1 2 on the manifold of n× n positive definite matrices arises in various optimisation problems, in quantum information and in the theory of optimal transport. It is also related to Riemannian geometry. In the first …

Cited by 76 Related articles All 5 versions 

MR3992484 Pending Bhatia, Rajendra; Jain, Tanvi; Lim, Yongdo 

On the Bures-Wasserstein distance between positive definite matrices. Expo. Math. 37 (2019), no. 2, 165–191. 47A63 (47A64 53C20 53C22 60B05 81P45)

Cited by 53 Related articles All 4 versions
<——2019    ——————— 2019  ———20 —


Dynamic models of Wasserstein-1-type unbalanced transport

B Schmitzer, B Wirth - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We consider a class of convex optimization problems modelling temporal mass transport and mass change between two given mass distributions (the so-called dynamic formulation of unbalanced transport), where we focus on those models for which transport costs are proportional to transport distance. For those models we derive an equivalent, computationally more efficient static formulation, we perform a detailed analysis of the model optimizers and the associated optimal mass change and transport, and we examine which …

Cited by 5 Related articles All 4 versions 

MR3986362 Pending Schmitzer, Bernhard; Wirth, Benedikt 

Dynamic models of Wasserstein-1-type unbalanced transport. ESAIM Control Optim. Calc. Var. 25 (2019), Art. 23, 54 pp. 49Q20 (37N40 90C05)

 

 MR3985566 Pending Privault, N.; Yam, S. C. P.; Zhang, Z. 

Poisson discretizations of Wiener functionals and Malliavin operators with Wasserstein estimates. Stochastic Process. Appl. 129 (2019), no. 9, 3376–3405. 60H07 (60F05)

Poisson discretizations of Wiener functionals and Malliavin operators with Wasserstein estimates

N Privault, SCP Yam, Z Zhang - Stochastic Processes and their …, 2019 - Elsevier

This article proposes a global, chaos-based procedure for the discretization of functionals of 

Brownian motion into functionals of a Poisson process with intensity λ> 0. Under this 

discretization we study the weak convergence, as the intensity of the underlying Poisson 

process goes to infinity, of Poisson functionals and their corresponding Malliavin-type 

derivatives to their Wiener counterparts. In addition, we derive a convergence rate of O (λ− 

1 4) for the Poisson discretization of Wiener functionals by combining the multivariate …

Related articles All 7 versions

MR3985558 Pending Luo, Dejun; Wang, Jian Refined basic couplings and Wasserstein-type distances for SDEs with Lévy noises. Stochastic Process. Appl. 129 (2019), no. 9, 3129–3173. 60J25 (60J75)

Refined basic couplings and Wasserstein-type distances for SDEs with Lévy noises

D Luo, J Wang - Stochastic Processes and their Applications, 2019 - Elsevier

We establish the exponential convergence with respect to the L 1-Wasserstein distance and the total variation for the semigroup corresponding to the stochastic differential equation d X t= d Z t+ b (X t) dt, where (Z t) t≥ 0 is a pure jump Lévy process whose Lévy measure ν fulfills inf x R d,| x|≤ κ 0 [ν(δ x ν)](R d)> 0 for some constant κ 0> 0, and the drift term b satisfies that for any x, y R d,< b (x)− b (y), x− y>≤ Φ 1 (| x− y|)| x− y|,| x− y|< l 0;− K 2| x− y| 2,| x− y|≥ l 0 with some positive constants K 2, l 0 and positive measurable function Φ 1. The …

Cited by 13 Related articles All 7 versions

MR3980309 Pending Fang, Xiao; Shao, Qi-Man; Xu, Lihu Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula. Probab. Theory Related Fields 174 (2019), no. 3-4, 945–979. 60F05 (60H07)

 Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM Shao, L Xu - Probability Theory and Related Fields, 2019 - Springer

Stein's method has been widely used for probability approximations. However, in the multi-dimensional setting, most of the results are for multivariate normal approximation or for test functions with bounded second-or higher-order derivatives. For a class of multivariate limiting distributions, we use Bismut's formula in Malliavin calculus to control the derivatives of the Stein equation solutions by the first derivative of the test function. Combined with Stein's exchangeable pair approach, we obtain a general theorem for multivariate …

Cited by 34 Related articles All 4 versions

[CITATION] Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula (vol 174, pg 945, 2019)

X Fang, QM Shao, L Xu - PROBABILITY …, 2019 - … TIERGARTENSTRASSE 17, D …


MR3978211 Pending Olvera-Cravioto, Mariana Convergence of the population dynamics algorithm in the Wasserstein metric. Electron. J. Probab. 24 (2019), Paper No. 61, 27 pp. 65C05 (60J80)

Convergence of the Population Dynamics algorithm in the Wasserstein metric

M Olvera-Cravioto - Electronic Journal of Probability, 2019 - projecteuclid.org

We study the cRelated articlesonvergence of the population dynamics algorithm, which produces sample pools of random variables having a distribution that closely approximates that of the special endogenous solution to a variety of branching stochastic fixed-point equations, including the smoothing transform, the high-order Lindley equation, the discounted tree-sum and the free-entropy equation. Specifically, we show its convergence in the Wasserstein metric of order $ p $($ p\geq 1$) and prove the consistency of estimators based on the sample pool produced …

Cited by 2 All 6 versions 


2019


 

Wasserstein barycenters in the manifold of all positive definite matrices

E Nobari, B Ahmadi Kakavandi - Quarterly of Applied Mathematics, 2019 - ams.org

In this paper, we study the Wasserstein barycenter of finitely many Borel probability measures on $\mathbb {P} _ {n} $, the Riemannian manifold of all $ n\times n $ real positive definite matrices as well as its associated dual problem, namely the optimal transport problem. Our results generalize some results of Agueh and Carlier on $\mathbb {R}^{n} $ to $\mathbb {P} _ {n} $. We show the existence of the optimal solutions and the Wasserstein barycenter measure. Furthermore, via a discretization approach and using the BFGS …

Related articles All 2 versions 


MR3959197 Pending Birghila, Corina; Pflug, Georg Ch. Optimal XL-insurance under Wasserstein-type ambiguity. Insurance Math. Econom. 88 (2019), 30–43. 91B30

Optimal XL-insurance under Wasserstein-type ambiguity

C Birghila, GC Pflug - Insurance: Mathematics and Economics, 2019 - Elsevier

We study the problem of optimal insurance contract design for risk management under a budget constraint. The contract holder takes into consideration that the loss distribution is not entirely known and therefore faces an ambiguity problem. For a given set of models, we …

Related articles All 5 versions 


MR3958435 Pending Arras, Benjamin; Azmoodeh, Ehsan; Poly, Guillaume; Swan, Yvik A bound on the Wasserstein-2 distance between linear combinations of independent random variables. Stochastic Process. Appl. 129 (2019), no. 7, 2341–2375. 60F05 (60G15 60G50 60H07)

 A bound on the 2-Wasserstein distance between linear ... - ORBi

orbi.uliege.be › bitstream › AAPS-FINAL-SPA

PDF 2019

We use this bound to estimate the 2-Wasserstein distance between random variables represented by linear combinations of independent random variables.

by B Arras · ‎2019 · ‎Cited by 20 · ‎Related articles


MR3958140 Pending Bhatia, Rajendra; Jain, Tanvi; Lim, Yongdo Inequalities for the Wasserstein mean of positive definite matrices. Linear Algebra Appl. 576 (2019), 108–123. 15A42 (15A18 47A30 47A64)

Inequalities for the Wasserstein mean of positive definite ...

www.sciencedirect.com › science › article › pii

Sep 1, 2019 — We prove majorization inequalities for different means of positive definite matrices. These include the Cartan mean (the Karcher mean), the log Euclidean mean, the Wasserstein mean and the power mean.

by R Bhatia · ‎2019 · ‎Cited by 12 · ‎Related articles

DF] arxiv.org

Inequalities for the Wasserstein mean of positive definite matrices

R Bhatia, T Jain, Y Lim - Linear Algebra and its Applications, 2019 - Elsevier

We prove majorization inequalities for different means of positive definite matrices. These include 

the Cartan mean (the Karcher mean), the log Euclidean mean, the Wasserstein mean and the 

power mean … (1) d ( A , B ) = [ tr ( A + B ) − 2 tr ( A 1 / 2 B A 1 / 2 ) 1 / 2 ] 1 / 2 … (2) Ω ( w ; …

Cited by 9 Related articles All 5 versions 


 

MR3955234  Bigot, Jérémie; Cazelles, Elsa; Papadakis, Nicolas Penalization of barycenters in the Wasserstein space. SIAM J. Math. Anal. 51 (2019), no. 3, 2261–2285. (Reviewer: Juan A. Cuesta-Albertos) 62G07 (49Q20 62G20)

 [PDF] arxiv.org

Penalization of barycenters in the Wasserstein space

J Bigot, E Cazelles, N Papadakis - SIAM Journal on Mathematical Analysis, 2019 - SIAM

In this paper, a regularization of Wasserstein barycenters for random measures supported 

on R^d is introduced via convex penalization. The existence and uniqueness of such 

barycenters is first proved for a large class of penalization functions. The Bregman …

Cited by 25 Related articles All 10 versions
 <——2019    ——————— 2019  ——30 —


MR3954993 Pending Lavenant, Hugo Harmonic mappings valued in the Wasserstein space. J. Funct. Anal. 277 (2019), no. 3, 688–785. 31C45 (31C12 35J05 35J25)

[PDF] arxiv.org

Harmonic mappings valued in the Wasserstein space

H Lavenant - Journal of Functional Analysis, 2019 - Elsevier

We propose a definition of the Dirichlet energy (which is roughly speaking the integral of the 

square of the gradient) for mappings μ: Ω(P (D), W 2) defined over a subset Ω of R p and 

valued in the space P (D) of probability measures on a compact convex subset D of R q …

Cited by 9 Related articles All 8 versions 


MR3951716   Bouchitté, Guy; Fragalà, Ilaria; Lucardesi, Ilaria 

Sensitivity of the compliance and of the Wasserstein distance with respect to a varying source. Appl. Math. Optim. 79 (2019), no. 3, 743–768. (Reviewer: Haijun Wang) 49Q12 (49K40)

Related articles All 8 versions


MR3951694  Cancès, Clément; Matthes, Daniel; Nabet, Flore A two-phase two-fluxes degenerate Cahn-Hilliard model as constrained Wasserstein gradient flow. Arch. Ration. Mech. Anal. 233 (2019), no. 2, 837–866. (Reviewer: Mohammed Guedda) 76D99 (35Q35)

 Investigators from University of Lille Have Reported New Data on Mechanical Engineering (A Two-phase Two-fluxes Degenerate Cahn-hilliard Model As Constrained Wasserstein...

Journal of Engineering, 08/2019

NewsletterCitation Online

Publication Title:Journal of Engineering

Language:English

Publisher:NewsRX LLC

Date :08/2019

Copyright:COPYRIGHT 2019 NewsRX LLC

Start Page:1257

Subjects:

Periodical publishing, Analysis, Models, Research, Mechanical engineering

ISSN:1945-8711

EISSN:1945-872X Cite this item Email this item Save this item More actions


2019 see 2017

MR3950950   Schmitzer, Bernhard; Wirth, Benedikt A framework for Wasserstein-1-type metrics. J. Convex Anal. 26 (2019), no. 2, 353–396. (Reviewer: Luca Lussardi) 49M29 (49M20 65K10)

 

MR3950564  Luo, Fengqiao; Mehrotra, Sanjay Decomposition algorithm for distributionally robust optimization using Wasserstein metric with an application to a class of regression models. European J. Oper. Res. 278 (2019), no. 1, 20–35. 90C47 (60B10 62J02 90C15 90C34)

A framework for Wasserstein-1-type metrics
Authors:Schmitzer B.Wirth B.
Article, 2019
Publication:Journal of Convex Analysis, 26, 2019
Publisher:2019

MR3949307   Petersen, Alexander; Müller, Hans-Georg Wasserstein covariance for multiple random densities. Biometrika 106 (2019), no. 2, 339–351. 62G07 (60B10 82C70)

 [PDF] arxiv.org

Wasserstein covariance for multiple random densities

A Petersen, HG Müller - Biometrika, 2019 - academic.oup.com

A common feature of methods for analysing samples of probability density functions is that they respect the geometry inherent to the space of densities. Once a metric is specified for this space, the Fréchet mean is typically used to quantify and visualize the average density …

Cited by 19 Related articles All 11 versions

Life Science Research - Biometrics; Studies from University of California - Santa Barbara Reveal New Findings on Biometrics (Wasserstein... 

Journal of Technology, Nov 19, 2019, 3027

Newspaper ArticleFull Text Online 


2019


MR3944398 Pending Hwang, Jinmi; Kim, Sejong Bounds for the Wasserstein mean with applications to the Lie-Trotter mean. J. Math. Anal. Appl. 475 (2019), no. 2, 1744–1753. 58D17

[PDF] arxiv.org

Bounds for the Wasserstein mean with applications to the Lie-Trotter mean

J Hwang, S Kim - Journal of Mathematical Analysis and Applications, 2019 - Elsevier

Since barycenters in the Wasserstein space of probability distributions have been 

introduced, the Wasserstein metric and the Wasserstein mean of positive definite Hermitian 

matrices have been recently developed. In this paper, we explore some properties of …

Cited by 3 Related articles All 3 versions 


 

MR3944201 Pending Gangbo, Wilfrid; Tudorascu, Adrian On differentiability in the Wasserstein space and well-posedness for Hamilton-Jacobi equations. J. Math. Pures Appl. (9) 125 (2019), 119–174. 58D25 (35B30 35D40 35F21 46G05)

 [PDF] ucla.edu

On differentiability in the Wasserstein space and well-posedness for Hamilton–Jacobi equations

W Gangbo, A Tudorascu - Journal de Mathématiques Pures et Appliquées, 2019 - Elsevier

In this paper we elucidate the connection between various notions of differentiability in the 

Wasserstein space: some have been introduced intrinsically (in the Wasserstein space, by 

using typical objects from the theory of Optimal Transport) and used by various authors to …

Cited by 19 Related articles All 3 versions


MR3940765   Fang, Xiao Wasserstein-2 bounds in normal approximation under local dependence. Electron. J. Probab. 24 (2019), Paper No. 35, 14 pp. (Reviewer: Benjamin Arras) 60F05

 [PDF] projecteuclid.org

Wasserstein-2 bounds in normal approximation under local dependence

X Fang - Electronic Journal of Probability, 2019 - projecteuclid.org

We obtain a general bound for the Wasserstein-2 distance in normal approximation for sums 

of locally dependent random variables. The proof is based on an asymptotic expansion for 

expectations of second-order differentiable functions of the sum. We apply the main result to …

Cited by 1 Related articles All 3 versions 

arXiv:1807.05741  [pdf, ps, other

MR3939527  Panaretos, Victor M.; Zemel, Yoav Statistical aspects of Wasserstein distances. Annu. Rev. Stat. Appl. 6 (2019), 405–431. 62-02 (28A33 60B10)

 Statistical Aspects of Wasserstein Distances

by Panaretos, Victor M; Zemel, Yoav

Annual Review of Statistics and Its Application, 03/2019, Volume 6, Issue 1

Journal Article: Full Text Online  view online

Wasserstein distances are metrics on probability distributions inspired by the problem of optimal mass transportation. Roughly speaking, they measure the minimal effort required to reconfigure the probability mass of one distribution in order to recover the other distribution. They are ubiquitous in mathematics, with a long history that has seen them catalyze core developments in analysis, optimization, and probability. Beyond their intrinsic mathematical richness, they possess attractive features that make them a versatile tool for the statistician: They can be used to derive weak convergence and convergence of moments, and can be easily bounded; they are well-adapted to quantify a natural notion of perturbation of a probability distribution; and they seamlessly incorporate the geometry of the domain of the distributions in question, thus being useful for contrasting complex objects. Consequently, they frequently appear in the development of statistical theory and inferential methodology, and they have recently become an object of inference in themselves. In this review, we provide a snapshot of the main concepts involved in Wasserstein distances and optimal transportation, and a succinct overview of some of their many statistical aspects.

[PDF] arxiv.org

Statistical aspects of Wasserstein distances

VM Panaretos, Y Zemel - Annual review of statistics and its …, 2019 - annualreviews.org

Wasserstein distances are metrics on probability distributions inspired by the problem of 

optimal mass transportation. Roughly speaking, they measure the minimal effort required to 

reconfigure the probability mass of one distribution in order to recover the other distribution …
Cited by 254
Related articles All 7 versions


2019 see 2017  

MR3939389  Zhao, Yong; Liu, Yongchao; Yang, Xinming 

Distributionally robust reward-risk ratio programming with Wasserstein metric. Pac. J. Optim. 15 (2019), no. 1, 69–90. (Reviewer: I. M. Stancu-Minasian) 90C15 (90C32 90C47)

Distributionally robust reward-risk ratio programming with Wasserstein metric
<——2019——————— 2019  ———————40 —


MR3928142  Bernton, Espen; Jacob, Pierre E.; Gerber, Mathieu; Robert, Christian P. Approximate Bayesian computation with the Wasserstein distance. J. R. Stat. Soc. Ser. B. Stat. Methodol. 81 (2019), no. 2, 235–269. 62F15 (28A33 60B10 62G30)

 Approximate Bayesian computation with the Wasserstein distance

E Bernton, PE Jacob, M Gerber, CP Robert - arXiv preprint arXiv …, 2019 - arxiv.org

A growing number of generative statistical models do not permit the numerical evaluation of their likelihood functions. Approximate Bayesian computation (ABC) has become a popular approach to overcome this issue, in which one simulates synthetic data sets given parameters and compares summaries of these data sets with the corresponding observed values. We propose to avoid the use of summaries and the ensuing loss of information by instead using the Wasserstein distance between the empirical distributions of the observed …

Cited by 32 Related articles All 11 versions View as HTML 


MR3926165  Lichtenegger, Emily; Niedzialomski, Robert Approximation and Wasserstein distance for self-similar measures on the unit interval. J. Math. Anal. Appl. 474 (2019), no. 2, 1250–1266. 28A33 (28A80)

Approximation and Wasserstein distance for self-similar measures on the unit interval

E Lichtenegger, R Niedzialomski - Journal of Mathematical Analysis and …, 2019 - Elsevier

We study the Wasserstein distance between self-similar measures associated to two non-overlapping linear contractions of the unit interval. The main theorem gives an explicit formula for the Wasserstein distance between iterations of certain discrete approximations of …

Related articles

2019 see 2018

MR3924870  Perchet, Vianney; Quincampoix, Marc A differential game on Wasserstein space. Application to weak approachability with partial monitoring. J. Dyn. Games 6 (2019), no. 1, 65–85. (Reviewer: Guiomar Martín-Herrán) 49N70 (91A20 91A23)

A differential game on Wasserstein space. Application to weak approachability with partial monitoring


MR3920362  Zemel, Yoav; Panaretos, Victor M. Fréchet means and Procrustes analysis in Wasserstein space. Bernoulli 25 (2019), no. 2, 932–976. 62G05 (60B05 60B10 60D05 60G57 62-07)

[PDF] arxiv.org

Fréchet means and Procrustes analysis in Wasserstein space

Y Zemel, VM Panaretos - Bernoulli, 2019 - projecteuclid.org

We consider two statistical problems at the intersection of functional and non-Euclidean data 

analysis: the determination of a Fréchet mean in the Wasserstein space of multivariate 

distributions; and the optimal registration of deformed random measures and point …

Cited by 28 Related articles All 8 versions 


MR3919780  Carlsson, John Gunnar; Wang, Ye Distributions with maximum spread subject to Wasserstein distance constraints. J. Oper. Res. Soc. China 7 (2019), no. 1, 69–105. 90C15 (90C34)

 Distributions with Maximum Spread Subject to Wasserstein Distance Constraints

JG Carlsson, Y Wang - Journal of the Operations Research Society of …, 2019 - Springer

Recent research on formulating and solving distributionally robust optimization problems has seen many different approaches for describing one's ambiguity set, such as constraints on first and second moments or quantiles. In this paper, we use the Wasserstein distance to …

Related articles All 3 versions 

MR3916326  Dedecker, Jérôme; Merlevède, Florence Behavior of the empirical Wasserstein distance in

d  under moment conditions. Electron. J. Probab. 24 (2019), Paper No. 6, 32 pp. (Reviewer: Oliver Johnson) 60B10 (60E15 60F10 60F15)

Related articles All 3 versions

2019


MR3915466  Dufour, François; Prieto-Rumeau, Tomás Approximation of discounted minimax Markov control problems and zero-sum Markov games using Hausdorff and Wasserstein distances. Dyn. Games Appl. 9 (2019), no. 1, 68–102. (Reviewer: Oscar Vega-Amaya) 90C40 (90C47 91A15)

Approximation of Discounted Minimax Markov Control Problems and Zero-Sum Markov Games Using Hausdorff and Wasserstein Distances

F Dufour, T Prieto-Rumeau - Dynamic Games and Applications, 2019 - Springer

This paper is concerned with a minimax control problem (also known as a robust Markov decision process (MDP) or a game against nature) with general state and action spaces under the discounted cost optimality criterion. We are interested in approximating …

Related articles All 5 versions 

 

MR3914882  Konarovskyi, Vitalii; von Renesse, Max-K. Modified massive Arratia flow and Wasserstein diffusion. Comm. Pure Appl. Math. 72 (2019), no. 4, 764–800. 60K35 (58J65 60G57 60J60)

[PDF] uni-leipzig.de

Modified massive Arratia flow and Wasserstein diffusion

V Konarovskyi, MK von Renesse - Communications on Pure …, 2019 - Wiley Online Library

Extending previous work by the first author we present a variant of the Arratia flow, which 

consists of a collection of coalescing Brownian motions starting from every point of the unit 

interval. The important new feature of the model is that individual particles carry mass that …

Cited by 14 Related articles All 3 versions 

[PDF] arxiv.org

Modified massive Arratia flow and Wasserstein diffusion

V Konarovskyi, MK von Renesse - Communications on Pure …, 2019 - Wiley Online Library

… -valued process to the Wasserstein diffusion of von Renesse and … Wasserstein distance. © 

2018 Wiley Periodicals, Inc. … short times governed by the Wasserstein distance. However, the …

 Cited by 33 Related articles All 9 versions


Non-Local Texture Optimization with Wasserstein Regularization under Convolutional Neural Network

Authors:Li J.Xu D.Xiang Y.Hou J.
Article, 2019
Publication:IEEE Transactions on Multimedia, 21, 2019 06 01, 1437
Publisher:2019

MR3910009  Xu, Lihu Approximation of stable law in Wasserstein-1 distance by Stein's method. Ann. Appl. Probab. 29 (2019), no. 1, 458–504. (Reviewer: Peter Kern) 60F05 (60E07 60G50 60G52)

 [PDF] arxiv.org

 Approximation of stable law in Wasserstein-1 distance by Stein's method

L Xu - The Annals of Applied Probability, 2019 - projecteuclid.org

Abstract Let $ n\in\mathbb {N} $, let $\zeta_ {n, 1},\ldots,\zeta_ {n, n} $ be a sequence of 

independent random variables with $\mathbb {E}\zeta_ {n, i}= 0$ and $\mathbb {E}|\zeta_ {n, 

i}|<\infty $ for each $ i $, and let $\mu $ be an $\alpha $-stable distribution having …

Cited by 26 Related articles All 9 versions

 

[PDF] arxiv.org

From GAN to WGAN

L Weng - arXiv preprint arXiv:1904.08994, 2019 - arxiv.org

Generative adversarial network (GAN) [1] has shown great results in many generative tasks to

replicate the real-world rich content such as images, human language, and music. It is inspired

by game theory: two models, a generator and a critic, are competing with each other while making …

  Cited by 8  Related articles  All 4 versions 

[CITATION] From GAN to WGAN. arXiv e-prints

L Weng - arXiv preprint arXiv:1904.08994, 2019

  Cited by 1

<——2019———— 2019  —————50—


MR3907014  Bandini, Elena; Cosso, Andrea; Fuhrman, Marco; Pham, Huyên Randomized filtering and Bellman equation in Wasserstein space for partial observation control problem. Stochastic Process. Appl. 129 (2019), no. 2, 674–711. (Reviewer: Oleg N. Granichin) 93E20 (49L25 60G35 60H30 93E11)

Randomized filtering and Bellman equation in Wasserstein space for partial observation control problem

E Bandini, A Cosso, M Fuhrman, H Pham - Stochastic Processes and their …, 2019 - Elsevier

We study a stochastic optimal control problem for a partially observed diffusion. By using the 

control randomization method in Bandini et al.(2018), we prove a corresponding 

randomized dynamic programming principle (DPP) for the value function, which is obtained …

Cited by 11 Related articles All 11 versions 

 

MR3906994  Shao, Jinghai The existence of geodesics in Wasserstein spaces over path groups and loop groups. Stochastic Process. Appl. 129 (2019), no. 1, 153–173. 60B05 (22E30 28A33 49Q20)

 [PDF] arxiv.org

The existence of geodesics in Wasserstein spaces over path groups and loop groups

J Shao - Stochastic Processes and their Applications, 2019 - Elsevier

In this work we prove the existence and uniqueness of the optimal transport map for 

L p-Wasserstein distance with p> 1, and particularly present an explicit expression of the optimal transport map for the case p= 2. As an application, we show the existence of geodesics …

Related articles All 8 versions 


MR3900833  Carrillo, José A.; Choi, Young-Pil; Tse, Oliver Convergence to equilibrium in Wasserstein distance for damped Euler equations with interaction forces. Comm. Math. Phys. 365 (2019), no. 1, 329–361. (Reviewer: Xinyu He) 35Q31

 [PDF] springer.com

Convergence to equilibrium in Wasserstein distance for damped Euler equations with interaction forces

JA Carrillo, YP Choi, O Tse - Communications in Mathematical Physics, 2019 - Springer

We develop tools to construct Lyapunov functionals on the space of probability measures in 

order to investigate the convergence to global equilibrium of a damped Euler system under 

the influence of external and interaction potential forces with respect to the 2-Wasserstein  …

Cited by 2 Related articles All 7 versions 


MR3900011  Dieci, Luca; Walsh, J. D., III The boundary method for semi-discrete optimal transport partitions and Wasserstein distance computation. J. Comput. Appl. Math. 353 (2019), 318–344. 65K10 (35J96 49M25)

 [PDF] arxiv.org

The boundary method for semi-discrete optimal transport partitions and Wasserstein distance computation

L Dieci, JD Walsh III - Journal of Computational and Applied Mathematics, 2019 - Elsevier

We introduce a new technique, which we call the boundary method, for solving semi-

discrete optimal transport problems with a wide range of cost functions. The boundary 

method reduces the effective dimension of the problem, thus improving complexity. For cost …

Cited by 3 Related articles All 4 versions  All 5 versions 


MR3884603  Li, Long; Vidard, Arthur; Le Dimet, François-Xavier; Ma, Jianwei Topological data assimilation using Wasserstein distance. Inverse Problems 35 (2019), no. 1, 015006, 23 pp. 62-07 (60B05 62M30 86A22)

 Topological data assimilation using Wasserstein distance

by Li, Long; Vidard, Arthur; Le Dimet, François-Xavier; More...

Inverse Problems, 01/2019, Volume 35, Issue 1

Journal Article: Full Text Online

Topological data assimilation using Wasserstein distance

L Li, A Vidard, FX Le Dimet, J Ma - Inverse Problems, 2018 - iopscience.iop.org

This work combines a level-set approach and the optimal transport-based Wasserstein 

distance in a data assimilation framework. The primary motivation of this work is to reduce 

assimilation artifacts resulting from the position and observation error in the tracking and …

All 2 versions



MR3881882  Bonnet, Benoît; Rossi, Francesco The Pontryagin maximum principle in the Wasserstein space. 

Calc. Var. Partial Differential Equations 58 (2019), no. 1, Art. 11, 36 pp. (Reviewer: Andrey V. Sarychev) 49K20 (49K27 58E25)

[PDF] arxiv.org

The Pontryagin Maximum Principle in the Wasserstein Space

B Bonnet, F Rossi - Calculus of Variations and Partial Differential …, 2019 - Springer

Abstract We prove a Pontryagin Maximum Principle for optimal control problems in the 

space of probability measures, where the dynamics is given by a transport equation with non-

local velocity. We formulate this first-order optimality condition using the formalism of …

Cited by 3 Related articles All 34 versions 


MR3875604  del Barrio, Eustasio; Gordaliza, Paula; Lescornel, Hélène; Loubes, Jean-Michel Central limit theorem and bootstrap procedure for Wasserstein's variations with an application to structural relationships between distributions. J. Multivariate Anal. 169 (2019), 341–362. (Reviewer: S. Valère Bitseki Penda) 60F05 (62G05 62G20)

Central limit theorem and bootstrap procedure for Wasserstein's variations with an application to structural relationships between distributions

Cited by 5 Related articles 


MR4045481 Prelim Rigollet, Philippe; Weed, Jonathan; Uncoupled isotonic regression via minimum Wasserstein deconvolution. Inf. Inference 8 (2019), no. 4, 691–71

Uncoupled isotonic regression via minimum Wasserstein ...

academic.oup.com › imaiai › article-abstract

Apr 2, 2019 — Uncoupled isotonic regression via minimum Wasserstein deconvolution. Philippe Rigollet,.

by P Rigollet · ‎2019 ·  

Cited by 35 Related articles All 8 versions



Domain Adaptation Techniques for EEG-based Emotion Recognition: A Comparative Study on Two Public Datasets

Z Lan, O Sourina, L Wang, R Scherer… - … on Cognitive and …, 2019 - graz.pure.elsevier.com

Affective brain-computer interface (aBCI) introduces personal affective factors to human-computer interaction. The state-of-the-art aBCI tailors its classifier to each individual user to achieve accurate emotion classification. A subject-independent classifier that is trained on …

<——2019————— 2019———————60—

 

 Document Title: "Data from National Center for Scientific Research (CNRS) Advance Knowledge in Probability Research 

(A new approach for the construction of a Wasserstein diffusion)"AndStart Page: 106AndISSN: 19441894

Journal of Technology & Science, 01/2019 Newsletter: Full Text Online 

Probability Research; Data from National Center for Scientific Research (CNRS) 

Advance Knowledge in Probability Research 

(A new approach for the construction of a Wasserstein diffusion)

Journal of Technology & Science, Jan 20, 2019, 106

tart Page (106) And ISSN (19441894)

Newspaper Article: Full Text Online

arXiv 2017. published 2018.


[PDF] mlr.press

Wasserstein adversarial examples via projected sinkhorn iterations

E Wong, F Schmidt, Z Kolter - International Conference on …, 2019 - proceedings.mlr.press

… makes the procedure tractable for generating adversarial images. In contrast to l∞ and l2 

perturbations, we find that the Wasserstein me

Wasserstein Hamiltonian flows

SN Chow, W Li, H Zhou - arXiv preprint arXiv:1903.01088, 2019 - arxiv.org

We establish kinetic Hamiltonian flows in density space embedded with the $ L^ 2$-

Wasserstein metric tensor. We derive the Euler-Lagrange equation in density space, which 

introduces the associated Hamiltonian flows. We demonstrate that many classical equations …

Related articles All 4 versions


[PDF] arxiv.org

Robust Wasserstein profile inference and applications to machine learning

J Blanchet, Y Kang, K Murthy - Journal of Applied Probability, 2019 - cambridge.org

We show that several machine learning estimators, including square-root least absolute 

shrinkage and selection and regularized logistic regression, can be represented as 

solutions to distributionally robust optimization problems. The associated uncertainty regions …

Cited by 221 Related articles All 5 versions

[PDF] koreascience.or.kr

Combining multi-task autoencoder with Wasserstein generative adversarial networks for improving speech recognition performance

CY Kao, H Ko - The Journal of the Acoustical Society of Korea, 2019 - koreascience.or.kr

As the presence of background noise in acoustic signal degrades the performance of speech or acoustic event recognition, it is still challenging to extract noise-robust acoustic features from noisy signal. In this paper, we propose a combined structure of Wasserstein  …

  Related articles All 3 versions 


2019   


[PDF] unibocconi.it

[PDF] Bayesian model comparison based on Wasserstein distances

M Catalano, A Lijoi, I Pruenster - SIS 2019 Smart Statistics for …, 2019 - iris.unibocconi.it

Demography in the Digital Era: New Data Sources for Population Research ...........................23 

Demografia nell'era digitale: nuovi fonti di dati per gli studi di popolazione................................23 

Diego Alburez-Gutierrez, Samin Aref, Sofia Gil-Clavel, André Grow, Daniela V. Negraia, Emilio …

Cited by 2 Related articles

[PDF] arxiv.org

Uncoupled isotonic regression via minimum Wasserstein deconvolution

P Rigollet, J Weed - Information and Inference: A Journal of the …, 2019 - academic.oup.com

Isotonic regression is a standard problem in shape-constrained estimation where the goal is 

to estimate an unknown non-decreasing regression function from independent pairs where. 

While this problem is well understood both statistically and computationally, much less is …

Cited by 16 Related articles All 3 versions 


[PDF] thecvf.com

Sliced wasserstein generative models

J Wu, Z Huang, D Acharya, W Li… - Proceedings of the …, 2019 - openaccess.thecvf.com

In generative modeling, the Wasserstein distance (WD) has emerged as a useful metric to 

measure the discrepancy between generated and real data distributions. Unfortunately, it is 

challenging to approximate the WD of high-dimensional distributions. In contrast, the sliced …

Cited by 84 Related articles All 15 versions

Sliced Wasserstein Generative Models

In generative modeling, the Wasserstein distance (WD) has emerged as a useful metric to measure the ...

Apr 11, 2019 · Uploaded by cantabilewq

Sliced Wasserstein Generative Models

Author - Papers With Code

paperswithcode.com › paper › sliced-wasserstein-generati...

In generative modeling, the Wasserstein distance (WD) has emerged as a useful metric to measure the discrepancy between generated and real data ...

Papers With Code · cantabilewq · 

Apr 13, 2019



Peer-reviewed
Harmonic mappings valued in the Wasserstein space
Author:Hugo Lavenant
Summary:We propose a definition of the Dirichlet energy (which is roughly speaking the integral of the square of the gradient) for mappings μ : Ω ( P ( D ) , W 2 ) defined over a subset Ω of R p and valued in the space P ( D ) of probability measures on a compact convex subset D of R q endowed with the quadratic Wasserstein distance. Our definition relies on a straightforward generalization of the Benamou-Brenier formula (already introduced by Brenier) but is also equivalent to the definition of Korevaar, Schoen and Jost as limit of approximate Dirichlet energies, and to the definition of Reshetnyak of Sobolev spaces valued in metric spaces.

We study harmonic mappings, i.e. minimizers of the Dirichlet energy provided that the values on the boundary ∂Ω are fixed. The notion of constant-speed geodesics in the Wasserstein space is recovered by taking for Ω a segment of R . As the Wasserstein space ( P ( D ) , W 2 ) is positively curved in the sense of Alexandrov we cannot apply the theory of Korevaar, Schoen and Jost and we use instead arguments based on optimal transport. We manage to get existence of harmonic mappings provided that the boundary values are Lipschitz on ∂Ω, uniqueness is an open question.

If Ω is a segment of R , it is known that a curve valued in the Wasserstein space P ( D ) can be seen as a superposition of curves valued in D. We show that it is no longer the case in higher dimensions: a generic mapping Ω P ( D ) cannot be represented as the superposition of mappings Ω D .

We are able to show the validity of a maximum principle: the composition F μ of a function F : P ( D ) R convex along generalized geodesics and a harmonic mapping μ : Ω P ( D ) is a subharmonic real-valued function.

We also study the special case where we restrict ourselves to a given family of elliptically contoured distributions (a finite-dimensional and geodesically convex submanifold of ( P ( D ) , W 2 ) which generalizes the case of Gaussian measures) and show that it boils down to harmonic mappings valued in the Riemannian manifold of symmetric matrices endowed with the distance coming from optimal transportShow more
Article, 2019
Publication:Journal of Functional Analysis, 277, 20190801, 688
Publisher:2019


1
Peer-reviewed
Misfit function for full waveform inversion based on the Wasserstein metric with dynamic formulation
Authors:Peng YongWenyuan LiaoJianping HuangZhenchun LiYaoting Lin
Summary:Conventional full waveform inversion (FWI) using least square distance ( L 2 norm) between the observed and predicted seismograms suffers from local minima. Recently, the Wasserstein metric ( W 1 metric) has been introduced to FWI to compute the misfit between two seismograms. Instead of comparisons bin by bin, the W 1 metric allows to compare signal intensities across different coordinates. This measure has great potential to account for time and space shifts of events within seismograms. However, there are two main challenges in application of the W 1 metric to FWI. The first one is that the compared signals need to satisfy nonnegativity and mass conservation assumptions. The second one is that the computation of W 1 metric between two seismograms is a computationally expensive problem. In this paper, a strategy is used to satisfy the two assumptions via decomposition and recombination of original seismic data. In addition, the computation of the W 1 metric based on dynamic formulation is formulated as a convex optimization problem. A primal-dual hybrid gradient method with linesearch has been developed to solve this large-scale optimization problem on GPU device. The advantages of the new method are that it is easy to implement and has high computational efficiency. Compared to the L 2 norm based FWI, the computation time of the proposed method will approximately increase by 11% in our case studies. A 1D time-shift signals case study has indicated that the W 1 metric is more effective in capturing time shift and makes the misfit function more convex. Two applications to synthetic data using transmissive and reflective recording geometries have demonstrated the effectiveness of the W 1 metric in mitigating cycle-skipping issues. We have also applied the proposed method to SEG 2014 benchmark data, which has further demonstrated that the W 1 metric can mitigate local minima and provide reliable velocity estimations without using low frequency information in the recorded data.

• Apply Wasserstein metric to full waveform inversion to mitigate local minimum issue. • PDHG method is utilized to calculate the Wasserstein metric with dynamic formulation. • Adjoint-state method is applied to compute the gradient of seismic inverse problem. • The effectiveness and robustness of our method are verified by the benchmark dataShow more
Article, 2019
Publication:Journal of Computational Physics, 399, 20191215
Publisher:2019

<——2019—————— 2019 ——-70 —



[PDF] Single Image Super-Resolution Based on Improved WGAN

L Yu, X Long, C Tong - download.atlantis-press.com

Adversarial Network to the single image super-resolution reconstruction, which has 

achieved good results. But the loss function based on feature space in SRGAN objectively 

sacrifices the pursuit of high peak signal-to-noise-ratio (PS NR), which is the result of a …


Deep learning framework DNN with conditional WGAN for protein solubility prediction

X Han, L Zhang, K Zhou, X Wang - arXiv preprint arXiv:1811.07140, 2018 - arxiv.org

Protein solubility plays a critical role in improving production yield of recombinant proteins in 

biocatalyst and pharmaceutical field. To some extent, protein solubility can represent the 

function and activity of biocatalysts which are mainly composed of recombinant proteins …

Cited by 1 Related articles All 3 versions

WGAN 이용한 Data Augmentation 기법

임세호, 신용구, 유철환, 이한규, 고성제 - 대한전자공학회 학술대회, 2017 - dbpia.co.kr

최근 영상 dataset 이용하여 network 학습시키는 연구가 활발히 진행되면서 영상 dataset 

구축의 중요성이 증가하고 있다. 이에 학습에 필요한 dataset 수가 부족한 경우에는 기존 

dataset 자르거나 회전시키는 등의 방법을 통해 인위적으로 dataset 수를 늘리는 data …

Related articles

[Korean Data Augmentation Technique Using WGAN]


WGAN Domain Adaptation for EEG-Based Emotion Recognition

Y Luo, SY Zhang, WL Zheng, BL Lu - International Conference on Neural …, 2018 - Springer

In this paper, we propose a novel Wasserstein generative adversarial network domain 

adaptation (WGANDA) framework for building cross-subject electroencephalography (EEG)-

based emotion recognition models. The proposed framework consists of GANs-like …

Cited by 8 Related articles All 4 versions


[PDF] arxiv.org

Robust Wasserstein profile inference and applications to machine learning

J Blanchet, Y KangK Murthy - Journal of Applied Probability, 2019 - cambridge.org

We show that several machine learning estimators, including square-root least absolute

shrinkage and selection and regularized logistic regression, can be represented as

solutions to distributionally robust optimization problems. The associated uncertainty regions …

  Cited by 136 Related articles All 5 versions


2019

[PDF] arxiv.org

A bound on the Wasserstein-2 distance between linear combinations of independent random variables

B Arras, E Azmoodeh, G Poly, Y Swan - Stochastic processes and their …, 2019 - Elsevier

We provide a bound on a distance between finitely supported elements and general 

elements of the unit sphere of 2 (N). We use this bound to estimate the Wasserstein-2 

distance between random variables represented by linear combinations of independent …

Cited by 12 Related articles All 9 versions 

Cited by 223 Related articles All 5 versions


[PDF] arxiv.org

The Gromov–Wasserstein distance between networks and stable network invariants

S Chowdhury, F Mémoli - Information and Inference: A Journal of …, 2019 - academic.oup.com

We define a metric—the network Gromov–Wasserstein distance—on weighted, directed 

networks that is sensitive to the presence of outliers. In addition to proving its theoretical 

properties, we supply network invariants based on optimal transport that approximate this …

[PDF] arxiv.org

Cited by 46 Related articles All 7 versions

[PDF] arxiv.org

A partial Laplacian as an infinitesimal generator on the Wasserstein space

YT Chow, W Gangbo - Journal of Differential Equations, 2019 - Elsevier

In this manuscript, we consider special linear operators which we term partial Laplacians on 

the Wasserstein space, and which we show to be partial traces of the Wasserstein Hessian. 

We verify a distinctive smoothing effect of the “heat flows” they generated for a particular …

Cited by 11 Related articles All 7 versions 

Study Results from Department of Mathematics Broaden Understanding of Differential Equations 

(A Partial Laplacian As an Infinitesimal Generator On the Wasserstein... 

Mathematics Week, 11/2019

NewsletterFull Text Online 

Mathematics - Differential Equations; Study Results from Department of Mathematics 

Broaden Understanding of Differential Equations 

(A Partial Laplacian As an Infinitesimal Generator On the Wasserstein... 

Journal of Mathematics, Nov 12, 2019, 604

Newspaper ArticleFull Text Online

2019
A Wasserstein distance approach for concentration of empirical risk estimates
Authors:A., Prashanth L. (Creator), Bhat, Sanjay P. (Creator)
Summary:This paper presents a unified approach based on Wasserstein distance to derive concentration bounds for empirical estimates for two broad classes of risk measures defined in the paper. The classes of risk measures introduced include as special cases well known risk measures from the finance literature such as conditional value at risk (CVaR), optimized certainty equivalent risk, spectral risk measures, utility-based shortfall risk, cumulative prospect theory (CPT) value, rank dependent expected utility and distorted risk measures. Two estimation schemes are considered, one for each class of risk measures. One estimation scheme involves applying the risk measure to the empirical distribution function formed from a collection of i.i.d. samples of the random variable (r.v.), while the second scheme involves applying the same procedure to a truncated sample. The bounds provided apply to three popular classes of distributions, namely sub-Gaussian, sub-exponential and heavy-tailed distributions. The bounds are derived by first relating the estimation error to the Wasserstein distance between the true and empirical distributions, and then using recent concentration bounds for the latter. Previous concentration bounds are available only for specific risk measures such as CVaR and CPT-value. The bounds derived in this paper are shown to either match or improve upon previous bounds in cases where they are available. The usefulness of the bounds is illustrated through an algorithm and the corresponding regret bound for a stochastic bandit problem involving a general risk measure from each of the two classes introduced in the paperShow more
Downloadable Archival Material, 2019-02-27
Undefined
Publisher:2019-02-27

[PDF] arxiv.org

Approximate Bayesian computation with the Wasserstein distance

E Bernton, PE Jacob, M Gerber… - Journal of the Royal …, 2019 - Wiley Online Library

A growing number of generative statistical models do not permit the numerical evaluation of 

their likelihood functions. Approximate Bayesian computation has become a popular 

approach to overcome this issue, in which one simulates synthetic data sets given …

Cited by 13 Related articles All 9 versions

Approximate Bayesian computation with the Wasserstein distance

E Bernton, PE Jacob, M Gerber… - Journal of the Royal …, 2019 - Wiley Online Library

… We plot the resulting distances against the number of model simulations in Fig … In contrast, the

ABC approach with the Euclidean distance struggles to approximate the … The estimated

1Wasserstein distance between the 2048 accepted samples and the posterior was 0.63 … 

 Cited by 102 Related articles All 15 versions

<——2019    ———— 2019  —————————80—


 

Wasserstein Distance Based Hierarchical Attention Model for Cross-Domain Sentiment Classification
Authors:Du Y.He M.Zhao X.
Article, 2019
Publication:Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 32, 2019 05 01, 446
Publisher:2019

[PDF] arxiv.org

Statistical inference for Bures-Wasserstein barycenters

A Kroshnin, V Spokoiny, A Suvorikova - arXiv preprint arXiv:1901.00226, 2019 - arxiv.org

In this work we introduce the concept of Bures-Wasserstein barycenter\(Q_*\), that is 

essentially a Fréchet mean of some distribution\(¶\) supported on a subspace of positive 

semi-definite Hermitian operators\(\H_ {+}(d)\). We allow a barycenter to be restricted to …

Cited by 9 Related articles All 3 versions

[PDF] researchgate.net

2019 arxiv.org › math

[CITATION] Statistical inference for bures-Wasserstein barycenters (2019)

A Kroshnin, V Spokoiny, A Suvorikova - arXiv preprint arXiv:1901.00226, 1901

  Cited by 2 Related articles


[PDF] arxiv.org

Wasserstein fair classification

R Jiang, A Pacchiano, T Stepleton, H Jiang… - arXiv preprint arXiv …, 2019 - arxiv.org

We propose an approach to fair classification that enforces independence between the 

classifier outputs and sensitive information by minimizing Wasserstein-1 distances. The 

approach has desirable theoretical properties and is robust to specific choices of the …

Cited by 3 All 5 versions


[PDF] arxiv.org

Wasserstein distributionally robust optimization: Theory and applications in machine learning

D Kuhn, PM Esfahani, VA Nguyen… - … Science in the Age …, 2019 - pubsonline.informs.org

Many decision problems in science, engineering, and economics are affected by uncertain 

parameters whose distribution is only indirectly observable through samples. The goal of 

data-driven decision making is to learn a decision from finitely many training samples that …

Cited by 224  Related articles All 9 versions

[PDF] arxiv.org

Estimation of smooth densities in Wasserstein distance

J Weed, Q Berthet - arXiv preprint arXiv:1902.01778, 2019 - arxiv.org

The Wasserstein distances are a set of metrics on probability distributions supported on 

$\mathbb {R}^ d $ with applications throughout statistics and machine learning. Often, such 

distances are used in the context of variational problems, in which the statistician employs in …

Cited by 53 Related articles All 6 versions

Estimation of smooth densities in Wasserstein distance289 views

Sep 17, 2019


[PDF] arxiv.org

Data-driven chance constrained optimization under Wasserstein ambiguity sets

AR Hota, A Cherukuri, J Lygeros - 2019 American Control …, 2019 - ieeexplore.ieee.org

We present a data-driven approach for distri-butionally robust chance constrained 

optimization problems (DRCCPs). We consider the case where the decision maker has 

access to a finite number of samples or realizations of the uncertainty. The chance constraint …

Cited by 29 Related articles All 6 versions

Data-driven chance constrained optimization under wasserstein ambiguity sets  book

 2019

Algorithms for optimal transport and wasserstein distances 

by Schrieber, Jörn 

Dissertation/Thesis:   Citation Online 

Algorithms for Optimal Transport and Wasserstein Distances .

Feb 28, 2019 - Algorithms for Optimal Transport and Wasserstein Distances. Schrieber, Jörn. Doctoral thesis. Date of Examination: 2019-02-14. Date of issue: ...

Schrieber, Jörn. Algorithms for Optimal Transport and Wasserstein Distances. 

Degree: PhD, Mathematik und Informatik, 2019, Georg-August-Universität Göttingen

URL: http://hdl.handle.net/11858/00-1735-0000-002E-E5B2-B 

► Optimal Transport and Wasserstein Distance are closely related terms that do not only have a long history in the mathematical literature, but also have seen… (more)

Record Details Similar Records Cite Share »
Boston University


2019 see 2020  [PDF] arxiv.org

Progressive Wasserstein Barycenters of Persistence Diagrams

J Vidal, J Budin, J Tierny - IEEE transactions on visualization …, 2019 - ieeexplore.ieee.org

This paper presents an efficient algorithm for the progressive approximation of Wasserstein 

barycenters of persistence diagrams, with applications to the visual analysis of ensemble 

data. Given a set of scalar fields, our approach enables the computation of a persistence …

Cited by 23 Related articles All 17 versions

rogressive Wasserstein Barycenters of Persistence Diagrams
www.youtube.com › watch

Link to the paper: https://julien-tierny.github.io/stuff/papers/vidal_vis19.

YouTube · Julien Tierny · 

Jul 18, 2019

[VIS19 Preview] Progressive Wasserstein Barycenters of ...

Abstract: This paper presents an efficient algorithm for the progressive approximation of Wasserstein ...

Sep 14, 2019 · Uploaded by VGTCommunity


[PDF] arxiv.org

On the complexity of approximating wasserstein barycenter

A Kroshnin, D Dvinskikh, P Dvurechensky… - arXiv preprint arXiv …, 2019 - arxiv.org

We study the complexity of approximating Wassertein barycenter of $ m $ discrete 

measures, or histograms of size $ n $ by contrasting two alternative approaches, both using 

entropic regularization. The first approach is based on the Iterative Bregman Projections …

Cited by 7 Related articles All 5 versions 

On the Complexity of Approximating Wasserstein Barycenters

P Dvurechensky - icml.cc

Page 1. On the Complexity of Approximating Wasserstein Barycenters Alexey Kroshnin, Darina Dvinskikh, Pavel Dvurechensky, Alexander Gasnikov, Nazarii Tupitsa, César A. Uribe International Conference on Machine Learning 2019 Page 2. Wasserstein barycenter ˆν …

 Cited by 39 Related articles All 11 versions 


Conditional WGAN for grasp generation

F Patzelt, R Haschke, H Ritter - European Symposium on …, 2019 - pub.uni-bielefeld.de

This work proposes a new approach to robotic grasping exploiting conditional Wasserstein 

generative adversarial networks (WGANs), which output promising grasp candidates from 

depth image inputs. In contrast to discriminative models, the WGAN approach enables …

under the discounted cost optimality criterion. We are interested in approximating …

Related articles All 5 versions

<——2019    ——————— 2019 — -90—


 

[PDF] arxiv.org

On the minimax optimality of estimating the wasserstein metric

T Liang - arXiv preprint arXiv:1908.10324, 2019 - arxiv.org

We study the minimax optimal rate for estimating the Wasserstein-$1 $ metric between two 

unknown probability measures based on $ n $ iid empirical samples from them. We show 

that estimating the Wasserstein metric itself between probability measures, is not …

Cited by 3 All 3 versions


[PDF] arxiv.org

Adapted wasserstein distances and stability in mathematical finance

J Backhoff-Veraguas, D Bartl, M Beiglböck… - arXiv preprint arXiv …, 2019 - arxiv.org

Assume that an agent models a financial asset through a measure Q with the goal to 

price/hedge some derivative or optimize some expected utility. Even if the model Q is 

chosen in the most skilful and sophisticated way, she is left with the possibility that Q does …

Cited by 6 Related articles All 11 versions 


[PDF] arxiv.org

Dynamic models of Wasserstein-1-type unbalanced transport

B Schmitzer, B Wirth - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We consider a class of convex optimization problems modelling temporal mass transport 

and mass change between two given mass distributions (the so-called dynamic formulation 

of unbalanced transport), where we focus on those models for which transport costs are …

Cited by 10 Related articles All 4 versions

[PDF] arxiv.org

Wasserstein stability estimates for covariance-preconditioned Fokker--Planck equations

JA Carrillo, U Vaes - arXiv preprint arXiv:1910.07555, 2019 - arxiv.org

We study the convergence to equilibrium of the mean field PDE associated with the 

derivative-free methodologies for solving inverse problems. We show stability estimates in 

the euclidean Wasserstein distance for the mean field PDE by using optimal transport …

Cited by 3 All 2 versions


[PDF] arxiv.org

Stacked wasserstein autoencoder

W Xu, S Keshmiri, G Wang - Neurocomputing, 2019 - Elsevier

Approximating distributions over complicated manifolds, such as natural images, are 

conceptually attractive. The deep latent variable model, trained using variational 

autoencoders and generative adversarial networks, is now a key technique for …

Cited by 2 All 4 versions 

 Stacked Wasserstein Autoencoder 

By: Xu, Wenju; Keshmiri, Shawn; Wang, Guanghui 

NEUROCOMPUTING   Volume: ‏ 363   Pages: ‏ 195-204   Published: ‏ OCT 21 2019 

 Free Full Text from Publisher 

Times Cited: 2  

Cited by 12 Related articles All 5 versions

2019

[PDF] arxiv.org

Gromov-wasserstein learning for graph matching and node embedding

H Xu, D Luo, H Zha, L Carin - arXiv preprint arXiv:1901.06003, 2019 - arxiv.org

A novel Gromov-Wasserstein learning framework is proposed to jointly match (align) graphs 

and learn embedding vectors for the associated graph nodes. Using Gromov-Wasserstein 

discrepancy, we measure the dissimilarity between two graphs and find their …

Cited by 12 Related articles All 8 versions


[PDF] arxiv.org

Wasserstein information matrix

W Li, J Zhao - arXiv preprint arXiv:1910.11248, 2019 - arxiv.org

We study the information matrix for statistical models by $ L^ 2$-Wasserstein metric. We call 

it Wasserstein information matrix (WIM), which is an analog of classical Fisher information 

matrix. Based on this matrix, we introduce Wasserstein score functions and study covariance …

Cited by 12 Related articles All 4 versions


[PDF] arxiv.org

Poincar\'e Wasserstein Autoencoder

I Ovinnikov - arXiv preprint arXiv:1901.01427,  - arxiv.org

This work presents a reformulation of the recently proposed Wasserstein autoencoder 

framework on a non-Euclidean manifold, the Poincaré ball model of the hyperbolic space. 

By assuming the latent space to be hyperbolic, we can use its intrinsic hierarchy to impose …

Cited by 27 Related articles All 5 versions

 arXiv:1901.01427  [pdf, other]2019

Poincaré Wasserstein Autoencoder

Authors: Ivan Ovinnikov

Abstract: This work presents a reformulation of the recently proposed Wasserstein autoencoder framework on a non-Euclidean manifold, the Poincaré ball model of the hyperbolic space. 

By assuming the latent space to be hyperbolic, we can use its intrinsic hierarchy to impose structure on the learned latent space representations. 

We demonstrate the model in the visual domain to analyze some of its properties a… More

Submitted 5 January, 2019; originally announced January 2019.

Journal ref: Bayesian Deep Learning Workshop (NeurIPS 2018)


[PDF] arxiv.org

Subspace robust wasserstein distances

FP Paty, M Cuturi - arXiv preprint arXiv:1901.08949, 2019 - arxiv.org

Making sense of Wasserstein distances between discrete measures in high-dimensional 

settings remains a challenge. Recent work has advocated a two-step approach to improve 

robustness and facilitate the computation of optimal transport, using for instance projections …

Cited by 90 Related articles All 6 versions

[PDF] mlr.press

A gradual, semi-discrete approach to generative network training via explicit wasserstein minimization

Y Chen, M TelgarskyC Zhang… - International …, 2019 - proceedings.mlr.press

This paper provides a simple procedure to fit generative networks to target distributions, with

the goal of a small Wasserstein distance (or other optimal transport costs). The approach is

based on two principles:(a) if the source randomness of the network is a continuous …

  Cited by 5 Related articles All 10 versions 
<——2019 ———— 2019  ———————100—


[PDF] arxiv.org

Wasserstein robust reinforcement learning

MA Abdullah, H Ren, HB Ammar, V Milenkovic… - arXiv preprint arXiv …, 2019 - arxiv.org

Reinforcement learning algorithms, though successful, tend to over-fit to training 

environments hampering their application to the real-world. This paper proposes WR $^{2} $ 

L; a robust reinforcement learning algorithm with significant robust performance on low and …

 Cited by 35 Related articles All 7 versions

[PDF] arxiv.org

Precise simulation of electromagnetic calorimeter showers using a Wasserstein Generative Adversarial Network

M Erdmann, J Glombitza, T Quast - Computing and Software for Big …, 2019 - Springer

Simulations of particle showers in calorimeters are computationally time-consuming, as they 

have to reproduce both energy depositions and their considerable fluctuations. A new 

approach to ultra-fast simulations is generative models where all calorimeter energy …

Cited by 13 Related articles All 6 versions 

 [CITATION] Precise simulation of electromagnetic calorimeter showers using a Wasserstein generative adversarial network. Comput Softw Big Sci 3 (1): 4

M Erdmann, J Glombitza, T Quast - arXiv preprint arXiv:1807.01954, 2019

 Cited by 85 Related articles All 7 versions

[PDF] arxiv.org

Orthogonal estimation of wasserstein distances

M Rowland, J Hron, Y Tang, K Choromanski… - arXiv preprint arXiv …, 2019 - arxiv.org

Wasserstein distances are increasingly used in a wide variety of applications in machine 

learning. Sliced Wasserstein distances form an important subclass which may be estimated 

efficiently through one-dimensional sorting operations. In this paper, we propose a new …

Cited by 59 Related articles All 11 versions

[PDF] arxiv.org

A Rademacher-type theorem on L2-Wasserstein spaces over closed Riemannian manifolds

LD Schiavo - Journal of Functional Analysis, 2019 - Elsevier

Let P be any Borel probability measure on the L 2-Wasserstein space (P 2 (M), W 2) over a 

closed Riemannian manifold M. We consider the Dirichlet form E induced by P and by the 

Wasserstein gradient on P 2 (M). Under natural assumptions on P, we show that W 2 …

Cited by 3 Related articles  All 4 versions

 

[PDF] esaim-cocv.org

A Pontryagin Maximum Principle in Wasserstein Spaces for Constrained Optimal Control Problems

B Bonnet - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

In this paper, we prove a Pontryagin Maximum Principle for constrained optimal control 

problems in the Wasserstein space of probability measures. The dynamics is described by a 

transport equation with non-local velocities which are affine in the control, and is subject to …

Cited by 3 Related articles All 73 versions 


2019

[PDF] arxiv.org

Wasserstein dependency measure for representation learning

S Ozair, C Lynch, Y Bengio, A Oord, S Levine… - arXiv preprint arXiv …, 2019 - arxiv.org

Mutual information maximization has emerged as a powerful learning objective for 

unsupervised representation learning obtaining state-of-the-art performance in applications 

such as object recognition, speech recognition, and reinforcement learning. However, such …

Cited by 6 Related articles All 2 versions

Mar 28, 2019

[PDF] nips.cc

[PDF] Supplementary Material for: Tree-Sliced Variants of Wasserstein Distances

T Le, M Yamada, K Fukumizu, M Cuturi - papers.nips.cc

In this section, we give detailed proofs for the inequality in the connection with OT with Euclidean ground metric (ie W2 metric) for TW distance, and investigate an empirical relation between TSW and W2 metric, especially when one increases the number of tree …

Cited by 2 Related articles

Related articles All 2 versions
[CITATION] Supplementary Material for: Tree-Sliced Variants of Wasserstein Distances

T LeM YamadaK FukumizuM Cuturi

  Related articles

[PDF] nips.cc

Tree-sliced variants of wasserstein distances

T Le, M Yamada, K Fukumizu, M Cuturi - Advances in neural …, 2019 - papers.nips.cc

Optimal transport (\OT) theory defines a powerful set of tools to compare probability 

distributions.\OT~ suffers however from a few drawbacks, computational and statistical, 

which have encouraged the proposal of several regularized variants of OT in the recent …

Cited by 2


[PDF] arxiv.org

Wasserstein Adversarial Examples via Projected Sinkhorn Iterations

E Wong, FR Schmidt, JZ Kolter - arXiv preprint arXiv:1902.07906, 2019 - arxiv.org

A rapidly growing area of work has studied the existence of adversarial examples, 

datapoints which have been perturbed to fool a classifier, but the vast majority of these 

works have focused primarily on threat models defined by $\ell_p $ norm-bounded …

Cited by 142 Related articles All 8 versions

[PDF] arxiv.org

Sparsemax and relaxed Wasserstein for topic sparsity

T Lin, Z Hu, X Guo - Proceedings of the Twelfth ACM International …, 2019 - dl.acm.org

Topic sparsity refers to the observation that individual documents usually focus on several 

salient topics instead of covering a wide variety of topics, and a real topic adopts a narrow 

range of terms instead of a wide coverage of the vocabulary. Understanding this topic …

Cited by 21 Related articles All 6 versions

Denoising of 3D Magnetic Resonance Images Using a Residual Encoder-Decoder Wasserstein Generative Adversarial Network

M Ran, J Hu, Y Chen, H Chen, H Sun, J Zhou… - Medical image …, 2019 - Elsevier

Abstract Structure-preserved denoising of 3D magnetic resonance imaging (MRI) images is 

a critical step in medical image analysis. Over the past few years, many algorithms with 

impressive performances have been proposed. In this paper, inspired by the idea of deep …

Related articles All 2 versions 

<——2019 —————— 2019 ——————110—

 

[PDF] arxiv.org

Primal dual methods for Wasserstein gradient flows

JA Carrillo, K Craig, L Wang, C Wei - arXiv preprint arXiv:1901.08081, 2019 - arxiv.org

Combining the classical theory of optimal transport with modern operator splitting 

techniques, we develop a new numerical method for nonlinear, nonlocal partial differential 

equations, arising in models of porous media, materials science, and biological swarming …

Cited by 6 Related articles All 3 versions


[PDF] thecvf.com

Unimodal-uniform constrained wasserstein training for medical diagnosis

X Liu, X Han, Y Qiao, Y Ge, S Li… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

The labels in medical diagnosis task are usually discrete and successively distributed. For 

example, the Diabetic Retinopathy Diagnosis (DR) involves five health risk levels: no DR (0), 

mild DR (1), moderate DR (2), severe DR (3) and proliferative DR (4). This labeling system is …

Cited by 26 Related articles All 9 versions

[PDF] arxiv.org

Fisher information regularization schemes for Wasserstein gradient flows

W Li, J Lu, L Wang - arXiv preprint arXiv:1907.02152, 2019 - arxiv.org

We propose a variational scheme for computing Wasserstein gradient flows. The scheme 

builds upon the Jordan--Kinderlehrer--Otto framework with the Benamou-Brenier's dynamic 

formulation of the quadratic Wasserstein metric and adds a regularization by the Fisher …

Cited by 2 All 6 versions


[PDF] aaai.org

Wasserstein Soft Label Propagation on Hypergraphs: Algorithm and Generalization Error Bounds

T Gao, S Asoodeh, Y Huang, J Evans - Proceedings of the AAAI …, 2019 - wvvw.aaai.org

Inspired by recent interests of developing machine learning and data mining algorithms on 

hypergraphs, we investigate in this paper the semi-supervised learning algorithm of 

propagating” soft labels”(eg probability distributions, class membership scores) over …

Cited by 2 Related articles All 4 versions


[PDF] arxiv.org

Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space

TL Gouic, Q Paris, P Rigollet, AJ Stromme - arXiv preprint arXiv …, 2019 - arxiv.org

This work establishes fast rates of convergence for empirical barycenters over a large class 

of geodesic spaces with curvature bounds in the sense of Alexandrov. More specifically, we 

show that parametric rates of convergence are achievable under natural conditions that …

Cited by 2 All 2 versions


2019

Wasserstein generative learning with kinematic constraints for probabilistic interactive driving behavior prediction

H Ma, J Li, W Zhan, M Tomizuka - 2019 IEEE Intelligent …, 2019 - ieeexplore.ieee.org

Since prediction plays a significant role in enhancing the performance of decision making 

and planning procedures, the requirement of advanced methods of prediction becomes 

urgent. Although many literatures propose methods to make prediction on a single agent …

Cited by 23 Related articles


[PDF] arxiv.org

 [PDF] projecteuclid.org

Behavior of the empirical Wasserstein distance in  under moment conditions

J Dedecker, F Merlevède - Electronic Journal of Probability, 2019 - projecteuclid.org

We establish some deviation inequalities, moment bounds and almost sure results for the

Wasserstein distance of order $ p\in [1,\infty) $ between the empirical measure of

independent and identically distributed ${\mathbb R}^ d $-valued random variables and the …

  Cited by 7 Related articles All 12 versions


[PDF] arxiv.org

Confidence regions in wasserstein distributionally robust estimation

J BlanchetK MurthyN Si - arXiv preprint arXiv:1906.01614, 2019 - arxiv.org

Wasserstein distributionally robust optimization estimators are obtained as solutions of min-

max problems in which the statistician selects a parameter minimizing the worst-case loss

among all probability models within a certain distance (in a Wasserstein sense) from the …

 Cited by 18 Related articles All 7 versions 


[PDF] arxiv.org

A Two-Phase Two-Fluxes Degenerate Cahn–Hilliard Model as Constrained Wasserstein Gradient Flow

C Cancès, D Matthes, F Nabet - Archive for Rational Mechanics and …, 2019 - Springer

We study a non-local version of the Cahn–Hilliard dynamics for phase separation in a two-

component incompressible and immiscible mixture with linear mobilities. Differently to the 

celebrated local model with nonlinear mobility, it is only assumed that the divergences of the …

Cited by 5 Related articles All 38 versions


Sliced-Wasserstein Flows: Nonparametric Generative Modeling via Optimal Transport and Diffusions

A Liutkus, U Simsekli, S Majewski… - International …, 2019 - proceedings.mlr.press

By building upon the recent theory that established the connection between implicit 

generative modeling (IGM) and optimal transport, in this study, we propose a novel 

parameter-free algorithm for learning the underlying distri

butions of complicated datasets …

 Cited by 32 Related articles All 7 versions 

[CITATION] … Majewski, Alain Durmus, and Fabian-Robert Stoter. Sliced-Wasserstein flows: Nonparametric generative modeling via optimal transport and diffusions

A Liutkus - International Conference on Machine Learning, 2019

  Cited by 2 Related articles

<——2019———— 2019———120 —


On the Bures–Wasserstein distance between positive definite matrices

R Bhatia, T Jain, Y Lim - Expositiones Mathematicae, 2019 - Elsevier

The metric d (A, B)= tr A+ tr B− 2 tr (A 1 2 BA 1 2) 1 2 1 2 on the manifold of n× n

positive definite matrices arises in various optimisation problems, in quantum information

and in the theory of optimal transport. It is also related to Riemannian geometry. In the first …

  Cited by 99 Related articles All 6 versions


[PDF] arxiv.org

Generalized Sliced Wasserstein Distances

S Kolouri, K Nadjahi, U Simsekli, R Badeau… - arXiv preprint arXiv …, 2019 - arxiv.org

The Wasserstein distance and its variations, eg, the sliced-Wasserstein (SW) distance, have 

recently drawn attention from the machine learning community. The SW distance, 

specifically, was shown to have similar properties to the Wasserstein distance, while being …

Cited by 108 Related articles All 8 versions

[PDF] arxiv.org

Wasserstein smoothing: Certified robustness against wasserstein adversarial attacks

A Levine, S Feizi - arXiv preprint arXiv:1910.10783, 2019 - arxiv.org

In the last couple of years, several adversarial attack methods based on different threat 

models have been proposed for the image classification problem. Most existing defenses 

consider additive threat models in which sample perturbations have bounded L_p norms …

Cited by 1 All 2 versions  All 3 versions

Cited by 6 Related articles 


[PDF] arxiv.org

From the backward Kolmogorov PDE on the Wasserstein space to propagation of chaos for McKean-Vlasov SDEs

N Frikha, PEC de Raynal - arXiv preprint arXiv:1907.01410, 2019 - arxiv.org

In this article, we provide some new quantitative estimates for propagation of chaos of non-

linear stochastic differential equations (SDEs) in the sense of McKean-Vlasov. We obtain 

explicit error estimates, at the level of the trajectories, at the level of the semi-group and at …

Cited by 2 All 17 versions


[PDF] mlr.press

The wasserstein transform

F Memoli, Z Smith, Z Wan - … Conference on Machine …, 2019 - proceedings.mlr.press

… 2003) is the so called Wasserstein distance on the set of all probability measures … 

Wasserstein transform, it is possible to formulate a similar transform using the notion of lp-Wasserstein

Cited by 5 Related articles All 7 versions


[PDF] arxiv.org

Kernelized Wasserstein Natural Gradient

M Arbel, A Gretton, W Li, G Montúfar - arXiv preprint arXiv:1910.09652, 2019 - arxiv.org

Many machine learning problems can be expressed as the optimization of some cost 

functional over a parametric family of probability distributions. It is often beneficial to solve 

such optimization problems using natural gradient methods. These methods are invariant to …

Cited by 12 Related articles All 9 versions

[PDF] researchgate.net

[PDF] Fairness with Wasserstein Adversarial Networks

M Serrurier, JM Loubes, E Pauwels - 2019 - researchgate.net

Quantifying, enforcing and implementing fairness emerged as a major topic in machine 

learning. We investigate these questions in the context of deep learning. Our main 

algorithmic and theoretical tool is the computational estimation of similarities between …

Cited by 4 Related articles

[PDF] arxiv.org

Graph Signal Representation with Wasserstein Barycenters

E Simou, P Frossard - ICASSP 2019-2019 IEEE International …, 2019 - ieeexplore.ieee.org

In many applications signals reside on the vertices of weighted graphs. Thus, there is the 

need to learn low dimensional representations for graph signals that will allow for data 

analysis and interpretation. Existing unsupervised dimensionality reduction methods for …

Cited by 6  Related articles  All 6 versions


On potentials of regularized Wasserstein generative adversarial networks for realistic hallucination of tiny faces

WZ ShaoJJ Xu, L Chen, Q Ge, LQ Wang, BK Bao… - Neurocomputing, 2019 - Elsevier

Super-resolution of facial images, aka face hallucination, has been intensively studied in the

past decades due to the increasingly emerging analysis demands in video surveillance, eg …

 Cite Cited by 5 Related articles All 3 versions


Conservative wasserstein training for pose estimation

X Liu, Y Zou, T Che, P Ding, P Jia… - Proceedings of the …, 2019 - openaccess.thecvf.com

This paper targets the task with discrete and periodic class labels (eg, pose/orientation 

estimation) in the context of deep learning. The commonly used cross-entropy or regression 

loss is not well matched to this problem as they ignore the periodic nature of the labels and …

[C] KBVK Conservative wasserstein training for pose estimation

X Liu, Y Zou, T Che, J You - ICCV, 2019

Cited by 27 Related articles All 10 versions

[PDF] thecvf.com

Conservative wasserstein training for pose estimation

X Liu, Y Zou, T Che, P Ding, P Jia… - Proceedings of the …, 2019 - openaccess.thecvf.com

This paper targets the task with discrete and periodic class labels (eg, pose/orientation 

estimation) in the context of deep learning. The commonly used cross-entropy or regression 

loss is not well matched to this problem as they ignore the periodic nature of the labels and …

Cited by 1 All 3 versions 

<——2019 ——— 2019  ——————130—


[PDF] researchgate.net

On the computation of Wasserstein barycenters

G Puccetti, L Rüschendorf, S Vanduffel - Journal of Multivariate Analysis, 2019 - Elsevier

The Wasserstein barycenter is an important notion in the analysis of high dimensional data 

with a broad range of applications in applied probability, economics, statistics, and in 

particular to clustering and image processing. In this paper, we state a general version of the …

Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Diffusions and PDEs on Wasserstein space

FY Wang - arXiv preprint arXiv:1903.02148, 2019 - arxiv.org

We propose a new type SDE, whose coefficients depend on the image of solutions, to 

investigate the diffusion process on the Wasserstein space $\mathcal P_2 $ over $\mathbb 

R^ d $, generated by the following time-dependent differential operator for $ f\in C_b^ 2 …

Cited by 2 Related articles All 4 versions


[PDF] ijcai.org

[PDF] Three-player wasserstein gan via amortised duality

QH Nhan Dam, T Le, TD Nguyen, H Bui… - Proc. of the 28th Int. Joint …, 2019 - ijcai.org

We propose a new formulation for learning generative adversarial networks (GANs) using 

optimal transport cost (the general form of Wasserstein distance) as the objective criterion to 

measure the dissimilarity between target distribution and learned distribution. Our …

Cited by 1


Peer-reviewed
A Pontryagin Maximum Principle in Wasserstein spaces for constrained optimal control problems*

Author:Benoît Bonnet
Summary:In this paper, we prove a Pontryagin Maximum Principle for constrained optimal control problems in the Wasserstein space of probability measures. The dynamics is described by a transport equation with non-local velocities which are affine in the control, and is subject to end-point and running state constraints. Building on our previous work, we combine the classical method of needle-variations from geometric control theory and the metric differential structure of the Wasserstein spaces to obtain a maximum principle formulated in the so-called Gamkrelidze formShow more
Article, 2019
Publication:ESAIM: Control, Optimisation and Calculus of Variations, 25, 2019
Publisher:2019

[PDF] arxiv.org

Straight-through estimator as projected Wasserstein gradient flow

P Cheng, C Liu, C Li, D Shen, R Henao… - arXiv preprint arXiv …, 2019 - arxiv.org

The Straight-Through (ST) estimator is a widely used technique for back-propagating 

gradients through discrete random variables. However, this effective method lacks 

theoretical justification. In this paper, we show that ST can be interpreted as the simulation of …

Cited by 9 Related articles All 7 versions


2019

[PDF] nips.cc

Concentration of risk measures: A Wasserstein distance approach

SP Bhat, LA Prashanth - Advances in Neural Information Processing …, 2019 - papers.nips.cc

Known finite-sample concentration bounds for the Wasserstein distance between the 

empirical and true distribution of a random variable are used to derive a two-sided 

concentration bound for the error between the true conditional value-at-risk (CVaR) of a …

referred to Chapter 6 of [Villani, 2008] for a detailed introduction …

 Cited by 27 Related articles All 6 versions
Concentration of risk measures: A wasserstein distance approach

[PDF] github.io

[PDF] On parameter estimation with the Wasserstein distance

E Bernton, PE Jacob, M Gerber… - … and Inference: A …, 2019 - espenbernton.github.io

Statistical inference can be performed by minimizing, over the parameter space, the 

Wasserstein distance between model distributions and the empirical distribution of the data. 

We study asymptotic properties of such minimum Wasserstein distance estimators …

Cited by 2 Related articles


[PDF] arxiv.org

Max-Sliced Wasserstein Distance and its use for GANs

I Deshpande, YT Hu, R Sun, A Pyrros… - arXiv preprint arXiv …, 2019 - arxiv.org

Generative adversarial nets (GANs) and variational auto-encoders have significantly 

improved our distribution modeling capabilities, showing promise for dataset augmentation, 

image-to-image translation and feature learning. However, to model high-dimensional …

Cited by 13 Related articles All 7 versions View as HTML 


[PDF] arxiv.org

Necessary condition for rectifiability involving Wasserstein distance 

D Dąbrowski - arXiv preprint arXiv:1904.11000, 2019 - arxiv.org

A Radon measure $\mu $ is $ n $-rectifiable if $\mu\ll\mathcal {H}^ n $ and $\mu $-almost all 

of $\text {supp}\,\mu $ can be covered by Lipschitz images of $\mathbb {R}^ n $. In this paper 

we give a necessary condition for rectifiability in terms of the so-called $\alpha_2 $ numbers …

Cited by 1 Related articles All 4 versions 

online Cover Image PEER-REVIEW OPEN ACCESS

Necessary condition for rectifiability involving Wasserstein distance $W_2

by Dąbrowski, Damian

04/2019

A Radon measure $\mu$ is $n$-rectifiable if $\mu\ll\mathcal{H}^n$ and $\mu$-almost all of $\text{supp}\,\mu$ can be covered by Lipschitz images of...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 Predictive density estimation under the Wasserstein loss

T Matsuda, WE Strawderman - arXiv preprint arXiv:1904.02880, 2019 - arxiv.org

We investigate predictive density estimation under the $ L^ 2$ Wasserstein loss for location 

families and location-scale families. We show that plug-in densities form a complete class 

and that the Bayesian predictive density is given by the plug-in density with the posterior …

All 2 versions

book?

<——2019    ——————— 2019  —————140 —


 

[PDF] arxiv.org

Sufficient condition for rectifiability involving Wasserstein distance 

D Dąbrowski - arXiv preprint arXiv:1904.11004, 2019 - arxiv.org

A Radon measure $\mu $ is $ n $-rectifiable if it is absolutely continuous with respect to 

$\mathcal {H}^ n $ and $\mu $-almost all of $\text {supp}\,\mu $ can be covered by Lipschitz 

images of $\mathbb {R}^ n $. In this paper we give two sufficient conditions for rectifiability …

Cited by 1 Related articles All 2 versions


2019 see 2020 [PDF] arxiv.org

Parameter estimation for biochemical reaction networks using Wasserstein distances

K Öcal, R Grima, G Sanguinetti - Journal of Physics A …, 2019 - iopscience.iop.org

We present a method for estimating parameters in stochastic models of biochemical reaction 

networks by fitting steady-state distributions using Wasserstein distances. We simulate a 

reaction network at different parameter settings and train a Gaussian process to learn the …

 Cited by 17 Related articles All 10 versions


[PDF] arxiv.org

Estimation of wasserstein distances in the spiked transport model

J Niles-Weed, P Rigollet - arXiv preprint arXiv:1909.07513, 2019 - arxiv.org

We propose a new statistical model, the spiked transport model, which formalizes the 

assumption that two probability distributions differ only on a low-dimensional subspace. We 

study the minimax rate of estimation for the Wasserstein distance under this model and show …

Cited by 32 Related articles All 2 versions


[PDF] arxiv.org

A variational finite volume scheme for Wasserstein gradient flows

C Cancès, TO Gallouët, G Todeschi - arXiv preprint arXiv:1907.08305, 2019 - arxiv.org

We propose a variational finite volume scheme to approximate the solutions to Wasserstein 

gradient flows. The time discretization is based on an implicit linearization of the 

Wasserstein distance expressed thanks to Benamou-Brenier formula, whereas space …

Cited by 1 All 8 versions


[PDF] arxiv.org

Wasserstein GANs for MR Imaging: from Paired to Unpaired Training

K Lei, M Mardani, JM Pauly, SS Vasawanala - arXiv preprint arXiv …, 2019 - arxiv.org

Lack of ground-truth MR images (labels) impedes the common supervised training of deep 

networks for image reconstruction. To cope with this challenge, this paper leverages WGANs 

for unpaired training of reconstruction networks, where the inputs are the undersampled …

Cited by 1 All 2 versions


2019

[PDF] uma.pt

[PDF] Sampling of probability measures in the convex order by Wasserstein projection

A Alfonsi, J Corbetta, B Jourdain - ArXiv e-prints, 2019 - pinguim.uma.pt

… Sampling of probability measures in the convex order by Wasserstein projections Aurélien

Alfonsi CERMICS, Ecole des Ponts, Paris-Est University Joint work with Jacopo Corbetta

and Benjamin Jourdain International Conference on Control, Games and Stochastic Analysis …

Cited by 1 All 4 versions


Sampling of probability measures in the convex order by Wasserstein projection

J Corbetta, B Jourdain - 2019 - ideas.repec.org

In this paper, for


[PDF] arxiv.org

Wasserstein Distance Based Domain Adaptation for Object Detection

P Xu, P Gurram, G Whipps, R Chellappa - arXiv preprint arXiv:1909.08675, 2019 - arxiv.org

In this paper, we present an adversarial unsupervised domain adaptation framework for object detection. Prior approaches utilize adversarial training based on cross entropy between the source and target domain distributions to learn a shared feature mapping that 

$\mu $ and $\nu $ two probability measures on $\mathbb {R}^ d $ with finite moments of order $\rho\ge 1$, we define the respective projections for the $ W_\rho $-Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures dominated by …
Cited by 13
Related articles All 3 versions


ON THE RATE OF CONVERGENCE OF EMPIRICAL MEASURE IN infinity-WASSERSTEIN DISTANCE FOR UNBOUNDED DENSITY FUNCTION 

By: Liu, Anning; Liu, Jian-Guo; Lu, Yulong 

QUARTERLY OF APPLIED MATHEMATICS   Volume: ‏ 77   Issue: ‏ 4   Pages: ‏ 811-829   Published: ‏ DEC 2019 


[PDF] arxiv.org

Accelerated Linear Convergence of Stochastic Momentum Methods in Wasserstein Distances

B Can, M Gurbuzbalaban, L Zhu - arXiv preprint arXiv:1901.07445, 2019 - arxiv.org

Momentum methods such as Polyak's heavy ball (HB) method, Nesterov's accelerated 

gradient (AG) as well as accelerated projected gradient (APG) method have been commonly 

used in machine learning practice, but their performance is quite sensitive to noise in the …

Cited by 26 Related articles All 10 versions


[PDF] arxiv.org

A Wasserstein-type distance in the space of Gaussian Mixture Models

J Delon, A Desolneux - arXiv preprint arXiv:1907.05254, 2019 - arxiv.org

In this paper we introduce a Wasserstein-type distance on the set of Gaussian mixture 

models. This distance is defined by restricting the set of possible coupling measures in the 

optimal transport problem to Gaussian mixture models. We derive a very simple discrete …

Cited by 1 All 4 versions 

 Tree-Sliced Approximation of Wasserstein Distances

T Le, M Yamada, K Fukumizu, M Cuturi - arXiv preprint arXiv:1902.00342, 2019 - arxiv.org

Optimal transport ($\OT $) theory provides a useful set of tools to compare probability 

distributions. As a consequence, the field of $\OT $ is gaining traction and interest within the 

machine learning community. A few deficiencies usually associated with $\OT $ include its …

Cited by 2 Related articles All 3 versions

<——2019 ————— 2019———150 —


 
 

[PDF] arxiv.org

Wasserstein Gradient Flow Formulation of the Time-Fractional Fokker-Planck Equation

MH Duong, B Jin - arXiv preprint arXiv:1908.09055, 2019 - arxiv.org

In this work, we investigate a variational formulation for a time-fractional Fokker-Planck 

equation which arises in the study of complex physical systems involving anomalously slow 

diffusion. The model involves a fractional-order Caputo derivative in time, and thus …

nvestigate a variational formulation for a time-fractional …

  Cited by 1 Related articles All 7 versions 


[PDF] arxiv.org

Existence of probability measure valued jump-diffusions in generalized Wasserstein spaces

M Larsson, S Svaluto-Ferro - arXiv preprint arXiv:1908.08080, 2019 - arxiv.org

We study existence of probability measure valued jump-diffusions described by martingale 

problems. We develop a simple device that allows us to embed Wasserstein spaces and 

other similar spaces of probability measures into locally compact spaces where classical …

Cited by 3 Related articles All 2 versions
 Zbl 1470.60135

Cited by 3 Related articles All 2 versions


Minimax estimation of smooth densities in Wasserstein distance

J Niles-WeedQ Berthet - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

We study nonparametric density estimation problems where error is measured in the

Wasserstein distance, a metric on probability distributions popular in many areas of statistics

and machine learning. We give the first minimax-optimal rates for this problem for general …

 

[PDF] mdpi.com

Wasserstein Distance Learns Domain Invariant Feature Representations for Drift Compensation of E-Nose

Y Tao, C Li, Z Liang, H Yang, J Xu - Sensors, 2019 - mdpi.com

Abstract Electronic nose (E-nose), a kind of instrument which combines with the gas sensor 

and the corresponding pattern recognition algorithm, is used to detect the type and 

concentration of gases. However, the sensor drift will occur in realistic application scenario …

Cited by 1 All 8 versions

Sensor Research; Investigators at Chongqing University of Posts and Telecommunications Zero in on Sensor Research (Wasserstein... 

Journal of Technology, Sep 10, 2019, 837

Newspaper ArticleFull Text Online
Cited by 7
Related articles All 8 versions 


Artifact correction in low‐dose dental CT imaging using Wasserstein generative adversarial networks

Z Hu, C Jiang, F Sun, Q Zhang, Y Ge, Y Yang… - Medical …, 2019 - Wiley Online Library

Purpose In recent years, health risks concerning high‐dose x‐ray radiation have become a 

major concern in dental computed tomography (CT) examinations. Therefore, adopting low‐

dose computed tomography (LDCT) technology has become a major focus in the CT …

Cited by 53 Related articles All 4 versions


2019

[PDF] arxiv.org

Dynamic Facial Expression Generation on Hilbert Hypersphere with Conditional Wasserstein Generative Adversarial Nets

N Otberdout, M Daoudi, A Kacem, L Ballihi… - arXiv preprint arXiv …, 2019 - arxiv.org

In this work, we propose a novel approach for generating videos of the six basic facial 

expressions given a neutral face image. We propose to exploit the face geometry by 

modeling the facial landmarks motion as curves encoded as points on a hypersphere. By …

Cited by 1 All 3 versions


Generating EEG signals of an RSVP Experiment by a Class Conditioned Wasserstein Generative Adversarial Network

S Panwar, P Rad, J Quarles… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

Electroencephalography (EEG) data is difficult to obtain due to complex experimental setups 

and reduced comfort due to prolonged wearing. This poses challenges to train powerful 

deep learning model due to the limited EEG data. Hence, being able to generate EEG data …

Cited by 5 Related articles All 2 versions


Generative adversarial framework for depth filling via wasserstein metric, cosine transform and domain transfer

A Atapour-Abarghouei, S Akcay… - Pattern Recognition, 2019 - Elsevier

In this work, the issue of depth filling is addressed using a self-supervised feature learning 

model that predicts missing depth pixel values based on the context and structure of the 

scene. A fully-convolutional generative model is conditioned on the available depth …

Cited by 19 Related articles All 7 versions

Peer-reviewed
A bound on the Wasserstein-2 distance between linear combinations of independent random variables
Authors:Benjamin ArrasEhsan AzmoodehGuillaume PolyYvik Swan
Summary:We provide a bound on a distance between finitely supported elements and general elements of the unit sphere of 2 ( N ) . We use this bound to estimate the Wasserstein-2 distance between random variables represented by linear combinations of independent random variables. Our results are expressed in terms of a discrepancy measure related to Nourdin-Peccati’s Malliavin-Stein method. The main application is towards the computation of quantitative rates of convergence to elements of the second Wiener chaos. In particular, we explicit these rates for non-central asymptotic of sequences of quadratic forms and the behavior of the generalized Rosenblatt process at extreme critical exponentShow more
Article, 2019
Publication:Stochastic Processes and their Applications, 129, 201907, 2341
Publisher:2019


Hyperbolic Wasserstein Distance for Shape Indexing

J Shi, Y Wang - IEEE transactions on pattern analysis and …, 2019 - ieeexplore.ieee.org

Shape space is an active research topic in computer vision and medical imaging fields. The 

distance defined in a shape space may provide a simple and refined index to represent a 

unique shape. This work studies the Wasserstein space and proposes a novel framework to …

Cited by 7 Related articles All 8 versions
<——2019    ——————— 2019  ——————160— 

 

 

[PDF] arxiv.org

Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces

V Ehrlacher, D Lombardi, O Mula, FX Vialard - arXiv preprint arXiv …, 2019 - arxiv.org

We consider the problem of model reduction of parametrized PDEs where the goal is to 

approximate any function belonging to the set of solutions at a reduced computational cost. 

For this, the bottom line of most strategies has so far been based on the approximation of the …

Cited by 1 All 42 versions

[CITATION] Nonlinear model reduction on metric spaces. application to one-dimensional conservative PDEs in Wasserstein spaces, Preprint,(2019)

V Ehrlacher, D Lombardi, O Mula, FX Vialard - arXiv preprint arXiv:1909.06626

[PDF] arxiv.org

Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces

V Ehrlacher, D Lombardi, O Mula… - … Mathematical Modelling …, 2020 - esaim-m2an.org

… They can however be formulated in the form of Wasserstein gradient flows … In the context of

Hamiltonian systems, [2, 33] introduce a reduced-order framework that preserves the symplectic …

The main contribution of this work is to develop the idea of reduced modeling in metric …

  Cited by 4 Related articles All 42 versions

 

[PDF] arxiv.org

Geometric convergence bounds for Markov chains in Wasserstein distance based on generalized drift and contraction conditions

Q Qin, JP Hobert - arXiv preprint arXiv:1902.02964, 2019 - arxiv.org

Let $\{X_n\} _ {n= 0}^\infty $ denote an ergodic Markov chain on a general state space that 

has stationary distribution $\pi $. This article concerns upper bounds on the $ L_1 $-

Wasserstein distance between the distribution of $ X_n $ and $\pi $ in the case where the …

 Cited by 12 Related articles All 3 versions

A unified formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt distances between positive definite operators

HQ Minh - International Conference on Geometric Science of …, 2019 - Springer

This work presents a parametrized family of distances, namely the Alpha Procrustes 

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes 

distances provide a unified formulation encompassing both the Bures-Wasserstein and Log …

Cited by 11 Related articles All 2 versions

[PDF] arxiv.org

Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic Schr {\" o} dinger equation

G Ferriere - arXiv preprint arXiv:1903.04309, 2019 - arxiv.org

We consider the dispersive logarithmic Schr {ö} dinger equation in a semi-classical scaling. 

We extend the results about the large time behaviour of the solution (dispersion faster than 

usual with an additional logarithmic factor, convergence of the rescaled modulus of the …

Cited by 2 Related articles All 3 versions


Data-driven Wasserstein distributionally robust optimization for biomass with agricultural waste-to-energy network design under uncertainty

C Ning, F You - Applied Energy, 2019 - Elsevier

This paper addresses the problem of biomass with agricultural waste-to-energy network 

design under uncertainty. We propose a novel data-driven Wasserstein distributionally 

robust optimization model for hedging against uncertainty in the optimal network design …

Cited by 30 Related articles All 7 versions

Information Technology; Investigators from Cornell University Target Information Technology 

(Data-driven Wasserstein Distributionally Robust Optimization for Biomass With Agricultural Waste-to-energy Network Design Under Uncertainty). Computer Technology Journal. December 26, 2019; p 210.

Computer Technology Journal, Dec 26, 2019, 210

Newspaper ArticleFull Text Online 


2019

A Wasserstein distance-based analogous method to predict distribution of non-uniform corrosion on reinforcements in concrete

Q Liu, RKL Su - Construction and Building Materials, 2019 - Elsevier

This paper presents an analogous method to predict the distribution of non-uniform 

corrosion on reinforcements in concrete by minimizing the Wasserstein distance. A 

comparison between the predicted and experimental results shows that the proposed …

Cited by 8 Related articles All 2 versions

[HTML] sciencedirect.com

[HTML] Wasserstein GAN-based small-sample augmentation for new-generation artificial intelligence: A case study of cancer-staging data in biology

Y Liu, Y Zhou, X Liu, F Dong, C Wang, Z Wang - Engineering, 2019 - Elsevier

It is essential to utilize deep-learning algorithms based on big data for the implementation of 

the new generation of artificial intelligence. Effective utilization of deep learning relies 

considerably on the number of labeled samples, which restricts the application of deep …

Cited by 7 Related articles
Artificial Intelligence; New Artificial Intelligence Findings from Tsinghua University Reported (Wasserstein Gan-based Small-sample Augmentation for New-generation... 

Cancerweekly Plus, Apr 16, 2019, 2801

Newspaper ArticleCitation Online 

Cited by 81 Related articles All 3 versions

[PDF] arxiv.org

Alpha Procrustes metrics between positive definite operators: a unifying formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt metrics

MH Quang - arXiv preprint arXiv:1908.09275, 2019 - arxiv.org

This work presents a parametrized family of distances, namely the Alpha Procrustes 

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes 

distances provide a unified formulation encompassing both the Bures-Wasserstein and Log …

Cited by 1 All 2 versions


[PDF] umd.edu  2019

[PDF] Quantum Wasserstein GANs

S Chakrabarti, Y Huang, T Li, S Feizi, X Wu - cs.umd.edu

Inspired by previous studies on the adversarial training of classical and quantum generative 

models, we propose the first design of quantum Wasserstein Generative Adversarial 

Networks (WGANs), which has been shown to improve the robustness and the scalability of …


Wasserstein Generative Adversarial Networks

[edit]Martin Arjovsky, Soumith Chintala, Léon Bottou ;

Proceedings of the 34th International Conference on Machine Learning, PMLR 70:214-223, 2017.

Abstract We introduce a new algorithm named WGAN, an alternative to traditional GAN training. In this new model, we show that we can improve the stability of learning, get rid of problems like mode collapse, and provide 

meaningful learning curves useful for debugging and hyperparameter searches. Furthermore, we show that the corresponding optimization problem is sound, and provide extensive theoretical work highlighting the deep connections to different distances between distributions.

Related Material  Download PDF   Supplementary PDF

Arjovsky, M., Chintala, S., Bottou, L.: Wasserstein GAN. arXiv preprint arXiv:1701.07875 (2017)


<—–2019—––2019 ————170— 


[PDF] arxiv.org

Wasserstein-Wasserstein Auto-Encoders 

S Zhang, Y Gao, Y Jiao, J Liu, Y Wang… - arXiv preprint arXiv …, 2019 - arxiv.org

To address the challenges in learning deep generative models (eg, the blurriness of 

variational auto-encoder and the instability of training generative adversarial networks, we 

propose a novel deep generative model, named Wasserstein-Wasserstein auto-encoders …

Cited by 8 Related articles All 4 versions


2019 arXiv see 2020

[v2] Thu, 2 May 2019 02:36:49 UTC (534 KB)

[v3] Sun, 28 Jul 2019 07:24:30 UTC (548 KB)
[CITATION] A Fast Globally Linearly Convergent Algorithm for the Computation of Wasserstein Barycenters. eprint

L Yang, J Li, D Sun, KC Toh - arXiv preprint arXiv:1809.04249, 2019

  Cited by 4 Related articles

[C] A Fast Globally Linearly Convergent Algorithm for the Computation of Wasserstein Barycenters. eprint

L Yang, J Li, D Sun, KC Toh - arXiv preprint arXiv:1809.04249, 2019

Cited by 3


[PDF] arxiv.org

Wasserstein Neural Processes

A Carr, J Nielson, D Wingate - arXiv preprint arXiv:1910.00668, 2019 - arxiv.org

Neural Processes (NPs) are a class of models that learn a mapping from a context set of input-output pairs to a distribution over functions. They are traditionally trained using maximum likelihood with a KL divergence regularization term. We show that there are …

Wasserstein neural processes
Cited by 2
Related articles All 3 versions


[PDF] arxiv.org

Kernel Wasserstein Distance

JH Oh, M Pouryahya, A Iyer, AP Apte… - arXiv preprint arXiv …, 2019 - arxiv.org

The Wasserstein distance is a powerful metric based on the theory of optimal transport. It 

gives a natural measure of the distance between two distributions with a wide range of 

applications. In contrast to a number of the common divergences on distributions such as …

Cited by 8 Related articles All 3 versions

[PDF] arxiv.org

Wasserstein Style Transfer

Y Mroueh - arXiv preprint arXiv:1905.12828, 2019 - arxiv.org

We propose Gaussian optimal transport for Image style transfer in an Encoder/Decoder framework. Optimal transport for Gaussian measures has closed forms Monge mappings from source to target distributions. Moreover interpolates between a content and a style …

Cited by 21 Related articles All 6 versions


2019

Deconvolution for the Wasserstein distance

J Dedecker - smai.emath.fr

We consider the problem of estimating a probability measure on Rd from data observed with an additive noise. We are interested in rates of convergence for the Wasserstein metric of order p≥ 1. The distribution of the errors is assumed to be known and to belong to a class of …

Related articles

Sliced Gromov-Wasserstein

T Vayer, R Flamary, R Tavenard, L Chapel… - arXiv preprint arXiv …, 2019 - arxiv.org

Recently used in various machine learning contexts, the Gromov-Wasserstein distance (GW) allows for comparing distributions that do not necessarily lie in the same metric space. However, this Optimal Transport (OT) distance requires solving a complex non convex …

Cited by 5 Related articles All 14 versios

[PDF] nips.cc

[PDF] Sliced gromov-wasserstein

V TitouanR FlamaryN CourtyR Tavenard… - Advances in Neural …, 2019 - papers.nips.cc

… PhD thesis. 2013. [14] N. Bonneel, J. Rabin, G. Peyré, and H. Pfister … [16] S. Kolouri, PE Pope,

CE Martin, and GK Rohde. “Sliced Wasserstein Auto-Encoders” … 2019. [17] I. Deshpande, Z. Zhang,

and AG Schwing. “Generative Modeling Using the Sliced Wasserstein Distance” …

Cited by 42 Related articles All 13 versions 

[PDF] researchgate.net

[PDF] Wasserstein distance: a flexible tool for statistical analysis

GVVLV Lucarini - 2019 - researchgate.net

The figure shows the Wasserstein distance calculated in the phase space composed by globally averaged temperature and precipitation. To provide some sort of benchmark, at the bottom of the figure is shown the value related to the NCEP reanalysis, which yields one of …

Related articles All 5 versions


Wasserstein-2 Generative Networks

A Korotin, V Egiazarian, A Asadulaev… - arXiv preprint arXiv …, 2019 - arxiv.org

Modern generative learning is mainly associated with Generative Adversarial Networks (GANs). Training such networks is always hard due to the minimax nature of the optimization objective. In this paper we propose a novel algorithm for training generative models, which …

arXiv:1909.13082  [pdf, other]  cs.LG cs.CV stat.ML
Wasserstein-2 Generative Networks
Authors: Alexander Korotin, Vage Egiazarian, Arip Asadulaev, Alexander Safin, Evgeny Burnaev
Abstract: Generative Adversarial Networks training is not easy due to the minimax nature of the optimization objective. In this paper, we propose a novel end-to-end algorithm for training generative models which uses a non-minimax objective simplifying model training. The proposed algorithm uses the approximation of Wasserstein-2 distance by Input Convex Neural Networks. From the theoretical side, we estima…
More
Submitted 22 June, 2020; v1 submitted 28 September, 2019; originally announced September 2019.
Comments: 29 pages, 21 figures, 3 tables

Cited by 25
 Related articles All 9 versions 


www.benasque.org › talks_contr › 193_Buttazzo

www.benasque.org › talks_contr › 193_Buttazzo

PDF

Aug 30, 2019 - where W denotes the p-Wasserstein distance and A is a suitable class of Borel probabil- ities ν that are singular with respect to µ, that is µ is 

 <——2019———— 2019 ——180 —


 ]

[PDF] nips.cc

Generalized sliced Wasserstein distances

S KolouriK NadjahiU SimsekliR Badeau… - Advances in Neural …, 2019 - papers.nips.cc

Page 1. Generalized Sliced Wasserstein Distances Soheil … Abstract The Wasserstein

distance and its variations, eg, the sliced-Wasserstein (SW) distance, have recently

drawn attention from the machine learning community. The …

Cited by 140 Related articles All 9 versions 

[PDF] ieee.org

Generating Adversarial Samples With Constrained Wasserstein Distance

K Wang, P Yi, F Zou, Y Wu - IEEE Access, 2019 - ieeexplore.ieee.org

In recent years, deep neural network (DNN) approaches prove to be useful in many machine learning tasks, including classification. However, small perturbations that are carefully crafted by attackers can lead to the misclassification of the images. Previous studies have …
Engineering; Researchers at Shanghai Jiao Tong University Report New Data on Engineering (Generating Adversarial Samples With Constrained Wasserstein... 

Network Weekly News, Dec 16, 2019, 3205

Newspaper ArticleFull Text Online 

Cited by 1 Related articles

[PDF] arxiv.org

Wasserstein Reinforcement Learning

A Pacchiano, J Parker-Holder, Y Tang… - arXiv preprint arXiv …, 2019 - arxiv.org

We propose behavior-driven optimization via Wasserstein distances (WDs) to improve several classes of state-of-the-art reinforcement learning (RL) algorithms. We show that WD regularizers acting on appropriate policy embeddings efficiently incorporate behavioral …

Related articles


[PDF] sfb1294.de

[PDF] Kalman-Wasserstein Gradient Flows

F Hoffmann - sfb1294.de

Parameter calibration and uncertainty in complex computer models. Optimization approach and least squares. Bayesian approach and sampling. Ensemble Kalman Inversion (for optimization). Ensemble Kalman Sampling (for sampling). Gaussian Process Regression …


[PDF] arxiv.org

Wasserstein Diffusion Tikhonov Regularization

AT Lin, Y Dukler, W Li, G Montúfar - arXiv preprint arXiv:1909.06860, 2019 - arxiv.org

We propose regularization strategies for learning discriminative models that are robust to in-class variations of the input data. We use the Wasserstein-2 geometry to capture semantically meaningful neighborhoods in the space of images, and define a corresponding …

Cited by 3 Related articles All 7 versions

Deep Multi-Wasserstein Unsupervised Domain Adaptation

TN Le, A Habrard, M Sebban - Pattern Recognition Letters, 2019 - Elsevier

In unsupervised domain adaptation (DA), one aims at learning from labeled source data and fully unlabeled target examples a model with a low error on the target domain. In this setting, standard generalization bounds prompt us to minimize the sum of three terms:(a) the source …

Cited by 3 Related articles All 4 versions

[PDF] mpg.de

Wasserstein of Wasserstein loss for learning generative models

Y Dukler, W Li, A Tong Lin, G Montúfar - 2019 - mis.mpg.de

The Wasserstein distance serves as a loss function for unsupervised learning which depends on the choice of a ground metric on sample space. We propose  Cited by 26 Related articles All 12 versions
Wasserstein of Wasserstein Loss for Learning Generative Models


[PDF] uchile.cl

[PDF] WASSERSTEIN-BASED DISTANCE FOR TIME SERIES ANALYSIS

E CAZELLES, A ROBERT, F TOBAR - cmm.uchile.cl

Page 1. WASSERSTEIN-BASED DISTANCE FOR TIME SERIES ANALYSIS ELSA CAZELLES, ARNAUD ROBERT AND FELIPE TOBAR UNIVERSIDAD DE CHILE BACKGROUND For a stationary continuous-time time series x(t), the Power Spectral Density is given by S(ξ) = lim T∞ …


[PDF] semanticscholar.org

[PDF] Optimal Transport and Wasserstein Distance

S Kolouri - pdfs.semanticscholar.org

The Wasserstein distance — which arises from the idea of optimal transport — is being used more and more in Statistics and Machine Learning. In these notes we review some of the basics about this topic. Two good references for this topic are … Kolouri, Soheil, et al. Optimal Mass …


Wasserstein Adversarial Imitation Learning

H Xiao, M Herman, J Wagner, S Ziesche… - arXiv preprint arXiv …, 2019 - arxiv.org

Imitation Learning describes the problem of recovering an expert policy from demonstrations. While inverse reinforcement learning approaches are known to be very sample-efficient in terms of expert demonstrations, they usually require problem-dependent …

Cited by 1 Related articles

<——2019———— 2019  —————190— 

 

[PDF] EE-559–Deep learning 10.2. Wasserstein GAN

F Fleuret - fleuret.org

Page 1. EE-559 – Deep learning 10.2. Wasserstein GAN François Fleuret https://fleuret.org/ee559/

Thu Mar 28 12:43:55 UTC 2019 Page 2. Arjovsky et al. (2017) point out that DJS does not account

[much] for the metric structure of the space. François Fleuret EE-559 – Deep learning / 10.2 …

Related articles All 2 versions


[PDF] arxiv.org

Wasserstein Contraction for Stochastic Nonlinear Systems

J Bouvrie, JJ Slotine - arXiv preprint arXiv:1902.08567, 2019 - arxiv.org

We suggest that the tools of contraction analysis for deterministic systems can be applied to the study of coupled stochastic dynamical systems, with Wasserstein distance and the theory of optimal transport serving as the key intermediary. If the drift term in an Ito diffusion is …

Cited by 1 Related articles Related articles All 2 versions


[PDF] arxiv.org

(q, p)-Wasserstein GANs: Comparing Ground Metrics for Wasserstein GANs

A Mallasto, J Frellsen, W Boomsma… - arXiv preprint arXiv …, 2019 - arxiv.org

Generative Adversial Networks (GANs) have made a major impact in computer vision and machine learning as generative models. Wasserstein GANs (WGANs) brought Optimal Transport (OT) theory into GANs, by minimizing the $1 $-Wasserstein distance between …

Cited by 10 Related articles All 2 versions

[PDF] arxiv.org

Thermodynamic interpretation of Wasserstein distance

A Dechant, Y Sakurai - arXiv preprint arXiv:1912.08405, 2019 - arxiv.org

We derive a relation between the dissipation in a stochastic dynamics and the Wasserstein distance. We show that the minimal amount of dissipation required to transform an initial state to a final state during a diffusion process is given by the Wasserstein distance between …

Cited by 23 Related articles All 2 versions

[PDF] arxiv.org

Wasserstein GAN Can Perform PCA

J Cho, C Suh - arXiv preprint arXiv:1902.09073, 2019 - arxiv.org

Generative Adversarial Networks (GANs) have become a powerful framework to learn generative models that arise across a wide variety of domains. While there has been a recent surge in the development of numerous GAN architectures with distinct optimization …

Related articles All 2 versions 


2019


Processus de diffusion sur l'espace de Wasserstein : modèles coalescents, propriétés de régularisation et équations de McKean-Vlasov
Authors:Victor MarxFrançois DelarueBenjamin JourdainArnaud GuillinMax-K von RenesseMireille BossyDjalil ChafaïUniversité Côte d'Azur (2015-2019).École doctorale Sciences fondamentales et appliquées (Nice).Université de Nice (1965-2019).
Show more
Summary:La thèse vise à étudier une classe de processus stochastiques à valeurs dans l'espace des mesures de probabilité sur la droite réelle, appelé espace de Wasserstein lorsqu'il est muni de la métrique de Wasserstein W2. Ce travail aborde principalement les questions suivantes : comment construire effectivement des processus stochastiques vérifiant des propriétes diffusives à valeurs dans un espace de dimension infinie ? existe-t-il une forme d'unicité, forte ou faible, satisfaite par certains processus ainsi construits ? peut-on établir des propriétés régularisantes de ces diffusions, en particulier le forçage stochastique d'équations de McKean-Vlasov ou des formules d'intégration par parties de BismutElworthy ? Le chapitre I propose une construction alternative, par approximations lisses, du système de particules défini par Konarovskyi et von Renesse, et appelé ci-après modèle coalescent. Le modèle coalescent est un processus aléatoire à valeurs dans l'espace de Wasserstein, satisfaisant une formule de type Itô sur cet espace et dont les déviations en temps petit sont régies par la métrique de Wasserstein, par analogie avec les déviations en temps court du mouvement brownien standard gouvernées par la métrique euclidienne. L'approximation régulière construite dans cette thèse partage ces propriétés diffusives et est obtenue par lissage des coefficients de l'équation différentielle stochastique satisfaite par le modèle coalescent. Cette variante présente l'avantage principal de satisfaire des résultats d'unicité demeurant ouverts pour le modèle coalescent. De plus, à de petites modifications de sa structure près, cette diffusion lissée possède des propriétés régularisantes : c'est précisément l'objet de l'étude des chapitres II à IV. Dans le chapitre II, on perturbe une équation de McKean-Vlasov mal posée par une de ces versions lissées du modèle coalescent, afin d'en restaurer l'unicité. Le lien est fait avec les résultats récents (Jourdain, Mishura-Veretennikov, Chaudru de Raynal-Frikha, Lacker, RöcknerZhang) où l'unicité d'une solution est démontrée lorsque le bruit est de dimension finie et le coefficient de dérive est lipschitzien en distance de variation totale en la variable de mesure. Dans notre cas, la diffusion sur l'espace de Wasserstein permet de régulariser le champ de vitesse en l'argument de mesure et ainsi de traiter des fonctions de dérive de faible régularité à la fois en la variable d'espace et de mesure. Enfin, les chapitres III et IV étudient, pour une diffusion définie sur l'espace de Wasserstein du cercle, les propriétés de régularisation du semi-groupe associé. Utilisant dans le chapitre III le calcul différentiel sur l'espace de Wasserstein introduit par Lions, on établit une inégalité de Bismut-Elworthy, contrôlant le gradient du semi-groupe aux points de l'espace des mesures de probabilité qui ont une densité assez régulière. Dans le chapitre IV, la vitesse d'explosion lorsqu'on fait tendre la variable temporelle vers zéro est améliorée sous certaines conditions de régularité supplémentaires. On déduit de ces résultats des estimations a priori pour une EDP posée sur l'espace de Wasserstein et dirigée par la diffusion sur le tore mentionnée ci-dessus, dans le cas homogène (chapitre III) et avec un terme source non trivial (chapitre IV)
Show more
Computer Program, 2019
English
Publisher:2019

[PDF] arxiv.org

Wasserstein-Fisher-Rao Document Distance

Z Wang, D Zhou, Y Zhang, H Wu, C Bao - arXiv preprint arXiv:1904.10294, 2019 - arxiv.org

As a fundamental problem of natural language processing, it is important to measure the distance between different documents. Among the existing methods, the Word Mover's Distance (WMD) has shown remarkable success in document semantic matching for its clear …

 Cited by 1 Related articles All 3 versions 

   Wasserstein-Fisher-Rao Document Distance

arxiv.org › cs

Apr 23, 2019 - In this paper, we apply the newly developed Wasserstein-Fisher-Rao (WFR) metric from unbalanced optimal transport theory to measure the ...

by Z Wang - ‎2019 

Cited by 3 Related articles All 4 versions 


2019
Implementation of batched Sinkhorn iterations for entropy-regularized Wasserstein loss

Author:Viehmann, Thomas (Creator)
Summary:In this report, we review the calculation of entropy-regularised Wasserstein loss introduced by Cuturi and document a practical implementation in PyTorch. Code is available at https://github.com/t-vi/pytorch-tvmisc/blob/master/wasserstein-distance/Pytorch_Wasserstein.ipynbShow more
Downloadable Archival Material, 2019-07-01
Undefined
Publisher:2019-07-01

Three-player wasserstein GAN via amortised duality

N Dam, Q Hoang, T Le, TD Nguyen, H Bui… - Proceedings of the 28th …, 2019 - dl.acm.org

We propose a new formulation for learning generative adversarial networks (GANs) using optimal transport cost (the general form of Wasserstein distance) as the objective criterion to measure the dissimilarity between target distribution and learned distribution. Our …


Computing Wasserstein Barycenters via Linear Programming

G Auricchio, F Bassetti, S Gualandi… - … Conference on Integration …, 2019 - Springer

This paper presents a family of generative Linear Programming models that permit to compute the exact Wasserstein Barycenter of a large set of two-Cited by 4 Related articles All 2 versions

 <——2019 ————2019–——200-


Computing Wasserstein Barycenters via Linear Programming

M Veneroni - … of Constraint Programming, Artificial Intelligence, and … - Springer

This paper presents a family of generative Linear Programming models that permit to compute the exact Wasserstein Barycenter of a large set of two-dimensional images. Wasserstein Barycenters were recently introduced to mathematically generalize the concept …

Related articles


Adaptive quadratic Wasserstein full-waveform inversion

D Wang, P Wang - SEG Technical Program Expanded Abstracts …, 2019 - library.seg.org

Full-waveform inversion (FWI) has increasingly become standard practice in the industry to resolve complex velocities. However, the current FWI research still exhibits a diverging scene, with various flavors of FWI targeting different aspects of the problem. Outstanding …

Cited by 4 Related articles All 2 versions


Peer-reviewed
Scene Classification Using Hierarchical Wasserstein CNN

Authors:Liu Y.Ding L.Suen C.Y.
Article, 2019
Publication:IEEE Transactions on Geoscience and Remote Sensing, 57, 2019 05 01, 2494
Publisher:2019


[PDF] arxiv.org

Wasserstein Weisfeiler-Lehman Graph Kernels

M Togninalli, E Ghisu, F Llinares-López… - arXiv preprint arXiv …, 2019 - arxiv.org

Graph kernels are an instance of the class of $\mathcal {R} $-Convolution kernels, which measure the similarity of objects by comparing their substructures. Despite their empirical success, most graph kernels use a naive aggregation of the final set of substructures …

Wasserstein Weisfeiler-Lehman Graph Kernels 

By: Togninalli, Matteo; Ghisu, Elisabetta; Llinares-Lopez, Felipe; et al.

Conference: 33rd Conference on Neural Information Processing Systems (NeurIPS) Location: ‏ Vancouver, CANADA Date: ‏ DEC 08-14, 2019 

ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019)  Book Series: ‏ Advances in Neural Information Processing Systems   Volume: ‏

Cited by 80 Related articles All 14 versions

[PDF] arxiv.org

Fused Gromov-Wasserstein Alignment for Hawkes Processes

D Luo, H Xu, L Carin - arXiv preprint arXiv:1910.02096, 2019 - arxiv.org

We propose a novel fused Gromov-Wasserstein alignment method to jointly learn the Hawkes processes in different event spaces, and align their event types. Given two Hawkes processes, we use fused Gromov-Wasserstein discrepancy to measure their dissimilarity …

Cited by 2 Related articles All 3 versions



2019

[PDF] arxiv.org

Parameterized Wasserstein mean with its properties

S Kim - arXiv preprint arXiv:1904.09385, 2019 - arxiv.org

A new least squares mean of positive definite matrices for the divergence associated with the sandwiched quasi-relative entropy has been introduced. It generalizes the well-known Wasserstein mean for covariance matrices of Gaussian distributions with mean zero, so we …

Related articles  All 2 versions


[PDF] thecvf.com

Wasserstein GAN with Quadratic Transport Cost

H Liu, X Gu, D Samaras - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Wasserstein GANs are increasingly used in Computer Vision applications as they are easier to train. Previous WGAN variants mainly use the l_1 transport cost to compute the Wasserstein distance between the real and synthetic data distributions. The l_1 transport …

[PDF] thecvf.com

[PDF] Wasserstein GAN with Quadratic Transport Cost Supplementary Material

H Liu, X Gu, D Samaras - openaccess.thecvf.com

(1) where I and J are disjoint sets, then for each xj, there exists at I, such that H t− H j= c (xj, yt). We prove this by contradiction, ie, there exists one xs, s J, such that we cannot find ay i such that H i− H s= c (xs, yi), i I. This means that H s> supi I {H i− c (xs, yi)} …

Wasserstein GAN with quadratic transport cost

H Liu, X Gu, D Samaras - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Wasserstein GANs are increasingly used in Computer Vision applications as they are easier 

to train. Previous WGAN variants mainly use the l_1 transport cost to compute the 

Wasserstein distance between the real and synthetic data distributions. The l_1 transport 

cost restricts the discriminator to be 1-Lipschitz. However, WGANs with l_1 transport cost 

were recently shown to not always converge. In this paper, we propose WGAN-QC, a WGAN 

with quadratic transport cost. Based on the quadratic transport cost, we propose an Optimal …

Cited by 43 Related articles All 5 versions

[PDF] archives-ouvertes.fr

Optimal Control in Wasserstein Spaces

B Bonnet - 2019 - hal.archives-ouvertes.fr

A wealth of mathematical tools allowing to model and analyse multi-agent systems has been brought forth as a consequence of recent developments in optimal transport theory. In this thesis, we extend for the first time several of these concepts to the framework of control …


[PDF] arxiv.org

Approximation of Wasserstein distance with Transshipment

N Papadakis - arXiv preprint arXiv:1901.09400, 2019 - arxiv.org

An algorithm for approximating the p-Wasserstein distance between histograms defined on unstructured discrete grids is presented. It is based on the computation of a barycenter constrained to be supported on a low dimensional subspace, which corresponds to a …

Cited by 2 Related articles All 4 versions

Peer-reviewed
Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance
Authors:Jonathan WeedFrancis Bach
Summary:The Wasserstein distance between two probability measures on a metric space is a measure of closeness with applications in statistics, probability, and machine learning. In this work, we consider the fundamental question of how quickly the empirical measure obtained from $n$ independent samples from $\mu$ approaches $\mu$ in the Wasserstein distance of any order. We prove sharp asymptotic and finite-sample results for this rate of convergence for general measures on general compact metric spaces. Our finite-sample results show the existence of multi-scale behavior, where measures can exhibit radically different rates of convergence as $n$ growsShow more
Downloadable Article
Publication:https://projecteuclid.org/euclid.bj/1568362038Bernoulli, 25, 2019-11-01, 2620


<——2019 —————2019—————210— 


[PDF] utwente.nl

Wasserstein Generative Adversarial Privacy Networks

KE Mulder - 2019 - essay.utwente.nl

A method to filter private data from public data using generative adversarial networks has been introduced in an article" Generative Adversarial Privacy" by Chong Huang et al. in 2018. We attempt to reproduce their results, and build further upon their work by introducing …


[PDF] arxiv.org

A partial Laplacian as an infinitesimal generator on the Wasserstein space

YT Chow, W Gangbo - Journal of Differential Equations, 2019 - Elsevier

In this manuscript, we consider special linear operators which we term partial Laplacians on

the Wasserstein space, and which we show to be partial traces of the Wasserstein Hessian.

We verify a distinctive smoothing effect of the “heat flows” they generated for a particular …

  Cited by 13 Related articles All 9 versions


[PDF] bayesiandeeplearning.org

[PDF] Nested-Wasserstein Distance for Sequence Generation

R Zhang, C Chen, Z Gan, Z Wen, W Wang, L Carin - bayesiandeeplearning.org

Reinforcement learning (RL) has been widely studied for improving sequencegeneration models. However, the conventional rewards used for RL training typically cannot capture sufficient semantic information and therefore render model bias. Further, the sparse and …


[PDF] arxiv.org

Irregularity of distribution in Wasserstein distance

C Graham - arXiv preprint arXiv:1910.14181, 2019 - arxiv.org

We study the non-uniformity of probability measures on the interval and the circle. On the interval, we identify the Wasserstein-$ p $ distance with the classical $ L^ p $-discrepancy. We thereby derive sharp estimates in Wasserstein distances for the irregularity of distribution …


[PDF] arxiv.org

Busemann functions on the Wasserstein space

G Zhu, WL Li, X Cui - arXiv preprint arXiv:1905.05544, 2019 - arxiv.org

We study rays and co-rays in the Wasserstein space $ P_p (\mathcal {X}) $($ p> 1$) whose ambient space $\mathcal {X} $ is a complete, separable, non-compact, locally compact length space. We show that rays in the Wasserstein space can be represented as probability …

Related articles


2019

[PDF] arxiv.org

Topic Modeling with Wasserstein Autoencoders

F Nan, R Ding, R Nallapati, B Xiang - arXiv preprint arXiv:1907.12374, 2019 - arxiv.org

We propose a novel neural topic model in the Wasserstein autoencoders (WAE) framework. Unlike existing variational autoencoder based models, we directly enforce Dirichlet prior on the latent document-topic vectors. We exploit the structure of the latent space and apply a …


Time Delay Estimation Via Wasserstein Distance Minimization

JM Nichols, MN Hutchinson, N Menkart… - IEEE Signal …, 2019 - ieeexplore.ieee.org

Time delay estimation between signals propagating through nonlinear media is an important problem with application to radar, underwater acoustics, damage detection, and communications (to name a few). Here, we describe a simple approach for determining the …

Cited by 3 Related articles All 2 versions


Wasserstein barycenters in the manifold of all positive definite matrices

E Nobari, B Ahmadi Kakavandi - Quarterly of Applied Mathematics, 2019 - ams.org

In this paper, we study the Wasserstein barycenter of finitely many Borel probability measures on $\mathbb {P} _ {n} $, the Riemannian manifold of all $ n\times n $ real positive definite matrices as well as its associated dual problem, namely the optimal transport …

Related articles All 2 versions 


Wasserstein Distributionally Robust Shortest Path Problem

Z Wang, K You, S Song, Y Zhang - arXiv preprint arXiv:1902.09128, 2019 - arxiv.org

This paper proposes a data-driven distributionally robust shortest path (DRSP) model where the distribution of travel time of each arc in the transportation network can only be observed through a finite training dataset. To resolve the ambiguity of the probability distribution, the …

  Cited by 11 Related articles All 4 versions 

MR3962587  Nobari, Elham; Ahmadi Kakavandi, Bijan Wasserstein barycenter in the manifold of all positive definite matrices. Quart. Appl. Math. 77 (2019), no. 3, 655–669. (Reviewer: Luca Lussardi) 49Q20 (49M25 58D20 90C25)


High Performance WGAN-GP based Multiple-category Network Anomaly Classification System

JT Wang, CH Wang - 2019 International Conference on Cyber …, 2019 - ieeexplore.ieee.org

Due to the increasing of smart devices, the detection of anomalous traffic on Internet is getting more essential. Many previous intrusion detection studies which focused on the classification between normal or anomaly events can be used to enhance the system 


<——2019    ——————— 2019  ——-220 —


 

[PDF] arxiv.org

A measure approximation theorem for Wasserstein-robust expected values

G van Zyl - arXiv preprint arXiv:1912.12119, 2019 - arxiv.org

We consider the problem of finding the infimum, over probability measures being in a ball defined by Wasserstein distance, of the expected value of a bounded Lipschitz random variable on $\mathbf {R}^ d $. We show that if the $\sigma-$ algebra is approximated in by a …


Frame-level speech enhancement based on Wasserstein GAN

P Chuan, T Lan, M Li, S Li, Q Liu - … International Conference on …, 2019 - spiedigitallibrary.org

Speech enhancement is a challenging and critical task in the speech processing research area. In this paper, we propose a novel speech enhancement model based on Wasserstein generative adversarial networks, called WSEM. The proposed model operates on frame …

 Related articles All 3 versions


[PDF] arxiv.org

On Scalable Variant of Wasserstein Barycenter

T Le, V Huynh, N Ho, D Phung, M Yamada - arXiv preprint arXiv …, 2019 - arxiv.org

We study a variant of Wasserstein barycenter problem, which we refer to as\emph {tree-sliced Wasserstein barycenter}, by leveraging the structure of tree metrics for the ground metrics in the formulation of Wasserstein distance. Drawing on the tree structure, we …

On scalable variant of wasserstein barycenter

[PDF] arxiv.org

Strong equivalence between metrics of Wasserstein type

E Bayraktar, G Guo - arXiv preprint arXiv:1912.08247, 2019 - arxiv.org

The sliced Wasserstein and more recently max-sliced Wasserstein metrics $\mW_p $ have attracted abundant attention in data sciences and machine learning due to its advantages to tackle the curse of dimensionality. A question of particular importance is the strong …


[PDF] arxiv.org

Learning with Wasserstein barycenters and applications

G Domazakis, D Drivaliaris, S Koukoulas… - arXiv preprint arXiv …, 2019 - arxiv.org

In this work, learning schemes for measure-valued data are proposed, ie data that their structure can be more efficiently represented as probability measures instead of points on $\R^ d $, employing the concept of probability barycenters as defined with respect to the …

Related articles All 2 versions

2019

[PDF] ucla.edu

On differentiability in the Wasserstein space and well-posedness for Hamilton–Jacobi equations

W GangboA Tudorascu - Journal de Mathématiques Pures et Appliquées, 2019 - Elsevier

In this paper we elucidate the connection between various notions of differentiability in the

Wasserstein space: some have been introduced intrinsically (in the Wasserstein space, by

using typical objects from the theory of Optimal Transport) and used by various authors to …

  Cited by 38 Related articles All 4 versions


[PDF] arxiv.org

Wasserstein Distance Guided Cross-Domain Learning

J Su - arXiv preprint arXiv:1910.07676, 2019 - arxiv.org

Domain adaptation aims to generalise a high-performance learner on target domain (non-labelled data) by leveraging the knowledge from source domain (rich labelled data) which comes from a different but related distribution. Assuming the source and target domains data …

Related articles All 2 versions

 Vae/Wgan-Based Image Representation Learning For Pose-Preserving Seamless Identity Replacement In Facial Images

H Kawai, J Chen, P Ishwar… - 2019 IEEE 29th …, 2019 - ieeexplore.ieee.org

We present a novel variational generative adversarial network (VGAN) based on Wasserstein loss to learn a latent representation from a face image that is invariant to identity but preserves head-pose information. This facilitates synthesis of a realistic face …

Cited by 4 Related articles All 3 versions


Slot based Image Captioning with WGAN

Z Xue, L Wang, P Guo - … IEEE/ACIS 18th International Conference on …, 2019 - computer.org

Existing image captioning methods are always limited to the rules of words or syntax with 

single sentence and poor words. In this paper, this paper introduces a novel framework for 

image captioning tasks which reconciles slot filling approaches with neural network 

approaches. Our approach first generates a sentence template with many slot locations 

using Wasserstein Generative Adversarial Network (WGAN). Then the slots which are in 

visual regions will be filled by object detectors. Our model consists of a structured sentence …

Related articles All 2 versions

[C] Slot based Image Captioning with WGAN

Z Xue, L Wang, P Guo - 2019 IEEE/ACIS 18th International …, 2019 - ieeexplore.ieee.org

Existing image captioning methods are always limited to the rules of words or syntax with single sentence and poor words. In this paper, this paper introduces a novel framework for image captioning tasks which reconciles slot filling approaches with neural network …

Slot based Image Captioning with WGAN

Z Xue, L Wang, P Guo - … IEEE/ACIS 18th International Conference on …, 2019 - computer.org

Existing image captioning methods are always limited to the rules of words or syntax with single sentence and poor words. In this paper, this paper introduces a novel framework for image captioning tasks which reconciles slot filling approaches with neural network approaches. Our approach first generates a sentence template with many slot locations using Wasserstein Generative Adversarial Network (WGAN). Then the slots which are in visual regions will be filled by object detectors. Our model consists of a structured sentence …

Related articles 

Slot based Image Captioning with WGAN 

by Xue, Ziyu; Wang, Lei; Guo, Peiyu 

2019 IEEE/ACIS 18th International Conference on Computer and Information Science (ICIS), 06/2019

Existing image captioning methods are always limited to the rules of words or syntax with single sentence and poor words. In this paper, this paper introduces...

Conference Proceeding: Full Text Online 


Peer-reviewed
Correction to: Multivariate approximations in Wasserstein distance by Stein’s method and Bismut’s formula

AAuthors:Xiao FangQi-Man ShaoLihu Xu
Summary:We write this note to correct [1, (6.9), (6.13), (7.1), (7.2)] because there was one term missed in [1, (6.9)]
Article, 2019
Publication:Probability Theory and Related Fields, 175, 20190711, 1177
Publisher:2019

<——2019————— 2019 ——————230— 


[PDF] rieck.ru

[PDF] A Wasserstein Subsequence Kernel for Time Series

C Bock, M Togninalli, E Ghisu, T Gumbsch, B Rieck… - bastian.rieck.ru

Kernel methods are a powerful approach for learning on structured data. However, as we show in this paper, simple but common instances of the popular R-convolution kernel framework can be meaningless when assessing the similarity of two time series through …

Cited by 6 Related articles All 9 versions
 Wasserstein Subsequence Kernel for Time Series


[PDF] nips.cc

[PDF] Supplementary Materials: Multi-marginal Wasserstein GAN

J Cao, L Mo, Y Zhang, K Jia, C Shen, M Tan - papers.nips.cc

Theory part. In Section A, we provide preliminaries of multi-marginal optimal transport. In Section B, we prove an equivalence theorem that solving Problem II is equivalent to solving Problem III under a mild assumption. In Section C, we build the relationship between …


[PDF] arxiv.org

Wasserstein distances for evaluating cross-lingual embeddings

G Balikas, I Partalas - arXiv preprint arXiv:1910.11005, 2019 - arxiv.org

Word embeddings are high dimensional vector representations of words that capture their semantic similarity in the vector space. There exist several algorithms for learning such embeddings both for a single language as well as for several languages jointly. In this work …


[PDF] arxiv.org

1-Wasserstein Distance on the Standard Simplex

A Frohmader, H Volkmer - arXiv preprint arXiv:1912.04945, 2019 - arxiv.org

Wasserstein distances provide a metric on a space of probability measures. We consider the space $\Omega $ of all probability measures on the finite set $\chi=\{1,\dots, n\} $ where $ n $ is a positive integer. 1-Wasserstein distance, $ W_1 (\mu,\nu) $ is a function from …

arXiv:1912.04945  [pdf, other

[PDF] arxiv.org

Donsker's theorem in {Wasserstein}-1 distance

L Coutin, L Decreusefond - arXiv preprint arXiv:1904.07045, 2019 - arxiv.org

We compute the Wassertein-1 (or Kolmogorov-Rubinstein) distance between a random walk in $ R^ d $ and the Brownian motion. The proof is based on a new estimate of the Lipschitz modulus of the solution of the Stein's equation. As an application, we can evaluate the rate …

Related articles All 16 versions
 

2019


[PDF] jst.go.jp

Wasserstein Autoencoder を用いた画像スタイル変換

中田秀基, 麻生英樹 - 人工知能学会全国大会論文集 一般社団法人 …, 2019 - jstage.jst.go.jp

抄録 本稿では Wasserstein Autoencoder を用いた画像スタイル変換を提案する. 画像スタイル変換とは, コンテント画像に対してスタイル画像から抽出したスタイルを適用することで, 任意のコンテントを任意のスタイルで描画する技術である. スタイル変換はこれまでも広く研究され …

[Japanese Image style conversion using Wasserstein Autoencoder]

All 2 versions

 

[PDF] arxiv.org

Disentangled Representation Learning with Wasserstein Total Correlation

Y Xiao, WY Wang - arXiv preprint arXiv:1912.12818, 2019 - arxiv.org

Unsupervised learning of disentangled representations involves uncovering of different factors of variations that contribute to the data generation process. Total correlation penalization has been a key component in recent methods towards disentanglement …

Disentangled Representation Learning with Wasserstein Total Correlation

Y Xiao, WY Wang - arXiv preprint arXiv:1912.12818, 2019 - arxiv.org

Unsupervised learning of disentangled representations involves uncovering of different factors of variations that contribute to the data generation process. Total correlation penalization has been a key component in recent methods towards disentanglement. However, Kullback-Leibler (KL) divergence-based total correlation is metric-agnostic and sensitive to data samples. In this paper, we introduce Wasserstein total correlation in both variational autoencoder and Wasserstein autoencoder settings to learn disentangled latent …

Cited by 6 Related articles All 2 versions  View as HTML

Wasserstein Distances for Estimating Parameters in Stochastic Reaction Networks

K Öcal, R Grima, G Sanguinetti - International Conference on …, 2019 - Springer

Modern experimental methods such as flow cytometry and fluorescence in-situ hybridization (FISH) allow the measurement of cell-by-cell molecule numbers for RNA, proteins and other substances for large numbers of cells at a time, opening up new possibilities for the …

Wasserstein Distances for Estimating Parameters in Stochastic Reaction Networks

Gromov-Wasserstein Factorization Models for Graph Clustering

H Xu - arXiv preprint arXiv:1911.08530, 2019 - arxiv.org

We propose a new nonlinear factorization model for graphs that are with topological structures, and optionally, node attributes. This model is based on a pseudometric called Gromov-Wasserstein (GW) discrepancy, which compares graphs in a relational way. It …

Gromov-Wasserstein Factorization Models for Graph Clustering

[PDF] mlr.press

Gromov-wasserstein learning for graph matching and node embedding

H XuD LuoH Zha, LC Duke - … on machine learning, 2019 - proceedings.mlr.press

… This paper considers the joint goal of graph matching and learning … -Wasserstein

learning framework. The dissimilarity between two graphs is measured by the Gromov-Wasserstein …

Cited by 107 Related articles All 12 versions 

<——2019——– 2019  ———240 ——


[PDF] arxiv.org

Statistical data analysis in the Wasserstein space

J Bigot - arXiv preprint arXiv:1907.08417, 2019 - arxiv.org

This paper is concerned by statistical inference problems from a data set whose elements may be modeled as random probability measures such as multiple histograms or point clouds. We propose to review recent contributions in statistics on the use of Wasserstein  …

Cited by 10 Related articles All 4 versions


2019 see 2020
Gromov-wasserstein averaging in a riemannian framework
Authors:Chowdhury S.Needham T.
Article, 2019
Publication:arXiv, 2019 10 09
Publisher:2019

 

S Chowdhury, T Needham - arXiv preprint arXiv:1910.04308, 2019 - arxiv.org

We introduce a theoretical framework for performing statistical tasks---including, but not limited to, averaging and principal component analysis---on the space of (possibly asymmetric) matrices with arbitrary entries and sizes. This is carried out under the lens of the …


[PDF] projecteuclid.org

Hybrid Wasserstein distance and fast distribution clustering

I Verdinelli, L Wasserman - Electronic Journal of Statistics, 2019 - projecteuclid.org

We define a modified Wasserstein distance for distribution clustering which inherits many of the properties of the Wasserstein distance but which can be estimated easily and computed quickly. The modified distance is the sum of two terms. The first term—which has a closed …

Related articles


 Learning with a Wasserstein Loss - NIPS Proceedings

[C] Learning with a Wasserstein Loss Advances in Neural Information Processing Systems (NIPS)

C Frogner, C Zhang, H Mobahi, M Araya-Polo… - 2019 see 2015

Cited by 2



2019
Accelerated Linear Convergence of Stochastic Momentum Methods in Wasserstein Distances
          Authors:Can, Bugra (Creator), Gurbuzbalaban, Mert (Creator), Zhu, Lingjiong (Creator)
Summary:Momentum methods such as Polyak's heavy ball (HB) method, Nesterov's accelerated gradient (AG) as well as accelerated projected gradient (APG) method have been commonly used in machine learning practice, but their performance is quite sensitive to noise in the gradients. We study these methods under a first-order stochastic oracle model where noisy estimates of the gradients are available. For strongly convex problems, we show that the distribution of the iterates of AG converges with the accelerated $O(\sqrt{\kappa}\log(1/\varepsilon))$ linear rate to a ball of radius $\varepsilon$ centered at a unique invariant distribution in the 1-Wasserstein metric where $\kappa$ is the condition number as long as the noise variance is smaller than an explicit upper bound we can provide. Our analysis also certifies linear convergence rates as a function of the stepsize, momentum parameter and the noise variance; recovering the accelerated rates in the noiseless case and quantifying the level of noise that can be tolerated to achieve a given performance. In the special case of strongly convex quadratic objectives, we can show accelerated linear rates in the $p$-Wasserstein metric for any $p\geq 1$ with improved sensitivity to noise for both AG and HB through a non-asymptotic analysis under some additional assumptions on the noise structure. Our analysis for HB and AG also leads to improved non-asymptotic convergence bounds in suboptimality for both deterministic and stochastic settings which is of independent interest. To the best of our knowledge, these are the first linear convergence results for stochastic momentum methods under the stochastic oracle model. We also extend our results to the APG method and weakly convex functions showing accelerated rates when the noise magnitude is sufficiently smallShow more
Downloadable Archival Material, 2019-01-22
Undefined
Publisher:2019-01-22


2019 PDF

On the Complexity of Approximating Wasserstein ...

proceedings.mlr.press › ...

by A Kroshnin · 2019 · Cited by 43 — we focus on the computational aspects of optimal transport, namely on the complexity ... thors addressed the question of the Wasserstein distance approximation ...

[C] On the complexity of computing Wasserstein distances

B Taskesen, S Shafieezadeh-Abadeh, D Kuhn - 2019 - Working paper

Cited by 2



[PDF] arxiv.org

Quantile Propagation for Wasserstein-Approximate Gaussian Processes

R Zhang, CJ Walder, EV Bonilla, MA Rizoiu… - arXiv preprint arXiv …, 2019 - arxiv.org

In this work, we develop a new approximation method to solve the analytically intractable Bayesian inference for Gaussian process models with factorizable Gaussian likelihoods and single-output latent functions. Our method--dubbed QP--is similar to the expectation …

 Related articles All 8 versions

[PDF] arxiv.org

Nonembeddability of Persistence Diagrams with 

Wasserstein Metric

A Wagner - arXiv preprint arXiv:1910.13935, 2019 - arxiv.org

Persistence diagrams do not admit an inner product structure compatible with any Wasserstein metric. Hence, when applying kernel methods to persistence diagrams, the underlying feature map necessarily causes distortion. We prove persistence diagrams with …

arXiv:1910.13935  [pdf, ps, other]  math.FA cs.LG math.AT math.MG
Nonembeddability of Persistence Diagrams with
p>2 Wasserstein Metric

[C] Wasserstein 생산적 적대 신경망과 구조적 유사지수를 이용한 저선량 컴퓨터 단층촬영 영상 잡음 제거 기법

이지나, 홍영택, 장영걸, 김주호, 백혜진… - 한국정보과학회 학술 …, 2019 - dbpia.co.kr

약컴퓨터 단층 촬영 영상 (Computed Tomography; CT) 진단을 위한 영상데이터 하나이며, 선량이높을수록 고품질 영상을 획득할 있게 하지만 질병 또는 종양을 유발할 있다. 최근 년간 생산적적대 신경망은 비지도 영상 잡음 제거 연구에서 많은 성과를 내고 …

[Korean  [C] Wasserstein Low-dose Computed Tomography Image Noise Reduction Using Productive Host Neural Networks and Structural Similarity Index

<——2019———— 2019——————250 —



[PDF] Parallel wasserstein generative adversarial nets with multiple discriminators

Y Su, S Zhao, X Chen, I King… - Proceedings of the 28th …, Macao  2019 - pdfs.semanticscholar.org

Abstract Wasserstein Generative Adversarial Nets (GANs) are newly proposed GAN algorithms and widely used in computer vision, web mining, information retrieval, etc. However, the existing algorithms with approximated Wasserstein loss converge slowly due …

Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence (IJCAI-19) 

Cited by 3 Related articles All 3 versions


2019 thesis   [PDF] illinois.edu

Deep generative models via explicit Wasserstein minimization

Y Chen - 2019 - ideals.illinois.edu

This thesis provides a procedure to fit generative networks to target distributions, with the goal of a small Wasserstein distance (or other optimal transport costs). The approach is based on two principles:(a) if the source randomness of the network is a continuous …

Deep generative models via explicit Wasserstein minimization

www.ideals.illinois.edu › handle

by Y Chen - ‎2019 - ‎Related articles

Aug 23, 2019 - This thesis provides a procedure to fit generative networks to target distributions, with the goal of a small Wasserstein distance (or other optimal ...

University of Illinois at Urbana-Champaign   MS
Related articles
All 2 versions


[PDF] arxiv.org

Approximate Bayesian Computation with the Sliced-Wasserstein Distance

K Nadjahi, V De Bortoli, A Durmus, R Badeau… - arXiv preprint arXiv …, 2019 - arxiv.org

Approximate Bayesian Computation (ABC) is a popular method for approximate inference in generative models with intractable but easy-to-sample likelihood. It constructs an approximate posterior distribution by finding parameters for which the simulated data are …

Cited by 1 Related articles All 3 versions
Approximate Bayesian computation with the sliced-wasserstein distance

A new ordinary kriging predictor for histogram data in L2-Wasserstein space

A Balzanella, R Verde, A Irpino - Smart statistics for smart …, 2019 - iris.unicampania.it

This paper introduces an ordinary kriging predictor for histogram data. We assume that the input data is a set of histograms which summarize data observed in a geographic area. Our aim is to predict the histogram of data in a spatial location where it is not possible to get …



[PDF] arxiv.org

On Efficient Multilevel Clustering via Wasserstein Distances

V Huynh, N Ho, N Dam, XL Nguyen… - arXiv preprint arXiv …, 2019 - arxiv.org

We propose a novel approach to the problem of multilevel clustering, which aims to simultaneously partition data in each group and discover grouping patterns among groups in a potentially large hierarchically structured corpus of data. Our method involves a joint …

[PDF] arxiv.org

On Efficient Multilevel Clustering via Wasserstein Distances

V Huynh, N Ho, N Dam, XL Nguyen… - arXiv preprint arXiv …, 2019 - arxiv.org

We propose a novel approach to the problem of multilevel clustering, which aims to 

simultaneously partition data in each group and discover grouping patterns among groups 

in a potentially large hierarchically structured corpus of data. Our method involves a joint 

optimization formulation over several spaces of discrete probability measures, which are 

endowed with Wasserstein distance metrics. We propose several variants of this problem, 

which admit fast optimization algorithms, by exploiting the connection to the problem of …

Related articles All 2 versions


[PDF] arxiv.org

Wasserstein Collaborative Filtering for Item Cold-start Recommendation

Y Meng, G Chen, B Liao, J Guo, W Liu - arXiv preprint arXiv:1909.04266, 2019 - arxiv.org

The item cold-start problem seriously limits the recommendation performance of Collaborative Filtering (CF) methods when new items have either none or very little interactions. To solve this issue, many modern Internet applications propose to predict a new …
Wasserstein collaborative filtering for item cold-start recommendation


[PDF] arxiv.org

Confidence Regions in Wasserstein Distributionally Robust Estimation

J Blanchet, K Murthy, N Si - arXiv preprint arXiv:1906.01614, 2019 - arxiv.org

Wasserstein distributionally robust optimization (DRO) estimators are obtained as solutions of min-max problems in which the statistician selects a parameter minimizing the worst-case loss among all probability models within a certain distance (in a Wasserstein sense) from the …

Cited by 23 Related articles All 7 versions


[PDF] aclweb.org

Modeling personalization in continuous space for response generation via augmented wasserstein autoencoders

Z Chan, J LiX Yang, X Chen, W Hu, D Zhao… - Proceedings of the 2019 …, 2019 - aclweb.org

Abstract Variational autoencoders (VAEs) and Wasserstein autoencoders (WAEs) have

achieved noticeable progress in open-domain response generation. Through introducing

latent variables in continuous space, these models are capable of capturing utterance-level …

  Cited by 16 Related articles All 3 versions 

<——2019    ——————— 2019  ———260 —



PHom-WAE: Persitent Homology for Wasserstein Auto-Encoders

J Charlier, F Petit, G Ormazabal, R State… - arXiv preprint arXiv …, 2019 - arxiv.org

Auto-encoders are among the most popular neural network architecture for dimension reduction. They are composed of two parts: the encoder which maps the model distribution to a latent manifold and the decoder which maps the latent manifold to a reconstructed …

Related articles
[CITATION] PHom-WAE: Persitent Homology for Wasserstein Auto-Encoders.

J Charlier, F Petit, G Ormazabal, Radu State, J Hilger - CoRR, 2019

PHom-WAE: Persitent homology for wasserstein auto-encoders ARTICCLE

Unsupervised adversarial domain adaptation based on the wasserstein distance for acoustic scene classification

A0uthors:Drossos K.Magron P.Virtanen T.2019 IEEE Workshop on Applications of Signal Processing to Audio and Acoustics, WASPAA 2019
Article, 2019
Publication:IEEE Workshop on Applications of Signal Processing to Audio and Acoustics, 2019-October, 2019 10 01, 259
Publisher:2019


[PDF] arxiv.org

Computationally Efficient Tree Variants of Gromov-Wasserstein

T Le, N Ho, M Yamada - arXiv preprint arXiv:1910.04462, 2019 - arxiv.org

We propose two novel variants of Gromov-Wasserstein (GW) between probability measures in different probability spaces based on projecting these measures into the tree metric spaces. Our first proposed discrepancy, named\emph {flow-based tree Gromov …

ited by 1 Related articles All 5 versions 

2019 see 2020
Cross-domain Text Sentiment Classification Based on Wasserstein Distance
Authors:Guoyong G.Lin Q.Chen N.
Article, 2019
Publication:Journal of Computers (Taiwan), 30, 2019 12 01, 276
Publisher:2019

Distributionally Robust Learning under the Wasserstein Metric

R Chen - 2019 - search.proquest.com

This dissertation develops a comprehensive statistical learning framework that is robust to (distributional) perturbations in the data using Distributionally Robust Optimization (DRO) under the Wasserstein metric. The learning problems that are studied include:(i) …


Elements of Statistical Inference in 2-Wasserstein Space

J Ebert, V Spokoiny, A Suvorikova - Topics in Applied Analysis and …, 2019 - Springer

This work addresses an issue of statistical inference for the datasets lacking underlying linear structure, which makes impossible the direct application of standard inference techniques and requires a development of a new tool-box taking into account properties of …

Elements of Statistical Inference in 2-Wasserstein Space

J Ebert, V SpokoinyA Suvorikova - Topics in Applied Analysis and …, 2019 - Springer

Page 1. Elements of Statistical Inference in 2-Wasserstein Space Johannes Ebert,

Vladimir Spokoiny and Alexandra Suvorikova Abstract This work addresses an

issue of statistical inference for the datasets lacking underlying …

  Related articles All 3 versions

[PDF] arxiv.org

Weak convergence of empirical Wasserstein type distances

P Berthet, JC Fort - arXiv preprint arXiv:1911.02389, 2019 - arxiv.org

We estimate contrasts $\int_0^ 1\rho (F^{-1}(u)-G^{-1}(u)) du $ between two continuous distributions $ F $ and $ G $ on $\mathbb R $ such that the set $\{F= G\} $ is a finite union of intervals, possibly empty or $\mathbb {R} $. The non-negative convex cost function $\rho $ is …

Cited by 7 Related articles All 8 versions

[HTML] nih.gov

Construction of 4D Neonatal Cortical Surface Atlases Using Wasserstein Distance

Z Chen, Z Wu, L Sun, F Wang, L Wang… - 2019 IEEE 16th …, 2019 - ieeexplore.ieee.org

Spatiotemporal (4D) neonatal cortical surface atlases with densely sampled ages are important tools for understanding the dynamic early brain development. Conventionally, after non-linear co-registration, surface atlases are constructed by simple Euclidean average …

  Cited by 1 Related articles All 5 versions

[PDF] arxiv.org

2D Wasserstein Loss for Robust Facial Landmark Detection

Y Yan, S Duffner, P Phutane, A Berthelier… - arXiv preprint arXiv …, 2019 - arxiv.org

Facial landmark detection is an important preprocessing task for most applications related to face analysis. In recent years, the performance of facial landmark detection has been significantly improved by using deep Convolutional Neural Networks (CNNs), especially the …
Related articles
All 2 versions 


[PDF] arxiv.org

Tensor product and Hadamard product for the Wasserstein means

J Hwang, S Kim - arXiv preprint arXiv:1908.09261, 2019 - arxiv.org

As one of the least squares mean, we consider the Wasserstein mean of positive definite Hermitian matrices. We verify in this paper the inequalities of the Wasserstein mean related with a strictly positive and unital linear map, the identity of the Wasserstein mean for tensor …
<——2019———2019  ——270 —



2019 thesis
Algorithms for Optimal Transport and Wasserstein Distances
Authors:Jörn SchrieberDominic SchuhmacherAnita Schöbel

Summary:Optimal Transport and Wasserstein Distance are closely related terms that do not only have a long history in the mathematical literature, but also have seen a resurgence in recent years, particularly in the context of the many applications they are used in, which span a variety of scientific fields including - but not limited to - imaging, statistics and machine learning. Due to drastic increases in data volume and a high demand for Wasserstein distance computation, the development of more efficient algorithms in the domain of optimal transport increased in priority and the advancement pick..
Show more
Thesis, Dissertation, 2019
English
Publisher:Niedersächsische Staats- und Universitätsbibliothek Göttingen, Göttingen, 2019

[PDF] Algorithms for Optimal Transport and Wasserstein Distances

J Schrieber - 2019 - d-nb.info

Optimal Transport and Wasserstein Distance are closely related terms that do not only have a long history in the mathematical literature, but also have seen a resurgence in recent years, particularly in the context of the many applications they are used in, which span a …

Related articles

Dissertation/Thesis:

 Zbl 1437.90004

[PDF] arxiv.org

Minimax Confidence Intervals for the Sliced Wasserstein Distance

T Manole, S Balakrishnan, L Wasserman - arXiv preprint arXiv:1909.07862, 2019 - arxiv.org

The Wasserstein distance has risen in popularity in the statistics and machine learning communities as a useful metric for comparing probability distributions. We study the problem of uncertainty quantification for the Sliced Wasserstein distance--an easily computable …

Related articles All 3 versions

[PDF] arxiv.org

Optimal Transport Relaxations with Application to Wasserstein GANs

S Mahdian, J Blanchet, P Glynn - arXiv preprint arXiv:1906.03317, 2019 - arxiv.org

We propose a family of relaxations of the optimal transport problem which regularize the problem by introducing an additional minimization step over a small region around one of the underlying transporting measures. The type of regularization that we obtain is related to …

Related articles


[PDF] nips.cc

Propagating Uncertainty in Reinforcement Learning via Wasserstein Barycenters

AM Metelli, A Likmeta, M Restelli - Advances in Neural Information …, 2019 - papers.nips.cc

How does the uncertainty of the value function propagate when performing temporal difference learning? In this paper, we address this question by proposing a Bayesian framework in which we employ approximate posterior distributions to model the uncertainty …

Cited by 9 Related articles All 8 versions
Propagating uncertainty in reinforcement learning via wasserstein barycenters


[PDF] harvard.edu

An Information-Theoretic View of Generalization via Wasserstein Distance

H Wang, M Diaz, JCS Santos Filho… - … on Information Theory …, 2019 - ieeexplore.ieee.org

We capitalize on the Wasserstein distance to obtain two information-theoretic bounds on the generalization error of learning algorithms. First, we specialize the Wasserstein distance into total variation, by using the discrete metric. In this case we derive a generalization bound …

 Cited by 21 Related articles All 6 versions

Inequalities of the Wasserstein mean with other matrix means

S Kim, H Lee - Annals of Functional Analysis, 2019 - Springer

Recently, a new Riemannian metric and a least squares mean of positive definite matrices have been introduced. They are called the Bures–Wasserstein metric and Wasserstein mean, which are different from the Riemannian trace metric and Karcher mean. In this paper 


[PDF] arxiv.org

The Wasserstein-Fourier Distance for Stationary Time Series

E Cazelles, A Robert, F Tobar - arXiv preprint arXiv:1912.05509, 2019 - arxiv.org

We introduce a novel framework for analysing stationary time series based on optimal transport distances and spectral embeddings. First, we represent time series by their power spectral density (PSD), which summarises the signal energy spread across the Fourier …


[PDF] semanticscholar.org

[PDF] A NONLOCAL FREE BOUNDARY PROBLEM WITH WASSERSTEIN DISTANCE

AL KARAKHANYAN - arXiv preprint arXiv:1904.06270, 2019 - pdfs.semanticscholar.org

(1.1) J [ρ]= log 1| x− y| dρ (x) dρ (y)+ d2 (ρ, ρ0) among all probability measures ρ with finite second momentum. Here d2 (ρ, ρ0)= infγ 1 2| x− y| 2dγ (x, y) is the square of the Wasserstein distance between ρ and a given probability measure ρ0, and γ is a joint probability measure …

Related articles All 3 versions


PWGAN: wasserstein GANs with perceptual loss for mode collapse

X Wu, C Shi, X Li, J He, X Wu, J Lv, J Zhou - Proceedings of the ACM …, 2019 - dl.acm.org

Generative adversarial network (GAN) plays an important part in image generation. It has great achievements trained on large scene data sets. However, for small scene data sets, we find that most of methods may lead to a mode collapse, which may repeatedly generate …


A degenerate Cahn‐Hilliard model as constrained Wasserstein gradient flow

D Matthes, C Cances, F Nabet - PAMM, 2019 - Wiley Online Library

Existence of solutions to a non‐local Cahn‐Hilliard model with degenerate mobility is considered. The PDE is written as a gradient flow with respect to the L 2‐Wasserstein metric for two components that are coupled by an incompressibility constraint. Approximating …

<——2019——————— 2019  ———-280 —


The quadratic Wasserstein metric for inverse data matching

K Ren, Y Yang - arXiv preprint arXiv:1911.06911, 2019 - arxiv.org

This work characterizes, analytically and numerically, two major effects of the quadratic Wasserstein ($ W_2 $) distance as the measure of data discrepancy in computational solutions of inverse problems. First, we show, in the infinite-dimensional setup, that the …


[PDF] arxiv.org

Learning with minibatch Wasserstein: asymptotic and gradient properties

K Fatras, Y Zine, R Flamary, R Gribonval… - arXiv preprint arXiv …, 2019 - arxiv.org

Optimal transport distances are powerful tools to compare probability distributions and have found many applications in machine learning. Yet their algorithmic complexity prevents their direct use on large scale datasets. To overcome this challenge, practitioners compute these …
Cited by 30 Related articles All 24 versions

[PDF] dpi-proceedings.com

Isomorphic Wasserstein Generative Adversarial Network for Numeric Data Augmentation

W Wei, W Chuang, LI Yue - DEStech Transactions on …, 2019 - dpi-proceedings.com

GAN-based schemes are one of the most popular methods designed for image generation. Some recent studies have suggested using GAN for numeric data augmentation that is to generate data for completing the imbalanced numeric data. Compared to the conventional …


Density estimation of multivariate samples using Wasserstein distance

E Luini, P Arbenz - Journal of Statistical Computation and …, 2019 - Taylor & Francis

Density estimation is a central topic in statistics and a fundamental task of machine learning. In this paper, we present an algorithm for approximating multivariate empirical densities with a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

Density estimation of multivariate samples using Wasserstein distance 

By: Luini, E.; Arbenz, P. 

JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION   Volume: ‏ 90   Issue: ‏ 2   Pages: ‏ 181-210   Published: ‏ JAN 22 2020 

Early Access: OCT 2019 


[PDF] arxiv.org

Investigating Under and Overfitting in Wasserstein Generative Adversarial Networks

B Adlam, C Weill, A Kapoor - arXiv preprint arXiv:1910.14137, 2019 - arxiv.org

We investigate under and overfitting in Generative Adversarial Networks (GANs), using discriminators unseen by the generator to measure generalization. We find that the model capacity of the discriminator has a significant effect on the generator's model quality, and …

Investigating under and overfitting in wasserstein generative adversarial networks

2019

[PDF] 150.162.46.34

Image Reflection Removal Using the Wasserstein Generative Adversarial Network

T Li, DPK Lun - … 2019-2019 IEEE International Conference on …, 2019 - ieeexplore.ieee.org

Imaging through a semi-transparent material such as glass often suffers from the reflection problem, which degrades the image quality. Reflection removal is a challenging task since it is severely ill-posed. Traditional methods, while all require long computation time on …

Related articles


Adversarial Learning for Cross-Modal Retrieval with Wasserstein Distance

Q Cheng, Y Zhang, X Gu - International Conference on Neural Information …, 2019 - Springer

This paper presents a novel approach for cross-modal retrieval in an Adversarial Learning with Wasserstein Distance (ALWD) manner, which aims at learning aligned representation for various modalities in a GAN framework. The generator projects the image and the text …


[PDF] arxiv.org

Duality and quotient spaces of generalized Wasserstein spaces

NP Chung, TS Trinh - arXiv preprint arXiv:1904.12461, 2019 - arxiv.org

In this article, using ideas of Liero, Mielke and Savaré in [21], we establish a Kantorovich duality for generalized Wasserstein distances $ W_1^{a, b} $ on a generalized Polish metric space, introduced by Picolli and Rossi. As a consequence, we give another proof that …

 Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

On isometric embeddings of Wasserstein spaces–the discrete case

GP Gehér, T Titkos, D Virosztek - Journal of Mathematical Analysis and …, 2019 - Elsevier

The aim of this short paper is to offer a complete characterization of all (not necessarily surjective) isometric embeddings of the Wasserstein space W p (X), where X is a countable discrete metric space and 0<p<∞ is any parameter value. Roughly speaking, we will prove …

Cited by 5 Related articles All 9 versions


[PDF] aaai.org

Manifold-Valued Image Generation with Wasserstein Generative Adversarial Nets

Z Huang, J Wu, L Van Gool - Proceedings of the AAAI Conference on …, 2019 - aaai.org

Generative modeling over natural images is one of the most fundamental machine learning problems. However, few modern generative models, including Wasserstein Generative Adversarial Nets (WGANs), are studied on manifold-valued images that are frequently …

2019 National Conference on Artificial Intelligence


<——2019————2019  ——————290— 


Investigators from University of Vienna Report New Data on Insurance Economics (Optimal Xl-insurance Under Wa... 

Insurance Business Weekly, 10/2019

NewsletterFull Text Online 

Optimal XL-insurance under Wasserstein-type ambiguity ...

Optimal XL-insurance under Wasserstein-type ambiguity ... Finding an optimal insurance or reinsurance contract is an important topic in actuarial science, ...

by C Birghila - ‎2019 - ‎Cited by 1 - ‎Related articles

Cited by 4 Related articles All 8 versions

Optimal XL-insurance under Wasserstein-type ambiguity

[PDF] arxiv.org

Towards Diverse Paraphrase Generation Using Multi-Class Wasserstein GAN

Z An, S Liu - arXiv preprint arXiv:1909.13827, 2019 - arxiv.org

Paraphrase generation is an important and challenging natural language processing (NLP) task. In this work, we propose a deep generative model to generate paraphrase with diversity. Our model is based on an encoder-decoder architecture. An additional transcoder …
Cited by 9
Related articles All 4 versions


[PDF] ieee.org

Prostate MR Image Segmentation With Self-Attention Adversarial Training Based on Wasserstein Distance

C Su, R Huang, C Liu, T Yin, B Du - IEEE Access, 2019 - ieeexplore.ieee.org

Prostate diseases are very common in men. Accurate segmentation of the prostate plays a significant role in further clinical treatment and diagnosis. There have been some methods that combine the segmentation network and generative adversarial network, using the …

Cited by 5 Related articles

[PDF] arxiv.org

Convergence of some classes of random flights in Wasserstein distance

A Falaleev, V Konakov - arXiv preprint arXiv:1910.03862, 2019 - arxiv.org

In this paper we consider a random walk of a particle in $\mathbb {R}^ d $. Convergence of different transformations of trajectories of random flights with Poisson switching moments has been obtained by Davydov and Konakov, as well as diffusion approximation of the …


Convergence of some classes of random flights in Wasserstein distance

A Falaleev, V Konakov - arXiv preprint arXiv:1910.03862, 2019 - arxiv.org

In this paper we consider a random walk of a particle in $\mathbb {R}^ d $. Convergence of 

different transformations of trajectories of random flights with Poisson switching moments 

has been obtained by Davydov and Konakov, as well as diffusion approximation of the 

process has been built. The goal of this paper is to prove stronger convergence in terms of 

the Wasserstein distance. Three types of transformations are considered: cases of 

exponential and super-exponential growth of a switching moment transformation function …

Related articles All 2 versions

2019

[PDF] arxiv.org

Bridging the Gap Between 

ƒ-GANs and Wasserstein GANs

J Song, S Ermon - arXiv preprint arXiv:1910.09779, 2019 - arxiv.org

Generative adversarial networks (GANs) have enjoyed much success in learning high-dimensional distributions. Learning objectives approximately minimize an $ f $-divergence ($ f $-GANs) or an integral probability metric (Wasserstein GANs) between the model and …

[PDF] openreview.net

A Wasserstein Minimum Velocity Approach to Learning Unnormalized Models

Z Wang, S Cheng, Y Li, J Zhu, B Zhang - 2019 - openreview.net

Score matching provides an effective approach to learning flexible unnormalized models, but its scalability is limited by the need to evaluate a second-order derivative. In this paper, we connect a general family of learning objectives including score matching to …


[PDF] arxiv.org

Scalable Gromov-Wasserstein Learning for Graph Partitioning and Matching

H Xu, D Luo, L Carin - arXiv preprint arXiv:1905.07645, 2019 - arxiv.org

We propose a scalable Gromov-Wasserstein learning (S-GWL) method and establish a novel and theoretically-supported paradigm for large-scale graph analysis. The proposed method is based on the fact that Gromov-Wasserstein discrepancy is a pseudometric on …

Cited by 109 Related articles All 11 versions

Scalable gromov-wasserstein learning for graph partitioning and matching

[PDF] arxiv.org

Single Image Haze Removal Using Conditional Wasserstein Generative Adversarial Networks

JP Ebenezer, B Das, S Mukhopadhyay - arXiv preprint arXiv:1903.00395, 2019 - arxiv.org

We present a method to restore a clear image from a haze-affected image using a Wasserstein generative adversarial network. As the problem is ill-conditioned, previous methods have required a prior on natural images or multiple images of the same scene. We …

Related articles  All 3 versions

Deep Distributional Sequence Embeddings Based on a Wasserstein Loss

A Abdelwahab, N Landwehr - arXiv preprint arXiv:1912.01933, 2019 - arxiv.org

Deep metric learning employs deep neural networks to embed instances into a metric space 

such that distances between instances of the same class are small and distances between 

instances from different classes are large. In most existing deep metric learning techniques, 

the embedding of an instance is given by a feature vector produced by a deep neural 

network and Euclidean distance or cosine similarity defines distances between these 

vectors. In this paper, we study deep distributional embeddings of sequences, where the …

Related articles All 8 versions 

<——2019——— 2019———300—

Understanding MCMC Dynamics as Flows on the Wasserstein Space

C Liu, J Zhuo, J Zhu - arXiv preprint arXiv:1902.00282, 2019 - arxiv.org

It is known that the Langevin dynamics used in MCMC is the gradient flow of the KL divergence on the Wasserstein space, which helps convergence analysis and inspires recent particle-based variational inference methods (ParVIs). But no more MCMC dynamics …

Cited by 21 Related articles All 14 versions

[PDF] arxiv.org

Barycenters of Natural Images--Constrained Wasserstein Barycenters for Image Morphing

D Simon, A Aberdam - arXiv preprint arXiv:1912.11545, 2019 - arxiv.org

Image interpolation, or image morphing, refers to a visual transition between two (or more) input images. For such a transition to look visually appealing, its desirable properties are (i) to be smooth;(ii) to apply the minimal required change in the image; and (iii) to seem" real" …


[PDF] cesar-conference.org

[PDF] Generating Natural Adversarial Hyperspectral examples with a modified Wasserstein GAN

JC Burnel, K Fatras, N Courty - cesar-conference.org

Adversarial examples are a hot topic due to their abilities to fool a classifier's prediction. There are two strategies to create such examples, one uses the attacked classifier's gradients, while the other only requires access to the classifier's prediction. This is …

arXiv:2001.09993  [pdf, other  cs.LG cs.AI stat.ML

Speech Enhancement for Noise-Robust Speech Synthesis Using Wasserstein GAN}}

N Adiga, Y Pantazis, V Tsiaras… - Proc. Interspeech …, 2019 - isca-speech.org

The quality of speech synthesis systems can be significantly deteriorated by the presence of background noise in the recordings. Despite the existence of speech enhancement techniques for effectively suppressing additive noise under low signal-tonoise (SNR) …


[PDF] arxiv.org

On the estimation of the Wasserstein distance in generative models

T Pinetz, D Soukup, T Pock - German Conference on Pattern Recognition, 2019 - Springer

Abstract Generative Adversarial Networks (GANs) have been used to model the underlying probability distribution of sample based datasets. GANs are notoriuos for training difficulties and their dependence on arbitrary hyperparameters. One recent improvement in GAN …

Cited by 5 Related articles All 5 versions


[PDF] arxiv.org

Multivariate stable approximation in Wasserstein distance by Stein's method

P Chen, I Nourdin, L Xu, X Yang - arXiv preprint arXiv:1911.12917, 2019 - arxiv.org

We investigate regularity properties of the solution to Stein's equation associated with multivariate integrable $\alpha $-stable distribution for a general class of spectral measures and Lipschitz test functions. The obtained estimates induce an upper bound in Wasserstein  …

Cited by 9 Related articles All 4 versions

[PDF] arxiv.org

Local Bures-Wasserstein Transport: A Practical and Fast Mapping Approximation

A Hoyos-Idrobo - arXiv preprint arXiv:1906.08227, 2019 - arxiv.org

Optimal transport (OT)-based methods have a wide range of applications and have attracted a tremendous amount of attention in recent years. However, most of the computational approaches of OT do not learn the underlying transport map. Although some algorithms …

Related articles All 2 versions

Local Bures-Wasserstein Transport: A Practical and Fast ...

https://arxiv.org › stat

https://arxiv.org › stat

by A Hoyos-Idrobo · 2019 — We build an approximated transport mapping by leveraging the closed-form of Gaussian (Bures-Wasserstein) transport; we compute local ...

[CITATION] Local Bures-Wasserstein Transport: A Practical and Fast Mapping Approximation.

AH Idrobo - CoRR, 2019

[PDF] arxiv.org

Data-Driven Distributionally Robust Appointment Scheduling over Wasserstein Balls

R Jiang, M Ryu, G Xu - arXiv preprint arXiv:1907.03219, 2019 - arxiv.org

We study a single-server appointment scheduling problem with a fixed sequence of appointments, for which we must determine the arrival time for each appointment. We specifically examine two stochastic models. In the first model, we assume that all appointees …

Cited by 18 Related articles All 4 versions

基于 Wasserstein GAN 的新一代人工智能小样本数据增强方法——以生物领域癌症分期数据为例

刘宇飞, 周源, 刘欣, 董放, 王畅, 王子鸿 - Engineering, 2019 - cnki.com.cn

以大数据为基础的深度学习算法在推动新一代人工智能快速发展中意义重大. 然而深度学习的有效利用对标注样本数量的高度依赖, 使得深度学习在小样本数据环境下的应用受到制约. 本研究提出了一种基于生成对抗网络(generative adversarial network, GAN) …

[Chinese  New generation of artificial intelligence small sample data augmentation method based on Wasserstein GAN: Taking cancer staging data in the biological field as an example]

Related papers  Caton, J. L. (2013, June). Exploring the prudent limits of automated cyber attack. In 2013 5th International Conference on Cyber Conflict (CYCON 2013) (pp. 1-16). IEEE.


[C] 基于 Wasserstein 度量的目标数据关联算法

刘洋, 郭春生 - 软件导刊, 2019 - rjdk.org

针对目前多目标跟踪中目标数据关联度量方式的不足, 以及Wasserstein 度量值衡量概率测度间差异程度的性质, 提出基于Wasserstein 度量的目标数据关联算法, 即利用Wasserstein 距离衡量目标外观特征向量之间的相似度, 将目标外观特征向量看成一个分布, 计算分布之间的 …

[Chinese [C] An example of target data association algorithm based on Wasserstein metric

<——2019—————— 2019 —————— -310 —


DE NOVO PROTEIN DESIGN OF NOVEL FOLDS USING GUIDED CONDITIONAL WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS (GCWGAN)

S Zhu - 2019 - oaktrust.library.tamu.edu

In the research areas about proteins, it is always a significant topic to detect the sequencestructure-function relationship. Fundamental questions remain for this topic: How much could current data alone reveal deep insights about such relationship? And how much …


[PDF] thecvf.com

Face Synthesis and Recognition Using Disentangled Representation-Learning Wasserstein GAN

GS Jison Hsu, CH Tang… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Abstract We propose the Disentangled Representation-learning Wasserstein GAN (DR-WGAN) trained on augmented data for face recognition and face synthesis across pose. We improve the state-of-the-art DR-GAN with the Wasserstein loss considered in the …

Related articles


[PDF] arxiv.org

Riemannian Normalizing Flow on Variational Wasserstein Autoencoder for Text Modeling

PZ Wang, WY Wang - arXiv preprint arXiv:1904.02399, 2019 - arxiv.org

Recurrent Variational Autoencoder has been widely used for language modeling and text generation tasks. These models often face a difficult optimization problem, also known as the Kullback-Leibler (KL) term vanishing issue, where the posterior easily collapses to the …

ited by 27 Related articles All 5 versions


[PDF] arxiv.org

A Wasserstein Inequality and Minimal Green Energy on Compact Manifolds

S Steinerberger - arXiv preprint arXiv:1907.09023, 2019 - arxiv.org

Let $ M $ be a smooth, compact $ d-$ dimensional manifold, $ d\geq 3, $ without boundary and let $ G: M\times M\rightarrow\mathbb {R}\cup\left\{\infty\right\} $ denote the Green's function of the Laplacian $-\Delta $(normalized to have mean value 0). We prove a bound …

A WASSERSTEIN INEQUALITY AND MINIMAL GREEN ENERGY ON COMPACT MANIFOLDS

[PDF] arxiv.org

Transport and Interface: an Uncertainty Principle for the Wasserstein distance

A Sagiv, S Steinerberger - arXiv preprint arXiv:1905.07450, 2019 - arxiv.org

Let $ f:[0, 1]^ d\rightarrow\mathbb {R} $ be a continuous function with zero mean and interpret $ f_ {+}=\max (f, 0) $ and $ f_ {-}=-\min (f, 0) $ as the densities of two measures. We prove that if the cost of transport from $ f_ {+} $ to $ f_ {-} $ is small (in terms of the …

Related articles


[PDF] arxiv.org

Attainability property for a probabilistic target in Wasserstein spaces

G Cavagnari, A Marigonda - arXiv preprint arXiv:1904.10933, 2019 - arxiv.org

In this paper we establish an attainability result for the minimum time function of a control problem in the space of probability measures endowed with Wasserstein distance. The dynamics is provided by a suitable controlled continuity equation, where we impose a …

Cited by 1 Related articles All 2 versions

2019 see 2020
Learning with minibatch Wasserstein : asymptotic and gradient properties

Authors:Fatras, Kilian (Creator), Zine, Younes (Creator), Flamary, Rémi (Creator), Gribonval, Rémi (Creator), Courty, Nicolas (Creator)
Summary:Optimal transport distances are powerful tools to compare probability distributions and have found many applications in machine learning. Yet their algorithmic complexity prevents their direct use on large scale datasets. To overcome this challenge, practitioners compute these distances on minibatches {\em i.e.} they average the outcome of several smaller optimal transport problems. We propose in this paper an analysis of this practice, which effects are not well understood so far. We notably argue that it is equivalent to an implicit regularization of the original problem, with appealing properties such as unbiased estimators, gradients and a concentration bound around the expectation, but also with defects such as loss of distance property. Along with this theoretical analysis, we also conduct empirical experiments on gradient flows, GANs or color transfer that highlight the practical interest of this strategyShow more
Downloadable Archival Material, 2019-10-09
Undefined
Publisher:2019-10-09


[HTML] mdpi.com

[HTML] Wasserstein Generative Adversarial Network Based De-Blurring Using Perceptual Similarity

M Hong, Y Choe - Applied Sciences, 2019 - mdpi.com

The de-blurring of blurred images is one of the most important image processing methods and it can be used for the preprocessing step in many multimedia and computer vision applications. Recently, de-blurring methods have been performed by neural network …

Related articles


Barycenters of Natural Images -- Constrained Wasserstein Barycenters for Image Morphing
Authors:Simon, Dror (Creator), Aberdam, Aviad (Creator)
Summary:Image interpolation, or image morphing, refers to a visual transition between two (or more) input images. For such a transition to look visually appealing, its desirable properties are (i) to be smooth; (ii) to apply the minimal required change in the image; and (iii) to seem "real", avoiding unnatural artifacts in each image in the transition. To obtain a smooth and straightforward transition, one may adopt the well-known Wasserstein Barycenter Problem (WBP). While this approach guarantees minimal changes under the Wasserstein metric, the resulting images might seem unnatural. In this work, we propose a novel approach for image morphing that possesses all three desired properties. To this end, we define a constrained variant of the WBP that enforces the intermediate images to satisfy an image prior. We describe an algorithm that solves this problem and demonstrate it using the sparse prior and generative adversarial networksShow more
Downloadable Archival Material, 2019-12-24
Undefined
Publisher:2019-12-24

一种基于 Wasserstein 距离及有效性指标的最优场景约简方法

董骁翀, 孙英云, 蒲天骄, 陈乃仕, 孙珂 - 中国电机工程学报, 2019 - cnki.com.cn

随着可再生能源的发展, 电力系统中不确定性因素增多. 精确地模拟可再生能源场景, 对大规模新能源并网调度, 规划有着重要意义. 针对该问题, 提出一种基于Wasserstein 概率距离的场景约简0-1 规划模型. 并对聚类有效性指标进行分析, 提出包含内部有效性及外部 …

[Chinese  Research on Speech Enhancement Algorithm Based on Wasserstein Distance and Effectiveness Index for Optimal Scene Reduction Method]

Related articles

<—–2019 ————— 2019  —————320 —

 

 

[PDF] nips.cc

Interior-point methods strike back: solving the Wasserstein barycenter problem

D Ge, H Wang, Z Xiong, Y Ye - Advances in Neural Information …, 2019 - papers.nips.cc

Computing the Wasserstein barycenter of a set of probability measures under the optimal 

transport metric can quickly become prohibitive for traditional second-order algorithms, such 

as interior-point methods, as the support size of the measures increases. In this paper, we 

overcome the difficulty by developing a new adapted interior-point method that fully exploits 

the problem's special matrix structure to reduce the iteration complexity and speed up the 

Newton procedure. Different from regularization approaches, our method achieves a well …

Cited by 7 Related articles All 5 versions

 

Wasserstein space as state space of quantum mechanics and optimal transport

MF Rosyid, K Wahyuningsih - Journal of Physics: Conference …, 2019 - iopscience.iop.org

In this work, we are in the position to view a measurement of a physical observable as an experiment in the sense of probability theory. To every physical observable, a sample space called the spectrum of the observable is therefore available. We have investigated the …

elated articles All 3 versions

[PDF] arxiv.org

Modeling the Biological Pathology Continuum with HSIC-regularized Wasserstein Auto-encoders

D Wu, H Kobayashi, C Ding, L Cheng… - arXiv preprint arXiv …, 2019 - arxiv.org

A crucial challenge in image-based modeling of biomedical data is to identify trends and 

features that separate normality and pathology. In many cases, the morphology of the 

imaged object exhibits continuous change as it deviates from normality, and thus a 

generative model can be trained to model this morphological continuum. Moreover, given 

side information that correlates to certain trend in morphological change, a latent variable 

model can be regularized such that its latent representation reflects this side information. In …

Cite Cited by 4 Related articles All 2 versions 

[PDF] arxiv.org

A fast proximal point method for computing exact wasserstein distance

Y Xie, X Wang, R Wang, H Zha - arXiv preprint arXiv:1802.04307, 2018 - arxiv.org

Wasserstein distance plays increasingly important roles in machine learning, stochastic 

programming and image processing. Major efforts have been under way to address its high 

computational complexity, some leading to approximate or regularized variations such as 

Sinkhorn distance. However, as we will demonstrate, regularized variations with large 

regularization parameter will degradate the performance in several important machine 

learning applications, and small regularization parameter will fail due to numerical stability …

Cited by 5 Related articles All 3 versions 


[PDF] arxiv.org

Bounding quantiles of Wasserstein distance between true and empirical measure

SN Cohen, MNA Tegnér, J Wiesel - arXiv preprint arXiv:1907.02006, 2019 - arxiv.org

Consider the empirical measure, $\hat {\mathbb {P}} _N $, associated to $ N $ iid samples of a given probability distribution $\mathbb {P} $ on the unit interval. For fixed $\mathbb {P} $ the Wasserstein distance between $\hat {\mathbb {P}} _N $ and $\mathbb {P} $ is a random …

Bounding quantiles of wasserstein distance between true and empirical measure


[PDF] arxiv.org

Tropical Optimal Transport and Wasserstein Distances in Phylogenetic Tree Space

W Lee, W Li, B Lin, A Monod - arXiv preprint arXiv:1911.05401, 2019 - arxiv.org

We study the problem of optimal transport on phylogenetic tree space from the perspective of tropical geometry, and thus define the Wasserstein-$ p $ distances for probability measures in this continuous metric measure space setting. With respect to the tropical metric …


 

[PDF] arxiv.org

Universality of persistence diagrams and the bottleneck and Wasserstein distances

P Bubenik, A Elchesen - arXiv preprint arXiv:1912.02563, 2019 - arxiv.org

We undertake a formal study of persistence diagrams and their metrics. We show that barcodes and persistence diagrams together with the bottleneck distance and the Wasserstein distances are obtained via universal constructions and thus have …

 Cited by 3 Related articles All 4 versions 
 

[PDF] projecteuclid.org

Convergence of the Population Dynamics algorithm in the Wasserstein metric

M Olvera-Cravioto - Electronic Journal of Probability, 2019 - projecteuclid.org

We study the convergence of the population dynamics algorithm, which produces sample pools of random variables having a distribution that closely approximates that of the special endogenous solution to a variety of branching stochastic fixed-point equations, including the …

CCited by 4 Related articles All 7 versions


[PDF] arxiv.org

Parisi's formula is a Hamilton-Jacobi equation in Wasserstein space

JC Mourrat - arXiv preprint arXiv:1906.08471, 2019 - arxiv.org

Parisi's formula is a self-contained description of the infinite-volume limit of the free energy of mean-field spin glass models. We show that this quantity can be recast as the solution of a Hamilton-Jacobi equation in the Wasserstein space of probability measures on the positive …

Cited by 1 Related articles


Wasserstein generative adversarial networks for motion artifact removal in dental CT imaging

C Jiang, Q Zhang, Y Ge, D Liang… - … 2019: Physics of …, 2019 - spiedigitallibrary.org

In dental computed tomography (CT) scanning, high-quality images are crucial for oral disease diagnosis and treatment. However, many artifacts, such as metal artifacts, downsampling artifacts and motion artifacts, can degrade the image quality in practice. The …

Cited by 1 Related articles All 2 versions 


<-—2019—————— 2019 ——-330 ———


Cross-domain Attention Network with Wasserstein Regularizers for E-commerce Search

M Qiu, B Wang, C Chen, X Zeng, J Huang… - Proceedings of the 28th …, 2019 - dl.acm.org

Product search and recommendation is a task that every e-commerce platform wants to outperform their peels on. However, training a good search or recommendation model often requires more data than what many platforms have. Fortunately, the search tasks on different …

Cited by 3 Related articles

On the Computational Complexity of Finding a Sparse Wasserstein Barycenter

S Borgwardt, S Patterson - arXiv preprint arXiv:1910.07568, 2019 - arxiv.org

The discrete Wasserstein barycenter problem is a minimum-cost mass transport problem for a set of probability measures with finite support. In this paper, we show that finding a barycenter of sparse support is hard, even in dimension 2 and for only 3 measures. We …

 Cited by 9 Related articles All 2 versions 

Aggregated Wasserstein Distance and State Registration for Hidden Markov Models

Y Chen, J Ye, J Li - IEEE Transactions on Pattern Analysis and …, 2019 - ieeexplore.ieee.org

We propose a framework, named Aggregated Wasserstein, for computing a distance between two Hidden Markov Models with state conditional distributions Cited by 16 Related articles All 7 versions

Finsler structure for variable exponent Wasserstein space and gradient flows

A Marcos, A Soglo - arXiv preprint arXiv:1912.12450, 2019 - arxiv.org

Please join the Simons Foundation and our generous member organizations in supporting arXiv during our giving campaign September 23-27. 100% of your contribution will fund improvements and new initiatives to benefit arXiv's global scientific community … We gratefully acknowledge …


[PDF] arxiv.org

Hausdorff and Wasserstein metrics on graphs and other structured data

E Patterson - arXiv preprint arXiv:1907.00257, 2019 - arxiv.org

Optimal transport is widely used in pure and applied mathematics to find probabilistic solutions to hard combinatorial matching problems. We extend the Wasserstein metric and other elements of optimal transport from the matching of sets to the matching of graphs and …


[PDF] arxiv.org

Hypothesis Test and Confidence Analysis with Wasserstein Distance on General Dimension

M Imaizumi, H Ota, T Hamaguchi - arXiv preprint arXiv:1910.07773, 2019 - arxiv.org

We develop a general framework for statistical inference with the Wasserstein distance. Recently, the Wasserstein distance has attracted much attention and been applied to various machine learning tasks due to its celebrated properties. Despite the importance …

[PDF] arxiv.org

Pushing the right boundaries matters! Wasserstein Adversarial Training for Label Noise

BB Damodaran, K Fatras, S Lobry, R Flamary… - arXiv preprint arXiv …, 2019 - arxiv.org

Noisy labels often occur in vision datasets, especially when they are issued from crowdsourcing or Web scraping. In this paper, we propose a new regularization method which enables one to learn robust classifiers in presence of noisy data. To achieve this goal …

Cited by 3 Related articles All 4 versions


[PDF] arxiv.org

Wasserstein Distance based Deep Adversarial Transfer Learning for Intelligent Fault Diagnosis

C Cheng, B Zhou, G Ma, D Wu, Y Yuan - arXiv preprint arXiv:1903.06753, 2019 - arxiv.org

The demand of artificial intelligent adoption for condition-based maintenance strategy is astonishingly increased over the past few years. Intelligent fault diagnosis is one critical topic of maintenance solution for mechanical systems. Deep learning models, such as …

Cited by 9  Related articles  All 3 versions 


[PDF] ieee.org

Accelerating CS-MRI Reconstruction With Fine-Tuning Wasserstein Generative Adversarial Network

M Jiang, Z Yuan, X Yang, J Zhang, Y Gong, L Xia… - IEEE …, 2019 - ieeexplore.ieee.org

Compressed sensing magnetic resonance imaging (CS-MRI) is a time-efficient method to acquire MR images by taking advantage of the highly under-sampled k-space data to accelerate the time consuming acquisition process. In this paper, we proposed a de-aliasing …

Engineering; Reports from Zhejiang Science Technical University Advance Knowledge in Engineering (Accelerating Cs-mri Reconstruction With Fine-tuning Wasserstein... 

Journal of Engineering, Dec 16, 2019, 2937

Newspaper ArticleFull Text Online 

 Cited by 11 Related articles

A partial Laplacian as an infinitesimal generator on the Wasserstein space

YT Chow, W Gangbo - Journal of Differential Equations, 2019 - Elsevier

In this manuscript, we consider special linear operators which we term partial Laplacians on

the Wasserstein space, and which we show to be partial traces of the Wasserstein Hessian.

We verify a distinctive smoothing effect of the “heat flows” they generated for a particular …

  Cited by 13 Related articles All 9 versions

<-—2019 ——————— 2019  ——340 ——


Data-Driven Distributionally Robust Shortest Path Problem Using the Wasserstein Ambiguity Set

Z Wang, K You, S Song, C Shang - 2019 IEEE 15th …, 2019 - ieeexplore.ieee.org

This paper proposes a data-driven distributionally robust shortest path (DRSP) model where the distribution of the travel time is only observable through a finite training dataset. Our DRSP model adopts the Wasserstein metric to construct the ambiguity set of probability …

Cite Cited by 1 Related articles


Optimal Fusion of Elliptic Extended Target Estimates based on the Wasserstein Distance

K Thormann, M Baum - arXiv preprint arXiv:1904.00708, 2019 - arxiv.org

This paper considers the fusion of multiple estimates of a spatially extended object, where the object extent is modeled as an ellipse that is parameterized by the orientation and semi-axes lengths. For this purpose, we propose a novel systematic approach that employs a …

  Cited by 1 Related articles All 5 versions


[PDF] arxiv.org

WGANSing: A Multi-Voice Singing Voice Synthesizer Based on the Wasserstein-GAN

P Chandna, M Blaauw, J Bonada, E Gomez - arXiv preprint arXiv …, 2019 - arxiv.org

We present a deep neural network based singing voice synthesizer, inspired by the Deep Convolutions Generative Adversarial Networks (DCGAN) architecture and optimized using the Wasserstein-GAN algorithm. We use vocoder parameters for acoustic modelling, to …
Cited by 44
Related articles All 6 versions


[PDF] ntu.edu.sg

Poisson discretizations of Wiener functionals and Malliavin operators with Wasserstein estimates

N Privault, SCP Yam, Z Zhang - Stochastic Processes and their …, 2019 - Elsevier

This article proposes a global, chaos-based procedure for the discretization of functionals of Brownian motion into functionals of a Poisson process with intensity λ> 0. Under this discretization we study the weak convergence, as the intensity of the underlying Poisson …

Related articles

Poissn discretizations of Wiener functionals and Malliavin ...

Poisson discretizations of Wiener functionals and Malliavin operators with ... for the discretization of functionals of Brownian motion into functionals of a Poisson ...

by N Privault - ‎2019 - ‎Related articles

Data on Stochastics and Dynamics Reported by Researchers at Chinese University of Hong Kong 

(Poisson Discretizations of Wiener Functionals and Malliavin Operators With Wasserstein... 

Journal of Engineering, 09/2019

NewsletterFull Text Online 

Stochastics and Dynamics; Data on Stochastics and Dynamics Reported by Researchers at Chinese University of Hong Kong (Poisson Discretizations of Wiener Functionals and Malliavin Operators With Wasserstein... 

Journal of Engineering, Sep 16, 2019, 289

Newspaper ArticleFull Text Online 


[PDF] ieee.org

Multi-source Medical Image Fusion Based on Wasserstein Generative Adversarial Networks

Z Yang, Y Chen, Z Le, F Fan, E Pan - IEEE Access, 2019 - ieeexplore.ieee.org

In this paper, we propose the medical Wasserstein generative adversarial networks (MWGAN), an end-to-end model, for fusing magnetic resonance imaging (MRI) and positron emission tomography (PET) medical images. Our method establishes two adversarial …


2019

[PDF] arxiv.org

Connections between Support Vector Machines, Wasserstein distance and gradient-penalty GANs

A Jolicoeur-Martineau, I Mitliagkas - arXiv preprint arXiv:1910.06922, 2019 - arxiv.org

We generalize the concept of maximum-margin classifiers (MMCs) to arbitrary norms and non-linear functions. Support Vector Machines (SVMs) are a special case of MMC. We find that MMCs can be formulated as Integral Probability Metrics (IPMs) or classifiers with some …

Cited by 12 Related articles All 2 versions


[PDF] arxiv.org

Quantitative spectral gap estimate and Wasserstein contraction of simple slice sampling

V Natarovskii, D Rudolf, B Sprungk - arXiv preprint arXiv:1903.03824, 2019 - arxiv.org

We prove Wasserstein contraction of simple slice sampling for approximate sampling wrt distributions with log-concave and rotational invariant Lebesgue densities. This yields, in particular, an explicit quantitative lower bound of the spectral gap of simple slice sampling …

Related articles


[PDF] arxiv.org

On the Wasserstein Distance between Classical Sequences and the Lebesgue Measure

L Brown, S Steinerberger - arXiv preprint arXiv:1909.09046, 2019 - arxiv.org

We discuss the classical problem of measuring the regularity of distribution of sets of $ N $ points in $\mathbb {T}^ d $. A recent line of investigation is to study the cost ($= $ mass $\times $ distance) necessary to move Dirac measures placed in these points to the uniform …


[PDF] biorxiv.org

Reproducibility test of radiomics using network analysis and Wasserstein K-means algorithm

JH Oh, A Apte, E Katsoulakis, N Riaz, V Hatzoglou… - bioRxiv, 2019 - biorxiv.org

Purpose: To construct robust and validated radiomic predictive models, the development of a reliable method that can identify reproducible radiomic features robust to varying image acquisition methods and other scanner parameters should be preceded with rigorous …

 Related articles All 3 versions 

[PDF] nips.cc

[PDF] A First-Order Algorithmic Framework for Wasserstein Distributionally Robust Logistic Regression

J Li, S Huang, AMC So - arXiv preprint arXiv:1910.12778, 2019 - papers.nips.cc

Wasserstein distance-based distributionally robust optimization (DRO) has received much attention lately due to its ability to provide a robustness interpretation of various learning models. Moreover, many of the DRO problems that arise in the learning context admits exact …
A first-order algorithmic framework for wasserstein distributionally robust logistic regression

<-—2019—————— 2019 — -350 —

——

[CITATION] Comparison of object functions for the inversion of seismic data and study on the potentialities of the Wasserstein Metric

L Stracca, E Stucchi, A Mazzotti - GNGTS, 2019 - arpi.unipi.it

IRIS è la soluzione IT che facilita la raccolta e la gestione dei dati relativi alle attività e ai prodotti 

della ricerca. Fornisce a ricercatori, amministratori e valutatori gli strumenti per monitorare i risultati 

della ricerca, aumentarne la visibilità e allocare in modo efficace le risorse disponibili … Comparison …


Adaptive Wasserstein Hourglass for Weakly Supervised Hand Pose Estimation from Monocular RGB

Y Zhang, L Chen, Y Liu, J Yong, W Zheng - arXiv preprint arXiv …, 2019 - arxiv.org

Insufficient labeled training datasets is one of the bottlenecks of 3D hand pose estimation from monocular RGB images. Synthetic datasets have a large number of images with precise annotations, but the obvious difference with real-world datasets impacts the …


[PDF] wustl.edu

Grid-Less DOA Estimation Using Sparse Linear Arrays Based on Wasserstein Distance

M Wang, Z Zhang, A Nehorai - IEEE Signal Processing Letters, 2019 - ieeexplore.ieee.org

Sparse linear arrays, such as nested and co-prime arrays, are capable of resolving O (M2) sources using only O (M) sensors by exploiting their so-called difference coarray model. One popular approach to exploit the difference coarray model is to construct an augmented …

Cited by 10 Related articles All 3 versions


[PDF] arxiv.org

Painting halos from 3D dark matter fields using Wasserstein mapping networks

DK Ramanah, T Charnock, G Lavaux - arXiv preprint arXiv:1903.10524, 2019 - arxiv.org

We present a novel halo painting network that learns to map approximate 3D dark matter fields to realistic halo distributions. This map is provided via a physically motivated network with which we can learn the non-trivial local relation between dark matter density field and …

All 4 versions


[PDF] arxiv.org

Asymptotic Guarantees for Learning Generative Models with the Sliced-Wasserstein Distance

K Nadjahi, A Durmus, U Şimşekli, R Badeau - arXiv preprint arXiv …, 2019 - arxiv.org

Minimum expected distance estimation (MEDE) algorithms have been widely used for probabilistic models with intractable likelihood functions and they have become increasingly popular due to their use in implicit generative modeling (eg Wasserstein generative …

Cited by 6 Related articles All 5 versions

Asymptotic guarantees for learning generative models with the sliced-wasserstein distance


[PDF] arxiv.org

Stochastic equation and exponential ergodicity in Wasserstein distances for affine processes

M Friesen, P Jin, B Rüdiger - arXiv preprint arXiv:1901.05815, 2019 - arxiv.org

This work is devoted to the study of conservative affine processes on the canonical state space $ D=\mathbb {R} _+^ m\times\mathbb {R}^ n $, where $ m+ n> 0$. We show that each affine process can be obtained as the pathwise unique strong solution to a stochastic …

Related articles  All 3 versions


[PDF] esaim-cocv.org

A Pontryagin Maximum Principle in Wasserstein spaces for constrained optimal control problems

B Bonnet - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

In this paper, we prove a Pontryagin Maximum Principle for constrained optimal control

problems in the Wasserstein space of probability measures. The dynamics is described by a

transport equation with non-local velocities which are affine in the control, and is subject to …

  Cited by 9 Related articles All 45 versions


[PDF] arxiv.org

Wasserstein convergence rates for coin tossing approximations of continuous Markov processes

S Ankirchner, T Kruse, M Urusov - arXiv preprint arXiv:1903.07880, 2019 - arxiv.org

We determine the convergence speed of a numerical scheme for approximating one-dimensional continuous strong Markov processes. The scheme is based on the construction of coin tossing Markov chains whose laws can be embedded into the process with a …

Cited by 1 Related articles


[PDF] arxiv.org

Stylized Text Generation Using Wasserstein Autoencoders with a Mixture of Gaussian Prior

A Ghabussi, L Mou, O Vechtomova - arXiv preprint arXiv:1911.03828, 2019 - arxiv.org

Wasserstein autoencoders are effective for text generation. They do not however provide any control over the style and topic of the generated sentences if the dataset has multiple classes and includes different topics. In this work, we present a semi-supervised approach …


Confronto di funzioni oggetto per l'inversione di dati sismici e studio delle potenzialità della Metrica di Wasserstein

L STRACCA - 2019 - etd.adm.unipi.it

Un problema inverso ha come scopo la determinazione o la stima dei parametri incogniti di un modello, conoscendo i dati da esso generati e l'operatore di forward modelling che descrive la relazione tra un modello generico e il rispettivo dato predetto. In un qualunque …

<-—2019    ——————— 2019  ————360 —


 

[PDF] aclweb.org

Modeling Personalization in Continuous Space for Response Generation via Augmented Wasserstein Autoencoders

Z Chan, J Li, X Yang, X Chen, W Hu, D Zhao… - Proceedings of the 2019 …, 2019 - aclweb.org

Abstract Variational autoencoders (VAEs) and Wasserstein autoencoders (WAEs) have achieved noticeable progress in open-domain response generation. Through introducing latent variables in continuous space, these models are capable of capturing utterance-level …

Cited by 22 Related articles All 2 versions 

 

Using Wasserstein-2 regularization to ensure fair decisions with Neural-Network classifiers

L Risser, Q Vincenot, N Couellan… - arXiv preprint arXiv …, 2019 - arxiv.org

In this paper, we propose a new method to build fair Neural-Network classifiers by using a constraint based on the Wasserstein distance. More specifically, we detail how to efficiently compute the gradients of Wasserstein-2 regularizers for Neural-Networks. The proposed …

Cited by 10 Related articles All 2 versions

[PDF] arxiv.org

Random Matrix-Improved Estimation of the Wasserstein Distance between two Centered Gaussian Distributions

M Tiomoko, R Couillet - arXiv preprint arXiv:1903.03447, 2019 - arxiv.org

This article proposes a method to consistently estimate functionals $\frac1p\sum_ {i= 1}^ pf (\lambda_i (C_1C_2)) $ of the eigenvalues of the product of two covariance matrices $ C_1, C_2\in\mathbb {R}^{p\times p} $ based on the empirical estimates $\lambda_i (\hat C_1\hat …

Cited by 1 Related articles


[PDF] sns.it    NNT : 2019SACLS112T

HÈSE DE DOCTORAT

Courbes et applications optimales à valeurs dansl’espace de Wasserstein

  Zastosowanie metryki Wassersteina w problemie uczenia ...

https://pages.mini.pw.edu.pl › ~mandziukj

PDFMar 27, 2019 — Zastosowanie metryki Wassersteina w problemie uczenia ograniczonych maszyn ... Uczenie maszyn Boltzmanna z zastosowaniem odległości.26 pages
[Polish  Application of the Wasserstein metric to the learning problem …[


2019

Unsupervised feature extraction based on improved Wasserstein generative adversarial network for hyperspectral classification

Q Sun, S Bourennane - Multimodal Sensing: Technologies …, 2019 - spiedigitallibrary.org

Accurate classification is one of the most important prerequisites for hyperspectral applications and feature extraction is the key step of classification. Recently, deep learning models have been successfully used to extract the spectral-spatial features in hyperspectral …

 Related articles All 4 versions


[PDF] arxiv.org

The Wasserstein Distances Between Pushed-Forward Measures with Applications to Uncertainty Quantification

A Sagiv - arXiv preprint arXiv:1902.05451, 2019 - arxiv.org

In the study of dynamical and physical systems, the input parameters are often uncertain or randomly distributed according to a measure $\varrho $. The system's response $ f $ pushes forward $\varrho $ to a new measure $ f\circ\varrho $ which we would like to study. However …

Related articles All 2 versions  Zbl 07327467


2019 see 2020
Multi-view Wasserstein discriminant analysis with entropic regularized Wasserstein distance

Author:笠井 裕之
Article
Publication:映像情報メディア学会技術報告 = ITE technical report., 43, 2019-12, 117
Article, 2019
Publication:ESAIM - Control, Optimisation and Calculus of Variations, 25, 2019
Publisher:2019


[PDF] sciencedirect.com

Distributionally Robust Chance-Constraint Optimal Power Flow Considering Uncertain Renewables with Wasserstein-Moment Metric

J Liu, Y Chen, C Duan, J Lyu - Energy Procedia, 2019 - Elsevier

Chance-constraint optimal power flow has been proven as an efficient method to manage the risk of volatile renewable energy sources. To address the uncertainties of renewable energy sources, a novel distributionally robust chance-constraint OPF model is proposed in …

Related articles


[PDF] arxiv.org

Using Wasserstein Generative Adversarial Networks for the Design of Monte Carlo Simulations

S Athey, GW Imbens, J Metzger, EM Munro - 2019 - nber.org

When researchers develop new econometric methods it is common practice to compare the performance of the new methods to those of existing methods in Monte Carlo studies. The credibility of such Monte Carlo studies is often limited because of the freedom the researcher …

book

<-—2019—————2019————370 —



Orthogonal Wasserstein GANs

Authors:Müller, Jan (Creator), Klein, Reinhard (Creator), Weinmann, Michael (Creator)
Summary:Wasserstein-GANs have been introduced to address the deficiencies of generative adversarial networks (GANs) regarding the problems of vanishing gradients and mode collapse during the training, leading to improved convergence behaviour and improved image quality. However, Wasserstein-GANs require the discriminator to be Lipschitz continuous. In current state-of-the-art Wasserstein-GANs this constraint is enforced via gradient norm regularization. In this paper, we demonstrate that this regularization does not encourage a broad distribution of spectral-values in the discriminator weights, hence resulting in less fidelity in the learned distribution. We therefore investigate the possibility of substituting this Lipschitz constraint with an orthogonality constraint on the weight matrices. We compare three different weight orthogonalization techniques with regards to their convergence properties, their ability to ensure the Lipschitz condition and the achieved quality of the learned distribution. In addition, we provide a comparison to Wasserstein-GANs trained with current state-of-the-art methods, where we demonstrate the potential of solely using orthogonality-based regularization. In this context, we propose an improved training procedure for Wasserstein-GANs which utilizes orthogonalization to further increase its generalization capability. Finally, we provide a novel metric to evaluate the generalization capabilities of the discriminators of different Wasserstein-GANsShow more
Downloadable Archival Material, 2019-11-29
Undefined
Publisher:2019-11-29


Structure preserving discretization and approximation of gradient flows in Wasserstein-like space

S Plazotta - 2019 - mediatum.ub.tum.de

This thesis investigates structure-preserving, temporal semi-discretizations and approximations for PDEs with gradient flow structure with the application to evolution problems in the L²-Wasserstein space. We investigate the variational formulation of the time …

tructure preserving discretization and approximation of gradient flows in Wasserstein-like space  thesis
Structure preserving discretization and approximation of gradient flows in Wasserstein-like space
thesis


[PDF] arxiv.org

A convergent Lagrangian discretization for 

-Wasserstein and flux-limited diffusion equations

O Junge, B Söllner - arXiv preprint arXiv:1906.01321, 2019 - arxiv.org

We study a Lagrangian numerical scheme for solution of a nonlinear drift diffusion equation of the form $\partial_t u=\partial_x (u\cdot c [\partial_x (h^\prime (u)+ v)]) $ on an interval. This scheme will consist of a spatio-temporal discretization founded in the formulation of the …

Cited by 2 Related articles 

[CITATION] A convergent Lagrangian discretization for 

p-Wasserstein and flux-limited diffusion equations

O Junge, B Söllner - arXiv preprint arXiv:1906.01321, 2019


[PDF] semanticscholar.org

[PDF] An LP-based, Strongly Polynomial 2-Approximation Algorithm for Sparse Wasserstein Barycenters

S Borgwardt - pdfs.semanticscholar.org

Wasserstein barycenters correspond to optimal solutions of transportation problems for several marginals, which arise in a wide range of fields. In many applications, data is given as a set of probability measures with finite support. The discrete barycenters in this setting …

Related articles

[PDF] arxiv.org

Temporal Wasserstein non-negative matrix factorization for non-rigid motion segmentation and spatiotemporal deconvolution

E Varol, A Nejatbakhsh, C McGrory - arXiv preprint arXiv:1912.03463, 2019 - arxiv.org

Motion segmentation for natural images commonly relies on dense optic flow to yield point trajectories which can be grouped into clusters through various means including spectral clustering or minimum cost multicuts. However, in biological imaging scenarios, such as …

Cited by 3 Related articles All 5 versions

[PDF] arxiv.org

Quantitative stability of optimal transport maps and linearization of the 2-Wasserstein space

Q Mérigot, A Delalande, F Chazal - arXiv preprint arXiv:1910.05954, 2019 - arxiv.org

This work studies an explicit embedding of the set of probability measures into a Hilbert space, defined using optimal transport maps from a reference probability density. This embedding linearizes to some extent the 2-Wasserstein space, and enables the direct use of …

arXiv:1910.05954  [pdf, other]  stat.ML cs.LG math.MG math.NA
Quantitative stability of optimal transport maps and linearization of the 2-Wasserstein space

[PDF] arxiv.org

A general solver to the elliptical mixture model through an approximate Wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - arXiv preprint arXiv:1906.03700, 2019 - arxiv.org

This paper studies the problem of estimation for general finite mixture models, with a particular focus on the elliptical mixture models (EMMs). Instead of using the widely adopted Kullback-Leibler divergence, we provide a stable solution to the EMMs that is robust to …

Related articles

S Li, Z Yu, M Xiang, D Mandic - arXiv preprint arXiv:1906.03700, 2019

 Cited by 1 Related articles
[CITATION] A general solver to the elliptical mixture model through an approximate wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - arXiv preprint arXiv:1906.03700, 2019

Cited by 1 Related articles


[PDF] projecteuclid.org

Behavior of the empirical Wasserstein distance in 

under moment conditions

J Dedecker, F Merlevède - Electronic Journal of Probability, 2019 - projecteuclid.org

We establish some deviation inequalities, moment bounds and almost sure results for the Wasserstein distance of order $ p\in [1,\infty) $ between the empirical measure of independent and identically distributed ${\mathbb R}^ d $-valued random variables and the …

Cited by 2 Related articles All 9 versions 


Convergence rates of the blocked Gibbs sampler with random scan in the Wasserstein metric

NY Wang, G Yin - Stochastics, 2019 - Taylor & Francis

To approximate μ, various scan Gibbs samplers with updating blocks are often used [1 J. Besag, P. Green, D. Higdon, and K. Mengersen, Bayesian computation and stochastic systems, Statist. Sci. 10(1) (1995), pp. 3–41. doi: 10.1214/ss/1177010123[Crossref], [Web of …

Related articles

MR4067882  Prelim Wang, Neng-Yi; Yin, Guosheng; Convergence rates of the blocked Gibbs sampler with random scan in the Wasserstein metric. Stochastics 92 (2020), no. 2, 265–274.

 Zbl 07553630   Zbl 1490.60008


[PDF] arxiv.org

Tractable Reformulations of Distributionally Robust Two-stage Stochastic Programs with 

Wasserstein Distance

W Xie - arXiv preprint arXiv:1908.08454, 2019 - arxiv.org

In the optimization under uncertainty, decision-makers first select a wait-and-see policy before any realization of uncertainty and then place a here-and-now decision after the uncertainty has been observed. Two-stage stochastic programming is a popular modeling …
[PDF] researchgate.net

[PDF] Tractable reformulations of distributionally robust two-stage stochastic programs with∞− Wasserstein distance

W Xie - arXiv preprint arXiv:1908.08454, 2019 - researchgate.net

… distance as τ ∞. Different types of Wasserstein ambiguity set might provide different tractable

results … (2018a), it still exhibits attractive convergent properties. The discussions on advantages

of Wasserstein ambiguity sets can be found in Mohajerin Esfa …

  Cited by 8 Related articles All 2 versions 

<——2019—————— 2019—————— 380 —

     

A Semi-Supervised Wasserstein Generative Adversarial Network for Classifying Driving Fatigue from EEG signals

S Panwar, P Rad, J Quarles, E Golob… - … on Systems, Man and …, 2019 - ieeexplore.ieee.org

Predicting driver's cognitive states using deep learning from electroencephalography (EEG) signals is considered this paper. To address the challenge posed by limited labeled training samples, a semi-supervised Wasserstein Generative Adversarial Network with gradient …

Cited by 3 Related articles
 semi-supervised wasserstein generative adversarial network for classifying driving fatigue from eeg signals


[PDF] mdpi.com

Multi-Turn Chatbot Based on Query-Context Attentions and Dual Wasserstein Generative Adversarial Networks

J Kim, S Oh, OW Kwon, H Kim - Applied Sciences, 2019 - mdpi.com

To generate proper responses to user queries, multi-turn chatbot models should selectively consider dialogue histories. However, previous chatbot models have simply concatenated or averaged vector representations of all previous utterances without considering contextual …

Cited by 8 Related articles All 4 versions

[PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance Part 1: Wrong Way Counterparty Credit Risk

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019 - arxiv.org

This paper investigates calculations of robust CVA for OTC derivatives under distributional uncertainty using Wasserstein distance as the ambiguity measure. Wrong way counterparty credit risk can be characterized (and indeed quantified) via the robust CVA formulation. The …

[CITATION] Distributionally robust xva via wasserstein distance part 1

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019

Cited by 2 Related articles


A Hyperspectral Image Classification Method Based on

  Multi-Discriminator Generative Adversarial Networks

https://www.ncbi.nlm.nih.gov › articles › PMC6696272

by H Gao · 2019 · Cited by 11 — At present, deep learning has become an important method for studying image processing


[PDF] arxiv.org

Wasserstein Index Generation Model: Automatic generation of time-series index with application to Economic Policy Uncertainty

F Xie - Economics Letters, 2019 - Elsevier

… Economics Letters. Wasserstein Index Generation Model: Automatic generation of time-series index with application to Economic Policy Uncertainty … Recently, Shiller (2017) has called for more attention in collecting and analyzing text data of economic interest. The WIG model …


[PDF] arxiv.org

Rate of convergence in Wasserstein distance of piecewise-linear L\'evy-driven SDEs

A Arapostathis, G Pang, N Sandrić - arXiv preprint arXiv:1907.05250, 2019 - arxiv.org

In this paper, we study the rate of convergence under the Wasserstein metric of a broad class of multidimensional piecewise Ornstein-Uhlenbeck processes with jumps. These are governed by stochastic differential equations having a piecewise linear drift, and a fairly …


2019 

Aero-Engine Faults Diagnosis Based on K-Means Improved Wasserstein GAN and Relevant Vector Machine

Z Zhao, R Zhou, Z Dong - 2019 Chinese Control Conference …, 2019 - ieeexplore.ieee.org

The aero-engine faults diagnosis is essential to the safety of the long-endurance aircraft. The problem of fault diagnosis for aero-engines is essentially a sort of model classification problem. Due to the difficulty of the engine faults modeling, a data-driven approach is used …

Aero-engine faults diagnosis based on K-means improved wasserstein GAN and relevant vector machine

Z Zhao, R Zhou, Z Dong - 2019 Chinese Control Conference …, 2019 - ieeexplore.ieee.org

The aero-engine faults diagnosis is essential to the safety of the long-endurance aircraft.

The problem of fault diagnosis for aero-engines is essentially a sort of model classification

problem. Due to the difficulty of the engine faults modeling, a data-driven approach is used …

Cited by 6 Related articles

[PDF] arxiv.org

Normalized Wasserstein Distance for Mixture Distributions with Applications in Adversarial Learning and Domain Adaptation

Y Balaji, R Chellappa, S Feizi - arXiv preprint arXiv:1902.00415, 2019 - arxiv.org

Understanding proper distance measures between distributions is at the core of several learning tasks such as generative models, domain adaptation, clustering, etc. In this work, we focus on {\it mixture distributions} that arise naturally in several application domains …

Cited by 9 Related articles All 2 versions

Wasserstein Proximal Algorithms for the Schr\"{o}dinger Bridge Problem: Density Control with Nonlinear Drift

Authors:Caluya, Kenneth F. (Creator), Halder, Abhishek (Creator)
Summary:We study the Schr\"{o}dinger bridge problem (SBP) with nonlinear prior dynamics. In control-theoretic language, this is a problem of minimum effort steering of a given joint state probability density function (PDF) to another over a finite time horizon, subject to a controlled stochastic differential evolution of the state vector. For generic nonlinear drift, we reduce the SBP to solving a system of forward and backward Kolmogorov partial differential equations (PDEs) that are coupled through the boundary conditions, with unknowns being the "Schr\"{o}dinger factors" -- so named since their product at any time yields the optimal controlled joint state PDF at that time. We show that if the drift is a gradient vector field, or is of mixed conservative-dissipative nature, then it is possible to transform these PDEs into a pair of initial value problems (IVPs) involving the same forward Kolmogorov operator. Combined with a recently proposed fixed point recursion that is contractive in the Hilbert metric, this opens up the possibility to numerically solve the SBPs in these cases by computing the Schr\"{o}dinger factors via a single IVP solver for the corresponding (uncontrolled) forward Kolmogorov PDE. The flows generated by such forward Kolmogorov PDEs, for the two aforementioned types of drift, in turn, enjoy gradient descent structures on the manifold of joint PDFs with respect to suitable distance functionals. We employ a proximal algorithm developed in our prior work, that exploits this geometric viewpoint, to solve these IVPs and compute the Schr\"{o}dinger factors via weighted scattered point cloud evolution in the state space. We provide the algorithmic details and illustrate the proposed framework of solving the SBPs with nonlinear prior dynamics by numerical examplesShow more
Downloadable Archival Material, 2019-12-03
Undefined


Experimental Analysis of Equilibrization in Binary Classification for Non-Image Imbalanced Data Using Wasserstein GAN

ZY Wang, DK Kang - International Journal of Internet …, 2019 - koreascience.or.kr

In this paper, we explore the details of three classic data augmentation methods and two generative model based oversampling methods. The three classic data augmentation methods are random sampling (RANDOM), Synthetic Minority Over-sampling Technique …

Related articles All 3 versions

<-—2019    ——————— 2019  ———390 —

[PDF] u-bordeaux.fr

Data-driven regularization of Wasserstein barycenters with an application to multivariate density registration

J Bigot, E Cazelles, N Papadakis - Information and Inference: A …, 2019 - academic.oup.com

We present a framework to simultaneously align and smoothen data in the form of multiple point clouds sampled from unknown densities with support in a-dimensional Euclidean space. This work is motivated by applications in bioinformatics where researchers aim to …


Modeling EEG data distribution with a Wasserstein Generative Adversarial Network to predict RSVP Events

S Panwar, P Rad, TP Jung, Y Huang - arXiv preprint arXiv:1911.04379, 2019 - arxiv.org

Electroencephalography (EEG) data are difficult to obtain due to complex experimental setups and reduced comfort with prolonged wearing. This poses challenges to train powerful deep learning model with the limited EEG data. Being able to generate EEG data …


[PDF] arxiv.org

Closed-form Expressions for Maximum Mean Discrepancy with Applications to Wasserstein Auto-Encoders

RM Rustamov - arXiv preprint arXiv:1901.03227, 2019 - arxiv.org

The Maximum Mean Discrepancy (MMD) has found numerous applications in statistics and machine learning, most recently as a penalty in the Wasserstein Auto-Encoder (WAE). In this paper we compute closed-form expressions for estimating the Gaussian kernel based MMD …

Related articles All 2 versions

Closed-form Expressions for Maximum Mean Discrepancy ...

https://www.semanticscholar.org › paper › Closed-form-E...

Jan 10, 2019 — In this paper we compute closed-form expressions for estimating the ... Mean Discrepancy with Applications to Wasserstein Auto-Encoders.
Cited by 7
Related articles All 2 versions


Optimal Estimation of Wasserstein Distance on A Tree with An Application to Microbiome Studies

S Wang, TT Cai, H Li - Journal of the American Statistical …, 2019 - Taylor & Francis

The weighted UniFrac distance, a plug-in estimator of the Wasserstein distance of read counts on a tree, has been widely used to measure the microbial community difference in microbiome studies. Our investigation however shows that such a plug-in estimator …


[PDF] arxiv.org

Improved Concentration Bounds for Conditional Value-at-Risk and Cumulative Prospect Theory using Wasserstein distance

SP Bhat - arXiv preprint arXiv:1902.10709, 2019 - arxiv.org

Known finite-sample concentration bounds for the Wasserstein distance between the empirical and true distribution of a random variable are used to derive a two-sided concentration bound for the error between the true conditional value-at-risk (CVaR) of a …

Related articles   All 2 versions


2019

Misfit function for full waveform inversion based on the Wasserstein metric with dynamic formulation

P Yong, W Liao, J Huang, Z Li, Y Lin - Journal of Computational Physics, 2019 - Elsevier

Conventional full waveform inversion (FWI) using least square distance (L 2 norm) between the observed and predicted seismograms suffers from local minima. Recently, the Wasserstein metric (W 1 metric) has been introduced to FWI to compute the misfit between …

Misfit function for full waveform inversion based on the Wasserstein metric with dynamic formulation

Dec 15, 2019 - Conventional full waveform inversion (FWI) using least square distance ( norm) between the observed and predicted seismograms suffers from local minima. Recently, the Wasserstein metric ( metric) has been introduced to FWI to compute the misfit between two seismograms.

by P Yong - ‎2019 - ‎Cited by 1 - ‎Related articles

Studies from China University of Petroleum (East China) Reveal New Findings on Computational Physics

(Misfit Function for Full Waveform Inversion Based On the Wasserstein... 

Journal of Physics Research, 12/2019

NewsletterFull Text Online 

Physics - Computational Physics; Studies from China University of Petroleum (East China) Reveal New Findings on Computational Physics (Misfit Function for Full Waveform Inversion Based On the Wasserstein... 

Physics Week, Dec 17, 2019, 989

Newspaper ArticleFull Text Online 

Cited by 7 Related articles All 2 versions


BIRCH Algorithm and Wasserstein Distance Metric Based Method for Generating Typical Scenarios of Wind Power Outputs

Q Li, X Tang, C Chen, X Liu, S Liu, X Shi… - … -Asia (ISGT Asia), 2019 - ieeexplore.ieee.org

With the ever-increasing penetration of renewable energy generation such as wind power and solar photovoltaics, the power system concerned is suffering more extensive and significant uncertainties. Scenario analysis has been utilized to solve this problem for power …

Cited by 1 Related articles All 2 versions

Unsupervised feature extraction based on improved Wasserstein generative adversarial network for hyperspectral classification

Q Sun, S Bourennane - Multimodal Sensing: Technologies …, 2019 - spiedigitallibrary.org

Accurate classification is one of the most important prerequisites for hyperspectral

applications and feature extraction is the key step of classification. Recently, deep learning

models have been successfully used to extract the spectral-spatial features in hyperspectral …

  Related articles All 4 versions[PDF] dpi-proceedings.com

Data augmentation in fault diagnosis based on the Wasserstein generative adversarial network with gradient penalty

X Gao, F Deng, X Yue - Neurocomputing, 2019 - Elsevier

Fault detection and diagnosis in industrial process is an extremely essential part to keep away from undesired events and ensure the safety of operators and facilities. In the last few decades various data based machine learning algorithms have been widely studied to …

Cied by 7 Related articles

Computation - Neural Computation; Studies from Beijing Institute of Technology Provide New Data on Neural Computation (Data Augmentation In Fault Diagnosis Based On the Wasserstein... 

Journal of robotics & machine learning, Jul 6, 2020, 500

Newspaper ArticleFull Text Online

[PDF] arxiv.org

Wasserstein stability of porous medium-type equations on manifolds with Ricci curvature bounded below

N De Ponti, M Muratori, C Orrieri - arXiv preprint arXiv:1908.03147, 2019 - arxiv.org

Given a complete, connected Riemannian manifold $\mathbb {M}^ n $ with Ricci curvature bounded from below, we discuss the stability of the solutions of a porous medium-type equation with respect to the 2-Wasserstein distance. We produce (sharp) stability estimates …
Cited by 1
 Related articles All 3 versions 

<-—2019    —— 2019  ————400 —



[PDF] rit.edu

A Comparative Assessment of the Impact of Various Norms on Wasserstein Generative Adversarial Networks

C Ramesh - 2019 - scholarworks.rit.edu

Abstract Generative Adversarial Networks (GANs) provide a fascinating new paradigm in machine learning and artificial intelligence, especially in the context of unsupervised learning. GANs are quickly becoming a state of the art tool, used in various applications …

Related articles 

A Comparative Assessment of the Impact of Various Norms on Wasserstein Generative Adversarial Networks

books.google.com › books

Chandini Ramesh · 2019 · ‎No preview

"Generative Adversarial Networks (GANs) provide a fascinating new paradigm in machine learning and artificial intelligence, especially in the context of unsupervised learning.
A comparative assessment of the impact of various norms on Wasserstein generative adversarial networks
  thesis

A comparative assessment of the impact of various norms on Wasserstein generative adversarial networks
 thesis

[PDF] On the rate of convergence of empirical measure in∞-Wasserstein distance for unbounded density function

A Liu, Y LU - Quarterly of Applied Mathematics, 2019 - services.math.duke.edu

We consider a sequence of identical independently distributed random samples from an absolutely continuous probability measure in one dimension with unbounded density. We establish a new rate of convergence of the∞-Wasserstein distance between the empirical …

Related articles All 3 version


[PDF] arxiv.org

2-Wasserstein Approximation via Restricted Convex Potentials with Application to Improved Training for GANs

A Taghvaei, A Jalali - arXiv preprint arXiv:1902.07197, 2019 - arxiv.org

We provide a framework to approximate the 2-Wasserstein distance and the optimal transport map, amenable to efficient training as well as statistical and geometric analysis. With the quadratic cost and considering the Kantorovich dual form of the optimal …

Cited by 1 Related articles ll 2 versions

arXiv:1908.05783  [pdf, other 

2-Wasserstein Approximation via Restricted Convex Potentials with Application to Improved Training for GANs

[PDF] archives-ouvertes.fr

Diffusive processes on the Wasserstein space: Coalescing models, Regularization properties and McKean-Vlasov equations

V Marx - 2019 - tel.archives-ouvertes.fr

The aim of this thesis is to study a class of diffusive stochastic processes with values in the space of probability measures on the real line, called Wasserstein space if it is endowed with the Wasserstein metric W_2. The following issues are mainly addressed in this work …

  Cited by 2 Related articles All 19 versions 
Diffusive processes on the Wasserstein space

[PDF] arxiv.org

A Conditional Wasserstein Generative Adversarial Network for Pixel-level Crack Detection using Video Extracted Images

Q Mei, M Gül - arXiv preprint arXiv:1907.06014, 2019 - arxiv.org

Automatic crack detection on pavement surfaces is an important research field in the scope of developing an intelligent transportation infrastructure system. In this paper, a novel method on the basis of conditional Wasserstein generative adversarial network (cWGAN) is …


[PDF] arxiv.org

Denoising of 3D magnetic resonance images using a residual encoder–decoder Wasserstein generative adversarial network

M Ran, J Hu, Y Chen, H Chen, H Sun, J Zhou… - Medical image …, 2019 - Elsevier

Abstract Structure-preserved denoising of 3D magnetic resonance imaging (MRI) images is a critical step in medical image analysis. Over the past few years, many algorithms with impressive performances have been proposed. In this paper, inspired by the idea of deep …

Cited by 62 Related articles All 7 versions


Deep joint two-stream Wasserstein auto-encoder and selective attention alignment for unsupervised domain adaptation

Z Chen, C Chen, X Jin, Y Liu, Z Cheng - Neural Computing and Applications - Springer

Abstract Domain adaptation refers to the process of utilizing the labeled source domain data to learn a model that can perform well in the target domain with limited or missing labels. Several domain adaptation methods combining image translation and feature alignment …

Cited by 13 Related articles


[PDF] arxiv.org

Wasserstein  F-tests and Confidence Bands for the Fr\`echet Regression of Density Response Curves

A Petersen, X Liu, AA Divani - arXiv preprint arXiv:1910.13418, 2019 - arxiv.org

Data consisting of samples of probability density functions are increasingly prevalent, necessitating the development of methodologies for their analysis that respect the inherent nonlinearities associated with densities. In many applications, density curves appear as …

[v1] Tue, 29 Oct 2019 17:30:57 UTC (393 KB)

[v2] Wed, 22 Jul 2020 16:37:19 UTC (393 KB)

2019  [PDF] arxiv.org

Illegible Text to Readable Text: An Image-to-Image Transformation using Conditional Sliced Wasserstein Adversarial Networks

M Karimi, G Veni, YY Yu - arXiv preprint arXiv:1910.05425, 2019 - arxiv.org

Automatic text recognition from ancient handwritten record images is an important problem in the genealogy domain. However, critical challenges such as varying noise conditions, vanishing texts, and variations in handwriting make the recognition task difficult. We tackle …

Related articles All 4 versions

[PDF] arxiv.org

Wasserstein Proximal Algorithms for the Schr\"{o} dinger Bridge Problem: Density Control with Nonlinear Drift

KF Caluya, A Halder - arXiv preprint arXiv:1912.01244, 2019 - arxiv.org

We study the Schrödinger bridge problem (SBP) with nonlinear prior dynamics. In control-theoretic language, this is a problem of minimum effort steering of a given joint state probability density function (PDF) to another over a finite time horizon, subject to a controlled …

12/2019 Journal Article:  Full Text Online 

Cited by 3 Related articles All 4 versions 

<——2019—————— 2019———————410 —


A Virtual Monochromatic Imaging Method for Spectral CT Based on Wasserstein Generative Adversarial Network With a Hybrid Loss

Z Shi, J Li, H Li, Q Hu, Q Cao - IEEE Access, 2019 - ieeexplore.ieee.org

Spectral computed tomography (CT) has become a popular clinical diagnostic technique because of its unique advantage in material distinction. Specifically, it can perform virtual monochromatic imaging to obtain accurate tissue composition with less beam hardening …

 Cited by 9 Related articles All 2 versions

Findings from Tianjin University Update Understanding of Engineering (A Virtual Monochromatic Imaging Method for Spectral Ct Based On Wasserstein... 

Journal of Engineering, 09/2019

NewsletterFull Text Online

Sensitivity of the Compliance and of the Wasserstein Distance with Respect to a Varying Source

G Bouchitté, I Fragalà, I Lucardesi - Applied Mathematics & Optimization, 2019 - Springer

We show that the compliance functional in elasticity is differentiable with respect to horizontal variations of the load term, when the latter is given by a possibly concentrated measure; moreover, we provide an integral representation formula for the derivative as a …

Related articles All 10 versions


[PDF] arxiv.org

Local well-posedness in the Wasserstein space for a chemotaxis model coupled to Navier-Stokes equations

K Kang, HK Kim - arXiv preprint arXiv:1907.01895, 2019 - arxiv.org

We consider a coupled system of Keller-Segel type equations and the incompressible Navier-Stokes equations in spatial dimension two and three. In the previous work [19], we established the existence of a weak solution of a Fokker-Plank equation in the Wasserstein  …

Local well-posedness in the Wasserstein space for a chemotaxis model coupled to Navier-Stokes equations

K Kang, HK Kim - arXiv preprint arXiv:1907.01895, 2019 - arxiv.org

We consider a coupled system of Keller-Segel type equations and the incompressible Navier-Stokes equations in spatial dimension two and three. In the previous work [19], we established the existence of a weak solution of a Fokker-Plank equation in the Wasserstein space using the optimal transportation technique. Exploiting this result, we constructed solutions of Keller-Segel-Navier-Stokes equations such that the density of biological organism belongs to the absolutely continuous curves in the Wasserstein space.

Related articles

[PDF] archives-ouvertes.fr

A Wasserstein norm for signed measures, with application to non local transport equation with source term

B Piccoli, F Rossi, M Tournus - 2019 - hal.archives-ouvertes.fr

We introduce the optimal transportation interpretation of the Kantorovich norm on the space of signed Radon measures with finite mass, based on a generalized Wasserstein distance for measures with different masses. With the formulation and the new topological properties …

Cited by 12 Related articles All 31 versions

A Wasserstein norm for signed measures, with application to non local transport equation with source term

[PDF] arxiv.org

Stein's method for normal approximation in Wasserstein distances with application to the multivariate Central Limit Theorem

T Bonis - arXiv preprint arXiv:1905.13615, 2019 - arxiv.org

We use Stein's method to bound the Wasserstein distance of order $2 $ between a measure $\nu $ and the Gaussian measure using a stochastic process $(X_t) _ {t\geq 0} $ such that $ X_t $ is drawn from $\nu $ for any $ t> 0$. If the stochastic process $(X_t) _ {t\geq 0} …

Cited by 1 Related articles


2019

Prioritized Experience Replay based on the Wasserstein Metric in Deep Reinforcement Learning: The regularizing effect of modelling return distributions

T Greevink - 2019 - repository.tudelft.nl

This thesis tests the hypothesis that distributional deep reinforcement learning (RL) 

algorithms get an increased performance over expectation based deep RL because of the 

regularizing effect of fitting a more complex model. This hypothesis was tested by comparing 

two variations of the distributional QR-DQN algorithm combined with prioritized experience 

replay. The first variation, called QR-W, prioritizes learning the return distributions. The 

second one, QR-TD, prioritizes learning the Q-Values. These algorithms were be tested with …


A wasserstein gradient-penalty generative adversarial network with deep auto-encoder for bearing intelligent fault diagnosis

X Xiong, H Jiang, X Li, M Niu - Measurement Science and …, 2019 - iopscience.iop.org

It is a great challenge to manipulate unbalanced fault data, in the field of rolling bearings intelligent fault diagnosis. In this paper, a novel intelligent fault diagnosis method called wasserstein gradient-penalty generative adversarial network (WGGAN) with deep auto …


[PDF] biorxiv.org

[PDF] De Novo Protein Design for Novel Folds using Guided Conditional Wasserstein Generative Adversarial Networks (gcWGAN)(Supporting Information)

M Karimi, S Zhu, Y Cao, Y Shen - Small - biorxiv.org

2.1 Methods Using a representative protein structure chosen by SCOPe for each of the 1,232 folds, we construct a pairwise similarity matrix of symmetrized TM scores (Zhang and Skolnick, 2004) and added a properly-scaled identity matrix to it to make a positive-definite …

 [PDF] biorxiv.org

De Novo Protein Design for Novel Folds using Guided Conditional Wasserstein Generative Adversarial Networks (gcWGAN)

M KarimiS ZhuY CaoY Shen - bioRxiv, 2019 - biorxiv.org

Motivation Facing data quickly accumulating on protein sequence and structure, this study is

addressing the following question: to what extent could current data alone reveal deep

insights into the sequence-structure relationship, such that new sequences can be designed …

  Cited by 6 Related articles All 4 versions View as HTML 


[PDF] mdpi.com

Data-Driven Distributionally Robust Stochastic Control of Energy Storage for Wind Power Ramp Management Using the Wasserstein Metric

I Yang - Energies, 2019 - mdpi.com

The integration of wind energy into the power grid is challenging because of its variability, which causes high ramp events that may threaten the reliability and efficiency of power systems. In this paper, we propose a novel distributionally robust solution to wind power …

 Cited by 4 Related articles All 6 versions 

Person-independent facial expression recognition method based on improved Wasserstein generative adversarial networks in combination with identity aware

C Xu, Y Cui, Y Zhang, P Gao, J Xu - Multimedia Systems, 2019 - Springer

Since the distinction between two expressions is fairly vague, usually a subtle change in one part of the human face is enough to change a facial expression. Most of the existing facial expression recognition algorithms are not robust enough because they rely on general facial …

<——2019    ——————— 2019  ————420—


 Gait recognition based on Wasserstein generating adversarial image inpainting network

L Xia, H Wang, W Guo - Journal of Central South University, 2019 - Springer

Aiming at the problem of small area human occlusion in gait recognition, a method based on generating adversarial image inpainting network was proposed which can generate a context consistent image for gait occlusion area. In order to reduce the effect of noise on …

Gait recognition based on Wasserstein generating adversarial image inpainting network 

By: Xia Li-min; Wang Hao; Guo Wei-ting 

JOURNAL OF CENTRAL SOUTH UNIVERSITY   Volume: ‏ 26   Issue: ‏ 10   Pages: ‏ 2759-2770   Published: ‏ OCT 2019

[PDF] ieee.org

A Deep Transfer Model With Wasserstein Distance Guided Multi-Adversarial Networks for Bearing Fault Diagnosis Under Different Working Conditions

M Zhang, D Wang, W Lu, J Yang, Z Li, B Liang - IEEE Access, 2019 - ieeexplore.ieee.org

In recent years, intelligent fault diagnosis technology with the deep learning algorithm has been widely used in the manufacturing industry for substituting time-consuming human analysis method to enhance the efficiency of fault diagnosis. The rolling bearing as the …

Cited by 59 Related articles All 6 versions

A Deep Transfer Model With Wasserstein Distance Guided Multi-Adversarial Network

for Bearing Fault Diagnosis Under Different Working Conditions

Cited by 59 Related articles All 6 versions

MING ZHANG 1 , DUO WANG2 , WEINING LU 3 , JU

Received April 17, 2019, accepted May 5, 2019, date of publication May 14, 2019, date of current version June 3, 2019.


New Colonic Polyps Study Findings Reported from University of Coimbra (Unsupervised Segmentation of Colonic Polyps In Narrow-band Imaging Data Based On Manifold Representation of Images and Wasserstein Distance).

Health & Medicine Week, 10/2019  Newsletter:  Full Text Online 

Digestive System Diseases and Conditions - Colonic Polyps; New Colonic Polyps Study Findings Reported from University of Coimbra 

(Unsupervised Segmentation of Colonic Polyps In Narrow-band Imaging Data Based On Manifold Representation of Images and Wasserstein... 

Information Technology Newsweekly, Oct 8, 2

019, 397

Newspaper ArticleFull Text Online 

Unsupervised segmentation of colonic polyps in narrow-band imaging data based on manifold representation of images and Wasserstein distance

IN Figueiredo, L Pinto, PN Figueiredo, R Tsai - … Signal Processing and …, 2019 - Elsevier

Colorectal cancer (CRC) is one of the most common cancers worldwide and after a certain age (≥ 50) regular colonoscopy examination for CRC screening is highly recommended. One of the most prominent precursors of CRC are abnormal growths known as polyps. If a …

 Related articles All 4 versions

Gait recognition based on Wasserstein generating adversarial image inpainting network 

By: Xia Li-min; Wang Hao; Guo Wei-ting 

JOURNAL OF CENTRAL SOUTH UNIVERSITY   Volume: ‏ 26   Issue: ‏ 10   Pages: ‏ 2759-2770   Published: ‏ OCT 2019
Cited by 3
Related articles


2Precise Simulation of Electromagnetic Calorimeter Showers Using a Wasserstein Generative Adversarial NetworkShow more
Authors:Martin ErdmannJonas GlombitzaThorben Quast
Summary:Simulations of particle showers in calorimeters are computationally time-consuming, as they have to reproduce both energy depositions and their considerable fluctuations. A new approach to ultra-fast simulations is generative models where all calorimeter energy depositions are generated simultaneously. We use GEANT4 simulations of an electron beam impinging on a multi-layer electromagnetic calorimeter for adversarial training of a generator network and a critic network guided by the Wasserstein distance. The generator is constrained during the training such that the generated showers show the expected dependency on the initial energy and the impact position. It produces realistic calorimeter energy depositions, fluctuations and correlations which we demonstrate in distributions of typical calorimeter observables. In most aspects, we observe that generated calorimeter showers reach the level of showers as simulated with the GEANT4 programShow more
Article, 2019
Publication:Computing and Software for Big Science, 3, 201912, 1
Publisher:2019


Tackling Algorithmic Bias in Neural-Network Classifiers using Wasserstein-2 Regularization

L Risser, Q Vincenot, JM Loubes - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

The increasingly common use of neural network classifiers in industrial and social

applications of image analysis has allowed impressive progress these last years. Such

methods are however sensitive to algorithmic bias, ie to an under-or an over-representation …


2019  

Music Classification using Multiclass Support Vector Machine and Multilevel Wasserstein Means

J Wei, C Jin, Z Cheng, X Lv… - 2019 IEEE/ACIS 18th …, 2019 - ieeexplore.ieee.org

Music classification is a challenging task in music information retrieval. In this article, we 

compare the performance of the two types of models. The first category is classified by 

Support Vector Machine (SVM). We use the feature extraction from audio as the basis of 

classification. Firstly, a total of 500 pieces of music by five famous classical music composers 

were selected, 400 of which were regarded as the training set of music genre classification, 

and the remaining pieces were regarded as the testing set. The second method is Multilevel …

Save Cite Related articles All 2 versions

[C] Music Classification using Multiclass Support Vector Machine and Multilevel Wasserstein Means

J Wei, C Jin, Z Cheng, X Lv… - 2019 IEEE/ACIS 18th …, 2019 - ieeexplore.ieee.org

Music classification is a challenging task in music information retrieval. In this article, we compare the performance of the two types of models. The first category is classified by Support Vector Machine (SVM). We use the feature extraction from audio as the basis of …


[PDF] opticsjournal.net

[PDF] 基于改进型 WGAN 的低剂量 CT 图像去噪方法

徐曾春, 叶超, 杜振龙, 李晓丽 - 光学与光电技术, 2019 - opticsjournal.net

摘要为改善低剂量CT 图像的质量, 提出一种基于改进型Wasserstein 生成对抗网络(WGAN-gp) 

的低剂量CT 图像去噪方法. WGAN-gp WGAN 网络的基础上加入梯度惩罚项, 解决了WGAN 

训练困难, 收敛速度慢的问题, 进一步提高网络的性能. 同时加入新感知损失度量函数 …

Related articles All 3 versions

[Chinese   Denoising of low-dose CT images based on improved WGAN
elated articles
All 3 versions



基于条件梯度 Wasserstein 生成对抗网络的图像识别

何子庆, 聂红玉, 刘月, 尹洋 - 计算机测量与控制, 2019 - cnki.com.cn

生成式对抗网络GAN 功能强大, 但是具有收敛速度慢, 训练不稳定, 生成样本多样性不足等缺点; 该文结合条件深度卷积对抗网络CDCGAN 和带有梯度惩罚的Wasserstein 生成对抗网络WGAN-GP 的优点, 提出了一个混合模型-条件梯度Wasserstein 生成对抗网络CDCWGAN-GP …

[Chinese  Image recognition based on conditional gradient Wasserstein generative adversarial network


Combining multi-task autoencoder with Wasserstein generative adversarial networks for improving speech recognition performance

CY Kao, H Ko - The Journal of the Acoustical Society of Korea, 2019 - koreascience.or.kr

As the presence of background noise in acoustic signal degrades the performance of speech or acoustic event recognition, it is still challenging to extract noise-robust acoustic features from noisy signal. In this paper, we propose a combined structure of Wasserstein  …

Related articles All 6 versions

<——2019—————— 2019————————430—


基于 Wasserstein GAN 的文档表示模型

马永军, 李亚军, 汪睿, 陈海山 - 计算机工程与科学, 2019 - airitilibrary.com

文档表示模型可以将非结构化的文本数据转化为结构化数据, 是多种自然语言处理任务的基础, 而目前基于词的模型在文档表示任务中有着无法直接表示文档的缺陷. 针对此问题, 基于生成对抗网络GAN 可以使用两个神经网络进行对抗学习, 从而很好地学习到原始数据分布 …

[Chinese  Document representation model based on Wasserstein GAN]


2019

July 23rd 13 Sinkhorn AutoEncoders - YouTube

www.youtube.com › watch

Introduction to the Wasserstein distance ... Machine-learning Methods – Part A (FRM Part 1 2023 – Book 2 – Quantitative Analysis – Chapter ...

YouTube · uai2

Jul 23,2019


Use of the Wasserstein Metric to Solve the Inverse Dynamic Seismic Problem

AA Vasilenko - Marine Technologies 2019, 2019 - earthdoc.org

The inverse dynamic seismic problem consists in recovering the velocity model of elastic medium based on the observed seismic data. In this work full waveform inversion method is used to solve this problem. It consists in minimizing an objective functional measuring the …


[PDF] researchgate.net

[PDF] Computation of Wasserstein barycenters via the Iterated Swapping Algorithm

G Puccetti, L RüschendorfS Vanduffel - 2019 - researchgate.net

In recent years, the Wasserstein barycenter has become an important notion in the analysis

of high dimensional data with a broad range of applications in applied probability,

economics, statistics and in particular to clustering and image processing. In our paper we …

  Related articles 


基于 Wasserstein 距离分层注意力模型的跨域情感分

杜永萍, 贺萌, 赵晓铮 - 模式识别与人工智能, 2019 - airitilibrary.com

跨领域情感分类任务旨在利用已知情感标签的源域数据对缺乏标记数据的目标域进行情感倾向性分析. 文中提出基于Wasserstein 距离的分层注意力模型, 结合Attention 机制, 采用分层模型进行特征提取, Wasserstein 距离作为域差异度量方式, 通过对抗式训练自动 …

[Chinese  Cross-domain sentiment analysis based on Wasserstein distance hierarchical attention model


Wasserstein 거리를 활용한 분포 강건 신문가판원 모형

이상윤, 김현우, 문일경 - 대한산업공학회 춘계공동학술대회 …, 2019 - portal.dbpia.co.kr

… Wasserstein 거리를 활용한 분포 강건 신문가판원 모형 … 2019 대한산업공학회 춘계공동학술대회 논문집 [3 학회 공동주최], 2019.4, 172-193 (22 pages) 인용정보 복사. Quick View Quick View 구매하기 6,000 기관회원으로 로그인하거나, 구매 이용할  …

[Korean  A Distribution Robust Newsstand Model Using Wasserstein Distance]

All 3 versions  

[CITATION] Wasserstein 거리를 활용한 분포 강건 신문가판원 모형

이상윤, 김현우, 문일경 - 한국경영과학회 학술대회논문집, 2019 - dbpia.co.kr

기관인증 소속기관이 구독중인 논문 이용 가능합니다.(구독기관내 IP· 계정 이용/대학도서관

홈페이지를 통해 접속) 로그인 개인화 서비스 이용 가능합니다.(내서재, 맞춤추천,

알림서비스등)내서재, 알림서비스 등의 다양한 개인화 서비스를 이용해보세요. 기관인증된 …

  All 2 versions


[PDF] archives-ouvertes.fr

Courbes et applications optimales à valeurs dans l'espace de Wasserstein

H Lavenant - 2019 - tel.archives-ouvertes.fr

L'espace de Wasserstein est l'ensemble des mesures de probabilité définies sur un domaine fixé et muni de la distance de Wasserstein quadratique. Dans ce travail, nous étudions des problèmes variationnels dans lesquels les inconnues sont des applications à …

Related articles
New Operations Science Findings from University of Southern California Outlined (
Wasserstein... (On Potentials of Regularized Wasserstein Generative Adversarial Networks for Reaistic Hallucination of Tiny Faces).

Science Letter, 01/2019

Newsletter:

Full Text Online 

Courbes et applications optimales à valeurs dans l'espace de Wasserstein 

by Lavenant, Hugo 

L'espace de Wasserstein est l'ensemble des mesures de probabilité définies sur un domaine fixé et muni de la distance de Wasserstein quadratique. Dans ce...

Dissertation/Thesis:  Full Text Online 

[French  Statistical properties of the barycenter in Wasserstein space]

PDF]  Courbes et applications optimales à valeurs dans l'espace de ...

by H Lavenant - ‎2019 - ‎Related articles

Jun 3, 2019 - 1.1 Optimal transport and Wasserstein distances: a brief historical survey . ..... A.2 Courbes optim

Related articles All 14 versions


Findings on Landscape Ecology Detailed by Investigators at Guangzhou University (Calculating Spatial Configurational Entropy of a Landscape Mosaic Based On the Wasserstein Metric).

Ecology, Environment & Conservation, 10/2019

Newsletter: Full Text Online 

cument Title: "Ecology Research - Landscape Ecology; Findings on Landscape Ecology Detailed by Investigators at Guangzhou University (Calculating Spatial Configurational Entropy of a Landscape Mosaic Based On the Wasserstein Metric)"AndStart Page: 564AndISSN: 19456492

Ecology, Environment & Conservation, Oct 11, 2019, 564

Newspaper Article:  Full Text Online 

Findings on Landscape Ecology Detailed by Investigators at Guangzhou University (Calculating Spatial Configurational Entropy of a Landscape Mosaic Based On the Wasserstein... 

Ecology, Environment & Conservation, 10/2019

NewsletterFull Text Online

Studies from Institute of Science and Technology Austria Have Provided New Data on Mathematics (On Isometric Embeddings of Wasserstein... 

Mathematics Week, 12/2019

NewsletterFull Text Online 

Mathematics; Studies from Institute of Science and Technology Austria Have Provided New Data on Mathematics (On Isometric Embeddings of Wasserstein... 

Journal of Mathematics, Dec 17, 2019, 939

Newspaper ArticleFull Text Online 

<——2019——————— 2019———440 — 


Data from University of Bordeaux Provide New Insights into Mathematical Analysis (Penalization of Barycenters In the Wasserstein... 

Mathematics Week, 07/2019

NewsletterFull Text Online 

Mathematics - Mathematical Analysis; Data from University of Bordeaux Provide New Insights into Mathematical Analysis  

(Penalization of Barycenters In the Wasserstein Space)

Journal of Mathematics, Jul 23, 2019, 18

Newspaper ArticleFull Text Online

www.researchgate.net › publication › 334761464_Genera...

Aug 2, 2019 - Generalised Wasserstein Dice Score for Imbalanced Multi-class Segmentation Using Holistic Convolutional Networks. Conference Paper (PDF ...


[PDF] researchgate.net

Wasserstein Metric Based Distributionally Robust Approximate Framework for Unit Commitment

R Zhu, H Wei, X Bai - IEEE Transactions on Power Systems, 2019 - ieeexplore.ieee.org

This Paper proposed a Wasserstein Metric based Distributionally Robust Approximate framework (WDRA) for Unit Commitment (UC) problem to manage the risk from uncertain wind power forecasted errors. The ambiguity set employed in the distributionally robust …

Cited by 3 Related articles

Cited by 44 Related articles All 3 versions

Researchers from Guangxi University Provide Details of New Studies and Findings in the Area of Power Systems (Wasserstein Metric Based Distributionally Robust Approximate Framework for Unit Commitment)

Energy Weekly News, 07/2019  Newsletter:  Full Text Online 

Engineering - Power Systems; Researchers from Guangxi University Provide Details of New Studies and Findings in the Area of Power Systems (Wasserstein... 

Energy Weekly News, Jul 26, 2019, 544  Newspaper Article:  Full Text Online 

Cited by 56 Related articles All 2 versions

  

[PDF] mdpi.com

Multi-Turn Chatbot Based on Query-Context Attentions and Dual Wasserstein Generative Adversarial Networks

J Kim, S Oh, OW Kwon, H Kim - Applied Sciences, 2019 - mdpi.com

To generate proper responses to user queries, multi-turn chatbot models should selectively 

consider dialogue histories. However, previous chatbot models have simply concatenated or 

averaged vector representations of all previous utterances without considering contextual 

importance. To mitigate this problem, we propose a multi-turn chatbot model in which 

previous utterances participate in response generation using different weights. The 

proposed model calculates the contextual importance of previous utterances by using an …

Cited by 17 Related articles All 5 versions

Science - Applied Sciences; Researchers from Kangwon National University Report on Findings in Applied Sciences (Multi-Turn Chatbot Based on Query-Context Attentions and Dual Wasserstein... 

Science Letter, Oct 11, 2019, 2293

Newspaper ArticleFull Text Online 


2019 see 2018

Reports from Xiamen University Provide New Insights into Applied Sciences (Application of Auxiliary Classifier Wasserstein Generative Adversarial Networks in Wireless Signal Clasification of Illegal Unmanned Aerial Vehicles)

 Telecommunications Weekly, 02/2019  Newsletter:  Full Text Online 


Study Data from Xiamen University Update Understanding of Unmanned Aerial Vehicle (Application of Auxiliary Classifier Wasserstein Generative Adversarial Networks In Wireless Signal Classification of Illegal Unmanned Aerial Vehicles)

Telecommunications Weekly, 05/2019 Newsletter: Full Text Online 



Unsupervised segmentation of colonic polyps in narrow-band ...

An automatic and unsupervised method for the segmentation of colonic polyps for in vivo Narrow-Band-Imaging (NBI) data is proposed. The proposed segmentation method is a histogram based two-phase segmentation model, involving the Wasserstein distance.

by IN Figueiredo - ‎2019 - ‎Related articles



Peer-reviewed
Calculating spatial configurational entropy of a landscape mosaic based on the Wasserstein metric
Authors:Yuan ZhaoXinchang Zhang
Summary:Entropy is an important concept traditionally associated with thermodynamics and is widely used to describe the degree of disorder in a substance, system, or process. Configurational entropy has received more attention because it better reflects the thermodynamic properties of physical and biological processes. However, as the number of configuration combinations increases, configurational entropy becomes too complex to calculate, and its value is too large to be accurately represented in practical applications.To calculate the spatial configurational entropy of a landscape mosaic based on a statistical metric.We proposed a relative entropy using histograms to compare two ecosystems with the Wasserstein metric, and used six digital elevation models and five simulated data to calculate the entropy of the complex ecosystems.The calculation and simulation showed that the purposed metric captured disorder in the spatial landscape, and the result was consistent with the general configurational entropy. By calculating several spatial scale landscapes, we found that relative entropy can be a trade-off between the rationality of results and the cost of calculation.Our results show that the Wasserstein metric is suitable to capture the discrepancy using complex landscape mosaic data sets, which provides a numerically efficient approximation for the similarity in the histograms, reducing excessive expansion of the calculated resultShow more
Article, 2019
Publication:Landscape Ecology, 34, 201908, 1849
Publisher:2019


Primal dual methods for wasserstein gradient flows

JA CarrilloK CraigL WangC Wei - arXiv preprint arXiv:1901.08081, 2019 - arxiv.org

Combining the classical theory of optimal transport with modern operator splitting

techniques, we develop a new numerical method for nonlinear, nonlocal partial differential

equations, arising in models of porous media, materials science, and biological swarming …

  Cited by 16 Related articles All 3 versions 

 

基于 Wasserstein 生成对抗网络的遥感图像去模糊研究

刘晨旭 - 2019 - cdmd.cnki.com.cn

遥感是一种重要的对地观测手段, 从获取的遥感图像中提炼的诸多关键性信息, 

已被广泛应用于侦察, 监测, 防治, 预警等领域. 在遥感成像的过程中, 由于拍摄距离远, 

扫描速度快, 外界光干扰, 大气湍流及大幅宽成像等因素造成的图像模糊, 在很大程度上降低了 …

 [Chinese  Research on Remote Sensing Image Deblurring Based on Wasserstein Generative Adversarial Network]

 
Researchers from Kangwon National University Report on Findings in Applied Sciences (Multi-Turn Chatbot Based on Query-Context Attentions and Dual Wasserstein Generative Adversarial Networks).

From:Science Letter

Science Letter, 10/2019  Newsletter:  Full Text Online 

Potential Analysis; Studies from Beijing Normal University in the Area of Potential Analysis Described (Exponential Contraction In Wasserstein... 

Journal of Mathematics, Mar 24, 2020, 889

Newspaper ArticleFull Text Online 

<—— 2019 ———————2019———— -450 —

 

Применение метрики Вассерштейна для решения обратной динамической задачи сейсмики

АА Василенко - Интерэкспо Гео-Сибирь, 2019 - cyberleninka.ru

Обратная динамическая задача сейсмики заключается в определении скоростной модели упругой среды по зарегистрированным данным. В данной работе предлагается использовать метрику Вассерштейна для построения функционала, характеризующего …

[Russian  Application of the Wasserstein metric to solve the inverse dynamic seismic problem]

Related articles All 4 versions
[Russian  Application of Wasserstein metaic for solotion 0f reverse dynamic problems in seismology]

Cited by 17 Related articles All 5 versions


2019  [PDF] ieee.org

WGAN-based robust occluded facial expression recognition

Y Lu, S Wang, W Zhao, Y Zhao - IEEE Access, 2019 - ieeexplore.ieee.org

Research on facial expression recognition (FER) technology can promote the development of theoretical and practical applications for our daily life. Currently, most of the related works on this technology are focused on un-occluded FER. However, in real life, facial expression …

Cited by 1 


[PDF] ieee.org

A packet-length-adjustable attention model based on bytes embedding using flow-wgan for smart cybersecurity

L Han, Y Sheng, X Zeng - IEEE Access, 2019 - ieeexplore.ieee.org

In the studies of cybersecurity, malicious traffic detection is attracting more and more attention for its capability of detecting attacks. Almost all of the intrusion detection methods based on deep learning have poor data processing capacity with the increase in the data …

Cited by 3 Related articles

Times Cited: 3 

(from Web of Science Core Collection) 

Researchers from Institute of Acoustics Report on Findings in Engineering (A Packet-length-adjustable Attention Model Based On Bytes Embedding Using Flow-wgan... 

Information Technology Newsweekly, 08/2019

Newsletter:  Full Text Online 

Engineering; Researchers from Institute of Acoustics Report on Findings in Engineering (A Packet-length-adjustable Attention Model Based On Bytes Embedding Using Flow-wgan... 

Information technology newsweekly, Aug 13, 2019, 464

Newspaper ArticleCitation Online 

A Packet-Length-Adjustable Attention Model Based on Bytes Embedding Using Flow-WGAN for Smart Cybersecurity
Article, 2019
Publication:IEEE access, 7, 2019, 82913
Publisher:2019

[PDF] tcd.ie

[PDF] Using WGAN for Improving Imbalanced Classification Performance

S Bhatia, R Dahyot - scss.tcd.ie   AICS 2019

This paper investigates data synthesis with a Generative Adversarial Network (GAN) for augmenting the amount of data used for training classifiers (in supervised learning) to compensate for class imbalance (when the classes are not represented equally by the same …

Related articles All 4 versions

 

  

Arterial Spin Labeling Images Synthesis via Locally-Constrained WGAN-GP Ensemble

W Huang, M Luo, X Liu, P Zhang, H Ding… - … Conference on Medical …, 2019 - Springer

Arterial spin labeling (ASL) images begin to receive much popularity in dementia diseases diagnosis recently, yet it is still not commonly seen in well-established image datasets for investigating dementia diseases. Hence, synthesizing ASL images from available data is …



 Network Security Situation Prediction Based on Improved WGAN

J Zhu, T Wang - International Conference on Simulation Tools and …, 2019 - Springer

The current network attacks on the network have become very complex. As the highest level of network security situational awareness, situation prediction provides effective information for network administrators to develop security protection strategies. The generative …

王婷婷, 朱江 - 计算机科学 - jsjkx.com

文中提出了一种基于差分WGAN (Wasserstein-GAN) 的网络安全态势预测机制, 

该机制利用生成对抗网络(Generative Adversarial Network, GAN) 来模拟态势的发展过程, 

从时间维度实现态势预测. 为了解决GAN 具有的网络难以训练, collapse mode …

All 2 versions 

[Chinese  Network security situation prediction based on differential WGAN]


[PDF] techscience.com

[PDF] Low-Dose CT Image Denoising Based on Improved WGAN-gp

X Li, C Ye, Y Yan, Z Du - Journal of New Media JNM, 2019 - test.techscience.com

In order to improve the quality of low-dose computational tomography (CT) images, the paper proposes an improved image denoising approach based on WGAN-gp with Wasserstein distance. For improving the training and the convergence efficiency, the given …

All 5 versions

基于 WGAN 网络的自然视频预测

李敏, 仝明磊, 范绿源, 南昊 - 仪表技术, 2019 - cnki.com.cn

计算机视觉技术已经在学术界和工业界取得了巨大的成果, 近年来, 视频预测已经成为一个重要

的研究领域. 现有基于生成对抗网络的视频预测模型在训练中需要小心平衡生成器和判别器的

训练, 生成模型多样性不足. 针对这些问题, 提出用Wasserstein 对抗生成网络(WGAN) …

Related articles

[Chinese  Low-dose CT image denoising method based on improved WGAN Application of dose CT image denoising method]

基于 WGAN 网络的自然视频预测

李敏, 仝明磊, 范绿源, 南昊 - 仪表技术, 2019 - cnki.com.cn

计算机视觉技术已经在学术界和工业界取得了巨大的成果, 近年来, 视频预测已经成为一个重要

的研究领域. 现有基于生成对抗网络的视频预测模型在训练中需要小心平衡生成器和判别器的

训练, 生成模型多样性不足. 针对这些问题, 提出用Wasserstein 对抗生成网络(WGAN) …


Generation of Network Traffic Using WGAN-GP and a DFT Filter for Resolving Data Imbalance

WH Lee, BN Noh, YS Kim, KM Jeong - International Conference on …, 2019 - Springer

The intrinsic features of Internet networks lead to imbalanced class distributions when datasets are conformed, phenomena called Class Imbalance and that is attaching an increasing attention in many research fields. In spite of performance losses due to Class …
Book Chapter 


E-WACGAN: Enhanced Generative Model of Signaling Data Based on WGAN-GP and ACGAN

Q Jin, R Lin, F Yang - IEEE Systems Journal, 2019 - ieeexplore.ieee.org

In recent years, the generative adversarial network (GAN) has achieved outstanding performance in the image field and the derivatives of GAN, namely auxiliary classifier GAN (ACGAN) and Wasserstein GAN with gradient penalty (WGAN-GP) have also been widely …

Cited by 3 Related articles

How to Develop a Wasserstein Generative Adversarial Network

Jul 17, 2019 - The event of the WGAN has a dense mathematical motivation, though in follow requires just a few minor modifications to the established ...

<——2019—— 2019 —— -460 —  

 

Feature augmentation for imbalanced classification with conditional mixture WGANs 

by Zhang, Yinghui; Sun, Bo; Xiao, Yongkang; More... 

Signal Processing: Image Communication, 07/2019, Volume 7

Journal Article: Full Text Online 

Study Results from Beijing Normal University Provide New Insights into Signal Processing (Feature Augmentation for Imbalanced Classification With Conditional Mixture Wgans... 

Electronics Newsweekly, 07/2019

Newsletter: Full Text Online 


Optimal Transport and Wasserstein Distance 1 Introduction

by S Kolouri

The Wasserstein distance — which arises from the idea of optimal transport — is being used more and more in Statistics and Machine Learning. In these notes ...


On isometric embeddings of Wasserstein spaces - the discrete case 

By: Geher, Gyorgy Pal; Titkos, Tamas; Virosztek, Daniel 

JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS  Volume: 480   Issue: 2     Article Number: 123435   Published: DEC 15 2019 

 Cited by 3 Related articles All 8 versions

Deep learning and Wasserstein distance metric based finger vein identification method, involves performing encoding operation on images, and utilizing similarity of Wasserstein distance metric as result of search identification    2019

Patent Number: CN110555382-A 

Patent Assignee: UNIV ZHEJIANG SCI-TECH 

Inventor(s): ZHANG N; TU X; BAO X; et al.


Generated confrontation network based mushroom phenotype image generation method, involves obtaining Wasserstein distance similarity measurement index for establishing high-quality mushroom surface type image generating model  2019

Patent Number: CN110197514-A 

Patent Assignee: UNIV NANJING AGRIC 

Inventor(s): YUAN P; WU M; XU H; et al.

New Programming Study Findings Have Been Reported from Virginia Polytechnic Institute and State University (On distributionally robust chance constrained programs with Wasserstein... 

Mathematics Week, 12/2019

NewsletterFull Text Online

Document Title: "Programming; New Programming Study Findings Have Been Reported from Virginia Polytechnic Institute and State University (On distributionally robust chance constrained programs with Wasserstein distance)"AndStart Page: 381AndISSN: 19442556

News of Science, Dec 15, 2019, 381   Newspaper Article:  Full Text Online 

Calculating Spatial Configurational Entropy of a Landscape Mosaic Based On the Wasserstein Metric)

Ecology, environment & conservation (Atlanta, Ga.), Oct 11, 2019, 564
On distributionally robust chance constrained programs with Wasserstein distance

W Xie - Mathematical Programming, 2019 - Springer

This paper studies a distributionally robust chance constrained program (DRCCP) with

Wasserstein ambiguity set, where the uncertain constraints should be satisfied with a

probability at least a given threshold for all the probability distributions of the uncertain

parameters within a chosen Wasserstein distance from an empirical distribution. In this work,

we investigate equivalent reformulations and approximations of such problems. We first

show that a DRCCP can be reformulated as a conditional value-at-risk constrained …

  Cited by 62 Related articles All 9 versions


 

How to Implement Wasserstein Loss for Generative Adversarial Networks

By Jason Brownlee on July 15, 2019 in Generative Adversarial Networks 


Семинар: Расстояние Вассерштейна для модулей ...

Translate this page

Sep 27, 2019 - Семинар научно-учебной лаборатории прикладной геометрии и топологии будет посвящён расстоянию Вассерштейна для модулей устойчивости. Докладчик - Владимир Смурыгин, стажёр-исследователь лаборатории.

"New Operations Science Findings from University of Southern California Outlined (Wasserstein Distance and the Distributionally Robust TSP)"AndStart Page: 921AndISSN: 15389111

Science Letter, 01/2019   Newsletter:  Full Text Online 

<——2019    ———— 2019  ———————470—


On potentials of regularized Wasserstein generative adversarial networks for realistic hallucination of tiny faces

WZ Shao, JJ Xu, L Chen, Q Ge, LQ Wang, BK Bao… - Neurocomputing, 2019 - Elsevier

Super-resolution of facial images, aka face hallucination, has been intensively studied in the 

past decades due to the increasingly emerging analysis demands in video surveillance, eg, 

face detection, verification, identification. However, the actual performance of most previous …

Cited by 1 All 3 versions
Calculating Spatial Configurational Entropy of a Landscape Mosaic Based On the Wasserstein Metric)

Ecology, environment & conservation (Atlanta, Ga.), Oct 11, 2019, 564

Researchers' Work from Nanjing University of Posts and Telecommunications Focuses on Hallucinations (On Potentials of Regularized Wasserstein Generative Adversarial Networks for Realistic Hallucination of Tiny Faces).

Health & Medicine Week, 11/2019

Newsletter: Full Text Online 

Perceptual Diseases and Conditions - Hallucinations; Researchers' Work from Nanjing University of Posts and Telecommunications Focuses on Hallucinations (On Potentials of Regularized Wasserstein... 

Pain & Central Nervous System Week, Oct 28, 2019, 6477

Newspaper ArticleFull Text Online 

 Cited by 2 Related articles All 3 versions


 Masters Thesis - TU Delft Repositories

repository.tudelft.nl › islandora › object › datastream › OBJ › download

PDF Delf 2019

the Wasserstein Metric in Deep. Reinforcement Learning. The regularizing effect of modelling return distributions. Master of Science Thesis. For the degree of ...


Document Title: "Stochastics and Dynamics; Findings from FuJian Normal University Provides New Data on Stochastics and Dynamics (Refined Basic Couplings and Wasserstein-type Distances for Sdes With Levy Noises)"

AndStart Page: 87AndISSN: 19441894

Journal of Technology & Science, Sep 22, 2019, 87

Newspaper Article:  Full Text Online 

Findings from FuJian Normal University Provides New Data on Stochastics and Dynamics 

(Refined Basic Couplings and Wasserstein... 

Mathematics Week, 09/2019

NewsletterFull Text Online
Cited by 22
Related articles All 6 versions


Identifying Imaging Markers for Predicting Cognitive  Assessments Using Wasserstein Distances Based Matrix Regression

Jul 10, 2019 - Regression models are widely used to predict the relationship between imaging biomarkers and cognitive assessment, and identify discriminative ...

by J Yan - ‎2019 - ‎Cited by 1 - ‎Related articles

Abstract · ‎Introduction · ‎Study of Cognitive Score ... · ‎Experimental Results

Reports Outline Biomarkers Study Findings from Xidian University 

(Identifying Imaging Markers for Predicting Cognitive Assessments Using Wasserstein... 

Health & Medicine Week, 08/2019

NewsletterFull Text Online 

Diagnostics and Screening - Biomarkers; Reports Outline Biomarkers Study Findings from Xidian University (Identifying Imaging Markers for Predicting Cognitive AssesUsing Wasserstein Distances Based Matrix Regression)sments Using Wasserstein Distances Based Matrix Regression)  

Medical Imaging Week, Aug 10, 2019, 4442  Newspaper Article:  Full Text Online 

Document Title: 

"Engineering - Power Systems; Reports from North China Electric Power University Provide New Insights into Power Systems 

(Risk-based Distributionally Robust Optimal Gas-power Flow With Wasserstein Distance)"

AndStart Page: 603AndISSN: 19456921

Energy & Ecology, Jun 28, 2019, 603   Newspaper Article:  Full Text Online 

Reports from North China Electric Power University Provide New Insights into Power Systems (Risk-based Distributionally Robust Optimal Gas-power Flow With Wasserstein Distance)

Energy Weekly News, 06/2019  Newsletter: Full Text Online

2019


On the Bures–Wasserstein distance between positive definite matrices

R Bhatia, T Jain, Y Lim - Expositiones Mathematicae, 2019 - Elsevier

The metric d (A, B)= tr A+ tr B− 2 tr (A 1 2 BA 1 2) 1 2 1 2 on the manifold of n× n

positive definite matrices arises in various optimisation problems, in quantum information

and in the theory of optimal transport. It is also related to Riemannian geometry. In the first …

  Cited by 104 Related articles All 6 versions


Courbes et applications optimales à valeurs dans l'espace de Wasserstein

H Lavenant - 2019 - tel.archives-ouvertes.fr

L'espace de Wasserstein est l'ensemble des mesures de probabilité définies sur un

domaine fixé et muni de la distance de Wasserstein quadratique. Dans ce travail, nous

étudions des problèmes variationnels dans lesquels les inconnues sont des applications à …

  Cited by 1 Related articles All 11 versions 


Document Title: "Science - Operations Science; Investigators at Northwestern University Describe Findings in Operations Science (Decomposition Algorithm for Distributionally Robust Optimization Using Wasserstein Metric With an Application To a Class of Regression Models)"AndStart Page: 810AndISSN: 15389111

Science Letter, Oct 11, 2019, 810   Newspaper Article:   Full Text Online 

"Investigators at Northwestern University Describe Findings in Operations Science (Decomposition Algorithm for Distributionally Robust Optimization Using Wasserstein Metric With an Application To a Class of Regression Models)"AndStart Page: 810AndISSN: 15389111

Science Letter, 10/2019  Newsletter: Full Text Online
Decomposition algorithm for distributionally robust optimization using Wasserstein metric with an application to a class of regression models

F Luo, S Mehrotra - European Journal of Operational Research, 2019 - Elsevier

We study distributionally robust optimization (DRO) problems where the ambiguity set is defined using the Wasserstein metric and can account for a bounded support. We show that this class of DRO problems can be reformulated as decomposable semi-infinite programs …

 Cited by 19 Related articles All 6 versions

 

2019 see 2018

Data from University of Toronto Provide New Insights into Signal Processing (Wasserstein-distance-based... 

Electronics Newsweekly, 08/2019

NewsletterFull Text Online 

Signal Processing; Data from University of Toronto Provide New Insights into Signal Processing  

(Wasserstein-distance-based Gaussian Mixture Reduction)

Electronics Newsweekly, Aug 6, 2019, 47

Newspaper ArticleFull Text Online 

 


Document Title: "Mathematics - Mathematical Statistics and Probability; Findings from University of Valladolid Reveals New Findings on Mathematical Statistics and Probability (Wide Consensus Aggregation In the Wasserstein Space. Application To Location-scatter Families)"AndStart Page: 212AndISSN: 19441894
Journal of Technology & Science, Mar 17, 2019, 212

Newspaper Article: Full Text Online
[CITATION] Convergence Rate to Equilibrium in Wasserstein Distance for Reflected Jump-Diffusions (2020)

A Sarantsev - Statistics and Probability Letters, 2019

  Cited by 1

<––2019  —-2019——————480——  


Document Title: "Stochastics and Dynamics; New Stochastics and Dynamics Findings Has Been Reported by Investigators at University of Paris (Randomized filtering and Bellman equation in Wasserstein space for partial observation control problem)"AndStart Page: 232AndISSN: 19442556

News of Science, Feb 17, 2019, 232   Newspaper Article:  Full Text Online 


2019 see 2018

Science - Applied Sciences; Reports from Xiamen University Provide New Insights into Applied Sciences 

(Application of Auxiliary Classifier Wasserstein Generative Adversarial

Defense & Aerospace Week, Feb 6, 2019, 126  Newspaper Article: Full Text Online 


[PDF] The generalized Vaserstein symbol

T Syed - 2019 - edoc.ub.uni-muenchen.de

Let R be a commutative ring. An important question in the study of projective modules is 

under which circumstances a projective R-module P is cancellative, ie under which 

circumstances any isomorphism P Rk Q Rk for some projective R-module Q and ke 0 …

Using Wasserstein Generative Adversarial Networks for the Design of Monte Carlo Simulations

By Susan Athey, Guido W. Imbens, Jonas Metzger, Evan Munro

September 2019Working Paper No. 3824

  Cited by 10 Related articles All 8 versions 

]PDF] A NOTE ON RELATIVE VASERSTEIN SYMBOL

K CHAKRABORTY, RA RAO - math.tifr.res.in

Definition 1.1. The Relative Elementary group En(R, I) : Let R be a ring and I R be an ideal. The relative elementary group is the subgroup of SLn(R, I) generated by the matrices of the form αei,j(a)α−1, where α En(R),i = j and a I … We identify GLn(R) with a subgroup 

Related articles


2019

Geometric mean flows and the Cartan barycenter on the ...

Aug 10, 2019 - Download Citation | Geometric mean flows and the Cartan barycenter on the Wasserstein space over positive definite matrices | We introduce a ...


On parameter estimation with the Wasserstein distance 

By: Bernton, Espen; Jacob, Pierre E.; Gerber, Mathieu; et al.

INFORMATION AND INFERENCE-A JOURNAL OF THE IMA  Volume: 8   Issue: 4   Pages: 657-676   Published: DEC 2019 

Times Cited: 1 


Wasserstein distance based convolution neural network anti-transfer learning method, involves completing countermeasure migration learning of convolution neural network according to convergence criterion 

Patent Number: CN110414383-A 

Patent Assignee: UNIV HUAZHONG SCI & TECHNOLOGY 

Inventor(s): YUAN Y; ZHOU B; CHENG C; et al.

Patent Number

Publ. Date

Main IPC

Week

Page Count

Language

CN110414383-A 

05 Nov 2019

G06K-009/00

201989 

Pages: 17 

Chinese 


 2019 see 2020
Fast Algorithms for Computational Optimal Transport and Wasserstein Barycenter
Authors:Guo, Wenshuo (Creator), Ho, Nhat (Creator), Jordan, Michael I. (Creator
Summary:We provide theoretical complexity analysis for new algorithms to compute the optimal transport (OT) distance between two discrete probability distributions, and demonstrate their favorable practical performance over state-of-art primal-dual algorithms and their capability in solving other problems in large-scale, such as the Wasserstein barycenter problem for multiple probability distributions. First, we introduce the \emph{accelerated primal-dual randomized coordinate descent} (APDRCD) algorithm for computing the OT distance. We provide its complexity upper bound $\bigOtil(\frac{n^{5/2}}{\varepsilon})$ where $n$ stands for the number of atoms of these probability measures and $\varepsilon > 0$ is the desired accuracy. This complexity bound matches the best known complexities of primal-dual algorithms for the OT problems, including the adaptive primal-dual accelerated gradient descent (APDAGD) and the adaptive primal-dual accelerated mirror descent (APDAMD) algorithms. Then, we demonstrate the better performance of the APDRCD algorithm over the APDAGD and APDAMD algorithms through extensive experimental studies, and further improve its practical performance by proposing a greedy version of it, which we refer to as \emph{accelerated primal-dual greedy coordinate descent} (APDGCD). Finally, we generalize the APDRCD and APDGCD algorithms to distributed algorithms for computing the Wasserstein barycenter for multiple probability distributionsShow more
Downloadable Archival Material, 2019-05-23
Undefined
Publisher:2019-05-23

[PDF] arxiv.org

From GAN to WGAN

L Weng - arXiv preprint arXiv:1904.08994, 2019 - arxiv.org

Generative adversarial network (GAN) [1] has shown great results in many generative tasks to 

replicate the real-world rich content such as images, human language, and music. It is inspired 

by game theory: two models, a generator and a critic, are competing with each other while making …

Cited by 2 Related articles All 4 versions

<——2019————— 2019  ———————490—

    

Novel Bi-directional Images Synthesis Based on WGAN-GP with GMM-Based Noise Generation

W Huang, M Luo, X Liu, P Zhang, H Ding… - International Workshop on …, 2019 - Springer

Abstract A novel WGAN-GP-based model is proposed in this study to fulfill bi-directional 

synthesis of medical images for the first time. GMM-based noise generated from the Glow 

model is newly incorporated into the WGAN-GP-based model to better reflect the …

Related articles All 2 versions 

2019 in book

Novel Bi-directional Images Synthesis Based on WGAN-GP with GMM-Based Noise Generation 

By: Huang, Wei; Luo, Mingyuan; Liu, Xi; et al.

Conference: 10th International Workshop on Machine Learning in Medical Imaging (MLMI) / 22nd International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI) Location: ‏ Shenzhen, PEOPLES R CHINA Date: ‏ OCT 13-17, 2019 

MACHINE LEARNING IN MEDICAL IMAGING (MLMI 2019)   Book Series: ‏ Lecture Notes in Computer Science   Volume: ‏ 11861   Pages: ‏ 160-168   Published: ‏ 2019 

Novel Bi-directional Images Synthesis Based on WGAN-GP with GMM-Based Noise Generation 

by Huang, Wei; Luo, Mingyuan; Liu, Xi; More... 

2019   eBookCitation Online 

Book Chapter  

 

[PDF] Multiple-Operation Image Anti-Forensics with WGAN-GP Framework

J Wu, Z Wang, H Zeng, X Kang - apsipa.org

A challenging task in the field of multimedia security involves concealing or eliminating the 

traces left by a chain of multiple manipulating operations, ie, multipleoperation anti-forensics 

in short. However, the existing antiforensic works concentrate on one specific manipulation …


[PDF] ucl.ac.be

[PDF] Conditional WGAN for grasp generation

F Patzelt, R Haschke, H Ritter - European Symposium on Artificial …, 2019 - elen.ucl.ac.be

This work proposes a new approach to robotic grasping exploiting conditional Wasserstein 

generative adversarial networks (WGANs), which output promising grasp candidates from 

depth image inputs. In contrast to discriminative models, the WGAN approach enables …

Related articles All 2 versions


[HTML] cqupt.edu.cn

[HTML] 基于 WGAN 的语音增强算法研究

王怡斐, 韩俊刚, 樊良辉 - 重庆邮电大学学报 (自然科学版), 2019 - journal2.cqupt.edu.cn

带噪语音可看成由独立的噪声信号和语音信号经某种方式混合而成, 传统语音增强方法需要对

噪声信号和干净语音信号的独立性和特征分布做出假设, 不合理的假设会造成噪声残留, 

语音失真等问题, 导致语音增强效果不佳. 此外, 噪声本身的随机性和突变性也会影响传统语音 …

Cited by 1 Related articles All 3 versions

[Chinese  Research on Speech Enhancement Algorithm Based on WGAN]


基于 Wasserstein 距离分层注意力模型的跨域情感分类

杜永萍, 贺萌, 赵晓铮 - 模式识别与人工智能, 2019 - airitilibrary.com

跨领域情感分类任务旨在利用已知情感标签的源域数据对缺乏标记数据的目标域进行情感倾向

性分析. 文中提出基于Wasserstein 距离的分层注意力模型, 结合Attention 机制, 

采用分层模型进行特征提取, Wasserstein 距离作为域差异度量方式, 通过对抗式训练自动 …

Related articles All 2 versions 

[Chinese  Cross-domain sentiment classification based on Wasserstein distance hierarchical attention model]

   

2019

[CITATION] エントロピー正則化 Wasserstein 距離に基づくマルチビュー Wasserstein 判別法 (放送技術)

笠井裕之 - 映像情報メディア学会技術報告= ITE technical report, 2019 - ci.nii.ac.jp

検索. すべて. 本文あり. すべて. 本文あり. タイトル. 著者名. 著者ID. 著者所属. 刊行物名. ISSN.

巻号ページ. 出版者. 参考文献. 出版年. 年から 年まで. 検索. 閉じる. 検索. 検索. [機関認証]

利用継続手続きのご案内. エントロピー正則化Wasserstein距離に基づくマルチビューWasserstein …

[Japanese  Entropy regularization Multi-view Wasserstein discriminant method based on Wasserstein distance (Broadcasting technology)]


Relation between the Kantorovich-Wasserstein metric and the Kullback-Leibler divergence

RV Belavkin - arXiv preprint arXiv:1908.09211, 2019 - adsabs.harvard.edu

We discuss a relation between the Kantorovich-Wasserstein (KW) metric and the Kullback-

Leibler (KL) divergence. The former is defined using the optimal transport problem (OTP) in 

the Kantorovich formulation. The lat


 

2019  [PDF] tum.de

Structure preserving discretization and approximation of gradient flows in Wasserstein-like space

S Plazotta - 2019 - mediatum.ub.tum.de

This thesis investigates structure-preserving, temporal semi-discretizations and

approximations for PDEs with gradient flow structure with the application to evolution

problems in the L²-Wasserstein space. We investigate the variational formulation of the time …

Related a


The optimal convergence rate of monotone schemes for conservation laws in the Wasserstein distance

AM Ruf, E Sande, S Solem - Journal of Scientific Computing, 2019 - Springer

Abstract In 1994, Nessyahu, Tadmor and Tassa studied convergence rates of monotone 

finite volume approximations of conservation laws. For compactly supported, Lip^+ Lip+-

bounded initial data they showed a first-order convergence rate in the Wasserstein distance. 

Our main result is to prove that this rate is optimal. We further provide numerical evidence 

indicating that the rate in the case of Lip^+ Lip+-unbounded initial data is worse than first-

order.

Cited by 4 Related articles All 4 versions


The Optimal Convergence Rate of Monotone Schemes for  Conservation Laws in the Wasserstein Distance

Jun 28, 2019 - The Optimal Convergence Rate of Monotone Schemes for Conservation Laws in the Wasserstein Distance. Adrian M. Ruf ,; Espen Sande & ...

by AM Ruf - ‎2019 - ‎Cited by 5 - ‎Related articles

The Optimal Convergence Rate of Monotone Schemes for ...

Nov 7, 2019 - Request PDF | The Optimal Convergence Rate of Monotone Schemes for Conservation Laws in the Wasserstein Distance | In 1994, Nessyahu, ...

Studies from Norwegian University of Science and Technology Describe New Findings in Conservation Law 

(The Optimal Convergence Rate of Monotone Schemes for Conservation Laws In the Wasserstein... 

Ecology, Environment & Conservation, 09/2019

Newsletter Full Text Online 

Conservation - Conservation Law; Studies from Norwegian University of Science and Technology (NTNU) Describe New Findings in Conservation Law (The Optimal Convergence Rate of Monotone Schemes for Conservation Laws In the Wasserstein... 

Ecology, Environment & Conservation, Sep 27, 2019, 1447

Newspaper ArticleFull Text Online 

Mathematics; New Mathematics Study Findings Recently Were Reported by Researchers at Technical University Berlin (A Fenchel-Moreau-Rockafellar type theorem on the Kantorovich-Wasserstein...

<——2019—————— 2019————————500——

Harmonic mappings valued in the Wasserstein space - cvgmt

cvgmt.sns.it › media › doc › paper › harmonic_mappings_Wasserstein_...

PDF 2019

by H LAVENANT - ‎2019 - ‎Cited by 9 - ‎Related articles

Wasserstein space; harmonic maps; Dirichlet problem. 1 ... they focus on numerical computation and visualization of theses soft maps, see also [34] for ...

Cited by 16 Related articles All 13 versions

Wasserstein Generative Adversarial Privacy Networks

essay.utwente.nl › Mulder_MA_EEMCS

PDF  University of Twente

by K Mulder - ‎2019 - ‎Related articles

Jul 19, 2019 - In this thesis, we consider whether we can modify the approach taken by [1] to use a Wasserstein GAN as basis instead of a traditional GAN, in ...


 Greedy Approach to Max-Sliced Wasserstein GANs

A Horváth - 2019 - openreview.net

Generative Adversarial Networks have made data generation possible in various use cases, 

but in case of complex, high-dimensional distributions it can be difficult to train them, 

because of convergence problems and the appearance of mode collapse. Sliced …

Related articles All 2 versions

Distributionally Robust Learning under the Wasserstein Metric

by R Chen - ‎2019 - ‎Related articles

This dissertation develops a comprehensive statistical learning framework that is robust to (distributional) perturbations in the data using Distributionally Robust ...


 Optimal Control in Wasserstein Spaces

Nov 19, 2019 - Download Citation | Optimal Control in Wasserstein Spaces | A ... In this thesis, we extend for the first time several of these concepts to the ...


2019

Input limited Wasserstein GAN

C FD - 2019 - ir.sia.cn

… 5. CONCLUSION AND FUTURE WORK Here we have proposed the Input Limited WGAN. We design a Sparse Autoencoder to restrict input domain … 1–15, 2014. [5] I. Goodfellow, J. Pouget-Abadie, and M. Mirza, “Generative Adversarial Networks,” arXiv Prepr. arXiv …, pp … 

Related articles


WGAN-Based Robust Occluded Facial Expression Recognition 

By: Lu, Yang; Wang, Shigang; Zhao, Wenting; et al.

IEEE ACCESS   Volume: 7   Pages: 93594-93610   Published: 2019 

 Free Full Text from Publisher 

WGAN-based robust occluded facial expression recognition

Y Lu, S Wang, W Zhao, Y Zhao - IEEE Access, 2019 - ieeexplore.ieee.org

Research on facial expression recognition (FER) technology can promote the development

of theoretical and practical applications for our daily life. Currently, most of the related works

on this technology are focused on un-occluded FER. However, in real life, facial expression

images often have partial occlusion; therefore, the accurate recognition of occluded facial

expression images is a topic that should be explored. In this paper, we proposed a novel

Wasserstein generative adversarial network-based method to perform occluded FER. After …

  Cited by 8 Related articles


Open Access 

Convolutional neural network adversarial transfer learning method based on Waserstein distance and application thereof 

by LI XINGYI; MA GUIJUN; CHENG CHENG; More... 

11/2019

The invention relates to a convolutional neural network adversarial transfer learning method based on Waserstein distance and application thereof and the...

Patent: Citation Online 


Open Access 

种基于Wasserstein GAN的光伏阵列故障诊断方法 

04/2019

本发明涉及种基于Wasserstein GAN的光伏阵列故障诊断方法,首先对光伏阵列电流、电压时序数据进行采集;接着将获取的光伏阵列时序电流与时序电压数据绘制为曲线图形并保存为样本;然后设计Wasserstein GAN网络中的鉴别器D与生成器G;然后训练Wasserstein...

Patent:  Citation Online 

Photovoltaic array fault diagnosis method based on Wasserstein GAN

[Chinese  Fault Diagnosis Method for Photovoltaic Array Based on Wasserstein GAN]

 

Open Access 

基于Wasserstein生成对抗网络的三维MRI图像去噪模型的构建方法及应用 

08/2019

Patent:  Citation Online  

Construction method and application of three-dimensional MRI image denoising model based on Wasserstein generative adversarial network   

[Chinese  Construction method and application of 3D MRI image denoising model based on Wasserstein generative adversarial network]
<——2019 ———— 2019  ————510—

From GAN to WGAN

L Weng - arXiv preprint arXiv:1904.08994, 2019 - arxiv.org

Generative adversarial network (GAN) [1] has shown great results in many generative tasks to replicate the real-world rich content such as images, human language, and music. It is inspired by game theory: two models, a generator and a critic, are competing with each other while making …

Cited by 2 Related articles 

From GAN to WGAN 

by Weng, Lilian 

04/2019

This paper explains the math behind a generative adversarial network (GAN) model and why it is hard to be trained. Wasserstein GAN is intended to improve GANs'...

Journal Article: Full Text Online 

Cited by 8 Related articles All 4 versions


Computing Wasserstein Barycenters via Linear ProgrammingAuthors:

Auricchio G.Gualandi S.Veneroni M.Bassetti F.16th International Conference on the Integration of Constraint Programming, Artificial Intelligence, and Operations Research, CPAIOR 2019Show mor
Article, 2019
Publication:Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11494 LNCS, 2019, 355
Publisher:2019


Prostate MR Image Segmentation With Self-Attention Adversarial Training Based on Wasserstein Distance

AAuthor:Chengwei Su
Article, 2019
Publication:IEEE access, 7, 2019, 184276
Publisher:2019

SGD Learns One-Layer Networks in WGANs

Q Lei, JD Lee, AG Dimakis, C Daskalakis - arXiv preprint arXiv …, 2019 - arxiv.org

Generative adversarial networks (GANs) are a widely used framework for learning generative models. Wasserstein GANs (WGANs), one of the most successful variants of GANs, require solving a minmax optimization problem to global optimality, but are in practice successfully trained using stochastic gradient descent-ascent. In this paper, we show that, when the generator is a one-layer network, stochastic gradient descent-ascent converges to a global solution with polynomial time and sample complexity.

Cited by 3 Related articles 

SGD Learns One-Layer Networks in WGANs 

by Lei, Qi; Lee, Jason D; Dimakis, Alexandros G; More... 

10/2019

Generative adversarial networks (GANs) are a widely used framework for learning generative models. Wasserstein GANs (WGANs), one of the most successful...

Journal Article: Full Text Online 


E-WACGAN: Enhanced Generative Model of Signaling Data Based on WGAN-GP and ACGAN

Q Jin, R Lin, F Yang - IEEE Systems Journal, 2019 - ieeexplore.ieee.org

In recent years, the generative adversarial network (GAN) has achieved outstanding

performance in the image field and the derivatives of GAN, namely auxiliary classifier GAN

(ACGAN) and Wasserstein GAN with gradient penalty (WGAN-GP) have also been widely …

  Cited by 1 Related articles All 2 versions


2019

Conditional WGANs with Adaptive Gradient Balancing for Sparse MRI Reconstruction

I Malkiel, S Ahn, V Taviani, A Menini, L Wolf… - arXiv preprint arXiv …, 2019 - arxiv.org

Recent sparse MRI reconstruction models have used Deep Neural Networks (DNNs) to reconstruct relatively high-quality images from highly undersampled k-space data, enabling much faster MRI scanning. However, these techniques sometimes struggle to reconstruct sharp images that preserve fine detail while maintaining a natural appearance. In this work, we enhance the image quality by using a Conditional Wasserstein Generative Adversarial Network combined with a novel Adaptive Gradient Balancing technique that stabilizes the …

Cited by 4 Related articles All 2 versions

Conditional WGANs with Adaptive Gradient Balancing for Sparse MRI Reconstruction 

by Malkiel, Itzik; Ahn, Sangtae; Taviani, Valentina; More... 

05/2019

Recent sparse MRI reconstruction models have used Deep Neural Networks (DNNs) to reconstruct relatively high-quality images from highly undersampled k-space...

Journal Article: Full Text Online 


Study of Constrained Network Structures for WGANs on Numeric Data Generation

W Wang, C Wang, T Cui, Y Li - arXiv preprint arXiv:1911.01649, 2019 - arxiv.org

Some recent studies have suggested using GANs for numeric data generation such as to generate data for completing the imbalanced numeric data. Considering the significant difference between the dimensions of the numeric data and images, as well as the strong correlations between features of numeric data, the conventional GANs normally face an overfitting problem, consequently leads to an ill-conditioning problem in generating numeric and structured data. This paper studies the constrained network structures between …

Related articles 

Study of Constrained Network Structures for WGANs on Numeric Data Generation 

by Wang, Wei; Wang, Chuang; Cui, Tao; More... 

11/2019

Some recent studies have suggested using GANs for numeric data generation such as to generate data for completing the imbalanced numeric data. Considering the...

Journal Article: Full Text Online 


  

Improved image enhancement method and device based on WGA-GP and U-net, and storage medium 

by WANG HONGLING; TANG JIE; LI QINGYU 

11/2019

The invention discloses an improved image enhancement method and an improved image enhancement device based on WGAN-GP and U-net, and a storage medium. The...

Patent:   Citation Online 


一种基于WGAN-GP和过采样的不平衡学习方法 - 中南大学教师 ...

faculty.csu.edu.cn › dengxiaoheng › zlcg › content

Translate this page

An imbalanced learning method based on WGAN-GP and oversampling. Click. Application no ... Patent Inventor:邓晓衡、黄戎、沈海澜. Open date2019-05-28.

一种基于WGAN-GP和过采样的不平衡学习方法 

05/2019

本发明公开了一种基于WGAN-GP和过采样的不平衡学习方法,包括:生成器网络,由三层全连接网络组成并且每一层的输出都应用了Batch...

Patent: Citation Online 


Univ Chongqing Posts & Telecom Files Chinese Patent Application for Differential Wgan Based Network Security... 

Global IP News. Security & Protection Patent News, Oct 14, 2019

Newspaper Article: Full Text Online 

 <————2019    ——————— 2019  ————————520—


  2019 521

 WGAN-based robust occluded facial expression recognition

Y Lu, S Wang, W Zhao, Y Zhao - IEEE Access, 2019 - ieeexplore.ieee.org

Research on facial expression recognition (FER) technology can promote the development of theoretical and practical applications for our daily life. Currently, most of the related works on this technology are focused on un-occluded FER. However, in real life, facial expression …

Cited by 1 Related articles 

Engineering; Investigators from Jilin University Report New Data on Engineering (Wgan-based Robust Occluded Facial... 

Journal of Engineering, Aug 19, 2019, 995

Newspaper Article: Full Text Online 

WGAN-Based Robust Occluded Facial Expression RecognitionAuthor:Yang Lu
Article, 2019
Publication:IEEE access, 7, 2019, 93594
Publisher:2019


2019

Distributionally Robust Learning Under the Wasserstein Metric 

by Chen, Ruidi 

This dissertation develops a comprehensive statistical learning framework that is robust to (distributional) perturbations in the data using Distributionally...

Dissertation/ThesisFull Text Online

Cited by 1 Related articles All 3 versions

  Wasserstein Generative Adversarial Network Based De-Blurring Using Perceptual Similarity). Network

www.mdpi.com › ...

Wasserstein Generative Adversarial Network Based De-Blurring Using ... the Wasserstein distance, and it captures well the perceptual similarity using the style ...

by M Hong - ‎2019 - ‎Related articles

Abstract · ‎Share and Cite · ‎Article Metrics

Recent Findings in Applied Sciences Described by Researchers from Yonsei University (Wasserstein... 

Journal of Engineering, 07/2019

NewsletterFull Text Online 

Science - Applied Sciences; Recent Findings in Applied Sciences Described by Researchers from Yonsei University (Wasserstein... 

Network Weekly News, Jul 22, 2019, 1243

Newspaper ArticleFull Text Online 


[PDF] researchgate.net

Least-squares reverse time migration via linearized waveform inversion using a Wasserstein metric

P Yong, J Huang, Z Li, W Liao, L Qu - Geophysics, 2019 - library.seg.org

Least-squares reverse time migration (LSRTM), an effective tool for imaging the structures of the Earth from seismograms, can be characterized as a linearized waveform inversion problem. We have investigated the performance of three minimization functionals as the L 2 …

Related articles

Least-squares reverse time migration via linearized waveform ...

Aug 14, 2019 - Least-squares reverse time migration via linearized waveform inversion using a Wasserstein metric · Check for updates on crossmark. Authors:.

by P Yong - ‎2019 - ‎Cited by 1 - ‎Related articles

Least-squares reverse time migration via linearized waveform ...

Aug 14, 2019 - Least-squares reverse time migration via linearized waveform inversion using a Wasserstein metric. Check for updates on crossmark. Authors:.

by P Yong - ‎2019 - ‎Cited by 1 - ‎Related articles

New Geophysics Study Findings Have Been Reported by Investigators at China University of Petroleum (East China) 

(Least-squares Reverse Time Migration Via Linearized Waveform Inversion Using a Wasserstein... 

Journal of Physics Research, 11/2019

NewsletterFull Text Online 


[PDF] arxiv.org

Wasserstein Barycenter Model Ensembling

P Dognin, I Melnyk, Y Mroueh, J Ross… - arXiv preprint arXiv …, 2019 - arxiv.org

In this paper we propose to perform model ensembling in a multiclass or a multilabel 

learning setting using Wasserstein (W.) barycenters. Optimal transport metrics, such as the 

 Cited by 20 Related articles All 5 versions

2019

Lifted and Geometric Differentiability of the Squared Quadratic Wasserstein Distance

www.tse-fr.eu › seminars › 2019-lifted-and-geometric-differentiabilit...

www.tse-fr.eu › seminars › 2019-lifted-and-geometric-differentiabilit...

  1. Cached

Apr 4, 2019 - Aurélien Alfonsi (CERMICS -Ecole Nationale des Ponts et Chaussées), “Lifted and Geometric Differentiability of the Squared Quadratic Wasserstein Distance'

...

 

      [PDF] Group Lasso Wasserstein sans grille

P CATALA, V DUVAL, G PEYR - paulcat.github.io

We consider in this paper the problem of simultaneously recovering pointwise sources 

across several similar tasks, given some low-pass measurements. The group Lasso 

regularize this problem by enforcing a common sparse support to the solutions in each task …

Commande Optimale dans les Espaces de Wasserstein

B Bonnet - 2019 - theses.fr

… Commande Optimale dans les Espaces de Wasserstein. par Benoit Bonnet.

Thèse de doctorat en Automatique. La soutenance a eu lieu le 28-10-2019 … Titre

traduit. Optimal Control in Wasserstein Spaces. Résumé …

[CITATION] Optimal Control in Wasserstein Spaces.(Commande Optimal dans les Espaces de Wasserstein).

B Bonnet - 2019 - Aix-Marseille University, France

www.researchgate.net › publication › 337311998_Optima...

Commande Optimal dans les Espaces de Wasserstein. Abstract. Une vaste quantité d'outils mathématiques permettant la modélisation et l'analyse des .


基于堆栈 Wasserstein 自编码器与混合生成对抗网络的高光谱图像分类研究

叶少晖 - 2019 - cdmd.cnki.com.cn

高光谱遥感是一种典型的对地观测技术, 在提升光谱分辨率的同时包含了更多的空间信息, 

分类识别技术作为高光谱图像处理中的核心技术之一, 可用于地质矿产, 水资源管理, 

军事等多个领域. 如何提取高光谱图像的高级特征, 建立小样本下鲁棒的分类模型 …

[Chenese  Research on Hyperspectral Image Classification Based on Stacked Wasserstein Autoencoder and Hybrid Generative Adversarial Network]


[PDF] Conditional WGAN for grasp generation.

F Patzelt, R Haschke, HJ Ritter - ESANN, 2019 - elen.ucl.ac.be

This work proposes a new approach to robotic grasping exploiting conditional Wasserstein 

generative adversarial networks (WGANs), which output promising grasp candidates from 

depth image inputs. In contrast to discriminative models, the WGAN approach enables …

Related articles All 2 versions
<—— 2019  ———2019 ——— 530— 


   [PDF] ucl.ac.be

  Super-Resolution Algorithm of Satellite Cloud Image Based on WGAN-GP

YY Luo, HG Lu, N Jia - 2019 International Conference on …, 2019 - ieeexplore.ieee.org

The resolution of an image is an important indicator for measuring image quality. The higher 

the resolution, the more detailed information is contained in the image, which is more 

conducive to subsequent image analysis and other tasks. Improving the resolution of images …

Related articles


[PDF] openreview.net

iWGAN: an Autoencoder WGAN for Inference

Y Chen, Q Gao, X Wang - 2019 - openreview.net

Generative Adversarial Networks (GANs) have been impactful on many problems and 

applications but suffer from unstable training. Wasserstein GAN (WGAN) leverages the 

Wasserstein distance to avoid the caveats in the minmax two-player training of GANs but …

Related articles



[PDF] semanticscholar.org

[PDF] Using WGAN for Improving Imbalanced Classification Performance.

S Bhatia, R Dahyot - AICS, 2019 - pdfs.semanticscholar.org

This paper investigates data synthesis with a Generative Adversarial Network (GAN) for 

augmenting the amount of data used for training classifiers (in supervised learning) to 

compensate for class imbalance (when the classes are not represented equally by the same …

Cited by 1 Related articles All 4 versions 


PDF] 结合 FC-DenseNet WGAN 的图像去雾算法

孙斌, 雎青青, 桑庆兵 - 计算机科学与探索, 2019 - fcst.ceaj.org

针对现有图像去雾算法严重依赖中间量准确估计的问题, 提出了一种基于Wasserstein 

生成对抗网络(WGAN) 的端到端图像去雾模型. 首先, 使用全卷积密集块网络(FC-DenseNet) 

充分学习图像中雾的特征; 其次, 采用残差学习思想直接从退化图像中学习到清晰图像的特征 …

Related articles All 2 versions

 [Chinese  mage defogging algorithm combined with FC-DenseNet and WGAN]



[HTML]  '''''

基于差分 WGAN 的网络安全态势预测

王婷婷, 朱江 - 计算机科学, 2019 - cnki.com.cn

文中提出了一种基于差分WGAN (Wasserstein-GAN) 的网络安全态势预测机制, 

该机制利用生成对抗网络(Generative Adversarial Network, GAN) 来模拟态势的发展过程, 

从时间维度实现态势预测. 为了解决GAN 具有的网络难以训练, collapse mode …

Related articles All 3 versions

[CITATION] 基于差分 WGAN 的网络安全态势预测 (Network Security Situation Forecast Based on Differential WGAN).

T Wang, J Zhu - 计算机科学, 2019

基于差分 WGAN 的网络安全态势预测

王婷婷, 朱江 - 计算机科学, 2019 - cnki.com.cn

文中提出了一种基于差分WGAN (Wasserstein-GAN) 的网络安全态势预测机制, 该机制利用生成对抗网络(Generative Adversarial Network, GAN) 来模拟态势的发展过程, 从时间维度实现态势预测. 为了解决GAN 具有的网络难以训练, collapse mode 及梯度不稳定的问题, 提出了利用Wasserstein 距离作为GAN 的损失函数, 并采用在损失函数中添加差分项的方法来提高态势值的分类精度, 同时还证明了差分WGAN 网络的稳定度. 实验结果与分析表明, 该机制相比其他机制而言, 在收敛性, 预测精度和复杂度方面具有优势.

Related articles 

Network Security Situation Forecast Based on Differential WGAN

0 citations* 

2019 Jisuanji Kexue 

Wang Tingting , Zhu Jiang



[CITATION] Conditional WGAN-GP 이용한 Few-Shot 이미지 생성

나상혁, 김준태 - 한국정보과학회 학술발표논문집, 2019 - dbpia.co.kr

약최근에 생성적 적대 신경망 (generate adversarial nets) 활용한 다양한 연구 개발이 

이루어지고 있다. 생성적 적대 신경망은 생성자, 판별자 신경망이 각각 적대적 학습하여 실제 

데이터와 유사한 데이터를생성하는 방법이다. 그러나 다른 딥러닝 분야와 마찬가지로 학습을 …

Related articles

[Korean   Few-Shot image generation using Conditional WGAN-GP] 


2019

Arterial Spin Labeling Images Synthesis via Locally-Constrained WGAN-GP Ensemble 

By: Huang, Wei; Luo, Mingyuan; Liu, Xi; et al.

Conference: 10th International Workshop on Machine Learning in Medical Imaging (MLMI) / 22nd International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI) Location: ‏ Shenzhen, PEOPLES R CHINA Date: ‏ OCT 13-17, 2019 

MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2019, PT IV   Book Series: ‏ Lecture Notes in Computer Science   Volume: ‏ 11767   Pages: ‏ 768-776   Published: ‏ 2019

2019

An Outlier Detection Approach Based on WGAN-Empowered Deep Autoencoder 

By: Huang, Yunxin; Xu, Hongzuo; Wang, Xiaodong; et al.

Conference: IEEE 9th International Conference on Electronics Information and Emergency Communication (ICEIEC) Location: ‏ Beijing, PEOPLES R CHINA Date: ‏ JUL 12-14, 2019 

Sponsor(s): ‏Inst Elect & Elect Engineers; IEEE Beijing Sect 

PROCEEDINGS OF 2019 IEEE 9TH INTERNATIONAL CONFERENCE ON ELECTRONICS INFORMATION AND EMERGENCY COMMUNICATION (ICEIEC 2019)   Book Series: ‏ IEEE International Conference on Electronics Information and Emergency Communication   Pages: ‏ 534-537   Published: ‏ 2019 

An Outlier Detection Approach Based on WGAN-Empowered Deep Autoencoder 

By: Huang, Yunxin; Xu, Hongzuo; Wang, Xiaodong; et al.

Conference: IEEE 9th International Conference on Electronics Information and Emergency Communication (ICEIEC) Location: Beijing, PEOPLES R CHINA Date: JUL 12-14, 2019 

Sponsor(s): Inst Elect & Elect Engineers; IEEE Beijing Sect 

PROCEEDINGS OF 2019 IEEE 9TH INTERNATIONAL CONFERENCE ON ELECTRONICS INFORMATION AND EMERGENCY COMMUNICATION (ICEIEC 2019)   Book Series: IEEE International Conference on Electronics Information and Emergency Communication   Pages: 534-537   Published: 2019
An Outlier Detection Approach Based on WGAN-Empowered Deep Autoencoder

Y Huang, H Xu, X Wang, Z Wu - 2019 IEEE 9th International …, 2019 - ieeexplore.ieee.org

Modelling normal data is one of the major challenges in outlier detection. Deep learning has been proven to be effective in modelling underlying distributions of input training data. However, the existing deep learning-based methods normally focus on how to alleviate the …


2019 [PDF] cyberleninka.ru

Применение метрики Вассерштейна для решения обратной динамической задачи сейсмики

АА Василенко - Интерэкспо Гео-Сибирь, 2019 - cyberleninka.ru

… В данной работе предлагается использовать метрику Вассерштейна для построения

… с использованием метрики Вассерштейна и L2-нормы скоростных моделей. …

Related articles All 5 versions 

Semi-supervised Multimodal Emotion Recognition with Improved Wasserstein GANs

J Liang, S Chen, Q Jin - 2019 Asia-Pacific Signal and …, 2019 - ieeexplore.ieee.org

Automatic emotion recognition has faced the challenge of lacking large-scale human 

labeled dataset for model learning due to the expensive data annotation cost and inevitable 

label ambiguity. To tackle such challenge, previous works have explored to transfer emotion …

  Cited by 3 Related articles All 2 versions

[CITATION] … Multimodal Emotion Recognition with Improved Wasserstein GANs. In 2019 Asia-Pacific Signal and Information Processing Association Annual Summit …

J Liang, S Chen, Q Jin - 2019 - IEEE

Cited by 3 Related articles

 <—— 2019  ——2019 ————— 540—  

 

 
Problems and Advances of Wasserstein GAN - ICERM Generative ...

stanniszhou.github.io/discussion-group/post/wgan 

Problems and Advances of Wasserstein GAN - ICERM Generative Models Discussion Group. Introduction Since Generative Adversarial Nets(GAN)([1]) was proposed in 2014, there have been a lot of researches on and applications of GAN([2,3]). However the generative and discriminative models were studied before the GAN was proposed([4]). 

[HTML] Problems and Advances of Wasserstein GAN

GAN Wasserstein - stanniszhou.github.io

Since Generative Adversarial Nets (GAN)([1]) was proposed in 2014, there have been a lot of researches on and applications of GAN ([2, 3]). However the generative and discriminative models were studied before the GAN was proposed ([4]). Some problems of …


 2019

Multiple-Operation Image Anti-Forensics with WGAN-GP Framework 

By: Wu, Jianyuan; Wang, Zheng; Zeng, Hui; et al.

Conference: Annual Summit and Conference of the Asia-Pacific-Signal-and-Information-Processing-Association (APSIPA ASC) Location: ‏ Lanzhou, PEOPLES R CHINA Date: ‏ NOV 18-21, 2019 

Sponsor(s): ‏Asia Pacific Signal & Informat Proc Assoc 

2019 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC)   Book Series: ‏ Asia-Pacific Signal and Information Processing Association Annual Summit and Conference   Pages: ‏ 1303-1307   Published: ‏ 2019 


2019

Uncoupled isotonic regression via minimum Wasserstein deconvolution 

By: Rigollet, Philippe; Weed, Jonathan 

INFORMATION AND INFERENCE-A JOURNAL OF THE IMA   Volume: ‏ 8   Issue: ‏ 4   Pages: ‏ 691-717   Published: ‏ DEC 


2019 


Data-driven regularization of Wasserstein barycenters with an application to multivariate density registration 

By: Cazelles, Jeremie Bigot Elsa; Papadakis, Nicolas 

INFORMATION AND INFERENCE-A JOURNAL OF THE IMA   Volume: ‏ 8   Issue: ‏ 4   Pages: ‏ 719-755   Published: ‏ DEC  

ited by 19 Related articles All 8 versions

The Gromov-Wasserstein distance between networks and stable network invariants 

By: Chowdhury, Samir; Memoli, Facundo 

INFORMATION AND INFERENCE-A JOURNAL OF THE IMA   Volume: ‏ 8   Issue: ‏ 4   Pages: ‏ 757-787   Published: ‏ DEC  



Modeling the Biological Pathology Continuum with HSIC-regularized Wasserstein Auto-encoders
Authors:Wu, Denny (Creator), Kobayashi, Hirofumi (Creator), Ding, Charles (Creator), Cheng, Lei (Creator), Ghassemi, Keisuke Goda Marzyeh (Creator)
Summary:A crucial challenge in image-based modeling of biomedical data is to identify trends and features that separate normality and pathology. In many cases, the morphology of the imaged object exhibits continuous change as it deviates from normality, and thus a generative model can be trained to model this morphological continuum. Moreover, given side information that correlates to certain trend in morphological change, a latent variable model can be regularized such that its latent representation reflects this side information. In this work, we use the Wasserstein Auto-encoder to model this pathology continuum, and apply the Hilbert-Schmitt Independence Criterion (HSIC) to enforce dependency between certain latent features and the provided side information. We experimentally show that the model can provide disentangled and interpretable latent representations and also generate a continuum of morphological changes that corresponds to change in the side informationShow more
Downloadable Archival Material, 2019-01-19
Undefined
Publisher:2019-01-19


Music Classification using Multiclass Support Vector Machine and Multilevel Wasserstein Means

J Wei, C Jin, Z Cheng, X Lv… - 2019 IEEE/ACIS 18th …, 2019 - ieeexplore.ieee.org

Music classification is a challenging task in music information retrieval. In this article, we

compare the performance of the two types of models. The first category is classified by

Support Vector Machine (SVM). We use the feature extraction from audio as the basis of …

  Related articles All 2 versions

<––2019  —- 2019————————550—— 


[PDF] arxiv.org

On the total variation Wasserstein gradient flow and the TV-JKO scheme

G Carlier, C Poon - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We study the JKO scheme for the total variation, characterize the optimizers, prove some of 

their qualitative properties (in particular a form of maximum principle and in some cases, a 

minimum principle as well). Finally, we establish a convergence result as the time step goes …

Cited by 6 Related articles All 4 versions    

On the total variation Wasserstein gradient flow and the TV-JKO scheme 

By: Carlier, Guillaume; Poon, Clarice 

ESAIM-CONTROL OPTIMISATION AND CALCULUS OF VARIATIONS   Volume: ‏ 25     Published: ‏ SEP 20 2019 

 Free Full Text from Publisher 

Times Cited: 1  


[PDF] Wasserstein convergence rates for random bit approximations of continuous Markov processes

S Ankirchner, T Kruse… - arXiv preprint arXiv …, 2019 - pdfs.semanticscholar.org

We determine the convergence speed of a numerical scheme for approximating one-dimensional continuous strong Markov processes. The scheme is based on the construction of certain Markov chains whose laws can be embedded into the process with a sequence of …

Wasserstein convergence rates for random bit approximations of continuous Markov processes

S Ankirchner, T Kruse, M Urusov - arXiv, 2019 - ui.adsabs.harvard.edu

We determine the convergence speed of a numerical scheme for approximating one-dimensional continuous strong Markov processes. The scheme is based on the construction of coin tossing Markov chains whose laws can be embedded into the process with a …

MR4144292 Prelim Ankirchner, Stefan; Kruse, Thomas; Urusov, Mikhail; Wasserstein convergence rates for random bit approximations of continuous Markov processes. J. Math. Anal. Appl. 493 (2021), no. 2, 124543. 60 (65)

Review PDF Clipboard Journal Article 

  Cited by 3 Related articles All 4 versions

Modeling EEG data distribution with a Wasserstein Generative Network to predict RSVP 

Nov 11, 2019 - We propose a novel Wasserstein Generative Adversarial Network with gradient penalty (WGAN-GP) to synthesize EEG data. This network ...

by S Panwar - ‎2019 - ‎Cited by 1 - ‎Related articles


Scholarly articles for 2019 IEEE International wasserstein

Multi-marginal wasserstein gan - ‎Cao - Cited by 20

Wasserstein smoothing: Certified robustness against … - ‎Levine - Cited by 9

… RSVP Experiment by a Class Conditioned Wasserstein - ‎Panwar - Cited by 5

list of  papers with Wasserstein in title appeared in 2019, 24 pq [ages

ited by 44 Related articles All 11 versions

Multi-marginal Wasserstein GAN - Papers With Code

paperswithcode.com › paper › review

Multi-marginal Wasserstein GAN. Multiple marginal matching problem aims at learning mappings to match a source domain to multiple target domains and it has ...

Papers With Code · Ross Taylor · 
[PDF] nips.cc

Multi-marginal wasserstein gan

J Cao, L Mo, Y Zhang, K Jia, C Shen… - Advances in Neural …, 2019 - papers.nips.cc

Multiple marginal matching problem aims at learning mappings to match a source domain to 

multiple target domains and it has attracted great attention in many applications, such as 

multi-domain image translation. However, addressing this problem has two critical …

Cited by 20 Related articles All 5 versions 


Peer-reviewed
Unsupervised Feature Extraction in Hyperspectral Images Based on Wasserstein Generative Adversarial Network

AAuthor AAuthors:Zhang M.Gong M.Mao Y.Li J.Wu Y.
Article, 2019
Publication:IEEE Transactions on Geoscience and Remote Sensing, 57, 2019 05 01, 2669
Publisher:2019


  

Predictive Density Estimation Under the Wasserstein Loss

2019 · ‎No preview

Publisher: Department of Mathematical Informatics, Graduate School of Information Science and Technology, the University of Tokyo

"Low rank approximation of Wasserstein distance kernel for ...

Mar 18, 2019 - Interdisciplinary Center for Cyber Security and Cyber Defense of Critical ... Tel Aviv University School of Computer Science, Tel Aviv University.

 

[PDF] ntu.edu.sg

Poisson discretizations of Wiener functionals and Malliavin operators with Wasserstein estimates

N PrivaultSCP YamZ Zhang - Stochastic Processes and their …, 2019 - Elsevier

This article proposes a global, chaos-based procedure for the discretization of functionals of

Brownian motion into functionals of a Poisson process with intensity λ> 0. Under this

discretization we study the weak convergence, as the intensity of the underlying Poisson  …

  Related articles All 7 versions

 

urvature of the Manifold of Fixed-Rank Positive-Semidefinite Matrices Endowed with the Bures-Wasserstein MetricAAuthors:Massart E.Hendrickx J.M.Absil P.-A.4th International Conference on Geometric Science of Information, GSI 2019
Article, 2019
Publication:Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11712 LNCS, 2019, 739
Publisher:2019


High Performance WGAN-GP based Multiple-category Network Anomaly Classification System

JT Wang, CH Wang - 2019 International Conference on Cyber …, 2019 - ieeexplore.ieee.org

Due to the increasing of smart devices, the detection of anomalous traffic on Internet is getting more essential. Many previous intrusion detection studies which focused on the classification between normal or anomaly events can be used to enhance the system security by launching alarms as the intrusions being detected. Although many intrusion detection systems which has been developed can achieve high detection rates, they are still difficult to perform well on some attacks that have never been seen before. In this paper, the …

Related articles 

High Performance WGAN-GP based Multiple-category Network Anomaly Classification System 

by Wang, Jing-Tong; Wang, Chih-Hung 

2019 International Conference on Cyber Security for Emerging Technologies (CSET), 10/2019

Due to the increasing of smart devices, the detection of anomalous traffic on Internet is getting more essential. Many previous intrusion detection studies...

Conference Proceeding: Full Text Online  

High Performance WGAN-GP based Multiple-category Network Anomaly Classification System

0 citations* 

2019 2019 International Conference on Cyber Security for Emerging Technologies (CSET) 

Jing-Tong Wang , Chih-Hung Wang 

National Chiayi University

<——2019  ———— 2019———————560——


Sketch-photo conversion method based on WGAN-GP and U-NET

0 citations* 

2019 Wang Shigang , Min Jiayuan , Wei Jian , 

Zhao Yan


Hyperspectral image classification method based on semi-supervised WGAN-GP

0 citations* 

2019 Bai Jing , Zhang Jingsen , Zhang Fan ,nLi Xiaohan , Yang Weijie 

see all 6 authors 


An unbalanced learning method based on WGAN-GP and oversampling

0 citations* 

2019 Deng Xiaoheng , Huang Rong , Shen Hailan


An unbalanced learning method based on WGAN-GP and oversampling

0 citations* 

2019 Deng Xiaoheng , Huang Rong , Shen Hailan 

The invention discloses an unbalanced learning method based on WGAN-GP and oversampling. The method includes a generator network which is composed of three layers of full-connection networks, whereinthe Batch Normalization (BN) normalization is applied to the output of each layer to prevent... View Full Abstract 


A feature recalibration convolution method based on WGAN model

0 citations* 

2019 Zhou Zhiheng , Li Lijun 

View More (6+) 

The invention discloses a feature recalibration convolution method based on a WGAN model, and belongsto the field of depth learning neural network. The method comprises the following steps: S1, constructing an original generated antagonistic network model; S2, constructing the Wasserstein d... View Full Abstract 


2019

Differential WGAN based network security situation prediction method

0 citations* 

2019 Wang Yong , Wang Tingting , Zhu Jiang 

View More (4+) 

The invention provides a differential WGAN based network security situation prediction method. The GAN (Generative adversarial network) is used to simulate the development process of situation, and the situation is predicted in the time dimension. A loss function with the Wasserstein distance as the GAN is used to solve the problem that the GAN is hard to train and instable in collapse mode and gradient, and a differential item is added to the loss function to improve the classification precision of a situation value. The stability of the differential WGAN network is proved. According to experimental results and analysis, the mechanism has advantages in the aspects in convergence, prediction precision and complexity compared with other mechanisms. Less

Wasserstein CNN: Learning Invariant Features for NIR-VIS Face Recognition

85 citations* for all 

77 citations* 

2019 IEEE Transactions on Pattern Analysis and Machine Intelligence

Ran He , Xiang Wu , Zhenan Sun , Tieniu Tan 

Chinese Academy of Sciences

Wasserstein CNN: Learning Invariant Features for NIR-VIS Face Recognition

R He, X Wu, Z Sun, T Tan - IEEE Transactions on Pattern Analysis & …, 2019 - computer.org

Heterogeneous face recognition (HFR) aims at matching facial images acquired from

different sensing modalities with mission-critical applications in forensics, security and

commercial sectors. However, HFR presents more challenging issues than traditional face …


  [PDF] opticsjournal.net

[PDF] 基于改进 WGAN-GP 的多波段图像同步超分与融合方法

田嵩旺, 蔺素珍, 雷海卫, 李大威, 王丽芳 - 光学学报, 2020 - opticsjournal.net

摘要针对低分辨率源图像的融合结果质量低下不利于后续目标提取的问题, 

提出一种基于梯度惩罚Wasserstein 生成对抗网络(WGAN-GP) 的多波段图像同步超分与融合

方法. 首先, 基于双三次插值法将多波段低分辨率源图像分别放大至目标尺寸; 其次 …

All 2 versions 

[Chinese  ulti-band image synchronization super-division and fusion method based on improved WGAN-GP]



基于WGAN的语音增强算法研究 

by 王怡斐 (WANG Yifei); 韩俊刚 (HAN Jungang); 樊良辉 (FAN Lianghui) 

Chongqing you dian xue yuan xue bao. Zi ran ke xue ban, 2019, Volume 31, Issue 1

TP391.4; 带噪语音可看成由独立的噪声信号和语音信号经某种方式混合而成, 传统语音增强方法需要对噪声信号和干净语音信号的独立性和特征分布做出假设, 不合理的假设会造成噪声残留、语音失真等问题, 导致语音增强效果不佳.此外, 噪声本身的随机性和突变性也会影响传统语音增强方法的鲁棒性.针对这些问题,...

Journal ArticleCitation Online Algorithm research of speech enhancement based on WGAN

 [Chinese  Multi-band image synchronization super-division and fusion method based on improved WGAN-GP]

online

Conditional WGAN-GP 이용한 Few-Shot 이미지 생성 

by 나상혁(Sanghyuck Na); 김준태(Juntae Kim) 

한국정보과학회 학술발표논문집, 2019, Volume 2019, Issue 12

Journal ArticleFull Text Online 

[Korean  Few-Shot image generation using Conditional WGAN-GP[

Preview 

<––2019  —- —2019———————570——  


基于差分WGAN的网络安全态势预测 

by 王婷婷 (WGAN Ting-ting); 朱江 (ZHU Jiang) 

Ji suan ji ke xue, 2019, Volume 46, Issue z2

TN918.1; 文中提出了一种基于差分WGAN(Wasserstein-GAN)的网络安全态势预测机制,该机制利用生成对抗网络(Generative Adversarial Network,GAN)来模拟态势的发展过程,从时间维度实现态势预测.为了解决GAN具有的网络难以训练、collapse...

Journal ArticleCitation Online 

[Chinese  Network security situation prediction based on differential WGAN]


 

Open Access 

A feature recalibration convolution method based on WGAN model 

by LI LIJUN; ZHOU ZHIHENG 

02/2019

The invention discloses a feature recalibration convolution method based on a WGAN model, and belongsto the field of depth learning neural network. The method...

PatentCitation Online 

Univ South China Tech Submits Patent Application for a Feature Recalibration Convolution Method Based on WGAN Model 

Global IP News: Broadband and Wireless Network Patent News, Aug 31, 2020

Newspaper ArticleCitation Online 

Univ South China Tech Submits Patent Application for a Feature Recalibration Convolution Method Based on WGAN Model 

Global IP News. Broadband and Wireless Network News, Aug 31, 2020

Newspaper ArticleFull Text Online

Open Access 

Differential WGAN based network security situation prediction method 

by ZHU JIANG; WANG TINGTING; WANG YONG 

01/2019

The invention provides a differential WGAN based network security situation prediction method. The GAN (Generative adversarial network) is used to simulate the...

PatentCitation Online 

Univ Chongqing Posts & Telecom Files Chinese Patent Application for Differential Wgan Based Network Security Situation Prediction Method 

Global IP News. Security & Protection Patent News, Oct 14, 2019

Newspaper ArticleFull Text Online 

Univ Chongqing Posts & Telecom Files Chinese Patent Application for Differential Wgan Based Network Security Situation Prediction Method 

Global IP News: Security & Protection Patent News, Oct 14, 2019

Newspaper ArticleCitation Online 


Open Access 

基于半监督WGAN-GP的高光谱图像分类方法 

02/2019

PatentCitation Online 

[Chinese  Hyperspectral image classification method based on semi-supervised WGAN-GP]


 2019

Engineering - Knowledge Engineering; New Findings in Knowledge Engineering Described from Xi'an Jiaotong University (Lp-WGAN: Using Lp-norm... 

Journal of robotics & machine learning, Jan 7, 2019, 148

Newspaper ArticleCitation Online 


2019


  2019

Signal Processing; Study Results from Beijing Normal University Provide New Insights into Signal Processing (Feature Augmentation for Imbalanced Classification With Conditional Mixture Wgans... 

Electronics newsweekly, Jul 9, 2019, 962

Newspaper ArticleCitation Online 


Optimal Transport and Wasserstein Distance 1 Introduction

2019  Larry Wasserman   CMU 

The Wasserstein distance — which arises from the idea ofoptimal transport— is being usedmore and more in Statistics and Machine Learning. In these notes we review some of thebasics about this topic. Two good references for this topic are: Learning. In th 

Semantic Image Inpainting through Improved Wasserstein Generative Adversarial Networks 

By: Vitoria, Patricia; Sintes, Joan; Ballester, Coloma 

Conference: 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISAPP) Location: ‏ Prague, CZECH REPUBLIC Date: ‏ FEB 25-27, 2019 

VISAPP: PROCEEDINGS OF THE 14TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS, VOL 4  Pages: ‏ 249-260   Published: ‏ 2019 


arXiv:1910.03993  [pdf, other]  q-fin.MF
Distributionally Robust XVA via Wasserstein Distance Part 2: Wrong Way Funding Risk
Authors: Derek Singh, Shuzhong Zhang
Abstract: This paper investigates calculations of robust funding valuation adjustment (FVA) for over the counter (OTC) derivatives under distributional uncertainty using Wasserstein distance as the ambiguity measure. Wrong way funding risk can be characterized via the robust FVA formulation. The simpler dual formulation of the robust FVA optimization is derived. Next, some computational experiments are cond…
More
Submitted 9 October, 2019; originally announced October 2019. 

[PDF] openreview.net

iWGAN: an Autoencoder WGAN for  Iference

Y Chen, Q Gao, X Wang - 2019 - openreview.net

Generative Adversarial Networks (GANs) have been impactful on many problems and

applications but suffer from unstable training. Wasserstein GAN (WGAN) leverages the

Wasserstein distance to avoid the caveats in the minmax two-player training of GANs but …

  Related articles 
[CITATION] Distributionally robust xva via wasserstein distance part 1

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019

Cited by 3 Related articles

Distributionally robust XVA via Wasserstein distance part 2: Wrong way funding risk


[PDF] semanticscholar.org

[PDF] Using WGAN for Improving Imbalanced Classification Performance.

S Bhatia, R Dahyot - AICS, 2019 - pdfs.semanticscholar.org

This paper investigates data synthesis with a Generative Adversarial Network (GAN) for

augmenting the amount of data used for training classifiers (in supervised learning) to

compensate for class imbalance (when the classes are not represented equally by the same …

  Cited by 7 Related articles All 4 versions 



<––2019 –—–2019—————580——  






 Slot based Image Captioning with WGAN

Z Xue, L Wang, P Guo - … IEEE/ACIS 18th International Conference on …, 2019 - computer.org

Existing image captioning methods are always limited to the rules of words or syntax with

single sentence and poor words. In this paper, this paper introduces a novel framework for

image captioning tasks which reconciles slot filling approaches with neural network …

  Related articles All 2 versions


High Performance WGAN-GP based Multiple-category Network Anomaly Classification System

JT Wang, CH Wang - 2019 International Conference on Cyber …, 2019 - ieeexplore.ieee.org

Due to the increasing of smart devices, the detection of anomalous traffic on Internet is

getting more essential. Many previous intrusion detection studies which focused on the

classification between normal or anomaly events can be used to enhance the system …

 Cited by 5 Related articles

An Outlier Detection Approach Based on WGAN-Empowered Deep Autoencoder

Y Huang, H Xu, X Wang, Z Wu - 2019 IEEE 9th International …, 2019 - ieeexplore.ieee.org

Modelling normal data is one of the major challenges in outlier detection. Deep learning has

been proven to be effective in modelling underlying distributions of input training data.

However, the existing deep learning-based methods normally focus on how to alleviate the …

Cited by 1 Related articles

  

[PDF] ucl.ac.be

[PDF] Conditional WGAN for grasp generation.

F Patzelt, R HaschkeHJ Ritter - ESANN, 2019 - elen.ucl.ac.be

This work proposes a new approach to robotic grasping exploiting conditional Wasserstein

generative adversarial networks (WGANs), which output promising grasp candidates from

depth image inputs. In contrast to discriminative models, the WGAN approach enables …

  Cited by 3 Related articles 


Network Security Situation Prediction Based on Improved WGAN

J Zhu, T Wang - International Conference on Simulation Tools and …, 2019 - Springer

The current network attacks on the network have become very complex. As the highest level

of network security situational awareness, situation prediction provides effective information

for network administrators to develop security protection strategies. The generative …

  Related articles

Cited by 1 Related articles


2019

Low-Dose CT Image Denoising Based on Improved WGAN-gp

X Li, C Ye, Y Yan, Z Du - Journal of New Media, 2019 - search.proquest.com

In order to improve the quality of low-dose computational tomography (CT) images, the

paper proposes an improved image denoising approach based on WGAN-gp with

Wasserstein distance. For improving the training and the convergence efficiency, the given …

Cited by 3 Related articles

[PDF] apsipa.org

Multiple-Operation Image Anti-Forensics with WGAN-GP Framework

J Wu, Z Wang, H ZengX Kang - 2019 Asia-Pacific Signal and …, 2019 - ieeexplore.ieee.org

A challenging task in the field of multimedia security involves concealing or eliminating the

traces left by a chain of multiple manipulating operations, ie, multiple-operation anti-

forensics in short. However, the existing anti-forensic works concentrate on one specific …

Cited by 3 Related articles All 2 versions

Novel Bi-directional Images Synthesis Based on WGAN-GP with GMM-Based Noise Generation

W Huang, M Luo, X Liu, P Zhang, H Ding… - International Workshop on …, 2019 - Springer

Abstract A novel WGAN-GP-based model is proposed in this study to fulfill bi-directional

synthesis of medical images for the first time. GMM-based noise generated from the Glow

model is newly incorporated into the WGAN-GP-based model to better reflect the …

 Cited by 2 Related articles All 3 versions

 

Generation of Network Traffic Using WGAN-GP and a DFT Filter for Resolving Data Imbalance

WH Lee, BN Noh, YS Kim, KM Jeong - International Conference on …, 2019 - Springer Using Wasserstein Generative Adversarial Networks for the Design of Monte Carlo Simulations

Cited by 2 Related articles All 2 versions

The intrinsic features of Internet networks lead to imbalanced class distributions when

datasets are conformed, phenomena called Class Imbalance and that is attaching an

increasing attention in many research fields. In spite of performance losses due to Class …

  Related articles


[PDF] jst.go.jp

Wasserstein Autoencoder を用いた画像スタイル変換

中田秀基, 麻生英樹 - 人工知能学会全国大会論文集 一般社団法人 …, 2019 - jstage.jst.go.jp

抄録 本稿では Wasserstein Autoencoder を用いた画像スタイル変換を提案する.

画像スタイル変換とは, コンテント画像に対してスタイル画像から抽出したスタイルを適用することで,

任意のコンテントを任意のスタイルで描画する技術である. スタイル変換はこれまでも広く研究され …

  All 2 versions

[Japanese  Image style conversion using Wasserstein Autoencoder]

<––2019  —- 2019———————590——


基于 Wasserstein 距离分层注意力模型的跨域情感分类

杜永萍, 贺萌, 赵晓铮 - 模式识别与人工智能, 2019 - cqvip.com

跨领域情感分类任务旨在利用已知情感标签的源域数据对缺乏标记数据的目标域进行情感倾向

性分析. 文中提出基于Wasserstein 距离的分层注意力模型, 结合Attention 机制,

采用分层模型进行特征提取, Wasserstein 距离作为域差异度量方式, 通过对抗式训练自动 …

  Related articles All 3 versions

[Chinese   Cross-domain sentiment classification based on Wasserstein distance hierarchical attention model


基于 Wasserstein 生成对抗网络的遥感图像去模糊研究

刘晨旭 - 2019 - cdmd.cnki.com.cn

遥感是一种重要的对地观测手段, 从获取的遥感图像中提炼的诸多关键性信息,

已被广泛应用于侦察, 监测, 防治, 预警等领域. 在遥感成像的过程中, 由于拍摄距离远,

扫描速度快, 外界光干扰, 大气湍流及大幅宽成像等因素造成的图像模糊, 在很大程度上降低了 …

[Chinese  Research on remote sensing image deblurring based on Wasserstein Generative Adversarial Network]

  

基于 Wasserstein GAN 的文档表示模型

马永军, 李亚军, 汪睿, 陈海山 - 计算机工程与科学, 2019 - airitilibrary.com

文档表示模型可以将非结构化的文本数据转化为结构化数据, 是多种自然语言处理任务的基础,

而目前基于词的模型在文档表示任务中有着无法直接表示文档的缺陷. 针对此问题,

基于生成对抗网络GAN 可以使用两个神经网络进行对抗学习, 从而很好地学习到原始数据分布 …

  All 2 versions

[Chinese  Document representation model based on Wasserstein GAN]


基于 Wasserstein GAN 的新一代人工智能小样本数据增强方法——以生物领域癌症分期数据为例

刘宇飞, 周源, 刘欣, 董放, 王畅, 王子鸿 - Engineering, 2019 - cnki.com.cn

以大数据为基础的深度学习算法在推动新一代人工智能快速发展中意义重大.

然而深度学习的有效利用对标注样本数量的高度依赖, 使得深度学习在小样本数据环境下的应用

受到制约. 本研究提出了一种基于生成对抗网络(generative adversarial network, GAN) …

  Related articles 

[Chinese  [A new generation of artificial intelligence small sample data enhancement method based on Wasserstein GAN-with]

[CITATION] A new generation of artificial intelligence small sample data augmentation method based on Wasserstein Gan: a case study of cancer staging data in …

YF Liu, Y Zhou, X Liu - Engineering, 2019

Cited by 2 令人拍案叫绝的Wasserstein GAN - 知乎

zhuanlan.zhihu.com › ...

Translate this page

Apr 20, 2017 — 本文后续: Wasserstein GAN最新进展:从weight clippinggradient penalty,更加先进的Lipschitz限制手法在GAN的相关研究如火如荼甚至可以 ..


2019 

 令人拍案叫绝的wasserstein gan - Ein的博客| Ein Blog

ein027.github.io › 2019/12/06 › ...

· Translate this page

Posted by Ein Blog on December 6, 2019 ... 而今天的主角Wasserstein GAN 下面简称WGAN)成功地做到了以下爆炸性的几点: ... 这就是令人拍案叫绝的部分了——实际上作者整整花了两篇论文,在第一篇《Towards Principled Methods for  ... [CITATION] 令人拍案叫绝的 Wasserstein GAN

郑华滨 - 2017-04-02 [2018-01-20]. https://zhuanlan. zhihu. com …, 2019 - 计算机工程与应用

  Cited by 2 Related articles

[Chinese  The amazing Wasserstein GAN[

[C] 令人拍案叫绝的 Wasserstein GAN

郑华滨 - 2017-04-02 [2018-01-20]. https://zhuanlan. zhihu. com …, 2019 - 计算机工程与应用

Cited by 2

[Chinese  Amazing Wasserstein GAN]

2019

Wasserstein 생산적 적대 신경망과 구조적 유사지수를 이용한 저선량 ...

https://www.eiric.or.kr › ser_view

· Translate this page


한글제목(Korean Title), Wasserstein 생산적 적대 신경망과 구조적 유사지수를 이용한 저선량 컴퓨터 단층촬영 영상 잡음 제거 기법. 영문제목(English Title) ...

[CITATION] Wasserstein 생산적 적대 신경망과 구조적 유사지수를 이용한 저선량 컴퓨터 단층촬영 영상 잡음 제거 기법

이지나, 홍영택, 장영걸, 김주호, 백혜진… - 한국정보과학회 학술 …, 2019 - dbpia.co.kr

약컴퓨터 단층 촬영 영상 (Computed Tomography; CT) 진단을 위한 영상데이터

하나이며, 선량이높을수록 고품질 영상을 획득할 있게 하지만 질병 또는 종양을 유발할

있다. 최근 년간 생산적적대 신경망은 비지도 영상 잡음 제거 연구에서 많은 성과를 내고 …

  Related articles

[Korean  Wasserstein low line using productive hostile neural networks and structural similarity index]

エントロピー正則化Wasserstein距離に基づくマルチビュー ...

ci.nii.ac.jp › naid

Translate this page

by 笠井裕之 · 2019 — エントロピー正則化Wasserstein距離に基づくマルチビューWasserstein判別法 (放送技術) Multi-view Wasserstein discriminant analysis with entropic regularized Wasserstein distance. 笠井 裕之 ...

[CITATION] エントロピー正則化 Wasserstein 距離に基づくマルチビュー Wasserstein 判別法 (放送技術)

笠井裕之 - 映像情報メディア学会技術報告= ITE technical report, 2019 - ci.nii.ac.jp

検索. すべて. 本文あり. すべて. 本文あり. タイトル. 著者名. 著者ID. 著者所属. 刊行物名. ISSN.

巻号ページ. 出版者. 参考文献. 出版年. 年から 年まで. 検索. 閉じる. 検索. 検索. [機関認証]

利用継続手続きのご案内. エントロピー正則化Wasserstein距離に基づくマルチビューWasserstein …

[Japanese  Entropy regularization Wasserstein Distance-based multi-view Wasserstein discrimination method (broadcasting technology)]


Optimal Control in Wasserstein Spaces - HAL

hal.archives-ouvertes.fr › tel-02361353 › document

PDF

Nov 13, 2019 — Subsequently, we investigate sufficient conditions for the Lipschitz-in-space regularity of mean-field optimal control. These results are generally ...

[CITATION] Optimal Control in Wasserstein Spaces.(Commande Optimal dans les Espaces de Wasserstein).

B Bonnet - 2019 - Aix-Marseille University, France

 


基于 Wasserstein 生成对抗网络的语音增强算法研究

叶帅帅 - 2019 - cdmd.cnki.com.cn

语音增强作为一种语音前端处理技术在人工智能领域扮演着越来越重要的角色.

目前大多数传统语音增强方法都是先对噪声分布进行建模, 然后根据建模结果对含噪语音进行降

. 然而这些传统的语音增强方法存在很多缺点, 例如在低信噪比下往往无法取得较好的降噪 …

[Chinese  Research on Speech Enhancement Algorithm Based on Wasserstein Generative Adversarial Network]

<——2019–—–2019———600——     



mproved Concentration Bounds for Conditional Value-at-Risk ...

deepai.org › publication › improved-concentration-bou...

Improved Concentration Bounds for Conditional Value-at-Risk and Cumulative Prospect Theory using Wasserstein distance. 02/27/2019 ∙ by Sanjay P. Bhat, ...


Concentration of risk measures: A Wasserstein distance ...

papers.nips.cc › paper › 9347-concentration-of-risk-me...

by SP Bhat · 2019 · Cited by 10 — Known finite-sample concentration bounds for the Wasserstein distance between ... bound for the error between the true conditional value-at-risk (CVaR) of a ... and improves upon previous bounds which were either one sided, or applied only ...

[CITATION] Improved Concentration Bounds for Conditional Value-at-Risk and Cumulative Prospect Theory using Wasserstein distance.

SP Bhat, LA Prashanth - CoRR, 2019

  Cited by 1+

Cited by 1 Related articles All 4 versions


PDF] arxiv.org

Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Stein's method has been widely used for probability approximations. However, in the multi-

dimensional setting, most of the results are for multivariate normal approximation or for test

functions with bounded second-or higher-order derivatives. For a class of multivariate …

  Cited by 21 Related articles All 7 versions

[CITATION] Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula (vol 174, pg 945, 2019)

X Fang, QM Shao, L Xu - PROBABILITY …, 2019 - … TIERGARTENSTRASSE 17, D …

Cited by 30 Related articles All 7 versions

[PDF] arxiv.org

Tree-sliced variants of wasserstein distances

T LeM YamadaK FukumizuM Cuturi - arXiv preprint arXiv:1902.00342, 2019 - arxiv.org

Optimal transport (\OT) theory defines a powerful set of tools to compare probability

distributions.\OT~ suffers however from a few drawbacks, computational and statistical,

which have encouraged the proposal of several regularized variants of OT in the recent …

  Cited by 19 Related articles All 5 versions 

Calendar | Dean of Students - Boston University

Jun 12, 2020 — The learning problems that are studied in this dissertation include: (i) Distributionally Robust Linear ... over a probabilistic ambiguity set characterized by the Wasserstein metric; (ii) Groupwise Wasserstein ... June 25, 2020.


 2019

[PDF] Problemas de clasificación: una perspectiva robusta con la métrica de Wasserstein

JA Acosta Melo - repositorio.uniandes.edu.co

El objetivo central de este trabajo es dar un contexto a los problemas de clasificación para

los casos de máquinas de soporte vectorial y regresión logıstica. La idea central es abordar

estos problemas con un enfoque robusto con ayuda de la métrica de Wasserstein que se …

  

  

基于堆栈 Wasserstein 自编码器与混合生成对抗网络的高光谱图像分类研究

叶少晖 - 2019 - cdmd.cnki.com.cn

高光谱遥感是一种典型的对地观测技术, 在提升光谱分辨率的同时包含了更多的空间信息,

分类识别技术作为高光谱图像处理中的核心技术之一, 可用于地质矿产, 水资源管理,

军事等多个领域. 如何提取高光谱图像的高级特征, 建立小样本下鲁棒的分类模型 …

[Chinese  Hyperspectral image analysis based on stack Wasserstein autoencoder and hybrid generative confrontation network] 

 

[PDF] imag.fr

[PDF] Méthode de couplage en distance de Wasserstein pour la théorie des valeurs extrêmes

B Bobbia, C Dombry, D Varron - toltex.imag.fr

Nous proposons une relecture de résultats classiques de la théorie des valeurs extrêmes,

que nous étudions grâce aux outils que nous fournit la théorie du transport optimal. Dans ce

cadre, nous pouvons voir la normalité des estimateurs comme une convergence de …

  Related articles All 2 versions 


Wasserstein 거리를 활용한 분포 강건 신문가판원 모형 - DBpia

https://www.dbpia.co.kr › articleDetail

· Translate this page

Wasserstein 거리를 활용한 분포 강건 신문가판원 모형 ... 추천 논문. 한국신문의 가판시장에 관한 연구 ... 가판 폐지와 시문의 1 다양성.

[CITATION] Wasserstein 거리를 활용한 분포 강건 신문가판원 모형

이상윤, 김현우, 문일경 - 대한산업공학회 춘계공동학술대회 논문집, 2019

  All 3 versions

[Korean  Distribution Robust Newsletter Model Using Wasserstein Distance]
<—≠2019 ———2019———— 610—


Deep generative models via explicit Wasserstein minimization

Y Chen - 2019 - ideals.illinois.edu

This thesis provides a procedure to fit generative networks to target distributions, with the

goal of a small Wasserstein distance (or other optimal transport costs). The approach is

based on two principles:(a) if the source randomness of the network is a continuous …

  Related articles All 3 versions 

  

A generalized Vaserstein symbol

T Syed - A JOURNAL OF THE K-THEORY FOUNDATION, 2019 - msp.org

In this paper, we provide a generalized construction of the Vaserstein symbol, which was

originally introduced by Andrei Suslin and Leonid Vaserstein in [Vaserstein and Suslin

1976]. We let R be a commutative ring and we let Umn (R) denote the set of unimodular …


[PDF] ntu.edu.sg

Poisson discretizations of Wiener functionals and Malliavin operators with Wasserstein estimates

N PrivaultSCP YamZ Zhang - Stochastic Processes and their …, 2019 - Elsevier

This article proposes a global, chaos-based procedure for the discretization of functionals of

Brownian motion into functionals of a Poisson process with intensity λ> 0. Under this

discretization we study the weak convergence, as the intensity of the underlying Poisson …

  Related articles All 7 versions

[PDF] arxiv.org


Geometric convergence bounds for Markov chains in Wasserstein distance based on generalized drift and contraction conditions

Q Qin, JP Hobert - arXiv preprint arXiv:1902.02964, 2019 - arxiv.org

Let $\{X_n\} _ {n= 0}^\infty $ denote an ergodic Markov chain on a general state space that

has stationary distribution $\pi $. This article concerns upper bounds on the $ L_1 $-

Wasserstein distance between the distribution of $ X_n $ and $\pi $. In particular, an explicit …

  Cited by 7 Related articles All 2 versions 


[PDF] rieck.me

wasserstein subsequence kernel for time series

C BockM TogninalliE Ghisu… - … Conference on Data …, 2019 - ieeexplore.ieee.org

Kernel methods are a powerful approach for learning on structured data. However, as we

show in this paper, simple but common instances of the popular R-convolution kernel

framework can be meaningless when assessing the similarity of two time series through …

  Cited by 4 Related articles All 10 versions


 

[PDF] arxiv.org

Wasserstein convergence rates for random bit approximations of continuous markov processes

S Ankirchner, T Kruse, M Urusov - arXiv preprint arXiv:1903.07880, 2019 - arxiv.org

We determine the convergence speed of a numerical scheme for approximating one-

dimensional continuous strong Markov processes. The scheme is based on the construction

of coin tossing Markov chains whose laws can be embedded into the process with a …

  Cited by 3 Related articles All 3 versions 

Approximation of Discounted Minimax Markov Control Problems and Zero-Sum Markov Games Using Hausdorff and Wasserstein Distances

F Dufour, T Prieto-Rumeau - Dynamic Games and Applications, 2019 - Springer

This paper is concerned with a minimax control problem (also known as a robust Markov

decision process (MDP) or a game against nature) with general state and action spaces

under the discounted cost optimality criterion. We are interested in approximating …

  Related articles All 6 versions


[PDF] arxiv.org

A Fenchel-Moreau-Rockafellar type theorem on the Kantorovich-Wasserstein space with applications in partially observable Markov decision processes

V Laschos, K ObermayerY ShenW Stannat - Journal of Mathematical …, 2019 - Elsevier

By using the fact that the space of all probability measures with finite support can be

completed in two different fashions, one generating the Arens-Eells space and another

generating the Kantorovich-Wasserstein (Wasserstein-1) space, and by exploiting the …

 Cited by 3 Related articles All 5 versions


 

Wasserstein convergence rates for random bit approximations ...

arxiv.org › math

Mar 19, 2019 — ... rates for random bit approximations of continuous Markov processes ... based on the construction of coin tossing Markov chains whose laws ...

by S Ankirchner · ‎2019 · ‎Cited by 3 · ‎Related articles

[CITATION] Wasserstein convergence rates for coin tossing approximations of continuous Markov processes

S Ankirchner, T Kruse, M Urusov - 2019

  Cited by 1


[PDF] psu.edu

Aggregated Wasserstein distance for hidden Markov models and automated morphological characterization of placenta from photos

Y Chen - 2019 - etda.libraries.psu.edu

In the past decade, fueled by the rapid advances of big data technology and machine

learning algorithms, data science has become a new paradigm of science and has more

and more emerged into its own field. At the intersection of computational methods, data …

<––2019  ———2019———— 620——

    

[PDF] psu.edu

[PDF] Subexponential upper and lower bounds in Wasserstein distance for Markov processes

A Arapostathis, G Pang, N Sandric - personal.psu.edu

In this article, relying on Foster-Lyapunov drift conditions, we establish subexponential

upper and lower bounds on the rate of convergence in the Lp-Wasserstein distance for a

class of irreducible and aperiodic Markov processes. We further discuss these results in the …

 Cited by 2 Related articles All 2 versions 

Subexponential upper and lower bounds in Wasserstein ...

arxiv.org › pdf

Subexponential upper and lower bounds in. Wasserstein distance for Markov processes. Ari Arapostathis1, Guodong Pang2 and Nikola Sandric3. 1Department ...

by A Arapostathis · ‎2019 · ‎Related articles


[PDF] arxiv.org

Random Matrix-Improved Estimation of the Wasserstein Distance between two Centered Gaussian Distributions

M TiomokoR Couillet - 2019 27th European Signal Processing …, 2019 - ieeexplore.ieee.org

This article proposes a method to consistently estimate functionals $\frac {1}{p}\sum_ {i=

1}^{p} f (\lambda_ {i}(C_ {1} C_ {2})) $ of the eigenvalues of the product of two covariance

matrices $ C_ {1}, C_ {2}\in\mathbb {R}^{p\times p} $ based on the empirical estimates …

  Cited by 1 Related articles All 4 versions


2019

[PDF] harchaoui.org

[PDF] Wasserstein Adversarial Mixture for Deep Generative Modeling and Clustering

W Harchaoui, PA Mattei, A AlmansaC Bouveyron - 2019 - harchaoui.org

Unsupervised learning, and in particular clustering, is probably the most central problem in

learning theory nowadays. This work focuses on the clustering of complex data by

introducing a deep generative approach for both modeling and clustering. The proposed …

  Cited by 1 Related articles All 3 versions 

 


A Fenchel-Moreau-Rockafellar type theorem on the Kantorovich-Wasserstein space with applications in partially observable Markov decision processes

V Laschos, K ObermayerY ShenW Stannat - Journal of Mathematical …, 2019 - Elsevier

By using the fact that the space of all probability measures with finite support can be

completed in two different fashions, one generating the Arens-Eells space and another

generating the Kantorovich-Wasserstein (Wasserstein-1) space, and by exploiting the …

  Cited by 1 Related articles All 5 versions


[PDF] lancs.ac.uk

Reproducing-Kernel Hilbert space regression with notes on the Wasserstein Distance

S Page - 2019 - eprints.lancs.ac.uk

We study kernel least-squares estimators for the regression problem subject to a norm

constraint. We bound the squared L2 error of our estimators with respect to the covariate

distribution. We also bound the worst-case squared L2 error of our estimators with respect to …

  Related articles All 4 versions 



[PDF] arxiv.org

Alpha Procrustes metrics between positive definite operators: a unifying formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt metrics

MH Quang - arXiv preprint arXiv:1908.09275, 2019 - arxiv.org

This work presents a parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

distances provide a unified formulation encompassing both the Bures-Wasserstein and Log …

  Cited by 4 Related articles All 2 versions 


2019

Approximate Bayesian computation with the Wasserstein distance

E Bernton, PE Jacob, M Gerber, CP Robert - arXiv preprint arXiv …, 2019 - arxiv.org

A growing number of generative statistical models do not permit the numerical evaluation of

their likelihood functions. Approximate Bayesian computation (ABC) has become a popular

approach to overcome this issue, in which one simulates synthetic data sets given …

  Cited by 33 Related articles All 11 versions 

[CITATION] Supplementary materials “Approximate Bayesian computation with the Wasserstein distance”

E Bernton, PE Jacob, M Gerber, CP Robert

  Related articles


[PDF] arxiv.org

Wasserstein metric-driven Bayesian inversion with applications to signal processing

M MotamedD Appelo - International Journal for Uncertainty …, 2019 - dl.begellhouse.com

We present a Bayesian framework based on a new exponential likelihood function driven by

the quadratic Wasserstein metric. Compared to conventional Bayesian models based on

Gaussian likelihood functions driven by the least-squares norm (L 2 norm), the new …

  Cited by 7 Related articles All 4 versions


[PDF] arxiv.org

Bridging Bayesian and minimax mean square error estimation via Wasserstein distributionally robust optimization

VA NguyenS Shafieezadeh-AbadehD Kuhn… - arXiv preprint arXiv …, 2019 - arxiv.org

We introduce a distributionally robust minimium mean square error estimation model with a

Wasserstein ambiguity set to recover an unknown signal from a noisy observation. The

proposed model can be viewed as a zero-sum game between a statistician choosing an …

  Cited by 6 Related articles All 4 versions 

[PDF] arxiv.org

<——2019——— 2019 ———- 630—

[PDF] unibocconi.it

[PDF] Bayesian model comparison based on Wasserstein distances

M Catalano, A Lijoi, I Pruenster - SIS 2019 Smart Statistics for …, 2019 - iris.unibocconi.it

Demography in the Digital Era: New Data Sources for Population Research ...........................23

Demografia nell'era digitale: nuovi fonti di dati per gli studi di popolazione................................23

Diego Alburez-Gutierrez, Samin Aref, Sofia Gil-Clavel, André Grow, Daniela V. Negraia, Emilio …

 

On the Bures–Wasserstein distance between positive definite matrices

R Bhatia, T Jain, Y Lim - Expositiones Mathematicae, 2019 - Elsevier

The metric d (A, B)= tr A+ tr B− 2 tr (A 1∕ 2 BA 1∕ 2) 1∕ 2 1∕ 2 on the manifold of n× n

positive definite matrices arises in various optimisation problems, in quantum information

and in the theory of optimal transport. It is also related to Riemannian geometry. In the first …

  Cited by 88 Related articles All 5


[PDF] arxiv.org

Wasserstein information matrix

W LiJ Zhao - arXiv preprint arXiv:1910.11248, 2019 - arxiv.org

We study information matrices for statistical models by the $ L^ 2$-Wasserstein metric. We

call them Wasserstein information matrices (WIMs), which are analogs of classical Fisher

information matrices. We introduce Wasserstein score functions and study covariance …

  Cited by 8 Related articles All 5 versions 

 


A unified formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt distances between positive definite operators

HQ Minh - International Conference on Geometric Science of …, 2019 - Springer

This work presents a parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

distances provide a unified formulation encompassing both the Bures-Wasserstein and Log …

  Cited by 5 Related articles


[PDF] arxiv.org

Alpha Procrustes metrics between positive definite operators: a unifying formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt metrics

MH Quang - arXiv preprint arXiv:1908.09275, 2019 - arxiv.org

This work presents a parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

distances provide a unified formulation encompassing both the Bures-Wasserstein and Log …

  Cited by 3 Related articles All 2 versions 


 

[PDF] arxiv.org

Local Bures-Wasserstein Transport: A Practical and Fast Mapping Approximation

A Hoyos-Idrobo - arXiv preprint arXiv:1906.08227, 2019 - arxiv.org

Optimal transport (OT)-based methods have a wide range of applications and have attracted

a tremendous amount of attention in recent years. However, most of the computational

approaches of OT do not learn the underlying transport map. Although some algorithms …

  Related articles All 2 versions 

[CITATION] Local Bures-Wasserstein Transport: A Practical and Fast Mapping Approximation.

AH Idrobo - CoRR, 2019


[PDF] arxiv.org

Wasserstein stability estimates for covariance-preconditioned Fokker--Planck equations

JA CarrilloU Vaes - arXiv preprint arXiv:1910.07555, 2019 - arxiv.org

We study the convergence to equilibrium of the mean field PDE associated with the

derivative-free methodologies for solving inverse problems. We show stability estimates in

the euclidean Wasserstein distance for the mean field PDE by using optimal transport …

  Cited by 5 Related articles All 3 versions 


2019

[PDF] arxiv.org

Wasserstein Gradient Flow Formulation of the Time-Fractional Fokker-Planck Equation

MH DuongB Jin - arXiv preprint arXiv:1908.09055, 2019 - arxiv.org

In this work, we investigate a variational formulation for a time-fractional Fokker-Planck

equation which arises in the study of complex physical systems involving anomalously slow

diffusion. The model involves a fractional-order Caputo derivative in time, and thus …

    Cited by 1 Related articles All 7 versions 

 

2009

[PDF] wvu.edu

Euler–Poisson systems as action-minimizing paths in the Wasserstein space

W GangboT NguyenA Tudorascu - Archive for rational mechanics and …, 2009 - Springer

This paper uses a variational approach to establish existence of solutions (σ t, vt) for the 1-d

Euler–Poisson system by minimizing an action. We assume that the initial and terminal

points σ 0, σ T are prescribed in P _2 (R), the set of Borel probability measures on the real

line, of finite second-order moments. We show existence of a unique minimizer of the action

when the time interval 0, T satisfies T<π. These solutions conserve the Hamiltonian and

they yield a path t→ σ t in P _2 (R). When σ t= δ y (t) is a Dirac mass, the Euler–Poisson …

  Cited by 48 Related articles All 20 versions

[CITATION] Euler-Poisson systems as action-minimizing paths in the Wasserstein space. to appear in Arch

W Gangbo, T Nguyen, A Tudorascu - Rational Mech. Anal

  Cited by 2 Related articles


Fasel, Jean

The Vaserstein symbol on real smooth affine threefolds. (English) Zbl 07272543

Srinivas, V. (ed.) et al., K-theory. Proceedings of the international colloquium, Mumbai, 2016. New Delhi: Hindustan Book Agency; Mumbai: Tata Institute of Fundamental Research (ISBN 978-93-86279-74-3/hbk). Studies in Mathematics. Tata Institute of Fundamental Research 23, 211-222 (2019).

MSC:  19G99 14F42

<——2019———— 2019  ———- 640——— 

[PDF] arxiv.org

Wasserstein-Fisher-Rao Document Distance

Z Wang, D Zhou, Y ZhangH WuC Bao - arXiv preprint arXiv:1904.10294, 2019 - arxiv.org

As a fundamental problem of natural language processing, it is important to measure the

distance between different documents. Among the existing methods, the Word Mover's

Distance (WMD) has shown remarkable success in document semantic matching for its clear …

  Cited by 1 Related articles All 3 versions 


Curvature of the manifold of fixed-rank positive-semidefinite matrices endowed with the Bures–Wasserstein metric

E MassartJM HendrickxPA Absil - International Conference on …, 2019 - Springer

We consider the manifold of rank-p positive-semidefinite matrices of size n, seen as a

quotient of the set of full-rank n-by-p matrices by the orthogonal group in dimension p. The

resulting distance coincides with the Wasserstein distance between centered degenerate …

  Cited by 6 Related articles All 5 versions


[PDF] arxiv.org

Local well-posedness in the Wasserstein space for a chemotaxis model coupled to Navier-Stokes equations

K Kang, HK Kim - arXiv preprint arXiv:1907.01895, 2019 - arxiv.org

We consider a coupled system of Keller-Segel type equations and the incompressible

Navier-Stokes equations in spatial dimension two and three. In the previous work [19], we

established the existence of a weak solution of a Fokker-Plank equation in the Wasserstein  …

  Related articles All 2 versions 


2019

[PDF] arxiv.org

Refined basic couplings and Wasserstein-type distances for SDEs with Lévy noises

D Luo, J Wang - Stochastic Processes and their Applications, 2019 - Elsevier

We establish the exponential convergence with respect to the L 1-Wasserstein distance and

the total variation for the semigroup corresponding to the stochastic differential equation d X

t= d Z t+ b (X t) dt, where (Z t) t≥ 0 is a pure jump Lévy process whose Lévy measure ν fulfills …

  Cited by 15 Related articles All 7 versions

[CITATION] Refined basic couplings and Wasserstein-type distances for SDEs with Lévy noises, to appear in Stoch

D Luo, J Wang - Proc. Appl

 Cited by 17 Related articles All 7 versions


2019

[PDF] arxiv.org

Fréchet means and Procrustes analysis in Wasserstein space

Y Zemel, VM Panaretos - Bernoulli, 2019 - projecteuclid.org

We consider two statistical problems at the intersection of functional and non-Euclidean data

analysis: the determination of a Fréchet mean in the Wasserstein space of multivariate

distributions; and the optimal registration of deformed random measures and point …

Cited by 45 Related articles All 8 versions


2019

[PDF] arxiv.org

Parisi's formula is a Hamilton-Jacobi equation in Wasserstein space

JC Mourrat - arXiv preprint arXiv:1906.08471, 2019 - arxiv.org

Parisi's formula is a self-contained description of the infinite-volume limit of the free energy of

mean-field spin glass models. We show that this quantity can be recast as the solution of a

Hamilton-Jacobi equation in the Wasserstein space of probability measures on the positive …

  Cited by 6 Related articles All 3 versions 


2019

[PDF] ucla.edu

On differentiability in the Wasserstein space and well-posedness for Hamilton–Jacobi equations

W GangboA Tudorascu - Journal de Mathématiques Pures et Appliquées, 2019 - Elsevier

In this paper we elucidate the connection between various notions of differentiability in the

Wasserstein space: some have been introduced intrinsically (in the Wasserstein space, by

using typical objects from the theory of Optimal Transport) and used by various authors to …

  Cited by 29 Related articles All 5 versions

 
A partial Laplacian as an infinitesimal generator on the Wasserstein space

YT Chow, W Gangbo - Journal of Differential Equations, 2019 - Elsevier

In this manuscript, we consider special linear operators which we term partial Laplacians on

the Wasserstein space, and which we show to be partial traces of the Wasserstein Hessian.

We verify a distinctive smoothing effect of the “heat flows” they generated for a particular …

  Cited by 11 Related articles All 9 versions


2019

[PDF] arxiv.org

A convergent Lagrangian discretization for -Wasserstein and flux-limited diffusion equations

B Söllner, O Junge - arXiv preprint arXiv:1906.01321, 2019 - arxiv.org

We study a Lagrangian numerical scheme for solution of a nonlinear drift diffusion equation

of the form $\partial_t u=\partial_x (u\cdot c [\partial_x (h^\prime (u)+ v)]) $ on an interval.

This scheme will consist of a spatio-temporal discretization founded in the formulation of the …

  Cited by 2 Related articles All 2 versions 

[CITATION] A convergent Lagrangian discretization for -Wasserstein and flux-limited diffusion equations

O Junge, B Söllner - arXiv preprint arXiv:1906.01321, 2019


2019
 The Gromov–Wasserstein distance between networks and stable network invariants

S ChowdhuryF Mémoli - Information and Inference: A Journal of …, 2019 - academic.oup.com

… The Gromov–Wasserstein distance between networks and stable network invariants … Advanced

Search. Abstract. We define a metric—the network Gromov–Wasserstein distance—on weighted,

directed networks that is sensitive to the presence of outliers …

  Cited by 26 Related articles All 5 versions

  <——2019———— 2019  ———- 650——— 


2019  [PDF] thecvf.com

Conservative wasserstein training for pose estimation

X LiuY Zou, T Che, P Ding, P Jia… - Proceedings of the …, 2019 - openaccess.thecvf.com

… Page 2. stein distance is defined as the cost of optimal transport for moving the mass in one

distribution to match the target dis- tribution [51, 52]. Specifically, we measure the Wasserstein

distance between a softmax prediction and its target label, both of which are normalized as …

Cited by 27 Related articles All 10 versions 

[PDF] arxiv.org

A two-phase two-fluxes degenerate Cahn–Hilliard model as constrained Wasserstein gradient flow

C CancèsD Matthes, F Nabet - Archive for Rational Mechanics and …, 2019 - Springer

We study a non-local version of the Cahn–Hilliard dynamics for phase separation in a two-

component incompressible and immiscible mixture with linear mobilities. Differently to the

celebrated local model with nonlinear mobility, it is only assumed that the divergences of the …

  Cited by 7 Related articles All 41 versions


[PDF] wiley.com

A degenerate Cahn‐Hilliard model as constrained Wasserstein gradient flow

D MatthesC Cances, F Nabet - PAMM, 2019 - Wiley Online Library

Existence of solutions to a non‐local Cahn‐Hilliard model with degenerate mobility is

considered. The PDE is written as a gradient flow with respect to the L2‐Wasserstein metric

for two components that are coupled by an incompressibility constraint. Approximating …

  Related articles All 5 versions


2019

[PDF] wiley.com

A degenerate Cahn‐Hilliard model as constrained Wasserstein gradient flow

D MatthesC Cances, F Nabet - PAMM, 2019 - Wiley Online Library

Existence of solutions to a non‐local Cahn‐Hilliard model with degenerate mobility is

considered. The PDE is written as a gradient flow with respect to the L2‐Wasserstein metric

for two components that are coupled by an incompressibility constraint. Approximating

solutions are constructed by means of an implicit discretization in time and variational

methods.

  Related articles All 5 versions

[CITATION] A degenerate Cahn-Hilliard model as constrained Wasserstein gradient flow

C CANCES


Attainability property for a probabilistic target in Wasserstein ...

arxiv.org › math

Apr 24, 2019 — Title:Attainability property for a probabilistic target in Wasserstein spaces ... space of probability measures endowed with Wasserstein distance.

by G Cavagnari · ‎2019 · ‎Cited by 1 · ‎Related articles

[CIT TION] Probabilistic target for control problems in Wasserstein spaces

G Cavagnari, A Marigonda, SP Fabio - preprint

  Cited by 2 Related articles


romov-Wasserstein Factorization Models for Graph Clustering

arxiv.org › cs

Nov 19, 2019 — We propose a new nonlinear factorization model for graphs that are with topological structures, and optionally, node attributes. This model is based on a pseudometric called Gromov-Wasserstein (GW) discrepancy, which compares graphs in a relational way.

by H Xu · ‎2019 · ‎Cited by 2 · ‎Related articles


[PDF] arxiv.org

Dynamic facial expression generation on hilbert hypersphere with conditional wasserstein generative adversarial nets

N OtberdoutM DaoudiA KacemL Ballihi… - arXiv preprint arXiv …, 2019 - arxiv.org

In this work, we propose a novel approach for generating videos of the six basic facial

expressions given a neutral face image. We propose to exploit the face geometry by

modeling the facial landmarks motion as curves encoded as points on a hypersphere. By …

  Cited by 5 Related articles All 2 versions 


 2019

[PDF] arxiv.org

2D Wasserstein Loss for Robust Facial Landmark Detection

Y Yan, S Duffner, P Phutane, A Berthelier… - arXiv preprint arXiv …, 2019 - arxiv.org

Facial landmark detection is an important preprocessing task for most applications related to

face analysis. In recent years, the performance of facial landmark detection has been

significantly improved by using deep Convolutional Neural Networks (CNNs), especially the …

  Related articles All 2 versions 


2019

[PDF] arxiv.org

Deep Distributional Sequence Embeddings Based on a Wasserstein Loss

A Abdelwahab, N Landwehr - arXiv preprint arXiv:1912.01933, 2019 - arxiv.org

Deep metric learning employs deep neural networks to embed instances into a metric space

such that distances between instances of the same class are small and distances between

instances from different classes are large. In most existing deep metric learning techniques …

  Related articles All 2 versions 


2019

[PDF] arxiv.org

[PDF] Implementation of batched Sinkhorn iterations for entropy-regularized Wasserstein loss

T Viehmann - arXiv preprint arXiv:1907.01729, 2019 - arxiv.org

In this report, we review the calculation of entropy-regularised Wasserstein loss introduced

by Cuturi and document a practical implementation in PyTorch. Subjects: Machine Learning

(stat. ML); Machine Learning (cs. LG) Cite as: arXiv: 1907.01729 [stat. ML](or arXiv …

Cited by 2 Related articles All 2 versions
<——2019———— 2019  ———- 660———  


Wasserstein of Wasserstein loss for learning generative models

Y DuklerW Li, A Tong Lin, G Montúfar - 2019 - mis.mpg.de

… In this section, we present the Kantorovich duality formulation of Wasserstein of Wasserstein

loss function with p = 1 and q = 2. As is done for Wasserstein GANs (Arjovsky et al., 2017), we

consider an equivalent Lipschitz-1 condition, which can be practically applied in the …

  Cited by 10 Related articles All 8 versions 


2019  [PDF] mpg.de

Wasserstein of Wasserstein loss for learning generative models

Y DuklerW Li, A Tong Lin, G Montúfar - 2019 - mis.mpg.de

The Wasserstein distance serves as a loss function for unsupervised learning which

depends on the choice of a ground metric on sample space. We propose to use a

Wasserstein distance as the ground metric on the sample space of images. This ground …

  Cited by 10 Related articles All 8 versions 

2019  [PDF] thecvf.com

Unimodal-uniform constrained wasserstein training for medical diagnosis

X LiuX Han, Y Qiao, Y GeS Li… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

The labels in medical diagnosis task are usually discrete and successively distributed. For

example, the Diabetic Retinopathy Diagnosis (DR) involves five health risk levels: no DR (0),

mild DR (1), moderate DR (2), severe DR (3) and proliferative DR (4). This labeling system is …

 Cited by 23 Related articles All 9 versions 

2019  [PDF] thecvf.com

Conservative wasserstein training for pose estimation

X LiuY Zou, T Che, P Ding, P Jia… - Proceedings of the …, 2019 - openaccess.thecvf.com

This paper targets the task with discrete and periodic class labels (eg, pose/orientation

estimation) in the context of deep learning. The commonly used cross-entropy or regression

loss is not well matched to this problem as they ignore the periodic nature of the labels and …

[CITATION] Wasserstein Training of Deep Boltzmann Machines

C Wang, T Tong, Y Zou

Cited by 27 Related articles All 10 versions 

[CITATION] Conservative Wasserstein Training for Pose Estimation Download PDF

X Liu, Y Zou, T Che, J YouBVKV Kumar

2019  [PDF] nips.cc

Scalable Gromov-Wasserstein learning for graph partitioning and matching

H XuD LuoL Carin - Advances in neural information processing …, 2019 - papers.nips.cc

We propose a scalable Gromov-Wasserstein learning (S-GWL) method and establish a

novel and theoretically-supported paradigm for large-scale graph analysis. The proposed

method is based on the fact that Gromov-Wasserstein discrepancy is a pseudometric on …

Cited by 72 Related articles All 12 versions 

2019  PDF] arxiv.org

 

[PDF] projecteuclid.org

Convergence of the population dynamics algorithm in the Wasserstein metric

M Olvera-Cravioto - Electronic Journal of Probability, 2019 - projecteuclid.org

We study the convergence of the population dynamics algorithm, which produces sample

pools of random variables having a distribution that closely approximates that of the special

endogenous solution to a variety of branching stochastic fixed-point equations, including the …

  Cited by 4 Related articles All 7 versions

2019  [PDF] arxiv.org

On scalable variant of wasserstein barycenter

T Le, V Huynh, N HoD PhungM Yamada - arXiv preprint arXiv …, 2019 - arxiv.org

We study a variant of Wasserstein barycenter problem, which we refer to as\emph {tree-

sliced Wasserstein barycenter}, by leveraging the structure of tree metrics for the ground

metrics in the formulation of Wasserstein distance. Drawing on the tree structure, we …

  Cited by 4 Related articles All 4 versions 


2019  [PDF] nips.cc

Interior-point methods strike back: Solving the wasserstein barycenter problem

D Ge, H Wang, Z Xiong, Y Ye - Advances in Neural Information …, 2019 - papers.nips.cc

Computing the Wasserstein barycenter of a set of probability measures under the optimal

transport metric can quickly become prohibitive for traditional second-order algorithms, such

as interior-point methods, as the support size of the measures increases. In this paper, we …

  Cited by 8 Related articles All 5 versions 

2019  [PDF] arxiv.org

Wasserstein barycenter model ensembling

P Dognin, I Melnyk, Y Mroueh, J Ross… - arXiv preprint arXiv …, 2019 - arxiv.org

In this paper we propose to perform model ensembling in a multiclass or a multilabel

learning setting using Wasserstein (W.) barycenters. Optimal transport metrics, such as the

Wasserstein distance, allow incorporating semantic side information such as word …

  Cited by 5 Related articles All 4 versions 
<——2019———— 2019  ———- 670———  


 

[PDF] arxiv.org

On the computational complexity of finding a sparse wasserstein barycenter

S Borgwardt, S Patterson - arXiv preprint arXiv:1910.07568, 2019 - arxiv.org

The discrete Wasserstein barycenter problem is a minimum-cost mass transport problem for

a set of probability measures with finite support. In this paper, we show that finding a

barycenter of sparse support is hard, even in dimension 2 and for only 3 measures. We …

  Cited by 8 Related articles All 2 versions View as HTML 

2019  [PDF] mlr.press

On the complexity of approximating Wasserstein barycenters

A KroshninN TupitsaD Dvinskikh… - International …, 2019 - proceedings.mlr.press

We study the complexity of approximating the Wasserstein barycenter of $ m $ discrete

measures, or histograms of size $ n $, by contrasting two alternative approaches that use

entropic regularization. The first approach is based on the Iterative Bregman Projections …

  Cited by 32 Related articles All 7 versions 

On the Complexity of Approximating Wasserstein Barycenters

P Dvurechensky - pdfs.semanticscholar.org

… νP2(Ω) m ∑ i=1 W(µi,ν), where W(µ, ν) is the Wasserstein distance between measures µ and

ν on Ω. WB is efficient in machine learning problems with geometric data, eg template image

reconstruction from random sample: Figure: Images from [Cuturi & Doucet, 2014] 2/9 On the …

Cited by 72 Related articles All 9 versions 

 

2019  [PDF] researchgate.net

[PDF] Computation of Wasserstein barycenters via the Iterated Swapping Algorithm

G Puccetti, L RüschendorfS Vanduffel - 2019 - researchgate.net

In recent years, the Wasserstein barycenter has become an important notion in the analysis

of high dimensional data with a broad range of applications in applied probability,

economics, statistics and in particular to clustering and image processing. In our paper we …

 

2019

Progressive Wasserstein Barycenters of Persistence Diagrams

J Vidal, J Budin, J Tierny - IEEE Transactions on Visualization …, 2019 - ieeexplore.ieee.org

This paper presents an efficient algorithm for the progressive approximation of Wasserstein

barycenters of persistence diagrams, with applications to the visual analysis of ensemble

data. Given a set of scalar fields, our approach enables the computation of a persistence …

Cited by 23 Related articles All 17 versions


 

[PDF] arxiv.org

Graph Signal Representation with Wasserstein Barycenters

E SimouP Frossard - ICASSP 2019-2019 IEEE International …, 2019 - ieeexplore.ieee.org

In many applications signals reside on the vertices of weighted graphs. Thus, there is the

need to learn low dimensional representations for graph signals that will allow for data

analysis and interpretation. Existing unsupervised dimensionality reduction methods for …

  Cited by 7 Related articles All 6 versions


2019  [PDF] arxiv.org

Data-driven regularization of Wasserstein barycenters with an application to multivariate density registration

J Bigot, E Cazelles, N Papadakis - Information and Inference: A …, 2019 - academic.oup.com

We present a framework to simultaneously align and smoothen data in the form of multiple

point clouds sampled from unknown densities with support in a-dimensional Euclidean

space. This work is motivated by applications in bioinformatics where researchers aim to …

 Cited by 21 Related articles All 7 versions


[PDF] nips.cc

Propagating uncertainty in reinforcement learning via wasserstein barycenters

AM MetelliA LikmetaM Restelli - Advances in Neural Information …, 2019 - papers.nips.cc

How does the uncertainty of the value function propagate when performing temporal

difference learning? In this paper, we address this question by proposing a Bayesian

framework in which we employ approximate posterior distributions to model the uncertainty …

  Cited by 3 Related articles All 7 versions 



 2019  [HTML] A 

Computing Wasserstein Barycenters via Linear Programming

G Auricchio, F Bassetti, S Gualandi… - … Conference on Integration …, 2019 - Springer

This paper presents a family of generative Linear Programming models that permit to

compute the exact Wasserstein Barycenter of a large set of two-dimensional images.

Wasserstein Barycenters were recently introduced to mathematically generalize the concept …

  Cited by 4 Related articles All 2 versions

Computing Wasserstein Barycenters via Linear Programming

M Veneroni - … of Constraint Programming, Artificial Intelligence, and …, 2019 - Springer

This paper presents a family of generative Linear Programming models that permit to

compute the exact Wasserstein Barycenter of a large set of two-dimensional images.

Wasserstein Barycenters were recently introduced to mathematically generalize the concept …

  Related articles All 2 versions

Cited by 4 Related articles All 2 versions

 2019  [PDF] arxiv.org

Learning with Wasserstein barycenters and applications

G Domazakis, D Drivaliaris, S Koukoulas… - arXiv preprint arXiv …, 2019 - arxiv.org

In this work, learning schemes for measure-valued data are proposed, ie data that their

structure can be more efficiently represented as probability measures instead of points on

$\R^ d $, employing the concept of probability barycenters as defined with respect to the …

  Related articles All 3 versions 

 2019  [PDF] arxiv.org

Barycenters in generalized Wasserstein spaces

NP Chung, TS Trinh - arXiv preprint arXiv:1909.05517, 2019 - arxiv.org

… In this note, following the streamline of Agueh and Carlier's work, we study the existence

and consistency of generalized Wasserstein barycenters. More precisely, first we show

the existence of generalized Wasserstein barycenters …

  Cited by 1 Related articles All 3 versions 
<——2019———— 2019  ———- 680——— 

2019

Wasserstein barycenters in the manifold of all positive definite matrices

E NobariB Ahmadi Kakavandi - Quarterly of Applied Mathematics, 2019 - ams.org

In this paper, we study the Wasserstein barycenter of finitely many Borel probability

measures on $\mathbb {P} _ {n} $, the Riemannian manifold of all $ n\times n $ real positive

definite matrices as well as its associated dual problem, namely the optimal transport …

  Related articles All 2 versions


[PDF] mlr.press

Understanding mcmc dynamics as flows on the wasserstein space

C LiuJ ZhuoJ Zhu - International Conference on Machine …, 2019 - proceedings.mlr.press

It is known that the Langevin dynamics used in MCMC is the gradient flow of the KL

divergence on the Wasserstein space, which helps convergence analysis and inspires

recent particle-based variational inference methods (ParVIs). But no more MCMC dynamics …

  Cited by 3 Related articles All 11 versions 


2019

On the Complexity of Approximating Wasserstein ...

proceedings.mlr.press › ...

PDF

In this paper, we focus on the computational aspects of optimal transport, namely on the complexity approximating a Wasserstein barycenter of a set of histograms.

by A Kroshnin · ‎2019 · ‎Cited by 32 · ‎Related articles

[CITATION] On the Complexity of Approximating Wasserstein Barycenter. eprint

A Kroshnin, D Dvinskikh, P Dvurechensky, A Gasnikov… - arXiv preprint arXiv …, 2019

  Cited by 3 Related articles


A general solver to the elliptical mixture model through an ...

https://deepai.org › publication › a-general-solver-to-the-e...

general solver to the elliptical mixture model through an approximate Wasserstein manifold. 06/09/2019  by Shengxi Li, et al. 0 share. This paper ...

[CITATION] general solver to the elliptical mixture model through an approximate wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - arXiv preprint arXiv:1906.03700, 2019

  Cited by 1 Related articles

[CITATION] A general solver to the elliptical mixture model through an approximate wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - arXiv preprint arXiv:1906.03700, 2019

  Cited by 1 Related articles

 

  2019  [HTML] nih.gov

[HTML] Identifying Imaging Markers for Predicting Cognitive Assessments Using Wasserstein Distances Based Matrix Regression

J Yan, C Deng, L Luo, X Wang, X Yao… - Frontiers in …, 2019 - ncbi.nlm.nih.gov

Cited by 2 Related articles All 10 versions 



 

 2019

[PDF] arxiv.org

The Pontryagin maximum principle in the Wasserstein space

B BonnetF Rossi - Calculus of Variations and Partial Differential …, 2019 - Springer

Abstract We prove a Pontryagin Maximum Principle for optimal control problems in the space of probability measures, where the dynamics is given by a transport equation with non-local velocity. We formulate this first-order optimality condition using the formalism of …

  Cited by 21 Related articles All 54 versions


2019

Fréchet means and Procrustes analysis in Wasserstein space

Y Zemel, VM Panaretos - Bernoulli, 2019 - projecteuclid.org

We consider two statistical problems at the intersection of functional and non-Euclidean data analysis: the determination of a Fréchet mean in the Wasserstein space of multivariate distributions; and the optimal registration of deformed random measures and point …

  Cited by 44 Related articles All 8 versions


2019

[PDF] ucla.edu

On differentiability in the Wasserstein space and well-posedness for Hamilton–Jacobi equations

W GangboA Tudorascu - Journal de Mathématiques Pures et Appliquées, 2019 - Elsevier

In this paper we elucidate the connection between various notions of differentiability in the Wasserstein space: some have been introduced intrinsically (in the Wasserstein space, by using typical objects from the theory of Optimal Transport) and used by various authors to …

  Cited by 30 Related articles All 5 versions


2019

[PDF] arxiv.org

Parisi's formula is a Hamilton-Jacobi equation in Wasserstein space

JC Mourrat - arXiv preprint arXiv:1906.08471, 2019 - arxiv.org

Parisi's formula is a self-contained description of the infinite-volume limit of the free energy of mean-field spin glass models. We show that this quantity can be recast as the solution of a Hamilton-Jacobi equation in the Wasserstein space of probability measures on the positive …

  Cited by 6 Related articles All 3 versions 


2019

[PDF] arxiv.org

A partial Laplacian as an infinitesimal generator on the Wasserstein space

YT Chow, W Gangbo - Journal of Differential Equations, 2019 - Elsevier

In this manuscript, we consider special linear operators which we term partial Laplacians on the Wasserstein space, and which we show to be partial traces of the Wasserstein Hessian. We verify a distinctive smoothing effect of the “heat flows” they generated for a particular …

  Cited by 11 Related articles All 9 versions

CITATION] A partial Laplacian as an infinitesimal generator on the Wasserstein space

W Gangbo, YT Chow - arXiv preprint arXiv:1710.10536, 2017

  Cited by 3 Related articles

A partial Laplacian as an infinitesimal generator on the Wasserstein space

Y Tin Chow, W Gangbo - arXiv, 2017 - ui.adsabs.harvard.edu

We study stochastic processes on the Wasserstein space, together with their infinitesimal generators. One of these processes is modeled after Brownian motion and plays a central role in our work. Its infinitesimal generator defines a partial Laplacian on the space of Borel …

<——2019———— 2019  ———- 690——


2019

Harmonic mappings valued in the Wasserstein space

H Lavenant - Journal of Functional Analysis, 2019 - Elsevier

We propose a definition of the Dirichlet energy (which is roughly speaking the integral of the square of the gradient) for mappings μ: Ω(P (D), W 2) defined over a subset Ω of R p and valued in the space P (D) of probability measures on a compact convex subset D of R q …

  Cited by 12 Related articles All 14 versions

2019

[PDF] arxiv.org

Penalization of barycenters in the Wasserstein space

J Bigot, E Cazelles, N Papadakis - SIAM Journal on Mathematical Analysis, 2019 - SIAM

In this paper, a regularization of Wasserstein barycenters for random measures supported on R^d is introduced via convex penalization. The existence and uniqueness of such barycenters is first proved for a large class of penalization functions. The Bregman …

  Cited by 15 Related articles All 8 versions


2019

[PDF] arxiv.org

Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space

TL GouicQ ParisP RigolletAJ Stromme - arXiv preprint arXiv …, 2019 - arxiv.org

This work establishes fast rates of convergence for empirical barycenters over a large class of geodesic spaces with curvature bounds in the sense of Alexandrov. More specifically, we show that parametric rates of convergence are achievable under natural conditions that …

  Cited by 8 Related articles All 2 versions 



2019

[PDF] arxiv.org

Randomized filtering and Bellman equation in Wasserstein space for partial observation control problem

E BandiniA Cosso, M Fuhrman, H Pham - Stochastic Processes and their …, 2019 - Elsevier

We study a stochastic optimal control problem for a partially observed diffusion. By using the control randomization method in Bandini et al.(2018), we prove a corresponding randomized dynamic programming principle (DPP) for the value function, which is obtained …

  Cited by 18 Related articles All 14 versions


2019

[PDF] arxiv.org

Second-Order Models for Optimal Transport and Cubic Splines on the Wasserstein Space

JD BenamouTO GallouëtFX Vialard - Foundations of Computational …, 2019 - Springer

On the space of probability densities, we extend the Wasserstein geodesics to the case of higher-order interpolation such as cubic spline interpolation. After presenting the natural extension of cubic splines to the Wasserstein space, we propose a simpler approach based …

  Cited by 7 Related articles All 4 versions

2019

2019

[PDF] archives-ouvertes.fr

Diffusive processes on the Wasserstein space: Coalescing models, Regularization properties and McKean-Vlasov equations

V Marx - 2019 - tel.archives-ouvertes.fr

The aim of this thesis is to study a class of diffusive stochastic processes with values in the space of probability measures on the real line, called Wasserstein space if it is endowed with the Wasserstein metric W_2. The following issues are mainly addressed in this work …


2019

[PDF] uni-bielefeld.de

[PDF] Diffusions and PDEs on Wasserstein space

FY Wang - arXiv preprint arXiv:1903.02148, 2019 - sfb1283.uni-bielefeld.de

We propose a new type SDE, whose coefficients depend on the image of solutions, to investigate the diffusion process on the Wasserstein space 2 over Rd, generated by the following time-dependent differential operator for f C2 … R d×Rd σ(t, x, µ)σ(t, y, µ) ,D2f(µ)(x …

  Cited by 2 Related articles All 2 versions 


2019

[PDF] arxiv.org

Busemann functions on the Wasserstein space

G Zhu, WL Li, X Cui - arXiv preprint arXiv:1905.05544, 2019 - arxiv.org

We study rays and co-rays in the Wasserstein space $ P_p (\mathcal {X}) $($ p> 1$) whose ambient space $\mathcal {X} $ is a complete, separable, non-compact, locally compact length space. We show that rays in the Wasserstein space can be represented as probability …

  Related articles All 2 versions 


2019

[PDF] iop.org

Wasserstein space as state space of quantum mechanics and optimal transport

MF Rosyid, K Wahyuningsih - Journal of Physics: Conference …, 2019 - iopscience.iop.org

In this work, we are in the position to view a measurement of a physical observable as an experiment in the sense of probability theory. To every physical observable, a sample space called the spectrum of the observable is therefore available. We have investigated the …

  Related articles All 2 versions

2019

Finsler structure for variable exponent Wasserstein space and gradient flows

A Marcos, A Soglo - arXiv preprint arXiv:1912.12450, 2019 - arxiv.org

Please join the Simons Foundation and our generous member organizations in supporting arXiv during our giving campaign September 23-27. 100% of your contribution will fund improvements and new initiatives to benefit arXiv's global scientific community … We gratefully acknowledge …

  All 2 versions 

 <——2019———— 2019  ———- 700——


 

2019

[PDF] sciencedirect.com

Moreau–Yosida approximation and convergence of Hamiltonian systems on Wasserstein space

HK Kim - Journal of Differential Equations, 2013 - Elsevier

In this paper, we study the stability property of Hamiltonian systems on the Wasserstein space. Let H be a given Hamiltonian satisfying certain properties. We regularize H using the Moreau–Yosida approximation and denote it by H τ. We show that solutions of the …

  Cited by 1 Related articles All 7 versions


2019

[PDF] arxiv.org

Understanding mcmc dynamics as flows on the wasserstein space

C LiuJ ZhuoJ Zhu - arXiv preprint arXiv:1902.00282, 2019 - arxiv.org

It is known that the Langevin dynamics used in MCMC is the gradient flow of the KL divergence on the Wasserstein space, which helps convergence analysis and inspires recent particle-based variational inference methods (ParVIs). But no more MCMC dynamics …

  Cited by 3 Related articles All 12 versions 


2019

[PDF] arxiv.org

From the backward Kolmogorov PDE on the Wasserstein space to propagation of chaos for McKean-Vlasov SDEs

N Frikha, PEC de Raynal - arXiv preprint arXiv:1907.01410, 2019 - arxiv.org

In this article, we provide some new quantitative estimates for propagation of chaos of non-linear stochastic differential equations (SDEs) in the sense of McKean-Vlasov. We obtain explicit error estimates, at the level of the trajectories, at the level of the semi-group and at …

  Cited by 1 Related articles All 17 versions 


2019

[PDF] arxiv.org

Local well-posedness in the Wasserstein space for a chemotaxis model coupled to Navier-Stokes equations

K Kang, HK Kim - arXiv preprint arXiv:1907.01895, 2019 - arxiv.org

We consider a coupled system of Keller-Segel type equations and the incompressible Navier-Stokes equations in spatial dimension two and three. In the previous work [19], we established the existence of a weak solution of a Fokker-Plank equation in the Wasserstein  …

  Related articles All 2 versions 


2019

[PDF] arxiv.org

Wasserstein Contraction of Stochastic Nonlinear Systems

J BouvrieJJ Slotine - arXiv preprint arXiv:1902.08567, 2019 - arxiv.org

We suggest that the tools of contraction analysis for deterministic systems can be applied

towards studying the convergence behavior of stochastic dynamical systems in the

Wasserstein metric. In particular, we consider the case of Ito diffusions with identical …

  Cited by 4 Related articles All 2 versions 

 

[PDF] arxiv.org

Quantitative spectral gap estimate and Wasserstein contraction of simple slice sampling

V Natarovskii, D RudolfB Sprungk - arXiv preprint arXiv:1903.03824, 2019 - arxiv.org

We prove Wasserstein contraction of simple slice sampling for approximate sampling wrt

distributions with log-concave and rotational invariant Lebesgue densities. This yields, in

particular, an explicit quantitative lower bound of the spectral gap of simple slice sampling …

  Related articles All 4 versions 


2019

[PDF] arxiv.org

Data-Driven Distributionally Robust Appointment Scheduling over Wasserstein Balls

R JiangM RyuG Xu - arXiv preprint arXiv:1907.03219, 2019 - arxiv.org

We study a single-server appointment scheduling problem with a fixed sequence of

appointments, for which we must determine the arrival time for each appointment. We

specifically examine two stochastic models. In the first model, we assume that all appointees …

  Cited by 3 Related articles All 3 versions 


2019

[PDF] theses.fr

Processus de diffusion sur l'espace de Wasserstein: modèles coalescents, propriétés de régularisation et équations de McKean-Vlasov

V Marx - 2019 - theses.fr

… Keywords: Wasserstein diffusion, interacting particle system, coalescing particles, regularization

properties, McKean-Vlasov equation, Fokker-Planck equation, restoration of uniqueness, notion

of weak solution, Bismut-Elworthy formula. Page 7. Page 8. Page 9. Page 10. Page 11 …

  Related articles All 3 versions 


2019

[PDF] arxiv.org

Modified massive Arratia flow and Wasserstein diffusion

V KonarovskyiMK von Renesse - Communications on Pure …, 2019 - Wiley Online Library

Extending previous work by the first author we present a variant of the Arratia flow, which

consists of a collection of coalescing Brownian motions starting from every point of the unit

interval. The important new feature of the model is that individual particles carry mass that …

  Cited by 26 Related articles All 5 versions


2019

[PDF] arxiv.org

Wasserstein Diffusion Tikhonov Regularization

AT Lin, Y DuklerW LiG Montúfar - arXiv preprint arXiv:1909.06860, 2019 - arxiv.org

We propose regularization strategies for learning discriminative models that are robust to in-

class variations of the input data. We use the Wasserstein-2 geometry to capture

semantically meaningful neighborhoods in the space of images, and define a corresponding …

  Cited by 1 Related articles All 5 versions 

<—2019———— 2019  ———- 710——


2019

[PDF] arxiv.org

A convergent Lagrangian discretization for -Wasserstein and flux-limited diffusion equations

B Söllner, O Junge - arXiv preprint arXiv:1906.01321, 2019 - arxiv.org

… We will note the key parts of the proofs where a difference has to be made if one is working with

flux-limiting cost. In the last part of the paper, we give some numerical examples to illustrate

the dynamics of the p-Wasserstein diffusion and flux-limiting diffusion …

  Cited by 2 Related articles All 2 versions 


2019

Approximation of stable law in Wasserstein-1 distance by Stein's method

L Xu - The Annals of Applied Probability, 2019 - projecteuclid.org

Abstract Let $ n\in\mathbb {N} $, let $\zeta_ {n, 1},\ldots,\zeta_ {n, n} $ be a sequence of

independent random variables with $\mathbb {E}\zeta_ {n, i}= 0$ and $\mathbb {E}|\zeta_ {n,

i}|<\infty $ for each $ i $, and let $\mu $ be an $\alpha $-stable distribution having …

   Cited by 20 Related articles All 9 versions


2019  [PDF] esaim-cocv.org

Dynamic models of Wasserstein-1-type unbalanced transport

B Schmitzer, B Wirth - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We consider a class of convex optimization problems modelling temporal mass transport

and mass change between two given mass distributions (the so-called dynamic formulation

of unbalanced transport), where we focus on those models for which transport costs are …

  Cited by 10 Related articles All 4 versions

2019  [PDF] arxiv.org

Wasserstein convergence rates for random bit approximations of continuous markov processes

S Ankirchner, T Kruse, M Urusov - arXiv preprint arXiv:1903.07880, 2019 - arxiv.org

We determine the convergence speed of a numerical scheme for approximating one-

dimensional continuous strong Markov processes. The scheme is based on the construction

of coin tossing Markov chains whose laws can be embedded into the process with a …


2019  [PDF] arxiv.org

Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures

O Bencheikh, B Jourdain - arXiv preprint arXiv:2012.09729, 2020 - arxiv.org

We are interested in the approximation in Wasserstein distance with index $\rho\ge 1$ of a

probability measure $\mu $ on the real line with finite moment of order $\rho $ by the

empirical measure of $ N $ deterministic points. The minimal error converges to $0 $ as …

  Related articles All 3 versions 


Robust Wasserstein profile inference and applications to machine learning

J BlanchetY KangK Murthy - Journal of Applied Probability, 2019 - cambridge.org

We show that several machine learning estimators, including square-root least absolute

shrinkage and selection and regularized logistic regression, can be represented as

solutions to distributionally robust optimization problems. The associated uncertainty regions …

Cited by 219 Related articles All 5 versions

[PDF] arxiv.org

Subspace robust wasserstein distances

FP PatyM Cuturi - arXiv preprint arXiv:1901.08949, 2019 - arxiv.org

Making sense of Wasserstein distances between discrete measures in high-dimensional

settings remains a challenge. Recent work has advocated a two-step approach to improve

robustness and facilitate the computation of optimal transport, using for instance projections …

  Cited by 36 Related articles All 4 versions 

[CITATION] Subspace Robust Wasserstein Distances

Cited by 74 Related articles All 6 versions 

2019

[PDF] thecvf.com

Sliced wasserstein generative models

J Wu, Z Huang, D Acharya, W Li… - Proceedings of the …, 2019 - openaccess.thecvf.com

In generative modeling, the Wasserstein distance (WD) has emerged as a useful metric to

measure the discrepancy between generated and real data distributions. Unfortunately, it is

challenging to approximate the WD of high-dimensional distributions. In contrast, the sliced  …

  Cited by 36 Related articles All 10 versions 


2019  [PDF] mlr.press

Sliced-Wasserstein flows: Nonparametric generative modeling via optimal transport and diffusions

A Liutkus, U Simsekli, S Majewski… - International …, 2019 - proceedings.mlr.press

By building upon the recent theory that established the connection between implicit

generative modeling (IGM) and optimal transport, in this study, we propose a novel

parameter-free algorithm for learning the underlying distributions of complicated datasets …

  Cited by 28 Related articles All 11 versions 

Cited by 60 Related articles All 7 versions 

2019

Generalized sliced Wasserstein distances

S KolouriK NadjahiU SimsekliR Badeau… - Advances in Neural …, 2019 - papers.nips.cc

The Wasserstein distance and its variations, eg, the sliced-Wasserstein (SW) distance, have

recently drawn attention from the machine learning community. The SW distance,

specifically, was shown to have similar properties to the Wasserstein distance, while being …

  Cited by 38 Related articles All 6 versions 

<—2019———— 2019  ———- 720——     


[PDF] thecvf.com

Max-sliced wasserstein distance and its use for gans

I DeshpandeYT HuR SunA Pyrros… - Proceedings of the …, 2019 - openaccess.thecvf.com

Generative adversarial nets (GANs) and variational auto-encoders have significantly

improved our distribution modeling capabilities, showing promise for dataset augmentation,

image-to-image translation and feature learning. However, to model high-dimensional …

  Cited by 32 Related articles All 7 versions 


2019

[PDF] thecvf.com

Sliced wasserstein discrepancy for unsupervised domain adaptation

CY Lee, T Batra, MH Baig… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

In this work, we connect two distinct concepts for unsupervised domain adaptation: feature

distribution alignment between domains by utilizing the task-specific decision boundary and

the Wasserstein metric. Our proposed sliced Wasserstein discrepancy (SWD) is designed to …

  Cited by 99 Related articles All 7 versions 


2019

[PDF] arxiv.org

Minimax Confidence Intervals for the Sliced Wasserstein Distance

T ManoleS BalakrishnanL Wasserman - arXiv preprint arXiv:1909.07862, 2019 - arxiv.org

The Wasserstein distance has risen in popularity in the statistics and machine learning

communities as a useful metric for comparing probability distributions. We study the problem

of uncertainty quantification for the Sliced Wasserstein distance--an easily computable …

  Cited by 1 Related articles All 3 versions 

year

[CITATION] Minimax Confidence Intervals for the Sliced Wasserstein Distance Download PDF

T Manole

2019

[PDF] openreview.net

A Greedy Approach to Max-Sliced Wasserstein GANs

A Horváth - 2019 - openreview.net

Generative Adversarial Networks have made data generation possible in various use cases,

but in case of complex, high-dimensional distributions it can be difficult to train them,

because of convergence problems and the appearance of mode collapse. Sliced  …

  Related articles All 2 versions 

2019

[PDF] arxiv.org

Hausdorff and Wasserstein metrics on graphs and other structured data

E Patterson - arXiv preprint arXiv:1907.00257, 2019 - arxiv.org

Optimal transport is widely used in pure and applied mathematics to find probabilistic

solutions to hard combinatorial matching problems. We extend the Wasserstein metric and

other elements of optimal transport from the matching of sets to the matching of graphs and …

  Cited by 2 Related articles All 2 versions 


2019

Mullins-Sekerka as the Wasserstein flow of the perimeter

A Chambolle, T Laux - arXiv preprint arXiv:1910.02508, 2019 - arxiv.org

We prove the convergence of an implicit time discretization for the Mullins-Sekerka equation 

proposed in [F. Otto, Arch. Rational Mech. Anal. 141 (1998) 63-103]. Our simple argument 

shows that the limit satisfies the equation in a distributional sense as well as an optimal 

energy-dissipation relation. The proof combines simple arguments from optimal transport, 

gradient flows & minimizing movements, and basic geometric measure theory.


2019

Unsupervised feature extraction based on improved Wasserstein generative adversarial network for hyperspectral classification

Q Sun, S Bourennane - Multimodal Sensing: Technologies …, 2019 - spiedigitallibrary.org

Accurate classification is one of the most important prerequisites for hyperspectral

applications and feature extraction is the key step of classification. Recently, deep learning

models have been successfully used to extract the spectral-spatial features in hyperspectral …

  Related articles All 4 versions


2019

[PDF] apsipa.org

Semi-supervised Multimodal Emotion Recognition with Improved Wasserstein GANs

J Liang, S Chen, Q Jin - 2019 Asia-Pacific Signal and …, 2019 - ieeexplore.ieee.org

Automatic emotion recognition has faced the challenge of lacking large-scale human

labeled dataset for model learning due to the expensive data annotation cost and inevitable

label ambiguity. To tackle such challenge, previous works have explored to transfer emotion …

  Related articles All 2 versions


2019

[PDF] arxiv.org

Duality and quotient spaces of generalized Wasserstein spaces

NP Chung, TS Trinh - arXiv preprint arXiv:1904.12461, 2019 - arxiv.org

In this article, using ideas of Liero, Mielke and Savaré in [21], we establish a Kantorovich

duality for generalized Wasserstein distances $ W_1^{a, b} $ on a generalized Polish metric

space, introduced by Picolli and Rossi. As a consequence, we give another proof that …

  Cited by 1 Related articles All 3 versions 


2019

[PDF] arxiv.org

Barycenters in generalized Wasserstein spaces

NP Chung, TS Trinh - arXiv preprint arXiv:1909.05517, 2019 - arxiv.org

In 2014, Piccoli and Rossi introduced generalized Wasserstein spaces which are

combinations of Wasserstein distances and $ L^ 1$-distances [11]. In this article, we follow

the ideas of Agueh and Carlier [1] to study generalized Wasserstein barycenters. We show …

  Cited by 1 Related articles All 3 versions 


<-—2019———— 2019  ———- 730—— 


[PDF] iop.org

Speech Dereverberation Based on Improved Wasserstein Generative Adversarial Networks

L Rao, J Yang - Journal of Physics: Conference Series, 2020 - iopscience.iop.org

In reality, the sound we hear is not only disturbed by noise, but also the reverberant, whose

effects are rarely taken into account. Recently, deep learning has shown great advantages

in speech signal processing. But among the existing dereverberation approaches, very few …


On isometric embeddings of Wasserstein spaces–the discrete case

GP GehérT TitkosD Virosztek - Journal of Mathematical Analysis and …, 2019 - Elsevier

The aim of this short paper is to offer a complete characterization of all (not necessarily

surjective) isometric embeddings of the Wasserstein space W p (X), where X is a countable

discrete metric space and 0< p<∞ is any parameter value. Roughly speaking, we will prove …

  Cited by 3 Related articles All 8 versions


2019  [PDF] arxiv.org

(q, p)-Wasserstein GANs: Comparing Ground Metrics for Wasserstein GANs

A MallastoJ FrellsenW Boomsma… - arXiv preprint arXiv …, 2019 - arxiv.org

Generative Adversial Networks (GANs) have made a major impact in computer vision and

machine learning as generative models. Wasserstein GANs (WGANs) brought Optimal

Transport (OT) theory into GANs, by minimizing the $1 $-Wasserstein distance between …

  Cited by 3 Related articles All 2 versions 


2019  [PDF] arxiv.org

Nonembeddability of Persistence Diagrams with  Wasserstein Metric

A Wagner - arXiv preprint arXiv:1910.13935, 2019 - arxiv.org

… Hence, when applying kernel methods to persistence diagrams, the underlying feature map

necessarily causes distortion. We prove persistence diagrams with the p-Wasserstein metric

do not admit a coarse embedding into a Hilbert space when p > 2. 1. Introduction …

  Related articles All 2 versions 

Projection au sens de Wasserstein 2 sur des espaces structurés de mesures

L Lebrat - 2019 - theses.fr

… Résumé. Cette thèse s'intéresse à l'approximation pour la métrique de 2-Wasserstein

de mesures de probabilité par une mesure structurée … Titre traduit. Projection in the

2-Wasserstein sense on structured measure space. Résumé …

Projection au sens de Wasserstein 2 sur des espaces structurés de mesures thesis


2019


Projection in the 2-Wasserstein sense on structured measure space

L Lebrat - 2019 - tel.archives-ouvertes.fr

This thesis focuses on the approximation for the 2-Wasserstein metric of probability

measures by structured measures. The set of structured measures under consideration is

made of consistent discretizations of measures carried by a smooth curve with a bounded …

  All 2 versions 


2019

[PDF] arxiv.org

2-wasserstein approximation via restricted convex potentials with application to improved training for gans

A TaghvaeiA Jalali - arXiv preprint arXiv:1902.07197, 2019 - arxiv.org

We provide a framework to approximate the 2-Wasserstein distance and the optimal

transport map, amenable to efficient training as well as statistical and geometric analysis.

With the quadratic cost and considering the Kantorovich dual form of the optimal …

  Cited by 9 Related articles All 2 versions 


2019

Elements of Statistical Inference in 2-Wasserstein Space

J Ebert, V SpokoinyA Suvorikova - Topics in Applied Analysis and …, 2019 - Springer

This work addresses an issue of statistical inference for the datasets lacking underlying

linear structure, which makes impossible the direct application of standard inference

techniques and requires a development of a new tool-box taking into account properties of …

  Related articles All 3 versions


2019

[PDF] sgugit.ru

[PDF] Применение метрики Вассерштейна для решения обратной динамической задачи сейсмики

АА Василенко - Интерэкспо Гео-Сибирь, 2019 - geosib.sgugit.ru

Обратная динамическая задача сейсмики заключается в определении скоростной

модели упругой среды по зарегистрированным данным. В данной работе предлагается

использовать метрику Вассерштейна для построения функционала, характеризующего …

  Related articles All 4 versions 

2019  [PDF] arxiv.org

Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Stein's method has been widely used for probability approximations. However, in the multi-

dimensional setting, most of the results are for multivariate normal approximation or for test

functions with bounded second-or higher-order derivatives. For a class of multivariate …

  Cited by 19 Related articles All 5 versions

2019

[HTML] springer.com

[HTML] Correction to: Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Under the above-strengthened Assumption 2.1, all the conclusions and examples in [1] still hold

true, except that all the constants \(C_\theta \) therein will depend on the constants in the new

assumption … Combining the previous three inequalities, we conclude that [1, (7.1)] still holds …

  Cited by 1 Related articles All 3 versions

2019

[CITATION] Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula (vol 174, pg 945, 2019)

X Fang, QM ShaoL Xu - PROBA

<—2019———— 2019  ———- 740——  


[PDF] arxiv.org

Fused Gromov-Wasserstein Alignment for Hawkes Processes

D LuoH XuL Carin - arXiv preprint arXiv:1910.02096, 2019 - arxiv.org

We propose a novel fused Gromov-Wasserstein alignment method to jointly learn the

Hawkes processes in different event spaces, and align their event types. Given two Hawkes

processes, we use fused Gromov-Wasserstein discrepancy to measure their dissimilarity …

  Cited by 1 Related articles All 2 versions 

 

2019

[PDF] arxiv.org

Fast Tree Variants of Gromov-Wasserstein

T LeN HoM Yamada - arXiv preprint arXiv:1910.04462, 2019 - arxiv.org

Gromov-Wasserstein (GW) is a powerful tool to compare probability measures whose

supports are in different metric spaces. GW suffers however from a computational drawback

since it requires to solve a complex non-convex quadratic program. We consider in this work …

  Cited by 1 Related articles All 5 versions


2019  [PDF] nips.cc

Scalable Gromov-Wasserstein learning for graph partitioning and matching

H XuD LuoL Carin - Advances in neural information processing …, 2019 - papers.nips.cc

We propose a scalable Gromov-Wasserstein learning (S-GWL) method and establish a

novel and theoretically-supported paradigm for large-scale graph analysis. The proposed

method is based on the fact that Gromov-Wasserstein discrepancy is a pseudometric on …

  Cited by 42 Related articles All 10 versions 


2019  [PDF] arxiv.org

Sliced gromov-wasserstein

T VayerR FlamaryR Tavenard, L Chapel… - arXiv preprint arXiv …, 2019 - arxiv.org

Recently used in various machine learning contexts, the Gromov-Wasserstein distance (GW)

allows for comparing distributions whose supports do not necessarily lie in the same metric

space. However, this Optimal Transport (OT) distance requires solving a complex non …

 Cited by 17 Related articles All 9 versions 


20019 [PDF] arxiv.org

Gromov-wasserstein learning for graph matching and node embedding

H XuD LuoH ZhaL Carin - arXiv preprint arXiv:1901.06003, 2019 - arxiv.org

A novel Gromov-Wasserstein learning framework is proposed to jointly match (align) graphs

and learn embedding vectors for the associated graph nodes. Using Gromov-Wasserstein

discrepancy, we measure the dissimilarity between two graphs and find their …

  Cited by 69 Related articles All 10 versions 

2019  [HTML] oup.com

The GromovWasserstein distance between networks and stable network invariants

S ChowdhuryF Mémoli - Information and Inference: A Journal of …, 2019 - academic.oup.com

We define a metric—the network GromovWasserstein distance—on weighted, directed

networks that is sensitive to the presence of outliers. In addition to proving its theoretical

properties, we supply network invariants based on optimal transport that approximate this …

  Cited by 16 Related articles


2019

Tree-Sliced Variants of Wasserstein Distances - NeurIPS 2019

nips.cc › Conferences › ScheduleMultitrack

pose the tree-sliced Wasserstein distance, computed by averaging the ... Peter Richtarik · Marco Cuturi; 2019 Workshop: Optimal Transport for Machine ...

Tue Dec 10

Tree-sliced variants of wasserstein distances

2019

 Wasserstein convergence rates for random bit approximations of continuous Markov processes.

3 citations*

2019 ARXIV: PROBABILITY

View More 

 Attainability property for a probabilistic target in wasserstein spaces

0 citations* for all

0 citations*

2021 DISCRETE & CONTINUOUS DYNAMICAL SYSTEMS - A

Giulia Cavagnari ,Antonio Marigonda

Nonholonomic system

Probability measure

View More (8+) 

In this paper we establish an attainability result for the minimum time function of a control problem in the space of probability measures endowed with Wasserstein distance. The dynamics is provided by a suitable controlled continuity equation, where we impose a nonlocal nonholonomic constraint on t... View Full Abstract 


2019

0 citations* Attainability property for a probabilistic target in Wasserstein spaces

[PDF] arxiv.org

Attainability property for a probabilistic target in Wasserstein spaces

G Cavagnari, A Marigonda - arXiv preprint arXiv:1904.10933, 2019 - arxiv.org

In this paper we establish an attainability result for the minimum time function of a control

problem in the space of probability measures endowed with Wasserstein distance. The

dynamics is provided by a suitable controlled continuity equation, where we impose a …

  Cited by 1 Related articles All 6 versions 

2019 ARXIV: OPTIMIZATION AND CONTROL

View More 

 Artificial Neural Network with Histogram Data Time Series Forecasting: A Least Squares Approach Based on Wasserstein Distance

0 citations*

2021

Pichayakone Rakpho 1,Woraphon Yamaka 1,Kongliang Zhu 2

1 Chiang Mai University ,2 Khon Kaen University

Histogr

Least squares

View More (6+) 

This paper aims to predict the histogram time series, and we use the high-frequency data with 5-min to construct the Histogram data for each day. In this paper, we apply the Artificial Neural Network (ANN) to Autoregressive (AR) structure and introduce the AR—ANN model to forecast this histogram tim... View Full Abstract 

<—2019———— 2019  ———- 750—


2019

[PDF] esaim-cocv.org

A Pontryagin Maximum Principle in Wasserstein Spaces for Constrained Optimal Control Problems

B Bonnet - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

In this paper, we prove a Pontryagin Maximum Principle for constrained optimal control

problems in the Wasserstein space of probability measures. The dynamics is described by a

transport equation with non-local velocities which are affine in the control, and is subject to …

  Cited by 7 Related articles All 131 versions


2019

[PDF] arxiv.org

Barycenters in generalized Wasserstein spaces

NP Chung, TS Trinh - arXiv preprint arXiv:1909.05517, 2019 - arxiv.org

In 2014, Piccoli and Rossi introduced generalized Wasserstein spaces which are

combinations of Wasserstein distances and $ L^ 1$-distances [11]. In this article, we follow

the ideas of Agueh and Carlier [1] to study generalized Wasserstein barycenters. We show …

  Cited by 1 Related articles All 3 versions 


2019 [PDF] arxiv.org

Duality and quotient spaces of generalized Wasserstein spaces

NP Chung, TS Trinh - arXiv preprint arXiv:1904.12461, 2019 - arxiv.org

In this article, using ideas of Liero, Mielke and Savaré in [21], we establish a Kantorovich

duality for generalized Wasserstein distances $ W_1^{a, b} $ on a generalized Polish metric

space, introduced by Picolli and Rossi. As a consequence, we give another proof that …

  Cited by 1 Related articles All 3 versions 


2019 [PDF] arxiv.org

Attainability property for a probabilistic target in Wasserstein spaces

G Cavagnari, A Marigonda - arXiv preprint arXiv:1904.10933, 2019 - arxiv.org

In this paper we establish an attainability result for the minimum time function of a control

problem in the space of probability measures endowed with Wasserstein distance. The

dynamics is provided by a suitable controlled continuity equation, where we impose a …

  Cited by 1 Related articles All 3 versions 

2019

On isometric embeddings of Wasserstein spaces–the discrete case

GP GehérT TitkosD Virosztek - Journal of Mathematical Analysis and …, 2019 - Elsevier

The aim of this short paper is to offer a complete characterization of all (not necessarily

surjective) isometric embeddings of the Wasserstein space W p (X), where X is a countable

discrete metric space and 0< p<∞ is any parameter value. Roughly speaking, we will prove …

  Cited by 1 Related articles All 8 versions


2019

[PDF] arxiv.org

Learning Embeddings into Entropic Wasserstein Spaces

C FrognerF MirzazadehJ Solomon - arXiv preprint arXiv:1905.03329, 2019 - arxiv.org

Euclidean embeddings of data are fundamentally limited in their ability to capture latent

semantic structures, which need not conform to Euclidean spatial assumptions. Here we

consider an alternative, which embeds data as discrete probability distributions in a …

  Cited by 2 Related articles All 7 versions 


2019  [PDF] archives-ouvertes.fr

Optimal Control in Wasserstein Spaces

B Bonnet - 2019 - hal.archives-ouvertes.fr

A wealth of mathematical tools allowing to model and analyse multi-agent systems has been

brought forth as a consequence of recent developments in optimal transport theory. In this

thesis, we extend for the first time several of these concepts to the framework of control …

  Related articles All 14 versions 

[CITATION] Optimal Control in Wasserstein Spaces.(Commande Optimal dans les Espaces de Wasserstein).

B Bonnet - 2019 - Aix-Marseille University, France


2019  [PDF] arxiv.org

The existence of geodesics in Wasserstein spaces over path groups and loop groups

J Shao - Stochastic Processes and their Applications, 2019 - Elsevier

In this work we prove the existence and uniqueness of the optimal transport map for L p-

Wasserstein distance with p> 1, and particularly present an explicit expression of the optimal

transport map for the case p= 2. As an application, we show the existence of geodesics …

  Related articles All 8 versions


 Central limit theorem and bootstrap procedure for Wasserstein's variations with an application to structural relationships between distributions

E Del BarrioP Gordaliza, H Lescornel… - Journal of Multivariate …, 2019 - Elsevier

Wasserstein barycenters and variance-like criteria based on the Wasserstein distance are

used in many problems to analyze the homogeneity of collections of distributions and

structural relationships between the observations. We propose the estimation of the …

  Cited by 18 Related articles All 13 versions


2019  [PDF] thecvf.com

Face Synthesis and Recognition Using Disentangled Representation-Learning Wasserstein GAN

GS Jison Hsu, CH Tang… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Abstract We propose the Disentangled Representation-learning Wasserstein GAN (DR-

WGAN) trained on augmented data for face recognition and face synthesis across pose. We

improve the state-of-the-art DR-GAN with the Wasserstein loss considered in the …

  Related articles 

[PDF] semanticscholar.org

Face Synthesis and Recognition Using Disentangled Representation-Learning Wasserstein GAN

GSJ Hsu, CH Tang, MH Yap - 2019 IEEE/CVF Conference on …, 2019 - ieeexplore.ieee.org

We propose the Disentangled Representation-learning Wasserstein GAN (DR-WGAN)

trained on augmented data for face recognition and face synthesis across pose. We improve

the state-of-the-art DR-GAN with the Wasserstein loss considered in the discriminator so that …

  Related articles All 2 versions

 <—2019———— 2019  ———- 760—


Unsupervised alignment of embeddings with wasserstein procrustes

E GraveA JoulinQ Berthet - The 22nd International …, 2019 - proceedings.mlr.press

We consider the task of aligning two sets of points in high dimension, which has many

applications in natural language processing and computer vision. As an example, it was

recently shown that it is possible to infer a bilingual lexicon, without supervised data, by …

  Cited by 76 Related articles All 3 versions 


[PDF] arxiv.org

On distributionally robust chance constrained programs with Wasserstein distance

W Xie - Mathematical Programming, 2019 - Springer

This paper studies a distributionally robust chance constrained program (DRCCP) with

Wasserstein ambiguity set, where the uncertain constraints should be satisfied with a

probability at least a given threshold for all the probability distributions of the uncertain …

  Cited by 37 Related articles All 9 versions


[PDF] ieee.org

A deep transfer model with wasserstein distance guided multi-adversarial networks for bearing fault diagnosis under different working conditions

M Zhang, D Wang, W Lu, J Yang, Z Li, B Liang - IEEE Access, 2019 - ieeexplore.ieee.org

In recent years, intelligent fault diagnosis technology with the deep learning algorithm has

been widely used in the manufacturing industry for substituting time-consuming human

analysis method to enhance the efficiency of fault diagnosis. The rolling bearing as the …

  Cited by 17 Related articles All 2 versions


[PDF] arxiv.org

Graph Signal Representation with Wasserstein Barycenters

E SimouP Frossard - ICASSP 2019-2019 IEEE International …, 2019 - ieeexplore.ieee.org

In many applications signals reside on the vertices of weighted graphs. Thus, there is the

need to learn low dimensional representations for graph signals that will allow for data

analysis and interpretation. Existing unsupervised dimensionality reduction methods for …

  Cited by 7 Related articles All 6 versions


[PDF] arxiv.org

Topic Modeling with Wasserstein Autoencoders

F NanR DingR NallapatiB Xiang - arXiv preprint arXiv:1907.12374, 2019 - arxiv.org

We propose a novel neural topic model in the Wasserstein autoencoders (WAE) framework.

Unlike existing variational autoencoder based models, we directly enforce Dirichlet prior on

the latent document-topic vectors. We exploit the structure of the latent space and apply a …

  Cited by 10 Related articles All 6 versions 


2019

Cross-domain Attention Network with Wasserstein Regularizers for E-commerce Search

M QiuB WangC Chen, X Zeng, J Huang… - Proceedings of the 28th …, 2019 - dl.acm.org

Product search and recommendation is a task that every e-commerce platform wants to

outperform their peels on. However, training a good search or recommendation model often

requires more data than what many platforms have. Fortunately, the search tasks on different …

  Related articles All 2 versions


[PDF] arxiv.org

Disentangled Representation Learning with Wasserstein Total Correlation

Y XiaoWY Wang - arXiv preprint arXiv:1912.12818, 2019 - arxiv.org

Unsupervised learning of disentangled representations involves uncovering of different

factors of variations that contribute to the data generation process. Total correlation

penalization has been a key component in recent methods towards disentanglement …

  Cited by 1 Related articles All 2 versions 


 [PDF] arxiv.org

Hypothesis Test and Confidence Analysis with Wasserstein Distance on General Dimension

M Imaizumi, H Ota, T Hamaguchi - arXiv preprint arXiv:1910.07773, 2019 - arxiv.org

We develop a general framework for statistical inference with the Wasserstein distance.

Recently, the Wasserstein distance has attracted much attention and been applied to

various machine learning tasks due to its celebrated properties. Despite the importance …

  Related articles All 2 versions 


[PDF] arxiv.org

Personalized Purchase Prediction of Market Baskets with Wasserstein-Based Sequence Matching

M KrausS Feuerriegel - Proceedings of the 25th ACM SIGKDD …, 2019 - dl.acm.org

Personalization in marketing aims at improving the shopping experience of customers by

tailoring services to individuals. In order to achieve this, businesses must be able to make

personalized predictions regarding the next purchase. That is, one must forecast the exact …

  Cited by 4 Related articles All 5 versions


[PDF] Anomaly detection on time series with wasserstein GAN applied to PHM

M Ducoffe, I Haloui, JS Gupta… - PHM Applications of Deep …, 2019 - phmsociety.org

Modern vehicles are more and more connected. For instance, in the aerospace industry,

newer aircraft are already equipped with data concentrators and enough wireless

connectivity to transmit sensor data collected during the whole flight to the ground, usually …

  Cited by 2 Related articles All 2 versions 

<—2019———— 2019  ———- 770—— 


[PDF] aaai.org

Manifold-Valued Image Generation with Wasserstein Generative Adversarial Nets

Z Huang, J Wu, L Van Gool - Proceedings of the AAAI Conference on …, 2019 - aaai.org

Generative modeling over natural images is one of the most fundamental machine learning

problems. However, few modern generative models, including Wasserstein Generative

Adversarial Nets (WGANs), are studied on manifold-valued images that are frequently …

  Cited by 4 Related articles All 10 versions 


Adversarial Learning for Cross-Modal Retrieval with Wasserstein Distance

Q Cheng, Y Zhang, X Gu - International Conference on Neural Information …, 2019 - Springer

This paper presents a novel approach for cross-modal retrieval in an Adversarial Learning

with Wasserstein Distance (ALWD) manner, which aims at learning aligned representation

for various modalities in a GAN framework. The generator projects the image and the text …

  Related articles


[PDF] arxiv.org

Learning with Wasserstein barycenters and applications

G Domazakis, D Drivaliaris, S Koukoulas… - arXiv preprint arXiv …, 2019 - arxiv.org

In this work, learning schemes for measure-valued data are proposed, ie data that their

structure can be more efficiently represented as probability measures instead of points on

$\R^ d $, employing the concept of probability barycenters as defined with respect to the …

  Related articles All 3 versions 

 

[HTML] deepai.org

[HTML] Manifold-valued image generation with wasserstein adversarial networks

EW GANs - 2019 - deepai.org

Unsupervised image generation has recently received an increasing amount of attention thanks

to the great success of generative adversarial networks (GANs), particularly Wasserstein

GANs. Inspired by the paradigm of real-valued image generation, this paper makes the first attempt …

  Cited by 3 Related articles 


[PDF] ntu.edu.sg

Poisson discretizations of Wiener functionals and Malliavin operators with Wasserstein estimates

N PrivaultSCP YamZ Zhang - Stochastic Processes and their …, 2019 - Elsevier

This article proposes a global, chaos-based procedure for the discretization of functionals of

Brownian motion into functionals of a Poisson process with intensity λ> 0. Under this

discretization we study the weak convergence, as the intensity of the underlying Poisson …

Cited by 1 Related articles All 4 versions


2019


[PDF] openreview.net

Fairness with Wasserstein Adversarial Networks

L Jean-Michel, E Pauwels - 2019 - openreview.net

Quantifying, enforcing and implementing fairness emerged as a major topic in machine

learning. We investigate these questions in the context of deep learning. Our main

algorithmic and theoretical tool is the computational estimation of similarities between …

  Related articles 


[PDF] Fairness with Wasserstein Adversarial Networks

M Serrurier, JM Loubes, E Pauwels - 2019 - researchgate.net

Quantifying, enforcing and implementing fairness emerged as a major topic in machine

learning. We investigate these questions in the context of deep learning. Our main

algorithmic and theoretical tool is the computational estimation of similarities between …

  Cited by 4 Related articles 


[PDF] arxiv.org

A nonlocal free boundary problem with Wasserstein distance

A Karakhanyan - arXiv preprint arXiv:1904.06270, 2019 - arxiv.org

We study the probability measures $\rho\in\mathcal M (\mathbb R^ 2) $ minimizing the

functional\[J [\rho]=\iint\log\frac1 {| xy|} d\rho (x) d\rho (y)+ d^ 2 (\rho,\rho_0),\] where $\rho_0

$ is a given probability measure and $ d (\rho,\rho_0) $ is the 2-Wasserstein distance of …

  Related articles All 2 versions 


[PDF] sciencedirect.com

Distributionally Robust Chance-Constraint Optimal Power Flow Considering Uncertain Renewables with Wasserstein-Moment Metric

J Liu, Y Chen, C Duan, J Lyu - Energy Procedia, 2019 - Elsevier

Chance-constraint optimal power flow has been proven as an efficient method to manage

the risk of volatile renewable energy sources. To address the uncertainties of renewable

energy sources, a novel distributionally robust chance-constraint OPF model is proposed in …

  Cited by 1 Related articles All 2 versions

[PDF] aau.dk

[PDF] Full-Band Music Genres Interpolations with Wasserstein Autoencoders

T Borghuis, A Tibo, S Conforti, L Brusci… - Workshop AI for Media …, 2019 - vbn.aau.dk

We compare different types of autoencoders for generating interpolations between four-

instruments musical patterns in the acid jazz, funk, and soul genres. Preliminary empirical

results suggest the superiority of Wasserstein autoencoders. The process of generation …

  Related articles All 3 versions 

<—2019———— 2019  ———- 780—— 

[PDF] koreascience.or.kr

Combining multi-task autoencoder with Wasserstein generative adversarial networks for improving speech recognition performance

CY Kao, H Ko - The Journal of the Acoustical Society of Korea, 2019 - koreascience.or.kr

As the presence of background noise in acoustic signal degrades the performance of

speech or acoustic event recognition, it is still challenging to extract noise-robust acoustic

features from noisy signal. In this paper, we propose a combined structure of Wasserstein …

  Related articles All 2 versions 


[PDF] ceur-ws.org

[PDF] Dialogue response generation with Wasserstein generative adversarial networks

SAS Gilani, E JembereAW Pillay - 2019 - ceur-ws.org

This research evaluates the effectiveness of a Generative Adversarial Network (GAN) for

open domain dialogue response systems. The research involves developing and evaluating

a Conditional Wasserstein GAN (CWGAN) for natural dialogue response generation. We …

  Related articles 


[PDF] arxiv.org

A variational finite volume scheme for Wasserstein gradient flows

C CancèsTO GallouëtG Todeschi - arXiv preprint arXiv:1907.08305, 2019 - arxiv.org

We propose a variational finite volume scheme to approximate the solutions to Wasserstein

gradient flows. The time discretization is based on an implicit linearization of the

Wasserstein distance expressed thanks to Benamou-Brenier formula, whereas space …

  Cited by 5 Related articles All 12 versions 


[PDF] arxiv.org

Straight-through estimator as projected Wasserstein gradient flow

P ChengC LiuC LiD ShenR Henao… - arXiv preprint arXiv …, 2019 - arxiv.org

The Straight-Through (ST) estimator is a widely used technique for back-propagating

gradients through discrete random variables. However, this effective method lacks

theoretical justification. In this paper, we show that ST can be interpreted as the simulation of …

  Cited by 4 Related articles All 6 versions 

 

[PDF] arxiv.org

On the total variation Wasserstein gradient flow and the TV-JKO scheme

G Carlier, C Poon - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We study the JKO scheme for the total variation, characterize the optimizers, prove some of

their qualitative properties (in particular a form of maximum principle and in some cases, a

minimum principle as well). Finally, we establish a convergence result as the time step goes …

  Cited by 7 Related articles All 7 versions


2019

[PDF] wiley.com

A degenerate Cahn‐Hilliard model as constrained Wasserstein gradient flow

D MatthesC Cances, F Nabet - PAMM, 2019 - Wiley Online Library

Existence of solutions to a non‐local Cahn‐Hilliard model with degenerate mobility is

considered. The PDE is written as a gradient flow with respect to the L2‐Wasserstein metric

for two components that are coupled by an incompressibility constraint. Approximating …

  Related articles All 5 versions


Bounds for the Wasserstein mean with applications to the Lie-Trotter mean

J Hwang, S Kim - Journal of Mathematical Analysis and Applications, 2019 - Elsevier

Since barycenters in the Wasserstein space of probability distributions have been

introduced, the Wasserstein metric and the Wasserstein mean of positive definite Hermitian

matrices have been recently developed. In this paper, we explore some properties of …

  Cited by 3 Related articles All 5 versions


[PDF] arxiv.org

Statistical aspects of Wasserstein distances

VM Panaretos, Y Zemel - Annual review of statistics and its …, 2019 - annualreviews.org

Wasserstein distances are metrics on probability distributions inspired by the problem of

optimal mass transportation. Roughly speaking, they measure the minimal effort required to

Cited by 82 Related articles All 10 versions


[PDF] nips.cc

Generalized sliced Wasserstein distances

S KolouriK NadjahiU SimsekliR Badeau… - Advances in Neural …, 2019 - papers.nips.cc

The Wasserstein distance and its variations, eg, the sliced-Wasserstein (SW) distance, have

recently drawn attention from the machine learning community. The SW distance,

specifically, was shown to have similar properties to the Wasserstein distance, while being …

  Cited by 40 Related articles All 6 versions 


[PDF] arxiv.org

Subspace robust wasserstein distances

FP PatyM Cuturi - arXiv preprint arXiv:1901.08949, 2019 - arxiv.org

Making sense of Wasserstein distances between discrete measures in high-dimensional

settings remains a challenge. Recent work has advocated a two-step approach to improve

robustness and facilitate the computation of optimal transport, using for instance projections …

  Cited by 36 Related articles All 4 versions 

[CITATION] Subspace Robust Wasserstein Distances

M Cuturi, FP Paty - 2019

  Cited by 37 Related articles All 5 versions 

<—2019———— 2019  ———- 790——  


[PDF] arxiv.org

Orthogonal estimation of wasserstein distances

M RowlandJ HronY TangK Choromanski… - arXiv preprint arXiv …, 2019 - arxiv.org

Wasserstein distances are increasingly used in a wide variety of applications in machine

learning. Sliced Wasserstein distances form an important subclass which may be estimated

efficiently through one-dimensional sorting operations. In this paper, we propose a new …

  Cited by 8 Related articles All 8 versions 

[PDF] nips.cc

Tree-sliced variants of wasserstein distances

T LeM YamadaK FukumizuM Cuturi - Advances in neural …, 2019 - papers.nips.cc

Optimal transport (\OT) theory defines a powerful set of tools to compare probability

distributions.\OT~ suffers however from a few drawbacks, computational and statistical,

which have encouraged the proposal of several regularized variants of OT in the recent …

  Cited by 12 Related articles All 5 versions 

PDF] nips.cc

[PDF] Tree-Sliced Variants of Wasserstein Distances

T LeM YamadaK FukumizuM Cuturi - Advances in Neural …, 2019 - papers.nips.cc

In this section, we give detailed proofs for the inequality in the connection with OT with

Euclidean ground metric (ie W2 metric) for TW distance, and investigate an empirical

relation between TSW and W2 metric, especially when one increases the number of tree …

  Related articles All 2 versions 



[PDF] arxiv.org

Estimation of wasserstein distances in the spiked transport model

J Niles-WeedP Rigollet - arXiv preprint arXiv:1909.07513, 2019 - arxiv.org

We propose a new statistical model, the spiked transport model, which formalizes the

assumption that two probability distributions differ only on a low-dimensional subspace. We

study the minimax rate of estimation for the Wasserstein distance under this model and show …

  Cited by 12 Related articles All 2 versions 



[PDF] arxiv.org

Parameter estimation for biochemical reaction networks using Wasserstein distances

K Öcal, R GrimaG Sanguinetti - Journal of Physics A …, 2019 - iopscience.iop.org

We present a method for estimating parameters in stochastic models of biochemical reaction

networks by fitting steady-state distributions using Wasserstein distances. We simulate a

reaction network at different parameter settings and train a Gaussian process to learn the …

  Cited by 3 Related articles All 6 versions



[PDF] arxiv.org

Accelerated linear convergence of stochastic momentum methods in wasserstein distances

B CanM GurbuzbalabanL Zhu - arXiv preprint arXiv:1901.07445, 2019 - arxiv.org

Momentum methods such as Polyak's heavy ball (HB) method, Nesterov's accelerated

gradient (AG) as well as accelerated projected gradient (APG) method have been commonly

used in machine learning practice, but their performance is quite sensitive to noise in the …

  Cited by 14 Related articles All 8 versions 


2019


[PDF] researchgate.net

[PDF] Tree-sliced approximation of wasserstein distances

T LeM YamadaK Fukumizu… - arXiv preprint arXiv …, 2019 - researchgate.net

Optimal transport (OT) theory provides a useful set of tools to compare probability

distributions. As a consequence, the field of OT is gaining traction and interest within the

machine learning community. A few deficiencies usually associated with OT include its high …

  Cited by 4 Related articles 


[PDF] arxiv.org

Inequalities for the Wasserstein mean of positive definite matrices

R Bhatia, T Jain, Y Lim - Linear Algebra and its Applications, 2019 - Elsevier

Let A 1 , … , A m be given positive definite matrices and let w = ( w 1 , … , w m ) be a vector of

weights; ie, w j ≥ 0 and ∑ j = 1 m w j = 1 . Then the (weighted) Wasserstein mean, or the Wasserstein

barycentre of A 1 , … , A m is defined as(2) Ω ( w ; A 1 , … , A m ) = argmin X P ∑ j = 1 m w …

  Cited by 12 Related articles All 5 versions

Wasserstein Distances for Estimating Parameters in Stochastic Reaction Networks

K Öcal, R GrimaG Sanguinetti - International Conference on …, 2019 - Springer

Modern experimental methods such as flow cytometry and fluorescence in-situ hybridization

(FISH) allow the measurement of cell-by-cell molecule numbers for RNA, proteins and other

substances for large numbers of cells at a time, opening up new possibilities for the …

  Related articles All 3 versions


[PDF] arxiv.org

On Efficient Multilevel Clustering via Wasserstein Distances

V Huynh, N Ho, N Dam, XL Nguyen… - arXiv preprint arXiv …, 2019 - arxiv.org

We propose a novel approach to the problem of multilevel clustering, which aims to

simultaneously partition data in each group and discover grouping patterns among groups

in a potentially large hierarchically structured corpus of data. Our method involves a joint …

  Related articles All 2 versions 


[PDF] researchgate.net

[PDF] Tropical Optimal Transport and Wasserstein Distances

W Lee, W LiB LinA Monod - arXiv preprint arXiv:1911.05401, 2019 - researchgate.net

We study the problem of optimal transport in tropical geometry and define the Wasserstein-p

distances for probability measures in the continuous metric measure space setting of the

tropical projective torus. We specify the tropical metric—a combinatorial metric that has been …

  Cited by 1 

[PDF] arxiv.org

Tropical Optimal Transport and Wasserstein Distances in Phylogenetic Tree Space

W Lee, W LiB LinA Monod - arXiv preprint arXiv:1911.05401, 2019 - arxiv.org

We study the problem of optimal transport on phylogenetic tree space from the perspective

of tropical geometry, and thus define the Wasserstein-$ p $ distances for probability

measures in this continuous metric measure space setting. With respect to the tropical metric …

  Related articles All 5 versions 

<—2019———— 2019  ———- 800——  


[PDF] arxiv.org

Wasserstein distances for evaluating cross-lingual embeddings

G Balikas, I Partalas - arXiv preprint arXiv:1910.11005, 2019 - arxiv.org

Word embeddings are high dimensional vector representations of words that capture their

semantic similarity in the vector space. There exist several algorithms for learning such

embeddings both for a single language as well as for several languages jointly. In this work …

 Cited by 1 Related articles All 4 versions 

[PDF] arxiv.org

The Wasserstein Distances Between Pushed-Forward Measures with Applications to Uncertainty Quantification

A Sagiv - arXiv preprint arXiv:1902.05451, 2019 - arxiv.org

In the study of dynamical and physical systems, the input parameters are often uncertain or

randomly distributed according to a measure $\varrho $. The system's response $ f $ pushes

forward $\varrho $ to a new measure $ f\circ\varrho $ which we would like to study. However …

Cited by 1 Related articles All 3 versions 

[PDF] d-nb.info

[PDF] Algorithms for Optimal Transport and Wasserstein Distances

J Schrieber - 2019 - d-nb.info

Optimal Transport and Wasserstein Distance are closely related terms that do not only have

a long history in the mathematical literature, but also have seen a resurgence in recent

years, particularly in the context of the many applications they are used in, which span a …

  Related articles 


[PDF] mlr.press

Wasserstein of Wasserstein loss for learning generative models

Y DuklerW Li, A Lin… - … on Machine Learning, 2019 - proceedings.mlr.press

The Wasserstein distance serves as a loss function for unsupervised learning which

depends on the choice of a ground metric on sample space. We propose to use the

Wasserstein distance itself as the ground metric on the sample space of images. This …

Cited by 25 Related articles All 12 versions 

 

[PDF] unibocconi.it

[PDF] Bayesian model comparison based on Wasserstein distances

M Catalano, A Lijoi, I Pruenster - SIS 2019 Smart Statistics for …, 2019 - iris.unibocconi.it

Demography in the Digital Era: New Data Sources for Population Research ...........................23

Demografia nell'era digitale: nuovi fonti di dati per gli studi di popolazione................................23

Diego Alburez-Gutierrez, Samin Aref, Sofia Gil-Clavel, André Grow, Daniela V. Negraia, Emilio …

Cited by 2 Related articles 

[PDF] arxiv.org

Universality of persistence diagrams and the bottleneck and Wasserstein distances

P Bubenik, A Elchesen - arXiv preprint arXiv:1912.02563, 2019 - arxiv.org

We undertake a formal study of persistence diagrams and their metrics. We show that

barcodes and persistence diagrams together with the bottleneck distance and the

Wasserstein distances are obtained via universal constructions and thus have …

  Cited by 3 Related articles All 4 versions 


Adapted Wasserstein Distances and Stability in Mathematical ...

arxiv.org › q-fin

by J Backhoff-Veraguas · ‎2019 · ‎Cited by 18 · ‎Related articles

Quantitative Finance > Mathematical Finance. arXiv:1901.07450 (q-fin). [Submitted on 22 Jan 2019 (v1), last revised 14 May 2020 (this version, v3)] ...

[CITATION] Adapted wasserstein distances and stability in mathematical finance. arXiv e-prints, page

J Backhoff-Veraguas, D Bartl, M Beiglböck, M Eder - arXiv preprint arXiv:1901.07450, 2019

  Cited by 4 Related articles

 2019  

On the Complexity of Approximating Wasserstein Barycenters

http://proceedings.mlr.press › ...

http://proceedings.mlr.press › ...PDF

by A Kroshnin · 2019 · Cited by 75 — Optimal transport distances lead to the concept of Wasserstein barycenter, which allows to define a mean of a set of complex objects, e.g. images, preserving ...

11 pages

[CITATION] On the complexity of computing Wasserstein distances

B Taskesen, S Shafieezadeh-Abadeh, D Kuhn - 2019 - Working paper

  Cited by 2 Related articles


2019z1   see 2018 2016  2017   

[CITATION] Application of optimal transport theory in seismology: Wasserstein distances for seismic full waveform inversion

M Yu - 2019 - theses.fr

… Application of optimal transport theory in seismology: Wasserstein distances for seismic

full waveform inversion. par Miao Yu. Projet de thèse en Sciences de la terre et de

l'atmosphere. La soutenance est prévue le 01-09-2019. Sous la direction de Jean-pierre …

CITATION] Application of optimal transport theory in seismology: Wasserstein distances for seismic full waveform inversion

M Yu - 2019 - theses.fr

… Application of optimal transport theory in seismology: Wasserstein distances for seismic

full waveform inversion. par Miao Yu. Projet de thèse en Sciences de la terre et de

l'atmosphere. La soutenance est prévue le 01-09-2019. Sous la direction de Jean-pierre …

[C] Application of optimal transport theory in seismology: Wasserstein distances for seismic full waveform inversion

M Yu - 2019 - theses.fr

… Application of optimal transport theory in seismology: Wasserstein distances for seismic full waveform inversion. par Miao Yu. Projet de thèse en Sciences de la terre et de l'atmosphere. 128. La soutenance est prévue le 01-09-2019. Sous la direction de …

[CITATION] Application of optimal transport theory in seismology: Wasserstein distances for seismic full waveform inversion

M Yu - 2019 - theses.fr

… Recherche avancée. Uniquement les thèses en préparation dont la soutenance est prévue dans les 6 prochains mois. Uniquement les personnes en lien avec une thèse soutenue ou en préparation depuis moins de 5 ans. Uniquement les thèses soutenues Uniquement les thèses …

[CITATION] Application of optimal transport theory in seismology: Wasserstein distances for seismic full waveform inversion

M Yu - 2019 - theses.fr

… Application of optimal transport theory in seismology: Wasserstein distances for seismic

full waveform inversion. par Miao Yu. Projet de thèse en Sciences de la terre et de

l'atmosphere. La soutenance est prévue le 01-09-2019. Sous la direction de Jean-pierre …

[ßçƒ√

… Application of optimal transport theory in seismology: Wasserstein distances for seismic

full waveform inversion. par Miao Yu. Projet de thèse en Sciences de la terre et de

l'atmosphere. La soutenance est prévue le 01-09-2019. Sous la direction de Jean-pierre …


PDF] mlr.press

On the complexity of approximating Wasserstein barycenters

A KroshninN TupitsaD Dvinskikh… - … conference on …, 2019 - proceedings.mlr.press

… We study the complexity of approximating the Wasserstein barycenter of m discrete

measures, or histograms of size n, by contrasting two alternative approaches that use entropic …

Cited by 73 Related articles All 9 versions 


(q, p)-Wasserstein GANs: Comparing Ground Metrics for Wasserstein GANs

A MallastoJ FrellsenW Boomsma… - arXiv preprint arXiv …, 2019 - arxiv.org

Generative Adversial Networks (GANs) have made a major impact in computer vision and

machine learning as generative models. Wasserstein GANs (WGANs) brought Optimal

Transport (OT) theory into GANs, by minimizing the $1 $-Wasserstein distance between …

  Cited by 3 Related articles All 2 versions 


[PDF] arxiv.org

Orthogonal Wasserstein GANs

J Müller, R KleinM Weinmann - arXiv preprint arXiv:1911.13060, 2019 - arxiv.org

Wasserstein-GANs have been introduced to address the deficiencies of generative

adversarial networks (GANs) regarding the problems of vanishing gradients and mode

collapse during the training, leading to improved convergence behaviour and improved …

  Cited by 1 Related articles All 2 versions 

<—2019———— 2019  ———- 810——  


[PDF] nips.cc

Quantum Wasserstein Generative Adversarial Networks

S Chakrabarti, H Yiming, T Li, S Feizi… - Advances in Neural …, 2019 - papers.nips.cc

The study of quantum generative models is well-motivated, not only because of its importance in quantum machine learning and quantum chemistry but also because of the perspective of its implementation on near-term quantum machines. Inspired by previous …

[PDF] umd.edu

[PDF] Quantum Wasserstein GANs

S Chakrabarti, Y Huang, L TongyangS Feizi… - 33rd Conference on …, 2019 - cs.umd.edu

We propose the first design of quantum Wasserstein Generative Adversarial Networks

(WGANs), which has been shown to improve the robustness and the scalability of the

adversarial training of quantum generative models even on noisy quantum hardware …

  Cited by 1 Related articles 


[PDF] arxiv.org

The boundary method for semi-discrete optimal transport partitions and Wasserstein distance computation

L Dieci, JD Walsh III - Journal of Computational and Applied Mathematics, 2019 - Elsevier

We introduce a new technique, which we call the boundary method, for solving semi-

discrete optimal transport problems with a wide range of cost functions. The boundary

method reduces the effective dimension of the problem, thus improving complexity. For cost …

  Cited by 9 Related articles All 5 versions


Improved Procedures for Training Primal Wasserstein GANs

T Zhang, Z Li, Q ZhuD Zhang - 2019 IEEE SmartWorld …, 2019 - ieeexplore.ieee.org

Primal Wasserstein GANs are a variant of Generative Adversarial Networks (ie, GANs),

which optimize the primal form of empirical Wasserstein distance directly. However, the high

computational complexity and training instability are the main challenges of this framework …

  Related articles


[PDF] arxiv.org

Optimal Transport Relaxations with Application to Wasserstein GANs

S Mahdian, J BlanchetP Glynn - arXiv preprint arXiv:1906.03317, 2019 - arxiv.org

We propose a family of relaxations of the optimal transport problem which regularize the

problem by introducing an additional minimization step over a small region around one of

the underlying transporting measures. The type of regularization that we obtain is related to …

  Related articles All 4 versions 


Training Wasserstein GANs for Estimating Depth Maps

AT Arslan, E Seke - 2019 3rd International Symposium on …, 2019 - ieeexplore.ieee.org

Depth maps depict pixel-wise depth association with a 2D digital image. Point clouds

generation and 3D surface reconstruction can be conducted by processing a depth map.

Estimating a corresponding depth map from a given input image is an important and difficult …

  Related articles

2019


[PDF] openreview.net

A Greedy Approach to Max-Sliced Wasserstein GANs

A Horváth - 2019 - openreview.net

Generative Adversarial Networks have made data generation possible in various use cases,

but in case of complex, high-dimensional distributions it can be difficult to train them,

because of convergence problems and the appearance of mode collapse. Sliced …

Related articles All 2 versions 



2019   see 2020

Bridging the Gap Between $ f $-GANs and Wasserstein GANs

https://arxiv.org › cs

by J Song · 2019 · Cited by 10 — Computer Science > Machine Learning. arXiv:1910.09779 (cs). [Submitted on 22 Oct 2019 (v1), last revised 17 Jun 2020 (this version, v2)] ...

[CITATION] Bridging the Gap Between f-GANs and Wasserstein GANs. arXiv e-prints, page

J Song, S Ermon - arXiv preprint arXiv:1910.09779, 2019

  Cited by 3

[PDF] nips.cc

Multi-marginal wasserstein gan

J Cao, L Mo, Y ZhangK JiaC Shen… - Advances in Neural …, 2019 - papers.nips.cc

Multiple marginal matching problem aims at learning mappings to match a source domain to

multiple target domains and it has attracted great attention in many applications, such as

multi-domain image translation. However, addressing this problem has two critical …

  Cited by 25 Related articles All 5 versions 

[PDF] nips.cc

[PDF] Multi-marginal Wasserstein GAN

J Cao, L Mo, Y ZhangK Jia, C Shen… - Advances in Neural …, 2019 - papers.nips.cc

Theory part. In Section A, we provide preliminaries of multi-marginal optimal transport. In

Section B, we prove an equivalence theorem that solving Problem II is equivalent to solving

Problem III under a mild assumption. In Section C, we build the relationship between …

  Cited by 31 Related articles All 5 versions 


[PDF] thecvf.com

Wasserstein GAN with quadratic transport cost

H LiuX GuD Samaras - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Wasserstein GANs are increasingly used in Computer Vision applications as they are easier

to train. Previous WGAN variants mainly use the l_1 transport cost to compute the

Wasserstein distance between the real and synthetic data distributions. The l_1 transport …

  Cited by 14 Related articles All 3 versions 

[PDF] thecvf.com

[PDF] Wasserstein GAN with Quadratic Transport Cost Supplementary Material

H Liu, X GuD Samaras - openaccess.thecvf.com

(1) where I and J are disjoint sets, then for each xj, there exists at I, such that H t− H j= c

(xj, yt). We prove this by contradiction, ie, there exists one xs, s J, such that we cannot find

ay i such that H i− H s= c (xs, yi), i I. This means that H s> supi I {H i− c (xs, yi)} …

 Cited by 17 Related articles All 5 versions 

<—2019———— 2019  ———- 820—— 


Wgansing: A multi-voice singing voice synthesizer based on the wasserstein-gan

P Chandna, M Blaauw, J Bonada… - … Processing Conference …, 2019 - ieeexplore.ieee.org

… Wasserstein… WassersteinGAN model for singing voice synthesis. In this paper, we present

a novel block-wise generative model for singing voice synthesis, trained using the Wasserstein

Cited by 43 Related articles All 6 versions

[PDF] arxiv.org

Towards Diverse Paraphrase Generation Using Multi-Class Wasserstein GAN

Z An, S Liu - arXiv preprint arXiv:1909.13827, 2019 - arxiv.org

Paraphrase generation is an important and challenging natural language processing (NLP)

task. In this work, we propose a deep generative model to generate paraphrase with

diversity. Our model is based on an encoder-decoder architecture. An additional transcoder …

  Cited by 4 Related articles All 4 versions 


[PDF] monash.edu

[PDF] Threeplayer wasserstein gan via amortised duality

QH Nhan Dam, T LeTD Nguyen… - Proc. of the 28th Int …, 2019 - research.monash.edu

We propose a new formulation for learning generative adversarial networks (GANs) using

optimal transport cost (the general form of Wasserstein distance) as the objective criterion to

measure the dissimilarity between target distribution and learned distribution. Our …

  Cited by 2 Related articles All 4 versions 


[PDF] aaai.org

EWGAN: Entropy-based Wasserstein GAN for imbalanced learning

J Ren, Y LiuJ Liu - Proceedings of the AAAI Conference on Artificial …, 2019 - aaai.org

In this paper, we propose a novel oversampling strategy dubbed Entropy-based

Wasserstein Generative Adversarial Network (EWGAN) to generate data samples for

minority classes in imbalanced learning. First, we construct an entropyweighted label vector …

  Cited by 1 Related articles All 5 versions 


[PDF] semanticscholar.org

Speech Enhancement for Noise-Robust Speech Synthesis Using Wasserstein GAN.

N AdigaY Pantazis, V Tsiaras, Y Stylianou - INTERSPEECH, 2019 - isca-speech.org

The quality of speech synthesis systems can be significantly deteriorated by the presence of

background noise in the recordings. Despite the existence of speech enhancement

techniques for effectively suppressing additive noise under low signal-tonoise (SNR) …

  Cited by 2 Related articles All 2 versions


2019

Music Classification using Multiclass Support Vector Machine and Multilevel Wasserstein Means

J Wei, C Jin, Z Cheng, X Lv… - 2019 IEEE/ACIS 18th …, 2019 - ieeexplore.ieee.org

Music classification is a challenging task in music information retrieval. In this article, we

compare the performance of the two types of models. The first category is classified by

Support Vector Machine (SVM). We use the feature extraction from audio as the basis of …

  Related articles All 2 versions

[PDF] phmsociety.org

[PDF] Anomaly detection on time series with wasserstein GAN applied to PHM

M Ducoffe, I Haloui, JS Gupta… - PHM Applications of Deep …, 2019 - phmsociety.org

Modern vehicles are more and more connected. For instance, in the aerospace industry,

newer aircraft are already equipped with data concentrators and enough wireless

connectivity to transmit sensor data collected during the whole flight to the ground, usually …

  Cited by 2 Related articles All 2 versions 


[PDF] thecvf.com

Face Synthesis and Recognition Using Disentangled Representation-Learning Wasserstein GAN

GS Jison Hsu, CH Tang… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Abstract We propose the Disentangled Representation-learning Wasserstein GAN (DR-

WGAN) trained on augmented data for face recognition and face synthesis across pose. We

improve the state-of-the-art DR-GAN with the Wasserstein loss considered in the …

  Related articles 


[PDF] semanticscholar.org

Face Synthesis and Recognition Using Disentangled Representation-Learning Wasserstein GAN

GSJ Hsu, CH Tang, MH Yap - 2019 IEEE/CVF Conference on …, 2019 - ieeexplore.ieee.org

We propose the Disentangled Representation-learning Wasserstein GAN (DR-WGAN)

trained on augmented data for face recognition and face synthesis across pose. We improve

the state-of-the-art DR-GAN with the Wasserstein loss considered in the discriminator so that …

  Related articles All 2 versions


[PDF] arxiv.org

Wasserstein GAN Can Perform PCA

J Cho, C Suh - 2019 57th Annual Allerton Conference on …, 2019 - ieeexplore.ieee.org

Generative Adversarial Networks (GANs) have become a powerful framework to learn

generative models that arise across a wide variety of domains. While there has been a

recent surge in the development of numerous GAN architectures with distinct optimization …

  Related articles All 6 versions

<—2019———— 2019  ———- 830——


Frame-level speech enhancement based on Wasserstein GAN

P Chuan, T Lan, M Li, S Li, Q Liu - … International Conference on …, 2019 - spiedigitallibrary.org

Speech enhancement is a challenging and critical task in the speech processing research

area. In this paper, we propose a novel speech enhancement model based on Wasserstein

generative adversarial networks, called WSEM. The proposed model operates on frame …

  Related articles All 2 versions

 

2019

Input limited Wasserstein GAN

C FD - 2019 - ir.sia.cn

Generative adversarial networks (GANs) has proven hugely successful, but suffer from train

instability. The recently proposed Wasserstein GAN (WGAN) has largely overcome the

problem, but can still fail to converge in some case or be to complex. It has been found that …

  Related articles 


[CITATION] Wasserstein gan. arXiv 2017

M Arjovsky, S ChintalaL Bottou - arXiv preprint arXiv:1701.07875, 2019

  Cited by 68 Related articles

[CITATION] GAN–Wasserstein GAN & WGAN-GP

J Hui - 2019

  Cited by 5 Related articles

 

基于 Wasserstein GAN 的文档表示模型

马永军, 李亚军, 汪睿, 陈海山 - 计算机工程与科学, 2019 - airitilibrary.com

文档表示模型可以将非结构化的文本数据转化为结构化数据, 是多种自然语言处理任务的基础,

而目前基于词的模型在文档表示任务中有着无法直接表示文档的缺陷. 针对此问题,

基于生成对抗网络GAN 可以使用两个神经网络进行对抗学习, 从而很好地学习到原始数据分布 …

  All 2 versions

[Chinese  Document representation model based on Wasserstein GAN]


基于 Wasserstein GAN 的新一代人工智能小样本数据增强方法——以生物领域癌症分期数据为例

刘宇飞, 周源, 刘欣, 董放, 王畅, 王子鸿 - Engineering, 2019 - cnki.com.cn

以大数据为基础的深度学习算法在推动新一代人工智能快速发展中意义重大.

然而深度学习的有效利用对标注样本数量的高度依赖, 使得深度学习在小样本数据环境下的应用

受到制约. 本研究提出了一种基于生成对抗网络(generative adversarial network, GAN) …

  Related articles 

[Chinese  A new generation of artificial intelligence small-sample data enhancement method in Wasserstein GAN-taking cancer staging data in the biological field as an example]


2019

https://zhuanlan.zhihu.com › ...

Translate this page Stunning Wasserstein GAN

而今天的主角Wasserstein GAN(下面简称WGAN)成功地做到了以下爆炸性的几点:. 彻底解决GAN训练不稳定的问题,不再需要小心平衡生成器和判别器的训练程度; 基本解决了 ...

[Chinese  人拍案叫绝的Wasserstein GAN - 阿里云开发者社区

Lìng rén pāi'àn jiàojué de Wasserstein GAN - ālǐ yún kāifā zhě shèqū]

[CITATION] 令人拍案叫绝的 Wasserstein GAN

郑华滨 - 2017-04-02 [2018-01-20]. https://zhuanlan. zhihu. com …, 2019 - 计算机工程与应用

  Cited by 2 Related articles

 [Chonese  The amazing Wasserstein GAN]


On the Bures–Wasserstein distance between positive definite matrices

R Bhatia, T Jain, Y Lim - Expositiones Mathematicae, 2019 - Elsevier

The metric d (A, B)= tr A+ tr B− 2 tr (A 1 2 BA 1 2) 1 2 1 2 on the manifold of n× n

positive definite matrices arises in various optimisation problems, in quantum information

and in the theory of optimal transport. It is also related to Riemannian geometry. In the first …

  Cited by 88 Related articles All 5 versions


[PDF] arxiv.org

Convergence of some classes of random flights in Wasserstein distance

A Falaleev, V Konakov - arXiv preprint arXiv:1910.03862, 2019 - arxiv.org

In this paper we consider a random walk of a particle in $\mathbb {R}^ d $. Convergence of different transformations of trajectories of random flights with Poisson switching moments has been obtained by Davydov and Konakov, as well as diffusion approximation of the …

  Related articles All 2 versions 


[PDF] thecvf.com

Max-sliced wasserstein distance and its use for gans

I DeshpandeYT HuR SunA Pyrros… - Proceedings of the …, 2019 - openaccess.thecvf.com

Generative adversarial nets (GANs) and variational auto-encoders have significantly

improved our distribution modeling capabilities, showing promise for dataset augmentation,

image-to-image translation and feature learning. However, to model high-dimensional …

  Cited by 33 Related articles All 7 versions 


[PDF] arxiv.org

Thermodynamic interpretation of Wasserstein distance

A Dechant, Y Sakurai - arXiv preprint arXiv:1912.08405, 2019 - arxiv.org

We derive a relation between the dissipation in a stochastic dynamics and the Wasserstein

distance. We show that the minimal amount of dissipation required to transform an initial

state to a final state during a diffusion process is given by the Wasserstein distance between …

  Cited by 5 Related articles All 2 versions 

<—2019———— 2019  ———- 840—— 


[PDF] arxiv.org

Estimation of smooth densities in Wasserstein distance

J WeedQ Berthet - arXiv preprint arXiv:1902.01778, 2019 - arxiv.org

The Wasserstein distances are a set of metrics on probability distributions supported on

$\mathbb {R}^ d $ with applications throughout statistics and machine learning. Often, such

distances are used in the context of variational problems, in which the statistician employs in …

  Cited by 19 Related articles All 5 versions 


[PDF] nips.cc

Concentration of risk measures: A Wasserstein distance approach

SP BhatLA Prashanth - Advances in Neural Information Processing …, 2019 - papers.nips.cc

Known finite-sample concentration bounds for the Wasserstein distance between the

empirical and true distribution of a random variable are used to derive a two-sided

concentration bound for the error between the true conditional value-at-risk (CVaR) of a …

  Cited by 10 Related articles All 4 versions 

[PDF] semanticscholar.org

[PDF] Concentration of risk measures: A Wasserstein distance approach

LA Prashanth - To appear in the proceedings of NeurIPS, 2019 - pdfs.semanticscholar.org

Page 1. Concentration of risk measures: A Wasserstein distance approach Prashanth LA Joint

work with Sanjay P. Bhat† IIT Madras † TCS Research To appear in the proceedings of

NeurIPS-2019. 1 Page 2. Introduction Page 3. Risk criteria • Conditional Value-at-Risk …

  Cited by 13 Related articles All 5 versions 


[PDF] arxiv.org

Approximate Bayesian computation with the Wasserstein distance

E Bernton, PE Jacob, M Gerber, CP Robert - arXiv preprint arXiv …, 2019 - arxiv.org

A growing number of generative statistical models do not permit the numerical evaluation of

their likelihood functions. Approximate Bayesian computation (ABC) has become a popular

approach to overcome this issue, in which one simulates synthetic data sets given …

  Cited by 34 Related articles All 11 versions 

 

[HTML] oup.com

The Gromov–Wasserstein distance between networks and stable network invariants

S ChowdhuryF Mémoli - Information and Inference: A Journal of …, 2019 - academic.oup.com

We define a metric—the network Gromov–Wasserstein distance—on weighted, directed

networks that is sensitive to the presence of outliers. In addition to proving its theoretical

properties, we supply network invariants based on optimal transport that approximate this …

  Cited by 16 Related articles All 6 versions


[PDF] arxiv.org

On distributionally robust chance constrained programs with Wasserstein distance

W Xie - Mathematical Programming, 2019 - Springer

This paper studies a distributionally robust chance constrained program (DRCCP) with

Wasserstein ambiguity set, where the uncertain constraints should be satisfied with a

probability at least a given threshold for all the probability distributions of the uncertain …

  Cited by 37 Related articles All 9 versions

 2019

Wasserstein distance based domain adaptation for object detection

P Xu, P GurramG WhippsR Chellappa - arXiv preprint arXiv:1909.08675, 2019 - arxiv.org

In this paper, we present an adversarial unsupervised domain adaptation framework for

object detection. Prior approaches utilize adversarial training based on cross entropy

between the source and target domain distributions to learn a shared feature mapping that …

Cited by 12 Related articles All 3 versions 

[HTML] oup.com

On parameter estimation with the Wasserstein distance

E Bernton, PE Jacob, M Gerber… - … and Inference: A …, 2019 - academic.oup.com

Statistical inference can be performed by minimizing, over the parameter space, the

Wasserstein distance between model distributions and the empirical distribution of the data.

We study asymptotic properties of such minimum Wasserstein distance estimators …

  Cited by 16 Related articles All 7 versions


Hyperbolic Wasserstein distance for shape indexing

J ShiY Wang - IEEE Transactions on Pattern Analysis and …, 2019 - ieeexplore.ieee.org

Shape space is an active research topic in computer vision and medical imaging fields. The

distance defined in a shape space may provide a simple and refined index to represent a

unique shape. This work studies the Wasserstein space and proposes a novel framework to …

Cited by 6 Related articles All 8 versions

 

[PDF] arxiv.org

Wasserstein distance based deep adversarial transfer learning for intelligent fault diagnosis

C Cheng, B Zhou, G Ma, D WuY Yuan - arXiv preprint arXiv:1903.06753, 2019 - arxiv.org

The demand of artificial intelligent adoption for condition-based maintenance strategy is

astonishingly increased over the past few years. Intelligent fault diagnosis is one critical

topic of maintenance solution for mechanical systems. Deep learning models, such as …

  Cited by 15 Related articles All 3 versions 

[PDF] harvard.edu

An information-theoretic view of generalization via Wasserstein distance

H WangM Diaz, JCS Santos Filho… - … on Information Theory …, 2019 - ieeexplore.ieee.org

We capitalize on the Wasserstein distance to obtain two information-theoretic bounds on the

generalization error of learning algorithms. First, we specialize the Wasserstein distance into

total variation, by using the discrete metric. In this case we derive a generalization bound …

  Cited by 4 Related articles All 4 versions

<—2019———— 2019  ———- 850—


[PDF] ieee.org

A deep transfer model with wasserstein distance guided multi-adversarial networks for bearing fault diagnosis under different working conditions

M Zhang, D Wang, W Lu, J Yang, Z Li, B Liang - IEEE Access, 2019 - ieeexplore.ieee.org

In recent years, intelligent fault diagnosis technology with the deep learning algorithm has

been widely used in the manufacturing industry for substituting time-consuming human

Cited by 54 Related articles All 6 versions

[PDF] researchgate.net

Grid-less DOA estimation using sparse linear arrays based on Wasserstein distance

M WangZ ZhangA Nehorai - IEEE Signal Processing Letters, 2019 - ieeexplore.ieee.org

Sparse linear arrays, such as nested and co-prime arrays, are capable of resolving O (M2)

sources using only O (M) sensors by exploiting their so-called difference coarray model. One

popular approach to exploit the difference coarray model is to construct an augmented …

  Cited by 3 Related articles All 4 versions


[PDF] arxiv.org

Unsupervised adversarial domain adaptation based on the wasserstein distance for acoustic scene classification

K DrossosP MagronT Virtanen - 2019 IEEE Workshop on …, 2019 - ieeexplore.ieee.org

A challenging problem in deep learning-based machine listening field is the degradation of

the performance when using data from unseen conditions. In this paper we focus on the

acoustic scene classification (ASC) task and propose an adversarial deep learning method …

 Cited by 27 Related articles All 9 versions


[PDF] arxiv.org

Multivariate stable approximation in Wasserstein distance by Stein's method

P Chen, I NourdinL XuX Yang - arXiv preprint arXiv:1911.12917, 2019 - arxiv.org

We investigate regularity properties of the solution to Stein's equation associated with

multivariate integrable $\alpha $-stable distribution for a general class of spectral measures

and Lipschitz test functions. The obtained estimates induce an upper bound in Wasserstein  …

Cited by 5 Related articles All 4 versions 

Time delay estimation via Wasserstein distance minimization

JM NicholsMN Hutchinson, N Menkart… - IEEE Signal …, 2019 - ieeexplore.ieee.org

Time delay estimation between signals propagating through nonlinear media is an important

problem with application to radar, underwater acoustics, damage detection, and

communications (to name a few). Here, we describe a simple approach for determining the …

  Cited by 3 Related articles All 2 versions

[PDF] mdpi.com

Wasserstein distance learns domain invariant feature representations for drift compensation of E-nose

Y Tao, C Li, Z Liang, H Yang, J Xu - Sensors, 2019 - mdpi.com

Abstract Electronic nose (E-nose), a kind of instrument which combines with the gas sensor

and the corresponding pattern recognition algorithm, is used to detect the type and

concentration of gases. However, the sensor drift will occur in realistic application scenario …

  Cited by 4 Related articles All 8 versions 

Cited by 8 Related articles All 8 versions 

[PDF] arxiv.org

Normalized Wasserstein distance for mixture distributions with applications in adversarial learning and domain adaptation

Y BalajiR ChellappaS Feizi - arXiv preprint arXiv:1902.00415, 2019 - arxiv.org

Understanding proper distance measures between distributions is at the core of several

learning tasks such as generative models, domain adaptation, clustering, etc. In this work,

we focus on mixture distributions that arise naturally in several application domains where …

Cited by 8 Related articles All 2 versions 

[PDF] ieee.org

Prostate MR image segmentation with self-attention adversarial training based on wasserstein distance

C Su, R Huang, C Liu, T Yin, B Du - IEEE Access, 2019 - ieeexplore.ieee.org

Prostate diseases are very common in men. Accurate segmentation of the prostate plays a

significant role in further clinical treatment and diagnosis. There have been some methods

that combine the segmentation network and generative adversarial network, using the …

  Cited by 2 Related articles

 

[PDF] arxiv.org

Geometric convergence bounds for Markov chains in Wasserstein distance based on generalized drift and contraction conditions

Q Qin, JP Hobert - arXiv preprint arXiv:1902.02964, 2019 - arxiv.org

Let $\{X_n\} _ {n= 0}^\infty $ denote an ergodic Markov chain on a general state space that

has stationary distribution $\pi $. This article concerns upper bounds on the $ L_1 $-

Wasserstein distance between the distribution of $ X_n $ and $\pi $. In particular, an explicit …

  Cited by 7 Related articles All 2 versions 


Aggregated Wasserstein Distance and State Registration for Hidden Markov Models

Y ChenJ Ye, J Li - IEEE Transactions on Pattern Analysis and …, 2019 - ieeexplore.ieee.org

We propose a framework, named Aggregated Wasserstein, for computing a distance

between two Hidden Markov Models with state conditional distributions being Gaussian. For

such HMMs, the marginal distribution at any time position follows a Gaussian mixture …

  Cited by 4 Related articles All 2 versions

<—2019———— 2019  ———- 860—


[PDF] arxiv.org

Approximation of stable law in Wasserstein-1 distance by Stein's method

L Xu - Annals of Applied Probability, 2019 - projecteuclid.org

Abstract Let $ n\in\mathbb {N} $, let $\zeta_ {n, 1},\ldots,\zeta_ {n, n} $ be a sequence of

independent random variables with $\mathbb {E}\zeta_ {n, i}= 0$ and $\mathbb {E}|\zeta_ {n,

i}|<\infty $ for each $ i $, and let $\mu $ be an $\alpha $-stable distribution having …

  Cited by 19 Related articles All 7 versions


[PDF] arxiv.org

On the estimation of the Wasserstein distance in generative models

T Pinetz, D Soukup, T Pock - German Conference on Pattern Recognition, 2019 - Springer

Abstract Generative Adversarial Networks (GANs) have been used to model the underlying

probability distribution of sample based datasets. GANs are notoriuos for training difficulties

and their dependence on arbitrary hyperparameters. One recent improvement in GAN …

  Related articles All 5 versions


Sufficient condition for rectifiability involving Wasserstein distance 

D Dąbrowski - arXiv preprint arXiv:1904.11004, 2019 - arxiv.org

A Radon measure $\mu $ is $ n $-rectifiable if it is absolutely continuous with respect to

$\mathcal {H}^ n $ and $\mu $-almost all of $\text {supp}\,\mu $ can be covered by Lipschitz

images of $\mathbb {R}^ n $. In this paper we give two sufficient conditions for rectifiability …

  Cited by 4 Related articles All 3 versions 

[PDF] projecteuclid.org

Behavior of the empirical Wasserstein distance in  under moment conditions

J Dedecker, F Merlevède - Electronic Journal of Probability, 2019 - projecteuclid.org

We establish some deviation inequalities, moment bounds and almost sure results for the

Wasserstein distance of order $ p\in [1,\infty) $ between the empirical measure of

independent and identically distributed ${\mathbb R}^ d $-valued random variables and the …

Cited by 8 Related articles All 18 versions


[PDF] springer.com

Convergence to equilibrium in Wasserstein distance for damped Euler equations with interaction forces

JA CarrilloYP ChoiO Tse - Communications in Mathematical Physics, 2019 - Springer

We develop tools to construct Lyapunov functionals on the space of probability measures in

order to investigate the convergence to global equilibrium of a damped Euler system under

the influence of external and interaction potential forces with respect to the 2-Wasserstein  …

Cited by 14 Related articles All 12 versions

[PDF] ieee.org

Generating Adversarial Samples With Constrained Wasserstein Distance

K Wang, P Yi, F Zou, Y Wu - IEEE Access, 2019 - ieeexplore.ieee.org

In recent years, deep neural network (DNN) approaches prove to be useful in many machine

learning tasks, including classification. However, small perturbations that are carefully

crafted by attackers can lead to the misclassification of the images. Previous studies have …

  Cited by 1 Related articles


[PDF] arxiv.org

Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic Schr {\" o} dinger equation

G Ferriere - arXiv preprint arXiv:1903.04309, 2019 - arxiv.org

We consider the dispersive logarithmic Schr {ö} dinger equation in a semi-classical scaling.

We extend the results about the large time behaviour of the solution (dispersion faster than

usual with an additional logarithmic factor, convergence of the rescaled modulus of the …

  Cited by 7 Related articles All 4 versions 

 

[PDF] arxiv.org

Hypothesis Test and Confidence Analysis with Wasserstein Distance on General Dimension

M Imaizumi, H Ota, T Hamaguchi - arXiv preprint arXiv:1910.07773, 2019 - arxiv.org

We develop a general framework for statistical inference with the Wasserstein distance.

Recently, the Wasserstein distance has attracted much attention and been applied to

various machine learning tasks due to its celebrated properties. Despite the importance …

  Related articles All 2 versions 


[PDF] arxiv.org

Optimal Fusion of Elliptic Extended Target Estimates based on the Wasserstein Distance

K ThormannM Baum - 2019 22th International Conference on …, 2019 - ieeexplore.ieee.org

This paper considers the fusion of multiple estimates of a spatially extended object, where

the object extent is modeled as an ellipse parameterized by the orientation and semi-axes

lengths. For this purpose, we propose a novel systematic approach that employs a distance …

 Cited by 3 Related articles All 6 versions

[PDF] arxiv.org

Wasserstein Distance Guided Cross-Domain Learning

J Su - arXiv preprint arXiv:1910.07676, 2019 - arxiv.org

Domain adaptation aims to generalise a high-performance learner on target domain (non-

labelled data) by leveraging the knowledge from source domain (rich labelled data) which

comes from a different but related distribution. Assuming the source and target domains data …

  Related articles All 2 versions 

<—2019———— 2019  ———- 870——


[PDF] arxiv.org

Minimax Confidence Intervals for the Sliced Wasserstein Distance

T ManoleS BalakrishnanL Wasserman - arXiv preprint arXiv:1909.07862, 2019 - arxiv.org

The Wasserstein distance has risen in popularity in the statistics and machine learning

communities as a useful metric for comparing probability distributions. We study the problem

of uncertainty quantification for the Sliced Wasserstein distance--an easily computable …

 Cited by 7 Related articles All 5 versions 


[HTML] nih.gov

Construction of 4D Neonatal Cortical Surface Atlases Using Wasserstein Distance

Z Chen, Z WuL SunF WangL Wang… - 2019 IEEE 16th …, 2019 - ieeexplore.ieee.org

Spatiotemporal (4D) neonatal cortical surface atlases with densely sampled ages are

important tools for understanding the dynamic early brain development. Conventionally,

after non-linear co-registration, surface atlases are constructed by simple Euclidean average …

Cited by 3 Related articles All 5 versions

[PDF] projecteuclid.org

Hybrid Wasserstein distance and fast distribution clustering

I Verdinelli, L Wasserman - Electronic Journal of Statistics, 2019 - projecteuclid.org

We define a modified Wasserstein distance for distribution clustering which inherits many of

the properties of the Wasserstein distance but which can be estimated easily and computed

quickly. The modified distance is the sum of two terms. The first term—which has a closed …

  Related articles All 3 versions

[PDF] arxiv.org

Kernel Wasserstein Distance

JH Oh, M Pouryahya, A Iyer, AP Apte… - arXiv preprint arXiv …, 2019 - arxiv.org

The Wasserstein distance is a powerful metric based on the theory of optimal transport. It

gives a natural measure of the distance between two distributions with a wide range of

applications. In contrast to a number of the common divergences on distributions such as …

  Cited by 3 Related articles All 3 versions 

 

[PDF] arxiv.org

Random Matrix-Improved Estimation of the Wasserstein Distance between two Centered Gaussian Distributions

M TiomokoR Couillet - 2019 27th European Signal Processing …, 2019 - ieeexplore.ieee.org

This article proposes a method to consistently estimate functionals $\frac {1}{p}\sum_ {i=

1}^{p} f (\lambda_ {i}(C_ {1} C_ {2})) $ of the eigenvalues of the product of two covariance

matrices $ C_ {1}, C_ {2}\in\mathbb {R}^{p\times p} $ based on the empirical estimates …

 Cited by 2 Related articles All 31 versions

 

[PDF] arxiv.org

The boundary method for semi-discrete optimal transport partitions and Wasserstein distance computation

L Dieci, JD Walsh III - Journal of Computational and Applied Mathematics, 2019 - Elsevier

We introduce a new technique, which we call the boundary method, for solving semi-

discrete optimal transport problems with a wide range of cost functions. The boundary

method reduces the effective dimension of the problem, thus improving complexity. For cost …

  CitCited by 10 Related articles All 6 versions

 

Adversarial Learning for Cross-Modal Retrieval with Wasserstein Distance

Q Cheng, Y Zhang, X Gu - International Conference on Neural Information …, 2019 - Springer

This paper presents a novel approach for cross-modal retrieval in an Adversarial Learning

with Wasserstein Distance (ALWD) manner, which aims at learning aligned representation

for various modalities in a GAN framework. The generator projects the image and the text …

  Related articles


[PDF] arxiv.org

Rate of convergence in Wasserstein distance of piecewise-linear L\'evy-driven SDEs

A Arapostathis, G Pang, N Sandrić - arXiv preprint arXiv:1907.05250, 2019 - arxiv.org

In this paper, we study the rate of convergence under the Wasserstein metric of a broad

class of multidimensional piecewise Ornstein-Uhlenbeck processes with jumps. These are

governed by stochastic differential equations having a piecewise linear drift, and a fairly …

  Related articles All 5 versions 


[PDF] nsf.gov

Distributions with Maximum Spread Subject to Wasserstein Distance Constraints

JG Carlsson, Y Wang - Journal of the Operations Research Society of …, 2019 - Springer

Recent research on formulating and solving distributionally robust optimization problems

has seen many different approaches for describing one's ambiguity set, such as constraints

on first and second moments or quantiles. In this paper, we use the Wasserstein distance to …

  Related articles All 2 versions


[PDF] arxiv.org

Approximation of Wasserstein distance with Transshipment

N Papadakis - arXiv preprint arXiv:1901.09400, 2019 - arxiv.org

An algorithm for approximating the p-Wasserstein distance between histograms defined on

unstructured discrete grids is presented. It is based on the computation of a barycenter

constrained to be supported on a low dimensional subspace, which corresponds to a …

  Cited by 2 Related articles All 5 versions 

<—2019———— 2019  ———- 880—


[PDF] arxiv.org

Convergence of some classes of random flights in Wasserstein distance

A Falaleev, V Konakov - arXiv preprint arXiv:1910.03862, 2019 - arxiv.org

In this paper we consider a random walk of a particle in $\mathbb {R}^ d $. Convergence of

different transformations of trajectories of random flights with Poisson switching moments

has been obtained by Davydov and Konakov, as well as diffusion approximation of the …

  Related articles All 2 versions 


[PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance Part 2: Wrong Way Funding Risk

D Singh, S Zhang - arXiv preprint arXiv:1910.03993, 2019 - arxiv.org

This paper investigates calculations of robust funding valuation adjustment (FVA) for over

the counter (OTC) derivatives under distributional uncertainty using Wasserstein distance as

the ambiguity measure. Wrong way funding risk can be characterized via the robust FVA …

  Related articles All 5 versions 


[PDF] sns.it

Sensitivity of the Compliance and of the Wasserstein Distance with Respect to a Varying Source

G BouchittéI FragalàI Lucardesi - Applied Mathematics & Optimization, 2019 - Springer

We show that the compliance functional in elasticity is differentiable with respect to

horizontal variations of the load term, when the latter is given by a possibly concentrated

measure; moreover, we provide an integral representation formula for the derivative as a …

  Related articles All 9 versions


[PDF] arxiv.org

Connections between Support Vector Machines, Wasserstein distance and gradient-penalty GANs

A Jolicoeur-MartineauI Mitliagkas - arXiv preprint arXiv:1910.06922, 2019 - arxiv.org

We generalize the concept of maximum-margin classifiers (MMCs) to arbitrary norms and

non-linear functions. Support Vector Machines (SVMs) are a special case of MMC. We find

that MMCs can be formulated as Integral Probability Metrics (IPMs) or classifiers with some …

  Cited by 5 Related articles All 3 versions 


[PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance Part 1: Wrong Way Counterparty Credit Risk

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019 - arxiv.org

This paper investigates calculations of robust CVA for OTC derivatives under distributional

uncertainty using Wasserstein distance as the ambiguity measure. Wrong way counterparty

credit risk can be characterized (and indeed quantified) via the robust CVA formulation. The …

  Cited by 1 Related articles All 5 versions 

[CITATION] Distributionally robust xva via wasserstein distance part 1

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019

  Cited by 2 Related articles

[PDF] arxiv.org

Bounding quantiles of Wasserstein distance between true and empirical measure

SN CohenMNA Tegnér, J Wiesel - arXiv preprint arXiv:1907.02006, 2019 - arxiv.org

Consider the empirical measure, $\hat {\mathbb {P}} _N $, associated to $ N $ iid samples of

a given probability distribution $\mathbb {P} $ on the unit interval. For fixed $\mathbb {P} $

the Wasserstein distance between $\hat {\mathbb {P}} _N $ and $\mathbb {P} $ is a random …

  Related articles All 3 versions 


[PDF] arxiv.org

A nonlocal free boundary problem with Wasserstein distance

A Karakhanyan - arXiv preprint arXiv:1904.06270, 2019 - arxiv.org

We study the probability measures $\rho\in\mathcal M (\mathbb R^ 2) $ minimizing the

functional\[J [\rho]=\iint\log\frac1 {| xy|} d\rho (x) d\rho (y)+ d^ 2 (\rho,\rho_0),\] where $\rho_0

$ is a given probability measure and $ d (\rho,\rho_0) $ is the 2-Wasserstein distance of …

  Related articles All 2 versions 


[HTML] Correction to: Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Under the above-strengthened Assumption 2.1, all the conclusions and examples in [1] still hold

true, except that all the constants \(C_\theta \) therein will depend on the constants in the new

assumption … Combining the previous three inequalities, we conclude that [1, (7.1)] still holds …

  Cited by 1 Related articles All 3 versions


[CITATION] Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula (vol 174, pg 945, 2019)

X Fang, QM ShaoL Xu - PROBABILITY …, 2019 - … TIERGARTEN

PDF] arxiv.org

Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Stein's method has been widely used for probability approximations. However, in the multi-

dimensional setting, most of the results are for multivariate normal approximation or for test

functions with bounded second-or higher-order derivatives. For a class of multivariate …

 Cited by 29 Related articles All 7 versions


Approximation and Wasserstein distance for self-similar measures on the unit interval

E Lichtenegger, R Niedzialomski - Journal of Mathematical Analysis and …, 2019 - Elsevier

We study the Wasserstein distance between self-similar measures associated to two non-

overlapping linear contractions of the unit interval. The main theorem gives an explicit

formula for the Wasserstein distance between iterations of certain discrete approximations of …

  Related articles All 2 versions

<—2019———— 2019  ———- 890—


BIRCH Algorithm and Wasserstein Distance Metric Based Method for Generating Typical Scenarios of Wind Power Outputs

Q LiX Tang, C Chen, X Liu, S Liu, X Shi… - … -Asia (ISGT Asia), 2019 - ieeexplore.ieee.org

With the ever-increasing penetration of renewable energy generation such as wind power

and solar photovoltaics, the power system concerned is suffering more extensive and

significant uncertainties. Scenario analysis has been utilized to solve this problem for power …

  Related articles

 

[PDF] arxiv.org

1-Wasserstein Distance on the Standard Simplex

A Frohmader, H Volkmer - arXiv preprint arXiv:1912.04945, 2019 - arxiv.org

Wasserstein distances provide a metric on a space of probability measures. We consider the

space $\Omega $ of all probability measures on the finite set $\chi=\{1,\dots, n\} $ where $ n

$ is a positive integer. 1-Wasserstein distance, $ W_1 (\mu,\nu) $ is a function from …

  Cited by 1 Related articles All 2 versions 


[PDF] bayesiandeeplearning.org

[PDF] Nested-Wasserstein Distance for Sequence Generation

R ZhangC ChenZ GanZ WenW WangL Carin - bayesiandeeplearning.org

Reinforcement learning (RL) has been widely studied for improving sequencegeneration

models. However, the conventional rewards used for RL training typically cannot capture

sufficient semantic information and therefore render model bias. Further, the sparse and …

  Related articles 


2019  [PDF] arxiv.org

Group level MEG/EEG source imaging via optimal transport: minimum Wasserstein estimates

H Janati, T Bazeille, B Thirion, M Cuturi… - … Information Processing in …, 2019 - Springer

… We extend next the Wasserstein distance to signed measures. We adopt a similar idea to 

what … {a}, \mathbf {b}\in \mathbb {R}^p\), we define the generalized Wasserstein distance as: …

Cited by 7 Related articles All 30 versions


Deconvolution for the Wasserstein distance

J Dedecker - smai.emath.fr

We consider the problem of estimating a probability measure on Rd from data observed with

an additive noise. We are interested in rates of convergence for the Wasserstein metric of

order p≥ 1. The distribution of the errors is assumed to be known and to belong to a class of …

  Related articles 


2019

[PDF] psu.edu

Aggregated Wasserstein distance for hidden Markov models and automated morphological characterization of placenta from photos

Y Chen - 2019 - etda.libraries.psu.edu

In the past decade, fueled by the rapid advances of big data technology and machine

learning algorithms, data science has become a new paradigm of science and has more

and more emerged into its own field. At the intersection of computational methods, data …

[PDF] csroc.org.tw

[PDF] Cross-domain Text Sentiment Classification Based on Wasserstein Distance

G Cai, Q Lin, N Chen - Journal of Computers, 2019 - csroc.org.tw

Text sentiment analysis is mainly to detect the sentiment polarity implicit in text data. Most

existing supervised learning algorithms are difficult to solve the domain adaptation problem

in text sentiment analysis. The key of cross-domain text sentiment analysis is how to extract …

  All 2 versions

 

[PDF] amazonaws.com

[PDF] Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application to Microbiome Studies”

S Wang, TT CaiH Li - pstorage-tf-iopjsd8797887.s3 …

Page 1. Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application

to Microbiome Studies” Shulei Wang, T. Tony Cai and Hongzhe Li University of Pennsylvania In

this supplementary material, we provide the proof for the main results (Section S1) and all the …

  All 3 versions 


[PDF] aau.dk

[PDF] Full-Band Music Genres Interpolations with Wasserstein Autoencoders

T Borghuis, A Tibo, S Conforti, L Brusci… - Workshop AI for Media …, 2019 - vbn.aau.dk

We compare different types of autoencoders for generating interpolations between four-

instruments musical patterns in the acid jazz, funk, and soul genres. Preliminary empirical

results suggest the superiority of Wasserstein autoencoders. The process of generation …

  Related articles All 4 versions 


[CITATION] Multisource wasserstein distance based domain adaptation

S Ghosh, S Prakash - 2019 - dspace.iiti.ac.in

… Please use this identifier to cite or link to this item: http://dspace.iiti.ac.in:8080/jspui/handle/

123456789/2064. Title: Multisource wasserstein distance based domain adaptation …

  

 <—2019———— 2019  ———- 900—


2019

[PDF] arxiv.org

Approximation of stable law in Wasserstein-1 distance by Stein's method

L Xu - The Annals of Applied Probability, 2019 - projecteuclid.org

Abstract Let $ n\in\mathbb {N} $, let $\zeta_ {n, 1},\ldots,\zeta_ {n, n} $ be a sequence of

independent random variables with $\mathbb {E}\zeta_ {n, i}= 0$ and $\mathbb {E}|\zeta_ {n,

i}|<\infty $ for each $ i $, and let $\mu $ be an $\alpha $-stable distribution having …

  Cited by 18 Related articles All 5 versions

CITATION] Approximation of stable law in Wasserstein-1 distance by Stein's method. Accepted by Annals of Applied Probability

L Xu - arXiv preprint arXiv:1709.00805, 2017


2019

[PDF] arxiv.org

Multivariate stable approximation in Wasserstein distance by Stein's method

P Chen, I NourdinL XuX Yang - arXiv preprint arXiv:1911.12917, 2019 - arxiv.org

We investigate regularity properties of the solution to Stein's equation associated with

multivariate integrable $\alpha $-stable distribution for a general class of spectral measures

and Lipschitz test functions. The obtained estimates induce an upper bound in Wasserstein  …

  Cited by 3 Related articles All 4 versions 


2019

[PDF] arxiv.org

Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Stein's method has been widely used for probability approximations. However, in the multi-

dimensional setting, most of the results are for multivariate normal approximation or for test

functions with bounded second-or higher-order derivatives. For a class of multivariate …

  Cited by 19 Related articles All 5 versions

[CITATION] Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula (vol 174, pg 945, 2019)

X Fang, QM ShaoL Xu - PROBABILITY …, 2019 - … TIERGARTENSTRASSE 17, D …

 

2019

[HTML] springer.com

[HTML] Correction to: Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Under the above-strengthened Assumption 2.1, all the conclusions and examples in [1] still hold

true, except that all the constants \(C_\theta \) therein will depend on the constants in the new

assumption … Combining the previous three inequalities, we conclude that [1, (7.1)] still holds …

  Cited by 1 Related articles All 3 versions


Unimodal-uniform constrained wasserstein training for medical diagnosis

X LiuX Han, Y Qiao, Y GeS Li… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

The labels in medical diagnosis task are usually discrete and successively distributed. For

example, the Diabetic Retinopathy Diagnosis (DR) involves five health risk levels: no DR (0),

mild DR (1), moderate DR (2), severe DR (3) and proliferative DR (4). This labeling system is …

  Cited by 15 Related articles All 7 versions 

2019

[PDF] researchgate.net

Wasserstein metric based distributionally robust approximate framework for unit commitment

R Zhu, H Wei, X Bai - IEEE Transactions on Power Systems, 2019 - ieeexplore.ieee.org

This paper proposed a Wasserstein metric-based distributionally robust approximate

framework (WDRA), for unit commitment problem to manage the risk from uncertain wind

power forecasted errors. The ambiguity set employed in the distributionally robust …

  Cited by 24 Related articles All 3 versions


 January 2019 

Hybrid Wasserstein distance and fast distribution clustering

Isabella Verdinelli, Larry Wasserman

Electronic Journal of Statistics Vol. 13, Issue 2 (Jan 2019), pg(s) 5088-5119

KEYWORDS: clusteringWasserstein

Read Abstract +

Cited by 8 Related 

[PDF] arxiv.org

On the minimax optimality of estimating the wasserstein metric

T Liang - arXiv preprint arXiv:1908.10324, 2019 - arxiv.org

We study the minimax optimal rate for estimating the Wasserstein-$1 $ metric between two

unknown probability measures based on $ n $ iid empirical samples from them. We show

that estimating the Wasserstein metric itself between probability measures, is not …

  Cited by 3 Related articles All 2 versions 


Calculating spatial configurational entropy of a landscape mosaic based on the Wasserstein metric

Y Zhao, X Zhang - Landscape Ecology, 2019 - Springer

Context Entropy is an important concept traditionally associated with thermodynamics and is

widely used to describe the degree of disorder in a substance, system, or process.

Configurational entropy has received more attention because it better reflects the …

  Cited by 4 Related articles All 2 versions


[PDF] arxiv.org

How Well Do WGANs Estimate the Wasserstein Metric?

A MallastoG MontúfarA Gerolin - arXiv preprint arXiv:1910.03875, 2019 - arxiv.org

Generative modelling is often cast as minimizing a similarity measure between a data

distribution and a model distribution. Recently, a popular choice for the similarity measure

has been the Wasserstein metric, which can be expressed in the Kantorovich duality …

  Cited by 4 Related articles All 5 versions 

 <—2019———— 2019  ———- 910—   


Misfit function for full waveform inversion based on the Wasserstein metric with dynamic formulation

P YongW Liao, J Huang, Z Li, Y Lin - Journal of Computational Physics, 2019 - Elsevier

Conventional full waveform inversion (FWI) using least square distance (L 2 norm) between

the observed and predicted seismograms suffers from local minima. Recently, the

Wasserstein metric (W 1 metric) has been introduced to FWI to compute the misfit between …

  Cited by 1 Related articles All 2 versions


 

[PDF] researchgate.net

Least-squares reverse time migration via linearized waveform inversion using a Wasserstein metric

P Yong, J Huang, Z Li, W LiaoL Qu - Geophysics, 2019 - library.seg.org

Least-squares reverse time migration (LSRTM), an effective tool for imaging the structures of

the earth from seismograms, can be characterized as a linearized waveform inversion

problem. We have investigated the performance of three minimization functionals as the L 2 …

  Cited by 3 Related articles All 4 versions

[CITATION] Least-squares reverse time migration via linearized waveform inversion using a Wasserstein metricWasserstein metric for LSRTM

P Yong, J Huang, Z Li, W LiaoL Qu - Geophysics, 2019

Cited by 10 Related articles All 5 versions

 

[PDF] projecteuclid.org

Convergence of the Population Dynamics algorithm in the Wasserstein metric

M Olvera-Cravioto - Electronic Journal of Probability, 2019 - projecteuclid.org

We study the convergence of the population dynamics algorithm, which produces sample

pools of random variables having a distribution that closely approximates that of the special

endogenous solution to a variety of branching stochastic fixed-point equations, including the …

  Cited by 3 Related articles All 6 versions


PWGAN: wasserstein GANs with perceptual loss for mode collapse

X Wu, C Shi, X Li, J He, X Wu, J Lv, J Zhou - Proceedings of the ACM …, 2019 - dl.acm.org

Generative adversarial network (GAN) plays an important part in image generation. It has

great achievements trained on large scene data sets. However, for small scene data sets,

we find that most of methods may lead to a mode collapse, which may repeatedly generate …

Related articles

Distributionally Robust Learning under the Wasserstein Metric

R Chen - 2019 - search.proquest.com

This dissertation develops a comprehensive statistical learning framework that is robust to

(distributional) perturbations in the data using Distributionally Robust Optimization (DRO)

under the Wasserstein metric. The learning problems that are studied include:(i) …

  Cited by 1 Related articles All 3 versions


2019

 

Use of the Wasserstein Metric to Solve the Inverse Dynamic Seismic Problem

AA Vasilenko - Geomodel 2019, 2019 - earthdoc.org

The inverse dynamic seismic problem consists in recovering the velocity model of elastic

medium based on the observed seismic data. In this work full waveform inversion method is

used to solve this problem. It consists in minimizing an objective functional measuring the …

  Related articles


Prioritized Experience Replay based on the Wasserstein Metric in Deep Reinforcement Learning: The regularizing effect of modelling return distributions

T Greevink - 2019 - repository.tudelft.nl

This thesis tests the hypothesis that distributional deep reinforcement learning (RL)

algorithms get an increased performance over expectation based deep RL because of the

regularizing effect of fitting a more complex model. This hypothesis was tested by comparing …

  

2019

[PDF] arxiv.org

A nonlocal free boundary problem with Wasserstein distance

A Karakhanyan - arXiv preprint arXiv:1904.06270, 2019 - arxiv.org

We study the probability measures $\rho\in\mathcal M (\mathbb R^ 2) $ minimizing the

functional\[J [\rho]=\iint\log\frac1 {| xy|} d\rho (x) d\rho (y)+ d^ 2 (\rho,\rho_0),\] where $\rho_0

$ is a given probability measure and $ d (\rho,\rho_0) $ is the 2-Wasserstein distance of …

  Related articles All 2 versions 


2019   [PDF] springer.com

Convergence to equilibrium in Wasserstein distance for damped Euler equations with interaction forces

JA CarrilloYP ChoiO Tse - Communications in Mathematical Physics, 2019 - Springer

We develop tools to construct Lyapunov functionals on the space of probability measures in

order to investigate the convergence to global equilibrium of a damped Euler system under

the influence of external and interaction potential forces with respect to the 2-Wasserstein  …

  Cited by 14 Related articles All 12 versions


2019 [PDF] arxiv.org

Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space

TL GouicQ ParisP RigolletAJ Stromme - arXiv preprint arXiv …, 2019 - arxiv.org

This work establishes fast rates of convergence for empirical barycenters over a large class

of geodesic spaces with curvature bounds in the sense of Alexandrov. More specifically, we

show that parametric rates of convergence are achievable under natural conditions that …

  Cited by 9 Related articles All 2 versions 

 <—2019———— 2019  ———-   920—    


2019

Calculating spatial configurational entropy of a landscape mosaic based on the Wasserstein metric

Y Zhao, X Zhang - Landscape Ecology, 2019 - Springer

Context Entropy is an important concept traditionally associated with thermodynamics and is

widely used to describe the degree of disorder in a substance, system, or process.

Configurational entropy has received more attention because it better reflects the …

 Cited by 4 Related articles All 5 versions


2019

[PDF] aaai.org

EWGAN: Entropy-based Wasserstein GAN for imbalanced learning

J Ren, Y LiuJ Liu - Proceedings of the AAAI Conference on Artificial …, 2019 - aaai.org

In this paper, we propose a novel oversampling strategy dubbed Entropy-based

Wasserstein Generative Adversarial Network (EWGAN) to generate data samples for

minority classes in imbalanced learning. First, we construct an entropyweighted label vector …

  Cited by 1 Related articles All 5 versions 


2019 [PDF] arxiv.org

[PDF] Implementation of batched Sinkhorn iterations for entropy-regularized Wasserstein loss

T Viehmann - arXiv preprint arXiv:1907.01729, 2019 - arxiv.org

In this report, we review the calculation of entropy-regularised Wasserstein loss introduced

by Cuturi and document a practical implementation in PyTorch. Subjects: Machine Learning

(stat. ML); Machine Learning (cs. LG) Cite as: arXiv: 1907.01729 [stat. ML](or arXiv …

  Cited by 1 Related articles All 2 versions 


2019 [PDF] arxiv.org

From the backward Kolmogorov PDE on the Wasserstein space to propagation of chaos for McKean-Vlasov SDEs

N Frikha, PEC de Raynal - arXiv preprint arXiv:1907.01410, 2019 - arxiv.org

In this article, we provide some new quantitative estimates for propagation of chaos of non-

linear stochastic differential equations (SDEs) in the sense of McKean-Vlasov. We obtain

explicit error estimates, at the level of the trajectories, at the level of the semi-group and at …

  Cited by 1 Related articles All 17 versions 

[CITATION] From the Backward Kolmogorov PDE on the Wasserstein space to propagation of chaos for McKean-Vlasov SDEs

PEC de Raynal, N Frikha - arXiv preprint arXiv:1907.01410, 2018

  Cited by 2 Related articles


2019 [PDF] arxiv.org

The Wasserstein-Fourier Distance for Stationary Time Series

E Cazelles, A Robert, F Tobar - arXiv preprint arXiv:1912.05509, 2019 - arxiv.org

We introduce a novel framework for analysing stationary time series based on optimal

transport distances and spectral embeddings. First, we represent time series by their power

spectral density (PSD), which summarises the signal energy spread across the Fourier  …

  Cited by 2 Related articles All 3 versions 




 Tropical Optimal Transport and Wasserstein Distances

W Lee, W LiB LinA Monod - arXiv preprint arXiv:1911.05401, 2019 - arxiv.org

We study the problem of optimal transport in tropical geometry and define the Wasserstein-$

p $ distances for probability measures in the continuous metric measure space setting of the

tropical projective torus. We specify the tropical metric---a combinatorial metric that has been …

  Cited by 1 Related articles All 3 versions 

[PDF] ucla.edu

[PDF] Tropical Optimal Transport and Wasserstein Distances in Phylogenetic Tree Space

W Lee, W LiB LinA Monod - arXiv preprint arXiv:1911.05401, 2019 - math.ucla.edu

We study the problem of optimal transport on phylogenetic tree space from the perspective

of tropical geometry, and thus define the Wasserstein-p distances for probability measures in

this continuous metric measure space setting. With respect to the tropical metric—a …

  Related articles All 2 versions 


2019 [PDF] arxiv.org

Topic modeling with Wasserstein autoencoders

F NanR DingR NallapatiB Xiang - arXiv preprint arXiv:1907.12374, 2019 - arxiv.org

… We propose a novel neural topic model in the Wasserstein autoencoders (WAE) framework … The

most pop- ular probabilistic topic model is the Latent Dirich- let Allocation (LDA) (Blei et al., 2003),

where the authors developed a … This work was done when the author was with …

  Cited by 11 Related articles All 5 versions 


 Wasserstein convergence rates for random bit approximations of continuous Markov processes

S Ankirchner, T Kruse, M Urusov - Journal of Mathematical Analysis and …, 2019 - Elsevier

We determine the convergence speed of a numerical scheme for approximating one-

dimensional continuous strong Markov processes. The scheme is based on the construction

of certain Markov chains whose laws can be embedded into the process with a sequence of …

  Cited by 3 Related articles All 4 versions



Primal dual methods for Wasserstein gradient flows

JA CarrilloK CraigL WangC Wei - arXiv preprint arXiv:1901.08081, 2019 - arxiv.org

Combining the classical theory of optimal transport with modern operator splitting

techniques, we develop a new numerical method for nonlinear, nonlocal partial differential

equations, arising in models of porous media, materials science, and biological swarming …

  Cited by 17 Related articles All 3 versions 


[PDF] mlr.press

Accelerated linear convergence of stochastic momentum methods in wasserstein distances

B CanM GurbuzbalabanL Zhu - … Conference on Machine …, 2019 - proceedings.mlr.press

Momentum methods such as Polyak's heavy ball (HB) method, Nesterov's accelerated

gradient (AG) as well as accelerated projected gradient (APG) method have been commonly

used in machine learning practice, but their performance is quite sensitive to noise in the …

  Cited by 15 Related articles All 8 versions 
<—2019———— 2019  ———-   930—     



[PDF] arxiv.org

Approximation of stable law in Wasserstein-1 distance by Stein's method

L Xu - Annals of Applied Probability, 2019 - projecteuclid.org

Abstract Let $ n\in\mathbb {N} $, let $\zeta_ {n, 1},\ldots,\zeta_ {n, n} $ be a sequence of

independent random variables with $\mathbb {E}\zeta_ {n, i}= 0$ and $\mathbb {E}|\zeta_ {n,

i}|<\infty $ for each $ i $, and let $\mu $ be an $\alpha $-stable distribution having …

  Cited by 19 Related articles All 7 versions


[PDF] arxiv.org

Interior-point methods strike back: Solving the wasserstein barycenter problem

D Ge, H Wang, Z Xiong, Y Ye - arXiv preprint arXiv:1905.12895, 2019 - arxiv.org

Computing the Wasserstein barycenter of a set of probability measures under the optimal

transport metric can quickly become prohibitive for traditional second-order algorithms, such

as interior-point methods, as the support size of the measures increases. In this paper, we …

  Cited by 14 Related articles All 5 versions 


[PDF] arxiv.org

Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Stein's method has been widely used for probability approximations. However, in the multi-

dimensional setting, most of the results are for multivariate normal approximation or for test

functions with bounded second-or higher-order derivatives. For a class of multivariate …

  Cited by 20 Related articles All 7 versions

[CITATION] Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula (vol 174, pg 945, 2019)

X Fang, QM ShaoL Xu - PROBABILITY …, 2019 - … TIERGARTENSTRASSE 17, D …

[HTML] springer.com

[HTML] Correction to: Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Under the above-strengthened Assumption 2.1, all the conclusions and examples in [1] still hold

true, except that all the constants \(C_\theta \) therein will depend on the constants in the new

assumption … Combining the previous three inequalities, we conclude that [1, (7.1)] still holds …

  Cited by 1 Related articles All 2 versions



Wasserstein distance-based analogous method to predict distribution of non-uniform corrosion on reinforcements in concrete

Q Liu, RKL Su - Construction and Building Materials, 2019 - Elsevier

This paper presents an analogous method to predict the distribution of non-uniform

corrosion on reinforcements in concrete by minimizing the Wasserstein distance. A

comparison between the predicted and experimental results shows that the proposed …

  Cited by 5 Related articles All 3 versions


A virtual monochromatic imaging method for spectral CT based on Wasserstein generative adversarial network with a hybrid loss

Z Shi, J Li, H Li, Q Hu, Q Cao - IEEE Access, 2019 - ieeexplore.ieee.org

Spectral computed tomography (CT) has become a popular clinical diagnostic technique

because of its unique advantage in material distinction. Specifically, it can perform virtual

monochromatic imaging to obtain accurate tissue composition with less beam hardening …

  Cited by 8 Related articles All 2 versions


2019

[PDF] arxiv.org

Multivariate stable approximation in Wasserstein distance by Stein's method

P Chen, I NourdinL XuX Yang - arXiv preprint arXiv:1911.12917, 2019 - arxiv.org

We investigate regularity properties of the solution to Stein's equation associated with

multivariate integrable $\alpha $-stable distribution for a general class of spectral measures

and Lipschitz test functions. The obtained estimates induce an upper bound in Wasserstein  …

  Cited by 3 Related articles All 4 versions 


[PDF] arxiv.org

The boundary method for semi-discrete optimal transport partitions and Wasserstein distance computation

L Dieci, JD Walsh III - Journal of Computational and Applied Mathematics, 2019 - Elsevier

We introduce a new technique, which we call the boundary method, for solving semi-

discrete optimal transport problems with a wide range of cost functions. The boundary

method reduces the effective dimension of the problem, thus improving complexity. For cost …

  Cited by 6 Related articles All 5 versions


Data augmentation method of sar image dataset based on wasserstein generative adversarial networks

Q Lu, H Jiang, G Li, W Ye - 2019 International conference on …, 2019 - ieeexplore.ieee.org

The published Synthetic Aperture Radar (SAR) samples are not abundant enough, which is

not conducive to the application of deep learning methods in the field of SAR automatic

target recognition. Generative Adversarial Nets (GANs) is one of the most effective ways to …

  Cited by 1 Related articles All 2 versions


An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription

C Jin, Z Li, Y Sun, H Zhang, X Lv, J Li, S Liu - International Conference on …, 2019 - Springer

Given a piece of acoustic musical signal, various automatic music transcription (AMT)

processing methods have been proposed to generate the corresponding music notations

without human intervention. However, the existing AMT methods based on signal …

  Related articles


BIRCH Algorithm and Wasserstein Distance Metric Based Method for Generating Typical Scenarios of Wind Power Outputs

Q Li, X Tang, C Chen, X Liu, S Liu, X Shi… - … -Asia (ISGT Asia), 2019 - ieeexplore.ieee.org

With the ever-increasing penetration of renewable energy generation such as wind power

and solar photovoltaics, the power system concerned is suffering more extensive and

significant uncertainties. Scenario analysis has been utilized to solve this problem for power …

  Related articles

<—2019———— 2019  ———-   940— 


2019

[PDF] projecteuclid.org

Hybrid Wasserstein distance and fast distribution clustering

I Verdinelli, L Wasserman - Electronic Journal of Statistics, 2019 - projecteuclid.org

We define a modified Wasserstein distance for distribution clustering which inherits many of

the properties of the Wasserstein distance but which can be estimated easily and computed

quickly. The modified distance is the sum of two terms. The first term—which has a closed …

  Cited by 1 Related articles All 5 versions


2019

[PDF] arxiv.org

The optimal convergence rate of monotone schemes for conservation laws in the Wasserstein distance

AM RufE SandeS Solem - Journal of Scientific Computing, 2019 - Springer

Abstract In 1994, Nessyahu, Tadmor and Tassa studied convergence rates of monotone

finite volume approximations of conservation laws. For compactly supported, Lip^+ Lip+-

bounded initial data they showed a first-order convergence rate in the Wasserstein distance …

  Cited by 8 Related articles All 6 versions


2019

[PDF] arxiv.org

Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic Schr {\" o} dinger equation

G Ferriere - arXiv preprint arXiv:1903.04309, 2019 - arxiv.org

We consider the dispersive logarithmic Schr {ö} dinger equation in a semi-classical scaling.

We extend the results about the large time behaviour of the solution (dispersion faster than

usual with an additional logarithmic factor, convergence of the rescaled modulus of the …

  Cited by 6 Related articles All 4 versions 


2019

[PDF] researchgate.net

[PDF] Rate of convergence in Wasserstein distance of piecewise-linear Lévy-driven SDEs

ARI ARAPOSTATHIS, G PANG… - arXiv preprint arXiv …, 2019 - researchgate.net

In this paper, we study the rate of convergence under the Wasserstein metric of a broad

class of multidimensional piecewise Ornstein–Uhlenbeck processes with jumps. These are

governed by stochastic differential equations having a piecewise linear drift, and a fairly …

  Related articles 

 


 2019

Reproducing-Kernel Hilbert space regression with notes on the Wasserstein Distance

S Page - 2019 - eprints.lancs.ac.uk

We study kernel least-squares estimators for the regression problem subject to a norm

constraint. We bound the squared L2 error of our estimators with respect to the covariate

distribution. We also bound the worst-case squared L2 error of our estimators with respect to …

  Related articles All 5 versions 




2019

A Wasserstein distance-based analogous method to predict distribution of non-uniform corrosion on reinforcements in concrete

Q Liu, RKL Su - Construction and Building Materials, 2019 - Elsevier

This paper presents an analogous method to predict the distribution of non-uniform

corrosion on reinforcements in concrete by minimizing the Wasserstein distance. A

comparison between the predicted and experimental results shows that the proposed …

  Cited by 5 Related articles All 3 versions


Multi-source medical image fusion based on Wasserstein generative adversarial networks

Z Yang, Y Chen, Z LeF Fan, E Pan - IEEE Access, 2019 - ieeexplore.ieee.org

In this paper, we propose the medical Wasserstein generative adversarial networks

(MWGAN), an end-to-end model, for fusing magnetic resonance imaging (MRI) and positron

emission tomography (PET) medical images. Our method establishes two adversarial …

  Cited by 5 Related articles


A virtual monochromatic imaging method for spectral CT based on Wasserstein generative adversarial network with a hybrid loss

Z Shi, J Li, H Li, Q Hu, Q Cao - IEEE Access, 2019 - ieeexplore.ieee.org

Spectral computed tomography (CT) has become a popular clinical diagnostic technique

because of its unique advantage in material distinction. Specifically, it can perform virtual

monochromatic imaging to obtain accurate tissue composition with less beam hardening …

  Cited by 8 Related articles All 2 versions


[PDF] ieee.org

Prostate MR image segmentation with self-attention adversarial training based on wasserstein distance

C Su, R Huang, C Liu, T Yin, B Du - IEEE Access, 2019 - ieeexplore.ieee.org

Prostate diseases are very common in men. Accurate segmentation of the prostate plays a

significant role in further clinical treatment and diagnosis. There have been some methods

that combine the segmentation network and generative adversarial network, using the …

  Cited by 3 Related articles


[PDF] wustl.edu

Grid-less DOA estimation using sparse linear arrays based on Wasserstein distance

M WangZ ZhangA Nehorai - IEEE Signal Processing Letters, 2019 - ieeexplore.ieee.org

Sparse linear arrays, such as nested and co-prime arrays, are capable of resolving O (M2)

sources using only O (M) sensors by exploiting their so-called difference coarray model. One

popular approach to exploit the difference coarray model is to construct an augmented …

  Cited by 3 Related articles All 3 versions

<—2019———— 2019  ———-   950—    



Gait recognition based on Wasserstein generating adversarial image inpainting network

L Xia, H Wang, W Guo - Journal of Central South University, 2019 - Springer

Aiming at the problem of small area human occlusion in gait recognition, a method based on

generating adversarial image inpainting network was proposed which can generate a

context consistent image for gait occlusion area. In order to reduce the effect of noise on …

  Cited by 1 Related articles


[PDF] uni-bielefeld.de

[PDF] Diffusions and PDEs on Wasserstein space

FY Wang - arXiv preprint arXiv:1903.02148, 2019 - sfb1283.uni-bielefeld.de

We propose a new type SDE, whose coefficients depend on the image of solutions, to investigate

the diffusion process on the Wasserstein space 2 over Rd, generated by the following

time-dependent differential operator for f C2 … R d×Rd σ(t, x, µ)σ(t, y, µ) ,D2f(µ)(x …

  Cited by 2 Related articles 


Evasion attacks based on wasserstein generative adversarial network

J Zhang, Q Yan, M Wang - 2019 Computing, Communications …, 2019 - ieeexplore.ieee.org

Security issues have been accompanied by the development of the artificial intelligence

industry. Machine learning has been widely used for fraud detection, spam detection, and

malicious file detection, since it has the ability to dig the value of big data. However, for …

  Cited by 1 Related articles


Data augmentation method of sar image dataset based on wasserstein generative adversarial networks

Q Lu, H Jiang, G Li, W Ye - 2019 International conference on …, 2019 - ieeexplore.ieee.org

The published Synthetic Aperture Radar (SAR) samples are not abundant enough, which is

not conducive to the application of deep learning methods in the field of SAR automatic

target recognition. Generative Adversarial Nets (GANs) is one of the most effective ways to …

  Cited by 1 Related articles All 2 versions


[PDF] Bayesian model comparison based on Wasserstein distances

M Catalano, A Lijoi, I Pruenster - SIS 2019 Smart Statistics for …, 2019 - iris.unibocconi.it

Demography in the Digital Era: New Data Sources for Population Research ...........................23

Demografia nell'era digitale: nuovi fonti di dati per gli studi di popolazione................................23

Diego Alburez-Gutierrez, Samin Aref, Sofia Gil-Clavel, André Grow, Daniela V. Negraia, Emilio …

  Cited by 1 Related articles 





 [PDF] rit.edu

A Comparative Assessment of the Impact of Various Norms on Wasserstein Generative Adversarial Networks

C Ramesh - 2019 - scholarworks.rit.edu

Abstract Generative Adversarial Networks (GANs) provide a fascinating new paradigm in

machine learning and artificial intelligence, especially in the context of unsupervised

learning. GANs are quickly becoming a state of the art tool, used in various applications …

  Related articles All 2 versions 


Weibo Authorship Identification based on Wasserstein generative adversarial networks

W Tang, C Wu, X Chen, Y Sun… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

During the past years, authorship identification has played a significant role in the public

security area. Recently, deep learning based approaches have been used in authorship

identification. However, all approaches based on deep learning require a large amount of …

  Related articles

 

 [PDF] csroc.org.tw

[PDF] Cross-domain Text Sentiment Classification Based on Wasserstein Distance

G Cai, Q Lin, N Chen - Journal of Computers, 2019 - csroc.org.tw

Text sentiment analysis is mainly to detect the sentiment polarity implicit in text data. Most

existing supervised learning algorithms are difficult to solve the domain adaptation problem

in text sentiment analysis. The key of cross-domain text sentiment analysis is how to extract …

  Related articles All 2 versions 


Frame-level speech enhancement based on Wasserstein GAN

P Chuan, T Lan, M Li, S Li, Q Liu - … International Conference on …, 2019 - spiedigitallibrary.org

Speech enhancement is a challenging and critical task in the speech processing research

area. In this paper, we propose a novel speech enhancement model based on Wasserstein

generative adversarial networks, called WSEM. The proposed model operates on frame …

  Related articles All 2 versions

<—2019———— 2019  ———-   960—  


[PDF] arxiv.org

Statistical aspects of Wasserstein distances

VM Panaretos, Y Zemel - Annual review of statistics and its …, 2019 - annualreviews.org

Wasserstein distances are metrics on probability distributions inspired by the problem of

optimal mass transportation. Roughly speaking, they measure the minimal effort required to

reconfigure the probability mass of one distribution in order to recover the other distribution …

  Cited by 81 Related articles All 10 versions


[PDF] mlr.press

Wasserstein of Wasserstein loss for learning generative models

Y DuklerW Li, A Lin… - … Conference on Machine …, 2019 - proceedings.mlr.press

The Wasserstein distance serves as a loss function for unsupervised learning which

depends on the choice of a ground metric on sample space. We propose to use the

Wasserstein distance itself as the ground metric on the sample space of images. This …

  Cited by 11 Related articles All 11 versions 


[PDF] mlr.press

Orthogonal estimation of wasserstein distances

M RowlandJ HronY Tang… - The 22nd …, 2019 - proceedings.mlr.press

Wasserstein distances are increasingly used in a wide variety of applications in machine

learning. Sliced Wasserstein distances form an important subclass which may be estimated

efficiently through one-dimensional sorting operations. In this paper, we propose a new …

  Cited by 9 Related articles All 9 versions 


[PDF] arxiv.org

Thermodynamic interpretation of Wasserstein distance

A Dechant, Y Sakurai - arXiv preprint arXiv:1912.08405, 2019 - arxiv.org

We derive a relation between the dissipation in a stochastic dynamics and the Wasserstein

distance. We show that the minimal amount of dissipation required to transform an initial

state to a final state during a diffusion process is given by the Wasserstein distance between …

  Cited by 5 Related articles All 2 versions 


[PDF] arxiv.org

Estimation of Wasserstein distances in the spiked transport model

J Niles-WeedP Rigollet - arXiv preprint arXiv:1909.07513, 2019 - arxiv.org

We propose a new statistical model, the spiked transport model, which formalizes the

assumption that two probability distributions differ only on a low-dimensional subspace. We

study the minimax rate of estimation for the Wasserstein distance under this model and show …

  Cited by 13 Related articles All 2 versions 


2019

[PDF] arxiv.org

Tree-sliced variants of wasserstein distances

T LeM YamadaK FukumizuM Cuturi - arXiv preprint arXiv:1902.00342, 2019 - arxiv.org

Optimal transport (\OT) theory defines a powerful set of tools to compare probability

distributions.\OT~ suffers however from a few drawbacks, computational and statistical,

which have encouraged the proposal of several regularized variants of OT in the recent …

  Cited by 16 Related articles All 5 versions 

[CITATION] Supplementary Material for: Tree-Sliced Variants of Wasserstein Distances

T LeM YamadaK FukumizuM Cuturi


[PDF] arxiv.org

Strong equivalence between metrics of Wasserstein type

E BayraktarG Guo - arXiv preprint arXiv:1912.08247, 2019 - arxiv.org

The sliced Wasserstein and more recently max-sliced Wasserstein metrics $\mW_p $ have

attracted abundant attention in data sciences and machine learning due to its advantages to

tackle the curse of dimensionality. A question of particular importance is the strong …

  Cited by 2 Related articles All 2 versions 


[PDF] arxiv.org

Data-driven regularization of Wasserstein barycenters with an application to multivariate density registration

J Bigot, E Cazelles, N Papadakis - Information and Inference: A …, 2019 - academic.oup.com

We present a framework to simultaneously align and smoothen data in the form of multiple

point clouds sampled from unknown densities with support in a-dimensional Euclidean

space. This work is motivated by applications in bioinformatics where researchers aim to …

  Cited by 7 Related articles All 8 versions

[PDF] esaim-cocv.org

Dynamic models of Wasserstein-1-type unbalanced transport

B Schmitzer, B Wirth - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We consider a class of convex optimization problems modelling temporal mass transport

and mass change between two given mass distributions (the so-called dynamic formulation

of unbalanced transport), where we focus on those models for which transport costs are …

  Cited by 6 Related articles All 5 versions


[PDF] researchgate.net

[PDF] Tree-sliced approximation of wasserstein distances

T LeM YamadaK Fukumizu… - arXiv preprint arXiv …, 2019 - researchgate.net

Optimal transport (OT) theory provides a useful set of tools to compare probability

distributions. As a consequence, the field of OT is gaining traction and interest within the

machine learning community. A few deficiencies usually associated with OT include its high …

  Cited by 2 Related articles 

<—2019———— 2019  ———-   970—  



Optimal XL-insurance under Wasserstein-type ambiguity

C Birghila, GC Pflug - Insurance: Mathematics and Economics, 2019 - Elsevier

We study the problem of optimal insurance contract design for risk management under a budget constraint. The contract holder takes into consideration that the loss distribution is not entirely known and therefore faces an ambiguity problem. For a given set of models, we formulate a minimax optimization problem of finding an optimal insurance contract that minimizes the distortion risk functional of the retained loss with premium limitation. We demonstrate that under the average value-at-risk measure, the entrance-excess of loss …

Cited by 1 Related articles All 6 versions 

Investigators from University of Vienna Report New Data on Insurance Economics (Optimal Xl-insurance Under Wasserstein...

Insurance Business Weekly, 10/2019

NewsletterCitation Online

 

MR3955014  Laschos, Vaios; Obermayer, Klaus; Shen, Yun; Stannat, Wilhelm A Fenchel-Moreau-Rockafellar type theorem on the Kantorovich-Wasserstein space with applications in partially observable Markov decision processes. J. Math. Anal. Appl. 477 (2019), no. 2, 1133–1156. (Reviewer: Onésimo Hernández Lerma) 49N15 (90C40)
[PDF] researchgate.net

A Fenchel-Moreau-Rockafellar type theorem on the Kantorovich-Wasserstein space with applications in partially observable Markov decision processes

V Laschos, K Obermayer, Y Shen, W Stannat - Journal of Mathematical …, 2019 - Elsevier

By using the fact that the space of all probability measures with finite support can be completed in two different fashions, one generating the Arens-Eells space and another generating the Kantorovich-Wasserstein (Wasserstein-1) space, and by exploiting the …

Related articles

New MatheFull Text Onlinematics Study Findings Recently Were Reported by Researchers at Technical University Berlin (A Fenchel-Moreau-Rockafellar type theorem on the Kantorovich-Wasserstein... 

Journal of Engineering, 09/2019

Newsletter 


[PDF] arxiv.org

Fast Tree Variants of Gromov-Wasserstein

T LeN HoM Yamada - arXiv preprint arXiv:1910.04462, 2019 - arxiv.org

Page 1. Flow-based Alignment Approaches for Probability Measures in Different

Spaces Tam Le RIKEN AIP, Japan tam.le@riken.jp Nhat Ho University of California,

Berkeley minhnhat@berkeley.edu Makoto Yamada Kyoto …

  Cited by 2 Related articles 


[PDF] mlr.press

Accelerated linear convergence of stochastic momentum methods in wasserstein distances

B CanM GurbuzbalabanL Zhu - … Conference on Machine …, 2019 - proceedings.mlr.press

Momentum methods such as Polyak's heavy ball (HB) method, Nesterov's accelerated gradient (AG) as well as accelerated projected gradient (APG) method have been commonly used in machine learning practice, but their performance is quite sensitive to noise in the …

 Cited by 25 Related articles All 8 versions 


[PDF] arxiv.org

On isometric embeddings of Wasserstein spaces–the discrete case

GP GehérT TitkosD Virosztek - Journal of Mathematical Analysis and …, 2019 - Elsevier

The aim of this short paper is to offer a complete characterization of all (not necessarily

surjective) isometric embeddings of the Wasserstein space W p (X), where X is a countable

discrete metric space and 0< p<∞ is any parameter value. Roughly speaking, we will prove …

  Cited by 1 Related articles All 8 versions



2019


[PDF] arxiv.org

Approximation of Wasserstein distance with Transshipment

N Papadakis - arXiv preprint arXiv:1901.09400, 2019 - arxiv.org

An algorithm for approximating the p-Wasserstein distance between histograms defined on

unstructured discrete grids is presented. It is based on the computation of a barycenter

constrained to be supported on a low dimensional subspace, which corresponds to a …

  Cited by 2 Related articles All 5 versions 


[PDF] arxiv.org

Wasserstein -tests and Confidence Bands for the Fr\`echet Regression of Density Response Curves

A Petersen, X Liu, AA Divani - arXiv preprint arXiv:1910.13418, 2019 - arxiv.org

… (2.1) d2 W (f,g) = ∫ R ( M opt f,g (u) − u )2 f(u)du = ∫ 1 0 (F−1 (t) − G −1 (t)) 2 dt, where the last

equality follows by the change of variables t = F(u). A more proper term for this metric is the

Wasserstein-2 distance, since it is just one among an entire class of Wasserstein metrics …

  Cited by 3 Related articles All 2 versions 

 

[PDF] projecteuclid.org

Behavior of the empirical Wasserstein distance in  under moment conditions

J Dedecker, F Merlevède - Electronic Journal of Probability, 2019 - projecteuclid.org

… In the same context, Bach and Weed [2] obtain sharper results by generalizing some ideas

going back to Dudley ([14], case p = 1). They introduce the notion of Wasserstein dimension

d p(µ) of the measure µ, and prove that np/sE(Wp p (µn,µ)) …

  Cited by 6 Related articles All 12 versions

[PDF] arxiv.org

Bounding quantiles of Wasserstein distance between true and empirical measure

SN CohenMNA Tegnér, J Wiesel - arXiv preprint arXiv:1907.02006, 2019 - arxiv.org

Consider the empirical measure, $\hat {\mathbb {P}} _N $, associated to $ N $ iid samples of

a given probability distribution $\mathbb {P} $ on the unit interval. For fixed $\mathbb {P} $

the Wasserstein distance between $\hat {\mathbb {P}} _N $ and $\mathbb {P} $ is a random …

  Related articles All 4 versions 


[PDF] researchgate.net

[PDF] Computation of Wasserstein barycenters via the Iterated Swapping Algorithm

G Puccetti, L RüschendorfS Vanduffel - 2019 - researchgate.net

In recent years, the Wasserstein barycenter has become an important notion in the analysis

of high dimensional data with a broad range of applications in applied probability,

economics, statistics and in particular to clustering and image processing. In our paper we …

  Related articles 

<—2019———— 2019  ———-   980— 



[PDF] arxiv.org

Tractable Reformulations of Distributionally Robust Two-stage Stochastic Programs with Wasserstein Distance

W Xie - arXiv preprint arXiv:1908.08454, 2019 - arxiv.org

… distance as τ ∞. Different types of Wasserstein ambiguity set might provide different

tractable results … (2018a), it still exhibits attractive convergent properties. The discussions

on advantages of Wasserstein ambiguity sets can be found in …

  Cited by 1 Related articles All 2 versions 


Data augmentation method of sar image dataset based on wasserstein generative adversarial networks

Q Lu, H Jiang, G Li, W Ye - 2019 International conference on …, 2019 - ieeexplore.ieee.org

… The contributions of this work are as follow: • We proposed the application of Wasserstein GAN

training to SAR images generation … [6] Gulrajani I, Ahmed F, Arjovsky M, et al. Improved training

of Wasserstein gans[C]. Advances in neural information processing systems …

  Cited by 1 Related articles All 2 versions

[PDF] amazonaws.com

[PDF] Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application to Microbiome Studies”

S Wang, TT CaiH Li - pstorage-tf-iopjsd8797887.s3 …

Page 1. Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application

to Microbiome Studies” Shulei Wang, T. Tony Cai and Hongzhe Li University of Pennsylvania In

this supplementary material, we provide the proof for the main results (Section S1) and all the …

  Related articles All 3 versions 

[PDF] amazonaws.com

[PDF] Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application to Microbiome Studies”

S Wang, TT Cai, H Li - pstorage-tf-iopjsd8797887.s3 …

Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application 

to Microbiome Studies” … Supplement to “Optimal Estimation of Wasserstein Distance on A 

Tree with An Application to Microbiome Studies” …

[PDF] psu.edu

Aggregated Wasserstein distance for hidden Markov models and automated morphological characterization of placenta from photos

Y Chen - 2019 - etda.libraries.psu.edu

Page 1. The Pennsylvania State University The Graduate School AGGREGATED WASSERSTEIN

DISTANCE FOR HIDDEN MARKOV MODELS AND AUTOMATED MORPHOLOGICAL

CHARACTERIZATION OF PLACENTA FROM PHOTOS A Dissertation in …

  Related articles 


基于特征解耦的深度学习图像再渲染方法

甘志雄 - 2019 - cdmd.cnki.com.cn

收敛速度有一定提升空间。针对这两个问题,本文在TD_GAN中引入连续型特征标签以使

模型学习到连续型特征,同时图像生成部分引入基于Wasserstein距离的生成对抗网络WGAN-

GP(Wasserstein Generative Adversarial Networks),利用Wasserstein …

All 2 versions

[Chinese  Deep learning image re-rendering method based on feature decoupling]


2019

A Wasserstein Inequality and Minimal Green Energy on Compact Manifolds

S Steinerberger - arXiv preprint arXiv:1907.09023, 2019 - arxiv.org

Let $ M $ be a smooth, compact $ d-$ dimensional manifold, $ d\geq 3, $ without boundary

and let $ G: M\times M\rightarrow\mathbb {R}\cup\left\{\infty\right\} $ denote the Green's

function of the Laplacian $-\Delta $(normalized to have mean value 0). We prove a bound …

  Cited by 2 Related articles All 2 versions 


 2019

Sampling of probability measures in the convex order by Wasserstein projection

J Corbetta, B Jourdain - 2019 - ideas.repec.org

In this paper, for $\mu $ and $\nu $ two probability measures on $\mathbb {R}^ d $ with finite

moments of order $\rho\ge 1$, we define the respective projections for the $ W_\rho $-

Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures dominated by …

  


2019

[CITATION] Multivariate Stein Factors from Wasserstein Decay

MA Erdogdu, L Mackey, O Shamir - 2019 - preparation

  Cited by 2 Related articles


 

  2019

Projection in the 2-Wasserstein sense on structured measure space

L Lebrat - 2019 - tel.archives-ouvertes.fr

This thesis focuses on the approximation for the 2-Wasserstein metric of probability

measures by structured measures. The set of structured measures under consideration is

made of consistent discretizations of measures carried by a smooth curve with a bounded …

  All 2 versions 


2019

Projection au sens de Wasserstein 2 sur des espaces structurés de mesures

L Lebrat - 2019 - theses.fr

Résumé Cette thèse s' intéresse à l'approximation pour la métrique de 2-Wasserstein de

mesures de probabilité par une mesure structurée. Les mesures structurées étudiées sont

des discrétisations consistantes de mesures portées par des courbes continues à vitesse et …

<—2019———— 2019  ———-   990—


2019

[PDF] arxiv.org

Geometric convergence bounds for Markov chains in Wasserstein distance based on generalized drift and contraction conditions

Q Qin, JP Hobert - arXiv preprint arXiv:1902.02964, 2019 - arxiv.org

Let $\{X_n\} _ {n= 0}^\infty $ denote an ergodic Markov chain on a general state space that

has stationary distribution $\pi $. This article concerns upper bounds on the $ L_1 $-

Wasserstein distance between the distribution of $ X_n $ and $\pi $. In particular, an explicit …

  Cited by 9 Related articles All 2 versions 


[PDF] arxiv.org

Fréchet means and Procrustes analysis in Wasserstein space

Y Zemel, VM Panaretos - Bernoulli, 2019 - projecteuclid.org

We consider two statistical problems at the intersection of functional and non-Euclidean data

analysis: the determination of a Fréchet mean in the Wasserstein space of multivariate

distributions; and the optimal registration of deformed random measures and point …

  Cited by 53 Related articles All 8 versions


2019

Sampling of probability measures in the convex order by Wasserstein projection

J Corbetta, B Jourdain - 2019 - ideas.repec.org

In this paper, for $\mu $ and $\nu $ two probability measures on $\mathbb {R}^ d $ with finite

moments of order $\rho\ge 1$, we define the respective projections for the $ W_\rho $-

Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures dominated by …


[PDF] arxiv.org

Multi-marginal wasserstein gan

J Cao, L Mo, Y ZhangK JiaC ShenM Tan - arXiv preprint arXiv …, 2019 - arxiv.org

Multiple marginal matching problem aims at learning mappings to match a source domain to

multiple target domains and it has attracted great attention in many applications, such as

multi-domain image translation. However, addressing this problem has two critical …

  Cited by 31 Related articles All 5 versions 

[CITATION] Supplementary Materials: Multi-marginal Wasserstein GAN

J Cao, L Mo, Y ZhangK Jia, C Shen, M Tan

  Related articles

CITATION] Supplementary Materials: Multi-marginal Wasserstein GAN

J Cao, L Mo, Y ZhangK Jia, C Shen, M Tan

  Related articles

[PDF] arxiv.org

Wasserstein robust reinforcement learning

…, HB Ammar, V Milenkovic, R Luo, M Zhang… - arXiv preprint arXiv …, 2019 - arxiv.org

Reinforcement learning algorithms, though successful, tend to over-fit to training

environments hampering their application to the real-world. This paper proposes $\text

{W}\text {R}^{2}\text {L} $--a robust reinforcement learning algorithm with significant robust …

 Cited by 33 Related articles All 7 versions 


2019

Artifact correction in low‐dose dental CT imaging using Wasserstein generative adversarial networks

Z Hu, C Jiang, F Sun, Q Zhang, Y Ge, Y Yang… - Medical …, 2019 - Wiley Online Library

Purpose In recent years, health risks concerning high‐dose x‐ray radiation have become a

major concern in dental computed tomography (CT) examinations. Therefore, adopting low‐

dose computed tomography (LDCT) technology has become a major focus in the CT …

Artifact correction in low‐dose dental CT imaging

2019  see 2020  Research article
Denoising of 3D magnetic resonance images using a residual encoder–decoder Wasserstein generative adversarial network
Medical Image Analysis5 May 2019...

Maosong RanJinrong HuYi Zhang
Cited by 111
 Related articles All 6 versions

[PDF] thecvf.com

Unimodal-uniform constrained wasserstein training for medical diagnosis

X LiuX Han, Y Qiao, Y GeS Li… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

The labels in medical diagnosis task are usually discrete and successively distributed. For

example, the Diabetic Retinopathy Diagnosis (DR) involves five health risk levels: no DR (0),

m  Cited by 20 Related articles All 8 versions 

[PDF] ieee.org

Accelerating CS-MRI reconstruction with fine-tuning Wasserstein generative adversarial network

M Jiang, Z Yuan, X Yang, J Zhang, Y Gong, L Xia… - IEEE …, 2019 - ieeexplore.ieee.org

Compressed sensing magnetic resonance imaging (CS-MRI) is a time-efficient method to

acquire MR images by taking advantage of the highly under-sampled k-space data to

accelerate the time consuming acquisition process. In this paper, we proposed a de-aliasing …

  Cited by 11 Related articles

<—2019———— 2019 ———-   1000— 


[PDF] arxiv.org

Wasserstein-wasserstein auto-encoders

S Zhang, Y Gao, Y JiaoJ Liu, Y Wang… - arXiv preprint arXiv …, 2019 - arxiv.org

To address the challenges in learning deep generative models (eg, the blurriness of

variational auto-encoder and the instability of training generative adversarial networks, we

propose a novel deep generative model, named Wasserstein-Wasserstein auto-encoders …

Cited by 8 Related articles All 4 versions 

Calculating spatial configurational entropy of a landscape mosaic based on the Wasserstein metric

Y Zhao, X Zhang - Landscape Ecology, 2019 - Springer

Context Entropy is an important concept traditionally associated with thermodynamics and is

widely used to describe the degree of disorder in a substance, system, or process.

Configurational entropy has received more attention because it better reflects the …

  Cited by 4 Related articles All 5 versions


[PDF] wustl.edu

Grid-less DOA estimation using sparse linear arrays based on Wasserstein distance

M WangZ ZhangA Nehorai - IEEE Signal Processing Letters, 2019 - ieeexplore.ieee.org

Sparse linear arrays, such as nested and co-prime arrays, are capable of resolving O (M2)

sources using only O (M) sensors by exploiting their so-called difference coarray model. One

popular approach to exploit the difference coarray model is to construct an augmented …

  Cited by 3 Related articles All 3 versions


[PDF] arxiv.org

Adaptive wasserstein hourglass for weakly supervised hand pose estimation from monocular RGB

Y Zhang, L Chen, Y Liu, J Yong, W Zheng - arXiv preprint arXiv …, 2019 - arxiv.org

Insufficient labeled training datasets is one of the bottlenecks of 3D hand pose estimation

from monocular RGB images. Synthetic datasets have a large number of images with

precise annotations, but the obvious difference with real-world datasets impacts the …

  Cited by 3 Related articles All 2 versions 


Wasserstein generative adversarial networks for motion artifact removal in dental CT imaging

C Jiang, Q Zhang, Y Ge, D Liang… - … 2019: Physics of …, 2019 - spiedigitallibrary.org

In dental computed tomography (CT) scanning, high-quality images are crucial for oral

disease diagnosis and treatment. However, many artifacts, such as metal artifacts,

Cited by 8 Related articles All 3 versions

 

Adversarial Learning for Cross-Modal Retrieval with Wasserstein Distance

Q Cheng, Y Zhang, X Gu - International Conference on Neural Information …, 2019 - Springer

This paper presents a novel approach for cross-modal retrieval in an Adversarial Learning

with Wasserstein Distance (ALWD) manner, which aims at learning aligned representation

for various modalities in a GAN framework. The generator projects the image and the text …

ited by 11 Related articles All 3 versions

[PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance Part 2: Wrong Way Funding Risk

D Singh, S Zhang - arXiv preprint arXiv:1910.03993, 2019 - arxiv.org

This paper investigates calculations of robust funding valuation adjustment (FVA) for over

the counter (OTC) derivatives under distributional uncertainty using Wasserstein distance as

the ambiguity measure. Wrong way funding risk can be characterized via the robust FVA …

  Related articles All 5 versions 


[PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance: Wrong Way Counterparty Credit and Funding Risk

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019 - arxiv.org

This paper investigates calculations of robust XVA, in particular, credit valuation adjustment

(CVA) and funding valuation adjustment (FVA) for over-the-counter derivatives under

distributional uncertainty using Wasserstein distance as the ambiguity measure. Wrong way …

  Related articles All 8 versions 


An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription

C Jin, Z Li, Y Sun, H Zhang, X Lv, J Li, S Liu - International Conference on …, 2019 - Springer

Given a piece of acoustic musical signal, various automatic music transcription (AMT)

processing methods have been proposed to generate the corresponding music notations

without human intervention. However, the existing AMT methods based on signal …

  Related articles

[CITATION] An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription

C Jin, Z Li, Y Sun, H Zhang, X Lv, J Li, S Liu - International Conference on …, 2019 - Springer

  Related articles

<—2019———— 2019  ———-   1010—  


Improved Procedures for Training Primal Wasserstein GANs

T Zhang, Z Li, Q ZhuD Zhang - 2019 IEEE SmartWorld …, 2019 - ieeexplore.ieee.org

Primal Wasserstein GANs are a variant of Generative Adversarial Networks (ie, GANs),

which optimize the primal form of empirical Wasserstein distance directly. However, the high

computational complexity and training instability are the main challenges of this framework …

  Cited by 4 Related articles


2019

A virtual monochromatic imaging method for spectral CT based on Wasserstein generative adversarial network with a hybrid loss

Z Shi, J Li, H Li, Q Hu, Q Cao - IEEE Access, 2019 - ieeexplore.ieee.org

Spectral computed tomography (CT) has become a popular clinical diagnostic technique

because of its unique advantage in material distinction. Specifically, it can perform virtual

monochromatic imaging to obtain accurate tissue composition with less beam hardening …

  Cited by 9 Related articles All 2 versions

Wasserstein-Bounded Generative Adversarial Networks

P Zhou, B NiL Xie, X Zhang, H Wang, C Geng, Q Tian - 2019 - openreview.net

In the field of Generative Adversarial Networks (GANs), how to design a stable training

strategy remains an open problem. Wasserstein GANs have largely promoted the stability

over the original GANs by introducing Wasserstein distance, but still remain unstable and …

  

[PDF] bayesiandeeplearning.org

[PDF] Nested-Wasserstein Distance for Sequence Generation

R ZhangC ChenZ GanZ WenW WangL Carin - bayesiandeeplearning.org

Reinforcement learning (RL) has been widely studied for improving sequencegeneration

models. However, the conventional rewards used for RL training typically cannot capture

sufficient semantic information and therefore render model bias. Further, the sparse and …

  Related articles 


2019   see 2020

stributionally Robust XVA via Wasserstein Distance Part 2 ...

https://arxiv.org › q-fin

by D Singh · 2019 · Cited by 1 — This paper investigates calculations of robust funding valuation adjustment (FVA) for over the counter (OTC) derivatives under distributional ...

Missing: 1 ‎| Must include: 1

[CITATION] Distributionally robust xva via wasserstein distance part 1

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019

  Cited by 2 Related articles


[PDF] ieee.org

A packet-length-adjustable attention model based on bytes embedding using flow-WGAN for smart cybersecurity

L Han, Y Sheng, X Zeng - IEEE Access, 2019 - ieeexplore.ieee.org

In the studies of cybersecurity, malicious traffic detection is attracting more and more

attention for its capability of detecting attacks. Almost all of the intrusion detection methods

based on deep learning have poor data processing capacity with the increase in the data …

  Cited by 3 Related articles

[PDF] mdpi.com


[PDF] researchgate.net

 [PDF] monash.edu

[PDF] Threeplayer wasserstein gan via amortised duality

QH Nhan Dam, T Le, TD Nguyen… - Proc. of the 28th Int …, 2019 - research.monash.edu

We propose a new formulation for learning generative adversarial networks (GANs) using 

optimal transport cost (the general form of Wasserstein distance) as the objective criterion to 

measure the dissimilarity between target distribution and learned distribution. Our …

Cited by 2 Related articles All 3 versions

[PDF] monash.edu

[PDF] Threeplayer wasserstein gan via amortised duality

QH Nhan Dam, T Le, TD Nguyen… - Proc. of the 28th Int …, 2019 - research.monash.edu

We propose a new formulation for learning generative adversarial networks (GANs) using 

optimal transport cost (the general form of Wasserstein distance) as the objective criterion to 

Cited by 7 Related articles All 5 versions

 

[PDF] researchgate.net

Infrared and Visible Image Fusion via Multi-discriminators Wasserstein Generative Adversarial Network

J Li, H Huo, K Liu, C Li, S Li… - 2019 18th IEEE …, 2019 - ieeexplore.ieee.org

Generative adversarial network (GAN) has been widely applied to infrared and visible image 

fusion. However, the existing GAN-based image fusion methods only establish one 

discriminator in the network to make the fused image capture gradient information from the …

Cited by 1 Related articles All 3 versions

[PDF] researchgate.net

Infrared and Visible Image Fusion via Multi-discriminators Wasserstein Generative Adversarial Network

J Li, H Huo, K Liu, C Li, S Li… - 2019 18th IEEE …, 2019 - ieeexplore.ieee.org

Generative adversarial network (GAN) has been widely applied to infrared and visible image 

fusion. However, the existing GAN-based image fusion methods only establish one 

discriminator in the network to make the fused image capture gradient information from the …

Cited by 1 Related articles All 3 versions


[PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance: Wrong Way Counterparty Credit and Funding Risk

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019 - arxiv.org

This paper investigates calculations of robust XVA, in particular, credit valuation adjustment 

(CVA) and funding valuation adjustment (FVA) for over-the-counter derivatives under 

distributional uncertainty using Wasserstein distance as the ambiguity measure. Wrong way …

Related articles All 8 versions

[PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance: Wrong Way Counterparty Credit and Funding Risk

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019 - arxiv.org

This paper investigates calculations of robust XVA, in particular, credit valuation adjustment 

(CVA) and funding valuation adjustment (FVA) for over-the-counter derivatives under 

distributional uncertainty using Wasserstein distance as the ambiguity measure. Wrong way …

Related articles All 8 versions


Schrieber, Jörn

Algorithms for optimal transport and Wasserstein distances. (English) Zbl 1437.90004

Göttingen: Univ. Göttingen (Diss.). viii, 159 p. (2019).

MSC:  90-02 90B06

PDF BibTeX XML Cite

Full Text: Link Link

Cited by 3 Related articles All 3 versions

<—-2019———— 2019  ———-   1020—    

2019  [PDF] mlr.press

Gromov-wasserstein learning for graph matching and node embedding

H XuD LuoH Zha, LC Duke - International conference on …, 2019 - proceedings.mlr.press

A novel Gromov-Wasserstein learning framework is proposed to jointly match (align) graphs

and learn embedding vectors for the associated graph nodes. Using Gromov-Wasserstein

discrepancy, we measure the dissimilarity between two graphs and find their …

  Cited by 45 Related articles All 9 versions 


2019  [PDF] arxiv.org

Scalable Gromov-Wasserstein learning for graph partitioning and matching

H XuD LuoL Carin - arXiv preprint arXiv:1905.07645, 2019 - arxiv.org

We propose a scalable Gromov-Wasserstein learning (S-GWL) method and establish a

novel and theoretically-supported paradigm for large-scale graph analysis. The proposed

method is based on the fact that Gromov-Wasserstein discrepancy is a pseudometric on …

  Cited by 21 Related articles All 7 versions 


2019  [PDF] mlr.press

Unsupervised alignment of embeddings with wasserstein procrustes

E GraveA JoulinQ Berthet - The 22nd International …, 2019 - proceedings.mlr.press

We consider the task of aligning two sets of points in high dimension, which has many

applications in natural language processing and computer vision. As an example, it was

recently shown that it is possible to infer a bilingual lexicon, without supervised data, by …

  Cited by 92 Related articles All 3 versions 


2019 see 2018

[CITATION] Graph Classification with Gromov-Wasserstein Distance via Heat Kernel

지종호, 박성홍, 신현정 - 한국정보과학회 학술발표논문집, 2019 - dbpia.co.kr

Graph type of data has shed light on various domains such as novel chemical compound

design in pharmaceutical industry, community detection or influential node identification in

social network analysis, intrusion detection in network security, and so on. In order to use the …

  Related articles


2019

Graph signal representation with Wasserstein Barycenters

E SimouP Frossard - ICASSP 2019-2019 IEEE International …, 2019 - ieeexplore.ieee.org

In many applications signals reside on the vertices of weighted graphs. Thus, there is the

need to learn low dimensional representations for graph signals that will allow for data

analysis and interpretation. Existing unsupervised dimensionality reduction methods for …

  Cited by 7 Related articles All 5 versions




[PDF] arxiv.org

A convergent Lagrangian discretization for -Wasserstein and flux-limited diffusion equations

B Söllner, O Junge - arXiv preprint arXiv:1906.01321, 2019 - arxiv.org

We study a Lagrangian numerical scheme for solution of a nonlinear drift diffusion equation

of the form $\partial_t u=\partial_x (u\cdot c [\partial_x (h^\prime (u)+ v)]) $ on an interval.

This scheme will consist of a spatio-temporal discretization founded in the formulation of the …

  Cited by 2 Related articles All 5 versions 

[CITATION] A convergent Lagrangian discretization for -Wasserstein and flux-limited diffusion equations

O Junge, B Söllner - arXiv preprint arXiv:1906.01321, 2019


2019

[PDF] theses.fr

Processus de diffusion sur l'espace de Wasserstein: modèles coalescents, propriétés de régularisation et équations de McKean-Vlasov

V Marx - 2019 - theses.fr

Résumé La thèse vise à étudier une classe de processus stochastiques à valeurs dans

l'espace des mesures de probabilité sur la droite réelle, appelé espace de Wasserstein

lorsqu'il est muni de la métrique de Wasserstein W2. Ce travail aborde principalement les …

  Related articles All 3 versions 



[PDF] arxiv.org

Primal dual methods for Wasserstein gradient flows

JA CarrilloK CraigL WangC Wei - arXiv preprint arXiv:1901.08081, 2019 - arxiv.org

Combining the classical theory of optimal transport with modern operator splitting

techniques, we develop a new numerical method for nonlinear, nonlocal partial differential

equations, arising in models of porous media, materials science, and biological swarming …

  Cited by 17 Related articles All 3 versions 


[PDF] mlr.press

Understanding mcmc dynamics as flows on the wasserstein space

C LiuJ ZhuoJ Zhu - International Conference on Machine …, 2019 - proceedings.mlr.press

It is known that the Langevin dynamics used in MCMC is the gradient flow of the KL

divergence on the Wasserstein space, which helps convergence analysis and inspires

recent particle-based variational inference methods (ParVIs). But no more MCMC dynamics  …

  Cited by 3 Related articles All 11 versions 


[PDF] arxiv.org

Riemannian normalizing flow on variational wasserstein autoencoder for text modeling

PZ WangWY Wang - arXiv preprint arXiv:1904.02399, 2019 - arxiv.org

Recurrent Variational Autoencoder has been widely used for language modeling and text

generation tasks. These models often face a difficult optimization problem, also known as

the Kullback-Leibler (KL) term vanishing issue, where the posterior easily collapses to the …

  Cited by 14 Related articles All 5 versions 


<—2019———— 2019  ———-   1030—   



Riemannian Normalizing Flow on Variational Wasserstein Autoencoder for Text Modeling

P Zizhuang Wang, WY Wang - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

Abstract Recurrent Variational Autoencoder has been widely used for language modeling

and text generation tasks. These models often face a difficult optimization problem, also

known as the Kullback-Leibler (KL) term vanishing issue, where the posterior easily …



[PDF] arxiv.org

Modified massive Arratia flow and Wasserstein diffusion

V KonarovskyiMK von Renesse - Communications on Pure …, 2019 - Wiley Online Library

Extending previous work by the first author we present a variant of the Arratia flow, which

consists of a collection of coalescing Brownian motions starting from every point of the unit

interval. The important new feature of the model is that individual particles carry mass that …

  Cited by 27 Related articles All 7 versions


[PDF] arxiv.org

Mullins-Sekerka as the Wasserstein flow of the perimeter

A Chambolle, T Laux - arXiv preprint arXiv:1910.02508, 2019 - arxiv.org

We prove the convergence of an implicit time discretization for the one-phase Mullins-

Sekerka equation, possibly with additional non-local repulsion, proposed in [F. Otto, Arch.

Rational Mech. Anal. 141 (1998) 63--103]. Our simple argument shows that the limit satisfies …

  Cited by 1 Related articles All 4 versions 


[PDF] arxiv.org

Straight-through estimator as projected Wasserstein gradient flow

P ChengC LiuC LiD ShenR Henao… - arXiv preprint arXiv …, 2019 - arxiv.org

The Straight-Through (ST) estimator is a widely used technique for back-propagating

gradients through discrete random variables. However, this effective method lacks

theoretical justification. In this paper, we show that ST can be interpreted as the simulation of …

  Cited by 4 Related articles All 5 versions 


[PDF] arxiv.org

On the total variation Wasserstein gradient flow and the TV-JKO scheme

G Carlier, C Poon - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We study the JKO scheme for the total variation, characterize the optimizers, prove some of

their qualitative properties (in particular a form of maximum principle and in some cases, a

minimum principle as well). Finally, we establish a convergence result as the time step goes …

  Cited by 7 Related articles All 7 versions


2019

[PDF] arxiv.org

Straight-through estimator as projected Wasserstein gradient flow

P ChengC LiuC LiD ShenR Henao… - arXiv preprint arXiv …, 2019 - arxiv.org

The Straight-Through (ST) estimator is a widely used technique for back-propagating

gradients through discrete random variables. However, this effective method lacks

theoretical justification. In this paper, we show that ST can be interpreted as the simulation of …

  Cited by 4 Related articles All 5 versions 


[PDF] arxiv.org

Wasserstein gradient flow formulation of the time-fractional Fokker-Planck equation

MH DuongB Jin - arXiv preprint arXiv:1908.09055, 2019 - arxiv.org

In this work, we investigate a variational formulation for a time-fractional Fokker-Planck

equation which arises in the study of complex physical systems involving anomalously slow

diffusion. The model involves a fractional-order Caputo derivative in time, and thus …

  Cited by 1 Related articles All 7 versions 

[PDF] projecteuclid.org

Convergence of the population dynamics algorithm in the Wasserstein metric

M Olvera-Cravioto - Electronic Journal of Probability, 2019 - projecteuclid.org

We study the convergence of the population dynamics algorithm, which produces sample

pools of random variables having a distribution that closely approximates that of the special

endogenous solution to a variety of branching stochastic fixed-point equations, including the …

  Cited by 3 Related articles All 6 versions


[PDF] sciencedirect.com

Distributionally Robust Chance-Constraint Optimal Power Flow Considering Uncertain Renewables with Wasserstein-Moment Metric

J Liu, Y Chen, C Duan, J Lyu - Energy Procedia, 2019 - Elsevier

Chance-constraint optimal power flow has been proven as an efficient method to manage

the risk of volatile renewable energy sources. To address the uncertainties of renewable

energy sources, a novel distributionally robust chance-constraint OPF model is proposed in …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

Finsler structure for variable exponent Wasserstein space and gradient flows

A Marcos, A Soglo - arXiv preprint arXiv:1912.12450, 2019 - arxiv.org

The variational approach requires the setting of new tools such as appropiate distance on the

probability space and an introduction of a Finsler metric in this space. The class of parabolic

equations is derived as the flow of a gradient with respect the Finsler structure. For q(x) q …

  Related articles All 2 versions 

<—2019———— 2019  ———-   1040—  


[PDF] wiley.com

A degenerate Cahn‐Hilliard model as constrained Wasserstein gradient flow

D MatthesC Cances, F Nabet - PAMM, 2019 - Wiley Online Library

Existence of solutions to a non‐local Cahn‐Hilliard model with degenerate mobility is

considered. The PDE is written as a gradient flow with respect to the L2‐Wasserstein metric

for two components that are coupled by an incompressibility constraint. Approximating …

  Related articles


[PDF] tum.de

Structure preserving discretization and approximation of gradient flows in Wasserstein-like space

S Plazotta - 2019 - mediatum.ub.tum.de

This thesis investigates structure-preserving, temporal semi-discretizations and

approximations for PDEs with gradient flow structure with the application to evolution

problems in the L²-Wasserstein space. We investigate the variational formulation of the time …

  Related articles All 3 versions 

Approximation of stable law in Wasserstein-1 distance by Stein's method

L Xu - Annals of Applied Probability, 2019 - projecteuclid.org

Abstract Let $ n\in\mathbb {N} $, let $\zeta_ {n, 1},\ldots,\zeta_ {n, n} $ be a sequence of

independent random variables with $\mathbb {E}\zeta_ {n, i}= 0$ and $\mathbb {E}|\zeta_ {n,

i}|<\infty $ for each $ i $, and let $\mu $ be an $\alpha $-stable distribution having …

  Cited by 19 Related articles All 7 versions


[PDF] arxiv.org

Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Stein's method has been widely used for probability approximations. However, in the multi-

dimensional setting, most of the results are for multivariate normal approximation or for test

functions with bounded second-or higher-order derivatives. For a class of multivariate …

  Cited by 20 Related articles All 7 versions

[HTML] springer.com

[HTML] Correction to: Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Under the above-strengthened Assumption 2.1, all the conclusions and examples in [1] still hold

true, except that all the constants \(C_\theta \) therein will depend on the constants in the new

assumption … Combining the previous three inequalities, we conclude that [1, (7.1)] still holds …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

Multivariate stable approximation in Wasserstein distance by Stein's method

P Chen, I NourdinL XuX Yang - arXiv preprint arXiv:1911.12917, 2019 - arxiv.org

We investigate regularity properties of the solution to Stein's equation associated with

multivariate integrable $\alpha $-stable distribution for a general class of spectral measures

and Lipschitz test functions. The obtained estimates induce an upper bound in Wasserstein  …

  Cited by 3 Related articles All 4 versions 


2019

[PDF] arxiv.org

Learning embeddings into entropic wasserstein spaces

C FrognerF MirzazadehJ Solomon - arXiv preprint arXiv:1905.03329, 2019 - arxiv.org

Euclidean embeddings of data are fundamentally limited in their ability to capture latent

semantic structures, which need not conform to Euclidean spatial assumptions. Here we

consider an alternative, which embeds data as discrete probability distributions in a …

  Cited by 3 Related articles All 7 versions 


Generating EEG signals of an RSVP experiment by a class conditioned wasserstein generative adversarial network

S PanwarP RadJ Quarles… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

Electroencephalography (EEG) data is difficult to obtain due to complex experimental setups

and reduced comfort due to prolonged wearing. This poses challenges to train powerful

deep learning model due to the limited EEG data. Hence, being able to generate EEG data …

  Cited by 5 Related articles All 2 versions


[PDF] arxiv.org

Adaptive wasserstein hourglass for weakly supervised hand pose estimation from monocular RGB

Y Zhang, L Chen, Y Liu, J Yong, W Zheng - arXiv preprint arXiv …, 2019 - arxiv.org

Insufficient labeled training datasets is one of the bottlenecks of 3D hand pose estimation

from monocular RGB images. Synthetic datasets have a large number of images with

precise annotations, but the obvious difference with real-world datasets impacts the …

   Cited by 3 Related articles All 3 versions 


[PDF] arxiv.org

From the backward Kolmogorov PDE on the Wasserstein space to propagation of chaos for McKean-Vlasov SDEs

N Frikha, PEC de Raynal - arXiv preprint arXiv:1907.01410, 2019 - arxiv.org

In this article, we provide some new quantitative estimates for propagation of chaos of non-

linear stochastic differential equations (SDEs) in the sense of McKean-Vlasov. We obtain

explicit error estimates, at the level of the trajectories, at the level of the semi-group and at …

  Cited by 4 Related articles All 7 versions 


A semi-supervised wasserstein generative adversarial network for classifying driving fatigue from EEG signals

S PanwarP RadJ Quarles, E Golob… - … on Systems, Man and …, 2019 - ieeexplore.ieee.org

Predicting driver's cognitive states using deep learning from electroencephalography (EEG)

signals is considered this paper. To address the challenge posed by limited labeled training

samples, a semi-supervised Wasserstein Generative Adversarial Network with gradient …

  Cited by 4 Related articles All 2 versions

<—2019———— 2019  ———-   1050—     


Processus de diffusion sur l’espace de Wasserstein : modèles coalescents, propriétés...

by Marx, Victor

2019 thesis

La thèse vise à étudier une classe de processus stochastiques à valeurs dans l’espace des mesures de probabilité sur la droite réelle, appelé espace de...

Dissertation/ThesisFull Text Online

[PDF] psu.edu

Aggregated Wasserstein distance for hidden Markov models and automated morphological characterization of placenta from photos

Y Chen - 2019 - etda.libraries.psu.edu

In the past decade, fueled by the rapid advances of big data technology and machine

learning algorithms, data science has become a new paradigm of science and has more

and more emerged into its own field. At the intersection of computational methods, data …

  Related articles 


[PDF] thecvf.com

Max-sliced wasserstein distance and its use for gans

I DeshpandeYT HuR SunA Pyrros… - Proceedings of the …, 2019 - openaccess.thecvf.com

Generative adversarial nets (GANs) and variational auto-encoders have significantly

improved our distribution modeling capabilities, showing promise for dataset augmentation,

image-to-image translation and feature learning. However, to model high-dimensional …

  Cited by 35 Related articles All 8 versions 


2019

A unified formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt distances between positive definite operators

HQ Minh - International Conference on Geometric Science of …, 2019 - Springer

This work presents a parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

distances provide a unified formulation encompassing both the Bures-Wasserstein and Log …

  Cited by 5 Related articles All 2 versions


2019

[PDF] arxiv.org

From the backward Kolmogorov PDE on the Wasserstein space to propagation of chaos for McKean-Vlasov SDEs

N Frikha, PEC de Raynal - arXiv preprint arXiv:1907.01410, 2019 - arxiv.org

In this article, we provide some new quantitative estimates for propagation of chaos of non-

linear stochastic differential equations (SDEs) in the sense of McKean-Vlasov. We obtain

explicit error estimates, at the level of the trajectories, at the level of the semi-group and at …

  Cited by 5 Related articles All 7 versions 


2019


2019

[PDF] archives-ouvertes.fr

Diffusive processes on the Wasserstein space: Coalescing models, Regularization properties and McKean-Vlasov equations

V Marx - 2019 - tel.archives-ouvertes.fr

The aim of this thesis is to study a class of diffusive stochastic processes with values in the

space of probability measures on the real line, called Wasserstein space if it is endowed

with the Wasserstein metric W2. The following issues are mainly addressed in this work: how …

  Cited by 2 Related articles All 9 versions 


2019

[PDF] theses.fr

Processus de diffusion sur l'espace de Wasserstein: modèles coalescents, propriétés de régularisation et équations de McKean-Vlasov

V Marx - 2019 - theses.fr

Résumé La thèse vise à étudier une classe de processus stochastiques à valeurs dans

l'espace des mesures de probabilité sur la droite réelle, appelé espace de Wasserstein

lorsqu'il est muni de la métrique de Wasserstein W2. Ce travail aborde principalement les …

  Related articles All 3 versions 

2019

[PDF] arxiv.org

From the backward Kolmogorov PDE on the Wasserstein space to propagation of chaos for McKean-Vlasov SDEs

N Frikha, PEC de Raynal - arXiv preprint arXiv:1907.01410, 2019 - arxiv.org

In this article, we provide some new quantitative estimates for propagation of chaos of non-

linear stochastic differential equations (SDEs) in the sense of McKean-Vlasov. We obtain

explicit error estimates, at the level of the trajectories, at the level of the semi-group and at …

  Cited by 5 Related articles All 7 versions 


2019

[PDF] archives-ouvertes.fr

Diffusive processes on the Wasserstein space: Coalescing models, Regularization properties and McKean-Vlasov equations

V Marx - 2019 - tel.archives-ouvertes.fr

The aim of this thesis is to study a class of diffusive stochastic processes with values in the

space of probability measures on the real line, called Wasserstein space if it is endowed

with the Wasserstein metric W2. The following issues are mainly addressed in this work: how …

  Cited by 2 Related articles All 9 versions 


2019

[PDF] arxiv.org

Finsler structure for variable exponent Wasserstein space and gradient flows

A Marcos, A Soglo - arXiv preprint arXiv:1912.12450, 2019 - arxiv.org

The variational approach requires the setting of new tools such as appropiate distance on the

probability space and an introduction of a Finsler metric in this space. The class of parabolic

equations is derived as the flow of a gradient with respect the Finsler structure. For q(x) q …

  Related articles All 2 versions 

<—2019———— 2019  ———-   1060—  



Tackling Algorithmic Bias in Neural-Network Classifiers using Wasserstein-2 Regularization

L Risser, Q Vincenot, JM Loubes - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

The increasingly common use of neural network classifiers in industrial and social 

applications of image analysis has allowed impressive progress these last years. Such 

methods are however sensitive to algorithmic bias, ie to an under-or an over-representation …

Cited by 2 Related articles All 3 versions

Projection in the 2-Wasserstein sense on structured measure space

L Lebrat - 2019 - tel.archives-ouvertes.fr

This thesis focuses on the approximation for the 2-Wasserstein metric of probability 

measures by structured measures. The set of structured measures under consideration is 

made of consistent discretizations of measures carried by a smooth curve with a bounded …

All 2 versions


[PDF] ieee.org

deep transfer model with wasserstein distance guided multi-adversarial networks for bearing fault diagnosis under different working conditions

M Zhang, D Wang, W Lu, J Yang, Z Li, B Liang - IEEE Access, 2019 - ieeexplore.ieee.org

… In this paper, a new deep transfer model based on Wasserstein distance guided multi-adversarial …

is learning the shared feature representation by minimizing the Wasserstein distance …

 Cited by 59 Related articles All 6 versions


2019

Wasserstein $ F $-tests and Confidence Bands for the Fr\echet ...

arxiv.org › stat

by A Petersen · 2019 · Cited by 3 — Wasserstein F-tests and Confidence Bands for the Frèchet Regression of Density Response Curves. Data consisting of samples of probability density functions are increasingly prevalent, necessitating the development of methodologies for their analysis that respect the inherent nonlinearities associated with densities.


[PDF] arxiv.org

Tree-Wasserstein Barycenter for Large-Scale Multilevel Clustering and Scalable Bayes

T Le, V Huynh, N Ho, D Phung, M Yamada - arXiv preprint arXiv …, 2019 - arxiv.org

We study in this paper a variant of Wasserstein barycenter problem, which we refer to as tree-

Wasserstein barycenter, by leveraging a specific class of ground metrics, namely tree 

metrics, for Wasserstein distance. Drawing on the tree structure, we propose an efficient …

Related articles All 2 versions


2019


Improved Procedures for Training Primal Wasserstein GANs

T Zhang, Z Li, Q Zhu, D Zhang - 2019 IEEE SmartWorld …, 2019 - ieeexplore.ieee.org

Primal Wasserstein GANs are a variant of Generative Adversarial Networks (ie, GANs), 

which optimize the primal form of empirical Wasserstein distance directly. However, the high 

computational complexity and training instability are the main challenges of this framework …

Cited by 4 Related articles


[PDF] arxiv.org

Clustering measure-valued data with Wasserstein barycenters

G Domazakis, D Drivaliaris, S Koukoulas… - arXiv preprint arXiv …, 2019 - arxiv.org

In this work, learning schemes for measure-valued data are proposed, ie data that their 

structure can be more efficiently represented as probability measures instead of points on 

$\R^ d $, employing the concept of probability barycenters as defined with respect to the …

Related articles All 2 versions



[PDF] mlr.press

Estimation of smooth densities in Wasserstein distance

J WeedQ Berthet - Conference on Learning Theory, 2019 - proceedings.mlr.press

The Wasserstein distances are a set of metrics on probability distributions supported on $\mathbb {R}^ d $ with applications throughout statistics and machine learning. Often, such distances are used in the context of variational problems, in which the statistician employs in …

  Cited by 23 Related articles All 4 versions 


[PDF] thecvf.com

Conservative wasserstein training for pose estimation

X LiuY Zou, T Che, P Ding, P Jia… - Proceedings of the …, 2019 - openaccess.thecvf.com

This paper targets the task with discrete and periodic class labels (eg, pose/orientation estimation) in the context of deep learning. The commonly used cross-entropy or regression loss is not well matched to this problem as they ignore the periodic nature of the labels and …

  Cited by 20 Related articles All 8 versions 


[PDF] mlr.press

Orthogonal estimation of wasserstein distances

M RowlandJ HronY Tang… - The 22nd …, 2019 - proceedings.mlr.press

Wasserstein distances are increasingly used in a wide variety of applications in machine learning. Sliced Wasserstein distances form an important subclass which may be estimated efficiently through one-dimensional sorting operations. In this paper, we propose a new …

  Cited by 10 Related articles All 9 versions 
<—2019———— 2019  ———-   1070—  



[HTML] oup.com

On parameter estimation with the Wasserstein distance

E Bernton, PE Jacob, M Gerber… - … and Inference: A …, 2019 - academic.oup.com

Statistical inference can be performed by minimizing, over the parameter space, the Wasserstein distance between model distributions and the empirical distribution of the data. We study asymptotic properties of such minimum Wasserstein distance estimators …

  Cited by 19 Related articles All 6 versions


[PDF] arxiv.org

Estimation of Wasserstein distances in the spiked transport model

J Niles-WeedP Rigollet - arXiv preprint arXiv:1909.07513, 2019 - arxiv.org

We propose a new statistical model, the spiked transport model, which formalizes the assumption that two probability distributions differ only on a low-dimensional subspace. We study the minimax rate of estimation for the Wasserstein distance under this model and show …

  Cited by 14 Related articles All 2 versions 


[PDF] arxiv.org

Riemannian normalizing flow on variational wasserstein autoencoder for text modeling

PZ WangWY Wang - arXiv preprint arXiv:1904.02399, 2019 - arxiv.org

Recurrent Variational Autoencoder has been widely used for language modeling and text generation tasks. These models often face a difficult optimization problem, also known as the Kullback-Leibler (KL) term vanishing issue, where the posterior easily collapses to the …

 Cited by 26 Related articles All 5 versions 

Riemannian Normalizing Flow on Variational Wasserstein Autoencoder for Text Modeling

P Zizhuang Wang, WY Wang - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

Abstract Recurrent Variational Autoencoder has been widely used for language modeling and text generation tasks. These models often face a difficult optimization problem, also known as the Kullback-Leibler (KL) term vanishing issue, where the posterior easily …


[PDF] arxiv.org

Confidence regions in wasserstein distributionally robust estimation

J Blanchet, K MurthyN Si - arXiv preprint arXiv:1906.01614, 2019 - arxiv.org

Wasserstein distributionally robust optimization (DRO) estimators are obtained as solutions of min-max problems in which the statistician selects a parameter minimizing the worst-case loss among all probability models within a certain distance (in a Wasserstein sense) from the …

  Cited by 10 Related articles All 6 versions 



[PDF] ieee.org

Multi-source medical image fusion based on Wasserstein generative adversarial networks

Z Yang, Y Chen, Z LeF Fan, E Pan - IEEE Access, 2019 - ieeexplore.ieee.org

In this paper, we propose the medical Wasserstein generative adversarial networks (MWGAN), an end-to-end model, for fusing magnetic resonance imaging (MRI) and positron emission tomography (PET) medical images. Our method establishes two adversarial …

  Cited by 5 Related articles


[PDF] arxiv.org

Parameter estimation for biochemical reaction networks using Wasserstein distances

K Öcal, R GrimaG Sanguinetti - Journal of Physics A …, 2019 - iopscience.iop.org

We present a method for estimating parameters in stochastic models of biochemical reaction networks by fitting steady-state distributions using Wasserstein distances. We simulate a reaction network at different parameter settings and train a Gaussian process to learn the …

  Cited by 6 Related articles All 7 versions


Prostate MR image segmentation with self-attention adversarial training based on wasserstein distance

C Su, R Huang, C Liu, T Yin, B Du - IEEE Access, 2019 - ieeexplore.ieee.org

Prostate diseases are very common in men. Accurate segmentation of the prostate plays a significant role in further clinical treatment and diagnosis. There have been some methods that combine the segmentation network and generative adversarial network, using the …

  Cited by 3 Related articles


[PDF] arxiv.org

Single image haze removal using conditional wasserstein generative adversarial networks

JP EbenezerB Das… - 2019 27th European …, 2019 - ieeexplore.ieee.org

We present a method to restore a clear image from a haze-affected image using a Wasserstein generative adversarial network. As the problem is ill-conditioned, previous methods have required a prior on natural images or multiple images of the same scene. We …

  Cited by 7 Related articles All 5 versions


Time delay estimation via Wasserstein distance minimization

JM NicholsMN Hutchinson, N Menkart… - IEEE Signal …, 2019 - ieeexplore.ieee.org

Time delay estimation between signals propagating through nonlinear media is an important problem with application to radar, underwater acoustics, damage detection, and communications (to name a few). Here, we describe a simple approach for determining the …

  Cited by 3 Related articles All 2 versions
<—2019———— 2019  ———-   1080— 



[PDF] wustl.edu

Grid-less DOA estimation using sparse linear arrays based on Wasserstein distance

M WangZ ZhangA Nehorai - IEEE Signal Processing Letters, 2019 - ieeexplore.ieee.org

Sparse linear arrays, such as nested and co-prime arrays, are capable of resolving O (M2) sources using only O (M) sensors by exploiting their so-called difference coarray model. One popular approach to exploit the difference coarray model is to construct an augmented …

  Cited by 3 Related articles All 3 versions


[PDF] arxiv.org

Adaptive wasserstein hourglass for weakly supervised hand pose estimation from monocular RGB

Y Zhang, L Chen, Y Liu, J Yong, W Zheng - arXiv preprint arXiv …, 2019 - arxiv.org

Insufficient labeled training datasets is one of the bottlenecks of 3D hand pose estimation from monocular RGB images. Synthetic datasets have a large number of images with precise annotations, but the obvious difference with real-world datasets impacts the …

  Cited by 3 Related articles All 2 versions 


[PDF] aaai.org

Manifold-valued image generation with Wasserstein generative adversarial nets

Z Huang, J Wu, L Van Gool - Proceedings of the AAAI Conference on …, 2019 - ojs.aaai.org

Generative modeling over natural images is one of the most fundamental machine learning problems. However, few modern generative models, including Wasserstein Generative Adversarial Nets (WGANs), are studied on manifold-valued images that are frequently …

  Cited by 4 Related articles All 13 versions 


[PDF] researchgate.net

Infrared and Visible Image Fusion via Multi-discriminators Wasserstein Generative Adversarial Network

J Li, H Huo, K Liu, C Li, S Li… - 2019 18th IEEE …, 2019 - ieeexplore.ieee.org

Generative adversarial network (GAN) has been widely applied to infrared and visible image fusion. However, the existing GAN-based image fusion methods only establish one discriminator in the network to make the fused image capture gradient information from the …

  Cited by 1 Related articles All 3 versions


Gait recognition based on Wasserstein generating adversarial image inpainting network

L Xia, H Wang, W Guo - Journal of Central South University, 2019 - Springer

Aiming at the problem of small area human occlusion in gait recognition, a method based on generating adversarial image inpainting network was proposed which can generate a context consistent image for gait occlusion area. In order to reduce the effect of noise on …

  Cited by 2 Related articles


2019

[PDF] arxiv.org

On the estimation of the Wasserstein distance in generative models

T Pinetz, D Soukup, T Pock - German Conference on Pattern Recognition, 2019 - Springer

Abstract Generative Adversarial Networks (GANs) have been used to model the underlying probability distribution of sample based datasets. GANs are notoriuos for training difficulties and their dependence on arbitrary hyperparameters. One recent improvement in GAN …

  Related articles All 5 versions


Image Reflection Removal Using the Wasserstein Generative Adversarial Network

T LiDPK Lun - … 2019-2019 IEEE International Conference on …, 2019 - ieeexplore.ieee.org

Imaging through a semi-transparent material such as glass often suffers from the reflection problem, which degrades the image quality. Reflection removal is a challenging task since it is severely ill-posed. Traditional methods, while all require long computation time on …

  Cited by 1 Related articles All 2 versions


Random Matrix-Improved Estimation of the Wasserstein Distance between two Centered Gaussian Distributions

M TiomokoR Couillet - 2019 27th European Signal Processing …, 2019 - ieeexplore.ieee.org

This article proposes a method to consistently estimate functionals 1/pΣ i= 1 pf (λ i (C 1 C 2)) of the eigenvalues of the product of two covariance matrices C 1, C 2 R p× p based on the empirical estimates λ i (Ĉ 1 Ĉ 2)(Ĉ a= 1/na Σ i= 1 na xi (a) xi (a)), when the size p and …

  Cited by 1 Related articles All 7 versions


Data augmentation method of sar image dataset based on wasserstein generative adversarial networks

Q Lu, H Jiang, G Li, W Ye - 2019 International conference on …, 2019 - ieeexplore.ieee.org

The published Synthetic Aperture Radar (SAR) samples are not abundant enough, which is not conducive to the application of deep learning methods in the field of SAR automatic target recognition. Generative Adversarial Nets (GANs) is one of the most effective ways to …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

Stylized Text Generation Using Wasserstein Autoencoders with a Mixture of Gaussian Prior

A Ghabussi, L Mou, O Vechtomova - arXiv preprint arXiv:1911.03828, 2019 - arxiv.org

Wasserstein autoencoders are effective for text generation. They do not however provide any control over the style and topic of the generated sentences if the dataset has multiple classes and includes different topics. In this work, we present a semi-supervised approach …

  Related articles All 2 versions 

<—2019———— 2019  ———-   1090—  



[HTML] deepai.org

[HTML] Manifold-valued image generation with wasserstein adversarial networks

EW GANs - 2019 - deepai.org

Unsupervised image generation has recently received an increasing amount of attention thanks to the great success of generative adversarial networks (GANs), particularly Wasserstein GANs. Inspired by the paradigm of real-valued image generation, this paper makes the first attempt …

  Cited by 2 Related articles 


[PDF] koreascience.or.kr

Experimental Analysis of Equilibrization in Binary Classification for Non-Image Imbalanced Data Using Wasserstein GAN

ZY Wang, DK Kang - International Journal of Internet …, 2019 - koreascience.or.kr

In this paper, we explore the details of three classic data augmentation methods and two generative model based oversampling methods. The three classic data augmentation methods are random sampling (RANDOM), Synthetic Minority Over-sampling Technique …

  Cited by 2 Related articles All 3 versions 


[PDF] csroc.org.tw

[PDF] Cross-domain Text Sentiment Classification Based on Wasserstein Distance

G Cai, Q Lin, N Chen - Journal of Computers, 2019 - csroc.org.tw

Text sentiment analysis is mainly to detect the sentiment polarity implicit in text data. Most existing supervised learning algorithms are difficult to solve the domain adaptation problem in text sentiment analysis. The key of cross-domain text sentiment analysis is how to extract …

  Related articles All 2 versions 


[PDF] amazonaws.com

[PDF] Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application to Microbiome Studies”

S Wang, TT CaiH Li - pstorage-tf-iopjsd8797887.s3 …

Page 1. Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application to Microbiome Studies” Shulei Wang, T. Tony Cai and Hongzhe Li University of Pennsylvania In this supplementary material, we provide the proof for the main results (Section S1) and all the …

  Related articles All 3 versions 



Artifact correction in low‐dose dental CT imaging using Wasserstein generative adversarial networks

Z Hu, C Jiang, F Sun, Q Zhang, Y Ge, Y Yang… - Medical …, 2019 - Wiley Online Library

Purpose In recent years, health risks concerning high‐dose x‐ray radiation have become a major concern in dental computed tomography (CT) examinations. Therefore, adopting low‐dose computed tomography (LDCT) technology has become a major focus in the CT …

  Cited by 27 Related articles All 5 versions




[PDF] arxiv.org

bound on the Wasserstein-2 distance between linear combinations of independent random variables

B Arras, E Azmoodeh, G Poly, Y Swan - Stochastic processes and their …, 2019 - Elsevier

We provide a bound on a distance between finitely supported elements and general elements of the unit sphere of 2 (N). We use this bound to estimate the Wasserstein-2 distance between random variables represented by linear combinations of independent …

  Cited by 20 Related articles All 15 versions


[PDF] arxiv.org

Geometric convergence bounds for Markov chains in Wasserstein distance based on generalized drift and contraction conditions

Q Qin, JP Hobert - arXiv preprint arXiv:1902.02964, 2019 - arxiv.org

Let $\{X_n\} _ {n= 0}^\infty $ denote an ergodic Markov chain on a general state space that has stationary distribution $\pi $. This article concerns upper bounds on the $ L_1 $-Wasserstein distance between the distribution of $ X_n $ and $\pi $. In particular, an explicit …

  Cited by 9 Related articles All 2 versions 


[PDF] arxiv.org

Subexponential upper and lower bounds in Wasserstein distance for Markov processes

A Arapostathis, G Pang, N Sandrić - arXiv preprint arXiv:1907.05250, 2019 - arxiv.org

In this article, relying on Foster-Lyapunov drift conditions, we establish subexponential upper and lower bounds on the rate of convergence in the $\mathrm {L}^ p $-Wasserstein distance for a class of irreducible and aperiodic Markov processes. We further discuss these …

  Cited by 2 Related articles All 3 versions 


[PDF] projecteuclid.org

Wasserstein-2 bounds in normal approximation under local dependence

X Fang - Electronic Journal of Probability, 2019 - projecteuclid.org

We obtain a general bound for the Wasserstein-2 distance in normal approximation for sums of locally dependent random variables. The proof is based on an asymptotic expansion for expectations of second-order differentiable functions of the sum. We apply the main result to …

  Cited by 3 Related articles All 3 versions


[PDF] arxiv.org

Bounds for the Wasserstein mean with applications to the Lie-Trotter mean

J Hwang, S Kim - Journal of Mathematical Analysis and Applications, 2019 - Elsevier

Since barycenters in the Wasserstein space of probability distributions have been introduced, the Wasserstein metric and the Wasserstein mean of positive definite Hermitian matrices have been recently developed. In this paper, we explore some properties of …

  Cited by 3 Related articles All 5 versions

<—2019———— 2019  ———-   1100—     


[PDF] aaai.org

Wasserstein soft label propagation on hypergraphs: Algorithm and generalization error bounds

T GaoS AsoodehY HuangJ Evans - Proceedings of the AAAI …, 2019 - ojs.aaai.org

Inspired by recent interests of developing machine learning and data mining algorithms on hypergraphs, we investigate in this paper the semi-supervised learning algorithm of propagating” soft labels”(eg probability distributions, class membership scores) over …

  Cited by 3 Related articles All 13 versions 


[CITATION] Improved concentration bounds for conditional value-at-risk and cumulative prospect theory using wasserstein distance

SP Bhat, LA Prashanth - arXiv preprint arXiv:1902.10709, 2019

  Cited by 1 Related articles


2019

Using Wasserstein generative adversarial networks for the design of Monte Carlo simulations

by Athey, Susan

Working paper series, 2019

BookCitation Online

 

2019 online

Courbes et applications optimales à valeurs dans l'espace de Wasserstein

by Lavenant, Hugo

eBookFull Text Online

Cited by 2 Related articles All 15 versions 

On differentiability in the Wasserstein space and well-posedness for Hamilton–Jacobi equations

W GangboA Tudorascu - Journal de Mathématiques Pures et Appliquées, 2019 - Elsevier

In this paper we elucidate the connection between various notions of differentiability in the

Wasserstein space: some have been introduced intrinsically (in the Wasserstein space, by

using typical objects from the theory of Optimal Transport) and used by various authors to …

  Cited by 32 Related articles All 4 versions

2019

[PDF] arxiv.org

Parisi's formula is a Hamilton-Jacobi equation in Wasserstein space

JC Mourrat - arXiv preprint arXiv:1906.08471, 2019 - arxiv.org

Parisi's formula is a self-contained description of the infinite-volume limit of the free energy of

mean-field spin glass models. We show that this quantity can be recast as the solution of a

Hamilton-Jacobi equation in the Wasserstein space of probability measures on the positive …

  Cited by 6 Related articles All 3 versions 


[PDF] arxiv.org

Interior-point methods strike back: Solving the wasserstein barycenter problem

D Ge, H Wang, Z Xiong, Y Ye - arXiv preprint arXiv:1905.12895, 2019 - arxiv.org

Computing the Wasserstein barycenter of a set of probability measures under the optimal

transport metric can quickly become prohibitive for traditional second-order algorithms, such

as interior-point methods, as the support size of the measures increases. In this paper, we …

  Cited by 9 Related articles All 3 versions 


[PDF] arxiv.org

Randomized filtering and Bellman equation in Wasserstein space for partial observation control problem

E BandiniA Cosso, M Fuhrman, H Pham - Stochastic Processes and their …, 2019 - Elsevier

We study a stochastic optimal control problem for a partially observed diffusion. By using the

control randomization method in Bandini et al.(2018), we prove a corresponding

randomized dynamic programming principle (DPP) for the value function, which is obtained …

  Cited by 16 Related articles All 13 versions


[PDF] arxiv.org

Wasserstein stability estimates for covariance-preconditioned Fokker-Planck equations

JA CarrilloU Vaes - arXiv preprint arXiv:1910.07555, 2019 - arxiv.org

We study the convergence to equilibrium of the mean field PDE associated with the

derivative-free methodologies for solving inverse problems. We show stability estimates in

the euclidean Wasserstein distance for the mean field PDE by using optimal transport …

  Cited by 7 Related articles All 4 versions 


[PDF] arxiv.org

Personalized purchase prediction of market baskets with Wasserstein-based sequence matching

M KrausS Feuerriegel - Proceedings of the 25th ACM SIGKDD …, 2019 - dl.acm.org

Personalization in marketing aims at improving the shopping experience of customers by

tailoring services to individuals. In order to achieve this, businesses must be able to make

personalized predictions regarding the next purchase. That is, one must forecast the exact …

  Cited by 4 Related articles All 4 versions

<—2019———— 2019  ———-   1110— 


[PDF] arxiv.org

Wasserstein Proximal Algorithms for the Schr\"{o} dinger Bridge Problem: Density Control with Nonlinear Drift

KF CaluyaA Halder - arXiv preprint arXiv:1912.01244, 2019 - arxiv.org

We study the Schrödinger bridge problem (SBP) with nonlinear prior dynamics. In control-

theoretic language, this is a problem of minimum effort steering of a given joint state

probability density function (PDF) to another over a finite time horizon, subject to a controlled …

  Cited by 4 Related articles All 4 versions 


[PDF] inria.fr

On a Wasserstein-type distance between solutions to stochastic differential equations

J Bion–Nadal, D Talay - The Annals of Applied Probability, 2019 - projecteuclid.org

In this paper, we introduce a Wasserstein-type distance on the set of the probability

distributions of strong solutions to stochastic differential equations. This new distance is

defined by restricting the set of possible coupling measures. We prove that it may also be …

Cited by 15 Related articles All 9 versions

[PDF] arxiv.org

Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic Schr {\" o} dinger equation

G Ferriere - arXiv preprint arXiv:1903.04309, 2019 - arxiv.org

We consider the dispersive logarithmic Schr {ö} dinger equation in a semi-classical scaling.

We extend the results about the large time behaviour of the solution (dispersion faster than

usual with an additional logarithmic factor, convergence of the rescaled modulus of the …

  Cited by 6 Related articles All 4 versions 


[PDF] arxiv.org

A convergent Lagrangian discretization for -Wasserstein and flux-limited diffusion equations

B Söllner, O Junge - arXiv preprint arXiv:1906.01321, 2019 - arxiv.org

We study a Lagrangian numerical scheme for solution of a nonlinear drift diffusion equation

of the form $\partial_t u=\partial_x (u\cdot c [\partial_x (h^\prime (u)+ v)]) $ on an interval.

This scheme will consist of a spatio-temporal discretization founded in the formulation of the …

  Cited by 2 Related articles All 5 versions 


[PDF] arxiv.org

Wasserstein stability of porous medium-type equations on manifolds with Ricci curvature bounded below

N De Ponti, M Muratori, C Orrieri - arXiv preprint arXiv:1908.03147, 2019 - arxiv.org

Given a complete, connected Riemannian manifold $\mathbb {M}^ n $ with Ricci curvature

bounded from below, we discuss the stability of the solutions of a porous medium-type

equation with respect to the 2-Wasserstein distance. We produce (sharp) stability estimates …

  Cited by 1 Related articles All 3 versions 


2019

[PDF] arxiv.org

Wasserstein norm for signed measures, with application to nonlocal transport equation with source term

B Piccoli, F Rossi, M Tournus - arXiv preprint arXiv:1910.05105, 2019 - arxiv.org

We introduce the optimal transportation interpretation of the Kantorovich norm on thespace

of signed Radon measures with finite mass, based on a generalized Wasserstein

distancefor measures with different masses. With the formulation and the new topological …

  Cited by 3 Related articles All 7 versions 


Data-driven distributionally robust shortest path problem using the Wasserstein ambiguity set

Z Wang, K YouS SongC Shang - 2019 IEEE 15th …, 2019 - ieeexplore.ieee.org

This paper proposes a data-driven distributionally robust shortest path (DRSP) model where

the distribution of the travel time is only observable through a finite training dataset. Our

DRSP model adopts the Wasserstein metric to construct the ambiguity set of probability …

  Cited by 1 Related articles


[PDF] arxiv.org

A nonlocal free boundary problem with Wasserstein distance

A Karakhanyan - arXiv preprint arXiv:1904.06270, 2019 - arxiv.org

We study the probability measures $\rho\in\mathcal M (\mathbb R^ 2) $ minimizing the

functional\[J [\rho]=\iint\log\frac1 {| xy|} d\rho (x) d\rho (y)+ d^ 2 (\rho,\rho_0),\] where $\rho_0

$ is a given probability measure and $ d (\rho,\rho_0) $ is the 2-Wasserstein distance of …

  Related articles All 2 versions 


Use of the Wasserstein Metric to Solve the Inverse Dynamic Seismic Problem

AA Vasilenko - Geomodel 2019, 2019 - earthdoc.org

The inverse dynamic seismic problem consists in recovering the velocity model of elastic

medium based on the observed seismic data. In this work full waveform inversion method is

used to solve this problem. It consists in minimizing an objective functional measuring the …

  Related articles

Data-driven Wasserstein distributionally robust optimization for biomass with agricultural waste-to-energy network design under uncertainty

C NingF You - Applied Energy, 2019 - Elsevier

This paper addresses the problem of biomass with agricultural waste-to-energy network design under uncertainty. We propose a novel data-driven Wasserstein distributionally robust optimization model for hedging against uncertainty in the optimal network design …

  Cited by 12 Related articles All 8 versions

<—2019———— 2019  ———-   1120—   


[PDF] arxiv.org

Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space

TL GouicQ ParisP RigolletAJ Stromme - arXiv preprint arXiv …, 2019 - arxiv.org

This work establishes fast rates of convergence for empirical barycenters over a large class of geodesic spaces with curvature bounds in the sense of Alexandrov. More specifically, we show that parametric rates of convergence are achievable under natural conditions that …

  Cited by 9 Related articles All 2 versions 


[PDF] arxiv.org

Fast Tree Variants of Gromov-Wasserstein

T LeN HoM Yamada - arXiv preprint arXiv:1910.04462, 2019 - arxiv.org

Gromov-Wasserstein (GW) is a powerful tool to compare probability measures whose supports are in different metric spaces. GW suffers however from a computational drawback since it requires to solve a complex non-convex quadratic program. We consider in this work …

  Cited by 2 Related articles 


[PDF] esaim-cocv.org

A Pontryagin Maximum Principle in Wasserstein spaces for constrained optimal control problems

B Bonnet - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

In this paper, we prove a Pontryagin Maximum Principle for constrained optimal control problems in the Wasserstein space of probability measures. The dynamics is described by a transport equation with non-local velocities which are affine in the control, and is subject to …

  Cited by 8 Related articles All 45 versions


[PDF] arxiv.org

Wasserstein Inequality and Minimal Green Energy on Compact Manifolds

S Steinerberger - arXiv preprint arXiv:1907.09023, 2019 - arxiv.org

Let $ M $ be a smooth, compact $ d-$ dimensional manifold, $ d\geq 3, $ without boundary and let $ G: M\times M\rightarrow\mathbb {R}\cup\left\{\infty\right\} $ denote the Green's function of the Laplacian $-\Delta $(normalized to have mean value 0). We prove a bound …

  Cited by 2 Related articles All 2 versions 


[PDF] mdpi.com

Data-driven distributionally robust stochastic control of energy storage for wind power ramp management using the Wasserstein metric

I Yang - Energies, 2019 - mdpi.com

The integration of wind energy into the power grid is challenging because of its variability, which causes high ramp events that may threaten the reliability and efficiency of power systems. In this paper, we propose a novel distributionally robust solution to wind power …

  Cited by 2 Related articles All 6 versions 


2019


[PDF] projecteuclid.org

Hybrid Wasserstein distance and fast distribution clustering

I Verdinelli, L Wasserman - Electronic Journal of Statistics, 2019 - projecteuclid.org

We define a modified Wasserstein distance for distribution clustering which inherits many of the properties of the Wasserstein distance but which can be estimated easily and computed quickly. The modified distance is the sum of two terms. The first term—which has a closed …

  Cited by 1 Related articles All 5 versions


[PDF] arxiv.org

Local Bures-Wasserstein Transport: A Practical and Fast Mapping Approximation

A Hoyos-Idrobo - arXiv preprint arXiv:1906.08227, 2019 - arxiv.org

Optimal transport (OT)-based methods have a wide range of applications and have attracted a tremendous amount of attention in recent years. However, most of the computational approaches of OT do not learn the underlying transport map. Although some algorithms …

  Related articles All 2 versions 

[CITATION] Local Bures-Wasserstein Transport: A Practical and Fast Mapping Approximation.

AH Idrobo - CoRR, 2019


BIRCH Algorithm and Wasserstein Distance Metric Based Method for Generating Typical Scenarios of Wind Power Outputs

Q Li, X Tang, C Chen, X Liu, S Liu, X Shi… - … -Asia (ISGT Asia), 2019 - ieeexplore.ieee.org

With the ever-increasing penetration of renewable energy generation such as wind power and solar photovoltaics, the power system concerned is suffering more extensive and significant uncertainties. Scenario analysis has been utilized to solve this problem for power …

  Related articles


Approximation of Discounted Minimax Markov Control Problems and Zero-Sum Markov Games Using Hausdorff and Wasserstein Distances

F Dufour, T Prieto-Rumeau - Dynamic Games and Applications, 2019 - Springer

This paper is concerned with a minimax control problem (also known as a robust Markov decision process (MDP) or a game against nature) with general state and action spaces under the discounted cost optimality criterion. We are interested in approximating …

  Related articles All 6 versions


融合 Faster-RCNN  Wasserstein 自编码器的图像检索方法研究及应用

张逸扬 - 2019 - cdmd.cnki.com.cn

伴随着社交网络和用户自创内容的快速发展, 目前互联网已经积累了海量图像数据标志人们已经进入读图时代”. 如何满足人们准确, 实时的图像检索需求, 已成为亟待解决的现实问题. 传统的图像检索方法因其人工标记数据, 关键字匹配等局限性, 难以应用于大规模图像检索 …

[Chinese  Research and application of image retrieval method fused with Faster-RCNN and Wasserstein autoencoder]

<—2019———— 2019  ———-   1130— 


2019 see 2018 2020 2021  128.84.4.18 › abs

Title: A Fast Globally Linearly Convergent Algorithm for the Computation of Wasserstein Barycenters ... dual problem and establish its global convergence and global linear convergence rate. ... [v2] Thu, 2 May 2019 02:36:49 GMT (534kb,D)

[CITATION] A Fast Globally Linearly Convergent Algorithm for the Computation of Wasserstein Barycenters. eprint

L Yang, J Li, D Sun, KC Toh - arXiv preprint arXiv:1809.04249, 2019

  Cited by 4 Related articles

CITATION] Fast Globally Linearly Convergent Algorithm for the Computation of Wasserstein Barycenters. eprint

L Yang, J Li, D Sun, KC Toh - arXiv preprint arXiv:1809.04249, 2019

  Cited by 5 Related articles

[CITATION] A Fast Globally Linearly Convergent Algorithm for the Computation of Wasserstein Barycenters. eprint

L Yang, J Li, D Sun, KC Toh - arXiv preprint arXiv:1809.04249, 2019

  Cited by 4 Related articles

[CITATION] A Fast Globally Linearly Convergent Algorithm for the Computation of Wasserstein Barycenters. eprint

L Yang, J Li, D Sun, KC Toh - arXiv preprint arXiv:1809.04249, 2019

  Cited by 5 Related articles


2019

Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

… be regarded as an extension of the result in [8] to the multi-dimensional setting … some of the techniques for removing it in the special case of multivariate normal approximation … In particular, we provide an error bound for the Wasserstein distance between the sampling distribution …

  Cited by 20 Related articles All 7 versions



[PDF] arxiv.org

Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance

J WeedF Bach - Bernoulli, 2019 - projecteuclid.org

The Wasserstein distance between two probability measures on a metric space is a measure of closeness with applications in statistics, probability, and machine learning. In this work, we consider the fundamental question of how quickly the empirical measure …

  Cited by 163 Related articles All 6 versions


[PDF] arxiv.org

Wasserstein covariance for multiple random densities

A PetersenHG Müller - Biometrika, 2019 - academic.oup.com

A common feature of methods for analysing samples of probability density functions is that they respect the geometry inherent to the space of densities. Once a metric is specified for this space, the Fréchet mean is typically used to quantify and visualize the average density …

  Cited by 11 Related articles All 12 versions


[PDF] mlr.press

Accelerated linear convergence of stochastic momentum methods in wasserstein distances

B CanM GurbuzbalabanL Zhu - … Conference on Machine …, 2019 - proceedings.mlr.press

Momentum methods such as Polyak's heavy ball (HB) method, Nesterov's accelerated gradient (AG) as well as accelerated projected gradient (APG) method have been commonly used in machine learning practice, but their performance is quite sensitive to noise in the …




  Cited by 16 Related articles All 8 versions 

[PDF] arxiv.org

A bound on the Wasserstein-2 distance between linear combinations of independent random variables

B Arras, E Azmoodeh, G Poly, Y Swan - Stochastic processes and their …, 2019 - Elsevier

We provide a bound on a distance between finitely supported elements and general elements of the unit sphere of 2 (N). We use this bound to estimate the Wasserstein-2 distance between random variables represented by linear combinations of independent …

  Cited by 20 Related articles All 15 versions



[PDF] arxiv.org

Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space

TL GouicQ ParisP RigolletAJ Stromme - arXiv preprint arXiv …, 2019 - arxiv.org

This work establishes fast rates of convergence for empirical barycenters over a large class of geodesic spaces with curvature bounds in the sense of Alexandrov. More specifically, we show that parametric rates of convergence are achievable under natural conditions that …

  Cited by 9 Related articles All 2 versions 


[PDF] arxiv.org

The optimal convergence rate of monotone schemes for conservation laws in the Wasserstein distance

AM RufE SandeS Solem - Journal of Scientific Computing, 2019 - Springer

Abstract In 1994, Nessyahu, Tadmor and Tassa studied convergence rates of monotone finite volume approximations of conservation laws. For compactly supported, Lip^+ Lip+-bounded initial data they showed a first-order convergence rate in the Wasserstein distance …

  Cited by 8 Related articles All 6 versions


[PDF] arxiv.org

Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic Schr {\" o} dinger equation

G Ferriere - arXiv preprint arXiv:1903.04309, 2019 - arxiv.org

We consider the dispersive logarithmic Schr {ö} dinger equation in a semi-classical scaling. We extend the results about the large time behaviour of the solution (dispersion faster than usual with an additional logarithmic factor, convergence of the rescaled modulus of the …

  Cited by 6 Related articles All 4 versions 


[PDF] springer.com

Convergence to equilibrium in Wasserstein distance for damped Euler equations with interaction forces

JA CarrilloYP ChoiO Tse - Communications in Mathematical Physics, 2019 - Springer

We develop tools to construct Lyapunov functionals on the space of probability measures in order to investigate the convergence to global equilibrium of a damped Euler system under the influence of external and interaction potential forces with respect to the 2-Wasserstein  …

  Cited by 11 Related articles All 11 versions

<—2019———— 2019  ———-   1140—  



[PDF] arxiv.org

Geometric convergence bounds for Markov chains in Wasserstein distance based on generalized drift and contraction conditions

Q Qin, JP Hobert - arXiv preprint arXiv:1902.02964, 2019 - arxiv.org

Let $\{X_n\} _ {n= 0}^\infty $ denote an ergodic Markov chain on a general state space that has stationary distribution $\pi $. This article concerns upper bounds on the $ L_1 $-Wasserstein distance between the distribution of $ X_n $ and $\pi $. In particular, an explicit …

  Cited by 9 Related articles All 2 versions 


[PDF] thecvf.com

[PDF] Face Synthesis and Recognition Using Disentangled Representation-Learning Wasserstein GAN.

GSJ Hsu, CH Tang, MH Yap - CVPR Workshops, 2019 - openaccess.thecvf.com

Abstract We propose the Disentangled Representation-learning Wasserstein GAN (DR-WGAN) trained on augmented data for face recognition and face synthesis across pose. We improve the state-of-the-art DR-GAN with the Wasserstein loss considered in the …

  Cited by 1 Related articles All 4 versions View as HTML 

[PDF] thecvf.com

Face Synthesis and Recognition Using Disentangled Representation-Learning Wasserstein GAN

GS Jison Hsu, CH Tang… - … and Pattern Recognition …, 2019 - openaccess.thecvf.com

Abstract We propose the Disentangled Representation-learning Wasserstein GAN (DR-WGAN) trained on augmented data for face recognition and face synthesis across pose. We improve the state-of-the-art DR-GAN with the Wasserstein loss considered in the …

  Related articles All 2 versions 


[PDF] arxiv.org

Weak convergence of empirical Wasserstein type distances

P Berthet, JC Fort - arXiv preprint arXiv:1911.02389, 2019 - arxiv.org

We estimate contrasts $\int_0^ 1\rho (F^{-1}(u)-G^{-1}(u)) du $ between two continuous distributions $ F $ and $ G $ on $\mathbb R $ such that the set $\{F= G\} $ is a finite union of intervals, possibly empty or $\mathbb {R} $. The non-negative convex cost function $\rho $ is …

  Cited by 2 Related articles All 6 versions 


[PDF] apsipa.org

Semi-supervised Multimodal Emotion Recognition with Improved Wasserstein GANs

J Liang, S Chen, Q Jin - 2019 Asia-Pacific Signal and …, 2019 - ieeexplore.ieee.org

Automatic emotion recognition has faced the challenge of lacking large-scale human labeled dataset for model learning due to the expensive data annotation cost and inevitable label ambiguity. To tackle such challenge, previous works have explored to transfer emotion …

Cited by 2 Related articles All 2 versions


Gait recognition based on Wasserstein generating adversarial image inpainting network

L Xia, H Wang, W Guo - Journal of Central South University, 2019 - Springer

Aiming at the problem of small area human occlusion in gait recognition, a method based on generating adversarial image inpainting network was proposed which can generate a context consistent image for gait occlusion area. In order to reduce the effect of noise on …

Cited by 3 Related articles

[PDF] projecteuclid.org

Convergence of the population dynamics algorithm in the Wasserstein metric

M Olvera-Cravioto - Electronic Journal of Probability, 2019 - projecteuclid.org

We study the convergence of the population dynamics algorithm, which produces sample pools of random variables having a distribution that closely approximates that of the special endogenous solution to a variety of branching stochastic fixed-point equations, including the …

  Cited by 3 Related articles All 6 versions


[PDF] arxiv.org

Random Matrix-Improved Estimation of the Wasserstein Distance between two Centered Gaussian Distributions

M TiomokoR Couillet - 2019 27th European Signal Processing …, 2019 - ieeexplore.ieee.org

This article proposes a method to consistently estimate functionals 1/pΣ i= 1 pf (λ i (C 1 C 2)) of the eigenvalues of the product of two covariance matrices C 1, C 2 R p× p based on the empirical estimates λ i (Ĉ 1 Ĉ 2)(Ĉ a= 1/na Σ i= 1 na xi (a) xi (a)), when the size p and …

  Cited by 1 Related articles All 7 versions

[PDF] researchgate.net

[PDF] Rate of convergence in Wasserstein distance of piecewise-linear Lévy-driven SDEs

ARI ARAPOSTATHIS, G PANG… - arXiv preprint arXiv …, 2019 - researchgate.net

In this paper, we study the rate of convergence under the Wasserstein metric of a broad class of multidimensional piecewise Ornstein–Uhlenbeck processes with jumps. These are governed by stochastic differential equations having a piecewise linear drift, and a fairly …

  Related articles 

[PDF] arxiv.org

Wgansing: A multi-voice singing voice synthesizer based on the wasserstein-gan

P Chandna, M Blaauw, J Bonada… - 2019 27th European …, 2019 - ieeexplore.ieee.org

We present a deep neural network based singing voice synthesizer, inspired by the Deep Convolutions Generative Adversarial Networks (DCGAN) architecture and optimized using the Wasserstein-GAN algorithm. We use vocoder parameters for acoustic modelling, to …

  Cited by 23 Related articles All 4 versions


[PDF] arxiv.org

Hypothesis Test and Confidence Analysis with Wasserstein Distance with General Dimension

M Imaizumi, H Ota, T Hamaguchi - arXiv preprint arXiv:1910.07773, 2019 - arxiv.org

We develop a general framework for statistical inference with the Wasserstein distance. Recently, the Wasserstein distance has attracted much attention and been applied to various machine learning tasks due to its celebrated properties. Despite the importance …

  Cited by 1 Related articles All 2 versions 

<—2019———— 2019  ———-   1150—  



Inequalities for the Wasserstein mean of positive definite matrices

R Bhatia, T Jain, Y Lim - Linear Algebra and its Applications, 2019 - Elsevier

Let A 1 , … , A m be given positive definite matrices and let w = ( w 1 , … , w m ) be a vector of weights; ie, w j ≥ 0 and ∑ j = 1 m w j = 1 . Then the (weighted) Wasserstein mean, or the Wasserstein barycentre of A 1 , … , A m is defined as(2) Ω ( w ; A 1 , … , A m ) = argmin X P ∑ j = 1 m w …

  Cited by 12 Related articles All 5 versions


[PDF] arxiv.org

Wasserstein stability estimates for covariance-preconditioned Fokker-Planck equations

JA CarrilloU Vaes - arXiv preprint arXiv:1910.07555, 2019 - arxiv.org

We study the convergence to equilibrium of the mean field PDE associated with the derivative-free methodologies for solving inverse problems. We show stability estimates in the euclidean Wasserstein distance for the mean field PDE by using optimal transport …

  Cited by 7 Related articles All 4 versions 


[PDF] arxiv.org

Bridging Bayesian and minimax mean square error estimation via Wasserstein distributionally robust optimization

VA NguyenS Shafieezadeh-AbadehD Kuhn… - arXiv preprint arXiv …, 2019 - arxiv.org

We introduce a distributionally robust minimium mean square error estimation model with a Wasserstein ambiguity set to recover an unknown signal from a noisy observation. The proposed model can be viewed as a zero-sum game between a statistician choosing an …

  Cited by 7 Related articles All 6 versions 


Hyperbolic Wasserstein distance for shape indexing

J ShiY Wang - IEEE transactions on pattern analysis and …, 2019 - ieeexplore.ieee.org

Shape space is an active research topic in computer vision and medical imaging fields. The distance defined in a shape space may provide a simple and refined index to represent a unique shape. This work studies the Wasserstein space and proposes a novel framework to …

  Cited by 3 Related articles All 7 versions


[PDF] arxiv.org

Bounds for the Wasserstein mean with applications to the Lie-Trotter mean

J Hwang, S Kim - Journal of Mathematical Analysis and Applications, 2019 - Elsevier

Since barycenters in the Wasserstein space of probability distributions have been introduced, the Wasserstein metric and the Wasserstein mean of positive definite Hermitian matrices have been recently developed. In this paper, we explore some properties of …

  Cited by 3 Related articles All 5 versions




[PDF] arxiv.org

Closed‐form Expressions for Maximum Mean Discrepancy with Applications to Wasserstein Auto‐Encoders

RM Rustamov - Stat, 2019 - Wiley Online Library

Abstract The Maximum Mean Discrepancy (MMD) has found numerous applications in statistics and machine learning, most recently as a penalty in the Wasserstein Auto‐Encoder (WAE). In this paper we compute closed‐form expressions for estimating the Gaussian …

  Cited by 5 Related articles All 3 versions


[PDF] arxiv.org

Wasserstein stability of porous medium-type equations on manifolds with Ricci curvature bounded below

N De Ponti, M Muratori, C Orrieri - arXiv preprint arXiv:1908.03147, 2019 - arxiv.org

Given a complete, connected Riemannian manifold $\mathbb {M}^ n $ with Ricci curvature bounded from below, we discuss the stability of the solutions of a porous medium-type equation with respect to the 2-Wasserstein distance. We produce (sharp) stability estimates …

  Cited by 1 Related articles All 3 versions 


[PDF] arxiv.org

A two-phase two-fluxes degenerate Cahn–Hilliard model as constrained Wasserstein gradient flow

C CancèsD Matthes, F Nabet - Archive for Rational Mechanics and …, 2019 - Springer

We study a non-local version of the Cahn–Hilliard dynamics for phase separation in a two-component incompressible and immiscible mixture with linear mobilities. Differently to the celebrated local model with nonlinear mobility, it is only assumed that the divergences of the …

  Cited by 7 Related articles All 17 versions


[PDF] arxiv.org

Parameterized Wasserstein mean with its properties

S Kim - arXiv preprint arXiv:1904.09385, 2019 - arxiv.org

A new least squares mean of positive definite matrices for the divergence associated with the sandwiched quasi-relative entropy has been introduced. It generalizes the well-known Wasserstein mean for covariance matrices of Gaussian distributions with mean zero, so we …

  Related articles All 2 versions 


Adapted Wasserstein Distances and Stability in Mathematical ...

arxiv.org › q-fin

by J Backhoff-Veraguas · 2019 · Cited by 18 — Quantitative Finance > Mathematical Finance. arXiv:1901.07450 (q-fin). [Submitted on 22 Jan 2019 (v1), last revised 14 May 2020 (this version, v3)] ...

[CITATION] Adapted wasserstein distances and stability in mathematical finance. arXiv e-prints, page

J Backhoff-Veraguas, D Bartl, M Beiglböck, M Eder - arXiv preprint arXiv:1901.07450, 2019

  Cited by 4 Related articles

<—2019———— 2019  ———-   1160—  



On the Bures–Wasserstein distance between positive definite matrices

R Bhatia, T Jain, Y Lim - Expositiones Mathematicae, 2019 - Elsevier

The metric d (A, B)= tr A+ tr B− 2 tr (A 1 2 BA 1 2) 1 2 1on the manifold of n× n positive definite matrices arises in various optimisation problems, in quantum information and in the theory of optimal transport. It is also related to Riemannian geometry. In the first …

  Cited by 94 Related articles All 6 versions


[PDF] mlr.press

On the complexity of approximating Wasserstein barycenters

A KroshninN TupitsaD Dvinskikh… - … conference on …, 2019 - proceedings.mlr.press

We study the complexity of approximating the Wasserstein barycenter of $ m $ discrete measures, or histograms of size $ n $, by contrasting two alternative approaches that use entropic regularization. The first approach is based on the Iterative Bregman Projections …

  Cited by 41 Related articles All 11 versions 

On the Complexity of Approximating Wasserstein Barycenters

P Dvurechensky - dev.icml.cc

Page 1. On the Complexity of Approximating Wasserstein Barycenters Alexey Kroshnin, Darina Dvinskikh, Pavel Dvurechensky, Alexander Gasnikov, Nazarii Tupitsa, César A. Uribe International Conference on Machine Learning 2019 Page 2. Wasserstein barycenter ˆν …

  All 4 versions 



[HTML] oup.com

On parameter estimation with the Wasserstein distance

E Bernton, PE Jacob, M Gerber… - … and Inference: A …, 2019 - academic.oup.com

Statistical inference can be performed by minimizing, over the parameter space, the Wasserstein distance between model distributions and the empirical distribution of the data. We study asymptotic properties of such minimum Wasserstein distance estimators …

  Cited by 21 Related articles All 6 versions


[PDF] arxiv.org

On the computational complexity of finding a sparse Wasserstein barycenter

S Borgwardt, S Patterson - arXiv preprint arXiv:1910.07568, 2019 - arxiv.org

The discrete Wasserstein barycenter problem is a minimum-cost mass transport problem for a set of probability measures with finite support. In this paper, we show that finding a barycenter of sparse support is hard, even in dimension 2 and for only 3 measures. We …

  Cited by 11 Related articles All 2 versions 


[PDF] arxiv.org

On distributionally robust chance constrained programs with Wasserstein distance

W Xie - Mathematical Programming, 2019 - Springer

This paper studies a distributionally robust chance constrained program (DRCCP) with Wasserstein ambiguity set, where the uncertain constraints should be satisfied with a probability at least a given threshold for all the probability distributions of the uncertain …

  Cited by 43 Related articles All 9 versions


[PDF] ucla.edu

On differentiability in the Wasserstein space and well-posedness for Hamilton–Jacobi equations

W GangboA Tudorascu - Journal de Mathématiques Pures et Appliquées, 2019 - Elsevier

In this paper we elucidate the connection between various notions of differentiability in the Wasserstein space: some have been introduced intrinsically (in the Wasserstein space, by using typical objects from the theory of Optimal Transport) and used by various authors to …

  Cited by 32 Related articles All 4 versions


[PDF] arxiv.org

Wgansing: A multi-voice singing voice synthesizer based on the wasserstein-gan

P Chandna, M Blaauw, J Bonada… - 2019 27th European …, 2019 - ieeexplore.ieee.org

We present a deep neural network based singing voice synthesizer, inspired by the Deep Convolutions Generative Adversarial Networks (DCGAN) architecture and optimized using the Wasserstein-GAN algorithm. We use vocoder parameters for acoustic modelling, to …

  Cited by 23 Related articles All 4 versions


[PDF] arxiv.org

Riemannian normalizing flow on variational wasserstein autoencoder for text modeling

PZ WangWY Wang - arXiv preprint arXiv:1904.02399, 2019 - arxiv.org

Recurrent Variational Autoencoder has been widely used for language modeling and text generation tasks. These models often face a difficult optimization problem, also known as the Kullback-Leibler (KL) term vanishing issue, where the posterior easily collapses to the …

  Cited by 14 Related articles All 5 versions 

Riemannian Normalizing Flow on Variational Wasserstein Autoencoder for Text Modeling

P Zizhuang Wang, WY Wang - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

Abstract Recurrent Variational Autoencoder has been widely used for language modeling and text generation tasks. These models often face a difficult optimization problem, also known as the Kullback-Leibler (KL) term vanishing issue, where the posterior easily …



[PDF] mlr.press

Understanding mcmc dynamics as flows on the wasserstein space

C LiuJ ZhuoJ Zhu - International Conference on Machine …, 2019 - proceedings.mlr.press

It is known that the Langevin dynamics used in MCMC is the gradient flow of the KL divergence on the Wasserstein space, which helps convergence analysis and inspires recent particle-based variational inference methods (ParVIs). But no more MCMC dynamics …

  Cited by 3 Related articles All 11 versions 


[PDF] arxiv.org

A bound on the Wasserstein-2 distance between linear combinations of independent random variables

B Arras, E Azmoodeh, G Poly, Y Swan - Stochastic processes and their …, 2019 - Elsevier

We provide a bound on a distance between finitely supported elements and general elements of the unit sphere of 2 (N). We use this bound to estimate the Wasserstein-2 distance between random variables represented by linear combinations of independent …

  Cited by 20 Related articles All 15 versions

<—2019———— 2019  ———-   1170—  


[PDF] biorxiv.org

Reproducibility test of radiomics using network analysis and Wasserstein K-means algorithm

JH Oh, AP Apte, E Katsoulakis, N Riaz, V Hatzoglou… - bioRxiv, 2019 - biorxiv.org

Purpose To construct robust and validated radiomic predictive models, the development of a reliable method that can identify reproducible radiomic features robust to varying image acquisition methods and other scanner parameters should be preceded with rigorous …

  Related articles All 3 versions 


2019 [PDF] arxiv.org

Wasserstein weisfeiler-lehman graph kernels

M TogninalliE GhisuF Llinares-López… - arXiv preprint arXiv …, 2019 - arxiv.org

Most graph kernels are an instance of the class of $\mathcal {R} $-Convolution kernels, which measure the similarity of objects by comparing their substructures. Despite their empirical success, most graph kernels use a naive aggregation of the final set of …

  Cited by 38 Related articles All 10 versions 


[PDF] arxiv.org

Unsupervised adversarial domain adaptation based on the Wasserstein distance for acoustic scene classification

K DrossosP MagronT Virtanen - 2019 IEEE Workshop on …, 2019 - ieeexplore.ieee.org

A challenging problem in deep learning-based machine listening field is the degradation of the performance when using data from unseen conditions. In this paper we focus on the acoustic scene classification (ASC) task and propose an adversarial deep learning method …

  Cited by 14 Related articles All 5 versions


[PDF] arxiv.org

A partial Laplacian as an infinitesimal generator on the Wasserstein space

YT Chow, W Gangbo - Journal of Differential Equations, 2019 - Elsevier

In this manuscript, we consider special linear operators which we term partial Laplacians on the Wasserstein space, and which we show to be partial traces of the Wasserstein Hessian. We verify a distinctive smoothing effect of the “heat flows” they generated for a particular …

  Cited by 11 Related articles All 9 versions


The Pontryagin maximum principle in the Wasserstein space

B BonnetF Rossi - Calculus of Variations and Partial Differential …, 2019 - Springer

Abstract We prove a Pontryagin Maximum Principle for optimal control problems in the

space of probability measures, where the dynamics is given by a transport equation with non-

local velocity. We formulate this first-order optimality condition using the formalism of …

  Cited by 25 Related articles All 20 versions

 2019

 

Calculating spatial configurational entropy of a landscape mosaic based on the Wasserstein metric

Y Zhao, X Zhang - Landscape Ecology, 2019 - Springer

Context Entropy is an important concept traditionally associated with thermodynamics and is widely used to describe the degree of disorder in a substance, system, or process. Configurational entropy has received more attention because it better reflects the …

Cite Cited by 9 Related articles All 6 versions

[PDF] arxiv.org

Semi-supervised multitask learning on multispectral satellite images using wasserstein generative adversarial networks (gans) for predicting poverty

A Perez, S GanguliS ErmonG AzzariM Burke… - arXiv preprint arXiv …, 2019 - arxiv.org

Obtaining reliable data describing local poverty metrics at a granularity that is informative to policy-makers requires expensive and logistically difficult surveys, particularly in the developing world. Not surprisingly, the poverty stricken regions are also the ones which …

  Cited by 21 Related articles All 4 versions 


2019

Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces

V Ehrlacher, D Lombardi, O Mula… - arXiv preprint arXiv …, 2019 - esaim-m2an.org

We consider the problem of model reduction of parametrized PDEs where the goal is to approximate any function belonging to the set of solutions at a reduced computational cost. For this, the bottom line of most strategies has so far been based on the approximation of the …

  Cited by 4 Related articles All 19 versions


[PDF] arxiv.org

Geometric convergence bounds for Markov chains in Wasserstein distance based on generalized drift and contraction conditions

Q Qin, JP Hobert - arXiv preprint arXiv:1902.02964, 2019 - arxiv.org

Let $\{X_n\} _ {n= 0}^\infty $ denote an ergodic Markov chain on a general state space that has stationary distribution $\pi $. This article concerns upper bounds on the $ L_1 $-Wasserstein distance between the distribution of $ X_n $ and $\pi $. In particular, an explicit …

  Cited by 9 Related articles All 2 versions 


[PDF] mdpi.com

Multi-turn chatbot based on query-context attentions and dual wasserstein generative adversarial networks

J Kim, S Oh, OW Kwon, H Kim - Applied Sciences, 2019 - mdpi.com

To generate proper responses to user queries, multi-turn chatbot models should selectively consider dialogue histories. However, previous chatbot models have simply concatenated or averaged vector representations of all previous utterances without considering contextual …

  Cited by 6 Related articles All 3 versions 

<—2019———— 2019  ———-   1180— 



[PDF] inria.fr

On Wasserstein-type distance between solutions to stochastic differential equations

J Bion–Nadal, D Talay - The Annals of Applied Probability, 2019 - projecteuclid.org

In this paper, we introduce a Wasserstein-type distance on the set of the probability distributions of strong solutions to stochastic differential equations. This new distance is defined by restricting the set of possible coupling measures. We prove that it may also be …

  Cited by 10 Related articles All 9 versions


[PDF] arxiv.org

Second-Order Models for Optimal Transport and Cubic Splines on the Wasserstein Space

JD BenamouTO GallouëtFX Vialard - Foundations of Computational …, 2019 - Springer

On the space of probability densities, we extend the Wasserstein geodesics to the case of higher-order interpolation such as cubic spline interpolation. After presenting the natural extension of cubic splines to the Wasserstein space, we propose a simpler approach based …

  Cited by 8 Related articles All 5 versions


[PDF] arxiv.org

On the minimax optimality of estimating the wasserstein metric

T Liang - arXiv preprint arXiv:1908.10324, 2019 - arxiv.org

We study the minimax optimal rate for estimating the Wasserstein-$1 $ metric between two unknown probability measures based on $ n $ iid empirical samples from them. We show that estimating the Wasserstein metric itself between probability measures, is not …

  Cited by 3 Related articles All 3 versions 


[PDF] arxiv.org

Deep Distributional Sequence Embeddings Based on Wasserstein Loss

A Abdelwahab, N Landwehr - arXiv preprint arXiv:1912.01933, 2019 - arxiv.org

Deep metric learning employs deep neural networks to embed instances into a metric space such that distances between instances of the same class are small and distances between instances from different classes are large. In most existing deep metric learning techniques …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Hausdorff and Wasserstein metrics on graphs and other structured data

E Patterson - arXiv preprint arXiv:1907.00257, 2019 - arxiv.org

Optimal transport is widely used in pure and applied mathematics to find probabilistic solutions to hard combinatorial matching problems. We extend the Wasserstein metric and other elements of optimal transport from the matching of sets to the matching of graphs and …

  Cited by 2 Related articles All 3 versions 




[PDF] arxiv.org

On the total variation Wasserstein gradient flow and the TV-JKO scheme

G Carlier, C Poon - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We study the JKO scheme for the total variation, characterize the optimizers, prove some of their qualitative properties (in particular a form of maximum principle and in some cases, a minimum principle as well). Finally, we establish a convergence result as the time step goes …

  Cited by 7 Related articles All 7 versions


On potentials of regularized Wasserstein generative adversarial networks for realistic hallucination of tiny faces

WZ ShaoJJ Xu, L Chen, Q Ge, LQ Wang, BK Bao… - Neurocomputing, 2019 - Elsevier

Super-resolution of facial images, aka face hallucination, has been intensively studied in the past decades due to the increasingly emerging analysis demands in video surveillance, eg, face detection, verification, identification. However, the actual performance of most previous …

  Cited by 2 Related articles All 3 versions


[PDF] arxiv.org

From the backward Kolmogorov PDE on the Wasserstein space to propagation of chaos for McKean-Vlasov SDEs

N Frikha, PEC de Raynal - arXiv preprint arXiv:1907.01410, 2019 - arxiv.org

In this article, we provide some new quantitative estimates for propagation of chaos of non-linear stochastic differential equations (SDEs) in the sense of McKean-Vlasov. We obtain explicit error estimates, at the level of the trajectories, at the level of the semi-group and at …

  Cited by 5 Related articles All 7 versions 


Wasserstein Inequality and Minimal Green Energy on Compact Manifolds

S Steinerberger - arXiv preprint arXiv:1907.09023, 2019 - arxiv.org

Let $ M $ be a smooth, compact $ d-$ dimensional manifold, $ d\geq 3, $ without boundary and let $ G: M\times M\rightarrow\mathbb {R}\cup\left\{\infty\right\} $ denote the Green's function of the Laplacian $-\Delta $(normalized to have mean value 0). We prove a bound …

  Cited by 2 Related articles All 2 versions 


[PDF] phmsociety.org

[PDF] Anomaly detection on time series with wasserstein gan applied to phm

M Ducoffe, I Haloui, JS Gupta… - International Journal of …, 2019 - phmsociety.org

Modern vehicles are more and more connected. For instance, in the aerospace industry, newer aircraft are already equipped with data concentrators and enough wireless connectivity to transmit sensor data collected during the whole flight to the ground, usually …

  Cited by 2 Related articles All 2 versions 

Misfit function for full waveform inversion based on the Wasserstein metric with dynamic formulation

P YongW Liao, J Huang, Z Li, Y Lin - Journal of Computational Physics, 2019 - Elsevier

Conventional full waveform inversion (FWI) using least square distance (L 2 norm) between the observed and predicted seismograms suffers from local minima. Recently, the Wasserstein metric (W 1 metric) has been introduced to FWI to compute the misfit between …

  Cited by 1 Related articles All 2 versions

<—2019———— 2019  ———-   1190—    



[PDF] arxiv.org

Zero-Sum Differential Games on the Wasserstein Space

J MoonT Basar - arXiv preprint arXiv:1912.06084, 2019 - arxiv.org

We consider two-player zero-sum differential games (ZSDGs), where the state process (dynamical system) depends on the random initial condition and the state process's distribution, and the objective functional includes the state process's distribution and the …

  Cited by 1 Related articles All 2 versions 


[PDF] archives-ouvertes.fr

Diffusive processes on the Wasserstein space: Coalescing models, Regularization properties and McKean-Vlasov equations

V Marx - 2019 - tel.archives-ouvertes.fr

The aim of this thesis is to study a class of diffusive stochastic processes with values in the space of probability measures on the real line, called Wasserstein space if it is endowed with the Wasserstein metric W2. The following issues are mainly addressed in this work: how …

  Cited by 2 Related articles All 9 versions 


[PDF] arxiv.org

Wasserstein Adversarial Regularization (WAR) on label noise

BB DamodaranK FatrasS LobryR Flamary… - arXiv preprint arXiv …, 2019 - arxiv.org

Noisy labels often occur in vision datasets, especially when they are obtained from crowdsourcing or Web scraping. We propose a new regularization method, which enables learning robust classifiers in presence of noisy data. To achieve this goal, we propose a new …

  Cited by 1 Related articles All 2 versions 

Wasserstein Adversarial Regularization (WAR) on label noise

B Bhushan Damodaran, K FatrasS Lobry… - arXiv e …, 2019 - ui.adsabs.harvard.edu

Noisy labels often occur in vision datasets, especially when they are obtained from crowdsourcing or Web scraping. We propose a new regularization method, which enables learning robust classifiers in presence of noisy data. To achieve this goal, we propose a new …


[PDF] aaai.org

Wasserstein soft label propagation on hypergraphs: Algorithm and generalization error bounds

T GaoS AsoodehY HuangJ Evans - … of the AAAI Conference on …, 2019 - ojs.aaai.org

Inspired by recent interests of developing machine learning and data mining algorithms on hypergraphs, we investigate in this paper the semi-supervised learning algorithm of propagating” soft labels”(eg probability distributions, class membership scores) over …

  Cited by 3 Related articles All 13 versions 


[PDF] arxiv.org

Busemann functions on the Wasserstein space

G Zhu, WL Li, X Cui - arXiv preprint arXiv:1905.05544, 2019 - arxiv.org

We study rays and co-rays in the Wasserstein space $ P_p (\mathcal {X}) $($ p> 1$) whose ambient space $\mathcal {X} $ is a complete, separable, non-compact, locally compact length space. We show that rays in the Wasserstein space can be represented as probability …

  Related articles All 2 versions 


  2019


[PDF] arxiv.org

On the estimation of the Wasserstein distance in generative models

T Pinetz, D Soukup, T Pock - German Conference on Pattern Recognition, 2019 - Springer

Abstract Generative Adversarial Networks (GANs) have been used to model the underlying probability distribution of sample based datasets. GANs are notoriuos for training difficulties and their dependence on arbitrary hyperparameters. One recent improvement in GAN …

  Related articles All 5 versions


[PDF] arxiv.org

Wasserstein stability of porous medium-type equations on manifolds with Ricci curvature bounded below

N De Ponti, M Muratori, C Orrieri - arXiv preprint arXiv:1908.03147, 2019 - arxiv.org

Given a complete, connected Riemannian manifold $\mathbb {M}^ n $ with Ricci curvature bounded from below, we discuss the stability of the solutions of a porous medium-type equation with respect to the 2-Wasserstein distance. We produce (sharp) stability estimates …

  Cited by 1 Related articles All 3 versions 


 [PDF] arxiv.org

Optimal Fusion of Elliptic Extended Target Estimates Based on the Wasserstein Distance

K ThormannM Baum - 2019 22th International Conference on …, 2019 - ieeexplore.ieee.org

This paper considers the fusion of multiple estimates of a spatially extended object, where the object extent is modeled as an ellipse parameterized by the orientation and semi-axes lengths. For this purpose, we propose a novel systematic approach that employs a distance …

  Cited by 1 Related articles All 5 versions


[PDF] arxiv.org

On isometric embeddings of Wasserstein spaces–the discrete case

GP GehérT TitkosD Virosztek - Journal of Mathematical Analysis and …, 2019 - Elsevier

The aim of this short paper is to offer a complete characterization of all (not necessarily surjective) isometric embeddings of the Wasserstein space W p (X), where X is a countable discrete metric space and 0< p<∞ is any parameter value. Roughly speaking, we will prove …

  Cited by 2 Related articles All 8 versions


[PDF] arxiv.org

On Efficient Multilevel Clustering via Wasserstein Distances

V Huynh, N Ho, N Dam, XL Nguyen… - arXiv preprint arXiv …, 2019 - arxiv.org

We propose a novel approach to the problem of multilevel clustering, which aims to simultaneously partition data in each group and discover grouping patterns among groups in a potentially large hierarchically structured corpus of data. Our method involves a joint …

  Related articles All 2 versions 

<—2019———— 2019  ———-   1200— 


Aero-engine faults diagnosis based on K-means improved wasserstein GAN and relevant vector machine

Z Zhao, R Zhou, Z Dong - 2019 Chinese Control Conference …, 2019 - ieeexplore.ieee.org

The aero-engine faults diagnosis is essential to the safety of the long-endurance aircraft. The problem of fault diagnosis for aero-engines is essentially a sort of model classification problem. Due to the difficulty of the engine faults modeling, a data-driven approach is used …

  Cited by 1 Related articles


Unsupervised segmentation of colonic polyps in narrow-band imaging data based on manifold representation of images and Wasserstein distance

IN FigueiredoL PintoPN FigueiredoR Tsai - … Signal Processing and …, 2019 - Elsevier

Colorectal cancer (CRC) is one of the most common cancers worldwide and after a certain age (≥ 50) regular colonoscopy examination for CRC screening is highly recommended. One of the most prominent precursors of CRC are abnormal growths known as polyps. If a …

  Related articles All 4 versions


[PDF] arxiv.org

1-Wasserstein Distance on the Standard Simplex

A Frohmader, H Volkmer - arXiv preprint arXiv:1912.04945, 2019 - arxiv.org

Wasserstein distances provide a metric on a space of probability measures. We consider the space $\Omega $ of all probability measures on the finite set $\chi=\{1,\dots, n\} $ where $ n $ is a positive integer. 1-Wasserstein distance, $ W_1 (\mu,\nu) $ is a function from …

  Cited by 1 Related articles All 2 versions 


[PDF] aaai.org

Wasserstein soft label propagation on hypergraphs: Algorithm and generalization error bounds

T GaoS AsoodehY HuangJ Evans - Proceedings of the AAAI …, 2019 - ojs.aaai.org

Inspired by recent interests of developing machine learning and data mining algorithms on

hypergraphs, we investigate in this paper the semi-supervised learning algorithm of

propagating” soft labels”(eg probability distributions, class membership scores) over …

  Cited by 3 Related articles All 8 versions 


An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription

C Jin, Z Li, Y Sun, H Zhang, X Lv, J Li, S Liu - International Conference on …, 2019 - Springer

Given a piece of acoustic musical signal, various automatic music transcription (AMT)

processing methods have been proposed to generate the corresponding music notations

without human intervention. However, the existing AMT methods based on signal …

  Related articles

2019

Approximation and Wasserstein distance for self-similar measures on the unit interval

E Lichtenegger, R Niedzialomski - Journal of Mathematical Analysis and …, 2019 - Elsevier

We study the Wasserstein distance between self-similar measures associated to two non-overlapping linear contractions of the unit interval. The main theorem gives an explicit formula for the Wasserstein distance between iterations of certain discrete approximations of …

  Related articles All 2 versions


Unsupervised feature extraction based on improved Wasserstein generative adversarial network for hyperspectral classification

Q Sun, S Bourennane - Multimodal Sensing: Technologies …, 2019 - spiedigitallibrary.org

Accurate classification is one of the most important prerequisites for hyperspectral applications and feature extraction is the key step of classification. Recently, deep learning models have been successfully used to extract the spectral-spatial features in hyperspectral …

  Related articles All 4 versions


[PDF] lancs.ac.uk

Reproducing-Kernel Hilbert space regression with notes on the Wasserstein Distance

S Page - 2019 - eprints.lancs.ac.uk

We study kernel least-squares estimators for the regression problem subject to a norm constraint. We bound the squared L2 error of our estimators with respect to the covariate distribution. We also bound the worst-case squared L2 error of our estimators with respect to …

  Related articles All 5 versions 


[PDF] arxiv.org

A Fenchel-Moreau-Rockafellar type theorem on the Kantorovich-Wasserstein space with applications in partially observable Markov decision processes

V Laschos, K ObermayerY ShenW Stannat - Journal of Mathematical …, 2019 - Elsevier

By using the fact that the space of all probability measures with finite support can be completed in two different fashions, one generating the Arens-Eells space and another generating the Kantorovich-Wasserstein (Wasserstein-1) space, and by exploiting the …

  Cited by 1 Related articles All 5 versions


[PDF] amazonaws.com

[PDF] Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application to Microbiome Studies”

S Wang, TT CaiH Li - pstorage-tf-iopjsd8797887.s3 …

Page 1. Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application to Microbiome Studies” Shulei Wang, T. Tony Cai and Hongzhe Li University of Pennsylvania In this supplementary material, we provide the proof for the main results (Section S1) and all the …

  Related articles All 3 versions 

<—2019———— 2019  ———-   1210— 


Prioritized Experience Replay based on the Wasserstein Metric in Deep Reinforcement Learning: The regularizing effect of modelling return distributions

T Greevink - 2019 - repository.tudelft.nl

This thesis tests the hypothesis that distributional deep reinforcement learning (RL) algorithms get an increased performance over expectation based deep RL because of the regularizing effect of fitting a more complex model. This hypothesis was tested by comparing …

  

Projection in the 2-Wasserstein sense on structured measure space

L Lebrat - 2019 - tel.archives-ouvertes.fr

This thesis focuses on the approximation for the 2-Wasserstein metric of probability measures by structured measures. The set of structured measures under consideration is made of consistent discretizations of measures carried by a smooth curve with a bounded …

  All 2 versions 


On the Complexity of Approximating Wasserstein Barycenter

Authors:Kroshnin, Alexey (Creator), Dvinskikh, Darina (Creator), Dvurechensky, Pavel (Creator), Gasnikov, Alexander (Creator), Tupitsa, Nazarii (Creator), Uribe, Cesar (Creator)
Summary:We study the complexity of approximating Wassertein barycenter of $m$ discrete measures, or histograms of size $n$ by contrasting two alternative approaches, both using entropic regularization. The first approach is based on the Iterative Bregman Projections (IBP) algorithm for which our novel analysis gives a complexity bound proportional to $\frac{mn^2}{\varepsilon^2}$ to approximate the original non-regularized barycenter. Using an alternative accelerated-gradient-descent-based approach, we obtain a complexity proportional to $\frac{mn^{2.5}}{\varepsilon} $. As a byproduct, we show that the regularization parameter in both approaches has to be proportional to $\varepsilon$, which causes instability of both algorithms when the desired accuracy is high. To overcome this issue, we propose a novel proximal-IBP algorithm, which can be seen as a proximal gradient method, which uses IBP on each iteration to make a proximal step. We also consider the question of scalability of these algorithms using approaches from distributed optimization and show that the first algorithm can be implemented in a centralized distributed setting (master/slave), while the second one is amenable to a more general decentralized distributed setting with an arbitrary network topologyShow more
Downloadable Archival Material, 2019-01-24
Undefined
Publisher:2019-01-24 

 

Comparison of object functions for the inversion of seismic ...

by L Stracca · 2019 — Titolo: Comparison of object functions for the inversion of seismic data and study on the potentialities of the Wasserstein Metric. Autori interni: STRACCA ...

[CITATION] Comparison of object functions for the inversion of seismic data and study on the potentialities of the Wasserstein Metric

L Stracca, E Stucchi, A Mazzotti - GNGTS, 2019 - arpi.unipi.it

IRIS è la soluzione IT che facilita la raccolta e la gestione dei dati relativi alle attività e ai prodotti della ricerca. Fornisce a ricercatori, amministratori e valutatori gli strumenti per monitorare i risultati della ricerca, aumentarne la visibilità e allocare in modo efficace le risorse disponibili … Comparison …

[CITATION] Comparison of object functions for the inversion of seismic data and study on the potentialities of the Wasserstein Metric

L Stracca, E StucchiA Mazzotti - GNGTS, 2019 - arpi.unipi.it

Comparison of object functions for the inversion of seismic data and study on the

potentialities of the Wasserstein Metric … Comparison of object functions for the inversion of

seismic data and study on the potentialities of the Wasserstein Metric … Comparison of …

[CITATION] Comparison of object functions for the inversion of seismic data and study on the potentialities of the Wasserstein Metric

L Stracca, E Stucchi, A Mazzotti - GNGTS, 2019 - arpi.unipi.it

IRIS è la soluzione IT che facilita la raccolta e la gestione dei dati relativi alle attività e ai prodotti

della ricerca. Fornisce a ricercatori, amministratori e valutatori gli strumenti per monitorare i risultati

della ricerca, aumentarne la visibilità e allocare in modo efficace le risorse disponibili … Comparison …


 A two-phase two-fluxes degenerate Cahn–Hilliard model as constrained Wasserstein gradient flow

C CancèsD Matthes, F Nabet - Archive for Rational Mechanics and …, 2019 - Springer

We study a non-local version of the Cahn–Hilliard dynamics for phase separation in a two-component incompressible and immiscible mixture with linear mobilities. Differently to the celebrated local model with nonlinear mobility, it is only assumed that the divergences of the …

  Cited by 7 Related articles All 17 versions


2019


[PDF] arxiv.org

Local well-posedness in the Wasserstein space for a chemotaxis model coupled to Navier-Stokes equations

K Kang, HK Kim - arXiv preprint arXiv:1907.01895, 2019 - arxiv.org

We consider a coupled system of Keller-Segel type equations and the incompressible Navier-Stokes equations in spatial dimension two and three. In the previous work [19], we established the existence of a weak solution of a Fokker-Plank equation in the Wasserstein  …

  Related articles All 2 versions 


[PDF] wiley.com

A degenerate Cahn‐Hilliard model as constrained Wasserstein gradient flow

D MatthesC Cances, F Nabet - PAMM, 2019 - Wiley Online Library

Existence of solutions to a non‐local Cahn‐Hilliard model with degenerate mobility is considered. The PDE is written as a gradient flow with respect to the L2‐Wasserstein metric for two components that are coupled by an incompressibility constraint. Approximating …

  Related articles


2019

A degenerate Cahnâ•'Hilliard model as constrained ...

onlinelibrary.wiley.com › doi › pdf › pamm.201900158

by D Matthes · 2019 — A degenerate Cahn-Hilliard model as constrained Wasserstein gradient flow. Daniel Matthes1,, Clement Cances2, and Flore Nabet3. 1 Technische Universität ...

[CITATION] A degenerate Cahn-Hilliard model as constrained Wasserstein gradient flow

C CANCES

  Related articles


[PDF] arxiv.org

Parisi's formula is a Hamilton-Jacobi equation in Wasserstein space

JC Mourrat - arXiv preprint arXiv:1906.08471, 2019 - arxiv.org

Parisi's formula is a self-contained description of the infinite-volume limit of the free energy of mean-field spin glass models. We show that this quantity can be recast as the solution of a Hamilton-Jacobi equation in the Wasserstein space of probability measures on the positive …

  Cited by 6 Related articles All 3 versions 


[PDF] researchgate.net

[PDF] Computationally efficient tree variants of gromov-wasserstein

T LeN HoM Yamada - arXiv preprint arXiv:1910.04462, 2019 - researchgate.net

We propose two novel variants of Gromov-Wasserstein (GW) between probability measures in different probability spaces based on projecting these measures into the tree metric spaces. Our first proposed discrepancy, named flow-based tree Gromov-Wasserstein …

  Cited by 1 Related articles All 5 versions 

<—2019———— 2019  ———-   1220— 



2019

Music Classification using Multiclass Support Vector Machine and Multilevel Wasserstein Means

J Wei, C Jin, Z Cheng, X Lv… - 2019 IEEE/ACIS 18th …, 2019 - ieeexplore.ieee.org

Music classification is a challenging task in music information retrieval. In this article, we

compare the performance of the two types of models. The first category is classified by

Support Vector Machine (SVM). We use the feature extraction from audio as the basis of …

  Related articles All 2 versions


Sliced wasserstein discrepancy for unsupervised domain adaptation

CY Lee, T Batra, MH Baig… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

In this work, we connect two distinct concepts for unsupervised domain adaptation: feature distribution alignment between domains by utilizing the task-specific decision boundary and the Wasserstein metric. Our proposed sliced Wasserstein discrepancy (SWD) is designed to …

  Cited by 105 Related articles All 7 versions 


[PDF] arxiv.org

Statistical aspects of Wasserstein distances

VM Panaretos, Y Zemel - Annual review of statistics and its …, 2019 - annualreviews.org

Wasserstein distances are metrics on probability distributions inspired by the problem of optimal mass transportation. Roughly speaking, they measure the minimal effort required to reconfigure the probability mass of one distribution in order to recover the other distribution …

  Cited by 86 Related articles All 10 versions

[PDF] arxiv.org


2019

Primal dual methods for Wasserstein gradient flows

JA CarrilloK CraigL WangC Wei - arXiv preprint arXiv:1901.08081, 2019 - arxiv.org

Combining the classical theory of optimal transport with modern operator splitting techniques, we develop a new numerical method for nonlinear, nonlocal partial differential equations, arising in models of porous media, materials science, and biological swarming …

  Cited by 18 Related articles All 3 versions 


2019

[PDF] arxiv.org

Statistical inference for Bures-Wasserstein barycenters

A KroshninV SpokoinyA Suvorikova - arXiv preprint arXiv:1901.00226, 2019 - arxiv.org

In this work we introduce the concept of Bures-Wasserstein barycenter $ Q_* $, that is essentially a Fréchet mean of some distribution $\mathbb {P} $ supported on a subspace of positive semi-definite Hermitian operators $\mathbb {H} _ {+}(d) $. We allow a barycenter to …

  Cited by 15 Related articles All 3 versions 

CITATION] Statistical inference for Bures-Wasserstein

A Kroshnin, V Spokoiny, A Suvorikova - arXiv preprint arXiv:1901.00226, 2019

  Cited by 2 Related articles


2019

2019

[PDF] arxiv.org

Statistical data analysis in the Wasserstein space

J Bigot - arXiv preprint arXiv:1907.08417, 2019 - arxiv.org

This paper is concerned by statistical inference problems from a data set whose elements may be modeled as random probability measures such as multiple histograms or point clouds. We propose to review recent contributions in statistics on the use of Wasserstein  …

  Cited by 3 Related articles All 2 versions 


[PDF] arxiv.org

Closed‐form Expressions for Maximum Mean Discrepancy with Applications to Wasserstein Auto‐Encoders

RM Rustamov - Stat, 2019 - Wiley Online Library

Abstract The Maximum Mean Discrepancy (MMD) has found numerous applications in statistics and machine learning, most recently as a penalty in the Wasserstein Auto‐Encoder (WAE). In this paper we compute closed‐form expressions for estimating the Gaussian …

  Cited by 5 Related articles All 3 versions


[PDF] mdpi.com

Multi-turn chatbot based on query-context attentions and dual wasserstein generative adversarial networks

J Kim, S Oh, OW Kwon, H Kim - Applied Sciences, 2019 - mdpi.com

To generate proper responses to user queries, multi-turn chatbot models should selectively consider dialogue histories. However, previous chatbot models have simply concatenated or averaged vector representations of all previous utterances without considering contextual …

  Cited by 6 Related articles All 3 versions 


[PDF] arxiv.org

On the minimax optimality of estimating the wasserstein metric

T Liang - arXiv preprint arXiv:1908.10324, 2019 - arxiv.org

We study the minimax optimal rate for estimating the Wasserstein-$1 $ metric between two unknown probability measures based on $ n $ iid empirical samples from them. We show that estimating the Wasserstein metric itself between probability measures, is not …

  Cited by 3 Related articles All 3 versions 


[PDF] researchgate.net

[PDF] Wasserstein distance: a flexible tool for statistical analysis

GVVLV Lucarini - 2019 - researchgate.net

The figure shows the Wasserstein distance calculated in the phase space composed by globally averaged temperature and precipitation. To provide some sort of benchmark, at the bottom of the figure is shown the value related to the NCEP reanalysis, which yields one of …

  Related articles All 4 versions 

<—2019———— 2019  ———-   1230—  


Elements of Statistical Inference in 2-Wasserstein Space

J Ebert, V SpokoinyA Suvorikova - Topics in Applied Analysis and …, 2019 - Springer

This work addresses an issue of statistical inference for the datasets lacking underlying linear structure, which makes impossible the direct application of standard inference techniques and requires a development of a new tool-box taking into account properties of …

  Related articles All 3 versions

[PDF] nsf.gov

An information-theoretic view of generalization via Wasserstein distance

H WangM Diaz, JCS Santos Filho… - … on Information Theory …, 2019 - ieeexplore.ieee.org

We capitalize on the Wasserstein distance to obtain two information-theoretic bounds on the

generalization error of learning algorithms. First, we specialize the Wasserstein distance into

total variation, by using the discrete metric. In this case we derive a generalization bound …

  Cited by 9 Related articles All 5 versions


[PDF] arxiv.org

The optimal convergence rate of monotone schemes for conservation laws in the Wasserstein distance

AM RufE SandeS Solem - Journal of Scientific Computing, 2019 - Springer

Abstract In 1994, Nessyahu, Tadmor and Tassa studied convergence rates of monotone finite volume approximations of conservation laws. For compactly supported, Lip^+ Lip+-bounded initial data they showed a first-order convergence rate in the Wasserstein distance …

  Cited by 8 Related articles All 6 versions


[PDF] iiasa.ac.at

Optimal XL-insurance under Wasserstein-type ambiguity

C Birghila, GC Pflug - Insurance: Mathematics and Economics, 2019 - Elsevier

We study the problem of optimal insurance contract design for risk management under a budget constraint. The contract holder takes into consideration that the loss distribution is not entirely known and therefore faces an ambiguity problem. For a given set of models, we …

  Cited by 2 Related articles All 7 versions


[PDF] esaim-cocv.org

A Pontryagin Maximum Principle in Wasserstein spaces for constrained optimal control problems

B Bonnet - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

In this paper, we prove a Pontryagin Maximum Principle for constrained optimal control problems in the Wasserstein space of probability measures. The dynamics is described by a transport equation with non-local velocities which are affine in the control, and is subject to …

  Cited by 8 Related articles All 45 versions




[PDF] arxiv.org

On the minimax optimality of estimating the wasserstein metric

T Liang - arXiv preprint arXiv:1908.10324, 2019 - arxiv.org

We study the minimax optimal rate for estimating the Wasserstein-$1 $ metric between two unknown probability measures based on $ n $ iid empirical samples from them. We show that estimating the Wasserstein metric itself between probability measures, is not …

  Cited by 3 Related articles All 3 versions 


[PDF] arxiv.org

Second-Order Models for Optimal Transport and Cubic Splines on the Wasserstein Space

JD BenamouTO GallouëtFX Vialard - Foundations of Computational …, 2019 - Springer

On the space of probability densities, we extend the Wasserstein geodesics to the case of higher-order interpolation such as cubic spline interpolation. After presenting the natural extension of cubic splines to the Wasserstein space, we propose a simpler approach based …

  Cited by 8 Related articles All 5 versions


[PDF] arxiv.org

A convergent Lagrangian discretization for -Wasserstein and flux-limited diffusion equations

B Söllner, O Junge - arXiv preprint arXiv:1906.01321, 2019 - arxiv.org

We study a Lagrangian numerical scheme for solution of a nonlinear drift diffusion equation of the form $\partial_t u=\partial_x (u\cdot c [\partial_x (h^\prime (u)+ v)]) $ on an interval. This scheme will consist of a spatio-temporal discretization founded in the formulation of the …

  Cited by 2 Related articles All 5 versions 

[CITATION] A convergent Lagrangian discretization for -Wasserstein and flux-limited diffusion equations

O Junge, B Söllner - arXiv preprint arXiv:1906.01321, 2019



[PDF] arxiv.org

Tropical Optimal Transport and Wasserstein Distances

W Lee, W LiB LinA Monod - arXiv preprint arXiv:1911.05401, 2019 - arxiv.org

We study the problem of optimal transport in tropical geometry and define the Wasserstein-$ p $ distances for probability measures in the continuous metric measure space setting of the tropical projective torus. We specify the tropical metric---a combinatorial metric that has been …

  Cited by 1 Related articles All 3 versions 

[PDF] Tropical Optimal Transport and Wasserstein Distances in Phylogenetic Tree Space

W Lee, W LiB LinA Monod - arXiv preprint arXiv:1911.05401, 2019 - math.ucla.edu

We study the problem of optimal transport on phylogenetic tree space from the perspective of tropical geometry, and thus define the Wasserstein-p distances for probability measures in this continuous metric measure space setting. With respect to the tropical metric—a …

  Related articles All 2 versions 



[PDF] arxiv.org

The boundary method for semi-discrete optimal transport partitions and Wasserstein distance computation

L Dieci, JD Walsh III - Journal of Computational and Applied Mathematics, 2019 - Elsevier

We introduce a new technique, which we call the boundary method, for solving semi-discrete optimal transport problems with a wide range of cost functions. The boundary method reduces the effective dimension of the problem, thus improving complexity. For cost …

  Cited by 6 Related articles All 5 versions

<——2019—–—2019—-   1240—    


[PDF] sciencedirect.com

Distributionally Robust Chance-Constraint Optimal Power Flow Considering Uncertain Renewables with Wasserstein-Moment Metric

J Liu, Y Chen, C Duan, J Lyu - Energy Procedia, 2019 - Elsevier

Chance-constraint optimal power flow has been proven as an efficient method to manage the risk of volatile renewable energy sources. To address the uncertainties of renewable energy sources, a novel distributionally robust chance-constraint OPF model is proposed in …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

Optimal Fusion of Elliptic Extended Target Estimates Based on the Wasserstein Distance

K ThormannM Baum - 2019 22th International Conference on …, 2019 - ieeexplore.ieee.org

This paper considers the fusion of multiple estimates of a spatially extended object, where the object extent is modeled as an ellipse parameterized by the orientation and semi-axes lengths. For this purpose, we propose a novel systematic approach that employs a distance …

  Cited by 1 Related articles All 5 versions


[PDF] arxiv.org

Group level MEG/EEG source imaging via optimal transport: minimum Wasserstein estimates

H JanatiT BazeilleB ThirionM Cuturi… - … Information Processing in …, 2019 - Springer

Magnetoencephalography (MEG) and electroencephalography (EEG) are non-invasive modalities that measure the weak electromagnetic fields generated by neural activity. Inferring the location of the current sources that generated these magnetic fields is an ill …

  Cited by 5 Related articles All 14 versions


[PDF] arxiv.org

Optimal Transport Relaxations with Application to Wasserstein GANs

S Mahdian, J Blanchet, P Glynn - arXiv preprint arXiv:1906.03317, 2019 - arxiv.org

We propose a family of relaxations of the optimal transport problem which regularize the problem by introducing an additional minimization step over a small region around one of the underlying transporting measures. The type of regularization that we obtain is related to …

  Related articles All 4 versions 


[PDF] archives-ouvertes.fr

Optimal Control in Wasserstein Spaces

B Bonnet - 2019 - hal.archives-ouvertes.fr

A wealth of mathematical tools allowing to model and analyse multi-agent systems has been brought forth as a consequence of recent developments in optimal transport theory. In this thesis, we extend for the first time several of these concepts to the framework of control …

  Related articles All 8 versions 

[CITATION] Optimal Control in Wasserstein Spaces.(Commande Optimal dans les Espaces de Wasserstein).

B Bonnet - 2019 - Aix-Marseille University, France


2019

[PDF] d-nb.info

[PDF] Algorithms for Optimal Transport and Wasserstein Distances

J Schrieber - 2019 - d-nb.info

Optimal Transport and Wasserstein Distance are closely related terms that do not only have a long history in the mathematical literature, but also have seen a resurgence in recent years, particularly in the context of the many applications they are used in, which span a …

  Related articles All 2 versions 


[PDF] iop.org

Wasserstein space as state space of quantum mechanics and optimal transport

MF Rosyid, K Wahyuningsih - Journal of Physics: Conference …, 2019 - iopscience.iop.org

In this work, we are in the position to view a measurement of a physical observable as an experiment in the sense of probability theory. To every physical observable, a sample space called the spectrum of the observable is therefore available. We have investigated the …

  Related articles All 2 versions


[PDF] amazonaws.com

[PDF] Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application to Microbiome Studies”

S Wang, TT CaiH Li - pstorage-tf-iopjsd8797887.s3 …

Page 1. Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application to Microbiome Studies” Shulei Wang, T. Tony Cai and Hongzhe Li University of Pennsylvania In this supplementary material, we provide the proof for the main results (Section S1) and all the …

  Related articles All 3 versions 

߃


[PDF] researchgate.net

[PDF] Parallel Wasserstein Generative Adversarial Nets with Multiple Discriminators.

Y SuS Zhao, X Chen, I KingMR Lyu - IJCAI, 2019 - researchgate.net

… However, the existing algorithms with approximated Wasserstein loss converge slowly due

to heavy computation cost and usually generate unstable results as well. In this paper, we …

Cited by 3 Related articles All 3 versions 


 

[PDF] wiley.com

Full View

Data‐driven affinely adjustable distributionally robust framework for unit commitment based on Wasserstein metric

W Hou, R Zhu, H Wei… - IET Generation …, 2019 - Wiley Online Library

… In this paper, we construct a Wasserstein-metric-based affinely adjustable … Wasserstein-metric-based 

ambiguity set is introduced into the UC problem for the first time. The Wasserstein-…

Cited by 19 Related articles All 3 versions


<——2019—–—2019–––1250—  


Primal dual methods for Wasserstein gradient flows

JA CarrilloK CraigL WangC Wei - arXiv preprint arXiv:1901.08081, 2019 - arxiv.org

… Using the JKO scheme, along with the Benamou-Brenier dynamical characterization of the Wasserstein distance, we reduce computing the solution of these evolutionary PDEs to solving a sequence of fully discrete minimization problems, with strictly convex objective function …

  Cited by 18 Related articles All 3 versions 


[PDF] sciencedirect.com

Harmonic mappings valued in the Wasserstein space

H Lavenant - Journal of Functional Analysis, 2019 - Elsevier

… The idea is to start from curves valued in the Wasserstein space and the so-called Benamou-Brenier formula [6]. If I is a segment of R and μ : I P ( D ) is an absolutely continuous curve, then its Dirichlet energy, which is nothing else than the integral of the square of its metric …

Cited by 16 Related articles All 13 versions

[PDF] mlr.press

Regularity as regularization: Smooth and strongly convex brenier potentials in optimal transport

FP PatyA d'Aspremont… - … Conference on Artificial …, 2020 - proceedings.mlr.press

… If fµ were to be exactly equal to ν, such a function would be called a Brenier potential. We quantify that nearness in terms of the Wasserstein distance between the push-foward of µ and ν to define: Definition 1. Let E be a partition of Rd and 0 ≤ l ≤ L. For µ, ν 2(Rd), we …

  Cited by 6 Related articles All 10 versions 

 


[PDF] arxiv.org

Wasserstein distance based domain adaptation for object detection

P Xu, P GurramG WhippsR Chellappa - arXiv preprint arXiv:1909.08675, 2019 - arxiv.org

… Wasserstein distance over the CE loss for domain adaptation. Since our task focuses on

unsupervised domain adaptation in feature space, for a fair comparison, we do not compare our …

Cited by 12 Related articles All 3 versions 


[PDF] arxiv.org

Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance

J WeedF Bach - Bernoulli, 2019 - projecteuclid.org

The Wasserstein distance between two probability measures on a metric space is a measure of closeness with applications in statistics, probability, and machine learning. In this work, we consider the fundamental question of how quickly the empirical measure …

  Cited by 163 Related articles All 6 versions


[PDF] arxiv.org

Wasserstein distance based domain adaptation for object detection

P Xu, P GurramG WhippsR Chellappa - arXiv preprint arXiv:1909.08675, 2019 - arxiv.org

In this paper, we present an adversarial unsupervised domain adaptation framework for object detection. Prior approaches utilize adversarial training based on cross entropy between the source and target domain distributions to learn a shared feature mapping that …

  Cited by 6 Related articles All 2 versions 


[PDF] researchgate.net

[PDF] Tractable reformulations of distributionally robust two-stage stochastic programs with∞− Wasserstein distance

W Xie - arXiv preprint arXiv:1908.08454, 2019 - researchgate.net

In the optimization under uncertainty, decision-makers first select a wait-and-see policy before any realization of uncertainty and then place a here-and-now decision after the uncertainty has been observed. Two-stage stochastic programming is a popular modeling …

 Cited by 13 Related articles
  

The wasserstein transform

F Memoli, Z Smith, Z Wan - International Conference on …, 2019 - proceedings.mlr.press

We introduce the Wasserstein transform, a method for enhancing and denoising datasets

defined on general metric spaces. The construction draws inspiration from Optimal

Transportation ideas. We establish the stability of our method under data perturbation and …

  Cited by 5 Related articles All 5 versions 


2019bb[PDF] arxiv.org

Subexponential upper and lower bounds in Wasserstein distance for Markov processes

A Arapostathis, G Pang, N Sandrić - arXiv preprint arXiv:1907.05250, 2019 - arxiv.org

In this article, relying on Foster-Lyapunov drift conditions, we establish subexponential upper and lower bounds on the rate of convergence in the $\mathrm {L}^ p $-Wasserstein distance for a class of irreducible and aperiodic Markov processes. We further discuss these …

  Cited by 2 Related articles All 3 versions 

 

[PDF] researchgate.net

[PDF] Rate of convergence in Wasserstein distance of piecewise-linear Lévy-driven SDEs

ARI ARAPOSTATHIS, G PANG… - arXiv preprint arXiv …, 2019 - researchgate.net

In this paper, we study the rate of convergence under the Wasserstein metric of a broad class of multidimensional piecewise Ornstein–Uhlenbeck processes with jumps. These are governed by stochastic differential equations having a piecewise linear drift, and a fairly …

  Related articles 

<——2019—–—2019 ——1260— 


[PDF] arxiv.org

Wasserstein-fisher-rao document distance

Z Wang, D Zhou, Y ZhangH WuC Bao - arXiv preprint arXiv:1904.10294, 2019 - arxiv.org

As a fundamental problem of natural language processing, it is important to measure the distance between different documents. Among the existing methods, the Word Mover's Distance (WMD) has shown remarkable success in document semantic matching for its clear …

  Cited by 2 Related articles All 3 versions 


 

 [PDF] colostate.edu

[PDF] Morse Theory for Wasserstein Spaces

J Mirth - math.colostate.edu

Applied topology uses simplicial complexes to approximate a manifold based on data. This

approximation is known not to always recover the homotopy type of the manifold. In this work-

in-progress we investigate how to compute the homotopy type in such settings using …

  Related articles All 2 versions 


2019  [PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance: Wrong Way Counterparty Credit and Funding Risk

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019 - arxiv.org

This paper investigates calculations of robust XVA, in particular, credit valuation adjustment (CVA) and funding valuation adjustment (FVA) for over-the-counter derivatives under distributional uncertainty using Wasserstein distance as the ambiguity measure. Wrong way …

  Related articles All 8 versions 

 

Minimax estimation of smooth densities in Wasserstein distance

J Niles-WeedQ Berthet - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

We study nonparametric density estimation problems where error is measured in the Wasserstein distance, a metric on probability distributions popular in many areas of statistics and machine learning. We give the first minimax-optimal rates for this problem for general …

Cited by 44 Related articles All 5 versions 

[PDF] cmu.edu

[PDF] Optimal Transport and Wasserstein Distance

S Kolouri - stat.cmu.edu

The Wasserstein distance — which arises from the idea of optimal transport — is being used more and more in Statistics and Machine Learning. In these notes we review some of the basics about this topic. Two good references for this topic are … Kolouri, Soheil, et al. Optimal Mass …

  All 2 versions 


 [PDF] uchile.cl

[PDF] WASSERSTEIN-BASED DISTANCE FOR TIME SERIES ANALYSIS

E CAZELLES, A ROBERT, F TOBAR - cmm.uchile.cl

Page 1. WASSERSTEIN-BASED DISTANCE FOR TIME SERIES ANALYSIS ELSA CAZELLES, ARNAUD ROBERT AND FELIPE TOBAR UNIVERSIDAD DE CHILE BACKGROUND For a stationary continuous-time time series x(t), the Power Spectral Density is given by S(ξ) = lim T∞ …

  Related articles 
 

[PDF] sfds.asso.fr

[PDF] Méthode de couplage en distance de Wasserstein pour la théorie des valeurs extrêmes

B Bobbia, C DombryD Varron - jds2019.sfds.asso.fr

Nous proposons une relecture de résultats classiques de la théorie des valeurs extrêmes, que nous étudions grâce aux outils que nous fournit la théorie du transport optimal. Dans ce cadre, nous pouvons voir la normalité des estimateurs comme une convergence de …

  Related articles All 2 versions 



[PDF] arxiv.org

Wasserstein dependency measure for representation learning

S OzairC LynchY BengioA OordS Levine… - arXiv preprint arXiv …, 2019 - arxiv.org

Mutual information maximization has emerged as a powerful learning objective for unsupervised representation learning obtaining state-of-the-art performance in applications such as object recognition, speech recognition, and reinforcement learning. However, such …

  Cited by 23 Related articles All 5 versions 



[PDF] arxiv.org

Precise simulation of electromagnetic calorimeter showers using a Wasserstein Generative Adversarial Network

M Erdmann, J Glombitza, T Quast - Computing and Software for Big …, 2019 - Springer

Simulations of particle showers in calorimeters are computationally time-consuming, as they have to reproduce both energy depositions and their considerable fluctuations. A new approach to ultra-fast simulations is generative models where all calorimeter energy …

  Cited by 40 Related articles All 6 versions


[PDF] arxiv.org

Using wasserstein-2 regularization to ensure fair decisions with neural-network classifiers

L Risser, Q Vincenot, N Couellan… - arXiv preprint arXiv …, 2019 - arxiv.org

In this paper, we propose a new method to build fair Neural-Network classifiers by using a constraint based on the Wasserstein distance. More specifically, we detail how to efficiently compute the gradients of Wasserstein-2 regularizers for Neural-Networks. The proposed …

  Cited by 9 Related articles All 2 versions 

<——2019—–—2019 ——1270—  



[PDF] ieee.org

Accelerating CS-MRI reconstruction with fine-tuning Wasserstein generative adversarial network

M Jiang, Z Yuan, X Yang, J Zhang, Y Gong, L Xia… - IEEE …, 2019 - ieeexplore.ieee.org

Compressed sensing magnetic resonance imaging (CS-MRI) is a time-efficient method to acquire MR images by taking advantage of the highly under-sampled k-space data to accelerate the time consuming acquisition process. In this paper, we proposed a de-aliasing …

  Cited by 5 Related articles


[PDF] mlr.press

A gradual, semi-discrete approach to generative network training via explicit wasserstein minimization

Y Chen, M TelgarskyC Zhang… - International …, 2019 - proceedings.mlr.press

This paper provides a simple procedure to fit generative networks to target distributions, with the goal of a small Wasserstein distance (or other optimal transport costs). The approach is based on two principles:(a) if the source randomness of the network is a continuous …

  Cited by 4 Related articles All 10 versions 


[PDF] researchgate.net

Infrared and Visible Image Fusion via Multi-discriminators Wasserstein Generative Adversarial Network

J Li, H Huo, K Liu, C Li, S Li… - 2019 18th IEEE …, 2019 - ieeexplore.ieee.org

Generative adversarial network (GAN) has been widely applied to infrared and visible image fusion. However, the existing GAN-based image fusion methods only establish one discriminator in the network to make the fused image capture gradient information from the …

  Cited by 1 Related articles All 3 versions


Image Reflection Removal Using the Wasserstein Generative Adversarial Network

T LiDPK Lun - … 2019-2019 IEEE International Conference on …, 2019 - ieeexplore.ieee.org

Imaging through a semi-transparent material such as glass often suffers from the reflection problem, which degrades the image quality. Reflection removal is a challenging task since it is severely ill-posed. Traditional methods, while all require long computation time on …

  Cited by 1 Related articles All 2 versions


Cross-domain Attention Network with Wasserstein Regularizers for E-commerce Search

M QiuB WangC Chen, X Zeng, J Huang… - Proceedings of the 28th …, 2019 - dl.acm.org

Product search and recommendation is a task that every e-commerce platform wants to outperform their peels on. However, training a good search or recommendation model often requires more data than what many platforms have. Fortunately, the search tasks on different …

  Related articles



[PDF] mdpi.com

Wasserstein Generative Adversarial Network Based De-Blurring Using Perceptual Similarity

M Hong, Y Choe - Applied Sciences, 2019 - mdpi.com

The de-blurring of blurred images is one of the most important image processing methods and it can be used for the preprocessing step in many multimedia and computer vision applications. Recently, de-blurring methods have been performed by neural network  …

  Cited by 1 Related articles All 4 versions 


[PDF] arxiv.org

A measure approximation theorem for Wasserstein-robust expected values

G van Zyl - arXiv preprint arXiv:1912.12119, 2019 - arxiv.org

We consider the problem of finding the infimum, over probability measures being in a ball defined by Wasserstein distance, of the expected value of a bounded Lipschitz random variable on $\mathbf {R}^ d $. We show that if the $\sigma-$ algebra is approximated in by a …

  Related articles All 2 versions 



[PDF] arxiv.org

Clustering measure-valued data with Wasserstein barycenters

G Domazakis, D Drivaliaris, S Koukoulas… - arXiv preprint arXiv …, 2019 - arxiv.org

In this work, learning schemes for measure-valued data are proposed, ie data that their structure can be more efficiently represented as probability measures instead of points on $\R^ d $, employing the concept of probability barycenters as defined with respect to the …

  Related articles All 2 versions 


PWGAN: wasserstein GANs with perceptual loss for mode collapse

X Wu, C Shi, X Li, J He, X Wu, J Lv, J Zhou - Proceedings of the ACM …, 2019 - dl.acm.org

Generative adversarial network (GAN) plays an important part in image generation. It has great achievements trained on large scene data sets. However, for small scene data sets, we find that most of methods may lead to a mode collapse, which may repeatedly generate …

  Related articles


[PDF] biorxiv.org

Reproducibility test of radiomics using network analysis and Wasserstein K-means algorithm

JH Oh, AP Apte, E Katsoulakis, N Riaz, V Hatzoglou… - bioRxiv, 2019 - biorxiv.org

Purpose To construct robust and validated radiomic predictive models, the development of a reliable method that can identify reproducible radiomic features robust to varying image acquisition methods and other scanner parameters should be preceded with rigorous …

  Related articles All 3 versions 

<——2019—–—2019 ——1280—  


[PDF] dpi-proceedings.com

Isomorphic Wasserstein Generative Adversarial Network for Numeric Data Augmentation

W Wei, W Chuang, LI Yue - DEStech Transactions on …, 2019 - dpi-proceedings.com

GAN-based schemes are one of the most popular methods designed for image generation. Some recent studies have suggested using GAN for numeric data augmentation that is to generate data for completing the imbalanced numeric data. Compared to the conventional …

  Related articles All 2 versions 


A Cost Effective Solution for Road Crack Inspection using ...

arxiv.org › pdf

by Q Mei · 2019 · Cited by 2 — method combining conditional Wasserstein generative adversarial network and connectivity maps is developed for pixel level crack detection. This paper is ...

[CITATION] A conditional wasserstein generative adversarial network for pixel-level crack detection using video extracted images

Q MeiM Gül - arXiv preprint arXiv:1907.06014, 2019

  Cited by 6 Related articles


Tackling Algorithmic Bias in Neural-Network Classifiers using Wasserstein-2 Regularization

L Risser, Q Vincenot, JM Loubes - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

The increasingly common use of neural network classifiers in industrial and social applications of image analysis has allowed impressive progress these last years. Such methods are however sensitive to algorithmic bias, ie to an under-or an over-representation …



[PDF] mlr.press

Wasserstein regularization for sparse multi-task regression

H JanatiM CuturiA Gramfort - The 22nd International …, 2019 - proceedings.mlr.press

We focus in this paper on high-dimensional regression problems where each regressor can be associated to a location in a physical space, or more generally a generic geometric space. Such problems often employ sparse priors, which promote models using a small …

  Cited by 27 Related articles All 8 versions 


Deep joint two-stream Wasserstein auto-encoder and selective attention alignment for unsupervised domain adaptation

Z ChenC Chen, X Jin, Y LiuZ Cheng - Neural computing and …, 2019 - Springer

Abstract Domain adaptation refers to the process of utilizing the labeled source domain data to learn a model that can perform well in the target domain with limited or missing labels. Several domain adaptation methods combining image translation and feature alignment …

  Cited by 12 Related articles




Deep multi-Wasserstein unsupervised domain adaptation

TN LeA HabrardM Sebban - Pattern Recognition Letters, 2019 - Elsevier

In unsupervised domain adaptation (DA), 1 aims at learning from labeled source data and fully unlabeled target examples a model with a low error on the target domain. In this setting, standard generalization bounds prompt us to minimize the sum of three terms:(a) the source …

  Cited by 3 Related articles All 3 versions


[PDF] arxiv.org

Towards diverse paraphrase generation using multi-class wasserstein GAN

Z An, S Liu - arXiv preprint arXiv:1909.13827, 2019 - arxiv.org

Paraphrase generation is an important and challenging natural language processing (NLP) task. In this work, we propose a deep generative model to generate paraphrase with diversity. Our model is based on an encoder-decoder architecture. An additional transcoder …

  Cited by 4 Related articles All 3 versions 


[PDF] arxiv.org

2-wasserstein approximation via restricted convex potentials with application to improved training for gans

A TaghvaeiA Jalali - arXiv preprint arXiv:1902.07197, 2019 - arxiv.org

We provide a framework to approximate the 2-Wasserstein distance and the optimal transport map, amenable to efficient training as well as statistical and geometric analysis. With the quadratic cost and considering the Kantorovich dual form of the optimal …

  Cited by 9 Related articles All 3 versions 


[PDF] monash.edu

[PDF] Threeplayer wasserstein gan via amortised duality

QH Nhan Dam, T LeTD Nguyen… - Proc. of the 28th Int …, 2019 - research.monash.edu

We propose a new formulation for learning generative adversarial networks (GANs) using optimal transport cost (the general form of Wasserstein distance) as the objective criterion to measure the dissimilarity between target distribution and learned distribution. Our …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

Duality and quotient spaces of generalized Wasserstein spaces

NP Chung, TS Trinh - arXiv preprint arXiv:1904.12461, 2019 - arxiv.org

In this article, using ideas of Liero, Mielke and Savaré in [21], we establish a Kantorovich duality for generalized Wasserstein distances $ W_1^{a, b} $ on a generalized Polish metric space, introduced by Picolli and Rossi. As a consequence, we give another proof that …

  Cited by 3 Related articles All 3 versions 

<——2019—–—2019 ——1290—




[PDF] arxiv.org

A two-phase two-fluxes degenerate Cahn–Hilliard model as constrained Wasserstein gradient flow

C CancèsD Matthes, F Nabet - Archive for Rational Mechanics and …, 2019 - Springer

We study a non-local version of the Cahn–Hilliard dynamics for phase separation in a two-component incompressible and immiscible mixture with linear mobilities. Differently to the celebrated local model with nonlinear mobility, it is only assumed that the divergences of the …

  Cited by 7 Related articles All 17 versions


[PDF] researchgate.net

Infrared and Visible Image Fusion via Multi-discriminators Wasserstein Generative Adversarial Network

J Li, H Huo, K Liu, C Li, S Li… - 2019 18th IEEE …, 2019 - ieeexplore.ieee.org

Generative adversarial network (GAN) has been widely applied to infrared and visible image fusion. However, the existing GAN-based image fusion methods only establish one discriminator in the network to make the fused image capture gradient information from the …

  Cited by 1 Related articles All 3 versions



Combining multi-task autoencoder with Wasserstein generative adversarial networks for improving speech recognition performance

CY Kao, H Ko - The Journal of the Acoustical Society of Korea, 2019 - koreascience.or.kr

As the presence of background noise in acoustic signal degrades the performance of speech or acoustic event recognition, it is still challenging to extract noise-robust acoustic features from noisy signal. In this paper, we propose a combined structure of Wasserstein  …

  Related articles All 3 versions 


[HTML] sciencedirect.com

[HTML] Wasserstein GAN-based small-sample augmentation for new-generation artificial intelligence: a case study of cancer-staging data in biology

Y Liu, Y Zhou, X Liu, F Dong, C Wang, Z Wang - Engineering, 2019 - Elsevier

It is essential to utilize deep-learning algorithms based on big data for the implementation of the new generation of artificial intelligence. Effective utilization of deep learning relies considerably on the number of labeled samples, which restricts the application of deep …

  Cited by 32 Related articles All 5 versions



Deep joint two-stream Wasserstein auto-encoder and selective attention alignment for unsupervised domain adaptation

Z ChenC Chen, X Jin, Y LiuZ Cheng - Neural computing and …, 2019 - Springer

Abstract Domain adaptation refers to the process of utilizing the labeled source domain data to learn a model that can perform well in the target domain with limited or missing labels. Several domain adaptation methods combining image translation and feature alignment …

  Cited by 12 Related articles


Deep multi-Wasserstein unsupervised domain adaptation

TN LeA HabrardM Sebban - Pattern Recognition Letters, 2019 - Elsevier

In unsupervised domain adaptation (DA), 1 aims at learning from labeled source data and fully unlabeled target examples a model with a low error on the target domain. In this setting, standard generalization bounds prompt us to minimize the sum of three terms:(a) the source …

  Cited by 3 Related articles All 3 versions


Improved Procedures for Training Primal Wasserstein GANs

T Zhang, Z Li, Q ZhuD Zhang - 2019 IEEE SmartWorld …, 2019 - ieeexplore.ieee.org

Primal Wasserstein GANs are a variant of Generative Adversarial Networks (ie, GANs), which optimize the primal form of empirical Wasserstein distance directly. However, the high computational complexity and training instability are the main challenges of this framework …

  Related articles


[PDF] illinois.edu

Deep generative models via explicit Wasserstein minimization

Y Chen - 2019 - ideals.illinois.edu

This thesis provides a procedure to fit generative networks to target distributions, with the goal of a small Wasserstein distance (or other optimal transport costs). The approach is based on two principles:(a) if the source randomness of the network is a continuous …

  Related articles All 3 versions 


[PDF] arxiv.org

Statistical data analysis in the Wasserstein space

J Bigot - arXiv preprint arXiv:1907.08417, 2019 - arxiv.org

This paper is concerned by statistical inference problems from a data set whose elements may be modeled as random probability measures such as multiple histograms or point clouds. We propose to review recent contributions in statistics on the use of Wasserstein  …

  Cited by 3 Related articles All 2 versions 


[PDF] arxiv.org

Fréchet means and Procrustes analysis in Wasserstein space

Y Zemel, VM Panaretos - Bernoulli, 2019 - projecteuclid.org

We consider two statistical problems at the intersection of functional and non-Euclidean data analysis: the determination of a Fréchet mean in the Wasserstein space of multivariate distributions; and the optimal registration of deformed random measures and point …

  Cited by 47 Related articles All 8 versions

<——2019—–—2019 ——1300—

[PDF] mlr.press

Accelerated linear convergence of stochastic momentum methods in wasserstein distances

B CanM GurbuzbalabanL Zhu - … Conference on Machine …, 2019 - proceedings.mlr.press

Momentum methods such as Polyak's heavy ball (HB) method, Nesterov's accelerated gradient (AG) as well as accelerated projected gradient (APG) method have been commonly used in machine learning practice, but their performance is quite sensitive to noise in the …

  Cited by 16 Related articles All 8 versions 


Computing Wasserstein Barycenters via linear programming

G Auricchio, F Bassetti, S Gualandi… - … Conference on Integration …, 2019 - Springer

This paper presents a family of generative Linear Programming models that permit to compute the exact Wasserstein Barycenter of a large set of two-dimensional images. Wasserstein Barycenters were recently introduced to mathematically generalize the concept …

  Cited by 4 Related articles All 2 versions

 

[PDF] thecvf.com

Order-Preserving Wasserstein Discriminant Analysis

B SuJ ZhouY Wu - Proceedings of the IEEE/CVF …, 2019 - openaccess.thecvf.com

Supervised dimensionality reduction for sequence data projects the observations in sequences onto a low-dimensional subspace to better separate different sequence classes. It is typically more challenging than conventional dimensionality reduction for static data …

  Cited by 2 Related articles All 6 versions 

[PDF] koreascience.or.kr


Experimental Analysis of Equilibrization in Binary Classification for Non-Image Imbalanced Data Using Wasserstein GAN

ZY Wang, DK Kang - International Journal of Internet …, 2019 - koreascience.or.kr

In this paper, we explore the details of three classic data augmentation methods and two generative model based oversampling methods. The three classic data augmentation methods are random sampling (RANDOM), Synthetic Minority Over-sampling Technique …

  Cited by 2 Related articles All 3 versions 



[PDF] arxiv.org

A First-Order Algorithmic Framework for Wasserstein Distributionally Robust Logistic Regression

J LiS HuangAMC So - arXiv preprint arXiv:1910.12778, 2019 - arxiv.org

Wasserstein distance-based distributionally robust optimization (DRO) has received much attention lately due to its ability to provide a robustness interpretation of various learning models. Moreover, many of the DRO problems that arise in the learning context admits exact …

  Cited by 1 Related articles All 7 versions 


2019 2

[PDF] nsf.gov

A Distributionally Robust Optimization Approach for Multivariate Linear Regression under the Wasserstein Metric

R ChenIC Paschalidis - 2019 IEEE 58th Conference on …, 2019 - ieeexplore.ieee.org

We present a Distributionally Robust Optimization (DRO) approach for Multivariate Linear Regression (MLR), where multiple correlated response variables are to be regressed against a common set of predictors. We develop a regularized MLR formulation that is robust …

  Related articles All 3 versions



[PDF] thecvf.com

[PDF] Order-preserving Wasserstein Discriminant Analysis: Supplementary Material

B SuJ ZhouY Wu - openaccess.thecvf.com

Fig. 1 illustrates the learned barycenters for two sequence classes from the UCR Time Series Archive [1]. Note that the sequences are univariate sequences for illustration. In this paper, we tackle multivariate sequences. We can observe that each barycenter reflects the …

  All 2 versions 


[PDF] biorxiv.org

Reproducibility test of radiomics using network analysis and Wasserstein K-means algorithm

JH Oh, AP Apte, E Katsoulakis, N Riaz, V Hatzoglou… - bioRxiv, 2019 - biorxiv.org

Purpose To construct robust and validated radiomic predictive models, the development of a reliable method that can identify reproducible radiomic features robust to varying image acquisition methods and other scanner parameters should be preceded with rigorous …

  Related articles All 3 versions 


2019

[PDF] arxiv.org

Learning with minibatch Wasserstein: asymptotic and gradient properties

K Fatras, Y Zine, R FlamaryR Gribonval… - arXiv preprint arXiv …, 2019 - arxiv.org

Optimal transport distances are powerful tools to compare probability distributions and have

found many applications in machine learning. Yet their algorithmic complexity prevents their

direct use on large scale datasets. To overcome this challenge, practitioners compute these …

  Cited by 9 Related articles All 23 versions 

[PDF] arxiv.org


Kernelized wasserstein natural gradient

M ArbelA GrettonW LiG Montúfar - arXiv preprint arXiv:1910.09652, 2019 - arxiv.org

Many machine learning problems can be expressed as the optimization of some cost

functional over a parametric family of probability distributions. It is often beneficial to solve

such optimization problems using natural gradient methods. These methods are invariant to …

  Cited by 6 Related articles All 7 versions 


 [PDF] arxiv.org

Tree-Wasserstein Barycenter for Large-Scale Multilevel Clustering and Scalable Bayes

T Le, V Huynh, N HoD PhungM Yamada - arXiv preprint arXiv …, 2019 - arxiv.org

We study in this paper a variant of Wasserstein barycenter problem, which we refer to as tree-

Wasserstein barycenter, by leveraging a specific class of ground metrics, namely tree

metrics, for Wasserstein distance. Drawing on the tree structure, we propose an efficient …

 Cited by 1 Related articles All 4 versions 

<——2019—–—2019 ——1310—  


 2019

[PDF] arxiv.org

Statistical data analysis in the Wasserstein space

J Bigot - arXiv preprint arXiv:1907.08417, 2019 - arxiv.org

This paper is concerned by statistical inference problems from a data set whose elements

may be modeled as random probability measures such as multiple histograms or point

clouds. We propose to review recent contributions in statistics on the use of Wasserstein  …

  Cited by 3 Related articles All 2 versions 

 

[PDF] aclweb.org

Modeling personalization in continuous space for response generation via augmented wasserstein autoencoders

Z Chan, J LiX YangX Chen, W Hu, D Zhao… - Proceedings of the 2019 …, 2019 - aclweb.org

Abstract Variational autoencoders (VAEs) and Wasserstein autoencoders (WAEs) have

achieved noticeable progress in open-domain response generation. Through introducing

latent variables in continuous space, these models are capable of capturing utterance-level …

  Cited by 11 Related articles All 3 versions 


[PDF] iop.org

Wasserstein space as state space of quantum mechanics and optimal transport

MF Rosyid, K Wahyuningsih - Journal of Physics: Conference …, 2019 - iopscience.iop.org

In this work, we are in the position to view a measurement of a physical observable as an

experiment in the sense of probability theory. To every physical observable, a sample space

called the spectrum of the observable is therefore available. We have investigated the …

  Related articles All 2 versions

 

[PDF] ucla.edu

[PDF] Tropical Optimal Transport and Wasserstein Distances in Phylogenetic Tree Space

W Lee, W LiB LinA Monod - arXiv preprint arXiv:1911.05401, 2019 - math.ucla.edu

We study the problem of optimal transport on phylogenetic tree space from the perspective

of tropical geometry, and thus define the Wasserstein-p distances for probability measures in

this continuous metric measure space setting. With respect to the tropical metric—a …

  Related articles All 2 versions 


[PDF] tum.de

Structure preserving discretization and approximation of gradient flows in Wasserstein-like space

S Plazotta - 2019 - mediatum.ub.tum.de

This thesis investigates structure-preserving, temporal semi-discretizations and

approximations for PDEs with gradient flow structure with the application to evolution

problems in the L²-Wasserstein space. We investigate the variational formulation of the time …

  Related articles All 3 versions 


2019


Distributions with Maximum Spread Subject to Wasserstein Distance Constraints

JG Carlsson, Y Wang - Journal of the Operations Research Society of …, 2019 - Springer

Recent research on formulating and solving distributionally robust optimization problems

has seen many different approaches for describing one's ambiguity set, such as constraints

on first and second moments or quantiles. In this paper, we use the Wasserstein distance to …

  Related articles All 3 versions


2019

[PDF] arxiv.org

Statistical data analysis in the Wasserstein space

J Bigot - arXiv preprint arXiv:1907.08417, 2019 - arxiv.org

This paper is concerned by statistical inference problems from a data set whose elements

may be modeled as random probability measures such as multiple histograms or point

clouds. We propose to review recent contributions in statistics on the use of Wasserstein  …

  Cited by 3 Related articles All 2 versions 


[PDF] arxiv.org

Data-driven chance constrained optimization under Wasserstein ambiguity sets

AR HotaA CherukuriJ Lygeros - 2019 American Control …, 2019 - ieeexplore.ieee.org

We present a data-driven approach for distri-butionally robust chance constrained

optimization problems (DRCCPs). We consider the case where the decision maker has

access to a finite number of samples or realizations of the uncertainty. The chance constraint …

  Cited by 19 Related articles All 4 versions



2019

Training Wasserstein GANs for Estimating Depth Maps

AT Arslan, E Seke - 2019 3rd International Symposium on …, 2019 - ieeexplore.ieee.org

Depth maps depict pixel-wise depth association with a 2D digital image. Point clouds

generation and 3D surface reconstruction can be conducted by processing a depth map.

Estimating a corresponding depth map from a given input image is an important and difficult …

  Related articles


[PDF] arxiv.org

Data-Driven Distributionally Robust Appointment Scheduling over Wasserstein Balls

R JiangM RyuG Xu - arXiv preprint arXiv:1907.03219, 2019 - arxiv.org

We study a single-server appointment scheduling problem with a fixed sequence of

appointments, for which we must determine the arrival time for each appointment. We

specifically examine two stochastic models. In the first model, we assume that all appointees …

  Cited by 3 Related articles All 3 versions 

<——2019—–—2019 ——1320—  


[PDF] arxiv.org

Minimax confidence intervals for the sliced Wasserstein distance

T ManoleS BalakrishnanL Wasserman - arXiv preprint arXiv:1909.07862, 2019 - arxiv.org

Motivated by the growing popularity of variants of the Wasserstein distance in statistics and

machine learning, we study statistical inference for the Sliced Wasserstein distance--an

easily computable variant of the Wasserstein distance. Specifically, we construct confidence …

  Cited by 3 Related articles All 4 versions 


2019 [PDF] arxiv.org

Wasserstein distance based deep adversarial transfer learning for intelligent fault diagnosis

C Cheng, B Zhou, G Ma, D WuY Yuan - arXiv preprint arXiv:1903.06753, 2019 - arxiv.org

The demand of artificial intelligent adoption for condition-based maintenance strategy is

astonishingly increased over the past few years. Intelligent fault diagnosis is one critical

topic of maintenance solution for mechanical systems. Deep learning models, such as …

   Cited by 24 Related articles All 3 versions 

[PDF] arxiv.org

The quadratic Wasserstein metric for inverse data matching

K Ren, Y Yang - arXiv preprint arXiv:1911.06911, 2019 - arxiv.org

This work characterizes, analytically and numerically, two major effects of the quadratic

Wasserstein ($ W_2 $) distance as the measure of data discrepancy in computational

solutions of inverse problems. First, we show, in the infinite-dimensional setup, that the …

  Cited by 1 Related articles 


Data-driven distributionally robust shortest path problem using the Wasserstein ambiguity set

Z Wang, K YouS SongC Shang - 2019 IEEE 15th …, 2019 - ieeexplore.ieee.org

This paper proposes a data-driven distributionally robust shortest path (DRSP) model where

the distribution of the travel time is only observable through a finite training dataset. Our

DRSP model adopts the Wasserstein metric to construct the ambiguity set of probability …

  Cited by 1 Related articles


[PDF] arxiv.org

Clustering measure-valued data with Wasserstein barycenters

G Domazakis, D Drivaliaris, S Koukoulas… - arXiv preprint arXiv …, 2019 - arxiv.org

In this work, learning schemes for measure-valued data are proposed, ie data that their

structure can be more efficiently represented as probability measures instead of points on

$\R^ d $, employing the concept of probability barycenters as defined with respect to the …

  Related articles All 2 versions 


2019

[PDF] dpi-proceedings.com

Isomorphic Wasserstein Generative Adversarial Network for Numeric Data Augmentation

W Wei, W Chuang, LI Yue - DEStech Transactions on …, 2019 - dpi-proceedings.com

GAN-based schemes are one of the most popular methods designed for image generation.

Some recent studies have suggested using GAN for numeric data augmentation that is to

generate data for completing the imbalanced numeric data. Compared to the conventional …

  Related articles All 2 versions 


2019

Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic Schr {\" o} dinger equation

G Ferriere - arXiv preprint arXiv:1903.04309, 2019 - arxiv.org

We consider the dispersive logarithmic Schr {ö} dinger equation in a semi-classical scaling.

We extend the results about the large time behaviour of the solution (dispersion faster than

usual with an additional logarithmic factor, convergence of the rescaled modulus of the …

  Cited by 6 Related articles All 4 versions 

2019
Finsler structure for variable exponent Wasserstein space and gradient flows

A Marcos, A Soglo - arXiv preprint arXiv:1912.12450, 2019 - arxiv.org

The variational approach requires the setting of new tools such as appropiate distance on the

probability space and an introduction of a Finsler metric in this space. The class of parabolic

equations is derived as the flow of a gradient with respect the Finsler structure. For q(x) q …

  Related articles All 2 versions 


2019

[PDF] tum.de

Structure preserving discretization and approximation of gradient flows in Wasserstein-like space

S Plazotta - 2019 - mediatum.ub.tum.de

This thesis investigates structure-preserving, temporal semi-discretizations and

approximations for PDEs with gradient flow structure with the application to evolution

problems in the L²-Wasserstein space. We investigate the variational formulation of the time …

  Related articles All 3 versions 


Arterial Spin Labeling Images Synthesis via Locally-Constrained WGAN-GP Ensemble

W Huang, M Luo, X Liu, P Zhang, H Ding… - … Conference on Medical …, 2019 - Springer

Arterial spin labeling (ASL) images begin to receive much popularity in dementia diseases

diagnosis recently, yet it is still not commonly seen in well-established image datasets for

investigating dementia diseases. Hence, synthesizing ASL images from available data is …

 Cited by 2 Related articles All 2 versions

<——2019—–—2019 ——1330—

iWGAN: an Autoencoder WGAN for Inference

Y Chen, Q Gao, X Wang - 2019 - openreview.net

Generative Adversarial Networks (GANs) have been impactful on many problems and

applications but suffer from unstable training. Wasserstein GAN (WGAN) leverages the

Wasserstein distance to avoid the caveats in the minmax two-player training of GANs but …

  Related articles 


Super-Resolution Algorithm of Satellite Cloud Image Based on WGAN-GP

YY Luo, HG Lu, N Jia - 2019 International Conference on …, 2019 - ieeexplore.ieee.org

The resolution of an image is an important indicator for measuring image quality. The higher

the resolution, the more detailed information is contained in the image, which is more

conducive to subsequent image analysis and other tasks. Improving the resolution of images …

Cited by 3 Related articles

  

[PDF] ceaj.org

[PDF] 结合 FC-DenseNet  WGAN 的图像去雾算法

孙斌, 雎青青, 桑庆兵 - 计算机科学与探索, 2019 - fcst.ceaj.org

针对现有图像去雾算法严重依赖中间量准确估计的问题, 提出了一种基于Wasserstein

生成对抗网络(WGAN) 的端到端图像去雾模型. 首先, 使用全卷积密集块网络(FC-DenseNet)

充分学习图像中雾的特征; 其次, 采用残差学习思想直接从退化图像中学习到清晰图像的特征 …

  Cited by 1 Related articles All 2 versions 

[Chinese  Image defogging algorithm combining FC-DenseNet and WGAN

[

[HTML] cqupt.edu.cn

[HTML] 基于 WGAN 的语音增强算法研究

王怡斐, 韩俊刚, 樊良辉 - 重庆邮电大学学报 (自然科学版), 2019 - journal2.cqupt.edu.cn

带噪语音可看成由独立的噪声信号和语音信号经某种方式混合而成, 传统语音增强方法需要对

噪声信号和干净语音信号的独立性和特征分布做出假设, 不合理的假设会造成噪声残留,

语音失真等问题, 导致语音增强效果不佳. 此外, 噪声本身的随机性和突变性也会影响传统语音 …

  Cited by 1 Related articles All 3 versions 

[Chinese  esearch on Speech Enhancement Algorithm Based on WGAN]

[PDF] jsjkx.com

[PDF] 基于差分 WGAN 的网络安全态势预测

王婷婷, 朱江 - 计算机科学, 2019 - jsjkx.com

摘要文中提出了一种基于差分WGAN (WassersteinGGAN) 的网络安全态势预测机制,

该机制利用生成对抗网络(GenerativeAdversarialNetwork, GAN) 来模拟态势的发展过程,

从时间维度实现态势预测. 为了解决GAN 具有的网络难以训练, collapsemode …

  Related articles All 3 versions 

[CITATION] 基于差分 WGAN 的网络安全态势预测 (Network Security Situation Forecast Based on Differential WGAN).

T Wang, J Zhu - 计算机科学, 2019


[PDF] opticsjournal.net

[PDF] 基于改进型 WGAN 的低剂量 CT 图像去噪方法

徐曾春, 叶超, 杜振龙, 李晓丽 - 光学与光电技术, 2019 - opticsjournal.net

摘要为改善低剂量CT 图像的质量, 提出一种基于改进型Wasserstein 生成对抗网络(WGAN-gp)

的低剂量CT 图像去噪方法. WGAN-gp WGAN 网络的基础上加入梯度惩罚项, 解决了WGAN

训练困难, 收敛速度慢的问题, 进一步提高网络的性能. 同时加入新感知损失度量函数 …

  Related articles All 3 versions 

[Chinese  Low-dose CT image denoising method based on improved WGAN]


基于 WGAN 网络的自然视频预测

李敏, 仝明磊, 范绿源, 南昊 - 仪表技术, 2019 - cnki.com.cn

计算机视觉技术已经在学术界和工业界取得了巨大的成果, 近年来, 视频预测已经成为一个重要

的研究领域. 现有基于生成对抗网络的视频预测模型在训练中需要小心平衡生成器和判别器的

训练, 生成模型多样性不足. 针对这些问题, 提出用Wasserstein 对抗生成网络(WGAN) …

  Related articles 

[Chinese  Natural video prediction based on WGAN network

]


Conditional WGAN-GP 이용한 Few-Shot 이미지 생성 - DBpia

https://www.dbpia.co.kr › articleDetail

https://www.dbpia.co.kr › articleDetail · Translate this page

Conditional WGAN-GP 이용한 Few-Shot 이미지 생성. Few-Shot Image Generation Using Conditional WGAN-GP. 인용 댓글 알림.

[CITATION] Conditional WGAN-GP 이용한 Few-Shot 이미지 생성

나상혁, 김준태 - 한국정보과학회 학술발표논문집, 2019 - dbpia.co.kr

약최근에 생성적 적대 신경망 (generate adversarial nets) 활용한 다양한 연구 개발이

이루어지고 있다. 생성적 적대 신경망은 생성자, 판별자 신경망이 각각 적대적 학습하여 실제

데이터와 유사한 데이터를생성하는 방법이다. 그러나 다른 딥러닝 분야와 마찬가지로 학습을 …

  Related articles

[Korean  Few-Shot image creation using Conditional WGAN-GP]


Wasserstein robust reinforcement learning

MA Abdullah, H RenHB Ammar, V Milenkovic… - arXiv preprint arXiv …, 2019 - arxiv.org

Reinforcement learning algorithms, though successful, tend to over-fit to training

environments hampering their application to the real-world. This paper proposes $\text

{W}\text {R}^{2}\text {L} $--a robust reinforcement learning algorithm with significant robust  …

Cited by 31 Related articles All 7 versions 

[PDF] arxiv.org

Wasserstein distributionally robust optimization: Theory and applications in machine learning

D KuhnPM EsfahaniVA Nguyen… - … Science in the Age …, 2019 - pubsonline.informs.org

Many decision problems in science, engineering, and economics are affected by uncertain

parameters whose distribution is only indirectly observable through samples. The goal of

data-driven decision making is to learn a decision from finitely many training samples that …

 Cited by 157 Related articles All 8 versions

<——2019—–—2019 ——1340— 


Decomposition algorithm for distributionally robust optimization using Wasserstein metric with an application to a class of regression models

F LuoS Mehrotra - European Journal of Operational Research, 2019 - Elsevier

We study distributionally robust optimization (DRO) problems where the ambiguity set is

defined using the Wasserstein metric and can account for a bounded support. We show that

this class of DRO problems can be reformulated as decomposable semi-infinite programs …

Cited by 33 Related articles All 5 versions+

[PDF] researchgate.net

Wasserstein metric based distributionally robust approximate framework for unit commitment

R ZhuH WeiX Bai - IEEE Transactions on Power Systems, 2019 - ieeexplore.ieee.org

This paper proposed a Wasserstein metric-based distributionally robust approximate

framework (WDRA), for unit commitment problem to manage the risk from uncertain wind

power forecasted errors. The ambiguity set employed in the distributionally robust  …

  Cited by 27 Related articles All 3 versions


[PDF] arxiv.org

Confidence regions in wasserstein distributionally robust estimation

J Blanchet, K MurthyN Si - arXiv preprint arXiv:1906.01614, 2019 - arxiv.org

Wasserstein distributionally robust optimization (DRO) estimators are obtained as solutions

of min-max problems in which the statistician selects a parameter minimizing the worst-case

loss among all probability models within a certain distance (in a Wasserstein sense) from the …

Cited by 23 Related articles All 7 versions 

[PDF] mlr.press

A gradual, semi-discrete approach to generative network training via explicit wasserstein minimization

Y Chen, M TelgarskyC Zhang… - International …, 2019 - proceedings.mlr.press

This paper provides a simple procedure to fit generative networks to target distributions, with

the goal of a small Wasserstein distance (or other optimal transport costs). The approach is

based on two principles:(a) if the source randomness of the network is a continuous …

  Cited by 4 Related articles All 10 versions 


[PDF] semanticscholar.org

Speech Enhancement for Noise-Robust Speech Synthesis Using Wasserstein GAN.

N AdigaY Pantazis, V Tsiaras, Y Stylianou - INTERSPEECH, 2019 - isca-speech.org

The quality of speech synthesis systems can be significantly deteriorated by the presence of

background noise in the recordings. Despite the existence of speech enhancement

techniques for effectively suppressing additive noise under low signal-tonoise (SNR) …

  Cited by 2 Related articles All 4 versions


2019


A First-Order Algorithmic Framework for Wasserstein Distributionally Robust Logistic Regression

J LiS HuangAMC So - arXiv preprint arXiv:1910.12778, 2019 - arxiv.org

Wasserstein distance-based distributionally robust optimization (DRO) has received much

attention lately due to its ability to provide a robustness interpretation of various learning

models. Moreover, many of the DRO problems that arise in the learning context admits exact …

  Cited by 1 Related articles All 7 versions 


[PDF] arxiv.org

A measure approximation theorem for Wasserstein-robust expected values

G van Zyl - arXiv preprint arXiv:1912.12119, 2019 - arxiv.org

We consider the problem of finding the infimum, over probability measures being in a ball

defined by Wasserstein distance, of the expected value of a bounded Lipschitz random

variable on $\mathbf {R}^ d $. We show that if the $\sigma-$ algebra is approximated in by a …

  Related articles All 2 versions 


Distributionally robust learning under the wasserstein metric

R Chen - 2019 - search.proquest.com

This dissertation develops a comprehensive statistical learning framework that is robust to

(distributional) perturbations in the data using Distributionally Robust Optimization (DRO)

under the Wasserstein metric. The learning problems that are studied include:(i) …

  Cited by 1 Related articles All 3 versions


[PDF] arxiv.org

2D Wasserstein Loss for Robust Facial Landmark Detection

Y Yan, S Duffner, P Phutane, A Berthelier… - arXiv preprint arXiv …, 2019 - arxiv.org

The recent performance of facial landmark detection has been significantly improved by

using deep Convolutional Neural Networks (CNNs), especially the Heatmap Regression

Models (HRMs). Although their performance on common benchmark datasets has reached a …

  Related articles All 3 versions 


[PDF] nsf.gov

A Distributionally Robust Optimization Approach for Multivariate Linear Regression under the Wasserstein Metric

R ChenIC Paschalidis - 2019 IEEE 58th Conference on …, 2019 - ieeexplore.ieee.org

We present a Distributionally Robust Optimization (DRO) approach for Multivariate Linear

Regression (MLR), where multiple correlated response variables are to be regressed

against a common set of predictors. We develop a regularized MLR formulation that is robust  …

  Related articles All 3 versions

<——2019—–—2019 ——1350— 


[PDF] openreview.net

A Greedy Approach to Max-Sliced Wasserstein GANs

A Horváth - 2019 - openreview.net

Generative Adversarial Networks have made data generation possible in various use cases,

but in case of complex, high-dimensional distributions it can be difficult to train them,

because of convergence problems and the appearance of mode collapse. Sliced …

  Related articles All 2 versions 


  [PDF] arxiv.org

Duality and quotient spaces of generalized Wasserstein spaces

NP Chung, TS Trinh - arXiv preprint arXiv:1904.12461, 2019 - arxiv.org

In this article, using ideas of Liero, Mielke and Savaré in [21], we establish a Kantorovich

duality for generalized Wasserstein distances $ W_1^{a, b} $ on a generalized Polish metric

space, introduced by Picolli and Rossi. As a consequence, we give another proof that …

  Cited by 3 Related articles All 3 versions 

 PDF

Optimal Control in Wasserstein Spaces - Archive ouverte HAL

hal.archives-ouvertes.fr › tel-02361353 › document

Nov 13, 2019 — Riemannienne des espaces de Wasserstein. Par la suite, nous ... et leur pédagogie de me lancer dans le monde académique. Un grand merci donc à ... Optimal Control Problems in Wasserstein Spaces. 'Variational Analysis ...

Missing: Commande ‎| Must include: Commande

 [CITATION] Optimal Control in Wasserstein Spaces.(Commande Optimal dans les Espaces de Wasserstein).

B Bonnet - 2019 - Aix-Marseille University, France


[PDF] polimi.it

Propagating uncertainty in reinforcement learning via wasserstein barycenters

AM MetelliA LikmetaM Restelli - 33rd Conference on Neural …, 2019 - re.public.polimi.it

How does the uncertainty of the value function propagate when performing temporal

difference learning? In this paper, we address this question by proposing a Bayesian

framework in which we employ approximate posterior distributions to model the uncertainty …

Cited by 9 Related articles All 8 versions 


[PDF] arxiv.org

Penalization of barycenters in the Wasserstein space

J Bigot, E Cazelles, N Papadakis - SIAM Journal on Mathematical Analysis, 2019 - SIAM

In this paper, a regularization of Wasserstein barycenters for random measures supported

on R^d is introduced via convex penalization. The existence and uniqueness of such

barycenters is first proved for a large class of penalization functions. The Bregman …

  Cited by 27 Related articles All 10 versions

 

2019


Barycenters in generalized Wasserstein spaces

NP Chung, TS Trinh - arXiv preprint arXiv:1909.05517, 2019 - arxiv.org

In 2014, Piccoli and Rossi introduced generalized Wasserstein spaces which are

combinations of Wasserstein distances and $ L^ 1$-distances [11]. In this article, we follow

the ideas of Agueh and Carlier [1] to study generalized Wasserstein barycenters. We show …

  Cited by 1 Related articles All 3 versions 


 www.researchgate.net › publication › 338228807_Learni...

Dec 30, 2019 — ... of points on $\R^d$, employing the concept of probability barycenters as defined with respect to the Wasserstein metric. Such type of learning ...

[CITATION] Learning with Wasserstein barycenters and applications.

G Domazakis, D Drivaliaris, S Koukoulas… - CoRR, 2019


PWGAN: wasserstein GANs with perceptual loss for mode collapse

X Wu, C Shi, X Li, J He, X Wu, J Lv, J Zhou - Proceedings of the ACM …, 2019 - dl.acm.org

Generative adversarial network (GAN) plays an important part in image generation. It has

great achievements trained on large scene data sets. However, for small scene data sets,

we find that most of methods may lead to a mode collapse, which may repeatedly generate …

  Related articles

PWGAN: wasserstein GANs with perceptual loss for mode collapse

X Wu, C Shi, X Li, J He, X Wu, J Lv, J Zhou - Proceedings of the ACM …, 2019 - dl.acm.org

Generative adversarial network (GAN) plays an important part in image generation. It has

great achievements trained on large scene data sets. However, for small scene data sets,

we find that most of methods may lead to a mode collapse, which may repeatedly generate …

  Related articles All 2 versions


[PDF] arxiv.org

2-wasserstein approximation via restricted convex potentials with application to improved training for gans

A TaghvaeiA Jalali - arXiv preprint arXiv:1902.07197, 2019 - arxiv.org

We provide a framework to approximate the 2-Wasserstein distance and the optimal

transport map, amenable to efficient training as well as statistical and geometric analysis.

With the quadratic cost and considering the Kantorovich dual form of the optimal …

  Cited by 9 Related articles All 3 version

 Bridging the Gap Between -GANs and Wasserstein GANs

J Song, S Ermon - arXiv preprint arXiv:1910.09779, 2019 - arxiv.org

Generative adversarial networks (GANs) have enjoyed much success in learning high-

dimensional distributions. Learning objectives approximately minimize an $ f $-divergence

($ f $-GANs) or an integral probability metric (Wasserstein GANs) between the model and …

[CITATION] Bridging the Gap Between f-GANs and Wasserstein GANs. arXiv e-prints, page

J Song, S Ermon - arXiv preprint arXiv:1910.0

<——2019—–—2019 ——1360— 


[PDF] arxiv.org

Parameterized Wasserstein mean with its properties

S Kim - arXiv preprint arXiv:1904.09385, 2019 - arxiv.org

… Note that this metric, denoted as d(A, B) and called the Bures-Wasserstein distance, coincides

with the Bures distance of density matrices in quantum information theory and is the matrix …

 Related articles All 2 versions 

[PDF] arxiv.org

Strong equivalence between metrics of Wasserstein type

E BayraktarG Guo - arXiv preprint arXiv:1912.08247, 2019 - arxiv.org

The sliced Wasserstein and more recently max-sliced Wasserstein metrics $\mW_p $ have

attracted abundant attention in data sciences and machine learning due to its advantages to

tackle the curse of dimensionality. A question of particular importance is the strong …

  Cited by 3 Related articles All 2 versions 


[PDF] arxiv.org

(q, p)-Wasserstein GANs: Comparing Ground Metrics for Wasserstein GANs

A MallastoJ FrellsenW Boomsma… - arXiv preprint arXiv …, 2019 - arxiv.org

Generative Adversial Networks (GANs) have made a major impact in computer vision and

machine learning as generative models. Wasserstein GANs (WGANs) brought Optimal

Transport (OT) theory into GANs, by minimizing the $1 $-Wasserstein distance between …

Cited by 5 Related articles All 3 versions 

[PDF] arxiv.org

Alpha Procrustes metrics between positive definite operators: a unifying formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt metrics

MH Quang - arXiv preprint arXiv:1908.09275, 2019 - arxiv.org

This work presents a parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

distances provide a unified formulation encompassing both the Bures-Wasserstein and Log …

  Cited by 4 Related articles All 2 versions 


[PDF] arxiv.org

The quadratic Wasserstein metric for inverse data matching

K Ren, Y Yang - arXiv preprint arXiv:1911.06911, 2019 - arxiv.org

This work characterizes, analytically and numerically, two major effects of the quadratic

Wasserstein ($ W_2 $) distance as the measure of data discrepancy in computational

solutions of inverse problems. First, we show, in the infinite-dimensional setup, that the …

  Cited by 1 Related articles 

 

 2019


 Use of the Wasserstein Metric to Solve the Inverse Dynamic Seismic Problem

AA Vasilenko - Geomodel 2019, 2019 - earthdoc.org

The inverse dynamic seismic problem consists in recovering the velocity model of elastic

medium based on the observed seismic data. In this work full waveform inversion method is

used to solve this problem. It consists in minimizing an objective functional measuring the …

  Related articles


BIRCH Algorithm and Wasserstein Distance Metric Based Method for Generating Typical Scenarios of Wind Power Outputs

Q Li, X Tang, C Chen, X Liu, S Liu, X Shi… - … -Asia (ISGT Asia), 2019 - ieeexplore.ieee.org

With the ever-increasing penetration of renewable energy generation such as wind power

and solar photovoltaics, the power system concerned is suffering more extensive and

significant uncertainties. Scenario analysis has been utilized to solve this problem for power …



[HTML] oup.com

The gromov–wasserstein distance between networks and stable network invariants

S ChowdhuryF Mémoli - Information and Inference: A Journal of …, 2019 - academic.oup.com

We define a metric—the network Gromov–Wasserstein distance—on weighted, directed

networks that is sensitive to the presence of outliers. In addition to proving its theoretical

properties, we supply network invariants based on optimal transport that approximate this …

  Cited by 19 Related articles All 5 versions


[PDF] arxiv.org

Quantum Wasserstein generative adversarial networks

S Chakrabarti, Y Huang, T LiS Feizi, X Wu - arXiv preprint arXiv …, 2019 - arxiv.org

The study of quantum generative models is well-motivated, not only because of its

importance in quantum machine learning and quantum chemistry but also because of the

perspective of its implementation on near-term quantum machines. Inspired by previous …

  Cited by 11 Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein-2 generative networks

A KorotinV EgiazarianA AsadulaevA Safin… - arXiv preprint arXiv …, 2019 - arxiv.org

Generative Adversarial Networks training is not easy due to the minimax nature of the

optimization objective. In this paper, we propose a novel end-to-end algorithm for training

generative models which uses a non-minimax objective simplifying model training. The …

  Cited by 8 Related articles All 3 versions 

<——2019—–—2019 ——1370—  


Evasion attacks based on wasserstein generative adversarial network

J Zhang, Q Yan, M Wang - 2019 Computing, Communications …, 2019 - ieeexplore.ieee.org

… Unfortunately, GAN has a problem of instability during training, and WGAN, which uses

Wasserstein distance calculations to calculate the difference between generating data and real …

 Cited by 5 Related articles

[PDF] nber.org

Using wasserstein generative adversarial networks for the design of monte carlo simulations

S AtheyGW Imbens, J Metzger, EM Munro - 2019 - nber.org

When researchers develop new econometric methods it is common practice to compare the

performance of the new methods to those of existing methods in Monte Carlo studies. The

credibility of such Monte Carlo studies is often limited because of the freedom the researcher …

  Cited by 13 Related articles All 8 versions 


[PDF] arxiv.org

Investigating under and overfitting in wasserstein generative adversarial networks

B AdlamC WeillA Kapoor - arXiv preprint arXiv:1910.14137, 2019 - arxiv.org

We investigate under and overfitting in Generative Adversarial Networks (GANs), using

discriminators unseen by the generator to measure generalization. We find that the model

capacity of the discriminator has a significant effect on the generator's model quality, and …

  Cited by 6 Related articles All 3 versions 

Investigating Under and Overfitting in Wasserstein Generative Adversarial Networks

A Kapoor, B AdlamC Weill - 2019 - research.google

We investigate under and overfitting in Generative Adversarial Networks (GANs), using

discriminators unseen by the generator to measure generalization. We find that the model

capacity of the discriminator has a significant effect on the generator's model quality, and …


Music Classification using Multiclass Support Vector Machine and Multilevel Wasserstein Means

J Wei, C Jin, Z Cheng, X Lv… - 2019 IEEE/ACIS 18th …, 2019 - ieeexplore.ieee.org

Music classification is a challenging task in music information retrieval. In this article, we

compare the performance of the two types of models. The first category is classified by

Support Vector Machine (SVM). We use the feature extraction from audio as the basis of …

  Related articles All 2 versions


[PDF] ieee.org

Multi-source medical image fusion based on Wasserstein generative adversarial networks

Z Yang, Y Chen, Z LeF Fan, E Pan - IEEE Access, 2019 - ieeexplore.ieee.org

In this paper, we propose the medical Wasserstein generative adversarial networks

(MWGAN), an end-to-end model, for fusing magnetic resonance imaging (MRI) and positron

emission tomography (PET) medical images. Our method establishes two adversarial …

  Cited by 6 Related articles


2019

[PDF] arxiv.org

Parameter estimation for biochemical reaction networks using Wasserstein distances

K Öcal, R GrimaG Sanguinetti - Journal of Physics A …, 2019 - iopscience.iop.org

We present a method for estimating parameters in stochastic models of biochemical reaction

networks by fitting steady-state distributions using Wasserstein distances. We simulate a

reaction network at different parameter settings and train a Gaussian process to learn the …

  Cited by 6 Related articles All 7 versions


[PDF] arxiv.org

Wasserstein stability of porous medium-type equations on manifolds with Ricci curvature bounded below

N De Ponti, M Muratori, C Orrieri - arXiv preprint arXiv:1908.03147, 2019 - arxiv.org

Given a complete, connected Riemannian manifold $\mathbb {M}^ n $ with Ricci curvature

bounded from below, we discuss the stability of the solutions of a porous medium-type

equation with respect to the 2-Wasserstein distance. We produce (sharp) stability estimates …

  Cited by 1 Related articles All 3 versions 

[PDF] rit.edu

A Comparative Assessment of the Impact of Various Norms on Wasserstein Generative Adversarial Networks

C Ramesh - 2019 - scholarworks.rit.edu

Abstract Generative Adversarial Networks (GANs) provide a fascinating new paradigm in

machine learning and artificial intelligence, especially in the context of unsupervised

learning. GANs are quickly becoming a state of the art tool, used in various applications …

  Related articles All 2 versions 


Wasserstein-Bounded Generative Adversarial Networks

P Zhou, B NiL Xie, X Zhang, H Wang, C Geng, Q Tian - 2019 - openreview.net

In the field of Generative Adversarial Networks (GANs), how to design a stable training

strategy remains an open problem. Wasserstein GANs have largely promoted the stability

over the original GANs by introducing Wasserstein distance, but still remain unstable and …

  

Data augmentation method of sar image dataset based on wasserstein generative adversarial networks

Q Lu, H Jiang, G Li, W Ye - 2019 International conference on …, 2019 - ieeexplore.ieee.org

The published Synthetic Aperture Radar (SAR) samples are not abundant enough, which is

not conducive to the application of deep learning methods in the field of SAR automatic

target recognition. Generative Adversarial Nets (GANs) is one of the most effective ways to …

  Cited by 1 Related articles All 2 versions

 

Skolnick, 2004) and added a properly-scaled identity matrix to it to make a positive-definite …

  Related articles 

<——2019—–—2019 ——1380—

Weibo Authorship Identification based on Wasserstein generative adversarial networks

W Tang, C Wu, X Chen, Y Sun… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

During the past years, authorship identification has played a significant role in the public

security area. Recently, deep learning based approaches have been used in authorship

identification. However, all approaches based on deep learning require a large amount of …

  Related articles

[PDF] koreascience.or.kr

Combining multi-task autoencoder with Wasserstein generative adversarial networks for improving speech recognition performance

CY Kao, H Ko - The Journal of the Acoustical Society of Korea, 2019 - koreascience.or.kr

As the presence of background noise in acoustic signal degrades the performance of

speech or acoustic event recognition, it is still challenging to extract noise-robust acoustic

features from noisy signal. In this paper, we propose a combined structure of Wasserstein  …

  Related articles All 3 versions 

[PDF] utwente.nl

Wasserstein Generative Adversarial Privacy Networks

KE Mulder - 2019 - essay.utwente.nl

A method to filter private data from public data using generative adversarial networks has

been introduced in an article" Generative Adversarial Privacy" by Chong Huang et al. in

2018. We attempt to reproduce their results, and build further upon their work by introducing …

  Related articles All 2 versions 


[PDF] ceur-ws.org

[PDF] Dialogue response generation with Wasserstein generative adversarial networks

SAS Gilani, E JembereAW Pillay - 2019 - ceur-ws.org

This research evaluates the effectiveness of a Generative Adversarial Network (GAN) for

open domain dialogue response systems. The research involves developing and evaluating

a Conditional Wasserstein GAN (CWGAN) for natural dialogue response generation. We …

  Related articles 

2019

 aperswithcode.com › paper › painting-halos-from-3d-...

Painting halos from 3D dark matter fields using Wasserstein ...

Painting halos from 3D dark matter fields using Wasserstein mapping networks. 25 Mar 2019 • Doogesh Kodi Ramanah • Tom Charnock • Guilhem Lavaux ... novel halo painting network that learns to map approximate 3D dark matter fields to ...

[CITATION] Painting halos from 3D dark matter fields using Wasserstein mapping networks

D Kodi Ramanah, T CharnockG Lavaux - arXiv preprint arXiv:1903.10524, 2019

 

2019


AEWGAN 이용한 고차원 불균형 데이터 이상 탐지 - 대한산업공학회 ...

www.dbpia.co.kr › articleDetail

· Translate this page

기관인증 소속기관이 구독중인 논문 이용 가능합니다. (구독기관 IP·계정 이용 / 대학도서관 홈페이지를 통해 접속) 로그인 개인화 서비스 이용 가능합니다.(내서재 ..

[Korean  High-dimensional unbalanced data anomaly detection using AEWGAN-The Korean Society of Industrial Engineers ...

[CITATION] AEWGAN 이용한 고차원 불균형 데이터 이상 탐지

송승환, 백준걸 - 대한산업공학회 추계학술대회 논문집, 2019 - dbpia.co.kr

… 2) WGAN(Wasserstein Generative Adversarial Networks) - 오버샘플링 - 기존 오버샘플링 방법들은

데이터의 분포를 이용하지 않는다는 문제점이 있음 … 이를 AEWGAN(Autoencoder Wasserstein

Generative Adversarial Networks) 칭함 2187 Page 8. 4. 실험 결과 1) 데이터 설명 …

Related articles


Distributionally Robust Learning Under the Wasserstein Metric

by R Chen - ‎2019 -Boston U     Related articles

The remainder of this dissertation is organized as follows. In Chapter 2, we develop the Wasserstein DRO formulation for linear regression under absolute error ...

Cited by 1 Related articles All 3 versions

Normalized wasserstein for mixture distributions with applications in adversarial learning and domain adaptation

Y BalajiR ChellappaS Feizi - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Understanding proper distance measures between distributions is at the core of several

learning tasks such as generative models, domain adaptation, clustering, etc. In this work,

we focus on mixture distributions that arise naturally in several application domains where …

  Cited by 9 Related articles All 4 versions 


[PDF] uclouvain.be

Curvature of the manifold of fixed-rank positive-semidefinite matrices endowed with the Bures–Wasserstein metric

E MassartJM HendrickxPA Absil - International Conference on …, 2019 - Springer

We consider the manifold of rank-p positive-semidefinite matrices of size n, seen as a

quotient of the set of full-rank n-by-p matrices by the orthogonal group in dimension p. The

resulting distance coincides with the Wasserstein distance between centered degenerate …

  Cited by 6 Related articles All 5 versions


[PDF] arxiv.org

Normalized Wasserstein distance for mixture distributions with applications in adversarial learning and domain adaptation

Y BalajiR ChellappaS Feizi - arXiv preprint arXiv:1902.00415, 2019 - arxiv.org

Understanding proper distance measures between distributions is at the core of several

learning tasks such as generative models, domain adaptation, clustering, etc. In this work,

we focus on mixture distributions that arise naturally in several application domains where …

  Cited by 5 Related articles All 2 versions 

<——2019—–—2019 ——1390— 


[PDF] nsf.gov

Distributions with Maximum Spread Subject to Wasserstein Distance Constraints

JG Carlsson, Y Wang - Journal of the Operations Research Society of …, 2019 - Springer

Recent research on formulating and solving distributionally robust optimization problems

has seen many different approaches for describing one's ambiguity set, such as constraints

on first and second moments or quantiles. In this paper, we use the Wasserstein distance to …

  Related articles All 3 versions


[PDF] arxiv.org

Random Matrix-Improved Estimation of the Wasserstein Distance between two Centered Gaussian Distributions

M TiomokoR Couillet - 2019 27th European Signal Processing …, 2019 - ieeexplore.ieee.org

This article proposes a method to consistently estimate functionals 1/pΣ i= 1 pf (λ i (C 1 C 2))

of the eigenvalues of the product of two covariance matrices C 1, C 2 R p× p based on the

empirical estimates λ i (Ĉ 1 Ĉ 2)(Ĉ a= 1/na Σ i= 1 na xi (a) xi (a)), when the size p and …

  Cited by 1 Related articles All 7 versions


Prioritized Experience Replay based on the Wasserstein Metric in Deep Reinforcement Learning: The regularizing effect of modelling return distributions

T Greevink - 2019 - repository.tudelft.nl

This thesis tests the hypothesis that distributional deep reinforcement learning (RL)

algorithms get an increased performance over expectation based deep RL because of the

regularizing effect of fitting a more complex model. This hypothesis was tested by comparing …

2019

De Novo Protein Design for Novel Folds using Guided Conditional Wasserstein Generative Adversarial Networks (gcWGAN)

M KarimiS ZhuY CaoY Shen - bioRxiv, 2019 - biorxiv.org

Motivation Facing data quickly accumulating on protein sequence and structure, this study is

addressing the following question: to what extent could current data alone reveal deep

insights into the sequence-structure relationship, such that new sequences can be designed …

  Cited by 6 Related articles All 4 versions 


[PDF] biorxiv.org

[PDF] De Novo Protein Design for Novel Folds using Guided Conditional Wasserstein Generative Adversarial Networks (gcWGAN)(Supporting Information)

M Karimi, S Zhu, Y Cao, Y Shen - Small - biorxiv.org

2.1 Methods Using a representative protein structure chosen by SCOPe for each of the

1,232 folds, we construct a pairwise similarity matrix of symmetrized TM scores (Zhang and

Skolnick, 2004) and added a properly-scaled identity matrix to it to make a positive-definite …

  Related articles 

year 2019 modified

[PDF] biorxiv.org

[PDF] De Novo Protein Design for Novel Folds using Guided Conditional Wasserstein Generative Adversarial Networks (gcWGAN)(Supporting Information)

M Karimi, S Zhu, Y Cao, Y Shen - Small - biorxiv.org

2.1 Methods Using a representative protein structure chosen by SCOPe for each of the 1,232

folds, we construct a pairwise similarity matrix of symmetrized TM scores (Zhang and Skolnick,

2004) and added a properly-scaled identity matrix to it to make a positive-definite Gram …

SRelated articles 

2019


[PDF] mdpi.com

Wasserstein Generative Adversarial Network Based De-Blurring Using Perceptual Similarity

M Hong, Y Choe - Applied Sciences, 2019 - mdpi.com

The de-blurring of blurred images is one of the most important image processing methods

and it can be used for the preprocessing step in many multimedia and computer vision

applications. Recently, de-blurring methods have been performed by neural network …

  Cited by 1 Related articles All 4 versions 


Commande Optimale dans les Espaces de Wasserstein

B Bonnet - 2019 - theses.fr

Résumé Une vaste quantité d'outils mathématiques permettant la modélisation et l'analyse

des problèmes multi-agents ont récemment été développés dans le cadre de la théorie du

transport optimal. Dans cette thèse, nous étendons pour la première fois plusieurs de ces …

  

Projection au sens de Wasserstein 2 sur des espaces structurés de mesures

L Lebrat - 2019 - theses.fr

Résumé Cette thèse s' intéresse à l'approximation pour la métrique de 2-Wasserstein de

mesures de probabilité par une mesure structurée. Les mesures structurées étudiées sont

des discrétisations consistantes de mesures portées par des courbes continues à vitesse et …

  

[PDF] theses.fr

Processus de diffusion sur l'espace de Wasserstein: modèles coalescents, propriétés de régularisation et équations de McKean-Vlasov

V Marx - 2019 - theses.fr

Résumé La thèse vise à étudier une classe de processus stochastiques à valeurs dans

l'espace des mesures de probabilité sur la droite réelle, appelé espace de Wasserstein

lorsqu'il est muni de la métrique de Wasserstein W2. Ce travail aborde principalement les …

  Related articles All 3 versions 


[PDF] sfds.asso.fr

[PDF] Méthode de couplage en distance de Wasserstein pour la théorie des valeurs extrêmes

B Bobbia, C DombryD Varron - jds2019.sfds.asso.fr

Nous proposons une relecture de résultats classiques de la théorie des valeurs extrêmes,

que nous étudions grâce aux outils que nous fournit la théorie du transport optimal. Dans ce

cadre, nous pouvons voir la normalité des estimateurs comme une convergence de  …

  Related articles All 2 versions 

<——2019—–—2019 ——1400— 


[PDF] uniandes.edu.co

[PDF] Problemas de clasificación: una perspectiva robusta con la métrica de Wasserstein

JA Acosta Melo - repositorio.uniandes.edu.co

El objetivo central de este trabajo es dar un contexto a los problemas de clasificación para

los casos de máquinas de soporte vectorial y regresión logıstica. La idea central es abordar

estos problemas con un enfoque robusto con ayuda de la métrica de Wasserstein que se …

  Related articles 


DE NOVO PROTEIN DESIGN OF NOVEL FOLDS USING GUIDED CONDITIONAL WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS (GCWGAN)

S Zhu - 2019 - oaktrust.library.tamu.edu

In the research areas about proteins, it is always a significant topic to detect the

sequencestructure-function relationship. Fundamental questions remain for this topic: How

much could current data alone reveal deep insights about such relationship? And how much …

 

 2019

 [PDF] archives-ouvertes.fr

Courbes et applications optimales à valeurs dans l'espace de Wasserstein

H Lavenant - 2019 - tel.archives-ouvertes.fr

L'espace de Wasserstein est l'ensemble des mesures de probabilité définies sur un

domaine fixé et muni de la distance de Wasserstein quadratique. Dans ce travail, nous

étudions des problèmes variationnels dans lesquels les inconnues sont des applications à …

  Cited by 1 Related articles All 11 versions 

[CITATION] Courbes et applications optimales à valeurs dans l'espace de Wasserstein

P CARDALIAGUET - 2019 - Université Paris-Dauphine

  Related articles



Distribuciones de Máxima Entrop´ıa en Bolas de Wasserstein

https://math.uniandes.edu.co › Presentacion-vargas

https://math.uniandes.edu.co › Presentacion-vargasPDF

by LFV Beltrán — Luis Felipe Vargas Beltrán (Universidad de Los Andes). Distribuciones de Máxima Entropıa en Bolas de Wasserstein28 de Mayo, 2019. 26 pages

]Sp[anish  Distributions of Maximum Entropy in Wasserstein


Optimal control in Wasserstein spaces

[CITATION] Distribuciones de máxima entropía en bolas de Wasserstein

LFV Beltrán - 2019 - Uniandes

2019  PDF

  [Spanish  Maximum entropy distributions in Wasserstein balls']

hal.archives-ouvertes.fr › tel-02361353 › document

Optimal Control in Wasserstein Spaces - Archive ouverte HAL

Nov 13, 2019 — Riemannienne des espaces de Wasserstein. Par la suite, nous ... et leur pédagogie de me lancer dans le monde académique. Un grand merci donc à ... Optimal Control Problems in Wasserstein Spaces. 'Variational Analysis ...

[CITATION] Optimal Control in Wasserstein Spaces.(Commande Optimal dans les Espaces de Wasserstein).

B Bonnet - 2019 - Aix-Marseille University, France

[French  Optimal Control in Wass]


2019 book
Wasserstein Variational Inference
Authors:L. AmbrogioniU. GüçlüY. GüçlütürkM. HinneM. van GervenE. MarisS. BengioH. WallachH. LarochelleK. Grauman
Summary:This paper introduces Wasserstein variational inference, a new form of approximate Bayesian inference based on optimal transport theory. Wasserstein variational inference uses a new family of divergences that includes both f-divergences and the Wasserstein distance as special cases. The gradients of the Wasserstein variational loss are obtained by backpropagating through the Sinkhorn iterations. This technique results in a very stable likelihood-free training method that can be used with implicit distributions and probabilistic programs. Using the Wasserstein variational inference framework, we introduce several new forms of autoencoders and test their robustness and performance against existing variational autoencoding techniques
Show more
Book, 2019
Publication:2019
Publisher:Neural Information Processing Systems Foundation, 2019

2019

[PDF] arxiv.org

Wasserstein dependency measure for representation learning

S OzairC LynchY BengioA OordS Levine… - arXiv preprint arXiv …, 2019 - arxiv.org

Mutual information maximization has emerged as a powerful learning objective for

unsupervised representation learning obtaining state-of-the-art performance in applications

such as object recognition, speech recognition, and reinforcement learning. However, such …

Cited by 164 Related articles All 8 versions


2[PDF] arxiv.org

Wasserstein adversarial regularization (WAR) on label noise

K FatrasBB DamodaranS LobryR Flamary… - arXiv preprint arXiv …, 2019 - arxiv.org

… regularization scheme based on the Wasserstein distance. Using this … The OT problem

seeks an optimal coupling T … OT distances are classically expressed through the Wasserstein …

Cited by 6 Related articles All 3 versions 

[PDF] arxiv.org

Wasserstein robust reinforcement learning

MA Abdullah, H RenHB Ammar, V Milenkovic… - arXiv preprint arXiv …, 2019 - arxiv.org

Reinforcement learning algorithms, though successful, tend to over-fit to training

environments hampering their application to the real-world. This paper proposes $\text

{W}\text {R}^{2}\text {L} $--a robust reinforcement learning algorithm with significant robust …

Cited by 33 Related articles All 7 versions 

[PDF] arxiv.org

Learning with minibatch Wasserstein: asymptotic and gradient properties

K Fatras, Y Zine, R FlamaryR Gribonval… - arXiv preprint arXiv …, 2019 - arxiv.org

Optimal transport distances are powerful tools to compare probability distributions and have

found many applications in machine learning. Yet their algorithmic complexity prevents their

direct use on large scale datasets. To overcome this challenge, practitioners compute these …

ited by 29 Related articles All 24 versions 


[PDF] arxiv.org

Donsker's theorem in {Wasserstein}-1 distance

L Coutin, L Decreusefond - arXiv preprint arXiv:1904.07045, 2019 - arxiv.org

We compute the Wassertein-(or Kolmogorov-Rubinstein) distance between a random walk

in $ R^ d $ and the Brownian motion. The proof is based on a new estimate of the Lipschitz

modulus of the solution of the Stein's equation. As an application, we can evaluate the rate …

  Cited by 1 Related articles All 17 versions 

<—-2019—–—2019 ——1410—

Wasserstein adversarial imitation learning

H XiaoM Herman, J Wagner, S Ziesche… - arXiv preprint arXiv …, 2019 - arxiv.org

Imitation Learning describes the problem of recovering an expert policy from

demonstrations. While inverse reinforcement learning approaches are known to be very

sample-efficient in terms of expert demonstrations, they usually require problem-dependent …

 Cited by 37 Related articles All 3 versions 

[PDF] arxiv.org

Asymptotic guarantees for learning generative models with the sliced-wasserstein distance

K NadjahiA DurmusU ŞimşekliR Badeau - arXiv preprint arXiv …, 2019 - arxiv.org

Minimum expected distance estimation (MEDE) algorithms have been widely used for

probabilistic models with intractable likelihood functions and they have become increasingly

popular due to their use in implicit generative modeling (eg Wasserstein generative …

Cited by 29 Related articles All 10 versions 

[PDF] thecvf.com

Normalized wasserstein for mixture distributions with applications in adversarial learning and domain adaptation

Y BalajiR ChellappaS Feizi - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Understanding proper distance measures between distributions is at the core of several

learning tasks such as generative models, domain adaptation, clustering, etc. In this work,

we focus on mixture distributions that arise naturally in several application domains where …

  Cited by 9 Related articles All 4 versions 


Wasserstein generative learning with kinematic constraints for probabilistic interactive driving behavior prediction

H Ma, J LiW ZhanM Tomizuka - 2019 IEEE Intelligent …, 2019 - ieeexplore.ieee.org

Since prediction plays a significant role in enhancing the performance of decision making

and planning procedures, the requirement of advanced methods of prediction becomes

urgent. Although many literatures propose methods to make prediction on a single agent …

Cited by 23 Related arti

2019  [PDF] arxiv.org

Wasserstein distance based deep adversarial transfer learning for intelligent fault diagnosis

C Cheng, B Zhou, G Ma, D WuY Yuan - arXiv preprint arXiv:1903.06753, 2019 - arxiv.org

The demand of artificial intelligent adoption for condition-based maintenance strategy is

astonishingly increased over the past few years. Intelligent fault diagnosis is one critical

topic of maintenance solution for mechanical systems. Deep learning models, such as …

  Cited by 16 Related articles All 3 versions 

2019


[PDF] arxiv.org

Semi-supervised multitask learning on multispectral satellite images using wasserstein generative adversarial networks (gans) for predicting poverty

A Perez, S GanguliS ErmonG AzzariM Burke… - arXiv preprint arXiv …, 2019 - arxiv.org

Obtaining reliable data describing local poverty metrics at a granularity that is informative to

policy-makers requires expensive and logistically difficult surveys, particularly in the

developing world. Not surprisingly, the poverty stricken regions are also the ones which …

  Cited by 21 Related articles All 4 versions 

[PDF] arxiv.org

Normalized Wasserstein distance for mixture distributions with applications in adversarial learning and domain adaptation

Y BalajiR ChellappaS Feizi - arXiv preprint arXiv:1902.00415, 2019 - arxiv.org

Understanding proper distance measures between distributions is at the core of several

learning tasks such as generative models, domain adaptation, clustering, etc. In this work,

we focus on mixture distributions that arise naturally in several application domains where …

  Cited by 5 Related articles All 2 versions 


[PDF] polimi.it

Propagating uncertainty in reinforcement learning via wasserstein barycenters

AM MetelliA LikmetaM Restelli - 33rd Conference on Neural …, 2019 - re.public.polimi.it

How does the uncertainty of the value function propagate when performing temporal

difference learning? In this paper, we address this question by proposing a Bayesian

framework in which we employ approximate posterior distributions to model the uncertainty …

  Cited by 9 Related articles All 8 versions 

[PDF] arxiv.org

Disentangled representation learning with Wasserstein total correlation

Y XiaoWY Wang - arXiv preprint arXiv:1912.12818, 2019 - arxiv.org

Unsupervised learning of disentangled representations involves uncovering of different

factors of variations that contribute to the data generation process. Total correlation

Cited by 6 Related articles All 2 versions 


[PDF] arxiv.org

Learning embeddings into entropic wasserstein spaces

C FrognerF MirzazadehJ Solomon - arXiv preprint arXiv:1905.03329, 2019 - arxiv.org

Euclidean embeddings of data are fundamentally limited in their ability to capture latent

semantic structures, which need not conform to Euclidean spatial assumptions. Here we

consider an alternative, which embeds data as discrete probability distributions in a …

Cited by 12 Related articles All 7 versions 

<—-2019—–—2019 ——1420—  


[PDF] arxiv.org

1-Wasserstein Distance on the Standard Simplex

A Frohmader, H Volkmer - arXiv preprint arXiv:1912.04945, 2019 - arxiv.org

Wasserstein distances provide a metric on a space of probability measures. We consider the

space $\Omega $ of all probability measures on the finite set $\chi=\{1,\dots, n\} $ where $ n

$ is a positive integer. 1-Wasserstein distance, $ W_1 (\mu,\nu) $ is a function from …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Zero-Sum Differential Games on the Wasserstein Space

J MoonT Basar - arXiv preprint arXiv:1912.06084, 2019 - arxiv.org

We consider two-player zero-sum differential games (ZSDGs), where the state process

(dynamical system) depends on the random initial condition and the state process's

distribution, and the objective functional includes the state process's distribution and the …

  Cited by 1 Related articles All 2 versions 


[PDF] aaai.org

EWGAN: Entropy-based Wasserstein GAN for imbalanced learning

J Ren, Y LiuJ Liu - Proceedings of the AAAI Conference on Artificial …, 2019 - ojs.aaai.org

In this paper, we propose a novel oversampling strategy dubbed Entropy-based

Wasserstein Generative Adversarial Network (EWGAN) to generate data samples for

minority classes in imbalanced learning. First, we construct an entropyweighted label vector …

  Cited by 1 Related articles All 7 versions 


Distributionally robust learning under the wasserstein metric

R Chen - 2019 - search.proquest.com

This dissertation develops a comprehensive statistical learning framework that is robust to

(distributional) perturbations in the data using Distributionally Robust Optimization (DRO)

under the Wasserstein metric. The learning problems that are studied include:(i) …

  Cited by 1 Related articles All 3 versions


[PDF] arxiv.org

Wasserstein Distance Guided Cross-Domain Learning

J Su - arXiv preprint arXiv:1910.07676, 2019 - arxiv.org

Domain adaptation aims to generalise a high-performance learner on target domain (non-

labelled data) by leveraging the knowledge from source domain (rich labelled data) which

comes from a different but related distribution. Assuming the source and target domains data …

  Related articles All 2 versions 


2019


Adversarial Learning for Cross-Modal Retrieval with Wasserstein Distance

Q Cheng, Y Zhang, X Gu - International Conference on Neural Information …, 2019 - Springer

This paper presents a novel approach for cross-modal retrieval in an Adversarial Learning

with Wasserstein Distance (ALWD) manner, which aims at learning aligned representation

for various modalities in a GAN framework. The generator projects the image and the text …

  Related articles


Distributionally robust XVA via wasserstein distance part 1: Wrong way counterparty credit risk

D Singh, S Zhang - Unknown Journal, 2019 - experts.umn.edu

This paper investigates calculations of robust CVA for OTC derivatives under distributional

uncertainty using Wasserstein distance as the ambiguity measure. Wrong way counterparty

credit risk can be characterized (and indeed quantified) via the robust CVA formulation. The …

  [CITATION] Distributionally robust xva via wasserstein distance part 1

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019

  Cited by 3 Related articles


Learning embeddings into entropic wasserstein spaces

C FrognerF MirzazadehJ Solomon - arXiv preprint arXiv:1905.03329, 2019 - arxiv.org

Euclidean embeddings of data are fundamentally limited in their ability to capture latent

semantic structures, which need not conform to Euclidean spatial assumptions. Here we

consider an alternative, which embeds data as discrete probability distributions in a …

  Cited by 3 Related articles All 7 versions 

TATION] Learning entropic wasserstein embeddings

C FrognerF Mirzazadeh, J Solomon - International Conference on Learning …, 2019

  Cited by 5 Related articles


Package 'transport'

cran.r-project.org › web › packages › transport

PDF

Aug 7, 2019 — Version 0.12-2. Date 2020-03-11. Title Computation of Optimal Transport Plans and Wasserstein Distances. Maintainer Dominic Schuhmacher ...

[CITATION] transport: Computation of Optimal Transport Plans and Wasserstein Distances, r package version 0.11-1

D Schuhmacher, B Bähre, C Gottschlich, V Hartmann… - 2019

  Cited by 2 Related articles

[CITATION] transport: Computation of Optimal Transport Plans and Wasserstein Distances, r package version 0.11-1

D Schuhmacher, B Bähre, C Gottschlich, V Hartmann… - 2019

  Cited by 2 Related articles

[CITATION] transport: Computation of Optimal Transport Plans and Wasserstein Distances, r package version 0.11-1

D Schuhmacher, B Bähre, C Gottschlich, V Hartmann… - 2019

  Cited by 2 Related articles

PDF] mlr.press

Unsupervised alignment of embeddings with wasserstein procrustes

E Grave, A Joulin, Q Berthet - The 22nd International …, 2019 - proceedings.mlr.press

… for stochastic minimization of an objective involving a Wasserstein distance. Similarly, our 

algo… In Proceedings of the sixth annual ACM symposium on Theory of computing, pages 172–…

Cited by 124 Related articles All 5 versions

<——2019—–—2019 —–—1430—   

Wasserstein distributionally robust optimization: Theory and applications in machine learning

D KuhnPM EsfahaniVA Nguyen… - … Science in the Age …, 2019 - pubsonline.informs.org

Many decision problems in science, engineering, and economics are affected by uncertain

parameters whose distribution is only indirectly observable through samples. The goal of

data-driven decision making is to learn a decision from finitely many training samples that …

  Cited by 59 Related articles All 7 versions


Decomposition algorithm for distributionally robust optimization using Wasserstein metric with an application to a class of regression models

F LuoS Mehrotra - European Journal of Operational Research, 2019 - Elsevier

We study distributionally robust optimization (DRO) problems where the ambiguity set is

defined using the Wasserstein metric and can account for a bounded support. We show that

this class of DRO problems can be reformulated as decomposable semi-infinite programs …

  Cited by 19 Related articles All 6 versions


[PDF] arxiv.org

Data-driven chance constrained optimization under Wasserstein ambiguity sets

AR HotaA CherukuriJ Lygeros - 2019 American Control …, 2019 - ieeexplore.ieee.org

We present a data-driven approach for distri-butionally robust chance constrained

optimization problems (DRCCPs). We consider the case where the decision maker has

access to a finite number of samples or realizations of the uncertainty. The chance constraint …

  Cited by 19 Related articles All 4 versions


Data-driven Wasserstein distributionally robust optimization for biomass with agricultural waste-to-energy network design under uncertainty

C NingF You - Applied Energy, 2019 - Elsevier

This paper addresses the problem of biomass with agricultural waste-to-energy network

design under uncertainty. We propose a novel data-driven Wasserstein distributionally

robust optimization model for hedging against uncertainty in the optimal network design …

  Cited by 12 Related articles All 8 versions


A first-order algorithmic framework for wasserstein distributionally robu[PDF] arxiv.orgst logistic regression

J LiS HuangAMC So - arXiv preprint arXiv:1910.12778, 2019 - arxiv.org

Wasserstein distance-based distributionally robust optimization (DRO) has received much

attention lately due to its ability to provide a robustness interpretation of various learning

models. Moreover, many of the DRO problems that arise in the learning context admits exact …

 Cited by 3 Related articles All 7 versions 

[CITATION] Anthony Man-Cho So. A first-order algorithmic framework for Wasserstein distributionally robust logistic regression

J LiS Huang - Advances in Neural Information Processing Systems, 2019

 Cited by 4 Related articles


2019


[PDF] nsf.gov

A Distributionally Robust Optimization Approach for Multivariate Linear Regression under the Wasserstein Metric

R ChenIC Paschalidis - 2019 IEEE 58th Conference on …, 2019 - ieeexplore.ieee.org

We present a Distributionally Robust Optimization (DRO) approach for Multivariate Linear

Regression (MLR), where multiple correlated response variables are to be regressed

against a common set of predictors. We develop a regularized MLR formulation that is robust …

  Related articles All 3 versions

Robust Wasserstein profile inference and applications to machine learning

J Blanchet, Y KangK Murthy - Journal of Applied Probability, 2019 - cambridge.org

We show that several machine learning estimators, including square-root least absolute

shrinkage and selection and regularized logistic regression, can be represented as

solutions to distributionally robust optimization problems. The associated uncertainty regions …

  Cited by 136 Related articles All 5 versions


[PDF] arxiv.org

Wasserstein distributionally robust optimization: Theory and applications in machine learning

D KuhnPM EsfahaniVA Nguyen… - … Science in the Age …, 2019 - pubsonline.informs.org

Many decision problems in science, engineering, and economics are affected by uncertain

parameters whose distribution is only indirectly observable through samples. The goal of

data-driven decision making is to learn a decision from finitely many training samples that …

Cited by 214 Related articles All 9 versions


[PDF] arxiv.org

Wasserstein metric-driven Bayesian inversion with applications to signal processing

M MotamedD Appelo - International Journal for Uncertainty …, 2019 - dl.begellhouse.com

We present a Bayesian framework based on a new exponential likelihood function driven by

the quadratic Wasserstein metric. Compared to conventional Bayesian models based on

Gaussian likelihood functions driven by the least-squares norm (L 2 norm), the new …

  Cited by 8 Related articles All 4 versions


[PDF] arxiv.org

Data-driven chance constrained optimization under Wasserstein ambiguity sets

AR HotaA CherukuriJ Lygeros - 2019 American Control …, 2019 - ieeexplore.ieee.org

We present a data-driven approach for distri-butionally robust chance constrained

optimization problems (DRCCPs). We consider the case where the decision maker has

access to a finite number of samples or realizations of the uncertainty. The chance constraint …

  Cited by 19 Related articles All 4 versions

<——2019—–—2019 ——1440—  



Data-driven Wasserstein distributionally robust optimization for biomass with agricultural waste-to-energy network design under uncertainty

C NingF You - Applied Energy, 2019 - Elsevier

This paper addresses the problem of biomass with agricultural waste-to-energy network

design under uncertainty. We propose a novel data-driven Wasserstein distributionally

robust optimization model for hedging against uncertainty in the optimal network design …

  Cited by 12 Related articles All 8 versions


[PDF] arxiv.org

Data-driven regularization of Wasserstein barycenters with an application to multivariate density registration

J Bigot, E Cazelles, N Papadakis - Information and Inference: A …, 2019 - academic.oup.com

We present a framework to simultaneously align and smoothen data in the form of multiple

point clouds sampled from unknown densities with support in a-dimensional Euclidean

space. This work is motivated by applications in bioinformatics where researchers aim to …

  Cited by 8 Related articles All 8 versions


[PDF] arxiv.org

Data-Driven Distributionally Robust Appointment Scheduling over Wasserstein Balls

R JiangM RyuG Xu - arXiv preprint arXiv:1907.03219, 2019 - arxiv.org

We study a single-server appointment scheduling problem with a fixed sequence of

appointments, for which we must determine the arrival time for each appointment. We

specifically examine two stochastic models. In the first model, we assume that all appointees …

  Cited by 3 Related articles All 3 versions 


[PDF] mdpi.com

Data-driven distributionally robust stochastic control of energy storage for wind power ramp management using the Wasserstein metric

I Yang - Energies, 2019 - mdpi.com

The integration of wind energy into the power grid is challenging because of its variability,

which causes high ramp events that may threaten the reliability and efficiency of power

systems. In this paper, we propose a novel distributionally robust solution to wind power …

  Cited by 2 Related articles All 6 versions 


Data-driven distributionally robust shortest path problem using the Wasserstein ambiguity set

Z Wang, K YouS SongC Shang - 2019 IEEE 15th …, 2019 - ieeexplore.ieee.org

This paper proposes a data-driven distributionally robust shortest path (DRSP) model where

the distribution of the travel time is only observable through a finite training dataset. Our

DRSP model adopts the Wasserstein metric to construct the ambiguity set of probability …

  Cited by 1 Related articles


2019

[PDF] lancs.ac.uk

Reproducing-Kernel Hilbert space regression with notes on the Wasserstein Distance

S Page - 2019 - eprints.lancs.ac.uk

We study kernel least-squares estimators for the regression problem subject to a norm

constraint. We bound the squared L2 error of our estimators with respect to the covariate

distribution. We also bound the worst-case squared L2 error of our estimators with respect to …

  Related articles All 5 versions 


[PDF] Rate of convergence in Wasserstein distance of piecewise-linear Lévy-driven SDEs

ARI ARAPOSTATHIS, G PANG… - arXiv preprint arXiv …, 2019 - researchgate.net

In this paper, we study the rate of convergence under the Wasserstein metric of a broad

class of multidimensional piecewise Ornstein–Uhlenbeck processes with jumps. These are

governed by stochastic differential equations having a piecewise linear drift, and a fairly …

  Related articles 


Music Classification using Multiclass Support Vector Machine and Multilevel Wasserstein Means

J Wei, C Jin, Z Cheng, X Lv… - 2019 IEEE/ACIS 18th …, 2019 - ieeexplore.ieee.org

Music classification is a challenging task in music information retrieval. In this article, we

compare the performance of the two types of models. The first category is classified by

Support Vector Machine (SVM). We use the feature extraction from audio as the basis of …

  Related articles All 2 versions


Decomposition algorithm for distributionally robust optimization using Wasserstein metric with an application to a class of regression models

F LuoS Mehrotra - European Journal of Operational Research, 2019 - Elsevier

We study distributionally robust optimization (DRO) problems where the ambiguity set is

defined using the Wasserstein metric and can account for a bounded support. We show that

this class of DRO problems can be reformulated as decomposable semi-infinite programs …

 Cited by 23 Related articles All 6 versions


[PDF] arxiv.org

Data-driven regularization of Wasserstein barycenters with an application to multivariate density registration

J Bigot, E Cazelles, N Papadakis - Information and Inference: A …, 2019 - academic.oup.com

We present a framework to simultaneously align and smoothen data in the form of multiple

point clouds sampled from unknown densities with support in a-dimensional Euclidean

space. This work is motivated by applications in bioinformatics where researchers aim to …

  Cited by 8 Related articles All 8 versions

<——2019—–—2019 ——1450—  



[PDF] arxiv.org

2-wasserstein approximation via restricted convex potentials with application to improved training for gans

A TaghvaeiA Jalali - arXiv preprint arXiv:1902.07197, 2019 - arxiv.org

We provide a framework to approximate the 2-Wasserstein distance and the optimal

transport map, amenable to efficient training as well as statistical and geometric analysis.

With the quadratic cost and considering the Kantorovich dual form of the optimal …

  Cited by 9 Related articles All 3 versions 


[PDF] arxiv.org

A Wasserstein norm for signed measures, with application to nonlocal transport equation with source term

B Piccoli, F Rossi, M Tournus - arXiv preprint arXiv:1910.05105, 2019 - arxiv.org

We introduce the optimal transportation interpretation of the Kantorovich norm on thespace

of signed Radon measures with finite mass, based on a generalized Wasserstein

distancefor measures with different masses. With the formulation and the new topological …

  Cited by 3 Related articles All 7 versions 


[PDF] arxiv.org

Optimal Transport Relaxations with Application to Wasserstein GANs

S Mahdian, J Blanchet, P Glynn - arXiv preprint arXiv:1906.03317, 2019 - arxiv.org

We propose a family of relaxations of the optimal transport problem which regularize the

problem by introducing an additional minimization step over a small region around one of

the underlying transporting measures. The type of regularization that we obtain is related to …

  Related articles All 4 versions 


[PDF] arxiv.org

The existence of geodesics in Wasserstein spaces over path groups and loop groups

J Shao - Stochastic Processes and their Applications, 2019 - Elsevier

In this work we prove the existence and uniqueness of the optimal transport map for L p-

Wasserstein distance with p> 1, and particularly present an explicit expression of the optimal

transport map for the case p= 2. As an application, we show the existence of geodesics  …

  Related articles All 8 versions


[PDF] amazonaws.com

[PDF] Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application to Microbiome Studies”

S Wang, TT CaiH Li - pstorage-tf-iopjsd8797887.s3 …

Page 1. Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application

to Microbiome Studies” Shulei Wang, T. Tony Cai and Hongzhe Li University of Pennsylvania In

this supplementary material, we provide the proof for the main results (Section S1) and all the …

  Related articles All 3 versions 


2019


Nonlinear model reduction on metric spaces. Application to ...

arxiv.org › math

Sep 14, 2019 — Application to one-dimensional conservative PDEs in Wasserstein spaces. We give theoretical and numerical evidence of their efficiency to reduce complexity for one-dimensional conservative PDEs where the underlying metric space can be chosen to be the L^2-Wasserstein space. ...

[PDF] arxiv.org

[CITATION] Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces

V EhrlacherD LombardiO MulaFX Vialard - arXiv preprint arXiv:1909.06626, 2019

  Cited by 4 Related articles All 19 versions


[PDF] arxiv.org


Authorcation

K Drossos, P Magron, T Virtanen - … Workshop on Applications of …, 2019 - ieeexplore.ieee.org

Wasserstein adversarial formulation As a second step, we aim at adapting MS to the target … 

to employ the order-1 Wasserstein distance (called Wasserstein distance from now on) W [18, …

 Cited by 28 Related articles All 9 versions

  

[PDF] arxiv.org

Asymptotic guarantees for learning generative models with the sliced-wasserstein distance

K NadjahiA DurmusU ŞimşekliR Badeau - arXiv preprint arXiv …, 2019 - arxiv.org

Minimum expected distance estimation (MEDE) algorithms have been widely used for

probabilistic models with intractable likelihood functions and they have become increasingly

popular due to their use in implicit generative modeling (eg Wasserstein generative …

  Cited by 19 Related articles All 5 versions 


Robust Wasserstein profile inference and applications to machine learning

J Blanchet, Y KangK Murthy - Journal of Applied Probability, 2019 - cambridge.org

We show that several machine learning estimators, including square-root least absolute

shrinkage and selection and regularized logistic regression, can be represented as

solutions to distributionally robust optimization problems. The associated uncertainty regions …

  Cited by 136 Related articles All 5 versions



[PDF] arxiv.org

Wasserstein distributionally robust optimization: Theory and applications in machine learning

D KuhnPM EsfahaniVA Nguyen… - … Science in the Age …, 2019 - pubsonline.informs.org

Many decision problems in science, engineering, and economics are affected by uncertain

parameters whose distribution is only indirectly observable through samples. The goal of

data-driven decision making is to learn a decision from finitely many training samples that …

  Cited by 59 Related articles All 7 versions

<——2019—–—2019 ——1460—




[PDF] thecvf.com

Normalized wasserstein for mixture distributions with applications in adversarial learning and domain adaptation

Y BalajiR ChellappaS Feizi - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Understanding proper distance measures between distributions is at the core of several

learning tasks such as generative models, domain adaptation, clustering, etc. In this work,

we focus on mixture distributions that arise naturally in several application domains where …

  Cited by 9 Related articles All 4 versions 


[PDF] arxiv.org

Wasserstein metric-driven Bayesian inversion with applications to signal processing

M MotamedD Appelo - International Journal for Uncertainty …, 2019 - dl.begellhouse.com

We present a Bayesian framework based on a new exponential likelihood function driven by

the quadratic Wasserstein metric. Compared to conventional Bayesian models based on

Gaussian likelihood functions driven by the least-squares norm (L 2 norm), the new …

  Cited by 8 Related articles All 4 versions


[PDF] arxiv.org

Strong equivalence between metrics of Wasserstein type

E BayraktarG Guo - arXiv preprint arXiv:1912.08247, 2019 - arxiv.org

The sliced Wasserstein and more recently max-sliced Wasserstein metrics $\mW_p $ have

attracted abundant attention in data sciences and machine learning due to its advantages to

tackle the curse of dimensionality. A question of particular importance is the strong  …

  Cited by 3 Related articles All 2 versions 


[PDF] arxiv.org

Normalized Wasserstein distance for mixture distributions with applications in adversarial learning and domain adaptation

Y BalajiR ChellappaS Feizi - arXiv preprint arXiv:1902.00415, 2019 - arxiv.org

Understanding proper distance measures between distributions is at the core of several

learning tasks such as generative models, domain adaptation, clustering, etc. In this work,

we focus on mixture distributions that arise naturally in several application domains where …

  Cited by 5 Related articles All 2 versions 


[PDF] arxiv.org

Bounds for the Wasserstein mean with applications to the Lie-Trotter mean

J Hwang, S Kim - Journal of Mathematical Analysis and Applications, 2019 - Elsevier

Since barycenters in the Wasserstein space of probability distributions have been

introduced, the Wasserstein metric and the Wasserstein mean of positive definite Hermitian

matrices have been recently developed. In this paper, we explore some properties of …

  Cited by 3 Related articles All 5 versions


2019


[PDF] arxiv.org

The Wasserstein Distances Between Pushed-Forward Measures with Applications to Uncertainty Quantification

A Sagiv - arXiv preprint arXiv:1902.05451, 2019 - arxiv.org

In the study of dynamical and physical systems, the input parameters are often uncertain or

randomly distributed according to a measure $\varrho $. The system's response $ f $ pushes

forward $\varrho $ to a new measure $ f\circ\varrho $ which we would like to study. However …

  Related articles All 3 versions 


[PDF] arxiv.org

A Fenchel-Moreau-Rockafellar type theorem on the Kantorovich-Wasserstein space with applications in partially observable Markov decision processes

V Laschos, K ObermayerY ShenW Stannat - … Analysis and Applications, 2019 - Elsevier

By using the fact that the space of all probability measures with finite support can be

completed in two different fashions, one generating the Arens-Eells space and another

generating the Kantorovich-Wasserstein (Wasserstein-1) space, and by exploiting the …

  Cited by 2 Related articles All 5 versions


[PDF] archives-ouvertes.fr

Courbes et applications optimales à valeurs dans l'espace de Wasserstein

H Lavenant - 2019 - tel.archives-ouvertes.fr

L'espace de Wasserstein est l'ensemble des mesures de probabilité définies sur un

domaine fixé et muni de la distance de Wasserstein quadratique. Dans ce travail, nous

étudions des problèmes variationnels dans lesquels les inconnues sont des applications à …

  Cited by 1 Related articles All 11 versions 

[CITATION] Courbes et applications optimales à valeurs dans l'espace de Wasserstein

P CARDALIAGUET - 2019 - Université Paris-Dauphine

  Related articles


(PDF) Learning with Wasserstein barycenters and applications

www.researchgate.net › publication › 338228807_Learni...

Dec 30, 2019 — arXiv:1912.11801v1 [stat.ML] 26 Dec 2019. Learning with Wasserstein Barycenters. However, in the new era of data science, the nature of data ...

[CITATION] Learning with Wasserstein barycenters and applications.

G Domazakis, D Drivaliaris, S Koukoulas… - CoRR, 2019

 

[PDF] arxiv.org

Wasserstein distance based domain adaptation for object detection

P Xu, P GurramG WhippsR Chellappa - arXiv preprint arXiv:1909.08675, 2019 - arxiv.org

In this paper, we present an adversarial unsupervised domain adaptation framework for

object detection. Prior approaches utilize adversarial training based on cross entropy

between the source and target domain distributions to learn a shared feature mapping that …

  Cited by 6 Related articles All 2 versions 

<——2019—–—2019 ——1470—


[PDF] phmsociety.org

[PDF] Anomaly detection on time series with wasserstein gan applied to phm

M Ducoffe, I Haloui, JS Gupta… - International Journal of …, 2019 - phmsociety.org

Modern vehicles are more and more connected. For instance, in the aerospace industry,

newer aircraft are already equipped with data concentrators and enough wireless

connectivity to transmit sensor data collected during the whole flight to the ground, usually …

  Cited by 2 Related articles All 2 versions 


[PDF] arxiv.org

2D Wasserstein Loss for Robust Facial Landmark Detection

Y Yan, S Duffner, P Phutane, A Berthelier… - arXiv preprint arXiv …, 2019 - arxiv.org

The recent performance of facial landmark detection has been significantly improved by

using deep Convolutional Neural Networks (CNNs), especially the Heatmap Regression

Models (HRMs). Although their performance on common benchmark datasets has reached a …

  Related articles All 3 versions 


2019  [PDF]  

A Conditional Wasserstein Generative Adversarial Network for Pixel-level Crack Detection using Video Extracted Images

www.semanticscholar.org › paper › A-Conditional-Wasse...

Automatic crack detection on pavement surfaces is an important research field in the ... A 121-layer densely connected neural network with deconvolution layers for ... A Conditional Wasserstein Generative Adversarial Network for Pixel-level ... Q. Mei, Mustafa Gül; Published 2019; Computer Science, Engineering; ArXiv.

[CITATION] A conditional wasserstein generative adversarial network for pixel-level crack detection using video extracted images

Q MeiM Gül - arXiv preprint arXiv:1907.06014, 2019

  Cited by 6 Related articles

[CITATION] A convergent Lagrangian discretization for -Wasserstein and flux-limited diffusion equations

O Junge, B Söllner - arXiv preprint arXiv:1906.01321, 2019



[PDF] arxiv.org

From the backward Kolmogorov PDE on the Wasserstein space to propagation of chaos for McKean-Vlasov SDEs

N Frikha, PEC de Raynal - arXiv preprint arXiv:1907.01410, 2019 - arxiv.org

In this article, we provide some new quantitative estimates for propagation of chaos of non-

linear stochastic differential equations (SDEs) in the sense of McKean-Vlasov. We obtain

explicit error estimates, at the level of the trajectories, at the level of the semi-group and at …

  Cited by 5 Related articles All 7 versions 


[PDF] uni-bielefeld.de

[PDF] Diffusions and PDEs on Wasserstein space

FY Wang - arXiv preprint arXiv:1903.02148, 2019 - sfb1283.uni-bielefeld.de

We propose a new type SDE, whose coefficients depend on the image of solutions, to investigate

the diffusion process on the Wasserstein space 2 over Rd, generated by the following

time-dependent differential operator for f C2 … R d×Rd σ(t, x, µ)σ(t, y, µ) ,D2f(µ)(x …

  Cited by 2 Related articles 


2019


[PDF] arxiv.org

Optimal Fusion of Elliptic Extended Target Estimates Based on the Wasserstein Distance

K ThormannM Baum - 2019 22th International Conference on …, 2019 - ieeexplore.ieee.org

This paper considers the fusion of multiple estimates of a spatially extended object, where

the object extent is modeled as an ellipse parameterized by the orientation and semi-axes

lengths. For this purpose, we propose a novel systematic approach that employs a distance …

  Cited by 1 Related articles All 5 versions

Optimal Fusion of Elliptic Extended Target Estimates Based on the Wasserstein Distance

K ThormannM Baum - 2019 22th International Conference on …, 2019 - ieeexplore.ieee.org

This paper considers the fusion of multiple estimates of a spatially extended object, where

the object extent is modeled as an ellipse parameterized by the orientation and semi-axes

lengths. For this purpose, we propose a novel systematic approach that employs a distance …

  Cited by 1 Related articles All 5 versions

2019

 [PDF] nsf.gov

An information-theoretic view of generalization via Wasserstein distance

H WangM Diaz, JCS Santos Filho… - … on Information Theory …, 2019 - ieeexplore.ieee.org

We capitalize on the Wasserstein distance to obtain two information-theoretic bounds on the

generalization error of learning algorithms. First, we specialize the Wasserstein distance into

total variation, by using the discrete metric. In this case we derive a generalization bound …

  Cited by 9 Related articles All 5 versions


Zastosowanie metryki Wassersteina w problemie uczenia ...

www.mini.pw.edu.pl › ~mandziuk

PDF

Mar 27, 2019 — Zastosowanie metryki Wassersteina w problemie uczenia ograniczonych maszyn. Boltzmanna dr inż. Maksymilian Bujok. Zakład Algebry i ...

[Polish  Application of the Wasserstein metric to the learning problem ...

www.mini.pw.edu.pl ›~ mandziuk]


Precise simulation of electromagnetic calorimeter showers using Wasserstein Generative Adversarial Network

M Erdmann, J Glombitza, T Quast - Computing and Software for Big …, 2019 - Springer

Simulations of particle showers in calorimeters are computationally time-consuming, as they

have to reproduce both energy depositions and their considerable fluctuations. A new

approach to ultra-fast simulations is generative models where all calorimeter energy …

  Cited by 42 Related articles All 6 versions


Artifact correction in low‐dose dental CT imaging using Wasserstein generative adversarial networks

Z Hu, C Jiang, F Sun, Q Zhang, Y Ge, Y Yang… - Medical …, 2019 - Wiley Online Library

Purpose In recent years, health risks concerning high‐dose x‐ray radiation have become a

major concern in dental computed tomography (CT) examinations. Therefore, adopting low‐

dose computed tomography (LDCT) technology has become a major focus in the CT …

  Cited by 30 Related articles All 5 versions

<——2019—–—2019 ——1450—  




Decomposition algorithm for distributionally robust optimization using Wasserstein metric with an application to a class of regression models

F LuoS Mehrotra - European Journal of Operational Research, 2019 - Elsevier

We study distributionally robust optimization (DRO) problems where the ambiguity set is

defined using the Wasserstein metric and can account for a bounded support. We show that

this class of DRO problems can be reformulated as decomposable semi-infinite programs …

  Cited by 19 Related articles All 6 versions


[PDF] nber.org

Using wasserstein generative adversarial networks for the design of monte carlo simulations

S AtheyGW Imbens, J Metzger, EM Munro - 2019 - nber.org

When researchers develop new econometric methods it is common practice to compare the

performance of the new methods to those of existing methods in Monte Carlo studies. The

credibility of such Monte Carlo studies is often limited because of the freedom the researcher …

  Cited by 14 Related articles All 8 versions 


[PDF] arxiv.org

Denoising of 3D magnetic resonance images using a residual encoder–decoder Wasserstein generative adversarial network

M Ran, J Hu, Y Chen, H Chen, H Sun, J Zhou… - Medical image …, 2019 - Elsevier

Abstract Structure-preserved denoising of 3D magnetic resonance imaging (MRI) images is

a critical step in medical image analysis. Over the past few years, many algorithms with

impressive performances have been proposed. In this paper, inspired by the idea of deep …

  Cited by 31 Related articles All 9 versions


 

[PDF] arxiv.org

Parameter estimation for biochemical reaction networks using Wasserstein distances

K Öcal, R GrimaG Sanguinetti - Journal of Physics A …, 2019 - iopscience.iop.org

We present a method for estimating parameters in stochastic models of biochemical reaction

networks by fitting steady-state distributions using Wasserstein distances. We simulate a

reaction network at different parameter settings and train a Gaussian process to learn the …

  Cited by 7 Related articles All 7 versions


[PDF] arxiv.org

Towards diverse paraphrase generation using multi-class wasserstein GAN

Z An, S Liu - arXiv preprint arXiv:1909.13827, 2019 - arxiv.org

Paraphrase generation is an important and challenging natural language processing (NLP)

task. In this work, we propose a deep generative model to generate paraphrase with

diversity. Our model is based on an encoder-decoder architecture. An additional transcoder …

  Cited by 4 Related articles All 3 versions 


2019


2019 see 2020

Learning with minibatch Wasserstein: asymptotic and gradient properties

K Fatras, Y Zine, R FlamaryR Gribonval… - arXiv preprint arXiv …, 2019 - arxiv.org

Optimal transport distances are powerful tools to compare probability distributions and have

found many applications in machine learning. Yet their algorithmic complexity prevents their

direct use on large scale datasets. To overcome this challenge, practitioners compute these …

  Cited by 12 Related articles All 23 versions 


[PDF] arxiv.org

Single image haze removal using conditional wasserstein generative adversarial networks

JP EbenezerB Das… - 2019 27th European …, 2019 - ieeexplore.ieee.org

We present a method to restore a clear image from a haze-affected image using a

Wasserstein generative adversarial network. As the problem is ill-conditioned, previous

methods have required a prior on natural images or multiple images of the same scene. We …

  Cited by 8 Related articles All 5 versions


[PDF] arxiv.org

Single image haze removal using conditional wasserstein generative adversarial networks

JP EbenezerB Das… - 2019 27th European …, 2019 - ieeexplore.ieee.org

We present a method to restore a clear image from a haze-affected image using a

Wasserstein generative adversarial network. As the problem is ill-conditioned, previous

methods have required a prior on natural images or multiple images of the same scene. We …

  Cited by 8 Related articles All 5 versions

[HTML] Identifying imaging markers for predicting cognitive assessments using wasserstein distances based matrix regression

J Yan, C DengL LuoX WangX YaoL Shen… - Frontiers in …, 2019 - frontiersin.org

Alzheimer's disease (AD) is a severe type of neurodegeneration which worsens human

memory, thinking and cognition along a temporal continuum. How to identify the informative

phenotypic neuroimaging markers and accurately predict cognitive assessment are crucial …

  Cited by 2 Related articles All 11 versions 


[PDF] wustl.edu

Grid-less DOA estimation using sparse linear arrays based on Wasserstein distance

M WangZ ZhangA Nehorai - IEEE Signal Processing Letters, 2019 - ieeexplore.ieee.org

Sparse linear arrays, such as nested and co-prime arrays, are capable of resolving O (M2)

sources using only O (M) sensors by exploiting their so-called difference coarray model. One

popular approach to exploit the difference coarray model is to construct an augmented …

  Cited by 3 Related articles All 3 versions


[PDF] researchgate.net

Least-squares reverse time migration via linearized waveform inversion using Wasserstein metric

P Yong, J Huang, Z Li, W LiaoL Qu - Geophysics, 2019 - library.seg.org

Least-squares reverse time migration (LSRTM), an effective tool for imaging the structures of

the earth from seismograms, can be characterized as a linearized waveform inversion

problem. We have investigated the performance of three minimization functionals as the L 2 …

  Cited by 3 Related articles All 4 versions

Least-squares reverse time migration via linearized waveform inversion using Wasserstein metricWasserstein metric for 

LSRTM

P Yong, J Huang, Z Li, W LiaoL Qu - Geophysics, 2019 - pubs.geoscienceworld.org

Least-squares reverse time migration (LSRTM), an effective tool for imaging the structures of

the earth from seismograms, can be characterized as a linearized waveform inversion

problem. We have investigated the performance of three minimization functionals as the L 2 …

  Related articles

<——2019—–—2019 ——1460— 


[PDF] biorxiv.org

De Novo Protein Design for Novel Folds using Guided Conditional Wasserstein Generative Adversarial Networks (gcWGAN)

M KarimiS ZhuY CaoY Shen - bioRxiv, 2019 - biorxiv.org

Motivation Facing data quickly accumulating on protein sequence and structure, this study is

addressing the following question: to what extent could current data alone reveal deep

insights into the sequence-structure relationship, such that new sequences can be designed …

  Cited by 6 Related articles All 4 versions 

[PDF] biorxiv.org

[PDF] De Novo Protein Design for Novel Folds using Guided Conditional Wasserstein Generative Adversarial Networks (gcWGAN)(Supporting Information)

M Karimi, S Zhu, Y Cao, Y Shen - Small - biorxiv.org

2.1 Methods Using a representative protein structure chosen by SCOPe for each of the

1,232 folds, we construct a pairwise similarity matrix of symmetrized TM scores (Zhang and

Skolnick, 2004) and added a properly-scaled identity matrix to it to make a positive-definite …

  Related articles 


[PDF] thecvf.com

[PDF] Face Synthesis and Recognition Using Disentangled Representation-Learning Wasserstein GAN.

GSJ Hsu, CH Tang, MH Yap - CVPR Workshops, 2019 - openaccess.thecvf.com

Abstract We propose the Disentangled Representation-learning Wasserstein GAN (DR-

WGAN) trained on augmented data for face recognition and face synthesis across pose. We

improve the state-of-the-art DR-GAN with the Wasserstein loss considered in the …

  Cited by 1 Related articles All 4 versions 

[PDF] thecvf.com

Face Synthesis and Recognition Using Disentangled Representation-Learning Wasserstein GAN

GS Jison Hsu, CH Tang… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Abstract We propose the Disentangled Representation-learning Wasserstein GAN (DR-

WGAN) trained on augmented data for face recognition and face synthesis across pose. We

improve the state-of-the-art DR-GAN with the Wasserstein loss considered in the …

  Related articles All 2 versions 



[PDF] semanticscholar.org

Speech Enhancement for Noise-Robust Speech Synthesis Using Wasserstein GAN.

N AdigaY Pantazis, V Tsiaras, Y Stylianou - INTERSPEECH, 2019 - isca-speech.org

The quality of speech synthesis systems can be significantly deteriorated by the presence of

background noise in the recordings. Despite the existence of speech enhancement

techniques for effectively suppressing additive noise under low signal-tonoise (SNR) …

  Cited by 3 Related articles All 4 versions


Image Reflection Removal Using the Wasserstein Generative Adversarial Network

T LiDPK Lun - … 2019-2019 IEEE International Conference on …, 2019 - ieeexplore.ieee.org

Imaging through a semi-transparent material such as glass often suffers from the reflection

problem, which degrades the image quality. Reflection removal is a challenging task since it

is severely ill-posed. Traditional methods, while all require long computation time on …

  Cited by 1 Related articles All 2 versions


[PDF] mdpi.com

Wasserstein Generative Adversarial Network Based De-Blurring Using Perceptual Similarity

M Hong, Y Choe - Applied Sciences, 2019 - mdpi.com

The de-blurring of blurred images is one of the most important image processing methods

and it can be used for the preprocessing step in many multimedia and computer vision

applications. Recently, de-blurring methods have been performed by neural network …

  Cited by 1 Related articles All 4 versions 


2019


[HTML] nih.gov

Construction of 4D Neonatal Cortical Surface Atlases Using Wasserstein Distance

Z Chen, Z WuL SunF WangL Wang… - 2019 IEEE 16th …, 2019 - ieeexplore.ieee.org

Spatiotemporal (4D) neonatal cortical surface atlases with densely sampled ages are

important tools for understanding the dynamic early brain development. Conventionally,

after non-linear co-registration, surface atlases are constructed by simple Euclidean average …

  Cited by 1 Related articles All 5 versions


Data-driven distributionally robust shortest path problem using the Wasserstein ambiguity set

Z Wang, K YouS SongC Shang - 2019 IEEE 15th …, 2019 - ieeexplore.ieee.org

This paper proposes a data-driven distributionally robust shortest path (DRSP) model where

the distribution of the travel time is only observable through a finite training dataset. Our

DRSP model adopts the Wasserstein metric to construct the ambiguity set of probability …

  Cited by 1 Related articles



[PDF] arxiv.org

Stylized Text Generation Using Wasserstein Autoencoders with a Mixture of Gaussian Prior

A Ghabussi, L Mou, O Vechtomova - arXiv preprint arXiv:1911.03828, 2019 - arxiv.org

Wasserstein autoencoders are effective for text generation. They do not however provide

any control over the style and topic of the generated sentences if the dataset has multiple

classes and includes different topics. In this work, we present a semi-supervised approach …

  Related articles All 2 versions 


Improved concentration bounds for conditional value-at-risk and cumulative prospect theory using wasserstein distance

SP Bhat, LA Prashanth - 2019 - openreview.net

This paper presents a unified approach based on Wasserstein distance to derive

concentration bounds for empirical estimates for a broad class of risk measures. The results

cover two broad classes of risk measures which are defined in the paper. The classes of risk …

  Cited by 1 Related articles 


[PDF] koreascience.or.kr

Experimental Analysis of Equilibrization in Binary Classification for Non-Image Imbalanced Data Using Wasserstein GAN

ZY Wang, DK Kang - International Journal of Internet …, 2019 - koreascience.or.kr

In this paper, we explore the details of three classic data augmentation methods and two

generative model based oversampling methods. The three classic data augmentation

methods are random sampling (RANDOM), Synthetic Minority Over-sampling Technique …

  Cited by 2 Related articles All 3 versions 

<——2019—–—2019 ——1490— 



Music Classification using Multiclass Support Vector Machine and Multilevel Wasserstein Means

J Wei, C Jin, Z Cheng, X Lv… - 2019 IEEE/ACIS 18th …, 2019 - ieeexplore.ieee.org

Music classification is a challenging task in music information retrieval. In this article, we

compare the performance of the two types of models. The first category is classified by

Support Vector Machine (SVM). We use the feature extraction from audio as the basis of …

  Related articles All 2 versions


2019

Input limited Wasserstein GAN

C FD - 2019 - ir.sia.cn

摘要 Generative adversarial networks (GANs) has proven hugely successful, but suffer from

train instability. The recently proposed Wasserstein GAN (WGAN) has largely overcome the

problem, but can still fail to converge in some case or be to complex. It has been found that …

  


A Conditional Wasserstein Generative Adversarial Network for ...

deepai.org › publication › a-conditional-wasserstein-ge...

A Conditinal Wasserstein Generative Adversarial Network for Pixel-level Crack Detection using Video Extracted Images. 07/13/2019 by Qipei Mei, et al.

[CITATION] A conditional wasserstein generative adversarial network for pixel-level crack detection using video extracted images

Q MeiM Gül - arXiv preprint arXiv:1907.06014, 2019

  Cited by 6 Related articles



[PDF] biorxiv.org

Reproducibility test of radiomics using network analysis and Wasserstein K-means algorithm

JH Oh, AP Apte, E Katsoulakis, N Riaz, V Hatzoglou… - bioRxiv, 2019 - biorxiv.org

Purpose To construct robust and validated radiomic predictive models, the development of a

reliable method that can identify reproducible radiomic features robust to varying image

acquisition methods and other scanner parameters should be preceded with rigorous …

  Related articles All 3 versions 


Tackling Algorithmic Bias in Neural-Network Classifiers using Wasserstein-2 Regularization

L Risser, Q Vincenot, JM Loubes - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

The increasingly common use of neural network classifiers in industrial and social

applications of image analysis has allowed impressive progress these last years. Such

methods are however sensitive to algorithmic bias, ie to an under-or an over-representation …

 

2019


DE NOVO PROTEIN DESIGN OF NOVEL FOLDS USING GUIDED CONDITIONAL WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS (GCWGAN)

S Zhu - 2019 - oaktrust.library.tamu.edu

In the research areas about proteins, it is always a significant topic to detect the

sequencestructure-function relationship. Fundamental questions remain for this topic: How

much could current data alone reveal deep insights about such relationship? And how much …

  

Painting halos from 3D dark matter fields using Wasserstein ...

arxiver.moonhats.com › 2019/03/27 › painting-halos-fr...

Mar 27, 2019 — We present a novel halo painting network that learns to map approximate 3D dark matter fields to realistic halo distributions. This map is ...


Painting halos from 3D dark matter fields using ... - arXiv Vanity

www.arxiv-vanity.com › papers

Zhang et al. (2019) constructed a two-phase convolutional neural network architecture to map 3D dark matter fields to the corresponding galaxy distribution in ...

[CITATION] Painting halos from 3D dark matter fields using Wasserstein mapping networks

D Kodi Ramanah, T CharnockG Lavaux - arXiv preprint arXiv:1903.10524, 2019


2019   

Time Series Generation using a One Dimensional Wasserstein GAN

KE Smith, A Smith - ITISE 2019. Proceedings of papers. Vol 2, 2019 - inis.iaea.org

[en] Time series data is an extremely versatile data type that can represent many real world 

events; however the acquisition of event specific time series requires special sensors, 

devices, and to record the events, and the man power to translate to one dimensional (1D) 

data. This is a costly labor effort and in many cases events are not frequent enough which 

results in a lack of time series data describing these events. This paper looks to address that 

issue of a shortage of event time series data by implementing a one dimensional …

[CITATION] Time Series Generation using a One Dimensional Wasserstein GAN

EK Smith, OA Smith - ITISE 2019 International Conference on Time Series …, 2019

  Cited by 1

 

Sliced wasserstein discrepancy for unsupervised domain adaptation

CY Lee, T Batra, MH Baig… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

In this work, we connect two distinct concepts for unsupervised domain adaptation: feature

distribution alignment between domains by utilizing the task-specific decision boundary and

the Wasserstein metric. Our proposed sliced Wasserstein discrepancy (SWD) is designed to …

  Cited by 111 Related articles All 7 versions 

<——2019—–—2019 ——1500— 


[PDF] mlr.press

Unsupervised alignment of embeddings with wasserstein procrustes

E GraveA JoulinQ Berthet - The 22nd International …, 2019 - proceedings.mlr.press

We consider the task of aligning two sets of points in high dimension, which has many

applications in natural language processing and computer vision. As an example, it was

recently shown that it is possible to infer a bilingual lexicon, without supervised data, by …

  Cited by 80 Related articles All 3 versions 


Deep multi-Wasserstein unsupervised domain adaptation

TN LeA HabrardM Sebban - Pattern Recognition Letters, 2019 - Elsevier

In unsupervised domain adaptation (DA), 1 aims at learning from labeled source data and

fully unlabeled target examples a model with a low error on the target domain. In this setting,

standard generalization bounds prompt us to minimize the sum of three terms:(a) the source …

  Cited by 3 Related articles All 3 versions


[PDF] mlr.press

On the complexity of approximating Wasserstein barycenters

A KroshninN TupitsaD Dvinskikh… - International …, 2019 - proceedings.mlr.press

We study the complexity of approximating the Wasserstein barycenter of $ m $ discrete

measures, or histograms of size $ n $, by contrasting two alternative approaches that use

entropic regularization. The first approach is based on the Iterative Bregman Projections …

  Cited by 43 Related articles All 11 versions 

On the Complexity of Approximating Wasserstein Barycenters

P Dvurechensky - dev.icml.cc

… νP2(Ω) m ∑ i=1 W(µi,ν), where W(µ, ν) is the Wasserstein distance between measures µ and

ν on Ω. WB is efficient in machine learning problems with geometric data, eg template image

reconstruction from random sample: Figure: Images from [Cuturi & Doucet, 2014] 2/9 On the …

  All 4 versions 



[PDF] arxiv.org

Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance

J WeedF Bach - Bernoulli, 2019 - projecteuclid.org

The Wasserstein distance between two probability measures on a metric space is a

measure of closeness with applications in statistics, probability, and machine learning. In

this work, we consider the fundamental question of how quickly the empirical measure …

  Cited by 167 Related articles All 6 versions

[PDF] arxiv.org


2019

On the computational complexity of finding a sparse Wasserstein barycenter

S Borgwardt, S Patterson - arXiv preprint arXiv:1910.07568, 2019 - arxiv.org

The discrete Wasserstein barycenter problem is a minimum-cost mass transport problem for

a set of probability measures with finite support. In this paper, we show that finding a

barycenter of sparse support is hard, even in dimension 2 and for only 3 measures. We …

  Cited by 11 Related articles All 2 versions 


2019


[PDF] arxiv.org

Unsupervised adversarial domain adaptation based on the Wasserstein distance for acoustic scene classification

K DrossosP MagronT Virtanen - 2019 IEEE Workshop on …, 2019 - ieeexplore.ieee.org

A challenging problem in deep learning-based machine listening field is the degradation of

the performance when using data from unseen conditions. In this paper we focus on the

acoustic scene classification (ASC) task and propose an adversarial deep learning method …

  Cited by 14 Related articles All 5 versions


Deep joint two-stream Wasserstein auto-encoder and selective attention alignment for unsupervised domain adaptation

Z ChenC Chen, X Jin, Y LiuZ Cheng - Neural computing and …, 2019 - Springer

Abstract Domain adaptation refers to the process of utilizing the labeled source domain data

to learn a model that can perform well in the target domain with limited or missing labels.

Several domain adaptation methods combining image translation and feature alignment …

  Cited by 12 Related articles


[PDF] arxiv.org

Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space

TL GouicQ ParisP RigolletAJ Stromme - arXiv preprint arXiv …, 2019 - arxiv.org

This work establishes fast rates of convergence for empirical barycenters over a large class

of geodesic spaces with curvature bounds in the sense of Alexandrov. More specifically, we

show that parametric rates of convergence are achievable under natural conditions that …

  Cited by 9 Related articles All 2 versions 


[PDF] projecteuclid.org

Behavior of the empirical Wasserstein distance in  under moment conditions

J Dedecker, F Merlevède - Electronic Journal of Probability, 2019 - projecteuclid.org

We establish some deviation inequalities, moment bounds and almost sure results for the

Wasserstein distance of order $ p\in [1,\infty) $ between the empirical measure of

independent and identically distributed ${\mathbb R}^ d $-valued random variables and the …

  Cited by 7 Related articles All 12 versions


1

2

Next


Weak convergence of empirical Wasserstein type distances

P Berthet, JC Fort - arXiv preprint arXiv:1911.02389, 2019 - arxiv.org

We estimate contrasts $\int_0^ 1\rho (F^{-1}(u)-G^{-1}(u)) du $ between two continuous

distributions $ F $ and $ G $ on $\mathbb R $ such that the set $\{F= G\} $ is a finite union of

intervals, possibly empty or $\mathbb {R} $. The non-negative convex cost function $\rho $ is …

  Cited by 2 Related articles All 6 versions 

<——2019—–—2019 ——1510—  



Unsupervised segmentation of colonic polyps in narrow-band imaging data based on manifold representation of images and Wasserstein distance

IN FigueiredoL PintoPN FigueiredoR Tsai - … Signal Processing and …, 2019 - Elsevier

Colorectal cancer (CRC) is one of the most common cancers worldwide and after a certain

age (≥ 50) regular colonoscopy examination for CRC screening is highly recommended.

One of the most prominent precursors of CRC are abnormal growths known as polyps. If a …

  Related articles All 4 versions



[PDF] arxiv.org

Bounding quantiles of Wasserstein distance between true and empirical measure

SN CohenMNA Tegnér, J Wiesel - arXiv preprint arXiv:1907.02006, 2019 - arxiv.org

Consider the empirical measure, $\hat {\mathbb {P}} _N $, associated to $ N $ iid samples of

a given probability distribution $\mathbb {P} $ on the unit interval. For fixed $\mathbb {P} $

the Wasserstein distance between $\hat {\mathbb {P}} _N $ and $\mathbb {P} $ is a random …

  Related articles All 4 versions 


Unsupervised feature extraction based on improved Wasserstein generative adversarial network for hyperspectral classification

Q Sun, S Bourennane - Multimodal Sensing: Technologies …, 2019 - spiedigitallibrary.org

Accurate classification is one of the most important prerequisites for hyperspectral

applications and feature extraction is the key step of classification. Recently, deep learning

models have been successfully used to extract the spectral-spatial features in hyperspectral …

  Related articles All 4 versions



[CITATION] On the complexity of computing Wasserstein distances

B Taskesen, S Shafieezadeh-Abadeh, D Kuhn - 2019 - Working paper

  Cited by 2 Related articles



Wasserstein dependency measure for representation learning

S OzairC LynchY BengioA OordS Levine… - arXiv preprint arXiv …, 2019 - arxiv.org

Mutual information maximization has emerged as a powerful learning objective for

unsupervised representation learning obtaining state-of-the-art performance in applications

such as object recognition, speech recognition, and reinforcement learning. However, such …

  Cited by 27 Related articles All 5 versions 


2019


[PDF] arxiv.org

Graph signal representation with Wasserstein Barycenters

E SimouP Frossard - ICASSP 2019-2019 IEEE International …, 2019 - ieeexplore.ieee.org

In many applications signals reside on the vertices of weighted graphs. Thus, there is the

need to learn low dimensional representations for graph signals that will allow for data

analysis and interpretation. Existing unsupervised dimensionality reduction methods for …

  Cited by 7 Related articles All 5 versions


[PDF] arxiv.org

Disentangled representation learning with Wasserstein total correlation

Y XiaoWY Wang - arXiv preprint arXiv:1912.12818, 2019 - arxiv.org

Unsupervised learning of disentangled representations involves uncovering of different

factors of variations that contribute to the data generation process. Total correlation

penalization has been a key component in recent methods towards disentanglement …

  Cited by 1 Related articles All 2 versions 


[PDF] thecvf.com

Face Synthesis and Recognition Using Disentangled Representation-Learning Wasserstein GAN

GS Jison Hsu, CH Tang… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Abstract We propose the Disentangled Representation-learning Wasserstein GAN (DR-

WGAN) trained on augmented data for face recognition and face synthesis across pose. We

improve the state-of-the-art DR-GAN with the Wasserstein loss considered in the …

  Related articles All 2 versions 


[PDF] arxiv.org

Quantitative spectral gap estimate and Wasserstein contraction of simple slice sampling

V Natarovskii, D RudolfB Sprungk - arXiv preprint arXiv:1903.03824, 2019 - arxiv.org

We prove Wasserstein contraction of simple slice sampling for approximate sampling wrt

distributions with log-concave and rotational invariant Lebesgue densities. This yields, in

particular, an explicit quantitative lower bound of the spectral gap of simple slice sampling …

  Related articles All 4 versions 


2019

Sampling of probability measures in the convex order by Wasserstein projection

J Corbetta, B Jourdain - 2019 - ideas.repec.org

In this paper, for $\mu $ and $\nu $ two probability measures on $\mathbb {R}^ d $ with finite

moments of order $\rho\ge 1$, we define the respective projections for the $ W_\rho $-

Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures dominated by …

<——2019—–—2019 ——1520— 



Wasserstein gan with quadratic transport cost

H LiuX GuD Samaras - Proceedings of the IEEE/CVF …, 2019 - openaccess.thecvf.com

Wasserstein GANs are increasingly used in Computer Vision applications as they are easier

to train. Previous WGAN variants mainly use the l_1 transport cost to compute the

Wasserstein distance between the real and synthetic data distributions. The l_1 transport  …

  Cited by 18 Related articles All 5 versions 

DF] Wasserstein GAN with Quadratic Transport Cost Supplementary Material

H Liu, X GuD Samaras - openaccess.thecvf.com

(1) where I and J are disjoint sets, then for each xj, there exists at I, such that H t− H j= c

(xj, yt). We prove this by contradiction, ie, there exists one xs, s J, such that we cannot find

ay i such that H i− H s= c (xs, yi), i I. This means that H s> supi I {H i− c (xs, yi)} …

  Related articles All 3 versions 



[PDF] arxiv.org

Estimation of Wasserstein distances in the spiked transport model

J Niles-WeedP Rigollet - arXiv preprint arXiv:1909.07513, 2019 - arxiv.org

We propose a new statistical model, the spiked transport model, which formalizes the

assumption that two probability distributions differ only on a low-dimensional subspace. We

study the minimax rate of estimation for the Wasserstein distance under this model and show …

  Cited by 14 Related articles All 2 versions 


Wasserstein generative learning with kinematic constraints for probabilistic interactive driving behavior prediction

H Ma, J LiW ZhanM Tomizuka - 2019 IEEE Intelligent …, 2019 - ieeexplore.ieee.org

Since prediction plays a significant role in enhancing the performance of decision making

and planning procedures, the requirement of advanced methods of prediction becomes

urgent. Although many literatures propose methods to make prediction on a single agent …

  Cited by 16 Related articles


2019

[PDF] nips.cc

[PDF] Concentration of risk measures: A Wasserstein distance approach

SP BhatP LA - Advances in Neural Information Processing Systems, 2019 - papers.nips.cc

Abstract< p> Known finite-sample concentration bounds for the Wasserstein distance

between the empirical and true distribution of a random variable are used to derive a two-

sided concentration bound for the error between the true conditional value-at-risk (CVaR) of …

  Cited by 18 Related articles All 7 versions 

[PDF] iitm.ac.in

[PDF] Concentration of risk measures: A Wasserstein distance approach

LA Prashanth - To appear in the proceedings of NeurIPS, 2019 - cse.iitm.ac.in

… Proof Idea We use the following alternative characterization of the Wasserstein distance W1(F1,

F2) = sup |E(f(X)) − E(f(Y))| , where (1) X and Y are random variables having CDFs F1 and F2,

respectively, and supremum is over all 1-Lipschitz functions f : R R The estimation error …

  Related articles All 4 versions 


[PDF] arxiv.org

Semi-supervised multitask learning on multispectral satellite images using wasserstein generative adversarial networks (gans) for predicting poverty

A Perez, S GanguliS ErmonG AzzariM Burke… - arXiv preprint arXiv …, 2019 - arxiv.org

Obtaining reliable data describing local poverty metrics at a granularity that is informative to

policy-makers requires expensive and logistically difficult surveys, particularly in the

developing world. Not surprisingly, the poverty stricken regions are also the ones which …

  Cited by 21 Related articles All 4 versions 


2019


[PDF] arxiv.org

Second-Order Models for Optimal Transport and Cubic Splines on the Wasserstein Space

JD BenamouTO GallouëtFX Vialard - Foundations of Computational …, 2019 - Springer

On the space of probability densities, we extend the Wasserstein geodesics to the case of

higher-order interpolation such as cubic spline interpolation. After presenting the natural

extension of cubic splines to the Wasserstein space, we propose a simpler approach based …

  Cited by 9 Related articles All 5 versions


[PDF] arxiv.org

Adaptive wasserstein hourglass for weakly supervised hand pose estimation from monocular RGB

Y Zhang, L Chen, Y Liu, J Yong, W Zheng - arXiv preprint arXiv …, 2019 - arxiv.org

Insufficient labeled training datasets is one of the bottlenecks of 3D hand pose estimation

from monocular RGB images. Synthetic datasets have a large number of images with

precise annotations, but the obvious difference with real-world datasets impacts the …

  Cited by 3 Related articles All 2 versions 


[PDF] esaim-cocv.org

Dynamic models of Wasserstein-1-type unbalanced transport

B Schmitzer, B Wirth - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We consider a class of convex optimization problems modelling temporal mass transport

and mass change between two given mass distributions (the so-called dynamic formulation

of unbalanced transport), where we focus on those models for which transport costs are …

  Cited by 6 Related articles All 5 versions


A semi-supervised wasserstein generative adversarial network for classifying driving fatigue from EEG signals

S PanwarP RadJ Quarles, E Golob… - … on Systems, Man and …, 2019 - ieeexplore.ieee.org

Predicting driver's cognitive states using deep learning from electroencephalography (EEG)

signals is considered this paper. To address the challenge posed by limited labeled training

samples, a semi-supervised Wasserstein Generative Adversarial Network with gradient …

  Cited by 3 Related articles All 2 versions


[PDF] arxiv.org

Tropical Optimal Transport and Wasserstein Distances

W Lee, W LiB LinA Monod - arXiv preprint arXiv:1911.05401, 2019 - arxiv.org

We study the problem of optimal transport in tropical geometry and define the Wasserstein-$

p $ distances for probability measures in the continuous metric measure space setting of the

tropical projective torus. We specify the tropical metric---a combinatorial metric that has been …

  Cited by 1 Related articles All 3 versions 

[PDF] ucla.edu

[PDF] Tropical Optimal Transport and Wasserstein Distances in Phylogenetic Tree Space

W Lee, W LiB LinA Monod - arXiv preprint arXiv:1911.05401, 2019 - math.ucla.edu

We study the problem of optimal transport on phylogenetic tree space from the perspective

of tropical geometry, and thus define the Wasserstein-p distances for probability measures in

this continuous metric measure space setting. With respect to the tropical metric—a …

  Related articles All 2 versions 

<——2019—–—2019 ——1530— 



Semi-supervised Multimodal Emotion Recognition with Improved Wasserstein GANs

J Liang, S Chen, Q Jin - 2019 Asia-Pacific Signal and …, 2019 - ieeexplore.ieee.org

Automatic emotion recognition has faced the challenge of lacking large-scale human

labeled dataset for model learning due to the expensive data annotation cost and inevitable

label ambiguity. To tackle such challenge, previous works have explored to transfer emotion …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

The boundary method for semi-discrete optimal transport partitions and Wasserstein distance computation

L Dieci, JD Walsh III - Journal of Computational and Applied Mathematics, 2019 - Elsevier

We introduce a new technique, which we call the boundary method, for solving semi-

discrete optimal transport problems with a wide range of cost functions. The boundary

method reduces the effective dimension of the problem, thus improving complexity. For cost …

  Cited by 7 Related articles All 5 versions

 

[PDF] arxiv.org

Wasserstein norm for signed measures, with application to nonlocal transport equation with source term

B Piccoli, F Rossi, M Tournus - arXiv preprint arXiv:1910.05105, 2019 - arxiv.org

We introduce the optimal transportation interpretation of the Kantorovich norm on thespace

of signed Radon measures with finite mass, based on a generalized Wasserstein

di Cited by 6 Related articles All 11 versions 

[PDF] nsf.gov

Distributions with Maximum Spread Subject to Wasserstein Distance Constraints

JG Carlsson, Y Wang - Journal of the Operations Research Society of …, 2019 - Springer

Recent research on formulating and solving distributionally robust optimization problems

has seen many different approaches for describing one's ambiguity set, such as constraints

on first and second moments or quantiles. In this paper, we use the Wasserstein distance to …

  Related articles All 3 versions


[PDF] arxiv.org

Group level MEG/EEG source imaging via optimal transport: minimum Wasserstein estimates

H JanatiT BazeilleB ThirionM Cuturi… - … Information Processing in …, 2019 - Springer

Magnetoencephalography (MEG) and electroencephalography (EEG) are non-invasive

modalities that measure the weak electromagnetic fields generated by neural activity.

Inferring the location of the current sources that generated these magnetic fields is an ill …

  Cited by 5 Related articles All 14 versions


2019


[PDF] arxiv.org

Optimal Transport Relaxations with Application to Wasserstein GANs

S Mahdian, J Blanchet, P Glynn - arXiv preprint arXiv:1906.03317, 2019 - arxiv.org

We propose a family of relaxations of the optimal transport problem which regularize the

problem by introducing an additional minimization step over a small region around one of

the underlying transporting measures. The type of regularization that we obtain is related to …

  Related articles All 4 versions 


[PDF] d-nb.info

[PDF] Algorithms for Optimal Transport and Wasserstein Distances

J Schrieber - 2019 - d-nb.info

Optimal Transport and Wasserstein Distance are closely related terms that do not only have

a long history in the mathematical literature, but also have seen a resurgence in recent

years, particularly in the context of the many applications they are used in, which span a …

  Related articles All 2 versions 


[PDF] iop.org

Wasserstein space as state space of quantum mechanics and optimal transport

MF Rosyid, K Wahyuningsih - Journal of Physics: Conference …, 2019 - iopscience.iop.org

In this work, we are in the position to view a measurement of a physical observable as an

experiment in the sense of probability theory. To every physical observable, a sample space

called the spectrum of the observable is therefore available. We have investigated the …

  Related articles All 2 versions


Wasserstein GAN · Depth First Learning

www.depthfirstlearning.com › WassersteinGAN

May 2, 2019 — The Wasserstein GAN (WGAN) is a GAN variant which uses the 1-Wasserstein ... By studying the WGAN, and its variant the WGAN-GP, we can learn a lot about ... learning as well as in both discriminative and generative methods. ... some approximation for a function we are trying to learn (an estimator).


Calculating spatial configurational entropy of a landscape mosaic based on the Wasserstein metric

Y Zhao, X Zhang - Landscape Ecology, 2019 - Springer

Context Entropy is an important concept traditionally associated with thermodynamics and is

widely used to describe the degree of disorder in a substance, system, or process.

Configurational entropy has received more attention because it better reflects the …

  Cited by 4 Related articles All 5 versions

<——2019—–—2019 ——1540—   



Personalized purchase prediction of market baskets with Wasserstein-based sequence matching

M KrausS Feuerriegel - Proceedings of the 25th ACM SIGKDD …, 2019 - dl.acm.org

Personalization in marketing aims at improving the shopping experience of customers by

tailoring services to individuals. In order to achieve this, businesses must be able to make

personalized predictions regarding the next purchase. That is, one must forecast the exact …

  Cited by 4 Related articles All 4 versions


[HTML] sciencedirect.com

[HTML] Wasserstein GAN-based small-sample augmentation for new-generation artificial intelligence: a case study of cancer-staging data in biology

Y Liu, Y Zhou, X Liu, F Dong, C Wang, Z Wang - Engineering, 2019 - Elsevier

It is essential to utilize deep-learning algorithms based on big data for the implementation of

the new generation of artificial intelligence. Effective utilization of deep learning relies

considerably on the number of labeled samples, which restricts the application of deep …

  Cited by 33 Related articles All 5 versions


[PDF] arxiv.org

Wgansing: A multi-voice singing voice synthesizer based on the wasserstein-gan

P Chandna, M Blaauw, J Bonada… - 2019 27th European …, 2019 - ieeexplore.ieee.org

We present a deep neural network based singing voice synthesizer, inspired by the Deep

Convolutions Generative Adversarial Networks (DCGAN) architecture and optimized using

the Wasserstein-GAN algorithm. We use vocoder parameters for acoustic modelling, to …

  Cited by 27 Related articles All 4 versions


[PDF] researchgate.net

Wasserstein metric based distributionally robust approximate framework for unit commitment

R ZhuH WeiX Bai - IEEE Transactions on Power Systems, 2019 - ieeexplore.ieee.org

This paper proposed a Wasserstein metric-based distributionally robust approximate

framework (WDRA), for unit commitment problem to manage the risk from uncertain wind

power forecasted errors. The ambiguity set employed in the distributionally robust …

  Cited by 28 Related articles All 3 versions


[PDF] arxiv.org

Wasserstein distance based domain adaptation for object detection

P Xu, P GurramG WhippsR Chellappa - arXiv preprint arXiv:1909.08675, 2019 - arxiv.org

In this paper, we present an adversarial unsupervised domain adaptation framework for

object detection. Prior approaches utilize adversarial training based on cross entropy

between the source and target domain distributions to learn a shared feature mapping that …

  Cited by 6 Related articles All 2 versions 

2019


2019

[PDF] arxiv.org

Wasserstein distance based deep adversarial transfer learning for intelligent fault diagnosis

C Cheng, B Zhou, G Ma, D WuY Yuan - arXiv preprint arXiv:1903.06753, 2019 - arxiv.org

The demand of artificial intelligent adoption for condition-based maintenance strategy is

astonishingly increased over the past few years. Intelligent fault diagnosis is one critical

topic of maintenance solution for mechanical systems. Deep learning models, such as …

  Cited by 16 Related articles All 3 versions 


[PDF] arxiv.org

Unsupervised adversarial domain adaptation based on the Wasserstein distance for acoustic scene classification

K DrossosP MagronT Virtanen - 2019 IEEE Workshop on …, 2019 - ieeexplore.ieee.org

A challenging problem in deep learning-based machine listening field is the degradation of

the performance when using data from unseen conditions. In this paper we focus on the

acoustic scene classification (ASC) task and propose an adversarial deep learning method …

  Cited by 14 Related articles All 5 versions


[PDF] ieee.org

Multi-source medical image fusion based on Wasserstein generative adversarial networks

Z Yang, Y Chen, Z LeF Fan, E Pan - IEEE Access, 2019 - ieeexplore.ieee.org

In this paper, we propose the medical Wasserstein generative adversarial networks

(MWGAN), an end-to-end model, for fusing magnetic resonance imaging (MRI) and positron

emission tomography (PET) medical images. Our method establishes two adversarial …

  Cited by 6 Related articles


Wasserstein distance-based analogous method to predict distribution of non-uniform corrosion on reinforcements in concrete

Q Liu, RKL Su - Construction and Building Materials, 2019 - Elsevier

This paper presents an analogous method to predict the distribution of non-uniform

corrosion on reinforcements in concrete by minimizing the Wasserstein distance. A

comparison between the predicted and experimental results shows that the proposed …

  Cited by 6 Related articles All 3 versions


A virtual monochromatic imaging method for spectral CT based on Wasserstein generative adversarial network with a hybrid loss

Z Shi, J Li, H Li, Q Hu, Q Cao - IEEE Access, 2019 - ieeexplore.ieee.org

Spectral computed tomography (CT) has become a popular clinical diagnostic technique

because of its unique advantage in material distinction. Specifically, it can perform virtual

monochromatic imaging to obtain accurate tissue composition with less beam hardening …

  Cited by 8 Related articles All 2 versions

<——2019—–—2019 ——1550—  



[PDF] ieee.org

Prostate MR image segmentation with self-attention adversarial training based on wasserstein distance

C Su, R Huang, C Liu, T Yin, B Du - IEEE Access, 2019 - ieeexplore.ieee.org

Prostate diseases are very common in men. Accurate segmentation of the prostate plays a

significant role in further clinical treatment and diagnosis. There have been some methods

that combine the segmentation network and generative adversarial network, using the …

  Cited by 3 Related articles


[PDF] csroc.org.tw

[PDF] Cross-domain Text Sentiment Classification Based on Wasserstein Distance

G Cai, Q Lin, N Chen - Journal of Computers, 2019 - csroc.org.tw

Text sentiment analysis is mainly to detect the sentiment polarity implicit in text data. Most

existing supervised learning algorithms are difficult to solve the domain adaptation problem

in text sentiment analysis. The key of cross-domain text sentiment analysis is how to extract …

  Related articles All 2 versions 


[PDF] mdpi.com

Multi-turn chatbot based on query-context attentions and dual wasserstein generative adversarial networks

J Kim, S Oh, OW Kwon, H Kim - Applied Sciences, 2019 - mdpi.com

To generate proper responses to user queries, multi-turn chatbot models should selectively

consider dialogue histories. However, previous chatbot models have simply concatenated or

averaged vector representations of all previous utterances without considering contextual …

  Cited by 6 Related articles All 3 versions 


[PDF] wustl.edu

Grid-less DOA estimation using sparse linear arrays based on Wasserstein distance

M WangZ ZhangA Nehorai - IEEE Signal Processing Letters, 2019 - ieeexplore.ieee.org

Sparse linear arrays, such as nested and co-prime arrays, are capable of resolving O (M2)

sources using only O (M) sensors by exploiting their so-called difference coarray model. One

popular approach to exploit the difference coarray model is to construct an augmented …

  Cited by 3 Related articles All 3 versions


[PDF] arxiv.org

Deep Distributional Sequence Embeddings Based on a Wasserstein Loss

A Abdelwahab, N Landwehr - arXiv preprint arXiv:1912.01933, 2019 - arxiv.org

Deep metric learning employs deep neural networks to embed instances into a metric space

such that distances between instances of the same class are small and distances between

instances from different classes are large. In most existing deep metric learning techniques …

  Cited by 1 Related articles All 2 versions 


2019


Misfit function for full waveform inversion based on the Wasserstein metric with dynamic formulation

P YongW Liao, J Huang, Z Li, Y Lin - Journal of Computational Physics, 2019 - Elsevier

Conventional full waveform inversion (FWI) using least square distance (L 2 norm) between

the observed and predicted seismograms suffers from local minima. Recently, the

Wasserstein metric (W 1 metric) has been introduced to FWI to compute the misfit between …

  Cited by 1 Related articles All 2 versions


[PDF] aaai.org

EWGAN: Entropy-based Wasserstein GAN for imbalanced learning

J Ren, Y LiuJ Liu - Proceedings of the AAAI Conference on Artificial …, 2019 - ojs.aaai.org

In this paper, we propose a novel oversampling strategy dubbed Entropy-based

Wasserstein Generative Adversarial Network (EWGAN) to generate data samples for

minority classes in imbalanced learning. First, we construct an entropyweighted label vector …

  Cited by 1 Related articles All 7 versions 


Gait recognition based on Wasserstein generating adversarial image inpainting network

L Xia, H Wang, W Guo - Journal of Central South University, 2019 - Springer

Aiming at the problem of small area human occlusion in gait recognition, a method based on

generating adversarial image inpainting network was proposed which can generate a

context consistent image for gait occlusion area. In order to reduce the effect of noise on …

  Cited by 2 Related articles


Evasion attacks based on wasserstein generative adversarial network

J Zhang, Q Yan, M Wang - 2019 Computing, Communications …, 2019 - ieeexplore.ieee.org

Security issues have been accompanied by the development of the artificial intelligence

industry. Machine learning has been widely used for fraud detection, spam detection, and

malicious file detection, since it has the ability to dig the value of big data. However, for …

  Cited by 1 Related articles


[PDF] arxiv.org

Optimal Fusion of Elliptic Extended Target Estimates Based on the Wasserstein Distance

K ThormannM Baum - 2019 22th International Conference on …, 2019 - ieeexplore.ieee.org

This paper considers the fusion of multiple estimates of a spatially extended object, where

the object extent is modeled as an ellipse parameterized by the orientation and semi-axes

lengths. For this purpose, we propose a novel systematic approach that employs a distance …

  Cited by 1 Related articles All 5 versions

<——2019—–—2019 ——1560— 



[PDF] mdpi.com

Wasserstein Generative Adversarial Network Based De-Blurring Using Perceptual Similarity

M Hong, Y Choe - Applied Sciences, 2019 - mdpi.com

The de-blurring of blurred images is one of the most important image processing methods

and it can be used for the preprocessing step in many multimedia and computer vision

applications. Recently, de-blurring methods have been performed by neural network …

  Cited by 1 Related articles All 4 versions 


Data augmentation method of sar image dataset based on wasserstein generative adversarial networks

Q Lu, H Jiang, G Li, W Ye - 2019 International conference on …, 2019 - ieeexplore.ieee.org

The published Synthetic Aperture Radar (SAR) samples are not abundant enough, which is

not conducive to the application of deep learning methods in the field of SAR automatic

target recognition. Generative Adversarial Nets (GANs) is one of the most effective ways to …

  Cited by 1 Related articles All 2 versions


Aero-engine faults diagnosis based on K-means improved wasserstein GAN and relevant vector machine

Z Zhao, R Zhou, Z Dong - 2019 Chinese Control Conference …, 2019 - ieeexplore.ieee.org

The aero-engine faults diagnosis is essential to the safety of the long-endurance aircraft.

The problem of fault diagnosis for aero-engines is essentially a sort of model classification

problem. Due to the difficulty of the engine faults modeling, a data-driven approach is used …

  Cited by 2 Related articles


Unsupervised segmentation of colonic polyps in narrow-band imaging data based on manifold representation of images and Wasserstein distance

IN FigueiredoL PintoPN FigueiredoR Tsai - … Signal Processing and …, 2019 - Elsevier

Colorectal cancer (CRC) is one of the most common cancers worldwide and after a certain

age (≥ 50) regular colonoscopy examination for CRC screening is highly recommended.

One of the most prominent precursors of CRC are abnormal growths known as polyps. If a …

  Related articles All 4 versions


[PDF] unibocconi.it

[PDF] Bayesian model comparison based on Wasserstein distances

M Catalano, A Lijoi, I Pruenster - SIS 2019 Smart Statistics for …, 2019 - iris.unibocconi.it

Demography in the Digital Era: New Data Sources for Population Research ...........................23

Demografia nell'era digitale: nuovi fonti di dati per gli studi di popolazione................................23

Diego Alburez-Gutierrez, Samin Aref, Sofia Gil-Clavel, André Grow, Daniela V. Negraia, Emilio …

  Cited by 2 Related articles 


2019


An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription

C Jin, Z Li, Y Sun, H Zhang, X Lv, J Li, S Liu - International Conference on …, 2019 - Springer

Given a piece of acoustic musical signal, various automatic music transcription (AMT)

processing methods have been proposed to generate the corresponding music notations

without human intervention. However, the existing AMT methods based on signal …

  Related articles


[PDF] uchile.cl

[PDF] WASSERSTEIN-BASED DISTANCE FOR TIME SERIES ANALYSIS

E CAZELLES, A ROBERT, F TOBAR - cmm.uchile.cl

Page 1. WASSERSTEIN-BASED DISTANCE FOR TIME SERIES ANALYSIS ELSA CAZELLES,

ARNAUD ROBERT AND FELIPE TOBAR UNIVERSIDAD DE CHILE BACKGROUND For a

stationary continuous-time time series x(t), the Power Spectral Density is given by S(ξ) = lim T∞ …

  Related articles 


BIRCH Algorithm and Wasserstein Distance Metric Based Method for Generating Typical Scenarios of Wind Power Outputs

Q Li, X Tang, C Chen, X Liu, S Liu, X Shi… - … -Asia (ISGT Asia), 2019 - ieeexplore.ieee.org

With the ever-increasing penetration of renewable energy generation such as wind power

and solar photovoltaics, the power system concerned is suffering more extensive and

significant uncertainties. Scenario analysis has been utilized to solve this problem for power …

  Related articles


Unsupervised feature extraction based on improved Wasserstein generative adversarial network for hyperspectral classification

Q Sun, S Bourennane - Multimodal Sensing: Technologies …, 2019 - spiedigitallibrary.org

Accurate classification is one of the most important prerequisites for hyperspectral

applications and feature extraction is the key step of classification. Recently, deep learning

models have been successfully used to extract the spectral-spatial features in hyperspectral …

  Related articles All 4 versions


Weibo Authorship Identification based on Wasserstein generative adversarial networks

W Tang, C Wu, X Chen, Y Sun… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

During the past years, authorship identification has played a significant role in the public

security area. Recently, deep learning based approaches have been used in authorship

identification. However, all approaches based on deep learning require a large amount of …

  Related articles

<——2019—–—2019 ——1570— 



[PDF] csroc.org.tw

[PDF] Cross-domain Text Sentiment Classification Based on Wasserstein Distance

G Cai, Q Lin, N Chen - Journal of Computers, 2019 - csroc.org.tw

Text sentiment analysis is mainly to detect the sentiment polarity implicit in text data. Most

existing supervised learning algorithms are difficult to solve the domain adaptation problem

in text sentiment analysis. The key of cross-domain text sentiment analysis is how to extract …

  Related articles All 2 versions 


Frame-level speech enhancement based on Wasserstein GAN

P Chuan, T Lan, M Li, S Li, Q Liu - … International Conference on …, 2019 - spiedigitallibrary.org

Speech enhancement is a challenging and critical task in the speech processing research

area. In this paper, we propose a novel speech enhancement model based on Wasserstein

generative adversarial networks, called WSEM. The proposed model operates on frame …

  Related articles All 2 versions


Prioritized Experience Replay based on the Wasserstein Metric in Deep Reinforcement Learning: The regularizing effect of modelling return distributions

T Greevink - 2019 - repository.tudelft.nl

This thesis tests the hypothesis that distributional deep reinforcement learning (RL)

algorithms get an increased performance over expectation based deep RL because of the

regularizing effect of fitting a more complex model. This hypothesis was tested by comparing …



[PDF] arxiv.org

Wasserstein distance based domain adaptation for object detection

P Xu, P GurramG WhippsR Chellappa - arXiv preprint arXiv:1909.08675, 2019 - arxiv.org

In this paper, we present an adversarial unsupervised domain adaptation framework for

object detection. Prior approaches utilize adversarial training based on cross entropy

between the source and target domain distributions to learn a shared feature mapping that …

  Cited by 6 Related articles All 2 versions 


 [PDF] arxiv.org

A Wasserstein Inequality and Minimal Green Energy on Compact Manifolds

S Steinerberger - arXiv preprint arXiv:1907.09023, 2019 - arxiv.org

Let $ M $ be a smooth, compact $ d-$ dimensional manifold, $ d\geq 3, $ without boundary

and let $ G: M\times M\rightarrow\mathbb {R}\cup\left\{\infty\right\} $ denote the Green's

function of the Laplacian $-\Delta $(normalized to have mean value 0). We prove a bound …

  Cited by 3 Related articles All 2 versions 


[PDF] springer.com

Convergence to equilibrium in Wasserstein distance for damped Euler equations with interaction forces

JA CarrilloYP ChoiO Tse - Communications in Mathematical Physics, 2019 - Springer

We develop tools to construct Lyapunov functionals on the space of probability measures in

order to investigate the convergence to global equilibrium of a damped Euler system under

the influence of external and interaction potential forces with respect to the 2-Wasserstein  …

  Cited by 11 Related articles All 11 versions


Artifact correction in low‐dose dental CT imaging using Wasserstein generative adversarial networks

Z Hu, C Jiang, F Sun, Q Zhang, Y Ge, Y Yang… - Medical …, 2019 - Wiley Online Library

… We used clinical dental CT images as the high-quality images urn:x-wiley:00942405:media:mp13415:mp13415-math-0039 

; then, we employed the FBP algorithm to reconstruct the …

  Cited by 64 Related articles All 4 versions

Kernelized wasserstein natural gradient

M ArbelA GrettonW LiG Montúfar - arXiv preprint arXiv:1910.09652, 2019 - arxiv.org

Many machine learning problems can be expressed as the optimization of some cost

functional over a parametric family of probability distributions. It is often beneficial to solve

such optimization problems using natural gradient methods. These methods are invariant to …

  Cited by 6 Related articles All 7 versions 


[PDF] arxiv.org

On distributionally robust chance constrained programs with Wasserstein distance

W Xie - Mathematical Programming, 2019 - Springer

This paper studies a distributionally robust chance constrained program (DRCCP) with

Wasserstein ambiguity set, where the uncertain constraints should be satisfied with a

probability at least a given threshold for all the probability distributions of the uncertain …

  Cited by 47 Related articles All 9 versions


[PDF] arxiv.org

Data-driven chance constrained optimization under Wasserstein ambiguity sets

AR HotaA CherukuriJ Lygeros - 2019 American Control …, 2019 - ieeexplore.ieee.org

We present a data-driven approach for distri-butionally robust chance constrained

optimization problems (DRCCPs). We consider the case where the decision maker has

access to a finite number of samples or realizations of the uncertainty. The chance constraint  …

  Cited by 21 Related articles All 4 versions

<——2019—–—2019 ——1580— 



[PDF] mlr.press

A gradual, semi-discrete approach to generative network training via explicit wasserstein minimization

Y Chen, M TelgarskyC Zhang… - International …, 2019 - proceedings.mlr.press

This paper provides a simple procedure to fit generative networks to target distributions, with

the goal of a small Wasserstein distance (or other optimal transport costs). The approach is

based on two principles:(a) if the source randomness of the network is a continuous …

  Cited by 4 Related articles All 10 versions 


[PDF] arxiv.org

The boundary method for semi-discrete optimal transport partitions and Wasserstein distance computation

L Dieci, JD Walsh III - Journal of Computational and Applied Mathematics, 2019 - Elsevier

We introduce a new technique, which we call the boundary method, for solving semi-

discrete optimal transport problems with a wide range of cost functions. The boundary

method reduces the effective dimension of the problem, thus improving complexity. For cost …

  Cited by 7 Related articles All 5 versions


[PDF] sciencedirect.com

Distributionally Robust Chance-Constraint Optimal Power Flow Considering Uncertain Renewables with Wasserstein-Moment Metric

J Liu, Y Chen, C Duan, J Lyu - Energy Procedia, 2019 - Elsevier

Chance-constraint optimal power flow has been proven as an efficient method to manage

the risk of volatile renewable energy sources. To address the uncertainties of renewable

energy sources, a novel distributionally robust chance-constraint OPF model is proposed in …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

On isometric embeddings of Wasserstein spaces–the discrete case

GP GehérT TitkosD Virosztek - Journal of Mathematical Analysis and …, 2019 - Elsevier

The aim of this short paper is to offer a complete characterization of all (not necessarily

surjective) isometric embeddings of the Wasserstein space W p (X), where X is a countable

discrete metric space and 0< p<∞ is any parameter value. Roughly speaking, we will prove …

  Cited by 3 Related articles All 8 versions


arXiv  

Strongly Polynomial 2-Approximations of Discrete Wasserstein Barycenters

Steffen Borgwardt

Wasserstein barycenters correspond to optimal solutions of transportation problems for several marginals. They arise in applications from economics to statistics. In many applications, data is given as a set of probability measures with finite support. The discrete barycenters that arise in this setting exhibit favorable properties: All barycenters have finite support, and there always is one with a provably sparse support. Further, each barycenter allows a non-mass splitting optimal transport to each of the marginals.

It is open whether the computation of a discrete barycenter is possible in polynomial time. The best known algorithms are based on linear programming, but the sizes of these programs scale exponentially. In this paper, we prove that there is a strongly polynomial, tight 2-approximation, based on restricting the possible support of the approximate barycenter to the support of the measures. The resulting measure is sparse, but its optimal transport will generally split mass. We then exhibit an algorithm to recover the non-mass split property in strongly polynomial time. Finally, we present an iterative scheme that alternates between these two computations. It terminates with a 2-approximation that has a sparse support and does not split mass at the same time. We conclude with some practical computations.

Subjects:

Optimization and Control (math.OC)

MSC classes:

90B80, 90C05, 90C46, 90C90

Cite as:

arXiv:1704.05491 [math.OC]

 

(or arXiv:1704.05491v2 [math.OC] for this version)

Submission history

[v5] Mon, 9 Sep 2019 21:02:56 UTC (295 KB)

[v6] Wed, 22 Apr 2020 21:31:25 UTC (303 KB)


2020


Unsupervised alignment of embeddings with wasserstein procrustes

E GraveA JoulinQ Berthet - The 22nd International …, 2019 - proceedings.mlr.press

We consider the task of aligning two sets of points in high dimension, which has many

applications in natural language processing and computer vision. As an example, it was

recently shown that it is possible to infer a bilingual lexicon, without supervised data, by …

  Cited by 81 Related articles All 3 versions 


[PDF] mlr.press

Gromov-wasserstein learning for graph matching and node embedding

H XuD LuoH Zha, LC Duke - International conference on …, 2019 - proceedings.mlr.press

A novel Gromov-Wasserstein learning framework is proposed to jointly match (align) graphs

and learn embedding vectors for the associated graph nodes. Using Gromov-Wasserstein

discrepancy, we measure the dissimilarity between two graphs and find their …

  Cited by 50 Related articles All 9 versions 


2020

[HTML] sciencedirect.com

[HTML] Wasserstein GAN-based small-sample augmentation for new-generation artificial intelligence: a case study of cancer-staging data in biology

Y Liu, Y Zhou, X Liu, F Dong, C Wang, Z Wang - Engineering, 2019 - Elsevier

It is essential to utilize deep-learning algorithms based on big data for the implementation of

the new generation of artificial intelligence. Effective utilization of deep learning relies

considerably on the number of labeled samples, which restricts the application of deep …

  Cited by 33 Related articles All 5 versions


[PDF] aclweb.org

Modeling personalization in continuous space for response generation via augmented wasserstein autoencoders

Z Chan, J LiX YangX Chen, W Hu, D Zhao… - Proceedings of the 2019 …, 2019 - aclweb.org

Abstract Variational autoencoders (VAEs) and Wasserstein autoencoders (WAEs) have

achieved noticeable progress in open-domain response generation. Through introducing

latent variables in continuous space, these models are capable of capturing utterance-level …

  Cited by 14 Related articles All 3 versions 


[PDF] thecvf.com

Joint wasserstein autoencoders for aligning multimodal embeddings

S Mahajan, T Botschen… - Proceedings of the …, 2019 - openaccess.thecvf.com

One of the key challenges in learning joint embeddings of multiple modalities, eg of images

and text, is to ensure coherent cross-modal semantics that generalize across datasets. We

propose to address this through joint Gaussian regularization of the latent representations …

  Cited by 2 Related articles All 6 versions 

<——2019—–—2019 ——1590—


[PDF] arxiv.org

Learning embeddings into entropic wasserstein spaces

C FrognerF MirzazadehJ Solomon - arXiv preprint arXiv:1905.03329, 2019 - arxiv.org

Euclidean embeddings of data are fundamentally limited in their ability to capture latent

semantic structures, which need not conform to Euclidean spatial assumptions. Here we

consider an alternative, which embeds data as discrete probability distributions in a …

  Cited by 3 Related articles All 7 versions 


[PDF] arxiv.org

Towards diverse paraphrase generation using multi-class wasserstein GAN

Z An, S Liu - arXiv preprint arXiv:1909.13827, 2019 - arxiv.org

Paraphrase generation is an important and challenging natural language processing (NLP)

task. In this work, we propose a deep generative model to generate paraphrase with

diversity. Our model is based on an encoder-decoder architecture. An additional transcoder …

  Cited by 4 Related articles All 3 versions 


[PDF] arxiv.org

Deep Distributional Sequence Embeddings Based on a Wasserstein Loss

A Abdelwahab, N Landwehr - arXiv preprint arXiv:1912.01933, 2019 - arxiv.org

Deep metric learning employs deep neural networks to embed instances into a metric space

such that distances between instances of the same class are small and distances between

instances from different classes are large. In most existing deep metric learning techniques …

  Cited by 1 Related articles All 2 versions 




Time delay estimation via Wasserstein distance minimization

JM NicholsMN Hutchinson, N Menkart… - IEEE Signal …, 2019 - ieeexplore.ieee.org

Time delay estimation between signals propagating through nonlinear media is an important

problem with application to radar, underwater acoustics, damage detection, and

communications (to name a few). Here, we describe a simple approach for determining the …

  Cited by 3 Related articles All 2 versions



[PDF] aaai.org

Manifold-valued image generation with Wasserstein generative adversarial nets

Z Huang, J Wu, L Van Gool - Proceedings of the AAAI Conference on …, 2019 - ojs.aaai.org

Generative modeling over natural images is one of the most fundamental machine learning

problems. However, few modern generative models, including Wasserstein Generative

Adversarial Nets (WGANs), are studied on manifold-valued images that are frequently …

  Cited by 4 Related articles All 13 versions 


2019


On isometric embeddings of Wasserstein spaces–the discrete case

GP GehérT TitkosD Virosztek - Journal of Mathematical Analysis and …, 2019 - Elsevier

The aim of this short paper is to offer a complete characterization of all (not necessarily

surjective) isometric embeddings of the Wasserstein space W p (X), where X is a countable

discrete metric space and 0< p<∞ is any parameter value. Roughly speaking, we will prove …

  Cited by 3 Related articles All 8 versions


[PDF] arxiv.org

Wasserstein distances for evaluating cross-lingual embeddings

G Balikas, I Partalas - arXiv preprint arXiv:1910.11005, 2019 - arxiv.org

Word embeddings are high dimensional vector representations of words that capture their

semantic similarity in the vector space. There exist several algorithms for learning such

embeddings both for a single language as well as for several languages jointly. In this work …

  Related articles All 3 versions 


[PDF] arxiv.org

Stylized Text Generation Using Wasserstein Autoencoders with a Mixture of Gaussian Prior

A Ghabussi, L Mou, O Vechtomova - arXiv preprint arXiv:1911.03828, 2019 - arxiv.org

Wasserstein autoencoders are effective for text generation. They do not however provide

any control over the style and topic of the generated sentences if the dataset has multiple

classes and includes different topics. In this work, we present a semi-supervised approach …

  Related articles All 2 versions 


[HTML] deepai.org

[HTML] Manifold-valued image generation with wasserstein adversarial networks

EW GANs - 2019 - deepai.org

Unsupervised image generation has recently received an increasing amount of attention thanks

to the great success of generative adversarial networks (GANs), particularly Wasserstein

GANs. Inspired by the paradigm of real-valued image generation, this paper makes the first attempt …

  Cited by 2 Related articles 


[PDF] bayesiandeeplearning.org

[PDF] Nested-Wasserstein Distance for Sequence Generation

R ZhangC ChenZ GanZ WenW WangL Carin - bayesiandeeplearning.org

Reinforcement learning (RL) has been widely studied for improving sequencegeneration

models. However, the conventional rewards used for RL training typically cannot capture

sufficient semantic information and therefore render model bias. Further, the sparse and …

  Related articles 

<——2019—–—2019 ——1600—  


[PDF] ceur-ws.org

[PDF] Dialogue response generation with Wasserstein generative adversarial networks

SAS Gilani, E JembereAW Pillay - 2019 - ceur-ws.org

This research evaluates the effectiveness of a Generative Adversarial Network (GAN) for

open domain dialogue response systems. The research involves developing and evaluating

a Conditional Wasserstein GAN (CWGAN) for natural dialogue response generation. We …

  Related articles 


[PDF] arxiv.org

Learning Embeddings into Entropic Wasserstein Spaces

C Frogner, F Mirzazadeh, J Solomon - arXiv preprint arXiv:1905.03329, 2019 - arxiv.org

Euclidean embeddings of data are fundamentally limited in their ability to capture latent semantic structures, which need not conform to Euclidean spatial assumptions. Here we consider an alternative, which embeds data as discrete probability distributions in a …

Related articles

Learning Embeddings into Entropic Wasserstein Spaces

arxiv.org › cs

by C Frogner · 2019 · Cited by 3 — We exploit this flexibility by learning an embedding that captures semantic information in the Wasserstein distance between embedded ...

[CITATION] Learning entropic wasserstein embeddings

C FrognerF Mirzazadeh, J Solomon - International Conference on Learning …, 2019

  Cited by 5 Related articles

[CITATION] Time Series Generation using a One Dimensional Wasserstein GAN

EK Smith, OA Smith - ITISE 2019 International Conference on Time Series …, 2019

  Cited by 11 Related articles All 7 versions


Потеря вассерштейна может быть отрицательной? - CodeRoad

coderoad.ru › Потеря-вассерштейна...

· Translate this page

Jul 19, 2019 — В настоящее время я тренирую WGAN в keras с (прибл) Потеря вассерштейна как показано ниже: def wasserstein_loss(y_true, y_pred): ...

[Russian  Can Wasserstein distance loss be negative?\

[PDF] arxiv.org

How Well Do WGANs Estimate the Wasserstein Metric?

A MallastoG MontúfarA Gerolin - arXiv preprint arXiv:1910.03875, 2019 - arxiv.org

Generative modelling is often cast as minimizing a similarity measure between a data

distribution and a model distribution. Recently, a popular choice for the similarity measure

has been the Wasserstein metric, which can be expressed in the Kantorovich duality …

  Cited by 5 Related articles All 5 versions 


Feature augmentation for imbalanced classification with conditional mixture WGANs

Y Zhang, B Sun, Y Xiao, R Xiao, YG Wei - Signal Processing: Image …, 2019 - Elsevier

Heterogeneity of class distribution is an intrinsic property of a real-world dataset. Therefore,

imbalanced classification is a popular but challenging task. Several methods exist to

address this problem. Notably, the adversarial-based data augmentation method, which …

Cited by 17 Related articles All 2 versions


2019


CWGAN: Conditional wasserstein generative adversarial nets for fault data generation

Y Yu, B Tang, R Lin, S Han, T Tang… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

With the rapid development of modern industry and artificial intelligence technology, fault

diagnosis technology has become more automated and intelligent. The deep learning

based fault diagnosis model has achieved significant advantages over the traditional fault …

  Cited by 3 Related articles All 2 versions


[PDF] arxiv.org

Conditional WGANs with Adaptive Gradient Balancing for Sparse MRI Reconstruction

I MalkielS AhnV TavianiA MeniniL Wolf… - arXiv preprint arXiv …, 2019 - arxiv.org

Recent sparse MRI reconstruction models have used Deep Neural Networks (DNNs) to

reconstruct relatively high-quality images from highly undersampled k-space data, enabling

much faster MRI scanning. However, these techniques sometimes struggle to reconstruct …

 Cited by 4 Related articles All 2 versions 


[PDF] aaai.org

EWGAN: Entropy-based Wasserstein GAN for imbalanced learning

J Ren, Y LiuJ Liu - Proceedings of the AAAI Conference on Artificial …, 2019 - ojs.aaai.org

In this paper, we propose a novel oversampling strategy dubbed Entropy-based

Wasserstein Generative Adversarial Network (EWGAN) to generate data samples for

minority classes in imbalanced learning. First, we construct an entropyweighted label vector …

  Cited by 1 Related articles All 7 versions 


[PDF] arxiv.org

Study of Constrained Network Structures for WGANs on Numeric Data Generation

W Wang, C Wang, T Cui, Y Li - arXiv preprint arXiv:1911.01649, 2019 - arxiv.org

Some recent studies have suggested using GANs for numeric data generation such as to

generate data for completing the imbalanced numeric data. Considering the significant

difference between the dimensions of the numeric data and images, as well as the strong …

  Related articles All 2 versions 


[PDF] nii.ac.jp

An Attempt to Improve Generalization Performance in Reinforcement Learning with Deterministic World Models and WGANs

T Yu, Y Tsuruoka - 2019 - ipsj.ixsq.nii.ac.jp

Significant progress has been made in the field of Reinforcement Learning (RL) in recent

years. Using artificial neural networks, researchers are able to train agents that can play

video games as well as or even better than human experts. However, it is common that the …

  Related articles 

[CITATION] An Attempt to Improve Generalization Performance in Reinforcement Learning with Deterministic World Models and WGANs

Y Tianshuai, T Yoshimasa - ゲームプログラミングワークショップ 2019 論文 …, 2019 - ci.nii.ac.jp

… An Attempt to Improve Generalization Performance in Reinforcement Learning with

Deterministic World Models and WGANs An Attempt to Improve Generalization Performance

in Reinforcement Learning with Deterministic World Models and WGANs …

  <——2019—–—2019 ——1610— 


[PDF] stanford.edu

[PDF] A Privacy Preserved Image-to-Image Translation Model in MRI: Distributed Learning of WGANs

T Ergen, B Ozturkler, B Isik - cs229.stanford.edu

In this project, we introduce a distributed training approach for Generative Adversarial

Networks (GANs) on Magnetic Resonance Imaging (MRI) tasks. In our distributed

framework, we have n discrimnator and a single generator. We first generate fake images …

  Related articles 

PDF] APRIVACY PRESERVED IMAGE-TO-IMAGE TRANSLATION MODEL IN MRI: DISTRIBUTED LEARNING OF WGANS

B ISIK, B OZTURKLER, T ERGEN - cs229.stanford.edu

… for MNIST and an MRI dataset. • Our setting worked succesfully on MNIST dataset as can

be seen from the evolution of fake images in Figures 2, 4, 6. Sim- ilarly from Figures 3, 5, 7,

it is seen that we were able to gen- erate fake images (two left-most images) very similar to …

  Related articles 


On the Bures–Wasserstein distance between positive definite matrices

R Bhatia, T Jain, Y Lim - Expositiones Mathematicae, 2019 - Elsevier

The metric d (A, B)= tr A+ tr B− 2 tr (A 1 2 BA 1 2) 1 2 1 2 on the manifold of n× n

positive definite matrices arises in various optimisation problems, in quantum information

and in the theory of optimal transport. It is also related to Riemannian geometry. In the first …

  Cited by 95 Related articles All 6 versions


[PDF] arxiv.org

Estimation of Wasserstein distances in the spiked transport model

J Niles-WeedP Rigollet - arXiv preprint arXiv:1909.07513, 2019 - arxiv.org

We propose a new statistical model, the spiked transport model, which formalizes the

assumption that two probability distributions differ only on a low-dimensional subspace. We

study the minimax rate of estimation for the Wasserstein distance under this model and show …

  Cited by 16 Related articles All 2 versions 


2019  [PDF] arxiv.org

Wasserstein barycenter model ensembling

P Dognin, I Melnyk, Y Mroueh, J Ross… - arXiv preprint arXiv …, 2019 - arxiv.org

In this paper we propose to perform model ensembling in a multiclass or a multilabel

learning setting using Wasserstein (W.) barycenters. Optimal transport metrics, such as the

Wasserstein distance, allow incorporating semantic side information such as word …

  Cited by 8 Related articles All 4 versions 


2019

[PDF] arxiv.org

Inequalities for the Wasserstein mean of positive definite matrices

R Bhatia, T Jain, Y Lim - Linear Algebra and its Applications, 2019 - Elsevier

Let A 1 , … , A m be given positive definite matrices and let w = ( w 1 , … , w m ) be a vector of

weights; ie, w j ≥ 0 and ∑ j = 1 m w j = 1 . Then the (weighted) Wasserstein mean, or the Wasserstein

barycentre of A 1 , … , A m is defined as(2) Ω ( w ; A 1 , … , A m ) = argmin X P ∑ j = 1 m w …

  Cited by 12 Related articles All 5 versions


[PDF] ieee.org

A deep transfer model with wasserstein distance guided multi-adversarial networks for bearing fault diagnosis under different working conditions

M Zhang, D Wang, W Lu, J Yang, Z Li, B Liang - IEEE Access, 2019 - ieeexplore.ieee.org

In recent years, intelligent fault diagnosis technology with the deep learning algorithm has

been widely used in the manufacturing industry for substituting time-consuming human

analysis method to enhance the efficiency of fault diagnosis. The rolling bearing as the …

  Cited by 25 Related articles All 5 versions


[PDF] uclouvain.be

Curvature of the manifold of fixed-rank positive-semidefinite matrices endowed with the Bures–Wasserstein metric

E MassartJM HendrickxPA Absil - International Conference on …, 2019 - Springer

We consider the manifold of rank-p positive-semidefinite matrices of size n, seen as a

quotient of the set of full-rank n-by-p matrices by the orthogonal group in dimension p. The

resulting distance coincides with the Wasserstein distance between centered degenerate …

  Cited by 6 Related articles All 5 versions


[PDF] arxiv.org

Alpha Procrustes metrics between positive definite operators: a unifying formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt metrics

MH Quang - arXiv preprint arXiv:1908.09275, 2019 - arxiv.org

This work presents a parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

distances provide a unified formulation encompassing both the Bures-Wasserstein and Log …

  Cited by 4 Related articles All 2 versions 


Wasserstein norm for signed measures, with application to nonlocal transport equation with source term

B Piccoli, F Rossi, M Tournus - arXiv preprint arXiv:1910.05105, 2019 - arxiv.org

We introduce the optimal transportation interpretation of the Kantorovich norm on thespace

of signed Radon measures with finite mass, based on a generalized Wasserstein

distancefor measures with different masses. With the formulation and the new topological …

  Cited by 4 Related articles All 7 versions 

<——2019—–—2019 ——1620—  


[PDF] arxiv.org

[CITATION]  

[PDF] arxiv.org

A two-phase two-fluxes degenerate Cahn–Hilliard model as constrained Wasserstein gradient flow

C CancèsD Matthes, F Nabet - Archive for Rational Mechanics and …, 2019 - Springer

We study a non-local version of the Cahn–Hilliard dynamics for phase separation in a two-

component incompressible and immiscible mixture with linear mobilities. Differently to the

celebrated local model with nonlinear mobility, it is only assumed that the divergences of the …

  Cited by 8 Related articles All 17 versions


[PDF] arxiv.org

Temporal Wasserstein non-negative matrix factorization for non-rigid motion segmentation and spatiotemporal deconvolution

E VarolA Nejatbakhsh, C McGrory - arXiv preprint arXiv:1912.03463, 2019 - arxiv.org

Motion segmentation for natural images commonly relies on dense optic flow to yield point

trajectories which can be grouped into clusters through various means including spectral

clustering or minimum cost multicuts. However, in biological imaging scenarios, such as …

  Cited by 2 Related articles All 3 versions 


[PDF] unibocconi.it

[PDF] Bayesian model comparison based on Wasserstein distances

M Catalano, A Lijoi, I Pruenster - SIS 2019 Smart Statistics for …, 2019 - iris.unibocconi.it

Demography in the Digital Era: New Data Sources for Population Research ...........................23

Demografia nell'era digitale: nuovi fonti di dati per gli studi di popolazione................................23

Diego Alburez-Gutierrez, Samin Aref, Sofia Gil-Clavel, André Grow, Daniela V. Negraia, Emilio …

  Cited by 2 Related articles 


[PDF] arxiv.org

Local well-posedness in the Wasserstein space for a chemotaxis model coupled to Navier-Stokes equations

K Kang, HK Kim - arXiv preprint arXiv:1907.01895, 2019 - arxiv.org

We consider a coupled system of Keller-Segel type equations and the incompressible

Navier-Stokes equations in spatial dimension two and three. In the previous work [19], we

established the existence of a weak solution of a Fokker-Plank equation in the Wasserstein  …

  Related articles All 2 versions 


Wasserstein barycenters in the manifold of all positive definite matrices

E NobariB Ahmadi Kakavandi - Quarterly of Applied Mathematics, 2019 - ams.org

In this paper, we study the Wasserstein barycenter of finitely many Borel probability

measures on $\mathbb {P} _ {n} $, the Riemannian manifold of all $ n\times n $ real positive

definite matrices as well as its associated dual problem, namely the optimal transport …

  Related articles All 2 versions


2019


[PDF] rit.edu

A Comparative Assessment of the Impact of Various Norms on Wasserstein Generative Adversarial Networks

C Ramesh - 2019 - scholarworks.rit.edu

Abstract Generative Adversarial Networks (GANs) provide a fascinating new paradigm in

machine learning and artificial intelligence, especially in the context of unsupervised

learning. GANs are quickly becoming a state of the art tool, used in various applications …

  Related articles All 2 versions 


A degenerate Cahn‐Hilliard model as constrained Wasserstein gradient flow

D MatthesC Cances, F Nabet - PAMM, 2019 - Wiley Online Library

Existence of solutions to a non‐local Cahn‐Hilliard model with degenerate mobility is

considered. The PDE is written as a gradient flow with respect to the L2‐Wasserstein metric

for two components that are coupled by an incompressibility constraint. Approximating …

  Related articles


2019 

[PDF] A general solver to the elliptical mixture model through an ...

https://www.semanticscholar.org › paper › A-general-solv...

https://www.semanticscholar.org › paper › A-general-solv...

This paper studies the problem of estimation for general finite mixture models, with a particular focus on the elliptical mixture models (EMMs).

[CITATION] A general solver to the elliptical mixture model through an approximate wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - arXiv preprint arXiv:1906.03700, 2019

  Cited by 1 Related articles


[PDF] mlr.press

Wasserstein of Wasserstein loss for learning generative models

Y DuklerW Li, A Lin… - … Conference on Machine …, 2019 - proceedings.mlr.press

The Wasserstein distance serves as a loss function for unsupervised learning which

depends on the choice of a ground metric on sample space. We propose to use the

Wasserstein distance itself as the ground metric on the sample space of images. This …

  Cited by 12 Related articles All 11 versions 


[PDF] thecvf.com

Sliced wasserstein generative models

J WuZ Huang, D Acharya, W Li… - Proceedings of the …, 2019 - openaccess.thecvf.com

In generative modeling, the Wasserstein distance (WD) has emerged as a useful metric to

measure the discrepancy between generated and real data distributions. Unfortunately, it is

challenging to approximate the WD of high-dimensional distributions. In contrast, the sliced …

  Cited by 48 Related articles All 12 versions 

[PDF] arxiv.org

<——2019—–—2019 ——1630— 


[PDF] arxiv.org

Asymptotic guarantees for learning generative models with the sliced-wasserstein distance

K NadjahiA DurmusU ŞimşekliR Badeau - arXiv preprint arXiv …, 2019 - arxiv.org

Minimum expected distance estimation (MEDE) algorithms have been widely used for

probabilistic models with intractable likelihood functions and they have become increasingly

popular due to their use in implicit generative modeling (eg Wasserstein generative …

  Cited by 19 Related articles All 5 versions 


 

[PDF] arxiv.org

On distributionally robust chance constrained programs with Wasserstein distance

W Xie - Mathematical Programming, 2019 - Springer

This paper studies a distributionally robust chance constrained program (DRCCP) with

Wasserstein ambiguity set, where the uncertain constraints should be satisfied with a

probability at least a given threshold for all the probability distributions of the uncertain …

  Cited by 51 Related articles All 9 versions


Aggregated wasserstein distance and state registration for hidden markov models

Y ChenJ Ye, J Li - IEEE transactions on pattern analysis and …, 2019 - ieeexplore.ieee.org

We propose a framework, named Aggregated Wasserstein, for computing a dissimilarity

measure or distance between two Hidden Markov Models with state conditional distributions

being Gaussian. For such HMMs, the marginal distribution at any time position follows a …

Cited by 7 Related articles All 7 versions


[PDF] arxiv.org

On the minimax optimality of estimating the wasserstein metric

T Liang - arXiv preprint arXiv:1908.10324, 2019 - arxiv.org

We study the minimax optimal rate for estimating the Wasserstein-$1 $ metric between two

unknown probability measures based on $ n $ iid empirical samples from them. We show

that estimating the Wasserstein metric itself between probability measures, is not …

  Cited by 3 Related articles All 3 versions 


[PDF] arxiv.org

Second-Order Models for Optimal Transport and Cubic Splines on the Wasserstein Space

JD BenamouTO GallouëtFX Vialard - Foundations of Computational …, 2019 - Springer

On the space of probability densities, we extend the Wasserstein geodesics to the case of

higher-order interpolation such as cubic spline interpolation. After presenting the natural

extension of cubic splines to the Wasserstein space, we propose a simpler approach based …

  Cited by 9 Related articles All 5 versions



 2019  


Dynamic models of Wasserstein-1-type unbalanced transport

B Schmitzer, B Wirth - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We consider a class of convex optimization problems modelling temporal mass transport

and mass change between two given mass distributions (the so-called dynamic formulation

of unbalanced transport), where we focus on those models for which transport costs are …

  Cited by 6 Related articles All 5 versions


Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein ...

arxiv.org › math

by V Ehrlacher · 2019 · Cited by 4 — Application to one-dimensional conservative PDEs in Wasserstein spaces. We give theoretical and numerical evidence of their efficiency to reduce complexity for one-dimensional conservative PDEs where the underlying metric space can be chosen to be the L^2-Wasserstein space. ...

[PDF] arxiv.org

[CITATION] Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces

V Ehrlacher, D Lombardi, O Mula, FX Vialard - arXiv preprint arXiv:1909.06626, 2019

  Cited by 4 Related articles All 19 versions


[PDF] archives-ouvertes.fr

Diffusive processes on the Wasserstein space: Coalescing models, Regularization properties and McKean-Vlasov equations

V Marx - 2019 - tel.archives-ouvertes.fr

The aim of this thesis is to study a class of diffusive stochastic processes with values in the

space of probability measures on the real line, called Wasserstein space if it is endowed

with the Wasserstein metric W2. The following issues are mainly addressed in this work: how …

  Cited by 2 Related articles All 9 versions 


[PDF] arxiv.org

On the estimation of the Wasserstein distance in generative models

T Pinetz, D Soukup, T Pock - German Conference on Pattern Recognition, 2019 - Springer

Abstract Generative Adversarial Networks (GANs) have been used to model the underlying

probability distribution of sample based datasets. GANs are notoriuos for training difficulties

and their dependence on arbitrary hyperparameters. One recent improvement in GAN …

  Related articles All 5 versions


Wasserstein Distances for Estimating Parameters in Stochastic Reaction Networks

K Öcal, R GrimaG Sanguinetti - International Conference on …, 2019 - Springer

Modern experimental methods such as flow cytometry and fluorescence in-situ hybridization

(FISH) allow the measurement of cell-by-cell molecule numbers for RNA, proteins and other

substances for large numbers of cells at a time, opening up new possibilities for the …

  Related articles All 3 versions

<——2019—–—2019 ——1640— 


[PDF] ieee.org

Wgan-based robust occluded facial expression recognition

Y Lu, S Wang, W Zhao, Y Zhao - IEEE Access, 2019 - ieeexplore.ieee.org

… recognition of occluded facial expression images is a topic that should be explored. In this paper, we proposed a novel Wasserstein … information, the recognition is achieved by learning …

 Cited by 25 Related articles All 2 versions


De Novo Protein Design for Novel Folds using Guided Conditional Wasserstein Generative Adversarial Networks (gcWGAN)

M KarimiS ZhuY CaoY Shen - bioRxiv, 2019 - biorxiv.org

Motivation Facing data quickly accumulating on protein sequence and structure, this study is

addressing the following question: to what extent could current data alone reveal deep

insights into the sequence-structure relationship, such that new sequences can be designed …

  Cited by 6 Related articles All 4 versions 

[PDF] biorxiv.org

[PDF] De Novo Protein Design for Novel Folds using Guided Conditional Wasserstein Generative Adversarial Networks (gcWGAN)(Supporting Information)

M Karimi, S Zhu, Y Cao, Y Shen - Small - biorxiv.org

2.1 Methods Using a representative protein structure chosen by SCOPe for each of the

1,232 folds, we construct a pairwise similarity matrix of symmetrized TM scores (Zhang and

Skolnick, 2004) and added a properly-scaled identity matrix to it to make a positive-definite …

  Related articles 



2019

On the computational complexity of finding a sparse Wasserstein barycenter

S Borgwardt, S Patterson - arXiv preprint arXiv:1910.07568, 2019 - arxiv.org

The discrete Wasserstein barycenter problem is a minimum-cost mass transport problem for

a set of probability measures with finite support. In this paper, we show that finding a

barycenter of sparse support is hard, even in dimension 2 and for only 3 measures. We …

  Cited by 11 Related articles All 2 versions 


[PDF] thecvf.com

Wasserstein gan with quadratic transport cost

H LiuX GuD Samaras - Proceedings of the IEEE/CVF …, 2019 - openaccess.thecvf.com

Wasserstein GANs are increasingly used in Computer Vision applications as they are easier

to train. Previous WGAN variants mainly use the l_1 transport cost to compute the

Wasserstein distance between the real and synthetic data distributions. The l_1 transport …

  Cited by 18 Related articles All 5 versions 

[PDF] Wasserstein GAN with Quadratic Transport Cost Supplementary Material

H Liu, X GuD Samaras - openaccess.thecvf.com

(1) where I and J are disjoint sets, then for each xj, there exists at I, such that H t− H j= c

(xj, yt). We prove this by contradiction, ie, there exists one xs, s J, such that we cannot find

ay i such that H i− H s= c (xs, yi), i I. This means that H s> supi I {H i− c (xs, yi)} …

  Related articles All 3 versions 


[PDF] mlr.press

Wasserstein regularization for sparse multi-task regression

H JanatiM CuturiA Gramfort - The 22nd International …, 2019 - proceedings.mlr.press

We focus in this paper on high-dimensional regression problems where each regressor can

be associated to a location in a physical space, or more generally a generic geometric

space. Such problems often employ sparse priors, which promote models using a small …

  Cited by 28 Related articles All 8 versions 


2019


[PDF] arxiv.org

Topic modeling with Wasserstein autoencoders

F NanR DingR NallapatiB Xiang - arXiv preprint arXiv:1907.12374, 2019 - arxiv.org

We propose a novel neural topic model in the Wasserstein autoencoders (WAE) framework.

Unlike existing variational autoencoder based models, we directly enforce Dirichlet prior on

the latent document-topic vectors. We exploit the structure of the latent space and apply a …

  Cited by 14 Related articles All 5 versions 


2019

[PDF] arxiv.org

Interior-point methods strike back: Solving the wasserstein barycenter problem

D Ge, H Wang, Z Xiong, Y Ye - arXiv preprint arXiv:1905.12895, 2019 - arxiv.org

Computing the Wasserstein barycenter of a set of probability measures under the optimal

transport metric can quickly become prohibitive for traditional second-order algorithms, such

as interior-point methods, as the support size of the measures increases. In this paper, we …

  Cited by 11 Related articles All 3 versions 


[PDF] arxiv.org

Sparsemax and relaxed Wasserstein for topic sparsity

T LinZ HuX Guo - Proceedings of the Twelfth ACM International …, 2019 - dl.acm.org

Topic sparsity refers to the observation that individual documents usually focus on several

salient topics instead of covering a wide variety of topics, and a real topic adopts a narrow

range of terms instead of a wide coverage of the vocabulary. Understanding this topic  …

  Cited by 10 Related articles All 5 versions


2019 see 2020  [PDF] arxiv.org

Wasserstein distance based deep adversarial transfer learning for intelligent fault diagnosis

C Cheng, B Zhou, G MaD WuY Yuan - arXiv preprint arXiv:1903.06753, 2019 - arxiv.org

… approach is to adopt the Wasserstein distance to train a DTL … how Wasserstein distance

behaves in transfer learning due … Then, we build a Wasserstein distance based DTL (WD-DTL) …

  Cited by 31 Related articles All 3 versions 

[PDF] arxiv.org

Randomized filtering and Bellman equation in Wasserstein space for partial observation control problem

E BandiniA Cosso, M Fuhrman, H Pham - Stochastic Processes and their …, 2019 - Elsevier

We study a stochastic optimal control problem for a partially observed diffusion. By using the

control randomization method in Bandini et al.(2018), we prove a corresponding

randomized dynamic programming principle (DPP) for the value function, which is obtained …

  Cited by 16 Related articles All 13 versions

<——2019—–—2019 ——1650—  


2019

De Novo Protein Design for Novel Folds using Guided Conditional Wasserstein Generative Adversarial Networks (gcWGAN)

M KarimiS ZhuY CaoY Shen - bioRxiv, 2019 - biorxiv.org

Motivation Facing data quickly accumulating on protein sequence and structure, this study is

addressing the following question: to what extent could current data alone reveal deep

insights into the sequence-structure relationship, such that new sequences can be designed …

  Cited by 6 Related articles All 4 versions 


[PDF] wustl.edu

Grid-less DOA estimation using sparse linear arrays based on Wasserstein distance

M WangZ ZhangA Nehorai - IEEE Signal Processing Letters, 2019 - ieeexplore.ieee.org

Sparse linear arrays, such as nested and co-prime arrays, are capable of resolving O (M2)

sources using only O (M) sensors by exploiting their so-called difference coarray model. One

popular approach to exploit the difference coarray model is to construct an augmented …

  Cited by 3 Related articles All 3 versions


CWGAN: Conditional wasserstein generative adversarial nets for fault data generation

Y Yu, B Tang, R Lin, S Han, T Tang… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

With the rapid development of modern industry and artificial intelligence technology, fault

diagnosis technology has become more automated and intelligent. The deep learning

based fault diagnosis model has achieved significant advantages over the traditional fault …

  Cited by 3 Related articles All 2 versions


[PDF] mdpi.com

Data-driven distributionally robust stochastic control of energy storage for wind power ramp management using the Wasserstein metric

I Yang - Energies, 2019 - mdpi.com

The integration of wind energy into the power grid is challenging because of its variability,

which causes high ramp events that may threaten the reliability and efficiency of power

systems. In this paper, we propose a novel distributionally robust solution to wind power …

  Cited by 2 Related articles All 6 versions 


2019

[PDF] researchgate.net

[PDF] RaspBary: Hawkes Point Process Wasserstein Barycenters as a Service

R Hosler, X LiuJ Carter, M Saper - 2019 - researchgate.net

We introduce an API for forecasting the intensity of spacetime events in urban environments

and spatially allocating vehicles during times of peak demand to minimize response time.

Our service is applicable to dynamic resource allocation problems that arise in ride sharing …

  Cited by 2 Related articles 


2019


Improved concentration bounds for conditional value-at-risk and cumulative prospect theory using wasserstein distance

SP Bhat, LA Prashanth - 2019 - openreview.net

This paper presents a unified approach based on Wasserstein distance to derive

concentration bounds for empirical estimates for a broad class of risk measures. The results

cover two broad classes of risk measures which are defined in the paper. The classes of risk …

  Cited by 1 Related articles 


Approximation of Discounted Minimax Markov Control Problems and Zero-Sum Markov Games Using Hausdorff and Wasserstein Distances

F Dufour, T Prieto-Rumeau - Dynamic Games and Applications, 2019 - Springer

This paper is concerned with a minimax control problem (also known as a robust Markov

decision process (MDP) or a game against nature) with general state and action spaces

under the discounted cost optimality criterion. We are interested in approximating …

  Related articles All 6 versions



A Conditional Wasserstein Generative Adversarial Network for Pixel-level Crack Detection using Video ... - DeepAI

deepai.org › publication › a-conditional-wasserstein-ge...

Jul 13, 2019 — 07/13/19 - Automatic crack detection on pavement surfaces is an important research field in the scope of developing an intelligent transporta...

[CITATION] A conditional wasserstein generative adversarial network for pixel-level crack detection using video extracted images

Q MeiM Gül - arXiv preprint arXiv:1907.06014, 2019

  Cited by 6 Related articles



The Pontryagin maximum principle in the Wasserstein space

B BonnetF Rossi - Calculus of Variations and Partial Differential …, 2019 - Springer

Abstract We prove a Pontryagin Maximum Principle for optimal control problems in the

space of probability measures, where the dynamics is given by a transport equation with non-

local velocity. We formulate this first-order optimality condition using the formalism of …

  Cited by 24 Related articles All 20 versions


[PDF] esaim-cocv.org

Pontryagin Maximum Principle in Wasserstein spaces for constrained optimal control problems

B Bonnet - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

In this paper, we prove a Pontryagin Maximum Principle for constrained optimal control

problems in the Wasserstein space of probability measures. The dynamics is described by a

transport equation with non-local velocities which are affine in the control, and is subject to …

  Cited by 8 Related articles All 45 versions

<——2019—–—2019 ——1660—  



The Pontryagin maximum principle in the Wasserstein space

B BonnetF Rossi - Calculus of Variations and Partial Differential …, 2019 - Springer

Abstract We prove a Pontryagin Maximum Principle for optimal control problems in the

space of probability measures, where the dynamics is given by a transport equation with non-

local velocity. We formulate this first-order optimality condition using the formalism of …

  Cited by 24 Related articles All 20 versions


[PDF] esaim-cocv.org

A Pontryagin Maximum Principle in Wasserstein spaces for constrained optimal control problems

B Bonnet - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

In this paper, we prove a Pontryagin Maximum Principle for constrained optimal control

problems in the Wasserstein space of probability measures. The dynamics is described by a

transport equation with non-local velocities which are affine in the control, and is subject to …

  Cited by 8 Related articles All 45 versions


[PDF] arxiv.org

A convergent Lagrangian discretization for -Wasserstein and flux-limited diffusion equations

B Söllner, O Junge - arXiv preprint arXiv:1906.01321, 2019 - arxiv.org

We study a Lagrangian numerical scheme for solution of a nonlinear drift diffusion equation

of the form $\partial_t u=\partial_x (u\cdot c [\partial_x (h^\prime (u)+ v)]) $ on an interval.

This scheme will consist of a spatio-temporal discretization founded in the formulation of the …

  Cited by 2 Related articles All 5 versions 

[CITATION] A convergent Lagrangian discretization for -Wasserstein and flux-limited diffusion equations

O Junge, B Söllner - arXiv preprint arXiv:1906.0132



[PDF] apsipa.org

Semi-supervised Multimodal Emotion Recognition with Improved Wasserstein GANs

J Liang, S Chen, Q Jin - 2019 Asia-Pacific Signal and …, 2019 - ieeexplore.ieee.org

Automatic emotion recognition has faced the challenge of lacking large-scale human

labeled dataset for model learning due to the expensive data annotation cost and inevitable

label ambiguity. To tackle such challenge, previous works have explored to transfer emotion  …

  Cited by 1 Related articles All 2 versions


Input limited Wasserstein GAN

C FD - 2019 - ir.sia.cn

摘要 Generative adversarial networks (GANs) has proven hugely successful, but suffer from

train instability. The recently proposed Wasserstein GAN (WGAN) has largely overcome the

problem, but can still fail to converge in some case or be to complex. It has been found that …


2019


2019 thaws book

MR4197822 Thesis Page, Stephen; Reproducing-Kernel Hilbert Space Regression with Notes on the Wasserstein Distance. Thesis (Ph.D.)–Lancaster University (United Kingdom). 2019. 276 pp. ISBN: 979-8691-27223-3, ProQuest LLC

Review PDF Clipboard Series Thesis

[PDF] lancs.ac.uk

Reproducing-Kernel Hilbert space regression with notes on the Wasserstein Distance

S Page - 2019 - eprints.lancs.ac.uk

We study kernel least-squares estimators for the regression problem subject to a norm

constraint. We bound the squared L2 error of our estimators with respect to the covariate

distribution. We also bound the worst-case squared L2 error of our estimators with respect to …

  Related articles All 4 versions 

[PDF] lancs.ac.uk

Reproducing-Kernel Hilbert space regression with notes on the Wasserstein Distance

S Page - 2019 - eprints.lancs.ac.uk

We study kernel least-squares estimators for the regression problem subject to a norm constraint. We bound the squared L2 error of our estimators with respect to the covariate distribution. We also bound the worst-case squared L2 error of our estimators with respect to …


MR4051515 Thesis Chen, Ruidi Distributionally Robust Learning Under the Wasserstein Metric. Thesis (Ph.D.)–Boston University. 2019. 206 pp. ISBN: 978-1687-99234-5, ProQuest LLC

Review PDF Clipboard Series Thesis
Dissertation or Thesis  Preview Available
Relaxed Wasserstein, Generative Adversarial Networks, Variational Autoencoders and Their Applications
Yang, Nan.University of California, Berkeley, ProQuest Dissertations Publishing, 2019. 22620074.
Abstract/DetailsPreview - PDF (742 KB)‎

Order a copy  Show Abstract 

MR4049226 Reviewed Bernton, EspenJacob, Pierre E.Gerber, MathieuRobert, Christian P. On parameter estimation with the Wasserstein distance. Inf. Inference 8 (2019), no. 4, 657–676. 62F10 (60B10 62F12)

Review PDF Clipboard Journal Article 1 Citation

Cited by 45 Related articles All 6 versions

Wasserstein regularization for sparse multi-task regression

H JanatiM CuturiA Gramfort - The 22nd International …, 2019 - proceedings.mlr.press

We focus in this paper on high-dimensional regression problems where each regressor can

be associated to a location in a physical space, or more generally a generic geometric

space. Such problems often employ sparse priors, which promote models using a small …

  Cited by 28 Related articles All 8 versions 


[PDF] arxiv.org

Using wasserstein-2 regularization to ensure fair decisions with neural-network classifiers

L Risser, Q Vincenot, N Couellan… - arXiv preprint arXiv …, 2019 - arxiv.org

In this paper, we propose a new method to build fair Neural-Network classifiers by using a

constraint based on the Wasserstein distance. More specifically, we detail how to efficiently

compute the gradients of Wasserstein-2 regularizers for Neural-Networks. The proposed …

  Cited by 9 Related articles All 2 versions 

<——2019—–—2019 ——1670— 



[PDF] arxiv.org

Wasserstein diffusion tikhonov regularization

AT Lin, Y DuklerW LiG Montúfar - arXiv preprint arXiv:1909.06860, 2019 - arxiv.org

We propose regularization strategies for learning discriminative models that are robust to in-

class variations of the input data. We use the Wasserstein-2 geometry to capture

semantically meaningful neighborhoods in the 

space of images, and define a corresponding …

  Cited by 2 Related articles All 6 versions 


[PDF] arxiv.org

Randomized filtering and Bellman equation in Wasserstein space for partial observation control problem

E BandiniA Cosso, M Fuhrman, H Pham - Stochastic Processes and their …, 2019 - Elsevier

We study a stochastic optimal control problem for a partially observed diffusion. By using the

control randomization method in Bandini et al.(2018), we prove a corresponding

randomized dynamic programming principle (DPP) for the value function, which is obtained …

  Cited by 16 Related articles All 13 versions


[PDF] arxiv.org

Data-driven regularization of Wasserstein barycenters with an application to multivariate density registration

J Bigot, E Cazelles, N Papadakis - Information and Inference: A …, 2019 - academic.oup.com

We present a framework to simultaneously align and smoothen data in the form of multiple

point clouds sampled from unknown densities with support in a-dimensional Euclidean

space. This work is motivated by applications in bioinformatics where researchers aim to …

  Cited by 11 Related articles All 8 versions


[PDF] arxiv.org

Wasserstein Adversarial Regularization (WAR) on label noise

BB DamodaranK FatrasS LobryR Flamary… - arXiv preprint arXiv …, 2019 - arxiv.org

Noisy labels often occur in vision datasets, especially when they are obtained from

crowdsourcing or Web scraping. We propose a new regularization method, which enables

learning robust classifiers in presence of noisy data. To achieve this goal, we propose a new …

  Cited by 1 Related articles All 2 versions 

Wasserstein Adversarial Regularization (WAR) on label noise

B Bhushan Damodaran, K FatrasS Lobry… - arXiv e …, 2019 - ui.adsabs.harvard.edu

Noisy labels often occur in vision datasets, especially when they are obtained from

crowdsourcing or Web scraping. We propose a new regularization method, which enables

learning robust classifiers in presence of noisy data. To achieve this goal, we


[PDF] archives-ouvertes.fr

Diffusive processes on the Wasserstein space: Coalescing models, Regularization properties and McKean-Vlasov equations

V Marx - 2019 - tel.archives-ouvertes.fr

The aim of this thesis is to study a class of diffusive stochastic processes with values in the

space of probability measures on the real line, called Wasserstein space if it is endowed

with the Wasserstein metric W2. The following issues are mainly addressed in this work: how …

  Cited by 2 Related articles All 9 versions 


2019


[PDF] arxiv.org

Wasserstein total variation filtering

E VarolA Nejatbakhsh - arXiv preprint arXiv:1910.10822, 2019 - arxiv.org

In this paper, we expand upon the theory of trend filtering by introducing the use of the

Wasserstein metric as a means to control the amount of spatiotemporal variation in filtered

time series data. While trend filtering utilizes regularization to produce signal estimates that …

  Related articles All 2 versions 


Distributionally robust xva via wasserstein distance part 1: Wrong way counterparty credit risk

D Singh, S Zhang - Unknown Journal, 2019 - experts.umn.edu

This paper investigates calculations of robust CVA for OTC derivatives under distributional

uncertainty using Wasserstein distance as the ambiguity measure. Wrong way counterparty

credit risk can be characterized (and indeed quantified) via the robust CVA formulation. The …

  


 paper, we propose a combined structure of Wasserstein  …

  Related articles All 3 versions 


Estimation of smooth densities in Wasserstein distance

J WeedQ Berthet - Conference on Learning Theory, 2019 - proceedings.mlr.press

The Wasserstein distances are a set of metrics on probability distributions supported on

$\mathbb {R}^ d $ with applications throughout statistics and machine learning. Often, such

distances are used in the context of variational problems, in which the statistician employs in …

  Cited by 25 Related articles All 4 versions 


[PDF] arxiv.org

Wasserstein style transfer

Y Mroueh - arXiv preprint arXiv:1905.12828, 2019 - arxiv.org

We propose Gaussian optimal transport for Image style transfer in an Encoder/Decoder

framework. Optimal transport for Gaussian measures has closed forms Monge mappings

from source to target distributions. Moreover interpolates between a content and a style …

  Cited by 9 Related articles All 3 versions 


Wasserstein generative learning with kinematic constraints for probabilistic interactive driving behavior prediction

H Ma, J LiW ZhanM Tomizuka - 2019 IEEE Intelligent …, 2019 - ieeexplore.ieee.org

Since prediction plays a significant role in enhancing the performance of decision making

and planning procedures, the requirement of advanced methods of prediction becomes

urgent. Although many literatures propose methods to make prediction on a single agent …

  Cited by 16 Related articles


 [PDF] ieee.org

A deep transfer model with wasserstein distance guided multi-adversarial networks for bearing fault diagnosis under different working conditions

M Zhang, D Wang, W Lu, J Yang, Z Li, B Liang - IEEE Access, 2019 - ieeexplore.ieee.org

In recent years, intelligent fault diagnosis technology with the deep learning algorithm has

been widely used in the manufacturing industry for substituting time-consuming human

analysis method to enhance the efficiency of fault diagnosis. The rolling bearing as the …

  Cited by 25 Related articles All 5 versions

<——2019—–—2019 ——1680— 


Minimax estimation of smooth densities in Wasserstein distance

J Niles-WeedQ Berthet - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

We study nonparametric density estimation problems where error is measured in the

Wasserstein distance, a metric on probability distributions popular in many areas of statistics

and machine learning. We give the first minimax-optimal rates for this problem for general …


Decomposition algorithm for distributionally robust optimization using Wasserstein metric with an application to a class of regression models

F LuoS Mehrotra - European Journal of Operational Research, 2019 - Elsevier

We study distributionally robust optimization (DRO) problems where the ambiguity set is

defined using the Wasserstein metric and can account for a bounded support. We show that

this class of DRO problems can be reformulated as decomposable semi-infinite programs …

  Cited by 20 Related articles All 6 versions


[PDF] arxiv.org

Tree-Wasserstein Barycenter for Large-Scale Multilevel Clustering and Scalable Bayes

T Le, V Huynh, N HoD PhungM Yamada - arXiv preprint arXiv …, 2019 - arxiv.org

We study in this paper a variant of Wasserstein barycenter problem, which we refer to as tree-

Wasserstein barycenter, by leveraging a specific class of ground metrics, namely tree

metrics, for Wasserstein distance. Drawing on the tree structure, we propose an efficient …

  Related articles All 2 versions 

[PDF] thecvf.com


[PDF] Wasserstein GAN with Quadratic Transport Cost Supplementary Material

H Liu, X GuD Samaras - openaccess.thecvf.com

(1) where I and J are disjoint sets, then for each xj, there exists at I, such that H t− H j= c

(xj, yt). We prove this by contradiction, ie, there exists one xs, s J, such that we cannot find

ay i such that H i− H s= c (xs, yi), i I. This means that H s> supi I {H i− c (xs, yi)} …

asserstein distance guarantees that even if there is no support between the real and …

Cited by 41 Related articles All 5 versions 

[PDF] thecvf.com  Conference Paper

[PDF] Order-preserving Wasserstein Discriminant Analysis: Supplementary Material

B SuJ ZhouY Wu - openaccess.thecvf.com

Fig. 1 illustrates the learned barycenters for two sequence classes from the UCR Time

Series Archive [1]. Note that the sequences are univariate sequences for illustration. In this

paper, we tackle multivariate sequences. We can observe that each barycenter reflects the …

Cited by 8 Related articles All 5 versions

[PDF] neurips.cc

Tree-sliced variants of Wasserstein distances

T LeM YamadaK Fukumizu… - Advances in neural …, 2019 - proceedings.neurips.cc

… the sliced-Wasserstein distance is a particular case (the tree is a chain). We propose the

tree-sliced Wasserstein distance, computed by averaging the Wasserstein distance between …

Cited by 38 Related articles All 9 versions 

Tree-Sliced Variants of Wasserstein Distances

https://arxiv.org › stat

by T Le · 2019 · Cited by 21 — Optimal transport (\OT) theory defines a powerful set of tools to compare probability distributions. \OT~suffers however from a few drawbacks, ...

Missing: Material ‎| Must include: Material

[CITATION] Supplementary Material for: Tree-Sliced Variants of Wasserstein Distances

T LeM YamadaK FukumizuM Cuturi

 Cited by 38 Related articles All 9 versions

Tackling Algorithmic Bias in Neural-Network Classifiers using Wasserstein-2 Regularization

L Risser, Q Vincenot, JM Loubes - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

The increasingly common use of neural network classifiers in industrial and social

applications of image analysis has allowed impressive progress these last years. Such

methods are however sensitive to algorithmic bias, ie to an under-or an over-representation …

Related articles All 3 versions 

[PDF] mlr.press

Wasserstein regularization for sparse multi-task regression

H JanatiM CuturiA Gramfort - The 22nd International …, 2019 - proceedings.mlr.press

We focus in this paper on high-dimensional regression problems where each regressor can

be associated to a location in a physical space, or more generally a generic geometric

space. Such problems often employ sparse priors, which promote models using a small …

  Cited by 28 Related articles All 8 versions 


[HTML] oup.com

Uncoupled isotonic regression via minimum Wasserstein deconvolution

P RigolletJ Weed - Information and Inference: A Journal of the …, 2019 - academic.oup.com

Isotonic regression is a standard problem in shape-constrained estimation where the goal is

to estimate an unknown non-decreasing regression function from independent pairs where.

While this problem is well understood both statistically and computationally, much less is …

  Cited by 37 Related articles All 8 versions


[PDF] arxiv.org

Concentration of risk measures: A Wasserstein distance approach

SP Bhat - arXiv preprint arXiv:1902.10709, 2019 - arxiv.org

Known finite-sample concentration bounds for the Wasserstein distance between the

empirical and true distribution of a random variable are used to derive a two-sided

concentration bound for the error between the true conditional value-at-risk (CVaR) of a …

  Cited by 13 Related articles All 5 versions 

<——2019—–—2019 ——1690— 


[PDF] iiasa.ac.at

Optimal XL-insurance under Wasserstein-type ambiguity

C Birghila, GC Pflug - Insurance: Mathematics and Economics, 2019 - Elsevier

We study the problem of optimal insurance contract design for risk management under a

budget constraint. The contract holder takes into consideration that the loss distribution is not

entirely known and therefore faces an ambiguity problem. For a given set of models, we …

  Cited by 3 Related articles All 7 versions


[PDF] arxiv.org

Refined basic couplings and Wasserstein-type distances for SDEs with Lévy noises

D Luo, J Wang - Stochastic Processes and their Applications, 2019 - Elsevier

We establish the exponential convergence with respect to the L 1-Wasserstein distance and

the total variation for the semigroup corresponding to the stochastic differential equation d X

t= d Z t+ b (X t) dt, where (Z t) t≥ 0 is a pure jump Lévy process whose Lévy measure ν fulfills …

  Cited by 17 Related articles All 7 versions


[PDF] inria.fr

On a Wasserstein-type distance between solutions to stochastic differential equations

J Bion–Nadal, D Talay - The Annals of Applied Probability, 2019 - projecteuclid.org

In this paper, we introduce a Wasserstein-type distance on the set of the probability

distributions of strong solutions to stochastic differential equations. This new distance is

defined by restricting the set of possible coupling measures. We prove that it may also be …

  Cited by 11 Related articles All 9 versions


[PDF] esaim-cocv.org

Dynamic models of Wasserstein-1-type unbalanced transport

B Schmitzer, B Wirth - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We consider a class of convex optimization problems modelling temporal mass transport

and mass change between two given mass distributions (the so-called dynamic formulation

of unbalanced transport), where we focus on those models for which transport costs are …

  Cited by 6 Related articles All 5 versions


[PDF] arxiv.org

Weak convergence of empirical Wasserstein type distances

P Berthet, JC Fort - arXiv preprint arXiv:1911.02389, 2019 - arxiv.org

We estimate contrasts $\int_0^ 1\rho (F^{-1}(u)-G^{-1}(u)) du $ between two continuous

distributions $ F $ and $ G $ on $\mathbb R $ such that the set $\{F= G\} $ is a finite union of

intervals, possibly empty or $\mathbb {R} $. The non-negative convex cost function $\rho $ is …

  Cited by 2 Related articles All 6 versions 


2019


[PDF] arxiv.org

Wasserstein stability of porous medium-type equations on manifolds with Ricci curvature bounded below

N De Ponti, M Muratori, C Orrieri - arXiv preprint arXiv:1908.03147, 2019 - arxiv.org

Given a complete, connected Riemannian manifold $\mathbb {M}^ n $ with Ricci curvature

bounded from below, we discuss the stability of the solutions of a porous medium-type

equation with respect to the 2-Wasserstein distance. We produce (sharp) stability estimates …

  Cited by 1 Related articles All 3 versions 


[PDF] arxiv.org

Group level MEG/EEG source imaging via optimal transport: minimum Wasserstein estimates

H JanatiT BazeilleB ThirionM Cuturi… - … Information Processing in …, 2019 - Springer

Magnetoencephalography (MEG) and electroencephalography (EEG) are non-invasive

modalities that measure the weak electromagnetic fields generated by neural activity.

Inferring the location of the current sources that generated these magnetic fields is an ill …

  Cited by 5 Related articles All 14 versions


Improved concentration bounds for conditional value-at-risk and cumulative prospect theory using wasserstein distance

SP Bhat, LA Prashanth - 2019 - openreview.net

This paper presents a unified approach based on Wasserstein distance to derive

concentration bounds for empirical estimates for a broad class of risk measures. The results

cover two broad classes of risk measures which are defined in the paper. The classes of risk  …

  Cited by 1 Related articles 


[PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance Part 2: Wrong Way Funding Risk

D Singh, S Zhang - arXiv preprint arXiv:1910.03993, 2019 - arxiv.org

This paper investigates calculations of robust funding valuation adjustment (FVA) for over

the counter (OTC) derivatives under distributional uncertainty using Wasserstein distance as

the ambiguity measure. Wrong way funding risk can be characterized via the robust FVA …

  Related articles All 5 versions 


[PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance: Wrong Way Counterparty Credit and Funding Risk

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019 - arxiv.org

This paper investigates calculations of robust XVA, in particular, credit valuation adjustment

(CVA) and funding valuation adjustment (FVA) for over-the-counter derivatives under

distributional uncertainty using Wasserstein distance as the ambiguity measure. Wrong way …

  Cited by 1 Related articles All 8 versions 

<——2019—–—2019 ——1700— 



[PDF] koreascience.or.kr

Combining multi-task autoencoder with Wasserstein generative adversarial networks for improving speech recognition performance

CY Kao, H Ko - The Journal of the Acoustical Society of Korea, 2019 - koreascience.or.kr

As the presence of background noise in acoustic signal degrades the performance of

speech or acoustic event recognition, it is still challenging to extract noise-robust acoustic

features from noisy signal. In this paper, we propose a combined structure of Wasserstein  …

  Related articles All 3 versions 


[PDF] arxiv.org

A Fenchel-Moreau-Rockafellar type theorem on the Kantorovich-Wasserstein space with applications in partially observable Markov decision processes

V Laschos, K ObermayerY ShenW Stannat - Journal of Mathematical …, 2019 - Elsevier

By using the fact that the space of all probability measures with finite support can be

completed in two different fashions, one generating the Arens-Eells space and another

generating the Kantorovich-Wasserstein (Wasserstein-1) space, and by exploiting the …

  Cited by 3 Related articles All 5 versions


Distributionally robust XVA via wasserstein distance part 1: Wrong way counterparty credit risk

D Singh, S Zhang - Unknown Journal, 2019 - experts.umn.edu

This paper investigates calculations of robust CVA for OTC derivatives under distributional

uncertainty using Wasserstein distance as the ambiguity measure. Wrong way counterparty

credit risk can be characterized (and indeed quantified) via the robust CVA formulation. The …


Sampling of probability measures in the convex order by Wasserstein projection

J Corbetta, B Jourdain - 2019 - ideas.repec.org

In this paper, for $\mu $ and $\nu $ two probability measures on $\mathbb {R}^ d $ with finite

moments of order $\rho\ge 1$, we define the respective projections for the $ W_\rho $-

Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures dominated by …

  


Data-driven chance constrained optimization under Wasserstein ambiguity sets

AR HotaA CherukuriJ Lygeros - 2019 American Control …, 2019 - ieeexplore.ieee.org

We present a data-driven approach for distri-butionally robust chance constrained

optimization problems (DRCCPs). We consider the case where the decision maker has

access to a finite number of samples or realizations of the uncertainty. The chance constraint …

  Cited by 21 Related articles All 4 versions


2019

Investigating Under and Overfitting in Wasserstein Generative Adversarial Networks

A Kapoor, B AdlamC Weill - 2019 - research.google

We investigate under and overfitting in Generative Adversarial Networks (GANs), using

discriminators unseen by the generator to measure generalization. We find that the model

capacity of the discriminator has a significant effect on the generator's model quality, and …

[PDF] arxiv.org

Investigating under and overfitting in wasserstein generative adversarial networks

B AdlamC WeillA Kapoor - arXiv preprint arXiv:1910.14137, 2019 - arxiv.org

We investigate under and overfitting in Generative Adversarial Networks (GANs), using

discriminators unseen by the generator to measure generalization. We find that the model

capacity of the discriminator has a significant effect on the generator's model quality, and …

  Cited by 7 Related articles All 3 versions 


[PDF] iiasa.ac.at

Optimal XL-insurance under Wasserstein-type ambiguity

C Birghila, GC Pflug - Insurance: Mathematics and Economics, 2019 - Elsevier

We study the problem of optimal insurance contract design for risk management under a

budget constraint. The contract holder takes into consideration that the loss distribution is not

entirely known and therefore faces an ambiguity problem. For a given set of models, we …

  Cited by 3 Related articles All 7 versions


[PDF] projecteuclid.org

Behavior of the empirical Wasserstein distance in  under moment conditions

J Dedecker, F Merlevède - Electronic Journal of Probability, 2019 - projecteuclid.org

We establish some deviation inequalities, moment bounds and almost sure results for the

Wasserstein distance of order $ p\in [1,\infty) $ between the empirical measure of

independent and identically distributed ${\mathbb R}^ d $-valued random variables and the …

   Cited by 7 Related articles All 19 versions

[PDF] projecteuclid.org

Wasserstein-2 bounds in normal approximation under local dependence

X Fang - Electronic Journal of Probability, 2019 - projecteuclid.org

We obtain a general bound for the Wasserstein-2 distance in normal approximation for sums

of locally dependent random variables. The proof is based on an asymptotic expansion for

expectations of second-order differentiable functions of the sum. We apply the main result to …

  Cited by 3 Related articles All 3 versions


Distributionally robust learning under the wasserstein metric

R Chen - 2019 - search.proquest.com

This dissertation develops a comprehensive statistical learning framework that is robust to

(distributional) perturbations in the data using Distributionally Robust Optimization (DRO)

under the Wasserstein metric. The learning problems that are studied include:(i) …

  Cited by 1 Related articles All 3 versions

<——2019—–—2019 ——1710— 


[PDF] researchgate.net

[PDF] Wasserstein distance: a flexible tool for statistical analysis

GVVLV Lucarini - 2019 - researchgate.net

The figure shows the Wasserstein distance calculated in the phase space composed by

globally averaged temperature and precipitation. To provide some sort of benchmark, at the

bottom of the figure is shown the value related to the NCEP reanalysis, which yields one of …

  Related articles All 4 versions 


Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance

J WeedF Bach - Bernoulli, 2019 - projecteuclid.org

The Wasserstein distance between two probability measures on a metric space is a

measure of closeness with applications in statistics, probability, and machine learning. In

this work, we consider the fundamental question of how quickly the empirical measure …

  Cited by 171 Related articles All 6 versions


[HTML] sciencedirect.com

[HTML] Wasserstein GAN-based small-sample augmentation for new-generation artificial intelligence: a case study of cancer-staging data in biology

Y Liu, Y Zhou, X Liu, F Dong, C Wang, Z Wang - Engineering, 2019 - Elsevier

It is essential to utilize deep-learning algorithms based on big data for the implementation of

the new generation of artificial intelligence. Effective utilization of deep learning relies

considerably on the number of labeled samples, which restricts the application of deep …

  Cited by 34 Related articles All 5 versions


[PDF] arxiv.org

Bridging Bayesian and minimax mean square error estimation via Wasserstein distributionally robust optimization

VA NguyenS Shafieezadeh-AbadehD Kuhn… - arXiv preprint arXiv …, 2019 - arxiv.org

We introduce a distributionally robust minimium mean square error estimation model with a

Wasserstein ambiguity set to recover an unknown signal from a noisy observation. The

proposed model can be viewed as a zero-sum game between a statistician choosing an …

  Cited by 8 Related articles All 6 versions 


[PDF] ieee.org

Generating Adversarial Samples With Constrained Wasserstein Distance

K Wang, P Yi, F Zou, Y Wu - IEEE Access, 2019 - ieeexplore.ieee.org

In recent years, deep neural network (DNN) approaches prove to be useful in many machine

learning tasks, including classification. However, small perturbations that are carefully

crafted by attackers can lead to the misclassification of the images. Previous studies have …

  Cited by 1 Related articles


2019


[PDF] projecteuclid.org

Wasserstein-2 bounds in normal approximation under local dependence

X Fang - Electronic Journal of Probability, 2019 - projecteuclid.org

We obtain a general bound for the Wasserstein-2 distance in normal approximation for sums

of locally dependent random variables. The proof is based on an asymptotic expansion for

expectations of second-order differentiable functions of the sum. We apply the main result to …

  Cited by 3 Related articles All 3 versions


[PDF] aaai.org

Wasserstein soft label propagation on hypergraphs: Algorithm and generalization error bounds

T GaoS AsoodehY HuangJ Evans - Proceedings of the AAAI …, 2019 - ojs.aaai.org

Inspired by recent interests of developing machine learning and data mining algorithms on

hypergraphs, we investigate in this paper the semi-supervised learning algorithm of

propagating” soft labels”(eg probability distributions, class membership scores) over …

  Cited by 3 Related articles All 13 versions 


[PDF] arxiv.org

Local well-posedness in the Wasserstein space for a chemotaxis model coupled to Navier-Stokes equations

K Kang, HK Kim - arXiv preprint arXiv:1907.01895, 2019 - arxiv.org

We consider a coupled system of Keller-Segel type equations and the incompressible

Navier-Stokes equations in spatial dimension two and three. In the previous work [19], we

established the existence of a weak solution of a Fokker-Plank equation in the Wasserstein  …

  Related articles All 2 versions 

[CITATION] Local Bures-Wasserstein Transport: A Practical and Fast Mapping Approximation.

AH Idrobo - CoRR, 2019



Wasserstein adversarial examples via projected sinkhorn iterations

E WongF SchmidtZ Kolter - International Conference on …, 2019 - proceedings.mlr.press

A rapidly growing area of work has studied the existence of adversarial examples,

datapoints which have been perturbed to fool a classifier, but the vast majority of these

works have focused primarily on threat models defined by $\ell_p $ norm-bounded …

  Cited by 71 Related articles All 8 versions 


 

[PDF] arxiv.org

Hypothesis Test and Confidence Analysis with Wasserstein Distance with General Dimension

M Imaizumi, H Ota, T Hamaguchi - arXiv preprint arXiv:1910.07773, 2019 - arxiv.org

We develop a general framework for statistical inference with the Wasserstein distance.

Recently, the Wasserstein distance has attracted much attention and been applied to

various machine learning tasks due to its celebrated properties. Despite the importance …

  Cited by 1 Related articles All 2 versions 

<——2019—–—2019 ——1720— 

 

[PDF] arxiv.org

Implementation of batched Sinkhorn iterations for entropy-regularized Wasserstein loss

T Viehmann - arXiv preprint arXiv:1907.01729, 2019 - arxiv.org

In this report, we review the calculation of entropy-regularised Wasserstein loss introduced

by Cuturi and document a practical implementation in PyTorch. Code is available at this

https URL Subjects: Machine Learning (stat. ML); Machine Learning (cs. LG) Cite as: arXiv …

  Cited by 1 Related articles All 2 versions 


[PDF] archives-ouvertes.fr

Optimal Control in Wasserstein Spaces

B Bonnet - 2019 - hal.archives-ouvertes.fr

A wealth of mathematical tools allowing to model and analyse multi-agent systems has been

brought forth as a consequence of recent developments in optimal transport theory. In this

thesis, we extend for the first time several of these concepts to the framework of control  …

  Related articles All 8 versions 

[CITATION] Optimal Control in Wasserstein Spaces.(Commande Optimal dans les Espaces de Wasserstein).

B Bonnet - 2019 - Aix-Marseille University, France



[PDF] mdpi.com

Wasserstein distance learns domain invariant feature representations for drift compensation of E-nose

Y Tao, C Li, Z Liang, H Yang, J Xu - Sensors, 2019 - mdpi.com

Abstract Electronic nose (E-nose), a kind of instrument which combines with the gas sensor

and the corresponding pattern recognition algorithm, is used to detect the type and

concentration of gases. However, the sensor drift will occur in realistic application scenario …

 Cited by 6 Related articles All 8 versions 

 

[PDF] arxiv.org

Subexponential upper and lower bounds in Wasserstein distance for Markov processes

A Arapostathis, G Pang, N Sandrić - arXiv preprint arXiv:1907.05250, 2019 - arxiv.org

In this article, relying on Foster-Lyapunov drift conditions, we establish subexponential

upper and lower bounds on the rate of convergence in the $\mathrm {L}^ p $-Wasserstein

distance for a class of irreducible and aperiodic Markov processes. We further discuss these …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein Contraction of Stochastic Nonlinear Systems

J BouvrieJJ Slotine - arXiv preprint arXiv:1902.08567, 2019 - arxiv.org

We suggest that the tools of contraction analysis for deterministic systems can be applied

towards studying the convergence behavior of stochastic dynamical systems in the

Wasserstein metric. In particular, we consider the case of Ito diffusions with identical …

  Cited by 4 Related articles All 2 versions 


2019

[PDF] arxiv.org

A measure approximation theorem for Wasserstein-robust expected values

G van Zyl - arXiv preprint arXiv:1912.12119, 2019 - arxiv.org

We consider the problem of finding the infimum, over probability measures being in a ball

defined by Wasserstein distance, of the expected value of a bounded Lipschitz random

variable on $\mathbf {R}^ d $. We show that if the $\sigma-$ algebra is approximated in by a …

  Related articles All 2 versions 

[PDF] researchgate.net

[PDF] Wasserstein distance: a flexible tool for statistical analysis

GVVLV Lucarini - 2019 - researchgate.net

The figure shows the Wasserstein distance calculated in the phase space composed by

globally averaged temperature and precipitation. To provide some sort of benchmark, at the

bottom of the figure is shown the value related to the NCEP reanalysis, which yields one of …

  Related articles All 4 versions 


[PDF] arxiv.org

[CITATION] Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces

V EhrlacherD LombardiO MulaFX Vialard - arXiv preprint arXiv:1909.06626, 2019

  Cited by 4 Related articles All 19 versions


[PDF] arxiv.org

[CITATION] Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces

V EhrlacherD LombardiO MulaFX Vialard - arXiv preprint arXiv:1909.06626, 2019

  Cited by 4 Related articles All 19 versions


Robust Wasserstein profile inference and applications to machine learning

J Blanchet, Y KangK Murthy - Journal of Applied Probability, 2019 - cambridge.org

We show that several machine learning estimators, including square-root least absolute

shrinkage and selection and regularized logistic regression, can be represented as

solutions to distributionally robust optimization problems. The associated uncertainty regions …

  Cited by 145 Related articles All 5 versions

<——2019—–—2019 ——1730— 


[PDF] uclouvain.be

Curvature of the manifold of fixed-rank positive-semidefinite matrices endowed with the Bures–Wasserstein metric

E MassartJM HendrickxPA Absil - International Conference on …, 2019 - Springer

We consider the manifold of rank-p positive-semidefinite matrices of size n, seen as a

quotient of the set of full-rank n-by-p matrices by the orthogonal group in dimension p. The

resulting distance coincides with the Wasserstein distance between centered degenerate …

  Cited by 6 Related articles All 5 versions


[PDF] arxiv.org

Wasserstein stability of porous medium-type equations on manifolds with Ricci curvature bounded below

N De Ponti, M Muratori, C Orrieri - arXiv preprint arXiv:1908.03147, 2019 - arxiv.org

Given a complete, connected Riemannian manifold $\mathbb {M}^ n $ with Ricci curvature

bounded from below, we discuss the stability of the solutions of a porous medium-type

equation with respect to the 2-Wasserstein distance. We produce (sharp) stability estimates …

  Cited by 1 Related articles All 3 versions 


Wasserstein covariance for multiple random densities

A PetersenHG Müller - Biometrika, 2019 - academic.oup.com

A common feature of methods for analysing samples of probability density functions is that

they respect the geometry inherent to the space of densities. Once a metric is specified for

this space, the Fréchet mean is typically used to quantify and visualize the average density …

  Cited by 12 Related articles All 12 versions


[PDF] arxiv.org

Confidence regions in wasserstein distributionally robust estimation

J Blanchet, K MurthyN Si - arXiv preprint arXiv:1906.01614, 2019 - arxiv.org

Wasserstein distributionally robust optimization (DRO) estimators are obtained as solutions

of min-max problems in which the statistician selects a parameter minimizing the worst-case

loss among all probability models within a certain distance (in a Wasserstein sense) from the …

  Cited by 10 Related articles All 6 versions 


[PDF] aclweb.org

Modeling personalization in continuous space for response generation via augmented wasserstein autoencoders

Z Chan, J LiX YangX Chen, W Hu, D Zhao… - Proceedings of the 2019 …, 2019 - aclweb.org

Abstract Variational autoencoders (VAEs) and Wasserstein autoencoders (WAEs) have

achieved noticeable progress in open-domain response generation. Through introducing

latent variables in continuous space, these models are capable of capturing utterance-level …

  Cited by 14 Related articles All 3 versions 


2019


[PDF] arxiv.org

Wasserstein stability estimates for covariance-preconditioned Fokker-Planck equations

JA CarrilloU Vaes - arXiv preprint arXiv:1910.07555, 2019 - arxiv.org

We study the convergence to equilibrium of the mean field PDE associated with the

derivative-free methodologies for solving inverse problems. We show stability estimates in

the euclidean Wasserstein distance for the mean field PDE by using optimal transport …

  Cited by 7 Related articles All 4 versions 

 

[PDF] arxiv.org

Hypothesis Test and Confidence Analysis with Wasserstein Distance with General Dimension

M Imaizumi, H Ota, T Hamaguchi - arXiv preprint arXiv:1910.07773, 2019 - arxiv.org

We develop a general framework for statistical inference with the Wasserstein distance.

Recently, the Wasserstein distance has attracted much attention and been applied to

various machine learning tasks due to its celebrated properties. Despite the importance …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Minimax confidence intervals for the sliced Wasserstein distance

T ManoleS BalakrishnanL Wasserman - arXiv preprint arXiv:1909.07862, 2019 - arxiv.org

Motivated by the growing popularity of variants of the Wasserstein distance in statistics and

machine learning, we study statistical inference for the Sliced Wasserstein distance--an

easily computable variant of the Wasserstein distance. Specifically, we construct confidence  …

  Cited by 3 Related articles All 4 versions 



[PDF] arxiv.org

Bounding quantiles of Wasserstein distance between true and empirical measure

SN Cohen, MNA Tegnér, J Wiesel - arXiv preprint arXiv:1907.02006, 2019 - arxiv.org

Consider the empirical measure, $\hat {\mathbb {P}} _N $, associated to $ N $ iid samples of

a given probability distribution $\mathbb {P} $ on the unit interval. For fixed $\mathbb {P} $

the Wasserstein distance between $\hat {\mathbb {P}} _N $ and $\mathbb {P} $ is a random …

  Related articles All 4 versions 


[PDF] ceur-ws.org

[PDF] Dialogue response generation with Wasserstein generative adversarial networks

SAS Gilani, E JembereAW Pillay - 2019 - ceur-ws.org

This research evaluates the effectiveness of a Generative Adversarial Network (GAN) for

open domain dialogue response systems. The research involves developing and evaluating

a Conditional Wasserstein GAN (CWGAN) for natural dialogue response generation. We …

  Related articles 

<——2019—–—2019 ——1740—  


On parameter estimation with the Wasserstein distance

E BerntonPE Jacob, M Gerber… - … and Inference: A …, 2019 - academic.oup.com

Statistical inference can be performed by minimizing, over the parameter space, the

Wasserstein distance between model distributions and the empirical distribution of the data.

We study asymptotic properties of such minimum Wasserstein distance estimators …

  Cited by 24 Related articles All 6 versions


[PDF] arxiv.org

Parameter estimation for biochemical reaction networks using Wasserstein distances

K Öcal, R GrimaG Sanguinetti - Journal of Physics A …, 2019 - iopscience.iop.org

We present a method for estimating parameters in stochastic models of biochemical reaction

networks by fitting steady-state distributions using Wasserstein distances. We simulate a

reaction network at different parameter settings and train a Gaussian process to learn the …

  Cited by 7 Related articles All 7 versions

 

CWGAN: Conditional wasserstein generative adversarial nets for fault data generation

Y Yu, B Tang, R Lin, S Han, T Tang… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

With the rapid development of modern industry and artificial intelligence technology, fault

diagnosis technology has become more automated and intelligent. The deep learning

based fault diagnosis model has achieved significant advantages over the traditional fault …

 Cited by 11 Related articles All 2 versions

2019 

AEWGAN 이용한 고차원 불균형 데이터 이상 탐지 - DBpia

https://www.dbpia.co.kr › articleDetail

· Translate this page

AEWGAN 이용한 고차원 불균형 데이터 이상 탐지 · 대한산업공학회 · 대한산업공학회 추계학술대회 논문집 · 2019 대한산업공학회 추계학술대회.

[CITATION] AEWGAN  이용한 고차원 불균형 데이터 이상 탐지

송승환, 백준걸 - 대한산업공학회 추계학술대회 논문집, 2019 - dbpia.co.kr

Page 1. AEWGAN 이용한 고차원 불균형 데이터 이상 탐지 송승환 연구원, 백준걸 교수

고려대학교 산업경영공학과 {ss-hwan, jungeol}@korea.ac.kr 2019 추계학술대회 2181 Page 2.

Contents 1. 연구 배경 2. 관련 연구 1) 이상 탐지 2) 불균형 데이터 처리 기법 3. 제안 방법 … 𝑤𝑤∈𝑊𝑊 …

  Related articles

[KOREAN  High-dimensional unbalanced data anomaly detection using AEWGAN]


Wasserstein of Wasserstein loss for learning generative models

Y DuklerW Li, A Lin… - … Conference on Machine …, 2019 - proceedings.mlr.press

The Wasserstein distance serves as a loss function for unsupervised learning which

depends on the choice of a ground metric on sample space. We propose to use the

Wasserstein distance itself as the ground metric on the sample space of images. This …

  Cited by 12 Related articles All 11 versions 


2019

 

[PDF] mlr.press

Estimation of smooth densities in Wasserstein distance

J WeedQ Berthet - Conference on Learning Theory, 2019 - proceedings.mlr.press

The Wasserstein distances are a set of metrics on probability distributions supported on

$\mathbb {R}^ d $ with applications throughout statistics and machine learning. Often, such

distances are used in the context of variational problems, in which the statistician employs in …

  Cited by 28 Related articles All 4 versions 

Minimax estimation of smooth densities in Wasserstein distance

J Niles-WeedQ Berthet - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

We study nonparametric density estimation problems where error is measured in the

Wasserstein distance, a metric on probability distributions popular in many areas of statistics

and machine learning. We give the first minimax-optimal rates for this problem for general …

 


[PDF] arxiv.org

Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance

J WeedF Bach - Bernoulli, 2019 - projecteuclid.org

The Wasserstein distance between two probability measures on a metric space is a

measure of closeness with applications in statistics, probability, and machine learning. In

this work, we consider the fundamental question of how quickly the empirical measure …

  Cited by 173 Related articles All 6 versions


[PDF] mlr.press

Unsupervised alignment of embeddings with wasserstein procrustes

E GraveA JoulinQ Berthet - The 22nd International …, 2019 - proceedings.mlr.press

We consider the task of aligning two sets of points in high dimension, which has many

applications in natural language processing and computer vision. As an example, it was

recently shown that it is possible to infer a bilingual lexicon, without supervised data, by …

  Cited by 83 Related articles All 3 versions 


[PDF] mlr.press

On the complexity of approximating Wasserstein barycenters

A KroshninN TupitsaD Dvinskikh… - International …, 2019 - proceedings.mlr.press

We study the complexity of approximating the Wasserstein barycenter of $ m $ discrete

measures, or histograms of size $ n $, by contrasting two alternative approaches that use

entropic regularization. The first approach is based on the Iterative Bregman Projections …

  Cited by 44 Related articles All 11 versions 

On the Complexity of Approximating Wasserstein Barycenters

P Dvurechensky - dev.icml.cc

… νP2(Ω) m ∑ i=1 W(µi,ν), where W(µ, ν) is the Wasserstein distance between measures µ and

ν on Ω. WB is efficient in machine learning problems with geometric data, eg template image

reconstruction from random sample: Figure: Images from [Cuturi & Doucet, 2014] 2/9 On the …

  All 4 versions 


[PDF] nips.cc

[PDF] Concentration of risk measures: A Wasserstein distance approach

SP BhatP LA - Advances in Neural Information Processing Systems, 2019 - papers.nips.cc

Abstract<p> Known finite-sample concentration bounds for the Wasserstein distance

between the empirical and true distribution of a random variable are used to derive a two-

sided concentration bound for the error between the true conditional value-at-risk (CVaR) of  …

  Cited by 14 Related articles All 4 versions 

[PDF] iitm.ac.in

[PDF] Concentration of risk measures: A Wasserstein distance approach

LA Prashanth - To appear in the proceedings of NeurIPS, 2019 - cse.iitm.ac.in

… Conditional Value-at-Risk [Brown et al.], [Gao et al.] Our work Spectral risk measures Our work

Our work Cumulative prospect theory [Cheng et al. 2018] Our work Unified approach: For each

bound, the estimation error is related to Wasserstein distance between empirical and true …

  Related articles All 4 versions 

<——2019—–—2019 ——1750—

2019

[PDF] arxiv.org

On the computational complexity of finding a sparse Wasserstein barycenter

S Borgwardt, S Patterson - arXiv preprint arXiv:1910.07568, 2019 - arxiv.org

The discrete Wasserstein barycenter problem is a minimum-cost mass transport problem for

a set of probability measures with finite support. In this paper, we show that finding a

barycenter of sparse support is hard, even in dimension 2 and for only 3 measures. We …

  Cited by 11 Related articles All 2 versions 


[PDF] arxiv.org

Precise simulation of electromagnetic calorimeter showers using a Wasserstein Generative Adversarial Network

M Erdmann, J Glombitza, T Quast - Computing and Software for Big …, 2019 - Springer

Simulations of particle showers in calorimeters are computationally time-consuming, as they

have to reproduce both energy depositions and their considerable fluctuations. A new

approach to ultra-fast simulations is generative models where all calorimeter energy …

  Cited by 46 Related articles All 6 versions


[PDF] mlr.press

Accelerated linear convergence of stochastic momentum methods in wasserstein distances

B CanM GurbuzbalabanL Zhu - … Conference on Machine …, 2019 - proceedings.mlr.press

Momentum methods such as Polyak's heavy ball (HB) method, Nesterov's accelerated

gradient (AG) as well as accelerated projected gradient (APG) method have been commonly

used in machine learning practice, but their performance is quite sensitive to noise in the …

  Cited by 18 Related articles All 8 versions 


2019 

[HTML] sciencedirect.com

[HTML] Wasserstein GAN-based small-sample augmentation for new-generation artificial intelligence: a case study of cancer-staging data in biology

Y Liu, Y Zhou, X Liu, F Dong, C Wang, Z Wang - Engineering, 2019 - Elsevier

It is essential to utilize deep-learning algorithms based on big data for the implementation of

the new generation of artificial intelligence. Effective utilization of deep learning relies

considerably on the number of labeled samples, which restricts the application of deep …

 Cited by 41 Related articles All 4 versions


[PDF] arxiv.org

A bound on the Wasserstein-2 distance between linear combinations of independent random variables

B Arras, E Azmoodeh, G Poly, Y Swan - Stochastic processes and their …, 2019 - Elsevier

We provide a bound on a distance between finitely supported elements and general

elements of the unit sphere of  2 (N). We use this bound to estimate the Wasserstein-2

distance between random variables represented by linear combinations of independent …

  Cited by 20 Related articles All 15 versions


 2019


[PDF] arxiv.org

Approximation of stable law in Wasserstein-1 distance by Stein's method

L Xu - Annals of Applied Probability, 2019 - projecteuclid.org

Abstract Let $ n\in\mathbb {N} $, let $\zeta_ {n, 1},\ldots,\zeta_ {n, n} $ be a sequence of

independent random variables with $\mathbb {E}\zeta_ {n, i}= 0$ and $\mathbb {E}|\zeta_ {n,

i}|<\infty $ for each $ i $, and let $\mu $ be an $\alpha $-stable distribution having …

  Cited by 19 Related articles All 7 versions


[PDF] arxiv.org

Progressive wasserstein barycenters of persistence diagrams

J Vidal, J Budin, J Tierny - IEEE transactions on visualization …, 2019 - ieeexplore.ieee.org

This paper presents an efficient algorithm for the progressive approximation of Wasserstein

barycenters of persistence diagrams, with applications to the visual analysis of ensemble

data. Given a set of scalar fields, our approach enables the computation of a persistence …

  Cited by 13 Related articles All 16 versions


[PDF] arxiv.org

Denoising of 3D magnetic resonance images using a residual encoder–decoder Wasserstein generative adversarial network

M Ran, J Hu, Y Chen, H Chen, H Sun, J Zhou… - Medical image …, 2019 - Elsevier

Abstract Structure-preserved denoising of 3D magnetic resonance imaging (MRI) images is

a critical step in medical image analysis. Over the past few years, many algorithms with

impressive performances have been proposed. In this paper, inspired by the idea of deep …

  Cited by 32 Related articles All 9 versions

<——2019—–—2019 ——1760— 


 

 Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space

TL GouicQ ParisP RigolletAJ Stromme - arXiv preprint arXiv …, 2019 - arxiv.org

This work establishes fast rates of convergence for empirical barycenters over a large class

of geodesic spaces with curvature bounds in the sense of Alexandrov. More specifically, we

show that parametric rates of convergence are achievable under natural conditions that …

  Cited by 9 Related articles All 2 versions 


[PDF] nsf.gov

An information-theoretic view of generalization via Wasserstein distance

H WangM Diaz, JCS Santos Filho… - … on Information Theory …, 2019 - ieeexplore.ieee.org

We capitalize on the Wasserstein distance to obtain two information-theoretic bounds on the

generalization error of learning algorithms. First, we specialize the Wasserstein distance into

total variation, by using the discrete metric. In this case we derive a generalization bound …

  Cited by 9 Related articles All 5 versions


[PDF] arxiv.org

Inequalities for the Wasserstein mean of positive definite matrices

R Bhatia, T Jain, Y Lim - Linear Algebra and its Applications, 2019 - Elsevier

Let A 1 , … , A m be given positive definite matrices and let w = ( w 1 , … , w m ) be a vector of

weights; ie, w j ≥ 0 and ∑ j = 1 m w j = 1 . Then the (weighted) Wasserstein mean, or the Wasserstein

barycentre of A 1 , … , A m is defined as(2) Ω ( w ; A 1 , … , A m ) = argmin X P ∑ j = 1 m w …

  Cited by 12 Related articles All 5 versions


Wasserstein distance-based analogous method to predict distribution of non-uniform corrosion on reinforcements in concrete

Q Liu, RKL Su - Construction and Building Materials, 2019 - Elsevier

This paper presents an analogous method to predict the distribution of non-uniform

corrosion on reinforcements in concrete by minimizing the Wasserstein distance. A

comparison between the predicted and experimental results shows that the proposed …

  Cited by 6 Related articles All 3 versions


[PDF] arxiv.org

Fast Tree Variants of Gromov-Wasserstein

T LeN HoM Yamada - arXiv preprint arXiv:1910.04462, 2019 - arxiv.org

Gromov-Wasserstein (GW) is a powerful tool to compare probability measures whose

supports are in different metric spaces. GW suffers however from a computational drawback

since it requires to solve a complex non-convex quadratic program. We consider in this work …

  Cited by 2 Related articles 


2019


[PDF] arxiv.org

Personalized purchase prediction of market baskets with Wasserstein-based sequence matching

M KrausS Feuerriegel - Proceedings of the 25th ACM SIGKDD …, 2019 - dl.acm.org

Personalization in marketing aims at improving the shopping experience of customers by

tailoring services to individuals. In order to achieve this, businesses must be able to make

personalized predictions regarding the next purchase. That is, one must forecast the exact …

  Cited by 4 Related articles All 4 versions


[PDF] researchgate.net

[PDF] Tractable reformulations of distributionally robust two-stage stochastic programs with∞− Wasserstein distance

W Xie - arXiv preprint arXiv:1908.08454, 2019 - researchgate.net

In the optimization under uncertainty, decision-makers first select a wait-and-see policy

before any realization of uncertainty and then place a here-and-now decision after the

uncertainty has been observed. Two-stage stochastic programming is a popular modeling …

  Cited by 10 Related articles All 2 versions 

[PDF] arxiv.org

Tractable Reformulations of Distributionally Robust Two-stage Stochastic Programs with Wasserstein Distance

W Xie - arXiv preprint arXiv:1908.08454, 2019 - arxiv.org

In the optimization under uncertainty, decision-makers first select a wait-and-see policy

before any realization of uncertainty and then place a here-and-now decision after the

uncertainty has been observed. Two-stage stochastic programming is a popular modeling …

  Cited by 1 Related articles All 2 versions 


Calculating spatial configurational entropy of a landscape mosaic based on the Wasserstein metric

Y Zhao, X Zhang - Landscape Ecology, 2019 - Springer

Context Entropy is an important concept traditionally associated with thermodynamics and is

widely used to describe the degree of disorder in a substance, system, or process.

Configurational entropy has received more attention because it better reflects the …

  Cited by 4 Related articles All 5 versions



2019

[PDF] arxiv.org

Strong equivalence between metrics of Wasserstein type

E BayraktarG Guo - arXiv preprint arXiv:1912.08247, 2019 - arxiv.org

The sliced Wasserstein and more recently max-sliced Wasserstein metrics $\mW_p $ have

attracted abundant attention in data sciences and machine learning due to its advantages to

tackle the curse of dimensionality. A question of particular importance is the strong …

  Cited by 3 Related articles All 2 versions 


[PDF] arxiv.org

Penalization of barycenters in the Wasserstein space

J Bigot, E Cazelles, N Papadakis - SIAM Journal on Mathematical Analysis, 2019 - SIAM

In this paper, a regularization of Wasserstein barycenters for random measures supported

on R^d is introduced via convex penalization. The existence and uniqueness of such

barycenters is first proved for a large class of penalization functions. The Bregman …

  Cited by 15 Related articles All 8 versions

<——2019—–—2019 ——1770—  



Curvature of the manifold of fixed-rank positive-semidefinite matrices endowed with the Bures–Wasserstein metric

E MassartJM HendrickxPA Absil - … Conference on Geometric Science of …, 2019 - Springer

We consider the manifold of rank-p positive-semidefinite matrices of size n, seen as a

quotient of the set of full-rank n-by-p matrices by the orthogonal group in dimension p. The

resulting distance coincides with the Wasserstein distance between centered degenerate …

  Cited by 6 Related articles All 5 versions


[PDF] arxiv.org

The optimal convergence rate of monotone schemes for conservation laws in the Wasserstein distance

AM RufE SandeS Solem - Journal of Scientific Computing, 2019 - Springer

Abstract In 1994, Nessyahu, Tadmor and Tassa studied convergence rates of monotone

finite volume approximations of conservation laws. For compactly supported, Lip^+ Lip+-

bounded initial data they showed a first-order convergence rate in the Wasserstein distance …

  Cited by 10 Related articles All 6 versions


[PDF] projecteuclid.org

Behavior of the empirical Wasserstein distance in  under moment conditions

J Dedecker, F Merlevède - Electronic Journal of Probability, 2019 - projecteuclid.org

We establish some deviation inequalities, moment bounds and almost sure results for the

Wasserstein distance of order $ p\in [1,\infty) $ between the empirical measure of

independent and identically distributed ${\mathbb R}^ d $-valued random variables and the …

  Cited by 7 Related articles All 12 versions


Generating EEG signals of an RSVP experiment by a class conditioned wasserstein generative adversarial network

S PanwarP RadJ Quarles… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

Electroencephalography (EEG) data is difficult to obtain due to complex experimental setups

and reduced comfort due to prolonged wearing. This poses challenges to train powerful

deep learning model due to the limited EEG data. Hence, being able to generate EEG data …

  Cited by 5 Related articles All 2 versions


2019

Poincar\'e Wasserstein Autoencoder

I Ovinnikov - arXiv preprint arXiv:1901.01427, 2019 - arxiv.org

This work presents a reformulation of the recently proposed Wasserstein autoencoder

framework on a non-Euclidean manifold, the Poincaré ball model of the hyperbolic space.

By assuming the latent space to be hyperbolic, we can use its intrinsic hierarchy to impose …

  Cited by 20 Related articles All 4 versions 


2019


[PDF] arxiv.org

On the minimax optimality of estimating the wasserstein metric

T Liang - arXiv preprint arXiv:1908.10324, 2019 - arxiv.org

We study the minimax optimal rate for estimating the Wasserstein-$1 $ metric between two

unknown probability measures based on $ n $ iid empirical samples from them. We show

that estimating the Wasserstein metric itself between probability measures, is not …

  Cited by 3 Related articles All 3 versions 


On potentials of regularized Wasserstein generative adversarial networks for realistic hallucination of tiny faces

WZ ShaoJJ Xu, L Chen, Q Ge, LQ Wang, BK Bao… - Neurocomputing, 2019 - Elsevier

Super-resolution of facial images, aka face hallucination, has been intensively studied in the

past decades due to the increasingly emerging analysis demands in video surveillance, eg,

face detection, verification, identification. However, the actual performance of most previous …

  Cited by 2 Related articles All 3 versions

 

[PDF] arxiv.org

From the backward Kolmogorov PDE on the Wasserstein space to propagation of chaos for McKean-Vlasov SDEs

N Frikha, PEC de Raynal - arXiv preprint arXiv:1907.01410, 2019 - arxiv.org

In this article, we provide some new quantitative estimates for propagation of chaos of non-

linear stochastic differential equations (SDEs) in the sense of McKean-Vlasov. We obtain

explicit error estimates, at the level of the trajectories, at the level of the semi-group and at …

  Cited by 5 Related articles All 7 versions 










[PDF] arxiv.org

Duality and quotient spaces of generalized Wasserstein spaces

NP Chung, TS Trinh - arXiv preprint arXiv:1904.12461, 2019 - arxiv.org

In this article, using ideas of Liero, Mielke and Savaré in [21], we establish a Kantorovich

duality for generalized Wasserstein distances $ W_1^{a, b} $ on a generalized Polish metric

space, introduced by Picolli and Rossi. As a consequence, we give another proof that …

  Cited by 3 Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein Contraction of Stochastic Nonlinear Systems

J BouvrieJJ Slotine - arXiv preprint arXiv:1902.08567, 2019 - arxiv.org

We suggest that the tools of contraction analysis for deterministic systems can be applied

towards studying the convergence behavior of stochastic dynamical systems in the

Wasserstein metric. In particular, we consider the case of Ito diffusions with identical …

  Cited by 4 Related articles All 2 versions 

<——2019—–—2019 ——1780—  



[PDF] arxiv.org

Wasserstein gradient flow formulation of the time-fractional Fokker-Planck equation

MH DuongB Jin - arXiv preprint arXiv:1908.09055, 2019 - arxiv.org

In this work, we investigate a variational formulation for a time-fractional Fokker-Planck

equation which arises in the study of complex physical systems involving anomalously slow

diffusion. The model involves a fractional-order Caputo derivative in time, and thus …

  Cited by 1 Related articles All 7 versions 


[PDF] arxiv.org

Mullins-Sekerka as the Wasserstein flow of the perimeter

A Chambolle, T Laux - arXiv preprint arXiv:1910.02508, 2019 - arxiv.org

We prove the convergence of an implicit time discretization for the one-phase Mullins-

Sekerka equation, possibly with additional non-local repulsion, proposed in [F. Otto, Arch.

Rational Mech. Anal. 141 (1998) 63--103]. Our simple argument shows that the limit satisfies …

  Cited by 1 Related articles All 4 versions 


[PDF] researchgate.net

[PDF] Computationally efficient tree variants of gromov-wasserstein

T LeN HoM Yamada - arXiv preprint arXiv:1910.04462, 2019 - researchgate.net

We propose two novel variants of Gromov-Wasserstein (GW) between probability measures

in different probability spaces based on projecting these measures into the tree metric

spaces. Our first proposed discrepancy, named flow-based tree Gromov-Wasserstein …

  Cited by 1 Related articles All 5 versions 


[PDF] arxiv.org

Weak convergence of empirical Wasserstein type distances

P Berthet, JC Fort - arXiv preprint arXiv:1911.02389, 2019 - arxiv.org

We estimate contrasts $\int_0^ 1\rho (F^{-1}(u)-G^{-1}(u)) du $ between two continuous

distributions $ F $ and $ G $ on $\mathbb R $ such that the set $\{F= G\} $ is a finite union of

intervals, possibly empty or $\mathbb {R} $. The non-negative convex cost function $\rho $ is …

  Cited by 2 Related articles All 6 versions 


[PDF] projecteuclid.org

Convergence of the population dynamics algorithm in the Wasserstein metric

M Olvera-Cravioto - Electronic Journal of Probability, 2019 - projecteuclid.org

We study the convergence of the population dynamics algorithm, which produces sample

pools of random variables having a distribution that closely approximates that of the special

endogenous solution to a variety of branching stochastic fixed-point equations, including the …

  Cited by 3 Related articles All 6 versions


2019


[PDF] mdpi.com

Data-driven distributionally robust stochastic control of energy storage for wind power ramp management using the Wasserstein metric

I Yang - Energies, 2019 - mdpi.com

The integration of wind energy into the power grid is challenging because of its variability,

which causes high ramp events that may threaten the reliability and efficiency of power

systems. In this paper, we propose a novel distributionally robust solution to wind power …

  Cited by 2 Related articles All 6 versions 


[PDF] arxiv.org

Wasserstein stability of porous medium-type equations on manifolds with Ricci curvature bounded below

N De Ponti, M Muratori, C Orrieri - arXiv preprint arXiv:1908.03147, 2019 - arxiv.org

Given a complete, connected Riemannian manifold $\mathbb {M}^ n $ with Ricci curvature

bounded from below, we discuss the stability of the solutions of a porous medium-type

equation with respect to the 2-Wasserstein distance. We produce (sharp) stability estimates …

  Cited by 1 Related articles All 3 versions 



On the estimation of the Wasserstein distance in generative models

T Pinetz, D Soukup, T Pock - German Conference on Pattern Recognition, 2019 - Springer

Abstract Generative Adversarial Networks (GANs) have been used to model the underlying

probability distribution of sample based datasets. GANs are notoriuos for training difficulties

and their dependence on arbitrary hyperparameters. One recent improvement in GAN …

  Related articles All 5 versions


[PDF] arxiv.org

Optimal Fusion of Elliptic Extended Target Estimates Based on the Wasserstein Distance

K ThormannM Baum - 2019 22th International Conference on …, 2019 - ieeexplore.ieee.org

This paper considers the fusion of multiple estimates of a spatially extended object, where

the object extent is modeled as an ellipse parameterized by the orientation and semi-axes

lengths. For this purpose, we propose a novel systematic approach that employs a distance …

  Cited by 1 Related articles All 5 versions


[HTML] nih.gov

Construction of 4D Neonatal Cortical Surface Atlases Using Wasserstein Distance

Z Chen, Z WuL SunF WangL Wang… - 2019 IEEE 16th …, 2019 - ieeexplore.ieee.org

Spatiotemporal (4D) neonatal cortical surface atlases with densely sampled ages are

important tools for understanding the dynamic early brain development. Conventionally,

after non-linear co-registration, surface atlases are constructed by simple Euclidean average …

  Cited by 1 Related articles All 5 versions

<——2019—–—2019 ——1790— 



[PDF] arxiv.org

Random Matrix-Improved Estimation of the Wasserstein Distance between two Centered Gaussian Distributions

M TiomokoR Couillet - 2019 27th European Signal Processing …, 2019 - ieeexplore.ieee.org

This article proposes a method to consistently estimate functionals 1/pΣ i= 1 pf (λ i (C 1 C 2))

of the eigenvalues of the product of two covariance matrices C 1, C 2 R p× p based on the

empirical estimates λ i (Ĉ 1 Ĉ 2)(Ĉ a= 1/na Σ i= 1 na xi (a) xi (a)), when the size p and …

  Cited by 1 Related articles All 7 versions


[PDF] arxiv.org

Nonembeddability of Persistence Diagrams with  Wasserstein Metric

A Wagner - arXiv preprint arXiv:1910.13935, 2019 - arxiv.org

Persistence diagrams do not admit an inner product structure compatible with any

Wasserstein metric. Hence, when applying kernel methods to persistence diagrams, the

underlying feature map necessarily causes distortion. We prove persistence diagrams with …

  Cited by 2 Related articles All 2 versions 

 

[PDF] arxiv.org

Implementation of batched Sinkhorn iterations for entropy-regularized Wasserstein loss

T Viehmann - arXiv preprint arXiv:1907.01729, 2019 - arxiv.org

In this report, we review the calculation of entropy-regularised Wasserstein loss introduced

by Cuturi and document a practical implementation in PyTorch. Code is available at this

https URL Subjects: Machine Learning (stat. ML); Machine Learning (cs. LG) Cite as: arXiv …

  Cited by 1 Related articles All 2 versions 


Data augmentation method of sar image dataset based on wasserstein generative adversarial networks

Q Lu, H Jiang, G Li, W Ye - 2019 International conference on …, 2019 - ieeexplore.ieee.org

The published Synthetic Aperture Radar (SAR) samples are not abundant enough, which is

not conducive to the application of deep learning methods in the field of SAR automatic

target recognition. Generative Adversarial Nets (GANs) is one of the most effective ways to …

  Cited by 1 Related articles All 2 versions


Universality of persistence diagrams and the bottleneck and Wasserstein distances

P Bubenik, A Elchesen - arXiv preprint arXiv:1912.02563, 2019 - arxiv.org

We undertake a formal study of persistence diagrams and their metrics. We show that

barcodes and persistence diagrams together with the bottleneck distance and the

Wasserstein distances are obtained via universal constructions and thus have …

  Cited by 3 Related articles All 4 versions 


2019

Unsupervised segmentation of colonic polyps in narrow-band imaging data based on manifold representation of images and Wasserstein distance

IN FigueiredoL PintoPN FigueiredoR Tsai - … Signal Processing and …, 2019 - Elsevier

Colorectal cancer (CRC) is one of the most common cancers worldwide and after a certain

age (≥ 50) regular colonoscopy examination for CRC screening is highly recommended.

One of the most prominent precursors of CRC are abnormal growths known as polyps. If a …

  Related articles All 4 versions


[PDF] arxiv.org

Stylized Text Generation Using Wasserstein Autoencoders with a Mixture of Gaussian Prior

A Ghabussi, L Mou, O Vechtomova - arXiv preprint arXiv:1911.03828, 2019 - arxiv.org

Wasserstein autoencoders are effective for text generation. They do not however provide

any control over the style and topic of the generated sentences if the dataset has multiple

classes and includes different topics. In this work, we present a semi-supervised approach …

  Related articles All 2 versions 


[PDF] researchgate.net

[PDF] Rate of convergence in Wasserstein distance of piecewise-linear Lévy-driven SDEs

ARI ARAPOSTATHIS, G PANG… - arXiv preprint arXiv …, 2019 - researchgate.net

In this paper, we study the rate of convergence under the Wasserstein metric of a broad

class of multidimensional piecewise Ornstein–Uhlenbeck processes with jumps. These are

governed by stochastic differential equations having a piecewise linear drift, and a fairly …

  Related articles 


[PDF] arxiv.org

Convergence of some classes of random flights in Wasserstein distance

A Falaleev, V Konakov - arXiv preprint arXiv:1910.03862, 2019 - arxiv.org

In this paper we consider a random walk of a particle in $\mathbb {R}^ d $. Convergence of

different transformations of trajectories of random flights with Poisson switching moments

has been obtained by Davydov and Konakov, as well as diffusion approximation of the …

  Related articles All 2 versions 


[PDF] koreascience.or.kr

Experimental Analysis of Equilibrization in Binary Classification for Non-Image Imbalanced Data Using Wasserstein GAN

ZY Wang, DK Kang - … Journal of Internet, Broadcasting and …, 2019 - koreascience.or.kr

In this paper, we explore the details of three classic data augmentation methods and two

generative model based oversampling methods. The three classic data augmentation

methods are random sampling (RANDOM), Synthetic Minority Over-sampling Technique …

  Cited by 2 Related articles All 3 versions 

<——2019—–—2019 ——1800— 



Wasserstein barycenters in the manifold of all positive definite matrices

E NobariB Ahmadi Kakavandi - Quarterly of Applied Mathematics, 2019 - ams.org

In this paper, we study the Wasserstein barycenter of finitely many Borel probability

measures on $\mathbb {P} _ {n} $, the Riemannian manifold of all $ n\times n $ real positive

definite matrices as well as its associated dual problem, namely the optimal transport …

  Related articles All 2 versions


 

[PDF] ntu.edu.sg

Poisson discretizations of Wiener functionals and Malliavin operators with Wasserstein estimates

N PrivaultSCP YamZ Zhang - Stochastic Processes and their …, 2019 - Elsevier

This article proposes a global, chaos-based procedure for the discretization of functionals of

Brownian motion into functionals of a Poisson process with intensity λ> 0. Under this

discretization we study the weak convergence, as the intensity of the underlying Poisson …

  Related articles All 6 versions



The existence of geodesics in Wasserstein spaces over path groups and loop groups

J Shao - Stochastic Processes and their Applications, 2019 - Elsevier

In this work we prove the existence and uniqueness of the optimal transport map for L p-

Wasserstein distance with p> 1, and particularly present an explicit expression of the optimal

transport map for the case p= 2. As an application, we show the existence of geodesics …

  Related articles All 8 versions


[PDF] rit.edu

A Comparative Assessment of the Impact of Various Norms on Wasserstein Generative Adversarial Networks

C Ramesh - 2019 - scholarworks.rit.edu

Abstract Generative Adversarial Networks (GANs) provide a fascinating new paradigm in

machine learning and artificial intelligence, especially in the context of unsupervised

learning. GANs are quickly becoming a state of the art tool, used in various applications …

  Related articles All 2 versions 


[PDF] iop.org

Wasserstein space as state space of quantum mechanics and optimal transport

MF Rosyid, K Wahyuningsih - … of Physics: Conference Series, 2019 - iopscience.iop.org

In this work, we are in the position to view a measurement of a physical observable as an

experiment in the sense of probability theory. To every physical observable, a sample space

called the spectrum of the observable is therefore available. We have investigated the …

  Related articles All 2 versions


2019


[PDF] sns.it

Sensitivity of the Compliance and of the Wasserstein Distance with Respect to a Varying Source

G BouchittéI FragalàI Lucardesi - Applied Mathematics & Optimization, 2019 - Springer

We show that the compliance functional in elasticity is differentiable with respect to

horizontal variations of the load term, when the latter is given by a possibly concentrated

measure; moreover, we provide an integral representation formula for the derivative as a …

  Related articles All 9 versions


to its own field. At the intersection of computational methods, data …

  Related articles 

[PDF] researchgate.net

[PDF] Computation of Wasserstein barycenters via the Iterated Swapping Algorithm

G Puccetti, L RüschendorfS Vanduffel - 2019 - researchgate.net

In recent years, the Wasserstein barycenter has become an important notion in the analysis

of high dimensional data with a broad range of applications in applied probability,

economics, statistics and in particular to clustering and image processing. In our paper we …

  Related articles 


Approximation of Discounted Minimax Markov Control Problems and Zero-Sum Markov Games Using Hausdorff and Wasserstein Distances

F Dufour, T Prieto-Rumeau - Dynamic Games and Applications, 2019 - Springer

This paper is concerned with a minimax control problem (also known as a robust Markov

decision process (MDP) or a game against nature) with general state and action spaces

under the discounted cost optimality criterion. We are interested in approximating …

  Related articles All 6 versions


[PDF] tum.de

Structure preserving discretization and approximation of gradient flows in Wasserstein-like space

S Plazotta - 2019 - mediatum.ub.tum.de

This thesis investigates structure-preserving, temporal semi-discretizations and

approximations for PDEs with gradient flow structure with the application to evolution

problems in the L²-Wasserstein space. We investigate the variational formulation of the time …

  Related articles All 3 versions 


Reproducibility test of radiomics using network analysis and Wasserstein K-means algorithm

JH Oh, AP Apte, E Katsoulakis, N Riaz, V Hatzoglou… - bioRxiv, 2019 - biorxiv.org

Purpose To construct robust and validated radiomic predictive models, the development of a

reliable method that can identify reproducible radiomic features robust to varying image

acquisition methods and other scanner parameters should be preceded with rigorous …

  Related articles All 3 versions 

<——2019—–—2019 ——1810—



Use of the Wasserstein Metric to Solve the Inverse Dynamic Seismic Problem

AA Vasilenko - Geomodel 2019, 2019 - earthdoc.org

The inverse dynamic seismic problem consists in recovering the velocity model of elastic

medium based on the observed seismic data. In this work full waveform inversion method is

used to solve this problem. It consists in minimizing an objective functional measuring the …

  Related articles


2019

Sampling of probability measures in the convex order by Wasserstein projection

J Corbetta, B Jourdain - 2019 - ideas.repec.org

In this paper, for $\mu $ and $\nu $ two probability measures on $\mathbb {R}^ d $ with finite

moments of order $\rho\ge 1$, we define the respective projections for the $ W_\rho $-

Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures dominated by …

  

Elements of Statistical Inference in 2-Wasserstein Space

J Ebert, V SpokoinyA Suvorikova - Topics in Applied Analysis and …, 2019 - Springer

This work addresses an issue of statistical inference for the datasets lacking underlying

linear structure, which makes impossible the direct application of standard inference

techniques and requires a development of a new tool-box taking into account properties of  …

  Related articles All 3 versions


DE NOVO PROTEIN DESIGN OF NOVEL FOLDS USING GUIDED CONDITIONAL WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS (GCWGAN)

S Zhu - 2019 - oaktrust.library.tamu.edu

In the research areas about proteins, it is always a significant topic to detect the

sequencestructure-function relationship. Fundamental questions remain for this topic: How

much could current data alone reveal deep insights about such relationship? And how much …

  

On the Complexity of Approximating Wasserstein Barycenters

proceedings.mlr.press › ...

by Aroshnin · 2019 · Cited by 40 — We study the complexity of approximating the Wasserstein barycenter of m discrete measures, or histograms of size n, by contrasting two alternative approaches ...

Missing: eprint ‎| Must include: eprint

[CITATION] On the Complexity of Approximating Wasserstein Barycenter. eprint

A Kroshnin, D Dvinskikh, P Dvurechensky, A Gasnikov… - arXiv preprint arXiv …, 2019

  Cited by 3 Related articles

[CITATION] On the complexity of computing Wasserstein distances

B Taskesen, S Shafieezadeh-Abadeh, D Kuhn - 2019 - Working paper

On the Complexity of Approximating Wasserstein Barycenters

http://proceedings.mlr.press › ...

by A Kroshnin · 2019 · Cited by 44 — We study the complexity of approximating the Wasserstein barycenter of m discrete measures, or histograms of size n, by contrasting two alternative approaches ...

[CITATION] On the Complexity of Approximating Wasserstein Barycenter. eprint

A Kroshnin, D Dvinskikh, P Dvurechensky, A Gasnikov… - arXiv preprint arXiv …, 2019

Cited by 72 Related articles All 9 versions


2019

Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic Schrödinger equation
Author:Ferriere G.
Article, 2019
Publication:arXiv, 2019 03 11
Publisher:2019


[PDF] arxiv.org

Approximate Bayesian computation with the Wasserstein distance

E BerntonPE Jacob, M Gerber… - Journal of the Royal …, 2019 - Wiley Online Library

… In the supplementary material, we also observe that the swapping distance can approximate

the Wasserstein distance more accurately than the Hilbert distance as the dimension urn:x-…

Cited by 88 Related articles All 12 versions

 

F) Comparison of poststack seismic inversion methods

https://www.researchgate.net › ... › Seismic Inversion

Discover the world's research ... Inversion of post-stack seismic data to reduce an. estimate of the ... objective of geophysicists for a number of years, and the.

[CITATION] Comparison of object functions for the inversion of seismic data and study on the potentialities of the Wasserstein Metric

L Stracca, E Stucchi, A Mazzotti - GNGTS, 2019 - arpi.unipi.it

IRIS è la soluzione IT che facilita la raccolta e la gestione dei dati relativi alle attività e ai prodotti

della ricerca. Fornisce a ricercatori, amministratori e valutatori gli strumenti per monitorare i risultati

della ricerca, aumentarne la visibilità e allocare in modo efficace le risorse disponibili … Comparison …

  

Wasserstein metric - Twitter

https://twitter.com › events

https://twitter.com › events

It features an excellent primer on the Wasserstein metric and its use in RL. Don't judge it by the quality of the uploaded video. Quote Tweet.

Jan 10, 2019

[PDF] arxiv.org

Hausdorff and Wasserstein metrics on graphs and other structured data

E Patterson - arXiv preprint arXiv:1907.00257, 2019 - arxiv.org

Optimal transport is widely used in pure and applied mathematics to find probabilistic

solutions to hard combinatorial matching problems. We extend the Wasserstein metric and

other elements of optimal transport from the matching of sets to the matching of graphs and …

  Cited by 5 Related articles All 3 versions 

<——2019—–—2019 ——1820— 


[PDF] arxiv.org

Robust Wasserstein profile inference and applications to machine learning

J Blanchet, Y KangK Murthy - Journal of Applied Probability, 2019 - cambridge.org

We show that several machine learning estimators, including square-root least absolute

shrinkage and selection and regularized logistic regression, can be represented as

solutions to distributionally robust optimization problems. The associated uncertainty regions …

  Cited by 146 Related articles All 5 versions

[PDF] thecvf.com

Max-sliced wasserstein distance and its use for gans

I DeshpandeYT HuR SunA Pyrros… - … Vision and Pattern …, 2019 - openaccess.thecvf.com

Generative adversarial nets (GANs) and variational auto-encoders have significantly

improved our distribution modeling capabilities, showing promise for dataset augmentation,

image-to-image translation and feature learning. However, to model high-dimensional …

  Cited by 43 Related articles All 8 versions 


[PDF] arxiv.org

Wasserstein distributionally robust optimization: Theory and applications in machine learning

D KuhnPM EsfahaniVA Nguyen… - … Science in the Age …, 2019 - pubsonline.informs.org

Many decision problems in science, engineering, and economics are affected by uncertain

parameters whose distribution is only indirectly observable through samples. The goal of

data-driven decision making is to learn a decision from finitely many training samples that …

  Cited by 69 Related articles All 7 versions


[PDF] mlr.press

Gromov-wasserstein learning for graph matching and node embedding

H XuD LuoH Zha, LC Duke - International conference on …, 2019 - proceedings.mlr.press

A novel Gromov-Wasserstein learning framework is proposed to jointly match (align) graphs

and learn embedding vectors for the associated graph nodes. Using Gromov-Wasserstein

discrepancy, we measure the dissimilarity between two graphs and find their …

  Cited by 52 Related articles All 9 versions 


[PDF] arxiv.org

Learning with minibatch Wasserstein: asymptotic and gradient properties

K Fatras, Y Zine, R FlamaryR Gribonval… - arXiv preprint arXiv …, 2019 - arxiv.org

Optimal transport distances are powerful tools to compare probability distributions and have

found many applications in machine learning. Yet their algorithmic complexity prevents their

direct use on large scale datasets. To overcome this challenge, practitioners compute these …

  Cited by 12 Related articles All 23 versions 


2019


[PDF] arxiv.org

Scalable Gromov-Wasserstein learning for graph partitioning and matching

H XuD LuoL Carin - arXiv preprint arXiv:1905.07645, 2019 - arxiv.org

We propose a scalable Gromov-Wasserstein learning (S-GWL) method and establish a

novel and theoretically-supported paradigm for large-scale graph analysis. The proposed

method is based on the fact that Gromov-Wasserstein discrepancy is a pseudometric on …

 Cited by 39 Related articles All 10 versions 


[HTML] oup.com

The gromov–wasserstein distance between networks and stable network invariants

S ChowdhuryF Mémoli - Information and Inference: A Journal of …, 2019 - academic.oup.com

We define a metric—the network Gromov–Wasserstein distance—on weighted, directed

networks that is sensitive to the presence of outliers. In addition to proving its theoretical

properties, we supply network invariants based on optimal transport that approximate this …

  Cited by 20 Related articles All 5 versions


[PDF] arxiv.org

Fréchet means and Procrustes analysis in Wasserstein space

Y Zemel, VM Panaretos - Bernoulli, 2019 - projecteuclid.org

We consider two statistical problems at the intersection of functional and non-Euclidean data

analysis: the determination of a Fréchet mean in the Wasserstein space of multivariate

distributions; and the optimal registration of deformed random measures and point …

  Cited by 51 Related articles All 8 versions


[PDF] ucla.edu

On differentiability in the Wasserstein space and well-posedness for Hamilton–Jacobi equations

W GangboA Tudorascu - Journal de Mathématiques Pures et Appliquées, 2019 - Elsevier

In this paper we elucidate the connection between various notions of differentiability in the

Wasserstein space: some have been introduced intrinsically (in the Wasserstein space, by

using typical objects from the theory of Optimal Transport) and used by various authors to …

  Cited by 35 Related articles All 4 versions


Aggregated wasserstein distance and state registration for hidden markov models

Y ChenJ Ye, J Li - IEEE transactions on pattern analysis and …, 2019 - ieeexplore.ieee.org

We propose a framework, named Aggregated Wasserstein, for computing a dissimilarity

measure or distance between two Hidden Markov Models with state conditional distributions

being Gaussian. For such HMMs, the marginal distribution at any time position follows a …

  Cited by 5 Related articles All 6 versions

<——2019—–—2019 ——1830— 



[PDF] thecvf.com

Normalized wasserstein for mixture distributions with applications in adversarial learning and domain adaptation

Y BalajiR ChellappaS Feizi - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Understanding proper distance measures between distributions is at the core of several

learning tasks such as generative models, domain adaptation, clustering, etc. In this work,

we focus on mixture distributions that arise naturally in several application domains where …

  Cited by 12 Related articles All 4 versions 


[PDF] ncl.ac.uk

Generative adversarial framework for depth filling via wasserstein metric, cosine transform and domain transfer

A Atapour-AbarghoueiS Akcay… - Pattern Recognition, 2019 - Elsevier

In this work, the issue of depth filling is addressed using a self-supervised feature learning

model that predicts missing depth pixel values based on the context and structure of the

scene. A fully-convolutional generative model is conditioned on the available depth …

  Cited by 17 Related articles All 4 versions


[PDF] arxiv.org

Bridging Bayesian and minimax mean square error estimation via Wasserstein distributionally robust optimization

VA NguyenS Shafieezadeh-AbadehD Kuhn… - arXiv preprint arXiv …, 2019 - arxiv.org

We introduce a distributionally robust minimium mean square error estimation model with a

Wasserstein ambiguity set to recover an unknown signal from a noisy observation. The

proposed model can be viewed as a zero-sum game between a statistician choosing an …

  Cited by 8 Related articles All 6 versions 

A unified formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt distances between positive definite operators

HQ Minh - International Conference on Geometric Science of …, 2019 - Springer

This work presents a parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

distances provide a unified formulation encompassing both the Bures-Wasserstein and Log …

Cited by 14 Related articles All 2 versions


[PDF] arxiv.org

Modified massive Arratia flow and Wasserstein diffusion

V KonarovskyiMK von Renesse - … on Pure and Applied …, 2019 - Wiley Online Library

Extending previous work by the first author we present a variant of the Arratia flow, which

consists of a collection of coalescing Brownian motions starting from every point of the unit

interval. The important new feature of the model is that individual particles carry mass that …

  Cited by 28 Related articles All 7 versions


2019


[PDF] arxiv.org

Investigating under and overfitting in wasserstein generative adversarial networks

B AdlamC WeillA Kapoor - arXiv preprint arXiv:1910.14137, 2019 - arxiv.org

We investigate under and overfitting in Generative Adversarial Networks (GANs), using

discriminators unseen by the generator to measure generalization. We find that the model

capacity of the discriminator has a significant effect on the generator's model quality, and  …

  Cited by 7 Related articles All 3 versions 

nvestigating Under and Overfitting in Wasserstein Generative Adversarial Networks

A Kapoor, B AdlamC Weill - 2019 - research.google

We investigate under and overfitting in Generative Adversarial Networks (GANs), using

discriminators unseen by the generator to measure generalization. We find that the model

capacity of the discriminator has a significant effect on the generator's model quality, and  …



[PDF] arxiv.org

Bridging Bayesian and minimax mean square error estimation via Wasserstein distributionally robust optimization

VA NguyenS Shafieezadeh-AbadehD Kuhn… - arXiv preprint arXiv …, 2019 - arxiv.org

We introduce a distributionally robust minimium mean square error estimation model with a

Wasserstein ambiguity set to recover an unknown signal from a noisy observation. The

proposed model can be viewed as a zero-sum game between a statistician choosing an …

  Cited by 8 Related articles All 6 versions 


[PDF] arxiv.org

Sparsemax and relaxed Wasserstein for topic sparsity

T LinZ HuX Guo - … ACM International Conference on Web Search and …, 2019 - dl.acm.org

Topic sparsity refers to the observation that individual documents usually focus on several

salient topics instead of covering a wide variety of topics, and a real topic adopts a narrow

range of terms instead of a wide coverage of the vocabulary. Understanding this topic …

  Cited by 10 Related articles All 5 versions


Deep joint two-stream Wasserstein auto-encoder and selective attention alignment for unsupervised domain adaptation

Z ChenC Chen, X Jin, Y LiuZ Cheng - Neural computing and …, 2019 - Springer

Abstract Domain adaptation refers to the process of utilizing the labeled source domain data

to learn a model that can perform well in the target domain with limited or missing labels.

Several domain adaptation methods combining image translation and feature alignment …

  Cited by 13 Related artic



Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Stein's method has been widely used for probability approximations. However, in the multi-

dimensional setting, most of the results are for multivariate normal approximation or for test

functions with bounded second-or higher-order derivatives. For a class of multivariate …

  Cited by 22 Related articles All 7 versions

[CITATION] Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula (vol 174, pg 945, 2019)

X Fang, QM ShaoL Xu - … THEORY AND …, 2019 - … TI

[HTML] springer.com

[HTML] Correction to: Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Under the above-strengthened Assumption 2.1, all the conclusions and examples in [1] still hold

true, except that all the constants \(C_\theta \) therein will depend on the constants in the new

assumption … Combining the previous three inequalities, we conclude that [1, (7.1)] still holds …

  Cited by 1 Related articles All 2 versions

<——2019—–—2019 ——1840—  



[PDF] arxiv.org

Connections between support vector machines, wasserstein distance and gradient-penalty GANs

A Jolicoeur-MartineauI Mitliagkas - arXiv preprint arXiv:1910.06922, 2019 - arxiv.org

We generalize the concept of maximum-margin classifiers (MMCs) to arbitrary norms and

non-linear functions. Support Vector Machines (SVMs) are a special case of MMC. We find

that MMCs can be formulated as Integral Probability Metrics (IPMs) or classifiers with some …

  Cited by 8 Related articles All 3 versions 


[PDF] arxiv.org

Randomized filtering and Bellman equation in Wasserstein space for partial observation control problem

E BandiniA Cosso, M Fuhrman, H Pham - Stochastic Processes and their …, 2019 - Elsevier

We study a stochastic optimal control problem for a partially observed diffusion. By using the

control randomization method in Bandini et al.(2018), we prove a corresponding

randomized dynamic programming principle (DPP) for the value function, which is obtained …

  Cited by 16 Related articles All 13 versions


[PDF] arxiv.org

A convergent Lagrangian discretization for -Wasserstein and flux-limited diffusion equations

B Söllner, O Junge - arXiv preprint arXiv:1906.01321, 2019 - arxiv.org

We study a Lagrangian numerical scheme for solution of a nonlinear drift diffusion equation

of the form $\partial_t u=\partial_x (u\cdot c [\partial_x (h^\prime (u)+ v)]) $ on an interval.

This scheme will consist of a spatio-temporal discretization founded in the formulation of the …

  Cited by 2 Related articles All 5 versions 

[CITATION] A convergent Lagrangian discretization for -Wasserstein and flux-limited diffusion equations

O Junge, B Söllner - arXiv preprint arXiv:1906.01321, 2019



[PDF] arxiv.org

Normalized Wasserstein distance for mixture distributions with applications in adversarial learning and domain adaptation

Y BalajiR ChellappaS Feizi - arXiv preprint arXiv:1902.00415, 2019 - arxiv.org

Understanding proper distance measures between distributions is at the core of several

learning tasks such as generative models, domain adaptation, clustering, etc. In this work,

we focus on mixture distributions that arise naturally in several application domains where …

  Cited by 5 Related articles All 2 versions 


[PDF] arxiv.org

Geometric convergence bounds for Markov chains in Wasserstein distance based on generalized drift and contraction conditions

Q QinJP Hobert - arXiv preprint arXiv:1902.02964, 2019 - arxiv.org

Let $\{X_n\} _ {n= 0}^\infty $ denote an ergodic Markov chain on a general state space that

has stationary distribution $\pi $. This article concerns upper bounds on the $ L_1 $-

Wasserstein distance between the distribution of $ X_n $ and $\pi $. In particular, an explicit …

  Cited by 9 Related articles All 2 versions 


2019


[PDF] arxiv.org

Refined basic couplings and Wasserstein-type distances for SDEs with Lévy noises

D Luo, J Wang - Stochastic Processes and their Applications, 2019 - Elsevier

We establish the exponential convergence with respect to the L 1-Wasserstein distance and

the total variation for the semigroup corresponding to the stochastic differential equation d X

t= d Z t+ b (X t) dt, where (Z t) t≥ 0 is a pure jump Lévy process whose Lévy measure ν fulfills …

  Cited by 17 Related articles All 7 versions


 

[PDF] arxiv.org

Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic Schr {\" o} dinger equation

G Ferriere - arXiv preprint arXiv:1903.04309, 2019 - arxiv.org

We consider the dispersive logarithmic Schr {ö} dinger equation in a semi-classical scaling.

We extend the results about the large time behaviour of the solution (dispersion faster than

usual with an additional logarithmic factor, convergence of the rescaled modulus of the …

  Cited by 6 Related articles All 4 versions 


[PDF] arxiv.org

Second-Order Models for Optimal Transport and Cubic Splines on the Wasserstein Space

JD BenamouTO GallouëtFX Vialard - Foundations of Computational …, 2019 - Springer

On the space of probability densities, we extend the Wasserstein geodesics to the case of

higher-order interpolation such as cubic spline interpolation. After presenting the natural

extension of cubic splines to the Wasserstein space, we propose a simpler approach based …



On the total variation Wasserstein gradient flow and the TV-JKO scheme

G Carlier, C Poon - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We study the JKO scheme for the total variation, characterize the optimizers, prove some of

their qualitative properties (in particular a form of maximum principle and in some cases, a

minimum principle as well). Finally, we establish a convergence result as the time step goes …

  Cited by 7 Related articles All 7 versions


[PDF] arxiv.org

Hausdorff and Wasserstein metrics on graphs and other structured data

E Patterson - arXiv preprint arXiv:1907.00257, 2019 - arxiv.org

Optimal transport is widely used in pure and applied mathematics to find probabilistic

solutions to hard combinatorial matching problems. We extend the Wasserstein metric and

other elements of optimal transport from the matching of sets to the matching of graphs and  …

  Cited by 4 Related articles All 3 versions 

<——2019—–—2019 ——1850—  



[PDF] arxiv.org

Subexponential upper and lower bounds in Wasserstein distance for Markov processes

A Arapostathis, G Pang, N Sandrić - arXiv preprint arXiv:1907.05250, 2019 - arxiv.org

In this article, relying on Foster-Lyapunov drift conditions, we establish subexponential

upper and lower bounds on the rate of convergence in the $\mathrm {L}^ p $-Wasserstein

distance for a class of irreducible and aperiodic Markov processes. We further discuss these …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

Hypothesis Test and Confidence Analysis with Wasserstein Distance with General Dimension

M Imaizumi, H Ota, T Hamaguchi - arXiv preprint arXiv:1910.07773, 2019 - arxiv.org

We develop a general framework for statistical inference with the Wasserstein distance.

Recently, the Wasserstein distance has attracted much attention and been applied to

various machine learning tasks due to its celebrated properties. Despite the importance …

  Cited by 1 Related articles All 2 versions 




[PDF] arxiv.org

Wasserstein Inequality and Minimal Green Energy on Compact Manifolds

S Steinerberger - arXiv preprint arXiv:1907.09023, 2019 - arxiv.org

Let $ M $ be a smooth, compact $ d-$ dimensional manifold, $ d\geq 3, $ without boundary

and let $ G: M\times M\rightarrow\mathbb {R}\cup\left\{\infty\right\} $ denote the Green's

function of the Laplacian $-\Delta $(normalized to have mean value 0). We prove a bound …

  Cited by 4 Related articles All 2 versions 


[PDF] arxiv.org

Tropical Optimal Transport and Wasserstein Distances

W Lee, W LiB LinA Monod - arXiv preprint arXiv:1911.05401, 2019 - arxiv.org

We study the problem of optimal transport in tropical geometry and define the Wasserstein-$

p $ distances for probability measures in the continuous metric measure space setting of the

tropical projective torus. We specify the tropical metric---a combinatorial metric that has been …

  Cited by 1 Related articles All 3 versions 

[PDF] ucla.edu

[PDF] Tropical Optimal Transport and Wasserstein Distances in Phylogenetic Tree Space

W Lee, W LiB LinA Monod - arXiv preprint arXiv:1911.05401, 2019 - math.ucla.edu

We study the problem of optimal transport on phylogenetic tree space from the perspective

of tropical geometry, and thus define the Wasserstein-p distances for probability measures in

this continuous metric measure space setting. With respect to the tropical metric—a …

  Related articles All 2 versions 



[PDF] arxiv.org

The boundary method for semi-discrete optimal transport partitions and Wasserstein distance computation

L Dieci, JD Walsh III - Journal of Computational and Applied Mathematics, 2019 - Elsevier

We introduce a new technique, which we call the boundary method, for solving semi-

discrete optimal transport problems with a wide range of cost functions. The boundary

method reduces the effective dimension of the problem, thus improving complexity. For cost …

  Cited by 7 Related articles All 5 versions


2019


[PDF] thecvf.com

[PDF] Face Synthesis and Recognition Using Disentangled Representation-Learning Wasserstein GAN.

GSJ Hsu, CH Tang, MH Yap - CVPR Workshops, 2019 - openaccess.thecvf.com

Abstract We propose the Disentangled Representation-learning Wasserstein GAN (DR-

WGAN) trained on augmented data for face recognition and face synthesis across pose. We

improve the state-of-the-art DR-GAN with the Wasserstein loss considered in the …

  Cited by 1 Related articles All 4 versions 



[PDF] archives-ouvertes.fr

Diffusive processes on the Wasserstein space: Coalescing models, Regularization properties and McKean-Vlasov equations

V Marx - 2019 - tel.archives-ouvertes.fr

The aim of this thesis is to study a class of diffusive stochastic processes with values in the

space of probability measures on the real line, called Wasserstein space if it is endowed

with the Wasserstein metric W2. The following issues are mainly addressed in this work: how …

  Cited by 2 Related articles All 9 versions 


[PDF] researchgate.net

Infrared and Visible Image Fusion via Multi-discriminators Wasserstein Generative Adversarial Network

J Li, H Huo, K Liu, C Li, S Li… - … On Machine Learning And …, 2019 - ieeexplore.ieee.org

Generative adversarial network (GAN) has been widely applied to infrared and visible image

fusion. However, the existing GAN-based image fusion methods only establish one

discriminator in the network to make the fused image capture gradient information from the …

  Cited by 1 Related articles All 3 versions



Hybrid Wasserstein distance and fast distribution clustering

I Verdinelli, L Wasserman - Electronic Journal of Statistics, 2019 - projecteuclid.org

We define a modified Wasserstein distance for distribution clustering which inherits many of

the properties of the Wasserstein distance but which can be estimated easily and computed

quickly. The modified distance is the sum of two terms. The first term—which has a closed …

  Cited by 2 Related articles All 5 versions


[PDF] aaai.org

Wasserstein soft label propagation on hypergraphs: Algorithm and generalization error bounds

T GaoS AsoodehY HuangJ Evans - Proceedings of the AAAI …, 2019 - ojs.aaai.org

Inspired by recent interests of developing machine learning and data mining algorithms on

hypergraphs, we investigate in this paper the semi-supervised learning algorithm of

propagating” soft labels”(eg probability distributions, class membership scores) over …

  Cited by 3 Related articles All 13 versions 

[PDF] uni-bielefeld.de

<——2019—–—2019 ——1860— 



[PDF] Diffusions and PDEs on Wasserstein space

FY Wang - arXiv preprint arXiv:1903.02148, 2019 - sfb1283.uni-bielefeld.de

We propose a new type SDE, whose coefficients depend on the image of solutions, to investigate

the diffusion process on the Wasserstein space 2 over Rd, generated by the following

time-dependent differential operator for f C2 … R d×Rd σ(t, x, µ)σ(t, y, µ) ,D2f(µ)(x …

  Cited by 2 Related articles 


[PDF] arxiv.org

Temporal Wasserstein non-negative matrix factorization for non-rigid motion segmentation and spatiotemporal deconvolution

E VarolA Nejatbakhsh, C McGrory - arXiv preprint arXiv:1912.03463, 2019 - arxiv.org

Motion segmentation for natural images commonly relies on dense optic flow to yield point

trajectories which can be grouped into clusters through various means including spectral

clustering or minimum cost multicuts. However, in biological imaging scenarios, such as …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

Tree-Wasserstein Barycenter for Large-Scale Multilevel Clustering and Scalable Bayes

T Le, V Huynh, N HoD PhungM Yamada - arXiv preprint arXiv …, 2019 - arxiv.org

We study in this paper a variant of Wasserstein barycenter problem, which we refer to as tree-

Wasserstein barycenter, by leveraging a specific class of ground metrics, namely tree

metrics, for Wasserstein distance. Drawing on the tree structure, we propose an efficient …

  Related articles All 2 versions 


 
2019

[CITATION] Multivariate Stein Factors from Wasserstein Decay

MA Erdogdu, L Mackey, O Shamir - 2019 - preparation

  Cited by 2 Related articles

[PDF] arxiv.org

Local Bures-Wasserstein Transport: A Practical and Fast Mapping Approximation

A Hoyos-Idrobo - arXiv preprint arXiv:1906.08227, 2019 - arxiv.org

Optimal transport (OT)-based methods have a wide range of applications and have attracted

a tremendous amount of attention in recent years. However, most of the computational

approaches of OT do not learn the underlying transport map. Although some algorithms …

  Related articles All 2 versions 

[CITATION] Local Bures-Wasserstein Transport: A Practical and Fast Mapping Approximation.

AH Idrobo - CoRR, 2019

2019


Improved concentration bounds for conditional value-at-risk and cumulative prospect theory using wasserstein distance

SP Bhat, LA Prashanth - 2019 - openreview.net

This paper presents a unified approach based on Wasserstein distance to derive

concentration bounds for empirical estimates for a broad class of risk measures. The results

cover two broad classes of risk measures which are defined in the paper. The classes of risk …

  Cited by 1 Related articles 


[PDF] Algorithms for Optimal Transport and Wasserstein Distances

J Schrieber - 2019 - d-nb.info

Optimal Transport and Wasserstein Distance are closely related terms that do not only have

a long history in the mathematical literature, but also have seen a resurgence in recent

years, particularly in the context of the many applications they are used in, which span a …

  Related articles All 2 versions 


Approximation and Wasserstein distance for self-similar measures on the unit interval

E Lichtenegger, R Niedzialomski - Journal of Mathematical Analysis and …, 2019 - Elsevier

We study the Wasserstein distance between self-similar measures associated to two non-

overlapping linear contractions of the unit interval. The main theorem gives an explicit

formula for the Wasserstein distance between iterations of certain discrete approximations of …

  Related articles All 2 versions


 

[PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance: Wrong Way Counterparty Credit and Funding Risk

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019 - arxiv.org

This paper investigates calculations of robust XVA, in particular, credit valuation adjustment

(CVA) and funding valuation adjustment (FVA) for over-the-counter derivatives under

distributional uncertainty using Wasserstein distance as the ambiguity measure. Wrong way …

  Cited by 1 Related articles All 8 versions 



[PDF] arxiv.org

Finsler structure for variable exponent Wasserstein space and gradient flows

A Marcos, A Soglo - arXiv preprint arXiv:1912.12450, 2019 - arxiv.org

The variational approach requires the setting of new tools such as appropiate distance on the

probability space and an introduction of a Finsler metric in this space. The class of parabolic

equations is derived as the flow of a gradient with respect the Finsler structure. For q(x) q …

  Related articles All 2 versions 

<—-2019—–—2019 ——1870—  


Music Classification using Multiclass Support Vector Machine and Multilevel Wasserstein Means

J Wei, C Jin, Z Cheng, X Lv… - … on Computer and …, 2019 - ieeexplore.ieee.org

Music classification is a challenging task in music information retrieval. In this article, we

compare the performance of the two types of models. The first category is classified by

Support Vector Machine (SVM). We use the feature extraction from audio as the basis of …

  Related articles All 2 versions

 

2019  see 2020

Adapted Wasserstein Distances and Stability in Mathematical ...

https://arxiv.org › q-fin

by J Backhoff-Veraguas · 2019 · Cited by 20 — Quantitative Finance > Mathematical Finance. arXiv:1901.07450 (q-fin). [Submitted on 22 Jan 2019 (v1), last revised 14 May 2020 (this version, v3)] ...

[CITATION] Adapted wasserstein distances and stability in mathematical finance. arXiv e-prints, page

J Backhoff-Veraguas, D Bartl, M Beiglböck, M Eder - arXiv preprint arXiv:1901.07450, 2019

  Cited by 4 Related articles

Bridging the Gap Between $ f $-GANs and Wasserstein GANs

https://arxiv.org › cs

by J Song · 2019 · Cited by 8 — Wasserstein GANs enjoy superior empirical performance, but in f-GANs the discriminator can be interpreted as a density ratio estimator which is necessary in some GAN applications. In this paper, we bridge the gap between f-GANs and Wasserstein GANs (WGANs).

 [CITATION] Bridging the Gap Between f-GANs and Wasserstein GANs. arXiv e-prints, page

J Song, S Ermon - arXiv preprint arXiv:1910.09779, 2019

  Cited by 3 Related articles


Clustering measure-valued data with Wasserstein barycenters

https://arxiv.org › stat

by G Domazakis · 2019 — Such type of learning approaches are highly appreciated in many ... real world applications: (a) clustering eurozone countries according to their ...

[CITATION] Learning with Wasserstein barycenters and applications.

G Domazakis, D Drivaliaris, S Koukoulas… - CoRR, 2019


Convergence to equilibrium in Wasserstein distance for damped Euler equations with interaction forces

JA CarrilloYP ChoiO Tse - Communications in Mathematical Physics, 2019 - Springer

We develop tools to construct Lyapunov functionals on the space of probability measures in

order to investigate the convergence to global equilibrium of a damped Euler system under

the influence of external and interaction potential forces with respect to the 2-Wasserstein  …

  Cited by 12 Related articles All 11 versions


[PDF] arxiv.org

Wasserstein stability of porous medium-type equations on manifolds with Ricci curvature bounded below

N De Ponti, M Muratori, C Orrieri - arXiv preprint arXiv:1908.03147, 2019 - arxiv.org

Given a complete, connected Riemannian manifold $\mathbb {M}^ n $ with Ricci curvature

bounded from below, we discuss the stability of the solutions of a porous medium-type

equation with respect to the 2-Wasserstein distance. We produce (sharp) stability estimates …

  Cited by 1 Related articles All 3 versions 

[PDF] arxiv.org

A Wasserstein norm for signed measures, with application to nonlocal transport equation with source term

B Piccoli, F Rossi, M Tournus - arXiv preprint arXiv:1910.05105, 2019 - arxiv.org

We introduce the optimal transportation interpretation of the Kantorovich norm on thespace

of signed Radon measures with finite mass, based on a generalized Wasserstein

distancefor measures with different masses. With the formulation and the new topological …

  Cited by 4 Related articles All 7 versions 


[PDF] arxiv.org

A nonlocal free boundary problem with Wasserstein distance

A Karakhanyan - arXiv preprint arXiv:1904.06270, 2019 - arxiv.org

We study the probability measures $\rho\in\mathcal M (\mathbb R^ 2) $ minimizing the

functional\[J [\rho]=\iint\log\frac1 {| xy|} d\rho (x) d\rho (y)+ d^ 2 (\rho,\rho_0),\] where $\rho_0

$ is a given probability measure and $ d (\rho,\rho_0) $ is the 2-Wasserstein distance of …

  Related articles All 2 versions 


[PDF] lancs.ac.uk

Reproducing-Kernel Hilbert space regression with notes on the Wasserstein Distance

S Page - 2019 - eprints.lancs.ac.uk

We study kernel least-squares estimators for the regression problem subject to a norm

constraint. We bound the squared L2 error of our estimators with respect to the covariate

distribution. We also bound the worst-case squared L2 error of our estimators with respect to …

  Related articles All 5 versions 


On the Bures–Wasserstein distance between positive definite matrices

R Bhatia, T Jain, Y Lim - Expositiones Mathematicae, 2019 - Elsevier

The metric d (A, B)= tr A+ tr B− 2 tr (A 1 2 BA 1 2) 1 2 1 2 on the manifold of n× n

positive definite matrices arises in various optimisation problems, in quantum information

and in the theory of optimal transport. It is also related to Riemannian geometry. In the first …

  Cited by 96 Related articles All 6 versions

<——2019—–—2019 ——1880— 



[PDF] arxiv.org

Approximate Bayesian computation with the Wasserstein distance

E BerntonPE Jacob, M Gerber, CP Robert - arXiv preprint arXiv …, 2019 - arxiv.org

A growing number of generative statistical models do not permit the numerical evaluation of

their likelihood functions. Approximate Bayesian computation (ABC) has become a popular

approach to overcome this issue, in which one simulates synthetic data sets given …

  Cited by 44 Related articles All 12 versions 


[HTML] oup.com

On parameter estimation with the Wasserstein distance

E BerntonPE Jacob, M Gerber… - … : A Journal of the IMA, 2019 - academic.oup.com

Statistical inference can be performed by minimizing, over the parameter space, the

Wasserstein distance between model distributions and the empirical distribution of the data.

We study asymptotic properties of such minimum Wasserstein distance estimators …

  Cited by 24 Related articles All 6 versions


[PDF] mlr.press

The wasserstein transform

F Memoli, Z Smith, Z Wan - International Conference on …, 2019 - proceedings.mlr.press

We introduce the Wasserstein transform, a method for enhancing and denoising datasets

defined on general metric spaces. The construction draws inspiration from Optimal

Transportation ideas. We establish the stability of our method under data perturbation and …

  Cited by 5 Related articles All 5 versions 


[PDF] arxiv.org

The Pontryagin maximum principle in the Wasserstein space

B BonnetF Rossi - Calculus of Variations and Partial Differential …, 2019 - Springer

Abstract We prove a Pontryagin Maximum Principle for optimal control problems in the

space of probability measures, where the dynamics is given by a transport equation with non-

local velocity. We formulate this first-order optimality condition using the formalism of …

  Cited by 24 Related articles All 20 versions


[PDF] arxiv.org

How Well Do WGANs Estimate the Wasserstein Metric?

A MallastoG MontúfarA Gerolin - arXiv preprint arXiv:1910.03875, 2019 - arxiv.org

Generative modelling is often cast as minimizing a similarity measure between a data

distribution and a model distribution. Recently, a popular choice for the similarity measure

has been the Wasserstein metric, which can be expressed in the Kantorovich duality …

  Cited by 5 Related articles All 5 versions 


2019


[PDF] arxiv.org

Asymptotic guarantees for learning generative models with the sliced-wasserstein distance

K NadjahiA DurmusU ŞimşekliR Badeau - arXiv preprint arXiv …, 2019 - arxiv.org

Minimum expected distance estimation (MEDE) algorithms have been widely used for

probabilistic models with intractable likelihood functions and they have become increasingly

popular due to their use in implicit generative modeling (eg Wasserstein generative …

  Cited by 20 Related articles All 5 versions 


[PDF] arxiv.org

Wgansing: A multi-voice singing voice synthesizer based on the wasserstein-gan

P Chandna, M Blaauw, J Bonada… - 2019 27th European …, 2019 - ieeexplore.ieee.org

We present a deep neural network based singing voice synthesizer, inspired by the Deep

Convolutions Generative Adversarial Networks (DCGAN) architecture and optimized using

the Wasserstein-GAN algorithm. We use vocoder parameters for acoustic modelling, to …

  Cited by 27 Related articles All 4 versions


[PDF] ucla.edu

On differentiability in the Wasserstein space and well-posedness for Hamilton–Jacobi equations

W GangboA Tudorascu - Journal de Mathématiques Pures et Appliquées, 2019 - Elsevier

In this paper we elucidate the connection between various notions of differentiability in the

Wasserstein space: some have been introduced intrinsically (in the Wasserstein space, by

using typical objects from the theory of Optimal Transport) and used by various authors to …

  Cited by 35 Related articles All 4 versions



Interior-point methods strike back: Solving the wasserstein barycenter problem

D Ge, H Wang, Z Xiong, Y Ye - arXiv preprint arXiv:1905.12895, 2019 - arxiv.org

Computing the Wasserstein barycenter of a set of probability measures under the optimal

transport metric can quickly become prohibitive for traditional second-order algorithms, such

as interior-point methods, as the support size of the measures increases. In this paper, we …

  Cited by 11 Related articles All 3 versions 


[PDF] mlr.press

Understanding mcmc dynamics as flows on the wasserstein space

C LiuJ ZhuoJ Zhu - International Conference on Machine …, 2019 - proceedings.mlr.press

It is known that the Langevin dynamics used in MCMC is the gradient flow of the KL

divergence on the Wasserstein space, which helps convergence analysis and inspires

recent particle-based variational inference methods (ParVIs). But no more MCMC dynamics …

  Cited by 3 Related articles All 11 versions 

<——2019—–—2019 ——1890— 



2019

[PDF] arxiv.org

Statistical data analysis in the Wasserstein space

J Bigot - arXiv preprint arXiv:1907.08417, 2019 - arxiv.org

This paper is concerned by statistical inference problems from a data set whose elements

may be modeled as random probability measures such as multiple histograms or point

clouds. We propose to review recent contributions in statistics on the use of Wasserstein  …

  Cited by 3 Related articles All 2 versions 


[PDF] arxiv.org

Unsupervised adversarial domain adaptation based on the Wasserstein distance for acoustic scene classification

K DrossosP MagronT Virtanen - 2019 IEEE Workshop on …, 2019 - ieeexplore.ieee.org

A challenging problem in deep learning-based machine listening field is the degradation of

the performance when using data from unseen conditions. In this paper we focus on the

acoustic scene classification (ASC) task and propose an adversarial deep learning method …

  Cited by 15 Related articles All 5 versions


[PDF] sciencedirect.com

Harmonic mappings valued in the Wasserstein space

H Lavenant - Journal of Functional Analysis, 2019 - Elsevier

We propose a definition of the Dirichlet energy (which is roughly speaking the integral of the

square of the gradient) for mappings μ: Ω(P (D), W 2) defined over a subset Ω of R p and

valued in the space P (D) of probability measures on a compact convex subset D of R q …

  Cited by 12 Related articles All 12 versions


[PDF] arxiv.org

A partial Laplacian as an infinitesimal generator on the Wasserstein space

YT Chow, W Gangbo - Journal of Differential Equations, 2019 - Elsevier

In this manuscript, we consider special linear operators which we term partial Laplacians on

the Wasserstein space, and which we show to be partial traces of the Wasserstein Hessian.

We verify a distinctive smoothing effect of the “heat flows” they generated for a particular …

  Cited by 13 Related articles All 9 versions


[PDF] arxiv.org

Second-Order Models for Optimal Transport and Cubic Splines on the Wasserstein Space

JD BenamouTO GallouëtFX Vialard - Foundations of Computational …, 2019 - Springer

On the space of probability densities, we extend the Wasserstein geodesics to the case of

higher-order interpolation such as cubic spline interpolation. After presenting the natural

extension of cubic splines to the Wasserstein space, we propose a simpler approach based …

  Cited by 9 Related articles All 5 versions


2019


[PDF] arxiv.org

Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic Schr {\" o} dinger equation

G Ferriere - arXiv preprint arXiv:1903.04309, 2019 - arxiv.org

We consider the dispersive logarithmic Schr {ö} dinger equation in a semi-classical scaling.

We extend the results about the large time behaviour of the solution (dispersion faster than

usual with an additional logarithmic factor, convergence of the rescaled modulus of the  …

  Cited by 6 Related articles All 4 versions 


A unified formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt distances between positive definite operators

HQ Minh - International Conference on Geometric Science of …, 2019 - Springer

This work presents a parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

distances provide a unified formulation encompassing both the Bures-Wasserstein and Log …

  Cited by 5 Related articles All 2 versions


[PDF] arxiv.org

On the total variation Wasserstein gradient flow and the TV-JKO scheme

G Carlier, C Poon - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We study the JKO scheme for the total variation, characterize the optimizers, prove some of

their qualitative properties (in particular a form of maximum principle and in some cases, a

minimum principle as well). Finally, we establish a convergence result as the time step goes …

  Cited by 7 Related articles All 7 versions



Alpha Procrustes metrics between positive definite operators: a unifying formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt metrics

MH Quang - arXiv preprint arXiv:1908.09275, 2019 - arxiv.org

This work presents a parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

distances provide a unified formulation encompassing both the Bures-Wasserstein and Log …

  Cited by 4 Related articles All 2 versions 


2019

[PDF] arxiv.org

The quadratic Wasserstein metric for inverse data matching

K Ren, Y Yang - arXiv preprint arXiv:1911.06911, 2019 - arxiv.org

This work characterizes, analytically and numerically, two major effects of the quadratic

Wasserstein ($ W_2 $) distance as the measure of data discrepancy in computational

solutions of inverse problems. First, we show, in the infinite-dimensional setup, that the  …

  Cited by 1 Related articles 

<——2019—–—2019 ——1900— 



[PDF] arxiv.org

Minimax confidence intervals for the sliced Wasserstein distance

T ManoleS BalakrishnanL Wasserman - arXiv preprint arXiv:1909.07862, 2019 - arxiv.org

Motivated by the growing popularity of variants of the Wasserstein distance in statistics and

machine learning, we study statistical inference for the Sliced Wasserstein distance--an

easily computable variant of the Wasserstein distance. Specifically, we construct confidence …

  Cited by 3 Related articles All 4 versions 


[PDF] arxiv.org

Modeling the Biological Pathology Continuum with HSIC-regularized Wasserstein Auto-encoders

D Wu, H Kobayashi, C Ding, L Cheng… - arXiv preprint arXiv …, 2019 - arxiv.org

A crucial challenge in image-based modeling of biomedical data is to identify trends and

features that separate normality and pathology. In many cases, the morphology of the

imaged object exhibits continuous change as it deviates from normality, and thus a …

  Cited by 4 Related articles All 2 versions 


[PDF] sciencedirect.com

Misfit function for full waveform inversion based on the Wasserstein metric with dynamic formulation

P YongW Liao, J Huang, Z Li, Y Lin - Journal of Computational Physics, 2019 - Elsevier

Conventional full waveform inversion (FWI) using least square distance (L 2 norm) between

the observed and predicted seismograms suffers from local minima. Recently, the

Wasserstein metric (W 1 metric) has been introduced to FWI to compute the misfit between …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

Bounds for the Wasserstein mean with applications to the Lie-Trotter mean

J Hwang, S Kim - Journal of Mathematical Analysis and Applications, 2019 - Elsevier

Since barycenters in the Wasserstein space of probability distributions have been

introduced, the Wasserstein metric and the Wasserstein mean of positive definite Hermitian

matrices have been recently developed. In this paper, we explore some properties of …

  Cited by 3 Related articles All 5 versions


[PDF] arxiv.org

Zero-Sum Differential Games on the Wasserstein Space

J MoonT Basar - arXiv preprint arXiv:1912.06084, 2019 - arxiv.org

We consider two-player zero-sum differential games (ZSDGs), where the state process

(dynamical system) depends on the random initial condition and the state process's

distribution, and the objective functional includes the state process's distribution and the  …

  Cited by 1 Related articles All 2 versions 


2019


[PDF] archives-ouvertes.fr

Diffusive processes on the Wasserstein space: Coalescing models, Regularization properties and McKean-Vlasov equations

V Marx - 2019 - tel.archives-ouvertes.fr

The aim of this thesis is to study a class of diffusive stochastic processes with values in the

space of probability measures on the real line, called Wasserstein space if it is endowed

with the Wasserstein metric W2. The following issues are mainly addressed in this work: how …

  Cited by 2 Related articles All 9 versions 


[PDF] arxiv.org

The boundary method for semi-discrete optimal transport partitions and Wasserstein distance computation

L Dieci, JD Walsh III - Journal of Computational and Applied Mathematics, 2019 - Elsevier

We introduce a new technique, which we call the boundary method, for solving semi-

discrete optimal transport problems with a wide range of cost functions. The boundary

method reduces the effective dimension of the problem, thus improving complexity. For cost …

  Cited by 7 Related articles All 5 versions


Pushing the right boundaries matters! wasserstein adversarial training for label noise

BB DamodaranK FatrasS LobryR FlamaryD Tuia… - 2019 - hal.laas.fr

Noisy labels often occur in vision datasets, especially when they are issued from

crowdsourcing or Web scraping. In this paper, we propose a new regularization method

which enables one to learn robust classifiers in presence of noisy data. To achieve this goal …

  Cited by 3 Related articles All 4 versions 


Distributionally robust learning under the wasserstein metric

R Chen - 2019 - search.proquest.com

This dissertation develops a comprehensive statistical learning framework that is robust to

(distributional) perturbations in the data using Distributionally Robust Optimization (DRO)

under the Wasserstein metric. The learning problems that are studied include:(i) …

  Cited by 1 Related articles All 3 versions


Image Reflection Removal Using the Wasserstein Generative Adversarial Network

T LiDPK Lun - … 2019-2019 IEEE International Conference on …, 2019 - ieeexplore.ieee.org

Imaging through a semi-transparent material such as glass often suffers from the reflection

problem, which degrades the image quality. Reflection removal is a challenging task since it

is severely ill-posed. Traditional methods, while all require long computation time on …

  Cited by 1 Related articles All 2 versions

<——2019—–—2019 ——1910—  



[PDF] arxiv.org

Busemann functions on the Wasserstein space

G Zhu, WL Li, X Cui - arXiv preprint arXiv:1905.05544, 2019 - arxiv.org

We study rays and co-rays in the Wasserstein space $ P_p (\mathcal {X}) $($ p> 1$) whose

ambient space $\mathcal {X} $ is a complete, separable, non-compact, locally compact

length space. We show that rays in the Wasserstein space can be represented as probability …

  Related articles All 2 versions 


Data-driven distributionally robust shortest path problem using the Wasserstein ambiguity set

Z Wang, K YouS SongC Shang - 2019 IEEE 15th …, 2019 - ieeexplore.ieee.org

This paper proposes a data-driven distributionally robust shortest path (DRSP) model where

the distribution of the travel time is only observable through a finite training dataset. Our

DRSP model adopts the Wasserstein metric to construct the ambiguity set of probability …

  Cited by 1 Related articles


[PDF] arxiv.org

The Wasserstein Distances Between Pushed-Forward Measures with Applications to Uncertainty Quantification

A Sagiv - arXiv preprint arXiv:1902.05451, 2019 - arxiv.org

In the study of dynamical and physical systems, the input parameters are often uncertain or

randomly distributed according to a measure $\varrho $. The system's response $ f $ pushes

forward $\varrho $ to a new measure $ f\circ\varrho $ which we would like to study. However …

  Related articles All 3 versions 


2019 1

[PDF] nsf.gov

A Distributionally Robust Optimization Approach for Multivariate Linear Regression under the Wasserstein Metric

R ChenIC Paschalidis - 2019 IEEE 58th Conference on …, 2019 - ieeexplore.ieee.org

We present a Distributionally Robust Optimization (DRO) approach for Multivariate Linear

Regression (MLR), where multiple correlated response variables are to be regressed

against a common set of predictors. We develop a regularized MLR formulation that is robust …

  Related articles All 3 versions


[PDF] arxiv.org

Local well-posedness in the Wasserstein space for a chemotaxis model coupled to Navier-Stokes equations

K Kang, HK Kim - arXiv preprint arXiv:1907.01895, 2019 - arxiv.org

We consider a coupled system of Keller-Segel type equations and the incompressible

Navier-Stokes equations in spatial dimension two and three. In the previous work [19], we

established the existence of a weak solution of a Fokker-Plank equation in the Wasserstein  …

  Related articles All 2 versions 


2019


[PDF] arxiv.org

1-Wasserstein Distance on the Standard Simplex

A Frohmader, H Volkmer - arXiv preprint arXiv:1912.04945, 2019 - arxiv.org

Wasserstein distances provide a metric on a space of probability measures. We consider the

space $\Omega $ of all probability measures on the finite set $\chi=\{1,\dots, n\} $ where $ n

$ is a positive integer. 1-Wasserstein distance, $ W_1 (\mu,\nu) $ is a function from …

  Cited by 1 Related articles All 2 versions 


Approximation and Wasserstein distance for self-similar measures on the unit interval

E Lichtenegger, R Niedzialomski - Journal of Mathematical Analysis and …, 2019 - Elsevier

We study the Wasserstein distance between self-similar measures associated to two non-

overlapping linear contractions of the unit interval. The main theorem gives an explicit

formula for the Wasserstein distance between iterations of certain discrete approximations of …

  Related articles All 2 versions


[PDF] lancs.ac.uk

Reproducing-Kernel Hilbert space regression with notes on the Wasserstein Distance

S Page - 2019 - eprints.lancs.ac.uk

We study kernel least-squares estimators for the regression problem subject to a norm

constraint. We bound the squared L2 error of our estimators with respect to the covariate

distribution. We also bound the worst-case squared L2 error of our estimators with respect to …

  Related articles All 5 versions Library Search View as HTML 


A Fenchel-Moreau-Rockafellar type theorem on the Kantorovich-Wasserstein space with applications in partially observable Markov decision processes

V Laschos, K ObermayerY ShenW Stannat - Journal of Mathematical …, 2019 - Elsevier

By using the fact that the space of all probability measures with finite support can be

completed in two different fashions, one generating the Arens-Eells space and another

generating the Kantorovich-Wasserstein (Wasserstein-1) space, and by exploiting the  …

  Cited by 3 Related articles All 5 versions


Deconvolution for the Wasserstein distance

J Dedecker - smai.emath.fr

We consider the problem of estimating a probability measure on Rd from data observed with

an additive noise. We are interested in rates of convergence for the Wasserstein metric of

order p≥ 1. The distribution of the errors is assumed to be known and to belong to a class of …

  Related articles 

<——2019—–—2019 ——1920— 


  

Projection in the 2-Wasserstein sense on structured measure space

L Lebrat - 2019 - tel.archives-ouvertes.fr

This thesis focuses on the approximation for the 2-Wasserstein metric of probability

measures by structured measures. The set of structured measures under consideration is

made of consistent discretizations of measures carried by a smooth curve with a bounded …

  All 2 versions 


Bridging the Gap Between $ f $-GANs and Wasserstein GANs

https://arxiv.org › cs

by J Song · 2019 · Cited by 8 — Next, we minimize over a Lagrangian relaxation of the constrained objective, and show that it generalizes critic objectives of both f-GAN and WGAN. ... Based on this generalization, we propose a novel practical objective, named KL-Wasserstein GAN (KL-WGAN

[CITATION] Bridging the Gap Between f-GANs and Wasserstein GANs. arXiv e-prints, page

J Song, S Ermon - arXiv preprint arXiv:1910.09779, 2019

  Cited by 3 Related articles


Solving General Elliptical Mixture Models through an ...

https://www.researchgate.net › ... › Mixture Models

Download Citation | Solving General Elliptical Mixture Models through an Approximate Wasserstein Manifold | We address the estimation problem for general ...

[CITATION] A general solver to the elliptical mixture model through an approximate wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - arXiv preprint arXiv:1906.03700, 2019

  Cited by 1 Related articles


Conservative wasserstein training for pose estimation

X LiuY Zou, T Che, P Ding, P Jia… - Proceedings of the …, 2019 - openaccess.thecvf.com

This paper targets the task with discrete and periodic class labels (eg, pose/orientation

estimation) in the context of deep learning. The commonly used cross-entropy or regression

loss is not well matched to this problem as they ignore the periodic nature of the labels and …

  Cited by 20 Related articles All 8 versions 


[PDF] arxiv.org

Wasserstein dependency measure for representation learning

S OzairC LynchY BengioA OordS Levine… - arXiv preprint arXiv …, 2019 - arxiv.org

Mutual information maximization has emerged as a powerful learning objective for

unsupervised representation learning obtaining state-of-the-art performance in applications

such as object recognition, speech recognition, and reinforcement learning. However, such …

  Cited by 29 Related articles All 5 versions 


2020


[PDF] thecvf.com

Sliced wasserstein discrepancy for unsupervised domain adaptation

CY Lee, T Batra, MH Baig… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

In this work, we connect two distinct concepts for unsupervised domain adaptation: feature

distribution alignment between domains by utilizing the task-specific decision boundary and

the Wasserstein metric. Our proposed sliced Wasserstein discrepancy (SWD) is designed to …

  Cited by 120 Related articles All 7 versions 

[CITATION] Sliced wasserstein discrepancy for unsupervised domain adaptation. In 2019 IEEE

C Lee, T Batra, MH Baig, D Ulbricht - CVF Conference on Computer Vision and …, 2019

  Cited by 1



[PDF] thecvf.com

Unimodal-uniform constrained wasserstein training for medical diagnosis

X LiuX Han, Y Qiao, Y GeS Li… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

The labels in medical diagnosis task are usually discrete and successively distributed. For

example, the Diabetic Retinopathy Diagnosis (DR) involves five health risk levels: no DR (0),

mild DR (1), moderate DR (2), severe DR (3) and proliferative DR (4). This labeling system is …

  Cited by 15 Related articles All 7 versions 



[PDF] arxiv.org

Primal dual methods for Wasserstein gradient flows

JA CarrilloK CraigL WangC Wei - arXiv preprint arXiv:1901.08081, 2019 - arxiv.org

Combining the classical theory of optimal transport with modern operator splitting

techniques, we develop a new numerical method for nonlinear, nonlocal partial differential

equations, arising in models of porous media, materials science, and biological swarming …

  Cited by 19 Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein covariance for multiple random densities

A PetersenHG Müller - Biometrika, 2019 - academic.oup.com

A common feature of methods for analysing samples of probability density functions is that

they respect the geometry inherent to the space of densities. Once a metric is specified for

this space, the Fréchet mean is typically used to quantify and visualize the average density …

  Cited by 12 Related articles All 12 versions


[PDF] mlr.press

Wasserstein regularization for sparse multi-task regression

H JanatiM CuturiA Gramfort - The 22nd International …, 2019 - proceedings.mlr.press

We focus in this paper on high-dimensional regression problems where each regressor can

be associated to a location in a physical space, or more generally a generic geometric

space. Such problems often employ sparse priors, which promote models using a small …

  Cited by 29 Related articles All 8 versions 

<——2019—–—2019 ——1930— 



Wasserstein distance based domain adaptation for object detection

P Xu, P GurramG WhippsR Chellappa - arXiv preprint arXiv:1909.08675, 2019 - arxiv.org

In this paper, we present an adversarial unsupervised domain adaptation framework for

object detection. Prior approaches utilize adversarial training based on cross entropy

between the source and target domain distributions to learn a shared feature mapping that …

  Cited by 6 Related articles All 2 versions 


Wasserstein generative learning with kinematic constraints for probabilistic interactive driving behavior prediction

H Ma, J LiW ZhanM Tomizuka - 2019 IEEE Intelligent …, 2019 - ieeexplore.ieee.org

Since prediction plays a significant role in enhancing the performance of decision making

and planning procedures, the requirement of advanced methods of prediction becomes

urgent. Although many literatures propose methods to make prediction on a single agent …

  Cited by 16 Related articles


2019  [PDF] arxiv.org

Wasserstein distance based deep adversarial transfer learning for intelligent fault diagnosis

C Cheng, B Zhou, G Ma, D WuY Yuan - arXiv preprint arXiv:1903.06753, 2019 - arxiv.org

The demand of artificial intelligent adoption for condition-based maintenance strategy is

astonishingly increased over the past few years. Intelligent fault diagnosis is one critical

topic of maintenance solution for mechanical systems. Deep learning models, such as …

  Cited by 16 Related articles All 3 versions 


[PDF] aclweb.org

Modeling personalization in continuous space for response generation via augmented wasserstein autoencoders

Z Chan, J LiX YangX Chen, W Hu, D Zhao… - Proceedings of the 2019 …, 2019 - aclweb.org

Abstract Variational autoencoders (VAEs) and Wasserstein autoencoders (WAEs) have

achieved noticeable progress in open-domain response generation. Through introducing

latent variables in continuous space, these models are capable of capturing utterance-level …

  Cited by 14 Related articles All 3 versions 


[PDF] ieee.org

A deep transfer model with wasserstein distance guided multi-adversarial networks for bearing fault diagnosis under different working conditions

M Zhang, D Wang, W Lu, J Yang, Z Li, B Liang - IEEE Access, 2019 - ieeexplore.ieee.org

In recent years, intelligent fault diagnosis technology with the deep learning algorithm has

been widely used in the manufacturing industry for substituting time-consuming human

analysis method to enhance the efficiency of fault diagnosis. The rolling bearing as the …

   Cited by 35 Related articles All 6 versions

2019


Data-driven Wasserstein distributionally robust optimization for biomass with agricultural waste-to-energy network design under uncertainty

C NingF You - Applied Energy, 2019 - Elsevier

This paper addresses the problem of biomass with agricultural waste-to-energy network

design under uncertainty. We propose a novel data-driven Wasserstein distributionally

robust optimization model for hedging against uncertainty in the optimal network design …

  Cited by 15 Related articles All 8 versions


[PDF] arxiv.org

2-wasserstein approximation via restricted convex potentials with application to improved training for gans

A TaghvaeiA Jalali - arXiv preprint arXiv:1902.07197, 2019 - arxiv.org

We provide a framework to approximate the 2-Wasserstein distance and the optimal

transport map, amenable to efficient training as well as statistical and geometric analysis.

With the quadratic cost and considering the Kantorovich dual form of the optimal …

  Cited by 9 Related articles All 3 versions 


[PDF] arxiv.org

Parameter estimation for biochemical reaction networks using Wasserstein distances

K Öcal, R GrimaG Sanguinetti - Journal of Physics A …, 2019 - iopscience.iop.org

We present a method for estimating parameters in stochastic models of biochemical reaction

networks by fitting steady-state distributions using Wasserstein distances. We simulate a

reaction network at different parameter settings and train a Gaussian process to learn the …

  Cited by 7 Related articles All 7 versions


Hyperbolic Wasserstein distance for shape indexing

J ShiY Wang - IEEE transactions on pattern analysis and …, 2019 - ieeexplore.ieee.org

Shape space is an active research topic in computer vision and medical imaging fields. The

distance defined in a shape space may provide a simple and refined index to represent a

unique shape. This work studies the Wasserstein space and proposes a novel framework to …

  Cited by 5 Related articles All 7 versions


A virtual monochromatic imaging method for spectral CT based on Wasserstein generative adversarial network with a hybrid loss

Z Shi, J Li, H Li, Q Hu, Q Cao - IEEE Access, 2019 - ieeexplore.ieee.org

Spectral computed tomography (CT) has become a popular clinical diagnostic technique

because of its unique advantage in material distinction. Specifically, it can perform virtual

monochromatic imaging to obtain accurate tissue composition with less beam hardening …

  Cited by 8 Related articles All 2 versions

<——2019—–—2019 ——1940— 



(q, p)-Wasserstein GANs: Comparing Ground Metrics for Wasserstein GANs

A MallastoJ FrellsenW Boomsma… - arXiv preprint arXiv …, 2019 - arxiv.org

Generative Adversial Networks (GANs) have made a major impact in computer vision and

machine learning as generative models. Wasserstein GANs (WGANs) brought Optimal

Transport (OT) theory into GANs, by minimizing the $1 $-Wasserstein distance between …

  Cited by 3 Related articles All 3 versions 


2019 [PDF] arxiv.org

Semi-supervised multitask learning on multispectral satellite images using wasserstein generative adversarial networks (gans) for predicting poverty

A Perez, S GanguliS ErmonG AzzariM Burke… - arXiv preprint arXiv …, 2019 - arxiv.org

Obtaining reliable data describing local poverty metrics at a granularity that is informative to

policy-makers requires expensive and logistically difficult surveys, particularly in the

developing world. Not surprisingly, the poverty stricken regions are also the ones which …

 Cited by 21 Related articles All 6 versions 


[PDF] researchgate.net

Wasserstein Subsequence Kernel for Time Series

C BockM TogninalliE Ghisu… - … Conference on Data …, 2019 - ieeexplore.ieee.org

Kernel methods are a powerful approach for learning on structured data. However, as we

show in this paper, simple but common instances of the popular R-convolution kernel

framework can be meaningless when assessing the similarity of two time series through …

  Cited by 3 Related articles All 10 versions


[PDF] arxiv.org

Sufficient condition for rectifiability involving Wasserstein distance 

D Dąbrowski - arXiv preprint arXiv:1904.11004, 2019 - arxiv.org

A Radon measure $\mu $ is $ n $-rectifiable if it is absolutely continuous with respect to

$\mathcal {H}^ n $ and $\mu $-almost all of $\text {supp}\,\mu $ can be covered by Lipschitz

images of $\mathbb {R}^ n $. In this paper we give two sufficient conditions for rectifiability …

  Cited by 4 Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein stability estimates for covariance-preconditioned Fokker-Planck equations

JA CarrilloU Vaes - arXiv preprint arXiv:1910.07555, 2019 - arxiv.org

We study the convergence to equilibrium of the mean field PDE associated with the

derivative-free methodologies for solving inverse problems. We show stability estimates in

the euclidean Wasserstein distance for the mean field PDE by using optimal transport …

  Cited by 8 Related articles All 4 versions 


2019


[PDF] springer.com

Convergence to equilibrium in Wasserstein distance for damped Euler equations with interaction forces

JA CarrilloYP ChoiO Tse - Communications in Mathematical Physics, 2019 - Springer

We develop tools to construct Lyapunov functionals on the space of probability measures in

order to investigate the convergence to global equilibrium of a damped Euler system under

the influence of external and interaction potential forces with respect to the 2-Wasserstein  …

  Cited by 13 Related articles All 11 versions


[PDF] thecvf.com

Joint wasserstein autoencoders for aligning multimodal embeddings

S Mahajan, T Botschen… - Proceedings of the …, 2019 - openaccess.thecvf.com

One of the key challenges in learning joint embeddings of multiple modalities, eg of images

and text, is to ensure coherent cross-modal semantics that generalize across datasets. We

propose to address this through joint Gaussian regularization of the latent representations …

  Cited by 2 Related articles All 6 versions 


[PDF] esaim-cocv.org

A Pontryagin Maximum Principle in Wasserstein spaces for constrained optimal control problems

B Bonnet - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

In this paper, we prove a Pontryagin Maximum Principle for constrained optimal control

problems in the Wasserstein space of probability measures. The dynamics is described by a

transport equation with non-local velocities which are affine in the control, and is subject to …

  Cited by 8 Related articles All 45 versions


[HTML] frontiersin.org

[HTML] Identifying imaging markers for predicting cognitive assessments using wasserstein distances based matrix regression

J Yan, C DengL LuoX WangX YaoL Shen… - Frontiers in …, 2019 - frontiersin.org

Alzheimer's disease (AD) is a severe type of neurodegeneration which worsens human

memory, thinking and cognition along a temporal continuum. How to identify the informative

phenotypic neuroimaging markers and accurately predict cognitive assessment are crucial …

  Cited by 2 Related articles All 11 versions 


2019

[PDF] biorxiv.org

De Novo Protein Design for Novel Folds using Guided Conditional Wasserstein Generative Adversarial Networks (gcWGAN)

M KarimiS ZhuY CaoY Shen - bioRxiv, 2019 - biorxiv.org

Motivation Facing data quickly accumulating on protein sequence and structure, this study is

addressing the following question: to what extent could current data alone reveal deep

insights into the sequence-structure relationship, such that new sequences can be designed …

  Cited by 6 Related articles All 4 versions 

<——2019—–—2019 ——1950— 

<——2019—–—2019 ——1960—  



Wasserstein generative adversarial networks for motion artifact removal in dental CT imaging

C Jiang, Q Zhang, Y Ge, D Liang… - … 2019: Physics of …, 2019 - spiedigitallibrary.org

In dental computed tomography (CT) scanning, high-quality images are crucial for oral

disease diagnosis and treatment. However, many artifacts, such as metal artifacts,

downsampling artifacts and motion artifacts, can degrade the image quality in practice. The …

  Cited by 5 Related articles All 3 versions


CWGAN: Conditional wasserstein generative adversarial nets for fault data generation

Y Yu, B Tang, R Lin, S Han, T Tang… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

With the rapid development of modern industry and artificial intelligence technology, fault

diagnosis technology has become more automated and intelligent. The deep learning

based fault diagnosis model has achieved significant advantages over the traditional fault …

  Cited by 3 Related articles All 2 versions


[PDF] arxiv.org

Adaptive wasserstein hourglass for weakly supervised hand pose estimation from monocular RGB

Y Zhang, L Chen, Y Liu, J Yong, W Zheng - arXiv preprint arXiv …, 2019 - arxiv.org

Insufficient labeled training datasets is one of the bottlenecks of 3D hand pose estimation

from monocular RGB images. Synthetic datasets have a large number of images with

precise annotations, but the obvious difference with real-world datasets impacts the …

  Cited by 3 Related articles All 2 versions 


A semi-supervised wasserstein generative adversarial network for classifying driving fatigue from EEG signals

S PanwarP RadJ Quarles, E Golob… - … on Systems, Man and …, 2019 - ieeexplore.ieee.org

Predicting driver's cognitive states using deep learning from electroencephalography (EEG)

signals is considered this paper. To address the challenge posed by limited labeled training

samples, a semi-supervised Wasserstein Generative Adversarial Network with gradient …

  Cited by 3 Related articles All 2 versions


[PDF] semanticscholar.org

Speech Enhancement for Noise-Robust Speech Synthesis Using Wasserstein GAN.

N AdigaY Pantazis, V Tsiaras, Y Stylianou - INTERSPEECH, 2019 - isca-speech.org

The quality of speech synthesis systems can be significantly deteriorated by the presence of

background noise in the recordings. Despite the existence of speech enhancement

techniques for effectively suppressing additive noise under low signal-tonoise (SNR) …

  Cited by 4 Related articles All 4 versions


2019


[PDF] arxiv.org

A First-Order Algorithmic Framework for Wasserstein Distributionally Robust Logistic Regression

J LiS HuangAMC So - arXiv preprint arXiv:1910.12778, 2019 - arxiv.org

Wasserstein distance-based distributionally robust optimization (DRO) has received much

attention lately due to its ability to provide a robustness interpretation of various learning

models. Moreover, many of the DRO problems that arise in the learning context admits exact …

  Cited by 1 Related articles All 7 versions 


[PDF] aaai.org

EWGAN: Entropy-based Wasserstein GAN for imbalanced learning

J Ren, Y LiuJ Liu - Proceedings of the AAAI Conference on Artificial …, 2019 - ojs.aaai.org

In this paper, we propose a novel oversampling strategy dubbed Entropy-based

Wasserstein Generative Adversarial Network (EWGAN) to generate data samples for

minority classes in imbalanced learning. First, we construct an entropyweighted label vector …

  Cited by 1 Related articles All 7 versions 


[PDF] arxiv.org

Fused Gromov-Wasserstein Alignment for Hawkes Processes

D LuoH XuL Carin - arXiv preprint arXiv:1910.02096, 2019 - arxiv.org

We propose a novel fused Gromov-Wasserstein alignment method to jointly learn the

Hawkes processes in different event spaces, and align their event types. Given two Hawkes

processes, we use fused Gromov-Wasserstein discrepancy to measure their dissimilarity …

  Cited by 2 Related articles All 2 versions 


[PDF] arxiv.org

Attainability property for a probabilistic target in Wasserstein spaces

G Cavagnari, A Marigonda - arXiv preprint arXiv:1904.10933, 2019 - arxiv.org

In this paper we establish an attainability result for the minimum time function of a control

problem in the space of probability measures endowed with Wasserstein distance. The

dynamics is provided by a suitable controlled continuity equation, where we impose a …

  Cited by 1 Related articles All 6 versions 


[PDF] arxiv.org

Wasserstein norm for signed measures, with application to nonlocal transport equation with source term

B Piccoli, F Rossi, M Tournus - arXiv preprint arXiv:1910.05105, 2019 - arxiv.org

We introduce the optimal transportation interpretation of the Kantorovich norm on thespace

of signed Radon measures with finite mass, based on a generalized Wasserstein

distancefor measures with different masses. With the formulation and the new topological …

  Cited by 4 Related articles All 7 versions 

<——2019—–—2019 ——1970—  


Cross-domain Attention Network with Wasserstein Regularizers for E-commerce Search

M QiuB WangC Chen, X Zeng, J Huang… - Proceedings of the 28th …, 2019 - dl.acm.org

Product search and recommendation is a task that every e-commerce platform wants to

outperform their peels on. However, training a good search or recommendation model often

requires more data than what many platforms have. Fortunately, the search tasks on different …

  Related articles


Wasserstein Distances for Estimating Parameters in Stochastic Reaction Networks

K Öcal, R GrimaG Sanguinetti - International Conference on …, 2019 - Springer

Modern experimental methods such as flow cytometry and fluorescence in-situ hybridization

(FISH) allow the measurement of cell-by-cell molecule numbers for RNA, proteins and other

substances for large numbers of cells at a time, opening up new possibilities for the …

  Related articles All 3 versions


[PDF] arxiv.org

Wasserstein distances for evaluating cross-lingual embeddings

G Balikas, I Partalas - arXiv preprint arXiv:1910.11005, 2019 - arxiv.org

Word embeddings are high dimensional vector representations of words that capture their

semantic similarity in the vector space. There exist several algorithms for learning such

embeddings both for a single language as well as for several languages jointly. In this work …

  Related articles All 3 versions 


Adversarial Learning for Cross-Modal Retrieval with Wasserstein Distance

Q Cheng, Y Zhang, X Gu - International Conference on Neural Information …, 2019 - Springer

This paper presents a novel approach for cross-modal retrieval in an Adversarial Learning

with Wasserstein Distance (ALWD) manner, which aims at learning aligned representation

for various modalities in a GAN framework. The generator projects the image and the text …

  Related articles


[PDF] arxiv.org

A measure approximation theorem for Wasserstein-robust expected values

G van Zyl - arXiv preprint arXiv:1912.12119, 2019 - arxiv.org

We consider the problem of finding the infimum, over probability measures being in a ball

defined by Wasserstein distance, of the expected value of a bounded Lipschitz random

variable on $\mathbf {R}^ d $. We show that if the $\sigma-$ algebra is approximated in by a …

  Related articles All 2 versions 


2019


Artifact correction in low‐dose dental CT imaging using Wasserstein generative adversarial networks

Z HuC Jiang, F Sun, Q Zhang, Y Ge, Y Yang… - Medical …, 2019 - Wiley Online Library

Purpose In recent years, health risks concerning high‐dose x‐ray radiation have become a

major concern in dental computed tomography (CT) examinations. Therefore, adopting low‐

dose computed tomography (LDCT) technology has become a major focus in the CT …

  Cited by 35 Related articles All 5 versions


Improved Procedures for Training Primal Wasserstein GANs

T Zhang, Z Li, Q ZhuD Zhang - 2019 IEEE SmartWorld …, 2019 - ieeexplore.ieee.org

Primal Wasserstein GANs are a variant of Generative Adversarial Networks (ie, GANs),

which optimize the primal form of empirical Wasserstein distance directly. However, the high

computational complexity and training instability are the main challenges of this framework …

  Cited by 1 Related articles


Unsupervised feature extraction based on improved Wasserstein generative adversarial network for hyperspectral classification

Q Sun, S Bourennane - Multimodal Sensing: Technologies …, 2019 - spiedigitallibrary.org

Accurate classification is one of the most important prerequisites for hyperspectral

applications and feature extraction is the key step of classification. Recently, deep learning

models have been successfully used to extract the spectral-spatial features in hyperspectral …

  Related articles All 4 versions


PWGAN: wasserstein GANs with perceptual loss for mode collapse

X Wu, C Shi, X Li, J He, X Wu, J Lv, J Zhou - Proceedings of the ACM …, 2019 - dl.acm.org

Generative adversarial network (GAN) plays an important part in image generation. It has

great achievements trained on large scene data sets. However, for small scene data sets,

we find that most of methods may lead to a mode collapse, which may repeatedly generate …

  Related articles


Training Wasserstein GANs for Estimating Depth Maps

AT Arslan, E Seke - 2019 3rd International Symposium on …, 2019 - ieeexplore.ieee.org

Depth maps depict pixel-wise depth association with a 2D digital image. Point clouds

generation and 3D surface reconstruction can be conducted by processing a depth map.

Estimating a corresponding depth map from a given input image is an important and difficult …

  Related articles

<——2019—–—2019 ——1980—  



[PDF] arxiv.org

Statistical inference for Bures-Wasserstein barycenters

A KroshninV SpokoinyA Suvorikova - arXiv preprint arXiv:1901.00226, 2019 - arxiv.org

In this work we introduce the concept of Bures-Wasserstein barycenter $ Q_* $, that is

essentially a Fréchet mean of some distribution $\mathbb {P} $ supported on a subspace of

positive semi-definite Hermitian operators $\mathbb {H} _ {+}(d) $. We allow a barycenter to …

  Cited by 16 Related articles All 3 versions 


[PDF] researchgate.net

Wasserstein metric based distributionally robust approximate framework for unit commitment

R ZhuH WeiX Bai - IEEE Transactions on Power Systems, 2019 - ieeexplore.ieee.org

This paper proposed a Wasserstein metric-based distributionally robust approximate

framework (WDRA), for unit commitment problem to manage the risk from uncertain wind

power forecasted errors. The ambiguity set employed in the distributionally robust …

 Cited by 57 Related articles All 2 versions

[PDF] arxiv.org

Riemannian normalizing flow on variational wasserstein autoencoder for text modeling

PZ WangWY Wang - arXiv preprint arXiv:1904.02399, 2019 - arxiv.org

Recurrent Variational Autoencoder has been widely used for language modeling and text

generation tasks. These models often face a difficult optimization problem, also known as

the Kullback-Leibler (KL) term vanishing issue, where the posterior easily collapses to the …

  Cited by 15 Related articles All 5 versions 

iemannian Normalizing Flow on Variational Wasserstein Autoencoder for Text Modeling

P Zizhuang Wang, WY Wang - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

Abstract Recurrent Variational Autoencoder has been widely used for language modeling

and text generation tasks. These models often face a difficult optimization problem, also

known as the Kullback-Leibler (KL) term vanishing issue, where the posterior easily … 

  Cited by 18 Related articles All 6 versions 

[PDF] Wasserstein distance: a flexible tool for statistical analysis

GVVLV Lucarini - 2019 - researchgate.net

The figure shows the Wasserstein distance calculated in the phase space composed by

globally averaged temperature and precipitation. To provide some sort of benchmark, at the

bottom of the figure is shown the value related to the NCEP reanalysis, which yields one of …

  Related articles All 4 versions 


2019 master

Confronto di funzioni oggetto per l'inversione di dati sismici e studio delle potenzialità della Metrica di Wasserstein

L STRACCA - 2019 - etd.adm.unipi.it

Un problema inverso ha come scopo la determinazione o la stima dei parametri incogniti di

un modello, conoscendo i dati da esso generati e l'operatore di forward modelling che

descrive la relazione tra un modello generico e il rispettivo dato predetto. In un qualunque …

2019

[PDF] bayesiandeeplearning.org

[PDF] Nested-Wasserstein Distance for Sequence Generation

R ZhangC ChenZ GanZ WenW WangL Carin - bayesiandeeplearning.org

Reinforcement learning (RL) has been widely studied for improving sequencegeneration

models. However, the conventional rewards used for RL training typically cannot capture

sufficient semantic information and therefore render model bias. Further, the sparse and …

  Related articles 


[PDF] uchile.cl

[PDF] WASSERSTEIN-BASED DISTANCE FOR TIME SERIES ANALYSIS

E CAZELLES, A ROBERT, F TOBAR - cmm.uchile.cl

Page 1. WASSERSTEIN-BASED DISTANCE FOR TIME SERIES ANALYSIS ELSA CAZELLES,

ARNAUD ROBERT AND FELIPE TOBAR UNIVERSIDAD DE CHILE BACKGROUND For a

stationary continuous-time time series x(t), the Power Spectral Density is given by S(ξ) = lim T∞ …

  Related articles 


[PDF] dpi-proceedings.com

Isomorphic Wasserstein Generative Adversarial Network for Numeric Data Augmentation

W Wei, W Chuang, LI Yue - DEStech Transactions on …, 2019 - dpi-proceedings.com

GAN-based schemes are one of the most popular methods designed for image generation.

Some recent studies have suggested using GAN for numeric data augmentation that is to

generate data for completing the imbalanced numeric data. Compared to the conventional …

  Related articles All 2 versions 


[PDF] colostate.edu

[PDF] Morse Theory for Wasserstein Spaces

J Mirth - math.colostate.edu

Applied topology uses simplicial complexes to approximate a manifold based on data. This

approximation is known not to always recover the homotopy type of the manifold. In this work-

in-progress we investigate how to compute the homotopy type in such settings using …

  Related articles All 2 versions 


Statistical inference for Bures-Wasserstein barycenters

https://arxiv.org › math

by A Kroshnin · 2019 · Cited by 16 — Mathematics > Statistics Theory. arXiv:1901.00226 (math). [Submitted on 2 Jan 2019 (v1), last revised 11 Feb 2019 (this version, v2)] ...

<——2019—–—2019 ——1990— 


Statistical inference for Bures-Wasserstein barycenters

https://arxiv.org › math

by A Kroshnin · 2019 · Cited by 16 — Mathematics > Statistics Theory. arXiv:1901.00226 (math). [Submitted on 2 Jan 2019 (v1), last revised 11 Feb 2019 (this version, v2)] ...

[CITATION] Statistical inference for Bures-Wasserstein

A Kroshnin, V Spokoiny, A Suvorikova - arXiv preprint arXiv:1901.00226, 2019

  Cited by 2 Related articles

 PHom-GeM: Persistent Homology for Generative ... - ORBi lu

https://orbilu.uni.lu › SDS_PHomGeM(2)

PDF

by JHJ Charlier · 2019 · Cited by 3 — tive Adversarial Network (GAN) and Auto-Encoders (AE), are among the most ... Our experiments underline the potential of persistent homology for Wasserstein.

[CITATION] PHom-WAE: Persitent Homology for Wasserstein Auto-Encoders.

J Charlier, F Petit, G Ormazabal, Radu State, J Hilger - CoRR, 2019


[PDF] Concentration of risk measures: A Wasserstein distance approach

SP BhatP LA - Advances in Neural Information Processing Systems, 2019 - papers.nips.cc

Abstract<p> Known finite-sample concentration bounds for the Wasserstein distance

between the empirical and true distribution of random variable are used to derive two-

sided concentration bound for the error between the true conditional value-at-risk (CVaR) of …

  Cited by 14 Related articles All 4 versions 


2019

[PDF] arxiv.org

On the computational complexity of finding sparse Wasserstein barycenter

S Borgwardt, S Patterson - arXiv preprint arXiv:1910.07568, 2019 - arxiv.org

The discrete Wasserstein barycenter problem is minimum-cost mass transport problem for

set of probability measures with finite support. In this paper, we show that finding a

barycenter of sparse support is hard, even in dimension 2 and for only 3 measures. We …

  Cited by 11 Related articles All 2 versions 


[PDF] arxiv.org

Precise simulation of electromagnetic calorimeter showers using a Wasserstein Generative Adversarial Network

M Erdmann, J Glombitza, T Quast - Computing and Software for Big …, 2019 - Springer

Simulations of particle showers in calorimeters are computationally time-consuming, as they

have to reproduce both energy depositions and their considerable fluctuations. new

approach to ultra-fast simulations is generative models where all calorimeter energy …

  Cited by 46 Related articles All 6 versions


2019


[PDF] arxiv.org

Wgansing: multi-voice singing voice synthesizer based on the wasserstein-gan

P Chandna, M Blaauw, J Bonada… - 2019 27th European …, 2019 - ieeexplore.ieee.org

We present deep neural network based singing voice synthesizer, inspired by the Deep

Convolutions Generative Adversarial Networks (DCGAN) architecture and optimized using

the Wasserstein-GAN algorithm. We use vocoder parameters for acoustic modelling, to …

  Cited by 28 Related articles All 4 versions


[PDF] arxiv.org

bound on the Wasserstein-2 distance between linear combinations of independent random variables

B Arras, E Azmoodeh, G Poly, Y Swan - Stochastic processes and their …, 2019 - Elsevier

We provide bound on distance between finitely supported elements and general

elements of the unit sphere of 2 (N). We use this bound to estimate the Wasserstein-2

distance between random variables represented by linear combinations of independent …

  Cited by 20 Related articles All 15 versions


[PDF] arxiv.org

Denoising of 3D magnetic resonance images using residual encoder–decoder Wasserstein generative adversarial network

M Ran, J Hu, Y Chen, H Chen, H Sun, J Zhou… - Medical image …, 2019 - Elsevier

Abstract Structure-preserved denoising of 3D magnetic resonance imaging (MRI) images is

critical step in medical image analysis. Over the past few years, many algorithms with

impressive performances have been proposed. In this paper, inspired by the idea of deep …

  Cited by 32 Related articles All 9 versions


[PDF] ieee.org

deep transfer model with wasserstein distance guided multi-adversarial networks for bearing fault diagnosis under different working conditions

M Zhang, D Wang, W Lu, J Yang, Z Li, B Liang - IEEE Access, 2019 - ieeexplore.ieee.org

In recent years, intelligent fault diagnosis technology with the deep learning algorithm has

been widely used in the manufacturing industry for substituting time-consuming human

analysis method to enhance the efficiency of fault diagnosis. The rolling bearing as the …

  Cited by 27 Related articles All 5 versions


[PDF] arxiv.org

Parisi's formula is Hamilton-Jacobi equation in Wasserstein space

JC Mourrat - arXiv preprint arXiv:1906.08471, 2019 - arxiv.org

Parisi's formula is self-contained description of the infinite-volume limit of the free energy of

mean-field spin glass models. We show that this quantity can be recast as the solution of a

Hamilton-Jacobi equation in the Wasserstein space of probability measures on the positive …

  Cited by 7 Related articles All 3 versions 

<——2019—–—2019 ——2000— 



partial Laplacian as an infinitesimal generator on the Wasserstein space

YT Chow, W Gangbo - Journal of Differential Equations, 2019 - Elsevier

In this manuscript, we consider special linear operators which we term partial Laplacians on

the Wasserstein space, and which we show to be partial traces of the Wasserstein Hessian.

We verify distinctive smoothing effect of the “heat flows” they generated for particular …

  Cited by 13 Related articles All 9 versions


[PDF] mlr.press

gradual, semi-discrete approach to generative network training via explicit wasserstein minimization

Y Chen, M TelgarskyC Zhang… - International …, 2019 - proceedings.mlr.press

This paper provides simple procedure to fit generative networks to target distributions, with

the goal of small Wasserstein distance (or other optimal transport costs). The approach is

based on two principles:(a) if the source randomness of the network is continuous …

  Cited by 4 Related articles All 10 versions 


A Wasserstein distance-based analogous method to predict distribution of non-uniform corrosion on reinforcements in concrete

Q Liu, RKL Su - Construction and Building Materials, 2019 - Elsevier

This paper presents an analogous method to predict the distribution of non-uniform

corrosion on reinforcements in concrete by minimizing the Wasserstein distance. A

comparison between the predicted and experimental results shows that the proposed …

  Cited by 6 Related articles All 3 versions


virtual monochromatic imaging method for spectral CT based on Wasserstein generative adversarial network with hybrid loss

Z Shi, J Li, H Li, Q Hu, Q Cao - IEEE Access, 2019 - ieeexplore.ieee.org

Spectral computed tomography (CT) has become popular clinical diagnostic technique

because of its unique advantage in material distinction. Specifically, it can perform virtual

monochromatic imaging to obtain accurate tissue composition with less beam hardening …

  Cited by 8 Related articles All 2 versions


Calculating spatial configurational entropy of landscape mosaic based on the Wasserstein metric

Y Zhao, X Zhang - Landscape Ecology, 2019 - Springer

Context Entropy is an important concept traditionally associated with thermodynamics and is

widely used to describe the degree of disorder in substance, system, or process.

Configurational entropy has received more attention because it better reflects the …

  Cited by 4 Related articles All 5 versions


2019


[PDF] researchgate.net

A Wasserstein Subsequence Kernel for Time Series

C BockM TogninalliE Ghisu… - … Conference on Data …, 2019 - ieeexplore.ieee.org

Kernel methods are powerful approach for learning on structured data. However, as we

show in this paper, simple but common instances of the popular R-convolution kernel

framework can be meaningless when assessing the similarity of two time series through …

  Cited by 3 Related articles All 10 versions


[PDF] inria.fr

On a Wasserstein-type distance between solutions to stochastic differential equations

J Bion–Nadal, D Talay - The Annals of Applied Probability, 2019 - projecteuclid.org

In this paper, we introduce a Wasserstein-type distance on the set of the probability

distributions of strong solutions to stochastic differential equations. This new distance is

defined by restricting the set of possible coupling measures. We prove that it may also be …

  Cited by 11 Related articles All 9 versions


Generating EEG signals of an RSVP experiment by class conditioned wasserstein generative adversarial network

S PanwarP RadJ Quarles… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

Electroencephalography (EEG) data is difficult to obtain due to complex experimental setups

and reduced comfort due to prolonged wearing. This poses challenges to train powerful

deep learning model due to the limited EEG data. Hence, being able to generate EEG data …

  Cited by 5 Related articles All 2 versions


[PDF] esaim-cocv.org

Pontryagin Maximum Principle in Wasserstein spaces for constrained optimal control problems

B Bonnet - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

In this paper, we prove Pontryagin Maximum Principle for constrained optimal control

problems in the Wasserstein space of probability measures. The dynamics is described by a

transport equation with non-local velocities which are affine in the control, and is subject to …

  Cited by 8 Related articles All 45 versions


unified formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt distances between positive definite operators

HQ Minh - International Conference on Geometric Science of …, 2019 - Springer

This work presents parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

distances provide unified formulation encompassing both the Bures-Wasserstein and Log …

  Cited by 5 Related articles All 2 versions

<——2019—–—2019 ——2010— 


Alpha Procrustes metrics between positive definite operators: unifying formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt metrics

MH Quang - arXiv preprint arXiv:1908.09275, 2019 - arxiv.org

This work presents parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

distances provide unified formulation encompassing both the Bures-Wasserstein and Log …

  Cited by 4 Related articles All 2 versions 


[PDF] researchgate.net

Least-squares reverse time migration via linearized waveform inversion using a Wasserstein metric

P Yong, J Huang, Z Li, W LiaoL Qu - Geophysics, 2019 - library.seg.org

Least-squares reverse time migration (LSRTM), an effective tool for imaging the structures of

the earth from seismograms, can be characterized as linearized waveform inversion

problem. We have investigated the performance of three minimization functionals as the L 2 …

  Cited by 8 Related articles All 5 versions

[CITATION] Least-squares reverse time migration via linearized waveform inversion using a Wasserstein metricWasserstein metric for LSRTM

P Yong, J Huang, Z Li, W LiaoL Qu - Geophysics, 2019

Cited by 8 Related articles All 5 versions

[PDF] arxiv.org

Deep Distributional Sequence Embeddings Based on a Wasserstein Loss

Abdelwahab, N Landwehr - arXiv preprint arXiv:1912.01933, 2019 - arxiv.org

Deep metric learning employs deep neural networks to embed instances into metric space

such that distances between instances of the same class are small and distances between

instances from different classes are large. In most existing deep metric learning techniques …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

two-phase two-fluxes degenerate Cahn–Hilliard model as constrained Wasserstein gradient flow

C CancèsD Matthes, F Nabet - Archive for Rational Mechanics and …, 2019 - Springer

We study non-local version of the Cahn–Hilliard dynamics for phase separation in two-

component incompressible and immiscible mixture with linear mobilities. Differently to the

celebrated local model with nonlinear mobility, it is only assumed that the divergences of the …

  Cited by 8 Related articles All 17 versions


[PDF] arxiv.org

A Wasserstein Inequality and Minimal Green Energy on Compact Manifolds

S Steinerberger - arXiv preprint arXiv:1907.09023, 2019 - arxiv.org

Let $ M $ be smooth, compact $ d-$ dimensional manifold, $ d\geq 3, $ without boundary

and let $ G: M\times M\rightarrow\mathbb {R}\cup\left\{\infty\right\} $ denote the Green's

function of the Laplacian $-\Delta $(normalized to have mean value 0). We prove bound …

  Cited by 2 Related articles All 2 versions 


2019


semi-supervised wasserstein generative adversarial network for classifying driving fatigue from EEG signals

S PanwarP RadJ Quarles, E Golob… - … on Systems, Man and …, 2019 - ieeexplore.ieee.org

Predicting driver's cognitive states using deep learning from electroencephalography (EEG)

signals is considered this paper. To address the challenge posed by limited labeled training

Cited by 4 Related articles All 2 versions


[PDF] arxiv.org

convergent Lagrangian discretization for -Wasserstein and flux-limited diffusion equations

B Söllner, O Junge - arXiv preprint arXiv:1906.01321, 2019 - arxiv.org

We study Lagrangian numerical scheme for solution of nonlinear drift diffusion equation

of the form $\partial_t u=\partial_x (u\cdot c [\partial_x (h^\prime (u)+ v)]) $ on an interval.

This scheme will consist of spatio-temporal discretization founded in the formulation of the …

  Cited by 2 Related articles All 5 versions 

[CITATION] convergent Lagrangian discretization for -Wasserstein and flux-limited diffusion equations

O Junge, B Söllner - arXiv preprint arXiv:1906.01321, 2019



[PDF] arxiv.org

First-Order Algorithmic Framework for Wasserstein Distributionally Robust Logistic Regression

J LiS HuangAMC So - arXiv preprint arXiv:1910.12778, 2019 - arxiv.org

Wasserstein distance-based distributionally robust optimization (DRO) has received much

attention lately due to its ability to provide robustness interpretation of various learning

models. Moreover, many of the DRO problems that arise in the learning context admits exact …

  Cited by 1 Related articles All 7 versions 


[PDF] arxiv.org

Attainability property for probabilistic target in Wasserstein spaces

G Cavagnari, Marigonda - arXiv preprint arXiv:1904.10933, 2019 - arxiv.org

In this paper we establish an attainability result for the minimum time function of control

problem in the space of probability measures endowed with Wasserstein distance. The

dynamics is provided by suitable controlled continuity equation, where we impose  …

  Cited by 1 Related articles All 6 versions 


[PDF] arxiv.org

A Wasserstein norm for signed measures, with application to nonlocal transport equation with source term

B Piccoli, F Rossi, M Tournus - arXiv preprint arXiv:1910.05105, 2019 - arxiv.org

We introduce the optimal transportation interpretation of the Kantorovich norm on thespace

of signed Radon measures with finite mass, based on generalized Wasserstein

distancefor measures with different masses. With the formulation and the new topological …

  Cited by 4 Related articles All 7 versions 

<——2019—–—2019 ——2020—  



[PDF] RaspBary: Hawkes Point Process Wasserstein Barycenters as Service

R Hosler, X LiuJ Carter, M Saper - 2019 - researchgate.net

We introduce an API for forecasting the intensity of spacetime events in urban environments

and spatially allocating vehicles during times of peak demand to minimize response time.

Our service is applicable to dynamic resource allocation problems that arise in ride sharing …

  Cited by 2 Related articles 


[PDF] arxiv.org

Stylized Text Generation Using Wasserstein Autoencoders with Mixture of Gaussian Prior

Ghabussi, L Mou, O Vechtomova - arXiv preprint arXiv:1911.03828, 2019 - arxiv.org

Wasserstein autoencoders are effective for text generation. They do not however provide

any control over the style and topic of the generated sentences if the dataset has multiple

classes and includes different topics. In this work, we present semi-supervised approach …

  Related articles All 2 versions 


[PDF] arxiv.org

measure approximation theorem for Wasserstein-robust expected values

G van Zyl - arXiv preprint arXiv:1912.12119, 2019 - arxiv.org

We consider the problem of finding the infimum, over probability measures being in ball

defined by Wasserstein distance, of the expected value of bounded Lipschitz random

variable on $\mathbf {R}^ d $. We show that if the $\sigma-$ algebra is approximated in by  …

  Related articles All 2 versions 


[PDF] arxiv.org

nonlocal free boundary problem with Wasserstein distance

Karakhanyan - arXiv preprint arXiv:1904.06270, 2019 - arxiv.org

We study the probability measures $\rho\in\mathcal M (\mathbb R^ 2) $ minimizing the

functional\[J [\rho]=\iint\log\frac1 {| xy|} d\rho (x) d\rho (y)+ d^ 2 (\rho,\rho_0),\] where $\rho_0

$ is given probability measure and $ d (\rho,\rho_0) $ is the 2-Wasserstein distance of …

  Related articles All 2 versions 


[PDF] arxiv.org

Local Bures-Wasserstein Transport: Practical and Fast Mapping Approximation

Hoyos-Idrobo - arXiv preprint arXiv:1906.08227, 2019 - arxiv.org

Optimal transport (OT)-based methods have wide range of applications and have attracted

tremendous amount of attention in recent years. However, most of the computational

approaches of OT do not learn the underlying transport map. Although some algorithms …

  Related articles All 2 versions 

[CITATION] Local Bures-Wasserstein Transport: Practical and Fast Mapping Approximation.

AH Idrobo - CoRR, 2019


2019


[PDF] nsf.gov

Distributionally Robust Optimization Approach for Multivariate Linear Regression under the Wasserstein Metric

R ChenIC Paschalidis - 2019 IEEE 58th Conference on …, 2019 - ieeexplore.ieee.org

We present Distributionally Robust Optimization (DRO) approach for Multivariate Linear

Regression (MLR), where multiple correlated response variables are to be regressed

against common set of predictors. We develop regularized MLR formulation that is robust …

  Related articles All 3 versions


[PDF] arxiv.org

Local well-posedness in the Wasserstein space for chemotaxis model coupled to Navier-Stokes equations

K Kang, HK Kim - arXiv preprint arXiv:1907.01895, 2019 - arxiv.org

We consider coupled system of Keller-Segel type equations and the incompressible

Navier-Stokes equations in spatial dimension two and three. In the previous work [19], we

established the existence of weak solution of Fokker-Plank equation in the Wasserstein  …

  Related articles All 2 versions 


[PDF] rit.edu

Comparative Assessment of the Impact of Various Norms on Wasserstein Generative Adversarial Networks

C Ramesh - 2019 - scholarworks.rit.edu

Abstract Generative Adversarial Networks (GANs) provide fascinating new paradigm in

machine learning and artificial intelligence, especially in the context of unsupervised

learning. GANs are quickly becoming state of the art tool, used in various applications …

  Related articles All 2 versions 


[PDF] wiley.com

degenerate Cahn‐Hilliard model as constrained Wasserstein gradient flow

D MatthesC Cances, F Nabet - PAMM, 2019 - Wiley Online Library

Existence of solutions to non‐local Cahn‐Hilliard model with degenerate mobility is

considered. The PDE is written as gradient flow with respect to the L2‐Wasserstein metric

for two components that are coupled by an incompressibility constraint. Approximating …

  Related articles


[PDF] arxiv.org

Fenchel-Moreau-Rockafellar type theorem on the Kantorovich-Wasserstein space with applications in partially observable Markov decision processes

V Laschos, K ObermayerY ShenW Stannat - Journal of Mathematical …, 2019 - Elsevier

By using the fact that the space of all probability measures with finite support can be

completed in two different fashions, one generating the Arens-Eells space and another

generating the Kantorovich-Wasserstein (Wasserstein-1) space, and by exploiting the …

  Cited by 3 Related articles All 5 versions

<——2019—–—2019 ——2030— 



Sensitivity of the Compliance and of the Wasserstein Distance with Respect to Varying Source

G BouchittéI FragalàI Lucardesi - Applied Mathematics & Optimization, 2019 - Springer

We show that the compliance functional in elasticity is differentiable with respect to

horizontal variations of the load term, when the latter is given by possibly concentrated

measure; moreover, we provide an integral representation formula for the derivative as  …

  Related articles All 9 versions

 


[PDF] researchgate.net

[PDF] Wasserstein distance: flexible tool for statistical analysis

GVVLV Lucarini - 2019 - researchgate.net

The figure shows the Wasserstein distance calculated in the phase space composed by

globally averaged temperature and precipitation. To provide some sort of benchmark, at the

bottom of the figure is shown the value related to the NCEP reanalysis, which yields one of …

  Related articles All 4 versions 


[PDF] openreview.net

Greedy Approach to Max-Sliced Wasserstein GANs

Horváth - 2019 - openreview.net

Generative Adversarial Networks have made data generation possible in various use cases,

but in case of complex, high-dimensional distributions it can be difficult to train them,

because of convergence problems and the appearance of mode collapse. Sliced …

  Related articles All 2 versions 


[PDF] amazonaws.com

[PDF] Supplement to “Optimal Estimation of Wasserstein Distance on Tree with An Application to Microbiome Studies”

S Wang, TT CaiH Li - pstorage-tf-iopjsd8797887.s3 …

Page 1. Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application

to Microbiome Studies” Shulei Wang, T. Tony Cai and Hongzhe Li University of Pennsylvania In

this supplementary material, we provide the proof for the main results (Section S1) and all the …

  Related articles All 3 versions 


[CITATION] Time Series Generation using One Dimensional Wasserstein GAN

EK Smith, OA Smith - ITISE 2019 International Conference on Time Series …, 2019

  Cited by 1


2019


[PDF] arxiv.org

Approximation of stable law in Wasserstein-1 distance by Stein's method

L Xu - Annals of Applied Probability, 2019 - projecteuclid.org

Abstract Let $ n\in\mathbb {N} $, let $\zeta_ {n, 1},\ldots,\zeta_ {n, n} $ be a sequence of

independent random variables with $\mathbb {E}\zeta_ {n, i}= 0$ and $\mathbb {E}|\zeta_ {n,

i}|<\infty $ for each $ i $, and let $\mu $ be an $\alpha $-stable distribution having …

  Cited by 19 Related articles All 7 versions


2019

[PDF] arxiv.org

Statistical data analysis in the Wasserstein space

J Bigot - arXiv preprint arXiv:1907.08417, 2019 - arxiv.org

This paper is concerned by statistical inference problems from a data set whose elements

may be modeled as random probability measures such as multiple histograms or point

clouds. We propose to review recent contributions in statistics on the use of Wasserstein  …

  Cited by 3 Related articles All 2 versions 


[PDF] arxiv.org

Confidence regions in wasserstein distributionally robust estimation

J Blanchet, K MurthyN Si - arXiv preprint arXiv:1906.01614, 2019 - arxiv.org

Wasserstein distributionally robust optimization (DRO) estimators are obtained as solutions

of min-max problems in which the statistician selects a parameter minimizing the worst-case

loss among all probability models within a certain distance (in Wasserstein sense) from the …

  Cited by 10 Related articles All 6 versions 


[PDF] aclweb.org

Modeling personalization in continuous space for response generation via augmented wasserstein autoencoders

Z Chan, J LiX YangX Chen, W Hu, D Zhao… - … on empirical methods in …, 2019 - aclweb.org

Abstract Variational autoencoders (VAEs) and Wasserstein autoencoders (WAEs) have

achieved noticeable progress in open-domain response generation. Through introducing

latent variables in continuous space, these models are capable of capturing utterance-level …

  Cited by 14 Related articles All 3 versions 


[PDF] arxiv.org

Using wasserstein-2 regularization to ensure fair decisions with neural-network classifiers

L Risser, Q Vincenot, N Couellan… - arXiv preprint arXiv …, 2019 - arxiv.org

In this paper, we propose a new method to build fair Neural-Network classifiers by using a

constraint based on the Wasserstein distance. More specifically, we detail how to efficiently

compute the gradients of Wasserstein-2 regularizers for Neural-Networks. The proposed …

  Cited by 9 Related articles All 2 versions 

<——2019—–—2019 ——2040— 


Commande Optimale dans les Espaces de Wasserstein

B Bonnet - 2019 - theses.fr

Résumé Une vaste quantité d'outils mathématiques permettant la modélisation et l'analyse

des problèmes multi-agents ont récemment été développés dans le cadre de la théorie du

transport optimal. Dans cette thèse, nous étendons pour la première fois plusieurs de ces …

[PDF] arxiv.org

Wasserstein distributionally robust optimization: Theory and applications in machine learning

D KuhnPM EsfahaniVA Nguyen… - … in the Age of …, 2019 - pubsonline.informs.org

Many decision problems in science, engineering, and economics are affected by uncertain

parameters whose distribution is only indirectly observable through samples. The goal of

data-driven decision making is to learn a decision from finitely many training samples that …

  Cited by 70 Related articles All 7 versions


[PDF] arxiv.org

Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance

J WeedF Bach - Bernoulli, 2019 - projecteuclid.org

The Wasserstein distance between two probability measures on a metric space is a

measure of closeness with applications in statistics, probability, and machine learning. In

this work, we consider the fundamental question of how quickly the empirical measure …

  Cited by 173 Related articles All 6 versions


[PDF] arxiv.org

Estimation of Wasserstein distances in the spiked transport model

J Niles-WeedP Rigollet - arXiv preprint arXiv:1909.07513, 2019 - arxiv.org

We propose a new statistical model, the spiked transport model, which formalizes the

assumption that two probability distributions differ only on a low-dimensional subspace. We

study the minimax rate of estimation for the Wasserstein distance under this model and show …

  Cited by 17 Related articles All 2 versions 


[PDF] arxiv.org

The Pontryagin maximum principle in the Wasserstein space

B BonnetF Rossi - Calculus of Variations and Partial Differential …, 2019 - Springer

Abstract We prove a Pontryagin Maximum Principle for optimal control problems in the

space of probability measures, where the dynamics is given by a transport equation with non-

local velocity. We formulate this first-order optimality condition using the formalism of …

  Cited by 24 Related articles All 20 versions


2019


[PDF] arxiv.org

Fréchet means and Procrustes analysis in Wasserstein space

Y Zemel, VM Panaretos - Bernoulli, 2019 - projecteuclid.org

We consider two statistical problems at the intersection of functional and non-Euclidean data

analysis: the determination of a Fréchet mean in the Wasserstein space of multivariate

distributions; and the optimal registration of deformed random measures and point …

  Cited by 51 Related articles All 8 versions


[PDF] thecvf.com

Normalized wasserstein for mixture distributions with applications in adversarial learning and domain adaptation

Y BalajiR ChellappaS Feizi - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Understanding proper distance measures between distributions is at the core of several

learning tasks such as generative models, domain adaptation, clustering, etc. In this work,

we focus on mixture distributions that arise naturally in several application domains where …

  Cited by 12 Related articles All 4 versions 


[PDF] mlr.press

Accelerated linear convergence of stochastic momentum methods in wasserstein distances

B CanM GurbuzbalabanL Zhu - … Conference on Machine …, 2019 - proceedings.mlr.press

Momentum methods such as Polyak's heavy ball (HB) method, Nesterov's accelerated

gradient (AG) as well as accelerated projected gradient (APG) method have been commonly

used in machine learning practice, but their performance is quite sensitive to noise in the …

  Cited by 25 Related articles All 8 versions 


[PDF] ucla.edu

On differentiability in the Wasserstein space and well-posedness for Hamilton–Jacobi equations

W GangboA Tudorascu - Journal de Mathématiques Pures et Appliquées, 2019 - Elsevier

In this paper we elucidate the connection between various notions of differentiability in the

Wasserstein space: some have been introduced intrinsically (in the Wasserstein space, by

using typical objects from the theory of Optimal Transport) and used by various authors to  …

  Cited by 34 Related articles All 4 versions


Artifact correction in low‐dose dental CT imaging using Wasserstein generative adversarial networks

Z Hu, C Jiang, F Sun, Q Zhang, Y Ge, Y Yang… - Medical …, 2019 - Wiley Online Library

Purpose In recent years, health risks concerning high‐dose x‐ray radiation have become a

major concern in dental computed tomography (CT) examinations. Therefore, adopting low‐

dose computed tomography (LDCT) technology has become a major focus in the CT …

  Cited by 30 Related articles All 5 versions

<——2019—–—2019 ——2050— 



[PDF] arxiv.org

Investigating under and overfitting in wasserstein generative adversarial networks

B AdlamC WeillA Kapoor - arXiv preprint arXiv:1910.14137, 2019 - arxiv.org

We investigate under and overfitting in Generative Adversarial Networks (GANs), using

discriminators unseen by the generator to measure generalization. We find that the model

capacity of the discriminator has a significant effect on the generator's model quality, and …

  Cited by 7 Related articles All 3 versions 

Investigating Under and Overfitting in Wasserstein Generative Adversarial Networks

A Kapoor, B AdlamC Weill - 2019 - research.google

We investigate under and overfitting in Generative Adversarial Networks (GANs), using

discriminators unseen by the generator to measure generalization. We find that the model

capacity of the discriminator has a significant effect on the generator's model quality, and …

 

[PDF] arxiv.org

Parisi's formula is a Hamilton-Jacobi equation in Wasserstein space

JC Mourrat - arXiv preprint arXiv:1906.08471, 2019 - arxiv.org

Parisi's formula is a self-contained description of the infinite-volume limit of the free energy of

mean-field spin glass models. We show that this quantity can be recast as the solution of a

Hamilton-Jacobi equation in the Wasserstein space of probability measures on the positive …

  Cited by 7 Related articles All 3 versions 

[PDF] arxiv.org

Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space

TL GouicQ ParisP RigolletAJ Stromme - arXiv preprint arXiv …, 2019 - arxiv.org

This work establishes fast rates of convergence for empirical barycenters over a large class

of geodesic spaces with curvature bounds in the sense of Alexandrov. More specifically, we

show that parametric rates of convergence are achievable under natural conditions that …

  Cited by 9 Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein metric-driven Bayesian inversion with applications to signal processing

M MotamedD Appelo - International Journal for Uncertainty …, 2019 - dl.begellhouse.com

We present a Bayesian framework based on a new exponential likelihood function driven by

the quadratic Wasserstein metric. Compared to conventional Bayesian models based on

Gaussian likelihood functions driven by the least-squares norm (L 2 norm), the new …

  Cited by 8 Related articles All 4 versions


[PDF] arxiv.org

Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Stein's method has been widely used for probability approximations. However, in the multi-

dimensional setting, most of the results are for multivariate normal approximation or for test

functions with bounded second-or higher-order derivatives. For a class of multivariate …

  Cited by 22 Related articles All 7 versions

[CITATION] Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula (vol 174, pg 945, 2019)

X Fang, QM ShaoL Xu - PROBABILITY …, 2019 - … TIERGARTENSTRASSE 17, D …

[HTML] Correction to: Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Under the above-strengthened Assumption 2.1, all the conclusions and examples in [1] still hold

true, except that all the constants \(C_\theta \) therein will depend on the constants in the new

assumption … Combining the previous three inequalities, we conclude that [1, (7.1)] still holds …

  Cited by 1 Related articles All 2 versions


2019


[PDF] sciencedirect.com

Harmonic mappings valued in the Wasserstein space

H Lavenant - Journal of Functional Analysis, 2019 - Elsevier

We propose a definition of the Dirichlet energy (which is roughly speaking the integral of the

square of the gradient) for mappings μ: Ω(P (D), W 2) defined over a subset Ω of R p and

valued in the space P (D) of probability measures on a compact convex subset D of R q …

  Cited by 12 Related articles All 12 versions


[PDF] mlr.press

A gradual, semi-discrete approach to generative network training via explicit wasserstein minimization

Y Chen, M TelgarskyC Zhang… - International …, 2019 - proceedings.mlr.press

This paper provides a simple procedure to fit generative networks to target distributions, with

the goal of a small Wasserstein distance (or other optimal transport costs). The approach is

based on two principles:(a) if the source randomness of the network is a continuous …

  Cited by 4 Related articles All 10 versions 


Wasserstein distance-based analogous method to predict distribution of non-uniform corrosion on reinforcements in concrete

Q Liu, RKL Su - Construction and Building Materials, 2019 - Elsevier

This paper presents an analogous method to predict the distribution of non-uniform

corrosion on reinforcements in concrete by minimizing the Wasserstein distance. A

comparison between the predicted and experimental results shows that the proposed …

  Cited by 6 Related articles All 3 versions


[PDF] arxiv.org

Penalization of barycenters in the Wasserstein space

J Bigot, E Cazelles, N Papadakis - SIAM Journal on Mathematical Analysis, 2019 - SIAM

In this paper, a regularization of Wasserstein barycenters for random measures supported

on R^d is introduced via convex penalization. The existence and uniqueness of such

barycenters is first proved for a large class of penalization functions. The Bregman …

  Cited by 15 Related articles All 8 versions


[PDF] arxiv.org

Randomized filtering and Bellman equation in Wasserstein space for partial observation control problem

E BandiniA Cosso, M Fuhrman, H Pham - Stochastic Processes and their …, 2019 - Elsevier

We study a stochastic optimal control problem for a partially observed diffusion. By using the

control randomization method in Bandini et al.(2018), we prove a corresponding

randomized dynamic programming principle (DPP) for the value function, which is obtained …

  Cited by 16 Related articles All 13 versions

<——2019—–—2019 ——2060—   



[PDF] springer.com

Convergence to equilibrium in Wasserstein distance for damped Euler equations with interaction forces

JA CarrilloYP ChoiO Tse - Communications in Mathematical Physics, 2019 - Springer

We develop tools to construct Lyapunov functionals on the space of probability measures in

order to investigate the convergence to global equilibrium of a damped Euler system under

the influence of external and interaction potential forces with respect to the 2-Wasserstein  …

  Cited by 13 Related articles All 11 versions


Multivariate stable approximation in Wasserstein distance by Stein's method

P Chen, I NourdinL XuX Yang - arXiv preprint arXiv:1911.12917, 2019 - arxiv.org

We investigate regularity properties of the solution to Stein's equation associated with

multivariate integrable $\alpha $-stable distribution for a general class of spectral measures

and Lipschitz test functions. The obtained estimates induce an upper bound in Wasserstein  …

  Cited by 4 Related articles All 4 versions 


[PDF] arxiv.org

Geometric convergence bounds for Markov chains in Wasserstein distance based on generalized drift and contraction conditions

Q QinJP Hobert - arXiv preprint arXiv:1902.02964, 2019 - arxiv.org

Let $\{X_n\} _ {n= 0}^\infty $ denote an ergodic Markov chain on a general state space that

has stationary distribution $\pi $. This article concerns upper bounds on the $ L_1 $-

Wasserstein distance between the distribution of $ X_n $ and $\pi $. In particular, an explicit …

  Cited by 9 Related articles All 2 versions 


[PDF] projecteuclid.org

Behavior of the empirical Wasserstein distance in  under moment conditions

J Dedecker, F Merlevède - Electronic Journal of Probability, 2019 - projecteuclid.org

We establish some deviation inequalities, moment bounds and almost sure results for the

Wasserstein distance of order $ p\in [1,\infty) $ between the empirical measure of

independent and identically distributed ${\mathbb R}^ d $-valued random variables and the …

  Cited by 7 Related articles All 12 versions


2019

onlinear model reduction on metric spaces. Application to ...

https://arxiv.org › math


by V Ehrlacher · 2019 · Cited by 4 — Application to one-dimensional conservative PDEs in Wasserstein spaces. We consider the problem of model reduction of parametrized PDEs where the goal is to approximate any function belonging to the set of solutions at a reduced computational cost.

[PDF] arxiv.org

[CITATION] Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces

V EhrlacherD LombardiO MulaFX Vialard - arXiv preprint arXiv:1909.06626, 2019

  Cited by 4 Related articles All 19 versions


2019


[PDF] inria.fr

On a Wasserstein-type distance between solutions to stochastic differential equations

J Bion–Nadal, D Talay - The Annals of Applied Probability, 2019 - projecteuclid.org

In this paper, we introduce a Wasserstein-type distance on the set of the probability

distributions of strong solutions to stochastic differential equations. This new distance is

defined by restricting the set of possible coupling measures. We prove that it may also be …

  Cited by 11 Related articles All 9 versions


[PDF] arxiv.org

Normalized Wasserstein distance for mixture distributions with applications in adversarial learning and domain adaptation

Y BalajiR ChellappaS Feizi - arXiv preprint arXiv:1902.00415, 2019 - arxiv.org

Understanding proper distance measures between distributions is at the core of several

learning tasks such as generative models, domain adaptation, clustering, etc. In this work,

we focus on mixture distributions that arise naturally in several application domains where …

  Cited by 5 Related articles All 2 versions 


[PDF] esaim-cocv.org

A Pontryagin Maximum Principle in Wasserstein spaces for constrained optimal control problems

B Bonnet - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

In this paper, we prove a Pontryagin Maximum Principle for constrained optimal control

problems in the Wasserstein space of probability measures. The dynamics is described by a

transport equation with non-local velocities which are affine in the control, and is subject to  …

  Cited by 8 Related articles All 45 versions


[PDF] polimi.it

Propagating uncertainty in reinforcement learning via wasserstein barycenters

AM MetelliA LikmetaM Restelli - 33rd Conference on Neural …, 2019 - re.public.polimi.it

How does the uncertainty of the value function propagate when performing temporal

difference learning? In this paper, we address this question by proposing a Bayesian

framework in which we employ approximate posterior distributions to model the uncertainty …

  Cited by 5 Related articles All 3 versions 


Approximation of Discounted Minimax Markov Control Problems and Zero-Sum Markov Games Using Hausdorff and Wasserstein Distances

F Dufour, T Prieto-Rumeau - Dynamic Games and Applications, 2019 - Springer

This paper is concerned with a minimax control problem (also known as a robust Markov

decision process (MDP) or a game against nature) with general state and action spaces

under the discounted cost optimality criterion. We are interested in approximating …

  Related articles All 6 versions

<——2019—–—2019 ——2070—  



2019  

Data-driven regularization of Wasserstein barycenters with an application to multivariate density registration

J Bigot, E Cazelles, N Papadakis - Information and Inference: A …, 2019 - academic.oup.com

We present a framework to simultaneously align and smoothen data in the form of multiple

point clouds sampled from unknown densities with support in a-dimensional Euclidean

space. This work is motivated by applications in bioinformatics where researchers aim to  …

  Cited by 12 Related articles All 8 versions


[PDF] arxiv.org

Temporal Wasserstein non-negative matrix factorization for non-rigid motion segmentation and spatiotemporal deconvolution

E VarolA Nejatbakhsh, C McGrory - arXiv preprint arXiv:1912.03463, 2019 - arxiv.org

Motion segmentation for natural images commonly relies on dense optic flow to yield point

trajectories which can be grouped into clusters through various means including spectral

clustering or minimum cost multicuts. However, in biological imaging scenarios, such as …

  Cited by 2 Related articles All 3 versions 


[PDF] ieee.org

Generating Adversarial Samples With Constrained Wasserstein Distance

K Wang, P Yi, F Zou, Y Wu - IEEE Access, 2019 - ieeexplore.ieee.org

In recent years, deep neural network (DNN) approaches prove to be useful in many machine

learning tasks, including classification. However, small perturbations that are carefully

crafted by attackers can lead to the misclassification of the images. Previous studies have …

  Cited by 1 Related articles


[PDF] arxiv.org

Subexponential upper and lower bounds in Wasserstein distance for Markov processes

A Arapostathis, G Pang, N Sandrić - arXiv preprint arXiv:1907.05250, 2019 - arxiv.org

In this article, relying on Foster-Lyapunov drift conditions, we establish subexponential

upper and lower bounds on the rate of convergence in the $\mathrm {L}^ p $-Wasserstein

distance for a class of irreducible and aperiodic Markov processes. We further discuss these …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

From the backward Kolmogorov PDE on the Wasserstein space to propagation of chaos for McKean-Vlasov SDEs

N Frikha, PEC de Raynal - arXiv preprint arXiv:1907.01410, 2019 - arxiv.org

In this article, we provide some new quantitative estimates for propagation of chaos of non-

linear stochastic differential equations (SDEs) in the sense of McKean-Vlasov. We obtain

explicit error estimates, at the level of the trajectories, at the level of the semi-group and at …

  Cited by 5 Related articles All 7 versions 


2019


[PDF] projecteuclid.org

Wasserstein-2 bounds in normal approximation under local dependence

X Fang - Electronic Journal of Probability, 2019 - projecteuclid.org

We obtain a general bound for the Wasserstein-2 distance in normal approximation for sums

of locally dependent random variables. The proof is based on an asymptotic expansion for

expectations of second-order differentiable functions of the sum. We apply the main result to  …

  Cited by 4 Related articles All 3 versions


[PDF] phmsociety.org

Anomaly detection on time series with wasserstein gan applied to phm

M Ducoffe, I Haloui, JS Gupta - International Journal of …, 2019 - papers.phmsociety.org

Modern vehicles are more and more connected. For instance, in the aerospace industry,

newer aircraft are already equipped with data concentrators and enough wireless

connectivity to transmit sensor data collected during the whole flight to the ground, usually …

  Cited by 2 Related articles All 2 versions 


[PDF] arxiv.org

2-wasserstein approximation via restricted convex potentials with application to improved training for gans

A TaghvaeiA Jalali - arXiv preprint arXiv:1902.07197, 2019 - arxiv.org

We provide a framework to approximate the 2-Wasserstein distance and the optimal

transport map, amenable to efficient training as well as statistical and geometric analysis.

With the quadratic cost and considering the Kantorovich dual form of the optimal …

  Cited by 9 Related articles All 3 versions 


[PDF] arxiv.org

Bounds for the Wasserstein mean with applications to the Lie-Trotter mean

J Hwang, S Kim - Journal of Mathematical Analysis and Applications, 2019 - Elsevier

Since barycenters in the Wasserstein space of probability distributions have been

introduced, the Wasserstein metric and the Wasserstein mean of positive definite Hermitian

matrices have been recently developed. In this paper, we explore some properties of …

  Cited by 3 Related articles All 5 versions


[PDF] projecteuclid.org

Convergence of the population dynamics algorithm in the Wasserstein metric

M Olvera-Cravioto - Electronic Journal of Probability, 2019 - projecteuclid.org

We study the convergence of the population dynamics algorithm, which produces sample

pools of random variables having a distribution that closely approximates that of the special

endogenous solution to a variety of branching stochastic fixed-point equations, including the …

  Cited by 3 Related articles Al

<——2019—–—2019 ——2080—  


Barycenters in generalized Wasserstein spaces

NP Chung, TS Trinh - arXiv preprint arXiv:1909.05517, 2019 - arxiv.org

In 2014, Piccoli and Rossi introduced generalized Wasserstein spaces which are

combinations of Wasserstein distances and $ L^ 1$-distances [11]. In this article, we follow

the ideas of Agueh and Carlier [1] to study generalized Wasserstein barycenters. We show …

  Cited by 1 Related articles All 3 versions 


[PDF] arxiv.org

Attainability property for a probabilistic target in Wasserstein spaces

G Cavagnari, A Marigonda - arXiv preprint arXiv:1904.10933, 2019 - arxiv.org

In this paper we establish an attainability result for the minimum time function of a control

problem in the space of probability measures endowed with Wasserstein distance. The

dynamics is provided by a suitable controlled continuity equation, where we impose a …

  Cited by 1 Related articles All 6 versions 


[PDF] arxiv.org

Wasserstein norm for signed measures, with application to nonlocal transport equation with source term

B Piccoli, F Rossi, M Tournus - arXiv preprint arXiv:1910.05105, 2019 - arxiv.org

We introduce the optimal transportation interpretation of the Kantorovich norm on thespace

of signed Radon measures with finite mass, based on a generalized Wasserstein

distancefor measures with different masses. With the formulation and the new topological …

  Cited by 4 Related articles All 7 versions 


[PDF] arxiv.org

On the estimation of the Wasserstein distance in generative models

T Pinetz, D Soukup, T Pock - German Conference on Pattern Recognition, 2019 - Springer

Abstract Generative Adversarial Networks (GANs) have been used to model the underlying

probability distribution of sample based datasets. GANs are notoriuos for training difficulties

and their dependence on arbitrary hyperparameters. One recent improvement in GAN …

  Related articles All 5 versions


[PDF] arxiv.org

Graph signal representation with wasserstein barycenters

E SimouP Frossard - … on Acoustics, Speech and Signal …, 2019 - ieeexplore.ieee.org

In many applications signals reside on the vertices of weighted graphs. Thus, there is the

need to learn low dimensional representations for graph signals that will allow for data

analysis and interpretation. Existing unsupervised dimensionality reduction methods for …

  Cited by 7 Related articles All 5 versions


2019


Wasserstein Distances for Estimating Parameters in Stochastic Reaction Networks

K Öcal, R GrimaG Sanguinetti - … on Computational Methods in Systems …, 2019 - Springer

Modern experimental methods such as flow cytometry and fluorescence in-situ hybridization

(FISH) allow the measurement of cell-by-cell molecule numbers for RNA, proteins and other

substances for large numbers of cells at a time, opening up new possibilities for the …

  Related articles All 3 versions


[PDF] arxiv.org

Optimal Transport Relaxations with Application to Wasserstein GANs

S Mahdian, J Blanchet, P Glynn - arXiv preprint arXiv:1906.03317, 2019 - arxiv.org

We propose a family of relaxations of the optimal transport problem which regularize the

problem by introducing an additional minimization step over a small region around one of

the underlying transporting measures. The type of regularization that we obtain is related to  …

  Related articles All 4 versions 


[PDF] arxiv.org

The Wasserstein Distances Between Pushed-Forward Measures with Applications to Uncertainty Quantification

A Sagiv - arXiv preprint arXiv:1902.05451, 2019 - arxiv.org

In the study of dynamical and physical systems, the input parameters are often uncertain or

randomly distributed according to a measure $\varrho $. The system's response $ f $ pushes

forward $\varrho $ to a new measure $ f\circ\varrho $ which we would like to study. However …

  Related articles All 3 versions 


Unsupervised segmentation of colonic polyps in narrow-band imaging data based on manifold representation of images and Wasserstein distance

IN FigueiredoL PintoPN FigueiredoR Tsai - … Signal Processing and …, 2019 - Elsevier

Colorectal cancer (CRC) is one of the most common cancers worldwide and after a certain

age (≥ 50) regular colonoscopy examination for CRC screening is highly recommended.

One of the most prominent precursors of CRC are abnormal growths known as polyps. If a …

  Related articles All 4 versions


[PDF] koreascience.or.kr

Experimental Analysis of Equilibrization in Binary Classification for Non-Image Imbalanced Data Using Wasserstein GAN

ZY Wang, DK Kang - International Journal of Internet …, 2019 - koreascience.or.kr

In this paper, we explore the details of three classic data augmentation methods and two

generative model based oversampling methods. The three classic data augmentation

methods are random sampling (RANDOM), Synthetic Minority Over-sampling Technique …

  Cited by 2 Related articles All 3 versions 

<——2019—–—2019 ——2090—  



[PDF] arxiv.org

Learning embeddings into entropic wasserstein spaces

C FrognerF MirzazadehJ Solomon - arXiv preprint arXiv:1905.03329, 2019 - arxiv.org

Euclidean embeddings of data are fundamentally limited in their ability to capture latent

semantic structures, which need not conform to Euclidean spatial assumptions. Here we

consider an alternative, which embeds data as discrete probability distributions in a …

[PDF] Tropical Optimal Transport and Wasserstein Distances in Phylogenetic Tree Space

W Lee, W LiB LinA Monod - arXiv preprint arXiv:1911.05401, 2019 - math.ucla.edu

We study the problem of optimal transport on phylogenetic tree space from the perspective

of tropical geometry, and thus define the Wasserstein-p distances for probability measures in

this continuous metric measure space setting. With respect to the tropical metric—a …

  Related articles All 2 versions 


[PDF] researchgate.net

[PDF] Rate of convergence in Wasserstein distance of piecewise-linear Lévy-driven SDEs

ARI ARAPOSTATHIS, G PANG… - arXiv preprint arXiv …, 2019 - researchgate.net

In this paper, we study the rate of convergence under the Wasserstein metric of a broad

class of multidimensional piecewise Ornstein–Uhlenbeck processes with jumps. These are

governed by stochastic differential equations having a piecewise linear drift, and a fairly …

  Related articles 


[PDF] archives-ouvertes.fr

Optimal Control in Wasserstein Spaces

B Bonnet - 2019 - hal.archives-ouvertes.fr

A wealth of mathematical tools allowing to model and analyse multi-agent systems has been

brought forth as a consequence of recent developments in optimal transport theory. In this

thesis, we extend for the first time several of these concepts to the framework of control …

  Related articles All 8 versions 

[CITATION] Optimal Control in Wasserstein Spaces.(Commande Optimal dans les Espaces de Wasserstein).

B Bonnet - 2019 - Aix-Marseille University, France


[PDF] arxiv.org

Local well-posedness in the Wasserstein space for a chemotaxis model coupled to Navier-Stokes equations

K Kang, HK Kim - arXiv preprint arXiv:1907.01895, 2019 - arxiv.org

We consider a coupled system of Keller-Segel type equations and the incompressible

Navier-Stokes equations in spatial dimension two and three. In the previous work [19], we

established the existence of a weak solution of a Fokker-Plank equation in the Wasserstein  …

  Related articles All 2 versions 


2019


[PDF] arxiv.org

Convergence of some classes of random flights in Wasserstein distance

A Falaleev, V Konakov - arXiv preprint arXiv:1910.03862, 2019 - arxiv.org

In this paper we consider a random walk of a particle in $\mathbb {R}^ d $. Convergence of

different transformations of trajectories of random flights with Poisson switching moments

has been obtained by Davydov and Konakov, as well as diffusion approximation of the …

  Related articles All 2 versions 


Wasserstein barycenters in the manifold of all positive definite matrices

E NobariB Ahmadi Kakavandi - Quarterly of Applied Mathematics, 2019 - ams.org

In this paper, we study the Wasserstein barycenter of finitely many Borel probability

measures on $\mathbb {P} _ {n} $, the Riemannian manifold of all $ n\times n $ real positive

definite matrices as well as its associated dual problem, namely the optimal transport …

  Related articles All 2 versions


[PDF] arxiv.org

The existence of geodesics in Wasserstein spaces over path groups and loop groups

J Shao - Stochastic Processes and their Applications, 2019 - Elsevier

In this work we prove the existence and uniqueness of the optimal transport map for L p-

Wasserstein distance with p> 1, and particularly present an explicit expression of the optimal

transport map for the case p= 2. As an application, we show the existence of geodesics …

  Related articles All 8 versions


 

[PDF] sns.it

Sensitivity of the Compliance and of the Wasserstein Distance with Respect to a Varying Source

G BouchittéI FragalàI Lucardesi - Applied Mathematics & Optimization, 2019 - Springer

We show that the compliance functional in elasticity is differentiable with respect to

horizontal variations of the load term, when the latter is given by a possibly concentrated

measure; moreover, we provide an integral representation formula for the derivative as a …

  Related articles All 9 versions


[PDF] openreview.net

A Greedy Approach to Max-Sliced Wasserstein GANs

A Horváth - 2019 - openreview.net

Generative Adversarial Networks have made data generation possible in various use cases,

but in case of complex, high-dimensional distributions it can be difficult to train them,

because of convergence problems and the appearance of mode collapse. Sliced …

  Related articles All 2 versions 

<——2019—–—2019 ——2100— 



[PDF] tum.de

Structure preserving discretization and approximation of gradient flows in Wasserstein-like space

S Plazotta - 2019 - mediatum.ub.tum.de

This thesis investigates structure-preserving, temporal semi-discretizations and

approximations for PDEs with gradient flow structure with the application to evolution

problems in the L²-Wasserstein space. We investigate the variational formulation of the time …

  Related articles All 3 versions 


Minimax estimation of smooth densities in Wasserstein distance

J Niles-WeedQ Berthet - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

We study nonparametric density estimation problems where error is measured in the

Wasserstein distance, a metric on probability distributions popular in many areas of statistics

and machine learning. We give the first minimax-optimal rates for this problem for general …

 

Use of the Wasserstein Metric to Solve the Inverse Dynamic Seismic Problem

AA Vasilenko - Geomodel 2019, 2019 - earthdoc.org

The inverse dynamic seismic problem consists in recovering the velocity model of elastic

medium based on the observed seismic data. In this work full waveform inversion method is

used to solve this problem. It consists in minimizing an objective functional measuring the …

  Related articles

  

[PDF] amazonaws.com

[PDF] Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application to Microbiome Studies”

S Wang, TT CaiH Li - pstorage-tf-iopjsd8797887.s3 …

Page 1. Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application

to Microbiome Studies” Shulei Wang, T. Tony Cai and Hongzhe Li University of Pennsylvania

In this supplementary material, we provide the proof for the main results (Section S1) … belonging …

  Related articles All 3 versions 


Tackling Algorithmic Bias in Neural-Network Classifiers using Wasserstein-2 Regularization

L Risser, Q Vincenot, JM Loubes - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

The increasingly common use of neural network classifiers in industrial and social

applications of image analysis has allowed impressive progress these last years. Such

methods are however sensitive to algorithmic bias, ie to an under-or an over-representation …

 

2019


Prioritized Experience Replay based on the Wasserstein Metric in Deep Reinforcement Learning: The regularizing effect of modelling return distributions

T Greevink - 2019 - repository.tudelft.nl

This thesis tests the hypothesis that distributional deep reinforcement learning (RL)

algorithms get an increased performance over expectation based deep RL because of the

regularizing effect of fitting a more complex model. This hypothesis was tested by comparing …

  

2019

Sampling of probability measures in the convex order by Wasserstein projection

J Corbetta, B Jourdain - 2019 - ideas.repec.org

In this paper, for $\mu $ and $\nu $ two probability measures on $\mathbb {R}^ d $ with finite

moments of order $\rho\ge 1$, we define the respective projections for the $ W_\rho $-

Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures dominated by …

  

Elements of Statistical Inference in 2-Wasserstein Space

J Ebert, V SpokoinyA Suvorikova - Topics in Applied Analysis and …, 2019 - Springer

This work addresses an issue of statistical inference for the datasets lacking underlying

linear structure, which makes impossible the direct application of standard inference

techniques and requires a development of a new tool-box taking into account properties of …

Related articles All 3 versions

MR4410573

Projection in the 2-Wasserstein sense on structured measure space

L Lebrat - 2019 - tel.archives-ouvertes.fr

This thesis focuses on the approximation for the 2-Wasserstein metric of probability

measures by structured measures. The set of structured measures under consideration is

made of consistent discretizations of measures carried by a smooth curve with a bounded …

  All 2 versions 

[PDF] thecvf.com

Sliced wasserstein discrepancy for unsupervised domain adaptation

CY Lee, T Batra, MH Baig… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

In this work, we connect two distinct concepts for unsupervised domain adaptation: feature 

distribution alignment between domains by utilizing the task-specific decision boundary and 

the Wasserstein metric. Our proposed sliced Wasserstein discrepancy (SWD) is designed to …

Cited by 24 Related articles All 10 versions

Sliced Wasserstein Discrepancy for Unsupervised Domain ...

http://ieeexplore.ieee.org › document

Sliced Wasserstein Discrepancy for Unsupervised Domain Adaptation ... Published in: 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition ...

Date Added to IEEE Xplore: 09 January 2020

Date of Conference: 15-20 June 2019

DOI: 10.1109/CVPR.2019.01053

[CITATION] Sliced wasserstein discrepancy for unsupervised domain adaptation. In 2019 IEEE

C Lee, T Batra, MH Baig, D Ulbricht - CVF Conference on Computer Vision and …, 2019

Cited by 293 Related articles All 10 versions

[CITATION] A general solver to the elliptical mixture model through an approximate wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - arXiv preprint arXiv:1906.03700, 2019

<——2019—–—2019 ——2110— 


Optimistic distributionally robust optimization for nonparametric likelihood approximation

…, MC YueD KuhnW Wiesemann - Advances in …, 2019 - proceedings.neurips.cc

… We prove that the resulting posterior inference problems under the KL divergence and the

Wasserstein distance enjoy strong theoretical guarantees, and we illustrate their promising …

 Cited by 16 Related articles All 11 versions 

[PDF] arxiv.org

Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Stein's method has been widely used for probability approximations. However, in the multi-

dimensional setting, most of the results are for multivariate normal approximation or for test

functions with bounded second-or higher-order derivatives. For a class of multivariate  …

  Cited by 22 Related articles All 7 versions

[CITATION] Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula (vol 174, pg 945, 2019)

X Fang, QM ShaoL Xu - PROBABILITY …, 2019 - … TIERGARTENSTRASSE 17, D …

 [HTML] springer.com

[HTML] Correction to: Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Under the above-strengthened Assumption 2.1, all the conclusions and examples in [1] still hold

true, except that all the constants \(C_\theta \) therein will depend on the constants in the new

assumption … Combining the previous three inequalities, we conclude that [1, (7.1)] still holds …

  Cited by 1 Related articles All 2 versions


 

[PDF] arxiv.org

Multivariate stable approximation in Wasserstein distance by Stein's method

P Chen, I NourdinL XuX Yang - arXiv preprint arXiv:1911.12917, 2019 - arxiv.org

We investigate regularity properties of the solution to Stein's equation associated with

multivariate integrable $\alpha $-stable distribution for a general class of spectral measures

and Lipschitz test functions. The obtained estimates induce an upper bound in Wasserstein  …

  Cited by 4 Related articles All 4 versions 


 

[PDF] semanticscholar.org

Speech Enhancement for Noise-Robust Speech Synthesis Using Wasserstein GAN.

N AdigaY Pantazis, V Tsiaras, Y Stylianou - INTERSPEECH, 2019 - isca-speech.org

The quality of speech synthesis systems can be significantly deteriorated by the presence of

background noise in the recordings. Despite the existence of speech enhancement

techniques for effectively suppressing additive noise under low signal-tonoise (SNR) …

  Cited by 4 Related articles All 4 versions


[PDF] arxiv.org

Wasserstein Adversarial Regularization (WAR) on label noise

BB DamodaranK FatrasS LobryR Flamary… - arXiv preprint arXiv …, 2019 - arxiv.org

Noisy labels often occur in vision datasets, especially when they are obtained from

crowdsourcing or Web scraping. We propose a new regularization method, which enables

learning robust classifiers in presence of noisy data. To achieve this goal, we propose a new …

  Cited by 1 Related articles All 2 versions 

Wasserstein Adversarial Regularization (WAR) on label noise

B Bhushan Damodaran, K FatrasS Lobry… - arXiv e …, 2019 - ui.adsabs.harvard.edu

Noisy labels often occur in vision datasets, especially when they are obtained from

crowdsourcing or Web scraping. We propose a new regularization method, which enables

learning robust classifiers in presence of noisy data. To achieve this goal, we propose a new …



Pushing the right boundaries matters! wasserstein adversarial training for label noise

BB DamodaranK FatrasS LobryR FlamaryD Tuia… - 2019 - hal.laas.fr

Noisy labels often occur in vision datasets, especially when they are issued from

crowdsourcing or Web scraping. In this paper, we propose a new regularization method

which enables one to learn robust classifiers in presence of noisy data. To achieve this goal …

  Cited by 3 Related articles All 4 versions 


 

[PDF] sciencedirect.com

Distributionally Robust Chance-Constraint Optimal Power Flow Considering Uncertain Renewables with Wasserstein-Moment Metric

J Liu, Y Chen, C Duan, J Lyu - Energy Procedia, 2019 - Elsevier

Chance-constraint optimal power flow has been proven as an efficient method to manage

the risk of volatile renewable energy sources. To address the uncertainties of renewable

energy sources, a novel distributionally robust chance-constraint OPF model is proposed in …

  Cited by 1 Related articles All 2 versions



Minimax estimation of smooth densities in Wasserstein distance

J Niles-WeedQ Berthet - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

We study nonparametric density estimation problems where error is measured in the

Wasserstein distance, a metric on probability distributions popular in many areas of statistics

and machine learning. We give the first minimax-optimal rates for this problem for general …

Minimax estimation of smooth densities in Wasserstein distance

J Niles-WeedQ Berthet - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

We study nonparametric density estimation problems where error is measured in the Wasserstein distance, a metric on probability distributions popular in many areas of statistics and machine learning. We give the first minimax-optimal rates for this problem for general …



Multivariate approximations in Wasserstein distance by Stein ...

link.springer.com › article › 10 

Author: Xiao Fang, Qi-Man Shao, Lihu Xu

Cited by: 14

Publish Year: 2019

CITATION] Multivariate Stein Factors from Wasserstein Decay

MA Erdogdu, L Mackey, O Shamir - 2019 - preparation

  Cited by 2 Related articles


[PDF] nsf.gov

An information-theoretic view of generalization via Wasserstein distance

H WangM Diaz, JCS Santos Filho… - … on Information Theory …, 2019 - ieeexplore.ieee.org

We capitalize on the Wasserstein distance to obtain two information-theoretic bounds on the

generalization error of learning algorithms. First, we specialize the Wasserstein distance into

total variation, by using the discrete metric. In this case we derive a generalization bound …

  Cited by 9 Related articles All 5 versions

<——2019—–—2019 ——2120—  




[PDF] ieee.org

Accelerating CS-MRI reconstruction with fine-tuning Wasserstein generative adversarial network

M Jiang, Z Yuan, X Yang, J Zhang, Y Gong, L Xia… - IEEE …, 2019 - ieeexplore.ieee.org

Compressed sensing magnetic resonance imaging (CS-MRIis a time-efficient method to

acquire MR images by taking advantage of the highly under-sampled k-space data to

accelerate the time consuming acquisition process. In this paper, we proposed a de-aliasing …

  Cited by 5 Related articles


[PDF] arxiv.org

Parisi's formula is a Hamilton-Jacobi equation in Wasserstein space

JC Mourrat - arXiv preprint arXiv:1906.08471, 2019 - arxiv.org

Parisi's formula is a self-contained description of the infinite-volume limit of the free energy of

mean-field spin glass models. We show that this quantity can be recast as the solution of a

Hamilton-Jacobi equation in the Wasserstein space of probability measures on the positive …

  Cited by 7 Related articles All 3 versions 


[PDF] arxiv.org

A partial Laplacian as an infinitesimal generator on the Wasserstein space

YT Chow, W Gangbo - Journal of Differential Equations, 2019 - Elsevier

In this manuscript, we consider special linear operators which we term partial Laplacians on

the Wasserstein space, and which we show to be partial traces of the Wasserstein Hessian.

We verify a distinctive smoothing effect of the “heat flows” they generated for a particular …

  Cited by 13 Related articles All 9 versions


[PDF] ieee.org

Prostate MR image segmentation with self-attention adversarial training based on wasserstein distance

C Su, R Huang, C Liu, T Yin, B Du - IEEE Access, 2019 - ieeexplore.ieee.org

Prostate diseases are very common in men. Accurate segmentation of the prostate plays a

significant role in further clinical treatment and diagnosis. There have been some methods

that combine the segmentation network and generative adversarial network, using the …

  Cited by 3 Related articles


[PDF] arxiv.org

Straight-through estimator as projected Wasserstein gradient flow

P ChengC LiuC LiD ShenR Henao… - arXiv preprint arXiv …, 2019 - arxiv.org

The Straight-Through (ST) estimator is a widely used technique for back-propagating

gradients through discrete random variables. However, this effective method lacks

theoretical justification. In this paper, we show that ST can be interpreted as the simulation of …

  Cited by 4 Related articles All 5 versions 


2019


Generating EEG signals of an RSVP experiment by a class conditioned wasserstein generative adversarial network

S PanwarP RadJ Quarles… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

Electroencephalography (EEG) data is difficult to obtain due to complex experimental setups

and reduced comfort due to prolonged wearing. This poses challenges to train powerful

deep learning model due to the limited EEG data. Hence, being able to generate EEG data …

  Cited by 5 Related articles All 2 versions


2019

Cross-domain Attention Network with Wasserstein Regularizers for E-commerce Search

M QiuB WangC Chen, X Zeng, J Huang… - Proceedings of the 28th …, 2019 - dl.acm.org

Product search and recommendation is a task that every e-commerce platform wants to

outperform their peels on. However, training a good search or recommendation model often

requires more data than what many platforms have. Fortunately, the search tasks on different …

  Related articles


 2019

An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription

C Jin, Z Li, Y Sun, H Zhang, X Lv, J Li, S Liu - International Conference on …, 2019 - Springer

Given a piece of acoustic musical signal, various automatic music transcription (AMT)

processing methods have been proposed to generate the corresponding music notations

without human intervention. However, the existing AMT methods based on signal …

  Related articles



Wasserstein adversarial examples via projected sinkhorn iterations

E WongF SchmidtZ Kolter - International Conference on …, 2019 - proceedings.mlr.press

A rapidly growing area of work has studied the existence of adversarial examples,

datapoints which have been perturbed to fool a classifier, but the vast majority of these

works have focused primarily on threat models defined by $\ell_p $ norm-bounded …

  Cited by 72 Related articles All 8 versions 


[HTML] oup.com

Uncoupled isotonic regression via minimum Wasserstein deconvolution

P RigolletJ Weed - Information and Inference: A Journal of the …, 2019 - academic.oup.com

Isotonic regression is a standard problem in shape-constrained estimation where the goal is

to estimate an unknown non-decreasing regression function from independent pairs where.

While this problem is well understood both statistically and computationally, much less is …

  Cited by 39 Related articles All 8 versions

<——2019—–—2019 ——2130—



[PDF] ncl.ac.uk

Generative adversarial framework for depth filling via wasserstein metric, cosine transform and domain transfer

…, S Akcay, GP de La Garanderie… - Pattern Recognition, 2019 - Elsevier

In this work, the issue of depth filling is addressed using a self-supervised feature learning

model that predicts missing depth pixel values based on the context and structure of the

scene. A fully-convolutional generative model is conditioned on the available depth …

  Cited by 17 Related articles All 4 versions



[PDF] arxiv.org

Straight-through estimator as projected Wasserstein gradient flow

P ChengC LiuC LiD ShenR Henao… - arXiv preprint arXiv …, 2019 - arxiv.org

The Straight-Through (ST) estimator is a widely used technique for back-propagating

gradients through discrete random variables. However, this effective method lacks

theoretical justification. In this paper, we show that ST can be interpreted as the simulation of …

  Cited by 4 Related articles All 5 versions 


Computing Wasserstein Barycenters via linear programming

G Auricchio, F Bassetti, S Gualandi… - … Conference on Integration …, 2019 - Springer

This paper presents a family of generative Linear Programming models that permit to

compute the exact Wasserstein Barycenter of a large set of two-dimensional images.

Wasserstein Barycenters were recently introduced to mathematically generalize the concept …

  Cited by 4 Related articles All 2 versions


[PDF] polimi.it

Propagating uncertainty in reinforcement learning via wasserstein barycenters

AM MetelliA LikmetaM Restelli - 33rd Conference on Neural …, 2019 - re.public.polimi.it

How does the uncertainty of the value function propagate when performing temporal

difference learning? In this paper, we address this question by proposing a Bayesian

framework in which we employ approximate posterior distributions to model the uncertainty …

  Cited by 5 Related articles All 3 versions 


[PDF] arxiv.org

2-wasserstein approximation via restricted convex potentials with application to improved training for gans

A TaghvaeiA Jalali - arXiv preprint arXiv:1902.07197, 2019 - arxiv.org

We provide a framework to approximate the 2-Wasserstein distance and the optimal

transport map, amenable to efficient training as well as statistical and geometric analysis.

With the quadratic cost and considering the Kantorovich dual form of the optimal …

  Cited by 9 Related articles All 3 versions 


2019



[PDF] researchgate.net

Least-squares reverse time migration via linearized waveform inversion using a Wasserstein metric

P Yong, J Huang, Z Li, W LiaoL Qu - Geophysics, 2019 - library.seg.org

Least-squares reverse time migration (LSRTM), an effective tool for imaging the structures of

the earth from seismograms, can be characterized as a linearized waveform inversion

problem. We have investigated the performance of three minimization functionals as the L 2 …

  Cited by 3 Related articles All 4 versions

[CITATION] Least-squares reverse time migration via linearized waveform inversion using a Wasserstein metricWasserstein metric for LSRTM

P Yong, J Huang, Z Li, W LiaoL Qu - Geophysics, 2019

  Related articles



[PDF] monash.edu

[PDF] Threeplayer wasserstein gan via amortised duality

QH Nhan Dam, T LeTD Nguyen… - Proc. of the 28th Int …, 2019 - research.monash.edu

We propose a new formulation for learning generative adversarial networks (GANs) using

optimal transport cost (the general form of Wasserstein distance) as the objective criterion to

measure the dissimilarity between target distribution and learned distribution. Our …

  Cited by 2 Related articles All 3 versions 


[PDF] researchgate.net

Infrared and Visible Image Fusion via Multi-discriminators Wasserstein Generative Adversarial Network

J Li, H Huo, K Liu, C Li, S Li… - 2019 18th IEEE …, 2019 - ieeexplore.ieee.org

Generative adversarial network (GAN) has been widely applied to infrared and visible image

fusion. However, the existing GAN-based image fusion methods only establish one

discriminator in the network to make the fused image capture gradient information from the …

  Cited by 1 Related articles All 3 versions


[PDF] arxiv.org

Group level MEG/EEG source imaging via optimal transport: minimum Wasserstein estimates

H JanatiT BazeilleB ThirionM Cuturi… - … Information Processing in …, 2019 - Springer

Magnetoencephalography (MEG) and electroencephalography (EEG) are non-invasive

modalities that measure the weak electromagnetic fields generated by neural activity.

Inferring the location of the current sources that generated these magnetic fields is an ill …

  Cited by 5 Related articles All 14 versions


[PDF] arxiv.org

On Efficient Multilevel Clustering via Wasserstein Distances

V HuynhN Ho, N Dam, XL Nguyen… - arXiv preprint arXiv …, 2019 - arxiv.org

We propose a novel approach to the problem of multilevel clustering, which aims to

simultaneously partition data in each group and discover grouping patterns among groups

in a potentially large hierarchically structured corpus of data. Our method involves a joint …

  Related articles All 2 versions 

<——2019—–—2019 ——2140—  



[PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance: Wrong Way Counterparty Credit and Funding Risk

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019 - arxiv.org

This paper investigates calculations of robust XVA, in particular, credit valuation adjustment

(CVA) and funding valuation adjustment (FVA) for over-the-counter derivatives under

distributional uncertainty using Wasserstein distance as the ambiguity measure. Wrong way …

  Cited by 1 Related articles All 8 versions 


[PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance Part 2: Wrong Way Funding Risk

D Singh, S Zhang - arXiv preprint arXiv:1910.03993, 2019 - arxiv.org

This paper investigates calculations of robust funding valuation adjustment (FVA) for over

the counter (OTC) derivatives under distributional uncertainty using Wasserstein distance as

the ambiguity measure. Wrong way funding risk can be characterized via the robust FVA …

  Related articles All 5 versions 



[CITATION] Distributionally robust xva via wasserstein distance part 1

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019

  Cited by 3 Related articles



[CITATION] On the complexity of computing Wasserstein distances

B TaskesenS Shafieezadeh-Abadeh, D Kuhn - 2019 - Working paper

  Cited by 2 Related articles


Data augmentation method of sar image dataset based on wasserstein generative adversarial networks

Q Lu, H Jiang, G Li, W Ye - 2019 International conference on …, 2019 - ieeexplore.ieee.org

The published Synthetic Aperture Radar (SAR) samples are not abundant enough, which is

not conducive to the application of deep learning methods in the field of SAR automatic

target recognition. Generative Adversarial Nets (GANs) is one of the most effective ways to …

  Cited by 1 Related articles All 2 versions


2019



2019

Kernel Wasserstein Distance

JH Oh, M Pouryahya, A Iyer, AP Apte… - arXiv preprint arXiv …, 2019 - arxiv.org

The Wasserstein distance is a powerful metric based on the theory of optimal transport. It

gives a natural measure of the distance between two distributions with a wide range of

applications. In contrast to a number of the common divergences on distributions such as …

  Cited by 4 Related articles All 3 versions 


[PDF] arxiv.org

Bridging Bayesian and minimax mean square error estimation via Wasserstein distributionally robust optimization

VA NguyenS Shafieezadeh-AbadehD Kuhn… - arXiv preprint arXiv …, 2019 - arxiv.org

We introduce a distributionally robust minimium mean square error estimation model with a

Wasserstein ambiguity set to recover an unknown signal from a noisy observation. The

proposed model can be viewed as a zero-sum game between a statistician choosing an …

  Cited by 8 Related articles All 6 versions 


[PDF] arxiv.org

Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula

X Fang, QM ShaoL Xu - Probability Theory and Related Fields, 2019 - Springer

Stein's method has been widely used for probability approximations. However, in the multi-

dimensional setting, most of the results are for multivariate normal approximation or for test

functions with bounded second-or higher-order derivatives. For a class of multivariate  …

  Cited by 22 Related articles All 7 versions

CITATION] Multivariate approximations in Wasserstein distance by Stein's method and Bismut's formula (vol 174, pg 945, 2019)

X Fang, QM ShaoL Xu - PROBABILITY …



[PDF] researchgate.net

Wasserstein Subsequence Kernel for Time Series

C BockM TogninalliE Ghisu… - … Conference on Data …, 2019 - ieeexplore.ieee.org

Kernel methods are a powerful approach for learning on structured data. However, as we

show in this paper, simple but common instances of the popular R-convolution kernel

framework can be meaningless when assessing the similarity of two time series through …

  Cited by 3 Related articles All 10 versions


[PDF] arxiv.org

On the minimax optimality of estimating the wasserstein metric

T Liang - arXiv preprint arXiv:1908.10324, 2019 - arxiv.org

We study the minimax optimal rate for estimating the Wasserstein-$1 $ metric between two

unknown probability measures based on $ n $ iid empirical samples from them. We show

that estimating the Wasserstein metric itself between probability measures, is not …

  Cited by 3 Related articles All 3 versions 

<——2019—–—2019 ——2150 


Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance

J WeedF Bach - Bernoulli, 2019 - projecteuclid.org

The Wasserstein distance between two probability measures on a metric space is a

measure of closeness with applications in statistics, probability, and machine learning. In

this work, we consider the fundamental question of how quickly the empirical measure …

  Cited by 173 Related articles All 6 versions


 

[PDF] arxiv.org

Tree-Wasserstein Barycenter for Large-Scale Multilevel Clustering and Scalable Bayes

T LeV HuynhN HoD PhungM Yamada - arXiv preprint arXiv …, 2019 - arxiv.org

We study in this paper a variant of Wasserstein barycenter problem, which we refer to as tree-

Wasserstein barycenter, by leveraging a specific class of ground metrics, namely tree

metrics, for Wasserstein distance. Drawing on the tree structure, we propose an efficient …

  Related articles All 2 versions 


[PDF] arxiv.org

On Efficient Multilevel Clustering via Wasserstein Distances

V HuynhN Ho, N Dam, XL Nguyen… - arXiv preprint arXiv …, 2019 - arxiv.org

We propose a novel approach to the problem of multilevel clustering, which aims to

simultaneously partition data in each group and discover grouping patterns among groups

in a potentially large hierarchically structured corpus of data. Our method involves a joint …

  Related articles All 2 versions 


[PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance Part 2: Wrong Way Funding Risk

D Singh, S Zhang - arXiv preprint arXiv:1910.03993, 2019 - arxiv.org

This paper investigates calculations of robust funding valuation adjustment (FVA) for over

the counter (OTC) derivatives under distributional uncertainty using Wasserstein distance as

the ambiguity measure. Wrong way funding risk can be characterized via the robust FVA …

  Related articles All 5 versions 


Music Classification using Multiclass Support Vector Machine and Multilevel Wasserstein Means

J Wei, C Jin, Z Cheng, X Lv… - 2019 IEEE/ACIS 18th …, 2019 - ieeexplore.ieee.org

Music classification is a challenging task in music information retrieval. In this article, we

compare the performance of the two types of models. The first category is classified by

Support Vector Machine (SVM). We use the feature extraction from audio as the basis of …

  Related articles All 2 versions


2019


Distributionally robust xva via wasserstein distance part 1: Wrong way counterparty credit risk

D Singh, S Zhang - Unknown Journal, 2019 - experts.umn.edu

This paper investigates calculations of robust CVA for OTC derivatives under distributional

uncertainty using Wasserstein distance as the ambiguity measure. Wrong way counterparty

credit risk can be characterized (and indeed quantified) via the robust CVA formulation. The …

[CITATION] Distributionally robust xva via wasserstein distance part 1

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019

  Cited by 3 Related articles


2019  

[CITATION] エントロピー正則化 Wasserstein 距離に基づくマルチビュー Wasserstein 判別法 (放送技術)

笠井裕之 - 映像情報メディア学会技術報告= ITE technical report, 2019 - ci.nii.ac.jp

検索. すべて. 本文あり. すべて. 本文あり. タイトル. 著者名. 著者ID. 著者所属. 刊行物名. ISSN.

巻号ページ. 出版者. 参考文献. 出版年. 年から 年まで. 検索. 閉じる. 検索. 検索. [機関認証]

利用継続手続きのご案内. エントロピー正則化Wasserstein距離に基づくマルチビューWasserstein …
[Japanese  Entropy regularization Wasserstein Distance-based multi-view Wasserstein discrimination method (broadcasting technology)]


2019    

Cross-domain Attention Network with Wasserstein Regularizers for E-commerce Search

M QiuB WangC Chen, X Zeng, J Huang… - Proceedings of the 28th …, 2019 - dl.acm.org

Product search and recommendation is a task that every e-commerce platform wants to

outperform their peels on. However, training a good search or recommendation model often

requires more data than what many platforms have. Fortunately, the search tasks on different …

  Related articles


2019

Confronto di funzioni oggetto per l'inversione di dati sismici studio delle potenzialità della Metrica di Wasserstein

L STRACCA - 2019 - etd.adm.unipi.it

Un problema inverso ha come scopo la determinazione o la stima dei parametri incogniti di

un modello, conoscendo i dati da esso generati l'operatore di forward modelling che

descrive la relazione tra un modello generico il rispettivo dato predetto. In un qualunque …

[Italian: Comparação de funções de objeto para inversão de dados sísmicos e estudo do potencial da métrica de Wasserstein]  


 year 2019

[PDF] uniandes.edu.co

[PDF] Problemas de clasificación: una perspectiva robusta con la métrica de Wasserstein

JA Acosta Melo - repositorio.uniandes.edu.co

El objetivo central de este trabajo es dar un contexto a los problemas de clasificación para

los casos de máquinas de soporte vectorial regresión logıstica. La idea central es abordar

estos problemas con un enfoque robusto con ayuda de la métrica de Wasserstein que se …

  Related articles 

[Spanish  Classification problems: a robust perspective using Wasserstein's metric]

<——2019—–—2019 ——2160— 


[PDF] arxiv.org

Precise simulation of electromagnetic calorimeter showers using a Wasserstein Generative Adversarial Network

M Erdmann, J Glombitza, T Quast - Computing and Software for Big …, 2019 - Springer

Simulations of particle showers in calorimeters are computationally time-consuming, as they

have to reproduce both energy depositions and their considerable fluctuations. A new

approach to ultra-fast simulations is generative models where all calorimeter energy …

  Cited by 46 Related articles All 6 versions

 

[PDF] arxiv.org

Hausdorff and Wasserstein metrics on graphs and other structured data

E Patterson - arXiv preprint arXiv:1907.00257, 2019 - arxiv.org

Optimal transport is widely used in pure and applied mathematics to find probabilistic

solutions to hard combinatorial matching problems. We extend the Wasserstein metric and

other elements of optimal transport from the matching of sets to the matching of graphs and …

  Cited by 5 Related articles All 3 versions 


[PDF] projecteuclid.org

Wasserstein-2 bounds in normal approximation under local dependence

X Fang - Electronic Journal of Probability, 2019 - projecteuclid.org

We obtain a general bound for the Wasserstein-2 distance in normal approximation for sums

of locally dependent random variables. The proof is based on an asymptotic expansion for

expectations of second-order differentiable functions of the sum. We apply the main result to …

  Cited by 4 Related articles All 3 versions


[PDF] arxiv.org

Temporal Wasserstein non-negative matrix factorization for non-rigid motion segmentation and spatiotemporal deconvolution

E VarolA Nejatbakhsh, C McGrory - arXiv preprint arXiv:1912.03463, 2019 - arxiv.org

Motion segmentation for natural images commonly relies on dense optic flow to yield point

trajectories which can be grouped into clusters through various means including spectral

clustering or minimum cost multicuts. However, in biological imaging scenarios, such as …

  Cited by 2 Related articles All 3 versions 


 

Wasserstein barycenters in the manifold of all positive definite matrices

E NobariB Ahmadi Kakavandi - Quarterly of Applied Mathematics, 2019 - ams.org

In this paper, we study the Wasserstein barycenter of finitely many Borel probability

measures on $\mathbb {P} _ {n} $, the Riemannian manifold of all $ n\times n $ real positive

definite matrices as well as its associated dual problem, namely the optimal transport …

  Related articles All 2 versions


2019

[PDF] sfds.asso.fr

[PDF] Méthode de couplage en distance de Wasserstein pour la théorie des valeurs extrêmes

B Bobbia, C DombryD Varron - jds2019.sfds.asso.fr

Nous proposons une relecture de résultats classiques de la théorie des valeurs extrêmes,

que nous étudions grâce aux outils que nous fournit la théorie du transport optimal. Dans ce

cadre, nous pouvons voir la normalité des estimateurs comme une convergence de …

  Related articles All 2 versions 


[PDF] aaai.org

Manifold-valued image generation with Wasserstein generative adversarial nets

Z Huang, J Wu, L Van Gool - Proceedings of the AAAI Conference on …, 2019 - ojs.aaai.org

Generative modeling over natural images is one of the most fundamental machine learning

problems. However, few modern generative models, including Wasserstein Generative

Adversarial Nets (WGANs), are studied on manifold-valued images that are frequently …

Cited by 16 Related articles All 12 versions 

 

[HTML] deepai.org

[HTML] Manifold-valued image generation with wasserstein adversarial networks

EW GANs - 2019 - deepai.org

Unsupervised image generation has recently received an increasing amount of attention thanks

to the great success of generative adversarial networks (GANs), particularly Wasserstein

GANs. Inspired by the paradigm of real-valued image generation, this paper makes the first attempt …

  Cited by 2 Related articles 


Unsupervised segmentation of colonic polyps in narrow-band imaging data based on manifold representation of images and Wasserstein distance

IN FigueiredoL PintoPN FigueiredoR Tsai - … Signal Processing and …, 2019 - Elsevier

Colorectal cancer (CRC) is one of the most common cancers worldwide and after a certain

age (≥ 50) regular colonoscopy examination for CRC screening is highly recommended.

One of the most prominent precursors of CRC are abnormal growths known as polyps. If a …

ited by 2 Related articles All 3 versions

Wasserstein barycenters in the manifold of all positive definite matrices

E NobariB Ahmadi Kakavandi - Quarterly of Applied Mathematics, 2019 - ams.org

In this paper, we study the Wasserstein barycenter of finitely many Borel probability

measures on $\mathbb {P} _ {n} $, the Riemannian manifold of all $ n\times n $ real positive

definite matrices as well as its associated dual problem, namely the optimal transport …

  Related articles All 2 versions

<——2019—–—2019 ——2170 


Projection in the 2-Wasserstein s9se on structured measure space

L Lebrat - 2019 - tel.archives-ouvertes.fr

This thesis focuses on the approximation for the 2-Wasserstein metric of probability

measures by structured measures. The set of structured measures under consideration is

made of consistent discretizations of measures carried by a smooth curve with a bounded …

  All 2 versions 


Projection au sens de Wasserstein 2 sur des espaces structurés de mesures

L Lebrat - 2019 - theses.fr

Résumé Cette thèse s' intéresse à l'approximation pour la métrique de 2-Wasserstein de

mesures de probabilité par une mesure structurée. Les mesures structurées étudiées sont

des discrétisations consistantes de mesures portées par des courbes continues à vitesse et …

  

[PDF] sfds.asso.fr

[PDF] Méthode de couplage en distance de Wasserstein pour la théorie des valeurs extrêmes

B Bobbia, C DombryD Varron - jds2019.sfds.asso.fr

Nous proposons une relecture de résultats classiques de la théorie des valeurs extrêmes,

que nous étudions grâce aux outils que nous fournit la théorie du transport optimal. Dans ce

cadre, nous pouvons voir la normalité des estimateurs comme une convergence de …

  Related articles All 2 versions 


Wasserstein Distributionally Robust Optimization - Delft ...

http://www.dcsc.tudelft.nl › 2019 › DRO_tutorial

PDF

by D Kuhn · Cited by 73 — Wasserstein distributionally robust optimization seeks data-driven decisions that ... independently from P. In addition, some structural properties of P may be known ... true distribution P, we must introduce a distance measure between probability ... we actually push down the risk under all distributions in the ambiguity set—in ...

[CITATION] Distributionally robust risk measures with structured Wasserstein ambiguity sets

VA Nguyen, D Filipovic, D Kuhn - 2019 - Working paper


Artifact correction in low‐dose dental CT imaging using Wasserstein generative adversarial networks

Z HuC Jiang, F Sun, Q Zhang, Y Ge, Y Yang… - Medical …, 2019 - Wiley Online Library

Purpose In recent years, health risks concerning high‐dose x‐ray radiation have become a

major concern in dental computed tomography (CT) examinations. Therefore, adopting low‐

dose computed tomography (LDCT) technology has become a major focus in the CT  …

  Cited by 30 Related articles All 5 versions


2019

[PDF] sciencedirect.com

Decomposition algorithm for distributionally robust optimization using Wasserstein metric with an application to a class of regression models

F LuoS Mehrotra - European Journal of Operational Research, 2019 - Elsevier

We study distributionally robust optimization (DRO) problems where the ambiguity set is

defined using the Wasserstein metric and can account for a bounded support. We show that

this class of DRO problems can be reformulated as decomposable semi-infinite programs …

  Cited by 23 Related articles All 6 versions


[PDF] arxiv.org

Data-driven chance constrained optimization under Wasserstein ambiguity sets

AR HotaA CherukuriJ Lygeros - 2019 American Control …, 2019 - ieeexplore.ieee.org

We present a data-driven approach for distri-butionally robust chance constrained

optimization problems (DRCCPs). We consider the case where the decision maker has

access to a finite number of samples or realizations of the uncertainty. The chance constraint …

  Cited by 21 Related articles All 4 versions


A virtual monochromatic imaging method for spectral CT based on Wasserstein generative adversarial network with a hybrid loss

Z Shi, J Li, H Li, Q Hu, Q Cao - IEEE Access, 2019 - ieeexplore.ieee.org

Spectral computed tomography (CT) has become a popular clinical diagnostic technique

because of its unique advantage in material distinction. Specifically, it can perform virtual

monochromatic imaging to obtain accurate tissue composition with less beam hardening …

  Cited by 9 Related articles All 2 versions


[PDF] iop.org

Wasserstein space as state space of quantum mechanics and optimal transport

MF Rosyid, K Wahyuningsih - Journal of Physics: Conference …, 2019 - iopscience.iop.org

In this work, we are in the position to view a measurement of a physical observable as an

experiment in the sense of probability theory. To every physical observable, a sample space

called the spectrum of the observable is therefore available. We have investigated the …

  Related articles All 2 versions


DE NOVO PROTEIN DESIGN OF NOVEL FOLDS USING GUIDED CONDITIONAL WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS (GCWGAN)

S Zhu - 2019 - oaktrust.library.tamu.edu

In the research areas about proteins, it is always a significant topic to detect the

sequencestructure-function relationship. Fundamental questions remain for this topic: How

much could current data alone reveal deep insights about such relationship? And how much …

<——2019—–—2019 ——2180 —


[PDF] nsf.gov

Distributions with Maximum Spread Subject to Wasserstein Distance Constraints

JG Carlsson, Y Wang - Journal of the Operations Research Society of …, 2019 - Springer

Recent research on formulating and solving distributionally robust optimization problems

has seen many different approaches for describing one's ambiguity set, such as constraints

on first and second moments or quantiles. In this paper, we use the Wasserstein distance to …

  Related articles All 3 versions


Wasserstein adversarial examples via projected sinkhorn iterations

E WongF SchmidtZ Kolter - International Conference on …, 2019 - proceedings.mlr.press

A rapidly growing area of work has studied the existence of adversarial examples,

datapoints which have been perturbed to fool a classifier, but the vast majority of these

works have focused primarily on threat models defined by $\ell_p $ norm-bounded …

  Cited by 73 Related articles All 8 versions 


[PDF] thecvf.com

Sliced wasserstein discrepancy for unsupervised domain adaptation

CY Lee, T Batra, MH Baig… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

In this work, we connect two distinct concepts for unsupervised domain adaptation: feature

distribution alignment between domains by utilizing the task-specific decision boundary and

the Wasserstein metric. Our proposed sliced Wasserstein discrepancy (SWD) is designed to …

  Cited by 124 Related articles All 7 versions 

[CITATION] Sliced wasserstein discrepancy for unsupervised domain adaptation. In 2019 IEEE

C Lee, T Batra, MH Baig, D Ulbricht - CVF Conference on Computer Vision and …, 2019

  Cited by 1



[PDF] arxiv.org

Wasserstein distance based domain adaptation for object detection

P Xu, P GurramG WhippsR Chellappa - arXiv preprint arXiv:1909.08675, 2019 - arxiv.org

In this paper, we present an adversarial unsupervised domain adaptation framework for

object detection. Prior approaches utilize adversarial training based on cross entropy

between the source and target domain distributions to learn a shared feature mapping that …

  Cited by 6 Related articles All 2 versions 


[PDF] thecvf.com

Normalized wasserstein for mixture distributions with applications in adversarial learning and domain adaptation

Y BalajiR ChellappaS Feizi - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Understanding proper distance measures between distributions is at the core of several

learning tasks such as generative models, domain adaptation, clustering, etc. In this work,

we focus on mixture distributions that arise naturally in several application domains where …

  Cited by 13 Related articles All 4 versions 


2019


[PDF] arxiv.org

Unsupervised adversarial domain adaptation based on the Wasserstein distance for acoustic scene classification

K DrossosP MagronT Virtanen - 2019 IEEE Workshop on …, 2019 - ieeexplore.ieee.org

A challenging problem in deep learning-based machine listening field is the degradation of

the performance when using data from unseen conditions. In this paper we focus on the

acoustic scene classification (ASC) task and propose an adversarial deep learning method …

  Cited by 15 Related articles All 5 versions


Deep multi-Wasserstein unsupervised domain adaptation

TN LeA HabrardM Sebban - Pattern Recognition Letters, 2019 - Elsevier

In unsupervised domain adaptation (DA), 1 aims at learning from labeled source data and

fully unlabeled target examples a model with a low error on the target domain. In this setting,

standard generalization bounds prompt us to minimize the sum of three terms:(a) the source …

  Cited by 3 Related articles All 3 versions


Deep joint two-stream Wasserstein auto-encoder and selective attention alignment for unsupervised domain adaptation

Z ChenC Chen, X Jin, Y LiuZ Cheng - Neural computing and …, 2019 - Springer

Abstract Domain adaptation refers to the process of utilizing the labeled source domain data

to learn a model that can perform well in the target domain with limited or missing labels.

Several domain adaptation methods combining image translation and feature alignment …

  Cited by 13 Related articles


[PDF] ncl.ac.uk

Generative adversarial framework for depth filling via wasserstein metric, cosine transform and domain transfer

A Atapour-AbarghoueiS Akcay… - Pattern Recognition, 2019 - Elsevier

In this work, the issue of depth filling is addressed using a self-supervised feature learning

model that predicts missing depth pixel values based on the context and structure of the

scene. A fully-convolutional generative model is conditioned on the available depth …

  Cited by 17 Related articles All 4 versions


[PDF] arxiv.org

Normalized Wasserstein distance for mixture distributions with applications in adversarial learning and domain adaptation

Y BalajiR ChellappaS Feizi - arXiv preprint arXiv:1902.00415, 2019 - arxiv.org

Understanding proper distance measures between distributions is at the core of several

learning tasks such as generative models, domain adaptation, clustering, etc. In this work,

we focus on mixture distributions that arise naturally in several application domains where …

  Cited by 5 Related articles All 2 versions 

<——2019—–—2019 ——2190— 



[PDF] mdpi.com

Wasserstein distance learns domain invariant feature representations for drift compensation of E-nose

Y Tao, C Li, Z Liang, H Yang, J Xu - Sensors, 2019 - mdpi.com

Abstract Electronic nose (E-nose), a kind of instrument which combines with the gas sensor

and the corresponding pattern recognition algorithm, is used to detect the type and

concentration of gases. However, the sensor drift will occur in realistic application scenario …

  Cited by 4 Related articles All 7 versions 


Cross-domain Attention Network with Wasserstein Regularizers for E-commerce Search

M QiuB WangC Chen, X Zeng, J Huang… - Proceedings of the 28th …, 2019 - dl.acm.org

Product search and recommendation is a task that every e-commerce platform wants to

outperform their peels on. However, training a good search or recommendation model often

requires more data than what many platforms have. Fortunately, the search tasks on different …

  Related articles


An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription

C Jin, Z Li, Y Sun, H Zhang, X Lv, J Li, S Liu - International Conference on …, 2019 - Springer

Given a piece of acoustic musical signal, various automatic music transcription (AMT)

processing methods have been proposed to generate the corresponding music notations

without human intervention. However, the existing AMT methods based on signal …

  Related articles


[PDF] csroc.org.tw

[PDF] Cross-domain Text Sentiment Classification Based on Wasserstein Distance

G Cai, Q Lin, N Chen - Journal of Computers, 2019 - csroc.org.tw

Text sentiment analysis is mainly to detect the sentiment polarity implicit in text data. Most

existing supervised learning algorithms are difficult to solve the domain adaptation problem

in text sentiment analysis. The key of cross-domain text sentiment analysis is how to extract …

  Related articles All 2 versions 


[CITATION] Multisource wasserstein distance based domain adaptation

S Ghosh, S Prakash - 2019 - dspace.iiti.ac.in

… Please use this identifier to cite or link to this item: http://dspace.iiti.ac.in:8080/jspui/handle/

123456789/2064. Title: Multisource wasserstein distance based domain adaptation …


2019


[PDF] ieee.org

A deep transfer model with wasserstein distance guided multi-adversarial networks for bearing fault diagnosis under different working conditions

M Zhang, D Wang, W Lu, J Yang, Z Li, B Liang - IEEE Access, 2019 - ieeexplore.ieee.org

In recent years, intelligent fault diagnosis technology with the deep learning algorithm has

been widely used in the manufacturing industry for substituting time-consuming human

analysis method to enhance the efficiency of fault diagnosis. The rolling bearing as the …

  Cited by 28 Related articles All 5 versions


2019

[PDF] biorxiv.org

De Novo Protein Design for Novel Folds using Guided Conditional Wasserstein Generative Adversarial Networks (gcWGAN)

M KarimiS ZhuY CaoY Shen - bioRxiv, 2019 - biorxiv.org

Motivation Facing data quickly accumulating on protein sequence and structure, this study is

addressing the following question: to what extent could current data alone reveal deep

insights into the sequence-structure relationship, such that new sequences can be designed …

  Cited by 6 Related articles All 4 versions 

 

[PDF] arxiv.org

Wasserstein Distance Guided Cross-Domain Learning

J Su - arXiv preprint arXiv:1910.07676, 2019 - arxiv.org

Domain adaptation aims to generalise a high-performance learner on target domain (non-

labelled data) by leveraging the knowledge from source domain (rich labelled data) which

comes from a different but related distribution. Assuming the source and target domains data …

  Related articles All 2 versions 


[PDF] aau.dk

[PDF] Full-Band Music Genres Interpolations with Wasserstein Autoencoders

T Borghuis, A Tibo, S Conforti, L Brusci… - Workshop AI for Media …, 2019 - vbn.aau.dk

We compare different types of autoencoders for generating interpolations between four-

instruments musical patterns in the acid jazz, funk, and soul genres. Preliminary empirical

results suggest the superiority of Wasserstein autoencoders. The process of generation …

  Related articles All 4 versions 


Generating EEG signals of an RSVP experiment by a class conditioned wasserstein generative adversarial network

S PanwarP RadJ Quarles… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

Electroencephalography (EEG) data is difficult to obtain due to complex experimental setups

and reduced comfort due to prolonged wearing. This poses challenges to train powerful

deep learning model due to the limited EEG data. Hence, being able to generate EEG data …

  Cited by 5 Related articles All 2 versions

<——2019—–—2019 ——



Gait recognition based on Wasserstein generating adversarial image inpainting network

L Xia, H Wang, W Guo - Journal of Central South University, 2019 - Springer

Aiming at the problem of small area human occlusion in gait recognition, a method based on

generating adversarial image inpainting network was proposed which can generate a

context consistent image for gait occlusion area. In order to reduce the effect of noise on …

  Cited by 2 Related articles


[PDF] arxiv.org

Deep Distributional Sequence Embeddings Based on a Wasserstein Loss

A Abdelwahab, N Landwehr - arXiv preprint arXiv:1912.01933, 2019 - arxiv.org

Deep metric learning employs deep neural networks to embed instances into a metric space

such that distances between instances of the same class are small and distances between

instances from different classes are large. In most existing deep metric learning techniques …

  Cited by 2 Related articles All 2 versions 


[PDF] ieee.org

Generating Adversarial Samples With Constrained Wasserstein Distance

K Wang, P Yi, F Zou, Y Wu - IEEE Access, 2019 - ieeexplore.ieee.org

In recent years, deep neural network (DNN) approaches prove to be useful in many machine

learning tasks, including classification. However, small perturbations that are carefully

crafted by attackers can lead to the misclassification of the images. Previous studies have …

  Cited by 1 Related articles

 

[PDF] bayesiandeeplearning.org

[PDF] Nested-Wasserstein Distance for Sequence Generation

R ZhangC ChenZ GanZ WenW WangL Carin - bayesiandeeplearning.org

Reinforcement learning (RL) has been widely studied for improving sequencegeneration

models. However, the conventional rewards used for RL training typically cannot capture

sufficient semantic information and therefore render model bias. Further, the sparse and …

  Related articles 


[PDF] arxiv.org

Wasserstein distributionally robust optimization: Theory and applications in machine learning

D KuhnPM EsfahaniVA Nguyen… - … Science in the Age …, 2019 - pubsonline.informs.org

Many decision problems in science, engineering, and economics are affected by uncertain

parameters whose distribution is only indirectly observable through samples. The goal of

data-driven decision making is to learn a decision from finitely many training samples that …

  Cited by 74 Related articles All 7 versions


2019


[PDF] sciencedirect.com

Harmonic mappings valued in the Wasserstein space

H Lavenant - Journal of Functional Analysis, 2019 - Elsevier

We propose a definition of the Dirichlet energy (which is roughly speaking the integral of the

square of the gradient) for mappings μ: Ω(P (D), W 2) defined over a subset Ω of R p and

valued in the space P (D) of probability measures on a compact convex subset D of R q …

  Cited by 12 Related articles All 12 versions


2019

[PDF] arxiv.org

The Wasserstein Distances Between Pushed-Forward Measures with Applications to Uncertainty Quantification

A Sagiv - arXiv preprint arXiv:1902.05451, 2019 - arxiv.org

In the study of dynamical and physical systems, the input parameters are often uncertain or

randomly distributed according to a measure $\varrho $. The system's response $ f $ pushes

forward $\varrho $ to a new measure $ f\circ\varrho $ which we would like to study. However …

  Related articles All 3 versions 


2019

[PDF] arxiv.org

Local Bures-Wasserstein Transport: A Practical and Fast Mapping Approximation

A Hoyos-Idrobo - arXiv preprint arXiv:1906.08227, 2019 - arxiv.org

Optimal transport (OT)-based methods have a wide range of applications and have attracted

a tremendous amount of attention in recent years. However, most of the computational

approaches of OT do not learn the underlying transport map. Although some algorithms …

  Related articles All 2 versions 


2019

[PDF] arxiv.org

Convergence of some classes of random flights in Wasserstein distance

A Falaleev, V Konakov - arXiv preprint arXiv:1910.03862, 2019 - arxiv.org

In this paper we consider a random walk of a particle in $\mathbb {R}^ d $. Convergence of

different transformations of trajectories of random flights with Poisson switching moments

has been obtained by Davydov and Konakov, as well as diffusion approximation of the …

  Related articles All 2 versions 


A convergent Lagrangian discretization for $ p $-Wasserstein ...

https://arxiv.org › math

by B Söllner · 2019 · Cited by 2 — ... discretization for p-Wasserstein and flux-limited diffusion equations ... numerical experiments including a numerical convergence analysis

<——2019—–—2019 ——2210—




[PDF] arxiv.org

Wasserstein robust reinforcement learning

MA Abdullah, H RenHB Ammar, V Milenkovic… - arXiv preprint arXiv …, 2019 - arxiv.org

Reinforcement learning algorithms, though successful, tend to over-fit to training

environments hampering their application to the real-world. This paper proposes $\text

{W}\text {R}^{2}\text {L} $--a robust reinforcement learning algorithm with significant robust …

  Cited by 16 Related articles All 4 versions 


[PDF] ncl.ac.uk

Generative adversarial framework for depth filling via wasserstein metric, cosine transform and domain transfer

A Atapour-AbarghoueiS Akcay… - Pattern Recognition, 2019 - Elsevier

In this work, the issue of depth filling is addressed using a self-supervised feature learning

model that predicts missing depth pixel values based on the context and structure of the

scene. A fully-convolutional generative model is conditioned on the available depth  …

  Cited by 17 Related articles All 4 versions



[PDF] polimi.it

Propagating uncertainty in reinforcement learning via wasserstein barycenters

AM MetelliA LikmetaM Restelli - 33rd Conference on Neural …, 2019 - re.public.polimi.it

How does the uncertainty of the value function propagate when performing temporal

difference learning? In this paper, we address this question by proposing a Bayesian

framework in which we employ approximate posterior distributions to model the uncertainty …

  Cited by 5 Related articles All 3 versions 


Data-driven distributionally robust shortest path problem using the Wasserstein ambiguity set

Z Wang, K YouS SongC Shang - 2019 IEEE 15th …, 2019 - ieeexplore.ieee.org

This paper proposes a data-driven distributionally robust shortest path (DRSP) model where

the distribution of the travel time is only observable through a finite training dataset. Our

DRSP model adopts the Wasserstein metric to construct the ambiguity set of probability …

  Cited by 1 Related articles


[PDF] arxiv.org

The existence of geodesics in Wasserstein spaces over path groups and loop groups

J Shao - Stochastic Processes and their Applications, 2019 - Elsevier

In this work we prove the existence and uniqueness of the optimal transport map for L p-

Wasserstein distance with p> 1, and particularly present an explicit expression of the optimal

transport map for the case p= 2. As an application, we show the existence of geodesics …

  Related articles All 8 versions


2019


Training Wasserstein GANs for Estimating Depth Maps

AT Arslan, E Seke - 2019 3rd International Symposium on …, 2019 - ieeexplore.ieee.org

Depth maps depict pixel-wise depth association with a 2D digital image. Point clouds

generation and 3D surface reconstruction can be conducted by processing a depth map.

Estimating a corresponding depth map from a given input image is an important and difficult …

  Related articles


[PDF] arxiv.org

Wasserstein adversarial imitation learning

H XiaoM Herman, J Wagner, S Ziesche… - arXiv preprint arXiv …, 2019 - arxiv.org

Imitation Learning describes the problem of recovering an expert policy from

demonstrations. While inverse reinforcement learning approaches are known to be very

sample-efficient in terms of expert demonstrations, they usually require problem-dependent …

  Cited by 18 Related articles All 2 versions 


Time delay estimation via Wasserstein distance minimization

JM NicholsMN Hutchinson, N Menkart… - IEEE Signal …, 2019 - ieeexplore.ieee.org

… 18] GK Rohde, F. Bucholtz, and JM Nichols, “Maximum empirical likeli- hood estimation of time

delay in independently and identically distributed noise,” IET Signal Process., vol … J. Sun, and

BF Hamfeldt, “Application of op- timal transport and the quadratic Wasserstein metric to …

  Cited by 4 Related articles All 2 versions


[PDF] researchgate.net

Wasserstein Subsequence Kernel for Time Series

C BockM TogninalliE Ghisu… - … Conference on Data …, 2019 - ieeexplore.ieee.org

… subsequences of two time series requires s2 distance calculations, each of which has to process

a sequence … CONCLUSION We developed a novel subsequence-based kernel that uses the

Wasserstein distance as an effective similarity measure for time series classification …

  Cited by 4 Related articles All 10 versions


[PDF] researchgate.net

Least-squares reverse time migration via linearized waveform inversion using a Wasserstein metric

P YongJ Huang, Z Li, W LiaoL Qu - Geophysics, 2019 - library.seg.org

… Recently, the Wasserstein metric, also known as the W 1 metric, has been introduced into

nonlinear waveform … L 1 norm, the W 1 metric frees us from the differentiability issue for time-domain

seismic … In addition, we have applied the W 1 metric to process the noisy field data to …

  Cited by 3 Related articles All 4 versions

<——2019—–—2019 ——2220—



[PDF] arxiv.org

Wasserstein gradient flow formulation of the time-fractional Fokker-Planck equation

MH DuongB Jin - arXiv preprint arXiv:1908.09055, 2019 - arxiv.org

… Keywords: Wasserstein gradient flow; time-fractional Fokker-Planck equation; convergence of

time- discretization … Therefore, the model (1.1) can be regarded as a time-fractional analogue … The

so-called subdiffusive process displays local motion occasionally interrupted by long …

  Cited by 1 Related articles All 7 versions 


[PDF] phmsociety.org

Anomaly detection on time series with wasserstein gan applied to phm

M Ducoffe, I Haloui, JS Gupta - International Journal of …, 2019 - papers.phmsociety.org

… Anomaly detection for time series consists of identifying whether the testing data conform to the …

First, we train a Wasserstein GAN: a discriminator D tries to maximize the expectation … This iterative

process approximates the minimization of the 1-Wasserstein distance between the …

  Cited by 2 Related articles All 2 versions 


[PDF] uchile.cl

[PDF] WASSERSTEIN-BASED DISTANCE FOR TIME SERIES ANALYSIS

E CAZELLES, A ROBERT, F TOBAR - cmm.uchile.cl

… sx and sy their respective NPSD We define the proposed Wasserstein-Fourier (WF) distance:

WF ([x],[y]) = W2(sx,sy). Theorem … Properties. Let y be a zero-mean stationary discrete-time time

series, and let y … following implications: (i) if y is a band-limited process and lim n∞ …

  Related articles 



2019

Comparison of time–distance inversion methods applied to ...

https://www.aanda.org › articles › full_html › 2019/09

by D Korda · 2019 · Cited by 3 — The pipeline was recently improved (Korda & Švanda 2019). It allows us to combine inversions of plasma flows and sound-speed perturbations in one inversion.

[CITATION] Comparison of object functions for the inversion of seismic data and study on the potentialities of the Wasserstein Metric

L Stracca, E Stucchi, A Mazzotti - GNGTS, 2019 - arpi.unipi.it

IRIS è la soluzione IT che facilita la raccolta e la gestione dei dati relativi alle attività e ai prodotti

della ricerca. Fornisce a ricercatori, amministratori e valutatori gli strumenti per monitorare i risultati

della ricerca, aumentarne la visibilità e allocare in modo efficace le risorse disponibili … Comparison …

  




2019

[PDF] arxiv.org

Nonembeddability of Persistence Diagrams with  Wasserstein Metric

A Wagner - arXiv preprint arXiv:1910.13935, 2019 - arxiv.org

Persistence diagrams do not admit an inner product structure compatible with any

Wasserstein metric. Hence, when applying kernel methods to persistence diagrams, the

underlying feature map necessarily causes distortion. We prove persistence diagrams with …

  Cited by 3 Related articles All 2 versions 


2019


[PDF] arxiv.org

Universality of persistence diagrams and the bottleneck and Wasserstein distances

P Bubenik, A Elchesen - arXiv preprint arXiv:1912.02563, 2019 - arxiv.org

We undertake a formal study of persistence diagrams and their metrics. We show that

barcodes and persistence diagrams together with the bottleneck distance and the

Wasserstein distances are obtained via universal constructions and thus have …

  Cited by 3 Related articles All 4 versions 


2019

[PDF] arxiv.org

Progressive wasserstein barycenters of persistence diagrams

J Vidal, J BudinJ Tierny - IEEE transactions on visualization …, 2019 - ieeexplore.ieee.org

This paper presents an efficient algorithm for the progressive approximation of Wasserstein

barycenters of persistence diagrams, with applications to the visual analysis of ensemble

data. Given a set of scalar fields, our approach enables the computation of a persistence …

  Cited by 16 Related articles All 16 versions


2019

DE NOVO PROTEIN DESIGN OF NOVEL FOLDS USING GUIDED CONDITIONAL WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS (GCWGAN)

S Zhu - 2019 - oaktrust.library.tamu.edu

In the research areas about proteins, it is always a significant topic to detect the

sequencestructure-function relationship. Fundamental questions remain for this topic: How

much could current data alone reveal deep insights about such relationship? And how much …

  

2019

How to Implement Wasserstein Loss for Generative ...

https://machinelearningmastery.com › how-to-impleme...

Jul 15, 2019 — The Wasserstein Generative Adversarial Network, or Wasserstein GAN, is an extension to the generative adversarial network that both ...

[CITATION] How to Implement Wasserstein Loss for Generative Adversarial Networks

J Brownlee - Machine Learning Mastery, Jul, 2019

  Cited by 1

How to Develop a Wasserstein Generative Adversarial ...

https://machinelearningmastery.com › Blog

https://machinelearningmastery.com › Blog

Jul 17, 2019 — This tutorial is divided into three parts; they are: Wasserstein Generative Adversarial Network; Wasserstein GAN Implementation Details; How to ...

How to Implement Wasserstein Loss for Generative ...
Jul 15, 2019 — In this post, you will discover how to implement Wasserstein loss for Generative Adversarial Networks. After reading this post, you will know:.


2019 [PDF] arxiv.org

How Well Do WGANs Estimate the Wasserstein Metric?

A MallastoG MontúfarA Gerolin - arXiv preprint arXiv:1910.03875, 2019 - arxiv.org

Generative modelling is often cast as minimizing a similarity measure between a data

distribution and a model distribution. Recently, a popular choice for the similarity measure

has been the Wasserstein metric, which can be expressed in the Kantorovich duality …

  Cited by 7 Related articles All 5 versions 

<——2019—–—2019 ——2230— 


De Novo Protein Design for Novel Folds using Guided Conditional Wasserstein Generative Adversarial Networks (gcWGAN)

M KarimiS ZhuY CaoY Shen - bioRxiv, 2019 - biorxiv.org

Motivation Facing data quickly accumulating on protein sequence and structure, this study is

addressing the following question: to what extent could current data alone reveal deep

insights into the sequence-structure relationship, such that new sequences can be designed …

  Cited by 6 Related articles All 4 versions 


[PDF] arxiv.org

Single image haze removal using conditional wasserstein generative adversarial networks

JP EbenezerB Das… - 2019 27th European …, 2019 - ieeexplore.ieee.org

We present a method to restore a clear image from a haze-affected image using a

Wasserstein generative adversarial network. As the problem is ill-conditioned, previous

methods have required a prior on natural images or multiple images of the same scene. We …

  Cited by 11 Related articles All 5 versions


[PDF] aaai.org

Wasserstein soft label propagation on hypergraphs: Algorithm and generalization error bounds

T GaoS AsoodehY HuangJ Evans - Proceedings of the AAAI …, 2019 - ojs.aaai.org

Inspired by recent interests of developing machine learning and data mining algorithms on

hypergraphs, we investigate in this paper the semi-supervised learning algorithm of

propagating” soft labels”(eg probability distributions, class membership scores) over …

  Cited by 4 Related articles All 13 versions 


[PDF] arxiv.org

Group level MEG/EEG source imaging via optimal transport: minimum Wasserstein estimates

H JanatiT BazeilleB ThirionM Cuturi… - … Information Processing in …, 2019 - Springer

Magnetoencephalography (MEG) and electroencephalography (EEG) are non-invasive

modalities that measure the weak electromagnetic fields generated by neural activity.

Inferring the location of the current sources that generated these magnetic fields is an ill …

  Cited by 5 Related articles All 14 versions


[PDF] arxiv.org

The existence of geodesics in Wasserstein spaces over path groups and loop groups

J Shao - Stochastic Processes and their Applications, 2019 - Elsevier

In this work we prove the existence and uniqueness of the optimal transport map for L p-

Wasserstein distance with p> 1, and particularly present an explicit expression of the optimal

transport map for the case p= 2. As an application, we show the existence of geodesics …

  Related articles All 8 versions


2019


2019

Artifact correction in low‐dose dental CT imaging using Wasserstein generative adversarial networks

Z HuC Jiang, F Sun, Q Zhang, Y Ge, Y Yang… - Medical …, 2019 - Wiley Online Library

Purpose In recent years, health risks concerning high‐dose x‐ray radiation have become a

major concern in dental computed tomography (CT) examinations. Therefore, adopting low‐

dose computed tomography (LDCT) technology has become a major focus in the CT …

  Cited by 31 Related articles All 5 versions


 2019

[PDF] arxiv.org

Adaptive wasserstein hourglass for weakly supervised hand pose estimation from monocular RGB

Y Zhang, L Chen, Y Liu, J Yong, W Zheng - arXiv preprint arXiv …, 2019 - arxiv.org

Insufficient labeled training datasets is one of the bottlenecks of 3D hand pose estimation

from monocular RGB images. Synthetic datasets have a large number of images with

precise annotations, but the obvious difference with real-world datasets impacts the …

  Cited by 3 Related articles All 2 versions 


[PDF] arxiv.org

Tree-sliced variants of wasserstein distances

T LeM YamadaK FukumizuM Cuturi - arXiv preprint arXiv:1902.00342, 2019 - arxiv.org

Optimal transport (\OT) theory defines a powerful set of tools to compare probability

distributions.\OT~ suffers however from a few drawbacks, computational and statistical,

which have encouraged the proposal of several regularized variants of OT in the recent …

  Cited by 21 Related articles All 5 versions 

[CITATION] Supplementary Material for: Tree-Sliced Variants of Wasserstein Distances

T LeM YamadaK FukumizuM Cuturi

  Related articles



[PDF] arxiv.org

Fast Tree Variants of Gromov-Wasserstein

T LeN HoM Yamada - arXiv preprint arXiv:1910.04462, 2019 - arxiv.org

Gromov-Wasserstein (GW) is a powerful tool to compare probability measures whose

supports are in different metric spaces. GW suffers however from a computational drawback

since it requires to solve a complex non-convex quadratic program. We consider in this work …

  Cited by 3 Related articles 



[PDF] arxiv.org

Denoising of 3D magnetic resonance images using a residual encoder–decoder Wasserstein generative adversarial network

M Ran, J Hu, Y Chen, H Chen, H Sun, J Zhou… - Medical image …, 2019 - Elsevier

Abstract Structure-preserved denoising of 3D magnetic resonance imaging (MRI) images is

a critical step in medical image analysis. Over the past few years, many algorithms with

impressive performances have been proposed. In this paper, inspired by the idea of deep …

  Cited by 35 Related articles All 9 versions

<——2019—–—2019 ——2240— 


[PDF] researchgate.net

[PDF] Tree-sliced approximation of wasserstein distances

T LeM YamadaK Fukumizu… - arXiv preprint arXiv …, 2019 - researchgate.net

Optimal transport (OT) theory provides a useful set of tools to compare probability

distributions. As a consequence, the field of OT is gaining traction and interest within the

machine learning community. A few deficiencies usually associated with OT include its high …

  Cited by 4 Related articles 


[PDF] researchgate.net

[PDF] Computationally efficient tree variants of gromov-wasserstein

T LeN HoM Yamada - arXiv preprint arXiv:1910.04462, 2019 - researchgate.net

We propose two novel variants of Gromov-Wasserstein (GW) between probability measures

in different probability spaces based on projecting these measures into the tree metric

spaces. Our first proposed discrepancy, named flow-based tree Gromov-Wasserstein …

  Cited by 1 Related articles All 5 versions 


[PDF] arxiv.org

Wasserstein-fisher-rao document distance

Z Wang, D Zhou, Y ZhangH WuC Bao - arXiv preprint arXiv:1904.10294, 2019 - arxiv.org

As a fundamental problem of natural language processing, it is important to measure the

distance between different documents. Among the existing methods, the Word Mover's

Distance (WMD) has shown remarkable success in document semantic matching for its clear …

  Cited by 3 Related articles All 3 versions 


[PDF] arxiv.org

Tree-Wasserstein Barycenter for Large-Scale Multilevel Clustering and Scalable Bayes

T LeV HuynhN HoD PhungM Yamada - arXiv preprint arXiv …, 2019 - arxiv.org

We study in this paper a variant of Wasserstein barycenter problem, which we refer to as tree-

Wasserstein barycenter, by leveraging a specific class of ground metrics, namely tree

metrics, for Wasserstein distance. Drawing on the tree structure, we propose an efficient …

  Related articles All 2 versions 

  Related articles All 3 versions 

Adversarial Learning for Cross-Modal Retrieval with Wasserstein Distance

Q Cheng, Y Zhang, X Gu - International Conference on Neural Information …, 2019 - Springer

This paper presents a novel approach for cross-modal retrieval in an Adversarial Learning

with Wasserstein Distance (ALWD) manner, which aims at learning aligned representation

for various modalities in a GAN framework. The generator projects the image and the text …

  Related articles


 2019


 Painting halos from 3D dark matter fields using Wasserstein ...

https://paperswithcode.com › paper › painting-halos-fro...

Painting halos from 3D dark matter fields using Wasserstein mapping networks. Edit social preview. 25 Mar 2019 • Doogesh Kodi Ramanah • Tom Charnock ...

 [CITATION] Painting halos from 3D dark matter fields using Wasserstein mapping networks

D Kodi Ramanah, T CharnockG Lavaux - arXiv preprint arXiv:1903.10524, 2019

[CITATION] Painting halos from 3D dark matter fields using Wasserstein mapping networks

D Kodi Ramanah, T Charnock, G Lavaux - arXiv preprint arXiv:1903.10524, 2019

 Related articles

[PDF] arxiv.org

On the total variation Wasserstein gradient flow and the TV-JKO scheme

G Carlier, C Poon - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We study the JKO scheme for the total variation, characterize the optimizers, prove some of

their qualitative properties (in particular a form of maximum principle and in some cases, a

minimum principle as well). Finally, we establish a convergence result as the time step goes …

  Cited by 7 Related articles All 7 versions


Cross-domain Attention Network with Wasserstein Regularizers for E-commerce Search

M QiuB WangC Chen, X Zeng, J Huang… - Proceedings of the 28th …, 2019 - dl.acm.org

Product search and recommendation is a task that every e-commerce platform wants to

outperform their peels on. However, training a good search or recommendation model often

requires more data than what many platforms have. Fortunately, the search tasks on different …

Cited by 3 Related articles

[PDF] arxiv.org

Wasserstein distances for evaluating cross-lingual embeddings

G Balikas, I Partalas - arXiv preprint arXiv:1910.11005, 2019 - arxiv.org

Word embeddings are high dimensional vector representations of words that capture their

semantic similarity in the vector space. There exist several algorithms for learning such

embeddings both for a single language as well as for several languages jointly. In this work …

  Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein total variation filtering

E VarolA Nejatbakhsh - arXiv preprint arXiv:1910.10822, 2019 - arxiv.org

In this paper, we expand upon the theory of trend filtering by introducing the use of the

Wasserstein metric as a means to control the amount of spatiotemporal variation in filtered

time series data. While trend filtering utilizes regularization to produce signal estimates that …

  Related articles All 2 versions 

<——2019—–—2019 ——2250—


[PDF] arxiv.org

Wasserstein Distance Guided Cross-Domain Learning

J Su - arXiv preprint arXiv:1910.07676, 2019 - arxiv.org

Domain adaptation aims to generalise a high-performance learner on target domain (non-

labelled data) by leveraging the knowledge from source domain (rich labelled data) which

comes from a different but related distribution. Assuming the source and target domains data …

  Related articles All 2 versions 


Adversarial Learning for Cross-Modal Retrieval with Wasserstein Distance

Q Cheng, Y Zhang, X Gu - International Conference on Neural Information …, 2019 - Springer

This paper presents a novel approach for cross-modal retrieval in an Adversarial Learning

with Wasserstein Distance (ALWD) manner, which aims at learning aligned representation

for various modalities in a GAN framework. The generator projects the image and the text …

  Related articles


[PDF] arxiv.org

Approximation of stable law in Wasserstein-1 distance by Stein's method

L Xu - Annals of Applied Probability, 2019 - projecteuclid.org

Abstract Let $ n\in\mathbb {N} $, let $\zeta_ {n, 1},\ldots,\zeta_ {n, n} $ be a sequence of

independent random variables with $\mathbb {E}\zeta_ {n, i}= 0$ and $\mathbb {E}|\zeta_ {n,

i}|<\infty $ for each $ i $, and let $\mu $ be an $\alpha $-stable distribution having …

  Cited by 20 Related articles All 7 versions


[PDF] arxiv.org

2-wasserstein approximation via restricted convex potentials with application to improved training for gans

A TaghvaeiA Jalali - arXiv preprint arXiv:1902.07197, 2019 - arxiv.org

We provide a framework to approximate the 2-Wasserstein distance and the optimal

transport map, amenable to efficient training as well as statistical and geometric analysis.

With the quadratic cost and considering the Kantorovich dual form of the optimal …

  Cited by 9 Related articles All 3 versions 


[PDF] arxiv.org

Multivariate stable approximation in Wasserstein distance by Stein's method

P Chen, I NourdinL XuX Yang - arXiv preprint arXiv:1911.12917, 2019 - arxiv.org

We investigate regularity properties of the solution to Stein's equation associated with

multivariate integrable $\alpha $-stable distribution for a general class of spectral measures

and Lipschitz test functions. The obtained estimates induce an upper bound in Wasserstein  …

  Cited by 4 Related articles All 4 versions 


2019


[PDF] researchgate.net

[PDF] Tree-sliced approximation of wasserstein distances

T LeM YamadaK Fukumizu… - arXiv preprint arXiv …, 2019 - researchgate.net

Optimal transport (OT) theory provides a useful set of tools to compare probability

distributions. As a consequence, the field of OT is gaining traction and interest within the

machine learning community. A few deficiencies usually associated with OT include its high …

  Cited by 4 Related articles 


[PDF] projecteuclid.org

Wasserstein-2 bounds in normal approximation under local dependence

X Fang - Electronic Journal of Probability, 2019 - projecteuclid.org

We obtain a general bound for the Wasserstein-2 distance in normal approximation for sums

of locally dependent random variables. The proof is based on an asymptotic expansion for

expectations of second-order differentiable functions of the sum. We apply the main result to …

  Cited by 4 Related articles All 3 versions


[PDF] arxiv.org

Approximation of Wasserstein distance with Transshipment

N Papadakis - arXiv preprint arXiv:1901.09400, 2019 - arxiv.org

An algorithm for approximating the p-Wasserstein distance between histograms defined on

unstructured discrete grids is presented. It is based on the computation of a barycenter

constrained to be supported on a low dimensional subspace, which corresponds to a …

  Cited by 2 Related articles All 5 versions 


[PDF] arxiv.org

A measure approximation theorem for Wasserstein-robust expected values

G van Zyl - arXiv preprint arXiv:1912.12119, 2019 - arxiv.org

We consider the problem of finding the infimum, over probability measures being in a ball

defined by Wasserstein distance, of the expected value of a bounded Lipschitz random

variable on $\mathbf {R}^ d $. We show that if the $\sigma-$ algebra is approximated in by a …

  Related articles All 2 versions 


 

[PDF] arxiv.org

Local Bures-Wasserstein Transport: A Practical and Fast Mapping Approximation

A Hoyos-Idrobo - arXiv preprint arXiv:1906.08227, 2019 - arxiv.org

Optimal transport (OT)-based methods have a wide range of applications and have attracted

a tremendous amount of attention in recent years. However, most of the computational

approaches of OT do not learn the underlying transport map. Although some algorithms …

  Related articles All 2 versions 

[CITATION] Local Bures-Wasserstein Transport: A Practical and Fast Mapping Approximation.

<——2019—–—2019 ——2260— 



Approximation and Wasserstein distance for self-similar measures on the unit interval

E Lichtenegger, R Niedzialomski - Journal of Mathematical Analysis and …, 2019 - Elsevier

We study the Wasserstein distance between self-similar measures associated to two non-

overlapping linear contractions of the unit interval. The main theorem gives an explicit

formula for the Wasserstein distance between iterations of certain discrete approximations of …

  Related articles All 2 versions



Approximation of Discounted Minimax Markov Control Problems and Zero-Sum Markov Games Using Hausdorff and Wasserstein Distances

F Dufour, T Prieto-Rumeau - Dynamic Games and Applications, 2019 - Springer

This paper is concerned with a minimax control problem (also known as a robust Markov

decision process (MDP) or a game against nature) with general state and action spaces

under the discounted cost optimality criterion. We are interested in approximating …

  Related articles All 6 versions


[PDF] tum.de

Structure preserving discretization and approximation of gradient flows in Wasserstein-like space

S Plazotta - 2019 - mediatum.ub.tum.de

This thesis investigates structure-preserving, temporal semi-discretizations and

approximations for PDEs with gradient flow structure with the application to evolution

problems in the L²-Wasserstein space. We investigate the variational formulation of the time …

  Related articles All 3 versions 


[PDF] arxiv.org

partial Laplacian as an infinitesimal generator on the Wasserstein space

YT Chow, W Gangbo - Journal of Differential Equations, 2019 - Elsevier

In this manuscript, we consider special linear operators which we term partial Laplacians on

the Wasserstein space, and which we show to be partial traces of the Wasserstein Hessian.

We verify a distinctive smoothing effect of the “heat flows” they generated for a particular …

  Cited by 13 Related articles All 9 versions


HTML] sciencedirect.com

[HTML] Wasserstein GAN-based small-sample augmentation for new-generation artificial intelligence: a case study of cancer-staging data in biology

Y Liu, Y Zhou, X Liu, F Dong, C Wang, Z Wang - Engineering, 2019 - Elsevier

It is essential to utilize deep-learning algorithms based on big data for the implementation of

the new generation of artificial intelligence. Effective utilization of deep learning relies

considerably on the number of labeled samples, which restricts the application of deep …

  Cited by 37 Related articles All 5 versions


2019


Projection in the 2-Wasserstein sense on structured measure space

L Lebrat - 2019 - tel.archives-ouvertes.fr

This thesis focuses on the approximation for the 2-Wasserstein metric of probability

measures by structured measures. The set of structured measures under consideration is

made of consistent discretizations of measures carried by a smooth curve with a bounded …

  All 2 versions 


[PDF] amazonaws.com

[PDF] Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application to Microbiome Studies”

S Wang, TT CaiH Li - pstorage-tf-iopjsd8797887.s3 …

Page 1. Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application

to Microbiome Studies” Shulei Wang, T. Tony Cai and Hongzhe Li University of Pennsylvania In

this supplementary material, we provide the proof for the main results (Section S1) and all the …

  Related articles All 3 versions 


2019

[PDF] researchgate.net

Infrared and Visible Image Fusion via Multi-discriminators Wasserstein Generative Adversarial Network

J Li, H Huo, K Liu, C Li, S Li… - 2019 18th IEEE …, 2019 - ieeexplore.ieee.org

Generative adversarial network (GAN) has been widely applied to infrared and visible image

fusion. However, the existing GAN-based image fusion methods only establish one

discriminator in the network to make the fused image capture gradient information from the …

  Cited by 1 Related articles All 3 versions

 

 2019

[PDF] arxiv.org

Optimal Fusion of Elliptic Extended Target Estimates Based on the Wasserstein Distance

K ThormannM Baum - … on Information Fusion (FUSION), 2019 - ieeexplore.ieee.org

This paper considers the fusion of multiple estimates of a spatially extended object, where

the object extent is modeled as an ellipse parameterized by the orientation and semi-axes

lengths. For this purpose, we propose a novel systematic approach that employs a distance …

  Cited by 1 Related articles All 5 versions


2019

[PDF] springer.com

Convergence to equilibrium in Wasserstein distance for damped Euler equations with interaction forces

JA CarrilloYP ChoiO Tse - Communications in Mathematical Physics, 2019 - Springer

We develop tools to construct Lyapunov functionals on the space of probability measures in

order to investigate the convergence to global equilibrium of a damped Euler system under

the influence of external and interaction potential forces with respect to the 2-Wasserstein  …

  Cited by 13 Related articles All 11 versions

 <——2019—–—2019 ——2270—

 

[PDF] ieee.org

Multi-source medical image fusion based on Wasserstein generative adversarial networks

Z Yang, Y Chen, Z LeF Fan, E Pan - IEEE Access, 2019 - ieeexplore.ieee.org

In this paper, we propose the medical Wasserstein generative adversarial networks

(MWGAN), an end-to-end model, for fusing magnetic resonance imaging (MRI) and positron

emission tomography (PET) medical images. Our method establishes two adversarial …

  Cited by 8 Related articles


2019

[PDF] arxiv.org

Hypothesis Test and Confidence Analysis with Wasserstein Distance with General Dimension

M Imaizumi, H Ota, T Hamaguchi - arXiv preprint arXiv:1910.07773, 2019 - arxiv.org

We develop a general framework for statistical inference with the Wasserstein distance.

Recently, the Wasserstein distance has attracted much attention and been applied to

various machine learning tasks due to its celebrated properties. Despite the importance …

  Cited by 1 Related articles All 2 versions 


 

 

 2019

[PDF] esaim-cocv.org

Dynamic models of Wasserstein-1-type unbalanced transport

B Schmitzer, B Wirth - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We consider a class of convex optimization problems modelling temporal mass transport

and mass change between two given mass distributions (the so-called dynamic formulation

of unbalanced transport), where we focus on those models for which transport costs are …

  Cited by 6 Related articles All 5 versions



2019

[PDF] arxiv.org

Convergence of some classes of random flights in Wasserstein distance

A Falaleev, V Konakov - arXiv preprint arXiv:1910.03862, 2019 - arxiv.org

In this paper we consider a random walk of a particle in $\mathbb {R}^ d $. Convergence of

different transformations of trajectories of random flights with Poisson switching moments

has been obtained by Davydov and Konakov, as well as diffusion approximation of the …

  Related articles All 2 versions 


2019

[PDF] arxiv.org

Statistical aspects of Wasserstein distances

VM Panaretos, Y Zemel - Annual review of statistics and its …, 2019 - annualreviews.org

Wasserstein distances are metrics on probability distributions inspired by the problem of

optimal mass transportation. Roughly speaking, they measure the minimal effort required to

reconfigure the probability mass of one distribution in order to recover the other distribution …

  Cited by 114 Related articles All 10 versions



2019


[PDF] arxiv.org

How Well Do WGANs Estimate the Wasserstein Metric?

A MallastoG MontúfarA Gerolin - arXiv preprint arXiv:1910.03875, 2019 - arxiv.org

Generative modelling is often cast as minimizing a similarity measure between a data

distribution and a model distribution. Recently, a popular choice for the similarity measure

has been the Wasserstein metric, which can be expressed in the Kantorovich duality …

  Cited by 6 Related articles All 5 versions 


[PDF] mlr.press

Sliced-Wasserstein flows: Nonparametric generative modeling via optimal transport and diffusions

A LiutkusU SimsekliS Majewski… - International …, 2019 - proceedings.mlr.press

By building upon the recent theory that established the connection between implicit

generative modeling (IGM) and optimal transport, in this study, we propose a novel

parameter-free algorithm for learning the underlying distributions of complicated datasets …

  Cited by 40 Related articles All 7 versions 


2019

[PDF] arxiv.org

Alpha Procrustes metrics between positive definite operators: a unifying formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt metrics

MH Quang - arXiv preprint arXiv:1908.09275, 2019 - arxiv.org

This work presents a parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

distances provide a unified formulation encompassing both the Bures-Wasserstein and Log …

  Cited by 4 Related articles All 2 versions 


[PDF] thecvf.com

Normalized wasserstein for mixture distributions with applications in adversarial learning and domain adaptation

Y BalajiR ChellappaS Feizi - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Understanding proper distance measures between distributions is at the core of several

learning tasks such as generative models, domain adaptation, clustering, etc. In this work,

we focus on mixture distributions that arise naturally in several application domains where …

  Cited by 14 Related articles All 4 versions 


2019

[PDF] arxiv.org

Normalized Wasserstein distance for mixture distributions with applications in adversarial learning and domain adaptation

Y BalajiR ChellappaS Feizi - arXiv preprint arXiv:1902.00415, 2019 - arxiv.org

Understanding proper distance measures between distributions is at the core of several

learning tasks such as generative models, domain adaptation, clustering, etc. In this work,

we focus on mixture distributions that arise naturally in several application domains where …

  Cited by 5 Related articles All 2 versions 

<——2019—–—2019 ——2280—


[CITATION] A general solver to the elliptical mixture model through an approximate wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - arXiv preprint arXiv:1906.03700, 2019

  Cited by 1 Related articles


2019

On potentials of regularized Wasserstein generative adversarial networks for realistic hallucination of tiny faces

WZ ShaoJJ Xu, L Chen, Q Ge, LQ Wang, BK Bao… - Neurocomputing, 2019 - Elsevier

Super-resolution of facial images, aka face hallucination, has been intensively studied in the

past decades due to the increasingly emerging analysis demands in video surveillance, eg,

face detection, verification, identification. However, the actual performance of most previous …

  Cited by 3 Related articles All 3 versions



2019

[PDF] arxiv.org

Stylized Text Generation Using Wasserstein Autoencoders with a Mixture of Gaussian Prior

A Ghabussi, L Mou, O Vechtomova - arXiv preprint arXiv:1911.03828, 2019 - arxiv.org

Wasserstein autoencoders are effective for text generation. They do not however provide

any control over the style and topic of the generated sentences if the dataset has multiple

classes and includes different topics. In this work, we present a semi-supervised approach …

  Related articles All 2 versions 

year 2019

[PDF] utwente.nl

Wasserstein Generative Adversarial Privacy Networks

KE Mulder - 2019 - essay.utwente.nl

A method to filter private data from public data using generative adversarial networks has

been introduced in an article" Generative Adversarial Privacy" by Chong Huang et al. in

2018. We attempt to reproduce their results, and build further upon their work by introducing …

  Related articles All 2 versions 




2019

[PDF] arxiv.org

Using wasserstein-2 regularization to ensure fair decisions with neural-network classifiers

L Risser, Q Vincenot, N Couellan… - arXiv preprint arXiv …, 2019 - arxiv.org

In this paper, we propose a new method to build fair Neural-Network classifiers by using a

constraint based on the Wasserstein distance. More specifically, we detail how to efficiently

compute the gradients of Wasserstein-2 regularizers for Neural-Networks. The proposed …

  Cited by 9 Related articles All 2 versions 


2019


[PDF] researchgate.net

[PDF] Fairness with wasserstein adversarial networks

M Serrurier, JM LoubesE Pauwels - 2019 - researchgate.net

Quantifying, enforcing and implementing fairness emerged as a major topic in machine

learning. We investigate these questions in the context of deep learning. Our main

algorithmic and theoretical tool is the computational estimation of similarities between …

  Cited by 4 Related articles 

[PDF] openreview.net

Fairness with Wasserstein Adversarial Networks

L Jean-Michel, E Pauwels - 2019 - openreview.net

Quantifying, enforcing and implementing fairness emerged as a major topic in machine

learning. We investigate these questions in the context of deep learning. Our main

algorithmic and theoretical tool is the computational estimation of similarities between …

  Related articles 




2019

[PDF] ncl.ac.uk

Generative adversarial framework for depth filling via wasserstein metric, cosine transform and domain transfer

A Atapour-AbarghoueiS Akcay… - Pattern Recognition, 2019 - Elsevier

In this work, the issue of depth filling is addressed using a self-supervised feature learning

model that predicts missing depth pixel values based on the context and structure of the

scene. A fully-convolutional generative model is conditioned on the available depth …

  Cited by 17 Related articles All 4 versions


2019  [PDF] researchgate.net

Full View

Data‐driven affinely adjustable distributionally robust framework for unit commitment based on Wasserstein metric

W Hou, R ZhuH Wei… - IET Generation …, 2019 - Wiley Online Library

This study proposes a data‐driven distributionally robust framework for unit commitment

based on Wasserstein metric considering the wind power generation forecasting errors. The

objective of the constructed model is to minimise the expected operating cost, including the …

  Cited by 10 Related articles All 3 versions 


 

2019  [PDF] arxiv.org

A First-Order Algorithmic Framework for Wasserstein Distributionally Robust Logistic Regression

J LiS HuangAMC So - arXiv preprint arXiv:1910.12778, 2019 - arxiv.org

Wasserstein distance-based distributionally robust optimization (DRO) has received much

attention lately due to its ability to provide a robustness interpretation of various learning

models. Moreover, many of the DRO problems that arise in the learning context admits exact …

  Cited by 4 Related articles All 7 versions 


 

2019

[PDF] projecteuclid.org

Hybrid Wasserstein distance and fast distribution clustering

I Verdinelli, L Wasserman - Electronic Journal of Statistics, 2019 - projecteuclid.org

We define a modified Wasserstein distance for distribution clustering which inherits many of

the properties of the Wasserstein distance but which can be estimated easily and computed

quickly. The modified distance is the sum of two terms. The first term—which has a closed …

  Cited by 2 Related articles All 5 versions

<——2019—–—2019 ——2290— 


 Estimation of monthly reference evapotranspiration using novel hybrid machine learning approaches

Y TikhamarineA Malik, A Kumar… - Hydrological …, 2019 - Taylor & Francis

… SC) and genetic expression programming (GEP) to predict monthly ET o from 50 weather stations

located in … The following approach was used in this study: (2) m = 2 n + 1 (2) Estimation of monthly

reference evapotranspiration using novel hybrid machine learning approaches …

  Cited by 29 Related articles All 3 versions



[HTML] oup.com

The gromov–wasserstein distance between networks and stable network invariants

S ChowdhuryF Mémoli - Information and Inference: A Journal of …, 2019 - academic.oup.com

We define a metric—the network Gromov–Wasserstein distance—on weighted, directed

networks that is sensitive to the presence of outliers. In addition to proving its theoretical

properties, we supply network invariants based on optimal transport that approximate this …

  Cited by 25 Related articles All 5 versions


[PDF] arxiv.org

A bound on the Wasserstein-2 distance between linear combinations of independent random variables

B Arras, E Azmoodeh, G Poly, Y Swan - Stochastic processes and their …, 2019 - Elsevier

We provide a bound on a distance between finitely supported elements and general

elements of the unit sphere of 2 (N). We use this bound to estimate the Wasserstein-2

distance between random variables represented by linear combinations of independent …

  Cited by 21 Related articles All 15 versions


2019

[PDF] arxiv.org

Strong equivalence between metrics of Wasserstein type

E BayraktarG Guo - arXiv preprint arXiv:1912.08247, 2019 - arxiv.org

The sliced Wasserstein and more recently max-sliced Wasserstein metrics $\mW_p $ have

attracted abundant attention in data sciences and machine learning due to its advantages to

tackle the curse of dimensionality. A question of particular importance is the strong …

  Cited by 3 Related articles All 2 versions 


[PDF] inria.fr

On a Wasserstein-type distance between solutions to stochastic differential equations

J Bion–Nadal, D Talay - The Annals of Applied Probability, 2019 - projecteuclid.org

In this paper, we introduce a Wasserstein-type distance on the set of the probability

distributions of strong solutions to stochastic differential equations. This new distance is

defined by restricting the set of possible coupling measures. We prove that it may also be …

  Cited by 13 Related articles All 9 versions


 2019


[PDF] arxiv.org

Alpha Procrustes metrics between positive definite operators: a unifying formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt metrics

MH Quang - arXiv preprint arXiv:1908.09275, 2019 - arxiv.org

This work presents a parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

distances provide a unified formulation encompassing both the Bures-Wasserstein and Log …

  Cited by 4 Related articles All 2 versions 


Bounding quantiles of Wasserstein distance between true and empirical measure

SN Cohen, MNA Tegnér, J Wiesel - arXiv preprint arXiv:1907.02006, 2019 - arxiv.org

Consider the empirical measure, $\hat {\mathbb {P}} _N $, associated to $ N $ iid samples of

a given probability distribution $\mathbb {P} $ on the unit interval. For fixed $\mathbb {P} $

the Wasserstein distance between $\hat {\mathbb {P}} _N $ and $\mathbb {P} $ is a random …

  Related articles All 4 versions 


2019 see 2020

[CITATION] Bridging the Gap Between f-GANs and Wasserstein GANs. arXiv e-prints, page

J Song, S Ermon - arXiv preprint arXiv:1910.09779, 2019

  Cited by 3 Related articles


Personalized Multi-Turn Chatbot Based on Dual WGAN

S Oh, JT Kim, H Kim, JE Lee, S Kim… - Annual Conference on …, 2019 - koreascience.or.kr

챗봇은 사람과 컴퓨터가 자연어로 대화를 주고받는 시스템을 말한다. 최근 챗봇에 대한 연구가

활발해지면서 단순히 기계적인 응답보다 사용자가 원하는 개인 특성이 반영된 챗봇에 대한 연구

All 2 versions 

 

2019 [PDF] arxiv.org

1-Wasserstein Distance on the Standard Simplex

A Frohmader, H Volkmer - arXiv preprint arXiv:1912.04945, 2019 - arxiv.org

Wasserstein distances provide a metric on a space of probability measures. We consider the

space $\Omega $ of all probability measures on the finite set $\chi=\{1,\dots, n\} $ where $ n

$ is a positive integer. 1-Wasserstein distance, $ W_1 (\mu,\nu) $ is a function from …

  Cited by 1 Related articles All 2 versions 

<——2019—–—2019 ——2300— 


[PDF] sns.it

Sensitivity of the Compliance and of the Wasserstein Distance with Respect to a Varying Source

G BouchittéI FragalàI Lucardesi - Applied Mathematics & Optimization, 2019 - Springer

We show that the compliance functional in elasticity is differentiable with respect to

horizontal variations of the load term, when the latter is given by a possibly concentrated

measure; moreover, we provide an integral representation formula for the derivative as a …

  Related articles All 9 versions


[PDF] sciencedirect.com

Harmonic mappings valued in the Wasserstein space

H Lavenant - Journal of Functional Analysis, 2019 - Elsevier

We propose a definition of the Dirichlet energy (which is roughly speaking the integral of the

square of the gradient) for mappings μ: Ω(P (D), W 2) defined over a subset Ω of R p and

valued in the space P (D) of probability measures on a compact convex subset D of R q …

  Cited by 12 Related articles All 12 versions


2019

[PDF] arxiv.org

Disentangled representation learning with Wasserstein total correlation

Y XiaoWY Wang - arXiv preprint arXiv:1912.12818, 2019 - arxiv.org

Unsupervised learning of disentangled representations involves uncovering of different

factors of variations that contribute to the data generation process. Total correlation

penalization has been a key component in recent methods towards disentanglement …

  Cited by 3 Related articles All 2 versions 


2019  [PDF] arxiv.org

Closed‐form Expressions for Maximum Mean Discrepancy with Applications to Wasserstein Auto‐Encoders

RM Rustamov - Stat, 2019 - Wiley Online Library

Abstract The Maximum Mean Discrepancy (MMD) has found numerous applications in

statistics and machine learning, most recently as a penalty in the Wasserstein Auto‐Encoder

(WAE). In this paper we compute closed‐form expressions for estimating the Gaussian …

  Cited by 5 Related articles All 3 versions


[PDF] arxiv.org

On the total variation Wasserstein gradient flow and the TV-JKO scheme

G Carlier, C Poon - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We study the JKO scheme for the total variation, characterize the optimizers, prove some of

their qualitative properties (in particular a form of maximum principle and in some cases, a

minimum principle as well). Finally, we establish a convergence result as the time step goes …

  Cited by 8 Related articles All 7 versions


2019


[PDF] arxiv.org

Wasserstein total variation filtering

E VarolA Nejatbakhsh - arXiv preprint arXiv:1910.10822, 2019 - arxiv.org

In this paper, we expand upon the theory of trend filtering by introducing the use of the

Wasserstein metric as a means to control the amount of spatiotemporal variation in filtered

time series data. While trend filtering utilizes regularization to produce signal estimates that …

  Related articles All 2 versions 


 2019[PDF] arxiv.org

Randomized filtering and Bellman equation in Wasserstein space for partial observation control problem

E BandiniA Cosso, M Fuhrman, H Pham - Stochastic Processes and their …, 2019 - Elsevier

We study a stochastic optimal control problem for a partially observed diffusion. By using the

control randomization method in Bandini et al.(2018), we prove a corresponding

randomized dynamic programming principle (DPP) for the value function, which is obtained …

  Cited by 17 Related articles All 13 versions


2019  [PDF] arxiv.org

How Well Do WGANs Estimate the Wasserstein Metric?

A MallastoG MontúfarA Gerolin - arXiv preprint arXiv:1910.03875, 2019 - arxiv.org

Generative modelling is often cast as minimizing a similarity measure between a data

distribution and a model distribution. Recently, a popular choice for the similarity measure

has been the Wasserstein metric, which can be expressed in the Kantorovich duality …

  Cited by 6 Related articles All 5 versions 


Differentially Private Synthetic Mixed-Type Data Generation ...

https://arxiv.org › pdf

https://arxiv.org › pdfPDF

by U Tantipongpipat · 2019 · Cited by 1 — More recently, [1] introduced a framework for training deep learning models with differential privacy, which involved adding Gaussian noise to a ...

[CITATION] Differential Privacy Synthetic Data Generation using WGANs, 2019

M Alzantot, M Srivastava - URL https://github. com/nesl/nist_differential_privacy …

  Cited by 5 Related articles

[PDF] arxiv.org

Inequalities for the Wasserstein mean of positive definite matrices

R Bhatia, T Jain, Y Lim - Linear Algebra and its Applications, 2019 - Elsevier

Let A 1 , … , A m be given positive definite matrices and let w = ( w 1 , … , w m ) be a vector of

weights; ie, w j ≥ 0 and ∑ j = 1 m w j = 1 . Then the (weighted) Wasserstein mean, or the Wasserstein

barycentre of A 1 , … , A m is defined as(2) Ω ( w ; A 1 , … , A m ) = argmin X P ∑ j = 1 m w …

  Cited by 12 Related articles All 5 versions

<——2019—–—2019 ——2310— 


[PDF] arxiv.org

Refined basic couplings and Wasserstein-type distances for SDEs with Lévy noises

D Luo, J Wang - Stochastic Processes and their Applications, 2019 - Elsevier

We establish the exponential convergence with respect to the L 1-Wasserstein distance and

the total variation for the semigroup corresponding to the stochastic differential equation d X

t= d Z t+ b (X t) dt, where (Z t) t≥ 0 is a pure jump Lévy process whose Lévy measure ν fulfills …

  Cited by 17 Related articles All 7 versions


[PDF] arxiv.org

Wasserstein metric-driven Bayesian inversion with applications to signal processing

M MotamedD Appelo - International Journal for Uncertainty …, 2019 - dl.begellhouse.com

We present a Bayesian framework based on a new exponential likelihood function driven by

the quadratic Wasserstein metric. Compared to conventional Bayesian models based on

Gaussian likelihood functions driven by the least-squares norm (L 2 norm), the new …

  Cited by 8 Related articles All 3 versions


Generating EEG signals of an RSVP experiment by a class conditioned wasserstein generative adversarial network

S PanwarP RadJ Quarles… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

Electroencephalography (EEG) data is difficult to obtain due to complex experimental setups

and reduced comfort due to prolonged wearing. This poses challenges to train powerful

deep learning model due to the limited EEG data. Hence, being able to generate EEG data …

  Cited by 7 Related articles All 2 versions


 

2019

Wasserstein Adversarial Regularization (WAR) on label noise

https://arxiv.org › cs

by BB Damodaran · 2019 · Cited by 4 — Computer Science > Machine Learning. arXiv:1904.03936 (cs). [Submitted on 8 Apr 2019 (v1), last revised 29 Jun 2021 (this version, v3)] ...

Wasserstein adversarial regularization (WAR) on label noise


2019

[PDF] uni-bielefeld.de

[PDF] Diffusions and PDEs on Wasserstein space

FY Wang - arXiv preprint arXiv:1903.02148, 2019 - sfb1283.uni-bielefeld.de

We propose a new type SDE, whose coefficients depend on the image of solutions, to investigate

the diffusion process on the Wasserstein space 2 over Rd, generated by the following

time-dependent differential operator for f C2 … R d×Rd σ(t, x, µ)σ(t, y, µ) ,D2f(µ)(x …

  Cited by 2 Related articles All 3 versions 


2019


Existence of probability measure valued jump-diffusions in generalized Wasserstein spaces

M Larsson, S Svaluto-Ferro - arXiv preprint arXiv:1908.08080, 2019 - arxiv.org

We study existence of probability measure valued jump-diffusions described by martingale

problems. We develop a simple device that allows us to embed Wasserstein spaces and

other similar spaces of probability measures into locally compact spaces where classical …

  Cited by 2 Related articles All 2 versions 


2019

Improved concentration bounds for conditional value-at-risk and cumulative prospect theory using wasserstein distance

SP Bhat, LA Prashanth - 2019 - openreview.net

This paper presents a unified approach based on Wasserstein distance to derive

concentration bounds for empirical estimates for a broad class of risk measures. The results

cover two broad classes of risk measures which are defined in the paper. The classes of risk …

  Cited by 1 Related articles 


2019

[PDF] arxiv.org

The Wasserstein-Fourier distance for stationary time series

E Cazelles, A Robert, F Tobar - arXiv preprint arXiv:1912.05509, 2019 - arxiv.org

We propose the Wasserstein-Fourier (WF) distance to measure the (dis) similarity between

time series by quantifying the displacement of their energy across frequencies. The WF

distance operates by calculating the Wasserstein distance between the (normalised) power …

  Cited by 1 Related articles All 2 versions 

 

2019

[PDF] phmsociety.org

Anomaly detection on time series with wasserstein gan applied to phm

M Ducoffe, I Haloui, JS Gupta - International Journal of …, 2019 - papers.phmsociety.org

Modern vehicles are more and more connected. For instance, in the aerospace industry,

newer aircraft are already equipped with data concentrators and enough wireless

connectivity to transmit sensor data collected during the whole flight to the ground, usually …

  Cited by 2 Related articles All 2 versions 


2019

Time Series Generation using a One Dimensional Wasserstein GAN

KE Smith, A Smith - ITISE 2019. Proceedings of papers. Vol 2, 2019 - inis.iaea.org

[en] Time series data is an extremely versatile data type that can represent many real world

events; however the acquisition of event specific time series requires special sensors,

devices, and to record the events, and the man power to translate to one dimensional (1D) …

  Cited by 2 Related articles 

<——2019—–—2019 ——2320— 



 2019

[PDF] psu.edu

Aggregated Wasserstein distance for hidden Markov models and automated morphological characterization of placenta from photos

Y Chen - 2019 - etda.libraries.psu.edu

In the past decade, fueled by the rapid advances of big data technology and machine

learning algorithms, data science has become a new paradigm of science and has more

and more emerged into its own field. At the intersection of computational methods, data …

  Related articles 


2019

[PDF] arxiv.org

Precise simulation of electromagnetic calorimeter showers using a Wasserstein Generative Adversarial Network

M Erdmann, J Glombitza, T Quast - Computing and Software for Big …, 2019 - Springer

Simulations of particle showers in calorimeters are computationally time-consuming, as they

have to reproduce both energy depositions and their considerable fluctuations. A new

approach to ultra-fast simulations is generative models where all calorimeter energy …

  Cited by 55 Related articles All 6 versions

[CITATION] Precise simulation of electromagnetic calorimeter showers using a Wasserstein generative adversarial network. Comput Softw Big Sci 3 (1): 4

M Erdmann, J Glombitza, T Quast - arXiv preprint arXiv:1807.01954, 2019

  Cited by 12 Related articles



2019

[PDF] arxiv.org

The optimal convergence rate of monotone schemes for conservation laws in the Wasserstein distance

AM RufE SandeS Solem - Journal of Scientific Computing, 2019 - Springer

Abstract In 1994, Nessyahu, Tadmor and Tassa studied convergence rates of monotone

finite volume approximations of conservation laws. For compactly supported, Lip^+ Lip+-

bounded initial data they showed a first-order convergence rate in the Wasserstein distance …

  Cited by 8 Related articles All 6 versions


2019

[PDF] arxiv.org

On the total variation Wasserstein gradient flow and the TV-JKO scheme

G Carlier, C Poon - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We study the JKO scheme for the total variation, characterize the optimizers, prove some of

their qualitative properties (in particular a form of maximum principle and in some cases, a

minimum principle as well). Finally, we establish a convergence result as the time step goes …

  Cited by 8 Related articles All 9 versions



2019

[PDF] arxiv.org

Denoising of 3D magnetic resonance images using a residual encoder–decoder Wasserstein generative adversarial network

M Ran, J Hu, Y Chen, H Chen, H Sun, J Zhou… - Medical image …, 2019 - Elsevier

Abstract Structure-preserved denoising of 3D magnetic resonance imaging (MRI) images is

a critical step in medical image analysis. Over the past few years, many algorithms with

impressive performances have been proposed. In this paper, inspired by the idea of deep …

  Cited by 41 Related articles All 8 versions


2019


Unsupervised feature extraction based on improved Wasserstein generative adversarial network for hyperspectral classification

Q Sun, S Bourennane - Multimodal Sensing: Technologies …, 2019 - spiedigitallibrary.org

Accurate classification is one of the most important prerequisites for hyperspectral

applications and feature extraction is the key step of classification. Recently, deep learning

models have been successfully used to extract the spectral-spatial features in hyperspectral …

  Related articles All 4 versions


2019

Ewgan: Entropy-based wasserstein gan for imbalanced learning

J Ren, Y LiuJ Liu - Proceedings of the AAAI Conference on Artificial …, 2019 - ojs.aaai.org

In this paper, we propose a novel oversampling strategy dubbed Entropy-based

Wasserstein Generative Adversarial Network (EWGAN) to generate data samples for

minority classes in imbalanced learning. First, we construct an entropyweighted label vector …

  Cited by 12  Related articles All 6 versions 


2019 [PDF] arxiv.org

Randomized filtering and Bellman equation in Wasserstein space for partial observation control problem

E BandiniA Cosso, M Fuhrman, H Pham - Stochastic Processes and their …, 2019 - Elsevier

We study a stochastic optimal control problem for a partially observed diffusion. By using the

control randomization method in Bandini et al.(2018), we prove a corresponding

randomized dynamic programming principle (DPP) for the value function, which is obtained …

  Cited by 18 Related articles All 13 versions


2019

[PDF] mlr.press

Gromov-wasserstein learning for graph matching and node embedding

H XuD LuoH Zha, LC Duke - International conference on …, 2019 - proceedings.mlr.press

A novel Gromov-Wasserstein learning framework is proposed to jointly match (align) graphs

and learn embedding vectors for the associated graph nodes. Using Gromov-Wasserstein

discrepancy, we measure the dissimilarity between two graphs and find their …

  Cited by 67 Related articles All 10 versions 


2019

[PDF] arxiv.org

The quadratic Wasserstein metric for inverse data matching

K Ren, Y Yang - arXiv preprint arXiv:1911.06911, 2019 - arxiv.org

This work characterizes, analytically and numerically, two major effects of the quadratic

Wasserstein ($ W_2 $) distance as the measure of data discrepancy in computational

solutions of inverse problems. First, we show, in the infinite-dimensional setup, that the …

  Cited by 1 Related articles 

<——2019—–—2019 ——2330—



2019  [PDF] arxiv.org

Parameterized Wasserstein mean with its properties

S Kim - arXiv preprint arXiv:1904.09385, 2019 - arxiv.org

A new least squares mean of positive definite matrices for the divergence associated with

the sandwiched quasi-relative entropy has been introduced. It generalizes the well-known

Wasserstein mean for covariance matrices of Gaussian distributions with mean zero, so we …

  Related articles All 2 versions 


2019 [PDF] arxiv.org

Attainability property for a probabilistic target in Wasserstein spaces

G Cavagnari, A Marigonda - arXiv preprint arXiv:1904.10933, 2019 - arxiv.org

In this paper we establish an attainability result for the minimum time function of a control

problem in the space of probability measures endowed with Wasserstein distance. The

dynamics is provided by a suitable controlled continuity equation, where we impose a …

  Cited by 1 Related articles All 6 versions 


 

2019 [PDF] arxiv.org

Denoising of 3D magnetic resonance images using a residual encoder–decoder Wasserstein generative adversarial network

M Ran, J Hu, Y Chen, H Chen, H Sun, J Zhou… - Medical image …, 2019 - Elsevier

Abstract Structure-preserved denoising of 3D magnetic resonance imaging (MRI) images is

a critical step in medical image analysis. Over the past few years, many algorithms with

impressive performances have been proposed. In this paper, inspired by the idea of deep …

  Cited by 42 Related articles All 8 versions



2019

[PDF] Face Synthesis and Recognition Using Disentangled Representation-Learning Wasserstein GAN.

GSJ Hsu, CH Tang, MH Yap - CVPR Workshops, 2019 - openaccess.thecvf.com

Abstract We propose the Disentangled Representation-learning Wasserstein GAN (DR-

WGAN) trained on augmented data for face recognition and face synthesis across pose. We

improve the state-of-the-art DR-GAN with the Wasserstein loss considered in the …

  Cited by 2 Related articles All 4 versions 

[PDF] thecvf.com

Face Synthesis and Recognition Using Disentangled Representation-Learning Wasserstein GAN

GS Jison Hsu, CH Tang… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Abstract We propose the Disentangled Representation-learning Wasserstein GAN (DR-

WGAN) trained on augmented data for face recognition and face synthesis across pose. We

improve the state-of-the-art DR-GAN with the Wasserstein loss considered in the …

  Related articles All 2 versions 


2019

Adaptive quadratic Wasserstein full-waveform inversion

D Wang, P Wang - SEG International Exposition and Annual Meeting, 2019 - onepetro.org

Full-waveform inversion (FWI) has increasingly become standard practice in the industry to

resolve complex velocities. However, the current FWI research still exhibits a diverging

scene, with various flavors of FWI targeting different aspects of the problem. Outstanding …

  Cited by 4 Related articles All 3 versions

[PDF] googleapis.com



2019  [PDF] arxiv.org

Adaptive wasserstein hourglass for weakly supervised hand pose estimation from monocular RGB

Y Zhang, L Chen, Y Liu, J Yong, W Zheng - arXiv preprint arXiv …, 2019 - arxiv.org

Insufficient labeled training datasets is one of the bottlenecks of 3D hand pose estimation

from monocular RGB images. Synthetic datasets have a large number of images with

precise annotations, but the obvious difference with real-world datasets impacts the …

  Cited by 3 Related articles All 3 versions 



 2019  [PDF] thecvf.com

Order-preserving wasserstein discriminant analysis

B SuJ ZhouY Wu - Proceedings of the IEEE/CVF …, 2019 - openaccess.thecvf.com

Supervised dimensionality reduction for sequence data projects the observations in

sequences onto a low-dimensional subspace to better separate different sequence classes.

It is typically more challenging than conventional dimensionality reduction for static data …

  Cited by 2 Related articles All 6 versions 



[PDF] thecvf.com

Order-Preserving Wasserstein Discriminant Analysis

B Su, J Zhou, Y Wu - Proceedings of the IEEE International …, 2019 - openaccess.thecvf.com

Supervised dimensionality reduction for sequence data projects the observations in sequences onto a low-dimensional subspace to better separate different sequence classes. It is typically more challenging than conventional dimensionality reduction for static data …

  Order-preserving wasserstein discriminant analysis

B SuJ ZhouY Wu - Proceedings of the IEEE/CVF …, 2019 - openaccess.thecvf.com

Supervised dimensionality reduction for sequence data projects the observations in

sequences onto a low-dimensional subspace to better separate different sequence classes.

It is typically more challenging than conventional dimensionality reduction for static data,

because measuring the separability of sequences involves non-linear procedures to

manipulate the temporal structures. This paper presents a linear method, namely Order-

preserving Wasserstein Discriminant Analysis (OWDA), which learns the projection by …

  Cited by 2 Related articles All 6 versions 

<——2019—–—2019 ——2340—


Waserstein 거리를 이용한 feature selection - 한국정보과학회 ...

https://www.dbpia.co.kr › articleDetail

· Translate this page

Wasserstein 거리를 활용한 분포 강건 신문가판원 모형 · 이상윤, 김현우, 문일경 · 대한산업공학회 춘계공동학술대회 논문집 · 대한산업공학회; 2019 ...

[Korean  Feature selection using waserstein distance]


Aero-engine faults diagnosis based on K-means improved wasserstein GAN and relevant vector machine

Z Zhao, R Zhou, Z Dong - 2019 Chinese Control Conference …, 2019 - ieeexplore.ieee.org

The aero-engine faults diagnosis is essential to the safety of the long-endurance aircraft.

The problem of fault diagnosis for aero-engines is essentially a sort of model classification

problem. Due to the difficulty of the engine faults modeling, a data-driven approach is used …

  Cited by 2 Related articles



 2019  [PDF] thecvf.com

Conservative wasserstein training for pose estimation

X LiuY Zou, T Che, P Ding, P Jia… - Proceedings of the …, 2019 - openaccess.thecvf.com

This paper targets the task with discrete and periodic class labels (eg, pose/orientation

estimation) in the context of deep learning. The commonly used cross-entropy or regression

loss is not well matched to this problem as they ignore the periodic nature of the labels and …

  Cited by 24 Related articles All 8 versions 



2019 [PDF] arxiv.org

Adaptive wasserstein hourglass for weakly supervised hand pose estimation from monocular RGB

Y Zhang, L Chen, Y Liu, J Yong, W Zheng - arXiv preprint arXiv …, 2019 - arxiv.org

Insufficient labeled training datasets is one of the bottlenecks of 3D hand pose estimation

from monocular RGB images. Synthetic datasets have a large number of images with

precise annotations, but the obvious difference with real-world datasets impacts the …

  Cited by 3 Related articles All 3 versions 


2019

Using wasserstein-2 regularization to ensure fair decisions with neural-network classifiers

L Risser, Q Vincenot, N Couellan, JM Loubes - 2019 - hal.archives-ouvertes.fr

In this paper, we propose a new method to build fair Neural-Network classifiers by using a

constraint based on the Wasserstein distance. More specifically, we detail how to efficiently

compute the gradients of Wasserstein-2 regularizers for Neural-Networks. The proposed …

  Cited by 9 Related articles 



[CITATION] Bridging the Gap Between f-GANs and Wasserstein GANs. arXiv e-prints, page

J Song, S Ermon - arXiv preprint arXiv:1910.09779, 2019

  Cited by 3 Related artic


[PDF] arxiv.org

Progressive wasserstein barycenters of persistence diagrams

J Vidal, J BudinJ Tierny - IEEE transactions on visualization …, 2019 - ieeexplore.ieee.org

This paper presents an efficient algorithm for the progressive approximation of Wasserstein

barycenters of persistence diagrams, with applications to the visual analysis of ensemble

data. Given a set of scalar fields, our approach enables the computation of a persistence …

  Cited by 19 Related articles All 16 versions


[PDF] arxiv.org

Bridging Bayesian and minimax mean square error estimation via Wasserstein distributionally robust optimization

VA NguyenS Shafieezadeh-AbadehD Kuhn… - arXiv preprint arXiv …, 2019 - arxiv.org

We introduce a distributionally robust minimium mean square error estimation model with a

Wasserstein ambiguity set to recover an unknown signal from a noisy observation. The

proposed model can be viewed as a zero-sum game between a statistician choosing an …

  Cited by 10 Related articles All 7 versions 


unified formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt distances between positive definite operators

HQ Minh - International Conference on Geometric Science of …, 2019 - Springer

This work presents a parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

distances provide a unified formulation encompassing both the Bures-Wasserstein and Log …

  Cited by 5 Related articles All 2 versions



Improved procedures for training primal wasserstein gans

T Zhang, Z Li, Q ZhuD Zhang - 2019 IEEE SmartWorld …, 2019 - ieeexplore.ieee.org

Primal Wasserstein GANs are a variant of Generative Adversarial Networks (ie, GANs),

which optimize the primal form of empirical Wasserstein distance directly. However, the high

computational complexity and training instability are the main challenges of this framework …

  Cited by 4 Related articles

<——2019—–—2019 ——2350— 



[PDF] apsipa.org

Semi-supervised multimodal emotion recognition with improved wasserstein gans

J Liang, S Chen, Q Jin - 2019 Asia-Pacific Signal and …, 2019 - ieeexplore.ieee.org

Automatic emotion recognition has faced the challenge of lacking large-scale human

labeled dataset for model learning due to the expensive data annotation cost and inevitable

label ambiguity. To tackle such challenge, previous works have explored to transfer emotion …

  Cited by 2 Related articles All 2 versions

 

Aero-engine faults diagnosis based on K-means improved Wasserstein GAN and relevant vector machine

Z Zhao, R Zhou, Z Dong - 2019 Chinese Control Conference …, 2019 - ieeexplore.ieee.org

The aero-engine faults diagnosis is essential to the safety of the long-endurance aircraft.

The problem of fault diagnosis for aero-engines is essentially a sort of model classification

problem. Due to the difficulty of the engine faults modeling, a data-driven approach is used …

  Cited by 3 Related articles


[PDF] arxiv.org

Random Matrix-Improved Estimation of the Wasserstein Distance between two Centered Gaussian Distributions

M TiomokoR Couillet - 2019 27th European Signal Processing …, 2019 - ieeexplore.ieee.org

This article proposes a method to consistently estimate functionals 1/pΣ i= 1 pf (λ i (C 1 C 2))

of the eigenvalues of the product of two covariance matrices C 1, C 2 R p× p based on the

empirical estimates λ i (Ĉ 1 Ĉ 2)(Ĉ a= 1/na Σ i= 1 na xi (a) xi (a)), when the size p and …

  Cited by 1 Related articles All 11 versions


Improved concentration bounds for conditional value-at-risk and cumulative prospect theory using wasserstein distance

SP Bhat, LA Prashanth - 2019 - openreview.net

This paper presents a unified approach based on Wasserstein distance to derive

concentration bounds for empirical estimates for a broad class of risk measures. The results

cover two broad classes of risk measures which are defined in the paper. The classes of risk …

  Cited by 1 Related articles 


[PDF] sciencedirect.com

Decomposition algorithm for distributionally robust optimization using Wasserstein metric with an application to a class of regression models

F LuoS Mehrotra - European Journal of Operational Research, 2019 - Elsevier

We study distributionally robust optimization (DRO) problems where the ambiguity set is

defined using the Wasserstein metric and can account for a bounded support. We show that

this class of DRO problems can be reformulated as decomposable semi-infinite programs …

  Cited by 23 Related articles All 6 versions


 2019


[PDF] researchgate.net

Least-squares reverse time migration via linearized waveform inversion using a Wasserstein metric

P YongJ Huang, Z Li, W LiaoL Qu - Geophysics, 2019 - library.seg.org

Least-squares reverse time migration (LSRTM), an effective tool for imaging the structures of

the earth from seismograms, can be characterized as a linearized waveform inversion

problem. We have investigated the performance of three minimization functionals as the L 2 …

  Cited by 5 Related articles All 4 versions


[PDF] researchgate.net

[PDF] Parallel Wasserstein Generative Adversarial Nets with Multiple Discriminators.

Y SuS Zhao, X Chen, I KingMR Lyu - IJCAI, 2019 - researchgate.net

Abstract Wasserstein Generative Adversarial Nets (GANs) are newly proposed GAN

algorithms and widely used in computer vision, web mining, information retrieval, etc.

However, the existing algorithms with approximated Wasserstein loss converge slowly due …

  Cited by 3 Related articles All 2 versions 



CWGAN: Conditional wasserstein generative adversarial nets for fault data generation

Y Yu, B Tang, R Lin, S Han, T Tang… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

With the rapid development of modern industry and artificial intelligence technology, fault

diagnosis technology has become more automated and intelligent. The deep learning

based fault diagnosis model has achieved significant advantages over the traditional fault …

  Cited by 4 Related articles All 2 versions


[PDF] sciencedirect.com

The boundary method for semi-discrete optimal transport partitions and Wasserstein distance computation

L Dieci, JD Walsh III - Journal of Computational and Applied Mathematics, 2019 - Elsevier

We introduce a new technique, which we call the boundary method, for solving semi-

discrete optimal transport problems with a wide range of cost functions. The boundary

method reduces the effective dimension of the problem, thus improving complexity. For cost …

  Cited by 9 Related articles All 5 versions


[PDF] arxiv.org

Wasserstein GAN can perform PCA

J Cho, C Suh - 2019 57th Annual Allerton Conference on …, 2019 - ieeexplore.ieee.org

Generative Adversarial Networks (GANs) have become a powerful framework to learn

generative models that arise across a wide variety of domains. While there has been a

recent surge in the development of numerous GAN architectures with distinct optimization …

  Cited by 2 Related articles All 7 versions

<——2019—–—2019 ——2360— 



BIRCH Algorithm and Wasserstein Distance Metric Based Method for Generating Typical Scenarios of Wind Power Outputs

Q Li, X Tang, C Chen, X Liu, S Liu, X Shi… - … -Asia (ISGT Asia), 2019 - ieeexplore.ieee.org

With the ever-increasing penetration of renewable energy generation such as wind power

and solar photovoltaics, the power system concerned is suffering more extensive and

significant uncertainties. Scenario analysis has been utilized to solve this problem for power …

  Cited by 1 Related articles


[PDF] arxiv.org

A nonlocal free boundary problem with Wasserstein distance

A Karakhanyan - arXiv preprint arXiv:1904.06270, 2019 - arxiv.org

We study the probability measures $\rho\in\mathcal M (\mathbb R^ 2) $ minimizing the

functional\[J [\rho]=\iint\log\frac1 {| xy|} d\rho (x) d\rho (y)+ d^ 2 (\rho,\rho_0),\] where $\rho_0

$ is a given probability measure and $ d (\rho,\rho_0) $ is the 2-Wasserstein distance of …

  Related articles All 3 versions 


[PDF] researchgate.net

[PDF] Computation of Wasserstein barycenters via the Iterated Swapping Algorithm

G Puccetti, L RüschendorfS Vanduffel - 2019 - researchgate.net

In recent years, the Wasserstein barycenter has become an important notion in the analysis

of high dimensional data with a broad range of applications in applied probability,

economics, statistics and in particular to clustering and image processing. In our paper we …

  Related articles 


  [PDF] arxiv.org

A first-order algorithmic framework for wasserstein distributionally robust logistic regression

J LiS HuangAMC So - arXiv preprint arXiv:1910.12778, 2019 - arxiv.org

Wasserstein distance-based distributionally robust optimization (DRO) has received much

attention lately due to its ability to provide a robustness interpretation of various learning

models. Moreover, many of the DRO problems that arise in the learning context admits exact …

  Cited by 3 Related articles All 7 versions 

[CITATION] Anthony Man-Cho So. A first-order algorithmic framework for Wasserstein distributionally robust logistic regression

J LiS Huang - Advances in Neural Information Processing Systems, 2019

  Cited by 3 Related articles



[PDF] arxiv.org

Tackling Algorithmic Bias in Neural-Network Classifiers using Wasserstein-2 Regularization

L Risser, Q Vincenot, JM Loubes - arXiv preprint arXiv:1908.05783, 2019 - arxiv.org

The increasingly common use of neural network classifiers in industrial and social

applications of image analysis has allowed impressive progress these last years. Such

methods are however sensitive to algorithmic bias, ie to an under-or an over-representation …

  Related articles All 3 versions 


2019


Algorithms for Optimal Transport and Wasserstein Distances

J Schrieber - 2019 - oatd.org

Abstract Optimal Transport and Wasserstein Distance are closely related terms that do not

only have a long history in the mathematical literature, but also have seen a resurgence in

recent years, particularly in the context of the many applications they are used in, which …

  Related articles All 2 versions 



Wasserstein style transfer

Y Mroueh - arXiv preprint arXiv:1905.12828, 2019 - arxiv.org

We propose Gaussian optimal transport for Image style transfer in an Encoder/Decoder

framework. Optimal transport for Gaussian measures has closed forms Monge mappings

from source to target distributions. Moreover interpolates between a content and a style  …

  Cited by 13 Related articles All 3 versions 

 


[PDF] arxiv.org

Data-driven distributionally robust appointment scheduling over Wasserstein balls

R JiangM RyuG Xu - arXiv preprint arXiv:1907.03219, 2019 - arxiv.org

We study a single-server appointment scheduling problem with a fixed sequence of

appointments, for which we must determine the arrival time for each appointment. We

specifically examine two stochastic models. In the first model, we assume that all appointees …

  Cited by 7 Related articles All 4 versions 

 

[PDF] mdpi.com

Wasserstein Generative Adversarial Network Based De-Blurring Using Perceptual Similarity

M Hong, Y Choe - Applied Sciences, 2019 - mdpi.com

The de-blurring of blurred images is one of the most important image processing methods

and it can be used for the preprocessing step in many multimedia and computer vision

applications. Recently, de-blurring methods have been performed by neural network …

  Cited by 1 Related articles All 5 versions 


[PDF] arxiv.org

The existence of geodesics in Wasserstein spaces over path groups and loop groups

J Shao - Stochastic Processes and their Applications, 2019 - Elsevier

In this work we prove the existence and uniqueness of the optimal transport map for L p-

Wasserstein distance with p> 1, and particularly present an explicit expression of the optimal

transport map for the case p= 2. As an application, we show the existence of geodesics …

  Related articles All 8 versions

<——2019—–—2019 ——2370— 


  

WGAN-Based Synthetic Minority Over-Sampling Technique: Improving Semantic Fine-Grained Classification for Lung Nodules in CT Images 

By: Wang, Qingfeng; Zhou, Xuehai; Wang, Chao; et al.

IEEE ACCESS   Volume: 7   Pages: 18450-18463   Published: 2019 

 WGAN-Based Synthetic Minority Over-Sampling Technique: Improving Semantic Fine-Grained Classification for Lung Nodules in CT Images

Q Wang, X Zhou, C Wang, Z Liu, J Huang, Y Zhou… - IEEE …, 2019 - ieeexplore.ieee.org

Data imbalance issue generally exists in most medical image analysis problems and maybe 

getting important with the popularization of data-hungry deep learning paradigms. We 

explore the cutting-edge Wasserstein generative adversarial networks (WGANs) to address …

Cited by 4 Related articles Cited by 4 Related articles 

  WGAN-Based Synthetic Minority Over-Sampling Technique: Improving Semantic Fine-Grained Classification for Lung Nodules in CT Images

23 citations*

2019 IEEE ACCESS

Qingfeng Wang 1,Xuehai Zhou 1,Chao Wang 1,Zhiqin Liu 2,Jun Huang 2 see all 9 authors

1 University of Science and Technology of China ,2 Southwest University of Science and TechnologyDeep learning

Convolutional neural networkView More (5+) 

Data imbalance issue generally exists in most medical image analysis problems and maybe getting important with the popularization of data-hungry deep learning paradigms. We explore the cutting-edge Wasserstein generative adversarial networks (WGANs) to address the data imbalance problem with oversam... View Full Abstract 

  Wasserstein Dependency Measure for Representation Learning44 citations* for all

30 citations*

2019 NEURAL INFORMATION PROCESSING SYSTEMS

Sherjil Ozair 1,Corey Lynch 2,Yoshua Bengio 1,Aaron van den Oord 2,Sergey Levine 3 see all 6 authors

1 Université de Montréal ,2 Google ,3 University of California, Berkeley

Mutual information

Feature learning

View More (10+) 

Mutual information maximization has emerged as a powerful learning objective for unsupervised representation learning obtaining state-of-the-art performance in applications such as object recognition, speech recognition, and reinforcement learning. However, such approaches are fundamentally limited ... View Full Abstract 

Cited by 55 Related articles All 6 versions 

 

 Approximate Bayesian computation with the Wasserstein distance

55 citations*

2019 JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY

View More 

 Approximate Bayesian computation with the Wasserstein distance.

0 citations*

2019 ARXIV: METHODOLOGY

View More 

 Wasserstein CNN: Learning Invariant Features for NIR-VIS Face Recognition

142 citations* for all

132 citations*

2019 IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE

Ran He ,Xiang Wu ,Zhenan Sun ,Tieniu Tan

Chinese Academy of Sciences

Facial recognition system

Feature extraction

View More (21+) 

Heterogeneous face recognition (HFR) aims at matching facial images acquired from different sensing modalities with mission-critical applications in forensics, security and commercial sectors. However, HFR presents more challenging issues than traditional face recognition because of the large intra-... View Full Abstract 

Cited by 88 Related articles All 12 versions

Peer-reviewed
On the rate of convergence of empirical measure in $∞ $-Wasserstein distance for unbounded density functionOn the rate of convergence of empirical measure in $∞ $-Wasserstein distance for unbounded density function
Authors:Anning LiuJian-Guo LiuYulong Lu
Summary:We consider a sequence of identical independently distributed random samples from an absolutely continuous probability measure in one dimension with unbounded density. We establish a new rate of convergence of the $ ∞ $-Wasserstein distance between the empirical measure of the samples and the true distribution, which extends the previous convergence result by Trillos and Slepčev to the case that the true distribution has an unbounded densityShow more
Downloadable Article, 2019
Publication:Quarterly of Applied Mathematics, 77, October 1, 2019, 811
Publisher:2019

Monte Carlo method

Estimator

View More (9+) 

Abstract When researchers develop new econometric methods it is common practice to compare the performance of the new methods to those of existing methods in Monte Carlo studies. The credibility of such Monte Carlo studies is often limited because of the discretion the researcher has in choosing t... View Full Abstract 

  Unsupervised Alignment of Embeddings with Wasserstein Procrustes

103 citations* for all

81 citations*

2019 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS

Edouard Grave 1,Armand Joulin 1,Quentin Berthet 2

1 Facebook ,2 University of Cambridge

Permutation matrix

Word (computer architecture)

View More (8+) 

We consider the task of aligning two sets of points in high dimension, which has many applications in natural language processing and computer vision. As an example, it was recently shown that it is possible to infer a bilingual lexicon, without supervised data, by aligning word embeddings trained o... View Full Abstract 

Cited by 123 Related articles

 2019 

 wasserstein

 Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance

124 citations*

2017 ARXIV: PROBABILITY

View More 

 On parameter estimation with the Wasserstein distance

43 citations* for all

25 citations*

2019 INFORMATION AND INFERENCE: A JOURNAL OF THE IMA

Espen Bernton 1,Pierre E. Jacob 1,Mathieu Gerber 2,Christian P. Robert 3

1 Harvard University ,2 School of Mathematics,3 Paris Dauphine University

Statistical inference

Minimum distance estimation

View More (8+) 

Statistical inference can be performed by minimizing, over the parameter space, the Wasserstein distance between model distributions and the empirical distribution of the data. We study asymptotic properties of such minimum Wasserstein distance estimators, complementing results derived by Bassetti, ... View Full Abstract 

Cited by 264 Related articles All 10 versions

Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distanceAuthors:Weed, Jonathan (Creator), Bach, Francis (Creator)
Summary:The Wasserstein distance between two probability measures on a metric space is a measure of closeness with applications in statistics, probability, and machine learning. In this work, we consider the fundamental question of how quickly the empirical measure obtained from $n$ independent samples from $\mu$ approaches $\mu$ in the Wasserstein distance of any order. We prove sharp asymptotic and finite-sample results for this rate of convergence for general measures on general compact metric spaces. Our finite-sample results show the existence of multi-scale behavior, where measures can exhibit radically different rates of convergence as $n$ growsShow more
Computer File, 2019-11-01
English
Publisher:Bernoulli Society for Mathematical Statistics and Probability, 2019-11-01


 Robust Wasserstein Profile Inference and Applications to Machine Learning

18 citations*

2016 ARXIV: STATISTICS THEORY

View More 

 Fast Algorithms for Computational Optimal Transport and Wasserstein Barycenter.

5 citations* for all

4 citations*

2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS

Wenshuo Guo ,Nhat Ho ,Michael I. Jordan

University of California, Berkeley

Coordinate descent

Gradient descent

View More (7+) 

We provide theoretical complexity analysis for new algorithms to compute the optimal transport (OT) distance between two discrete probability distributions, and demonstrate their favorable practical performance over state-of-art primal-dual algorithms and their capability in solving other problems i... View Full Abstract 

Cited by 219 Related articles All 5 versions

 Sliced Wasserstein Generative Models

13 citations*

2019 ARXIV: COMPUTER VISION AND PATTERN RECOGNITION

View More 

 Max-Sliced Wasserstein Distance and Its Use for GANs

57 citations* for all

51 citations*

2019 COMPUTER VISION AND PATTERN RECOGNITION

Ishan Deshpande 1,Yuan-Ting Hu 1,Ruoyu Sun 1,Ayis Pyrros 2,Nasir Siddiqui 2 see all 9 authors

1 University of Illinois at Urbana–Champaign ,2 Dupagemd

Projection (set theory)

Feature learning

View More (7+) 

Generative adversarial nets (GANs) and variational auto-encoders have significantly improved our distribution modeling capabilities, showing promise for dataset augmentation, image-to-image translation and feature learning. However, to model high-dimensional distributions, sequential training and st... View Full Abstract 
Cited by 80
 Related articles All 14 versions 


 Statistical Aspects of Wasserstein Distances.

2 citations*

2018 ARXIV: METHODOLOGY

View More 

 Precise Simulation of Electromagnetic Calorimeter Showers Using a Wasserstein Generative Adversarial Network

63 citations* for all

62 citations*

2019 COMPUTING AND SOFTWARE FOR BIG SCIENCE

Martin Erdmann 1,Jonas Glombitza 1,Thorben Quast 2

1 RWTH Aachen University ,2 CERN

Calorimeter

Energy (signal processing)

View More (8+) 

Simulations of particle showers in calorimeters are computationally time-consuming, as they have to reproduce both energy depositions and their considerable fluctuations. A new approach to ultra-fast simulations is generative models where all calorimeter energy depositions are generated simultaneous... View Full Abstract 

Cited by 193 Related articles All 8 versions

 Precise simulation of electromagnetic calorimeter showers using a Wasserstein Generative Adversarial Network

1 citations*

2018 ARXIV: INSTRUMENTATION AND DETECTORS

View More 

 Wasserstein Generative Learning with Kinematic Constraints for Probabilistic Interactive Driving Behavior Prediction

16 citations* for all

16 citations*

2019 IEEE INTELLIGENT VEHICLES SYMPOSIUM

Hengbo Ma ,Jiachen Li ,Wei Zhan ,Masayoshi Tomizuka

University of California, Berkeley

Generative model

Probabilistic logic

View More (9+) 

Since prediction plays a significant role in enhancing the performance of decision making and planning procedures, the requirement of advanced methods of prediction becomes urgent. Although many literatures propose methods to make prediction on a single agent, there is still a challenging and open p... View Full Abstract 

[CITATION] Precise simulation of electromagnetic calorimeter showers using a Wasserstein generative adversarial network. Comput Softw Big Sci (1): 4

M Erdmann, J Glombitza, T Quast - arXiv preprint arXiv:1807.01954, 2019

 st simulations is generative models where all calorimeter energy depositions …

Cited by 84 Related articles All 7 versions

<——2019—–—2019 ——2380— 


 Wasserstein Generative Learning with Kinematic Constraints for Probabilistic Interactive Driving Behavior Prediction.

0 citations*

2019 IV

Cited by 23 Related articles


 Wasserstein Fair Classification

35 citations* for all

25 citations*

2019 UNCERTAINTY IN ARTIFICIAL INTELLIGENCE

Ray Jiang 1,Aldo Pacchiano 2,Tom Stepleton 1,Heinrich Jiang 1,Silvia Chiappa 1

1 Google ,2 University of California, Berkeley

Benchmark (computing)

Classifier (linguistics)

View More (6+) 

We propose an approach to fair classification that enforces independence between the classifier outputs and sensitive information by minimizing Wasserstein-1 distances. The approach has desirable theoretical properties and is robust to specific choices of the threshold used to obtain class predictio... View Full Abstract 

 Wasserstein Fair Classification

10 citations*

2019 ARXIV: MACHINE LEARNING

View More 

 Wasserstein Adversarial Examples via Projected Sinkhorn Iterations.

84 citations* for all

37 citations*

2019 INTERNATIONAL CONFERENCE ON MACHINE LEARNING

Eric Wong 1,Frank R. Schmidt 1,J. Zico Kolter 2

1 Bosch ,2 Carnegie Mellon University

Contextual image classification

Robustness (computer science)

View More (8+) 

A rapidly growing area of work has studied the existence of adversarial examples, datapoints which have been perturbed to fool a classifier, but the vast majority of these works have focused primarily on threat models defined by $\ell_p$ norm-bounded perturbations. In this paper, we propose a new th... View Full Abstract 

 

 Wasserstein Adversarial Examples via Projected Sinkhorn Iterations.

47 citations*

2019 ARXIV: LEARNING

View More 

 Kernelized Wasserstein Natural Gradient

8 citations* for all

5 citations*

2020 INTERNATIONAL CONFERENCE ON LEARNING REPRESENTATIONS

M Arbel 1,A Gretton 1,W Li 2,G Montufar 2

1 University College London ,2 University of California, Los Angeles

Wasserstein metric

Reproducing kernel Hilbert space

View More (8+) 

Many machine learning problems can be expressed as the optimization of some cost functional over a parametric family of probability distributions. It is often beneficial to solve such optimization problems using natural gradient methods. These methods are invariant to the parametrization of the fami... View Full Abstract 

EXCERPTS (47)

Cited by 123 Related articles All 8 versions 

 Kernelized Wasserstein Natural Gradient

3 citations*

2019 ARXIV: MACHINE LEARNING

Cited by 11 Related articles All 9 versions 

2019  [PDF] mdpi.com

Wasserstein distance learns domain invariant feature representations for drift compensation of E-nose

Y Tao, C Li, Z Liang, H Yang, J Xu - Sensors, 2019 - mdpi.com

Abstract Electronic nose (E-nose), a kind of instrument which combines with the gas sensor

and the corresponding pattern recognition algorithm, is used to detect the type and

concentration of gases. However, the sensor drift will occur in realistic application scenario …

  Cited by 6 Related articles All 8 versions 


2019


2019  [PDF] arxiv.org

Asymptotic guarantees for learning generative models with the sliced-wasserstein distance

K NadjahiA DurmusU ŞimşekliR Badeau - arXiv preprint arXiv …, 2019 - arxiv.org

Minimum expected distance estimation (MEDE) algorithms have been widely used for

probabilistic models with intractable likelihood functions and they have become increasingly

popular due to their use in implicit generative modeling (eg Wasserstein generative …

  Cited by 22 Related articles All 7 versions 

  Interior-Point Methods Strike Back: Solving the Wasserstein Barycenter Problem

8 citations* for all

7 citations*

2019 NEURAL INFORMATION PROCESSING SYSTEMS

DongDong Ge 1,Haoyue Wang ,Zikai Xiong 2,Yinyu Ye 3

1 Shanghai University of Finance and Economics ,2 Massachusetts Institute of Technology ,3 Stanford University

Interior point method

MNIST database

View More (6+) 

Computing the Wasserstein barycenter of a set of probability measures under the optimal transport metric can quickly become prohibitive for traditional second-order algorithms, such as interior-point methods, as the support size of the measures increases. In this paper, we overcome the difficulty by... View Full Abstract 

  Cited by 15 Related articles All 6 versions 

  

 2019  [PDF] mlr.press

Subspace robust Wasserstein distances

FP PatyM Cuturi - International Conference on Machine …, 2019 - proceedings.mlr.press

Making sense of Wasserstein distances between discrete measures in high-dimensional

settings remains a challenge. Recent work has advocated a two-step approach to improve …

 Cited by 60 Related articles All 5 versions 

 2019 

 Wasserstein GANs for MR Imaging: from Paired to Unpaired Training

4 citations*

2019 ARXIV: IMAGE AND VIDEO PROCESSING

View More 

 Subspace Robust Wasserstein Distances

53 citations* for all

45 citations*

2019 INTERNATIONAL CONFERENCE ON MACHINE LEARNING

François-Pierre Paty 1,Marco Cuturi 2

1 ENSAE ParisTech ,2 Université Paris-Saclay

Robustness (computer science)

Subspace topology

View More (8+) 

Making sense of Wasserstein distances between discrete measures in high-dimensional settings remains a challenge. Recent work has advocated a two-step approach to improve robustness and facilitate the computation of optimal transport, using for instance projections on random real lines, or a prelimi... View Full Abstract 

Cited by 79 Related articles All 6 versions 

 

 Gromov-Wasserstein Learning for Graph Matching and Node Embedding.

14 citations*

2019 ARXIV: LEARNING

Cited by 112 Related articles All 12 versions 

<——2019—–—2019 ——2390— 


2019  [PDF] arxiv.org

Fused Gromov-Wasserstein Alignment for Hawkes Processes

D LuoH XuL Carin - arXiv preprint arXiv:1910.02096, 2019 - arxiv.org

We propose a novel fused Gromov-Wasserstein alignment method to jointly learn the

Hawkes processes in different event spaces, and align their event types. Given two Hawkes

processes, we use fused Gromov-Wasserstein discrepancy to measure their dissimilarity …

  Cited by 2 Related articles All 4 versions 


2019 see 2020 [PDF] researchgate.net

[PDF] Partial gromov-wasserstein with applications on positive-unlabeled learning

L ChapelMZ AlayaG Gasso - arXiv preprint arXiv:2002.08276, 2019 - researchgate.net

Optimal Transport (OT) framework allows defining similarity between probability distributions

and provides metrics such as the Wasserstein and Gromov-Wasserstein discrepancies.

Classical OT problem seeks a transportation map that preserves the total mass, requiring the …

  Cited by 6 Related articles All 2 versions 


2019  [PDF] phmsociety.org

Anomaly detection on time series with wasserstein gan applied to phm

M Ducoffe, I Haloui, JS Gupta - International Journal of …, 2019 - papers.phmsociety.org

Modern vehicles are more and more connected. For instance, in the aerospace industry,

newer aircraft are already equipped with data concentrators and enough wireless

connectivity to transmit sensor data collected during the whole flight to the ground, usually …

  Cited by 2 Related articles All 2 versions 


2019  [PDF] mlr.press

Subspace robust Wasserstein distances

FP PatyM Cuturi - International Conference on Machine …, 2019 - proceedings.mlr.press

Making sense of Wasserstein distances between discrete measures in high-dimensional

settings remains a challenge. Recent work has advocated a two-step approach to improve

robustness and facilitate the computation of optimal transport, using for instance projections …

  Cited by 55 Related articles All 5 versions 

2019  [PDF] arxiv.org

Topic modeling with Wasserstein autoencoders

F NanR DingR NallapatiB Xiang - arXiv preprint arXiv:1907.12374, 2019 - arxiv.org

We propose a novel neural topic model in the Wasserstein autoencoders (WAE) framework.

Unlike existing variational autoencoder based models, we directly enforce Dirichlet prior on

the latent document-topic vectors. We exploit the structure of the latent space and apply a …

  Cited by 21 Related articles All 5 versions 

 

2019


 2019  [PDF] mlr.press

Sliced-Wasserstein flows: Nonparametric generative modeling via optimal transport and diffusions

A LiutkusU SimsekliS Majewski… - International …, 2019 - proceedings.mlr.press

By building upon the recent theory that established the connection between implicit

generative modeling (IGM) and optimal transport, in this study, we propose a novel

parameter-free algorithm for learning the underlying distributions of complicated datasets …

  Cited by 44 Related articles All 7 versions 

[CITATION] … şimşekli, Szymon Majewski, Alain Durmus, and Fabian-Robert Stoter. Sliced-Wasserstein flows: Nonparametric generative modeling via optimal transport …

A Liutkus - International Conference on Machine Learning, 2019

  Cited by 2 Related articles


2019 [PDF] aclweb.org

Modeling personalization in continuous space for response generation via augmented wasserstein autoencoders

Z Chan, J Li, X Yang, X Chen, W Hu, D Zhao… - Proceedings of the 2019 …, 2019 - aclweb.org

Abstract Variational autoencoders (VAEs) and Wasserstein autoencoders (WAEs) have

achieved noticeable progress in open-domain response generation. Through introducing

latent variables in continuous space, these models are capable of capturing utterance-level …

  Cited by 18 Related articles All 2 versions 


2019  [PDF] arxiv.org

Modeling the biological pathology continuum with hsic-regularized wasserstein auto-encoders

D Wu, H Kobayashi, C Ding, L Cheng… - arXiv preprint arXiv …, 2019 - arxiv.org

A crucial challenge in image-based modeling of biomedical data is to identify trends and

features that separate normality and pathology. In many cases, the morphology of the

imaged object exhibits continuous change as it deviates from normality, and thus a …

  Cited by 4 Related articles All 2 versions 


2019  [PDF] harchaoui.org

[PDF] Wasserstein Adversarial Mixture for Deep Generative Modeling and Clustering

W Harchaoui, PA MatteiA AlmansaC Bouveyron - 2019 - harchaoui.org

Unsupervised learning, and in particular clustering, is probably the most central problem in

learning theory nowadays. This work focuses on the clustering of complex data by

introducing a deep generative approach for both modeling and clustering. The proposed …

  Cited by 1 Related articles All 3 versions 


2019  [PDF] ieee.org

Multi-source medical image fusion based on Wasserstein generative adversarial networks

Z Yang, Y Chen, Z LeF Fan, E Pan - IEEE Access, 2019 - ieeexplore.ieee.org

In this paper, we propose the medical Wasserstein generative adversarial networks

(MWGAN), an end-to-end model, for fusing magnetic resonance imaging (MRI) and positron

emission tomography (PET) medical images. Our method establishes two adversarial …

  Cited by 9 Related articles

<——2019—–—2019 ——2400—


2019  [PDF] arxiv.org

Group level MEG/EEG source imaging via optimal transport: minimum Wasserstein estimates

H JanatiT BazeilleB ThirionM Cuturi… - … Information Processing in …, 2019 - Springer

Magnetoencephalography (MEG) and electroencephalography (EEG) are non-invasive

modalities that measure the weak electromagnetic fields generated by neural activity.

Inferring the location of the current sources that generated these magnetic fields is an ill …

  Cited by 5 Related articles All 14 versions


2019

 Orthogonal Estimation of Wasserstein Distances

1 citations*

2019 ARXIV: MACHINE LEARNING

View Less 

Mark Rowland 1,Jiri Hron 1,Yunhao Tang 2,Krzysztof Choromanski 3,Tamas Sarlos 3 see all 6 authors

1 University of Cambridge ,2 Columbia University ,3 Google

Coupling (probability)

Sorting

View More (5+) 

Wasserstein distances are increasingly used in a wide variety of applications in machine learning. Sliced Wasserstein distances form an important subclass which may be estimated eciently through one-dimensional sorting operations. In this paper, we propose a new variant of sliced Wasserstein distan... View Full Abstract 

 Uniform contractivity in Wasserstein metric for the original 1D Kac's model

9 citations* for all

9 citations*


 2019

 Uncoupled isotonic regression via minimum Wasserstein deconvolution

32 citations* for all

23 citations*

2019 INFORMATION AND INFERENCE: A JOURNAL OF THE IMA

Philippe Rigollet ,Jonathan Weed

Massachusetts Institute of Technology

 Cited by 52 Related articles All 7 versions

2019

Inequalities for the Wasserstein mean of positive definite matrices

R Bhatia, T Jain, Y Lim - Linear Algebra and its Applications, 2019 - Elsevier

Let A 1 , … , A m be given positive definite matrices and let w = ( w 1 , … , w m ) be a vector of

weights; ie, w j ≥ 0 and ∑ j = 1 m w j = 1 . Then the (weighted) Wasserstein mean, or the Wasserstein

barycentre of A 1 , … , A m is defined as(2) Ω ( w ; A 1 , … , A m ) = argmin X P ∑ j = 1 m w …

  Cited by 12 Related articles All 5 versions


 

2019  [PDF] arxiv.org

Alpha Procrustes metrics between positive definite operators: a unifying formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt metrics

MH Quang - arXiv preprint arXiv:1908.09275, 2019 - arxiv.org

This work presents a parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

distances provide a unified formulation encompassing both the Bures-Wasserstein and Log …

  Cited by 4 Related articles All 2 versions 


2019  

2019  [PDF] arxiv.org

Parameter estimation for biochemical reaction networks using Wasserstein distances

K Öcal, R GrimaG Sanguinetti - Journal of Physics A …, 2019 - iopscience.iop.org

We present a method for estimating parameters in stochastic models of biochemical reaction

networks by fitting steady-state distributions using Wasserstein distances. We simulate a

reaction network at different parameter settings and train a Gaussian process to learn the …

  Cited by 11 Related articles All 9 versions

2019  [PDF] researchgate.net

[PDF] Parallel Wasserstein Generative Adversarial Nets with Multiple Discriminators.

Y SuS Zhao, X Chen, I KingMR Lyu - IJCAI, 2019 - researchgate.net

Abstract Wasserstein Generative Adversarial Nets (GANs) are newly proposed GAN

algorithms and widely used in computer vision, web mining, information retrieval, etc.

However, the existing algorithms with approximated Wasserstein loss converge slowly due …

  Cited by 3 Related articles All 2 versions 


2019

Wasserstein Distances for Estimating Parameters in Stochastic Reaction Networks

K Öcal, R GrimaG Sanguinetti - International Conference on …, 2019 - Springer

Modern experimental methods such as flow cytometry and fluorescence in-situ hybridization

(FISH) allow the measurement of cell-by-cell molecule numbers for RNA, proteins and other

substances for large numbers of cells at a time, opening up new possibilities for the …

  Related articles All 4 versions


2019  [PDF] mlr.press

Orthogonal estimation of wasserstein distances

M RowlandJ HronY Tang… - The 22nd …, 2019 - proceedings.mlr.press

Wasserstein distances are increasingly used in a wide variety of applications in machine

learning. Sliced Wasserstein distances form an important subclass which may be estimated

efficiently through one-dimensional sorting operations. In this paper, we propose a new …

  Cited by 13 Related articles All 9 versions 

 

2019  [PDF] arxiv.org

Orthogonal Wasserstein GANs

J Müller, R KleinM Weinmann - arXiv preprint arXiv:1911.13060, 2019 - arxiv.org

Wasserstein-GANs have been introduced to address the deficiencies of generative

adversarial networks (GANs) regarding the problems of vanishing gradients and mode

collapse during the training, leading to improved convergence behaviour and improved …

  Cited by 1 Related articles All 2 versions 

<——2019—–—2019 ——2410—


2019 [PDF] arxiv.org

Precise simulation of electromagnetic calorimeter showers using a Wasserstein Generative Adversarial Network

M Erdmann, J Glombitza, T Quast - Computing and Software for Big …, 2019 - Springer

Simulations of particle showers in calorimeters are computationally time-consuming, as they

have to reproduce both energy depositions and their considerable fluctuations. A new

approach to ultra-fast simulations is generative models where all calorimeter energy …

  Cited by 57 Related articles All 6 versions

[CITATION] Precise simulation of electromagnetic calorimeter showers using a Wasserstein generative adversarial network. Comput Softw Big Sci 3 (1): 4

M Erdmann, J Glombitza, T Quast - arXiv preprint arXiv:1807.01954, 2019

  Cited by 12 Related articles


2019  [HTML] oup.com

Uncoupled isotonic regression via minimum Wasserstein deconvolution

P RigolletJ Weed - Information and Inference: A Journal of the …, 2019 - academic.oup.com

Isotonic regression is a standard problem in shape-constrained estimation where the goal is

to estimate an unknown non-decreasing regression function from independent pairs where.

While this problem is well understood both statistically and computationally, much less is …

  Cited by 43 Related articles All 8 versions


2019  [PDF] arxiv.org

The quadratic Wasserstein metric for inverse data matching

K Ren, Y Yang - arXiv preprint arXiv:1911.06911, 2019 - arxiv.org

This work characterizes, analytically and numerically, two major effects of the quadratic

Wasserstein ($ W_2 $) distance as the measure of data discrepancy in computational

solutions of inverse problems. First, we show, in the infinite-dimensional setup, that the …

  Cited by 1 Related articles 


2019

Use of the Wasserstein Metric to Solve the Inverse Dynamic Seismic Problem

AA Vasilenko - Geomodel 2019, 2019 - earthdoc.org

The inverse dynamic seismic problem consists in recovering the velocity model of elastic

medium based on the observed seismic data. In this work full waveform inversion method is

used to solve this problem. It consists in minimizing an objective functional measuring the …

  Related articles



2019

 Hausdorff and Wasserstein metrics on graphs and other structured data

4 citations*

2019 ARXIV: OPTIMIZATION AND CONTROL

View Less 

Evan Patterson

Stanford University

Wasserstein metric

View More (8+) 

Optimal transport is widely used in pure and applied mathematics to find probabilistic solutions to hard combinatorial matching problems. We extend the Wasserstein metric and other elements of optimal transport from the matching of sets to the matching of graphs and other structured data. This struc... View Full Abstrac


 Wasserstein of Wasserstein Loss for Learning Generative Models

17 citations*

2019 INTERNATIONAL CONFERENCE ON MACHINE LEARNING

Yonatan Dukler 1,Wuchen Li 1,Alex Tong Lin 1,Guido Montúfar 2

1 University of California, Los Angeles ,2 Max Planck Society

Computer science

Artificial intelligence


2019


 Predictive density estimation under the Wasserstein loss

0 citations*

2019 ARXIV: STATISTICS THEORY

View More 

 Low Dose CT Image Denoising Using a Generative Adversarial Network with Wasserstein Distance and Perceptual Loss

330 citations*

2017 ARXIV: COMPUTER VISION AND PATTERN RECOGNITION

Qingsong Yang ,Pingkun Yan ,Yanbo Zhang ,Hengyong Yu ,Yongyi Shi see all 8 authors

Image noise

Feature vector

View More (9+) 

In this paper, we introduce a new CT image denoising method based on the generative adversarial network (GAN) with Wasserstein distance and perceptual similarity. The Wasserstein distance is a key concept of the optimal transform theory, and promises to improve the performance of the GAN. The percep... View Full Abstract 


2019

 Subspace Robust Wasserstein Distances

11 citations*

2019 ARXIV: LEARNING

View More 

 Wasserstein Robust Reinforcement Learning

14 citations*

2019 ARXIV: LEARNING

Mohammed Amin Abdullah ,Hang Ren ,Haitham Bou-Ammar ,Vladimir Milenkovic ,Rui Luo see all 7 authors

Reinforcement learning

Solver

View More (3+) 

Reinforcement learning algorithms, though successful, tend to over-fit to training environments hampering their application to the real-world. This paper proposes $\text{W}\text{R}^{2}\text{L}$ -- a robust reinforcement learning algorithm with significant robust performance on low and high-dimension... View Full Abstract 


2019

 Quantum Wasserstein Generative Adversarial Networks

1 citations*

2019 ARXIV: QUANTUM PHYSICS

Shouvanik Chakrabarti ,Yiming Huang ,Tongyang Li ,Soheil Feizi ,Xiaodi Wu

University of Maryland, College Park

Quantum machine learning

Quantum circuit

View More (7+) 

The study of quantum generative models is well-motivated, not only because of its importance in quantum machine learning and quantum chemistry but also because of the perspective of its implementation on near-term quantum machines. Inspired by previous studies on the adversarial training of classica... View Full Abstract 

Cited by 33 Related articles All 8 versions 

2019

 Data-Driven Distributionally Robust Appointment Scheduling over Wasserstein Balls

5 citations*

2019 ARXIV: OPTIMIZATION AND CONTROL

Ruiwei Jiang 1,Minseok Ryu 1,Guanglin Xu 2

1 University of Michigan ,2 University of Minnesota

Probability distribution

Robust optimization

View More (8+) 

We study a single-server appointment scheduling problem with a fixed sequence of appointments, for which we must determine the arrival time for each appointment. We specifically examine two stochastic models. In the first model, we assume that all appointees show up at the scheduled arrival times ye... View Full Abstract 

2019  [PDF] arxiv.org

On isometric embeddings of Wasserstein spaces–the discrete case

GP GehérT TitkosD Virosztek - Journal of Mathematical Analysis and …, 2019 - Elsevier

The aim of this short paper is to offer a complete characterization of all (not necessarily

surjective) isometric embeddings of the Wasserstein space W p (X), where X is a countable

discrete metric space and 0< p<∞ is any parameter value. Roughly speaking, we will prove …

  Cited by 3 Related articles All 9 versions

<——2019—–—2019 ——2420—


2019  [HTML] nih.gov

Construction of 4D Neonatal Cortical Surface Atlases Using Wasserstein Distance

Z Chen, Z WuL SunF WangL Wang… - 2019 IEEE 16th …, 2019 - ieeexplore.ieee.org

Spatiotemporal (4D) neonatal cortical surface atlases with densely sampled ages are

important tools for understanding the dynamic early brain development. Conventionally,

after non-linear co-registration, surface atlases are constructed by simple Euclidean average …

  Cited by 1 Related articles All 5 versions

Isometric study of Wasserstein spaces---the real line

G Pál Gehér, T TitkosD Virosztek - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

Recently Kloeckner described the structure of the isometry group of the quadratic

Wasserstein space $\mathcal {W} _2\left (\mathbb {R}^ n\right) $. It turned out that the case of

the real line is exceptional in the sense that there exists an exotic isometry flow. Following …

 

2019  [PDF] researchgate.net

[PDF] Tractable reformulations of distributionally robust two-stage stochastic programs with∞− wasserstein distance

W Xie - arXiv preprint arXiv:1908.08454, 2019 - researchgate.net

In the optimization under uncertainty, decision-makers first select a wait-and-see policy

before any realization of uncertainty and then place a here-and-now decision after the

uncertainty has been observed. Two-stage stochastic programming is a popular modeling …

  Cited by 10 Related articles 

[PDF] arxiv.org

Tractable Reformulations of Distributionally Robust Two-stage Stochastic Programs with Wasserstein Distance

W Xie - arXiv preprint arXiv:1908.08454, 2019 - arxiv.org

In the optimization under uncertainty, decision-makers first select a wait-and-see policy

before any realization of uncertainty and then place a here-and-now decision after the

uncertainty has been observed. Two-stage stochastic programming is a popular modeling …

  Cited by 2 Related articles All 2 versions 


2019  [PDF] arxiv.org

Zero-Sum Differential Games on the Wasserstein Space

J MoonT Basar - arXiv preprint arXiv:1912.06084, 2019 - arxiv.org

We consider two-player zero-sum differential games (ZSDGs), where the state process

(dynamical system) depends on the random initial condition and the state process's

distribution, and the objective functional includes the state process's distribution and the …

  Cited by 1 Related articles All 2 versions 


2019  [PDF] ieee.org

Accelerating CS-MRI reconstruction with fine-tuning Wasserstein generative adversarial network

M Jiang, Z Yuan, X Yang, J Zhang, Y Gong, L Xia… - IEEE …, 2019 - ieeexplore.ieee.org

Compressed sensing magnetic resonance imaging (CS-MRI) is a time-efficient method to

acquire MR images by taking advantage of the highly under-sampled k-space data to

accelerate the time consuming acquisition process. In this paper, we proposed a de-aliasing …

  Cited by 7 Related articles


2019


2019  [PDF] arxiv.org

Tractable Reformulations of Distributionally Robust Two-stage Stochastic Programs with Wasserstein Distance

W Xie - arXiv preprint arXiv:1908.08454, 2019 - arxiv.org

In the optimization under uncertainty, decision-makers first select a wait-and-see policy

before any realization of uncertainty and then place a here-and-now decision after the

uncertainty has been observed. Two-stage stochastic programming is a popular modeling …

  Cited by 2 Related articles All 2 versions 

uncertainty has been observed. Two-stage stochastic programming is a popular modeling …

  Cited by 10 Related articles 


2019

 Nonembeddability of Persistence Diagrams with $p>2$ Wasserstein Metric

3 citations*

2019 ARXIV: FUNCTIONAL ANALYSIS

View More 

 Wasserstein statistics in one-dimensional location scale models

1 citations* for all

1 citations*

2021 ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS

Shun-ichi Amari ,Takeru Matsuda

RIKEN Center for Brain Science, Saitama, Japan

Information geometry

Asymptotic distribution

View More (8+) 

Wasserstein geometry and information geometry are two important structures to be introduced in a manifold of probability distributions. Wasserstein geometry is defined by using the transportation cost between two distributions, so it reflects the metric of the base manifold on which the distribution... View Full Abstract 

2019 see 2021

 The Wasserstein-Fourier Distance for Stationary Time Series

2 citations*

2019 ARXIV: MACHINE LEARNING

View More 

 Considering anatomical prior information for low-dose CT image enhancement using attribute-augmented Wasserstein generative adversarial networks

2 citations*

2021 NEUROCOMPUTING

Zhenxing Huang 1,Xinfeng Liu 2,Rongpin Wang 2,Jincai Chen 3,4,Ping Lu 3,4 see all 12 authors

1 Wuhan National Laboratory for Optoelectronics, Huazhong University of Science & Technology, Wuhan 430074, China,2 Department of Radiology, Guizhou Provincial People’s Hospital, Guiyang 550002, China,3 Huazhong University of Science and Technology ,4 Chinese Ministry of Education

Deep learning

Pattern recognition

View More (8+) 

Abstract Currently, many deep learning (DL)-based low-dose CT image postprocessing technologies fail to consider the anatomical differences in training data among different human body sites, such as the cranium, lung and pelvis. In addition, we can observe evident anatomical similarities at the sa... View Full Abstract 


2019 

 Estimation of smooth densities in Wasserstein distance

23 citations*

View More 

 Scalable Gromov-Wasserstein Learning for Graph Partitioning and Matching

40 citations* for all

29 citations*

2019 NEURAL INFORMATION PROCESSING SYSTEMS

Hongteng Xu ,Dixin Luo ,Lawrence Carin

Duke University

Graph partition

Matching (graph theory)

View More (8+) 

We propose a scalable Gromov-Wasserstein learning (S-GWL) method and establish a novel and theoretically-supported paradigm for large-scale graph analysis. The proposed method is based on the fact that Gromov-Wasserstein discrepancy is a pseudometric on graphs. Given two graphs, the optimal transpor... View Full Abstract 

ited by 54 Related articles All 6 versions 

 2019 

 On differentiability in the Wasserstein space and well-posedness for Hamilton–Jacobi equations

32 citations*

2019 JOURNAL DE MATHÉMATIQUES PURES ET APPLIQUÉES

Wilfrid Gangbo 1,Adrian Tudorascu 2

1 University of California, Los Angeles ,2 West Virginia University

Hilbert space

Hamilton–Jacobi equation

View More (8+) 

Abstract In this paper we elucidate the connection between various notions of differentiability in the Wasserstein space: some have been introduced intrinsically (in the Wasserstein space, by using typical objects from the theory of Optimal Transport) and used by various authors to study gradient ... View Full Abstract 

Cited by 52 Related articles All 5 versions

<——2019—–—2019 ——2430— 


[PDF] arxiv.org

A gradual, semi-discrete approach to generative network training via explicit Wasserstein minimization

Y Chen, M Telgarsky, C Zhang, B Bailey, D Hsu… - arXiv preprint arXiv …, 2019 - arxiv.org

This paper provides a simple procedure to fit generative networks to target distributions, with the goal of a small Wasserstein distance (or other optimal transport costs). The approach is based on two principles:(a) if the source randomness of the network is a continuous …

Cited by 3 Related articles All 10 versions

 A gradual, semi-discrete approach to generative network training via explicit Wasserstein minimization

3 citations*

2019 ARXIV: LEARNING

View More 

 Confidence Regions in Wasserstein Distributionally Robust Estimation

7 citations* for all

0 citations*

2021 BIOMETRIKA

Jose Blanchet 1,Karthyek Murthy 2,Nian Si 1

1 Stanford University ,2 Singapore University of Technology and Design

Robust optimization

Estimator

View More (8+) 

Wasserstein distributionally robust optimization estimators are obtained as solutions of min-max problems in which the statistician selects a parameter minimizing the worst-case loss among all probability models within a certain distance (in a Wasserstein sense) from the underlying empirical measure... View Full Abstract 

Cited by 23 Related articles All 7 versions

 2019 

 Irregularity of distribution in Wasserstein distance

2 citations*

2019 ARXIV: CLASSICAL ANALYSIS AND ODES

View More 

 Unsupervised Feature Extraction in Hyperspectral Images Based on Wasserstein Generative Adversarial Network

62 citations*

2019 IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING

Mingyang Zhang 1,Maoguo Gong 1,Yishun Mao 1,Jun Li 2,Yue Wu 1

1 Xidian University ,2 Sun Yat-sen University

Feature extraction

Feature (computer vision)

View More (8+) 

Feature extraction (FE) is a crucial research area in hyperspectral image (HSI) processing. Recently, due to the powerful ability of deep learning (DL) to extract spatial and spectral features, DL-based FE methods have shown great potentials for HSI processing. However, most of the DL-based FE metho... View Full Abstract 


 2019 

 Unsupervised Adversarial Domain Adaptation Based on The Wasserstein Distance For Acoustic Scene Classification

13 citations* for all

13 citations*

2019 WORKSHOP ON APPLICATIONS OF SIGNAL PROCESSING TO AUDIO AND ACOUSTICS

Konstantinos Drossos ,Paul Magron ,Tuomas Virtanen

Tampere University,Audio Research Group,Tampere,Finland

Deep learning

Discriminative model

View More (9+) 

A challenging problem in deep learning-based machine listening field is the degradation of the performance when using data from unseen conditions. In this paper we focus on the acoustic scene classification (ASC) task and propose an adversarial deep learning method to allow adapting an acoustic scen... View Full Abstract 

Cited by 28 Related articles All 9 versions

 2019 

 Multi-marginal Wasserstein GAN.

0 citations*

2019 ARXIV: LEARNING

View More 

 Primal Dual Methods for Wasserstein Gradient Flows

25 citations* for all

4 citations*

2021 FOUNDATIONS OF COMPUTATIONAL MATHEMATICS

José A. Carrillo 1,Katy Craig 2,Li Wang 3,Chaozhen Wei 4

1 Imperial College London ,2 University of California, Santa Barbara ,3 University of Minnesota ,4 Hong Kong University of Science and Technology

Discretization

Discrete time and continuous time

View More (8+) 

Combining the classical theory of optimal transport with modern operator splitting techniques, we develop a new numerical method for nonlinear, nonlocal partial differential equations, arising in models of porous media, materials science, and biological swarming. Our method proceeds as follows: firs... View Full Abstract 

Cited by 51 Related articles All 11 versions 

2019

2019 see 2020

 Irregularity of distribution in Wasserstein distance

2 citations*

2019 ARXIV: CLASSICAL ANALYSIS AND ODES

View More 

 Low Dose CT Image Denoising Using a Generative Adversarial Network with Wasserstein Distance and Perceptual Loss

331 citations*

2017 ARXIV: COMPUTER VISION AND PATTERN RECOGNITION

Qingsong Yang ,Pingkun Yan ,Yanbo Zhang ,Hengyong Yu ,Yongyi Shi see all 8 authors

Distribution (mathematics)

Image noise

View More (9+) 

In this paper, we introduce a new CT image denoising method based on the generative adversarial network (GAN) with Wasserstein distance and perceptual similarity. The Wasserstein distance is a key concept of the optimal transform theory, and promises to improve the performance of the GAN. The percep... View Full Abstract 


2019 see 2020

 Modeling EEG data distribution with a Wasserstein Generative Adversarial Network to predict RSVP Events

1 citations*

2019 ARXIV: IMAGE AND VIDEO PROCESSING

View More 

 Distributionally Robust Stochastic Optimization with Wasserstein Distance

378 citations*

2016 ARXIV: OPTIMIZATION AND CONTROL

Rui Gao ,Anton J. Kleywegt

Georgia Institute of Technology

Distribution (mathematics)

Stochastic optimization

View More (8+) 

Distributionally robust stochastic optimization (DRSO) is an approach to optimization under uncertainty in which, instead of assuming that there is an underlying probability distribution that is known exactly, one hedges against a chosen set of distributions. In this paper we first point out that th... View Full Abstract 


2019  [PDF] arxiv.org

The Wasserstein-Fourier distance for stationary time series

E Cazelles, A Robert, F Tobar - arXiv preprint arXiv:1912.05509, 2019 - arxiv.org

We propose the Wasserstein-Fourier (WF) distance to measure the (dis) similarity between

time series by quantifying the displacement of their energy across frequencies. The WF

distance operates by calculating the Wasserstein distance between the (normalised) power …

  Cited by 1 Related articles All 2 versions 


2019  

[CITATION] A general solver to the elliptical mixture model through an approximate wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - arXiv preprint arXiv:1906.03700, 2019

  Cited by 1 Related articles

[PDF] A general solver to the elliptical mixture model through an ...

https://www.semanticscholar.org › paper › A-general-solv...

We thus resort to an efficient optimisation on a statistical manifold defined under an approximate Wasserstein distance, which allows for explicit metrics 


   

2019  [PDF] sciencedirect.com

On differentiability in the Wasserstein space and well-posedness for Hamilton–Jacobi equations

W GangboA Tudorascu - Journal de Mathématiques Pures et Appliquées, 2019 - Elsevier

In this paper we elucidate the connection between various notions of differentiability in the

Wasserstein space: some have been introduced intrinsically (in the Wasserstein space, by

using typical objects from the theory of Optimal Transport) and used by various authors to …

  Cited by 39 Related articles All 4 versions


 

2019

Generating EEG signals of an RSVP experiment by a class conditioned wasserstein generative adversarial network

S PanwarP RadJ Quarles… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

Electroencephalography (EEG) data is difficult to obtain due to complex experimental setups

and reduced comfort due to prolonged wearing. This poses challenges to train powerful

deep learning model due to the limited EEG data. Hence, being able to generate EEG data …

  Cited by 7 Related articles All 2 versions

<——2019—–—2019 ——2440— 


2019  [PDF] arxiv.org

Group level MEG/EEG source imaging via optimal transport: minimum Wasserstein estimates

H JanatiT BazeilleB ThirionM Cuturi… - … Information Processing in …, 2019 - Springer

Magnetoencephalography (MEG) and electroencephalography (EEG) are non-invasive

modalities that measure the weak electromagnetic fields generated by neural activity.

Inferring the location of the current sources that generated these magnetic fields is an ill …

  Cited by 5 Related articles All 14 versions


MR4141947 Prelim Pinetz, Thomas; Soukup, Daniel; Pock, Thomas; On the estimation of the Wasserstein distance in generative models. Pattern recognition, 156–170, Lecture Notes in Comput. Sci., 11824, Springer, Cham, [2019], ©2019. 94A16

Review PDF Clipboard Series Chapter

[PDF] arxiv.org

On the estimation of the Wasserstein distance in generative models

T Pinetz, D Soukup, T Pock - German Conference on Pattern Recognition, 2019 - Springer

… is to use the Wasserstein distance as loss function leading to Wasserstein Generative … Using

this as a basis, we show various ways in which the Wasserstein distance is estimated for the …

Cited by 5 Related articles All 5 versions

2019

 Adapted Wasserstein Distances and Stability in Mathematical Finance

4 citations*

2019 ARXIV: MATHEMATICAL FINANCE

View More 

Adapted Wasserstein Distances and Stability in Mathematical Finance

2019see 2020

[CITATION] Adapted wasserstein distances and stability in mathematical finance. arXiv e-prints, page

J Backhoff-Veraguas, D Bartl, M Beiglböck, M Eder - arXiv preprint arXiv:1901.07450, 2019

  Cited by 3 Related articles


2019 RESEARCH PAPERS IN ECONOMICS

View More 

 Quantum Statistical Learning via Quantum Wasserstein Natural Gradient

1 citations* for all

1 citations*

2021 JOURNAL OF STATISTICAL PHYSICS

Simon Becker 1,Wuchen Li 2

1 University of Cambridge ,2 University of South Carolina

Riemannian manifold

Wasserstein metric

View More (8+) 

In this article, we introduce a new approach towards the statistical learning problem $$\mathrm{argmin}_{\rho (\theta ) \in {\mathcal {P}}_{\theta }} W_{Q}^2 (\rho _{\star },\rho (\theta ))$$ to approxim... View Full Abstract 

  

2019

 On the Wasserstein Distance between Classical Sequences and the Lebesgue Measure.

1 citations*

2019 ARXIV: CLASSICAL ANALYSIS AND ODES

View More 


2019

 Wasserstein Index Generation Model: Automatic Generation of Time-series Index with Application to Economic Policy Uncertainty

0 citations*

2019 RESEARCH PAPERS IN ECONOMICS

View More 

 Wasserstein Index Generation Model: Automatic Generation of Time-series Index with Application to Economic Policy Uncertainty

0 citations*

2019 ARXIV: GENERAL ECONOMICS

View More 

 Solving General Elliptical Mixture Models through an Approximate Wasserstein Manifold.

3 citations* for all

1 citations*

2020 NATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE

Shengxi Li ,Zeyang Yu ,Min Xiang ,Danilo P. Mandic

Imperial College London

Divergence (statistics)

Statistical manifold

View More (8+) 

We address the estimation problem for general finite mixture models, with a particular focus on the elliptical mixture models (EMMs). Compared to the widely adopted Kullback-Leibler divergence, we show that the Wasserstein distance provides a more desirable optimisation space. We thus provide a stab... View Full Abstract 


2019

 Solving general elliptical mixture models through an approximate Wasserstein manifold

2 citations*

2019 ARXIV: LEARNING

View More 

 Differential inclusions in Wasserstein spaces: The Cauchy-Lipschitz framework

10 citations* for all

10 citations*

2021 JOURNAL OF DIFFERENTIAL EQUATIONS

Benoît Bonnet ,Hélène Frankowska

University of Paris

Differential inclusion

Lipschitz continuity

View More (8+) 

Abstract In this article, we propose a general framework for the study of differential inclusions in the Wasserstein space of probability measures. Based on earlier geometric insights on the structure of continuity equations, we define solutions of differential inclusions as absolutely continuous ... View Full Abstract 


2019

 The quadratic Wasserstein metric for inverse data matching

0 citations*

2019 ARXIV: NUMERICAL ANALYSIS

View More 

On the Wasserstein distance between mutually singular measures

4 citations*

2020 ADVANCES IN CALCULUS OF VARIATIONS

Giuseppe Buttazzo 1,Guillaume Carlier 2,Maxime Laborde 3

1 University of Pisa ,2 University of Paris ,3 McGill University

Constant (mathematics)

Bounded function

View More (8+) 

We study the Wasserstein distance between two measures µ, ν which are mutually singular. In particular, we are interested in minimization problems of the form W (µ, A) = inf W (µ, ν) : ν A where µ is a given probability and A is contained in the class µ of probabilities that are singular with re... View Full Abstract 

Cited by 1 Related articles 

2019 

 Primal dual methods for Wasserstein gradient flows

21 citations*

2019 ARXIV: NUMERICAL ANALYSIS

View More 

 Learning Embeddings into Entropic Wasserstein Spaces

6 citations* for all

2 citations*

2019 INTERNATIONAL CONFERENCE ON LEARNING REPRESENTATIONS

Charlie Frogner 1,Farzaneh Mirzazadeh 2,Justin Solomon 1

1 Massachusetts Institute of Technology ,2 IBM

Embedding

Metric (mathematics)

View More (8+) 

Euclidean embeddings of data are fundamentally limited in their ability to capture latent semantic structures, which need not conform to Euclidean spatial assumptions. Here we consider an alternative, which embeds data as discrete probability distributions in a Wasserstein space, endowed with an opt... View Full Abstract 


 2019 

 Learning Embeddings into Entropic Wasserstein Spaces.

4 citations*

2019 ARXIV: LEARNING

View More 

 Parameter-Transferred Wasserstein Generative Adversarial Network (PT-WGAN) for Low-Dose PET Image Denoising

3 citations* for all

3 citations*

2021 IEEE TRANSACTIONS ON RADIATION AND PLASMA MEDICAL SCIENCES

Yu Gong 1,Hongming Shan 2,Yueyang Teng 1,Ning Tu 3,Ming Li 4 see all 8 authors

1 Northeastern University (China) ,2 Fudan University ,3 Wuhan University ,4 MI Research and Development Division, Neusoft Medical Systems Company, Ltd., Shenyang, China

Image noise

Noise reduction

View More (9+) 

Due to the widespread of positron emission tomography (PET) in clinical practice, the potential risk of PET-associated radiation dose to patients needs to be minimized. However, with the reduction in the radiation dose, the resultant images may suffer from noise and artifacts that compromise diagnos... View Full Abstract 

EXCERPTS (52)

Cited by 12 Related articles All 7 versions 

<——2019—–—2019 ——2450—


 
 2019 

 Parameter-Transferred Wasserstein Generative Adversarial Network (PT-WGAN) for Low-Dose PET Image Denoising

0 citations*

2019 ARXIV: IMAGE AND VIDEO PROCESSING

View More 

 WGANSing: A Multi-Voice Singing Voice Synthesizer Based on the Wasserstein-GAN

32 citations*

2019 EUROPEAN SIGNAL PROCESSING CONFERENCE

Pritish Chandna ,Merlijn Blaauw ,Jordi Bonada ,Emilia Gomez

Pompeu Fabra University

Speech synthesis

Timbre

View More (9+) 

We present a deep neural network based singing voice synthesizer, inspired by the Deep Convolutions Generative Adversarial Networks (DCGAN) architecture and optimized using the Wasserstein-GAN algorithm. We use vocoder parameters for acoustic modelling, to separate the influence of pitch and timbre.... View Full Abstract 


 2019 

 A Wasserstein Inequality and Minimal Green Energy on Compact Manifolds

5 citations*

2019 ARXIV: CLASSICAL ANALYSIS AND ODES

View More 

 Sparsemax and Relaxed Wasserstein for Topic Sparsity

17 citations* for all

17 citations*

2019 WEB SEARCH AND DATA MINING

Tianyi Lin ,Zhiyue Hu ,Xin Guo

University of California, Berkeley

Softmax function

Stability (learning theory)

View More (9+) 

Topic sparsity refers to the observation that individual documents usually focus on several salient topics instead of covering a wide variety of topics, and a real topic adopts a narrow range of terms instead of a wide coverage of the vocabulary. Understanding this topic sparsity is especially impor... View Full Abstract 


WassRank: Listwise document ranking using optimal transport theory

HT Yu, A Jatowt, H Joho, JM Jose, X Yang… - Proceedings of the …, 2019 - dl.acm.org

… from ground truth labels based on the Wasserstein distance. (2) … Wasserstein distance. In 

Section 3, we present how to perform listwise document ranking based on tailored Wasserstein

Cited by 17 Related articles All 2 versions
WassRank: Listwise Document Ranking Using Optimal ...

http://eprints.gla.ac.uk › ...

Oct 25, 2019 — We propose a novel ranking method, referred to as WassRank, under which the problem of listwise document ranking boils down to the task of ...

ISBN: 9781450359405



2019 

 Strong equivalence between metrics of Wasserstein type

6 citations*

2019 ARXIV: PROBABILITY

View More 

 Wasserstein GAN-Based Small-Sample Augmentation for New-Generation Artificial Intelligence: A Case Study of Cancer-Staging Data in Biology

58 citations*

2019 ENGINEERING

Yufei Liu 1,2,3,Yuan Zhou 3,Xin Liu 2,Fang Dong 3,Chang Wang 2 see all 6 authors

1 Chinese Academy of Engineering ,2 Huazhong University of Science and Technology ,3 Tsinghua University

Test set

Deep learning

View More (7+) 

Abstract It is essential to utilize deep-learning algorithms based on big data for the implementation of the new generation of artificial intelligence. Effective utilization of deep learning relies considerably on the number of labeled samples, which restricts the application of deep learning in a... View Full Abstract 

Wasserstein GAN-based small-sample augmentation for new-generation artificial intelligence: a case study of cancer-

 2019 

 Semi-Supervised Multitask Learning on Multispectral Satellite Images Using Wasserstein Generative Adversarial Networks (GANs) for Predicting Poverty

9 citations*

2019 ARXIV: COMPUTER VISION AND PATTERN RECOGNITION

Anthony Perez ,Swetava Ganguli ,Stefano Ermon ,George Azzari ,Marshall Burke see all 6 authors

Stanford University

Multi-task learning

Machine learning

View More (8+) 

Obtaining reliable data describing local poverty metrics at a granularity that is informative to policy-makers requires expensive and logistically difficult surveys, particularly in the developing world. Not surprisingly, the poverty stricken regions are also the ones which have a high probability o... View Full Abstract 


 2019 


 Modeling EEG data distribution with a Wasserstein Generative Adversarial Network to predict RSVP Events

0 citations*

2019 ARXIV: IMAGE AND VIDEO PROCESSING

View More 

 Multivariate approximations in Wasserstein distance by Stein’s method and Bismut’s formula

20 citations* for all

18 citations*

2019 PROBABILITY THEORY AND RELATED FIELDS

Xiao Fang 1,Qi-Man Shao 1,Lihu Xu 2

1 The Chinese University of Hong Kong ,2 University of Macau

Stein's method

Malliavin calculus

View More (8+) 

Stein’s method has been widely used for probability approximations. However, in the multi-dimensional setting, most of the results are for multivariate normal approximation or for test functions with bounded second- or higher-order derivatives. For a class of multivariate limiting distributions, we ... View Full Abstract 


 2019 

 Data-Driven Chance Constrained Optimization under Wasserstein Ambiguity Sets

21 citations* for all

20 citations*

2019 ADVANCES IN COMPUTING AND COMMUNICATIONS

Ashish R. Hota 1,Ashish Cherukuri 2,John Lygeros 3

1 Indian Institutes of Technology ,2 University of Groningen ,3 ETH Zurich

Wasserstein metric

Constraint (information theory)

View More (8+) 

We present a data-driven approach for distri-butionally robust chance constrained optimization problems (DRCCPs). We consider the case where the decision maker has access to a finite number of samples or realizations of the uncertainty. The chance constraint is then required to hold for all distribu... View Full Abstract 
Cited by 31
 Related articles All 7 versions


2019

  Risk-Based Distributionally Robust Optimal Gas-Power Flow With Wasserstein Distance

31 citations*

2019 IEEE TRANSACTIONS ON POWER SYSTEMS

Cheng Wang 1,Rui Gao 2,Wei Wei 3,Miadreza Shafie-khah 4,Tianshu Bi 1 see all 6 authors

1 North China Electric Power University ,2 University of Texas at Austin ,3 Tsinghua University ,4 Institute for Systems and Computer Engineering, Technology and Science, Porto, Portugal

Wind power

Robust optimization

View More (8+) 

Gas-fired units and power-to-gas facilities provide pivotal backups for power systems with volatile renewable generations. The deepened system interdependence calls for elaborate consideration of network models of both natural gas and power systems, as well as uncertain factors. This paper proposes ... View Full Abstract 


 2019 

 The Gromov-Wasserstein distance between networks and stable network invariants

27 citations* for all

19 citations*

2019 INFORMATION AND INFERENCE: A JOURNAL OF THE IMA

Samir Chowdhury ,Facundo Mémoli

Ohio State University

Stochastic block model

Generative model

View More (6+) 

We define a metric---the network Gromov-Wasserstein distance---on weighted, directed networks that is sensitive to the presence of outliers. In addition to proving its theoretical properties, we supply network invariants based on optimal transport that approximate this distance by means of lower bou... View Full Abstract 


 2019 

 Gromov-Wasserstein Factorization Models for Graph Clustering

3 citations*

2019 ARXIV: LEARNING

View More 

 Adapted Wasserstein distances and stability in mathematical finance

26 citations* for all

22 citations*

2020 FINANCE AND STOCHASTICS

Julio Backhoff-Veraguas 1,2,Daniel Bartl 2,Mathias Beiglböck 2,Manu Eder 2

1 University of Twente ,2 University of Vienna

Mathematical finance

Semimartingale

View More (8+) 

Assume that an agent models a financial asset through a measure with the goal to price/hedge some derivative or optimise some expected utility. Even if the model is chosen in the most skilful and sophisticated way, the agent is left with the possibility that does not provide an exact descripti... View Full Abstract 

<——2019—–—2019 ——2460— 


 

 Data-driven regularization of Wasserstein barycenters with an application to multivariate density registration

13 citations* for all

9 citations*

2019 INFORMATION AND INFERENCE: A JOURNAL OF THE IMA

Jérémie Bigot 1,Elsa Cazelles 1,Nicolas Papadakis 2

1 Institut de Mathématiques de Bordeaux,2 Centre national de la recherche scientifique

Wasserstein metric

Population

View More (8+) 

We present a framework to simultaneously align and smooth data in the form of multiple point clouds sampled from unknown densities with support in a d-dimensional Euclidean space. This work is motivated by applications in bio-informatics where researchers aim to automatically normalize large dataset... View Full Abstract 

ited by 21 Related articles All 7 versions

2019

https://arxiv.org › q-fin

by J Backhoff-Veraguas · 2019 · Cited by 25 — Assume that an agent models a financial asset through a measure Q with the goal to price / hedge some derivative or optimize some expected ...

Cite as: arXiv:1901.07450

[CITATION] Adapted wasserstein distances and stability in mathematical finance. arXiv e-prints, page

J Backhoff-Veraguas, D Bartl, M Beiglböck, M Eder - arXiv preprint arXiv:1901.07450, 2019

  Cited by 3 Related articles


2019  [PDF] arxiv.org

Sparsemax and relaxed Wasserstein for topic sparsity

T LinZ HuX Guo - Proceedings of the Twelfth ACM International …, 2019 - dl.acm.org

Topic sparsity refers to the observation that individual documents usually focus on several

salient topics instead of covering a wide variety of topics, and a real topic adopts a narrow

range of terms instead of a wide coverage of the vocabulary. Understanding this topic …

  Cited by 17 Related articles All 6 versions


2019

A general solver to the elliptical mixture model ... - DeepAI

https://deepai.org › publication › a-general-solver-to-the-e...

Jun 9, 2019 — We thus resort to an efficient optimisation on a statistical manifold defined under an approximate Wasserstein distance, which allows for ...

[CITATION] A general solver to the elliptical mixture model through an approximate wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - arXiv preprint arXiv:1906.03700, 2019

  Cited by 1 Related articles

2019  [PDF] sciencedirect.com

A partial Laplacian as an infinitesimal generator on the Wasserstein space

YT Chow, W Gangbo - Journal of Differential Equations, 2019 - Elsevier

In this manuscript, we consider special linear operators which we term partial Laplacians on

the Wasserstein space, and which we show to be partial traces of the Wasserstein Hessian.

We verify a distinctive smoothing effect of the “heat flows” they generated for a particular …

  Cited by 14 Related articles All 9 versions


2019


Generating EEG signals of an RSVP experiment by a class conditioned wasserstein generative adversarial network

S PanwarP RadJ Quarles… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

Electroencephalography (EEG) data is difficult to obtain due to complex experimental setups

and reduced comfort due to prolonged wearing. This poses challenges to train powerful

deep learning model due to the limited EEG data. Hence, being able to generate EEG data …

  Cited by 7 Related articles All 2 versions


2019  [PDF] arxiv.org

Semi-supervised multitask learning on multispectral satellite images using wasserstein generative adversarial networks (gans) for predicting poverty

A Perez, S GanguliS ErmonG AzzariM Burke… - arXiv preprint arXiv …, 2019 - arxiv.org

Obtaining reliable data describing local poverty metrics at a granularity that is informative to

policy-makers requires expensive and logistically difficult surveys, particularly in the

developing world. Not surprisingly, the poverty stricken regions are also the ones which …

  Cited by 21 Related articles All 6 versions 


2019

Relaxed Wasserstein, Generative Adversarial Networks, Variational Autoencoders and their applications

N Yang - 2019 - escholarship.org

Statistical divergences play an important role in many data-driven applications. Two notable

examples are Distributionally Robust Optimization (DRO) problems and Generative …



2019

Riemannian normalizing flow on variational wasserstein autoencoder for text modeling

PZ WangWY Wang - arXiv preprint arXiv:1904.02399, 2019 - arxiv.org

Recurrent Variational Autoencoder has been widely used for language modeling and text

generation tasks. These models often face a difficult optimization problem, also known as …

 Cited by 18 Related articles All 6 versions 

 All 2 versions 

Riemannian Normalizing Flow on Variational Wasserstein Autoencoder for Text Modeling

P Zizhuang Wang, WY Wang - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

Abstract Recurrent Variational Autoencoder has been widely used for language modeling

and text generation tasks. These models often face a difficult optimization problem, also …



2019  [PDF] mlr.press

On the complexity of approximating Wasserstein barycenters

A KroshninN TupitsaD Dvinskikh… - International …, 2019 - proceedings.mlr.press

We study the complexity of approximating the Wasserstein barycenter of $ m $ discrete

measures, or histograms of size $ n $, by contrasting two alternative approaches that use …

 Cited by 58 Related articles All 12 versions 

[CTATION] On the Complexity of Approximating Wasserstein Barycenter. eprint

A Kroshnin, D Dvinskikh, P Dvurechensky, A Gasnikov… - arXiv preprint arXiv …, 2019

 Cited by 3 Related articles

On the Complexity of Approximating Wasserstein Barycenters

P Dvurechensky - icml.cc

… 6/9 On the Complexity of Approximating Wasserstein Barycenters Page 7. Iterative Bregman

Projections min πl1=pl, πT l 1=πT l+1 1 πlRn×n + , l=1,...,m 1 m m ∑ l=1 {πl,Cl + γH(πl) } …

 All 4 versions 

<——2019—–—2019 ——2470— 



 2019

Time delay estimation via Wasserstein distance minimization

JM NicholsMN HutchinsonN Menkart… - IEEE Signal …, 2019 - ieeexplore.ieee.org

Time delay estimation between signals propagating through nonlinear media is an important

problem with application to radar, underwater acoustics, damage detection, and …

 Cited by 4 Related articles All 2 versions


2019  [PDF] illinois.edu

Deep generative models via explicit Wasserstein minimization

Y Chen - 2019 - ideals.illinois.edu

This thesis provides a procedure to fit generative networks to target distributions, with the

goal of a small Wasserstein distance (or other optimal transport costs). The approach is …

 Related articles All 3 versions 


2019

[CITATION] Approximating wasserstein distances with pytorch

D Daza - 2019

 Cited by 4 Related articles



2019

 Modeling EEG data distribution with a Wasserstein Generative Adversarial Network to predict RSVP Events

1 citations*

2019 ARXIV: IMAGE AND VIDEO PROCESSING

View More 

 Distributionally Robust Stochastic Optimization with Wasserstein Distance

378 citations*

2016 ARXIV: OPTIMIZATION AND CONTROL

Rui Gao ,Anton J. Kleywegt

Georgia Institute of Technology

Distribution (mathematics)

Stochastic optimization

View More (8+) 

Distributionally robust stochastic optimization (DRSO) is an approach to optimization under uncertainty in which, instead of assuming that there is an underlying probability distribution that is known exactly, one hedges against a chosen set of distributions. In this paper we first point out that th... View Full Abstract 


 
2019

 Sliced Wasserstein Discrepancy for Unsupervised Domain Adaptation

178 citations* for all

152 citations*

2019 COMPUTER VISION AND PATTERN RECOGNITION

Chen-Yu Lee ,Tanmay Batra ,Mohammad Haris Baig ,Daniel Ulbricht

Apple Inc.

Wasserstein metric

Feature (machine learning)

View More (10+) 

In this work, we connect two distinct concepts for unsupervised domain adaptation: feature distribution alignment between domains by utilizing the task-specific decision boundary and the Wasserstein metric. Our proposed sliced Wasserstein discrepancy (SWD) is designed to capture the natural notion o... View Full Abstract 

EXCERPTS (126)

Cited by 318 Related articles All 10 versions 

2019

2019  [PDF] escholarship.org

Relaxed Wasserstein, Generative Adversarial Networks, Variational Autoencoders and their applications

N Yang - 2019 - escholarship.org

Statistical divergences play an important role in many data-driven applications. Two notable

examples are Distributionally Robust Optimization (DRO) problems and Generative …

 All 2 versions 


2019  [PDF] sciencedirect.com

On differentiability in the Wasserstein space and well-posedness for Hamilton–Jacobi equations

W GangboA Tudorascu - Journal de Mathématiques Pures et Appliquées, 2019 - Elsevier

In this paper we elucidate the connection between various notions of differentiability in the

Wasserstein space: some have been introduced intrinsically (in the Wasserstein space, by …

 Cited by 42 Related articles All 4 versions


2019  [PDF] arxiv.org

Temporal Wasserstein non-negative matrix factorization for non-rigid motion segmentation and spatiotemporal deconvolution

E VarolA Nejatbakhsh, C McGrory - arXiv preprint arXiv:1912.03463, 2019 - arxiv.org

Motion segmentation for natural images commonly relies on dense optic flow to yield point

trajectories which can be grouped into clusters through various means including spectral …

 Cited by 3 Related articles All 5 versions 


2019  [PDF] arxiv.org

Local well-posedness in the Wasserstein space for a chemotaxis model coupled to Navier-Stokes equations

K Kang, HK Kim - arXiv preprint arXiv:1907.01895, 2019 - arxiv.org

We consider a coupled system of Keller-Segel type equations and the incompressible

Navier-Stokes equations in spatial dimension two and three. In the previous work [19], we …

 Related articles All 3 versions 


2019  2019  [PDF] arxiv.org

Universality of persistence diagrams and the bottleneck and Wasserstein distances

P Bubenik, A Elchesen - arXiv preprint arXiv:1912.02563, 2019 - arxiv.org

We prove that persistence diagrams with the p-Wasserstein distance form the universal p-

subadditive commutative monoid on an underlying metric space with a distinguished subset …

 Cited by 6 Related articles All 4 versions 

<——2019—–—2019 ——2480— 



2019  Gromov-Wasserstein Learning for Graph Matching and Node ...https://arxiv.org › pdf
PDF by H Xu · 2019 · Cited by 83 — A novel Gromov-Wasserstein learning framework is proposed to jointly match (align) graphs and learn embedding vectors for the associated ...

Cited by 146 Related articles All 11 versions

[PDF] arxiv.org

Wasserstein distributionally robust optimization: Theory and applications in machine learning

D KuhnPM EsfahaniVA Nguyen… - … Science in the Age …, 2019 - pubsonline.informs.org

Many decision problems in science, engineering, and economics are affected by uncertain

parameters whose distribution is only indirectly observable through samples. The goal of

data-driven decision making is to learn a decision from finitely many training samples that …

 Cited by 121 Related articles All 7 versions


[PDF] arxiv.org

On distributionally robust chance constrained programs with Wasserstein distance

W Xie - Mathematical Programming, 2019 - Springer

This paper studies a distributionally robust chance constrained program (DRCCP) with

Wasserstein ambiguity set, where the uncertain constraints should be satisfied with a

probability at least a given threshold for all the probability distributions of the uncertain …

 Cited by 74 Related articles All 9 versions


[PDF] sciencedirect.com

Decomposition algorithm for distributionally robust optimization using Wasserstein metric with an application to a class of regression models

F LuoS Mehrotra - European Journal of Operational Research, 2019 - Elsevier

We study distributionally robust optimization (DRO) problems where the ambiguity set is

defined using the Wasserstein metric and can account for a bounded support. We show that

this class of DRO problems can be reformulated as decomposable semi-infinite programs …

 Cited by 29 Related articles All 6 versions


[PDF] researchgate.net

Wasserstein metric based distributionally robust approximate framework for unit commitment

R ZhuH WeiX Bai - IEEE Transactions on Power Systems, 2019 - ieeexplore.ieee.org

This paper proposed a Wasserstein metric-based distributionally robust approximate

framework (WDRA), for unit commitment problem to manage the risk from uncertain wind

power forecasted errors. The ambiguity set employed in the distributionally robust …

 Cited by 43 Related articles All 3 versions


2019


[PDF] researchgate.net

[PDF] Tractable reformulations of distributionally robust two-stage stochastic programs with∞− wasserstein distance

W Xie - arXiv preprint arXiv:1908.08454, 2019 - researchgate.net

In the optimization under uncertainty, decision-makers first select a wait-and-see policy

before any realization of uncertainty and then place a here-and-now decision after the

uncertainty has been observed. Two-stage stochastic programming is a popular modeling …

 Cited by 13 Related articles 

Cited by 2 Related articles All 2 versions 



[PDF] researchgate.net

Full View

Data-driven affinely adjustable distributionally robust framework for unit commitment based on Wasserstein metric

W Hou, R ZhuH Wei… - IET Generation …, 2019 - ieeexplore.ieee.org

This study proposes a data-driven distributionally robust framework for unit commitment

based on Wasserstein metric considering the wind power generation forecasting errors. The

objective of the constructed model is to minimise the expected operating cost, including the …

 Cited by 12 Related articles All 5 versions 


[PDF] arxiv.org

Data-driven distributionally robust appointment scheduling over Wasserstein balls

R JiangM RyuG Xu - arXiv preprint arXiv:1907.03219, 2019 - arxiv.org

We study a single-server appointment scheduling problem with a fixed sequence of

appointments, for which we must determine the arrival time for each appointment. We

specifically examine two stochastic models. In the first model, we assume that all appointees …

 Cited by 11 Related articles All 4 versions 


[PDF] mdpi.com

Data-driven distributionally robust stochastic control of energy storage for wind power ramp management using the Wasserstein metric

I Yang - Energies, 2019 - mdpi.com

The integration of wind energy into the power grid is challenging because of its variability,

which causes high ramp events that may threaten the reliability and efficiency of power

systems. In this paper, we propose a novel distributionally robust solution to wind power …

 Cited by 3 Related articles All 6 versions 


Distributionally robust learning under the wasserstein metric

R Chen - 2019 - search.proquest.com

This dissertation develops a comprehensive statistical learning framework that is robust to

(distributional) perturbations in the data using Distributionally Robust Optimization (DRO)

under the Wasserstein metric. The learning problems that are studied include:(i) …

 Cited by 2 Related articles All 3 versions

<——2019—–—2019 ——2490—-



Data-driven distributionally robust shortest path problem using the Wasserstein ambiguity set

Z Wang, K YouS SongC Shang - 2019 IEEE 15th …, 2019 - ieeexplore.ieee.org

This paper proposes a data-driven distributionally robust shortest path (DRSP) model where

the distribution of the travel time is only observable through a finite training dataset. Our

DRSP model adopts the Wasserstein metric to construct the ambiguity set of probability …

 Cited by 1 Related articles


[PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance: Wrong Way Counterparty Credit and Funding Risk

D SinghS Zhang - arXiv preprint arXiv:1910.01781, 2019 - arxiv.org

This paper investigates calculations of robust XVA, in particular, credit valuation adjustment

(CVA) and funding valuation adjustment (FVA) for over-the-counter derivatives under

distributional uncertainty using Wasserstein distance as the ambiguity measure. Wrong way …

 Cited by 1 Related articles All 8 versions 


[PDF] sciencedirect.com

Distributionally Robust Chance-Constraint Optimal Power Flow Considering Uncertain Renewables with Wasserstein-Moment Metric

J Liu, Y Chen, C Duan, J Lyu - Energy Procedia, 2019 - Elsevier

Chance-constraint optimal power flow has been proven as an efficient method to manage

the risk of volatile renewable energy sources. To address the uncertainties of renewable

energy sources, a novel distributionally robust chance-constraint OPF model is proposed in …

 Cited by 1 Related articles All 2 versions


[PDF] nsf.gov

A Distributionally Robust Optimization Approach for Multivariate Linear Regression under the Wasserstein Metric

R ChenIC Paschalidis - 2019 IEEE 58th Conference on …, 2019 - ieeexplore.ieee.org

We present a Distributionally Robust Optimization (DRO) approach for Multivariate Linear

Regression (MLR), where multiple correlated response variables are to be regressed

against a common set of predictors. We develop a regularized MLR formulation that is robust …

 Related articles All 3 versions


Distributionally robust xva via wasserstein distance part 1: Wrong way counterparty credit risk

D SinghS Zhang - Unknown Journal, 2019 - experts.umn.edu

This paper investigates calculations of robust CVA for OTC derivatives under distributional

uncertainty using Wasserstein distance as the ambiguity measure. Wrong way counterparty

credit risk can be characterized (and indeed quantified) via the robust CVA formulation. The …

 Cited by 1 Related articles 


2019


Relaxed Wasserstein with Applications to GANs

Xin GuoJohnny HongTianyi LinNan Yang

[v5] Sat, 4 May 2019 08:49:44 UTC (4,232 KB)

[CITATION] Relaxed Wasserstein, with applications to GANs and distributionally robust optimization

X Guo, J HongT Lin, N Yang - Arxive Preprint Series, arXiv, 2019

 Cited by 1 Related articles

2019 see 2020

[PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance Part 2: Wrong Way Funding Risk

D SinghS Zhang - arXiv preprint arXiv:1910.03993, 2019 - arxiv.org

This paper investigates calculations of robust funding valuation adjustment (FVA) for over

the counter (OTC) derivatives under distributional uncertainty using Wasserstein distance as

the ambiguity measure. Wrong way funding risk can be characterized via the robust FVA …

 Related articles All 6 versions 

[CITATION] Distributionally robust xva via wasserstein distance part 1

D Singh, S Zhang - arXiv preprint arXiv:1910.01781, 2019

 Cited by 3 Related articles

[CITATION] Distributionally robust risk measures with structured Wasserstein ambiguity sets

VA Nguyen, D Filipovic, D Kuhn - 2019 - Working paper

 Cited by 2 Related articles


2019  [PDF] arxiv.org

Finsler structure for variable exponent Wasserstein space and gradient flows

A Marcos, A Soglo - arXiv preprint arXiv:1912.12450, 2019 - arxiv.org

In this paper, we propose a variational approach based on optimal transportation to study

the existence and unicity of solution for a class of parabolic equations involving $ q (x) $-

Laplacian operator\begin {equation*}\label {equation variable q (x)}\frac {\partial\rho (t …

 Related articles All 2 versions 

<——2019—–—2019 ——2500—- 



 [PDF] arxiv.org

Bounding quantiles of Wasserstein distance between true and empirical measure

SN Cohen, MNA Tegnér, J Wiesel - arXiv preprint arXiv:1907.02006, 2019 - arxiv.org

Consider the empirical measure, $\hat {\mathbb {P}} _N $, associated to $ N $ iid samples of

a given probability distribution $\mathbb {P} $ on the unit interval. For fixed $\mathbb {P} $

the Wasserstein distance between $\hat {\mathbb {P}} _N $ and $\mathbb {P} $ is a random …

 Related articles All 5 versions 


2019  [PDF] researchgate.net

[PDF] Parallel Wasserstein Generative Adversarial Nets with Multiple Discriminators.

Y SuS Zhao, X Chen, I KingMR Lyu - IJCAI, 2019 - researchgate.net

Abstract Wasserstein Generative Adversarial Nets (GANs) are newly proposed GAN

algorithms and widely used in computer vision, web mining, information retrieval, etc.

However, the existing algorithms with approximated Wasserstein loss converge slowly due …

 Cited by 3 Related articles All 2 versions 


2019 see 2020   [PDF] arxiv.org

Wasserstein covariance for multiple random densities

A PetersenHG Müller - Biometrika, 2019 - academic.oup.com

A common feature of methods for analysing samples of probability density functions is that

they respect the geometry inherent to the space of densities. Once a metric is specified for

this space, the Fréchet mean is typically used to quantify and visualize the average density …

 Cited by 19 Related articles All 12 versions


 

2019  [PDF] arxiv.org

Modified massive Arratia flow and Wasserstein diffusion

V KonarovskyiMK von Renesse - Communications on Pure …, 2019 - Wiley Online Library

Extending previous work by the first author we present a variant of the Arratia flow, which

consists of a collection of coalescing Brownian motions starting from every point of the unit

interval. The important new feature of the model is that individual particles carry mass that …

 Cited by 32 Related articles All 7 versions


2019

Wasserstein-generative networks

A KorotinV EgiazarianA AsadulaevA Safin… - arXiv preprint arXiv …, 2019 - arxiv.org

We propose a novel end-to-end non-minimax algorithm for training optimal transport

mappings for the quadratic cost (Wasserstein-distance). The algorithm uses input convex

neural networks and a cycle-consistency regularization to approximate Wasserstein- …

 Cited by 22 Related articles All 7 versions 


2019


Using wasserstein-regularization to ensure fair decisions with neural-network classifiers

L Risser, Q Vincenot, N Couellan, JM Loubes - 2019 - hal.archives-ouvertes.fr

In this paper, we propose a new method to build fair Neural-Network classifiers by using a

constraint based on the Wasserstein distance. More specifically, we detail how to efficiently

compute the gradients of Wasserstein-regularizers for Neural-Networks. The proposed …

 Cited by 12 Related articles 


2019  [PDF] sciencedirect.com

A bound on the Wasserstein-distance between linear combinations of independent random variables

B Arras, E Azmoodeh, G Poly, Y Swan - Stochastic processes and their …, 2019 - Elsevier

We provide a bound on a distance between finitely supported elements and general

elements of the unit sphere of  (N). We use this bound to estimate the Wasserstein-2

distance between random variables represented by linear combinations of independent …

 Cited by 23 Related articles All 15 versions


2019 [PDF] projecteuclid.org

Wasserstein-bounds in normal approximation under local dependence

X Fang - Electronic Journal of Probability, 2019 - projecteuclid.org

We obtain a general bound for the Wasserstein-distance in normal approximation for sums

of locally dependent random variables. The proof is based on an asymptotic expansion for

expectations of second-order differentiable functions of the sum. We apply the main result to …

 Cited by 5 Related articles All 4 versions



2019  [PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance Part 2: Wrong Way Funding Risk

D SinghS Zhang - arXiv preprint arXiv:1910.03993, 2019 - arxiv.org

This paper investigates calculations of robust funding valuation adjustment (FVA) for over

the counter (OTC) derivatives under distributional uncertainty using Wasserstein distance as

the ambiguity measure. Wrong way funding risk can be characterized via the robust FVA …

 Related articles All 6 versions 


2019  [PDF] archives-ouvertes.fr

[PDF] Diffusive processes on the Wasserstein space: Coalescing models, Regularization properties and McKean-Vlasov equations

V Marx - 2019 - tel.archives-ouvertes.fr

The aim of this thesis is to study a class of diffusive stochastic processes with values in the

space of probability measures on the real line, called Wasserstein space if it is endowed

with the Wasserstein metric W2. The following issues are mainly addressed in this work: how …

 Cited by 2 Related articles All 9 versions 

<——2019—–—2019 ——2510—- 



2019

Projection au sens de Wasserstein 2 sur des espaces structurés de mesures

L Lebrat - 2019 - theses.fr

Résumé Cette thèse s' intéresse à l'approximation pour la métrique de 2-Wasserstein de

mesures de probabilité par une mesure structurée. Les mesures structurées étudiées sont

des discrétisations consistantes de mesures portées par des courbes continues à vitesse et …



 

2019  [PDF] apsipa.org

Semi-supervised multimodal emotion recognition with improved wasserstein gans

J Liang, S Chen, Q Jin - 2019 Asia-Pacific Signal and …, 2019 - ieeexplore.ieee.org

… In this section, we present the proposed multi-modality semi-supervised emotion

recognition approach. We first explain the general multi-modality emotion recognition framework

and the algorithm of semi-supervised learning with CTGAN, then we present the multi-modality …

 Cited by 2 Related articles All 2 versions

[CITATION] … Emotion Recognition with Improved Wasserstein GANs. In 2019 Asia-Pacific Signal and Informatio



2019

Estimation of the Gromov–Wasserstein distance of spheres

https://mathoverflow.net › questions › estimation-of-the...

https://mathoverflow.net › questions › estimation-of-the...

Feb 24, 2019 · 1 answer

... of Wasserstein-Gromov distance included (for discrete measures). ... where T is simply a projection of the spherical coordinates.


2019

Unsupervised alignment of embeddings with wasserstein procrustes

E GraveA JoulinQ Berthet - The 22nd International …, 2019 - proceedings.mlr.press

We consider the task of aligning two sets of points in high dimension, which has many

applications in natural language processing and computer vision. As an example, it was

recently shown that it is possible to infer a bilingual lexicon, without supervised data, by …

 Cited by 113 Related articles All 3 versions 



2019  [PDF] arxiv.org

Fréchet means and Procrustes analysis in Wasserstein space

Y Zemel, VM Panaretos - Bernoulli, 2019 - projecteuclid.org

We consider two statistical problems at the intersection of functional and non-Euclidean data

analysis: the determination of a Fréchet mean in the Wasserstein space of multivariate

distributions; and the optimal registration of deformed random measures and point …

 Cited by 68 Related articles All 9 versions

 

2019

2019  [PDF] inria.fr

On a Wasserstein-type distance between solutions to stochastic differential equations

J Bion–Nadal, D Talay - The Annals of Applied Probability, 2019 - projecteuclid.org

In this paper, we introduce a Wasserstein-type distance on the set of the probability

distributions of strong solutions to stochastic differential equations. This new distance is

defined by restricting the set of possible coupling measures. We prove that it may also be …

 Cited by 15 Related articles All 9 versions



Robust Wasserstein profile inference and applications to machine learning

J BlanchetY KangK Murthy - Journal of Applied Probability, 2019 - cambridge.org

We show that several machine learning estimators, including square-root least absolute shrinkage and selection and regularized logistic regression, can be represented as solutions to distributionally robust optimization problems. The associated uncertainty regions …

 ited by 198 Related articles All 5 versions



[PDF] mlr.press

Wasserstein adversarial examples via projected sinkhorn iterations

E WongF SchmidtZ Kolter - International Conference on …, 2019 - proceedings.mlr.press

A rapidly growing area of work has studied the existence of adversarial examples, datapoints which have been perturbed to fool a classifier, but the vast majority of these works have focused primarily on threat models defined by $\ell_p $ norm-bounded …

 Cite Cited by 105 Related articles All 8 versions 

 


[PDF] arxiv.org

Straight-through estimator as projected Wasserstein gradient flow

P ChengC LiuC LiD ShenR Henao… - arXiv preprint arXiv …, 2019 - arxiv.org

The Straight-Through (ST) estimator is a widely used technique for back-propagating gradients through discrete random variables. However, this effective method lacks theoretical justification. In this paper, we show that ST can be interpreted as the simulation of …

  Cited by 6 Related articles All 6 versions 


Elements of Statistical Inference in 2-Wasserstein Space

J Ebert, V SpokoinyA Suvorikova - Topics in Applied Analysis and …, 2019 - Springer

This work addresses an issue of statistical inference for the datasets lacking underlying linear structure, which makes impossible the direct application of standard inference techniques and requires a development of a new tool-box taking into account properties of …

  Cited by 2 Related articles All 3 versions

<——2019—–—2019 ——2520—-


Statistical inference for Bures-Wasserstein barycenters

A Kroshnin, V Spokoiny, A Suvorikova - arXiv e-prints, 2019 - ui.adsabs.harvard.edu

In this work we introduce the concept of Bures-Wasserstein barycenter $ Q_* $, that is 

essentially a Fréchet mean of some distribution $\mathbb {P} $ supported on a subspace of 

positive semi-definite Hermitian operators $\mathbb {H} _ {+}(d) $. We allow a barycenter to …

Cite

[CITATION] Statistical inference for Bures-Wasserstein

A Kroshnin, V Spokoiny, A Suvorikova - arXiv preprint arXiv:1901.00226, 2019

Cited by 2 Related articles

 

[PDF] arxiv.org

Wasserstein contraction of stochastic nonlinear systems

J BouvrieJJ Slotine - arXiv preprint arXiv:1902.08567, 2019 - arxiv.org

We suggest that the tools of contraction analysis for deterministic systems can be applied towards studying the convergence behavior of stochastic dynamical systems in the Wasserstein metric. In particular, we consider the case of Ito diffusions with identical …

 Cited by 5 Related articles All 2 versions 


[PDF] mdpi.com

Data-driven distributionally robust stochastic control of energy storage for wind power ramp management using the Wasserstein metric

I Yang - Energies, 2019 - mdpi.com

The integration of wind energy into the power grid is challenging because of its variability, which causes high ramp events that may threaten the reliability and efficiency of power systems. In this paper, we propose a novel distributionally robust solution to wind power …

 Cited by 3 Related articles All 6 versions 


2019

Evasion attacks based on wasserstein generative adversarial network

J Zhang, Q Yan, M Wang - 2019 Computing, Communications …, 2019 - ieeexplore.ieee.org

Security issues have been accompanied by the development of the artificial intelligence 

industry. Machine learning has been widely used for fraud detection, spam detection, and 

malicious file detection, since it has the ability to dig the value of big data. However, for 

malicious attackers, there is a strong motivation to evade such algorithms. Because 

attackers do not know the specific parameters of the machine model, they can only carry out 

a black box attack. This paper proposes a method based on Wasserstein Generative …

ited by 4 Related articles


2019 see 2020
1907.12059] Wasserstein Fair Classification - arXiv

https://arxiv.org › stat

https://arxiv.org › stat

by R Jiang · 2019 · Cited by 59 — We propose an approach to fair classification that enforces independence between the classifier outputs and sensitive information by minimizing ...

Cite as: arXiv:1907.12059

Wasserstein fair classification


2019

Weibo Authorship Identification based on Wasserstein generative adversarial networks

W Tang, C Wu, X Chen, Y Sun… - … on Signal, Information and …, 2019 - ieeexplore.ieee.org

During the past years, authorship identification has played a significant role in the public 

security area. Recently, deep learning based approaches have been used in authorship 

identification. However, all approaches based on deep learning require a large amount of …

 Related articles


2

Wasserstein distance-based analogous method to predict distribution of non-uniform corrosion on reinforcements in concrete

Q Liu, RKL Su - Construction and Building Materials, 2019 - Elsevier

… based on minimizing the Wasserstein distance (WD) to predict the distribution of the non-uniform

corrosion on reinforcements. The WD is a distance function defined between two …

 Cited by 8 Related articles All 3 versions


  

2019 see 2021  [PDF] thecvf.com

Order-preserving wasserstein discriminant analysis

B SuJ ZhouY Wu - Proceedings of the IEEE/CVF …, 2019 - openaccess.thecvf.com

Supervised dimensionality reduction for sequence data projects the observations in

sequences onto a low-dimensional subspace to better separate different sequence classes.

It is typically more challenging than conventional dimensionality reduction for static data …

Cited by 2 Related articles All 6 versions 

 

2019  [PDF] unibocconi.it

[PDF] Bayesian model comparison based on Wasserstein distances

M Catalano, A Lijoi, I Prünster - SIS 2019 Smart Statistics for Smart …, 2019 - iris.unibocconi.it

… We here propose a way to fill in this gap by exploiting the Wasserstein distance. While

simulations of the … After a brief recapitulation of basic notions about completely random

measures and the Wasserstein distance, in Section 3 we provide general upper and lower …

Cited by 2 Related articles 


Working Paper  Full Text
Lifted Wasserstein Matcher for Fast and Robust Topology Tracking
Soler, Maxime; Plainchault, Mélanie; Conche, Bruno; Tierny, Julien.arXiv.org; Ithaca, Jan 2, 2019.
Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Show Abstract 

<——2019—–—2019 ——2530—-


cholarly Journal  Citation/Abstract
an optimal scene reduction method based on Wasserstein distance and validity index
Dong, Zuoli; Sun, Yingyun; Pu, Tianjiao; Chen, Naishi; Sun, Zuo.Zhongguo Dianji Gongcheng Xuebao = Proceedings of the CSEE; Beijing Vol. 39, Iss. 16,  (2019): 4650.
Abstract/Details

\Show Abstract 

Scholarly Journal  Full Text
Wasserstein Distance Learns Domain Invariant Feature Representations for Drift Compensation of E-Nose
Li, Chunyan; Yang, Haocheng; Xu, Juan.Sensors; Basel Vol. 19, Iss. 17,  (2019): 3703.
Abstract/DetailsFull textFull text - PDF (1 MB)‎

Show Abstract 
Cited by 7 Related articles All 8 versions    


Dissertation or Thesis  Preview Available
Reroducing-Kernel Hilbert Space Regression with Notes on the Wasserstein Distance
Page, Stephen.Lancaster University (United Kingdom). ProQuest Dissertations Publishing, 2019. 28277860.
Abstract/DetailsPreview - PDF (401 KB)‎

Order a copy

Show Abstract 


Dissertation or Thesis  Citation
Structure preserving discretization and approximation of gradient flows in Wasserstein-like space
Alternate title: Strukturerhaltende Diskretisierungen und Approximationen von Gradienten Flüssen in Wasserstein ähnlichen Räumen

Plazotta, Simon.Technische Universitaet Muenchen (Germany). ProQuest Dissertations Publishing, 2019. 27552212.
Details

Related articles All 3 versions 


Dissertation or Thesis  Citation
Algorithms for optimal transport and wasserstein distances
Schrieber, Jörn.Georg-August-Universitaet Goettingen (Germany). ProQuest Dissertations Publishing, 2019. 13888207.
Details

Show More 

 2019


Scholarly Journal  Citation/Abstract
The Optimal Convergence Rate of Monotone Schemes for Conservation Laws in the Wasserstein Distance
Ruf, Adrian M; Sande, Espen; Solem, Susanne.Journal of Scientific Computing; New York Vol. 80, Iss. 3,  (2019): 1764-1776.

Abstract/Details Get full textLink to external site, this link will open in a new window
 Show Abstract 

Scholarly Journal Full TextCentrl limit theorem and bootstrap procedure for Wasserstein’s variations with an application to structural relationships between distributions
del Barrio, Eustasio; Gordaliza, Paula; Lescornel, Hélène; Loubes, Jean-Michel.Journal of Multivariate Analysis Vol. 169,  (Jan 2019): 341-362.

Abstract/Details Get full textLink to external site, this link will open in a new window

Cited by (‎1)References (‎34)
Show Abstract 

Scholarly Journal  Citation/Abstract
Gait recognition based on Wasserstein generating adversarial image inpainting network
Li-min, Xia; Wang, Hao; Wei-ting, Guo.Journal of Central South University; Heidelberg Vol. 26, Iss. 10,  (2019): 2759-2770.
Abstract/Details 
Show Abstract 


Conference Paper  Citation/Abstract

Normalized Wasserstein for Mixture Distributions With Applications in Adversarial Learning and Domain Adaptation
Balaji, Yogesh; Chellappa, Rama; Feizi, Soheil.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Cited by 21 Related articles All 4 versions 


Abstract/Details 
Show Abstract 
 
Conference Paper Citation/Abstract

+Graph Signal Representation with Wasserstein Barycenters
Simou, Effrosyni; Frossard, Pascal.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details
Show Abstract 

<——2019—–—2019 ——2540—-


  Conference Pape  Citation/Abstrac

to-Engine Faults Diagnosis Based on K-Means Improved Wasserstein GAN and Relevant Vector Machine
Zhao, Zihe; Zhou, Rui; Dong, Zhuoning.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details    Show Abstract 
Cited by 5
 Related articles All 6 versions 


Conference Paper  Citation/Abstract

Joint Wasserstein Autoencoders for Aligning Multimodal Embeddings
Mahajan, Shweta; Botschen, Teresa; Gurevych, Iryna; Roth, Stefan.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details Get full textLink to external site, this link will open in a new windo
Show Abstract 
Cited by 5
 Related articles All 6 versions 


Conference Pape  Citation/Abstract

A Semi-Supervised Wasserstein Generative Adversarial Network for Classifying Driving Fatigue from EEG signals
Panwar, Sharaj; Rad, Paul; Quarles, John; Golob, Edward; Huang, Yufei.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details  Show Abstract 
Cited by 8
 Related articles All 4 versions


Conference Pape  Citation/Abstract

A Distributionally Robust Optimization Approach for Multivariate Linear Regression under the Wasserstein Metric

Chen, Ruidi; Paschalidis, Ioannis Ch.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details 

Show Abstract 

Select result item  Conference Paper  Citation/Abstract

BIRCH Algorithm and Wasserstein Distance Metric Based Method for Generating Typical Scenarios of Wind Power Outputs

Li, Qiushi; Tang, Xianghua; Chen, Changming; Liu, Xinyi; Liu, Shengyuan; et al.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details 
Show Abstract 


 2019


Conference Paper  Citation/Abstract

Face Synthesis and Recognition Using Disentangled Representation-Learning Wasserstein GAN

Hsu, Gee-Sern Jison; Tang, Chia-Hao; Yap, Moi Hoon.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details     Show Abstract 

Cited by 3 Related articles All 4 versions 

Conference Paper  Citation/Abstract

Infrared and Visible Image Fusion via Multi-discriminators Wasserstein Generative Adversarial Network

Li, Jing; Huo, Hongtao; Liu, Kejian; Li, Chang; Li, Shuo; et al.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details 

Show Abstract 


Conference Paper  Citation/Abstract

Generating EEG signals of an RSVP Experiment by a Class Conditioned Wasserstein Generative Adversarial Network

Panwar, Sharaj; Rad, Paul; Quarles, John; Huang, Yufei.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details 

Show Abstract 


Conference Paper  Citation/Abstract

Construction of 4D Neonatal Cortical Surface Atlases Using Wasserstein Distance

Chen, Zengsi; Wu, Zhengwang; Sun, Liang; Wang, Fan; Wang, Li; et al.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details

Show Abstract 


Conference Paper  Citation/Abstract

A Wasserstein Subsequence Kernel for Time Series

Bock, Christian; Togninalli, Matteo; Ghisu, Elisabetta; Gumbsch, Thomas; Rieck, Bastian; et al.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details Get full textLink to external site, this link will open in a new window

Show Abstract 

Cited by 6 Related articles All 9 versions

<——2019—–—2019 ——2550—-

Conference Paper  Citation/Abstract

Wasserstein GAN Can Perform PCA

Cho, Jaewoong; Suh, Changho.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details Get full textLink to external site, this link will open in a new window

Show Abstract 


Conference Paper  Citation/Abstract

Semi-supervised Multimodal Emotion Recognition with Improved Wasserstein GANs

Liang, Jingjun; Chen, Shizhe; Jin, Qin.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details   Show Abstract 

 

Conference Paper  Citation/Abstract

An Information-Theoretic View of Generalization via Wasserstein Distance

Wang, Hao; Diaz, Mario; Santos Filho, Jose Candido S; Calmon, Flavio P.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details   Show Abstract 

Cited by 20 Related articles All 6 versions

Conference Paper  Citation/Abstract

Data Augmentation Method of SAR Image Dataset Based on Wasserstein Generative Adversarial Networks

Lu, Qinglin; Jiang, Haiyang; Li, Guojing; Ye, Wei.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details   Show Abstract 

Cited by 2 Related articles All 2 versions

 
Conference Paper  Citation/Abstract

Music Classification using Multiclass Support Vector Machine and Multilevel Wasserstein Means
Wei, Jin; Jin, Cong; Cheng, Zhiyuan; Lv, Xin; Song, Leiyu.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details  Show Abstract 

2019


Conference Paperm  Citation/Abstract

Unimodal-Uniform Constrained Wasserstein Training for Medical Diagnosis
Liu, Xiaofen; Han, Xu; Qiao, Yukai; Ge, Yi; Li, Site; et al.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details Get full textLink to external site, this link will open in a new windo
Show Abstract 

Cited by 23 Related articles All 9 versions 


Conference Paper  Citation/Abstract

Weibo Authorship Identification based on Wasserstein generative adversarial networks
Tang, Wanbing; Wu, Chunhua; Chen, Xiaolong; Sun, Yudao; Li, Chen.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details 
Show Abstract 


Conference Paper Citation/Abstract

Optimal Fusion of Elliptic Extended Target Estimates based on the Wasserstein Distance
Thormann, Kolja; Baum, Marcus.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).
Abstract/Details
Show Abstract 


Conference Paper  Citation/Abstract
Training Wasserstein GANs for Estimating Depth Maps
Arslan, Abdullah Taha; Seke, Erol.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details   Show Abstract 
 
Conference Paper  Citation/Abstract
Single Image Haze Removal Using Conditional Wasserstein Generative Adversarial Networks
Ebenezer, Joshua Peter; Das, Bijaylaxmi; Mukhopadhyay, Sudipta.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).
Abstract/Details Get full textLink to external site, this link will open in a new window

Show Abstract 

Cited by 13 Related articles All 8 versions

<——2019—–—2019 ——2560—-



Conference Paper  Citation/Abstract

Random Matrix-Improved Estimation of the Wasserstein Distance between two Centered Gaussian Distributions
oko, Malik; Couillet, Romain.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 

Cited by 3 Related articles All 22 versions

Conference Paper  Citation/Abstrac

Image Reflection Removal Using the Wasserstein Generative Adversarial Network
Li, Tingtian; Lun, Daniel PK.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/Details 

Wasserstein CNN: Learning Invariant Features for NIR-VIS Face Recognition
He, Ran; Wu, Xiang; Sun, Zhenan; Tan, Tieniu.IEEE Transactions on Pattern Analysis and Machine Intelligence; New York Vol. 41, Iss. 7,  (2019): 1761-1773.

Abstract/Details Get full textLink to external site, this link will open in a new window

Cited by (‎5)
Show Abstract 

 
Scholarly Journal  Citation/Abstract
Non-Local Texture Optimization With Wasserstein Regularization Under Convolutional Neural Network
Li, Jie; Xiang, Yong; Hou, Jingyu; Xu, Dan.IEEE Transactions on Multimedia; Piscataway Vol. 21, Iss. 6,  (2019): 1437-1449.

Abstract/Details 

Citation/Abstract

Unsupervised Feature Extraction in Hyperspectral Images Based on Wasserstein Generative Adversarial Network
Zhang, Mingyang; Gong, Maoguo; Mao, Yishun; Li, Jun; Wu, Yue.IEEE Transactions on Geoscience and Remote Sensing; New York Vol. 57, Iss. 5,  (2019): 2669-2688.

Abstract/Details 

Cited by (‎2)
Show Abstract 
 
 
Scholarly Journal  Citation/Abstract

Scene Classification Using Hierarchical Wasserstein CNN
Liu, Yishu; Suen, Ching Y; Liu, Yingbin; Ding, Liwang.IEEE Transactions on Geoscience and Remote Sensing; New York Vol. 57, Iss. 5,  (2019): 2494-2509.

Abstract/Details 
Show Abstract 

2019

Scholarly Journal  Citation/Abstract

Time Delay Estimation Via Wasserstein Distance Minimization
Nichols, Jonathan M; Hutchinson, Meredith N; Menkart, Nicole; Cranch, Geoff A; Gustavo Kunde Rohde.IEEE Signal Processing Letters; New York Vol. 26, Iss. 6,  (2019): 908-912.

Abstract/Details 

Cited by (‎1)
Show Abstract  


Scholarly Journal  Citation/Abstract

Scene Classification by Coupling Convolutional Neural Networks With Wasserstein Distance
Liu, Yishu; Liu, Yingbin; Ding, Liwang.IEEE Geoscience and Remote Sensing Letters; Piscataway Vol. 16, Iss. 5,  (2019): 722-726.

Abstract/Details 
Show Abstract 

Scholarly Journal  Citation/Abstract

A Deep Transfer Model With Wasserstein Distance Guided Multi-Adversarial Networks for Bearing Fault Diagnosis Under Different Working Conditions
Zhang, Ming; Wang, Duo; Lu, Weining; Yang, Jun; Li, Zhiheng; et al.IEEE Access; Piscataway Vol. 7,  (2019): 65303-65318.

Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 
A deep transfer model with wasserstein distance guided multi-

Scholarly Journal  Citation/Abstract

Prostate MR Image Segmentation With Self-Attention Adversarial Training Based on Wasserstein Distance
Su, Chengwei; Huang, Renxiang; Liu, Chang; Yin, Tailang; Du, Bo.IEEE Access; Piscataway Vol. 7,  (2019): 184276-184284.

Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 

Cited by 74 Related articles All 7 versions
 
Scholarly Journal  Citation/Abstract

Multi-Source Medical Image Fusion Based on Wasserstein Generative Adversarial Networks
Yang, Zhiguang; Chen, Youping; Le, Zhuliang; Fan, Fan; Pan, Erting.IEEE Access; Piscataway Vol. 7,  (2019): 175947-175958.

Abstract/Details Get full textLink to external site, this link will open in a new window

Cited by 15 Related articles

<——2019—–—2019 ——2570—-


Scholarly Journal  Citation/Abstract

Generating Adversarial Samples With Constrained Wasserstein Distance
Wang, Kedi; Yi, Ping; Zou, Futai; Wu, Yue.IEEE Access; Piscataway Vol. 7,  (2019): 136812-136821.

Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 

Scholarly Journal  Citation/Abstract

A Virtual Monochromatic Imaging Method for Spectral CT Based on Wasserstein Generative Adversarial Network With a Hybrid Loss
Shi, Zaifeng; Li, Jinzhuo; Li, Huilong; Hu, Qixing; Cao, Qingjie.IEEE Access; Piscataway Vol. 7,  (2019): 110992-111011.

Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 


Scholarly Journal  Citation/Abstract

Second-Order Models for Optimal Transport and Cubic Splines on the Wasserstein Space
Jean-David Benamou; Gallouët, Thomas O; Vialard, François-Xavier.Foundations of Computational Mathematics; New York Vol. 19, Iss. 5,  (2019): 1113-1143.

Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 
Cited by 18 Related articles All 7 versions

Scholarly Journal  Full Text

On th total variation Wasserstein gradient flow and the TV-JKO scheme
Carlier, Guillaume; Poon, Clarice.ESAIM. Control, Optimisation and Calculus of Variations; Les Ulis Vol. 25,  (2019).

Abstract/Details Get full textLink to external site, this link will open in a new window

 
Scholarly Journal  Full Text

A Pontryagin Maximum Principle in Wasserstein spaces for constrained optimal control problems
Bonnet, Benoît.ESAIM. Control, Optimisation and Calculus of Variations; Les Ulis Vol. 25,  (2019).

Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 
Cited by 16 Related articles All 47 versions

 2019


Scholarly Journal  Full Text

Data-Driven Distributionally Robust Stochastic Control of Energy Storage for Wind Power Ramp Management Using the Wasserstein Metric
Yang, Insoon.Energies; Basel Vol. 12, Iss. 23,  (2019): 4577.

Abstract/DetailsFull textFull text - PDF (676 KB)‎
Show Abstract 

Cited by 6
 Related articles All 5 versions 


Scholarly Journal  Citation/Abstract

Approximation of Discounted Minimax Markov Control Problems and Zero-Sum Markov Games Using Hausdorff and Wasserstein DistancesDufour, François; Prieto-Rumeau, Tomás.Dynamic Games and Applications; Heidelberg Vol. 9, Iss. 1,  (2019): 68-102.

Abstract/Details 
Show Abstract 

Cited by 6 Related articles All 5 versions 

Scholarly Journal  Citation/Abstract

Convergence to Equilibrium in Wasserstein Distance for Damped Euler Equations with Interaction Forces
Carrillo, José A; Young-Pil, Choi; Tse, Oliver.Communications in Mathematical Physics; Heidelberg Vol. 365, Iss. 1,  (2019): 329-361.

Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 

 
Scholarly Journal  Citation/Abstract

The Pontryagin Maximum Principle in the Wasserstein SpaceBonnet, Benoît; Rossi, Francesco.Calculus of Variations and Partial Differential Equations; Heidelberg Vol. 58, Iss. 1,  (2019): 1-36.

Abstract/Details Get full textLink to external site, this link will open in a new window

Cited by (‎1)
Show Abstract 
Cited by 33 Related articles All 21 versions


Scholarly Journal  Citation/Abstract

A Two-Phase Two-Fluxes Degenerate Cahn–Hilliard Model as Constrained Wasserstein Gradient Flow
Cancès, Clément; Matthes, Daniel; Nabet, Flore.Archive for Rational Mechanics and Analysis; Heidelberg Vol. 233, Iss. 2,  (2019): 837-866.

Abstract/Details Get full textLink to external site, this link will open in a new window
 Show Abstract 

Cited by 14 Related articles All 17 versions

<——2019—–—2019 ——2580—-


Scholarly Journal  Full TextWasserstein Generative Adversarial Network Based De-Blurring Using Perceptual Similarity
 Hong, Minsoo; Choe, Yoonsik.Applied Sciences; Basel Vol. 9, Iss. 11,  (Jan 2019).

Abstract/DetailsFull textFull text - PDF (9 MB)‎
Show Abstract 

Scholarly Journal  Full Text
Multi-Turn Chatbot Based on Query-Context Attentions and Dual Wasserstein Generative Adversarial Networks
Kim, Jintae; Oh, Shinhyeok; Oh-Woog Kwon; Kim, Harksoo.Applied Sciences; Basel Vol. 9, Iss. 18,  (2019): 3908.
Abstract/DetailsFull textFull text - PDF (338 KB)‎
Show Abstract 


Slot based Image Captioning with WGAN

Z Xue, L Wang, P Guo - 2019 IEEE/ACIS 18th International …, 2019 - ieeexplore.ieee.org

Existing image captioning methods are always limited to the rules of words or syntax with 

single sentence and poor words. In this paper, this paper introduces a novel framework for 

image captioning tasks which reconciles slot filling approaches with neural network 

approaches. Our approach first generates a sentence template with many slot locations 

using Wasserstein Generative Adversarial Network (WGAN). Then the slots which are in 

visual regions will be filled by object detectors. Our model consists of a structured sentence …

 Related articles All 2 versions

Slot based Image Captioning with WGAN

Conference Paper  Citation/Abstract

Slot based Image Captioning with WGAN

Xue, Ziyu; Wang, Lei; Guo, Peiyu.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2019).

Abstract/DetailsGetit!Show Abstract 

Select result item


2019  [PDF] arxiv.org

Temporal Wasserstein non-negative matrix factorization for non-rigid motion segmentation and spatiotemporal deconvolution

E VarolA Nejatbakhsh, C McGrory - arXiv preprint arXiv:1912.03463, 2019 - arxiv.org

… Thus, we cast motion segmentation as a temporal non-linear matrix factorization problem

with Wasserstein metric loss. The dictionary elements of this factorization yield segmentation

of motion into coherent objects while the loading coefficients allow for time-varying intensity …

Cited by 3 Related articles All 5 versions 


2019  [PDF] ieee.org

Prostate MR image segmentation with self-attention adversarial training based on wasserstein distance

C Su, R Huang, C Liu, T Yin, B Du - IEEE Access, 2019 - ieeexplore.ieee.org

… In this paper, we propose a segmentation network with self-attention adversarial training

based on Wasserstein distance to tackle the problem. First, a segmentation network with

residual connection and attention mechanism is deployed to generate the prostate segmentation …

Cited by 3 Related articles

123 19-20  4


2019


2019  [PDF] arxiv.org

Temporal Wasserstein non-negative matrix factorization for non-rigid motion segmentation and spatiotemporal deconvolution

E VarolA Nejatbakhsh, C McGrory - arXiv preprint arXiv:1912.03463, 2019 - arxiv.org

… To this end, we propose an alternative paradigm for motion segmentation based on optimal

transport which models … segmentation as a temporal non-linear matrix factorization problem

with Wasserstein metric loss. The dictionary elements of this factorization yield segmentation …

Cited by 3 Related articles All 5 versions 


2019

Unsupervised segmentation of colonic polyps in narrow-band imaging data based on manifold representation of images and Wasserstein distance

IN FigueiredoL PintoPN FigueiredoR Tsai - … Signal Processing and …, 2019 - Elsevier

… Polyp segmentation is a crucial step towards an … segmentation model, involving the

Wasserstein distance. These histograms incorporate fused information about suitable image

descriptors, namely semi-local texture, geometry and color. To test the proposed segmentation …

Cited by 1 Related articles All 3 versions


2019  [PDF] biorxiv.org

Free from Publisher

De Novo Protein Design for Novel Folds using Guided Conditional Wasserstein Generative Adversarial Networks (gcWGAN)

M KarimiS ZhuY CaoY Shen - bioRxiv, 2019 - biorxiv.org

… To overcome the aforementioned challenges in protein design, we have developed a

semi-supervised, guided conditional Wasserstein … known folds to learn fold representations

generalizable for novel folds. We then develop a novel Generative Adversarial Network (GAN) for …

SCited by 3 Related articles All 4 versions 


2019  [PDF] arxiv.org

Disentangled representation learning with Wasserstein total correlation

Y XiaoWY Wang - arXiv preprint arXiv:1912.12818, 2019 - arxiv.org

… In this paper, we introduce a Wasserstein distance version of total correlation and propose

to learn disentangled … Wasserstein total correlation, a Wasserstein distance version of total

correlation, and apply it to disentangled representation learning; (2) we introduce Wasserstein …

SCited by 6 Related articles All 3 versions 

<——2019—–—2019 ——2590—- 


2019 [PDF] tamu.edu

DE NOVO PROTEIN DESIGN OF NOVEL FOLDS USING GUIDED CONDITIONAL WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS (GCWGAN)

S Zhu - 2019 - oaktrust.library.tamu.edu

… In this project two novel generative models, the conditional Wasserstein GAN (cWGAN) and

guided conditional Wasserstein GAN (… on the lowdimensional fold representation and we

called it conditional Wasserstein GAN. Based on that we also constructed a model that is guided …

S

2019  [PDF] thecvf.com

Face Synthesis and Recognition Using Disentangled Representation-Learning Wasserstein GAN

GS Jison HsuCH Tang… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

… Disentangled Representation-learning Wasserstein GAN (DR-WGAN) trained on augmented

data for face recognition and face synthesis across pose. We improve the state-of-the-art

DR-GAN with the Wasserstein … training data on the disentangled facial representation learning, …

SRelated articles All 2 versions 

 

2019  [PDF] arxiv.org

(q, p)-Wasserstein GANs: Comparing Ground Metrics for Wasserstein GANs

A Mallasto, J FrellsenW Boomsma… - arXiv preprint arXiv …, 2019 - arxiv.org

… This is a notable generalization as in the WGAN literature the OT distances are

commonly based on the l2 ground metric. We demonstrate the effect of different p-Wasserstein

distances in two toy examples. Furthermore, we show that the ground metric does make a difference, …

SCited by 5 Related articles All 3 versions 



2019 see 2020

Joint Wasserstein Autoencoders for Aligning Multimodal Embeddings 

arXiv:1909.06635v1 [cs.CV] 14 Sep 2019

https://arxiv.org › pdf

https://arxiv.org › pdfPDF

by S Mahajan · 2019 · Cited by 3 — Joint Wasserstein Autoencoders for Aligning Multimodal Embeddings ... encoded output of the GRU encoder and Gaussian regular-.

[PDF] thecvf.com

Joint wasserstein autoencoders for aligning multimodal embeddings

S Mahajan, T Botschen… - Proceedings of the …, 2019 - openaccess.thecvf.com

One of the key challenges in learning joint embeddings of multiple modalities, eg of images 

and text, is to ensure coherent cross-modal semantics that generalize across datasets. We 

propose to address this through joint Gaussian regularization of the latent representations …

Cited by 1 All 3 versions


[PDF] arxiv.org

Distributionally robust optimization: A review

H Rahimian, S Mehrotra - arXiv preprint arXiv:1908.05659, 2019 - arxiv.org

… to model the distributional ambiguity and discuss results for each of these ambiguity sets. … 

as the structural properties of the underlying unknown true distribution into the ambiguity set… 

distributionally robust two-stage stochastic linear program over a Wasserstein ambiguity set, …

 Cited by 170 Related articles All 5 versions

[CITATION] Distributionally robust risk measures with structured Wasserstein ambiguity sets

VA Nguyen, D Filipovic, D Kuhn - 2019 - Working paper

 Cited by 2 Related articles


2019

Deep learning-based electroencephalography analysis

https://iopscience.iop.org › article

https://iopscience.iop.org › article

by Y Roy · 2019 · Cited by 414 — Features learned through a DNN might also be more effective or expressive than the ones engineered by humans. Second, as is the case in the multiple domains ...


2019

towards interpreting deep neural networks - OpenReview

https://openreview.net › pdf

https://openreview.net › pdfPDF

by J Cao · 2019 — transport theory, we employ the Wasserstein distance (W-distance) to measure the ... mechanism of DNNs and investigate the across-layer behavior of a DNN ...


2019

A Wasserstein Inequality and Minimal Green Energy on ... - arXiv

https://arxiv.org › math

https://arxiv.org › math

by S Steinerberger · 2019 · Cited by 6 — We use this to show that minimizers of the discrete Green energy on compact manifolds have optimal rate of convergence W_2\left( \frac{1}{n} ...


2019

1908.04369] Wasserstein Index Generation Model: Automatic ...

https://arxiv.org › econ

https://arxiv.org › econ

by F Xie · 2019 · Cited by 7 — Abstract: I propose a novel method, the Wasserstein Index Generation model (WIG), to generate a public sentiment index automatically.


2019

iced Wasserstein Generative Models - CVPR 2019 Open ...

https://openaccess.thecvf.com › html › Wu_Sliced_Was...

https://openaccess.thecvf.com › html › Wu_Sliced_Was...

by J Wu · 2019 · Cited by 72 — Abstract. In generative modeling, the Wasserstein distance (WD) has emerged as a useful metric to measure the discrepancy between generated and real data ..

 <——2019—–—2019 ——2600—- 

  

Posterior Collapse, Wasserstein Auto Encoder deterministic ...

https://parkgeonyeong.github.io › Po...

https://parkgeonyeong.github.io › Po... · Translate this page

Aug 17, 2019 — 이를 해결하기 위해 많은 generative model KL-divergence(VAE, equivalently cross entropy), f-diverge

2019 see 2020

Quantifying the Empirical Wasserstein Distance to a Set of ...

https://proceedings.neurips.cc › paper › file

https://proceedings.neurips.cc › paper › filePDF

by M Phan · 2019 · Cited by 19 — We consider the problem of estimating the Wasserstein distance betwee

 
2019

Wasserstein Dependency Measure for Representation Learning

unified 

https://arxiv.org › cs

https://arxiv.org › cs

by S Ozair · 2019 · Cited by 51 — In this work, we empirically demonstrate that mutual information-based representation learning approaches do fail to learn complete ...

Cited by 55 Related articles All 6 versions

2019

The Gromov-Wasserstein distance and distributional ...

https://mathematics.stanford.edu › events › topology

https://mathematics.stanford.edu › events › topology

Oct 11, 2019 — This talk will overview the construction of the GW distance, the stability of distributional invariants, and will discuss some results regarding ...


2019

Semi-supervised multitask learning on multispectral satellite images using wasserstein generative adversarial networks (gans) for predicting poverty

A Perez, S Ganguli, S Ermon, G Azzari, M Burke… - arXiv preprint arXiv …, 2019 - arxiv.org

… Only 5% of the satellite images can be associated with labels (which are obtained from DHS Surveys) and thus a semi-supervised approach using a GAN (similar to the approach of Salimans, et al. (2016)), albeit with a more stable-to-train flavor of GANs called the …

Cited by 28 Related articles 


2019

2019  [PDF] mlr.press

The relative complexity of maximum likelihood estimation, map estimation, and sampling

C Tosh, S Dasgupta - Conference on Learning Theory, 2019 - proceedings.mlr.press

… By contrast, many of the problems that we have in mind—such as estimation of Gaussian mixture models or of topic distributions—take solutions in continuous spaces. The … Given this observation, we define the Wasserstein approximate posterior sampling problem as follows. …


2019

Quantile Propagation for Wasserstein-Approximate Gaussian ...

https://www.semanticscholar.org › paper › Quantile-Propa...

https://www.semanticscholar.org › paper › Quantile-Propa...

Dec 21, 2019 — A new approximation method to solve the analytically intractable Bayesian inference for Gaussian process models with factorizable Gaussian ...


2019

An Optimal Scenario Reduction Method Based on Wasserstein Distance and Validity Index,一种基于Wasserstein距离及有效性指标的最优场景约简方法

Zhongguo Dianji Gongcheng Xuebao/Proceedings of the Chinese Society of Electrical Engineering

2019 | Journal article

DOI: 10.13334/j.0258-8013.pcsee.181494

EID: 2-s2.0-85072658394

CITATION] An optimal scenario reduction method based on Wasserstein distance and validity index

D Xiaochong, S Yingyun, P Tianjiao - Proceedings of the CSEE, 2019

Cited by 3 Related articles


2019  [PDF] github.io

[PDF] Why Wasserstein distance is better for training GANs: A summary

IPP Panangaden - 2019 - arnab39.github.io

… Let us wait for the Wasserstein distance until the last chapter when we formally define optimal 

… Before I introduce the Wasserstein metric, let us take a brief look at the crucial concepts of 

… This completes our understanding of why theoretically using Wasserstein distance is better …

[PDF] thecvf.com

Normalized Wasserstein for Mixture Distributions With Applications in Adversarial Learning and Domain Adaptation

Y Balaji, R Chellappa, S Feizi - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Understanding proper distance measures between distributions is at the core of several learning tasks such as generative models, domain adaptation, clustering, etc. In this work, we focus on mixture distributions that arise naturally in several application domains where …

2019  [PDF] thecvf.com

Normalized wasserstein for mixture distributions with applications in adversarial learning and domain adaptation

Y Balaji, R Chellappa, S Feizi - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

… By relaxing the marginal constraints of the classical Wasserstein distance (1), we introduce 

the Normalized Wasserstein measure (NW … In this section, we introduce the normalized 

Wasserstein measure and discuss its properties. Recall that G is an array of generator functions …

 Cited by 18 Related articles All 4 versions

<——2019—–—2019 ——2610—- 



2019  [PDF] arxiv.org

Personalized purchase prediction of market baskets with Wasserstein-based sequence matching

M Kraus, S Feuerriegel - Proceedings of the 25th ACM SIGKDD …, 2019 - dl.acm.org

market basket, we evaluate the agreement of the predicted basket and the real basket 

using the following metrics: • Wasserstein distance: The Wasserstein … For instance, a market 

basket comprising “red wine” has a small Wasserstein distance to a market basket containing “…

  Cited by 8 Related articles All 4 versions
Personalized Purchase Prediction of Market Baskets with Wasserstein-Based Sequence Matching


2019

Normalized wasserstein for mixture distributions with applications in adversarial learning and domain adaptation

Y Balaji, R Chellappa, S Feizi - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

Understanding proper distance measures between distributions is at the core of several 

learning tasks such as generative models, domain adaptation, clustering, etc. In this work, 

we focus on mixture distributions that arise naturally in several application domains where 

the data contains different sub-populations. For mixture distributions, established distance 

measures such as the Wasserstein distance do not take into account imbalanced mixture 

proportions. Thus, even if two mixture distributions have identical mixture components but …

Cited by 24 Related articles All 4 versions

[CITATION] Normalized wasserstein for mixture distributions with applications in adversarial learning and domain adaptation. 2019 IEEE

Y Balaji, R Chellappa, S Feizi - CVF International Conference on Computer Vision …, 2019

Cited by 2 Related articles


2019

Poincaré Wasserstein Autoencoder | OpenReview

https://openreview.net › forum

https://openreview.net › forum

by I Ovinnikov · 2019 · Cited by 24 — Review: In this paper, the authors proposed a Poincare Wasserstein autoencoder for representing and generating data with latent hierarchical structures. The ...


2019  [PDF] arxiv.org

Second-order models for optimal transport and cubic splines on the Wasserstein space

JD BenamouTO GallouëtFX Vialard - Foundations of Computational …, 2019 - Springer

… On the space of probability densities, we extend the Wasserstein geodesics to the case of

higher-order interpolation such as cubic spline interpolation. After presenting the natural

extension of cubic splines to the Wasserstein space, we propose a simpler approach based on the …

  Cited by 16 Related articles All 6 versions


2019

virtual monochromatic imaging method for spectral CT based on Wasserstein generative adversarial network with a hybrid loss

Z Shi, J Li, H Li, Q Hu, Q Cao - IEEE Access, 2019 - ieeexplore.ieee.org

… tomography (CT) has become a popular clinical diagnostic technique because of its unique

advantage in material distinction. Specifically, it can perform virtual … Aiming at modeling

spatial and spectral correlations, this paper proposes a Wasserstein generative adversarial …

Cited by 9 Related articles All 2 versions


2019


Evasion attacks based on wasserstein generative adversarial network

J Zhang, Q Yan, M Wang - 2019 Computing, Communications …, 2019 - ieeexplore.ieee.org

… attacks methods for malicious PDF detection system require a prior knowledge of the target

classifier and feedback information from the detection system. However, in real scenarios,

the information is difficult to obtain. In this paper, we propose a WGAN-based evasion attacks …

Cited by 3 Related articles



2019

[PDF] phmsociety.org

Anomaly detection on time series with wasserstein gan applied to phm

M Ducoffe, I Haloui, JS Gupta - International Journal of …, 2019 - papers.phmsociety.org

… They measure the quality of the generated distribution with the 1-Wasserstein distance. In its

primal form, the Wasserstein distance requires … Instead, the formulation of Wasserstein GAN

relies on the dual expression of the 1-Wasserstein distance, which allows nicer optimization …

Cited by 4 Related articles All 2 versions 



2019  

Necessary condition for rectifiability involving Wasserstein ...https://arxiv.org › math
https://arxiv.org › math
by D Dąbrowski · 2019 · Cited by 9 — Abstract: A Radon measure \mu is n-rectifiable if \mu\ll\mathcal{H}^n and \mu-almost all of \text{supp}\,\mu can be covered by Lipschitz ...


2019

'A compelling model for the industry': New York Media CEO ...

https://digiday.com › media › compelling-model-indust...

Sep 25, 2019 — New York Media CEO Pam Wasserstein, who will become president of Vox Media, said the deal is based on shared ambition, along with a dose of ...


 2019

A kind of fuzzy detection seed set generation method and generator based on WGAN …

CN CN108549597A 纪守领 浙江大学

Priority 2018-03-05 • Filed 2018-03-05 • Published 2018-09-18

The invention discloses a kind of fuzzy detection seed set generator based on WGAN models, includingTraining set acquisition module, has the fuzzy detection tool based on mutation algorithm, using common input as the identical program of Seed inspection multiple input format, it may be found that …

<——2019—–—2019 ——2620—- 



Using Wasserstein Generative Adversarial Networks for the Design of Monte Carlo Simulatio

 dmission Events & Information Sessions ... Enter the terms you wish to search for. 

Using sing Wasserstein Generative Adversarial Networks for the ...

https://arxiv.org › econ

by S Athey · 2019 · Cited by 42 — Title:Using Wasserstein Generative Adversarial Networks for the Design of Monte Carlo Simulations ; Comments: 30 pages, 4 figures ; Subjects: .

2019  see 2020 Research article
Typical wind power scenario generation for multiple wind farms using conditional improved Wasserstein generative adversarial network
International Journal of Electrical Power & Energy Systems2 July 2019...

Yufan ZhangQian AiTianguang Lu

S. BorysovJeppe Rich


Aero-engine faults diagnosis based on K-means improved Wasserstein GAN and relevant vector machine

Z Zhao, R Zhou, Z Dong - 2019 Chinese Control Conference …, 2019 - ieeexplore.ieee.org

… -GP on the aero-engine faults dataset for generating faults data in this paper, by using the

WGAN-GP the convergence s


2019 see 2020   Research article
Data supplement for a soft sensor using a new generative model based on a variational autoencoder and Wasserstein GAN
Journal of Process Control25 November 2019...

Xiao WangHan Liu


 
 

2019  Research article
Bounds for the Wasserstein mean with applications to the Lie-Trotter mean
Journal of Mathematical Analysis and Applications22 March 2019...

Jinmi HwangSejong Kim

Cited by 7 Related articles All 4 versions

  2019


2019  see 2020  Research article
W-LDMM: A Wasserstein driven low-dimensional manifold model for noisy image restoration
Neurocomputing6 September 2019...

Ruiqiang HeXiangchu FengChunyu Yang

 

2019  Research article
Data-driven Wasserstein distributionally robust optimization for biomass with agricultural waste-to-energy network design under uncertainty
Applied Energy23 September 2019...

Chao NingFengqi You

Cited by 30 Related articles All 7 versions

2019  see 2020  Research article
On the computation of Wasserstein barycenters
Journal of Multivariate Analysis13 December 2019...

Giovanni PuccettiLudger RüschendorfSteven Vanduffel


  2019  Research article
A partial Laplacian as an infinitesimal generator on the Wasserstein space
Journal of Differential Equations25 June 2019...

Yat Tin ChowWilfrid Gangbo

Cited by 16 Related articles All 9 versions

  

2019  see 2020  Research article
Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings
Knowledge-Based Systems31 October 2019...

Yuanfei DaiShiping WangWenzhong Guo

 <——2019—–—2019 ——2630—-

 

2019  Research article
On isometric embeddings of Wasserstein spaces – the discrete case
Journal of Mathematical Analysis and Applications21 August 2019...

György Pál GehérTamás TitkosDániel Virosztek


2019  Research articleOpen access
Distributionally Robust Chance-Constraint Optimal Power Flow Considering Uncertain Renewables with Wasserstein-Moment Metric
Energy ProcediaFebruary 2019...

Jun LiuYefu ChenJia Lyu

Download PDF


 

2019  Research article
A Fenchel-Moreau-Rockafellar type theorem on the Kantorovich-Wasserstein space with applications in partially observable Markov decision processes
Journal of Mathematical Analysis and Applications8 May 2019...

Vaios LaschosKlaus ObermayerWilhelm Stannat

Cited by 3 Related articles All 5 versions

2019  Short communication
Deep multi-Wasserstein unsupervised domain adaptation
Pattern Recognition Letters30 April 2019...

Tien-Nam LeAmaury HabrardMarc Sebban

 Cited by 3 Related articles All 4 versions


2019  see 2020  Short communication
Wasserstein GAN based on Autoencoder with back-translation for cross-lingual embedding mappings
Pattern Recognition Letters23 November 2019...

Yuhong ZhangYuling LiXuegang Hu

  
2019

2019  Research article
Optimal XL-insurance under Wasserstein-type ambiguity
Insurance: Mathematics and Economics29 May 2019...

Corina BirghilaGeorg Ch. Pflug

 Cited by 7 Related articles All 6 versions


2019  Research article
A Rademacher-type theorem on L2-Wasserstein spaces over closed Riemannian manifolds
Journal of Functional Analysis
13 November 2019...

Lorenzo Dello Schiavo

Cited by 6 Related articles All 6 versions

 Data augmentation in fault diagnosis based on the Wasserstein generative adversarial network with gradient penalty

31 citations*

2020 NEUROCOMPUTING

Xin Gao ,Fang Deng ,Xianghu Yue

Beijing Institute of Technology

Fault detection and isolation

Fault (power engineering)

View More (7+) 

Abstract Fault detection and diagnosis in industrial process is an extremely essential part to keep away from undesired events and ensure the safety of operators and facilities. In the last few decades various data based machine learning algorithms have been widely studied to monitor machine condi... View Full Abstract 


2019

 Wasserstein Smoothing: Certified Robustness against Wasserstein Adversarial Attacks

6 citations*

2019 ARXIV: LEARNING

View More 

 On a Novel Application of Wasserstein-Procrustes for Unsupervised Cross-Lingual Learning.

1 citations*

2020 ARXIV: COMPUTATION AND LANGUAGE

Guillem Ramírez ,Rumen Dangovski ,Preslav Nakov ,Marin Soljacic

Massachusetts Institute of Technology

Text corpus

Word (computer architecture)

View More (8+) 

The emergence of unsupervised word embeddings, pre-trained on very large monolingual text corpora, is at the core of the ongoing neural revolution in Natural Language Processing (NLP). Initially introduced for English, such pre-trained word embeddings quickly emerged for a number of other languages.... View Full Abstract 


2019  Research article
Misfit function for full waveform inversion based on the Wasserstein metric with dynamic formulation
Journal of Computational Physics28 August 2019...

Peng YongWenyuan LiaoYaoting Lin

Cited by 16 Related articles All 3 versions

<——2019—–—2019 ——2640—- 

 

2019  Research article
Decomposition algorithm for distributionally robust optimization using Wasserstein metric with an application to a class of regression models
European Journal of Operational Research15 March 2019...

Fengqiao LuoSanjay Mehrotra

2019  Research article
On potentials of regularized Wasserstein generative adversarial networks for realistic hallucination of tiny faces
Neurocomputing22 July 2019...

Wen-Ze ShaoJing-Jing XuHai-Bo Li


 2019  Research article
Stacked Wasserstein Autoencoder
Neurocomputing19 July 2019...

Wenju XuShawn KeshmiriGuanghui Wang

2019  [PDF] researchgate.net

[PDF] Speech Enhancement for Noise-Robust Speech Synthesis Using Wasserstein GAN.

N Adiga, Y Pantazis, V Tsiaras, Y Stylianou - INTERSPEECH, 2019 - researchgate.net

… In this paper, we propose a speech enhancement technique based on generative adversarial 

networks (GANs) which acts as a … the speech enhancement generative adversarial network 

(SEGAN) approach and recent advances in deep learning, we propose to use Wasserstein …

Sited by 10 Related articles All 4 versions


2019  see 2020

An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription

C Jin, Z Li, Y Sun, H Zhang, X Lv, J Li, S Liu - International Conference on …, 2019 - Springer

… processing or machine learning cannot perfectly restore the original music signal and have 

significant distortion. In this paper, we propose a novel processing … entropic regularized 

Wasserstein Barycenter algorithm to speed up the computation of the Wasserstein distance and …

Related articles


2019


2019  Research article
A Wasserstein distance-based analogous method to predict distribution of non-uniform corrosion on reinforcements in concrete
Construction and Building Materials8 August 2019...

Qifang LiuRay Kai Leung Su


 CWGAN: Conditional wasserstein generative adversarial nets for fault data generation

Y Yu, B Tang, R Lin, S Han, T Tang… - 2019 IEEE International …, 2019 - ieeexplore.ieee.org

… This expected value can be taken to the lower bound in all possible joint distributions and 

defined as the Wasserstein distance of the two distributions. The Wasserstein distance can be …

 Cited by 15 Related articles All 2 versions

[PDF] sciencedirect.com

Decomposition algorithm for distributionally robust optimization using Wasserstein metric with an application to a class of regression models

F LuoS Mehrotra - European Journal of Operational Research, 2019 - Elsevier

… The use of Wasserstein metric to define an ambiguity set of … As shown in this article, the

Wasserstein metric results in a … and the metric used to define the Wasserstein metric uses l 1 or l …

Cited by 33 Related articles All 5 versions


[PDF] ncl.ac.uk

Generative adversarial framework for depth filling via wasserstein metric, cosine transform and domain transfer

A Atapour-AbarghoueiS Akcay… - Pattern Recognition, 2019 - Elsevier

… We propose a domain critic network, which uses the Wasserstein metric to measure the

distance between the source (synthetic data) and the target (real-world data) and minimizes this …

Cited by 19 Related articles All 7 versions


[PDF] mlr.press

Wasserstein of Wasserstein loss for learning generative models

Y DuklerW Li, A Lin… - … Conference on Machine …, 2019 - proceedings.mlr.press

… a Wasserstein distance as the ground metric on the sample space of images. This ground

metric is … We derive the Wasserstein ground metric on image space and define a Riemannian …

Cited by 24 Related articles All 12 versions 

2019 TutORial: Wasserstein Distributionally Robust   Optimization- YouTube

2019 TutORial: Wasserstein Distributionally Robust Optimization

www.youtube.com › watch

Given by Daniel Kuhn at 2019 INFORMS Annual Meeting in Seattle, WA.Many decision problems in science, engineering and economics are affected ...

YouTube · INFORMS · 

ICML 2019 Generative Adversarial Networks Paper ...

youtube.videoken.com › embed

Paper: Wasserstein of Wasserstein Loss for Learning Generative Models45:27 ... Paper: Flat Metric Minimization with Applications in Generative Modeling50:08.

VideoKen · 

Oct 13, 2019

2019 TutORial: Wasserstein Distributionally Robust Optimization

Wasserstein distributionally robust optimization seeks data-driven decisions that perform well under the most ...

Dec 20, 2019 · Uploaded by INFORMS 

<——2019—–—2019 ——2650—


[PDF] arxiv.org

On the Bures–Wasserstein distance between positive definite matrices

R Bhatia, T Jain, Y Lim - Expositiones Mathematicae, 2019 - Elsevier

… metric has … Wasserstein metric. If A and B are diagonal matrices, then d ( A , B ) reduces to

the Hellinger distance between probability distributions and is related to the Rao–Fisher metric …

Cited by 154 Related articles All 5 versions


[PDF] researchgate.net  Full View

Data-driven affinely adjustable distributionally robust framework for unit commitment based on Wasserstein metric

W Hou, R ZhuH Wei… - IET Generation …, 2019 - ieeexplore.ieee.org

… This paper focuses on the Wasserstein metric, because it has a tractable reformulation and

a … Ξ and the two probability distributions FN, F (Ξ), the Wasserstein metric is defined as …

Cited by 15 Related articles All 4 versions 


[PDF] arxiv.org

Wasserstein style transfer

Y Mroueh - arXiv preprint arXiv:1905.12828, 2019 - arxiv.org

… We show in Figure 3 the output of our mixing strategy using two of the geodesic metrics

namely Wasserstein and Fisher Rao barycenters. We give as baseline the AdaIn output for this (…

Cited by 20 Related articles All 6 versions 


Wire Feed  Full Text

Univ Chongqing Posts & Telecom Files Chinese Patent Application for Differential Wgan Based Network Security Situation Prediction Method

Global IP News. Security & Protection Patent News; New Delhi [New Delhi]. 14 Oct 2019. 

DetailsFul


[PDF] arxiv.org

Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space

TL GouicQ ParisP RigolletAJ Stromme - arXiv preprint arXiv …, 2019 - arxiv.org

… behavior of empirical barycenters in the context where (S, d) is the 2-Wasserstein space of

… (RD) equipped with the 2-Wasserstein metric. The Wasserstein space has recently played a …

  Cited by 19 Related articles All 4 versions 


2019


Data-driven Wasserstein distributionally robust optimization for biomass with agricultural waste-to-energy network design under uncertainty

C NingF You - Applied Energy, 2019 - Elsevier

… data-driven Wasserstein distributionally robust optimization … distributions based on the

Wasserstein metric, which is utilized … two-stage distributionally robust optimization model not only …

  Cited by 30 Related articles All 7 versions


Aggregated Wasserstein distance and state registration for hidden Markov models

Y ChenJ Ye, J Li - IEEE transactions on pattern analysis and …, 2019 - ieeexplore.ieee.org

… Wasserstein, for computing a dissimilarity measure or distance between two Hidden Markov

… of optimal transport and the Wasserstein metric between distributions. Specifically, the …

Cited by 13 Related articles All 7 versions


[PDF] neurips.cc

Quantum wasserstein generative adversarial networks

S Chakrabarti, H Yiming, T Li… - Advances in Neural …, 2019 - proceedings.neurips.cc

… of quantum generative models even on noisy quantum hardware. Specifically, we propose a

definition of the Wasserstein semimetric between quantum … to turn the quantum Wasserstein …

Cited by 32 Related articles All 8 versions 


[PDF] iop.org

Wasserstein space as state space of quantum mechanics and optimal transport

MF Rosyid, K Wahyuningsih - Journal of Physics: Conference …, 2019 - iopscience.iop.org

… space P2(Σ(A)) which is called Wasserstein space. Let B be any other observable. It can be

… We will investigate the Wasserstein spaces over the spectrums of a quantum observables, …

Related articles All 3 versions


[PDF] arxiv.org

Thermodynamic interpretation of Wasserstein distance

A Dechant, Y Sakurai - arXiv preprint arXiv:1912.08405, 2019 - arxiv.org

… stochastic dynamics and the Wasserstein distance. We show … is given by the Wasserstein

distance between the two states, … Using a lower bound on the Wasserstein distance, we further …

Cited by 19 Related articles All 2 versions 
<——2019—–—2019 ——2660—-

[PDF] neurips.cc

Wasserstein weisfeiler-lehman graph kernels

M TogninalliE Ghisu… - Advances in …, 2019 - proceedings.neurips.cc

… We propose a novel method that relies on the Wasserstein distance between the node …

ordered strings through the aggregation of the labels of a node and its neighbours; those strings …

Cited by 93 Related articles All 13 versions 


[PDF] arxiv.org

Inequalities for the Wasserstein mean of positive definite matrices

R Bhatia, T Jain, Y Lim - Linear Algebra and its Applications, 2019 - Elsevier

… + B + ( A B ) 1 / 2 + ( B A ) 1 / 2 ] , and can be thought of as the Wasserstein mean of A and B.

… of the Wasserstein metric, mean and barycentre in various areas like quantum information, …

Cited by 15 Related articles All 6 versions

 

[PDF] ncl.ac.uk

Generative adversarial framework for depth filling via wasserstein metric, cosine transform and domain transfer

A Atapour-AbarghoueiS Akcay… - Pattern Recognition, 2019 - Elsevier

… We propose a domain critic network, which uses the Wasserstein metric to measure the

distance between the source (synthetic data) and the target (real-world data) and minimizes this …

Cited by 19 Related articles All 7 versions

 

[PDF] 

[PDF] arxiv.org

On isometric embeddings of Wasserstein spaces–the discrete case

GP GehérT TitkosD Virosztek - Journal of Mathematical Analysis and …, 2019 - Elsevier

… Wigner's theorem about quantum mechanical symmetry … of all isometric embeddings of the

Wasserstein space W p ( X ) , where … In order to introduce the Wasserstein space W p ( X ) , we …

Cited by 6 Related articles All 9 versions

 

[PDF] arxiv.org

Wasserstein information matrix

W Li, J Zhao - arXiv preprint arXiv:1910.11248, 2019 - arxiv.org

… Another phenomenon appears when we consider the Wasserstein natural gradient applies

to Fisher scores. Specifically, we use log-likelihood function as a loss function and apply WIM …

Cited by 15 Related articles All 5 versions 

 

2019

[PDF] arxiv.org

Statistical aspects of Wasserstein distances

VM Panaretos, Y Zemel - Annual review of statistics and its …, 2019 - annualreviews.org

Wasserstein distances are metrics on probability distributions inspired by the problem of …

In this review, we provide a snapshot of the main concepts involved in Wasserstein distances …

Cited by 260 Related articles All 7 versions

[PDF] arxiv.org

A Fenchel-Moreau-Rockafellar type theorem on the Kantorovich-Wasserstein space with applications in partially observable Markov decision processes

V Laschos, K ObermayerY ShenW Stannat - Journal of Mathematical …, 2019 - Elsevier

… for proper convex functionals on Wasserstein-1. We retrieve … field of Partially observable

Markov decision processes (POMDPs… Wasserstein-1 space with the space of Lipschitz functions. …

Cited by 3 Related articles All 5 versio


[PDF] arxiv.org

Wasserstein metric-driven bayesian inversion with applications to signal processing

M MotamedD Appelo - International Journal for Uncertainty …, 2019 - dl.begellhouse.com

… of the quadratic Wasserstein distance. In this paper, we focus on the quadratic Wasserstein

and … 3: Posterior histograms and the trace plots of the Markov chain samples in two cases: (a) …

Cited by 9 Related articles All 3 versions

 

[PDF] researchgate.net

[PDF] Connections between support vector machines, wasserstein distance and gradient-penalty gans

A Jolicoeur-MartineauI Mitliagkas - arXiv preprint arXiv …, 2019 - researchgate.net

… As stated in Section 2.3, the popular approach of softly enforcing ||xf(x)||2 ≈ 1 at all

interpolations between real and fake samples does not ensure that we estimate the Wasserstein …

Cited by 12 Related articles All 2 versions 

<——2019—–—2019 ——2670—-



[PDF] iop.org

Wasserstein space as state space of quantum mechanics and optimal transport

MF Rosyid, K Wahyuningsih - Journal of Physics: Conference …, 2019 - iopscience.iop.org

… is homeomorphic to the Wasserstein space over the … can be formulated in the Wasserstein

space over the spectrum of each … of Newton equation of motion in the Wasserstein space over …

Related articles All 3 versions


[PDF] neurips.cc

Propagating uncertainty in reinforcement learning via wasserstein barycenters

AM MetelliA Likmeta… - Advances in Neural …, 2019 - proceedings.neurips.cc

… We will denote the algorithm employing this update rule as Modified Wasserstein Q-Learning

(MWQL). The reason why we need to change the WTD lies in the fact that the uncertainty …

Cited by 9 Related articles All 8 versions 


[PDF] arxiv.org

A convergent Lagrangian discretization for -Wasserstein and flux-limited diffusion equations

B Söllner, O Junge - arXiv preprint arXiv:1906.01321, 2019 - arxiv.org

… Formulating the Euler-Lagrange equation n the case of p-Wasserstein cost is straight forward.

In the case of flux-limiting cost, however, we have to make sure the minimization problem …

Cited by 2 Related articles All 6 versions 


[PDF] researchgate.net

[PDF] Rate of convergence in Wasserstein distance of piecewise-linear Lévy-driven SDEs

ARI ARAPOSTATHIS, G PANG… - arXiv preprint arXiv …, 2019 - researchgate.net

… some recent developments for Markov processes under the Wasserstein metric. Butkovsky

[… of general Markov processes (both discrete and continuous time) in the Wasserstein metric, …

Related articles 


2019 see 2020  [HTML] nih.gov

Hyperbolic Wasserstein distance for shape indexing

J ShiY Wang - IEEE transactions on pattern analysis and …, 2019 - ieeexplore.ieee.org

… on the Euler number of the surface is positive, zero, or negative, respectively. In other words,

surfaces with Euler … Let ðS;gÞ be a surface with Euler number xðSÞ < 0 and its hyperbolic …

Cited by 7 Related articles All 8 versions

I 

20`9


[PDF] arxiv.org

Wasserstein adversarial imitation learning

H XiaoM Herman, J Wagner, S Ziesche… - arXiv preprint arXiv …, 2019 - arxiv.org

… An infinite horizon discounted Markov decision process setting M is defined by the tuple (S,

A, p, r, µ0,γ) consisting of the finite state space S, the finite action space A and the transition …

Cited by 41 Related articles All 3 versions 

[PDF] arxiv.org

On the total variation Wasserstein gradient flow and the TV-JKO scheme

G Carlier, C Poon - ESAIM: Control, Optimisation and Calculus of …, 2019 - esaim-cocv.org

We study the JKO scheme for the total variation, characterize the optimizers, prove some of

their qualitative properties (in particular a form of maximum principle and in some cases, a …

  Cited by 8 Related articles All 12 versions


 del Barrio, EustasioGordaliza, PaulaLescornel, HélèneLoubes, Jean-Michel

Central limit theorem and bootstrap procedure for Wasserstein’s variations with an application to structural relationships between distributions. (English) Zbl 1476.62089

J. Multivariate Anal. 169, 341-362 (2019).

Reviewer: N. G. Gamkrelid


Deconvolution for the Wasserstein distance

J Dedecker - smai.emath.fr

… We are interested in rates of convergence for the Wasserstein metric of order p ≥ 1. The

distribution of the errors is assumed to be known and to belong to a class of supersmooth or …

Related articles 


[PDF] arxiv.org

Parameter estimation for biochemical reaction networks using Wasserstein distances

K Öcal, R GrimaG Sanguinetti - Journal of Physics A …, 2019 - iopscience.iop.org

… the reaction network as a Markov chain whose states are … The transitions of the Markov

chain correspond to reactions, with … . The forward Kolmogorov equation for this Markov chain is …

Cited by 16 Related articles All 10 versions

<——2019—–—2019 ——2680—-



[PDF] unibocconi.it

[PDF] Bayesian model comparison based on Wasserstein distances

M Catalano, A Lijoi, I Prünster - SIS 2019 Smart Statistics for Smart …, 2019 - iris.unibocconi.it

… While simulations of the Wasserstein distance are easily achieved [19], analytical …

Wasserstein distance, in Section 3 we provide general upper and lower bounds for the Wasserstein …

Cited by 2 Related articles 


[PDF] arxiv.org

Refined basic couplings and Wasserstein-type distances for SDEs with Lévy noises

D Luo, J Wang - Stochastic Processes and their Applications, 2019 - Elsevier

We establish the exponential convergence with respect to the L 1 -Wasserstein distance and

the total variation for the semigroup corresponding to the stochastic differential equation d X …

Cited by 22 Related articles All 6 versions


[PDF] arxiv.org

Estimation of wasserstein distances in the spiked transport model

J Niles-WeedP Rigollet - arXiv preprint arXiv:1909.07513, 2019 - arxiv.org

… and subgaussian concentration properties of the Wasserstein distance. In Section 6 we

propose and analyze an estimator for the Wasserstein distance under the spiked transport model…

Cited by 32 Related articles All 2 versions 

 


Wasserstein barycenters in the manifold of all positive definite matrices

E NobariB Ahmadi Kakavandi - Quarterly of Applied Mathematics, 2019 - ams.org

… Abstract: In this paper, we study the Wasserstein barycenter … optimal solutions and the

Wasserstein barycenter measure. … that the density of the Wasserstein barycenter measure can be …

Related articles All 2 versions


[PDF] arxiv.org

Orthogonal Wasserstein GANs

J Müller, R KleinM Weinmann - arXiv preprint arXiv:1911.13060, 2019 - arxiv.org

… to compare WassersteinGAN discriminators based on their approximated Wasserstein distance

… weight orthogonalization during the training of Wasserstein-GANs to enforce its Lipschitz …

Cited by 4 Related articles All 2 versions 

2019


[PDF] arxiv.org

Wasserstein distributionally robust optimization: Theory and applications in machine learning

D KuhnPM EsfahaniVA Nguyen… - … science in the age …, 2019 - pubsonline.informs.org

… distribution within a certain Wasserstein distance from a nominal … We will also show that

Wasserstein distributionally robust … exceeds the inverse Fisher information matrix in a positive …

Cited by 164 Related articles All 8 versions

[PDF] arxiv.org

Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space

TL GouicQ ParisP RigolletAJ Stromme - arXiv preprint arXiv …, 2019 - arxiv.org

… Banach space if and only if it is of type 2 [LT91], which is a property linked to the geometry of

the Banach … rates of convergence of barycenters on Wasserstein spaces, let alone general …

Cited by 19 Related articles All 4 versions 

 

[PDF] arxiv.org

Multivariate stable approximation in Wasserstein distance by Stein's method

P Chen, I NourdinL XuX Yang - arXiv preprint arXiv:1911.12917, 2019 - arxiv.org

… The Markov process we construct in the first step of Barbour’s program is the so-called

Ornstein-Uhlenbeck type process which is a simple stochastic differential equation (SDE) driven …

 Cited by 5 Related articles All 4 versions 


[PDF] thecvf.com

Max-sliced wasserstein distance and its use for gans

I DeshpandeYT HuR SunA Pyrros… - Proceedings of the …, 2019 - openaccess.thecvf.com

… In this paper, we analyzed the Wasserstein and sliced Wasserstein distance … Wasserstein

distance. We showed that this distance enjoys a better sample complexity than the Wasserstein …

Cited by 76 Related articles All 11 versions 

[PDF] arxiv.org

The Parisi formula is a Hamilton–Jacobi equation in Wasserstein space

JC Mourrat - Canadian Journal of Mathematics, 2019 - cambridge.org

… There already exists a rich literature on Hamilton-Jacobi equations in infinitedimensional

Banach spaces, as well as on the Wasserstein space of probability measures or more general …

 Cited by 11 Related articles All 7 versions

<——2019—–—2019 ——2692—-


[PDF] arxiv.org

Hypothesis test and confidence analysis with wasserstein distance with general dimension

M ImaizumiH Ota, T Hamaguchi - arXiv preprint arXiv:1910.07773, 2019 - arxiv.org

… inference with the Wasserstein distance. Recently, the Wasserstein distance has attracted

much … Despite the importance, hypothesis tests and confidence analysis with the Wasserstein …

Cited by 3 Related articles All 2 versions 


2019 see 2020  [PDF] arxiv.org

Wasserstein gradient flow formulation of the time-fractional Fokker-Planck equation

MH DuongB Jin - arXiv preprint arXiv:1908.09055, 2019 - arxiv.org

… Then in Section 3, we describe the L1 scheme, which is an extension of the backward

Euler method to the fractional case, and derive relevant approximation properties, which are …

Cited by 3 Related articles All 10 versions 

Optimal curves and mappings valued in the Wasserstein space

H Lavenant - HAL, 2019 - dml.mathdoc.fr

… de Wasserstein.Quand l'espace de départ est un segment, c'est-à-dire quand les inconnues

sont des courbes à valeurs dans l'espace de Wasserstein, … des équations d'Euler.Quand l'…

 Cited by 2 Related articles 


[PDF] arxiv.org

Modified massive Arratia flow and Wasserstein diffusion

V KonarovskyiMK von Renesse - Communications on Pure …, 2019 - Wiley Online Library

… In this work we relate the induced measure-valued process to the Wasserstein diffusion of

… times that is governed by the quadratic Wasserstein distance. © 2018 Wiley Periodicals, Inc. …

 Cited by 33 Related articles All 9 versions


Approximation and Wasserstein distance for self-similar measures on the unit interval

E Lichtenegger, R Niedzialomski - Journal of Mathematical Analysis and …, 2019 - Elsevier

… with respect to the 1-Wasserstein distance. Hence by Banach's Contraction Principle it has

a unique … In particular, the 1-Wasserstein distance between μ n and ν n converges to the 1-…

 Cited by 1 Related articles All 2 versions


2019


[PDF] arxiv.org

Poincar\'e Wasserstein Autoencoder

I Ovinnikov - arXiv preprint arXiv:1901.01427, 2019 - arxiv.org

… geometry imposed by the Fisher information metric to enhance learning performance [1]. …

as opposed to our approach, which uses a Wasserstein formulation of the problem. [23] …

 Cited by 24 Related articles All 4 versions 


[PDF] arxiv.org

Gaussian approximation for penalized Wasserstein barycenters

N Buzun - arXiv preprint arXiv:1904.00891, 2019 - arxiv.org

… In this work we consider Wasserstein barycenters (average in Wasserstein distance) in Fourier

… Define additional Fisher matrix corresponded to the projection into first p elements of the …

 Related articles All 2 versions 

 [PDF] arxiv.org

How well do WGANs estimate the wasserstein metric?

A Mallasto, G MontúfarA Gerolin - arXiv preprint arXiv:1910.03875, 2019 - arxiv.org

… For this reason, entropic relaxation of the 1-Wasserstein … and stability of computing the

Wasserstein metric through its dual … the Wasserstein distance does not produce the best looking …

Cited by 12 Related articles All 6 versions 


[PDF] neurips.cc

Asymptotic guarantees for learning generative models with the sliced-wasserstein distance

K NadjahiA DurmusU Simsekli… - Advances in Neural …, 2019 - proceedings.neurips.cc

… Wasserstein generative adversarial networks, Wasserstein autoencoders). Emerging from

computational optimal transport, the Sliced-Wasserstein … in general Wasserstein spaces. Then …

Cited by 32 Related articles All 10 versions 

[PDF] mlr.press

Understanding mcmc dynamics as flows on the wasserstein space

C LiuJ ZhuoJ Zhu - International Conference on Machine …, 2019 - proceedings.mlr.press

… its Wasserstein space P(M). We then show that any regular MCMC dynamics is the fGH flow

on the Wasserstein … 2011) chooses D as the inverse Fisher metric so that M is the distribution …

 Cited by 11 Related articles All 14 versions 

<——2019—–—2019 ——2700—-


[PDF] mlr.press

Accelerated linear convergence of stochastic momentum methods in wasserstein distances

B CanM GurbuzbalabanL Zhu - … Conference on Machine …, 2019 - proceedings.mlr.press

… Under Assumption 2, the iterations ξk forms a timehomogeneous Markov chain which we …

Therefore, if we set Sα,β in (1), we can consider the 2-Wasserstein distance between two Borel …

  Cited by 26 Related articles All 10 versions 

[PDF] arxiv.org

Data-driven distributionally robust appointment scheduling over wasserstein balls

R JiangM RyuG Xu - arXiv preprint arXiv:1907.03219, 2019 - arxiv.org

… a Wasserstein ball centered at the empirical distribution based on the historical data [26, 54,

61]. Accordingly, we consider two Wasserstein-based distributionally robust … the Wasserstein …

Cited by 18 Related articles All 4 versions 


[PDF] mlr.press

Wasserstein regularization for sparse multi-task regression

H JanatiM CuturiA Gramfort - The 22nd International …, 2019 - proceedings.mlr.press

… θ−) consists in estimating the barycenter of the θt … We show how our Multi-task Wasserstein

(MTW) model can be solved efficiently relying on proximal coordinate descent and Sinkhorn’s …

Cited by 35 Related articles All 9 versions 


[PDF] researchgate.net

Full View

Data-driven affinely adjustable distributionally robust framework for unit commitment based on Wasserstein metric

W Hou, R ZhuH Wei… - IET Generation …, 2019 - ieeexplore.ieee.org

… robust framework for unit commitment based on Wasserstein metric … What is more important,

different from the conventional robust … This is realised by Wasserstein ball with an empirical …

Cited by 15 Related articles All 4 versions 


[PDF] researchgate.net

[PDF] Tractable reformulations of distributionally robust two-stage stochastic programs with∞− Wasserstein distance

W Xie - arXiv preprint arXiv:1908.08454, 2019 - researchgate.net

… Wasserstein distance converges to ∞−Wasserstein distance as τ ∞. Different types of

Wasserstein … The results of this paper reveal that ∞−Wasserstein ambiguity set indeed delivers …

Cited by 12 Related articles 


 2019

 

[PDF] researchgate.net

[PDF] RaspBary: Hawkes Point Process Wasserstein Barycenters as a Service

R Hosler, X LiuJ Carter, M Saper - 2019 - researchgate.net

… Secondly, Wasserstein barycenters have recently been in… We incorporate fast Wasserstein

barycenters computation … Both the Hawkes process and Wasserstein barycenters are …

Cited by 4 Related articles 


[PDF] researchgate.net

[PDF] Speech Enhancement for Noise-Robust Speech Synthesis Using Wasserstein GAN.

N AdigaY Pantazis, V Tsiaras, Y Stylianou - INTERSPEECH, 2019 - researchgate.net

… We propose to use Wasserstein distance with gradient penalty (WGAN) [14] which has shown

… on both objective metrics and subjective listening tests that both Wasserstein loss function …

Cited by 10 Related articles All 4 versions 


[PDF] mlr.press

Wasserstein adversarial examples via projected sinkhorn iterations

E WongF SchmidtZ Kolter - International Conference on …, 2019 - proceedings.mlr.press

… perturbations, in Figure 4 we find that it is substantially more robust than either the standard

oCited by 125 Related articles All 8 versions 

[PDF] arxiv.org

Clustering measure-valued data with Wasserstein barycenters

G Domazakis, D Drivaliaris, S Koukoulas… - arXiv preprint arXiv …, 2019 - arxiv.org

… Following the aforementioned explicit form results for the Wasserstein distance and

Wasserstein barycenter in the case of Location-Scatter family, we can obtain also analytic …

  Cited by 1 Related articles All 2 versions 


2019 see 2020

Adaptive quadratic Wasserstein full-waveform inversion

D Wang, P Wang - SEG International Exposition and Annual Meeting, 2019 - onepetro.org

… In this work, we present an FWI scheme based on the quadratic Wasserstein metric, with 

adaptive normalization and integral wavefield. We show that this scheme has better convexity …

S Cited by 4 Related articles All 2 versions

<——2019—–—2019 ——2710—- 


[PDF] arxiv.org

The optimal convergence rate of monotone schemes for conservation laws in the Wasserstein distance

AM RufE SandeS Solem - Journal of Scientific Computing, 2019 - Springer

… a first-order convergence rate in the Wasserstein distance. Our main result is to prove that

… After an integration by parts, we can see that the time derivative of the Wasserstein distance …

Cited by 10 Related articles All 6 versions

 

 

2019 see 2020   [PDF] arxiv.org

Wasserstein barycenter model ensembling

P Dognin, I Melnyk, Y Mroueh, J Ross… - arXiv preprint arXiv …, 2019 - arxiv.org

… using Wasserstein (W.) barycenters. Optimal transport metrics, such as the Wasserstein

distance, … In this paper we propose to use the Frechet means with Wasserstein distance (d = W2 …

Cited by 21 Related articles All 5 versions 

[PDF] arxiv.org

Hypothesis test and confidence analysis with wasserstein distance with general dimension

M ImaizumiH Ota, T Hamaguchi - arXiv preprint arXiv:1910.07773, 2019 - arxiv.org

… Firstly, we provide a formal definition of the Wasserstein distance, which is a distance

between probability measures by using transportation between the measures. Let (X,d) be a …

Cited by 3 Related articles All 2 versions 



[PDF] arxiv.org

Finsler structure for variable exponent Wasserstein space and gradient flows

A Marcos, A Soglo - arXiv preprint arXiv:1912.12450, 2019 - arxiv.org

… A such duality mapping exists because of the Hahn Banach … We define the Wasserstein

distance Wp(.) between ρ0 and … a definition of the tangent space of the Wasserstein space Pp(.)(Ω…

Related articles All 2 versions 

 

[PDF] projecteuclid.org

Behavior of the empirical Wasserstein distance in  under moment conditions

J Dedecker, F Merlevède - Electronic Journal of Probability, 2019 - projecteuclid.org

… We establish some deviation inequalities, moment bounds and almost sure results for the

Wasserstein distance of order p [1, ∞) between the empirical measure of independent and …

Cited by 8 Related articles All 18 versions


2019


[PDF] uni-bielefeld.de

[PDF] Diffusions and PDEs on Wasserstein space

FY Wang - arXiv preprint arXiv:1903.02148, 2019 - sfb1283.uni-bielefeld.de

We propose a new type SDE, whose coefficients depend on the image of solutions, to

investigate the diffusion process on the Wasserstein space 2 over Rd, generated by the …

Cited by 2 Related articles 


[PDF] wiley.com

A degenerate Cahn‐Hilliard model as constrained Wasserstein gradient flow

D MatthesC Cances, F Nabet - PAMM, 2019 - Wiley Online Library

… The PDE is written as a gradient flow with respect to the L2-Wasserstein metric for two

components that are coupled by an incompressibility constraint. Approximating solutions are …

Related articles All 2 versions

 

 

[PDF] aaai.org

Wasserstein soft label propagation on hypergraphs: Algorithm and generalization error bounds

T GaoS AsoodehY HuangJ Evans - Proceedings of the AAAI …, 2019 - ojs.aaai.org

… We will see that in this case hypergraph label propagation can be cast into a Wasserstein

propagation … With these notations, it is easy to write down the Euler-Lagrange equation of the …

Cited by 3 Related articles All 5 versions 


[PDF] arxiv.org

Wasserstein covariance for multiple random densities

A PetersenHG Müller - Biometrika, 2019 - academic.oup.com

… , the Wasserstein metric is popular due to its theoretical appeal and interpretive value as an

optimal transport metric, leading to the Wasserstein–… Second, we introduce a Wasserstein …

Cited by 19 Related articles All 10 versions


2019 see 2020

Aggregated Wasserstein distance and state registration for hidden Markov models

Y ChenJ Ye, J Li - IEEE transactions on pattern analysis and …, 2019 - ieeexplore.ieee.org

… and the Wasserstein metric between distributions… Wasserstein metric for Gaussian distributions.

The solution of the optimization problem is a fast approximation to the Wasserstein metric …

Cited by 13 Related articles All 7 versions

<——2019—–—2019 ——2720—- 


[PDF] arxiv.org

Wasserstein contraction of stochastic nonlinear systems

J BouvrieJJ Slotine - arXiv preprint arXiv:1902.08567, 2019 - arxiv.org

… Wasserstein distance between the laws of any two solutions can be bounded by the Wasserstein

… Let P : [0,T] × R2d × B(R2d) R+ denote the transition function of the Markov process …

Cited by 5 Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein diffusion tikhonov regularization

AT Lin, Y DuklerW LiG Montúfar - arXiv preprint arXiv:1909.06860, 2019 - arxiv.org

… with the Wasserstein distance, in particular the Wasserstein-2 metric and geometry. … the

Wasserstein metric on input space and which integrates the entire set of Wasserstein Gaussian …

Cited by 3 Related articles All 7 versions 


2019 see 2020  [PDF] arxiv.org

Learning with minibatch Wasserstein: asymptotic and gradient properties

K Fatras, Y Zine, R FlamaryR Gribonval… - arXiv preprint arXiv …, 2019 - arxiv.org

… The original gradient flow algorithm uses an Euler scheme. Formally, starting from an

initial distribution at time t = 0, it means that at each iteration we integrate the ODE …

Cited by 27 Related articles All 24 versions 

 

[PDF] arxiv.org

Penalization of barycenters in the Wasserstein space

J Bigot, E Cazelles, N Papadakis - SIAM Journal on Mathematical Analysis, 2019 - SIAM

… Wasserstein distance W2 associated to the quadratic cost for the comparison of probability

measures (see, eg, [35] for a thorough introduction on the topic of Wasserstein … Wasserstein …

Cited by 27 Related articles All 10 versions


[PDF] sciencedirect.com

The boundary method for semi-discrete optimal transport partitions and Wasserstein distance computation

L Dieci, JD Walsh III - Journal of Computational and Applied Mathematics, 2019 - Elsevier

We introduce a new technique, which we call the boundary method, for solving semi-discrete

optimal transport problems with a wide range of cost functions. The boundary method …

Cited by 10 Related articles All 6 versions


2019

 [PDF] arxiv.org

A nonlocal free boundary problem with Wasserstein distance

A Karakhanyan - arXiv preprint arXiv:1904.06270, 2019 - arxiv.org

… The paper is organized as follows: In Section 2 we recall some facts on the Wasserstein …

Then we derive the Euler-Lagrange equation. From here we infer that ρ has L∞ density with …

Related articles All 3 versions 

 

[PDF] arxiv.org

Wasserstein norm for signed measures, with application to nonlocal transport equation with source term

B Piccoli, F Rossi, M Tournus - arXiv preprint arXiv:1910.05105, 2019 - arxiv.org

… In Section 3, we define the generalized Wasserstein distance for signed measures, we

show that it can be used to define a norm, and prove some topological properties. Section 4 is …

Cited by 11 Related articles All 32 versions 

 

[PDF] biorxiv.org

Free from Publisher

Reproducibility test of radiomics using network analysis and Wasserstein K-means algorithm

JH Oh, AP Apte, E Katsoulakis, N Riaz, V Hatzoglou… - bioRxiv, 2019 - biorxiv.org

… For phantom data, the Wasserstein distance on a largest … was much smaller than the

Wasserstein distance on the same … the Wasserstein distance metric and the proposed Wasserstein …

Related articles All 3 versions 


[PDF] arxiv.org

Fréchet means and Procrustes analysis in Wasserstein space

Y Zemel, VM Panaretos - Bernoulli, 2019 - projecteuclid.org

… of a Fréchet mean in the Wasserstein space of multivariate … Exploiting the tangent bundle

structure of Wasserstein space… iid realisations from a law on Wasserstein space, and indeed …

Cited by 78 Related articles All 10 versions

 

[PDF] archives-ouvertes.fr

[PDF] Diffusive processes on the Wasserstein space: coalescing models, regularization properties and McKean-Vlasov equations

V Marx - 2019 - tel.archives-ouvertes.fr

… Dans notre cas, la diffusion sur l’espace de Wasserstein permet de … Wasserstein metric

on P2(R). Second, von Renesse and Sturm also stated a Varadhan-like formula for the Markov …

Cited by 2 Related articles All 9 versions 

<——2019—–—2019 ——2730—- 


Wasserstein Distances for Estimating Parameters in Stochastic Reaction Networks

K Öcal, R GrimaG Sanguinetti - International Conference on …, 2019 - Springer

… optimization has been successfully applied for identifying parameters in cosmology [23],

genomic prediction [24] and in the context of maximum likelihood estimation for general Markov …

  Related articles All 4 versions


[PDF] arxiv.org

Bounding quantiles of Wasserstein distance between true and empirical measure

SN Cohen, MNA Tegnér, J Wiesel - arXiv preprint arXiv:1907.02006, 2019 - arxiv.org

… For the same reasons our method of proof only covers the case of the 1-Wasserstein distance,

while we expect a similar result to hold for the p-Wasserstein distance for p > 1 also. We …

  Related articles All 4 versions 


[PDF] mlr.press

The wasserstein transform

F Memoli, Z Smith, Z Wan - International Conference on …, 2019 - proceedings.mlr.press

… A localization operator L is a map from Pf (X) to Markov kernels over X, ie, given α …

Wasserstein transform, it is possible to formulate a similar transform using the notion of lp-Wasserstein …

 Cited by 5 Related articles All 7 versions 

2019 see 2020 [PDF] arxiv.org

Kernel wasserstein distance

JH Oh, M Pouryahya, A Iyer, AP Apte… - arXiv preprint arXiv …, 2019 - arxiv.org

… The Wasserstein distance is a powerful metric based on the theory of optimal transport. It …

Wasserstein distance. In this work, we develop a novel method to compute the L2-Wasserstein …

 Cited by 9 Related articles All 3 versions 

2019


[HTML] frontiersin.org

[HTML] Identifying imaging markers for predicting cognitive assessments using wasserstein distances based matrix regression

J Yan, C DengL LuoX WangX YaoL Shen… - Frontiers in …, 2019 - frontiersin.org

… To tackle this problem, in this paper we consider Wasserstein distance as distance metric

for regression model. Different from L p distances (p ≥ 0) (Luo et al., 2017) or Kullback-Leibler …

 Cited by 2 Related articles All 9 versions 



Using wasserstein-2 regularization to ensure fair decisions with neural-network classifiers

L Risser, Q Vincenot, N Couellan, JM Loubes - 2019 - hal.archives-ouvertes.fr

… by using a constraint based on the Wasserstein distance. More specifically, we detail how

to efficiently compute the gradients of Wasserstein-2 regularizers for Neural-Networks. The …

Cited by 11 Related articles 


[PDF] arxiv.org

(q, p)-Wasserstein GANs: Comparing Ground Metrics for Wasserstein GANs

A Mallasto, J FrellsenW Boomsma… - arXiv preprint arXiv …, 2019 - arxiv.org

… We demonstrate the effect of different p-Wasserstein distances in two toy examples.

Furthermore, we show that the ground metric does make a difference, by comparing different (q, p) …

 Cited by 5 Related articles All 3 versions 


[PDF] arxiv.org

Towards diverse paraphrase generation using multi-class wasserstein GAN

Z AnS Liu - arXiv preprint arXiv:1909.13827, 2019 - arxiv.org

… should have minimized Wasserstein distance to P (i) r , while its Wasserstein distance to

another … , 2015], where we use the Wasserstein distance between distributions to replace the L2 …

Cited by 9 Related articles All 4 versions 


[PDF] arxiv.org

Disentangled representation learning with Wasserstein total correlation

Y XiaoWY Wang - arXiv preprint arXiv:1912.12818, 2019 - arxiv.org

… total correlation in both variational autoencoder and Wasserstein autoencoder settings to …

the Wasserstein total correlation term. We discuss the benefits of using Wasserstein distance …

 Cited by 6 Related articles All 2 versions 

<——2019—–—2019 ——2740—- 


[PDF] arxiv.org

Deep distributional sequence embeddings based on a wasserstein loss

A Abdelwahab, N Landwehr - arXiv preprint arXiv:1912.01933, 2019 - arxiv.org

… We propose a distance metric based on Wasserstein distances between the distributions

and a corresponding loss function for metric learning, which leads to a novel end-to-end …

 Cited by 6 Related articles All 2 versions 


 [PDF] arxiv.org

Refined basic couplings and Wasserstein-type distances for SDEs with Lévy noises

D Luo, J Wang - Stochastic Processes and their Applications, 2019 - Elsevier

… We establish the exponential convergence with respect to the L 1 -Wasserstein distance and

the total variation for the semigroup corresponding to the stochastic differential equation d X …

S Cited by 22 Related articles All 6 versions

 

2019 see 2020  [PDF] arxiv.org

Progressive wasserstein barycenters of persistence diagrams

J Vidal, J BudinJ Tierny - IEEE transactions on visualization …, 2019 - ieeexplore.ieee.org

… Since Wasserstein distances are only approximated in this strategy, we suggest to relax

the overall stopping condition (Alg. 1) and stop the iterations after two successive increases in …

SCited by 23 Related articles All 17 versions


WASSERSTEIN METRIC-DRIVEN BAYESIAN INVERSION WITH APPLICATIONS TO SIGNAL PROCESSING
Authors:Mohammad MotamedDaniel Appelo
Summary:We present a Bayesian framework based on a new exponential likelihood function driven by the quadratic Wasserstein metric. Compared to conventional Bayesian models based on Gaussian likelihood functions driven by the least-squares norm (L 2 norm), the new framework features several advantages. First, the new framework does not rely on the like-lihood of the measurement noise and hence can treat complicated noise structures such as combined additive and multiplicative noise. Second, unlike the normal likelihood function, the Wasserstein-based exponential likelihood function does not usually generate multiple local extrema. As a result, the new framework features better convergence to correct posteriors when a Markov Chain Monte Carlo sampling algorithm is employed. Third, in the particular case of signal processing problems, although a normal likelihood function measures only the amplitude differences between the observed and simulated signals, the new likelihood function can capture both amplitude and phase differences. We apply the new framework to a class of signal processing problems, that is, the inverse uncertainty quantification of waveforms, and demonstrate its advantages compared to Bayesian models with normal likelihood functionsShow more
Downloadable Article
Publication:International Journal for Uncertainty Quantification, 9, 2019, 395

2019 see 2020

Adaptive quadratic Wasserstein full-waveform inversion

D Wang, P Wang - SEG International Exposition and Annual Meeting, 2019 - onepetro.org

… based on the quadratic Wasserstein metric, with adaptive normalization and integral wavefield.

We show that this scheme has better convexity than traditional metrics, and therefore can …

 Cited by 4 Related articles All 2 versions


2019


[HTML] aclanthology.org

[HTML] Modeling personalization in continuous space for response generation via augmented wasserstein autoencoders

Z Chan, J Li, X YangX Chen, W Hu… - Proceedings of the …, 2019 - aclanthology.org

Variational autoencoders (VAEs) and Wasserstein autoencoders (WAEs) have achieved

noticeable progress in open-domain response


2019  

How to Develop a Wasserstein Generative Adversarial ...

https://machinelearningmastery.com › Blog

Jul 17, 2019 — The benefit of the WGAN is that the training process is more stable and less sensitive to model architecture and choice of hyperparameter ...

[CITATION] How to develop a wasserstein generative adversarial network (wgan) from scratch

J Brownlee - 2019

Cited by 3 Related articles

How to Develop a Wasserstein Generative Adversarial Network (WGAN) From Scratch

by Jason Brownlee on  in Generative Adversarial Networks

by Jason Brownlee on July 17, 2019 in Generative Adversarial Networks

July 17, 2019

[1910.06749] Parameter-Transferred Wasserstein ... - arXiv

https://arxiv.org › eess

https://arxiv.org › eess

by Y Gong · 2019 · Cited by 19 — In this paper, we propose a parameter-transferred Wasserstein generative adversarial network (PT-WGAN) for low

[PDF] stanford.edu

[PDF] A Privacy Preserved Image-to-Image Translation Model in MRI: Distributed Learning of WGANs

T Ergen, B Ozturkler, B Isik - cs229.stanford.edu

In this project, we introduce a distributed training approach for Generative Adversarial Networks 

(GANs) on Magnetic Resonance Imaging (MRI) tasks. In our distributed framework, we …

Related articles


[PDF] arxiv.org

Zero-sum differential games on the Wasserstein space

J Moon, T Basar - arXiv preprint arXiv:1912.06084, 2019 - arxiv.org

We consider two-player zero-sum differential games (ZSDGs), where the state process (dynamical 

system) depends on the random initial condition and the state process's distribution, …

Cited by 3 Related articles All 3 versions

<——2019—–—2019 ——2750—



 Computational optimal transport: With applications to data science

G Peyré, M Cuturi - Foundations and Trends® in Machine …, 2019 - nowpublishers.com

computational schemes. The main body of Chapters 2, 3, 4, 9, and 10 is devoted solely to the 

study of the geometry induced by optimal transport in … a certain Wasserstein distance of the …

 Cited by 1680 Related articles All 7 versions 

[CITATION] transport: Computation of Optimal Transport Plans and Wasserstein Distances, r package version 0.11-1

D Schuhmacher, B Bähre, C Gottschlich, V Hartmann… - 2019

Cited by 8 Related articles

[CITATION] DialogWAE: Multimodal response generation with conditional wasserstein auto-encoder

GU Xiaodong, K CHO - The 7th International Conference on Learning …, 2019


 [PDF] Full-Band Music Genres Interpolations with Wasserstein Autoencoders

T Borghuis, A Tibo, S Conforti, L Brusci… - … AI for Media and …, 2019 - vbn.aau.dk

We compare different types of autoencoders for generating interpolations between four-instruments 

musical patterns in the acid jazz, funk, and soul genres. Preliminary empirical results …

 Related articles All 3 versions 


Research article
Conditional Wasserstein generative adversarial network-gradient penalty-based approach to alleviating imbalanced data classification
Information Sciences13 October 2019...

Ming ZhengTong LiZifei Ma



2019  patent

Data association method in pedestrian tracking based on Wasserstein measurement

CN CN110110670B 郭春生 杭州电子科技大学

Priority 2019-05-09 • Filed 2019-05-09 • Granted 2022-03-25 • Published 2022-03-25

5. The data association method in pedestrian tracking based on Wasserstein measurement as claimed in claim 1, wherein said second step is specifically: the method comprises the steps that seven video clips on a train sequence of an MOT16 data set are used for making data sets, and the made training …


2019  patent

Wasserstein barycenter model ensembling

US US20200342361A1 Youssef Mroueh International Business Machines Corporation

Priority 2019-04-29 • Filed 2019-04-29 • Published 2020-10-29

10 . The system according to claim 9 , further comprising inputting side information into the barycenter, wherein the barycenter comprises a Wasserstein barycenter with a Wasserstein distance metric. 11 . The system according to claim 9 , further comprising a plurality of the barycenters to determine …


2019


019  patent

… and embedded clustering based on depth self-coding of Sliced-Wasserstein


CN CN111178427B 郭春生 杭州电子科技大学

Priority 2019-12-27 • Filed 2019-12-27 • Granted 2022-07-26 • Published 2022-07-26

2. The method for image dimensionality reduction and embedded clustering based on depth self-coding of Sliced-Wasserstein distance according to claim 1, wherein in step S4, the cluster center of the self-coding embedded clustering network after initialization construction is initialized by an …


 019  patent

… for high-dimension unsupervised anomaly detection using kernalized wasserstein 

KR KR102202842B1 백명희조 서울대학교산학협력단

Priority 2019-08-13 • Filed 2019-08-13 • Granted 2021-01-14 • Published 2021-01-14

The present invention relates to a learning method and a learning apparatus for high-dimension unsupervised abnormality detection using a kernalized Wasserstein autoencoder to decrease excessive computations of a Christoffel function, and a test method and a test apparatus using the same.


019  patent

A kind of uneven learning method based on WGAN-GP and over-sampling

CN CN109816044A 邓晓衡 中南大学

Priority 2019-02-11 • Filed 2019-02-11 • Published 2019-05-28

3. a kind of uneven learning method based on WGAN-GP and over-sampling as claimed in claim 2, which is characterized in that sentence The loss function of other device, as follows: Wherein, D (), G () respectively indicate the function expression of arbiter and Maker model, P r Indicate the number of …


019  patent

Sketch based on WGAN-GP and U-NET-photo method for transformation

CN CN110175567A 王世刚 吉林大学

Priority 2019-05-28 • Filed 2019-05-28 • Published 2019-08-27

1. a kind of sketch based on WGAN-GP and U-NET -- photo method for transformation, it is characterised in that include the following steps: 1.1 obtain human face sketch -- picture data library: FERET, CUHK, IIIT-D 1.2 by sketch -- photo keeps the distribution proportion of its face of …


2019  patent

New energy capacity configuration method based on WGAN scene simulation and …

CN CN112994115A 马燕峰 华北电力大学(保定)

Priority 2019-12-18 • Filed 2019-12-18 • Published 2021-06-18

1. A new energy capacity configuration method based on Wasserstein generation countermeasure network (WGAN) scene simulation and time sequence production simulation is characterized by mainly comprising the following specific steps: step 1, simulating a large number of wind and light resource …

<——2019—–—2019 ——2760—- 


2019 patent

Method, device and storage medium for improving image enhancement based on WGAN

CN CN110493242B 王红玲 上海网达软件股份有限公司

Priority 2019-08-27 • Filed 2019-08-27 • Granted 2022-02-11 • Published 2022-02-11

1. A method for improved image enhancement based on WGAN-GP and U-net, comprising the steps of: the first step is as follows: de-encapsulating the input video stream or file to obtain a first video code stream and a first audio code stream; the second step is as follows: decoding the first video …


2019 patent

Convolutional neural networks based on Wasserstein distance fight transfer …

CN CN110414383A 袁烨 华中科技大学

Priority 2019-07-11 • Filed 2019-07-11 • Published 2019-11-05

In the step 3.2, the Wasserstein distance is the real number average value and target reality of the source domain set of real numbers The difference of the real number average value of manifold. 5. a kind of convolutional neural networks based on Wasserstein distance according to claim 2 fight …


2019 patent

… denoising model of confrontation network are generated based on Wasserstein

CN CN110097512A 张意 四川大学

Priority 2019-04-16 • Filed 2019-04-16 • Published 2019-08-06

4. the building of the three-dimensional MRI image denoising model of confrontation network is generated based on Wasserstein according to claim 3 Method, it is characterised in that the noise data of input coding device is successively handled by Three dimensional convolution, at normalization in …


22019 patent

Clean energy power supply planning method based on Wasserstein distance and …

CN CN110797919A 汪荣华 国网四川省电力公司经济技术研究院

Priority 2019-12-05 • Filed 2019-12-05 • Published 2020-02-14

8. The method for clean energy power planning based on Wasserstein distance and distribution robust optimization of claim 2, wherein the uncertain set formed by the hypercube is introduced as follows: wherein, theta is an uncertain set formed by the hypercube; is a normalized uncertain variable; r m …


2019 patent

Finger vein identification method based on deep learning and Wasserstein

CN CN110555382A 张娜 浙江理工大学

Priority 2019-07-31 • Filed 2019-07-31 • Published 2019-12-10

6. the finger vein recognition method based on deep learning and Wasserstein distance measurement in claim 1, wherein: the step S5 includes: S51, in the registration stage, acquiring a finger vein image through the step S1, further extracting a feature code G w (x) of the image through the steps S2 …


2019


2019 patent

Wasserstein distance-based fault diagnosis method for deep countermeasure …

CN CN110907176B 徐娟 合肥工业大学

Priority 2019-09-30 • Filed 2019-09-30 • Granted 2021-02-02 • Published 2021-02-02

3. The method for fault diagnosis of deep migration-resistant network based on Wasserstein distance as claimed in claim 2, wherein in step S4, the empirical loss L of domain discriminator as the objective function of fault diagnosis model is obtained D And a gradient penalty term L of the domain …


019 patent

… typical scene generation method based on BIRCH clustering and Wasserstein

CN CN110929399A 汤向华 国网江苏省电力有限公司南通供电分公司

Priority 2019-11-21 • Filed 2019-11-21 • Published 2020-03-27

2. The method for generating a typical wind power output scene based on BIRCH clustering and Wasserstein distance as claimed in claim 1, wherein: the specific steps of the BIRCH clustering are as follows: a) setting threshold parameters B, L and T, and inputting wind power scene number S; b) number …


019 patent

System and method for unsupervised domain adaptation via sliced-wasserstein …

WO EP CN EP3918527A1 Alexander J. GABOURIE HRL Laboratories, LLC

Priority 2019-01-30 • Filed 2019-12-18 • Published 2021-12-08

12. The computer program product as set forth in Claim 11, wherein the one or more processors further perform an operation of using sliced-Wasserstein (SW) distance as a dissimilarity measure for determining dissimilarity between the first input data distribution and the second input data …


VIDEO   

Wasserstein Geodesic between PacMan and Ghost

Carrillo, Jose ; Craig, Katy ; Wang, Li ; Chaozhen Wei2019

 Wasserstein Geodesic between PacMan and Ghost

OPEN ACCESS

Wasserstein Geodesic between PacMan and Ghost

No Online Access 

Carrillo, Jose 

VIDEO  

Wasserstein Geodesic between PacMan and Ghost

Carrillo, Jose ; Craig, Katy ; Wang, Li ; Chaozhen Wei2019

 Wasserstein Geodesic between PacMan and Ghost

OPEN ACCESS

Wasserstein Geodesic between PacMan and Ghost

No Online Access 

<——2019—–—2019–——2770—  



2019  see 2018 2022

Data-Driven Chance Constrained ... - Open Collections

https://open.library.ubc.ca › collections › items

https://open.library.ubc.ca › collections › items

Jan 14, 2019 — We provide an exact deterministic reformulation for data-driven chance constrained programs over Wasserstein balls. For individual chance ...

Data-Driven Chance Constrained Programs over Wasserstein ...

We provide an exact deterministic reformulation for data-driven chance constrained programs over Wasserstein balls. For individual chance ...

Open Collections · Wiesemann, Wolfram · 

Jan 14, 2019


://www.slideshare.net › 16wgan-...

십분딥러닝_16_WGAN (Wasserstein GANs) - SlideShare

Translate this page

Jan 18, 2019 — 십분딥러닝_16_WGAN (Wasserstein GANs). Jan. 18, 2019. • 1 like • 476 views. Report. Download Now Download. Download to read offline.

Missing: 2 ‎| Must include: 2


분딥러닝_16_WGAN (Wasserstein GANs)_1 - YouTube

www.youtube.com › watch

17:41

임성빈, SlideShare, Wasserstein GAN 수학 이해하기, https://www.slideshare.net/ssuser7e10... (Kantorovich ...

Jan 20, 2019 · Uploaded by 10min deep learning


 PR-142: Wasserstein GAN_哔哩哔哩(-)つロ干杯~-bilibili

http://bing.comPR-142: Wasserstein GAN字幕版之后会放出,敬请持续关注欢迎加入人工 ...

Oct 15, 201

PR-142: Wasserstein GAN

YouTube   · 691 views   · 2/17/2019

by   Jinsung Yoon

Jan 20, 2019
PR-142: Wasserstein GAN - YouTube

www.youtube.com › watch

PR-142: Wasserstein GAN. 2,462 views2.4K view . 43. Dislike. Share. Save. Jinsung Yoon. Jinsung Yoon. 90 subscribers.

YouTube · Jinsung Yoon · ≈∂ç

Feb 17, 2019

Eric Wong on Twitter: "New paper on Wasserstein adversarial ...

Postdoc at MIT working on optimization and robustness problems. Pittsburgh, PA. github.com/riceric22. Joined ...

Feb 23, 2019

PR-142: Wasserstein GAN - YouTube
Abstract. Guido Montufar - University of California, Los Angeles (UCLA), Mathematics and Statistics This lecture ...

Mar 14, 2019
 

 

Wasserstein GAN generating Graffiti Tags Smode effects

YouTube · 111 views · 

4/11/2019 · by mASLLSAm

Apr 11, 2019


2019


Learning 34: (1) Wasserstein Generative Adversarial Network (WGAN): Introduction

YouTube · 9,000+ views

 · 4/16/2019  · by Ahlad Kumar

Apr 16, 2019

(1) Wasserstein Generative Adversarial Network (WGAN ...

www.youtube.com › watch eep Learning 35: (2) Wasserstein Generative Adversarial Network (WGAN): Wasserstein metric. Ahlad Kumar. Ahlad Kumar.

YouTube · Ahlad Kumar · 

Apr 16, 2019


Deep Learning 34: (1) Wasserstein Generative Adversarial Network (WGAN): Introduction

4,053 views 

Apr 16, 2019

Deep Learning 34: (1) Wasserstein Generative Adversarial Network (WGAN): Introduction

YouTube  · 9,000+ views

 · 4/16/2019  · by  Ahlad Kumar

Deep Learning 35: (2) Wasserstein Generative Adversarial

 - Uploaded by Ahlad Kumar

In this lecture a detailed discussion on Wasserstein metric is carried out. (1) https://ieeexplore.ieee.org ...

Apr 22, 2019

2) Wasserstein Generative Adversarial Network (WGAN)

www.youtube.com › watch

In this lecture a detailed discussion on Wasserstein metric is carried out.

YouTube · Ahlad Kumar · 

Apr 21, 2019


Deep Learning 36: (3) Wasserstein Generative Adversarial Network (WGAN): WGAN Understanding

1,902 views 

Apr 28, 2019

<——2019—–—2019–——2780—



2019 see 2017

20190430 Wasserstein Gans

YouTube · 298 views · 

4/30/2019 · by AGIST audio not English

 

Why don't we all use WGAN instead of GAN? - Reddit

https://www.reddit.com › comments › why_dont_we_al...

https://www.reddit.com › comments › why_dont_we_al...

Mar 5, 2019 — - Both generator and discriminator losses oscillate around an equilibrium. If the loss is at the equilibrium it either means that both G and D ...

[R] [1701.07875] Wasserstein GAN : r/MachineLearning - Reddit

Jan 30, 2017

[D] Why don't people use typical classification networks (e.g. ...

Jan 22, 2020

More results from www.reddit.com

Why don't we all use WGAN instead of GAN

t seems to me that a Wasserstein-GAN has much better properties than a regular GAN. - In regular GANs, the ...

Mar 4, 2019 · Upl


Deep Learning 37: (4) Wasserstein Generative Adversarial Network 

(WGAN): Coding using Tensor Flow

700 views 

www.youtube.com › watch

... of Wasserstein Generative Adversarial Network (WGAN) is performed in TensorFlow using Google Colab#wasserstein#tensorflow#GAN.

YouTube · Ahlad Kumar · 

May 5, 2019

 

Lecture 6 对抗生成网络GAN(2018) Wasserstein GAN(WGAN 

Lecture 6 对抗生成网络GAN(2018) Wasserstein GAN(WGAN) Energy-based ...

May 19, 2019


On Wasserstein Gradient Flows and Particle-Based ...

https://slideslive.com › on-wasserstein-gradient-flows-a...

https://slideslive.com › on-wasserstein-gradient-flows-a...

Jun 15, 2019 — Stein's method is a technique from probability theory for bounding the distance between probability measures using differential and difference ...

On Wasserstein Gradient Flows and Particle-Based ...

crossminds.ai › video › on-wasserstein-gradient-flows-and...

crossminds.ai › video › on-wasserstein-gradient-flows-and...

On Wasserstein Gradient Flows and Particle-Based Variational Inference. 

 . 0. Ruiyi Zhang. Follow. Recommended. Details. Comments.

. 0. Ruiyi Zhang. Follow. Recommended. Details. Comments.

Jun 15, 2019


2019


GAN Lecture 6 (2018): WGAN, EBGAN_哔哩哔哩(-)つロ

http://bing.com GAN Lecture 6 (2018): WGAN, EBGAN 字幕版之后会放出,敬请持 ...

Sep 23, 2019



Talks & Posters – César's Webpage

cauribe.mit.edu › invited-talks-posters-and-abstracts

On the Complexity of Approximating Wasserstein Barycenters. INFORMS Annual Meeting 2019; Invited Seminar at Rensselaer Polytechnic Institute 2019 ...

César A. Uribe · CSL Student Conference · 

Sep 23, 2019


Estimation of wasserstein distances in the spiked transport model

J Niles-Weed, P Rigollet - arXiv preprint arXiv:1909.07513, 2019 - arxiv.org

… We study the minimax rate of estimation for the Wasserstein distance under this model and 

… , the plug-in estimator is nearly rate-optimal for estimating the Wasserstein distance in high …

 Cited by 46 Related articles All 2 versions 

Estimating the Wasserstein Metric - Jonathan Niles-Weed .

Uploaded by Institute for Advanced Study

Estimating the Wasserstein Metric - Jonathan Niles-Weed ... Regularized Wasserstein Distances & Minimum ...

Oct 4, 2019 - Uploaded by Institute for Advanced Study 10,162 view

 

aiacademy: 生成對抗網路GAN - Wasserstein GAN (WGAN

aiacademy: 生成對抗網路gan - wasserstein gan (wgan). oct 4, 2019 ...

oct 4, 2019


Stochastic Wasserstein Autoencoder for Probabilistic ...

https://aclanthology.org › ...

https://aclanthology.org › ...

by H Bahuleyan · 2019 · Cited by 21 — In this paper, we propose to use the Wasserstein autoencoder (WAE) for probabilistic sentence generation, where the encoder could be either stochastic or ...

Stochastic Wasserstein Autoencoder for Probabilistic 

This is "Stochastic Wasserstein Autoencoder for Probabilistic Sentence Generation" by TechTalksTV on Vimeo ...

Oct 7, 2019 · Upload

<——2019—–—2019–——2790—


Katharine Turner (12/3/19): Why should q=p in the ... - YouTube

Title: Why should q=p in the Wasserstein distance between persistence diagrams? Let me count the ways ...

Dec 3, 2019 · Uploaded by Applied Algebraic Topology Network


Optimal Transport - The Wasserstein Metric

2,238 views   Math 707: Optimal Transp

Brittany Hamfeldt

735 subscribers

 Dec 13, 2019


Optimal Transport - Wasserstein Barycentres - YouTube

www.youtube.com › watch

Math 707: Optimal Transport Wasserstein Barycentres October 21, ... Gradient descent algorithms for Bures-Wasserstein barycenters.

YouTube · Brittany Hamfeldt · 

Dec 13, 2019


2019 see 2017

Improved Training of Wasserstein GANs Presentation Group J

... for COMP7404 Computational intelligence and machine learning at HKU. The original paper is improved-training-of-Wasserstein-GANs.

YouTube · Junyue Liu · 

Dec 15, 2019


2019
Study of Constrained Network Structures for WGANs on Numeric Data GenerationAuthors:Wang, Wei (Creator), Wang, Chuang (Creator), Cui, Tao (Creator), Li, Yue (Creator)
Summary:Some recent studies have suggested using GANs for numeric data generation such as to generate data for completing the imbalanced numeric data. Considering the significant difference between the dimensions of the numeric data and images, as well as the strong correlations between features of numeric data, the conventional GANs normally face an overfitting problem, consequently leads to an ill-conditioning problem in generating numeric and structured data. This paper studies the constrained network structures between generator G and discriminator D in WGAN, designs several structures including isomorphic, mirror and self-symmetric structures. We evaluates the performances of the constrained WGANs in data augmentations, taking the non-constrained GANs and WGANs as the baselines. Experiments prove the constrained structures have been improved in 17/20 groups of experiments. In twenty experiments on four UCI Machine Learning Repository datasets, Australian Credit Approval data, German Credit data, Pima Indians Diabetes data and SPECT heart data facing five conventional classifiers. Especially, Isomorphic WGAN is the best in 15/20 experiments. Finally, we theoretically proves that the effectiveness of constrained structures by the directed graphic model (DGM) analysisShow more
Downloadable Archival Material, 2019-11-05
Undefined
Publisher:2019-11-05
Access Free

 

2019



SGD Learns One-Layer Networks in WGANsAuthors:Lei, Qi (Creator), Lee, Jason D. (Creator), Dimakis, Alexandros G. (Creator), Daskalakis, Constantinos (Creator)
Summary:Generative adversarial networks (GANs) are a widely used framework for learning generative models. Wasserstein GANs (WGANs), one of the most successful variants of GANs, require solving a minmax optimization problem to global optimality, but are in practice successfully trained using stochastic gradient descent-ascent. In this paper, we show that, when the generator is a one-layer network, stochastic gradient descent-ascent converges to a global solution with polynomial time and sample complexityShow more
Downloadable Archival Material, 2019-10-15
Undefined
Publisher:2019-10-15
Access Free

2019  Peer-reviewed
Feature augmentation for imbalanced classification with conditional mixture WGANsAuthors:Yinghui ZhangBo SunYongkang XiaoRong XiaoYunGang Wei
Summary:Heterogeneity of class distribution is an intrinsic property of a real-world dataset. Therefore, imbalanced classification is a popular but challenging task. Several methods exist to address this problem. Notably, the adversarial-based data augmentation method, which aims to directly learn the distribution of minority classes unlike simple data modification, has been applied to the challenging task. While effective, the method focuses on a certain domain and lacks universality, and the generated samples lack diversity due to the mode collapse of Generative Adversarial Networks (GANs). In this paper, we propose a general framework of data augmentation using GANs in feature space for imbalanced classification. The core of the framework comprises conditional mixture WGANs (cMWGANs), which are used to approximate true feature distribution and generate label preserved and diverse features for the minority class of various datasets. We conduct three experiments on SVHN, FER2013, and Amazon Review of Instant Video to demonstrate the versatility of the framework and better performance of our cMWGANs in single feature learning. The results show significant improvement with feature augmentation of cMWGANsShow more
Article
Publication:Signal Processing: Image Communication, 75, July 2019, 89

2019
How Well Do WGANs Estimate the Wasserstein Metric?Authors:Mallasto, Anton (Creator), Montúfar, Guido (Creator), Gerolin, Augusto (Creator)
Summary:Generative modelling is often cast as minimizing a similarity measure between a data distribution and a model distribution. Recently, a popular choice for the similarity measure has been the Wasserstein metric, which can be expressed in the Kantorovich duality formulation as the optimum difference of the expected values of a potential function under the real data distribution and the model hypothesis. In practice, the potential is approximated with a neural network and is called the discriminator. Duality constraints on the function class of the discriminator are enforced approximately, and the expectations are estimated from samples. This gives at least three sources of errors: the approximated discriminator and constraints, the estimation of the expectation value, and the optimization required to find the optimal potential. In this work, we study how well the methods, that are used in generative adversarial networks to approximate the Wasserstein metric, perform. We consider, in particular, the $c$-transform formulation, which eliminates the need to enforce the constraints explicitly. We demonstrate that the $c$-transform allows for a more accurate estimation of the true Wasserstein metric from samples, but surprisingly, does not perform the best in the generative settingShow more
Downloadable Archival Material, 2019-10-09
Undefined
Publisher:2019-10-09
Access Free

2019
Conditional WGANs with Adaptive Gradient Balancing for Sparse MRI ReconstructionAuthors:Malkiel, Itzik (Creator), Ahn, Sangtae (Creator), Taviani, Valentina (Creator), Menini, Anne (Creator), Wolf, Lior (Creator), Hardy, Christopher J. (Creator)
Summary:Recent sparse MRI reconstruction models have used Deep Neural Networks (DNNs) to reconstruct relatively high-quality images from highly undersampled k-space data, enabling much faster MRI scanning. However, these techniques sometimes struggle to reconstruct sharp images that preserve fine detail while maintaining a natural appearance. In this work, we enhance the image quality by using a Conditional Wasserstein Generative Adversarial Network combined with a novel Adaptive Gradient Balancing technique that stabilizes the training and minimizes the degree of artifacts, while maintaining a high-quality reconstruction that produces sharper images than other techniquesShow more
Downloadable Archival Material, 2019-05-02
Undefined
Publisher:2019-05-02
Access Free

 2019  eBook
Using Wasserstein Generative Adversarial Networks for the design of Monte Carlo simulationsAuthors:Susan Athey (Author), Guido Imbens (Author), Jonas Metzger (Author), Evan M. Munro (Author), National Bureau of Economic Research (Publisher)
Summary:When researchers develop new econometric methods it is common practice to compare the performance of the new methods to those of existing methods in Monte Carlo studies. The credibility of such Monte Carlo studies is often limited because of the freedom the researcher has in choosing the design. In recent years a new class of generative models emerged in the machine learning literature, termed Generative Adversarial Networks (GANs) that can be used to systematically generate artificial data that closely mimics real economic datasets, while limiting the degrees of freedom for the researcher and optionally satisfying privacy guarantees with respect to their training data. In addition if an applied researcher is concerned with the performance of a particular statistical method on a specific data set (beyond its theoretical properties in large samples), she may wish to assess the performance, e.g., the coverage rate of confidence intervals or the bias of the estimator, using simulated data which resembles her setting. To illustrate these methods we apply Wasserstein GANs (WGANs) to compare a number of different estimators for average treatment effects under unconfoundedness in three distinct settings (corresponding to three real data sets) and present a methodology for assessing the robustness of the results. In this example, we find that (i) there is not one estimator that outperforms the others in all three settings, so researchers should tailor their analytic approach to a given setting, and (ii) systematic simulation studies can be helpful for selecting among competing methods in this situationShow more
eBook, 2019
English
Publisher:National Bureau of Economic Research, Cambridge, Mass., 2019
Also available asPrint Book
View AllFormats & Editions
 

<——2019—–—2019–——2800—


Entropy-Based Wasserstein GAN for Imbalanced Learninghttps://ojs.aaai.org › AAAI › article › viewPDFby J Ren · 2019 · Cited

by 13 — EWGANEntropy-Based Wasserstein GAN for Imbalanced Learning. Jinfu RenYang LiuJiming Liu


EWGAN: Entropy-based Wasserstein GAN for imbalanced learning


2019
Wasserstein Continuity of Entropy and Outer Bounds for Interference Channels
Authors:Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science (Contributor), Polyanskiy, Yury (Creator), Wu, Yihong (Creator)
Summary:It is shown that under suitable regularity conditions, differential entropy is O(n)-Lipschitz as a function of probability distributions on n with respect to the quadratic Wasserstein distance. Under similar conditions, (discrete) Shannon entropy is shown to be O(n)-Lipschitz in distributions over the product space with respect to Ornstein's d-distance (Wasserstein distance corresponding to the Hamming distance). These results together with Talagrand's and Marton's transportation-information inequalities allow one to replace the unknown multi-user interference with its independent identically distributed approximations. As an application, a new outer bound for the two-user Gaussian interference channel is proved, which, in particular, settles the missing corner point problem of Costa (1985)Show more
Downloadable Archival Material, 2019-07-09T14:55:35Z
English
Publisher:Institute of Electrical and Electronics Engineers (IEEE), 2019-07-09T14:55:35Z

2019 comp fp;e

Fréchet means and Procrustes analysis in Wasserstein spaceAuthors:Zemel, Yoav (Creator), Panaretos, Victor M. (Creator)
Summary:We consider two statistical problems at the intersection of functional and non-Euclidean data analysis: the determination of a Fréchet mean in the Wasserstein space of multivariate distributions; and the optimal registration of deformed random measures and point processes. We elucidate how the two problems are linked, each being in a sense dual to the other. We first study the finite sample version of the problem in the continuum. Exploiting the tangent bundle structure of Wasserstein space, we deduce the Fréchet mean via gradient descent. We show that this is equivalent to a Procrustes analysis for the registration maps, thus only requiring successive solutions to pairwise optimal coupling problems. We then study the population version of the problem, focussing on inference and stability: in practice, the data are i.i.d. realisations from a law on Wasserstein space, and indeed their observation is discrete, where one observes a proxy finite sample or point process. We construct regularised nonparametric estimators, and prove their consistency for the population mean, and uniform consistency for the population Procrustes registration mapsShow more
Computer File, 2019-05
English
Publisher:Bernoulli Society for Mathematical Statistics and Probability, 2019-05


2019
Parameter-Transferred Wasserstein Generative Adversarial Network (PT-WGAN) for Low-Dose PET Image Denoising
Authors:Gong, Yu (Creator), Shan, Hongming (Creator), Teng, Yueyang (Creator), Tu, Ning (Creator), Li, Ming (Creator), Liang, Guodong (Creator), Wang, Ge (Creator), Wang, Shanshan (Creator)Show more
Summary:Due to the widespread use of positron emission tomography (PET) in clinical practice, the potential risk of PET-associated radiation dose to patients needs to be minimized. However, with the reduction in the radiation dose, the resultant images may suffer from noise and artifacts that compromise diagnostic performance. In this paper, we propose a parameter-transferred Wasserstein generative adversarial network (PT-WGAN) for low-dose PET image denoising. The contributions of this paper are twofold: i) a PT-WGAN framework is designed to denoise low-dose PET images without compromising structural details, and ii) a task-specific initialization based on transfer learning is developed to train PT-WGAN using trainable parameters transferred from a pretrained model, which significantly improves the training efficiency of PT-WGAN. The experimental results on clinical data show that the proposed network can suppress image noise more effectively while preserving better image fidelity than recently published state-of-the-art methods. We make our code available at https://github.com/90n9-yu/PT-WGANShow more
Downloadable Archival Material, 2019-10-12
Undefined
Publisher:2019-10-12
Access Free


2019

PaperView: Generalized Wasserstein Dice Score for ...

calable Gromov-Wasserstein Learning for Graph Partitioning ...

paperswithcode.com › paper › review

paperswithcode.com › paper › review

We propose a scalable Gromov-Wasserstein learning (S-GWL) method and establish a novel and theoretically-supported paradigm for large-scale graph analysis.

Papers With Code · Ross Taylor · 

May 20, 2019


Aswin Shriram U T (@AswinShriram) / Twitter

NASA Asteroid Watch ... approach for the low-data regime that calculates the Wasserstein distance (aka the earth mover's distance) between expert and agent, ...

Twitter · Oct 17, 2019


CAM Colloquium - Ziv Goldfeld (11/1/19) - YouTube

www.youtube.com › watch

www.youtube.com › watchCAM - Cornell Center for Applied Math Colloquium ... Unfortunately, empirical approximation under Wasserstein distances suffers from a ...

YouTube · CAM - Cornell Center for Applied Math Colloquium · 

Nov 4, 2019


2019  grant award

Robust Wasserstein Profile Inference

Award Number:1915967; Principal Investigator:Jose Blanchet; Co-Principal Investigator:; Organization:Stanford University;NSF Organization:DMS Start Date:07/01/2019; Award Amount:$250,000.00; Relevance:83.36;

Robust Wasserstein Profile Inference–––


Zhiwu Huang - Papers With Code

paperswithcode.com › search

paperswithcode.com › searchIn generative modeling, the Wasserstein distance (WD) has emerged as a useful ... (RL) based neural architecture search (NAS) methodology for effective and ...

Papers With Code · cantabilewq · 

Apr 11, 2019

<—–2019—–—2019–——2810—


Thomas Deliot (@thomasdeliot) / Twitter

twitter.com › thomasdeliot

twitter.com › thomasdeliot2022 Bisection-Based Triangulation of Catmull-Clark subvision by Jonathan ... how to synthesize beautiful textures using only our Sliced Wasserstein Loss.

Twitter · 

Feb 15, 2019


 2019
From GAN to WGAN
Author:Weng, Lilian (Creator)
Summary:This paper explains the math behind a generative adversarial network (GAN) model and why it is hard to be trained. Wasserstein GAN is intended to improve GANs' training by adopting a smooth metric for measuring the distance between two probability distributionsShow more
Downloadable Archival Material, 2019-04-18
Undefined
Publisher:2019-04-18

From GAN to WGANAuthor:Weng, Lilian (Creator)
Summary:This paper explains the math behind a generative adversarial network (GAN) model and why it is hard to be trained. Wasserstein GAN is intended to improve GANs' training by adopting a smooth metric for measuring the distance between two probability distributionsShow more
Downloadable Archival Material, 2019-04-


2019
Remote sensing image deblurring algorithm based on WGAN
Authors:Xia H.Liu C.16th International Conference on Service-Oriented Computing, ICSOC 2018
Article, 2019
Publication:Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11434 LNCS, 2019, 113
Publisher:2019

 

2obile Application Network Behavior Detection and Evaluation with WGAN and Bi-LSTM

Authors:Wei S.Jiang P.Yuan Q.Wang J.2018 IEEE Region 10 Conference, TENCON 2018
Article, 2019
Publication:IEEE Region 10 Annual International Conference, Proceedings/TENCON, 2018-October, 2019 02 22, 44
Publisher:2019


2019
Arterial spin labeling images synthesis via locally-constrained WGAN-GP ensemble
Authors:Huang W.Luo M.Liu X.Zhang P.Ding H.Ni D.22nd International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2019
Article, 2019
Publication:Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11767 LNCS, 2019, 768
Publisher:2019

 Arterial Spin Labeling Images Synthesis via Locally-Constrained WGAN-GP Ensemble

0 citations* 

2019 Medical Image Computing and Computer-Assisted Intervention

Wei Huang 1, Mingyuan Luo 1, Xi Liu 1, Peng Zhang 2, Huijun Ding 3 see all 6 authors 

1 Nanchang University , 

2 Northwestern Polytechnical University , 

3 Shenzhen University

Book Chapter  Full Text Online

Arterial Spin Labeling Images Synthesis via Locally-Constrained WGAN-GP Ensemble 

by Huang, Wei; Luo, Mingyuan; Liu, Xi; More... 

Related articles


2019


Generation of network traffic using WGAN-GP and a DFT filter for resolving data imbalance
Authors:Lee W.H.Noh B.N.Kim Y.S.Jeong K.M.12th International Conference on Internet and Distributed Computing Systems, IDCS 2019
Article, 2019
Publication:Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11874 LNCS, 2019, 306
Publisher:2019



Wasserstein total variation filtering

Authors:Varol, Erdem (Creator), Nejatbakhsh, Amin (Creator)
Summary:In this paper, we expand upon the theory of trend filtering by introducing the use of the Wasserstein metric as a means to control the amount of spatiotemporal variation in filtered time series data. While trend filtering utilizes regularization to produce signal estimates that are piecewise linear, in the case of $\ell_1$ regularization, or temporally smooth, in the case of $\ell_2$ regularization, it ignores the topology of the spatial distribution of signal. By incorporating the information about the underlying metric space of the pixel layout, the Wasserstein metric is an attractive choice as a regularizer to undercover spatiotemporal trends in time series data. We introduce a globally optimal algorithm for efficiently estimating the filtered signal under a Wasserstein finite differences operator. The efficacy of the proposed algorithm in preserving spatiotemporal trends in time series video is demonstrated in both simulated and fluorescent microscopy videos of the nematode caenorhabditis elegans and compared against standard trend filtering algorithmsShow more
Downloadable Archival Material, 2019-10-23
Undefined
Publisher:2019-10-23
10.

2019
Novel Bi-directional Images Synthesis Based on WGAN-GP with GMM-Based Noise Generation
Authors:Huang W.Luo M.Liu X.Zhang P.Ding H.Ni D.10th International Workshop on Machine Learning in Medical Imaging, MLMI 2019 held in conjunction with the 22nd International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2019Show more
Article, 2019
Publication:Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11861 LNCS, 2019, 160
Publisher:2019


2019
Using WGAN for improving imbalanced classification performance
Authors:Bhatia S.Dahyot R.27th AIAI Irish Conference on Artificial Intelligence and Cognitive Science, AICS 2019
Article, 2019
Publication:CEUR Workshop Proceedings, 2563, 2019, 365
Publisher:2019


2019
Bridging the Gap Between $f$-GANs and Wasserstein GANs
Authors:Song, Jiaming (Creator), Ermon, Stefano (Creator)
Summary:Generative adversarial networks (GANs) have enjoyed much success in learning high-dimensional distributions. Learning objectives approximately minimize an $f$-divergence ($f$-GANs) or an integral probability metric (Wasserstein GANs) between the model and the data distribution using a discriminator. Wasserstein GANs enjoy superior empirical performance, but in $f$-GANs the discriminator can be interpreted as a density ratio estimator which is necessary in some GAN applications. In this paper, we bridge the gap between $f$-GANs and Wasserstein GANs (WGANs). First, we list two constraints over variational $f$-divergence estimation objectives that preserves the optimal solution. Next, we minimize over a Lagrangian relaxation of the constrained objective, and show that it generalizes critic objectives of both $f$-GAN and WGAN. Based on this generalization, we propose a novel practical objective, named KL-Wasserstein GAN (KL-WGAN). We demonstrate empirical success of KL-WGAN on synthetic datasets and real-world image generation benchmarks, and achieve state-of-the-art FID scores on CIFAR10 image generationShow more
Downloadable Archival Material, 2019-10-22
Undefined
Publisher:2019-10-22

<—–2019—–—2019–——2820—



(q,p)-Wasserstein GANs: Comparing Ground Metrics for Wasserstein GANs
Authors:Mallasto, Anton (Creator), Frellsen, Jes (Creator), Boomsma, Wouter (Creator), Feragen, Aasa (Creator)
Summary:Generative Adversial Networks (GANs) have made a major impact in computer vision and machine learning as generative models. Wasserstein GANs (WGANs) brought Optimal Transport (OT) theory into GANs, by minimizing the $1$-Wasserstein distance between model and data distributions as their objective function. Since then, WGANs have gained considerable interest due to their stability and theoretical framework. We contribute to the WGAN literature by introducing the family of $(q,p)$-Wasserstein GANs, which allow the use of more general $p$-Wasserstein metrics for $p\geq 1$ in the GAN learning procedure. While the method is able to incorporate any cost function as the ground metric, we focus on studying the $l^q$ metrics for $q\geq 1$. This is a notable generalization as in the WGAN literature the OT distances are commonly based on the $l^2$ ground metric. We demonstrate the effect of different $p$-Wasserstein distances in two toy examples. Furthermore, we show that the ground metric does make a difference, by comparing different $(q,p)$ pairs on the MNIST and CIFAR-10 datasets. Our experiments demonstrate that changing the ground metric and $p$ can notably improve on the common $(q,p) = (2,1)$ caseShow more
Downloadable Archival Material, 2019-02-10
Undefined
Publisher:2019-02-10


2019
Quantum Wasserstein Generative Adversarial Networks
Authors:Chakrabarti, Shouvanik (Creator), Huang, Yiming (Creator), Li, Tongyang (Creator), Feizi, Soheil (Creator), Wu, Xiaodi (Creator)
Summary:The study of quantum generative models is well-motivated, not only because of its importance in quantum machine learning and quantum chemistry but also because of the perspective of its implementation on near-term quantum machines. Inspired by previous studies on the adversarial training of classical and quantum generative models, we propose the first design of quantum Wasserstein Generative Adversarial Networks (WGANs), which has been shown to improve the robustness and the scalability of the adversarial training of quantum generative models even on noisy quantum hardware. Specifically, we propose a definition of the Wasserstein semimetric between quantum data, which inherits a few key theoretical merits of its classical counterpart. We also demonstrate how to turn the quantum Wasserstein semimetric into a concrete design of quantum WGANs that can be efficiently implemented on quantum machines. Our numerical study, via classical simulation of quantum systems, shows the more robust and scalable numerical performance of our quantum WGANs over other quantum GAN proposals. As a surprising application, our quantum WGAN has been used to generate a 3-qubit quantum circuit of ~50 gates that well approximates a 3-qubit 1-d Hamiltonian simulation circuit that requires over 10k gates using standard techniquesShow more
Downloadable Archival Material, 2019-10-31
Undefined
Publisher:2019-10-31

Cited by 49 Related articles All 7 versions


2019
Conditional wgans with adaptive gradient balancing for sparse MRI reconstruction
Authors:Malkiel I.Ahn S.Hardy C.J.Wolf L.Taviani V.Menini A.
Article, 2019
Publication:arXiv, 2019 05 02
Publisher:2019

2019
Study of constrained network structures for WGANs on numeric data generation
Authors:Wang W.Wang C.Li Y.Cui T.
Article, 2019
Publication:arXiv, 2019 11 05
Publisher:2019



2019
SGD learns one-layer networks in WGANs
Authors:Lei Q.Dimakis A.G.Lee J.D.Daskalakis C.
Article, 2019
Publication:arXiv, 2019 10 15
Publisher:2019

2019


[PDF] arxiv.org

How Well Do WGANs Estimate the Wasserstein Metric?

A Mallasto, G Montúfar, A Gerolin - arXiv preprint arXiv:1910.03875, 2019 - arxiv.org

Generative modelling is often cast as minimizing a similarity measure between a data distribution and a model distribution. Recently, a popular choice for the similarity measure has been the Wasserstein metric, which can be expressed in the Kantorovich duality …

Cited by 14 Related articles All 6 versions
How well do WGANs estimate the wasserstein metric?
Authors:Mallasto A.Montufar G.Gerolin A.
Article, 2019
Publication:arXiv, 2019 10 09
Publisher:2019


A使用WGAN-GP對臉部馬賽克進行眼睛補圖 = Eye In-painting Using WGAN-GP for Face Images with Mosaic / Shi yongWGAN-GP dui lian bu ma sai ke jin xing yan jing bu tu = Eye In-painting Using WGAN-GP for Face Images with Mosaic  

Authors:吳承軒, H. T. ChangCheng Hsuan Wu張賢宗 / Chengxuan WuXianzong Zhang

Thesis, Dissertation, 2019[min 108]

Chinese, Chu ban

Publisher:長庚大學, Tao yuan shi, 2019[min 108]



2019

Lenz Belzner (@LenzBelzner) / Twitter

mobile.twitter.com › lenzbelzner0:05

My internship work at Google Brain is out on Arxiv: https://arxiv.org/abs/1903.11780 . ... We propose an alternative based on the Wasserstein distance.

Twitter · 

Mar 4, 2019


2019

From shallow to deep learning for inverse imaging problems

www.youtube.com › watche ... Why Your Brain Thinks This Water Is Spiralling | Science Of Illusions | WIRED.

YouTube · The Alan Turing Institute · 

Jun 12, 2019


  2019

Tweets with replies by Balaji Lakshminarayanan ... - Twitter

mobile.twitter.com › balajiln › with_repliesBrain. Previously. @DeepMind . Mountain View, CA gatsby.ucl.ac.uk/~balaji Joined March ... The Cramer Distance as a Solution to Biased Wasserstein Gradients.

Twitter · 

Dec 7, 2019

<—–2019—–—2019–——2830—



APPROXIMATION OF STABLE LAW IN WASSERSTEIN-1 DISTANCE BY STEIN’S METHOD
JOURNAL ARTICLE
APPROXIMATION OF STABLE LAW IN WASSERSTEIN-1 DISTANCE BY STEIN’S METHOD
Lihu Xu
The Annals of Applied Probability, Vol. 29, No. 1 (February 2019), pp. 458-504
...the 
Wasserstein -1 distance of L(Sn) and μ essentially by an L1 discrepancy be- tween two kernels. More[precisely, we prove the following inequality:( ∑n ∫ NL )≤ ∣∣∣K ] α(t,N) Ki(t,N) d W (Sn),μ C − dt +RN,n , i=1 −N n α where dW is the Wasserstein -1...



Peer-reviewed
Approximation of stable law in Wasserstein-1 distance by Stein’s method<sup>1</sup>

Author:Xu L.
Article, 2019
Publication:Annals of Applied Probability, 29, 2019 02 01, 458
Publisher:2019


Learning and inference with Wasserstein metrics
Authors:Tomaso Poggio (Contributor), Massachusetts Institute of Technology Department of Brain and Cognitive Sciences (Contributor), Frogner, Charles (Charles Albert) (Creator)Show mor
Summary:Thesis: Ph. D., Massachusetts Institute of Technology, Department of Brain and Cognitive Sciences, 2018
Downloadable Archival Material, 2019-03-01T19:52:20Z
English
Publisher:Massachusetts Institute of Technology, 2019-03-01T19:52:20Z



2019 fBook

Nonlinear diffusion equations and curvature conditions in metric measure spacesAuthors:Luigi Ambrosio (Author), Andrea Mondino (Author), Giuseppe Savaré (Author)
Abstract:Aim of this paper is to provide new characterizations of the curvature dimension condition in the context of metric measure spaces (X, d, m). On the geometric side, our new approach takes into account suitable weighted action functionals which provide the natural modulus of K-convexity when one investigates the convexity properties of N-dimensional entropies. On the side of diffusion semigroups and evolution variational inequalities, our new approach uses the nonlinear diffusion semigroup induced by the N-dimensional entropy, in place of the heat flow. Under suitable assumptions (most notably the quadraticity of Cheeger's energy relative to the metric measure structure) both approaches are shown to be equivalent to the strong CD*(K, N) condition of Bacher-SturmShow more
eBook, 2019
English
Publisher:American Mathematical Society, Providence, 2019
Also available asPrint Book
View AllFormats & Editions

Reproducing-kernel Hilbert space regression with notes on the Wasserstein distanceAuthors:Stephen Page (Author), University of Lancaster (Degree granting institution)

Thesis, Dissertation, 2019
English

Publisher:Lancaster University, [Great Britain], 2019

Peer-reviewed
Multivariate approximations in Wasserstein distance by Stein’s method and Bismut’s formula
Authors:Xiao FangQi-Man ShaoLihu Xu
Summary:Stein’s method has been widely used for probability approximations. However, in the multi-dimensional setting, most of the results are for multivariate normal approximation or for test functions with bounded second- or higher-order derivatives. For a class of multivariate limiting distributions, we use Bismut’s formula in Malliavin calculus to control the derivatives of the Stein equation solutions by the first derivative of the test function. Combined with Stein’s exchangeable pair approach, we obtain a general theorem for multivariate approximations with near optimal error bounds on the Wasserstein distance. We apply the theorem to the unadjusted Langevin algorithmShow more
Article, 2019
Publication:Probability Theory and Related Fields, 174, 20190801, 945
Publisher:2019

2019


Poincar\'e Wasserstein Autoencoder
Author:Ovinnikov, Ivan (Creator)
Summary:This work presents a reformulation of the recently proposed Wasserstein autoencoder framework on a non-Euclidean manifold, the Poincar\'e ball model of the hyperbolic space. By assuming the latent space to be hyperbolic, we can use its intrinsic hierarchy to impose structure on the learned latent space representations. We demonstrate the model in the visual domain to analyze some of its properties and show competitive results on a graph link prediction taskShow more
Downloadable Archival Material, 2019-01-05
Undefined
Publisher:2019-01-05

 
Inequalities of the Wasserstein mean with other matrix means
Authors:Sejong KimHosoo Lee
Summary:Abstract: Recently, a new Riemannian metric and a least squares mean of positive definite matrices have been introduced. They are called the Bures–Wasserstein metric and Wasserstein mean, which are different from the Riemannian trace metric and Karcher mean. In this paper we find relationships of the Wasserstein mean with other matrix means such as the power means, harmonic mean, and Karcher meanShow more
Article, 2019
Publication:Annals of Functional Analysis, 11, 20191201, 194
Publisher:2019


Peer-reviewed
Inequalities for the Wasserstein mean of positive definite matrices
Authors:Rajendra BhatiaTanvi JainYongdo Lim
Summary:We prove majorization inequalities for different means of positive definite matrices. These include the Cartan mean (the Karcher mean), the log Euclidean mean, the Wasserstein mean and the power meanShow more
Article
Publication:Linear Algebra and Its Applications, 576, 2019-09-01, 108


Peer-reviewed
Randomized filtering and Bellman equation in Wasserstein space for partial observation control problem
Authors:Elena BandiniAndrea CossoMarco FuhrmanHuyên Pham
Summary:We study a stochastic optimal control problem for a partially observed diffusion. By using the control randomization method in Bandini et al. (2018), we prove a corresponding randomized dynamic programming principle (DPP) for the value function, which is obtained from a flow property of an associated filter process. This DPP is the key step towards our main result: a characterization of the value function of the partial observation control problem as the unique viscosity solution to the corresponding dynamic programming Hamilton-Jacobi-Bellman (HJB) equation. The latter is formulated as a new, fully non linear partial differential equation on the Wasserstein space of probability measures. An important feature of our approach is that it does not require any non-degeneracy condition on the diffusion coefficient, and no condition is imposed to guarantee existence of a density for the filter process solution to the controlled Zakai equation. Finally, we give an explicit solution to our HJB equation in the case of a partially observed non Gaussian linear-quadratic modelShow more
Article
Publication:Stochastic Processes and their Applications, 129, February 2019, 674

<—–2019—–—2019–——2840—



 Peer-reviewed
Deep multi-Wasserstein unsupervised domain adaptation
Authors:Tien-Nam LeAmaury HabrardMarc Sebban
Summary:• We address the problem of negative transfer in unsupervised domain adaptation by: • Minimizing the source true risk and the divergence between the domains. • While controlling the combined error of the ideal joint hypothesis. • We employ highly-confident target pseudo-labels and multiple Wasserstein distances. • Experimental results show that our model outperforms state of the art.

In unsupervised domain adaptation (DA), 1 aims at learning from labeled source data and fully unlabeled target examples a model with a low error on the target domain. In this setting, standard generalization bounds prompt us to minimize the sum of three terms: (a) the source true risk, (b) the divergence between the source and target domains, and (c) the combined error of the ideal joint hypothesis over the two domains. Many DA methods - especially those using deep neural networks - have focused on the first two terms by using different divergence measures to align the source and target distributions on a shared latent feature space, while ignoring the third term, assuming it is negligible to perform the adaptation. However, it has been shown that purely aligning the two distributions while minimizing the source error may lead to so-called negative transfer. In this paper, we address this issue with a new deep unsupervised DA method - called MCDA  - minimizing the first two terms while controlling the third one. MCDA benefits from highly-confident target samples (using softmax predictions) to minimize class-wise Wasserstein distances and efficiently approximate the ideal joint hypothesis. Empirical results show that our approach outperforms state of the art methodsShow more
Article, 2019
Publication:Pattern Recognition Letters, 125, 20190701, 249
Publisher:2019

Peer-reviewed
Hybrid Wasserstein distance and fast distribution clustering
Authors:Isabella VerdinelliLarry Wasserman
Summary:We define a modified Wasserstein distance for distribution clustering which inherits many of the properties of the Wasserstein distance but which can be estimated easily and computed quickly. The modified distance is the sum of two terms. The first term — which has a closed form — measures the location-scale differences between the distributions. The second term is an approximation that measures the remaining distance after accounting for location-scale differences. We consider several forms of approximation with our main emphasis being a tangent space approximation that can be estimated using nonparametric regression and leads to fast and easy computation of barycenters which otherwise would be very difficult to compute. We evaluate the strengths and weaknesses of this approach on simulated and real examplesShow more
Downloadable Article
Publication:https://projecteuclid.org/euclid.ejs/1576119710Electron. J. Statist. 13, no. 2 (2019), 5088-5119, 2019

Peer-reviewed
Artifact correction in low-dose dental CT imaging using Wasserstein generative adversarial networks
Authors:Zhanli HuChanghui JiangFengyi SunQiyang ZhangYongshuai GeYongfeng YangXin LiuHairong ZhengDong Liang
Summary:Purpose: In recent years, health risks concerning high-dose x-ray radiation have become a major concern in dental computed tomography (CT) examinations. Therefore, adopting low-dose computed tomography (LDCT) technology has become a major focus in the CT imaging field. One of these LDCT technologies is downsampling data acquisition during low-dose x-ray imaging processes. However, reducing the radiation dose can adversely affect CT image quality by introducing noise and artifacts in the resultant image that can compromise diagnostic information. In this paper, we propose an artifact correction method for downsampling CT reconstruction based on deep learning.Method: We used clinical dental CT data with low-dose artifacts reconstructed by conventional filtered back projection (FBP) as inputs to a deep neural network and corresponding high-quality labeled normal-dose CT data during training. We trained a generative adversarial network (GAN) with Wasserstein distance (WGAN) and mean squared error (MSE) loss, called m-WGAN, to remove artifacts and obtain high-quality CT dental images in a clinical dental CT examination environment.Results: The experimental results confirmed that the proposed algorithm effectively removes low-dose artifacts from dental CT scans. In addition, we showed that the proposed method is efficient for removing noise from low-dose CT scan images compared to existing approaches. We compared the performances of the general GAN, convolutional neural networks, and m-WGAN. Through quantitative and qualitative analysis of the results, we concluded that the proposed m-WGAN method resulted in better artifact correction performance preserving the texture in dental CT scanning.Conclusions: The image quality evaluation metrics indicated that the proposed method effectively improves image quality when used as a postprocessing technique for dental CT images. To the best of our knowledge, this work is the first deep learning architecture used with a commercial cone-beam dental CT scanner. The artifact correction performance was rigorously evaluated and demonstrated to be effective. Therefore, we believe that the proposed algorithm represents a new direction in the research area of low-dose dental CT artifact correctionShow more
Article, 2019
Publication:Medical Physics, 46, April 2019, 1686
Publisher:2019

Peer-reviewed
Authors:Elham NobariBijan Ahmadi Kakavandi
Wasserstein barycenters in the manifold of all positive definite matrices
Summary:In this paper, we study the Wasserstein barycenter of finitely many Borel probability measures on $ \mathbb{P}_{n}$, the Riemannian manifold of all $ n× n$ real positive definite matrices as well as its associated dual problem, namely the optimal transport problem. Our results generalize some results of Agueh and Carlier on $ \mathbb{R}^{n}$ to $ \mathbb{P}_{n}$. We show the existence of the optimal solutions and the Wasserstein barycenter measure. Furthermore, via a discretization approach and using the BFGS (Broyden-Fletcher-Goldfarb-Shanno) method for nonsmooth convex optimization, we propose a numerical method for computing the potential functions of the optimal transport problem. Also, thanks to the so-called optimal transport Jacobian on Riemannian manifolds of Cordero-Erausquin, McCann, and Schmuckenschläger, we show that the density of the Wasserstein barycenter measure can be approximated numerically. The paper concludes with some numerical experimentsShow more
Downloadable Article, 2019
Publication:Quarterly of Applied Mathematics, 77, July 1, 2019, 655
Publisher:2019

 
On a Wasserstein-type distance between solutions to stochastic differential equations
Authors:Jocelyne Bion–NadalDenis Talay
Summary:In this paper, we introduce a Wasserstein-type distance on the set of the probability distributions of strong solutions to stochastic differential equations. This new distance is defined by restricting the set of possible coupling measures. We prove that it may also be defined by means of the value function of a stochastic control problem whose Hamilton–Jacobi–Bellman equation has a smooth solution, which allows one to deduce a priori estimates or to obtain numerical evaluations. We exhibit an optimal coupling measure and characterize it as a weak solution to an explicit stochastic differential equation, and we finally describe procedures to approximate this optimal coupling measure.¶ A notable application concerns the following modeling issue: given an exact diffusion model, how to select a simplified diffusion model within a class of admissible models under the constraint that the probability distribution of the exact model is preserved as much as possible?Show more
Downloadable Article
Publication:https://projecteuclid.org/euclid.aoap/1550566838Ann. Appl. Probab., 29, 2019-06, 1609

MR3914552  Bion-Nadal, Jocelyne; Talay, Denis On a Wasserstein-type distance between solutions to stochastic differential equations. Ann. Appl. Probab. 29 (2019), no. 3, 1609–1639. (Reviewer: Marco Fuhrman) 60J60 (28A33 93E20)

  On a Wasserstein-type distance between solutions to stochastic differential equations

J Bion–Nadal, D Talay - The Annals of Applied Probability, 2019 - projecteuclid.org

In this paper, we introduce a Wasserstein-type distance on the set of the probability 

distributions of strong solutions to stochastic differential equations. This new distance is 

defined by restricting the set of possible coupling measures. We prove that it may also be …

Cited by 18 Related articles All 10 versions

2019



2019 see 2020
A variational finite volume scheme for Wasserstein gradient flows
Authors:Cancès, Clément (Creator), Gallouët, Thomas O. (Creator), Todeschi, Gabriele (Creator)
Summary:We propose a variational finite volume scheme to approximate the solutions to Wasserstein gradient flows. The time discretization is based on an implicit linearization of the Wasserstein distance expressed thanks to Benamou-Brenier formula, whereas space discretization relies on upstream mobility two-point flux approximation finite volumes. Our scheme is based on a first discretize then optimize approach in order to preserve the variational structure of the continuous model at the discrete level. Our scheme can be applied to a wide range of energies, guarantees non-negativity of the discrete solutions as well as decay of the energy. We show that our scheme admits a unique solution whatever the convex energy involved in the continuous problem, and we prove its convergence in the case of the linear Fokker-Planck equation with positive initial density. Numerical illustrations show that it is first order accurate in both time and space, and robust with respect to both the energy and the initial profileShow more
Downloadable Archival Material, 2019-07-18
Undefined
Publisher:2019-07-18

Adaptive Wasserstein Hourglass for Weakly Supervised Hand Pose Estimation from Monocular RGB
Authors:Zhang, Yumeng (Creator), Chen, Li (Creator), Liu, Yufeng (Creator), Yong, Junhai (Creator), Zheng, Wen (Creator)
Summary:Insufficient labeled training datasets is one of the bottlenecks of 3D hand pose estimation from monocular RGB images. Synthetic datasets have a large number of images with precise annotations, but the obvious difference with real-world datasets impacts the generalization. Little work has been done to bridge the gap between two domains over their wide difference. In this paper, we propose a domain adaptation method called Adaptive Wasserstein Hourglass (AW Hourglass) for weakly-supervised 3D hand pose estimation, which aims to distinguish the difference and explore the common characteristics (e.g. hand structure) of synthetic and real-world datasets. Learning the common characteristics helps the network focus on pose-related information. The similarity of the characteristics makes it easier to enforce domain-invariant constraints. During training, based on the relation between these common characteristics and 3D pose learned from fully-annotated synthetic datasets, it is beneficial for the network to restore the 3D pose of weakly labeled real-world datasets with the aid of 2D annotations and depth images. While in testing, the network predicts the 3D pose with the input of RGBShow more


2019

Wasserstein regularization for sparse multi-task regression

http://proceedings.mlr.press › ...

http://proceedings.mlr.press › ...

by H Janati · 2019 · Cited by 35 — Wasserstein regularization for sparse multi-task regression. Hicham Janati. Marco Cuturi ... ACM. ISBN 978-1-60558-516-1. doi: 10.1145/1553374.1553431.


2019

[PDF] openreview.net

Sliced Wasserstein auto-encoders

S KolouriPE PopeCE Martin… - … Conference on Learning …, 2019 - openreview.net

… In short, we regularize the auto-encoder loss with the sliced-Wasserstein distance between

… similar capabilities to Wasserstein Auto-Encoders (WAE) and Variational Auto-Encoders (…

 Cited by 116 Related articles All 2 versions 


2019 see 2020

https://jcupt.bupt.edu.cn › j.cnki.1005-8885.2020.0004

 Remaining useful life prediction of lithium-ion batteries using a method based on Wasserstein GAN

This method achieves a more reliable and accurate RUL prediction of lithium-ion batteries by combining the artificial neural network (ANN) model which takes the ...

<—–2019—–—2019–——2850—


Dialogue response generation with Wasserstein generative adversarial networks
Authors:Gilani S.A.S.Jembere E.Pillay A.W.2019 South African Forum for Artificial Intelligence Research, FAIR 2019
Article, 2019
Publication:CEUR Workshop Proceedings, 2540, 2019
Publisher:2019



2019 see 2020
A Distributionally Robust Optimization Approach for Multivariate Linear Regression under the Wasserstein Metric
Authors:Ruidi ChenIoannis Ch Paschalidis2019 IEEE 58th Conference on Decision and Control (CDC)

Summary:We present a Distributionally Robust Optimization (DRO) approach for Multivariate Linear Regression (MLR), where multiple correlated response variables are to be regressed against a common set of predictors. We develop a regularized MLR formulation that is robust to large perturbations in the data, where the regularizer is the dual norm of the regression coefficient matrix in the sense of a newly defined matrix norm. We establish bounds on the prediction bias of the solution, offering insights on the role of the regularizer in controlling the prediction error. Experimental results show that, compared to a number of popular MLR methods, our approach leads to a lower out-of-sample Mean Squared Error (MSE) in various scenarios
Show more
Chapter, 2019
Publication:2019 IEEE 58th Conference on Decision and Control (CDC), 201912, 3655
Publisher:2019

Curvature of the Manifold of Fixed-Rank Positive-Semidefinite Matrices Endowed with the Bures–Wasserstein Metric
Authors:UCL - SST/ICTM/INMA - Pôle en ingénierie mathématique (Contributor), Massart, Estelle (Creator), Hendrickx, Julien (Creator), Absil, Pierre-Antoine (Creator), 4th International Conference, GSI 2019 (Creator)
Show more
Summary:We consider the manifold of rank-p positive-semidefinite matrices of size n, seen as a quotient of the set of full-rank n-by-p matrices by the orthogonal group in dimension p. The resulting distance coincides with the Wasserstein distance between centered degenerate Gaussian distributions. We obtain expressions for the Riemannian curvature tensor and the sectional curvature of the manifold. We also provide tangent vectors spanning planes associated with the extreme values of the sectional curvature
Show more
Downloadable Archival Material, 2019
English
Publisher:Frank Nielsen, Frédéric Barbaresco Eds, 2019


2019 thesis
結合Wasserstein Distance於對抗領域適應之研究 = A generative adversarial network in domain adaptation by utilizing the wasserstein distance
 / Jie heWasserstein Distance yu dui kang ling yu shi ying zhi yan jiu = A generative adversarial network in domain adaptation by utilizing the wasserstein distance
Show more
2019 thesis
Authors:許維哲撰.許維哲劉立頌 / xu wei zhe zhuanWeizhe XuLisong Liu
Thesis, Dissertation, min 108[2019]
Chinese
Publisher:許維哲, Jia yi xian, min 108[2019]


Sparsemax and Relaxed Wasserstein for Topic Sparsity
Authors:Tianyi Lin (Author), Zhiyue Hu (Author), Xin Guo (Author)

Summary:Topic sparsity refers to the observation that individual documents usually focus on several salient topics instead of covering a wide variety of topics, and a real topic adopts a narrow range of terms instead of a wide coverage of the vocabulary. Understanding this topic sparsity is especially important for analyzing user-generated web content and social media, which are featured in the form of extremely short posts and discussions. As topic sparsity of individual documents in online social media increases, so does the difficulty of analyzing the online text sources using traditional methods. In this paper, we propose two novel neural models by providing sparse posterior distributions over topics based on the Gaussian sparsemax construction, enabling efficient training by stochastic backpropagation. We construct an inference network conditioned on the input data and infer the variational distribution with the relaxed Wasserstein (RW) divergence. Unlike existing works based on Gaussian softmax construction and Kullback-Leibler (KL) divergence, our approaches can identify latent topic sparsity with training stability, predictive performance, and topic coherence. Experiments on different genres of large text corpora have demonstrated the effectiveness of our models as they outperform both probabilistic and neural methods
Show more
Chapter, 2019
Publication:Proceedings of the Twelfth ACM International Conference on Web Search and Data Mining, 20190130, 141
Publisher:2019


2019


Relaxed Wasserstein, Generative Adversarial Networks, Variational Autoencoders and their applications
Authors:Nan YangGuo, Xin1 (Contributor), Yang, Nan (Creator)
Summary:Statistical divergences play an important role in many data-driven applications. Two notable examples are Distributionally Robust Optimization (DRO) problems and Generative Adversarial Networks (GANs).In the first section of my dissertation, we propose a novel class of statistical divergence called Relaxed Wasserstein (RW) divergence, which combines Wasserstein distance and Bregman divergence. We begin with its strong probabilistic properties, and then to illustrate its uses, we introduce Relaxed Wasserstein GANs (RWGANs) and compare it empirically with several state-of-the-art GANs in image generation. We show that it strikes a balance between training speed and image quality. We also discuss the potential use of Relaxed Wasserstein to construct ambiguity sets in DRO problems.In the second section of my dissertation, we show the application of another type of generative neural network, the Variational AutoEncoder (VAE), to metagenomic binning problems in bioinformatics. Shotgun sequencing is used to produce short reads from DNA sequences in a sample from a microbial community, which could contain thousands species of discovered or unknown microbes. The short reads are then assembled by connecting overlapping subsequences and thus forming longer sequences called contigs. Metagenomic binning is the process of grouping contigs from multiple organisms based on their genomes of origin. We propose a new network structure called MetaAE, which combines compositional and reference-based information in a nonlinear way. We show that this binning algorithm improves the performance of state-of-the-art binners by 20\% on two independent synthetic datasets
Show more
Downloadable Archival Material, 2019-01-01
English
Publisher:eScholarship, University of California, 2019-01-01

2019 book
Predictive density estimation under the wasserstein loss
 / Predictive density estimation under the wasserstein loss
Print Book, 2019.4
English
Publisher:Department of Mathematical Informatics, Graduate School of Information Science and Technology, the University of Tokyo, Tokyo, 2019.4

2019 thesis
Distribuciones de máxima entropía en bolas de Wasserstein
Authors:Luis Felipe Vargas BeltránMauricio Fernando Velasco GregoryAdolfo José Quiroz SalazarFabrice Gamboa
Summary:"Presentamos un método para hallar la distribución de máxima entropía en la Bola de Wasserstein de un radio dado t centrada en la distribución empírica dada por n puntos. Esta distribución es la más general (minimiza la cantidad de información previa) a una distancia $t$ de la distribución empírica y de aquí su importancia en inferencia estadística. El método depende de un nuevo algoritmo de cutting plane y es generalizado a otro tipo de funciones, entre ellas los Funcionales Euclidianos Subaditivos. También, damos una nueva generalización al algoritmo de Fortune para generar el diagrama de Voronoi Pesado Aditivamente que permite hacer optimización en Bolas de Wasserstein a mayor velocidad." -- Tomado del Formato de Documento de Grado
Show more
Thesis, Dissertation, 2019
Spanish
Publisher:Uniandes, Bogotá, 2019


Wasserstein clustering based video anomaly detection for traffic surveillance
Authors:Arivazhagan S.Mary Rosaline M.Sylvia Lilly Jebarani W.
Article, 2019
Publication:International Journal of Engineering and Advanced Technology, 9, 2019 10 01, 6438
Publisher:2019


Behavior of the empirical wasserstein distance in r<sup>d</sup> under moment conditions
Authors:Dedecker J.Merlevede F.
Article, 2019
Publication:Electronic Journal of Probability, 24, 2019
Publisher:2019

<—–2019—–—2019–——2860—



Scene Classification by Coupling Convolutional Neural Networks with Wasserstein Distance
Authors:Liu Y.Ding L.
Article, 2019
Publication:IEEE Geoscience and Remote Sensing Letters, 16, 2019 05 01, 722
Publisher:2019

Barycenters of Natural Images - Constrained Wasserstein Barycenters for Image Morphing
Authors:Simon D.Aberdam A.
Article, 2019
Publication:arXiv, 2019 12 24



Bridging Bayesian and minimax mean square error estimation via wasserstein distributionally robust optimization
Authors:Nguyen V.A.Shafieezadeh-Abadeh S.Kuhn D.Esfahani P.M.
Article, 2019
Publication:arXiv, 2019 11 08
Publisher:2019


Rate of convergence in wasserstein distance of piecewise-linear lévy-driven SDEs
Authors:Arapostathis A.Pang G.Sandric N.
Article, 2019
Publication:arXiv, 2019 07 10
Publisher:2019

2019 SEE 2020
Stein’s method for normal approximation in Wasserstein distances with application to the multivariate Central Limit Theorem
Author:Bonis T.
Article, 2019
Publication:arXiv, 2019 05 31
Publisher:2019


2019


Training Wasserstein GANs for Estimating Depth Maps

AT Arslan, E Seke - 2019 3rd International Symposium on …, 2019 - ieeexplore.ieee.org

Depth maps depict pixel-wise depth association with a 2D digital image. Point clouds generation and 3D surface reconstruction can be conducted by processing a depth map. Estimating a corresponding depth map from a given input image is an important and difficult …

Training Wasserstein GANs for Estimating Depth Maps


Estimation of smooth densities in Wasserstein distance<sup></sup>
Authors:Weed J.Berthet Q.
Article, 2019
Publication:arXiv, 2019 02 05
Publisher:201


Tractable Reformulations of Distributionally Robust Two-stage Stochastic Programs with ∞−Wasserstein Distance
Author:Xie W.
Article, 2019
Publication:arXiv, 2019 08 22
Publisher:2019


2019 see 2020
Bridging the Gap Between f-GANs and Wasserstein GANs
Authors:Song J.Ermon S.
Article, 2019
Publication:arXiv, 2019 10 22
Publisher:2019


Donsker’s theorem in Wasserstein-1 distance
Authors:Coutin L.Decreusefond L.
Article, 2019
Publication:arXiv, 2019 04 15
Publisher:2019

<—–2019—–—2019–——2870—e



Distributed Computation of Wasserstein Barycenters over Networks
Authors:Uribe C.A.Dvinskikh D.Dvurechensky P.Gasnikov A.Nedic A.57th IEEE Conference on Decision and Control, CDC 2018
Article, 2019
Publication:Proceedings of the IEEE Conference on Decision and Control, 2018-December, 2019 01 18, 6544
Publisher:2019


Wasserstein Contraction of Stochastic Nonlinear Systems
Authors:Bouvrie J.Slotine J.-J.
Article, 2019
Publication:arXiv, 2019 02 22
Publisher:2019

On the wasserstein distance between classical sequences and the Lebsesgue measure
Authors:Brown L.Steinerberger S.
Article, 2019
Publication:arXiv, 2019 09 19
Publisher:2019

2019 see 2020
A convergent Lagrangian discretization for p-Wasserstein and flux-limited diffusion equations
Authors:Sollner B.Junge O.
Article, 2019
Publication:arXiv, 2019 06 04
Publisher:2019



Necessary condition for rectifiability involving wasserstein distance w<sub>2</sub>
Author:Dabrowski D.
Article, 2019
Publication:arXiv, 2019 04 24
Publisher:2019

 
2019


Wasserstein Proximal Algorithms for the Schrödinger Bridge Problem: Density Control with Nonlinear Drift
Authors:Caluya K.F.Halder A.
Article, 2019
Publication:arXiv, 2019 12 03
Publisher:2019
 
Wasserstein stability estimates for covariance-preconditioned Fokker-Planck equations
Authors:Carrillo J.A.Vaes U.
Article, 2019
Publication:arXiv, 2019 10 16
Publisher:2019



PARISI’S FORMULA IS A HAMILTON-JACOBI EQUATION IN WASSERSTEIN SPACE
Author:Mourrat J.-C.
Article, 2019
Publication:arXiv, 2019 06 20
Publisher:2019


(q,p)-Wasserstein GANs: Comparing Ground Metrics forWasserstein GANs
Authors:Mallasto A.Boomsma W.Feragen A.Frellsen J.
Article, 2019
Publication:arXiv, 2019 02 10
Publisher:2019


Wasserstein f-tests and confidence bands for the frechet regression of density response curves
Authors:Petersenz A.Liu X.Divani A.A.
Article, 2019
Publication:arXiv, 2019 10 29
Publisher:2019

<—–2019—–—2019–——2880—



Unsupervised adversarial domain adaptation based on the wasserstein distance for acoustic scene classification
Authors:Drossos K.Magron P.Virtanen T.
Article, 2019
Publication:arXiv, 2019 04 24
Publisher:2019

Sufficient Condition for Rectifiability Involving Wasserstein Distance W<sub>2</sub>
Author:Dabrowski D.
Article, 2019
Publication:arXiv, 2019 04 24
Publisher:2019


PWGAN wasserstein GANs with perceptual loss for mode collapse
Authors:Xianyu Wu (Author), Canghong Shi (Author), Xiaojie Li (Author), Jia He (Author), Xi Wu (Author), Jiancheng Lv (Author), Jiliu Zhou (Author)
Summary:Generative adversarial network (GAN) plays an important part in image generation. It has great achievements trained on large scene data sets. However, for small scene data sets, we find that most of methods may lead to a mode collapse, which may repeatedly generate the same image with bad quality. To solve the problem, a novel Wasserstein Generative Adversarial Networks with perceptual loss function (PWGAN) is proposed in this paper. The proposed approach could be better to reflect the characteristics of the ground truth and the generated samples, and combining with the training adversarial loss, PWGAN can produce a perceptual realistic image. There are two benefits of PWGAN over state-of-the-art approaches on small scene data sets. First, PWGAN ensures the diversity of the generated samples, and basically solve mode collapse problem under the small scene data sets. Second, PWGAN enables the generator network quickly converge and improve training stability. Experimental results show that the images generated by PWGAN have achieved better quality in visual effect and stability than state-of-the-art approaches
Show more
Chapter, 2019
Publication:Proceedings of the ACM Turing Celebration Conference - China, 20190517, 1
Publisher:2019


[PDF] arxiv.org

Random matrix-improved estimation of the wasserstein distance between two centered gaussian distributions

M TiomokoR Couillet - 2019 27th European Signal Processing …, 2019 - ieeexplore.ieee.org

… However, computing the Wasserstein distance is expensive … are zero-mean Gaussian with

covariance matrices C1 and C2. … for the Wasserstein distance between two centered Gaussian …

Save Cite Cited by 3 Related articles All 22 versions


Math 707: Optimal TransportThe Wasserstein Metric

October 9, 2019This is a lecture on "The Wasserstein Metric" given as a part of Brittany ...

YouTube · Brittany Hamfeldt · 

Dec 13, 2019


2019



2019 [PDF] arxiv.org

Optimal fusion of elliptic extended target estimates based on the Wasserstein distance

K ThormannM Baum - 2019 22th International Conference on …, 2019 - ieeexplore.ieee.org

… Wasserstein distance, as a cost function. We derive an explicit approximate expression for

the Minimum Mean Gaussian Wasserstein … for the fusion of extended target estimates. The …

Cited by 6 Related articles All 5 versions


2019

Wasserstein generative learning with kinematic constraints for probabilistic interactive driving behavior prediction

H Ma, J LiW ZhanM Tomizuka - 2019 IEEE Intelligent …, 2019 - ieeexplore.ieee.org

… inference method based on Wasserstein auto-encoder, which … • We incorporate the kinematic

model into the deep generative … In this paper, we propose a variant of Wasserstein auto-…

 Cited by 28 Related articles All 2 versions


2019. [PDF] arxiv.org

Weak optimal total variation transport problems and generalized Wasserstein barycenters

NP Chung, TS Trinh - arXiv preprint arXiv:1909.05517, 2019 - arxiv.org

In this paper, we establish a Kantorovich duality for weak optimal total variation transport 

As consequences, we recover a version of duality formula for partial optimal …

Save Cite Related articles All 2 versions


 2019z2 see 2017

[CITATION] Read-through: Wasserstein GAN

S Insightful - 2019 - June

Cited by 2 Related articles


2019 patent

Depth self-coding embedded clustering method based on Sliced-Wasserstein …

CN CN111178427A 郭春生 杭州电子科技大学

Priority 2019-12-27 • Filed 2019-12-27 • Published 2020-05-19

The invention discloses a depth self-coding embedded clustering method based on Sliced-Wasserstein distance, which comprises the following steps: s11, constructing a self-coding network module based on a Sliced-Walserstein distance; s12, constructing a clustering module; s13, combining the built …

<—–2019—–—2019–——2890—



2019 patent

System and method for unsupervised domain adaptation via sliced-wasserstein …

US US20200125982A1 Alexander J. Gabourie Hrl Laboratories, Llc

Priority 2018-02-06 • Filed 2019-12-18 • Published 2020-04-23

The computer program product as set forth in claim 11 , wherein the one or more processors further perform an operation of using sliced-Wasserstein (SW) distance as a dissimilarity measure for determining dissimilarity between the first input data distribution and the second input data distribution.


2019 patent

System and method for unsupervised domain adaptation via sliced-wasserstein …

WO EP CN WO2020159638A1 Alexander J. GABOURIE Hrl Laboratories, Llc

Priority 2019-01-30 • Filed 2019-12-18 • Published 2020-08-06

12. The computer program product as set forth in Claim 11, wherein the one or more processors further perform an operation of using sliced-Wasserstein (SW) distance as a dissimilarity measure for determining dissimilarity between the first input data distribution and the second input data …


2019 patent

Clean energy power supply planning method based on Wasserstein distance and …

CN CN110797919B 汪荣华 国网四川省电力公司经济技术研究院

Priority 2019-12-05 • Filed 2019-12-05 • Granted 2020-09-01 • Published 2020-09-01

The invention discloses a clean energy power supply planning method based on Wasserstein distance and distribution robust optimization, which relates to the technical field of power system planning, and comprises the following steps: s1: constructing a wind-solar output uncertainty set based on …


2019 patent

… typical scene generation method based on BIRCH clustering and Wasserstein …

CN CN110929399A 汤向华 国网江苏省电力有限公司南通供电分公司

Priority 2019-11-21 • Filed 2019-11-21 • Published 2020-03-27

2. The method for generating a typical wind power output scene based on BIRCH clustering and Wasserstein distance as claimed in claim 1, wherein: the specific steps of the BIRCH clustering are as follows: a) setting threshold parameters B, L and T, and inputting wind power scene number S; b) number …


2019 patent

Wasserstein distance-based fault diagnosis method for deep countermeasure …

CN CN110907176A 徐娟 合肥工业大学

Priority 2019-09-30 • Filed 2019-09-30 • Published 2020-03-24

2. The method for fault diagnosis of the deep immunity migration network based on Wasserstein distance as claimed in claim 1, wherein in step S3, the objective function of the fault diagnosis model is determined, which includes the following specific steps: s301, extracting the source domain D from …


 

2019


2019 patent

… for high-dimension unsupervised anomaly detection using kernalized wasserstein …

KR KR102202842B1 백명희조 서울대학교산학협력단

Priority 2019-08-13 • Filed 2019-08-13 • Granted 2021-01-14 • Published 2021-01-14

The present invention relates to a learning method and a learning apparatus for high-dimension unsupervised abnormality detection using a kernalized Wasserstein autoencoder to decrease excessive computations of a Christoffel function, and a test method and a test apparatus using the same.


2019 patent

Finger vein identification method based on deep learning and Wasserstein …

CN CN110555382A 张娜 浙江理工大学

Priority 2019-07-31 • Filed 2019-07-31 • Published 2019-12-10

6. the finger vein recognition method based on deep learning and Wasserstein distance measurement in claim 1, wherein: the step S5 includes: S51, in the registration stage, acquiring a finger vein image through the step S1, further extracting a feature code G w (x) of the image through the steps S2 …


2019 patent

Convolutional neural networks based on Wasserstein distance fight transfer …

CN CN110414383A 袁烨 华中科技大学

Priority 2019-07-11 • Filed 2019-07-11 • Published 2019-11-05

In the step 3.2, the Wasserstein distance is the real number average value and target reality of the source domain set of real numbers The difference of the real number average value of manifold. 5. a kind of convolutional neural networks based on Wasserstein distance according to claim 2 fight …


2019 patent

Data association method in pedestrian tracking based on Wasserstein measurement

CN CN110110670B 郭春生 杭州电子科技大学

Priority 2019-05-09 • Filed 2019-05-09 • Granted 2022-03-25 • Published 2022-03-25

5. The data association method in pedestrian tracking based on Wasserstein measurement as claimed in claim 1, wherein said second step is specifically: the method comprises the steps that seven video clips on a train sequence of an MOT16 data set are used for making data sets, and the made training …


2019-2020 patent

Wasserstein barycenter model ensembling

US US20200342361A1 Youssef Mroueh International Business Machines Corporation

Priority 2019-04-29 • Filed 2019-04-29 • Published 2020-10-29

, wherein the side information includes class relationships represented by a graph or via an embedding space. 13 . The system according to claim 9 , wherein the optimal transport metric includes a Wasserstein distance. 14 . The system according to claim 11 , wherein the barycenter takes into account the …

<—–2019—–—2019–——2900—



019 patent

… denoising model of confrontation network are generated based on Wasserstein

CN CN110097512A 张意 四川大学

Priority 2019-04-16 • Filed 2019-04-16 • Published 2019-08-06

The invention discloses a kind of construction method of three-dimensional MRI image denoising model that confrontation network is generated based on Wasserstein and applications, the present invention generates confrontation network as basic model using Wasserstein and handles MRI noise image, it …


019 patent

Sketch based on WGAN-GP and U-NET-photo method for transformation

CN CN110175567A 王世刚 吉林大学

Priority 2019-05-28 • Filed 2019-05-28 • Published 2019-08-27

1. a kind of sketch based on WGAN-GP and U-NET -- photo method for transformation, it is characterised in that include the following steps: 1.1 obtain human face sketch -- picture data library: FERET, CUHK, IIIT-D 1.2 by sketch -- photo keeps the distribution proportion of its face of …


019 patent

A kind of horizontal proliferation eGaN HEMT device of integrated backward …

CN CN110212028A 张士英 张士英

Priority 2019-05-22 • Filed 2019-05-22 • Published 2019-09-06

6. the horizontal proliferation eGaN HEMT of a kind of integrated backward dioded according to claim 1 and embedded drain electrode field plate Device, which is characterized in that MIS Schottky diode extended segment (104) and MIS Schottky diode insulating layer (105) are adopted MIS Schottky …


019 patent

A kind of eGaN HEMT hybrid solenoid valve circuit and control method

CN CN110224579A 彭子和 南京航空航天大学

Priority 2019-05-16 • Filed 2019-05-16 • Published 2019-09-10

4. the method for controlling eGaN HEMT hybrid solenoid valve circuit as claimed in claim 2, it is characterised in that: keep driving electricity Potential source U dri In running order, DC current source I is opened in starting before eGaN HEMT is opened on_bias , enter in eGaN HEMT It is closed when …


019 patent

A kind of uneven learning method based on WGAN-GP and over-sampling

CN CN109816044A 邓晓衡 中南大学

Priority 2019-02-11 • Filed 2019-02-11 • Published 2019-05-28

3. a kind of uneven learning method based on WGAN-GP and over-sampling as claimed in claim 2, which is characterized in that sentence The loss function of other device, as follows: Wherein, D (), G () respectively indicate the function expression of arbiter and Maker model, P r Indicate the number of …


2019


2019-2023 patent

Methods and devices performing adaptive quadratic Wasserstein full-waveform …

US BR GB MX GB2584196B Wang Diancheng Cgg Services Sas

Priority 2019-03-26 • Filed 2020-03-13 • Granted 2023-01-18 • Published 2023-01-18

<—–2019—–—2019–——2906—end 2019  e19

including 4 titles with Vaserstein or   вассерштейна,  

1 with ВАССЕРШТЕЙН, 1 with ВАССЕРШТЕЙНА,   1 title with WassRank,,

1 title with Wasserstein,and 1 title with CWGAN.

 WGAN = Wasserstein GAN 


start 2020 Wasserstein Vaserstein in tittle       

MR4040747 Prelim Xie, Fangzhou; Wasserstein Index Generation Model: Automatic generation of time-series index with application to Economic Policy Uncertainty. Econom. Lett. 186 (2020), 108874. 91B84

 Wasserstein Index Generation Model: Automatic generation of time-series index with application to Economic Policy Uncertainty 

by Xie, Fangzhou 

Economics Letters, 01/2020, Volume 186

I propose a novel method, the Wasserstein Index Generation model (WIG), to generate a public sentiment index automatically. To test the model’s effectiveness,...

Journal Article: Full Text Online 


MR4038803 Prelim Luini, E.; Arbenz, P.; Density estimation of multivariate samples using Wasserstein distance. J. Stat. Comput. Simul. 90 (2020), no. 2, 181–210.

Density estimation of multivariate samples using Wasserstein distance

E Luini, P Arbenz - Journal of Statistical Computation and …, 2020 - Taylor & Francis

… To the knowledge of the authors, other publications involving the topics of Wasserstein distance

and hypothesis tests are [7] and [8]. The former introduced the Wasserstein distance in

nonparametric two-sample or homogeneity testing, the latter in uniformity and distributional …

Cited by 4 Related articles All 4 versions

 

MR4036051 Prelim Lei, Jing; Convergence and concentration of empirical measures under Wasserstein distance in unbounded functional spaces. Bernoulli 26 (2020), no. 1, 767–798. 60B10 (60B12 60E15 60G15 62G30)

 Convergence and concentration of empirical measures under Wasserstein distance in unbounded functional spaces 

By: Lei, Jing 

BERNOULLI  Volume: 26   Issue: 1   Pages: 767-798   Published: FEB 2020 

Cited by 42 Related articles All 5 versions


MR4058364 Prelim Liu, Yating; Pagès, Gilles; Characterization of probability distribution convergence in Wasserstein distance by  Lp-quantization error function. Bernoulli 26 (2020), no. 2, 1171–1204.

Characterization of probability distribution convergence in Wasserstein distance by L-P-quantization error function 

By: Liu, Yating; Pages, Gilles 

BERNOULLI  Volume: 26   Issue: 2   Pages: 1171-1204   Published: MAY 2020 

Zbl 07166560

Cited by 1 Related articles All 5 versions


[PDF] arxiv.org

Convergence and concentration of empirical measures under wasserstein distance in unbounded functional spaces

J Lei - Bernoulli, 2020 - projecteuclid.org

We provide upper bounds of the expected Wasserstein distance between a probability 

measure and its empirical version, generalizing recent results for finite dimensional 

Euclidean spaces and bounded functional spaces. Such a generalization can cover …

Cited by 21 Related articles All 2 versions 

Zbl 07140516   Lei, Jing  Convergence and concentration of empirical measures under Wasserstein distance in unbounded functional spaces. (English) 

Bernoulli 26, No. 1, 767-798 (2020).  MSC:  60 62 

 Cited by 63 Related articles All 5 versionsCited by 73 Related articles All 5 versions

2020

[PDF] arxiv.org

Wasserstein hamiltonian flows

SN Chow, W Li, H Zhou - Journal of Differential Equations, 2020 - Elsevier

We establish kinetic Hamiltonian flows in density space embedded with the L 2-Wasserstein 

metric tensor. We derive the Euler-Lagrange equation in density space, which introduces the 

associated Hamiltonian flows. We demonstrate that many classical equations, such as …

Cited by 1 Related articles All 4 versions
Zbl 07128929    Chow, Shui-Nee; Li, Wuchen; Zhou, Haomin

Wasserstein Hamiltonian flows. (English) 

J. Differ. Equations 268, No. 3, 1205-1219 (2020).   MSC:  35A15 47J35 

MR4029003 Prelim Chow, Shui-Nee; Li, Wuchen; Zhou, Haomin; Wasserstein Hamiltonian flows. J. Differential Equations 268 (2020), no. 3, 1205–1219. 58 (35Q41 35Q83)
 Wasserstein Hamiltonian flows 

by Chow, Shui-Nee; Li, Wuchen; Zhou, Haomin 

Journal of Differential Equations, 01/2020, Volume 268, Issue 3

We establish kinetic Hamiltonian flows in density space embedded with the L2-Wasserstein metric tensor. We derive the Euler-Lagrange equation in density space,...

Journal Article: Full Text Online 

Cited by 7 Related articles All 7 versions

Semi-supervised Surface Wave Tomography with Wasserstein Cycle-consistent GAN: Method and Application on Southern California Plate Boundary Region

A Cai, H QiuF Niu - 2020 - essoar.org

Machine learning algorithm is applied to shear wave velocity (Vs) inversion in surface wave

tomography, where a set of 1-D Vs profiles and the corresponding synthetic dispersion

curves are used in network training. Previous studies showed that performances of a trained …

 


[PDF] arxiv.org

Visual transfer for reinforcement learning via wasserstein domain confusion

J RoyG Konidaris - arXiv preprint arXiv:2006.03465, 2020 - arxiv.org

We introduce Wasserstein Adversarial Proximal Policy Optimization (WAPPO), a novel

algorithm for visual transfer in Reinforcement Learning that explicitly learns to align the

distributions of extracted features between a source and target task. WAPPO approximates …

  Cited by 3 Related articles All 6 versions 



Bridging the gap between f-gans and wasserstein gans

J SongS Ermon - International Conference on Machine …, 2020 - proceedings.mlr.press

Generative adversarial networks (GANs) variants approximately minimize divergences

between the model and the data distribution using a discriminator. Wasserstein GANs

(WGANs) enjoy superior empirical performance, however, unlike in f-GANs, the discriminator …

  Cited by 11 Related articles All 4 versions 


Typical wind power scenario generation for multiple wind farms using conditional improved Wasserstein generative adversarial network

Y Zhang, Q Ai, F Xiao, R Hao, T Lu - … Journal of Electrical Power & Energy …, 2020 - Elsevier

Because of environmental benefits, wind power is taking an increasing role meeting 

electricity demand. However, wind power tends to exhibit large uncertainty and is largely 

influenced by meteorological conditions. Apart from the variability, when multiple wind farms …

All 2 versions 

<——2020———————2020 ———————-10—


[PDF] researchgate.net

Data supplement for a soft sensor using a new generative model based on a variational autoencoder and Wasserstein GAN

X Wang, H Liu - Journal of Process Control, 2020 - Elsevier

In industrial process control, measuring some variables is difficult for environmental or cost 

reasons. This necessitates employing a soft sensor to predict these variables by using the 

collected data from easily measured variables. The prediction accuracy and computational …

Data supplement for a soft sensor using a new generative model based on a variational autoencoder and Wasserstein GAN 

by Wang, Xiao; Liu, Han 

Journal of Process Control, 01/2020, Volume 85

Journal Article: Full Text Online 

All 2 versions 

[PDF] researchgate.net

Data supplement for a soft sensor using a new generative model based on a variational autoencoder and Wasserstein GAN

X Wang, H Liu - Journal of Process Control, 2020 - Elsevier

In industrial process control, measuring some variables is difficult for environmental or cost 

reasons. This necessitates employing a soft sensor to predict these variables by using the 

collected data from easily measured variables. The prediction accuracy and computational …
Cited by 39
Related articles All 2 versions

2020TATION] Commande Optimale dans les Espaces de Wasserstein

B Bonnet - 2020 - theses.fr

… Commande Optimale dans les Espaces de Wasserstein. par Benoit Bonnet. Projet de thèse en

Automatique, signal, productique, robotique. 127. La soutenance est prévue le 17-01-2020. Sous

la direction de Francesco Rossi et de Maxime Hauray. Thèses en préparation à Aix-Marseille …

Translate this page
see Optimal Control in Wasserstein


arXiv:2001.01700  [pdf, other  math.ST 

Gradient descent algorithms for Bures-Wasserstein barycenters 

Authors: Sinho Chewi, Tyler Maunu, Philippe Rigollet, Austin J. Stromme 

Abstract: We study first order methods to compute the barycenter of a probability distribution over the Bures-Wasserstein manifold. We derive global rates of convergence for both gradient descent and stochastic gradient descent despite the fact that the barycenter functional is not geodesically convex. Our analysis overcomes this technical hurdle by developing a Polyak-Lojasiewicz (PL) inequality, which is… More 

Submitted 6 January, 2020; originally announced January 2020. 

Comments: 24 pages, 5 figures 

MSC Class: Primary: 62F10; Secondary: 90C26; 58E; 68W25 

Gradient descent algorithms for Bures-Wasserstein barycenters 

by Chewi, Sinho; Maunu, Tyler; Rigollet, Philippe; More... 

01/2020 

Journal Article:  Full Text Online 

Cited by 36 Related articles All 9 versions

Tyler Maunu (MIT) -- Gradient descent algorithms for Bures-Wasserstein 

41"35

Jonathan Niles-Weed (NYU/IAS) - Estimation of the Wasserstein distance in the spiked transport model ...

- Uploaded by MIFODS

Gradient descent algorithms for Bures-Wasserstein barycenters
Feb 5, 2020

Data Augmentation Based on Wasserstein Generative Adversarial Nets Under Few Samples

Y Jiang, B Zhu, Q Ma - IOP Conference Series: Materials Science …, 2020 - iopscience.iop.org

… proposed Wasserstein Generative Adversarial Nets (WGAN) in document [6]. This model uses Wasserstein distance (also known as Earth-mover, EM distance) instead of Jensen-Shannon (JS) divergence to evaluate the distance between actual samples and generated samples … 

Data Augmentation Based on Wasserstein Generative Adversarial Nets Under Few Samples 

by Jiang, Yuchen; Zhu, Bin; Ma, Qi 

IOP Conference Series: Materials Science and Engineering, 01/2020, Volume 711

Journal Article:  Full Text Online

MR4043394 Prelim Puccetti, Giovanni; Rüschendorf, Ludger; Vanduffel, Steven; On the computation of Wasserstein barycenters. J. Multivariate Anal. 176 (2020), 104581. 65C50 (60-08 68U10 68W25) 7 

see 2019

2020


Wasserstein GAN based on Autoencoder with back-translation for cross-lingual embedding mappings 

by Zhang, Yuhong; Li, Yuling; Zhu, Yi; More... 

Pattern Recognition Letters, 01/2020, Volume 129

Journal Article:  Full Text Online  

Wasserstein GAN based on Autoencoder with back-translation for cross-lingual embedding mappings

Y Zhang, Y Li, Y Zhu, X Hu - Pattern Recognition Letters, 2020 - Elsevier

Recent works about learning cross-lingual word mappings (CWMs) focus on relaxing the 

requirement of bilingual signals through generative adversarial networks (GANs). GANs 

based models intend to enforce source embedding space to align target embedding space. 

However, existing GANs based models cannot exploit the underlying information of target-

side for an alignment standard in the training, which may lead to some suboptimal results of 

CWMs. To address this problem, we propose a novel method, named Wasserstein GAN …

Related articles All 2 versions

Cited by 6 Related articles All 3 versions

Wasserstein Distributionally Robust Motion Control for Collision Avoidance Using Conditional Value-at-Risk 

by Hakobyan, Astghik; Yang, Insoon  01/2020

Journal Article: Full Text Online
Wasserstein distributionally robust motion control for collision avoidance using conditional value-at-risk


Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings 

by Dai, Yuanfei; Wang, Shiping; Chen, Xing; More... 

Knowledge-Based Systems, 02/2020, Volume 190

Journal Article:  Full Text Online  see 2019

Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings

Y Dai, S Wang, X Chen, C Xu, W Guo - Knowledge-Based Systems, 2020 - Elsevier

Abstract Knowledge graph embedding aims to project entities and relations into low-

dimensional and continuous semantic feature spaces, which has captured more attention in 

recent years. Most of the existing models roughly construct negative samples via a uniformly 

random mode, by which these corrupted samples are practically trivial for training the 

embedding model. Inspired by generative adversarial networks (GANs), the generator can 

be employed to sample more plausible negative triplets, that boosts the discriminator to …

Cited by 15 Related articles All 2 versions

Wasserstein Exponential Kernels 

by De Plaen, Henri; Fanuel, Michaël; Suykens, Johan A. K 

02/2020

In the context of kernel methods, the similarity between data points is encoded by the kernel function which is often defined thanks to the Euclidean distance,...

Journal Article: Full Text Online 

online OPEN ACCESS

Wasserstein Exponential Kernels

by De Plaen, Henri; Fanuel, M; Suykens, J

07/2020

status: published

Conference ProceedingFull Text Online

[PDF] arxiv.org

Wasserstein Exponential Kernels

H De Plaen, M Fanuel, JAK Suykens - arXiv preprint arXiv:2002.01878, 2020 - arxiv.org

In the context of kernel methods, the similarity between data points is encoded by the kernel 

function which is often defined thanks to the Euclidean distance, a common example being 

the squared exponential kernel. Recently, other distances relying on optimal transport theory …

Cited by 5 Related articles All 5 versions

 <——2020———————2020 —————-20—


Stochastic Approximation versus Sample Average Approximation for population Wasserstein barycenters

D Dvinskikh - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

In machine learning and optimization community there are two main approaches for convex

risk minimization problem, namely, the Stochastic Approximation (SA) and the Sample

Average Approximation (SAA). In terms of oracle complexity (required number of stochastic

gradient evaluations), both approaches are considered equivalent on average (up to a

logarithmic factor). The total complexity depends on the specific problem, however, starting

from work\cite {nemirovski2009robust} it was generally accepted that the SA is better than …

  Cited by 1

online  OPEN ACCESS

Stochastic Approximation versus Sample Average Approximation for population Wasserstein...

by Dvinskikh, Darina

01/2020

In machine learning and optimization community there are two main approaches for convex risk minimization problem, namely, the Stochastic Approximation (SA)...

Journal ArticleFull Text Online


[2001.06187]

Exponential contraction in Wasserstein distance on static and evolving manifolds 

by Cheng, Li-Juan; Thalmaier, Anton; Zhang, Shao-Qin  01/2020

Journal Article: Full Text Online 

[PDF] arxiv.org

Exponential contraction in Wasserstein distance on static and evolving manifolds

LJ Cheng, A Thalmaier, SQ Zhang - arXiv preprint arXiv:2001.06187, 2020 - arxiv.org

In this article, exponential contraction in Wasserstein distance for heat semigroups of diffusion processes on Riemannian manifolds is established under curvature conditions where Ricci curvature is not necessarily required to be non-negative. Compared to the …

Cited by 1 All 4 versions


arxiv.org › math

SA vs SAA for population Wasserstein barycenter calculation 

by Dvinskikh, Darina  01/2020

Journal Article: Full Text Online 

Cited by 4

 arxiv.org › math

TPFA Finite Volume Approximation of Wasserstein Gradient Flows 

by Natale, Andrea; Todeschi, Gabriele  01/2020

Journal Article: Full Text Online 

Related articles All 8 versions

Book ChapterFull Text Online
Cited by 3
 Related articles All 6 versions


arxiv.org › cs

Nested-Wasserstein Self-Imitation Learning for Sequence Generation 

by Zhang, Ruiyi; Chen, Changyou; Gan, Zhe; More... 

01/2020

Journal Article: Full Text Online 

All 2 versions 

ng, C Chen, Z Gan, Z Wen, W Wang, L Carin - bayesiandeeplearning.org

… (i) A novel nested-Wasserstein self-imitation learning framework is … Nested-Wasserstein distance

provides a natural way to manifest semantic matching compared with the conventional rewards …

Alternatively, we can train a discriminator to learn the reward model, but empirically it … 


2020


[PDF] ceaj.org

[PDF] 结合 FC-DenseNet WGAN 的图像去雾算法

孙斌, 雎青青, 桑庆兵 - 计算机科学与探索, 2019 - fcst.ceaj.org

针对现有图像去雾算法严重依赖中间量准确估计的问题, 提出了一种基于Wasserstein 

生成对抗网络(Wasserstein Generative Adversarial Networks, WGAN) 的端到端图像去雾模型. 

首先, 使用全卷积密集块网络(Fully Convolutional DenseNets, FC-DenseNet) …

Related articles All 2 versions 

[Chinese  Image dehazing algorithm combining FC-DenseNet and WGAN]

Related articles All 2 versions

 Typical wind power scenario generation for multiple wind farms using conditional improved Wasserstein generative adversarial network

Y Zhang, Q Ai, F Xiao, R Hao, T Lu - … Journal of Electrical Power & Energy …, 2020 - Elsevier

Because of environmental benefits, wind power is taking an increasing role meeting electricity demand. However, wind power tends to exhibit large uncertainty and is largely influenced by meteorological conditions. Apart from the variability, when multiple wind farms …

Related articles All 2 versions 

[PDF] researchgate.net


 arXiv:2001.11005  [pdf, other]  physics.chem-ph physics.comp-ph 

Wasserstein metric for improved QML with adjacency matrix representations 

Authors: Onur Çaylak, O. Anatole von Lilienfeld, Björn Baumeier 

Abstract: We study the Wasserstein metric to measure distances between molecules represented by the atom index dependent adjacency "Coulomb" matrix, used in kernel ridge regression based supervised learning. Resulting quantum machine learning models exhibit improved training efficiency and result in smoother predictions of molecular distortions. We first demonstrate smoothness for the continuous extraction… More 

Submitted 29 January, 2020; originally announced January 2020.
[PDF] arxiv.org

Wasserstein metric for improved QML with adjacency matrix representations

O Çaylak, OA von Lilienfeld, B Baumeier - arXiv preprint arXiv:2001.11005, 2020 - arxiv.org

… The Wasser- stein metric is permutation invariant and the MAE obtained with it is given by … 4 ious

possible combinations of kernel functions, Wasserstein metric, and representations other than

the CM … Montavon, K.-R. Müller, and M. Cuturi, “Wasserstein training of boltz- mann … 

Related articles All 2 versions 


arXiv:2001.10655  [pdf, ps, other 

cs.LG cs.CR eess.SP math.OC math.ST stat.ML 

Regularization Helps with Mitigating Poisoning Attacks: Distributionally-Robust Machine Learning Using the Wasserstein Distance 

Authors: Farhad Farokhi 

Abstract: We use distributionally-robust optimization for machine learning to mitigate the effect of data poisoning attacks. We provide performance guarantees for the trained model on the original data (not including the poison records) by training the model for the worst-case distribution on a neighbourhood around the empirical distribution (extracted from the training dataset corrupted by a poisoning atta… More 

Submitted 28 January, 2020; originally announced January 2020. 

 

arXiv:2001.09993  [pdf, other 

cs.LG cs.AI stat.ML 

Generating Natural Adversarial Hyperspectral examples with a modified Wasserstein GAN 

Authors: Jean-Christophe Burnel, Kilian Fatras, Nicolas Courty 

Abstract: Adversarial examples are a hot topic due to their abilities to fool a classifier's prediction. There are two strategies to create such examples, one uses the attacked classifier's gradients, while the other only requires access to the clas-sifier's prediction. This is particularly appealing when the classifier is not full known (black box model). In this paper, we present a new method which is abl… More 

Submitted 27 January, 2020; originally announced January 2020. 

Comments: C&ESAR, Nov 2019, Rennes, France 

<——2020———————2020 ———————-30—


arXiv:2001.08056  [pdf, ps, other  math.ST 

Bures-Wasserstein Geometry 

Authors: Jesse van Oostrum 

 Abstract: The Bures-Wasserstein distance is a Riemannian distance on the space of positive definite Hermitian matrices and is given by: d(Σ,T)=[tr(Σ)+tr(T)−2tr(Σ1/2TΣ1/2)1/2]1/2. This distance function appears in the fields of optimal transport, quantum information, and optimisation theory. In this paper, the geometrical properties of this dis… More

Submitted 21 January, 2020; originally announced January 2020.

Bures-Wasserstein Geometry 

by van Oostrum, Jesse 

01/2020

The Bures-Wasserstein distance is a Riemannian distance on the space of positive definite Hermitian matrices and is given by: $d(\Sigma,T) =...

Journal Article:  Full Text Online 

All 2 versions  Related articles


arXiv:2001.09817  [pdf, ps, other 

math.PR mathdoi 10.1214/19-EJP410 

Exact rate of convergence of the mean Wasserstein distance between the empirical and true Gaussian distribution 

Authors: Philippe Berthet, Jean-Claude Fort 

 Abstract: We study the Wasserstein distance W2 for Gaussian samples. We establish the exact rate of convergence loglogn/n−−−−−−−−−√ of the expected value of the W2 distance between the empirical and true c.d.f.'s for the normal distribution. We also show that the rate of weak convergence is unexpectedly 1/n−−√ in the case of two correlated Gaussian samples.

Submitted 27 January, 2020; originally announced January 2020.

Journal ref: Electron. J. Probab. 25 (2020) 


Wasserstein distributionally robust shortest path problem 

by Wang, Zhuolin; You, Keyou; Song, Shiji; More... 

European Journal of Operational Research, 01/2020

Journal Article: Full Text Online 

Wasserstein Distributionally Robust Shortest Path Problem

Z Wang, K You, S Song, Y Zhang - European Journal of Operational …, 2020 - Elsevier

This paper proposes a data-driven distributionally robust shortest path (DRSP) model where the distribution of the travel time in the transportation network can only be partially observed through a finite number of samples. Specifically, we aim to find an optimal path to minimize the worst-case α-reliable mean-excess travel time (METT) over a Wasserstein ball, which is centered at the empirical distribution of the sample dataset and the ball radius quantifies the level of its confidence. In sharp contrast to the existing DRSP models, our model is …

Related articles 

MR4054103    Prelim Wang, Zhuolin; You, Keyou; Song, Shiji; Zhang, Yuli; Wasserstein distributionally robust shortest path problem. European J. Oper. Res. 284 (2020), no. 1, 31–43. 90C35 (90B06 90C10)


Nonpositive curvature, the variance functional, and the Wasserstein barycenter

Kim, Young-Heon; Pass, Brendan

Proceedings of the American Mathematical Society

2020 p. 1  FullText OnlineJournal Article

MR4069211 Prelim Kim, Young-Heon; Pass, Brendan Nonpositive curvature, the variance functional, and the Wasserstein barycenter. Proc. Amer. Math. Soc. 148 (2020), no. 4, 1745–1756. 53C21 (49Q20 49Q22)

 Nonpositive curvature, the variance functional, and the Wasserstein barycenter

YH Kim, B Pass - Proceedings of the American Mathematical Society, 2020 - ams.org

We show that a Riemannian manifold $ M $ has nonpositive sectional curvature and is 

simply connected if and only if the variance functional on the space $ P (M) $ of probability 

measures over $ M $ is displacement convex. We then establish convexity over Wasserstein 

barycenters of the variance, and derive an inequality between the variance of the 

Wasserstein and linear barycenters of a probability measure on $ P (M) $. These results are 

applied to invariant measures under isometry group actions, implying that the variance of the …

Cited by 3 Related articles All 3 versions


Missing features reconstruction using a wasserstein generative adversarial imputation networkAuthors:Friedjungova M.Vasata D.Balatsko M.Jirina M.20th International Conference on Computational Science, ICCS 2020
Article, 2020
Publication:Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 12140 LNCS, 2020, 225
Publisher:2020

 2020


Parameter estimation for biochemical reaction networks using Wasserstein distances

Öcal, Kaan; Grima, Ramon

Journal of Physics A: Mathematical and Theoretical

2020 v. 53 no. 3 p. 34002 

FullText Online Journal Article

Parameter estimation for biochemical reaction networks using Wasserstein distances 

by Öcal, Kaan; Grima, Ramon; Sanguinetti, Guido 

Journal of Physics A: Mathematical and Theoretical, 01/2020, Volume 53, Issue 3

Journal Article: Full Text Online 


[PDF] researchgate.net

On the computation of Wasserstein barycenters

G Puccetti, L Rüschendorf, S Vanduffel - Journal of Multivariate Analysis, 2020 - Elsevier

The Wasserstein barycenter is an important notion in the analysis of high dimensional data 

with a broad range of applications in applied probability, economics, statistics, and in 

particular to clustering and image processing. In this paper, we state a general version of the …

Cited by 3 Related articles All 4 versions 

Multivariate Analysis; New Findings on Multivariate Analysis Described by Investigators at University of Freiburg (On the computation of Wasserstein barycenters) 

Journal of Mathematics, Mar 3, 2020, 459

Newspaper Article:

Full Text Online 

  

Optimal Estimation of Wasserstein Distance on a Tree With an Application to Microbiome Studies
Wang, Shulei; Cai, T. Tony
Journal of the American Statistical Association
2020 pp. 1–17
FulText OnlineJournal Artic

Optimal Estimation of Wasserstein Distance on a Tree With an Application to Microbiome Studies 

By: Wang, Shulei; Cai, T. Tony; Li, Hongzhe 

JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION     

Early Access: JAN 2020 

Cited by 1 Related articles All 4 versions

Wasserstein Learning of Determinantal Point ProcessesAuthors:Anquetil, Lucas (Creator), Gartrell, Mike (Creator), Rakotomamonjy, Alain (Creator), Tanielian, Ugo (Creator), Calauzènes, Clément (Creator)
Summary:Determinantal point processes (DPPs) have received significant attention as an elegant probabilistic model for discrete subset selection. Most prior work on DPP learning focuses on maximum likelihood estimation (MLE). While efficient and scalable, MLE approaches do not leverage any subset similarity information and may fail to recover the true generative distribution of discrete data. In this work, by deriving a differentiable relaxation of a DPP sampling algorithm, we present a novel approach for learning DPPs that minimizes the Wasserstein distance between the model and data composed of observed subsets. Through an evaluation on a real-world dataset, we show that our Wasserstein learning approach provides significantly improved predictive performance on a generative task compared to DPPs trained using MLEShow more
Downloadable Archival Material, 2020-11-19
Undefined
Publisher:2020-11-19

[PDF] arxiv.org

Stein's method for normal approximation in Wasserstein distances with application to the multivariate Central Limit Theorem

T Bonis - Probability Theory and Related Fields, 2020 - Springer

We use Stein's method to bound the Wasserstein distance of order 2 between a 

measure\(\nu\) and the Gaussian measure using a stochastic process\((X_t) _ {t\ge 0}\) such 

that\(X_t\) is drawn from\(\nu\) for any\(t> 0\). If the stochastic process\((X_t) _ {t\ge 0}\) 

satisfies an additional exchangeability assumption, we show it can also be used to obtain 

bounds on Wasserstein distances of any order\(p\ge 1\). Using our results, we provide 

convergence rates for the multi-dimensional central limit theorem in terms of Wasserstein …

Cited by 3 Related articles All 2 versions

 <——2020———————2020 —————-40— 


 

arXiv:2002.00743  [pdf, other 

cs.CL cs.AI cs.LG stat.ML 

Unsupervised Multilingual Alignment using Wasserstein Barycenter 

Authors: Xin Lian, Kshitij Jain, Jakub Truszkowski, Pascal Poupart, Yaoliang Yu 

Abstract: We study unsupervised multilingual alignment, the problem of finding word-to-word translations between multiple languages without using any parallel data. One popular strategy is to reduce multilingual alignment to the much simplified bilingual setting, by picking one of the input languages as the pivot language that we transit through. However, it is well-known that transiting through a poorly ch… More 

Submitted 28 January, 2020; originally announced February 2020. 

Comments: Work in progress; comments welcome! 

 Unsupervised Multilingual Alignment using Wasserstein Barycenter 

by Lian, Xin; Jain, Kshitij; Truszkowski, Jakub; More... 

01/2020

We study unsupervised multilingual alignment, the problem of finding word-to-word translations between multiple languages without using any parallel data. One...

Journal Article: Full Text Online 

2020

 Lian, Xin. Unsupervised Multilingual Alignment using Wasserstein Barycenter. 

Degree: 2020, University of Waterloo

URL: http://hdl.handle.net/10012/15557 

► We investigate the language alignment problem when there are multiple languages, and we are interested in finding translation between all pairs of languages. The problem… (more)

Record Details Similar Records Cite Share »
University of Illinois – Urbana-Champaign

Cited by 3 Related articles All 13 versions
Unsupervised Multilingual Alignment using Wasserstein Barycenter
thesis


Schiavo, Lorenzo Dello

A Rademacher-type theorem on L 2-Wasserstein spaces over closed Riemannian manifolds. (English) Zbl 07155075 

J. Funct. Anal. 278, No. 6, Article ID 108397, 51 p. (2020). 

MSC:  31C25 46G99 

MR4054103 Prelim Dello Schiavo, Lorenzo; A Rademacher-type theorem on

L2-Wasserstein spaces over closed Riemannian manifolds. J. Funct. Anal. 278 (2020), no. 6, 108397. 31C25 (58D20)


Kim, Sejong; Lee, Hosoo  Inequalities of the Wasserstein mean with other matrix means. (English) Zbl 07154010 

Ann. Funct. Anal. 11, No. 1, 194-207 (2020). 

MSC:  15B48 47B65 

Inequalities of the Wasserstein mean with other matrix means 

By: Kim, Sejong; Lee, Hosoo 

ANNALS OF FUNCTIONAL ANALYSIS  Volume: 11   Issue: 1   Pages: 194-207   Published: JAN 2020 

MR4091410 Pending Kim, Sejong; Lee, Hosoo Inequalities of the Wasserstein mean with other matrix means. Ann. Funct. Anal. 11 (2020), no. 1, 194–207. 15B48 (15A45 47B65)

 Review PDF Clipboard Journal Article
Cited by 6
Related articles All 2 versions


 

Regularization Helps with Mitigating Poisoning Attacks ...

arxiv.org › cs

by F Farokhi · 2020 · Cited by 5 — ... poisoning attack) defined using the Wasserstein distance. We relax the distributionally-robust machine learning problem by finding an upper ...

online OPEN ACCESS

Regularization Helps with Mitigating Poisoning Attacks: Distributionally-Robust Machine Learning Using the Wasserstein...

by Farokhi, Farhad

01/2020

We use distributionally-robust optimization for machine learning to mitigate the effect of data poisoning attacks. We provide performance guarantees for the...

Journal ArticleFull Text Online

 OPEN ACCESS

Regularization Helps with Mitigating Poisoning Attacks: Distributionally-Robust Machine Learning Using the Wasserstein...

by Farokhi, Farhad

01/2020

We use distributionally-robust optimization for machine learning to mitigate the effect of data poisoning attacks. We provide performance guarantees for the...

PublicationCitation Online


Isometric study of Wasserstein spaces---the real line

G Pál Gehér, T TitkosD Virosztek - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

Recently Kloeckner described the structure of the isometry group of the quadratic

Wasserstein space $\mathcal {W} _2\left (\mathbb {R}^ n\right) $. It turned out that the case of

the real line is exceptional in the sense that there exists an exotic isometry flow. Following …

2020

2020 see 2019

On the total variation Wasserstein gradient flow and the TV-JKO scheme 

By: Carlier, Guillaume; Poon, Clarice 

ESAIM-CONTROL OPTIMISATION AND CALCULUS OF VARIATIONS  Volume: 25     Published: SEP 20 2019 


Artifact correction in low-dose dental CT imaging using Wasserstein generative adversarial networks 

By: Hu, Zhanli; Jiang, Changhui; Sun, Fengyi; et al.

MEDICAL PHYSICS  Volume: 46   Issue: 4   Pages: 1686-1696   Published: APR 2019 


Hybrid Wasserstein distance and fast distribution clustering 

By: Verdinelli, Isabella; Wasserman, Larry 

ELECTRONIC JOURNAL OF STATISTICS  Volume: 13   Issue: 2   Pages: 5088-5119   Published: 2019 

 Free Full Text from Publisher 

Wasserstein hamiltoion in density space, which introduces the 

associated Hamiltonian flows. We demonstrate that many classical equations, such as …

Cited by 1 Related articles All 5 version\


Wasserstein loss with alternative reinforcement learning for severity-aware semantic segmentation

X Liu, Y Lu, X Liu, S Bai, S Li… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Semantic segmentation is important for many real-world systems, eg, autonomous vehicles,

which predict the class of each pixel. Recently, deep networks achieved significant progress

wrt the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross …

  Cited by 3 Related articles

<——2020———————2020 ———————-50— 


An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription

C Jin, Z Li, Y Sun, H Zhang, X Lv, J Li, S Liu - International Conference on …, 2019 - Springer

… First Online: 27 February 2020. 2 Downloads. Part of the Lecture Notes of the Institute for … Finally,

we drew a conclusion that our crowdsourcing method is very useful in improving … J., Lee, H., Slaney,

M.: A classification-based polyphonic piano transcription approach using learned …

  Related articles


 Person-independent facial expression recognition method based on improved Wasserstein generative adversarial networks in combination with identity aware

C Xu, Y Cui, Y Zhang, P Gao, J Xu - Multimedia Systems, 2020 - Springer

Since the distinction between two expressions is fairly vague, usually a subtle change in one part of the human face is enough to change a facial expression. Most of the existing facial expression recognition algorithms are not robust enough because they rely on general facial features or algorithms without considering differences between facial expression and facial identity. In this paper, we propose a person-independent recognition method based on Wasserstein generative adversarial networks for micro-facial expressions, where a facial …

 Cited by 8 Related articles All 2 versions

Person-independent facial expression recognition method based on improved Wasserstein generative adversarial networks in... 

by Xu, Caie; Cui, Yang; Zhang, Yunhui; More... 

Multimedia Systems, 02/2020, Volume 26, Issue 1

Journal Article:  Full Text Online 

   Person-independent facial expression recognition method based on improved Wasserstein

Research Conducted at University of Yamanashi Has Updated Our Knowledge about Multimedia (Person-independent facial expression recognition method based on improved Wasserstein... 

Journal of Technology & Science, 03/2020

Newsletter: Full Text Online 

Multimedia; Research Conducted at University of Yamanashi Has Updated Our Knowledge about Multimedia (Person-independent facial expression recognition method based on improved Wasserstein... 

Journal of Technology & Science, Mar 15, 2020, 2574


[HTML] nih.gov

[HTML] EEG Signal Reconstruction Using a Generative Adversarial Network With Wasserstein Distance and Temporal-Spatial-Frequency Loss

T Luo, Y Fan, L Chen, G Guo, C Zhou - Frontiers in …, 2020 - ncbi.nlm.nih.gov

Applications based on electroencephalography (EEG) signals suffer from the mutual

contradiction of high classification performance vs. low cost. The nature of this contradiction

makes EEG signal reconstruction with high sampling rates and sensitivity challenging …

  Cited by 3 Related articles All 3 versions


A generalized Vaserstein symbol

T Syed - Annals of K-Theory, 2020 - msp.org

Let R be a commutative ring. For any projective R-module P 0 of constant rank 2 with a 

trivialization of its determinant, we define a generalized Vaserstein symbol on the orbit 

space of the set of epimorphisms P 0 R R under the action of the group of elementary …

Cited by 1 Related articles 

Cited by 3 Related articles All 7 versions Library Search


2020


A Survey on the Non-injectivity of the Vaserstein Symbol in Dimension Three

N Gupta, DR Rao, S Kolte - Leavitt Path Algebras and Classical K-Theory, 2020 - Springer

We give a recap of the study of the Vaserstein symbol \(V_A : Um_3(A)/E_3(A) \longrightarrow W_E(A)\), the elementary symplectic Witt group; when A is an affine threefold over a field k … LN Vaserstein in [20] proved that the orbit space of unimodular rows of length three modulo elementary …

All 2 versions

Gupta, NeenaRao, Dhvanita R.Kolte, Sagar

A survey on the non-injectivity of the Vaserstein symbol in dimension three. (English) Zbl 1442.19002

Ambily, A. A. (ed.) et al., Leavitt path algebras and classical K-theory. Based on the international workshop on Leavitt path algebras and K-theory, Kerala, India, July 1–3, 2017. Singapore: Springer. Indian Stat. Inst. Ser., 193-202 (2020).

MSC:  19B14 13C10 13F35


arXiv:2002.05366  [pdf, other 

cs.LG stat.ML 

Regularizing activations in neural networks via distribution matching with the Wasserstein metric 

Authors: Taejong Joo, Donggu Kang, Byunghoon Kim 

Abstract: Regularization and normalization have become indispensable components in training deep neural networks, resulting in faster training and improved generalization performance. We propose the projected error function regularization loss (PER) that encourages activations to follow the standard normal distribution. PER randomly projects activations onto one-dimensional space and computes the regulariza… More 

Submitted 13 February, 2020; originally announced February 2020. 

Comments: ICLR 2020 

propose the projected error function regularization loss (PER) that encourages activations to …

 ited by 4 Related articles All 7 versions


arXiv:2002.04783  [pdf, ps, other 

cs.CC cs.DS stat.ML 

Revisiting Fixed Support Wasserstein Barycenter: Computational Hardness and Efficient Algorithms 

Authors: Tianyi Lin, Nhat Ho, Xi Chen, Marco Cuturi, Michael I. Jordan 

Abstract: We study the fixed-support Wasserstein barycenter problem (FS-WBP), which consists in computing the Wasserstein barycenter of  m  discrete probability measures supported on a finite metric space of size

n. We show first that the constraint matrix arising from the linear programming (LP) representation of the FS-WBP is totally unimodular when  m≥3  and  n=2, but not totally unimodular whe… More 

Submitted 11 February, 2020; originally announced February 2020. 

Comments: Under review, ICML 


arXiv:2002.04363  [pdf, ps, other 

math.ST cs.LG math.PR 

Wasserstein Control of Mirror Langevin Monte Carlo 

Authors: Kelvin Shuangjian Zhang, Gabriel Peyré, Jalal Fadili, Marcelo Pereyra 

Abstract: Discretized Langevin diffusions are efficient Monte Carlo methods for sampling from high dimensional target densities that are log-Lipschitz-smooth and (strongly) log-concave. In particular, the Euclidean Langevin Monte Carlo sampling algorithm has received much attention lately, leading to a detailed understanding of its non-asymptotic convergence properties and of the role that smoothness and lo… More 

Submitted 11 February, 2020; originally announced February 2020. 

Comments: 22 pages, 2 tables 

 

arXiv:2002.03035  [pdf, ps, other 

math.OC stat.ML 

Wasserstein Proximal Gradient 

Authors: Adil Salim, Anna Korba, Giulia Luise 

Abstract: We consider the task of sampling from a log-concave probability distribution. This target distribution can be seen as a minimizer of the relative entropy functional defined on the space of probability distributions. The relative entropy can be decomposed as the sum of a functional called the potential energy, assumed to be smooth, and a nonsmooth functional called the entropy. We adopt a Forward B… More 

Submitted 7 February, 2020; originally announced February 2020. 

<——2020———————2020 —————-60—

 

arXiv:2002.03016  [pdf, ps, other 

cs.LG stat.ML 

Safe Wasserstein Constrained Deep Q-Learning 

Authors: Aaron Kandel, Scott J. Moura 

Abstract: This paper presents a distributionally robust Q-Learning algorithm (DrQ) which leverages Wasserstein ambiguity sets to provide probabilistic out-of-sample safety guarantees during online learning. First, we follow past work by separating the constraint functions from the principal objective to create a hierarchy of machines within the constrained Markov decision process (CMDP). DrQ works within th… More 

Submitted 7 February, 2020; originally announced February 2020. 

Comments: 13 pages, 3 figures

Cited by 1 Related articles All 2 versions

arXiv:2002.08695  [pdf, other]  cs.LG math.OC stat.ML

Stochastic Optimization for Regularized Wasserstein Estimators

Authors: Marin Ballu, Quentin Berthet, Francis Bach

Abstract: Optimal transport is a foundational problem in optimization, that allows to compare probability distributions while taking into account geometric aspects. Its optimal objective value, the Wasserstein distance, provides an important loss between distributions that has been used in many applications throughout machine learning and statistics. Recent algorithmic progress on this problem and its regul… More

Submitted 20 February, 2020; originally announced February 2020.


arXiv:2002.08276  [pdf, other]   stat.ML cs.LG

Partial Gromov-Wasserstein with Applications on Positive-Unlabeled Learning

Authors: Laetitia Chapel, Mokhtar Z. Alaya, Gilles Gasso

Abstract: Optimal Transport (OT) framework allows defining similarity between probability distributions and provides metrics such as the Wasserstein and Gromov-Wasserstein discrepancies. Classical OT problem seeks a transportation map that preserves the total mass, requiring the mass of the source and target distributions to be the same. This may be too restrictive in certain applications such as color or s… More

Submitted 19 February, 2020; originally announced February 2020.


arXiv:2002.07501  [pdf, other  stat.ML cs.LG

A Wasserstein Minimum Velocity Approach to Learning Unnormalized Models

Authors: Ziyu Wang, Shuyu Cheng, Yueru Li, Jun Zhu, Bo Zhang

Abstract: Score matching provides an effective approach to learning flexible unnormalized models, but its scalability is limited by the need to evaluate a second-order derivative. In this paper, we present a scalable approximation to a general family of learning objectives including score matching, by observing a new connection between these objectives and Wasserstein gradient flows. We present applications… More

Submitted 18 February, 2020; originally announced February 2020.


arXiv:2002.07367  [pdf, other  stat.ML cs.LG stat.CO

Distributional Sliced-Wasserstein and Applications to Generative Modeling

Authors: Khai Nguyen, Nhat Ho, Tung Pham, Hung Bui

Abstract: Sliced-Wasserstein distance (SWD) and its variation, Max Sliced-Wasserstein distance (Max-SWD), have been widely used in the recent years due to their fast computation and scalability when the probability measures lie in very high dimension. However, these distances still have their weakness, SWD requires a lot of projection samples because it uses the uniform distribution to sample projecting dir… More

Submitted 17 February, 2020; originally announced February 2020.


2020


arXiv:2002.07261  [pdf, other  math.PR math.ST

Estimating processes in adapted Wasserstein distance

Authors: Julio Backhoff, Daniel Bartl, Mathias Beiglböck, Johannes Wiesel

Abstract: A number of researchers have independently introduced topologies on the set of laws of stochastic processes that extend the usual weak topology. Depending on the respective scientific background this was motivated by applications and connections to various areas (e.g. Plug-Pichler - stochastic programming, Hellwig - game theory, Aldous - stability of optimal stopping, Hoover-Keisler - model theory… More

Submitted 17 February, 2020; originally announced February 2020.

MSC Class: 60G42; 90C46; 58E30


arXiv:2002.07129  [pdf, other  [pdf, other]  math.CA math.AP

The existence of minimizers for an isoperimetric problem with Wasserstein penalty term in unbounded domains

Authors: Qinglan Xia, Bohan Zhou

Abstract: In this article, we consider the (double) minimization problem

min{P(E;Ω)+λWp(E,F): EΩ, FRd, |EF|=0, |E|=|F|=1},

where p1, Ω is a (possibly unbounded) domain in Rd, P(E;Ω) denotes the relative perimeter of E in Ω and Wp denotes the p-Wasserstein distance. When Ω is unbounde… More

Submitted 17 February, 2020; originally announced February 2020.

MSC Class: 49J45; 49Q20; 49Q05; 49J20


2020  

arXiv:2002.06877  [pdf, ps, other]  math.PR

McKean-Vlasov SDEs with Drifts Discontinuous under Wasserstein Distance

Authors: Xing Huang, Feng-Yu Wang

Abstract: Existence and uniqueness are proved for Mckean-Vlasov type distribution dependent SDEs with singular drifts satisfying an integrability condition in space variable and the Lipschitz condition in distribution variable with respect to W0 or W0+Wθ for some θ≥1, where W0 is the total variation distance and Wθ is the Lθ-Wasserstein distance. This improves some existing results wher… More

Submitted 17 February, 2020; originally announced February 2020.

Comments: 14 pages

 Cited by 26 Related articles All 6 versions

arXiv:2002.06751  [pdf, other]  eess.SY

Second-order Conic Programming Approach for Wasserstein Distributionally Robust Two-stage Linear Programs

Authors: Zhuolin Wang, Keyou You, Shiji Song, Yuli Zhang

Abstract: This paper proposes a second-order conic programming (SOCP) approach to solve distributionally robust two-stage stochastic linear programs over 1-Wasserstein balls. We start from the case with distribution uncertainty only in the objective function and exactly reformulate it as an SOCP problem. Then, we study the case with distribution uncertainty only in constraints, and show that such a robust p… More

Submitted 16 February, 2020; originally announced February 2020.

Comments: AISTATS 2020

MR4068569 Prelim Wang, Zhuolin; You, Keyou; Song, Shiji; Zhang, Yuli; Wasserstein distributionally robust shortest path problem. European J. Oper. Res. 284 (2020), no. 1, 31–43. 90C35 (90B06 90C10)

 

arXiv:2002.06241  [pdf, other]  cs.CV cs.LG cs.MA cs.RO

Social-WaGDAT: Interaction-aware Trajectory Prediction via Wasserstein Graph Double-Attention Network

Authors: Jiachen Li, Hengbo Ma, Zhihao Zhang, Masayoshi Tomizuka

Abstract: Effective understanding of the environment and accurate trajectory prediction of surrounding dynamic obstacles are indispensable for intelligent mobile systems (like autonomous vehicles and social robots) to achieve safe and high-quality planning when they navigate in highly interactive and crowded scenarios. Due to the existence of frequent interactions and uncertainty in the scene evolution, it… More

Submitted 14 February, 2020; originally announced February 2020.

<——2020——————2020 ——————-70— 



Exponential convergence in the Wasserstein metric <inline ...

www.aimsciences.org › article › doi › dcds.2020222

by L Cheng · 2020 · Cited by 1 — ... the semigroup \begin{document}$ (P_t) $\end{document} of one-dimensional diffusion. ... Exponential convergence in the Wasserstein metric W1 for one dimensional ... Discrete & Continuous Dynamical Systems, 2020, 40 (9) : 5131-5148. doi: ... \begin{document}$ W_1 $\end{document} for one dimensional diffusions" ...

Exponential convergence in the Wasserstein metric \begin ...

www.researchgate.net › ... › Thermodynamics › Diffusion

Jan 4, 2021 — Request PDF | On Jan 12020, Lingyan Cheng and others published ... Wasserstein metric \begin{document}$ W_1 $\end{document} for one ...

online

Exponential convergence in the Wasserstein metric \begin{document}$ W_1 $\end{document} for one...

by Cheng, Lingyan; Li, Ruinan; Wu, Liming

Discrete and continuous dynamical systems, 2020, Volume 40, Issue 9

Article PDF Download Now (via Unpaywall) BrowZine PDF Icon

Journal ArticleFull Text Online


A Rademacher-type theorem on L-2-Wasserstein spaces over closed Riemannian manifolds 

By: Dello Schiavo, Lorenzo 

JOURNAL OF FUNCTIONAL ANALYSIS  Volume: 278   Issue: 6     Article Number: UNSP 108397   Published: APR 1 2020 


 W-LDMM: A Wasserstein driven low-dimensional manifold model for noisy image restoration 

By: He, Ruiqiang; Feng, Xiangchu; Wang, Weiwei; et al.

NEUROCOMPUTING  Volume: 371   Pages: 108-123   Published: JAN 2 2020 

 Free Full Text from Publisher 

Computation - Neural Computation; Recent Findings in Neural Computation Described by Researchers from Xidian University (W-ldmm: a Wasserstein Driven Low-dimensional... 

Journal of Robotics & Machine Learning, Jan 6, 2020, 262

Newspaper Article:  Full Text Online 

Recent Findings in Neural Computation Described by Researchers from Xidian University (W-ldmm: a Wasser... 

Robotics & Machine Learning, 01/2020

NewsletterFull Text Online 

W-LDMM: A Wasserstein driven low-dimensional manifold model for noisy image restoration

R He, X Feng, W Wang, X Zhu, C Yang - Neurocomputing, 2020 - Elsevier

The Wasserstein distance originated from the optimal transport theory is a general and 

flexible statistical metric in a variety of image processing problems. In this paper, we propose 

a novel Wasserstein driven low-dimensional manifold model (W-LDMM), which tactfully …

All 2 versions 


[PDF] jst.go.jp

Orthogonal gradient penalty for fast training of wasserstein gan based multi-task autoencoder toward robust speech recognition

CY Kao, S Park, A Badi, DK Han… - IEICE TRANSACTIONS on …, 2020 - search.ieice.org

Performance in Automatic Speech Recognition (ASR) degrades dramatically in noisy

environments. To alleviate this problem, a variety of deep networks based on convolutional

neural networks and recurrent neural networks were proposed by applying L1 or L2 loss. In …

  Cited by 1 Related articles All 5 versions


Cross-Domain Text Sentiment Classification Based on Wasserstein Distance 

By: Cai, Guoyong; Lin, Qiang; Chen, Nannan 

Conference: 2nd International Conference on Security with Intelligent Computing and Big-data Services (SICBS) Location: Guilin, PEOPLES R CHINA Date: DEC 14-16, 2018 

SECURITY WITH INTELLIGENT COMPUTING AND BIG-DATA SERVICES  Book Series: Advances in Intelligent Systems and Computing   Volume: 895   Pages: 280-291   Published: 2020 


 2020


 Wasserstein Generative Adversarial Networks Based Data Augmentation for Radar Data Analysis

H Lee, J Kim, EK Kim, S Kim - Applied Sciences, 2020 - mdpi.com

Ground-based weather radar can observe a wide range with a high spatial and temporal resolution. They are beneficial to meteorological research and services by providing valuable information. Recent weather radar data related research has focused on applying machine learning and deep learning to solve complicated problems. It is a well-known fact that an adequate amount of data is a positively necessary condition in machine learning and deep learning. Generative adversarial networks (GANs) have received extensive attention …

Wasserstein Generative Adversarial Networks Based Data Augmentation for... 

by Lee, Hansoo; Kim, Jonggeun; Kim, Eun Kyeong; More... 

Applied Sciences, 02/2020, Volume 10, Issue 4

Ground-based weather radar can observe a wide range with a high spatial and temporal resolution. They are beneficial to meteorological research and services by...

Journal Article: Full Text Online 

Machine Learning; Pusan National University Researchers Have Published New Data on Machine Learning (Wasserstein Generative Adversarial Networks Based Data... 

Robotics & Machine Learning, Mar 16, 2020, 472

Newspaper Article: Full Text Online 


The existence of minimizers for an isoperimetric problem with Wasserstein penalty term in unbounded domains

Q Xia, B Zhou - arXiv preprint arXiv:2002.07129, 2020 - arxiv.org

In this article, we consider the (double) minimization problem $$\min\left\{P (E;\Omega)+\lambda W_p (E, F):~ E\subseteq\Omega,~ F\subseteq\mathbb {R}^ d,~\lvert E\cap F\rvert= 0,~\lvert E\rvert=\lvert F\rvert= 1\right\}, $$ where $ p\geqslant 1$, $\Omega $ is a (possibly unbounded) domain in $\mathbb {R}^ d $, $ P (E;\Omega) $ denotes the relative perimeter of $ E $ in $\Omega $ and $ W_p $ denotes the $ p $-Wasserstein distance. When $\Omega $ is unbounded and $ d\geqslant 3$, it is an open problem …

The existence of minimizers for an isoperimetric problem with Wasserstein... 

by Xia, Qinglan; Zhou, Bohan 

02/2020

In this article, we consider the (double) minimization problem $$\min\left\{P(E;\Omega)+\lambda W_p(E,F):~E\subseteq\Omega,~F\subseteq \mathbb{R}^d,~\lvert...

Journal Article: Full Text Online 


Limit Distribution Theory for Smooth Wasserstein Distance with Applications to Generative Modeling

Z Goldfeld, K Kato - arXiv preprint arXiv:2002.01012, 2020 - arxiv.org

The 1-Wasserstein distance ($\mathsf {W} _1 $) is a popular proximity measure between probability distributions. Its metric structure, robustness to support mismatch, and rich geometric structure fueled its wide adoption for machine learning tasks. Such tasks inherently rely on approximating distributions from data. This surfaces a central issue--empirical approximation under Wasserstein distances suffers from the curse of dimensionality, converging at rate $ n^{-1/d} $ where $ n $ is the sample size and $ d $ is …

Limit Distribution Theory for Smooth Wasserstein Distance with Applications... 

by Goldfeld, Ziv; Kato, Kengo 

02/2020

The 1-Wasserstein distance ($\mathsf{W}_1$) is a popular proximity measure between probability distributions. Its metric structure, robustness to support...

Journal Article: Full Text Online 

 Optimal Transport and Wasserstein Distance 1 Intro

arXiv

Limit Distribution Theory for Smooth Wasserstein Distance with Applications to Generative Modeling 

by Goldfeld, Ziv; Kato, Kengo 

02/2020

The 1-Wasserstein distance ($\mathsf{W}_1$) is a popular proximity measure between probability distributions. Its metric structure, robustness to support...

Journal Article:  Full Text Online

 arXiv:2002.10543  [pdf, other  cs.LG stat.ML 

Variational Wasserstein Barycenters for Geometric Clustering 

Authors: Liang Mi, Tianshu Yu, Jose Bento, Wen Zhang, Baoxin Li, Yalin Wang 

Abstract: We propose to compute Wasserstein barycenters (WBs) by solving for Monge maps with variational principle. We discuss the metric properties of WBs and explore their connections, especially the connections of Monge WBs, to K-means clustering and co-clustering. We also discuss the feasibility of Monge WBs on unbalanced measures and spherical domains. We propose two new problems -- regularized K-means… More 

Submitted 24 February, 2020; originally announced February 2020. 

Cited by 2 Related articles All 2 versions

arXiv:2002.10157  [pdf, ps, other  math.PR 

Infinite-dimensional regularization of McKean-Vlasov equation with a Wasserstein diffusion 

Authors: Victor Marx 

Abstract: Much effort has been spent in recent years on restoring uniqueness of McKean-Vlasov SDEs with non-smooth coefficients. As a typical instance, the velocity field is assumed to be bounded and measurable in its space variable and Lipschitz-continuous with respect to the distance in total variation in its measure variable, see [Jourdain, Mishura-Veretennikov]. In contrast with those works, we consider… More 

Submitted 24 February, 2020; originally announced February 2020. 

<——2020———————2020 —————-80—  


System and method for unsupervised domain adaptation via sliced-wasserstein distance

AJ GabourieM RostamiS Kolouri… - US Patent App. 16 …, 2020 - freepatentsonline.com

Described is a system for unsupervised domain adaptation in an autonomous learning

agent. The system adapts a learned model with a set of unlabeled data from a target

domain, resulting in an adapted model. The learned model was previously trained to …

  Cited by 2 Related articles All 2 versions 

arXiv:2002.09221  [pdf, ps, other 

math.FA math.PR 

Self-improvement of the Bakry-Emery criterion for Poincar{é} inequalities and Wasserstein contraction using variable curvature bounds 

Authors: Patrick Cattiaux, Max Fathi, Arnaud Guillin 

Abstract: We study Poincar{é} inequalities and long-time behavior for diffusion processes on R^n under a variable curvature lower bound, in the sense of Bakry-Emery. We derive various estimates on the rate of convergence to equilibrium in L^1 optimal transport distance, as well as bounds on the constant in the Poincar{é} inequality in several situations of interest, including some where curvature may be neg… More 

Submitted 21 February, 2020; originally announced February 2020. 


arXiv:2002.08695  [pdf, other 

cs.LG math.OC stat.ML 

Stochastic Optimization for Regularized Wasserstein Estimators 

Authors: Marin Ballu, Quentin Berthet, Francis Bach 

Abstract: Optimal transport is a foundational problem in optimization, that allows to compare probability distributions while taking into account geometric aspects. Its optimal objective value, the Wasserstein distance, provides an important loss between distributions that has been used in many applications throughout machine learning and statistics. Recent algorithmic progress on this problem and its regul… More 

Submitted 20 February, 2020; originally announced February 2020. 

Cited by 13 Related articles All 13 versions
Stochastic Optimization for Regularized Wasserstein Estimators

slideslive.com › stochastic-optimization-for-regularized-w...

slideslive.com › stochastic-optimization-for-regularized-w...

Its optimal objective value, the Wasserstein... ... areas such as machine vision, computational biology, speech recognition, and robotics.

SlidesLive · 

Jul 12, 2020


arXiv:2002.08276  [pdf, other 

stat.ML cs.LG 

Partial Gromov-Wasserstein with Applications on Positive-Unlabeled Learning 

Authors: Laetitia Chapel, Mokhtar Z. Alaya, Gilles Gasso 

Abstract: Optimal Transport (OT) framework allows defining similarity between probability distributions and provides metrics such as the Wasserstein and Gromov-Wasserstein discrepancies. Classical OT problem seeks a transportation map that preserves the total mass, requiring the mass of the source and target distributions to be the same. This may be too restrictive in certain applications such as color or s… More 

Submitted 19 February, 2020; originally announced February 2020. 


arXiv:2002.07501  [pdf, other 

stat.ML cs.LG 

A Wasserstein Minimum Velocity Approach to Learning Unnormalized Models 

Authors: Ziyu Wang, Shuyu Cheng, Yueru Li, Jun Zhu, Bo Zhang 

Abstract: Score matching provides an effective approach to learning flexible unnormalized models, but its scalability is limited by the need to evaluate a second-order derivative. In this paper, we present a scalable approximation to a general family of learning objectives including score matching, by observing a new connection between these objectives and Wasserstein gradient flows. We present applications… More 

Submitted 18 February, 2020; originally announced February 2020. 

Comments: AISTATS 2020 


2020

arXiv:2002.07367  [pdf, other 

stat.ML cs.LG stat.CO 

Distributional Sliced-Wasserstein and Applications to Generative Modeling 

Authors: Khai Nguyen, Nhat Ho, Tung Pham, Hung Bui 

Abstract: Sliced-Wasserstein distance (SWD) and its variation, Max Sliced-Wasserstein distance (Max-SWD), have been widely used in the recent years due to their fast computation and scalability when the probability measures lie in very high dimension. However, these distances still have their weakness, SWD requires a lot of projection samples because it uses the uniform distribution to sample projecting dir… More 

Submitted 17 February, 2020; originally announced February 2020. 

arXiv:2002.07261  [pdf, other 

Cited by 32 Related articles All 12 versions

Distributional Sliced-Wasserstein and Applications to ...

openreview.net › forum

Sliced-Wasserstein distance (SW) and its variant, Max Sliced-Wasserstein ... (https://github.com/ChengzijunAixiaoli/PPMM/blob/master/color%20transfer.ipynb).


math.PR math.ST 

Estimating processes in adapted Wasserstein distance 

Authors: Julio Backhoff, Daniel Bartl, Mathias Beiglböck, Johannes Wiesel 

Abstract: A number of researchers have independently introduced topologies on the set of laws of stochastic processes that extend the usual weak topology. Depending on the respective scientific background this was motivated by applications and connections to various areas (e.g. Plug-Pichler - stochastic programming, Hellwig - game theory, Aldous - stability of optimal stopping, Hoover-Keisler - model theory… More 

Submitted 17 February, 2020; originally announced February 2020. 

MSC Class: 60G42; 90C46; 58E30 

arXiv:2002.07129  [pdf, other 

 

[PDF] umn.edu

Data-driven Distributionally Robust Stochastic Optimization via Wasserstein Distance with Applications to Portfolio Risk Management and Inventory Control

D Singh - 2020 - conservancy.umn.edu

The central theme of this dissertation is stochastic optimization under distributional

ambiguity. One canthink of this as a two player game between a decision maker, who tries to

minimize some loss or maximize some reward, and an adversarial agent that chooses the …

  All 2 versions 

 

Input limited Wasserstein GAN

F Cao, H Zhao, P Liu, P Li - Second Target Recognition and …, 2020 - spiedigitallibrary.org

Generative adversarial networks (GANs) has proven hugely successful, but suffer from train 

instability. The recently proposed Wasserstein GAN (WGAN) has largely overcome the 

problem, but can still fail to converge in some case or be to complex. It has been found that …


arXiv:2003.02421  [pdf, other 

stat.ME math.DS 

Regularized Variational Data Assimilation for Bias Treatment using the Wasserstein Metric 

Authors: Sagar K. Tamang, Ardeshir Ebtehaj, Dongmian Zou, Gilad Lerman 

Abstract: This paper presents a new variational data assimilation (VDA) approach for the formal treatment of bias in both model outputs and observations. This approach relies on the Wasserstein metric stemming from the theory of optimal mass transport to penalize the distance between the probability histograms of the analysis state and an a priori reference dataset, which is likely to be more uncertain but… More 

Submitted 4 March, 2020; originally announced March 2020. 

Comments: 7 figures 

Information Technology - Information and Data Aggregation; Researchers from University of Minnesota Twin Cities Provide Details of New Studies and Findings in the Area of Information and Data Aggregation (Regularized Variational Data Assimilation for Bias Treatment Using the Wasserstein...

Computers, Networks & Communications, Aug 13, 2020, 708

Newspaper ArticleCitation Online

Researchers from University of Minnesota Twin Cities Provide Details of New Studies and Findings in the Area of Information and Data Aggregation (Regularized Variational Data Assimilation for Bias Treatment Using the Wasserstein...

Information Technology Newsweekly, 08/2020

NewsletterFull Text Online

 <——2020——————2020 ——————-90—

 

arXiv:2003.00389  [pdf, other  cs.CV 

Joint Wasserstein Distribution Matching 

Authors: JieZhang Cao, Langyuan Mo, Qing Du, Yong Guo, Peilin Zhao, Junzhou Huang, Mingkui Tan 

Abstract: Joint distribution matching (JDM) problem, which aims to learn bidirectional mappings to match joint distributions of two domains, occurs in many machine learning and computer vision applications. This problem, however, is very difficult due to two critical challenges: (i) it is often difficult to exploit sufficient information from the joint distribution to conduct the matching; (ii) this problem… More 

Submitted 29 February, 2020; originally announced March 2020. 

Comments: This paper is accepted by Chinese Journal of Computers in 2020 

Related articles All 3 versions

arXiv:2003.00134  [pdf, other 

cs.IR cs.CV cs.LG 

Image Hashing by Minimizing Independent Relaxed Wasserstein Distance 

Authors: Khoa D. Doan, Amir Kimiyaie, Saurav Manchanda, Chandan K. Reddy 

Abstract: Image hashing is a fundamental problem in the computer vision domain with various challenges, primarily, in terms of efficiency and effectiveness. Existing hashing methods lack a principled characterization of the goodness of the hash codes and a principled approach to learn the discrete hash functions that are being optimized in the continuous space. Adversarial autoencoders are shown to be able… More 

Submitted 28 February, 2020; originally announced March 2020. 

Cited by 2 Related articles All 2 versions

 РАСПРЕДЕЛЕННОЕ ВЫЧИСЛЕНИЕ БАРИЦЕНТРА ВАСЕРШТЕЙНА

ДМ Двинских - soc-phys.ipu.ru

Количественные модели и методы в исследованиях сложных сетей … Двинских Д. М. (Московский 

физико-технический институт, Москва; Сколковский институт науки и технологий, 

Москва) … Определим энтропийно-регуляризованное расстояние Васерштейна, порожденное …

Related articles 

Количественные модели и методы в исследованиях сложных сетей1РАСПРЕДЕЛЕННОЕ ВЫЧИСЛЕНИЕ БАРИЦЕНТРА ВАСЕРШТЕЙНАДвинских Д.М.(Московский физико-технический институт, Москва; Сколковский институт науки и технологий, Москва)

 [Russian  Distributive computation of the Vaserstein bariocenter]

Updated pdf 2021:

РАСПРЕДЕЛЕННОЕ ВЫЧИСЛЕНИЕ БАРИЦЕНТРА ВАСЕРШТЕЙНА

ДМ Двинских - soc-phys.ipu.ru

Количественные модели и методы в исследованиях сложных сетей … Двинских Д. М. (Московский физико-технический институт, Москва; Сколковский институт науки и технологий, Москва) … Определим энтропийно-регуляризованное расстояние Васерштейна, порожденное …

Related articles

2020 book

An Invitation to Statistics in Wasserstein Space Authors  (view affiliations)  Victor M. Panaretos Yoav Zemel

Open Access Book IWFOS 2020 

ISBN 3030384381, 9783030384388

An Invitation to Statistics  in Wasserstein Space 

by Panaretos, Victor M; Zemel, Yoav 

This open access book presents the key aspects of statistics in Wasserstein spaces, i.e. statistics in the space of probability measures when endowed with the...

eBook  Full Text Online 

Cited by 63 Related articles All 8 versions

arXiv:2003.06725  [pdf, other 

math.OC math.ST 

Wasserstein Distance to Independence Models 

Authors: Türkü Özlüm Çelik, Asgar Jamneshan, Guido Montúfar, Bernd Sturmfels, Lorenzo Venturello 

Abstract: An independence model for discrete random variables is a Segre-Veronese variety in a probability simplex. Any metric on the set of joint states of the random variables induces a Wasserstein metric on the probability simplex. The unit ball of this polyhedral norm is dual to the Lipschitz polytope. Given any data distribution, we seek to minimize its Wasserstein distance to a fixed independence mode… More 

Submitted 14 March, 2020; originally announced March 2020. 

MSC Class: Polynomial Optimization; Algebraic Statistics; Computational Algebraic Geometry 


2020

arXiv:2003.06684  [pdf, other 

stat.ME 

Multivariate goodness-of-Fit tests based on Wasserstein distance 

Authors: Marc Hallin, Gilles Mordant, Johan Segers 

Abstract: Goodness-of-fit tests based on the empirical Wasserstein distance are proposed for simple and composite null hypotheses involving general multivariate distributions. This includes the important problem of testing for multivariate normality with unspecified mean vector and covariance matrix and, more generally, testing for elliptical symmetry with given standard radial density and unspecified locat… More 

Submitted 14 March, 2020; originally announced March 2020. 

Comments: 37 pages, 8 figures 

MSC Class: 62G30 

  Cited by 4 Related articles All 10 versions 


arXiv:2003.06048  [pdf, other 

cs.LG stat.ML 

Wasserstein-based Graph Alignment 

Authors: Hermina Petric Maretic, Mireille El Gheche, Matthias Minder, Giovanni Chierchia, Pascal Frossard 

Abstract: We propose a novel method for comparing non-aligned graphs of different sizes, based on the Wasserstein distance between graph signal distributions induced by the respective graph Laplacian matrices. Specifically, we cast a new formulation for the one-to-many graph alignment problem, which aims at matching a node in the smaller graph with one or more nodes in the larger graph. By integrating optim… More 

Submitted 12 March, 2020; originally announced March 2020. 

 Cited by 10 Related articles All 3 versions

arXiv:2003.05599  [pdf, other 

math.ST 

Posterior asymptotics in Wasserstein metrics on the real line 

Authors: Minwoo Chae, Pierpaolo De Blasi, Stephen G. Walker 

Abstract: In this paper, we use the class of Wasserstein metrics to study asymptotic properties of posterior distributions. Our first goal is to provide sufficient conditions for posterior consistency. In addition to the well-known Schwartz's Kullback--Leibler condition on the prior, the true distribution and most probability measures in the support of the prior are required to possess moments up to an orde… More 

Submitted 11 March, 2020; originally announced March 2020. 

Comments: 37pages, 4 figures 

MSC Class: 62F15; 62G20; 62G07 


arXiv:2003.05479  [pdf, ps, other 

math.ST 

Wasserstein statistics in 1D location-scale model 

Authors: Shun-ichi Amari 

Abstract: Wasserstein geometry and information geometry are two important structures introduced in a manifold of probability distributions. The former is defined by using the transportation cost between two distributions, so it reflects the metric structure of the base manifold on which distributions are defined. Information geometry is constructed based on the invariance criterion that the geometry is inva… More 

Submitted 5 March, 2020; originally announced March 2020. 

Comments: 14 pages, 2 figures

arXiv:2003.04874  [pdf, ps, other 

math.OC eess.SY 

Wasserstein Distributionally Robust Look-Ahead Economic Dispatch 

Authors: Bala Kameshwar Poolla, Ashish R. Hota, Saverio Bolognani, Duncan S. Callaway, Ashish Cherukuri 

Abstract: We present two data-driven distributionally robust optimization formulations for the look-ahead economic dispatch (LAED) problem with uncertain renewable energy generation. In particular, the goal is to minimize the cost of conventional energy generation subject to uncertain operational constraints. Furthermore, these constraints are required to hold for a family of distributions with similar char… More 

Submitted 10 March, 2020; originally announced March 2020. 

Cited by 13 Related articles All 8 versions

<——2020———————2020 —————-100— 

 

arXiv:2003.04033  [pdf, other 

cs.LG cs.AI stat.ML 

When can Wasserstein GANs minimize Wasserstein Distance? 

Authors: Yuanzhi Li, Zehao Dou 

Abstract: Generative Adversarial Networks (GANs) are widely used models to learn complex real-world distributions. In GANs, the training of the generator usually stops when the discriminator can no longer distinguish the generator's output from the set of training examples. A central question of GANs is that when the training stops, whether the generated distribution is actually close to the target distribu… More 

Submitted 9 March, 2020; originally announced March 2020. 

Comments: 45 pages 

[PDF] arxiv.org

When can Wasserstein GANs minimize Wasserstein Distance?

Y Li, Z Dou - arXiv preprint arXiv:2003.04033, 2020 - arxiv.org

Generative Adversarial Networks (GANs) are widely used models to learn complex real-

world distributions. In GANs, the training of the generator usually stops when the 

discriminator can no longer distinguish the generator's output from the set of training …

Cited by 2 Related articles All 3 versions
When can Wasserstein GANs minimize Wasserstein Distance?
 

by Li, Yuanzhi; Dou, Zehao 

03/2020

Generative Adversarial Networks (GANs) are widely used models to learn complex real-world distributions. In GANs, the training of the generator usually stops...

Journal ArticleFull Text Online 

arXiv:2003.03803  [pdf, other 

math.NA 

 
[PDF] neurips.cc

[PDF] Asymptotic Guarantees for Generative Modeling Based on the Smooth Wasserstein Distance

Z Goldfeld, K Greenewald… - Advances in Neural …, 2020 - proceedings.neurips.cc

Asymptotic Guarantees for Generative Modeling Based on the Smooth Wasserstein Distance …

In this work, we conduct a thorough statistical study of the minimum smooth Wasserstein estimators

(MSWEs), first proving the estimator's measurability and asymptotic consistency … 

All 2 versions


Science - Management Science; Reports Outline Management Science Study Findings from Xi'an University of Technology... 

Science Letter, Mar 6, 2020, 1386

Newspaper Article: Full Text Online 

Reports Outline Management Science Study Findings from Xi'an University of Technology (Data Supplement for a Soft Sensor Using a New Generative Model Based On a Variational Au

Reports Outline Management Science Study Findings from Xi'an University of Technology (Data Supplement for a Soft Sensor Using a New Generative Model Based On a Variational... 

Science Letter, 03/2020

Newsletter: Full Text Online 


[PDF] openreview.net

[PDF] Sliced-Wasserstein Autoencoder: An Embarrassingly Simple Generative Model

ESG Model - openreview.net

In this paper we study generative modeling via autoencoders while using the elegant 

geometric properties of the optimal transport (OT) problem and the Wasserstein distances. 

We introduce Sliced-Wasserstein Autoencoders (SWAE), which are generative models that …

Related articles All 2 versions 2020


PDF] arxiv.org

A Rademacher-type theorem on L2-Wasserstein spaces over closed Riemannian manifolds

LD Schiavo - Journal of Functional Analysis, 2020 - Elsevier

Let P be any Borel probability measure on the L 2-Wasserstein space (P 2 (M), W 2) over a 

closed Riemannian manifold M. We consider the Dirichlet form E induced by P and by the 

Wasserstein gradient on P 2 (M). Under natural assumptions on P, we show that W 2 …

Related articles All 4 versions 

   

 arXiv:2003.13976  [pdf, ps, other 

math.PR 

On Stein's factors for Poisson approximation in Wasserstein distance with non-linear transportation costs 

Authors: Zhong-Wei Liao, Yutao Ma, Aihua Xia 

Abstract: We establish various bounds on the solutions to a Stein equation for Poisson approximation in Wasserstein distance with non-linear transportation costs. The proofs are a refinement of those in [Barbour and Xia (2006)] using the results in [Liu and Ma (2009)]. As a corollary, we obtain an estimate of Poisson approximation error measured in L^2-Wasserstein distance. 

Submitted 31 March, 2020; originally announced March 2020. 

Comments: 21 pages 

MSC Class: 60F05; 60E15; 60J27 


arXiv:2003.13258  [pdf, other 

eess.SY math.OC 

Minimax control of ambiguous linear stochastic systems using the Wasserstein metric 

Authors: Kihyun Kim, Insoon Yang 

Abstract: In this paper, we propose a minimax linear-quadratic control method to address the issue of inaccurate distribution information in practical stochastic systems. To construct a control policy that is robust against errors in an empirical distribution of uncertainty, our method is to adopt an adversary, which selects the worst-case distribution. To systematically adjust the conservativeness of our m… More 

Submitted 30 March, 2020; originally announced March 2020. 

Cited by 4 Related articles All 4 versions

arXiv:2003.12685  [pdf, ps, other 

math.OC 

Distributionally Robust Chance-Constrained Programs with Right-Hand Side Uncertainty under Wasserstein Ambiguity 

Authors: Nam Ho-Nguyen, Fatma Kılınç-Karzan, Simge Küçükyavuz, Dabeen Lee 

Abstract: We consider exact deterministic mixed-integer programming (MIP) reformulations of distributionally robust chance-constrained programs (DR-CCP) with random right-hand sides over Wasserstein ambiguity sets. The existing MIP formulations are known to have weak continuous relaxation bounds, and, consequently, for hard instances with small radius, or with a large number of scenarios, the branch-and-bou… More 

Submitted 27 March, 2020; originally announced March 2020. 

Comments: 21 pages 

MSC Class: 90C11; 90C15 

[PDF] arxiv.org

Distributionally Robust Chance-Constrained Programs with Right-Hand Side Uncertainty under Wasserstein Ambiguity

N Ho-Nguyen, F Kılınç-Karzan, S Küçükyavuz… - arXiv preprint arXiv …, 2020 - arxiv.org

We consider exact deterministic mixed-integer programming (MIP) reformulations of

distributionally robust chance-constrained programs (DR-CCP) with random right-hand

sides over Wasserstein ambiguity sets. The existing MIP formulations are known to have …

  Cited by 6 Related articles All 5 versions

Showing the best result for this search. See all results


arXiv:2003.11403  [pdf, ps, other 

cs.LG eess.SY math.OC math.PR stat.ML 

Convergence of Recursive Stochastic Algorithms using Wasserstein Divergence 

Authors: Abhishek Gupta, William B. Haskell 

Abstract: This paper develops a unified framework, based on iterated random operator theory, to analyze the convergence of constant stepsize recursive stochastic algorithms (RSAs) in machine learning and reinforcement learning. RSAs use randomization to efficiently compute expectations, and so their iterates form a stochastic process. The key idea is to lift the RSA into an appropriate higher-dimensional sp… More 

Submitted 25 March, 2020; originally announced March 2020. 

Comments: 32 pages, submitted to SIMODS 

MSC Class: 93E35; 60J20; 68Q32 

 Related articles All 2 versions 

<——2020———————2020 ———————-110—   


arXiv:2003.10590  [pdf, ps, other  math.PR 

Authors: Andrey Sarantsev 

Abstract: Convergence rate to the stationary distribution for continuous-time Markov processes can be studied using Lyapunov functions. Recent work by the author provided explicit rates of convergence in special case of a reflected jump-diffusion on a half-line. These results are proved for total variation distance and its generalizations: measure distances defined by test functions regardless of their cont… More 

Submitted 23 March, 2020; originally announced March 2020. 

Comments: 9 pages. Keywords: Lyapunov functions, convergence rate, Wasserstein distance, coupling, jump-diffusions 

MSC Class: 60J51; 60H10; 60J60 

MR4118939 Prelim Sarantsev, Andrey; Convergence rate to equilibrium in Wasserstein distance for reflected jump-diffusions. Statist. Probab. Lett. 165 (2020), 108860. 60J60 (60J76)

Review PDF Clipboard Journal Article 

STATISTICS & PROBABILITY LETTERS  Volume: ‏ 165     Article Number: 108860   Published: ‏ OCT 2020

arXiv:2003.07880  [pdf, ps, other 

math.DS math.OC math.PR math.ST 

High-Confidence Attack Detection via Wasserstein-Metric Computations 

Authors: Dan Li, Sonia Martínez 

Abstract: This paper considers a sensor attack and fault detection problem for linear cyber-physical systems, which are subject to possibly non-Gaussian noise that can have an unknown light-tailed distribution. We propose a new threshold-based detection mechanism that employs the Wasserstein metric, and which guarantees system performance with high confidence. The proposed detector may generate false alarms… More 

Submitted 17 March, 2020; originally announced March 2020. 

Comments: Submitted to Control system letters 

Cited by 1 Related articles All 5 versions


[PDF] arxiv.org

Wasserstein k-means with sparse simplex projection

T Fukunaga, H Kasai - arXiv preprint arXiv:2011.12542, 2020 - arxiv.org

This paper presents a proposal of a faster Wasserstein $ k $-means algorithm for histogram

data by reducing Wasserstein distance computations and exploiting sparse simplex

projection. We shrink data samples, centroids, and the ground cost matrix, which leads to …

  All 2 versions 


arXiv:2003.08295  [pdf]  bcs.NE 

Many-Objective Estimation of Distribution Optimization Algorithm Based on WGAN-GP 

Authors: Zhenyu Liang, Yunfan Li, Zhongwei Wan 

Abstract: Estimation of distribution algorithms (EDA) are stochastic optimization algorithms. EDA establishes a probability model to describe the distribution of solution from the perspective of population macroscopically by statistical learning method, and then randomly samples the probability model to generate a new population. EDA can better solve multi-objective optimal problems (MOPs). However, the per… More 

Submitted 15 March, 2020; originally announced March 2020. 

Comments: arXiv admin note: substantial text overlap with arXiv:2003.07013 

Related articles All 2 versions

2020 see 2019  [HTML] springer.com

[HTML] The Wasserstein Space

VM Panaretos, Y Zemel - International Workshop on Functional and …, 2020 - Springer

The Kantorovich problem described in the previous chapter gives rise to a metric structure, the Wasserstein distance, in the space of probability measures P (X) P (\mathcal X) on a space X\mathcal X. The resulting metric space, a subspace of P (X) P (\mathcal X), is …

Related articles


2020

  

[PDF] arxiv.org

Isometric study of Wasserstein spaces---the real line

GP Gehér, T Titkos, D Virosztek - arXiv preprint arXiv:2002.00859, 2020 - arxiv.org

Recently Kloeckner described the structure of the isometry group of the quadratic Wasserstein space $\mathcal {W} _2\left (\mathbb {R}^ n\right) $. It turned out that the case of the real line is exceptional in the sense that there exists an exotic isometry flow. Following …

Cited by 1

Isometric study of Wasserstein spaces – the real line 

by Gehér, György Pál; Titkos, Tamás; Virosztek, Dániel 

Transactions of the American Mathematical Society, 05/2020, Volume 373, Issue 8

Journal ArticleFull Text Online 

ISOMETRIC STUDY OF WASSERSTEIN SPACES - THE REAL LINE

By: Geher, Gyorgy Pal; Titkos, Tamas; Virosztek, Daniel

TRANSACTIONS OF THE AMERICAN MATHEMATICAL SOCIETY   Volume: ‏ 373   Issue: ‏ 8   Pages: ‏ 5855-5883   Published: ‏ AUG 2020

Get It Penn State Free Accepted Article From Repository

 arXiv:2002.00859  [pdf, ps, other 

math.MG math-ph math.FA math.PR 

Isometric study of Wasserstein spaces --- the real line 

Authors: György Pál Gehér, Tamás Titkos, Dániel Virosztek 

Abstract: Recently Kloeckner described the structure of the isometry group of the quadratic Wasserstein space W2(Rn). It turned out that the case of the real line is exceptional in the sense that there exists an exotic isometry flow. Following this line of investigation, we compute Isom(Wp(R)), the isometry group of the Wasserstein… More Submitted 3 February, 2020; originally announced February 2020. 

Comments: 32 pages, 7 figures. Accepted for publication in Trans. Amer. Math. Soc 

MSC Class: Primary: 54E40; 46E27. Secondary: 60A10; 60B05
ewspaper ArticleCitation Online


[PDF] ethz.ch

[PDF] Smooth Wasserstein Distance: Metric Structure and Statistical Efficiency

Z Goldfeld - International Zurich Seminar on Information …, 2020 - research-collection.ethz.ch

The Wasserstein distance has seen a surge of interest and applications in machine learning. Its popularity is driven by many advantageous properties it possesses, such as metric structure (metrization of weak convergence), robustness to support mismatch, compatibility …

Smooth Wasserstein Distance: Metric Structure and Statistical ...

slideslive.com › smooth-wasserstein-distance-metric-struct...

Smooth Wasserstein Distance: Metric Structure and Statistical Efficiency. Aug 26, 2020. Speakers. ZG · Ziv Goldfeld. Speaker · 0 followers.

SlidesLive · 

Aug 26, 2020


[PDF] epfl.ch

Wasserstein Distributionally Robust Learning

S Shafieezadeh Abadeh - 2020 - infoscience.epfl.ch

Many decision problems in science, engineering, and economics are affected by 

uncertainty, which is typically modeled by a random variable governed by an unknown 

probability distribution. For many practical applications, the probability distribution is only …

Wasserstein Distributionally Robust Learning

books.google.com › books

OROOSH Shafieezadeh Abadeh · 2020 · ‎No preview

Mots-clés de l'auteur: Distributionally robust optimization ; Wasserstein distance ; Regularization ; Supervised Learning ; Inverse optimization ; Kalman filter ; Frank-Wolfe algorithm. book


 

Wasserstein Random Forests at First Glance - Qiming Du

mgimm.github.io › doc › wrf-try

Feb 17, 2020 - 1.1 Motivation. The classical setting of supervised learning focus on the estimation of the conditional expectation. E[Y | X = x] for some ...

[C] Wasserstein Random Forests at First Glance

Q Du - 2020


[PDF] brown.edu

[PDF] Reduced-order modeling of transport equations using Wasserstein spaces

V Ehrlacher, D Lombardi, O Mula, FX Vialard - icerm.brown.edu

Page 1. Introduction to Wassertein spaces and barycenters Model order reduction of parametric transport equations Reduced-order modeling of transport equations using Wasserstein spaces V. Ehrlacher1, D. Lombardi 2, O. Mula 3, F.-X. Vialard 4 1Ecole des Ponts ParisTech & INRIA …

<——2020————2020 ————120—

   

  

 WGAN-based Autoencoder Training Over-the-air

S Dörner, M Henninger, S Cammerer… - arXiv preprint arXiv …, 2020 - arxiv.org

The practical realization of end-to-end training of communication systems is fundamentally limited by its accessibility of the channel gradient. To overcome this major burden, the idea of generative adversarial networks (GANs) that learn to mimic the actual channel behavior …

[CITATION] S. t. Brink,“WGAN-based autoencoder training over-the-air,”

S Dörner, M Henninger, S Cammerer - arXiv preprint arXiv:2003.02744, 2020

  Cited by 10 Related articles All 3 versions

Structure-preserving variational schemes for fourth order nonlinear partial differential equations with a Wasserstein gradient flow structure

B Ashworth - 2020 - ethos.bl.uk

… There is a growing interest in studying nonlinear partial differential equations which constitute

gradient flows in the Wasserstein metric and related structure preserving variational discretisations.

In this thesis, we focus on the fourth order Derrida-Lebowitz-Speer-Spohn (DLSS …

Technique Proposal to Stabilize Lipschitz Continuity of WGAN Based on Regularization Terms

HI Hahn - The Journal of The Institute of Internet, Broadcasting …, 2020 - koreascience.or.kr

The recently proposed Wasserstein generative adversarial network (WGAN) has improved some of the tricky and unstable training processes that are chronic problems of the generative adversarial network (GAN), but there are still cases where it generates poor …


정칙화 항에 기반한 WGAN 립쉬츠 연속 안정화 기법 제안

한희일 - 한국인터넷방송통신학회 논문지, 2020 - earticle.net

최근에 제안된 WGAN (Wasserstein generative adversarial network) 등장으로 GAN (generative adversarial network) 고질적인 문제인 까다롭고 불안정한 학습과정이 다소 개선되기는 하였으나 여전히 수렴이 되거나 자연스럽지 못한 출력물을 생성하는 등의 …

[Korean  Proposal of WGAN's continuous lip sheet stabilization method based on regularization term]


분별기의 립쉬츠 연속 안정화를 통한 WGAN 성능개선

한희일 - 전자공학회논문지, 2020 - dbpia.co.kr

GAN (generative adversarial network) 등장으로 생성모델 분야의 획기적 발전이 이루어졌지만 학습 시의 불안정성은 해결되어야 가장 문제로 대두되고 있다. 최근에 제안된 WGAN (Wasserstein GAN) 학습 안정성이 개선되어 GAN 대안이 되고 있으나 …

 [Korean  Improving WGAN performance through continuous stabilization of the lip sheet of the fractionator[

Improving the Performance of WGAN Using Stabilization of Lipschitz Continuity of the Discriminator

MR4076772 Prelim Berthet, Philippe; Fort, Jean-Claude; Klein, Thierry; A Central Limit Theorem for Wasserstein type distances between two distinct univariate distributions. Ann. Inst. Henri Poincaré Probab. Stat. 56 (2020), no. 2, 954–982. 62G30 (60F05 60F17 62G20)

Researchers at Institute of Mathematics Toulouse Report New Data on Probability Research (A Central Limit Theorem for Wasserstein... 

Mathematics Week, 06/2020

NewsletterFull Text Online 

Probability Research; Researchers at Institute of Mathematics Toulouse Report New Data on Probability Research (A Central Limit Theorem for Wasserstein... 

News of science (Atlanta, Ga.), Jun 14, 20

online

Researchers at Institute of Mathematics Toulouse Report New Data on Probability Research (A Central Limit Theorem for Wasserstein type distances between two distinct univariate distributions)

Mathematics Week, 06/2020

NewsletterFull Text Online

MR4075297 Prelim García Trillos, Nicolás; Gromov–Hausdorff limit of Wasserstein spaces on point clouds. Calc. Var. Partial Differential Equations 59 (2020), no. 2, Paper No. 73. 49J45 (35R03 49J55)

[PDF] arxiv.org

Gromov–Hausdorff limit of Wasserstein spaces on point clouds

NG Trillos - Calculus of Variations and Partial Differential …, 2020 - Springer

We consider a point cloud\(X_n:=\{{\mathbf {x}} _1,\ldots,{\mathbf {x}} _n\}\) uniformly distributed on the flat torus\({\mathbb {T}}^ d:=\mathbb {R}^ d/\mathbb {Z}^ d\), and construct a geometric graph on the cloud by connecting points that are within distance\(\varepsilon\) of …

Cited by 6 Related articles


MR4073204 Prelim Jimenez, Chloé; Marigonda, Antonio; Quincampoix, Marc; Optimal control of multiagent systems in the Wasserstein space. Calc. Var. Partial Differential Equations 59 (2020), no. 2, Paper No. 58. 49L25 (34A60 49J52 49Q20)

Optimal control of multiagent systems in the Wasserstein space 

By: Jimenez, Chloe; Marigonda, Antonio; Quincampoix, Marc 

CALCULUS OF VARIATIONS AND PARTIAL DIFFERENTIAL EQUATIONS  Volume: ‏ 59   Issue: ‏ 2     Article Number: 58   Published: ‏ MAR 2 2020 

Cited by 22 Related articles All 6 versions

Iput limited Wasserstein GAN - SPIE Digital Library

by F Cao - ‎2020

Jan 31, 2020 - Generative adversarial networks (GANs) has proven hugely successful, but suffer from train instability. The recently proposed Wasserstein GAN ...

Input limited Wasserstein GAN 

by Cao, Feidao; Zhao, Huaici; Liu, Pengfei; More... 

01/2020

Generative adversarial networks (GANs) has proven hugely successful, but suffer from train instability. The recently proposed Wasserstein GAN (WGAN) has...

Conference Proceeding:  Full Text Online
 Input limited Wasserstein GAN

F Cao, H Zhao, P Liu, P Li - Second Target Recognition and …, 2020 - spiedigitallibrary.org

Generative adversarial networks (GANs) has proven hugely successful, but suffer from train 

instability. The recently proposed Wasserstein GAN (WGAN) has largely overcome the 

problem, but can still fail to converge in some case or be to complex. It has been found that …

Related articles\


Chinese font translation with improved Wasserstein ...

by Y Miao - ‎2020

Jan 31, 2020 - Chinese font translation with improved Wasserstein generative adversarial network ... are selected as the core component to extract the features fully and enhance the information transmission between network layers.

Chinese font translation with improved Wasserstein generative adversarial network 

by Miao, Yalin; Jia, Huanhuan; Tang, Kaixu; More... 

01/2020

Nowadays, various fonts are applied in many fields, and the generation of multiple fonts by computer plays an important role in the inheritance, development...

Conference Proceeding: Full Text Online 

 Chinese font translation with improved 

 <——2020———————2020 ————-130—


     Exponential Contraction in Wasserstein Distances for Diffusion Semigroups with Negative Curvature

by FY Wang - ‎2020 - ‎Cited by 16 - ‎Related articles

Feb 6, 2020 - Let Pt be the (Neumann) diffusion semigroup Pt generated by a weighted ... Exponential Contraction in Wasserstein Distances for Diffusion ... Bakry, D., Gentil, I., Ledoux, M.: Analysis and Geometry of Markov Diffusion Operators. ... In: Potential Theory and Its Related Fields, 61–80, RIMS Kôkyûroku ...

Studies from Beijing Normal University in the Area of Potential Analysis Described (Exponential Contraction In Wasserstein Distances for Diffusion Semigroups With Negative... 

Mathematics Week, 03/2020

Newsletter:  Full Text Online 

Potential Analysis; Studies from Beijing Normal University in the Area of Potential Analysis Described (Exponential Contraction In Wasserstein Distances for Diffusion Semigroups... 

Journal of Mathematics, Mar 24, 2020, 889

Newspaper Article:Full Text Online 

Cited by 16 Related articles

MR4140091 Prelim Wang, Feng-Yu; Exponential Contraction in Wasserstein Distances for Diffusion Semigroups with Negative Curvature. Potential Anal. 53 (2020), no. 3, 1123–1144.

Review PDF Clipboard Journal Article 

  Cited by 9 Related articles

 MGMDcGAN: Medical Image Fusion Using Multi-Generator Multi-Discriminator Conditional Generative Adversarial Network

J Huang, Z Le, Y Ma, F Fan, H Zhang, L Yang - IEEE Access, 2020 - ieeexplore.ieee.org

… However, a single medical imaging modality cannot provide sufficient information for its intended purpose … Discriminators DM and DY in the first cGAN are used to discriminate between source images and Im, respectively … J. Huang et al.: Medical Image Fusion Using MGMDcGAN … 


Engineering; New Engineering Findings from Wuhan University Discussed (Multi-source Medical Image Fusion Based On Wasserstein Generative Adversarial Networks) 

Journal of Engineering, Feb 24, 2020, 1511

Newspaper Article:

Full Text Online  see 2019

[PDF] ieee.org

Multi-source medical image fusion based on Wasserstein generative adversarial networks

Z Yang, Y Chen, Z Le, F Fan, E Pan - IEEE Access, 2019 - ieeexplore.ieee.org

… , we propose the medical Wasserstein generative adversarial … Different information from 

source images can be effectively … architecture to deal with source images of different resolutions. …

Cited by 16 Related articles

  

L'école hassidique est illégale, plaide l'avocat du couple Lowen-Wasserstein 

by Giuseppe Valiante 

La Presse Canadienne, Feb 19, 2020

Newspaper Article: Full Text Online 


2020 see 2019

 Engineering; Data on Engineering Discussed by Researchers at Wuhan University (Prostate MR Image Segmentation With Self-Attention Adversarial Training Based on Wasserstein... 

Journal of Engineering, Feb 3, 2020, 252

Newspaper ArticleFull Text Online 


2020


Information Technology; Study Data from Seoul National University Provide New Insights into Information Technology (Data-Driven Distributionally Robust Stochastic Control of... 

Computers, Networks & Communications, Feb 13, 2020, 836

Newspaper Article: Full Text Online 

Data-Driven Distributionally Robust Stochastic Control of Energy Storage for Wind Power Ramp Management Using the Wasserstein Metric

I Yang - Energies, 2019 - mdpi.com

 Cited by 2 Related articles All 6 versions 


 WGAN-E: A Generative Adversarial Networks for Facial Feature Security

C Wu, B Ju, Y Wu, NN Xiong, S Zhang - Electronics, 2020 - mdpi.com

Artificial intelligence technology plays an increasingly important role in human life. For example, distinguishing different people is an essential capability of many intelligent systems. To achieve this, one possible technical means is to perceive and recognize people by optical imaging of faces, so-called face recognition technology. After decades of research and development, especially the emergence of deep learning technology in recent years, face recognition has made great progress with more and more applications in the fields of …

WGAN-E: A Generative Adversarial Networks for Facial Feature Security 

by Wu, Chunxue; Ju, Bobo; Wu, Yan; More... 

Electronics, 03/2020, Volume 9, Issue 3

Artificial intelligence technology plays an increasingly important role in human life. For example, distinguishing different people is an essential capability...

Journal Article: Full Text Online

Cited by 8 Related articles All 4 versions

Longtime WGAN morning host Ken Altshuler fired from station 

by Matt Byrne 

TCA Regional News, Mar 28, 2020

Newspaper Article:  Full Text Online 


[HTML] Qsmgan: Improved quantitative susceptibility mapping using 3d generative adversarial networks with increased receptive field

Y Chen, A Jakary, S Avadiappan, CP Hess, JM Lupo - NeuroImage, 2020 - Elsevier

… Generative adversarial networks improved the quality of the susceptibility maps … QSM maps from single orientation phase maps efficiently and performs significantly better than traditional … by utilizing a GAN to regularize the model training process and further improve the accuracy … 

Cited by 3 Related articles

[PDF] arxiv.org

Self-improvement of the Bakry-Emery criterion for Poincar {\'e} inequalities and Wasserstein contraction using variable curvature bounds

P Cattiaux, M Fathi, A Guillin - arXiv preprint arXiv:2002.09221, 2020 - arxiv.org

We study Poincar {é} inequalities and long-time behavior for diffusion processes on R^ n under a variable curvature lower bound, in the sense of Bakry-Emery. We derive various estimates on the rate of convergence to equilibrium in L^ 1 optimal transport distance, as …

Related articles All 33 versions

<——2020—————2020 —————-140—  


2020

Deep joint two-stream Wasserstein auto-encoder and selective attention alignment for unsupervised domain adaptation

Z ChenC Chen, X Jin, Y LiuZ Cheng - Neural computing and …, 2020 - Springer

… In this work, we propose a method that joints two-stream Wasserstein auto-encoder (WAE)

and … 1b, the two-stream WAE minimizes the Wasserstein distance based on optimal transport …

Cited by 18 Related articles All 4 versions

Wasserstein Loss-Based Deep Object DetectionY Han, X Liu, Z Sheng, Y Ren, X Han… - Proceedings of the …, 2020 - openaccess.thecvf.comObject detection locates the objects with bounding boxes and identifies their classes, which is valuable in many computer vision applications (eg autonomous driving). Most existing deep learning-based methods output a probability vector for instance classification trained with the one-hot label. However, the limitation of these models lies in attribute perception because they do not take the severity of different misclassifications intcosideration. In this paper, we propose a novel methd edon the Wasserstein dance called Wasserstein …

  Cited by 11 Related articles All 5 versions 


arXiv:2004.07162  [pdf, ps, other]  math.OC cs.LG
On Linear Optimization over Wasserstein Balls
Authors: Man-Chung Yue, Daniel Kuhn, Wolfram Wiesemann
Abstract: Wasserstein balls, which contain all probability measures within a pre-specified Wasserstein distance to a reference measure, have recently enjoyed wide popularity in the distributionally robust optimization and machine learning communities to formulate and solve data-driven optimization problems with rigorous statistical guarantees. In this technical note we prove that the Wasserstein ball is
wea… MoSubmitted 15 April, 2020; originally announced April 2020


arXiv:2004.03981  [pdf, othermath.NA
A Wasserstein Coupled Particle Filter for Multilevel Estimation
Authors: Marco Ballesio, Ajay Jasra, Erik von Schwerin, Raul Tempone
Abstract: In this paper, we consider the filtering problem for partially observed diffusions, which are regularly observed at discrete times. We are concerned with the case when one must resort to time-discretization of the diffusion process if the transition density is not available in an appropriate form. In such cases, one must resort to advanced numerical algorithms such as particle filters to consisten…
More
Submitted 8 April, 2020; originally announced April 2020.
Comments: 48 pages
MSC Class: 65C05 (Primary) 65C20; 65C30 (Secondary) 


2020

arXiv:2004.03867  [pdf, othereess.IV cs.CV
S2A: Wasserstein GAN with Spatio-Spectral Laplacian Attention for Multi-Spectral Band Synthesis
Authors: Litu Rout, Indranil Misra, S Manthira Moorthi, Debajyoti Dhar
Abstract: Intersection of adversarial learning and satellite image processing is an emerging field in remote sensing. In this study, we intend to address synthesis of high resolution multi-spectral satellite imagery using adversarial learning. Guided by the discovery of attention mechanism, we regulate the process of band synthesis through spatio-spectral Laplacian attention. Further, we use Wasserstein GAN…
More
Submitted 8 April, 2020; originally announced April 2020.
Comments: Computer Vision and Pattern Recognition (CVPR) Workshop on Large Scale Computer Vision for Remote Sensing Imagery 

Conference ProceedingCitation Online


2020

arXiv:2004.03730  [pdf, othermath.ST math.NA
Stability of Gibbs Posteriors from the Wasserstein Loss for Bayesian Full Waveform Inversion
Authors: Matthew M. Dunlop, Yunan Yang
Abstract: Recently, the Wasserstein loss function has been proven to be effective when applied to deterministic full-waveform inversion (FWI) problems. We consider the application of this loss function in Bayesian FWI so that the uncertainty can be captured in the solution. Other loss functions that are commonly used in practice are also considered for comparison. Existence and stability of the resulting Gi…
More
Submitted 7 April, 2020; originally announced April 2020.
Comments: 26 pages, 7 figures
MSC Class: 62C10; 86A22; 65J22; 49K40 

optimal transport cost becomes the class of Wasserstein distance …

  Cited by 1 Related articles All 3 versions 

  

arXiv:2004.00999  [pdf, ps, othercs.LG cs.CL econ.GN
Pruned Wasserstein Index Generation Model and wigpy Package
Authors: Fangzhou Xie
Abstract: Recent proposal of Wasserstein Index Generation model (WIG) has shown a new direction for automatically generating indices. However, it is challenging in practice to fit large datasets for two reasons. First, the Sinkhorn distance is notoriously expensive to compute and suffers from dimensionality severely. Second, it requires to compute a full
N×N
matrix to be fit into memory, where N
i… More
Submitted 9 July, 2020; v1 submitted 30 March, 2020; originally announced April 2020.
Comments: fix typos and errors

arXiv:2004.00759  [pdf, other]  eess.SY math.OC
Safe Zero-Shot Model-Based Learning and Control: A Wasserstein Distributionally Robust Approach
Authors: Aaron Kandel, Scott J. Moura

Abstract: This paper explores distributionally robust zero-shot model-based learning and control using Wasserstein ambiguity sets. Conventional model-based reinforcement learning algorithms struggle to guarantee feasibility throughout the online learning process. We address this open challenge with the following approach. Using a stochastic model-predictive control (MPC) strategy, we augment safety constrai…
More
Submitted 1 April, 2020; originally announced April 2020.
Comments: In review for CDC20 

Cited by 1 Related articles All 3 versions

arXiv:2003.13976  [pdf, ps, othermath.PR
On Stein's factors for Poisson approximation in
Wasserstein distance with non-linear transportation costs
Authors:
Zhong-Wei Liao, Yutao Ma, Aihua Xia
Abstract: We establish various bounds on the solutions to a Stein equation for Poisson approximation in Wasserstein distance with non-linear transportation costs. The proofs are a refinement of those in [Barbour and Xia (2006)] using the results in [Liu and Ma (2009)]. As a corollary, we obtain an estimate of Poisson approximation error measured in L^2-Wasserstein distance.
Submitted 31 March, 2020; originally announced March 2020.
Comments: 21 pages
MSC Class: 60F05; 60E15; 60J27


arXiv:2007.08906  [pdf, ps, othermath.OC
Differential Inclusions in
Wasserstein Spaces: The Cauchy-Lipschitz Framework
Authors:
Benoît Bonnet, Hélène Frankowska
Abstract: In this article, we propose a general framework for the study of differential inclusions in the Wasserstein space of probability measures. Based on earlier geometric insights on the structure of continuity equations, we define solutions of differential inclusions as absolutely continuous curves whose driving velocity fields are measurable selections of multifunction taking their values in the spac…
More
Submitted 17 July, 2020; originally announced July 2020.
Comments: To appear in Journal of Differential Equations
MSC Class: 28B20; 34A60; 34G20; 49J21; 49J4

<——2020—————2020 —————-150—

arXiv:2005.09290  [pdf, ps, othermath.PR
Convergence in
Wasserstein Distance for Empirical Measures of Dirichlet Diffusion Processes on Manifolds
Authors:
Feng-Yu Wang
Abstract: Let M
Submitted 19 May, 2020; originally announced May 2020. 

Cited by 4 Related articles All 3 versions

arXiv:2005.06530  [pdf, other]  math.OC math-ph stat.ML
The Equivalence of Fourier-based and
Wasserstein Metrics on Imaging Problems
Authors:
Gennaro Auricchio, Andrea Codegoni, Stefano Gualandi, Giuseppe Toscani, Marco Veneroni
Abstract: We investigate properties of some extensions of a class of Fourier-based probability metrics, originally introduced to study convergence to equilibrium for the solution to the spatially homogeneous Boltzmann equation. At difference with the original one, the new Fourier-based metrics are well-defined also for probability distributions with different centers of mass, and for discrete probability me…
More
Submitted 13 May, 2020; originally announced May 2020.
Comments: 18 pages, 2 figures, 1 table
MSC Class: 90C06; 90C08 \

MR4170640 Prelim Auricchio, Gennaro; Codegoni, Andrea; Gualandi, Stefano; Toscani, Giuseppe; Veneroni, Marco; The equivalence of Fourier-based and Wasserstein metrics on imaging problems. Atti Accad. Naz. Lincei Rend. Lincei Mat. Appl. 31 (2020), no. 3, 627–649. 60A10 (42A38 49Q20 60E15)

Review PDF Clipboard Journal Article 

The Equivalence of Fourier-based and Wasserstein Metrics on ...

arxiv.org › math

May 13, 2020 — The Equivalence of Fourier-based and Wasserstein Metrics on Imaging Problems. ... Numerical results then show that in benchmark problems of image processing, Fourier metrics provide a better runtime with respect to Wasserstein ones.

by G Auricchio · ‎2020 · ‎Related articles
Related articles
 All 7 versions 

Auricchio, GennaroCodegoni, AndreaGualandi, StefanoToscani, GiuseppeVeneroni, Marco

The equivalence of Fourier-based and Wasserstein metrics on imaging problems. (English) Zbl 07326808

Atti Accad. Naz. Lincei, Cl. Sci. Fis. Mat. Nat., IX. Ser., Rend. Lincei, Mat. Appl. 31, No. 3, 627-649 (2020).

Reports on Mathematics from University of Pavia Provide New Insights 

(The Equivalence of Fourier-based and Wasserstein...

Journal of Technology & Science, 12/2020

NewsletterFull Text Online

arXiv:2005.05208  [pdf, othermath.ST
Wasserstein distance error bounds for the multivariate normal approximation of the maximum likelihood estimator
Authors:
Andreas Anastasiou, Robert E. Gaunt
Abstract: We obtain explicit Wasserstein distance error bounds between the distribution of the multi-parameter MLE and the multivariate normal distribution. Our general bounds are given for possibly high-dimensional, independent and identically distributed random vectors. Our general bounds are of the optimal O(n
−1/2)
order. We apply our general bounds to derive Wasserstein distance error bou…
More
Submitted 11 May, 2020; originally announced May 2020.
Comments: 31 pages, 1 figure
MSC Class: 60F05; 62E17; 62F10; 62F12 

Cited by 1 Related articles All 4 versions 

arXiv:2005.04972  [pdf, ps, other]  math.PR
A Bismut-Elworthy inequality for a Wasserstein diffusion on the circle
Authors: Victor Marx
Abstract: We investigate in this paper a regularization property of a diffusion on the Wasserstein space P2(T)
of the one-dimensional torus. The control obtained on the gradient of the semi-group is very much in the spirit of Bismut-Elworthy-Li integration by parts formula for Brownian motions. Although the general strategy is based on Kunita's expansion as in Thalwaier and Wang's appr…
More
Submitted 11 May, 2020; originally announced May 2020. 


arXiv:2005.04925  [pdf, ps, other]  math.CA math.PR
Berry-Esseen smoothing inequality for the Wasserstein metric on compact Lie groups
Authors:
Bence Borda
Abstract: We prove a sharp general inequality estimating the distance of two probability measures on a compact Lie group in the Wasserstein metric in terms of their Fourier transforms. We use a generalized form of the Wasserstein metric, related by Kantorovich duality to the family of functions with an arbitrarily prescribed modulus of continuity. The proof is based on smoothing with a suitable kernel, and…
More
Submitted 23 June, 2020; v1 submitted 11 May, 2020; originally announced May 2020.
Comments: 24 pages; main result improved in version 2
MSC Class: 43A77; 60B15 


2020

arXiv:2005.00738  [pdf, ps, other]  math.PR
Asymptotics of smoothed Wasserstein distances
Authors: Hong-Bin Chen, Jonathan Niles-Weed
Abstract: We investigate contraction of the Wasserstein distances on Rd
under Gaussian smoothing. It is well known that the heat semigroup is exponentially contractive with respect to the Wasserstein distances on manifolds of positive curvature; however, on flat Euclidean space---where the heat semigroup corresponds to smoothing the measures by Gaussian convolution---the situation is more subtle…
More
Submitted 2 May, 2020; originally announced May 2020. 

Asymptotics of Smoothed Wasserstein Distances | Research ...

research.shanghai.nyu.edu › math › events › asymptoti...

 — Hongbin Chen, NYU. Location: Via Zoom (Members of NYU Shanghai Community can join from Room 611 at Pudong campus). - RSVP Here -.

Nov 30, 2020


arXiv:2004.14089  [pdf, ps, other]  math.PR
Equidistribution of random walks on compact groups II. The Wasserstein metric
Authors: Bence Borda
Abstract: We consider a random walk S
Submitted 30 April, 2020; v1 submitted 26 April, 2020; originally announced April 2020.
Comments: 23 pages 

[PDF] arxiv.org

Equidistribution of random walks on compact groups II. The Wasserstein metric

B Borda - arXiv preprint arXiv:2004.14089, 2020 - arxiv.org

We consider a random walk $ S_k $ with iid steps on a compact group equipped with a bi-

invariant metric. We prove quantitative ergodic theorems for the sum $\sum_ {k= 1}^ N f

(S_k) $ with Hölder continuous test functions $ f $, including the central limit theorem, the …

  Related articles All 2 versions 

arXiv:2004.12478  [pdf, other]  cs.LG cs.CR stat.ML
Improved Image Wasserstein Attacks and Defenses
Authors: J. Edward Hu, Adith Swaminathan, Hadi Salman, Greg Yang
Abstract: Robustness against image perturbations bounded by a

 Submitted 16 April, 2020; originally announced April 2020.
MSC Class: 93Exx; 90C25; 65K10; 68U10
Cited by 7
Related articles All 4 versions


arXiv:2004.07537  [pdf, ps, other]  math.PR
Precise Limit in Wasserstein Distance for Conditional Empirical Measures of Dirichlet Diffusion Processes
Authors: Feng-Yu Wang
Abstract: Let M…
Submitted 13 May, 2020; v1 submitted 16 April, 2020; originally announced April 2020.
Comments: 21 pages 


arXiv:2004.07341  [pdf, other]  cs.LG cs.CY stat.ML
Wasserstein Adversarial Autoencoders for Knowledge Graph Embedding based Drug-Drug Interaction Prediction
Authors: Yuanfei Dai, Chenhao Guo, Wenzhong Guo, Carsten Eickhoff
Abstract: Interaction between pharmacological agents can trigger unexpected adverse events. Capturing richer and more comprehensive information about drug-drug interactions (DDI) is one of the key tasks in public health and drug development. Recently, several knowledge graph embedding approaches have received increasing attention in the DDI domain due to their capability of projecting drugs and interactions…
More
Submitted 15 April, 2020; originally announced April 2020.

<——2020———————2020 ———————-160—

arXiv:2006.03465  [pdf, other]  cs.LG stat.ML
Visual Transfer for Reinforcement Learning via Wasserstein Domain Confusion
Authors: Josh Roy, George Konidaris
Abstract: We introduce Wasserstein Adversarial Proximal Policy Optimization (WAPPO), a novel algorithm for visual transfer in Reinforcement Learning that explicitly learns to align the distributions of extracted features between a source and target task. WAPPO approximates and minimizes the Wasserstein-1 distance between the distributions of features from source and target domains via a novel Wasserstein Co…
More
Submitted 4 June, 2020; originally announced June 2020. 

Journal ArticleFull Text Online
Visual Transfer for Reinforcement Learning via Wasserstei

Visual Transfer for Reinforcement Learning via Wasserstein Domain Confusion · Want to watch this again ...

Jun 25, 2020 · Uploaded by Josh Roy


arXiv:2006.03416  [pdf, other]  stat.ML cs.LG
Entropy-Regularized 2-Wasserstein Distance between Gaussian Measures
Authors: Anton Mallasto, Augusto Gerolin, Hà Quang Minh
Abstract: Gaussian distributions are plentiful in applications dealing in uncertainty quantification and diffusivity. They furthermore stand as important special cases for frameworks providing geometries for probability measures, as the resulting geometry on Gaussians is often expressible in closed-form under the frameworks. In this work, we study the Gaussian geometry under the entropy-regularized 2-Wasser…
More
Submitted 5 June, 2020; originally announced June 2020. 


arXiv:2006.03333  [pdf, otherstat.ML cs.LG
Principled learning method for Wasserstein distributionally robust optimization with local perturbations
Authors: Yongchan Kwon, Wonyoung Kim, Joong-Ho Won, Myunghee Cho Paik
Abstract: Wasserstein distributionally robust optimization (WDRO) attempts to learn a model that minimizes the local worst-case risk in the vicinity of the empirical data distribution defined by Wasserstein ball. While WDRO has received attention as a promising tool for inference since its introduction, its theoretical understanding has not been fully matured. Gao et al. (2017) proposed a minimizer based on…
More
Submitted 22 June, 2020; v1 submitted 5 June, 2020; originally announced June 2020.
Comments: Accepted for ICML 2020 

Journal article

arXiv:2006.02682  [pdf, othercs.LG stat.ML
Some Theoretical Insights into Wasserstein GANs
Authors: Gérard Biau, Maxime Sangnier, Ugo Tanielian
Abstract: Generative Adversarial Networks (GANs) have been successful in producing outstanding results in areas as diverse as image, video, and text generation. Building on these successes, a large number of empirical studies have validated the benefits of the cousin approach called Wasserstein GANs (WGANs), which brings stabilization in the training process. In the present paper, we add a new stone to the…
More
Submitted 4 June, 2020; originally announced June 2020. 

Journal article
Cited by 1
Related articles All 10 versions

Principled learning method for Wasserstein distributionally ...

slideslive.com › principled-learning-method-for-wasserste...

Wasserstein distributionally robust optimization (WDRO) attempts to ... of the empirical data distribution defined by Wasserstein ball.

SlidesLive · 

Jul 12, 2020

Jul 9, 2020


arXiv:2006.02509  [pdf, other]  math.ST cs.LG
SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

Authors:
Sinho Chewi, Thibaut Le Gouic, Chen Lu, Tyler Maunu, Philippe Rigollet
Abstract: Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of optimal transport. We introduce a new perspective on SVGD that instead views SVGD as the (kernelized) gradient flow of the chi-squared divergence which, we show, exhibits a strong form of uniform exponential ergodicity un…
More
Submitted 3 June, 2020; originally announced June 2020.
Comments: 20 pages, 5 figures
Journal ArticleFull Text Online

Cited by 21 Related articles All 10 versions
SVGD as a Kernelized Wasserstein Gradient Flow of the Chi ...

slideslive.com › svgd-as-a-kernelized...

slideslive.com › svgd-as-a-kernelized...

... Googlebot/2.1; +http://www.google.com/bot.html). SVGD as a Kernelized Wasserstein Gradient Flow of the Chi-Squared Divergence. Dec 6, 2020. Speakers.

SlidesLive · 


2020

arXiv:2006.02068  [pdf, other]  cs.CV cs.RO
PLG-IN: Pluggable Geometric Consistency Loss with Wasserstein Distance in Monocular Depth Estimation
Authors: Noriaki Hirose, Satoshi Koide, Keisuke Kawano, Ruho Kondo
Abstract: We propose a novel objective to penalize geometric inconsistencies, to improve the performance of depth estimation from monocular camera images. Our objective is designed with the Wasserstein distance between two point clouds estimated from images with different camera poses. The Wasserstein distance can impose a soft and symmetric coupling between two point clouds, which suitably keeps geometric…
More
Submitted 3 June, 2020; originally announced June 2020.
Comments: 9 pages, 6 figures, 2 tables 

Journal ArticleFull Text Online

arXiv:2006.01397  [pdf, ps, other]  math.OC cs.LG eess.SY stat.ML
Online Stochastic Convex Optimization:
Wasserstein Distance Variation
Authors:
Iman Shames, Farhad Farokhi
Abstract: Distributionally-robust optimization is often studied for a fixed set of distributions rather than time-varying distributions that can drift significantly over time (which is, for instance, the case in finance and sociology due to underlying expansion of economy and evolution of demographics). This motivates understanding conditions on probability distributions, using the Wasserstein distance, tha…
More
Submitted 2 June, 2020; originally announced June 2020.

Cited by 2
Related articles All 4 versions 

 

arXiv:2006.00945  [pdf, othercs.LG stat.ML
Robust Reinforcement Learning with Wasserstein Constraint
Authors: Linfang Hou, Liang Pang, Xin Hong, Yanyan Lan, Zhiming Ma, Dawei Yin
Abstract: Robust Reinforcement Learning aims to find the optimal policy with some extent of robustness to environmental dynamics. Existing learning algorithms usually enable the robustness through disturbing the current state or simulating environmental parameters in a heuristic way, which lack quantified robustness to the system dynamics (i.e. transition probability). To overcome this issue, we leverage Wa…
More
Submitted 1 June, 2020; originally announced June 2020. 

Journal article
Cited by 6
Related articles All 3 versions


arXiv:2005.13815  [pdf, ps, othercs.LG math.OC stat.ML
Adversarial Classification via Distributional Robustness with Wasserstein Ambiguity
Authors: Nam Ho-Nguyen, Stephen J. Wright
Abstract: We study a model for adversarial classification based on distributionally robust chance constraints. We show that under Wasserstein ambiguity, the model aims to minimize the conditional value-at-risk of the distance to misclassification, and we explore links to previous adversarial classification models and maximum margin classifiers. We also provide a reformulation of the distributionally robust…
More
Submitted 28 May, 2020; originally announced May 2020.
Comments: 32 pages 

Cited by 4 Related articles All 3 versions

arXiv:2005.09923  [pdf, otherstat.ML cs.LG math.OC
Tessellated Wasserstein Auto-Encoders
Authors: Kuo Gai, Shihua Zhang
Abstract: Non-adversarial generative models such as variational auto-encoder (VAE), Wasserstein auto-encoders with maximum mean discrepancy (WAE-MMD), sliced-Wasserstein auto-encoder (SWAE) are relatively easy to train and have less mode collapse compared to Wasserstein auto-encoder with generative adversarial network (WAE-GAN). However, they are not very accurate in approximating the target distribution in…
More
Submitted 20 May, 2020; originally announced May 2020.
Comments: 15 pages, 8 figures
MSC Class: 90-08; 68T01 ACM Class: I.2.6; I.5.1; I.4.0
<——2020———————2020 —————-170—


arXiv:2006.08012  [pdf, othermath.OC cs.CG cs.DS cs.LG
High-precision Wasserstein barycenters in polynomial time
Authors: Jason M. Altschuler, Enric Boix-Adsera
Abstract: Computing Wasserstein barycenters is a fundamental geometric problem with widespread applications in machine learning, statistics, and computer graphics. However, it is unknown whether Wasserstein barycenters can be computed in polynomial time, either exactly or to high precision (i.e., with
polylog(1/ε)
runtime dependence). This paper answers these questions in the affirmativ… More
Submitted 14 June, 2020; originally announced June 2020.
Comments: 11 pages, 3 figures 

Cited by 1 Related articles All 2 versions

arXiv:2006.07458  [pdf, othercs.LG math.OC stat.ML
Projection Robust Wasserstein Distance and Riemannian Optimization
Authors: Tianyi Lin, Chenyou Fan, Nhat Ho, Marco Cuturi, Michael I. Jordan
Abstract: Projection robust Wasserstein (PRW) distance, or Wasserstein projection pursuit (WPP), is a robust variant of the Wasserstein distance. Recent work suggests that this quantity is more robust than the standard Wasserstein distance, in particular when comparing probability measures in high-dimensions. However, it is ruled out for practical application because the optimization model is essentially no…
More
Submitted 28 June, 2020; v1 submitted 12 June, 2020; originally announced June 2020.
Comments: The first two authors contributed equally  

rojection robust Wasserstein distance and Riemannian optimization

Cited by 32 Related articles All 9 versions

arXiv:2006.07286  [pdf, other]  stat.ML cs.LG math.ST
Fair Regression with Wasserstein Barycenters
Authors: Evgenii Chzhen, Christophe Denis, Mohamed Hebiri, Luca Oneto, Massimiliano Pontil
Abstract: We study the problem of learning a real-valued function that satisfies the Demographic Parity constraint. It demands the distribution of the predicted output to be independent of the sensitive attribute. We consider the case that the sensitive attribute is available for prediction. We establish a connection between fair regression and optimal transport theory, based on which we derive a close form…
More
Submitted 23 June, 2020; v1 submitted 12 June, 2020; originally announced June 2020. 

Journal ArticleFull Text Online

arXiv:2006.06763  [pdf, other]  math.OC cs.LG stat.ML
Stochastic Saddle-Point Optimization for Wasserstein Barycenters
Authors: Daniil Tiapkin, Alexander Gasnikov, Pavel Dvurechensky
Abstract: We study the computation of non-regularized Wasserstein barycenters of probability measures supported on the finite set. The first result gives a stochastic optimization algorithm for the discrete distribution over the probability measures which is comparable with the current best algorithms. The second result extends the previous one to the arbitrary distribution using kernel methods. Moreover, t…
More
Submitted 11 June, 2020; originally announced June 2020. 


arXiv:2006.06090  [pdf, ps, other]  stat.ML cs.LG
Robustified Multivariate Regression and Classification Using Distributionally Robust Optimization under the
Wasserstein Metric
Authors:
Ruidi Chen, Ioannis Ch. Paschalidis
Abstract: We develop Distributionally Robust Optimization (DRO) formulations for Multivariate Linear Regression (MLR) and Multiclass Logistic Regression (MLG) when both the covariates and responses/labels may be contaminated by outliers. The DRO framework uses a probabilistic ambiguity set defined as a ball of distributions that are close to the empirical distribution of the training set in the sense of the…
More
Submitted 10 June, 2020; originally announced June 2020. 

Journal ArticleFull Text Online

2020

arXiv:2006.05421  [pdf, othercs.LG stat.ML
Conditional Sig-Wasserstein GANs for Time Series Generation
Authors: Hao Ni, Lukasz Szpruch, Magnus Wiese, Shujian Liao, Baoren Xiao
Abstract: Generative adversarial networks (GANs) have been extremely successful in generating samples, from seemingly high dimensional probability measures. However, these methods struggle to capture the temporal dependence of joint probability distributions induced by time-series data. Furthermore, long time-series data streams hugely increase the dimension of the target space, which may render generative…
More
Submitted 9 June, 2020; originally announced June 2020. 

Journal article
Cited by 25
Related 


2020

arXiv:2006.04709  [pdf, otherstat.ML cs.LG stat.ME
Wasserstein Random Forests and Applications in Heterogeneous Treatment Effects
Authors: Qiming Du, Gérard Biau, François Petit, Raphaël Porcher
Abstract: We present new insights into causal inference in the context of Heterogeneous Treatment Effects by proposing natural variants of Random Forests to estimate the key conditional distributions. To achieve this, we recast Breiman's original splitting criterion in terms of Wasserstein distances between empirical measures. This reformulation indicates that Random Forests are well adapted to estimate con…
More
Submitted 8 June, 2020; originally announced June 2020. 

Journal ArticleFull Text Online

arXiv:2006.04678  [pdf, othercs.LG stat.ML
Primal Wasserstein Imitation Learning
Authors: Robert Dadashi, Léonard Hussenot, Matthieu Geist, Olivier Pietquin
Abstract: Imitation Learning (IL) methods seek to match the behavior of an agent with that of an expert. In the present work, we propose a new IL method based on a conceptually simple algorithm: Primal Wasserstein Imitation Learning (PWIL), which ties to the primal form of the Wasserstein distance between the expert and the agent state-action distributions. We present a reward function which is derived offl…
More
Submitted 8 June, 2020; originally announced June 2020.

Conference ProceedingFull Text Online

Journal ArticleFull Text Online

Cited by 45  Related articles All 18 versions
17.7k members in the reinforcementlearning community. Reinforcement learning is a subfield of AI/statistics ...

Nov 13, 2020 · Uploaded by Wesley Liao

[R] Primal Wasserstein Imitation Learning : MachineLearning

Finally, we show that the behavior of the agent we train matches the behavior of the expert with the Wasserstein ...

Jun 9, 2020


arXiv:2006.04163  [pdf, othercs.LG math.MG stat.ML
Generalized Spectral Clustering via Gromov-Wasserstein Learning
Authors: Samir Chowdhury, Tom Needham
Abstract: We establish a bridge between spectral clustering and Gromov-Wasserstein Learning (GWL), a recent optimal transport-based approach to graph partitioning. This connection both explains and improves upon the state-of-the-art performance of GWL. The Gromov-Wasserstein framework provides probabilistic correspondences between nodes of source and target graphs via a quadratic programming relaxation of t…
More
Submitted 7 June, 2020; originally announced June 2020. 

Related articles All 2 versions

arXiv:2006.03503  [pdf, othercs.LG cs.RO stat.ML
Wasserstein Distance guided Adversarial Imitation Learning with Reward Shape Exploration
Authors: Ming Zhang, Yawei Wang, Xiaoteng Ma, Li Xia, Jun Yang, Zhiheng Li, Xiu Li
Abstract: The generative adversarial imitation learning (GAIL) has provided an adversarial learning framework for imitating expert policy from demonstrations in high-dimensional continuous tasks. However, almost all GAIL and its extensions only design a kind of reward function of logarithmic form in the adversarial training strategy with the Jensen-Shannon (JS) divergence for all complex environments. The f…
More
Submitted 5 June, 2020; originally announced June 202

Conference ProceedingFull Text Online

Cited by 7 Related articles All 6 versions

<——2020———————2020 ————-180—


arXiv:2006.12640  [pdf, otherstat.ME
Wasserstein Autoregressive Models for Density Time Series
Authors:
Chao Zhang, Piotr Kokoszka, Alexander Petersen
Abstract: Data consisting of time-indexed distributions of cross-sectional or intraday returns have been extensively studied in finance, and provide one example in which the data atoms consist of serially dependent probability distributions. Motivated by such data, we propose an autoregressive model for density time series by exploiting the tangent space structure on the space of distributions that is induc…
More
Submitted 22 June, 2020; originally announced June 2020. 


arXiv:2006.12287  [pdf, othermath.ST
Gromov-
Wasserstein Distance based Object Matching: Asymptotic Inference
Authors:
Christoph Alexander Weitkamp, Katharina Proksch, Carla Tameling, Axel Munk
Abstract: In this paper, we aim to provide a statistical theory for object matching based on the Gromov-Wasserstein distance. To this end, we model general objects as metric measure spaces. Based on this, we propose a simple and efficiently computable asymptotic statistical test for pose invariant object discrimination. This is based on an empirical version of a β
-trimmed lower bound of the Gromov-Wassers…
More
Submitted 24 June, 2020; v1 submitted 22 June, 2020; originally announced June 2020.
Comments: For a version with the complete supplement see [v2]
MSC Class: 62E20; 62G20; 65C60 (Primary) 60E05 (Secondary) 

Journal article
Cited by 5 Related articles All 7 versions

Gromov-Wasserstein Distance based Object Matching: Asymptotic Inference  book

arXiv:2006.11783  [pdf, other]  cs.LG stat.ML doi 10.1007/978-3-030-50423-6_17
Missing Features Reconstruction Using a
Wasserstein Generative Adversarial Imputation Network
Authors:
Magda Friedjungová, Daniel Vašata, Maksym Balatsko, Marcel Jiřina
Abstract: Missing data is one of the most common preprocessing problems. In this paper, we experimentally research the use of generative and non-generative models for feature reconstruction. Variational Autoencoder with Arbitrary Conditioning (VAEAC) and Generative Adversarial Imputation Network (GAIN) were researched as representatives of generative models, while the denoising autoencoder (DAE) represented…
More
Submitted 21 June, 2020; originally announced June 2020.
Comments: Preprint of the conference paper (ICCS 2020), part of the Lecture Notes in Computer Science
Journal ref: Computational Science - ICCS 2020. ICCS 2020. Lecture Notes in Computer Science 12140 (2020) 225-239 

  Book ChapterFull Text Online


2020

arXiv:2006.10325  [pdf, otherstat.ML cs.LG
When OT meets MoM: Robust estimation of
Wasserstein Distance
Authors:
Guillaume Staerman, Pierre Laforgue, Pavlo Mozharovskyi, Florence d'Alché-Buc
Abstract: Issued from Optimal Transport, the Wasserstein distance has gained importance in Machine Learning due to its appealing geometrical properties and the increasing availability of efficient approximations. In this work, we consider the problem of estimating the Wasserstein distance between two probability distributions when observations are polluted by outliers. To that end, we investigate how to lev…
More
Submitted 18 June, 2020; originally announced June 2020. 

Journal article

arXiv:2006.09660  [pdf, otherstat.ME
Wasserstein Regression
Authors:
Yaqing Chen, Zhenhua Lin, Hans-Georg Müller
Abstract: The analysis of samples of random objects that do not lie in a vector space has found increasing attention in statistics in recent years. An important class of such object data is univariate probability measures defined on the real line. Adopting the Wasserstein metric, we develop a class of regression models for such data, where random distributions serve as predictors and the responses are eithe…
More
Submitted 17 June, 2020; originally announced June 2020.

Cited by 1 All 2 versions

 

2020

arXiv:2006.09430  [pdf, othercs.LG stat.ML
Wasserstein Embedding for Graph Learning
Authors:
Soheil Kolouri, Navid Naderializadeh, Gustavo K. Rohde, Heiko Hoffmann
Abstract: We present Wasserstein Embedding for Graph Learning (WEGL), a novel and fast framework for embedding entire graphs in a vector space, in which various machine learning models are applicable for graph-level prediction tasks. We leverage new insights on defining similarity between graphs as a function of the similarity between their node embedding distributions. Specifically, we use the Wasserstein…
More
Submitted 16 June, 2020; originally announced June 2020. 

Journal ArticleFull Text Online

arXiv:2006.09304  [pdf, other]  physics.ao-ph cond-mat.stat-mech nlin.CD
Ranking IPCC Models Using the
Wasserstein Distance
Authors:
Gabriele Vissio, Valerio Lembo, Valerio Lucarini, Michael Ghil
Abstract: We propose a methodology for evaluating the performance of climate models based on the use of the Wasserstein distance. This distance provides a rigorous way to measure quantitatively the difference between two probability distributions. The proposed approach is flexible and can be applied in any number of dimensions; it allows one to rank climate models taking into account all the moments of the…
More
Submitted 16 June, 2020; originally announced June 2020.
Comments: 22 pages, 5 figures, 3 tables 


arXiv:2006.09187  [pdf, othermath.NA
Time Discretizations of
Wasserstein-Hamiltonian Flows
Authors:
Jianbo Cui, Luca Dieci, Haomin Zhou
Abstract: We study discretizations of Hamiltonian systems on the probability density manifold equipped with the L2-Wasserstein metric. Based on discrete optimal transport theory, several Hamiltonian systems on graph (lattice) with different weights are derived, which can be viewed as spatial discretizations to the original Hamiltonian systems. We prove the consistency and provide the approximate orders f…
More
Submitted 16 June, 2020; originally announced June 2020.
Comments: 34 pages
MSC Class: Primary 65P10; Secondary 35R02; 58B20; 65M12 

 Cited by 5 Related articles All 4 versions

arXiv:2006.08812  [pdf, othercs
Augmented Sliced
Wasserstein Distances
Authors:
Xiongjie Chen, Yongxin Yang, Yunpeng Li
Abstract: While theoretically appealing, the application of the Wasserstein distance to large-scale machine learning problems has been hampered by its prohibitive computational cost. The sliced Wasserstein distance and its variants improve the computational efficiency through random projection, yet they suffer from low projection efficiency because the majority of projections result in trivially small value…
More
Submitted 17 June, 2020; v1 submitted 15 June, 2020; originally announced June 2020.
Comments: 16 pages, 5 figures 

Cited by 4 Related articles All 5 versions
Augmented Sliced Wasserstein Distances - Papers With Code

paperswithcode.com › paper › review

1:05

While theoretically appealing, the application of the Wasserstein distance to large-scale machine learning ...

May 7, 2020 - Uploaded by Ross Taylor


arXiv:2006.08172  [pdf, other] math   
Faster
Wasserstein Distance Estimation with the Sinkhorn Divergence
Authors:
Lenaic Chizat, Pierre Roussillon, Flavien Léger, François-Xavier Vialard, Gabriel Peyré
Abstract: The squared Wasserstein distance is a natural quantity to compare probability distributions in a non-parametric setting. This quantity is usually estimated with the plug-in estimator, defined via a discrete optimal transport problem. It can be solved to ε
-accuracy by adding an entropic regularization of order ε
and using for instance Sinkhorn's algorithm. In this work, we propose instead to es…
More
Submitted 15 June, 2020; originally announced June 2020. 

Cited by 62 Related articles All 9 versions

Carolina Parada · Robotics at Google - SlidesLive

slideslive.com › robotics-at-google

slideslive.com › robotics-at-google

Robotics at Google. Dec 6, 2020. Speakers ...

 Faster Wasserstein Distance Estimation with the Sinkhorn Divergence. 03:21 ...

SlidesLive · 

Dec 6, 2020

<——2020——  2020————-190—  


   

arXiv:2007.06750  [pdf, ps, othermath.OC
Strong Formulations for Distributionally Robust Chance-Constrained Programs with Left-Hand Side Uncertainty under
Wasserstein Ambiguity
Authors:
Nam Ho-Nguyen, Fatma Kılınç-Karzan, Simge Küçükyavuz, Dabeen Lee
Abstract: Distributionally robust chance-constrained programs (DR-CCP) over Wasserstein ambiguity sets exhibit attractive out-of-sample performance and admit big-M
-based mixed-integer programming (MIP) reformulations with conic constraints. However, the resulting formulations often suffer from scalability issues as sample size increases. To address this shortcoming, we derive stronger formulations that sc…
More
Submitted 13 July, 2020; originally announced July 2020.
MSC Class: 90C17; 90C15; 90C11; 90C57 

[PDF] arxiv.org

Strong Formulations for Distributionally Robust Chance-Constrained Programs with Left-Hand Side Uncertainty under Wasserstein Ambiguity

N Ho-NguyenF Kılınç-KarzanS Küçükyavuz… - arXiv preprint arXiv …, 2020 - arxiv.org

Distributionally robust chance-constrained programs (DR-CCP) over Wasserstein ambiguity

sets exhibit attractive out-of-sample performance and admit big-$ M $-based mixed-integer

programming (MIP) reformulations with conic constraints. However, the resulting …

  Cited by 3 Related articles All 3 versions 

 

  arXiv:2007.04462  [pdf, other]  cs.LG math.OC stat.ML
Scalable Computations of Wasserstein Barycenter via Input Convex Neural Networks
Authors: Jiaojiao Fan, Amirhossein Taghvaei, Yongxin Chen
Abstract: Wasserstein Barycenter is a principled approach to represent the weighted mean of a given set of probability distributions, utilizing the geometry induced by optimal transport. In this work, we present a novel scalable algorithm to approximate the Wasserstein Barycenters aiming at high-dimensional applications in machine learning. Our proposed algorithm is based on the Kantorovich dual formulation…
More
Submitted 8 July, 2020; originally announced July 2020.
Comments: 16 pages,12 figures
MSC Class: 49Q22; 62Dxx; 62F15
Cited by 15
Related articles All 7 versions


2020

arXiv:2007.03408  [pdf, other]  cs.CV cs.LG stat.ML
Wasserstein Generative Models for Patch-based Texture Synthesis
Authors: Antoine Houdard, Arthur Leclaire, Nicolas Papadakis, Julien Rabin
Abstract: In this paper, we propose a framework to train a generative model for texture image synthesis from a single example. To do so, we exploit the local representation of images via the space of patches, that is, square sub-images of fixed size (e.g. 4×4
). Our main contribution is to consider optimal transport to enforce the multiscale patch distribution of generated images, which leads to two…
More
Submitted 19 June, 2020; originally announced July 2020. 

Journal ArticleFull Text Online

 

arXiv:2007.03085  [pdf, other]  cs.CV cs.LG
Wasserstein Distances for Stereo Disparity Estimation
Authors: Divyansh Garg, Yan Wang, Bharath Hariharan, Mark Campbell, Kilian Q. Weinberger, Wei-Lun Chao
Abstract: Existing approaches to depth or disparity estimation output a distribution over a set of pre-defined discrete values. This leads to inaccurate results when the true depth or disparity does not match any of these values. The fact that this distribution is usually learned indirectly through a regression loss causes further problems in ambiguous regions around object boundaries. We address these issu…
More
Submitted 6 July, 2020; originally announced July 2020. 

  

arXiv:2006.16824  [pdf, other]  math.AT
Wasserstein Stability for Persistence Diagrams
Authors: Primoz Skraba, Katharine Turner
Abstract: The stability of persistence diagrams is among the most important results in applied and computational topology. Most results in the literature phrase stability in terms of the bottleneck distance between diagrams and the ∞
-norm of perturbations. This has two main implications: it makes the space of persistence diagrams rather pathological and it is often provides very pessimistic bounds wi…
More
Submitted 30 June, 2020; originally announced June 2020. 

Journal article
Cited by 18
Related articles All 2 versions

Wasserstein Stability for Persistence Diagrams · SlidesLive

slideslive.com › wasserstein-stability-for-persistence-diagr...

slideslive.com › wasserstein-stability-for-persistence-diagr...

... Topological Data Analysis and Beyond; Wasserstein Stability for Persistence Diagrams ... Wasserstein Stability for Persistence Diagrams. Dec 6, 2020 ...

SlidesLive · D

ec 6, 2020

arXiv:2006.14566  [pdf, other]  eess.IV cs.CV
Multimarginal Wasserstein Barycenter for Stain Normalization and Augmentation
Authors: Saad Nadeem, Travis Hollmann, Allen Tannenbaum
Abstract: Variations in hematoxylin and eosin (H&E) stained images (due to clinical lab protocols, scanners, etc) directly impact the quality and accuracy of clinical diagnosis, and hence it is important to control for these variations for a reliable diagnosis. In this work, we present a new approach based on the multimarginal Wasserstein barycenter to normalize and augment H&E stained images given one or m…
More
Submitted 25 June, 2020; originally announced June 2020.
Comments: To appear in MICCAI 2020 

Cited by 11 Related articles All 9 versions

 arXiv:2006.12915  [pdf, ps, other]  eess.IV cs.CV
Deep Attentive Wasserstein Generative Adversarial Networks for MRI Reconstruction with Recurrent Context-Awareness
Authors:
Yifeng Guo, Chengjia Wang, Heye Zhang, Guang Yang
Abstract: The performance of traditional compressive sensing-based MRI (CS-MRI) reconstruction is affected by its slow iterative procedure and noise-induced artefacts. Although many deep learning-based CS-MRI methods have been proposed to mitigate the problems of traditional methods, they have not been able to achieve more robust results at higher acceleration factors. Most of the deep learning-based CS-MRI…
More
Submitted 23 June, 2020; originally announced June 2020. 

  Deep attentive wasserstein generative adversarial networks for MRI reconstruction with recurrent context-awareness

Y Guo, C WangH ZhangG Yang - International Conference on Medical …, 2020 - Springer

The performance of traditional compressive sensing-based MRI (CS-MRI) reconstruction is

affected by its slow iterative procedure and noise-induced artefacts. Although many deep

learning-based CS-MRI methods have been proposed to mitigate the problems of traditional

methods, they have not been able to achieve more robust results at higher acceleration

factors. Most of the deep learning-based CS-MRI methods still can not fully mine the

information from the k-space, which leads to unsatisfactory results in the MRI reconstruction …

 Cited by 21 Related articles All 5 versions

book chapter   Conference Proceeding


arXiv:2006.08265  [pdf, othercs.LG cs.CR stat.ML

GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators

Authors: Dingfan Chen, Tribhuvanesh Orekondy, Mario Fritz

Abstract: The wide-spread availability of rich data has fueled the growth of machine learning applications in numerous domains. However, growth in domains with highly-sensitive data (e.g., medical) is largely hindered as the private nature of data prohibits it from being shared. To this end, we propose Gradient-sanitized Wasserstein Generative Adversarial Networks (GS-WGAN), which allows releasing a sanitiz… More

Submitted 15 June, 2020; originally announced June 2020.
Cited by 4 All 6 versions

 

WGAN domain adaptation for the joint optic disc-and-cup segmentation in fundus images

S Kadambi, Z Wang, E Xing - … Journal of Computer Assisted Radiology and …, 2020 - Springer

Purpose The cup-to-disc ratio (CDR), a clinical metric of the relative size of the optic cup to 

the optic disc, is a key indicator of glaucoma, a chronic eye disease leading to loss of vision. 

CDR can be measured from fundus images through the segmentation of optic disc and optic …

R Cited by 14 Related articles All 3 versions

 WGAN domain adaptation for the joint optic disc-and-cup segmentation in fundus images.

0 citations* 

2020 International Journal of Computer Assisted Radiology and Surgery

Shreya Kadambi , 

Zeya Wang , Eric P. Xing 

Petuum Inc., Pittsburgh, PA, 15222, USA.


RDA-UNET-WGAN: An Accurate Breast Ultrasound Lesion Segmentation Using Wasserstein Generative Adversarial Networks

A Negi, ANJ Raj, R Nersisson, Z Zhuang… - … FOR SCIENCE AND …, 2020 - Springer

Early-stage detection of lesions is the best possible way to fight breast cancer, a disease 

with the highest malignancy ratio among women. Though several methods primarily based 

on deep learning have been proposed for tumor segmentation, it is still a challenging …

Related articles
Science - Science and Engineering; Reports from Shantou University Provide New Insights into Science and Engineering (Rda-unet-wgan: an Accurate Breast Ultrasound Lesion Segmentation Using Wasserstein... 

Journal of Engineering, Apr 27, 2020, 1790

Newspaper ArticleFull Text Online 

All 4 versions

Publisher:2020

<——2020——2020————  200—   


[HTML] hindawi.com

[HTML] Motion Deblurring in Image Color Enhancement by WGAN

J Feng, S Qi - International Journal of Optics, 2020 - hindawi.com

Motion deblurring and image enhancement are active research areas over the years. 

Although the CNN-based model has an advanced state of the art in motion deblurring and 

image enhancement, it fails to produce multitask results when challenged with the images of …

All 3 versions


[PDF] researchsquare.com

[PDF] Res-WGAN: Image Classification for Plant Small-scale Datasets

M Jiaqi, Y Si, Y Xiande, G Wanlin, L Minzan, Z Lihua… - 2020 - researchsquare.com

Background: Artificial identification of rare plants is an important yet challenging 12 problem 

in plant taxonomy. Although deep learning-based method can accurately 13 predict rare 

plant category from training samples, accuracy requirements of only few 14 experts are …

Related articles All 3 versions

 

[PDF] thecvf.com

Severity-aware semantic segmentation with reinforced wasserstein training

X Liu, W Ji, J You, GE Fakhri… - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com

Semantic segmentation is a class of methods to classify each pixel in an image into 

semantic classes, which is critical for autonomous vehicles and surgery systems. Cross-

Cited by 17 Related articles All 7 versions

 Conference ProceedingCitation Online

[PDF] arxiv.org

Fisher information regularization schemes for Wasserstein gradient flows

W Li, J Lu, L Wang - Journal of Computational Physics, 2020 - Elsevier

We propose a variational scheme for computing Wasserstein gradient flows. The scheme 

builds upon the Jordan–Kinderlehrer–Otto framework with the Benamou-Brenier's dynamic 

formulation of the quadratic Wasserstein metric and adds a regularization by the Fisher …

Cited by 6 Related articles All 8 versions 

MR4107049 Prelim Li, Wuchen; Lu, Jianfeng; Wang, Li; Fisher information regularization schemes for Wasserstein gradient flows. J. Comput. Phys. 416 (2020), 109449, 24 pp. 65M08 (49M37 49Q22 90C55) 

Review PDF Clipboard Journal Article 

Cited by 22 Related articles All 7 versions

2020  see 2019  [PDF] arxiv.org

 


[PDF] mlr.press

Approximate inference with wasserstein gradient flows

C Frogner, T Poggio - International Conference on Artificial …, 2020 - proceedings.mlr.press

We present a novel approximate inference method for diffusion processes, based on the 

Wasserstein gradient flow formulation of the diffusion. In this formulation, the time-dependent 

density of the diffusion is derived as the limit of implicit Euler steps that follow the gradients …

Cited by 18 Related articles All 6 versions


[PDF] aaai.org

[PDF] Importance-Aware Semantic Segmentation in Self-Driving with Discrete Wasserstein Training.

X Liu, Y Han, S Bai, Y Ge, T Wang, X Han, S Li, J You… - AAAI, 2020 - aaai.org

Semantic segmentation (SS) is an important perception manner for self-driving cars and 

robotics, which classifies each pixel into a pre-determined class. The widely-used cross 

entropy (CE) loss-based deep networks has achieved significant progress wrt the mean …

Cited by 8 Related articles All 6 versions

arXiv:2010.12440  [pdf, othercs.CV cs.LG cs.RO 

Importance-Aware Semantic Segmentation in Self-Driving with Discrete

Wasserstein Training 

Authors: Xiaofeng Liu, Yuzhuo Han, Song Bai, Yi Ge, Tianxing Wang, Xu Han, Site Li, Jane You, Ju

Lu
Abstract: Semantic segmentation (SS) is an important perception manner for self-driving cars and robotics, which classifies each pixel into a pre-determined class. The widely-used cross entropy (CE) loss-based deep networks has achieved significant progress w.r.t. the mean Intersection-over Union (mIoU). However, the cross entropy loss can not take the different importance of each class in an self-driving s…
More
Submitted 21 October, 2020; originally announced October 2020.
Comments: Published in Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI) 2020. arXiv admin note: text overlap with arXiv:2008.04751 

 

MR4121100 Prelim Hwang, Jinmi; Kim, Sejong; Tensor product and Hadamard product for the Wasserstein means. Linear Algebra Appl. 603 (2020), 496–507. 15B48 (15A45 15A69)

Review PDF Clipboard Journal Article 

Tensor product and Hadamard product for the Wasserstein means 

By: Hwang, Jinmi; Kim, Sejong 

LINEAR ALGEBRA AND ITS APPLICATIONS  Volume: ‏ 603   Pages: ‏ 496-507   Published: ‏ OCT 2020

MR4119947 Prelim Chigarev, Vladimir; Kazakov, Alexey; Pikovsky, Arkady; Kantorovich-Rubinstein-Wasserstein distance between overlapping attractor and repeller. Chaos 30 (2020), no. 7, 073114, 10 pp. 37C70

Review PDF Clipboard Journal Article 

Kantorovich–Rubinstein–Wasserstein distance between ...

Jul 7, 2020 - We consider several examples of dynamical systems demonstrating overlapping attractor and repeller. These systems are constructed via ...

[CITATION] Kantorovich-Rubinstein-Wasserstein distance between overlapping attractor and repeller, Chaos 30

V Chigarev, A Kazakov, A Pikovsky - 2020

Cited by 2 Related articles 


MR4118990 Prelim Backhoff-Veraguas, Julio; Bartl, Daniel; Beiglböck, Mathias; Eder, Manu; Adapted Wasserstein distances and stability in mathematical finance. Finance Stoch. 24 (2020), no. 3, 601–632. 91G80 (49Q22 60G44 60H30 90C15) 

Review PDF Clipboard Journal Article 

[PDF] arxiv.org

Adapted wasserstein distances and stability in mathematical finance

BV Julio, D Bartl, B Mathias, E Manu - Finance and Stochastics, 2020 - Springer

Assume that an agent models a financial asset through a measure Q with the goal to 

price/hedge some derivative or optimise some expected utility. Even if the model Q is 

chosen in the most skilful and sophisticated way, the agent is left with the possibility that Q 

does not provide an exact description of reality. This leads us to the following question: will 

the hedge still be somewhat meaningful for models in the proximity of Q? If we measure 

proximity with the usual Wasserstein distance (say), the answer is No. Models which are …

Cited by 6 Related articles All 11 versions


MR4118923 Prelim Xie, Weijun; Tractable reformulations of two-stage distributionally robust linear programs over the type-

Wasserstein ball. Oper. Res. Lett. 48 (2020), no. 4, 513–523. 90C15 (90C05)

 Review PDF Clipboard Journal Article 

Tactable reformulations of two-stage distributionally robust ...

This paper studies a two-stage distributionally robust stochastic linear program under the type- ∞ Wasserstein ball by providing sufficient conditions under which ...

by W Xie - ‎2020

Tractable reformulations of two-stage distributionally robust ...

Jun 23, 2020 - This paper studies a two-stage distributionally robust stochastic linear program under the type- ∞ Wasserstein ball by providing sufficient ...

by W Xie - ‎2020

<-—2020—— —2020——— 210 —


MR4117397 Prelim Han, Wei; Wang, Lizhe; Feng, Ruyi; Gao, Lang; Chen, Xiaodao; Deng, Ze; Chen, Jia; Liu, Peng; Sample generation based on a supervised Wasserstein generative adversarial network for high-resolution remote-sensing scene classification. Inform. Sci. 539 (2020), 177–194. 94A08 

Review PDF Clipboard Journal Article 

Sample Generation based on a Supervised Wasserstein ...

https://www.researchgate.net › publication › 342254976_Sample_Generatio...

Jul 10, 2020 - Request PDF | Sample Generation based on a Supervised Wasserstein Generative Adversarial Network for High-resolution Remote-sensing ...


 

MR4117303 Prelim Sagiv, Amir; Steinerberger, Stefan; Transport and interface: an uncertainty principle for the Wasserstein distance. SIAM J. Math. Anal. 52 (2020), no. 3, 3039–3051. 28A75 (49Q22 58C40) 

Review PDF Clipboard Journal Article 

Transport and Interface: An Uncertainty Principle for ... - SIAM

https://epubs.siam.org › doi › abs

by A Sagiv - ‎2020 - ‎Cited by 1 - ‎Related articles

Transport and Interface: An Uncertainty Principle for the Wasserstein Distance. Related Databases. Web of Science. You must be logged in with an active ...

RANSPORT AND INTERFACE: AN UNCERTAINTY PRINCIPLE FOR THE WASSERSTEIN DISTANCE 

By: Sagiv, Amir; Steinerberger, Stefan 

SIAM JOURNAL ON MATHEMATICAL ANALYSIS  Volume: ‏ 52   Issue: ‏ 3   Pages: ‏ 3039-3051   Published: ‏ 2020 

online

Data on Mathematical Analysis Described by Researchers at Tel Aviv University 

(Transport and Interface: an Uncertainty Principle for the Wasserstein...

Mathematics Week, 08/2020

NewsletterFull Text Online

 Preview 

 Cite this item Email this item Save this item More actions

Cited by 8 Related articles All 3 versions
 

MR4116705 Prelim Alfonsi, Aurélien; Corbetta, Jacopo; Jourdain, Benjamin; Sampling of probability measures in the convex order by Wasserstein projection. Ann. Inst. Henri Poincaré Probab. Stat. 56 (2020), no. 3, 1706–1729. 91G60 (49Q22 60E15 60G42 90C08)

 Review PDF Clipboard Journal Article 

Sampling of probability measures in the convex order by Wasserstein projection

A Alfonsi, J Corbetta, B Jourdain - Annales de l'Institut Henri …, 2020 - projecteuclid.org

In this paper, for $\mu $ and $\nu $ two probability measures on $\mathbb {R}^{d} $ with 

finite moments of order $\varrho\ge 1$, we define the respective projections for the $ W_ 

{\varrho} $-Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures …

Cited by 8 Related articles All 5 versions

Sampling of probability measures in the convex order by Wasserstein projection 

By: Alfonsi, Aurelien; Corbetta, Jacopo; Jourdain, Benjamin 

ANNALES DE L INSTITUT HENRI POINCARE-PROBABILITES ET STATISTIQUES  Volume: ‏ 56   Issue: ‏ 3   Pages: ‏ 1706-1729   Published: ‏ AUG 2020 



 

MR4114015 Prelim Marcos, Aboubacar; Soglo, Ambroise; Solutions of a class of degenerate kinetic equations using steepest descent in Wasserstein space. J. Math. 2020, Art. ID 7489532, 30 pp. 35K59 (35Q20 35Q83 45K05 65M75) 

Review PDF Clipboard Journal Article 

ww.researchgate.net › publication › 342068632_ 

Solutions of a Class of Degenerate Kinetic Equations Using ...

www.researchgate.net › publication › 342068632_Solutio...

Jun 23, 2020 - PDF | We use the steepest descent method in an Orlicz–Wasserstein space to study the existence of solutions for a very broad class of kinetic ...


MR4093037 Indexed Li, Jing; Huo, Hongtao; Liu, Kejian; Li, Chang Infrared and visible image fusion using dual discriminators generative adversarial networks with Wasserstein distance. Inform. Sci. 529 (2020), 28–41. 94A08 

Review PDF Clipboard Journal Article 

[PDF] researchgate.net

Infrared and Visible Image Fusion Using Dual Discriminators Generative Adversarial Networks with Wasserstein Distance

J Li, H Huo, K Liu, C Li - Information Sciences, 2020 - Elsevier

Generative adversarial network (GAN) has shown great potential in infrared and visible 

image fusion. The existing GAN-based methods establish an adversarial game between 

generative image and source images to train the generator until the generative image 

contains enough meaningful information from source images. However, they only design 

one discriminator to force the fused result to complement gradient information from visible 

image, which may lose some detail information that existing in infrared image and omit some …

Related articles All 2 versions


2020

MR4085708 Pending Buttazzo, Giuseppe; Carlier, Guillaume; Laborde, Maxime On the Wasserstein distance between mutually singular measures. Adv. Calc. Var. 13 (2020), no. 2, 141–154. 49J45 (49M29 49Q20) 

Review PDF Clipboard Journal Article 

the Wasserstein distance between mutually singular ...

Jan 18, 2020 - We study the Wasserstein distance between two measures {\mu,\nu} which are mutually singular. In particular, we are interested in ...

On the Wasserstein Distance Between Mutually Singular Measures. Advances in Calculus of Variations, 2020;13(2):141-154. Advances in Calculus of Variations can be contacted at: Walter De Gruyter Gmbh, Genthiner Strasse 13, D-10785 Berlin, Germany.

On the Wasserstein distance between mutually singular ...

https://www.degruyter.com › doi › acv-2017-0036 › pdf

by G Buttazzo · 2020 · Cited by 1 — 142 | G. Buttazzo et al., On the Wasserstein distance between mutually singular measures. (see Proposition 3.6). If μ L1 (or in slightly more ...

On the Wasserstein distance between mutually singular measures 

By: Buttazzo, Giuseppe; Carlier, Guillaume; Laborde, Maxime 

ADVANCES IN CALCULUS OF VARIATIONS  Volume: ‏ 13   Issue: ‏ 2   Pages: ‏ 141-154   Published: ‏ APR 2020 

Newspaper ArticleFull Text Online
ArticleFull Text Online

[PDF] archives-ouvertes.fr

[CITATION] On [CITATION]  

On the Wasserstein distance between mutually singular measures

G Buttazzo, G Carlier, M Laborde - Advances in Calculus of Variations, 2020 - De Gruyter

  Cited by 1 Related articles All 6 versions

Mathematics - 

Calculus; New Findings in Calculus Described from University of Pisa (On the Wasserstein... 

News of Science, May 24, 2020, 496

Newspaper ArticleFull Text Online

 

MR4083852 Pending Chae, Minwoo; Walker, Stephen G. Wasserstein upper bounds of the total variation for smooth densities. Statist. Probab. Lett. 163 (2020), 108771, 6 pp. 60E15 

Review PDF Clipboard Journal Article 

Wasserstein upper bounds of the total variation for smooth densities

M Chae, SG Walker - Statistics & Probability Letters, 2020 - Elsevier

The total variation distance between probability measures cannot be bounded by the Wasserstein metric in general. If we consider sufficiently smooth probability densities, however, it is possible to bound the total variation by a power of the Wasserstein distance. We provide a sharp upper bound which depends on the Sobolev norms of the densities involved.

Cited by 1 Related articles All 2 versions 

Wasserstein upper bounds of the total variation for smooth densities

May 10, 2020 - Request PDF | Wasserstein upper bounds of the total variation for smooth densities | The total variation distance between probability measures ...

 Cited by 4 Related articles All 4 versi


Studies from Tsinghua University Have Provided New Information about Operations Science 

 (Wasserstein Distributionally Robust Shortest Path Problem).

Science Letter, 07/2020

NewsletterFull Text Online
 

Study Data from Arizona State University Provide New Insights into Machine Learning (Hyperbolic Wasserstein... 

Robotics & Machine Learning, 06/2020

NewsletterFull Text Online 

Machine Learning; Study Data from Arizona State University Provide New Insights into Machine Learning (Hyperbolic Wasserstein... 

Journal of robotics & machine learning, Jun 15, 2020, 411

Newspaper ArticleFull Text Online 

Cited by 11 Related articles All 7 versions

<——2020——2020———  220   

 

New Findings in Calculus Described from University of Pisa 

(On the Wasserstein Distance Between Mutually Singular... 

Mathematics Week, 05/2020

NewsletterFull Text Online 


 

Researchers Submit Patent Application, "System And Method For Unsupervised Domain Adaptation Via Sliced-Wasserstein... 

Information Technology Newsweekly, 05/2020

NewsletterFull Text Online

HRL Laboratories LLC; Researchers Submit Patent Application, "System And Method For Unsupervised Domain Adaptation Via Sliced-Wasserstein... 

Information Technology Newsweekly, May 12, 2020, 7353

Newspaper ArticleFull Text Online 


A Riemannian submersion‐based approach to the Wasserstein ...

https://onlinelibrary.wiley.com › doi › mma

A Riemannian submersion‐based approach to the Wasserstein barycenter of positive definite matrices ... First published: 05 March 2020 ... which there is a significant development of various metric‐based means for positive definite matrices.

Investigators from Beijing Institute of Technology Target Mathematics in Applied Science 

(A Riemannian Submersion-based Approach To the Wasserstein... 

Mathematics Week, 06/2020

NewsletterFull Text Online 


2020 see 2019
New Neural Computation Study Results from Zhejiang University Described 

(Deep Joint Two-stream Wasserst... 

Robotics & Machine Learning, 06/2020

NewsletterFull Text Online 


2020  


www.researchgate.net › publication › 341086740_Orthog...

May 3, 2020 - Request PDF | Orthogonal Gradient Penalty for Fast Training of Wasserstein GAN Based Multi-Task Autoencoder toward Robust Speech ...

Technology - Information Technology; Korea University Reports Findings in Information Technology (Orthogonal Gradient Penalty for Fast Training of Wasserstein... 

Computer technology journal, Jun 4, 2020, 526

Newspaper ArticleFull Text Online 


 13  Kangwon National University Researchers Highlight Recent Research in Applied Sciences (Knowledge-Grounded Chatbot Based on Dual Wasserstein... 

Science Letter, 05/2020

NewsletterFull Text Online
Science - Applied Sciences; Kangwon National University Researchers Highlight Recent Research in Applied Sciences (Knowledge-Grounded Chatbot Based on Dual Wasserstein...
 

Science Letter, May 29, 2020, 524

Newspaper ArticleFull Text Online 

  

Kantorovich–Rubinstein–Wasserstein distance between overlapping attractor and repeller<? A3B2 show [editpick]?>

V Chigarev, A KazakovA Pikovsky - Chaos: An Interdisciplinary …, 2020 - aip.scitation.org

We consider several examples of dynamical systems demonstrating overlapping attractor

and repeller. These systems are constructed via introducing controllable dissipation to

prototypic models with chaotic dynamics (Anosov cat map, Chirikov standard map, and …

  Cited by 7 Related articles All 5 versions


EEG Signal Reconstruction Using a Generative Adversarial ...

Apr 30, 2020 - The nature of this contradiction makes EEG signal reconstruction with ... on generative adversarial networks with the Wasserstein distance ... 1College of Mathematics and Informatics, Fujian Normal University, ... Research on “GANs conditioned by brain signals” (Kavasidis et al., ... Published: 30 April 2020.


Fujian Normal University Researchers Report Research in Neuroinformatics (EEG Signal Reconstruction Using a Generative Adversarial Network With Wasserstein... 

Health & Medicine Week, 05/2020

NewsletterFull Text Online 

Information Technology - Neuroinformatics; Fujian Normal University Researchers Report Research in Neuroinformatics 

(EEG Signal Reconstruction Using a Generative Adversarial Network With Wasserstein... 

Health & Medicine Week, May 22, 2020, 1967

nformation Technology - Neuroinformatics; Fujian Normal University Researchers Report Research in Neuroinformatics 

(EEG Signal Reconstruction Using a Generative Adversarial Network With Wasserstein... 

Health & Medicine Week, May 22, 2020, 1967
[C] EEG signal reconstruction using a generative adversarial network with Wasserstein distance and temporal-spatial-frequency lossEEG Signal Reconstruction Using a Generative Adversarial Network With Wasserstein Distance and Temporal-Spatial-Frequency Loss 

By: Luo, Tian-jian; Fan, Yachao; Chen, Lifei; et al.

FRONTIERS IN NEUROINFORMATICS  Volume: ‏ 14     Article Number: 15   Published: ‏ APR 30 2020 

Cited by 15 Related articles All 5 versions
<——2020—— 2020—— 230 —

 

   

Gromov–Hausdorff limit of Wasserstein spaces on point clouds

NG Trillos - Calculus of Variations and Partial Differential …, 2020 - Springer

We consider a point cloud\(X_n:=\{{\mathbf {x}} _1,\ldots,{\mathbf {x}} _n\}\) uniformly 

distributed on the flat torus\({\mathbb {T}}^ d:=\mathbb {R}^ d/\mathbb {Z}^ d\), and construct 

a geometric graph on the cloud by connecting points that are within distance\(\varepsilon\) of …

Cited by 11 Related articles All 3 versions

Studies from University of Wisconsin in the Area of Differential Equations Reported (Gromov-Hausdorff limit of Wasserstein... 

Mathematics Week, 04/2020

NewsletterFull Text Online 

Gromov–Hausdorff limit of Wasserstein spaces on point clouds 

by García Trillos, Nicolás 

Calculus of variations and partial differential equations, 04/2020, Volume 59, Issue 2

Article PDF Download PDF 

Journal ArticleFull Text Online

    

Aggregated Wasserstein distance for hidden Markov models and automated morphological characterization of... 

by Chen, Yukun; Wang, James Z 

2020

In the past decade, fueled by the rapid advances of big data technology and machine learning algorithms, data science has become a new paradigm of science and...

Dissertation/Thesis

ONLINE, Electronic thesis, ONLINE 


A Wasserstein based two-stage distributionally robust ...

A Wasserstein based two-stage distributionally robust optimization model for optimal operation of CCHP micro-grid under uncertainties. Author links open overlay ...

by Y Wang - ‎2020 - ‎Cited by 3 - ‎Related articles

Study Findings on Electric Power Are Outlined in Reports from North China Electric Power University (A Wasserst... 

Energy Weekly News, 07/2020

NewsletterFull Text Online 

Energy - Electric Power; Study Findings on Electric Power Are Outlined in Reports from North China Electric Power University (

(A Wasserstein Based Two-stage Distributionally Robust Optimization Model for Optimal Operation of Cchp Micro-grid Under Uncertainties)

Energy weekly news, Jul 10, 2020, 1431

Newspaper ArticleFull Text Online 

A Wasserstein based two-stage distributionally robust optimization model for optimal operation of CCHP micro-grid under uncertainties 

By: Wang, Yuwei; Yang, Yuanjuan; Tang, Liu; et al.

INTERNATIONAL JOURNAL OF ELECTRICAL POWER & ENERGY SYSTEMS  Volume: ‏ 119     Article Number: 105941   Published: ‏ JUL 2020 


Optimal control theory - The equivalence of fourier-based and wasserstein metrics on imaging problems

Authors:Auricchio G.Codegoni A.Gualandi S.Toscani G.Veneroni M.
Artile, 2020
Publication:Atti della Accademia Nazionale dei Lincei, Classe di Scienze Fisiche, Matematiche e Naturali, Rendiconti Lincei Matematica e Applicazioni, 31, 2020, 627
Publisher:2020
 

 基于 WGAN 反馈的深度学习差分隐私保护方法

陶陶, 柏建树 - 收藏, 2020 - cnki.com.cn

本文针对攻击者可能通过某些技术手段如生成式对抗网络(GAN) 等窃取深度学习训练数据集中

敏感信息的问题, 结合差分隐私理论, 提出经沃瑟斯坦生成式对抗网络(WGAN) 

反馈调参的深度学习差分隐私保护的方法. 该方法使用随机梯度下降进行优化 …

[Chinese   Deep learning differential privacy protection based on WGAN feedback]

 

 2020


A convergent Lagrangian discretization for <inline-formula ...

https://www.aimsciences.org › doi › cpaa.2020190

by B Söliver · 2020 · Cited by 2 — A convergent Lagrangian discretization for p-Wasserstein and flux-limited diffusion ... Keywords: Drift diffusion equation, optimal transport, Lagrangian scheme. ... Communications on Pure & Applied Analysis, 2020, 19 (9) : 4227-4256. doi: ...

online

Mathematics; New Mathematics Findings from Technical University Munich (TU Munich) Reported 

(A Convergent Lagrangian Discretization for P-wasserstein and Flux-limited Diffusion Equations)

Journal of mathematics (Atlanta, Ga.), Jul 21, 2020, 504

  

Wasserstein and Kolmogorov Error Bounds for Variance-Gamma Approximation via Stein’s Method I 

by Gaunt, Robert E 

Journal of theoretical probability, 2018, Volume 33, Issue 1

Article PDF Download PDF 

Journal ArticleFull Text Online 


Wasserstein Index Generation Model: Automatic generation of time-series index with application to... 

by Xie, Fangzhou 

Economics Letters, 01/2020, Volume 186

I propose a novel method, the Wasserstein Index Generation model (WIG), to generate a public sentiment index automatically. To test the model’s effectiveness,...

Article PDF Download PDF 

Journal ArticleFull Text Online 


2020

Hyperbolic Wasserstein Distance for Shape Indexing

https://www.computer.org › csdl › journal › 2020/06

 J Shi · 2020 · Cited by 5 — The resulting hyperbolic Wasserstein distance can intrinsically measure the ... Hyperbolic Wasserstein Distance for Shape Indexing. 2020, pp. 1362-1376, vol.

Hyperbolic Wasserstein Distance for Shape Indexing 

by Shi, Jie; Wang, Yalin 

IEEE transactions on pattern analysis and machine intelligence, 06/2020, Volume 42, Issue 6

Shape space is an active research topic in computer vision and medical imaging fields. The distance defined in a shape space may provide a simple and refined...

Article PDF Download PDF 

Journal ArticleFull Text Online 

 Study Data from Arizona State University Provide New Insights into Machine Learning (Hyperbolic Wasserstein Distance for Shape Indexing)

Robotics & Machine Learning, 06/2020

NewsletterFull Text Online

online

Machine Learning; Study Data from Arizona State University Provide New Insights into Machine Learning (Hyperbolic Wasserstein Distance for Shape Indexing)

Journal of robotics & machine learning, Jun 15, 2020, 411

Newspaper ArticleFull Text Online


Adversarial sliced Wasserstein domain adaptation networks 

by Zhang, Yun; Wang, Nianbin; Cai, Shaobin 

Image and vision computing, 07/2020

Article PDF Download PDF 

Journal ArticleFull Text Online 

Cited by 5 Related articles All 2 versions
<——2020—— 2020——— 240 —


 

Wasserstein autoencoders for collaborative filtering 

by Zhang, Xiaofeng; Zhong, Jingbin; Liu, Kai 

Neural computing & applications, 07/2020

Article PDF Download PDF 

Journal ArticleFull Text Online 

Computation - Neural Computation; New Data from School of Computer Science Illuminate Findings in Neural Computation (Wasserstein...

Journal of robotics & machine learning, Aug 10, 2020, 180

Newspaper ArticleCitation Online

(08/10/2020). "Computation - Neural Computation; New Data from School of Computer Science Illuminate Findings in Neural Computation 

(Wasserstein Autoencoders for Collaborative Filtering)". Journal of robotics & machine learning (1944-1851), p. 180.


Learning to Align via Wasserstein for Person Re-Identification 

by Zhang, Zhizhong; Xie, Yuan; Li, Ding; More... 

IEEE transactions on image processing, 2020, Volume 29

Existing successful person re-identification (Re-ID) models often employ the part-level representation to extract the fine-grained information, but commonly...

Article PDF Download PDF 

Journal ArticleFull Text Online 

Findings from Chinese Academy of Sciences Broaden Understanding of Imaging Technology (Learning To Align Via Wasserstein...Journal of Technology & Science, 08/2020

NewsletterFull Text Online 

Technology - Imaging Technology; Findings from Chinese Academy of Sciences Broaden Understanding of Imaging Technology (Learning To Align Via Wasserstein...

Journal of technology & science, Aug 9, 2020, 657

Newspaper ArticleFull Text Online 

   Learning to Align via Wasserstein for Person Re-Identification    Zhang, Y Xie, D Li, W Zhang… - IEEE Transactions on …, 2020 - ieeexplore.ieee.orgExisting successful person re-identification (Re-ID) models often employ the part-level representation to extract the fine-grained information, but commonly use the loss that is particularly designed for global features, ignoring the rCited Cited by 7 Related articles All 2 versions
Zbl 07586386

   

On the computation of Wasserstein barycenters 

by Puccetti, Giovanni; Rüschendorf, Ludger; Vanduffel, Steven 

Journal of Multivariate Analysis, 03/2020, Volume 176

The Wasserstein barycenter is an important notion in the analysis of high dimensional data with a broad range of applications in applied probability,...

Article PDF Download PDF 

Journal ArticleFull Text Online 

 of the input measures on the line, in which case problem (1) can be efficiently …

Cited by 15 Related articles All 8 versions

   2020  see 2019

The quadratic Wasserstein metric for inverse data matching 

by Engquist, Björn; Ren, Kui; Yang, Yunan 

Inverse problems, 05/2020, Volume 36, Issue 5

Article PDF Download PDF 

Cited by 22 Related articles All 7 versions


  Kantorovich–Rubinstein–Wasserstein distance between overlapping attractor and repeller 

by Chigarev, Vladimir; Kazakov, Alexey; Pikovsky, Arkady 

Chaos (Woodbury, N.Y.), 07/2020, Volume 30, Issue 7

We consider several examples of dynamical systems demonstrating overlapping attractor and repeller. These systems are constructed via introducing controllable...

Article PDF Download PDF 

Journal ArticleFull Text Online 

All 2 versions

Kantorovich–Rubinstein–Wasserstein distance between overlapping attractor and repeller

aip.scitation.org › doi

by V Chigarev · 2020 · Cited by 2 — Kantorovich–Rubinstein– ... Below, we apply the KRWD to characterize difference between attractors and repellers. ... panel (c) stars overlap with pluses].

ABSTRACT · ‎INTRODUCTION · ‎II. BASIC MODELS · ‎IV. DISCUSSION

[CITATION] Kantorovich-Rubinstein-Wasserstein distance between overlapping attractor and repeller, Chaos 30

V Chigarev, A Kazakov, A Pikovsky - 2020

Cited by 11 Related articles All 7 versions
   

2020  [PDF] arxiv.org

Differentiable maps between Wasserstein spaces

B Lessel, T Schick - arXiv preprint arXiv:2010.02131, 2020 - arxiv.org

A notion of differentiability is being proposed for maps between Wasserstein spaces of order

2 of smooth, connected and complete Riemannian manifolds. Due to the nature of the

tangent space construction on Wasserstein spaces, we only give a global definition of …

  Related articles All 2 versions 


Generating Hyperspectral Data Based on 3D CNN and Improved Wasserstein Generative Adversarial Network Using Homemade High-resolution Datasets

Y Li, D Huang - Proceedings of the International Conference on …, 2020 - dl.acm.org

Hyperspectral images contain rich information on the fingerprints of materials and are being

popularly used in the exploration of oil and gas, environmental monitoring, and remote

sensing. Since hyperspectral images cover a wide range of wavelengths with high

resolution, they can provide rich features for enhancing the subsequent target detection and

classification procedure. The recently proposed deep learning algorithms have been

frequently utilized to extract features from hyperspectral images. However, these algorithms …

  Related articles

online

Generating Hyperspectral Data Based on 3D CNN and Improved Wasserstein Generative Adversarial Network Using Homemade High-resolution Datasets

by Li, Yin; Huang, Da

Proceedings of the International Conference on wireless communication and sensor networks, 05/2020

Hyperspectral images contain rich information on the fingerprints of materials and are being popularly used in the exploration of oil and gas, environmental...

Conference ProceedingFull Text Online


   

Drift compensation algorithm based on Time-Wasserstein ...

ieeexplore.ieee.org › abstract › document

by Y Tao · 2020 — This paper proposes Time-Wasserstein dynamic distribution alignment (TWDDA) to solve drift compensation according to the domain adaptive ...

Date of Conference: 9-11 Aug. 2020

DOI: 10.1109/ICCC49849.2020.9238779

Date Added to IEE

online

Drift compensation algorithm based on Time-Wasserstein dynamic distribution alignment

by Tao, Yang; Zeng, Kewei; Liang, Zhifang

2020 IEEE/CIC International Conference on Communications in China (ICCC), 08/2020

The electronic nose (E-nose) is mainly used to detect different types and concentrations of gases. At present, the average life of E-nose is relatively short,...

Conference ProceedingFull Text Online


Wasserstein distributionally robust shortest path problem ...

The model is extended to solve the distributionally robust bi-criteria shortest path problem as well as minimum flow cost problems. Abstract. This paper proposes a ...

by Z Wang - ‎2020 - ‎Cited by 1 - ‎Related articles

Science - Operations Science; Studies from Tsinghua University Have Provided New Information about Operations Science (Wasserstein... 

(Wasserstein Distributionally Robust Shortest Path Problem). Science Letter. July 10, 2020; p 2320.

Science letter (Atlanta, Ga.), Jul 10, 2020, 2320

Newspaper ArticleFull Text Online 

online

Studies from Tsinghua University Have Provided New Information about Operations Science 

(Wasserstein Distributionally Robust Shortest Path Problem)

Science Letter, 07/2020

NewsletterFull Text Online

Science - Operations Science; Studies from Tsinghua University Have Provided New Information about Operations Science (Wasserstein Distributionally Robust Shortest Path Problem)

Science letter (Atlanta, Ga.), Jul 10, 2020, 2320

Newspaper ArticleFull Text Online
Cited by 11
Related articles All 7 versions

Palo Alto Research Center Submits United States Patent Application for Object Shape Regression Using Wasser... 

Global IP News. Information Technology Patent News, Jun 18, 2020

Newspaper ArticleFull Text Online
Palo Alto Research Center Incorporated; "Object Shape Regression Using Wasserstein Distance" in Patent... 

Journal of engineering (Atlanta, Ga.), Jul 6, 2020, 5046   patent

Newspaper ArticleFull Text Online 

[PDF] googleapis.com

Object shape regression using wasserstein distance

J Sun, SKP Kumar, R Bala - US Patent App. 16/222,062, 2020 - Google Patents

One embodiment can provide a system for detecting outlines of objects in images. During 

operation, the system receives an image that includes at least one object, generates a 

random noise signal, and provides the received image and the random noise signal to a …

All 2 versions\

 OPEN ACCESS

OBJECT SHAPE REGRESSION USING WASSERSTEIN DISTANCE

by KALLUR PALLI KUMAR, Sricharan; BALA, Raja; SUN, Jin

06/2020

One embodiment can provide a system for detecting outlines of objects in images. During operation, the system receives an image that includes at least one...

PatentCitation Online߃

 Cited by 11 Related articles All 7 versions
 
<——2020———2020 ———  250—    


 

Engineering; Study Data from China Academy of Electronics and Information Technology Update Understanding of Engineering (Robust Multivehicle Tracking With Wasserstein... 

Journal of Engineering, May 4, 2020, 4570

Newspaper ArticleFull Text Online

IEEE ACCESS  Volume: ‏ 8   Pages: ‏ 47863-47876   Published: ‏ 2020 

Robust Multivehicle Tracking with Wasserstein ... - X-MOL

https://www.x-mol.com › paper › adv

https://www.x-mol.com › paper › adv · Translate this page

Jan 1, 2020 — Vehicle tracking based on surveillance vid

HRL Laboratories Applies for Patent on System and Method for Unsupervised Domain Adaptation Via Sliced-Wasserstein Distance

System and method for unsupervised domain adaptation via sliced-wasserstein distance

AJ Gabourie, M Rostami, S Kolouri… - US Patent App. 16 …, 2020 - freepatentsonline.com

Described is a system for unsupervised domain adaptation in an autonomous learning 

agent. The system adapts a learned model with a set of unlabeled data from a target 

domain, resulting in an adapted model. The learned model was previously trained to …

Cited by 2 Related articles


2020  see 2019

[PDF] arxiv.org

Adapted wasserstein distances and stability in mathematical finance

BV Julio, D Bartl, B Mathias, E Manu - Finance and Stochastics, 2020 - Springer

Assume that an agent models a financial asset through a measure Q with the goal to 

price/hedge some derivative or optimise some expected utility. Even if the model Q is 

chosen in the most skilful and sophisticated way, the agent is left with the possibility that Q 

does not provide an exact description of reality. This leads us to the following question: will 

the hedge still be somewhat meaningful for models in the proximity of Q? If we measure 

proximity with the usual Wasserstein distance (say), the answer is No. Models which are …

Cited by 6 Related articles All 11 versions

Finance - Finance and Stochastics; Study Results from University of Vienna Update Understanding of Finance and Stochastics (Adapted Wasserstein... 

Journal of mathematics (Atlanta, Ga.), Jun 30, 2020, 949

Newspaper ArticleFull Text Online 

Global IP News. Information Technology Patent News, Apr 23, 2020  pate\

Newspaper ArticleFull Text Online 

Adapted Wasserstein distances and stability in mathematical finance 

By: Backhoff-Veraguas, Julio; Bartl, Daniel; Beiglboeck, Mathias; et al.

FINANCE AND STOCHASTICS  Volume: ‏ 24   Issue: ‏ 3   Pages: ‏ 601-632   Published: ‏ JUL 2020 

Early Access: JUN 2020 

online

Study Results from University of Vienna Update Understanding of Finance and Stochastics 

(Adapted Wasserstein Distances and Stability In Mathematical Finance)

Investment Weekly News, 07/2020

NewsletterFull Text Online

 online

Finance - Finance and Stochastics; Study Results from University of Vienna Update Understanding of Finance and Stochastics (Adapted Wasserstein Distances and Stability In Mathematical Finance)

Journal of mathematics (Atlanta, Ga.), Jun 30, 2020, 949

Newspaper ArticleFull Text Online

Wasserstein Generative Adversarial Network and ... - Hindawi

Wasserstein Generative Adversarial Network and Convolutional Neural Network (WG-CNN) for Bearing Fault Diagnosis. Hang Yin ,1,2 Zhongzhi Li ,2 Jiankai ...

by H Yin  2020  - ‎Related articles
New Findings in Mathematics Described from Zhongkai University of Agriculture and Engineering

 [Wasserstein Generative Adversarial Network and Convolutional Neural Network (WG-CNN) for Bearing Fault Diagnosis]. Journal of Technology. June 9, 2020; p 1248.

Mathematics; New Findings in Mathematics Described from Zhongkai University of Agriculture and Engineering [Wasserstein... 

Journal of technology (Atlanta, Ga.), Jun 9, 2020, 1248

Newspaper ArticleFull Text Online 

Wasserstein Generative Adversarial Network and Convolutional Neural Network (WG-CNN) for Bearing Fault Diagnosis 

By: Yin, Hang; Li, Zhongzhi; Zuo, Jiankai; et al.

MATHEMATICAL PROBLEMS IN ENGINEERING  Volume: ‏ 2020     Article Number: 2604191   Published: ‏ MAY 11 2020 

Cited by 41 Related articles All 7 versions

Recent Studies from University of Manchester Add New Data to Probability Research 

(Wasserstein and Kolmogorov Error Bounds for Variance-gamma Approximation Via Stein's Method I). Journal of Mathematics. June 2, 2020; p 9

Probability Research; Recent Studies from University of Manchester Add New Data to Probability Research (Wa... 

Journal of mathematics (Atlanta, Ga.), Jun

 2, 2020, 939

Newspaper ArticleFull Text Online 


Reconstruction of shale image based on Wasserstein Generative Adversarial Networks with gradient penalty. 

Advances in Geo-Energy Research, 2020,4(1):107-114. (Advances in Geo-Energy Research - http://www.astp-

(PDF) Reconstruction of shale image based on Wasserstein ...

May 14, 2020 - PDF | Generative Adversarial Networks (GANs), as most popular artificial intelligence models in the current image generation field, have ...

Reconstruction of shale image based on Wasserstein ...

Reconstruction of shale image based on Wasserstein Generative Adversarial Networks with gradient penalty. Wenshu Zha, Xingbao Li, Yan Xing, Lei He, ...

by W Zha - ‎2020

Engineering - Geoenergy Research; Researchers from Hefei University of Technology Discuss Research in Geoenergy Research (Reconstruction of shale image based on Wasserstein... 

Network Weekly News, May 25, 2020, 610

Newspaper ArticleFull Text Online 

Researchers from Hefei University of Technology Discuss Research in Geoenergy Research (Reconstruction of shale image based on Wasserstein... 

Energy Weekly News, 05/2020

NewsletterFull Text Online 

ited by 22 Related articles

 (PDF) Calculating the Wasserstein Metric-Based Boltzmann ...

Jun 23, 2020 - Calculating the Wasserstein Metric-Based Boltzmann Entropy of a Landscape Mosaic. Article (PDF Available) in Entropy 22(4):381 · March ...

Calculating the Wasserstein Metric-Based Boltzmann ... - MDPI

This study developed a new software tool for conveniently calculating the Wasserstein metric-based Boltzmann entropy. The tool provides a user-friendly ...

by H Zhang - ‎2020 - ‎Cited by 1 - ‎Related articles

Southwest Jiaotong University Researchers Add New Findings in the Area of Entropy 

(Calculating the Wasserstein Metric-Based Boltzmann Entropy of a Landscape Mosaic). Computer Weekly News. April 15, 2020; p 864.

Entropy; Southwest Jiaotong University Researchers Add New Findings in the Area of Entropy (Calculating the Wa... 

Computer Weekly News, Apr 15, 2020, 864

Newspaper ArticleFull Text Online 

Southwest Jiaotong University Researchers Add New Findings in the Area of Entropy 

(Calculating the Wasser... 

Computer Weekly News, 04/2020

NewsletterFull Text Online 

ited by 8 Related articles All 9 versions

FRWCAE: joint faster-RCNN and Wasserstein convolutional ...

Mar 2, 2020 - FRWCAE: joint faster-RCNN and Wasserstein convolutional auto-encoder for instance retrieval. Yi-yang Zhang ,; Yong Feng ...

by Y Zhang - ‎2020 - ‎Related articles

Applied Intelligence; Investigators at Chongqing University Report Findings in Applied Intelligence 

(Frwcae: Joint Faster-rcnn and Wasserstein... 

Journal of Robotics & Machine Learning, Mar 30, 2020, 118

Newspaper ArticleFull Text Online 

FRWCAE: joint faster-RCNN and Wasserstein convolutional auto-encoder for instance retrieval 

By: Zhang, Yi-yang; Feng, Yong; Liu, Da-jiang; et al.

APPLIED INTELLIGENCE  Volume: ‏ 50   Issue: ‏ 7   Pages: ‏ 2208-2221   Published: ‏ JUL 2020 

Early Access: MAR 2020 

Cited by 3 Related articles All 4 versions  


[PDF] mlr.press

Wasserstein smoothing: Certified robustness against wasserstein adversarial attacks

A Levine, S Feizi - International Conference on Artificial …, 2020 - proceedings.mlr.press

In the last couple of years, several adversarial attack methods based on different threat 

models have been proposed for the image classification problem. Most existing defenses 

consider additive threat models in which sample perturbations have bounded L_p norms …

Cited by 6 Related articles All 2 versions
<——2020———2020—————  260—   

       

 

[PDF] psu.edu

[PDF] Subexponential upper and lower bounds in Wasserstein distance for Markov processes

A Arapostathis, G Pang, N Sandric - personal.psu.edu

In this article, relying on Foster-Lyapunov drift conditions, we establish subexponential 

upper and lower bounds on the rate of convergence in the Lp-Wasserstein distance for a 

class of irreducible and aperiodic Markov processes. We further discuss these results in the …

Related articles


[PDF] researchgate.net

[PDF] Distributionally Robust XVA via Wasserstein Distance: Wrong Way Counterparty Credit and Funding Risk

D Singh, S Zhang - researchgate.net

This paper investigates calculations of robust XVA, in particular, credit valuation adjustment 

(CVA) and funding valuation adjustment (FVA) for over-the-counter derivatives under 

distributional uncertainty using Wasserstein distance as the ambiguity measure. Wrong way …

Related articles All 2 versions


2020  [PDF] researchgate.net

[PDF] Dual Decomposition of Two-Stage Distributionally Robust Mixed-Integer Programming under the Wasserstein Ambiguity Set

K Kim - researcAll 2 versionshgate.net

We develop a dual decomposition of two-stage distributionally robust mixed-integer 

programming (DRMIP) under the Wasserstein ambiguity set. The dual decomposition is 

based on the Lagrangian dual of DRMIP, which results from the Lagrangian relaxation of the …

Cited by 1 Related articles All 2 versions


[PDF] optimization-online.org

[PDF] Dual Decomposition of Two-Stage Distributionally Robust Mixed-Integer Programming under the Wasserstein Ambiguity Set

K Kim - optimization-online.org

We develop a dual decomposition of two-stage distributionally robust mixed-integer

programming (DRMIP) under the Wasserstein ambiguity set. The dual decomposition is

based on the Lagrangian dual of DRMIP, which results from the Lagrangian relaxation of the …

  Cited by 1 Related articles All 2 versions 


 Improving Wasserstein Generative Models for Image ...

www.research-collection.ethz.ch › handle

Improving Wasserstein Generative Models for Image Synthesis and Enhancement ... 2020. Type. Doctoral Thesis. ETH Bibliography. yes. Altmetrics. Download.

by J Wu · ‎2020

[CITATION] Improving Wasserstein Generative Models for Image Synthesis and Enhancement

J Wu - 2020 - research-collection.ethz.ch

… JavaScript is disabled for your browser. Some features of this site may not work

without it. Research Collection. Navigational link. Search. Improving Wasserstein

Generative Models for Image Synthesis and Enhancement … 


2020

[PDF] A Fast Globally Linearly Convergent Algorithm for the Computation of Wasserstein Barycenters

L Yang, J Li, D Sun, KC Toh - Journal of Machine Learning Research, 2020 - polyu.edu.hk

We consider the problem of computing a Wasserstein barycenter for a set of discrete probability distributions with finite supports, which finds many applications in areas such as statistics, machine learning and image processing. When the support points of the …

Mathematics > Optimization and Control

Submitted on 12 Sep 2018 (v1), last revised 16 Apr 2020 (this version, v4)]

A Fast Globally Linearly Convergent Algorithm for the Computation of Wasserstein Barycenters

Lei Yang, Jia Li, Defeng Sun, Kim-Chuan Toh

Submission history

From: Lei Yang [view email] 

[v1] Wed, 12 Sep 2018 04:13:48 UTC (297 KB)

[v2] Thu, 2 May 2019 02:36:49 UTC (534 KB)

[v3] Sun, 28 Jul 2019 07:24:30 UTC (548 KB)

[v4] Thu, 16 Apr 2020 08:56:56 UTC (372 KB)


arXiv:2007.14667  [pdf, ps, othermath.PR
Wasserstein Convergence Rate for Empirical Measures on Noncompact Manifolds
Authors:
Feng-Yu Wang
Abstract: Let Xt be the (reflecting) diffusion process generated by L:=Δ+
V
on a complete connected Riemannian manifold M
possibly with a boundary ∂M
, where V
C1(M) such that μ(dx):=eV(x)dx
is a probability measure. We estimate the convergence rate for the empirical measure
$μ_t:=\frac 1 t \int_0^t δ_{X_s\d s$ under the Wasserstein distance. As a typical example, when… More
Submitted 29 July, 2020; originally announced July 2020.
Comments: 18 pages 


arXiv:2007.12378  [pdf, othermath.ST
Global sensitivity analysis and
Wasserstein spaces
Authors:
Jean-Claude Fort, Thierry Klein, Agnès Lagnoux
Abstract: Sensitivity indices are commonly used to quantity the relative inuence of any specic group of input variables on the output of a computer code. In this paper, we focus both on computer codes the output of which is a cumulative distribution function and on stochastic computer codes. We propose a way to perform a global sensitivity analysis for these kinds of computer codes. In the rst setting, we d…
More
Submitted 24 July, 2020; originally announced July 2020

 

arXiv:2007.11465  [pdf, ps, othercs.LG cs.CV stat.ML
Wasserstein Routed Capsule Networks
Authors:
Alexander Fuchs, Franz Pernkopf
Abstract: Capsule networks offer interesting properties and provide an alternative to today's deep neural network architectures. However, recent approaches have failed to consistently achieve competitive results across different image datasets. We propose a new parameter efficient capsule architecture, that is able to tackle complex tasks by using neural networks trained with an approximate Wasserstein obje…
More
Submitted 22 July, 2020; originally announced July 2020.
Comments: 8 pages, 3 figures
ACM Class: I.2.10 

 Cited by 2 Related articles All 3 versions

arXiv:2007.11401  [pdf, ps, othermath.ST
Wasserstein Statistics in One-dimensional Location-Scale Model
Authors: Shun-ichi Amari, Takeru Matsuda
Abstract: Wasserstein geometry and information geometry are two important structures to be introduced in a manifold of probability distributions. Wasserstein geometry is defined by using the transportation cost between two distributions, so it reflects the metric of the base manifold on which the distributions are defined. Information geometry is defined to be invariant under reversible transformations of t…
More
Submitted 21 July, 2020; originally announced July 2020.
Comments: arXiv admin note: substantial text overlap with arXiv:2003.05479 

   <——2020———2020———  270  


arXiv:2007.11247  [pdf]  physics.med-ph eess.IV
A material decomposition method for dual-energy CT via dual interactive Wasserstein generative adversarial networks
Authors: Zaifeng Shi, Huilong Li, Qingjie Cao, Zhongqi Wang, Ming Cheng
Abstract: Dual-energy computed tomography has great potential in material characterization and identification, whereas the reconstructed material-specific images always suffer from magnified noise and beam hardening artifacts. In this study, a data-driven approach using dual interactive Wasserstein generative adversarial networks is proposed to improve the material decomposition accuracy. Specifically, two…
More
Submitted 22 July, 2020; originally announced July 2020.
Comments: 40 pages, 10 figures, research article 


 

arXiv:2007.09456  [pdf, ps, other]  cs.CL cs.LG stat.ML
On a Novel Application of
Wasserstein-Procrustes for Unsupervised Cross-Lingual Learning
Authors:
Guillem Ramírez, Rumen Dangovski, Preslav Nakov, Marin Soljačić
Abstract: The emergence of unsupervised word embeddings, pre-trained on very large monolingual text corpora, is at the core of the ongoing neural revolution in Natural Language Processing (NLP). Initially introduced for English, such pre-trained word embeddings quickly emerged for a number of other languages. Subsequently, there have been a number of attempts to align the embedding spaces across languages,…
More
Submitted 18 July, 2020; originally announced July 2020. 


[PDF] academia.edu

Data augmentation in fault diagnosis based on the Wasserstein generative adversarial network with gradient penaltyX Gao, F Deng, X Yue - Neurocomputing, 2020 - ElsevierFault detection and diagnosis in industrial process is an extremely essential part to keep away from undesired events and ensure the safety of operators and facilities. In the last few decades various data based machine learning algorithms have been widely studied to …Cited by 10 Re               

[PDF] Pattern-Based Music Generation with Wasserstein Autoencoders and PRCDescriptions

V Borghuis, L Angioloni, L Brusci, P Frasconi - ijcai.org

We present a pattern-based MIDI music generation system with a generation strategy based 

on Wasserstein autoencoders and a novel variant of pianoroll descriptions of patterns which 

employs separate channels for note velocities and note durations and can be fed into classic …

Data augmentation in fault diagnosis based on the Wasserstein generative adversarial network with gradient penalty 

By: Gao, Xin; Deng, Fang; Yue, Xianghu 

NEUROCOMPUTING  Volume: ‏ 396   Pages: ‏ 487-494   Published: ‏ JUL 5 2020 

ArticleFull Text Online
Cited by 109 Related articles All 2 versions


[PDF] uniroma1.it

[PDF] Nonparametric Density Estimation with Wasserstein Distance for Actuarial Applications

EG Luini - iris.uniroma1.it

Density estimation is a central topic in statistics and a fundamental task of actuarial sciences. 

In this work, we present an algorithm for approximating multivariate empirical densities with 

a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

Related articles All 2 versions

Two approaches for population Wasserstein barycenter problem: Stochastic Averaging versus Sample Average Approximation

D Dvinskikh, A Gasnikov - nnov.hse.ru

Abstract In Machine Learning and Optimization community there are two main approaches 

for convex risk minimization problem: Stochastic Averaging (SA) and Sample Average 

Approximation (SAA). At the moment, it is known that both approaches are on average …

[PDF] researchgate.net

[PDF] ADDENDUM TO” ISOMETRIC STUDY OF WASSERSTEIN SPACES–THE REAL LINE”

GPÁL GEHÉR, T TITKOS, D VIROSZTEK - researchgate.net

We show an example of a Polish metric space X whose quadratic Wasserstein space W2 (X) 

possesses an isometry that splits mass. This gives an affirmative answer to Kloeckner's 

question,[2, Question 2]. Let us denote the metric space ([0, 1],|·|), equipped with the usual …

Related articles

[CITATION] Estimating processes in adapted Wasserstein distance

J Backhoff-Veraguas, D Bartl, M Beiglböck, J Wiesel - arXiv preprint arXiv:2002.07261, 2020

Cited by 2 Related articles


2020

[PDF] semanticscholar.org

[PDF] Deconvolution for the Wasserstein metric and topological inference

B Michel - pdfs.semanticscholar.org

La SEE (Société de l'Electricité, de l'Electronique et des Technologies de l'Information et de 

la Communication–Association reconnue d'utilité publique, régie par la loi du 1er juillet 

1901) met à la disposition de ses adhérents et des abonnés à ses publications, un …


[PDF] unipv.it

[PDF] On the equivalence between Fourier-based and Wasserstein metrics

G Auricchio, A Codegoni, S Gualandi, G Toscani… - eye - mate.unipv.it

We investigate properties of some extensions of a class of Fourierbased probability metrics, 

originally introduced to study convergence to equilibrium for the solution to the spatially 

homogeneous Boltzmann equation. At difference with the original one, the new Fourier …


Optimality in weighted L2-Wasserstein goodness-of-fit statistics

T de Wet, V Humble - South African Statistical Journal, 2020 - journals.co.za

In Del Barrio, Cuesta-Albertos, Matran and Rodriguez-Rodriguez (1999) and Del Barrio, 

Cuesta-Albertos and Matran (2000), the authors introduced a new class of goodness-of-fit 

statistics based on the L2-Wasserstein distance. It was shown that the desirable property of …

Related articles

  

[PDF] EE-559–Deep learning 11.2. Wasserstein GAN

F Fleuret - 2020 - fleuret.org

Page 1. EE-559 – Deep learning 11.2. Wasserstein GAN François Fleuret https://fleuret.org/ee559/

May 16, 2020 Page 2. Arjovsky et al. (2017) pointed out that DJS does not account [much] for the

metric structure of the space. François Fleuret EE-559 – Deep learning / 11.2. Wasserstein GAN …

Related articles All 2 versions\ 


2020

[PDF] mlr.press

Quantitative stability of optimal transport maps and linearization of the 2-wasserstein space

Q Mérigot, A Delalande… - … Conference on Artificial …, 2020 - proceedings.mlr.press

This work studies an explicit embedding of the set of probability measures into a Hilbert

space, defined using optimal transport maps from a reference probability density. This

embedding linearizes to some extent the 2-Wasserstein space and is shown to be bi-Hölder …

  Cited by 10 Related articles All 4 versions 

 <——2020———2020———  280 —  


WGAIN: Data Imputation using Wasserstein GAIN/submitted by Christina Halmich

C Halmich - 2020 - epub.jku.at

Missing data is a well known problem in the Machine Learning world. A lot of datasets that 

are used for training algorithms contain missing values, eg 45% of the datasets stored in the 

UCI Machine Learning Repository [16], which is a commonly used dataset collection …

 Related articles All 2 versions

[PDF] tum.de

L2p-Wasserstein and flux-limited gradient flows: Entropic discretization, convergence analysis and numerics

B Söllner - 2020 - mediatum.ub.tum.de

We analyse different discretizations of gradient flows in transport metrics with non-quadratic 

costs. Among others we discuss the p-Laplace equation and evolution equations with flux-

limitation. We prove comparison principles, free energy monotony, non-negativity and mass …

All 2 versions

Lp-Wasserstein and Flux-limited Gradient Flows: Entropic Discretization, Convergence Analysis and Numerics

By Benjamin Söllner · 2020 book
p-Wasserstein and flux-limited gradient flows: Entropic discretization, convergence analysis and numerics
  thesis

L<sup>p</sup>-Wasserstein and flux-limited gradient flows: Entropic discretization, convergence analysis and numerics  thesis


[PDF] ecai2020.eu

[PDF] Dual Rejection Sampling for Wasserstein Auto-Encoders

L Hou, H Shen, X Cheng - ecai2020.eu

Deep generative models enhanced by Wasserstein distance have achieved remarkable 

success in recent years. Wasserstein Auto-Encoders (WAEs) are auto-encoder based 

generative models that aim to minimize the Wasserstein distance between the data …

All 2 versions 


[PDF] Lecture 3: Wasserstein Space

L Chizat - 2020 - lchizat.github.io

Let X, Y be compact metric spaces, c C (X× Y) the cost function and (µ, ν) P (X)× P (Y) 

the marginals. In previous lectures, we have seen that the optimal transport problem can be 

formulated as an optimization over the space of transport plans Π (µ, ν)—the primal or …

Cited by 1 Related articles All 4 versions

Isometric study of Wasserstein spaces---the real line

arxiv.org › math

by GP Gehér · 2020 · Cited by 3 — Isometric study of Wasserstein spaces --- the real line Recently Kloeckner described the structure of the isometry group of the quadratic Wasserstein space \mathcal{W}_2\left(\mathbb{R}^n\right). It turned out that the case of the real line is exceptional in the sense that there exists an exotic isometry flow.

online Cover Image  OPEN ACCESS

Isometric study of Wasserstein spaces -- the real line

by György Pál Gehér; Tamás Titkos; Dániel Virosztek

Transactions of the American Mathematical Society, 08/2020, Volume 373, Issue 8

Recently Kloeckner described the structure of the isometry group of the quadratic Wasserstein space \mathcal {W}_2(\mathbb{R}^n). It turned out that the case...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

A Short Introduction to Optimal Transport and Wasserstein ...

http://alexhwilliams.info › itsneuronalblog › 2020/10/09

Oct 9, 2020 — Optimal transport theory is one way to construct an alternative notion of distance between probability distributions.

[PDF] aimsciences.org

Exponential convergence in the Wasserstein metric  W  for one dimensional diffusions

L Cheng, R Li, L Wu - Discrete & Continuous Dynamical Systems-A, 2020 - aimsciences.org

In this paper, we find some general and efficient sufficient conditions for the exponential 

convergence W1, d (Pt (x,·), Pt (y,·))≤ Ke− δtd (x, y) for the semigroup (Pt) of one-

dimensional diffusion. Moreover, some sharp estimates of the involved constants K≥ 1, δ> 0 …

Related articles All 2 versions

 EXPONENTIAL CONVERGENCE IN THE WASSERSTEIN METRIC W-1 FOR ONE DIMENSIONAL DIFFUSIONS 

By: Cheng, Lingyan; Li, Ruinan; Wu, Liming 

DISCRETE AND CONTINUOUS DYNAMICAL SYSTEMS  Volume: ‏ 40   Issue: ‏ 9   Pages: ‏ 5131-5148   Published: ‏ SEP 2020 

EXPONENTIAL CONVERGENCE IN THE WASSERSTEIN METRIC W-1 FOR ONE DIMENSIONAL DIFFUSIONS

By: Cheng, Lingyan; Li, Ruinan; Wu, Liming

DISCRETE AND CONTINUOUS DYNAMICAL SYSTEMS   Volume: ‏ 40   Issue: ‏ 9   Pages: ‏ 5131-5148   Published: ‏ SEP 2020

Get It Penn State Free Full Text from Publisher

MR4128302 Prelim Cheng, Lingyan; Li, Ruinan; Wu, Liming; Exponential convergence in the Wasserstein metric  W1  for one dimensional diffusions. Discrete Contin. Dyn. Syst. 40 (2020), no. 9, 5131–5148. 60B10 (60H10 60J60)

 

[PDF] researchgate.net

[PDF] Wasserstein Barycenters for Bayesian Learning: Technical Report

G Rios - 2020 - researchgate.net

Within probabilistic modelling, a crucial but challenging task is that of learning (or fitting) the 

models. For models described by a finite set of parameters, this task is reduced to finding the 

best parameters, to feed them into the model and then calculate the posterior distribution to …

Related articles

[HTML] springer.com

[HTML] Fréchet Means in the Wasserstein Space

VM Panaretos, Y Zemel - International Workshop on Functional and …, 2020 - Springer

The concept of a Fréchet mean (Fréchet [55]) generalises the notion of mean to a more general 

metric space by replacing the usual “sum of squares” with a “sum of squared distances”, giving 

rise to the so-called Fréchet functional. A closely related notion is that of a Karcher mean (Karcher …

Related articles


thogonal Gradient Penalty for Fast Training of Wasserstein ...

... Penalty for Fast Training of Wasserstein GAN Based Multi-Task Autoencoder ... a new orthogonal gradient penalty (OGP) method for Wasserstein Generative ...

<——2020——–2020————290——   

[PDF] inria.fr

Graph Diffusion Wasserstein Distances

A Barbe, M Sebban, P Gonçalves, P Borgnat… - … on Machine Learning …, 2020 - hal.inria.fr

Optimal Transport (OT) for structured data has received much attention in the machine 

learning community, especially for addressing graph classification or graph transfer learning 

tasks. In this paper, we present the Diffusion Wasserstein (DW) distance, as a generalization …

 Cited by 17 Related articles All 4 versions

 Wind: Wasserstein Inception Distance For Evaluating Generative Adversarial Network Performance

P Dimitrakopoulos, G Sfikas… - ICASSP 2020-2020 IEEE …, 2020 - ieeexplore.ieee.org

In this paper, we present Wasserstein Inception Distance (WInD), a novel metric for 

evaluating performance of Generative Adversarial Networks (GANs). The proposed metric 

extends on the rationale of the previously proposed Frechet Inception Distance (FID), in the …

Related articles

Wind: Wasserstein Inception Distance For Evaluating ... - IEEE.tv

ieeetv.ieee.org › ondemand › wind-wasserstein-inception-...

In this paper, we present Wasserstein Inception Distance (WInD), a novel metric for evaluating performance of Generative Adversarial Networks (GANs).

IEEE.tv · 

May 3, 2020

[HTML] atlantis-press.com

[HTML] Multimedia Analysis and Fusion via Wasserstein Barycenter

C Jin, J Wang, J Wei, L Tan, S Liu… - … Journal of Networked …, 2020 - atlantis-press.com

Optimal transport distance, otherwise known as Wasserstein distance, recently has attracted 

attention in music signal processing and machine learning as powerful discrepancy 

measures for probability distributions. In this paper, we propose an ensemble approach with …

Related articles All 3 versions

 Multimedia Analysis and Fusion via Wasserstein Barycenter 

By: Jin, Cong; Wang, Junhao; Wei, Jin; et al.

INTERNATIONAL JOURNAL OF NETWORKED AND DISTRIBUTED COMPUTING  Volume: ‏ 8   Issue: ‏ 2   Pages: ‏ 58-66   Published: ‏ MAR 2020

A Riemannian submersion‐based approach to the Wasserstein barycenter of positive definite matrices

M Li, H Sun, D Li - Mathematical Methods in the Applied …, 2020 - Wiley Online Library

In this paper, we introduce a novel geometrization on the space of positive definite matrices, 

derived from the Riemannian submersion from the general linear group to the space of 

positive definite matrices, resulting in easier computation of its geometric structure. The …

Related articles


[PDF] uni-bonn.de

Diffusions on Wasserstein Spaces

L Dello Schiavo - 2020 - bonndoc.ulb.uni-bonn.de

We construct a canonical diffusion process on the space of probability measures over a 

closed Riemannian manifold, with invariant measure the Dirichlet–Ferguson measure. 

Together with a brief survey of the relevant literature, we collect several tools from the theory …

Related articles

Diffusions on Wasserstein Spaces

books.google.com › books

Lorenzo Dello Schiavo · 2020 · ‎ book


[PDF] researchgate.net

[PDF] Fixed-Support Wasserstein Barycenters: Computational Hardness and Fast Algorithm

T Lin, N Ho, X Chen, M Cuturi, MI Jordan - 2020 - researchgate.net

We study the fixed-support Wasserstein barycenter problem (FS-WBP), which consists in 

computing the Wasserstein barycenter of m discrete probability measures supported on a 

finite metric space of size n. We show first that the constraint matrix arising from the standard …

[PDF] researchgate.net

[PDF] Fixed-Support Wasserstein Barycenters: Computational Hardness and Fast Algorithm

T Lin, N Ho, X Chen, M Cuturi… - Advances in Neural …, 2020 - researchgate.net

We study the fixed-support Wasserstein barycenter problem (FS-WBP), which consists in 

computing the Wasserstein barycenter of m discrete probability measures supported on a 

finite metric space of size n. We show first that the constraint matrix arising from the standard …

Cited by 5 Related articles All 5 versions


[PDF] projecteuclid.org

Donsker's theorem in Wasserstein-1 distance

L Coutin, L Decreusefond - Electronic Communications in …, 2020 - projecteuclid.org

We compute the Wassertein-1 (or Kantorovitch-Rubinstein) distance between a random 

walk in $\mathbf {R}^{d} $ and the Brownian motion. The proof is based on a new estimate of 

the modulus of continuity of the solution of the Stein's equation. As an application, we can …

Related articles All 18 versions


The Wasserstein Impact Measure (WIM): a generally ...

https://arxiv.org › stat

by F Ghaderinezhad · 2020 — The Wasserstein Impact Measure (WIM): a generally applicable, practical tool for quantifying prior impact in Bayesian statistics. The prior distribution is a crucial building block in Bayesian analysis, and its choice will impact the subsequent inference.

Missing: T2020 ‎| Must include: T2020

online  PEER-REVIEW

The Wasserstein Impact Measure (WIM) : a generally applicable, practical tool for quantifying prior...

by Ghaderinezhad, Fatemeh; Ley, Christophe; Serrien, Ben

2020

Journal ArticleFull Text Online

 

Infrared and visible image fusion using dual discriminators generative adversarial networks with Wasserstein distance

By: Li, Jing; Huo, Hongtao; Liu, Kejian; et al.

INFORMATION SCIENCES   Volume: ‏ 529   Pages: ‏ 28-41   Published: ‏ AUG 2020

Get It Penn State

 <——2020—–—2020————300——  

 

Horo-functions associated to atom sequences on the Wasserstein space

By: Zhu, Guomin; Wu, Hongguang; Cui, Xiaojun

ARCHIV DER MATHEMATIK     

[early access icon] Early Access: JUL 2020

Get It Penn State

Horo-functions associated to atom sequences on the Wasserstein space 

By: Zhu, Guomin; Wu, Hongguang; Cui, Xiaojun 

ARCHIV DER MATHEMATIK     

Early Access: JUL 2020  Zbl 07254624

arXiv:2008.02648  [pdf, other  cs.LG stat.ML 

Graph Wasserstein Correlation Analysis for Movie Retrieval 

Authors: Xueya Zhang, Tong Zhang, Xiaobin Hong, Zhen Cui, Jian Yang 

Abstract: Movie graphs play an important role to bridge heterogenous modalities of videos and texts in human-centric retrieval. In this work, we propose Graph Wasserstein Correlation Analysis (GWCA) to deal with the core issue therein, i.e, cross heterogeneous graph comparison. Spectral graph filtering is introduced to encode graph signals, which are then embedded as probability distributions in a Wasserste… More 

Submitted 6 August, 2020; originally announced August 2020. 


2020 see 2019  [PDF] esaim-proc.org

Statistical data analysis in the Wasserstein space

J Bigot - ESAIM: Proceedings and Surveys, 2020 - esaim-proc.org

This paper is concerned by statistical inference problems from a data set whose elements 

may be modeled as random probability measures such as multiple histograms or point 

clouds. We propose to review recent contributions in statistics on the use of Wasserstein 

distances and tools from optimal transport to analyse such data. In particular, we highlight 

the benefits of using the notions of barycenter and geodesic PCA in the Wasserstein space 

for the purpose of learning the principal modes of geometric variation in a dataset. In this …

Cited by 14 Related articles All 4 versions

2020

Gromov-Wasserstein Averaging in a Riemannian Framework

S Chowdhury, T Needham - Proceedings of the IEEE/CVF …, 2020 - openaccess.thecvf.com

We introduce a theoretical framework for performing statistical tasks-including, but not 

limited to, averaging and principal component analysis-on the space of (possibly 

asymmetric) matrices with arbitrary entries and sizes. This is carried out under the lens of the 

Gromov-Wasserstein (GW) distance, and our methods translate the Riemannian framework 

of GW distances developed by Sturm into practical, implementable tools for network data 

analysis. Our methods are illustrated on datasets of letter graphs, asymmetric stochastic …

Conference ProceedingCitation Online
Cited by 16
Related articles All 11 versions
Gromov-Wasserstein Learning in a Riemannian Framework

video.ucdavis.edu › media › Samir+ChowdhuryA+Gromo...

Video thumbnail for Samir Chowdhury: Gromov-Wasserstein Learning in a Riemannian Framework. 0:00. Off Air. / 1:00:50.

AggieVideo · 

Dec 8, 2020

Gromov-Wasserstein Learning in a Riemannian Framework

video.ucdavis.edu › media › Samir+ChowdhuryA+Gromo...

Video thumbnail for Samir Chowdhury: Gromov-Wasserstein Learning in a Riemannian Framework. 0:00. Off Air. / 1:00:50. Auto. Options; Off; English.

AggieVideo · 

Sep 29, 2020


2020 see 2019

Wasserstein Collaborative Filtering for Item Cold-start Recommendation

Y Meng, X Yan, W Liu, H Wu, J Cheng - … of the 28th ACM Conference on …, 2020 - dl.acm.org

Item cold-start recommendation, which predicts user preference on new items that have no 

user interaction records, is an important problem in recommender systems. In this paper, we 

model the disparity between user preferences on warm items (those having interaction 

record) and that on cold-start items using the Wasserstein distance. On this basis, we 

propose Wasserstein Collaborative Filtering (WCF), which predicts user preference on cold-

start items by minimizing the Wasserstein distance under user embedding constraint. Our …

Cited by 11 Related articles All 4 versions


2020

[PDF] Gromov-Wasserstein Factorization Models for Graph Clustering

H Xu - AAAI, 2020 - aaai.org

We propose a new nonlinear factorization model for graphs that are with topological 

structures, and optionally, node attributes. This model is based on a pseudometric called 

Gromov-Wasserstein (GW) discrepancy, which compares graphs in a relational way. It 

estimates observed graphs as GW barycenters constructed by a set of atoms with different 

weights. By minimizing the GW discrepancy between each observed graph and its GW 

barycenter-based estimation, we learn the atoms and their weights associated with the …

Cited by 1 Related articles    All 3 versions        View as HTML 




    



  

Wasserstein Distance based Deep Adversarial Transfer Learning for Intelligent Fault Diagnosis with Unlabeled or Insufficient Labeled Data

C Cheng, B Zhou, G Ma, D Wu, Y Yuan - Neurocomputing, 2020 - Elsevier

Intelligent fault diagnosis is one critical topic of maintenance solution for mechanical 

systems. Deep learning models, such as convolutional neural networks (CNNs), have been 

successfully applied to fault diagnosis tasks and achieved promising results. However, one 

is that two datasets (in source and target domains) of similar tasks are with different feature 

distributions because of different operational conditions; another one is that insufficient or 

unlabeled data in real industry applications (target domains) limit the adaptability of the … 


Unsupervised Wasserstein Distance Guided Domain Adaptation for 3D Multi-domain Liver SegmentationAuthors:You C.Duncan J.S.Yang J.Chapiro J.3rd International Workshop on Interpretability of Machine Intelligence in Medical Image Computing, iMIMIC 2020, the 2nd International Workshop on Medical Image Learning with Less Labels and Imperfect Data, MIL3ID 2020, and the 5th International Workshop on Large-scale Annotation of Biomedical data and Expert Label Synthesis, LABELS 2020, held in conjunction with the 23rd International Conference on Medical Image Computing and Computer Assisted Intervention, MICCAI 2020Show more
Article, 2020
Publication:Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 12446 LNCS, 2020, 155
Publisher:2020


[PDF] arxiv.org

Wasserstein distance error bounds for the multivariate normal approximation of the maximum likelihood estimator

A AnastasiouRE Gaunt - arXiv preprint arXiv:2005.05208, 2020 - arxiv.org

We obtain explicit Wasserstein distance error bounds between the distribution of the multi-

parameter MLE and the multivariate normal distribution. Our general bounds are given for

possibly high-dimensional, independent and identically distributed random vectors. Our …

  Cited by 1 Related articles All 4 versions 


Domain-attention Conditional Wasserstein Distance for Multi-source Domain Adaptation

H Wu, Y Yan, MK Ng, Q Wu - ACM Transactions on Intelligent Systems …, 2020 - dl.acm.org

Multi-source domain adaptation has received considerable attention due to its effectiveness 

of leveraging the knowledge from multiple related sources with different distributions to 

enhance the learning performance. One of the fundamental challenges in multi-source 

domain adaptation is how to determine the amount of knowledge transferred from each 

source domain to the target domain. To address this issue, we propose a new algorithm, 

called Domain-attention Conditional Wasserstein Distance (DCWD), to learn transferred …

Cited by 14 Related articles All 5 versions

<——2020—— 2020——- 310 ——   

Knowledge-aware Attentive Wasserstein Adversarial Dialogue Response Generation

Y Zhang, Q Fang, S Qian, C Xu - ACM Transactions on Intelligent …, 2020 - dl.acm.org

Natural language generation has become a fundamental task in dialogue systems. RNN-

based natural response generation methods encode the dialogue context and decode it into 

a response. However, they tend to generate dull and simple responses. In this article, we 

propose a novel framework, called KAWA-DRG (Knowledge-aware Attentive Wasserstein 

Adversarial Dialogue Response Generation) to model conversation-specific external 

knowledge and the importance variances of dialogue context in a unified adversarial …

Related articles


node2coords: Graph Representation Learning with Wasserstein Barycenters

E Simou, D Thanou, P Frossard - arXiv preprint arXiv:2007.16056, 2020 - arxiv.org

In order to perform network analysis tasks, representations that capture the most relevant 

information in the graph structure are needed. However, existing methods do not learn 

representations that can be interpreted in a straightforward way and that are robust to 

perturbations to the graph structure. In this work, we address these two limitations by 

proposing node2coords, a representation learning algorithm for graphs, which learns 

simultaneously a low-dimensional space and coordinates for the nodes in that space. The …


[HTML] Imbalanced Fault Classification of Bearing via Wasserstein Generative Adversarial Networks with Gradient Penalty

B Han, S Jia, G Liu, J Wang - Shock and Vibration, 2020 - hindawi.com

Recently, generative adversarial networks (GANs) are widely applied to increase the 

amounts of imbalanced input samples in fault diagnosis. However, the existing GAN-based 

methods have convergence difficulties and training instability, which affect the fault 

diagnosis efficiency. This paper develops a novel framework for imbalanced fault 

classification based on Wasserstein generative adversarial networks with gradient penalty 

(WGAN-GP), which interpolates randomly between the true and generated samples to …

Related articles All 4 versions

Imbalanced Fault Classification of Bearing via Wasserstein Generative Adversarial Networks with Gradient Penalty 

By: Han, Baokun; Jia, Sixiang; Liu, Guifang; et al.

SHOCK AND VIBRATION  Volume: ‏ 2020     Article Number: 8836477   Published: ‏ JUL 21 2020 

Shock Research; Researchers from Shandong University of Science and Technology Discuss Findings in Shock Research (Imbalanced Fault Classification of Bearing via Wasserstein... 

Journal of technology (Atlanta, Ga.), Aug 18, 2020, 3318

... to news originating from Qingdao, People's Republic of China, by VerticalNews correspondents, research stated, "Recently, generative adversarial networks (GANs...

Newspaper ArticleCitation Online
  Related articles All 4 versions 

 An LP-based, strongly-polynomial 2-approximation algorithm for sparse Wasserstein barycenters

S Borgwardt - Operational Research, 2020 - Springer

Discrete Wasserstein barycenters correspond to optimal solutions of transportation problems 

for a set of probability measures with finite support. Discrete barycenters are measures with 

finite support themselves and exhibit two favorable properties: there always exists one with a 

provably sparse support, and any optimal transport to the input measures is non-mass 

splitting. It is open whether a discrete barycenter can be computed in polynomial time. It is 

possible to find an exact barycenter through linear programming, but these programs may …

Related articles All 2 versions

2. arXiv:1704.05491  [pdf, other]  math.OC 

An LP-based, strongly-polynomial 2-approximation algorithm for sparse Wasserstein barycenters 

By: Borgwardt, Steffen 

OPERATIONAL RESEARCH     

journal article


MR4127894
Prelim Gehér, György Pál; Titkos, Tamás; Virosztek, Dániel; Isometric study of Wasserstein spaces – the real line. Trans. Amer. Math. Soc. 373 (2020), no. 8, 5855–5883. 54E40 (46E27 60A10 60B05)

Review PDF Clipboard Journal Article 

[PDF] ams.orgFull View

Isometric study of Wasserstein spaces–the real line

G Gehér, T Titkos, D Virosztek - Transactions of the American Mathematical …, 2020 - ams.org

Recently Kloeckner described the structure of the isometry group of the quadratic 

Wasserstein space $\mathcal {W} _2 (\mathbb {R}^ n) $. It turned out that the case of the real 

line is exceptional in the sense that there exists an exotic isometry flow. Following this line of 

investigation, we compute $\mathrm {Isom}(\mathcal {W} _p (\mathbb {R})) $, the isometry 

group of the Wasserstein space $\mathcal {W} _p (\mathbb {R}) $ for all $ p\in 

[1,\infty)\setminus\{2\} $. We show that $\mathcal {W} _2 (\mathbb {R}) $ is also exceptional …

Cited by 4 Related articles All 8 versions

MR4120535 Prelim Sagiv, Amir; The Wasserstein distances between pushed-forward measures with applications to uncertainty quantification. Commun. Math. Sci. 18 (2020), no. 3, 707-724. 60A10 (28A10 28A33 65C20)

Review PDF Clipboard Journal Article 

 The Wasserstein Distances Between Pushed-Forward Measures with Applications to Uncertainty Quantification

A Sagiv - arXiv preprint arXiv:1902.05451, 2019 - arxiv.org

In the study of dynamical and physical systems, the input parameters are often uncertain or 

randomly distributed according to a measure $\varrho $. The system's response $ f $ pushes 

forward $\varrho $ to a new measure $ f\circ\varrho $ which we would like to study. However, 

we might not have access to $ f $ but only to its approximation $ g $. We thus arrive at a 

fundamental question--if $ f $ and $ g $ are close in $ L^ q $, does $ g\circ\varrho $ 

approximate $ f\circ\varrho $ well, and in what sense? Previously, we demonstrated that the …

Related articles All 2 versions 

online  

New Mathematical Sciences Study Findings Have Been Reported by Investigators at Tel Aviv University 

(The Wasserstein Distances Between Pushed-forward Measures With Applications To Uncertainty Quantification)

Mathematics Week, 09/2020

NewsletterFull Text Online

 

arXiv:2008.06088  [pdf, ps, other 

math.PR 

Stein factors for variance-gamma approximation in the Wasserstein and Kolmogorov distances 

Authors: Robert E. Gaunt 

Abstract: We obtain new bounds for the solution of the variance-gamma (VG) Stein equation that are of the correct form for approximations in terms of the Wasserstein and Kolmorogorov metrics. These bounds hold for all parameters values of the four parameter VG class. As an application we obtain explicit Wasserstein and Kolmogorov distance error bounds in a six moment theorem for VG approximation of double W… More 

Submitted 13 August, 2020; originally announced August 2020. 

Comments: 30 pages 

MSC Class: Primary 60F05; 62E17 

Cited by 1 All 3 versions


arXiv:2008.05824  [pdf, ps, other 

stat.AP q-fin.RM 

Risk Measures Estimation Under Wasserstein Barycenter 

Authors: M. Andrea Arias-Serna, Jean-Michel Loubes, Francisco J. Caro-Lopera 

Abstract: Randomness in financial markets requires modern and robust multivariate models of risk measures. This paper proposes a new approach for modeling multivariate risk measures under Wasserstein barycenters of probability measures supported on location-scatter families. Simple and advanced copulas multivariate Value at Risk models are compared with the derived technique. The performance of the model is… More 

Submitted 13 August, 2020; originally announced August 2020. 


arXiv:2008.04751  [pdf, other 

cs.CV cs.LG cs.PF cs.RO 

Reinforced Wasserstein Training for Severity-Aware Semantic Segmentation in Autonomous Driving 

Authors: Xiaofeng Liu, Yimeng Zhang, Xiongchang Liu, Song Bai, Site Li, Jane You 

Abstract: Semantic segmentation is important for many real-world systems, e.g., autonomous vehicles, which predict the class of each pixel. Recently, deep networks achieved significant progress w.r.t. the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross-entropy loss can essentially ignore the difference of severity for an autonomous car with different wrong prediction mist… More 

Submitted 11 August, 2020; originally announced August 2020. 

Comments: Accepted to IEEE Transactions on Intelligent Transportation Systems (T-ITS) 

 Reinforced wasserstein training for severity-aware semantic segmentation in autonomous driving

X Liu, Y Zhang, X Liu, S Bai, S Li, J You - arXiv preprint arXiv:2008.04751, 2020 - arxiv.org

Semantic segmentation is important for many real-world systems, eg, autonomous vehicles, which predict the class of each pixel. Recently, deep networks achieved significant progress wrt the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross-entropy loss can essentially ignore the difference of severity for an autonomous car with different wrong prediction mistakes. For example, predicting the car to the road is much more servery than recognize it as the bus. Targeting for this difficulty, we develop a Wasserstein 
Cited by 3
Related articles All 5 versions


arXiv:2008.04295  [pdf, other 

stat.AP stat.ML 

Segmentation analysis and the recovery of queuing parameters via the Wasserstein distance: a study of administrative data for patients with chronic obstructive pulmonary disease 

Authors: Henry Wilde, Vincent Knight, Jonathan Gillard, Kendal Smith 

Abstract: This work uses a data-driven approach to analyse how the resource requirements of patients with chronic obstructive pulmonary disease (COPD) may change, quantifying how those changes impact the hospital system with which the patients interact. This approach is composed of a novel combination of often distinct modes of analysis: segmentation, operational queuing theory, and the recovery of paramete… More 

Submitted 14 August, 2020; v1 submitted 10 August, 2020; originally announced August 2020. 

Comments: 24 pages, 11 figures (19 including subfigures) 

 <——2020——2020————————  320  ——  

arXiv:2008.02883  [pdf, other 

cs.LG stat.ML 

Stronger and Faster Wasserstein Adversarial Attacks 

Authors: Kaiwen Wu, Allen Houze Wang, Yaoliang Yu 

Abstract: Deep models, while being extremely flexible and accurate, are surprisingly vulnerable to "small, imperceptible" perturbations known as adversarial attacks. While the majority of existing attacks focus on measuring perturbations under the  pmetric, Wasserstein distance, which takes geometry in pixel space into account, has long been known to be a suitable metric for measuring image quality a… More 

Submitted 6 August, 2020; originally announced August 2020. 

Comments: 30 pages, accepted to ICML 2020 

Cited by 15 Related articles All 12 versions
Stronger and Faster Wasserstein Adversarial Attacks

slideslive.com › stronger-and-faster-wasserstein-adversaria...

While the majority of existing attacks focuses on measuring perturbations under the _p metricWasserstein distance, which takes geometry ...

SlidesLive · Jul 12, 2020  

Stronger and Faster Wasserstein Adversarial Attacks - ICML

icml.cc › virtual › poster

Stronger and Faster Wasserstein Adversarial Attacks. Kaiwen Wu, Allen Wang, Yaoliang Yu,. Abstract. 

Tue Jul 14 8 a.m. PDT [ Join Zoom ]. Tue 

Jul 14 7 p.m. ...


online

Gromov-Wasserstein optimal transport to align single-cell multi-omics data (Updated November...

Life Science Weekly, 11/2020

NewsletterFull Text Online

Cited by 37 Related articles All 7 versions

arXiv:2008.04861  [pdf, other 

eess.IV cs.LG 

TextureWGAN: Texture Preserving WGAN with MLE Regularizer for Inverse Problems 

Authors: Masaki Ikuta, Jun Zhang 

Abstract: Many algorithms and methods have been proposed for inverse problems particularly with the recent surge of interest in machine learning and deep learning methods. Among all proposed methods, the most popular and effective method is the convolutional neural network (CNN) with mean square error (MSE). This method has been proven effective in super-resolution, image de-noising, and image reconstructio… More 

Submitted 11 August, 2020; v1 submitted 11 August, 2020; originally announced August 2020. 

Comments: Submitted to SPIE Medical Imaging Conference 2021 


Statistical learning in Wasserstein space

A Karimi, L Ripani, TT Georgiou - arXiv preprint arXiv:2004.07875, 2020 - arxiv.org

We seek a generalization of regression and principle component analysis (PCA) in a metric 

space where data points are distributions metrized by the Wasserstein metric. We recast 

these analyses as multimarginal optimal transport problems. The particular formulation 

allows efficient computation, ensures existence of optimal solutions, and admits a 

probabilistic interpretation over the space of paths (line segments). Application of the theory 

to the interpolation of empirical distributions, images, power spectra, as well as assessing …

Related articles All 4 versions    Published: ‏ JUL 2021


 Wasserstein based transfer network for cross-domain sentiment classification

By: Du, Yongping; He, Meng; Wang, Lulin; et al.

KNOWLEDGE-BASED SYSTEMS   Volume: ‏ 204     Article Number: 106162   Published: ‏ SEP 27 2020

 

 2020

[PDF] ieee.org

Joint Transfer of Model Knowledge and Fairness Over Domains Using Wasserstein Distance

T Yoon, J Lee, W Lee - IEEE Access, 2020 - ieeexplore.ieee.org

Owing to the increasing use of machine learning in our daily lives, the problem of fairness 

has recently become an important topic in machine learning societies. Recent studies 

regarding fairness in machine learning have been conducted to attempt to ensure statistical 

independence between individual model predictions and designated sensitive attributes. 

However, in reality, cases exist in which the sensitive variables of data used for learning 

models differ from the data upon which the model is applied. In this paper, we investigate a …

online

Machine Learning; Study Results from Seoul National University Provide New Insights into Machine Learning 

(Joint Transfer of Model Knowledge and Fairness Over Domains Using Wasserstein...

Journal of robotics & machine learning, Aug 24, 2020, 2850

Newspaper ArticleFull Text Online

 Cited by 4 Related articles All 2 versions

  

arXiv:2008.11135  [pdf, other 

math-ph cs.IT math.OC quant-ph 

Quantum statistical learning via Quantum Wasserstein natural gradient 

Authors: Simon Becker, Wuchen Li 

Abstract: In this article, we introduce a new approach towards the statistical learning problem…

 Submitted 25 August, 2020; originally announced August 2020. 


arXiv:2008.09202  [pdf, other  cs.LG 

Conditional Wasserstein GAN-based Oversampling of Tabular Data for Imbalanced Learning 

Authors: Justin Engelmann, Stefan Lessmann 

Abstract: Class imbalance is a common problem in supervised learning and impedes the predictive performance of classification models. Popular countermeasures include oversampling the minority class. Standard methods like SMOTE rely on finding nearest neighbours and linear interpolations which are problematic in case of high-dimensional, complex data distributions. Generative Adversarial Networks (GANs) have… More 

Submitted 20 August, 2020; originally announced August 2020. 

 Related articles All 5 versions 

arXiv:2008.09165  [pdf, other 

stat.ML cs.LG math.OC 

Linear Optimal Transport Embedding: Provable fast Wasserstein distance computation and classification for nonlinear problems 

Authors: Caroline Moosmüller, Alexander Cloninger 

Abstract: Discriminating between distributions is an important problem in a number of scientific fields. This motivated the introduction of Linear Optimal Transportation (LOT), which embeds the space of distributions into an

L2-space. The transform is defined by computing the optimal transport of each distribution to a fixed reference distribution, and has a number of benefits when it comes to speed of c… More 

Submitted 20 August, 2020; originally announced August 2020. 

Comments: 33 pages, 8 figures 

[CITATION] Linear optimal transport embedding: Provable fast Wasserstein distance computation and classification for nonlinear problems

C Moosmüller, A Cloninger - arXiv preprint arXiv:2008.09165, 2020

 Cited by 7 Related articles

Wasserstein Introduces a Shutoff Valve and Smart Water Monitoring Device 

Plus Company Updates, Sep 10, 2020

Newspaper ArticleCitation Online 

<——2020—— —2020———— 330  ——   


Generalizing Point Embeddings using the Wasserstein Space of Elliptical Distributions

Embedding complex objects as vectors in low dimensional spaces is a longstanding problem in machine learning. We propose in this work an extension of that approach, which consists in embedding objects as elliptical probability distributions, namely distributions whose densities have elliptical level sets... (read more)


Wasserstein  F-tests and Confidence Bands for the Frèchet Regression of Density Response Curves

Alexander Petersen, Xi Liu, Afshin A. Divani

Download PDF

Data consisting of samples of probability density functions are increasingly prevalent, necessitating the development of methodologies for their analysis that respect the inherent nonlinearities associated with densities. In many applications, density curves appear as functional response objects in a regression model with vector predictors. For such models, inference is key to understand the importance of density-predictor relationships, and the uncertainty associated with the estimated conditional mean densities, defined as conditional Fréchet means under a suitable metric. Using the Wasserstein geometry of optimal transport, we consider the Fréchet regression of density curve responses and develop tests for global and partial effects, as well as simultaneous confidence bands for estimated conditional mean densities. The asymptotic behavior of these objects is based on underlying functional central limit theorems within Wasserstein space, and we demonstrate that they are asymptotically of the correct size and coverage, with uniformly strong consistency of the proposed tests under sequences of contiguous alternatives. The accuracy of these methods, including nominal size, power, and coverage, is assessed through simulations, and their utility is illustrated through a regression analysis of post-intracerebral hemorrhage hematoma densities and their associations with a set of clinical and radiological covariates. 

Comments:

58 pages (with Appendix), 5 figures, accepted at Annals of Statistics

Subjects:

Methodology (stat.ME); Statistics Theory (math.ST)

Cite as:

arXiv:1910.13418 [stat.ME]

 

(or arXiv:1910.13418v2 [stat.ME] for this version) 

Bibliographic data

[Enable Bibex (What is Bibex?)]

Submission history

From: Alexander Petersen [view email] 

[v1] Tue, 29 Oct 2019 17:30:57 UTC (393 KB)

[v2] Wed, 22 Jul 2020 16:37:19 UTC (393 KB)


 1 May 2020

Rates of convergence in de Finetti’s representation theorem, and Hausdorff moment problem

Emanuele Dolera, Stefano Favaro

Bernoulli Vol. 26, Issue 2 (May 2020), pg(s) 1294-1322

KEYWORDS: de Finetti’s law of large numbersde Finetti’s representation theoremEdgeworth expansionsexchangeabilityHausdorff moment problemKolmogorov distanceWasserstein distance

Read Abstract +

[PDF] mlr.press

Wasserstein fair classification

R Jiang, A Pacchiano, T Stepleton… - Uncertainty in …, 2020 - proceedings.mlr.press

We propose an approach to fair classification that enforces independence between the 

classifier outputs and sensitive information by minimizing Wasserstein-1 distances. The 

approach has desirable theoretical properties and is robust to specific choices of the …

Cited by 96 Related articles All 5 versions

[PDF] mlr.press

Wasserstein smoothing: Certified robustness against wasserstein adversarial attacks

A Levine, S Feizi - International Conference on Artificial …, 2020 - proceedings.mlr.press

In the last couple of years, several adversarial attack methods based on different threat 

models have been proposed for the image classification problem. Most existing defenses 

consider additive threat models in which sample perturbations have bounded L_p norms …

Cited by 33 Related articles All 7 versions

  
2020

[CITATION] Improving the Robustness of Wasserstein Embedding by Adversarial PAC-Bayesian Learning.

D Ding, M Zhang, X Pan, M Yang, X He - AAAI, 2020

The Robustness of Wasserstein Embedding by ...

staff.ustc.edu.cn › papers › aaai20-adversarial-embedding

PDF

Improving the Robustness of Wasserstein Embedding by Adversarial. PAC-Bayesian Learning. Daizong Ding,1 Mi Zhang,1 Xudong Pan,1 Min Yang,1


  

[PDF] arxiv.org

Predictive density estimation under the Wasserstein loss

T Matsuda, WE Strawderman - Journal of Statistical Planning and Inference, 2020 - Elsevier

We investigate predictive density estimation under the L 2 Wasserstein loss for location 

families and location-scale families. We show that plug-in densities form a complete class 

and that the Bayesian predictive density is given by the plug-in density with the posterior 

mean of the location and scale parameters. We provide Bayesian predictive densities that 

dominate the best equivariant one in normal models. Simulation results are also presented.

Cited by 1 Related articles All 4 versions

MR4101489 Reviewed Matsuda, Takeru; Strawderman, William E. Predictive density estimation under the Wasserstein loss. J. Statist. Plann. Inference 210 (2021), 53–63. 62A99 (62C05 62C99)

Journal Article  Zbl 07211894 

Predictive density estimation under the Wasserstein loss

By: Matsuda, Takeru; Strawderman, William E.

JOURNAL OF STATISTICAL PLANNING AND INFERENCE   Volume: ‏ 210   Pages: ‏ 53-63   Published: ‏ JAN 2021
 


MR4138415 Prelim Fan, Xiequan; Ma, Xiaohui; On the Wasserstein distance for a martingale central limit theorem. Statist. Probab. Lett. 167 (2020), 108892. 60G42 (60E15 60F25)

 Journal Article
On the Wasserstein distance for a martingale central limit theorem
 

by Fan, Xiequan; Ma, Xiaohui 

Statistics & probability letters, 12/2020, Volume 167

We prove an upper bound on the Wasserstein distance between normalized martingales and the standard normal random variable, which extends a result of Röllin...

Article PDF Download PDF 

Journal ArticleFull Text Online 

On the Wasserstein distance for a martingale central limit theorem

The Kolmogorov distance in central limit theorem for martingales has been intensely studied under various conditions. 

For instance, we recall the following result ...

by X Fan · ‎2020 · ‎Related articles

 On the Wasserstein distance for a martingale central limit theorem
by Fan, Xiequan; Ma, Xiaohui
Statistics & probability letters, 12/2020, Volume 167
We prove an upper bound on the Wasserstein distance between normalized martingales and the standard normal random variable, which extends a result of Röllin...

Related articles
All 9 versions


Wasserstein convergence rates for random bit approximations of continuous markov processes

S Ankirchner, T Kruse, M Urusov - Journal of Mathematical Analysis and …, 2020 - Elsevier

We determine the convergence speed of a numerical scheme for approximating one-dimensional continuous strong Markov processes. The scheme is based on the construction of certain Markov chains whose laws can be embedded into the process with a sequence of stopping times. Under a mild condition on the process' speed measure we prove that the approximating Markov chains converge at fixed times at the rate of 1/4 with respect to every p-th Wasserstein distance. For the convergence of paths, we prove any rate strictly smaller …

Cited by 3 Related articles All 3 versions 

  

Functional Data Clustering Analysis via the Learning of Gaussian Processes with Wasserstein Distance

T Li, J Ma - International Conference on Neural Information …, 2020 - Springer

Functional data clustering analysis becomes an urgent and challenging task in the new era

of big data. In this paper, we propose a new framework for functional data clustering

analysis, which adopts a similar structure as the k-means algorithm for the conventional …

  Related articles

  <——2020—————2020———  340   


arXiv:2009.01370  [pdf, ps, other 

math.FA 

On nonexpansiveness of metric projection operators on Wasserstein spaces 

Authors: Anshul Adve, Alpár Mészáros 

Abstract: In this note we investigate properties of metric projection operators onto closed and geodesically convex proper subsets of Wasserstein spaces … is isometrically isomorphic to a flat space wit… More 

Submitted 2 September, 2020; originally announced September 2020. 

Comments: 9 Pages

  Related articles All 3 versions 

arXiv:2009.00708  [pdf, other 

physics.geo-ph 

Velocity Inversion Using the Quadratic Wasserstein Metric 

Authors: Srinath Mahankali 

Abstract: Full--waveform inversion (FWI) is a method used to determine properties of the Earth from information on the surface. We use the squared Wasserstein distance (squared

W2 distance) as an objective function to invert for the velocity as a function of position in the Earth, and we discuss its convexity with respect to the velocity parameter. In one dimension, we consider constant, piecewise increa… More 

Submitted 26 August, 2020; originally announced September 2020. 

Comments: 20 pages, 9 figures 

Related articles All 6 versions

arXiv:2008.12534  [pdf, other 

cs.LG stat.ML 

Continuous Regularized Wasserstein Barycenters 

Authors: Lingxiao Li, Aude Genevay, Mikhail Yurochkin, Justin Solomon 

Abstract: Wasserstein barycenters provide a geometrically meaningful way to aggregate probability distributions, built on the theory of optimal transport. They are difficult to compute in practice, however, leading previous work to restrict their supports to finite sets of points. Leveraging a new dual formulation for the regularized Wasserstein barycenter problem, we introduce a stochastic algorithm that c… More 

Submitted 28 August, 2020; originally announced August 2020. 

Cited by 15 Related articles All 7 versions

Adaptive WGAN with loss change rate balancing 

Authors: Xu Ouyang, Gady Agam 

Abstract: Optimizing the discriminator in Generative Adversarial Networks (GANs) to completion in the inner training loop is computationally prohibitive, and on finite datasets would result in overfitting. To address this, a common update strategy is to alternate between k optimization steps for the discriminator D and one optimization step for the generator G. This strategy is repeated in various GAN algor… More 

Submitted 27 August, 2020; originally announced August 2020.

Wasserstein Barycenter and Its Application to Texture Mixing

Apr 5, 2020 - Download Citation | Wasserstein Barycenter and Its Application to Texture Mixing | This paper proposes a new definition of the averaging of ...


2020


[PDF] arxiv.org

Revisiting fixed support wasserstein barycenter: Computational hardness and efficient algorithms

T LinN HoX ChenM CuturiMI Jordan - arXiv preprint arXiv:2002.04783, 2020 - arxiv.org

We study the fixed-support Wasserstein barycenter problem (FS-WBP), which consists in

computing the Wasserstein barycenter of $ m $ discrete probability measures supported on

a finite metric space of size $ n $. We show first that the constraint matrix arising from the …

  Cited by 3 Related articles All 2 versions 


Statistical learning in Wasserstein space

A Karimi, L Ripani, TT Georgiou - arXiv preprint arXiv:2004.07875, 2020 - arxiv.org

We seek a generalization of regression and principle component analysis (PCA) in a metric space where data points are distributions metrized by the Wasserstein metric. We recast these analyses as multimarginal optimal transport problems. The particular formulation allows efficient computation, ensures existence of optimal solutions, and admits a probabilistic interpretation over the space of paths (line segments). Application of the theory to the interpolation of empirical distributions, images, power spectra, as well as assessing …

Related articles
Cited by 9
Related articles All 8 versions


online Cover Image  PEER-REVIEW

Sample generation based on a supervised Wasserstein Generative Adversarial Network for...

by Han, Wei; Wang, Lizhe; Feng, Ruyi ; More...

Information sciences, 10/2020, Volume 539

As high-resolution remote-sensing (HRRS) images have become increasingly widely available, scene classification focusing on the smart classification of land...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 Cited by 22 Related articles

2020 online Cover Image PEER-REVIEW

De Novo Protein Design for Novel Folds Using Guided Conditional Wasserstein Generative Adversarial...

by Karimi, Mostafa; Zhu, Shaowen; Cao, Yue ; More...

Journal of chemical information and modeling, 12/2020, Volume 60, Issue 12

Although massive data is quickly accumulating on protein sequence and structure, there is a small and limited number of protein architectural types (or...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon


2020  see 2019

Aggregated Wasserstein Distance and State Registration for Hidden Markov Models 

By: Chen, Yukun; Ye, Jianbo; Li, Jia 

IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE  Volume: ‏ 42   Issue: ‏ 9   Pages: ‏ 2133-2147   Published: ‏ SEPT 1 2020 

Cited by 12 Related articles All 7 versions

<——2020————2020————————  350  ——   


Entropy-Regularized $2 $-Wasserstein Distance between ...

https://arxiv.org › stat

by A Mallasto · 2020 · Cited by 6 — [Submitted on 5 Jun 2020]. Title:Entropy-Regularized 2-Wasserstein Distance between Gaussian Measures ... In this work, we study the Gaussian geometry

online  OPEN ACCESS

Entropy-Regularized $2$-Wasserstein Distance between Gaussian Measures

by Mallasto, Anton; Gerolin, Augusto; Minh, Hà Quang

06/2020

Gaussian distributions are plentiful in applications dealing in uncertainty quantification and diffusivity. They furthermore stand as important special cases...

Journal ArticleFull Text Online


Fused Gromov-Wasserstein Distance for Structured Objects

T Vayer, L Chapel, R Flamary, R Tavenard, N Courty - Algorithms, 2020 - mdpi.com

Optimal transport theory has recently found many applications in machine learning thanks to its capacity to meaningfully compare various machine learning objects that are viewed as distributions. The Kantorovitch formulation, leading to the Wasserstein distance, focuses on the features of the elements of the objects, but treats them independently, whereas the Gromov–Wasserstein distance focuses on the relations between the elements, depicting the structure of the object, yet discarding its features. In this paper, we study the Fused Gromov …

Cited by 26 Related articles All 33 versions

Conditional Wasserstein generative adversarial network-gradient penalty-based approach to alleviating imbalanced data classification

M Zheng, T Li, R Zhu, Y Tang, M Tang, L Lin, Z Ma - Information Sciences, 2020 - Elsevier

In data mining, common classification algorithms cannot effectively learn from imbalanced

data. Oversampling addresses this problem by creating data for the minority class in order to

balance the class distribution before the model is trained. The Traditional oversampling

approaches are based on Synthetic Minority Oversampling TEchnique (SMOTE), which

focus on local information but generates insufficiently realistic data. In contrast, the

Generative Adversarial Network (GAN) captures the true data distribution in order to …

  Cited by 15 Related articles All 2 versions

online Cover Image  PEER-REVIEW OPEN ACCESS

Network Intrusion Detection Based on Conditional Wasserstein Generative Adversarial Network and...

by Zhang, Guoling; Wang, Xiaodan; Li, Rui ; More...

IEEE access, 2020, Volume 8

In the field of intrusion detection, there is often a problem of data imbalance, and more and more unknown types of attacks make detection difficult. To...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 

A new Wasserstein distance- and cumulative sum-dependent health indicator and its application in prediction of remaining useful life of bearing 

By: Yin, Jiancheng; Xu, Minqiang; Zheng, Huailiang; et al.

JOURNAL OF THE BRAZILIAN SOCIETY OF MECHANICAL SCIENCES AND ENGINEERING  Volume: ‏ 42   Issue: ‏ 9     Article Number: 479   Published: ‏ AUG 18 2020 

Investigators from Harbin Institute of Technology Release New Data on Mechanical Science (A New Wasserstein Distance- and Cumulative... 

Journal of Technology & Science, 09/2020

NewsletterFull Text Online 

Science - Mechanical Science; Investigators from Harbin Institute of Technology Release New Data on Mechanical Science (A New Wasserstein... 

Journal of technology & science, Sep 13, 2020, 1581

Newspaper ArticleFull Text Online 


Stein's method for normal approximation in Wasserstein distances with application to the multivariate central limit theorem 

By: Bonis, Thomas 

PROBABILITY THEORY AND RELATED FIELDS     

Early Access: AUG 2020 


Data-driven Stochastic Programming with Distributionally Robust Constraints Under Wasserstein Distance: Asymptotic Properties 

By: Mei, Yu; Chen, Zhi-Ping; Ji, Bing-Bing; et al.

JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF CHINA    

Data-driven Stochastic Programming with Distributionally Robust Constraints Under Wasserstein Distance: Asymptotic Properties

Y Mei, ZP Chen, BB Ji, ZJ Xu, J Liu - … of the Operations Research Society of …, 2020 - Springer

Distributionally robust optimization is a dominant paradigm for decision-making problems 

where the distribution of random variables is unknown. We investigate a distributionally 

robust optimization problem with ambiguities in the objective function and countably infinite …

journal article

Cited by 2 Related articles

World harmonized light-duty vehicles test procedure face recognition algorithm, comprises preprocessing human face image e.g. white balance and pixel compression and calculating Wasserstein distance 

Patent Number: CN111488763-A 

Patent Assignee: UNIV TIANJIN QINGDAO OCEAN TECHNOLOGY 

Inventor(s): XU J; SHI X; WANG R; et al.


Wasserstein metric for improved quantum machine learning with adjacency matrix representations

O ÇaylakOA von Lilienfeld… - … Learning: Science and …, 2020 - iopscience.iop.org

… Onur Çaylak et al 2020 Mach … To further investigate the performance of Wasserstein based kernels

as in equation (4) in QML models, we have turned to … emphasize that the observed solution of

the indexing problem and the simultaneous improvement of the predictions by using …

  Cited by 5 Related articles

online OPEN ACCESS

Wasserstein-based Projections with Applications to Inverse Problems

by Heaton, Howard; Fung, Samy Wu; Lin, Alex Tong ; More...

08/2020

Inverse problems consist of recovering a signal from a collection of noisy measurements. These are typically cast as optimization problems, with classic...

Journal ArticleFull Text Online

 

Modeling EEG data distribution with a Wasserstein Generative Adversarial Network to predict RSVP Events

S Panwar, P Rad, TP Jung… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Electroencephalography (EEG) data are difficult to obtain due to complex experimental 

setups and reduced comfort with prolonged wearing. This poses challenges to train powerful 

deep learning model with the limited EEG data. Being able to generate EEG data 

computationally could address this limitation. We propose a novel Wasserstein Generative 

Adversarial Network with gradient penalty (WGAN-GP) to synthesize EEG data. This network 

addresses several modeling challenges of simulating time-series EEG data including …

Cited by 1 Related articles All 2 versions

Modeling EEG Data Distribution With a Wasserstein Generative Adversarial Network to Predict RSVP Events 

By: Panwar, Sharaj; Rad, Paul; Jung, Tzyy-Ping; et al.

IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING  Volume: ‏ 28   Issue: ‏ 8   Pages: ‏ 1720-1730   Published: ‏ AUG 2020 

online

Findings from University of Texas San Antonio Update Knowledge of Data Distribution 

(Modeling Eeg Data Distribution With a Wasserstein...

Health & Medicine Week, 09/2020

NewsletterFull Text Online

Findings from University of Texas San Antonio Update Knowledge of Data Distribution (Modeling Eeg Data Distribution With a Wasserstein Generative Adversarial Network to predict RSVP 

Health & Medicine Week, 09/2020

NewsletterFull Text Online
Cited by 17
Related articles All 6 versions

 

Operational Research; Findings from University of Colorado Denver Broaden Understanding of Operational Research (An Lp-based, Strongly-polynomial 2-approximation Algorithm for Sparse Wasserstein...

Journal of mathematics (Atlanta, Ga.), Aug 25, 2020, 610

Newspaper ArticleCitation Online

(08/25/2020). "Operational Research; Findings from University of Colorado Denver Broaden Understanding of Operational Research (An Lp-based, Strongly-polynomial 2-approximation Algorithm for Sparse Wasserstein Barycenters)". Journal of mathematics (Atlanta, Ga.) (1945-8738), p. 610.

 <——2020———2020 ———————  360  ——   


An Integrated Processing Method Based on Wasserstein ...

www.researchgate.net › publication › 339516055_An_Int...

Download Citation | An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription | Given a piece of acoustic ...

online

An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic...

by Jin, Cong; Li, Zhongtong; Sun, Yuanyuan ; More...

Communications and Networking, 02/2020

Given a piece of acoustic musical signal, various automatic music transcription (AMT) processing methods have been proposed to generate the corresponding music...

Book ChapterFull Text Online

Infrared and visible image fusion using dual discriminators generative adversarial networks with Wasserstein distance 

By: Li, Jing; Huo, Hongtao; Liu, Kejian; et al.

INFORMATION SCIENCES  Volume: ‏ 529   Pages: ‏ 28-41   Published: ‏ AUG 2020 


Wasserstein distance based rapid image enhancing method, involves inputting constructed data set into deep learning model, and inputting to-be-processed motion blur image into deep learning model to obtain clear and full-color image 

Patent Number: CN111476721-A 

Patent Assignee: UNIV CHONGQING POSTS & TELECOM 

Inventor(s): FENG J; QI S; WU S.


Algorithm based on S-transform/Wasserstein GAN to solve unbalanced data leakage of brine pipeline, comprises e.g. transforming fault data to time-frequency-modular three-dimensional leak point picture through S transformation 

Patent Number: CN111460367-A 

Patent Assignee: HUAIYIN TECHNOLOGY INST 

Inventor(s): XU M; DING W; ZHAO J; et al.

 


Orthogonal gradient penalty for fast training of Wasserstein ...

https://koreauniv.pure.elsevier.com › publications › ort...

by CY Kao · 2020 · Cited by 1 — ... penalty for fast training of Wasserstein GaN based multi-task autoencoder toward robust ... In this Letter, we propose a new orthogonal gradient penalty (OGP) method for Wasserstein Generative ... Publication status, Published - 2020 May ...

online Cover Image

 PEER-REVIEW OPEN ACCESS

Orthogonal Gradient Penalty for Fast Training of Wasserstein GAN Based Multi-Task Autoencoder...

by KAO, Chao-Yuan; PARK, Sangwook; BADI, Alzahra ; More...

IEICE transactions on information and systems, 05/2020, Volume E103.D, Issue 5

Performance in Automatic Speech Recognition (ASR) degrades dramatically in noisy environments. To alleviate this problem, a variety of deep networks based on...

Journal ArticleFull Text Online

Method for performing adaptive image classification in depth domain based on Wasserstein distance, involves removing domain adaptation layer, and inputting target domain samples into model for classification to get accuracy 

Patent Number: CN111428803-A 

Patent Assignee: UNIV SHANDONG 

Inventor(s): WU Q; SUN S; LIU J; et al.


[PDF] arxiv.org

A Riemannian Block Coordinate Descent Method for Computing the Projection Robust Wasserstein Distance

M HuangS MaL Lai - arXiv preprint arXiv:2012.05199, 2020 - arxiv.org

The Wasserstein distance has become increasingly important in machine learning and deep

learning. Despite its popularity, the Wasserstein distance is hard to approximate because of

the curse of dimensionality. A recently proposed approach to alleviate the curse of …

  Cited by 1 Related articles All 3 versions 



Typical wind power scenario generation for multiple wind ...

https://www.sciencedirect.com › science › article › pii

by Y Zhang · 2020 · Cited by 16 — Typical wind power scenario generation for multiple wind farms using ... on the conditional improved Wasserstein generative adversarial network (WGAN).

2020 onlineCover Image

 PEER-REVIEW OPEN ACCESS

Typical wind power scenario generation for multiple wind farms using conditional improved Wasserstein...

by Zhang, Yufan; Ai, Qian; Xiao, Fei ; More...

International journal of electrical power & energy systems, 01/2020, Volume 114

•Labeling model, conditional scenario generation and reduction are proposed.•The conditional WGAN-GP is trained to generate scenarios for multi-wind farms.•The...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

online

Reports from Shanghai Jiao Tong University Describe Recent Advances in Wind Farms 

(Typical Wind Power Scenario Generation for Multiple Wind Farms Using Conditional Improved Wasserstein...

Energy Weekly News, 01/2020

NewsletterFull Text Online


 

[HTML] Wasserstein and Kolmogorov error bounds for variance-gamma approximation via Stein's method I

RE Gaunt - Journal of Theoretical Probability, 2020 - Springer

The variance-gamma (VG) distributions form a four-parameter family that includes as special

and limiting cases the normal, gamma and Laplace distributions. Some of the numerous

applications include financial modelling and approximation on Wiener space. Recently,

Stein's method has been extended to the VG distribution. However, technical difficulties

have meant that bounds for distributional approximations have only been given for smooth

test functions (typically requiring at least two derivatives for the test function). In this paper …

  Cited by 15 Related articles All 6 versions

online Cover Image PEER-REVIEW OPEN ACCESS

Wasserstein and Kolmogorov Error Bounds for Variance-Gamma Approximation via Stein’s Method I

by Gaunt, Robert E

Journal of theoretical probability, 03/2020, Volume 33, Issue 1

The variance-gamma (VG) distributions form a four-parameter family that includes as special and limiting cases the normal, gamma and Laplace distributions....

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Cited by 29 Related articles All 10 versions


Auto encoder Wasserstein generative adversarial network based voice noise reduction method, involves obtaining and converting one-dimensional speech signals into one-dimensional discrete speech signals in test phase 

Patent Number: CN111564160-A 

Patent Assignee: UNIV CHONGQING POSTS & TELECOM 

Inventor(s): HU Z; XU X; LUO Y; et al.

 

Kantorovich-Rubinstein-Wasserstein distance between overlapping attractor and repeller<?A3B2 show [editpick]?> 

By: Chigarev, Vladimir; Kazakov, Alexey; Pikovsky, Arkady 

CHAOS  Volume: ‏ 30   Issue: ‏ 7     Published: ‏ JUL 2020 

Chigarev, Vladimir; Kazakov, Alexey; Pikovsky, Arkady

Kantorovich-Rubinstein-Wasserstein distance between overlapping attractor and repeller. (English) Zbl 07269028 

Chaos 30, No. 7, 073114, 10 p. (2020). 

MSC:  37

   <——2020——2020——————  370  ——   



Tractable reformulations of two-stage distributionally robust linear programs over the type-infinity Wasserstein ball 

By: Xie, Weijun 

OPERATIONS RESEARCH LETTERS  Volume: ‏ 48   Issue: ‏ 4   Pages: ‏ 513-523   Published: ‏ JUL 2020 


year 2020

[PDF] optimization-online.org

[PDF] A Novel Solution Methodology for Wasserstein-based Data-Driven Distributionally Robust Problems

CA Gamboa, DM Valladao, A Street… - optimization-online.org

Distributionally robust optimization (DRO) is a mathematical framework to incorporate 

ambiguity over the actual data-generating probability distribution. Data-driven DRO 

problems based on the Wasserstein distance are of particular interest for their sound …


A Novel Solution Methodology for Wasserstein-based Data ...

http://www.optimization-online.org › 2020/10

http://www.optimization-online.org › 2020/10

Oct 21, 2020 — A Novel Solution Methodology for Wasserstein-based Data-Driven Distributionally Robust Problems · Carlos Gamboa(caagamboaro · Abstract: ...

Missing: CADM

[PDF] optimization-online.org

[PDF] Novel Solution Methodology for Wasserstein-based Data-Driven Distributionally Robust Problems

CA Gamboa, DM ValladaoA Street… - optimization-online.org

… Data-driven DRO problems based on the Wasserstein distance are of particular interest for

their sound mathematical properties. For right-… data-driven Wasserstein-based DROs with

right-hand-sided uncertainty and rectangular support. We propose a novel finite reformulation …

SRelated articles 



Dynamic Facial Expression Generation on Hilbert Hypersphere with Conditional Wasserstein Generative Adversarial Nets. 

By: Otberdout, Naima; Daoudi, Mohammed; Kacem, Anis; et al.

IEEE transactions on pattern analysis and machine intelligence  Volume: ‏ PP     Published: ‏ 2020-Jun-15 (Epub 2020 Jun 15) 

[PDF] archives-ouvertes.fr

Dynamic facial expression generation on hilbert hypersphere with conditional wasserstein generative adversarial nets

N OtberdoutM DaoudiA Kacem… - … on Pattern Analysis …, 2020 - ieeexplore.ieee.org

Page 1. 0162-8828 (c) 2020 IEEE. Personal use is permitted, but republication/

redistribution requires IEEE permission. See http://www.ieee.org/

publications_standards/publications/rights/index.html for more information. This …

  Cited by 1 All 8 versions


スパース・シンプレックス射影による Wasserstein κ-means 法高速化の一検討

福永拓海, 笠井裕之 - IEICE Conferences Archives, 2020 - ieice.org

k-mean は広く用いられるクラスタリング法の一つであるが, 全サンプルとセントロイド

(クラスター中心) との距離計算が毎回の反復で必要なことから, 大規模データへの適用は難しい.

そこで, 当該計算量の削減による高速化手法が多数提案されている. 一方, ユークリッド距離以外の …

  All 2 versions 

[Japanese  Wasserstein κ-means method by sparse simplex projection]

An Improvement based on Wasserstein GAN for Alleviating Mode Collapsing

Y Chen, X Hou - 2020 International Joint Conference on Neural …, 2020 - ieeexplore.ieee.org

In the past few years, Generative Adversarial Networks as a deep generative model has

received more and more attention. Mode collapsing is one of the challenges in the study of

Generative Adversarial Networks. In order to solve this problem, we deduce a new algorithm

on the basis of Wasserstein GAN. We add a generated distribution entropy term to the

objective function of generator net and maximize the entropy to increase the diversity of fake

images. And then Stein Variational Gradient Descent algorithm is used for optimization. We …

  Related articles


Wasserstein distance based deep multi-feature adversarial transfer diagnosis approach under variable working conditions 

By: She, D.; Peng, N.; Jia, M.; et al.

JOURNAL OF INSTRUMENTATION  Volume: ‏ 15   Issue: ‏ 6     Article Number: P06002   Published: ‏ JUN 2020 

Cited by 5 Related articles All 3 versions

A CONVERGENT LAGRANGIAN DISCRETIZATION FOR p-WASSERSTEIN AND FLUX-LIMITED DIFFUSION EQUATIONS 

By: Soellner, Benjamin; Junge, Oliver 

COMMUNICATIONS ON PURE AND APPLIED ANALYSIS  Volume: ‏ 19   Issue: ‏ 9   Pages: ‏ 4227-4256   Published: ‏ JUN

Deep joint two-stream Wasserstein auto-encoder and selective attention alignment for unsupervised domain adaptation 

By: Chen, Zhihong; Chen, Chao; Jin, Xinyu; et al.

NEURAL COMPUTING & APPLICATIONS  Volume: ‏ 32   Issue: ‏ 11   Special Issue: ‏ SI   Pages: ‏ 7489-7502   Published: ‏ JUN 

A data-driven distributionally robust newsvendor model with a Wasserstein ambiguity set 

By: Lee, Sangyoon; Kim, Hyunwoo; Moon, Ilkyeong 

JOURNAL OF THE OPERATIONAL RESEARCH SOCIETY     

Early Access: MAY 2020

Aluminum electrolysis fire eye image repair method for Wasserstein deep convolution generating an anti-network, involves extracting the center of the fire eye as the center, where distance is used to define the loss of the generator 

Patent Number: CN111192221-A 

Patent Assignee: UNIV CENT SOUTH 

Inventor(s): CHEN X; PAN M; XIE Y; et al.

  <——2020——2020———————  380  ——   

 

CVaR-Based Approximations of Wasserstein Distributionally Robust Chance Constraints with Application to Process Scheduling 

By: Liu, Botong; Zhang, Qi; Ge, Xiaolong; et al.

INDUSTRIAL & ENGINEERING CHEMISTRY RESEARCH  Volume: ‏ 59   Issue: ‏ 20   Pages: ‏ 9562-9574   Published: ‏ MAY 20 2020 

CVaR-Based Approximations of Wasserstein Distributionally Robust Chance Constraints with... 

by Liu, Botong; Zhang, Qi; Ge, Xiaolong; More... 

Industrial & engineering chemistry research, 05/2020, Volume 59, Issue 20

Article PDF Download PDF 

Journal ArticleFull Text Online 

Cited by 5 Related articles All 4 versions

Method for deep self-coding embedded clustering based on Sliced-Wasserstein distance, involves processing self-encoding embedded clustering network when clustering network reaches preset threshold, and completing final clustering 

Patent Number: CN111178427-A 

Patent Assignee: UNIV HANGZHOU DIANZI 

Inventor(s): GUO C; RONG P; CHEN H; et al.


A Riemannian submersion-based approach to the Wasserstein barycenter of positive definite matrices 

By: Li, Mingming; Sun, Huafei; Li, Didong 

MATHEMATICAL METHODS IN THE APPLIED SCIENCES  Volume: ‏ 43   Issue: ‏ 7   Pages: ‏ 4927-4939   Published: ‏ MAY 15 2020 

online

Investigators from Beijing Institute of Technology Target Mathematics in Applied Science 

(A Riemannian Submersion-based Approach To the Wasserstein Barycenter of Positive Definite Matrices)

Mathematics Week, 06/2020

NewsletterFull Text Online


[PDF] arxiv.org

Wasserstein distributionally robust shortest path problem

Z Wang, K YouS SongY Zhang - European Journal of Operational …, 2020 - Elsevier

This paper proposes a data-driven distributionally robust shortest path (DRSP) model where

the distribution of the travel time in the transportation network can only be partially observed

through a finite number of samples. Specifically, we aim to find an optimal path to minimize …

  Cited by 3 Related articles All 8 versions

Orthogonal Gradient Penalty for Fast Training of Wasserstein GAN Based Multi-Task Autoencoder toward Robust Speech Recognition 

By: Kao, Chao-Yuan; Park, Sangwook; Badi, Alzahra; et al.

IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS  Volume: ‏ E103D   Issue: ‏ 5   Pages: ‏ 1195-1198   Published: ‏ MAY 2020 

 Cited by 3 Related articles All 6 versions


online

Technology - Information Technology; Korea University Reports Findings in Information Technology 

(Orthogonal Gradient Penalty for Fast Training of Wasserstein GAN Based Multi-Task Autoencoder toward...

Computer technology journal, Jun 4, 2020, 526

Newspaper ArticleFull Text Online

A novel kernel Wasserstein distance on Gaussian measures: An application of identifying dental artifacts in head and neck computed tomography 

By: Oh, Jung Hun; Pouryahya, Maryam; Iyer, Aditi; et al.

COMPUTERS IN BIOLOGY AND MEDICINE  Volume: ‏ 120     Article Number: 103731   Published: ‏ MAY 2020 

 Cited by 1 Related articles All 5 versions
online

New Findings from Memorial Sloan-Kettering Cancer Center Update Understanding of Computers 

(A Novel Kernel Wasserstein Distance On Gaussian Measures: an Application of Identifying Dental Artifacts In Head...

Computer Weekly News, 06/2020

NewsletterFull Text Online

 Computers; New Findings from Memorial Sloan-Kettering Cancer Center Update Understanding of Computers 

(A Novel Kernel Wasserstein Distance On Gaussian Measures: an Application of Identifying...

Computer Weekly News, Jun 10, 2020, 445

Newspaper ArticleCitation Online

 A novel kernel Wasserstein distance on Gaussian measures ...

A novel L2-Wasserstein distance in reproducing kernel Hilbert spaces was proposed. •. The resultant distance matrix was integrated with a hierarchical clustering ...

by JH Oh - ‎2020 - ‎Related articles

New Findings from Memorial Sloan-Kettering Cancer Center Update Understanding of Computers 

(A Novel Kernel Wasserstein... 

Computer Weekly News, 06/2020

NewsletterFull Text Online 

Computers; A Novel Kernel Wasserstein... Center Update Understanding of Computers (A Novel Kernel Wasserstein... 

Computer Weekly News, Jun 10, 2020, 445

Newspaper ArticleFull Text Online 


Knowledge-Grounded Chatbot Based on Dual Wasserstein Generative Adversarial Networks with Effective Attention Mechanisms 

By: Kim, Sihyung; Kwon, Oh-Woog; Kim, Harksoo 

APPLIED SCIENCES-BASEL  Volume: ‏ 10   Issue: ‏ 9     Article Number: 3335   Published: ‏ MAY 2020 


[PDF] sci-en-tech.com

[PDF] Entropy-regularized Wasserstein Distances for Analyzing Environmental and Ecological Data

H Yoshioka, Y Yoshioka, Y Yaegashi - THE 11TH …, 2020 - sci-en-tech.com

We explore applicability of entropy-regularized Wasserstein (pseudo-) distances as new

tools for analyzing environmental and ecological data. In this paper, the two specific

examples are considered and are numerically analyzed using the Sinkhorn algorithm. The …

[PDF] researchgate.net

Infrared and visible image fusion using dual discriminators generative adversarial networks with Wasserstein distance

J Li, H Huo, K Liu, C Li - Information Sciences, 2020 - Elsevier

Generative adversarial network (GAN) has shown great potential in infrared and visible

image fusion. The existing GAN-based methods establish an adversarial game between

generative image and source images to train the generator until the generative image

contains enough meaningful information from source images. However, they only design

one discriminator to force the fused result to complement gradient information from visible

image, which may lose some detail information that existing in infrared image and omit some …

 Cited by 22 Related articles All 2 versions

 online Cover Image  PEER-REVIEW

Infrared and visible image fusion using dual discriminators generative adversarial networks with Wasserstein...

by Li, Jing; Huo, Hongtao; Liu, Kejian ; More...

Information sciences, 08/2020, Volume 529

•We employ Generative Multi-Adversarial networks to fuse source images.•We design two discriminators to preserve more intensity and texture information.•We...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

  

 

Face gender discrimination algorithm based on Wasserstein distance involves building comparison libraries, calculating distance between target distribution and different distributions and selecting type closest to comparison library 

Patent Number: CN111046708-A 

Patent Assignee: UNIV TIANJIN QINGDAO OCEAN TECHNOLOGY 

Inventor(s): XU J; SHI X; WANG R; et al.

  <——2020————2020————  390  ——   

     

Wasserstein distance and auto-encoder based semi-supervised deep learning fault diagnosis method, involves acquiring industrial continuous process data of fault category, inputting standardized data set to-be-tested into WASS-AE model 

Patent Number: CN111026058-A 

Patent Assignee: UNIV ZHEJIANG 


[PDF] iop.org

An Improved Defect Detection Method of Water alls Using the WGAN

Y Zang, L Lu, Y Wang, Y Ding, J Yang… - Journal of Physics …, 2020 - iopscience.iop.org

This paper proposes an improved water wall defect detection method using Wasserstein 

generation adversarial network (WGAN). The method aims to improve the problems of poor 

safety and high level of maintenance personnel required by traditional inspection methods …

 

[PDF] arxiv.org

Statistical analysis of Wasserstein GANs with applications to time series forecasting

M Haas, S Richter - arXiv preprint arXiv:2011.03074, 2020 - arxiv.org

We provide statistical theory for conditional and unconditional Wasserstein generative

adversarial networks (WGANs) in the framework of dependent observations. We prove

upper bounds for the excess Bayes risk of the WGAN estimators with respect to a modified …

  Cited by 1 Related articles All 3 versions 


 Multi-view Wasserstein discriminant analysis with entropic regularized Wasserstein distance

H Kasai - ICASSP 2020-2020 IEEE International Conference …, 2020 - ieeexplore.ieee.org

Analysis of multi-view data has recently garnered growing attention because multi-view data

frequently appear in real-world applications, which are collected or taken from many sources

or captured using various sensors. A simple and popular promising approach is to learn a

latent subspace shared by multi-view data. Nevertheless, because one sample lies in

heterogeneous structure types, many existing multi-view data analyses show that

discrepancies in within-class data across multiple views have a larger value than …

  Cited by 4 Related articles

online

Multi-View Wasserstein Discriminant Analysis with Entropic Regularized Wasserstein Distance

by Kasai, Hiroyuki

ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 05/2020

Analysis of multi-view data has recently garnered growing attention because multi-view data frequently appear in real-world applications, which are collected...

Conference ProceedingFull Text Online


A Wasserstein gradient-penalty generative adversarial network with deep auto-encoder for bearing intelligent fault diagnosis 

By: Xiong, Xiong; Jiang, Hongkai; Li, Xingqiu; et al.

MEASUREMENT SCIENCE AND TECHNOLOGY  Volume: ‏ 31   Issue: ‏ 4     Article Number: 045006   Published: ‏ APR 2020 

Cited by 27 Related articles All 3 versions

BIRCH clustering and Wasserstein distance based wind power output typical scene generation method, involves obtaining non-leaf node branch parameter, leaf node branch parameter and wind electric scene maximum cluster radius threshold 

Patent Number: CN110929399-A 

Patent Assigorrecnee: STATE GRID JIANGSU ELECTRIC POWER CO LTD 

Inventor(s): TANG X; LI Q; WANG S; et al.

 

Fault diagnosis method for deep combat migration network based on Wasserstein distance, involves obtaining data set of source domain, an establishment is made that feature-based transfer learning fault diagnosis model has feature extractor 

Patent Number: CN110907176-A 

Patent Assignee: UNIV HEFEI TECHNOLOGY 

Inventor(s): XU J; HUANG J; ZHOU L; et al.


[PDF] ieee.org

Study of Restrained Network Structures for Wasserstein Generative Adversarial Networks (WGANs) on Numeric Data Augmentation

W Wang, C Wang, T Cui, Y Li - IEEE Access, 2020 - ieeexplore.ieee.org

Some recent studies have suggested using Generative Adversarial Network (GAN) for

numeric data over-sampling, which is to generate data for completing the imbalanced

numeric data. Compared with the conventional over-sampling methods, taken SMOTE as an …

  Cited by 1 Related articles

Gromov-Hausdorff limit of Wasserstein spaces on point clouds 

By: Garcia Trillos, Nicolas 

CALCULUS OF VARIATIONS AND PARTIAL DIFFERENTIAL EQUATIONS  Volume: ‏ 59   Issue: ‏ 2     Article Number: 73   Published: ‏ MAR 11 2020

Pattern-Based Music Generation with Wasserstein ...

https://research.tue.nl › publications › pattern-based-mu...

by VAJT Borghuis · 2020 — Pattern-Based Music Generation with Wasserstein 

Autoencoders and PRC Descriptions. V.A.J. (Tijn) ... Title of host publication, Proceedings of the 29th International Joint Conference on Artificial Intelligence, IJCAI 2020.

online OPEN ACCESS

Pattern-Based Music Generation with Wasserstein 

Autoencoders and PRC Descriptions

by Borghuis, V.A.J; Angioloni, Luca; Brusci, Lorenzo ; More...

Proceedings of the 29th International Joint Conference on Artificial Intelligence, IJCAI 2020, 07/2020

We demonstrate a pattern-based MIDI music generation system with a generation strategy based on Wasserstein autoencoders and a novel variant of pianoroll...

Conference ProceedingFull Text Online

Pattern-Based Music Generation with Wasserstein 

Autoencoders and PRC Descriptions  book
<——2020——2020———  400  ——  


[PDF] dergipark.org.tr

Wasserstein Riemannian Geometry on Statistical Manifold

C Ogouyandjou, N Wadagni - … Electronic Journal of Geometry, 2020 - dergipark.org.tr

In this paper, we study some geometric properties of statistical manifold equipped with the Riemannian Otto metric which is related to the L 2-Wasserstein distance of optimal mass transport. We construct some α-connections on such manifold and we prove that the …

  All 2 versions 

Wasserstein and Kolmogorov Error Bounds for Variance-Gamma Approximation via Stein's Method I 

By: Gaunt, Robert E. 

JOURNAL OF THEORETICAL PROBABILITY  Volume: ‏ 33   Issue: ‏ 1   Pages: ‏ 465-505   Published: ‏ MAR 2020 

 online

Probability Research; Recent Studies from University of Manchester Add New Data to Probability Research 

(Wasserstein and Kolmogorov Error Bounds for Variance-gamma Approximation Via Stein's Method I)

Journal of mathematics (Atlanta, Ga.), Jun 2, 2020, 939

Newspaper ArticleFull Text Online

Cited by 22 Related articles All 8 versions

Exact rate of convergence of the mean Wasserstein distance ...

https://arxiv.org › math

by P Berthet · 2020 · Cited by 3 — [Submitted on 27 Jan 2020]. Title:Exact rate of convergence of the mean Wasserstein distance between the empirical and true Gaussian distribution.

online OPEN ACCESS

Exact rate of convergence of the mean Wasserstein distance between the empirical and true Gaussian...

by Berthet, Philippe; Fort, Jean-Claude

01/2020

Electron. J. Probab. 25 (2020) We study the Wasserstein distance $W_2$ for Gaussian samples. We establish the exact rate of convergence $\sqrt{\log\log n/n}$...

Article Link Read Article (via Unpaywall) BrowZine Article Link Icon

Journal ArticleFull Text Online

2020

Wasserstein Collaborative Filtering for ... - ACM Digital Library

https://dl.acm.org › doi › abs

https://dl.acm.org › doi › abs

by Y Meng · 2020 · Cited by 10 — On this basis, we propose Wasserstein Collaborative Filtering (WCF), which predicts user preference on cold-start items by minimizing the ...


Training method for guaranteeing stable convergence of maximum and minimum loss function of generative confrontation network model (GAN) model, involves calculating countermeasure loss function Wasserstein format 

Patent Number: CN110826688-A 

Patent Assignee: JIANGSU AIJIA HOUSEHOLD PROD CO LTD 

Inventor(s): CHEN X; LV C; LIN S.


2020

 

Clean energy source power supply planning method involves constructing output uncertainty set, which is based on wind of the Wasserstein, and a power plan model is established under distributed robust optimization distance 

Patent Number: CN110797919-A 

Patent Assignee: STATE GRID SICHUAN ELECTRIC POWER CO ECO 

Inventor(s): WANG R; LIU Y; ZHU M; et al.


yonline Cover Image PEER-REVIEW

Wasserstein distance based deep adversarial transfer learning for intelligent fault diagnosis with unlabeled...

by Cheng, Cheng; Zhou, Beitong; Ma, Guijun ; More...

Neurocomputing (Amsterdam), 10/2020, Volume 409

•A Wasserstein distance based deep transfer learning (WD-DTL) network is designed for intelligent fault diagnosis, addressing industrial domain shift...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

ited by 109 Related articles All 5 versions


Conditional Wasserstein generative adversarial network-gradient penalty-based approach to alleviating imbalanced data

Peer-reviewed
Conditional Wasserstein generative adversarial network-gradient penalty-based approach to alleviating imbalanced data classification
Authors:Ming ZhengTong LiRui ZhuYahui TangMingjing TangLeilei LinZifei Ma
Article, 2020
Publication:Information sciences, 512, 2020, 1009
Publisher:2020

Wasserstein Generative Adversarial Networks Based Data Augmentation for Radar Data Analysis 

By: Lee, Hansoo; Kim, Jonggeun; Kim, Eun Kyeong; et al.

APPLIED SCIENCES-BASEL  Volume: ‏ 10   Issue: ‏ 4     Article Number: 1449   Published: ‏ FEB 2020 

 

Severity-aware semantic segmentation with reinforced wasserstein training

X LiuW JiJ YouGE Fakhri… - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com

Semantic segmentation is a class of methods to classify each pixel in an image into

semantic classes, which is critical for autonomous vehicles and surgery systems. Cross-

entropy (CE) loss-based deep neural networks (DNN) achieved great success wrt the …

  Cited by 12 Related articles All 5 versions 

<—–2020———2020——— 410 —  

[PDF] researchgate.net

Tractable reformulations of two-stage distributionally robust linear programs over the type-∞ wasserstein ball

W Xie - Operations Research Letters, 2020 - Elsevier

JavaScript is disabled on your browser. Please enable JavaScript to use all

the features on this page. Skip to main content Skip to article …

  Cited by 6 Related articles All 4 versions

    

Experiment data in support of "Recovering hospital service times via the Wasserstein distance for segmentation analysis: a study in COPD patients in the Cwm Taf region" 

By: Wilde, Henry; Knight, Vincent; Gillard, Jonathan 

Zenodo

DOI: ‏ http://dx.doi.org.ezaccess.libraries.psu.edu/10.5281/ZENODO.3924715 

Document Type: Data set 


Donsker's theorem in Wasserstein-1 distance - Project Euclid

projecteuclid.org › volume-25 › issue-none › 20-ECP308

by L Coutin · 2020 · Cited by 1 — Laurent Decreusefond. "Donsker's theorem in Wasserstein-1 distance." Electron. Commun. Probab. 25 1 - 13, 2020. https://doi.org/10.1214/20-ECP308 ...

2020 online

 PEER-REVIEW OPEN ACCESS

Donsker's theorem in {Wasserstein}-1 distance

by Coutin, L; Decreusefond, Laurent

Electronic communications in probability, 2020, Volume 25

We compute the Wassertein-1 (or Kantorovitch-Rubinstein) distance between a random walk in $R^d$ and the Brownian motion. The proof is based on a new estimate...

Journal ArticleFull Text Online

 OPEN ACCESS

Donsker's theorem in {Wasserstein}-1 distance

by Coutin, L; Decreusefond, Laurent

01/2020

International audience; We compute the Wassertein-1 (or Kantorovitch-Rubinstein) distance between a random walk in $R^d$ and the Brownian motion. The proof is...

PublicationCitation Online


 Intelligent Fault Diagnosis with a Deep Transfer Network based on Wasserstein Distance

J Xu, J Huang, Y Zhao, L Zhou - Procedia Computer Science, 2020 - Elsevier

Intelligent fault-diagnosis methods based on deep-learning technology have been very

successful for complex industrial systems. The deep learning based fault classification

model requires a large number of labeled data. Moreover, the probability distribution of

training set and test data should be the same. These two conditions are often not satisfied in

practical working conditions. Thereby an intelligent fault-diagnosis method based on a deep

adversarial transfer network is proposed, when the target domain only has unlabeled …

  Related articles

ver Image PEER-REVIEW OPEN ACCESS

Intelligent Fault Diagnosis with a Deep Transfer Network based on Wasserstein Distance

by Xu, Juan; Huang, Jingkun; Zhao, Yukun ; More...

Procedia computer science, 2020, Volume 174

Intelligent fault-diagnosis methods based on deep-learning technology have been very successful for complex industrial systems. The deep learning based fault...

Journal ArticleCitation Online

 

     

2020
Study of Restrained Network Structures for Wasserstein Generative Adversarial Networks (WGANs) on Numeric Data Augmentation
Authors:Wei WangChuang WangTao CuiYue Li
Summary:Some recent studies have suggested using Generative Adversarial Network (GAN) for numeric data over-sampling, which is to generate data for completing the imbalanced numeric data. Compared with the conventional over-sampling methods, taken SMOTE as an example, the recently-proposed GAN schemes fail to generate distinguishable augmentation results for classifiers. In this paper, we discuss the reason for such failures, based on which we further study the restrained conditions between G and D theoretically, and propose a quantitative indicator of the restrained structure, called Similarity of the Restrained Condition (SRC) to measure the restrained conditions. Practically, we propose several candidate solutions, which are isomorphic (IWGAN) mirror (MWGAN) and self-symmetric WGAN (SWGAN) for restrained conditions. Besides, the restrained WGANs enhance the classification performance in AUC on five classifiers compared with the original data as the baseline, conventional SMOTE, and other GANs add up to 20 groups of experiments in four datasets. The restrained WGANs outperform all others in 17/20 groups, among which IWGAN accounted for 15/17 groups and the SRC is an effective measure in evaluating the restraints so that further GAN structures with G-D restrains could be designed on SRC. Multidimensional scaling (MDS) is introduced to eliminate the impact of datasets and evaluation of the AUC in a composite index and IWGAN decreases the MDS distance by 20% to 40%. Moreover, the convergence speed of IWGAN is increased, and the initial error of loss function is reducedShow more
Article, 2020
Publication:IEEE Access, 8, 2020, 89812
Publisher:2020
Study of Restrained Network Structures for Wasserstein Generative Adversarial Networks (WGANs) on Numeric Data Augmentation 

By: Wang, Wei; Wang, Chuang; Cui, Tao; et al.

IEEE ACCESS  Volume: ‏ 8   Pages: ‏ 89812-89821   Published: ‏ 2020 

numeric data. Compared with the conventional over-sampling methods, taken SMOTE as an …

  Cited by 2 Related articles All 2 versions

Severity-Aware Semantic Segmentation with Reinforced Wasserstein TrainingAuthors:Liu X.Ji W.You J.El Fakhri G.Woo J.2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2020Show more
Article, 2020
Publication:Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2020, 12563
Publisher:2020


Donsker's theorem in Wasserstein-1 distance 

By: Coutin, Laure; Decreusefond, Laurent 

ELECTRONIC COMMUNICATIONS IN PROBABILITY  Volume: ‏ 25     Article Number: 27   Published: ‏ 2020 

Related articles All 18 versions
Donsker's theorem in Wasserstein-1 distance

Cited by 7 Related articles All 34 versions

2020

Data supplement for a soft sensor using a new generative ...

https://www.sciencedirect.com › science › article › pii

by X Wang · 2020 · Cited by 8 — In this study, a deep generative model combining the variational autoencoder (VAE) and the Wasserstein generative adversarial network (WGAN) is utilized to ...

online Cover Image PEER-REVIEW

Data supplement for a soft sensor using a new generative model based on a variational autoencoder and Wasserstein...

by Wang, Xiao; Liu, Han

Journal of process control, 01/2020, Volume 85

•We propose a generative model named VA-WGAN by integrating a VAE with WGAN to supplement training data for soft sensor modeling. The VA-WGAN generates the...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 [PDF] researchgate.net

Data supplement for a soft sensor using a new generative model based on a variational autoencoder and Wasserstein GAN

X Wang, H Liu - Journal of Process Control, 2020 - Elsevier

In industrial process control, measuring some variables is difficult for environmental or cost reasons. This necessitates employing a soft sensor to predict these variables by using the collected data from easily measured variables. The prediction accuracy and computational …

Cited by 25 Related articles All 2 versions

OPTIMALITY IN WEIGHTED L-2-WASSERSTEIN GOODNESS-OF-FIT STATISTICS 

By: de Wet, Tertius; Humble, Veronica 

SOUTH AFRICAN STATISTICAL JOURNAL  Volume: ‏ 54   Issue: ‏ 1   Pages: ‏ 1-13   Published: ‏ 2020 


2020

2012.09729] Approximation rate in Wasserstein distance of ...

https://arxiv.org › math

https://arxiv.org › math

by O Bencheikh · 2020 — Title:Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures.


<——2020——2020———  420   ——  




Peer-reviewed
Data Augmentation Method for Switchgear Defect Samples Based on Wasserstein Generative Adversarial Network
Show more
Authors:Xueyou HuangJun XiongYu ZhangJingyi LiangZhang HaoningHui Liu
Summary:The problem of sample imbalance will lead to poor generalization ability of the deep learning model algorithm, and the phenomenon of overfitting during network training, which limits the accuracy of intelligent fault diagnosis of switchgear equipment. In view of this, this paper proposes a data augmentation method for switchgear defect samples based on Wasserstein generative adversarial network with the partial discharge live detection data of the substation and the real-time switchgear partial discharge simulation experimental data. This method can improve the imbalanced distribution of data, and solve the problems such as the disappearance of gradients and model collapses in the classic generative adversarial network model, and greatly improve the stability of training. Verification through examples and comparison with traditional data augmentation methods. The results show that the data augmentation method mentioned in this paper can more effectively reduce the data imbalance, improve the performance of data-driven technology, and provide data support for subsequent fault diagnosis of switchgear equipmentShow more
Article, 2020
Publication:1659, October 2020, 012056
Publisher:2020

Motion Deblurring in Image Color Enhancement by WGAN 

By: Feng, Jiangfan; Qi, Shuang 

INTERNATIONAL JOURNAL OF OPTICS   Volume: ‏ 2020     Article Number: 1295028   Published: ‏ JUN 24 2020 

 Free Full Text from Publisher 


[PDF] github.io

[PDF] Wasserstein Loss with Alternative Reinforcement Learning for Severity-Aware Semantic Segmentation

X Liu, Y Zhang, X Liu, S Bai, S Li, J You - liu-xiaofeng.github.io

Semantic segmentation is important for many realworld systems, eg, autonomous vehicles, which predict the class of each pixel. Recently, deep networks achieved significant progress wrt the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross …

Cited by 14 Related articles All 5 versions

[PDF] hbu.cn

[PDF] 基于 Wasserstein 距离的双向学习推理

花强, 刘轶功, 张峰, 董春茹 - 河北大学学报 (自然科学版) - xbzrb.hbu.cn

基于Wasserstein 距离的生成对抗网络(WGAN) 将编码器和生成器双向集成于其模型中, 从而增强了生成模型的学习能力, 但其在优化目标中使用KL 散度度量分布间的差异, 会导致学习训练过程中出现梯度消失或梯度爆炸问题, 降低模型鲁棒性. 为克服这一问题 …

All 2 versions View as HTML 

[Chinese   Two-way learning reasoning based on Wasserstein distance]   

 

First arrival picking of microseismic signals based on nested U-Net and Wasserstein Generative Adversarial Network

JL Zhang, GQ Sheng - Journal of Petroleum Science and Engineering, 2020 - Elsevier

Picking the first arrival of microseismic signals, quickly and accurately, is the key for real-time data processing of microseismic monitoring. The traditional method cannot meet the high-accuracy and high-efficiency requirements for the firstarrival microseismic picking, in a low SNR environment. Concentrating on the problem of relatively low microseismic SNR, this paper proposes the Residual Link Nested U-Net Network (RLU-Net), which can not only retain the spatial position information of input signal and profile, but also realize the first …

Related articles All 2 versions

2020  Research article
First arrival picking of microseismic signals based on nested U-Net and Wasserstein Generative Adversarial Network
Journal of Petroleum Science and Engineering23 June 2020...

JingLan ZhangGuanQun Sheng

Cited by 11 Related articles All 4 versions
 

[PDF] mdpi.com

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - Symmetry, 2021 - mdpi.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively 

reduce the radiation risk of patients, but it may increase noise and artefacts, which can 

compromise diagnostic information. The methods based on deep learning can effectively …

Related articles

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss. Symmetry 2021, 13, 126

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - 2021 - search.proquest.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively 

reduce the radiation risk of patients, but it may increase noise and artefacts, which can 

compromise diagnostic information. The methods based on deep learning can effectively …

 

  

Wasserstein distance estimates for stochastic integrals by forward-backward stochastic calculus

JC Breton, N Privault - Potential Analysis, 2020 - Springer

We prove Wasserstein distance bounds between the probability distributions of stochastic integrals with jumps, based on the integrands appearing in their stochastic integral representations. Our approach does not rely on the Stein equation or on the propagation of convexity property for Markovian semigroups, and makes use instead of forward-backward stochastic calculus arguments. This allows us to consider a large class of target distributions constructed using Brownian stochastic integrals and pure jump martingales, which can be …

Related articles All 4 versions

Wasserstein metric for improved quantum machine learning with adjacency matrix representations

O Çaylak, OA von Lilienfeld… - … Learning: Science and …, 2020 - iopscience.iop.org

We study the Wasserstein metric to measure distances between molecules represented by the atom index dependent adjacency'Coulomb'matrix, used in kernel ridge regression based supervised learning. Resulting machine learning models of quantum properties, aka quantum machine learning models exhibit improved training efficiency and result in smoother predictions of energies related to molecular distortions. We first illustrate smoothness for the continuous extraction of an atom from some organic molecule. Learning …
Cited by 13
Related articles All 7 versions


[PDF] researchgate.net

[PDF] Fixed-Support Wasserstein Barycenters: Computational Hardness and Fast Algorithm

T LinN HoX ChenM Cuturi… - Advances in Neural …, 2020 - researchgate.net

We study the fixed-support Wasserstein barycenter problem (FS-WBP), which consists in

computing the Wasserstein barycenter of m discrete probability measures supported on a

finite metric space of size n. We show first that the constraint matrix arising from the standard …

  Cited by 5 Related articles All 5 versions 


[PDF] arxiv.org

Continuous regularized wasserstein barycenters

L LiA GenevayM YurochkinJ Solomon - arXiv preprint arXiv …, 2020 - arxiv.org

Wasserstein barycenters provide a geometrically meaningful way to aggregate probability

distributions, built on the theory of optimal transport. They are difficult to compute in practice,

however, leading previous work to restrict their supports to finite sets of points. Leveraging a …

  Cited by 5 Related articles All 4 versions 

<——2020———2020———  430  ——  

 

State Intellectual Property Office of China Releases Univ Nanjing Tech's Patent Application for a 

Blind Detection Method of an Image Repetition Region Based on Euclidean Metric of Wasserstein Histogram

Global IP News. Information Technology Patent News, Aug 31, 2020

Newspaper ArticleFull Text Online 

ProQuest


arXiv:2009.04651  [pdf, ps, other 

stat.ML cs.LG math.ST 

Universal consistency of Wasserstein

k-NN classifier 

Authors: Donlapark Ponnoprat 

Abstract: The Wasserstein distance provides a notion of dissimilarities between probability measures, which has recent applications in learning of structured data with varying size such as images and text documents. In this work, we analyze the

k-nearest neighbor classifier (k-NN) under the Wasserstein distance and establish the universal consistency on families of distributions. From previous results o… More 

Submitted 9 September, 2020; originally announced September 2020. 

Comments: 12 pages 


arXiv:2009.04469  [pdf, ps, other 

quant-ph cs.IT math-ph math.FA math.PR 

The quantum Wasserstein distance of order 1 

Authors: Giacomo De Palma, Milad Marvian, Dario Trevisan, Seth Lloyd 

Abstract: We propose a generalization of the Wasserstein distance of order 1 to the quantum states of

n qudits. The proposal recovers the Hamming distance for the vectors of the canonical basis, and more generally the classical Wasserstein distance for quantum states diagonal in the canonical basis. The proposed distance is invariant with respect to permutations of the qudits and unitary operations acting… More 

Submitted 9 September, 2020; originally announced September 2020. 


arXiv:2009.04382  [pdf  cs.LG math.PR stat.ML 

Finite-Sample Guarantees for Wasserstein Distributionally Robust Optimization: Breaking the Curse of Dimensionality 

Authors: Rui Gao 

Abstract: Wasserstein distributionally robust optimization (DRO) aims to find robust and generalizable solutions by hedging against data perturbations in Wasserstein distance. Despite its recent empirical success in operations research and machine learning, existing performance guarantees for generic loss functions are either overly conservative due to the curse of dimensionality, or plausible only in large… More 

Submitted 9 September, 2020; originally announced September 2020.
Cited by 16
Related articles All 3 versions


arXiv:2009.04266  [pdf, other]  math.OC stat.ML 

The Unbalanced Gromov Wasserstein Distance: Conic Formulation and Relaxation 

Authors: Thibault Séjourné, François-Xavier Vialard, Gabriel Peyré 

Abstract: Comparing metric measure spaces (i.e. a metric space endowed with a probability distribution) is at the heart of many machine learning problems. This includes for instance predicting properties of molecules in quantum chemistry or generating graphs with varying connectivity. The most popular distance between such metric measure spaces is the Gromov-Wasserstein (GW) distance, which is the solution… More 

Submitted 9 September, 2020; originally announced September 2020.

2020

arXiv:2009.03443  [pdf, other  stat.ME 

Ensemble Riemannian Data Assimilation over the Wasserstein Space 

Authors: Sagar K. Tamang, Ardeshir Ebtehaj, Peter J. Van Leeuwen, Dongmian Zou, Gilad Lerman 

Abstract: In this paper, we present a new ensemble data assimilation paradigm over a Riemannian manifold equipped with the Wasserstein metric. Unlike Eulerian penalization of error in the Euclidean space, the Wasserstein metric can capture translation and shape difference between square integrable probability distributions of the background state and observations, enabling to formally penalize geophysical b… More 

Submitted 7 September, 2020; originally announced September 2020. 

All 7 versions

arXiv:2009.02831  [pdf, other 

cs.CV eess.IV 

Unsupervised Wasserstein Distance Guided Domain Adaptation for 3D Multi-Domain Liver Segmentation 

Authors: Chenyu You, Junlin Yang, Julius Chapiro, James S. Duncan 

Abstract: Deep neural networks have shown exceptional learning capability and generalizability in the source domain when massive labeled data is provided. However, the well-trained models often fail in the target domain due to the domain shift. Unsupervised domain adaptation aims to improve network performance when applying robust models trained on medical images from source domains to a new target domain.… More 

Submitted 6 September, 2020; originally announced September 2020. 

Cited by 20 Related articles All 3 versions

MR4146769 Prelim Bassetti, Federico; Gualandi, Stefano; Veneroni, Marco; On the Computation of Kantorovich–Wasserstein Distances Between Two-Dimensional Histograms by Uncapacitated Minimum Cost Flows. SIAM J. Optim. 30 (2020), no. 3, 2441–2469. 90C06 (49Q22 90C08)

Review PDF Clipboard Journal Article 

On the Computation of Kantorovich--Wasserstein Distances ...

In this work, we present a method to compute the Kantorovich--Wasserstein distance of order 1 between a pair of two-dimensional histograms. Recent works in ...

journal PDF

[PDF] arxiv.org

Geometric Characteristics of Wasserstein Metric on SPD (n)

Y Luo, S Zhang, Y Cao, H Sun - arXiv preprint arXiv:2012.07106, 2020 - arxiv.org

Wasserstein distance, especially among symmetric positive-definite matrices, has broad and

deep influences on development of artificial intelligence (AI) and other branches of computer

science. A natural idea is to describe the geometry of $ SPD\left (n\right) $ as a Riemannian …

  Related articles All 2 versions 


Method for training condition generating type countermeasure network of an electronic device, involves providing condition generating type countermeasure network with generator, discriminator and Wasserstein generative adversarial network 

Patent Number: CN111582348-A 

Patent Assignee: UNIV WUHAN POLYTECHNIC 

Inventor(s): LI Y; XU X; YUAN C.

. <——2020—— 2020————— 440  ——   


 2o2o see 2019

Wasserstein Distances for Estimating Parameters in Stochastic Reaction Networks 

By: Ocal, Kaan; Grima, Ramon; Sanguinetti, Guido 

Conference: 17th International Conference on Computational Methods in Systems Biology (CMSB) Location: ‏ Univ Trieste, Trieste, ITALY Date: ‏ SEP 18-20, 2019 

Sponsor(s): ‏Univ Trieste, Dept Math & Geosciences 

COMPUTATIONAL METHODS IN SYSTEMS BIOLOGY (CMSB 2019)  Book Series: ‏ Lecture Notes in Bioinformatics   Volume: ‏ 11773   Pages: ‏ 347-351   Published: ‏ 2019 


2020

A Distributionally Robust Optimization Approach for Multivariate Linear Regression under the Wasserstein Metric 

By: Chen, Ruidi; Paschalidis, Ioannis Ch. 

Conference: 58th IEEE Conference on Decision and Control (CDC) Location: ‏ Nice, FRANCE Date: ‏ DEC 11-13, 2019 

Sponsor(s): ‏IEEE 

2019 IEEE 58TH CONFERENCE ON DECISION AND CONTROL (CDC)  Book Series: ‏ IEEE Conference on Decision and Control   Pages: ‏ 3655-3660   Published: ‏ 2019  


2020 see 2019

Unimodal-uniform Constrained Wasserstein Training for Medical Diagnosis 

By: Liu, Xiaofeng; Han, Xu; Qiao, Yukai; et al.

Conference: IEEE/CVF International Conference on Computer Vision (ICCV) Location: ‏ Seoul, SOUTH KOREA Date: ‏ OCT 27-NOV 02, 2019 

Sponsor(s): ‏IEEE; IEEE Comp Soc; CVF 

2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW)  Book Series: ‏ IEEE International Conference on Computer Vision Workshops   Pages: ‏ 332-341   Published: ‏ 2019 


Joint Wasserstein Autoencoders for Aligning Multimodal Embeddings 

By: Mahajan, Shweta; Botschen, Teresa; Gurevych, Iryna; et al.

Conference: IEEE/CVF International Conference on Computer Vision (ICCV) Location: ‏ Seoul, SOUTH KOREA Date: ‏ OCT 27-NOV 02, 2019 

Sponsor(s): ‏IEEE; IEEE Comp Soc; CVF 

2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW)  Book Series: ‏ IEEE International Conference on Computer Vision Workshops   Pages: ‏ 4561-4570   Published: ‏ 2019 


Generating Adversarial Samples With Constrained Wasserstein Distance 

By: Wang, Kedi; Yi, Ping; Zou, Futai; et al.

IEEE ACCESS  Volume: ‏ 7   Pages: ‏ 136812-136821   Published: ‏ 2019 

 

[PDF] arxiv.org

Modeling EEG data distribution with a Wasserstein Generative Adversarial Network to predict RSVP Events

S PanwarP RadTP Jung… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Electroencephalography (EEG) data are difficult to obtain due to complex experimental

setups and reduced comfort with prolonged wearing. This poses challenges to train powerful

deep learning model with the limited EEG data. Being able to generate EEG data  …

  Cited by 1 Related articles All 5 versions

 


[PDF] ucsal.br

Adaptação do WGAN ao processo estocástico

RR Aquino - 2020 - ri.ucsal.br

Dentro de diversas áreas do conhecimento, os dados (diversos tipos de informações) são 

valiosos e a sua análise é mais valiosa ainda. Então, associando a área da inteligência 

artificial, observa-se uma nova moda, a geração de dados sintéticos para suprir a falta de


Open Access 

First-Order Methods for Wasserstein Distributionally Robust MDP 

by Grand-Clement, Julien; Kroer, Christian 

09/2020

Markov Decision Processes (MDPs) are known to be sensitive to parameter specification. Distributionally robust MDPs alleviate this issue by allowing for...

Journal ArticleFull Text Online 

arXiv:2009.06790  [pdf, other  math.OC cs.GT 

First-Order Methods for Wasserstein Distributionally Robust MDP 

Authors: Julien Grand-Clement, Christian Kroer 

Abstract: Markov Decision Processes (MDPs) are known to be sensitive to parameter specification. Distributionally robust MDPs alleviate this issue by allowing for ambiguity sets which give a set of possible distributions over parameter sets. The goal is to find an optimal policy with respect to the worst-case parameter distribution. We propose a first-order methods framework for solving Distributionally rob… More 

Submitted 14 September, 2020; originally announced September 2020. 

Cited by 6 Related articles All 5 versions

Researchers from National Center for Scientific Research Discuss Research in Machine Learning (Fused Gromov-Wasserstein... 

Journal of Engineering, 09/2020

NewsletterCitation Online 


A Hybrid Deep-learning Framework Based on the Wasserstein-GAN with Non-Subsampled Contourlet Transform for Noise Reduction in Low-dose CT

Woosung Kim, PhD, Yonsei University, Wonju
<
——2020—— 2020————— 450  ——  

arXiv:2009.10590  [pdf, ps, other 

math.PR math-ph 

Cutoff thermalization for Ornstein-Uhlenbeck systems with small Lévy noise in the Wasserstein distance 

Authors: Gerardo Barrera, Michael A. Högele, Juan Carlos Pardo 

Abstract: This article establishes cutoff thermalization (also known as the cutoff phenomenon) for a general class of general Ornstein-Uhlenbeck systems

Submitted 22 September, 2020; originally announced September 2020. 

Comments: 44 pages 

MSC Class: 37H10; 60J60; 60J70; 60G51 


Wasserstein distance user manual — gudhi documentation

Aug 11, 2020 - ... Dmitriy Morozov, and Arnur Nigmetov. gudhi.hera. wasserstein_distance (X: numpy.ndarray[float64], Y: numpy.ndarray[float64], order: float ...


wasserstein.distance: L_p q-Wasserstein Distance in ... - Rdrr.io

In kernelTDA: Statistical Learning with Kernel for Persistence Diagrams

Author(s). Tullia Padellini, Francesco Palini. The included C++ library is authored by Michael Kerber, Dmitriy Morozov, and Arnur Nigmetov ...

Sep 26, 2020

Compute the q-Wasserstein distance between persistence diagrams using an arbitrary L_p norm as ground metric. 


 STOCHASTIC EQUATION AND EXPONENTIAL ERGODICITY IN WASSERSTEIN DISTANCES FOR AFFINE PROCESSES 

By: Friesen, Martin; Jin, Peng; Rudiger, Barbara 

ANNALS OF APPLIED PROBABILITY   Volume: ‏ 30   Issue: ‏ 5   Pages: ‏ 2165-2195   Published: ‏ OCT 2020

Cited by 13 Related articles All 6 versions

2020  Open Access 

Cutoff thermalization for Ornstein-Uhlenbeck systems with small L\'evy noise in the Wasserstein distance 

by Barrera, Gerardo; Högele, Michael A; Pardo, Juan Carlos 

09/2020

This article establishes cutoff thermalization (also known as the cutoff phenomenon) for a general class of general Ornstein-Uhlenbeck systems...

Journal ArticleFull Text Online 


2020

Parameter-Transferred Wasserstein Generative Adversarial Network (PT-WGAN) for Low-Dose PET... 

by Gong, Yu; Shan, Hongming; Teng, Yueyang; More... 

IEEE transactions on radiation and plasma medical sciences, 09/2020

Due to the widespread use of positron emission tomography (PET) in clinical practice, the potential risk of PET-associated radiation dose to patients needs to...

Journal ArticleFull Text Online
Cited by 10
Related articles All 4 versions


2020  Open Access 

An Ensemble Wasserstein Generative Adversarial Network Method for Road Extraction from High... 

by Yang, Chuan; Wang, Zhenghong 

IEEE access, 09/2020

Road extraction from high resolution remote sensing (HR-RS) images is an important yet challenging computer vision task. In this study, we propose an ensemble...

Article PDF Download PDF 

Journal ArticleFull Text Online 

[PDF] ieee.org

An Ensemble Wasserstein Generative Adversarial Network Method for Road Extraction From High Resolution Remote Sensing Images in Rural Areas

C Yang, Z Wang - IEEE Access, 2020 - ieeexplore.ieee.org

Road extraction from high resolution remote sensing (HR-RS) images is an important yet 

challenging computer vision task. In this study, we propose an ensemble Wasserstein 

Generative Adversarial Network with Gradient Penalty (WGAN-GP) method called E-WGAN …

Cited by 12 Related articles All 3 versions


A Linear Programming Approximation of Distributionally ...

online Cover Image

A Linear Programming Approximation of Distributionally Robust Chance-Constrained Dispatch With Wasserstein...

by Zhou, Anping; Yang, Ming; Wang, Mingqiang ; More...

IEEE transactions on power systems, 09/2020, Volume 35, Issue 5

This paper proposes a data-driven distributionally robust chance constrained real-time dispatch (DRCC-RTD) considering renewable generation forecasting errors....

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Reports from Shandong University Add New Data to Findings in Power Systems (A Linear Programming Approximation of Distributionally Robust Chance-constrained Dispatch With Wasserstein... 

Energy Weekly News, 09/2020

NewsletterFull Text Online 

Engineering - Power Systems; Reports from Shandong University Add New Data to Findings in Power Systems (A Linear Programming Approximation of Distributionally Robust Chance-constrained Dispatch With Wasserstein... 

Energy weekly news, Sep 25, 2020, 650

Newspaper ArticleFull Text Online

A Linear Programming Approximation of Distributionally Robust Chance-Constrained Dispatch With Wasserstein Distance 

By: Zhou, Anping; Yang, Ming; Wang, Mingqiang; et al.

IEEE TRANSACTIONS ON POWER SYSTEMS  Volume: ‏ 35   Issue: ‏ 5   Pages: ‏ 3366-3377   Published: ‏ SEP

  Cited by 3 Related articles All 2 versions

2020

Machine Learning; Researchers from National Center for Scientific Research (CNRS) Discuss Research in Machine Learning (Fused Gromov-Wasserstein... 

Robotics & machine learning, Sep 14, 2020, 1858

Newspaper ArticleCitation Online 


2020

Information Technology - Data Distribution; Findings from University of Texas San Antonio Update Knowledge of Data Distribution (Modeling Eeg Data Distribution With a Wasserstein... 

Computer technology journal, Sep 3, 2020, 164

Newspaper ArticleCitation Online

<——2020——————2020———  460

Simulating drug effects on blood glucose laboratory test time series with a conditional WGAN

0 citations* 

2020 medRxivAlexandre Yahi ,  Nicholas P Tatonetti

Health and Medicine; Simulating drug effects on blood glucose laboratory test time series with a conditional WGAN 

Health & medicine week, Aug 7, 2020, 6553

Newspaper ArticleFull Text Online 


2020

WGAN algorithm based building energy consumption prediction method, involves training GAN prediction model, updating optimal combination of hyper parameters, and predicting building energy consumption value by optimal GAN prediction model

Patent Number: CN111178626-A

Patent Assignee: UNIV SUZHOU SCI & TECHNOLOGY

Inventor(s): FU Q; SHEN Y; CHEN J; et al.


Spam transaction attack detection model based on GRU and WGAN-div

0 citations* 

2020 Computer Communications

Jin Yang , Tao Li , Gang Liang , YunPeng Wang ,bTianYu Gao 

see all 6 authors 

Sichuan University

Publisher:2020


 Eye in-painting using WGAN-GP for face images with mosaic

CH Wu, HT Chang, A Amjad - 2020 International Conference …, 2020 - spiedigitallibrary.org

In order to protect personal privacy, news reports often use the mosaics upon the face of the 

protagonist in the photo. However, readers will feel uncomfortable and awkward to this kind 

of photos. In this research, we detect the eye mosaic and try to use eye complementing …


2020

A Generative Steganography Method Based on WGAN-GP

0 citations* 

Jun Li , Ke Niu , Liwei Liao , Lijie Wang , Jia Liu 

see all 7 authors 

Cited by 11 Related articles

arXiv:2010.01037  [pdf, other 

cs.LG cs.CV stat.ML 

Encoded Prior Sliced Wasserstein AutoEncoder for learning latent manifold representations 

Authors: Sanjukta Krishnagopal, Jacob Bedrossian 

Abstract: While variational autoencoders have been successful generative models for a variety of tasks, the use of conventional Gaussian or Gaussian mixture priors are limited in their ability to capture topological or geometric properties of data in the latent representation. In this work, we introduce an Encoded Prior Sliced Wasserstein AutoEncoder (EPSWAE) wherein an additional prior-encoder network lear… More 

Submitted 2 October, 2020; originally announced October 2020. 

Comments: 8 pages, 4 figures in the main text, Submitted to The International Conference on Learning Representations (ICLR)2021 

Related articles All 4 versions

arXiv:2009.14552  [pdf, other 

math.OC cs.LG stat.ML 

Wasserstein Distributionally Robust Inverse Multiobjective Optimization 

Authors: Chaosheng Dong, Bo Zeng 

Abstract: Inverse multiobjective optimization provides a general framework for the unsupervised learning task of inferring parameters of a multiobjective decision making problem (DMP), based on a set of observed decisions from the human expert. However, the performance of this framework relies critically on the availability of an accurate DMP, sufficient decisions of high quality, and a parameter space that… More 

Submitted 30 September, 2020; originally announced September 2020. 

Comments: 19 pages   

Cited by 2 Related articles All 8 versions


2020 see 2019
E-WACGAN: Enhanced Generative Model of Signaling Data Based on WGAN-GP and ACGAN
 

by Jin, Qimin; Lin, Rongheng; Yang, Fangchun 

IEEE systems journal, 09/2020, Volume 14, Issue 3

In recent years, the generative adversarial network (GAN) has achieved outstanding performance in the image field and the derivatives of GAN, namely auxiliary...

Article PDF Download PDF 

Journal Article Full Text Online 

Investigators at Beijing University of Posts and Telecommunications Report Findings in Information Technology (E-wacgan: Enhanced Generative Model of Signaling Data Based On Wgan... 

Telecommunications Weekly, 09/2020

NewsletterFull Text Online 

 

2020

Satisfaction evaluation method involves using the WGAN network to generate an expressionless standard mage based on the existing expression pictures to form an expressionless standard atlas to be trained

Patent Number: CN111639518-A

Patent Assignee: SHANGHAI ZHUOFAN INFORMATION TECHNOLOGY

Inventor(s): ZHANG Q; GAO L; ZHONG J; et al.

  <——2020——2020——————  470—   


KBRI-Neuroinformatics/WGAN-for-RNASeq-analysis: First release of WGAN-for-RNASeq-analysis 

by KBRI-Neuroinformatics 

2020

This is the first release.

Computer FileCitation Online 

. adversarial networks for RNA-seq analysis to predict the molecular progress of Alzheimer's disease - KBRI-Neuroinformatics/WGAN-for-RNASeq-analysis.


 

eriknes/delta-wgans: First version 

by N, Erik 

2020

No description provided.

Computer FileCitation Online 

 

Open Access 

KBRI-Neuroinformatics/WGAN-for-RNASeq-analysis v1.0.1 

by KBRI-Neuroinformatics 

2020

A practical application of generative adversarial networks for RNA-seq analysis to predict the molecular progress of Alzheimer's disease

Computer FileCitation Online 

Preview 


Open Access 

정칙화 항에 기반한 WGAN의 립쉬츠 연속 안정화 기법 제안 

by 한희일 

한국인터넷방송통신학회 논문지, 02/2020, Volume 20, Issue 1

최근에 제안된 WGAN(Wasserstein generative adversarial network) 등장으로 GAN(generative adversarial network) 고질적인 문제인 까다롭고 불안정한 학습과정이 다소 개선되기는 하였으나 여전히 수렴이 되거나 자연스럽지...

Journal ArticleCitation Online 

[Koran  Proposal of WGAN's Lipsheets Continuous Stabilization Method Based on Regularization Terms]


 2020  online

New Findings from Petuum Inc. in Computer Assisted Radiology and Surgery Provides New Insights (Wgan Domain Adaptation for the Joint Optic Disc-and-cup... 

Medical Devices & Surgical Technology Week, 06/2020

NewsletterFull Text Online 

Surgery - Computer Assisted Radiology and Surgery; New Findings from Petuum Inc. in Computer Assisted Radiology and Surgery Provides New Insights (Wgan... 

Medical devices & surgical technology week, Jun 21, 2020, 728

Newspaper ArticleFull Text Online 


 

Open Access 

Building energy consumption prediction method based on WGAN algorithm and monitoring and prediction system 

by LU YOU; WANG ZHECHAO; WU HONGJIE; More... 

05/2020

The invention relates to a building energy consumption prediction method based on a WGAN algorithm and a building energy consumption monitoring and prediction...

PatentCitation Online 

Preview 


 

University of Shanghai for Science and Technology Researchers Publish New Study Findings on Information and Data Encoding and Encryption (WGAN-E: A... 

Information Technology Newsweekly, 04/2020

NewsletterCitation Online 

Information Technology - Information and Data Encoding and Encryption; University of Shanghai for Science and Technology Researchers Publish New Study Findings on Information and Data Encoding and Encryption (WGAN... 

Information technology newsweekly, Apr 14, 2020, 953

Newspaper ArticleCitation Online 

Preview 


 

Open Access 

一种基于改进AC-WGANs的无人机信号识别检测方法 

04/2020

PatentCitation Online 

[Chinese  An UAV signal recognition and detection method based on improved AC-WGANs]



MR4159156 Prelim Graham, Cole; Irregularity of Distribution in Wasserstein Distance. J. Fourier Anal. Appl. 26 (2020), no. 5, Paper No. 75.

Irregularity of Distribution in Wasserstein Distance 

by Graham, Cole 

The Journal of fourier analysis and applications, 10/2020, Volume 26, Issue 5

We study the non-uniformity of probability measures on the interval and circle. On the interval, we identify the Wasserstein-p distance with the classical...

Article PDF Download PDF 

Journal ArticleFull Text Online 

  Cited by 2 Related articles All 3 versions


MR4157964 Thesis Mirth, Joshua Robert; Vietoris—Rips Metric Thickenings and Wasserstein Spaces. Thesis (Ph.D.)–Colorado State University. 2020. 107 pp. ISBN: 979-8664-76221-1, ProQuest LLC
<——2020——2020—————————  480—   


Bassetti, Federico; Gualandi, Stefano; Veneroni, Marco

On the computation of Kantorovich-Wasserstein distances between two-dimensional histograms by uncapacitated minimum cost flows. (English) Zbl 07248646 

SIAM J. Optim. 30, No. 3, 2441-2469 (2020). 

MSC:  90C06 90C08 


The Wasserstein Loss Function    - Jeevana   Inala   Prafulla Dhariwal 

PDF Feb 2020 MIT

i.e. KL divergence vs Wasserstein loss layer in Caffe. The data set is toy dataset 1, and the Wasserstein loss layer has. Sinkhorn iterations = 50. The Wasserstein ...


GraphWGAN: Graph Representation Learning with Wasserstein Generative Adversarial Networks 

By: Yan, Rong; Shen, Huawei; Cao, Qi; et al.

Conference: IEEE International Conference on Big Data and Smart Computing (BigComp) Location: ‏ Busan, SOUTH KOREA Date: ‏ FEB 19-22, 2020 

Sponsor(s): ‏IEEE; IEEE Comp Soc; Korean Inst Informat Scientists & Engineers 

2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP 2020)  Book Series: ‏ International Conference on Big Data and Smart Computing   Pages: ‏ 315-322   Published: ‏ 2020 


waspr: Wasserstein Barycenters of Subset Posteriors 

By: Cremers, Jolien 

Zenodo

DOI: ‏ http://dx.doi.org.ezaccess.libraries.psu.edu/10.5281/ZENODO.3971909 

Document Type: Software 

waspr: Wasserstein Barycenters of Subset Posteriors | Zenodo

Aug 4, 2020 — Functions to compute Wasserstein barycenters of subset posteriors using the swapping algorithm. The Wasserstein barycenter is a geometric ...


2020 online Cover Image  PEER-REVIEW OPEN ACCESS

Wasserstein Distributionally Robust Stochastic Control: A Data-Driven Approach

by Yang, Insoon

IEEE transactions on automatic control, 10/2020

Standard stochastic control methods assume that the probability distribution of uncertain variables is available. Unfortunately, in practice, obtaining...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Cited by 45 Related articles All 3 versions

Convergence rate to equilibrium in Wasserstein distance for reflected jump–diffusions 

by Sarantsev, Andrey 

Statistics & probability letters, 10/2020, Volume 165

Convergence rate to the stationary distribution for continuous-time Markov processes can be studied using Lyapunov functions. Recent work by the author...

Article PDF Download PDF 

Journal ArticleFull Text Online 

ults for Wasserstein distance, … for Wasserstein distance than for total variation distance. …

 Cited by 1 Related articles All 5 versions

Open Access 

Differentiable maps between Wasserstein spaces 

by Lessel, Bernadette; Schick, Thomas 

10/2020

A notion of differentiability is being proposed for maps between Wasserstein spaces of order 2 of smooth, connected and complete Riemannian manifolds. Due to...

Journal ArticleFull Text Online 

arXiv:2010.02131  [pdf, ps, other]  math.MG 

Differentiable maps between Wasserstein spaces 

Authors: Bernadette Lessel, Thomas Schick 

Abstract: A notion of differentiability is being proposed for maps between Wasserstein spaces of order 2 of smooth, connected and complete Riemannian manifolds. Due to the nature of the tangent space construction on Wasserstein spaces, we only give a global definition of differentiability, i.e. without a prior notion of pointwise differentiability. With our definition, however, we recover the expected prope… More 

Submitted 5 October, 2020; originally announced October 2020. 

Comments: 16 pages

Open Access 

Permutation invariant networks to learn Wasserstein metrics 

by Sehanobish, Arijit; Ravindra, Neal; van Dijk, David 

10/2020

Understanding the space of probability measures on a metric space equipped with a Wasserstein distance is one of the fundamental questions in mathematical...

Journal ArticleFull Text Online 

arXiv:2010.05820  [pdf, other 

cs.LG math.PR stat.ML 

Permutation invariant networks to learn Wasserstein metrics 

Authors: Arijit Sehanobish, Neal Ravindra, David van Dijk 

Abstract: Understanding the space of probability measures on a metric space equipped with a Wasserstein distance is one of the fundamental questions in mathematical analysis. The Wasserstein metric has received a lot of attention in the machine learning community especially for its principled way of comparing distributions. In this work, we use a permutation invariant network to map samples from probability… More 

Submitted 12 October, 2020; originally announced October 2020. 

Comments: Work in progress
 Cited by 19 Related articles All 2 versions

Permutation Invariant Networks to Learn Wasserstein Metrics

Understanding the space of probability measures on a metric space equipped with a Wasserstein distance is one of the fundamental questions ...

CrossMind.ai · 

Dec 6, 2020

Open Access 

Chance-Constrained Set Covering with Wasserstein Ambiguity 

by Shen, Haoming; Jiang, Ruiwei 

10/2020

We study a generalized distributionally robust chance-constrained set covering problem (DRC) with a Wasserstein ambiguity set, where both decisions and...

Journal ArticleFull Text Online 

arXiv:2010.05671  [pdf, other 

math.OC 

Chance-Constrained Set Covering with Wasserstein Ambiguity 

Authors: Haoming Shen, Ruiwei Jiang 

Abstract: We study a generalized distributionally robust chance-constrained set covering problem (DRC) with a Wasserstein ambiguity set, where both decisions and uncertainty are binary-valued. We establish the NP-hardness of DRC and recast it as a two-stage stochastic program, which facilitates decomposition algorithms. Furthermore, we derive two families of valid inequalities. The first family targets the… More 

Submitted 12 October, 2020; originally announced October 2020. 

Comments: 39 pages, 3 figures
Cited by 2
Related articles All 2 versions


Open Access 

Efficient Wasserstein Natural Gradients for Reinforcement Learning 

by Moskovitz, Ted; Arbel, Michael; Huszar, Ferenc; More... 

10/2020

A novel optimization approach is proposed for application to policy gradient methods and evolution strategies for reinforcement learning (RL). The procedure...

Journal ArticleFull Text Online 

arXiv:2010.05380  [pdf, other]  cs.LG 

Efficient Wasserstein Natural Gradients for Reinforcement Learning 

Authors: Ted Moskovitz, Michael Arbel, Ferenc Huszar, Arthur Gretton 

Abstract: A novel optimization approach is proposed for application to policy gradient methods and evolution strategies for reinforcement learning (RL). The procedure uses a computationally efficient Wasserstein natural gradient (WNG) descent that takes advantage of the geometry induced by a Wasserstein penalty to speed optimization. This method follows the recent theme in RL of including a divergence penal… More 

Submitted 11 October, 2020; originally announced October 2020.
Cited by 3
Related articles All 6 versions 

<—–2020———2020—–490 — 

 


Open Access 

Improved Complexity Bounds in Wasserstein Barycenter Problem 

by Dvinskikh, Darina; Tiapkin, Daniil 

10/2020

In this paper, we focus on computational aspects of Wasserstein barycenter problem. We provide two algorithms to compute Wasserstein barycenter of $m$ discrete...

Journal ArticleFull Text Online 

arXiv:2010.04677  [pdf, other  math.OC 

Improved Complexity Bounds in Wasserstein Barycenter Problem 

Authors: Darina Dvinskikh, Daniil Tiapkin 

Abstract: In this paper, we focus on computational aspects of Wasserstein barycenter problem. We provide two algorithms to compute Wasserstein barycenter of

m  discrete measures of size n  with accuracy ε. The first algorithm, based on mirror prox with some specific norm, meets the complexity of celebrated accelerated iterative Bregman projections (IBP), that is… More 

Submitted 9 October, 2020; originally announced October 2020. 

Comments: 7 pages

Open Access 

Learning disentangled representations with the Wasserstein Autoencoder 

by Gaujac, Benoit; Feige, Ilya; Barber, David 

10/2020

Disentangled representation learning has undoubtedly benefited from objective function surgery. However, a delicate balancing act of tuning is still required...

Journal ArticleFull Text Online 

 arXiv:2010.03459  [pdf, other]  stat.ML cs.CV cs.LG 

Learning disentangled representations with the Wasserstein Autoencoder 

Authors: Benoit Gaujac, Ilya Feige, David Barber 

Abstract: Disentangled representation learning has undoubtedly benefited from objective function surgery. However, a delicate balancing act of tuning is still required in order to trade off reconstruction fidelity versus disentanglement. Building on previous successes of penalizing the total correlation in the latent variables, we propose TCWAE (Total Correlation Wasserstein Autoencoder). Working in the WAE… More 

Submitted 7 October, 2020; originally announced October 2020.

Open Access 

SWIFT: Scalable Wasserstein Factorization for Sparse Nonnegative Tensors 

by Afshar, Ardavan; Yin, Kejing; Yan, Sherry; More... 

10/2020

Existing tensor factorization methods assume that the input tensor follows some specific distribution (i.e. Poisson, Bernoulli and Gaussian), and solve the...

Journal ArticleFull Text Online 

arXiv:2010.04081  [pdf, other]  cs.LG 

SWIFT: Scalable Wasserstein Factorization for Sparse Nonnegative Tensors 

Authors: Ardavan Afshar, Kejing Yin, Sherry Yan, Cheng Qian, Joyce C. Ho, Haesun Park, Jimeng Sun 

Abstract: Existing tensor factorization methods assume that the input tensor follows some specific distribution (i.e. Poisson, Bernoulli and Gaussian), and solve the factorization by minimizing some empirical loss functions defined based on the corresponding distribution. However, it suffers from several drawbacks: 1) In reality, the underlying distributions are complicated and unknown, making it infeasible… More 

Submitted 8 October, 2020; originally announced October 2020.

Open Access 

Learning Deep-Latent Hierarchies by Stacking Wasserstein Autoencoders 

by Gaujac, Benoit; Feige, Ilya; Barber, David 

10/2020

Probabilistic models with hierarchical-latent-variable structures provide state-of-the-art results amongst non-autoregressive, unsupervised density-based...

Journal ArticleFull Text Online 

arXiv:2010.03467  [pdf, other]  stat.ML cs.CV cs.LG 

Learning Deep-Latent Hierarchies by Stacking Wasserstein Autoencoders 

Authors: Benoit Gaujac, Ilya Feige, David Barber 

Abstract: Probabilistic models with hierarchical-latent-variable structures provide state-of-the-art results amongst non-autoregressive, unsupervised density-based models. However, the most common approach to training such models based on Variational Autoencoders (VAEs) often fails to leverage deep-latent hierarchies; successful approaches require complex inference and optimisation schemes. Optimal Transpor… More 

Submitted 7 October, 2020; originally announced October 2020.
Related articles
All 4 versions

 

 

Open Access 

Averaging Atmospheric Gas Concentration Data using Wasserstein Barycenters 

by Barré, Mathieu; Giron, Clément; Mazzolini, Matthieu; More... 

10/2020

Hyperspectral satellite images report greenhouse gas concentrations worldwide on a daily basis. While taking simple averages of these images over time produces...

Journal ArticleFull Text Online 

arXiv:2010.02762  [pdf, other]  cs.LG math.OC 

Averaging Atmospheric Gas Concentration Data using Wasserstein Barycenters 

Authors: Mathieu Barré, Clément Giron, Matthieu Mazzolini, Alexandre d'Aspremont 

Abstract: Hyperspectral satellite images report greenhouse gas concentrations worldwide on a daily basis. While taking simple averages of these images over time produces a rough estimate of relative emission rates, atmospheric transport means that simple averages fail to pinpoint the source of these emissions. We propose using Wasserstein barycenters coupled with weather data to average gas concentration da… More 

Submitted 6 October, 2020; originally announced October 2020.

Open Access 

Improving Relational Regularized Autoencoders with Spherical Sliced Fused Gromov Wasserstein 

by Nguyen, Khai; Nguyen, Son; Ho, Nhat; More... 

10/2020

Relational regularized autoencoder (RAE) is a framework to learn the distribution of data by minimizing a reconstruction loss together with a relational...

Journal ArticleFull Text Online 

arXiv:2010.01787  [pdf, other]  stat.ML cs.LG 

Improving Relational Regularized Autoencoders with Spherical Sliced Fused Gromov Wasserstein 

Authors: Khai Nguyen, Son Nguyen, Nhat Ho, Tung Pham, Hung Bui 

Abstract: Relational regularized autoencoder (RAE) is a framework to learn the distribution of data by minimizing a reconstruction loss together with a relational regularization on the latent space. A recent attempt to reduce the inner discrepancy between the prior and aggregated posterior distributions is to incorporate sliced fused Gromov-Wasserstein (SFG) between these distributions. That approach has a… More 

Submitted 5 October, 2020; originally announced October 2020. 

Comments: 39 pages, 19 figures

  

Open Access 

Derivative over Wasserstein spaces along curves of densities 

by Buckdahn, Rainer; Li, Juan; Liang, Hao 

10/2020

In this paper, given any random variable $\xi$ defined over a probability space $(\Omega,\mathcal{F},Q)$, we focus on the study of the derivative of functions...

Journal ArticleFull Text Online 

arXiv:2010.01507  [pdf, ps, other]  math.PR 

Derivative over Wasserstein spaces along curves of densities 

Authors: Rainer Buckdahn, Juan Li, Hao Liang 

Abstract: In this paper, given any random variable…

defined over a probability space…

, we focus on the study of the derivative of functions of the form

 ….  defined over the convex cone of densities 

is a function over the space… More 

Submitted 4 October, 2020; originally announced October 2020. 

Comments: 55 pages  

   

2020 Open Access 

Ripple-GAN: Lane line detection with Ripple Lane Line Detection Network and Wasserstein GAN 

by Zhang, Y; Lu, Z; Ma, D; More... 

11/2020

With artificial intelligence technology being advanced by leaps and bounds, intelligent driving has attracted a huge amount of attention recently in research...

Journal ArticleCitation Online 

Cited by 12 Related articles All 4 versions

 Semantic Inpainting with Multi-dimensional Adversarial Network and Wasserstein Distance 

by Wang, Haodi; Jiao, Libin; Bie, Rongfang; More... 

Pattern Recognition and Computer Vision, 10/2020

Inpainting represents a procedure which can restore the lost parts of an image based upon the residual information. We present an inpainting network that...

Book ChapterFull Text Online 

Semantic Inpainting with Multi-dimensional Adversarial Network and Wasserstein Distance

H Wang, L Jiao, R Bie, H Wu - Chinese Conference on Pattern …, 2020 - Springer

… images both in detail and in general. Compared with the traditional training procedure,

our model combines with Wasserstein Distance that enhances the stability of network

training. The network is training specifically on street …

online

Semantic Inpainting with Multi-dimensional Adversarial Network and Wasserstein Distance

<——2020————2020———  500 


Studies in the Area of Chemical Research Reported from Texas A&M University (De Novo Protein Design for Novel Folds Using Guided Conditional Wasserstein... 

Information Technology Newsweekly, 10/2020

NewsletterCitation Online 

 Chemical Research; Studies in the Area of Chemical Research Reported from Texas A&M University (De Novo Protein Design for Novel Folds Using Guided Conditional Wasserstein... 

Information technology newsweekly, Oct 13, 2020, 822

Newspaper ArticleCitation Online 


online

Researchers' Work from Stanford University Focuses on Fourier Analysis 

(Irregularity of Distribution In Wa...

Mathematics Week, 10/2020

NewsletterFull Text Onlin

Study Results from China University of Geosciences in the Area of Information Technology Reported (Sample Generation Based On a Supervised Wasserstein... 

Information Technology Newsweekly, 10/2020

NewsletterCitation Online 


U.S. Patent and Trademark Office Receives CGG Services SAS's Patent Application for Methods and Devices Performing Adaptive Quadratic Wasserstein... 

Global IP News: Engineering Patent News, Oct 2, 2020

Newspaper ArticleCitation Online 


Univ Sichuan Applies for Patent on Construction Method and Application of Three-Dimensional Mri Image Denoising Model Based on Wasserstein...

Global IP News: Medical Patent News, Oct 19, 2020

Newspaper ArticleCitation Online


2020


   arXiv:2010.07717  [pdf, other]  cs.CL cs.IR 

Wasserstein Distance Regularized Sequence Representation for Text Matching in Asymmetrical Domains 

Authors: Weijie Yu, Chen Xu, Jun Xu, Liang Pang, Xiaopeng Gao, Xiaozhao Wang, Ji-Rong Wen 

Abstract: One approach to matching texts from asymmetrical domains is projecting the input sequences into a common semantic space as feature vectors upon which the matching function can be readily defined and learned. In real-world matching practices, it is often observed that with the training goes on, the feature vectors projected from different domains tend to be indistinguishable. The phenomenon, howeve… More 

Submitte

d 15 October, 2020; originally announced October 2020.

  arXiv:2010.09989  [pdf, other]  cs.CV 

Wasserstein K-Means for Clustering Tomographic Projections 

Authors: Rohan Rao, Amit Moscovich, Amit Singer 

Abstract: Motivated by the 2D class averaging problem in single-particle cryo-electron microscopy (cryo-EM), we present a k-means algorithm based on a rotationally-invariant Wasserstein metric for images. Unlike existing methods that are based on Euclidean (

) distances, we prove that the Wasserstein metric better accommodates for the out-of-plane angular differences between different particle views. We… More 

Submitted 19 October, 2020; originally announced October 2020. 

Comments: 11 pages, 5 figures, 1 table 

MSC Class: 62H30 (Primary) 92C55; 68U10 (Secondary) ACM Class: I.5.3; I.4.0 

Cited by 4 Related articles All 7 versions

Wasserstein K-Means for Clustering Tomographic ProjectTable 1:
Seconds per iteration averaged (over two runs) for the L2 and W1 metrics and the number of iterations ...

May 7, 2020 · Uploaded by Ross Taylor

Wasserstein K-Means for Clustering Tomographic Projections

slideslive.com › wasserstein-kmeans-for-clustering-tomog...

Wasserstein K-Means for Clustering Tomographic Projections. Dec 6, 2020 ... Machine Learning for Safety-Critical Robotics Applications.

SlidesLive · 

Dec 6, 2020

arXiv:2010.09267  [pdf, ps, other 

math.ST stat.CO stat.ML 

Reweighting samples under covariate shift using a Wasserstein distance criterion 

Authors: Julien Reygner, Adrien Touboul 

Abstract: Considering two random variables with different laws to which we only have access through finite size iid samples, we address how to reweight the first sample so that its empirical distribution converges towards the true law of the second sample as the size of both samples goes to infinity. We study an optimal reweighting that minimizes the Wasserstein distance between the empirical measures of th… More 

Submitted 19 October, 2020; originally announced October 2020. 

Cited by 2 Related articles All 27 versions

arXiv:2010.08950  [pdf, ps, other 

math.PR 

Exponential Convergence in Entropy and Wasserstein Distance for McKean-Vlasov SDEs 

Authors: Panpan Ren, Feng-Yu Wang 

Abstract: The following type exponential convergence is proved for (non-degenerate or degenerate) McKean-Vlasov SDEs: 

  

Convergence rate to equilibrium in Wasserstein distance for reflected jump-diffusions 

By: Sarantsev, Andrey 

STATISTICS & PROBABILITY LETTERS  Volume: ‏ 165     Article Number: 108860   Published: ‏ OCT 2020

 <-—-2020——2020————— 510 


arXiv:2010.12865  [pdf, othermath.OC cs.LG stat.ML
Fast Epigraphical Projection-based Incremental Algorithms for
Wasserstein Distributionally Robust Support Vector Machine
Authors:
Jiajin Li, Caihua Chen, Anthony Man-Cho So
Abstract: Wasserstein \textbf{D}istributionally \textbf{R}obust \textbf{O}ptimization (DRO) is concerned with finding decisions that perform well on data that are drawn from the worst-case probability distribution within a Wasserstein ball centered at a certain nominal distribution. In recent years, it has been shown that various DRO formulations of learning models admit tractable convex reformulations. How…
More
Submitted 24 October, 2020; originally announced October 2020.
Comments: Accepted by NeurIPS 2020 

[PDF] arxiv.org

Fast Epigraphical Projection-based Incremental Algorithms for Wasserstein Distributionally Robust Support Vector Machine

J Li, C ChenAMC So - arXiv preprint arXiv:2010.12865, 2020 - arxiv.org

Wasserstein\textbf {D} istributionally\textbf {R} obust\textbf {O} ptimization (DRO) is

concerned with finding decisions that perform well on data that are drawn from the worst-

case probability distribution within a Wasserstein ball centered at a certain nominal …

Cited by 2 Related articles All 6 versions 


J Li, C Chen, AMC So - Advances in Neural Information …, 2020 - proceedings.neurips.cc

Abstract Wasserstein Distributionally Robust Optimization (DRO) is concerned with finding decisions that perform well on data that are drawn from the worst-case probability distribution within a Wasserstein ball centered at a certain nominal distribution. In recent years, it has been shown that various DRO formulations of learning models admit tractable convex reformulations. However, most existing works propose to solve these convex reformulations by general-purpose solvers, which are not well-suited for tackling large-scale …

All 7 versions View as HTML 


arXiv:2010.12522  [pdf, otherstat.ME stat.CO
The
Wasserstein Impact Measure (WIM): a generally applicable, practical tool for quantifying prior impact in Bayesian statistics
Authors:
Fatemeh Ghaderinezhad, Christophe Ley, Ben Serrien
Abstract: The prior distribution is a crucial building block in Bayesian analysis, and its choice will impact the subsequent inference. It is therefore important to have a convenient way to quantify this impact, as such a measure of prior impact will help us to choose between two or more priors in a given situation. A recently proposed approach consists in determining the Wasserstein distance between poster…
More
Submitted 23 October, 2020; originally announced October 2020. 

Related articles All 2 versions

arXiv:2010.12101  [pdf, other]  math.ST math.OC
Fast and Smooth Interpolation on Wasserstein Space
Authors: Sinho Chewi, Julien Clancy, Thibaut Le Gouic, Philippe Rigollet, George Stepaniants, Austin J. Stromme
Abstract: We propose a new method for smoothly interpolating probability measures using the geometry of optimal transport. To that end, we reduce this problem to the classical Euclidean setting, allowing us to directly leverage the extensive toolbox of spline interpolation. Unlike previous approaches to measure-valued splines, our interpolated curves (i) have a clear interpretation as governing particle flo…
More
Submitted 22 October, 2020; originally announced October 2020.
Comments: 38 pages, 5 figures 


arXiv:2010.11970  [pdf, other]  stat.ML cs.LG
Two-sample Test using Projected
Wasserstein Distance: Breaking the Curse of Dimensionality
Authors:
Jie Wang, Rui Gao, Yao Xie
Abstract: We develop a projected Wasserstein distance for the two-sample test, a fundamental problem in statistics and machine learning: given two sets of samples, to determine whether they are from the same distribution. In particular, we aim to circumvent the curse of dimensionality in Wasserstein distance: when the dimension is high, it has diminishing testing power, which is inherently due to the slow c…
More
Submitted 22 October, 2020; originally announced October 2020.
Comments: 16 pages, 5 figures. Submitted to AISTATS 2021   

  Cited by 1 Related articles All 3 versions 

2020

Convergence in Monge-Wasserstein Distance of Mean Field ...

jglobal.jst.go.jp › detail

Article “Convergence in Monge-Wasserstein Distance of Mean Field Systems with Locally Lipschitz Coefficients” Detailed information of the J-GLOBAL is a service ... 局所Lipschitz係数を持つ平均場系のMonge-Wasserstein距離における収束【JST・京大機械翻訳】 ... Nguyen Dung Tien ... Nguyen Son Luu ... Du Nguyen Huu.

MR4164597 Prelim Nguyen, Dung Tien; Nguyen, Son Luu; Du, Nguyen Huu; Convergence in Monge-Wasserstein Distance of Mean Field Systems with Locally Lipschitz Coefficients. Acta Math. Vietnam. 45 (2020), no. 4, 875–896. 60 (93)

 Zbl 07270260 

 
2020 PDF

Carlos Ogouyandjou, Nestor Wadagni, Wasserstein ...

ijmcs.future-in-tech.net › R-Carlos

by C Ogouyandjou · 2020 — M. CS. Wasserstein Riemannian geometry of Gamma densitiesCarlos OgouyandjouNestor Wadagni. Institut de Mathématiques et de Sciences Physiques.

MR4159966 Prelim Ogouyandjou, Carlos; Wadagni, Nestor; Wasserstein Riemannian geometry of Gamma densities. Int. J. Math. Comput. Sci. 15 (2020), no. 4, 1253–1270. 53 (60)

 


 Adaptive Wasserstein Hourglass for Weakly Supervised RGB 3D Hand Pose Estimation

Yumeng Zhang, 

Li Chen,  Yufeng Liu, Wen Zheng, Junhai Yong

MM '20: Proceedings of the 28th ACM International Conference on MultimediaOctober 2020, pp 2076–2084https://doi.org/10.1145/3394171.3413651
The deficiency of labeled training data is one of the bottlenecks in 3D hand pose estimation from monocular RGB images. Synthetic datasets have a large number of images with precise annotations, but their obvious difference with real-world datasets ... 

 Cited by 2 Related articles


Synthesising Tabular Datasets Using Wasserstein Conditional GANS with Gradient Penalty (WCGAN-GP)

S McKeever, M Singh Walia - 2020 - arrow.tudublin.ie

Deep learning based methods based on Generative Adversarial Networks (GANs) have

seen remarkable success in data synthesis of images and text. This study investigates the

use of GANs for the generation of tabular mixed dataset. We apply Wasserstein Conditional …

  Related articles 


arXiv:2011.03156  [pdf, other  cs.LG math.PR
DECWA: Density-Based Clustering using Wasserstein Distance

Nabil El Malki,  Robin Cugny, Olivier Teste,  Franck Ravat

CIKM '20: Proceedings of the 29th ACM International Conference on Information & Knowledge ManagementOctober 2020, pp 2005–2008https://doi.org/10.1145/3340531.3412125
Clustering is a data analysis method for extracting knowledge by discovering groups of data called clusters. Among these methods, state-of-the-art density-based clustering methods have proven to be effective for arbitrary-shaped clusters. Despite their ..

Cited by 2 Related articles All 2 versions

arXiv:2011.03156  [pdf, other  cs.LG math.PR 

Wasserstein-based fairness interpretability framework for machine learning models 

Authors: Alexey Miroshnikov, Konstandinos Kotsiopoulos, Ryan Franks, Arjun Ravi Kannan 

Abstract: In this article, we introduce a fairness interpretability framework for measuring and explaining bias in classification and regression models at the level of a distribution. In our work, motivated by the ideas of Dwork et al. (2012), we measure the model bias across sub-population distributions using the Wasserstein metric. The transport theory characterization of the Wasserstein metric allows us… More 

Submitted 5 November, 2020; originally announced November 2020. 

Copdfmments: 34 pages 

MSC Class: 90C08; 91A12 . 

Cited by 2 Related articles All 4 versions <——2020———————2020——— 520  ——   

arXiv:2011.03074  [, other 

math.ST stat.ML 

Statistical analysis of Wasserstein GANs with applications to time series forecasting 

Authors: Moritz Haas, Stefan Richter 

Abstract: We provide statistical theory for conditional and unconditional Wasserstein generative adversarial networks (WGANs) in the framework of dependent observations. We prove upper bounds for the excess Bayes risk of the WGAN estimators with respect to a modified Wasserstein-type distance. Furthermore, we formalize and derive statements on the weak convergence of the estimators and use them to develop c… More 

Submitted 5 November, 2020; originally announced November 2020. 

Comments: 47 pages, 4 figures 

MSC Class: 62M45 


arXiv:2011.01614  [pdf, other 

eess.IV cs.CV cs.LG 

Generalized Wasserstein Dice Score, Distributionally Robust Deep Learning, and Ranger for brain tumor segmentation: BraTS 2020 challenge 

Authors: Lucas Fidon, Sebastien Ourselin, Tom Vercauteren 

Abstract: Training a deep neural network is an optimization problem with four main ingredients: the design of the deep neural network, the per-sample loss function, the population loss function, and the optimizer. However, methods developed to compete in recent BraTS challenges tend to focus only on the design of deep neural network architectures, while paying less attention to the three other aspects. In t… More 

Submitted 3 November, 2020; originally announced November 2020. 

Comments: MICCAI 2020 BrainLes Workshop. Our method ranked fourth out of the 693 registered teams for the segmentation task of the BraTS 2020 challenge 


arXiv:2011.01300  [pdf, other 

cond-mat.mtrl-sci physics.atm-clus 

Classification of atomic environments via the Gromov-Wasserstein distance 

Authors: Sakura Kawano, Jeremy K. Mason 

Abstract: Interpreting molecular dynamics simulations usually involves automated classification of local atomic environments to identify regions of interest. Existing approaches are generally limited to a small number of reference structures and only include limited information about the local chemical composition. This work proposes to use a variant of the Gromov-Wasserstein (GW) distance to quantify the d… More 

Submitted 2 November, 2020; originally announced November 202

Cited by 1 Related articles All 3 versions

arXiv:2011.00759  [pdf, other 

math.OC eess.SY 

Data-Driven Approximation of the Perron-Frobenius Operator Using the Wasserstein Metric 

Authors: Amirhossein Karimi, Tryphon T. Georgiou 

Abstract: This manuscript introduces a regression-type formulation for approximating the Perron-Frobenius Operator by relying on distributional snapshots of data. These snapshots may represent densities of particles. The Wasserstein metric is leveraged to define a suitable functional optimization in the space of distributions. The formulation allows seeking suitable dynamics so as to interpolate the distrib… More 

Submitted 2 November, 2020; originally announced November 2020. 

Comments: 11 pages 

MSC Class: 93E12; 93E35; 49J45; 49Q20; 49M29; 90C46 

  Related articles All 3 versions 

arXiv:2010.15285  [pdf, other 

stat.ME stat.ML 

Intrinsic Sliced Wasserstein Distances for Comparing Collections of Probability Distributions on Manifolds and Graphs 

Authors: Raif M. Rustamov, Subhabrata Majumdar 

Abstract: Collections of probability distributions arise in a variety of statistical applications ranging from user activity pattern analysis to brain connectomics. In practice these distributions are represented by histograms over diverse domain types including finite intervals, circles, cylinders, spheres, other manifolds, and graphs. This paper introduces an approach for detecting differences between two… More 

Submitted 28 October, 2020; originally announced October 2020. 

Report number: TD:102696/2020-10-08

arXiv:2010.14877  [pdf, other 

stat.ML cs.LG 

Hierarchical Gaussian Processes with Wasserstein-2 Kernels 

Authors: Sebastian Popescu, David Sharp, James Cole, Ben Glocker 

Abstract: We investigate the usefulness of Wasserstein-2 kernels in the context of hierarchical Gaussian Processes. Stemming from an observation that stacking Gaussian Processes severely diminishes the model's ability to detect outliers, which when combined with non-zero mean functions, further extrapolates low variance to regions with low training data density, we posit that directly taking into account th… More 

Submitted 28 October, 2020; originally announced October 2020.
Cited by 3
Related articles All 3 versions


arXiv:2010.14325  [pdf, other  math.OC 

Distributed Optimization with Quantization for Computing Wasserstein Barycenters 

Authors: Roman Krawtschenko, César A. Uribe, Alexander Gasnikov, Pavel Dvurechensky 

Abstract: We study the problem of the decentralized computation of entropy-regularized semi-discrete Wasserstein barycenters over a network. Building upon recent primal-dual approaches, we propose a sampling gradient quantization scheme that allows efficient communication and computation of approximate barycenters where the factor distributions are stored distributedly on arbitrary networks. The communicati… More 

Submitted 27 October, 2020; originally announced October 2020. 

online

 OPEN ACCESS

Distributed Optimization with Quantization for Computing Wasserstein Barycenters

by Krawtschenko, Roman; Uribe, César A; Gasnikov, Alexander ; More...

10/2020

We study the problem of the decentralized computation of entropy-regularized semi-discrete Wasserstein barycenters over a network. Building upon recent...

Journal ArticleFull Text Online
Distributed Optimization with Quantization for Computing
Wasserstein Barycenters

www.youtube.com › watch

Simulation results for the paper "Distributed Optimization with Quantization for Computing Wasserstein Barycenters".

YouTube · Césa

r A. Uribe · 

Oct 22, 2020

Distributed optimization with quantization for computing Wasserstein barycenters Book

Data-Driven Distributionally Robust Unit Commitment With Wasserstein Metric: Tractable Formulation and Efficient... 

by Zheng, Xiaodong; 

Chen, Haoyong 

IEEE transactions on power systems, 11/2020, Volume 35, Issue 6

Article PDF Download PDF 

Journal ArticleFull Text Online 

Cited by 16

   

Researchers' Work from Stanford University Focuses on Fourier Analysis (Irregularity of Distribution In Wasserstein... 

Mathematics Week, 10/2020

NewsletterCitation Online 

 

Patent Application Titled "Methods And Devices Performing Adaptive Quadratic Wasserstein Full-Waveform... 

Information Technology Newsweekly, 10/2020

NewsletterCitation Online 

 OPEN ACCESS

METHODS AND DEVICES PERFORMING ADAPTIVE QUADRATIC WASSERSTEIN FULL-WAVEFORM...

by WANG, Diancheng; WANG, Ping

10/2020

Methods and devices for seismic exploration of an underground structure apply W2-based full-wave inversion to transformed synthetic and seismic data. Data...

PatentCitation Online
<——2020————2020 ———————-830—


[PDF] arxiv.org

A variational finite volume scheme for Wasserstein gradient flows

C CancèsTO GallouëtG Todeschi - Numerische Mathematik, 2020 - Springer

We propose a variational finite volume scheme to approximate the solutions to Wasserstein

gradient flows. The time discretization is based on an implicit linearization of the

Wasserstein distance expressed thanks to Benamou–Brenier formula, whereas space

discretization relies on upstream mobility two-point flux approximation finite volumes. The

scheme is based on a first discretize then optimize approach in order to preserve the

variational structure of the continuous model at the discrete level. It can be applied to a wide …

  Cited by 6 Related articles All 9 versions

Study Results from University of Lille Provide New Insights into Mathematics 

(A Variational Finite Volume Scheme for Wasserstein... 

Mathematics Week, 11/2020

NewsletterCitation Online 

MR4169480 Prelim Cancès, Clément; Gallouët, Thomas O.; Todeschi, Gabriele; A variational finite volume scheme for Wasserstein gradient flows. Numer. Math. 146 (2020), no. 3, 437–480. 65M08 (35K65 49M29 49Q22 65M12)

Review PDF Clipboard Journal Article 

A variational finite volume scheme for Wasserstein gradient flows 

By: Cances, Clement; Gallouet, Thomas O.; Todeschi, Gabriele 

NUMERISCHE MATHEMATIK  Volume: ‏ 146   Issue: ‏ 3   Pages: ‏ 437-480   Published: ‏ NOV 2020 

Early Access: OCT 2020

Findings from Harbin Engineering University Reveals New Findings on Computer Graphics (Adversarial Sliced Wass... 

Computer Weekly News, 11/2020

NewsletterFull Text Online 

Computers - Computer Graphics; Findings from Harbin Engineering University Reveals New Findings on Computer Graphics (Adversarial Sliced Wasserstein... 

Computer technology journal, Nov 5, 2020, 184

Newspaper ArticleCitation Online

Findings from Polytechnic University Milan Update Understanding of Optimization Research (On the Computation of Kantorovich Wasserstein... 

Mathematics Week, 11/2020

NewsletterCitation Online

Univ Hangzhou Dianzi Seeks Patent for Data Association Method in Pedestrian Tracking Based on Wasserstein... 

Global IP News. Measurement & Testing Patent News, Nov 1, 2020

Newspaper ArticleCitation Online

IBM Submits United States Patent Application for Wasserstein Barycenter Model Ensembling 

Global IP News. Information Technology Patent News, Oct 29, 2020

Newspaper ArticleFull Text Online 


2020


CGG Services SA; Patent Application Titled "Methods And Devices Performing Adaptive Quadratic Wasserstein... 

Computer Weekly News, Oct 21, 2020, 3200

Newspaper ArticleCitation Online 

CGG Services SA; Patent Application Titled "Methods And Devices Performing Adaptive Quadratic Wasserstein...

Computer Weekly News, Oct 21, 2020, 3200

Newspaper ArticleCitation Online

 

Optimization Research; Findings from Polytechnic University Milan Update Understanding of Optimization Research (On the Computation of Kantorovich Wasserstein... 

Journal of technology & science, Nov 8, 2020, 141

Newspaper ArticleFull Text Online 

online

Findings from Polytechnic University Milan Update Understanding of Optimization Research 

(On the Computation of Kantorovich Wasserstein...

Mathematics Week, 11/2020

NewsletterFull Text Online


Univ Sichuan Applies for Patent on Construction Method and Application of Three-Dimensional Mri Image Denoising Model Based on Wasserstein... 

Global IP News: Medical Patent News, Oct 19, 2020

Newspaper ArticleCitation Online 


Evaluating the Performance of Climate Models Based on Wasserstein Distance 

by Vissio, Gabriele; Lembo, Valerio; 

Lucarini, Valerio ; More... 

Geophysical research letters, 11/2020, Volume 47, Issue 21

Article PDF Download PDF 

Journal ArticleFull Text Online 


WAE\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$_{-}$$\end{document}RN: Integrating Wasserstein...

by Zhang, Xinxin; Liu, Xiaoming; Yang, Guan ; More... 

Chinese Computational Linguistics, 11/2020

Book ChapterFull Text Online 

 <——2020———————2020——— 540  ——    


  arXiv:2011.09712
 [pdf, other]  cs.LG math.CO 

Wasserstein Learning of Determinantal Point Processes 

Authors: Lucas Anquetil, Mike Gartrell, Alain Rakotomamonjy, Ugo Tanielian, Clément Calauzènes 

Abstract: Determinantal point processes (DPPs) have received significant attention as an elegant probabilistic model for discrete subset selection. Most prior work on DPP learning focuses on maximum likelihood estimation (MLE). While efficient and scalable, MLE approaches do not leverage any subset similarity information and may fail to recover the true generative distribution of discrete data. In this work… More 

Submitted 19 November, 2020; originally announced November 2020. 

  Related articles All 4 versions 

2020 see 2022 Oct 21 video

arXiv:2011.08151  [pdf, other]  math.NA math.AP 

The back-and-forth method for Wasserstein gradient flows 

Authors: Matt Jacobs, Wonjun Lee, Flavien Léger 

Abstract: We present a method to efficiently compute Wasserstein gradient flows. Our approach is based on a generalization of the back-and-forth method (BFM) introduced by Jacobs and Léger to solve optimal transport problems. We evolve the gradient flow by solving the dual problem to the JKO scheme. In general, the dual problem is much better behaved than the primal problem. This allows us to efficiently ru… More 

Submitted 16 November, 2020; originally announced November 2020. 

MSC Class: 65K10; 65M99 

Cited by 2 Related articles All 2 versions

arXiv:2011.07489  [pdf, ps, other]  stat.ML cs.LG math.PR 

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes 

Authors: Minh Ha Quang 

Abstract: This work studies the entropic regularization formulation of the 2-Wasserstein distance on an infinite-dimensional Hilbert space, in particular for the Gaussian setting. We first present the Minimum Mutual Information property, namely the joint measures of two Gaussian measures on Hilbert space with the smallest mutual information are joint Gaussian measures. This is the infinite-dimensional gener… More 

Submitted 15 November, 2020; originally announced November 2020. 

Comments: 92 pages 


MR4174419 Prelim Alfonsi, Aurélien; Jourdain, Benjamin; Squared quadratic Wasserstein distance: optimal couplings and Lions differentiability. ESAIM Probab. Stat. 24 (2020), 703–717. 90C08 (49J50 58B10 60E15 60G42)

Review PDF Clipboard Journal Article 

UARED QUADRATIC WASSERSTEIN DISTANCE: OPTIMAL COUPLINGS AND LIONS DIFFERENTIABILITY 

By: Alfonsi, Aurelien; Jourdain, Benjamin 

ESAIM-PROBABILITY AND STATISTICS  Volume: ‏ 24   Pages: ‏ 703-717   Published: ‏ NOV 16 2020 

 Free Full Text from Publisher 

Zbl 07285910߃

[PDF] esaim-ps.org

Squared quadratic Wasserstein distance: optimal couplings and Lions differentiability

A Alfonsi, B Jourdain - ESAIM: Probability and Statistics, 2020 - esaim-ps.org

In this paper, we remark that any optimal coupling for the quadratic Wasserstein distance 

between two probability measures μ and ν with finite second order moments on d is the 

composition of a martingale coupling with an optimal transport map. We check the existence …

Related articles All 5 versions

Wasserstein距離を評価関数とする離散時間システムの最適 ...

https://www.jstage.jst.go.jp › -char

· Translate this page

by 星野健太 · 2020 — 主催: 一般社団法人 システム制御情報学会, 公益社団法人 計測自動制御学会, 一般社団法人 日本機械学会, 公益社団法人 化学工学会, 公益社団法人 精密工学会, 一般社団 ... 開催日2020/11/21 - 2020/11/22. 63回自動制御連合講演会Wasserstein距離を評価関数とする離散時間システムの最適制御問題について.

Wasserstein距離を評価関数とする離散時間システムの最適制御問題について

by 星野 健太

自動制御連合講演会講演論文集, 2020, Volume 63

Journal ArticleCitation Online

[Japanese  Wasserstein About the optimal control problem of a discrete-time system using the distance as an evaluation function]

2020

 

MRPrelim4170073 Ogouyandjou, Carlos; Wadagni, Nestor; Wasserstein Riemannian geometry on statistical manifold. Int. Electron. J. Geom. 13 (2020), no. 2, 144–151. 53B12 (60D05 62B11)

Review PDF Clipboard Journal Article 

Wasserstein Riemannian Geometry on Statistical Manifold 

By: Ogouyandjou, Carlos; Wadagni, Nestor 

INTERNATIONAL ELECTRONIC JOURNAL OF GEOMETRY  Volume: ‏ 13   Issue: ‏ 2   Pages: ‏ 144-151   Published: ‏ OCT 15 Wasserstein Riemannian Geometry on Statistical Manifoldhttps://dergipark.org.tr › iejg › issue
· Translate this page

 Oct 15, 2020 — In this paper, we study some geometric properties of statistical manifold equipped with the Riemannian Otto metric which is related to the L ...


MR4169690 Prelim Ehrlacher, Virginie; Lombardi, Damiano; Mula, Olga; Vialard, François-Xavier; Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces. ESAIM Math. Model. Numer. Anal. 54 (2020), no. 6, 2159–2197. 65M22 (65M12)

Review PDF Clipboard Journal Article 

Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces 

By: Ehrlacher, Virginie; Lombardi, Damiano; Mula, Olga; et al.

ESAIM-MATHEMATICAL MODELLING AND NUMERICAL ANALYSIS-MODELISATION MATHEMATIQUE ET ANALYSE NUMERIQUE  Volume: ‏ 54   Issue: ‏ 6   Pages: ‏ 2159-2197   Published: ‏ NOV 3 2020 

Reports on Mathematical Modelling Findings from INRIA Paris Provide New Insights (Nonlinear Model Reduction On Metric Spaces. Application To One-dimensional Conservative Pdes In Wasserstein... 

Mathematics Week, 12/2020

NewsletterCitation Online     arXiv 2019, 2020

Cited by 4 Related articles All 42 versions
Zbl 07357924


Image hashing by minimizing independent relaxed wasserstein distance

KD DoanA KimiyaieS Manchanda… - arXiv preprint arXiv …, 2020 - arxiv.org

… The rapid growth of the visual data, especially images, brings many challenges to the problem

of … or O(nlog(n + d)). This is an order of magnitude faster than the … compare the performance of

the proposed method with various representative unsupervised image hashing methods …

  Cited by 2 Related articles All 2 versions 


MR4168389 Prelim Bonis, Thomas; Stein's method for normal approximation in Wasserstein distances with application to the multivariate central limit theorem. Probab. Theory Related Fields 178 (2020), no. 3-4, 827–860. 60E15 (26D10 60J05)

Review PDF Clipboard Journal Article 1 Citation 


[PDF] uwaterloo.ca

Wasserstein Adversarial Robustness

K Wu - 2020 - uwspace.uwaterloo.ca

… at Waterloo has created an excellent learning and research environment, which makes the thesis

possible … (2019) recently proposed the Wasserstein threat model, ie, adversarial examples are

subject to a perturbation budget measured by the Wasser- stein distance (aka …

<——2020————2020 ———————-550—

arXiv:2011.12542  [pdf, ps, other]  cs.LG 

Wasserstein k-means with sparse simplex projection 

Authors: Takumi Fukunaga, Hiroyuki Kasai 

Abstract: This paper presents a proposal of a faster Wasserstein

k-means algorithm for histogram data by reducing Wasserstein distance computations and exploiting sparse simplex projection. We shrink data samples, centroids, and the ground cost matrix, which leads to considerable reduction of the computations used to solve optimal transport problems without loss of clustering quality. Furthermore, we dyna… More 

Submitted 25 November, 2020; originally announced November 2020. 

Comments: Accepted in ICPR2020

arXiv:2011.11599  [pdf, ps, other]  math.PR 

Martingale Wasserstein inequality for probability measures in the convex order 

Authors: Benjamin Jourdain, William Margheriti 

Abstract: It is known since [24] that two one-dimensional probability measures in the convex order admit a martingale coupling with respect to which the integral of

|xy| is smaller than twice their

W1-distance (Wasserstein distance with index

1). We showed in [24] that replacing

|xy|  and…  

Submitted 23 November, 2020; originally announced November 2020. 

Speech Dereverberation Based on Improved Wasserstein Generative Adversarial Networks

L Rao, J Yang - Journal of Physics: Conference Series, 2020 - iopscience.iop.org

… For reverberant speech, first use WGAN-GP for pre-processing, then use MCLP method, so that it may get better dereverberation effect … [17] Park SR and Lee J 2016 A fully convolutional neural network for speech enhancement arXiv preprint arXiv:1609.07132 … 

Cited by 2 Related articles All 3 versions

A TextCNN and WGAN-gp based deep learning frame for unpaired text style transfer in multimedia services

M Hu, M He, W Su, A Chehri - Multimedia Systems, 2020 - Springer

… PDF. Special Issue Paper; Published: 23 November 2020. A TextCNN and WGAN-gp based deep learning frame for unpaired text style transfer in multimedia services … Full size image. WGAN-gp for content preservation. Guaranteeing … 

  

 [PDF] iop.org

An Improved Defect Detection Method of Water Walls Using the WGAN

Y Zhang, L Lu, Y Wang, Y Ding, J Yang… - Journal of Physics …, 2020 - iopscience.iop.org

… the future development direction of this experiment is to deploy this water-wall automatic defect detection system to further effectively collect multiple types of defect data, and re-use the WGAN network to … 8] I. Goodfellow, Pouget-Abadie, Generative Adversarial Nets, ArXiv, 2014 …

Bonis, Thomas

Stein’s method for normal approximation in Wasserstein distances with application to the multivariate central limit theorem. (English) Zbl 07271331 

Probab. Theory Relat. Fields 178, No. 3-4, 827-860 (2020). 

MSC:  60E15 26D10 60J05 

Drug-drug interaction prediction with Wasserstein Adversarial Autoencoder-based knowledge graph embeddings. 

By: Dai, Yuanfei; Guo, Chenhao; Guo, Wenzhong; et al.

Briefings in bioinformatics    Published: ‏ 2020-Oct-30 (Epub 2020 Oct 30) 

Zbl 07271331
Cited by 24
Related articles All 5 versions+


Method for model ensembling e.g. Wasserstein barycenter model ensembling, involves inputting set of models that predict different sets of attributes and determining source set of attributes and target set of attributes using barycenter 

Patent Number: US2020342361-A1 

Patent Assignee: INT BUSINESS MACHINES CORP 

Inventor(s): MROUEH Y; DOGNIN P L; MELNYK I; et al.


Anti-domain adaptive model training method comprises e.g. configuring source domain embedding extractor and target domain embedding extractor share some layer parameters to speaker discriminator to obtain speaker loss and Wasserstein loss 

Patent Number: CN111797844-A 

Patent Assignee: SUZHOU AISPEECH INFORMATION TECHNOLOGY C 

Inventor(s): QIAN Y; CHEN Z; WANG S.


Generative adversarial network based single-stage target detection method, involves calculating target detection loss value of loss function constructed based on Wasserstein distance based on predicted target image and sample label 

Patent Number: CN111767962-A 

Patent Assignee: CHINESE ACAD SCI AUTOMATION INST 

Inventor(s): TANG S; ZHENG Q; ZHU H; et al.

 <——2020———2020————— 560  ——      


MR4177281 Prelim Brown, Louis; Steinerberger, Stefan; On the Wasserstein distance between classical sequences and the Lebesgue measure. Trans. Amer. Math. Soc. 373 (2020), no. 12, 8943–8962. 11L07 (41A25 42B05 65D30)

On the Wasserstein distance between classical sequences and the Lebesgue... 

by Louis Brown; 

Stefan Steinerberger 

Transactions of the American Mathematical Society, 12/2020, Volume 373, Issue 12

We discuss the classical problem of measuring the regularity of distribution of sets of N points in \mathbb{T}^d. A recent line of investigation is to study...

Article PDF Download PDF 

Journal ArticleFull Text Online 

ited by 9 Related articles All 5 versions

arXiv:2011.13384  [pdf, other]  cs.LG cs.CL 

Automatic coding of students' writing via Contrastive Representation Learning in the Wasserstein space 

Authors: Ruijie Jiang, Julia Gouvea, David Hammer, Shuchin Aeron 

Abstract: Qualitative analysis of verbal data is of central importance in the learning sciences. It is labor-intensive and time-consuming, however, which limits the amount of data researchers can include in studies. This work is a step towards building a statistical machine learning (ML) method for achieving an automated support for qualitative analyses of students' writing, here specifically in score labor… More 

Submitted 26 November, 2020; originally announced November 2020. 


[PDF] arxiv.org

A Wasserstein coupled particle filter for multilevel estimation

M Ballesio, A JasraE von Schwerin… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we consider the filtering problem for partially observed diffusions, which are

regularly observed at discrete times. We are concerned with the case when one must resort

to time-discretization of the diffusion process if the transition density is not available in an …

  Cited by 3 Related articles All 4 versions 


  

arXiv:2012.01252  [pdf, other]  cs.LG 

Partial Gromov-Wasserstein Learning for Partial Graph Matching 

Authors: Weijie Liu, Chao Zhang, Jiahao Xie, Zebang Shen, Hui Qian, Nenggan Zheng 

Abstract: Graph matching finds the correspondence of nodes across two graphs and is a basic task in graph-based machine learning. Numerous existing methods match every node in one graph to one node in the other graph whereas two graphs usually overlap partially in many \realworld{} applications. In this paper, a partial Gromov-Wasserstein learning framework is proposed for partially matching two graphs, whi… More 

Submitted 2 December, 2020; originally announced December 2020. 


arXiv:2012.00780  [pdf, other]  cs.LG cs.AI stat.ML 

Refining Deep Generative Models via Wasserstein Gradient Flows 

Authors: Abdul Fatir Ansari, Ming Liang Ang, Harold Soh 

Abstract: Deep generative modeling has seen impressive advances in recent years, to the point where it is now commonplace to see simulated samples (e.g., images) that closely resemble real-world data. However, generation quality is generally inconsistent for any given model and can vary dramatically between samples. We introduce Discriminator Gradient flow (DGflow), a new technique that improves generated s… More 

Submitted 1 December, 2020; originally announced December 2020. 



 

Open Access 

Online Companion - Enhanced Wasserstein distributionally robust OPF with... 

by Arrigo, Adriano; Kazempour, Jalal; Grève, Zacharie De ; More... 

11/2020

This paper goes beyond the current state of the art related to Wasserstein distributionally robust optimal powerflow problems, by adding dependence structure...

Conference ProceedingCitation Online 



Functional Data Clustering Analysis via the Learning of Gaussian Processes with Wasserstein distance

by Li, Tao; Ma, Jinwen 

Neural Information Processing, 11/2020

Functional data clustering analysis becomes an urgent and challenging task in the new era of big data. In this paper, we propose a new framework for functional...

Book ChapterFull Text Online 

Springer link
Functional Data Clustering Analysis via the Learning of Gaussian Processes with Wasserstein Distance


Investigators from Federal Reserve Bank of Philadelphia Have Reported New Data on Entropy 

(Probability Forecast Combination Via Entropy Regularized Wasserstein Distance)

Investment Weekly News, 11/2020

NewsletterCitation Online 

Comments: Preprint 

Probability forecast combination via entropy regularized wasserstein 

Probability Forecast Combination via Entropy Regularized Wasserstein Distance

R Cumings-Menon, M Shin - Entropy, 2020 - mdpi.com

We propose probability and density forecast combination methods that are defined using the entropy regularized Wasserstein distance. First, we provide a theoretical characterization of the combined density forecast based on the regularized Wasserstein distance under the assumption. More specifically, we show that the regularized Wasserstein barycenter between multivariate Gaussian input densities is multivariate Gaussian, and provide a simple way to compute mean and its variance–covariance matrix. Second, we show how this …

Cited by 2 Related articles All 17 versions

Probability forecast combination via entropy regularized Wasserstein distance book


Distributionally Robust Optimal Reactive Power Dispatch with Wasserstein Distance in Active Distribution Network 

By: Liu, Jun; Chen, Yefu; Duan, Chao; et al.

JOURNAL OF MODERN POWER SYSTEMS AND CLEAN ENERGY  Volume: ‏ 8   Issue: ‏ 3   Pages: ‏ 426-436   Article Number: 2196-5625(2020)8:3<426:DRORPD>2.0.TX;2-X   Published: ‏ MAY 2020 

 Free Full Text from Publisher 

ited by 18 Related articles All 6 versions

 arXiv:2012.03809  [pdf, ps, other]  math.ST cs.AI cs.LG eess.SP stat.ML 

Independent Elliptical Distributions Minimize Their W2 Wasserstein Distance from Independent Elliptical Distributions with the Same Density Generator 

Authors: Song Fang, Quanyan Zhu 

Abstract: This short note is on a property of the

W2 Wasserstein distance which indicates that independent elliptical distributions minimize their

W2 Wasserstein distance from given independent elliptical distributions with the same density generators. Furthermore, we examine the implications of this property in the Gelbrich bound when the distributions are not necessarily elliptic… More 

Submitted 7 December, 2020; originally announced December 2020.
 <——2020——— 2020——————570 —    


 

arXiv:2012.03612  [pdf, ps, other]  cs.LG cs.AI cs.DS stat.ML 

LCS Graph Kernel Based on Wasserstein Distance in Longest Common Subsequence Metric Space 

Authors: Jianming Huang, Zhongxi Fang, Hiroyuki Kasai 

Abstract: For graph classification tasks, many methods use a common strategy to aggregate information of vertex neighbors. Although this strategy provides an efficient means of extracting graph topological features, it brings excessive amounts of information that might greatly reduce its accuracy when dealing with large-scale neighborhoods. Learning graphs using paths or walks will not suffer from this diff… More 

Submitted 7 December, 2020; originally announced December 2020. 


arXiv:2012.03564  [pdf, ps, other]  math.OA math-ph math.OC 

Quadratic Wasserstein metrics for von Neumann algebras via transport plans 

Authors: Rocco Duvenhage 

Abstract: We show how one can obtain a class of quadratic Wasserstein metrics, that is to say, Wasserstein metrics of order 2, on the set of faithful normal states of a von Neumann algebra

A, via transport plans, rather than through a dynamical approach. Two key points to make this work, are a suitable formulation of the cost of transport arising from Tomita-Takesaki theory and relative tensor products of… More 

Submitted 7 December, 2020; originally announced December 2020. 

Comments: 20 pages 

Quadratic Wasserstein metrics for von Neumann algebras via transport plans

R Duvenhage - arXiv preprint arXiv:2012.03564, 2020 - arxiv.org

We show how one can obtain a class of quadratic Wasserstein metrics, that is to say,

Wasserstein metrics of order 2, on the set of faithful normal states of a von Neumann algebra

$ A $, via transport plans, rather than through a dynamical approach. Two key points to

make this work, are a suitable formulation of the cost of transport arising from Tomita-

Takesaki theory and relative tensor products of bimodules (or correspondences in the sense

of Connes). The triangle inequality, symmetry and $ W_ {2}(\mu,\mu)= 0$ all work quite …

  All 2 versions 

Quadratic Wasserstein metrics for von Neumann algebras via transport plans
by Duvenhage, Rocco
12/2020
We show how one can obtain a class of quadratic Wasserstein metrics, that is to say, Wasserstein metrics of order 2, on the set of faithful normal states of a...

Journal ArticleFull Text Online

 Cited by 6 Related articles All 2 versions

arXiv:2012.03420  [pdf, other]  cs.LG stat.ML 

Sobolev Wasserstein GAN 

Authors: Minkai Xu, Zhiming Zhou, Guansong Lu, Jian Tang, Weinan Zhang, Yong Yu 

Abstract: Wasserstein GANs (WGANs), built upon the Kantorovich-Rubinstein (KR) duality of Wasserstein distance, is one of the most theoretically sound GAN models. However, in practice it does not always outperform other variants of GANs. This is mostly due to the imperfect implementation of the Lipschitz condition required by the KR duality. Extensive work has been done in the community with different imple… More 

Submitted 6 December, 2020; originally announced December 2020. 

Comments: Accepted by AAAI 2021  


arXiv:2012.04500  [pdf, other 

q-fin.MF q-fin.PM q-fin.RM 

Portfolio Optimisation within a Wasserstein Ball 

Authors: Silvana Pesenti, Sebastian Jaimungal 

Abstract: We consider the problem of active portfolio management where a loss-averse and/or gain-seeking investor aims to outperform a benchmark strategy's risk profile while not deviating too much from it. Specifically, an investor considers alternative strategies that co-move with the benchmark and whose terminal wealth lies within a Wasserstein ball surrounding it. The investor then chooses the alternati… More 

Submitted 8 December, 2020; originally announced December 2020. 

Comments: 36 pages, 2 tables, 6 figures 

MSC Class: 91G10; 91G70; 91G05; 91B06 


Portfolio Optimisation within a Wasserstein Ball

S PesentiS Jaimungal - arXiv preprint arXiv:2012.04500, 2020 - arxiv.org

We consider the problem of active portfolio management where a loss-averse and/or gain-

seeking investor aims to outperform a benchmark strategy's risk profile while not deviating

too much from it. Specifically, an investor considers alternative strategies that co-move with

the benchmark and whose terminal wealth lies within a Wasserstein ball surrounding it. The

investor then chooses the alternative strategy that minimises their personal risk preferences,

modelled in terms of a distortion risk measure. In a general market model, we prove that an …

 Cited by 6 Related articles All 6 versions 

Portfolio Optimisation within a Wasserstein Ball
by Pesenti, Silvana; Jaimungal, Sebastian
12/2020
We consider the problem of active portfolio management where a loss-averse and/or gain-seeking investor aims to outperform a benchmark strategy's risk profile...

Journal ArticleFull Text Online


arXiv:2012.04023  [pdf, ps, other 

math.ST cs.LG eess.SP math.PR stat.ML 

The Spectral-Domain

W2Wasserstein Distance for Elliptical Processes and the Spectral-Domain Gelbrich Bound 

Authors: Song Fang, Quanyan Zhu 

Abstract: In this short note, we introduce the spectral-domain

W2 Wasserstein distance for elliptical stochastic processes in terms of their power spectra. We also introduce the spectral-domain Gelbrich bound for processes that are not necessarily elliptical. 

Submitted 7 December, 2020; originally announced December 2020. 



Barycenters of Natural Images-Constrained Wasserstein Barycenters for Image Morphing
Authors:Simon D.Aberdam A.2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2020
Article, 2020
Publication:Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2020, 7907
Publisher:2020

On the Wasserstein distance for a martingale central limit ...

www.sciencedirect.com › science › article › pii


online PEER-REVIEW OPEN ACCESS

Data augmentation-based conditional Wasserstein generative adversarial network-gradient penalty for...

by Mokbal, Fawaz Mahiuob Mohammed; Wang, Dan; Wang, Xiaoxi ; More...

PeerJ. Computer science, 2020, Volume 6

The rapid growth of the worldwide web and accompanied opportunities of web applications in various aspects of life have attracted the attention of...

Article PDF Download Now (via Unpaywall) BrowZine PDF Icon

Journal ArticleFull Text Online


Stein's method for normal approximation in Wasserstein ...

arxiv.org › math

May 31, 2019 — Stein's method for normal approximation in Wasserstein distances with application to the multivariate Central Limit Theorem. ... If the stochastic process (X_t)_{t \geq 0} satisfies an additional exchangeability assumption, we show it can also be used to obtain bounds on Wasserstein distances of any order p \geq 1.

by T Bonis · ‎2019 · ‎Cited by 5 · ‎Related articles

Stein’s method for normal approximation in Wasserstein distances with application to the multivariate central...
by Bonis, Thomas
Probability theory and related fields, 08/2020
Article PDF Download
Cited by 17
Related articles All 5 versions

On the Wasserstein Distance between Classical Sequences ...

arxiv.org › math

Sep 19, 2019 — On the Wasserstein Distance between Classical Sequences and the Lebesgue Measure. We discuss the classical problem of measuring the regularity of distribution of sets of N points in \mathbb{T}^d. ... We show that Kronecker sequences satisfy optimal transport distance in d \geq 3 dimensions.

by L Brown · ‎2019 · ‎Cited by 3 · ‎Related articles
  On the Wasserstein distance between classical sequences and the Lebesgue measure
by Louis Brown; Stefan Steinerberger
Transactions of the American Mathematical Society, 12/2020, Volume 373, Issue 12
We discuss the classical problem of measuring the regularity of distribution of sets of N points in \mathbb{T}^d. A recent line of investigation is to study...
Article PDF Do 

 Partial Gromov-Wasserstein Learning for Partial Graph Matching
by Liu, Weijie; Zhang, Chao; Xie, Jiahao ; More...
12/2020
Graph matching finds the correspondence of nodes across two graphs and is a basic task in graph-based machine learning. Numerous existing methods match every...   

Cited by 10 Related articles All 5 versions

 <——2020————2020——— 580  ——    


Refining Deep Generative Models via Wasserstein Gradient ...

arxiv.org › cs

Dec 1, 2020 — We introduce Discriminator Gradient flow (DGflow), a new technique that improves generated samples via the gradient flow of ...

by AF Ansari · ‎2020
 Refining Deep Generative Models via Wasserstein Gradient Flows
by Ansari, Abdul Fatir; Ang, Ming Liang; Soh, Harold
12/2020
Deep generative modeling has seen impressive advances in recent years, to the point wher Reports on Mathematical Modelling Findings from INRIA Paris Provide New Insights (Nonlinear Model Reduction On Metric Spaces. Application To One-dimensional Conservative Pdes In Wasserstein...
Mathematics Week, 12/2020


arXiv:2012.06397  [pdfother] tat.CO stat.ME

Randomised Wasserstein Barycenter Computation: Resampling with Statistical Guarantees

Authors: Florian HeinemannAxel MunkYoav Zemel

Abstract: We propose a hybrid resampling method to approximate finitely supported Wasserstein barycenters on large-scale datasets, which can be combined with any exact solver. Nonasymptotic bounds on the expected error of the objective value as well as the barycenters themselves allow to calibrate computational cost and statistical accuracy. The rate of these upper bounds is shown to be optimal and independ…  More

Submitted 11 December, 2020; originally announced December 2020.


[PDF] Distributionally Robust XVA via Wasserstein Distance: Wrong Way Counterparty Credit and Funding Risk

D Singh, S Zhang - researchgate.net

This paper investigates calculations of robust XVA, in particular, credit valuation adjustment

(CVA) and funding valuation adjustment (FVA) for over-the-counter derivatives under

distributional uncertainty using Wasserstein distance as the ambiguity measure. Wrong way …

  Related articles All 2 versions 


 基于 Wasserstein 生成对抗网络的智能光通信

牟迪, 蒙文, 赵尚弘, 王翔, 刘文亚 - 中国激光, 2020 - opticsjournal.net

摘要首先介绍激光链路通信的优势, 然后介绍基于生成对抗网络(GAN) 的端到端通信学习系统,

提高了通信系统的实时性与全局优化性. 针对传统GAN 在训练与应用中模式坍塌和训练不稳定

的问题, 引入Wasserstein 生成对抗网络进行改进. 最后将Wasserstein 生成对抗网络应用于端到 …

[Chinese  Intelligent optical communication based on Wasserstein Generative Confrontation Network]


PDF arXiv:2002.08276v2 [stat.ML] 12 Jun 2020 - arXiv.org

arxiv.org › pdf

Jun 12, 2020 — of neither Wasserstein nor Gromov-Wasserstein are available yet. ... i,j is a coupling matrix with an entry Tij that describes the amount of mass ...

[CITATION] Gromov-Wasserstein Coupling Matrix

SERWC Matrix - hal.archives-ouvertes.fr

Page 1. 0 100 200 300 400 500 0 20 40 60 80 100 Gromov-Wasserstein Coupling Matrix 0 100

200 300 400 500 0 20 40 60 80 100 SubEmbedding Robust Wasserstein Coupling Matrix


2020

  

基于 Wasserstein 生成式对抗网络的开关柜缺陷样本增强方法

张宇, 熊俊, 黄雪莜, 桑成磊, 黎明 - 高电压技术, 2020 - hve.epri.cdqingdajiaoyu.com

摘要样本不平衡问题会导致深度学习模型算法泛化能力不佳, 网络训练时出现过拟合的现象,

制约了开关柜设备智能故障诊断的准确度. 鉴于此, 本文结合配电站现场局部放电带电检测数据

和真型开关柜局部放电模拟实验数据, 提出了一种基于Wasserstein 生成式对抗网络的缺陷样本 …

[Chinese Switchgear defect sample enhancement method based on Wasserstein generative confrontation network]


 [PDF] googleapis.com

Methods and devices performing adaptive quadratic wasserstein full-waveform inversion

W Diancheng, P Wang - US Patent App. 16/662,644, 2020 - Google Patents

Methods and devices for seismic exploration of an underground structure apply W 2-based

full-wave inversion to transformed synthetic and seismic data. Data transformation ensures

that the synthetic and seismic data are positive definite and have the same mass using an …

  All 2 versions 

基于 Wasserstein 距离的条件双向学习推理

刘轶功 - 2020 - cdmd.cnki.com.cn

图像生成技术一直是计算机视觉, 计算机图形学等专业领域的重要研究方向,

同时被工业界广泛的应用. 经过许多科学家的致力研究, 在深度学习图像生成的问题上,

表现结果依然不尽人意, 最主要的困难和挑战在于图像生成结果的多样性, 真实性 …

[Chinese  Conditional two-way learning reasoning based on Wasserstein distance]


Lp-Wasserstein and flux-limited gradient flows: Entropic discretization, convergence analysis and numerics

B Söllner - 2020 - mediatum.ub.tum.de

We analyse different discretizations of gradient flows in transport metrics with non-quadratic

costs. Among others we discuss the p-Laplace equation and evolution equations with flux-

limitation. We prove comparison principles, free energy monotony, non-negativity and mass …

  All 3 versions 

 A new Wasserstein distance-and cumulative sum-dependent health indicator and its application in prediction of remaining useful life of bearing

J Yin, M Xu, H Zheng, Y Yang - Journal of the Brazilian Society of …, 2020 - Springer

The safety and reliability of mechanical performance are affected by the condition (health

status) of the bearings. A health indicator (HI) with high monotonicity and robustness is a

helpful tool to simplify the predictive model and improve prediction accuracy. In this paper, a …

  Related articles

<——2020——  2020———————— 590  ——    


Wasserstein Random Forests and Applications in ...

arxiv.org › stat

Jun 8, 2020 — This reformulation indicates that Random Forests are well adapted to estimate conditional distributions and provides a natural extension of the algorithm to multivariate outputs. Following the ... From: Qiming Du [view email]

by Q Du · ‎2020 · ‎Related articles

[CITATION] Wasserstein Random Forests at First Glance

Q Du - 2020

  Related articles

[CITATION] Improving Wasserstein Generative Models for Image Synthesis and Enhancement

J Wu - 2020 - research-collection.ethz.ch

… JavaScript is disabled for your browser. Some features of this site may not work

without it. Research Collection. Navigational link. Search. Improving Wasserstein

Generative Models for Image Synthesis and Enhancement …


Stereoscopic image reflection removal based on Wasserstein Generative Adversarial Network

X Wang - 2020 - theses.lib.polyu.edu.hk

Reflection removal is a long-standing problem in computer vision. Although much progress

has been made in single-image solutions, the limitations are also obvious due to the

challenging nature of this problem. In this study, we propose to use stereoscopic image pairs …

  

2[PDF] arxiv.org

SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

S ChewiTL GouicC LuT Maunu… - arXiv preprint arXiv …, 2020 - arxiv.org

Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described

as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of

optimal transport. We introduce a new perspective on SVGD that instead views SVGD as the …

  Cited by 8 Related articles All 5 versions 


[PDF] googleapis.com

Object shape regression using wasserstein distance

J Sun, SKP Kumar, R Bala - US Patent App. 16/222,062, 2020 - Google Patents

One embodiment can provide a system for detecting outlines of objects in images. During

operation, the system receives an image that includes at least one object, generates a

random noise signal, and provides the received image and the random noise signal to a …

  All 3 versions 

[PDF] projecteuclid.org

Donsker's theorem in Wasserstein-1 distance

L Coutin, L Decreusefond - Electronic Communications in …, 2020 - projecteuclid.org

We compute the Wassertein-1 (or Kantorovitch-Rubinstein) distance between a random

walk in $\mathbf {R}^{d} $ and the Brownian motion. The proof is based on a new estimate of

the modulus of continuity of the solution of the Stein's equation. As an application, we can …

  Related articles All 18 versions


Two approaches for population Wasserstein barycenter problem: Stochastic Averaging versus Sample Average Approximation

D DvinskikhA Gasnikov - nnov.hse.ru

Abstract In Machine Learning and Optimization community there are two main approaches

for convex risk minimization problem: Stochastic Averaging (SA) and Sample Average

Approximation (SAA). At the moment, it is known that both approaches are on average …

  Related articles 


2020

Diffusions on Wasserstein Spaces - bonndoc - Universität Bonn

bonndoc.ulb.uni-bonn.de › xmlui › handle

05.05.2020 ... Dello Schiavo, Lorenzo: Diffusions on Wasserstein Spaces. - Bonn, 2020. - Dissertation, Rheinische Friedrich-Wilhelms-Universität Bonn.

by L Dello Schiavo · ‎2020 · ‎Related articles

Related articles All 2 versions 

Diffusions on Wasserstein Spaces thesis


Isometric study of Wasserstein spaces---the real line

G Pál Gehér, T TitkosD Virosztek - arXiv, 2020 - ui.adsabs.harvard.edu

Recently Kloeckner described the structure of the isometry group of the quadratic

Wasserstein space $\mathcal {W} _2\left (\mathbb {R}^ n\right) $. It turned out that the case of

the real line is exceptional in the sense that there exists an exotic isometry flow. Following …

 


PDF  Lecture 3: Wasserstein Space - Lénaïc Chizat

lchizat.github.io › lecture3

Feb 26, 2020 — Mérigot's lecture notes and [3, 4]. 1 Reminders. Let X, Y be compact metric spaces, c C(X × Y ) the cost function ...

[CITATION] Lecture 3: Wasserstein Space

L Chizat - 2020

  Related articles All 2 versions

 

A TextCNN and WGAN-gp based deep learning frame for unpaired text style transfer in multimedia services

M Hu, M He, W Su, A Chehri - Multimedia Systems, 2020 - Springer

With the rapid growth of big multimedia data, multimedia processing techniques are facing

some challenges, such as knowledge understanding, semantic modeling, feature

representation, etc. Hence, based on TextCNN and WGAN-gp (improved training of …


main adaptation for the joint optic disc-and-cup segmentation in fundus images

S KadambiZ WangE Xing - … Journal of Computer Assisted Radiology and …, 2020 - Springer

Purpose The cup-to-disc ratio (CDR), a clinical metric of the relative size of the optic cup to

the optic disc, is a key indicator of glaucoma, a chronic eye disease leading to loss of vision.

CDR can be measured from fundus images through the segmentation of optic disc and optic …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

Adaptive WGAN with loss change rate balancing

X Ouyang, G Agam - arXiv preprint arXiv:2008.12463, 2020 - arxiv.org

Optimizing the discriminator in Generative Adversarial Networks (GANs) to completion in the

inner training loop is computationally prohibitive, and on finite datasets would result in

overfitting. To address this, a common update strategy is to alternate between k optimization …

  All 2 versions 

<——2020——————2020———    600  ——    


   An Improved Defect Detection Method of Water Walls Using the WGAN

Y Zhang, L Lu, Y Wang, Y Ding, J Yang… - Journal of Physics …, 2020 - iopscience.iop.org

This paper proposes an improved water wall defect detection method using Wasserstein

generation adversarial network (WGAN). The method aims to improve the problems of poor

safety and high level of maintenance personnel required by traditional inspection methods …

 Related articles All 2 versions


A Generative Steganography Method Based on WGAN-GP

J Li, K Niu, L Liao, L Wang, J Liu, Y Lei… - … Conference on Artificial …, 2020 - Springer

With the development of Generative Adversarial Networks (GAN), GAN-based

steganography and steganalysis techniques have attracted much attention from

researchers. In this paper, we propose a novel image steganography method without …
  Cited by 3 Related articles


[PDF] researchsquare.com

[PDF] Res-WGAN: Image Classification for Plant Small-scale Datasets

M Jiaqi, Y Si, Y Xiande, G Wanlin, L Minzan, Z Lihua… - 2020 - researchsquare.com

Background: Artificial identification of rare plants is an important yet challenging 12 problem

in plant taxonomy. Although deep learning-based method can accurately 13 predict rare

plant category from training samples, accuracy requirements of only few 14 experts are …

  Related articles All 3 versions 

Technique Proposal to Stabilize Lipschitz Continuity of WGAN Based on Regularization Terms

HI Hahn - The Journal of The Institute of Internet, Broadcasting …, 2020 - koreascience.or.kr

The recently proposed Wasserstein generative adversarial network (WGAN) has improved

some of the tricky and unstable training processes that are chronic problems of the

generative adversarial network (GAN), but there are still cases where it generates poor …

  Related articles 


Eye in-painting using WGAN-GP for face images with mosaic

CH Wu, HT Chang, A Amjad - 2020 International Conference …, 2020 - spiedigitallibrary.org

In order to protect personal privacy, news reports often use the mosaics upon the face of the

protagonist in the photo. However, readers will feel uncomfortable and awkward to this kind

of photos. In this research, we detect the eye mosaic and try to use eye complementing …

   Related articles All 3 versions


2020


Res-WGAN: Image Classification for Plant Small-scale Datasets

M Wang, M Jiaqi, H Xia, Y Si, G Wanlin, L Minzan… - 2020 - europepmc.org

Background: The artificial identification of rare plants is always a challenging problem in

plant taxonomy. Although the convolutional neural network (CNN) in the deep learning

method can better realize the automatic classification of plant samples through training, the …

  Related articles All 4 versions 


 [PDF] ucsal.br

Adaptação do WGAN ao processo estocástico

RR Aquino - 2020 - ri.ucsal.br

Dentro de diversas áreas do conhecimento, os dados (diversos tipos de informações) são

valiosos e a sua análise é mais valiosa ainda. Então, associando a área da inteligência

artificial, observa-se uma nova moda, a geração de dados sintéticos para suprir a falta de …

   Related articles 


 [PDF] opticsjournal.net

[PDF] 基于改进 WGAN-GP 的多波段图像同步超分与融合方法

田嵩旺, 蔺素珍, 雷海卫, 李大威, 王丽芳 - 光学学报, 2020 - opticsjournal.net

摘要针对低分辨率源图像的融合结果质量低下不利于后续目标提取的问题,

提出一种基于梯度惩罚Wasserstein 生成对抗网络(WGAN-GP) 的多波段图像同步超分与融合

方法. 首先, 基于双三次插值法将多波段低分辨率源图像分别放大至目标尺寸; 其次 …

  All 2 versions 

[Chinese    Multi-band image synchronization based on improved WGAN-GP]  


정칙화 항에 기반한 WGAN  립쉬츠 연속 안정화 기법 제안

한희일 - 한국인터넷방송통신학회 논문지, 2020 - earticle.net

최근에 제안된 WGAN (Wasserstein generative adversarial network) 등장으로 GAN

(generative adversarial network) 고질적인 문제인 까다롭고 불안정한 학습과정이 다소

개선되기는 하였으나 여전히 수렴이 되거나 자연스럽지 못한 출력물을 생성하는 등의 …

[Korean  Continuous stability of WGAN's lip sheets based on regularization terms]

All 2 versions
 
 

분별기의 립쉬츠 연속 안정화를 통한 WGAN 성능개선

한희일 - 전자공학회논문지, 2020 - dbpia.co.kr

GAN (generative adversarial network) 등장으로 생성모델 분야의 획기적 발전이

이루어졌지만 학습 시의 불안정성은 해결되어야 가장 문제로 대두되고 있다. 최근에

제안된 WGAN (Wasserstein GAN) 학습 안정성이 개선되어 GAN 대안이 되고 있으나 …

  Related articles

[Korean WGAN performance through continuous stabilization of the separator's lip sheet]


  基于 WGAN 反馈的深度学习差分隐私保护方法

陶陶, 柏建树 - 收藏, 2020 - cnki.com.cn

本文针对攻击者可能通过某些技术手段如生成式对抗网络(GAN) 等窃取深度学习训练数据集中

敏感信息的问题, 结合差分隐私理论, 提出经沃瑟斯坦生成式对抗网络(WGAN)

反馈调参的深度学习差分隐私保护的方法. 该方法使用随机梯度下降进行优化 …

[[Chinese  Deep learning differential privacy protection based on WGAN feedback]

<——2020————2020———  610 ——   


 

[PDF] arxiv.org

TextureWGAN: Texture Preserving WGAN with MLE Regularizer for Inverse Problems

M Ikuta, J Zhang - arXiv preprint arXiv:2008.04861, 2020 - arxiv.org

Many algorithms and methods have been proposed for inverse problems particularly with

the recent surge of interest in machine learning and deep learning methods. Among all

proposed methods, the most popular and effective method is the convolutional neural …

  All 2 versions 


arXiv:2012.07106  [pdfother math.DG
Geometric Characteristics of Wasserstein Metric on SPD(n)
Authors: Yihao LuoShiqiang ZhangYueqi CaoHuafei Sun
Abstract: Wasserstein distance, especially among symmetric positive-definite matrices, has broad and deep influences on development of artificial intelligence (AI) and other branches of computer science. A natural idea is to describe the geometry of SPD(n)
 as a Riemannian manifold endowed with the Wasserstein metric. In this paper, by involving the fiber bundle, we obtain explicit expressions f…  More
Submitted 13 December, 2020; originally announced December 2020.
Comments: 29 pages, 4 figures
MSC Class: 53Z50 (Primary); 53B20 (Secondary)

arXiv:2012.06961  [pdfother cs.LG  math.OC
Online Stochastic Optimization with Wasserstein Based Non-stationarity
Authors: Jiashuo JiangXiaocheng LiJiawei Zhang
Abstract: We consider a general online stochastic optimization problem with multiple budget constraints over a horizon of finite time periods. At each time period, a reward function and multiple cost functions, where each cost function is involved in the consumption of one corresponding budget, are drawn from an unknown distribution, which is assumed to be non-stationary across time. Then, a decision maker…  More
Submitted 12 December, 2020; originally announced December 2020.
Cited by 7
 Related articles All 3 versions 


arXiv:2012.06859  [pdfother cs.CV
Spectral Unmixing With Multinomial Mixture Kernel and Wasserstein Generative Adversarial Loss
Authors: Savas OzkanGozde Bozdagi Akar
Abstract: This study proposes a novel framework for spectral unmixing by using 1D convolution kernels and spectral uncertainty. High-level representations are computed from data, and they are further modeled with the Multinomial Mixture Model to estimate fractions under severe spectral uncertainty. Furthermore, a new trainable uncertainty term based on a nonlinear neural network model is introduced in the r…  More
Submitted 12 December, 2020; originally announced December 2020.
Comments: AI for Earth Sciences Workshop at NeurIPS 2020

[PDF] arxiv.org

Spectral Unmixing With Multinomial Mixture Kernel and Wasserstein Generative Adversarial Loss

S OzkanGB Akar - arXiv preprint arXiv:2012.06859, 2020 - arxiv.org

This study proposes a novel framework for spectral unmixing by using 1D convolution kernels and spectral uncertainty. High-level representations are computed from data, and they are further modeled with the Multinomial Mixture Model to estimate fractions under …

  Related articles All 2 versions 

Sobolev Wasserstein GAN 1.

M Xu, Z Zhou, G LuJ TangW ZhangY Yu - arXiv preprint arXiv …, 2020 - arxiv.org

Wasserstein GANs (WGANs), built upon the Kantorovich-Rubinstein (KR) duality of

Wasserstein distance, is one of the most theoretically sound GAN models. However, in

practice it does not always outperform other variants of GANs. This is mostly due to the …

  All 2 versions 

Sobolev Wasserstein GAN
by Xu, Minkai; Zhou, Zhiming; Lu, Guansong ; More...
12/2020
Wasserstein GANs (WGANs), built upon the Kantorovich-Rubinstein (KR) duality of Wasserstein distance, is one of the most theoretically sound GAN models....

Journal ArticleFull Text Online

 

online

EEG data augmentation using Wasserstein GAN

by Bouallegue, Ghaith; Djemal, Ridha

2020 20th International Conference on Sciences and Techniques of Automatic Control and Computer Engineering (STA), 12/2020

Electroencephalogram (EEG) presents a challenge during the classification task using machine learning and deep learning techniques due to the lack or to the...

Conference ProceedingFull Text Online
EEG data augmentation using Wasserstein GAN - IEEE Xplore

https://ieeexplore.ieee.org › document

by G Bouallegue · 2020 · Cited by 2 — EEG data augmentation using Wasserstein GAN. Abstract: Electroencephalogram (EEG) presents a challenge during the classification task using machine learning ...

Date Added to IEEE Xplore: 26 January 2021



 OPEN ACCESS

Data-driven Distributionally Robust Stochastic Optimization via Wasserstein Distance with...

by Singh, Derek  2020

University of Minnesota Ph.D. dissertation. December 2020. Major: Industrial Engineering. Advisor: Shuzhong Zhang. 1 computer file (PDF); xi, 190 pages. The...

Dissertation/ThesisCitation Online

LCS Graph Kernel Based on Wasserstein Distance in Longest Common Subsequence Metric Space
by Huang, Jianming; Fang, Zhongxi; Kasai, Hiroyuki
12/2020
For graph classification tasks, many methods use a common strategy to aggregate information of vertex neighbors. Although this strategy provides an efficient...

Journal ArticleFull Text Online


 The Spectral-Domain  Wasserstein Distance for Elliptical Processes and the Spectral-Domain Gelbrich Bound

S FangQ Zhu - arXiv preprint arXiv:2012.04023, 2020 - arxiv.org

In this short note, we introduce the spectral-domain $\mathcal {W} _2 $ Wasserstein distance

for elliptical stochastic processes in terms of their power spectra. We also introduce the

spectral-domain Gelbrich bound for processes that are not necessarily elliptical. Subjects:

Statistics Theory (math. ST); Machine Learning (cs. LG); Signal Processing (eess. SP);

Probability (math. PR); Machine Learning (stat. ML)

The Spectral-Domain $\mathcal{W}_2$ Wasserstein Distance for Elliptical Processes and the...
by Fang, Song; Zhu, Quanyan
12/2020
In this short note, we introduce the spectral-domain $\mathcal{W}_2$ Wasserstein distance for elliptical stochastic processes in terms of their power spectra....

Journal ArticleFull Text Online


Independent Elliptical Distributions Minimize Their  Wasserstein Distance from Independent Elliptical Distributions with the Same Density Generator

S FangQ Zhu - arXiv preprint arXiv:2012.03809, 2020 - arxiv.org

This short note is on a property of the $\mathcal {W} _2 $ Wasserstein distance which

indicates that independent elliptical distributions minimize their $\mathcal {W} _2 $

Wasserstein distance from given independent elliptical distributions with the same density

generators. Furthermore, we examine the implications of this property in the Gelbrich bound

when the distributions are not necessarily elliptical. Meanwhile, we also generalize the

results to the cases when the distributions are not independent. The primary purpose of this …

  All 2 versions 

Independent Elliptical Distributions Minimize Their $\mathcal{W}_2$ Wasserstein Distance from...
by Fang, Song; Zhu, Quanyan
12/2020
This short note is on a property of the $\mathcal{W}_2$ Wasserstein distance which indicates that independent elliptical distributions minimize their...

Journal ArticleFull Text Online

<——2020——2020——————  620 ——   


 

arXiv:2012.08850  [pdfpsother math.OC eess.SY
Consistency of Distributionally Robust Risk- and Chance-Constrained Optimization under Wasserstein Ambiguity Sets
Authors: Ashish CherukuriAshish R. Hota
Abstract: We study stochastic optimization problems with chance and risk constraints, where in the latter, risk is quantified in terms of the conditional value-at-risk (CVaR). We consider the distributionally robust versions of these problems, where the constraints are required to hold for a family of distributions constructed from the observed realizations of the uncertainty via the Wasserstein distance. O…  More
Submitted 16 December, 2020; originally announced December 2020.

 

arXiv:2012.08674  [pdfother cs.LG  cs.CV
Wasserstein Contrastive Representation Distillation
Authors: Liqun ChenZhe GanDong WangJingjing LiuRicardo HenaoLawrence Carin
Abstract: The primary goal of knowledge distillation (KD) is to encapsulate the information of a model learned from a teacher network into a student network, with the latter being more compact than the former. Existing work, e.g., using Kullback-Leibler divergence for distillation, may fail to capture important structural knowledge in the teacher network and often lacks the ability for feature generalizatio…  More
Submitted 15 December, 2020; originally announced December 2020.

arXiv:2012.08610  [pdfother eess.SY  cs.MA
Distributed Wasserstein Barycenters via Displacement Interpolation
Authors: Pedro Cisneros-VelardeFrancesco Bullo
Abstract: Consider a multi-agent system whereby each agent has an initial probability measure. In this paper, we propose a distributed algorithm based upon stochastic, asynchronous and pairwise exchange of information and displacement interpolation in the Wasserstein space. We characterize the evolution of this algorithm and prove it computes the Wasserstein barycenter of the initial measures under various…  More
Submitted 15 December, 2020; originally announced December 2020.
Comments: 25 pages, 4 figures
MSC Class: 60J20 (Primary); 49N99; 46N10 (Secondary)


[PDF] mlr.press

Wasserstein fair classification

R JiangA Pacchiano, T Stepleton… - Uncertainty in …, 2020 - proceedings.mlr.press

We propose an approach to fair classification that enforces independence between the

classifier outputs and sensitive information by minimizing Wasserstein-1 distances. The

approach has desirable theoretical properties and is robust to specific choices of the …

Cited by 78 Related articles All 5 versions 

[PDF] nips.cc

[PDF] Fair regression with wasserstein barycenters

E Chzhen, C Denis, M HebiriL Oneto… - Advances in Neural …, 2020 - papers.nips.cc

We study the problem of learning a real-valued function that satisfies the Demographic

Parity constraint. It demands the distribution of the predicted output to be independent of the

sensitive attribute. We consider the case that the sensitive attribute is available for …

ited by 25 Related articles All 6 versions 

  [PDF] nips.cc

Projection robust Wasserstein distance and Riemannian optimization

T LinC FanN HoM CuturiM Jordan - Advances in Neural …, 2020 - papers.nips.cc

Projection robust Wasserstein (PRW) distance, or Wasserstein projection pursuit (WPP), is a

robust variant of the Wasserstein distance. Recent work suggests that this quantity is more

robust than the standard Wasserstein distance, in particular when comparing probability …

  Cited by 2 All 8 versions 


[PDF] nips.cc

[PDF] Continuous regularized wasserstein barycenters

L LiA GenevayM Yurochkin… - Advances in Neural …, 2020 - papers.nips.cc

Wasserstein barycenters provide a geometrically meaningful way to aggregate probability

distributions, built on the theory of optimal transport. They are difficult to compute in practice,

however, leading previous work to restrict their supports to finite sets of points. Leveraging a …

  Cited by 1 All 4 versions 

 

[PDF] arxiv.org

Improved complexity bounds in wasserstein barycenter problem

D Dvinskikh, D Tiapkin - arXiv preprint arXiv:2010.04677, 2020 - arxiv.org

In this paper, we focus on computational aspects of Wasserstein barycenter problem. We

provide two algorithms to compute Wasserstein barycenter of $ m $ discrete measures of

size $ n $ with accuracy $\varepsilon $. The first algorithm, based on mirror prox with some …

  Cited by 2 All 2 versions 


[PDF] nips.cc

[PDF] The Wasserstein Proximal Gradient Algorithm

A SalimA KorbaG Luise - Advances in Neural Information …, 2020 - papers.nips.cc

Wasserstein gradient flows are continuous time dynamics that define curves of steepest

descent to minimize an objective function over the space of probability measures (ie, the

Wasserstein space). This objective is typically a divergence wrt a fixed target distribution. In …

   Cited by 35 Related articles All 5 versions

 
 

[HTML] mdpi.com

Fused Gromov-Wasserstein distance for structured objects

T Vayer, L Chapel, R FlamaryR TavenardN Courty - Algorithms, 2020 - mdpi.com

Optimal transport theory has recently found many applications in machine learning thanks to

its capacity to meaningfully compare various machine learning objects that are viewed as

distributions. The Kantorovitch formulation, leading to the Wasserstein distance, focuses on …

Cited by 20 Related articles All 32 versions 

<——2020—–—2020———————  630 ——    


[PDF] mlr.press

Stronger and faster Wasserstein adversarial attacks

K WuA WangY Yu - International Conference on Machine …, 2020 - proceedings.mlr.press

Deep models, while being extremely flexible and accurate, are surprisingly vulnerable to

“small, imperceptible” perturbations known as adversarial attacks. While the majority of

existing attacks focus on measuring perturbations under the $\ell_p $ metric, Wasserstein  …

Cited by 8 Related articles All 10 versions 

[PDF] arxiv.org

Wasserstein distributionally robust stochastic control: A data-driven approach

I Yang - IEEE Transactions on Automatic Control, 2020 - ieeexplore.ieee.org

Standard stochastic control methods assume that the probability distribution of uncertain

variables is available. Unfortunately, in practice, obtaining accurate distribution information

is a challenging task. To resolve this issue, we investigate the problem of designing a control …

 Cited by 69 Related articles All 3 versions

Importance-Aware Semantic Segmentation in Self-Driving with ...

arxiv.org › cs

  1. Oct 21,  2020 — Semantic segmentation (SS) is an important perception manner for self-driving cars and robotics, which classifies each pixel into a pre-determined class. ... In our extenssive experiments, Wasserstein loss demonstrates superior segmentation performance on the predefined critical classes for safe-driving.

by X Liu · ‎2020 · ‎Cited by 6 · ‎Related articles

[PDF] aaai.org

[CITATION] Importance-Aware Semantic Segmentation in Self-Driving with Discrete Wasserstein Training.

X Liu, Y Han, S Bai, Y Ge, T Wang, X HanS LiJ You… - AAAI, 2020

ited by 21 Related articles All 7 versions 

 

Adapted wasserstein distances and stability in mathematical finance

BV JulioD BartlB Mathias, E Manu - Finance and Stochastics, 2020 - Springer

Assume that an agent models a financial asset through a measure Q with the goal to

price/hedge some derivative or optimise some expected utility. Even if the model Q is

chosen in the most skilful and sophisticated way, the agent is left with the possibility that Q …

Cited by 36 Related articles All 20 versions

[CITATION] Adapted wasserstein distances and stability in mathematical finance. arXiv e-prints, page

J Backhoff-Veraguas, D Bartl, M Beiglböck, M Eder - arXiv preprint arXiv:1901.07450, 2019

  Cited by 4 Related articles


[PDF] mlr.press

Bridging the gap between f-gans and wasserstein gans

J SongS Ermon - International Conference on Machine …, 2020 - proceedings.mlr.press

Generative adversarial networks (GANs) variants approximately minimize divergences

between the model and the data distribution using a discriminator. Wasserstein GANs

(WGANs) enjoy superior empirical performance, however, unlike in f-GANs, the discriminator …

  Cited by 5 Related articles All 2 versions 


 2020


Wasserstein distributionally robust inverse multiobjective optimization

C Dong, B Zeng - arXiv preprint arXiv:2009.14552, 2020 - arxiv.org

Inverse multiobjective optimization provides a general framework for the unsupervised

learning task of inferring parameters of a multiobjective decision making problem (DMP),

based on a set of observed decisions from the human expert. However, the performance of …

  Cited by 2 All 2 versions 

 

[PDF] nips.cc

[PDF] Quantifying the Empirical Wasserstein Distance to a Set of Measures: Beating the Curse of Dimensionality

N SiJ BlanchetS Ghosh… - Advances in Neural …, 2020 - papers.nips.cc

We consider the problem of estimating the Wasserstein distance between the empirical

measure and a set of probability measures whose expectations over a class of functions

(hypothesis class) are constrained. If this class is sufficiently rich to characterize a particular …

Cited by 8 Related articles All 3 versions 


 [PDF] mlr.press

Robust Document Distance with Wasserstein-Fisher-Rao metric

Z Wang, D Zhou, M Yang, Y Zhang… - Asian Conference on …, 2020 - proceedings.mlr.press

Computing the distance among linguistic objects is an essential problem in natural

language processing. The word mover's distance (WMD) has been successfully applied to

measure the document distance by synthesizing the low-level word similarity with the …

  Cited by 3 Related articles All 2 versions 

[PDF] nips.cc

[PDF] Deep Diffusion-Invariant Wasserstein Distributional Classification

SW Park+DW ShuJ Kwon - Advances in Neural Information …, 2020 - papers.nips.cc

In this paper, we present a novel classification method called deep diffusion-invariant

Wasserstein distributional classification (DeepWDC). DeepWDC represents input data and

labels as probability measures to address severe perturbations in input data. It can output 

 Related articles All 2 versions 

  

[PDF] iop.org

Full View

Wasserstein metric for improved quantum machine learning with adjacency matrix representations

O Çaylak, OA von Lilienfeld… - … Learning: Science and …, 2020 - iopscience.iop.org

We study the Wasserstein metric to measure distances between molecules represented by

the atom index dependent adjacency'Coulomb'matrix, used in kernel ridge regression based

supervised learning. Resulting machine learning models of quantum properties, aka …

 Cited by 10 Related articles All 5 versions

<——2020———2020——————640 —  


[PDF] nips.cc

[PDF] Wasserstein Distances for Stereo Disparity Estimation

D GargY WangB HariharanM Campbell… - Advances in Neural …, 2020 - papers.nips.cc

Existing approaches to depth or disparity estimation output a distribution over a set of pre-

defined discrete values. This leads to inaccurate results when the true depth or disparity

does not match any of these values. The fact that this distribution is usually learned indirectly …

 ited by 18 Related articles All 6 versions 



[PDF] researchgate.net

[PDF] STATISTICAL INFERENCE FOR BURES-WASSERSTEIN BARYCENTERS BY ALEXEY KROSHNIN, VLADIMIR SPOKOINY 2 AND ALEXANDRA …

A KROSHNIN - researchgate.net

… –Wasserstein barycen… –Wasserstein distance, and explain, how the obtained results are

connected to optimal transportation theory and can be applied to statistical inference in quantum …

Related articles 

[PDF] mlr.press

Approximate inference with wasserstein gradient flows

C FrognerT Poggio - International Conference on Artificial …, 2020 - proceedings.mlr.press

We present a novel approximate inference method for diffusion processes, based on the

Wasserstein gradient flow formulation of the diffusion. In this formulation, the time-dependent

density of the diffusion is derived as the limit of implicit Euler steps that follow the gradients …

Cited by 18 Related articles All 6 versions 


[PDF] arxiv.org

wasserstein-type distance in the space of gaussian mixture models

J Delon, A Desolneux - SIAM Journal on Imaging Sciences, 2020 - SIAM

In this paper we introduce a Wasserstein-type distance on the set of Gaussian mixture

models. This distance is defined by restricting the set of possible coupling measures in the

optimal transport problem to Gaussian mixture models. We derive a very simple discrete …

 Cited by 37 Related articles All 7 versions


www.amazon.co.jp › 数理科学-2020...

Translate this page

... 特集情報幾何学の探究 基礎と応用現状と展望に迫る. 20201116日に日本でレビュー済み. Amazonで購入. 今回の特集の記事のラインナップを掲げます ...

[CITATION] Wasserstein 幾何学と情報幾何学 (特集 情報幾何学の探究: 基礎と応用, 現状と展望に迫る)

高津飛鳥 - 数理科学, 2020 - ci.nii.ac.jp

CiNii 国立情報学研究所 学術情報ナビゲータ[サイニィ]. メニュー 検索. 日本の論文をさがす;

大学図書館の本をさがす; 日本の博士論文をさがす. 日本の論文をさがす; 大学図書館の本をさがす;

日本の博士論文をさがす. 新規登録; ログイン; English. 検索. すべて. 本文あり. 詳細検索. すべて …

[Japanese  Wasserstein Geometry and Information Geometry (Special Feature: Exploration of Information Geometry:]


2020


Wasserstein距离的条件双向学习推理--《河北大学》2020 ...

cdmd.cnki.com.cn › Article › CDMD...

· Translate this page

by 刘轶功 · 2020 — 论文提出一种基于Wasserstein距离的双向学习推理(WBLI)模型,将编码器和生成器双向集成于基于Wasserstein距离的生成对抗网络模型中,以解决生成对抗网络中的 ...

[CITATION] 基于 Wasserstein 距离的双向学习推理

花强, 刘轶功, 张峰, 董春茹 - 河北大学学报 (自然科学版)

 [Chinese  Two-way learning and reasoning based on Wasserstein distance]


Convergence and concentration of empirical measures under Wasserstein distance in unbounded functional spaces

J Lei - Bernoulli, 2020 - projecteuclid.org

We provide upper bounds of the expected Wasserstein distance between a probability measure and its empirical version, generalizing recent results for finite dimensional Euclidean spaces and bounded functional spaces. Such a generalization can cover …

  Cited by 43 Related articles All 5 versions


Semi-supervised Data-driven Surface Wave Tomography ...

www.essoar.org › doi › essoar.10505231.1

by A Cai — Semi-supervised Data-driven Surface Wave Tomography using Wasserstein Cycle-consistent GAN: Application on Southern California Plate ...

[CITATION] Semi-supervised Data-driven Surface Wave Tomography using Wasserstein Cycle-consistent GAN: Application on Southern California Plate Boundary …

A Cai, H QiuF Niu - AGU Fall Meeting 2020, 2020 - agu.confex.com

Semi-supervised Data-driven Surface Wave Tomography using Wasserstein Cycle-consistent GAN: Application on Southern California Plate Boundary Region

A Cai, H QiuF Niu - 2020 - essoar.org

Current machine learning based shear wave velocity (Vs) inversion using surface wave

dispersion measurements utilizes synthetic dispersion curves calculated from existing 3-D

velocity models as training datasets. It is shown in the previous studies that the …

  All 3 versions

  

[PDF] projecteuclid.org

Donsker's theorem in Wasserstein-1 distance

L Coutin, L Decreusefond - Electronic Communications in …, 2020 - projecteuclid.org

We compute the Wassertein-1 (or Kantorovitch-Rubinstein) distance between a random

walk in $\mathbf {R}^{d} $ and the Brownian motion. The proof is based on a new estimate of

the modulus of continuity of the solution of the Stein's equation. As an application, we can …

Cited by 6 Related articles All 51 versions


[v2] Thu, 4 Jun 2020 12:51:56 UTC (240 KB)

[CITATION] Wasserstein gradient flow formulation of the time-fractional Fokker-Planck equation

B Jin, MH Duong - Communications in Mathematical Sciences, 2020 - discovery.ucl.ac.uk

… Wasserstein gradient flow formulation of the time-fractional Fokker-Planck equation. Jin, B; Duong,

MH; (2020) Wasserstein gradient flow formulation of the time-fractional Fokker-Planck equation.

Communications in Mathematical Sciences (In press). [img], Text fracFPE_cms_revised.pdf …

<——2020—————2020—————650 —

 

Multi-Band Image Synchronous Super-Resolution and Fusion Method Based on Improved WGAN-GP

By: Tian SongwangLin SuzhenLei Haiwei; et al.

ACTA OPTICA SINICA  Volume: ‏ 40   Issue: ‏ 20     Article Number: 2010001   Published: ‏ OCT 25 2020

  View Abstract

[HTML] hindawi.com

[HTML] Motion Deblurring in Image Color Enhancement by WGAN

J Feng, S Qi - International Journal of Optics, 2020 - hindawi.com

Motion deblurring and image enhancement are active research areas over the years.

Although the CNN-based model has an advanced state of the art in motion deblurring and

image enhancement, it fails to produce multitask results when challenged with the images of …

   Cited by 1 Related articles All 4 versions 
  

Spam transaction attack detection model based on GRU and WGAN-div

J Yang, T Li, G Liang, YP Wang, TY Gao… - Computer …, 2020 - Elsevier

A Spam Transaction attack is a kind of hostile attack activity specifically targeted against a

Cryptocurrency Network. Traditional network intrusion detection methods lack the capability

of automatic feature extraction for spam transaction attacks, and thus the detection efficiency …

   Cited by 1 Related articles All 2 versions


A TextCNN and WGAN-gp based deep learning frame for unpaired text style transfer in multimedia services

M Hu, M He, W Su, A Chehri - Multimedia Systems, 2020 - Springer

With the rapid growth of big multimedia data, multimedia processing techniques are facing

some challenges, such as knowledge understanding, semantic modeling, feature

representation, etc. Hence, based on TextCNN and WGAN-gp (improved training of …

 

 

An enhanced uncertainty principle for the Vaserstein distance

arxiv.org › math

Mar 6, 2020 — We improve some recent results of Sagiv and Steinerberger that quantify the following uncertainty principle: for a function f with mean zero, either ...

by T Carroll · ‎2020 ·

 Cited by 2 · ‎Related articles  Zbl 07326676    MR4224354 

An enhanced uncertainty princi  √ple for the Vaserstein distance

By: Carroll, Tom; Massaneda, Xavier; Ortega-Cerda, Joaquim

BULLETIN OF THE LONDON MATHEMATICAL SOCIETY  Volume: ‏ 52   Issue: ‏ 6   Pages: ‏ 1158-1173   Published: ‏ DEC

Early Access: JUL 2020    

arXiv:2003.03165  [pdf, ps, other 

An enhanced uncertainty principle for the Vaserstein distance 

Authors: Tom Carroll, Xavier Massaneda, Joaquim Ortega-Cerda 

Abstract: We improve on some recent results of Sagiv and Steinerberger that quantify the following uncertainty principle: for a function

f with mean zero, then either the size of the zero set of the function or the cost of transporting the mass of the positive part of

f to its negative part must be big. We also provide a sharp upper estimate of the transport cost of the positive part of an eigenfunction… More 

Submitted

[v1] Fri, 6 Mar 2020 12:52:08 UTC (14 KB)

[v2] Fri, 13 Mar 2020 15:25:38 UTC (14 KB)

Cited by 1 Related articles All 3 versions 

BULLETIN OF THE LONDON MATHEMATICAL SOCIETY     

Cited by 2 Related articles All 3 versions View as HTML


2020

   

arXiv:2012.09999  [pdfother stat.ME
Interpretable Model Summaries Using the Wasserstein Distance
Authors: Eric DunipaceLorenzo Trippa
Abstract: In the current computing age, models can have hundreds or even thousands of parameters; however, such large models decrease the ability to interpret and communicate individual parameters. Reducing the dimensionality of the parameter space in the estimation phase is a commonly used technique, but less work has focused on selecting subsets of the parameters to focus on for interpretation--especially…  More
Submitted 17 December, 2020; originally announced December 2020.


Tensor product and Hadamard product for the Wasserstein means

J Hwang, S Kim - Linear Algebra and its Applications, 2020 - Elsevier

As one of the least squares mean, we consider the Wasserstein mean of positive definite

Hermitian matrices. We verify in this paper the inequalities of the Wasserstein mean related

with a strictly positive and unital linear map, the identity of the Wasserstein mean for tensor …

  Related articles All 2 versions


Wasserstein Metric Based Adaptive Fuzzy Clustering Methods ...

www.researchgate.net › publication › 307831044_Wasser...

Sep 30, 2020 — Download Citation | Wasserstein Metric Based Adaptive Fuzzy Clustering Methods for Symbolic Interval Data | 

[HTML] springer.com

[HTML] Wasserstein and Kolmogorov error bounds for variance-gamma approximation via Stein's method I

RE Gaunt - Journal of Theoretical Probability, 2020 - Springer

The variance-gamma (VG) distributions form a four-parameter family that includes as special

and limiting cases the normal, gamma and Laplace distributions. Some of the numerous

applications include financial modelling and approximation on Wiener space. Recently …

  Cited by 13 Related articles All 6 versions


[PDF] esaim-proc.org

Statistical data analysis in the Wasserstein space

J Bigot - ESAIM: Proceedings and Surveys, 2020 - esaim-proc.org

This paper is concerned by statistical inference problems from a data set whose elements

may be modeled as random probability measures such as multiple histograms or point

clouds. We propose to review recent contributions in statistics on the use of Wasserstein  …

  Cited by 3 Related articles All 3 versions
<——2020————————2020———————  660 —  


(PDF) Spectral gaps in Wasserstein distances and the 2D ...

www.researchgate.net › publication › 2126057_Spectral_...

Oct 6, 2020 — We finally show that the latter condition is satisfied by the two-dimensional stochastic Navier--Stokes equations, even in situations where the ...

 Spectral gaps in Wasserstein distances and the 2D stochastic Navier-Stokes equations. Spectral gaps in Wasserstein distances and the 2D stochastic Navier-Stokes equations. 


Functional Data Clustering Analysis via the Learning of Gaussian Processes with Wasserstein Distance

T Li, J Ma - International Conference on Neural Information …, 2020 - Springer

33 days ago - Functional data clustering analysis becomes an urgent and challenging task in

the new era of big data. In this paper, we propose a new framework for functional data

clustering analysis, which adopts a similar structure as the k-means algorithm for the …

Cited by 2 Related articles All 2 versions

[PDF] arxiv.org

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

MH Quang - arXiv preprint arXiv:2011.07489, 2020 - arxiv.org

36 days ago - This work studies the entropic regularization formulation of the 2-Wasserstein

distance on an infinite-dimensional Hilbert space, in particular for the Gaussian setting. We

first present the Minimum Mutual Information property, namely the joint measures of two …

  All 2 versions 

[CITATION] Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

HQ Minh - preprint, 2020

Cited by 4 Related articles

[PDF] arxiv.org

Hierarchical Gaussian Processes with Wasserstein-2 Kernels

S PopescuD SharpJ ColeB Glocker - arXiv preprint arXiv:2010.14877, 2020 - arxiv.org

54 days ago - We investigate the usefulness of Wasserstein-2 kernels in the context of

hierarchical Gaussian Processes. Stemming from an observation that stacking Gaussian

Processes severely diminishes the model's ability to detect outliers, which when combined …

Cited by 3 Related articles All 3 versions 

Learning Wasserstein Distance-Based Gaussian Graphical Model for Multivariate Time Series Classification

HU Xuegang, L Jianxing, LI Peipei… - 2020 IEEE …, 2020 - ieeexplore.ieee.org

134 days ago - Multivariate time series classification occupies an important position in time

series data mining tasks and has been applied in many fields. However, due to the statistical

coupling between different variables of Multivariate Time Series (MTS) data, traditional …

Cited by 1 Related articles All 3 versions


[PDF] arxiv.org

Entropy-Regularized -Wasserstein Distance between Gaussian Measures

A MallastoA GerolinHQ Minh - arXiv preprint arXiv:2006.03416, 2020 - arxiv.org

197 days ago - Gaussian distributions are plentiful in applications dealing in uncertainty

quantification and diffusivity. They furthermore stand as important special cases for

frameworks providing geometries for probability measures, as the resulting geometry on …

Cited by 9 Related articles All 2 versions 

[PDF] researchgate.net

Non-Gaussian BLE-Based Indoor Localization Via Gaussian Sum Filtering Coupled with Wasserstein Distance

P Malekzadeh, S Mehryar, P Spachos… - ICASSP 2020-2020 …, 2020 - ieeexplore.ieee.org

230 days ago - With recent breakthroughs in signal processing, communication and

networking systems, we are more and more surrounded by smart connected devices

empowered by the Internet of Thing (IoT). Bluetooth Low Energy (BLE) is considered as the …

Cited by 4 Related articles All 3 versions

A novel kernel Wasserstein distance on Gaussian measures: An application of identifying dental artifacts in head and neck computed tomography

JH Oh, M PouryahyaA IyerAP ApteJO Deasy… - Computers in Biology …, 2020 - Elsevier

270 days ago - The Wasserstein distance is a powerful metric based on the theory of optimal

mass transport. It gives a natural measure of the distance between two distributions with a

wide range of applications. In contrast to a number of the common divergences on …

 Cited by 12 Related articles All 4 versions


 Learning Graphons via Structured Gromov-Wasserstein Barycenters

H XuD LuoL CarinH Zha - arXiv preprint arXiv:2012.05644, 2020 - arxiv.org

11 days ago - We propose a novel and principled method to learn a nonparametric graph

model called graphon, which is defined in an infinite-dimensional space and represents

arbitrary-size graphs. Based on the weak regularity lemma from the theory of graphons, we …


[PDF] arxiv.org

Partial Gromov-Wasserstein Learning for Partial Graph Matching

W LiuC ZhangJ XieZ Shen, H Qian… - arXiv preprint arXiv …, 2020 - arxiv.org

18 days ago - Graph matching finds the correspondence of nodes across two graphs and is a

basic task in graph-based machine learning. Numerous existing methods match every node

in one graph to one node in the other graph whereas two graphs usually overlap partially in …

  All 4 versions 

[PDF] arxiv.org

<——2020——2020————————  670——    


Classification of atomic environments via the Gromov–Wasserstein distance

S Kawano, JK Mason - Computational Materials Science, 2020 - Elsevier

47 days ago - Interpreting molecular dynamics simulations usually involves automated

classification of local atomic environments to identify regions of interest. Existing approaches

are generally limited to a small number of reference structures and only include limited …

  All 4 versions


[PDF] arxiv.org

Improving Relational Regularized Autoencoders with Spherical Sliced Fused Gromov Wasserstein

K NguyenS NguyenN HoT PhamH Bui - arXiv preprint arXiv …, 2020 - arxiv.org

76 days ago - Relational regularized autoencoder (RAE) is a framework to learn the

distribution of data by minimizing a reconstruction loss together with a relational

regularization on the latent space. A recent attempt to reduce the inner discrepancy between …

  All 3 versions 


[PDF] openreview.net

[PDF] GROMOV WASSERSTEIN

I RELATIONAL - openreview.net

80 days ago - Relational regularized autoencoder (RAE) is a framework to learn the

distribution of data by minimizing a reconstruction loss together with a relational

regularization on the latent space. A recent attempt to reduce the inner discrepancy between …

  All 2 versions 


[PDF] arxiv.org

The Unbalanced Gromov Wasserstein Distance: Conic Formulation and Relaxation

T SéjournéFX VialardG Peyré - arXiv preprint arXiv:2009.04266, 2020 - arxiv.org

103 days ago - Comparing metric measure spaces (ie a metric space endowed with a

probability distribution) is at the heart of many machine learning problems. This includes for

instance predicting properties of molecules in quantum chemistry or generating graphs with …

  Cited by 1 All 2 version


 

[PDF] arxiv.org

Gromov-Wasserstein Distance based Object Matching: Asymptotic Inference

CA Weitkamp, K Proksch, C Tameling… - arXiv preprint arXiv …, 2020 - arxiv.org

182 days ago - In this paper, we aim to provide a statistical theory for object matching based

on the Gromov-Wasserstein distance. To this end, we model general objects as metric

measure spaces. Based on this, we propose a simple and efficiently computable asymptotic …

Cited by 1 Related articles All 6 versions 


[PDF] arxiv.org

Generalized Spectral Clustering via Gromov-Wasserstein Learning

S ChowdhuryT Needham - arXiv preprint arXiv:2006.04163, 2020 - arxiv.org

195 days ago - We establish a bridge between spectral clustering and Gromov-Wasserstein

Learning (GWL), a recent optimal transport-based approach to graph partitioning. This

connection both explains and improves upon the state-of-the-art performance of GWL. The …

  Related articles All 2 versions 

[PDF] arxiv.org

Partial Gromov-Wasserstein with Applications on Positive-Unlabeled Learning

L Chapel, MZ AlayaG Gasso - arXiv preprint arXiv:2002.08276, 2020 - arxiv.org

305 days ago - Optimal Transport (OT) framework allows defining similarity between

probability distributions and provides metrics such as the Wasserstein and Gromov-

Wasserstein discrepancies. Classical OT problem seeks a transportation map that preserves …


 romov-Wasserstein optimal transport to align single-cell multi-omics data

P DemetciR SantorellaB SandstedeWS Noble… - BioRxiv, 2020 - biorxiv.org

236 days ago - Data integration of single-cell measurements is critical for our understanding of

cell development and disease, but the lack of correspondence between different types of

single-cell measurements makes such efforts challenging. Several unsupervised algorithms …

<——2020————2020—————————  680——    


[PDF] arxiv.org

Central limit theorems for Markov chains based on their convergence rates in Wasserstein distance

R Jin, A Tan - arXiv preprint arXiv:2002.09427, 2020 - arxiv.org

Many tools are available to bound the convergence rate of Markov chains in total variation

(TV) distance. Such results can be used to establish central limit theorems (CLT) that enable

error evaluations of Monte Carlo estimates in practice. However, convergence analysis …

  Cited by 1 Related articles All 3 versions 

[PDF] arxiv.org

A wasserstein-type distance in the space of gaussian mixture models

J Delon, A Desolneux - SIAM Journal on Imaging Sciences, 2020 - SIAM

In this paper we introduce a Wasserstein-type distance on the set of Gaussian mixture

models. This distance is defined by restricting the set of possible coupling measures in the

optimal transport problem to Gaussian mixture models. We derive a very simple discrete …

  Cited by 7 Related articles All 10 versions


[PDF] neurips.cc

[PDF] Quantile Propagation for Wasserstein-Approximate Gaussian Processes

R ZhangC WalderEV Bonilla… - Advances in Neural …, 2020 - proceedings.neurips.cc

Approximate inference techniques are the cornerstone of probabilistic methods based on

Gaussian process priors. Despite this, most work approximately optimizes standard

divergence measures such as the Kullback-Leibler (KL) divergence, which lack the basic …

 Cited by 2 Related articles All 7 versions 

[PDF] arxiv.org

Projection robust Wasserstein distance and Riemannian optimization

T LinC FanN HoM CuturiMI Jordan - arXiv preprint arXiv:2006.07458, 2020 - arxiv.org

Projection robust Wasserstein (PRW) distance, or Wasserstein projection pursuit (WPP), is a

robust variant of the Wasserstein distance. Recent work suggests that this quantity is more

robust than the standard Wasserstein distance, in particular when comparing probability …

  Cited by 2 All 7 versions 















 




e[PDF] arxiv.org

Riemannian normalizing flow on variational wasserstein autoencoder for text modeling

PZ WangWY Wang - arXiv preprint arXiv:1904.02399, 2019 - arxiv.org

Recurrent Variational Autoencoder has been widely used for language modeling and text

generation tasks. These models often face a difficult optimization problem, also known as

the Kullback-Leibler (KL) term vanishing issue, where the posterior easily collapses to the …

  Cited by 12 Related articles All 4 versions 














Riemannian Normalizing Flow on Variational Wasserstein Autoencoder for Text Modeling

P Zizhuang Wang, WY Wang - arXiv, 2019 - ui.adsabs.harvard.edu

Abstract Recurrent Variational Autoencoder has been widely used for language modeling

and text generation tasks. These models often face a difficult optimization problem, also

known as the Kullback-Leibler (KL) term vanishing issue, where the posterior easily …



[PDF] thecvf.com

Gromov-Wasserstein Averaging in a Riemannian Framework

S ChowdhuryT Needham - Proceedings of the IEEE/CVF …, 2020 - openaccess.thecvf.com

We introduce a theoretical framework for performing statistical tasks-including, but not

limited to, averaging and principal component analysis-on the space of (possibly

asymmetric) matrices with arbitrary entries and sizes. This is carried out under the lens of the …

  Cited by 4 Related articles All 3 versions 


[PDF] arxiv.org

A Rademacher-type theorem on L2-Wasserstein spaces over closed Riemannian manifolds

LD Schiavo - Journal of Functional Analysis, 2020 - Elsevier

Let P be any Borel probability measure on the L 2-Wasserstein space (P 2 (M), W 2) over a

closed Riemannian manifold M. We consider the Dirichlet form E induced by P and by the

Wasserstein gradient on P 2 (M). Under natural assumptions on P, we show that W 2 …

  Cited by 5 Related articles All 6 versions


PDF] dergipark.org.tr

Wasserstein Riemannian Geometry on Statistical Manifold

C Ogouyandjou, N Wadagni - International Electronic Journal of …, 2020 - dergipark.org.tr

In this paper, we study some geometric properties of statistical manifold equipped with the

Riemannian Otto metric which is related to the L 2-Wasserstein distance of optimal mass

transport. We construct some α-connections on such manifold and we prove that the …

  All 2 versions 


[PDF] arxiv.org

Ensemble Riemannian Data Assimilation over the Wasserstein Space

SK Tamang, A Ebtehaj, PJ Van Leeuwen, D Zou… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we present a new ensemble data assimilation paradigm over a Riemannian

manifold equipped with the Wasserstein metric. Unlike Eulerian penalization of error in the

Euclidean space, the Wasserstein metric can capture translation and shape difference …

  All 4 versions 

<————2020———————2020 ———————-690—


[PDF] Wasserstein Riemannian geometry of Gamma densities

C Ogouyandjou, N Wadagni - Computer Science, 2020 - ijmcs.future-in-tech.net

Abstract A Wasserstein Riemannian Gamma manifold is a space of Gamma probability

density functions endowed with the Riemannian Otto metric which is related to the

Wasserstein distance. In this paper, we study some geometric properties of such Riemanian …


[PDF] arxiv.org

A Riemannian Block Coordinate Descent Method for Computing the Projection Robust Wasserstein Distance

M HuangS MaL Lai - arXiv preprint arXiv:2012.05199, 2020 - arxiv.org

The Wasserstein distance has become increasingly important in machine learning and deep

learning. Despite its popularity, the Wasserstein distance is hard to approximate because of

the curse of dimensionality. A recently proposed approach to alleviate the curse of …

  

A Riemannian submersion‐based approach to the Wasserstein barycenter of positive definite matrices

M Li, H Sun, D Li - Mathematical Methods in the Applied …, 2020 - Wiley Online Library

In this paper, we introduce a novel geometrization on the space of positive definite matrices,

derived from the Riemannian submersion from the general linear group to the space of

positive definite matrices, resulting in easier computation of its geometric structure. The …

  Related articles


 arXiv:2012.10701  [pdfother math.AP 
math.PR
Entropic-Wasserstein barycenters: PDE characterization, regularity and CLT
Authors: Guillaume CarlierKatharina EichingerAlexey Kroshnin
Abstract: In this paper, we investigate properties of entropy-penalized Wasserstein barycenters introduced by Bigot, Cazelles and Papadakis (2019) as a regularization of Wasserstein barycenters first presented by Agueh and Carlier (2011). After characterizing these barycenters in terms of a system of Monge-Ampère equations, we prove some global moment and Sobolev bounds as well as higher regularity properti…  More
Submitted 19 December, 2020; originally announced December 2020.
MSC Class: 49Q25; 35J96; 60B12

[PDF] arxiv.org

Entropic-Wasserstein barycenters: PDE characterization, regularity and CLT

G Carlier, K Eichinger, A Kroshnin - arXiv preprint arXiv:2012.10701, 2020 - arxiv.org

In this paper, we investigate properties of entropy-penalized Wasserstein barycenters

introduced by Bigot, Cazelles and Papadakis (2019) as a regularization of Wasserstein

barycenters first presented by Agueh and Carlier (2011). After characterizing these …

Related articles All 5 versions 

 arXiv:2012.10514  [pdfpsother math.AT  math.AP math.MG
Virtual persistence diagrams, signed measures, and Wasserstein distance
Authors: Peter BubenikAlex Elchesen
Abstract: Persistence diagrams, an important summary in topological data analysis, consist of a set of ordered pairs, each with positive multiplicity. Persistence diagrams are obtained via Mobius inversion and may be compared using a one-parameter family of metrics called Wasserstein distances. In certain cases, Mobius inversion produces sets of ordered pairs which may have negative multiplicity. We call th…  More
Submitted 18 December, 2020; originally announced December 2020.
Comments: 30 pages

[PDF] arxiv.org

Virtual persistence diagrams, signed measures, and Wasserstein distance

P Bubenik, A Elchesen - arXiv preprint arXiv:2012.10514, 2020 - arxiv.org

Persistence diagrams, an important summary in topological data analysis, consist of a set of

ordered pairs, each with positive multiplicity. Persistence diagrams are obtained via Mobius

inversion and may be compared using a one-parameter family of metrics called Wasserstein  …

  All 2 versions 


Kantorovich–Rubinstein–Wasserstein distance between overlapping attractor and repeller<? A3B2 show [editpick]?>

V Chigarev, A KazakovA Pikovsky - Chaos: An Interdisciplinary …, 2020 - aip.scitation.org

We consider several examples of dynamical systems demonstrating overlapping attractor

and repeller. These systems are constructed via introducing controllable dissipation to

prototypic models with chaotic dynamics (Anosov cat map, Chirikov standard map, and …

 Cited by 11 Related articles All 7 versions

On the Computation of Kantorovich--Wasserstein Distances Between Two-Dimensional Histograms by Uncapacitated Minimum Cost Flows

F Bassetti, S GualandiM Veneroni - SIAM Journal on Optimization, 2020 - SIAM

In this work, we present a method to compute the Kantorovich--Wasserstein distance of

order 1 between a pair of two-dimensional histograms. Recent works in computer vision and

machine learning have shown the benefits of measuring Wasserstein distances of order 1 …

  Cited by 1 All 2 versions


[PDF] arxiv.org

Dynamic facial expression generation on hilbert hypersphere with conditional wasserstein generative adversarial nets

N OtberdoutM DaoudiA Kacem… - … on Pattern Analysis …, 2020 - ieeexplore.ieee.org

In this work, we propose a novel approach for generating videos of the six basic facial

expressions given a neutral face image. We propose to exploit the face geometry by

modeling the facial landmarks motion as curves encoded as points on a hypersphere. By …

  Cited by 6 Related articles All 2 versions

[PDF] amazonaws.com

[PDF] Bayesian Wasserstein GAN and Application for Vegetable Disease Image Data

W Cho, MH Na, S Kang, S Kim - manuscriptlink-society-file.s3 …

Various GAN models have been proposed so far and they are used in various fields.

However, despite the excellent performance of these GANs, the biggest problem is that the

model collapse occurs in the simultaneous optimization of the generator and discriminator of …

September 13, 2020 


 Approximate Bayesian computation with the sliced-Wasserstein distance

K Nadjahi, V De Bortoli, A Durmus… - ICASSP 2020-2020 …, 2020 - ieeexplore.ieee.org

Approximate Bayesian Computation (ABC) is a popular method for approximate inference in

generative models with intractable but easy-to-sample likelihood. It constructs an

approximate posterior distribution by finding parameters for which the simulated data are …

  Cited by 2 Related articles All 7 versions

<——2020———2020————-  700——


[PDF] aaai.org

Improving the Robustness of Wasserstein Embedding by Adversarial PAC-Bayesian Learning

D Ding, M Zhang, X PanM YangX He - Proceedings of the AAAI …, 2020 - ojs.aaai.org

Node embedding is a crucial task in graph analysis. Recently, several methods are

proposed to embed a node as a distribution rather than a vector to capture more information.

Although these methods achieved noticeable improvements, their extra complexity brings …

  

 [PDF] arxiv.org

Stability of Gibbs Posteriors from the Wasserstein Loss for Bayesian Full Waveform Inversion

MM DunlopY Yang - arXiv preprint arXiv:2004.03730, 2020 - arxiv.org

Recently, the Wasserstein loss function has been proven to be effective when applied to

deterministic full-waveform inversion (FWI) problems. We consider the application of this

loss function in Bayesian FWI so that the uncertainty can be captured in the solution. Other …

  Cited by 1 Related articles All 3 versions 


(PDF) THE α-z-BURES WASSERSTEIN DIVERGENCE

www.researchgate.net › publication › 345187838_THE_a...

Nov 2, 2020 — PDF | In this paper, we introduce the α-z-Bures Wasserstein divergence for positive semidefinite matrices A and B as Φ(A, B) = T r((1 − α)A + ...

[PDF] researchgate.net

[PDF] THE α-z-BURES WASSERSTEIN DIVERGENCE

THOA DINHCT LE, BK VO, TD VUONG - researchgate.net

Φ (A, B)= Tr ((1− α) A+ αB)− Tr (Qα, z (A, B)), where Qα, z (A, B)=(A 1− α 2z B α z A 1− α 2z) z

is the matrix function in the α-z-Renyi relative entropy. We show that for 0≤ α≤ z≤ 1, the

quantity Φ (A, B) is a quantum divergence and satisfies the Data Processing Inequality in …

2019

[PDF] arxiv.org

Bures-Wasserstein Geometry

J van Oostrum - arXiv preprint arXiv:2001.08056, 2020 - arxiv.org

The Bures-Wasserstein distance is a Riemannian distance on the space of positive definite

Hermitian matrices and is given by: $ d (\Sigma, T)=\left [\text {tr}(\Sigma)+\text {tr}(T)-2\text

{tr}\left (\Sigma^{1/2} T\Sigma^{1/2}\right)^{1/2}\right]^{1/2} $. This distance function appears …

  Related articles All 3 versions 


 

[PDF] arxiv.org

Gradient descent algorithms for Bures-Wasserstein barycenters

S ChewiT MaunuP RigolletAJ Stromme - arXiv preprint arXiv …, 2020 - arxiv.org

We study first order methods to compute the barycenter of a probability distribution over the

Bures-Wasserstein manifold. We derive global rates of convergence for both gradient

descent and stochastic gradient descent despite the fact that the barycenter functional is not …

  Cited by 11 Related articles All 2 versions 


 

2019  [PDF] arxiv.org

Statistical inference for Bures-Wasserstein barycenters

A KroshninV SpokoinyA Suvorikova - arXiv preprint arXiv:1901.00226, 2019 - arxiv.org

In this work we introduce the concept of Bures-Wasserstein barycenter $ Q_* $, that is

essentially a Fréchet mean of some distribution $\mathbb {P} $ supported on a subspace of

positive semi-definite Hermitian operators $\mathbb {H} _ {+}(d) $. We allow a barycenter to …

  Cited by 12 Related articles All 4 versions 

[CIATION] Statistical inference for bures-Wasserstein barycenters (2019)

A Kroshnin, V Spokoiny, A Suvorikova - arXiv preprint arXiv:1901.00226, 1901

  Cited by 2 Related articles


 2020

PDF) Statistical inference for Bures-Wasserstein barycenters

www.researchgate.net › publication › 330076951_Statisti...

Sep 8, 2020 — PDF | In this work we introduce the concept of Bures-Wasserstein barycenter \(Q_*\), that is essentially a Fr\'echet mean of some distribution ...

n this work we introduce the concept of BuresWasserstein barycenter Q, that is

essentially a Fréchet mean of some distribution P supported on a subspace of positive semi-

definite d-dimensional Hermitian operators H+(d). We allow a barycenter to be constrained …

[PDF] arxiv.org

Statistical inference for Bures-Wasserstein barycenters

A KroshninV SpokoinyA Suvorikova - arXiv preprint arXiv:1901.00226, 2019 - arxiv.org

In this work we introduce the concept of Bures-Wasserstein barycenter $ Q_* $, that is

essentially a Fréchet mean of some distribution $\mathbb {P} $ supported on a subspace of

positive semi-definite Hermitian operators $\math {H} _ {+}(d) $. We allow a barycenter to …

  Cited by 13 Related articles All 4 versions 

Statistical inference for Bures-Wasserstein barycenters  eBook

2020  [PDF] arxiv.org

Fisher information regularization schemes for Wasserstein gradient flows

W LiJ LuL Wang - Journal of Computational Physics, 2020 - Elsevier

We propose a variational scheme for computing Wasserstein gradient flows. The scheme

builds upon the Jordan–Kinderlehrer–Otto framework with the Benamou-Brenier's dynamic

formulation of the quadratic Wasserstein metric and adds a regularization by the Fisher  …

 Cited by 19 Related articles All 12 versions

Oct 27, 2020 — PDF | The goal of this paper is to study optimal transportation problems and gradient flows of probability measures on the Wiener space, based ...

 Cited by 19 Related articles All 12 versions


2020

Cahn-Hilliard and Thin Film equations with nonlinear mobility ...

www.researchgate.net › publication › 51988829_Cahn-Hi...

Oct 22, 2020 — Request PDF | Cahn-Hilliard and Thin Film equations with nonlinear mobility as ... Dirichlet energy with respect to a Wasserstein-like transport metric, and ... Concerning gradient flows in metrics defined by nonlinear mobility ...

Cahn-Hilliard and Thin Film equations with nonlinear mobility as gradient flows in weighted-Wasserstein metrics

www.researchgate.net › publication › 51988829_Cahn-Hi...

Oct 22, 2020 — Request PDF | Cahn-Hilliard and Thin Film equations with nonlinear mobility as ... Dirichlet energy with respect to a Wasserstein-like transport metric, and ... Concerning gradient flows in metrics defined by nonlinear mobility ...

<——2020———————2020————-  710——

 

[PDF] arxiv.org

The Wasserstein Impact Measure (WIM): a generally applicable, practical tool for quantifying prior impact in Bayesian statistics

F Ghaderinezhad, C LeyB Serrien - arXiv preprint arXiv:2010.12522, 2020 - arxiv.org

The prior distribution is a crucial building block in Bayesian analysis, and its choice will

impact the subsequent inference. It is therefore important to have a convenient way to

quantify this impact, as such a measure of prior impact will help us to choose between two or …

  All 3 versions 


Wasserstein Gradient Flows and the Fokker Planck Equation ...

statmech.stanford.edu › post › gradient_flows_00

May 26, 2020 — Wasserstein Gradient Flows and the Fokker Planck Equation (Part I) ... hole in the ground in such a way that the amount of work I d


Deep learning 11.2. Wasserstein GAN - fleuret.org

fleuret.org › materials › dlc-slides-11-2-Wasserstein-GAN

PDF

Nov 29, 2020 — Figure 4: JS estimates for an MLP generator (upper left) and a DCGAN generator (upper right) trained with the standard GAN procedure.

[CITATION] EE-559–Deep learning 11.2. Wasserstein GAN

F Fleuret - 2020

  Related articles All 2 versions


Wasserstein distance estimates for stochastic integrals ... - NTU

personal.ntu.edu.sg › wasserstein_forward-backward

PDF

Aug 7, 2020 — of jump-diffusion processes. In [BP08], lower and upper bounds on option prices have been obtained in one-dimensional jump-diffusion ...

[CITATION] Wasserstein distance estimates for jump-diffusion processes

JC Breton, N Privault - Preprint, 2020

  Cited by 1

  


Interpretable Model Summaries Using the Wasserstein Distance

E Dunipace, L Trippa - arXiv preprint arXiv:2012.09999, 2020 - arxiv.org

In the current computing age, models can have hundreds or even thousands of parameters;

however, such large models decrease the ability to interpret and communicate individual

parameters. Reducing the dimensionality of the parameter space in the estimation phase is

a commonly used technique, but less work has focused on selecting subsets of the

parameters to focus on for interpretation--especially in settings such as Bayesian inference

or bootstrapped frequentist inference that consider a distribution of estimates. Moreover …

  All 2 versions 

12/2020 


2020

 

2020 online Cover Image

Mullins-Sekerka as the Wasserstein flow of the perimeter

by Chambolle, Antonin; Laux, Tim

Proceedings of the American Mathematical Society, 12/2020

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 

 

2020  online OPEN ACCESS

Extreme quantile regression : a coupling approach and Wasserstein distance

by Bobbia, Benjamin

Université Bourgogne Franche-Comté, 2020

This work is related with the estimation of conditional extreme quantiles. More precisely, we estimate high quantiles of a real distribution conditionally to...

Dissertation/ThesisFull Text Online

Entropic-Wasserstein barycenters: PDE characterization, regularity and CLT

G Carlier, K Eichinger, A Kroshnin - arXiv preprint arXiv:2012.10701, 2020 - arxiv.org

In this paper, we investigate properties of entropy-penalized Wasserstein barycenters

introduced by Bigot, Cazelles and Papadakis (2019) as a regularization of Wasserstein

barycenters first presented by Agueh and Carlier (2011). After characterizing these

barycenters in terms of a system of Monge-Ampère equations, we prove some global

moment and Sobolev bounds as well as higher regularity properties. We finally establish a

central limit theorem for entropic-Wasserstein barycenters.

  All 2 versions 

2/2020  

Virtual persistence diagrams, signed measures, and Wasserstein distance

P Bubenik, A Elchesen - arXiv preprint arXiv:2012.10514, 2020 - arxiv.org

Persistence diagrams, an important summary in topological data analysis, consist of a set of

ordered pairs, each with positive multiplicity. Persistence diagrams are obtained via Mobius

inversion and may be compared using a one-parameter family of metrics called Wasserstein

distances. In certain cases, Mobius inversion produces sets of ordered pairs which may

have negative multiplicity. We call these virtual persistence diagrams. Divol and Lacombe

recently showed that there is a Wasserstein distance for Radon measures on the half plane …

12/2020

 Cited by 5 Related articles

Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures

O Bencheikh, B Jourdain - arXiv preprint arXiv:2012.09729, 2020 - arxiv.org

We are interested in the approximation in Wasserstein distance with index $\rho\ge 1$ of a

probability measure $\mu $ on the real line with finite moment of order $\rho $ by the

empirical measure of $ N $ deterministic points. The minimal error converges to $0 $ as $

N\to+\infty $ and we try to characterize the order associated with this convergence. Apart

when $\mu $ is a Dirac mass and the error vanishes, the order is not larger than $1 $. We

give a necessary condition and a sufficient condition for the order to be equal to this …

  All 3 versions 

12/2020  

 <——2020———2020—      ——-  720—— 


A Data-Driven Distributionally Robust Game Using Wasserstein Distance

G Peng, T Zhang, Q Zhu - International Conference on Decision and Game …, 2020 - Springer

This paper studies a special class of games, which enables the players to leverage the

information from a dataset to play the game. However, in an adversarial scenario, the

dataset may not be trustworthy. We propose a distributionally robust formulation to introduce

robustness against the worst-case scenario and tackle the curse of the optimizer. By

applying Wasserstein distance as the distribution metric, we show that the game considered

in this work is a generalization of the robust game and data-driven empirical game. We also …

Book ChapterFull Text Online

  Cited by 1 Related articles All 3 versions

Zbl 1484.91071

Risk-based distributionally robust optimal gas-power flow with wasserstein distance

C WangR GaoW WeiM Shafie-khah… - … on Power Systems, 2018 - ieeexplore.ieee.org

… Computational results validate the effectiveness of the proposed models and methods … in part

by the “111” project (B08013), and in part by the Fundamental Research Funds for … 14], reserve

procurement [15], and optimal power flow [16], yet DRO based OGPF studies have not …

  Cited by 19 Related articles All 2 versions

Study Results from Xi'an Jiaotong University Update Understanding of Modern Power Systems and Clean Energy (Distributionally Robust Optimal Reactive Power Dispatch With Wasserstein...

Energy Weekly News, 12/2020

NewsletterFull Text Online

Cited by 18 Related articles All 5 versions

 

The Equivalence of Fourier-based and Wasserstein Metrics on Imaging Problems

G Auricchio, A CodegoniS Gualandi… - arXiv preprint arXiv …, 2020 - arxiv.org

… It is remarkable that the result of Theorem 3 can be extended to the D2 metric. Theorem 6.

The function D2 defined in (12) is equivalent to the W2 distance. Page 7. THE EQUIVALENCE

OF FOURIER-BASED AND WASSERSTEIN METRICS 7 Proof …

  Related articles All 4 versions 

Reports on Mathematics from University of Pavia Provide New Insights 

(The Equivalence of Fourier-based and W...

Journal of Technology & Science, 12/2020

NewsletterFull Text Online


2020   [PDF] arxiv.org

Precise Limit in Wasserstein Distance for Conditional Empirical Measures of Dirichlet Diffusion Processes

FY Wang - arXiv preprint arXiv:2004.07537, 2020 - arxiv.org

Let $ M $ be a $ d $-dimensional connected compact Riemannian manifold with boundary

$\partial M $, let $ V\in C^ 2 (M) $ such that $\mu (dx):= e^{V (x)} dx $ is a probability

measure, and let $ X_t $ be the diffusion process generated by $ L:=\Delta+\nabla V $ with …

  Cited by 2 Related articles All 2 versions 


 [PDF] arxiv.org

Convergence in Wasserstein Distance for Empirical Measures of Dirichlet Diffusion Processes on Manifolds

FY Wang - arXiv preprint arXiv:2005.09290, 2020 - arxiv.org

Let $ M $ be a $ d $-dimensional connected compact Riemannian manifold with boundary

$\partial M $, let $ V\in C^ 2 (M) $ such that $\mu ({\rm d} x):={\rm e}^{V (x)}{\rm d} x $ is a

probability measure, and let $ X_t $ be the diffusion process generated by …

Cited by 4 Related articles All 3 versions 

[CITATION] Convergence in Wasserstein Distance for empirical measures of Dirichlet diffusion processes on manifolds, Preprint (2020)

FY Wang - arXiv preprint arXiv:2005.09290

Cited by 3 Related articles


PDF] arxiv.org

Differential inclusions in Wasserstein spaces: The Cauchy-Lipschitz framework

B BonnetH Frankowska - Journal of Differential Equations, 2020 - Elsevier

In this article, we propose a general framework for the study of differential inclusions in the

Wasserstein space of probability measures. Based on earlier geometric insights on the

structure of continuity equations, we define solutions of differential inclusions as absolutely …

  All 6 versions


2020

[PDF] arxiv.org

Many-Objective Estimation of Distribution Optimization Algorithm Based on WGAN-GP

Z Liang, Y Li, Z Wan - arXiv preprint arXiv:2003.08295, 2020 - arxiv.org

Estimation of distribution algorithms (EDA) are stochastic optimization algorithms. EDA

establishes a probability model to describe the distribution of solution from the perspective of

population macroscopically by statistical learning method, and then randomly samples the …

  Related articles All 2 versions 


2020

Waserstein Barycenters: Statistics and Optimization

books.google.com › books

Austin James Stromme · 2020 · ‎No preview book

We study a geometric notion of average, the barycenter, over 2-Wasserstein space.


 chet functional. A closely related notion is that of a Karcher mean (Karcher …

   <——2020———-2020——————  730——

2020

[PDF] thecvf.com

S2A: Wasserstein GAN with Spatio-Spectral Laplacian Attention for Multi-Spectral Band Synthesis

L RoutI MisraS Manthira Moorthi… - Proceedings of the …, 2020 - openaccess.thecvf.com

Intersection of adversarial learning and satellite image processing is an emerging field in

remote sensing. In this study, we intend to address synthesis of high resolution multi-spectral

satellite imagery using adversarial learning. Guided by the discovery of attention …

  Cited by 2 Related articles All 4 versions 


2020

[PDF] arxiv.org

Time Discretizations of Wasserstein-Hamiltonian Flows

J Cui, L Dieci, H Zhou - arXiv preprint arXiv:2006.09187, 2020 - arxiv.org

We study discretizations of Hamiltonian systems on the probability density manifold

equipped with the $ L^ 2$-Wasserstein metric. Based on discrete optimal transport theory,

several Hamiltonian systems on graph (lattice) with different weights are derived, which can …

  All 3 versions 


2020
Wasserstein hamiltonian flows

SN Chow, W LiH Zhou - Journal of Differential Equations, 2020 - Elsevier

We establish kinetic Hamiltonian flows in density space embedded with the L 2-Wasserstein

metric tensor. We derive the Euler-Lagrange equation in density space, which introduces the

associated Hamiltonian flows. We demonstrate that many classical equations, such as …

  Cited by 3 Related articles All 6 versions


2020

[PDF] arxiv.org

Lagrangian schemes for Wasserstein gradient flows

JA CarrilloD MatthesMT Wolfram - arXiv preprint arXiv:2003.03803, 2020 - arxiv.org

This paper reviews different numerical methods for specific examples of Wasserstein

gradient flows: we focus on nonlinear Fokker-Planck equations, but also discuss

discretizations of the parabolic-elliptic Keller-Segel model and of the fourth order thin film …

  Cited by 2 Related articles All 3 versions 


2020

[PDF] aimsciences.org

A convergent Lagrangian discretization for\begin {document} \end {document}-Wasserstein and flux-limited diffusion equations

B Söliver, O Junge - Communications on Pure & Applied Analysis, 2020 - aimsciences.org

We study a Lagrangian numerical scheme for solving a nonlinear drift diffusion equations of

the form∂ tu=∂ x (u·(c)[∂ xh (u)+ v]), like Fokker-Plank and q-Laplace equations, on an

interval. This scheme will consist of a spatio-temporal discretization founded on the …


 2020

[PDF] arxiv.org

Risk Measures Estimation Under Wasserstein Barycenter

MA Arias-SernaJM Loubes… - arXiv preprint arXiv …, 2020 - arxiv.org

… in market indices of United States generated by the financial crisis due to COVID-19 … above

discussion is organized in this paper as follows: preliminaries about Wasserstein barycenter

are … Then Section 3 defines the Wasserstein Barycenter in risk measures and results for VaR …

  All 5 versions  Imbalanced Fault Classification of Bearing via Wasserstein Generative Adversarial Networks with Gradient Penalty


EEG data augmentation using Wasserstein GAN

G Bouallegue, R Djemal - 2020 20th International Conference …, 2020 - ieeexplore.ieee.org

Electroencephalogram (EEG) presents a challenge during the classification task using

machine learning and deep learning techniques due to the lack or to the low size of

available datasets for each specific neurological disorder. Therefore, the use of data …


[HTML] hindawi.com

[HTML] Imbalanced Fault Classification of Bearing via Wasserstein Generative Adversarial Networks with Gradient Penalty

B Han, S Jia, G Liu, J Wang - Shock and Vibration, 2020 - hindawi.com

Recently, generative adversarial networks (GANs) are widely applied to increase the amounts

of imbalanced input samples in fault diagnosis. However, the existing GAN-based methods have

convergence difficulties and training instability, which affect the fault diagnosis efficiency …

  All 4 versions 

  All 3 versions 

Wasserstein Distributionally Robust Motion Planning and Control with Safety Constraints Using Conditional Value-at-Risk

A HakobyanI Yang - 2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

In this paper, we propose an optimization-based decision-making tool for safe motion

planning and control in an environment with randomly moving obstacles. The unique feature

of the proposed method is that it limits the risk of unsafety by a pre-specified threshold even …

  Cited by 1 All 2 versions


[PDF] arxiv.org

Necessary Condition for Rectifiability Involving Wasserstein Distance W2

D Dąbrowski - International Mathematics Research Notices, 2020 - academic.oup.com

A Radon measure is-rectifiable if it is absolutely continuous with respect to-dimensional

Hausdorff measure and-almost all of can be covered by Lipschitz images of. In this paper,

we give a necessary condition for rectifiability in terms of the so-called numbers …

  Cited by 5 Related articles All 4 versions

  MR4216708 Prelim Da̧browski, Damian; Necessary condition for rectifiability involving Wasserstein distance  W2

Int. Math. Res. Not. IMRN 2020, no. 22, 8936–8972. 28

Review PDF Clipboard Journal Article

  <——2020—————2020———-  740—— 


A data-driven distributionally robust newsvendor model with a Wasserstein ambiguity set

S Lee, H Kim, I Moon - Journal of the Operational …, 2020 - orsociety.tandfonline.com

In this paper, we derive a closed-form solution and an explicit characterization of the worst-

case distribution for the data-driven distributionally robust newsvendor model with an

ambiguity set based on the Wasserstein distance of order p[1,∞). We also consider the …

  Cited by 3 Related articles All 2 versions


[PDF] arxiv.org

Wasserstein Autoregressive Models for Density Time Series

C ZhangP KokoszkaA Petersen - arXiv preprint arXiv:2006.12640, 2020 - arxiv.org

… of the space of densities, the above notion of stationarity is defined by the Wasserstein mean

and … which is not equivalent to those traditional stationarity definitions of functional time series …

In particular, a conventional stationarity notion for a stochastic process is understood in the …

  Cited by 4 Related articles All 3 versions 

  

[PDF] arxiv.org

Averaging atmospheric gas concentration data using wasserstein barycenters

M Barré, C Giron, M Mazzolini… - arXiv preprint arXiv …, 2020 - arxiv.org

Hyperspectral satellite images report greenhouse gas concentrations worldwide on a daily

basis. While taking simple averages of these images over time produces a rough estimate of

relative emission rates, atmospheric transport means that simple averages fail to pinpoint …

Cited by 3 Related articles All 6 versions 


[PDF] arxiv.org

Efficient Wasserstein Natural Gradients for Reinforcement Learning

T MoskovitzM ArbelF HuszarA Gretton - arXiv preprint arXiv …, 2020 - arxiv.org

A novel optimization approach is proposed for application to policy gradient methods and

evolution strategies for reinforcement learning (RL). The procedure uses a computationally

efficient Wasserstein natural gradient (WNG) descent that takes advantage of the geometry …

  Cited by 1 Related articles All 2 versions 


Density estimation of multivariate samples using Wasserstein distance

E Luini, P Arbenz - Journal of Statistical Computation and …, 2020 - Taylor & Francis

Density estimation is a central topic in statistics and a fundamental task of machine learning.

In this paper, we present an algorithm for approximating multivariate empirical densities with

a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

  Cited by 2 Related articles All 3 versions


[PDF] arxiv.org

Limit Distribution Theory for Smooth Wasserstein Distance with Applications to Generative Modeling

Z GoldfeldK Kato - arXiv preprint arXiv:2002.01012, 2020 - arxiv.org

The 1-Wasserstein distance ($\mathsf {W} _1 $) is a popular proximity measure between

probability distributions. Its metric structure, robustness to support mismatch, and rich

geometric structure fueled its wide adoption for machine learning tasks. Such tasks …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein routed capsule networks

A Fuchs, F Pernkopf - arXiv preprint arXiv:2007.11465, 2020 - arxiv.org

Capsule networks offer interesting properties and provide an alternative to today's deep

neural network architectures. However, recent approaches have failed to consistently

achieve competitive results across different image datasets. We propose a new parameter …

  Cited by 1 All 2 versions 

 Cited by 2 Related articles All 3 versions 

 Deep attentive wasserstein generative adversarial networks for MRI reconstruction with recurrent context-awareness

Y Guo, C WangH ZhangG Yang - International Conference on Medical …, 2020 - Springer

The performance of traditional compressive sensing-based MRI (CS-MRI) reconstruction is

affected by its slow iterative procedure and noise-induced artefacts. Although many deep

learning-based CS-MRI methods have been proposed to mitigate the problems of traditional …

  Cited by 1 All 4 versions

 Deep Attentive Wasserstein Generative Adversarial Networks for MRI Reconstruction with Recurrent Context-Awareness

0 citations*

2020 ARXIV: IMAGE AND VIDEO PROCESSING

View More 

 Gromov-Wasserstein Factorization Models for Graph Clustering

11 citations* for all

7 citations*

2020 NATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE

Hongtengl Xu

Duke University

Clustering coefficient

Factorization

View More (8+) 

We propose a new nonlinear factorization model for graphs that are with topological structures, and optionally, node attributes. This model is based on a pseudometric called Gromov-Wasserstein (GW) discrepancy, which compares graphs in a relational way. It estimates observed graphs as GW barycenters... View Full Abstract 


[PDF] arxiv.org

The Unbalanced Gromov Wasserstein Distance: Conic Formulation and Relaxation

T SéjournéFX VialardG Peyré - arXiv preprint arXiv:2009.04266, 2020 - arxiv.org

Comparing metric measure spaces (ie a metric space endowed with a probability

distribution) is at the heart of many machine learning problems. This includes for instance

predicting properties of molecules in quantum chemistry or generating graphs with varying …

  Cited by 1 All 2 versions 


[PDF] arxiv.org

Classification of atomic environments via the Gromov–Wasserstein distance

S Kawano, JK Mason - Computational Materials Science, 2020 - Elsevier

Interpreting molecular dynamics simulations usually involves automated classification of

local atomic environments to identify regions of interest. Existing approaches are generally

limited to a small number of reference structures and only include limited information about …

  All 4 versions

  <——2020———2020———-  750—— 


Domain-attention Conditional Wasserstein Distance for Multi-source Domain Adaptation

H Wu, Y Yan, MK NgQ Wu - ACM Transactions on Intelligent Systems …, 2020 - dl.acm.org

Multi-source domain adaptation has received considerable attention due to its effectiveness

of leveraging the knowledge from multiple related sources with different distributions to

enhance the learning performance. One of the fundamental challenges in multi-source …

Cited by 10 Related articles All 3 versions

[PDF] arxiv.org

Regularized variational data assimilation for bias treatment using the Wasserstein metric

SK Tamang, A EbtehajD Zou… - Quarterly Journal of the …, 2020 - Wiley Online Library

This article presents a new variational data assimilation (VDA) approach for the formal

treatment of bias in both model outputs and observations. This approach relies on the

Wasserstein metric, stemming from the theory of optimal mass transport, to penalize the …

  Cited by 1 Related articles All 3 versions

[PDF] ieee.org


Joint transfer of model knowledge and fairness over domains using wasserstein distance

T Yoon, J LeeW Lee - IEEE Access, 2020 - ieeexplore.ieee.org

Owing to the increasing use of machine learning in our daily lives, the problem of fairness

has recently become an important topic in machine learning societies. Recent studies

regarding fairness in machine learning have been conducted to attempt to ensure statistical …

  Cited by 1


[PDF] arxiv.org

A Wasserstein Minimum Velocity Approach to Learning Unnormalized Models

Z WangS Cheng, Y Li, J Zhu, B Zhang - arXiv preprint arXiv:2002.07501, 2020 - arxiv.org

Score matching provides an effective approach to learning flexible unnormalized models,

but its scalability is limited by the need to evaluate a second-order derivative. In this paper,

we present a scalable approximation to a general family of learning objectives including …

Cited by 4 Related articles All 9 versions 


[PDF] arxiv.org

Adversarial Classification via Distributional Robustness with Wasserstein Ambiguity

N Ho-NguyenSJ Wright - arXiv preprint arXiv:2005.13815, 2020 - arxiv.org

We study a model for adversarial classification based on distributionally robust chance

constraints. We show that under Wasserstein ambiguity, the model aims to minimize the

conditional value-at-risk of the distance to misclassification, and we explore links to previous …

  Cited by 1 Related articles All 3 versions 


2020

Fast algorithms for computational optimal transport and wasserstein barycenter

W GuoN HoM Jordan - International Conference on …, 2020 - proceedings.mlr.press

We provide theoretical complexity analysis for new algorithms to compute the optimal

transport (OT) distance between two discrete probability distributions, and demonstrate their

favorable practical performance compared to state-of-art primal-dual algorithms. First, we …

  Cited by 2 Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein K-Means for Clustering Tomographic Projections

R Rao, A MoscovichA Singer - arXiv preprint arXiv:2010.09989, 2020 - arxiv.org

Motivated by the 2D class averaging problem in single-particle cryo-electron microscopy

(cryo-EM), we present a k-means algorithm based on a rotationally-invariant Wasserstein

metric for images. Unlike existing methods that are based on Euclidean ($ L_2 $) distances …

  Cited by 5 Related articles All 7 versions 

Knowledge-aware Attentive Wasserstein Adversarial Dialogue Response Generation

Y Zhang, Q Fang, S Qian, C Xu - ACM Transactions on Intelligent …, 2020 - dl.acm.org

Natural language generation has become a fundamental task in dialogue systems. RNN-

based natural response generation methods encode the dialogue context and decode it into

a response. However, they tend to generate dull and simple responses. In this article, we …

  Cited by 2 Related articles


[PDF] arxiv.org

Wasserstein statistics in 1D location-scale model

S Amari - arXiv preprint arXiv:2003.05479, 2020 - arxiv.org

Wasserstein geometry and information geometry are two important structures introduced in a

manifold of probability distributions. The former is defined by using the transportation cost

between two distributions, so it reflects the metric structure of the base manifold on which …

  Cited by 1 Related articles All 2 versions 


DECWA: Density-Based Clustering using Wasserstein Distance

N El Malki, R Cugny, O TesteF Ravat - Proceedings of the 29th ACM …, 2020 - dl.acm.org

Clustering is a data analysis method for extracting knowledge by discovering groups of data

called clusters. Among these methods, state-of-the-art density-based clustering methods

have proven to be effective for arbitrary-shaped clusters. Despite their encouraging results …

Cited by 2 Related articles All 2 versions

 <——2020———2020———-  760—— 


 [PDF] arxiv.org

Regularizing activations in neural networks via distribution matching with the Wasserstein metric

T Joo, D Kang, B Kim - arXiv preprint arXiv:2002.05366, 2020 - arxiv.org

Regularization and normalization have become indispensable components in training deep

neural networks, resulting in faster training and improved generalization performance. We

propose the projected error function regularization loss (PER) that encourages activations to …

  Cited by 3 Related articles All 3 versions 


[PDF] arxiv.org

The Wasserstein Impact Measure (WIM): a generally applicable, practical tool for quantifying prior impact in Bayesian statistics

F Ghaderinezhad, C LeyB Serrien - arXiv preprint arXiv:2010.12522, 2020 - arxiv.org

The prior distribution is a crucial building block in Bayesian analysis, and its choice will

impact the subsequent inference. It is therefore important to have a convenient way to

quantify this impact, as such a measure of prior impact will help us to choose between two or …

  All 3 versions 


[PDF] arxiv.org

Automatic coding of students' writing via Contrastive Representation Learning in the Wasserstein space

R Jiang, J GouveaD HammerS Aeron - arXiv preprint arXiv:2011.13384, 2020 - arxiv.org

Qualitative analysis of verbal data is of central importance in the learning sciences. It is labor-

intensive and time-consuming, however, which limits the amount of data researchers can

include in studies. This work is a step towards building a statistical machine learning (ML) …

  All 2 versions 


[PDF] upenn.edu

Optimal Estimation of Wasserstein Distance on A Tree with An Application to Microbiome Studies

S Wang, TT CaiH Li - Journal of the American Statistical …, 2020 - Taylor & Francis

The weighted UniFrac distance, a plug-in estimator of the Wasserstein distance of read

counts on a tree, has been widely used to measure the microbial community difference in

microbiome studies. Our investigation however shows that such a plug-in estimator …

  Related articles All 3 versions


Probability forecast combination via entropy regularized wasserstein distance

R Cumings-Menon, M Shin - Entropy, 2020 - mdpi.com

We propose probability and density forecast combination methods that are defined using the

entropy regularized Wasserstein distance. First, we provide a theoretical characterization of

the combined density forecast based on the regularized Wasserstein distance under the …

  Cited by 1 All 15 versions 

[PDF] arxiv.org

Generalized Wasserstein Dice Score, Distributionally Robust Deep Learning, and Ranger for brain tumor segmentation: BraTS 2020 challenge

L FidonS OurselinT Vercauteren - arXiv preprint arXiv:2011.01614, 2020 - arxiv.org

Training a deep neural network is an optimization problem with four main ingredients: the

design of the deep neural network, the per-sample loss function, the population loss

function, and the optimizer. However, methods developed to compete in recent BraTS …

  ACited by 10 Related articles All 6 versions

[PDF] arxiv.org

Social-wagdat: Interaction-aware trajectory prediction via wasserstein graph double-attention network

J Li, H Ma, Z Zhang, M Tomizuka - arXiv preprint arXiv:2002.06241, 2020 - arxiv.org

Effective understanding of the environment and accurate trajectory prediction of surrounding

dynamic obstacles are indispensable for intelligent mobile systems (like autonomous

vehicles and social robots) to achieve safe and high-quality planning when they navigate in …

  Cited by 10 Related articles All 3 versions 


[PDF] arxiv.org

Quantum statistical learning via Quantum Wasserstein natural gradient

S BeckerW Li - arXiv preprint arXiv:2008.11135, 2020 - arxiv.org

In this article, we introduce a new approach towards the statistical learning problem

$\operatorname {argmin} _ {\rho (\theta)\in\mathcal P_ {\theta}} W_ {Q}^ 2 (\rho_ {\star},\rho

(\theta)) $ to approximate a target quantum state $\rho_ {\star} $ by a set of parametrized …

  Cited by 1 All 3 versions 


Wasserstein upper bounds of the total variation for smooth densities

M Chae, SG Walker - Statistics & Probability Letters, 2020 - Elsevier

The total variation distance between probability measures cannot be bounded by the

Wasserstein metric in general. If we consider sufficiently smooth probability densities,

however, it is possible to bound the total variation by a power of the Wasserstein distance …

  Cited by 4 Related articles All 4 versions


[PDF] arxiv.org

Revisiting Fixed Support Wasserstein Barycenter: Computational Hardness and Efficient Algorithms

T LinN Ho, X Chen, M CuturiMI Jordan - arXiv preprint arXiv:2002.04783, 2020 - arxiv.org

We study the fixed-support Wasserstein barycenter problem (FS-WBP), which consists in

computing the Wasserstein barycenter of $ m $ discrete probability measures supported on

a finite metric space of size $ n $. We show first that the constraint matrix arising from the …

  Cited by 1 Related articles All 3 versions 

Watzlaf

[CITATION] Revisiting Fixed Support Wasserstein Barycenter: Computational Hardness and Efficient Algorithms. Cornell University

T Lin, N Ho, X Chen, M Cuturi, MI Jordan - Computer Science, Computational …, 2020

  Cited by 2 Related articles

<——2020———————2020 ———————-770—


[PDF] arxiv.org

A Rademacher-type theorem on L2-Wasserstein spaces over closed Riemannian manifolds

LD Schiavo - Journal of Functional Analysis, 2020 - Elsevier

Let P be any Borel probability measure on the L 2-Wasserstein space (P 2 (M), W 2) over a

closed Riemannian manifold M. We consider the Dirichlet form E induced by P and by the

Wasserstein gradient on P 2 (M). Under natural assumptions on P, we show that W 2 …

  Cited by 5 Related articles All 6 versions


[PDF] arxiv.org

Wasserstein Learning of Determinantal Point Processes

L AnquetilM GartrellA Rakotomamonjy… - arXiv preprint arXiv …, 2020 - arxiv.org

Determinantal point processes (DPPs) have received significant attention as an elegant

probabilistic model for discrete subset selection. Most prior work on DPP learning focuses

on maximum likelihood estimation (MLE). While efficient and scalable, MLE approaches do …

Cited by 1 Related articles All 4 versions 


Gromov-Wasserstein Factorization Models for Graph Clustering

ojs.aaai.org › index.php › AAAI › article › view

PDF

Copyright c 2020, Association for the Advancement of Artificial. Intelligence (www.aaai.org). All rights reserved. 1In this paper, we borrow the terms “atoms” and “ ...

by H Xu · ‎2020 · ‎Cited by 2 · ‎Related articles

[PDF] arxiv.org

[CITATION] Gromov-Wasserstein Factorization Models for Graph Clustering.

H Xu - AAAI, 2020

 Cited by 8 Related articles All 5 versions 


[PDF] arxiv.org

Gromov–Hausdorff limit of Wasserstein spaces on point clouds

NG Trillos - Calculus of Variations and Partial Differential …, 2020 - Springer

We consider a point cloud\(X_n:=\{{\mathbf {x}} _1,\ldots,{\mathbf {x}} _n\}\) uniformly

distributed on the flat torus\({\mathbb {T}}^ d:=\mathbb {R}^ d/\mathbb {Z}^ d\), and construct

a geometric graph on the cloud by connecting points that are within distance\(\varepsilon\) of …

 Cited by 15 Related articles All 4 versions

[HTML] thelancet.com

[HTML] Remdesivir for COVID-19: challenges of underpowered studies

JD Norrie - The Lancet, 2020 - thelancet.com

… 12: Wasserstein RL; Schirm AL; Lazar NA. Moving to a world beyond “p … 2020 https://www.who.

int/blueprint/priority-diseases/key-action/COVID-19_Treatment_Trial_Design_Master_Protoc

ol_synopsis_Final_18022020.pdf … First case of 2019 novel coronavirus in the United States …

  Cited by 37 Related articles All 3 versions


 

2020

Wasserstein based transfer network for cross-domain sentiment classification

Y Du, M He, L Wang, H Zhang - Knowledge-Based Systems, 2020 - Elsevier

Automatic sentiment analysis of social media texts is of great significance for identifying

people's opinions that can help people make better decisions. Annotating data is time

consuming and laborious, and effective sentiment analysis on domains lacking of labeled …

Cited by 14 Related articles All 2 versions

[PDF] arxiv.org

Fast and Smooth Interpolation on Wasserstein Space

S Chewi, J Clancy, TL GouicP Rigollet… - arXiv preprint arXiv …, 2020 - arxiv.org

We propose a new method for smoothly interpolating probability measures using the

geometry of optimal transport. To that end, we reduce this problem to the classical Euclidean

setting, allowing us to directly leverage the extensive toolbox of spline interpolation. Unlike …

  All 2 versions 

[PDF] arxiv.org

Visual Transfer for Reinforcement Learning via Wasserstein Domain Confusion

J RoyG Konidaris - arXiv preprint arXiv:2006.03465, 2020 - arxiv.org

We introduce Wasserstein Adversarial Proximal Policy Optimization (WAPPO), a novel

algorithm for visual transfer in Reinforcement Learning that explicitly learns to align the

distributions of extracted features between a source and target task. WAPPO approximates …

  Cited by 2 Related articles All 3 versions 

[PDF] arxiv.org

Gradient descent algorithms for Bures-Wasserstein barycenters

S ChewiT MaunuP RigolletAJ Stromme - arXiv preprint arXiv …, 2020 - arxiv.org

We study first order methods to compute the barycenter of a probability distribution over the

Bures-Wasserstein manifold. We derive global rates of convergence for both gradient

descent and stochastic gradient descent despite the fact that the barycenter functional is not …

 Cited by 36 Related articles All 9 versions 


[PDF] arxiv.org

Learning disentangled representations with the Wasserstein Autoencoder

B Gaujac, I FeigeD Barber - arXiv preprint arXiv:2010.03459, 2020 - arxiv.org

Disentangled representation learning has undoubtedly benefited from objective function

surgery. However, a delicate balancing act of tuning is still required in order to trade off

reconstruction fidelity versus disentanglement. Building on previous successes of penalizing …

  All 2 versions 

<——2020————————2020——————-  780—— 


[PDF] epfl.ch

[PDF] THE CONTINUOUS FORMULATION OF SHALLOW NEURAL NETWORKS AS WASSERSTEIN-TYPE GRADIENT FLOWS

X FERNÁNDEZ-REAL, A FIGALLI - sma.epfl.ch

It has been recently observed that the training of a single hidden layer artificial neural

network can be reinterpreted as a Wasserstein gradient flow for the weights for the error

functional. In the limit, as the number of parameters tends to infinity, this gives rise to a family …

  Related articles 

[HTML] aimsciences.org

[HTML] RWRM: Residual Wasserstein regularization model for image restoration

R He, X Feng, X Zhu, H Huang… - Inverse Problems & …, 2020 - aimsciences.org

Existing image restoration methods mostly make full use of various image prior information.

However, they rarely exploit the potential of residual histograms, especially their role as

ensemble regularization constraint. In this paper, we propose a residual Wasserstein  …

  owever, they rarely exploit the potential of residual histograms, especially their role as …

 Related articles All 2 versions 


2020 see 2019

Partial Gromov-Wasserstein with Applications on Positive-Unlabeled Learning

L Chapel, MZ AlayaG Gasso - arXiv preprint arXiv:2002.08276, 2020 - arxiv.org

Optimal Transport (OT) framework allows defining similarity between probability distributions

and provides metrics such as the Wasserstein and Gromov-Wasserstein discrepancies.

Classical OT problem seeks a transportation map that preserves the total mass, requiring the …

  Cited by 5 Related articles All 5 versions 


[PDF] arxiv.org

Partial Gromov-Wasserstein Learning for Partial Graph Matching

W LiuC ZhangJ XieZ ShenH Qian… - arXiv preprint arXiv …, 2020 - arxiv.org

Graph matching finds the correspondence of nodes across two graphs and is a basic task in

graph-based machine learning. Numerous existing methods match every node in one graph

to one node in the other graph whereas two graphs usually overlap partially in …

  Related articles All 5 versions 


[PDF] arxiv.org

Wasserstein hamiltonian flows

SN Chow, W LiH Zhou - Journal of Differential Equations, 2020 - Elsevier

We establish kinetic Hamiltonian flows in density space embedded with the L 2-Wasserstein

metric tensor. We derive the Euler-Lagrange equation in density space, which introduces the

associated Hamiltonian flows. We demonstrate that many classical equations, such as …

Cited by 7 Related articles All 7 versions

2020

2020  [PDF] arxiv.org

On a Novel Application of Wasserstein-Procrustes for Unsupervised Cross-Lingual Learning

G Ramírez, R Dangovski, P Nakov… - arXiv preprint arXiv …, 2020 - arxiv.org

The emergence of unsupervised word embeddings, pre-trained on very large monolingual

text corpora, is at the core of the ongoing neural revolution in Natural Language Processing

(NLP). Initially introduced for English, such pre-trained word embeddings quickly emerged …

 Cited by 1 Related articles All 3 versions 


[PDF] thecvf.com

Gromov-Wasserstein Averaging in a Riemannian Framework

S ChowdhuryT Needham - Proceedings of the IEEE/CVF …, 2020 - openaccess.thecvf.com

We introduce a theoretical framework for performing statistical tasks-including, but not

limited to, averaging and principal component analysis-on the space of (possibly

asymmetric) matrices with arbitrary entries and sizes. This is carried out under the lens of the …

  Cited by 15 Related articles

[PDF] mlr.press

Nested-wasserstein self-imitation learning for sequence generation

R ZhangC ChenZ GanZ Wen… - International …, 2020 - proceedings.mlr.press

Reinforcement learning (RL) has been widely studied for improving sequence-generation

models. However, the conventional rewards used for RL training typically cannot capture

sufficient semantic information and therefore render model bias. Further, the sparse and …

  Cited by 2 Related articles All 6 versions 

Nested-Wasserstein Self-Imitation Learning for Sequence Generation

L Carin - 2020 - openreview.net

Reinforcement learning (RL) has been widely studied for improving sequence-generation

models. However, the conventional rewards used for RL training typically cannot capture

sufficient semantic information and therefore render model bias. Further, the sparse and …


[PDF] arxiv.org

Some Theoretical Insights into Wasserstein GANs

G BiauM SangnierU Tanielian - arXiv preprint arXiv:2006.02682, 2020 - arxiv.org

Generative Adversarial Networks (GANs) have been successful in producing outstanding

results in areas as diverse as image, video, and text generation. Building on these

successes, a large number of empirical studies have validated the benefits of the cousin …

  Cited by 3 Related articles All 3 versions 


Trajectories from Distribution-Valued Functional Curves: A Unified Wasserstein Framework

A SharmaG Gerig - … Conference on Medical Image Computing and …, 2020 - Springer

Temporal changes in medical images are often evaluated along a parametrized function that

represents a structure of interest (eg white matter tracts). By attributing samples along these

functions with distributions of image properties in the local neighborhood, we create …

  All 2 versions

 <——2020———2020———-  790—— 


A Data-Driven Distributionally Robust Game Using Wasserstein Distance

G Peng, T Zhang, Q Zhu - International Conference on Decision and Game …, 2020 - Springer

This paper studies a special class of games, which enables the players to leverage the

information from a dataset to play the game. However, in an adversarial scenario, the

dataset may not be trustworthy. We propose a distributionally robust formulation to introduce …

  

[PDF] researchgate.net

[PDF] Fixed-Support Wasserstein Barycenters: Computational Hardness and Fast Algorithm

T LinN HoX ChenM Cuturi… - Advances in Neural …, 2020 - researchgate.net

We study the fixed-support Wasserstein barycenter problem (FS-WBP), which consists in

computing the Wasserstein barycenter of m discrete probability measures supported on a

finite metric space of size n. We show first that the constraint matrix arising from the standard …

  All 4 versions 


[PDF] thecvf.com

Wasserstein Loss-Based Deep Object Detection

Y Han, X Liu, Z Sheng, Y Ren, X Han… - Proceedings of the …, 2020 - openaccess.thecvf.com

Object detection locates the objects with bounding boxes and identifies their classes, which

is valuable in many computer vision applications (eg autonomous driving). Most existing

deep learning-based methods output a probability vector for instance classification trained …

Cited by 14 Related articles All 5 versions 


[PDF] arxiv.org

Infinite-dimensional regularization of McKean-Vlasov equation with a Wasserstein diffusion

V Marx - arXiv preprint arXiv:2002.10157, 2020 - arxiv.org

Much effort has been spent in recent years on restoring uniqueness of McKean-Vlasov

SDEs with non-smooth coefficients. As a typical instance, the velocity field is assumed to be

bounded and measurable in its space variable and Lipschitz-continuous with respect to the …

  Cited by 1 Related articles All 20 versions 


[PDF] arxiv.org

SA vs SAA for population Wasserstein barycenter calculation

D Dvinskikh - arXiv preprint arXiv:2001.07697, 2020 - arxiv.org

In Machine Learning and Optimization community there are two main approaches for convex

risk minimization problem. The first approach is Stochastic Averaging (SA)(online) and the

second one is Stochastic Average Approximation (SAA)(Monte Carlo, Empirical Risk …

  Cited by 3 Related articles All 2 versions 


  

[PDF] arxiv.org

A Riemannian Block Coordinate Descent Method for Computing the Projection Robust Wasserstein Distance

M HuangS MaL Lai - arXiv preprint arXiv:2012.05199, 2020 - arxiv.org

The Wasserstein distance has become increasingly important in machine learning and deep

learning. Despite its popularity, the Wasserstein distance is hard to approximate because of

the curse of dimensionality. A recently proposed approach to alleviate the curse of …

  All 2 versions 


[PDF] arxiv.org

Stochastic Optimization for Regularized Wasserstein Estimators

M BalluQ BerthetF Bach - arXiv preprint arXiv:2002.08695, 2020 - arxiv.org

Optimal transport is a foundational problem in optimization, that allows to compare

probability distributions while taking into account geometric aspects. Its optimal objective

value, the Wasserstein distance, provides an important loss between distributions that has …

  Cited by 5 Related articles All 3 versions 


Adaptive Wasserstein Hourglass for Weakly Supervised RGB 3D Hand Pose Estimation

Y Zhang, L Chen, Y Liu, W Zheng, J Yong - Proceedings of the 28th ACM …, 2020 - dl.acm.org

The deficiency of labeled training data is one of the bottlenecks in 3D hand pose estimation

from monocular RGB images. Synthetic datasets have a large number of images with

precise annotations, but their obvious difference with real-world datasets limits the …

 Adaptive Wasserstein Hourglass for Weakly Supervised RGB 3D Hand Pose Estimation

Y Zhang, L Chen, Y Liu, W Zheng, J Yong - Proceedings of the 28th ACM …, 2020 - dl.acm.org

The deficiency of labeled training data is one of the bottlenecks in 3D hand pose estimation from monocular RGB images. Synthetic datasets have a large number of images with precise annotations, but their obvious difference with real-world datasets limits the …

  Related articles


[PDF] arxiv.org

A Wasserstein coupled particle filter for multilevel estimation

M Ballesio, A JasraE von Schwerin… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we consider the filtering problem for partially observed diffusions, which are

regularly observed at discrete times. We are concerned with the case when one must resort

to time-discretization of the diffusion process if the transition density is not available in an …

 Cited by 8 Related articles All 5 versions 


[PDF] arxiv.org

Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures

O Bencheikh, B Jourdain - arXiv preprint arXiv:2012.09729, 2020 - arxiv.org

We are interested in the approximation in Wasserstein distance with index $\rho\ge 1$ of a

probability measure $\mu $ on the real line with finite moment of order $\rho $ by the

empirical measure of $ N $ deterministic points. The minimal error converges to $0 $ as …

  All 3 versions View as HTML 

<——2020———————2020 ———————-800—


[PDF] mlr.press

Wasserstein smoothing: Certified robustness against wasserstein adversarial attacks

A Levine, S Feizi - International Conference on Artificial …, 2020 - proceedings.mlr.press

In the last couple of years, several adversarial attack methods based on different threat

models have been proposed for the image classification problem. Most existing defenses

consider additive threat models in which sample perturbations have bounded L_p norms …

  Cited by 12 Related articles All 2 versions 


 [PDF] arxiv.org

Convergence and concentration of empirical measures under wasserstein distance in unbounded functional spaces

J Lei - Bernoulli, 2020 - projecteuclid.org

We provide upper bounds of the expected Wasserstein distance between a probability

measure and its empirical version, generalizing recent results for finite dimensional

Euclidean spaces and bounded functional spaces. Such a generalization can cover …

  Cited by 38 Related articles All 5 versions


[PDF] wiley.com

Evaluating the performance of climate models based on Wasserstein distance

G VissioV LemboV Lucarini… - Geophysical Research …, 2020 - Wiley Online Library

We propose a methodology for intercomparing climate models and evaluating their

performance against benchmarks based on the use of the Wasserstein distance (WD). This

distance provides a rigorous way to measure quantitatively the difference between two …

  Cited by 2 All 13 versions


[PDF] neurips.cc

[PDF] Faster Wasserstein Distance Estimation with the Sinkhorn Divergence

L ChizatP RoussillonF Léger… - Advances in Neural …, 2020 - proceedings.neurips.cc

The squared Wasserstein distance is a natural quantity to compare probability distributions

in a non-parametric setting. This quantity is usually estimated with the plug-in estimator,

defined via a discrete optimal transport problem which can be solved to $\epsilon …

 Cited by 53 Related articles All 7 versions 

[PDF] arxiv.org

When ot meets mom: Robust estimation of wasserstein distance

G StaermanP LaforgueP Mozharovskyi… - arXiv preprint arXiv …, 2020 - arxiv.org

Issued from Optimal Transport, the Wasserstein distance has gained importance in Machine

Learning due to its appealing geometrical properties and the increasing availability of

efficient approximations. In this work, we consider the problem of estimating the Wasserstein  …

  Cited by 2 All 4 versions View as HTML 


2020

 

Severity-aware semantic segmentation with reinforced wasserstein training

X LiuW JiJ YouGE Fakhri… - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com

Semantic segmentation is a class of methods to classify each pixel in an image into

semantic classes, which is critical for autonomous vehicles and surgery systems. Cross-

entropy (CE) loss-based deep neural networks (DNN) achieved great success wrt the …

  Cited by 7 Related articles 

 

[PDF] arxiv.org

Wasserstein regression

Y Chen, Z Lin, HG Müller - arXiv preprint arXiv:2006.09660, 2020 - arxiv.org

The analysis of samples of random objects that do not lie in a vector space has found

increasing attention in statistics in recent years. An important class of such object data is

univariate probability measures defined on the real line. Adopting the Wasserstein metric …

  Cited by 1 All 2 versions 


   [PDF] archives-ouvertes.fr

Learning with minibatch Wasserstein: asymptotic and gradient properties

K Fatras, Y Zine, R Flamary… - the 23nd …, 2020 - hal.archives-ouvertes.fr

Optimal transport distances are powerful tools to compare probability distributions and have

found many applications in machine learning. Yet their algorithmic complexity prevents their

direct use on large scale datasets. To overcome this challenge, practitioners compute these …

  Cited byCited by 86 Related articles All 3 versions


2020  see 2019  [PDF] academia.edu

Data augmentation in fault diagnosis based on the Wasserstein generative adversarial network with gradient penalty

X Gao, F Deng, X Yue - Neurocomputing, 2020 - Elsevier

Fault detection and diagnosis in industrial process is an extremely essential part to keep

away from undesired events and ensure the safety of operators and facilities. In the last few

decades various data based machine learning algorithms have been widely studied to …

Cited by 90 Related articles All 3 versions


  [PDF] mlr.press

A fast proximal point method for computing exact wasserstein distance

Y XieX Wang, R Wang, H Zha - Uncertainty in Artificial …, 2020 - proceedings.mlr.press

Wasserstein distance plays increasingly important roles in machine learning, stochastic

programming and image processing. Major efforts have been under way to address its high

computational complexity, some leading to approximate or regularized variations such as …

 Cited by 85 Related articles All 6 versions 

<——2020———2020————-  810——

 

[PDF] arxiv.org

When can Wasserstein GANs minimize Wasserstein Distance?

Y LiZ Dou - arXiv preprint arXiv:2003.04033, 2020 - arxiv.org

Generative Adversarial Networks (GANs) are widely used models to learn complex real-

world distributions. In GANs, the training of the generator usually stops when the

discriminator can no longer distinguish the generator's output from the set of training …

Cited by 7 Related articles All 2 versions 

[PDF] arxiv.org

Fisher information regularization schemes for Wasserstein gradient flows

W LiJ LuL Wang - Journal of Computational Physics, 2020 - Elsevier

We propose a variational scheme for computing Wasserstein gradient flows. The scheme

builds upon the Jordan–Kinderlehrer–Otto framework with the Benamou-Brenier's dynamic

formulation of the quadratic Wasserstein metric and adds a regularization by the Fisher …

  Cited by 8 Related articles All 8 versions


Learning to Align via Wasserstein for Person Re-Identification

Z Zhang, Y Xie, D Li, W Zhang… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Existing successful person re-identification (Re-ID) models often employ the part-level

representation to extract the fine-grained information, but commonly use the loss that is

particularly designed for global features, ignoring the relationship between semantic parts …

 Cited by 13 Related articles All 2 versions

 [PDF] researchgate.net

The quadratic Wasserstein metric for inverse data matching

B Engquist, K Ren, Y Yang - Inverse Problems, 2020 - iopscience.iop.org

This work characterizes, analytically and numerically, two major effects of the quadratic

Wasserstein (W 2) distance as the measure of data discrepancy in computational solutions

of inverse problems. First, we show, in the infinite-dimensional setup, that the W 2 distance …

Cited by 17 Related articles All 6 versions

[PDF] arxiv.org

Lagrangian schemes for Wasserstein gradient flows

JA CarrilloD MatthesMT Wolfram - arXiv preprint arXiv:2003.03803, 2020 - arxiv.org

This paper reviews different numerical methods for specific examples of Wasserstein

gradient flows: we focus on nonlinear Fokker-Planck equations, but also discuss

discretizations of the parabolic-elliptic Keller-Segel model and of the fourth order thin film …

  Cited by 2 Related articles All 3 versions 


2020


[PDF] arxiv.org

The back-and-forth method for wasserstein gradient flows

M Jacobs, W Lee, F Léger - arXiv preprint arXiv:2011.08151, 2020 - arxiv.org

We present a method to efficiently compute Wasserstein gradient flows. Our approach is

based on a generalization of the back-and-forth method (BFM) introduced by Jacobs and

Léger to solve optimal transport problems. We evolve the gradient flow by solving the dual …

  Cited by 1 All 2 versions 


[PDF] arxiv.org

Wasserstein autoencoders for collaborative filtering

X Zhang, J Zhong, K Liu - Neural Computing and Applications, 2020 - Springer

The recommender systems have long been studied in the literature. The collaborative

filtering is one of the most widely adopted recommendation techniques which is usually

applied to the explicit data, eg, rating scores. However, the implicit data, eg, click data, is …

  Cited by 9 Related articles All 3 versions


Gradient descent algorithms for Bures-Wasserstein barycenters

S ChewiT MaunuP Rigollet… - … on Learning Theory, 2020 - proceedings.mlr.press

We study first order methods to compute the barycenter of a probability distribution $ P $

over the space of probability measures with finite second moment. We develop a framework

to derive global rates of convergence for both gradient descent and stochastic gradient …

  Cited by 17 Related articles All 5 versions 


[PDF] arxiv.org

Sampling of probability measures in the convex order by Wasserstein projection

A AlfonsiJ Corbetta, B Jourdain - Annales de l'Institut Henri …, 2020 - projecteuclid.org

In this paper, for $\mu $ and $\nu $ two probability measures on $\mathbb {R}^{d} $ with

finite moments of order $\varrho\ge 1$, we define the respective projections for the $ W_

{\varrho} $-Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures …

  Cited by 14 Related articles All 5 versions


[PDF] arxiv.org

Multimarginal wasserstein barycenter for stain normalization and augmentation

S NadeemT HollmannA Tannenbaum - International Conference on …, 2020 - Springer

Variations in hematoxylin and eosin (H&E) stained images (due to clinical lab protocols,

scanners, etc) directly impact the quality and accuracy of clinical diagnosis, and hence it is

important to control for these variations for a reliable diagnosis. In this work, we present a …

  Cited by 2 All 3 versions

Multimarginal Wasserstein Barycenter for Stain Normalization and Augmentation.

By: Nadeem, Saad; Hollmann, Travis; Tannenbaum, Allen

Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention  Volume: ‏ 12265   Pages: ‏ 362-371   Published: ‏ 2020-Oct (Epub 2020 Sep 29)

Get It Penn State  View Abstract

   <——2020———2020————-  820——


[PDF] arxiv.org

The quantum Wasserstein distance of order 1

G De PalmaM MarvianD TrevisanS Lloyd - arXiv preprint arXiv …, 2020 - arxiv.org

We propose a generalization of the Wasserstein distance of order 1 to the quantum states of

$ n $ qudits. The proposal recovers the Hamming distance for the vectors of the canonical

basis, and more generally the classical Wasserstein distance for quantum states diagonal in …

  Cited by 1 All 3 versions 


Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings

Y Dai, S Wang, X Chen, C Xu, W Guo - Knowledge-Based Systems, 2020 - Elsevier

Abstract Knowledge graph embedding aims to project entities and relations into low-

dimensional and continuous semantic feature spaces, which has captured more attention in

recent years. Most of the existing models roughly construct negative samples via a uniformly …

 Cited by 15 Related articles All 2 versions

[PDF] arxiv.org

Wasserstein GANs for MR imaging: from paired to unpaired training

K Lei, M MardaniJM Pauly… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Lack of ground-truth MR images impedes the common supervised training of neural

networks for image reconstruction. To cope with this challenge, this paper leverages

unpaired adversarial training for reconstruction networks, where the inputs are …

 Cited by 24 Related articles All 9 versions

  [PDF] arxiv.org

Image hashing by minimizing independent relaxed wasserstein distance

KD DoanA KimiyaieS Manchanda… - arXiv preprint arXiv …, 2020 - arxiv.org

Image hashing is a fundamental problem in the computer vision domain with various

challenges, primarily, in terms of efficiency and effectiveness. Existing hashing methods lack

a principled characterization of the goodness of the hash codes and a principled approach …

  Cited by 2 Related articles All 3 versions 


De novo protein design for novel folds using guided conditional Wasserstein generative adversarial networks

M Karimi, S Zhu, Y CaoY Shen - Journal of Chemical Information …, 2020 - ACS Publications

Although massive data is quickly accumulating on protein sequence and structure, there is a

small and limited number of protein architectural types (or structural folds). This study is

addressing the following question: how well could one reveal underlying sequence …

  Cited by 1 All 3 versions


[PDF] researchgate.net

On the computation of Wasserstein barycenters

G PuccettiL RüschendorfS Vanduffel - Journal of Multivariate Analysis, 2020 - Elsevier

The Wasserstein barycenter is an important notion in the analysis of high dimensional data

with a broad range of applications in applied probability, economics, statistics, and in

particular to clustering and image processing. In this paper, we state a general version of the …

  Cited by 7 Related articles All 7 versions

Cited by 15 Related articles All 8 versions

[PDF] arxiv.org

Global sensitivity analysis and Wasserstein spaces

JC Fort, T Klein, A Lagnoux - arXiv preprint arXiv:2007.12378, 2020 - arxiv.org

Sensitivity indices are commonly used to quantity the relative inuence of any specic group of

input variables on the output of a computer code. In this paper, we focus both on computer

codes the output of which is a cumulative distribution function and on stochastic computer …

  Cited by 1 All 8 versions 

 
[PDF] arxiv.org

Stein factors for variance-gamma approximation in the Wasserstein and Kolmogorov distances

RE Gaunt - arXiv preprint arXiv:2008.06088, 2020 - arxiv.org

We obtain new bounds for the solution of the variance-gamma (VG) Stein equation that are

of the correct form for approximations in terms of the Wasserstein and Kolmorogorov metrics.

These bounds hold for all parameters values of the four parameter VG class. As an …

  Cited by 2 All 3 versions 


[PDF] core.ac.uk thesis  Ashworth.pdf 

[PDF] Structure-preserving variational schemes for fourth order nonlinear partial differential equations with a Wasserstein gradient flow structure

B Ashworth - 2020 - core.ac.uk

There is a growing interest in studying nonlinear partial differential equations which

constitute gradient flows in the Wasserstein metric and related structure preserving

variational discretisations. In this thesis, we focus on the fourth order Derrida-Lebowitz …

Related articles All 2 versions 

[PDF] arxiv.org

Wasserstein Embedding for Graph Learning

S KolouriN NaderializadehGK Rohde… - arXiv preprint arXiv …, 2020 - arxiv.org

We present Wasserstein Embedding for Graph Learning (WEGL), a novel and fast

framework for embedding entire graphs in a vector space, in which various machine

learning models are applicable for graph-level prediction tasks. We leverage new insights …

Cited by 17 Related articles All 5 versions 

<——2020———2020————-  830——


[PDF] arxiv.org

A variational finite volume scheme for Wasserstein gradient flows

C CancèsTO GallouëtG Todeschi - Numerische Mathematik, 2020 - Springer

We propose a variational finite volume scheme to approximate the solutions to Wasserstein

gradient flows. The time discretization is based on an implicit linearization of the

Wasserstein distance expressed thanks to Benamou–Brenier formula, whereas space …

 Cited by 12 Related articles All 11 versions

 

year 2020

[CITATION] Data augmentation method for power transformer fault diagnosis based on conditional Wasserstein generative adversarial network

YP Liu, Z Xu, J He, Q Wang, SG Gao, J Zhao - Power System Technology, 2020

  Cited by 3 Related articles

W-LDMM: A wasserstein driven low-dimensional manifold model for noisy image restoration

R He, X Feng, W Wang, X Zhu, C Yang - Neurocomputing, 2020 - Elsevier

The Wasserstein distance originated from the optimal transport theory is a general and

flexible statistical metric in a variety of image processing problems. In this paper, we propose

a novel Wasserstein driven low-dimensional manifold model (W-LDMM), which tactfully …

  Cited by 2 Related articles All 2 versions

 
2020  
[PDF] arxiv.org

Scalable computations of wasserstein barycenter via input convex neural networks

J FanA TaghvaeiY Chen - arXiv preprint arXiv:2007.04462, 2020 - arxiv.org

Wasserstein Barycenter is a principled approach to represent the weighted mean of a given

set of probability distributions, utilizing the geometry induced by optimal transport. In this

work, we present a novel scalable algorithm to approximate the Wasserstein Barycenters …

  Cited by 1 All 3 versions 


2020

Remaining useful life prediction of lithium-ion batteries using a fusion method based on Wasserstein GAN

周温丁, 鲍士兼, 许方敏, 赵成林 - 中国邮电高校学报 (英文版), 2020 - jcupt.bupt.edu.cn

Lithium-ion batteries are the main power supply equipment in many fields due to their

advantages of no memory, high energy density, long cycle life and no pollution to the

environment. Accurate prediction for the remaining useful life (RUL) of lithium-ion batteries …

  All 2 versions 

[PDF] arxiv.org

node2coords: Graph representation learning with wasserstein barycenters

E SimouD ThanouP Frossard - arXiv preprint arXiv:2007.16056, 2020 - arxiv.org

In order to perform network analysis tasks, representations that capture the most relevant

information in the graph structure are needed. However, existing methods do not learn

representations that can be interpreted in a straightforward way and that are robust to …

  Cited by 1 All 2 versions 


[PDF] optimization-online.org

Data-driven distributionally robust chance-constrained optimization with Wasserstein metric

R JiMA Lejeune - Journal of Global Optimization, 2020 - Springer

We study distributionally robust chance-constrained programming (DRCCP) optimization

problems with data-driven Wasserstein ambiguity sets. The proposed algorithmic and

reformulation framework applies to all types of distributionally robust chance-constrained …

  Cited by 9 Related articles All 3 versions


Data-driven distributionally robust unit commitment with Wasserstein metric: Tractable formulation and efficient solution method

X Zheng, H Chen - IEEE Transactions on Power Systems, 2020 - ieeexplore.ieee.org

In this letter, we propose a tractable formulation and an efficient solution method for the

Wasserstein-metric-based distributionally robust unit commitment (DRUC-dW) problem.

First, a distance-based data aggregation method is introduced to hedge against the …

  Cited by 3 All 2 versions


[PDF] arxiv.org

Limit Distribution Theory for Smooth Wasserstein Distance with Applications to Generative Modeling

Z GoldfeldK Kato - arXiv preprint arXiv:2002.01012, 2020 - arxiv.org

The 1-Wasserstein distance ($\mathsf {W} _1 $) is a popular proximity measure between

probability distributions. Its metric structure, robustness to support mismatch, and rich

geometric structure fueled its wide adoption for machine learning tasks. Such tasks …

  Cited by 1 Related articles All 2 versions 

 <——2020———2020————-  840——


[HTML] mdpi.com

Calculating the Wasserstein metric-based Boltzmann entropy of a landscape mosaic

H Zhang, Z Wu, T Lan, Y Chen, P Gao - Entropy, 2020 - mdpi.com

Shannon entropy is currently the most popular method for quantifying the disorder or

information of a spatial data set such as a landscape pattern and a cartographic map.

However, its drawback when applied to spatial data is also well documented; it is incapable …

Cited by 8 Related articles All 9 versions 

  [PDF] arxiv.org

Graph Wasserstein Correlation Analysis for Movie Retrieval

X Zhang, T Zhang, X Hong, Z Cui, J Yang - European Conference on …, 2020 - Springer

Movie graphs play an important role to bridge heterogenous modalities of videos and texts

in human-centric retrieval. In this work, we propose Graph Wasserstein Correlation Analysis

(GWCA) to deal with the core issue therein, ie, cross heterogeneous graph comparison …

 Related articles All 5 versions


[PDF] arxiv.org

Stochastic saddle-point optimization for wasserstein barycenters

D Tiapkin, A GasnikovP Dvurechensky - arXiv preprint arXiv:2006.06763, 2020 - arxiv.org

We study the computation of non-regularized Wasserstein barycenters of probability

measures supported on the finite set. The first result gives a stochastic optimization

algorithm for the discrete distribution over the probability measures which is comparable …

  Cited by 2 All 3 versions 


 [PDF] arxiv.org

Exponential contraction in Wasserstein distances for diffusion semigroups with negative c

urvatureFY Wang - Potential Analysis, 2020 - Springer

Let P t be the (Neumann) diffusion semigroup P t generated by a weighted Laplacian on a

complete connected Riemannian manifold M without boundary or with a convex boundary. It

is well known that the Bakry-Emery curvature is bounded below by a positive constant> 0 …

  Cited by 18 Related articles All 3 versions


[PDF] arxiv.org

Finite-sample guarantees for Wasserstein distributionally robust optimization: Breaking the curse of dimensionality

R Gao - arXiv preprint arXiv:2009.04382, 2020 - arxiv.org

Wasserstein distributionally robust optimization (DRO) aims to find robust and generalizable

solutions by hedging against data perturbations in Wasserstein distance. Despite its recent

empirical success in operations research and machine learning, existing performance …

 Cited by 17 Related articles All 3 versions 

  

2020

[PDF] researchgate.net

Sample generation based on a supervised Wasserstein Generative Adversarial Network for high-resolution remote-sensing scene classification

W HanL Wang, R Feng, L Gao, X Chen, Z Deng… - Information …, 2020 - Elsevier

As high-resolution remote-sensing (HRRS) images have become increasingly widely

available, scene classification focusing on the smart classification of land cover and land

use has also attracted more attention. However, mainstream methods encounter a severe …

Cited  by 7 Related articles All 3 versions

  

Primal heuristics for wasserstein barycenters

PY Bouchet, S GualandiLM Rousseau - International Conference on …, 2020 - Springer

This paper presents primal heuristics for the computation of Wasserstein Barycenters of a

given set of discrete probability measures. The computation of a Wasserstein Barycenter is

formulated as an optimization problem over the space of discrete probability measures. In …

  Cited by 1


On the Computation of Kantorovich--Wasserstein Distances Between Two-Dimensional Histograms by Uncapacitated Minimum Cost Flows

F Bassetti, S GualandiM Veneroni - SIAM Journal on Optimization, 2020 - SIAM

In this work, we present a method to compute the Kantorovich--Wasserstein distance of

order 1 between a pair of two-dimensional histograms. Recent works in computer vision and

machine learning have shown the benefits of measuring Wasserstein distances of order 1 …

Cited by 10 Related articles All 2 versions


[PDF] arxiv.org

Distributed Optimization with Quantization for Computing Wasserstein Barycenters

R Krawtschenko, CA UribeA Gasnikov… - arXiv preprint arXiv …, 2020 - arxiv.org

We study the problem of the decentralized computation of entropy-regularized semi-discrete

Wasserstein barycenters over a network. Building upon recent primal-dual approaches, we

propose a sampling gradient quantization scheme that allows efficient communication and …

  Cited by 1 All 3 versions 


[PDF] unifi.it

[PDF] Conlon: A pseudo-song generator based on a new pianoroll, wasserstein autoencoders, and optimal interpolations

L Angioloni, T Borghuis, L Brusci… - Proceedings of the 21st …, 2020 - flore.unifi.it

We introduce CONLON, a pattern-based MIDI generation method that employs a new

lossless pianoroll-like data description in which velocities and durations are stored in

separate channels. CONLON uses Wasserstein autoencoders as the underlying generative …

 Cited by 1 Related articles All 12 versions 

<——2020———2020————-  850-——


[PDF] arxiv.org

First-Order Methods for Wasserstein Distributionally Robust MDP

J Grand-ClementC Kroer - arXiv preprint arXiv:2009.06790, 2020 - arxiv.org

Markov Decision Processes (MDPs) are known to be sensitive to parameter specification.

Distributionally robust MDPs alleviate this issue by allowing for ambiguity sets which give a

set of possible distributions over parameter sets. The goal is to find an optimal policy with …

  Cited by 1 All 3 versions 


[PDF] arxiv.org

Irregularity of distribution in Wasserstein distance

C Graham - Journal of Fourier Analysis and Applications, 2020 - Springer

We study the non-uniformity of probability measures on the interval and circle. On the

interval, we identify the Wasserstein-p distance with the classical\(L^ p\)-discrepancy. We

thereby derive sharp estimates in Wasserstein distances for the irregularity of distribution of …

  Cited by 1 Related articles All 2 versions


Wasserstein Loss With Alternative Reinforcement Learning for Severity-Aware Semantic Segmentation

X Liu, Y Lu, X Liu, S Bai, S Li… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Semantic segmentation is important for many real-world systems, eg, autonomous vehicles,

which predict the class of each pixel. Recently, deep networks achieved significant progress

wrt the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross …

 CCited by 16 Related articles All 6 versions

 

[PDF] arxiv.org

Cutoff thermalization for Ornstein-Uhlenbeck systems with small L\'evy noise in the Wasserstein distance

G BarreraMA HögeleJC Pardo - arXiv preprint arXiv:2009.10590, 2020 - arxiv.org

This article establishes cutoff thermalization (also known as the cutoff phenomenon) for a

general class of general Ornstein-Uhlenbeck systems $(X^\epsilon_t (x)) _ {t\geq 0} $ under

$\epsilon $-small additive Lévy noise with initial value $ x $. The driving noise processes …

  Cited by 1 All 3 versions 


[PDF] arxiv.org

Approximate Bayesian computation with the sliced-Wasserstein distance

K NadjahiV De BortoliA Durmus… - ICASSP 2020-2020 …, 2020 - ieeexplore.ieee.org

Approximate Bayesian Computation (ABC) is a popular method for approximate inference in

generative models with intractable but easy-to-sample likelihood. It constructs an

approximate posterior distribution by finding parameters for which the simulated data are …

  Cited by 2 Related articles All 7 versions


[PDF] neurips.cc

[PDF] Asymptotic Guarantees for Generative Modeling Based on the Smooth Wasserstein Distance

Z GoldfeldK Greenewald… - Advances in Neural …, 2020 - proceedings.neurips.cc

Minimum distance estimation (MDE) gained recent attention as a formulation of (implicit)

generative modeling. It considers minimizing, over model parameters, a statistical distance

between the empirical data distribution and the model. This formulation lends itself well to …

Isometric study of Wasserstein spaces–the real line

G GehérT TitkosD Virosztek - Transactions of the American Mathematical …, 2020 - ams.org

Recently Kloeckner described the structure of the isometry group of the quadratic

Wasserstein space $\mathcal {W} _2 (\mathbb {R}^ n) $. It turned out that the case of the real

line is exceptional in the sense that there exists an exotic isometry flow. Following this line of …

Cited by 7 Related articles All 5 versions 


 [PDF] arxiv.org

Two-sample Test using Projected Wasserstein Distance: Breaking the Curse of Dimensionality

J WangR Gao, Y Xie - arXiv preprint arXiv:2010.11970, 2020 - arxiv.org

We develop a projected Wasserstein distance for the two-sample test, a fundamental

problem in statistics and machine learning: given two sets of samples, to determine whether

they are from the same distribution. In particular, we aim to circumvent the curse of …

  Cited by 1 All 3 versions 


[PDF] ieee.org

An Ensemble Wasserstein Generative Adversarial Network Method for Road Extraction From High Resolution Remote Sensing Images in Rural Areas

C Yang, Z Wang - IEEE Access, 2020 - ieeexplore.ieee.org

Road extraction from high resolution remote sensing (HR-RS) images is an important yet

challenging computer vision task. In this study, we propose an ensemble Wasserstein

Generative Adversarial Network with Gradient Penalty (WGAN-GP) method called E-WGAN …

  Cited by 1 All 2 versions


[PDF] arxiv.org

Wasserstein Distributionally Robust Motion Control for Collision Avoidance Using Conditional Value-at-Risk

A HakobyanI Yang - arXiv preprint arXiv:2001.04727, 2020 - arxiv.org

In this paper, a risk-aware motion control scheme is considered for mobile robots to avoid

randomly moving obstacles when the true probability distribution of uncertainty is unknown.

We propose a novel model predictive control (MPC) method for limiting the risk of unsafety …

  Cited by 3 Related articles All 2 versions 

<——2020———2020———-  860——

[PDF] arxiv.org

Wasserstein Distance to Independence Models

TÖ Çelik, A Jamneshan, G Montúfar… - arXiv preprint arXiv …, 2020 - arxiv.org

An independence model for discrete random variables is a Segre-Veronese variety in a

probability simplex. Any metric on the set of joint states of the random variables induces a

Wasserstein metric on the probability simplex. The unit ball of this polyhedral norm is dual to …

  Related articles All 2 versions 

 

[PDF] aaai.org

Regularized Wasserstein means for aligning distributional data

L MiW ZhangY Wang - Proceedings of the AAAI Conference on …, 2020 - ojs.aaai.org

We propose to align distributional data from the perspective of Wasserstein means. We raise

the problem of regularizing Wasserstein means and propose several terms tailored to tackle

different problems. Our formulation is based on the variational transportation to distribute a …

Cited by 4 Related articles All 8 versions 

[PDF] arxiv.org

Ranking IPCC Models Using the Wasserstein Distance

G Vissio, V Lembo, V LucariniM Ghil - arXiv preprint arXiv:2006.09304, 2020 - arxiv.org

We propose a methodology for evaluating the performance of climate models based on the

use of the Wasserstein distance. This distance provides a rigorous way to measure

quantitatively the difference between two probability distributions. The proposed approach is …

  All 5 versions 

[PDF] archives-ouvertes.fr

Dynamic facial expression generation on hilbert hypersphere with conditional wasserstein generative adversarial nets

N OtberdoutM DaoudiA Kacem… - … on Pattern Analysis …, 2020 - ieeexplore.ieee.org

In this work, we propose a novel approach for generating videos of the six basic facial

expressions given a neutral face image. We propose to exploit the face geometry by

modeling the facial landmarks motion as curves encoded as points on a hypersphere. By …

Cited by 42 Related articles All 12 versions

[PDF] arxiv.org

Wasserstein distributionally robust shortest path problem

Z Wang, K YouS SongY Zhang - European Journal of Operational …, 2020 - Elsevier

This paper proposes a data-driven distributionally robust shortest path (DRSP) model where

the distribution of the travel time in the transportation network can only be partially observed

through a finite number of samples. Specifically, we aim to find an optimal path to minimize …

  Cited by 2 Related articles All 8 versions

[PDF] arxiv.org

Asymptotics of smoothed Wasserstein distances

HB ChenJ Niles-Weed - arXiv preprint arXiv:2005.00738, 2020 - arxiv.org

We investigate contraction of the Wasserstein distances on $\mathbb {R}^ d $ under

Gaussian smoothing. It is well known that the heat semigroup is exponentially contractive

with respect to the Wasserstein distances on manifolds of positive curvature; however, on flat …

  Related articles All 2 versions 


[PDF] arxiv.org

Robust Reinforcement Learning with Wasserstein Constraint

L Hou, L PangX HongY Lan, Z Ma, D Yin - arXiv preprint arXiv …, 2020 - arxiv.org

Robust Reinforcement Learning aims to find the optimal policy with some extent of

robustness to environmental dynamics. Existing learning algorithms usually enable the

robustness through disturbing the current state or simulating environmental parameters in a …

 Cited by 6 Related articles All 3 versions 

[PDF] thecvf.com

Channel Pruning for Accelerating Convolutional Neural Networks via Wasserstein Metric

H Duan, H Li - Proceedings of the Asian Conference on …, 2020 - openaccess.thecvf.com

Channel pruning is an effective way to accelerate deep convolutional neural networks.

However, it is still a challenge to reduce the computational complexity while preserving the

performance of deep models. In this paper, we propose a novel channel pruning method via …

  Related articles All 2 versions

 

[HTML] hindawi.com

[HTML] An Integrated Consensus Improving Strategy Based on PL-Wasserstein Distance and Its Application in the Evaluation of Network Public Opinion Emergencies

S Zhang, Z Ma, X Liu, Z Wang, L Jiang - Complexity, 2020 - hindawi.com

In real life, multiple network public opinion emergencies may break out in a certain place at

the same time. So, it is necessary to invite emergency decision experts in multiple fields for

timely evaluating the comprehensive crisis of the online public opinion, and then limited …

  timely evaluating the comprehensive crisis of the online public opinion, and then limited …

  Related articles All 7 versions 

 An Integrated Consensus Improving Strategy Based on PL-Wasserstein Distance and Its Application in the Evaluation of Network Public Opinion Emergencies

By: Zhang, Shitao; Ma, Zhenzhen; Liu, Xiaodi; et al.

COMPLEXITY  Volume: ‏ 2020     Article Number: 9870620   Published: ‏ DEC 1 2020

Get It Penn State Free Full Text from Publisher View Abstract


Reconstruction of shale image based on Wasserstein Generative Adversarial Networks with gradient penalty

W Zha, X Li, Y Xing, L He, D Li - Advances in Geo-Energy …, 2020 - yandy-ager.com

Abstract Generative Adversarial Networks (GANs), as most popular artificial intelligence

models in the current image generation field, have excellent image generation capabilities.

Based on Wasserstein GANs with gradient penalty, this paper proposes a novel digital core …

  Cited by 7 Related articles 

<——2020——————2020 ———————-870—


[PDF] arxiv.org

Joint Wasserstein Distribution Matching

JZ Cao, L Mo, Q Du, Y GuoP ZhaoJ Huang… - arXiv preprint arXiv …, 2020 - arxiv.org

Joint distribution matching (JDM) problem, which aims to learn bidirectional mappings to

match joint distributions of two domains, occurs in many machine learning and computer

vision applications. This problem, however, is very difficult due to two critical challenges:(i) it …

  Related articles All 2 versions 


[PDF] arxiv.org

Statistical learning in Wasserstein space

A Karimi, L Ripani, TT Georgiou - arXiv preprint arXiv:2004.07875, 2020 - arxiv.org

We seek a generalization of regression and principle component analysis (PCA) in a metric

space where data points are distributions metrized by the Wasserstein metric. We recast

these analyses as multimarginal optimal transport problems. The particular formulation …

  Related articles All 4 versions 


[PDF] arxiv.org

A Bismut-Elworthy inequality for a Wasserstein diffusion on the circle

V Marx - arXiv preprint arXiv:2005.04972, 2020 - arxiv.org

We investigate in this paper a regularization property of a diffusion on the Wasserstein

space $\mathcal {P} _2 (\mathbb {T}) $ of the one-dimensional torus. The control obtained

on the gradient of the semi-group is very much in the spirit of Bismut-Elworthy-Li integration …

  Related articles All 21 versions 


[PDF] arxiv.org

Wasserstein Stability for Persistence Diagrams

P SkrabaK Turner - arXiv preprint arXiv:2006.16824, 2020 - arxiv.org

The stability of persistence diagrams is among the most important results in applied and

computational topology. Most results in the literature phrase stability in terms of the

bottleneck distance between diagrams and the $\infty $-norm of perturbations. This has two …

 Cited by 31 Related articles All 2 versions 

[PDF] mlr.press

Quantitative stability of optimal transport maps and linearization of the 2-wasserstein space

Q Mérigot, A Delalande… - … Conference on Artificial …, 2020 - proceedings.mlr.press

This work studies an explicit embedding of the set of probability measures into a Hilbert

space, defined using optimal transport maps from a reference probability density. This

embedding linearizes to some extent the 2-Wasserstein space and is shown to be bi-Hölder …

  Cited by 9 Related articles All 4 versions 

4 Citations  33 References


2020

CVaR-Based Approximations of Wasserstein Distributionally Robust Chance Constraints with Application to Process Scheduling

B Liu, Q Zhang, X Ge, Z Yuan - Industrial & Engineering Chemistry …, 2020 - ACS Publications

Distributionally robust chance constrained programming is a stochastic optimization

approach that considers uncertainty in model parameters as well as uncertainty in the

underlying probability distribution. It ensures a specified probability of constraint satisfaction …

  Related articles


[PDF] arxiv.org

Precise Limit in Wasserstein Distance for Conditional Empirical Measures of Dirichlet Diffusion Processes

FY Wang - arXiv preprint arXiv:2004.07537, 2020 - arxiv.org

Let $ M $ be a $ d $-dimensional connected compact Riemannian manifold with boundary

$\partial M $, let $ V\in C^ 2 (M) $ such that $\mu (dx):= e^{V (x)} dx $ is a probability

measure, and let $ X_t $ be the diffusion process generated by $ L:=\Delta+\nabla V $ with …

  Cited by 2 Related articles All 2 versions 


A Wasserstein gradient-penalty generative adversarial network with deep auto-encoder for bearing intelligent fault diagnosis

X Xiong, J Hongkai, X Li, M Niu - Measurement Science and …, 2020 - iopscience.iop.org

It is a great challenge to manipulate unbalanced fault data in the field of rolling bearings

intelligent fault diagnosis. In this paper, a novel intelligent fault diagnosis method called the

Wasserstein gradient-penalty generative adversarial network with deep auto-encoder is …

 Cited by 21 Related articles All 3 versions


  [PDF] arxiv.org

Tessellated Wasserstein Auto-Encoders

K Gai, S Zhang - arXiv preprint arXiv:2005.09923, 2020 - arxiv.org

Non-adversarial generative models such as variational auto-encoder (VAE), Wasserstein

auto-encoders with maximum mean discrepancy (WAE-MMD), sliced-Wasserstein auto-

encoder (SWAE) are relatively easy to train and have less mode collapse compared to …

  Related articles All 2 versions 


 [PDF] arxiv.org

The Equivalence of Fourier-based and Wasserstein Metrics on Imaging Problems

G Auricchio, A CodegoniS Gualandi… - arXiv preprint arXiv …, 2020 - arxiv.org

We investigate properties of some extensions of a class of Fourier-based probability metrics,

originally introduced to study convergence to equilibrium for the solution to the spatially

homogeneous Boltzmann equation. At difference with the original one, the new Fourier …

  Related articles All 4 versions 

 <——2020———2020———-  880——  


FRWCAE: joint faster-RCNN and Wasserstein convolutional auto-encoder for instance retrieval

Y Zhang, Y Feng, D Liu, J Shang, B Qiang - Applied Intelligence, 2020 - Springer

Based on the powerful feature extraction capability of deep convolutional neural networks,

image-level retrieval methods have achieved superior performance compared to the hand-

crafted features and indexing algorithms. However, people tend to focus on foreground …

  Cited by 1 Related articles


[PDF] researchgate.net

Infrared and Visible Image Fusion Using Dual Discriminators Generative Adversarial Networks with Wasserstein Distance

J Li, H Huo, K Liu, C Li - Information Sciences, 2020 - Elsevier

Generative adversarial network (GAN) has shown great potential in infrared and visible

image fusion. The existing GAN-based methods establish an adversarial game between

generative image and source images to train the generator until the generative image …

  Cited by 3 Related articles All 2 versions

Cited by 25 Related articles All 2 versions

Drug–drug interaction prediction with Wasserstein Adversarial Autoencoder-based knowledge graph embeddings

Y Dai, C Guo, W Guo, C Eickhoff - Briefings in Bioinformatics, 2020 - academic.oup.com

An interaction between pharmacological agents can trigger unexpected adverse events.

Capturing richer and more comprehensive information about drug–drug interactions (DDIs)

is one of the key tasks in public health and drug development. Recently, several knowledge …

  All 3 versions

[PDF] arxiv.org

Distributional Sliced-Wasserstein and Applications to Generative Modeling

K NguyenN HoT PhamH Bui - arXiv preprint arXiv:2002.07367, 2020 - arxiv.org

Sliced-Wasserstein distance (SWD) and its variation, Max Sliced-Wasserstein distance (Max-

SWD), have been widely used in the recent years due to their fast computation and

scalability when the probability measures lie in very high dimension. However, these …

Cited by 33 Related articles All 12 versions 

 

[PDF] arxiv.org

Exponential contraction in Wasserstein distance on static and evolving manifolds

LJ Cheng, A Thalmaier, SQ Zhang - arXiv preprint arXiv:2001.06187, 2020 - arxiv.org

In this article, exponential contraction in Wasserstein distance for heat semigroups of

diffusion processes on Riemannian manifolds is established under curvature conditions

where Ricci curvature is not necessarily required to be non-negative. Compared to the …

Cited by 3 Related articles All 7 versions 

 

Adversarial sliced Wasserstein domain adaptation networks

Y Zhang, N Wang, S Cai - Image and Vision Computing, 2020 - Elsevier

Abstract Domain adaptation has become a resounding success in learning a domain

agnostic model that performs well on target dataset by leveraging source dataset which has

related data distribution. Most of existing works aim at learning domain-invariant features …

Cited by 5 Related articles All 2 versions

[PDF] arxiv.org

Multivariate goodness-of-Fit tests based on Wasserstein distance

M HallinG MordantJ Segers - arXiv preprint arXiv:2003.06684, 2020 - arxiv.org

Goodness-of-fit tests based on the empirical Wasserstein distance are proposed for simple

and composite null hypotheses involving general multivariate distributions. This includes the

important problem of testing for multivariate normality with unspecified mean vector and …

  Cited by 3 Related articles All 9 versions 

 

[PDF] aaai.org

Improving the Robustness of Wasserstein Embedding by Adversarial PAC-Bayesian Learning

D Ding, M Zhang, X PanM YangX He - Proceedings of the AAAI …, 2020 - ojs.aaai.org

Node embedding is a crucial task in graph analysis. Recently, several methods are

proposed to embed a node as a distribution rather than a vector to capture more information.

Although these methods achieved noticeable improvements, their extra complexity brings …


 
   

[PDF] thecvf.com

S2A: Wasserstein GAN with Spatio-Spectral Laplacian Attention for Multi-Spectral Band Synthesis

L RoutI MisraS Manthira Moorthi… - Proceedings of the …, 2020 - openaccess.thecvf.com

Intersection of adversarial learning and satellite image processing is an emerging field in

remote sensing. In this study, we intend to address synthesis of high resolution multi-spectral

satellite imagery using adversarial learning. Guided by the discovery of attention …

 Cited by 3 Related articles All 10 versions 


[HTML] nih.gov

[HTML] EEG Signal Reconstruction Using a Generative Adversarial Network With Wasserstein Distance and Temporal-Spatial-Frequency Loss

T Luo, Y Fan, L Chen, G Guo, C Zhou - Frontiers in …, 2020 - ncbi.nlm.nih.gov

Applications based on electroencephalography (EEG) signals suffer from the mutual

contradiction of high classification performance vs. low cost. The nature of this contradiction

makes EEG signal reconstruction with high sampling rates and sensitivity challenging …

  Cited by 3 Related articles All 3 versions

<——2020———————2020 ———————-890—-


[PDF] brown.edu

Drug–drug interaction prediction with Wasserstein Adversarial Autoencoder-based knowledge graph embeddings

Y Dai, C Guo, W Guo, C Eickhoff - Briefings in Bioinformatics, 2020 - academic.oup.com

An interaction between pharmacological agents can trigger unexpected adverse events.

Capturing richer and more comprehensive information about drug–drug interactions (DDIs)

is one of the key tasks in public health and drug development. Recently, several knowledge …

  Related articles All 3 versions

 

[PDF] arxiv.org

Differentiable maps between Wasserstein spaces

B Lessel, T Schick - arXiv preprint arXiv:2010.02131, 2020 - arxiv.org

A notion of differentiability is being proposed for maps between Wasserstein spaces of order

2 of smooth, connected and complete Riemannian manifolds. Due to the nature of the

tangent space construction on Wasserstein spaces, we only give a global definition of …

  Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein Autoregressive Models for Density Time Series

C Zhang, P KokoszkaA Petersen - arXiv preprint arXiv:2006.12640, 2020 - arxiv.org

Data consisting of time-indexed distributions of cross-sectional or intraday returns have

been extensively studied in finance, and provide one example in which the data atoms

consist of serially dependent probability distributions. Motivated by such data, we propose …

  All 3 versions 


[PDF] mlr.press

Wasserstein smoothing: Certified robustness against wasserstein adversarial attacks

A Levine, S Feizi - International Conference on Artificial …, 2020 - proceedings.mlr.press

In the last couple of years, several adversarial attack methods based on different threat

models have been proposed for the image classification problem. Most existing defenses

consider additive threat models in which sample perturbations have bounded L_p norms …

  Cited by 17 Related articles All 5 versions 


[PDF] aaai.org

Importance-aware semantic segmentation in self-driving with discrete wasserstein training

X Liu, Y Han, S Bai, Y Ge, T Wang, X HanS Li… - Proceedings of the …, 2020 - ojs.aaai.org

Semantic segmentation (SS) is an important perception manner for self-driving cars and

robotics, which classifies each pixel into a pre-determined class. The widely-used cross

entropy (CE) loss-based deep networks has achieved significant progress wrt the mean …

  Cited by 9 Related articles All 6 versions 


[PDF] arxiv.org

Improved Image Wasserstein Attacks and Defenses

JE HuA SwaminathanH SalmanG Yang - arXiv preprint arXiv …, 2020 - arxiv.org

Robustness against image perturbations bounded by a $\ell_p $ ball have been well-

studied in recent literature. Perturbations in the real-world, however, rarely exhibit the pixel

independence that $\ell_p $ threat models assume. A recently proposed Wasserstein  …

  Cited by 3 Related articles All 3 versions 


[PDF] arxiv.org

Online Stochastic Convex Optimization: Wasserstein Distance Variation

I ShamesF Farokhi - arXiv preprint arXiv:2006.01397, 2020 - arxiv.org

Distributionally-robust optimization is often studied for a fixed set of distributions rather than

time-varying distributions that can drift significantly over time (which is, for instance, the case

in finance and sociology due to underlying expansion of economy and evolution of …

 Cited by 2 Related articles All 4 versions 

2020  [PDF] arxiv.org

Wasserstein-based fairness interpretability framework for machine learning models

A Miroshnikov, K Kotsiopoulos, R Franks… - arXiv preprint arXiv …, 2020 - arxiv.org

In this article, we introduce a fairness interpretability framework for measuring and

explaining bias in classification and regression models at the level of a distribution. In our

work, motivated by the ideas of Dwork et al.(2012), we measure the model bias across sub …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein Generative Models for Patch-based Texture Synthesis

A HoudardA LeclaireN Papadakis… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we propose a framework to train a generative model for texture image

synthesis from a single example. To do so, we exploit the local representation of images via

the space of patches, that is, square sub-images of fixed size (eg $4\times 4$). Our main …

  Related articles 

<——2020———2020———-  900—— 

         

[PDF] arxiv.org

Conditional Sig-Wasserstein GANs for Time Series Generation

H NiL SzpruchM Wiese, S Liao, B Xiao - arXiv preprint arXiv:2006.05421, 2020 - arxiv.org

Generative adversarial networks (GANs) have been extremely successful in generating

samples, from seemingly high dimensional probability measures. However, these methods

struggle to capture the temporal dependence of joint probability distributions induced by …

Cited by 27 Related articles All 3 versions 

 

[PDF] arxiv.org

SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

S ChewiTL Gouic, C Lu, T Maunu… - arXiv preprint arXiv …, 2020 - arxiv.org

Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described

as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of

optimal transport. We introduce a new perspective on SVGD that instead views SVGD as the …

Cited by 15 Related articles All 9 versions 


[PDF] arxiv.org

Primal Wasserstein Imitation Learning

R DadashiL HussenotM GeistO Pietquin - arXiv preprint arXiv …, 2020 - arxiv.org

Imitation Learning (IL) methods seek to match the behavior of an agent with that of an expert.

In the present work, we propose a new IL method based on a conceptually simple algorithm:

Primal Wasserstein Imitation Learning (PWIL), which ties to the primal form of the …

Cited by 31 Related articles All 18 versions 

[PDF] ieee.org

Distributionally Robust Optimal Reactive Power Dispatch with Wasserstein Distance in Active Distribution Network

J Liu, Y Chen, C Duan, J Lin… - Journal of Modern Power …, 2020 - ieeexplore.ieee.org

The uncertainties from renewable energy sources (RESs) will not only introduce significant

influences to active power dispatch, but also bring great challenges to the analysis of

optimal reactive power dispatch (ORPD). To address the influence of high penetration of …

  Cited by 1 Related articles All 2 versions


2020

De novo protein design for novel folds using guided conditional Wasserstein generative adversarial networks

M Karimi, S Zhu, Y CaoY Shen - Journal of Chemical Information …, 2020 - ACS Publications

Although massive data is quickly accumulating on protein sequence and structure, there is a

small and limited number of protein architectural types (or structural folds). This study is

addressing the following question: how well could one reveal underlying sequence …

  Cited by 4 Related articles All 5 versions

[PDF] ucl.ac.uk

Ripple-GAN: Lane Line Detection With Ripple Lane Line Detection Network and Wasserstein GAN

Y Zhang, Z Lu, D Ma, JH Xue… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

With artificial intelligence technology being advanced by leaps and bounds, intelligent

driving has attracted a huge amount of attention recently in research and development. In

intelligent driving, lane line detection is a fundamental but challenging task particularly …

  Related articles All 2 versions


[PDF] arxiv.org

PLG-IN: Pluggable Geometric Consistency Loss with Wasserstein Distance in Monocular Depth Estimation

N HiroseS KoideK KawanoR Kondo - arXiv preprint arXiv:2006.02068, 2020 - arxiv.org

We propose a novel objective to penalize geometric inconsistencies, to improve the

performance of depth estimation from monocular camera images. Our objective is designed

with the Wasserstein distance between two point clouds estimated from images with different …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein Distributionally Robust Look-Ahead Economic Dispatch

BK PoollaAR HotaS BolognaniDS Callaway… - arXiv preprint arXiv …, 2020 - arxiv.org

We present two data-driven distributionally robust optimization formulations for the look-

ahead economic dispatch (LAED) problem with uncertain renewable energy generation. In

particular, the goal is to minimize the cost of conventional energy generation subject to …

Cited by 28 Related articles All 10 versions

[PDF] arxiv.org

On Linear Optimization over Wasserstein Balls

MC YueD KuhnW Wiesemann - arXiv preprint arXiv:2004.07162, 2020 - arxiv.org

Wasserstein balls, which contain all probability measures within a pre-specified Wasserstein

distance to a reference measure, have recently enjoyed wide popularity in the

distributionally robust optimization and machine learning communities to formulate and …

  Cited by 3 Related articles All 5 versions 


[PDF] arxiv.org

Derivative over Wasserstein spaces along curves of densities

R Buckdahn, J Li, H Liang - arXiv preprint arXiv:2010.01507, 2020 - arxiv.org

In this paper, given any random variable $\xi $ defined over a probability space

$(\Omega,\mathcal {F}, Q) $, we focus on the study of the derivative of functions of the form $

L\mapsto F_Q (L):= f\big ((LQ) _ {\xi}\big), $ defined over the convex cone of densities …

  All 2 versions 

<——2020———2020———-  910—

[PDF] ieee.org

Robust Multivehicle Tracking With Wasserstein Association Metric in Surveillance Videos

Y Zeng, X Fu, L Gao, J Zhu, H Li, Y Li - IEEE Access, 2020 - ieeexplore.ieee.org

Vehicle tracking based on surveillance videos is of great significance in the highway traffic

monitoring field. In real-world vehicle-tracking applications, partial occlusion and objects

with similarly appearing distractors pose significant challenges. For addressing the above …

Cited by 9 Related articles All 2 versions


[PDF] arxiv.org

Universal consistency of Wasserstein  k-NN classifier

D Ponnoprat - arXiv preprint arXiv:2009.04651, 2020 - arxiv.org

The Wasserstein distance provides a notion of dissimilarities between probability measures,

which has recent applications in learning of structured data with varying size such as images

and text documents. In this work, we analyze the $ k $-nearest neighbor classifier ($ k $-NN) …

  All 2 versions 


[PDF] arxiv.org

Convergence of Recursive Stochastic Algorithms using Wasserstein Divergence

A Gupta, WB Haskell - arXiv preprint arXiv:2003.11403, 2020 - arxiv.org

This paper develops a unified framework, based on iterated random operator theory, to

analyze the convergence of constant stepsize recursive stochastic algorithms (RSAs) in

machine learning and reinforcement learning. RSAs use randomization to efficiently …

  Related articles All 2 versions 


[PDF] arxiv.org

A material decomposition method for dual-energy CT via dual interactive Wasserstein generative adversarial networks

Z Shi, H Li, Q Cao, Z Wang, M Cheng - arXiv preprint arXiv:2007.11247, 2020 - arxiv.org

Dual-energy computed tomography has great potential in material characterization and

identification, whereas the reconstructed material-specific images always suffer from

magnified noise and beam hardening artifacts. In this study, a data-driven approach using …

  All 3 versions 


[PDF] sns.it

Optimal control of multiagent systems in the Wasserstein space

C Jimenez, A Marigonda, M Quincampoix - Calculus of Variations and …, 2020 - Springer

This paper concerns a class of optimal control problems, where a central planner aims to

control a multi-agent system in\({\mathbb {R}}^ d\) in order to minimize a certain cost of Bolza

type. At every time and for each agent, the set of admissible velocities, describing his/her …

  Cited by 4 Related articles All 3 versions


[PDF] arxiv.org

Variational Wasserstein Barycenters for Geometric Clustering

L Mi, T Yu, J BentoW ZhangB LiY Wang - arXiv preprint arXiv …, 2020 - arxiv.org

We propose to compute Wasserstein barycenters (WBs) by solving for Monge maps with

variational principle. We discuss the metric properties of WBs and explore their connections,

especially the connections of Monge WBs, to K-means clustering and co-clustering. We also …

  Cited by 2 Related articles All 2 versions 

 

[PDF] arxiv.org

Augmented Sliced Wasserstein Distances

X Chen, Y Yang, Y Li - arXiv preprint arXiv:2006.08812, 2020 - arxiv.org

While theoretically appealing, the application of the Wasserstein distance to large-scale

machine learning problems has been hampered by its prohibitive computational cost. The

sliced Wasserstein distance and its variants improve the computational efficiency through …

 Cited by 5 Related articles All 5 versions 


[PDF] arxiv.org

Adversarial Classification via Distributional Robustness with Wasserstein Ambiguity

N Ho-NguyenSJ Wright - arXiv preprint arXiv:2005.13815, 2020 - arxiv.org

We study a model for adversarial classification based on distributionally robust chance

constraints. We show that under Wasserstein ambiguity, the model aims to minimize the

conditional value-at-risk of the distance to misclassification, and we explore links to previous …

  Related articles All 3 versions 


High-precision Wasserstein barycenters in polynomial time

JM Altschuler, E Boix-Adsera - arXiv preprint arXiv:2006.08012, 2020 - arxiv.org

Computing Wasserstein barycenters is a fundamental geometric problem with widespread

applications in machine learning, statistics, and computer graphics. However, it is unknown

whether Wasserstein barycenters can be computed in polynomial time, either exactly or to …

  All 3 versions 


[PDF] projecteuclid.org

Existence of probability measure valued jump-diffusions in generalized Wasserstein spaces

M Larsson, S Svaluto-Ferro - Electronic Journal of Probability, 2020 - projecteuclid.org

We study existence of probability measure valued jump-diffusions described by martingale

problems. We develop a simple device that allows us to embed Wasserstein spaces and

other similar spaces of probability measures into locally compact spaces where classical …

  Cited by 2 Related articles All 2 versions

<——2020———————2020 ———————-920—


[PDF] arxiv.org

Wasserstein Exponential Kernels

H De PlaenM FanuelJAK Suykens - arXiv preprint arXiv:2002.01878, 2020 - arxiv.org

In the context of kernel methods, the similarity between data points is encoded by the kernel

function which is often defined thanks to the Euclidean distance, a common example being

the squared exponential kernel. Recently, other distances relying on optimal transport theory …

  Related articles All 3 versions 


[PDF] arxiv.org

Ensemble Riemannian Data Assimilation over the Wasserstein Space

SK Tamang, A Ebtehaj, PJ Van Leeuwen, D Zou… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we present a new ensemble data assimilation paradigm over a Riemannian

manifold equipped with the Wasserstein metric. Unlike Eulerian penalization of error in the

Euclidean space, the Wasserstein metric can capture translation and shape difference …

  All 4 versions 


[PDF] arxiv.org

Stability of Gibbs Posteriors from the Wasserstein Loss for Bayesian Full Waveform Inversion

MM DunlopY Yang - arXiv preprint arXiv:2004.03730, 2020 - arxiv.org

Recently, the Wasserstein loss function has been proven to be effective when applied to

deterministic full-waveform inversion (FWI) problems. We consider the application of this

loss function in Bayesian FWI so that the uncertainty can be captured in the solution. Other …

  Cited by 1 Related articles All 3 versions 

MR4329989

Conditional Wasserstein Auto-Encoder for Interactive Vehicle Trajectory Prediction

C Fei, X He, S Kawahara, N Shirou… - 2020 IEEE 23rd …, 2020 - ieeexplore.ieee.org

Trajectory prediction is a crucial task required for autonomous driving. The highly

interactions and uncertainties in real-world traffic scenarios make it a challenge to generate

trajectories that are accurate, reasonable and covering diverse modality as much as …

 

[PDF] arxiv.org

Learning Deep-Latent Hierarchies by Stacking Wasserstein Autoencoders

B Gaujac, I FeigeD Barber - arXiv preprint arXiv:2010.03467, 2020 - arxiv.org

Probabilistic models with hierarchical-latent-variable structures provide state-of-the-art

results amongst non-autoregressive, unsupervised density-based models. However, the

most common approach to training such models based on Variational Autoencoders (VAEs) …

  All 2 versions 



[PDF] arxiv.org

Conditional Wasserstein GAN-based Oversampling of Tabular Data for Imbalanced Learning

J Engelmann, S Lessmann - arXiv preprint arXiv:2008.09202, 2020 - arxiv.org

Class imbalance is a common problem in supervised learning and impedes the predictive

performance of classification models. Popular countermeasures include oversampling the

minority class. Standard methods like SMOTE rely on finding nearest neighbours and linear …

  All 3 versions 


[PDF] arxiv.org

Safe Wasserstein Constrained Deep Q-Learning

A KandelSJ Moura - arXiv preprint arXiv:2002.03016, 2020 - arxiv.org

This paper presents a distributionally robust Q-Learning algorithm (DrQ) which leverages

Wasserstein ambiguity sets to provide probabilistic out-of-sample safety guarantees during

online learning. First, we follow past work by separating the constraint functions from the …

  Related articles All 2 versions 


Nonparametric Different-Feature Selection Using Wasserstein Distance

W Zheng, FY WangC Gou - 2020 IEEE 32nd International …, 2020 - ieeexplore.ieee.org

In this paper, we propose a feature selection method that characterizes the difference

between two kinds of probability distributions. The key idea is to view the feature selection

problem as a sparsest k-subgraph problem that considers Wasserstein distance between …

 

[HTML] peerj.com

[HTML] Correcting nuisance variation using Wasserstein distance

G TabakM FanS YangS Hoyer, G Davis - PeerJ, 2020 - peerj.com

Profiling cellular phenotypes from microscopic imaging can provide meaningful biological

information resulting from various factors affecting the cells. One motivating application is

drug development: morphological cell features can be captured from images, from which …

  Cited by 2 Related articles All 8 versions 

[PDF] neurips.cc

[PDF] Quantile Propagation for Wasserstein-Approximate Gaussian Processes

R ZhangC WalderEV Bonilla… - Advances in Neural …, 2020 - proceedings.neurips.cc

Approximate inference techniques are the cornerstone of probabilistic methods based on

Gaussian process priors. Despite this, most work approximately optimizes standard

divergence measures such as the Kullback-Leibler (KL) divergence, which lack the basic …

   Related articles All 6 versions 

<——2020———2020———-  930—— 


 

System and method for unsupervised domain adaptation via sliced-wasserstein distance

AJ GabourieM RostamiS Kolouri… - US Patent App. 16 …, 2020 - freepatentsonline.com

Described is a system for unsupervised domain adaptation in an autonomous learning

agent. The system adapts a learned model with a set of unlabeled data from a target

domain, resulting in an adapted model. The learned model was previously trained to …


 

[PDF] arxiv.org

Posterior asymptotics in Wasserstein metrics on the real line

M ChaeP De Blasi, SG Walker - arXiv preprint arXiv:2003.05599, 2020 - arxiv.org

In this paper, we use the class of Wasserstein metrics to study asymptotic properties of

posterior distributions. Our first goal is to provide sufficient conditions for posterior

consistency. In addition to the well-known Schwartz's Kullback--Leibler condition on the …

  Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein Distance guided Adversarial Imitation Learning with Reward Shape Exploration

M ZhangY WangX Ma, L Xia, J Yang, Z Li… - arXiv preprint arXiv …, 2020 - arxiv.org

The generative adversarial imitation learning (GAIL) has provided an adversarial learning

framework for imitating expert policy from demonstrations in high-dimensional continuous

tasks. However, almost all GAIL and its extensions only design a kind of reward function of …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Symmetric Skip Connection Wasserstein GAN for High-Resolution Facial Image Inpainting

J JamC KendrickV DrouardK Walker… - arXiv preprint arXiv …, 2020 - arxiv.org

We propose a Symmetric Skip Connection Wasserstein Generative Adversarial Network (S-

WGAN) for high-resolution facial image inpainting. The architecture is an encoder-decoder

with convolutional blocks, linked by skip connections. The encoder is a feature extractor that …

  Cited by 3 Related articles All 2 versions 

[PDF] arxiv.org

Symmetric skip connection wasserstein gan for high-resolution facial image inpainting

J JamC KendrickV DrouardK Walker… - arXiv preprint arXiv …, 2020 - arxiv.org

The state-of-the-art facial image inpainting methods achieved promising results but face realism preservation remains a challenge. This is due to limitations such as; failures in preserving edges and blurry artefacts. To overcome these limitations, we propose a …

  Cited by 5 Related articles All 3 versions 



[PDF] arxiv.org

Wasserstein Random Forests and Applications in Heterogeneous Treatment Effects

Q Du, G BiauF Petit, R Porcher - arXiv preprint arXiv:2006.04709, 2020 - arxiv.org

We present new insights into causal inference in the context of Heterogeneous Treatment

Effects by proposing natural variants of Random Forests to estimate the key conditional

distributions. To achieve this, we recast Breiman's original splitting criterion in terms of …

  Related articles All 2 versions 



[PDF] arxiv.org

Generating Natural Adversarial Hyperspectral examples with a modified Wasserstein GAN

JC Burnel, K FatrasN Courty - arXiv preprint arXiv:2001.09993, 2020 - arxiv.org

Adversarial examples are a hot topic due to their abilities to fool a classifier's prediction.

There are two strategies to create such examples, one uses the attacked classifier's

gradients, while the other only requires access to the clas-sifier's prediction. This is …

  Related articles All 8 versions 


PDF] mlr.press

Fast algorithms for computational optimal transport and wasserstein barycenter

W GuoN HoM Jordan - International Conference on …, 2020 - proceedings.mlr.press

We provide theoretical complexity analysis for new algorithms to compute the optimal

transport (OT) distance between two discrete probability distributions, and demonstrate their

favorable practical performance compared to state-of-art primal-dual algorithms. First, we …

  Cited by 5 Related articles All 4 versions 


[PDF] thecvf.com

Barycenters of Natural Images Constrained Wasserstein Barycenters for Image Morphing

D SimonA Aberdam - … of the IEEE/CVF Conference on …, 2020 - openaccess.thecvf.com

Image interpolation, or image morphing, refers to a visual transition between two (or more)

input images. For such a transition to look visually appealing, its desirable properties are (i)

to be smooth;(ii) to apply the minimal required change in the image; and (iii) to seem" real" …

  Cited by 2 Related articles All 4 versions 


Wasserstein GAN based on Autoencoder with back-translation for cross-lingual embedding mappings

Y Zhang, Y Li, Y Zhu, X Hu - Pattern Recognition Letters, 2020 - Elsevier

Recent works about learning cross-lingual word mappings (CWMs) focus on relaxing the

requirement of bilingual signals through generative adversarial networks (GANs). GANs

based models intend to enforce source embedding space to align target embedding space …

    Cited by 3 Related articles All 3 versions

 

[PDF] arxiv.org

Data-Driven Approximation of the Perron-Frobenius Operator Using the Wasserstein Metric

A KarimiTT Georgiou - arXiv preprint arXiv:2011.00759, 2020 - arxiv.org

This manuscript introduces a regression-type formulation for approximating the Perron-

Frobenius Operator by relying on distributional snapshots of data. These snapshots may

represent densities of particles. The Wasserstein metric is leveraged to define a suitable …

  All 2 versions 

<——2020———2020———-  940——

[PDF] researchgate.net

[PDF] Ranking IPCC Model Performance Using the Wasserstein Distance

G VissioV LemboV Lucarini… - arXiv preprint arXiv …, 2020 - researchgate.net

We propose a methodology for intercomparing climate models and evaluating their

performance against benchmarks based on the use of the Wasserstein distance (WD). This

distance provides a rigorous way to measure quantitatively the difference between two …

  

[PDF] arxiv.org

Segmentation analysis and the recovery of queuing parameters via the Wasserstein distance: a study of administrative data for patients with chronic obstructive …

H Wilde, V KnightJ Gillard, K Smith - arXiv preprint arXiv:2008.04295, 2020 - arxiv.org

This work uses a data-driven approach to analyse how the resource requirements of

patients with chronic obstructive pulmonary disease (COPD) may change, and quantifies

how those changes affect the strains of the hospital system the patients interact with. This is …

  Cited by 1 Related articles All 3 versions


[PDF] arxiv.org

Principled Learning Method for Wasserstein Distributionally Robust Optimization with Local Perturbations

Y Kwon, W Kim, JH Won, MC Paik - arXiv preprint arXiv:2006.03333, 2020 - arxiv.org

Wasserstein distributionally robust optimization (WDRO) attempts to learn a model that

minimizes the local worst-case risk in the vicinity of the empirical data distribution defined by

Wasserstein ball. While WDRO has received attention as a promising tool for inference since …

  Related articles All 2 versions 


[PDF] arxiv.org

Hierarchical Gaussian Processes with Wasserstein-2 Kernels

S PopescuD SharpJ ColeB Glocker - arXiv preprint arXiv:2010.14877, 2020 - arxiv.org

We investigate the usefulness of Wasserstein-2 kernels in the context of hierarchical

Gaussian Processes. Stemming from an observation that stacking Gaussian Processes

severely diminishes the model's ability to detect outliers, which when combined with non …

  All 2 versions 


[PDF] arxiv.org

Wasserstein Adversarial Autoencoders for Knowledge Graph Embedding based Drug-Drug Interaction Prediction

Y Dai, C Guo, W Guo, C Eickhoff - arXiv preprint arXiv:2004.07341, 2020 - arxiv.org

Interaction between pharmacological agents can trigger unexpected adverse events.

Capturing richer and more comprehensive information about drug-drug interactions (DDI) is

one of the key tasks in public health and drug development. Recently, several knowledge …

  Cited by 1 Related articles All 2 versions 



[PDF] arxiv.org

Independent Elliptical Distributions Minimize Their  Wasserstein Distance from Independent Elliptical Distributions with the Same Density Generator

S FangQ Zhu - arXiv preprint arXiv:2012.03809, 2020 - arxiv.org

This short note is on a property of the $\mathcal {W} _2 $ Wasserstein distance which

indicates that independent elliptical distributions minimize their $\mathcal {W} _2 $

Wasserstein distance from given independent elliptical distributions with the same density …

  Related articles All 2 versions 

CITATION] Independent Elliptical Distributions Minimize Their W2 Wasserstein Distance from Independent Elliptical Distributions with the Same Density Generator.

S Fang, Q Zhu - arXiv preprint, 2020



[PDF] researchgate.net

Inequalities of the Wasserstein mean with other matrix means

S Kim, H Lee - Annals of Functional Analysis, 2020 - Springer

Recently, a new Riemannian metric and a least squares mean of positive definite matrices

have been introduced. They are called the Bures–Wasserstein metric and Wasserstein

mean, which are different from the Riemannian trace metric and Karcher mean. In this paper …

  Cited by 2 Related articles All 2 versions

[PDF] arxiv.org

Chance-Constrained Set Covering with Wasserstein Ambiguity

H Shen, R Jiang - arXiv preprint arXiv:2010.05671, 2020 - arxiv.org

We study a generalized distributionally robust chance-constrained set covering problem

(DRC) with a Wasserstein ambiguity set, where both decisions and uncertainty are binary-

valued. We establish the NP-hardness of DRC and recast it as a two-stage stochastic …

  All 2 versions 


[PDF] arxiv.org

Central limit theorems for Markov chains based on their convergence rates in Wasserstein distance

R Jin, A Tan - arXiv preprint arXiv:2002.09427, 2020 - arxiv.org

Many tools are available to bound the convergence rate of Markov chains in total variation

(TV) distance. Such results can be used to establish central limit theorems (CLT) that enable

error evaluations of Monte Carlo estimates in practice. However, convergence analysis …

  Related articles All 2 versions 


Infrared and visible image fusion using dual discriminators generative adversarial networks with Wasserstein distance

J Li, H Huo, K Liu, C Li - Information Sciences, 2020 - Elsevier

Generative adversarial network (GAN) has shown great potential in infrared and visible

image fusion. The existing GAN-based methods establish an adversarial game between

generative image and source images to train the generator until the generative image …

  Cited by 5 Related articles All 3 versions

<——2020———2020———-  950——


[PDF] arxiv.org

Wasserstein Distance guided Adversarial Imitation Learning with Reward Shape Exploration

M ZhangY WangX Ma, L Xia, J Yang… - 2020 IEEE 9th Data …, 2020 - ieeexplore.ieee.org

The generative adversarial imitation learning (GAIL) has provided an adversarial learning

framework for imitating expert policy from demonstrations in high-dimensional continuous

tasks. However, almost all GAIL and its extensions only design a kind of reward function of …

  Cited by 3 Related articles All 5 versions

 

[PDF] arxiv.org

Wasserstein Distance Regularized Sequence Representation for Text Matching in Asymmetrical Domains

W Yu, C Xu, J XuL Pang, X Gao, X Wang… - arXiv preprint arXiv …, 2020 - arxiv.org

One approach to matching texts from asymmetrical domains is projecting the input

sequences into a common semantic space as feature vectors upon which the matching

function can be readily defined and learned. In real-world matching practices, it is often …

  Related articles All 3 versions 


[PDF] iop.org

Data Augmentation Based on Wasserstein Generative Adversarial Nets Under Few Samples

Y Jiang, B Zhu, Q Ma - IOP Conference Series: Materials Science …, 2020 - iopscience.iop.org

Aiming at the problem of low accuracy of image classification under the condition of few

samples, an improved method based on Wasserstein Generative Adversarial Nets is

proposed. The small data sets are augmented by generating target samples through …

  Cited by 1 Related articles All 2 versions

[PDF] sciencedirect.com

Intelligent Fault Diagnosis with a Deep Transfer Network based on Wasserstein Distance

J Xu, J Huang, Y Zhao, L Zhou - Procedia Computer Science, 2020 - Elsevier

Intelligent fault-diagnosis methods based on deep-learning technology have been very

successful for complex industrial systems. The deep learning based fault classification

model requires a large number of labeled data. Moreover, the probability distribution of …

 

[PDF] arxiv.org

Time Discretizations of Wasserstein-Hamiltonian Flows

J Cui, L Dieci, H Zhou - arXiv preprint arXiv:2006.09187, 2020 - arxiv.org

We study discretizations of Hamiltonian systems on the probability density manifold

equipped with the $ L^ 2$-Wasserstein metric. Based on discrete optimal transport theory,

several Hamiltonian systems on graph (lattice) with different weights are derived, which can …

  All 3 versions 


2020

 [PDF] arxiv.org

Differentiable maps between Wasserstein spaces

B Lessel, T Schick - arXiv preprint arXiv:2010.02131, 2020 - arxiv.org

A notion of differentiability is being proposed for maps between Wasserstein spaces of order

2 of smooth, connected and complete Riemannian manifolds. Due to the nature of the

tangent space construction on Wasserstein spaces, we only give a global definition of …

  All 2 versions 


[HTML] hindawi.com

[HTML] Wasserstein Generative Adversarial Network and Convolutional Neural Network (WG-CNN) for Bearing Fault Diagnosis

H Yin, Z Li, J Zuo, H Liu, K Yang, F Li - Mathematical Problems in …, 2020 - hindawi.com

In recent years, intelligent fault diagnosis technology with deep learning algorithms has

been widely used in industry, and they have achieved gratifying results. Most of these

methods require large amount of training data. However, in actual industrial systems, it is …

 Cited by 14 Related articles All 7 versions 

[PDF] arxiv.org

Wasserstein Statistics in One-dimensional Location-Scale Model

S AmariT Matsuda - arXiv preprint arXiv:2007.11401, 2020 - arxiv.org

Wasserstein geometry and information geometry are two important structures to be

introduced in a manifold of probability distributions. Wasserstein geometry is defined by

using the transportation cost between two distributions, so it reflects the metric of the base …

Cited by 2 Related articles All 5 versions 

[PDF] arxiv.org

Reweighting samples under covariate shift using a Wasserstein distance criterion

J Reygner, A Touboul - arXiv preprint arXiv:2010.09267, 2020 - arxiv.org

Considering two random variables with different laws to which we only have access through

finite size iid samples, we address how to reweight the first sample so that its empirical

distribution converges towards the true law of the second sample as the size of both …

  All 25 versions 

 

[PDF] arxiv.org

High-Confidence Attack Detection via Wasserstein-Metric Computations

D LiS Martínez - arXiv preprint arXiv:2003.07880, 2020 - arxiv.org

This paper considers a sensor attack and fault detection problem for linear cyber-physical

systems, which are subject to possibly non-Gaussian noise that can have an unknown light-

tailed distribution. We propose a new threshold-based detection mechanism that employs …

  Cited by 1 Related articles All 5 versions

<——2020———2020———-  960—


[PDF] arxiv.org

Statistical analysis of Wasserstein GANs with applications to time series forecasting

M Haas, S Richter - arXiv preprint arXiv:2011.03074, 2020 - arxiv.org

We provide statistical theory for conditional and unconditional Wasserstein generative

adversarial networks (WGANs) in the framework of dependent observations. We prove

upper bounds for the excess Bayes risk of the WGAN estimators with respect to a modified …

 Cited by 3 Related articles All 3 versions 


[PDF] iop.org

Speech Dereverberation Based on Improved Wasserstein Generative Adversarial Networks

L Rao, J Yang - Journal of Physics: Conference Series, 2020 - iopscience.iop.org

In reality, the sound we hear is not only disturbed by noise, but also the reverberant, whose

effects are rarely taken into account. Recently, deep learning has shown great advantages

in speech signal processing. But among the existing dereverberation approaches, very few …


GraphWGAN: Graph Representation Learning with Wasserstein Generative Adversarial Networks

R Yan, H Shen, C Qi, K Cen… - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Graph representation learning aims to represent vertices as low-dimensional and real-

valued vectors to facilitate subsequent downstream tasks, ie, node classification, link

predictions. Recently, some novel graph representation learning frameworks, which try to …

  Related articles All 2 versions


[PDF] arxiv.org

Intrinsic Sliced Wasserstein Distances for Comparing Collections of Probability Distributions on Manifolds and Graphs

RM RustamovS Majumdar - arXiv preprint arXiv:2010.15285, 2020 - arxiv.org

Collections of probability distributions arise in a variety of statistical applications ranging

from user activity pattern analysis to brain connectomics. In practice these distributions are

represented by histograms over diverse domain types including finite intervals, circles …

  All 2 versions 


2020 [PDF] aaai.org

Importance-aware semantic segmentation in self-driving with discrete wasserstein training

X Liu, Y Han, S Bai, Y Ge, T Wang, X HanS Li… - Proceedings of the …, 2020 - ojs.aaai.org

Semantic segmentation (SS) is an important perception manner for self-driving cars and

robotics, which classifies each pixel into a pre-determined class. The widely-used cross

entropy (CE) loss-based deep networks has achieved significant progress wrt the mean …

  Cited by 12 Related articles All 6 versions 

[PDF] researchgate.net

Inequalities of the Wasserstein mean with other matrix means

S Kim, H Lee - Annals of Functional Analysis, 2020 - Springer

Recently, a new Riemannian metric and a least squares mean of positive definite matrices

have been introduced. They are called the Bures–Wasserstein metric and Wasserstein

Cited by 6 Related articles All 2 versions

Drift compensation algorithm based on Time-Wasserstein dynamic distribution alignment

Y Tao, K Zeng, Z Liang - 2020 IEEE/CIC International …, 2020 - ieeexplore.ieee.org

The electronic nose (E-nose) is mainly used to detect different types and concentrations of

gases. At present, the average life of E-nose is relatively short, mainly due to the drift of the

sensor resulting in a decrease in the effect. Therefore, it is the focus of research in this field …

 

Convergence rates of the blocked Gibbs sampler with random scan in the Wasserstein metric

NY Wang, G Yin - Stochastics, 2020 - Taylor & Francis

Formulae display: ?Mathematical formulae have been encoded as MathML and are displayed

in this HTML version using MathJax in order to improve their display. Uncheck the box to turn

MathJax off. This feature requires Javascript. Click on a formula to zoom … This paper establishes …

  Related articles All 4 versions


[PDF] iop.org

Data Augmentation Method for Switchgear Defect Samples Based on Wasserstein Generative Adversarial Network

X Huang, J Xiong, Y Zhang, J Liang… - Journal of Physics …, 2020 - iopscience.iop.org

The problem of sample imbalance will lead to poor generalization ability of the deep

learning model algorithm, and the phenomenon of overfitting during network training, which

limits the accuracy of intelligent fault diagnosis of switchgear equipment. In view of this, this …

  Related articles


An Improvement based on Wasserstein GAN for Alleviating Mode Collapsing

Y Chen, X Hou - 2020 International Joint Conference on Neural …, 2020 - ieeexplore.ieee.org

In the past few years, Generative Adversarial Networks as a deep generative model has

received more and more attention. Mode collapsing is one of the challenges in the study of

Generative Adversarial Networks. In order to solve this problem, we deduce a new algorithm …

 Cited by 1 Related articles

<——2020———2020———-  970——


[PDF] arxiv.org

Convergence rate to equilibrium in Wasserstein distance for reflected jump-diffusions

A Sarantsev - arXiv preprint arXiv:2003.10590, 2020 - arxiv.org

Convergence rate to the stationary distribution for continuous-time Markov processes can be

studied using Lyapunov functions. Recent work by the author provided explicit rates of

convergence in special case of a reflected jump-diffusion on a half-line. These results are …

   Cited by 1 Related articles All 7 versions


[PDF] arxiv.org

Martingale Wasserstein inequality for probability measures in the convex order

B Jourdain, W Margheriti - arXiv preprint arXiv:2011.11599, 2020 - arxiv.org

It is known since [24] that two one-dimensional probability measures in the convex order

admit a martingale coupling with respect to which the integral of $\vert xy\vert $ is smaller

than twice their $\mathcal W_1 $-distance (Wasserstein distance with index $1 $). We …

  All 7 versions 


DPIR-Net: Direct PET image reconstruction based on the Wasserstein generative adversarial network

Z Hu, H Xue, Q Zhang, J Gao, N Zhang… - … on Radiation and …, 2020 - ieeexplore.ieee.org

Positron emission tomography (PET) is an advanced medical imaging technique widely

used in various clinical applications, such as tumor detection and neurologic disorders.

Reducing the radiotracer dose is desirable in PET imaging because it decreases the …

Cited by 32 Related articles


[PDF] arxiv.org

Pruned Wasserstein Index Generation Model and wigpy Package

F Xie - arXiv preprint arXiv:2004.00999, 2020 - arxiv.org

Recent proposal of Wasserstein Index Generation model (WIG) has shown a new direction

for automatically generating indices. However, it is challenging in practice to fit large

datasets for two reasons. First, the Sinkhorn distance is notoriously expensive to compute …

  Related articles All 5 versions 


[PDF] arxiv.org

Geometric Characteristics of Wasserstein Metric on SPD (n)

Y Luo, S Zhang, Y Cao, H Sun - arXiv preprint arXiv:2012.07106, 2020 - arxiv.org

Wasserstein distance, especially among symmetric positive-definite matrices, has broad and

deep influences on development of artificial intelligence (AI) and other branches of computer

science. A natural idea is to describe the geometry of $ SPD\left (n\right) $ as a Riemannian …

  All 2 versions 


[PDF] arxiv.org

Wasserstein metric for improved QML with adjacency matrix representations

O Çaylak, OA von LilienfeldB Baumeier - arXiv preprint arXiv:2001.11005, 2020 - arxiv.org

We study the Wasserstein metric to measure distances between molecules represented by

the atom index dependent adjacency" Coulomb" matrix, used in kernel ridge regression

based supervised learning. Resulting quantum machine learning models exhibit improved …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Second-order Conic Programming Approach for Wasserstein Distributionally Robust Two-stage Linear Programs

Z Wang, K YouS SongY Zhang - arXiv preprint arXiv:2002.06751, 2020 - arxiv.org

This paper proposes a second-order conic programming (SOCP) approach to solve

distributionally robust two-stage stochastic linear programs over 1-Wasserstein balls. We

start from the case with distribution uncertainty only in the objective function and exactly …

  Related articles All 3 versions 


[PDF] arxiv.org

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

MH Quang - arXiv preprint arXiv:2011.07489, 2020 - arxiv.org

This work studies the entropic regularization formulation of the 2-Wasserstein distance on an

infinite-dimensional Hilbert space, in particular for the Gaussian setting. We first present the

Minimum Mutual Information property, namely the joint measures of two Gaussian measures …

  All 2 versions 


[PDF] arxiv.org

Minimax control of ambiguous linear stochastic systems using the Wasserstein metric

K KimI Yang - arXiv preprint arXiv:2003.13258, 2020 - arxiv.org

In this paper, we propose a minimax linear-quadratic control method to address the issue of

inaccurate distribution information in practical stochastic systems. To construct a control

policy that is robust against errors in an empirical distribution of uncertainty, our method is to …

  Related articles All 2 versions 


Multi-View Wasserstein Discriminant Analysis with Entropic Regularized Wasserstein Distance

H Kasai - ICASSP 2020-2020 IEEE International Conference …, 2020 - ieeexplore.ieee.org

Analysis of multi-view data has recently garnered growing attention because multi-view data

frequently appear in real-world applications, which are collected or taken from many sources

or captured using various sensors. A simple and popular promising approach is to learn a …

  Cited by 2 Related articles All 2 versions

<——2020———2020———-  980——


[PDF] arxiv.org

Permutation invariant networks to learn Wasserstein metrics

A SehanobishN Ravindra, D van Dijk - arXiv preprint arXiv:2010.05820, 2020 - arxiv.org

Understanding the space of probability measures on a metric space equipped with a

Wasserstein distance is one of the fundamental questions in mathematical analysis. The

Wasserstein metric has received a lot of attention in the machine learning community …

  Related articles All 5 versions 

[PDF] arxiv.org

SWIFT: Scalable Wasserstein Factorization for Sparse Nonnegative Tensors

A AfsharK Yin, S Yan, C Qian, JC Ho, H Park… - arXiv preprint arXiv …, 2020 - arxiv.org

Existing tensor factorization methods assume that the input tensor follows some specific

distribution (ie Poisson, Bernoulli and Gaussian), and solve the factorization by minimizing

some empirical loss functions defined based on the corresponding distribution. However, it …

  All 2 versions 


[PDF] researchgate.net

[PDF] Computational Hardness and Fast Algorithm for Fixed-Support Wasserstein Barycenter

T LinN HoX ChenM CuturiMI Jordan - 2020 - researchgate.net

We study in this paper the fixed-support Wasserstein barycenter problem (FS-WBP), which

consists in computing the Wasserstein barycenter of m discrete probability measures

supported on a finite metric space of size n. We show first that the constraint matrix arising …

  Cited by 3 Related articles All 2 versions 

[PDF] mdpi.com

Knowledge-Grounded Chatbot Based on Dual Wasserstein Generative Adversarial Networks with Effective Attention Mechanisms

S Kim, OW Kwon, H Kim - Applied Sciences, 2020 - mdpi.com

A conversation is based on internal knowledge that the participants already know or external

knowledge that they have gained during the conversation. A chatbot that communicates with

humans by using its internal and external knowledge is called a knowledge-grounded …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

Quadratic Wasserstein metrics for von Neumann algebras via transport plans

R Duvenhage - arXiv preprint arXiv:2012.03564, 2020 - arxiv.org

We show how one can obtain a class of quadratic Wasserstein metrics, that is to say,

Wasserstein metrics of order 2, on the set of faithful normal states of a von Neumann algebra

$ A $, via transport plans, rather than through a dynamical approach. Two key points to …

Cited by 5 Related articles All 2 versions 


[PDF] arxiv.org

Safe Zero-Shot Model-Based Learning and Control: A Wasserstein Distributionally Robust Approach

A KandelSJ Moura - arXiv preprint arXiv:2004.00759, 2020 - arxiv.org

This paper explores distributionally robust zero-shot model-based learning and control

using Wasserstein ambiguity sets. Conventional model-based reinforcement learning

algorithms struggle to guarantee feasibility throughout the online learning process. We …

  Related articles All 2 versions 


[PDF] arxiv.org

Unsupervised Multilingual Alignment using Wasserstein Barycenter

X Lian, K JainJ TruszkowskiP Poupart… - arXiv preprint arXiv …, 2020 - arxiv.org

We study unsupervised multilingual alignment, the problem of finding word-to-word

translations between multiple languages without using any parallel data. One popular

strategy is to reduce multilingual alignment to the much simplified bilingual setting, by …

  Cited by 1 Related articles All 4 versions 


[PDF] iop.org

A collaborative filtering recommendation framework based on Wasserstein GAN

R Li, F Qian, X Du, S Zhao… - Journal of Physics …, 2020 - iopscience.iop.org

Compared with the original GAN, Wasserstein GAN minimizes the Wasserstein Distance

between the generative distribution and the real distribution, can well capture the potential

distribution of data and has achieved excellent results in image generation. However, the …

 

[PDF] archives-ouvertes.fr

On the Wasserstein distance between mutually singular measures

G Buttazzo, G Carlier, M Laborde - Advances in Calculus of …, 2020 - degruyter.com

We study the Wasserstein distance between two measures μ, ν which are mutually singular.

In particular, we are interested in minimization problems of the form W⁢(μ, 𝒜)= inf⁡{W⁢(μ,

ν): ν 𝒜}, where μ is a given probability and 𝒜 is contained in the class μ of probabilities …

  Cited by 1 Related articles All 8 versions

[PDF] archives-ouvertes.fr

[CITATION] On the Wasserstein distance between mutually singular measures

G Buttazzo, G Carlier, M Laborde - Advances in Calculus of Variations, 2020 - De Gruyter

  Cited by 1 Related articles All 6 versions


[PDF] ntu.edu.sg

Wasserstein distance estimates for stochastic integrals by forward-backward stochastic calculus

JC Breton, N Privault - Potential Analysis, 2020 - Springer

We prove Wasserstein distance bounds between the probability distributions of stochastic

integrals with jumps, based on the integrands appearing in their stochastic integral

representations. Our approach does not rely on the Stein equation or on the propagation of …

  All 4 versions

<——2020———2020———-  990

[PDF] arxiv.org

Hierarchical Low-Rank Approximation of Regularized Wasserstein distance

M Motamed - arXiv preprint arXiv:2004.12511, 2020 - arxiv.org

Sinkhorn divergence is a measure of dissimilarity between two probability measures. It is

obtained through adding an entropic regularization term to Kantorovich's optimal transport

problem and can hence be viewed as an entropically regularized Wasserstein distance …

  Related articles All 2 versions 


[HTML] springer.com

[HTML] Missing Features Reconstruction Using a Wasserstein Generative Adversarial Imputation Network

M FriedjungováD Vašata, M Balatsko… - … on Computational Science, 2020 - Springer

Missing data is one of the most common preprocessing problems. In this paper, we

experimentally research the use of generative and non-generative models for feature

reconstruction. Variational Autoencoder with Arbitrary Conditioning (VAEAC) and …

  Related articles


[PDF] inria.fr

Graph Diffusion Wasserstein Distances

A Barbe, M Sebban, P Gonçalves, P Borgnat… - … on Machine Learning …, 2020 - hal.inria.fr

Optimal Transport (OT) for structured data has received much attention in the machine

learning community, especially for addressing graph classification or graph transfer learning

tasks. In this paper, we present the Diffusion Wasserstein (DW) distance, as a generalization …

 Cited by 15 Related articles All 3 versions

[PDF] researchgate.net

[PDF] STATISTICAL INFERENCE FOR BURES-WASSERSTEIN BARYCENTERS BY ALEXEY KROSHNIN, VLADIMIR SPOKOINY 2 AND ALEXANDRA …

A KROSHNIN - researchgate.net

In this work we introduce the concept of Bures–Wasserstein barycenter Q, that is essentially a Fréchet mean of some distribution P supported on a subspace of positive semi-definite d-dimensional Hermitian operators H+(d). We allow a barycenter to be constrained …

 Cite Related articles 


Wasserstein Embeddings for Nonnegative Matrix Factorization

M Febrissy, M Nadif - … Conference on Machine Learning, Optimization, and …, 2020 - Springer

In the field of document clustering (or dictionary learning), the fitting error called the

Wasserstein (In this paper, we use “Wasserstein”,“Earth Mover's”,“Kantorovich–Rubinstein”

interchangeably) distance showed some advantages for measuring the approximation of the …

  Related articles


[PDF] ibpsa.org

[PDF] Comparing Bottom-Up Energy Consumption Models Using The Wasserstein Distance Between Load Profile Histograms

E Sanderson, A Fragaki, J Simo… - BSO-V 2020: IBPSA …, 2020 - ibpsa.org

This paper presents a comparison of bottom up models that generate appliance load profiles. The comparison is based on their ability to accurately distribute load over time-of-day. This is a key feature of model performance if the model is used to assess the impact of …

  Related articles All 2 versions 


[PDF] arxiv.org

Predictive density estimation under the Wasserstein loss

T Matsuda, WE Strawderman - Journal of Statistical Planning and Inference, 2020 - Elsevier

We investigate predictive density estimation under the L 2 Wasserstein loss for location

families and location-scale families. We show that plug-in densities form a complete class

and that the Bayesian predictive density is given by the plug-in density with the posterior …

  Cited by 1 Related articles All 4 versions


[PDF] arxiv.org

Data-Driven Approximation of the Perron-Frobenius Operator Using the Wasserstein Metric

A KarimiTT Georgiou - arXiv preprint arXiv:2011.00759, 2020 - arxiv.org

This manuscript introduces a regression-type formulation for approximating the Perron-

Frobenius Operator by relying on distributional snapshots of data. These snapshots may

represent densities of particles. The Wasserstein metric is leveraged to define a suitable …

  Related articles All 3 versions 


[PDF] arxiv.org

Velocity Inversion Using the Quadratic Wasserstein Metric

S Mahankali - arXiv preprint arXiv:2009.00708, 2020 - arxiv.org

Full--waveform inversion (FWI) is a method used to determine properties of the Earth from

information on the surface. We use the squared Wasserstein distance (squared $ W_2 $

distance) as an objective function to invert for the velocity as a function of position in the …

  All 4 versions 


Discrete Wasserstein Autoencoders for Document Retrieval

Y ZhangH Zhu - … 2020-2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

Learning to hash via generative models has became a promising paradigm for fast similarity

search in document retrieval. The binary hash codes are treated as Bernoulli latent variables

when training a variational autoencoder (VAE). However, the prior of discrete distribution (ie …

  Related articles

<——2020———2020———-  1000—— 


[PDF] arxiv.org

Wasserstein Convergence Rate for Empirical Measures on Noncompact Manifolds

FY Wang - arXiv preprint arXiv:2007.14667, 2020 - arxiv.org

Let $ X_t $ be the (reflecting) diffusion process generated by $ L:=\Delta+\nabla V $ on a

complete connected Riemannian manifold $ M $ possibly with a boundary $\partial M $,

where $ V\in C^ 1 (M) $ such that $\mu (dx):= e^{V (x)} dx $ is a probability measure. We …

  All 2 versions 


[PDF] arxiv.org

The existence of minimizers for an isoperimetric problem with Wasserstein penalty term in unbounded domains

Q Xia, B Zhou - arXiv preprint arXiv:2002.07129, 2020 - arxiv.org

In this article, we consider the (double) minimization problem $$\min\left\{P

(E;\Omega)+\lambda W_p (E, F):~ E\subseteq\Omega,~ F\subseteq\mathbb {R}^ d,~\lvert

E\cap F\rvert= 0,~\lvert E\rvert=\lvert F\rvert= 1\right\}, $$ where $ p\geqslant 1$, $\Omega …

  Related articles All 4 versions 


[PDF] arxiv.org

The Spectral-Domain  Wasserstein Distance for Elliptical Processes and the Spectral-Domain Gelbrich Bound

S FangQ Zhu - arXiv preprint arXiv:2012.04023, 2020 - arxiv.org

In this short note, we introduce the spectral-domain $\mathcal {W} _2 $ Wasserstein distance

for elliptical stochastic processes in terms of their power spectra. We also introduce the

spectral-domain Gelbrich bound for processes that are not necessarily elliptical. Subjects …

The Spectral-Domain W2
 Wasserstein Distance for Elliptical Processes and the Spectral-Domain Gelbrich Bound

Song, Fang; Zhu, Quanyan. arXiv.org; Ithaca, Jan 6, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Wasserstein cycle-consistent generative adversarial network for improved seismic impedance inversion: Example on 3D SEAM model

A Cai, H Di, Z Li, H Maniar, A Abubakar - SEG Technical Program …, 2020 - library.seg.org

The convolutional neural networks (CNNs) have attracted great attentions in seismic

exploration applications by their capability of learning the representations of data with

multiple level of abstractions, given an adequate amount of labeled data. In seismic …

Cited by 10 Related articles All 2 versions

[PDF] arxiv.org

An LP-based, strongly-polynomial 2-approximation algorithm for sparse Wasserstein barycenters

S Borgwardt - Operational Research, 2020 - Springer

Discrete Wasserstein barycenters correspond to optimal solutions of transportation problems

for a set of probability measures with finite support. Discrete barycenters are measures with

finite support themselves and exhibit two favorable properties: there always exists one with a …

  Cited by 2 Related articles All 2 versions


[PDF] arxiv.org

Hierarchical Low-Rank Approximation of Regularized Wasserstein distance

M Motamed - arXiv preprint arXiv:2004.12511, 2020 - arxiv.org

Sinkhorn divergence is a measure of dissimilarity between two probability measures. It is

obtained through adding an entropic regularization term to Kantorovich's optimal transport

problem and can hence be viewed as an entropically regularized Wasserstein distance …

  Related articles All 3 versions 


Generating Hyperspectral Data Based on 3D CNN and Improved Wasserstein Generative Adversarial Network Using Homemade High-resolution Datasets

Y Li, D Huang - Proceedings of the International Conference on …, 2020 - dl.acm.org

Hyperspectral images contain rich information on the fingerprints of materials and are being

popularly used in the exploration of oil and gas, environmental monitoring, and remote

sensing. Since hyperspectral images cover a wide range of wavelengths with high …

   Related articles


Wasserstein-Distance-Based Temporal Clustering for Capacity-Expansion Planning in Power Systems

L CondeixaF Oliveira… - … Conference on Smart …, 2020 - ieeexplore.ieee.org

As variable renewable energy sources are steadily incorporated in European power

systems, the need for higher temporal resolution in capacity-expansion models also

increases. Naturally, there exists a trade-off between the amount of temporal data used to …

 

Convergence in Monge-Wasserstein Distance of Mean Field Systems with Locally Lipschitz Coefficients

DT NguyenSL Nguyen, NH Du - Acta Mathematica Vietnamica, 2020 - Springer

This paper focuses on stochastic systems of weakly interacting particles whose dynamics

depend on the empirical measures of the whole populations. The drift and diffusion

coefficients of the dynamical systems are assumed to be locally Lipschitz continuous and …

 Related articles


[PDF] aimsciences.org

Exponential convergence in the Wasserstein metric\begin {document} \end {document} for one dimensional diffusions

L Cheng, R Li, L Wu - Discrete & Continuous Dynamical Systems-A, 2020 - aimsciences.org

In this paper, we find some general and efficient sufficient conditions for the exponential

convergence W1, d (Pt (x,·), Pt (y,·))≤ Ke− δtd (x, y) for the semigroup (Pt) of one-

dimensional diffusion. Moreover, some sharp estimates of the involved constants K≥ 1, δ> 0 …

  Related articles All 2 versions 

<——2020———2020———-  1010—— 


W-LDMM: A wasserstein driven low-dimensional manifold model for noisy image restoration

R He, X Feng, W Wang, X Zhu, C Yang - Neurocomputing, 2020 - Elsevier

The Wasserstein distance originated from the optimal transport theory is a general and

flexible statistical metric in a variety of image processing problems. In this paper, we propose

a novel Wasserstein driven low-dimensional manifold model (W-LDMM), which tactfully …

  Cited by 3 Related articles All 2 versions


[PDF] arxiv.org

Berry-Esseen smoothing inequality for the Wasserstein metric on compact Lie groups

B Borda - arXiv preprint arXiv:2005.04925, 2020 - arxiv.org

We prove a general inequality estimating the distance of two probability measures on a

compact Lie group in the Wasserstein metric in terms of their Fourier transforms. The result is

close to being sharp. We use a generalized form of the Wasserstein metric, related by …

  Related articles All 2 versions 


RDA-UNET-WGAN: An Accurate Breast Ultrasound Lesion Segmentation Using Wasserstein Generative Adversarial Networks

A NegiANJ RajR Nersisson, Z Zhuang… - … FOR SCIENCE AND …, 2020 - Springer

Early-stage detection of lesions is the best possible way to fight breast cancer, a disease

with the highest malignancy ratio among women. Though several methods primarily based

on deep learning have been proposed for tumor segmentation, it is still a challenging …

  Cited by 3 Related articles


[PDF] researchgate.net

Non-Gaussian BLE-Based Indoor Localization Via Gaussian Sum Filtering Coupled with Wasserstein Distance

P Malekzadeh, S Mehryar, P Spachos… - ICASSP 2020-2020 …, 2020 - ieeexplore.ieee.org

With recent breakthroughs in signal processing, communication and networking systems, we

are more and more surrounded by smart connected devices empowered by the Internet of

Thing (IoT). Bluetooth Low Energy (BLE) is considered as the main-stream technology to …

   Cited by 2 Related articles All 3 versions


[PDF] arxiv.org

Independent Elliptical Distributions Minimize Their  Wasserstein Distance from Independent Elliptical Distributions with the Same Density Generator

S FangQ Zhu - arXiv preprint arXiv:2012.03809, 2020 - arxiv.org

This short note is on a property of the $\mathcal {W} _2 $ Wasserstein distance which

indicates that independent elliptical distributions minimize their $\mathcal {W} _2 $

Wasserstein distance from given independent elliptical distributions with the same density …

  All 2 versions 


Functional Data Clustering Analysis via the Learning of Gaussian Processes with Wasserstein Distance

T Li, J Ma - International Conference on Neural Information …, 2020 - Springer

Functional data clustering analysis becomes an urgent and challenging task in the new era

of big data. In this paper, we propose a new framework for functional data clustering

analysis, which adopts a similar structure as the k-means algorithm for the conventional …

 

[PDF] arxiv.org

Wasserstein Collaborative Filtering for Item Cold-start Recommendation

Y MengX YanW Liu, H Wu, J Cheng - … of the 28th ACM Conference on …, 2020 - dl.acm.org

Item cold-start recommendation, which predicts user preference on new items that have no

user interaction records, is an important problem in recommender systems. In this paper, we

model the disparity between user preferences on warm items (those having interaction …

  Related articles All 3 versions


[PDF] epfl.ch

Wasserstein Distributionally Robust Learning

S Shafieezadeh Abadeh - 2020 - infoscience.epfl.ch

Many decision problems in science, engineering, and economics are affected by

uncertainty, which is typically modeled by a random variable governed by an unknown

probability distribution. For many practical applications, the probability distribution is only …

  

Numeric Data Augmentation using Structural Constraint Wasserstein Generative Adversarial Networks

W Wang, C Wang, T Cui, R Gong… - … on Circuits and …, 2020 - ieeexplore.ieee.org

Some recent studies have suggested using GANs for numeric data generation such as to

generate data for completing the imbalanced numeric data. Considering the significant

difference between the dimensions of the numeric data and images, as well as the strong …

 

[PDF] uwaterloo.ca

Wasserstein Adversarial Robustness

K Wu - 2020 - uwspace.uwaterloo.ca

Deep models, while being extremely flexible and accurate, are surprisingly vulnerable

to``small, imperceptible''perturbations known as adversarial attacks. While the majority of

existing attacks focus on measuring perturbations under the $\ell_p $ metric, Wasserstein  …

<——2020———2020———-  1020——

[PDF] arxiv.org

Encoded Prior Sliced Wasserstein AutoEncoder for learning latent manifold representations

S KrishnagopalJ Bedrossian - arXiv preprint arXiv:2010.01037, 2020 - arxiv.org

While variational autoencoders have been successful generative models for a variety of

tasks, the use of conventional Gaussian or Gaussian mixture priors are limited in their ability

to capture topological or geometric properties of data in the latent representation. In this …

 Related articles All 3 versions 

[PDF] mit.edu

Wasserstein barycenters: statistics and optimization

AJ Stromme - 2020 - dspace.mit.edu

We study a geometric notion of average, the barycenter, over 2-Wasserstein space. We

significantly advance the state of the art by introducing extendible geodesics, a simple

synthetic geometric condition which implies non-asymptotic convergence of the empirical …

  

Semantic Inpainting with Multi-dimensional Adversarial Network and Wasserstein Distance

H Wang, L Jiao, R Bie, H Wu - Chinese Conference on Pattern …, 2020 - Springer

Inpainting represents a procedure which can restore the lost parts of an image based upon

the residual information. We present an inpainting network that consists of an Encoder-

Decoder pipeline and a multi-dimensional adversarial network. The Encoder-Decoder …

 

[PDF] colostate.edu

[PDF] Vietoris–Rips metric thickenings and Wasserstein spaces

J Mirth - 2020 - math.colostate.edu

If the vertex set, X, of a simplicial complex, K, is a metric space, then K can be interpreted as

a subset of the Wasserstein space of probability measures on X. Such spaces are called

simplicial metric thickenings, and a prominent example is the Vietoris–Rips metric …

  Cited by 1 Related articles All 2 versions 


[PDF] tins.ro

Enhancing the Classification of EEG Signals using Wasserstein Generative Adversarial Networks

VM PetruţiuLD Palcu, C Lemnaur… - 2020 IEEE 16th …, 2020 - ieeexplore.ieee.org

Collecting EEG signal data during a human visual recognition task is a costly and time-

consuming process. However, training good classification models usually requires a large

amount of quality data. We propose a data augmentation method based on Generative …

  All 2 versions

[ 

[PDF] biorxiv.org

Free from Publisher

Gromov-Wasserstein optimal transport to align single-cell multi-omics data

P DemetciR SantorellaB SandstedeWS Noble… - BioRxiv, 2020 - biorxiv.org

Data integration of single-cell measurements is critical for our understanding of cell

development and disease, but the lack of correspondence between different types of single-

cell measurements makes such efforts challenging. Several unsupervised algorithms are …

Cited by 35 Related articles All 7 versions

[PDF] arxiv.org

Entropy-Regularized -Wasserstein Distance between Gaussian Measures

A MallastoA GerolinHQ Minh - arXiv preprint arXiv:2006.03416, 2020 - arxiv.org

Gaussian distributions are plentiful in applications dealing in uncertainty quantification and

diffusivity. They furthermore stand as important special cases for frameworks providing

geometries for probability measures, as the resulting geometry on Gaussians is often …

Cite Cited by 7 Related articles All 3 versions 

[PDF] dergipark.org.tr

Wasserstein Riemannian Geometry on Statistical Manifold

C Ogouyandjou, N Wadagni - International Electronic Journal of …, 2020 - dergipark.org.tr

In this paper, we study some geometric properties of statistical manifold equipped with the

Riemannian Otto metric which is related to the L 2-Wasserstein distance of optimal mass

transport. We construct some α-connections on such manifold and we prove that the …

  All 2 versions 

MR4170073 Pending Ogouyandjou, CarlosWadagni, Nestor Wasserstein Riemannian geometry on statistical manifold. Int. 

Electron. J. Geom. 13 (2020), no. 2, 144–151. 53B12 (60D05 62B11)

Review PDF Clipboard Journal Article
Related articles
 All 4 versions 


[PDF] iop.org

Data Augmentation Based on Wasserstein Generative Adversarial Nets Under Few Samples

Y Jiang, B Zhu, Q Ma - IOP Conference Series: Materials Science …, 2020 - iopscience.iop.org

Aiming at the problem of low accuracy of image classification under the condition of few

samples, an improved method based on Wasserstein Generative Adversarial Nets is

proposed. The small data sets are augmented by generating target samples through …

  Cited by 1 Related articles All 2 versions


A Generative Model for Zero-Shot Learning via Wasserstein Auto-encoder

X Luo, Z Cai, F Wu, J Xiao-Yuan - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Zero-shot learning aims to use the labeled instances to train the model, and then classifies

the instances that belong to a class without labeled instances. However, the training

instances and test instances are disjoint. Thus, the description of the classes (eg text …

<——2020———2020—————-—-  1030—— 


[PDF] arxiv.org

Exponential Convergence in Entropy and Wasserstein Distance for McKean-Vlasov SDEs

P Ren, FY Wang - arXiv preprint arXiv:2010.08950, 2020 - arxiv.org

The following type exponential convergence is proved for (non-degenerate or degenerate)

McKean-Vlasov SDEs: $$ W_2 (\mu_t,\mu_\infty)^ 2+{\rm Ent}(\mu_t|\mu_\infty)\le c {\rm e}^{-

\lambda t}\min\big\{W_2 (\mu_0,\mu_\infty)^ 2,{\rm Ent}(\mu_0|\mu_\infty)\big\},\\t\ge 1 …

  All 2 versions 


[PDF] arxiv.org

Unsupervised Wasserstein Distance Guided Domain Adaptation for 3D Multi-domain Liver Segmentation

C YouJ Yang, J Chapiro, JS Duncan - Interpretable and Annotation …, 2020 - Springer

Deep neural networks have shown exceptional learning capability and generalizability in

the source domain when massive labeled data is provided. However, the well-trained

models often fail in the target domain due to the domain shift. Unsupervised domain …

Cited by 19 Related articles All 3 versions


[PDF] iop.org

Data Augmentation Method for Switchgear Defect Samples Based on Wasserstein Generative Adversarial Network

X Huang, J Xiong, Y Zhang, J Liang… - Journal of Physics …, 2020 - iopscience.iop.org

The problem of sample imbalance will lead to poor generalization ability of the deep

learning model algorithm, and the phenomenon of overfitting during network training, which

limits the accuracy of intelligent fault diagnosis of switchgear equipment. In view of this, this …

 

[HTML] peerj.com

[HTML] Correcting nuisance variation using Wasserstein distance

G TabakM FanS YangS Hoyer, G Davis - PeerJ, 2020 - peerj.com

Profiling cellular phenotypes from microscopic imaging can provide meaningful biological

information resulting from various factors affecting the cells. One motivating application is

drug development: morphological cell features can be captured from images, from which …

  Related articles All 8 versions 


Fixed-Support Wasserstein Barycenter: Computational Hardness and Efficient Algorithms

T LinN Ho, X Chen, M CuturiMI Jordan - 2020 - research.google

We study in this paper the finite-support Wasserstein barycenter problem (FS-WBP), which

consists in computing the Wasserstein barycenter of $ m $ discrete probability measures

supported on a finite metric space of size $ n $. We show first that the constraint matrix …

  


[PDF] sjtu.edu.cn

[PDF] Kalman-Wasserstein Gradient Flows

F Hoffmann - 2020 - ins.sjtu.edu.cn

Parameter calibration and uncertainty in complex computer models. Ensemble Kalman

Inversion (for optimization). Ensemble Kalman Sampling (for sampling). Kalman-Wasserstein

gradient flow structure … Minimize E : Ω R, where Ω RN … Dynamical …

  Related articles All 4 versions 

Convergence rates of the blocked Gibbs sampler with random scan in the Wasserstein metric

NY Wang, G Yin - Stochastics, 2020 - Taylor & Francis

To approximate μ, various scan Gibbs samplers with updating blocks are often used [1 J.

Besag, P. Green, D. Higdon, and K. Mengersen, Bayesian computation and stochastic

systems, Statist. Sci. 10(1) (1995), pp. 3–41. doi: 10.1214/ss/1177010123[Crossref], [Web of …

  Related articles All 3 versions


[PDF] unipv.it

[PDF] On the equivalence between Fourier-based and Wasserstein metrics

G Auricchio, A Codegoni, S GualandiG Toscani… - eye - mate.unipv.it

We investigate properties of some extensions of a class of Fourierbased probability metrics,

originally introduced to study convergence to equilibrium for the solution to the spatially

homogeneous Boltzmann equation. At difference with the original one, the new Fourier …

  

Learning Wasserstein Distance-Based Gaussian Graphical Model for Multivariate Time Series Classification

HU Xuegang, L Jianxing, LI Peipei… - 2020 IEEE …, 2020 - ieeexplore.ieee.org

Multivariate time series classification occupies an important position in time series data

mining tasks and has been applied in many fields. However, due to the statistical coupling

between different variables of Multivariate Time Series (MTS) data, traditional classification …

Cited by 1 Related articles All 2 versions

A Super Resolution Method for Remote Sensing Images Based on Cascaded Conditional Wasserstein GANs

B Liu, H Li, Y Zhou, Y Peng, A Elazab… - 2020 IEEE 3rd …, 2020 - ieeexplore.ieee.org

High-resolution (HR) remote sensing imagery is quite beneficial for subsequent

interpretation. Obtaining HR images can be achieved by upgrading the imaging device. Yet,

the cost to perform this task is very huge. Thus, it is necessary to obtain HR images from low …

 <——2020—————2020———-  1040——


  [HTML] springer.com

[HTML] Fréchet Means in the Wasserstein Space 

VM Panaretos, Y Zemel - International Workshop on Functional and …, 2020 - Springer

The concept of a Fréchet mean (Fréchet [55]) generalises the notion of mean to a more general

metric space by replacing the usual “sum of squares” with a “sum of squared distances”, giving

rise to the so-called Fréchet functional. A closely related notion is that of a Karcher mean (Karcher …

  Related articles


[PDF] jst.go.jp

Orthogonal Gradient Penalty for Fast Training of Wasserstein GAN Based Multi-Task Autoencoder toward Robust Speech Recognition

CY Kao, S Park, A Badi, DK Han… - IEICE TRANSACTIONS on …, 2020 - search.ieice.org

Performance in Automatic Speech Recognition (ASR) degrades dramatically in noisy

environments. To alleviate this problem, a variety of deep networks based on convolutional

neural networks and recurrent neural networks were proposed by applying L1 or L2 loss. In …

 Cited by 1 Related articles All 5 versions


2020 thesis
Wasserstein barycenters : statistics and optimization
Authors:Austin J. StrommeMassachusetts Institute of Technology
Abstract:We study a geometric notion of average, the barycenter, over 2-Wasserstein space. We significantly advance the state of the art by introducing extendible geodesics, a simple synthetic geometric condition which implies non-asymptotic convergence of the empirical barycenter in non-negatively curved spaces such as Wasserstein space. We further establish convergence of first-order methods in the Gaussian case, overcoming the nonconvexity of the barycenter functional. These results are accomplished by various novel geometrically inspired estimates for the barycenter functional including a variance inequality, new so-called quantitative stability estimates, and a Polyak-Łojasiewicz (PL) inequality. These inequalities may be of independent interestShow more
Thesis, Dissertation, 2020
English
Publisher:2020

 

[PDF] ethz.ch

[PDF] Smooth Wasserstein Distance: Metric Structure and Statistical Efficiency

Z Goldfeld - International Zurich Seminar on Information …, 2020 - research-collection.ethz.ch

The Wasserstein distance has seen a surge of interest and applications in machine learning.

Its popularity is driven by many advantageous properties it possesses, such as metric

structure (metrization of weak convergence), robustness to support mismatch, compatibility …

  Related articles All 5 versions 


[PDF] researchgate.net

[PDF] Wasserstein Barycenters for Bayesian Learning: Technical Report

G Rios - 2020 - researchgate.net

Within probabilistic modelling, a crucial but challenging task is that of learning (or fitting) the

models. For models described by a finite set of parameters, this task is reduced to finding the

best parameters, to feed them into the model and then calculate the posterior distribution to …

  Related articles 

Input limited Wasserstein GAN

F Cao, H Zhao, P Liu, P Li - Second Target Recognition and …, 2020 - spiedigitallibrary.org

Generative adversarial networks (GANs) has proven hugely successful, but suffer from train

instability. The recently proposed Wasserstein GAN (WGAN) has largely overcome the

problem, but can still fail to converge in some case or be to complex. It has been found that …

Related articles All 3 versions


[PDF] aimsciences.org

A convergent Lagrangian discretization for\begin {document} \end {document}-Wasserstein and flux-limited diffusion equations

B Söliver, O Junge - Communications on Pure & Applied Analysis, 2020 - aimsciences.org

We study a Lagrangian numerical scheme for solving a nonlinear drift diffusion equations of

the form∂ tu=∂ x (u·(c)[∂ xh (u)+ v]), like Fokker-Plank and q-Laplace equations, on an

interval. This scheme will consist of a spatio-temporal discretization founded on the …

  All 3 versions 


Synthesising Tabular Datasets Using Wasserstein Conditional GANS with Gradient Penalty (WCGAN-GP)

S McKeever, M Singh Walia - 2020 - arrow.tudublin.ie

Deep learning based methods based on Generative Adversarial Networks (GANs) have

seen remarkable success in data synthesis of images and text. This study investigates the

use of GANs for the generation of tabular mixed dataset. We apply Wasserstein Conditional …

  

[PDF] researchgate.net

[PDF] THE α-z-BURES WASSERSTEIN DIVERGENCE

THOA DINHCT LE, BK VO, TD VUONG - researchgate.net

Φ (A, B)= Tr ((1− α) A+ αB)− Tr (Qα, z (A, B)), where Qα, z (A, B)=(A 1− α 2z B α z A 1− α 2z) z

is the matrix function in the α-z-Renyi relative entropy. We show that for 0≤ α≤ z≤ 1, the

quantity Φ (A, B) is a quantum divergence and satisfies the Data Processing Inequality in …

  

[PDF] future-in-tech.net

[PDF] Wasserstein Riemannian geometry of Gamma densities

C Ogouyandjou, N Wadagni - Computer Science, 2020 - ijmcs.future-in-tech.net

Abstract A Wasserstein Riemannian Gamma manifold is a space of Gamma probability

density functions endowed with the Riemannian Otto metric which is related to the

Wasserstein distance. In this paper, we study some geometric properties of such Riemanian …
Related articles
 

<——2020———2020———-  1050——

[PDF] sabanciuniv.edu

Cyclic Adversarial Framework with Implicit Autoencoder and Wasserstein Loss (CAFIAWL)

E Bonabi Mobaraki - 2020 - research.sabanciuniv.edu

Since the day that the Simple Perceptron was invented, Artificial Neural Networks (ANNs)

attracted many researchers. Technological improvements in computers and the internet

paved the way for unseen computational power and an immense amount of data that …

 

Synthetic Data Generation Using Wasserstein Conditional Gans With Gradient Penalty (WCGANS-GP)

M Singh Walia - 2020 - arrow.tudublin.ie

With data protection requirements becoming stricter, the data privacy has become

increasingly important and more crucial than ever. This has led to restrictions on the

availability and dissemination of real-world datasets. Synthetic data offers a viable solution …

  

[PDF] jku.at

WGAIN: Data Imputation using Wasserstein GAIN/submitted by Christina Halmich

C Halmich - 2020 - epub.jku.at

Missing data is a well known problem in the Machine Learning world. A lot of datasets that

are used for training algorithms contain missing values, eg 45% of the datasets stored in the

UCI Machine Learning Repository [16], which is a commonly used dataset collection …

  All 2 versions 


[PDF] ibpsa.org

[PDF] Comparing Bottom-Up Energy Consumption Models Using The Wasserstein Distance Between Load Profile Histograms

E Sanderson, A Fragaki, J Simo… - BSO-V 2020: IBPSA …, 2020 - ibpsa.org

This paper presents a comparison of bottom up models that generate appliance load

profiles. The comparison is based on their ability to accurately distribute load over time-of-

day. This is a key feature of model performance if the model is used to assess the impact of …

   Related articles All 2 versions 


2020

S Artificial Neural Network with Histogram Data Time Series Forecasting: A Least Squares Approach Based on Wasserstein Distance

P Rakpho, W Yamaka, K Zhu - Behavioral Predictive Modeling in …, 2020 - Springer

This paper aims to predict the histogram time series, and we use the high-frequency data

with 5-min to construct the Histogram data for each day. In this paper, we apply the Artificial

Neural Network (ANN) to Autoregressive (AR) structure and introduce the AR—ANN model …

  All 3 versions

[PDF] uniroma1.it

[PDF] Nonparametric Density Estimation with Wasserstein Distance for Actuarial Applications

EG Luini - iris.uniroma1.it

Density estimation is a central topic in statistics and a fundamental task of actuarial sciences.

In this work, we present an algorithm for approximating multivariate empirical densities with

a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

  Related articles All 2 versions 

 

U-Net  cWGAN  이용한 탄성파 탐사 자료 보간 성능 평가

유지윤, 윤대웅 - 지구물리와 물리탐사, 2022 - papersearch.net

… , and conditional Wasserstein GAN (cWGAN) were used as seismic data … cWGAN showed

better prediction performance than U-Net with higher PSNR and SSIM. However, cWGAN …

 [Chinese  

Evaluation of interpolation performance of seismic survey data using U-Net and cWGAN|


[PDF] esaim-ps.org

Squared quadratic Wasserstein distance: optimal couplings and Lions differentiability

A Alfonsi, B Jourdain - ESAIM: Probability and Statistics, 2020 - esaim-ps.org

In this paper, we remark that any optimal coupling for the quadratic Wasserstein distance

between two probability measures μ and ν with finite second order moments on d is the

composition of a martingale coupling with an optimal transport map. We check the existence …

 MR4174419 Prelim Alfonsi, Aurélien; Jourdain, Benjamin; Squared quadratic Wasserstein distance: optimal couplings and Lions differentiability. ESAIM Probab. Stat. 24 (2020), 703–717. 49Q22 (49J50 58B10 60E15 60G42)

Cited by 1 Related articles All 8 versions


[PDF] sci-en-tech.com

[PDF] Entropy-regularized Wasserstein Distances for Analyzing Environmental and Ecological Data

H Yoshioka, Y Yoshioka, Y Yaegashi - THE 11TH …, 2020 - sci-en-tech.com

We explore applicability of entropy-regularized Wasserstein (pseudo-) distances as new

tools for analyzing environmental and ecological data. In this paper, the two specific

examples are considered and are numerically analyzed using the Sinkhorn algorithm. The …

  All 2 versions 


Wasserstein Control of Mirror Langevin Monte Carlo

K Shuangjian Zhang, G PeyréJ FadiliM Pereyra - arXiv, 2020 - ui.adsabs.harvard.edu

Discretized Langevin diffusions are efficient Monte Carlo methods for sampling from high

dimensional target densities that are log-Lipschitz-smooth and (strongly) log-concave. In

particular, the Euclidean Langevin Monte Carlo sampling algorithm has received much …

 Cited by 14 Related articles All 16 versions 

<——2020———2020———-  1060——


  

[PDF] optimization-online.org

[PDF] A Novel Solution Methodology for Wasserstein-based Data-Driven Distributionally Robust Problems

CA Gamboa, DM ValladaoA Street… - optimization-online.org

Distributionally robust optimization (DRO) is a mathematical framework to incorporate

ambiguity over the actual data-generating probability distribution. Data-driven DRO

problems based on the Wasserstein distance are of particular interest for their sound …

  

Isometries of Wasserstein spaces

GP Gehér, T TitkosD Virosztek - halgebra.math.msu.su

Due to its nice theoretical properties and an astonishing number of applications via optimal

transport problems, probably the most intensively studied metric nowadays is the p-

Wasserstein metric. Given a complete and separable metric space X and a real number p≥ …

System and method for unsupervised domain adaptation via sliced-wasserstein distance

AJ GabourieM RostamiS Kolouri… - US Patent App. 16 …, 2020 - freepatentsonline.com

Described is a system for unsupervised domain adaptation in an autonomous learning

agent. The system adapts a learned model with a set of unlabeled data from a target

domain, resulting in an adapted model. The learned model was previously trained to …

  Cited by 2 Related articles 


Considering Anatomical Prior Information for Low-dose CT Image Enhancement Using Attribute-Agumented Wasserstein Generative Adversarial Networks

Z Huang, X Liu, R Wang, J Chen, P Lu, Q Zhang… - Neurocomputing, 2020 - Elsevier

Currently, many deep learning (DL)-based low-dose CT image postprocessing technologies

fail to consider the anatomical differences in training data among different human body sites,

such as the cranium, lung and pelvis. In addition, we can observe evident anatomical …

 

[PDF] psu.edu

[PDF] Subexponential upper and lower bounds in Wasserstein distance for Markov processes

A Arapostathis, G Pang, N Sandric - personal.psu.edu

In this article, relying on Foster-Lyapunov drift conditions, we establish subexponential

upper and lower bounds on the rate of convergence in the Lp-Wasserstein distance for a

class of irreducible and aperiodic Markov processes. We further discuss these results in the …

  Related articles 


Optimality in weighted L2-Wasserstein goodness-of-fit statistics

T de Wet, V Humble - South African Statistical Journal, 2020 - journals.co.za

In Del Barrio, Cuesta-Albertos, Matran and Rodriguez-Rodriguez (1999) and Del Barrio,

Cuesta-Albertos and Matran (2000), the authors introduced a new class of goodness-of-fit

statistics based on the L2-Wasserstein distance. It was shown that the desirable property of …

  Related articles


[PDF] researchgate.net

[PDF] ADDENDUM TO” ISOMETRIC STUDY OF WASSERSTEIN SPACES–THE REAL LINE”

GPÁL GEHÉRT TITKOSD VIROSZTEK - researchgate.net

We show an example of a Polish metric space X whose quadratic Wasserstein space W2 (X)

possesses an isometry that splits mass. This gives an affirmative answer to Kloeckner's

question,[2, Question 2]. Let us denote the metric space ([0, 1],|·|), equipped with the usual …

  Related articles 


[PDF] amazonaws.com

[PDF] Bayesian Wasserstein GAN and Application for Vegetable Disease Image Data

W Cho, MH Na, S Kang, S Kim - manuscriptlink-society-file.s3 …

Various GAN models have been proposed so far and they are used in various fields.

However, despite the excellent performance of these GANs, the biggest problem is that the

model collapse occurs in the simultaneous optimization of the generator and discriminator of …

  

[PDF] brown.edu

[PDF] Reduced-order modeling of transport equations using Wasserstein spaces

V Ehrlacher, D Lombardi, O MulaFX Vialard - icerm.brown.edu

Page 1. Introduction to Wassertein spaces and barycenters Model order reduction of parametric

transport equations Reduced-order modeling of transport equations using Wasserstein spaces

V. Ehrlacher1, D. Lombardi 2, O. Mula 3, F.-X. Vialard 4 1Ecole des Ponts ParisTech & INRIA …

  Related articles 

 

[PDF] semanticscholar.org

[PDF] Deconvolution for the Wasserstein metric and topological inference

B Michel - pdfs.semanticscholar.org

La SEE (Société de l'Electricité, de l'Electronique et des Technologies de l'Information et de

la Communication–Association reconnue d'utilité publique, régie par la loi du 1er juillet

1901) met à la disposition de ses adhérents et des abonnés à ses publications, un …

[CITATION] Deconvolution for the Wasserstein metric and topological inference

B Michel

<——2020———2020———-  1070——


 

Semi-supervised Surface Wave Tomography with Wasserstein Cycle-consistent GAN: Method and Application on Southern California Plate Boundary Region

A Cai, H QiuF Niu - 2020 - essoar.org

Machine learning algorithm is applied to shear wave velocity (Vs) inversion in surface wave

tomography, where a set of 1-D Vs profiles and the corresponding synthetic dispersion

curves are used in network training. Previous studies showed that performances of a trained …

 

 

 

[PDF] ams.org

Nonpositive curvature, the variance functional, and the Wasserstein barycenter

YH KimB Pass - Proceedings of the American Mathematical Society, 2020 - ams.org

We show that a Riemannian manifold $ M $ has nonpositive sectional curvature and is

simply connected if and only if the variance functional on the space $ P (M) $ of probability

measures over $ M $ is displacement convex. We then establish convexity over Wasserstein  …

  Cited by 1 Related articles All 2 versions


2020

[PDF] esaim-proc.org

Statistical data analysis in the Wasserstein space

J Bigot - ESAIM: Proceedings and Surveys, 2020 - esaim-proc.org

This paper is concerned by statistical inference problems from a data set whose elements may be modeled as random probability measures such as multiple histograms or point clouds. We propose to review recent contributions in statistics on the use of Wasserstein  …

  Cited by 2 Related articles

  

[PDF] jst.go.jp

Wasserstein 距離を評価関数とする離散時間システムの最適制御問題について

星野健太 - 自動制御連合講演会講演論文集 63 回自動制御連合 …, 2020 - jstage.jst.go.jp

Abstract– This paper discusses an optimal control problem with the terminal cost given by the

Wasser- stein distance. The problem is formulated as the control problem regarding the probability

distributions of the state variables. This paper discusses a necessary condition of the optimality …

 


[PDF] polimi.it

Wasserstein K-means per clustering di misure di probabilità e applicazioni

R TAFFONI - 2020 - politesi.polimi.it

Abstract in italiano La tesi tratterà dello studio della distanza di Wasserstein, studiandone il

caso generale ed il caso discreto, applicato all'algoritmo del K-means, che verrà descritto

nei suoi passaggi. Infine verrà applicato questo algoritmo con dati artificiale ed un dataset …

  [PDF] polimi.it

Wasserstei K-means per clustering di misure di probabilità e applicazioni

R TAFFONI - 2020 - politesi.polimi.it

Abstract in italiano La tesi tratterà dello studio della distanza di Wasserstein, studiandone il 

caso generale ed il caso discreto, applicato all'algoritmo del K-means, che verrà descritto 

nei suoi passaggi. Infine verrà applicato questo algoritmo con dati artificiale ed un dataset …

Related articles

[PDF] Dual Rejection Sampling for Wasserstein Auto-Encoders

L Hou, H Shen, X Cheng - 24th European Conference on Artificial …, 2020 - ecai2020.eu

Deep generative models enhanced by Wasserstein distance have achieved remarkable

success in recent years. Wasserstein Auto-Encoders (WAEs) are auto-encoder based

generative models that aim to minimize the Wasserstein distance between the data …

  Cited by 1 All 2 versions 


[PDF] arxiv.org

variational finite volume scheme for Wasserstein gradient flows

C CancèsTO GallouëtG Todeschi - Numerische Mathematik, 2020 - Springer

We propose a variational finite volume scheme to approximate the solutions to Wasserstein

gradient flows. The time discretization is based on an implicit linearization of the …

 Cited by 8 Related articles All 10 versions

Importance-aware semantic segmentation in self-driving with discrete wasserstein training

X Liu, Y Han, S Bai, Y Ge, T Wang, X HanS Li… - Proceedings of the …, 2020 - ojs.aaai.org

Semantic segmentation (SS) is an important perception manner for self-driving cars and

robotics, which classifies each pixel into a pre-determined class. The widely-used cross

entropy (CE) loss-based deep networks has achieved significant progress wrt the mean …

  Cited by 9 Related articles All 6 versions 

2020

[PDF] Wasserstein Barycenters for Bayesian Learning: Technical Report

G Rios - 2020 - researchgate.net

Within probabilistic modelling, a crucial but challenging task is that of learning (or fitting) the

models. For models described by a finite set of parameters, this task is reduced to finding the

best parameters, to feed them into the model and then calculate the posterior distribution to …

  Related articles 

<——2020———2020———-  1080——

book  [PDF] oapen.org

[BOOK] An Invitation to Statistics in Wasserstein Space

VM Panaretos, Y Zemel - 2020 - library.oapen.org

This open access book presents the key aspects of statistics in Wasserstein spaces, ie

statistics in the space of probability measures when endowed with the geometry of optimal

transportation. Further to reviewing state-of-the-art aspects, it also provides an accessible …

Cited by 64 Related articles All 8 versions 

[PDF] arxiv.org

McKean-Vlasov SDEs with Drifts Discontinuous under Wasserstein Distance

X Huang, FY Wang - arXiv preprint arXiv:2002.06877, 2020 - arxiv.org

Existence and uniqueness are proved for Mckean-Vlasov type distribution dependent SDEs

with singular drifts satisfying an integrability condition in space variable and the Lipschitz

condition in distribution variable with respect to $ W_0 $ or $ W_0+ W_\theta $ for some …

  Cited by 7 Related articles All 4 versions 

 MR4211198 Prelim Huang, Xing; Wang, Feng-Yu; Mckean-Vlasov SDES with drifts discontinuous under Wasserstein distance. Discrete Contin. Dyn. Syst. 41 (2021), no. 4, 1667–1679. 60H10

Review PDF Clipboard Journal Article

[PDF] arxiv.org

Transport and Interface: an Uncertainty Principle for the Wasserstein distance

A SagivS Steinerberger - SIAM Journal on Mathematical Analysis, 2020 - SIAM

Let f:(0,1)^dR be a continuous function with zero mean and interpret f_+=\max(f,0) and f_-

=-\min(f,0) as the densities of two measures. We prove thb 

Attainability property for a probabilistic target in wasserstein ...

www.aimsciences.org › article › doi › dcds.2020300

Attainability property for a probabilistic target in wasserstein spaces. Discrete & Continuous Dynamical Systems - A, 2021, 41 (2) : 777-812. doi: 10.3934/dcds.

by G Cavagnari · ‎2020 · ‎Cited by 1 · ‎Related articles


High-Confidence Attack Detection via Wasserstein-Metric Computations

D LiS Martínez - arXiv preprint arXiv:2003.07880, 2020 - arxiv.org

This paper considers a sensor attack and fault detection problem for linear cyber-physical

systems, which are subject to possibly non-Gaussian noise that can have an unknown light-

tailed distribution. We propose a new threshold-based detection mechanism that employs

the Wasserstein metric, and which guarantees system performance with high confidence.

The proposed detector may generate false alarms with a rate $\Delta $ in normal operation,

where $\Delta $ can be tuned to be arbitrarily small by means of a benchmark distribution …

Cited by 8 Related articles All 5 versions


2020


Typical wind power scenario generation for multiple wind farms using conditional improved Wasserstein generative adversarial network

Y Zhang, Q Ai, F Xiao, R Hao, T Lu - … Journal of Electrical Power & Energy …, 2020 - Elsevier

Because of environmental benefits, wind power is taking an increasing role meeting

electricity demand. However, wind power tends to exhibit large uncertainty and is largely

influenced by meteorological conditions. Apart from the variability, when multiple wind farms …

  Cited by 11 Related articles All 2 versions

[PDF] arxiv.org


2020 Dissertation or Thesis

Vietoris–Rips metric thickenings and Wasserstein spaces

mountainscholar.org › handle

Vietoris–Rips metric thickenings and Wasserstein spaces. Thumbnail ... Date Issued. 2020. Format. born digital; doctoral dissertations ...

by J Mirth · ‎Cited by 1 · ‎Related articles

Colorado State  pdf

Cited by 3 Related articles All 3 versions


A convergent Lagrangian discretization for <inline-formula ...

www.aimsciences.org › article › doi › cpaa.2020190

by B Söliver · 2020 · Cited by 2 — We study a Lagrangian numerical scheme for solving a nonlinear drift ... Fokker-Plank and \begin{document}$ q $\end{document} -Laplace equations, on an interval. ... A convergent Lagrangian discretization for p-Wasserstein and flux-limited ... Communications on Pure & Applied Analysis, 2020, 19 (9) : 4227-4256. doi: ...

online

 PEER-REVIEW

A convergent Lagrangian discretization for \begin{document}$ p $\end{document}-Wasserstein...

by Söliver, Benjamin; Junge, Oliver

Communications on pure and applied analysis, 2020, Volume 19, Issue 9

Article PDF Download Now (via Unpaywall) BrowZine PDF Icon

Journal ArticleFull Text Online

2020 PDF

Wasserstein Distributionally Robust Learning - Infoscience

infoscience.epfl.ch › record › files › EPFL_TH10012

2020. Présentée le 17 juin 2020. Prof. R. Seifert, président du jury. Prof. ... In this thesis, we use the Wasserstein distance to construct an ambiguity set with.

by S Shafieezadeh Abadeh · ‎20

Cited by 1 Related articles

 2020

arXiv:2006.07458v6 [cs.LG] 20 Dec 2020 - arXiv.org

arxiv.org › pdf

Dec 20, 2020 — Projection Robust Wasserstein Distance and Riemannian ... [2020] further provided several fundamental statistical bounds for PRW ... PhD thesis, Ph. D. Dissertation, Dissertation de Mastere, Université College Gublin, Irlande ...

by T Lin · ‎2020 · ‎Cited by 2

<——2020——2020———1090——   


2020

Wu, Jiqing. Improving Wasserstein Generative Models for Image Synthesis and Enhancement.

Degree: 2020, ETH Zürich

URL: http://hdl.handle.net/20.500.11850/414485 

Subjects/Keywords: info:eu-repo/classification/ddc/004; Data processing, computer science

Improving Wasserstein Generative Models for Image ...

www.research-collection.ethz.ch › handle

by J Wu · 2020 — Improving Wasserstein Generative Models for Image Synthesis and Enhancement. Mendeley · CSV · RIS · BibTeX. Download. Full text (PDF, 56.08Mb).

[CITATION] Improving Wasserstein Generative Models for Image Synthesis and Enhancement

J Wu - 2020 - research-collection.ethz.ch

… Some features of this site may not work without it. Research Collection. Navigational link. Search. Improving Wasserstein Generative Models for Image Synthesis and Enhancement … Download. Full text (PDF, 56.08Mb). Open access. Author. Wu, Jiqing. Date. 2020. Type …



2020

Wasserstein barycenters : statistics and optimization

Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, May, 2020. Cataloged from the official PDF of ...

by AJ Stromme · ‎2020


2020

Asset Allocation in ... - Erasmus University Thesis Repository

Asset Allocation in Emerging Market Space using Wasserstein Generative Adversarial Networks

Bakx, A. 2020-07-14. Asset Allocation in Emerging Market Space using Wasserstein Generative Adversarial Networks ... Thesis Advisor, Vermeulen, S.H.L.C.G..


2020

Joshua Mirth Ph.D. Colorado State University 2020  

Dissertation: Vietoris-Rips metric thickenings and Wasserstein spaces

Mathematics Subject Classification: 55—Algebraic topology


2020

 Wu, Kaiwen. Wasserstein Adversarial Robustness. 

Degree: 2020, University of Waterloo

URL: http://hdl.handle.net/10012/16345 

► Deep models, while being extremely flexible and accurate, are surprisingly vulnerable to ``small, imperceptible'' perturbations known as adversarial attacks. While the majority of existing attacks… (more)

Subjects/Keywords: Wasserstein distance; adversarial robustness; optimization

Record Details Similar Records Cite Share »
Colorado State University

elated articles

2020

DISSERTATION

Aggregated Wasserstein Distance for Hidden Markov Models and Automated Morphological Characterization of Placenta from Photos

Chen, Yukun ; 2020

Aggregated Wasserstein Distance for Hidden Markov Models and Automated Morphological Characterization of Placenta from Photos

Online Access Available 

Aggregated Wasserstein distance for hidden Markov models and automated morphological characterization...

 of Placenta from Photos

by Chen, YukunWang, James Z
In the past decade, fueled by the rapid advances of big data technology and machine learning algorithms, data science has become a new paradigm of science and...
Dissertation/Thesis
Check Availability

 MR4327033 Thesis 

2020

Wasserstein Loss With Alternative Reinforcement Learning for Severity-Aware Semantic Segmentation

X Liu, Y Lu, X Liu, S Bai, S Li… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Semantic segmentation is important for many real-world systems, eg, autonomous vehicles,

which predict the class of each pixel. Recently, deep networks achieved significant progress

wrt the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross …

 

2020

[PDF] arxiv.org

Stability of Gibbs Posteriors from the Wasserstein Loss for Bayesian Full Waveform Inversion

MM DunlopY Yang - arXiv preprint arXiv:2004.03730, 2020 - arxiv.org

Recently, the Wasserstein loss function has been proven to be effective when applied to

deterministic full-waveform inversion (FWI) problems. We consider the application of this

loss function in Bayesian FWI so that the uncertainty can be captured in the solution. Other …

  Cited by 1 Related articles All 3 versions 


2020

W-LDMM: A wasserstein driven low-dimensional manifold model for noisy image restoration

R He, X Feng, W Wang, X Zhu, C Yang - Neurocomputing, 2020 - Elsevier

The Wasserstein distance originated from the optimal transport theory is a general and

flexible statistical metric in a variety of image processing problems. In this paper, we propose

a novel Wasserstein driven low-dimensional manifold model (W-LDMM), which tactfully …

  Cited by 2 Related articles All 2 versions

2020

[PDF] arxiv.org

Predictive density estimation under the Wasserstein loss

T Matsuda, WE Strawderman - Journal of Statistical Planning and Inference, 2020 - Elsevier

We investigate predictive density estimation under the L 2 Wasserstein loss for location

families and location-scale families. We show that plug-in densities form a complete class

and that the Bayesian predictive density is given by the plug-in density with the posterior …

  Cited by 1 Related articles All 4 versions

<——2020———2020———1100— 


Wasserstein Metric And Large-Time Asymptotics Of Nonlinear ...

www.researchgate.net › publication › 268014629_Wasser...

Sep 11, 2020 — We review various recent applicatious of Wasserstein-type metrics to both nonlinear partial ... in nonlinear diusion equations of porous medium type. ... is known as the Kantorovich-Wasserstein distance of F and. G ... Pattern formation, stability of equilibria and dependence o n the main mechanisms ...

2020

[PDF] sabanciuniv.edu

Cyclic Adversarial Framework with Implicit Autoencoder and Wasserstein Loss (CAFIAWL)

E Bonabi Mobaraki - 2020 - research.sabanciuniv.edu

Since the day that the Simple Perceptron was invented, Artificial Neural Networks (ANNs)

attracted many researchers. Technological improvements in computers and the internet

paved the way for unseen computational power and an immense amount of data that …

2020  [PDF] arxiv.org

Partial Gromov-Wasserstein Learning for Partial Graph Matching

W LiuC ZhangJ XieZ Shen, H Qian… - arXiv preprint arXiv …, 2020 - arxiv.org

Graph matching finds the correspondence of nodes across two graphs and is a basic task in

graph-based machine learning. Numerous existing methods match every node in one graph

to one node in the other graph whereas two graphs usually overlap partially in …

 Cited by 2 Related articles All 3 versions 

2020  [PDF] arxiv.org

Wasserstein Learning of Determinantal Point Processes

L AnquetilM GartrellA Rakotomamonjy… - arXiv preprint arXiv …, 2020 - arxiv.org

Determinantal point processes (DPPs) have received significant attention as an elegant

probabilistic model for discrete subset selection. Most prior work on DPP learning focuses

on maximum likelihood estimation (MLE). While efficient and scalable, MLE approaches do …

  All 4 versions 


2020  [PDF] thecvf.com

Severity-aware semantic segmentation with reinforced wasserstein training

X LiuW JiJ YouGE Fakhri… - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com

Semantic segmentation is a class of methods to classify each pixel in an image into

semantic classes, which is critical for autonomous vehicles and surgery systems. Cross-

entropy (CE) loss-based deep neural networks (DNN) achieved great success wrt the …

Cited by 17 Related articles All 7 versions 

Severity-Aware Semantic Segmentation With Reinforced

Wasserstein Training

To sidestep this, in this work, we propose to incorporate the severity-aware inter-class correlation into our Wasserstein training framework ...

YouTube · ComputerVisionFoundation Videos · 

Jul 18, 2020


2020

Importance-Aware Semantic Segmentation in Self-Driving with ...

arxiv.org › cs

Oct 21, 2020 — Semantic segmentation (SS) is an important perception manner for self-driving cars and robotics, which classifies each pixel into a pre-determined class. ... In our extenssive experiments, Wasserstein loss demonstrates superior segmentation performance on the predefined critical classes for safe-driving.

by X Liu · ‎2020 · ‎Cited by 6 · ‎Related articles

[PDF] aaai.org

[CITATION] Importance-Aware Semantic Segmentation in Self-Driving with Discrete Wasserstein Training.

X Liu, Y Han, S Bai, Y Ge, T Wang, X HanS LiJ You… - AAAI, 2020

 Cited by 24 Related articles All 8 versions


2020  [PDF] arxiv.org2020

Reinforced wasserstein training for severity-aware semantic segmentation in autonomous driving

X Liu, Y Zhang, X Liu, S Bai, S Li, J You - arXiv preprint arXiv:2008.04751, 2020 - arxiv.org

Semantic segmentation is important for many real-world systems, eg, autonomous vehicles,

which predict the class of each pixel. Recently, deep networks achieved significant progress

wrt the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross …

Cited by 4 Related articles All 5 versions 

 

12. On some nonlinear evolution systems which are ...

www.researchgate.net › publication › 318646857_12_On...

Sep 27, 2020 — On some nonlinear evolution systems which are perturbations of Wasserstein gradient flows: In the Applied Sciences | This chapter presents existence and ... In book: Topological Optimization and Optimal Transport. Authors:.

2020

[PDF] Computational Hardness and Fast Algorithm for Fixed-Support Wasserstein Barycenter

T LinN HoX ChenM CuturiMI Jordan - 2020 - researchgate.net

We study in this paper the fixed-support Wasserstein barycenter problem (FS-WBP), which

consists in computing the Wasserstein barycenter of m discrete probability measures

supported on a finite metric space of size n. We show first that the constraint matrix arising …

  Cited by 3 Related articles All 2 versions 


2020  [PDF] arxiv.org

Unsupervised Multilingual Alignment using Wasserstein Barycenter

X Lian, K JainJ TruszkowskiP Poupart… - arXiv preprint arXiv …, 2020 - arxiv.org

We study unsupervised multilingual alignment, the problem of finding word-to-word

translations between multiple languages without using any parallel data. One popular

strategy is to reduce multilingual alignment to the much simplified bilingual setting, by …

 Cited by 3 Related articles All 11 versions 

<——2020————2020———1110——


[PDF] googleapis.com

Wasserstein barycenter model ensembling

Y MrouehPL Dognin, I Melnyk, J Ross… - US Patent App. 16 …, 2020 - Google Patents

A method, system and apparatus of ensembling, including inputting a set of models that

predict different sets of attributes, determining a source set of attributes and a target set of

attributes using a barycenter with an optimal transport metric, and determining a consensus …

  All 2 versions 


 2020  [PDF] arxiv.org

Improved complexity bounds in wasserstein barycenter problem

D Dvinskikh, D Tiapkin - arXiv preprint arXiv:2010.04677, 2020 - arxiv.org

In this paper, we focus on computational aspects of Wasserstein barycenter problem. We

provide two algorithms to compute Wasserstein barycenter of $ m $ discrete measures of

size $ n $ with accuracy $\varepsilon $. The first algorithm, based on mirror prox with some …

  Cited by 2 All 2 versions 


 

SA vs SAA for population Wasserstein barycenter calculation

D Dvinskikh - arXiv preprint arXiv:2001.07697, 2020 - arxiv.org

In Machine Learning and Optimization community there are two main approaches for convex

risk minimization problem. The first approach is Stochastic Averaging (SA)(online) and the

second one is Stochastic Average Approximation (SAA)(Monte Carlo, Empirical Risk …

  Cited by 3 Related articles All 2 versions 


[PDF] arxiv.org

Scalable computations of wasserstein barycenter via input convex neural networks

J FanA TaghvaeiY Chen - arXiv preprint arXiv:2007.04462, 2020 - arxiv.org

Wasserstein Barycenter is a principled approach to represent the weighted mean of a given

set of probability distributions, utilizing the geometry induced by optimal transport. In this

work, we present a novel scalable algorithm to approximate the Wasserstein Barycenters …

Cited by 13 Related articles All 7 versions 


[PDF] arxiv.org

Randomised Wasserstein Barycenter Computation: Resampling with Statistical Guarantees

F Heinemann, A Munk, Y Zemel - arXiv preprint arXiv:2012.06397, 2020 - arxiv.org

We propose a hybrid resampling method to approximate finitely supported Wasserstein

barycenters on large-scale datasets, which can be combined with any exact solver.

Nonasymptotic bounds on the expected error of the objective value as well as the …

  All 2 versions 


Fast algorithms for computational optimal transport and wasserstein barycenter

W GuoN HoM Jordan - International Conference on …, 2020 - proceedings.mlr.press

We provide theoretical complexity analysis for new algorithms to compute the optimal

transport (OT) distance between two discrete probability distributions, and demonstrate their

favorable practical performance compared to state-of-art primal-dual algorithms. First, we …

  Cited by 2 Related articles All 2 versions 


 2020

[PDF] arxiv.org2020

Revisiting Fixed Support Wasserstein Barycenter: Computational Hardness and Efficient Algorithms

T LinN Ho, X Chen, M CuturiMI Jordan - arXiv preprint arXiv:2002.04783, 2020 - arxiv.org

We study the fixed-support Wasserstein barycenter problem (FS-WBP), which consists in

computing the Wasserstein barycenter of $ m $ discrete probability measures supported on

a finite metric space of size $ n $. We show first that the constraint matrix arising from the …

  Cited by 1 Related articles All 3 versions 

2020

[PDF] arxiv.org

Risk Measures Estimation Under Wasserstein Barycenter

MA Arias-SernaJM Loubes… - arXiv preprint arXiv …, 2020 - arxiv.org

Randomness in financial markets requires modern and robust multivariate models of risk

measures. This paper proposes a new approach for modeling multivariate risk measures

under Wasserstein barycenters of probability measures supported on location-scatter …

  All 5 versions 


2020 book

[PDF] oapen.org

[BOOK] An invitation to statistics in Wasserstein space

VM Panaretos, Y Zemel - 2020 - library.oapen.org

This open access book presents the key aspects of statistics in Wasserstein spaces, ie

statistics in the space of probability measures when endowed with the geometry of optimal

transportation. Further to reviewing state-of-the-art aspects, it also provides an accessible …

  Cited by 27 Related articles All 7 versions 

 


<——2020———-—2020———1120—


2020

A Riemannian submersion‐based approach to the Wasserstein barycenter of positive definite matrices

M Li, H Sun, D Li - Mathematical Methods in the Applied …, 2020 - Wiley Online Library

In this paper, we introduce a novel geometrization on the space of positive definite matrices,

derived from the Riemannian submersion from the general linear group to the space of

positive definite matrices, resulting in easier computation of its geometric structure. The …

  Related articles


   2020

Fixed-Support Wasserstein Barycenter: Computational Hardness and Efficient Algorithms

T LinN Ho, X Chen, M CuturiMI Jordan - 2020 - research.google

We study in this paper the finite-support Wasserstein barycenter problem (FS-WBP), which

consists in computing the Wasserstein barycenter of $ m $ discrete probability measures

supported on a finite metric space of size $ n $. We show first that the constraint matrix …

2020

[PDF] arxiv.org

Fair regression with wasserstein barycenters

E Chzhen, C Denis, M HebiriL Oneto… - arXiv preprint arXiv …, 2020 - arxiv.org

We study the problem of learning a real-valued function that satisfies the Demographic

Parity constraint. It demands the distribution of the predicted output to be independent of the

sensitive attribute. We consider the case that the sensitive attribute is available for …

  Cited by 3 All 4 versions 


2020

[PDF] arxiv.org

Continuous regularized wasserstein barycenters

L LiA GenevayM YurochkinJ Solomon - arXiv preprint arXiv …, 2020 - arxiv.org

Wasserstein barycenters provide a geometrically meaningful way to aggregate probability

distributions, built on the theory of optimal transport. They are difficult to compute in practice,

however, leading previous work to restrict their supports to finite sets of points. Leveraging a …

ited by 15 Related articles All 7 versions 

[PDF] researchgate.net

On the computation of Wasserstein barycenters

G PuccettiL RüschendorfS Vanduffel - Journal of Multivariate Analysis, 2020 - Elsevier

The Wasserstein barycenter is an important notion in the analysis of high dimensional data

with a broad range of applications in applied probability, economics, statistics, and in

particular to clustering and image processing. In this paper, we state a general version of the …

  Cited by 7 Related articles All 7 versions

2020

 

[PDF] arxiv.org

node2coords: Graph representation learning with wasserstein barycenters

E SimouD ThanouP Frossard - arXiv preprint arXiv:2007.16056, 2020 - arxiv.org

In order to perform network analysis tasks, representations that capture the most relevant

information in the graph structure are needed. However, existing methods do not learn

representations that can be interpreted in a straightforward way and that are robust to …

Cited by 4 Related articles All 6 versions


2020 [PDF] arxiv.org

Stochastic saddle-point optimization for wasserstein barycenters

D Tiapkin, A GasnikovP Dvurechensky - arXiv preprint arXiv:2006.06763, 2020 - arxiv.org

We study the computation of non-regularized Wasserstein barycenters of probability

measures supported on the finite set. The first result gives a stochastic optimization

algorithm for the discrete distribution over the probability measures which is comparable …

Cited by 4 Related articles All 4 versions 

2020  [PDF] arxiv.org

Distributed Optimization with Quantization for Computing Wasserstein Barycenters

R Krawtschenko, CA UribeA Gasnikov… - arXiv preprint arXiv …, 2020 - arxiv.org

We study the problem of the decentralized computation of entropy-regularized semi-discrete

Wasserstein barycenters over a network. Building upon recent primal-dual approaches, we

propose a sampling gradient quantization scheme that allows efficient communication and …

Cited by 3 Related articles All 3 versions 

 2020

Primal heuristics for wasserstein barycenters

PY Bouchet, S GualandiLM Rousseau - International Conference on …, 2020 - Springer

This paper presents primal heuristics for the computation of Wasserstein Barycenters of a

given set of discrete probability measures. The computation of a Wasserstein Barycenter is

formulated as an optimization problem over the space of discrete probability measures. In …

  Cited by 1

2020  [PDF] researchgate.net

[PDF] Fixed-Support Wasserstein Barycenters: Computational Hardness and Fast Algorithm

T LinN HoX ChenM Cuturi… - Advances in Neural …, 2020 - researchgate.net

We study the fixed-support Wasserstein barycenter problem (FS-WBP), which consists in

computing the Wasserstein barycenter of m discrete probability measures supported on a

finite metric space of size n. We show first that the constraint matrix arising from the standard …

 Cited by 26 Related articles All 9 versions 

<——2020——2020———1130—— 


 

[PDF] arxiv.org

Averaging atmospheric gas concentration data using wasserstein barycenters

M Barré, C Giron, M Mazzolini… - arXiv preprint arXiv …, 2020 - arxiv.org

Hyperspectral satellite images report greenhouse gas concentrations worldwide on a daily

basis. While taking simple averages of these images over time produces a rough estimate of

relative emission rates, atmospheric transport means that simple averages fail to pinpoint …

 Cited by 3 Related articles All 6 versions 


2020  [PDF] thecvf.com

Barycenters of Natural Images Constrained Wasserstein Barycenters for Image Morphing

D SimonA Aberdam - … of the IEEE/CVF Conference on …, 2020 - openaccess.thecvf.com

Image interpolation, or image morphing, refers to a visual transition between two (or more)

input images. For such a transition to look visually appealing, its desirable properties are (i)

to be smooth;(ii) to apply the minimal required change in the image; and (iii) to seem" real" …

  Cited by 2 Related articles All 4 versions 


2020  [PDF] arxiv.org

Learning Graphons via Structured Gromov-Wasserstein Barycenters

H XuD LuoL CarinH Zha - arXiv preprint arXiv:2012.05644, 2020 - arxiv.org

We propose a novel and principled method to learn a nonparametric graph model called

graphon, which is defined in an infinite-dimensional space and represents arbitrary-size

graphs. Based on the weak regularity lemma from the theory of graphons, we leverage a …

Cited by 5 Related articles All 6 versions 

2020  [PDF] arxiv.org

Variational Wasserstein Barycenters for Geometric Clustering

L Mi, T Yu, J BentoW ZhangB LiY Wang - arXiv preprint arXiv …, 2020 - arxiv.org

We propose to compute Wasserstein barycenters (WBs) by solving for Monge maps with

variational principle. We discuss the metric properties of WBs and explore their connections,

especially the connections of Monge WBs, to K-means clustering and co-clustering. We also …

  Cited by 2 Related articles All 2 versions 


 

[PDF] arxiv.org

High-precision Wasserstein barycenters in polynomial time

JM Altschuler, E Boix-Adsera - arXiv preprint arXiv:2006.08012, 2020 - arxiv.org

Computing Wasserstein barycenters is a fundamental geometric problem with widespread

applications in machine learning, statistics, and computer graphics. However, it is unknown

whether Wasserstein barycenters can be computed in polynomial time, either exactly or to …


 

[PDF] mit.edu

Wasserstein barycenters: statistics and optimization

AJ Stromme - 2020 - dspace.mit.edu

We study a geometric notion of average, the barycenter, over 2-Wasserstein space. We

significantly advance the state of the art by introducing extendible geodesics, a simple

synthetic geometric condition which implies non-asymptotic convergence of the empirical …

  Related articles 


2020  [PDF] researchgate.net

[PDF] Wasserstein Barycenters for Bayesian Learning: Technical Report

G Rios - 2020 - researchgate.net

Within probabilistic modelling, a crucial but challenging task is that of learning (or fitting) the

models. For models described by a finite set of parameters, this task is reduced to finding the

best parameters, to feed them into the model and then calculate the posterior distribution to …

  Related articles 


2020  [PDF] ams.org

Nonpositive curvature, the variance functional, and the Wasserstein barycenter

YH KimB Pass - Proceedings of the American Mathematical Society, 2020 - ams.org

We show that a Riemannian manifold $ M $ has nonpositive sectional curvature and is

simply connected if and only if the variance functional on the space $ P (M) $ of probability

measures over $ M $ is displacement convex. We then establish convexity over Wasserstein …

Cited by 3 Related articles All 3 versions

2020  [PDF] arxiv.org

An LP-based, strongly-polynomial 2-approximation algorithm for sparse Wasserstein barycenters

S Borgwardt - Operational Research, 2020 - Springer

Discrete Wasserstein barycenters correspond to optimal solutions of transportation problems

for a set of probability measures with finite support. Discrete barycenters are measures with

finite support themselves and exhibit two favorable properties: there always exists one with a …

  Cited by 2 Related articles All 2 versions


2020

Distributed Wasserstein Barycenters via Displacement Interpolation

P Cisneros-VelardeF Bullo - arXiv preprint arXiv:2012.08610, 2020 - arxiv.org

Consider a multi-agent system whereby each agent has an initial probability measure. In this

paper, we propose a distributed algorithm based upon stochastic, asynchronous and

pairwise exchange of information and displacement interpolation in the Wasserstein space …

  All 2 versions 

<——2020———-—2020———1140—— 


year 2020

Gromov-Wasserstein Coupling Matrix SubEmbedding Robust ...

hal.archives-ouvertes.fr › file › GW_SERW_map

PDF

0. 100. 200. 300. 400. 500. 0. 20. 40. 60. 80. 100. Gromov-Wasserstein Coupling Matrix. 0. 100. 200. 300. 400. 500. 0. 20. 40. 60. 80. 100. SubEmbedding ...

by SERWC Matrix

 

year 2020

CITATION] Gromov-Wasserstein Coupling Matrix

SERWC Matrix - hal.archives-ouvertes.fr

Page 1. 0 100 200 300 400 500 0 20 40 60 80 100 Gromov-Wasserstein Coupling Matrix 0 100

200 300 400 500 0 20 40 60 80 100 SubEmbedding Robust Wasserstein Coupling Matrix

  [CITATION] Gromov-Wasserstein Coupling Matrix

SERWC Matrix - hal.archives-ouvertes.fr

Page 1. 0 100 200 300 400 500 0 20 40 60 80 100 Gromov-Wasserstein Coupling Matrix 0 100

200 300 400 500 0 20 40 60 80 100 SubEmbedding Robust Wasserstein Coupling Matrix

  


year 2020

Two approaches for population Wasserstein barycenter problem: Stochastic Averaging versus Sample Average

Approximation

D DvinskikhA Gasnikov - nnov.hse.ru

Abstract In Machine Learning and Optimization community there are two main approaches

for convex risk minimization problem: Stochastic Averaging (SA) and Sample Average

Approximation (SAA). At the moment, it is known that both approaches are on average …

2020  [PDF] researchgate.net

Non-Gaussian BLE-Based Indoor Localization Via Gaussian Sum Filtering Coupled with Wasserstein Distance

P Malekzadeh, S Mehryar, P Spachos… - ICASSP 2020-2020 …, 2020 - ieeexplore.ieee.org

… identified in Eq. (6) and update the overall point estimation xk|k and its error

covariance Pk|k by collapsing the mc filtered components into one single Gaussian

via Wasserstein-Based Clustering GMR algorithm. In other words … 

Related articles All 2 versions 

Conference ProceedingFull Text Online
Cited by 2
Related articles All 3 versions


Network Intrusion Detection Based on Conditional ...

ieeexplore.ieee.org › iel7

by G Zhang · 2020 — ... Detection Based on. Conditional Wasserstein Generative Adversarial ... systems (IDS) c

[PDF] ieee.org

[CITATION] Network Intrusion Detection Based on Conditional Wasserstein Generative Adversarial Network and Cost-Sensitive Stacked Autoencoder

G Zhang, X Wang, R Li, Y Song, J He, J Lai - IEEE Access, 2020 - ieeexplore.ieee.org

In the field of intrusion detection, there is often a problem of data imbalance, and more and more unknown types of attacks make detection difficult. To resolve above issues, this article proposes a network intrusion detection model called CWGAN-CSSAE, which combines …

  Related articles

2020  [PDF] arxiv.org

Online Stochastic Optimization with Wasserstein Based Non-stationarity

J Jiang, X Li, J Zhang - arXiv preprint arXiv:2012.06961, 2020 - arxiv.org

We consider a general online stochastic optimization problem with multiple budget constraints over a horizon of finite time periods. At each time period, a reward function and multiple cost functions, where each cost function is involved in the consumption of one …



Synthetic Data Generation Using Wasserstein Conditional Gans With Gradient Penalty (WCGANS-GP)

M Singh Walia - 2020 - arrow.tudublin.ie

… Dissertations. Title … Disciplines. Computer Sciences. Publication Details. A dissertation submitted in partial fulfilment of the requirements of Technological University Dublin for the degree of M.Sc. in Computer Science (Data Analytics) September 2020. Abstract …
Related articles
All 2 versions


Lagrangian schemes for Wasserstein gradient flows

JA Carrillo, D Matthes, MT Wolfram - arXiv preprint arXiv:2003.03803, 2020 - arxiv.org

… 1. Introduction In most general terms, L2-Wasserstein gradient flows are evolution equations for a time-dependent probability density ρ(·) : [0,T] × Ω R≥0 on a domain Ω Rd that can be written as follows … 1 arXiv:2003.03803v1 [math.NA] 8 Mar 2020 Page 2 … 


2020

[HTML] Fréchet Means in the Wasserstein Space 

VM Panaretos, Y Zemel - International Workshop on Functional and …, 2020 - Springer

The concept of a Fréchet mean (Fréchet [55]) generalises the notion of mean to a more general metric space by replacing the usual “sum of squares” with a “sum of squared distances”, giving rise to the so-called Fréchet functional. A closely related notion is that of a Karcher mean (Karcher …

  Related articles


2020

Horo-functions associated to atom sequences on the Wasserstein space

G Zhu, H Wu, X Cui - Archiv der Mathematik, 2020 - Springer

On the Wasserstein space over a complete, separable, non-compact, locally compact length space, we consider the horo-functions associated to sequences of atomic measures. We show the existence of co-rays for any prescribed initial probability measure with respect to a …


2020


MR4111670 Prelim Delon, Julie; Desolneux, Agnès A Wasserstein-type distance in the space of Gaussian mixture models. SIAM J. Imaging Sci. 13 (2020), no. 2, 936–970. 49Q22 (65K05 65K10 68Q25 68R10 68U05 68U10 90C05) 

Review PDF Clipboard Journal Article 

A Wasserstein-Type Distance in the Space of Gaussian ... - SIAM

https://epubs.siam.org › doi › abs

by J Delon - ‎2020 - ‎Cited by 5 - ‎Related articles

Multiscale Modeling & Simulation. SIAM Journal on Applied Algebra and Geometry. SIAM Journal on Applied Dynamical Systems. SIAM Journal on Applied ...

[PDF] arxiv.org

A wasserstein-type distance in the space of gaussian mixture models

J Delon, A Desolneux - SIAM Journal on Imaging Sciences, 2020 - SIAM

… We write P(Rd) the set probability measures on Rd. For p ≥ 1, the Wasserstein space Pp(Rd) is defined as the set of probability measures µ with a finite moment of order p, ie, such that ∫ R d xpdµ(x) < +∞, with . the Euclidean norm on Rd …

 Cited by 52 Related articles All 9 versions

A Wasserstein-Type Distance in the Space of Gaussian Mixture Models 

By: Delon, Julie; Desolneux, Agnes 

SIAM JOURNAL ON IMAGING SCIENCES  Volume: ‏ 13   Issue: ‏ 2   Pages: ‏ 936-970   Published: ‏ 2020

<——2020—————2020———1150—— 


2020

[HTML] springer.com

[HTML] The Wasserstein Space

VM Panaretos, Y Zemel - International Workshop on Functional and …, 2020 - Springer

The Kantorovich problem described in the previous chapter gives rise to a metric structure, the Wasserstein distance, in the space of probability measures P (X) P (\mathcal X) on a space X\mathcal X. The resulting metric space, a subspace of P (X) P (\mathcal X), is …

  Related articles


2020

[PDF] arxiv.org

Fast and Smooth Interpolation on Wasserstein Space

S Chewi, J Clancy, TL GouicP Rigollet… - arXiv preprint arXiv …, 2020 - arxiv.org

We propose a new method for smoothly interpolating probability measures using the geometry of optimal transport. To that end, we reduce this problem to the classical Euclidean setting, allowing us to directly leverage the extensive toolbox of spline interpolation. Unlike …

  All 2 versions 


[PDF] sciencedirect.com

A novel kernel Wasserstein distance on Gaussian measures: An application of identifying dental artifacts in head and neck computed tomography

JH Oh, M PouryahyaA IyerAP ApteJO Deasy… - Computers in biology …, 2020 - Elsevier

The Wasserstein distance is a powerful metric based on the theory of optimal mass

transport. It gives a natural measure of the distance between two distributions with a wide

range of applications. In contrast to a number of the common divergences on distributions …

  Cited by 3 Related articles All 5 versions


2020  see 2019   [PDF] mlr.press

Quantitative stability of optimal transport maps and linearization of the 2-wasserstein space

Q Mérigot, A Delalande… - … Conference on Artificial …, 2020 - proceedings.mlr.press

This work studies an explicit embedding of the set of probability measures into a Hilbert space, defined using optimal transport maps from a reference probability density. This embedding linearizes to some extent the 2-Wasserstein space and is shown to be bi-Hölder …

  Cited by 24 Related articles All 6 versions

2020  [PDF] sns.it

Optimal control of multiagent systems in the Wasserstein space

C Jimenez, A Marigonda, M Quincampoix - Calculus of Variations and …, 2020 - Springer

This paper concerns a class of optimal control problems, where a central planner aims to control a multi-agent system in\({\mathbb {R}}^ d\) in order to minimize a certain cost of Bolza type. At every time and for each agent, the set of admissible velocities, describing his/her …

  Cited by 4 Related articles All 3 versions


2020

[PDF] arxiv.org

Ensemble Riemannian Data Assimilation over the Wasserstein Space

SK Tamang, A Ebtehaj, PJ Van Leeuwen, D Zou… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we present a new ensemble data assimilation paradigm over a Riemannian manifold equipped with the Wasserstein metric. Unlike Eulerian penalization of error in the Euclidean space, the Wasserstein metric can capture translation and shape difference …

  All 4 versions 

 

2020  [PDF] esaim-proc.org

Statistical data analysis in the Wasserstein space

J Bigot - ESAIM: Proceedings and Surveys, 2020 - esaim-proc.org

This paper is concerned by statistical inference problems from a data set whose elements may be modeled as random probability measures such as multiple histograms or point clouds. We propose to review recent contributions in statistics on the use of Wasserstein …

  Cited by 3 Related articles All 3 versions


2020

[PDF] oapen.org

[BOOK] An Invitation to Statistics in Wasserstein Space

VM Panaretos, Y Zemel - 2020 - library.oapen.org

This open access book presents the key aspects of statistics in Wasserstein spaces, ie statistics in the space of probability measures when endowed with the geometry of optimal transportation. Further to reviewing state-of-the-art aspects, it also provides an accessible …

  Cited by 16 Related articles All 7 versions 


2020

[PDF] arxiv.org

Automatic coding of students' writing via Contrastive Representation Learning in the Wasserstein space

R Jiang, J GouveaD HammerS Aeron - arXiv preprint arXiv:2011.13384, 2020 - arxiv.org

Qualitative analysis of verbal data is of central importance in the learning sciences. It is labor-intensive and time-consuming, however, which limits the amount of data researchers can include in studies. This work is a step towards building a statistical machine learning (ML) …

  All 2 versions 


2020

[PDF] future-in-tech.net

[PDF] Wasserstein Riemannian geometry of Gamma densities

C Ogouyandjou, N Wadagni - Computer Science, 2020 - ijmcs.future-in-tech.net

Abstract A Wasserstein Riemannian Gamma manifold is a space of Gamma probability density functions endowed with the Riemannian Otto metric which is related to the Wasserstein distance. In this paper, we study some geometric properties of such Riemanian …

<——2020—————2020———1160——   


arXiv:2012.14310  [pdfpsother math.PR math.ST
Unajusted Langevin algorithm with multiplicative noise: Total variation and Wasserstein bounds
Authors: Gilles PagesFabien Panloup
Abstract: In this paper, we focus on non-asymptotic bounds related to the Euler scheme of an ergodic diffusion with a possibly multiplicative diffusion term (non-constant diffusion coefficient). More precisely, the objective of this paper is to control the distance of the standard Euler scheme with decreasing step (usually called Unajusted Langevin Algorithm in the Monte-Carlo literature) to the invariant d…  More
Submitted 28 December, 2020; originally announced December 2020.

Related articles All 2 versions 

2020  [PDF] arxiv.org

On Linear Optimization over Wasserstein Balls

MC YueD KuhnW Wiesemann - arXiv preprint arXiv:2004.07162, 2020 - arxiv.org

Wasserstein balls, which contain all probability measures within a pre-specified Wasserstein

distance to a reference measure, have recently enjoyed wide popularity in the

distributionally robust optimization and machine learning communities to formulate and …

  Cited by 3 Related articles All 5 versions 


2020  [PDF] arxiv.org

Self-improvement of the Bakry-Emery criterion for Poincar {\'e} inequalities and Wasserstein contraction using variable curvature bounds

P Cattiaux, M Fathi, A Guillin - arXiv preprint arXiv:2002.09221, 2020 - arxiv.org

We study Poincar {é} inequalities and long-time behavior for diffusion processes on R^ n

under a variable curvature lower bound, in the sense of Bakry-Emery. We derive various

estimates on the rate of convergence to equilibrium in L^ 1 optimal transport distance, as …

  Related articles All 33 versions 

Cited by 1 Related articles All 15 versions 

2020

[PDF] arxiv.org

Portfolio Optimisation within a Wasserstein Ball

SM PesentiS Jaimungal - Available at SSRN, 2020 - papers.ssrn.com

We consider the problem of active portfolio management where a loss-averse and/or gain-

seeking investor aims to outperform a benchmark strategy's risk profile while not deviating

too much from it. Specifically, an investor considers alternative strategies that co-move with …

  All 6 versions


2020  PDF] aimsciences.org

A convergent Lagrangian discretization for\begin {document} \end {document}-Wasserstein and flux-limited diffusion equations

B Söliver, O Junge - Communications on Pure & Applied Analysis, 2020 - aimsciences.org

… pp. 4227–4256 A CONVERGENT LAGRANGIAN DISCRETIZATION FOR p-WASSERSTEIN

AND FLUX-LIMITED DIFFUSION EQUATIONS … rR (sr − c(r)). We consider a family of

functions for c, that can be called “p-Wasserstein-like” cost functions …

  All 3 versions 


2020


2020

[PDF] arxiv.org

Infinite-dimensional regularization of McKean-Vlasov equation with a Wasserstein diffusion

V Marx - arXiv preprint arXiv:2002.10157, 2020 - arxiv.org

Much effort has been spent in recent years on restoring uniqueness of McKean-Vlasov

SDEs with non-smooth coefficients. As a typical instance, the velocity field is assumed to be

bounded and measurable in its space variable and Lipschitz-continuous with respect to the …

  Cited by 1 Related articles All 20 versions 


2020

[PDF] arxiv.org

A Bismut-Elworthy inequality for a Wasserstein diffusion on the circle

V Marx - arXiv preprint arXiv:2005.04972, 2020 - arxiv.org

We investigate in this paper a regularization property of a diffusion on the Wasserstein

space $\mathcal {P} _2 (\mathbb {T}) $ of the one-dimensional torus. The control obtained

on the gradient of the semi-group is very much in the spirit of Bismut-Elworthy-Li integration …

  Related articles All 21 versions 


2020  [PDF] projecteuclid.org

Donsker's theorem in Wasserstein-1 distance

L Coutin, L Decreusefond - Electronic Communications in …, 2020 - projecteuclid.org

We compute the Wassertein-1 (or Kantorovitch-Rubinstein) distance between a random

walk in $\mathbf {R}^{d} $ and the Brownian motion. The proof is based on a new estimate of

the modulus of continuity of the solution of the Stein's equation. As an application, we can …

   Cited by 3 Related articles All 31 versions


2020 
online Cover Image  PEER-REVIEW

Deep joint two-stream Wasserstein auto-encoder and selective attention alignment for unsupervised...

by Chen, Zhihong; Chen, Chao; Jin, Xinyu ; More...

Neural computing & applications, 06/2020, Volume 32, Issue 11

Domain adaptation refers to the process of utilizing the labeled source domain data to learn a model that can perform well in the target domain with limited or...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 Deep joint two-stream Wasserstein auto-encoder and ...

https://www.researchgate.net › ... › Alignment

Oct 19, 2020 — Request PDF | Deep joint two-stream Wasserstein auto-encoder and selective attention alignment for unsupervised domain adaptation .

New Neural Computation Study Results from Zhejiang University Described 

(Deep Joint Two-stream Wasserstein Auto-encoder and Selective Attention Alignment for Unsupervised Domain Adaptation)

Robotics & Machine Learning, 06/2020

NewsletterFull Text Online
Cited by 14
Related articles All 4 versions


2020 [PDF] arxiv.org

Convergence of Recursive Stochastic Algorithms using Wasserstein Divergence

A Gupta, WB Haskell - arXiv preprint arXiv:2003.11403, 2020 - arxiv.org

This paper develops a unified framework, based on iterated random operator theory, to

analyze the convergence of constant stepsize recursive stochastic algorithms (RSAs) in

machine learning and reinforcement learning. RSAs use randomization to efficiently …

  Related articles All 2 versions 

<——2020—————2020———1170—


year 2020

[PDF] A particle model for Wasserstein type diffusion

V Konarovskyi, M von Renesse - math.uni-leipzig.de

… 1 2 – Wasserstein distance on P2(Rd) There is known a “singular” Γ such that the DK equation

has a solution µt on P2([0,1]) called the Wasserstein diffusion that is a Markov process and

satisfies the Varadhan formula P{µt = ν} e− d2W (µ0,ν) 2t , t 1 …

  Related articles 


 

[PDF] arxiv.org

Projection robust Wasserstein distance and Riemannian optimization

T LinC FanN HoM CuturiMI Jordan - arXiv preprint arXiv:2006.07458, 2020 - arxiv.org

Projection robust Wasserstein (PRW) distance, or Wasserstein projection pursuit (WPP), is a

robust variant of the Wasserstein distance. Recent work suggests that this quantity is more

robust than the standard Wasserstein distance, in particular when comparing probability …
 Cited by 19 Related articles All 8 versions 


Augmented Sliced Wasserstein Distances

X Chen, Y Yang, Y Li - arXiv preprint arXiv:2006.08812, 2020 - arxiv.org

While theoretically appealing, the application of the Wasserstein distance to large-scale

machine learning problems has been hampered by its prohibitive computational cost. The

sliced Wasserstein distance and its variants improve the computational efficiency through …

  All 2 versions 


2020

Adversarial sliced Wasserstein domain adaptation networks

Y Zhang, N Wang, S Cai - Image and Vision Computing, 2020 - Elsevier

Abstract Domain adaptation has become a resounding success in learning a domain

agnostic model that performs well on target dataset by leveraging source dataset which has

related data distribution. Most of existing works aim at learning domain-invariant features …

  All 2 versions

2020

[PDF] arxiv.org

Approximate Bayesian computation with the sliced-Wasserstein distance

K NadjahiV De BortoliA Durmus… - ICASSP 2020-2020 …, 2020 - ieeexplore.ieee.org

Approximate Bayesian Computation (ABC) is a popular method for approximate inference in

generative models with intractable but easy-to-sample likelihood. It constructs an

approximate posterior distribution by finding parameters for which the simulated data are …

  Cited by 2 Related articles All 7 versions


2020

[PDF] arxiv.org

Intrinsic Sliced Wasserstein Distances for Comparing Collections of Probability Distributions on Manifolds and Graphs

RM RustamovS Majumdar - arXiv preprint arXiv:2010.15285, 2020 - arxiv.org

Collections of probability distributions arise in a variety of statistical applications ranging

from user activity pattern analysis to brain connectomics. In practice these distributions are

represented by histograms over diverse domain types including finite intervals, circles …

  All 2 versions 

2020 [PDF] arxiv.org

Encoded Prior Sliced Wasserstein AutoEncoder for learning latent manifold representations

S KrishnagopalJ Bedrossian - arXiv preprint arXiv:2010.01037, 2020 - arxiv.org

While variational autoencoders have been successful generative models for a variety of

tasks, the use of conventional Gaussian or Gaussian mixture priors are limited in their ability

to capture topological or geometric properties of data in the latent representation. In this …

  All 2 versions 

  Cited by 2 All 6 versions 

2020 [PDF] arxiv.org

Gromov–Hausdorff limit of Wasserstein spaces on point clouds

NG Trillos - Calculus of Variations and Partial Differential …, 2020 - Springer

We consider a point cloud\(X_n:=\{{\mathbf {x}} _1,\ldots,{\mathbf {x}} _n\}\) uniformly

distributed on the flat torus\({\mathbb {T}}^ d:=\mathbb {R}^ d/\mathbb {Z}^ d\), and construct

a geometric graph on the cloud by connecting points that are within distance\(\varepsilon\) of …

  Cited by 12 Related articles All 3 versions


2020 [PDF] arxiv.org

The Equivalence of Fourier-based and Wasserstein Metrics on Imaging Problems

G Auricchio, A CodegoniS Gualandi… - arXiv preprint arXiv …, 2020 - arxiv.org

We investigate properties of some extensions of a class of Fourier-based probability metrics,

originally introduced to study convergence to equilibrium for the solution to the spatially

homogeneous Boltzmann equation. At difference with the original one, the new Fourier …

  Related articles All 4 versions 


2020

Posterior asymptotics in Wasserstein metrics on the real line

M ChaeP De Blasi, SG Walker - arXiv preprint arXiv:2003.05599, 2020 - arxiv.org

In this paper, we use the class of Wasserstein metrics to study asymptotic properties of

posterior distributions. Our first goal is to provide sufficient conditions for posterior

consistency. In addition to the well-known Schwartz's Kullback--Leibler condition on the …

  Related articles All 2 versions 
<——2020——2020———1180——

year 2020

[PDF] On the equivalence between Fourier-based and Wasserstein metrics

G Auricchio, A Codegoni, S GualandiG Toscani… - eye - mate.unipv.it

We investigate properties of some extensions of a class of Fourierbased probability metrics,

originally introduced to study convergence to equilibrium for the solution to the spatially

homogeneous Boltzmann equation. At difference with the original one, the new Fourier …

  

2020  online Cover Image

De Novo Protein Design for Novel Folds Using Guided Conditional Wasserstein Generative...

by Karimi, Mostafa; Zhu, Shaowen; Cao, Yue ; More...

Journal of chemical information and modeling, 12/2020, Volume 60, Issue 12

Although massive data is quickly accumulating on protein sequence and structure, there is a small and limited number of protein architectural types (or...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

De novo protein design for novel folds using guided conditional Wasserstein generative adversarial networks

M Karimi, S Zhu, Y CaoY Shen - Journal of Chemical Information …, 2020 - ACS Publications

Although massive data is quickly accumulating on protein sequence and structure, there is a

small and limited number of protein architectural types (or structural folds). This study is

addressing the following question: how well could one reveal underlying sequence …

  Cited by 4 Related articles All 5 versions

 

2020  online

Wasserstein GANs for MR Imaging: from Paired to Unpaired Training

by Lei, Ke; Mardani, Morteza; Pauly, John M ; More...

IEEE transactions on medical imaging, 09/2020, Volume 40, Issue 1

Lack of ground-truth MR images impedes the common supervised training of neural networks for image reconstruction. To cope with this challenge, this paper...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online
Cited by 28
Related articles All 10 versions

2020

Necessary Condition for Rectifiability Involving Wasserstein Distance W-2

By: Dabrowski, Damian

INTERNATIONAL MATHEMATICS RESEARCH NOTICES  Volume: ‏ 2020   Issue: ‏ 22   Pages: ‏ 8936-8972   Published: ‏ NOV 2020


2020

Person-independent facial expression recognition method based on improved Wasserstein generative adversarial networks in combination with identity aware

C Xu, Y Cui, Y Zhang, P Gao, J Xu - Multimedia Systems, 2020 - Springer

Since the distinction between two expressions is fairly vague, usually a subtle change in one

part of the human face is enough to change a facial expression. Most of the existing facial

expression recognition algorithms are not robust enough because they rely on general facial …

   Cited by 8 Related articles All 2 versions


2020


2020  [PDF] projecteuclid.org

Existence of probability measure valued jump-diffusions in generalized Wasserstein spaces

M Larsson, S Svaluto-Ferro - Electronic Journal of Probability, 2020 - projecteuclid.org

We study existence of probability measure valued jump-diffusions described by martingale

problems. We develop a simple device that allows us to embed Wasserstein spaces and

other similar spaces of probability measures into locally compact spaces where classical …

  Cited by 2 Related articles All 2 versions


2020

Typical wind power scenario generation for multiple wind farms using conditional improved Wasserstein generative adversarial network

Y Zhang, Q Ai, F Xiao, R Hao, T Lu - … Journal of Electrical Power & Energy …, 2020 - Elsevier

Because of environmental benefits, wind power is taking an increasing role meeting

electricity demand. However, wind power tends to exhibit large uncertainty and is largely

influenced by meteorological conditions. Apart from the variability, when multiple wind farms …

  Cited by 11 Related articles All 2 versions


2020

Chinese font translation with improved Wasserstein generative adversarial network

Y Miao, H Jia, K Tang, Y Ji - Twelfth International Conference …, 2020 - spiedigitallibrary.org

Nowadays, various fonts are applied in many fields, and the generation of multiple fonts by

computer plays an important role in the inheritance, development and innovation of Chinese

culture. Aiming at the existing font generation methods, which have some problems 

such as …

 Related articles All 3 versions


2020

Generating Hyperspectral Data Based on 3D CNN and Improved Wasserstein Generative Adversarial Network Using Homemade High-resolution Datasets

Y Li, D Huang - Proceedings of the International Conference on …, 2020 - dl.acm.org

Hyperspectral images contain rich information on the fingerprints of materials and are being

popularly used in the exploration of oil and gas, environmental monitoring, and remote

sensing. Since hyperspectral images cover a wide range of wavelengths with high …

 Related articles


Data Augmentation Method for Fault Diagnosis of Mechanical Equipment Based on Improved Wasserstein GAN

W Liu, L Duan, Y Tang, J Yang - 2020 11th International …, 2020 - ieeexplore.ieee.org

Most of the time the mechanical equipment is in normal operation state, which results in high

imbalance between fault data and normal data. In addition, traditional signal processing

methods rely heavily on expert experience, making it difficult for classification or prediction …

  Related articles
<——2020—————2020———1190——   


2020 see 2019  [PDF] iop.org

Speech Dereverberation Based on Improved Wasserstein Generative Adversarial Networks

L Rao, J Yang - Journal of Physics: Conference Series, 2020 - iopscience.iop.org

In reality, the sound we hear is not only disturbed by noise, but also the reverberant, whose

effects are rarely taken into account. Recently, deep learning has shown great advantages

in speech signal processing. But among the existing dereverberation approaches, very few …

Cited by 2 Related articles All 2 versions

2020

 Typical wind power scenario generation for multiple wind farms using conditional improved Wasserstein generative adversarial network

Y Zhang, Q Ai, F Xiao, R Hao, T Lu - … Journal of Electrical Power & Energy …, 2020 - Elsevier

Because of environmental benefits, wind power is taking an increasing role meeting

electricity demand. However, wind power tends to exhibit large uncertainty and is largely

influenced by meteorological conditions. Apart from the variability, when multiple wind farms …

  Cited by 11 Related articles All 2 versions


2020

Метрика Вассерштейна - Wasserstein metric - qaz.wiki

ru.qaz.wiki › wiki › Wasserstein_met...

Translate this page

Dec 21, 2020 — В математике , то Вассерстины расстояние или метрика Канторович-Рубинштейн является функцией расстояния , определенной ...

Определение · ‎Интуиция и подключение ... · ‎Примеры · ‎Недвижимость


 

2020

Обращение полного волнового поля с использованием метрики Вассерштейна

АА Василенко - МНСК-2020, 2020 - elibrary.ru

Обратная динамическая задача сейсмики заключается в определении параметров

упругой среды по зарегистрированным в ходе полевых работ данным. Данная задача

сводится к минимизации целевого функционала, измеряющего отклонение …

2020
Расстояние Канторовича-Рубинштейна-Вассерштейна между аттрактором и репеллером

АО Казаков, АС Пиковский, ВГ Чигарев - Математическое …, 2020 - elibrary.ru

Мы рассматриваем несколько примеров динамических систем, демонстрирующих

пересечение аттрактора и репеллера. Эти системы строятся с помощью добавления

контролируемой диссипации в базовые модели с хаотической динамикой …

  All 2 versions

[Russian  The Kantorovich-Rubinstein-Vaserstein distance between tractors and repelents]

Расстояние Канторовича-Рубинштейна-Вассерштейна между аттрактором и репеллером

АО Казаков, АС Пиковский, ВГ Чигарев - Математическое …, 2020 - elibrary.ru

Мы рассматриваем несколько примеров динамических систем, демонстрирующих

пересечение аттрактора и репеллера. Эти системы строятся с помощью добавления

контролируемой диссипации в базовые модели с хаотической динамикой …

  Related articles All 2 versions
[Russian  Kantorovich-Rubinstein-Vaserstein distance  between tractor and repelent]

2020

2020

[PDF] arxiv.org

Bismut-Elworthy inequality for a Wasserstein diffusion on the circle

V Marx - arXiv preprint arXiv:2005.04972, 2020 - arxiv.org

We investigate in this paper a regularization property of a diffusion on the Wasserstein

space $\mathcal {P} _2 (\mathbb {T}) $ of the one-dimensional torus. The control obtained

on the gradient of the semi-group is very much in the spirit of Bismut-Elworthy-Li integration …

  Related articles All 21 versions 


 Workshop | Schedules | Computing Wasserstein ... - MSRI
Computing Wasserstein barycenters using gradient descent algorithms

 May 04, 2020 (02:00 PM PDT - 03:00 PM PDT)

Speaker(: Philippe Rigollet (Massachusetts Institute of Technology)


Workshop | Schedules | A Deeper Understanding of ... - MSRI

A Deeper Understanding of the Quadratic Wasserstein Metric in Inverse Data Matching

 May 04, 2020 - May 08, 2020

May 05, 2020 (11:00 AM PDT - 12:00 PM PDT)

Speaker: Yunan Yang (New York University, Courant Institute)

A Deeper Understanding Of The Quadratic Wasserstein Metric ...

www.msri.org › Workshop › Schedules

www.msri.org › Workshop › Schedules

We show that the quadratic Wasserstein metric has a "smoothing" effect on the inversion process, making ... Please report video problems to itsupport@msri.org.

May 5, 2020

Data augmentation-based conditional Wasserstein generative adversarial network-gradient penalty for XSS attack detection system

By: Mokbal, Fawaz Mahiuob Mohammed; Wang, Dan; Wang, Xiaoxi; et al.

PEERJ COMPUTER SCIENCE    Article Number: e328   Published: ‏ DEC 14 2020

Get It Penn State Free Full Text from Publisher View Abstract

CITATION] Data Augmentation Method for Power Transformer Fault Diagnosis Based on Conditional Wasserstein Generative Adversarial Network [J]

Y Liu, Z Xu, J He, Q Wang, SG Gao, J Zhao - Power System Technology,  

 Cited by 58 Related articles All 2 versions

Wasserstein upper bounds of the total variation for smooth densities

M Chae, SG Walker - Statistics & Probability Letters, 2020 - Elsevier

The total variation distance between probability measures cannot be bounded by the

Wasserstein metric in general. If we consider sufficiently smooth probability densities,

however, it is possible to bound the total variation by a power of the Wasserstein distance …

  Cited by 3 Related articles All 5 versions

<——2020———2020————1200—


Necessary Condition for Rectifiability Involving Wasserstein Distance W-2

By: Dabrowski, Damian

INTERNATIONAL MATHEMATICS RESEARCH NOTICES  Volume: ‏ 2020   Issue: ‏ 22   Pages: ‏ 8936-8972   Published: ‏ NOV 2020

[PDF] arxiv.org

Necessary Condition for Rectifiability Involving Wasserstein Distance W2

D Dąbrowski - International Mathematics Research Notices, 2020 - academic.oup.com

A Radon measure is-rectifiable if it is absolutely continuous with respect to-dimensional 

Hausdorff measure and-almost all of can be covered by Lipschitz images of. In this paper, 

we give a necessary condition for rectifiability in terms of the so-called numbers …
Cited by 10
Related articles All 7 versions

 2020

 Wasserstein Distance to Independence Models

0 citations*

2020 ARXIV: OPTIMIZATION AND CONTROL

View More 

 High-Confidence Attack Detection via Wasserstein-Metric Computations

2 citations* for all

2 citations*

2021 IEEE CONTROL SYSTEMS LETTERS

Dan Li ,Sonia Martinez

University of California, San Diego

Wasserstein metric

Fault detection and isolation

View More (8+) 

This letter considers a sensor attack and fault detection problem for linear cyber-physical systems, which are subject to system noise that can obey an unknown light-tailed distribution. We propose a new threshold-based detection mechanism that employs the Wasserstein metric, and which guarantees sy... View Full Abstract 

 2020

 Statistical learning in Wasserstein space

0 citations*

2020 ARXIV: OPTIMIZATION AND CONTROL

View More 

 Differential inclusions in Wasserstein spaces: The Cauchy-Lipschitz framework

1 citations* for all

1 citations*

2021 JOURNAL OF DIFFERENTIAL EQUATIONS

Benoît Bonnet ,Hélène Frankowska

University of ParisLipschitz continuitDifferential inclusion

View More (8+) 

Abstract In this article, we propose a general framework for the study of differential inclusions in the Wasserstein space of probability measures. Based on earlier geometric insights on the structure of continuity equations, we define solutions of differential inclusions as absolutely continuous ... View Full Abstract 

L [PDF] arxiv.org

Differential inclusions in Wasserstein spaces: The Cauchy-Lipschitz framework

B BonnetH Frankowska - Journal of Differential Equations, 2020 - Elsevier

In this article, we propose a general framework for the study of differential inclusions in the

Wasserstein space of probability measures. Based on earlier geometric insights on the

structure of continuity equations, we define solutions of differential inclusions as absolutely …

Cited by 6 Related articles All 7 versions 


2020

[PDF] projecteuclid.org

Existence of probability measure valued jump-diffusions in generalized Wasserstein spaces

M Larsson, S Svaluto-Ferro - Electronic Journal of Probability, 2020 - projecteuclid.org

We study existence of probability measure valued jump-diffusions described by martingale

problems. We develop a simple device that allows us to embed Wasserstein spaces and

other similar spaces of probability measures into locally compact spaces where classical …

  Cited by 2 Related articles All 2 versions


2020  [PDF] ams.org

Full View

Isometric study of Wasserstein spaces–the real line

G GehérT TitkosD Virosztek - Transactions of the American Mathematical …, 2020 - ams.org

Recently Kloeckner described the structure of the isometry group of the quadratic

Wasserstein space $\mathcal {W} _2 (\mathbb {R}^ n) $. It turned out that the case of the real

line is exceptional in the sense that there exists an exotic isometry flow. Following this line of …

  Cited by 2 Related articles All 5 versions

[PDF] researchgate.net

[PDF] ADDENDUM TO” ISOMETRIC STUDY OF WASSERSTEIN SPACES–THE REAL LINE”

GPÁL GEHÉRT TITKOSD VIROSZTEK - researchgate.net

We show an example of a Polish metric space X whose quadratic Wasserstein space W2 (X)

possesses an isometry that splits mass. This gives an affirmative answer to Kloeckner's

question,[2, Question 2]. Let us denote the metric space ([0, 1],|·|), equipped with the usual …

  Related articles 

2020  [PDF] arxiv.org

Gromov–Hausdorff limit of Wasserstein spaces on point clouds

NG Trillos - Calculus of Variations and Partial Differential …, 2020 - Springer

We consider a point cloud\(X_n:=\{{\mathbf {x}} _1,\ldots,{\mathbf {x}} _n\}\) uniformly

distributed on the flat torus\({\mathbb {T}}^ d:=\mathbb {R}^ d/\mathbb {Z}^ d\), and construct

a geometric graph on the cloud by connecting points that are within distance\(\varepsilon\) of …

  Cited by 12 Related articles All 3 versions

2020  [PDF] arxiv.org

Derivative over Wasserstein spaces along curves of densities

R Buckdahn, J Li, H Liang - arXiv preprint arXiv:2010.01507, 2020 - arxiv.org

In this paper, given any random variable $\xi $ defined over a probability space

$(\Omega,\mathcal {F}, Q) $, we focus on the study of the derivative of functions of the form $

L\mapsto F_Q (L):= f\big ((LQ) _ {\xi}\big), $ defined over the convex cone of densities …

  All 2 versions 


2020  [PDF] arxiv.org

Differential inclusions in Wasserstein spaces: The Cauchy-Lipschitz framework

B BonnetH Frankowska - Journal of Differential Equations, 2020 - Elsevier

In this article, we propose a general framework for the study of differential inclusions in the

Wasserstein space of probability measures. Based on earlier geometric insights on the

structure of continuity equations, we define solutions of differential inclusions as absolutely …

  All 6 versions

<——2020——2020———1210——   


2020

[PDF] arxiv.org

A Rademacher-type theorem on L2-Wasserstein spaces over closed Riemannian manifolds

LD Schiavo - Journal of Functional Analysis, 2020 - Elsevier

Let P be any Borel probability measure on the L 2-Wasserstein space (P 2 (M), W 2) over a

closed Riemannian manifold M. We consider the Dirichlet form E induced by P and by the

Wasserstein gradient on P 2 (M). Under natural assumptions on P, we show that W 2 …

  Cited by 5 Related articles All 6 versions

2020

[PDF] arxiv.org

Differentiable maps between Wasserstein spaces

B Lessel, T Schick - arXiv preprint arXiv:2010.02131, 2020 - arxiv.org

A notion of differentiability is being proposed for maps between Wasserstein spaces of order

2 of smooth, connected and complete Riemannian manifolds. Due to the nature of the

tangent space construction on Wasserstein spaces, we only give a global definition of 

  All 2 versions 

[PDF] arxiv.org

Irregularity of distribution in Wasserstein distance

C Graham - Journal of Fourier Analysis and Applications, 2020 - Springer

We study the non-uniformity of probability measures on the interval and circle. On the

interval, we identify the Wasserstein-p distance with the classical\(L^ p\)-discrepancy. We

thereby derive sharp estimates in Wasserstein distances for the irregularity of distribution of …

  Cited by 3 Related articles All 3 versions

 

2020

[PDF] arxiv.org

Global sensitivity analysis and Wasserstein spaces

JC Fort, T Klein, A Lagnoux - arXiv preprint arXiv:2007.12378, 2020 - arxiv.org

Sensitivity indices are commonly used to quantity the relative inuence of any specic group of

input variables on the output of a computer code. In this paper, we focus both on computer

codes the output of which is a cumulative distribution function and on stochastic computer …

  Cited by 1 All 8 versions 


2020

Diffusions on Wasserstein Spaces

L Dello Schiavo - 2020 - bonndoc.ulb.uni-bonn.de

We construct a canonical diffusion process on the space of probability measures over a

closed Riemannian manifold, with invariant measure the Dirichlet–Ferguson measure.

Together with a brief survey of the relevant literature, we collect several tools from the theory …

  Related articles 


<2020

2020

[PDF] colostate.edu

[PDF] Vietoris–Rips metric thickenings and Wasserstein spaces

J Mirth - 2020 - math.colostate.edu

If the vertex set, X, of a simplicial complex, K, is a metric space, then K can be interpreted as

a subset of the Wasserstein space of probability measures on X. Such spaces are called

simplicial metric thickenings, and a prominent example is the Vietoris–Rips metric …

  Cited by 1 Related articles All 2 versions 

2020

Morse Theory for Wasserstein Spaces – Joshua Mirth

www.math.colostate.edu › ~mirth › jmm_talk › jmm_talk

Morse theory for Wasserstein Spaces. JMM 2020. Joshua Mirth – Colorado State University. 1 / 11. Motivation. Persistent homology takes as input a filtered ...

[PDF] colostate.edu

[PDF] Morse Theory for Wasserstein Spaces

J Mirth - math.colostate.edu

Applied topology uses simplicial complexes to approximate a manifold based on data. This

approximation is known not to always recover the homotopy type of the manifold. In this work-

in-progress we investigate how to compute the homotopy type in such settings using …

  Related articles All 2 versions 


PDF) A bound on the 2-Wasserstein distance between linear ...

Nov 23, 2020 — We use this bound to estimate the Wasserstein-2 distance between random variables represented by linear combinations of independent ...

by B Arras · ‎2019 · ‎Cited by 20 · ‎Related articles


Inequalities for the Wasserstein mean of positive definite ...
www.researchgate.net › publication › 323694739_Inequa...


Nov 10, 2020 — PDF | We prove majorization inequalities for different means of positive definite matrices. These include the Cartan mean (the Karcher mean), ...


Penalization of barycenters for $\varphi $-exponential ...

arxiv.org › math

[Submitted on 15 Jun 2020] ... Abstract: In this paper we study the penalization of barycenters in the Wasserstein space for \varphi-exponential distributions.

by S Kum · ‎2020

<——2020—————2020———1220——   


2020

Learning Wasserstein Distance-Based Gaussian Graphical Model for Multivariate Time Series Classification

HU Xuegang, L Jianxing, LI Peipei… - 2020 IEEE …, 2020 - ieeexplore.ieee.org

Multivariate time series classification occupies an important position in time series data

mining tasks and has been applied in many fields. However, due to the statistical coupling

between different variables of Multivariate Time Series (MTS) data, traditional classification …

  All 2 versions

2020

A Generative Model for Zero-Shot Learning via Wasserstein Auto-encoder

X Luo, Z Cai, F Wu, J Xiao-Yuan - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

… We test our model on four benchmark datasets including CUB, SUN, AWA2 and aPY,

the results of which demonstrate the effectiveness of our model. Index Terms—Zero-

Shot LearningWasserstein Auto-encoder, Generative Model …

Cited by 1 Related articles

Approximate inference with wasserstein gradient flows

C FrognerT Poggio - International Conference on Artificial …, 2020 - proceedings.mlr.press

We present a novel approximate inference method for diffusion processes, based on the

Wasserstein gradient flow formulation of the diffusion. In this formulation, the time-dependent

density of the diffusion is derived as the limit of implicit Euler steps that follow the gradients …

  Cited by 12 Related articles All 2 versions 


[PDF] arxiv.org

Fair regression with wasserstein barycenters

E Chzhen, C Denis, M HebiriL Oneto… - arXiv preprint arXiv …, 2020 - arxiv.org

We study the problem of learning a real-valued function that satisfies the Demographic

Parity constraint. It demands the distribution of the predicted output to be independent of the

sensitive attribute. We consider the case that the sensitive attribute is available for …

  Cited by 4 All 4 versions 


[PDF] mlr.press

Robust Document Distance with Wasserstein-Fisher-Rao metric

Z Wang, D Zhou, M Yang, Y Zhang… - Asian Conference on …, 2020 - proceedings.mlr.press

Computing the distance among linguistic objects is an essential problem in natural

language processing. The word mover's distance (WMD) has been successfully applied to

measure the document distance by synthesizing the low-level word similarity with the …

  Cited by 1 



[PDF] arxiv.org

node2coords: Graph representation learning with wasserstein barycenters

E SimouD ThanouP Frossard - IEEE Transactions on Signal …, 2020 - ieeexplore.ieee.org

In order to perform network analysis tasks, representations that capture the most relevant

information in the graph structure are needed. However, existing methods do not learn

representations that can be interpreted in a straightforward way and that are stable to …

  Cited by 1 All 3 versions


Data-driven distributionally robust unit commitment with Wasserstein metric: Tractable formulation and efficient solution method

X ZhengH Chen - IEEE Transactions on Power Systems, 2020 - ieeexplore.ieee.org

In this letter, we propose a tractable formulation and an efficient solution method for the

Wasserstein-metric-based distributionally robust unit commitment (DRUC-dW) problem.

First, a distance-based data aggregation method is introduced to hedge against the …

  Cited by 3 All 2 versions


[PDF] arxiv.org

Online Stochastic Optimization with Wasserstein Based Non-stationarity

J Jiang, X Li, J Zhang - arXiv preprint arXiv:2012.06961, 2020 - arxiv.org

We consider a general online stochastic optimization problem with multiple budget

constraints over a horizon of finite time periods. At each time period, a reward function and

multiple cost functions, where each cost function is involved in the consumption of one …

  All 2 versions 


[PDF] arxiv.org

Robust Reinforcement Learning with Wasserstein Constraint

L Hou, L PangX HongY Lan, Z Ma, D Yin - arXiv preprint arXiv …, 2020 - arxiv.org

Robust Reinforcement Learning aims to find the optimal policy with some extent of

robustness to environmental dynamics. Existing learning algorithms usually enable the

robustness through disturbing the current state or simulating environmental parameters in a …

  Related articles All 3 versions 


[PDF] researchgate.net

Infrared and Visible Image Fusion Using Dual Discriminators Generative Adversarial Networks with Wasserstein Distance

J Li, H Huo, K Liu, C Li - Information Sciences, 2020 - Elsevier

Generative adversarial network (GAN) has shown great potential in infrared and visible

image fusion. The existing GAN-based methods establish an adversarial game between

generative image and source images to train the generator until the generative image …

  Cited by 3 Related articles All 2 versions

<——2020————2020———1230——   


Drug–drug interaction prediction with Wasserstein Adversarial Autoencoder-based knowledge graph embeddings

Y Dai, C Guo, W Guo, C Eickhoff - Briefings in Bioinformatics, 2020 - academic.oup.com

An interaction between pharmacological agents can trigger unexpected adverse events.

Capturing richer and more comprehensive information about drug–drug interactions (DDIs)

is one of the key tasks in public health and drug development. Recently, several knowledge …

  All 3 versions


[PDF] ieee.org

Distributionally Robust Optimal Reactive Power Dispatch with Wasserstein Distance in Active Distribution Network

J Liu, Y Chen, C Duan, J Lin… - Journal of Modern Power …, 2020 - ieeexplore.ieee.org

The uncertainties from renewable energy sources (RESs) will not only introduce significant

influences to active power dispatch, but also bring great challenges to the analysis of

optimal reactive power dispatch (ORPD). To address the influence of high penetration of …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

PLG-IN: Pluggable Geometric Consistency Loss with Wasserstein Distance in Monocular Depth Estimation

N HiroseS KoideK KawanoR Kondo - arXiv preprint arXiv:2006.02068, 2020 - arxiv.org

We propose a novel objective to penalize geometric inconsistencies, to improve the

performance of depth estimation from monocular camera images. Our objective is designed

with the Wasserstein distance between two point clouds estimated from images with different …

  Cited by 1 Related articles All 2 versions 

[PDF] ieee.org

Robust Multivehicle Tracking With Wasserstein Association Metric in Surveillance Videos

Y Zeng, X Fu, L Gao, J Zhu, H Li, Y Li - IEEE Access, 2020 - ieeexplore.ieee.org

Vehicle tracking based on surveillance videos is of great significance in the highway traffic

monitoring field. In real-world vehicle-tracking applications, partial occlusion and objects

with similarly appearing distractors pose significant challenges. For addressing the above …

  Cited by 3 Related articles


Obtaining PET/CT images from non-attenuation corrected PET images in a single PET system using Wasserstein generative adversarial networks

Z Hu, Y Li, S Zou, H Xue, Z Sang, X Liu… - Physics in Medicine …, 2020 - iopscience.iop.org

Positron emission tomography (PET) imaging plays an indispensable role in early disease

detection and postoperative patient staging diagnosis. However, PET imaging requires not

only additional computed tomography (CT) imaging to provide detailed anatomical …

  Cited by 6 Related articles All 5 versions


2020

[PDF] arxiv.org

Adversarial Classification via Distributional Robustness with Wasserstein Ambiguity

N Ho-NguyenSJ Wright - arXiv preprint arXiv:2005.13815, 2020 - arxiv.org

We study a model for adversarial classification based on distributionally robust chance

constraints. We show that under Wasserstein ambiguity, the model aims to minimize the

conditional value-at-risk of the distance to misclassification, and we explore links to previous …

  Related articles All 3 versions 


[PDF] arxiv.org

Hierarchical Gaussian Processes with Wasserstein-2 Kernels

S PopescuD SharpJ ColeB Glocker - arXiv preprint arXiv:2010.14877, 2020 - arxiv.org

We investigate the usefulness of Wasserstein-2 kernels in the context of hierarchical

Gaussian Processes. Stemming from an observation that stacking Gaussian Processes

severely diminishes the model's ability to detect outliers, which when combined with non …

  All 2 versions 


[PDF] arxiv.org

Chance-Constrained Set Covering with Wasserstein Ambiguity

H Shen, R Jiang - arXiv preprint arXiv:2010.05671, 2020 - arxiv.org

We study a generalized distributionally robust chance-constrained set covering problem

(DRC) with a Wasserstein ambiguity set, where both decisions and uncertainty are binary-

valued. We establish the NP-hardness of DRC and recast it as a two-stage stochastic …

  All 2 versions 


GraphWGAN: Graph Representation Learning with Wasserstein Generative Adversarial Networks

R Yan, H Shen, C Qi, K Cen… - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Graph representation learning aims to represent vertices as low-dimensional and real-

valued vectors to facilitate subsequent downstream tasks, ie, node classification, link

predictions. Recently, some novel graph representation learning frameworks, which try to …

  Related articles All 2 versions

[PDF] lewissoft.com

Wasserstein Distributionally Robust Motion Planning and Control with Safety Constraints Using Conditional Value-at-Risk

A HakobyanI Yang - 2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

… 3: Quadrotor trajectories at difference stages controlled by the standard SAA method and the

proposed distributionally robust method with Wasserstein ball radii θ = 1.5×10−3, 2×10−3,

3×10−3. The quadrotor controlled by the SAA method collides with the first obstacle at t = 16 …

 Cited by 1 All 2 versions

Conference ProceedingCitation Online

Cited by 17 Related articles

<——2020————2020———1240——   


[PDF] arxiv.org

The existence of minimizers for an isoperimetric problem with Wasserstein penalty term in unbounded domains

Q Xia, B Zhou - arXiv preprint arXiv:2002.07129, 2020 - arxiv.org

In this article, we consider the (double) minimization problem $$\min\left\{P

(E;\Omega)+\lambda W_p (E, F):~ E\subseteq\Omega,~ F\subseteq\mathbb {R}^ d,~\lvert

E\cap F\rvert= 0,~\lvert E\rvert=\lvert F\rvert= 1\right\}, $$ where $ p\geqslant 1$, $\Omega …

  Related articles All 4 versions 


[PDF] arxiv.org

Randomised Wasserstein Barycenter Computation: Resampling with Statistical Guarantees

F Heinemann, A MunkY Zemel - arXiv preprint arXiv:2012.06397, 2020 - arxiv.org

Page 1. Randomised Wasserstein Barycenter Computation: Resampling with

Statistical Guarantees Florian Heinemann Axel Munk † Yoav Zemel ‡ December

14, 2020 Abstract We propose a hybrid resampling method to …

  All 2 versions 


2020


2020 see 2019

De novo protein design for novel folds using guided conditional Wasserstein generative adversarial networks

M Karimi, S Zhu, Y CaoY Shen - Journal of Chemical Information …, 2020 - ACS Publications

Although massive data is quickly accumulating on protein sequence and structure, there is a

small and limited number of protein architectural types (or structural folds). This study is …

 Cited by 15 Related articles All 5 versions

De Novo Protein Design for Novel Folds Using Guided Conditional Wasserstein Generative Adversarial Networks

Karimi, M; Zhu, SW; (...); Shen, Y

Dec 28 2020 | JOURNAL OF CHEMICAL INFORMATION AND MODELING 60 (12) , pp.5667-5681

Although massive data is quickly accumulating on protein sequence and structure, there is a small and limited number of protein architectural types (or structural folds). This study is addressing the following question: how well could one reveal underlying sequence-structure relationships and design protein sequences for an arbitrary, potentially novel, structural fold? In response to the question, we have developed novel deep generative models, namely, semisupervised gcWGAN (guided, conditional, Wasserstein Generative Adversarial Networks). To overcome training difficulties and improve design qualities, we build our models on conditional Wasserstein GAN (WGAN) that uses Wasserstein distance in the loss function. Our major contributions include (1) constructing a low-dimensional and generalizable representation of the fold space for the conditional input, (2) developing an ultrafast sequence-to-fold predictor (or oracle) and incorporating its feedback into WGAN as a loss to guide model training, and (3) exploiting sequence data with and without paired structures to enable a semisupervised training strategy. Assessed by the oracle over 100 novel folds not in the training set, gcWGAN generates more successful designs and covers 3.5 times more target folds compared to a competing data-driven method (cVAE). Assessed by sequence- and structure-based predictors, gcWGAN designs are physically and biologically sound. Assessed by a structure predictor over representative novel folds, including one not even part of basis folds, gcWGAN designs have comparable or better fold accuracy yet much more sequence diversity and novelty than cVAE. The ultrafast data-driven model is further shown to boost the success of a principle-driven de novo method (RosettaDesign), through generating design seeds and tailoring design space. In conclusion, gcWGAN explores uncharted sequence space to design proteins by learning generalizable principles from current sequence-structure data. Data, source codes, and trained models are available at https://github.com/Shen-Lab/gcWGAN


Functional Data Clustering Analysis via the Learning of Gaussian Processes with Wasserstein Distance

T Li, J Ma - International Conference on Neural Information …, 2020 - Springer

Functional data clustering analysis becomes an urgent and challenging task in the new era

of big data. In this paper, we propose a new framework for functional data clustering

analysis, which adopts a similar structure as the k-means algorithm for the conventional …

 

Learning Wasserstein Distance-Based Gaussian Graphical Model for Multivariate Time Series Classification

HU Xuegang, L Jianxing, LI Peipei… - 2020 IEEE …, 2020 - ieeexplore.ieee.org

Multivariate time series classification occupies an important position in time series data

mining tasks and has been applied in many fields. However, due to the statistical coupling

between different variables of Multivariate Time Series (MTS) data, traditional classification

methods cannot find complex dependencies between different variables, so most existing

methods perform not well in MTS classification with many variables. Thus, in this paper, a

novel model-based classification method is proposed, called Wasserstein Distance-based …

  Related articles All 2 versions

online

Learning Wasserstein Distance-Based Gaussian Graphical Model for Multivariate Time Series...

by HU, Xuegang; LIAO, Jianxing; LI, Peipei ; More...

2020 IEEE International Conference on Knowledge Graph (ICKG), 08/2020

Multivariate time series classification occupies an important position in time series data mining tasks and has been applied in many fields. However, due to...

Conference ProceedingFull Text Online

 Learning Wasserstein Distance-Based Gaussian Graphical Model for Multivariate Time Series Classification


2020


Wasserstein 거리 척도 기반 SRGAN 이용한 위성 영상 해상도 ...

https://www.dbpia.co.kr › articleDetail

· Translate this page

기관이 구독 중인 논문을 이용하실 있습니다. 회원님의 기관인증 상태가 ... Wasserstein 거리 척도 기반 SRGAN 이용한 위성 영상 해상도 향상. 이용하기 북마크.

[Korean  Satellite image resolution using SRGAN based on Wasserstein distance scale …]

online

Wasserstein 거리 척도 기반 SRGAN 이용한 위성 영상 해상도 향상

by 황지언; 유초시; 신요안

한국통신학회 학술대회논문집, 2020, Volume 2020, Issue 8

Journal ArticleFull Text Online

[Korean    Enhancement of satellite image resolution using SRGAN based on Wasserstein distance scale]]


 

[PDF] unifi.it

[PDF] Pattern-Based Music Generation with Wasserstein Autoencoders and PRCDescriptions

V Borghuis, L Angioloni, L Brusci… - 29th International Joint …, 2020 - flore.unifi.it

We present a pattern-based MIDI music generation system with a generation strategy based

on Wasserstein autoencoders and a novel variant of pianoroll descriptions of patterns which

employs separate channels for note velocities and note durations and can be fed into classic …

  All 4 versions 


[PDF] kweku.me

[PDF] Measuring Bias with Wasserstein Distance

K Kwegyir-Aggrey, SM Brown - kweku.me

In fair classification, we often ask:" what does it mean to be fair, and how is fairness

measured?" Previous approaches to defining and enforcing fairness rely on a set of

statistical fairness definitions, with each definition providing its own unique measurement of …


[PDF] uniroma1.it

[PDF] Nonparametric Density Estimation with Wasserstein Distance for Actuarial Applications

EG Luini - iris.uniroma1.it

Density estimation is a central topic in statistics and a fundamental task of actuarial sciences.

In this work, we present an algorithm for approximating multivariate empirical densities with

a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

  Related articles All 2 versions 


Approximate inference with wasserstein gradient flows

C FrognerT Poggio - International Conference on Artificial …, 2020 - proceedings.mlr.press

We present a novel approximate inference method for diffusion processes, based on the

Wasserstein gradient flow formulation of the diffusion. In this formulation, the time-dependent

density of the diffusion is derived as the limit of implicit Euler steps that follow the gradients …

  Cited by 12 Related articles All 2 versions 

<——2020————2020———1250——   


[PDF] arxiv.org

Fisher information regularization schemes for Wasserstein gradient flows

W LiJ LuL Wang - Journal of Computational Physics, 2020 - Elsevier

We propose a variational scheme for computing Wasserstein gradient flows. The scheme

builds upon the Jordan–Kinderlehrer–Otto framework with the Benamou-Brenier's dynamic

formulation of the quadratic Wasserstein metric and adds a regularization by the Fisher …

  Cited by 7 Related articles All 8 versions


[PDF] arxiv.org

Lagrangian schemes for Wasserstein gradient flows

JA CarrilloD MatthesMT Wolfram - arXiv preprint arXiv:2003.03803, 2020 - arxiv.org

This paper reviews different numerical methods for specific examples of Wasserstein

gradient flows: we focus on nonlinear Fokker-Planck equations, but also discuss

discretizations of the parabolic-elliptic Keller-Segel model and of the fourth order thin film …

  Cited by 3 Related articles All 3 versions 


[PDF] arxiv.org

The back-and-forth method for wasserstein gradient flows

M Jacobs, W Lee, F Léger - arXiv preprint arXiv:2011.08151, 2020 - arxiv.org

We present a method to efficiently compute Wasserstein gradient flows. Our approach is

based on a generalization of the back-and-forth method (BFM) introduced by Jacobs and

Léger to solve optimal transport problems. We evolve the gradient flow by solving the dual …

  Cited by 1 All 2 versions 


[PDF] arxiv.org

Refining Deep Generative Models via Wasserstein Gradient Flows

AF Ansari, ML Ang, H Soh - arXiv preprint arXiv:2012.00780, 2020 - arxiv.org

Deep generative modeling has seen impressive advances in recent years, to the point

where it is now commonplace to see simulated samples (eg, images) that closely resemble

real-world data. However, generation quality is generally inconsistent for any given model …

  

Wasserstein gradient-penalty generative adversarial network with deep auto-encoder for bearing intelligent fault diagnosis

X Xiong, J Hongkai, X Li, M Niu - Measurement Science and …, 2020 - iopscience.iop.org

It is a great challenge to manipulate unbalanced fault data in the field of rolling bearings

intelligent fault diagnosis. In this paper, a novel intelligent fault diagnosis method called the

Wasserstein gradient-penalty generative adversarial network with deep auto-encoder is …

  Cited by 2 Related articles All 2 versions


2020

[PDF] arxiv.org

SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

S ChewiTL Gouic, C Lu, T Maunu… - arXiv preprint arXiv …, 2020 - arxiv.org

Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described

as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of

optimal transport. We introduce a new perspective on SVGD that instead views SVGD as the …

  Cited by 2 Related articles All 2 versions 


[PDF] archives-ouvertes.fr

Learning with minibatch Wasserstein: asymptotic and gradient properties

K Fatras, Y Zine, R Flamary… - the 23nd …, 2020 - hal.archives-ouvertes.fr

Page 1. HAL Id: hal-02502329 https://hal.archives-ouvertes.fr/hal-02502329 Submitted on 9

Mar 2020 HAL is a multi-disciplinary open access archive for the deposit and dissemination

of sci- entific research documents, whether they are pub- lished or not …

  Cited by 5 Related articles All 77 versions 


[PDF] arxiv.org

TPFA Finite Volume Approximation of Wasserstein Gradient Flows

A NataleG Todeschi - International Conference on Finite Volumes for …, 2020 - Springer

Numerous infinite dimensional dynamical systems arising in different fields have been

shown to exhibit a gradient flow structure in the Wasserstein space. We construct Two Point

Flux Approximation Finite Volume schemes discretizing such problems which preserve the …

  Cited by 1 Related articles All 8 versions


[PDF] sjtu.edu.cn

[PDF] Kalman-Wasserstein Gradient Flows

F Hoffmann - 2020 - ins.sjtu.edu.cn

Parameter calibration and uncertainty in complex computer models. Ensemble Kalman

Inversion (for optimization). Ensemble Kalman Sampling (for sampling). Kalman-Wasserstein

gradient flow structure … Minimize E : Ω R, where Ω RN … Dynamical …

  Related articles All 4 versions 


[PDF] researchgate.net

[PDF] Potential Analysis of Wasserstein GAN as an Anomaly Detection Method for Industrial Images

A Misik - researchgate.net

The task of detecting anomalies in images is a crucial part of current industrial optical

monitoring systems. In recent years, neural networks have proven to be an efficient method

for this problem, especially autoencoders and generative adversarial networks (GAN). A …

  

<——2020——2020———1260—— 


[v2] Thu, 4 Jun 2020 12:51:56 UTC (240 KB)

[CITATION] Wasserstein gradient flow formulation of the time-fractional Fokker-Planck equation

B Jin, MH Duong - Communications in Mathematical Sciences, 2020 - discovery.ucl.ac.uk

… Wasserstein gradient flow formulation of the time-fractional Fokker-Planck equation. Jin, B; Duong,

MH; (2020) Wasserstein gradient flow formulation of the time-fractional Fokker-Planck equation.

Communications in Mathematical Sciences (In press). [img], Text fracFPE_cms_revised.pdf …

Limit Distribution Theory for Smooth Wasserstein Distance with Applications to Generative Modeling

Z GoldfeldK Kato - arXiv preprint arXiv:2002.01012, 2020 - arxiv.org

The 1-Wasserstein distance ($\mathsf {W} _1 $) is a popular proximity measure between

probability distributions. Its metric structure, robustness to support mismatch, and rich

geometric structure fueled its wide adoption for machine learning tasks. Such tasks …

  Cited by 1 Related articles All 2 versions 


[PDF] ethz.ch

[PDF] Smooth Wasserstein Distance: Metric Structure and Statistical Efficiency

Z Goldfeld - International Zurich Seminar on Information …, 2020 - research-collection.ethz.ch

The Wasserstein distance has seen a surge of interest and applications in machine learning.

Its popularity is driven by many advantageous properties it possesses, such as metric

structure (metrization of weak convergence), robustness to support mismatch, compatibility …

  Related articles All 5 versions 


 Projection Robust Wasserstein Distance and Riemannian Optimization

4 citations* for all   

4 citations* for all

1 citations*

2020 NEURAL INFORMATION PROCESSING SYSTEMS

Tianyi Lin 1,Chenyou Fan 2,Nhat Ho 1,Marco Cuturi 2,Michael I. Jordan 3

1 University of California, Berkeley ,2 Google ,3 Stanford University

Applied mathematics

Mathematics

Cited by 24 Related articles All 8 versions 

 

Adapted wasserstein distances and stability in mathematical finance

BV JulioD BartlB Mathias, E Manu - Finance and Stochastics, 2020 - Springer

Assume that an agent models a financial asset through a measure Q with the goal to

price/hedge some derivative or optimise some expected utility. Even if the model Q is

chosen in the most skilful and sophisticated way, the agent is left with the possibility that Q …

  Cited by 15 Related articles All 11 versions

[PDF] arxiv.org

Exponential contraction in Wasserstein distances for diffusion semigroups with negative curvature

FY Wang - Potential Analysis, 2020 - Springer

Let P t be the (Neumann) diffusion semigroup P t generated by a weighted Laplacian on a

complete connected Riemannian manifold M without boundary or with a convex boundary. It

is well known that the Bakry-Emery curvature is bounded below by a positive constant> 0 …

  Cited by 20 Related articles All 3 versions


[PDF] arxiv.org

Stochastic equation and exponential ergodicity in Wasserstein distances for affine processes

M Friesen, P Jin, B Rüdiger - Annals of Applied Probability, 2020 - projecteuclid.org

This work is devoted to the study of conservative affine processes on the canonical state

space $ D=\mathbb {R} _ {+}^{m}\times\mathbb {R}^{n} $, where $ m+ n> 0$. We show that

each affine process can be obtained as the pathwise unique strong solution to a stochastic …

Cited by 15 Related articles All 6 versions

[PDF] arxiv.org

Stein's method for normal approximation in Wasserstein distances with application to the multivariate Central Limit Theorem

T Bonis - Probability Theory and Related Fields, 2020 - Springer

We use Stein's method to bound the Wasserstein distance of order 2 between a

measure\(\nu\) and the Gaussian measure using a stochastic process\((X_t) _ {t\ge 0}\) such

that\(X_t\) is drawn from\(\nu\) for any\(t> 0\). If the stochastic process\((X_t) _ {t\ge 0}\) …

  Cited by 5 Related articles All 2 versions


[PDF] arxiv.org

Wasserstein Distances for Stereo Disparity Estimation

D GargY WangB HariharanM Campbell… - arXiv preprint arXiv …, 2020 - arxiv.org

Existing approaches to depth or disparity estimation output a distribution over a set of pre-

defined discrete values. This leads to inaccurate results when the true depth or disparity

does not match any of these values. The fact that this distribution is usually learned indirectly …

Cited by 15 Related articles All 6 versions 

[CITATION] Supplementary Material: Wasserstein Distances for Stereo Disparity Estimation

D Garg, Y Wang, B HariharanM Campbell

<——2020————2020———1270——   


[PDF] arxiv.org

Asymptotics of smoothed Wasserstein distances

HB ChenJ Niles-Weed - arXiv preprint arXiv:2005.00738, 2020 - arxiv.org

We investigate contraction of the Wasserstein distances on $\mathbb {R}^ d $ under

Gaussian smoothing. It is well known that the heat semigroup is exponentially contractive

with respect to the Wasserstein distances on manifolds of positive curvature; however, on flat …

  Related articles All 2 versions 


[PDF] arxiv.org

Augmented Sliced Wasserstein Distances

X Chen, Y Yang, Y Li - arXiv preprint arXiv:2006.08812, 2020 - arxiv.org

While theoretically appealing, the application of the Wasserstein distance to large-scale

machine learning problems has been hampered by its prohibitive computational cost. The

sliced Wasserstein distance and its variants improve the computational efficiency through …

  All 2 versions 


[PDF] arxiv.org

Intrinsic Sliced Wasserstein Distances for Comparing Collections of Probability Distributions on Manifolds and Graphs

RM RustamovS Majumdar - arXiv preprint arXiv:2010.15285, 2020 - arxiv.org

Collections of probability distributions arise in a variety of statistical applications ranging

from user activity pattern analysis to brain connectomics. In practice these distributions are

represented by histograms over diverse domain types including finite intervals, circles …

  All 2 versions 


[PDF] inria.fr

Graph Diffusion Wasserstein Distances

A Barbe, M Sebban, P Gonçalves, P Borgnat… - … on Machine Learning …, 2020 - hal.inria.fr

Optimal Transport (OT) for structured data has received much attention in the machine

learning community, especially for addressing graph classification or graph transfer learning

tasks. In this paper, we present the Diffusion Wasserstein (DW) distance, as a generalization …

  Cited by 1 Related articles All 3 versions 

[PDF] arxiv.org

Wasserstein coupled particle filter for multilevel estimation

M Ballesio, A Jasravon Schwerin… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we consider the filtering problem for partially observed diffusions, which are

regularly observed at discrete times. We are concerned with the case when one must resort

to time-discretization of the diffusion process if the transition density is not available in an …

  Cited by 3 Related articles All 4 versions 


[PDF] arxiv.org

When can Wasserstein GANs minimize Wasserstein Distance?

Y LiZ Dou - arXiv preprint arXiv:2003.04033, 2020 - arxiv.org

Generative Adversarial Networks (GANs) are widely used models to learn complex real-

world distributions. In GANs, the training of the generator usually stops when the

discriminator can no longer distinguish the generator's output from the set of training …

  Cited by 4 Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein GANs for MR imaging: from paired to unpaired training

K Lei, M MardaniJM Pauly… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Lack of ground-truth MR images impedes the common supervised training of neural

networks for image reconstruction. To cope with this challenge, this paper leverages

unpaired adversarial training for reconstruction networks, where the inputs are …

  Cited by 5 Related articles All 2 versions


[PDF] arxiv.org

Some Theoretical Insights into Wasserstein GANs

G BiauM SangnierU Tanielian - arXiv preprint arXiv:2006.02682, 2020 - arxiv.org

Generative Adversarial Networks (GANs) have been successful in producing outstanding

results in areas as diverse as image, video, and text generation. Building on these

successes, a large number of empirical studies have validated the benefits of the cousin …

  Cited by 3  Cited by 5 Related articles All 5 versions 


[PDF] arxiv.org

Conditional Sig-Wasserstein GANs for Time Series Generation

H NiL SzpruchM Wiese, S Liao, B Xiao - arXiv preprint arXiv:2006.05421, 2020 - arxiv.org

Generative adversarial networks (GANs) have been extremely successful in generating

samples, from seemingly high dimensional probability measures. However, these methods

struggle to capture the temporal dependence of joint probability distributions induced by …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Statistical analysis of Wasserstein GANs with applications to time series forecasting

M Haas, S Richter - arXiv preprint arXiv:2011.03074, 2020 - arxiv.org

We provide statistical theory for conditional and unconditional Wasserstein generative

adversarial networks (WGANs) in the framework of dependent observations. We prove

upper bounds for the excess Bayes risk of the WGAN estimators with respect to a modified …

  All 3 versions 

<——2020————2020———1280——

A Super Resolution Method for Remote Sensing Images Based on Cascaded Conditional Wasserstein GANs

B Liu, H Li, Y Zhou, Y Peng, A Elazab… - 2020 IEEE 3rd …, 2020 - ieeexplore.ieee.org

High-resolution (HR) remote sensing imagery is quite beneficial for subsequent

interpretation. Obtaining HR images can be achieved by upgrading the imaging device. Yet,

the cost to perform this task is very huge. Thus, it is necessary to obtain HR images from low …

 

[PDF] ceur-ws.org

[PDF] Synthesising Tabular Data using Wasserstein Conditional GANs with Gradient Penalty (WCGAN-GP)

M Walia, B TierneyS McKeever - ceur-ws.org

… pp. 2672–2680 (2014) Page 12. 12 W. Manhar et al. 14. Gulrajani, I., Ahmed, F., Arjovsky,

M., Dumoulin, V., Courville, AC: Improved training of wasserstein gans. In: Advances in

neural information processing systems. pp. 5767–5777 (2017)

[PDF] arxiv.org

Sobolev Wasserstein GAN

M XuZ ZhouG LuJ TangW ZhangY Yu - arXiv preprint arXiv …, 2020 - arxiv.org

Wasserstein GANs (WGANs), built upon the Kantorovich-Rubinstein (KR) duality of

Wasserstein distance, is one of the most theoretically sound GAN models. However, in

practice it does not always outperform other variants of GANs. This is mostly due to the …

  All 2 versions 

 

[PDF] ucl.ac.uk

Ripple-GAN: Lane Line Detection With Ripple Lane Line Detection Network and Wasserstein GAN

Y Zhang, Z Lu, D Ma, JH Xue… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

With artificial intelligence technology being advanced by leaps and bounds, intelligent

driving has attracted a huge amount of attention recently in research and development. In

intelligent driving, lane line detection is a fundamental but challenging task particularly …

  Related articles All 2 versions


[PDF] thecvf.com

S2A: Wasserstein GAN with Spatio-Spectral Laplacian Attention for Multi-Spectral Band Synthesis

L RoutI MisraS Manthira Moorthi… - Proceedings of the …, 2020 - openaccess.thecvf.com

Intersection of adversarial learning and satellite image processing is an emerging field in

remote sensing. In this study, we intend to address synthesis of high resolution multi-spectral

satellite imagery using adversarial learning. Guided by the discovery of attention …

  Cited by 2 Related articles All 4 versions 



[PDF] arxiv.org

Conditional Wasserstein GAN-based Oversampling of Tabular Data for Imbalanced Learning

J Engelmann, S Lessmann - arXiv preprint arXiv:2008.09202, 2020 - arxiv.org

Class imbalance is a common problem in supervised learning and impedes the predictive

performance of classification models. Popular countermeasures include oversampling the

minority class. Standard methods like SMOTE rely on finding nearest neighbours and linear …

  All 3 versions 


[PDF] arxiv.org

Generating Natural Adversarial Hyperspectral examples with a modified Wasserstein GAN

JC Burnel, K FatrasN Courty - arXiv preprint arXiv:2001.09993, 2020 - arxiv.org

Adversarial examples are a hot topic due to their abilities to fool a classifier's prediction.

There are two strategies to create such examples, one uses the attacked classifier's

gradients, while the other only requires access to the clas-sifier's prediction. This is …

  Related articles All 8 versions 


[PDF] arxiv.org

Symmetric Skip Connection Wasserstein GAN for High-Resolution Facial Image Inpainting

J JamC KendrickV DrouardK Walker… - arXiv preprint arXiv …, 2020 - arxiv.org

We propose a Symmetric Skip Connection Wasserstein Generative Adversarial Network (S-

WGAN) for high-resolution facial image inpainting. The architecture is an encoder-decoder

with convolutional blocks, linked by skip connections. The encoder is a feature extractor that …

  Cited by 3 Related articles All 2 versions 


Wasserstein GAN based on Autoencoder with back-translation for cross-lingual embedding mappings

Y Zhang, Y Li, Y Zhu, X Hu - Pattern Recognition Letters, 2020 - Elsevier

Recent works about learning cross-lingual word mappings (CWMs) focus on relaxing the

requirement of bilingual signals through generative adversarial networks (GANs). GANs

based models intend to enforce source embedding space to align target embedding space …

  Related articles All 2 versions


An Improvement based on Wasserstein GAN for Alleviating Mode Collapsing

Y Chen, X Hou - 2020 International Joint Conference on Neural …, 2020 - ieeexplore.ieee.org

In the past few years, Generative Adversarial Networks as a deep generative model has

received more and more attention. Mode collapsing is one of the challenges in the study of

Generative Adversarial Networks. In order to solve this problem, we deduce a new algorithm …

All 2 versions 

<——2020——2020———1290—— 


[PDF] iop.org

A collaborative filtering recommendation framework based on Wasserstein GAN

R Li, F Qian, X Du, S Zhao… - Journal of Physics …, 2020 - iopscience.iop.org

Compared with the original GAN, Wasserstein GAN minimizes the Wasserstein Distance

between the generative distribution and the real distribution, can well capture the potential

distribution of data and has achieved excellent results in image generation. However, the …


2020

Input limited Wasserstein GAN

F Cao, H Zhao, P Liu, P Li - Second Target Recognition and …, 2020 - spiedigitallibrary.org

Generative adversarial networks (GANs) has proven hugely successful, but suffer from train

instability. The recently proposed Wasserstein GAN (WGAN) has largely overcome the

problem, but can still fail to converge in some case or be to complex. It has been found that …

  Related articles


Data Augmentation Method for Fault Diagnosis of Mechanical Equipment Based on Improved Wasserstein GAN

W Liu, L Duan, Y Tang, J Yang - 2020 11th International …, 2020 - ieeexplore.ieee.org

Most of the time the mechanical equipment is in normal operation state, which results in high

imbalance between fault data and normal data. In addition, traditional signal processing

methods rely heavily on expert experience, making it difficult for classification or prediction …

 Data Augmentation Method for Fault Diagnosis of Mechanical Equipment Based on Improved Wasserstein GAN

W Liu, L Duan, Y Tang, J Yang - 2020 11th International …, 2020 - ieeexplore.ieee.org

Most of the time the mechanical equipment is in normal operation state, which results in high

imbalance between fault data and normal data. In addition, traditional signal processing

methods rely heavily on expert experience, making it difficult for classification or prediction …


[PDF] jst.go.jp

Orthogonal Gradient Penalty for Fast Training of Wasserstein GAN Based Multi-Task Autoencoder toward Robust Speech Recognition

CY Kao, S Park, A Badi, DK Han… - IEICE TRANSACTIONS on …, 2020 - search.ieice.org

Performance in Automatic Speech Recognition (ASR) degrades dramatically in noisy

environments. To alleviate this problem, a variety of deep networks based on convolutional

neural networks and recurrent neural networks were proposed by applying L1 or L2 loss. In …

  Cited by 1 Related articles All 2 versions


Improving EEG-based motor imagery classification with conditional Wasserstein GAN

Z Li, Y Yu - 2020 International Conference on Image, Video …, 2020 - spiedigitallibrary.org

Deep learning based algorithms have made huge progress in the field of image

classification and speech recognition. There is an increasing number of researchers

beginning to use deep learning to process electroencephalographic (EEG) brain signals …

[PDF] amazonaws.com

[PDF] Bayesian Wasserstein GAN and Application for Vegetable Disease Image Data

W Cho, MH Na, S Kang, S Kim - manuscriptlink-society-file.s3 …

Various GAN models have been proposed so far and they are used in various fields.

However, despite the excellent performance of these GANs, the biggest problem is that the

model collapse occurs in the simultaneous optimization of the generator and discriminator of …

  

[PDF] fleuret.org

[PDF] Deep learning 11.2. Wasserstein GAN

F Fleuret - 2020 - fleuret.org

Page 1. Deep learning 11.2. Wasserstein GAN François Fleuret https://fleuret.org/dlc/ Dec

20, 2020 Page 2. Arjovsky et al. (2017) pointed out that DJS does not account [much] for the

metric structure of the space. François Fleuret Deep learning / 11.2. Wasserstein GAN 1 …

  All 2 versions 

[PDF] minegrado.ovh

[CITATION] EE-559–Deep learning 11.2. Wasserstein GAN

F Fleuret - 2020

  Related articles All 2 versions


[PDF] arxiv.org

Convergence and concentration of empirical measures under wasserstein distance in unbounded functional spaces

J Lei - Bernoulli, 2020 - projecteuclid.org

We provide upper bounds of the expected Wasserstein distance between a probability

measure and its empirical version, generalizing recent results for finite dimensional

Euclidean spaces and bounded functional spaces. Such a generalization can cover …

  Cited by 39 Related articles All 5 versions


[PDF] wiley.com

Evaluating the performance of climate models based on Wasserstein distance

G VissioV LemboV Lucarini… - Geophysical Research …, 2020 - Wiley Online Library

We propose a methodology for intercomparing climate models and evaluating their

performance against benchmarks based on the use of the Wasserstein distance (WD). This

distance provides a rigorous way to measure quantitatively the difference between two …

  Cited by 2 All 13 versions


[PDF] neurips.cc

[PDF] Faster Wasserstein Distance Estimation with the Sinkhorn Divergence

L ChizatP RoussillonF Léger… - Advances in Neural …, 2020 - proceedings.neurips.cc

The squared Wasserstein distance is a natural quantity to compare probability distributions

in a non-parametric setting. This quantity is usually estimated with the plug-in estimator,

defined via a discrete optimal transport problem which can be solved to $\epsilon …

Cited by 49 Related articles All 7 versions 

[PDF] semanticscholar.org

[PDF] Faster Wasserstein Distance Estimation with the Sinkhorn Divergence

FX VialardG Peyré - pdfs.semanticscholar.org

Page 1. Faster Wasserstein Distance Estimation with the Sinkhorn Divergence Lénaıc Chizat1,

joint work with Pierre Roussillon2, Flavien Léger2, François-Xavier Vialard3 and Gabriel Peyré2

July 8th, 2020 - Optimal Transport: Regularization and Applications 1CNRS a

<——2020——2020———1300—— 


[HTML] mdpi.com

Fused Gromov-Wasserstein distance for structured objects

T Vayer, L Chapel, R FlamaryR TavenardN Courty - Algorithms, 2020 - mdpi.com

Optimal transport theory has recently found many applications in machine learning thanks to

its capacity to meaningfully compare various machine learning objects that are viewed as

distributions. The Kantorovitch formulation, leading to the Wasserstein distance, focuses on …

  Cited by 4 All 32 versions 


[PDF] arxiv.org

When ot meets mom: Robust estimation of wasserstein distance

G StaermanP LaforgueP Mozharovskyi… - arXiv preprint arXiv …, 2020 - arxiv.org

Issued from Optimal Transport, the Wasserstein distance has gained importance in Machine

Learning due to its appealing geometrical properties and the increasing availability of

efficient approximations. In this work, we consider the problem of estimating the Wasserstein  …

  Cited by 2 All 4 versions 


[PDF] mlr.press

A fast proximal point method for computing exact wasserstein distance

Y XieX Wang, R Wang, H Zha - Uncertainty in Artificial …, 2020 - proceedings.mlr.press

Wasserstein distance plays increasingly important roles in machine learning, stochastic

programming and image processing. Major efforts have been under way to address its high

computational complexity, some leading to approximate or regularized variations such as …

  Cited by 17 Related articles All 3 versions 


[PDF] arxiv.org

Projection robust Wasserstein distance and Riemannian optimization

T LinC FanN HoM CuturiMI Jordan - arXiv preprint arXiv:2006.07458, 2020 - arxiv.org

Projection robust Wasserstein (PRW) distance, or Wasserstein projection pursuit (WPP), is a

robust variant of the Wasserstein distance. Recent work suggests that this quantity is more

robust than the standard Wasserstein distance, in particular when comparing probability …

  Cited by 2 All 6 versions 

Cited by 19 Related articles All 8 versions 


[PDF] arxiv.org

When can Wasserstein GANs minimize Wasserstein Distance?

Y LiZ Dou - arXiv preprint arXiv:2003.04033, 2020 - arxiv.org

Generative Adversarial Networks (GANs) are widely used models to learn complex real-

world distributions. In GANs, the training of the generator usually stops when the

discriminator can no longer distinguish the generator's output from the set of training …

  Cited by 4 Related articles All 3 versions 

Kantorovich–Rubinstein–Wasserstein distance between overlapping attractor and repeller<? A3B2 show 

[editpick]?>

V Chigarev, A KazakovA Pikovsky - Chaos: An Interdisciplinary …, 2020 - aip.scitation.org

We consider several examples of dynamical systems demonstrating overlapping attractor

and repeller. These systems are constructed via introducing controllable dissipation to

prototypic models with chaotic dynamics (Anosov cat map, Chirikov standard map, and …

  Cited by 2 All 5 versions


[PDF] arxiv.org

The quantum Wasserstein distance of order 1

G De PalmaM MarvianD TrevisanS Lloyd - arXiv preprint arXiv …, 2020 - arxiv.org

We propose a generalization of the Wasserstein distance of order 1 to the quantum states of

$ n $ qudits. The proposal recovers the Hamming distance for the vectors of the canonical

basis, and more generally the classical Wasserstein distance for quantum states diagonal in …

  Cited by 1 All 3 versions 


Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings

Y Dai, S Wang, X Chen, C Xu, W Guo - Knowledge-Based Systems, 2020 - Elsevier

Abstract Knowledge graph embedding aims to project entities and relations into low-

dimensional and continuous semantic feature spaces, which has captured more attention in

recent years. Most of the existing models roughly construct negative samples via a uniformly …

  Cited by 3 Related articles All 2 versions


[PDF] ams.org

On the Wasserstein distance between classical sequences and the Lebesgue measure

L Brown, S Steinerberger - Transactions of the American Mathematical …, 2020 - ams.org

We discuss the classical problem of measuring the regularity of distribution of sets of $ N $

points in $\mathbb {T}^ d $. A recent line of investigation is to study the cost ($= $ mass

$\times $ distance) necessary to move Dirac measures placed on these points to the uniform …

  Cited by 3 Related articles All 2 versions

<——2020——2020———1310—— 


PDF] archives-ouvertes.fr

Study of the aggregation procedure: patch fusion and generalized Wasserstein barycenters

A Saint-Dizier - 2020 - tel.archives-ouvertes.fr

… In this thesis, we focus on photographs that could have been taken by any personal imaging 

device, that we shall call natural i

Related articles All 4 versions

Selective Multi-source Transfer Learning with Wasserstein Domain Distance for Financial Fraud Detection

Y Sun, L Lan, X Zhao, M Fan, Q Guo, C Li - … Intelligent Computing and …, 2020 - Springer

As financial enterprises have moved their services to the internet, financial fraud detection

has become an ever-growing problem causing severe economic losses for the financial

industry. Recently, machine learning has gained significant attention to handle the financial …


[PDF] bciml.cn

Wasserstein distance based deep adversarial transfer learning for intelligent fault diagnosis with unlabeled or insufficient labeled data

C Cheng, B Zhou, G Ma, D WuY Yuan - Neurocomputing, 2020 - Elsevier

Intelligent fault diagnosis is one critical topic of maintenance solution for mechanical

systems. Deep learning models, such as convolutional neural networks (CNNs), have been

successfully applied to fault diagnosis tasks and achieved promising results. However, one …

  Cited by 3 All 2 versions


[PDF] arxiv.org

Limit Distribution Theory for Smooth Wasserstein Distance with Applications to Generative Modeling

Z GoldfeldK Kato - arXiv preprint arXiv:2002.01012, 2020 - arxiv.org

The 1-Wasserstein distance ($\mathsf {W} _1 $) is a popular proximity measure between

probability distributions. Its metric structure, robustness to support mismatch, and rich

geometric structure fueled its wide adoption for machine learning tasks. Such tasks …

  Cited by 1 Related articles All 2 versions 


Wasserstein distributionally robust motion planning and control with safety constraints using conditional value-at-risk

A HakobyanI Yang - 2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

In this paper, we propose an optimization-based decision-making tool for safe motion

planning and control in an environment with randomly moving obstacles. The unique feature

of the proposed method is that it limits the risk of unsafety by a pre-specified threshold even …

 Cited by 9 Related articles All 2 versions

[PDF] upenn.edu

Optimal Estimation of Wasserstein Distance on A Tree with An Application to Microbiome Studies

S Wang, TT CaiH Li - Journal of the American Statistical …, 2020 - Taylor & Francis

The weighted UniFrac distance, a plug-in estimator of the Wasserstein distance of read

counts on a tree, has been widely used to measure the microbial community difference in

microbiome studies. Our investigation however shows that such a plug-in estimator …

  Related articles All 3 versions


[PDF] arxiv.org

Linear Optimal Transport Embedding: Provable fast Wasserstein distance computation and classification for nonlinear problems

C MoosmüllerA Cloninger - arXiv preprint arXiv:2008.09165, 2020 - arxiv.org

Discriminating between distributions is an important problem in a number of scientific fields.

This motivated the introduction of Linear Optimal Transportation (LOT), which embeds the

space of distributions into an $ L^ 2$-space. The transform is defined by computing the …

  Cited by 2 All 2 versions 


 [PDF] ieee.org

Joint transfer of model knowledge and fairness over domains using wasserstein distance

T Yoon, J LeeW Lee - IEEE Access, 2020 - ieeexplore.ieee.org

Owing to the increasing use of machine learning in our daily lives, the problem of fairness

has recently become an important topic in machine learning societies. Recent studies

regarding fairness in machine learning have been conducted to attempt to ensure statistical …

Cited by 5 Related articles All 2 versions


[PDF] arxiv.org

Estimating processes in adapted Wasserstein distance

J BackhoffD BartlM Beiglböck, J Wiesel - arXiv preprint arXiv …, 2020 - arxiv.org

A number of researchers have independently introduced topologies on the set of laws of

stochastic processes that extend the usual weak topology. Depending on the respective

scientific background this was motivated by applications and connections to various areas …

  Cited by 2 Related articles All 3 versions 

[CITATION] Estimating processes in adapted Wasserstein distance

J Backhoff-Veraguas, D Bartl, M Beiglböck, J Wiesel - arXiv preprint arXiv:2002.07261, 2020

  Cited by 2 Related articles


Domain-attention Conditional Wasserstein Distance for Multi-source Domain Adaptation

H Wu, Y Yan, MK NgQ Wu - ACM Transactions on Intelligent Systems …, 2020 - dl.acm.org

Multi-source domain adaptation has received considerable attention due to its effectiveness

of leveraging the knowledge from multiple related sources with different distributions to

enhance the learning performance. One of the fundamental challenges in multi-source …

  Related articles

<——2020——2020———1320—


2020 book

[HTML] mdpi.com

Probability forecast combination via entropy regularized wasserstein distance

R Cumings-Menon, M Shin - Entropy, 2020 - mdpi.com

We propose probability and density forecast combination methods that are defined using the

entropy regularized Wasserstein distance. First, we provide a theoretical characterization of

the combined density forecast based on the regularized Wasserstein distance under the …

  Cited by 1 All 15 versions 

MR4220095 Prelim Cumings-Menon, Ryan; Shin, Minchul; Probability forecast combination via entropy regularized Wasserstein distance. Entropy 22 (2020), no. 9, Paper No. 929, 18 pp. 62 (60)



[PDF] arxiv.org

Transport and Interface: an Uncertainty Principle for the Wasserstein distance

A SagivS Steinerberger - SIAM Journal on Mathematical Analysis, 2020 - SIAM

Let f:(0,1)^dR be a continuous function with zero mean and interpret f_+=\max(f,0) and f_-

=-\min(f,0) as the densities of two measures. We prove that if the cost of transport from f_+ to

f_- is small, in terms of the Wasserstein distance W_1(f_+,f_-), then the Hausdorff measure of …

  Cited by 2 Related articles All 2 versions


[PDF] arxiv.org

Irregularity of distribution in Wasserstein distance

C Graham - Journal of Fourier Analysis and Applications, 2020 - Springer

We study the non-uniformity of probability measures on the interval and circle. On the

interval, we identify the Wasserstein-p distance with the classical\(L^ p\)-discrepancy. We

thereby derive sharp estimates in Wasserstein distances for the irregularity of distribution of …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

MH Quang - arXiv preprint arXiv:2011.07489, 2020 - arxiv.org

This work studies the entropic regularization formulation of the 2-Wasserstein distance on an

infinite-dimensional Hilbert space, in particular for the Gaussian setting. We first present the

Minimum Mutual Information property, namely the joint measures of two Gaussian measures …

  Cited by 1 All 2 versions 


[PDF] arxiv.org

Cutoff thermalization for Ornstein-Uhlenbeck systems with small L\'evy noise in the Wasserstein distance

G BarreraMA HögeleJC Pardo - arXiv preprint arXiv:2009.10590, 2020 - arxiv.org

This article establishes cutoff thermalization (also known as the cutoff phenomenon) for a

general class of general Ornstein-Uhlenbeck systems $(X^\epsilon_t (x)) _ {t\geq 0} $ under

$\epsilon $-small additive Lévy noise with initial value $ x $. The driving noise processes …

  Cited by 1 All 3 versions 


[PDF] arxiv.org

The Unbalanced Gromov Wasserstein Distance: Conic Formulation and Relaxation

T SéjournéFX VialardG Peyré - arXiv preprint arXiv:2009.04266, 2020 - arxiv.org

Comparing metric measure spaces (ie a metric space endowed with a probability

distribution) is at the heart of many machine learning problems. This includes for instance

predicting properties of molecules in quantum chemistry or generating graphs with varying …

  Cited by 1 All 2 versions 


[PDF] arxiv.org

McKean-Vlasov SDEs with Drifts Discontinuous under Wasserstein Distance

X Huang, FY Wang - arXiv preprint arXiv:2002.06877, 2020 - arxiv.org

Existence and uniqueness are proved for Mckean-Vlasov type distribution dependent SDEs

with singular drifts satisfying an integrability condition in space variable and the Lipschitz

condition in distribution variable with respect to $ W_0 $ or $ W_0+ W_\theta $ for some …

Cited by 23 Related articles All 5 versions 

[PDF] arxiv.org

Two-sample Test using Projected Wasserstein Distance: Breaking the Curse of Dimensionality

J Wang, R Gao, Y Xie - arXiv preprint arXiv:2010.11970, 2020 - arxiv.org

We develop a projected Wasserstein distance for the two-sample test, a fundamental

problem in statistics and machine learning: given two sets of samples, to determine whether

they are from the same distribution. In particular, we aim to circumvent the curse of …

  Cited by 1 All 3 versions 


[PDF] arxiv.org

Wasserstein Distance to Independence Models

TÖ Çelik, A Jamneshan, G Montúfar… - arXiv preprint arXiv …, 2020 - arxiv.org

An independence model for discrete random variables is a Segre-Veronese variety in a

probability simplex. Any metric on the set of joint states of the random variables induces a

Wasserstein metric on the probability simplex. The unit ball of this polyhedral norm is dual to …

  Related articles All 2 versions 

Wasserstein Distance to Independence Models

T Özlüm Çelik, A JamneshanG Montúfar… - arXiv, 2020 - ui.adsabs.harvard.edu

An independence model for discrete random variables is a Segre-Veronese variety in a

probability simplex. Any metric on the set of joint states of the random variables induces a

Wasserstein metric on the probability simplex. The unit ball of this polyhedral norm is dual to …

 

[PDF] arxiv.org

Ranking IPCC Models Using the Wasserstein Distance

G Vissio, V Lembo, V LucariniM Ghil - arXiv preprint arXiv:2006.09304, 2020 - arxiv.org

We propose a methodology for evaluating the performance of climate models based on the

use of the Wasserstein distance. This distance provides a rigorous way to measure

quantitatively the difference between two probability distributions. The proposed approach is …

  All 5 versions 

<——2020——-—2020———1330——  


Person-independent facial expression recognition method based on improved Wasserstein generative adversarial networks in combination with identity aware

C Xu, Y Cui, Y Zhang, P Gao, J Xu - Multimedia Systems, 2020 - Springer

Since the distinction between two expressions is fairly vague, usually a subtle change in one

part of the human face is enough to change a facial expression. Most of the existing facial

expression recognition algorithms are not robust enough because they rely on general facial …

  Cited by 7 Related articles

[PDF] arxiv.org

Exponential Convergence in Entropy and Wasserstein Distance for McKean-Vlasov SDEs

P Ren, FY Wang - arXiv preprint arXiv:2010.08950, 2020 - arxiv.org

The following type exponential convergence is proved for (non-degenerate or degenerate)

McKean-Vlasov SDEs: $$ W_2 (\mu_t,\mu_\infty)^ 2+{\rm Ent}(\mu_t|\mu_\infty)\le c {\rm e}^{-

\lambda t}\min\big\{W_2 (\mu_0,\mu_\infty)^ 2,{\rm Ent}(\mu_0|\mu_\infty)\big\},\\t\ge 1 …

  Cited by 1 All 2 versions 


[PDF] arxiv.org

Classification of atomic environments via the Gromov–Wasserstein distance

S Kawano, JK Mason - Computational Materials Science, 2020 - Elsevier

Interpreting molecular dynamics simulations usually involves automated classification of

local atomic environments to identify regions of interest. Existing approaches are generally

limited to a small number of reference structures and only include limited information about …

  Related articles All 4 versions


[PDF] arxiv.org

Precise Limit in Wasserstein Distance for Conditional Empirical Measures of Dirichlet Diffusion Processes

FY Wang - arXiv preprint arXiv:2004.07537, 2020 - arxiv.org

Let $ M $ be a $ d $-dimensional connected compact Riemannian manifold with boundary

$\partial M $, let $ V\in C^ 2 (M) $ such that $\mu (dx):= e^{V (x)} dx $ is a probability

measure, and let $ X_t $ be the diffusion process generated by $ L:=\Delta+\nabla V $ with …

  Cited by 2 Related articles All 2 versions 


 

[PDF] arxiv.org

Exponential contraction in Wasserstein distance on static and evolving manifolds

LJ Cheng, A Thalmaier, SQ Zhang - arXiv preprint arXiv:2001.06187, 2020 - arxiv.org

In this article, exponential contraction in Wasserstein distance for heat semigroups of

diffusion processes on Riemannian manifolds is established under curvature conditions

where Ricci curvature is not necessarily required to be non-negative. Compared to the …

ited by 3 Related articles All 7 versions 

[PDF] arxiv.org

Social-wagdat: Interaction-aware trajectory prediction via wasserstein graph double-attention network

J Li, H Ma, Z ZhangM Tomizuka - arXiv preprint arXiv:2002.06241, 2020 - arxiv.org

Effective understanding of the environment and accurate trajectory prediction of surrounding

dynamic obstacles are indispensable for intelligent mobile systems (like autonomous

vehicles and social robots) to achieve safe and high-quality planning when they navigate in …

  Cited by 20 Related articles All 3 versions 


[PDF] arxiv.org

Multivariate goodness-of-Fit tests based on Wasserstein distance

M HallinG MordantJ Segers - arXiv preprint arXiv:2003.06684, 2020 - arxiv.org

Goodness-of-fit tests based on the empirical Wasserstein distance are proposed for simple

and composite null hypotheses involving general multivariate distributions. This includes the

important problem of testing for multivariate normality with unspecified mean vector and …

  Cited by 3 Related articles All 9 versions 


[PDF] arxiv.org

Gromov-Wasserstein Distance based Object Matching: Asymptotic Inference

CA Weitkamp, K Proksch, C Tameling… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we aim to provide a statistical theory for object matching based on the Gromov-

Wasserstein distance. To this end, we model general objects as metric measure spaces.

Based on this, we propose a simple and efficiently computable asymptotic statistical test for …

 Cited by 2 Related articles All 6 versions 

Wasserstein distance based deep multi-feature adversarial transfer diagnosis approach under variable working conditions

D She, N Peng, M Jia, MG Pecht - Journal of Instrumentation, 2020 - iopscience.iop.org

Intelligent mechanical fault diagnosis is a crucial measure to ensure the safe operation of

equipment. To solve the problem that network features is not fully utilized in the adversarial

transfer learning, this paper develops a Wasserstein distance based deep multi-feature …

  Cited by 1 Related articles

Wasserstein distance based deep multi-feature adversarial transfer diagnosis approach under variable working conditions

D She, N Peng, M Jia, MG Pecht - Journal of Instrumentation, 2020 - iopscience.iop.org

Intelligent mechanical fault diagnosis is a crucial measure to ensure the safe operation of

equipment. To solve the problem that network features is not fully utilized in the adversarial

transfer learning, this paper develops a Wasserstein distance based deep multi-feature …

  Cited by 1 Related articles All 2 versions

A novel kernel Wasserstein distance on Gaussian measures: An application of identifying dental artifacts in head and neck computed tomography

JH Oh, M PouryahyaA IyerAP ApteJO Deasy… - Computers in Biology …, 2020 - Elsevier

The Wasserstein distance is a powerful metric based on the theory of optimal mass

transport. It gives a natural measure of the distance between two distributions with a wide

range of applications. In contrast to a number of the common divergences on distributions …

  Related articles All 4 versions

<——2020———2020———1340——


[HTML] nih.gov

[HTML] EEG Signal Reconstruction Using a Generative Adversarial Network With Wasserstein Distance and Temporal-Spatial-Frequency Loss

T Luo, Y Fan, L Chen, G Guo, C Zhou - Frontiers in …, 2020 - ncbi.nlm.nih.gov

Applications based on electroencephalography (EEG) signals suffer from the mutual

contradiction of high classification performance vs. low cost. The nature of this contradiction

makes EEG signal reconstruction with high sampling rates and sensitivity challenging …

  ited by 21 Related articles All 6 versions 

[PDF] arxiv.org

Online Stochastic Convex Optimization: Wasserstein Distance Variation

I ShamesF Farokhi - arXiv preprint arXiv:2006.01397, 2020 - arxiv.org

Distributionally-robust optimization is often studied for a fixed set of distributions rather than

time-varying distributions that can drift significantly over time (which is, for instance, the case

in finance and sociology due to underlying expansion of economy and evolution of …

  Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein Distance Regularized Sequence Representation for Text Matching in Asymmetrical Domains

W Yu, C Xu, J XuL Pang, X Gao, X Wang… - arXiv preprint arXiv …, 2020 - arxiv.org

One approach to matching texts from asymmetrical domains is projecting the input

sequences into a common semantic space as feature vectors upon which the matching

function can be readily defined and learned. In real-world matching practices, it is often …

  All 3 versions 


[PDF] arxiv.org

Virtual persistence diagrams, signed measures, and Wasserstein distance

P Bubenik, A Elchesen - arXiv preprint arXiv:2012.10514, 2020 - arxiv.org

Persistence diagrams, an important summary in topological data analysis, consist of a set of

ordered pairs, each with positive multiplicity. Persistence diagrams are obtained via Mobius

inversion and may be compared using a one-parameter family of metrics called Wasserstein …

  All 2 versions 

[PDF] arxiv.org

The Wasserstein-Fourier distance for stationary time series

E Cazelles, A Robert, F Tobar - IEEE Transactions on Signal …, 2020 - ieeexplore.ieee.org

… displacement of their energy across frequencies. The WF distance operates by

calculating the Wasserstein distance between the (normalised) power spectral densities

(NPSD) of time series. Yet this rationale has been considered …

Cited by 8 Related articles All 36 versions

[PDF] arxiv.org

PLG-IN: Pluggable Geometric Consistency Loss with Wasserstein Distance in Monocular Depth Estimation

N HiroseS KoideK KawanoR Kondo - arXiv preprint arXiv:2006.02068, 2020 - arxiv.org

We propose a novel objective to penalize geometric inconsistencies, to improve the

performance of depth estimation from monocular camera images. Our objective is designed

with the Wasserstein distance between two point clouds estimated from images with different …

  Cited by 1 Related articles All 2 versions 

 

Distributionally Robust Optimal Reactive Power Dispatch with Wasserstein Distance in Active Distribution Network

J Liu, Y Chen, C Duan, J Lin… - Journal of Modern Power …, 2020 - ieeexplore.ieee.org

The uncertainties from renewable energy sources (RESs) will not only introduce significant

influences to active power dispatch, but also bring great challenges to the analysis of

optimal reactive power dispatch (ORPD). To address the influence of high penetration of …

  Cited by 1 Related articles All 2 versions


A [PDF] nsf.gov

A Data-Driven Distributionally Robust Game Using Wasserstein Distance

G PengT ZhangQ Zhu - International Conference on Decision and Game …, 2020 - Springer

This paper studies a special class of games, which enables the players to leverage the

information from a dataset to play the game. However, in an adversarial scenario, the

dataset may not be trustworthy. We propose a distributionally robust formulation to introduce …

  All 2 versions


Nonparametric Different-Feature Selection Using Wasserstein Distance

W Zheng, FY WangC Gou - 2020 IEEE 32nd International …, 2020 - ieeexplore.ieee.org

In this paper, we propose a feature selection method that characterizes the difference

between two kinds of probability distributions. The key idea is to view the feature selection

problem as a sparsest k-subgraph problem that considers Wasserstein distance between …

  All 2 versions


[PDF] arxiv.org

Interpretable Model Summaries Using the Wasserstein Distance

E Dunipace, L Trippa - arXiv preprint arXiv:2012.09999, 2020 - arxiv.org

In the current computing age, models can have hundreds or even thousands of parameters;

however, such large models decrease the ability to interpret and communicate individual

parameters. Reducing the dimensionality of the parameter space in the estimation phase is …

  All 2 versions 

<——2020———2020———1350—— 


[PDF] arxiv.org

Wasserstein Distance guided Adversarial Imitation Learning with Reward Shape Exploration

M ZhangY WangX Ma, L Xia, J Yang, Z Li… - arXiv preprint arXiv …, 2020 - arxiv.org

The generative adversarial imitation learning (GAIL) has provided an adversarial learning

framework for imitating expert policy from demonstrations in high-dimensional continuous

tasks. However, almost all GAIL and its extensions only design a kind of reward function of …

  Cited by 1 Related articles All 2 versions 


[PDF] researchgate.net

[PDF] Ranking IPCC Model Performance Using the Wasserstein Distance

G VissioV LemboV Lucarini… - arXiv preprint arXiv …, 2020 - researchgate.net

We propose a methodology for intercomparing climate models and evaluating their

performance against benchmarks based on the use of the Wasserstein distance (WD). This

distance provides a rigorous way to measure quantitatively the difference between two …

  

[PDF] arxiv.org

Wasserstein distance error bounds for the multivariate normal approximation of the maximum likelihood estimator

A AnastasiouRE Gaunt - arXiv preprint arXiv:2005.05208, 2020 - arxiv.org

We obtain explicit Wasserstein distance error bounds between the distribution of the multi-

parameter MLE and the multivariate normal distribution. Our general bounds are given for

possibly high-dimensional, independent and identically distributed random vectors. Our …

  Cited by 1 Related articles All 3 versions 


A Generative Model for Zero-Shot Learning via Wasserstein Auto-encoder

X Luo, Z Cai, F Wu, J Xiao-Yuan - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Zero-shot learning aims to use the labeled instances to train the model, and then classifies

the instances that belong to a class without labeled instances. However, the training

instances and test instances are disjoint. Thus, the description of the classes (eg text …

  Related articles


A new Wasserstein distance-and cumulative sum-dependent health indicator and its application in prediction of remaining useful life of bearing

J Yin, M Xu, H Zheng, Y Yang - Journal of the Brazilian Society of …, 2020 - Springer

The safety and reliability of mechanical performance are affected by the condition (health

status) of the bearings. A health indicator (HI) with high monotonicity and robustness is a

helpful tool to simplify the predictive model and improve prediction accuracy. In this paper, a …

 Cited by 2 Related articles


[PDF] mlr.press

Robust Document Distance with Wasserstein-Fisher-Rao metric

Z Wang, D Zhou, M Yang, Y Zhang… - Asian Conference on …, 2020 - proceedings.mlr.press

… Measuring the similarity between linguistic objects plays an important role in natural lan- guage

processing. Word Mover's Distance (WMD) (Kusner et al., 2015) measures the Wasserstein

distance of documents as bag of words distributed in word embedding space …

Cited by 3 Related articles All 2 versions


[PDF] arxiv.org

Reweighting samples under covariate shift using a Wasserstein distance criterion

J Reygner, A Touboul - arXiv preprint arXiv:2010.09267, 2020 - arxiv.org

Considering two random variables with different laws to which we only have access through

finite size iid samples, we address how to reweight the first sample so that its empirical

distribution converges towards the true law of the second sample as the size of both …

  All 25 versions 


[PDF] arxiv.org

Wasserstein K-Means for Clustering Tomographic Projections

R RaoA MoscovichA Singer - arXiv preprint arXiv:2010.09989, 2020 - arxiv.org

Motivated by the 2D class averaging problem in single-particle cryo-electron microscopy

(cryo-EM), we present a k-means algorithm based on a rotationally-invariant Wasserstein

metric for images. Unlike existing methods that are based on Euclidean ($ L_2 $) distances …

  Cited by 1 Related articles All 5 versions 


[PDF] arxiv.org

Convergence rate to equilibrium in Wasserstein distance for reflected jump-diffusions

A Sarantsev - arXiv preprint arXiv:2003.10590, 2020 - arxiv.org

Convergence rate to the stationary distribution for continuous-time Markov processes can be

studied using Lyapunov functions. Recent work by the author provided explicit rates of

convergence in special case of a reflected jump-diffusion on a half-line. These results are …

  Related articles All 2 versions 


[PDF] arxiv.org

Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures

O Bencheikh, B Jourdain - arXiv preprint arXiv:2012.09729, 2020 - arxiv.org

We are interested in the approximation in Wasserstein distance with index $\rho\ge 1$ of a

probability measure $\mu $ on the real line with finite moment of order $\rho $ by the

empirical measure of $ N $ deterministic points. The minimal error converges to $0 $ as …

  All 3 versions 

<——2020————-—2020———1360—


[PDF] arxiv.org

A Riemannian Block Coordinate Descent Method for Computing the Projection Robust Wasserstein Distance

M HuangS MaL Lai - arXiv preprint arXiv:2012.05199, 2020 - arxiv.org

The Wasserstein distance has become increasingly important in machine learning and deep

learning. Despite its popularity, the Wasserstein distance is hard to approximate because of

the curse of dimensionality. A recently proposed approach to alleviate the curse of …

  All 3 versions 


Multi-View Wasserstein Discriminant Analysis with Entropic Regularized Wasserstein Distance

H Kasai - ICASSP 2020-2020 IEEE International Conference …, 2020 - ieeexplore.ieee.org

Analysis of multi-view data has recently garnered growing attention because multi-view data

frequently appear in real-world applications, which are collected or taken from many sources

or captured using various sensors. A simple and popular promising approach is to learn a …

Cited by 9 Related articles

[PDF] archives-ouvertes.fr

On the Wasserstein distance between mutually singular measures

G Buttazzo, G Carlier, M Laborde - Advances in Calculus of …, 2020 - degruyter.com

We study the Wasserstein distance between two measures μ, ν which are mutually singular.

In particular, we are interested in minimization problems of the form W⁢(μ, 𝒜)= inf⁡{W⁢(μ,

ν): ν 𝒜}, where μ is a given probability and 𝒜 is contained in the class μ of probabilities …


  Cited by 1 Related articles All 8 versions

[PDF] ntu.edu.sg

Wasserstein distance estimates for stochastic integrals by forward-backward stochastic calculus

JC Breton, N Privault - Potential Analysis, 2020 - Springer

We prove Wasserstein distance bounds between the probability distributions of stochastic

integrals with jumps, based on the integrands appearing in their stochastic integral

representations. Our approach does not rely on the Stein equation or on the propagation of …

  All 4 versions


[PDF] arxiv.org

Hierarchical Low-Rank Approximation of Regularized Wasserstein distance

M Motamed - arXiv preprint arXiv:2004.12511, 2020 - arxiv.org

Sinkhorn divergence is a measure of dissimilarity between two probability measures. It is

obtained through adding an entropic regularization term to Kantorovich's optimal transport

problem and can hence be viewed as an entropically regularized Wasserstein distance …

  Related articles All 2 versions 


Covariance Steering of Discrete-Time Stochastic Linear Systems Based on Wasserstein Distance Terminal Cost

IM BalciE Bakolas - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We consider a class of stochastic optimal control problems for discrete-time linear systems

whose objective is the characterization of control policies that will steer the probability

distribution of the terminal state of the system close to a desired Gaussian distribution. In our …

 Cited by 7 Related articles All 4 versions


[PDF] arxiv.org

LCS Graph Kernel Based on Wasserstein Distance in Longest Common Subsequence Metric Space

J Huang, Z Fang, H Kasai - arXiv preprint arXiv:2012.03612, 2020 - arxiv.org

For graph classification tasks, many methods use a common strategy to aggregate

information of vertex neighbors. Although this strategy provides an efficient means of

extracting graph topological features, it brings excessive amounts of information that might …

  All 2 versions 

[PDF] sciencedirect.com

Intelligent Fault Diagnosis with a Deep Transfer Network based on Wasserstein Distance

J Xu, J Huang, Y Zhao, L Zhou - Procedia Computer Science, 2020 - Elsevier

Intelligent fault-diagnosis methods based on deep-learning technology have been very

successful for complex industrial systems. The deep learning based fault classification

model requires a large number of labeled data. Moreover, the probability distribution of …

 Cited by 2 Related articles


[PDF] arxiv.org

The Spectral-Domain  Wasserstein Distance for Elliptical Processes and the Spectral-Domain Gelbrich Bound

S FangQ Zhu - arXiv preprint arXiv:2012.04023, 2020 - arxiv.org

In this short note, we introduce the spectral-domain $\mathcal {W} _2 $ Wasserstein distance

for elliptical stochastic processes in terms of their power spectra. We also introduce the

spectral-domain Gelbrich bound for processes that are not necessarily elliptical. Subjects …

  All 2 versions 

[PDF] arxiv.org

Intrinsic Sliced Wasserstein Distances for Comparing Collections of Probability Distributions on Manifolds and Graphs

RM RustamovS Majumdar - arXiv preprint arXiv:2010.15285, 2020 - arxiv.org

Collections of probability distributions arise in a variety of statistical applications ranging

from user activity pattern analysis to brain connectomics. In practice these distributions are

represented by histograms over diverse domain types including finite intervals, circles …

  Cited by 3 Related articles All 2 versions 

<——2020——-—2020———1370—-


Convergence in Monge-Wasserstein Distance of Mean Field Systems with Locally Lipschitz Coefficients

DT NguyenSL Nguyen, NH Du - Acta Mathematica Vietnamica, 2020 - Springer

This paper focuses on stochastic systems of weakly interacting particles whose dynamics

depend on the empirical measures of the whole populations. The drift and diffusion

coefficients of the dynamical systems are assumed to be locally Lipschitz continuous and …

 

[PDF] researchgate.net

Non-Gaussian BLE-Based Indoor Localization Via Gaussian Sum Filtering Coupled with Wasserstein Distance

P Malekzadeh, S Mehryar, P Spachos… - ICASSP 2020-2020 …, 2020 - ieeexplore.ieee.org

With recent breakthroughs in signal processing, communication and networking systems, we

are more and more surrounded by smart connected devices empowered by the Internet of

Thing (IoT). Bluetooth Low Energy (BLE) is considered as the main-stream technology to …

  Related articles All 2 versions


Functional Data Clustering Analysis via the Learning of Gaussian Processes with Wasserstein Distance

T Li, J Ma - International Conference on Neural Information …, 2020 - Springer

Functional data clustering analysis becomes an urgent and challenging task in the new era

of big data. In this paper, we propose a new framework for functional data clustering

analysis, which adopts a similar structure as the k-means algorithm for the conventional …

 

[PDF] arxiv.org

Independent Elliptical Distributions Minimize Their  Wasserstein Distance from Independent Elliptical Distributions with the Same Density Generator

S FangQ Zhu - arXiv preprint arXiv:2012.03809, 2020 - arxiv.org

This short note is on a property of the $\mathcal {W} _2 $ Wasserstein distance which

indicates that independent elliptical distributions minimize their $\mathcal {W} _2 $

Wasserstein distance from given independent elliptical distributions with the same density …

  All 2 versions 

online OPEN ACCESS

Independent Elliptical Distributions Minimize Their $\mathcal{W}_2$ Wasserstein Distance from...

by Fang, Song; Zhu, Quanyan

12/2020

This short note is on a property of the $\mathcal{W}_2$ Wasserstein distance which indicates that independent elliptical distributions minimize their...

Journal ArticleFull Text Online

 Related articles All 2 versions 

[PDF] arxiv.org

Convergence in Wasserstein Distance for Empirical Measures of Dirichlet Diffusion Processes on Manifolds

FY Wang - arXiv preprint arXiv:2005.09290, 2020 - arxiv.org

Let $ M $ be a $ d $-dimensional connected compact Riemannian manifold with boundary

$\partial M $, let $ V\in C^ 2 (M) $ such that $\mu ({\rm d} x):={\rm e}^{V (x)}{\rm d} x $ is a

probability measure, and let $ X_t $ be the diffusion process generated by …

  Cited by 1 Related articles All 2 versions 


2020

Semantic Inpainting with Multi-dimensional Adversarial Network and Wasserstein Distance

H Wang, L Jiao, R Bie, H Wu - Chinese Conference on Pattern …, 2020 - Springer

Inpainting represents a procedure which can restore the lost parts of an image based upon

the residual information. We present an inpainting network that consists of an Encoder-

Decoder pipeline and a multi-dimensional adversarial network. The Encoder-Decoder …

 

[PDF] arxiv.org

Entropy-Regularized -Wasserstein Distance between Gaussian Measures

A MallastoA GerolinHQ Minh - arXiv preprint arXiv:2006.03416, 2020 - arxiv.org

Gaussian distributions are plentiful in applications dealing in uncertainty quantification and

diffusivity. They furthermore stand as important special cases for frameworks providing

geometries for probability measures, as the resulting geometry on Gaussians is often …

  Cited by 3 Related articles All 2 versions 


Learning Wasserstein Distance-Based Gaussian Graphical Model for Multivariate Time Series Classification

HU Xuegang, L Jianxing, LI Peipei… - 2020 IEEE …, 2020 - ieeexplore.ieee.org

Multivariate time series classification occupies an important position in time series data

mining tasks and has been applied in many fields. However, due to the statistical coupling

between different variables of Multivariate Time Series (MTS) data, traditional classification …

  All 2 versions

[PDF] arxiv.org

Unsupervised Wasserstein Distance Guided Domain Adaptation for 3D Multi-domain Liver Segmentation

C YouJ Yang, J Chapiro, JS Duncan - Interpretable and Annotation …, 2020 - Springer

Deep neural networks have shown exceptional learning capability and generalizability in

the source domain when massive labeled data is provided. However, the well-trained

models often fail in the target domain due to the domain shift. Unsupervised domain …

  All 3 versions


[HTML] Correcting nuisance variation using Wasserstein distance

G TabakM FanS YangS Hoyer, G Davis - PeerJ, 2020 - peerj.com

Profiling cellular phenotypes from microscopic imaging can provide meaningful biological

information resulting from various factors affecting the cells. One motivating application is

drug development: morphological cell features can be captured from images, from which …

  Related articles All 8 versions 

<——2020——-—2020———1380——


[PDF] polimi.it

Wasserstein K-means per clustering di misure di probabilità e applicazioni

R TAFFONI - 2020 - politesi.polimi.it

Abstract in italiano La tesi tratterà dello studio della distanza di Wasserstein, studiandone il

caso generale ed il caso discreto, applicato all'algoritmo del K-means, che verrà descritto

nei suoi passaggi. Infine verrà applicato questo algoritmo con dati artificiale ed un dataset …

[PDF] ethz.ch

[PDF] Smooth Wasserstein Distance: Metric Structure and Statistical Efficiency

Z Goldfeld - International Zurich Seminar on Information …, 2020 - research-collection.ethz.ch

The Wasserstein distance has seen a surge of interest and applications in machine learning.

Its popularity is driven by many advantageous properties it possesses, such as metric

structure (metrization of weak convergence), robustness to support mismatch, compatibility …

  Related articles All 5 versions 


2020 PDF

Quantifying the Empirical Wasserstein Distance to a Set of ...

papers.nips.cc › paper › file

There is a rapidly growing research literature discussing the statistical properties of the Wasserstein distance and how to beat the curse of dimensionality. Weed and Bach [25] claim that the Wasserstein distance enjoys a faster convergence rate if the true measure has support on a lower-dimensional manifold.

Cited by 8 Related articles All 3 versions 

[CITATION] Quantifying the Empirical Wasserstein Distance to a Set of Measures: Beating the Curse of Dimensionality

N SiJ BlanchetS GhoshM Squillante - Advances in Neural Information Processing …, 2020

  Related articles

Quantifying the Empirical Wasserstein Distance to a Set of ...

slideslive.com › quantifying-the-empirical-wasserstein-dist...

slideslive.com › quantifying-the-empirical-wasserstein-dist...

Quantifying the Empirical Wasserstein Distance to a Set of Measures: Beating the Curse of Dimensionality. Dec 6, 2020 ...

SlidesLive · 

Dec 6, 2020

[PDF] kweku.me

[PDF] Measuring Bias with Wasserstein Distance

K Kwegyir-Aggrey, SM Brown - kweku.me

In fair classification, we often ask:" what does it mean to be fair, and how is fairness

measured?" Previous approaches to defining and enforcing fairness rely on a set of

statistical fairness definitions, with each definition providing its own unique measurement of …

  

[PDF] unipv.it

[PDF] On the equivalence between Fourier-based and Wasserstein metrics

G Auricchio, A Codegoni, S GualandiG Toscani… - eye - mate.unipv.it

We investigate properties of some extensions of a class of Fourierbased probability metrics,

originally introduced to study convergence to equilibrium for the solution to the spatially

homogeneous Boltzmann equation. At difference with the original one, the new Fourier …

  Related articles 

  All 2 versions


[PDF] uniroma1.it

[PDF] Nonparametric Density Estimation with Wasserstein Distance for Actuarial Applications

EG Luini - iris.uniroma1.it

Density estimation is a central topic in statistics and a fundamental task of actuarial sciences.

In this work, we present an algorithm for approximating multivariate empirical densities with

a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

  Related articles All 2 versions 


2020

Artificial Neural Network with Histogram Data Time Series Forecasting: A Least Squares Approach Based on Wasserstein Distance

P Rakpho, W Yamaka, K Zhu - Behavioral Predictive Modeling in …, 2020 - Springer

This paper aims to predict the histogram time series, and we use the high-frequency data

with 5-min to construct the Histogram data for each day. In this paper, we apply the Artificial

Neural Network (ANN) to Autoregressive (AR) structure and introduce the AR—ANN model …

  All 3 versions

Artificial Neural Network with Histogram Data Time Series Forecasting: A Least Squares Approach Based on Wasserstein...

by Rakpho, Pichayakone; Yamaka, Woraphon; Zhu, Kongliang

01/2020

This paper aims to predict the histogram time series, and we use the high-frequency data with 5-min to construct the Histogram data for each day. In this...

Book ChapterCitation Online

 

 [PDF] esaim-ps.org

Squared quadratic Wasserstein distance: optimal couplings and Lions differentiability

A Alfonsi, B Jourdain - ESAIM: Probability and Statistics, 2020 - esaim-ps.org

In this paper, we remark that any optimal coupling for the quadratic Wasserstein distance

between two probability measures μ and ν with finite second order moments on d is the

composition of a martingale coupling with an optimal transport map. We check the existence …

 

[PDF] psu.edu

[PDF] Subexponential upper and lower bounds in Wasserstein distance for Markov processes

A Arapostathis, G Pang, N Sandric - personal.psu.edu

In this article, relying on Foster-Lyapunov drift conditions, we establish subexponential

upper and lower bounds on the rate of convergence in the Lp-Wasserstein distance for a

class of irreducible and aperiodic Markov processes. We further discuss these results in the …

 Cited by 1 Related articles 


[2002.01012] Asymptotic Guarantees for Generative Modeling Based on the Smooth Wasserstein Distance

arxiv.org › math

by Z Goldfeld · 2020 · Cited by 1 — Asymptotic Guarantees for Generative Modeling Based on the Smooth Wasserstein Distance. Minimum distance estimation (MDE) gained recent attention as a formulation of (implicit) generative modeling.

[CITATION] Asymptotic Guarantees for Generative Modeling Based on the Smooth Wasserstein Distance

Z GoldfeldK GreenewaldK Kato - Advances in Neural Information Processing …, 2020

 Cited by 2 Related articles All 2 versions 

[CITATION] Asymptotic Guarantees for Generative Modeling Based on the Smooth Wasserstein Distance

Z GoldfeldK GreenewaldK Kato - Advances in Neural Information Processing …, 2020

in estimators (MSWEs), first proving the estimator’s measurability and asymptotic

Save Cite Cited by 10 Related articles All 5 versions

<——2020————-—2020———1390—


[PDF] googleapis.com

Object shape regression using wasserstein distance

J Sun, SKP Kumar, R Bala - US Patent App. 16/222,062, 2020 - Google Patents

One embodiment can provide a system for detecting outlines of objects in images. During

operation, the system receives an image that includes at least one object, generates a

random noise signal, and provides the received image and the random noise signal to a …

  All 3 versions 


[PDF] researchgate.net

[PDF] Distributionally Robust XVA via Wasserstein Distance: Wrong Way Counterparty Credit and Funding Risk

D Singh, S Zhang - researchgate.net

This paper investigates calculations of robust XVA, in particular, credit valuation adjustment

(CVA) and funding valuation adjustment (FVA) for over-the-counter derivatives under

distributional uncertainty using Wasserstein distance as the ambiguity measure. Wrong way …

  Related articles All 2 versions 


An Improvement based on Wasserstein GAN for Alleviating Mode Collapsing

Y Chen, X Hou - 2020 International Joint Conference on Neural …, 2020 - ieeexplore.ieee.org

In the past few years, Generative Adversarial Networks as a deep generative model has

received more and more attention. Mode collapsing is one of the challenges in the study of

Generative Adversarial Networks. In order to solve this problem, we deduce a new algorithm

on the basis of Wasserstein GAN. We add a generated distribution entropy term to the

objective function of generator net and maximize the entropy to increase the diversity of fake

images. And then Stein Variational Gradient Descent algorithm is used for optimization. We …

  Related articles

online

An Improvement based on Wasserstein GAN for Alleviating Mode Collapsing

by Chen, Yingying; Hou, Xinwen

2020 International Joint Conference on Neural Networks (IJCNN), 07/2020

In the past few years, Generative Adversarial Networks as a deep generative model has received more and more attention. Mode collapsing is one of the...

Conference ProceedingFull Text Online

Wasserstein distance estimates for stochastic integrals ... - NTU

personal.ntu.edu.sg › wasserstein_forward-backward

PDF

Aug 7, 2020 — of jump-diffusion processes. In [BP08], lower and upper bounds on option prices have been obtained in one-dimensional jump-diffusion ...

[CITATION] Wasserstein distance estimates for jump-diffusion processes

JC Breton, N Privault - Preprint, 2020


2020  [PDF] arxiv.org

Stein's method for normal approximation in Wasserstein distances with application to the multivariate Central Limit Theorem

T Bonis - Probability Theory and Related Fields, 2020 - Springer

We use Stein's method to bound the Wasserstein distance of order 2 between a

measure\(\nu\) and the Gaussian measure using a stochastic process\((X_t) _ {t\ge 0}\) such

that\(X_t\) is drawn from\(\nu\) for any\(t> 0\). If the stochastic process\((X_t) _ {t\ge 0}\) …

  Cited by 5 Related articles All 2 versions


[PDF] arxiv.org

High-Confidence Attack Detection via Wasserstein-Metric Computations

D LiS Martínez - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

This letter considers a sensor attack and fault detection problem for linear cyber-physical

systems, which are subject to system noise that can obey an unknown light-tailed

distribution. We propose a new threshold-based detection mechanism that employs the …

  Cited by 2 Related articles All 5 versions


2020

[PDF] arxiv.org

On Stein's factors for Poisson approximation in Wasserstein distance with non-linear transportation costs

ZW Liao, Y Ma, A Xia - arXiv preprint arXiv:2003.13976, 2020 - arxiv.org

We establish various bounds on the solutions to a Stein equation for Poisson approximation

in Wasserstein distance with non-linear transportation costs. The proofs are a refinement of

those in [Barbour and Xia (2006)] using the results in [Liu and Ma (2009)]. As a corollary, we …

  Related articles All 2 versions 


2020   [PDF] arxiv.org

Stein factors for variance-gamma approximation in the Wasserstein and Kolmogorov distances

RE Gaunt - arXiv preprint arXiv:2008.06088, 2020 - arxiv.org

We obtain new bounds for the solution of the variance-gamma (VG) Stein equation that are

of the correct form for approximations in terms of the Wasserstein and Kolmorogorov metrics.

These bounds hold for all parameters values of the four parameter VG class. As an …

  Cited by 2 All 3 versions 


The quadratic Wasserstein metric for inverse data matching

B Engquist, K Ren, Y Yang - Inverse Problems, 2020 - iopscience.iop.org

This work characterizes, analytically and numerically, two major effects of the quadratic

Wasserstein (W 2) distance as the measure of data discrepancy in computational solutions

of inverse problems. First, we show, in the infinite-dimensional setup, that the W 2 distance …

  Cited by 3 Related articles All 5 versions

he quadratic Wasserstein metric for inverse data matching

K Ren, Y Yang - arXiv preprint arXiv:1911.06911, 2019 - arxiv.org

This work characterizes, analytically and numerically, two major effects of the quadratic

Wasserstein ($ W_2 $) distance as the measure of data discrepancy in computational

solutions of inverse problems. First, we show, in the infinite-dimensional setup, that the …

  Cited by 1 Related articles 


[PDF] iop.org  Full View

Wasserstein metric for improved quantum machine learning with adjacency matrix representations

O Çaylak, OA von Lilienfeld… - … Learning: Science and …, 2020 - iopscience.iop.org

We study the Wasserstein metric to measure distances between molecules represented by

the atom index dependent adjacency'Coulomb'matrix, used in kernel ridge regression based

supervised learning. Resulting machine learning models of quantum properties, aka …

  Cited by 3

<——2020——2020———1400——


[HTML] mdpi.com

Calculating the Wasserstein metric-based Boltzmann entropy of a landscape mosaic

H Zhang, Z Wu, T Lan, Y Chen, P Gao - Entropy, 2020 - mdpi.com

Shannon entropy is currently the most popular method for quantifying the disorder or

information of a spatial data set such as a landscape pattern and a cartographic map.

However, its drawback when applied to spatial data is also well documented; it is incapable …

  Cited by 3 Related articles All 4 versions 

Data-driven distributionally robust unit commitment with Wasserstein metric: Tractable formulation and efficient solution method

X ZhengH Chen - IEEE Transactions on Power Systems, 2020 - ieeexplore.ieee.org

In this letter, we propose a tractable formulation and an efficient solution method for the

Wasserstein-metric-based distributionally robust unit commitment (DRUC-dW) problem.

First, a distance-based data aggregation method is introduced to hedge against the …

  Cited by 3 All 2 versions


PDF] esaim-proc.org

Statistical data analysis in the Wasserstein space

J Bigot - ESAIM: Proceedings and Surveys, 2020 - esaim-proc.org

This paper is concerned by statistical inference problems from a data set whose elements

may be modeled as random probability measures such as multiple histograms or point

clouds. We propose to review recent contributions in statistics on the use of Wasserstein  …

  Cited by 2 Related articles

[PDF] arxiv.org

Regularized variational data assimilation for bias treatment using the Wasserstein metric

SK Tamang, A EbtehajD Zou… - Quarterly Journal of the …, 2020 - Wiley Online Library

This article presents a new variational data assimilation (VDA) approach for the formal

treatment of bias in both model outputs and observations. This approach relies on the

Wasserstein metric, stemming from the theory of optimal mass transport, to penalize the …

  Cited by 1 Related articles All 3 versions


[PDF] arxiv.org

Regularizing activations in neural networks via distribution matching with the Wasserstein metric

T Joo, D Kang, B Kim - arXiv preprint arXiv:2002.05366, 2020 - arxiv.org

Regularization and normalization have become indispensable components in training deep

neural networks, resulting in faster training and improved generalization performance. We

propose the projected error function regularization loss (PER) that encourages activations to …

Cited by 3 Related articles All 5 versions

[PDF] arxiv.org

Distributed Wasserstein Barycenters via Displacement Interpolation

P Cisneros-VelardeF Bullo - arXiv preprint arXiv:2012.08610, 2020 - arxiv.org

Consider a multi-agent system whereby each agent has an initial probability measure. In this

paper, we propose a distributed algorithm based upon stochastic, asynchronous and

pairwise exchange of information and displacement interpolation in the Wasserstein space …

  Related articles All 2 versions 

  

 

[PDF] arxiv.org

Data-Driven Approximation of the Perron-Frobenius Operator Using the Wasserstein Metric

A KarimiTT Georgiou - arXiv preprint arXiv:2011.00759, 2020 - arxiv.org

This manuscript introduces a regression-type formulation for approximating the Perron-

Frobenius Operator by relying on distributional snapshots of data. These snapshots may

represent densities of particles. The Wasserstein metric is leveraged to define a suitable …

  Cited by 1 Related articles All 6 versions

[PDF] arxiv.org

High-Confidence Attack Detection via Wasserstein-Metric Computations

D LiS Martínez - arXiv preprint arXiv:2003.07880, 2020 - arxiv.org

This paper considers a sensor attack and fault detection problem for linear cyber-physical

systems, which are subject to possibly non-Gaussian noise that can have an unknown light-

tailed distribution. We propose a new threshold-based detection mechanism that employs …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Geometric Characteristics of Wasserstein Metric on SPD (n)

Y Luo, S Zhang, Y Cao, H Sun - arXiv preprint arXiv:2012.07106, 2020 - arxiv.org

Wasserstein distance, especially among symmetric positive-definite matrices, has broad and

deep influences on development of artificial intelligence (AI) and other branches of computer

science. A natural idea is to describe the geometry of $ SPD\left (n\right) $ as a Riemannian …

  All 2 versions 


[PDF] arxiv.org

Wasserstein metric for improved QML with adjacency matrix representations

O Çaylak, OA von LilienfeldB Baumeier - arXiv preprint arXiv:2001.11005, 2020 - arxiv.org

We study the Wasserstein metric to measure distances between molecules represented by

the atom index dependent adjacency" Coulomb" matrix, used in kernel ridge regression

based supervised learning. Resulting quantum machine learning models exhibit improved …

  Cited by 1 Related articles All 2 versions 

<——2020—————2020———1410——


[PDF] arxiv.org

Minimax control of ambiguous linear stochastic systems using the Wasserstein metric

K KimI Yang - arXiv preprint arXiv:2003.13258, 2020 - arxiv.org

In this paper, we propose a minimax linear-quadratic control method to address the issue of

inaccurate distribution information in practical stochastic systems. To construct a control

policy that is robust against errors in an empirical distribution of uncertainty, our method is to …

  Related articles All 2 versions 


[PDF] arxiv.org

Velocity Inversion Using the Quadratic Wasserstein Metric

S Mahankali - arXiv preprint arXiv:2009.00708, 2020 - arxiv.org

Full--waveform inversion (FWI) is a method used to determine properties of the Earth from

information on the surface. We use the squared Wasserstein distance (squared $ W_2 $

distance) as an objective function to invert for the velocity as a function of position in the …

  All 5 versions 

2020

(PDF) Learning Graphons via Structured Gromov-Wasserstein ...

https://www.researchgate.net › ... › Psychology › Learning

https://www.researchgate.net › ... › Psychology › Learning

by H Xu · 2021 · Cited by 3 — Abstract. We propose a novel and principled method to learn a non- parametric graph model called graphon, which is defined in an infinite-dimensional space ...Dec 10, 2020 — We propose a novel and principled method to learn a nonparametric graph model called graphon, which is defined in an infinite-dimensional ...

Cited by 5 Related articles All 6 versions


[PDF] arxiv.org

Berry-Esseen smoothing inequality for the Wasserstein metric on compact Lie groups

B Borda - arXiv preprint arXiv:2005.04925, 2020 - arxiv.org

We prove a general inequality estimating the distance of two probability measures on a

compact Lie group in the Wasserstein metric in terms of their Fourier transforms. The result is

close to being sharp. We use a generalized form of the Wasserstein metric, related by …

  Related articles All 2 versions 


[PDF] aimsciences.org

Exponential convergence in the Wasserstein metric\begin {document} \end {document} for one dimensional diffusions

L Cheng, R Li, L Wu - Discrete & Continuous Dynamical Systems-A, 2020 - aimsciences.org

In this paper, we find some general and efficient sufficient conditions for the exponential

convergence W1, d (Pt (x,·), Pt (y,·))≤ Ke− δtd (x, y) for the semigroup (Pt) of one-

dimensional diffusion. Moreover, some sharp estimates of the involved constants K≥ 1, δ> 0 …

  Related articles All 2 versions 

 

 


Wasserstein metric-based Boltzmann entropy of a landscape mosaic: a clarification, correction, and evaluation of thermodynamic consistency

P Gao, H Zhang, Z Wu - Landscape Ecology - Springer

Objectives The first objective is to provide a clarification of and a correction to the

Wasserstein metric-based method. The second is to evaluate the method in terms of

thermodynamic consistency using different implementations. Methods Two implementation …

 

Convergence rates of the blocked Gibbs sampler with random scan in the Wasserstein metric

NY Wang, G Yin - Stochastics, 2020 - Taylor & Francis

To approximate μ, various scan Gibbs samplers with updating blocks are often used [1 J.

Besag, P. Green, D. Higdon, and K. Mengersen, Bayesian computation and stochastic

systems, Statist. Sci. 10(1) (1995), pp. 3–41. doi: 10.1214/ss/1177010123[Crossref], [Web of …

  Related articles All 3 versions


 

[PDF] semanticscholar.org

[PDF] Deconvolution for the Wasserstein metric and topological inference

B Michel - pdfs.semanticscholar.org

La SEE (Société de l'Electricité, de l'Electronique et des Technologies de l'Information et de

la Communication–Association reconnue d'utilité publique, régie par la loi du 1er juillet

1901) met à la disposition de ses adhérents et des abonnés à ses publications, un …


Semi-supervised Surface Wave Tomography with Wasserstein Cycle-consistent GAN: Method and Application on Southern California Plate Boundary Region

A Cai, H QiuF Niu - 2020 - essoar.org

Machine learning algorithm is applied to shear wave velocity (Vs) inversion in surface wave

tomography, where a set of 1-D Vs profiles and the corresponding synthetic dispersion

curves are used in network training. Previous studies showed that performances of a trained …

 


Semi-supervised Surface Wave Tomography with Wasserstein Cycle-consistent GAN: Method and Application on Southern California Plate Boundary Region

A Cai, H QiuF Niu - 2020 - essoar.org

Machine learning algorithm is applied to shear wave velocity (Vs) inversion in surface wave

tomography, where a set of 1-D Vs profiles and the corresponding synthetic dispersion

curves are used in network training. Previous studies showed that performances of a trained …

<——2020——2020———1420——   


2020

Entropy Regularized Power k-Means Clustering

arxiv.org › pdf

PDF

Jan 10, 2020 — entropy regularization to learn feature relevance while annealing. ... ML] 10 Jan 2020 ... 2012; Chakraborty and Das, 2017), but pairwise Euclidean distances become decreasingly informative as the ... are sampled from a standard normal distribution (further details are described later in Simulation 2 of.

 

2020

[PDF] arxiv.org 2020

Entropy-Regularized 2-Wasserstein Distance between Gaussian Measures

Anton MallastoAugusto GerolinHà Quang Minh

Gaussian distributions are plentiful in applications dealing in uncertainty quantification and diffusivity. They furthermore stand as important special cases for frameworks providing geometries for probability measures, as the resulting geometry on Gaussians is often expressible in closed-form under the frameworks. In this work, we study the Gaussian geometry under the entropy-regularized 2-Wasserstein distance, by providing closed-form solutions for the distance and interpolations between elements. Furthermore, we provide a fixed-point characterization of a population barycenter when restricted to the manifold of Gaussians, which allows computations through the fixed-point iteration algorithm. As a consequence, the results yield closed-form expressions for the 2-Sinkhorn divergence. As the geometries change by varying the regularization magnitude, we study the limiting cases of vanishing and infinite magnitudes, reconfirming well-known results on the limits of the Sinkhorn divergence. Finally, we illustrate the resulting geometries with a numerical study.


[PDF] arxiv.orgEntropy-Regularized 2-Wasserstein Distance between Gaussian Measures

Exponential Convergence in Entropy and Wasserstein Distance for McKean-Vlasov SDEs

P Ren, FY Wang - arXiv preprint arXiv:2010.08950, 2020 - arxiv.org

The following type exponential convergence is proved for (non-degenerate or degenerate)

McKean-Vlasov SDEs: $$ W_2 (\mu_t,\mu_\infty)^ 2+{\rm Ent}(\mu_t|\mu_\infty)\le c {\rm e}^{-

\lambda t}\min\big\{W_2 (\mu_0,\mu_\infty)^ 2,{\rm Ent}(\mu_0|\mu_\infty)\big\},\\t\ge 1 …

  Cited by 1 All 2 versions 


[HTML] mdpi.com

Calculating the Wasserstein metric-based Boltzmann entropy of a landscape mosaic

H Zhang, Z Wu, T Lan, Y Chen, P Gao - Entropy, 2020 - mdpi.com

Shannon entropy is currently the most popular method for quantifying the disorder or

information of a spatial data set such as a landscape pattern and a cartographic map.

However, its drawback when applied to spatial data is also well documented; it is incapable …

  Cited by 3 Related articles All 4 versions 


2020

[PDF] uni-bielefeld.de

[PDF] Exponential Convergence in Entropy and Wasserstein for McKean-Vlasov SDEs

P Renc, FY Wanga - 2020 - sfb1283.uni-bielefeld.de

The convergence in entropy for stochastic systems is an important topic in both probability theory

and mathematical physics, and has been well studied for Markov processes by using the

log-Sobolev inequality, see for instance [5] and references therein. However, the existing results …

  All 2 versio




2020

[PDF] arxiv.org

Stein factors for variance-gamma approximation in the Wasserstein and Kolmogorov distances

RE Gaunt - arXiv preprint arXiv:2008.06088, 2020 - arxiv.org

We obtain new bounds for the solution of the variance-gamma (VG) Stein equation that are

of the correct form for approximations in terms of the Wasserstein and Kolmorogorov metrics.

These bounds hold for all parameters values of the four parameter VG class. As an …

  Cited by 2 All 3 versions 



2020

[PDF] arxiv.org

The Equivalence of Fourier-based and Wasserstein Metrics on Imaging Problems

G Auricchio, A CodegoniS Gualandi… - arXiv preprint arXiv …, 2020 - arxiv.org

We investigate properties of some extensions of a class of Fourier-based probability metrics,

originally introduced to study convergence to equilibrium for the solution to the spatially

homogeneous Boltzmann equation. At difference with the original one, the new Fourier …

  Related articles All 4 versions 


2020

Parameter-Transferred Wasserstein Generative Adversarial Network (PT-WGAN) for Low-Dose PET Image Denoising

Y GongH ShanY Teng, N Tu, M Li… - … on Radiation and …, 2020 - ieeexplore.ieee.org

Due to the widespread use of positron emission tomography (PET) in clinical practice, the

potential risk of PET-associated radiation dose to patients needs to be minimized. However,

with the reduction in the radiation dose, the resultant images may suffer from noise and …

  Cited by 12 Related articles All 4 versions

2020

RDA-UNET-WGAN: An Accurate Breast Ultrasound Lesion ...

www.springerprofessional.de › rda-unet-wgan-an-accur...

Apr 3, 2020 — In this paper, we propose a Generative Adversarial Network (GAN) based algorithm for segmenting the tumor in Breast Ultrasound images.

RDA-UNET-WGAN: An Accurate Breast Ultrasound Lesion Segmentation Using Wasserstein Generative Adversarial Networks

A NegiANJ RajR Nersisson, Z Zhuang… - … FOR SCIENCE AND …, 2020 - Springer

Early-stage detection of lesions is the best possible way to fight breast cancer, a disease

with the highest malignancy ratio among women. Though several methods primarily based

on deep learning have been proposed for tumor segmentation, it is still a challenging …

  Cited by 3 Related articles


  2020  [HTML] springer.com

[HTML] Wasserstein and Kolmogorov error bounds for variance-gamma approximation via Stein's method I

RE Gaunt - Journal of Theoretical Probability, 2020 - Springer

The variance-gamma (VG) distributions form a four-parameter family that includes as special

and limiting cases the normal, gamma and Laplace distributions. Some of the numerous

applications include financial modelling and approximation on Wiener space. Recently …

  Cited by 13 Related articles All 6 versions


 <——2020—————2020———1430—


  tear 2020

[PDF] unipv.it

[PDF] On the equivalence between Fourier-based and Wasserstein metrics

G Auricchio, A Codegoni, S GualandiG Toscani… - eye - mate.unipv.it

We investigate properties of some extensions of a class of Fourierbased probability metrics,

originally introduced to study convergence to equilibrium for the solution to the spatially

homogeneous Boltzmann equation. At difference with the original one, the new Fourier …


2020

 [BOOK] An invitation to statistics in Wasserstein space

VM Panaretos, Y Zemel - 2020 - library.oapen.org

This open access book presents the key aspects of statistics in Wasserstein spaces, ie

statistics in the space of probability measures when endowed with the geometry of optimal

transportation. Further to reviewing state-of-the-art aspects, it also provides an accessible …

  Cited by 18 Related articles All 7 versions 


2020  [PDF] arxiv.org

Wasserstein statistics in 1D location-scale model

S Amari - arXiv preprint arXiv:2003.05479, 2020 - arxiv.org

Wasserstein geometry and information geometry are two important structures introduced in a

manifold of probability distributions. The former is defined by using the transportation cost

between two distributions, so it reflects the metric structure of the base manifold on which …

  Cited by 1 Related articles All 2 versions 

 

 2020

[PDF] arxiv.org

Wasserstein statistics in one-dimensional location-scale model

S AmariT Matsuda - arXiv preprint arXiv:2007.11401, 2020 - arxiv.org

Wasserstein geometry and information geometry are two important structures to be

introduced in a manifold of probability distributions. Wasserstein geometry is defined by

using the transportation cost between two distributions, so it reflects the metric of the base …

  Cited by 1 Related articles All 2 versions 


2020  [PDF] mit.edu

Wasserstein barycenters: statistics and optimization

AJ Stromme - 2020 - dspace.mit.edu

We study a geometric notion of average, the barycenter, over 2-Wasserstein space. We

significantly advance the state of the art by introducing extendible geodesics, a simple

synthetic geometric condition which implies non-asymptotic convergence of the empirical …

  Related articles 


2020 [PDF] unipd.it

[PDF] Weighted L2-Wasserstein Goodness-of-Fit Statistics

T de Wet - stat.unipd.it

In two recent papers, del Barrio et al.[2] and del Barrio et al.[3], the authors introduced and

studied a new class of goodness-of-fit statistics for location-scale families, based on L2-

functionals of the empirical quantile process. These functionals measure the Wasserstein  …

  Related articles 

[CITATION] Optimality in weighted L2-Wasserstein goodness-of-fit statistics

T De Wet, V Humble - South African …, 2020 - South African Statistical Association …

  Related articles All 2 versions


oncentration of risk measures: A Wasserstein distance ...

arxiv.org › math

Oct 21, 2020 — Previous concentration bounds are available only for specific risk measures such as CVaR and CPT-value. The bounds derived in this paper are shown to either match or improve upon previous bounds in cases where they ...


[PDF] academia.edu

Data augmentation in fault diagnosis based on the Wasserstein generative adversarial network with gradient penalty

X Gao, F Deng, X Yue - Neurocomputing, 2020 - Elsevier

Fault detection and diagnosis in industrial process is an extremely essential part to keep

away from undesired events and ensure the safety of operators and facilities. In the last few

decades various data based machine learning algorithms have been widely studied to …

  Cited by 36 Related articles All 3 version


Wasserstein distance to independence models

TÖ Çelik, A Jamneshan, G MontúfarB Sturmfels… - Journal of Symbolic …, 2020 - Elsevier

An independence model for discrete random variables is a Segre-Veronese variety in a

probability simplex. Any metric on the set of joint states of the random variables induces a

Wasserstein metric on the probability simplex. The unit ball of this polyhedral norm is dual to …

  Related articles All 3 versions


A fast proximal point method for computing exact wasserstein distance

Y XieX Wang, R Wang, H Zha - Uncertainty in Artificial …, 2020 - proceedings.mlr.press

Wasserstein distance plays increasingly important roles in machine learning, stochastic

programming and image processing. Major efforts have been under way to address its high

computational complexity, some leading to approximate or regularized variations such as …

  Cited by 48 Related articles All 5 versions 

<——2020——2020———1440——  



[PDF] arxiv.org

The back-and-forth method for wasserstein gradient flows

M Jacobs, W Lee, F Léger - arXiv preprint arXiv:2011.08151, 2020 - arxiv.org

We present a method to efficiently compute Wasserstein gradient flows. Our approach is

based on a generalization of the back-and-forth method (BFM) introduced by Jacobs and

Léger to solve optimal transport problems. We evolve the gradient flow by solving the dual …

  Cited by 1 Related articles All 2 versions 


[PDF] mlr.press

Principled learning method for Wasserstein distributionally robust optimization with local perturbations

Y Kwon, W Kim, JH Won… - … Conference on Machine …, 2020 - proceedings.mlr.press

Wasserstein distributionally robust optimization (WDRO) attempts to learn a model that

minimizes the local worst-case risk in the vicinity of the empirical data distribution defined by

Wasserstein ball. While WDRO has received attention as a promising tool for inference since …

  Related articles All 5 versions 


Person-independent facial expression recognition method based on improved Wasserstein generative adversarial networks in combination with identity aware

C Xu, Y Cui, Y Zhang, P Gao, J Xu - Multimedia Systems, 2020 - Springer

Since the distinction between two expressions is fairly vague, usually a subtle change in one

part of the human face is enough to change a facial expression. Most of the existing facial

expression recognition algorithms are not robust enough because they rely on general facial …

  Cited by 4 Related articles


[PDF] arxiv.org

First-Order Methods for Wasserstein Distributionally Robust MDP

J Grand-ClementC Kroer - arXiv preprint arXiv:2009.06790, 2020 - arxiv.org

Markov Decision Processes (MDPs) are known to be sensitive to parameter specification.

Distributionally robust MDPs alleviate this issue by allowing for ambiguity sets which give a

set of possible distributions over parameter sets. The goal is to find an optimal policy with …

  Cited by 1 Related articles All 3 versions 

 

[PDF] ieee.org

An Ensemble Wasserstein Generative Adversarial Network Method for Road Extraction From High Resolution Remote Sensing Images in Rural Areas

C Yang, Z Wang - IEEE Access, 2020 - ieeexplore.ieee.org

Road extraction from high resolution remote sensing (HR-RS) images is an important yet

challenging computer vision task. In this study, we propose an ensemble Wasserstein

Generative Adversarial Network with Gradient Penalty (WGAN-GP) method called E-WGAN …

  Cited by 2 Related articles All 2 versions



[PDF] arxiv.org

A material decomposition method for dual-energy CT via dual interactive Wasserstein generative adversarial networks

Z Shi, H Li, Q Cao, Z Wang, M Cheng - arXiv preprint arXiv:2007.11247, 2020 - arxiv.org

Dual-energy computed tomography has great potential in material characterization and

identification, whereas the reconstructed material-specific images always suffer from

magnified noise and beam hardening artifacts. In this study, a data-driven approach using …

  Related articles All 3 versions 


[PDF] ieee.org

A New Data-Driven Distributionally Robust Portfolio Optimization Method based on Wasserstein Ambiguity Set

N Du, Y Liu, Y Liu - IEEE Access, 2020 - ieeexplore.ieee.org

Since optimal portfolio strategy depends heavily on the distribution of uncertain returns, this

paper proposes a new method for the portfolio optimization problem with respect to

distribution uncertainty. When the distributional information of the uncertain return rate is …

  Related articles


A Riemannian Block Coordinate Descent Method for Computing the Projection Robust Wasserstein Distance

M HuangS MaL Lai - arXiv preprint arXiv:2012.05199, 2020 - arxiv.org

The Wasserstein distance has become increasingly important in machine learning and deep

learning. Despite its popularity, the Wasserstein distance is hard to approximate because of

the curse of dimensionality. A recently proposed approach to alleviate the curse of …

  Related articles All 3 versions 


[PDF] arxiv.org

Stein's method for normal approximation in Wasserstein distances with application to the multivariate Central Limit Theorem

T Bonis - Probability Theory and Related Fields, 2020 - Springer

We use Stein's method to bound the Wasserstein distance of order 2 between a

measure\(\nu\) and the Gaussian measure using a stochastic process\((X_t) _ {t\ge 0}\) such

that\(X_t\) is drawn from\(\nu\) for any\(t> 0\). If the stochastic process\((X_t) _ {t\ge 0}\) …

  Cited by 6 Related articles All 3 versions


[PDF] iop.org

Data Augmentation Method for Switchgear Defect Samples Based on Wasserstein Generative Adversarial Network

X Huang, J Xiong, Y Zhang, J Liang… - Journal of Physics …, 2020 - iopscience.iop.org

The problem of sample imbalance will lead to poor generalization ability of the deep

learning model algorithm, and the phenomenon of overfitting during network training, which

limits the accuracy of intelligent fault diagnosis of switchgear equipment. In view of this, this …

  Related articles

<——2020——2020———1450——


A Super Resolution Method for Remote Sensing Images Based on Cascaded Conditional Wasserstein GANs

B Liu, H Li, Y Zhou, Y Peng, A Elazab… - 2020 IEEE 3rd …, 2020 - ieeexplore.ieee.org

High-resolution (HR) remote sensing imagery is quite beneficial for subsequent

interpretation. Obtaining HR images can be achieved by upgrading the imaging device. Yet,

the cost to perform this task is very huge. Thus, it is necessary to obtain HR images from low …

  Related articles


Data Augmentation Method for Fault Diagnosis of Mechanical Equipment Based on Improved Wasserstein GAN

W Liu, L Duan, Y Tang, J Yang - 2020 11th International …, 2020 - ieeexplore.ieee.org

Most of the time the mechanical equipment is in normal operation state, which results in high

imbalance between fault data and normal data. In addition, traditional signal processing

methods rely heavily on expert experience, making it difficult for classification or prediction …

  Related articles


System and method for unsupervised domain adaptation via sliced-wasserstein distance

AJ GabourieM RostamiS Kolouri… - US Patent App. 16 …, 2020 - freepatentsonline.com

Described is a system for unsupervised domain adaptation in an autonomous learning

agent. The system adapts a learned model with a set of unlabeled data from a target

domain, resulting in an adapted model. The learned model was previously trained to …

  Cited by 2 Related articles All 2 versions 


Semi-supervised Surface Wave Tomography with Wasserstein Cycle-consistent GAN: Method and Application on Southern California Plate Boundary Region

A Cai, H QiuF Niu - 2020 - essoar.org

Machine learning algorithm is applied to shear wave velocity (Vs) inversion in surface wave

tomography, where a set of 1-D Vs profiles and the corresponding synthetic dispersion

curves are used in network training. Previous studies showed that performances of a trained …


2020

Remaining useful life prediction of lithium-ion batteries using a fusion method based on Wasserstein GAN

周温丁, 鲍士兼, 许方敏, 赵成林 - 中国邮电高校学报 (英文版), 2020 - jcupt.bupt.edu.cn

Lithium-ion batteries are the main power supply equipment in many fields due to their

advantages of no memory, high energy density, long cycle life and no pollution to the

environment. Accurate prediction for the remaining useful life (RUL) of lithium-ion batteries …

  All 2 versions 


 

[PDF] bciml.cn

Wasserstein distance based deep adversarial transfer learning for intelligent fault diagnosis with unlabeled or insufficient labeled data

C Cheng, B Zhou, G MaD WuY Yuan - Neurocomputing, 2020 - Elsevier

… and unsupervised learning is required. Inspired by Wasserstein distance of optimal transport,

in this paper, we propose a novel Wasserstein Distance-based Deep Transfer Learning (…

 Cited by 70 Related articles All 3 versions


 2020

Drift compensation algorithm based on Time-Wasserstein dynamic distribution alignment

Y Tao, K Zeng, Z Liang - 2020 IEEE/CIC International …, 2020 - ieeexplore.ieee.org

The electronic nose (E-nose) is mainly used to detect different types and concentrations of

gases. At present, the average life of E-nose is relatively short, mainly due to the drift of the

sensor resulting in a decrease in the effect. Therefore, it is the focus of research in this field …

  Related articles

2020  [PDF] arxiv.org

Characterization of probability distribution convergence in Wasserstein distance by -quantization error function

Y Liu, G Pagès - Bernoulli, 2020 - projecteuclid.org

We establish conditions to characterize probability measures by their $ L^{p} $-quantization

error functions in both $\mathbb {R}^{d} $ and Hilbert settings. This characterization is two-

fold: static (identity of two distributions) and dynamic (convergence for the $ L^{p} …

  Cited by 1 Related articles All 5 versions


2020

[PDF] arxiv.org

Reinforced wasserstein training for severity-aware semantic segmentation in autonomous driving

X Liu, Y Zhang, X Liu, S Bai, S Li, J You - arXiv preprint arXiv:2008.04751, 2020 - arxiv.org

Semantic segmentation is important for many real-world systems, eg, autonomous vehicles,

which predict the class of each pixel. Recently, deep networks achieved significant progress

wrt the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross …

  Cited by 1 Related articles All 3 versions 

2020  [PDF] arxiv.org

Convergence rate to equilibrium in Wasserstein distance for reflected jump–diffusions

A Sarantsev - Statistics & Probability Letters, 2020 - Elsevier

Convergence rate to the stationary distribution for continuous-time Markov processes can be

studied using Lyapunov functions. Recent work by the author provided explicit rates of

convergence in special case of a reflected jump–diffusion on a half-line. These results are …

  Related articles All 7 versions

<——2020——2020———1460——



2020

[PDF] arxiv.org

Wasserstein Convergence Rate for Empirical Measures on Noncompact Manifolds

FY Wang - arXiv preprint arXiv:2007.14667, 2020 - arxiv.org

Let $ X_t $ be the (reflecting) diffusion process generated by $ L:=\Delta+\nabla V $ on a

complete connected Riemannian manifold $ M $ possibly with a boundary $\partial M $,

where $ V\in C^ 1 (M) $ such that $\mu (dx):= e^{V (x)} dx $ is a probability measure. We …

  Related articles All 2 versions 


 

2020

[PDF] thecvf.com

Severity-aware semantic segmentation with reinforced wasserstein training

X LiuW JiJ YouGE Fakhri… - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com

Semantic segmentation is a class of methods to classify each pixel in an image into

semantic classes, which is critical for autonomous vehicles and surgery systems. Cross-

entropy (CE) loss-based deep neural networks (DNN) achieved great success wrt the …

  Cited by 10 Related articles All 5 versions 


2020

A Wasserstein based two-stage distributionally robust optimization model for optimal operation of CCHP micro-grid under uncertainties

Y Wang, Y Yang, L Tang, W Sun, B Li - International Journal of Electrical …, 2020 - Elsevier

Combined cooling, heating and power (CCHP) micro-grids are getting increasing attentions

due to the realization of cleaner production and high energy efficiency. However, with the

features of complex tri-generation structure and renewable power uncertainties, it is …

  Cited by 19 Related articles All 2 versions


2020 [PDF] researchgate.net

Tractable reformulations of two-stage distributionally robust linear programs over the type-∞ wasserstein ball

W Xie - Operations Research Letters, 2020 - Elsevier

This paper studies a two-stage distributionally robust stochastic linear program under the

type-∞ Wasserstein ball by providing sufficient conditions under which the program can be

e  Cited by 12 Related articles All 4 versions


2020

Limit distribution theory for smooth Wasserstein distance with applications to generative modeling

Z GoldfeldK Kato - arXiv preprint arXiv:2002.01012, 2020 - arxiv.org

The 1-Wasserstein distance ($\mathsf {W} _1 $) is a popular proximity measure between

probability distributions. Its metric structure, robustness to support mismatch, and rich

geometric structure fueled its wide adoption for machine learning tasks. Such tasks …

  Cited by 2 Related articles All 2 versions 


2020

2020

[PDF] arxiv.org

Joint Wasserstein Distribution Matching

JZ Cao, L Mo, Q Du, Y GuoP ZhaoJ Huang… - arXiv preprint arXiv …, 2020 - arxiv.org

Joint distribution matching (JDM) problem, which aims to learn bidirectional mappings to

match joint distributions of two domains, occurs in many machine learning and computer

vision applications. This problem, however, is very difficult due to two critical challenges:(i) it …

  Related articles All 2 versions 


 2020

[PDF] arxiv.org

Irregularity of distribution in Wasserstein distance

C Graham - Journal of Fourier Analysis and Applications, 2020 - Springer

We study the non-uniformity of probability measures on the interval and circle. On the

interval, we identify the Wasserstein-p distance with the classical\(L^ p\)-discrepancy. We

thereby derive sharp estimates in Wasserstein distances for the irregularity of distribution of …

  Cited by 2 Related articles All 3 versions


2020

[PDF] arxiv.org

Regularizing activations in neural networks via distribution matching with the Wasserstein metric

T Joo, D Kang, B Kim - arXiv preprint arXiv:2002.05366, 2020 - arxiv.org

Regularization and normalization have become indispensable components in training deep

neural networks, resulting in faster training and improved generalization performance. We

propose the projected error function regularization loss (PER) that encourages activations to …

  Cited by 3 Related articles All 5 versions 


 2020

[PDF] arxiv.org

Modeling EEG data distribution with a Wasserstein Generative Adversarial Network to predict RSVP Events

S PanwarP RadTP Jung… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Electroencephalography (EEG) data are difficult to obtain due to complex experimental

setups and reduced comfort with prolonged wearing. This poses challenges to train powerful

deep learning model with the limited EEG data. Being able to generate EEG data …

  Cited by 1 Related articles All 5 versions


2020

Trajectories from Distribution-Valued Functional Curves: A Unified Wasserstein Framework

A SharmaG Gerig - … Conference on Medical Image Computing and …, 2020 - Springer

Temporal changes in medical images are often evaluated along a parametrized function that

represents a structure of interest (eg white matter tracts). By attributing samples along these

functions with distributions of image properties in the local neighborhood, we create …

  Related articles All 2 versions

<——2020——2020———1470—— 

 


[PDF] arxiv.org

Multivariate goodness-of-Fit tests based on Wasserstein distance

M HallinG MordantJ Segers - arXiv preprint arXiv:2003.06684, 2020 - arxiv.org

Goodness-of-fit tests based on the empirical Wasserstein distance are proposed for simple

and composite null hypotheses involving general multivariate distributions. This includes the

important problem of testing for multivariate normality with unspecified mean vector and …

  Cited by 4 Related articles All 10 versions 


Reconstruction of shale image based on Wasserstein Generative Adversarial Networks with gradient penalty

W Zha, X Li, Y Xing, L He, D Li - Advances in Geo-Energy …, 2020 - yandy-ager.com

Abstract Generative Adversarial Networks (GANs), as most popular artificial intelligence

models in the current image generation field, have excellent image generation capabilities.

Based on Wasserstein GANs with gradient penalty, this paper proposes a novel digital core …

  Cited by 7 Related articles 


Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings

Y Dai, S Wang, X Chen, C Xu, W Guo - Knowledge-Based Systems, 2020 - Elsevier

Abstract Knowledge graph embedding aims to project entities and relations into low-

dimensional and continuous semantic feature spaces, which has captured more attention in

recent years. Most of the existing models roughly construct negative samples via a uniformly …

  Cited by 4 Related articles All 2 versions


 

[PDF] arxiv.org

Fast and Smooth Interpolation on Wasserstein Space

S Chewi, J Clancy, TL GouicP Rigollet… - arXiv preprint arXiv …, 2020 - arxiv.org

We propose a new method for smoothly interpolating probability measures using the

geometry of optimal transport. To that end, we reduce this problem to the classical Euclidean

setting, allowing us to directly leverage the extensive toolbox of spline interpolation. Unlike …

  Related articles All 2 versions 




[PDF] ieee.org

A New Data-Driven Distributionally Robust Portfolio Optimization Method based on Wasserstein Ambiguity Set

N Du, Y Liu, Y Liu - IEEE Access, 2020 - ieeexplore.ieee.org

Since optimal portfolio strategy depends heavily on the distribution of uncertain returns, this

paper proposes a new method for the portfolio optimization problem with respect to

distribution uncertainty. When the distributional information of the uncertain return rate is …

  Related articles


An Improvement based on Wasserstein GAN for Alleviating Mode Collapsing

Y Chen, X Hou - 2020 International Joint Conference on Neural …, 2020 - ieeexplore.ieee.org

In the past few years, Generative Adversarial Networks as a deep generative model has

received more and more attention. Mode collapsing is one of the challenges in the study of

Generative Adversarial Networks. In order to solve this problem, we deduce a new algorithm …

  Related articles


[PDF] iop.org

Data Augmentation Based on Wasserstein Generative Adversarial Nets Under Few Samples

Y Jiang, B Zhu, Q Ma - IOP Conference Series: Materials Science …, 2020 - iopscience.iop.org

Aiming at the problem of low accuracy of image classification under the condition of few

samples, an improved method based on Wasserstein Generative Adversarial Nets is

proposed. The small data sets are augmented by generating target samples through …

  Cited by 1 Related articles All 2 versions


[PDF] iop.org

A collaborative filtering recommendation framework based on Wasserstein GAN

R Li, F Qian, X Du, S Zhao… - Journal of Physics …, 2020 - iopscience.iop.org

Compared with the original GAN, Wasserstein GAN minimizes the Wasserstein Distance

between the generative distribution and the real distribution, can well capture the potential

distribution of data and has achieved excellent results in image generation. However, the …

  Related articles


On nonexpansiveness of metric projection operators on Wasserstein spaces

A Adve, A Mészáros - arXiv preprint arXiv:2009.01370, 2020 - arxiv.org

In this note we investigate properties of metric projection operators onto closed and

geodesically convex proper subsets of Wasserstein spaces $(\mathcal {P} _p (\mathbf {R}^

d), W_p). $ In our study we focus on the particular subset of probability measures having …

  Related articles All 3 versions 

<——2020——2020———1480——



Covariance Steering of Discrete-Time Stochastic Linear Systems Based on Wasserstein Distance Terminal Cost

IM BalciE Bakolas - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We consider a class of stochastic optimal control problems for discrete-time linear systems

whose objective is the characterization of control policies that will steer the probability

distribution of the terminal state of the system close to a desired Gaussian distribution. In our …

  Related articles


[PDF] arxiv.org

LCS Graph Kernel Based on Wasserstein Distance in Longest Common Subsequence Metric Space

J Huang, Z Fang, H Kasai - arXiv preprint arXiv:2012.03612, 2020 - arxiv.org

For graph classification tasks, many methods use a common strategy to aggregate

information of vertex neighbors. Although this strategy provides an efficient means of

extracting graph topological features, it brings excessive amounts of information that might …

  Related articles All 2 versions 


[PDF] sciencedirect.com

Intelligent Fault Diagnosis with a Deep Transfer Network based on Wasserstein Distance

J Xu, J Huang, Y Zhao, L Zhou - Procedia Computer Science, 2020 - Elsevier

Intelligent fault-diagnosis methods based on deep-learning technology have been very

successful for complex industrial systems. The deep learning based fault classification

model requires a large number of labeled data. Moreover, the probability distribution of …

  Related articles


[PDF] uni-bonn.de

Diffusions on Wasserstein Spaces

L Dello Schiavo - 2020 - bonndoc.ulb.uni-bonn.de

We construct a canonical diffusion process on the space of probability measures over a

closed Riemannian manifold, with invariant measure the Dirichlet–Ferguson measure.

Together with a brief survey of the relevant literature, we collect several tools from the theory …

  Related articles All 3 versions 


[PDF] iop.org

Data Augmentation Method for Switchgear Defect Samples Based on Wasserstein Generative Adversarial Network

X Huang, J Xiong, Y Zhang, J Liang… - Journal of Physics …, 2020 - iopscience.iop.org

The problem of sample imbalance will lead to poor generalization ability of the deep

learning model algorithm, and the phenomenon of overfitting during network training, which

limits the accuracy of intelligent fault diagnosis of switchgear equipment. In view of this, this …

  Related articles


2020


Stereoscopic image reflection removal based on Wasserstein Generative Adversarial Network

X Wang, Y Pan, DPK Lun - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Reflection removal is a long-standing problem in computer vision. In this paper, we consider

the reflection removal problem for stereoscopic images. By exploiting the depth information

of stereoscopic images, a new background edge estimation algorithm based on the …

  Related articles All 2 versions


Data Augmentation Method for Switchgear Defect Samples Based on Wasserstein Generative Adversarial Network

X Huang, J Xiong, Y Zhang, J Liang… - Journal of Physics …, 2020 - iopscience.iop.org

… in Table 1. Table 1. Model performance comparison under different augmented data … diagnosis

of switchgear, this paper proposes an augmentation method of defect samples … and Efficient

Processing of Distribution Equipment Condition Detection Data, No.082100KK52190004 …

  Related articles

[CITATION] Data augmentation method for power transformer fault diagnosis based on conditional Wasserstein generative adversarial network

YP Liu, Z Xu, J He, Q Wang, SG Gao, J Zhao - Power System Technology, 2020

  Cited by 2


[PDF] arxiv.org

Geometric Characteristics of Wasserstein Metric on SPD (n)

Y Luo, S Zhang, Y Cao, H Sun - arXiv preprint arXiv:2012.07106, 2020 - arxiv.org

Page 1. Geometric Characteristics of Wasserstein Metric on SPD(n) Yihao Luo,

Shiqiang Zhang, Yueqi Cao, Huafei Sun School of Mathematics and Statistics, Beijing

Institute of Technology, Beijing 100081, PR China Abstract …

  Related articles All 2 versions 

 and JS divergence, Wasserstein distance can measure the difference between two …

Cited by 1 Related articles All 2 versions

Data Augmentation Method for Fault Diagnosis of Mechanical Equipment Based on Improved Wasserstein GAN

W Liu, L Duan, Y Tang, J Yang - 2020 11th International …, 2020 - ieeexplore.ieee.org

… results. In view of the above problem, this paper proposed a method to augment

failure data for mechanical equipment diagnosis based on Wasserstein generative

adversarial networks with gradient penalty (WGAN-GP). First …

  Related articles


When ot meets mom: Robust estimation of wasserstein distance

G StaermanP LaforgueP Mozharovskyi… - arXiv preprint arXiv …, 2020 - arxiv.org

Issued from Optimal Transport, the Wasserstein distance has gained importance in Machine

Learning due to its appealing geometrical properties and the increasing availability of

efficient approximations. In this work, we consider the problem of estimating the Wasserstein …

  Cited by 2 Related articles All 4 versions 

<——2020——2020———1490—— 


[PDF] core.ac.uk

On the computation of Wasserstein barycenters

G PuccettiL RüschendorfS Vanduffel - Journal of Multivariate Analysis, 2020 - Elsevier

The Wasserstein barycenter is an important notion in the analysis of high dimensional data

with a broad range of applications in applied probability, economics, statistics, and in

particular to clustering and image processing. In this paper, we state a general version of the …

  Cited by 7 Related articles All 9 versions


[PDF] arxiv.org

Scalable computations of wasserstein barycenter via input convex neural networks

J FanA TaghvaeiY Chen - arXiv preprint arXiv:2007.04462, 2020 - arxiv.org

Wasserstein Barycenter is a principled approach to represent the weighted mean of a given

set of probability distributions, utilizing the geometry induced by optimal transport. In this

work, we present a novel scalable algorithm to approximate the Wasserstein Barycenters …

  Cited by 1 Related articles All 3 versions 


[PDF] ams.org

Isometric study of Wasserstein spaces–the real line

G GehérT TitkosD Virosztek - Transactions of the American Mathematical …, 2020 - ams.org

Recently Kloeckner described the structure of the isometry group of the quadratic

Wasserstein space $\mathcal {W} _2 (\mathbb {R}^ n) $. It turned out that the case of the real

line is exceptional in the sense that there exists an exotic isometry flow. Following this line of …

  Cited by 2 Related articles All 8 versions


[PDF] neurips.cc

[PDF] Ratio Trace Formulation of Wasserstein Discriminant Analysis

H LiuY CaiYL Chen, P Li - Advances in Neural …, 2020 - proceedings.neurips.cc

Abstract< p> We reformulate the Wasserstein Discriminant Analysis (WDA) as a ratio trace

problem and present an eigensolver-based algorithm to compute the discriminative

subspace of WDA. This new formulation, along with the proposed algorithm, can be served …

  Related articles All 3 versions 

 

2020

[PDF] arxiv.org

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

MH Quang - arXiv preprint arXiv:2011.07489, 2020 - arxiv.org

This work studies the entropic regularization formulation of the 2-Wasserstein distance on an

infinite-dimensional Hilbert space, in particular for the Gaussian setting. We first present the

Minimum Mutual Information property, namely the joint measures of two Gaussian measures …

  Cited by 1 Related articles All 2 versions 

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

MH Quang - arXiv preprint arXiv:2011.07489, 2020 - arxiv.org

This work studies the entropic regularization formulation of the 2-Wasserstein distance on an

infinite-dimensional Hilbert space, in particular for the Gaussian setting. We first present the

Minimum Mutual Information property, namely the joint measures of two Gaussian measures …

  Cited by 2 Related articles All 2 versions 


Finite-Horizon Control of Nonlinear Discrete-Time Systems with Terminal Cost of Wasserstein Distance

K Hoshino - 2020 59th IEEE Conference on Decision and …, 2020 - ieeexplore.ieee.org

This study explores a finite-horizon optimal control problem of nonlinear discrete-time

systems for steering a probability distribution of initial states as close as possible to a

desired probability distribution of terminal states. The problem is formulated as an optimal …

  Cited by 1 Related articles


[PDF] aaai.org

Improving the Robustness of Wasserstein Embedding by Adversarial PAC-Bayesian Learning

D Ding, M Zhang, X PanM YangX He - Proceedings of the AAAI …, 2020 - ojs.aaai.org

Node embedding is a crucial task in graph analysis. Recently, several methods are

proposed to embed a node as a distribution rather than a vector to capture more information.

Although these methods achieved noticeable improvements, their extra complexity brings …

  Related articles All 3 versions 


CVaR-Based Approximations of Wasserstein Distributionally Robust Chance Constraints with Application to Process Scheduling

B Liu, Q Zhang, X Ge, Z Yuan - Industrial & Engineering Chemistry …, 2020 - ACS Publications

Distributionally robust chance constrained programming is a stochastic optimization

approach that considers uncertainty in model parameters as well as uncertainty in the

underlying probability distribution. It ensures a specified probability of constraint satisfaction …

  Related articles All 4 versions


[PDF] upenn.edu

Optimal Estimation of Wasserstein Distance on a Tree With an Application to Microbiome Studies

S Wang, TT CaiH Li - Journal of the American Statistical …, 2020 - Taylor & Francis

The weighted UniFrac distance, a plug-in estimator of the Wasserstein distance of read

counts on a tree, has been widely used to measure the microbial community difference in

microbiome studies. Our investigation however shows that such a plug-in estimator …

  Related articles All 4 versions

<——2020——2020———1500——


[PDF] arxiv.org

TPFA Finite Volume Approximation of Wasserstein Gradient Flows

A NataleG Todeschi - International Conference on Finite Volumes for …, 2020 - Springer

Numerous infinite dimensional dynamical systems arising in different fields have been

shown to exhibit a gradient flow structure in the Wasserstein space. We construct Two Point

Flux Approximation Finite Volume schemes discretizing such problems which preserve the …

  Cited by 2 Related articles All 6 versions


[PDF] arxiv.org

Universal consistency of Wasserstein -NN classifier

D Ponnoprat - arXiv preprint arXiv:2009.04651, 2020 - arxiv.org

The Wasserstein distance provides a notion of dissimilarities between probability measures,

which has recent applications in learning of structured data with varying size such as images

and text documents. In this work, we analyze the $ k $-nearest neighbor classifier ($ k $-NN) …

  Related articles All 2 versions 


 

[PDF] arxiv.org

On a Novel Application of Wasserstein-Procrustes for Unsupervised Cross-Lingual Learning

G Ramírez, R Dangovski, P Nakov… - arXiv preprint arXiv …, 2020 - arxiv.org

The emergence of unsupervised word embeddings, pre-trained on very large monolingual

text corpora, is at the core of the ongoing neural revolution in Natural Language Processing

(NLP). Initially introduced for English, such pre-trained word embeddings quickly emerged …

  Related articles All 3 versions 


 

Time Discretizations of Wasserstein-Hamiltonian Flows

J Cui, L Dieci, H Zhou - arXiv preprint arXiv:2006.09187, 2020 - arxiv.org

We study discretizations of Hamiltonian systems on the probability density manifold

equipped with the $ L^ 2$-Wasserstein metric. Based on discrete optimal transport theory,

several Hamiltonian systems on graph (lattice) with different weights are derived, which can …

  Cited by 1 Related articles All 3 versions 


On the Computation of Kantorovich--Wasserstein Distances Between Two-Dimensional Histograms by Uncapacitated Minimum Cost Flows

F Bassetti, S GualandiM Veneroni - SIAM Journal on Optimization, 2020 - SIAM

… of fit between distri- butions [38, 53] as well as an alternative to the usual g-divergences as cost

function in minimum distance point estimation problems [7, 8]. It was used to compare two-

dimensional (2D) histograms, but only considering the 1-norm as a cost structure of Page 3 …

  Cited by 1 Related articles All 2 versions


2020

 

[PDF] arxiv.org

Hierarchical Low-Rank Approximation of Regularized Wasserstein distance

M Motamed - arXiv preprint arXiv:2004.12511, 2020 - arxiv.org

… The application of Wasserstein metric may however be limited to cases where the probability

measures are supported on low-dimensional spaces, because its numerical computation can

quickly become prohibitive as the dimension increases; see eg [13] …

  Related articles All 3 versions 


Isometries of Wasserstein spaces

GP Gehér, T TitkosD Virosztek - halgebra.math.msu.su

Due to its nice theoretical properties and an astonishing number of applications via optimal

transport problems, probably the most intensively studied metric nowadays is the p-

Wasserstein metric. Given a complete and separable metric space X and a real number p≥ …

  Related articles 

 

A new Wasserstein distance-and cumulative sum-dependent health indicator and its application in prediction of remaining useful life of bearing

J Yin, M Xu, H Zheng, Y Yang - Journal of the Brazilian Society of …, 2020 - Springer

… 5, the conclusion of this paper is drawn. Theory Background. The theory of Wasserstein distance.

The Wasserstein distance (WD) is a similarity measurement method of the distance between

two distributions, and its essence is to measure the distance for weighted point sets …

  Related articles


[PDF] researchgate.net

[PDF] ADDENDUM TO” ISOMETRIC STUDY OF WASSERSTEIN SPACES–THE REAL LINE”

GPÁL GEHÉRT TITKOSD VIROSZTEK - researchgate.net

We show an example of a Polish metric space X whose quadratic Wasserstein space W2 (X)

possesses an isometry that splits mass. This gives an affirmative answer to Kloeckner's

question,[2, Question 2]. Let us denote the metric space ([0, 1],|·|), equipped with the usual …

  Related articles 


[PDF] arxiv.org

Consistency of Distributionally Robust Risk-and Chance-Constrained Optimization Under Wasserstein Ambiguity Sets

A CherukuriAR Hota - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

… Motivated by these attrac- tive features, several recent works have proposed approxi-

mations and finite-dimensional reformulations of Wasserstein distributionally robust

chance and CVaR constrained pro- grams [13]–[16] …

Cited by 4 Related articles All 4 versions

<——2020——2020———1510—— 


22020  2786_>2787

Regularized Wasserstein Means for Aligning Distributional Data.

By: Mi, Liang; Zhang, Wen; Wang, Yalin

Proceedings of the ... AAAI Conference on Artificial Intelligence. AAAI Conference on Artificial Intelligence  Volume: ‏ 34   Issue: ‏ 4   Pages: ‏ 5166-5173   Published: ‏ 2020 (Epub 2020 Apr 03)

 

2020
Sampling of probability measures in the convex order by Wasserstein projection

A AlfonsiJ Corbetta, B Jourdain - Annales de l'Institut Henri …, 2020 - projecteuclid.org

In this paper, for $\mu $ and $\nu $ two probability measures on $\mathbb {R}^{d} $ with

finite moments of order $\varrho\ge 1$, we define the respective projections for the $ W_

{\varrho} $-Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures …

  Cited by 17 Related articles All 9 versions


2020

Sampling of probability measures in the convex order by Wasserstein projection

A AlfonsiJ Corbetta, B Jourdain - Annales de l'Institut Henri …, 2020 - projecteuclid.org

In this paper, for $\mu $ and $\nu $ two probability measures on $\mathbb {R}^{d} $ with

finite moments of order $\varrho\ge 1$, we define the respective projections for the $ W_

{\varrho} $-Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures …

Cited by 26 RelatedRelated articles All 9 versions

[CITATION] Sampling of probability measures in the convex order by Wasserstein projection. arXiv e-prints, page

Cited by 26 Related articles All 11 versions

2020

[PDF] arxiv.org

Projection robust Wasserstein distance and Riemannian optimization

T LinC FanN HoM CuturiMI Jordan - arXiv preprint arXiv:2006.07458, 2020 - arxiv.org

Projection robust Wasserstein (PRW) distance, or Wasserstein projection pursuit (WPP), is a

robust variant of the Wasserstein distance. Recent work suggests that this quantity is more

robust than the standard Wasserstein distance, in particular when comparing probability …

  Cited by 2 Related articles All 6 versions 


2020

[PDF] arxiv.org

A new approach to posterior contraction rates via Wasserstein dynamics

E Dolera, S Favaro, E Mainini - arXiv preprint arXiv:2011.14425, 2020 - arxiv.org

This paper presents a new approach to the classical problem of quantifying posterior

contraction rates (PCRs) in Bayesian statistics. Our approach relies on Wasserstein

distance, and it leads to two main contributions which improve on the existing literature of …

  Cited by 1 Related articles All 2 versions 


2020

[PDF] arxiv.org

Exponential contraction in Wasserstein distance on static and evolving manifolds

LJ Cheng, A Thalmaier, SQ Zhang - arXiv preprint arXiv:2001.06187, 2020 - arxiv.org

In this article, exponential contraction in Wasserstein distance for heat semigroups of

diffusion processes on Riemannian manifolds is established under curvature conditions

where Ricci curvature is not necessarily required to be non-negative. Compared to the …

  Cited by 2 Related articles All 5 versions 


2020

[PDF] arxiv.org

Fast Epigraphical Projection-based Incremental Algorithms for Wasserstein Distributionally Robust Support Vector Machine

J Li, C ChenAMC So - arXiv preprint arXiv:2010.12865, 2020 - arxiv.org

Wasserstein\textbf {D} istributionally\textbf {R} obust\textbf {O} ptimization (DRO) is

concerned with finding decisions that perform well on data that are drawn from the worst-

case probability distribution within a Wasserstein ball centered at a certain nominal …

  Related articles All 5 versions 


2020

[PDF] arxiv.org

Wasserstein k-means with sparse simplex projection

T Fukunaga, H Kasai - arXiv preprint arXiv:2011.12542, 2020 - arxiv.org

This paper presents a proposal of a faster Wasserstein $ k $-means algorithm for histogram

data by reducing Wasserstein distance computations and exploiting sparse simplex

projection. We shrink data samples, centroids, and the ground cost matrix, which leads to …

  Related articles All 2 versions 


 2020

Exponential contraction in Wasserstein distances for diffusion semigroups with negative curvature

FY Wang - Potential Analysis, 2020 - Springer

Let P t be the (Neumann) diffusion semigroup P t generated by a weighted Laplacian on a

complete connected Riemannian manifold M without boundary or with a convex boundary. It

is well known that the Bakry-Emery curvature is bounded below by a positive constant> 0 …

  Cited by 4 Related articles


2020

[PDF] arxiv.org

On nonexpansiveness of metric projection operators on Wasserstein spaces

A Adve, A Mészáros - arXiv preprint arXiv:2009.01370, 2020 - arxiv.org

In this note we investigate properties of metric projection operators onto closed and

geodesically convex proper subsets of Wasserstein spaces $(\mathcal {P} _p (\mathbf {R}^

d), W_p). $ In our study we focus on the particular subset of probability measures having …

  Related articles All 3 versions 


<——2020——2020———1520——

[PDF] arxiv.org

 FRWCAE: joint faster-RCNN and Wasserstein convolutional auto-encoder for instance retrieval

Y Zhang, Y Feng, D Liu, J Shang, B Qiang - Applied Intelligence, 2020 - Springer

Based on the powerful feature extraction capability of deep convolutional neural networks,

image-level retrieval methods have achieved superior performance compared to the hand-

crafted features and indexing algorithms. However, people tend to focus on foreground …

  Related articles


N domain adaptation for the joint optic disc-and-cup segmentation in fundus images

S KadambiZ WangE Xing - … Journal of Computer Assisted Radiology and …, 2020 - Springer

Purpose The cup-to-disc ratio (CDR), a clinical metric of the relative size of the optic cup to

the optic disc, is a key indicator of glaucoma, a chronic eye disease leading to loss of vision.

CDR can be measured from fundus images through the segmentation of optic disc and optic …

  Cited by 1 Related articles All 3 versions


valuating the Performance of Climate Models Based on Wasserstein Distance

Authors:Gabriele VissioValerio LemboValerio LucariniMichael Ghil
Article, 2020
Publication:Geophysical research letters, 47, 2020, N
Publisher:2020

Considering anatomical prior information for low-dose CT image enhancement using attribute-augmented Wasserstein generative adversarial networks

Z Huang, X Liu, R Wang, J Chen, P Lu, Q Zhang… - Neurocomputing, 2021 - Elsevier

Currently, many deep learning (DL)-based low-dose CT image postprocessing technologies fail to consider the anatomical differences in training data among different human body sites, such as the cranium, lung and pelvis. In addition, we can observe evident anatomical …

  Related articles

 

[PDF] arxiv.org

Distributed Wasserstein Barycenters via Displacement Interpolation

P Cisneros-Velarde, F Bullo - arXiv preprint arXiv:2012.08610, 2020 - arxiv.org

Consider a multi-agent system whereby each agent has an initial probability measure. In this 

paper, we propose a distributed algorithm based upon stochastic, asynchronous and 

pairwise exchange of information and displacement interpolation in the Wasserstein space …

Related articles All 2 versions 


2020

Social-wagdat: Interaction-aware trajectory prediction via wasserstein graph double-attention network

J Li, H Ma, Z ZhangM Tomizuka - arXiv preprint arXiv:2002.06241, 2020 - arxiv.org

Effective understanding of the environment and accurate trajectory prediction of surrounding

dynamic obstacles are indispensable for intelligent mobile systems (like autonomous

vehicles and social robots) to achieve safe and high-quality planning when they navigate in …

  Cited by 12 Related articles All 3 versions 


2020  [PDF] arxiv.org

Wasserstein-based graph alignment

HP MareticME GhecheM MinderG Chierchia… - arXiv preprint arXiv …, 2020 - arxiv.org

We propose a novel method for comparing non-aligned graphs of different sizes, based on

the Wasserstein distance between graph signal distributions induced by the respective

graph Laplacian matrices. Specifically, we cast a new formulation for the one-to-many graph  …

  Cited by 5 Related articles All 2 versions 


2020[PDF] aaai.org

Gromov-wasserstein factorization models for graph clustering

H Xu - Proceedings of the AAAI Conference on Artificial …, 2020 - ojs.aaai.org

We propose a new nonlinear factorization model for graphs that are with topological

structures, and optionally, node attributes. This model is based on a pseudometric called

Gromov-Wasserstein (GW) discrepancy, which compares graphs in a relational way. It …

  Cited by 3 Related articles All 5 versions 


2020  [PDF] arxiv.org

node2coords: Graph representation learning with wasserstein barycenters

E SimouD ThanouP Frossard - IEEE Transactions on Signal …, 2020 - ieeexplore.ieee.org

In order to perform network analysis tasks, representations that capture the most relevant

information in the graph structure are needed. However, existing methods do not learn

representations that can be interpreted in a straightforward way and that are stable to …

  Cited by 1 Related articles All 3 versions


2020

[PDF] arxiv.org

Wasserstein Embedding for Graph Learning

S KolouriN NaderializadehGK Rohde… - arXiv preprint arXiv …, 2020 - arxiv.org

We present Wasserstein Embedding for Graph Learning (WEGL), a novel and fast

framework for embedding entire graphs in a vector space, in which various machine

learning models are applicable for graph-level prediction tasks. We leverage new insights …

  Cited by 1 Related articles All 3 versions 


<——2020——2020———1530—— 



2020

Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings

Y Dai, S Wang, X Chen, C Xu, W Guo - Knowledge-Based Systems, 2020 - Elsevier

Abstract Knowledge graph embedding aims to project entities and relations into low-

dimensional and continuous semantic feature spaces, which has captured more attention in

recent years. Most of the existing models roughly construct negative samples via a uniformly …

  Cited by 4 Related articles All 2 versions


2020

[PDF] arxiv.org

Graph Wasserstein Correlation Analysis for Movie Retrieval

X Zhang, T Zhang, X Hong, Z Cui, J Yang - European Conference on …, 2020 - Springer

Movie graphs play an important role to bridge heterogenous modalities of videos and texts

in human-centric retrieval. In this work, we propose Graph Wasserstein Correlation Analysis

(GWCA) to deal with the core issue therein, ie, cross heterogeneous graph comparison …

  Related articles All 5 versions


2020

[PDF] arxiv.org

Wasserstein Adversarial Autoencoders for Knowledge Graph Embedding based Drug-Drug Interaction Prediction

Y Dai, C Guo, W Guo, C Eickhoff - arXiv preprint arXiv:2004.07341, 2020 - arxiv.org

Interaction between pharmacological agents can trigger unexpected adverse events.

Capturing richer and more comprehensive information about drug-drug interactions (DDI) is

one of the key tasks in public health and drug development. Recently, several knowledge …

  Cited by 1 Related articles All 2 versions 


2020

[PDF] inria.fr

Graph diffusion wasserstein distances

A Barbe, M Sebban, P Gonçalves, P Borgnat… - … on Machine Learning …, 2020 - hal.inria.fr

Optimal Transport (OT) for structured data has received much attention in the machine

learning community, especially for addressing graph classification or graph transfer learning

tasks. In this paper, we present the Diffusion Wasserstein (DW) distance, as a generalization …

  Cited by 1 Related articles 


 2020

[PDF] arxiv.org

Partial Gromov-Wasserstein Learning for Partial Graph Matching

W LiuC ZhangJ XieZ Shen, H Qian… - arXiv preprint arXiv …, 2020 - arxiv.org

Graph matching finds the correspondence of nodes across two graphs and is a basic task in

graph-based machine learning. Numerous existing methods match every node in one graph

to one node in the other graph whereas two graphs usually overlap partially in …

  Related articles All 4 versions 




2020

GraphWGAN: Graph Representation Learning with Wasserstein Generative Adversarial Networks

R Yan, H Shen, C Qi, K Cen… - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Graph representation learning aims to represent vertices as low-dimensional and real-

valued vectors to facilitate subsequent downstream tasks, ie, node classification, link

predictions. Recently, some novel graph representation learning frameworks, which try to …

  Related articles All 2 versions


2020

[PDF] arxiv.org

LCS Graph Kernel Based on Wasserstein Distance in Longest Common Subsequence Metric Space

J Huang, Z Fang, H Kasai - arXiv preprint arXiv:2012.03612, 2020 - arxiv.org

For graph classification tasks, many methods use a common strategy to aggregate

information of vertex neighbors. Although this strategy provides an efficient means of

extracting graph topological features, it brings excessive amounts of information that might …

  Related articles All 2 versions 


 MR4195561 Prelim Duong, Manh Hong; Jin, Bangti; Wasserstein gradient flow formulation of the time-fractional Fokker-Planck equation. Commun. Math. Sci. 18 (2020), no. 7, 1949–1975. 65M06 (35Q84 60G22 65M12)

Review PDF Clipboard Journal Article


MR4193900 Prelim Larsson, Martin; Svaluto-Ferro, Sara; Existence of probability measure valued jump-diffusions in generalized Wasserstein spaces. Electron. J. Probab. 25 (2020), Paper No. 159, 1–25. 60J60 (60G57 60J76)

Review PDF Clipboard Journal Article


Precise limit in Wasserstein distance for conditional empirical measures of Dirichlet diffusion processes

FY Wang - arXiv preprint arXiv:2004.07537, 2020 - arxiv.org

Let $ M $ be a $ d $-dimensional connected compact Riemannian manifold with boundary

$\partial M $, let $ V\in C^ 2 (M) $ such that $\mu (dx):= e^{V (x)} dx $ is a probability

measure, and let $ X_t $ be the diffusion process generated by $ L:=\Delta+\nabla V $ with …

  Cited by 2 Related articles All 3 versions 

<——2020——2020———1540——  

2020

[PDF] inria.fr

Graph diffusion wasserstein distances

A Barbe, M Sebban, P Gonçalves, P Borgnat… - … on Machine Learning …, 2020 - hal.inria.fr

Optimal Transport (OT) for structured data has received much attention in the machine

learning community, especially for addressing graph classification or graph transfer learning

tasks. In this paper, we present the Diffusion Wasserstein (DW) distance, as a generalization …

  Cited by 1 Related articles 


2020

[PDF] arxiv.org

Convergence in Wasserstein Distance for Empirical Measures of Dirichlet Diffusion Processes on Manifolds

FY Wang - arXiv preprint arXiv:2005.09290, 2020 - arxiv.org

Let $ M $ be a $ d $-dimensional connected compact Riemannian manifold with boundary

$\partial M $, let $ V\in C^ 2 (M) $ such that $\mu ({\rm d} x):={\rm e}^{V (x)}{\rm d} x $ is a

probability measure, and let $ X_t $ be the diffusion process generated by …

  Cited by 1 Related articles All 3 versions 


2020

Exponential contraction in Wasserstein distances for diffusion semigroups with negative curvature

FY Wang - Potential Analysis, 2020 - Springer

Let P t be the (Neumann) diffusion semigroup P t generated by a weighted Laplacian on a

complete connected Riemannian manifold M without boundary or with a convex boundary. It

is well known that the Bakry-Emery curvature is bounded below by a positive constant> 0 …

  Cited by 4 Related articles


2020

ips.cc › virtual › public

NeurIPS 2020 : Deep Diffusion-Invariant Wasserstein ...

Dec 7, 2020 — Abstract: In this paper, we present a novel classification method called deep diffusion-invariant Wasserstein distributional classification ...

[CITATION] Deep Diffusion-Invariant Wasserstein Distributional Classification

SW Park+DW ShuJ Kwon - Advances in Neural Information Processing Systems, 2020

  Related articles


2020

Wasserstein distance estimates for stochastic integrals by forward-backward stochastic calculus

JC Breton, N Privault - Potential Analysis, 2020 - Springer

… 7. Breton, J. -C., Privault, N.: Bounds on option prices in point process diffusion models. Int. J. Theor. Appl Financ. 11(6), 597–610 (2008) … 9. Breton, J. -C., Privault, N.: Wasserstein distance estimates for jump-diffusion processes. Preprint, pp. 22 (2020). 10 …

  Related articles All 4 versions

[CITATION] Wasserstein distance estimates for jump-diffusion processes

JC Breton, N Privault - Preprint, 2020

  Cited by 2 Related articles


2020

[PDF] arxiv.org

Minimax control of ambiguous linear stochastic systems using the Wasserstein metric

K KimI Yang - 2020 59th IEEE Conference on Decision and …, 2020 - ieeexplore.ieee.org

In this paper, we propose a minimax linear-quadratic control method to address the issue of

inaccurate distribution information in practical stochastic systems. To construct a control

policy that is robust against errors in an empirical distribution of uncertainty, our method …

  Cited by 4 Related articles All 3 versions

[PDF] mlr.press

Approximate inference with wasserstein gradient flows

C FrognerT Poggio - International Conference on Artificial …, 2020 - proceedings.mlr.press

We present a novel approximate inference method for diffusion processes, based on the

Wasserstein gradient flow formulation of the diffusion. In this formulation, the time-dependent

density of the diffusion is derived as the limit of implicit Euler steps that follow the gradients …

  Cited by 11 Related articles All 3 versions 


[PDF] arxiv.org

Lagrangian schemes for Wasserstein gradient flows

JA CarrilloD MatthesMT Wolfram - arXiv preprint arXiv:2003.03803, 2020 - arxiv.org

This paper reviews different numerical methods for specific examples of Wasserstein

gradient flows: we focus on nonlinear Fokker-Planck equations, but also discuss

discretizations of the parabolic-elliptic Keller-Segel model and of the fourth order thin film …

  Cited by 4 Related articles All 3 versions 


[PDF] arxiv.org

Fisher information regularization schemes for Wasserstein gradient flows

W LiJ LuL Wang - Journal of Computational Physics, 2020 - Elsevier

We propose a variational scheme for computing Wasserstein gradient flows. The scheme

builds upon the Jordan–Kinderlehrer–Otto framework with the Benamou-Brenier's dynamic

formulation of the quadratic Wasserstein metric and adds a regularization by the Fisher …

  Cited by 9 Related articles All 10 versions


[PDF] arxiv.org

The back-and-forth method for wasserstein gradient flows

M Jacobs, W Lee, F Léger - arXiv preprint arXiv:2011.08151, 2020 - arxiv.org

We present a method to efficiently compute Wasserstein gradient flows. Our approach is

based on a generalization of the back-and-forth method (BFM) introduced by Jacobs and

Léger to solve optimal transport problems. We evolve the gradient flow by solving the dual …

  Cited by 1 Related articles All 2 versions 

<——2020——2020———1550——  


[PDF] arxiv.org

SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

S ChewiTL GouicC LuT Maunu… - arXiv preprint arXiv …, 2020 - arxiv.org

Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described

as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of

optimal transport. We introduce a new perspective on SVGD that instead views SVGD as the …

  Cited by 2 Related articles All 5 versions 


[PDF] arxiv.org

Wasserstein hamiltonian flows

SN Chow, W LiH Zhou - Journal of Differential Equations, 2020 - Elsevier

We establish kinetic Hamiltonian flows in density space embedded with the L 2-Wasserstein

metric tensor. We derive the Euler-Lagrange equation in density space, which introduces the

associated Hamiltonian flows. We demonstrate that many classical equations, such as …

  Cited by 4 Related articles All 7 versions


[PDF] arxiv.org

A variational finite volume scheme for Wasserstein gradient flows

C CancèsTO GallouëtG Todeschi - Numerische Mathematik, 2020 - Springer

We propose a variational finite volume scheme to approximate the solutions to Wasserstein

gradient flows. The time discretization is based on an implicit linearization of the

Wasserstein distance expressed thanks to Benamou–Brenier formula, whereas space …

  Cited by 5 Related articles All 9 versions


 [PDF] arxiv.org

Refining Deep Generative Models via Wasserstein Gradient Flows

AF Ansari, ML Ang, H Soh - arXiv preprint arXiv:2012.00780, 2020 - arxiv.org

Deep generative modeling has seen impressive advances in recent years, to the point

where it is now commonplace to see simulated samples (eg, images) that closely resemble

real-world data. However, generation quality is generally inconsistent for any given model …

  Related articles 

Refining Deep Generative Models via Wasserstein Gradient Flows

A Fatir Ansari, ML Ang, H Soh - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

Deep generative modeling has seen impressive advances in recent years, to the point

where it is now commonplace to see simulated samples (eg, images) that closely resemble

real-world data. However, generation quality is generally inconsistent for any given model 

[PDF] arxiv.org

TPFA Finite Volume Approximation of Wasserstein Gradient Flows

A NataleG Todeschi - International Conference on Finite Volumes for …, 2020 - Springer

Numerous infinite dimensional dynamical systems arising in different fields have been

shown to exhibit a gradient flow structure in the Wasserstein space. We construct Two Point

Flux Approximation Finite Volume schemes discretizing such problems which preserve the …

  Cited by 2 Related articles All 6 versions

  
2020


[PDF] sjtu.edu.cn

[PDF] Kalman-Wasserstein Gradient Flows

F Hoffmann - 2020 - ins.sjtu.edu.cn

Parameter calibration and uncertainty in complex computer models. Ensemble Kalman

Inversion (for optimization). Ensemble Kalman Sampling (for sampling). Kalman-Wasserstein

gradient flow structure … Minimize E : Ω R, where Ω RN … Dynamical …

  Related articles All 5 versions 


[PDF] tum.de

Lp-Wasserstein and flux-limited gradient flows: Entropic discretization, convergence analysis and numerics

B Söllner - 2020 - mediatum.ub.tum.de

We analyse different discretizations of gradient flows in transport metrics with non-quadratic

costs. Among others we discuss the p-Laplace equation and evolution equations with flux-

limitation. We prove comparison principles, free energy monotony, non-negativity and mass …

  Related articles All 3 versions 


Multi-view Wasserstein discriminant analysis with entropic regularized Wasserstein distance

H Kasai - ICASSP 2020-2020 IEEE International Conference …, 2020 - ieeexplore.ieee.org

Analysis of multi-view data has recently garnered growing attention because multi-view data

frequently appear in real-world applications, which are collected or taken from many sources

or captured using various sensors. A simple and popular promising approach is to learn a …

  Cited by 2 Related articles


[HTML] nih.gov

Wasserstein GANs for MR imaging: from paired to unpaired training

K Lei, M MardaniJM Pauly… - IEEE transactions on …, 2020 - ieeexplore.ieee.org

Lack of ground-truth MR images impedes the common supervised training of neural

networks for image reconstruction. To cope with this challenge, this paper leverages

unpaired adversarial training for reconstruction networks, where the inputs are …

  Cited by 4 Related articles All 7 versions


[PDF] arxiv.org

Image hashing by minimizing independent relaxed wasserstein distance

KD DoanA KimiyaieS Manchanda… - arXiv preprint arXiv …, 2020 - arxiv.org

Image hashing is a fundamental problem in the computer vision domain with various

challenges, primarily, in terms of efficiency and effectiveness. Existing hashing methods lack

a principled characterization of the goodness of the hash codes and a principled approach …

  Cited by 2 Related articles All 2 versions 

<——2020——2020———1560—— 


Obtaining PET/CT images from non-attenuation corrected PET images in a single PET system using Wasserstein generative adversarial networks

Z Hu, Y Li, S Zou, H Xue, Z Sang, X Liu… - Physics in Medicine …, 2020 - iopscience.iop.org

Positron emission tomography (PET) imaging plays an indispensable role in early disease

detection and postoperative patient staging diagnosis. However, PET imaging requires not

only additional computed tomography (CT) imaging to provide detailed anatomical …

  Cited by 3 Related articles All 5 versions


[PDF] arxiv.org

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

MH Quang - arXiv preprint arXiv:2011.07489, 2020 - arxiv.org

This work studies the entropic regularization formulation of the 2-Wasserstein distance on an

infinite-dimensional Hilbert space, in particular for the Gaussian setting. We first present the

Minimum Mutual Information property, namely the joint measures of two Gaussian measures …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Chance-Constrained Set Covering with Wasserstein Ambiguity

H Shen, R Jiang - arXiv preprint arXiv:2010.05671, 2020 - arxiv.org

We study a generalized distributionally robust chance-constrained set covering problem

(DRC) with a Wasserstein ambiguity set, where both decisions and uncertainty are binary-

valued. We establish the NP-hardness of DRC and recast it as a two-stage stochastic …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

Stability of Gibbs posteriors from the Wasserstein loss for Bayesian full waveform inversion

MM DunlopY Yang - arXiv preprint arXiv:2004.03730, 2020 - arxiv.org

Recently, the Wasserstein loss function has been proven to be effective when applied to

deterministic full-waveform inversion (FWI) problems. We consider the application of this

loss function in Bayesian FWI so that the uncertainty can be captured in the solution. Other …

  Cited by 1 Related articles All 3 versions 


Precipitation forecasting using machine-learning-based ensemble aggregation with Wasserstein-guided weighting

F O'Donncha, K Dipietro, SC James… - AGU Fall Meeting …, 2020 - ui.adsabs.harvard.edu

Precipitation forecasting is one of the most complex modeling tasks, requiring the resolution

of numerous spatial and temporal patterns that are sensitive to the accurate representation

of many secondary variables (precipitable water column, air humidity, pressure, etc.) …

  Related articles All 2 versions

[CITATION] Precipitation forecasting using machine-learning-based ensemble aggregation with Wasserstein-guided weighting

F O'Donncha, K Dipietro, SC James, B Byars… - AGU Fall Meeting …

, 2020 - agu.confex.com

2020

[PDF] aaai.org

Improving the Robustness of Wasserstein Embedding by Adversarial PAC-Bayesian Learning

D Ding, M Zhang, X PanM YangX He - Proceedings of the AAAI …, 2020 - ojs.aaai.org

Node embedding is a crucial task in graph analysis. Recently, several methods are

proposed to embed a node as a distribution rather than a vector to capture more information.

Although these methods achieved noticeable improvements, their extra complexity brings …

  Related articles All 3 versions 


Trajectories from Distribution-Valued Functional Curves: A Unified Wasserstein Framework

A SharmaG Gerig - … Conference on Medical Image Computing and …, 2020 - Springer

Temporal changes in medical images are often evaluated along a parametrized function that

represents a structure of interest (eg white matter tracts). By attributing samples along these

functions with distributions of image properties in the local neighborhood, we create …

  Related articles All 2 versions


[PDF] arxiv.org

Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures

O Bencheikh, B Jourdain - arXiv preprint arXiv:2012.09729, 2020 - arxiv.org

We are interested in the approximation in Wasserstein distance with index $\rho\ge 1$ of a

probability measure $\mu $ on the real line with finite moment of order $\rho $ by the

empirical measure of $ N $ deterministic points. The minimal error converges to $0 $ as …

  Related articles All 3 versions 


[PDF] ntu.edu.sg

Wasserstein distance estimates for stochastic integrals by forward-backward stochastic calculus

JC Breton, N Privault - Potential Analysis, 2020 - Springer

We prove Wasserstein distance bounds between the probability distributions of stochastic

integrals with jumps, based on the integrands appearing in their stochastic integral

representations. Our approach does not rely on the Stein equation or on the propagation of …

  Related articles All 4 versions


[PDF] arxiv.org

Learning Deep-Latent Hierarchies by Stacking Wasserstein Autoencoders

B Gaujac, I FeigeD Barber - arXiv preprint arXiv:2010.03467, 2020 - arxiv.org

Probabilistic models with hierarchical-latent-variable structures provide state-of-the-art

results amongst non-autoregressive, unsupervised density-based models. However, the

most common approach to training such models based on Variational Autoencoders (VAEs) …

  Related articles All 4 versions 

<——2020——2020———1570—— 



[PDF] arxiv.org

Tensor product and Hadamard product for the Wasserstein means

J Hwang, S Kim - Linear Algebra and its Applications, 2020 - Elsevier

As one of the least squares mean, we consider the Wasserstein mean of positive definite

Hermitian matrices. We verify in this paper the inequalities of the Wasserstein mean related

with a strictly positive and unital linear map, the identity of the Wasserstein mean for tensor …

  Related articles All 5 versions


Image Hashing by Minimizing Discrete Component-wise Wasserstein Distance

KD DoanS Manchanda, S Badirli… - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

Image hashing is one of the fundamental problems that demand both efficient and effective

solutions for various practical scenarios. Adversarial autoencoders are shown to be able to

implicitly learn a robust, locality-preserving hash function that generates balanced and high …

 

[PDF] jku.at

WGAIN: Data Imputation using Wasserstein GAIN/submitted by Christina Halmich

C Halmich - 2020 - epub.jku.at

Missing data is a well known problem in the Machine Learning world. A lot of datasets that

are used for training algorithms contain missing values, eg 45% of the datasets stored in the

UCI Machine Learning Repository [16], which is a commonly used dataset collection …

  Related articles All 2 versions 


[PDF] researchgate.net

[PDF] STATISTICAL INFERENCE FOR BURES-WASSERSTEIN BARYCENTERS BY ALEXEY KROSHNIN, VLADIMIR SPOKOINY 2 AND ALEXANDRA …

A KROSHNIN - researchgate.net

In this work we introduce the concept of Bures–Wasserstein barycenter Q, that is

essentially a Fréchet mean of some distribution P supported on a subspace of positive semi-

definite d-dimensional Hermitian operators H+(d). We allow a barycenter to be constrained …

  Related articles 






[PDF] tum.de

Lp-Wasserstein and flux-limited gradient flows: Entropic discretization, convergence analysis and numerics

B Söllner - 2020 - mediatum.ub.tum.de

We analyse different discretizations of gradient flows in transport metrics with non-quadratic

costs. Among others we discuss the p-Laplace equation and evolution equations with flux-

limitation. We prove comparison principles, free energy monotony, non-negativity and mass …

  Related articles All 3 versions 







i




2020

Расстояние Канторовича-Рубинштейна-Вассерштейна между аттрактором и репеллером

АО Казаков, АС Пиковский, ВГ Чигарев - Математическое …, 2020 - elibrary.ru

Мы рассматриваем несколько примеров динамических систем, демонстрирующих пересечение аттрактора и репеллера. Эти системы строятся с помощью добавления контролируемой диссипации в базовые модели с хаотической динамикой …

Related articles

[Russian  Kantorovich-Rubinstejn-Baserstein distance between tractor and repelent]


 

2020

Обращение полного волнового поля с использованием метрики Вассерштейна

АА Василенко - МНСК-2020, 2020 - elibrary.ru

Обратная динамическая задача сейсмики заключается в определении параметров упругой среды по зарегистрированным в ходе полевых работ данным. Данная задача сводится к минимизации целевого функционала, измеряющего отклонение …

Related articles

[Russian  Reverse of full wave field using Wasserstein metric]


Bridging the gap between f-gans and wasserstein gans

J SongS Ermon - International Conference on Machine …, 2020 - proceedings.mlr.press

Generative adversarial networks (GANs) variants approximately minimize divergences

between the model and the data distribution using a discriminator. Wasserstein GANs

(WGANs) enjoy superior empirical performance, however, unlike in f-GANs, the discriminator …

  Cited by 5 Related articles All 4 versions 


[PDF] arxiv.org

When can Wasserstein GANs minimize Wasserstein Distance?

Y LiZ Dou - arXiv preprint arXiv:2003.04033, 2020 - arxiv.org

Generative Adversarial Networks (GANs) are widely used models to learn complex real-

world distributions. In GANs, the training of the generator usually stops when the

discriminator can no longer distinguish the generator's output from the set of training …

  Cited by 5 Related articles All 3 versions 



 2020

 McKean-Vlasov SDEs with drifts discontinuous under Wasserstein distance

X Huang, FY Wang - arXiv preprint arXiv:2002.06877, 2020 - arxiv.org

Existence and uniqueness are proved for Mckean-Vlasov type distribution dependent SDEs

with singular drifts satisfying an integrability condition in space variable and the Lipschitz

condition in distribution variable with respect to $ W_0 $ or $ W_0+ W_\theta $ for some …

  Cited by 7 Related articles All 4 versions 

<——2020——2020———1580——


2020

[PDF] arxiv.org

Infinite-dimensional regularization of McKean-Vlasov equation with a Wasserstein diffusion

V Marx - arXiv preprint arXiv:2002.10157, 2020 - arxiv.org

Much effort has been spent in recent years on restoring uniqueness of McKean-Vlasov

SDEs with non-smooth coefficients. As a typical instance, the velocity field is assumed to be

bounded and measurable in its space variable and Lipschitz-continuous with respect to the …

  Cited by 2 Related articles All 9 versions 


2020

[PDF] arxiv.org

Exponential Convergence in Entropy and Wasserstein Distance for McKean-Vlasov SDEs

P Ren, FY Wang - arXiv preprint arXiv:2010.08950, 2020 - arxiv.org

The following type exponential convergence is proved for (non-degenerate or degenerate)

McKean-Vlasov SDEs: $$ W_2 (\mu_t,\mu_\infty)^ 2+{\rm Ent}(\mu_t|\mu_\infty)\le c {\rm e}^{-

\lambda t}\min\big\{W_2 (\mu_0,\mu_\infty)^ 2,{\rm Ent}(\mu_0|\mu_\infty)\big\},\\t\ge 1 …

  Cited by 1 Related articles All 2 versions 



2020

 [PDF] arxiv.org

Infinite-dimensional regularization of McKean-Vlasov equation with a Wasserstein diffusion

V Marx - arXiv preprint arXiv:2002.10157, 2020 - arxiv.org

Much effort has been spent in recent years on restoring uniqueness of McKean-Vlasov

SDEs with non-smooth coefficients. As a typical instance, the velocity field is assumed to be

bounded and measurable in its space variable and Lipschitz-continuous with respect to the …

  Cited by 2 Related articles All 9 versions 


2020

[PDF] arxiv.org

Exponential Convergence in Entropy and Wasserstein Distance for McKean-Vlasov SDEs

P Ren, FY Wang - arXiv preprint arXiv:2010.08950, 2020 - arxiv.org

The following type exponential convergence is proved for (non-degenerate or degenerate)

McKean-Vlasov SDEs: $$ W_2 (\mu_t,\mu_\infty)^ 2+{\rm Ent}(\mu_t|\mu_\infty)\le c {\rm e}^{-

\lambda t}\min\big\{W_2 (\mu_0,\mu_\infty)^ 2,{\rm Ent}(\mu_0|\mu_\infty)\big\},\\t\ge 1 …

  Cited by 1 Related articles All 2 versions 


  2020

Convergence in Monge-Wasserstein Distance of Mean Field Systems with Locally Lipschitz Coefficients

DT NguyenSL Nguyen, NH Du - Acta Mathematica Vietnamica, 2020 - Springer

This paper focuses on stochastic systems of weakly interacting particles whose dynamics

depend on the empirical measures of the whole populations. The drift and diffusion

coefficients of the dynamical systems are assumed to be locally Lipschitz continuous and …

  Related articles



2020

Stability of Gibbs posteriors from the Wasserstein loss for Bayesian full waveform inversion

MM DunlopY Yang - arXiv preprint arXiv:2004.03730, 2020 - arxiv.org

Recently, the Wasserstein loss function has been proven to be effective when applied to

deterministic full-waveform inversion (FWI) problems. We consider the application of this

loss function in Bayesian FWI so that the uncertainty can be captured in the solution. Other …

  Cited by 1 Related articles All 3 versions 


2020

Convergence rates of the blocked Gibbs sampler with random scan in the Wasserstein metric

NY Wang, G Yin - Stochastics, 2020 - Taylor & Francis

Formulae display: ?Mathematical formulae have been encoded as MathML and are displayed

in this HTML version using MathJax in order to improve their display. Uncheck the box to turn

MathJax off. This feature requires Javascript. Click on a formula to zoom … This paper establishes …

  Related articles All 4 versions


 2020

[PDF] arxiv.org

Berry-Esseen smoothing inequality for the Wasserstein metric on compact Lie groups

B Borda - arXiv preprint arXiv:2005.04925, 2020 - arxiv.org

We prove a general inequality estimating the distance of two probability measures on a

compact Lie group in the Wasserstein metric in terms of their Fourier transforms. The result is

close to being sharp. We use a generalized form of the Wasserstein metric, related by …

  Related articles All 2 versions 


[PDF] mlr.press

Bridging the gap between f-gans and wasserstein gans

J SongS Ermon - International Conference on Machine …, 2020 - proceedings.mlr.press

Generative adversarial networks (GANs) variants approximately minimize divergences

between the model and the data distribution using a discriminator. Wasserstein GANs

(WGANs) enjoy superior empirical performance, however, unlike in f-GANs, the discriminator …

  Cited by 9 Related articles All 4 versions 

 

[PDF] researchgate.net

[PDF] … INFERENCE FOR BURES-WASSERSTEIN BARYCENTERS BY ALEXEY KROSHNIN, VLADIMIR SPOKOINY 2 AND ALEXANDRA SUVORIKOVA 3

A KROSHNIN - researchgate.net

In this work we introduce the concept of Bures–Wasserstein barycenter Q, that is 

essentially a Fréchet mean of some distribution P supported on a subspace of positive semi-

definite d-dimensional Hermitian operators H+(d). We allow a barycenter to be constrained …

Related articles

<——2020——2020———1590—— 


2020

On Distributionally Robust Chance Constrained Programs ...

www.optimization-online.org › DB_FILE › 2018/06

by W Xie · 2020 · Cited by 41 — We study distributional robust chance constrained programs (DRCCPs) of the form: ... Also, (1) is termed a DRCCP with right-hand uncertainty if η1 = 0,η2 = 1 and a ... In this paper, we consider Wasserstein ambiguity set P, i.e., we make the ... (iii) Finally, let Z denote the set in the right-hand side of (4) , we only need to show ...


Network-based de-artifact finite angle CT reconstruction method, involves sending training set into WGAN-GP network model and dividing to-be-processed image into trained model to output limited angle CT reconstruction image

Patent Number: CN110648376-A

Patent Assignee: UNIV NANJING POST & TELECOM

Inventor(s): XU H; XIE S.


US-WGAN

By: Sun, Hongyu

IEEE DataPort

DOI: ‏ http://dx.doi.org.ezaccess.libraries.psu.edu/10.21227/G99Z-8645

Document Type: Data set

the link is from Web of Science.  direct link


[HTML] Data augmentation-based conditional Wasserstein generative adversarial network-gradient penalty for XSS attack detection system

FMM Mokbal, D Wang, X Wang, L Fu - PeerJ Computer Science, 2020 - peerj.com

The rapid growth of the worldwide web and accompanied opportunities of web applications in various aspects of life have attracted the attention of organizations, governments, and individuals. Consequently, web applications have increasingly become the target of …

  Related articles All 5 versions 

 

Faster Wasserstein distance estimation with the Sinkhorn divergence

L ChizatP RoussillonF LégerFX Vialard… - arXiv preprint arXiv …, 2020 - arxiv.org

The squared Wasserstein distance is a natural quantity to compare probability distributions in a non-parametric setting. This quantity is usually estimated with the plug-in estimator, defined via a discrete optimal transport problem. It can be solved to $\epsilon $-accuracy by …

  Cited by 2 Related articles All 6 versions 

[PDF] semanticscholar.org

[PDF] Faster Wasserstein Distance Estimation with the Sinkhorn Divergence

FX VialardG Peyré - pdfs.semanticscholar.org

Page 1. Faster Wasserstein Distance Estimation with the Sinkhorn Divergence Lénaıc Chizat1, joint work with Pierre Roussillon2, Flavien Léger2, François-Xavier Vialard3 and Gabriel Peyré2 July 8th, 2020 - Optimal Transport: Regularization and Applications 1CNRS and Université …

2020


[PDF] arxiv.org

When ot meets mom: Robust estimation of wasserstein distance

G StaermanP LaforgueP Mozharovskyi… - arXiv preprint arXiv …, 2020 - arxiv.org

Issued from Optimal Transport, the Wasserstein distance has gained importance in Machine Learning due to its appealing geometrical properties and the increasing availability of efficient approximations. In this work, we consider the problem of estimating the Wasserstein  …

  Cited by 2 Related articles All 4 versions 

 View Abstract

[PDF] arxiv.org

Improved image wasserstein attacks and defenses

JE HuA SwaminathanH SalmanG Yang - arXiv preprint arXiv …, 2020 - arxiv.org

Robustness against image perturbations bounded by a $\ell_p $ ball have been well-studied in recent literature. Perturbations in the real-world, however, rarely exhibit the pixel independence that $\ell_p $ threat models assume. A recently proposed Wasserstein  …

  Cited by 4 Related articles All 4 versions 


Reconstruction of shale image based on Wasserstein Generative Adversarial Networks with gradient penalty

W Zha, X Li, Y Xing, L He, D Li - Advances in Geo-Energy …, 2020 - yandy-ager.com

Abstract Generative Adversarial Networks (GANs), as most popular artificial intelligence models in the current image generation field, have excellent image generation capabilities. Based on Wasserstein GANs with gradient penalty, this paper proposes a novel digital core …

  Cited by 11 Related articles 


[PDF] researchgate.net

Infrared and visible image fusion using dual discriminators generative adversarial networks with Wasserstein distance

J Li, H Huo, K Liu, C Li - Information Sciences, 2020 - Elsevier

Generative adversarial network (GAN) has shown great potential in infrared and visible image fusion. The existing GAN-based methods establish an adversarial game between generative image and source images to train the generator until the generative image  …

  Cited by 4 Related articles All 3 versions


DPIR-Net: Direct PET image reconstruction based on the Wasserstein generative adversarial network

Z Hu, H Xue, Q Zhang, J Gao, N Zhang… - … on Radiation and …, 2020 - ieeexplore.ieee.org

Positron emission tomography (PET) is an advanced medical imaging technique widely used in various clinical applications, such as tumor detection and neurologic disorders. Reducing the radiotracer dose is desirable in PET imaging because it decreases the …

  Cited by 14 Related articles

<——2020——2020———1600——


[PDF] arxiv.org

Image hashing by minimizing independent relaxed wasserstein distance

KD DoanA KimiyaieS Manchanda… - arXiv preprint arXiv …, 2020 - arxiv.org

Image hashing is a fundamental problem in the computer vision domain with various challenges, primarily, in terms of efficiency and effectiveness. Existing hashing methods lack a principled characterization of the goodness of the hash codes and a principled approach …

  Cited by 2 Related articles All 2 versions 


A Super Resolution Method for Remote Sensing Images Based on Cascaded Conditional Wasserstein GANs

B Liu, H Li, Y Zhou, Y Peng, A Elazab… - 2020 IEEE 3rd …, 2020 - ieeexplore.ieee.org

High-resolution (HR) remote sensing imagery is quite beneficial for subsequent

interpretation. Obtaining HR images can be achieved by upgrading the imaging device. Yet,

the cost to perform this task is very huge. Thus, it is necessary to obtain HR images from low …

  Related articles

 

W-LDMM: A wasserstein driven low-dimensional manifold model for noisy image restoration

R He, X Feng, W Wang, X Zhu, C Yang - Neurocomputing, 2020 - Elsevier

The Wasserstein distance originated from the optimal transport theory is a general and flexible statistical metric in a variety of image processing problems. In this paper, we propose a novel Wasserstein driven low-dimensional manifold model (W-LDMM), which tactfully …

  Cited by 3 Related articles All 2 versions

[PDF] arxiv.org

Wasserstein coupled particle filter for multilevel estimation

M Ballesio, A JasraE von Schwerin… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we consider the filtering problem for partially observed diffusions, which are regularly observed at discrete times. We are concerned with the case when one must resort to time-discretization of the diffusion process if the transition density is not available in an …

  Cited by 2 Related articles All 4 versions 


[PDF] thecvf.com

Barycenters of natural images constrained wasserstein barycenters for image morphing

D SimonA Aberdam - … of the IEEE/CVF Conference on …, 2020 - openaccess.thecvf.com

Image interpolation, or image morphing, refers to a visual transition between two (or more) input images. For such a transition to look visually appealing, its desirable properties are (i) to be smooth;(ii) to apply the minimal required change in the image; and (iii) to seem" real" …

Cited by 9 Related articles All 8 versions 

2020

[PDF] arxiv.org

PLG-IN: Pluggable Geometric Consistency Loss with Wasserstein Distance in Monocular Depth Estimation

N HiroseS KoideK KawanoR Kondo - arXiv preprint arXiv:2006.02068, 2020 - arxiv.org

We propose a novel objective to penalize geometric inconsistencies, to improve the performance of depth estimation from monocular camera images. Our objective is designed with the Wasserstein distance between two point clouds estimated from images with different …

  Cited by 2 Related articles All 2 versions 

Density estimation of multivariate samples using Wasserstein distance

E Luini, P Arbenz - Journal of Statistical Computation and …, 2020 - Taylor & Francis

Density estimation is a central topic in statistics and a fundamental task of machine learning. In this paper, we present an algorithm for approximating multivariate empirical densities with a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

  Cited by 2 Related articles All 3 versions


[PDF] arxiv.org

Wasserstein Distances for Stereo Disparity Estimation

D GargY WangB HariharanM Campbell… - arXiv preprint arXiv …, 2020 - arxiv.org

Existing approaches to depth or disparity estimation output a distribution over a set of pre-defined discrete values. This leads to inaccurate results when the true depth or disparity does not match any of these values. The fact that this distribution is usually learned indirectly …

  Cited by 2 Related articles All 3 versions 

[CITATION] Supplementary Material: Wasserstein Distances for Stereo Disparity Estimation

D Garg, Y Wang, B HariharanM Campbell

  Related articles


[PDF] upenn.edu

Optimal Estimation of Wasserstein Distance on a Tree With an Application to Microbiome Studies

S Wang, TT CaiH Li - Journal of the American Statistical …, 2020 - Taylor & Francis

The weighted UniFrac distance, a plug-in estimator of the Wasserstein distance of read counts on a tree, has been widely used to measure the microbial community difference in microbiome studies. Our investigation however shows that such a plug-in estimator …

  Related articles All 4 versions


[PDF] arxiv.org

Wasserstein Distance Regularized Sequence Representation for Text Matching in Asymmetrical Domains

W Yu, C Xu, J XuL Pang, X Gao, X Wang… - arXiv preprint arXiv …, 2020 - arxiv.org

One approach to matching texts from asymmetrical domains is projecting the input sequences into a common semantic space as feature vectors upon which the matching function can be readily defined and learned. In real-world matching practices, it is often …

  Related articles All 3 versions 

<——2020——2020———1610——  


[HTML] aimsciences.org

[HTML] RWRM: Residual Wasserstein regularization model for image restoration

R He, X Feng, X Zhu, H Huang… - Inverse Problems & …, 2020 - aimsciences.org

Existing image restoration methods mostly make full use of various image prior information. However, they rarely exploit the potential of residual histograms, especially their role as ensemble regularization constraint. In this paper, we propose a residual Wasserstein  …

  Related articles All 2 versions 


[PDF] imstat.org

Central Limit Theorem for Wasserstein type distances between two distinct univariate distributions

P Berthet, JC Fort, T Klein - Annales de l'Institut Henri Poincaré …, 2020 - projecteuclid.org

In this article we study the natural nonparametric estimator of a Wasserstein type cost

between two distinct continuous distributions $ F $ and $ G $ on $\mathbb {R} $. The

estimator is based on the order statistics of a sample having marginals $ F $, $ G $ and any …

  Related articles All 4 versions

[PDF] thecvf.com

Illegible Text to Readable Text: An Image-to-Image Transformation using Conditional Sliced Wasserstein Adversarial Networks

M KarimiG VeniYY Yu - … of the IEEE/CVF Conference on …, 2020 - openaccess.thecvf.com

Automatic text recognition from ancient handwritten record images is an important problem in the genealogy domain. However, critical challenges such as varying noise conditions, vanishing texts, and variations in handwriting makes the recognition task difficult. We tackle …

  Related articles All 7 versions 


[PDF] aclweb.org

WAERN: Integrating Wasserstein Autoencoder and Relational Network for Text Sequence

X Zhang, X Liu, G Yang, F Li, W Liu - China National Conference on …, 2020 - Springer

Abstract One challenge in Natural Language Processing (NLP) area is to learn semantic representation in different contexts. Recent works on pre-trained language model have received great attentions and have been proven as an effective technique. In spite of the …

  Related articles All 5 versions


[PDF] arxiv.org

Risk Measures Estimation Under Wasserstein Barycenter

MA Arias-SernaJM Loubes… - arXiv preprint arXiv …, 2020 - arxiv.org

Randomness in financial markets requires modern and robust multivariate models of risk measures. This paper proposes a new approach for modeling multivariate risk measures under Wasserstein barycenters of probability measures supported on location-scatter …

  Related articles All 5 versions 


2020

A Novel Data-to-Text Generation Model with Transformer Planning and a Wasserstein Auto-Encoder

X Xu, T He, H Wang - 2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

Existing methods for data-to-text generation have difficulty producing diverse texts with low duplication rates. In this paper, we propose a novel data-to-text generation model with Transformer planning and a Wasserstein auto-encoder, which can convert constructed data …

  Related articles All 2 versions


Stereoscopic image reflection removal based on Wasserstein Generative Adversarial Network

X Wang, Y Pan, DPK Lun - … Visual Communications and Image …, 2020 - ieeexplore.ieee.org

Reflection removal is a long-standing problem in computer vision. In this paper, we consider the reflection removal problem for stereoscopic images. By exploiting the depth information of stereoscopic images, a new background edge estimation algorithm based on the …

  Related articles All 2 versions


Image Hashing by Minimizing Discrete Component-wise Wasserstein Distance

KD DoanS Manchanda, S Badirli… - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

Image hashing is one of the fundamental problems that demand both efficient and effective solutions for various practical scenarios. Adversarial autoencoders are shown to be able to implicitly learn a robust, locality-preserving hash function that generates balanced and high …

 

[PDF] uniroma1.it

[PDF] Nonparametric Density Estimation with Wasserstein Distance for Actuarial Applications

EG Luini - iris.uniroma1.it

Density estimation is a central topic in statistics and a fundamental task of actuarial sciences. In this work, we present an algorithm for approximating multivariate empirical densities with a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

  Related articles All 2 versions 


  

[PDF] amazonaws.com

[PDF] Bayesian Wasserstein GAN and Application for Vegetable Disease Image Data

W Cho, MH Na, S Kang, S Kim - 2020 - manuscriptlink-society-file.s3 …

Various GAN models have been proposed so far and they are used in various fields. However, despite the excellent performance of these GANs, the biggest problem is that the model collapse occurs in the simultaneous optimization of the generator and discriminator of …

  Related articles 

<——2020——2020———1620—— 


Data Augmentation using Pre-trained Transformer Models

Varun KumarAshutosh ChoudharyEunah Cho ·  Edit social preview

paperswithcode.com › paper › data-augmentation-using 

4 Mar 2020 • Varun Kumar • Ashutosh Choudhary • Eunah Cho Language model based pre-trained models such as BERT have provided significant gains across different NLP tasks. In this paper, we study different types of transformer based pre-trained models such as auto-regressive models (GPT-2), auto-encoder models (BERT), and seq2seq models ...


Computationally Efficient Wasserstein loss for Structured Labels

jglobal.jst.go.jp › detail › JGLOBAL_ID=20200221658...

Volume 34th Page ROMBUNNO.1J5-GS-2-02 (WEB ONLY) Publication year 2020. JST Material Number U1701A Document type Proceedings

2020 OPEN ACCESS

Computationally Efficient Wasserstein loss for Structured Labels

by TOYOKUNI, Ayato; YOKOI, Sho; KASHIMA, Hisashi ; More...

Proceedings of the Annual Conference of JSAI, 2020

The problem of estimating the probability distribution of label given a input has been widely studied as Label Distribution Learning (LDL). In this paper, we...

Journal ArticleCitation Online


Improved complexity bounds in wasserstein barycenter problem

D Dvinskikh, D Tiapkin - arXiv preprint arXiv:2010.04677, 2020 - arxiv.org

In this paper, we focus on computational aspects of Wasserstein barycenter problem. We provide two algorithms to compute Wasserstein barycenter of $ m $ discrete measures of size $ n $ with accuracy $\varepsilon $. The first algorithm, based on mirror prox with some …

  Cited by 2 Related articles All 2 versions 


[PDF] arxiv.org

Necessary Condition for Rectifiability Involving Wasserstein Distance W2

D Dąbrowski - International Mathematics Research Notices, 2020 - academic.oup.com

A Radon measure is-rectifiable if it is absolutely continuous with respect to-dimensional

Hausdorff measure and-almost all of can be covered by Lipschitz images of. In this paper,

we give a necessary condition for rectifiability in terms of the so-called numbers …

  Cited by 6 Related articles All 5 versions

Wasserstein upper bounds of the total variation for smooth densities

M Chae, SG Walker - Statistics & Probability Letters, 2020 - Elsevier

The total variation distance between probability measures cannot be bounded by the Wasserstein metric in general. If we consider sufficiently smooth probability densities, however, it is possible to bound the total variation by a power of the Wasserstein distance …

  Cited by 3 Related articles All 5 versions

2020

[PDF] arxiv.org

Symmetric skip connection wasserstein gan for high-resolution facial image inpainting

J JamC KendrickV DrouardK Walker… - arXiv preprint arXiv …, 2020 - arxiv.org

The state-of-the-art facial image inpainting methods achieved promising results but face realism preservation remains a challenge. This is due to limitations such as; failures in preserving edges and blurry artefacts. To overcome these limitations, we propose a …

  Cited by 3 Related articles All 3 versions 


W-LDMM: A wasserstein driven low-dimensional manifold model for noisy image restoration

R He, X Feng, W Wang, X Zhu, C Yang - Neurocomputing, 2020 - Elsevier

The Wasserstein distance originated from the optimal transport theory is a general and flexible statistical metric in a variety of image processing problems. In this paper, we propose a novel Wasserstein driven low-dimensional manifold model (W-LDMM), which tactfully …

  Cited by 2 Related articles All 2 versions


[PDF] arxiv.org

Wasserstein distance error bounds for the multivariate normal approximation of the maximum likelihood estimator

A AnastasiouRE Gaunt - arXiv preprint arXiv:2005.05208, 2020 - arxiv.org

We obtain explicit Wasserstein distance error bounds between the distribution of the multi-parameter MLE and the multivariate normal distribution. Our general bounds are given for possibly high-dimensional, independent and identically distributed random vectors. Our …

  Cited by 1 Related articles All 4 versions 


[HTML] springer.com

[HTML] Wasserstein and Kolmogorov error bounds for variance-gamma approximation via Stein's method I

RE Gaunt - Journal of Theoretical Probability, 2020 - Springer

The variance-gamma (VG) distributions form a four-parameter family that includes as special and limiting cases the normal, gamma and Laplace distributions. Some of the numerous applications include financial modelling and approximation on Wiener space. Recently …

  Cited by 13 Related articles All 6 versions



2020

[PDF] archives-ouvertes.fr

Unajusted Langevin algorithm with multiplicative noise: Total variation and Wasserstein bounds

G Pages, F Panloup - 2020 - hal.archives-ouvertes.fr

In this paper, we focus on non-asymptotic bounds related to the Euler scheme of an ergodic diffusion with a possibly multiplicative diffusion term (non-constant diffusion coefficient). More precisely, the objective of this paper is to control the distance of the standard Euler …

  Related articles All 5 versions 

<——2020——2020———1630——

[PDF] arxiv.org

Hierarchical Low-Rank Approximation of Regularized Wasserstein distance

M Motamed - arXiv preprint arXiv:2004.12511, 2020 - arxiv.org

Sinkhorn divergence is a measure of dissimilarity between two probability measures. It is obtained through adding an entropic regularization term to Kantorovich's optimal transport problem and can hence be viewed as an entropically regularized Wasserstein distance …

  Related articles All 3 versions 


year 2020   [PDF] kweku.me

[PDF] Measuring Bias with Wasserstein Distance

K Kwegyir-Aggrey, SM Brown - kweku.me

In fair classification, we often ask:" what does it mean to be fair, and how is fairness

measured?" Previous approaches to defining and enforcing fairness rely on a set of

statistical fairness definitions, with each definition providing its own unique measurement of …

 Related articles 

[PDF] uniandes.edu.co

[PDF] arxiv.org

The Spectral-Domain  Wasserstein Distance for Elliptical Processes and the Spectral-Domain Gelbrich Bound

S FangQ Zhu - arXiv preprint arXiv:2012.04023, 2020 - arxiv.org

In this short note, we introduce the spectral-domain $\mathcal {W} _2 $ Wasserstein distance for elliptical stochastic processes in terms of their power spectra. We also introduce the spectral-domain Gelbrich bound for processes that are not necessarily elliptical. Subjects …

  Related articles All 2 versions 


[PDF] arxiv.org

Regularized variational data assimilation for bias treatment using the Wasserstein metric

SK Tamang, A EbtehajD Zou… - Quarterly Journal of the …, 2020 - Wiley Online Library

This article presents a new variational data assimilation (VDA) approach for the formal

treatment of bias in both model outputs and observations. This approach relies on the

Wasserstein metric, stemming from the theory of optimal mass transport, to penalize the …

  Cited by 1 Related articles All 4 versions


[PDF] arxiv.org

Improved complexity bounds in wasserstein barycenter problem

D Dvinskikh, D Tiapkin - arXiv preprint arXiv:2010.04677, 2020 - arxiv.org

In this paper, we focus on computational aspects of Wasserstein barycenter problem. We

provide two algorithms to compute Wasserstein barycenter of $ m $ discrete measures of

size $ n $ with accuracy $\varepsilon $. The first algorithm, based on mirror prox with some …

  Cited by 2 Related articles All 2 versions 

[PDF] arxiv.org

2020

Stochastic equation and exponential ergodicity in Wasserstein distances for affine processes

M Friesen, P Jin, B Rüdiger - Annals of Applied Probability, 2020 - projecteuclid.org

This work is devoted to the study of conservative affine processes on the canonical state

space $ D=\mathbb {R} _ {+}^{m}\times\mathbb {R}^{n} $, where $ m+ n> 0$. We show that

each affine process can be obtained as the pathwise unique strong solution to a stochastic …

  Cited by 8 Related articles All 5 versions


[PDF] arxiv.org

Wasserstein distributionally robust shortest path problem

Z Wang, K YouS SongY Zhang - European Journal of Operational …, 2020 - Elsevier

This paper proposes a data-driven distributionally robust shortest path (DRSP) model where

the distribution of the travel time in the transportation network can only be partially observed

through a finite number of samples. Specifically, we aim to find an optimal path to minimize …

  Cited by 3 Related articles All 8 versions


[PDF] arxiv.org

Infinite-dimensional regularization of McKean-Vlasov equation with a Wasserstein diffusion

V Marx - arXiv preprint arXiv:2002.10157, 2020 - arxiv.org

Much effort has been spent in recent years on restoring uniqueness of McKean-Vlasov

SDEs with non-smooth coefficients. As a typical instance, the velocity field is assumed to be

bounded and measurable in its space variable and Lipschitz-continuous with respect to the …

  Cited by 1 Related articles All 9 versions 


[PDF] arxiv.org

The existence of minimizers for an isoperimetric problem with Wasserstein penalty term in unbounded domains

Q Xia, B Zhou - arXiv preprint arXiv:2002.07129, 2020 - arxiv.org

In this article, we consider the (double) minimization problem $$\min\left\{P

(E;\Omega)+\lambda W_p (E, F):~ E\subseteq\Omega,~ F\subseteq\mathbb {R}^ d,~\lvert

E\cap F\rvert= 0,~\lvert E\rvert=\lvert F\rvert= 1\right\}, $$ where $ p\geqslant 1$, $\Omega …

  Related articles All 4 versions 


[HTML] hindawi.com

[HTML] Solutions of a Class of Degenerate Kinetic Equations Using Steepest Descent in Wasserstein Space

A Marcos, A Soglo - Journal of Mathematics, 2020 - hindawi.com

We use the steepest descent method in an Orlicz–Wasserstein space to study the existence

of solutions for a very broad class of kinetic equations, which include the Boltzmann

equation, the Vlasov–Poisson equation, the porous medium equation, and the parabolic p …

  Related articles All 6 versions 

<——2020——2020———1640—— 


2020 book

[PDF] core.ac.uk

[PDF] Structure-preserving variational schemes for fourth order nonlinear partial differential equations with a Wasserstein gradient flow structure

B Ashworth - 2020 - core.ac.uk

There is a growing interest in studying nonlinear partial differential equations which

constitute gradient flows in the Wasserstein metric and related structure preserving

variational discretisations. In this thesis, we focus on the fourth order Derrida-Lebowitz …

  

[PDF] brown.edu

[PDF] Reduced-order modeling of transport equations using Wasserstein spaces

V Ehrlacher, D Lombardi, O MulaFX Vialard - icerm.brown.edu

… Page 12. Introduction to Wassertein spaces and barycenters Model order reduction of parametric

transport equations Wasserstein distance and barycenter in dimension 1 For all u, v P2(Ω), the

2-Wasserstein distance between u and v is equal to W2(u, v) := icdfu − icdfv L2(0,1). Let U := (u1 …

  Related articles 


Two approaches for population Wasserstein barycenter problem: Stochastic Averaging versus Sample Average Approximation

D DvinskikhA Gasnikov - nnov.hse.ru

Abstract In Machine Learning and Optimization community there are two main approaches

for convex risk minimization problem: Stochastic Averaging (SA) and Sample Average

Approximation (SAA). At the moment, it is known that both approaches are on average …

  Related articles 



Stronger and faster Wasserstein adversarial attacks

K WuA WangY Yu - International Conference on Machine …, 2020 - proceedings.mlr.press

Deep models, while being extremely flexible and accurate, are surprisingly vulnerable to “small, imperceptible” perturbations known as adversarial attacks. While the majority of existing attacks focus on measuring perturbations under the $\ell_p $ metric, Wasserstein  …

  Cited by 2 Related articles All 7 versions 


[PDF] arxiv.org

Faster Wasserstein distance estimation with the Sinkhorn divergence

L ChizatP RoussillonF LégerFX Vialard… - arXiv preprint arXiv …, 2020 - arxiv.org

The squared Wasserstein distance is a natural quantity to compare probability distributions in a non-parametric setting. This quantity is usually estimated with the plug-in estimator, defined via a discrete optimal transport problem. It can be solved to $\epsilon $-accuracy by …

  Cited by 2 Related articles All 6 versions 

[PDF] semanticscholar.org

[PDF] Faster Wasserstein Distance Estimation with the Sinkhorn Divergence

FX VialardG Peyré - pdfs.semanticscholar.org

… Let H(µ) = ∫ log(µ(x))µ(x)dx and µ, ν with bounded densities. Theorem (Yasue formulation of the Schrödinger problem) Tλ(µ, ν) + dλlog(2πλ) + λ(H(µ) + H(ν)) = min ρ,v ∫ 1 0 ∫ R d ( v(t,x)2 2 ︷︷ Kinetic energy + λ2 4 x log(ρ(t,x))2 2 ︷︷ Fisher information ) …


[PDF] mlr.press

fast proximal point method for computing exact wasserstein distance

Y XieX Wang, R Wang, H Zha - Uncertainty in Artificial …, 2020 - proceedings.mlr.press

Wasserstein distance plays increasingly important roles in machine learning, stochastic programming and image processing. Major efforts have been under way to address its high computational complexity, some leading to approximate or regularized variations such as …

  Cited by 51 Related articles All 5 versions 


[PDF] researchgate.net

[PDF] Computational hardness and fast algorithm for fixed-support wasserstein barycenter

T LinN HoX ChenM Cuturi… - arXiv preprint arXiv …, 2020 - researchgate.net

We study in this paper the fixed-support Wasserstein barycenter problem (FS-WBP), which consists in computing the Wasserstein barycenter of m discrete probability measures supported on a finite metric space of size n. We show first that the constraint matrix arising …

  Cited by 3 Related articles All 2 versions 


Typical wind power scenario generation for multiple wind farms using conditional improved Wasserstein generative adversarial network

Y Zhang, Q Ai, F Xiao, R Hao, T Lu - … Journal of Electrical Power & Energy …, 2020 - Elsevier

Because of environmental benefits, wind power is taking an increasing role meeting electricity demand. However, wind power tends to exhibit large uncertainty and is largely influenced by meteorological conditions. Apart from the variability, when multiple wind farms …

  Cited by 12 Related articles


[PDF] researchgate.net

[PDF] Fixed-Support Wasserstein Barycenters: Computational Hardness and Fast Algorithm

T LinN HoX ChenM Cuturi… - Advances in Neural …, 2020 - researchgate.net

We study the fixed-support Wasserstein barycenter problem (FS-WBP), which consists in computing the Wasserstein barycenter of m discrete probability measures supported on a finite metric space of size n. We show first that the constraint matrix arising from the standard …

Cited by 10 Related articles All 8 versions 


[PDF] mlr.press

Fast algorithms for computational optimal transport and wasserstein barycenter

W GuoN HoM Jordan - International Conference on …, 2020 - proceedings.mlr.press

We provide theoretical complexity analysis for new algorithms to compute the optimal transport (OT) distance between two discrete probability distributions, and demonstrate their favorable practical performance compared to state-of-art primal-dual algorithms. First, we …

  Cited by 2 Related articles All 4 versions 

<——2020——2020———1650—— 


[PDF] arxiv.org

Linear Optimal Transport Embedding: Provable fast Wasserstein distance computation and classification for nonlinear problems

C MoosmüllerA Cloninger - arXiv preprint arXiv:2008.09165, 2020 - arxiv.org

Discriminating between distributions is an important problem in a number of scientific fields. This motivated the introduction of Linear Optimal Transportation (LOT), which embeds the space of distributions into an $ L^ 2$-space. The transform is defined by computing the …

Cited by 4 Related articles All 2 versions 

FRWCAE: joint faster-RCNN and Wasserstein convolutional auto-encoder for instance retrieval

Y Zhang, Y Feng, D Liu, J Shang, B Qiang - Applied Intelligence, 2020 - Springer

Based on the powerful feature extraction capability of deep convolutional neural networks, image-level retrieval methods have achieved superior performance compared to the hand-crafted features and indexing algorithms. However, people tend to focus on foreground …

  Related articles

[CITATION] Frwcae: joint faster-rcnn and wasserstein convolutional auto-encoder for instance retrieval

Z Yy, Y Feng, L Dj, S Jx, Q Bh - Applied Intelligence, 2020

  Cited by 2 Related articles


[PDF] arxiv.org

Fast and Smooth Interpolation on Wasserstein Space

S Chewi, J Clancy, TL GouicP Rigollet… - arXiv preprint arXiv …, 2020 - arxiv.org

We propose a new method for smoothly interpolating probability measures using the geometry of optimal transport. To that end, we reduce this problem to the classical Euclidean setting, allowing us to directly leverage the extensive toolbox of spline interpolation. Unlike …

  Related articles All 2 versions 


[PDF] arxiv.org

The equivalence of Fourier-based and Wasserstein metrics on imaging problems

G Auricchio, A CodegoniS Gualandi… - arXiv preprint arXiv …, 2020 - arxiv.org

We investigate properties of some extensions of a class of Fourier-based probability metrics, originally introduced to study convergence to equilibrium for the solution to the spatially homogeneous Boltzmann equation. At difference with the original one, the new Fourier …

  Cited by 1 Related articles All 7 versions 


[PDF] arxiv.org

A material decomposition method for dual-energy CT via dual interactive Wasserstein generative adversarial networks

Z Shi, H Li, Q Cao, Z Wang, M Cheng - arXiv preprint arXiv:2007.11247, 2020 - arxiv.org

Dual-energy computed tomography has great potential in material characterization and identification, whereas the reconstructed material-specific images always suffer from magnified noise and beam hardening artifacts. In this study, a data-driven approach using …

  Related articles All 3 versions 


2020


[PDF] arxiv.org

Fast Epigraphical Projection-based Incremental Algorithms for Wasserstein Distributionally Robust Support Vector Machine

J Li, C ChenAMC So - arXiv preprint arXiv:2010.12865, 2020 - arxiv.org

Wasserstein\textbf {D} istributionally\textbf {R} obust\textbf {O} ptimization (DRO) is concerned with finding decisions that perform well on data that are drawn from the worst-case probability distribution within a Wasserstein ball centered at a certain nominal …

  Related articles All 5 versions 


[PDF] bessarion.gr

WindWasserstein Inception Distance For Evaluating Generative Adversarial Network Performance

P Dimitrakopoulos, G Sfikas… - ICASSP 2020-2020 IEEE …, 2020 - ieeexplore.ieee.org

In this paper, we present Wasserstein Inception Distance (WInD), a novel metric for evaluating performance of Generative Adversarial Networks (GANs). The proposed metric extends on the rationale of the previously proposed Frechet Inception Distance (FID), in the …

Cited by 6 Related articles All 5 versions

[PDF] jst.go.jp

Orthogonal gradient penalty for fast training of wasserstein gan based multi-task autoencoder toward robust speech recognition

CY Kao, S Park, A Badi, DK Han… - IEICE TRANSACTIONS on …, 2020 - search.ieice.org

Performance in Automatic Speech Recognition (ASR) degrades dramatically in noisy environments. To alleviate this problem, a variety of deep networks based on convolutional neural networks and recurrent neural networks were proposed by applying L1 or L2 loss. In …

  Cited by 1 Related articles All 5 versions


2020

A Fast Globally Linearly Convergent Algorithm for the ... - PolyU

www.polyu.edu.hk › profile › dfsun › WB_jmlr_final

by L Yang · 2020 · Cited by 7 — The computational results show that our sGS-ADMM is highly competitive. 4. Page 5. Fast Algorithm for Computing Wasserstein Barycenters compared to IBP and ...


Person-independent facial expression recognition method based on improved Wasserstein generative adversarial networks in combination with identity aware

C Xu, Y Cui, Y Zhang, P Gao, J Xu - Multimedia Systems, 2020 - Springer

Since the distinction between two expressions is fairly vague, usually a subtle change in one part of the human face is enough to change a facial expression. Most of the existing facial expression recognition algorithms are not robust enough because they rely on general facial …

Cited by 11 Related articles All 3 versions

<——2020——2020———1660——  


[PDF] arxiv.org

Wasserstein Random Forests and Applications in Heterogeneous Treatment Effects

Q Du, G BiauF Petit, R Porcher - arXiv preprint arXiv:2006.04709, 2020 - arxiv.org

We present new insights into causal inference in the context of Heterogeneous Treatment Effects by proposing natural variants of Random Forests to estimate the key conditional distributions. To achieve this, we recast Breiman's original splitting criterion in terms of …

  Related articles All 4 versions 


[PDF] arxiv.org

Exponential Convergence in Entropy and Wasserstein Distance for McKean-Vlasov SDEs

P Ren, FY Wang - arXiv preprint arXiv:2010.08950, 2020 - arxiv.org

The following type exponential convergence is proved for (non-degenerate or degenerate) McKean-Vlasov SDEs: $$ W_2 (\mu_t,\mu_\infty)^ 2+{\rm Ent}(\mu_t|\mu_\infty)\le c {\rm e}^{-\lambda t}\min\big\{W_2 (\mu_0,\mu_\infty)^ 2,{\rm Ent}(\mu_0|\mu_\infty)\big\},\\t\ge 1 …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Convergence in Wasserstein Distance for Empirical Measures of Dirichlet Diffusion Processes on Manifolds

FY Wang - arXiv preprint arXiv:2005.09290, 2020 - arxiv.org

Let $ M $ be a $ d $-dimensional connected compact Riemannian manifold with boundary $\partial M $, let $ V\in C^ 2 (M) $ such that $\mu ({\rm d} x):={\rm e}^{V (x)}{\rm d} x $ is a probability measure, and let $ X_t $ be the diffusion process generated by …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein Convergence Rate for Empirical Measures on Noncompact Manifolds

FY Wang - arXiv preprint arXiv:2007.14667, 2020 - arxiv.org

Let $ X_t $ be the (reflecting) diffusion process generated by $ L:=\Delta+\nabla V $ on a complete connected Riemannian manifold $ M $ possibly with a boundary $\partial M $, where $ V\in C^ 1 (M) $ such that $\mu (dx):= e^{V (x)} dx $ is a probability measure. We …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Convergence rate to equilibrium in Wasserstein distance for reflected jump–diffusions

A Sarantsev - Statistics & Probability Letters, 2020 - Elsevier

Convergence rate to the stationary distribution for continuous-time Markov processes can be studied using Lyapunov functions. Recent work by the author provided explicit rates of convergence in special case of a reflected jump–diffusion on a half-line. These results are …

  Related articles All 7 versions

[PDF] arxiv.org

Convergence of Recursive Stochastic Algorithms using Wasserstein Divergence

A Gupta, WB Haskell - arXiv preprint arXiv:2003.11403, 2020 - arxiv.org

This paper develops a unified framework, based on iterated random operator theory, to analyze the convergence of constant stepsize recursive stochastic algorithms (RSAs) in machine learning and reinforcement learning. RSAs use randomization to efficiently …

  Related articles All 2 versions 


2020

[PDF] arxiv.org

Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures

O Bencheikh, B Jourdain - arXiv preprint arXiv:2012.09729, 2020 - arxiv.org

We are interested in the approximation in Wasserstein distance with index $\rho\ge 1$ of a probability measure $\mu $ on the real line with finite moment of order $\rho $ by the empirical measure of $ N $ deterministic points. The minimal error converges to $0 $ as …

  Related articles All 3 versions 


Central limit theorems for Markov chains based on their convergence rates in Wasserstein distance

R Jin, A Tan - arXiv preprint arXiv:2002.09427, 2020 - arxiv.org

Many tools are available to bound the convergence rate of Markov chains in total variation (TV) distance. Such results can be used to establish central limit theorems (CLT) that enable error evaluations of Monte Carlo estimates in practice. However, convergence analysis …

  Related articles All 2 versions 


[PDF] arxiv.org

Equidistribution of random walks on compact groups II. The Wasserstein metric

B Borda - arXiv preprint arXiv:2004.14089, 2020 - arxiv.org

We consider a random walk $ S_k $ with iid steps on a compact group equipped with a bi-invariant metric. We prove quantitative ergodic theorems for the sum $\sum_ {k= 1}^ N f (S_k) $ with Hölder continuous test functions $ f $, including the central limit theorem, the …

  Related articles All 2 versions 


Convergence in Monge-Wasserstein Distance of Mean Field Systems with Locally Lipschitz Coefficients

DT NguyenSL Nguyen, NH Du - Acta Mathematica Vietnamica, 2020 - Springer

This paper focuses on stochastic systems of weakly interacting particles whose dynamics depend on the empirical measures of the whole populations. The drift and diffusion coefficients of the dynamical systems are assumed to be locally Lipschitz continuous and …

  Related articles

<——2020——2020———1670—— 


[PDF] arxiv.org

Characterization of probability distribution convergence in Wasserstein distance by -quantization error function

Y Liu, G Pagès - Bernoulli, 2020 - projecteuclid.org

We establish conditions to characterize probability measures by their $ L^{p} $-quantization error functions in both $\mathbb {R}^{d} $ and Hilbert settings. This characterization is two-fold: static (identity of two distributions) and dynamic (convergence for the $ L^{p} …

  Cited by 1 Related articles All 5 versions


Convergence rates of the blocked Gibbs sampler with random scan in the Wasserstein metric

NY Wang, G Yin - Stochastics, 2020 - Taylor & Francis

Formulae display: ?Mathematical formulae have been encoded as MathML and are displayed in this HTML version using MathJax in order to improve their display. Uncheck the box to turn MathJax off. This feature requires Javascript. Click on a formula to zoom … This paper establishes …

  Related articles All 4 versions


[PDF] tum.de

Lp-Wasserstein and flux-limited gradient flows: Entropic discretization, convergence analysis and numerics

B Söllner - 2020 - mediatum.ub.tum.de

We analyse different discretizations of gradient flows in transport metrics with non-quadratic costs. Among others we discuss the p-Laplace equation and evolution equations with flux-limitation. We prove comparison principles, free energy monotony, non-negativity and mass …

  Related articles All 3 versions 

[PDF] tum.de

Lp-Wasserstein and flux-limited gradient flows: Entropic discretization, convergence analysis and numerics

B Söllner - 2020 - mediatum.ub.tum.de

We analyse different discretizations of gradient flows in transport metrics with non-quadratic costs. Among others we discuss the p-Laplace equation and evolution equations with flux-limitation. We prove comparison principles, free energy monotony, non-negativity and mass …

  Related articles All 3 versions 

[PDF] arxiv.org

Multivariate goodness-of-Fit tests based on Wasserstein distance

M HallinG MordantJ Segers - arXiv preprint arXiv:2003.06684, 2020 - arxiv.org

Goodness-of-fit tests based on the empirical Wasserstein distance are proposed for simple and composite null hypotheses involving general multivariate distributions. This includes the important problem of testing for multivariate normality with unspecified mean vector and …

  Cited by 5 Related articles All 10 versions 


[PDF] arxiv.org

Visual transfer for reinforcement learning via wasserstein domain confusion

J RoyG Konidaris - arXiv preprint arXiv:2006.03465, 2020 - arxiv.org

We introduce Wasserstein Adversarial Proximal Policy Optimization (WAPPO), a novel algorithm for visual transfer in Reinforcement Learning that explicitly learns to align the distributions of extracted features between a source and target task. WAPPO approximates …

  Cited by 2 Related articles All 6 versions 


[PDF] arxiv.org

Two-sample Test using Projected Wasserstein Distance: Breaking the Curse of Dimensionality

J Wang, R Gao, Y Xie - arXiv preprint arXiv:2010.11970, 2020 - arxiv.org

We develop a projected Wasserstein distance for the two-sample test, a fundamental problem in statistics and machine learning: given two sets of samples, to determine whether they are from the same distribution. In particular, we aim to circumvent the curse of …

  Cited by 1 Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein Stability for Persistence Diagrams

P SkrabaK Turner - arXiv preprint arXiv:2006.16824, 2020 - arxiv.org

The stability of persistence diagrams is among the most important results in applied and computational topology. Most results in the literature phrase stability in terms of the bottleneck distance between diagrams and the $\infty $-norm of perturbations. This has two …

  Cited by 2 Related articles All 2 versions 


2020

[PDF] springer.com

[PDF] Adapted Wasserstein distances and stability in mathematical finance

J Backhoff-VeraguasD BartlM Beiglböck… - Finance and …, 2020 - Springer

Assume that an agent models a financial asset through a measure with the goal to price/hedge some derivative or optimise some expected utility. Even if the model is chosen in the most skilful and sophisticated way, the agent is left with the possibility that  …

  Cited by 18 Related articles All 12 versions


[PDF] researchgate.net

Inequalities of the Wasserstein mean with other matrix means

S Kim, H Lee - Annals of Functional Analysis, 2020 - Springer

Recently, a new Riemannian metric and a least squares mean of positive definite matrices have been introduced. They are called the Bures–Wasserstein metric and Wasserstein mean, which are different from the Riemannian trace metric and Karcher mean. In this paper …

  Cited by 2 Related articles All 2 versions


[PDF] arxiv.org

Stability of Gibbs posteriors from the Wasserstein loss for Bayesian full waveform inversion

MM DunlopY Yang - arXiv preprint arXiv:2004.03730, 2020 - arxiv.org

Recently, the Wasserstein loss function has been proven to be effective when applied to deterministic full-waveform inversion (FWI) problems. We consider the application of this loss function in Bayesian FWI so that the uncertainty can be captured in the solution. Other …

  Cited by 1 Related articles All 3 versions 

<——2020——2020———1680—— 


[PDF] mlr.press

Quantitative stability of optimal transport maps and linearization of the 2-wasserstein space

Q Mérigot, A Delalande… - … Conference on Artificial …, 2020 - proceedings.mlr.press

This work studies an explicit embedding of the set of probability measures into a Hilbert space, defined using optimal transport maps from a reference probability density. This embedding linearizes to some extent the 2-Wasserstein space and is shown to be bi-Hölder …

  Cited by 12 Related articles All 5 versions 


[PDF] arxiv.org

Wasserstein Distance guided Adversarial Imitation Learning with Reward Shape Exploration

M ZhangY WangX Ma, L Xia, J Yang… - 2020 IEEE 9th Data …, 2020 - ieeexplore.ieee.org

The generative adversarial imitation learning (GAIL) has provided an adversarial learning framework for imitating expert policy from demonstrations in high-dimensional continuous tasks. However, almost all GAIL and its extensions only design a kind of reward function of …

 Cite Cited by 8 Related articles All 7 versions

Convergence in Monge-Wasserstein Distance of Mean Field Systems with Locally Lipschitz Coefficients

DT NguyenSL Nguyen, NH Du - Acta Mathematica Vietnamica, 2020 - Springer

This paper focuses on stochastic systems of weakly interacting particles whose dynamics depend on the empirical measures of the whole populations. The drift and diffusion coefficients of the dynamical systems are assumed to be locally Lipschitz continuous and …

  Related articles


[PDF] googleapis.com

Object shape regression using wasserstein distance

J Sun, SKP Kumar, R Bala - US Patent App. 16/222,062, 2020 - Google Patents

One embodiment can provide a system for detecting outlines of objects in images. During operation, the system receives an image that includes at least one object, generates a random noise signal, and provides the received image and the random noise signal to a …

  All 3 versions 


2020

Weak KAM Theory on the Wasserstein Torus with Multidimensional Underlying Space

www.researchgate.net › publication › 259536795_Weak_...

Oct 11, 2020 — Proving the $L_p$-boundedness of such integral operators is the key step in constructing an $L_p$-theory for linear stochastic partial differential ...



Evaluating the performance of climate models based on Wasserstein distance

G VissioV LemboV Lucarini… - Geophysical Research …, 2020 - Wiley Online Library

We propose a methodology for intercomparing climate models and evaluating their performance against benchmarks based on the use of the Wasserstein distance (WD). This distance provides a rigorous way to measure quantitatively the difference between two …

  Cited by 2 Related articles All 13 versions


[PDF] academia.edu

Data augmentation in fault diagnosis based on the Wasserstein generative adversarial network with gradient penalty

X Gao, F Deng, X Yue - Neurocomputing, 2020 - Elsevier

Fault detection and diagnosis in industrial process is an extremely essential part to keep away from undesired events and ensure the safety of operators and facilities. In the last few decades various data based machine learning algorithms have been widely studied to …

  Cited by 25 Related articles All 3 versions


[PDF] arxiv.org

On linear optimization over wasserstein balls

MC YueD KuhnW Wiesemann - arXiv preprint arXiv:2004.07162, 2020 - arxiv.org

Wasserstein balls, which contain all probability measures within a pre-specified Wasserstein distance to a reference measure, have recently enjoyed wide popularity in the distributionally robust optimization and machine learning communities to formulate and …

  Cited by 3 Related articles All 6 versions 


[PDF] arxiv.org

Partial gromov-wasserstein with applications on positive-unlabeled learning

L Chapel, MZ AlayaG Gasso - arXiv preprint arXiv:2002.08276, 2020 - arxiv.org

Optimal Transport (OT) framework allows defining similarity between probability distributions and provides metrics such as the Wasserstein and Gromov-Wasserstein discrepancies. Classical OT problem seeks a transportation map that preserves the total mass, requiring the …

  Cited by 5 Related articles All 3 versions 


[PDF] core.ac.uk

On the computation of Wasserstein barycenters

G PuccettiL RüschendorfS Vanduffel - Journal of Multivariate Analysis, 2020 - Elsevier

The Wasserstein barycenter is an important notion in the analysis of high dimensional data with a broad range of applications in applied probability, economics, statistics, and in particular to clustering and image processing. In this paper, we state a general version of the …

  Cited by 7 Related articles All 9 versions

<——2020——2020———1690——  


Sample generation based on a supervised Wasserstein Generative Adversarial Network for high-resolution remote-sensing scene classification

W HanL Wang, R Feng, L Gao, X Chen, Z Deng… - Information …, 2020 - Elsevier

As high-resolution remote-sensing (HRRS) images have become increasingly widely available, scene classification focusing on the smart classification of land cover and land use has also attracted more attention. However, mainstream methods encounter a severe …

  Cited by 3 Related articles All 3 versions


[PDF] arxiv.org

Dynamic facial expression generation on hilbert hypersphere with conditional wasserstein generative adversarial nets

N OtberdoutM DaoudiA Kacem… - … Transactions on …, 2020 - ieeexplore.ieee.org

In this work, we propose a novel approach for generating videos of the six basic facial expressions given a neutral face image. We propose to exploit the face geometry by modeling the facial landmarks motion as curves encoded as points on a hypersphere. By …

  Cited by 6 Related articles All 10 versions


22020  [PDF] ieee.org

Using improved conditional generative adversarial networks to detect social bots on Twitter

B Wu, L Liu, Y Yang, K Zheng, X Wang - IEEE Access, 2020 - ieeexplore.ieee.org

… disseminating false information with malicious intent. Therefore, it is a crucial and urgent 

task to detect and remove malicious social bots in … Therefore, in this study, we improve 

the CGAN by introducing the Wasserstein distance with a gradient penalty and a condition-generation …

Cited by 14 Related articles

Person-independent facial expression recognition method based on improved Wasserstein generative adversarial networks in combination with identity aware

C Xu, Y Cui, Y Zhang, P Gao, J Xu - Multimedia Systems, 2020 - Springer

Since the distinction between two expressions is fairly vague, usually a subtle change in one part of the human face is enough to change a facial expression. Most of the existing facial expression recognition algorithms are not robust enough because they rely on general facial …

  Cited by 5 Related articles


On the Computation of Kantorovich--Wasserstein Distances Between Two-Dimensional Histograms by Uncapacitated Minimum Cost Flows

F Bassetti, S GualandiM Veneroni - SIAM Journal on Optimization, 2020 - SIAM

In this work, we present a method to compute the Kantorovich--Wasserstein distance of order 1 between a pair of two-dimensional histograms. Recent works in computer vision and machine learning have shown the benefits of measuring Wasserstein distances of order 1 …

  Cited by 3 Related articles All 2 versions


 [PDF] arxiv.org

Asymptotic Guarantees for Generative Modeling Based on the Smooth Wasserstein Distance

Z GoldfeldK GreenewaldK Kato - arXiv preprint arXiv:2002.01012, 2020 - arxiv.org

Minimum distance estimation (MDE) gained recent attention as a formulation of (implicit) generative modeling. It considers minimizing, over model parameters, a statistical distance between the empirical data distribution and the model. This formulation lends itself well to …

  Cited by 2 Related articles All 2 versions 

[CITATION] Asymptotic Guarantees for Generative Modeling Based on the Smooth Wasserstein Distance

Z GoldfeldK GreenewaldK Kato - Advances in Neural Information Processing …, 2020

  Cited by 1 Related articles


[PDF] arxiv.org

Gromov–Hausdorff limit of Wasserstein spaces on point clouds

NG Trillos - Calculus of Variations and Partial Differential …, 2020 - Springer

We consider a point cloud X_n:={x _1, ..., x _n\} X n:= x 1,…, xn uniformly distributed on the flat torus T^ d:= R^ d/Z^ d T d:= R d/Z d, and construct a geometric graph on the cloud by connecting points that are within distance ε ε of each other. We let P (X_n) P (X n) be the …

  Cited by 11 Related articles All 4 versions


[PDF] arxiv.org

Intrinsic Sliced Wasserstein Distances for Comparing Collections of Probability Distributions on Manifolds and Graphs

RM RustamovS Majumdar - arXiv preprint arXiv:2010.15285, 2020 - arxiv.org

Collections of probability distributions arise in a variety of statistical applications ranging from user activity pattern analysis to brain connectomics. In practice these distributions are represented by histograms over diverse domain types including finite intervals, circles …

  Cited by 2 Related articles All 2 versions 


[PDF] unifi.it

[PDF] Conlon: A pseudo-song generator based on a new pianoroll, wasserstein autoencoders, and optimal interpolations

L Angioloni, T Borghuis, L Brusci… - Proceedings of the 21st …, 2020 - flore.unifi.it

We introduce CONLON, a pattern-based MIDI generation method that employs a new lossless pianoroll-like data description in which velocities and durations are stored in separate channels. CONLON uses Wasserstein autoencoders as the underlying generative …

  Cited by 1 Related articles All 7 versions 


[PDF] arxiv.org

A Rademacher-type theorem on L2-Wasserstein spaces over closed Riemannian manifolds

LD Schiavo - Journal of Functional Analysis, 2020 - Elsevier

Let P be any Borel probability measure on the L 2-Wasserstein space (P 2 (M), W 2) over a closed Riemannian manifold M. We consider the Dirichlet form E induced by P and by the Wasserstein gradient on P 2 (M). Under natural assumptions on P, we show that W 2 …

  Cited by 5 Related articles All 6 versions

<——2020——2020———1700——


[PDF] ams.org

On the Wasserstein distance between classical sequences and the Lebesgue measure

L Brown, S Steinerberger - Transactions of the American Mathematical …, 2020 - ams.org

We discuss the classical problem of measuring the regularity of distribution of sets of $ N $ points in $\mathbb {T}^ d $. A recent line of investigation is to study the cost ($= $ mass $\times $ distance) necessary to move Dirac measures placed on these points to the uniform …

  Cited by 3 Related articles All 4 versions


DPIR-Net: Direct PET image reconstruction based on the Wasserstein generative adversarial network

Z Hu, H Xue, Q Zhang, J Gao, N Zhang… - … Transactions on …, 2020 - ieeexplore.ieee.org

Positron emission tomography (PET) is an advanced medical imaging technique widely used in various clinical applications, such as tumor detection and neurologic disorders. Reducing the radiotracer dose is desirable in PET imaging because it decreases the …

  Cited by 4 Related articles


[HTML] hindawi.com

[HTML] An Integrated Consensus Improving Strategy Based on PL-Wasserstein Distance and Its Application in the Evaluation of Network Public Opinion Emergencies

S Zhang, Z Ma, X Liu, Z Wang, L Jiang - Complexity, 2020 - hindawi.com

In real life, multiple network public opinion emergencies may break out in a certain place at the same time. So, it is necessary to invite emergency decision experts in multiple fields for timely evaluating the comprehensive crisis of the online public opinion, and then limited …

  Related articles All 7 versions 


[PDF] mdpi.com

Knowledge-grounded chatbot based on dual wasserstein generative adversarial networks with effective attention mechanisms

S Kim, OW Kwon, H Kim - Applied Sciences, 2020 - mdpi.com

A conversation is based on internal knowledge that the participants already know or external knowledge that they have gained during the conversation. A chatbot that communicates with humans by using its internal and external knowledge is called a knowledge-grounded …

  Cited by 3 Related articles All 4 versions 


[PDF] arxiv.org

A Bismut-Elworthy inequality for a Wasserstein diffusion on the circle

V Marx - arXiv preprint arXiv:2005.04972, 2020 - arxiv.org

We investigate in this paper a regularization property of a diffusion on the Wasserstein space $\mathcal {P} _2 (\mathbb {T}) $ of the one-dimensional torus. The control obtained on the gradient of the semi-group is very much in the spirit of Bismut-Elworthy-Li integration …

  Related articles All 9 versions 

  Related articles All 2 versions 


[PDF] arxiv.org

The equivalence of Fourier-based and Wasserstein metrics on imaging problems

G Auricchio, A CodegoniS Gualandi… - arXiv preprint arXiv …, 2020 - arxiv.org

We investigate properties of some extensions of a class of Fourier-based probability metrics, originally introduced to study convergence to equilibrium for the solution to the spatially homogeneous Boltzmann equation. At difference with the original one, the new Fourier …

  Cited by 1 Related articles All 7 versions 


2020

[PDF] upenn.edu

Optimal Estimation of Wasserstein Distance on a Tree With an Application to Microbiome Studies

S Wang, TT CaiH Li - Journal of the American Statistical …, 2020 - Taylor & Francis

The weighted UniFrac distance, a plug-in estimator of the Wasserstein distance of read counts on a tree, has been widely used to measure the microbial community difference in microbiome studies. Our investigation however shows that such a plug-in estimator …

  Related articles All 4 versions


[PDF] arxiv.org

Exponential contraction in Wasserstein distance on static and evolving manifolds

LJ Cheng, A Thalmaier, SQ Zhang - arXiv preprint arXiv:2001.06187, 2020 - arxiv.org

In this article, exponential contraction in Wasserstein distance for heat semigroups of diffusion processes on Riemannian manifolds is established under curvature conditions where Ricci curvature is not necessarily required to be non-negative. Compared to the …

  Cited by 2 Related articles All 5 versions 


A novel kernel Wasserstein distance on Gaussian measures: An application of identifying dental artifacts in head and neck computed tomography

JH Oh, M PouryahyaA IyerAP ApteJO Deasy… - Computers in biology …, 2020 - Elsevier

The Wasserstein distance is a powerful metric based on the theory of optimal mass transport. It gives a natural measure of the distance between two distributions with a wide range of applications. In contrast to a number of the common divergences on distributions …

  Cited by 1 Related articles All 5 versions


[PDF] arxiv.org

Posterior asymptotics in Wasserstein metrics on the real line

M ChaeP De Blasi, SG Walker - arXiv preprint arXiv:2003.05599, 2020 - arxiv.org

In this paper, we use the class of Wasserstein metrics to study asymptotic properties of posterior distributions. Our first goal is to provide sufficient conditions for posterior consistency. In addition to the well-known Schwartz's Kullback--Leibler condition on the …

  Related articles All 2 versions 

<——2020——2020———1710——



First arrival picking of microseismic signals based on nested U-Net and Wasserstein Generative Adversarial Network

JL Zhang, GQ Sheng - Journal of Petroleum Science and Engineering, 2020 - Elsevier

Picking the first arrival of microseismic signals, quickly and accurately, is the key for real-time data processing of microseismic monitoring. The traditional method cannot meet the high-accuracy and high-efficiency requirements for the firstarrival microseismic picking, in a low …

  Related articles All 2 versions


2020

Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces

V Ehrlacher, D Lombardi, O Mula… - … and Numerical Analysis, 2020 - search.proquest.com

We consider the problem of model reduction of parametrized PDEs where the goal is to approximate any function belonging to the set of solutions at a reduced computational cost. For this, the bottom line of most strategies has so far been based on the approximation of the …

  Related articles All 2 versions


[PDF] iop.org

Speech Dereverberation Based on Improved Wasserstein Generative Adversarial Networks

L Rao, J Yang - Journal of Physics: Conference Series, 2020 - iopscience.iop.org

In reality, the sound we hear is not only disturbed by noise, but also the reverberant, whose effects are rarely taken into account. Recently, deep learning has shown great advantages in speech signal processing. But among the existing dereverberation approaches, very few …

  Related articles All 2 versions


[PDF] arxiv.org

Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures

O Bencheikh, B Jourdain - arXiv preprint arXiv:2012.09729, 2020 - arxiv.org

We are interested in the approximation in Wasserstein distance with index $\rho\ge 1$ of a probability measure $\mu $ on the real line with finite moment of order $\rho $ by the empirical measure of $ N $ deterministic points. The minimal error converges to $0 $ as …

  Related articles All 3 versions 


Drift compensation algorithm based on Time-Wasserstein dynamic distribution alignment

Y Tao, K Zeng, Z Liang - … IEEE/CIC International Conference on …, 2020 - ieeexplore.ieee.org

The electronic nose (E-nose) is mainly used to detect different types and concentrations of gases. At present, the average life of E-nose is relatively short, mainly due to the drift of the sensor resulting in a decrease in the effect. Therefore, it is the focus of research in this field …

  Related articles


 

[PDF] arxiv.org

Central limit theorems for Markov chains based on their convergence rates in Wasserstein distance

R Jin, A Tan - arXiv preprint arXiv:2002.09427, 2020 - arxiv.org

Many tools are available to bound the convergence rate of Markov chains in total variation (TV) distance. Such results can be used to establish central limit theorems (CLT) that enable error evaluations of Monte Carlo estimates in practice. However, convergence analysis …

  Related articles All 2 versions 



[PDF] arxiv.org

Geometric Characteristics of Wasserstein Metric on SPD (n)

Y Luo, S Zhang, Y Cao, H Sun - arXiv preprint arXiv:2012.07106, 2020 - arxiv.org

Wasserstein distance, especially among symmetric positive-definite matrices, has broad and deep influences on development of artificial intelligence (AI) and other branches of computer science. A natural idea is to describe the geometry of $ SPD\left (n\right) $ as a Riemannian …

  Related articles All 2 versions 


[PDF] iop.org

A collaborative filtering recommendation framework based on Wasserstein GAN

R Li, F Qian, X Du, S Zhao… - Journal of Physics …, 2020 - iopscience.iop.org

Compared with the original GAN, Wasserstein GAN minimizes the Wasserstein Distance between the generative distribution and the real distribution, can well capture the potential distribution of data and has achieved excellent results in image generation. However, the …

  Related articles


[PDF] arxiv.org

On nonexpansiveness of metric projection operators on Wasserstein spaces

A Adve, A Mészáros - arXiv preprint arXiv:2009.01370, 2020 - arxiv.org

In this note we investigate properties of metric projection operators onto closed and geodesically convex proper subsets of Wasserstein spaces $(\mathcal {P} _p (\mathbf {R}^ d), W_p). $ In our study we focus on the particular subset of probability measures having …

  Related articles All 3 versions 


[PDF] arxiv.org

On the Wasserstein distance for a martingale central limit theorem

X Fan, X Ma - Statistics & Probability Letters, 2020 - Elsevier

… On the Wasserstein distance for a martingale central limit theorem. Author links open overlay panelXiequanFan XiaohuiMa. Show more … Abstract. We prove an upper bound on the Wasserstein distance between normalized martingales and the standard normal random variable, which …

  Related articles All 8 versions

<——2020——2020———1720——

[PDF] arxiv.org

Equidistribution of random walks on compact groups II. The Wasserstein metric

B Borda - arXiv preprint arXiv:2004.14089, 2020 - arxiv.org

We consider a random walk $ S_k $ with iid steps on a compact group equipped with a bi-invariant metric. We prove quantitative ergodic theorems for the sum $\sum_ {k= 1}^ N f (S_k) $ with Hölder continuous test functions $ f $, including the central limit theorem, the …

  Related articles All 2 versions 


Horo-functions associated to atom sequences on the Wasserstein space

G Zhu, H Wu, X Cui - Archiv der Mathematik, 2020 - Springer

On the Wasserstein space over a complete, separable, non-compact, locally compact length space, we consider the horo-functions associated to sequences of atomic measures. We show the existence of co-rays for any prescribed initial probability measure with respect to a …

  Related articles


Wasserstein GAN based on Autoencoder with back-translation for cross-lingual embedding mappings

Y Zhang, Y Li, Y Zhu, X Hu - Pattern Recognition Letters, 2020 - Elsevier

Recent works about learning cross-lingual word mappings (CWMs) focus on relaxing the requirement of bilingual signals through generative adversarial networks (GANs). GANs based models intend to enforce source embedding space to align target embedding space …

  Related articles All 2 versions


Generating Hyperspectral Data Based on 3D CNN and Improved Wasserstein Generative Adversarial Network Using Homemade High-resolution Datasets

Y Li, D Huang - Proceedings of the International Conference on …, 2020 - dl.acm.org

Hyperspectral images contain rich information on the fingerprints of materials and are being popularly used in the exploration of oil and gas, environmental monitoring, and remote sensing. Since hyperspectral images cover a wide range of wavelengths with high …

  Related articles


[PDF] arxiv.org

On Stein's factors for Poisson approximation in Wasserstein distance with non-linear transportation costs

ZW Liao, Y Ma, A Xia - arXiv preprint arXiv:2003.13976, 2020 - arxiv.org

We establish various bounds on the solutions to a Stein equation for Poisson approximation in Wasserstein distance with non-linear transportation costs. The proofs are a refinement of those in [Barbour and Xia (2006)] using the results in [Liu and Ma (2009)]. As a corollary, we …

  Related articles All 2 versions 

2020

[PDF] arxiv.org

Berry-Esseen smoothing inequality for the Wasserstein metric on compact Lie groups

B Borda - arXiv preprint arXiv:2005.04925, 2020 - arxiv.org

We prove a general inequality estimating the distance of two probability measures on a compact Lie group in the Wasserstein metric in terms of their Fourier transforms. The result is close to being sharp. We use a generalized form of the Wasserstein metric, related by …

  Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein Convergence Rate for Empirical Measures on Noncompact Manifolds

FY Wang - arXiv preprint arXiv:2007.14667, 2020 - arxiv.org

Let $ X_t $ be the (reflecting) diffusion process generated by $ L:=\Delta+\nabla V $ on a complete connected Riemannian manifold $ M $ possibly with a boundary $\partial M $, where $ V\in C^ 1 (M) $ such that $\mu (dx):= e^{V (x)} dx $ is a probability measure. We …

  Cited by 1 Related articles All 2 versions 


[PDF] archives-ouvertes.fr

On the Wasserstein distance between mutually singular measures

G Buttazzo, G Carlier, M Laborde - Advances in Calculus of …, 2020 - degruyter.com

We study the Wasserstein distance between two measures μ, ν which are mutually singular. In particular, we are interested in minimization problems of the form W⁢(μ, 𝒜)= inf⁡{W⁢(μ, ν): ν 𝒜}, where μ is a given probability and 𝒜 is contained in the class μ of probabilities …

  Cited by 1 Related articles All 6 versions


[PDF] dergipark.org.tr

Wasserstein Riemannian Geometry on Statistical Manifold

C Ogouyandjou, N Wadagni - International Electronic Journal of …, 2020 - dergipark.org.tr

In this paper, we study some geometric properties of statistical manifold equipped with the Riemannian Otto metric which is related to the L 2-Wasserstein distance of optimal mass transport. We construct some α-connections on such manifold and we prove that the …

  Related articles All 2 versions 


A Super Resolution Method for Remote Sensing Images Based on Cascaded Conditional Wasserstein GANs

B Liu, H Li, Y Zhou, Y Peng, A Elazab… - … Conference on …, 2020 - ieeexplore.ieee.org

High-resolution (HR) remote sensing imagery is quite beneficial for subsequent interpretation. Obtaining HR images can be achieved by upgrading the imaging device. Yet, the cost to perform this task is very huge. Thus, it is necessary to obtain HR images from low …

  Related articles

<——2020——2020———1730——


[PDF] unipv.it

[PDF] On the equivalence between Fourier-based and Wasserstein metrics

G Auricchio, A Codegoni, S GualandiG Toscani… - eye - mate.unipv.it

We investigate properties of some extensions of a class of Fourierbased probability metrics, originally introduced to study convergence to equilibrium for the solution to the spatially homogeneous Boltzmann equation. At difference with the original one, the new Fourier …

  Related articles 


[PDF] researchgate.net

Infrared and visible image fusion using dual discriminators generative adversarial networks with Wasserstein distance

J Li, H Huo, K Liu, C Li - Information Sciences, 2020 - Elsevier

Generative adversarial network (GAN) has shown great potential in infrared and visible image fusion. The existing GAN-based methods establish an adversarial game between generative image and source images to train the generator until the generative image …

  Cited by 4 Related articles All 3 versions


2020

[PDF] esaim-proc.org

Statistical data analysis in the Wasserstein space

J Bigot - ESAIM: Proceedings and Surveys, 2020 - esaim-proc.org

This paper is concerned by statistical inference problems from a data set whose elements may be modeled as random probability measures such as multiple histograms or point clouds. We propose to review recent contributions in statistics on the use of Wasserstein  …

  Cited by 2 Related articles

[PDF] mdpi.com

Knowledge-grounded chatbot based on dual wasserstein generative adversarial networks with effective attention mechanisms

S Kim, OW Kwon, H Kim - Applied Sciences, 2020 - mdpi.com

A conversation is based on internal knowledge that the participants already know or external knowledge that they have gained during the conversation. A chatbot that communicates with humans by using its internal and external knowledge is called a knowledge-grounded …

  Cited by 3 Related articles All 4 versions 


[PDF] ecai2020.eu

[PDF] Dual Rejection Sampling for Wasserstein Auto-Encoders

L Hou, H Shen, X Cheng - 24th European Conference on Artificial …, 2020 - ecai2020.eu

Deep generative models enhanced by Wasserstein distance have achieved remarkable success in recent years. Wasserstein Auto-Encoders (WAEs) are auto-encoder based generative models that aim to minimize the Wasserstein distance between the data …

  Cited by 1 Related articles All 3 versions 


2020

[PDF] arxiv.org

Randomised Wasserstein Barycenter Computation: Resampling with Statistical Guarantees

F Heinemann, A MunkY Zemel - arXiv preprint arXiv:2012.06397, 2020 - arxiv.org

We propose a hybrid resampling method to approximate finitely supported Wasserstein barycenters on large-scale datasets, which can be combined with any exact solver. Nonasymptotic bounds on the expected error of the objective value as well as the …

  Related articles All 2 versions 

[PDF] arxiv.org

A material decomposition method for dual-energy CT via dual interactive Wasserstein generative adversarial networks

Z Shi, H Li, Q Cao, Z Wang, M Cheng - arXiv preprint arXiv:2007.11247, 2020 - arxiv.org

Dual-energy computed tomography has great potential in material characterization and identification, whereas the reconstructed material-specific images always suffer from magnified noise and beam hardening artifacts. In this study, a data-driven approach using …

  Related articles All 3 versions 


[PDF] arxiv.org

Statistical learning in Wasserstein space

A Karimi, L Ripani, TT Georgiou - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We seek a generalization of regression and principle component analysis (PCA) in a metric space where data points are distributions metrized by the Wasserstein metric. We recast these analyses as multimarginal optimal transport problems. The particular formulation …

  Related articles All 7 versions 


[PDF] arxiv.org

Statistical analysis of Wasserstein GANs with applications to time series forecasting

M Haas, S Richter - arXiv preprint arXiv:2011.03074, 2020 - arxiv.org

We provide statistical theory for conditional and unconditional Wasserstein generative adversarial networks (WGANs) in the framework of dependent observations. We prove upper bounds for the excess Bayes risk of the WGAN estimators with respect to a modified …

  Cited by 2 Related articles All 3 versions 


[PDF] dergipark.org.tr

Wasserstein Riemannian Geometry on Statistical Manifold

C Ogouyandjou, N Wadagni - International Electronic Journal of …, 2020 - dergipark.org.tr

In this paper, we study some geometric properties of statistical manifold equipped with the Riemannian Otto metric which is related to the L 2-Wasserstein distance of optimal mass transport. We construct some α-connections on such manifold and we prove that the …

  Related articles All 2 versions 

<——2020——2020———1740——


[PDF] ethz.ch

[PDF] Smooth Wasserstein Distance: Metric Structure and Statistical Efficiency

Z Goldfeld - International Zurich Seminar on Information …, 2020 - research-collection.ethz.ch

The Wasserstein distance has seen a surge of interest and applications in machine learning. Its popularity is driven by many advantageous properties it possesses, such as metric structure (metrization of weak convergence), robustness to support mismatch, compatibility …

  Related articles All 4 versions 


2020

[PDF] STATISTICAL INFERENCE FOR BURES-WASSERSTEIN BARYCENTERS BY ALEXEY KROSHNIN, VLADIMIR SPOKOINY 2 AND ALEXANDRA …

A KROSHNIN - researchgate.net

In this work we introduce the concept of Bures–Wasserstein barycenter Q, that is essentially a Fréchet mean of some distribution P supported on a subspace of positive semi-definite d-dimensional Hermitian operators H+(d). We allow a barycenter to be constrained …

  Related articles 

[PDF] … INFERENCE FOR BURES-WASSERSTEIN BARYCENTERS BY ALEXEY KROSHNIN, VLADIMIR SPOKOINY 2 AND ALEXANDRA SUVORIKOVA 3

A KROSHNIN - researchgate.net

In this work we introduce the concept of Bures–Wasserstein barycenter Q, that is

essentially a Fréchet mean of some distribution P supported on a subspace of positive semi-

definite d-dimensional Hermitian operators H+(d). We allow a barycenter to be constrained …

  Related articles 


[PDF] optimization-online.org

[PDF] Dual Decomposition of Two-Stage Distributionally Robust Mixed-Integer Programming under the Wasserstein Ambiguity Set

K Kim - optimization-online.org

We develop a dual decomposition of two-stage distributionally robust mixed-integer programming (DRMIP) under the Wasserstein ambiguity set. The dual decomposition is based on the Lagrangian dual of DRMIP, which results from the Lagrangian relaxation of the …

  Related articles All 2 versions 


Optimality in weighted L2-Wasserstein goodness-of-fit statistics

T De Wet, V Humble - South African Statistical Journal, 2020 - journals.co.za

Abstract In Del Barrio, Cuesta-Albertos, Matran and Rodriguez-Rodriguez (1999) and Del Barrio, Cuesta-Albertos and Matran (2000), the authors introduced a new class of goodness-of-fit statistics based on the L 2-Wasserstein distance. It was shown that the desirable …

  Related articles All 2 versions


Fused Gromov-Wasserstein distance for structured objects

T Vayer, L Chapel, R FlamaryR TavenardN Courty - Algorithms, 2020 - mdpi.com

Optimal transport theory has recently found many applications in machine learning thanks to its capacity to meaningfully compare various machine learning objects that are viewed as distributions. The Kantorovitch formulation, leading to the Wasserstein distance, focuses on …

  Cited by 4 Related articles All 33 versions 

[HTML] mdpi.com

Fused Gromov-Wasserstein distance for structured objects

T Vayer, L Chapel, R FlamaryR TavenardN Courty - Algorithms, 2020 - mdpi.com

112 days ago - Optimal transport theory has recently found many applications in machine

learning thanks to its capacity to meaningfully compare various machine learning objects

that are viewed as distributions. The Kantorovitch formulation, leading to the Wasserstein  …

  Cited by 3 All 29 versions 


2020


[PDF] sns.it

Optimal control of multiagent systems in the Wasserstein space

C Jimenez, A Marigonda, M Quincampoix - Calculus of Variations and …, 2020 - Springer

This paper concerns a class of optimal control problems, where a central planner aims to control a multi-agent system in R^ d R d in order to minimize a certain cost of Bolza type. At every time and for each agent, the set of admissible velocities, describing his/her underlying …

  Cited by 6 Related articles All 3 versions


[PDF] stanford.edu

A class of optimal transport regularized formulations with applications to wasserstein gans

S Mahdian, JH Blanchet… - 2020 Winter Simulation …, 2020 - ieeexplore.ieee.org

Optimal transport costs (eg Wasserstein distances) are used for fitting high-dimensional

distributions. For example, popular artificial intelligence algorithms such as Wasserstein

Generative Adversarial Networks (WGANs) can be interpreted as fitting a black-box …

  All 3 versions


[PDF] mlr.press

Fast algorithms for computational optimal transport and wasserstein barycenter

W GuoN HoM Jordan - International Conference on …, 2020 - proceedings.mlr.press

We provide theoretical complexity analysis for new algorithms to compute the optimal transport (OT) distance between two discrete probability distributions, and demonstrate their favorable practical performance compared to state-of-art primal-dual algorithms. First, we …

  Cited by 2 Related articles All 4 versions 


[PDF] biorxiv.org

Gromov-Wasserstein optimal transport to align single-cell multi-omics data

P DemetciR SantorellaB SandstedeWS Noble… - BioRxiv, 2020 - biorxiv.org

Data integration of single-cell measurements is critical for understanding cell development and disease, but the lack of correspondence between different types of measurements makes such efforts challenging. Several unsupervised algorithms can align heterogeneous …

  Cited by 4 Related articles All 3 versions 


[PDF] arxiv.org

Linear Optimal Transport Embedding: Provable fast Wasserstein distance computation and classification for nonlinear problems

C MoosmüllerA Cloninger - arXiv preprint arXiv:2008.09165, 2020 - arxiv.org

Discriminating between distributions is an important problem in a number of scientific fields. This motivated the introduction of Linear Optimal Transportation (LOT), which embeds the space of distributions into an $ L^ 2$-space. The transform is defined by computing the …

  Cited by 2 Related articles All 2 versions 


<——2020——2020———1750——


[PDF] mlr.press

Quantitative stability of optimal transport maps and linearization of the 2-wasserstein space

Q Mérigot, A Delalande… - … Conference on Artificial …, 2020 - proceedings.mlr.press

This work studies an explicit embedding of the set of probability measures into a Hilbert space, defined using optimal transport maps from a reference probability density. This embedding linearizes to some extent the 2-Wasserstein space and is shown to be bi-Hölder …

  Cited by 12 Related articles All 5 versions 


2020

[PDF] upenn.edu

Optimal Estimation of Wasserstein Distance on a Tree With an Application to Microbiome Studies

S Wang, TT CaiH Li - Journal of the American Statistical …, 2020 - Taylor & Francis

The weighted UniFrac distance, a plug-in estimator of the Wasserstein distance of read counts on a tree, has been widely used to measure the microbial community difference in microbiome studies. Our investigation however shows that such a plug-in estimator …

  Related articles All 4 versions


[PDF] esaim-ps.org

Squared quadratic Wasserstein distance: optimal couplings and Lions differentiability

A Alfonsi, B Jourdain - ESAIM: Probability and Statistics, 2020 - esaim-ps.org

In this paper, we remark that any optimal coupling for the quadratic Wasserstein distance between two probability measures μ and ν with finite second order moments on d is the composition of a martingale coupling with an optimal transport map. We check the existence …

  Related articles All 5 versions


[PDF] A CLASS OF OPTIMAL TRANSPORT REGULARIZED FORMULATIONS WITH APPLICATIONS TO WASSERSTEIN GANS

KH Bae, B Feng, S Kim, S Lazarova-Molnar, Z Zheng… - stanford.edu

Optimal transport costs (eg Wasserstein distances) are used for fitting high-dimensional distributions. For example, popular artificial intelligence algorithms such as Wasserstein Generative Adversarial Networks (WGANs) can be interpreted as fitting a black-box …

  

20220

Optimality in weighted L2-Wasserstein goodness-of-fit statistics

T De Wet, V Humble - South African Statistical Journal, 2020 - journals.co.za

Abstract In Del Barrio, Cuesta-Albertos, Matran and Rodriguez-Rodriguez (1999) and Del Barrio, Cuesta-Albertos and Matran (2000), the authors introduced a new class of goodness-of-fit statistics based on the L 2-Wasserstein distance. It was shown that the desirable …

  Related articles All 2 versions


2020

2020  modified

[PDF] cnrs.fr

[PDF] SUPPLEMENTARY MATERIALS: Wasserstein Dictionary Learning: Optimal Transport-based unsupervised non-linear dictionary learning

MA Schmitz, M HeitzN Bonneel, F Ngole, D Coeurjolly… - perso.liris.cnrs.fr

SUPPLEMENTARY MATERIALS: Wasserstein Dictionary Learning: Optimal Transport-based unsupervised non-linear dictionary learning … Morgan A. Schmitz† , Matthieu Heitz‡ , Nicolas Bonneel‡ , Fred Ngol`e§ , David Coeurjolly‡ , Marco Cuturi¶, Gabriel Peyré , and Jean-Luc …

  Related articles All 2 versions 


Numeric Data Augmentation using Structural Constraint Wasserstein Generative Adversarial Networks

W Wang, C Wang, T Cui, R Gong… - … on Circuits and …, 2020 - ieeexplore.ieee.org

Some recent studies have suggested using GANs for numeric data generation such as to

generate data for completing the imbalanced numeric data. Considering the significant

difference between the dimension

 

2020

Intelligent Optical Communication Based on Wasserstein Generative Adversarial Network

By: Mu DiMeng WenZhao Shanghong; et al.

CHINESE JOURNAL OF LASERS-ZHONGGUO JIGUANG  Volume: ‏ 47   Issue: ‏ 11     Article Number: 1106005   Published: ‏ NOV 2020


2020

[PDF] arxiv.org

A generalized Vaserstein symbol

T Syed - Annals of K-Theory, 2020 - msp.org

Let R be a commutative ring. For any projective R-module P 0 of constant rank 2 with a trivialization of its determinant, we define a generalized Vaserstein symbol on the orbit space of the set of epimorphisms P 0 R R under the action of the group of elementary …

  Cited by 4 Related articles All 9 versions 


2020

A Survey on the Non-injectivity of the Vaserstein Symbol in Dimension Three

N Gupta, DR Rao, S Kolte - Leavitt Path Algebras and Classical K-Theory, 2020 - Springer

We give a recap of the study of the Vaserstein symbol \(V_A : Um_3(A)/E_3(A) \longrightarrow W_E(A)\), the elementary symplectic Witt group; when A is an affine threefold over a field k … LN Vaserstein in [20] proved that the orbit space of unimodular rows of length three modulo elementary …

  Cited by 1 Related articles All 2 versions

<——2020——2020———1760——  



An enhanced uncertainty principle for the Vaserstein distance

T CarrollX Massaneda… - Bulletin of the London …, 2020 - Wiley Online Library

We improve some recent results of Sagiv and Steinerberger that quantify the following uncertainty principle: for a function f with mean zero, then either the size of the zero set of the function or the cost of transporting the mass of the positive part of f to its negative part must …

  Cited by 7 Related articles All 6 versions

MR4224354 Prelim Carroll, Tom; Massaneda, Xavier; Ortega-Cerdà, Joaquim; An enhanced uncertainty principle for the Vaserstein distance. Bull. Lond. Math. Soc. 52 (2020), no. 6, 1158–1173. 35P20 (28A75 49Q22 58C40)

Review PDF Clipboard Journal Article

[PDF] arxiv.org

When can Wasserstein GANs minimize Wasserstein Distance?

Y LiZ Dou - arXiv preprint arXiv:2003.04033, 2020 - arxiv.org

Generative Adversarial Networks (GANs) are widely used models to learn complex real-world distributions. In GANs, the training of the generator usually stops when the discriminator can no longer distinguish the generator's output from the set of training …

  Cited by 5 Related articles All 3 versions 

 

Kantorovich–Rubinstein–Wasserstein distance between overlapping attractor and repeller<? A3B2 show [editpick]?>

V Chigarev, A KazakovA Pikovsky - Chaos: An Interdisciplinary …, 2020 - aip.scitation.org

We consider several examples of dynamical systems demonstrating overlapping attractor and repeller. These systems are constructed via introducing controllable dissipation to prototypic models with chaotic dynamics (Anosov cat map, Chirikov standard map, and …

  Cited by 4 Related articles All 5 versions


2020

A Linear Programming Approximation of Distributionally Robust Chance-Constrained Dispatch With Wasserstein Distance

A Zhou, M Yang, M Wang… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

This paper proposes a data-driven distributionally robust chance constrained real-time dispatch (DRCC-RTD) considering renewable generation forecasting errors. The proposed DRCC-RTD model minimizes the expected quadratic cost function and guarantees that the …

 Cited by 19 Related articles All 2 versions

Stochastic optimization for regularized wasserstein estimators

M BalluQ BerthetF Bach - International Conference on …, 2020 - proceedings.mlr.press

Optimal transport is a foundational problem in optimization, that allows to compare

probability distributions while taking into account geometric aspects. Its optimal objective …

 Cite Cited by 11 Related articles All 6 versions 

Stochastic Optimization for Regularized Wasserstein Estimators

F BachM BalluQ Berthet - 2020 - research.google

Optimal transport is a foundational problem in optimization, that allows to compare

probability distributions while taking into account geometric aspects. Its optimal objective …

2020

[PDF] arxiv.org

Regularization helps with mitigating poisoning attacks: Distributionally-robust machine learning using the wasserstein distance

F Farokhi - arXiv preprint arXiv:2001.10655, 2020 - arxiv.org

We use distributionally-robust optimization for machine learning to mitigate the effect of data poisoning attacks. We provide performance guarantees for the trained model on the original data (not including the poison records) by training the model for the worst-case distribution …

  Cited by 5 Related articles All 3 versions View as HTML 


2020

[PDF] arxiv.org

Approximate bayesian computation with the sliced-wasserstein distance

K NadjahiV De BortoliA Durmus… - ICASSP 2020-2020 …, 2020 - ieeexplore.ieee.org

Approximate Bayesian Computation (ABC) is a popular method for approximate inference in generative models with intractable but easy-to-sample likelihood. It constructs an approximate posterior distribution by finding parameters for which the simulated data are …

Cited by 12 Related articles All 8 versions

 

Finite-Horizon Control of Nonlinear Discrete-Time Systems with Terminal Cost of Wasserstein Distance

K Hoshino - 2020 59th IEEE Conference on Decision and …, 2020 - ieeexplore.ieee.org

This study explores a finite-horizon optimal control problem of nonlinear discrete-time systems for steering a probability distribution of initial states as close as possible to a desired probability distribution of terminal states. The problem is formulated as an optimal …

  Cited by 1 Related articles

[PDF] arxiv.org

wasserstein-type distance in the space of gaussian mixture models

J Delon, A Desolneux - SIAM Journal on Imaging Sciences, 2020 - SIAM

In this paper we introduce a Wasserstein-type distance on the set of Gaussian mixture models. This distance is defined by restricting the set of possible coupling measures in the optimal transport problem to Gaussian mixture models. We derive a very simple discrete …

  Cited by 9 Related articles All 7 versions

Cited by 37 Related articles All 7 versions

[PDF] arxiv.org

The Wasserstein-Fourier distance for stationary time series

E Cazelles, A Robert, F Tobar - IEEE Transactions on Signal …, 2020 - ieeexplore.ieee.org

We propose the Wasserstein-Fourier (WF) distance to measure the (dis) similarity between time series by quantifying the displacement of their energy across frequencies. The WF distance operates by calculating the Wasserstein distance between the (normalised) power …

  Cited by 6 Related articles All 45 versions

<——2020——2020———1770—— 


Convergence rate to equilibrium in Wasserstein distance for reflected jump–diffusions

A Sarantsev - Statistics & Probability Letters, 2020 - Elsevier

Convergence rate to the stationary distribution for continuous-time Markov processes can be studied using Lyapunov functions. Recent work by the author provided explicit rates of convergence in special case of a reflected jump–diffusion on a half-line. These results are …

  Related articles All 7 versions

  

[PDF] mlr.press

Robust Document Distance with Wasserstein-Fisher-Rao metric

Z Wang, D Zhou, M Yang, Y Zhang… - Asian Conference on …, 2020 - proceedings.mlr.press

Computing the distance among linguistic objects is an essential problem in natural language processing. The word mover's distance (WMD) has been successfully applied to measure the document distance by synthesizing the low-level word similarity with the …

  Cited by 1 Related articles 


[PDF] bessarion.gr

Wind: Wasserstein Inception Distance For Evaluating Generative Adversarial Network Performance

P Dimitrakopoulos, G Sfikas… - ICASSP 2020-2020 IEEE …, 2020 - ieeexplore.ieee.org

In this paper, we present Wasserstein Inception Distance (WInD), a novel metric for evaluating performance of Generative Adversarial Networks (GANs). The proposed metric extends on the rationale of the previously proposed Frechet Inception Distance (FID), in the …

  Related articles All 3 versions

Cited by 6 Related articles All 5 versions

Spatial-aware Network using Wasserstein Distance for Unsupervised Domain Adaptation

L Long, L Bin, F Jiang - 2020 Chinese Automation Congress …, 2020 - ieeexplore.ieee.org

In a general scenario, the purpose of Unsupervised Domain Adaptation (UDA) is to classify unlabeled target domain data as much as possible, but the source domain data has a large number of labels. To address this situation, this paper introduces the optimal transport theory …


[PDF] arxiv.org

Social-wagdat: Interaction-aware trajectory prediction via wasserstein graph double-attention network

J Li, H Ma, Z ZhangM Tomizuka - arXiv preprint arXiv:2002.06241, 2020 - arxiv.org

Effective understanding of the environment and accurate trajectory prediction of surrounding dynamic obstacles are indispensable for intelligent mobile systems (like autonomous vehicles and social robots) to achieve safe and high-quality planning when they navigate in …

 Cited by 38 Related articles All 3 versions 

Conditional Wasserstein generative adversarial network-gradient penalty-based approach to alleviating imbalanced data classification

M Zheng, T Li, R Zhu, Y Tang, M Tang, L Lin, Z Ma - Information Sciences, 2020 - Elsevier

In data mining, common classification algorithms cannot effectively learn from imbalanced data. Oversampling addresses this problem by creating data for the minority class in order to balance the class distribution before the model is trained. The Traditional oversampling …

  Cited by 12 Related articles All 2 versions


[HTML] hindawi.com

[HTML] Wasserstein generative adversarial network and convolutional neural network (WG-CNN) for bearing fault diagnosis

H Yin, Z Li, J Zuo, H Liu, K Yang, F Li - Mathematical Problems in …, 2020 - hindawi.com

In recent years, intelligent fault diagnosis technology with deep learning algorithms has been widely used in industry, and they have achieved gratifying results. Most of these methods require large amount of training data. However, in actual industrial systems, it is …

  Cited by 7 Related articles All 7 versions 


A Wasserstein gradient-penalty generative adversarial network with deep auto-encoder for bearing intelligent fault diagnosis

X Xiong, J Hongkai, X Li, M Niu - Measurement Science and …, 2020 - iopscience.iop.org

It is a great challenge to manipulate unbalanced fault data in the field of rolling bearings intelligent fault diagnosis. In this paper, a novel intelligent fault diagnosis method called the Wasserstein gradient-penalty generative adversarial network with deep auto-encoder is …

  Cited by 4 Related articles All 2 versions


[PDF] ucl.ac.uk

Ripple-GAN: Lane line detection with ripple lane line detection network and Wasserstein GAN

Y Zhang, Z Lu, D Ma, JH Xue… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

With artificial intelligence technology being advanced by leaps and bounds, intelligent driving has attracted a huge amount of attention recently in research and development. In intelligent driving, lane line detection is a fundamental but challenging task particularly …

  Cited by 1 Related articles All 2 versions


Wasserstein based transfer network for cross-domain sentiment classification

Y Du, M He, L Wang, H Zhang - Knowledge-Based Systems, 2020 - Elsevier

Automatic sentiment analysis of social media texts is of great significance for identifying people's opinions that can help people make better decisions. Annotating data is time consuming and laborious, and effective sentiment analysis on domains lacking of labeled …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

Modeling EEG data distribution with a Wasserstein Generative Adversarial Network to predict RSVP Events

S PanwarP RadTP Jung… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Electroencephalography (EEG) data are difficult to obtain due to complex experimental setups and reduced comfort with prolonged wearing. This poses challenges to train powerful deep learning model with the limited EEG data. Being able to generate EEG data …

  Cited by 1 Related articles All 5 versions

<——2020——2020——1780—— 


The Wasserstein Impact Measure (WIM): a generally applicable, practical tool for quantifying prior impact in Bayesian statistics

F Ghaderinezhad, C LeyB Serrien - arXiv preprint arXiv:2010.12522, 2020 - arxiv.org

The prior distribution is a crucial building block in Bayesian analysis, and its choice will impact the subsequent inference. It is therefore important to have a convenient way to quantify this impact, as such a measure of prior impact will help us to choose between two or …

  Related articles All 3 versions 


[PDF] projecteuclid.org

Existence of probability measure valued jump-diffusions in generalized Wasserstein spaces

M Larsson, S Svaluto-Ferro - Electronic Journal of Probability, 2020 - projecteuclid.org

We study existence of probability measure valued jump-diffusions described by martingale problems. We develop a simple device that allows us to embed Wasserstein spaces and other similar spaces of probability measures into locally compact spaces where classical …

  Cited by 2 Related articles All 3 versions



[PDF] aclweb.org

WAERN: Integrating Wasserstein Autoencoder and Relational Network for Text Sequence

X Zhang, X Liu, G Yang, F Li, W Liu - China National Conference on …, 2020 - Springer

Abstract One challenge in Natural Language Processing (NLP) area is to learn semantic representation in different contexts. Recent works on pre-trained language model have received great attentions and have been proven as an effective technique. In spite of the …

  Related articles All 4 versions


[HTML] springer.com

[HTML] Missing Features Reconstruction Using a Wasserstein Generative Adversarial Imputation Network

M FriedjungováD Vašata, M Balatsko… - … on Computational Science, 2020 - Springer

Missing data is one of the most common preprocessing problems. In this paper, we experimentally research the use of generative and non-generative models for feature reconstruction. Variational Autoencoder with Arbitrary Conditioning (VAEAC) and …

  Cited by 1 Related articles All 8 versions


[HTML] peerj.com

[HTML] Data augmentation-based conditional Wasserstein generative adversarial network-gradient penalty for XSS attack detection system

FMM Mokbal, D Wang, X Wang, L Fu - PeerJ Computer Science, 2020 - peerj.com

The rapid growth of the worldwide web and accompanied opportunities of web applications in various aspects of life have attracted the attention of organizations, governments, and individuals. Consequently, web applications have increasingly become the target of …

  Related articles All 5 versions 

 

2020

Chinese font translation with improved Wasserstein generative adversarial network

Y Miao, H Jia, K Tang, Y Ji - Twelfth International Conference …, 2020 - spiedigitallibrary.org

Nowadays, various fonts are applied in many fields, and the generation of multiple fonts by computer plays an important role in the inheritance, development and innovation of Chinese culture. Aiming at the existing font generation methods, which have some problems such as …

  Related articles All 2 versions


[PDF] arxiv.org

Parameter-Transferred Wasserstein Generative Adversarial Network (PT-WGAN) for Low-Dose PET Image Denoising

Y GongH ShanY Teng, N Tu, M Li… - … on Radiation and …, 2020 - ieeexplore.ieee.org

Due to the widespread use of positron emission tomography (PET) in clinical practice, the potential risk of PET-associated radiation dose to patients needs to be minimized. However, with the reduction in the radiation dose, the resultant images may suffer from noise and …

  Related articles All 3 versions


[PDF] arxiv.org

A Rademacher-type theorem on L2-Wasserstein spaces over closed Riemannian manifolds

LD Schiavo - Journal of Functional Analysis, 2020 - Elsevier

Let P be any Borel probability measure on the L 2-Wasserstein space (P 2 (M), W 2) over a closed Riemannian manifold M. We consider the Dirichlet form E induced by P and by the Wasserstein gradient on P 2 (M). Under natural assumptions on P, we show that W 2 …

  Cited by 5 Related articles All 6 versions


A Wasserstein based two-stage distributionally robust optimization model for optimal operation of CCHP micro-grid under uncertainties

Y Wang, Y Yang, L Tang, W Sun, B Li - International Journal of Electrical …, 2020 - Elsevier

Combined cooling, heating and power (CCHP) micro-grids are getting increasing attentions due to the realization of cleaner production and high energy efficiency. However, with the features of complex tri-generation structure and renewable power uncertainties, it is …

  Cited by 13 Related articles All 2 versions


[PDF] arxiv.org

Scalable computations of wasserstein barycenter via input convex neural networks

J FanA TaghvaeiY Chen - arXiv preprint arXiv:2007.04462, 2020 - arxiv.org

Wasserstein Barycenter is a principled approach to represent the weighted mean of a given set of probability distributions, utilizing the geometry induced by optimal transport. In this work, we present a novel scalable algorithm to approximate the Wasserstein Barycenters …

  Cited by 3 Related articles All 3 versions 

<——2020——2020———1790——



[PDF] researchgate.net

Tractable reformulations of two-stage distributionally robust linear programs over the type-∞ wasserstein ball

W Xie - Operations Research Letters, 2020 - Elsevier

This paper studies a two-stage distributionally robust stochastic linear program under the type-∞ Wasserstein ball by providing sufficient conditions under which the program can be efficiently computed via a tractable convex program. By exploring the properties of binary …

  Cited by 6 Related articles All 4 versions


[PDF] thecvf.com

S2a: 

[PDF] biorxiv.org

Gromov-Wasserstein optimal transport to align single-cell multi-omics data

P DemetciR SantorellaB SandstedeWS Noble… - BioRxiv, 2020 - biorxiv.org

Data integration of single-cell measurements is critical for understanding cell development and disease, but the lack of correspondence between different types of measurements makes such efforts challenging. Several unsupervised algorithms can align heterogeneous …

  Cited by 4 Related articles All 3 versions 


[PDF] imstat.org

A Central Limit Theorem for Wasserstein type distances between two distinct univariate distributions

P Berthet, JC Fort, T Klein - Annales de l'Institut Henri Poincaré …, 2020 - projecteuclid.org

In this article we study the natural nonparametric estimator of a Wasserstein type cost between two distinct continuous distributions $ F $ and $ G $ on $\mathbb {R} $. The estimator is based on the order statistics of a sample having marginals $ F $, $ G $ and any …

  Related articles All 4 versions


[PDF] arxiv.org

Martingale Wasserstein inequality for probability measures in the convex order

B Jourdain, W Margheriti - arXiv preprint arXiv:2011.11599, 2020 - arxiv.org

It is known since [24] that two one-dimensional probability measures in the convex order admit a martingale coupling with respect to which the integral of $\vert xy\vert $ is smaller than twice their $\mathcal W_1 $-distance (Wasserstein distance with index $1 $). We …

  Related articles All 7 versions 



[PDF] jst.go.jp

Orthogonal gradient penalty for fast training of wasserstein gan based multi-task autoencoder toward robust speech recognition

CY Kao, S Park, A Badi, DK Han… - IEICE TRANSACTIONS on …, 2020 - search.ieice.org

Performance in Automatic Speech Recognition (ASR) degrades dramatically in noisy environments. To alleviate this problem, a variety of deep networks based on convolutional neural networks and recurrent neural networks were proposed by applying L1 or L2 loss. In …

  Cited by 1 Related articles All 5 versions



[PDF] arxiv.org

Second-order Conic Programming Approach for Wasserstein Distributionally Robust Two-stage Linear Programs

Z Wang, K YouS SongY Zhang - arXiv preprint arXiv:2002.06751, 2020 - arxiv.org

This paper proposes a second-order conic programming (SOCP) approach to solve distributionally robust two-stage stochastic linear programs over 1-Wasserstein balls. We start from the case with distribution uncertainty only in the objective function and exactly …

  Related articles All 3 versions 

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss. Symmetry 2021, 13, 126

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - 2021 - search.proquest.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively reduce the radiation risk of patients, but it may increase noise and artefacts, which can compromise diagnostic information. The methods based on deep learning can effectively …

[PDF] mdpi.com

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - Symmetry, 2021 - mdpi.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively reduce the radiation risk of patients, but it may increase noise and artefacts, which can compromise diagnostic information. The methods based on deep learning can effectively …

  Related articles 


[PDF] optimization-online.org

[PDF] Dual Decomposition of Two-Stage Distributionally Robust Mixed-Integer Programming under the Wasserstein Ambiguity Set

K Kim - optimization-online.org

We develop a dual decomposition of two-stage distributionally robust mixed-integer programming (DRMIP) under the Wasserstein ambiguity set. The dual decomposition is based on the Lagrangian dual of DRMIP, which results from the Lagrangian relaxation of the …

  Related articles All 2 versions 


Two approaches for population Wasserstein barycenter problem: Stochastic Averaging versus Sample Average Approximation

D DvinskikhA Gasnikov - nnov.hse.ru

Abstract In Machine Learning and Optimization community there are two main approaches for convex risk minimization problem: Stochastic Averaging (SA) and Sample Average Approximation (SAA). At the moment, it is known that both approaches are on average …

  Related articles 


A new approach to posterior contraction rates via Wasserstein dynamics

E Dolera, S Favaro, E Mainini - arXiv preprint arXiv:2011.14425, 2020 - arxiv.org

This paper presents a new approach to the classical problem of quantifying posterior contraction rates (PCRs) in Bayesian statistics. Our approach relies on Wasserstein distance, and it leads to two main contributions which improve on the existing literature of …

  Cited by 1 Related articles All 2 versions 


[PDF] thecvf.com

Wasserstein loss-based deep object detection

Y Han, X Liu, Z Sheng, Y Ren, X Han… - Proceedings of the …, 2020 - openaccess.thecvf.com

Object detection locates the objects with bounding boxes and identifies their classes, which is valuable in many computer vision applications (eg autonomous driving). Most existing deep learning-based methods output a probability vector for instance classification trained …

 Cited by 14 Related articles All 5 versions 

<——2020——2020———1800—— 


[PDF] aaai.org

Importance-aware semantic segmentation in self-driving with discrete wasserstein training

X Liu, Y Han, S Bai, Y Ge, T Wang, X HanS Li… - Proceedings of the …, 2020 - ojs.aaai.org

Semantic segmentation (SS) is an important perception manner for self-driving cars and robotics, which classifies each pixel into a pre-determined class. The widely-used cross entropy (CE) loss-based deep networks has achieved significant progress wrt the mean …

  Cited by 8 Related articles All 6 versions 


[PDF] arxiv.org

Primal wasserstein imitation learning

R DadashiL HussenotM GeistO Pietquin - arXiv preprint arXiv …, 2020 - arxiv.org

Imitation Learning (IL) methods seek to match the behavior of an agent with that of an expert. In the present work, we propose a new IL method based on a conceptually simple algorithm: Primal Wasserstein Imitation Learning (PWIL), which ties to the primal form of the …

  Cited by 4 Related articles All 2 versions 


Conditional Wasserstein generative adversarial network-gradient penalty-based approach to alleviating imbalanced data classification

M Zheng, T Li, R Zhu, Y Tang, M Tang, L Lin, Z Ma - Information Sciences, 2020 - Elsevier

In data mining, common classification algorithms cannot effectively learn from imbalanced data. Oversampling addresses this problem by creating data for the minority class in order to balance the class distribution before the model is trained. The Traditional oversampling …

  Cited by 12 Related articles All 2 versions


[PDF] arxiv.org

Deep attentive wasserstein generative adversarial networks for MRI reconstruction with recurrent context-awareness

Y Guo, C WangH ZhangG Yang - International Conference on Medical …, 2020 - Springer

The performance of traditional compressive sensing-based MRI (CS-MRI) reconstruction is affected by its slow iterative procedure and noise-induced artefacts. Although many deep learning-based CS-MRI methods have been proposed to mitigate the problems of traditional …

  Cited by 16 Related articles All 4 versions


Wasserstein gradient-penalty generative adversarial network with deep auto-encoder for bearing intelligent fault diagnosis

X Xiong, J Hongkai, X Li, M Niu - Measurement Science and …, 2020 - iopscience.iop.org

It is a great challenge to manipulate unbalanced fault data in the field of rolling bearings intelligent fault diagnosis. In this paper, a novel intelligent fault diagnosis method called the Wasserstein gradient-penalty generative adversarial network with deep auto-encoder is …

  Cited by 4 Related articles All 2 versions


Primal heuristics for wasserstein barycenters

PY Bouchet, S GualandiLM Rousseau - International Conference on …, 2020 - Springer

This paper presents primal heuristics for the computation of Wasserstein Barycenters of a given set of discrete probability measures. The computation of a Wasserstein Barycenter is formulated as an optimization problem over the space of discrete probability measures. In …

  Cited by 1 Related articles


[PDF] arxiv.org

Refining Deep Generative Models via Wasserstein Gradient Flows

AF Ansari, ML Ang, H Soh - arXiv preprint arXiv:2012.00780, 2020 - arxiv.org

Deep generative modeling has seen impressive advances in recent years, to the point where it is now commonplace to see simulated samples (eg, images) that closely resemble real-world data. However, generation quality is generally inconsistent for any given model …

  Related articles 

Refining Deep Generative Models via Wasserstein Gradient Flows

A Fatir Ansari, ML Ang, H Soh - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

Deep generative modeling has seen impressive advances in recent years, to the point where it is now commonplace to see simulated samples (eg, images) that closely resemble real-world data. However, generation quality is generally inconsistent for any given model …

 

[PDF] arxiv.org

Generalized Wasserstein Dice Score, Distributionally Robust Deep Learning, and Ranger for brain tumor segmentation: BraTS 2020 challenge

L FidonS OurselinT Vercauteren - arXiv preprint arXiv:2011.01614, 2020 - arxiv.org

Training a deep neural network is an optimization problem with four main ingredients: the design of the deep neural network, the per-sample loss function, the population loss function, and the optimizer. However, methods developed to compete in recent BraTS …

  Related articles All 2 versions 


[HTML] hindawi.com

[HTML] Imbalanced Fault Classification of Bearing via Wasserstein Generative Adversarial Networks with Gradient Penalty

B Han, S Jia, G Liu, J Wang - Shock and Vibration, 2020 - hindawi.com

Recently, generative adversarial networks (GANs) are widely applied to increase the amounts of imbalanced input samples in fault diagnosis. However, the existing GAN-based methods have convergence difficulties and training instability, which affect the fault …

Cited by 3 Related articles All 5 versions 

[PDF] arxiv.org

The existence of minimizers for an isoperimetric problem with Wasserstein penalty term in unbounded domains

Q Xia, B Zhou - arXiv preprint arXiv:2002.07129, 2020 - arxiv.org

In this article, we consider the (double) minimization problem $$\min\left\{P (E;\Omega)+\lambda W_p (E, F):~ E\subseteq\Omega,~ F\subseteq\mathbb {R}^ d,~\lvert E\cap F\rvert= 0,~\lvert E\rvert=\lvert F\rvert= 1\right\}, $$ where $ p\geqslant 1$, $\Omega …

  Related articles All 4 versions 

<——2020——2020———1810——

[PDF] arxiv.org

Safe Wasserstein Constrained Deep Q-Learning

A KandelSJ Moura - arXiv preprint arXiv:2002.03016, 2020 - arxiv.org

This paper presents a distributionally robust Q-Learning algorithm (DrQ) which leverages Wasserstein ambiguity sets to provide probabilistic out-of-sample safety guarantees during online learning. First, we follow past work by separating the constraint functions from the …

  Related articles All 2 versions 

[PDF] jst.go.jp

Orthogonal gradient penalty for fast training of wasserstein gan based multi-task autoencoder toward robust speech recognition

CY Kao, S Park, A Badi, DK Han… - IEICE TRANSACTIONS on …, 2020 - search.ieice.org

Performance in Automatic Speech Recognition (ASR) degrades dramatically in noisy environments. To alleviate this problem, a variety of deep networks based on convolutional neural networks and recurrent neural networks were proposed by applying L1 or L2 loss. In …

  Cited by 1 Related articles All 5 versions


[PDF] arxiv.org

Wasserstein k-means with sparse simplex projection

T Fukunaga, H Kasai - arXiv preprint arXiv:2011.12542, 2020 - arxiv.org

This paper presents a proposal of a faster Wasserstein $ k $-means algorithm for histogram

data by reducing Wasserstein distance computations and exploiting sparse simplex

projection. We shrink data samples, centroids, and the ground cost matrix, which leads to …

  Related articles All 2 versions end


2020

[PDF] researchgate.net

Inequalities of the Wasserstein mean with other matrix means

S Kim, H Lee - Annals of Functional Analysis, 2020 - Springer

Recently, a new Riemannian metric and a least squares mean of positive definite matrices

have been introduced. They are called the Bures–Wasserstein metric and Wasserstein

mean, which are different from the Riemannian trace metric and Karcher mean. In this paper …

  Cited by 3 Related articles All 2 versions

 

  

Synthetic Data Generation Using Wasserstein Conditional Gans With Gradient Penalty (WCGANS-GP)

M Singh Walia - 2020 - arrow.tudublin.ie

With data protection requirements becoming stricter, the data privacy has become increasingly important and more crucial than ever. This has led to restrictions on the availability and dissemination of real-world datasets. Synthetic data offers a viable solution …

  Related articles 


2020

Synthesising Tabular Datasets Using Wasserstein Conditional GANS with Gradient Penalty (WCGAN-GP)

S McKeever, M Singh Walia - 2020 - arrow.tudublin.ie

Deep learning based methods based on Generative Adversarial Networks (GANs) have seen remarkable success in data synthesis of images and text. This study investigates the use of GANs for the generation of tabular mixed dataset. We apply Wasserstein Conditional …

  Related articles 


[PDF] ceur-ws.org

[PDF] Synthesising Tabular Data using Wasserstein Conditional GANs with Gradient Penalty (WCGAN-GP)

M Walia, B TierneyS McKeever - ceur-ws.org

Deep learning based methods based on Generative Adversarial Networks (GANs) have seen remarkable success in data synthesis of images and text. This study investigates the use of GANs for the generation of tabular mixed dataset. We apply Wasserstein Conditional …

  Related articles All 2 versions 


[PDF] fleuret.org

[PDF] Deep learning 11.2. Wasserstein GAN

F Fleuret - 2020 - fleuret.org

Page 1. Deep learning 11.2. Wasserstein GAN François Fleuret https://fleuret.org/dlc/ Dec 20, 2020 Page 2. Arjovsky et al. (2017) pointed out that DJS does not account [much] for the metric structure of the space. François Fleuret Deep learning / 11.2. Wasserstein GAN 1 …

  Related articles All 2 versions 


[PDF] minegrado.ovh

[PDF] EE-559–Deep learning 10.2. Wasserstein GAN

F Fleuret - 2020 - minegrado.ovh

… If the clipping parameter is large, then it can take a long time for any weights to reach their limit, thereby making it harder to train the critic till optimality. If the clipping is small, this can easily lead to vanishing gradients when the number of layers is big, or batch normalization …

  Related articles All 2 versions 


[PDF] Ratio Trace Formulation of Wasserstein Discriminant Analysis

H LiuY CaiYL Chen, P Li - Advances in Neural Information …, 2020 - research.baidu.com

Abstract We reformulate the Wasserstein Discriminant Analysis (WDA) as a ratio trace problem and present an eigensolver-based algorithm to compute the discriminative subspace of WDA. This new formulation, along with the proposed algorithm, can be served …

  Related articles All 2 versions 

<——2020——2020———1820——  

[PDF] arxiv.org

Global sensitivity analysis and Wasserstein spaces

JC Fort, T Klein, A Lagnoux - arXiv preprint arXiv:2007.12378, 2020 - arxiv.org

Sensitivity indices are commonly used to quantity the relative inuence of any specic group of input variables on the output of a computer code. In this paper, we focus both on computer codes the output of which is a cumulative distribution function and on stochastic computer …

  Cited by 1 Related articles All 9 versions 

 

[PDF] arxiv.org

Graph Wasserstein Correlation Analysis for Movie Retrieval

X Zhang, T Zhang, X Hong, Z Cui, J Yang - European Conference on …, 2020 - Springer

Movie graphs play an important role to bridge heterogenous modalities of videos and texts in human-centric retrieval. In this work, we propose Graph Wasserstein Correlation Analysis (GWCA) to deal with the core issue therein, ie, cross heterogeneous graph comparison …

  Related articles All 5 versions


[HTML] atlantis-press.com

[HTML] Multimedia analysis and fusion via Wasserstein Barycenter

C Jin, J Wang, J Wei, L Tan, S Liu… - … Journal of Networked …, 2020 - atlantis-press.com

Optimal transport distance, otherwise known as Wasserstein distance, recently has attracted attention in music signal processing and machine learning as powerful discrepancy measures for probability distributions. In this paper, we propose an ensemble approach with …

  Cited by 2 Related articles All 2 versions 


[PDF] arxiv.org

First-Order Methods for Wasserstein Distributionally Robust MDP

J Grand-ClementC Kroer - arXiv preprint arXiv:2009.06790, 2020 - arxiv.org

Markov Decision Processes (MDPs) are known to be sensitive to parameter specification. Distributionally robust MDPs alleviate this issue by allowing for ambiguity sets which give a set of possible distributions over parameter sets. The goal is to find an optimal policy with …

  Cited by 1 Related articles All 3 versions 


[PDF] arxiv.org

Minimax control of ambiguous linear stochastic systems using the Wasserstein metric

K KimI Yang - 2020 59th IEEE Conference on Decision and …, 2020 - ieeexplore.ieee.org

In this paper, we propose a minimax linear-quadratic control method to address the issue of inaccurate distribution information in practical stochastic systems. To construct a control policy that is robust against errors in an empirical distribution of uncertainty, our method …

  Cited by 1 Related articles All 3 versions


2020

[PDF] arxiv.org

Second-order Conic Programming Approach for Wasserstein Distributionally Robust Two-stage Linear Programs

Z Wang, K YouS SongY Zhang - arXiv preprint arXiv:2002.06751, 2020 - arxiv.org

This paper proposes a second-order conic programming (SOCP) approach to solve distributionally robust two-stage stochastic linear programs over 1-Wasserstein balls. We start from the case with distribution uncertainty only in the objective function and exactly …

  Related articles All 3 versions 


[PDF] arxiv.org

Statistical analysis of Wasserstein GANs with applications to time series forecasting

M Haas, S Richter - arXiv preprint arXiv:2011.03074, 2020 - arxiv.org

We provide statistical theory for conditional and unconditional Wasserstein generative adversarial networks (WGANs) in the framework of dependent observations. We prove upper bounds for the excess Bayes risk of the WGAN estimators with respect to a modified …

  Cited by 2 Related articles All 3 versions 

  

[PDF] brown.edu

[PDF] Reduced-order modeling of transport equations using Wasserstein spaces

V EhrlacherD LombardiO MulaFX Vialard - icerm.brown.edu

… Let u1, u2 P2(Ω). Then, W2(u1, u2)2 := inf π P(Ω × Ω) ∫ yΩ π(dx, dy) = u1(dx) ∫ xΩ π(dx, dy) = u2(dy) ∫ Ω×Ω |x − y|2 π(dx, dy). where P (Ω × Ω) is the set of probability measures on Ω × Ω. Kantorovich formulation of optimal transport problem Several numerical …

  Related articles 


[PDF] mlr.press

Gradient descent algorithms for Bures-Wasserstein barycenters

S ChewiT MaunuP Rigollet… - … on Learning Theory, 2020 - proceedings.mlr.press

We study first order methods to compute the barycenter of a probability distribution $ P $

over the space of probability measures with finite second moment. We develop a framework

to derive global rates of convergence for both gradient descent and stochastic gradient  …

  Cited by 16 Related articles All 5 versions 


[PDF] arxiv.org

Wasserstein proximal gradient

A SalimA Korba, G Luise - arXiv preprint arXiv:2002.03035, 2020 - arxiv.org

We consider the task of sampling from a log-concave probability distribution. This target

distribution can be seen as a minimizer of the relative entropy functional defined on the

space of probability distributions. The relative entropy can be decomposed as the sum of a …

  Cited by 2 Related articles All 2 versions 

<——2020——2020———1830—— 


[PDF] Synthesising Tabular Data using Wasserstein Conditional GANs with Gradient Penalty (WCGAN-GP)

M Walia, B TierneyS McKeever - ceur-ws.org

Deep learning based methods based on Generative Adversarial Networks (GANs) have

seen remarkable success in data synthesis of images and text. This study investigates the

use of GANs for the generation of tabular mixed dataset. We apply Wasserstein Conditional …

  Related articles All 2 versions 


The Wasserstein Proximal Gradient Algorithm

A SalimA KorbaG Luise - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

Wasserstein gradient flows are continuous time dynamics that define curves of steepest

descent to minimize an objective function over the space of probability measures (ie, the

Wasserstein space). This objective is typically a divergence wrt a fixed target distribution. In …

  Related articles


[PDF] jst.go.jp

Orthogonal gradient penalty for fast training of wasserstein gan based multi-task autoencoder toward robust speech recognition

CY Kao, S Park, A Badi, DK Han… - IEICE TRANSACTIONS on …, 2020 - search.ieice.org

Performance in Automatic Speech Recognition (ASR) degrades dramatically in noisy

environments. To alleviate this problem, a variety of deep networks based on convolutional

neural networks and recurrent neural networks were proposed by applying L1 or L2 loss. In …

  Cited by 1 Related articles All 5 versions


[PDF] arxiv.org

Statistical learning in Wasserstein space

A Karimi, L Ripani, TT Georgiou - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We seek a generalization of regression and principle component analysis (PCA) in a metric

space where data points are distributions metrized by the Wasserstein metric. We recast

these analyses as multimarginal optimal transport problems. The particular formulation …

  Related articles All 7 versions 


[PDF] arxiv.org

Regularized variational data assimilation for bias treatment using the Wasserstein metric

SK Tamang, A EbtehajD Zou… - Quarterly Journal of the …, 2020 - Wiley Online Library

This article presents a new variational data assimilation (VDA) approach for the formal

treatment of bias in both model outputs and observations. This approach relies on the

Wasserstein metric, stemming from the theory of optimal mass transport, to penalize the …

  Cited by 1 Related articles All 4 versions


2020


data-driven distributionally robust newsvendor model with a Wasserstein ambiguity set

S Lee, H Kim, I Moon - Journal of the Operational …, 2020 - orsociety.tandfonline.com

In this paper, we derive a closed-form solution and an explicit characterization of the worst-

case distribution for the data-driven distributionally robust newsvendor model with an

ambiguity set based on the Wasserstein distance of order p[1,∞). We also consider the …

  Cited by 4 Related articles All 2 versions


[PDF] aaai.org

Regularized Wasserstein means for aligning distributional data

L MiW ZhangY Wang - Proceedings of the AAAI Conference on …, 2020 - ojs.aaai.org

We propose to align distributional data from the perspective of Wasserstein means. We raise

the problem of regularizing Wasserstein means and propose several terms tailored to tackle

different problems. Our formulation is based on the variational transportation to distribute a …

  Cited by 2 Related articles All 5 versions 


Averaging atmospheric gas concentration data using wasserstein barycenters

M Barré, C Giron, M Mazzolini… - arXiv preprint arXiv …, 2020 - arxiv.org

Hyperspectral satellite images report greenhouse gas concentrations worldwide on a daily

basis. While taking simple averages of these images over time produces a rough estimate of

relative emission rates, atmospheric transport means that simple averages fail to pinpoint …

  Cited by 1 Related articles All 3 versions 


[PDF] biorxiv.org

Gromov-Wasserstein optimal transport to align single-cell multi-omics data

P DemetciR SantorellaB SandstedeWS Noble… - BioRxiv, 2020 - biorxiv.org

Data integration of single-cell measurements is critical for understanding cell development

and disease, but the lack of correspondence between different types of measurements

makes such efforts challenging. Several unsupervised algorithms can align heterogeneous …

  Cited by 4 Related articles All 3 versions 


[PDF] arxiv.org

Conditional Wasserstein GAN-based Oversampling of Tabular Data for Imbalanced Learning

J Engelmann, S Lessmann - arXiv preprint arXiv:2008.09202, 2020 - arxiv.org

Class imbalance is a common problem in supervised learning and impedes the predictive

performance of classification models. Popular countermeasures include oversampling the

minority class. Standard methods like SMOTE rely on finding nearest neighbours and linear …

  Cited by 1 Related articles All 5 versions 

<——2020——2020———1840——


[PDF] arxiv.org

Ensemble Riemannian Data Assimilation over the Wasserstein Space

SK Tamang, A Ebtehaj, PJ Van Leeuwen, D Zou… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we present a new ensemble data assimilation paradigm over a Riemannian

manifold equipped with the Wasserstein metric. Unlike Eulerian penalization of error in the

Euclidean space, the Wasserstein metric can capture translation and shape difference …

  Related articles All 4 versions 


A Novel Data-to-Text Generation Model with Transformer Planning and a Wasserstein Auto-Encoder

X Xu, T He, H Wang - 2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

Existing methods for data-to-text generation have difficulty producing diverse texts with low

duplication rates. In this paper, we propose a novel data-to-text generation model with

Transformer planning and a Wasserstein auto-encoder, which can convert constructed data  …

  Related articles All 2 versions


[HTML] peerj.com

[HTML] Data augmentation-based conditional Wasserstein generative adversarial network-gradient penalty for XSS attack detection system

FMM Mokbal, D Wang, X Wang, L Fu - PeerJ Computer Science, 2020 - peerj.com

The rapid growth of the worldwide web and accompanied opportunities of web applications

in various aspects of life have attracted the attention of organizations, governments, and

individuals. Consequently, web applications have increasingly become the target of …

  Related articles All 5 versions 


EEG data augmentation using Wasserstein GAN

G Bouallegue, R Djemal - 2020 20th International Conference …, 2020 - ieeexplore.ieee.org

Electroencephalogram (EEG) presents a challenge during the classification task using

machine learning and deep learning techniques due to the lack or to the low size of

available datasets for each specific neurological disorder. Therefore, the use of data  …

 

Numeric Data Augmentation using Structural Constraint Wasserstein Generative Adversarial Networks

W Wang, C Wang, T Cui, R Gong… - … on Circuits and …, 2020 - ieeexplore.ieee.org

Some recent studies have suggested using GANs for numeric data generation such as to

generate data for completing the imbalanced numeric data. Considering the significant

difference between the dimensions of the numeric data and images, as well as the strong …

  Related articles


2020

[PDF] mdpi.com

Wasserstein Generative Adversarial Networks Based Data Augmentation for Radar Data Analysis

H Lee, J Kim, EK Kim, S Kim - Applied Sciences, 2020 - mdpi.com

Ground-based weather radar can observe a wide range with a high spatial and temporal

resolution. They are beneficial to meteorological research and services by providing

valuable information. Recent weather radar data related research has focused on applying …

  Related articles All 2 versions 


Synthetic Data Generation Using Wasserstein Conditional Gans With Gradient Penalty (WCGANS-GP)

M Singh Walia - 2020 - arrow.tudublin.ie

With data protection requirements becoming stricter, the data privacy has become

increasingly important and more crucial than ever. This has led to restrictions on the

availability and dissemination of real-world datasets. Synthetic data offers a viable solution …

  Related articles 


[PDF] optimization-online.org

[PDF] A Novel Solution Methodology for Wasserstein-based Data-Driven Distributionally Robust Problems

CA Gamboa, DM ValladaoA Street… - optimization-online.org

Distributionally robust optimization (DRO) is a mathematical framework to incorporate

ambiguity over the actual data-generating probability distribution. Data-driven DRO

problems based on the Wasserstein distance are of particular interest for their sound …

  Related articles 


[PDF] amazonaws.com

[PDF] Bayesian Wasserstein GAN and Application for Vegetable Disease Image Data

W Cho, MH Na, S Kang, S Kim - 2020 - manuscriptlink-society-file.s3 …

Various GAN models have been proposed so far and they are used in various fields.

However, despite the excellent performance of these GANs, the biggest problem is that the

model collapse occurs in the simultaneous optimization of the generator and discriminator of …

  Related articles 


[PDF] sci-en-tech.com

[PDF] Entropy-regularized Wasserstein Distances for Analyzing Environmental and Ecological Data

H Yoshioka, Y Yoshioka, Y Yaegashi - THE 11TH …, 2020 - sci-en-tech.com

We explore applicability of entropy-regularized Wasserstein (pseudo-) distances as new

tools for analyzing environmental and ecological data. In this paper, the two specific

examples are considered and are numerically analyzed using the Sinkhorn algorithm. The …

  Related articles All 2 versions 

<——2020——2020———1850——   


Multi-view Wasserstein discriminant analysis with entropic regularized Wasserstein distance

H Kasai - ICASSP 2020-2020 IEEE International Conference …, 2020 - ieeexplore.ieee.org

Analysis of multi-view data has recently garnered growing attention because multi-view data

frequently appear in real-world applications, which are collected or taken from many sources

or captured using various sensors. A simple and popular promising approach is to learn a …

  Cited by 2 Related articles


[PDF] ceur-ws.org

[PDF] Synthesising Tabular Data using Wasserstein Conditional GANs with Gradient Penalty (WCGAN-GP)

M Walia, B TierneyS McKeever - ceur-ws.org

Deep learning based methods based on Generative Adversarial Networks (GANs) have

seen remarkable success in data synthesis of images and text. This study investigates the

use of GANs for the generation of tabular mixed dataset. We apply Wasserstein Conditional …

  Related articles All 2 versions 

2020

 Evaluating the performance of climate models based on Wasserstein distance

G VissioV LemboV Lucarini… - Geophysical Research …, 2020 - Wiley Online Library

We propose a methodology for intercomparing climate models and evaluating their

performance against benchmarks based on the use of the Wasserstein distance (WD). This

distance provides a rigorous way to measure quantitatively the difference between two …

  Cited by 2 Related articles All 13 versions

WGAN domain adaptation for the joint optic disc-and-cup segmentation in fundus images

S KadambiZ WangE Xing - … Journal of Computer Assisted Radiology and …, 2020 - Springer

Purpose The cup-to-disc ratio (CDR), a clinical metric of the relative size of the optic cup to

the optic disc, is a key indicator of glaucoma, a chronic eye disease leading to loss of vision.

CDR can be measured from fundus images through the segmentation of optic disc and optic …

  Cited by 1 Related articles All 3 versions


Image dehazing algorithm based on FC-DenseNet and WGAN

B SUN, Q JU, Q SANG - … of Frontiers of Computer Science and …, 2020 - engine.scichina.com

The existing image dehazing algorithms rely heavily on the accurate estimation of the

intermediate variables. This paper proposes an end-to-end image dehazing model based

on Wasserstein generative adversarial networks (WGAN). Firstly, the fully convolutional …

  Cited by 3 Related articles 


2020


[PDF] preprints.org

Panchromatic Image Super-Resolution via Self Attention-augmented WGAN

J Du, K Cheng, Y Yu, D Wang, H Zhou - 2020 - preprints.org

Panchromatic (PAN) images contain abundant spatial information that is useful for earth

observation, but always suffer from low-resolution due to the sensor limitation and large-

scale view field. The current super-resolution (SR) methods based on traditional attention …

  Related articles All 4 versions 


[PDF] netinfo-security.org

Research on Mobile Malicious Adversarial Sample Generation Based on WGAN

LI Hongjiao, C Hongyan - Netinfo Security, 2020 - netinfo-security.org

In recent years, using machine learning algorithm to detect mobile terminal malware has

become a research hotspot. In order to make the malware evade detection, malware

producers use various methods to make malicious adversarial samples. This paper …

 

  

[PDF] netinfo-security.org

基于 WGAN 的移动恶意对抗样本生成研究

李红娇, 陈红艳 - 信息网络安全, 2020 - netinfo-security.org

近年来, 利用机器学习算法进行移动终端恶意软件的检测已成为研究热点,

而恶意软件制作者为了使恶意软件能够逃避检测, 采用各种方法来制作恶意对抗样本.

文章提出一种基于Wasserstein GAN (WGAN) 的算法MalWGAN 来生成移动终端恶意对抗样本 …

  Related articles 

[Chinese  Research on the generation of mobile malicious adversarial samples based on WGAN]


books.google.com › books › about

· Translate this page

WGAN-GP對臉部馬賽克進行眼睛補圖- Google Books

 [Use WGAN-GP to complement the face mosaic]

[CITATION] 使用 WGAN-GP 對臉部馬賽克進行眼睛補圖

HT Chang - 2020 - 長庚大學

[Chinese  Use WGAN-GP to complement the face mosaic]

[CITATION] 使用 WGAN-GP 對臉部馬賽克進行眼睛補圖

HT Chang - 2020 - 長庚大學

 

Wasserstein distributionally robust shortest path problem

Z Wang, K YouS SongY Zhang - European Journal of Operational …, 2020 - Elsevier

This paper proposes a data-driven distributionally robust shortest path (DRSP) model where

the distribution of the travel time in the transportation network can only be partially observed

through a finite number of samples. Specifically, we aim to find an optimal path to minimize …

Cited by 11 Related articles All 7 versions

<——2020——2020———1860——


A wasserstein minimum velocity approach to learning unnormalized models

Z WangS Cheng, L Yueru, J Zhu… - International …, 2020 - proceedings.mlr.press

Score matching provides an effective approach to learning flexible unnormalized models,

but its scalability is limited by the need to evaluate a second-order derivative. In this paper,

we present a scalable approximation to a general family of learning objectives including …

  Cited by 4 Related articles All 9 versions 


[PDF] ieee.org

Robust multivehicle tracking with wasserstein association metric in surveillance videos

Y Zeng, X Fu, L Gao, J Zhu, H Li, Y Li - IEEE Access, 2020 - ieeexplore.ieee.org

Vehicle tracking based on surveillance videos is of great significance in the highway traffic

monitoring field. In real-world vehicle-tracking applications, partial occlusion and objects

with similarly appearing distractors pose significant challenges. For addressing the above …

Cited by 9 Related articles All 2 versions

A Wasserstein based two-stage distributionally robust optimization model for optimal operation of CCHP micro-grid under uncertainties

Y Wang, Y Yang, L Tang, W Sun, B Li - International Journal of Electrical …, 2020 - Elsevier

Combined cooling, heating and power (CCHP) micro-grids are getting increasing attentions

due to the realization of cleaner production and high energy efficiency. However, with the

features of complex tri-generation structure and renewable power uncertainties, it is …

  Cited by 15 Related articles All 2 versions


[PDF] arxiv.org

Wasserstein distributionally robust inverse multiobjective optimization

C Dong, B Zeng - arXiv preprint arXiv:2009.14552, 2020 - arxiv.org

Inverse multiobjective optimization provides a general framework for the unsupervised

learning task of inferring parameters of a multiobjective decision making problem (DMP),

based on a set of observed decisions from the human expert. However, the performance of …

  Cited by 2 Related articles All 2 versions 


[PDF] arxiv.org

A new approach to posterior contraction rates via Wasserstein dynamics

E Dolera, S Favaro, E Mainini - arXiv preprint arXiv:2011.14425, 2020 - arxiv.org

This paper presents a new approach to the classical problem of quantifying posterior

contraction rates (PCRs) in Bayesian statistics. Our approach relies on Wasserstein

distance, and it leads to two main contributions which improve on the existing literature of …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein distributionally robust look-ahead economic dispatch

BK PoollaAR HotaS Bolognani… - … on Power Systems, 2020 - ieeexplore.ieee.org

We consider the problem of look-ahead economic dispatch (LAED) with uncertain

renewable energy generation. The goal of this problem is to minimize the cost of

conventional energy generation subject to uncertain operational constraints. These …

  Cited by 2 Related articles All 3 versions

Wasserstein Distributionally Robust Look-Ahead Economic Dispatch

B Kameshwar Poolla, AR HotaS Bolognani… - arXiv e …, 2020 - ui.adsabs.harvard.edu

We consider the problem of look-ahead economic dispatch (LAED) with uncertain

renewable energy generation. The goal of this problem is to minimize the cost of

conventional energy generation subject to uncertain operational constraints. These …

Cited by 12 Related articles All 8 versions

[PDF] researchgate.net

Tractable reformulations of two-stage distributionally robust linear programs over the type-∞ wasserstein ball

W Xie - Operations Research Letters, 2020 - Elsevier

This paper studies a two-stage distributionally robust stochastic linear program under the

type-∞ Wasserstein ball by providing sufficient conditions under which the program can be

efficiently computed via a tractable convex program. By exploring the properties of binary …

  Cited by 6 Related articles All 4 versions


[PDF] arxiv.org

Finite-sample guarantees for Wasserstein distributionally robust optimization: Breaking the curse of dimensionality

R Gao - arXiv preprint arXiv:2009.04382, 2020 - arxiv.org

Wasserstein distributionally robust optimization (DRO) aims to find robust and generalizable

solutions by hedging against data perturbations in Wasserstein distance. Despite its recent

empirical success in operations research and machine learning, existing performance …

  Cited by 1 Related articles All 3 versions 


[PDF] mlr.press

Principled learning method for Wasserstein distributionally robust optimization with local perturbations

Y Kwon, W Kim, JH Won… - … Conference on Machine …, 2020 - proceedings.mlr.press

Wasserstein distributionally robust optimization (WDRO) attempts to learn a model that

minimizes the local worst-case risk in the vicinity of the empirical data distribution defined by

Wasserstein ball. While WDRO has received attention as a promising tool for inference since …

  Related articles All 5 versions 


[PDF] arxiv.org

Wasserstein distributionally robust motion control for collision avoidance using conditional value-at-risk

A HakobyanI Yang - arXiv preprint arXiv:2001.04727, 2020 - arxiv.org

In this paper, a risk-aware motion control scheme is considered for mobile robots to avoid

randomly moving obstacles when the true probability distribution of uncertainty is unknown.

We propose a novel model predictive control (MPC) method for limiting the risk of unsafety …

  Cited by 4 Related articles All 2 versions 

<——2020——2020———1870——  


[PDF] lewissoft.com

Wasserstein Distributionally Robust Motion Planning and Control with Safety Constraints Using Conditional Value-at-Risk

A HakobyanI Yang - 2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

In this paper, we propose an optimization-based decision-making tool for safe motion

planning and control in an environment with randomly moving obstacles. The unique feature

of the proposed method is that it limits the risk of unsafety by a pre-specified threshold even …

Cited by 9 Related articles


[PDF] arxiv.org

First-Order Methods for Wasserstein Distributionally Robust MDP

J Grand-ClementC Kroer - arXiv preprint arXiv:2009.06790, 2020 - arxiv.org

Markov Decision Processes (MDPs) are known to be sensitive to parameter specification.

Distributionally robust MDPs alleviate this issue by allowing for ambiguity sets which give a

set of possible distributions over parameter sets. The goal is to find an optimal policy with …

Cited by 15 Related articles All 2 versions


CVaR-Based Approximations of Wasserstein Distributionally Robust Chance Constraints with Application to Process Scheduling

B Liu, Q Zhang, X Ge, Z Yuan - Industrial & Engineering Chemistry …, 2020 - ACS Publications

Distributionally robust chance constrained programming is a stochastic optimization

approach that considers uncertainty in model parameters as well as uncertainty in the

underlying probability distribution. It ensures a specified probability of constraint satisfaction …

ited by 5 Related articles All 4 versions


[PDF] arxiv.org

Strong Formulations for Distributionally Robust Chance-Constrained Programs with Left-Hand Side Uncertainty under Wasserstein Ambiguity

N Ho-NguyenF Kılınç-KarzanS Küçükyavuz… - arXiv preprint arXiv …, 2020 - arxiv.org

Distributionally robust chance-constrained programs (DR-CCP) over Wasserstein ambiguity

sets exhibit attractive out-of-sample performance and admit big-$ M $-based mixed-integer

programming (MIP) reformulations with conic constraints. However, the resulting …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

Generalized Wasserstein Dice Score, Distributionally Robust Deep Learning, and Ranger for brain tumor segmentation: BraTS 2020 challenge

L FidonS OurselinT Vercauteren - arXiv preprint arXiv:2011.01614, 2020 - arxiv.org

Training a deep neural network is an optimization problem with four main ingredients: the

design of the deep neural network, the per-sample loss function, the population loss

function, and the optimizer. However, methods developed to compete in recent BraTS …

Cited by 10 Related articles All 6 versions

2020


[PDF] arxiv.org

Robust Reinforcement Learning with Wasserstein Constraint

L Hou, L PangX HongY Lan, Z Ma, D Yin - arXiv preprint arXiv …, 2020 - arxiv.org

Robust Reinforcement Learning aims to find the optimal policy with some extent of

robustness to environmental dynamics. Existing learning algorithms usually enable the

robustness through disturbing the current state or simulating environmental parameters in a …

  Related articles All 4 versions 


[PDF] arxiv.org

Consistency of Distributionally Robust Risk-and Chance-Constrained Optimization Under Wasserstein Ambiguity Sets

A CherukuriAR Hota - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We study stochastic optimization problems with chance and risk constraints, where in the

latter, risk is quantified in terms of the conditional value-at-risk (CVaR). We consider the

distributionally robust versions of these problems, where the constraints are required to hold …

Cited by 4 Related articles All 6 versions

[PDF] arxiv.org

Robustified Multivariate Regression and Classification Using Distributionally Robust Optimization under the Wasserstein Metric

R ChenIC Paschalidis - arXiv preprint arXiv:2006.06090, 2020 - arxiv.org

We develop Distributionally Robust Optimization (DRO) formulations for Multivariate Linear

Regression (MLR) and Multiclass Logistic Regression (MLG) when both the covariates and

responses/labels may be contaminated by outliers. The DRO framework uses a probabilistic …

Cited by 2 Related articles All 5 versions 


[PDF] arxiv.org

Safe Zero-Shot Model-Based Learning and Control: A Wasserstein Distributionally Robust Approach

A KandelSJ Moura - arXiv preprint arXiv:2004.00759, 2020 - arxiv.org

This paper explores distributionally robust zero-shot model-based learning and control

using Wasserstein ambiguity sets. Conventional model-based reinforcement learning

algorithms struggle to guarantee feasibility throughout the online learning process. We …

  Related articles All 2 versions 


Second-order Conic Programming Approach for Wasserstein Distributionally Robust Two-stage Linear Programs

Z Wang, K YouS SongY Zhang - arXiv preprint arXiv:2002.06751, 2020 - arxiv.org

This paper proposes a second-order conic programming (SOCP) approach to solve

distributionally robust two-stage stochastic linear programs over 1-Wasserstein balls. We

start from the case with distribution uncertainty only in the objective function and exactly …

  Related articles All 3 versions 

<——2020——2020———1880——



A Riemannian submersion‐based approach to the Wasserstein barycenter of positive definite matrices

M Li, H Sun, D Li - Mathematical Methods in the Applied …, 2020 - Wiley Online Library

In this paper, we introduce a novel geometrization on the space of positive definite matrices,

derived from the Riemannian submersion from the general linear group to the space of

positive definite matrices, resulting in easier computation of its geometric structure. The …

  Related articles


[PDF] epfl.ch

Wasserstein Distributionally Robust Learning

S Shafieezadeh Abadeh - 2020 - infoscience.epfl.ch

Many decision problems in science, engineering, and economics are affected by

uncertainty, which is typically modeled by a random variable governed by an unknown

probability distribution. For many practical applications, the probability distribution is only …

  Related articles 

[CITATION] Wasserstein Distributionally Robust Learning

OS Abadeh - 2020 - Ecole Polytechnique Fédérale de …


[PDF] optimization-online.org

[PDF] Dual Decomposition of Two-Stage Distributionally Robust Mixed-Integer Programming under the Wasserstein Ambiguity Set

K Kim - optimization-online.org

We develop a dual decomposition of two-stage distributionally robust mixed-integer

programming (DRMIP) under the Wasserstein ambiguity set. The dual decomposition is

based on the Lagrangian dual of DRMIP, which results from the Lagrangian relaxation of the …

Cited by 3 Related articles All 2 versions 

2020  [PDF]mlr.org

[PDF] Wasserstein barycenters can be computed in polynomial time in fixed dimension

JM Altschuler, E Boix-Adsera - Journal of Machine Learning Research, 2021 - jmlr.org

Computing Wasserstein barycenters is a fundamental geometric problem with widespread

applications in machine learning, statistics, and computer graphics. However, it is unknown

whether Wasserstein barycenters can be computed in polynomial time, either exactly or to …

  All 14 versions 


2020  [PDF] arxiv.org

Wasserstein barycenters are NP-hard to compute

JM Altschuler, E Boix-Adsera - arXiv preprint arXiv:2101.01100, 2021 - arxiv.org

The problem of computing Wasserstein barycenters (aka Optimal Transport barycenters) has

attracted considerable recent attention due to many applications in data science. While there

exist polynomial-time algorithms in any fixed dimension, all known runtimes suffer …

  Cited by 1 Related articles All 2 versions 


 2020

Isometries of Wasserstein spaces

GP Gehér, T TitkosD Virosztek - halgebra.math.msu.su

Due to its nice theoretical properties and an astonishing number of applications via optimal

transport problems, probably the most intensively studied metric nowadays is the p-

Wasserstein metric. Given a complete and separable metric space X and a real number p≥ …

  Related articles 


[PDF] brown.edu

[PDF] Reduced-order modeling of transport equations using Wasserstein spaces

V EhrlacherD LombardiO MulaFX Vialard - icerm.brown.edu

Page 1. Introduction to Wassertein spaces and barycenters Model order reduction of parametric

transport equations Reduced-order modeling of transport equations using Wasserstein spaces

V. Ehrlacher1, D. Lombardi 2, O. Mula 3, F.-X. Vialard 4 1Ecole des Ponts ParisTech & INRIA …

  Related articles 


Isometric study of Wasserstein spaces---the real line

G Pál Gehér, T TitkosD Virosztek - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

Recently Kloeckner described the structure of the isometry group of the quadratic

Wasserstein space $\mathcal {W} _2\left (\mathbb {R}^ n\right) $. It turned out that the case of

the real line is exceptional in the sense that there exists an exotic isometry flow. Following …


정칙화 항에 기반한 WGAN 의 립쉬츠 연속 안정화 기법 제안

한희일 - 한국인터넷방송통신학회 논문지, 2020 - dbpia.co.kr

최근에 제안된 WGAN(Wasserstein generative adversarial network) 등장으로 GAN(generative

adversarial network) 고질적인 문제인 까다롭고 불안정한 학습과정이 다소 개선되기는 하였으나

여전히 수렴이 안되거나 자연스럽지 못한 출력물을 생성하는 등의 경우가 발생한다 …

  All 2 versions

]Korean  Proposal of Lipsheets Continuous Stabilization Method of WGAN Based on Regularization Terms

Hee-Il Han - The Journal of The Korean Institute of Internet, Broadcasting and Communication, 2020-dbpia.co.kr

… With the emergence of the recently proposed WGAN (Wasserstein Generative Adversarial Network),

Although the difficult and unstable learning process, which is a chronic problem of adversarial network), has been somewhat improved,

There are still cases where convergence is not possible or unnatural output is generated].


 

[PDF] urfu.ru

[PDF] Вероятностные методы анализа игровых задач управления: диссертация на соискание ученой степени доктора физико-математических наук: 01.01 …

ЮВ Авербух - 2020 - elar.urfu.ru

… В этом случае движение каждого элемента системы определяется генератором 𝐿𝑡 [𝑚 (𝑡),

𝑢]. 2В англоязычной литературе распространено название метрика Васерштейна

(Wasserstein distance). Вопрос о точном наименовании разъяснен в [BK12, § 1.1]. 12 …

  All 2 versions 

<——2020——2020———1890——



Berthet , Fort , Klein : A Central Limit Theorem for Wasserstein ...

projecteuclid.org › euclid.aihp

by P Berthet · ‎2020 · ‎Related articles

A Central Limit Theorem for Wasserstein type distances between two distinct univariate distributions. Philippe Berthet, Jean-Claude Fort, and Thierry Klein ...
Received: 2 February 2018; Revised: 6 March 2019; Accepted: 29 March 2019; Published: May 2020

First available in Project Euclid: 16 March 2020



[PDF] stanford.edu

[PDF] A CLASS OF OPTIMAL TRANSPORT REGULARIZED FORMULATIONS WITH APPLICATIONS TO WASSERSTEIN GANS

KH Bae, B Feng, S Kim, S Lazarova-Molnar, Z Zheng… - stanford.edu

Optimal transport costs (eg Wasserstein distances) are used for fitting high-dimensional

distributions. For example, popular artificial intelligence algorithms such as Wasserstein

Generative Adversarial Networks (WGANs) can be interpreted as fitting a black-box …

  

Synthesising Tabular Datasets Using Wasserstein Conditional GANS with Gradient Penalty (WCGAN-GP)

S McKeever, M Singh Walia - 2020 - arrow.tudublin.ie

Deep learning based methods based on Generative Adversarial Networks (GANs) have

seen remarkable success in data synthesis of images and text. This study investigates the

use of GANs for the generation of tabular mixed dataset. We apply Wasserstein Conditional …


[PDF] minegrado.ovh

[PDF] EE-559–Deep learning 10.2. Wasserstein GAN

F Fleuret - 2020 - minegrado.ovh

Page 1. EE-559 – Deep learning 10.2. Wasserstein GAN François Fleuret https://fleuret.org/ee559/

Mon Feb 18 13:32:59 UTC 2019 ÉCOLE POLYTECHNIQUE FÉDÉRALE DE LAUSANNE Arjovsky

et al. (2017) point out that DJS does not account [much] for the metric structure of the space. δ x …

  Related articles All 2 versions 


2020

Quadratic Wasserstein metrics for von Neumann algebras via transport plans

R Duvenhage - arXiv preprint arXiv:2012.03564, 2020 - arxiv.org

We show how one can obtain a class of quadratic Wasserstein metrics, that is to say,

Wasserstein metrics of order 2, on the set of faithful normal states of a von Neumann algebra

$ A $, via transport plans, rather than through a dynamical approach. Two key points to …

  Cited by 1 Related articles All 2 versions 


2020  [PDF] arxiv.org

Permutation invariant networks to learn Wasserstein metrics

A SehanobishN Ravindra, D van Dijk - arXiv preprint arXiv:2010.05820, 2020 - arxiv.org

Understanding the space of probability measures on a metric space equipped with a

Wasserstein distance is one of the fundamental questions in mathematical analysis. The

Wasserstein metric has received a lot of attention in the machine learning community …

  Related articles All 4 versions 


Robust Document Distance with Wasserstein-Fisher-Rao metric

Z Wang, D Zhou, M Yang, Y Zhang… - Asian Conference on …, 2020 - proceedings.mlr.press

Computing the distance among linguistic objects is an essential problem in natural

language processing. The word mover's distance (WMD) has been successfully applied to

measure the document distance by synthesizing the low-level word similarity with the …

  Cited by 1 Related articles 


[PDF] ieee.org

Robust multivehicle tracking with wasserstein association metric in surveillance videos

Y Zeng, X Fu, L Gao, J Zhu, H Li, Y Li - IEEE Access, 2020 - ieeexplore.ieee.org

Vehicle tracking based on surveillance videos is of great significance in the highway traffic

monitoring field. In real-world vehicle-tracking applications, partial occlusion and objects

with similarly appearing distractors pose significant challenges. For addressing the above …

  Cited by 3 Related articles

 

[PDF] ethz.ch

[PDF] Smooth Wasserstein Distance: Metric Structure and Statistical Efficiency

Z Goldfeld - International Zurich Seminar on Information …, 2020 - research-collection.ethz.ch

The Wasserstein distance has seen a surge of interest and applications in machine learning.

Its popularity is driven by many advantageous properties it possesses, such as metric

structure (metrization of weak convergence), robustness to support mismatch, compatibility …

  Related articles All 4 versions 


[PDF] arxiv.org

Regularizing activations in neural networks via distribution matching with the Wasserstein metric

T Joo, D Kang, B Kim - arXiv preprint arXiv:2002.05366, 2020 - arxiv.org

Regularization and normalization have become indispensable components in training deep

neural networks, resulting in faster training and improved generalization performance. We

propose the projected error function regularization loss (PER) that encourages activations to …

  Cited by 3 Related articles All 5 versions 

<——2020——2020———1900—— 


Reconstruction of shale image based on Wasserstein Generative Adversarial Networks with gradient penalty

W Zha, X Li, Y Xing, L He, D Li - Advances in Geo-Energy …, 2020 - yandy-ager.com

Abstract Generative Adversarial Networks (GANs), as most popular artificial intelligence

models in the current image generation field, have excellent image generation capabilities.

Cited by 22 Related articles 


Data Augmentation Method for Fault Diagnosis of Mechanical Equipment Based on Improved Wasserstein GAN

W Liu, L Duan, Y Tang, J Yang - 2020 11th International …, 2020 - ieeexplore.ieee.org

Most of the time the mechanical equipment is in normal operation state, which results in high

imbalance between fault data and normal data. In addition, traditional signal processing

methods rely heavily on expert experience, making it difficult for classification or prediction …

  Related articles


[PDF] researchgate.net

Infrared and visible image fusion using dual discriminators generative adversarial networks with Wasserstein distance

J Li, H Huo, K Liu, C Li - Information Sciences, 2020 - Elsevier

Generative adversarial network (GAN) has shown great potential in infrared and visible

image fusion. The existing GAN-based methods establish an adversarial game between

generative image and source images to train the generator until the generative image …

Cited by 30 Related articles

Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings

Y Dai, S Wang, X Chen, C Xu, W Guo - Knowledge-Based Systems, 2020 - Elsevier

Abstract Knowledge graph embedding aims to project entities and relations into low-

dimensional and continuous semantic feature spaces, which has captured more attention in

recent years. Most of the existing models roughly construct negative samples via a uniformly …

  Cited by 4 Related articles All 2 versions


[PDF] arxiv.org

Deep attentive wasserstein generative adversarial networks for MRI reconstruction with recurrent context-awareness

Y Guo, C WangH ZhangG Yang - International Conference on Medical …, 2020 - Springer

The performance of traditional compressive sensing-based MRI (CS-MRI) reconstruction is

affected by its slow iterative procedure and noise-induced artefacts. Although many deep

learning-based CS-MRI methods have been proposed to mitigate the problems of traditional …

  Cited by 3 Related articles All 4 versions


2020


[PDF] arxiv.org

Wasserstein Distances for Stereo Disparity Estimation

D GargY WangB HariharanM Campbell… - arXiv preprint arXiv …, 2020 - arxiv.org

Existing approaches to depth or disparity estimation output a distribution over a set of pre-

defined discrete values. This leads to inaccurate results when the true depth or disparity

does not match any of these values. The fact that this distribution is usually learned indirectly …

  Cited by 2 Related articles All 3 versions 

[PDF] arxiv.org

Scalable computations of wasserstein barycenter via input convex neural networks

J FanA TaghvaeiY Chen - arXiv preprint arXiv:2007.04462, 2020 - arxiv.org

Wasserstein Barycenter is a principled approach to represent the weighted mean of a given

set of probability distributions, utilizing the geometry induced by optimal transport. In this

work, we present a novel scalable algorithm to approximate the Wasserstein Barycenters …

 Cited by 13 Related articles All 7 versions 

[PDF] arxiv.org

Wasserstein routed capsule networks

A Fuchs, F Pernkopf - arXiv preprint arXiv:2007.11465, 2020 - arxiv.org

Capsule networks offer interesting properties and provide an alternative to today's deep

neural network architectures. However, recent approaches have failed to consistently

achieve competitive results across different image datasets. We propose a new parameter …

  Cited by 1 Related articles All 2 versions 


2020 [PDF] arxiv.org

Social-wagdat: Interaction-aware trajectory prediction via wasserstein graph double-attention network

J Li, H Ma, Z ZhangM Tomizuka - arXiv preprint arXiv:2002.06241, 2020 - arxiv.org

Effective understanding of the environment and accurate trajectory prediction of surrounding

dynamic obstacles are indispensable for intelligent mobile systems (like autonomous

vehicles and social robots) to achieve safe and high-quality planning when they navigate in …

  Cited by 21 Related articles All 3 versions 

RDA-UNET-WGAN: An Accurate Breast Ultrasound Lesion Segmentation Using Wasserstein Generative Adversarial Networks

A NegiANJ RajR Nersisson, Z Zhuang… - Arabian Journal for …, 2020 - Springer

Early-stage detection of lesions is the best possible way to fight breast cancer, a disease

with the highest malignancy ratio among women. Though several methods primarily based

on deep learning have been proposed for tumor segmentation, it is still a challenging …

  Cited by 4 Related articles

<——2020——2020———1910——



[PDF] thecvf.com

Channel Pruning for Accelerating Convolutional Neural Networks via Wasserstein Metric

H Duan, H Li - Proceedings of the Asian Conference on …, 2020 - openaccess.thecvf.com

Channel pruning is an effective way to accelerate deep convolutional neural networks.

However, it is still a challenge to reduce the computational complexity while preserving the

performance of deep models. In this paper, we propose a novel channel pruning method via …

  Related articles 


[PDF] mdpi.com

Knowledge-grounded chatbot based on dual wasserstein generative adversarial networks with effective attention mechanisms

S Kim, OW Kwon, H Kim - Applied Sciences, 2020 - mdpi.com

A conversation is based on internal knowledge that the participants already know or external

knowledge that they have gained during the conversation. A chatbot that communicates with

humans by using its internal and external knowledge is called a knowledge-grounded …

  Cited by 3 Related articles All 4 versions 


Adversarial sliced Wasserstein domain adaptation networks

Y Zhang, N Wang, S Cai - Image and Vision Computing, 2020 - Elsevier

Abstract Domain adaptation has become a resounding success in learning a domain

agnostic model that performs well on target dataset by leveraging source dataset which has

related data distribution. Most of existing works aim at learning domain-invariant features …

  Cited by 1 Related articles All 2 versions


[PDF] ieee.org

Study of Restrained Network Structures for Wasserstein Generative Adversarial Networks (WGANs) on Numeric Data Augmentation

W Wang, C Wang, T Cui, Y Li - IEEE Access, 2020 - ieeexplore.ieee.org

Some recent studies have suggested using Generative Adversarial Network (GAN) for

numeric data over-sampling, which is to generate data for completing the imbalanced

numeric data. Compared with the conventional over-sampling methods, taken SMOTE as an …

  Related articles


[PDF] iop.org

Speech Dereverberation Based on Improved Wasserstein Generative Adversarial Networks

L Rao, J Yang - Journal of Physics: Conference Series, 2020 - iopscience.iop.org

In reality, the sound we hear is not only disturbed by noise, but also the reverberant, whose

effects are rarely taken into account. Recently, deep learning has shown great advantages

in speech signal processing. But among the existing dereverberation approaches, very few …

  Related articles All 2 versions


2020




[PDF] arxiv.org

Permutation invariant networks to learn Wasserstein metrics

A SehanobishN Ravindra, D van Dijk - arXiv preprint arXiv:2010.05820, 2020 - arxiv.org

Understanding the space of probability measures on a metric space equipped with a

Wasserstein distance is one of the fundamental questions in mathematical analysis. The

Wasserstein metric has received a lot of attention in the machine learning community …

  Related articles All 4 versions View as HTML 



 


Numeric Data Augmentation using Structural Constraint Wasserstein Generative Adversarial Networks

W Wang, C Wang, T Cui, R Gong… - … on Circuits and …, 2020 - ieeexplore.ieee.org

Some recent studies have suggested using GANs for numeric data generation such as to

generate data for completing the imbalanced numeric data. Considering the significant

difference between the dimensions of the numeric data and images, as well as the strong …

  Related articles


First arrival picking of microseismic signals based on nested U-Net and Wasserstein Generative Adversarial Network

JL Zhang, GQ Sheng - Journal of Petroleum Science and Engineering, 2020 - Elsevier

Picking the first arrival of microseismic signals, quickly and accurately, is the key for real-time

data processing of microseismic monitoring. The traditional method cannot meet the high-

accuracy and high-efficiency requirements for the firstarrival microseismic picking, in a low …

  Cited by 1 Related articles All 2 versions

<——2020——2020———1920——


[PDF] mdpi.com

Wasserstein Generative Adversarial Networks Based Data Augmentation for Radar Data Analysis

H Lee, J Kim, EK Kim, S Kim - Applied Sciences, 2020 - mdpi.com

Ground-based weather radar can observe a wide range with a high spatial and temporal

resolution. They are beneficial to meteorological research and services by providing

valuable information. Recent weather radar data related research has focused on applying …

  Related articles All 2 versions 

[PDF] epfl.ch

[PDF] THE CONTINUOUS FORMULATION OF SHALLOW NEURAL NETWORKS AS WASSERSTEIN-TYPE GRADIENT FLOWS

X FERNÁNDEZ-REAL, A FIGALLI - sma.epfl.ch

It has been recently observed that the training of a single hidden layer artificial neural

network can be reinterpreted as a Wasserstein gradient flow for the weights for the error

functional. In the limit, as the number of parameters tends to infinity, this gives rise to a family …

  Related articles  2


 

 

2020

[PDF] amazonaws.com

[PDF] Wasserstein 거리 척도 기반 SRGAN 이용한 위성 영상 해상도 향상

황지언, 유초시, 신요안 - 한국통신학회 …, 2020 - journal-home.s3.ap-northeast-2 …

… A. Cunningham, A. Acosta, A. Aitken, A. Tejani, J. Totz, Z. Wang, and W. Shi, "Photo-realistic single

image super-resolution using a genera- tive adversarial network," Proc … [4] M. Arjovsky, S. Chintala,

and L. Bottou, "Wasserstein genera- tive adversarial networks," Proc …

Related articles All 2 versions

[Korean [PDF] Improving the resolution of satellite images using SRGAN based on Wasserstein distance scale]


2020

[[[[[[

한희일 - 한국인터넷방송통신학회 논문지, 2020 - dbpia.co.kr

… 최근에 제안된 WGAN(Wasserstein generative adversarial network)의 등장으로 GAN(generative

adversarial network)의 고질적인 문제인 까다롭고 불안정한 학습과정이 다소 개선되기는 하였으나

여전히 수렴이 안되거나 자연스럽지 못한 출력물을 생성하는 등의 경우가 발생한다 …

All 2 versions

[Korean  The Journal of the Korean Society for Internet, Broadcasting and Communication, 2020-dbpia.co.kr

… With the advent of the recently proposed WGAN (Wasserstein generative adversarial network),



[PDF] Computational hardness and fast algorithm for fixed-support wasserstein barycenter

T LinN HoX ChenM Cuturi… - arXiv preprint arXiv …, 2020 - researchgate.net

We study in this paper the fixed-support Wasserstein barycenter problem (FS-WBP), which

consists in computing the Wasserstein barycenter of m discrete probability measures

supported on a finite metric space of size n. We show first that the constraint matrix arising …

  Cited by 3 Related articles All 2 versions 


2020


[PDF] arxiv.org

Revisiting fixed support wasserstein barycenter: Computational hardness and efficient algorithms

T LinN HoX ChenM CuturiMI Jordan - arXiv preprint arXiv:2002.04783, 2020 - arxiv.org

We study the fixed-support Wasserstein barycenter problem (FS-WBP), which consists in

computing the Wasserstein barycenter of $ m $ discrete probability measures supported on

a finite metric space of size $ n $. We show first that the constraint matrix arising from the …

  Cited by 2 Related articles All 2 versions 


[PDF] arxiv.org

Efficient Wasserstein Natural Gradients for Reinforcement Learning

T MoskovitzM ArbelF HuszarA Gretton - arXiv preprint arXiv …, 2020 - arxiv.org

A novel optimization approach is proposed for application to policy gradient methods and

evolution strategies for reinforcement learning (RL). The procedure uses a computationally

efficient Wasserstein natural gradient (WNG) descent that takes advantage of the geometry …

  Cited by 1 Related articles All 2 versions 


Fused Gromov-Wasserstein distance for structured objects

T VayerL ChapelR FlamaryR TavenardN Courty - Algorithms, 2020 - mdpi.com

Optimal transport theory has recently found many applications in machine learning thanks to

its capacity to meaningfully compare various machine learning objects that are viewed as

distributions. The Kantorovitch formulation, leading to the Wasserstein distance, focuses on …

  Cited by 7 Related articles All 33 versions 


Symmetric Wasserstein Autoencoders

S Sun, H Guo - 2020 - openreview.net

Leveraging the framework of Optimal Transport, we introduce a new family of generative

autoencoders with a learnable prior, called Symmetric Wasserstein Autoencoders (SWAEs).

We propose to symmetrically match the joint distributions of the observed data and the latent …

[PDF] nsf.gov

A Data-Driven Distributionally Robust Game Using Wasserstein Distance

G PengT ZhangQ Zhu - … on Decision and Game Theory for Security, 2020 - Springer

This paper studies a special class of games, which enables the players to leverage the

information from a dataset to play the game. However, in an adversarial scenario, the

dataset may not be trustworthy. We propose a distributionally robust formulation to introduce …

  Related articles All 2 versions

<——2020——2020———1930——


[PDF] arxiv.org

Independent Elliptical Distributions Minimize Their  Wasserstein Distance from Independent Elliptical Distributions with the Same Density Generator

S FangQ Zhu - arXiv preprint arXiv:2012.03809, 2020 - arxiv.org

This short note is on a property of the $\mathcal {W} _2 $ Wasserstein distance which

indicates that independent elliptical distributions minimize their $\mathcal {W} _2 $

Wasserstein distance from given independent elliptical distributions with the same density …

  Related articles All 2 versions 


2020

De novo protein design for novel folds using guided conditional Wasserstein generative adversarial networks

M Karimi, S Zhu, Y CaoY Shen - Journal of Chemical Information …, 2020 - ACS Publications

Although massive data is quickly accumulating on protein sequence and structure, there is a

small and limited number of protein architectural types (or structural folds). This study is

addressing the following question: how well could one reveal underlying sequence …

  Cited by 2 Related articles All 5 versions


 [PDF] mlr.press

A wasserstein minimum velocity approach to learning unnormalized models

Z WangS Cheng, L Yueru, J Zhu… - International …, 2020 - proceedings.mlr.press

Score matching provides an effective approach to learning flexible unnormalized models,

but its scalability is limited by the need to evaluate a second-order derivative. In this paper,

we present a scalable approximation to a general family of learning objectives including …

  Cited by 4 Related articles All 9 versions 


Learning to Align via Wasserstein for Person Re-Identification

Z ZhangY Xie, D Li, W Zhang… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Existing successful person re-identification (Re-ID) models often employ the part-level

representation to extract the fine-grained information, but commonly use the loss that is

particularly designed for global features, ignoring the relationship between semantic parts …

Cited by 7 Related articles All 2 versions


[PDF] arxiv.org

Visual transfer for reinforcement learning via wasserstein domain confusion

J RoyG Konidaris - arXiv preprint arXiv:2006.03465, 2020 - arxiv.org

We introduce Wasserstein Adversarial Proximal Policy Optimization (WAPPO), a novel

algorithm for visual transfer in Reinforcement Learning that explicitly learns to align the

distributions of extracted features between a source and target task. WAPPO approximates …

  Cited by 3 Related articles All 6 versions 


2020


[PDF] mlr.press

Nested-wasserstein self-imitation learning for sequence generation

R ZhangC ChenZ GanZ Wen… - International …, 2020 - proceedings.mlr.press

Reinforcement learning (RL) has been widely studied for improving sequence-generation

models. However, the conventional rewards used for RL training typically cannot capture

sufficient semantic information and therefore render model bias. Further, the sparse and …

  Cited by 2 Related articles All 6 versions 


Nested-Wasserstein Self-Imitation Learning for Sequence Generation

L Carin - 2020 - openreview.net

Reinforcement learning (RL) has been widely studied for improving sequence-generation

models. However, the conventional rewards used for RL training typically cannot capture

sufficient semantic information and therefore render model bias. Further, the sparse and …


[PDF] arxiv.org

The quantum Wasserstein distance of order 1

G De PalmaM MarvianD TrevisanS Lloyd - arXiv preprint arXiv …, 2020 - arxiv.org

We propose a generalization of the Wasserstein distance of order to the quantum states of

$ n $ qudits. The proposal recovers the Hamming distance for the vectors of the canonical

basis, and more generally the classical Wasserstein distance for quantum states diagonal in …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

Primal wasserstein imitation learning

R DadashiL HussenotM GeistO Pietquin - arXiv preprint arXiv …, 2020 - arxiv.org

Imitation Learning (IL) methods seek to match the behavior of an agent with that of an expert.

In the present work, we propose a new IL method based on a conceptually simple algorithm:

Primal Wasserstein Imitation Learning (PWIL), which ties to the primal form of the …

 Cited by 31 Related articles All 18 versions 

Wasserstein loss with alternative reinforcement learning for severity-aware semantic segmentation

X Liu, Y Lu, X Liu, S Bai, S Li… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Semantic segmentation is important for many real-world systems, eg, autonomous vehicles,

which predict the class of each pixel. Recently, deep networks achieved significant progress

wrt the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross …

Cited by 11 Related articles All 2 versions

<——2020——2020———1940——


[PDF] iop.org  Full View

Wasserstein metric for improved quantum machine learning with adjacency matrix representations

O Çaylak, OA von Lilienfeld… - Machine Learning …, 2020 - iopscience.iop.org

We study the Wasserstein metric to measure distances between molecules represented by

the atom index dependent adjacency'Coulomb'matrix, used in kernel ridge regression based

supervised learning. Resulting machine learning models of quantum properties, aka …

Cited by 10 Related articles All 5 versions

[PDF] arxiv.org

Wasserstein Embedding for Graph Learning

S KolouriN NaderializadehGK Rohde… - arXiv preprint arXiv …, 2020 - arxiv.org

We present Wasserstein Embedding for Graph Learning (WEGL), a novel and fast

framework for embedding entire graphs in a vector space, in which various machine

learning models are applicable for graph-level prediction tasks. We leverage new insights …

Cited by 19 Related articles All 5 versions 

[PDF] arxiv.org

node2coords: Graph representation learning with wasserstein barycenters

E SimouD ThanouP Frossard - IEEE Transactions on Signal …, 2020 - ieeexplore.ieee.org

In order to perform network analysis tasks, representations that capture the most relevant

information in the graph structure are needed. However, existing methods do not learn

representations that can be interpreted in a straightforward way and that are stable to …

  Cited by 1 Related articles All 3 versions


[PDF] mlr.press

Principled learning method for Wasserstein distributionally robust optimization with local perturbations

Y Kwon, W Kim, JH Won… - … on Machine Learning, 2020 - proceedings.mlr.press

Wasserstein distributionally robust optimization (WDRO) attempts to learn a model that

minimizes the local worst-case risk in the vicinity of the empirical data distribution defined by

Wasserstein ball. While WDRO has received attention as a promising tool for inference since …

  Related articles All 5 versions 


[PDF] bciml.cn

Wasserstein distance based deep adversarial transfer learning for intelligent fault diagnosis with unlabeled or insufficient labeled data

C Cheng, B Zhou, G Ma, D WuY Yuan - Neurocomputing, 2020 - Elsevier

Intelligent fault diagnosis is one critical topic of maintenance solution for mechanical

systems. Deep learning models, such as convolutional neural networks (CNNs), have been

successfully applied to fault diagnosis tasks and achieved promising results. However, one …

  Cited by 5 Related articles All 3 versions


2020


[PDF] arxiv.org

Regularization helps with mitigating poisoning attacks: Distributionally-robust machine learning using the wasserstein distance

F Farokhi - arXiv preprint arXiv:2001.10655, 2020 - arxiv.org

We use distributionally-robust optimization for machine learning to mitigate the effect of data

poisoning attacks. We provide performance guarantees for the trained model on the original

data (not including the poison records) by training the model for the worst-case distribution …

  Cited by 5 Related articles All 3 versions 


PDF] arxiv.org

Robust Reinforcement Learning with Wasserstein Constraint

L Hou, L PangX HongY Lan, Z Ma, D Yin - arXiv preprint arXiv …, 2020 - arxiv.org

Robust Reinforcement Learning aims to find the optimal policy with some extent of

robustness to environmental dynamics. Existing learning algorithms usually enable the

robustness through disturbing the current state or simulating environmental parameters in a …

  Related articles All 4 versions 


[PDF] arxiv.org

Statistical learning in Wasserstein space

A Karimi, L Ripani, TT Georgiou - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We seek a generalization of regression and principle component analysis (PCA) in a metric

space where data points are distributions metrized by the Wasserstein metric. We recast

these analyses as multimarginal optimal transport problems. The particular formulation …

Cited by 6 Related articles All 7 versions 

 

Learning Graphons via Structured Gromov-Wasserstein Barycenters

H XuD LuoL CarinH Zha - arXiv preprint arXiv:2012.05644, 2020 - arxiv.org

We propose a novel and principled method to learn a nonparametric graph model called

graphon, which is defined in an infinite-dimensional space and represents arbitrary-size

graphs. Based on the weak regularity lemma from the theory of graphons, we leverage a …

  Cited by 1 Related articles All 2 versions 


Learning Wasserstein Isometric Embedding for Point Clouds

K KawanoS Koide, T Kutsuna - 2020 International Conference …, 2020 - ieeexplore.ieee.org

The Wasserstein distance has been employed for determining the distance between point

clouds, which have variable numbers of points and invariance of point order. However, the

high computational cost associated with the Wasserstein distance hinders its practical …

  All 2 versions

<——2020——2020———1950—— 

 

[PDF] arxiv.org

Learning disentangled representations with the Wasserstein Autoencoder

B Gaujac, I FeigeD Barber - arXiv preprint arXiv:2010.03459, 2020 - arxiv.org

Disentangled representation learning has undoubtedly benefited from objective function

surgery. However, a delicate balancing act of tuning is still required in order to trade off

reconstruction fidelity versus disentanglement. Building on previous successes of penalizing …

  Related articles All 2 versions 


[PDF] arxiv.org

Automatic coding of students' writing via Contrastive Representation Learning in the Wasserstein space

R Jiang, J GouveaD HammerS Aeron - arXiv preprint arXiv:2011.13384, 2020 - arxiv.org

Qualitative analysis of verbal data is of central importance in the learning sciences. It is labor-

intensive and time-consuming, however, which limits the amount of data researchers can

include in studies. This work is a step towards building a statistical machine learning (ML) …

  Related articles All 2 versions 

 

[PDF] arxiv.org

Wasserstein-based fairness interpretability framework for machine learning models

A Miroshnikov, K Kotsiopoulos, R Franks… - arXiv preprint arXiv …, 2020 - arxiv.org

In this article, we introduce a fairness interpretability framework for measuring and

explaining bias in classification and regression models at the level of a distribution. In our

work, motivated by the ideas of Dwork et al.(2012), we measure the model bias across sub …

  Related articles All 2 versions 


[PDF] arxiv.org

Robust Reinforcement Learning with Wasserstein Constraint

L Hou, L PangX HongY Lan, Z Ma, D Yin - arXiv preprint arXiv …, 2020 - arxiv.org

Robust Reinforcement Learning aims to find the optimal policy with some extent of

robustness to environmental dynamics. Existing learning algorithms usually enable the

robustness through disturbing the current state or simulating environmental parameters in a …

 Cited by 6 Related 


[PDF] arxiv.org

Wasserstein Distance guided Adversarial Imitation Learning with Reward Shape Exploration

M ZhangY WangX Ma, L Xia, J Yang… - … Control and Learning …, 2020 - ieeexplore.ieee.org

The generative adversarial imitation learning (GAIL) has provided an adversarial learning

framework for imitating expert policy from demonstrations in high-dimensional continuous

tasks. However, almost all GAIL and its extensions only design a kind of reward function of …

  Cited by 2 Related articles All 5 versions


2020


[PDF] arxiv.org

Generalized Wasserstein Dice Score, Distributionally Robust Deep Learning, and Ranger for brain tumor segmentation: BraTS 2020 challenge

L FidonS OurselinT Vercauteren - arXiv preprint arXiv:2011.01614, 2020 - arxiv.org

Training a deep neural network is an optimization problem with four main ingredients: the

design of the deep neural network, the per-sample loss function, the population loss

function, and the optimizer. However, methods developed to compete in recent BraTS …

  Related articles All 2 versions 


[PDF] arxiv.org

On a Novel Application of Wasserstein-Procrustes for Unsupervised Cross-Lingual Learning

G Ramírez, R Dangovski, P Nakov… - arXiv preprint arXiv …, 2020 - arxiv.org

The emergence of unsupervised word embeddings, pre-trained on very large monolingual

text corpora, is at the core of the ongoing neural revolution in Natural Language Processing

(NLP). Initially introduced for English, such pre-trained word embeddings quickly emerged …

  Cited by 1 Related articles All 3 versions 


Conditional Wasserstein GAN-based Oversampling of Tabular Data for Imbalanced Learning

J Engelmann, S Lessmann - arXiv preprint arXiv:2008.09202, 2020 - arxiv.org

Class imbalance is a common problem in supervised learning and impedes the predictive

performance of classification models. Popular countermeasures include oversampling the

minority class. Standard methods like SMOTE rely on finding nearest neighbours and linear …

  Cited by 1 Related articles All 5 versions 


GraphWGAN: Graph Representation Learning with Wasserstein Generative Adversarial Networks

R Yan, H Shen, C Qi, K Cen… - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Graph representation learning aims to represent vertices as low-dimensional and real-

valued vectors to facilitate subsequent downstream tasks, ie, node classification, link

predictions. Recently, some novel graph representation learning frameworks, which try to …

  Related articles All 2 versions


[PDF] arxiv.org

Safe Zero-Shot Model-Based Learning and Control: A Wasserstein Distributionally Robust Approach

A KandelSJ Moura - arXiv preprint arXiv:2004.00759, 2020 - arxiv.org

This paper explores distributionally robust zero-shot model-based learning and control

using Wasserstein ambiguity sets. Conventional model-based reinforcement learning

algorithms struggle to guarantee feasibility throughout the online learning process. We …

  Related articles All 2 versions 

<——2020——2020———1960——



[PDF] epfl.ch

Wasserstein Distributionally Robust Learning

S Shafieezadeh Abadeh - 2020 - infoscience.epfl.ch

Many decision problems in science, engineering, and economics are affected by

uncertainty, which is typically modeled by a random variable governed by an unknown

probability distribution. For many practical applications, the probability distribution is only …

  Related articles 

CITATION] Wasserstein Distributionally Robust Learning

OS Abadeh - 2020 - Ecole Polytechnique Fédérale de …



[PDF] arxiv.org

Encoded Prior Sliced Wasserstein AutoEncoder for learning latent manifold representations

S KrishnagopalJ Bedrossian - arXiv preprint arXiv:2010.01037, 2020 - arxiv.org

While variational autoencoders have been successful generative models for a variety of

tasks, the use of conventional Gaussian or Gaussian mixture priors are limited in their ability

to capture topological or geometric properties of data in the latent representation. In this …

  Related articles All 2 versions 


[PDF] arxiv.org

Learning Deep-Latent Hierarchies by Stacking Wasserstein Autoencoders

B Gaujac, I FeigeD Barber - arXiv preprint arXiv:2010.03467, 2020 - arxiv.org

Probabilistic models with hierarchical-latent-variable structures provide state-of-the-art

results amongst non-autoregressive, unsupervised density-based models. However, the

most common approach to training such models based on Variational Autoencoders (VAEs) …

  Related articles All 4 versions 


Functional Data Clustering Analysis via the Learning of Gaussian Processes with Wasserstein Distance

T Li, J Ma - International Conference on Neural Information …, 2020 - Springer

Functional data clustering analysis becomes an urgent and challenging task in the new era

of big data. In this paper, we propose a new framework for functional data clustering

analysis, which adopts a similar structure as the k-means algorithm for the conventional …

  Related articles


A Generative Model for Zero-Shot Learning via Wasserstein Auto-encoder

X Luo, Z Cai, F Wu, J Xiao-Yuan - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Zero-shot learning aims to use the labeled instances to train the model, and then classifies

the instances that belong to a class without labeled instances. However, the training

instances and test instances are disjoint. Thus, the description of the classes (eg text …

  Related articles


 2020

[PDF] researchgate.net

[PDF] Wasserstein Barycenters for Bayesian Learning: Technical Report

G Rios - 2020 - researchgate.net

Within probabilistic modelling, a crucial but challenging task is that of learning (or fitting) the

models. For models described by a finite set of parameters, this task is reduced to finding the

best parameters, to feed them into the model and then calculate the posterior distribution to …

  Related articles 

  

[PDF] minegrado.ovh

[PDF] EE-559–Deep learning 10.2. Wasserstein GAN

F Fleuret - 2020 - minegrado.ovh

… µ DJS(µ, µ ) = min(δ, |x|) (1 δ log ( 1 + 1 2δ ) − ( 1 + 1 δ ) log ( 1 + 1 δ )) Hence all |x| greater than

δ are seen the same. François Fleuret EE-559 – Deep learning / 10.2. Wasserstein GAN 1 / 16

Page 2. An alternative choice is the “earth moving distance”, which intuitively is the minimum …

  Related articles All 2 versions 


[PDF] fleuret.org

[PDF] Deep learning 11.2. Wasserstein GAN

F Fleuret - 2020 - fleuret.org

… Wasserstein GAN 2 / 20 Page 9. An alternative choice is the “earth moving distance”, or

Wasserstein distance, which intuitively is the minimum mass displacement to transform one

distribution into the other. 4 × 1 4 2 × 1 4 3 × 1 2 1 2 3 4 5 6 7 8 9 10 µ = 1 4 1[1,2] + 1 4 1[3,4] + …

  Related articles All 2 versions 



 2020

[PDF] arxiv.org

Online Stochastic Optimization with Wasserstein Based Non-stationarity

J Jiang, X Li, J Zhang - arXiv preprint arXiv:2012.06961, 2020 - arxiv.org

We consider a general online stochastic optimization problem with multiple budget

constraints over a horizon of finite time periods. At each time period, a reward function and

multiple cost functions, where each cost function is involved in the consumption of one …

  Related articles All 2 versions 


2020

[PDF] arxiv.org

Online Stochastic Convex Optimization: Wasserstein Distance Variation

I ShamesF Farokhi - arXiv preprint arXiv:2006.01397, 2020 - arxiv.org

Distributionally-robust optimization is often studied for a fixed set of distributions rather than

time-varying distributions that can drift significantly over time (which is, for instance, the case

in finance and sociology due to underlying expansion of economy and evolution of …

  Related articles All 3 versions 

<——2020——2020———1970—— 




Stochastic optimization for regularized wasserstein estimators

M BalluQ BerthetF Bach - International Conference on …, 2020 - proceedings.mlr.press

Optimal transport is a foundational problem in optimization, that allows to compare

probability distributions while taking into account geometric aspects. Its optimal objective

value, the Wasserstein distance, provides an important loss between distributions that has …

  Cited by 7 Related articles All 4 versions 

Stochastic Optimization for Regularized Wasserstein Estimators

F BachM BalluQ Berthet - 2020 - research.google

Optimal transport is a foundational problem in optimization, that allows to compare

probability distributions while taking into account geometric aspects. Its optimal objective

value, the Wasserstein distance, provides an important loss between distributions that has …

 

[PDF] arxiv.org

Projection robust Wasserstein distance and Riemannian optimization

T LinC FanN HoM CuturiMI Jordan - arXiv preprint arXiv:2006.07458, 2020 - arxiv.org

Projection robust Wasserstein (PRW) distance, or Wasserstein projection pursuit (WPP), is a

robust variant of the Wasserstein distance. Recent work suggests that this quantity is more

robust than the standard Wasserstein distance, in particular when comparing probability …

  Cited by 5 Related articles All 6 versions 


[PDF] arxiv.org

On linear optimization over wasserstein balls

MC YueD KuhnW Wiesemann - arXiv preprint arXiv:2004.07162, 2020 - arxiv.org

Wasserstein balls, which contain all probability measures within a pre-specified Wasserstein

distance to a reference measure, have recently enjoyed wide popularity in the

distributionally robust optimization and machine learning communities to formulate and …

  Cited by 4 Related articles All 6 versions 

A Wasserstein based two-stage distributionally robust optimization model for optimal operation of CCHP micro-grid under uncertainties

Y Wang, Y Yang, L Tang, W Sun, B Li - International Journal of Electrical …, 2020 - Elsevier

Combined cooling, heating and power (CCHP) micro-grids are getting increasing attentions

due to the realization of cleaner production and high energy efficiency. However, with the

features of complex tri-generation structure and renewable power uncertainties, it is …

  Cited by 16 Related articles All 2 versions


[PDF] arxiv.org

Wasserstein distributionally robust inverse multiobjective optimization

C Dong, B Zeng - arXiv preprint arXiv:2009.14552, 2020 - arxiv.org

Inverse multiobjective optimization provides a general framework for the unsupervised

learning task of inferring parameters of a multiobjective decision making problem (DMP),

based on a set of observed decisions from the human expert. However, the performance of …

  Cited by 2 Related articles All 2 versions 


2020

[PDF] arxiv.org

Distributed optimization with quantization for computing Wasserstein barycenters

R Krawtschenko, CA UribeA Gasnikov… - arXiv preprint arXiv …, 2020 - arxiv.org

We study the problem of the decentralized computation of entropy-regularized semi-discrete

Wasserstein barycenters over a network. Building upon recent primal-dual approaches, we

propose a sampling gradient quantization scheme that allows efficient communication and …

  Cited by 2 Related articles All 3 versions 


[PDF] mlr.press

Principled learning method for Wasserstein distributionally robust optimization with local perturbations

Y Kwon, W Kim, JH Won… - … Conference on Machine …, 2020 - proceedings.mlr.press

Wasserstein distributionally robust optimization (WDRO) attempts to learn a model that

minimizes the local worst-case risk in the vicinity of the empirical data distribution defined by

Wasserstein ball. While WDRO has received attention as a promising tool for inference since …

  Related articles All 5 versions 


[PDF] arxiv.org

Stochastic saddle-point optimization for wasserstein barycenters

D Tiapkin, A GasnikovP Dvurechensky - arXiv preprint arXiv:2006.06763, 2020 - arxiv.org

We study the computation of non-regularized Wasserstein barycenters of probability

measures supported on the finite set. The first result gives a stochastic optimization

algorithm for the discrete distribution over the probability measures which is comparable …

  Cited by 2 Related articles All 3 versions 


[PDF] optimization-online.org

Data-driven distributionally robust chance-constrained optimization with Wasserstein metric

R JiMA Lejeune - Journal of Global Optimization, 2020 - Springer

We study distributionally robust chance-constrained programming (DRCCP) optimization

problems with data-driven Wasserstein ambiguity sets. The proposed algorithmic and

reformulation framework applies to all types of distributionally robust chance-constrained …

  Cited by 9 Related articles All 3 versions

[PDF] arxiv.org


Finite-sample guarantees for Wasserstein distributionally robust optimization: Breaking the curse of dimensionality

R Gao - arXiv preprint arXiv:2009.04382, 2020 - arxiv.org

Wasserstein distributionally robust optimization (DRO) aims to find robust and generalizable

solutions by hedging against data perturbations in Wasserstein distance. Despite its recent

empirical success in operations research and machine learning, existing performance …

  Cited by 1 Related articles All 3 versions 

<——2020——2020———1980—— 



[PDF] ieee.org

A New Data-Driven Distributionally Robust Portfolio Optimization Method based on Wasserstein Ambiguity Set

N Du, Y Liu, Y Liu - IEEE Access, 2020 - ieeexplore.ieee.org

Since optimal portfolio strategy depends heavily on the distribution of uncertain returns, this

paper proposes a new method for the portfolio optimization problem with respect to

distribution uncertainty. When the distributional information of the uncertain return rate is …

  Related articles



[PDF] arxiv.org

Primal wasserstein imitation learning

R DadashiL HussenotM GeistO Pietquin - arXiv preprint arXiv …, 2020 - arxiv.org

Imitation Learning (IL) methods seek to match the behavior of an agent with that of an expert.

In the present work, we propose a new IL method based on a conceptually simple algorithm:

Primal Wasserstein Imitation Learning (PWIL), which ties to the primal form of the …

  Cited by 6 Related articles All 2 versions 



[PDF] mit.edu

Wasserstein barycenters: statistics and optimization

AJ Stromme - 2020 - dspace.mit.edu

We study a geometric notion of average, the barycenter, over 2-Wasserstein space. We

significantly advance the state of the art by introducing extendible geodesics, a simple

synthetic geometric condition which implies non-asymptotic convergence of the empirical …

  Related articles 

  

[PDF] umn.edu

Data-driven Distributionally Robust Stochastic Optimization via Wasserstein Distance with Applications to Portfolio Risk Management and Inventory Control

D Singh - 2020 - conservancy.umn.edu

The central theme of this dissertation is stochastic optimization under distributional

ambiguity. One canthink of this as a two player game between a decision maker, who tries to

minimize some loss or maximize some reward, and an adversarial agent that chooses the …

  All 2 versions 


Data-driven Risk-sensitive Appointment Scheduling: A Wasserstein Distributionally Robust Optimization Approach

Z Pang, S Wang - Available at SSRN 3740083, 2020 - papers.ssrn.com

We consider an optimal appointment scheduling problem for a single-server healthcare

delivery system with random durations, focusing on the tradeoff between overtime work and

patient delays which are measured under conditional value-at-risk (CVaR). To address the …

 

2020

 

Data-driven Risk-sensitive Appointment Scheduling: A Wasserstein Distributionally Robust Optimization Approach

Z Pang, S Wang - Available at SSRN 3740083, 2020 - papers.ssrn.com

We consider an optimal appointment scheduling problem for a single-server healthcare

delivery system with random durations, focusing on the tradeoff between overtime work and

patient delays which are measured under conditional value-at-risk (CVaR). To address the …



Wasserstein distributionally robust stochastic control: A data-driven approach

I Yang - IEEE Transactions on Automatic Control, 2020 - ieeexplore.ieee.org

Standard stochastic control methods assume that the probability distribution of uncertain

variables is available. Unfortunately, in practice, obtaining accurate distribution information

is a challenging task. To resolve this issue, in this article we investigate the problem of …

  Cited by 23 Related articles All 3 versions


[PDF] iop.org

Full View

Wasserstein metric for improved quantum machine learning with adjacency matrix representations

O Çaylak, OA von Lilienfeld… - Machine Learning …, 2020 - iopscience.iop.org

We study the Wasserstein metric to measure distances between molecules represented by

the atom index dependent adjacency'Coulomb'matrix, used in kernel ridge regression based

supervised learning. Resulting machine learning models of quantum properties, aka …

  Cited by 4 Related articles


A data-driven distributionally robust newsvendor model with a Wasserstein ambiguity set

S Lee, H Kim, I Moon - Journal of the Operational …, 2020 - orsociety.tandfonline.com

In this paper, we derive a closed-form solution and an explicit characterization of the worst-

case distribution for the data-driven distributionally robust newsvendor model with an

ambiguity set based on the Wasserstein distance of order p[1,∞). We also consider the …

  Cited by 4 Related articles All 2 versions


Data-driven distributionally robust unit commitment with Wasserstein metric: Tractable formulation and efficient solution method

X ZhengH Chen - IEEE Transactions on Power Systems, 2020 - ieeexplore.ieee.org

In this letter, we propose a tractable formulation and an efficient solution method for the

Wasserstein-metric-based distributionally robust unit commitment (DRUC-dW) problem.

First, a distance-based data aggregation method is introduced to hedge against the …

  Cited by 3 Related articles All 2 versions

<——2020——2020———1990——


[PDF] arxiv.org

Regularization helps with mitigating poisoning attacks: Distributionally-robust machine learning using the wasserstein distance

F Farokhi - arXiv preprint arXiv:2001.10655, 2020 - arxiv.org

We use distributionally-robust optimization for machine learning to mitigate the effect of data

poisoning attacks. We provide performance guarantees for the trained model on the original

data (not including the poison records) by training the model for the worst-case distribution …

  Cited by 5 Related articles All 3 versions 


[PDF] optimization-online.org

Data-driven distributionally robust chance-constrained optimization with Wasserstein metric

R JiMA Lejeune - Journal of Global Optimization, 2020 - Springer

We study distributionally robust chance-constrained programming (DRCCP) optimization

problems with data-driven Wasserstein ambiguity sets. The proposed algorithmic and

reformulation framework applies to all types of distributionally robust chance-constrained …

  Cited by 9 Related articles All 3 versions


Data-driven stochastic programming with distributionally robust constraints under Wasserstein distance: asymptotic properties

Y Mei, ZP Chen, BB Ji, ZJ Xu, J Liu - … of the Operations Research Society of …, 2020 - Springer

Distributionally robust optimization is a dominant paradigm for decision-making problems

where the distribution of random variables is unknown. We investigate a distributionally

robust optimization problem with ambiguities in the objective function and countably infinite …

  Cited by 1 Related articles

[PDF] nsf.gov

A Data-Driven Distributionally Robust Game Using Wasserstein Distance

G PengT ZhangQ Zhu - International Conference on Decision and Game …, 2020 - Springer

This paper studies a special class of games, which enables the players to leverage the

information from a dataset to play the game. However, in an adversarial scenario, the

dataset may not be trustworthy. We propose a distributionally robust formulation to introduce …

  Related articles All 2 versions


Wasserstein-based fairness interpretability framework for machine learning models

A Miroshnikov, K Kotsiopoulos, R Franks… - arXiv preprint arXiv …, 2020 - arxiv.org

In this article, we introduce a fairness interpretability framework for measuring and

explaining bias in classification and regression models at the level of a distribution. In our

work, motivated by the ideas of Dwork et al.(2012), we measure the model bias across sub …

  Related articles All 2 versions 


2020

[PDF] arxiv.org

Data-Driven Approximation of the Perron-Frobenius Operator Using the Wasserstein Metric

A KarimiTT Georgiou - arXiv preprint arXiv:2011.00759, 2020 - arxiv.org

This manuscript introduces a regression-type formulation for approximating the Perron-

Frobenius Operator by relying on distributional snapshots of data. These snapshots may

represent densities of particles. The Wasserstein metric is leveraged to define a suitable …

  Related articles All 3 versions 


[PDF] ieee.org

A New Data-Driven Distributionally Robust Portfolio Optimization Method based on Wasserstein Ambiguity Set

N Du, Y Liu, Y Liu - IEEE Access, 2020 - ieeexplore.ieee.org

Since optimal portfolio strategy depends heavily on the distribution of uncertain returns, this

paper proposes a new method for the portfolio optimization problem with respect to

distribution uncertainty. When the distributional information of the uncertain return rate is …

  Related articles


[PDF] arxiv.org

Quadratic Wasserstein metrics for von Neumann algebras via transport plans

R Duvenhage - arXiv preprint arXiv:2012.03564, 2020 - arxiv.org

We show how one can obtain a class of quadratic Wasserstein metrics, that is to say,

Wasserstein metrics of order 2, on the set of faithful normal states of a von Neumann algebra

$ A $, via transport plans, rather than through a dynamical approach. Two key points to …

  Cited by 1 Related articles All 2 versions 


[PDF] umn.edu

Data-driven Distributionally Robust Stochastic Optimization via Wasserstein Distance with Applications to Portfolio Risk Management and Inventory Control

D Singh - 2020 - conservancy.umn.edu

The central theme of this dissertation is stochastic optimization under distributional

ambiguity. One canthink of this as a two player game between a decision maker, who tries to

minimize some loss or maximize some reward, and an adversarial agent that chooses the …

  All 2 versions 


Data-driven Risk-sensitive Appointment Scheduling: A Wasserstein Distributionally Robust Optimization Approach

Z Pang, S Wang - Available at SSRN 3740083, 2020 - papers.ssrn.com

We consider an optimal appointment scheduling problem for a single-server healthcare

delivery system with random durations, focusing on the tradeoff between overtime work and

patient delays which are measured under conditional value-at-risk (CVaR). To address the …

<——2020——2020———2000—— 


Semi-supervised Data-driven Surface Wave Tomography using Wasserstein Cycle-consistent GAN: Application on Southern California Plate Boundary Region

A Cai, H QiuF Niu - 2020 - essoar.org

Current machine learning based shear wave velocity (Vs) inversion using surface wave

dispersion measurements utilizes synthetic dispersion curves calculated from existing 3-D

velocity models as training datasets. It is shown in the previous studies that the …

  All 2 versions


[PDF] optimization-online.org

[PDF] A Novel Solution Methodology for Wasserstein-based Data-Driven Distributionally Robust Problems

CA Gamboa, DM ValladaoA Street… - optimization-online.org

Distributionally robust optimization (DRO) is a mathematical framework to incorporate

ambiguity over the actual data-generating probability distribution. Data-driven DRO

problems based on the Wasserstein distance are of particular interest for their sound …

  Related articles 


Distributional sliced-Wasserstein and applications to generative modeling

K NguyenN HoT PhamH Bui - arXiv preprint arXiv:2002.07367, 2020 - arxiv.org

Sliced-Wasserstein distance (SWD) and its variation, Max Sliced-Wasserstein distance (Max-

SWD), have been widely used in the recent years due to their fast computation and

scalability when the probability measures lie in very high dimension. However, these …

  Cited by 7 Related articles All 4 versions 

 

[PDF] arxiv.org

Improving Relational Regularized Autoencoders with Spherical Sliced Fused Gromov Wasserstein

K NguyenS NguyenN HoT PhamH Bui - arXiv preprint arXiv …, 2020 - arxiv.org

Relational regularized autoencoder (RAE) is a framework to learn the distribution of data by

minimizing a reconstruction loss together with a relational regularization on the latent space.

A recent attempt to reduce the inner discrepancy between the prior and aggregated …

  Cited by 2 Related articles All 3 versions 

 

GraphWGAN: Graph Representation Learning with Wasserstein Generative Adversarial Networks

R Yan, H Shen, C Qi, K Cen… - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Graph representation learning aims to represent vertices as low-dimensional and real-

valued vectors to facilitate subsequent downstream tasks, ie, node classification, link

predictions. Recently, some novel graph representation learning frameworks, which try to …

  Related articles All 2 versions


2020


Wasserstein Index Generation Model: Automatic generation of time-series index with application to Economic Policy Uncertainty

F Xie - Economics Letters, 2020 - Elsevier

Automatic time-series index generation as a black-box method … Comparable results with existing

ones, tested on EPU … Applicable to any text corpus to produce sentiment indices … I propose

a novel method, the Wasserstein Index Generation model (WIG), to generate a public sentiment …

  Cited by 6 Related articles All 11 versions


2020

Data-driven Risk-sensitive Appointment Scheduling: A Wasserstein Distributionally Robust Optimization Approach

Z Pang, S Wang - Available at SSRN 3740083, 2020 - papers.ssrn.com

We consider an optimal appointment scheduling problem for a single-server healthcare

delivery system with random durations, focusing on the tradeoff between overtime work and

patient delays which are measured under conditional value-at-risk (CVaR). To address the …

 

2020

[PDF] upenn.edu

Optimal Estimation of Wasserstein Distance on a Tree With an Application to Microbiome Studies

S Wang, TT CaiH Li - Journal of the American Statistical …, 2020 - Taylor & Francis

The weighted UniFrac distance, a plug-in estimator of the Wasserstein distance of read

counts on a tree, has been widely used to measure the microbial community difference in

microbiome studies. Our investigation however shows that such a plug-in estimator …

  Related articles All 4 versions


A novel kernel Wasserstein distance on Gaussian measures: An application of identifying dental artifacts in head and neck computed tomography

JH Oh, M PouryahyaA IyerAP ApteJO Deasy… - Computers in biology …, 2020 - Elsevier

The Wasserstein distance is a powerful metric based on the theory of optimal mass

transport. It gives a natural measure of the distance between two distributions with a wide

range of applications. In contrast to a number of the common divergences on distributions …

  Cited by 1 Related articles All 5 versions


[PDF] arxiv.org

On a Novel Application of Wasserstein-Procrustes for Unsupervised Cross-Lingual Learning

G Ramírez, R Dangovski, P Nakov… - arXiv preprint arXiv …, 2020 - arxiv.org

The emergence of unsupervised word embeddings, pre-trained on very large monolingual

text corpora, is at the core of the ongoing neural revolution in Natural Language Processing

(NLP). Initially introduced for English, such pre-trained word embeddings quickly emerged …

  Related articles All 3 versions 

<——2020——2020———2010——


[PDF] arxiv.org

Portfolio Optimisation within a Wasserstein Ball

SM PesentiS Jaimungal - Available at SSRN, 2020 - papers.ssrn.com

We consider the problem of active portfolio management where a loss-averse and/or gain-

seeking investor aims to outperform a benchmark strategy's risk profile while not deviating

too much from it. Specifically, an investor considers alternative strategies that co-move with …

  Related articles All 7 versions


[PDF] jku.at

WGAIN: Data Imputation using Wasserstein GAIN/submitted by Christina Halmich

C Halmich - 2020 - epub.jku.at

Missing data is a well known problem in the Machine Learning world. A lot of datasets that

are used for training algorithms contain missing values, eg 45% of the datasets stored in the

UCI Machine Learning Repository [16], which is a commonly used dataset collection …

  Related articles All 2 versions 


Stein's method for normal approximation in Wasserstein distances with application to the multivariate Central Limit Theorem

T Bonis - Probability Theory and Related Fields, 2020 - Springer

We use Stein's method to bound the Wasserstein distance of order 2 between a

measure\(\nu\) and the Gaussian measure using a stochastic process\((X_t) _ {t\ge 0}\) such

that\(X_t\) is drawn from\(\nu\) for any\(t> 0\). If the stochastic process\((X_t) _ {t\ge 0}\) …

   Cited by 8 Related articles All 3 versions


[PDF] amazonaws.com

[PDF] Bayesian Wasserstein GAN and Application for Vegetable Disease Image Data

W Cho, MH Na, S Kang, S Kim - 2020 - manuscriptlink-society-file.s3 …

Various GAN models have been proposed so far and they are used in various fields.

However, despite the excellent performance of these GANs, the biggest problem is that the

model collapse occurs in the simultaneous optimization of the generator and discriminator of …

  Related articles 

2020

Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces

V EhrlacherD LombardiO Mula… - … and Numerical Analysis, 2020 - search.proquest.com

We consider the problem of model reduction of parametrized PDEs where the goal is to

approximate any function belonging to the set of solutions at a reduced computational cost.

For this, the bottom line of most strategies has so far been based on the approximation of the …

  Related articles All 2 versions


2020

Distributional sliced-Wasserstein and applications to generative modeling

K NguyenN HoT PhamH Bui - arXiv preprint arXiv:2002.07367, 2020 - arxiv.org

Sliced-Wasserstein distance (SWD) and its variation, Max Sliced-Wasserstein distance (Max-

SWD), have been widely used in the recent years due to their fast computation and

scalability when the probability measures lie in very high dimension. However, these …

  Cited by 7 Related articles All 4 versions 


[PDF] arxiv.org

Strong Formulations for Distributionally Robust Chance-Constrained Programs with Left-Hand Side Uncertainty under Wasserstein Ambiguity

N Ho-NguyenF Kılınç-KarzanS Küçükyavuz… - arXiv preprint arXiv …, 2020 - arxiv.org

Distributionally robust chance-constrained programs (DR-CCP) over Wasserstein ambiguity

sets exhibit attractive out-of-sample performance and admit big-$ M $-based mixed-integer

programming (MIP) reformulations with conic constraints. However, the resulting …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein Random Forests and Applications in Heterogeneous Treatment Effects

Q Du, G BiauF Petit, R Porcher - arXiv preprint arXiv:2006.04709, 2020 - arxiv.org

We present new insights into causal inference in the context of Heterogeneous Treatment

Effects by proposing natural variants of Random Forests to estimate the key conditional

distributions. To achieve this, we recast Breiman's original splitting criterion in terms of …

  Related articles All 4 versions 


[PDF] arxiv.org

Statistical analysis of Wasserstein GANs with applications to time series forecasting

M Haas, S Richter - arXiv preprint arXiv:2011.03074, 2020 - arxiv.org

We provide statistical theory for conditional and unconditional Wasserstein generative

adversarial networks (WGANs) in the framework of dependent observations. We prove

upper bounds for the excess Bayes risk of the WGAN estimators with respect to a modified …

  Cited by 2 Related articles All 3 versions 


[PDF] uniroma1.it

[PDF] Nonparametric Density Estimation with Wasserstein Distance for Actuarial Applications

EG Luini - iris.uniroma1.it

Density estimation is a central topic in statistics and a fundamental task of actuarial sciences.

In this work, we present an algorithm for approximating multivariate empirical densities with

a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

  Related articles All 2 versions 

<——2020——2020———2020 titles ——



Wasserstein loss-based deep object detection

Y Han, X Liu, Z Sheng, Y Ren, X Han… - Proceedings of the …, 2020 - openaccess.thecvf.com

Object detection locates the objects with bounding boxes and identifies their classes, which

is valuable in many computer vision applications (eg autonomous driving). Most existing

deep learning-based methods output a probability vector for instance classification trained …

  Cited by 8 Related articles All 5 versions 


[PDF] mlr.press

Stronger and faster Wasserstein adversarial attacks

K WuA WangY Yu - International Conference on Machine …, 2020 - proceedings.mlr.press

Deep models, while being extremely flexible and accurate, are surprisingly vulnerable to

“small, imperceptible” perturbations known as adversarial attacks. While the majority of

existing attacks focus on measuring perturbations under the $\ell_p $ metric, Wasserstein  …

  Cited by 2 Related articles All 7 versions 


[PDF] ucl.ac.uk

Ripple-GAN: Lane line detection with ripple lane line detection network and Wasserstein GAN

Y Zhang, Z Lu, D Ma, JH Xue… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

With artificial intelligence technology being advanced by leaps and bounds, intelligent

driving has attracted a huge amount of attention recently in research and development. In

intelligent driving, lane line detection is a fundamental but challenging task particularly …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

High-Confidence Attack Detection via Wasserstein-Metric Computations

D LiS Martínez - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

This letter considers a sensor attack and fault detection problem for linear cyber-physical

systems, which are subject to system noise that can obey an unknown light-tailed

distribution. We propose a new threshold-based detection mechanism that employs the …

  Cited by 1 Related articles All 5 versions
 

Network Intrusion Detection Based on Conditional ...

ieeexplore.ieee.org › document

Oct 19, 2020 — Network Intrusion Detection Based on Conditional Wasserstein Generative Adversarial Network and Cost-Sensitive Stacked Autoencoder.

[PDF] ieee.org

[CITATION] Network Intrusion Detection Based on Conditional Wasserstein Generative Adversarial Network and Cost-Sensitive Stacked Autoencoder

G Zhang, X Wang, R Li, Y Song, J He, J Lai - IEEE Access, 2020 - ieeexplore.ieee.org

In the field of intrusion detection, there is often a problem of data imbalance, and more and

more unknown types of attacks make detection difficult. To resolve above issues, this article

proposes a network intrusion detection model called CWGAN-CSSAE, which combines …

  Related articles


2020


PDF] arxiv.org

Entropic-Wasserstein barycenters: PDE characterization, regularity and CLT

G Carlier, K Eichinger, A Kroshnin - arXiv preprint arXiv:2012.10701, 2020 - arxiv.org

In this paper, we investigate properties of entropy-penalized Wasserstein barycenters

introduced by Bigot, Cazelles and Papadakis (2019) as a regularization of Wasserstein

barycenters first presented by Agueh and Carlier (2011). After characterizing these …

  Related articles All 5 versions 



[PDF] arxiv.org

Automatic coding of students' writing via Contrastive Representation Learning in the Wasserstein space

R Jiang, J GouveaD HammerS Aeron - arXiv preprint arXiv:2011.13384, 2020 - arxiv.org

Qualitative analysis of verbal data is of central importance in the learning sciences. It is labor-

intensive and time-consuming, however, which limits the amount of data researchers can

include in studies. This work is a step towards building a statistical machine learning (ML) …

  Related articles All 2 versions 


Remaining useful life prediction of lithium-ion batteries using a fusion method based on Wasserstein GAN

周温丁, 鲍士兼, 许方敏, 赵成林 - 中国邮电高校学报 (英文版), 2020 - jcupt.bupt.edu.cn

Lithium-ion batteries are the main power supply equipment in many fields due to their

advantages of no memory, high energy density, long cycle life and no pollution to the

environment. Accurate prediction for the remaining useful life (RUL) of lithium-ion batteries …

  All 2 versions 


2020  [PDF] arxiv.org

A generalized Vaserstein symbol

T Syed - Annals of K-Theory, 2020 - msp.org

Let R be a commutative ring. For any projective R-module P 0 of constant rank 2 with a

trivialization of its determinant, we define a generalized Vaserstein symbol on the orbit

space of the set of epimorphisms P 0 R R under the action of the group of elementary …

   Cited by 5 Related articles All 11 versions 


2020 see 2014

A Survey on the Non-injectivity of the Vaserstein Symbol in Dimension Three

N Gupta, DR Rao, S Kolte - Leavitt Path Algebras and Classical K-Theory, 2020 - Springer

We give a recap of the study of the Vaserstein symbol \(V_A : Um_3(A)/E_3(A) \longrightarrow

W_E(A)\), the elementary symplectic Witt group; when A is an affine threefold over a field k …

LN Vaserstein in [20] proved that the orbit space of unimodular rows of length three modulo elementary …

  Cited by 1 Related articles All 2 versions

<——2020——2020———2030—— 


Ranking IPCC Model Performance Using the Wasserstein Distance

by G Vissio · 2020 · Cited by 2 — formance against benchmarks based on the use of the Wasserstein distance (WD). ... The samples used in the WD calculations are drawn by performing a Ulam ...
Abstract We propose a methodology for intercomparing climate models and evaluating their performance against benchmarks based on the use of the Wasserstein distance (WD). This distance provides a rigorous way to measure quantitatively the difference between two probability distributions. The proposed approach is flexible and can be applied in any number of dimensions; it allows one to rank climate models taking into account all the moments of the distributions. By selecting the combination of climatic variables and the regions of interest, it is possible to highlight specific model deficiencies. The WD enables a comprehensive evaluation of the skill of a climate model. We apply this approach to a selected number of physical fields, ranking the models in terms of their performance in simulating them and pinpointing their weaknesses in the simulation of some of the selected physica

Related articles All 2 versions

2020 Sep 11

Master's Thesis Presentation • Machine Learning — Wasserstein Adversarial Robustness

cs.uwaterloo.ca › events › masters-thesis-presentation-m...asserstein Autoencoders with Mixture of Gaussian Priors for Stylized Text Generation

Please note: This master's thesis presentation will be given online. Amirpasha Ghabussi ... Variational autoencoders and Wasserstein autoencoders are two widely used methods for text. ... Thursday, January 21, 2021 — 10:00 AM EST ...

Amirpasha Ghabussi, Master’s candidate
David R. Cheriton School of Computer Science

Supervisor: Professor Olga Vechtomova


2020 video

Wasserstein Loss - Week 3: Wasserstein GANs with Gradient ...

www.coursera.org › lecture › wasserstein-loss-vy3To

Oct 21, 2020 — I really liked how well the sections on Wasserstein Loss and Conditional & Controllable ... Week 3: Wasserstein GANs with Gradient Penalty ... Online · Master of Applied Data Science · Global MBA · Master's of Innovation & ...

Wasserstein Loss - Week 3: Wasserstein GANs with Gradient ...

www.coursera.org › lecture › wasserstein-loss-vy3To

AI for the course "Build Basic Generative Adversarial Networks (GANs)". Learn advanced techniques to reduce ...

Oct 1, 2020

 Lecture 11.4: Wasserstein Generative Adversarial Networks

What Are GANs? | Generative Adversarial Networks Explained | Deep Learning With Python | Edureka. edureka! edureka!

YouTube · UniHeidelberg · 

Oct 15, 2020

Wasserstein Loss - Week 3: Wasserstein GANs wit

Video created by DeepLearning.AI for the course "Build Basic Generative Adversarial Networks (GANs ...

Oct 2, 2020 · Uploaded by Eric Zelikman



2020 online

Primal Heuristics for Wasserstein Barycenters

by Bouchet, Pierre-Yves; Gualandi, Stefano; Rousseau, Louis-Martin

Integration of Constraint Programming, Artificial Intelligence, and Operations Research, 09/2020

This paper presents primal heuristics for the computation of Wasserstein Barycenters of a given set of discrete probability measures...

Book ChapterFull Text Online

 Cited by 1 Related articles


2020 online   OPEN ACCESS

Missing Features Reconstruction Using a Wasserstein Generative Adversarial Imputation...

by Friedjungová, Magda; Vašata, Daniel; Balatsko, Maksym ; More...

Computational Science – ICCS 2020, 06/2020

...). Moreover, we introduce WGAIN as the Wasserstein modification of GAIN, which turns out to be the best imputation model when the degree of missingness is less...

Book ChapterFull Text Online

Cited by 6 Related articles All 10 versions

2020

2020  [PDF] thecvf.com

Gromov-wasserstein averaging in a riemannian framework

S ChowdhuryT Needham - Proceedings of the IEEE/CVF …, 2020 - openaccess.thecvf.com

We introduce a theoretical framework for performing statistical tasks-including, but not

limited to, averaging and principal component analysis-on the space of (possibly

asymmetric) matrices with arbitrary entries and sizes. This is carried out under the lens of the …

  Cited by 10 Related articles All 6 versions 


2020

De novo protein design for novel folds using guided conditional Wasserstein generative adversarial networks

M Karimi, S Zhu, Y CaoY Shen - Journal of Chemical Information …, 2020 - ACS Publications

Although massive data is quickly accumulating on protein sequence and structure, there is a

small and limited number of protein architectural types (or structural folds). This study is

addressing the following question: how well could one reveal underlying sequence …

  Cited by 2 Related articles All 5 versions





[HTML] nih.gov

[HTML] EEG signal reconstruction using a generative adversarial network with wasserstein distance and temporal-spatial-frequency loss

T Luo, Y Fan, L Chen, G Guo, C Zhou - Frontiers in …, 2020 - ncbi.nlm.nih.gov

Applications based on electroencephalography (EEG) signals suffer from the mutual

contradiction of high classification performance vs. low cost. The nature of this contradiction

makes EEG signal reconstruction with high sampling rates and sensitivity challenging …

  Cited by 6 Related articles All 5 versions


Typical wind power scenario generation for multiple wind farms using conditional improved Wasserstein generative adversarial network

Y Zhang, Q Ai, F Xiao, R Hao, T Lu - … Journal of Electrical Power & Energy …, 2020 - Elsevier

Because of environmental benefits, wind power is taking an increasing role meeting

electricity demand. However, wind power tends to exhibit large uncertainty and is largely

influenced by meteorological conditions. Apart from the variability, when multiple wind farms …

  Cited by 12 Related articles


[PDF] researchgate.net

Infrared and visible image fusion using dual discriminators generative adversarial networks with Wasserstein distance

J Li, H Huo, K Liu, C Li - Information Sciences, 2020 - Elsevier

Generative adversarial network (GAN) has shown great potential in infrared and visible

image fusion. The existing GAN-based methods establish an adversarial game between

generative image and source images to train the generator until the generative image …

  Cited by 4 Related articles All 3 versions

 

<——2020——2020———2040——



[PDF] arxiv.org

Regularized variational data assimilation for bias treatment using the Wasserstein metric

SK Tamang, A EbtehajD Zou… - Quarterly Journal of the …, 2020 - Wiley Online Library

This article presents a new variational data assimilation (VDA) approach for the formal

treatment of bias in both model outputs and observations. This approach relies on the

Wasserstein metric, stemming from the theory of optimal mass transport, to penalize the …

  Cited by 1 Related articles All 4 versions


[PDF] arxiv.org

Scalable computations of wasserstein barycenter via input convex neural networks

J FanA TaghvaeiY Chen - arXiv preprint arXiv:2007.04462, 2020 - arxiv.org

Wasserstein Barycenter is a principled approach to represent the weighted mean of a given

set of probability distributions, utilizing the geometry induced by optimal transport. In this

work, we present a novel scalable algorithm to approximate the Wasserstein Barycenters …

  Cited by 3 Related articles All 3 versions 


[PDF] arxiv.org

LCS Graph Kernel Based on Wasserstein Distance in Longest Common Subsequence Metric Space

J Huang, Z Fang, H Kasai - arXiv preprint arXiv:2012.03612, 2020 - arxiv.org

For graph classification tasks, many methods use a common strategy to aggregate

information of vertex neighbors. Although this strategy provides an efficient means of

extracting graph topological features, it brings excessive amounts of information that might …

  Cited by 1 Related articles All 2 versions 

[PDF] researchgate.net

Data supplement for a soft sensor using a new generative model based on a variational autoencoder and Wasserstein GAN

X Wang, H Liu - Journal of Process Control, 2020 - Elsevier

In industrial process control, measuring some variables is difficult for environmental or cost

reasons. This necessitates employing a soft sensor to predict these variables by using the

collected data from easily measured variables. The prediction accuracy and computational …

  Cited by 6 Related articles All 3 versions


 

[PDF] arxiv.org

Wasserstein distributionally robust motion control for collision avoidance using conditional value-at-risk

A HakobyanI Yang - arXiv preprint arXiv:2001.04727, 2020 - arxiv.org

In this paper, a risk-aware motion control scheme is considered for mobile robots to avoid

randomly moving obstacles when the true probability distribution of uncertainty is unknown.

We propose a novel model predictive control (MPC) method for limiting the risk of unsafety …

  Cited by 5 Related articles All 2 versions 


2020


Joint transfer of model knowledge and fairness over domains using wasserstein distance

T Yoon, J LeeW Lee - IEEE Access, 2020 - ieeexplore.ieee.org

Owing to the increasing use of machine learning in our daily lives, the problem of fairness

has recently become an important topic in machine learning societies. Recent studies

regarding fairness in machine learning have been conducted to attempt to ensure statistical …

  Cited by 1 Related articles


[PDF] lewissoft.com

Wasserstein Distributionally Robust Motion Planning and Control with Safety Constraints Using Conditional Value-at-Risk

A HakobyanI Yang - 2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

In this paper, we propose an optimization-based decision-making tool for safe motion

planning and control in an environment with randomly moving obstacles. The unique feature

of the proposed method is that it limits the risk of unsafety by a pre-specified threshold even …

Cited by 25 Related articles All 9 versions

RDA-UNET-WGAN: An Accurate Breast Ultrasound Lesion Segmentation Using Wasserstein Generative Adversarial Networks

A NegiANJ RajR Nersisson, Z Zhuang… - Arabian Journal for …, 2020 - Springer

Early-stage detection of lesions is the best possible way to fight breast cancer, a disease

with the highest malignancy ratio among women. Though several methods primarily based

on deep learning have been proposed for tumor segmentation, it is still a challenging …

Cited by 29 Related articles All 2 versions


Density estimation of multivariate samples using Wasserstein distance

E Luini, P Arbenz - Journal of Statistical Computation and …, 2020 - Taylor & Francis

Density estimation is a central topic in statistics and a fundamental task of machine learning.

In this paper, we present an algorithm for approximating multivariate empirical densities with

a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

  Cited by 2 Related articles All 3 versions


DECWA: Density-Based Clustering using Wasserstein Distance

N El Malki, R Cugny, O TesteF Ravat - Proceedings of the 29th ACM …, 2020 - dl.acm.org

Clustering is a data analysis method for extracting knowledge by discovering groups of data

called clusters. Among these methods, state-of-the-art density-based clustering methods

have proven to be effective for arbitrary-shaped clusters. Despite their encouraging results …

  Related articles All 2 versions

<——2020——2020———2050——  


[PDF] arxiv.org

Two-sample Test using Projected Wasserstein Distance: Breaking the Curse of Dimensionality

J Wang, R Gao, Y Xie - arXiv preprint arXiv:2010.11970, 2020 - arxiv.org

We develop a projected Wasserstein distance for the two-sample test, a fundamental

problem in statistics and machine learning: given two sets of samples, to determine whether

they are from the same distribution. In particular, we aim to circumvent the curse of …

  Cited by 1 Related articles All 3 versions 


[PDF] arxiv.org

Ranking IPCC Models Using the Wasserstein Distance

G VissioV LemboV LucariniM Ghil - arXiv preprint arXiv:2006.09304, 2020 - arxiv.org

We propose a methodology for evaluating the performance of climate models based on the

use of the Wasserstein distance. This distance provides a rigorous way to measure

quantitatively the difference between two probability distributions. The proposed approach is …

  Related articles All 5 versions 


[PDF] arxiv.org

Averaging atmospheric gas concentration data using wasserstein barycenters

M Barré, C Giron, M Mazzolini… - arXiv preprint arXiv …, 2020 - arxiv.org

Hyperspectral satellite images report greenhouse gas concentrations worldwide on a daily

basis. While taking simple averages of these images over time produces a rough estimate of

relative emission rates, atmospheric transport means that simple averages fail to pinpoint …

  Cited by 1 Related articles All 3 versions 


[PDF] nsf.gov

A Data-Driven Distributionally Robust Game Using Wasserstein Distance

G PengT ZhangQ Zhu - International Conference on Decision and Game …, 2020 - Springer

This paper studies a special class of games, which enables the players to leverage the

information from a dataset to play the game. However, in an adversarial scenario, the

dataset may not be trustworthy. We propose a distributionally robust formulation to introduce …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

Unsupervised Multilingual Alignment using Wasserstein Barycenter

X Lian, K JainJ TruszkowskiP Poupart… - arXiv preprint arXiv …, 2020 - arxiv.org

We study unsupervised multilingual alignment, the problem of finding word-to-word

translations between multiple languages without using any parallel data. One popular

strategy is to reduce multilingual alignment to the much simplified bilingual setting, by …

  Cited by 1 Related articles All 8 versions 

2020


Nonparametric Different-Feature Selection Using Wasserstein Distance

W Zheng, FY WangC Gou - 2020 IEEE 32nd International …, 2020 - ieeexplore.ieee.org

In this paper, we propose a feature selection method that characterizes the difference

between two kinds of probability distributions. The key idea is to view the feature selection

problem as a sparsest k-subgraph problem that considers Wasserstein distance between …

  Related articles All 2 versions


[PDF] arxiv.org

Convergence of Recursive Stochastic Algorithms using Wasserstein Divergence

A Gupta, WB Haskell - arXiv preprint arXiv:2003.11403, 2020 - arxiv.org

This paper develops a unified framework, based on iterated random operator theory, to

analyze the convergence of constant stepsize recursive stochastic algorithms (RSAs) in

machine learning and reinforcement learning. RSAs use randomization to efficiently …

Cited by 2 Related articles All 4 versions 

[PDF] researchgate.net

[PDF] Ranking IPCC Model Performance Using the Wasserstein Distance

G VissioV LemboV Lucarini… - arXiv preprint arXiv …, 2020 - researchgate.net

We propose a methodology for intercomparing climate models and evaluating their

performance against benchmarks based on the use of the Wasserstein distance (WD). This

distance provides a rigorous way to measure quantitatively the difference between two …

  Related articles 


[PDF] arxiv.org

Data-Driven Approximation of the Perron-Frobenius Operator Using the Wasserstein Metric

A KarimiTT Georgiou - arXiv preprint arXiv:2011.00759, 2020 - arxiv.org

This manuscript introduces a regression-type formulation for approximating the Perron-

Frobenius Operator by relying on distributional snapshots of data. These snapshots may

represent densities of particles. The Wasserstein metric is leveraged to define a suitable …

  Related articles All 3 versions 


[PDF] arxiv.org

Minimax control of ambiguous linear stochastic systems using the Wasserstein metric

K KimI Yang - 2020 59th IEEE Conference on Decision and …, 2020 - ieeexplore.ieee.org

In this paper, we propose a minimax linear-quadratic control method to address the issue of

inaccurate distribution information in practical stochastic systems. To construct a control

policy that is robust against errors in an empirical distribution of uncertainty, our method …

  Cited by 3 Related articles All 3 versions

<——2020——2020———2060—— 


[PDF] arxiv.org

Interpretable Model Summaries Using the Wasserstein Distance

E Dunipace, L Trippa - arXiv preprint arXiv:2012.09999, 2020 - arxiv.org

In the current computing age, models can have hundreds or even thousands of parameters;

however, such large models decrease the ability to interpret and communicate individual

parameters. Reducing the dimensionality of the parameter space in the estimation phase is …

  Related articles All 2 versions 


[PDF] arxiv.org

Robustified Multivariate Regression and Classification Using Distributionally Robust Optimization under the Wasserstein Metric

R ChenIC Paschalidis - arXiv preprint arXiv:2006.06090, 2020 - arxiv.org

We develop Distributionally Robust Optimization (DRO) formulations for Multivariate Linear

Regression (MLR) and Multiclass Logistic Regression (MLG) when both the covariates and

responses/labels may be contaminated by outliers. The DRO framework uses a probabilistic …

  Related articles All 3 versions 


[PDF] arxiv.org

High-precision Wasserstein barycenters in polynomial time

JM Altschuler, E Boix-Adsera - arXiv preprint arXiv:2006.08012, 2020 - arxiv.org

… weight vector λ, and an accuracy ε > 0, computes an ε-additively approximate Wasserstein

barycenter in … corresponding tuple (j1,...,jk) [n] k is computable in O(nk polylog U) time by

computing … non-empty cell Fj1,...,jk contains at least one cell in H, this process enumerates all …

  Related articles All 3 versions 

[HTML] springer.com

[HTML] Missing Features Reconstruction Using Wasserstein Generative Adversarial Imputation Network

M FriedjungováD Vašata, M Balatsko… - … on Computational Science, 2020 - Springer

Missing data is one of the most common preprocessing problems. In this paper, we

experimentally research the use of generative and non-generative models for feature

reconstruction. Variational Autoencoder with Arbitrary Conditioning (VAEAC) and …

  Cited by 1 Related articles All 8 versions


Biosignal Oversampling Using Wasserstein Generative Adversarial Network

MS MuniaM Nourani, S Houari - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Oversampling plays a vital role in improving the minority-class classification accuracy for

imbalanced biomedical datasets. In this work, we propose a single-channel biosignal data

generation method by exploiting the advancements in well-established image-based …

  All 2 versions


2020

[PDF] arxiv.org

Velocity Inversion Using the Quadratic Wasserstein Metric

S Mahankali - arXiv preprint arXiv:2009.00708, 2020 - arxiv.org

Full--waveform inversion (FWI) is a method used to determine properties of the Earth from

information on the surface. We use the squared Wasserstein distance (squared $ W_2 $

distance) as an objective function to invert for the velocity as a function of position in the …

  Related articles All 6 versions 


2020 

[PDF] thecvf.com

Illegible Text to Readable Text: An Image-to-Image Transformation using Conditional Sliced Wasserstein Adversarial Networks

M KarimiG VeniYY Yu - … of the IEEE/CVF Conference on …, 2020 - openaccess.thecvf.com

Automatic text recognition from ancient handwritten record images is an important problem

in the genealogy domain. However, critical challenges such as varying noise conditions,

vanishing texts, and variations in handwriting makes the recognition task difficult. We tackle …

  Related articles All 7 versions 


EEG data augmentation using Wasserstein GAN

G Bouallegue, R Djemal - 2020 20th International Conference …, 2020 - ieeexplore.ieee.org

Electroencephalogram (EEG) presents a challenge during the classification task using

machine learning and deep learning techniques due to the lack or to the low size of

available datasets for each specific neurological disorder. Therefore, the use of data …

 

[HTML] peerj.com

[HTML] Correcting nuisance variation using Wasserstein distance

G TabakM FanS YangS Hoyer, G Davis - PeerJ, 2020 - peerj.com

Profiling cellular phenotypes from microscopic imaging can provide meaningful biological

information resulting from various factors affecting the cells. One motivating application is

drug development: morphological cell features can be captured from images, from which …

  Cited by 1 Related articles All 8 versions 


Numeric Data Augmentation using Structural Constraint Wasserstein Generative Adversarial Networks

W Wang, C Wang, T Cui, R Gong… - … on Circuits and …, 2020 - ieeexplore.ieee.org

Some recent studies have suggested using GANs for numeric data generation such as to

generate data for completing the imbalanced numeric data. Considering the significant

difference between the dimensions of the numeric data and images, as well as the strong …

  Related articles

<——2020——2020———2070—— 


[HTML] Eeg signal reconstruction using a generative adversarial network with wasserstein distance and temporal-spatial-frequency loss

T Luo, Y Fan, L Chen, G Guo, C Zhou - Frontiers in neuroinformatics, 2020 - frontiersin.org

Applications based on electroencephalography (EEG) signals suffer from the mutual

contradiction of high classification performance versus low cost. The nature of this

contradiction makes EEG signal reconstruction with high sampling rate and sensitivity …

  Cited by 8 Related articles All 5 versions 

[HTML] Solutions of a Class of Degenerate Kinetic Equations Using Steepest Descent in Wasserstein Space

A Marcos, A Soglo - Journal of Mathematics, 2020 - hindawi.com

We use the steepest descent method in an Orlicz–Wasserstein space to study the existence

of solutions for a very broad class of kinetic equations, which include the Boltzmann

equation, the Vlasov–Poisson equation, the porous medium equation, and the parabolic p …

  Related articles All 6 versions 


Synthetic Data Generation Using Wasserstein Conditional Gans With Gradient Penalty (WCGANS-GP)

M Singh Walia - 2020 - arrow.tudublin.ie

With data protection requirements becoming stricter, the data privacy has become

increasingly important and more crucial than ever. This has led to restrictions on the

availability and dissemination of real-world datasets. Synthetic data offers a viable solution …

  Related articles 


Synthesising Tabular Datasets Using Wasserstein Conditional GANS with Gradient Penalty (WCGAN-GP)

S McKeever, M Singh Walia - 2020 - arrow.tudublin.ie

Deep learning based methods based on Generative Adversarial Networks (GANs) have

seen remarkable success in data synthesis of images and text. This study investigates the

use of GANs for the generation of tabular mixed dataset. We apply Wasserstein Conditional …

  Related articles 


Spatial-aware Network using Wasserstein Distance for Unsupervised Domain Adaptation

L Long, L Bin, F Jiang - 2020 Chinese Automation Congress …, 2020 - ieeexplore.ieee.org

In a general scenario, the purpose of Unsupervised Domain Adaptation (UDA) is to classify

unlabeled target domain data as much as possible, but the source domain data has a large

number of labels. To address this situation, this paper introduces the optimal transport theory …

 

2020


[PDF] jku.at

WGAIN: Data Imputation using Wasserstein GAIN/submitted by Christina Halmich

C Halmich - 2020 - epub.jku.at

Missing data is a well known problem in the Machine Learning world. A lot of datasets that

are used for training algorithms contain missing values, eg 45% of the datasets stored in the

UCI Machine Learning Repository [16], which is a commonly used dataset collection …

  Related articles All 2 versions 


2020

Input limited Wasserstein GAN

F Cao, H Zhao, P Liu, P Li - Second Target Recognition and …, 2020 - spiedigitallibrary.org

Generative adversarial networks (GANs) has proven hugely successful, but suffer from train

instability. The recently proposed Wasserstein GAN (WGAN) has largely overcome the

problem, but can still fail to converge in some case or be to complex. It has been found that …

  Related articles All 2 versions


[PDF] ibpsa.org

[PDF] Comparing Bottom-Up Energy Consumption Models Using The Wasserstein Distance Between Load Profile Histograms

E Sanderson, A Fragaki, J Simo… - BSO-V 2020: IBPSA …, 2020 - ibpsa.org

This paper presents a comparison of bottom up models that generate appliance load

profiles. The comparison is based on their ability to accurately distribute load over time-of-

day. This is a key feature of model performance if the model is used to assess the impact of …

  Related articles All 2 versions 


[PDF] ceur-ws.org

[PDF] Synthesising Tabular Data using Wasserstein Conditional GANs with Gradient Penalty (WCGAN-GP)

M Walia, B TierneyS McKeever - ceur-ws.org

Deep learning based methods based on Generative Adversarial Networks (GANs) have

seen remarkable success in data synthesis of images and text. This study investigates the

use of GANs for the generation of tabular mixed dataset. We apply Wasserstein Conditional …

  Related articles All 2 versions 


[PDF] brown.edu

[PDF] Reduced-order modeling of transport equations using Wasserstein spaces

V EhrlacherD LombardiO MulaFX Vialard - icerm.brown.edu

Page 1. Introduction to Wassertein spaces and barycenters Model order reduction of parametric

transport equations Reduced-order modeling of transport equations using Wasserstein spaces

V. Ehrlacher1, D. Lombardi 2, O. Mula 3, F.-X. Vialard 4 1Ecole des Ponts ParisTech & INRIA …

  Related articles 

<——2020——2020———2080—— 


Semi-supervised Data-driven Surface Wave Tomography using Wasserstein Cycle-consistent GAN: Application on Southern California Plate Boundary Region

A Cai, H QiuF Niu - 2020 - essoar.org

Current machine learning based shear wave velocity (Vs) inversion using surface wave

dispersion measurements utilizes synthetic dispersion curves calculated from existing 3-D

velocity models as training datasets. It is shown in the previous studies that the …

  All 2 versions


Remaining useful life prediction of lithium-ion batteries using a fusion method based on Wasserstein GAN

周温丁, 鲍士兼, 许方敏, 赵成林 - 中国邮电高校学报 (英文版), 2020 - jcupt.bupt.edu.cn

Lithium-ion batteries are the main power supply equipment in many fields due to their

advantages of no memory, high energy density, long cycle life and no pollution to the

environment. Accurate prediction for the remaining useful life (RUL) of lithium-ion batteries …

  All 2 versions 



Convergence and concentration of empirical measures under Wasserstein distance in unbounded functional spaces

J Lei - Bernoulli, 2020 - projecteuclid.org

We provide upper bounds of the expected Wasserstein distance between a probability

measure and its empirical version, generalizing recent results for finite dimensional

Euclidean spaces and bounded functional spaces. Such a generalization can cover …

  Cited by 45 Related articles All 5 versions


[HTML] mdpi.com

Fused Gromov-Wasserstein distance for structured objects

T Vayer, L Chapel, R FlamaryR TavenardN Courty - Algorithms, 2020 - mdpi.com

Optimal transport theory has recently found many applications in machine learning thanks to

its capacity to meaningfully compare various machine learning objects that are viewed as

distributions. The Kantorovitch formulation, leading to the Wasserstein distance, focuses on …

  Cited by 6 Related articles All 33 versions 


[PDF] arxiv.org

Improved complexity bounds in wasserstein barycenter problem

D Dvinskikh, D Tiapkin - arXiv preprint arXiv:2010.04677, 2020 - arxiv.org

In this paper, we focus on computational aspects of Wasserstein barycenter problem. We

provide two algorithms to compute Wasserstein barycenter of $ m $ discrete measures of

size $ n $ with accuracy $\varepsilon $. The first algorithm, based on mirror prox with some …

  Cited by 3 Related articles All 2 versions 


2020


[PDF] stanford.edu

[PDF] Quantifying the Empirical Wasserstein Distance to a Set of Measures: Beating the Curse of Dimensionality

N Si, J Blanchet, S GhoshM Squillante - Advances in Neural …, 2020 - stanford.edu

Page 1. Quantifying the Empirical Wasserstein Distance to a Set of Measures: Beating the Curse

of Dimensionality Nian Si Joint work with Jose Blanchet, Soumyadip Ghosh, and Mark Squillante

NeurIPS 2020 October 22, 2020 niansi@stanford.edu (Stanford) Wasserstein Projection October …

  Related articles 


[PDF] arxiv.org

Precise limit in Wasserstein distance for conditional empirical measures of Dirichlet diffusion processes

FY Wang - arXiv preprint arXiv:2004.07537, 2020 - arxiv.org

Let $ M $ be a $ d $-dimensional connected compact Riemannian manifold with boundary

$\partial M $, let $ V\in C^ 2 (M) $ such that $\mu (dx):= e^{V (x)} dx $ is a probability

measure, and let $ X_t $ be the diffusion process generated by $ L:=\Delta+\nabla V $ with …

  Cited by 3 Related articles All 3 versions 


[PDF] arxiv.org

Convergence in Wasserstein Distance for Empirical Measures of Dirichlet Diffusion Processes on Manifolds

FY Wang - arXiv preprint arXiv:2005.09290, 2020 - arxiv.org

Let $ M $ be a $ d $-dimensional connected compact Riemannian manifold with boundary

$\partial M $, let $ V\in C^ 2 (M) $ such that $\mu ({\rm d} x):={\rm e}^{V (x)}{\rm d} x $ is a

probability measure, and let $ X_t $ be the diffusion process generated by …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

Unsupervised Multilingual Alignment using Wasserstein Barycenter

X Lian, K JainJ TruszkowskiP Poupart… - arXiv preprint arXiv …, 2020 - arxiv.org

We study unsupervised multilingual alignment, the problem of finding word-to-word

translations between multiple languages without using any parallel data. One popular

strategy is to reduce multilingual alignment to the much simplified bilingual setting, by …

  Cited by 1 Related articles All 8 versions 


[PDF] arxiv.org

On a Novel Application of Wasserstein-Procrustes for Unsupervised Cross-Lingual Learning

G Ramírez, R Dangovski, P Nakov… - arXiv preprint arXiv …, 2020 - arxiv.org

The emergence of unsupervised word embeddings, pre-trained on very large monolingual

text corpora, is at the core of the ongoing neural revolution in Natural Language Processing

(NLP). Initially introduced for English, such pre-trained word embeddings quickly emerged …

  Related articles All 3 versions 

<——2020——2020———2090——



[PDF] arxiv.org

Wasserstein Convergence Rate for Empirical Measures on Noncompact Manifolds

FY Wang - arXiv preprint arXiv:2007.14667, 2020 - arxiv.org

Let $ X_t $ be the (reflecting) diffusion process generated by $ L:=\Delta+\nabla V $ on a

complete connected Riemannian manifold $ M $ possibly with a boundary $\partial M $,

where $ V\in C^ 1 (M) $ such that $\mu (dx):= e^{V (x)} dx $ is a probability measure. We …

  Cited by 1 Related articles All 2 versions 



Unsupervised Wasserstein Distance Guided Domain Adaptation for 3D Multi-domain Liver Segmentation

C YouJ Yang, J Chapiro, JS Duncan - Interpretable and Annotation …, 2020 - Springer

Deep neural networks have shown exceptional learning capability and generalizability in

the source domain when massive labeled data is provided. However, the well-trained

models often fail in the target domai

n due to the domain shift. Unsupervised domain …

  Related articles All 3 versions


Spatial-aware Network using Wasserstein Distance for Unsupervised Domain Adaptation

L Long, L Bin, F Jiang - 2020 Chinese Automation Congress …, 2020 - ieeexplore.ieee.org

In a general scenario, the purpose of Unsupervised Domain Adaptation (UDA) is to classify

unlabeled target domain data as much as possible, but the source domain data has a large

number of labels. To address this situation, this paper introduces the optimal transport theory …


2020

Sampling of probability measures in the convex order by Wasserstein projection

A AlfonsiJ Corbetta, B Jourdain - Annales de l'Institut Henri …, 2020 - projecteuclid.org

In this paper, for $\mu $ and $\nu $ two probability measures on $\mathbb {R}^{d} $ with

finite moments of order $\varrho\ge 1$, we define the respective projections for the $ W_

{\varrho} $-Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures …

  Cited by 19 Related articles All 9 versions


[PDF] arxiv.org

node2coords: Graph representation learning with wasserstein barycenters

E SimouD ThanouP Frossard - IEEE Transactions on Signal …, 2020 - ieeexplore.ieee.org

In order to perform network analysis tasks, representations that capture the most relevant

information in the graph structure are needed. However, existing methods do not learn

representations that can be interpreted in a straightforward way and that are stable to …

  Cited by 1 Related articles All 3 versions


2021


[PDF] arxiv.org

Wasserstein Contrastive Representation Distillation

L ChenZ GanD Wang, J Liu, R Henao… - arXiv preprint arXiv …, 2020 - arxiv.org

The primary goal of knowledge distillation (KD) is to encapsulate the information of a model

learned from a teacher network into a student network, with the latter being more compact

than the former. Existing work, eg, using Kullback-Leibler divergence for distillation, may fail …

  Related articles All 2 versions 


[PDF] ecai2020.eu

[PDF] Dual Rejection Sampling for Wasserstein Auto-Encoders

L Hou, H Shen, X Cheng - 24th European Conference on Artificial …, 2020 - ecai2020.eu

Deep generative models enhanced by Wasserstein distance have achieved remarkable

success in recent years. Wasserstein Auto-Encoders (WAEs) are auto-encoder based

generative models that aim to minimize the Wasserstein distance between the data …

  Cited by 1 Related articles All 3 versions 


[PDF] arxiv.org

Automatic coding of students' writing via Contrastive Representation Learning in the Wasserstein space

R Jiang, J GouveaD HammerS Aeron - arXiv preprint arXiv:2011.13384, 2020 - arxiv.org

Qualitative analysis of verbal data is of central importance in the learning sciences. It is labor-

intensive and time-consuming, however, which limits the amount of data researchers can

include in studies. This work is a step towards building a statistical machine learning (ML) …

  Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein Distance Regularized Sequence Representation for Text Matching in Asymmetrical Domains

W Yu, C Xu, J XuL Pang, X Gao, X Wang… - arXiv preprint arXiv …, 2020 - arxiv.org

One approach to matching texts from asymmetrical domains is projecting the input

sequences into a common semantic space as feature vectors upon which the matching

function can be readily defined and learned. In real-world matching practices, it is often …

  Related articles All 3 versions 


GraphWGAN: Graph Representation Learning with Wasserstein Generative Adversarial Networks

R Yan, H Shen, C Qi, K Cen… - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Graph representation learning aims to represent vertices as low-dimensional and real-

valued vectors to facilitate subsequent downstream tasks, ie, node classification, link

predictions. Recently, some novel graph representation learning frameworks, which try to …

  Related articles All 2 versions

<——2020——2020———2100——



A Super Resolution Method for Remote Sensing Images Based on Cascaded Conditional Wasserstein GANs

B Liu, H Li, Y Zhou, Y Peng, A Elazab… - 2020 IEEE 3rd …, 2020 - ieeexplore.ieee.org

High-resolution (HR) remote sensing imagery is quite beneficial for subsequent

interpretation. Obtaining HR images can be achieved by upgrading the imaging device. Yet,

the cost to perform this task is very huge. Thus, it is necessary to obtain HR images from low …

  Related articles



Remote Sensing Image Segmentation based on Generative Adversarial Network with Wasserstein divergence

X Cao, C Song, J Zhang, C Liu - 2020 3rd International Conference on …, 2020 - dl.acm.org

In the image segmentation fields, traditional methods can be classified into four main

categories: threshold-based (eg Otsu [1]),. edge-based (eg Canny [2], Hough transform [3]),

region-based (eg Super pixel [4]), and energy functional-based segmentation methods (eg …

 


Sample generation based on a supervised Wasserstein Generative Adversarial Network for high-resolution remote-sensing scene classification

W HanL Wang, R Feng, L Gao, X Chen, Z Deng… - Information …, 2020 - Elsevier

As high-resolution remote-sensing (HRRS) images have become increasingly widely

available, scene classification focusing on the smart classification of land cover and land

use has also attracted more attention. However, mainstream methods encounter a severe …

  Cited by 4 Related articles All 3 versions


[PDF] arxiv.org

Transport and Interface: an Uncertainty Principle for the Wasserstein distance

A SagivS Steinerberger - SIAM Journal on Mathematical Analysis, 2020 - SIAM

Let f:(0,1)^dR be a continuous function with zero mean and interpret f_+=\max(f,0) and f_-

=-\min(f,0) as the densities of two measures. We prove that if the cost of transport from f_+ to

f_- is small, in terms of the Wasserstein distance W_1(f_+,f_-), then the Hausdorff measure of …

  Cited by 3 Related articles All 3 versions


[PDF] mlr.press

Fast algorithms for computational optimal transport and wasserstein barycenter

W GuoN HoM Jordan - International Conference on …, 2020 - proceedings.mlr.press

We provide theoretical complexity analysis for new algorithms to compute the optimal

transport (OT) distance between two discrete probability distributions, and demonstrate their

favorable practical performance compared to state-of-art primal-dual algorithms. First, we …

  Cited by 2 Related articles All 4 versions 


2020


[PDF] arxiv.org

Linear Optimal Transport Embedding: Provable fast Wasserstein distance computation and classification for nonlinear problems

C MoosmüllerA Cloninger - arXiv preprint arXiv:2008.09165, 2020 - arxiv.org

Discriminating between distributions is an important problem in a number of scientific fields.

This motivated the introduction of Linear Optimal Transportation (LOT), which embeds the

space of distributions into an $ L^ 2$-space. The transform is defined by computing the …

  Cited by 2 Related articles All 2 versions 


[PDF] lewissoft.com

 


CVaR-Based Approximations of Wasserstein Distributionally Robust Chance Constraints with Application to Process Scheduling

B Liu, Q Zhang, X Ge, Z Yuan - Industrial & Engineering Chemistry …, 2020 - ACS Publications

Distributionally robust chance constrained programming is a stochastic optimization

approach that considers uncertainty in model parameters as well as uncertainty in the

underlying probability distribution. It ensures a specified probability of constraint satisfaction …

 roximation of an individual distributionally robust chance constraint with Wasserstein …

Cited by 6 Related articles All 4 versions

[PDF] mlr.press

Quantitative stability of optimal transport maps and linearization of the 2-wasserstein space

Q Mérigot, A Delalande… - … Conference on Artificial …, 2020 - proceedings.mlr.press

This work studies an explicit embedding of the set of probability measures into a Hilbert

space, defined using optimal transport maps from a reference probability density. This

embedding linearizes to some extent the 2-Wasserstein space and is shown to be bi-Hölder …

  Cited by 16 Related articles All 5 versions 


[PDF] biorxiv.org

Gromov-Wasserstein optimal transport to align single-cell multi-omics data

P DemetciR SantorellaB SandstedeWS Noble… - BioRxiv, 2020 - biorxiv.org

Data integration of single-cell measurements is critical for understanding cell development

and disease, but the lack of correspondence between different types of measurements

makes such efforts challenging. Several unsupervised algorithms can align heterogeneous …

  Cited by 5 Related articles All 3 versions 

<——2020——2020———2110—— 



Data-driven stochastic programming with distributionally robust constraints under Wasserstein distance: asymptotic properties

Y Mei, ZP Chen, BB Ji, ZJ Xu, J Liu - … of the Operations Research Society of …, 2020 - Springer

Distributionally robust optimization is a dominant paradigm for decision-making problems

where the distribution of random variables is unknown. We investigate a distributionally

robust optimization problem with ambiguities in the objective function and countably infinite …

  Cited by 1 Related articles


Adaptive Wasserstein Hourglass for Weakly Supervised RGB 3D Hand Pose Estimation

Y Zhang, L Chen, Y Liu, W Zheng, J Yong - Proceedings of the 28th ACM …, 2020 - dl.acm.org

The deficiency of labeled training data is one of the bottlenecks in 3D hand pose estimation

from monocular RGB images. Synthetic datasets have a large number of images with

precise annotations, but their obvious difference with real-world datasets limits the …

  Related articles



Quadratic Wasserstein metrics for von Neumann algebras via transport plans

R Duvenhage - arXiv preprint arXiv:2012.03564, 2020 - arxiv.org

We show how one can obtain a class of quadratic Wasserstein metrics, that is to say,

Wasserstein metrics of order 2, on the set of faithful normal states of a von Neumann algebra

$ A $, via transport plans, rather than through a dynamical approach. Two key points to …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

On Stein's factors for Poisson approximation in Wasserstein distance with non-linear transportation costs

ZW Liao, Y Ma, A Xia - arXiv preprint arXiv:2003.13976, 2020 - arxiv.org

We establish various bounds on the solutions to a Stein equation for Poisson approximation

in Wasserstein distance with non-linear transportation costs. The proofs are a refinement of

those in [Barbour and Xia (2006)] using the results in [Liu and Ma (2009)]. As a corollary, we …

  Related articles All 2 versions 


Semi-supervised Surface Wave Tomography with Wasserstein Cycle-consistent GAN: Method and Application on Southern California Plate Boundary Region

A Cai, H QiuF Niu - 2020 - essoar.org

Machine learning algorithm is applied to shear wave velocity (Vs) inversion in surface wave

tomography, where a set of 1-D Vs profiles and the corresponding synthetic dispersion

curves are used in network training. Previous studies showed that performances of a trained …

 

2020


[PDF] stanford.edu

[PDF] A CLASS OF OPTIMAL TRANSPORT REGULARIZED FORMULATIONS WITH APPLICATIONS TO WASSERSTEIN GANS

KH Bae, B Feng, S Kim, S Lazarova-Molnar, Z Zheng… - stanford.edu

Optimal transport costs (eg Wasserstein distances) are used for fitting high-dimensional

distributions. For example, popular artificial intelligence algorithms such as Wasserstein

Generative Adversarial Networks (WGANs) can be interpreted as fitting a black-box …

  

Semi-supervised Data-driven Surface Wave Tomography using Wasserstein Cycle-consistent GAN: Application on Southern California Plate Boundary Region

A Cai, H QiuF Niu - 2020 - essoar.org

Current machine learning based shear wave velocity (Vs) inversion using surface wave

dispersion measurements utilizes synthetic dispersion curves calculated from existing 3-D

velocity models as training datasets. It is shown in the previous studies that the …

  All 2 versions


[PDF] brown.edu

[PDF] Reduced-order modeling of transport equations using Wasserstein spaces

V EhrlacherD LombardiO MulaFX Vialard - icerm.brown.edu

… Page 7. Introduction to Wassertein spaces and barycenters Model order reduction of

parametric transport equations Comparison between the Wasserstein and L2(Ω)

interpolation [Kolouri et al. 2016] ρW2 t ρL2 t Interesting property of the Wasserstein metric …

  Related articles 



Wasserstein-based graph alignment

HP MareticME GhecheM MinderG Chierchia… - arXiv preprint arXiv …, 2020 - arxiv.org

We propose a novel method for comparing non-aligned graphs of different sizes, based on

the Wasserstein distance between graph signal distributions induced by the respective

graph Laplacian matrices. Specifically, we cast a new formulation for the one-to-many graph …

  Cited by 5 Related articles All 2 versions 

Wasserstein-based Graph Alignment

H Petric MareticM El GhecheM Minder… - arXiv e …, 2020 - ui.adsabs.harvard.edu

We propose a novel method for comparing non-aligned graphs of different sizes, based on

the Wasserstein distance between graph signal distributions induced by the respective

graph Laplacian matrices. Specifically, we cast a new formulation for the one-to-many graph …


A Wasserstein based two-stage distributionally robust optimization model for optimal operation of CCHP micro-grid under uncertainties

Y Wang, Y Yang, L Tang, W Sun, B Li - International Journal of Electrical …, 2020 - Elsevier

Combined cooling, heating and power (CCHP) micro-grids are getting increasing attentions

due to the realization of cleaner production and high energy efficiency. However, with the

features of complex tri-generation structure and renewable power uncertainties, it is …

  Cited by 16 Related articles All 2 versions

<——2020——2020———2120——  



[PDF] wiley.com

Evaluating the performance of climate models based on Wasserstein distance

G VissioV LemboV Lucarini… - Geophysical Research …, 2020 - Wiley Online Library

We propose a methodology for intercomparing climate models and evaluating their

performance against benchmarks based on the use of the Wasserstein distance (WD). This

distance provides a rigorous way to measure quantitatively the difference between two …

  Cited by 2 Related articles All 13 versions


[PDF] thecvf.com

Wasserstein loss-based deep object detection

Y Han, X Liu, Z Sheng, Y Ren, X Han… - Proceedings of the …, 2020 - openaccess.thecvf.com

Object detection locates the objects with bounding boxes and identifies their classes, which

is valuable in many computer vision applications (eg autonomous driving). Most existing

deep learning-based methods output a probability vector for instance classification trained …

  Cited by 8 Related articles All 5 versions 


[PDF] arxiv.org

Multivariate goodness-of-Fit tests based on Wasserstein distance

M HallinG MordantJ Segers - arXiv preprint arXiv:2003.06684, 2020 - arxiv.org

Goodness-of-fit tests based on the empirical Wasserstein distance are proposed for simple

and composite null hypotheses involving general multivariate distributions. This includes the

important problem of testing for multivariate normality with unspecified mean vector and …

  Cited by 5 Related articles All 10 versions 


[PDF] academia.edu

Data augmentation in fault diagnosis based on the Wasserstein generative adversarial network with gradient penalty

X Gao, F Deng, X Yue - Neurocomputing, 2020 - Elsevier

Fault detection and diagnosis in industrial process is an extremely essential part to keep

away from undesired events and ensure the safety of operators and facilities. In the last few

decades various data based machine learning algorithms have been widely studied to …

  Cited by 27 Related articles All 3 versions



Reconstruction of shale image based on Wasserstein Generative Adversarial Networks with gradient penalty

W Zha, X Li, Y Xing, L He, D Li - Advances in Geo-Energy …, 2020 - yandy-ager.com

Abstract Generative Adversarial Networks (GANs), as most popular artificial intelligence

models in the current image generation field, have excellent image generation capabilities.

Based on Wasserstein GANs with gradient penalty, this paper proposes a novel digital core …

  Cited by 8 Related articles 


2020


[HTML] hindawi.com

[HTML] Imbalanced Fault Classification of Bearing via Wasserstein Generative Adversarial Networks with Gradient Penalty

B Han, S Jia, G Liu, J Wang - Shock and Vibration, 2020 - hindawi.com

Recently, generative adversarial networks (GANs) are widely applied to increase the

amounts of imbalanced input samples in fault diagnosis. However, the existing GAN-based

methods have convergence difficulties and training instability, which affect the fault …

  Related articles All 4 versions 


[PDF] researchgate.net

Data supplement for a soft sensor using a new generative model based on a variational autoencoder and Wasserstein GAN

X Wang, H Liu - Journal of Process Control, 2020 - Elsevier

In industrial process control, measuring some variables is difficult for environmental or cost

reasons. This necessitates employing a soft sensor to predict these variables by using the

collected data from easily measured variables. The prediction accuracy and computational …

  Cited by 8 Related articles All 3 versions

 


[HTML] mdpi.com

Calculating the Wasserstein metric-based Boltzmann entropy of a landscape mosaic

H Zhang, Z Wu, T Lan, Y Chen, P Gao - Entropy, 2020 - mdpi.com

Shannon entropy is currently the most popular method for quantifying the disorder or

information of a spatial data set such as a landscape pattern and a cartographic map.

However, its drawback when applied to spatial data is also well documented; it is incapable …

  Cited by 3 Related articles All 9 versions   MR4217458 


Online Stochastic Optimization with Wasserstein Based Non-stationarity

J Jiang, X Li, J Zhang - arXiv preprint arXiv:2012.06961, 2020 - arxiv.org

We consider a general online stochastic optimization problem with multiple budget

constraints over a horizon of finite time periods. At each time period, a reward function and

multiple cost functions, where each cost function is involved in the consumption of one …

  Related articles All 2 versions 


Wasserstein based transfer network for cross-domain sentiment classification

Y Du, M He, L Wang, H Zhang - Knowledge-Based Systems, 2020 - Elsevier

Automatic sentiment analysis of social media texts is of great significance for identifying

people's opinions that can help people make better decisions. Annotating data is time

consuming and laborious, and effective sentiment analysis on domains lacking of labeled …

  Cited by 1 Related articles All 2 versions

<——2020——2020———2130——



DPIR-Net: Direct PET image reconstruction based on the Wasserstein generative adversarial network

Z Hu, H Xue, Q Zhang, J Gao, N Zhang… - … on Radiation and …, 2020 - ieeexplore.ieee.org

Positron emission tomography (PET) is an advanced medical imaging technique widely

used in various clinical applications, such as tumor detection and neurologic disorders.

Reducing the radiotracer dose is desirable in PET imaging because it decreases the …

  Cited by 5 Related articles


Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings

Y Dai, S Wang, X Chen, C Xu, W Guo - Knowledge-Based Systems, 2020 - Elsevier

Abstract Knowledge graph embedding aims to project entities and relations into low-

dimensional and continuous semantic feature spaces, which has captured more attention in

recent years. Most of the existing models roughly construct negative samples via a uniformly …

  Cited by 6 Related articles All 2 versions


[PDF] researchgate.net

Sample generation based on a supervised Wasserstein Generative Adversarial Network for high-resolution remote-sensing scene classification

W HanL Wang, R Feng, L Gao, X Chen, Z Deng… - Information …, 2020 - Elsevier

As high-resolution remote-sensing (HRRS) images have become increasingly widely

available, scene classification focusing on the smart classification of land cover and land

use has also attracted more attention. However, mainstream methods encounter a severe …

  Cited by 4 Related articles All 3 versions


[PDF] bciml.cn

Wasserstein distance based deep adversarial transfer learning for intelligent fault diagnosis with unlabeled or insufficient labeled data

C Cheng, B Zhou, G Ma, D WuY Yuan - Neurocomputing, 2020 - Elsevier

Intelligent fault diagnosis is one critical topic of maintenance solution for mechanical

systems. Deep learning models, such as convolutional neural networks (CNNs), have been

successfully applied to fault diagnosis tasks and achieved promising results. However, one …

  Cited by 8 Related articles All 3 versions


[PDF] researchgate.net

Data supplement for a soft sensor using a new generative model based on a variational autoencoder and Wasserstein GAN

X Wang, H Liu - Journal of Process Control, 2020 - Elsevier

In industrial process control, measuring some variables is difficult for environmental or cost

reasons. This necessitates employing a soft sensor to predict these variables by using the

collected data from easily measured variables. The prediction accuracy and computational …

  Cited by 6 Related articles All 3 versions


2020


Person-independent facial expression recognition method based on improved Wasserstein generative adversarial networks in combination with identity aware

C Xu, Y Cui, Y Zhang, P Gao, J Xu - Multimedia Systems, 2020 - Springer

Since the distinction between two expressions is fairly vague, usually a subtle change in one

part of the human face is enough to change a facial expression. Most of the existing facial

expression recognition algorithms are not robust enough because they rely on general facial …

  Cited by 5 Related articles


[PDF] unifi.it

[PDF] Conlon: A pseudo-song generator based on a new pianoroll, wasserstein autoencoders, and optimal interpolations

L Angioloni, T Borghuis, L Brusci… - Proceedings of the 21st …, 2020 - flore.unifi.it

We introduce CONLON, a pattern-based MIDI generation method that employs a new

lossless pianoroll-like data description in which velocities and durations are stored in

separate channels. CONLON uses Wasserstein autoencoders as the underlying generative …

  Cited by 1 Related articles All 7 versions 


Asymptotic Guarantees for Generative Modeling Based on the Smooth Wasserstein Distance

Z GoldfeldK GreenewaldK Kato - arXiv preprint arXiv:2002.01012, 2020 - arxiv.org

Minimum distance estimation (MDE) gained recent attention as a formulation of (implicit)

generative modeling. It considers minimizing, over model parameters, a statistical distance

between the empirical data distribution and the model. This formulation lends itself well to …

  Cited by 2 Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein-based fairness interpretability framework for machine learning models

A Miroshnikov, K Kotsiopoulos, R Franks… - arXiv preprint arXiv …, 2020 - arxiv.org

In this article, we introduce a fairness interpretability framework for measuring and

explaining bias in classification and regression models at the level of a distribution. In our

work, motivated by the ideas of Dwork et al.(2012), we measure the model bias across sub …

  Related articles All 2 versions 

<——2020——2020———2140——



DECWA: Density-Based Clustering using Wasserstein Distance

N El Malki, R Cugny, O TesteF Ravat - Proceedings of the 29th ACM …, 2020 - dl.acm.org

Clustering is a data analysis method for extracting knowledge by discovering groups of data

called clusters. Among these methods, state-of-the-art density-based clustering methods

have proven to be effective for arbitrary-shaped clusters. Despite their encouraging results …

  Related articles All 2 versions


[PDF] unifi.it

[PDF] Conlon: A pseudo-song generator based on a new pianoroll, wasserstein autoencoders, and optimal interpolations

L Angioloni, T Borghuis, L Brusci… - Proceedings of the 21st …, 2020 - flore.unifi.it

We introduce CONLON, a pattern-based MIDI generation method that employs a new

lossless pianoroll-like data description in which velocities and durations are stored in

separate channels. CONLON uses Wasserstein autoencoders as the underlying generative …

  Cited by 1 Related articles All 7 versions 

[HTML] hindawi.com

[HTML] An Integrated Consensus Improving Strategy Based on PL-Wasserstein Distance and Its Application in the Evaluation of Network Public Opinion Emergencies

S Zhang, Z Ma, X Liu, Z Wang, L Jiang - Complexity, 2020 - hindawi.com

In real life, multiple network public opinion emergencies may break out in a certain place at

the same time. So, it is necessary to invite emergency decision experts in multiple fields for

timely evaluating the comprehensive crisis of the online public opinion, and then limited …

  Related articles All 7 versions 


[PDF] mdpi.com

Knowledge-grounded chatbot based on dual wasserstein generative adversarial networks with effective attention mechanisms

S Kim, OW Kwon, H Kim - Applied Sciences, 2020 - mdpi.com

A conversation is based on internal knowledge that the participants already know or external

knowledge that they have gained during the conversation. A chatbot that communicates with

humans by using its internal and external knowledge is called a knowledge-grounded …

  Cited by 3 Related articles All 4 versions 


CVaR-Based Approximations of Wasserstein Distributionally Robust Chance Constraints with Application to Process Scheduling

B Liu, Q Zhang, X Ge, Z Yuan - Industrial & Engineering Chemistry …, 2020 - ACS Publications

Distributionally robust chance constrained programming is a stochastic optimization

approach that considers uncertainty in model parameters as well as uncertainty in the

underlying probability distribution. It ensures a specified probability of constraint satisfaction …

  Related articles All 4 versions


2020


[PDF] arxiv.org

The equivalence of Fourier-based and Wasserstein metrics on imaging problems

G Auricchio, A CodegoniS Gualandi… - arXiv preprint arXiv …, 2020 - arxiv.org

We investigate properties of some extensions of a class of Fourier-based probability metrics,

originally introduced to study convergence to equilibrium for the solution to the spatially

homogeneous Boltzmann equation. At difference with the original one, the new Fourier …

  Cited by 1 Related articles All 7 versions 


[PDF] brown.edu

Drug–drug interaction prediction with Wasserstein Adversarial Autoencoder-based knowledge graph embeddings

Y Dai, C Guo, W Guo, C Eickhoff - Briefings in Bioinformatics, 2020 - academic.oup.com

An interaction between pharmacological agents can trigger unexpected adverse events.

Capturing richer and more comprehensive information about drug–drug interactions (DDIs)

is one of the key tasks in public health and drug development. Recently, several knowledge …

  Related articles All 3 versions


[PDF] arxiv.org

Gromov-Wasserstein Distance based Object Matching: Asymptotic Inference

CA Weitkamp, K Proksch, C Tameling… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we aim to provide a statistical theory for object matching based on the Gromov-

Wasserstein distance. To this end, we model general objects as metric measure spaces.

Based on this, we propose a simple and efficiently computable asymptotic statistical test for …

  Related articles All 2 versions 


Wasserstein Generative Models for Patch-based Texture Synthesis

A HoudardA LeclaireN Papadakis… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we propose a framework to train a generative model for texture image

synthesis from a single example. To do so, we exploit the local representation of images via

the space of patches, that is, square sub-images of fixed size (eg $4\times 4$). Our main …

  Cited by 1 Related articles All 10 versions 


[PDF] arxiv.org

Fast Epigraphical Projection-based Incremental Algorithms for Wasserstein Distributionally Robust Support Vector Machine

J Li, C ChenAMC So - arXiv preprint arXiv:2010.12865, 2020 - arxiv.org

Wasserstein\textbf {D} istributionally\textbf {R} obust\textbf {O} ptimization (DRO) is

concerned with finding decisions that perform well on data that are drawn from the worst-

case probability distribution within a Wasserstein ball centered at a certain nominal …

  Cited by 1 Related articles All 5 versions 

<——2020——2020———2150—— 



[PDF] arxiv.org

Conditional Wasserstein GAN-based Oversampling of Tabular Data for Imbalanced Learning

J Engelmann, S Lessmann - arXiv preprint arXiv:2008.09202, 2020 - arxiv.org

Class imbalance is a common problem in supervised learning and impedes the predictive

performance of classification models. Popular countermeasures include oversampling the

minority class. Standard methods like SMOTE rely on finding nearest neighbours and linear …

  Cited by 1 Related articles All 5 versions 


First arrival picking of microseismic signals based on nested U-Net and Wasserstein Generative Adversarial Network

JL Zhang, GQ Sheng - Journal of Petroleum Science and Engineering, 2020 - Elsevier

Picking the first arrival of microseismic signals, quickly and accurately, is the key for real-time

data processing of microseismic monitoring. The traditional method cannot meet the high-

accuracy and high-efficiency requirements for the firstarrival microseismic picking, in a low …

  Related articles All 2 versions


[PDF] jst.go.jp

Orthogonal gradient penalty for fast training of wasserstein gan based multi-task autoencoder toward robust speech recognition

CY Kao, S Park, A Badi, DK Han… - IEICE TRANSACTIONS on …, 2020 - search.ieice.org

Performance in Automatic Speech Recognition (ASR) degrades dramatically in noisy

environments. To alleviate this problem, a variety of deep networks based on convolutional

neural networks and recurrent neural networks were proposed by applying L1 or L2 loss. In …

  Cited by 1 Related articles All 5 versions


[PDF] ieee.org

A New Data-Driven Distributionally Robust Portfolio Optimization Method based on Wasserstein Ambiguity Set

N Du, Y Liu, Y Liu - IEEE Access, 2020 - ieeexplore.ieee.org

Since optimal portfolio strategy depends heavily on the distribution of uncertain returns, this

paper proposes a new method for the portfolio optimization problem with respect to

distribution uncertainty. When the distributional information of the uncertain return rate is …

  Related articles


[PDF] arxiv.org

Central limit theorems for Markov chains based on their convergence rates in Wasserstein distance

R Jin, A Tan - arXiv preprint arXiv:2002.09427, 2020 - arxiv.org

Many tools are available to bound the convergence rate of Markov chains in total variation

(TV) distance. Such results can be used to establish central limit theorems (CLT) that enable

error evaluations of Monte Carlo estimates in practice. However, convergence analysis …

  Related articles All 2 versions 


2020


[PDF] iop.org

Speech Dereverberation Based on Improved Wasserstein Generative Adversarial Networks

L Rao, J Yang - Journal of Physics: Conference Series, 2020 - iopscience.iop.org

In reality, the sound we hear is not only disturbed by noise, but also the reverberant, whose

effects are rarely taken into account. Recently, deep learning has shown great advantages

in speech signal processing. But among the existing dereverberation approaches, very few …

  Related articles All 2 versions


[PDF] arxiv.org

Safe Zero-Shot Model-Based Learning and Control: A Wasserstein Distributionally Robust Approach

A KandelSJ Moura - arXiv preprint arXiv:2004.00759, 2020 - arxiv.org

This paper explores distributionally robust zero-shot model-based learning and control

using Wasserstein ambiguity sets. Conventional model-based reinforcement learning

algorithms struggle to guarantee feasibility throughout the online learning process. We …

  Related articles All 2 versions 


Drift compensation algorithm based on Time-Wasserstein dynamic distribution alignment

Y Tao, K Zeng, Z Liang - 2020 IEEE/CIC International …, 2020 - ieeexplore.ieee.org

The electronic nose (E-nose) is mainly used to detect different types and concentrations of

gases. At present, the average life of E-nose is relatively short, mainly due to the drift of the

sensor resulting in a decrease in the effect. Therefore, it is the focus of research in this field …

  Related articles



An Improvement based on Wasserstein GAN for Alleviating Mode Collapsing

Y Chen, X Hou - 2020 International Joint Conference on Neural …, 2020 - ieeexplore.ieee.org

In the past few years, Generative Adversarial Networks as a deep generative model has

received more and more attention. Mode collapsing is one of the challenges in the study of

Generative Adversarial Networks. In order to solve this problem, we deduce a new algorithm …

  Related articles


[PDF] iop.org

Data Augmentation Based on Wasserstein Generative Adversarial Nets Under Few Samples

Y Jiang, B Zhu, Q Ma - IOP Conference Series: Materials Science …, 2020 - iopscience.iop.org

Aiming at the problem of low accuracy of image classification under the condition of few

samples, an improved method based on Wasserstein Generative Adversarial Nets is

proposed. The small data sets are augmented by generating target samples through …

  Cited by 1 Related articles All 2 versions

<——2020——2020———2160—— 



[PDF] arxiv.org

LCS Graph Kernel Based on Wasserstein Distance in Longest Common Subsequence Metric Space

J Huang, Z Fang, H Kasai - arXiv preprint arXiv:2012.03612, 2020 - arxiv.org

For graph classification tasks, many methods use a common strategy to aggregate

information of vertex neighbors. Although this strategy provides an efficient means of

extracting graph topological features, it brings excessive amounts of information that might …

  Cited by 1 Related articles All 2 versions 


Covariance Steering of Discrete-Time Stochastic Linear Systems Based on Wasserstein Distance Terminal Cost

IM BalciE Bakolas - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We consider a class of stochastic optimal control problems for discrete-time linear systems

whose objective is the characterization of control policies that will steer the probability

distribution of the terminal state of the system close to a desired Gaussian distribution. In our …

  Related articles


[HTML] peerj.com

[HTML] Data augmentation-based conditional Wasserstein generative adversarial network-gradient penalty for XSS attack detection system

FMM Mokbal, D Wang, X Wang, L Fu - PeerJ Computer Science, 2020 - peerj.com

The rapid growth of the worldwide web and accompanied opportunities of web applications

in various aspects of life have attracted the attention of organizations, governments, and

individuals. Consequently, web applications have increasingly become the target of …

  Related articles All 5 versions 


[PDF] iop.org

A collaborative filtering recommendation framework based on Wasserstein GAN

R Li, F Qian, X Du, S Zhao… - Journal of Physics …, 2020 - iopscience.iop.org

Compared with the original GAN, Wasserstein GAN minimizes the Wasserstein Distance

between the generative distribution and the real distribution, can well capture the potential

distribution of data and has achieved excellent results in image generation. However, the …

  Related articles


A Riemannian submersion‐based approach to the Wasserstein barycenter of positive definite matrices

M Li, H Sun, D Li - Mathematical Methods in the Applied …, 2020 - Wiley Online Library

In this paper, we introduce a novel geometrization on the space of positive definite matrices,

derived from the Riemannian submersion from the general linear group to the space of

positive definite matrices, resulting in easier computation of its geometric structure. The …

  Related articles


 2020

 

[PDF] researchgate.net

Non-Gaussian BLE-Based Indoor Localization Via Gaussian Sum Filtering Coupled with Wasserstein Distance

P Malekzadeh, S Mehryar, P Spachos… - ICASSP 2020-2020 …, 2020 - ieeexplore.ieee.org

With recent breakthroughs in signal processing, communication and networking systems, we

are more and more surrounded by smart connected devices empowered by the Internet of

Thing (IoT). Bluetooth Low Energy (BLE) is considered as the main-stream technology to …

  Cited by 1 Related articles All 2 versions


Wasserstein GAN based on Autoencoder with back-translation for cross-lingual embedding mappings

Y Zhang, Y Li, Y Zhu, X Hu - Pattern Recognition Letters, 2020 - Elsevier

Recent works about learning cross-lingual word mappings (CWMs) focus on relaxing the

requirement of bilingual signals through generative adversarial networks (GANs). GANs

based models intend to enforce source embedding space to align target embedding space …

  Related articles All 2 versions


[PDF] sciencedirect.com

Intelligent Fault Diagnosis with a Deep Transfer Network based on Wasserstein Distance

J Xu, J Huang, Y Zhao, L Zhou - Procedia Computer Science, 2020 - Elsevier

Intelligent fault-diagnosis methods based on deep-learning technology have been very

successful for complex industrial systems. The deep learning based fault classification

model requires a large number of labeled data. Moreover, the probability distribution of …

  Related articles


[PDF] aalto.fi

Wasserstein-Distance-Based Temporal Clustering for Capacity-Expansion Planning in Power Systems

L CondeixaF Oliveira… - … Conference on Smart …, 2020 - ieeexplore.ieee.org

As variable renewable energy sources are steadily incorporated in European power

systems, the need for higher temporal resolution in capacity-expansion models also

increases. Naturally, there exists a trade-off between the amount of temporal data used to …

  Related articles


Generating Hyperspectral Data Based on 3D CNN and Improved Wasserstein Generative Adversarial Network Using Homemade High-resolution Datasets

Y Li, D Huang - Proceedings of the International Conference on …, 2020 - dl.acm.org

Hyperspectral images contain rich information on the fingerprints of materials and are being

popularly used in the exploration of oil and gas, environmental monitoring, and remote

sensing. Since hyperspectral images cover a wide range of wavelengths with high …

  Related articles

<——2020——2020———2170——



[PDF] mdpi.com

Wasserstein Generative Adversarial Networks Based Data Augmentation for Radar Data Analysis

H Lee, J Kim, EK Kim, S Kim - Applied Sciences, 2020 - mdpi.com

Ground-based weather radar can observe a wide range with a high spatial and temporal

resolution. They are beneficial to meteorological research and services by providing

valuable information. Recent weather radar data related research has focused on applying …

  Related articles All 2 versions 


Learning Wasserstein Distance-Based Gaussian Graphical Model for Multivariate Time Series Classification

HU Xuegang, L Jianxing, LI Peipei… - 2020 IEEE …, 2020 - ieeexplore.ieee.org

Multivariate time series classification occupies an important position in time series data

mining tasks and has been applied in many fields. However, due to the statistical coupling

between different variables of Multivariate Time Series (MTS) data, traditional classification …

  Related articles All 2 versions


[PDF] iop.org

Data Augmentation Method for Switchgear Defect Samples Based on Wasserstein Generative Adversarial Network

X Huang, J Xiong, Y Zhang, J Liang… - Journal of Physics …, 2020 - iopscience.iop.org

The problem of sample imbalance will lead to poor generalization ability of the deep

learning model algorithm, and the phenomenon of overfitting during network training, which

limits the accuracy of intelligent fault diagnosis of switchgear equipment. In view of this, this …

  Related articles


A Super Resolution Method for Remote Sensing Images Based on Cascaded Conditional Wasserstein GANs

B Liu, H Li, Y Zhou, Y Peng, A Elazab… - 2020 IEEE 3rd …, 2020 - ieeexplore.ieee.org

High-resolution (HR) remote sensing imagery is quite beneficial for subsequent

interpretation. Obtaining HR images can be achieved by upgrading the imaging device. Yet,

the cost to perform this task is very huge. Thus, it is necessary to obtain HR images from low …

  Related articles


Remote Sensing Image Segmentation based on Generative Adversarial Network with Wasserstein divergence

X Cao, C Song, J Zhang, C Liu - 2020 3rd International Conference on …, 2020 - dl.acm.org

In the image segmentation fields, traditional methods can be classified into four main

categories: threshold-based (eg Otsu [1]),. edge-based (eg Canny [2], Hough transform [3]),

region-based (eg Super pixel [4]), and energy functional-based segmentation methods (eg …

 

2020

 

Data Augmentation Method for Fault Diagnosis of Mechanical Equipment Based on Improved Wasserstein GAN

W Liu, L Duan, Y Tang, J Yang - 2020 11th International …, 2020 - ieeexplore.ieee.org

Most of the time the mechanical equipment is in normal operation state, which results in high

imbalance between fault data and normal data. In addition, traditional signal processing

methods rely heavily on expert experience, making it difficult for classification or prediction …

  Related articles


[PDF] unifi.it

[PDF] Pattern-Based Music Generation with Wasserstein Autoencoders and PRCDescriptions

V Borghuis, L Angioloni, L Brusci… - 29th International Joint …, 2020 - flore.unifi.it

We present a pattern-based MIDI music generation system with a generation strategy based

on Wasserstein autoencoders and a novel variant of pianoroll descriptions of patterns which

employs separate channels for note velocities and note durations and can be fed into classic …

  Related articles All 4 versions 


Stereoscopic image reflection removal based on Wasserstein Generative Adversarial Network

X Wang, Y Pan, DPK Lun - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Reflection removal is a long-standing problem in computer vision. In this paper, we consider

the reflection removal problem for stereoscopic images. By exploiting the depth information

of stereoscopic images, a new background edge estimation algorithm based on the …

  Related articles All 2 versions


[PDF] unipv.it

[PDF] On the equivalence between Fourier-based and Wasserstein metrics

G Auricchio, A Codegoni, S GualandiG Toscani… - eye - mate.unipv.it

We investigate properties of some extensions of a class of Fourierbased probability metrics,

originally introduced to study convergence to equilibrium for the solution to the spatially

homogeneous Boltzmann equation. At difference with the original one, the new Fourier …

  Related articles 


Improving EEG-based motor imagery classification with conditional Wasserstein GAN

Z Li, Y Yu - 2020 International Conference on Image, Video …, 2020 - spiedigitallibrary.org

Deep learning based algorithms have made huge progress in the field of image

classification and speech recognition. There is an increasing number of researchers

beginning to use deep learning to process electroencephalographic (EEG) brain signals …

  Related articles

<——2020——2020———2180——


Remaining useful life prediction of lithium-ion batteries using a fusion method based on Wasserstein GAN

周温丁, 鲍士兼, 许方敏, 赵成林 - 中国邮电高校学报 (英文版), 2020 - jcupt.bupt.edu.cn

Lithium-ion batteries are the main power supply equipment in many fields due to their

advantages of no memory, high energy density, long cycle life and no pollution to the

environment. Accurate prediction for the remaining useful life (RUL) of lithium-ion batteries …

  All 2 versions Cached 


Wasserstein loss-based deep object detection

Y Han, X Liu, Z Sheng, Y Ren, X Han… - Proceedings of the …, 2020 - openaccess.thecvf.com

Object detection locates the objects with bounding boxes and identifies their classes, which

is valuable in many computer vision applications (eg autonomous driving). Most existing

deep learning-based methods output a probability vector for instance classification trained …

  Cited by 8 Related articles All 5 versions 


[PDF] arxiv.org

Cutoff thermalization for Ornstein-Uhlenbeck systems with small L\'evy noise in the Wasserstein distance

G BarreraMA HögeleJC Pardo - arXiv preprint arXiv:2009.10590, 2020 - arxiv.org

This article establishes cutoff thermalization (also known as the cutoff phenomenon) for a

general class of general Ornstein-Uhlenbeck systems $(X^\epsilon_t (x)) _ {t\geq 0} $ under

$\epsilon $-small additive Lévy noise with initial value $ x $. The driving noise processes …

  Cited by 1 Related articles All 3 versions 


[PDF] arxiv.org

A Bismut-Elworthy inequality for a Wasserstein diffusion on the circle

V Marx - arXiv preprint arXiv:2005.04972, 2020 - arxiv.org

We investigate in this paper a regularization property of a diffusion on the Wasserstein

space $\mathcal {P} _2 (\mathbb {T}) $ of the one-dimensional torus. The control obtained

on the gradient of the semi-group is very much in the spirit of Bismut-Elworthy-Li integration …

  Related articles All 9 versions 


[PDF] arxiv.org

Martingale Wasserstein inequality for probability measures in the convex order

B Jourdain, W Margheriti - arXiv preprint arXiv:2011.11599, 2020 - arxiv.org

It is known since [24] that two one-dimensional probability measures in the convex order

admit a martingale coupling with respect to which the integral of $\vert xy\vert $ is smaller

than twice their $\mathcal W_1 $-distance (Wasserstein distance with index $1 $). We …

  Related articles All 7 versions 


 2020


[PDF] arxiv.org

Gromov-Wasserstein Distance based Object Matching: Asymptotic Inference

CA Weitkamp, K Proksch, C Tameling… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we aim to provide a statistical theory for object matching based on the Gromov-

Wasserstein distance. To this end, we model general objects as metric measure spaces.

Based on this, we propose a simple and efficiently computable asymptotic statistical test for …

  Related articles All 2 versions 


 [PDF] arxiv.org

Convergence rate to equilibrium in Wasserstein distance for reflected jump–diffusions

A Sarantsev - Statistics & Probability Letters, 2020 - Elsevier

Convergence rate to the stationary distribution for continuous-time Markov processes can be

studied using Lyapunov functions. Recent work by the author provided explicit rates of

convergence in special case of a reflected jump–diffusion on a half-line. These results are …

  Related articles All 7 versions


Importance-aware semantic segmentation in self-driving with discrete wasserstein training

X Liu, Y Han, S Bai, Y Ge, T Wang, X HanS Li… - Proceedings of the …, 2020 - ojs.aaai.org

Semantic segmentation (SS) is an important perception manner for self-driving cars and

robotics, which classifies each pixel into a pre-determined class. The widely-used cross

entropy (CE) loss-based deep networks has achieved significant progress wrt the mean …

  Cited by 8 Related articles All 6 versions 


[PDF] arxiv.org

Efficient Wasserstein Natural Gradients for Reinforcement Learning

T MoskovitzM ArbelF HuszarA Gretton - arXiv preprint arXiv …, 2020 - arxiv.org

A novel optimization approach is proposed for application to policy gradient methods and

evolution strategies for reinforcement learning (RL). The procedure uses a computationally

efficient Wasserstein natural gradient (WNG) descent that takes advantage of the geometry …

  Cited by 1 Related articles All 2 versions 


A Linear Programming Approximation of Distributionally Robust Chance-Constrained Dispatch With Wasserstein Distance

A Zhou, M Yang, M Wang… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

This paper proposes a data-driven distributionally robust chance constrained real-time

dispatch (DRCC-RTD) considering renewable generation forecasting errors. The proposed

DRCC-RTD model minimizes the expected quadratic cost function and guarantees that the …

  Cited by 5 Related articles All 2 versions

<——2020——2020———2190—— 



[PDF] optimization-online.org

Data-driven distributionally robust chance-constrained optimization with Wasserstein metric

R JiMA Lejeune - Journal of Global Optimization, 2020 - Springer

We study distributionally robust chance-constrained programming (DRCCP) optimization

problems with data-driven Wasserstein ambiguity sets. The proposed algorithmic and

reformulation framework applies to all types of distributionally robust chance-constrained …

  Cited by 10 Related articles All 3 versions


[PDF] thecvf.com

Barycenters of natural images constrained wasserstein barycenters for image morphing

D SimonA Aberdam - … of the IEEE/CVF Conference on …, 2020 - openaccess.thecvf.com

Image interpolation, or image morphing, refers to a visual transition between two (or more)

input images. For such a transition to look visually appealing, its desirable properties are (i)

to be smooth;(ii) to apply the minimal required change in the image; and (iii) to seem" real" …

  Cited by 3 Related articles All 7 versions 


Finite-Horizon Control of Nonlinear Discrete-Time Systems with Terminal Cost of Wasserstein Distance

K Hoshino - 2020 59th IEEE Conference on Decision and …, 2020 - ieeexplore.ieee.org

This study explores a finite-horizon optimal control problem of nonlinear discrete-time

systems for steering a probability distribution of initial states as close as possible to a

desired probability distribution of terminal states. The problem is formulated as an optimal …

  Cited by 1 Related articles


[PDF] arxiv.org

Robust Reinforcement Learning with Wasserstein Constraint

L Hou, L PangX HongY Lan, Z Ma, D Yin - arXiv preprint arXiv …, 2020 - arxiv.org

Robust Reinforcement Learning aims to find the optimal policy with some extent of

robustness to environmental dynamics. Existing learning algorithms usually enable the

robustness through disturbing the current state or simulating environmental parameters in a …

  Related articles All 4 versions 


CVaR-Based Approximations of Wasserstein Distributionally Robust Chance Constraints with Application to Process Scheduling

B Liu, Q Zhang, X Ge, Z Yuan - Industrial & Engineering Chemistry …, 2020 - ACS Publications

Distributionally robust chance constrained programming is a stochastic optimization

approach that considers uncertainty in model parameters as well as uncertainty in the

underlying probability distribution. It ensures a specified probability of constraint satisfaction …

  Related articles All 4 versions


2020


[PDF] arxiv.org

Strong Formulations for Distributionally Robust Chance-Constrained Programs with Left-Hand Side Uncertainty under Wasserstein Ambiguity

N Ho-NguyenF Kılınç-KarzanS Küçükyavuz… - arXiv preprint arXiv …, 2020 - arxiv.org

Distributionally robust chance-constrained programs (DR-CCP) over Wasserstein ambiguity

sets exhibit attractive out-of-sample performance and admit big-$ M $-based mixed-integer

programming (MIP) reformulations with conic constraints. However, the resulting …

  Cited by 3 Related articles All 3 versions 


[PDF] arxiv.org

Consistency of Distributionally Robust Risk-and Chance-Constrained Optimization Under Wasserstein Ambiguity Sets

A CherukuriAR Hota - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We study stochastic optimization problems with chance and risk constraints, where in the

latter, risk is quantified in terms of the conditional value-at-risk (CVaR). We consider the

distributionally robust versions of these problems, where the constraints are required to hold …

  Cited by 1 Related articles All 3 versions


Discrete Wasserstein Autoencoders for Document Retrieval

Y ZhangH Zhu - … 2020-2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

Learning to hash via generative models has became a promising paradigm for fast similarity

search in document retrieval. The binary hash codes are treated as Bernoulli latent variables

when training a variational autoencoder (VAE). However, the prior of discrete distribution (ie …

  Related articles


Covariance Steering of Discrete-Time Stochastic Linear Systems Based on Wasserstein Distance Terminal Cost

IM BalciE Bakolas - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We consider a class of stochastic optimal control problems for discrete-time linear systems

whose objective is the characterization of control policies that will steer the probability

distribution of the terminal state of the system close to a desired Gaussian distribution. In our …

  Related articles


[PDF] arxiv.org

Generating Natural Adversarial Hyperspectral examples with a modified Wasserstein GAN

JC Burnel, K FatrasN Courty - arXiv preprint arXiv:2001.09993, 2020 - arxiv.org

Adversarial examples are a hot topic due to their abilities to fool a classifier's prediction.

There are two strategies to create such examples, one uses the attacked classifier's

gradients, while the other only requires access to the clas-sifier's prediction. This is …

  Related articles All 4 versions 

<——2020——2020———2200—— 


Numeric Data Augmentation using Structural Constraint Wasserstein Generative Adversarial Networks

W Wang, C Wang, T Cui, R Gong… - … on Circuits and …, 2020 - ieeexplore.ieee.org

Some recent studies have suggested using GANs for numeric data generation such as to

generate data for completing the imbalanced numeric data. Considering the significant

difference between the dimensions of the numeric data and images, as well as the strong …

  Related articles


Image Hashing by Minimizing Discrete Component-wise Wasserstein Distance

KD DoanS ManchandaS Badirli… - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

Image hashing is one of the fundamental problems that demand both efficient and effective

solutions for various practical scenarios. Adversarial autoencoders are shown to be able to

implicitly learn a robust, locality-preserving hash function that generates balanced and high …



Wasserstein Embedding for Graph Learning

S KolouriN NaderializadehGK Rohde… - arXiv preprint arXiv …, 2020 - arxiv.org

We present Wasserstein Embedding for Graph Learning (WEGL), a novel and fast

framework for embedding entire graphs in a vector space, in which various machine

learning models are applicable for graph-level prediction tasks. We leverage new insights …

  Cited by 3 Related articles All 3 versions 


2020


[PDF] mlr.press

Nested-wasserstein self-imitation learning for sequence generation

R ZhangC ChenZ GanZ Wen… - International …, 2020 - proceedings.mlr.press

Reinforcement learning (RL) has been widely studied for improving sequence-generation

models. However, the conventional rewards used for RL training typically cannot capture

sufficient semantic information and therefore render model bias. Further, the sparse and …

  Cited by 2 Related articles All 6 versions 


Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings

Y Dai, S Wang, X Chen, C Xu, W Guo - Knowledge-Based Systems, 2020 - Elsevier

Abstract Knowledge graph embedding aims to project entities and relations into low-

dimensional and continuous semantic feature spaces, which has captured more attention in

recent years. Most of the existing models roughly construct negative samples via a uniformly …

 Cited by 15 Related articles All 2 versions

2020


[PDF] aaai.org

Improving the Robustness of Wasserstein Embedding by Adversarial PAC-Bayesian Learning

D Ding, M Zhang, X PanM YangX He - Proceedings of the AAAI …, 2020 - ojs.aaai.org

Node embedding is a crucial task in graph analysis. Recently, several methods are

proposed to embed a node as a distribution rather than a vector to capture more information.

Although these methods achieved noticeable improvements, their extra complexity brings …

  Related articles All 3 versions 


Typical wind power scenario generation for multiple wind farms using conditional improved Wasserstein generative adversarial network

Y Zhang, Q Ai, F Xiao, R Hao, T Lu - … Journal of Electrical Power & Energy …, 2020 - Elsevier

Because of environmental benefits, wind power is taking an increasing role meeting

electricity demand. However, wind power tends to exhibit large uncertainty and is largely

influenced by meteorological conditions. Apart from the variability, when multiple wind farms …

  Cited by 14 Related articles


Knowledge-aware attentive wasserstein adversarial dialogue response generation

Y Zhang, Q Fang, S Qian, C Xu - ACM Transactions on Intelligent …, 2020 - dl.acm.org

Natural language generation has become a fundamental task in dialogue systems. RNN-

based natural response generation methods encode the dialogue context and decode it into

a response. However, they tend to generate dull and simple responses. In this article, we …

  Cited by 2 Related articles


[PDF] researchgate.net

Sample generation based on a supervised Wasserstein Generative Adversarial Network for high-resolution remote-sensing scene classification

W HanL Wang, R Feng, L Gao, X Chen, Z Deng… - Information …, 2020 - Elsevier

As high-resolution remote-sensing (HRRS) images have become increasingly widely

available, scene classification focusing on the smart classification of land cover and land

use has also attracted more attention. However, mainstream methods encounter a severe …

  Cited by 4 Related articles All 3 versions


[PDF] arxiv.org

Wasserstein Index Generation Model: Automatic generation of time-series index with application to Economic Policy Uncertainty

F Xie - Economics Letters, 2020 - Elsevier

Automatic time-series index generation as a black-box method … Comparable results with existing

ones, tested on EPU … Applicable to any text corpus to produce sentiment indices … I propose

a novel method, the Wasserstein Index Generation model (WIG), to generate a public sentiment …

  Cited by 6 Related articles All 11 versions

<——2020——2020———2210——



Dynamic facial expression generation on hilbert hypersphere with conditional wasserstein generative adversarial nets

N OtberdoutM DaoudiA Kacem… - … on Pattern Analysis …, 2020 - ieeexplore.ieee.org

In this work, we propose a novel approach for generating videos of the six basic facial

expressions given a neutral face image. We propose to exploit the face geometry by

modeling the facial landmarks motion as curves encoded as points on a hypersphere. By …

  Cited by 7 Related articles All 10 versions


[PDF] arxiv.org

Conditional Sig-Wasserstein GANs for Time Series Generation

H NiL SzpruchM WieseS Liao, B Xiao - arXiv preprint arXiv:2006.05421, 2020 - arxiv.org

Generative adversarial networks (GANs) have been extremely successful in generating

samples, from seemingly high dimensional probability measures. However, these methods

struggle to capture the temporal dependence of joint probability distributions induced by …

  Cited by 4 Related articles All 3 versions 


Linear Optimal Transport Embedding: Provable fast Wasserstein distance computation and classification for nonlinear problems

C MoosmüllerA Cloninger - arXiv preprint arXiv:2008.09165, 2020 - arxiv.org

Discriminating between distributions is an important problem in a number of scientific fields.

This motivated the introduction of Linear Optimal Transportation (LOT), which embeds the

space of distributions into an $ L^ 2$-space. The transform is defined by computing the …

  Cited by 2 Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein Adversarial Autoencoders for Knowledge Graph Embedding based Drug-Drug Interaction Prediction

Y Dai, C Guo, W Guo, C Eickhoff - arXiv preprint arXiv:2004.07341, 2020 - arxiv.org

Interaction between pharmacological agents can trigger unexpected adverse events.

Capturing richer and more comprehensive information about drug-drug interactions (DDI) is

one of the key tasks in public health and drug development. Recently, several knowledge …

  Cited by 1 Related articles All 2 versions 


Learning Wasserstein Isometric Embedding for Point Clouds

K KawanoS Koide, T Kutsuna - 2020 International Conference …, 2020 - ieeexplore.ieee.org

The Wasserstein distance has been employed for determining the distance between point

clouds, which have variable numbers of points and invariance of point order. However, the

high computational cost associated with the Wasserstein distance hinders its practical …

  All 2 versions


 2020

 

Semantics-assisted Wasserstein Learning for Topic and Word Embeddings

C LiX Li, J Ouyang, Y Wang - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Wasserstein distance, defined as the cost (measured by word embeddings) of optimal

transport plan for moving between two histograms, has been proven effective in tasks of

natural language processing. In this paper, we extend Nonnegative Matrix Factorization …

  All 2 versions


Wasserstein Embeddings for Nonnegative Matrix Factorization

M Febrissy, M Nadif - … Conference on Machine Learning, Optimization, and …, 2020 - Springer

In the field of document clustering (or dictionary learning), the fitting error called the

Wasserstein (In this paper, we use “Wasserstein”,“Earth Mover's”,“Kantorovich–Rubinstein”

interchangeably) distance showed some advantages for measuring the approximation of the …

  Related articles


[PDF] arxiv.org

Pruned Wasserstein Index Generation Model and wigpy Package

F Xie - arXiv preprint arXiv:2004.00999, 2020 - arxiv.org

Recent proposal of Wasserstein Index Generation model (WIG) has shown a new direction

for automatically generating indices. However, it is challenging in practice to fit large

datasets for two reasons. First, the Sinkhorn distance is notoriously expensive to compute …

  Related articles All 8 versions 


A Novel Data-to-Text Generation Model with Transformer Planning and a Wasserstein Auto-Encoder

X Xu, T He, H Wang - 2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

Existing methods for data-to-text generation have difficulty producing diverse texts with low

duplication rates. In this paper, we propose a novel data-to-text generation model with

Transformer planning and a Wasserstein auto-encoder, which can convert constructed data …

  Related articles All 2 versions


[PDF] ntu.edu.sg

Wasserstein distance estimates for stochastic integrals by forward-backward stochastic calculus

JC Breton, N Privault - Potential Analysis, 2020 - Springer

We prove Wasserstein distance bounds between the probability distributions of stochastic

integrals with jumps, based on the integrands appearing in their stochastic integral

representations. Our approach does not rely on the Stein equation or on the propagation of …

  Related articles All 4 versions

<——2020——2020———2220——  



[PDF] Pattern-Based Music Generation with Wasserstein Autoencoders and PRCDescriptions

V Borghuis, L Angioloni, L Brusci… - 29th International Joint …, 2020 - flore.unifi.it

We present a pattern-based MIDI music generation system with a generation strategy based

on Wasserstein autoencoders and a novel variant of pianoroll descriptions of patterns which

employs separate channels for note velocities and note durations and can be fed into classic …

  Related articles All 4 versions 


Synthetic Data Generation Using Wasserstein Conditional Gans With Gradient Penalty (WCGANS-GP)

M Singh Walia - 2020 - arrow.tudublin.ie

With data protection requirements becoming stricter, the data privacy has become

increasingly important and more crucial than ever. This has led to restrictions on the

availability and dissemination of real-world datasets. Synthetic data offers a viable solution …

  Related articles 


Rethinking Wasserstein-Procrustes for Aligning Word Embeddings Across Languages

G Ramírez Santos - 2020 - upcommons.upc.edu

The emergence of unsupervised word embeddings, pre-trained on very large monolingual

text corpora, is at the core of the ongoing neural revolution in Natural Language Processing

(NLP). Initially introduced for English, such pre-trained word embeddings quickly emerged …


Тренировочная устойчивость Вассерштейна Ганса - CodeRoad

coderoad.ru › Тренировочная-усто...

· Translate this page

May 2, 2020 — ... в том, что GANs не имея единой целевой функции(есть две сети), нет. ... которые гарантируются с помощью потерь Вассерштейна. У ...

 [Russian  Training stability of Wasserstein GAN…]



wasserstein minimum velocity approach to learning unnormalized models

Z WangS Cheng, L Yueru, J Zhu… - International …, 2020 - proceedings.mlr.press

Score matching provides an effective approach to learning flexible unnormalized models,

but its scalability is limited by the need to evaluate a second-order derivative. In this paper,

we present a scalable approximation to a general family of learning objectives including …

  Cited by 4 Related articles All 9 versions 


2020


[PDF] wiley.com

Evaluating the performance of climate models based on Wasserstein distance

G VissioV LemboV Lucarini… - Geophysical Research …, 2020 - Wiley Online Library

We propose a methodology for intercomparing climate models and evaluating their

performance against benchmarks based on the use of the Wasserstein distance (WD). This

distance provides a rigorous way to measure quantitatively the difference between two …

  Cited by 2 Related articles All 13 versions


[PDF] arxiv.org

wasserstein-type distance in the space of gaussian mixture models

J Delon, A Desolneux - SIAM Journal on Imaging Sciences, 2020 - SIAM

In this paper we introduce a Wasserstein-type distance on the set of Gaussian mixture

models. This distance is defined by restricting the set of possible coupling measures in the

optimal transport problem to Gaussian mixture models. We derive a very simple discrete …

  Cited by 10 Related articles All 7 versions


[PDF] arxiv.org

Estimating processes in adapted Wasserstein distance

J BackhoffD BartlM Beiglböck, J Wiesel - arXiv preprint arXiv …, 2020 - arxiv.org

A number of researchers have independently introduced topologies on the set of laws of

stochastic processes that extend the usual weak topology. Depending on the respective

scientific background this was motivated by applications and connections to various areas …

  Cited by 3 Related articles All 4 versions 

[CITATION] Estimating processes in adapted wasserstein distance. arXiv e-prints: 2002.07261

J Backhoff-Veraguas, D Bartl, M Beiglböck, J Wiesel - 2020 - February

  Cited by 3

[CITATION] Estimating processes in adapted Wasserstein distance

J Backhoff-Veraguas, D Bartl, M Beiglböck, J Wiesel - Preprint, 2020

  Cited by 2 Related articles


 

[PDF] aaai.org

Gromov-wasserstein factorization models for graph clustering

H Xu - Proceedings of the AAAI Conference on Artificial …, 2020 - ojs.aaai.org

We propose a new nonlinear factorization model for graphs that are with topological

structures, and optionally, node attributes. This model is based on a pseudometric called

Gromov-Wasserstein (GW) discrepancy, which compares graphs in a relational way. It …

  Cited by 5 Related articles All 5 versions 


Wasserstein fair classification

R JiangA Pacchiano, T Stepleton… - Uncertainty in …, 2020 - proceedings.mlr.press

We propose an approach to fair classification that enforces independence between the

classifier outputs and sensitive information by minimizing Wasserstein-1 distances. The

approach has desirable theoretical properties and is robust to specific choices of the …

  Cited by 38 Related articles All 4 versions 

 <——2020——2020———2230—— 



[PDF] arxiv.org

Wasserstein Autoregressive Models for Density Time Series

C ZhangP KokoszkaA Petersen - arXiv preprint arXiv:2006.12640, 2020 - arxiv.org

Data consisting of time-indexed distributions of cross-sectional or intraday returns have

been extensively studied in finance, and provide one example in which the data atoms

consist of serially dependent probability distributions. Motivated by such data, we propose …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

Robustified Multivariate Regression and Classification Using Distributionally Robust Optimization under the Wasserstein Metric

R ChenIC Paschalidis - arXiv preprint arXiv:2006.06090, 2020 - arxiv.org

We develop Distributionally Robust Optimization (DRO) formulations for Multivariate Linear

Regression (MLR) and Multiclass Logistic Regression (MLG) when both the covariates and

responses/labels may be contaminated by outliers. The DRO framework uses a probabilistic …

  Related articles All 3 versions 


Refining Deep Generative Models via Wasserstein Gradient Flows

AF Ansari, ML Ang, H Soh - arXiv preprint arXiv:2012.00780, 2020 - arxiv.org

Deep generative modeling has seen impressive advances in recent years, to the point

where it is now commonplace to see simulated samples (eg, images) that closely resemble

real-world data. However, generation quality is generally inconsistent for any given model …

  Related articles 

Refining Deep Generative Models via Wasserstein Gradient Flows

A Fatir Ansari, ML Ang, H Soh - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

Deep generative modeling has seen impressive advances in recent years, to the point

where it is now commonplace to see simulated samples (eg, images) that closely resemble

real-world data. However, generation quality is generally inconsistent for any given model …



[PDF] aaai.org

Solving general elliptical mixture models through an approximate Wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - Proceedings of the AAAI Conference on …, 2020 - ojs.aaai.org

We address the estimation problem for general finite mixture models, with a particular focus

on the elliptical mixture models (EMMs). Compared to the widely adopted Kullback–Leibler

divergence, we show that the Wasserstein distance provides a more desirable optimisation …

  Cited by 2 Related articles All 4 versions 


[PDF] arxiv.org

Ranking IPCC Models Using the Wasserstein Distance

G VissioV LemboV LucariniM Ghil - arXiv preprint arXiv:2006.09304, 2020 - arxiv.org

We propose a methodology for evaluating the performance of climate models based on the

use of the Wasserstein distance. This distance provides a rigorous way to measure

quantitatively the difference between two probability distributions. The proposed approach is …

  Related articles All 5 versions 


2020


[PDF] arxiv.org

Wasserstein Generative Models for Patch-based Texture Synthesis

A HoudardA LeclaireN Papadakis… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we propose a framework to train a generative model for texture image

synthesis from a single example. To do so, we exploit the local representation of images via

the space of patches, that is, square sub-images of fixed size (eg $4\times 4$). Our main …

  Cited by 1 Related articles All 10 versions 


[PDF] arxiv.org

Wasserstein-based fairness interpretability framework for machine learning models

A Miroshnikov, K Kotsiopoulos, R Franks… - arXiv preprint arXiv …, 2020 - arxiv.org

In this article, we introduce a fairness interpretability framework for measuring and

explaining bias in classification and regression models at the level of a distribution. In our

work, motivated by the ideas of Dwork et al.(2012), we measure the model bias across sub …

  Related articles All 2 versions 


2020

Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces

V EhrlacherD LombardiO Mula… - … and Numerical Analysis, 2020 - search.proquest.com

We consider the problem of model reduction of parametrized PDEs where the goal is to

approximate any function belonging to the set of solutions at a reduced computational cost.

For this, the bottom line of most strategies has so far been based on the approximation of the …

  Related articles All 2 versions


[PDF] arxiv.org

Safe Zero-Shot Model-Based Learning and Control: A Wasserstein Distributionally Robust Approach

A KandelSJ Moura - arXiv preprint arXiv:2004.00759, 2020 - arxiv.org

This paper explores distributionally robust zero-shot model-based learning and control

using Wasserstein ambiguity sets. Conventional model-based reinforcement learning

algorithms struggle to guarantee feasibility throughout the online learning process. We …

  Related articles All 2 versions 


2020

Wasserstein Distance to Independence Models

T Özlüm Çelik, A JamneshanG Montúfar… - arXiv e …, 2020 - ui.adsabs.harvard.edu

An independence model for discrete random variables is a Segre-Veronese variety in a

probability simplex. Any metric on the set of joint states of the random variables induces a

Wasserstein metric on the probability simplex. The unit ball of this polyhedral norm is dual to …

 <——2020——2020———2240—— 



[PDF] ibpsa.org

[PDF] Comparing Bottom-Up Energy Consumption Models Using The Wasserstein Distance Between Load Profile Histograms

E Sanderson, A Fragaki, J Simo… - BSO-V 2020: IBPSA …, 2020 - ibpsa.org

This paper presents a comparison of bottom up models that generate appliance load

profiles. The comparison is based on their ability to accurately distribute load over time-of-

day. This is a key feature of model performance if the model is used to assess the impact of …

  Related articles All 2 versions 


[CITATION] Improving Wasserstein Generative Models for Image Synthesis and Enhancement

J Wu - 2020 - research-collection.ethz.ch

… JavaScript is disabled for your browser. Some features of this site may not work

without it. Research Collection. Navigational link. Search. Improving Wasserstein

Generative Models for Image Synthesis and Enhancement …


Wasserstein based two-stage distributionally robust optimization model for optimal operation of CCHP micro-grid under uncertainties

Y Wang, Y Yang, L Tang, W Sun, B Li - International Journal of Electrical …, 2020 - Elsevier

Combined cooling, heating and power (CCHP) micro-grids are getting increasing attentions

due to the realization of cleaner production and high energy efficiency. However, with the

features of complex tri-generation structure and renewable power uncertainties, it is …

  Cited by 16 Related articles All 2 versions


[PDF] arxiv.org

Wasserstein Index Generation Model: Automatic generation of time-series index with application to Economic Policy Uncertainty

F Xie - Economics Letters, 2020 - Elsevier

Automatic time-series index generation as a black-box method … Comparable results with existing

ones, tested on EPU … Applicable to any text corpus to produce sentiment indices … I propose

a novel method, the Wasserstein Index Generation model (WIG), to generate a public sentiment …

  Cited by 6 Related articles All 11 versions


A data-driven distributionally robust newsvendor model with a Wasserstein ambiguity set

S Lee, H Kim, I Moon - Journal of the Operational …, 2020 - orsociety.tandfonline.com

In this paper, we derive a closed-form solution and an explicit characterization of the worst-

case distribution for the data-driven distributionally robust newsvendor model with an

ambiguity set based on the Wasserstein distance of order p[1,∞). We also consider the …

  Cited by 4 Related articles All 2 versions


2020


[PDF] researchgate.net

Data supplement for a soft sensor using a new generative model based on a variational autoencoder and Wasserstein GAN

X Wang, H Liu - Journal of Process Control, 2020 - Elsevier

In industrial process control, measuring some variables is difficult for environmental or cost

reasons. This necessitates employing a soft sensor to predict these variables by using the

collected data from easily measured variables. The prediction accuracy and computational …

  Cited by 6 Related articles All 3 versions


[HTML] mdpi.com

Probability forecast combination via entropy regularized wasserstein distance

R Cumings-Menon, M Shin - Entropy, 2020 - mdpi.com

We propose probability and density forecast combination methods that are defined using the

entropy regularized Wasserstein distance. First, we provide a theoretical characterization of

the combined density forecast based on the regularized Wasserstein distance under the …

  Cited by 2 Related articles All 15 versions 


 

W-LDMM: A wasserstein driven low-dimensional manifold model for noisy image restoration

R He, X Feng, W Wang, X Zhu, C Yang - Neurocomputing, 2020 - Elsevier

The Wasserstein distance originated from the optimal transport theory is a general and

flexible statistical metric in a variety of image processing problems. In this paper, we propose

a novel Wasserstein driven low-dimensional manifold model (W-LDMM), which tactfully …

  Cited by 3 Related articles All 2 versions


[PDF] ieee.org

Joint transfer of model knowledge and fairness over domains using wasserstein distance

T Yoon, J LeeW Lee - IEEE Access, 2020 - ieeexplore.ieee.org

Owing to the increasing use of machine learning in our daily lives, the problem of fairness

has recently become an important topic in machine learning societies. Recent studies

regarding fairness in machine learning have been conducted to attempt to ensure statistical …

  Cited by 1 Related articles


[PDF] arxiv.org

Gromov–Hausdorff limit of Wasserstein spaces on point clouds

NG Trillos - Calculus of Variations and Partial Differential …, 2020 - Springer

We consider a point cloud X_n:={x _1, ..., x _n\} X n:= x 1,…, xn uniformly distributed on the

flat torus T^ d:= R^ d/Z^ d T d:= R d/Z d, and construct a geometric graph on the cloud by

connecting points that are within distance ε ε of each other. We let P (X_n) P (X n) be the …

  Cited by 12 Related articles All 4 versions

<——2020——2020———2250——  




[PDF] arxiv.org

Wasserstein statistics in 1D location-scale model

S Amari - arXiv preprint arXiv:2003.05479, 2020 - arxiv.org

Wasserstein geometry and information geometry are two important structures introduced in a

manifold of probability distributions. The former is defined by using the transportation cost

between two distributions, so it reflects the metric structure of the base manifold on which …

  Cited by 1 Related articles All 2 versions 


[PDF] imstat.org

A Central Limit Theorem for Wasserstein type distances between two distinct univariate distributions

P Berthet, JC Fort, T Klein - Annales de l'Institut Henri Poincaré …, 2020 - projecteuclid.org

In this article we study the natural nonparametric estimator of a Wasserstein type cost

between two distinct continuous distributions $ F $ and $ G $ on $\mathbb {R} $. The

estimator is based on the order statistics of a sample having marginals $ F $, $ G $ and any …

  Related articles All 4 versions


[HTML] aimsciences.org

[HTML] RWRM: Residual Wasserstein regularization model for image restoration

R He, X Feng, X Zhu, H Huang… - Inverse Problems & …, 2020 - aimsciences.org

Existing image restoration methods mostly make full use of various image prior information.

However, they rarely exploit the potential of residual histograms, especially their role as

ensemble regularization constraint. In this paper, we propose a residual Wasserstein  …

  Related articles All 2 versions 


[PDF] arxiv.org

Pruned Wasserstein Index Generation Model and wigpy Package

F Xie - arXiv preprint arXiv:2004.00999, 2020 - arxiv.org

Recent proposal of Wasserstein Index Generation model (WIG) has shown a new direction

for automatically generating indices. However, it is challenging in practice to fit large

datasets for two reasons. First, the Sinkhorn distance is notoriously expensive to compute …

  Related articles All 8 versions 


[PDF] researchgate.net

[PDF] Ranking IPCC Model Performance Using the Wasserstein Distance

G VissioV LemboV Lucarini… - arXiv preprint arXiv …, 2020 - researchgate.net

We propose a methodology for intercomparing climate models and evaluating their

performance against benchmarks based on the use of the Wasserstein distance (WD). This

distance provides a rigorous way to measure quantitatively the difference between two …

  Related articles 


2020


2020

Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces

V EhrlacherD LombardiO Mula… - … and Numerical Analysis, 2020 - search.proquest.com

We consider the problem of model reduction of parametrized PDEs where the goal is to

approximate any function belonging to the set of solutions at a reduced computational cost.

For this, the bottom line of most strategies has so far been based on the approximation of the …

  Related articles All 2 versions


[PDF] arxiv.org

Interpretable Model Summaries Using the Wasserstein Distance

E Dunipace, L Trippa - arXiv preprint arXiv:2012.09999, 2020 - arxiv.org

In the current computing age, models can have hundreds or even thousands of parameters;

however, such large models decrease the ability to interpret and communicate individual

parameters. Reducing the dimensionality of the parameter space in the estimation phase is …

  Related articles All 2 versions 


[PDF] arxiv.org

Central limit theorems for Markov chains based on their convergence rates in Wasserstein distance

R Jin, A Tan - arXiv preprint arXiv:2002.09427, 2020 - arxiv.org

Many tools are available to bound the convergence rate of Markov chains in total variation

(TV) distance. Such results can be used to establish central limit theorems (CLT) that enable

error evaluations of Monte Carlo estimates in practice. However, convergence analysis …

  Related articles All 2 versions 


[PDF] arxiv.org

Safe Zero-Shot Model-Based Learning and Control: A Wasserstein Distributionally Robust Approach

A KandelSJ Moura - arXiv preprint arXiv:2004.00759, 2020 - arxiv.org

This paper explores distributionally robust zero-shot model-based learning and control

using Wasserstein ambiguity sets. Conventional model-based reinforcement learning

algorithms struggle to guarantee feasibility throughout the online learning process. We …

  Related articles All 2 versions 


Exponential contraction in Wasserstein distances for diffusion semigroups with negative curvature

FY Wang - Potential Analysis, 2020 - Springer

Let P t be the (Neumann) diffusion semigroup P t generated by a weighted Laplacian on a

complete connected Riemannian manifold M without boundary or with a convex boundary. It

is well known that the Bakry-Emery curvature is bounded below by a positive constant> 0 …

  Cited by 5 Related articles

<——2020——2020———2260——  



A Novel Data-to-Text Generation Model with Transformer Planning and a Wasserstein Auto-Encoder

X Xu, T He, H Wang - 2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

Existing methods for data-to-text generation have difficulty producing diverse texts with low

duplication rates. In this paper, we propose a novel data-to-text generation model with

Transformer planning and a Wasserstein auto-encoder, which can convert constructed data …

  Related articles All 2 versions


[PDF] arxiv.org

On the Wasserstein distance for a martingale central limit theorem

X Fan, X Ma - Statistics & Probability Letters, 2020 - Elsevier

… Following Bolthausen again, Mourrat (2013) has extended the term min V n 2 − 1 ∞ 1

2 , V n 2 − 1 1 1 3 of (2) to the more general term V n 2 − 1 p p + s n − 2 p 1 ( 2

p + 1 ) , for p ≥ 1 . Recently, with the methods of Grama and Haeusler (2000) (see also Fan et …

  Related articles All 8 versions


2020

[PDF] udl.cat

[PDF] Cálculo privado de la distancia de Wasserstein (Earth Mover)

A Blanco-Justicia, J Domingo-Ferrer - recsi2020.udl.cat

La distancia de Wasserstein, más conocida en inglés como Earth Mover's Distance (EMD),

es una medida de distancia entre dos distribuciones de probabilidad. La EMD se utiliza

ampliamente en la comparación de imágenes y documentos, y forma parte de modelos de …

 Spanish  Private calculation of Wasserstein distance (Earth Mover)


A Generative Model for Zero-Shot Learning via Wasserstein Auto-encoder

X Luo, Z Cai, F Wu, J Xiao-Yuan - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Zero-shot learning aims to use the labeled instances to train the model, and then classifies

the instances that belong to a class without labeled instances. However, the training

instances and test instances are disjoint. Thus, the description of the classes (eg text …

  Related articles


Learning Wasserstein Distance-Based Gaussian Graphical Model for Multivariate Time Series Classification

HU Xuegang, L Jianxing, LI Peipei… - 2020 IEEE …, 2020 - ieeexplore.ieee.org

Multivariate time series classification occupies an important position in time series data

mining tasks and has been applied in many fields. However, due to the statistical coupling

between different variables of Multivariate Time Series (MTS) data, traditional classification …

  Related articles All 2 versions


2020

 2020 [PDF] googleapis.com

Wasserstein barycenter model ensembling

Y MrouehPL Dognin, I Melnyk, J Ross… - US Patent App. 16 …, 2020 - Google Patents

A method, system and apparatus of ensembling, including inputting a set of models that

predict different sets of attributes, determining a source set of attributes and a target set of

attributes using a barycenter with an optimal transport metric, and determining a consensus …

  All 2 versions 


Sgd learns one-layer networks in wgans

Q LeiJ LeeA Dimakis… - … Conference on Machine …, 2020 - proceedings.mlr.press

Generative adversarial networks (GANs) are a widely used framework for learning

generative models. Wasserstein GANs (WGANs), one of the most successful variants of

GANs, require solving a minmax optimization problem to global optimality, but are in practice …

Cited by 31 Related articles All 10 versions 

CWGAN: A Graph Vector Based Traffic Missing Data Adversarial Generation Approach

M Kang, Y Yang, D Chen, W Yu - 2020 Chinese Automation …, 2020 - ieeexplore.ieee.org

Traffic speed prediction is an important and basic application in intelligent transportation

system. But due to the equipment failure, the sampled time-series data are often corrupted,

which induces data missing problems and creates difficulties in traffic speed prediction. In …

Cited by 1 Related articles

Pengfei Jiang - dblp

dblp.org › Persons

Dec 2, 2020 — 基于深度森林与CWGAN-GP的移动应用网络行为分类与评估 ... Network Behavior Based on Deep Forest and CWGAN-GP). ... CISS 2020: 1-6.


[PDF] jsjkx.com

[PDF] 基于深度森林与 CWGAN-GP 的移动应用网络行为分类与评估

蒋鹏飞, 魏松杰 - 计算机科学 - jsjkx.com

摘要针对目前移动应用数目庞大, 功能复杂, 并且其中混杂着各式各样的恶意应用等问题,

面向Android 平台分析了应用程序的网络行为, 对不同类别的应用程序设计了合理的网络行为

触发事件以模拟网络交互行为, 提出了网络事件行为序列, 并利用改进的深度森林模型对应用 …

  Cited by 1 Related articles All 2 versions 

[Chinese  Mobile application network behavior classification and evaluation based on deep forest and CWGAN-GP

<——2020——2020———2270—— 


基于深度森林与CWGAN-GP的移动应用网络行为分类与评估

https://www.jsjkx.com › jsjkx.18110...

https://www.jsjkx.com › jsjkx.18110... · Translate this page

by 蒋鹏飞 · Cited by 1 — Classification and Evaluation of Mobile Application Network Behavior Based on Deep Forest and CWGAN-GP. JIANG Peng-fei,WEI Song-jie.

Missing: CGAN- ‎| Must include: CGAN-

[CITATION] …  CWGAN-GP 的移动应用网络行为分类与评估 (Classification and Evaluation of Mobile Application Network Behavior Based on Deep Forest and CWGAN …

P Jiang, S Wei - 计算机科学, 2020

基于深度森林与CWGAN-GP的移动应用网络行为分类与评估 - 计算机科学

www.jsjkx.com › jsjkx.181102118

· Translate this page

by 蒋鹏飞 · Cited by 1 — 基于深度森林与CWGAN-GP的移动应用网络行为分类与评估 ... and Evaluation of Mobile Application Network Behavior Based on Deep Forest and CWGAN-GP.


2020 see 2019  [PDF] mlr.press

Sgd learns one-layer networks in wgans

Q Lei, J Lee, A Dimakis… - … Conference on Machine …, 2020 - proceedings.mlr.press

Generative adversarial networks (GANs) are a widely used framework for learning generative 

models. Wasserstein GANs (WGANs), one of the most successful variants of GANs, require …

Cited by 30 Related articles All 11 versions
SGD Learns One-Layer Networks in WGANs

334 views  Streamed live on Dec 16, 2020  Qi Lei (Princeton University)…

Simons Institute

Dec 12, 2020


p-Wasserstein and flux-limited diffusion equations. (English) Zbl 07326889

Commun. Pure Appl. Anal. 19, No. 9, 4227-4256 (2020).

MSC:  35K30 35Q99 65M12 35B40

PDF BibTeX XML Cite

Full Text: DOI


On Optimal Control of Discrete-time Systems with Wasserstein ...

https://www.jstage.jst.go.jp › article › jacc › _article › -char

... 22, 2020. THE 63RD JAPAN JOINT AUTOMATIC CONTROL CONFERENCE. On Optimal Control of Discrete-time Systems with Wasserstein Terminal Cost.

 OPEN ACCESS

On Optimal Control of Discrete-time Systems with Wasserstein Terminal Cost

by Hoshino, Kenta

Proceedings of the Japan Joint Automatic Control Conference, 2020

Journal ArticleCitation Online

 

Студент ПМИ нашел оптимальный алгоритм решения задачи поиска барицентра Вассерштейна

30 сентября, 2020 г.

[Russian A student at HSE found an optimal algorithm  for 

the Wasserstein barycenter search. Sep 30, 2020, student Dan Tyapkin]


2020



Wasserstein learning of deep generative point process models

S XiaoM FarajtabarX YeJ YanL Song… - arXiv preprint arXiv …, 2017 - arxiv.org

Point processes are becoming very popular in modeling asynchronous sequential data due

to their sound mathematical foundation and strength in modeling a variety of real-world

phenomena. Currently, they are often characterized via intensity function which limits …

  Cited by 95 Related articles All 10 versions 

[CITATION] Le Song, and Hongyuan Zha. 2017.“Wasserstein Learning of Deep Generative Point Process Models”

S Xiao, M Farajtabar, X Ye, Y Junchi - Proceedings of the 31st International Conference on …

  Cited by 3 Related articles



[PDF] mlr.press

A fast proximal point method for computing exact wasserstein distance

Y XieX Wang, R Wang, H Zha - Uncertainty in Artificial …, 2020 - proceedings.mlr.press

Wasserstein distance plays increasingly important roles in machine learning, stochastic

programming and image processing. Major efforts have been under way to address its high

computational complexity, some leading to approximate or regularized variations such as …

  Cited by 54 Related articles All 5 versions 


[PDF] arxiv.org

Generalizing point embeddings using the wasserstein space of elliptical distributions

B MuzellecM Cuturi - arXiv preprint arXiv:1805.07594, 2018 - arxiv.org

Embedding complex objects as vectors in low dimensional spaces is a longstanding

problem in machine learning. We propose in this work an extension of that approach, which

consists in embedding objects as elliptical probability distributions, namely distributions …

  Cited by 48 Related articles All 7 versions 


[HTML] sciencedirect.com

[HTML] A fixed-point approach to barycenters in Wasserstein space

PC Álvarez-EstebanE Del Barrio… - Journal of Mathematical …, 2016 - Elsevier

Let P 2, ac be the set of Borel probabilities on R d with finite second moment and absolutely

continuous with respect to Lebesgue measure. We consider the problem of finding the

barycenter (or Fréchet mean) of a finite set of probabilities ν 1,…, ν k P 2, ac with respect to …

  Cited by 83 Related articles All 6 versions


2020

[PDF] arxiv.org

Interior-point methods strike back: Solving the wasserstein barycenter problem

D Ge, H Wang, Z Xiong, Y Ye - arXiv preprint arXiv:1905.12895, 2019 - arxiv.org

Computing the Wasserstein barycenter of a set of probability measures under the optimal

transport metric can quickly become prohibitive for traditional second-order algorithms, such

as interior-point methods, as the support size of the measures increases. In this paper, we …

  Cited by 11 Related articles All 3 versions 

<——2020——2020———2280——  



[PDF] arxiv.org

Stochastic saddle-point optimization for wasserstein barycenters

D TiapkinA GasnikovP Dvurechensky - arXiv preprint arXiv:2006.06763, 2020 - arxiv.org

We study the computation of non-regularized Wasserstein barycenters of probability

measures supported on the finite set. The first result gives a stochastic optimization

algorithm for the discrete distribution over the probability measures which is comparable …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

Gromov–Hausdorff limit of Wasserstein spaces on point clouds

NG Trillos - Calculus of Variations and Partial Differential …, 2020 - Springer

We consider a point cloud X_n:={x _1, ..., x _n\} X n:= x 1,…, xn uniformly distributed on the

flat torus T^ d:= R^ d/Z^ d T d:= R d/Z d, and construct a geometric graph on the cloud by

connecting points that are within distance ε ε of each other. We let P (X_n) P (X n) be the …

  Cited by 12 Related articles All 4 versions


Learning Wasserstein Isometric Embedding for Point Clouds

K KawanoS Koide, T Kutsuna - 2020 International Conference …, 2020 - ieeexplore.ieee.org

The Wasserstein distance has been employed for determining the distance between point

clouds, which have variable numbers of points and invariance of point order. However, the

high computational cost associated with the Wasserstein distance hinders its practical …

  All 2 versions


Multiscale Nonrigid Point Cloud Registration Using Rotation-Invariant Sliced-Wasserstein Distance via Laplace--Beltrami Eigenmap

R LaiH Zhao - SIAM Journal on Imaging Sciences, 2017 - SIAM

In this work, we propose computational models and algorithms for point cloud registration

with nonrigid transformation. First, point clouds sampled from manifolds originally embedded

in some Euclidean space are transformed to new point clouds embedded in R^n by the …

  Cited by 16 Related articles


[PDF] arxiv.org

Wasserstein Learning of Determinantal Point Processes

L AnquetilM GartrellA Rakotomamonjy… - arXiv preprint arXiv …, 2020 - arxiv.org

Determinantal point processes (DPPs) have received significant attention as an elegant

probabilistic model for discrete subset selection. Most prior work on DPP learning focuses

on maximum likelihood estimation (MLE). While efficient and scalable, MLE approaches do …

  Related articles All 4 versions 


2010


Multi-scale non-rigid point cloud registration using robust sliced-Wasserstein distance via Laplace-Beltrami eigenmap

R LaiH Zhao - arXiv preprint arXiv:1406.3758, 2014 - arxiv.org

In this work, we propose computational models and algorithms for point cloud registration

with non-rigid transformation. First, point clouds sampled from manifolds originally

embedded in some Euclidean space $\mathbb {R}^ D $ are transformed to new point clouds …

  Cited by 10 Related articles All 12 versions 


2020

[PDF] researchgate.net

[PDF] RaspBary: Hawkes Point Process Wasserstein Barycenters as a Service

R Hosler, X LiuJ Carter, M Saper - 2019 - researchgate.net

We introduce an API for forecasting the intensity of spacetime events in urban environments

and spatially allocating vehicles during times of peak demand to minimize response time.

Our service is applicable to dynamic resource allocation problems that arise in ride sharing …

  Cited by 2 Related articles 


[HTML] sciencedirect.com

[HTML] Wasserstein metric convergence method for Fokker–Planck equations with point controls

L Petrelli, AJ Kearsley - Applied mathematics letters, 2009 - Elsevier

Monge–Kantorovich mass transfer theory is employed to obtain an existence and

uniqueness result for solutions to Fokker–Planck Equations with time dependent point

control. Existence for an approximate problem is established together with a convergence …

  Cited by 1 Related articles All 9 versions


[PDF] psu.edu

[PDF] Notes on a Wasserstein metric convergence method for Fokker-Planck equations with point controls

L Petrelli - 2004 - Citeseer

Abstract We employ the Monge-Kantorovich mass transfer theory to obtain an existence and

uniqueness result for Fokker-Planck Equations with time dependent point control. We prove

existence for an approximate problem and then show convergence in the Wasserstein  …

  Cited by 1 Related articles All 7 versions 


A Fast Proximal Point Method for Computing Exact Wasserstein Distance

proceedings.mlr.press › ...

by Y Xie · 2020 · Cited by 54 — A Fast Proximal Point Method for Computing Exact Wasserstein DistanceYujia Xie, Xiangfeng Wang, Ruijia Wang, Hongyuan ZhaWasserstein distance ...

[CITATION] A fast proximal point method for computing wasserstein distance

Y XieX Wang, R Wang, H Zha - arXiv preprint arXiv:1802.04307, 2018

Cited by 94 Related articles All 6 versions 

<——2020——2020———2290—— 


Transport and Interface: an Uncertainty Principle for the Wasserstein distance

A SagivS Steinerberger - SIAM Journal on Mathematical Analysis, 2020 - SIAM

Let f:(0,1)^dR be a continuous function with zero mean and interpret f_+=\max(f,0) and f_-

=-\min(f,0) as the densities of two measures. We prove that if the cost of transport from f_+ to

f_- is small, in terms of the Wasserstein distance W_1(f_+,f_-), then the Hausdorff measure of …

  Cited by 4 Related articles All 3 versions


[PDF] arxiv.org

Entropic-Wasserstein barycenters: PDE characterization, regularity and CLT

G Carlier, K Eichinger, A Kroshnin - arXiv preprint arXiv:2012.10701, 2020 - arxiv.org

In this paper, we investigate properties of entropy-penalized Wasserstein barycenters

introduced by Bigot, Cazelles and Papadakis (2019) as a regularization of Wasserstein

barycenters first presented by Agueh and Carlier (2011). After characterizing these …

  Related articles All 5 versions 


2020

Input limited Wasserstein GAN

F Cao, H Zhao, P Liu, P Li - Second Target Recognition and …, 2020 - spiedigitallibrary.org

Generative adversarial networks (GANs) has proven hugely successful, but suffer from train

instability. The recently proposed Wasserstein GAN (WGAN) has largely overcome the

problem, but can still fail to converge in some case or be to complex. It has been found that …

  Related articles All 2 versions


[PDF] tum.de

Lp-Wasserstein and flux-limited gradient flows: Entropic discretization, convergence analysis and numerics

B Söllner - 2020 - mediatum.ub.tum.de

We analyse different discretizations of gradient flows in transport metrics with non-quadratic

costs. Among others we discuss the p-Laplace equation and evolution equations with flux-

limitation. We prove comparison principles, free energy monotony, non-negativity and mass …

  Related articles All 3 versions 


Wasserstein smoothing: Certified robustness against wasserstein adversarial attacks

A Levine, S Feizi - International Conference on Artificial …, 2020 - proceedings.mlr.press

In the last couple of years, several adversarial attack methods based on different threat

models have been proposed for the image classification problem. Most existing defenses

consider additive threat models in which sample perturbations have bounded L_p norms …

  Cited by 15 Related articles All 5 versions 


2020

[PDF] arxiv.org

Fisher information regularization schemes for Wasserstein gradient flows

W LiJ LuL Wang - Journal of Computational Physics, 2020 - Elsevier

We propose a variational scheme for computing Wasserstein gradient flows. The scheme

builds upon the Jordan–Kinderlehrer–Otto framework with the Benamou-Brenier's dynamic

formulation of the quadratic Wasserstein metric and adds a regularization by the Fisher …

  Cited by 10 Related articles All 10 versions


[PDF] arxiv.org

Wasserstein autoencoders for collaborative filtering

X Zhang, J Zhong, K Liu - Neural Computing and Applications, 2020 - Springer

The recommender systems have long been studied in the literature. The collaborative

filtering is one of the most widely adopted recommendation techniques which is usually

applied to the explicit data, eg, rating scores. However, the implicit data, eg, click data, is …

  Cited by 10 Related articles All 3 versions


 

[PDF] arxiv.org

Regularization helps with mitigating poisoning attacks: Distributionally-robust machine learning using the wasserstein distance

F Farokhi - arXiv preprint arXiv:2001.10655, 2020 - arxiv.org

We use distributionally-robust optimization for machine learning to mitigate the effect of data

poisoning attacks. We provide performance guarantees for the trained model on the original

data (not including the poison records) by training the model for the worst-case distribution …

  Cited by 5 Related articles All 3 versions 


[HTML] aimsciences.org

[HTML] RWRM: Residual Wasserstein regularization model for image restoration

R He, X Feng, X Zhu, H Huang… - Inverse Problems & …, 2020 - aimsciences.org

Existing image restoration methods mostly make full use of various image prior information.

However, they rarely exploit the potential of residual histograms, especially their role as

ensemble regularization constraint. In this paper, we propose a residual Wasserstein  …

  Related articles All 2 versions 


[PDF] arxiv.org

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

MH Quang - arXiv preprint arXiv:2011.07489, 2020 - arxiv.org

This work studies the entropic regularization formulation of the 2-Wasserstein distance on an

infinite-dimensional Hilbert space, in particular for the Gaussian setting. We first present the

Minimum Mutual Information property, namely the joint measures of two Gaussian measures …

  Cited by 2 Related articles All 2 versions 

<——2020——2020———2300——



[PDF] arxiv.org

Infinite-dimensional regularization of McKean-Vlasov equation with a Wasserstein diffusion

V Marx - arXiv preprint arXiv:2002.10157, 2020 - arxiv.org

Much effort has been spent in recent years on restoring uniqueness of McKean-Vlasov

SDEs with non-smooth coefficients. As a typical instance, the velocity field is assumed to be

bounded and measurable in its space variable and Lipschitz-continuous with respect to the …

  Cited by 2 Related articles All 9 versions 


[PDF] aaai.org

Improving the Robustness of Wasserstein Embedding by Adversarial PAC-Bayesian Learning

D Ding, M Zhang, X PanM YangX He - Proceedings of the AAAI …, 2020 - ojs.aaai.org

Node embedding is a crucial task in graph analysis. Recently, several methods are

proposed to embed a node as a distribution rather than a vector to capture more information.

Although these methods achieved noticeable improvements, their extra complexity brings …

  Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein Collaborative Filtering for Item Cold-start Recommendation

Y MengX YanW Liu, H Wu, J Cheng - … of the 28th ACM Conference on …, 2020 - dl.acm.org

Item cold-start recommendation, which predicts user preference on new items that have no

user interaction records, is an important problem in recommender systems. In this paper, we

model the disparity between user preferences on warm items (those having interaction …

  Cited by 2 Related articles All 4 versions



[HTML] hindawi.com

[HTML] An Integrated Consensus Improving Strategy Based on PL-Wasserstein Distance and Its Application in the Evaluation of Network Public Opinion Emergencies

S Zhang, Z Ma, X Liu, Z Wang, L Jiang - Complexity, 2020 - hindawi.com

In real life, multiple network public opinion emergencies may break out in a certain place at

the same time. So, it is necessary to invite emergency decision experts in multiple fields for

timely evaluating the comprehensive crisis of the online public opinion, and then limited …

  Related articles All 7 versions 


[PDF] arxiv.org

Adversarial Classification via Distributional Robustness with Wasserstein Ambiguity

N Ho-NguyenSJ Wright - arXiv preprint arXiv:2005.13815, 2020 - arxiv.org

We study a model for adversarial classification based on distributionally robust chance

constraints. We show that under Wasserstein ambiguity, the model aims to minimize the

conditional value-at-risk of the distance to misclassification, and we explore links to previous …

  Cited by 1 Related articles All 3 versions 


2020


[PDF] arxiv.org

Improving Relational Regularized Autoencoders with Spherical Sliced Fused Gromov Wasserstein

K Nguyen, S Nguyen, N HoT PhamH Bui - arXiv preprint arXiv …, 2020 - arxiv.org

Relational regularized autoencoder (RAE) is a framework to learn the distribution of data by

minimizing a reconstruction loss together with a relational regularization on the latent space.

A recent attempt to reduce the inner discrepancy between the prior and aggregated …

  Cited by 2 Related articles All 3 versions 


[PDF] iop.org

A collaborative filtering recommendation framework based on Wasserstein GAN

R Li, F Qian, X Du, S Zhao… - Journal of Physics …, 2020 - iopscience.iop.org

Compared with the original GAN, Wasserstein GAN minimizes the Wasserstein Distance

between the generative distribution and the real distribution, can well capture the potential

distribution of data and has achieved excellent results in image generation. However, the …

  Related articles


 

[PDF] uwaterloo.ca

Wasserstein Adversarial Robustness

K Wu - 2020 - uwspace.uwaterloo.ca

Deep models, while being extremely flexible and accurate, are surprisingly vulnerable

to``small, imperceptible''perturbations known as adversarial attacks. While the majority of

existing attacks focus on measuring perturbations under the $\ell_p $ metric, Wasserstein  …

  Related articles 


Improving EEG-based motor imagery classification with conditional Wasserstein GAN

Z Li, Y Yu - 2020 International Conference on Image, Video …, 2020 - spiedigitallibrary.org

Deep learning based algorithms have made huge progress in the field of image

classification and speech recognition. There is an increasing number of researchers

beginning to use deep learning to process electroencephalographic (EEG) brain signals …

Related articles All 3 versions


[CITATION] Improving Wasserstein Generative Models for Image Synthesis and Enhancement

J Wu - 2020 - research-collection.ethz.ch

… JavaScript is disabled for your browser. Some features of this site may not work

without it. Research Collection. Navigational link. Search. Improving Wasserstein

Generative Models for Image Synthesis and Enhancement …

<——2020——2020———2310—— 


PDF] Faster Wasserstein Distance Estimation with the Sinkhorn Divergence

L ChizatP RoussillonF Léger… - Advances in Neural …, 2020 - proceedings.neurips.cc

The squared Wasserstein distance is a natural quantity to compare probability distributions

in a non-parametric setting. This quantity is usually estimated with the plug-in estimator,

defined via a discrete optimal transport problem which can be solved to $\epsilon …

  Cited by 8 Related articles All 7 versions 


[PDF] arxiv.org

Visual transfer for reinforcement learning via wasserstein domain confusion

J RoyG Konidaris - arXiv preprint arXiv:2006.03465, 2020 - arxiv.org

We introduce Wasserstein Adversarial Proximal Policy Optimization (WAPPO), a novel

algorithm for visual transfer in Reinforcement Learning that explicitly learns to align the

distributions of extracted features between a source and target task. WAPPO approximates …

  Cited by 3 Related articles All 6 versions 


[PDF] arxiv.org

SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

S ChewiTL GouicC LuT Maunu… - arXiv preprint arXiv …, 2020 - arxiv.org

Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described

as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of

optimal transport. We introduce a new perspective on SVGD that instead views SVGD as the …

  Cited by 3 Related articles All 5 versions 


[PDF] bciml.cn

Wasserstein distance based deep adversarial transfer learning for intelligent fault diagnosis with unlabeled or insufficient labeled data

C Cheng, B Zhou, G Ma, D WuY Yuan - Neurocomputing, 2020 - Elsevier

Intelligent fault diagnosis is one critical topic of maintenance solution for mechanical

systems. Deep learning models, such as convolutional neural networks (CNNs), have been

successfully applied to fault diagnosis tasks and achieved promising results. However, one …

  Cited by 10 Related articles All 3 versions


Wasserstein upper bounds of the total variation for smooth densities

M Chae, SG Walker - Statistics & Probability Letters, 2020 - Elsevier

The total variation distance between probability measures cannot be bounded by the

Wasserstein metric in general. If we consider sufficiently smooth probability densities,

however, it is possible to bound the total variation by a power of the Wasserstein distance …

  Cited by 3 Related articles All 5 versions

Wasserstein distance based deep multi-feature adversarial transfer diagnosis approach under variable working conditions


2020


[PDF] ieee.org

Joint transfer of model knowledge and fairness over domains using wasserstein distance

T Yoon, J LeeW Lee - IEEE Access, 2020 - ieeexplore.ieee.org

Owing to the increasing use of machine learning in our daily lives, the problem of fairness

has recently become an important topic in machine learning societies. Recent studies

regarding fairness in machine learning have been conducted to attempt to ensure statistical …

  Cited by 1 Related articles

 

Wasserstein based transfer network for cross-domain sentiment classification

Y Du, M He, L Wang, H Zhang - Knowledge-Based Systems, 2020 - Elsevier

Automatic sentiment analysis of social media texts is of great significance for identifying

people's opinions that can help people make better decisions. Annotating data is time

consuming and laborious, and effective sentiment analysis on domains lacking of labeled …

  Cited by 2 Related articles All 2 versions


Remote Sensing Image Segmentation based on Generative Adversarial Network with Wasserstein divergence

X Cao, C Song, J Zhang, C Liu - 2020 3rd International Conference on …, 2020 - dl.acm.org

In the image segmentation fields, traditional methods can be classified into four main

categories: threshold-based (eg Otsu [1]),. edge-based (eg Canny [2], Hough transform [3]),

region-based (eg Super pixel [4]), and energy functional-based segmentation methods (eg …

 


[PDF] researchgate.net

[PDF] THE α-z-BURES WASSERSTEIN DIVERGENCE

THOA DINHCT LE, BK VO, TD VUONG - researchgate.net

Φ (A, B)= Tr ((1− α) A+ αB)− Tr (Qα, z (A, B)), where Qα, z (A, B)=(A 1− α 2z B α z A 1− α 2z) z

is the matrix function in the α-z-Renyi relative entropy. We show that for 0≤ α≤ z≤ 1, the

quantity Φ (A, B) is a quantum divergence and satisfies the Data Processing Inequality in …

  Related articles 


[PDF] sciencedirect.com

Intelligent Fault Diagnosis with a Deep Transfer Network based on Wasserstein Distance

J Xu, J Huang, Y Zhao, L Zhou - Procedia Computer Science, 2020 - Elsevier

Intelligent fault-diagnosis methods based on deep-learning technology have been very

successful for complex industrial systems. The deep learning based fault classification

model requires a large number of labeled data. Moreover, the probability distribution of …

  Related articles

<——2020——2020———2320——   



Isometric study of Wasserstein spaces–the real line

G GehérT TitkosD Virosztek - Transactions of the American Mathematical …, 2020 - ams.org

Recently Kloeckner described the structure of the isometry group of the quadratic

Wasserstein space $\mathcal {W} _2 (\mathbb {R}^ n) $. It turned out that the case of the real

line is exceptional in the sense that there exists an exotic isometry flow. Following this line of …

  Cited by 3 Related articles All 8 versions

 

[PDF] arxiv.org

material decomposition method for dual-energy CT via dual interactive Wasserstein generative adversarial networks

Z Shi, H Li, Q Cao, Z Wang, M Cheng - arXiv preprint arXiv:2007.11247, 2020 - arxiv.org

Dual-energy computed tomography has great potential in material characterization and

identification, whereas the reconstructed material-specific images always suffer from

magnified noise and beam hardening artifacts. In this study, a data-driven approach using …

  Related articles All 3 versions 



[PDF] arxiv.org

Posterior asymptotics in Wasserstein metrics on the real line

M ChaeP De Blasi, SG Walker - arXiv preprint arXiv:2003.05599, 2020 - arxiv.org

In this paper, we use the class of Wasserstein metrics to study asymptotic properties of

posterior distributions. Our first goal is to provide sufficient conditions for posterior

consistency. In addition to the well-known Schwartz's Kullback--Leibler condition on the …

  Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein statistics in one-dimensional location-scale model

S AmariT Matsuda - arXiv preprint arXiv:2007.11401, 2020 - arxiv.org

Wasserstein geometry and information geometry are two important structures to be

introduced in a manifold of probability distributions. Wasserstein geometry is defined by

using the transportation cost between two distributions, so it reflects the metric of the base …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures

O Bencheikh, B Jourdain - arXiv preprint arXiv:2012.09729, 2020 - arxiv.org

We are interested in the approximation in Wasserstein distance with index $\rho\ge 1$ of a

probability measure $\mu $ on the real line with finite moment of order $\rho $ by the

empirical measure of $ N $ deterministic points. The minimal error converges to $0 $ as …

  Related articles All 3 versions 


2020


[PDF] optimization-online.org

[PDF] Dual Decomposition of Two-Stage Distributionally Robust Mixed-Integer Programming under the Wasserstein Ambiguity Set

K Kim - optimization-online.org

We develop a dual decomposition of two-stage distributionally robust mixed-integer

programming (DRMIP) under the Wasserstein ambiguity set. The dual decomposition is

based on the Lagrangian dual of DRMIP, which results from the Lagrangian relaxation of the …

  Related articles All 2 versions 


[PDF] researchgate.net

[PDF] ADDENDUM TO” ISOMETRIC STUDY OF WASSERSTEIN SPACES–THE REAL LINE”

GPÁL GEHÉRT TITKOSD VIROSZTEK - researchgate.net

We show an example of a Polish metric space X whose quadratic Wasserstein space W2 (X)

possesses an isometry that splits mass. This gives an affirmative answer to Kloeckner's

question,[2, Question 2]. Let us denote the metric space ([0, 1],|·|), equipped with the usual …

  Related articles 


Isometric study of Wasserstein spaces---the real line

G Pál Gehér, T TitkosD Virosztek - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

Recently Kloeckner described the structure of the isometry group of the quadratic

Wasserstein space $\mathcal {W} _2\left (\mathbb {R}^ n\right) $. It turned out that the case of

the real line is exceptional in the sense that there exists an exotic isometry flow. Following …

 


Isometric study of Wasserstein spaces --- the real line - NASA/ADS

https://ui.adsabs.harvard.edu › abs › abstract

by G Pál Gehér · 2020 — Recently Kloeckner described the structure of the isometry group of the quadratic Wasserstein space $\mathcal{W}_2\left(\mathbb{R}^n\right)$. It turned out that ...


[[PDF] neurips.cc

Wasserstein distances for stereo disparity estimation

D GargWangB Hariharan… - Advances in …, 2020 - proceedings.neurips.cc

… Wasserstein distance [39] to measure the divergence. While computing the exact Wasserstein

… In this paper, we choose the Wasserstein distance for one particular reason: p(d |u, v) and …

 Cited by 18 Related articles All 6 versions 

CITATION] Supplementary MaterialWasserstein Distances for Stereo Disparity Estimation

D Garg, Y Wang, B HariharanM Campbell

 Cited by 18 Related articles All 6 versions 

<——2020——2020———2330——   


[PDF] ieee.org

Distributionally Robust Optimal Reactive Power Dispatch with Wasserstein Distance in Active Distribution Network

J Liu, Y Chen, C Duan, J Lin… - Journal of Modern Power …, 2020 - ieeexplore.ieee.org

The uncertainties from renewable energy sources (RESs) will not only introduce significant influences to active power dispatch, but also bring great challenges to the analysis of optimal reactive power dispatch (ORPD). To address the influence of high penetration of …

  Cited by 3 Related articles All 3 versions


[PDF] mlr.press

wasserstein minimum velocity approach to learning unnormalized models

Z WangS Cheng, L Yueru, J Zhu… - International …, 2020 - proceedings.mlr.press

Score matching provides an effective approach to learning flexible unnormalized models,

but its scalability is limited by the need to evaluate a second-order derivative. In this paper,

we present a scalable approximation to a general family of learning objectives including …

  Cited by 4 Related articles All 9 versions 


[PDF] arxiv.org

wasserstein-type distance in the space of gaussian mixture models

J Delon, A Desolneux - SIAM Journal on Imaging Sciences, 2020 - SIAM

In this paper we introduce a Wasserstein-type distance on the set of Gaussian mixture

models. This distance is defined by restricting the set of possible coupling measures in the

optimal transport problem to Gaussian mixture models. We derive a very simple discrete …

  Cited by 11 Related articles All 7 versions


[PDF] arxiv.org

Wasserstein proximal gradient

A SalimA Korba, G Luise - arXiv preprint arXiv:2002.03035, 2020 - arxiv.org

We consider the task of sampling from a log-concave probability distribution. This target

distribution can be seen as a minimizer of the relative entropy functional defined on the

space of probability distributions. The relative entropy can be decomposed as the sum of a …

  Cited by 2 Related articles All 2 versions 


[PDF] mlr.press

A fast proximal point method for computing exact wasserstein distance

Y XieX Wang, R Wang, H Zha - Uncertainty in Artificial …, 2020 - proceedings.mlr.press

Wasserstein distance plays increasingly important roles in machine learning, stochastic

programming and image processing. Major efforts have been under way to address its high

computational complexity, some leading to approximate or regularized variations such as …

  Cited by 54 Related articles All 5 versions 


2020


[PDF] arxiv.org

Dynamic facial expression generation on hilbert hypersphere with conditional wasserstein generative adversarial nets

N OtberdoutM DaoudiA Kacem… - … on Pattern Analysis …, 2020 - ieeexplore.ieee.org

In this work, we propose a novel approach for generating videos of the six basic facial

expressions given a neutral face image. We propose to exploit the face geometry by

modeling the facial landmarks motion as curves encoded as points on a hypersphere. By …

  Cited by 8 Related articles All 10 versions


[PDF] researchgate.net

Tractable reformulations of two-stage distributionally robust linear programs over the type-∞ wasserstein ball

W Xie - Operations Research Letters, 2020 - Elsevier

This paper studies a two-stage distributionally robust stochastic linear program under the

type-∞ Wasserstein ball by providing sufficient conditions under which the program can be

efficiently computed via a tractable convex program. By exploring the properties of binary …

  Cited by 8 Related articles All 4 versions  Zbl 07331158


2Fixed-Support Wasserstein Barycenter: Computational Hardness and Efficient Algorithms

T LinN Ho, X Chen, M CuturiMI Jordan - 2020 - research.google

We study in this paper the finite-support Wasserstein barycenter problem (FS-WBP), which

consists in computing the Wasserstein barycenter of $ m $ discrete probability measures

supported on a finite metric space of size $ n $. We show first that the constraint matrix …

Person-independent facial expression recognition method based on improved Wasserstein generative adversarial networks in combination with identity aware

C Xu, Y Cui, Y Zhang, P Gao, J Xu - Multimedia Systems, 2020 - Springer

Since the distinction between two expressions is fairly vague, usually a subtle change in one

part of the human face is enough to change a facial expression. Most of the existing facial

expression recognition algorithms are not robust enough because they rely on general facial  …

  Cited by 5 Related articles


[PDF] imstat.org

A Central Limit Theorem for Wasserstein type distances between two distinct univariate distributions

P Berthet, JC Fort, T Klein - Annales de l'Institut Henri Poincaré …, 2020 - projecteuclid.org

In this article we study the natural nonparametric estimator of a Wasserstein type cost

between two distinct continuous distributions $ F $ and $ G $ on $\mathbb {R} $. The

estimator is based on the order statistics of a sample having marginals $ F $, $ G $ and any …

  Related articles All 4 versions

<——2020——2020———2340—— 


[PDF] arxiv.org

A Rademacher-type theorem on L2-Wasserstein spaces over closed Riemannian manifolds

LD Schiavo - Journal of Functional Analysis, 2020 - Elsevier

Let P be any Borel probability measure on the L 2-Wasserstein space (P 2 (M), W 2) over a

closed Riemannian manifold M. We consider the Dirichlet form E induced by P and by the

Wasserstein gradient on P 2 (M). Under natural assumptions on P, we show that W 2 …

  Cited by 5 Related articles All 6 versions


The Wasserstein Proximal Gradient Algorithm

A SalimA KorbaG Luise - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

Wasserstein gradient flows are continuous time dynamics that define curves of steepest

descent to minimize an objective function over the space of probability measures (ie, the

Wasserstein space). This objective is typically a divergence wrt a fixed target distribution. In …

  Related articles


[PDF] arxiv.org

Risk Measures Estimation Under Wasserstein Barycenter

MA Arias-SernaJM Loubes… - arXiv preprint arXiv …, 2020 - arxiv.org

Randomness in financial markets requires modern and robust multivariate models of risk

measures. This paper proposes a new approach for modeling multivariate risk measures

under Wasserstein barycenters of probability measures supported on location-scatter …

  Related articles All 5 versions 


[PDF] jst.go.jp

Orthogonal gradient penalty for fast training of wasserstein gan based multi-task autoencoder toward robust speech recognition

CY Kao, S Park, A Badi, DK Han… - IEICE TRANSACTIONS on …, 2020 - search.ieice.org

Performance in Automatic Speech Recognition (ASR) degrades dramatically in noisy

environments. To alleviate this problem, a variety of deep networks based on convolutional

neural networks and recurrent neural networks were proposed by applying L1 or L2 loss. In …

  Cited by 1 Related articles All 5 versions


[PDF] arxiv.org

Consistency of Distributionally Robust Risk-and Chance-Constrained Optimization Under Wasserstein Ambiguity Sets

A CherukuriAR Hota - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We study stochastic optimization problems with chance and risk constraints, where in the

latter, risk is quantified in terms of the conditional value-at-risk (CVaR). We consider the

distributionally robust versions of these problems, where the constraints are required to hold …

  Cited by 1 Related articles All 3 versions


[PDF] umn.edu

Data-driven Distributionally Robust Stochastic Optimization via Wasserstein Distance with Applications to Portfolio Risk Management and Inventory Control

D Singh - 2020 - conservancy.umn.edu

The central theme of this dissertation is stochastic optimization under distributional

ambiguity. One canthink of this as a two player game between a decision maker, who tries to

minimize some loss or maximize some reward, and an adversarial agent that chooses the …

  All 3 versions 

Dissertation or Thesis

Convergence and concentration of empirical measures under Wasserstein distance in unbounded functional spaces

J Lei - Bernoulli, 2020 - projecteuclid.org

We provide upper bounds of the expected Wasserstein distance between a probability

measure and its empirical version, generalizing recent results for finite dimensional

Euclidean spaces and bounded functional spaces. Such a generalization can cover …

  Cited by 47 Related articles All 5 versions


[PDF] arxiv.org

Sampling of probability measures in the convex order by Wasserstein projection

A AlfonsiJ Corbetta, B Jourdain - Annales de l'Institut Henri …, 2020 - projecteuclid.org

In this paper, for $\mu $ and $\nu $ two probability measures on $\mathbb {R}^{d} $ with

finite moments of order $\varrho\ge 1$, we define the respective projections for the $ W_

{\varrho} $-Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures …

  Cited by 19 Related articles All 9 versions


[PDF] arxiv.org

McKean-Vlasov SDEs with drifts discontinuous under Wasserstein distance

X Huang, FY Wang - arXiv preprint arXiv:2002.06877, 2020 - arxiv.org

Existence and uniqueness are proved for Mckean-Vlasov type distribution dependent SDEs

with singular drifts satisfying an integrability condition in space variable and the Lipschitz

condition in distribution variable with respect to $ W_0 $ or $ W_0+ W_\theta $ for some …

  Cited by 8 Related articles All 4 versions 


Wasserstein distance based deep multi-feature adversarial transfer diagnosis approach under variable working conditions

D She, N Peng, M Jia, MG Pecht - Journal of Instrumentation, 2020 - iopscience.iop.org

Intelligent mechanical fault diagnosis is a crucial measure to ensure the safe operation of

equipment. To solve the problem that network features is not fully utilized in the adversarial

transfer learning, this paper develops a Wasserstein distance based deep multi-feature …

Cited by 6 Related articles All 3 versions

<——2020——2020———2350——


[HTML] mdpi.com

Probability forecast combination via entropy regularized wasserstein distance

R Cumings-Menon, M Shin - Entropy, 2020 - mdpi.com

We propose probability and density forecast combination methods that are defined using the

entropy regularized Wasserstein distance. First, we provide a theoretical characterization of

the combined density forecast based on the regularized Wasserstein distance under the …

  Cited by 2 Related articles All 15 versions 


[PDF] arxiv.org

Intrinsic Sliced Wasserstein Distances for Comparing Collections of Probability Distributions on Manifolds and Graphs

RM RustamovS Majumdar - arXiv preprint arXiv:2010.15285, 2020 - arxiv.org

Collections of probability distributions arise in a variety of statistical applications ranging

from user activity pattern analysis to brain connectomics. In practice these distributions are

represented by histograms over diverse domain types including finite intervals, circles …

  Cited by 2 Related articles All 2 versions 


[PDF] arxiv.org

A Rademacher-type theorem on L2-Wasserstein spaces over closed Riemannian manifolds

LD Schiavo - Journal of Functional Analysis, 2020 - Elsevier

Let P be any Borel probability measure on the L 2-Wasserstein space (P 2 (M), W 2) over a

closed Riemannian manifold M. We consider the Dirichlet form E induced by P and by the

Wasserstein gradient on P 2 (M). Under natural assumptions on P, we show that W 2 …

  Cited by 5 Related articles All 6 versions


2020  [HTML] mdpi.com

Probability forecast combination via entropy regularized wasserstein distance

R Cumings-Menon, M Shin - Entropy, 2020 - mdpi.com

We propose probability and density forecast combination methods that are defined using the

entropy regularized Wasserstein distance. First, we provide a theoretical characterization of

the combined density forecast based on the regularized Wasserstein distance under the …

  Cited by 2 Related articles All 15 versions 

Existence of probability measure valued jump-diffusions in generalized Wasserstein spaces

M Larsson, S Svaluto-Ferro - Electronic Journal of Probability, 2020 - projecteuclid.org

We study existence of probability measure valued jump-diffusions described by martingale

problems. We develop a simple device that allows us to embed Wasserstein spaces and

other similar spaces of probability measures into locally compact spaces where classical …

  Cited by 2 Related articles All 3 versions


2020


[PDF] arxiv.org

Risk Measures Estimation Under Wasserstein Barycenter

MA Arias-SernaJM Loubes… - arXiv preprint arXiv …, 2020 - arxiv.org

Randomness in financial markets requires modern and robust multivariate models of risk

measures. This paper proposes a new approach for modeling multivariate risk measures

under Wasserstein barycenters of probability measures supported on location-scatter …

  Related articles All 5 versions 


[PDF] stanford.edu

A Class of Optimal Transport Regularized Formulations with Applications to Wasserstein GANs

S Mahdian, JH Blanchet… - 2020 Winter Simulation …, 2020 - ieeexplore.ieee.org

Optimal transport costs (eg Wasserstein distances) are used for fitting high-dimensional

distributions. For example, popular artificial intelligence algorithms such as Wasserstein

Generative Adversarial Networks (WGANs) can be interpreted as fitting a black-box …

  All 2 versions


[PDF] arxiv.org

Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures

O Bencheikh, B Jourdain - arXiv preprint arXiv:2012.09729, 2020 - arxiv.org

We are interested in the approximation in Wasserstein distance with index $\rho\ge 1$ of a

probability measure $\mu $ on the real line with finite moment of order $\rho $ by the

empirical measure of $ N $ deterministic points. The minimal error converges to $0 $ as …

  Related articles All 3 versions 


[PDF] arxiv.org

Martingale Wasserstein inequality for probability measures in the convex order

B Jourdain, W Margheriti - arXiv preprint arXiv:2011.11599, 2020 - arxiv.org

It is known since [24] that two one-dimensional probability measures in the convex order

admit a martingale coupling with respect to which the integral of $\vert xy\vert $ is smaller

than twice their $\mathcal W_1 $-distance (Wasserstein distance with index $1 $). We …

  Related articles All 7 versions 


[PDF] arxiv.org

Time Discretizations of Wasserstein-Hamiltonian Flows

J Cui, L Dieci, H Zhou - arXiv preprint arXiv:2006.09187, 2020 - arxiv.org

Page 1. TIME DISCRETIZATIONS OF WASSERSTEIN-HAMILTONIAN FLOWS

JIANBO CUI, LUCA DIECI, AND HAOMIN ZHOU Abstract. We … Page 3. TIME

DISCRETIZATIONS OF WASSERSTEIN-HAMILTONIAN FLOWS 3 of …

  Cited by 1 Related articles All 3 versions 

<——2020——2020———2360——


Finite-Horizon Control of Nonlinear Discrete-Time Systems with Terminal Cost of Wasserstein Distance

K Hoshino - 2020 59th IEEE Conference on Decision and …, 2020 - ieeexplore.ieee.org

… Among those studies, the study in [6] addresses a finite-horizon optimal control problem of linear

continuous-time systems with the terminal cost of the Wasserstein distance, with the aim of steering

a given initial probability distribution of state variables to a desired probability …

  Cited by 1 Related articles+


[PDF] arxiv.org

Characterization of probability distribution convergence in Wasserstein distance by -quantization error function

Y Liu, G Pagès - Bernoulli, 2020 - projecteuclid.org

We establish conditions to characterize probability measures by their $ L^{p} $-quantization

error functions in both $\mathbb {R}^{d} $ and Hilbert settings. This characterization is two-

fold: static (identity of two distributions) and dynamic (convergence for the $ L^{p} …

  Cited by 1 Related articles All 5 versions



Global sensitivity analysis and Wasserstein spaces

JC Fort, T Klein, A Lagnoux - arXiv preprint arXiv:2007.12378, 2020 - arxiv.org

Sensitivity indices are commonly used to quantity the relative inuence of any specic group of

input variables on the output of a computer code. In this paper, we focus both on computer

codes the output of which is a cumulative distribution function and on stochastic computer …

  Cited by 1 Related articles All 9 versions 


[PDF] researchgate.net

Sample generation based on a supervised Wasserstein Generative Adversarial Network for high-resolution remote-sensing scene classification

W HanL Wang, R Feng, L Gao, X Chen, Z Deng… - Information …, 2020 - Elsevier

As high-resolution remote-sensing (HRRS) images have become increasingly widely

available, scene classification focusing on the smart classification of land cover and land

use has also attracted more attention. However, mainstream methods encounter a severe …

  Cited by 5 Related articles All 3 versions


[PDF] arxiv.org

Wasserstein Autoregressive Models for Density Time Series

C ZhangP KokoszkaA Petersen - arXiv preprint arXiv:2006.12640, 2020 - arxiv.org

Data consisting of time-indexed distributions of cross-sectional or intraday returns have

been extensively studied in finance, and provide one example in which the data atoms

consist of serially dependent probability distributions. Motivated by such data, we propose …

  Cited by 2 Related articles All 3 versions 

 

2020

Density estimation of multivariate samples using Wasserstein distance

E Luini, P Arbenz - Journal of Statistical Computation and …, 2020 - Taylor & Francis

Density estimation is a central topic in statistics and a fundamental task of machine learning.

In this paper, we present an algorithm for approximating multivariate empirical densities with

a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

  Cited by 2 Related articles All 3 versions


[PDF] arxiv.org

Finite-sample guarantees for Wasserstein distributionally robust optimization: Breaking the curse of dimensionality

R Gao - arXiv preprint arXiv:2009.04382, 2020 - arxiv.org

Wasserstein distributionally robust optimization (DRO) aims to find robust and generalizable

solutions by hedging against data perturbations in Wasserstein distance. Despite its recent

empirical success in operations research and machine learning, existing performance …

  Cited by 1 Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein distance error bounds for the multivariate normal approximation of the maximum likelihood estimator

A AnastasiouRE Gaunt - arXiv preprint arXiv:2005.05208, 2020 - arxiv.org

We obtain explicit Wasserstein distance error bounds between the distribution of the multi-

parameter MLE and the multivariate normal distribution. Our general bounds are given for

possibly high-dimensional, independent and identically distributed random vectors. Our …

  Cited by 1 Related articles All 4 versions 


[HTML] springer.com

[HTML] Wasserstein and Kolmogorov error bounds for variance-gamma approximation via Stein's method I

RE Gaunt - Journal of Theoretical Probability, 2020 - Springer

The variance-gamma (VG) distributions form a four-parameter family that includes as special

and limiting cases the normal, gamma and Laplace distributions. Some of the numerous

applications include financial modelling and approximation on Wiener space. Recently …

  Cited by 14 Related articles All 6 versions


DECWA: Density-Based Clustering using Wasserstein Distance

N El Malki, R Cugny, O TesteF Ravat - Proceedings of the 29th ACM …, 2020 - dl.acm.org

Clustering is a data analysis method for extracting knowledge by discovering groups of data

called clusters. Among these methods, state-of-the-art density-based clustering methods

have proven to be effective for arbitrary-shaped clusters. Despite their encouraging results …

  Related articles All 2 versions

<——2020——2020———2370——



[PDF] arxiv.org

Two-sample Test using Projected Wasserstein Distance: Breaking the Curse of Dimensionality

J Wang, R Gao, Y Xie - arXiv preprint arXiv:2010.11970, 2020 - arxiv.org

We develop a projected Wasserstein distance for the two-sample test, a fundamental

problem in statistics and machine learning: given two sets of samples, to determine whether

they are from the same distribution. In particular, we aim to circumvent the curse of …

  Cited by 2 Related articles All 3 versions 

 

Drift compensation algorithm based on Time-Wasserstein dynamic distribution alignment

Y Tao, K Zeng, Z Liang - 2020 IEEE/CIC International …, 2020 - ieeexplore.ieee.org

… Third, the distribution weights are calculated according to the Wasserstein distance and

time span, and the distribution adaptation is dynamically adjusted. Finally, learn the classifier …

III. TIME-WASSERSTEIN DYNAMIC DISTRIBUTION ALIGNMENT …

  Related articles


[PDF] arxiv.org

Conditional Sig-Wasserstein GANs for Time Series Generation

H NiL SzpruchM WieseS Liao, B Xiao - arXiv preprint arXiv:2006.05421, 2020 - arxiv.org

… of the linear functional L. We use L(SN (Xt−p+1:t)) as an estimator for Eν[SM (Xt+1:t+q)|Xt−p+

1:t]. Given Xt−p+1:t, we sample the noise from the distribution of the latent process Zt+1 … In this

paper, we developed the conditional Sig-Wasserstein GAN for time series generation …

  Cited by 4 Related articles All 3 versions 

[PDF] arxiv.org

Stein's method for normal approximation in Wasserstein distances with application to the multivariate Central Limit Theorem

T Bonis - Probability Theory and Related Fields, 2020 - Springer

We use Stein's method to bound the Wasserstein distance of order 2 between a

measure\(\nu\) and the Gaussian measure using a stochastic process\((X_t) _ {t\ge 0}\) such

that\(X_t\) is drawn from\(\nu\) for any\(t> 0\). If the stochastic process\((X_t) _ {t\ge 0}\) …

  Cited by 7 Related articles All 3 versions

 

[PDF] arxiv.org

Characterization of probability distribution convergence in Wasserstein distance by -quantization error function

Y Liu, G Pagès - Bernoulli, 2020 - projecteuclid.org

We establish conditions to characterize probability measures by their $ L^{p} $-quantization

error functions in both $\mathbb {R}^{d} $ and Hilbert settings. This characterization is two-

fold: static (identity of two distributions) and dynamic (convergence for the $ L^{p} …

  Cited by 1 Related articles All 5 versions


2020


[PDF] iop.org

Data Augmentation Method for Switchgear Defect Samples Based on Wasserstein Generative Adversarial Network

X Huang, J Xiong, Y Zhang, J Liang… - Journal of Physics …, 2020 - iopscience.iop.org

The problem of sample imbalance will lead to poor generalization ability of the deep

learning model algorithm, and the phenomenon of overfitting during network training, which

limits the accuracy of intelligent fault diagnosis of switchgear equipment. In view of this, this …

  Related articles


[PDF] uniroma1.it

[PDF] Nonparametric Density Estimation with Wasserstein Distance for Actuarial Applications

EG Luini - iris.uniroma1.it

Density estimation is a central topic in statistics and a fundamental task of actuarial sciences.

In this work, we present an algorithm for approximating multivariate empirical densities with

a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

  Related articles All 2 versions 


Stochastic Approximation versus Sample Average Approximation for population Wasserstein barycenters

D Dvinskikh - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

In machine learning and optimization community there are two main approaches for convex

risk minimization problem, namely, the Stochastic Approximation (SA) and the Sample

Average Approximation (SAA). In terms of oracle complexity (required number of stochastic …

  Cited by 1



Wasserstein control of mirror langevin monte carlo

KS ZhangG PeyréJ Fadili… - Conference on Learning …, 2020 - proceedings.mlr.press

Discretized Langevin diffusions are efficient Monte Carlo methods for sampling from high

dimensional target densities that are log-Lipschitz-smooth and (strongly) log-concave. In

particular, the Euclidean Langevin Monte Carlo sampling algorithm has received much …

  Cited by 7 Related articles All 10 versions 


[PDF] arxiv.org

Wasserstein distributionally robust stochastic control: A data-driven approach

I Yang - IEEE Transactions on Automatic Control, 2020 - ieeexplore.ieee.org

Standard stochastic control methods assume that the probability distribution of uncertain

variables is available. Unfortunately, in practice, obtaining accurate distribution information

is a challenging task. To resolve this issue, in this article we investigate the problem of …

  Cited by 24 Related articles All 3 versions

<——2020——2020———2380——



[PDF] sns.it

Optimal control of multiagent systems in the Wasserstein space

C Jimenez, A Marigonda, M Quincampoix - Calculus of Variations and …, 2020 - Springer

This paper concerns a class of optimal control problems, where a central planner aims to

control a multi-agent system in R^ d R d in order to minimize a certain cost of Bolza type. At

every time and for each agent, the set of admissible velocities, describing his/her underlying …

  Cited by 8 Related articles All 3 versions

 

[PDF] researchgate.net

Data supplement for a soft sensor using a new generative model based on a variational autoencoder and Wasserstein GAN

X Wang, H Liu - Journal of Process Control, 2020 - Elsevier

In industrial process control, measuring some variables is difficult for environmental or cost

reasons. This necessitates employing a soft sensor to predict these variables by using the

collected data from easily measured variables. The prediction accuracy and computational …

  Cited by 6 Related articles All 3 versions


[PDF] arxiv.org

Wasserstein distributionally robust motion control for collision avoidance using conditional value-at-risk

A HakobyanI Yang - arXiv preprint arXiv:2001.04727, 2020 - arxiv.org

In this paper, a risk-aware motion control scheme is considered for mobile robots to avoid

randomly moving obstacles when the true probability distribution of uncertainty is unknown.

We propose a novel model predictive control (MPC) method for limiting the risk of unsafety …

  Cited by 5 Related articles All 2 versions 


[PDF] lewissoft.com

Wasserstein Distributionally Robust Motion Planning and Control with Safety Constraints Using Conditional Value-at-Risk

A HakobyanI Yang - 2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

In this paper, we propose an optimization-based decision-making tool for safe motion

planning and control in an environment with randomly moving obstacles. The unique feature

of the proposed method is that it limits the risk of unsafety by a pre-specified threshold even …

  Cited by 2 Related articles All 2 versions


[PDF] unifi.it

[PDF] Conlon: A pseudo-song generator based on a new pianoroll, wasserstein autoencoders, and optimal interpolations

L Angioloni, T Borghuis, L Brusci… - Proceedings of the 21st …, 2020 - flore.unifi.it

We introduce CONLON, a pattern-based MIDI generation method that employs a new

lossless pianoroll-like data description in which velocities and durations are stored in

separate channels. CONLON uses Wasserstein autoencoders as the underlying generative …

  Cited by 1 Related articles All 7 versions 


2020


Finite-Horizon Control of Nonlinear Discrete-Time Systems with Terminal Cost of Wasserstein Distance

K Hoshino - … 59th IEEE Conference on Decision and Control  …, 2020 - ieeexplore.ieee.org

This study explores a finite-horizon optimal control problem of nonlinear discrete-time

systems for steering a probability distribution of initial states as close as possible to a

desired probability distribution of terminal states. The problem is formulated as an optimal …

  Cited by 1 Related articles


[PDF] arxiv.org

Minimax control of ambiguous linear stochastic systems using the Wasserstein metric

K KimI Yang - … 59th IEEE Conference on Decision and Control …, 2020 - ieeexplore.ieee.org

In this paper, we propose a minimax linear-quadratic control method to address the issue of

inaccurate distribution information in practical stochastic systems. To construct a control

policy that is robust against errors in an empirical distribution of uncertainty, our method …

  Cited by 4 Related articles All 3 versions


A new Wasserstein distance-and cumulative sum-dependent health indicator and its application in prediction of remaining useful life of bearing

J Yin, M Xu, H Zheng, Y Yang - Journal of the Brazilian Society of …, 2020 - Springer

The safety and reliability of mechanical performance are affected by the condition (health

status) of the bearings. A health indicator (HI) with high monotonicity and robustness is a

helpful tool to simplify the predictive model and improve prediction accuracy. In this paper, a …

  Related articles


[PDF] ieee.org

A New Data-Driven Distributionally Robust Portfolio Optimization Method based on Wasserstein Ambiguity Set

N Du, Y Liu, Y Liu - IEEE Access, 2020 - ieeexplore.ieee.org

Since optimal portfolio strategy depends heavily on the distribution of uncertain returns, this

paper proposes a new method for the portfolio optimization problem with respect to

distribution uncertainty. When the distributional information of the uncertain return rate is …

  Related articles


[PDF] arxiv.org

Safe Zero-Shot Model-Based Learning and Control: A Wasserstein Distributionally Robust Approach

A KandelSJ Moura - arXiv preprint arXiv:2004.00759, 2020 - arxiv.org

This paper explores distributionally robust zero-shot model-based learning and control

using Wasserstein ambiguity sets. Conventional model-based reinforcement learning

algorithms struggle to guarantee feasibility throughout the online learning process. We …

  Related articles All 2 versions 

<——2020——2020———2390——  



Learning Wasserstein Distance-Based Gaussian Graphical Model for Multivariate Time Series Classification

HU Xuegang, L Jianxing, LI Peipei… - 2020 IEEE …, 2020 - ieeexplore.ieee.org

Multivariate time series classification occupies an important position in time series data

mining tasks and has been applied in many fields. However, due to the statistical coupling

between different variables of Multivariate Time Series (MTS) data, traditional classification …

  Related articles All 2 versions



Wasserstein upper bounds of the total variation for smooth densities

M Chae, SG Walker - Statistics & Probability Letters, 2020 - Elsevier

The total variation distance between probability measures cannot be bounded by the

Wasserstein metric in general. If we consider sufficiently smooth probability densities,

however, it is possible to bound the total variation by a power of the Wasserstein distance …

  Cited by 3 Related articles All 5 versions

 

Density estimation of multivariate samples using Wasserstein distance

E Luini, P Arbenz - Journal of Statistical Computation and …, 2020 - Taylor & Francis

Density estimation is a central topic in statistics and a fundamental task of machine learning.

In this paper, we present an algorithm for approximating multivariate empirical densities with

a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

  Cited by 2 Related articles All 3 versions


 

Finite-Horizon Control of Nonlinear Discrete-Time Systems with Terminal Cost of Wasserstein Distance

K Hoshino - 2020 59th IEEE Conference on Decision and …, 2020 - ieeexplore.ieee.org

This study explores a finite-horizon optimal control problem of nonlinear discrete-time

systems for steering a probability distribution of initial states as close as possible to a

desired probability distribution of terminal states. The problem is formulated as an optimal …

  Cited by 1 Related articles


[PDF] arxiv.org

The Wasserstein Impact Measure (WIM): a generally applicable, practical tool for quantifying prior impact in Bayesian statistics

F Ghaderinezhad, C LeyB Serrien - arXiv preprint arXiv:2010.12522, 2020 - arxiv.org

The prior distribution is a crucial building block in Bayesian analysis, and its choice will

impact the subsequent inference. It is therefore important to have a convenient way to

quantify this impact, as such a measure of prior impact will help us to choose between two or …

Cited by 1 Related articles All 2 versions 

2020


Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces

V EhrlacherD LombardiO Mula… - … and Numerical Analysis, 2020 - search.proquest.com

We consider the problem of model reduction of parametrized PDEs where the goal is to

approximate any function belonging to the set of solutions at a reduced computational cost.

For this, the bottom line of most strategies has so far been based on the approximation of the …

  Related articles All 2 versions


Drift compensation algorithm based on Time-Wasserstein dynamic distribution alignment

Y Tao, K Zeng, Z Liang - 2020 IEEE/CIC International …, 2020 - ieeexplore.ieee.org

The electronic nose (E-nose) is mainly used to detect different types and concentrations of

gases. At present, the average life of E-nose is relatively short, mainly due to the drift of the

sensor resulting in a decrease in the effect. Therefore, it is the focus of research in this field …

  Related articles


[PDF] Structure-preserving variational schemes for fourth order nonlinear partial differential equations with a Wasserstein gradient flow structure

B Ashworth - 2020 - core.ac.uk

There is a growing interest in studying nonlinear partial differential equations which

constitute gradient flows in the Wasserstein metric and related structure preserving

variational discretisations. In this thesis, we focus on the fourth order Derrida-Lebowitz …


Exponential contraction in Wasserstein distances for diffusion semigroups with negative curvature

FY Wang - Potential Analysis, 2020 - Springer

Let P t be the (Neumann) diffusion semigroup P t generated by a weighted Laplacian on a

complete connected Riemannian manifold M without boundary or with a convex boundary. It

is well known that the Bakry-Emery curvature is bounded below by a positive constant> 0 …

 Cited by 10 Related articles

[PDF] ams.org

Full View

Nonpositive curvature, the variance functional, and the Wasserstein barycenter

YH KimB Pass - Proceedings of the American Mathematical Society, 2020 - ams.org

We show that a Riemannian manifold $ M $ has nonpositive sectional curvature and is

simply connected if and only if the variance functional on the space $ P (M) $ of probability

measures over $ M $ is displacement convex. We then establish convexity over Wasserstein  …

  Cited by 2 Related articles All 3 versions

<——2020——2020———2400——


Knowledge-aware attentive wasserstein adversarial dialogue response generation

Y Zhang, Q Fang, S Qian, C Xu - ACM Transactions on Intelligent …, 2020 - dl.acm.org

Natural language generation has become a fundamental task in dialogue systems. RNN-

based natural response generation methods encode the dialogue context and decode it into

response. However, they tend to generate dull and simple responses. In this article, we …

  Cited by 2 Related articles


[PDF] arxiv.org

High-Confidence Attack Detection via Wasserstein-Metric Computations

D LiS Martínez - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

This letter considers a sensor attack and fault detection problem for linear cyber-physical

systems, which are subject to system noise that can obey an unknown light-tailed

distribution. We propose a new threshold-based detection mechanism that employs the …

  Cited by 2 Related articles All 5 versions


Finite-Horizon Control of Nonlinear Discrete-Time Systems with Terminal Cost of Wasserstein Distance

K Hoshino - 2020 59th IEEE Conference on Decision and …, 2020 - ieeexplore.ieee.org

This study explores a finite-horizon optimal control problem of nonlinear discrete-time

systems for steering a probability distribution of initial states as close as possible to a

desired probability distribution of terminal states. The problem is formulated as an optimal …

  Cited by 1 Related articles



Trajectories from Distribution-Valued Functional Curves: A Unified Wasserstein Framework

A SharmaG Gerig - … Conference on Medical Image Computing and …, 2020 - Springer

Temporal changes in medical images are often evaluated along a parametrized function that

represents a structure of interest (eg white matter tracts). By attributing samples along these

functions with distributions of image properties in the local neighborhood, we create …

  Related articles All 2 versions


[PDF] arxiv.org

Derivative over Wasserstein spaces along curves of densities

R Buckdahn, J Li, H Liang - arXiv preprint arXiv:2010.01507, 2020 - arxiv.org

In this paper, given any random variable $\xi $ defined over a probability space

$(\Omega,\mathcal {F}, Q) $, we focus on the study of the derivative of functions of the form $

L\mapsto F_Q (L):= f\big ((LQ) _ {\xi}\big), $ defined over the convex cone of densities …

  Related articles All 2 versions 


2020


Covariance Steering of Discrete-Time Stochastic Linear Systems Based on Wasserstein Distance Terminal Cost

IM BalciE Bakolas - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We consider a class of stochastic optimal control problems for discrete-time linear systems

whose objective is the characterization of control policies that will steer the probability

distribution of the terminal state of the system close to a desired Gaussian distribution. In our …

  Cited by 2 Related articles


Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

MH Quang - arXiv preprint arXiv:2011.07489, 2020 - arxiv.org

This work studies the entropic regularization formulation of the 2-Wasserstein distance on an

infinite-dimensional Hilbert space, in particular for the Gaussian setting. We first present the

Minimum Mutual Information property, namely the joint measures of two Gaussian measures …

  Cited by 2 Related articles All 2 versions 


Finite-Horizon Control of Nonlinear Discrete-Time Systems with Terminal Cost of Wasserstein Distance

K Hoshino - 2020 59th IEEE Conference on Decision and …, 2020 - ieeexplore.ieee.org

This study explores a finite-horizon optimal control problem of nonlinear discrete-time

systems for steering a probability distribution of initial states as close as possible to a

desired probability distribution of terminal states. The problem is formulated as an optimal …

  Cited by 1 Related articles


year 2020

[PDF] uniroma1.it

[PDF] Nonparametric Density Estimation with Wasserstein Distance for Actuarial Applications

EG Luini - iris.uniroma1.it

Density estimation is a central topic in statistics and a fundamental task of actuarial sciences.

In this work, we present an algorithm for approximating multivariate empirical densities with

a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

  Related articles All 2 versions 

 

Convergence and concentration of empirical measures under Wasserstein distance in unbounded functional spaces

J Lei - Bernoulli, 2020 - projecteuclid.org

We provide upper bounds of the expected Wasserstein distance between a probability

measure and its empirical version, generalizing recent results for finite dimensional

Euclidean spaces and bounded functional spaces. Such a generalization can cover …

  Cited by 49 Related articles All 5 versions

<——2020——2020———2410—— 


   

[PDF] mlr.press

Wasserstein control of mirror langevin monte carlo

KS ZhangG PeyréJ Fadili… - Conference on Learning …, 2020 - proceedings.mlr.press

Discretized Langevin diffusions are efficient Monte Carlo methods for sampling from high

dimensional target densities that are log-Lipschitz-smooth and (strongly) log-concave. In

particular, the Euclidean Langevin Monte Carlo sampling algorithm has received much …

  Cited by 7 Related articles All 10 versions 

Wasserstein Control of Mirror Langevin Monte Carlo

K Shuangjian Zhang, G PeyréJ Fadili… - arXiv e …, 2020 - ui.adsabs.harvard.edu

Discretized Langevin diffusions are efficient Monte Carlo methods for sampling from high

dimensional target densities that are log-Lipschitz-smooth and (strongly) log-concave. In

particular, the Euclidean Langevin Monte Carlo sampling algorithm has received much …



[PDF] wiley.com

Evaluating the performance of climate models based on Wasserstein distance

G VissioV LemboV Lucarini… - Geophysical Research …, 2020 - Wiley Online Library

We propose a methodology for intercomparing climate models and evaluating their

performance against benchmarks based on the use of the Wasserstein distance (WD). This

distance provides a rigorous way to measure quantitatively the difference between two …

  Cited by 2 Related articles All 13 versions


[PDF] arxiv.org

wasserstein-type distance in the space of gaussian mixture models

J Delon, A Desolneux - SIAM Journal on Imaging Sciences, 2020 - SIAM

In this paper we introduce a Wasserstein-type distance on the set of Gaussian mixture

models. This distance is defined by restricting the set of possible coupling measures in the

optimal transport problem to Gaussian mixture models. We derive a very simple discrete …

  Cited by 11 Related articles All 7 versions


[PDF] sns.it

Optimal control of multiagent systems in the Wasserstein space

C Jimenez, A Marigonda, M Quincampoix - … of Variations and Partial …, 2020 - Springer

This paper concerns a class of optimal control problems, where a central planner aims to

control a multi-agent system in R^ d R d in order to minimize a certain cost of Bolza type. At

every time and for each agent, the set of admissible velocities, describing his/her underlying …

  Cited by 8 Related articles All 3 versions


[PDF] arxiv.org

Sampling of probability measures in the convex order by Wasserstein projection

A AlfonsiJ Corbetta, B Jourdain - Annales de l'Institut Henri …, 2020 - projecteuclid.org

In this paper, for $\mu $ and $\nu $ two probability measures on $\mathbb {R}^{d} $ with

finite moments of order $\varrho\ge 1$, we define the respective projections for the $ W_

{\varrho} $-Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures …

  Cited by 19 Related articles All 9 versions


2020


Reconstruction of shale image based on Wasserstein Generative Adversarial Networks with gradient penalty

W Zha, X Li, Y Xing, L He, D Li - Advances in Geo-Energy …, 2020 - yandy-ager.com

Abstract Generative Adversarial Networks (GANs), as most popular artificial intelligence

models in the current image generation field, have excellent image generation capabilities.

Based on Wasserstein GANs with gradient penalty, this paper proposes a novel digital core …

  Cited by 8 Related articles 

 

Wasserstein based two-stage distributionally robust optimization model for optimal operation of CCHP micro-grid under uncertainties

Y Wang, Y Yang, L Tang, W Sun, B Li - … Journal of Electrical Power & …, 2020 - Elsevier

Combined cooling, heating and power (CCHP) micro-grids are getting increasing attentions

due to the realization of cleaner production and high energy efficiency. However, with the

features of complex tri-generation structure and renewable power uncertainties, it is …

  Cited by 20 Related articles All 2 versions


[HTML] mdpi.com

Calculating the Wasserstein metric-based Boltzmann entropy of a landscape mosaic

H Zhang, Z Wu, T Lan, Y Chen, P Gao - Entropy, 2020 - mdpi.com

Shannon entropy is currently the most popular method for quantifying the disorder or

information of a spatial data set such as a landscape pattern and a cartographic map.

However, its drawback when applied to spatial data is also well documented; it is incapable …

  Cited by 3 Related articles All 9 versions 


The quantum Wasserstein distance of order 1

G De PalmaM MarvianD TrevisanS Lloyd - arXiv preprint arXiv …, 2020 - arxiv.org

We propose a generalization of the Wasserstein distance of order 1 to the quantum states of

$ n $ qudits. The proposal recovers the Hamming distance for the vectors of the canonical

basis, and more generally the classical Wasserstein distance for quantum states diagonal in …

  Cited by 3 Related articles All 3 versions 


[PDF] arxiv.org

SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

S ChewiTL GouicC LuT Maunu… - arXiv preprint arXiv …, 2020 - arxiv.org

Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described

as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of

optimal transport. We introduce a new perspective on SVGD that instead views SVGD as the …

  Cited by 4 Related articles All 5 versions 

<——2020——2020———2420—— 



[PDF] researchgate.net

Tractable reformulations of two-stage distributionally robust linear programs over the type-∞ wasserstein ball

W Xie - Operations Research Letters, 2020 - Elsevier

This paper studies a two-stage distributionally robust stochastic linear program under the

type-∞ Wasserstein ball by providing sufficient conditions under which the program can be

efficiently computed via a tractable convex program. By exploring the properties of binary …

  Cited by 7 Related articles All 4 versions


A Linear Programming Approximation of Distributionally Robust Chance-Constrained Dispatch With Wasserstein Distance

A Zhou, M Yang, M Wang… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

This paper proposes a data-driven distributionally robust chance constrained real-time

dispatch (DRCC-RTD) considering renewable generation forecasting errors. The proposed

DRCC-RTD model minimizes the expected quadratic cost function and guarantees that the …

  Cited by 5 Related articles All 2 versions



[PDF] arxiv.org

Wasserstein Index Generation Model: Automatic generation of time-series index with application to Economic Policy Uncertainty

F Xie - Economics Letters, 2020 - Elsevier

Automatic time-series index generation as a black-box method … Comparable results with existing

ones, tested on EPU … Applicable to any text corpus to produce sentiment indices … I propose

a novel method, the Wasserstein Index Generation model (WIG), to generate a public sentiment …

  Cited by 6 Related articles All 11 versions


[PDF] arxiv.org

Irregularity of distribution in Wasserstein distance

C Graham - Journal of Fourier Analysis and Applications, 2020 - Springer

We study the non-uniformity of probability measures on the interval and circle. On the

interval, we identify the Wasserstein-p distance with the classical\(L^ p\)-discrepancy. We

thereby derive sharp estimates in Wasserstein distances for the irregularity of distribution of  …

  Cited by 2 Related articles All 3 versions


[PDF] arxiv.org

Asymptotics of smoothed Wasserstein distances

HB ChenJ Niles-Weed - arXiv preprint arXiv:2005.00738, 2020 - arxiv.org

We investigate contraction of the Wasserstein distances on $\mathbb {R}^ d $ under

Gaussian smoothing. It is well known that the heat semigroup is exponentially contractive

with respect to the Wasserstein distances on manifolds of positive curvature; however, on flat …

  Cited by 2 Related articles All 2 versions 


2020


Wasserstein upper bounds of the total variation for smooth densities

M Chae, SG Walker - Statistics & Probability Letters, 2020 - Elsevier

The total variation distance between probability measures cannot be bounded by the

Wasserstein metric in general. If we consider sufficiently smooth probability densities,

however, it is possible to bound the total variation by a power of the Wasserstein distance …

  Cited by 3 Related articles All 5 versions



[PDF] ieee.org

Joint transfer of model knowledge and fairness over domains using wasserstein distance

T Yoon, J LeeW Lee - IEEE Access, 2020 - ieeexplore.ieee.org

Owing to the increasing use of machine learning in our daily lives, the problem of fairness

has recently become an important topic in machine learning societies. Recent studies

regarding fairness in machine learning have been conducted to attempt to ensure statistical …

  Cited by 1 Related articles


[PDF] thecvf.com

Barycenters of natural images constrained wasserstein barycenters for image morphing

D SimonA Aberdam - Proceedings of the IEEE/CVF …, 2020 - openaccess.thecvf.com

Image interpolation, or image morphing, refers to a visual transition between two (or more)

input images. For such a transition to look visually appealing, its desirable properties are (i)

to be smooth;(ii) to apply the minimal required change in the image; and (iii) to seem" real" …

  Cited by 3 Related articles All 7 versions 


On the Computation of Kantorovich--Wasserstein Distances Between Two-Dimensional Histograms by Uncapacitated Minimum Cost Flows

F Bassetti, S GualandiM Veneroni - SIAM Journal on Optimization, 2020 - SIAM

In this work, we present a method to compute the Kantorovich--Wasserstein distance of

order 1 between a pair of two-dimensional histograms. Recent works in computer vision and

machine learning have shown the benefits of measuring Wasserstein distances of order 1 …

  Cited by 5 Related articles All 2 versions


[PDF] arxiv.org

Gromov–Hausdorff limit of Wasserstein spaces on point clouds

NG Trillos - Calculus of Variations and Partial Differential …, 2020 - Springer

We consider a point cloud X_n:={x _1, ..., x _n\} X n:= x 1,…, xn uniformly distributed on the

flat torus T^ d:= R^ d/Z^ d T d:= R d/Z d, and construct a geometric graph on the cloud by

connecting points that are within distance ε ε of each other. We let P (X_n) P (X n) be the …

  Cited by 12 Related articles All 4 versions

<——2020——2020———2430——


[PDF] arxiv.org

Stability of Gibbs posteriors from the Wasserstein loss for Bayesian full waveform inversion

MM DunlopY Yang - arXiv preprint arXiv:2004.03730, 2020 - arxiv.org

Recently, the Wasserstein loss function has been proven to be effective when applied to

deterministic full-waveform inversion (FWI) problems. We consider the application of this

loss function in Bayesian FWI so that the uncertainty can be captured in the solution. Other …

  Cited by 1 Related articles All 3 versions 


Density estimation of multivariate samples using Wasserstein distance

E Luini, P Arbenz - Journal of Statistical Computation and …, 2020 - Taylor & Francis

Density estimation is a central topic in statistics and a fundamental task of machine learning.

In this paper, we present an algorithm for approximating multivariate empirical densities with

a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

  Cited by 2 Related articles All 3 versions


[PDF] stanford.edu

[PDF] Quantifying the Empirical Wasserstein Distance to a Set of Measures: Beating the Curse of Dimensionality

N Si, J Blanchet, S GhoshM Squillante - Advances in Neural …, 2020 - stanford.edu

… 4 Statistical convergence niansi@stanford.edu (Stanford) Wasserstein Projection October

22, 2020 2 / 10 Page 3. Wasserstein Distances and the Curse of Dimensionality Wasserstein

Distances and the Curse of Dimensionality Definition of the Wasserstein Distance (earth …

  Related articles 


[PDF] arxiv.org

Wasserstein distance error bounds for the multivariate normal approximation of the maximum likelihood estimator

A AnastasiouRE Gaunt - arXiv preprint arXiv:2005.05208, 2020 - arxiv.org

We obtain explicit Wasserstein distance error bounds between the distribution of the multi-

parameter MLE and the multivariate normal distribution. Our general bounds are given for

possibly high-dimensional, independent and identically distributed random vectors. Our …

  Cited by 1 Related articles All 4 versions 


[PDF] arxiv.org

Intrinsic Sliced Wasserstein Distances for Comparing Collections of Probability Distributions on Manifolds and Graphs

RM RustamovS Majumdar - arXiv preprint arXiv:2010.15285, 2020 - arxiv.org

Collections of probability distributions arise in a variety of statistical applications ranging

from user activity pattern analysis to brain connectomics. In practice these distributions are

represented by histograms over diverse domain types including finite intervals, circles …

  Cited by 2 Related articles All 2 versions 


2020

[PDF] researchgate.net

Inequalities of the Wasserstein mean with other matrix means

S Kim, H Lee - Annals of Functional Analysis, 2020 - Springer

Recently, a new Riemannian metric and a least squares mean of positive definite matrices

have been introduced. They are called the Bures–Wasserstein metric and Wasserstein

mean, which are different from the Riemannian trace metric and Karcher mean. In this paper …

  Cited by 2 Related articles All 2 versions


[PDF] arxiv.org

Infinite-dimensional regularization of McKean-Vlasov equation with a Wasserstein diffusion

V Marx - arXiv preprint arXiv:2002.10157, 2020 - arxiv.org

Much effort has been spent in recent years on restoring uniqueness of McKean-Vlasov

SDEs with non-smooth coefficients. As a typical instance, the velocity field is assumed to be

bounded and measurable in its space variable and Lipschitz-continuous with respect to the …

  Cited by 1 Related articles All 9 versions 


Finite-Horizon Control of Nonlinear Discrete-Time Systems with Terminal Cost of 

Wasserstein Distance

K Hoshino - 2020 59th IEEE Conference on Decision and …, 2020 - ieeexplore.ieee.org

This study explores a finite-horizon optimal control problem of nonlinear discrete-time

systems for steering a probability distribution of initial states as close as possible to a

desired probability distribution of terminal states. The problem is formulated as an optimal …

  Cited by 1 Related articles


[PDF] arxiv.org

Two-sample Test using Projected Wasserstein Distance: Breaking the Curse of Dimensionality

J Wang, R Gao, Y Xie - arXiv preprint arXiv:2010.11970, 2020 - arxiv.org

We develop a projected Wasserstein distance for the two-sample test, a fundamental

problem in statistics and machine learning: given two sets of samples, to determine whether

they are from the same distribution. In particular, we aim to circumvent the curse of  …

  Cited by 2 Related articles All 3 versions 


   [PDF] arxiv.org

The equivalence of Fourier-based and Wasserstein metrics on imaging problems

G Auricchio, A CodegoniS Gualandi… - arXiv preprint arXiv …, 2020 - arxiv.org

We investigate properties of some extensions of a class of Fourier-based probability metrics,

originally introduced to study convergence to equilibrium for the solution to the spatially

homogeneous Boltzmann equation. At difference with the original one, the new Fourier …

  Cited by 1 Related articles All 7 versions 

<——2020——2020———2440——


A novel kernel Wasserstein distance on Gaussian measures: An application of identifying dental artifacts in head and neck computed tomography

JH Oh, M PouryahyaA IyerAP ApteJO Deasy… - Computers in biology …, 2020 - Elsevier

The Wasserstein distance is a powerful metric based on the theory of optimal mass

transport. It gives a natural measure of the distance between two distributions with a wide

range of applications. In contrast to a number of the common divergences on distributions …

Cited by 10 Related articles All 6 versions


[PDF] arxiv.org

Wasserstein Learning of Determinantal Point Processes

L AnquetilM GartrellA Rakotomamonjy… - arXiv preprint arXiv …, 2020 - arxiv.org

Determinantal point processes (DPPs) have received significant attention as an elegant

probabilistic model for discrete subset selection. Most prior work on DPP learning focuses

on maximum likelihood estimation (MLE). While efficient and scalable, MLE approaches do …

  Related articles All 4 versions 


[PDF] arxiv.org

Automatic coding of students' writing via Contrastive Representation Learning in the Wasserstein space

R Jiang, J GouveaD HammerS Aeron - arXiv preprint arXiv:2011.13384, 2020 - arxiv.org

Qualitative analysis of verbal data is of central importance in the learning sciences. It is labor-

intensive and time-consuming, however, which limits the amount of data researchers can

include in studies. This work is a step towards building a statistical machine learning (ML) …

  Related articles All 2 versions 


[PDF] arxiv.org

Convergence in Wasserstein Distance for Empirical Measures of Dirichlet Diffusion Processes on Manifolds

FY Wang - arXiv preprint arXiv:2005.09290, 2020 - arxiv.org

Let $ M $ be a $ d $-dimensional connected compact Riemannian manifold with boundary

$\partial M $, let $ V\in C^ 2 (M) $ such that $\mu ({\rm d} x):={\rm e}^{V (x)}{\rm d} x $ is a

probability measure, and let $ X_t $ be the diffusion process generated by …

  Cited by 3 Related articles All 3 versions 


[PDF] mlr.press

Quantitative stability of optimal transport maps and linearization of the 2-wasserstein space

Q Mérigot, A Delalande… - … Conference on Artificial …, 2020 - proceedings.mlr.press

This work studies an explicit embedding of the set of probability measures into a Hilbert

space, defined using optimal transport maps from a reference probability density. This

embedding linearizes to some extent the 2-Wasserstein space and is shown to be bi-Hölder …

  Cited by 16 Related articles All 5 versions 


2020


[HTML] hindawi.com

[HTML] Imbalanced Fault Classification of Bearing via Wasserstein Generative Adversarial Networks with Gradient Penalty

B Han, S Jia, G Liu, J Wang - Shock and Vibration, 2020 - hindawi.com

Recently, generative adversarial networks (GANs) are widely applied to increase the

amounts of imbalanced input samples in fault diagnosis. However, the existing GAN-based

methods have convergence difficulties and training instability, which affect the fault …

  Related articles All 4 versions 


[PDF] arxiv.org

Derivative over Wasserstein spaces along curves of densities

R Buckdahn, J Li, H Liang - arXiv preprint arXiv:2010.01507, 2020 - arxiv.org

In this paper, given any random variable $\xi $ defined over a probability space

$(\Omega,\mathcal {F}, Q) $, we focus on the study of the derivative of functions of the form $

L\mapsto F_Q (L):= f\big ((LQ) _ {\xi}\big), $ defined over the convex cone of densities …

  Related articles All 2 versions 


Conditional Wasserstein GAN-based Oversampling of Tabular Data for Imbalanced Learning

J Engelmann, S Lessmann - arXiv preprint arXiv:2008.09202, 2020 - arxiv.org

Class imbalance is a common problem in supervised learning and impedes the predictive

performance of classification models. Popular countermeasures include oversampling the

minority class. Standard methods like SMOTE rely on finding nearest neighbours and linear …

  Cited by 3 Related articles All 5 versions 


[PDF] projecteuclid.org

Existence of probability measure valued jump-diffusions in generalized Wasserstein spaces

M Larsson, S Svaluto-Ferro - Electronic Journal of Probability, 2020 - projecteuclid.org

We study existence of probability measure valued jump-diffusions described by martingale

problems. We develop a simple device that allows us to embed Wasserstein spaces and

other similar spaces of probability measures into locally compact spaces where classical …

  Cited by 2 Related articles All 3 versions


[PDF] arxiv.org

Convergence of Recursive Stochastic Algorithms using Wasserstein Divergence

A Gupta, WB Haskell - arXiv preprint arXiv:2003.11403, 2020 - arxiv.org

This paper develops a unified framework, based on iterated random operator theory, to

analyze the convergence of constant stepsize recursive stochastic algorithms (RSAs) in

machine learning and reinforcement learning. RSAs use randomization to efficiently …

  Related articles All 2 versions 

<——2020——2020———2450——



First arrival picking of microseismic signals based on nested U-Net and Wasserstein Generative Adversarial Network

JL Zhang, GQ Sheng - Journal of Petroleum Science and Engineering, 2020 - Elsevier

Picking the first arrival of microseismic signals, quickly and accurately, is the key for real-time

data processing of microseismic monitoring. The traditional method cannot meet the high-

accuracy and high-efficiency requirements for the firstarrival microseismic picking, in a low …

  Related articles All 2 versions


[PDF] arxiv.org

Segmentation analysis and the recovery of queuing parameters via the Wasserstein distance: a study of administrative data for patients with chronic obstructive …

H Wilde, V KnightJ Gillard, K Smith - arXiv preprint arXiv:2008.04295, 2020 - arxiv.org

This work uses a data-driven approach to analyse how the resource requirements of

patients with chronic obstructive pulmonary disease (COPD) may change, and quantifies

how those changes affect the strains of the hospital system the patients interact with. This is …

  Related articles All 3 versions 


[PDF] arxiv.org

Data-Driven Approximation of the Perron-Frobenius Operator Using the Wasserstein Metric

A KarimiTT Georgiou - arXiv preprint arXiv:2011.00759, 2020 - arxiv.org

This manuscript introduces a regression-type formulation for approximating the Perron-

Frobenius Operator by relying on distributional snapshots of data. These snapshots may

represent densities of particles. The Wasserstein metric is leveraged to define a suitable …

  Related articles All 3 versions 


[PDF] arxiv.org

Minimax control of ambiguous linear stochastic systems using the Wasserstein metric

K KimI Yang - 2020 59th IEEE Conference on Decision and …, 2020 - ieeexplore.ieee.org

In this paper, we propose a minimax linear-quadratic control method to address the issue of

inaccurate distribution information in practical stochastic systems. To construct a control

policy that is robust against errors in an empirical distribution of uncertainty, our method …

  Cited by 4 Related articles All 3 versions


 

[PDF] arxiv.org

Consistency of Distributionally Robust Risk-and Chance-Constrained Optimization Under Wasserstein Ambiguity Sets

A CherukuriAR Hota - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We study stochastic optimization problems with chance and risk constraints, where in the

latter, risk is quantified in terms of the conditional value-at-risk (CVaR). We consider the

distributionally robust versions of these problems, where the constraints are required to hold …

  Cited by 1 Related articles All 3 versions


2020


[PDF] ieee.org

Study of Restrained Network Structures for Wasserstein Generative Adversarial Networks (WGANs) on Numeric Data Augmentation

W Wang, C Wang, T Cui, Y Li - IEEE Access, 2020 - ieeexplore.ieee.org

Some recent studies have suggested using Generative Adversarial Network (GAN) for

numeric data over-sampling, which is to generate data for completing the imbalanced

numeric data. Compared with the conventional over-sampling methods, taken SMOTE as an …

  Cited by 1 Related articles


[PDF] arxiv.org

Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures

O Bencheikh, B Jourdain - arXiv preprint arXiv:2012.09729, 2020 - arxiv.org

We are interested in the approximation in Wasserstein distance with index $\rho\ge 1$ of a

probability measure $\mu $ on the real line with finite moment of order $\rho $ by the

empirical measure of $ N $ deterministic points. The minimal error converges to $0 $ as …

  Related articles All 3 versions 


  

[PDF] arxiv.org

Hierarchical Low-Rank Approximation of Regularized Wasserstein distance

M Motamed - arXiv preprint arXiv:2004.12511, 2020 - arxiv.org

Sinkhorn divergence is a measure of dissimilarity between two probability measures. It is

obtained through adding an entropic regularization term to Kantorovich's optimal transport

problem and can hence be viewed as an entropically regularized Wasserstein distance …

  Related articles All 3 versions 


Covariance Steering of Discrete-Time Stochastic Linear Systems Based on Wasserstein Distance Terminal Cost

IM BalciE Bakolas - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We consider a class of stochastic optimal control problems for discrete-time linear systems

whose objective is the characterization of control policies that will steer the probability

distribution of the terminal state of the system close to a desired Gaussian distribution. In our …

  Cited by 2 Related articles


[PDF] stanford.edu

A Class of Optimal Transport Regularized Formulations with Applications to Wasserstein GANs

S Mahdian, JH Blanchet… - 2020 Winter Simulation …, 2020 - ieeexplore.ieee.org

Optimal transport costs (eg Wasserstein distances) are used for fitting high-dimensional

distributions. For example, popular artificial intelligence algorithms such as Wasserstein

Generative Adversarial Networks (WGANs) can be interpreted as fitting a black-box …

  All 2 versions

<——2020——2020———2460——



Self-improvement of the Bakry-Emery criterion for Poincar {\'e} inequalities and Wasserstein contraction using variable curvature bounds

P Cattiaux, M Fathi, A Guillin - arXiv preprint arXiv:2002.09221, 2020 - arxiv.org

We study Poincar {é} inequalities and long-time behavior for diffusion processes on R^ n

under a variable curvature lower bound, in the sense of Bakry-Emery. We derive various

estimates on the rate of convergence to equilibrium in L^ 1 optimal transport distance, as …

  Cited by 1 Related articles All 15 versions 


A Riemannian submersion‐based approach to the Wasserstein barycenter of positive definite matrices

M Li, H Sun, D Li - Mathematical Methods in the Applied …, 2020 - Wiley Online Library

In this paper, we introduce a novel geometrization on the space of positive definite matrices,

derived from the Riemannian submersion from the general linear group to the space of

positive definite matrices, resulting in easier computation of its geometric structure. The …

  Related articles


[PDF] arxiv.org

Equidistribution of random walks on compact groups II. The Wasserstein metric

B Borda - arXiv preprint arXiv:2004.14089, 2020 - arxiv.org

We consider a random walk $ S_k $ with iid steps on a compact group equipped with a bi-

invariant metric. We prove quantitative ergodic theorems for the sum $\sum_ {k= 1}^ N f

(S_k) $ with Hölder continuous test functions $ f $, including the central limit theorem, the …

  Related articles All 2 versions 


[PDF] arxiv.org

The existence of minimizers for an isoperimetric problem with Wasserstein penalty term in unbounded domains

Q Xia, B Zhou - arXiv preprint arXiv:2002.07129, 2020 - arxiv.org

In this article, we consider the (double) minimization problem $$\min\left\{P

(E;\Omega)+\lambda W_p (E, F):~ E\subseteq\Omega,~ F\subseteq\mathbb {R}^ d,~\lvert

E\cap F\rvert= 0,~\lvert E\rvert=\lvert F\rvert= 1\right\}, $$ where $ p\geqslant 1$, $\Omega …

  Related articles All 4 versions 


Convergence in Monge-Wasserstein Distance of Mean Field Systems with Locally Lipschitz Coefficients

DT NguyenSL Nguyen, NH Du - Acta Mathematica Vietnamica, 2020 - Springer

This paper focuses on stochastic systems of weakly interacting particles whose dynamics

depend on the empirical measures of the whole populations. The drift and diffusion

coefficients of the dynamical systems are assumed to be locally Lipschitz continuous and …

  Related articles


Functional Data Clustering Analysis via the Learning of Gaussian Processes with Wasserstein Distance

T Li, J Ma - International Conference on Neural Information …, 2020 - Springer

Functional data clustering analysis becomes an urgent and challenging task in the new era

of big data. In this paper, we propose a new framework for functional data clustering

analysis, which adopts a similar structure as the k-means algorithm for the conventional …

  Related articles


[PDF] tins.ro

Enhancing the Classification of EEG Signals using Wasserstein Generative Adversarial Networks

VM PetruţiuLD Palcu, C Lemnaur… - 2020 IEEE 16th …, 2020 - ieeexplore.ieee.org

Collecting EEG signal data during a human visual recognition task is a costly and time-

consuming process. However, training good classification models usually requires a large

amount of quality data. We propose a data augmentation method based on Generative …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

Characterization of probability distribution convergence in Wasserstein distance by -quantization error function

Y Liu, G Pagès - Bernoulli, 2020 - projecteuclid.org

We establish conditions to characterize probability measures by their $ L^{p} $-quantization

error functions in both $\mathbb {R}^{d} $ and Hilbert settings. This characterization is two-

fold: static (identity of two distributions) and dynamic (convergence for the $ L^{p} …

  Cited by 1 Related articles All 5 versions


Wasserstein metric-based Boltzmann entropy of a landscape mosaic: a clarification, correction, and evaluation of thermodynamic consistency

P Gao, H Zhang, Z Wu - Landscape Ecology - Springer

Objectives The first objective is to provide a clarification of and a correction to the

Wasserstein metric-based method. The second is to evaluate the method in terms of

thermodynamic consistency using different implementations. Methods Two implementation …

  Related articles


Data Augmentation Method for Fault Diagnosis of Mechanical Equipment Based on Improved Wasserstein GAN

W Liu, L Duan, Y Tang, J Yang - 2020 11th International …, 2020 - ieeexplore.ieee.org

Most of the time the mechanical equipment is in normal operation state, which results in high

imbalance between fault data and normal data. In addition, traditional signal processing

methods rely heavily on expert experience, making it difficult for classification or prediction …

  Related articles

<——2020——2020———2470——



[PDF] arxiv.org

On nonexpansiveness of metric projection operators on Wasserstein spaces

A Adve, A Mészáros - arXiv preprint arXiv:2009.01370, 2020 - arxiv.org

In this note we investigate properties of metric projection operators onto closed and

geodesically convex proper subsets of Wasserstein spaces $(\mathcal {P} _p (\mathbf {R}^

d), W_p). $ In our study we focus on the particular subset of probability measures having …

  Related articles All 3 versions 


[HTML] Solutions of a Class of Degenerate Kinetic Equations Using Steepest Descent in Wasserstein Space

A Marcos, A Soglo - Journal of Mathematics, 2020 - hindawi.com

We use the steepest descent method in an Orlicz–Wasserstein space to study the existence

of solutions for a very broad class of kinetic equations, which include the Boltzmann

equation, the Vlasov–Poisson equation, the porous medium equation, and the parabolic p …

  Related articles All 6 versions 


Convergence rates of the blocked Gibbs sampler with random scan in the Wasserstein metric

NY Wang, G Yin - Stochastics, 2020 - Taylor & Francis

Formulae display: ?Mathematical formulae have been encoded as MathML and are displayed

in this HTML version using MathJax in order to improve their display. Uncheck the box to turn

MathJax off. This feature requires Javascript. Click on a formula to zoom … This paper establishes …

  Related articles All 4 versions


[PDF] future-in-tech.net

[PDF] Wasserstein Riemannian geometry of Gamma densities

C Ogouyandjou, N Wadagni - Computer Science, 2020 - ijmcs.future-in-tech.net

Abstract A Wasserstein Riemannian Gamma manifold is a space of Gamma probability

density functions endowed with the Riemannian Otto metric which is related to the

Wasserstein distance. In this paper, we study some geometric properties of such Riemanian …

  Related articles 


[PDF] optimization-online.org

[PDF] Dual Decomposition of Two-Stage Distributionally Robust Mixed-Integer Programming under the Wasserstein Ambiguity Set

K Kim - optimization-online.org

We develop a dual decomposition of two-stage distributionally robust mixed-integer

programming (DRMIP) under the Wasserstein ambiguity set. The dual decomposition is

based on the Lagrangian dual of DRMIP, which results from the Lagrangian relaxation of the …

  Related articles All 2 versions 


2020


[PDF] researchgate.net

[PDF] Potential Analysis of Wasserstein GAN as an Anomaly Detection Method for Industrial Images

A Misik - researchgate.net

The task of detecting anomalies in images is a crucial part of current industrial optical

monitoring systems. In recent years, neural networks have proven to be an efficient method

for this problem, especially autoencoders and generative adversarial networks (GAN). A …

  

[PDF] brown.edu

[PDF] Reduced-order modeling of transport equations using Wasserstein spaces

V EhrlacherD LombardiO MulaFX Vialard - icerm.brown.edu

… Bar(U, Λ) = argmin uP2(Ω) n ∑ i=1 λi W2(u, ui )2. The measure Bar(U, Λ) is unique and is called

the Wasserstein barycenter of U with weights Λ. This object is the Wasserstein counterpart of the

L2(Ω) barycenter of a set of functions (ρ1, ··· , ρn) L2(Ω)n with barycentric weight Λ. Indeed …

  Related articles 


[PDF] epfl.ch

[PDF] THE CONTINUOUS FORMULATION OF SHALLOW NEURAL NETWORKS AS WASSERSTEIN-TYPE GRADIENT FLOWS

X FERNÁNDEZ-REAL, A FIGALLI - sma.epfl.ch

It has been recently observed that the training of a single hidden layer artificial neural

network can be reinterpreted as a Wasserstein gradient flow for the weights for the error

functional. In the limit, as the number of parameters tends to infinity, this gives rise to a family …

  Related articles 


 Remaining useful life prediction of lithium-ion batteries using a fusion method based on Wasserstein GAN

周温丁, 鲍士兼, 许方敏, 赵成林 - 中国邮电高校学报 (英文版), 2020 - jcupt.bupt.edu.cn

Lithium-ion batteries are the main power supply equipment in many fields due to their

advantages of no memory, high energy density, long cycle life and no pollution to the

environment. Accurate prediction for the remaining useful life (RUL) of lithium-ion batteries …

  All 2 versions 


Optimality in weighted L2-Wasserstein goodness-of-fit statistics

Optimality in weighted L2-Wasserstein goodness-of-fit statistics

T De Wet, V Humble - South African Statistical Journal, 2020 - journals.co.za

In Del Barrio, Cuesta-Albertos, Matran and Rodriguez-Rodriguez (1999) and Del Barrio,

Cuesta-Albertos and Matran (2000), the authors introduced a new class of goodness-of-fit

statistics based on the L2-Wasserstein distance. It was shown that the desirable property of  …

  Related articles All 3 versions

<——2020——2020———2480——


[PDF] projecteuclid.org

Existence of probability measure valued jump-diffusions in generalized Wasserstein spaces

M Larsson, S Svaluto-Ferro - Electronic Journal of Probability, 2020 - projecteuclid.org

We study existence of probability measure valued jump-diffusions described by martingale 

problems. We develop a simple device that allows us to embed Wasserstein spaces and 

other similar spaces of probability measures into locally compact spaces where classical …

Cited by 3 Related articles All 2 versions


[PDF] researchgate.net

[PDF] Computational hardness and fast algorithm for fixed-support wasserstein barycenter

T Lin, N Ho, X Chen, M Cuturi… - arXiv preprint arXiv …, 2020 - researchgate.net

We study in this paper the fixed-support Wasserstein barycenter problem (FS-WBP), which 

consists in computing the Wasserstein barycenter of m discrete probability measures 

supported on a finite metric space of size n. We show first that the constraint matrix arising …

Cited by 3 Related articles All 2 versions

 

[PDF] mlr.press

Bridging the gap between f-gans and wasserstein gans

J Song, S Ermon - International Conference on Machine …, 2020 - proceedings.mlr.press

Generative adversarial networks (GANs) variants approximately minimize divergences 

between the model and the data distribution using a discriminator. Wasserstein GANs 

(WGANs) enjoy superior empirical performance, however, unlike in f-GANs, the discriminator …

Cited by 8 Related articles All 4 versions

[CITATION] Bridging the Gap Between f-GANs and Wasserstein GANs. arXiv e-prints, page

J Song, S Ermon - arXiv preprint arXiv:1910.09779, 2019

  Cited by 3 Related articles


Fast algorithms for computational optimal transport and wasserstein barycenter

W Guo, N Ho, M Jordan - … on Artificial Intelligence and …, 2020 - proceedings.mlr.press

We provide theoretical complexity analysis for new algorithms to compute the optimal 

transport (OT) distance between two discrete probability distributions, and demonstrate their 

favorable practical performance compared to state-of-art primal-dual algorithms. First, we …

Cited by 2 Related articles All 4 versions


2020 [PDF] unifi.it

[PDF] Conlon: A pseudo-song generator based on a new pianoroll, wasserstein autoencoders, and optimal interpolations

L Angioloni, T Borghuis, L Brusci… - Proceedings of the 21st …, 2020 - flore.unifi.it

We introduce CONLON, a pattern-based MIDI generation method that employs a new 

lossless pianoroll-like data description in which velocities and durations are stored in 

separate channels. CONLON uses Wasserstein autoencoders as the underlying generative …

Cited by 1 Related articles All 7 versions

CONLON: A pseudo-song generator based on a new pianoroll, Wasserstein autoencoders, and optimal interpolations book

2020

 

Semantics-assisted Wasserstein Learning for Topic and Word Embeddings

C Li, X Li, J Ouyang, Y Wang - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Wasserstein distance, defined as the cost (measured by word embeddings) of optimal 

transport plan for moving between two histograms, has been proven effective in tasks of 

natural language processing. In this paper, we extend Nonnegative Matrix Factorization …

All 2 versions 


[PDF] arxiv.org

Spectral Unmixing With Multinomial Mixture Kernel and Wasserstein Generative Adversarial Loss

S Ozkan, GB Akar - arXiv preprint arXiv:2012.06859, 2020 - arxiv.org

This study proposes a novel framework for spectral unmixing by using 1D convolution 

kernels and spectral uncertainty. High-level representations are computed from data, and 

they are further modeled with the Multinomial Mixture Model to estimate fractions under …

Related articles All 2 versions


[PDF] aclweb.org

WAE

RN: Integrating Wasserstein Autoencoder and Relational Network for Text Sequence

X Zhang, X Liu, G Yang, F Li, W Liu - China National Conference on …, 2020 - Springer

Abstract One challenge in Natural Language Processing (NLP) area is to learn semantic 

representation in different contexts. Recent works on pre-trained language model have 

received great attentions and have been proven as an effective technique. In spite of the …

Related articles All 4 versions 


Covariance Steering of Discrete-Time Stochastic Linear Systems Based on Wasserstein Distance Terminal Cost

IM BalciE Bakolas - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

… on knowledge of good initial guesses and thus, in general, a systematic process for the … consider

the case in which the terminal cost corresponds to the squared Wasserstein distance between …

distribution but in contrast with the latter reference, we consider the discrete-time case …

  Cited by 2 Relat


[PDF] arxiv.org

Unajusted Langevin algorithm with multiplicative noise: Total variation and Wasserstein bounds

F Panloup - arXiv preprint arXiv:2012.14310, 2020 - arxiv.org

In this paper, we focus on non-asymptotic bounds related to the Euler scheme of an ergodic 

diffusion with a possibly multiplicative diffusion term (non-constant diffusion coefficient). 

More precisely, the objective of this paper is to control the distance of the standard Euler …

Related articles All 2 versions

2020

Unajusted Langevin algorithm with multiplicative noise: Total variation and Wasserstein bounds

F Panloup - arXiv preprint arXiv:2012.14310, 2020 - arxiv.org

… In this paper, we focus on non-asymptotic bounds related to the Euler scheme of an ergodic diffusion with a possibly multiplicative … improve) such bounds for Total Variation and L1-Wasserstein distances in both multiplicative and additive and frameworks. These bounds rely on …

Related articles All 3 versions

[CITATION] Total Variation and Wasserstein bounds for the ergodic Euler-Naruyama scheme for diffusions

G Pages, F Panloup - Preprint, 2020

  Cited by 2


[PDF] archives-ouvertes.fr

Unajusted Langevin algorithm with multiplicative noise: Total variation and Wasserstein bounds

G Pages, F Panloup - 2020 - hal.archives-ouvertes.fr

In this paper, we focus on non-asymptotic bounds related to the Euler scheme of an ergodic 

diffusion with a possibly multiplicative diffusion term (non-constant diffusion coefficient). 

More precisely, the objective of this paper is to control the distance of the standard Euler …

Related articles All 5 versions

[CITATION] Unajusted Langevin algorithm with multiplicative noise: Total variation and Wasserstein bounds. arXiv e-prints, page

G Pagès, F Panloup - arXiv preprint arXiv:2012.14310, 2020

 Cited by 2

<——2020——2020———2490—



2020 see 2019 z

[PDF] arxiv.org

Wasserstein gradient flow formulation of the time-fractional Fokker-Planck equation

MH DuongB Jin - arXiv preprint arXiv:1908.09055, 2019 - arxiv.org

In this work, we investigate a variational formulation for a time-fractional Fokker-Planck

equation which arises in the study of complex physical systems involving anomalously slow

diffusion. The model involves a fractional-order Caputo derivative in time, and thus

inherently nonlocal. The study follows the Wasserstein gradient flow approach pioneered by

[26]. We propose a JKO type scheme for discretizing the model, using the L1 scheme for the

Caputo fractional derivative in time, and establish the convergence of the scheme as the …

Cited by 3 Related articles All 10 versions 

Showing the best result for this search. See all results

[CITATION] Wasserstein gradient flow formulation of the time-fractional Fokker-Planck equation

B Jin, MH Duong - Communications in Mathematical Sciences, 2020 - discovery.ucl.ac.uk

… Wasserstein gradient flow formulation of the time-fractional Fokker-Planck equation. Jin, B; Duong,

MH; (2020) Wasserstein gradient flow formulation of the time-fractional Fokker-Planck equation.

Communications in Mathematical Sciences (In press). [img], Text fracFPE_cms_revised.pdf …
 

A Novel Data-to-Text Generation Model with Transformer Planning and a Wasserstein Auto-Encoder

X Xu, T He, H Wang - 2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

Existing methods for data-to-text generation have difficulty producing diverse texts with low 

duplication rates. In this paper, we propose a novel data-to-text generation model with 

Transformer planning and a Wasserstein auto-encoder, which can convert constructed data …

Cited by 3 Related articles All 3 versions

A Novel Data-to-Text Generation Model with Transformer Planning and a Wasserstein Auto-Encoder

X Xu, T He, H Wang - 2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

Existing methods for data-to-text generation have difficulty producing diverse texts with low

duplication rates. In this paper, we propose a novel data-to-text generation model with

Transformer planning and a Wasserstein auto-encoder, which can convert constructed data …


 

Fixed-Support Wasserstein Barycenter: Computational Hardness and Efficient Algorithms

T Lin, N Ho, X Chen, M Cuturi, MI Jordan - 2020 - research.google

We study in this paper the finite-support Wasserstein barycenter problem (FS-WBP), which 

consists in computing the Wasserstein barycenter of $ m $ discrete probability measures 

supported on a finite metric space of size $ n $. We show first that the constraint matrix …


 

[PDF] sabanciuniv.edu

Cyclic Adversarial Framework with Implicit Autoencoder and Wasserstein Loss (CAFIAWL)

E Bonabi Mobaraki - 2020 - research.sabanciuniv.edu

Since the day that the Simple Perceptron was invented, Artificial Neural Networks (ANNs) 

attracted many researchers. Technological improvements in computers and the internet 

paved the way for unseen computational power and an immense amount of data that …

Related articles 


2020 dissertation

Data-driven Distributionally Robust Stochastic Optimization via Wasserstein Distance with Applications to Portfolio Risk Management and Inventory Control

D Singh - 2020 - conservancy.umn.edu

The central theme of this dissertation is stochastic optimization under distributional 

ambiguity. One canthink of this as a two player game between a decision maker, who tries to 

minimize some loss or maximize some reward, and an adversarial agent that chooses the …

 Related articles All 4 versions

2020

[PDF] amazonaws.com

[PDF] Bayesian Wasserstein GAN and Application for Vegetable Disease Image Data

W Cho, MH Na, S Kang, S Kim - 2020 - manuscriptlink-society-file.s3 …

Various GAN models have been proposed so far and they are used in various fields. 

However, despite the excellent performance of these GANs, the biggest problem is that the 

model collapse occurs in the simultaneous optimization of the generator and discriminator of …

Related articles


[PDF] sci-en-tech.com

[PDF] Entropy-regularized Wasserstein Distances for Analyzing Environmental and Ecological Data

H Yoshioka, Y Yoshioka, Y Yaegashi - THE 11TH …, 2020 - sci-en-tech.com

We explore applicability of entropy-regularized Wasserstein (pseudo-) distances as new 

tools for analyzing environmental and ecological data. In this paper, the two specific 

examples are considered and are num

erically analyzed using the Sinkhorn algorithm. The …

Related articles All 2 versions


Semi-supervised Surface Wave Tomography with Wasserstein Cycle-consistent GAN: Method and Application on Southern California Plate Boundary Region

A Cai, H Qiu, F Niu - 2020 - essoar.org

Machine learning algorithm is applied to shear wave velocity (Vs) inversion in surface wave 

tomography, where a set of 1-D Vs profiles and the corresponding synthetic dispersion 

curves are used in network training. Previous studies showed that performances of a trained …


[PDF] semanticscholar.org

[PDF] Deconvolution for the Wasserstein metric and topological inference

B Michel - pdfs.semanticscholar.org

La SEE (Société de l'Electricité, de l'Electronique et des Technologies de l'Information et de 

la Communication–Association reconnue d'utilité publique, régie par la loi du 1er juillet 

1901) met à la disposition de ses adhérents et des abonnés à ses publications, un …

[CITATION] Deconvolution for the Wasserstein metric and topological inference

B Michel

[PDF] googleapis.com

Methods and devices performing adaptive quadratic wasserstein full-waveform inversion

W Diancheng, P Wang - US Patent App. 16/662,644, 2020 - Google Patents

Methods and devices for seismic exploration of an underground structure apply W 2-based 

full-wave inversion to transformed synthetic and seismic data. Data transformation ensures 

that the synthetic and seismic data are positive definite and have the same mass using an …

Cited by 1 Related articles All 2 versions

<——2020—–—2020———2500—


Network Intrusion Detection Based on Conditional ...

http://ieeexplore.ieee.org › document

http://ieeexplore.ieee.org › document

Oct 19, 2020 — Network Intrusion Detection Based on Conditional Wasserstein Generative Adversarial Network and Cost-Sensitive Stacked Autoencoder.

DOI: 10.1109/ACCESS.2020.3031892

[PDF] ieee.org  

[CITATION] Network Intrusion Detection Based on Conditional Wasserstein Generative Adversarial Network and Cost-Sensitive Stacked Autoencoder

G Zhang, X Wang, R Li, Y Song, J He, J Lai - IEEE Access, 2020 - ieeexplore.ieee.org

In the field of intrusion detection, there is often a problem of data imbalance, and more and 

more unknown types of attacks make detection difficult. To resolve above issues, this article 

proposes a network intrusion detection model called CWGAN-CSSAE, which combines …

Cited by 18 Related articles


Super Resolution Method for Remote Sensing Images Based on Cascaded Conditional Wasserstein GANs

B Liu, H Li, Y Zhou, Y Peng, Elazab… - 2020 IEEE 3rd …, 2020 - ieeexplore.ieee.org

High-resolution (HR) remote sensing imagery is quite beneficial for subsequent

interpretation. Obtaining HR images can be achieved by upgrading the imaging device. Yet,

the cost to perform this task is very huge. Thus, it is necessary to obtain HR images from low …

  Related articles


Wasserstein Distributionally Robust Chance ... - DTU Orbit

https://orbit.dtu.dk › files › EJOR_Paper

PDF

by A Arrigo · Cited by 6 — Wasserstein Distributionally Robust Chance-Constrained Optimization for Energy and. Reserve Dispatch: An Exact and Physically-Bounded ...

[CITATION] Wasserstein distributionally robust chanceconstrained optimization for energy and reserve dispatch: An exact and physically-bounded formulation

A Arrigo, C Ordoudis, J Kazempour, Z De Grève… - Eur. J. Oper. Res. under …, 2020

Cited by 2 


2020  [PDF] arxiv.org

The existence of minimizers for an isoperimetric problem with Wasserstein penalty term in unbounded domains

Q Xia, B Zhou - arXiv preprint arXiv:2002.07129, 2020 - arxiv.org

In this article, we consider the (double) minimization problem $$\min\left\{P

(E;\Omega)+\lambda W_p (E, F):~ E\subseteq\Omega,~ F\subseteq\mathbb {R}^ d,~\lvert

E\cap F\rvert= 0,~\lvert E\rvert=\lvert F\rvert= 1\right\}, $$ where $ p\geqslant 1$, $\Omega …

  Related articles All 4 versions 


Theory Seminar: Smooth Wasserstein Distance: Metric ...

https://www.cs.cornell.edu › content › theory-seminar-s...

Home < Events < Calendar. Speaker · Ziv Goldfeld, Cornell · Monday, April 27, 2020 - 15:45 · Streaming via Zoom. Host: · Bobby Kleinberg · Spring 2020 Theory ...

2020


2020 SEE 2019

Minimax estimation of smooth densities in Wasserstein distance

https://talks.cam.ac.uk › talk › index

Nov 6, 2020 — https://maths-cam-ac-uk.zoom.us/j/92821218455?pwd=aHFOZWw5bzVReUNYR2d5OWc1Tk15Zz09. If you have a question about this talk, ...

 Minimax estimation of smooth densities in Wasserstein distance

Add to your list(s) Download to your calendar using vCal

Jonathan Niles-Weed (Courant Institute)

Friday 06 November 2020 zoom


2020

Theory Seminar: Smooth Wasserstein Distance: Metric ...

https://www.cs.cornell.edu › content › theory-seminar-s...

Apr 27, 2020 — Streaming via Zoom. Host: ... Abstract: The Wasserstein distance has seen a surge of interest and applications in machine learning. This stems ...

Theory Seminar: Smooth Wasserstein Distance: Metric ...

www.cs.cornell.edu › content › theory-seminar-smooth...

Apr 27, 2020 — Streaming via Zoom ... This talk proposes a novel smooth 1-Wasserstein distance (W1), t\

 Faster Wasserstein Distance Estimation with the Sinkhorn Divergence

L ChizatP RoussillonF Léger… - Advances in Neural …, 2020 - proceedings.neurips.cc

The squared Wasserstein distance is a natural quantity to compare probability distributions

in a non-parametric setting. This quantity is usually estimated with the plug-in estimator,

defined via a discrete optimal transport problem which can be solved to $\epsilon …

  Cited by 8 Related articles All 7 versions 

[PDF] semanticscholar.org

[PDF] Faster Wasserstein Distance Estimation with the Sinkhorn Divergence

FX VialardG Peyré - pdfs.semanticscholar.org

… 1CNRS and Université Paris-Sud 2ENS Paris 3Université Gustave Eiffel Page 2. Optimal Transport

& Entropic Regularization Page 3. Statistical Optimal Transport Estimation of the Squared

Wasserstein Distance Let µ and ν be probability densities on the unit ball in Rd . Given ˆµ


[PDF] academia.edu

Data augmentation in fault diagnosis based on the Wasserstein generative adversarial network with gradient penalty

X Gao, F Deng, X Yue - Neurocomputing, 2020 - Elsevier

Fault detection and diagnosis in industrial process is an extremely essential part to keep

away from undesired events and ensure the safety of operators and facilities. In the last few

decades various data based machine learning algorithms have been widely studied to …

  Cited by 31 Related articles All 3 versions


[PDF] mlr.press

Bridging the gap between f-gans and wasserstein gans

J SongS Ermon - International Conference on Machine …, 2020 - proceedings.mlr.press

Generative adversarial networks (GANs) variants approximately minimize divergences

between the model and the data distribution using a discriminator. Wasserstein GANs

(WGANs) enjoy superior empirical performance, however, unlike in f-GANs, the discriminator …

  Cited by 8 Related articles All 4 versions 

<——2020——2020———2510——



[PDF] researchgate.net

The quadratic Wasserstein metric for inverse data matching

B Engquist, K Ren, Y Yang - Inverse Problems, 2020 - iopscience.iop.org

This work characterizes, analytically and numerically, two major effects of the quadratic

Wasserstein (W 2) distance as the measure of data discrepancy in computational solutions

of inverse problems. First, we show, in the infinite-dimensional setup, that the W 2 distance …

  Cited by 5 Related articles All 6 versions


[PDF] arxiv.org

The back-and-forth method for wasserstein gradient flows

M Jacobs, W Lee, F Léger - arXiv preprint arXiv:2011.08151, 2020 - arxiv.org

We present a method to efficiently compute Wasserstein gradient flows. Our approach is

based on a generalization of the back-and-forth method (BFM) introduced by Jacobs and

Léger to solve optimal transport problems. We evolve the gradient flow by solving the dual …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Regularized variational data assimilation for bias treatment using the Wasserstein metric

SK Tamang, A EbtehajD Zou… - Quarterly Journal of the …, 2020 - Wiley Online Library

This article presents a new variational data assimilation (VDA) approach for the formal

treatment of bias in both model outputs and observations. This approach relies on the

Wasserstein metric, stemming from the theory of optimal mass transport, to penalize the  …

  Cited by 1 Related articles All 4 versions


[PDF] ams.org

On the Wasserstein distance between classical sequences and the Lebesgue measure

L Brown, S Steinerberger - … of the American Mathematical Society, 2020 - ams.org

We discuss the classical problem of measuring the regularity of distribution of sets of $ N $

points in $\mathbb {T}^ d $. A recent line of investigation is to study the cost ($= $ mass

$\times $ distance) necessary to move Dirac measures placed on these points to the uniform …

  Cited by 5 Related articles All 4 versions


[PDF] esaim-proc.org

Statistical data analysis in the Wasserstein space

J Bigot - ESAIM: Proceedings and Surveys, 2020 - esaim-proc.org

This paper is concerned by statistical inference problems from a data set whose elements

may be modeled as random probability measures such as multiple histograms or point

clouds. We propose to review recent contributions in statistics on the use of Wasserstein  …

  Cited by 2 Related articles



2020


DPIR-Net: Direct PET image reconstruction based on the Wasserstein generative adversarial network

Z Hu, H Xue, Q Zhang, J Gao, N Zhang… - … on Radiation and …, 2020 - ieeexplore.ieee.org

Positron emission tomography (PET) is an advanced medical imaging technique widely

used in various clinical applications, such as tumor detection and neurologic disorders.

Reducing the radiotracer dose is desirable in PET imaging because it decreases the  …

  Cited by 5 Related articles



Stein factors for variance-gamma approximation in the Wasserstein and Kolmogorov distances

RE Gaunt - arXiv preprint arXiv:2008.06088, 2020 - arxiv.org

We obtain new bounds for the solution of the variance-gamma (VG) Stein equation that are

of the correct form for approximations in terms of the Wasserstein and Kolmorogorov metrics.

These bounds hold for all parameters values of the four parameter VG class. As an …

  Cited by 4 Related articles All 3 versions 


[PDF] arxiv.org

Regularization helps with mitigating poisoning attacks: Distributionally-robust machine learning using the wasserstein distance

F Farokhi - arXiv preprint arXiv:2001.10655, 2020 - arxiv.org

We use distributionally-robust optimization for machine learning to mitigate the effect of data

poisoning attacks. We provide performance guarantees for the trained model on the original

data (not including the poison records) by training the model for the worst-case distribution …

  Cited by 5 Related articles All 3 versions 


[PDF] arxiv.org

Transport and Interface: an Uncertainty Principle for the Wasserstein distance

A SagivS Steinerberger - SIAM Journal on Mathematical Analysis, 2020 - SIAM

Let f:(0,1)^dR be a continuous function with zero mean and interpret f_+=\max(f,0) and f_-

=-\min(f,0) as the densities of two measures. We prove that if the cost of transport from f_+ to

f_- is small, in terms of the Wasserstein distance W_1(f_+,f_-), then the Hausdorff measure of …

  Cited by 4 Related articles All 3 versions


[PDF] arxiv.org

Approximate bayesian computation with the sliced-wasserstein distance

K NadjahiV De BortoliA Durmus… - ICASSP 2020-2020 …, 2020 - ieeexplore.ieee.org

Approximate Bayesian Computation (ABC) is a popular method for approximate inference in

generative models with intractable but easy-to-sample likelihood. It constructs an

approximate posterior distribution by finding parameters for which the simulated data are …

  Cited by 4 Related articles All 8 versions

<——2020——2020———2520——

 

[PDF] arxiv.org

Regularizing activations in neural networks via distribution matching with the Wasserstein metric

T Joo, D Kang, B Kim - arXiv preprint arXiv:2002.05366, 2020 - arxiv.org

Regularization and normalization have become indispensable components in training deep

neural networks, resulting in faster training and improved generalization performance. We

propose the projected error function regularization loss (PER) that encourages activations to …

 Cited by 5 Related articles All 7 versions

[PDF] arxiv.org

Asymptotic Guarantees for Generative Modeling Based on the Smooth Wasserstein Distance

Z GoldfeldK GreenewaldK Kato - arXiv preprint arXiv:2002.01012, 2020 - arxiv.org

Minimum distance estimation (MDE) gained recent attention as a formulation of (implicit)

generative modeling. It considers minimizing, over model parameters, a statistical distance

between the empirical data distribution and the model. This formulation lends itself well to …

  Cited by 2 Related articles All 2 versions 

[CITATION] Asymptotic Guarantees for Generative Modeling Based on the Smooth Wasserstein Distance

Z GoldfeldK GreenewaldK Kato - Advances in Neural Information Processing …, 2020

  Cited by 3 Related articles


[PDF] arxiv.org

The Unbalanced Gromov Wasserstein Distance: Conic Formulation and Relaxation

T SéjournéFX VialardG Peyré - arXiv preprint arXiv:2009.04266, 2020 - arxiv.org

Comparing metric measure spaces (ie a metric space endowed with a probability

distribution) is at the heart of many machine learning problems. This includes for instance

predicting properties of molecules in quantum chemistry or generating graphs with varying …

  Cited by 5 Related articles All 2 versions 


[PDF] arxiv.org

Cutoff thermalization for Ornstein-Uhlenbeck systems with small L\'evy noise in the Wasserstein distance

G BarreraMA HögeleJC Pardo - arXiv preprint arXiv:2009.10590, 2020 - arxiv.org

This article establishes cutoff thermalization (also known as the cutoff phenomenon) for a

general class of general Ornstein-Uhlenbeck systems $(X^\epsilon_t (x)) _ {t\geq 0} $ under

$\epsilon $-small additive Lévy noise with initial value $ x $. The driving noise processes …

  Cited by 1 Related articles All 3 versions 


[PDF] arxiv.org

Ranking IPCC Models Using the Wasserstein Distance

G VissioV LemboV LucariniM Ghil - arXiv preprint arXiv:2006.09304, 2020 - arxiv.org

We propose a methodology for evaluating the performance of climate models based on the

use of the Wasserstein distance. This distance provides a rigorous way to measure

quantitatively the difference between two probability distributions. The proposed approach is …

  Related articles All 5 versions 


2020


[PDF] arxiv.org

A Bismut-Elworthy inequality for a Wasserstein diffusion on the circle

V Marx - arXiv preprint arXiv:2005.04972, 2020 - arxiv.org

We investigate in this paper a regularization property of a diffusion on the Wasserstein

space $\mathcal {P} _2 (\mathbb {T}) $ of the one-dimensional torus. The control obtained

on the gradient of the semi-group is very much in the spirit of Bismut-Elworthy-Li integration …

  Related articles All 9 versions 


 

The Wasserstein Impact Measure (WIM): a generally applicable, practical tool for quantifying prior impact in Bayesian statistics

F Ghaderinezhad, C LeyB Serrien - arXiv preprint arXiv:2010.12522, 2020 - arxiv.org

The prior distribution is a crucial building block in Bayesian analysis, and its choice will

impact the subsequent inference. It is therefore important to have a convenient way to

quantify this impact, as such a measure of prior impact will help us to choose between two or …

  Related articles All 3 versions 


[PDF] arxiv.org

Learning disentangled representations with the Wasserstein Autoencoder

B Gaujac, I FeigeD Barber - arXiv preprint arXiv:2010.03459, 2020 - arxiv.org

Disentangled representation learning has undoubtedly benefited from objective function

surgery. However, a delicate balancing act of tuning is still required in order to trade off

reconstruction fidelity versus disentanglement. Building on previous successes of penalizing …

  Related articles All 2 versions 


[PDF] researchgate.net

[PDF] Ranking IPCC Model Performance Using the Wasserstein Distance

G VissioV LemboV Lucarini… - arXiv preprint arXiv …, 2020 - researchgate.net

We propose a methodology for intercomparing climate models and evaluating their

performance against benchmarks based on the use of the Wasserstein distance (WD). This

distance provides a rigorous way to measure quantitatively the difference between two …

  Related articles 


[PDF] arxiv.org

Posterior asymptotics in Wasserstein metrics on the real line

M ChaeP De Blasi, SG Walker - arXiv preprint arXiv:2003.05599, 2020 - arxiv.org

In this paper, we use the class of Wasserstein metrics to study asymptotic properties of

posterior distributions. Our first goal is to provide sufficient conditions for posterior

consistency. In addition to the well-known Schwartz's Kullback--Leibler condition on the  …

  Related articles All 2 versions 

<——2020——2020———2530——



[HTML] springer.com

[HTML] The Wasserstein Space

VM Panaretos, Y Zemel - International Workshop on Functional and …, 2020 - Springer

The Kantorovich problem described in the previous chapter gives rise to a metric structure,

the Wasserstein distance, in the space of probability measures P (X) P (\mathcal X) on a

space X\mathcal X. The resulting metric space, a subspace of P (X) P (\mathcal X), is …

  Related articles


Data-driven Risk-sensitive Appointment Scheduling: A Wasserstein Distributionally Robust Optimization Approach

Z Pang, S Wang - Available at SSRN 3740083, 2020 - papers.ssrn.com

We consider an optimal appointment scheduling problem for a single-server healthcare

delivery system with random durations, focusing on the tradeoff between overtime work and

patient delays which are measured under conditional value-at-risk (CVaR). To address the …

 

[PDF] arxiv.org

Interpretable Model Summaries Using the Wasserstein Distance

E Dunipace, L Trippa - arXiv preprint arXiv:2012.09999, 2020 - arxiv.org

In the current computing age, models can have hundreds or even thousands of parameters;

however, such large models decrease the ability to interpret and communicate individual

parameters. Reducing the dimensionality of the parameter space in the estimation phase is …

  Related articles All 2 versions 


[PDF] arxiv.org

Martingale Wasserstein inequality for probability measures in the convex order

B Jourdain, W Margheriti - arXiv preprint arXiv:2011.11599, 2020 - arxiv.org

It is known since [24] that two one-dimensional probability measures in the convex order

admit a martingale coupling with respect to which the integral of $\vert xy\vert $ is smaller

than twice their $\mathcal W_1 $-distance (Wasserstein distance with index $1 $). We …

  Related articles All 7 versions 


[PDF] researchgate.net

Non-Gaussian BLE-Based Indoor Localization Via Gaussian Sum Filtering Coupled with Wasserstein Distance

P Malekzadeh, S Mehryar, P Spachos… - ICASSP 2020-2020 …, 2020 - ieeexplore.ieee.org

With recent breakthroughs in signal processing, communication and networking systems, we

are more and more surrounded by smart connected devices empowered by the Internet of

Thing (IoT). Bluetooth Low Energy (BLE) is considered as the main-stream technology to …

  Cited by 1 Related articles All 2 versions


2020


[PDF] arxiv.org

Velocity Inversion Using the Quadratic Wasserstein Metric

S Mahankali - arXiv preprint arXiv:2009.00708, 2020 - arxiv.org

Full--waveform inversion (FWI) is a method used to determine properties of the Earth from

information on the surface. We use the squared Wasserstein distance (squared $ W_2 $

distance) as an objective function to invert for the velocity as a function of position in the  …

  Related articles All 6 versions 


[HTML] springer.com

[HTML] Fréchet Means in the Wasserstein Space 

VM Panaretos, Y Zemel - International Workshop on Functional and …, 2020 - Springer

The concept of a Fréchet mean (Fréchet [55]) generalises the notion of mean to a more general

metric space by replacing the usual “sum of squares” with a “sum of squared distances”, giving

rise to the so-called Fréchet functional. A closely related notion is that of a Karcher mean (Karcher …

  Related articles


[PDF] arxiv.org

On the Wasserstein distance for a martingale central limit theorem

X Fan, X Ma - Statistics & Probability Letters, 2020 - Elsevier

… On the Wasserstein distance for a martingale central limit theorem. Author links open overlay

panelXiequanFan XiaohuiMa. Show more … Abstract. We prove an upper bound on the Wasserstein

distance between normalized martingales and the standard normal random variable, which …

  Related articles All 8 versions


Horo-functions associated to atom sequences on the Wasserstein space

G Zhu, H Wu, X Cui - Archiv der Mathematik, 2020 - Springer

On the Wasserstein space over a complete, separable, non-compact, locally compact length

space, we consider the horo-functions associated to sequences of atomic measures. We

show the existence of co-rays for any prescribed initial probability measure with respect to a …

  Related articles


[PDF] arxiv.org

Tensor product and Hadamard product for the Wasserstein means

J Hwang, S Kim - Linear Algebra and its Applications, 2020 - Elsevier

As one of the least squares mean, we consider the Wasserstein mean of positive definite

Hermitian matrices. We verify in this paper the inequalities of the Wasserstein mean related

with a strictly positive and unital linear map, the identity of the Wasserstein mean for tensor …

  Related articles All 5 versions

<——2020——2020———2540——e


[PDF] arxiv.org

The Spectral-Domain  Wasserstein Distance for Elliptical Processes and the Spectral-Domain Gelbrich Bound

S FangQ Zhu - arXiv preprint arXiv:2012.04023, 2020 - arxiv.org

In this short note, we introduce the spectral-domain $\mathcal {W} _2 $ Wasserstein distance

for elliptical stochastic processes in terms of their power spectra. We also introduce the

spectral-domain Gelbrich bound for

 processes that are not necessarily elliptical. Subjects …

  Related articles All 2 versions 


The Wasserstein Proximal Gradient Algorithm

A SalimA KorbaG Luise - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

Wasserstein gradient flows are continuous time dynamics that define curves of steepest

descent to minimize an objective function over the space of probability measures (ie, the

Wasserstein space). This objective is typically a divergence wrt a fixed target distribution. In …

  Related articles


[HTML] springer.com

[HTML] Fréchet Means in the Wasserstein Space 

VM Panaretos, Y Zemel - International Workshop on Functional and …, 2020 - Springer

The concept of a Fréchet mean (Fréchet [55]) generalises the notion of mean to a more general

metric space by replacing the usual “sum of squares” with a “sum of squared distances”, giving

rise to the so-called Fréchet functional. A closely related notion is that of a Karcher mean (Karcher …

  Related articles


[PDF] ams.org

Full View

Nonpositive curvature, the variance functional, and the Wasserstein barycenter

YH KimB Pass - Proceedings of the American Mathematical Society, 2020 - ams.org

We show that a Riemannian manifold $ M $ has nonpositive sectional curvature and is

simply connected if and only if the variance functional on the space $ P (M) $ of probability

measures over $ M $ is displacement convex. We then establish convexity over Wasserstein  …

  Cited by 2 Related articles All 3 versions


[PDF] ibpsa.org

[PDF] Comparing Bottom-Up Energy Consumption Models Using The Wasserstein Distance Between Load Profile Histograms

E Sanderson, A Fragaki, J Simo… - BSO-V 2020: IBPSA …, 2020 - ibpsa.org

This paper presents a comparison of bottom up models that generate appliance load

profiles. The comparison is based on their ability to accurately distribute load over time-of-

day. This is a key feature of model performance if the model is used to assess the impact of …

  Related articles All 2 versions 


2020


[PDF] On the equivalence between Fourier-based and Wasserstein metrics

G Auricchio, A Codegoni, S GualandiG Toscani… - eye - mate.unipv.it

We investigate properties of some extensions of a class of Fourierbased probability metrics,

originally introduced to study convergence to equilibrium for the solution to the spatially

homogeneous Boltzmann equation. At difference with the original one, the new Fourier …

  Related articles 


[PDF] researchgate.net

[PDF] THE α-z-BURES WASSERSTEIN DIVERGENCE

THOA DINHCT LE, BK VO, TD VUONG - researchgate.net

Φ (A, B)= Tr ((1− α) A+ αB)− Tr (Qα, z (A, B)), where Qα, z (A, B)=(A 1− α 2z B α z A 1− α 2z) z

is the matrix function in the α-z-Renyi relative entropy. We show that for 0≤ α≤ z≤ 1, the

quantity Φ (A, B) is a quantum divergence and satisfies the Data Processing Inequality in …

  Related articles 

  

[PDF] semanticscholar.org

[PDF] Deconvolution for the Wasserstein metric and topological inference

B Michel - pdfs.semanticscholar.org

La SEE (Société de l'Electricité, de l'Electronique et des Technologies de l'Information et de

la Communication–Association reconnue d'utilité publique, régie par la loi du 1er juillet

1901) met à la disposition de ses adhérents et des abonnés à ses publications, un …


Data Augmentation Method for Fault Diagnosis of Mechanical ...

https://www.semanticscholar.org › paper › Data-Augment...

In view of the above problem, this paper proposed a method to augment failure data for mechanical equipment diagnosis based on Wasserstein generative ...

Missing: Transformer ‎| Must include: Transformer

Data Augmentation Method for Fault Diagnosis of Mechanical Equipment Based on Improved Wasserstein GAN

W Liu, L Duan, Y Tang, J Yang - 2020 11th International …, 2020 - ieeexplore.ieee.org

Most of the time the mechanical equipment is in normal operation state, which results in high

imbalance between fault data and normal data. In addition, traditional signal processing

methods rely heavily on expert experience, making it difficult for classification or prediction …

  Related articles

[CITATION] Data Augmentation Method for Power Transformer Fault Diagnosis Based on Conditional Wasserstein Generative Adversarial Network [J]

Y Liu, Z Xu, J He, Q Wang, SG Gao, J Zhao - Power System Technology, 2020

  Cited by 1

Data Augmentation Method for Fault Diagnosis of Mechanical Equipment Based on Improved Wasserstein GAN

W Liu, L Duan, Y Tang, J Yang - 2020 11th International …, 2020 - ieeexplore.ieee.org

Most of the time the mechanical equipment is in normal operation state, which results in high

imbalance between fault data and normal data. In addition, traditional signal processing

methods rely heavily on expert experience, making it difficult for classification or prediction …

  Related articles

Supplementary Material: Wasserstein Distances for Stereo Disparity ...

https://docplayer.net › 200806740-Supplementary-materi...

Supplementary Material: Wasserstein Distances for Stereo Disparity Estimation Divyansh Garg 1 Yan Wang 1 Bharath Hariharan 1 Mark Campbell 1 Kilian Q.

[CITATION] Supplementary Material: Wasserstein Distances for Stereo Disparity Estimation

D Garg, Y Wang, B HariharanM Campbell

 Cited by 18 Related articles All 6 versions

Wasserstein Distances for Stereo Disparity Estimation_

作者使用一种新的能够输出任意深度值的神经网络结构和一种新的损失函数来解决这些问题,这种损失函数是从真实 ...

Oct 23, 2020

 Wasserstein Distances for Stereo Disparity Estimation

www.youtube.com › watch

www.youtube.com › watch

Key moments. View all · Motivation of the Work · Motivation of the Work · Depth Estimation from Images · Depth Estimation from Images · Robotic ...

YouTube · Computer Vision Talks · 

Nov 1, 2020

——2020——2020———2550——


SA vs SAA for population Wasserstein barycenter calculation

D Dvinskikh - arXiv preprint arXiv:2001.07697, 2020 - arxiv.org

In Machine Learning and Optimization community there are two main approaches for convex

risk minimization problem. The first approach is Stochastic Averaging (SA)(online) and the

second one is Stochastic Average Approximation (SAA)(Monte Carlo, Empirical Risk …

  Cited by 4 Related articles 


Learning to Align via Wasserstein for Person Re-Identification

Z ZhangY Xie, D Li, W Zhang… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Existing successful person re-identification (Re-ID) models often employ the part-level

representation to extract the fine-grained information, but commonly use the loss that is

particularly designed for global features, ignoring the relationship between semantic parts …

  Cited by 1 Related articles All 2 versions


[PDF] mlr.press

Stochastic optimization for regularized wasserstein estimators

M BalluQ BerthetF Bach - International Conference on …, 2020 - proceedings.mlr.press

Optimal transport is a foundational problem in optimization, that allows to compare

probability distributions while taking into account geometric aspects. Its optimal objective

value, the Wasserstein distance, provides an important loss between distributions that has …

  Cited by 7 Related articles All 4 versions 

Stochastic Optimization for Regularized Wasserstein Estimators

F BachM BalluQ Berthet - 2020 - research.google

Optimal transport is a foundational problem in optimization, that allows to compare

probability distributions while taking into account geometric aspects. Its optimal objective

value, the Wasserstein distance, provides an important loss between distributions that has …



[PDF] mlr.press

A fast proximal point method for computing exact wasserstein distance

Y XieX Wang, R Wang, H Zha - Uncertainty in Artificial …, 2020 - proceedings.mlr.press

Wasserstein distance plays increasingly important roles in machine learning, stochastic

programming and image processing. Major efforts have been under way to address its high

computational complexity, some leading to approximate or regularized variations such as …

  Cited by 54 Related articles All 5 versions 


[PDF] mlr.press

Gradient descent algorithms for Bures-Wasserstein barycenters

S ChewiT MaunuP Rigollet… - … on Learning Theory, 2020 - proceedings.mlr.press

We study first order methods to compute the barycenter of a probability distribution $ P $

over the space of probability measures with finite second moment. We develop a framework

to derive global rates of convergence for both gradient descent and stochastic gradient …

  Cited by 17 Related articles All 5 versions 


2020


[HTML] mdpi.com

Fused Gromov-Wasserstein distance for structured objects

T Vayer, L Chapel, R FlamaryR TavenardN Courty - Algorithms, 2020 - mdpi.com

Optimal transport theory has recently found many applications in machine learning thanks to

its capacity to meaningfully compare various machine learning objects that are viewed as

distributions. The Kantorovitch formulation, leading to the Wasserstein distance, focuses on …

  Cited by 7 Related articles All 33 versions 

Researchers from National Center for Scientific Research Discuss Research in Machine Learning 

(Fused Gromov-Wasserstein...

Journal of Engineering, 09/2020

NewsletterFull Text Online

[PDF] arxiv.org

Lagrangian schemes for Wasserstein gradient flows

JA CarrilloD MatthesMT Wolfram - arXiv preprint arXiv:2003.03803, 2020 - arxiv.org

This paper reviews different numerical methods for specific examples of Wasserstein

gradient flows: we focus on nonlinear Fokker-Planck equations, but also discuss

discretizations of the parabolic-elliptic Keller-Segel model and of the fourth order thin film …

  Cited by 4 Related articles All 3 versions 


[PDF] arxiv.org

Visual transfer for reinforcement learning via wasserstein domain confusion

J RoyG Konidaris - arXiv preprint arXiv:2006.03465, 2020 - arxiv.org

We introduce Wasserstein Adversarial Proximal Policy Optimization (WAPPO), a novel

algorithm for visual transfer in Reinforcement Learning that explicitly learns to align the

distributions of extracted features between a source and target task. WAPPO approximates …

  Cited by 3 Related articles All 6 versions 


[PDF] mlr.press

Nested-wasserstein self-imitation learning for sequence generation

R ZhangC ChenZ GanZ Wen… - International …, 2020 - proceedings.mlr.press

Reinforcement learning (RL) has been widely studied for improving sequence-generation

models. However, the conventional rewards used for RL training typically cannot capture

sufficient semantic information and therefore render model bias. Further, the sparse and …

  Cited by 2 Related articles All 6 versions 

Nested-Wasserstein Self-Imitation Learning for Sequence Generation

L Carin - 2020 - openreview.net

Reinforcement learning (RL) has been widely studied for improving sequence-generation

models. However, the conventional rewards used for RL training typically cannot capture

sufficient semantic information and therefore render model bias. Further, the sparse and …


[PDF] arxiv.org

Fisher information regularization schemes for Wasserstein gradient flows

W LiJ LuL Wang - Journal of Computational Physics, 2020 - Elsevier

We propose a variational scheme for computing Wasserstein gradient flows. The scheme

builds upon the Jordan–Kinderlehrer–Otto framework with the Benamou-Brenier's dynamic

formulation of the quadratic Wasserstein metric and adds a regularization by the Fisher …

  Cited by 10 Related articles All 10 versions

<——2020——2020———2560——



Wasserstein autoencoders for collaborative filtering

X Zhang, J Zhong, K Liu - Neural Computing and Applications, 2020 - Springer

The recommender systems have long been studied in the literature. The collaborative

filtering is one of the most widely adopted recommendation techniques which is usually

applied to the explicit data, eg, rating scores. However, the implicit data, eg, click data, is …

  Cited by 11 Related articles All 3 versions


Wasserstein loss with alternative reinforcement learning for severity-aware semantic segmentation

X Liu, Y Lu, X Liu, S Bai, S Li… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Semantic segmentation is important for many real-world systems, eg, autonomous vehicles,

which predict the class of each pixel. Recently, deep networks achieved significant progress

wrt the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross …

  Cited by 2 Related articles


[PDF] iop.org

Full View

Wasserstein metric for improved quantum machine learning with adjacency matrix representations

O Çaylak, OA von Lilienfeld… - … Learning: Science and …, 2020 - iopscience.iop.org

We study the Wasserstein metric to measure distances between molecules represented by

the atom index dependent adjacency'Coulomb'matrix, used in kernel ridge regression based

supervised learning. Resulting machine learning models of quantum properties, aka …

  Cited by 4 Related articles


De novo protein design for novel folds using guided conditional Wasserstein generative adversarial networks

M Karimi, S Zhu, Y CaoY Shen - Journal of Chemical Information …, 2020 - ACS Publications

Although massive data is quickly accumulating on protein sequence and structure, there is a

small and limited number of protein architectural types (or structural folds). This study is

addressing the following question: how well could one reveal underlying sequence …

  Cited by 3 Related articles All 5 versions


[PDF] aaai.org

Gromov-wasserstein factorization models for graph clustering

H Xu - Proceedings of the AAAI Conference on Artificial …, 2020 - ojs.aaai.org

We propose a new nonlinear factorization model for graphs that are with topological

structures, and optionally, node attributes. This model is based on a pseudometric called

Gromov-Wasserstein (GW) discrepancy, which compares graphs in a relational way. It …

  Cited by 5 Related articles All 5 versions 


2020


Domain-attention Conditional Wasserstein Distance for Multi-source Domain Adaptation

H Wu, Y Yan, MK NgQ Wu - ACM Transactions on Intelligent Systems …, 2020 - dl.acm.org

Multi-source domain adaptation has received considerable attention due to its effectiveness

of leveraging the knowledge from multiple related sources with different distributions to

enhance the learning performance. One of the fundamental challenges in multi-source …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

Wasserstein Stability for Persistence Diagrams

P SkrabaK Turner - arXiv preprint arXiv:2006.16824, 2020 - arxiv.org

The stability of persistence diagrams is among the most important results in applied and

computational topology. Most results in the literature phrase stability in terms of the

bottleneck distance between diagrams and the $\infty $-norm of perturbations. This has two …

  Cited by 4 Related articles All 2 versions 


[PDF] researchgate.net

Typical wind power scenario generation for multiple wind farms using conditional improved Wasserstein generative adversarial network

Y Zhang, Q Ai, F Xiao, R Hao, T Lu - … Journal of Electrical Power & Energy …, 2020 - Elsevier

Because of environmental benefits, wind power is taking an increasing role meeting

electricity demand. However, wind power tends to exhibit large uncertainty and is largely

influenced by meteorological conditions. Apart from the variability, when multiple wind farms …

  Cited by 15 Related articles


[PDF] arxiv.org

Wasserstein Embedding for Graph Learning

S KolouriN NaderializadehGK Rohde… - arXiv preprint arXiv …, 2020 - arxiv.org

We present Wasserstein Embedding for Graph Learning (WEGL), a novel and fast

framework for embedding entire graphs in a vector space, in which various machine

learning models are applicable for graph-level prediction tasks. We leverage new insights …

  Cited by 3 Related articles All 3 versions 


[HTML] nih.gov

Wasserstein GANs for MR imaging: from paired to unpaired training

K Lei, M MardaniJM Pauly… - IEEE transactions on …, 2020 - ieeexplore.ieee.org

Lack of ground-truth MR images impedes the common supervised training of neural

networks for image reconstruction. To cope with this challenge, this paper leverages

unpaired adversarial training for reconstruction networks, where the inputs are …

  Cited by 5 Related articles All 7 versions

<——2020——2020———2570——



Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings

Y Dai, S Wang, X Chen, C Xu, W Guo - Knowledge-Based Systems, 2020 - Elsevier

Abstract Knowledge graph embedding aims to project entities and relations into low-

dimensional and continuous semantic feature spaces, which has captured more attention in

recent years. Most of the existing models roughly construct negative samples via a uniformly …

  Cited by 7 Related articles All 2 versions


[PDF] ieee.org

An Ensemble Wasserstein Generative Adversarial Network Method for Road Extraction From High Resolution Remote Sensing Images in Rural Areas

C Yang, Z Wang - IEEE Access, 2020 - ieeexplore.ieee.org

Road extraction from high resolution remote sensing (HR-RS) images is an important yet

challenging computer vision task. In this study, we propose an ensemble Wasserstein

Generative Adversarial Network with Gradient Penalty (WGAN-GP) method called E-WGAN …

Cited by 13 Related articles All 3 versions

[PDF] arxiv.org

Variational wasserstein barycenters for geometric clustering

L Mi, T Yu, J BentoW ZhangB LiY Wang - arXiv preprint arXiv …, 2020 - arxiv.org

We propose to compute Wasserstein barycenters (WBs) by solving for Monge maps with

variational principle. We discuss the metric properties of WBs and explore their connections,

especially the connections of Monge WBs, to K-means clustering and co-clustering. We also …

  Cited by 2 Related articles All 2 versions 


[PDF] bciml.cn

Wasserstein distance based deep adversarial transfer learning for intelligent fault diagnosis with unlabeled or insufficient labeled data

C Cheng, B Zhou, G Ma, D WuY Yuan - Neurocomputing, 2020 - Elsevier

Intelligent fault diagnosis is one critical topic of maintenance solution for mechanical

systems. Deep learning models, such as convolutional neural networks (CNNs), have been

successfully applied to fault diagnosis tasks and achieved promising results. However, one …

  Cited by 11 Related articles All 3 versions


[PDF] arxiv.org

Deep attentive wasserstein generative adversarial networks for MRI reconstruction with recurrent context-awareness

Y Guo, C WangH ZhangG Yang - International Conference on Medical …, 2020 - Springer

The performance of traditional compressive sensing-based MRI (CS-MRI) reconstruction is

affected by its slow iterative procedure and noise-induced artefacts. Although many deep

learning-based CS-MRI methods have been proposed to mitigate the problems of traditional …

  Cited by 4 Related articles All 4 versions


2020


[PDF] arxiv.org

Wasserstein Autoregressive Models for Density Time Series

C ZhangP KokoszkaA Petersen - arXiv preprint arXiv:2006.12640, 2020 - arxiv.org

Data consisting of time-indexed distributions of cross-sectional or intraday returns have

been extensively studied in finance, and provide one example in which the data atoms

consist of serially dependent probability distributions. Motivated by such data, we propose …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

Conditional Sig-Wasserstein GANs for Time Series Generation

H NiL SzpruchM WieseS Liao, B Xiao - arXiv preprint arXiv:2006.05421, 2020 - arxiv.org

Generative adversarial networks (GANs) have been extremely successful in generating

samples, from seemingly high dimensional probability measures. However, these methods

struggle to capture the temporal dependence of joint probability distributions induced by …

  Cited by 4 Related articles All 3 versions 


[PDF] arxiv.org

Necessary Condition for Rectifiability Involving Wasserstein Distance W2

D Dąbrowski - International Mathematics Research Notices, 2020 - academic.oup.com

A Radon measure is-rectifiable if it is absolutely continuous with respect to-dimensional

Hausdorff measure and-almost all of can be covered by Lipschitz images of. In this paper,

we give a necessary condition for rectifiability in terms of the so-called numbers …

  Cited by 6 Related articles All 5 versions


[PDF] arxiv.org

Graph Wasserstein Correlation Analysis for Movie Retrieval

X Zhang, T Zhang, X Hong, Z Cui, J Yang - European Conference on …, 2020 - Springer

Movie graphs play an important role to bridge heterogenous modalities of videos and texts

in human-centric retrieval. In this work, we propose Graph Wasserstein Correlation Analysis

(GWCA) to deal with the core issue therein, ie, cross heterogeneous graph comparison …

  Related articles All 5 versions


[PDF] arxiv.org

Distributed optimization with quantization for computing Wasserstein barycenters

R Krawtschenko, CA UribeA Gasnikov… - arXiv preprint arXiv …, 2020 - arxiv.org

We study the problem of the decentralized computation of entropy-regularized semi-discrete

Wasserstein barycenters over a network. Building upon recent primal-dual approaches, we

propose a sampling gradient quantization scheme that allows efficient communication and …

  Cited by 2 Related articles All 3 versions 

<——2020——2020———2580——



S2a: Wasserstein gan with spatio-spectral laplacian attention for multi-spectral band synthesis

L RoutI MisraSM Moorthi… - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com

Intersection of adversarial learning and satellite image processing is an emerging field in

remote sensing. In this study, we intend to address synthesis of high resolution multi-spectral

satellite imagery using adversarial learning. Guided by the discovery of attention …

  Cited by 3 Related articles All 9 versions 


[PDF] mlr.press

Principled learning method for Wasserstein distributionally robust optimization with local perturbations

Y Kwon, W Kim, JH Won… - … Conference on Machine …, 2020 - proceedings.mlr.press

Wasserstein distributionally robust optimization (WDRO) attempts to learn a model that

minimizes the local worst-case risk in the vicinity of the empirical data distribution defined by

Wasserstein ball. While WDRO has received attention as a promising tool for inference since …

  Related articles All 5 versions 


[PDF] arxiv.org

Stochastic saddle-point optimization for wasserstein barycenters

D TiapkinA GasnikovP Dvurechensky - arXiv preprint arXiv:2006.06763, 2020 - arxiv.org

We study the computation of non-regularized Wasserstein barycenters of probability

measures supported on the finite set. The first result gives a stochastic optimization

algorithm for the discrete distribution over the probability measures which is comparable …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

Reinforced wasserstein training for severity-aware semantic segmentation in autonomous driving

X Liu, Y Zhang, X Liu, S Bai, S Li, J You - arXiv preprint arXiv:2008.04751, 2020 - arxiv.org

Semantic segmentation is important for many real-world systems, eg, autonomous vehicles,

which predict the class of each pixel. Recently, deep networks achieved significant progress

wrt the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross …

  Cited by 1 Related articles All 3 versions 


[PDF] arxiv.org

A variational finite volume scheme for Wasserstein gradient flows

C CancèsTO GallouëtG Todeschi - Numerische Mathematik, 2020 - Springer

We propose a variational finite volume scheme to approximate the solutions to Wasserstein

gradient flows. The time discretization is based on an implicit linearization of the

Wasserstein distance expressed thanks to Benamou–Brenier formula, whereas space …

  Cited by 6 Related articles All 9 versions


2020


[PDF] arxiv.org

Wasserstein coupled particle filter for multilevel estimation

M Ballesio, A JasraE von Schwerin… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we consider the filtering problem for partially observed diffusions, which are

regularly observed at discrete times. We are concerned with the case when one must resort

to time-discretization of the diffusion process if the transition density is not available in an …

  Cited by 2 Related articles All 4 versions 


[PDF] arxiv.org

Symmetric skip connection wasserstein gan for high-resolution facial image inpainting

J JamC KendrickV DrouardK Walker… - arXiv preprint arXiv …, 2020 - arxiv.org

The state-of-the-art facial image inpainting methods achieved promising results but face

realism preservation remains a challenge. This is due to limitations such as; failures in

preserving edges and blurry artefacts. To overcome these limitations, we propose a …

  Cited by 3 Related articles All 3 versions 


W-LDMM: A wasserstein driven low-dimensional manifold model for noisy image restoration

R He, X Feng, W Wang, X Zhu, C Yang - Neurocomputing, 2020 - Elsevier

The Wasserstein distance originated from the optimal transport theory is a general and

flexible statistical metric in a variety of image processing problems. In this paper, we propose

a novel Wasserstein driven low-dimensional manifold model (W-LDMM), which tactfully …

  Cited by 3 Related articles All 2 versions


[PDF] arxiv.org

Wasserstein distributionally robust motion control for collision avoidance using conditional value-at-risk

A HakobyanI Yang - arXiv preprint arXiv:2001.04727, 2020 - arxiv.org

In this paper, a risk-aware motion control scheme is considered for mobile robots to avoid

randomly moving obstacles when the true probability distribution of uncertainty is unknown.

We propose a novel model predictive control (MPC) method for limiting the risk of unsafety …

  Cited by 5 Related articles All 2 versions 


 Semantics-assisted Wasserstein learning for topic and word embeddings

C LiX Li, J Ouyang, Y Wang - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

… Wasserstein NMF topic model, namely Semantics-Assisted Wasserstein Learning (SAWL),

with simultaneous learning of … SAWL model with Wasserstein learning, which simultaneously …

Cited by 3 Related articles All 2 versions

<——2020——2020———2590——

Primal heuristics for wasserstein barycenters

PY Bouchet, S GualandiLM Rousseau - International Conference on …, 2020 - Springer

This paper presents primal heuristics for the computation of Wasserstein Barycenters of a

given set of discrete probability measures. The computation of a Wasserstein Barycenter is

formulated as an optimization problem over the space of discrete probability measures. In …

  Cited by 1 Related articles


[PDF] arxiv.org

First-Order Methods for Wasserstein Distributionally Robust MDP

J Grand-ClementC Kroer - arXiv preprint arXiv:2009.06790, 2020 - arxiv.org

Markov Decision Processes (MDPs) are known to be sensitive to parameter specification.

Distributionally robust MDPs alleviate this issue by allowing for ambiguity sets which give a

set of possible distributions over parameter sets. The goal is to find an optimal policy with …

  Cited by 1 Related articles All 3 versions 


Wasserstein based transfer network for cross-domain sentiment classification

Y Du, M He, L Wang, H Zhang - Knowledge-Based Systems, 2020 - Elsevier

Automatic sentiment analysis of social media texts is of great significance for identifying

people's opinions that can help people make better decisions. Annotating data is time

consuming and laborious, and effective sentiment analysis on domains lacking of labeled …

  Cited by 2 Related articles All 2 versions


[PDF] aaai.org

Regularized Wasserstein means for aligning distributional data

L MiW ZhangY Wang - Proceedings of the AAAI Conference on …, 2020 - ojs.aaai.org

We propose to align distributional data from the perspective of Wasserstein means. We raise

the problem of regularizing Wasserstein means and propose several terms tailored to tackle

different problems. Our formulation is based on the variational transportation to distribute a …

  Cited by 3 Related articles All 5 versions 


[PDF] ieee.org

An Ensemble Wasserstein Generative Adversarial Network Method for Road Extraction From High Resolution Remote Sensing Images in Rural Areas

C Yang, Z Wang - IEEE Access, 2020 - ieeexplore.ieee.org

Road extraction from high resolution remote sensing (HR-RS) images is an important yet

challenging computer vision task. In this study, we propose an ensemble Wasserstein

Generative Adversarial Network with Gradient Penalty (WGAN-GP) method called E-WGAN …

  Cited by 2 Related articles All 2 versions


2020


[PDF] arxiv.org

Wasserstein metric for improved QML with adjacency matrix representations

O Çaylak, OA von LilienfeldB Baumeier - arXiv preprint arXiv:2001.11005, 2020 - arxiv.org

We study the Wasserstein metric to measure distances between molecules represented by

the atom index dependent adjacency" Coulomb" matrix, used in kernel ridge regression

based supervised learning. Resulting quantum machine learning models exhibit improved …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Efficient Wasserstein Natural Gradients for Reinforcement Learning

T MoskovitzM ArbelF HuszarA Gretton - arXiv preprint arXiv …, 2020 - arxiv.org

A novel optimization approach is proposed for application to policy gradient methods and

evolution strategies for reinforcement learning (RL). The procedure uses a computationally

efficient Wasserstein natural gradient (WNG) descent that takes advantage of the geometry …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein Adversarial Autoencoders for Knowledge Graph Embedding based Drug-Drug Interaction Prediction

Y Dai, C Guo, W Guo, C Eickhoff - arXiv preprint arXiv:2004.07341, 2020 - arxiv.org

Interaction between pharmacological agents can trigger unexpected adverse events.

Capturing richer and more comprehensive information about drug-drug interactions (DDI) is

one of the key tasks in public health and drug development. Recently, several knowledge …

  Cited by 1 Related articles All 2 versions 


[PDF] thecvf.com

Channel Pruning for Accelerating Convolutional Neural Networks via Wasserstein Metric

H Duan, H Li - Proceedings of the Asian Conference on …, 2020 - openaccess.thecvf.com

Channel pruning is an effective way to accelerate deep convolutional neural networks.

However, it is still a challenge to reduce the computational complexity while preserving the

performance of deep models. In this paper, we propose a novel channel pruning method via …


[PDF] Dual Rejection Sampling for Wasserstein Auto-Encoders

L Hou, H Shen, X Cheng - 24th European Conference on Artificial …, 2020 - ecai2020.eu

Deep generative models enhanced by Wasserstein distance have achieved remarkable

success in recent years. Wasserstein Auto-Encoders (WAEs) are auto-encoder based

generative models that aim to minimize the Wasserstein distance between the data …

  Cited by 1 Related articles All 3 versions 

<——2020——2020———2600——



[PDF] arxiv.org

Wasserstein K-Means for Clustering Tomographic Projections

R RaoA MoscovichA Singer - arXiv preprint arXiv:2010.09989, 2020 - arxiv.org

Motivated by the 2D class averaging problem in single-particle cryo-electron microscopy

(cryo-EM), we present a k-means algorithm based on a rotationally-invariant Wasserstein

metric for images. Unlike existing methods that are based on Euclidean ($ L_2 $) distances …

  Cited by 1 Related articles All 5 versions 


Learning Wasserstein Isometric Embedding for Point Clouds

K KawanoS Koide, T Kutsuna - 2020 International Conference …, 2020 - ieeexplore.ieee.org

The Wasserstein distance has been employed for determining the distance between point

clouds, which have variable numbers of points and invariance of point order. However, the

high computational cost associated with the Wasserstein distance hinders its practical …

  All 2 versions


[PDF] arxiv.org

Strong Formulations for Distributionally Robust Chance-Constrained Programs with Left-Hand Side Uncertainty under Wasserstein Ambiguity

N Ho-NguyenF Kılınç-KarzanS Küçükyavuz… - arXiv preprint arXiv …, 2020 - arxiv.org

Distributionally robust chance-constrained programs (DR-CCP) over Wasserstein ambiguity

sets exhibit attractive out-of-sample performance and admit big-$ M $-based mixed-integer

programming (MIP) reformulations with conic constraints. However, the resulting …

  Cited by 3 Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein Collaborative Filtering for Item Cold-start Recommendation

Y MengX YanW Liu, H Wu, J Cheng - … of the 28th ACM Conference on …, 2020 - dl.acm.org

Item cold-start recommendation, which predicts user preference on new items that have no

user interaction records, is an important problem in recommender systems. In this paper, we

model the disparity between user preferences on warm items (those having interaction …

  Cited by 2 Related articles All 4 versions


[PDF] arxiv.org

Partial Gromov-Wasserstein Learning for Partial Graph Matching

W LiuC ZhangJ XieZ Shen, H Qian… - arXiv preprint arXiv …, 2020 - arxiv.org

Graph matching finds the correspondence of nodes across two graphs and is a basic task in

graph-based machine learning. Numerous existing methods match every node in one graph

to one node in the other graph whereas two graphs usually overlap partially in …

  Related articles All 4 versions 


2020


[PDF] arxiv.org

Wasserstein Generative Models for Patch-based Texture Synthesis

A HoudardA LeclaireN Papadakis… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we propose a framework to train a generative model for texture image

synthesis from a single example. To do so, we exploit the local representation of images via

the space of patches, that is, square sub-images of fixed size (eg $4\times 4$). Our main …

  Cited by 1 Related articles All 10 versions 


[PDF] imstat.org

A Central Limit Theorem for Wasserstein type distances between two distinct univariate distributions

P Berthet, JC Fort, T Klein - Annales de l'Institut Henri Poincaré …, 2020 - projecteuclid.org

In this article we study the natural nonparametric estimator of a Wasserstein type cost

between two distinct continuous distributions $ F $ and $ G $ on $\mathbb {R} $. The

estimator is based on the order statistics of a sample having marginals $ F $, $ G $ and any …

  Related articles All 4 versions


[PDF] arxiv.org

A material decomposition method for dual-energy CT via dual interactive Wasserstein generative adversarial networks

Z Shi, H Li, Q Cao, Z Wang, M Cheng - arXiv preprint arXiv:2007.11247, 2020 - arxiv.org

Dual-energy computed tomography has great potential in material characterization and

identification, whereas the reconstructed material-specific images always suffer from

magnified noise and beam hardening artifacts. In this study, a data-driven approach using …

  Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein Distance Regularized Sequence Representation for Text Matching in Asymmetrical Domains

W Yu, C Xu, J XuL Pang, X Gao, X Wang… - arXiv preprint arXiv …, 2020 - arxiv.org

One approach to matching texts from asymmetrical domains is projecting the input

sequences into a common semantic space as feature vectors upon which the matching

function can be readily defined and learned. In real-world matching practices, it is often …

  Related articles All 3 versions 


Wasserstein-based fairness interpretability framework for machine learning models

A Miroshnikov, K Kotsiopoulos, R Franks… - arXiv preprint arXiv …, 2020 - arxiv.org

In this article, we introduce a fairness interpretability framework for measuring and

explaining bias in classification and regression models at the level of a distribution. In our

work, motivated by the ideas of Dwork et al.(2012), we measure the model bias across sub …

  Related articles All 2 versions 

<——2020——2020———2610——


[PDF] arxiv.org

Fast Epigraphical Projection-based Incremental Algorithms for Wasserstein Distributionally Robust Support Vector Machine

J Li, C ChenAMC So - arXiv preprint arXiv:2010.12865, 2020 - arxiv.org

Wasserstein\textbf {D} istributionally\textbf {R} obust\textbf {O} ptimization (DRO) is

concerned with finding decisions that perform well on data that are drawn from the worst-

case probability distribution within a Wasserstein ball centered at a certain nominal …

Cited by 2 Related articles All 6 versions 


[PDF] neurips.cc

[PDF] Quantile Propagation for Wasserstein-Approximate Gaussian Processes

R ZhangC WalderEV Bonilla… - Advances in Neural …, 2020 - proceedings.neurips.cc

Approximate inference techniques are the cornerstone of probabilistic methods based on

Gaussian process priors. Despite this, most work approximately optimizes standard

divergence measures such as the Kullback-Leibler (KL) divergence, which lack the basic …

  Related articles All 6 versions 


[HTML] aimsciences.org

[HTML] RWRM: Residual Wasserstein regularization model for image restoration

R He, X Feng, X Zhu, H Huang… - Inverse Problems & …, 2020 - aimsciences.org

Existing image restoration methods mostly make full use of various image prior information.

However, they rarely exploit the potential of residual histograms, especially their role as

ensemble regularization constraint. In this paper, we propose a residual Wasserstein  …

  Related articles All 2 versions 


Adaptive Wasserstein Hourglass for Weakly Supervised RGB 3D Hand Pose Estimation

Y Zhang, L Chen, Y Liu, W Zheng, J Yong - Proceedings of the 28th ACM …, 2020 - dl.acm.org

The deficiency of labeled training data is one of the bottlenecks in 3D hand pose estimation

from monocular RGB images. Synthetic datasets have a large number of images with

precise annotations, but their obvious difference with real-world datasets limits the …

  Related articles


Conditional Wasserstein Auto-Encoder for Interactive Vehicle Trajectory Prediction

C Fei, X He, S Kawahara, N Shirou… - 2020 IEEE 23rd …, 2020 - ieeexplore.ieee.org

Trajectory prediction is a crucial task required for autonomous driving. The highly

interactions and uncertainties in real-world traffic scenarios make it a challenge to generate

trajectories that are accurate, reasonable and covering diverse modality as much as …

  Related articles


2020


[PDF] arxiv.org

Convergence rate to equilibrium in Wasserstein distance for reflected jump–diffusions

A Sarantsev - Statistics & Probability Letters, 2020 - Elsevier

Convergence rate to the stationary distribution for continuous-time Markov processes can be

studied using Lyapunov functions. Recent work by the author provided explicit rates of

convergence in special case of a reflected jump–diffusion on a half-line. These results are …

  Related articles All 7 versions


[PDF] arxiv.org

An LP-based, strongly-polynomial 2-approximation algorithm for sparse Wasserstein barycenters

S Borgwardt - Operational Research, 2020 - Springer

Discrete Wasserstein barycenters correspond to optimal solutions of transportation problems

for a set of probability measures with finite support. Discrete barycenters are measures with

finite support themselves and exhibit two favorable properties: there always exists one with a …

  Cited by 4 Related articles All 3 versions


[PDF] arxiv.org

SWIFT: Scalable Wasserstein Factorization for Sparse Nonnegative Tensors

A AfsharK Yin, S Yan, C Qian, JC Ho, H Park… - arXiv preprint arXiv …, 2020 - arxiv.org

Existing tensor factorization methods assume that the input tensor follows some specific

distribution (ie Poisson, Bernoulli and Gaussian), and solve the factorization by minimizing

some empirical loss functions defined based on the corresponding distribution. However, it …

  Cited by 1 Related articles All 3 versions 


Exponential contraction in Wasserstein distances for diffusion semigroups with negative curvature

FY Wang - Potential Analysis, 2020 - Springer

Let P t be the (Neumann) diffusion semigroup P t generated by a weighted Laplacian on a

complete connected Riemannian manifold M without boundary or with a convex boundary. It

is well known that the Bakry-Emery curvature is bounded below by a positive constant> 0 …

  Cited by 5 Related articles


Wasserstein distance estimates for stochastic integrals by forward-backward stochastic calculus

JC Breton, N Privault - Potential Analysis, 2020 - Springer

We prove Wasserstein distance bounds between the probability distributions of stochastic

integrals with jumps, based on the integrands appearing in their stochastic integral

representations. Our approach does not rely on the Stein equation or on the propagation of …

  Related articles All 4 versions

<——2020——2020———2620——



Wasserstein Embeddings for Nonnegative Matrix Factorization

M Febrissy, M Nadif - … Conference on Machine Learning, Optimization, and …, 2020 - Springer

In the field of document clustering (or dictionary learning), the fitting error called the

Wasserstein (In this paper, we use “Wasserstein”,“Earth Mover's”,“Kantorovich–Rubinstein”

interchangeably) distance showed some advantages for measuring the approximation of the …

  Related articles


[PDF] arxiv.org

Encoded Prior Sliced Wasserstein AutoEncoder for learning latent manifold representations

S KrishnagopalJ Bedrossian - arXiv preprint arXiv:2010.01037, 2020 - arxiv.org

While variational autoencoders have been successful generative models for a variety of

tasks, the use of conventional Gaussian or Gaussian mixture priors are limited in their ability

to capture topological or geometric properties of data in the latent representation. In this …

  Related articles All 2 versions 


Wasserstein cycle-consistent generative adversarial network for improved seismic impedance inversion: Example on 3D SEAM model

A Cai, H Di, Z Li, H Maniar, A Abubakar - SEG Technical Program …, 2020 - library.seg.org

The convolutional neural networks (CNNs) have attracted great attentions in seismic

exploration applications by their capability of learning the representations of data with

multiple level of abstractions, given an adequate amount of labeled data. In seismic …

  Cited by 2 Related articles


Wasserstein GAN based on Autoencoder with back-translation for cross-lingual embedding mappings

Y Zhang, Y Li, Y Zhu, X Hu - Pattern Recognition Letters, 2020 - Elsevier

Recent works about learning cross-lingual word mappings (CWMs) focus on relaxing the

requirement of bilingual signals through generative adversarial networks (GANs). GANs

based models intend to enforce source embedding space to align target embedding space …

  Related articles All 2 versions


[PDF] arxiv.org

On Stein's factors for Poisson approximation in Wasserstein distance with non-linear transportation costs

ZW Liao, Y Ma, A Xia - arXiv preprint arXiv:2003.13976, 2020 - arxiv.org

We establish various bounds on the solutions to a Stein equation for Poisson approximation

in Wasserstein distance with non-linear transportation costs. The proofs are a refinement of

those in [Barbour and Xia (2006)] using the results in [Liu and Ma (2009)]. As a corollary, we …

  Related articles All 2 versions 


2020


[PDF] aalto.fi

Wasserstein-Distance-Based Temporal Clustering for Capacity-Expansion Planning in Power Systems

L CondeixaF Oliveira… - … Conference on Smart …, 2020 - ieeexplore.ieee.org

As variable renewable energy sources are steadily incorporated in European power

systems, the need for higher temporal resolution in capacity-expansion models also

increases. Naturally, there exists a trade-off between the amount of temporal data used to …

  Cited by 1 Related articles


[PDF] arxiv.org

Wasserstein Convergence Rate for Empirical Measures on Noncompact Manifolds

FY Wang - arXiv preprint arXiv:2007.14667, 2020 - arxiv.org

Let $ X_t $ be the (reflecting) diffusion process generated by $ L:=\Delta+\nabla V $ on a

complete connected Riemannian manifold $ M $ possibly with a boundary $\partial M $,

where $ V\in C^ 1 (M) $ such that $\mu (dx):= e^{V (x)} dx $ is a probability measure. We …

  Cited by 2 Related articles All 2 versions 


A Generative Model for Zero-Shot Learning via Wasserstein Auto-encoder

X Luo, Z Cai, F Wu, J Xiao-Yuan - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Zero-shot learning aims to use the labeled instances to train the model, and then classifies

the instances that belong to a class without labeled instances. However, the training

instances and test instances are disjoint. Thus, the description of the classes (eg text …

  Related articles


Wasserstein Generative Adversarial Networks Based Data Augmentation for Radar Data Analysis

H Lee, J Kim, EK Kim, S Kim - Applied Sciences, 2020 - mdpi.com

Ground-based weather radar can observe a wide range with a high spatial and temporal

resolution. They are beneficial to meteorological research and services by providing

valuable information. Recent weather radar data related research has focused on applying …

  Cited by 1 Related articles All 2 versions 


Learning Wasserstein Distance-Based Gaussian Graphical Model for Multivariate Time Series Classification

HU Xuegang, L Jianxing, LI Peipei… - 2020 IEEE …, 2020 - ieeexplore.ieee.org

Multivariate time series classification occupies an important position in time series data

mining tasks and has been applied in many fields. However, due to the statistical coupling

between different variables of Multivariate Time Series (MTS) data, traditional classification …

  Related articles All 2 versions

<——2020——2020———2630—— 



[PDF] arxiv.org

Unsupervised Wasserstein Distance Guided Domain Adaptation for 3D Multi-domain Liver Segmentation

C YouJ Yang, J Chapiro, JS Duncan - … Annotation-Efficient Learning for …, 2020 - Springer

Deep neural networks have shown exceptional learning capability and generalizability in

the source domain when massive labeled data is provided. However, the well-trained

models often fail in the target domain due to the domain shift. Unsupervised domain …

  Related articles All 3 versions


[PDF] iop.org

Data Augmentation Method for Switchgear Defect Samples Based on Wasserstein Generative Adversarial Network

X Huang, J Xiong, Y Zhang, J Liang… - Journal of Physics …, 2020 - iopscience.iop.org

The problem of sample imbalance will lead to poor generalization ability of the deep

learning model algorithm, and the phenomenon of overfitting during network training, which

limits the accuracy of intelligent fault diagnosis of switchgear equipment. In view of this, this …

  Related articles


A Super Resolution Method for Remote Sensing Images Based on Cascaded Conditional Wasserstein GANs

B Liu, H Li, Y Zhou, Y Peng, A Elazab… - 2020 IEEE 3rd …, 2020 - ieeexplore.ieee.org

High-resolution (HR) remote sensing imagery is quite beneficial for subsequent

interpretation. Obtaining HR images can be achieved by upgrading the imaging device. Yet,

the cost to perform this task is very huge. Thus, it is necessary to obtain HR images from low …

  Related articles


A Sliced Wasserstein Loss for Neural Texture Synthesis

E Heitz, K Vanhoey, T Chambon… - arXiv preprint …, 2020 - arxiv-export-lb.library.cornell.edu

We address the problem of computing a textural loss based on the statistics extracted from

the feature activations of a convolutional neural network optimized for object recognition (eg

VGG-19). The underlying mathematical problem is the measure of the distance between two …

  

[PDF] researchgate.net

[PDF] Wasserstein Barycenters for Bayesian Learning: Technical Report

G Rios - 2020 - researchgate.net

Within probabilistic modelling, a crucial but challenging task is that of learning (or fitting) the

models. For models described by a finite set of parameters, this task is reduced to finding the

best parameters, to feed them into the model and then calculate the posterior distribution to …

  Related articles 


2020


[PDF] core.ac.uk

[PDF] Structure-preserving variational schemes for fourth order nonlinear partial differential equations with a Wasserstein gradient flow structure

B Ashworth - 2020 - core.ac.uk

There is a growing interest in studying nonlinear partial differential equations which

constitute gradient flows in the Wasserstein metric and related structure preserving

variational discretisations. In this thesis, we focus on the fourth order Derrida-Lebowitz …

  

Spatial-aware Network using Wasserstein Distance for Unsupervised Domain Adaptation

L Long, L Bin, F Jiang - 2020 Chinese Automation Congress …, 2020 - ieeexplore.ieee.org

In a general scenario, the purpose of Unsupervised Domain Adaptation (UDA) is to classify

unlabeled target domain data as much as possible, but the source domain data has a large

number of labels. To address this situation, this paper introduces the optimal transport theory …

 

Stochastic Approximation versus Sample Average Approximation for population Wasserstein barycenters

D Dvinskikh - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

In machine learning and optimization community there are two main approaches for convex

risk minimization problem, namely, the Stochastic Approximation (SA) and the Sample

Average Approximation (SAA). In terms of oracle complexity (required number of stochastic …

  Cited by 1


  

Rethinking Wasserstein-Procrustes for Aligning Word Embeddings Across Languages

G Ramírez Santos - 2020 - upcommons.upc.edu

The emergence of unsupervised word embeddings, pre-trained on very large monolingual

text corpora, is at the core of the ongoing neural revolution in Natural Language Processing

(NLP). Initially introduced for English, such pre-trained word embeddings quickly emerged …

  

[PDF] uniroma1.it

[PDF] Nonparametric Density Estimation with Wasserstein Distance for Actuarial Applications

EG Luini - iris.uniroma1.it

Density estimation is a central topic in statistics and a fundamental task of actuarial sciences.

In this work, we present an algorithm for approximating multivariate empirical densities with

a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

  Related articles All 2 versions 

<——2020——2020———2640—— 



[PDF] optimization-online.org

[PDF] A Novel Solution Methodology for Wasserstein-based Data-Driven Distributionally Robust Problems

CA Gamboa, DM ValladaoA Street… - optimization-online.org

Distributionally robust optimization (DRO) is a mathematical framework to incorporate

ambiguity over the actual data-generating probability distribution. Data-driven DRO

problems based on the Wasserstein distance are of particular interest for their sound …

  Related articles 


Two approaches for population Wasserstein barycenter problem: Stochastic Averaging versus Sample Average Approximation

D DvinskikhA Gasnikov - nnov.hse.ru

Abstract In Machine Learning and Optimization community there are two main approaches

for convex risk minimization problem: Stochastic Averaging (SA) and Sample Average

Approximation (SAA). At the moment, it is known that both approaches are on average …

  Related articles 


[CITATION] Wasserstein distance estimates for jump-diffusion processes

JC Breton, N Privault - Preprint, 2020

  Cited by 2 Related articles



[PDF] mlr.press

A wasserstein minimum velocity approach to learning unnormalized models

Z WangS Cheng, L Yueru, J Zhu… - International …, 2020 - proceedings.mlr.press

Score matching provides an effective approach to learning flexible unnormalized models,

but its scalability is limited by the need to evaluate second-order derivative. In this paper,

we present scalable approximation to general family of learning objectives including …

  Cited by 4 Related articles All 9 versions 


[PDF] mlr.press

fast proximal point method for computing exact wasserstein distance

Y XieX Wang, R Wang, H Zha - Uncertainty in Artificial …, 2020 - proceedings.mlr.press

Wasserstein distance plays increasingly important roles in machine learning, stochastic

programming and image processing. Major efforts have been under way to address its high

computational complexity, some leading to approximate or regularized variations such as …

  Cited by 54 Related articles All 5 versions 


 

2020

[PDF] arxiv.org

Wasserstein distributionally robust stochastic control: data-driven approach

I Yang - IEEE Transactions on Automatic Control, 2020 - ieeexplore.ieee.org

Standard stochastic control methods assume that the probability distribution of uncertain

variables is available. Unfortunately, in practice, obtaining accurate distribution information

is challenging task. To resolve this issue, in this article we investigate the problem of …

  Cited by 24 Related articles All 3 versions


[PDF] thecvf.com

Gromov-wasserstein averaging in riemannian framework

S ChowdhuryT Needham - Proceedings of the IEEE/CVF …, 2020 - openaccess.thecvf.com

We introduce theoretical framework for performing statistical tasks-including, but not

limited to, averaging and principal component analysis-on the space of (possibly

asymmetric) matrices with arbitrary entries and sizes. This is carried out under the lens of the …

  Cited by 7 Related articles All 6 versions 


A Wasserstein based two-stage distributionally robust optimization model for optimal operation of CCHP micro-grid under uncertainties

Y Wang, Y Yang, L Tang, W Sun, B Li - International Journal of Electrical …, 2020 - Elsevier

Combined cooling, heating and power (CCHP) micro-grids are getting increasing attentions

due to the realization of cleaner production and high energy efficiency. However, with the

features of complex tri-generation structure a

nd renewable power uncertainties, it is …

  Cited by 16 Related articles All 2 versions



[PDF] arxiv.org

new approach to posterior contraction rates via Wasserstein dynamics

E Dolera, S Favaro, E Mainini - arXiv preprint arXiv:2011.14425, 2020 - arxiv.org

This paper presents new approach to the classical problem of quantifying posterior

contraction rates (PCRs) in Bayesian statistics. Our approach relies on Wasserstein

distance, and it leads to two main contributions which improve on the existing literature of …

  Cited by 1 Related articles All 2 versions 


[HTML] mdpi.com

Calculating the Wasserstein metric-based Boltzmann entropy of landscape mosaic

H Zhang, Z Wu, T Lan, Y Chen, P Gao - Entropy, 2020 - mdpi.com

Shannon entropy is currently the most popular method for quantifying the disorder or

information of spatial data set such as landscape pattern and cartographic map.

However, its drawback when applied to spatia

l data is also well documented; it is incapable …

  Cited by 3 Related articles All 9 versions 

<——2020——2020———2650—— 



[HTML] nih.gov

[HTML] EEG signal reconstruction using generative adversarial network with wasserstein distance and temporal-spatial-frequency loss

T Luo, Y Fan, L Chen, G Guo, C Zhou - Frontiers in …, 2020 - ncbi.nlm.nih.gov

Applications based on electroencephalography (EEG) signals suffer from the mutual

contradiction of high classification performance vs. low cost. The nature of this contradiction

makes EEG signal reconstruction with high sampling rates and sensitivity challenging …

  Cited by 6 Related articles All 5 versions


[PDF] arxiv.org

SVGD as kernelized Wasserstein gradient flow of the chi-squared divergence

S ChewiTL GouicC LuT Maunu… - arXiv preprint arXiv …, 2020 - arxiv.org

Stein Variational Gradient Descent (SVGD), popular sampling algorithm, is often described

as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of

optimal transport. We introduce new perspective on SVGD that instead views SVGD as the …

  Cited by 4 Related articles All 5 versions 


Sample generation based on supervised Wasserstein Generative Adversarial Network for high-resolution remote-sensing scene classification

W HanL Wang, R Feng, L Gao, X Chen, Z Deng… - Information …, 2020 - Elsevier

As high-resolution remote-sensing (HRRS) images have become increasingly widely

available, scene classification focusing on the smart classification of land cover and land

use has also attracted more attention. However, mainstream methods encounter severe …

  Cited by 5 Related articles All 3 versions


Linear Programming Approximation of Distributionally Robust Chance-Constrained Dispatch With Wasserstein Distance

Zhou, M Yang, M Wang… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

This paper proposes data-driven distributionally robust chance constrained real-time

dispatch (DRCC-RTD) considering renewable generation forecasting errors. The proposed

DRCC-RTD model minimizes the expected quadratic cost function and guarantees that the …

  Cited by 5 Related articles All 2 versions


Obtaining PET/CT images from non-attenuation corrected PET images in single PET system using Wasserstein generative adversarial networks

Z Hu, Y Li, S Zou, H Xue, Z Sang, X Liu… - Physics in Medicine …, 2020 - iopscience.iop.org

Positron emission tomography (PET) imaging plays an indispensable role in early disease

detection and postoperative patient staging diagnosis. However, PET imaging requires not

only additional computed tomography (CT) imaging to provide detailed anatomical …

  Cited by 6 Related articles All 5 versions

online Cover Image PEER-REVIEW

Obtaining PET/CT images from non-attenuation corrected PET images in a single PET system using Wasserstein...

by Hu, Zhanli; Li, Yongchang; Zou, Sijuan ; More...

Physics in medicine & biology, 11/2020, Volume 65, Issue 21

Positron emission tomography (PET) imaging plays an indispensable role in early disease detection and postoperative patient staging diagnosis. However, PET...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

2020


data-driven distributionally robust newsvendor model with a Wasserstein ambiguity set

S Lee, H Kim, I Moon - Journal of the Operational …, 2020 - orsociety.tandfonline.com

In this paper, we derive closed-form solution and an explicit characterization of the worst-

case distribution for the data-driven distributionally robust newsvendor model with an

ambiguity set based on the Wasserstein distance of order p[1,∞). We also consider the …

  Cited by 4 Related articles All 2 versions


[PDF] researchgate.net

Data supplement for soft sensor using new generative model based on variational autoencoder and Wasserstein GAN

X Wang, H Liu - Journal of Process Control, 2020 - Elsevier

In industrial process control, measuring some variables is difficult for environmental or cost

reasons. This necessitates employing soft sensor to predict these variables by using the

collected data from easily measured variables. The prediction accuracy and computational …

  Cited by 7 Related articles All 3 versions


[PDF] arxiv.org

A Wasserstein coupled particle filter for multilevel estimation

M Ballesio, JasraE von Schwerin… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we consider the filtering problem for partially observed diffusions, which are

regularly observed at discrete times. We are concerned with the case when one must resort

to time-discretization of the diffusion process if the transition density is not available in an …

  Cited by 2 Related articles All 4 versions 


[PDF] arxiv.org

variational finite volume scheme for Wasserstein gradient flows

C CancèsTO GallouëtG Todeschi - Numerische Mathematik, 2020 - Springer

We propose variational finite volume scheme to approximate the solutions to Wasserstein

gradient flows. The time discretization is based on an implicit linearization of the

Wasserstein distance expressed thanks to Benamou–Brenier formula, whereas space …

  Cited by 6 Related articles All 9 versions


W-LDMM: A wasserstein driven low-dimensional manifold model for noisy image restoration

R He, X Feng, W Wang, X Zhu, C Yang - Neurocomputing, 2020 - Elsevier

The Wasserstein distance originated from the optimal transport theory is general and

flexible statistical metric in variety of image processing problems. In this paper, we propose

novel Wasserstein driven low-dimensional manifold model (W-LDMM), which tactfully …

  Cited by 3 Related articles All 2 versions

<——2020——2020———2660—— 



A Wasserstein gradient-penalty generative adversarial network with deep auto-encoder for bearing intelligent fault diagnosis

X Xiong, J Hongkai, X Li, M Niu - Measurement Science and …, 2020 - iopscience.iop.org

It is great challenge to manipulate unbalanced fault data in the field of rolling bearings

intelligent fault diagnosis. In this paper, novel intelligent fault diagnosis method called the

Wasserstein gradient-penalty generative adversarial network with deep auto-encoder is …

  Cited by 6 Related articles All 2 versions


[PDF] unifi.it

[PDF] Conlon: pseudo-song generator based on new pianoroll, wasserstein autoencoders, and optimal interpolations

L Angioloni, T Borghuis, L Brusci… - Proceedings of the 21st …, 2020 - flore.unifi.it

We introduce CONLON, pattern-based MIDI generation method that employs new

lossless pianoroll-like data description in which velocities and durations are stored in

separate channels. CONLON uses Wasserstein autoencoders as the underlying generative …

  Cited by 1 Related articles All 7 v


[PDF] Quantifying the Empirical Wasserstein Distance to Set of Measures: Beating the Curse of Dimensionality

N Si, J Blanchet, S GhoshM Squillante - Advances in Neural …, 2020 - stanford.edu

Page 1. Quantifying the Empirical Wasserstein Distance to a Set of Measures: Beating the Curse

of Dimensionality Nian Si Joint work with Jose Blanchet, Soumyadip Ghosh, and Mark Squillante

NeurIPS 2020 October 22, 2020 niansi@stanford.edu (Stanford) Wasserstein Projection October …

  Related articles 


 

 

[PDF] arxiv.org

Infinite-dimensional regularization of McKean-Vlasov equation with a Wasserstein diffusion

V Marx - arXiv preprint arXiv:2002.10157, 2020 - arxiv.org

Much effort has been spent in recent years on restoring uniqueness of McKean-Vlasov

SDEs with non-smooth coefficients. As typical instance, the velocity field is assumed to be

bounded and measurable in its space variable and Lipschitz-continuous with respect to the …

  Cited by 1 Related articles All 9 versions 


[PDF] arxiv.org

Modeling EEG data distribution with a Wasserstein Generative Adversarial Network to predict RSVP Events

S PanwarP RadTP Jung… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Electroencephalography (EEG) data are difficult to obtain due to complex experimental

setups and reduced comfort with prolonged wearing. This poses challenges to train powerful

deep learning model with the limited EEG data. Being able to generate EEG data …

  Cited by 4 Related articles All 5 versions


2020


[PDF] arxiv.org

Bismut-Elworthy inequality for a Wasserstein diffusion on the circle

V Marx - arXiv preprint arXiv:2005.04972, 2020 - arxiv.org

We investigate in this paper regularization property of diffusion on the Wasserstein

space $\mathcal {P} _2 (\mathbb {T}) $ of the one-dimensional torus. The control obtained

on the gradient of the semi-group is very much in the spirit of Bismut-Elworthy-Li integration …

  Related articles All 9 versions 


[PDF] arxiv.org

The Wasserstein Impact Measure (WIM): generally applicable, practical tool for quantifying prior impact in Bayesian statistics

F Ghaderinezhad, C LeyB Serrien - arXiv preprint arXiv:2010.12522, 2020 - arxiv.org

The prior distribution is crucial building block in Bayesian analysis, and its choice will

impact the subsequent inference. It is therefore important to have convenient way to

quantify this impact, as such measure of prior impact will help us to choose between two or …

  Related articles All 3 versions 


2020

[PDF] upenn.edu

Optimal Estimation of Wasserstein Distance on Tree With an Application to Microbiome Studies

S Wang, TT CaiH Li - Journal of the American Statistical …, 2020 - Taylor & Francis

The weighted UniFrac distance, plug-in estimator of the Wasserstein distance of read

counts on tree, has been widely used to measure the microbial community difference in

microbiome studies. Our investigation however shows that such plug-in estimator …

  Related articles All 4 versions


[PDF] sciencedirect.com

novel kernel Wasserstein distance on Gaussian measures: An application of identifying dental artifacts in head and neck computed tomography

…, IyerAP ApteJO DeasyTannenbaum - Computers in biology …, 2020 - Elsevier

The Wasserstein distance is powerful metric based on the theory of optimal mass

transport. It gives natural measure of the distance between two distributions with wide

range of applications. In contrast to number of the common divergences on distributions …

  Cited by 3 Related articles All 5 versions


[PDF] imstat.org

Central Limit Theorem for Wasserstein type distances between two distinct univariate distributions

P Berthet, JC Fort, T Klein - Annales de l'Institut Henri Poincaré …, 2020 - projecteuclid.org

In this article we study the natural nonparametric estimator of a Wasserstein type cost

between two distinct continuous distributions $ F $ and $ G $ on $\mathbb {R} $. The

estimator is based on the order statis

tics of sample having marginals $ F $, $ G $ and any …

  Related articles All 4 versions

<——2020——2020———2670—— 




[PDF] arxiv.org

material decomposition method for dual-energy CT via dual interactive Wasserstein generative adversarial networks

Z Shi, H Li, Q Cao, Z Wang, M Cheng - arXiv preprint arXiv:2007.11247, 2020 - arxiv.org

Dual-energy computed tomography has great potential in material characterization and

identification, whereas the reconstructed material-specific images always suffer from

magnified noise and beam hardening artifacts. In this study, data-driven approach using …

  Related articles All 3 versions 



Trajectories from Distribution-Valued Functional Curves: Unified Wasserstein Framework

SharmaG Gerig - … Conference on Medical Image Computing and …, 2020 - Springer

Temporal changes in medical images are often evaluated along parametrized function that

represents structure of interest (eg white matter tracts). By attributing samples along these

functions with distributions of image properties in the local neighborhood, we create …

  Related articles All 2 versions


[PDF] nsf.gov

Data-Driven Distributionally Robust Game Using Wasserstein Distance

G PengT ZhangQ Zhu - International Conference on Decision and Game …, 2020 - Springer

This paper studies special class of games, which enables the players to leverage the

information from dataset to play the game. However, in an adversarial scenario, the

dataset may not be trustworthy. We propose distributionally robust formulation to introduce …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

On Novel Application of Wasserstein-Procrustes for Unsupervised Cross-Lingual Learning

G Ramírez, R Dangovski, P Nakov… - arXiv preprint arXiv …, 2020 - arxiv.org

The emergence of unsupervised word embeddings, pre-trained on very large monolingual

text corpora, is at the core of the ongoing neural revolution in Natural Language Processing

(NLP). Initially introduced for English, such pre-trained word embeddings quickly emerged …

  Related articles All 3 versions 


System and method for unsupervised domain adaptation via sliced-wasserstein distance

AJ GabourieM RostamiS Kolouri… - US Patent App. 16 …, 2020 - freepatentsonline.com

Described is a system for unsupervised domain adaptation in an autonomous learning

agent. The system adapts a learned model with a set of unlabeled data from a target

domain, resulting in an adapted model. The learned model was previously trained to …

  Cited by 2 Related articles All 2 versions 


2020


new Wasserstein distance-and cumulative sum-dependent health indicator and its application in prediction of remaining useful life of bearing

J Yin, M Xu, H Zheng, Y Yang - Journal of the Brazilian Society of …, 2020 - Springer

The safety and reliability of mechanical performance are affected by the condition (health

status) of the bearings. health indicator (HI) with high monotonicity and robustness is a

helpful tool to simplify the predictive model and improve prediction accuracy. In this paper,  …

  Related articles


[PDF] ieee.org

New Data-Driven Distributionally Robust Portfolio Optimization Method based on Wasserstein Ambiguity Set

N Du, Y Liu, Y Liu - IEEE Access, 2020 - ieeexplore.ieee.org

Since optimal portfolio strategy depends heavily on the distribution of uncertain returns, this

paper proposes new method for the portfolio optimization problem with respect to

distribution uncertainty. When the distributional information of the uncertain return rate is …

  Related articles


[PDF] arxiv.org

Riemannian Block Coordinate Descent Method for Computing the Projection Robust Wasserstein Distance

M HuangS MaL Lai - arXiv preprint arXiv:2012.05199, 2020 - arxiv.org

The Wasserstein distance has become increasingly important in machine learning and deep

learning. Despite its popularity, the Wasserstein distance is hard to approximate because of

the curse of dimensionality. recently proposed approach to alleviate the curse of …

  Cited by 1 Related articles All 3 versions 


[PDF] arxiv.org

Safe Zero-Shot Model-Based Learning and Control: A Wasserstein Distributionally Robust Approach

KandelSJ Moura - arXiv preprint arXiv:2004.00759, 2020 - arxiv.org

This paper explores distributionally robust zero-shot model-based learning and control

using Wasserstein ambiguity sets. Conventional model-based reinforcement learning

algorithms struggle to guarantee feasibility throughout the online learning process. We …

  Related articles All 2 versions 


[PDF] arxiv.org

Reweighting samples under covariate shift using a Wasserstein distance criterion

J ReygnerTouboul - arXiv preprint arXiv:2010.09267, 2020 - arxiv.org

Considering two random variables with different laws to which we only have access through

finite size iid samples, we address how to reweight the first sample so that its empirical

distribution converges towards the true law of the second sample as the size of both …

  Related articles All 26 versions 

<——2020——2020———2680—— 



Novel Data-to-Text Generation Model with Transformer Planning and a Wasserstein Auto-Encoder

X Xu, T He, H Wang - 2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

Existing methods for data-to-text generation have difficulty producing diverse texts with low

duplication rates. In this paper, we propose novel data-to-text generation model with

Transformer planning and a Wasserstein auto-encoder, which can convert constructed data …

  Related articles All 2 versions


[HTML] Missing Features Reconstruction Using a Wasserstein Generative Adversarial Imputation Network

M FriedjungováD Vašata, M Balatsko… - … on Computational Science, 2020 - Springer

Missing data is one of the most common preprocessing problems. In this paper, we

experimentally research the use of generative and non-generative models for feature

reconstruction. Variational Autoencoder with Arbitrary Conditioning (VAEAC) and …

  Cited by 1 Related articles All 8 versions


[PDF] stanford.edu

Class of Optimal Transport Regularized Formulations with Applications to Wasserstein GANs

S Mahdian, JH Blanchet… - 2020 Winter Simulation …, 2020 - ieeexplore.ieee.org

Optimal transport costs (eg Wasserstein distances) are used for fitting high-dimensional

distributions. For example, popular artificial intelligence algorithms such as Wasserstein

Generative Adversarial Networks (WGANs) can be interpreted as fitting black-box …

  All 2 versions


[PDF] iop.org

collaborative filtering recommendation framework based on Wasserstein GAN

R Li, F Qian, X Du, S Zhao… - Journal of Physics …, 2020 - iopscience.iop.org

Compared with the original GAN, Wasserstein GAN minimizes the Wasserstein Distance

between the generative distribution and the real distribution, can well capture the potential

distribution of data and has achieved excellent results in image generation. However, the …

  Related articles


[PDF] arxiv.org

On the Wasserstein distance for martingale central limit theorem

X Fan, X Ma - Statistics & Probability Letters, 2020 - Elsevier

… On the Wasserstein distance for a martingale central limit theorem. Author links open overlay

panelXiequanFan XiaohuiMa. Show more … Abstract. We prove an upper bound on the Wasserstein

distance between normalized martingales and the standard normal random variable, which …

  Related articles All 8 versions


2020


[PDF] arxiv.org

Portfolio Optimisation within a Wasserstein Ball

SM PesentiS Jaimungal - Available at SSRN, 2020 - papers.ssrn.com

We consider the problem of active portfolio management where loss-averse and/or gain-

seeking investor aims to outperform benchmark strategy's risk profile while not deviating

too much from it. Specifically, an investor considers alternative strategies that co-move with …

  Related articles All 7 versions


Riemannian submersion‐based approach to the Wasserstein barycenter of positive definite matrices

M Li, H Sun, D Li - Mathematical Methods in the Applied …, 2020 - Wiley Online Library

In this paper, we introduce novel geometrization on the space of positive definite matrices,

derived from the Riemannian submersion from the general linear group to the space of

positive definite matrices, resulting in easier computation of its geometric structure. The …

  Related articles


[PDF] sciencedirect.com

Intelligent Fault Diagnosis with Deep Transfer Network based on Wasserstein Distance

J Xu, J Huang, Y Zhao, L Zhou - Procedia Computer Science, 2020 - Elsevier

Intelligent fault-diagnosis methods based on deep-learning technology have been very

successful for complex industrial systems. The deep learning based fault classification

model requires large number of labeled data. Moreover, the probability distribution of …

  Related articles


[PDF] arxiv.org

Generating Natural Adversarial Hyperspectral examples with modified Wasserstein GAN

JC Burnel, K FatrasN Courty - arXiv preprint arXiv:2001.09993, 2020 - arxiv.org

Adversarial examples are hot topic due to their abilities to fool classifier's prediction.

There are two strategies to create such examples, one uses the attacked classifier's

gradients, while the other only requires access to the clas-sifier's prediction. This is …

  Related articles All 4 versions 


Generative Model for Zero-Shot Learning via Wasserstein Auto-encoder

X Luo, Z Cai, F Wu, J Xiao-Yuan - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Zero-shot learning aims to use the labeled instances to train the model, and then classifies

the instances that belong to class without labeled instances. However, the training

instances and test instances are disjoint. Thus, the description of the classes (eg text …

  Related articles

<——2020——2020———2690——



[HTML] Solutions of Class of Degenerate Kinetic Equations Using Steepest Descent in Wasserstein Space

Marcos, Soglo - Journal of Mathematics, 2020 - hindawi.com

We use the steepest descent method in an Orlicz–Wasserstein space to study the existence

of solutions for very broad class of kinetic equations, which include the Boltzmann

equation, the Vlasov–Poisson equation, the porous medium equation, and the parabolic p …

  Cited by 1 Related articles All 6 versions 


Sliced Wasserstein Loss for Neural Texture Synthesis

E Heitz, K Vanhoey, T Chambon… - arXiv preprint …, 2020 - arxiv-export-lb.library.cornell.edu

We address the problem of computing textural loss based on the statistics extracted from

the feature activations of convolutional neural network optimized for object recognition (eg

VGG-19). The underlying mathematical problem is the measure of the distance between two …

  

Artificial Neural Network with Histogram Data Time Series Forecasting: Least Squares Approach Based on Wasserstein Distance

P Rakpho, W Yamaka, K Zhu - Behavioral Predictive Modeling in …, 2020 - Springer

This paper aims to predict the histogram time series, and we use the high-frequency data

with 5-min to construct the Histogram data for each day. In this paper, we apply the Artificial

Neural Network (ANN) to Autoregressive (AR) structure and introduce the AR—ANN model …

  Related articles All 3 versions


[PDF] ethz.ch

[PDF] Smooth Wasserstein Distance: Metric Structure and Statistical Efficiency

Z Goldfeld - International Zurich Seminar on Information …, 2020 - research-collection.ethz.ch

The Wasserstein distance has seen a surge of interest and applications in machine learning.

Its popularity is driven by many advantageous properties it possesses, such as metric

structure (metrization of weak convergence), robustness to support mismatch, compatibility …

  Related articles All 4 versions 


Wasserstein metric-based Boltzmann entropy of landscape mosaic: clarification, correction, and evaluation of thermodynamic consistency

P Gao, H Zhang, Z Wu - Landscape Ecology - Springer

Objectives The first objective is to provide clarification of and correction to the

Wasserstein metric-based method. The second is to evaluate the method in terms of

thermodynamic consistency using different implementations. Methods Two implementation …

  Related articles


2020


Data-driven Risk-sensitive Appointment Scheduling: A Wasserstein Distributionally Robust Optimization Approach

Z Pang, S Wang - Available at SSRN 3740083, 2020 - papers.ssrn.com

We consider an optimal appointment scheduling problem for single-server healthcare

delivery system with random durations, focusing on the tradeoff between overtime work and

patient delays which are measured under conditional value-at-risk (CVaR). To address the …

 

[PDF] optimization-online.org

[PDF] Novel Solution Methodology for Wasserstein-based Data-Driven Distributionally Robust Problems

CA Gamboa, DM ValladaoStreet… - optimization-online.org

Distributionally robust optimization (DRO) is mathematical framework to incorporate

ambiguity over the actual data-generating probability distribution. Data-driven DRO

problems based on the Wasserstein distance are of particular interest for their sound …

  Related articles 


Remaining useful life prediction of lithium-ion batteries using fusion method based on Wasserstein GAN

周温丁, 鲍士兼, 许方敏, 赵成林 - 中国邮电高校学报 (英文版), 2020 - jcupt.bupt.edu.cn

Lithium-ion batteries are the main power supply equipment in many fields due to their

advantages of no memory, high energy density, long cycle life and no pollution to the

environment. Accurate prediction for the remaining useful life (RUL) of lithium-ion batteries …

  All 2 versions 


[BOOK] An invitation to statistics in Wasserstein space

VM Panaretos, Y Zemel - 2020 - library.oapen.org

This open access book presents the key aspects of statistics in Wasserstein spaces, ie

statistics in the space of probability measures when endowed with the geometry of optimal

transportation. Further to reviewing state-of-the-art aspects, it also provides an accessible …

  Cited by 21 Related articles All 7 versions 


[PDF] arxiv.org

Convergence and concentration of empirical measures under Wasserstein distance in unbounded functional spaces

J Lei - Bernoulli, 2020 - projecteuclid.org

We provide upper bounds of the expected Wasserstein distance between a probability

measure and its empirical version, generalizing recent results for finite dimensional

 Cited by 54 Related articles All 6 versions

<——2020——2020———2700—


[PDF] springer.com

[PDF] Adapted Wasserstein distances and stability in mathematical finance

J Backhoff-VeraguasD BartlM Beiglböck… - Finance and …, 2020 - Springer

Assume that an agent models a financial asset through a measure with the goal to

price/hedge some derivative or optimise some expected utility. Even if the model is

chosen in the most skilful and sophisticated way, the agent is left with the possibility that  …

  Cited by 20 Related articles All 12 versions

[CITATION] Adapted wasserstein distances and stability in mathematical finance. arXiv e-prints, page

J Backhoff-Veraguas, D Bartl, M Beiglböck, M Eder - arXiv preprint arXiv:1901.07450, 2019

  Cited by 4 Related articles


Convergence rate to equilibrium in Wasserstein distance for reflected jump–diffusions

A Sarantsev - Statistics & Probability Letters, 2020 - Elsevier

Convergence rate to the stationary distribution for continuous-time Markov processes can be

studied using Lyapunov functions. Recent work by the author provided explicit rates of

convergence in special case of a reflected jump–diffusion on a half-line. These results are …

  Related articles All 

[CITATION] Convergence Rate to Equilibrium in Wasserstein Distance for Reflected Jump-Diffusions (2020)

A Sarantsev - Statistics and Probability Letters, 2019

  Cited by 1


[PDF] mlr.press

wasserstein minimum velocity approach to learning unnormalized models

Z WangS Cheng, L Yueru, J Zhu… - International …, 2020 - proceedings.mlr.press

Score matching provides an effective approach to learning flexible unnormalized models,

but its scalability is limited by the need to evaluate a second-order derivative. In this paper,

we present a scalable approximation to a general family of learning objectives including …

  Cited by 4 Related articles All 9 versions 



[PDF] arxiv.org

wasserstein-type distance in the space of gaussian mixture models

J Delon, A Desolneux - SIAM Journal on Imaging Sciences, 2020 - SIAM

In this paper we introduce a Wasserstein-type distance on the set of Gaussian mixture

models. This distance is defined by restricting the set of possible coupling measures in the

optimal transport problem to Gaussian mixture models. We derive a very simple discrete …

  Cited by 12 Related articles All 7 versions



[PDF] arxiv.org

Estimating processes in adapted Wasserstein distance

J BackhoffD BartlM Beiglböck, J Wiesel - arXiv preprint arXiv …, 2020 - arxiv.org

A number of researchers have independently introduced topologies on the set of laws of

stochastic processes that extend the usual weak topology. Depending on the respective

scientific background this was motivated by applications and connections to various areas …

  Cited by 3 Related articles All 4 versions 

[CITATION] Estimating processes in adapted Wasserstein distance

J Backhoff-Veraguas, D Bartl, M Beiglböck, J Wiesel - Preprint, 2020

  Cited by 2 Related articles


2020


[PDF] aaai.org

Importance-aware semantic segmentation in self-driving with discrete wasserstein training

X Liu, Y Han, S Bai, Y Ge, T Wang, X HanS Li… - Proceedings of the …, 2020 - ojs.aaai.org

Semantic segmentation (SS) is an important perception manner for self-driving cars and

robotics, which classifies each pixel into a pre-determined class. The widely-used cross

entropy (CE) loss-based deep networks has achieved significant progress wrt the mean …

  Cited by 9 Related articles All 6 versions 


[PDF] arxiv.org

Some Theoretical Insights into Wasserstein GANs

G BiauM SangnierU Tanielian - arXiv preprint arXiv:2006.02682, 2020 - arxiv.org

Generative Adversarial Networks (GANs) have been successful in producing outstanding

results in areas as diverse as image, video, and text generation. Building on these

successes, a large number of empirical studies have validated the benefits of the cousin …

  Cited by 5 Related articles All 5 versions 


[PDF] academia.edu

Data augmentation in fault diagnosis based on the Wasserstein generative adversarial network with gradient penalty

X Gao, F Deng, X Yue - Neurocomputing, 2020 - Elsevier

Fault detection and diagnosis in industrial process is an extremely essential part to keep

away from undesired events and ensure the safety of operators and facilities. In the last few

decades various data based machine learning algorithms have been widely studied to  …

  Cited by 31 Related articles All 3 versions


Learning to Align via Wasserstein for Person Re-Identification

Z ZhangY Xie, D Li, W Zhang… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Existing successful person re-identification (Re-ID) models often employ the part-level

representation to extract the fine-grained information, but commonly use the loss that is

particularly designed for global features, ignoring the relationship between semantic parts …

  Cited by 1 Related articles All 2 versions


Optimal control of multiagent systems in the Wasserstein space

C Jimenez, A Marigonda, M Quincampoix - Calculus of Variations and …, 2020 - Springer

This paper concerns a class of optimal control problems, where a central planner aims to

control a multi-agent system in R^ d R d in order to minimize a certain cost of Bolza type. At

every time and for each agent, the set of admissible velocities, describing his/her underlying …

  Cited by 8 Related articles All 3 versions

<——2020——2020———2710—



2020

[PDF] arxiv.org

Sampling of probability measures in the convex order by Wasserstein projection

A AlfonsiJ Corbetta, B Jourdain - Annales de l'Institut Henri …, 2020 - projecteuclid.org

In this paper, for $\mu $ and $\nu $ two probability measures on $\mathbb {R}^{d} $ with

finite moments of order $\varrho\ge 1$, we define the respective projections for the $ W_

{\varrho} $-Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures …

  Cited by 19 Related articles All 9 versions


[PDF] arxiv.org

Distributional sliced-Wasserstein and applications to generative modeling

K NguyenN HoT PhamH Bui - arXiv preprint arXiv:2002.07367, 2020 - arxiv.org

Sliced-Wasserstein distance (SWD) and its variation, Max Sliced-Wasserstein distance (Max-

SWD), have been widely used in the recent years due to their fast computation and

scalability when the probability measures lie in very high dimension. However, these …

  Cited by 7 Related articles All 4 versions 


[PDF] thecvf.com

Gromov-wasserstein averaging in a riemannian framework

S ChowdhuryT Needham - Proceedings of the IEEE/CVF …, 2020 - openaccess.thecvf.com

We introduce a theoretical framework for performing statistical tasks-including, but not

limited to, averaging and principal component analysis-on the space of (possibly

asymmetric) matrices with arbitrary entries and sizes. This is carried out under the lens of the …

  Cited by 8 Related articles All 6 versions 


Conditional Wasserstein generative adversarial network-gradient penalty-based approach to alleviating imbalanced data classification

M Zheng, T Li, R Zhu, Y Tang, M Tang, L Lin, Z Ma - Information Sciences, 2020 - Elsevier

In data mining, common classification algorithms cannot effectively learn from imbalanced

data. Oversampling addresses this problem by creating data for the minority class in order to

balance the class distribution before the model is trained. The Traditional oversampling …

Cited by 44 Related articles All 2 versions

[PDF] ieee.org

Robust multivehicle tracking with wasserstein association metric in surveillance videos

Y Zeng, X Fu, L Gao, J Zhu, H Li, Y Li - IEEE Access, 2020 - ieeexplore.ieee.org

Vehicle tracking based on surveillance videos is of great significance in the highway traffic

monitoring field. In real-world vehicle-tracking applications, partial occlusion and objects

with similarly appearing distractors pose significant challenges. For addressing the above …

  Cited by 3 Related articles


2020


[PDF] arxiv.org

A new approach to posterior contraction rates via Wasserstein dynamics

E Dolera, S Favaro, E Mainini - arXiv preprint arXiv:2011.14425, 2020 - arxiv.org

This paper presents a new approach to the classical problem of quantifying posterior

contraction rates (PCRs) in Bayesian statistics. Our approach relies on Wasserstein

distance, and it leads to two main contributions which improve on the existing literature of …

  Cited by 1 Related articles All 2 versions 


2020  [PDF] esaim-proc.org

Statistical data analysis in the Wasserstein space

J Bigot - ESAIM: Proceedings and Surveys, 2020 - esaim-proc.org

This paper is concerned by statistical inference problems from a data set whose elements

may be modeled as random probability measures such as multiple histograms or point

clouds. We propose to review recent contributions in statistics on the use of Wasserstein  …

  Cited by 2 Related articles


[HTML] nih.gov

Wasserstein GANs for MR imaging: from paired to unpaired training

K Lei, M MardaniJM Pauly… - IEEE transactions on …, 2020 - ieeexplore.ieee.org

Lack of ground-truth MR images impedes the common supervised training of neural

networks for image reconstruction. To cope with this challenge, this paper leverages

unpaired adversarial training for reconstruction networks, where the inputs are …

  Cited by 5 Related articles All 7 versions


[PDF] arxiv.org

Stein factors for variance-gamma approximation in the Wasserstein and Kolmogorov distances

RE Gaunt - arXiv preprint arXiv:2008.06088, 2020 - arxiv.org

We obtain new bounds for the solution of the variance-gamma (VG) Stein equation that are

of the correct form for approximations in terms of the Wasserstein and Kolmorogorov metrics.

These bounds hold for all parameters values of the four parameter VG class. As an …

  Cited by 4 Related articles All 3 versions 


Obtaining PET/CT images from non-attenuation corrected PET images in a single PET system using Wasserstein generative adversarial networks

Z Hu, Y Li, S Zou, H Xue, Z Sang, X Liu… - Physics in Medicine …, 2020 - iopscience.iop.org

Positron emission tomography (PET) imaging plays an indispensable role in early disease

detection and postoperative patient staging diagnosis. However, PET imaging requires not

only additional computed tomography (CT) imaging to provide detailed anatomical …

  Cited by 6 Related articles All 5 versions

<——2020——2020———2720—

[PDF] arxiv.org

Stochastic equation and exponential ergodicity in Wasserstein distances for affine processes

M Friesen, P Jin, B Rüdiger - Annals of Applied Probability, 2020 - projecteuclid.org

This work is devoted to the study of conservative affine processes on the canonical state

space $ D=\mathbb {R} _ {+}^{m}\times\mathbb {R}^{n} $, where $ m+ n> 0$. We show that

each affine process can be obtained as the pathwise unique strong solution to a stochastic …

  Cited by 8 Related articles All 5 versions


[PDF] arxiv.org

Irregularity of distribution in Wasserstein distance

C Graham - Journal of Fourier Analysis and Applications, 2020 - Springer

We study the non-uniformity of probability measures on the interval and circle. On the

interval, we identify the Wasserstein-p distance with the classical\(L^ p\)-discrepancy. We

thereby derive sharp estimates in Wasserstein distances for the irregularity of distribution of …

  Cited by 3 Related articles All 3 versions

 


 


[PDF] arxiv.org

Regularizing activations in neural networks via distribution matching with the Wasserstein metric

T Joo, D Kang, B Kim - arXiv preprint arXiv:2002.05366, 2020 - arxiv.org

Regularization and normalization have become indispensable components in training deep

neural networks, resulting in faster training and improved generalization performance. We

propose the projected error function regularization loss (PER) that encourages activations to  …

  Cited by 3 Related articles All 5 versions 


Existence of probability measure valued jump-diffusions in generalized Wasserstein spaces

M Larsson, S Svaluto-Ferro - Electronic Journal of Probability, 2020 - projecteuclid.org

We study existence of probability measure valued jump-diffusions described by martingale

problems. We develop a simple device that allows us to embed Wasserstein spaces and

other similar spaces of probability measures into locally compact spaces where classical …

  Cited by 2 Related articles All 3 versions


2020

[PDF] arxiv.org

Wasserstein statistics in 1D location-scale model

S Amari - arXiv preprint arXiv:2003.05479, 2020 - arxiv.org

Wasserstein geometry and information geometry are two important structures introduced in a

manifold of probability distributions. The former is defined by using the transportation cost

between two distributions, so it reflects the metric structure of the base manifold on which …

  Cited by 1 Related articles All 2 versions 

[PDF] arxiv.org

Online Stochastic Optimization with Wasserstein Based Non-stationarity

J Jiang, X Li, J Zhang - arXiv preprint arXiv:2012.06961, 2020 - arxiv.org

We consider a general online stochastic optimization problem with multiple budget

constraints over a horizon of finite time periods. At each time period, a reward function and

multiple cost functions, where each cost function is involved in the consumption of one …

  Related articles All 2 versions 


[PDF] Quantifying the Empirical Wasserstein Distance to a Set of Measures: Beating the Curse of Dimensionality

N Si, J Blanchet, S GhoshM Squillante - Advances in Neural …, 2020 - stanford.edu

… Page 11. Duality Results Connections with the the Integral Probability Metric (IPM) Connections

with the the Integral Probability Metric (IPM) IPMF (P, Pn) = sup f F ∫ f dP - ∫ f

dPn . Rn is not a metric in general. We add a new modeling feature, which is the …

  Related articles 


[PDF] arxiv.org

Cutoff thermalization for Ornstein-Uhlenbeck systems with small L\'evy noise in the Wasserstein distance

G BarreraMA HögeleJC Pardo - arXiv preprint arXiv:2009.10590, 2020 - arxiv.org

This article establishes cutoff thermalization (also known as the cutoff phenomenon) for a

general class of general Ornstein-Uhlenbeck systems $(X^\epsilon_t (x)) _ {t\geq 0} $ under

$\epsilon $-small additive Lévy noise with initial value $ x $. The driving noise processes …

  Cited by 1 Related articles All 3 versions 


[PDF] ieee.org

Distributionally Robust Optimal Reactive Power Dispatch with Wasserstein Distance in Active Distribution Network

J Liu, Y Chen, C Duan, J Lin… - Journal of Modern Power …, 2020 - ieeexplore.ieee.org

The uncertainties from renewable energy sources (RESs) will not only introduce significant

influences to active power dispatch, but also bring great challenges to the analysis of

optimal reactive power dispatch (ORPD). To address the influence of high penetration of …

  Cited by 5 Related articles All 3 versions

<——2020——2020———2730—


[PDF] arxiv.org

Exponential Convergence in Entropy and Wasserstein Distance for McKean-Vlasov SDEs

P Ren, FY Wang - arXiv preprint arXiv:2010.08950, 2020 - arxiv.org

The following type exponential convergence is proved for (non-degenerate or degenerate)

McKean-Vlasov SDEs: $$ W_2 (\mu_t,\mu_\infty)^ 2+{\rm Ent}(\mu_t|\mu_\infty)\le c {\rm e}^{-

\lambda t}\min\big\{W_2 (\mu_0,\mu_\infty)^ 2,{\rm Ent}(\mu_0|\mu_\infty)\big\},\\t\ge 1 …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Modeling EEG data distribution with a Wasserstein Generative Adversarial Network to predict RSVP Events

S PanwarP RadTP Jung… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Electroencephalography (EEG) data are difficult to obtain due to complex experimental

setups and reduced comfort with prolonged wearing. This poses challenges to train powerful

deep learning model with the limited EEG data. Being able to generate EEG data …

  Cited by 4 Related articles All 5 versions

 

[PDF] biorxiv.org

Gromov-Wasserstein optimal transport to align single-cell multi-omics data

P DemetciR SantorellaB SandstedeWS Noble… - BioRxiv, 2020 - biorxiv.org

Data integration of single-cell measurements is critical for understanding cell development

and disease, but the lack of correspondence between different types of measurements

makes such efforts challenging. Several unsupervised algorithms can align heterogeneous …

  Cited by 8 Related articles All 3 versions 

Exponential contraction in Wasserstein distance on static and evolving manifolds

LJ Cheng, A Thalmaier, SQ Zhang - arXiv preprint arXiv:2001.06187, 2020 - arxiv.org

In this article, exponential contraction in Wasserstein distance for heat semigroups of

diffusion processes on Riemannian manifolds is established under curvature conditions

where Ricci curvature is not necessarily required to be non-negative. Compared to the …

  Cited by 2 Related articles All 5 versions 


2020

[PDF] upenn.edu

Optimal Estimation of Wasserstein Distance on a Tree With an Application to Microbiome Studies

S Wang, TT CaiH Li - Journal of the American Statistical …, 2020 - Taylor & Francis

The weighted UniFrac distance, a plug-in estimator of the Wasserstein distance of read

counts on a tree, has been widely used to measure the microbial community difference in

microbiome studies. Our investigation however shows that such a plug-in estimator …

  Related articles All 4 versions


 2020


[PDF] arxiv.org

Statistical learning in Wasserstein space

A Karimi, L Ripani, TT Georgiou - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We seek a generalization of regression and principle component analysis (PCA) in a metric

space where data points are distributions metrized by the Wasserstein metric. We recast

these analyses as multimarginal optimal transport problems. The particular formulation …

  Cited by 2 Related articles All 7 versions 


[PDF] arxiv.org

Convergence in Wasserstein Distance for Empirical Measures of Dirichlet Diffusion Processes on Manifolds

FY Wang - arXiv preprint arXiv:2005.09290, 2020 - arxiv.org

Let $ M $ be a $ d $-dimensional connected compact Riemannian manifold with boundary

$\partial M $, let $ V\in C^ 2 (M) $ such that $\mu ({\rm d} x):={\rm e}^{V (x)}{\rm d} x $ is a

probability measure, and let $ X_t $ be the diffusion process generated by …

  Cited by 3 Related articles All 3 versions 


 [PDF] aaai.org

Solving general elliptical mixture models through an approximate Wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - Proceedings of the AAAI Conference on …, 2020 - ojs.aaai.org

… , we show that the Wasserstein distance provides a more … a manifold of an approximate 

Wasserstein distance. To this end, we … , especially under the Wasserstein distance. To relieve this …

Cited by 6 Related articles All 5 versions

[PDF] arxiv.org

Wasserstein Distance Regularized Sequence Representation for Text Matching in Asymmetrical Domains

W Yu, C Xu, J XuL Pang, X Gao, X Wang… - arXiv preprint arXiv …, 2020 - arxiv.org

One approach to matching texts from asymmetrical domains is projecting the input

sequences into a common semantic space as feature vectors upon which the matching

function can be readily defined and learned. In real-world matching practices, it is often …

Cited by 3 Related articles All 7 versions 

[PDF] projecteuclid.org

Existence of probability measure valued jump-diffusions in generalized Wasserstein spaces

M Larsson, S Svaluto-Ferro - Electronic Journal of Probability, 2020 - projecteuclid.org

We study existence of probability measure valued jump-diffusions described by martingale

problems. We develop a simple device that allows us to embed Wasserstein spaces and

other similar spaces of probability measures into locally compact spaces where classical …

  Cited by 2 Related articles All 3 versions

<——2020——2020———2740— 



[PDF] arxiv.org

High-precision Wasserstein barycenters in polynomial time

JM Altschuler, E Boix-Adsera - arXiv preprint arXiv:2006.08012, 2020 - arxiv.org

Computing Wasserstein barycenters is a fundamental geometric problem with widespread

applications in machine learning, statistics, and computer graphics. However, it is unknown

whether Wasserstein barycenters can be computed in polynomial time, either exactly or to  …

  Related articles All 3 versions 


[PDF] arxiv.org

Posterior asymptotics in Wasserstein metrics on the real line

M ChaeP De Blasi, SG Walker - arXiv preprint arXiv:2003.05599, 2020 - arxiv.org

In this paper, we use the class of Wasserstein metrics to study asymptotic properties of

posterior distributions. Our first goal is to provide sufficient conditions for posterior

consistency. In addition to the well-known Schwartz's Kullback--Leibler condition on the …

  Related articles All 2 versions 


2020

Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces

V EhrlacherD LombardiO Mula… - … and Numerical Analysis, 2020 - search.proquest.com

We consider the problem of model reduction of parametrized PDEs where the goal is to

approximate any function belonging to the set of solutions at a reduced computational cost.

For this, the bottom line of most strategies has so far been based on the approximation of the …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

Stein's method for normal approximation in Wasserstein distances with application to the multivariate Central Limit Theorem

T Bonis - Probability Theory and Related Fields, 2020 - Springer

We use Stein's method to bound the Wasserstein distance of order 2 between a

measure\(\nu\) and the Gaussian measure using a stochastic process\((X_t) _ {t\ge 0}\) such

that\(X_t\) is drawn from\(\nu\) for any\(t> 0\). If the stochastic process\((X_t) _ {t\ge 0}\) …

  Cited by 7 Related articles All 3 versions


[PDF] arxiv.org

Strong Formulations for Distributionally Robust Chance-Constrained Programs with Left-Hand Side Uncertainty under Wasserstein Ambiguity

N Ho-NguyenF Kılınç-KarzanS Küçükyavuz… - arXiv preprint arXiv …, 2020 - arxiv.org

Distributionally robust chance-constrained programs (DR-CCP) over Wasserstein ambiguity

sets exhibit attractive out-of-sample performance and admit big-$ M $-based mixed-integer

programming (MIP) reformulations with conic constraints. However, the resulting …

  Cited by 3 Related articles All 3 versions 


2020


[PDF] arxiv.org

Statistical analysis of Wasserstein GANs with applications to time series forecasting

M Haas, S Richter - arXiv preprint arXiv:2011.03074, 2020 - arxiv.org

We provide statistical theory for conditional and unconditional Wasserstein generative

adversarial networks (WGANs) in the framework of dependent observations. We prove

upper bounds for the excess Bayes risk of the WGAN estimators with respect to a modified …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures

O Bencheikh, B Jourdain - arXiv preprint arXiv:2012.09729, 2020 - arxiv.org

We are interested in the approximation in Wasserstein distance with index $\rho\ge 1$ of a

probability measure $\mu $ on the real line with finite moment of order $\rho $ by the

empirical measure of $ N $ deterministic points. The minimal error converges to $0 $ as …

  Related articles All 3 versions 


Exponential contraction in Wasserstein distances for diffusion semigroups with negative curvature

FY Wang - Potential Analysis, 2020 - Springer

Let P t be the (Neumann) diffusion semigroup P t generated by a weighted Laplacian on a

complete connected Riemannian manifold M without boundary or with a convex boundary. It

is well known that the Bakry-Emery curvature is bounded below by a positive constant> 0 …

  Cited by 5 Related articles


[PDF] arxiv.org

Martingale Wasserstein inequality for probability measures in the convex order

B Jourdain, W Margheriti - arXiv preprint arXiv:2011.11599, 2020 - arxiv.org

It is known since [24] that two one-dimensional probability measures in the convex order

admit a martingale coupling with respect to which the integral of $\vert xy\vert $ is smaller

than twice their $\mathcal W_1 $-distance (Wasserstein distance with index $1 $). We …

  Related articles All 7 versions 


[PDF] arxiv.org

Central limit theorems for Markov chains based on their convergence rates in Wasserstein distance

R Jin, A Tan - arXiv preprint arXiv:2002.09427, 2020 - arxiv.org

Many tools are available to bound the convergence rate of Markov chains in total variation

(TV) distance. Such results can be used to establish central limit theorems (CLT) that enable

error evaluations of Monte Carlo estimates in practice. However, convergence analysis …

  Related articles All 2 versions 

<——2020——2020———2750— 



[PDF] arxiv.org

Permutation invariant networks to learn Wasserstein metrics

A SehanobishN Ravindra, D van Dijk - arXiv preprint arXiv:2010.05820, 2020 - arxiv.org

Understanding the space of probability measures on a metric space equipped with a

Wasserstein distance is one of the fundamental questions in mathematical analysis. The

Wasserstein metric has received a lot of attention in the machine learning community …

  Related articles All 4 versions 


 

[PDF] arxiv.org

LCS Graph Kernel Based on Wasserstein Distance in Longest Common Subsequence Metric Space

J Huang, Z Fang, H Kasai - arXiv preprint arXiv:2012.03612, 2020 - arxiv.org

For graph classification tasks, many methods use a common strategy to aggregate

information of vertex neighbors. Although this strategy provides an efficient means of

extracting graph topological features, it brings excessive amounts of information that might …

  Cited by 1 Related articles All 2 versions 


[PDF] stanford.edu

A Class of Optimal Transport Regularized Formulations with Applications to Wasserstein GANs

S Mahdian, JH Blanchet… - 2020 Winter Simulation …, 2020 - ieeexplore.ieee.org

Optimal transport costs (eg Wasserstein distances) are used for fitting high-dimensional

distributions. For example, popular artificial intelligence algorithms such as Wasserstein

Generative Adversarial Networks (WGANs) can be interpreted as fitting a black-box …

  All 2 versions


[PDF] thecvf.com

Illegible Text to Readable Text: An Image-to-Image Transformation using Conditional Sliced Wasserstein Adversarial Networks

M KarimiG VeniYY Yu - … of the IEEE/CVF Conference on …, 2020 - openaccess.thecvf.com

Automatic text recognition from ancient handwritten record images is an important problem

in the genealogy domain. However, critical challenges such as varying noise conditions,

vanishing texts, and variations in handwriting makes the recognition task difficult. We tackle …


  Cited by 1 Related articles All 7 versions 

A Riemannian submersion‐based approach to the Wasserstein barycenter of positive definite matrices

M Li, H Sun, D Li - … Methods in the Applied Sciences, 2020 - Wiley Online Library

In this paper, we introduce a novel geometrization on the space of positive definite matrices,

derived from the Riemannian submersion from the general linear group to the space of

positive definite matrices, resulting in easier computation of its geometric structure. The …

  Related articles

 2020


[PDF] arxiv.org

The existence of minimizers for an isoperimetric problem with Wasserstein penalty term in unbounded domains

Q Xia, B Zhou - arXiv preprint arXiv:2002.07129, 2020 - arxiv.org

In this article, we consider the (double) minimization problem $$\min\left\{P

(E;\Omega)+\lambda W_p (E, F):~ E\subseteq\Omega,~ F\subseteq\mathbb {R}^ d,~\lvert

E\cap F\rvert= 0,~\lvert E\rvert=\lvert F\rvert= 1\right\}, $$ where $ p\geqslant 1$, $\Omega …

  Related articles All 4 versions 


Horo-functions associated to atom sequences on the Wasserstein space

G Zhu, H Wu, X Cui - Archiv der Mathematik, 2020 - Springer

On the Wasserstein space over a complete, separable, non-compact, locally compact length

space, we consider the horo-functions associated to sequences of atomic measures. We

show the existence of co-rays for any prescribed initial probability measure with respect to a …

  Related articles


[PDF] arxiv.org

On Stein's factors for Poisson approximation in Wasserstein distance with non-linear transportation costs

ZW Liao, Y Ma, A Xia - arXiv preprint arXiv:2003.13976, 2020 - arxiv.org

We establish various bounds on the solutions to a Stein equation for Poisson approximation

in Wasserstein distance with non-linear transportation costs. The proofs are a refinement of

those in [Barbour and Xia (2006)] using the results in [Liu and Ma (2009)]. As a corollary, we …

  Related articles All 2 versions 


Convergence in Monge-Wasserstein Distance of Mean Field Systems with Locally Lipschitz Coefficients

DT NguyenSL Nguyen, NH Du - Acta Mathematica Vietnamica, 2020 - Springer

This paper focuses on stochastic systems of weakly interacting particles whose dynamics

depend on the empirical measures of the whole populations. The drift and diffusion

coefficients of the dynamical systems are assumed to be locally Lipschitz continuous and …

  Related articles


Wasserstein-Distance-Based Temporal Clustering for Capacity-Expansion Planning in Power Systems

L CondeixaF Oliveira… - … Conference on Smart …, 2020 - ieeexplore.ieee.org

As variable renewable energy sources are steadily incorporated in European power

systems, the need for higher temporal resolution in capacity-expansion models also

increases. Naturally, there exists a trade-off between the amount of temporal data used to  …

  Cited by 1 Related articles

<——2020——2020———2760—


[PDF] arxiv.org

Characterization of probability distribution convergence in Wasserstein distance by -quantization error function

Y Liu, G Pagès - Bernoulli, 2020 - projecteuclid.org

We establish conditions to characterize probability measures by their $ L^{p} $-quantization

error functions in both $\mathbb {R}^{d} $ and Hilbert settings. This characterization is two-

fold: static (identity of two distributions) and dynamic (convergence for the $ L^{p} …

  Cited by 1 Related articles All 5 versions


[HTML] springer.com

[HTML] Fréchet Means in the Wasserstein Space 

VM Panaretos, Y Zemel - International Workshop on Functional and …, 2020 - Springer

The concept of a Fréchet mean (Fréchet [55]) generalises the notion of mean to a more general

metric space by replacing the usual “sum of squares” with a “sum of squared distances”, giving

rise to the so-called Fréchet functional. A closely related notion is that of a Karcher mean (Karcher …

  Related articles


[HTML] hindawi.com

[HTML] Solutions of a Class of Degenerate Kinetic Equations Using Steepest Descent in Wasserstein Space

A Marcos, A Soglo - Journal of Mathematics, 2020 - hindawi.com

We use the steepest descent method in an Orlicz–Wasserstein space to study the existence

of solutions for a very broad class of kinetic equations, which include the Boltzmann

equation, the Vlasov–Poisson equation, the porous medium equation, and the parabolic p …

  Cited by 1 Related articles All 6 versions 


Convergence rates of the blocked Gibbs sampler with random scan in the Wasserstein metric

NY Wang, G Yin - Stochastics, 2020 - Taylor & Francis

Formulae display: ?Mathematical formulae have been encoded as MathML and are displayed

in this HTML version using MathJax in order to improve their display. Uncheck the box to turn

MathJax off. This feature requires Javascript. Click on a formula to zoom … This paper establishes …

  Related articles All 4 versions


[PDF] ibpsa.org

[PDF] Comparing Bottom-Up Energy Consumption Models Using The Wasserstein Distance Between Load Profile Histograms

E Sanderson, A Fragaki, J Simo… - BSO-V 2020: IBPSA …, 2020 - ibpsa.org

This paper presents a comparison of bottom up models that generate appliance load

profiles. The comparison is based on their ability to accurately distribute load over time-of-

day. This is a key feature of model performance if the model is used to assess the impact of …

  Related articles All 2 versions 

2020


[PDF] projecteuclid.org

Donsker's theorem in Wasserstein-1 distance

L Coutin, L Decreusefond - Electronic Communications in …, 2020 - projecteuclid.org

We compute the Wassertein-1 (or Kantorovitch-Rubinstein) distance between a random

walk in $\mathbf {R}^{d} $ and the Brownian motion. The proof is based on a new estimate of

the modulus of continuity of the solution of the Stein's equation. As an application, we can …

  Cited by 1 Related articles All 18 versions


[PDF] researchgate.net

[PDF] ADDENDUM TO” ISOMETRIC STUDY OF WASSERSTEIN SPACES–THE REAL LINE”

GPÁL GEHÉRT TITKOSD VIROSZTEK - researchgate.net

We show an example of a Polish metric space X whose quadratic Wasserstein space W2 (X)

possesses an isometry that splits mass. This gives an affirmative answer to Kloeckner's

question,[2, Question 2]. Let us denote the metric space ([0, 1],|·|), equipped with the usual …

Cited by 6 Related articles All 8 versions

Optimality in weighted L2-Wasserstein goodness-of-fit statistics

T De Wet, V Humble - South African Statistical Journal, 2020 - journals.co.za

In Del Barrio, Cuesta-Albertos, Matran and Rodriguez-Rodriguez (1999) and Del Barrio,

Cuesta-Albertos and Matran (2000), the authors introduced a new class of goodness-of-fit

statistics based on the L2-Wasserstein distance. It was shown that the desirable property of …

  Related articles All 2 versions


[PDF] researchgate.net

Typical wind power scenario generation for multiple wind farms using conditional improved Wasserstein generative adversarial network

Y Zhang, Q Ai, F Xiao, R Hao, T Lu - … Journal of Electrical Power & Energy …, 2020 - Elsevier

Because of environmental benefits, wind power is taking an increasing role meeting

electricity demand. However, wind power tends to exhibit large uncertainty and is largely

influenced by meteorological conditions. Apart from the variability, when multiple wind farms …

  Cited by 15 Related articles


Density estimation of multivariate samples using Wasserstein distance

E Luini, P Arbenz - Journal of Statistical Computation and …, 2020 - Taylor & Francis

Density estimation is a central topic in statistics and a fundamental task of machine learning.

In this paper, we present an algorithm for approximating multivariate empirical densities with

a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

  Cited by 2 Related articles All 3 versions

<——2020——2020———2770— 


State Intellectual Property Office of China Releases Univ Nanjing Tech's Patent Application for a Blind Detection Method of an Image Repetition Region
Based on Euclidean Metric of Wasserstein...

Global IP News: Information Technology Patent News, Aug 31, 2020

Newspaper ArticleCitation Online

(08/31/2020). "State Intellectual Property Office of China Releases Univ Nanjing Tech's Patent Application for a Blind Detection Method of an Image Repetition Region Based on Euclidean Metric of Wasserstein Histogram". Global IP News: Information Technology Patent News


Robustified Multivariate Regression and Classification Using Distributionally Robust Optimization under the Wasserstein Metric

R ChenIC Paschalidis - arXiv preprint arXiv:2006.06090, 2020 - arxiv.org

We develop Distributionally Robust Optimization (DRO) formulations for Multivariate Linear

Regression (MLR) and Multiclass Logistic Regression (MLG) when both the covariates and

responses/labels may be contaminated by outliers. The DRO framework uses a probabilistic …

  Related articles All 3 versions 


[PDF] arxiv.org

Unajusted Langevin algorithm with multiplicative noise: Total variation and Wasserstein bounds

F Panloup - arXiv preprint arXiv:2012.14310, 2020 - arxiv.org

In this paper, we focus on non-asymptotic bounds related to the Euler scheme of an ergodic

diffusion with a possibly multiplicative diffusion term (non-constant diffusion coefficient).

More precisely, the objective of this paper is to control the distance of the standard Euler …

  Related articles All 2 versions 

Unajusted Langevin algorithm with multiplicative noise: Total variation and Wasserstein bounds

G Pages, F Panloup - 2020 - hal.archives-ouvertes.fr

In this paper, we focus on non-asymptotic bounds related to the Euler scheme of an ergodic

diffusion with a possibly multiplicative diffusion term (non-constant diffusion coefficient).

More precisely, the objective of this paper is to control the distance of the standard Euler …

  Related articles All 5 versions 


Image Hashing by Minimizing Discrete Component-wise Wasserstein Distance

KD DoanS ManchandaS Badirli… - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

Image hashing is one of the fundamental problems that demand both efficient and effective

solutions for various practical scenarios. Adversarial autoencoders are shown to be able to

implicitly learn a robust, locality-preserving hash function that generates balanced and high …

 

[PDF] researchgate.net

[PDF] Potential Analysis of Wasserstein GAN as an Anomaly Detection Method for Industrial Images

A Misik - researchgate.net

The task of detecting anomalies in images is a crucial part of current industrial optical

monitoring systems. In recent years, neural networks have proven to be an efficient method

for this problem, especially autoencoders and generative adversarial networks (GAN). A …

  

2020


Cross-domain Attention Network with Wasserstein Regularizers for E-commerce Search

M QiuB WangC Chen, X Zeng, J Huang… - Proceedings of the 28th …, 2019 - dl.acm.org

Product search and recommendation is a task that every e-commerce platform wants to

outperform their peels on. However, training a good search or recommendation model often

requires more data than what many platforms have. Fortunately, the search tasks on different …

  Related articles


2020 [PDF] arxiv.org

Distributionally Robust XVA via Wasserstein Distance Part 2: Wrong Way Funding Risk

D Singh, S Zhang - arXiv preprint arXiv:1910.03993, 2019 - arxiv.org

This paper investigates calculations of robust funding valuation adjustment (FVA) for over

the counter (OTC) derivatives under distributional uncertainty using Wasserstein distance as

the ambiguity measure. Wrong way funding risk can be characterized via the robust FVA …

  Related articles All 5 versions 


An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription

C Jin, Z Li, Y Sun, H Zhang, X Lv, J Li, S Liu - International Conference on …, 2019 - Springer

Given a piece of acoustic musical signal, various automatic music transcription (AMT)

processing methods have been proposed to generate the corresponding music notations

without human intervention. However, the existing AMT methods based on signal …

  Related articles


2020

Distributionally robust XVA via wasserstein distance part 1: Wrong way counterparty credit risk

D Singh, S Zhang - Unknown Journal, 2019 - experts.umn.edu

This paper investigates calculations of robust CVA for OTC derivatives under distributional

uncertainty using Wasserstein distance as the ambiguity measure. Wrong way counterparty

credit risk can be characterized (and indeed quantified) via the robust CVA formulation. The …

  

2020

[PDF] amazonaws.com

[PDF] Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application to Microbiome Studies”

S Wang, TT CaiH Li - pstorage-tf-iopjsd8797887.s3 …

Page 1. Supplement to “Optimal Estimation of Wasserstein Distance on A Tree with An Application

to Microbiome Studies” Shulei Wang, T. Tony Cai and Hongzhe Li University of Pennsylvania In

this supplementary material, we provide the proof for the main results (Section S1) and all the …

  Related articles All 3 versions 

<——2020——2020———2780— 


[HTML] An Integrated Consensus Improving Strategy Based on PL-Wasserstein Distance and Its Application in the Evaluation of Network Public Opinion Emergencies

S Zhang, Z Ma, X Liu, Z Wang, L Jiang - Complexity, 2020 - hindawi.com

In real life, multiple network public opinion emergencies may break out in a certain place at

the same time. So, it is necessary to invite emergency decision experts in multiple fields for

timely evaluating the comprehensive crisis of the online public opinion, and then limited …

  Related articles All 7 versions 


[PDF] researchgate.net

[PDF] Wasserstein distributionally robust chanceconstrained optimization for energy and reserve dispatch: An exact and physically-bounded formulation

A ArrigoC OrdoudisJ Kazempour… - Eur. J. Oper. Res …, 2020 - researchgate.net

In the context of transition towards sustainable, cost-efficient and reliable energy systems,

the improvement of current energy and reserve dispatch models is crucial to properly cope

with the uncertainty of weather-dependent renewable power generation. In contrast to …

  Cited by 2 All 3 versions 


 

An Improvement based on Wasserstein GAN for Alleviating Mode Collapsing

Y Chen, X Hou - 2020 International Joint Conference on Neural …, 2020 - ieeexplore.ieee.org

In the past few years, Generative Adversarial Networks as a deep generative model has

received more and more attention. Mode collapsing is one of the challenges in the study of

Generative Adversarial Networks. In order to solve this problem, we deduce a new algorithm …

  Related articles


[PDF] arxiv.org

An LP-based, strongly-polynomial 2-approximation algorithm for sparse Wasserstein barycenters

S Borgwardt - Operational Research, 2020 - Springer

Discrete Wasserstein barycenters correspond to optimal solutions of transportation problems

for a set of probability measures with finite support. Discrete barycenters are measures with

finite support themselves and exhibit two favorable properties: there always exists one with a …

  Cited by 4 Related articles All 3 versions


[PDF] thecvf.com

Illegible Text to Readable Text: An Image-to-Image Transformation using Conditional Sliced Wasserstein Adversarial Networks

M KarimiG VeniYY Yu - … of the IEEE/CVF Conference on …, 2020 - openaccess.thecvf.com

Automatic text recognition from ancient handwritten record images is an important problem

in the genealogy domain. However, critical challenges such as varying noise conditions,

vanishing texts, and variations in handwriting makes the recognition task difficult. We tackle …

  Cited by 1 Related articles All 7 versions 


2020


[PDF] researchgate.net

[PDF] Potential Analysis of Wasserstein GAN as an Anomaly Detection Method for Industrial Images

A Misik - researchgate.net

The task of detecting anomalies in images is a crucial part of current industrial optical

monitoring systems. In recent years, neural networks have proven to be an efficient method

for this problem, especially autoencoders and generative adversarial networks (GAN). A …

[PDF] wiley.com

Evaluating the performance of climate models based on Wasserstein distance

G VissioV LemboV Lucarini… - Geophysical Research …, 2020 - Wiley Online Library

We propose a methodology for intercomparing climate models and evaluating their

performance against benchmarks based on the use of the Wasserstein distance (WD). This

distance provides a rigorous way to measure quantitatively the difference between two …

  Cited by 2 Related articles All 13 versions


Learning to Align via Wasserstein for Person Re-Identification

Z ZhangY Xie, D Li, W Zhang… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Existing successful person re-identification (Re-ID) models often employ the part-level

representation to extract the fine-grained information, but commonly use the loss that is

particularly designed for global features, ignoring the relationship between semantic parts …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

Visual transfer for reinforcement learning via wasserstein domain confusion

J RoyG Konidaris - arXiv preprint arXiv:2006.03465, 2020 - arxiv.org

We introduce Wasserstein Adversarial Proximal Policy Optimization (WAPPO), a novel

algorithm for visual transfer in Reinforcement Learning that explicitly learns to align the

distributions of extracted features between a source and target task. WAPPO approximates …

  Cited by 3 Related articles All 6 versions 


[PDF] arxiv.org

Social-wagdat: Interaction-aware trajectory prediction via wasserstein graph double-attention network

J Li, H Ma, Z ZhangM Tomizuka - arXiv preprint arXiv:2002.06241, 2020 - arxiv.org

Effective understanding of the environment and accurate trajectory prediction of surrounding

dynamic obstacles are indispensable for intelligent mobile systems (like autonomous

vehicles and social robots) to achieve safe and high-quality planning when they navigate in …

  Cited by 19 Related articles All 3 versions 

<——2020——2020———2790—


[PDF] arxiv.org

A new approach to posterior contraction rates via Wasserstein dynamics

E Dolera, S Favaro, E Mainini - arXiv preprint arXiv:2011.14425, 2020 - arxiv.org

This paper presents a new approach to the classical problem of quantifying posterior

contraction rates (PCRs) in Bayesian statistics. Our approach relies on Wasserstein

distance, and it leads to two main contributions which improve on the existing literature of …

  Cited by 1 Related articles All 2 versions 


Illumination-invariant flotation froth color measuring via Wasserstein distance-based CycleGAN with structure-preserving constraint

J Liu, J He, Y Xie, W Gui, Z Tang, T Ma… - IEEE transactions on …, 2020 - ieeexplore.ieee.org

Froth color can be referred to as a direct and instant indicator to the key flotation production

index, for example, concentrate grade. However, it is intractable to measure the froth color

robustly due to the adverse interference of time-varying and uncontrollable multisource …

  Cited by 7 Related articles All 3 versions


[PDF] arxiv.org

Distributed optimization with quantization for computing Wasserstein barycenters

R Krawtschenko, CA UribeA Gasnikov… - arXiv preprint arXiv …, 2020 - arxiv.org

We study the problem of the decentralized computation of entropy-regularized semi-discrete

Wasserstein barycenters over a network. Building upon recent primal-dual approaches, we

propose a sampling gradient quantization scheme that allows efficient communication and …

  Cited by 2 Related articles All 3 versions 


[PDF] arxiv.org

Scalable computations of wasserstein barycenter via input convex neural networks

J FanA TaghvaeiY Chen - arXiv preprint arXiv:2007.04462, 2020 - arxiv.org

Wasserstein Barycenter is a principled approach to represent the weighted mean of a given

set of probability distributions, utilizing the geometry induced by optimal transport. In this

work, we present a novel scalable algorithm to approximate the Wasserstein Barycenters …

  Cited by 3 Related articles All 3 versions 


[HTML] atlantis-press.com

[HTML] Multimedia analysis and fusion via Wasserstein Barycenter

C Jin, J Wang, J Wei, L Tan, S Liu… - … Computing, 2020 - atlantis-press.com

Optimal transport distance, otherwise known as Wasserstein distance, recently has attracted

attention in music signal processing and machine learning as powerful discrepancy

measures for probability distributions. In this paper, we propose an ensemble approach with …

  Cited by 2 Related articles All 2 versions 


 2020


Regularizing activations in neural networks via distribution matching with the Wasserstein metric

T Joo, D Kang, B Kim - arXiv preprint arXiv:2002.05366, 2020 - arxiv.org

Regularization and normalization have become indispensable components in training deep

neural networks, resulting in faster training and improved generalization performance. We

propose the projected error function regularization loss (PER) that encourages activations to …

  Cited by 3 Related articles All 5 versions 


[HTML] mdpi.com

Probability forecast combination via entropy regularized wasserstein distance

R Cumings-Menon, M Shin - Entropy, 2020 - mdpi.com

We propose probability and density forecast combination methods that are defined using the

entropy regularized Wasserstein distance. First, we provide a theoretical characterization of

the combined density forecast based on the regularized Wasserstein distance under the …

  Cited by 2 Related articles All 15 versions 


[PDF] arxiv.org

Refining Deep Generative Models via Wasserstein Gradient Flows

AF Ansari, ML Ang, H Soh - arXiv preprint arXiv:2012.00780, 2020 - arxiv.org

Deep generative modeling has seen impressive advances in recent years, to the point

where it is now commonplace to see simulated samples (eg, images) that closely resemble

real-world data. However, generation quality is generally inconsistent for any given model …

  Related articles 


[PDF] arxiv.org

A material decomposition method for dual-energy CT via dual interactive Wasserstein generative adversarial networks

Z Shi, H Li, Q Cao, Z Wang, M Cheng - arXiv preprint arXiv:2007.11247, 2020 - arxiv.org

Dual-energy computed tomography has great potential in material characterization and

identification, whereas the reconstructed material-specific images always suffer from

magnified noise and beam hardening artifacts. In this study, a data-driven approach using …

  Related articles All 3 versions 


[PDF] arxiv.org

Automatic coding of students' writing via Contrastive Representation Learning in the Wasserstein space

R Jiang, J GouveaD HammerS Aeron - arXiv preprint arXiv:2011.13384, 2020 - arxiv.org

Qualitative analysis of verbal data is of central importance in the learning sciences. It is labor-

intensive and time-consuming, however, which limits the amount of data researchers can

include in studies. This work is a step towards building a statistical machine learning (ML) …

  Related articles All 2 versions 

<——2020——2020———2800—


Learning Graphons via Structured Gromov-Wasserstein Barycenters

H XuD LuoL CarinH Zha - arXiv preprint arXiv:2012.05644, 2020 - arxiv.org

We propose a novel and principled method to learn a nonparametric graph model called

graphon, which is defined in an infinite-dimensional space and represents arbitrary-size

graphs. Based on the weak regularity lemma from the theory of graphons, we leverage a …

  Cited by 1 Related articles All 2 versions 

 

[PDF] arxiv.org

Adversarial Classification via Distributional Robustness with Wasserstein Ambiguity

N Ho-NguyenSJ Wright - arXiv preprint arXiv:2005.13815, 2020 - arxiv.org

We study a model for adversarial classification based on distributionally robust chance

constraints. We show that under Wasserstein ambiguity, the model aims to minimize the

conditional value-at-risk of the distance to misclassification, and we explore links to previous …

  Cited by 1 Related articles All 3 versions 


[HTML] hindawi.com

[HTML] Imbalanced Fault Classification of Bearing via Wasserstein Generative Adversarial Networks with Gradient Penalty

B Han, S Jia, G Liu, J Wang - Shock and Vibration, 2020 - hindawi.com

Recently, generative adversarial networks (GANs) are widely applied to increase the

amounts of imbalanced input samples in fault diagnosis. However, the existing GAN-based

methods have convergence difficult

ies and training instability, which affect the fault …

  Related articles All 4 versions 


[PDF] researchgate.net

[PDF] Ranking IPCC Model Performance Using the Wasserstein Distance

G VissioV LemboV Lucarini… - arXiv preprint arXiv …, 2020 - researchgate.net

We propose a methodology for intercomparing climate models and evaluating their

performance against benchmarks based on the use of the Wasserstein distance (WD). This

distance provides a rigorous way to measure quantitatively the difference between two …

  Related articles 


[PDF] umn.edu

Data-driven Distributionally Robust Stochastic Optimization via Wasserstein Distance with Applications to Portfolio Risk Management and Inventory Control

D Singh - 2020 - conservancy.umn.edu

The central theme of this dissertation is stochastic optimization under distributional

ambiguity. One canthink of this as a two player game between a decision maker, who tries to

minimize some loss or maximize some reward, and an adversarial agent that chooses the …

  All 3 versions 

 

2020


[PDF] polimi.it

Wasserstein K-means per clustering di misure di probabilità e applicazioni

R TAFFONI - 2020 - politesi.polimi.it

Abstract in italiano La tesi tratterà dello studio della distanza di Wasserstein, studiandone il

caso generale ed il caso discreto, applicato all'algoritmo del K-means, che verrà descritto

nei suoi passaggi. Infine verrà applicato questo algoritmo con dati artificiale ed un dataset …

  Related articles 


Regularized Wasserstein means for aligning distributional data

L MiW ZhangY Wang - Proceedings of the AAAI Conference on …, 2020 - ojs.aaai.org

We propose to align distributional data from the perspective of Wasserstein means. We raise

the problem of regularizing Wasserstein means and propose several terms tailored to tackle

different problems. Our formulation is based on the variational transportation to distribute a …

  Cited by 3 Related articles All 5 versions 


Density estimation of multivariate samples using Wasserstein distance

E Luini, P Arbenz - Journal of Statistical Computation and …, 2020 - Taylor & Francis

Density estimation is a central topic in statistics and a fundamental task of machine learning.

In this paper, we present an algorithm for approximating multivariate empirical densities with

a piecewise constant distribution defined on a hyperrectangular-shaped partition of the …

  Cited by 2 Related articles All 3 versions


[PDF] arxiv.org

Wasserstein distance error bounds for the multivariate normal approximation of the maximum likelihood estimator

A AnastasiouRE Gaunt - arXiv preprint arXiv:2005.05208, 2020 - arxiv.org

We obtain explicit Wasserstein distance error bounds between the distribution of the multi-

parameter MLE and the multivariate normal distribution. Our general bounds are given for

possibly high-dimensional, independent and identically distributed random vectors. Our …

  Cited by 1 Related articles All 4 versions 


[PDF] researchgate.net

Inequalities of the Wasserstein mean with other matrix means

S Kim, H Lee - Annals of Functional Analysis, 2020 - Springer

Recently, a new Riemannian metric and a least squares mean of positive definite matrices

have been introduced. They are called the Bures–Wasserstein metric and Wasserstein

mean, which are different from the Riemannian trace metric and Karcher mean. In this paper …

  Cited by 2 Related articles All 2 versions

<——2020——2020———2810— 


[PDF] aaai.org

Solving general elliptical mixture models through an approximate Wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - Proceedings of the AAAI Conference on …, 2020 - ojs.aaai.org

We address the estimation problem for general finite mixture models, with a particular focus

on the elliptical mixture models (EMMs). Compared to the widely adopted Kullback–Leibler

divergence, we show that the Wasserstein distance provides a more desirable optimisation …

  Cited by 2 Related articles All 4 versions 



[PDF] amazonaws.com

[PDF] Wasserstein 거리 척도 기반 SRGAN 이용한 위성 영상 해상도 향상

황지언, 유초시, 신요안 - 한국통신학회 …, 2020 - journal-home.s3.ap-northeast-2 …

약본 논문에서는 기존 SRGAN (Super Resolution Generative Adversarial Network) 에서

Wasserstein 거리 척도를 이용하여 위성 영상의 해상도를 더욱 향상하는 방안을 제안한다.

GAN 단점인 모드 붕괴 현상을 개선하기 위해 Wasserstein 거리  Gradient Penalty  …

  Related articles All 2 versions 

[Korean  Satellite image using SRGAN based on Wasserstein distance scale

]


2020

[PDF] jst.go.jp

Wasserstein 距離を評価関数とする離散時間システムの最適制御問題について

星野健太 - 自動制御連合講演会講演論文集 63 回自動制御連合 …, 2020 - jstage.jst.go.jp

Abstract– This paper discusses an optimal control problem with the terminal cost given by the

Wasser- stein distance. The problem is formulated as the control problem regarding the probability

distributions of the state variables. This paper discusses a necessary condition of the optimality …

  Related articles

[Japanese  Wasserstein Optimal system for discrete-time system with distance as evaluation function


[PDF] arxiv.org

Learning Graphons via Structured Gromov-Wasserstein Barycenters

H XuD LuoL CarinH Zha - arXiv preprint arXiv:2012.05644, 2020 - arxiv.org

We propose a novel and principled method to learn a nonparametric graph model called

graphon, which is defined in an infinite-dimensional space and represents arbitrary-size

graphs. Based on the weak regularity lemma from the theory of graphons, we leverage a …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

A Riemannian Block Coordinate Descent Method for Computing the Projection Robust Wasserstein Distance

M HuangS MaL Lai - arXiv preprint arXiv:2012.05199, 2020 - arxiv.org

The Wasserstein distance has become increasingly important in machine learning and deep

learning. Despite its popularity, the Wasserstein distance is hard to approximate because of

the curse of dimensionality. A recently proposed approach to alleviate the curse of …

  Cited by 1 Related articles All 3 versions 


2020


[PDF] arxiv.org

Encoded Prior Sliced Wasserstein AutoEncoder for learning latent manifold representations

S KrishnagopalJ Bedrossian - arXiv preprint arXiv:2010.01037, 2020 - arxiv.org

While variational autoencoders have been successful generative models for a variety of

tasks, the use of conventional Gaussian or Gaussian mixture priors are limited in their ability

to capture topological or geometric properties of data in the latent representation. In this …

  Related articles All 2 versions 


[PDF] dergipark.org.tr

Wasserstein Riemannian Geometry on Statistical Manifold

C Ogouyandjou, N Wadagni - International Electronic Journal of …, 2020 - dergipark.org.tr

In this paper, we study some geometric properties of statistical manifold equipped with the

Riemannian Otto metric which is related to the L 2-Wasserstein distance of optimal mass

transport. We construct some α-connections on such manifold and we prove that the …

  Related articles All 2 versions 


2020 patent 

 OPEN ACCESS

基于改进WGAN-GP的多波段图像同步融合与增强方法

09/2020

PatentAvailable Online

2020 patent

 Wasserstein barycenter model ensembling

US US20200342361A1 Youssef Mroueh International Business Machines Corporation

Priority 2019-04-29 • Filed 2019-04-29 • Published 2020-10-29

10 . The system according to claim 9 , further comprising inputting side information into the barycenter, wherein the barycenter comprises a Wasserstein barycenter with a Wasserstein distance metric. 11 . The system according to claim 9 , further comprising a plurality of the barycenters to determine …


2020 patent

Difference privacy greedy grouping method adopting Wasserstein distance

CN CN112307514A 杨悦 哈尔滨工程大学

Priority 2020-11-26 • Filed 2020-11-26 • Published 2021-02-02

1. A differential privacy greedy grouping method adopting Wasserstein distance is characterized by comprising the following steps: step 1: reading a data set D received at the ith time point i Step 2: will D i Data set D released from last time point i-1 Performing Wasserstein distance similarity …

<——2020——2020———2820—


2020 patent

Wi-Fi indoor positioning method based on signal distribution Wasserstein …

CN CN111741429A 周牧 重庆邮电大学

Priority 2020-06-23 • Filed 2020-06-23 • Published 2020-10-02

2. The Wi-Fi indoor positioning method based on the signal distribution Wasserstein distance metric according to claim 1, wherein said ninth step comprises the steps of: step nine (one), w corresponding to each RP m,n Sequencing all RPs from small to large to obtain an RP sequencing set u related to …


2020 patent

Wasserstein distance-based image rapid enhancement method

CN CN111476721A 丰江帆 重庆邮电大学

Priority 2020-03-10 • Filed 2020-03-10 • Published 2020-07-31

4. The Wasserstein distance-based image rapid enhancement method according to claim 3, characterized in that: in step S21, the motion-blurred image has 256 features, including texture features, color features, and edge features. 5. The Wasserstein distance-based image rapid enhancement method …


Enhancing the Classification of EEG Signals using Wasserstein Generative Adversarial Networks

VM PetruţiuLD Palcu, C Lemnaur… - 2020 IEEE 16th …, 2020 - ieeexplore.ieee.org

Collecting EEG signal data during a human visual recognition task is a costly and time-

consuming process. However, training good classification models usually requires a large

amount of quality data. We propose a data augmentation method based on Generative …

  Cited by 1 Related articles All 2 versions


Improving EEG-based motor imagery classification with conditional Wasserstein GAN

Z Li, Y Yu - 2020 International Conference on Image, Video …, 2020 - spiedigitallibrary.org

Deep learning based algorithms have made huge progress in the field of image

classification and speech recognition. There is an increasing number of researchers

beginning to use deep learning to process electroencephalographic (EEG) brain signals …

  Related articles


Invariant Adversarial Learning for Distributional Robustness

J Liu, Z ShenP CuiL ZhouK KuangB Li… - arXiv preprint arXiv …, 2020 - arxiv.org

… satisfies c(z, z) = 0, for probability measures P and Q supported on Z, the Wasserstein distance

between … 2.1 indicates that the correlation between stable covariates S and the target Y stays

invariant across environments, which is quite similar to the invariance in [22 …

  Related articles All 2 versions 

[CITATION] Deep Diffusion-Invariant Wasserstein Distributional Classification

SW Park+DW ShuJ Kwon - Advances in Neural Information Processing Systems, 2020

  Related articles


2020


[PDF] thecvf.com

Barycenters of natural images constrained wasserstein barycenters for image morphing

D SimonA Aberdam - … of the IEEE/CVF Conference on …, 2020 - openaccess.thecvf.com

Image interpolation, or image morphing, refers to a visual transition between two (or more)

input images. For such a transition to look visually appealing, its desirable properties are (i)

to be smooth;(ii) to apply the minimal required change in the image; and (iii) to seem" real" …

  Cited by 3 Related articles All 7 versions 

[PDF] arxiv.org

Safe Wasserstein Constrained Deep Q-Learning

A KandelSJ Moura - arXiv preprint arXiv:2002.03016, 2020 - arxiv.org

This paper presents a distributionally robust Q-Learning algorithm (DrQ) which leverages

Wasserstein ambiguity sets to provide probabilistic out-of-sample safety guarantees during

online learning. First, we follow past work by separating the constraint functions from the …

  Related articles All 2 versions 


[PDF] arxiv.org

A material decomposition method for dual-energy CT via dual interactive Wasserstein generative adversarial networks

Z Shi, H Li, Q Cao, Z Wang, M Cheng - arXiv preprint arXiv:2007.11247, 2020 - arxiv.org

Dual-energy computed tomography has great potential in material characterization and

identification, whereas the reconstructed material-specific images always suffer from

magnified noise and beam hardening artifacts. In this study, a data-driven approach using …

  Related articles All 3 versions 


[PDF] arxiv.org

Image hashing by minimizing independent relaxed wasserstein distance

KD DoanA KimiyaieS Manchanda… - arXiv preprint arXiv …, 2020 - arxiv.org

Image hashing is a fundamental problem in the computer vision domain with various

challenges, primarily, in terms of efficiency and effectiveness. Existing hashing methods lack

a principled characterization of the goodness of the hash codes and a principled approach …

  Cited by 2 Related articles All 2 versions 

<——2020——2020———2830—


2020

Person-independent facial expression recognition method based on improved Wasserstein generative adversarial networks in combination with identity aware

C Xu, Y Cui, Y Zhang, P Gao, J Xu - Multimedia Systems, 2020 - Springer

Since the distinction between two expressions is fairly vague, usually a subtle change in one

part of the human face is enough to change a facial expression. Most of the existing facial

expression recognition algorithms are not robust enough because they rely on general facial …

  Cited by 6 Related articles


2020[PDF] aaai.org

Improving the Robustness of Wasserstein Embedding by Adversarial PAC-Bayesian Learning

D Ding, M Zhang, X PanM YangX He - Proceedings of the AAAI …, 2020 - ojs.aaai.org

Node embedding is a crucial task in graph analysis. Recently, several methods are

proposed to embed a node as a distribution rather than a vector to capture more information.

Although these methods achieved noticeable improvements, their extra complexity brings …

  Related articles All 3 versions 


 2020 [PDF] arxiv.org

Generating Natural Adversarial Hyperspectral examples with a modified Wasserstein GAN

JC Burnel, K FatrasN Courty - arXiv preprint arXiv:2001.09993, 2020 - arxiv.org

Adversarial examples are a hot topic due to their abilities to fool a classifier's prediction.

There are two strategies to create such examples, one uses the attacked classifier's

gradients, while the other only requires access to the clas-sifier's prediction. This is …

  Related articles All 4 versions 


2020 [PDF] arxiv.org

Independent Elliptical Distributions Minimize Their  Wasserstein Distance from Independent Elliptical Distributions with the Same Density Generator

S FangQ Zhu - arXiv preprint arXiv:2012.03809, 2020 - arxiv.org

This short note is on a property of the $\mathcal {W} _2 $ Wasserstein distance which

indicates that independent elliptical distributions minimize their $\mathcal {W} _2 $

Wasserstein distance from given independent elliptical distributions with the same density …

  Related articles All 2 versions 


 2020 OPEN ACCESS

Algorithm for solving imbalance of leakage data of halogen conveying pipeline based on S transformation/WGAN

by ZHAO JIANYANG; SHAN JINGSONG; DING WEIHONG ; More...

07/2020

The invention relates to the technical field of halogen conveying pipeline detection, and discloses an algorithm for solving imbalance of leakage data of a...

PatentCitation Online

2020

 

 2020 OPEN ACCESS  

一种基于WGAN的超参数动态调整方法

09/2020

PatentCitation Online

 [Chinese  dynamic adjustment method of hyperparameters based on WGAN]


 

 2020 online

Face Inpainting based on Improved WGAN-modified

by Zhao, Yue; Liu, Lijun; Liu, Han ; More...

2020 International Symposium on Autonomous Systems (ISAS), 12/2020

Image Inpainting aims to use the technical methods to repair and reconstruct the corrupted region of the corrupted image, so that the reconstructed image looks...

Conference ProceedingFull Text Online

 

2020 online  OPEN ACCESS

Accelerated WGAN update strategy with loss change rate balancing

by Ouyang, Xu; Agam, Gady

08/2020

Optimizing the discriminator in Generative Adversarial Networks (GANs) to completion in the inner training loop is computationally prohibitive, and on finite...

Journal ArticleFull Text Online

 

2020 Cover Image  OPEN ACCESS

A TextCNN and WGAN-gp based deep learning frame for unpaired text style transfer in multimedia...

by Hu, Mingxuan; He, Min; Su, Wei ; More...

11/2020

With the rapid growth of big multimedia data, multimedia processing techniques are facing some challenges, such as knowledge understanding, semantic modeling,...

Article View Article PDF BrowZine PDF Icon

Journal ArticleCitation Online

View Complete Issue Browse Now BrowZine Book Icon

2020  online

Findings from Chongqing University of Posts and Telecommunications Update Knowledge of Optics Research 

(Motion Deblurring In Image Color Enhancement By Wgan...

Science Letter, 12/2020

NewsletterFull Text Online

<——2020——2020———2840— 

 

2020  OPEN ACCESS

Building energy consumption prediction method based on WGAN algorithm and monitoring and...

by LU YOU; WANG ZHECHAO; WU HONGJIE ; More...

05/2020

The invention relates to a building energy consumption prediction method based on a WGAN algorithm and a building energy consumption monitoring and prediction...

PatentCitation Online

2020 online

Univ South China Tech Submits Patent Application for a Feature Recalibration Convolution Method Based on WGAN...

Global IP News. Broadband and Wireless Network News, Aug 31, 2020

Newspaper ArticleFull Text Online

2020 online

Information Technology; Investigators at Beijing University of Posts and Telecommunications Report Findings in Information Technology (E-wacgan: Enhanced Generative Model of Signaling Data Based On Wgan...

Telecommunications weekly, Sep 30, 2020, 70

Newspaper ArticleFull Text Online

2020

Régression quantile extrême : une approche par couplage et ...

https://hal.inria.fr › UMR6623

· Translate this page

by B Bobbia · 2020 — Plus précisément, l'estimation de quantiles d'une distribution réelle ... Régression quantile extrême : une approche par couplage et distance de Wasserstein. Benjamin Bobbia 1 ... Université Bourgogne Franche-Comté, 2020.

 OPEN ACCESS

Régression quantile extrême : une approche par couplage et distance de Wasserstein

by Bobbia, Benjamin

2020

Ces travaux concernent l'estimation de quantiles extrêmes conditionnels. Plus précisément, l'estimation de quantiles d'une distribution réelle en fonction...

Dissertation/ThesisCitation Online


[PDF] academia.edu

Data augmentation in fault diagnosis based on the Wasserstein generative adversarial network with gradient penalty

X Gao, F Deng, X Yue - Neurocomputing, 2020 - Elsevier

Fault detection and diagnosis in industrial process is an extremely essential part to keep

away from undesired events and ensure the safety of operators and facilities. In the last few

decades various data based machine learning algorithms have been widely studied to …

  Cited by 32 Related articles All 3 versions

 

2020


RDA-UNET-WGAN: An Accurate Breast Ultrasound Lesion Segmentation Using Wasserstein Generative Adversarial Networks

A NegiANJ RajR Nersisson, Z Zhuang… - Arabian Journal for …, 2020 - Springer

Early-stage detection of lesions is the best possible way to fight breast cancer, a disease

with the highest malignancy ratio among women. Though several methods primarily based

on deep learning have been proposed for tumor segmentation, it is still a challenging …

  Cited by 5 Related articles


[HTML] hindawi.com

[HTML] Imbalanced Fault Classification of Bearing via Wasserstein Generative Adversarial Networks with Gradient Penalty

B Han, S Jia, G Liu, J Wang - Shock and Vibration, 2020 - hindawi.com

Recently, generative adversarial networks (GANs) are widely applied to increase the

amounts of imbalanced input samples in fault diagnosis. However, the existing GAN-based

methods have convergence difficulties and training instability, which affect the fault  …

  Related articles All 4 versions 


[PDF] sciencedirect.com

Intelligent Fault Diagnosis with a Deep Transfer Network based on Wasserstein Distance

J Xu, J Huang, Y Zhao, L Zhou - Procedia Computer Science, 2020 - Elsevier

Intelligent fault-diagnosis methods based on deep-learning technology have been very

successful for complex industrial systems. The deep learning based fault classification

model requires a large number of labeled data. Moreover, the probability distribution of …

  Related articles


Data Augmentation Method for Fault Diagnosis of Mechanical Equipment Based on Improved Wasserstein GAN

W Liu, L Duan, Y Tang, J Yang - 2020 11th International …, 2020 - ieeexplore.ieee.org

Most of the time the mechanical equipment is in normal operation state, which results in high

imbalance between fault data and normal data. In addition, traditional signal processing

methods rely heavily on expert experience, making it difficult for classification or prediction …

  Related articles


[PDF] theses.fr

Régression quantile extrême: une approche par couplage et distance de Wasserstein.

B Bobbia - 2020 - theses.fr

Résumé Ces travaux concernent l'estimation de quantiles extrêmes conditionnels. Plus

précisément, l'estimation de quantiles d'une distribution réelle en fonction d'une covariable

de grande dimension. Pour effectuer une telle estimation, nous présentons un modèle …

  All 8 versio

<——2020——2020———2850— 

online  OPEN ACCESS

Wasserstein Contrastive Representation Distillation

by Chen, Liqun; Wang, Dong; Gan, Zhe ; More...

12/2020

The primary goal of knowledge distillation (KD) is to encapsulate the information of a model learned from a teacher network into a student network, with the...

Journal ArticleFull Text Online

 

Exponential convergence in the Wasserstein metric W~1 for one dimensional diffusions

Authors:Lingyan ChengRuinan LiLiming Wu
 
Article, 2020
Publication:Discrete and continuous dynamical systems, 40, 2020, 5131
Publisher:2020

 

Finite-Horizon Control of Nonlinear Discrete-Time Systems with Terminal Cost of Wasserstein Distance

by Hoshino, Kenta

2020 59th IEEE Conference on Decision and Control (CDC), 12/2020

This study explores a finite-horizon optimal control problem of nonlinear discrete-time systems for steering a probability distribution of initial states as...

Conference ProceedingCitation Online

 

online

A Class of Optimal Transport Regularized Formulations with Applications to Wasserstein GANs

by Mahdian, Saied; Blanchet, Jose H; Glynn, Peter W

2020 Winter Simulation Conference (WSC), 12/2020

Optimal transport costs (e.g. Wasserstein distances) are used for fitting high-dimensional distributions. For example, popular artificial intelligence...

Conference ProceedingFull Text Online

 

online  OPEN ACCESS

A Riemannian Block Coordinate Descent Method for Computing the Projection Robust Wasserstein...

by Huang, Minhui; Ma, Shiqian; Lai, Lifeng

12/2020

The Wasserstein distance has become increasingly important in machine learning and deep learning. Despite its popularity, the Wasserstein distance is hard 

to...

Journal ArticleFull Text Online

 

2020

 OPEN ACCESS

Wasserstein gradient flow formulation of the time-fractional Fokker–Planck equation

by Jin, B; Duong, MH

12/2020

In this work, we investigate a variational formulation for a time-fractional Fokke–Planck equation which arises in the study of complex physical systems...

Journal ArticleCitation Online

 

online OPEN ACCESS

The Spectral-Domain $\mathcal{W}_2$ Wasserstein Distance for Elliptical Processes and the...

by Fang, Song; Zhu, Quanyan

12/2020

In this short note, we introduce the spectral-domain $\mathcal{W}_2$ Wasserstein distance for elliptical stochastic processes in terms of their power spectra....

Journal ArticleFull Text Online

 


2020


Semantic Inpainting with Multi-dimensional Adversarial Network and Wasserstein Distance

H Wang, L Jiao, R Bie, H Wu - Chinese Conference on Pattern …, 2020 - Springer

Inpainting represents a procedure which can restore the lost parts of an image based upon

the residual information. We present an inpainting network that consists of an Encoder-

Decoder pipeline and a multi-dimensional adversarial network. The Encoder-Decoder …

  Related articles

online

Reports on Information Technology from George Mason University Provide New Insights 

(Data-driven Distributionally Robust Chance-constrained Optimization With Wasserstein...

Information Technology Newsweekly, 12/2020

NewsletterFull Text Online


online

Cover Image  OPEN ACCESS

A collaborative filtering recommendation framework based on Wasserstein GAN

by Li, Rui; Qian, Fulan; Du, Xiuquan ; More...

Journal of physics. Conference series, 11/2020, Volume 1684, Issue 1

Compared with the original GAN, Wasserstein GAN minimizes the Wasserstein Distance between the generative distribution and the real distribution, can well...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Cited by 3 Related articles All 2 versions

<——2020——2020—2860—  


online  Cover Image  OPEN ACCESS

Squared quadratic Wasserstein distance : optimal couplings and Lions differentiability

by Alfonsi, Aurélien; Jourdain, Benjamin

Probability and statistics, 03/2020, Volume 24

In this paper, we remark that any optimal coupling for the quadratic Wasserstein distance $W^2_2(\mu,\nu)$ between two probability measures $\mu$ and $\nu$...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

  

Nonparametric Different-Feature Selection Using Wasserstein Distance

by Zheng, Wenbo; Wang, Fei-Yue; Gou, Chao

2020 IEEE 32nd International Conference on Tools with Artificial Intelligence (ICTAI), 11/2020

In this paper, we propose a feature selection method that characterizes the difference between two kinds of probability distributions. The key idea is to view...

Conference ProceedingCitation Online

Nonparametric Different-Feature Selection Using Wasserstein Distance chapter

 

Learning Wasserstein Isometric Embedding for Point Clouds

by Kawano, Keisuke; Koide, Satoshi; Kutsuna, Takuro

2020 International Conference on 3D Vision (3DV), 11/2020

The Wasserstein distance has been employed for determining the distance between point clouds, which have variable numbers of points and invariance of point...

Conference ProceedingCitation Online

Cited by 2 Related articles All 3 versions

  

online

Biosignal Oversampling Using Wasserstein Generative Adversarial Network

by Munia, Munawara Saiyara; Nourani, Mehrdad; Houari, Sammy

2020 IEEE International Conference on Healthcare Informatics (ICHI), 11/2020

Oversampling plays a vital role in improving the minority-class classification accuracy for imbalanced biomedical datasets. In this work, we propose a...

Conference ProceedingFull Text Online

Cited by 1 Related articles All 2 versions

 

online  Cover Image

Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein...

by Ehrlacher, Virginie; Lombardi, Damiano; Mula, Olga ; More...

ESAIM: Mathematical Modelling and Numerical Analysis, 03/2020, Volume 54, Issue 6

We consider the problem of model reduction of parametrized PDEs where the goal is to approx- imate any function belonging to the set of solutions at a reduced...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 Cited by 17 Related articles All 25 versions


2020

online  OPEN ACCESS

A new approach to posterior contraction rates via Wasserstein dynamics

by Dolera, Emanuele; Favaro, Stefano; Mainini, Edoardo

11/2020

This paper presents a new approach to the classical problem of quantifying posterior contraction rates (PCRs) in Bayesian statistics. Our approach relies on...

Journal ArticleFull Text Online


online

Improving EEG-based motor imagery classification with conditional Wasserstein GAN

by Li, Zheng; Yu, Yang

11/2020

Deep learning based algorithms have made huge progress in the field of image classification and speech recognition. There is an increasing number of...

Conference ProceedingFull Text Online

Improving EEG-based motor imagery classification with conditional Wasserstein GAN

online

Spatial-aware Network using Wasserstein Distance for Unsupervised Domain Adaptation

by Long, Liu; Bin, Luo; Jiang, Fan

2020 Chinese Automation Congress (CAC), 11/2020

In a general scenario, the purpose of Unsupervised Domain Adaptation (UDA) is to classify unlabeled target domain data as much as possible, but the source...

Conference ProceedingFull Text Online

 Related articles All 3 versions


online  OPEN ACCESS

Automatic coding of students' writing via Contrastive Representation Learning in the Wasserstein space

by Jiang, Ruijie; Gouvea, Julia; Hammer, David ; More...

11/2020

Qualitative analysis of verbal data is of central importance in the learning sciences. It is labor-intensive and time-consuming, however, which limits the...

Journal ArticleFull Text Online

  Cited by 2 Related articles All 4 versions

online   OPEN ACCESS

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures...

by Quang, Minh Ha

11/2020

This work studies the entropic regularization formulation of the 2-Wasserstein distance on an infinite-dimensional Hilbert space, in particular for the...

Journal ArticleFull Text Online

 Cited by 2 Related articles All 3 versions

<——2020——2020—2870—


Wasserstein幾何学と情報幾何学

by 高津 飛鳥

数理科学, 11/2020, Volume 58, Issue 11

Journal ArticleCitation Online

 [Japanese  Wasserstein Geometry and Information Geometry]

 

"Wasserstein Barycenter Model Ensembling" in Patent Application Approval Process (USPTO...

Technology Business Journal, 11/2020

NewsletterCitation Online

Wasserstein barycenter model ensembling

[PDF] arxiv.org

Visual transfer for reinforcement learning via wasserstein domain confusion

J RoyG Konidaris - arXiv preprint arXiv:2006.03465, 2020 - arxiv.org

We introduce Wasserstein Adversarial Proximal Policy Optimization (WAPPO), a novel

algorithm for visual transfer in Reinforcement Learning that explicitly learns to align the

distributions of extracted features between a source and target task. WAPPO approximates …

  Cited by 3 Related articles All 6 versions 


Domain-attention Conditional Wasserstein Distance for Multi-source Domain Adaptation

H Wu, Y Yan, MK NgQ Wu - ACM Transactions on Intelligent Systems …, 2020 - dl.acm.org

Multi-source domain adaptation has received considerable attention due to its effectiveness

of leveraging the knowledge from multiple related sources with different distributions to

enhance the learning performance. One of the fundamental challenges in multi-source …

  Cited by 1 Related articles All 2 versions


Adversarial sliced Wasserstein domain adaptation networks

Y Zhang, N Wang, S Cai - Image and Vision Computing, 2020 - Elsevier

Abstract Domain adaptation has become a resounding success in learning a domain

agnostic model that performs well on target dataset by leveraging source dataset which has

related data distribution. Most of existing works aim at learning domain-invariant features …

  Cited by 4 Related articles All 2 versions


2020


[PDF] arxiv.org

Wasserstein Index Generation Model: Automatic generation of time-series index with application to Economic Policy Uncertainty

F Xie - Economics Letters, 2020 - Elsevier

Automatic time-series index generation as a black-box method … Comparable results with existing

ones, tested on EPU … Applicable to any text corpus to produce sentiment indices … I propose

a novel method, the Wasserstein Index Generation model (WIG), to generate a public sentiment …

  Cited by 6 Related articles All 11 versions


Wasserstein based transfer network for cross-domain sentiment classification

Y Du, M He, L Wang, H Zhang - Knowledge-Based Systems, 2020 - Elsevier

Automatic sentiment analysis of social media texts is of great significance for identifying

people's opinions that can help people make better decisions. Annotating data is time

consuming and laborious, and effective sentiment analysis on domains lacking of labeled …

  Cited by 2 Related articles All 2 versions


[PDF] ieee.org

Distributionally Robust Optimal Reactive Power Dispatch with Wasserstein Distance in Active Distribution Network

J Liu, Y Chen, C Duan, J Lin… - Journal of Modern Power …, 2020 - ieeexplore.ieee.org

The uncertainties from renewable energy sources (RESs) will not only introduce significant

influences to active power dispatch, but also bring great challenges to the analysis of

optimal reactive power dispatch (ORPD). To address the influence of high penetration of …

  Cited by 8 Related articles All 3 versions


[PDF] arxiv.org

Automatic coding of students' writing via Contrastive Representation Learning in the Wasserstein space

R Jiang, J GouveaD HammerE Miller… - arXiv preprint arXiv …, 2020 - arxiv.org

Qualitative analysis of verbal data is of central importance in the learning sciences. It is labor-

intensive and time-consuming, however, which limits the amount of data researchers can

include in studies. This work is a step towards building a statistical machine learning (ML) …

  Related articles All 2 versions 


Nonparametric Different-Feature Selection Using Wasserstein Distance

W Zheng, FY WangC Gou - 2020 IEEE 32nd International …, 2020 - ieeexplore.ieee.org

In this paper, we propose a feature selection method that characterizes the difference

between two kinds of probability distributions. The key idea is to view the feature selection

problem as a sparsest k-subgraph problem that considers Wasserstein distance between …

  Related articles All 2 versions

<——2020——2020—2880—


[PDF] arxiv.org

The Spectral-Domain  Wasserstein Distance for Elliptical Processes and the Spectral-Domain Gelbrich Bound

S FangQ Zhu - arXiv preprint arXiv:2012.04023, 2020 - arxiv.org

In this short note, we introduce the spectral-domain $\mathcal {W} _2 $ Wasserstein distance

for elliptical stochastic processes in terms of their power spectra. We also introduce the

spectral-domain Gelbrich bound for processes that are not necessarily elliptical. Subjects …

  Related articles All 2 versions 


[PDF] arxiv.org

Unsupervised Wasserstein Distance Guided Domain Adaptation for 3D Multi-domain Liver Segmentation

C YouJ Yang, J Chapiro, JS Duncan - Interpretable and Annotation …, 2020 - Springer

Deep neural networks have shown exceptional learning capability and generalizability in

the source domain when massive labeled data is provided. However, the well-trained

models often fail in the target domain due to the domain shift. Unsupervised domain  …

  Related articles All 3 versions

 

System and method for unsupervised domain adaptation via sliced-wasserstein distance

AJ GabourieM RostamiS Kolouri… - US Patent App. 16 …, 2020 - freepatentsonline.com

Described is a system for unsupervised domain adaptation in an autonomous learning

agent. The system adapts a learned model with a set of unlabeled data from a target

domain, resulting in an adapted model. The learned model was previously trained to …

  Cited by 2 Related articles All 2 versions 


Spatial-aware Network using Wasserstein Distance for Unsupervised Domain Adaptation

L Long, L Bin, F Jiang - 2020 Chinese Automation Congress …, 2020 - ieeexplore.ieee.org

In a general scenario, the purpose of Unsupervised Domain Adaptation (UDA) is to classify

unlabeled target domain data as much as possible, but the source domain data has a large

number of labels. To address this situation, this paper introduces the optimal transport theory …


2020

De novo protein design for novel folds using guided conditional Wasserstein generative adversarial networks

M Karimi, S Zhu, Y CaoY Shen - Journal of Chemical Information …, 2020 - ACS Publications

Although massive data is quickly accumulating on protein sequence and structure, there is a

small and limited number of protein architectural types (or structural folds). This study is

addressing the following question: how well could one reveal underlying sequence …

Cited by 23 Related articles All 4 versions

 De Novo Protein Design for Novel Folds Using Guided Conditional Wasserstein Generative Adversarial Networks

By: Karimi, Mostafa; Zhu, Shaowen; Cao, Yue; et al.

JOURNAL OF CHEMICAL INFORMATION AND MODELING  Volume: ‏ 60   Issue: ‏ 12   Pages: ‏ 5667-5681   Published: ‏ DEC 28 2020

Get It Penn State  View Abstract

Times Cited: 1
Cited by 23
 Related articles All 4 versions


2020


[PDF] arxiv.org

Wasserstein Distance guided Adversarial Imitation Learning with Reward Shape Exploration

M ZhangY WangX Ma, L Xia, J Yang… - 2020 IEEE 9th Data …, 2020 - ieeexplore.ieee.org

The generative adversarial imitation learning (GAIL) has provided an adversarial learning

framework for imitating expert policy from demonstrations in high-dimensional continuous

tasks. However, almost all GAIL and its extensions only design a kind of reward function of …

  Cited by 3 Related articles All 5 versions


[PDF] carloalberto.org

[PDF] Measuring dependence in the Wasserstein distance for Bayesian nonparametric models

M Catalano, A Lijoi, I Prünster - Under revision. xi, 2020 - carloalberto.org

The proposal and study of dependent Bayesian nonparametric models has been one of the

most active research lines in the last two decades, with random vectors of measures

representing a natural and popular tool to define them. Nonetheless a principled approach …

  Cited by 1 


A new Wasserstein distance-and cumulative sum-dependent health indicator and its application in prediction of remaining useful life of bearing

J Yin, M Xu, H Zheng, Y Yang - Journal of the Brazilian Society of …, 2020 - Springer

The safety and reliability of mechanical performance are affected by the condition (health

status) of the bearings. A health indicator (HI) with high monotonicity and robustness is a

helpful tool to simplify the predictive model and improve prediction accuracy. In this paper, a …

  Related articles


[PDF] arxiv.org

Unsupervised Wasserstein Distance Guided Domain Adaptation for 3D Multi-domain Liver Segmentation

C YouJ Yang, J Chapiro, JS Duncan - Interpretable and Annotation …, 2020 - Springer

Deep neural networks have shown exceptional learning capability and generalizability in

the source domain when massive labeled data is provided. However, the well-trained

models often fail in the target domain due to the domain shift. Unsupervised domain …

  Related articles All 3 versions


online Cover Image PEER-REVIEW

Drug-drug interaction prediction with Wasserstein Adversarial Autoencoder-based knowledge graph...

Briefings in bioinformatics, 10/2020

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

<——2020——2020—2890—  

online Cover Image  OPEN ACCESS

Data Augmentation Method for Switchgear Defect Samples Based on Wasserstein Generative...

by Huang, Xueyou; Xiong, Jun; Zhang, Yu ; More...

Journal of physics. Conference series, 10/2020, Volume 1659, Issue 1

The problem of sample imbalance will lead to poor generalization ability of the deep learning model algorithm, and the phenomenon of overfitting during network...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 [CITATION] Data augmentation method for power transformer fault diagnosis based on conditional Wasserstein generative adversarial network

YP Liu, Z Xu, J He, Q Wang, SG Gao, J Zhao - Power System Technology, 2020

Cited by 6 Related articles

 

online  OPEN ACCESS

Intrinsic Sliced Wasserstein Distances for Comparing Collections of Probability Distributions...

by Rustamov, Raif M; Majumdar, Subhabrata

10/2020

Collections of probability distributions arise in a variety of statistical applications ranging from user activity pattern analysis to brain connectomics. In...

Journal ArticleFull Text Online

Cited by 6 Related articles All 5 versions

ransport: Computation of Optimal Transport Plans and ...

rdrr.io/cran/transport 

Mar 13, 2020 · Solve optimal transport problems. Compute Wasserstein distances (a.k.a. Kantorovitch, Fortet--Mourier, Mallows, Earth Mover's, or minimal L_p distances), return the corresponding transference plans, and display them graphically. Objects that can be compared include grey-scale images, (weighted) point patterns, and mass vectors. 

[ 

  

online OPEN ACCESS

Wasserstein Distance Regularized Sequence Representation for Text Matching in Asymmetrical...

by Yu, Weijie; Xu, Chen; Xu, Jun ; More...

10/2020

One approach to matching texts from asymmetrical domains is projecting the input sequences into a common semantic space as feature vectors upon which the...

Journal ArticleFull Text Online

Cited by 7 Related articles All 7 versions

 

online  OPEN ACCESS

Data-driven Distributionally Robust Optimal Stochastic Control Using the Wasserstein Metric

by Zhao, Feiran; You, Keyou

10/2020

Optimal control of a stochastic dynamical system usually requires a good dynamical model with probability distributions, which is difficult to obtain due to...

Journal ArticleFull Text Online

 Related articles All 2 versions 

 

2020

2020 online OPEN ACCESS

CONLON: A pseudo-song generator based on a new pianoroll, Wasserstein autoencoders, and optimal

 INTERPOLATIONS

by Angioloni, Luca; Borghuis, V.A.J; Brusci, Lorenzo ; More...

Proceedings of the 21st International Society for Music Information Retrieval Conference, 10/2020

We introduce CONLON, a pattern-based MIDI generation method that employs a new lossless pianoroll-like data description in which velocities and durations are...

Conference ProceedingFull Text Online

 

online Cover Image PEER-REVIEW

Horo-functions associated to atom sequences on the Wasserstein space

by Zhu, Guomin; Wu, Hongguang; Cui, Xiaojun

Archiv der Mathematik, 07/2020, Volume 115, Issue 5

On the Wasserstein space over a complete, separable, non-compact, locally compact length space, we consider the horo-functions associated to sequences of...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

News results for "(TitleCombined:(wasserstein))"

Paid Notice: Deaths; WASSERSTEIN, FLORENCE

The New York Times, Jul 27, 2020, 20

WASSERSTEIN--Florence. On July 22, beloved mother, devoted grandmother, proud great-grandmother, passed away at the age of 97. Her love of life and family will...

Newspaper ArticleCitation Online

 Related articles

 

 OPEN ACCESS patent

基于信号分布Wasserstein距离度量的Wi-Fi室内定位方法

10/2020 ...

PatentCitation Online

 [Chinese  Wi-Fi indoor positioning method based on signal distribution Wasserstein distance metric]

 CN CN111741429A 周牧 重庆邮电大学


online

Fourier Analysis; Researchers' Work from Stanford University Focuses on Fourier Analysis (Irregularity of Distribution In Wasserstein...

Journal of mathematics (Atlanta, Ga.), Oct 27, 2020, 750

Newspaper ArticleFull Text Online


[PDF] thecvf.com

Severity-aware semantic segmentation with reinforced wasserstein training

X LiuW JiJ YouGE Fakhri… - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com

Semantic segmentation is a class of methods to classify each pixel in an image into

semantic classes, which is critical for autonomous vehicles and surgery systems. Cross-

entropy (CE) loss-based deep neural networks (DNN) achieved great success wrt the …

  Cited by 10 Related articles All 5 versions 

<——2020—–—2020—––2900— 


[PDF] mlr.press

Nested-wasserstein self-imitation learning for sequence generation

R ZhangC ChenZ GanZ Wen… - International …, 2020 - proceedings.mlr.press

Reinforcement learning (RL) has been widely studied for improving sequence-generation

models. However, the conventional rewards used for RL training typically cannot capture

sufficient semantic information and therefore render model bias. Further, the sparse and …

  Cited by 2 Related articles All 6 versions 


Nested-Wasserstein Self-Imitation Learning for Sequence Generation

L Carin - 2020 - openreview.net

Reinforcement learning (RL) has been widely studied for improving sequence-generation

models. However, the conventional rewards used for RL training typically cannot capture

sufficient semantic information and therefore render model bias. Further, the sparse and …

  

[PDF] brown.edu

Drug–drug interaction prediction with Wasserstein Adversarial Autoencoder-based knowledge graph embeddings

Y Dai, C Guo, W Guo, C Eickhoff - Briefings in Bioinformatics, 2020 - academic.oup.com

An interaction between pharmacological agents can trigger unexpected adverse events.

Capturing richer and more comprehensive information about drug–drug interactions (DDIs)

is one of the key tasks in public health and drug development. Recently, several knowledge …

  Related articles All 3 versions


 Generating Hyperspectral Data Based on 3D CNN and Improved Wasserstein Generative Adversarial Network Using Homemade High-resolution Datasets

Y Li, D Huang - Proceedings of the International Conference on …, 2020 - dl.acm.org

Hyperspectral images contain rich information on the fingerprints of materials and are being

popularly used in the exploration of oil and gas, environmental monitoring, and remote

sensing. Since hyperspectral images cover a wide range of wavelengths with high …

  Related articles


[PDF] arxiv.org

Generating Natural Adversarial Hyperspectral examples with a modified Wasserstein GAN

JC Burnel, K FatrasN Courty - arXiv preprint arXiv:2001.09993, 2020 - arxiv.org

Adversarial examples are a hot topic due to their abilities to fool a classifier's prediction.

There are two strategies to create such examples, one uses the attacked classifier's

gradients, while the other only requires access to the clas-sifier's prediction. This is …

  Cited by 1 Related articles All 4 versions 


2020


online OPEN ACCESS

Universal consistency of Wasserstein $k$-NN classifier

by Ponnoprat, Donlapark

09/2020

The Wasserstein distance provides a notion of dissimilarities between probability measures, which has recent applications in learning of structured data with...

Journal ArticleFull Text Online

Universal consistency of Wasserstein $k$-NN classifier

https://www.researchgate.net › publication › 344198079_...

In this work, we analyze the $k$-nearest neighbor classifier ($k$-NN) under the Wasserstein distance and establish the universal consistency on families of ...


Conditional Wasserstein Auto-Encoder for Interactive Vehicle Trajectory Prediction

C Fei, X He, S Kawahara, N Shirou… - 2020 IEEE 23rd …, 2020 - ieeexplore.ieee.org

Trajectory prediction is a crucial task required for autonomous driving. The highly

interactions and uncertainties in real-world traffic scenarios make it a challenge to generate

trajectories that are accurate, reasonable and covering diverse modality as much as

possible. This paper propose a conditional Wasserstein auto-encoder trajectory prediction

model (TrajCWAE) that combines the representation learning and variational inference to

generate predictions with multi-modal nature. TrajCWAE model leverages a context …

 Cited by 3 Related articles All 3 versions

Conference Proceeding

 

 Optimal Control Theory--The equivalence of Fourier-based and Wasserstein metrics

by G Auricchio · 2020 · Cited by 1 — Optimal Control Theory--The equivalence of Fourier-based and Wasserstein metrics on imaging problems. Citation metadata. Author: Gennaro Auricchio, Andrea ...

Journal Article

 


Enhancing the Classification of EEG Signals using Wasserstein Generative Adversarial Networks

VM PetruţiuLD Palcu, C Lemnaur… - 2020 IEEE 16th …, 2020 - ieeexplore.ieee.org

Collecting EEG signal data during a human visual recognition task is a costly and time-

consuming process. However, training good classification models usually requires a large

amount of quality data. We propose a data augmentation method based on Generative

Adversarial Networks (GANs) to generate artificial EEG signals from existing data, in order to

improve classification performance on data collected during a visual recognition task. We

evaluate the quality of the artificially generated signal in terms of the accuracy of a …

  Cited by 1 Related articles All 2 versions

Conference Proceeding

[HTML] hindawi.com

Cited by 2 Related articles All 2 versions

 A Super Resolution Method for Remote Sensing Images Based on Cascaded Conditional Wasserstein GANs

B Liu, H Li, Y Zhou, Y Peng, A Elazab… - 2020 IEEE 3rd …, 2020 - ieeexplore.ieee.org

High-resolution (HR) remote sensing imagery is quite beneficial for subsequent

interpretation. Obtaining HR images can be achieved by upgrading the imaging device. Yet,

the cost to perform this task is very huge. Thus, it is necessary to obtain HR images from low-

resolution (LR) ones. In the literature, the super-resolution image reconstruction methods

based on deep learning have unparalleled advantages in comparison to traditional

reconstruction methods. This work is inspired by these current mainstream methods and …

  Related articles

Conference ProceedingF

.<——2020——2020—2910—  


 Small Object Detection from Remote Sensing Images with the Help of Object-Focused Super-Resolution Using Wasserstein GANs

L Courtrai, MT Pham, C Friguet… - IGARSS 2020-2020 …, 2020 - ieeexplore.ieee.org

In this paper, we investigate and improve the use of a super-resolution approach to benefit

the detection of small objects from aerial and satellite remote sensing images. The main

idea is to focus the super-resolution on target objects within the training phase. Such a

technique requires a reduced number of network layers depending on the desired scale

factor and the reduced size of the target objects. The learning of our super-resolution

network is performed using deep residual blocks integrated in a Wasserstein Generative …

  All 7 versions

Conference Proceeding


2020 [PDF] ieee.org

Joint transfer of model knowledge and fairness over domains using wasserstein distance

T Yoon, J LeeW Lee - IEEE Access, 2020 - ieeexplore.ieee.org

Owing to the increasing use of machine learning in our daily lives, the problem of fairness

has recently become an important topic in machine learning societies. Recent studies

regarding fairness in machine learning have been conducted to attempt to ensure statistical …

  Cited by 2 Related articles

 

online  

Investigators at Beijing University of Technology Detail Findings in Knowledge Engineering 

(Wasserstein Based Transfer Network for Cross-domain Sentiment Classification)

Robotics & Machine Learning, 09/2020

NewsletterFull Text Online

ifted and geometric differentiability of the squared quadratic ...

https://hal-enpc.archives-ouvertes.fr › ...

In this paper, we remark that any optimal coupling for the quadratic Wasserstein distance W22(μ,ν) between two probability measures μ and ν with finite second ...

2020 online Cover Image

Lifted and geometric differentiability of the squared quadratic Wasserstein distance

by Alfonsi, Aurélien; Jourdain, Benjamin

Probability and statistics, 2020

In this paper, we remark that any optimal coupling for the quadratic Wasserstein distance W22(μ,ν) between two probability measures μ and ν with finite second...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

[PDF] arxiv.org

Social-wagdat: Interaction-aware trajectory prediction via wasserstein graph double-attention network

J Li, H Ma, Z ZhangM Tomizuka - arXiv preprint arXiv:2002.06241, 2020 - arxiv.org

Effective understanding of the environment and accurate trajectory prediction of surrounding

dynamic obstacles are indispensable for intelligent mobile systems (like autonomous

vehicles and social robots) to achieve safe and high-quality planning when they navigate in …

  Cited by 19 Related articles All 3 versions 



Super Resolution Method for Remote Sensing Images Based on Cascaded Conditional Wasserstein GANsB Liu, H Li, Y Zhou, Y Peng, A Elazab… - 2020 IEEE 3rd …, 2020 - ieeexplore.ieee.org

High-resolution (HR) remote sensing imagery is quite beneficial for subsequent

interpretation. Obtaining HR images can be achieved by upgrading the imaging device. Yet,

the cost to perform this task is very huge. Thus, it is necessary to obtain HR images from low …

  Related articles

2020 [PDF] mlr.press

Quantitative stability of optimal transport maps and linearization of the 2-wasserstein space

Q Mérigot, A Delalande… - … Conference on Artificial …, 2020 - proceedings.mlr.press

This work studies an explicit embedding of the set of probability measures into a Hilbert

space, defined using optimal transport maps from a reference probability density. This

embedding linearizes to some extent the 2-Wasserstein space and is shown to be bi-Hölder …

  Cited by 16 Related articles All 5 versions 

 patent OPEN ACCESS

Image rapid enhancement method based on Wasserstein distance

by QI SHUANG; WU SHANHONG; FENG JIANGFAN

07/2020

The invention relates to an image rapid enhancement method based on a Wasserstein distance, and belongs to the field of computer vision. The method comprises...

PatentCitation Online

 patent OPEN ACCESS

Depth domain adaptive image classification method based on Waserstein distance

by WU QIANG; LIU JU; SUN SHUANG ; More...

07/2020

The invention provides a Wasserstein distance-based depth domain adaptive image classification method and apparatus, and a computer readable storage medium....

PatentCitation Online


[PDF] arxiv.org

The Wasserstein-Fourier distance for stationary time series

E Cazelles, A Robert, F Tobar - IEEE Transactions on Signal …, 2020 - ieeexplore.ieee.org

… This is particularly useful for Gaussian process (GPs, [21]), as it allows us to define a proper … NPSDs

that are distributions supported on Rd,d> 1, and for which the Wasserstein distance can … However,

in most time series applications presented here, such as the experiments in Sec …

  Cited by 3 Related articles All 4 versions

.<——2020——2020—2920— 


2020 [PDF] arxiv.org

Wasserstein Stability for Persistence Diagrams

P SkrabaK Turner - arXiv preprint arXiv:2006.16824, 2020 - arxiv.org

The stability of persistence diagrams is among the most important results in applied and

computational topology. Most results in the literature phrase stability in terms of the

bottleneck distance between diagrams and the $\infty $-norm of perturbations. This has two …

  Cited by 4 Related articles All 2 versions 


 2020

[PDF] springer.comAdapted Wasserstein distances and stability in mathematical finance

J Backhoff-VeraguasD BartlM Beiglböck… - Finance and …, 2020 - Springer

Assume that an agent models a financial asset through a measure with the goal to

price/hedge some derivative or optimise some expected utility. Even if the model is

chosen in the most skilful and sophisticated way, the agent is left with the possibility that  …

  Cited by 21 Related articles All 12 versions

[CITATION] Adapted wasserstein distances and stability in mathematical finance. arXiv e-prints, page

J Backhoff-Veraguas, D Bartl, M Beiglböck, M Eder - arXiv preprint arXiv:1901.07450, 2019

  Cited by 4 Related articles


2020 [PDF] arxiv.org

Virtual persistence diagrams, signed measures, and Wasserstein distance

P Bubenik, A Elchesen - arXiv preprint arXiv:2012.10514, 2020 - arxiv.org

Persistence diagrams, an important summary in topological data analysis, consist of a set of

ordered pairs, each with positive multiplicity. Persistence diagrams are obtained via Mobius

inversion and may be compared using a one-parameter family of metrics called Wasserstein  …

  Related articles All 2 versions 


2020 [PDF] arxiv.org

High-precision Wasserstein barycenters in polynomial time

JM Altschuler, E Boix-Adsera - arXiv preprint arXiv:2006.08012, 2020 - arxiv.org

Computing Wasserstein barycenters is a fundamental geometric problem with widespread

applications in machine learning, statistics, and computer graphics. However, it is unknown

whether Wasserstein barycenters can be computed in polynomial time, either exactly or to …

  Cited by 1 Related articles All 3 versions 


2020

Horo-functions associated to atom sequences on the Wasserstein space

G Zhu, H Wu, X Cui - Archiv der Mathematik, 2020 - Springer

On the Wasserstein space over a complete, separable, non-compact, locally compact length

space, we consider the horo-functions associated to sequences of atomic measures. We

show the existence of co-rays for any prescribed initial probability measure with respect to a …

  Related articles


2020


[PDF] arxiv.org

Spectral Unmixing With Multinomial Mixture Kernel and Wasserstein Generative Adversarial Loss

S OzkanGB Akar - arXiv preprint arXiv:2012.06859, 2020 - arxiv.org

This study proposes a novel framework for spectral unmixing by using 1D convolution

kernels and spectral uncertainty. High-level representations are computed from data, and

they are further modeled with the Multinomial Mixture Model to estimate fractions under …

  Related articles All 2 versions 


 2020

S2a: Wasserstein gan with spatio-spectral laplacian attention for multi-spectral band synthesis

L RoutI MisraSM Moorthi… - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com

Intersection of adversarial learning and satellite image processing is an emerging field in

remote sensing. In this study, we intend to address synthesis of high resolution multi-spectral

satellite imagery using adversarial learning. Guided by the discovery of attention …

  Cited by 3 Related articles All 9 versions 


 2020

De novo protein design for novel folds using guided conditional Wasserstein generative adversarial networks

M Karimi, S Zhu, Y CaoY Shen - Journal of Chemical Information …, 2020 - ACS Publications

Although massive data is quickly accumulating on protein sequence and structure, there is a

small and limited number of protein architectural types (or structural folds). This study is

addressing the following question: how well could one reveal underlying sequence …

  Cited by 4 Related articles All 5 versions

[PDF] biorxiv.org


Wasserstein loss-based deep object detection

Y Han, X Liu, Z Sheng, Y Ren, X Han… - Proceedings of the …, 2020 - openaccess.thecvf.com

Object detection locates the objects with bounding boxes and identifies their classes, which

is valuable in many computer vision applications (eg autonomous driving). Most existing

deep learning-based methods output a probability vector for instance classification trained

with the one-hot label. However, the limitation of these models lies in attribute perception

because they do not take the severity of different misclassifications into consideration. In this

paper, we propose a novel method based on the Wasserstein distance called Wasserstein …

Wasserstein Loss based Deep Object Detection

by Han, Yuzhuo; Liu, Xiaofeng; Sheng, Zhenfei ; More...

2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 06/2020

Object detection locates the objects with bounding boxes and identifies their classes, which is valuable in many computer vision applications (e.g. autonomous...

Conference ProceedingCitation Online
CCited by 16 Related articles All 5 versions

A Sliced Wasserstein Loss for Neural Texture Synthesis ...

https://paperswithcode.com › paper › pitfalls-of-the-gra...

Jun 12, 2020 — A Sliced Wasserstein Loss for Neural Texture Synthesis ... of a convolutional neural network optimized for object recognition (e.g. VGG-19).     

online  OPEN ACCESS

A Sliced Wasserstein Loss for Neural Texture Synthesis

by Heitz, Eric; Vanhoey, Kenneth; Chambon, Thomas ; More...

06/2020

IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2021 We address the problem of computing a textural loss based on the statistics...

Journal ArticleFull Text Online

.<——2020——2020—2930—

Wasserstein barycenters can be computed in polynomial time ...

https://ui.adsabs.harvard.edu › abs › abstract

by JM Altschuler · 2020 · Cited by 1 — This paper answers these questions in the affirmative for any fixed dimension. Ou

2020 online  OPEN ACCESS

Wasserstein barycenters can be computed in polynomial time in fixed dimension

by Altschuler, Jason M; Boix-Adsera, Enric

06/2020

Computing Wasserstein barycenters is a fundamental geometric problem with widespread applications in machine learning, statistics, and computer graphics....

Journal ArticleFull Text Online

2020

Illegible Text to Readable Text: An Image-to ... - IEEE Xplore

https://ieeexplore.ieee.org › document

by M Karimi · 2020 · Cited by 1 — Illegible Text to Readable Text: An Image-to-Image Transformation using Conditional Sliced Wasserstein Adversarial Networks. Abstract: Automatic text ...

Illegible Text to Readable Text: An Image-to-Image Transformation using Conditional Sliced Wasserstein Adversarial Networks

by Karimi, Mostafa; Veni, Gopalkrishna; Yu, Yen-Yun

2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 06/2020

Automatic text recognition from ancient handwritten record images is an important problem in the genealogy domain. However, critical challenges such as varying...

Conference ProceedingCitation Online

Cited by 3 Related articles All 7 versions

[PDF] arxiv.org

Ranking IPCC Models Using the Wasserstein Distance

G VissioV LemboV LucariniM Ghil - arXiv preprint arXiv:2006.09304, 2020 - arxiv.org

We propose a methodology for evaluating the performance of climate models based on the

use of the Wasserstein distance. This distance provides a rigorous way to measure

quantitatively the difference between two probability distributions. The proposed approach is …

  chmarks based on the use of the Wasserstein distance (WD). This distance …

 Cited by 14 Related articles All 11 versions


2020 [PDF] researchgate.net

[PDF] Ranking IPCC Model Performance Using the Wasserstein Distance

G VissioV LemboV Lucarini… - arXiv preprint arXiv …, 2020 - researchgate.net

We propose a methodology for intercomparing climate models and evaluating their

performance against benchmarks based on the use of the Wasserstein distance (WD). This

distance provides a rigorous way to measure quantitatively the difference between two …

  Related articles 


[PDF] biorxiv.org

Gromov-Wasserstein optimal transport to align single-cell multi-omics data

P DemetciR SantorellaB SandstedeWS Noble… - BioRxiv, 2020 - biorxiv.org

Data integration of single-cell measurements is critical for understanding cell development

and disease, but the lack of correspondence between different types of measurements

makes such efforts challenging. Several unsupervised algorithms can align heterogeneous …

  Cited by 10 Related articles All 3 versions 

2020


2020 [PDF] arxiv.org

Graph Wasserstein Correlation Analysis for Movie Retrieval

X Zhang, T Zhang, X Hong, Z Cui, J Yang - European Conference on …, 2020 - Springer

Movie graphs play an important role to bridge heterogenous modalities of videos and texts

in human-centric retrieval. In this work, we propose Graph Wasserstein Correlation Analysis

(GWCA) to deal with the core issue therein, ie, cross heterogeneous graph comparison …

  Related articles All 5 versions


2020 [PDF] arxiv.org

Equidistribution of random walks on compact groups II. The Wasserstein metric

B Borda - arXiv preprint arXiv:2004.14089, 2020 - arxiv.org

We consider a random walk $ S_k $ with iid steps on a compact group equipped with a bi-

invariant metric. We prove quantitative ergodic theorems for the sum $\sum_ {k= 1}^ N f

(S_k) $ with Hölder continuous test functions $ f $, including the central limit theorem, the …

  Cited by 1 Related articles All 2 versions 


2020 [PDF] researchgate.net

Data supplement for a soft sensor using a new generative model based on a variational autoencoder and Wasserstein GAN

X Wang, H Liu - Journal of Process Control, 2020 - Elsevier

In industrial process control, measuring some variables is difficult for environmental or cost

reasons. This necessitates employing a soft sensor to predict these variables by using the

collected data from easily measured variables. The prediction accuracy and computational …

  Cited by 8 Related articles All 3 versions


[PDF] arxiv.org

On the Wasserstein distance for a martingale central limit theorem

X Fan, X Ma - Statistics & Probability Letters, 2020 - Elsevier

… On the Wasserstein distance for a martingale central limit theorem … Previous article in issue;

Next article in issue. MSC. 60G42. 60E15. 60F25. Keywords. Martingales. Central limit theorem.

Wasserstein metric. 1. Introduction and main result. Let n ≥ 1 . Assume that X = X i 1 ≤ i ≤ …

  Related articles All 8 versions


Discrete Wasserstein Autoencoders for Document Retrieval

Y ZhangH Zhu - … 2020-2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

Learning to hash via generative models has became a promising paradigm for fast similarity

search in document retrieval. The binary hash codes are treated as Bernoulli latent variables

when training a variational autoencoder (VAE). However, the prior of discrete distribution (ie,

Bernoulli distribution) is short of some structure regularization to generate more effi-cient

binary codes. In this paper, we present an end-to-end Wasserstein Autoencoder (WAE) for

text hashing to avoid in-differentiable operators in the reparameterization trick, where the …

  Related articles

online

Discrete Wasserstein Autoencoders for Document Retrieval

by Zhang, Yifei; Zhu, Hao

ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 05/2020

Learning to hash via generative models has became a promising paradigm for fast similarity search in document retrieval. The binary hash codes are treated as...

Conference ProceedingFull Text Online

 ——2020——2020—2940— 

   

 

2020 patent OPEN ACCESS

Method for embedding and clustering depth self-coding based on Sliced-Waserstein distance

by CHEN HUAHUA; YING NA; GUO CHUNSHENG ; More...

05/2020

The invention discloses a deep self-encoding embedding clustering method based on a Sliced-Waserstein distance. The method comprises the following steps: S11,...

PatentCitation Online


2020 patent OPEN ACCESS

一种基于Wasserstein GAN的光伏阵列故障诊断方法

05/2020

本发明涉及一种基于Wasserstein GAN的光伏阵列故障诊断方法,首先对光伏阵列电流、电压时序数据进行采集;接着将获取的光伏阵列时序电流与时序电压数据绘制为曲线图形并保存为样本;然后设计Wasserstein GAN网络中的鉴别器D与生成器G;然后训练Wasserstein...

PatentCitation Online


[PDF] mlr.press

Robust Document Distance with Wasserstein-Fisher-Rao metric

Z Wang, D Zhou, M Yang, Y Zhang… - Asian Conference on …, 2020 - proceedings.mlr.press

Computing the distance among linguistic objects is an essential problem in natural

language processing. The word mover's distance (WMD) has been successfully applied to

measure the document distance by synthesizing the low-level word similarity with the …

  Cited by 1 Related articles 


[PDF] arxiv.org

Graph Wasserstein Correlation Analysis for Movie Retrieval

X Zhang, T Zhang, X Hong, Z Cui, J Yang - European Conference on …, 2020 - Springer

Movie graphs play an important role to bridge heterogenous modalities of videos and texts

in human-centric retrieval. In this work, we propose Graph Wasserstein Correlation Analysis

(GWCA) to deal with the core issue therein, ie, cross heterogeneous graph comparison …

  Related articles All 5 versions

 

[PDF] arxiv.org

Wasserstein statistics in 1D location-scale model

S Amari - arXiv preprint arXiv:2003.05479, 2020 - arxiv.org

Wasserstein geometry and information geometry are two important structures introduced in a

manifold of probability distributions. The former is defined by using the transportation cost

between two distributions, so it reflects the metric structure of the base manifold on which …

  Cited by 1 Related articles All 2 versions 


2020

 

FRWCAE: joint faster-RCNN and Wasserstein convolutional auto-encoder for instance retrieval

Y Zhang, Y Feng, D Liu, J Shang, B Qiang - Applied Intelligence, 2020 - Springer

Based on the powerful feature extraction capability of deep convolutional neural networks,

image-level retrieval methods have achieved superior performance compared to the hand-

crafted features and indexing algorithms. However, people tend to focus on foreground …

  Related articles

[CITATION] Frwcae: joint faster-rcnn and wasserstein convolutional auto-encoder for instance retrieval

Z Yy, Y Feng, L Dj, S Jx, Q Bh - Applied Intelligence, 2020

  Cited by 2 Related articles


[PDF] upenn.edu

Optimal Estimation of Wasserstein Distance on a Tree With an Application to Microbiome Studies

S Wang, TT CaiH Li - Journal of the American Statistical …, 2020 - Taylor & Francis

The weighted UniFrac distance, a plug-in estimator of the Wasserstein distance of read

counts on a tree, has been widely used to measure the microbial community difference in

microbiome studies. Our investigation however shows that such a plug-in estimator …

  Related articles All 4 versions


Discrete Wasserstein Autoencoders for Document Retrieval

Y ZhangH Zhu - … 2020-2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

Learning to hash via generative models has became a promising paradigm for fast similarity

search in document retrieval. The binary hash codes are treated as Bernoulli latent variables

when training a variational autoencoder (VAE). However, the prior of discrete distribution (ie …

  Related articles


Generating Hyperspectral Data Based on 3D CNN and Improved Wasserstein Generative Adversarial Network Using Homemade High-resolution Datasets

Y Li, D Huang - Proceedings of the International Conference on …, 2020 - dl.acm.org

Hyperspectral images contain rich information on the fingerprints of materials and are being

popularly used in the exploration of oil and gas, environmental monitoring, and remote

sensing. Since hyperspectral images cover a wide range of wavelengths with high …

  Related articles


Wasserstein upper bounds of the total variation for smooth densities

M Chae, SG Walker - Statistics & Probability Letters, 2020 - Elsevier

The total variation distance between probability measures cannot be bounded by the

Wasserstein metric in general. If we consider sufficiently smooth probability densities,

however, it is possible to bound the total variation by a power of the Wasserstein distance …

  Cited by 3 Related articles All 5 versions

<——2020——2020—2950—


Wasserstein based transfer network for cross-domain sentiment classification

Y Du, M He, L Wang, H Zhang - Knowledge-Based Systems, 2020 - Elsevier

Automatic sentiment analysis of social media texts is of great significance for identifying

people's opinions that can help people make better decisions. Annotating data is time

consuming and laborious, and effective sentiment analysis on domains lacking of labeled …

  Cited by 3 Related articles All 2 versions


[PDF] arxiv.org

Online Stochastic Convex Optimization: Wasserstein Distance Variation

I ShamesF Farokhi - arXiv preprint arXiv:2006.01397, 2020 - arxiv.org

Distributionally-robust optimization is often studied for a fixed set of distributions rather than

time-varying distributions that can drift significantly over time (which is, for instance, the case

in finance and sociology due to underlying expansion of economy and evolution of …

  Related articles All 3 versions 


[PDF] arxiv.org

Stein factors for variance-gamma approximation in the Wasserstein and Kolmogorov distances

RE Gaunt - arXiv preprint arXiv:2008.06088, 2020 - arxiv.org

We obtain new bounds for the solution of the variance-gamma (VG) Stein equation that are

of the correct form for approximations in terms of the Wasserstein and Kolmorogorov metrics.

These bounds hold for all parameters values of the four parameter VG class. As an …

  Cited by 4 Related articles All 3 versions 


A Linear Programming Approximation of Distributionally Robust Chance-Constrained Dispatch With Wasserstein Distance

A Zhou, M Yang, M Wang… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

This paper proposes a data-driven distributionally robust chance constrained real-time

dispatch (DRCC-RTD) considering renewable generation forecasting errors. The proposed

DRCC-RTD model minimizes the expected quadratic cost function and guarantees that the …

  Cited by 7 Related articles All 2 versions


TPFA Finite Volume Approximation of Wasserstein Gradient Flows

A NataleG Todeschi - International Conference on Finite Volumes for …, 2020 - Springer

Numerous infinite dimensional dynamical systems arising in different fields have been

shown to exhibit a gradient flow structure in the Wasserstein space. We construct Two Point

Flux Approximation Finite Volume schemes discretizing such problems which preserve the …

Cited by 3 Related articles All 6 versions


2020


[PDF] arxiv.org

Wasserstein Generative Models for Patch-based Texture Synthesis

A HoudardA LeclaireN Papadakis… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we propose a framework to train a generative model for texture image

synthesis from a single example. To do so, we exploit the local representation of images via

the space of patches, that is, square sub-images of fixed size (eg $4\times 4$). Our main …

  Cited by 1 Related articles All 10 versions 


[PDF] arxiv.org

Chance-Constrained Set Covering with Wasserstein Ambiguity

H Shen, R Jiang - arXiv preprint arXiv:2010.05671, 2020 - arxiv.org

We study a generalized distributionally robust chance-constrained set covering problem

(DRC) with a Wasserstein ambiguity set, where both decisions and uncertainty are binary-

valued. We establish the NP-hardness of DRC and recast it as a two-stage stochastic …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Data-Driven Approximation of the Perron-Frobenius Operator Using the Wasserstein Metric

A KarimiTT Georgiou - arXiv preprint arXiv:2011.00759, 2020 - arxiv.org

This manuscript introduces a regression-type formulation for approximating the Perron-

Frobenius Operator by relying on distributional snapshots of data. These snapshots may

represent densities of particles. The Wasserstein metric is leveraged to define a suitable …

  Related articles All 3 versions 


[PDF] arxiv.org

Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures

O Bencheikh, B Jourdain - arXiv preprint arXiv:2012.09729, 2020 - arxiv.org

We are interested in the approximation in Wasserstein distance with index $\rho\ge 1$ of a

probability measure $\mu $ on the real line with finite moment of order $\rho $ by the

empirical measure of $ N $ deterministic points. The minimal error converges to $0 $ as …

  Related articles All 3 versions 


[PDF] arxiv.org

Hierarchical Low-Rank Approximation of Regularized Wasserstein distance

M Motamed - arXiv preprint arXiv:2004.12511, 2020 - arxiv.org

Sinkhorn divergence is a measure of dissimilarity between two probability measures. It is

obtained through adding an entropic regularization term to Kantorovich's optimal transport

problem and can hence be viewed as an entropically regularized Wasserstein distance …

  Related articles All 3 v

<——2020——2020—2960—  


[PDF] arxiv.org

An LP-based, strongly-polynomial 2-approximation algorithm for sparse Wasserstein barycenters

S Borgwardt - Operational Research, 2020 - Springer

Discrete Wasserstein barycenters correspond to optimal solutions of transportation problems

for a set of probability measures with finite support. Discrete barycenters are measures with

finite support themselves and exhibit two favorable properties: there always exists one with a …

  Cited by 4 Related articles All 3 versions


[PDF] arxiv.org

Stein's method for normal approximation in Wasserstein distances with application to the multivariate Central Limit Theorem

T Bonis - Probability Theory and Related Fields, 2020 - Springer

We use Stein's method to bound the Wasserstein distance of order 2 between a

measure\(\nu\) and the Gaussian measure using a stochastic process\((X_t) _ {t\ge 0}\) such

that\(X_t\) is drawn from\(\nu\) for any\(t> 0\). If the stochastic process\((X_t) _ {t\ge 0}\) …

  Cited by 8 Related articles All 3 versions


[PDF] arxiv.org

On Stein's factors for Poisson approximation in Wasserstein distance with non-linear transportation costs

ZW Liao, Y Ma, A Xia - arXiv preprint arXiv:2003.13976, 2020 - arxiv.org

We establish various bounds on the solutions to a Stein equation for Poisson approximation

in Wasserstein distance with non-linear transportation costs. The proofs are a refinement of

those in [Barbour and Xia (2006)] using the results in [Liu and Ma (2009)]. As a corollary, we …

  Related articles All 2 versions 


Stochastic Approximation versus Sample Average Approximation for population Wasserstein barycenters

D Dvinskikh - arXiv e-prints, 2020 - ui.adsabs.harvard.edu

In machine learning and optimization community there are two main approaches for convex

risk minimization problem, namely, the Stochastic Approximation (SA) and the Sample

Average Approximation (SAA). In terms of oracle complexity (required number of stochastic …

Cited by 4 Related articles All 3 versions 

[CITATION] Stochastic approximation versus sample average approximation for population Wasserstein barycenter calculation. arXiv e-prints, art

D Dvinskikh - arXiv preprint arXiv:2001.07697, 2020


[PDF] arxiv.org

Partial Gromov-Wasserstein Learning for Partial Graph Matching

W LiuC ZhangJ XieZ Shen, H Qian… - arXiv preprint arXiv …, 2020 - arxiv.org

Graph matching finds the correspondence of nodes across two graphs and is a basic task in

graph-based machine learning. Numerous existing methods match every node in one graph

to one node in the other graph whereas two graphs usually overlap partially in …

  Related articles All 4 versions 

2020


2020   

Remaining useful life prediction of lithium-ion batteries using a fusion method based on Wasserstein GAN

周温丁, 鲍士兼, 许方敏, 赵成林 - 中国邮电高校学报 (英文版), 2020 - jcupt.bupt.edu.cn

Lithium-ion batteries are the main power supply equipment in many fields due to their

advantages of no memory, high energy density, long cycle life and no pollution to the

Create alertenvironment. Accurate prediction for the remaining useful life (RUL) of lithium-ion batteries …

  All 2 versions 


[C] Remaining useful life prediction of lithium-ion batteries using a fusion method based on Wasserstein GAN

周温丁, 鲍士兼, 许方敏… - 中国邮电高校学报 …, 2020 - journal13.magtechjournal.com

Abstract Lithium-ion batteries are the main power supply equipment in many fields due to their advantages of no memory, high energy density, long cycle life and no pollution to the environment. Accurate prediction for the remaining useful life (RUL) of lithium-ion batteries can …

[CITATION] Remaining useful life prediction of lithium-ion batteries using a fusion method based on Wasserstein GAN

周温丁, 鲍士兼, 许方敏, 赵成林 - 中国邮电高校学报 (英文版), 2020 - jcupt.bupt.edu.cn

Abstract Lithium-ion batteries are the main power supply equipment in many fields due to their

advantages of no memory, high energy density, long cycle life and no pollution to the

environment. Accurate prediction for the remaining useful life (RUL) of lithium-ion batteries can …

  All 2 versions


 2020 [PDF] arxiv.org

Wasserstein Collaborative Filtering for Item Cold-start Recommendation

Y MengX YanW Liu, H Wu, J Cheng - … of the 28th ACM Conference on …, 2020 - dl.acm.org

Item cold-start recommendation, which predicts user preference on new items that have no

user interaction records, is an important problem in recommender systems. In this paper, we

model the disparity between user preferences on warm items (those having interaction …

  Cited by 4 Related articles All 4 versions


 2020 [PDF] arxiv.org

The Unbalanced Gromov Wasserstein Distance: Conic Formulation and Relaxation

T SéjournéFX VialardG Peyré - arXiv preprint arXiv:2009.04266, 2020 - arxiv.org

Comparing metric measure spaces (ie a metric space endowed with a probability

distribution) is at the heart of many machine learning problems. This includes for instance

predicting properties of molecules in quantum chemistry or generating graphs with varying …

  Cited by 5 Related articles All 2 versions 


2020

Remaining useful life prediction of lithium-ion batteries using a fusion method based on Wasserstein GAN

周温丁, 鲍士兼, 许方敏, 赵成林 - 中国邮电高校学报 (英文版), 2020 - jcupt.bupt.edu.cn

Lithium-ion batteries are the main power supply equipment in many fields due to their

advantages of no memory, high energy density, long cycle life and no pollution to the

environment. Accurate prediction for the remaining useful life (RUL) of lithium-ion batteries …

  All 2 versions 


2020 [PDF] ieee.org

Robust multivehicle tracking with wasserstein association metric in surveillance videos

Y Zeng, X Fu, L Gao, J Zhu, H Li, Y Li - IEEE Access, 2020 - ieeexplore.ieee.org

Vehicle tracking based on surveillance videos is of great significance in the highway traffic

monitoring field. In real-world vehicle-tracking applications, partial occlusion and objects

with similarly appearing distractors pose significant challenges. For addressing the above …

  Cited by 4 Related articles

<——2020——2020—2970—  


[PDF] arxiv.org

Wasserstein distributionally robust look-ahead economic dispatch

BK PoollaAR HotaS Bolognani… - … on Power Systems, 2020 - ieeexplore.ieee.org

We consider the problem of look-ahead economic dispatch (LAED) with uncertain

renewable energy generation. The goal of this problem is to minimize the cost of

conventional energy generation subject to uncertain operational constraints. These …

  Cited by 3 Related articles All 3 versions

 

[PDF] arxiv.org

Wasserstein statistics in 1D location-scale model

S Amari - arXiv preprint arXiv:2003.05479, 2020 - arxiv.org

Wasserstein geometry and information geometry are two important structures introduced in a

manifold of probability distributions. The former is defined by using the transportation cost

between two distributions, so it reflects the metric structure of the base manifold on which …

  Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

A wasserstein-type distance in the space of gaussian mixture models

J Delon, A Desolneux - SIAM Journal on Imaging Sciences, 2020 - SIAM

In this paper we introduce a Wasserstein-type distance on the set of Gaussian mixture

models. This distance is defined by restricting the set of possible coupling measures in the

optimal transport problem to Gaussian mixture models. We derive a very simple discrete …

  Cited by 17 Related articles All 7 versions

2020 [PDF] aaai.org

Solving general elliptical mixture models through an approximate Wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - Proceedings of the AAAI Conference on …, 2020 - ojs.aaai.org

We address the estimation problem for general finite mixture models, with a particular focus

on the elliptical mixture models (EMMs). Compared to the widely adopted Kullback–Leibler

divergence, we show that the Wasserstein distance provides a more desirable optimisation …

  Cited by 2 Related articles All 4 versions 


2020 [PDF] arxiv.org

Spectral Unmixing With Multinomial Mixture Kernel and Wasserstein Generative Adversarial Loss

S OzkanGB Akar - arXiv preprint arXiv:2012.06859, 2020 - arxiv.org

This study proposes a novel framework for spectral unmixing by using 1D convolution

kernels and spectral uncertainty. High-level representations are computed from data, and

they are further modeled with the Multinomial Mixture Model to estimate fractions under …

  Related articles All 2 versions 


 

2020

year 2020 [PDF] amazonaws.com

[PDF] INFORMS Optimization Society 2020 Young Researchers Prize Expository Article: On Distributionally Robust Chance Constrained Programs with Wasserstein …

W Xie - higherlogicdownload.s3.amazonaws …

I am truly honored and grateful to be awarded the 2020 INFORMS Optimization Society Young

Researcher Prize for the work “On Distributionally Robust Chance Constrained Program with

Wasserstein Distance.” I would like to thank the committee members (Prof. Sam Burer, Prof. Hande …


2020

Wasserstein cycle-consistent generative adversarial network for improved seismic impedance inversion: Example on 3D SEAM model

A Cai, H Di, Z Li, H Maniar, A Abubakar - SEG Technical Program …, 2020 - library.seg.org

The convolutional neural networks (CNNs) have attracted great attentions in seismic

exploration applications by their capability of learning the representations of data with

multiple level of abstractions, given an adequate amount of labeled data. In seismic …

Cited by 10 Related articles All 2 versions

[CITATION] Wasserstein cycle-consistent generative adversarial network for improved seismic impedance inversion: Example on 3D SEAM model: 90th Annual …

A Cai, H Di, Z Li, H Maniar, A Abubakar - 2020 - Abstract

 Cited by 10 Related articles All 2 versions

year 2020 [PDF] kweku.me

[PDF] Measuring Bias with Wasserstein Distance

K Kwegyir-Aggrey, SM Brown - kweku.me

In fair classification, we often ask:" what does it mean to be fair, and how is fairness

measured?" Previous approaches to defining and enforcing fairness rely on a set of

statistical fairness definitions, with each definition providing its own unique measurement of …

  Related articles 

2020 [PDF] unifi.it

[PDF] Pattern-Based Music Generation with Wasserstein Autoencoders and PRCDescriptions

V Borghuis, L Angioloni, L Brusci… - 29th International Joint …, 2020 - flore.unifi.it

We present a pattern-based MIDI music generation system with a generation strategy based

on Wasserstein autoencoders and a novel variant of pianoroll descriptions of patterns which

employs separate channels for note velocities and note durations and can be fed into classic …

  Related articles All 4 versions 


2020 [PDF] sabanciuniv.edu

Cyclic Adversarial Framework with Implicit Autoencoder and Wasserstein Loss (CAFIAWL)

E Bonabi Mobaraki - 2020 - research.sabanciuniv.edu

Since the day that the Simple Perceptron was invented, Artificial Neural Networks (ANNs)

attracted many researchers. Technological improvements in computers and the internet

paved the way for unseen computational power and an immense amount of data that …

  Related articles

<——2020——2020—2980—  


2020 [PDF] iop.org

A collaborative filtering recommendation framework based on Wasserstein GAN

R Li, F Qian, X Du, S Zhao… - Journal of Physics …, 2020 - iopscience.iop.org

Compared with the original GAN, Wasserstein GAN minimizes the Wasserstein Distance

between the generative distribution and the real distribution, can well capture the potential

distribution of data and has achieved excellent results in image generation. However, the …

  Related articles

 

 2020 [PDF] mlr.press

Wasserstein fair classification

R JiangA Pacchiano, T Stepleton… - Uncertainty in …, 2020 - proceedings.mlr.press

We propose an approach to fair classification that enforces independence between the

classifier outputs and sensitive information by minimizing Wasserstein-1 distances. The

approach has desirable theoretical properties and is robust to specific choices of the …

 ited by 76 Related articles All 5 versions 


2020 [PDF] thecvf.com

Gromov-wasserstein averaging in a riemannian framework

S ChowdhuryT Needham - Proceedings of the IEEE/CVF …, 2020 - openaccess.thecvf.com

We introduce a theoretical framework for performing statistical tasks-including, but not

limited to, averaging and principal component analysis-on the space of (possibly

asymmetric) matrices with arbitrary entries and sizes. This is carried out under the lens of the …

  Cited by 10 Related articles All 6 versions 


2020 [PDF] arxiv.org

Fair regression with wasserstein barycenters

E Chzhen, C Denis, M HebiriL Oneto… - arXiv preprint arXiv …, 2020 - arxiv.org

We study the problem of learning a real-valued function that satisfies the Demographic

Parity constraint. It demands the distribution of the predicted output to be independent of the

sensitive attribute. We consider the case that the sensitive attribute is available for …

  Cited by 6 Related articles All 4 versions 

2020 [PDF] arxiv.org

Averaging atmospheric gas concentration data using wasserstein barycenters

M Barré, C Giron, M Mazzolini… - arXiv preprint arXiv …, 2020 - arxiv.org

Hyperspectral satellite images report greenhouse gas concentrations worldwide on a daily

basis. While taking simple averages of these images over time produces a rough estimate of

relative emission rates, atmospheric transport means that simple averages fail to pinpoint …

  Cited by 2 Related articles All 3 versions 

2020

[PDF] arxiv.org

Wasserstein-based fairness interpretability framework for machine learning models

A Miroshnikov, K Kotsiopoulos, R Franks… - arXiv preprint arXiv …, 2020 - arxiv.org

In this article, we introduce a fairness interpretability framework for measuring and

explaining bias in classification and regression models at the level of a distribution. In our

work, motivated by the ideas of Dwork et al.(2012), we measure the model bias across sub …

  Cited by 1 Related articles All 2 versions 


Probability forecast combination via entropy regularized wasserstein distance

R Cumings-Menon, M Shin - Entropy, 2020 - mdpi.com

We propose probability and density forecast combination methods that are defined using the

entropy regularized Wasserstein distance. First, we provide a theoretical characterization of

the combined density forecast based on the regularized Wasserstein distance under the …

  Cited by 2 Related articles All 15 versions 

2020 [PDF] arxiv.org

Averaging atmospheric gas concentration data using wasserstein barycenters

M Barré, C Giron, M Mazzolini… - arXiv preprint arXiv …, 2020 - arxiv.org

Hyperspectral satellite images report greenhouse gas concentrations worldwide on a daily

basis. While taking simple averages of these images over time produces a rough estimate of

relative emission rates, atmospheric transport means that simple averages fail to pinpoint …

  Cited by 2 Related articles All 3 versions 


2020

Wasserstein GAN based on Autoencoder with back-translation for cross-lingual embedding mappings

Y Zhang, Y Li, Y Zhu, X Hu - Pattern Recognition Letters, 2020 - Elsevier

Recent works about learning cross-lingual word mappings (CWMs) focus on relaxing the

requirement of bilingual signals through generative adversarial networks (GANs). GANs

based models intend to enforce source embedding space to align target embedding space …

  Cited by 1 Related articles All 2 versions


2020

Chinese font translation with improved Wasserstein generative adversarial network

Y Miao, H Jia, K Tang, Y Ji - Twelfth International Conference …, 2020 - spiedigitallibrary.org

Nowadays, various fonts are applied in many fields, and the generation of multiple fonts by

computer plays an important role in the inheritance, development and innovation of Chinese

culture. Aiming at the existing font generation methods, which have some problems such as …

  Related articles All 2 versions


2020

Precipitation forecasting using machine-learning-based ensemble aggregation with Wasserstein-guided weighting

F O'Donncha, K Dipietro, SC James… - AGU Fall Meeting …, 2020 - ui.adsabs.harvard.edu

Precipitation forecasting is one of the most complex modeling tasks, requiring the resolution

of numerous spatial and temporal patterns that are sensitive to the accurate representation

of many secondary variables (precipitable water column, air humidity, pressure, etc.) …

  All 2 versions

<——2020——2020—2990—


Stochastic equation and exponential ergodicity in Wasserstein distances for affine processes

M Friesen, P Jin, B Rüdiger - Annals of Applied Probability, 2020 - projecteuclid.org

This work is devoted to the study of conservative affine processes on the canonical state

space $ D=\mathbb {R} _ {+}^{m}\times\mathbb {R}^{n} $, where $ m+ n> 0$. We show that

each affine process can be obtained as the pathwise unique strong solution to a stochastic …

  Cited by 9 Related articles All 5 versions


2020

Semi-supervised Surface Wave Tomography with Wasserstein Cycle-consistent GAN: Method and Application on Southern California Plate Boundary Region

A Cai, H QiuF Niu - 2020 - essoar.org

Machine learning algorithm is applied to shear wave velocity (Vs) inversion in surface wave

tomography, where a set of 1-D Vs profiles and the corresponding synthetic dispersion

curves are used in network training. Previous studies showed that performances of a trained …

 

2020

Semi-supervised Data-driven Surface Wave Tomography using Wasserstein Cycle-consistent GAN: Application on Southern California Plate Boundary Region

A Cai, H QiuF Niu - 2020 - essoar.org

Current machine learning based shear wave velocity (Vs) inversion using surface wave

dispersion measurements utilizes synthetic dispersion curves calculated from existing 3-D

velocity models as training datasets. It is shown in the previous studies that the …

[PDF] researchgate.net

Typical wind power scenario generation for multiple wind farms using conditional improved Wasserstein generative adversarial network

Y Zhang, Q Ai, F Xiao, R Hao, T Lu - … Journal of Electrical Power & Energy …, 2020 - Elsevier

Because of environmental benefits, wind power is taking an increasing role meeting

electricity demand. However, wind power tends to exhibit large uncertainty and is largely

influenced by meteorological conditions. Apart from the variability, when multiple wind farms …

  Cited by 19 Related articles


[PDF] ams.org

On the Wasserstein distance between classical sequences and the Lebesgue measure

L Brown, S Steinerberger - Transactions of the American Mathematical …, 2020 - ams.org

We discuss the classical problem of measuring the regularity of distribution of sets of $ N $

points in $\mathbb {T}^ d $. A recent line of investigation is to study the cost ($= $ mass

$\times $ distance) necessary to move Dirac measures placed on these points to the uniform …

  Cited by 5 Related articles All 4 versions


2020


[PDF] arxiv.org

Symmetric skip connection wasserstein gan for high-resolution facial image inpainting

J JamC KendrickV DrouardK Walker… - arXiv preprint arXiv …, 2020 - arxiv.org

The state-of-the-art facial image inpainting methods achieved promising results but face

realism preservation remains a challenge. This is due to limitations such as; failures in

preserving edges and blurry artefacts. To overcome these limitations, we propose a …

  Cited by 4 Related articles All 3 versions 


On the Computation of Kantorovich--Wasserstein Distances Between Two-Dimensional Histograms by Uncapacitated Minimum Cost Flows

F Bassetti, S GualandiM Veneroni - SIAM Journal on Optimization, 2020 - SIAM

In this work, we present a method to compute the Kantorovich--Wasserstein distance of

order 1 between a pair of two-dimensional histograms. Recent works in computer vision and

machine learning have shown the benefits of measuring Wasserstein distances of order 1 …

  Cited by 6 Related articles All 2 versions


[PDF] arxiv.org

Entropy-Regularized -Wasserstein Distance between Gaussian Measures

A MallastoA GerolinHQ Minh - arXiv preprint arXiv:2006.03416, 2020 - arxiv.org

Gaussian distributions are plentiful in applications dealing in uncertainty quantification and

diffusivity. They furthermore stand as important special cases for frameworks providing

geometries for probability measures, as the resulting geometry on Gaussians is often …

  Cited by 7 Related articles All 3 versions 


[PDF] arxiv.org

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

MH Quang - arXiv preprint arXiv:2011.07489, 2020 - arxiv.org

This work studies the entropic regularization formulation of the 2-Wasserstein distance on an

infinite-dimensional Hilbert space, in particular for the Gaussian setting. We first present the

Minimum Mutual Information property, namely the joint measures of two Gaussian measures …

  Cited by 3 Related articles All 2 versions 

2020 [PDF] arxiv.org

Global sensitivity analysis and Wasserstein spaces

JC Fort, T Klein, A Lagnoux - arXiv preprint arXiv:2007.12378, 2020 - arxiv.org

Sensitivity indices are commonly used to quantity the relative inuence of any specic group of

input variables on the output of a computer code. In this paper, we focus both on computer

codes the output of which is a cumulative distribution function and on stochastic computer …

  Cited by 1 Related articles All 9 versions 

<——2020——2020—3000— 

 

[PDF] arxiv.org

Dynamic facial expression generation on hilbert hypersphere with conditional wasserstein generative adversarial nets

N OtberdoutM DaoudiA Kacem… - … on Pattern Analysis …, 2020 - ieeexplore.ieee.org

In this work, we propose a novel approach for generating videos of the six basic facial

expressions given a neutral face image. We propose to exploit the face geometry by

modeling the facial landmarks motion as curves encoded as points on a hypersphere. By …

  Cited by 12 Related articles All 10 versions


2020

Learning Wasserstein Isometric Embedding for Point Clouds

K KawanoS Koide, T Kutsuna - 2020 International Conference …, 2020 - ieeexplore.ieee.org

The Wasserstein distance has been employed for determining the distance between point

clouds, which have variable numbers of points and invariance of point order. However, the

high computational cost associated with the Wasserstein distance hinders its practical …

  All 2 versions


2020 [HTML] hindawi.com

[HTML] Imbalanced Fault Classification of Bearing via Wasserstein Generative Adversarial Networks with Gradient Penalty

B Han, S Jia, G Liu, J Wang - Shock and Vibration, 2020 - hindawi.com

Recently, generative adversarial networks (GANs) are widely applied to increase the

amounts of imbalanced input samples in fault diagnosis. However, the existing GAN-based

methods have convergence difficulties and training instability, which affect the fault …

  Related articles All 4 versions 

 

2020

Synthetic Data Generation Using Wasserstein Conditional Gans With Gradient Penalty (WCGANS-GP)

M Singh Walia - 2020 - arrow.tudublin.ie

With data protection requirements becoming stricter, the data privacy has become

increasingly important and more crucial than ever. This has led to restrictions on the

availability and dissemination of real-world datasets. Synthetic data offers a viable solution …

  Related articles 

Previous

1

2


 

 [PDF] Structure-preserving variational schemes for fourth order nonlinear partial differential equations with a Wasserstein gradient flow structure

B Ashworth - 2020 - core.ac.uk

There is a growing interest in studying nonlinear partial differential equations which

constitute gradient flows in the Wasserstein metric and related structure preserving

variational discretisations. In this thesis, we focus on the fourth order Derrida-Lebowitz …

 Related articles 

2020


[PDF] arxiv.org

Stochastic Approximation versus Sample Average Approximation for population Wasserstein barycenters

D Dvinskikh - arXiv preprint arXiv:2001.07697, 2020 - arxiv.org

In machine learning and optimization community there are two main approaches for convex

risk minimization problem, namely, the Stochastic Approximation (SA) and the Sample

Average Approximation (SAA). In terms of oracle complexity (required number of stochastic …

  Cited by 3 Related articles All 2 versions 

[CITATION] Stochastic approximation versus sample average approximation for population Wasserstein barycenter calculation. arXiv e-prints, art

D Dvinskikh - arXiv preprint arXiv:2001.07697, 2020

  Cited by 2 Related articles


2020 [PDF] arxiv.org

Regularized variational data assimilation for bias treatment using the Wasserstein metric

SK TamangA EbtehajD Zou… - Quarterly Journal of the …, 2020 - Wiley Online Library

This article presents a new variational data assimilation (VDA) approach for the formal

treatment of bias in both model outputs and observations. This approach relies on the

Wasserstein metric, stemming from the theory of optimal mass transport, to penalize the …

  Cited by 1 Related articles All 6 versions


2020

Convergence and concentration of empirical measures under Wasserstein distance in unbounded functional spaces

J Lei - Bernoulli, 2020 - projecteuclid.org

We provide upper bounds of the expected Wasserstein distance between a probability

measure and its empirical version, generalizing recent results for finite dimensional

Euclidean spaces and bounded functional spaces. Such a generalization can cover …

  Cited by 51 Related articles All 6 versions


2020 [PDF] mlr.press

Wasserstein control of mirror langevin monte carlo

KS ZhangG PeyréJ Fadili… - Conference on Learning …, 2020 - proceedings.mlr.press

Discretized Langevin diffusions are efficient Monte Carlo methods for sampling from high

dimensional target densities that are log-Lipschitz-smooth and (strongly) log-concave. In

particular, the Euclidean Langevin Monte Carlo sampling algorithm has received much …

  Cited by 8 Related articles All 14 versions 

Wasserstein Control of Mirror Langevin Monte Carlo

K Shuangjian Zhang, G PeyréJ Fadili… - arXiv e …, 2020 - ui.adsabs.harvard.edu

Discretized Langevin diffusions are efficient Monte Carlo methods for sampling from high

dimensional target densities that are log-Lipschitz-smooth and (strongly) log-concave. In

particular, the Euclidean Langevin Monte Carlo sampling algorithm has received much …

 

  

2020 [PDF] arxiv.org

Averaging atmospheric gas concentration data using wasserstein barycenters

M Barré, C Giron, M Mazzolini… - arXiv preprint arXiv …, 2020 - arxiv.org

Hyperspectral satellite images report greenhouse gas concentrations worldwide on a daily

basis. While taking simple averages of these images over time produces a rough estimate of

relative emission rates, atmospheric transport means that simple averages fail to pinpoint …

  Cited by 2 Related articles All 6 versions 

<——2020——2020—3010—


2020 [PDF] projecteuclid.org

Existence of probability measure valued jump-diffusions in generalized Wasserstein spaces

M Larsson, S Svaluto-Ferro - Electronic Journal of Probability, 2020 - projecteuclid.org

We study existence of probability measure valued jump-diffusions described by martingale

problems. We develop a simple device that allows us to embed Wasserstein spaces and

other similar spaces of probability measures into locally compact spaces where classical …

  Cited by 1 Related articles All 2 versions


2020 [PDF] uni-bonn.de

Diffusions on Wasserstein Spaces

L Dello Schiavo - 2020 - bonndoc.ulb.uni-bonn.de

We construct a canonical diffusion process on the space of probability measures over a

closed Riemannian manifold, with invariant measure the Dirichlet–Ferguson measure.

Together with a brief survey of the relevant literature, we collect several tools from the theory …

  Related articles All 3 versions 

2020 [PDF] arxiv.org

Conditional sig-wasserstein gans for time series generation

H NiL SzpruchM WieseS Liao, B Xiao - arXiv preprint arXiv:2006.05421, 2020 - arxiv.org

Generative adversarial networks (GANs) have been extremely successful in generating

samples, from seemingly high dimensional probability measures. However, these methods

struggle to capture the temporal dependence of joint probability distributions induced by …

  Cited by 7 Related articles All 3 versions 


2020 [PDF] archives-ouvertes.fr

The Wasserstein-Fourier distance for stationary time series

E Cazelles, A Robert, F Tobar - IEEE Transactions on Signal …, 2020 - ieeexplore.ieee.org

We propose the Wasserstein-Fourier (WF) distance to measure the (dis) similarity between

time series by quantifying the displacement of their energy across frequencies. The WF

distance operates by calculating the Wasserstein distance between the (normalised) power …

  Cited by 2 Related articles All 35 versions


2020 [PDF] arxiv.org

Fisher information regularization schemes for Wasserstein gradient flows

W LiJ LuL Wang - Journal of Computational Physics, 2020 - Elsevier

We propose a variational scheme for computing Wasserstein gradient flows. The scheme

builds upon the Jordan–Kinderlehrer–Otto framework with the Benamou-Brenier's dynamic

formulation of the quadratic Wasserstein metric and adds a regularization by the Fisher …

  Cited by 15 Related articles All 12 versions


2020

[PDF] arxiv.org

Lagrangian schemes for Wasserstein gradient flows

JA CarrilloD MatthesMT Wolfram - 2020 - books.google.com

This chapter reviews different numerical methods for specific examples of Wasserstein

gradient flows: we focus on nonlinear Fokker-Planck equations, but also discuss

discretizations of the parabolic-elliptic Keller-Segel model and of the fourth order thin film …

  Cited by 4 Related articles All 4 versions


2020 [PDF] arxiv.org

A variational finite volume scheme for Wasserstein gradient flows

C CancèsTO GallouëtG Todeschi - Numerische Mathematik, 2020 - Springer

We propose a variational finite volume scheme to approximate the solutions to Wasserstein

gradient flows. The time discretization is based on an implicit linearization of the

Wasserstein distance expressed thanks to Benamou–Brenier formula, whereas space …

  Cited by 7 Related articles All 10 versions


2020 [PDF] core.ac.uk

[PDF] Structure-preserving variational schemes for fourth order nonlinear partial differential equations with a Wasserstein gradient flow structure

B Ashworth - 2020 - core.ac.uk

There is a growing interest in studying nonlinear partial differential equations which

constitute gradient flows in the Wasserstein metric and related structure preserving

variational discretisations. In this thesis, we focus on the fourth order Derrida-Lebowitz …

  Related articles All 2 versions 

2020

Optimality in weighted L2-Wasserstein goodness-of-fit statistics

T De Wet, V Humble - South African Statistical Journal, 2020 - journals.co.za

In Del Barrio, Cuesta-Albertos, Matran and Rodriguez-Rodriguez (1999) and Del Barrio,

Cuesta-Albertos and Matran (2000), the authors introduced a new class of goodness-of-fit

statistics based on the L2-Wasserstein distance. It was shown that the desirable property of …

  Related articles All 3 versions


2020 [PDF] arxiv.org

Conditional wasserstein gan-based oversampling of tabular data for imbalanced learning

J EngelmannS Lessmann - arXiv preprint arXiv:2008.09202, 2020 - arxiv.org

Class imbalance is a common problem in supervised learning and impedes the predictive

performance of classification models. Popular countermeasures include oversampling the

minority class. Standard methods like SMOTE rely on finding nearest neighbours and linear …

  Cited by 7 Related articles All 5 versions 

<——2020——2020—3020—


[PDF] researchgate.net

Biosignal Oversampling Using Wasserstein Generative Adversarial Network

MS MuniaM Nourani, S Houari - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Oversampling plays a vital role in improving the minority-class classification accuracy for

imbalanced biomedical datasets. In this work, we propose a single-channel biosignal data

generation method by exploiting the advancements in well-established image-based …

  Related articles All 2 versions


2020 [PDF] arxiv.org

Data-Driven Approximation of the Perron-Frobenius Operator Using the Wasserstein Metric

A KarimiTT Georgiou - arXiv preprint arXiv:2011.00759, 2020 - arxiv.org

This manuscript introduces a regression-type formulation for approximating the Perron-

Frobenius Operator by relying on distributional snapshots of data. These snapshots may

represent densities of particles. The Wasserstein metric is leveraged to define a suitable …

  Related articles All 3 versions 


2020 [PDF] arxiv.org

Distributed Wasserstein Barycenters via Displacement Interpolation

P Cisneros-VelardeF Bullo - arXiv preprint arXiv:2012.08610, 2020 - arxiv.org

Consider a multi-agent system whereby each agent has an initial probability measure. In this

paper, we propose a distributed algorithm based upon stochastic, asynchronous and

pairwise exchange of information and displacement interpolation in the Wasserstein space …

  Related articles All 2 versions 

[PDF] nsf.gov

The quadratic Wasserstein metric for inverse data matching

B Engquist, K Ren, Y Yang - Inverse Problems, 2020 - iopscience.iop.org

This work characterizes, analytically and numerically, two major effects of the quadratic

Wasserstein (W 2) distance as the measure of data discrepancy in computational solutions

of inverse problems. First, we show, in the infinite-dimensional setup, that the W 2 distance …

  Cited by 6 Related articles All 6 versions


[PDF] arxiv.org

Gromov-Wasserstein Distance based Object Matching: Asymptotic Inference

CA Weitkamp, K Proksch, C Tameling… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we aim to provide a statistical theory for object matching based on the Gromov-

Wasserstein distance. To this end, we model general objects as metric measure spaces.

Based on this, we propose a simple and efficiently computable asymptotic statistical test for …

  Cited by 1 Related articles All 6 versions 


2020


[PDF] arxiv.org

Joint Wasserstein Distribution Matching

JZ Cao, L Mo, Q Du, Y GuoP ZhaoJ Huang… - arXiv preprint arXiv …, 2020 - arxiv.org

Joint distribution matching (JDM) problem, which aims to learn bidirectional mappings to

match joint distributions of two domains, occurs in many machine learning and computer

vision applications. This problem, however, is very difficult due to two critical challenges:(i) it …

  Related articles All 3 versions 


2020

A Novel Ant Colony Shape Matching Algorithm Based on the Gromov-Wasserstein Distance

J Zhang, L Zhang, E Saucan - 2020 8th International …, 2020 - ieeexplore.ieee.org

Shape matching has always been and still is an important task in the graphics and imaging

research. The optimization of the minimum distance among the feature points on two

surfaces of the same topological types, is a core to match shapes. Therefore, we propose in …

  Related articles

[PDF] thecvf.com

Illegible Text to Readable Text: An Image-to-Image Transformation using Conditional Sliced Wasserstein Adversarial Networks

M KarimiG VeniYY Yu - … of the IEEE/CVF Conference on …, 2020 - openaccess.thecvf.com

Automatic text recognition from ancient handwritten record images is an important problem

in the genealogy domain. However, critical challenges such as varying noise conditions,

vanishing texts, and variations in handwriting makes the recognition task difficult. We tackle …

  Cited by 1 Related articles All 7 versions 


Remote Sensing Image Segmentation based on Generative Adversarial Network with Wasserstein divergence

X Cao, C Song, J Zhang, C Liu - 2020 3rd International Conference on …, 2020 - dl.acm.org

In the image segmentation fields, traditional methods can be classified into four main

categories: threshold-based (eg Otsu [1]),. edge-based (eg Canny [2], Hough transform [3]),

region-based (eg Super pixel [4]), and energy functional-based segmentation methods (eg …

  Related articles


Hyperspectral Image Classification Approach Based on Wasserstein Generative Adversarial Networks

N Chen, C Li - … on Machine Learning and Cybernetics (ICMLC), 2020 - ieeexplore.ieee.org

Hyperspectral image classification is an important research direction in the application of

remote sensing technology. In the process of labeling different types of objects based on

spectral information and geometric spatial characteristics, noise interference often exists in …

<——2020——2020—3030—  


2020

Knowledge-aware attentive wasserstein adversarial dialogue response generation

Y Zhang, Q Fang, S Qian, C Xu - ACM Transactions on Intelligent …, 2020 - dl.acm.org

Natural language generation has become a fundamental task in dialogue systems. RNN-

based natural response generation methods encode the dialogue context and decode it into

a response. However, they tend to generate dull and simple responses. In this article, we …

  Cited by 3 Related articles


2020 [PDF] arxiv.org

Reinforced wasserstein training for severity-aware semantic segmentation in autonomous driving

X Liu, Y Zhang, X Liu, S Bai, S Li, J You - arXiv preprint arXiv:2008.04751, 2020 - arxiv.org

Semantic segmentation is important for many real-world systems, eg, autonomous vehicles,

which predict the class of each pixel. Recently, deep networks achieved significant progress

wrt the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross …

  Cited by 1 Related articles All 4 versions 


2020

Wasserstein cycle-consistent generative adversarial network for improved seismic impedance inversion: Example on 3D SEAM model

A Cai, H Di, Z Li, H Maniar, A Abubakar - SEG Technical Program …, 2020 - library.seg.org

The convolutional neural networks (CNNs) have attracted great attentions in seismic

exploration applications by their capability of learning the representations of data with

multiple level of abstractions, given an adequate amount of labeled data. In seismic …

  Cited by 4 Related articles All 2 versions

2020

Spatial-aware Network using Wasserstein Distance for Unsupervised Domain Adaptation

L Long, L Bin, F Jiang - 2020 Chinese Automation Congress …, 2020 - ieeexplore.ieee.org

In a general scenario, the purpose of Unsupervised Domain Adaptation (UDA) is to classify

unlabeled target domain data as much as possible, but the source domain data has a large

number of labels. To address this situation, this paper introduces the optimal transport theory …

  Related articles

 

2020 [PDF] researchgate.net

[PDF] Image hashing by minimizing independent relaxed wasserstein distance

K DoanA KimiyaieS Manchanda… - arXiv preprint arXiv …, 2020 - researchgate.net

Image hashing is a fundamental problem in the computer vision domain with various

challenges, primarily, in terms of efficiency and effectiveness. Existing hashing methods lack

a principled characterization of the goodness of the hash codes and a principled approach …

  Cited by 1 Related articles 


2020

2020 [PDF] arxiv.org

Image Hashing by Minimizing Discrete Component-wise Wasserstein Distance

KD DoanS ManchandaS Badirli… - arXiv preprint arXiv …, 2020 - arxiv.org

Image hashing is one of the fundamental problems that demand both efficient and effective

solutions for various practical scenarios. Adversarial autoencoders are shown to be able to

implicitly learn a robust, locality-preserving hash function that generates balanced and high …

  Related articles All 2 versions 


2020

Obtaining PET/CT images from non-attenuation corrected PET images in a single PET system using Wasserstein generative adversarial networks

Z Hu, Y Li, S Zou, H Xue, Z Sang, X Liu… - Physics in Medicine …, 2020 - iopscience.iop.org

Positron emission tomography (PET) imaging plays an indispensable role in early disease

detection and postoperative patient staging diagnosis. However, PET imaging requires not

only additional computed tomography (CT) imaging to provide detailed anatomical …

  Cited by 12 Related articles All 5 versions


2020 [PDF] thecvf.com

Barycenters of natural images constrained wasserstein barycenters for image morphing

D SimonA Aberdam - … of the IEEE/CVF Conference on …, 2020 - openaccess.thecvf.com

Image interpolation, or image morphing, refers to a visual transition between two (or more)

input images. For such a transition to look visually appealing, its desirable properties are (i)

to be smooth;(ii) to apply the minimal required change in the image; and (iii) to seem" real" …

  Cited by 4 Related articles All 7 versions 


2020  [PDF] arxiv.org

Generating natural adversarial hyperspectral examples with a modified wasserstein GAN

JC Burnel, K FatrasN Courty - arXiv preprint arXiv:2001.09993, 2020 - arxiv.org

Adversarial examples are a hot topic due to their abilities to fool a classifier's prediction.

There are two strategies to create such examples, one uses the attacked classifier's

gradients, while the other only requires access to the clas-sifier's prediction. This is …

  Cited by 3 Related articles All 5 versions 

2020

Small object detection from remote sensing images with the help of object-focused super-resolution using Wasserstein GANs

L Courtrai, MT Pham, C Friguet… - IGARSS 2020-2020 …, 2020 - ieeexplore.ieee.org

In this paper, we investigate and improve the use of a super-resolution approach to benefit

the detection of small objects from aerial and satellite remote sensing images. The main

idea is to focus the super-resolution on target objects within the training phase. Such a …

  Cited by 1 Related articles All 7 versions

<——2020——2020—3040— 


2020

Hyperspectral Image Classification Approach Based on Wasserstein Generative Adversarial Networks

N Chen, C Li - … on Machine Learning and Cybernetics (ICMLC), 2020 - ieeexplore.ieee.org

Hyperspectral image classification is an important research direction in the application of

remote sensing technology. In the process of labeling different types of objects based on

spectral information and geometric spatial characteristics, noise interference often exists in …

 

2020

A Super Resolution Method for Remote Sensing Images Based on Cascaded Conditional Wasserstein GANs

B Liu, H Li, Y Zhou, Y Peng, A Elazab… - 2020 IEEE 3rd …, 2020 - ieeexplore.ieee.org

High-resolution (HR) remote sensing imagery is quite beneficial for subsequent

interpretation. Obtaining HR images can be achieved by upgrading the imaging device. Yet,

the cost to perform this task is very huge. Thus, it is necessary to obtain HR images from low …

  Cited by 1 Related articles


2020

Generating Hyperspectral Data Based on 3D CNN and Improved Wasserstein Generative Adversarial Network Using Homemade High-resolution Datasets

Y Li, D Huang - Proceedings of the International Conference on …, 2020 - dl.acm.org

Hyperspectral images contain rich information on the fingerprints of materials and are being

popularly used in the exploration of oil and gas, environmental monitoring, and remote

sensing. Since hyperspectral images cover a wide range of wavelengths with high …

  Related articles


2020

Unsupervised band selection based on weighted information entropy and 3D discrete cosine transform for hyperspectral image classification

SS SawantP Manoharan - International Journal of Remote …, 2020 - Taylor & Francis

Band selection is an effective means of reducing the dimensionality of the hyperspectral

image by selecting the most informative and distinctive bandsBands are usually selected

by adopting information theoretic measures, such as, the information entropy or mutual …

  Cited by 23 Related articles All 3 versions

Variable p

 

year 2020  [PDF] researchgate.net

[PDF] Potential Analysis of Wasserstein GAN as an Anomaly Detection Method for Industrial Images

A Misik - researchgate.net

The task of detecting anomalies in images is a crucial part of current industrial optical

monitoring systems. In recent years, neural networks have proven to be an efficient method

for this problem, especially autoencoders and generative adversarial networks (GAN). A …

  Related articles 

2020

2020 see 2019

Methods and devices performing adaptive quadratic wasserstein full-waveform inversion

W Diancheng, P Wang - US Patent App. 16/662,644, 2020 - Google Patents

Methods and devices for seismic exploration of an underground structure apply W 2-based

full-wave inversion to transformed synthetic and seismic data. Data transformation ensures

that the synthetic and seismic data are positive definite and have the same mass using an …

  All 2 versions 

 

2020

Wasserstein generative models for patch-based texture synthesis

A HoudardA LeclaireN Papadakis… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we propose a framework to train a generative model for texture image

synthesis from a single example. To do so, we exploit the local representation of images via

the space of patches, that is, square sub-images of fixed size (eg $4\times 4$). Our main …

  Cited by 1 Related articles All 12 versions 

[PDF] thecvf.com


 2020

Adaptive Wasserstein Hourglass for Weakly Supervised RGB 3D Hand Pose Estimation

Y Zhang, L Chen, Y Liu, W Zheng, J Yong - Proceedings of the 28th ACM …, 2020 - dl.acm.org

The deficiency of labeled training data is one of the bottlenecks in 3D hand pose estimation

from monocular RGB images. Synthetic datasets have a large number of images with

precise annotations, but their obvious difference with real-world datasets limits the …

  Related articles


2020  [PDF] arxiv.org

Equidistribution of random walks on compact groups II. The Wasserstein metric

B Borda - arXiv preprint arXiv:2004.14089, 2020 - arxiv.org

We consider a random walk $ S_k $ with iid steps on a compact group equipped with a bi-

invariant metric. We prove quantitative ergodic theorems for the sum $\sum_ {k= 1}^ N f

(S_k) $ with H\" older continuous test functions $ f $, including the central limit theorem, the …

  Cited by 1 Related articles All 4 versions 


 2020

Multi-view Wasserstein discriminant analysis with entropic regularized Wasserstein distance

H Kasai - ICASSP 2020-2020 IEEE International Conference …, 2020 - ieeexplore.ieee.org

Analysis of multi-view data has recently garnered growing attention because multi-view data

frequently appear in real-world applications, which are collected or taken from many sources

or captured using various sensors. A simple and popular promising approach is to learn a …

  Cited by 6 Related articles

<——2020——2020—3050—


2020  [PDF] neurips.cc

[PDF] Ratio Trace Formulation of Wasserstein Discriminant Analysis

H LiuY CaiYL Chen, P Li - Advances in Neural …, 2020 - proceedings.neurips.cc

Abstract< p> We reformulate the Wasserstein Discriminant Analysis (WDA) as a ratio trace

problem and present an eigensolver-based algorithm to compute the discriminative

subspace of WDA. This new formulation, along with the proposed algorithm, can be served …

  Related articles All 4 versions 


[PDF] iop.org

Data Augmentation Method for Switchgear Defect Samples Based on Wasserstein Generative Adversarial Network

X Huang, J Xiong, Y Zhang, J Liang… - Journal of Physics …, 2020 - iopscience.iop.org

… in Table 1. Table 1. Model performance comparison under different augmented data … diagnosis

of switchgear, this paper proposes an augmentation method of defect samples … and Efficient

Processing of Distribution Equipment Condition Detection Data, No.082100KK52190004 …

  Related articles All 2 versions


[PDF] iop.org

Data Augmentation Method for Switchgear Defect Samples Based on Wasserstein Generative Adversarial Network

X Huang, Xiong, Y Zhang, Liang… - Journal of Physics …, 2020 - iopscience.iop.org

… in the intelligent diagnosis of switchgear, this paper proposes an augmentation method of defect

samples based … Imbalanced data processing algorithm based on boundary mixed sampling[J].

Control and … J]. Journal of Frontiers of Computer Science and Technology,2020,14(03 …

  Cited by 1 Related articles All 2 versions

Hybrid distance-guided adversarial network for intelligent fault diagnosis under different working conditions

B Han, X Zhang, Wang, Z An, S Jia, G Zhang - Measurement, 2021 - Elsevier

… Although the above methods have achieved extraordinary success in fault diagnosis, they all

use only one … Section IV applies two experimental bearing datasets to verify the effectiveness

of the proposed method … D s and X t D t . The training data and testing data drawn from …

  Cited by 5 Related articles All 2 versions

[CITATION] Data Augmentation Method for Power Transformer Fault Diagnosis Based on Conditional Wasserstein Generative Adversarial Network [J]

Y Liu, Z Xu, J He, Q Wang, SG Gao, J Zhao - Power System Technology, 2020

  Cited by 1


https://kowshikchilamkurthy.medium.com › wasserstein...

Wasserstein Distance, Contraction Mapping, and Modern RL

Unlike the Kullback-Leibler divergence, the Wasserstein metric is a true probability metric and considers both the probability of and the distance between ...


2020 [PDF] neurips.cc

[PDF] Ratio Trace Formulation of Wasserstein Discriminant Analysis

H LiuY CaiYL Chen, P Li - Advances in Neural …, 2020 - proceedings.neurips.cc

Abstract< p> We reformulate the Wasserstein Discriminant Analysis (WDA) as a ratio trace

problem and present an eigensolver-based algorithm to compute the discriminative

subspace of WDA. This new formulation, along with the proposed algorithm, can be served …

  Related articles All 4 versions 


2020


2020 see 2019  [PDF] bciml.cn

Wasserstein distance based deep adversarial transfer learning for intelligent fault diagnosis with unlabeled or insufficient labeled data

C Cheng, B Zhou, G Ma, D WuY Yuan - Neurocomputing, 2020 - Elsevier

Intelligent fault diagnosis is one critical topic of maintenance solution for mechanical

systems. Deep learning models, such as convolutional neural networks (CNNs), have been

successfully applied to fault diagnosis tasks and achieved promising results. However, one …

  Cited by 27 Related articles All 3 versions


 

2020

Adaptive Wasserstein Hourglass for Weakly Supervised RGB 3D Hand Pose Estimation

Y Zhang, L Chen, Y Liu, W Zheng, J Yong - Proceedings of the 28th ACM …, 2020 - dl.acm.org

The deficiency of labeled training data is one of the bottlenecks in 3D hand pose estimation

from monocular RGB images. Synthetic datasets have a large number of images with

precise annotations, but their obvious difference with real-world datasets limits the …

  Related articles


2020  [HTML] springer.com

[HTML] Missing features reconstruction using a wasserstein generative adversarial imputation network

M FriedjungováD Vašata, M Balatsko… - … on Computational Science, 2020 - Springer

Missing data is one of the most common preprocessing problems. In this paper, we

experimentally research the use of generative and non-generative models for feature

reconstruction. Variational Autoencoder with Arbitrary Conditioning (VAEAC) and …

  Cited by 4



2020

DPIR-Net: Direct PET image reconstruction based on the Wasserstein generative adversarial network

Z Hu, H Xue, Q Zhang, J Gao, N Zhang… - … on Radiation and …, 2020 - ieeexplore.ieee.org

Positron emission tomography (PET) is an advanced medical imaging technique widely

used in various clinical applications, such as tumor detection and neurologic disorders.

Reducing the radiotracer dose is desirable in PET imaging because it decreases the …

  Cited by 15 Related articles



2020

Obtaining PET/CT images from non-attenuation corrected PET images in a single PET system using Wasserstein generative adversarial networks

Z Hu, Y Li, S Zou, H Xue, Z Sang, X Liu… - Physics in Medicine …, 2020 - iopscience.iop.org

Positron emission tomography (PET) imaging plays an indispensable role in early disease

detection and postoperative patient staging diagnosis. However, PET imaging requires not

only additional computed tomography (CT) imaging to provide detailed anatomical …

  Cited by 13 Related articles All 5 versions

<——2020——2020—3060— 


2020  [PDF] sciencedirect.com

Intelligent Fault Diagnosis with a Deep Transfer Network based on Wasserstein Distance

J Xu, J Huang, Y Zhao, L Zhou - Procedia Computer Science, 2020 - Elsevier

Intelligent fault-diagnosis methods based on deep-learning technology have been very

successful for complex industrial systems. The deep learning based fault classification

model requires a large number of labeled data. Moreover, the probability distribution of …

  Related articles


 2020  [PDF] academia.edu

Data augmentation in fault diagnosis based on the Wasserstein generative adversarial network with gradient penalty

X Gao, F Deng, X Yue - Neurocomputing, 2020 - Elsevier

Fault detection and diagnosis in industrial process is an extremely essential part to keep

away from undesired events and ensure the safety of operators and facilities. In the last few

decades various data based machine learning algorithms have been widely studied to …

  Cited by 41 Related articles All 3 versions


2020  [PDF] arxiv.org

Gromov–Hausdorff limit of Wasserstein spaces on point clouds

NG Trillos - Calculus of Variations and Partial Differential …, 2020 - Springer

We consider a point cloud X_n:={x _1, ..., x _n\} X n:= x 1,…, xn uniformly distributed on the

flat torus T^ d:= R^ d/Z^ d T d:= R d/Z d, and construct a geometric graph on the cloud by

connecting points that are within distance ε ε of each other. We let P (X_n) P (X n) be the …

  Cited by 12 Related articles All 5 versions


2020 see 2019  [PDF] bciml.cn

Wasserstein distance based deep adversarial transfer learning for intelligent fault diagnosis with unlabeled or insufficient labeled data

C Cheng, B Zhou, G Ma, D WuY Yuan - Neurocomputing, 2020 - Elsevier

Intelligent fault diagnosis is one critical topic of maintenance solution for mechanical

systems. Deep learning models, such as convolutional neural networks (CNNs), have been

successfully applied to fault diagnosis tasks and achieved promising results. However, one …

  Cited by 27 Related articles All 3 versions

 Wasserstein distance based deep adversarial transfer learning for intelligent fault diagnosis with unlabeled or insufficient labeled data

25 citations*

2020 NEUROCOMPUTING

Cheng Cheng ,Beitong Zhou ,Guijun Ma ,Dongrui Wu ,Ye Yuan

Huazhong University of Science and Technology

Unsupervised learning

Deep learning

View More (9+) 

Abstract Intelligent fault diagnosis is one critical topic of maintenance solution for mechanical systems. Deep learning models, such as convolutional neural networks (CNNs), have been successfully applied to fault diagnosis tasks and achieved promising results. However, one is that two datasets (... View Full Abstract 
Cited by 68
 Related articles All 3 versions


2020

Adaptive Wasserstein Hourglass for Weakly Supervised RGB 3D Hand Pose Estimation

Y Zhang, L Chen, Y Liu, W Zheng, J Yong - Proceedings of the 28th ACM …, 2020 - dl.acm.org

The deficiency of labeled training data is one of the bottlenecks in 3D hand pose estimation

from monocular RGB images. Synthetic datasets have a large number of images with

precise annotations, but their obvious difference with real-world datasets limits the …

  Related articles


2020


2020  [PDF] sciencedirect.com

Intelligent Fault Diagnosis with a Deep Transfer Network based on Wasserstein Distance

J Xu, J Huang, Y Zhao, L Zhou - Procedia Computer Science, 2020 - Elsevier

Intelligent fault-diagnosis methods based on deep-learning technology have been very

successful for complex industrial systems. The deep learning based fault classification

model requires a large number of labeled data. Moreover, the probability distribution of …

  Related articles


[PDF] arxiv.org

Two-sample test using projected Wasserstein distance: Breaking the curse of dimensionality

J WangR GaoY Xie - arXiv preprint arXiv:2010.11970, 2020 - arxiv.org

We develop a projected Wasserstein distance for the two-sample test, a fundamental problem in statistics and machine learning: given two sets of samples, to determine whether they are from the same distribution. In particular, we aim to circumvent the curse of …

  Cited by 4 Related articles All 3 versions 

2020

 Data Augmentation Method for Switchgear Defect Samples Based on Wasserstein Generative Adversarial Network

X Huang, J Xiong, Y Zhang, J Liang… - Journal of Physics …, 2020 - iopscience.iop.org

… in Table 1. Table 1. Model performance comparison under different augmented data … diagnosis

of switchgear, this paper proposes an augmentation method of defect samples … and Efficient

Processing of Distribution Equipment Condition Detection Data, No.082100KK52190004 …

  Related articles All 2 versions


2020  [PDF] mlr.press

Bridging the gap between f-gans and wasserstein gans

J SongS Ermon - International Conference on Machine …, 2020 - proceedings.mlr.press

Generative adversarial networks (GANs) variants approximately minimize divergences

between the model and the data distribution using a discriminator. Wasserstein GANs

(WGANs) enjoy superior empirical performance, however, unlike in f-GANs, the discriminator …

  Cited by 11 Related articles All 4 versions 


[PDF] arxiv.org

Improved image wasserstein attacks and defenses

JE Hu, A SwaminathanH SalmanG Yang - arXiv preprint arXiv …, 2020 - arxiv.org

Robustness against image perturbations bounded by a $\ell_p $ ball have been well-

studied in recent literature. Perturbations in the real-world, however, rarely exhibit the pixel

independence that $\ell_p $ threat models assume. A recently proposed Wasserstein  …

  Cited by 5 Related articles All 4 versions 

<——2020——2020—3070—


[PDF] iop.org  Full View

Wasserstein metric for improved quantum machine learning with adjacency matrix representations

O ÇaylakOA von Lilienfeld… - … Learning: Science and …, 2020 - iopscience.iop.org

We study the Wasserstein metric to measure distances between molecules represented by

the atom index dependent adjacency'Coulomb'matrix, used in kernel ridge regression based

supervised learning. Resulting machine learning models of quantum properties, aka …

  Cited by 10 Related articles All 5 versions


Trajectories from Distribution-Valued Functional Curves: A Unified Wasserstein Framework

A SharmaG Gerig - … Conference on Medical Image Computing and …, 2020 - Springer

Temporal changes in medical images are often evaluated along a parametrized function that

represents a structure of interest (eg white matter tracts). By attributing samples along these

functions with distributions of image properties in the local neighborhood, we create …

  Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

Wasserstein generative models for patch-based texture synthesis

A HoudardA LeclaireN Papadakis… - arXiv preprint arXiv …, 2020 - arxiv.org

In this paper, we propose a framework to train a generative model for texture image

synthesis from a single example. To do so, we exploit the local representation of images via

the space of patches, that is, square sub-images of fixed size (eg $4\times 4$). Our main …

  Cited by 1 Related articles All 12 versions 


[PDF] arxiv.org

The Wasserstein Proximal Gradient Algorithm

A SalimA KorbaG Luise - arXiv preprint arXiv:2002.03035, 2020 - arxiv.org

Wasserstein gradient flows are continuous time dynamics that define curves of steepest

descent to minimize an objective function over the space of probability measures (ie, the

Wasserstein space). This objective is typically a divergence wrt a fixed target distribution. In …

  Cited by 4 Related articles All 3 versions 


[PDF] Computational hardness and fast algorithm for fixed-support wasserstein barycenter

T LinN HoX ChenM Cuturi… - arXiv preprint arXiv …, 2020 - researchgate.net

We study in this paper the fixed-support Wasserstein barycenter problem (FS-WBP), which

consists in computing the Wasserstein barycenter of m discrete probability measures

supported on a finite metric space of size n. We show first that the constraint matrix arising …

  Cited by 3 Related articles All 2 versions 


2020


[PDF] arxiv.org

An LP-based, strongly-polynomial 2-approximation algorithm for sparse Wasserstein barycenters

S Borgwardt - Operational Research, 2020 - Springer

Discrete Wasserstein barycenters correspond to optimal solutions of transportation problems

for a set of probability measures with finite support. Discrete barycenters are measures with

finite support themselves and exhibit two favorable properties: there always exists one with a …

  Cited by 4 Related articles All 3 versions


 

Drift compensation algorithm based on Time-Wasserstein dynamic distribution alignment

Y Tao, K Zeng, Z Liang - 2020 IEEE/CIC International …, 2020 - ieeexplore.ieee.org

The electronic nose (E-nose) is mainly used to detect different types and concentrations of

gases. At present, the average life of E-nose is relatively short, mainly due to the drift of the

sensor resulting in a decrease in the effect. Therefore, it is the focus of research in this field …

  Related articles


[PDF] arxiv.org

Unajusted Langevin algorithm with multiplicative noise: Total variation and Wasserstein bounds

F Panloup - arXiv preprint arXiv:2012.14310, 2020 - arxiv.org

In this paper, we focus on non-asymptotic bounds related to the Euler scheme of an ergodic

diffusion with a possibly multiplicative diffusion term (non-constant diffusion coefficient).

More precisely, the objective of this paper is to control the distance of the standard Euler …

  Related articles All 3 versions 

[PDF] archives-ouvertes.fr

Unajusted Langevin algorithm with multiplicative noise: Total variation and Wasserstein bounds

G Pages, F Panloup - 2020 - hal.archives-ouvertes.fr

In this paper, we focus on non-asymptotic bounds related to the Euler scheme of an ergodic

diffusion with a possibly multiplicative diffusion term (non-constant diffusion coefficient).

More precisely, the objective of this paper is to control the distance of the standard Euler …

  Related articles All 8 versions 



A Novel Ant Colony Shape Matching Algorithm Based on the Gromov-Wasserstein Distance

J Zhang, L Zhang, E Saucan - 2020 8th International …, 2020 - ieeexplore.ieee.org

Shape matching has always been and still is an important task in the graphics and imaging

research. The optimization of the minimum distance among the feature points on two

surfaces of the same topological types, is a core to match shapes. Therefore, we propose in …

  Related articles


[PDF] mlr.press

Gradient descent algorithms for Bures-Wasserstein barycenters

S ChewiT MaunuP Rigollet… - … on Learning Theory, 2020 - proceedings.mlr.press

We study first order methods to compute the barycenter of a probability distribution $ P $

over the space of probability measures with finite second moment. We develop a framework

to derive global rates of convergence for both gradient descent and stochastic gradient …

  Cited by 24 Related articles All 6 versions 

<——2020——2020—3080—



DPIR-NetDirect PET image reconstruction based on the Wasserstein generative adversarial network

Z Hu, H Xue, Q Zhang, J Gao, N Zhang… - … on Radiation and …, 2020 - ieeexplore.ieee.org

Positron emission tomography (PET) is an advanced medical imaging technique widely

used in various clinical applications, such as tumor detection and neurologic disorders.

Reducing the radiotracer dose is desirable in PET imaging because it decreases the …

  Cited by 15 Related articles


[PDF] arxiv.org

Central limit theorems for Markov chains based on their convergence rates in Wasserstein distance

R Jin, A Tan - arXiv preprint arXiv:2002.09427, 2020 - arxiv.org

Many tools are available to bound the convergence rate of Markov chains in total variation

(TV) distance. Such results can be used to establish central limit theorems (CLT) that enable

error evaluations of Monte Carlo estimates in practice. However, convergence analysis …

  Related articles All 2 versions 




[PDF] arxiv.org

Hierarchical gaussian processes with wasserstein-2 kernels

S PopescuD SharpJ ColeB Glocker - arXiv preprint arXiv:2010.14877, 2020 - arxiv.org

We investigate the usefulness of Wasserstein-2 kernels in the context of hierarchical

Gaussian Processes. Stemming from an observation that stacking Gaussian Processes

severely diminishes the model's ability to detect outliers, which when combined with non …

  Cited by 3 Related articles All 3 versions 


Trajectories from Distribution-Valued Functional Curves: A Unified Wasserstein Framework

A SharmaG Gerig - … Conference on Medical Image Computing and …, 2020 - Springer

Temporal changes in medical images are often evaluated along a parametrized function that

represents a structure of interest (eg white matter tracts). By attributing samples along these

functions with distributions of image properties in the local neighborhood, we create …

  Cited by 1 Related articles All 2 versions


[PDF] ams.org

Full View

Nonpositive curvature, the variance functional, and the Wasserstein barycenter

YH KimB Pass - Proceedings of the American Mathematical Society, 2020 - ams.org

We show that a Riemannian manifold $ M $ has nonpositive sectional curvature and is

simply connected if and only if the variance functional on the space $ P (M) $ of probability

measures over $ M $ is displacement convex. We then establish convexity over Wasserstein  …

  Cited by 4 Related articles All 5 versions


2020


[PDF] openreview.net

Functional Priors for Bayesian Neural Networks through Wasserstein Distance Minimization to Gaussian Processes

BH TranD MiliosS Rossi… - Third Symposium on …, 2020 - openreview.net

The Bayesian treatment of neural networks dictates that a prior distribution is considered

over the weight and bias parameters of the network. The non-linear nature of the model

implies that any distribution of the parameters has an unpredictable effect on the distribution …

  Related articles All 2 versions 




Functional Data Clustering Analysis via the Learning of Gaussian Processes with Wasserstein Distance

T Li, J Ma - International Conference on Neural Information …, 2020 - Springer

Functional data clustering analysis becomes an urgent and challenging task in the new era

of big data. In this paper, we propose a new framework for functional data clustering

analysis, which adopts a similar structure as the k-means algorithm for the conventional …

  Related articles


[PDF] arxiv.org

The unbalanced Gromov Wasserstein distance: Conic formulation and relaxation

T SéjournéFX VialardG Peyré - arXiv preprint arXiv:2009.04266, 2020 - arxiv.org

Comparing metric measure spaces (ie a metric space endowed with a probability

distribution) is at the heart of many machine learning problems. This includes for instance

predicting properties of molecules in quantum chemistry or generating graphs with varying …

  Cited by 5 Related articles All 3 versions 


[PDF] ieee.org

Joint transfer of model knowledge and fairness over domains using wasserstein distance

T YoonJ LeeW Lee - IEEE Access, 2020 - ieeexplore.ieee.org

Owing to the increasing use of machine learning in our daily lives, the problem of fairness

has recently become an important topic in machine learning societies. Recent studies

regarding fairness in machine learning have been conducted to attempt to ensure statistical …

  Cited by 2 Related articles


[PDF] arxiv.org

A Rademacher-type theorem on L2-Wasserstein spaces over closed Riemannian manifolds

LD Schiavo - Journal of Functional Analysis, 2020 - Elsevier

Let P be any Borel probability measure on the L 2-Wasserstein space (P 2 (M), W 2) over a

closed Riemannian manifold M. We consider the Dirichlet form E induced by P and by the

Wasserstein gradient on P 2 (M). Under natural assumptions on P, we show that W 2 …

  Cited by 5 Related articles All 6 versions
<——2020——2020—3090—   


[PDF] thecvf.com

Illegible Text to Readable Text: An Image-to-Image Transformation using Conditional Sliced Wasserstein Adversarial Networks

M KarimiG VeniYY Yu - … of the IEEE/CVF Conference on …, 2020 - openaccess.thecvf.com

Automatic text recognition from ancient handwritten record images is an important problem

in the genealogy domain. However, critical challenges such as varying noise conditions,

vanishing texts, and variations in handwriting makes the recognition task difficult. We tackle …

  Cited by 1 Related articles All 7 versions 


[PDF] arxiv.org

Derivative over Wasserstein spaces along curves of densities

R Buckdahn, J Li, H Liang - arXiv preprint arXiv:2010.01507, 2020 - arxiv.org

In this paper, given any random variable $\xi $ defined over a probability space

$(\Omega,\mathcal {F}, Q) $, we focus on the study of the derivative of functions of the form $

L\mapsto F_Q (L):= f\big ((LQ) _ {\xi}\big), $ defined over the convex cone of densities …

  Related articles All 2 versions 


 GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators

15 citations* for all

5 citations*

2020 NEURAL INFORMATION PROCESSING SYSTEMS

Dingfan Chen ,Tribhuvanesh Orekondy ,Mario Fritz

Max Planck Society

 Cited by 18 Related articles All 11 versions 
Cited by 59
Related articles All 10 versions
[NeurIPS 2020]GS-WGAN: A Gradient-Sanitized Approach for ...

NeurIPS 2020] GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators ...

Oct 31, 2020 · Uploaded by Dingfan Chen


2020 see 2019  [PDF] arxiv.org

Kernelized wasserstein natural gradient

M ArbelA GrettonW LiG Montúfar - arXiv preprint arXiv:1910.09652, 2019 - arxiv.org

… 4 Page 5. Published as a conference paper at ICLR 2020 … 3 KERNELIZED WASSERSTEIN

NATURAL GRADIENT In this section we propose an estimator for the Wasserstein natural gradient

using kernel methods and exploiting the formulation in (9). We restrict to the case of the …

  Cited by 7 Related articles All 6 versions 

 

pen Access 

Derivative over Wasserstein spaces along curves of densities 

by Buckdahn, Rainer; Li, Juan; Liang, Hao 

10/2020

In this paper, given any random variable $\xi$ defined over a probability space $(\Omega,\mathcal{F},Q)$, we focus on the study of the derivative of functions...

Journal ArticleFull Text Online 

arXiv:2010.01507  [pdfpsother]  math.PR 

Derivative over Wasserstein spaces along curves of densities 

Authors: Rainer BuckdahnJuan LiHao Liang 

Abstract: In this paper, given any random variable…

defined over a probability space…

, we focus on the study of the derivative of functions of the form

 ….  defined over the convex cone of densities 

is a function over the space…  More 

Submitted 4 October, 2020; originally announced October 2020. 

Comments: 55 pages  


2020

 2020  see 2021

 The back-and-forth method for Wasserstein gradient flows

2 citations*

2020 ARXIV: NUMERICAL ANALYSIS

View More 

 Optimal control of multiagent systems in the Wasserstein space

16 citations*

2020 CALCULUS OF VARIATIONS AND PARTIAL DIFFERENTIAL EQUATIONS

Chloé Jimenez 1,Antonio Marigonda 2,Marc Quincampoix 1

1 Centre national de la recherche scientifique ,2 University of Verona

Type (model theory)

Hamilton–Jacobi–Bellman equation

View More (8+) 

This paper concerns a class of optimal control problems, where a central planner aims to control a multi-agent system in $${\mathbb {R}}^d$$ in order to minimize a certain cost of Bolza type. At every time and for each agent, the set of admissible velocities, describing his/her underlying microscopi... View Full Abstract 

 Graph Wasserstein Correlation Analysis for Movie Retrieval

0 citations*

2020 ARXIV: LEARNING

View More 

 Principled learning method for Wasserstein distributionally robust optimization with local perturbations

0 citations* for all

0 citations*

2020 INTERNATIONAL CONFERENCE ON MACHINE LEARNING

Yongchan Kwon 1,Wonyoung Kim 2,Joong-Ho Won 2,Myunghee Cho Paik 2

1 Stanford University ,2 Seoul National University

Robust optimization

Robustness (computer science)

View More (8+) 

Wasserstein distributionally robust optimization (WDRO) attempts to learn a model that minimizes the local worst-case risk in the vicinity of the empirical data distribution defined by Wasserstein ball. While WDRO has received attention as a promising tool for inference since its introduction, its t... View Full Abstract 

Cited by 1 Related articles All 7 versions

 

 Principled learning method for Wasserstein distributionally robust optimization with local perturbations

0 citations*

2020 ARXIV: MACHINE LEARNING

View More 

 SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

10 citations* for all

8 citations*

2020 NEURAL INFORMATION PROCESSING SYSTEMS

Sinho Chewi ,Thibaut Le Gouic ,Chen Lu ,Tyler Maunu ,Philippe Rigollet

Massachusetts Institute of Technology

Gradient descent

Laplace operator

View More (8+) 

Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of optimal transport. We introduce a new perspective on SVGD that instead views SVGD as the (kernelized) gradient flow of th... View Full Abstract 

Cited by 15 Related articles All 9 versions 

 

 

 Stochastic Optimization for Regularized Wasserstein Estimators.

5 citations*

2020 ARXIV: LEARNING

    

 Fixed-Support Wasserstein Barycenters: Computational Hardness and Fast Algorithm.

2020 ARXIV: COMPUTATIONAL COMPLEXITY

View More 


 Accelerated surgery versus standard care in hip fracture (HIP ATTACK): an international, randomised, controlled trial

88 citations*

2020 THE LANCET

Flavia K Borges ,Mohit Bhandari ,Ernesto Guerra-Farfan ,Ameen Patel ,Alben Sigamani see all 505 authors

Hip fracture

Fracture fixation

View More (59+) 

Summary Background Observational studies have suggested that accelerated surgery is associated with improved outcomes in patients with a hip fracture. The HIP ATTACK trial assessed whether accelerated surgery could reduce mortality and major complications. Methods HIP ATTACK was an internatio... View Full Abstract 

 Cited by 131 Related articles All 16 versions

<——2020——2020—3100—  


 Stochastic Optimization for Regularized Wasserstein Estimators.

5 citations*

2020 ARXIV: LEARNING

View More 

 Continuous Regularized Wasserstein Barycenters

7 citations* for all

6 citations*

2020 NEURAL INFORMATION PROCESSING SYSTEMS

Lingxiao Li 1,Aude Genevay 1,Mikhail Yurochkin 2,Justin M. Solomon 1

1 Massachusetts Institute of Technology ,2 IBM

Stochastic gradient descent

Strong duality

View More (8+) 

Wasserstein barycenters provide a geometrically meaningful way to aggregate probability distributions, built on the theory of optimal transport. They are difficult to compute in practice, however, leading previous work to restrict their supports to finite sets of points. Leveraging a new dual formul... View Full Abstract 

Cited by 12 Related articles All 12 versions 

 Continuous Regularized Wasserstein Barycenters

1 citations*

2020 ARXIV: LEARNING

View More 

 Bridging the Gap Between f-GANs and Wasserstein GANs

13 citations* for all

6 citations*

2020 INTERNATIONAL CONFERENCE ON MACHINE LEARNING

Jiaming Song ,Stefano Ermon

Stanford University

Lagrangian relaxation

Estimator

View More (6+) 

Generative adversarial networks (GANs) have enjoyed much success in learning high-dimensional distributions. Learning objectives approximately minimize an $f$-divergence ($f$-GANs) or an integral probability metric (Wasserstein GANs) between the model and the data distribution using a discriminator.... View Full Abstract 

Cited by 25 Related articles All 5 versions 

2020 

 Bridging the Gap Between f-GANs and Wasserstein GANs

https://papertalk.org › papertalks

This is an embedded video. Talk and the respective paper are published at ICML 2020 virtual conference. If you are one of the authors of the paper and want to ...

[CITATION] Bridging the Gap Between f-GANs and Wasserstein GANs. arXiv e-prints, page

J Song, S Ermon - arXiv preprint arXiv:1910.09779, 2019

Cited by 25 Related articles All 5 versions 
Bridging the Gap Between f-GANs and Wasserstein GANs

slideslive.com › bridging-the-gap-between-fgans-and-was...

slideslive.com › bridging-the-gap-between-fgans-and-was...

 premier gathering of professionals dedicated to the advancement of the branch of ...

SlidesLive · 

Jul 12, 2020


2020

DPIR-Net: Direct PET image reconstruction based on the Wasserstein generative adversarial network

Z Hu, H Xue, Q Zhang, J Gao, N Zhang… - … on Radiation and …, 2020 - ieeexplore.ieee.org

Positron emission tomography (PET) is an advanced medical imaging technique widely

used in various clinical applications, such as tumor detection and neurologic disorders.

Reducing the radiotracer dose is desirable in PET imaging because it decreases the …

  Cited by 17 Related articles

 Deep Attentive Wasserstein Generative Adversarial Networks for MRI Reconstruction with Recurrent Context-Awareness

4 citations* for all

4 citations*

2020 MEDICAL IMAGE COMPUTING AND COMPUTER-ASSISTED INTERVENTION

Yifeng Guo 1,Chengjia Wang 2,Heye Zhang 1,Guang Yang 3

1 Sun Yat-sen University ,2 University of Edinburgh ,3 National Institutes of Health

Deep learning

Recurrent neural network

View More (8+) 

The performance of traditional compressive sensing-based MRI (CS-MRI) reconstruction is affected by its slow iterative procedure and noise-induced artefacts. Although many deep learning-based CS-MRI methods have been proposed to mitigate the problems of traditional methods, they have not been able t... View Full Abstract 

  Cited by 16 Related articles All 4 versions


2020

1 May 2020

Characterization of probability distribution convergence in Wasserstein distance by 

Lp-quantization error function

Yating Liu, Gilles Pagès

Bernoulli Vol. 26, Issue 2 (May 2020), pg(s) 1171-1204

KEYWORDS: probability distribution characterizationVector quantizationVoronoï diagramWasserstein convergence

Read Abstract +


2020  [PDF] arxiv.org

Finite-sample guarantees for Wasserstein distributionally robust optimization: Breaking the curse of dimensionality

R Gao - arXiv preprint arXiv:2009.04382, 2020 - arxiv.org

Wasserstein distributionally robust optimization (DRO) aims to find robust and generalizable

solutions by hedging against data perturbations in Wasserstein distance. Despite its recent

empirical success in operations research and machine learning, existing performance …

  Cited by 7 Related articles All 3 versions 

 

2020  [PDF] arxiv.org

Asymptotic guarantees for generative modeling based on the smooth wasserstein distance

Z GoldfeldK GreenewaldK Kato - arXiv preprint arXiv:2002.01012, 2020 - arxiv.org

Minimum distance estimation (MDE) gained recent attention as a formulation of (implicit)

generative modeling. It considers minimizing, over model parameters, a statistical distance

between the empirical data distribution and the model. This formulation lends itself well to …

  Cited by 5 Related articles All 3 versions 



2020  [PDF] arxiv.org

Generalized Wasserstein Dice Score, Distributionally Robust Deep Learning, and Ranger for brain tumor segmentation: BraTS 2020 challenge

L FidonS OurselinT Vercauteren - arXiv preprint arXiv:2011.01614, 2020 - arxiv.org

Training a deep neural network is an optimization problem with four main ingredients: the

design of the deep neural network, the per-sample loss function, the population loss

function, and the optimizer. However, methods developed to compete in recent BraTS …

  Cited by 1 Related articles All 3 versions 


2020

Illumination-invariant flotation froth color measuring via Wasserstein distance-based CycleGAN with structure-preserving constraint

J Liu, J He, Y Xie, W Gui, Z Tang, T Ma… - IEEE transactions on …, 2020 - ieeexplore.ieee.org

Froth color can be referred to as a direct and instant indicator to the key flotation production

index, for example, concentrate grade. However, it is intractable to measure the froth color

robustly due to the adverse interference of time-varying and uncontrollable multisource …

  Cited by 13 Related articles All 3 versions

<——2020——2020—3110— 



2020  [PDF] arxiv.org

A new approach to posterior contraction rates via Wasserstein dynamics

E Dolera, S Favaro, E Mainini - arXiv preprint arXiv:2011.14425, 2020 - arxiv.org

This paper presents a new approach to the classical problem of quantifying posterior

contraction rates (PCRs) in Bayesian statistics. Our approach relies on Wasserstein

distance, and it leads to two main contributions which improve on the existing literature of …

  Cited by 1 Related articles All 2 versions 


2020  [PDF] thecvf.com

Illegible Text to Readable Text: An Image-to-Image Transformation using Conditional Sliced Wasserstein Adversarial Networks

M KarimiG VeniYY Yu - … of the IEEE/CVF Conference on …, 2020 - openaccess.thecvf.com

Automatic text recognition from ancient handwritten record images is an important problem

in the genealogy domain. However, critical challenges such as varying noise conditions,

vanishing texts, and variations in handwriting makes the recognition task difficult. We tackle …

  Cited by 1 Related articles All 7 versions 



2020  [PDF] arxiv.org

Permutation invariant networks to learn Wasserstein metrics

A SehanobishN RavindraD van Dijk - arXiv preprint arXiv:2010.05820, 2020 - arxiv.org

Understanding the space of probability measures on a metric space equipped with a

Wasserstein distance is one of the fundamental questions in mathematical analysis. The

Wasserstein metric has received a lot of attention in the machine learning community …

  Related articles All 4 versions 

[PDF] projecteuclid.org

Posterior asymptotics in Wasserstein metrics on the real line

M ChaeP De Blasi, SG Walker - Electronic Journal of Statistics, 2021 - projecteuclid.org

In this paper, we use the class of Wasserstein metrics to study asymptotic properties of

posterior distributions. Our first goal is to provide sufficient conditions for posterior

consistency. In addition to the well-known Schwartz's Kullback–Leibler condition on the …

  Related articles All 2 versions


2020

Deep Diffusion-Invariant Wasserstein Distributional ...

https://papers.nips.cc › paper › 2020 › hash

Authors. Sung Woo Park, Dong Wook Shu, Junseok Kwon. Abstract. In this paper, we present a novel classification method called deep diffusion-invariant ...

[CITATION] Deep Diffusion-Invariant Wasserstein Distributional Classification

SW Park+DW Shu

J Kwon - Advances in Neural Information Processing Systems, 2020

  Related articles

 

20 2020

 Hausdorff and Wasserstein metrics on graphs and other structured data

5 citations* for all

1 citations*

2020 INFORMATION AND INFERENCE: A JOURNAL OF THE IMA

Evan Patterson

Stanford University

Wasserstein metric

Matching (graph theory)

View More (8+) 

Optimal transport is widely used in pure and applied mathematics to find probabilistic solutions to hard combinatorial matching problems. We extend the Wasserstein metric and other elements of optimal transport from the matching of sets to the matching of graphs and other structured data. This struc... View Full Abstract 


2020  [PDF] arxiv.org

Wasserstein distances for stereo disparity estimation

D GargY WangB HariharanM Campbell… - arXiv preprint arXiv …, 2020 - arxiv.org

Existing approaches to depth or disparity estimation output a distribution over a set of pre-

defined discrete values. This leads to inaccurate results when the true depth or disparity

does not match any of these values. The fact that this distribution is usually learned indirectly …

  Cited by 4 Related articles All 4 versions 

[CITATION] Supplementary Material: Wasserstein Distances for Stereo Disparity Estimation

D Garg, Y Wang, B HariharanM Campbell

 

 Reinforced Wasserstein Training for Severity-Aware Semantic Segmentation in Autonomous Driving

1 citations*

2020 ARXIV: COMPUTER VISION AND PATTERN RECOGNITION

Xiaofeng Liu ,Yimeng Zhang ,Xiongchang Liu ,Song Bai ,Site Li see all 6 authors

Harvard University

Metric (mathematics)

Set (psychology)

View More (9+) 

Semantic segmentation is important for many real-world systems, e.g., autonomous vehicles, which predict the class of each pixel. Recently, deep networks achieved significant progress w.r.t. the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross-entropy loss can esse... View Full Abstract 

 Cited by 4 Related articles All 5 versions 

2020  [PDF] arxiv.org

Wasserstein-based graph alignment

HP MareticME GhecheM MinderG Chierchia… - arXiv preprint arXiv …, 2020 - arxiv.org

We propose a novel method for comparing non-aligned graphs of different sizes, based on

the Wasserstein distance between graph signal distributions induced by the respective

graph Laplacian matrices. Specifically, we cast a new formulation for the one-to-many graph …

  Cited by 6 Related articles All 3 versions 

Wasserstein-based Graph Alignment

H Petric MareticM El GhecheM Minder… - arXiv e …, 2020 - ui.adsabs.harvard.edu

We propose a novel method for comparing non-aligned graphs of different sizes, based on

the Wasserstein distance between graph signal distributions induced by the respective

graph Laplacian matrices. Specifically, we cast a new formulation for the one-to-many graph …

 


2020 see 2019

Deep joint two-stream Wasserstein auto-encoder and selective attention alignment for unsupervised domain adaptation

Z ChenC Chen, X Jin, Y LiuZ Cheng - Neural computing and …, 2020 - Springer

Abstract Domain adaptation refers to the process of utilizing the labeled source domain data

to learn a model that can perform well in the target domain with limited or missing labels.

Several domain adaptation methods combining image translation and feature alignment  …

  Cited by 13 Related articles All 3 versions


2020  [PDF] arxiv.org

Unsupervised Multilingual Alignment using Wasserstein Barycenter

X Lian, K JainJ TruszkowskiP Poupart… - arXiv preprint arXiv …, 2020 - arxiv.org

We study unsupervised multilingual alignment, the problem of finding word-to-word

translations between multiple languages without using any parallel data. One popular

strategy is to reduce multilingual alignment to the much simplified bilingual setting, by …

  Cited by 1 Related articles All 8 versions 

<——2020——2020—3120— 



2020  [PDF] arxiv.org

Modeling EEG data distribution with a wasserstein generative adversarial network to predict rsvp events

S PanwarP RadTP Jung… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Electroencephalography (EEG) data are difficult to obtain due to complex experimental

setups and reduced comfort with prolonged wearing. This poses challenges to train powerful

deep learning model with the limited EEG data. Being able to generate EEG data …

  Cited by 5 Related articles All 5 versions



2020  [PDF] arxiv.org

Asymptotic guarantees for generative modeling based on the smooth wasserstein distance

Z GoldfeldK GreenewaldK Kato - arXiv preprint arXiv:2002.01012, 2020 - arxiv.org

Minimum distance estimation (MDE) gained recent attention as a formulation of (implicit)

generative modeling. It considers minimizing, over model parameters, a statistical distance

between the empirical data distribution and the model. This formulation lends itself well to …

  Cited by 5 Related articles All 3 versions 



2020 [PDF] openreview.net

Wasserstein Distributional Normalization: Nonparametric Stochastic Modeling for Handling Noisy Labels

SW Park, J Kwon - 2020 - openreview.net

We propose a novel Wasserstein distributional normalization (WDN) algorithm to handle

noisy labels for accurate classification. In this paper, we split our data into uncertain and

certain samples based on small loss criteria. We investigate the geometric relationship …

  Related articles 


year 2020  [PDF] brown.edu

[PDF] Reduced-order modeling of transport equations using Wasserstein spaces

V EhrlacherD LombardiO MulaFX Vialard - icerm.brown.edu

Introduction to Wassertein spaces and barycenters Model order reduction of parametric transport

equations Reduced-order modeling of transport equations using Wasserstein spaces V.

Ehrlacher1, D. Lombardi 2, O. Mula 3, F.-X. Vialard 4 … For all u, v P2(Ω), the 2-Wasserstein …

  Related articles 



2020  [PDF] arxiv.org

Reinforced wasserstein training for severity-aware semantic segmentation in autonomous driving

X Liu, Y Zhang, X Liu, S Bai, S Li, J You - arXiv preprint arXiv:2008.04751, 2020 - arxiv.org

Semantic segmentation is important for many real-world systems, eg, autonomous vehicles,

which predict the class of each pixel. Recently, deep networks achieved significant progress

wrt the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross …

  Cited by 1 Related articles All 4 versions 


2020


 SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

2 citations*

2020 ARXIV: STATISTICS THEORY

View More 

 The Wasserstein Gradient Flow of the Fisher Information and the Quantum Drift-diffusion Equation

182 citations*

2009 ARCHIVE FOR RATIONAL MECHANICS AND ANALYSIS

Ugo Gianazza ,Giuseppe Savaré ,Giuseppe Toscani

University of Pavia

Delta-v (physics)

Probability measure

View More (9+) 

We prove the global existence of non-negative variational solutions to the “drift diffusion” evolution equation $${{\partial_t} u+ div \left(u{\mathrm{D}}\left(2 \frac{\Delta \sqrt u}{\sqrt u}-{f}\right)\right)=0}$$ ... View Full Abstract 




2020  [PDF] arxiv.org

SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

S ChewiTL GouicC LuT Maunu… - arXiv preprint arXiv …, 2020 - arxiv.org

Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described

as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of

optimal transport. We introduce a new perspective on SVGD that instead views SVGD as the …

  Cited by 11 Related articles All 7 versions 



2020  [HTML] hindawi.com

[HTML] An Integrated Consensus Improving Strategy Based on PL-Wasserstein Distance and Its Application in the Evaluation of Network Public Opinion Emergencies

S Zhang, Z Ma, X Liu, Z Wang, L Jiang - Complexity, 2020 - hindawi.com

In real life, multiple network public opinion emergencies may break out in a certain place at

the same time. So, it is necessary to invite emergency decision experts in multiple fields for

timely evaluating the comprehensive crisis of the online public opinion, and then limited …

  Related articles All 9 versions 



2020  [PDF] jst.go.jp

Orthogonal gradient penalty for fast training of wasserstein gan based multi-task autoencoder toward robust speech recognition

CY Kao, S Park, A Badi, DK Han… - IEICE TRANSACTIONS on …, 2020 - search.ieice.org

Performance in Automatic Speech Recognition (ASR) degrades dramatically in noisy

environments. To alleviate this problem, a variety of deep networks based on convolutional

neural networks and recurrent neural networks were proposed by applying L1 or L2 loss. In …

  Cited by 1 Related articles All 5 versions


 2020 see 2019  [PDF] nsf.gov

The quadratic Wasserstein metric for inverse data matching

B Engquist, K Ren, Y Yang - Inverse Problems, 2020 - iopscience.iop.org

This work characterizes, analytically and numerically, two major effects of the quadratic

Wasserstein (W 2) distance as the measure of data discrepancy in computational solutions

of inverse problems. First, we show, in the infinite-dimensional setup, that the W 2 distance …

  Cited by 7 Related articles All 6 versions

<——2020——2020—3130— 


2020  [PDF] arxiv.org

Wasserstein-based Projections with Applications to Inverse Problems

H HeatonSW Fung, AT Lin, S OsherW Yin - arXiv preprint arXiv …, 2020 - arxiv.org

Inverse problems consist of recovering a signal from a collection of noisy measurements.

These are typically cast as optimization problems, with classic approaches using a data

fidelity term and an analytic regularizer that stabilizes recovery. Recent Plug-and-Play (PnP) …

  Cited by 1 Related articles All 2 versions 


2020  [PDF] arxiv.org

Wasserstein distributionally robust inverse multiobjective optimization

C DongB Zeng - arXiv preprint arXiv:2009.14552, 2020 - arxiv.org

Inverse multiobjective optimization provides a general framework for the unsupervised

learning task of inferring parameters of a multiobjective decision making problem (DMP),

based on a set of observed decisions from the human expert. However, the performance of …

  Cited by 2 Related articles All 5 versions 

2020

 Wasserstein Loss With Alternative Reinforcement Learning for Severity-Aware Semantic Segmentation

6 citations*

2020 IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS

Xiaofeng Liu 1,Yunhong Lu 1,Xiongchang Liu 2,Song Bai 3,Site Li 4 see all 6 authors

1 Harvard University ,2 China University of Mining and Technology ,3 University of California, Berkeley ,4 Carnegie Mellon University

Metric (mathematics)

Reinforcement learning

View More (9+) 

Semantic segmentation is important for many real-world systems, e.g., autonomous vehicles, which predict the class of each pixel. Recently, deep networks achieved significant progress w.r.t. the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross entropy loss can esse... View Full Abstract 

2020

 Projection Robust Wasserstein Distance and Riemannian Optimization

0 citations*

2020 ARXIV: LEARNING

View More 

 Robust Wasserstein Profile Inference and Applications to Machine Learning

168 citations* for all

150 citations*

2019 JOURNAL OF APPLIED PROBABILITY

Jose Blanchet 1,Yang Kang 2,Karthyek Murthy 3

1 Stanford University ,2 Columbia University ,3 Singapore University of Technology and Design

Robust optimization

Inference

View More (9+) 

We show that several machine learning estimators, including square-root least absolute shrinkage and selection and regularized logistic regression, can be represented as solutions to distributionally robust optimization problems. The associated uncertainty regions are based on suitably defined Wasse... View Full Abstract 

2020

 The quantum Wasserstein distance of order 1

3 citations*

2020 ARXIV: QUANTUM PHYSICS

View More 

 Quantum Statistical Learning via Quantum Wasserstein Natural Gradient

1 citations* for all

1 citations*

2021 JOURNAL OF STATISTICAL PHYSICS

Simon Becker 1,Wuchen Li 2

1 University of Cambridge ,2 University of South Carolina

Riemannian manifold

Wasserstein metric

View More (8+) 

In this article, we introduce a new approach towards the statistical learning problem $$\mathrm{argmin}_{\rho (\theta ) \in {\mathcal {P}}_{\theta }} W_{Q}^2 (\rho _{\star },\rho (\theta ))$$ to approxim... View Full Abstract 



2020


 Wasserstein metric for improved quantum machine learning with adjacency matrix representations

2020

9 citations*

2020 MACHINE LEARNING: SCIENCE AND TECHNOLOGY

Onur Çaylak 1,2,O. Anatole von Lilienfeld 2,3,Björn Baumeier 1

1 Eindhoven University of Technology ,2 University of California, Los Angeles ,3 University of Basel

Wasserstein metric

Quantum machine learning

View More (3+) 

Cited by 10 Related articles All 5 versions

2020

 On Linear Optimization over Wasserstein Balls

4 citations*

2020 ARXIV: OPTIMIZATION AND CONTROL

View More 

 Data-Driven Chance Constrained Programs over Wasserstein Balls

51 citations*

2018 ARXIV: OPTIMIZATION AND CONTROL

Zhi Chen ,Daniel Kuhn ,Wolfram Wiesemann

Orthant

Ball (bearing)

View More (8+) 

We provide an exact deterministic reformulation for data-driven chance constrained programs over Wasserstein balls. For individual chance constraints as well as joint chance constraints with right-hand side uncertainty, our reformulation amounts to a mixed-integer conic program. In the special case ... View Full Abstract 

2020  [PDF] ams.org

Isometric study of Wasserstein spaces–the real line

G GehérT TitkosD Virosztek - Transactions of the American Mathematical …, 2020 - ams.org

Recently Kloeckner described the structure of the isometry group of the quadratic

Wasserstein space $\mathcal {W} _2 (\mathbb {R}^ n) $. It turned out that the case of the real

line is exceptional in the sense that there exists an exotic isometry flow. Following this line of …

  Cited by 3 Related articles All 9 versions

Learning Wasserstein Isometric Embedding for Point Clouds

K KawanoS Koide, T Kutsuna - 2020 International Conference …, 2020 - ieeexplore.ieee.org

The Wasserstein distance has been employed for determining the distance between point

clouds, which have variable numbers of points and invariance of point order. However, the

high computational cost associated with the Wasserstein distance hinders its practical …

  Related articles All 2 versions

[PDF] researchgate.net

[PDF] ADDENDUM TO” ISOMETRIC STUDY OF WASSERSTEIN SPACES–THE REAL LINE”

GPÁL GEHÉRT TITKOSD VIROSZTEK - researchgate.net

We show an example of a Polish metric space X whose quadratic Wasserstein space W2 (X)

possesses an isometry that splits mass. This gives an affirmative answer to Kloeckner's

question,[2, Question 2]. Let us denote the metric space ([0, 1],|·|), equipped with the usual …

  Related articles 



2020  [PDF] researchgate.net

Tractable reformulations of two-stage distributionally robust linear programs over the type-∞ wasserstein ball

W Xie - Operations Research Letters, 2020 - Elsevier

This paper studies a two-stage distributionally robust stochastic linear program under the

type-∞ Wasserstein ball by providing sufficient conditions under which the program can be

efficiently computed via a tractable convex program. By exploring the properties of binary …

  Cited by 12 Related articles All 4 versions


2020  [PDF] ams.org

On the Wasserstein distance between classical sequences and the Lebesgue measure

L Brown, S Steinerberger - Transactions of the American Mathematical …, 2020 - ams.org

We discuss the classical problem of measuring the regularity of distribution of sets of $ N $

points in $\mathbb {T}^ d $. A recent line of investigation is to study the cost ($= $ mass

$\times $ distance) necessary to move Dirac measures placed on these points to the uniform …

  Cited by 6 Related articles All 5 versions

<——2020——2020—3140—


2020  [PDF] researchgate.net

[PDF] Computational hardness and fast algorithm for fixed-support wasserstein barycenter

T LinN HoX ChenM Cuturi… - arXiv preprint arXiv …, 2020 - researchgate.net

We study in this paper the fixed-support Wasserstein barycenter problem (FS-WBP), which

consists in computing the Wasserstein barycenter of m discrete probability measures

supported on a finite metric space of size n. We show first that the constraint matrix arising …

  Cited by 3 Related articles All 2 versions 

 

2020  [PDF] researchgate.net

Tractable reformulations of two-stage distributionally robust linear programs over the type-∞ wasserstein ball

W Xie - Operations Research Letters, 2020 - Elsevier

This paper studies a two-stage distributionally robust stochastic linear program under the

type-∞ Wasserstein ball by providing sufficient conditions under which the program can be

efficiently computed via a tractable convex program. By exploring the properties of binary …

  Cited by 12 Related articles All 4 versions


2020  [PDF] arxiv.org

Equidistribution of random walks on compact groups II. The Wasserstein metric

B Borda - arXiv preprint arXiv:2004.14089, 2020 - arxiv.org

We consider a random walk $ S_k $ with iid steps on a compact group equipped with a bi-

invariant metric. We prove quantitative ergodic theorems for the sum $\sum_ {k= 1}^ N f

(S_k) $ with H\" older continuous test functions $ f $, including the central limit theorem, the …

  Cited by 1 Related articles All 4 versions 


2020

Reconstruction of shale image based on Wasserstein Generative Adversarial Networks with gradient penalty

W Zha, X Li, Y Xing, L He, D Li - Advances in Geo-Energy …, 2020 - yandy-ager.com

Abstract Generative Adversarial Networks (GANs), as most popular artificial intelligence

models in the current image generation field, have excellent image generation capabilities.

Based on Wasserstein GANs with gradient penalty, this paper proposes a novel digital core …

  Cited by 11 Related articles 


2020  [HTML] frontiersin.org

[HTML] Eeg signal reconstruction using a generative adversarial network with wasserstein distance and temporal-spatial-frequency loss

T Luo, Y Fan, L Chen, G Guo, C Zhou - Frontiers in neuroinformatics, 2020 - frontiersin.org

Applications based on electroencephalography (EEG) signals suffer from the mutual

contradiction of high classification performance versus low cost. The nature of this

contradiction makes EEG signal reconstruction with high sampling rate and sensitivity …

  Cited by 10 Related articles All 5 versions 


2020

2020  [HTML] springer.com

[HTML] Missing features reconstruction using a wasserstein generative adversarial imputation network

M FriedjungováD Vašata, M Balatsko… - … on Computational Science, 2020 - Springer

Missing data is one of the most common preprocessing problems. In this paper, we

experimentally research the use of generative and non-generative models for feature

reconstruction. Variational Autoencoder with Arbitrary Conditioning (VAEAC) and …

  Cited by 4 Related articles All 8 versions



2020

Research of MRI Reconstruction Method by Using De-aliasing Wasserstein Generative Adversarial Networks with Gradient Penalty

Z YUAN, M JIANG, Y LI, M ZHI, Z ZHU - ACTA ELECTONICA SINICA, 2020 - ejournal.org.cn

In this paper, we propose an improved Wasserstein generative adversarial network (WGAN),

de-aliasing Wasserstein generative adversarial network with Gradient Penalty (DAWGAN-

GP), for magnetic resonance imaging (MRI) reconstruction. This method uses WGAN to …

  All 2 versions 


2020  [PDF] arxiv.org

Distributed Wasserstein Barycenters via Displacement Interpolation

P Cisneros-VelardeF Bullo - arXiv preprint arXiv:2012.08610, 2020 - arxiv.org

Consider a multi-agent system whereby each agent has an initial probability measure. In this

paper, we propose a distributed algorithm based upon stochastic, asynchronous and

pairwise exchange of information and displacement interpolation in the Wasserstein space …

  Related articles All 2 versio


 2020  [PDF] arxiv.org

PLG-IN: Pluggable Geometric Consistency Loss with Wasserstein Distance in Monocular Depth Estimation

N HiroseS KoideK KawanoR Kondo - arXiv preprint arXiv:2006.02068, 2020 - arxiv.org

We propose a novel objective for penalizing geometric inconsistencies to improve the depth

and pose estimation performance of monocular camera images. Our objective is designed

using the Wasserstein distance between two point clouds, estimated from images with …

  Cited by 2 Related articles All 2 versions 



2020  [PDF] arxiv.org

Consistency of Distributionally Robust Risk-and Chance-Constrained Optimization Under Wasserstein Ambiguity Sets

A CherukuriAR Hota - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We study stochastic optimization problems with chance and risk constraints, where in the

latter, risk is quantified in terms of the conditional value-at-risk (CVaR). We consider the

distributionally robust versions of these problems, where the constraints are required to hold …

  Cited by 3 Related articles All 4 versions

 <——2020——2020—3150—  


2020  [PDF] arxiv.org

Universal consistency of Wasserstein -NN classifier

D Ponnoprat - arXiv preprint arXiv:2009.04651, 2020 - arxiv.org

The Wasserstein distance provides a notion of dissimilarities between probability measures,

which has recent applications in learning of structured data with varying size such as images

and text documents. In this work, we analyze the $ k $-nearest neighbor classifier ($ k $-NN) …

  Related articles All 3 versions 



2020  [PDF] mlr.press

Quantitative stability of optimal transport maps and linearization of the 2-wasserstein space

Q MérigotA Delalande… - … Conference on Artificial …, 2020 - proceedings.mlr.press

This work studies an explicit embedding of the set of probability measures into a Hilbert

space, defined using optimal transport maps from a reference probability density. This

embedding linearizes to some extent the 2-Wasserstein space and is shown to be bi-Hölder …

  Cited by 15 Related articles All 5 versions 


2020

A new Wasserstein distance-and cumulative sum-dependent health indicator and its application in prediction of remaining useful life of bearing

J Yin, M Xu, H Zheng, Y Yang - Journal of the Brazilian Society of …, 2020 - Springer

The safety and reliability of mechanical performance are affected by the condition (health

status) of the bearings. A health indicator (HI) with high monotonicity and robustness is a

helpful tool to simplify the predictive model and improve prediction accuracy. In this paper, a …

  Cited by 1 Related articles


2020

 Differential Inclusions in Wasserstein Spaces: The Cauchy-Lipschitz Framework.

0 citations*

2020 ARXIV: OPTIMIZATION AND CONTROL

View More 

 On Linear Optimization over Wasserstein Balls

4 citations* for all

0 citations*

2021 MATHEMATICAL PROGRAMMING

Man-Chung Yue 1,Daniel Kuhn 2,Wolfram Wiesemann 3

1 Hong Kong Polytechnic University ,2 École Polytechnique Fédérale de Lausanne ,3 Imperial College London

Wasserstein metric

Infinite-dimensional optimization

View More (8+) 

Wasserstein balls, which contain all probability measures within a pre-specified Wasserstein distance to a reference measure, have recently enjoyed wide popularity in the distributionally robust optimization and machine learning communities to formulate and solve data-driven optimization problems wi... View Full Abstract 


2020

 A Riemannian Block Coordinate Descent Method for Computing the Projection Robust Wasserstein Distance

2 citations*

2020 ARXIV: LEARNING

View More 

 Improving Relational Regularized Autoencoders with Spherical Sliced Fused Gromov Wasserstein

3 citations* for all

3 citations*

2021 INTERNATIONAL CONFERENCE ON LEARNING REPRESENTATIONS

Khai Nguyen 1,Son Nguyen 2,Nhat Ho 3,Tung Pham 4,Hung Bui 5

1 VinAI Research, Vietnam,2 Worcester Polytechnic Institute ,3 University of Texas at Austin ,4 Vietnam National University, Hanoi ,5 Google

Autoencoder

Discriminative model

View More (8+) 

Relational regularized autoencoder (RAE) is a framework to learn the distribution of data by minimizing a reconstruction loss together with a relational regularization on the prior of latent space. A recent attempt to reduce the inner discrepancy between the prior and aggregated posterior distributi... View Full Abstract 

Cited by 7 Related articles All 10 versions 

2020

[PDF] usp.br

[PDF] Acoplamento de vaserstein e associação de sistemas markovianos de partículas

PA Ferrari - teses.usp.br

This document is only for private use for research and teaching activities. Reproduction for

commercial use is forbidden. This rights cover the whole data about this document as well

as its contents. Any uses or copies of this document in whole or in part must include the …

[0rtuuese  Coupling of Vaserstein and Association of Markovian Particle Systems]


2020 Open Access 

Ripple-GAN: Lane line detection with Ripple Lane Line Detection Network and Wasserstein GAN 

by Zhang, Y; Lu, Z; Ma, D; More... 

11/2020

With artificial intelligence technology being advanced by leaps and bounds, intelligent driving has attracted a huge amount of attention recently in research...

Journal ArticleCitation Online 

Cited by 14 Related articles All 4 versions

 Semantic Inpainting with Multi-dimensional Adversarial Network and Wasserstein Distance 

by Wang, Haodi; Jiao, Libin; Bie, Rongfang; More... 

Pattern Recognition and Computer Vision, 10/2020

Inpainting represents a procedure which can restore the lost parts of an image based upon the residual information. We present an inpainting network that...

Book ChapterFull Text Online 

Semantic Inpainting with Multi-dimensional Adversarial Network and Wasserstein Distance

H Wang, L Jiao, R Bie, H Wu - Chinese Conference on Pattern …, 2020 - Springer

… images both in detail and in general. Compared with the traditional training procedure,

our model combines with Wasserstein Distance that enhances the stability of network

training. The network is training specifically on street …

online

Semantic Inpainting with Multi-dimensional Adversarial Network and Wasserstein Distance

Related articles

[PDF] medrxiv.org

Simulating drug effects on blood glucose laboratory test time series with a conditional WGAN

A YahiNP Tatonetti - medRxiv, 2020 - medrxiv.org

The unexpected effects of medications has led to more than 14 million drug adverse events

reported to the Food and Drug Administration (FDA) over the past 10 years in the United

States alone, with a little over 1.3 million of them linked to death, and represents a medical …

  Related articles All 3 versions 


 A TextCNN and WGAN-gp based deep learning frame for unpaired text style transfer in multimedia services

M Hu, M He, W Su, A Chehri - Multimedia Systems, 2020 - Springer

With the rapid growth of big multimedia data, multimedia processing techniques are facing

some challenges, such as knowledge understanding, semantic modeling, feature

representation, etc. Hence, based on TextCNN and WGAN-gp (improved training of …

  Cited by 1 Related articles All 2 versions

<——2020—–—2020—––3160— 

 

Face Inpainting based on Improved WGAN-modified

Y Zhao, L Liu, H Liu, G Xie… - … on Autonomous Systems …, 2020 - ieeexplore.ieee.org

Image Inpainting aims to use the technical methods to repair and reconstruct the corrupted

region of the corrupted image, so that the reconstructed image looks more authentic. In this

paper, the improved Wasserstein Generative Adversarial Network combined with the …

  Related articles


[PDF] uepg.br

Generating synthetic 2019-nCoV samples with WGAN to increase the precision of an Ensemble Classifier

A Santos, DR Carvalho - Iberoamerican Journal of Applied …, 2020 - revistas.uepg.br

The objective of this research is to present an alternative data augmentation technique

based on WGAN to improve the precision in detection of positive 2019-nCoV samples, as

well as compare it with other traditional data augmentation techniques, using a dataset …

  All 2 versions 


WGAN-GP overriding Model.train_step

https://colab.research.google.com › generative › ipynb

May 9, 2020 — The WGAN-GP method proposes an alternative to weight clipping to ensure smooth training. Inste

[CITATION] Wgan-gp overriding model. train step

AK Nain - 2020 - May

  Cited by 2


2020 zzz 5

[CITATION] Insulator target detection based on image deblurring of WGAN

DW Wang, YD Li - Journal of Electric Power Automation Equipment, 2020

  Cited by 1 Related articles


2020 

 Interpretable Model Summaries Using the Wasserstein Distance

0 citations*

2020 ARXIV: METHODOLOGY

Eric Dunipace ,Lorenzo Trippa

Harvard University

Statistical model

Bayesian inference

View More (7+) 

Statistical models often include thousands of parameters. However, large models decrease the investigator's ability to interpret and communicate the estimated parameters. Reducing the dimensionality of the parameter space in the estimation phase is a commonly used approach, but less work has focused... View Full Abstract 


 Distributional Sliced-Wasserstein and Applications to Generative Modeling

4 citations*

2020 ARXIV: MACHINE LEARNING

View More 

 Strong equivalence between metrics of Wasserstein type

7 citations* for all

1 citations*

2021 ELECTRONIC COMMUNICATIONS IN PROBABILITY

Erhan Bayraktar 1,Gaoyue Guo 2

1 University of Michigan ,2 University of Paris

Wasserstein metric

Equivalence (measure theory)

View More (5+) 

The sliced Wasserstein metric Wp and more recently max-sliced Wasserstein metric Wp have attracted abundant attention in data sciences and machine learning due to their advantages to tackle the curse of dimensionality, see e.g. [15], [6]. A question of particular importance is the strong equivalenc... View Full Abstract 
Cited by 33 Related articles All 12 versions 


 2020 

 Learning Graphons via Structured Gromov-Wasserstein Barycenters

1 citations*

2020 ARXIV: LEARNING

View More 

 Revisiting Fixed Support Wasserstein Barycenter: Computational Hardness and Efficient Algorithms.

3 citations*

Tianyi Lin ,Nhat Ho ,Xi Chen ,Marco Cuturi ,Michael I. Jordan

Unimodular matrix

Metric space

View More (8+) 

We study the fixed-support Wasserstein barycenter problem (FS-WBP), which consists in computing the Wasserstein barycenter of $m$ discrete probability measures supported on a finite metric space of size $n$. We show first that the constraint matrix arising from the standard linear programming (LP) r... View Full Abstract 

  

 2020 

 Improved Image Wasserstein Attacks and Defenses.

5 citations*

2020 ARXIV: LEARNING

View More 

 Wasserstein Contrastive Representation Distillation

1 citations* for all

0 citations*

2021 COMPUTER VISION AND PATTERN RECOGNITION

Liqun Chen 1,Dong Wang 2,Zhe Gan 3,Jingjing Liu 3,Ricardo Henao 4 see all 6 authors

1 University of Surrey ,2 Duke University ,3 Microsoft ,4 Center for Applied Genomics

Knowledge transfer

Representation (mathematics)

View More (8+) 

The primary goal of knowledge distillation (KD) is to encapsulate the information of a model learned from a teacher network into a student network, with the latter being more compact than the former. Existing work, e.g., using Kullback-Leibler divergence for distillation, may fail to capture importa... View Full Abstract 

Cited by 7 Related articles All 4 versions 

 2020 

 Modeling EEG data distribution with a Wasserstein Generative Adversarial Network to predict RSVP Events

1 citations*

2019 ARXIV: IMAGE AND VIDEO PROCESSING

View More 

 Nested-Wasserstein Self-Imitation Learning for Sequence Generation

2 citations* for all

2 citations*

2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS

Ruiyi Zhang 1,Changyou Chen 2,Zhe Gan 3,Zheng Wen 4,Wenlin Wang 1 see all 6 authors

1 Duke University ,2 State University of New York System ,3 Microsoft ,4 DeepMind

Semantic matching

Reinforcement learning

View More (9+) 

Reinforcement learning (RL) has been widely studied for improving sequence-generation models. However, the conventional rewards used for RL training typically cannot capture sufficient semantic information and therefore render model bias. Further, the sparse and delayed rewards make RL exploration i... View Full Abstract 


2020 see 2019

 Modeling EEG Data Distribution With a Wasserstein Generative Adversarial Network to Predict RSVP Events

5 citations* for all

4 citations*

2020 INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY

Sharaj Panwar 1,Paul Rad 1,Tzyy-Ping Jung 2,Yufei Huang 1

1 University of Texas at San Antonio ,2 University of California, San Diego

Deep learning

Convolutional neural network

View More (12+) 

Electroencephalography (EEG) data are difficult to obtain due to complex experimental setups and reduced comfort with prolonged wearing. This poses challenges to train powerful deep learning model with the limited EEG data. Being able to generate EEG data computationally could address this limitatio... View Full Abstract 

<——2020——2020—3170— 


2020  [PDF] arxiv.org

Scalable computations of wasserstein barycenter via input convex neural networks

J FanA TaghvaeiY Chen - arXiv preprint arXiv:2007.04462, 2020 - arxiv.org

Wasserstein Barycenter is a principled approach to represent the weighted mean of a given

set of probability distributions, utilizing the geometry induced by optimal transport. In this

work, we present a novel scalable algorithm to approximate the Wasserstein Barycenters …

  Cited by 8 Related articles All 3 versions 


2020  [PDF] mlr.press

Nested-wasserstein self-imitation learning for sequence generation

R ZhangC ChenZ GanZ Wen… - International …, 2020 - proceedings.mlr.press

Reinforcement learning (RL) has been widely studied for improving sequence-generation

models. However, the conventional rewards used for RL training typically cannot capture

sufficient semantic information and therefore render model bias. Further, the sparse and …

Cited by 5 Related articles All 9 versions 

2020 see 2019  [PDF] archives-ouvertes.fr

The Wasserstein-Fourier distance for stationary time series

E Cazelles, A Robert, F Tobar - IEEE Transactions on Signal …, 2020 - ieeexplore.ieee.org

We propose the Wasserstein-Fourier (WF) distance to measure the (dis) similarity between

time series by quantifying the displacement of their energy across frequencies. The WF

distance operates by calculating the Wasserstein distance between the (normalised) power …

  Cited by 2 Related articles All 35 versions

2020  [PDF] aaai.org

[PDF] Swift: Scalable wasserstein factorization for sparse nonnegative tensors

A AfsharK Yin, S Yan, C QianJC Ho, H Park… - arXiv preprint arXiv …, 2020 - aaai.org

Existing tensor factorization methods assume that the input tensor follows some specific

distribution (ie Poisson, Bernoulli, and Gaussian), and solve the factorization by minimizing

some empirical loss functions defined based on the corresponding distribution. However, it …

  Cited by 2 Related articles All 7 versions 


2020

First arrival picking of microseismic signals based on nested U-Net and Wasserstein Generative Adversarial Network

JL Zhang, GQ Sheng - Journal of Petroleum Science and Engineering, 2020 - Elsevier

Picking the first arrival of microseismic signals, quickly and accurately, is the key for real-time

data processing of microseismic monitoring. The traditional method cannot meet the high-

accuracy and high-efficiency requirements for the firstarrival microseismic picking, in a low …

  Cited by 4 Related articles All 2 versions

 

2020


Rethinking Wasserstein-Procrustes for Aligning Word Embeddings Across Languages

G Ramírez Santos - 2020 - upcommons.upc.edu

The emergence of unsupervised word embeddings, pre-trained on very large monolingual

text corpora, is at the core of the ongoing neural revolution in Natural Language Processing

(NLP). Initially introduced for English, such pre-trained word embeddings quickly emerged …

Rethinking Wasserstein-Procrustes for Aligning Word Embeddings Across Languages

2020  [HTML] frontiersin.org

[HTML] Eeg signal reconstruction using a generative adversarial network with wasserstein distance and temporal-spatial-frequency loss

T Luo, Y Fan, L Chen, G Guo, C Zhou - Frontiers in neuroinformatics, 2020 - frontiersin.org

Applications based on electroencephalography (EEG) signals suffer from the mutual

contradiction of high classification performance versus low cost. The nature of this

contradiction makes EEG signal reconstruction with high sampling rate and sensitivity …

  Cited by 10 Related articles All 5 versions 



2020  [PDF] arxiv.org

Modeling EEG data distribution with a wasserstein generative adversarial network to predict rsvp events

S PanwarP RadTP Jung… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

Electroencephalography (EEG) data are difficult to obtain due to complex experimental

setups and reduced comfort with prolonged wearing. This poses challenges to train powerful

deep learning model with the limited EEG data. Being able to generate EEG data …

  Cited by 5 Related articles All 5 versions


 2020  [PDF] tins.ro

Enhancing the classification of EEG signals using Wasserstein generative adversarial networks

VM PetruţiuLD Palcu, C Lemnaur… - 2020 IEEE 16th …, 2020 - ieeexplore.ieee.org

Collecting EEG signal data during a human visual recognition task is a costly and time-

consuming process. However, training good classification models usually requires a large

amount of quality data. We propose a data augmentation method based on Generative …

  Cited by 1 Related articles All 2 versions


2020  [PDF] arxiv.org

Interpretable Model Summaries Using the Wasserstein Distance

E Dunipace, L Trippa - arXiv preprint arXiv:2012.09999, 2020 - arxiv.org

Statistical models often include thousands of parameters. However, large models decrease

the investigator's ability to interpret and communicate the estimated parameters. Reducing

the dimensionality of the parameter space in the estimation phase is a commonly used …

  Related articles All 2 versions 

<——2020——2020—3180—


2020

EEG data augmentation using Wasserstein GAN

G Bouallegue, R Djemal - 2020 20th International Conference …, 2020 - ieeexplore.ieee.org

Electroencephalogram (EEG) presents a challenge during the classification task using

machine learning and deep learning techniques due to the lack or to the low size of

available datasets for each specific neurological disorder. Therefore, the use of data …

  Cited by 1 Related articles


2020  [PDF] arxiv.org

Wasserstein distributionally robust look-ahead economic dispatch

BK PoollaAR HotaS Bolognani… - … on Power Systems, 2020 - ieeexplore.ieee.org

We consider the problem of look-ahead economic dispatch (LAED) with uncertain

renewable energy generation. The goal of this problem is to minimize the cost of

conventional energy generation subject to uncertain operational constraints. The risk of …

  Cited by 4 Related articles All 3 versions


2020

Wasserstein Distributionally Robust Look-Ahead Economic Dispatch

B Kameshwar Poolla, AR HotaS Bolognani… - arXiv e …, 2020 - ui.adsabs.harvard.edu

We consider the problem of look-ahead economic dispatch (LAED) with uncertain

renewable energy generation. The goal of this problem is to minimize the cost of

conventional energy generation subject to uncertain operational constraints. These …


2020  [PDF] arxiv.org

Cutoff thermalization for Ornstein-Uhlenbeck systems with small L\'evy noise in the Wasserstein distance

G BarreraMA HögeleJC Pardo - arXiv preprint arXiv:2009.10590, 2020 - arxiv.org

This article establishes cutoff thermalization (also known as the cutoff phenomenon) for a

general class of general Ornstein-Uhlenbeck systems $(X^\epsilon_t (x)) _ {t\geq 0} $ under

$\epsilon $-small additive L\'evy noise with initial value $ x $. The driving noise processes …

  Cited by 3 Related articles All 3 versions 


 2020 see 2019

Irregularity of distribution in Wasserstein distance

C Graham - Journal of Fourier Analysis and Applications, 2020 - Springer

We study the non-uniformity of probability measures on the interval and circle. On the

interval, we identify the Wasserstein-p distance with the classical L^ p L p-discrepancy. We

thereby derive sharp estimates in Wasserstein distances for the irregularity of distribution of

sequences on the interval and circle. Furthermore, we prove an L^ p L p-adapted Erdős–

Turán inequality, and use it to extend a well-known bound of Pólya and Vinogradov on the

equidistribution of quadratic residues in finite fields.

 Cited by 7 Related articles All 3 versions

   Irregularity of Distribution in Wasserstein Distance

3 citations* for all

1 citations*

2020 JOURNAL OF FOURIER ANALYSIS AND APPLICATIONS

Cole Graham

Stanford University

Probability measure

Distribution (number theory)

View More (8+) 

We study the non-uniformity of probability measures on the interval and circle. On the interval, we identify the Wasserstein-p distance with the classical $$L^p$$ -discrepancy. We thereby derive sharp estimates in Wasserstein distances for the irregularity of distribution of sequences on the inter... View Full Abstract 

Cited by 9 Related articles All 3 versions

2020

MR4142495 Prelim Bigot, Jérémie; Statistical data analysis in the Wasserstein space. Journées MAS 2018—Sampling and processes, 1–19, ESAIM Proc. Surveys, 68, EDP Sci., Les Ulis, 2020. 62R20 (49Q22 60B10 62G05 62H25)

Review PDF Clipboard Series Chapter

 Cited by 6 Related articles All 2 versions


 2020

 Symmetric Skip Connection Wasserstein GAN for High-Resolution Facial Image Inpainting

7 citations*

2020 ARXIV: COMPUTER VISION AND PATTERN RECOGNITION

View More 

 Robust W-GAN-Based Estimation Under Wasserstein Contamination.

0 citations*

2021 ARXIV: STATISTICS THEORY

Zheng Liu 1,Po-Ling Loh 2

1 University of Wisconsin-Madison ,2 University of Cambridge

Estimator

Minimax

View More (8+) 

Robust estimation is an important problem in statistics which aims at providing a reasonable estimator when the data-generating distribution lies within an appropriately defined ball around an uncontaminated distribution. Although minimax rates of estimation have been established in recent years, ma... View Full Abstract 


2020

 node2coords: Graph Representation Learning with Wasserstein Barycenters

2 citations*

2020 ARXIV: LEARNING

View More 

 Prediction and analysis of multiple protein lysine modified sites based on conditional wasserstein generative adversarial networks

0 citations*

2021 BMC BIOINFORMATICS

Yingxi Yang 1,Hui Wang 2,Wen Li 1,Xiaobo Wang 1,Shizhao Wei 3 see all 7 authors

1 University of Science and Technology Beijing ,2 Chinese Academy of Sciences ,3 No. 15 Research Institute, China Electronics Technology Group Corporation, Beijing, 100083, China.

Matthews correlation coefficient

Pearson product-moment correlation coefficient

View More (16+) 

Protein post-translational modification (PTM) is a key issue to investigate the mechanism of protein’s function. With the rapid development of proteomics technology, a large amount of protein sequence data has been generated, which highlights the importance of the in-depth study and analysis of PTMs... View Full Abstract 

Cited by 5 Related articles All 6 versions

2020

 Wasserstein k-means with sparse simplex projection

2 citations*

2020 ARXIV: LEARNING

View More 

 Wasserstein Autoregressive Models for Density Time Series

7 citations* for all

1 citations*

2021 JOURNAL OF TIME SERIES ANALYSIS

Chao Zhang 1,Piotr Kokoszka 2,Alexander Petersen 1,3

1 University of California, Santa Barbara ,2 Colorado State University ,3 Brigham Young University

Wasserstein metric

Autoregressive model

View More (8+) 

Data consisting of time-indexed distributions of cross-sectional or intraday returns have been extensively studied in finance, and provide one example in which the data atoms consist of serially dependent probability distributions. Motivated by such data, we propose an autoregressive model for densi... View Full Abstract 


2020

 Wasserstein Autoregressive Models for Density Time Series

6 citations*

2020 ARXIV: METHODOLOGY

View More 

 Distributional robustness in minimax linear quadratic control with Wasserstein distance.

1 citations*

2021 ARXIV E-PRINTS

Kihyun Kim ,Insoon Yang

Seoul National University

Algebraic Riccati equation

Wasserstein metric

View More (8+) 

To address the issue of inaccurate distributions in practical stochastic systems, a minimax linear-quadratic control method is proposed using the Wasserstein metric. Our method aims to construct a control policy that is robust against errors in an empirical distribution of underlying uncertainty, by... View Full Abstract 

<——2020——2020—3190—  

2020

 Second-order Conic Programming Approach for Wasserstein Distributionally Robust Two-stage Linear Programs

1 citations*

2020 ARXIV: OPTIMIZATION AND CONTROL

View More 

 2020 

 Quantum statistical learning via Quantum Wasserstein natural gradient

0 citations*

2020 ARXIV: MATHEMATICAL PHYSICS

View More 

 Wasserstein Distance guided Adversarial Imitation Learning with Reward Shape Exploration

5 citations* for all

3 citations*

2020 2020 IEEE 9TH DATA DRIVEN CONTROL AND LEARNING SYSTEMS CONFERENCE (DDCLS)

Ming Zhang 1,Yawei Wang 1,Xiaoteng Ma 1,Li Xia 2,Jun Yang 1 see all 7 authors

1 Tsinghua University ,2 Sun Yat-sen University

Reinforcement learning

Divergence (statistics)

View More (8+) 

The generative adversarial imitation learning (GAIL) has provided an adversarial learning framework for imitating expert policy from demonstrations in high-dimensional continuous tasks. However, almost all GAIL and its extensions only design a kind of reward function of logarithmic form in the adver... View Full Abstract 

 2020 

 Wasserstein and Kolmogorov Error Bounds for Variance-Gamma Approximation via Stein’s Method I

21 citations* for all

19 citations*

2020 JOURNAL OF THEORETICAL PROBABILITY

Robert E. Gaunt

University of Manchester

Stein's method

Laplace's method

View More (8+) 

The variance-gamma (VG) distributions form a four-parameter family that includes as special and limiting cases the normal, gamma and Laplace distributions. Some of the numerous applications include financial modelling and approximation on Wiener space. Recently, Stein’s method has been extended to t... View Full Abstract 

Cited by 22 Related articles All 8 versions

 2020 

 Minimax Control of Ambiguous Linear Stochastic Systems Using the Wasserstein Metric

4 citations* for all

4 citations*

2020 CONFERENCE ON DECISION AND CONTROL

Kihyun Kim ,Insoon Yang

Seoul National University

Algebraic Riccati equation

Riccati equation

View More (8+) 

In this paper, we propose a minimax linear-quadratic control method to address the issue of inaccurate distribution information in practical stochastic systems. To construct a control policy that is robust against errors in an empirical distribution of uncertainty, our method adopts an adversary, wh... View Full Abstract 

Cited by 7 Related articles All 6 versions

 2020 see 2019

 Wasserstein Smoothing: Certified Robustness against Wasserstein Adversarial Attacks

25 citations* for all

19 citations*

2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS

Alexander Levine ,Soheil Feizi

University of Maryland, College Park

Wasserstein metric

Smoothing

View More (8+) 

In the last couple of years, several adversarial attack methods based on different threat models have been proposed for the image classification problem. Most existing defenses consider additive threat models in which sample perturbations have bounded L_p norms. These defenses, however, can be vulne... View Full Abstract 

Cited by 32 Related articles All 7 versions 

2020

 2020 see 2018

 Convergence and Concentration of Empirical Measures under Wasserstein Distance in Unbounded Functional Spaces

56 citations* for all

53 citations*

2020 BERNOULLI

Jing Lei

Probability measure

Concentration inequality

View More (8+) 

We provide upper bounds of the expected Wasserstein distance between a probability measure Wasserstein Index Generation Model: Automatic Generation of Time-series Index with Application to Economic Policy  and its empirical version, generalizing recent results for finite dimensional Euclidean spaces and bounded functional spaces. Such a generalization can cover Euclidean spaces with large dimensionality, with th... View Full Abstract 

Cited by 71 Related articles All 5 versions

 2020 

 First-Order Methods for Wasserstein Distributionally Robust MDP

1 citations*

2020 ARXIV: OPTIMIZATION AND CONTROL

View More 

 Distributionally robust chance-constrained programs with right-hand side uncertainty under Wasserstein ambiguity

8 citations* for all

1 citations*

2021 MATHEMATICAL PROGRAMMING

Nam Ho-Nguyen 1,Fatma Kılınç-Karzan 2,Simge Küçükyavuz 3,Dabeen Lee 4

1 University of Sydney ,2 Carnegie Mellon University ,3 Northwestern University ,4 Discrete Mathematics Group, Institute for Basic Science (IBS), Daejeon, Republic of Korea

Relaxation (approximation)

Transportation theory

View More (7+) 

We consider exact deterministic mixed-integer programming (MIP) reformulations of distributionally robust chance-constrained programs (DR-CCP) with random right-hand sides over Wasserstein ambiguity sets. The existing MIP formulations are known to have weak continuous relaxation bounds, and, consequ... View Full Abstract 

Cited by 6 Related articles All 5 versions 

 2020 

 Distributionally Robust Chance-Constrained Programs with Right-Hand Side Uncertainty under Wasserstein Ambiguity

7 citations*

2020 ARXIV: OPTIMIZATION AND CONTROL

View More 

 Ensemble Riemannian data assimilation over the Wasserstein space

0 citations* for all

0 citations*

2021 NONLINEAR PROCESSES IN GEOPHYSICS

Sagar K. Tamang 1,Ardeshir Ebtehaj 1,Peter J. van Leeuwen 2,Dongmian Zou 3,Gilad Lerman 1

1 University of Minnesota ,2 Colorado State University ,3 Duke University

Wasserstein metric

Euclidean distance

View More (8+) 

Abstract. In this paper, we present an ensemble data assimilation paradigm over a Riemannian manifold equipped with the Wasserstein metric. Unlike the Euclidean distance used in classic data assimilation methodologies, the Wasserstein metric can capture the translation and difference between the sha... View Full Abstract 



 2020 

 Regularized Variational Data Assimilation for Bias Treatment using the Wasserstein Metric

0 citations*

2020 ARXIV: METHODOLOGY

View More 

 The quadratic Wasserstein metric for inverse data matching

8 citations* for all

8 citations*

2020 INVERSE PROBLEMS

Björn Engquist ,Kui Ren ,Yunan Yang

Wasserstein metric

Smoothing

View More (8+) 

This work characterizes, analytically and numerically, two major effects of the quadratic Wasserstein ($W_2$) distance as the measure of data discrepancy in computational solutions of inverse problems. First, we show, in the infinite-dimensional setup, that the $W_2$ distance has a smoothing effect ... View Full Abstract 
Cited by 19
 Related articles All 7 versions


 2020 

 Sample generation based on a supervised Wasserstein Generative Adversarial Network for high-resolution remote-sensing scene classification

7 citations*

2020 INFORMATION SCIENCES

Wei Han ,Lizhe Wang ,Ruyi Feng ,Lang Gao ,Xiaodao Chen see all 8 authors

China University of Geosciences (Wuhan)

Convolutional neural network

Classifier (UML)

Abstract As high-resolution remote-sensing (HRRS) images have become increasingly widely available, scene classification focusing on the smart classification of land cover and land use has also attracted more attention. However, mainstream methods encounter a severe problem in that many annotation... View Full Abstract 

Cited by 21 Related articles All 2 versions

 <——2020——2020—3200— 


 Obtaining PET/CT images from non-attenuation corrected PET images in a single PET system using Wasserstein generative adversarial networks.

13 citations*

2020 PHYSICS IN MEDICINE AND BIOLOGY

Zhanli Hu ,Yongchang Li ,Sijuan Zou ,Hengzhi Xue ,Ziru Sang see all 10 authors

Chinese Academy of Sciences

PET-CT

Positron emission tomography

View More (17+) 

Positron emission tomography (PET) imaging plays an indispensable role in early disease detection and postoperative patient staging diagnosis. However, PET imaging requires not only additional computed tomography (CT) imaging to provide detailed anatomical information but also attenuation correction... View Full Abstract 
 Cited by 17 Related articles All 5 versions


 2020 

 Wasserstein Exponential Kernels

1 citations* for all

0 citations*

2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORK

Henri De Plaen ,Michael Fanuel ,Johan A. K. Suykens

Katholieke Universiteit Leuven

Kernel method

Kernel (statistics)

View More (8+) 

In the context of kernel methods, the similarity between data points is encoded by the kernel function which is often defined thanks to the Euclidean distance; the squared exponential kernel is a common example. Recently, other distances relying on optimal transport theory – such as the Wasserstein ... View Full Abstract 

EXCERPTS (18)

Cited by 5 Related articles All 5 versions

2020  [HTML] proquest.com

[HTML] Classification of atomic environments via the Gromov-Wasserstein distance

S Kawano - 2020 - search.proquest.com

Interpreting molecular dynamics simulations usually involves automated classification of

local atomic environments to identify regions of interest. Existing approaches are generally

limited to a small number of reference structures and only include limited information about …

  Cited by 1 Related articles All 3 versions


2020

Wasserstein distance based deep multi-feature adversarial transfer diagnosis approach under variable working conditions

D She, N Peng, M Jia, MG Pecht - Journal of Instrumentation, 2020 - iopscience.iop.org

Intelligent mechanical fault diagnosis is a crucial measure to ensure the safe operation of

equipment. To solve the problem that network features is not fully utilized in the adversarial

transfer learning, this paper develops a Wasserstein distance based deep multi-feature  …

  Cited by 2 Related articles All 2 versions


2020

Nonparametric Different-Feature Selection Using Wasserstein Distance

W Zheng, FY WangC Gou - 2020 IEEE 32nd International …, 2020 - ieeexplore.ieee.org

In this paper, we propose a feature selection method that characterizes the difference

between two kinds of probability distributions. The key idea is to view the feature selection

problem as a sparsest k-subgraph problem that considers Wasserstein distance between …

  Related articles All 2 versions


2020


2020  [PDF] arxiv.org

Estimating processes in adapted Wasserstein distance

J BackhoffD BartlM BeiglböckJ Wiesel - arXiv preprint arXiv …, 2020 - arxiv.org

A number of researchers have independently introduced topologies on the set of laws of

stochastic processes that extend the usual weak topology. Depending on the respective

scientific background this was motivated by applications and connections to various areas …

  Cited by 7 Related articles All 4 versions 

[CITATION] Estimating processes in adapted Wasserstein distance

J Backhoff-Veraguas, D Bartl, M Beiglböck, J Wiesel - Preprint, 2020

  Cited by 3 Related articles



2020  [PDF] springer.com

[PDF] Adapted Wasserstein distances and stability in mathematical finance

J Backhoff-VeraguasD BartlM Beiglböck… - Finance and …, 2020 - Springer

Assume that an agent models a financial asset through a measure with the goal to

price/hedge some derivative or optimise some expected utility. Even if the model is

chosen in the most skilful and sophisticated way, the agent is left with the possibility that  …

  Cited by 25 Related articles All 13 versions


2020  [PDF] arxiv.org

Stein factors for variance-gamma approximation in the Wasserstein and Kolmogorov distances

RE Gaunt - arXiv preprint arXiv:2008.06088, 2020 - arxiv.org

We obtain new bounds for the solution of the variance-gamma (VG) Stein equation that are

of the correct form for approximations in terms of the Wasserstein and Kolmorogorov metrics.

These bounds hold for all parameters values of the four parameter VG class. As an …

  Cited by 6 Related articles All 3 versions 


2020  [PDF] arxiv.org

Wasserstein distributionally robust look-ahead economic dispatch

BK PoollaAR HotaS Bolognani… - … on Power Systems, 2020 - ieeexplore.ieee.org

We consider the problem of look-ahead economic dispatch (LAED) with uncertain

renewable energy generation. The goal of this problem is to minimize the cost of

conventional energy generation subject to uncertain operational constraints. The risk of …

  Cited by 4 Related articles All 3 versions

Wasserstein Distributionally Robust Look-Ahead Economic Dispatch

B Kameshwar Poolla, AR HotaS Bolognani… - arXiv e …, 2020 - ui.adsabs.harvard.edu

We consider the problem of look-ahead economic dispatch (LAED) with uncertain

renewable energy generation. The goal of this problem is to minimize the cost of

conventional energy generation subject to uncertain operational constraints. These …

  

2020  [PDF] arxiv.org

Parameter-transferred wasserstein generative adversarial network (PT-WGAN) for low-dose pet image denoising

Y GongH ShanY Teng, N Tu, M Li… - … on Radiation and …, 2020 - ieeexplore.ieee.org

Due to the widespread of positron emission tomography (PET) in clinical practice, the

potential risk of PET-associated radiation dose to patients needs to be minimized. However,

with the reduction in the radiation dose, the resultant images may suffer from noise and …

  Cited by 3 Related articles All 4 versions

<——2020——2020—3210— 



2020  [PDF] future-in-tech.net

[PDF] Wasserstein Riemannian geometry of Gamma densities

C Ogouyandjou, N Wadagni - Computer Science, 2020 - ijmcs.future-in-tech.net

Abstract A Wasserstein Riemannian Gamma manifold is a space of Gamma probability

density functions endowed with the Riemannian Otto metric which is related to the

Wasserstein distance. In this paper, we study some geometric properties of such Riemanian …

  Related articles 


2020  [PDF] arxiv.org

Conditional sig-wasserstein gans for time series generation

H NiL SzpruchM WieseS Liao, B Xiao - arXiv preprint arXiv:2006.05421, 2020 - arxiv.org

Generative adversarial networks (GANs) have been extremely successful in generating

samples, from seemingly high dimensional probability measures. However, these methods

struggle to capture the temporal dependence of joint probability distributions induced by …

  Cited by 11 Related articles All 3 versions 


2020

Obtaining PET/CT images from non-attenuation corrected PET images in a single PET system using Wasserstein generative adversarial networks

Z Hu, Y Li, S Zou, H Xue, Z Sang, X Liu… - Physics in Medicine …, 2020 - iopscience.iop.org

Positron emission tomography (PET) imaging plays an indispensable role in early disease

detection and postoperative patient staging diagnosis. However, PET imaging requires not

only additional computed tomography (CT) imaging to provide detailed anatomical …

  Cited by 13 Related articles All 5 versions


2020  [PDF] arxiv.org

First-Order Methods for Wasserstein Distributionally Robust MDP

J Grand-ClémentC Kroer - arXiv preprint arXiv:2009.06790, 2020 - arxiv.org

Markov Decision Processes (MDPs) are known to be sensitive to parameter specification.

Distributionally robust MDPs alleviate this issue by allowing for ambiguity sets which give a

set of possible distributions over parameter sets. The goal is to find an optimal policy with …

  Cited by 3 Related articles All 3 versions 



2020  [PDF] ieee.org

An ensemble Wasserstein generative adversarial network method for road extraction from high resolution remote sensing images in rural areas

C Yang, Z Wang - IEEE Access, 2020 - ieeexplore.ieee.org

Road extraction from high resolution remote sensing (HR-RS) images is an important yet

challenging computer vision task. In this study, we propose an ensemble Wasserstein

Generative Adversarial Network with Gradient Penalty (WGAN-GP) method called E-WGAN …

  Cited by 4 Related articles All 2 versions


  2020


2020  [PDF] aaai.org

Solving general elliptical mixture models through an approximate Wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - Proceedings of the AAAI Conference on …, 2020 - ojs.aaai.org

We address the estimation problem for general finite mixture models, with a particular focus

on the elliptical mixture models (EMMs). Compared to the widely adopted Kullback–Leibler

divergence, we show that the Wasserstein distance provides a more desirable optimisation …

  Cited by 2 Related articles All 3 versions 


 2020

The Spectral-Domain $\mathcal{W}_2$ Wasserstein Distance ...

https://arxiv.org › math

by S Fang · 2020 — We also introduce the spectral-domain Gelbrich bound for processes that are not necessarily elliptical. Subjects: Statistics Theory (math.ST); ...

[CITATION] The Spectral-Domain W2 Wasserstein Distance for Elliptical Processes and the Spectral-Domain Gelbrich Bound.

S Fang, Q Zhu - CoRR, 2020


2020

Independent Elliptical Distributions Minimize Their $\mathcal{W}

https://arxiv.org › math

by S Fang · 2020 — This short note is on a property of the \mathcal{W}_2 Wasserstein distance which indicates that independent elliptical distributions minimize ...

Missing: W2 ‎| Must include: W2

[CITATION] Independent Elliptical Distributions Minimize Their W2 Wasserstein Distance from Independent Elliptical Distributions with the Same Density Generator.

S Fang, Q Zhu - arXiv preprint, 2020

  Cited by 1 Related articles

Independent Elliptical Distributions Minimize Their W2
 Wasserstein Distance from Independent Elliptical

 

2020  [PDF] arxiv.org

The equivalence of Fourier-based and Wasserstein metrics on imaging problems

G Auricchio, A CodegoniS Gualandi… - … Lincei-Matematica e …, 2020 - ems.press

We investigate properties of some extensions of a class of Fourier-based probability metrics,

originally introduced to study convergence to equilibrium for the solution to the spatially

homogeneous Boltzmann equation. At di¤ erence with the original one, the new Fourier …

  Cited by 1 Related articles All 8 versions


 

2020

G. Auricchio, A. Codegoni, S. Gualandi, G. Toscani and M. Veneroni. 

The Equivalence of Fourier-based and Wasserstein Metrics on Imaging Problems. Accepted for publication in Rendiconti Lincei. Matematica e Applicazioni. ( ArXiv Preprint, 2020).

G Auricchio, A CodegoniS Gualandi… - … Lincei-Matematica e …, 2020 - ems.press

We investigate properties of some extensions of a class of Fourier-based probability metrics,

originally introduced to study convergence to equilibrium for the solution to the spatially

homogeneous Boltzmann equation. At di¤ erence with the original one, the new Fourier …

  Cited by 1 Related articles All 8 versions

<——2020——2020—3220— 


[PDF] archives-ouvertes.fr

The Wasserstein-Fourier distance for stationary time series

E Cazelles, A Robert, F Tobar - IEEE Transactions on Signal …, 2020 - ieeexplore.ieee.org

… 1053-587X © 2020 IEEE … time series x [x], y [y] and z [z], WF verifies: i) non-negativity:

WF([x], [y]) ≥ 0 is direct by the non- negativity of W2, ii) identity of indiscernible: WF([x], [y]) =

W2(sx,sy)=0 is equivalent to sx = sy, and by definition of the equivalence class, to [x]=[y …

Cited by 6 Related articles All 45 versions

year 2020  [PDF] unipv.it

[PDF] On the equivalence between Fourier-based and Wasserstein metrics

G Auricchio, A Codegoni, S GualandiG Toscani… - eye - mate.unipv.it

We investigate properties of some extensions of a class of Fourierbased probability metrics,

originally introduced to study convergence to equilibrium for the solution to the spatially

homogeneous Boltzmann equation. At difference with the original one, the new Fourier …

  Related articles 


(PDF) THE α-z-BURES WASSERSTEIN DIVERGENCE

https://www.researchgate.net › ... › Quantum

Nov 2, 2020 — TRUNG-HOA DINH, CONG-TRINH LE, BICH-KHUE VO AND TRUNG-DUNG VUONG. Abstract. In this paper, we introduce the α-z-Bures Wasserstein divergence.


(PDF) The α-z-Bures Wasserstein divergence - ResearchGatehttps://www.researchgate.net › ... › Quantum
Jun 17, 2021 — TRUNG HOA DINH, CONG TRINH LE, BICH KHUE VO AND TRUNG DUNG VUONG. Abstract. In this paper, we introduce the α-z-Bures Wasserstein divergence.

Waserstein  or  Вассерштейн  

including 4 titles with Vaserstein and

28 titles with WGAN-Wasserstein

Cutoff thermalization for Ornstein-Uhlenbeck systems with small Lévy noise in the Wasserstein distance

by G Barrera · 2020 · Cited by 3 — Cutoff thermalization for Ornstein-Uhlenbeck systems with small Lévy noise in the Wasserstein distance. Authors:Gerardo Barrera, Michael A.


2020


Stochastic Approximation versus Sample Average Approximation for population Wasserstein barycenters

D Dvinskikh - arXiv preprint arXiv:2001.07697, 2020 - arxiv.org

In machine learning and optimization community there are two main approaches for convex

risk minimization problem, namely, the Stochastic Approximation (SA) and the Sample

Average Approximation (SAA). In terms of oracle complexity (required number of stochastic

gradient evaluations), both approaches are considered equivalent on average (up to a

logarithmic factor). The total complexity depends on the specific problem, however, starting

from work\cite {nemirovski2009robust} it was generally accepted that the SA is better than …

Cited by 4 Related articles All 3 versions
[CITATION] Stochastic approximation versus sample average approximation for population Wasserstein barycenter calculation. arXiv e-prints, art

D Dvinskikh - arXiv preprint arXiv:2001.07697, 2020

Cited by 2 Related articles


The Wasserstein Impact Measure (WIM): a generally ... - arXiv

https://arxiv.org › stat

by F Ghaderinezhad · 2020 — Title:The Wasserstein Impact Measure (WIM): a generally applicable, practical tool for quantifying prior impact in Bayesian statistics.


Equidistribution of random walks on compact groups II ... - arXiv

https://arxiv.org › math

by B Borda · 2020 · Cited by 2 — The proof uses a new Berry--Esseen type inequality for the p-Wasserstein metric on the torus,and the simultaneous Diophantine approximation ...

 On Stein's factors for Poisson approximation in Wasserstein ...

https://arxiv.org › math

by ZW Liao · 2020 — Abstract: We establish various bounds on the solutions to a Stein equation for Poisson approximation in Wasserstein distance with non-linear ...


2020  [PDF] arxiv.org

Continuous regularized Wasserstein barycenters

L LiA GenevayM YurochkinJ Solomon - arXiv preprint arXiv …, 2020 - arxiv.org

Wasserstein barycenters provide a geometrically meaningful way to aggregate probability

distributions, built on the theory of optimal transport. They are difficult to compute in practice …

 Cite Cited by 9 Related articles All 5 versions 


2020  [PDF] arxiv.org

Improving relational regularized autoencoders with spherical sliced fused Gromov Wasserstein

K Nguyen, S Nguyen, N HoT PhamH Bui - arXiv preprint arXiv …, 2020 - arxiv.org

Relational regularized autoencoder (RAE) is a framework to learn the distribution of data by

minimizing a reconstruction loss together with a relational regularization on the latent space …

 Cite Cited by 6 Related articles All 6 versions 

<——2020——2020—3230— 



 2020 [PDF] aaai.org

Regularized Wasserstein means for aligning distributional data

L MiW ZhangY Wang - Proceedings of the AAAI Conference on …, 2020 - ojs.aaai.org

We propose to align distributional data from the perspective of Wasserstein means. We raise

the problem of regularizing Wasserstein means and propose several terms tailored to tackle …

 Cite Cited by 3 Related articles All 8 versions 


2020 [HTML] mdpi.com

Probability forecast combination via entropy regularized wasserstein distance

R Cumings-Menon, M Shin - Entropy, 2020 - mdpi.com

We propose probability and density forecast combination methods that are defined using the

entropy regularized Wasserstein distance. First, we provide a theoretical characterization of …

 Cite Cited by 2 Related articles All 15 versions 


 2020 [PDF] arxiv.org

Regularized variational data assimilation for bias treatment using the Wasserstein metric

SK TamangA EbtehajD Zou… - Quarterly Journal of the …, 2020 - Wiley Online Library

This article presents a new variational data assimilation (VDA) approach for the formal

treatment of bias in both model outputs and observations. This approach relies on the …

 Cite Cited by 3 Related articles All 6 versions



2020  [PDF] arxiv.org

Wasserstein Distance Regularized Sequence Representation for Text Matching in Asymmetrical Domains

W Yu, C Xu, J XuL Pang, X Gao, X Wang… - arXiv preprint arXiv …, 2020 - arxiv.org

One approach to matching texts from asymmetrical domains is projecting the input

sequences into a common semantic space as feature vectors upon which the matching …

 Cite Cited by 1 Related articles All 4 versions 



2020

A class of optimal transport regularized formulations with applications to wasserstein gans

S Mahdian, JH Blanchet… - 2020 Winter Simulation …, 2020 - ieeexplore.ieee.org

Optimal transport costs (eg Wasserstein distances) are used for fitting high-dimensional

distributions. For example, popular artificial intelligence algorithms such as Wasserstein  …

 Cite Related articles All 3 versions


2020


2020 [PDF] arxiv.org

Reweighting samples under covariate shift using a Wasserstein distance criterion

J Reygner, A Touboul - arXiv preprint arXiv:2010.09267, 2020 - arxiv.org

Considering two random variables with different laws to which we only have access through

finite size iid samples, we address how to reweight the first sample so that its empirical …

 Cite Cited by 1 Related articles All 29 versions 


 2020 [PDF] arxiv.org

Hierarchical Low-Rank Approximation of Regularized Wasserstein Distance

M Motamed - arXiv preprint arXiv:2004.12511, 2020 - arxiv.org

Sinkhorn divergence is a measure of dissimilarity between two probability measures. It is

obtained through adding an entropic regularization term to Kantorovich's optimal transport …

 Cite Related articles All 3 versions 


2020 [PDF] sci-en-tech.com

[PDF] Entropy-regularized Wasserstein Distances for Analyzing Environmental and Ecological Data

H Yoshioka, Y Yoshioka, Y Yaegashi - THE 11TH …, 2020 - sci-en-tech.com

We explore applicability of entropy-regularized Wasserstein (pseudo-) distances as new

tools for analyzing environmental and ecological data. In this paper, the two specific …

 Cite Related articles All 2 versions 



2020 [PDF] arxiv.org

Regularized variational data assimilation for bias treatment using the Wasserstein metric

SK TamangA EbtehajD Zou… - Quarterly Journal of the …, 2020 - Wiley Online Library

This article presents a new variational data assimilation (VDA) approach for the formal

treatment of bias in both model outputs and observations. This approach relies on the …

 Cited by 3 Related articles All 6 versions


2020  [PDF] arxiv.org

The Wasserstein Impact Measure (WIM): a generally applicable, practical tool for quantifying prior impact in Bayesian statistics

F GhaderinezhadC LeyB Serrien - arXiv preprint arXiv:2010.12522, 2020 - arxiv.org

The prior distribution is a crucial building block in Bayesian analysis, and its choice will

impact the subsequent inference. It is therefore important to have a convenient way to …

 Related articles All 2 versions 

<——2020——2020—3240— 



2020 [PDF] arxiv.org

Variational wasserstein barycenters for geometric clustering

L Mi, T Yu, J BentoW ZhangB LiY Wang - arXiv preprint arXiv …, 2020 - arxiv.org

We propose to compute Wasserstein barycenters (WBs) by solving for Monge maps with

variational principle. We discuss the metric properties of WBs and explore their connections …

 Cited by 2 Related articles All 2 versions 


2020 [PDF] arxiv.org

Encoded Prior Sliced Wasserstein AutoEncoder for learning latent manifold representations

S KrishnagopalJ Bedrossian - arXiv preprint arXiv:2010.01037, 2020 - arxiv.org

While variational autoencoders have been successful generative models for a variety of

tasks, the use of conventional Gaussian or Gaussian mixture priors are limited in their ability …

 Related articles All 3 versions 

 

2020  [PDF] core.ac.uk

[PDF] Structure-preserving variational schemes for fourth order nonlinear partial differential equations with a Wasserstein gradient flow structure

B Ashworth - 2020 - core.ac.uk

There is a growing interest in studying nonlinear partial differential equations which

constitute gradient flows in the Wasserstein metric and related structure preserving …

 Related articles All 2 versions 


Two-sample Test using Projected Wasserstein Distance - arXiv

by J Wang · 2020 · Cited by 4 — We develop a projected Wasserstein distance for the two-sample test, a fundamental problem in statistics and machine learning: given two sets of ...



ast PCA in 1-D Wasserstein Spaces via B-splines ...

Fast PCA in 1-D Wasserstein Spaces via B-splines Representation and Metric Projection

Pegoraro, M and Beraha, M

35th AAAI Conference on Artificial Intelligence / 33rd Conference on Innovative Applications of Artificial Intelligence / 11th Symposium on Educational Advances in Artificial Intelligence

2021 | THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE 35 , pp.9342-9349

We address the problem of performing Principal Component Analysis over a family of probability measures on the real line, using the Wasserstein geometry. We present a novel representation of the 2-Wasserstein space, based on a well known isometric bijection and a B-spline expansion. Thanks to this representation, we are able to reinterpret previous work and derive more efficient optimization routines for existing approaches. As shown in our simulations, the solution of these optimization problems can be costly in practice and thus pose a limit to their usage. We propose a novel definition of Principal Component Analysis in the Wasserstein space that, when used in combination with the B-spline representation, yields a straightforward optimization problem that is extremely fast to compute. Through extensive simulation studies, we show how our PCA performs similarly to the ones already proposed in the literature while retaining a much smaller computational cost. We apply our method to a real dataset of mortality rates due to Covid-19 in the US, concluding that our analyses are consistent with the current scientific consensus on the disease.


 . 2020

A New Data-Driven Distributionally Robust Portfolio Optimization Method Based on Wasserstein Ambiguity Set

N Du, Y Liu, Y Liu - IEEE Access, 2020 - ieeexplore.ieee.org

Since optimal portfolio strategy depends heavily on the distribution of uncertain returns, this

article proposes a new method for the portfolio optimization problem with respect to

distribution uncertainty. When the distributional information of the uncertain return rate is

only observable through a finite sample dataset, we model the portfolio selection problem

with a robust optimization method from the data-driven perspective. We first develop an

ambiguous mean-CVaR portfolio optimization model, where the ambiguous distribution set …

 CCited by 3 Related articles


2020


Cutoff thermalization for Ornstein-Uhlenbeck systems with ...

https://arxiv.org › math

by G Barrera · 2020 · Cited by 4 — Cutoff thermalization for Ornstein-Uhlenbeck systems with small Lévy noise in the Wasserstein distance. Authors:Gerardo Barrera, Michael A.

Missing: Levy ‎| Must include: Levy

You visited this page on 11/12/21.

 
Illumination-invariant flotation froth color measuring via Wasserstein distance-based CycleGAN with structure-preserving constraint

J Liu, J He, Y Xie, W Gui, Z Tang, T Ma… - IEEE transactions on …, 2020 - ieeexplore.ieee.org

Froth color can be referred to as a direct and instant indicator to the key flotation production

index, for example, concentrate grade. However, it is intractable to measure the froth color

robustly due to the adverse interference of time-varying and uncontrollable multisource

illuminations in the flotation process monitoring. In this article, we proposed an illumination-

invariant froth color measuring method by solving a structure-preserved image-to-image

color translation task via an introduced Wasserstein distance-based structure-preserving …

Cited by 31 Related articles All 3 versions

2020  [PDF] aaai.org

Gromov-Wasserstein factorization models for graph clustering

H Xu - Proceedings of the AAAI Conference on Artificial …, 2020 - ojs.aaai.org

We propose a new nonlinear factorization model for graphs that are with topological

structures, and optionally, node attributes. This model is based on a pseudometric called …

 Cited by 7 Related articles All 4 versions 

 

2020  [PDF] aaai.org

[PDF] Swift: Scalable wasserstein factorization for sparse nonnegative tensors

A AfsharK Yin, S Yan, C QianJC Ho, H Park… - arXiv preprint arXiv …, 2020 - aaai.org

Existing tensor factorization methods assume that the input tensor follows some specific

distribution (ie Poisson, Bernoulli, and Gaussian), and solve the factorization by minimizing …

 Cited by 3 Related articles All 7 versions 


2020  [PDF] arxiv.org

Safe Wasserstein Constrained Deep Q-Learning

A KandelSJ Moura - arXiv preprint arXiv:2002.03016, 2020 - arxiv.org

This paper presents a distributionally robust Q-Learning algorithm (DrQ) which leverages

Wasserstein ambiguity sets to provide probabilistic out-of-sample safety guarantees during …

 Cited by 1 Related articles All 2 versions 

<——2020——2020—3250— 


 

A Generative Model for Zero-Shot Learning via Wasserstein Auto-encoder

X Luo, Z Cai, F Wu, J Xiao-Yuan - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Zero-shot learning aims to use the labeled instances to train the model, and then classifies

the instances that belong to a class without labeled instances. However, the training …

 Related articles


2020  [PDF] arxiv.org

Independent Elliptical Distributions Minimize Their  Wasserstein Distance from Independent Elliptical Distributions with the Same Density Generator

S FangQ Zhu - arXiv preprint arXiv:2012.03809, 2020 - arxiv.org

This short note is on a property of the $\mathcal {W} _2 $ Wasserstein distance which

indicates that independent elliptical distributions minimize their $\mathcal {W} _2 …

 Related articles All 2 versions 


2020

Numeric Data Augmentation using Structural Constraint Wasserstein Generative Adversarial Networks

W Wang, C Wang, T Cui, R Gong… - … on Circuits and …, 2020 - ieeexplore.ieee.org

Some recent studies have suggested using GANs for numeric data generation such as to

generate data for completing the imbalanced numeric data. Considering the significant …

 Related articles


2020

Selective Multi-source Transfer Learning with Wasserstein Domain Distance for Financial Fraud Detection

Y Sun, L Lan, X Zhao, M Fan, Q Guo, C Li - … Intelligent Computing and …, 2020 - Springer

As financial enterprises have moved their services to the internet, financial fraud detection

has become an ever-growing problem causing severe economic losses for the financial  …


2020  [PDF] arxiv.org

Safe Zero-Shot Model-Based Learning and Control: A Wasserstein Distributionally Robust Approach

A KandelSJ Moura - arXiv preprint arXiv:2004.00759, 2020 - arxiv.org

This paper explores distributionally robust zero-shot model-based learning and control

using Wasserstein ambiguity sets. Conventional model-based reinforcement learning …

 Cited by 1 Related articles All 2 versions 


2020

 

Wasserstein Embeddings for Nonnegative Matrix Factorization

M Febrissy, M Nadif - … Conference on Machine Learning, Optimization, and …, 2020 - Springer

In the field of document clustering (or dictionary learning), the fitting error called the

Wasserstein (In this paper, we use “Wasserstein”,“Earth Mover's”,“Kantorovich–Rubinstein” …

 Related articles


2020  [PDF] arxiv.org

Augmented sliced Wasserstein distances

X Chen, Y Yang, Y Li - arXiv preprint arXiv:2006.08812, 2020 - arxiv.org

While theoretically appealing, the application of the Wasserstein distance to large-scale

machine learning problems has been hampered by its prohibitive computational cost. The …

 Cited by 2 Related articles All 3 versions 


2020  [PDF] arxiv.org

Stochastic Approximation versus Sample Average Approximation for population Wasserstein barycenters

D Dvinskikh - arXiv preprint arXiv:2001.07697, 2020 - arxiv.org

In machine learning and optimization community there are two main approaches for convex

risk minimization problem, namely, the Stochastic Approximation (SA) and the Sample …

 Cited by 3 Related articles All 2 versions 

[CITATION] Stochastic approximation versus sample average approximation for population Wasserstein barycenter calculation. arXiv e-prints, art

D Dvinskikh - arXiv preprint arXiv:2001.07697, 2020

 Cited by 2 Related articles



2020  see 2021

A Fast Globally Linearly Convergent Algorithm for the Computation of Wasserstein Barycenters
Yang, Lei; Li, Jia; Sun, Defeng; Kim-Chuan Toh. arXiv.org; Ithaca, Dec 26, 2020.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Data augmentation-based conditional Wasserstein generative adversarial network-gradient penalty for XSS attack detection system
Fawaz Mahiuob Mohammed Mokbal; Wang, Dan; Wang, Xiaoxi; Fu, Lihua. PeerJ Computer Science; San Diego (Dec 14, 2020).

Abstract/DetailsFull textFull text - PDF (1 MB)‎
 
<——2020——2020—3260—


2020 see 2019

Wasserstein-2 Generative Networks
Korotin, Alexander; Egiazarian, Vage; Arip Asadulaev; Safin, Alexander; Burnaev, Evgeny. arXiv.org; Ithaca, Dec 10, 2020.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Partial Gromov-Wasserstein Learning for Partial Graph Matching
Liu, Weijie; Zhang, Chao; Xie, Jiahao; Shen, Zebang; Qian, Hui; et al. arXiv.org; Ithaca, Dec 9, 2020.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Cited by 2
 Related articles All 3 versions 


Wasserstein barycenters can be computed in polynomial time in fixed dimension
Altschuler, Jason M; Boix-Adsera, Enric. arXiv.org; Ithaca, Dec 9, 2020.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window


Stein’s method for normal approximation in Wasserstein distances with application to the multivariate central limit theorem
Bonis, Thomas. Probability Theory and Related Fields; Heidelberg Vol. 178, Iss. 3-4,  (Dec 2020): 827-860.

Abstract/Details Get full textLink to external site, this link will open in a new window

 

A new approach to posterior contraction rates via Wasserstein dynamics
Dolera, Emanuele; Favaro, Stefano; Mainini, Edoardo. arXiv.org; Ithaca, Nov 29, 2020.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Cited by 2
 Related articles All 2 versions 


2020


  2020 see 2019

Sliced Gromov-Wasserstein
Vayer, Titouan; Flamary, Rémi; Tavenard, Romain; Chapel, Laetitia; Courty, Nicolas. arXiv.org; Ithaca, Dec 11, 2020.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Cited by 15 Related articles All 2 versions


2020  see 2021  Dissertation or Thesis  Full Text

Classification of Atomic Environments Via the Gromov-Wasserstein Distance
Kawano, Sakura. University of California, Davis. ProQuest Dissertations Publishing, 2020. 27998403.
Abstract/DetailsPreview - PDF (525 KB)‎Full text - PDF (1 MB)‎

 
Conference Paper Citation/Abstract

Barycenters of Natural Images - Constrained Wasserstein Barycenters for Image Morphing
Simon, Dror; Aberdam, Aviad. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).
 

 
Conference Paper  Citation/Abstract

Wasserstein Loss based Deep Object Detection
Han, Yuzhuo; Liu, Xiaofeng; She
ng, Zhenfei; Ren, Yutao; Han, Xu; et al. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).

Abstract/Details 


Stereoscopic image reflection removal based on Wasserstein Generative Adversarial Network

Wang, XYPan, YK and Lun, DPK

IEEE International Conference on Visual Communications and Image Processing (VCIP)

2020 | 2020 IEEE INTERNATIONAL CONFERENCE ON VISUAL COMMUNICATIONS AND IMAGE PROCESSING (VCIP) , pp.148-151

Reflection removal is a long-standing problem in computer vision. In this paper, we consider the reflection removal problem for stereoscopic images. By exploiting the depth information of stereoscopic images, a new background edge estimation algorithm based on the Wasserstein Generative Adversarial Network (WGAN) is proposed to distinguish the edges of the background image from the reflection. The background edges are then used to reconstruct the background image. We compare the proposed approach with the state-of-the-art reflection removal methods. Results show that the proposed approach can outperform the traditional single-image based methods and is comparable to the multiple-image based approach while having a much simpler imaging hardware requirement.

Show more   References   Related records

<——2020——2020—3270—


Wasserstein-Distance-Based Temporal Clustering for Capacity-Expansion Planning in Power Systems

Condeixa, LOliveira, F and Siddiqui, AS

3rd International Conference on Smart Energy Systems and Technologies (SEST)

2020 | 2020 INTERNATIONAL CONFERENCE ON SMART ENERGY SYSTEMS AND TECHNOLOGIES (SEST)

As variable renewable energy sources are steadily incorporated in European power systems, the need for higher temporal resolution in capacity-expansion models also increases.

Naturally, there exists a trade-off between the amount of temporal data used to plan power systems for decades ahead and time resolution needed to represent renewable energy variability accurately. We propose the use of the Wasserstein distance as a measure of cluster discrepancy using it to cluster demand, wind availability, and solar availability data. When compared to the Euclidean distance and the maximal distance, the hierarchical clustering performed using the Wasserstein distance leads to capacity-expansion planning that 1) more accurately estimates system costs and 2) more efficiently adopts storage resources. Numerical results indicate an improvement in cost estimation by up to 5% vis-a-vis the Euclidean distance and a reduction of storage investment that is equivalent to nearly 100% of the installed capacity under the benchmark full time resolution.

Show more  References  Related records

Cited by 1 Related articles All 7 versions

Spoken Keyword Detection Based on Wasserstein Generative Adversarial Network

W Zhao, S Kun, C Hao - 2020 5th International Conference on …, 2020 - ieeexplore.ieee.org

With the rapid development of artificial neural networks, it's applied to all areas of computer

technologies. This paper combines deep neural network and keyword detection technology

to propose a Wasserstein Generative Adversarial Network-based spoken keyword detection

which is widely different from the existing methods. With the ability of Wasserstein

Generative Adversarial Network (WGAN) to generates data autonomously, new sequences

are generated, through which it analyzes whether keywords presence and where the …

 Related articles All 2 versions

Spoken Keyword Detection Based on Wasserstein Generative Adversarial Network

Conference Paper

Citation/Abstract

Spoken Keyword Detection Based on Wasserstein Generative Adversarial Network

Zhao, Wen; Kun, She; Hao, Chen. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).

Abstract/DetailsGetit! 

Show Abstract 

 

Correlated Wasserstein Autoencoder ... - IEEE Computer Societyhttps://www.computer.org › csdl › download-article › pdf
by L Yao · 2020 — Abstract—Recommender systems for implicit data, e.g., brows- ing data, have attracted more and more research efforts. Most existing approaches assume the ...

Correlated Wasserstein Autoencoder for Implicit Data Recommendation

Conference Paper

Citation/Abstract

Correlated Wasserstein Autoencoder for Implicit Data Recommendation

Zhong, Jingbin; Zhang, Xiaofeng; Luo, Linhao. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).

RCited by 1 Related articles All 2 versions

 

Gromov-Wasserstein Distance based Object Matching - math

https://arxiv.org › math

by CA Weitkamp · 2020 · Cited by 1 — Abstract: In this paper, we aim to provide a statistical theory for object matching based on the Gromov-Wasserstein distance.

A Novel Ant Colony Shape Matching Algorithm Based on the Gromov-Wasserstein Distance

Conference Paper

Citation/Abstract

A Novel Ant Colony Shape Matching Algorithm Based on the Gromov-Wasserstein Distance

Zhang, Lu; Saucan, Emil. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).

Abstract/DetailsGetit! 

Show Abstract 


Wasserstein-2 Generative Networks | OpenReview

https://openreview.net › forum

https://openreview.net › forum

by A Korotin · 2020 · Cited by 21 — 28 Sept 2020 (modified: 13 Mar 2021)ICLR 2021 PosterReaders: EveryoneShow ... Keywords: wasserstein-2 distance, optimal transport maps, non-minimax ...


Wasserstein Embedding for Graph Learning | OpenReview

Oct 17, 2020

Learning to generate Wasserstein barycenters | OpenReview

Oct 26, 2020

An Improved Composite Functional Gradient Learning by ...

Oct 5, 2021

Augmented Sliced Wasserstein Distances | OpenReview

Oct 7, 2020

More results from openreview.net

 Cited by 20 Related articles All 5 versions

2020  see 2021 [PDF] arxiv.org

First-Order Methods for Wasserstein Distributionally Robust MDP

J Grand-ClémentC Kroer - arXiv preprint arXiv:2009.06790, 2020 - arxiv.org

Markov Decision Processes (MDPs) are known to be sensitive to parameter specification.

Distributionally robust MDPs alleviate this issue by allowing for ambiguity sets which give a

set of possible distributions over parameter sets. The goal is to find an optimal policy with …

 Cited by 4 Related articles All 3 versions 


 

[PDF] arxiv.org

Wasserstein distributionally robust stochastic control: A data-driven approach

I Yang - IEEE Transactions on Automatic Control, 2020 - ieeexplore.ieee.org

Standard stochastic control methods assume that the probability distribution of uncertain

variables is available. Unfortunately, in practice, obtaining accurate distribution information

is a challenging task. To resolve this issue, in this article we investigate the problem of …

 Cited by 31 Related articles All 3 versions


[PDF] arxiv.org

Finite-sample guarantees for Wasserstein distributionally robust optimization: Breaking the curse of dimensionality

R Gao - arXiv preprint arXiv:2009.04382, 2020 - arxiv.org

Wasserstein distributionally robust optimization (DRO) aims to find robust and generalizable

solutions by hedging against data perturbations in Wasserstein distance. Despite its recent

empirical success in operations research and machine learning, existing performance …

 Cited by 10 Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein distributionally robust inverse multiobjective optimization

C DongB Zeng - arXiv preprint arXiv:2009.14552, 2020 - arxiv.org

Inverse multiobjective optimization provides a general framework for the unsupervised

learning task of inferring parameters of a multiobjective decision making problem (DMP),

based on a set of observed decisions from the human expert. However, the performance of …

 Cited by 3 Related articles All 5 versions 

<——2020——2020—3280—- 


 
 

Wasserstein distributionally robust motion planning and control with safety constraints using conditional value-at-risk

A HakobyanI Yang - 2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

In this paper, we propose an optimization-based decision-making tool for safe motion

planning and control in an environment with randomly moving obstacles. The unique feature

of the proposed method is that it limits the risk of unsafety by a pre-specified threshold even …

 Cited by 9 Related articles All 2 versions


[PDF] arxiv.org

Wasserstein distributionally robust look-ahead economic dispatch

BK PoollaAR HotaS Bolognani… - … on Power Systems, 2020 - ieeexplore.ieee.org

We consider the problem of look-ahead economic dispatch (LAED) with uncertain

renewable energy generation. The goal of this problem is to minimize the cost of

conventional energy generation subject to uncertain operational constraints. The risk of …

 Cited by 6 Related articles All 3 versions

Wasserstein Distributionally Robust Look-Ahead Economic Dispatch

B Kameshwar Poolla, AR HotaS Bolognani… - arXiv e …, 2020 - ui.adsabs.harvard.edu

We consider the problem of look-ahead economic dispatch (LAED) with uncertain

renewable energy generation. The goal of this problem is to minimize the cost of

conventional energy generation subject to uncertain operational constraints. These …




CVaR-based approximations of Wasserstein Distributionally robust chance constraints with application to process scheduling

B Liu, Q Zhang, X Ge, Z Yuan - Industrial & Engineering Chemistry …, 2020 - ACS Publications

Distributionally robust chance constrained programming is a stochastic optimization

approach that considers uncertainty in model parameters as well as uncertainty in the

underlying probability distribution. It ensures a specified probability of constraint satisfaction …

 Cited by 5 Related articles All 5 versions



[PDF] mlr.press

Principled learning method for Wasserstein distributionally robust optimization with local perturbations

Y Kwon, W Kim, JH Won… - … Conference on Machine …, 2020 - proceedings.mlr.press

Wasserstein distributionally robust optimization (WDRO) attempts to learn a model that

minimizes the local worst-case risk in the vicinity of the empirical data distribution defined by

Wasserstein ball. While WDRO has received attention as a promising tool for inference since …

 Related articles All 7 versions 



Wasserstein based two-stage distributionally robust optimization model for optimal operation of CCHP micro-grid under uncertainties

Y Wang, Y Yang, L Tang, W Sun, B Li - International Journal of Electrical …, 2020 - Elsevier

Combined cooling, heating and power (CCHP) micro-grids are getting increasing attentions

due to the realization of cleaner production and high energy efficiency. However, with the

features of complex tri-generation structure and renewable power uncertainties, it is …

 Cited by 20 Related articles All 2 versions


2020


[PDF] arxiv.org

Wasserstein distributionally robust shortest path problem

Z Wang, K YouS SongY Zhang - European Journal of Operational …, 2020 - Elsevier

This paper proposes a data-driven distributionally robust shortest path (DRSP) model where

the distribution of the travel time in the transportation network can only be partially observed

through a finite number of samples. Specifically, we aim to find an optimal path to minimize …

 Cited by 9 Related articles All 8 versions


A linear programming approximation of distributionally robust chance-constrained dispatch with wasserstein distance

A Zhou, M Yang, M Wang… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

This paper proposes a data-driven distributionally robust chance constrained real-time

dispatch (DRCC-RTD) considering renewable generation forecasting errors. The proposed

DRCC-RTD model minimizes the expected quadratic cost function and guarantees that the …

 Cited by 16 Related articles All 2 versions



[PDF] researchgate.net

Tractable reformulations of two-stage distributionally robust linear programs over the type-∞ wasserstein ball

W Xie - Operations Research Letters, 2020 - Elsevier

This paper studies a two-stage distributionally robust stochastic linear program under the

type-∞ Wasserstein ball by providing sufficient conditions under which the program can be

efficiently computed via a tractable convex program. By exploring the properties of binary …

 Cited by 13 Related articles All 4 versions


[PDF] ieee.org

Distributionally robust optimal reactive power dispatch with wasserstein distance in active distribution network

J Liu, Y Chen, C Duan, J Lin… - Journal of Modern Power …, 2020 - ieeexplore.ieee.org

The uncertainties from renewable energy sources (RESs) will not only introduce significant

influences to active power dispatch, but also bring great challenges to the analysis of

optimal reactive power dispatch (ORPD). To address the influence of high penetration of …

 Cited by 13 Related articles All 3 versions


Data-driven distributionally robust unit commitment with Wasserstein metric: Tractable formulation and efficient solution method

X ZhengH Chen - IEEE Transactions on Power Systems, 2020 - ieeexplore.ieee.org

In this letter, we propose a tractable formulation and an efficient solution method for the

Wasserstein-metric-based distributionally robust unit commitment (DRUC-dW) problem.

First, a distance-based data aggregation method is introduced to hedge against the …

 Cited by 8 Related articles All 2 versions

<——2020——2020—3290—-


[PDF] arxiv.org

Safe Zero-Shot Model-Based Learning and Control: A Wasserstein Distributionally Robust Approach

A KandelSJ Moura - arXiv preprint arXiv:2004.00759, 2020 - arxiv.org

This paper explores distributionally robust zero-shot model-based learning and control

using Wasserstein ambiguity sets. Conventional model-based reinforcement learning

algorithms struggle to guarantee feasibility throughout the online learning process. We …

 Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Consistency of Distributionally Robust Risk-and Chance-Constrained Optimization Under Wasserstein Ambiguity Sets

A CherukuriAR Hota - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We study stochastic optimization problems with chance and risk constraints, where in the

latter, risk is quantified in terms of the conditional value-at-risk (CVaR). We consider the

distributionally robust versions of these problems, where the constraints are required to hold …

 Cited by 3 Related articles All 4 versions


[PDF] nsf.gov

A data-driven distributionally robust game using wasserstein distance

G PengT ZhangQ Zhu - International Conference on Decision and Game …, 2020 - Springer

This paper studies a special class of games, which enables the players to leverage the

information from a dataset to play the game. However, in an adversarial scenario, the

dataset may not be trustworthy. We propose a distributionally robust formulation to introduce …

 Cited by 1 Related articles All 3 versions


[PDF] arxiv.org

Strong formulations for distributionally robust chance-constrained programs with left-hand side uncertainty under Wasserstein ambiguity

N Ho-NguyenF Kılınç-KarzanS Küçükyavuz… - arXiv preprint arXiv …, 2020 - arxiv.org

Distributionally robust chance-constrained programs (DR-CCP) over Wasserstein ambiguity

sets exhibit attractive out-of-sample performance and admit big-$ M $-based mixed-integer

programming (MIP) reformulations with conic constraints. However, the resulting …

 Cited by 4 Related articles All 3 versions 

 

[PDF] arxiv.org

Generalized Wasserstein Dice Score, Distributionally Robust Deep Learning, and Ranger for brain tumor segmentation: BraTS 2020 challenge

L FidonS OurselinT Vercauteren - arXiv preprint arXiv:2011.01614, 2020 - arxiv.org

Training a deep neural network is an optimization problem with four main ingredients: the

design of the deep neural network, the per-sample loss function, the population loss

function, and the optimizer. However, methods developed to compete in recent BraTS …

 Cited by 3 Related articles All 3 versions 


2020


Data-driven stochastic programming with distributionally robust constraints under Wasserstein distance: asymptotic properties

Y Mei, ZP Chen, BB Ji, ZJ Xu, J Liu - … of the Operations Research Society of …, 2020 - Springer

Distributionally robust optimization is a dominant paradigm for decision-making problems

where the distribution of random variables is unknown. We investigate a distributionally

robust optimization problem with ambiguities in the objective function and countably infinite …

 Cited by 2 Related articles


[PDF] epfl.ch

Wasserstein Distributionally Robust Learning

S Shafieezadeh Abadeh - 2020 - infoscience.epfl.ch

Many decision problems in science, engineering, and economics are affected by

uncertainty, which is typically modeled by a random variable governed by an unknown

probability distribution. For many practical applications, the probability distribution is only …

 Related articles 

[CITATION] Wasserstein Distributionally Robust Learning

OS Abadeh - 2020 - Ecole Polytechnique Fédérale de …

[PDF] openreview.net

Wasserstein Distributionally Robust Optimization: A Three-Player Game Framework

Z Tu, S YouT HuangD Tao - 2020 - openreview.net

Wasserstein distributionally robust optimization (DRO) has recently received significant

attention in machine learning due to its connection to generalization, robustness and

regularization. Existing methods only consider a limited class of loss functions or apply to …

 Related articles 


[PDF] arxiv.org

Data-driven Distributionally Robust Optimal Stochastic Control Using the Wasserstein Metric

F ZhaoK You - arXiv preprint arXiv:2010.06794, 2020 - arxiv.org

Optimal control of a stochastic dynamical system usually requires a good dynamical model

with probability distributions, which is difficult to obtain due to limited measurements and/or

complicated dynamics. To solve it, this work proposes a data-driven distributionally robust …

 Related articles All 2 versions 


[PDF] optimization-online.org

[PDF] Dual Decomposition of Two-Stage Distributionally Robust Mixed-Integer Programming under the Wasserstein Ambiguity Set

K Kim - Preprint manuscript, 2020 - optimization-online.org

We develop a dual decomposition of two-stage distributionally robust mixed-integer

programming (DRMIP) under the Wasserstein ambiguity set. The dual decomposition is

based on the Lagrangian dual of DRMIP, which results from the Lagrangian relaxation of the …

 Cited by 1 Related articles All 2 versions 

<——2020——2020—3300—- 


[PDF] ieee.org

A New Data-Driven Distributionally Robust Portfolio Optimization Method Based on Wasserstein Ambiguity Set

N Du, Y Liu, Y Liu - IEEE Access, 2020 - ieeexplore.ieee.org

Since optimal portfolio strategy depends heavily on the distribution of uncertain returns, this

article proposes a new method for the portfolio optimization problem with respect to

distribution uncertainty. When the distributional information of the uncertain return rate is …

 Cited by 3 Related articles


[PDF] umn.edu

Data-driven Distributionally Robust Stochastic Optimization via Wasserstein Distance with Applications to Portfolio Risk Management and Inventory Control

DR Singh - 2020 - search.proquest.com

The central theme of this dissertation is stochastic optimization under distributional

ambiguity. One can think of this as a two player game between a decision maker, who tries

to minimize some loss or maximize some reward, and an adversarial agent that chooses the …

 Related articles All 4 versions


Relaxed Wasserstein with Applications to GAN

Xin GuoJohnny HongTianyi LinNan Yang

Wasserstein Generative Adversarial Networks (WGANs) provide a versatile class of models, which have attracted great attention in various applications. However, this framework has two main drawbacks: (i) Wasserstein-1 (or Earth-Mover) distance is restrictive such that WGANs cannot always fit data geometry well; (ii) It is difficult to achieve fast training of WGANs. In this paper, we propose a new class of \textit{Relaxed Wasserstein} (RW) distances by generalizing Wasserstein-1 distance with Bregman cost functions. We show that RW distances achieve nice statistical properties while not sacrificing the computational tractability. Combined with the GANs framework, we develop Relaxed WGANs (RWGANs) which are not only statistically flexible but can be approximated efficiently using heuristic approaches. Experiments on real images demonstrate that the RWGAN with Kullback-Leibler (KL) cost function outperforms other competing approaches, e.g., WGANs, even with gradient penalty.

[v1] Fri, 19 May 2017 19:51:34 UTC (2,234 KB)

[v2] Tue, 31 Oct 2017 08:39:34 UTC (4,223 KB)

[v3] Fri, 30 Mar 2018 01:54:01 UTC (4,223 KB)

[v4] Sun, 16 Sep 2018 20:50:00 UTC (4,224 KB)

[v5] Sat, 4 May 2019 08:49:44 UTC (4,232 KB)

[v6] Thu, 22 Oct 2020 08:18:42 UTC (4,230 KB)

[v7] Sat, 6 Feb 2021 09:33:22 UTC (4,226 KB)

[v8] Sat, 17 Jul 2021 06:03:54 UTC (4,225 KB)

Wasserstein Generative Adversarial Networks (WGAN

Wasserstein Generative Adversarial Networks (WGAN & WGAN-GP) in ... TensorFlow Tutorial 5 - Adding ...

Nov 3, 2020 · Uploaded by Aladdin P


2020

Illumination-invariant flotation froth color measuring via Wasserstein distance-based CycleGAN with structure-preserving constraint

J Liu, J He, Y Xie, W Gui, Z Tang, T Ma… - IEEE transactions on …, 2020 - ieeexplore.ieee.org

Froth color can be referred to as a direct and instant indicator to the key flotation production

index, for example, concentrate grade. However, it is intractable to measure the froth color

robustly due to the adverse interference of time-varying and uncontrollable multisource …

 Cited by 21 Related articles All 3 versions


Closed-form Expressions for Maximum Mean Discrepancy

 with Applications to Wasserstein Auto-Encoders

[Submitted on 10 Jan 2019 (v1), last revised 2 Jun 2020 (this version, v2)]

by RM Rustamov · 2019 · Cited by 6 — In this paper we compute closed-form expressions for estimating the Gaussian kernel based MMD between a given distribution and the standard ...

Cover Image

Closed‐form expressions for maximum mean discrepancy with applications to Wasserstein...

by Rustamov, Raif M

Stat (International Statistical Institute), 12/2021, Volume 10, Issue 1

Journal ArticleCitation Online


2020


Probability distribution fitting with Wasserstein metrics - Cross ...

https://stats.stackexchange.com › questions › probabilit...

https://stats.stackexchange.com › questions › probabilit...

Nov 15, 2020 — I have a relatively complex physical model for a process occurring, and I can numerically solve the relevant differential e

2020 [PDF] arxiv.org

Hierarchical gaussian processes with wasserstein-kernels

S PopescuD SharpJ ColeB Glocker - arXiv preprint arXiv:2010.14877, 2020 - arxiv.org

We investigate the usefulness of Wasserstein-kernels in the context of hierarchical

Gaussian Processes. Stemming from an observation that stacking Gaussian Processes

severely diminishes the model's ability to detect outliers, which when combined with non …

 Cited by 3 Related articles All 3 versions 


2020  [PDF] ieee.org

Robust multivehicle tracking with wasserstein association metric in surveillance videos

Y Zeng, X Fu, L Gao, J Zhu, H Li, Y Li - IEEE Access, 2020 - ieeexplore.ieee.org

Vehicle tracking based on surveillance videos is of great significance in the highway traffic

monitoring field. In real-world vehicle-tracking applications, partial occlusion and objects

with similarly appearing distractors pose significant challenges. For addressing the above …

 Cited by 6 Related articles


2020  [PDF] arxiv.org

An LP-based, strongly-polynomial 2-approximation algorithm for sparse Wasserstein barycenters

S Borgwardt - Operational Research, 2020 - Springer

Discrete Wasserstein barycenters correspond to optimal solutions of transportation problems

for a set of probability measures with finite support. Discrete barycenters are measures with

finite support themselves and exhibit two favorable properties: there always exists one with a …

 Cited by 5 Related articles All 3 versions


2020  [PDF] ucl.ac.uk

Wasserstein-distance-based temporal clustering for capacity-expansion planning in power systems

L CondeixaF Oliveira… - … Conference on Smart …, 2020 - ieeexplore.ieee.org

As variable renewable energy sources are steadily incorporated in European power

systems, the need for higher temporal resolution in capacity-expansion models also

increases. Naturally, there exists a trade-off between the amount of temporal data used to …

 Cited by 1 Related articles All 6 versions

Distributionally Safe Path PlanningWasserstein Safe RRT

P LathropB Boardman… - IEEE Robotics and …, 2021 - ieeexplore.ieee.org

In this paper, we propose a Wasserstein metric-based random path planning algorithm.

Wasserstein Safe RRT (W-Safe RRT) provides finite-sample probabilistic guarantees on the

safety of a returned path in an uncertain obstacle environment. Vehicle and obstacle states …

Cited by 3 Related articles All 7 versions

<——2020——2020—3310—-


2020  [PDF] github.io

[PDF] Lecture 3Wasserstein Space

L Chizat - 2020 - lchizat.github.io

Let X, Y be compact metric spaces, c C (X× Y) the cost function and (µ, ν) P (X)× P (Y)

the marginals. In previous lectures, we have seen that the optimal transport problem can be

formulated as an optimization over the space of transport plans Π (µ, ν)—the primal or …

 All 2 versions 


[PDF] ieee.org

A New Data-Driven Distributionally Robust Portfolio Optimization Method Based on Wasserstein Ambiguity Set

N Du, Y Liu, Y Liu - IEEE Access, 2020 - ieeexplore.ieee.org

Since optimal portfolio strategy depends heavily on the distribution of uncertain returns, this

article proposes a new method for the portfolio optimization problem with respect to

distribution uncertainty. When the distributional information of the uncertain return rate is …

 Cited by 3 Related articles


[PDF] arxiv.org

Portfolio Optimisation within a Wasserstein Ball

SM PesentiS Jaimungal - Available at SSRN, 2020 - papers.ssrn.com

We consider the problem of active portfolio management where a loss-averse and/or gain-

seeking investor aims to outperform a benchmark strategy's risk profile while not deviating

too much from it. Specifically, an investor considers alternative strategies that co-move with …

 Cited by 1 Related articles All 7 versions


2020 see 2019

Investigating Under and Overfitting in Wasserstein Generative ...

https://www.researchgate.net › ... › Discrimination

https://www.researchgate.net › ... › Discrimination

Oct 22, 2020 — We investigate under and overfitting in Generative Adversarial Networks (GANs), using discriminators unseen by the generator to measure ...

[CITATION] Investigating under and overfitting in wasserstein generative adversarial networks. arXiv

B Adlam, C Weill, A Kapoor - ar


2020

Posterior summaries for the Wasserstein barycenter of subset...

https://rdrr.io › CRAN › waspr

https://rdrr.io › CRAN › waspr

Jul 25, 2020 — summary gives a posterior summary (mean, mode, sd, HPD)


2020


Tao, TaoBai, JianshuLiu, HengHou, ShudongZheng, Xiao

Differential privacy protection method for deep learning based on WGAN feedback. (English) Zbl 07448687

J. Univ. Sci. Technol. China 50, No. 8, 1064-1071 (2020).

Zbl 1488.68025

2020

 Interacting Langevin diffusions: Gradient structure and ensemble Kalman sampler

A Garbuno-InigoF HoffmannW LiAM Stuart - SIAM Journal on Applied …, 2020 - SIAM

… In summary, the objective of the inverse problem is to find information about the truth u†

underlying … based methodologies of current interest include Stein variational gradient descent

[61… and exhibit a novel Kalman--Wasserstein gradient flow structure in the associated nonlinear …

 Cited by 75 Related articles All 8 versions


 

Sensitivity analysis of Wasserstein distributionally robust ...

https://arxiv.org › math

by D Bartl · 2020 · Cited by 10 — We consider sensitivity of a generic stochastic optimization problem to model uncertainty. We take a non-parametric approach and capture model ...



2020  [HTML] hindawi.com

[HTML] Solutions of a class of degenerate kinetic equations using steepest descent in wasserstein space

A Marcos, A Soglo - Journal of Mathematics, 2020 - hindawi.com

We use the steepest descent method in an Orlicz–Wasserstein space to study the existence

of solutions for a very broad class of kinetic equations, which include the Boltzmann

equation, the Vlasov–Poisson equation, the porous medium equation, and the parabolic p …

  Cited by 2 Related articles All 7 versions 


2020  [PDF] upc.edu

Rethinking Wasserstein-Procrustes for Aligning Word Embeddings Across Languages

G Ramírez Santos - 2020 - upcommons.upc.edu

The emergence of unsupervised word embeddings, pre-trained on very large monolingual

text corpora, is at the core of the ongoing neural revolution in Natural Language Processing

(NLP). Initially introduced for English, such pre-trained word embeddings quickly emerged …

<——2020——2020—3320—- 



[PDF] mlr.press

Approximate inference with wasserstein gradient flows

C FrognerT Poggio - International Conference on Artificial …, 2020 - proceedings.mlr.press

We present a novel approximate inference method for diffusion processes, based on the Wasserstein gradient flow formulation of the diffusion. In this formulation, the time-dependent density of the diffusion is derived as the limit of implicit Euler steps that follow the gradients …

  Cited by 17 Related articles All 3 versions 


[PDF] mlr.press

Stochastic optimization for regularized wasserstein estimators

M BalluQ BerthetF Bach - International Conference on …, 2020 - proceedings.mlr.press

Optimal transport is a foundational problem in optimization, that allows to compare probability distributions while taking into account geometric aspects. Its optimal objective value, the Wasserstein distance, provides an important loss between distributions that has …

  Cited by 12 Related articles All 6 versions 

Stochastic Optimization for Regularized Wasserstein Estimators

F BachM BalluQ Berthet - 2020 - research.google

Optimal transport is a foundational problem in optimization, that allows to compare probability distributions while taking into account geometric aspects. Its optimal objective value, the Wasserstein distance, provides an important loss between distributions that has …

 

[PDF] arxiv.org

Wasserstein distributionally robust stochastic control: A data-driven approach

I Yang - IEEE Transactions on Automatic Control, 2020 - ieeexplore.ieee.org

Standard stochastic control methods assume that the probability distribution of uncertain variables is available. Unfortunately, in practice, obtaining accurate distribution information is a challenging task. To resolve this issue, in this article we investigate the problem of …

  Cited by 31 Related articles All 3 versions


[PDF] arxiv.org

Stochastic equation and exponential ergodicity in Wasserstein distances for affine processes

M Friesen, P Jin, B Rüdiger - The Annals of Applied Probability, 2020 - projecteuclid.org

This work is devoted to the study of conservative affine processes on the canonical state space $ D=\mathbb {R} _ {+}^{m}\times\mathbb {R}^{n} $, where $ m+ n> 0$. We show that each affine process can be obtained as the pathwise unique strong solution to a stochastic  …

  Cited by 11 Related articles All 6 versions


[PDF] arxiv.org

Online Stochastic Optimization with Wasserstein Based Non-stationarity

J JiangX LiJ Zhang - arXiv preprint arXiv:2012.06961, 2020 - arxiv.org

We consider a general online stochastic optimization problem with multiple budget constraints over a horizon of finite time periods. In each time period, a reward function and multiple cost functions are revealed, and the decision maker needs to specify an action from …

 Cited by 4 Related articles All 2 versions 


2020



[PDF] arxiv.org

Minimax control of ambiguous linear stochastic systems using the Wasserstein metric

K KimI Yang - 2020 59th IEEE Conference on Decision and …, 2020 - ieeexplore.ieee.org

In this paper, we propose a minimax linear-quadratic control method to address the issue of inaccurate distribution information in practical stochastic systems. To construct a control policy that is robust against errors in an empirical distribution of uncertainty, our method …

 Cited by 4 Related articles All 4 versions


[PDF] arxiv.org

Stochastic Approximation versus Sample Average Approximation for population Wasserstein barycenters

D Dvinskikh - arXiv preprint arXiv:2001.07697, 2020 - arxiv.org

In machine learning and optimization community there are two main approaches for convex risk minimization problem, namely, the Stochastic Approximation (SA) and the Sample Average Approximation (SAA). In terms of oracle complexity (required number of stochastic  …

  Cited by 3 Related articles All 2 versions 

[CITATION] Stochastic approximation versus sample average approximation for population Wasserstein barycenter calculation. arXiv e-prints, art

D Dvinskikh - arXiv preprint arXiv:2001.07697, 2020



Covariance Steering of Discrete-Time Stochastic Linear Systems Based on Wasserstein Distance Terminal Cost

IM BalciE Bakolas - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

We consider a class of stochastic optimal control problems for discrete-time linear systems whose objective is the characterization of control policies that will steer the probability distribution of the terminal state of the system close to a desired Gaussian distribution. In our …

  Cited by 6 Related articles All 2 versions


[PDF] arxiv.org

Stochastic saddle-point optimization for wasserstein barycenters

D TiapkinA GasnikovP Dvurechensky - arXiv preprint arXiv:2006.06763, 2020 - arxiv.org

We consider population Wasserstein barycenter problem for random probability measures supported on a finite set of points and generated by an online stream of data. This leads to a complicated stochastic optimization problem where the objective is given as an expectation …

  Cited by 3 Related articles All 4 versions 


[PDF] arxiv.org

Data-driven Distributionally Robust Optimal Stochastic Control Using the Wasserstein Metric

F ZhaoK You - arXiv preprint arXiv:2010.06794, 2020 - arxiv.org

Optimal control of a stochastic dynamical system usually requires a good dynamical model with probability distributions, which is difficult to obtain due to limited measurements and/or complicated dynamics. To solve it, this work proposes a data-driven distributionally robust …

  Related articles All 2 versions 

<——2020——2020—3330—- 


[PDF] ntu.edu.sg

Wasserstein distance estimates for stochastic integrals by forward-backward stochastic calculus

JC Breton, N Privault - Potential Analysis, 2020 - Springer

We prove Wasserstein distance bounds between the probability distributions of stochastic integrals with jumps, based on the integrands appearing in their stochastic integral representations. Our approach does not rely on the Stein equation or on the propagation of …

  Related articles All 4 versions


[PDF] arxiv.org

Online Stochastic Convex Optimization: Wasserstein Distance Variation

I ShamesF Farokhi - arXiv preprint arXiv:2006.01397, 2020 - arxiv.org

Distributionally-robust optimization is often studied for a fixed set of distributions rather than time-varying distributions that can drift significantly over time (which is, for instance, the case in finance and sociology due to underlying expansion of economy and evolution of …

  Cited by 2 Related articles All 4 versions 


[PDF] umn.edu

Data-driven Distributionally Robust Stochastic Optimization via Wasserstein Distance with Applications to Portfolio Risk Management and Inventory Control

DR Singh - 2020 - search.proquest.com

The central theme of this dissertation is stochastic optimization under distributional ambiguity. One can think of this as a two player game between a decision maker, who tries to minimize some loss or maximize some reward, and an adversarial agent that chooses the …

 Related articles All 4 versions


2020

Data augmentation-based conditional Wasserstein ... - PeerJ

https://peerj.com › articles

https://peerj.com › articlesPDF

by FMM Mokbal · 2020 · Cited by 3 — Despite the model's complexity, the DR score was 0.9480, which is inadequate for detecting malicious attacks. Moreover, the model has a high FP ...

20 pages
  

 2020

Selective Multi-source Transfer Learning with Wasserstein Domain Distance for Financial Fraud Detection

Y Sun, L Lan, X Zhao, M Fan, Q Guo, C Li - … Intelligent Computing and …, 2020 - Springer

… For XGBoost, we did parameter tuning via an extensive grid search on parameters such 

as learning rate, sub-tree number, max tree depth, sub-sample columns, and so on. For Mix-Finetune 

and AVE-trans, we used the same neural network structure with our WM-trans model for …


2020


wasp: Compute Wasserstein barycenters of subset posteriors

https://rdrr.io › CRAN › waspr

https://rdrr.io › CRAN › waspr

Jul 25, 2020 — This function computes Wasserstein Barycenters of subset posteriors and gives posterior summaries for the full posterior.


Стохастические уравнения со взаимодействием. Лекция 4.     Расстояние Вассерштейна

https://www.youtube.com › watch

Стохастические уравнения со взаимодействием. Лекция 4. Расстояние Вассерштейна. 110 views110 views. Sep 27, 2020. 4. 0. Share. Save. 4 / 0.

youtube video

[Russian Stachastical equations with interactions. Lection 4. Wasserstein distance]

тохастические уравнения со взаимодействием. Лекция 4.     Расстояние Вассерштейна

https://www.youtube.com › watch

Стохастические уравнения со взаимодействием. Лекция 4. Расстояние Вассерштейна. 110 views110 views. 

Sep 27, 2020. 4. 0. Share. Save. 4 / 0.

youtube video

[Russian Stachastical equations with interactions. Lection 4. Wasserstein distance]

Sep 27, 2020.

   2020

 012.12687] Wasserstein Dropout 

https://arxiv.org › cs

by J Sicking · 2020 — Abstract: Despite of its importance for safe machine learning, uncertainty quantification for neural networks is far from being solved.
Conference Paper  Citation/Abstract

Data Augmentation Method for Fault Diagnosis of Mechanical Equipment Based on Improved Wasserstein GAN
Duan, Lixiang; Tang, Yu; Yang, Jialing.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).
Abstract/Details   Show Abstract 
Cited by 1
Related articles
 
Conference Paper  Citation/Abstract
Hyperspectral Image Classification Approach Based on Wasserstein Generative Adversarial Networks
Chen, Naigeng.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).
Abstract/Details 
Show Abstract 

<——2020——2020—3340—-



Conference Paper  Citation/Abstract

EEG data augmentation using Wasserstein GAN
Bouallegue, Ghaith; Djemal, Ridha.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).
Abstract/Details  Show Abstract 


Conference Paper  Citation/Abstract

Numeric Data Augmentation using Structural Constraint Wasserstein Generative Adversarial Networks

Wang, Wei; Wang, Chuang; Cui, Tao; Gong, Ruohan; Tang, Zuqi; et al.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).

Abstract/Details  Show Abstract 

 
Conference Paper  Citation/Abstract

Biosignal Oversampling Using Wasserstein Generative Adversarial Network
Nourani, Mehrdad; Houari, Sammy.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).

Abstract/Details  Show Abstract 
Cited by 2
 Related articles All 3 versions
 
Scholarly Journal  Citation/Abstract

Progressive Wasserstein Barycenters of Persistence Diagrams
Vidal, Jules; Budin, Joseph; Tierny, Julien.IEEE Transactions on Visualization and Computer Graphics; New York Vol. 26, Iss. 1,  (2020): 151-161.

Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 


Scholarly Journal  Citation/Abstract

Aggregated Wasserstein Distance and State Registration for Hidden Markov Models
Chen, Yukun; Ye, Jianbo; Li, Jia.IEEE Transactions on Pattern Analysis and Machine Intelligence; New York Vol. 42, Iss. 9,  (2020): 2133-2147.
Abstract/Details 
Show Abstract 

2020
 
Scholarly Journal  Citation/Abstract

Hyperbolic Wasserstein Distance for Shape Indexing
Shi, Jie; Wang, Yalin.IEEE Transactions on Pattern Analysis and Machine Intelligence; New York Vol. 42, Iss. 6,  (2020): 1362-1376.
Abstract/Details Get full textLink to external site, this link will open in a new window
Cited by (‎1)

Show Abstract 

Scholarly Journal  Citation/Abstract

Modeling EEG Data Distribution With a Wasserstein Generative Adversarial Network to Predict RSVP Events
Panwar, Sharaj; Rad, Paul; Jung, Tzyy-Ping; Huang, Yufei.IEEE Transactions on Neural Systems and Rehabilitation Engineering; New York Vol. 28, Iss. 8,  (2020): 1720-1730.

Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 

Cited by 23 Related articles All 6 versions

Scholarly Journal  Citation/Abstract

Learning to Align via Wasserstein for Person Re-Identification
Zhang, Zhizhong; Xie, Yuan; Ding, Li; Zhang, Wensheng; Tian, Qi.IEEE Transactions on Image Processing; New York Vol. 29,  (2020): 7104-7116.
Abstract/Details   Show Abstract 

Cited by 8 Related articles All 2 versions


Scholarly Journal  Citation/Abstract

Study of Restrained Network Structures for Wasserstein Generative Adversarial Networks (WGANs) on Numeric Data Augmentation
Wang, Wei; Wang, Chuang; Cui, Tao; Li, Yue.IEEE Access; Piscataway Vol. 8,  (2020): 89812-89821.
Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 

 Conditional Wasserstein Generative Adversarial Network and Cost-Sensitive Stacked Autoencodg, Guoling; Wang, Xiaodan; Li, Rui; Song, Yafei; He, Jiaxing; et al.IEEE Access; Piscataway Vol. 8,  (2020): 190431-190447.
Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 

<——2020——2020————3350—- 


Working Paper  Full Text

Probability Forecast Combination via Entropy Regularized Wasserstein Distance
Cumings-Menon, Ryan; Shin, Minchul.IDEAS Working Paper Series from RePEc; St. Louis, 2020.
Abstract/Details Get full textLink to external site, this link will open in a new window

Show Abstract
Scholarly Journal  Full Text

Probability Forecast Combination via Entropy Regularized Wasserstein Distance
Cumings-Menon, Ryan; Shin, Minchul.Entropy; Basel Vol. 22, Iss. 9,  (2020): 929.
Abstract/DetailsFull textFull text - PDF (931
 
Working Paper  Full Text
Multivariate Goodness-of-Fit Tests Based on Wasserstein Distance
Hallin, Marc; Mordant, Gilles; Segers, Johan.IDEAS Working Paper Series from RePEc; St. Louis, 2020.
Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Working Paper  Full Text
Pruned Wasserstein Index Generation Model and wigpy Package
Xie, Fangzhou.IDEAS Working Paper Series from RePEc; St. Louis,
Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Scholarly Journal  Citation/Abstract

Multiple Voltage Sag Events Homology Detection Based on Wasserstein Distance
Xiao, Xianyong; Gui, Liangyu; Li, Chengxin; Zhang, Huaying; Li, Hongxin; et al.Dianwang Jishu = Power System Technology; Beijing Iss. 12,  (2020): 4684.

Abstract/DetailsShow Abstract 
Scholarly Journal Citation/Abstract

Data Augmentation Method for Power Transformer Fault Diagnosis Based on Conditional Wasserstein Generative Adversarial Network
Liu, Yunpeng; Xu, Ziqiang; He, Jiahui; Wang, Quan; Gao, Shuguo; et al.Dianwang Jishu = Power System Technology; Beijing Iss. 4,  (2020): 1505.

Abstract/Details

Show Abstract 
Scholarly Journal  Full Text
An Integrated Consensus Improving Strategy Based on PL-Wasserstein Distance and Its Application in the Evaluation of Network Public Opinion Emergencies
Zhang, Shitao; Ma, Zhenzhen; Liu, Xiaodi; Wang, Zhiying; Jiang, Lihui.Complexity; Hoboken Vol. 2020,  (2020).
Abstract/DetailsFull textFull text - PDF (752 KB)
Show Abstract 


2020

Remote Sensing Image Segmentation based on Generative Adversarial Network with Wasserstein divergence

X Cao, C Song, J Zhang, C Liu - 2020 3rd International Conference on …, 2020 - dl.acm.org

… In this study, we have proposed an image segmentation method based on the generative

adversarial technique for remote sensing. To improve the segmentation performance, we

have adopted L1 regression loss for the generative model and W-div loss for the discriminator …

Related articles

 

2020
 

Scholarly Journal  Citation/Abstract

Optimal control of multiagent systems in the Wasserstein spaceJimenez Chloé; Marigonda Antonio; Quincampoix Marc.Calculus

of Variations and Partial Differential Equations; Heidelberg Vol. 59, Iss. 2,  (2020).

Abstract/Details References (‎31)

Show Abstract 


2020  [PDF] ucl.ac.uk

Wasserstein-distance-based temporal clustering for capacity-expansion planning in power systems

L CondeixaF Oliveira… - … Conference on Smart …, 2020 - ieeexplore.ieee.org

… of temporal data used to plan power systems for decades ahead and time resolution needed

to represent renewable energy variability accurately. We propose the use of the Wasserstein

… , the hierarchical clustering performed using the Wasserstein distance leads to capacity-…

Cited by 1 Related articles All 6 versions


2020  [PDF] thecvf.com

Severity-aware semantic segmentation with reinforced wasserstein training

X Liu, W Ji, J YouGE Fakhri… - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com

… to the Wasserstein distance as an alternative for cross-entropy loss. The 1st Wasserstein

distance can be the optimal transport for transferring the probability masses from a source

distribution to a target distribution [41]. For each pixel, we can calculate the Wasserstein distance …

Cited by 13 Related articles All 5 versions 


2020  [PDF] aaai.org

Importance-aware semantic segmentation in self-driving with discrete wasserstein training

X Liu, Y Han, S Bai, Y Ge, T Wang, X HanS Li… - Proceedings of the …, 2020 - ojs.aaai.org

… inter-class correlation in a Wasserstein training framework by configuring its ground distance

… In our extenssive experiments, Wasserstein loss demonstrates superior segmentation … yet

effective loss function for semantic segmentation based on the Wasserstein distance. It is an …

Cited by 15 Related articles All 6 versions 


2020  [PDF] arxiv.org

Segmentation analysis and the recovery of queuing parameters via the Wasserstein distance: a study of administrative data for patients with chronic obstructive …

H Wilde, V KnightJ Gillard, K Smith - arXiv preprint arXiv:2008.04295, 2020 - arxiv.org

This work uses a data-driven approach to analyse how the resource requirements of

patients with chronic obstructive pulmonary disease (COPD) may change, quantifying how

those changes impact the hospital system with which the patients interact. This approach is …

Cited by 1 Related articles All 3 versions 

<——2020——2020—3360 —

2020  [HTML] frontiersin.org

[HTML] Eeg signal reconstruction using a generative adversarial network with wasserstein distance and temporal-spatial-frequency loss

T Luo, Y Fan, L Chen, G Guo, C Zhou - Frontiers in neuroinformatics, 2020 - frontiersin.org

… only aim to minimize the temporal mean-squared-error (MSE) under generic penalties. Instead 

of using temporal MSE according to … algorithm based on generative adversarial networks 

with the Wasserstein distance (WGAN) and a temporal-spatial-frequency (TSF-MSE) loss …

Cited by 21 Related articles All 6 versions

2020   [PDF] sciencedirect.com

novel kernel Wasserstein distance on Gaussian measures: an application of identifying dental artifacts in head and neck computed tomography

JH Oh, M PouryahyaA IyerAP ApteJO Deasy… - Computers in biology …, 2020 - Elsevier

… proposed via the Wasserstein distance. In this work, we develop a novel method to compute

the L 2 -Wasserstein distance in reproducing … In this section, we introduce the classical L 2

-Wasserstein distance between Gaussian measures and then propose a novel approach to …

SCited by 10 Related articles All 6 versions

2020  [PDF] arxiv.org

Geometric Characteristics of Wasserstein Metric on SPD (n)

Y Luo, S Zhang, Y Cao, H Sun - arXiv preprint arXiv:2012.07106, 2020 - arxiv.org

… A natural idea is to describe the geometry of SPD (n) as a Riemannian manifold endowed with

the Wasserstein metric. In this paper, by involving the fiber bundle, we obtain explicit expressions

for some locally … In this part, we will study the Wasserstein Jacobi fields on SPD (n). …

SCited by 1 Related articles All 2 versions 


2020  [HTML] nih.gov

De novo protein design for novel folds using guided conditional Wasserstein generative adversarial networks

M Karimi, S Zhu, Y CaoY Shen - Journal of Chemical Information …, 2020 - ACS Publications

… , we have developed novel deep generative models, namely, semisupervised gcWGAN (guided,

conditional, Wasserstein Generative … design qualities, we build our models on conditional

Wasserstein GAN (WGAN) that uses Wasserstein distance in the loss function. Our major …

SCited by 17 Related articles All 5 versions


2020  [PDF] arxiv.org

On a Novel Application of Wasserstein-Procrustes for Unsupervised Cross-Lingual Learning

G Ramírez, R DangovskiP Nakov… - arXiv preprint arXiv …, 2020 - arxiv.org

… are, intrinsically, versions of the Wasserstein-Procrustes problem. Hence, we devise an

approach to solve Wasserstein-Procrustes in a direct way, … We believe that our rethinking of the

Wasserstein-Procrustes problem could enable further research, thus helping to develop better …

SCited by 1 Related articles All 3 versions 


2020


Novel Data-to-Text Generation Model with Transformer Planning and a Wasserstein Auto-Encoder

X Xu, T He, H Wang - 2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org

… In this paper, we propose a novel data-to-text generation model that can produce summary

text from structured data. At the same time, we … Diversity is enhanced and the duplication of

the generated texts is decreased by adding the distributed samples of the Wasserstein auto-…

SRelated articles All 3 versions


2020

Novel Ant Colony Shape Matching Algorithm Based on the Gromov-Wasserstein Distance

J Zhang, L Zhang, E Saucan - 2020 8th International …, 2020 - ieeexplore.ieee.org

… paper a novel ant colony shape matching algorithm based on the GromovWasserstein

distance. Firstly, the Gromov-Wasserstein distance … Secondly, we make appeal to the geometric

distance optimization method to establish the Gromov-Wasserstein distance model based on …

SRelated articles


DPIR-Net: Direct PET image reconstruction based on the Wasserstein generative adversarial network

Z Hu, H Xue, Q Zhang, J Gao, N Zhang… - … on Radiation and …, 2020 - ieeexplore.ieee.org

… reconstruction network (DPIR-Net) using an improved Wasserstein generative adversarial … 

Second, we combine perceptual loss, mean square error, and the Wasserstein distance as … 

PET reconstruction network, and our proposed DPIR-Net method and evaluated the proposed …

Cited by 47 Related articles

 

  2020

nowledge-Grounded Chatbot Based on Dual Wasserstein ...

https://www.mdpi.com › pdf

https://www.mdpi.com › pdfPDF

by S Kim · 2020 · Cited by 8 — Wasserstein Generative Adversarial Networks with ... encoded by Bi-GRU, and cj denotes the j-th utterance vector encoded by Uni-GRU.


2020

Finite-Horizon Control of Nonlinear Discrete-Time Systems with Terminal Cost of Wasserstein Distance

By: Hoshino, Kenta

Conference: 59th IEEE Conference on Decision and Control (CDC) Location: ‏ ELECTR NETWORK Date: ‏ DEC 14-18, 2020

Sponsor(s): ‏IEEE; Korea Tourism Org; MathWorks; LG Electron; Mando; Jusung Engn; Koh Young Technol; LG Chem; Inst Control Robot & Syst; Soc Ind & Appl Math; Hyundai Motor Co; LS Elect; RS Automat; SOS Lab; Elsevier; Hancom MDS

2020 59TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC)  Book Series: ‏ IEEE Conference on Decision and Control   Pages: ‏ 4268-4

Cited by 3 Related articles

<——2020——2020—3370 —


 

 Method for estimating time delay distance, involves calculating time delay for each unique pair of multiple sensors by minimizing Wasserstein distance between two cumulative distribution transforms corresponding to unique pair

Patent Number: US2021281361-A1

Patent Assignee: US SEC OF NAVY

June 16, 2020
 

2020

[2003.06725] Wasserstein Distance to Independence Models

https://arxiv.org › math

https://arxiv.org › math

by TÖ Çelik · 2020 · Cited by 5 — Abstract: An independence model for discrete random variables is a Segre-Veronese variety in a probability simplex.

Missing: allintile:Semicontinuity

Bernd Sturmfels, Wasserstein Distance to Independence Models, AlCoVE 2020

38 views

•Jun 26, 2020

2020

The Wasserstein Space | SpringerLink

https://link.springer.com › chapter

https://link.springer.com › chapter

by VM Panaretos · 2020 — Abstract. The Kantorovich problem described in the previous chapter gives rise to a metric structure, the Wasserstein distance, in the space ...

2020  [PDF] thecvf.com

Barycenters of natural images constrained wasserstein barycenters for image morphing

D Simon, A Aberdam - … of the IEEE/CVF Conference on …, 2020 - openaccess.thecvf.com

Image interpolation, or image morphing, refers to a visual transition between two (or more) 

input images. For such a transition to look visually appealing, its desirable properties are (i) 

to be smooth;(ii) to apply the minimal required change in the image; and (iii) to seem" real" …

Cited by 9 Related articles All 7 versions
Barycenters of Natural Images Constrained Wasserstein ...

To obtain a smooth and straightforward transition, one may adopt the well-known Wasserstein Barycenter Problem (WBP).

YouTube · ComputerVisionFoundation Videos · 

Jul 17, 2020  

Constrained Wasserstein Barycenters for Image Morphing

www.youtube.com › watch

A short description of our CVPR 2020 paperhttps://arxiv.org/abs/1912.11545.

YouTube · Dror Simon · 

Apr 26, 2020


2020

The Wasserstein Proximal Gradient Algorithm - NeurIPS ...

https://proceedings.neurips.cc › paper › hash

https://proceedings.neurips.cc › paper › hash

by A SALIM · 2020 · Cited by 7 — Wasserstein gradient flows are continuous time dynamics that define curves of steepest descent to minimize an objective function over the space of ...

Cited by 14 Related arti
The Wasserstein Proximal Gradient Algorithm [NeurIPS 2020]

crossminds.ai › video › the-wasserstein-proximal-gradient...

crossminds.ai › video › the-wasserstein-proximal-gradient..

Abstract: Wasserstein gradient flows are continuous time dynamics that define curves of steepest descent to minimize an objective function ...

CrossMind.ai · 

Nov 8, 2020


2020 

[PDF] stanford.edu

[PDF] Quantifying the Empirical Wasserstein Distance to a Set of Measures: Beating the Curse of Dimensionality.

N Si, JH Blanchet, S Ghosh, MS Squillante - NeurIPS, 2020 - stanford.edu

… Definition of the Wasserstein Distance (earth mover’s distance, optimal cost): for any measure 

P, Q, … Definition of the Wasserstein Distance (earth mover’s distance, optimal cost): for any 

measure P, Q, … How to explain the good empirical performance, eg., Wasserstein GAN? …

Cited by 8 Related articles All 3 versions
 

2020

Sparse-View CT Reconstruction Using Wasserstein GANs ...

https://www.semanticscholar.org › paper › Sparse-View-C...

This work proposes a 2D computed tomography slice image reconstruction method from a limited number of projection images using Wasserstein generative ...


 

2020

Semi-supervised Data-driven Surface Wave ... - NASA/ADS

https://ui.adsabs.harvard.edu › abs › abstract

https://ui.adsabs.harvard.edu › abs › abstract

by A Cai · 2020 — The algorithm is termed Wasserstein cycle-consistent generative adversarial networks (Wcycle-GAN), which combines the architecture of cycle-consistent GAN with ...

ll 3 versions

Fair Regression with Wasserstein Barycenters - NeurIPS ...

https://proceedings.neurips.cc › paper › file

https://proceedings.neurips.cc › paper › filePDF

by E Chzhen · 2020 · Cited by 15 — Unlike the case of fair classification, fair regression has received limited attention to date; we are only aware of few works on this topic that are supported ...

11 pages
Fair regression with Wasserstein barycenters - CrossMind.ai

crossminds.ai › video › fair-regression-with-wasserstein-b...

Fair regression with Wasserstein barycenters. Dec 06, 2020. |. arXiv link. 0. Evgenii Chzhen. Follow. Machine Learning Fairness.

CrossMind.ai · 

Dec 6, 2020



2020 see 2019

Quantile Propagation for Wasserstein-Approximate Gaussian ...

https://proceedings.neurips.cc › paper › file

https://proceedings.neurips.cc › paper › filePDF

by R Zhang · 2020 — Experiments on classification and Poisson regression show that QP outperforms both EP and variational Bayes. 1 Introduction. Gaussian process (GP) models have ...

13 pages

<——2020——2020—3380—


 

Quantile Propagation for Wasserstein-Approximate ... - Algorithm

https://algorithm.data61.csiro.au › 2020/10 › Qua...

https://algorithm.data61.csiro.au › 2020/10 › Qua...PDF

by R Zhang — Quantile Propagation for Wasserstein-Approximate Gaussian Processes. Rui Zhang 1 2 Christian J. Walder 1 2 Edwin V. Bonilla 2 Marian-Andrei Rizoiu 3 2 ...


2020

Correcting nuisance variation using Wasserstein distance

https://peerj.com › articles

https://peerj.com › articles

by G Tabak · 2020 · Cited by 5 — One motivating application is drug development: morphological cell features can be captured from images, from which similarities between ...

Correcting nuisance variation using Wasserstein distance 

By: Tabak, Gil; Fan, Minjie; Yang, Samuel; et al.

PEERJ  Volume: ‏ 8     Article Number: e8594   Published: ‏ FEB 28 2020
Cited by 6
Related articles All 7 versions


2020  see 2019  [PDF] ethz.ch

Wasserstein Weisfeiler-Lehman Graph Kernels

M Togninalli, E Ghisu… - Advances in …, 2020 - research-collection.ethz.ch

… the graph Wasserstein distance, a new distance between graphs based on their node feature 

representations, and we discuss how kernels can … that works for both categorically labelled 

and continuously attributed graphs, and we couple it with our graph Wasserstein distance; …

2020 see 2019

Wasserstein Style Transfer

Mroueh, Y

23rd International Conference on Artificial Intelligence and Statistics (AISTATS)

2020 | INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108 108 , pp.842-851

We propose Gaussian optimal transport for image style transfer in an Encoder/Decoder framework. Optimal transport for Gaussian measures has closed forms Monge mappings from source to target distributions. Moreover, interpolating between a content and a style image can be seen as geodesics in the Wasserstein Geometry. Using this insight, we show how to mix different target styles, using Wasserstein barycenter of Gaussian measures. Since Gaussians are closed under Wasserstein barycenter, this allows us a simple style transfer and style mixing and interpolation. Moreover we show how mixing different styles can be achieved using other geodesic metrics between gaussians such as the Fisher Rao metric, while the transport of the content to the new interpolate style is still performed with Gaussian OT maps. Our simple methodology allows to generate new stylized content interpolating between many artistic styles. The metric used in the interpolation results in different stylizations. A demo is available on https: //wasserstein-transfer.github.ic.

 Citation 36


2020

Wasserstein Riemannian geometry of Gamma densities

Ogouyandjou, C and Wadagni, N

2020 | INTERNATIONAL JOURNAL OF MATHEMATICS AND COMPUTER SCIENCE 15 (4) , pp.1253-1270

A Wasserstein Riemannian Gamma manifold is a space of Gamma probability density functions endowed with the Riemannian Otto metric which is related to the Wasserstein distance. In this paper, we study some geometric properties of such Riemanian manifold. In particular we compute the coefficients of alpha-connections and the sectional curvature of those manifolds.

 

2020


Multi-Band Image Synchronous Super-Resolution and Fusion ...

https://www.researching.cn › articles

https://www.researching.cn › articles

by S Tian — Multi-Band Image Synchronous Super-Resolution and Fusion Method Based on Improved WGAN- ... adversarial network with gradient penalty (WGAN-GP) is proposed.

Multi-Band Image Synchronous Super-Resolution and Fusion Method Based on Improved WGAN-GP

Tian, SW; Lin, SZ; (...); Wang, LF

Oct 25 2020 | ACTA OPTICA SINICA 40 (20)

Aiming at the problem that the fused results of low resolution source images arc not good for the subsequent target extraction, a multi-band image synchronous super-resolution and fusion method based on Wasserstein generative adversarial network with gradient penalty (WGAN-GP) is proposed. Firstly, the multi-band low-resolution source images arc enlarged to the target size respectively based on the bicubic interpolation method. Secondly, the enlarged results arc input to a feature extraction (encoding) network to extract features respectively and combine them in a high-level feature space. Then, the initial fused images arc reconstructed by decoding network. Finally, a high-resolution fused image is obtained through a dynamic game between the generator and the discriminator. The experimental results show that the proposed method can not only achieve multi-band images super-resolution and fusion simultaneously, but also the information amount, clarity, and visual quality of the fused images arc significantly higher than other representative methods.


    

4RWRM: RESIDUAL WASSERSTEIN REGULARIZATION ...

https://www.aimsciences.org › article › exportPdf

https://www.aimsciences.org › article › exportPdfPDF

by R He · 2020 — Existing image restoration methods mostly make full use of various ... RWRM: Residual Wasserstein regularization model for image restoration.

RWRM: RESIDUAL WASSERSTEIN REGULARIZATION MODEL FOR IMAGE RESTORATION

He, RQ; Feng, XC; (...); Wei, BZ

Dec 2021 | Aug 2020 (Early Access) | INVERSE PROBLEMS AND IMAGING 15 (6) , pp.1307-1332

Existing image restoration methods mostly make full use of various image prior information. However, they rarely exploit the potential of resid-ual histograms, especially their role as ensemble regularization constraint. In this paper, we propose a residual Wasserstein regularization model (RWRM), in which a residual histogram constraint is subtly embedded into a type of variational minimization problems. Specifically, utilizing the Wasserstein dis-tance from the optimal transport theory, this scheme is achieved by enforcing the observed image residual histogram as close as possible to the reference residual histogram. Furthermore, the RWRM unifies the residual Wasserstein regularization and image prior regularization to improve image restoration per-formance. The robustness of parameter selection in the RWRM makes the proposed algorithms easier to implement. Finally, extensive experiments have confirmed that our RWRM applied to Gaussian denoising and non-blind de-convolution is effective.



Learning with minibatch Wasserstein : asymptotic and gradient properties - OpenReview

https://openreview.net › forum

by K Fatras · 2020 · Cited by 24 — Learning with minibatch Wasserstein : asymptotic and gradient properties ... and have found many applications in machine learning.

Learning with minibatch Wasserstein : asymptotic and gradient properties

Fatras, K; Zine, Y; (...); Courty, N

23rd International Conference on Artificial Intelligence and Statistics (AISTATS)

2020 | INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108 108 , pp.2131-2140

Optimal transport distances are powerful tools to compare probability distributions and have found many applications in machine learning. Yet their algorithmic complexity prevents their direct use on large scale datasets. To overcome this challenge, practitioners compute these distances on minibatches i.e. they average the outcome of several smaller optimal transport problems. We propose in this paper an analysis of this practice, which effects are not well understood so far. We notably argue that it is equivalent to an implicit regularization of the original problem, with appealing properties such as unbiased estimators, gradients and a concentration bound around the expectation, but also with defects such as loss of distance property. Along with this theoretical analysis, we also conduct empirical experiments on gradient flows, GANs or color transfer that highlight the practical interest of this strategy.

   


Stereoscopic image reflection removal based on Wasserstein Generative Adversarial Network

X Wang, Y Pan, DPK Lun - 2020 IEEE International …, 2020 - ieeexplore.ieee.org

Reflection removal is a long-standing problem in computer vision. In this paper, we consider

the reflection removal problem for stereoscopic images. By exploiting the depth information

of stereoscopic images, a new background edge estimation algorithm based on the

Wasserstein Generative Adversarial Network (WGAN) is proposed to distinguish the edges

of the background image from the reflection. The background edges are then used to

reconstruct the background image. We compare the proposed approach with the state-of-the …

Cited by 1 Related articles All 3 versions

Stereoscopic image reflection removal based on Wasserstein Generative Adversarial Network

Wang, XY; Pan, YK and Lun, DPK

IEEE International Conference on Visual Communications and Image Processing (VCIP)

2020 | 2020 IEEE INTERNATIONAL CONFERENCE ON VISUAL COMMUNICATIONS AND IMAGE PROCESSING (VCIP) , pp.148-151

Reflection removal is a long-standing problem in computer vision. In this paper, we consider the reflection removal problem for stereoscopic images. By exploiting the depth information of stereoscopic images, a new background edge estimation algorithm based on the Wasserstein Generative Adversarial Network (WGAN) is proposed to distinguish the edges of the background image from the reflection. The background edges are then used to reconstruct the background image. We compare the proposed approach with the state-of-the-art reflection removal methods. Results show that the proposed approach can outperform the traditional single-image based methods and is comparable to the multiple-image based approach while having a much simpler imaging hardware requirement.

umeric Data Augmentation using Structural Constraint ...

https://ieeexplore.ieee.org › document

by W Wang · 2020 — Numeric Data Augmentation using Structural Constraint Wasserstein Generative Adversarial Networks. Abstract: Some recent studies have suggested using GANs ...

INSPEC Accession Number: 20727647

DOI: 10.1109/ISCAS45731.2020.9181232

Date Added to IEEE Xplore: 28 September 2020

Numeric Data Augmentation using Structural Constraint ...

https://ieeexplore.ieee.org › document

by W Wang · 2020 — Numeric Data Augmentation using Structural Constraint Wasserstein Generative Adversarial Networks. Abstract: Some recent studies have suggested using GANs ...

INSPEC Accession Number: 20727647

DOI: 10.1109/ISCAS45731.2020.9181232

Date Added to IEEE Xplore: 28 September  

Numeric Data Augmentation Using Structural Constraint Wasserstein Generative Adversarial Networks

Wang, W; Wang, C; (...); Li, Y

IEEE International Symposium on Circuits and Systems (ISCAS)

2020 | 2020 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS)

Some recent studies have suggested using GANs for numeric data generation such as to generate data for completing the imbalanced numeric data. Considering the significant difference between the dimensions of the numeric data and images, as well as the strong correlations between features of numeric data, the conventional GANs normally face an overfitting problem, consequently leads to an ill-conditioning problem in generating numeric and structured data. This paper studies the constrained network structures between generator G and discriminator D in WGAN, designs several structures including isomorphic, mirror and self-symmetric structures. We evaluates the performances of the constrained WGANs in data augmentations, taking the non-constrained GANs and WGANs as the baselines. Experiments prove the constrained structures have been improved in 16/20 groups of experiments. In twenty experiments on four UCI Machine Learning Repository datasets, Australian Credit Approval data, German Credit data, Pima Indians Diabetes data and SPECT heart data facing five conventional classifiers. Especially, Isomorphic WGAN is the best in 15/20 experiments. Finally, we theoretically proves that the effectiveness of constrained structures by the directed graphic model (DGM) analysis.

 <——2020——2020—3390 —



2020  [PDF] jsjkx.com

[PDF] 基于深度森林与 CWGAN-GP 的移动应用网络行为分类与评估

蒋鹏飞, 魏松杰 - 计算机科学, 2020 - jsjkx.com

针对目前移动应用数目庞大, 功能复杂, 并且其中混杂着各式各样的恶意应用等问题, 面向Android 

平台分析了应用程序的网络行为, 对不同类别的应用程序设计了合理的网络行为触发事件以模拟

网络交互行为, 提出了网络事件行为序列, 并利用改进的深度森林模型对应用进行分类识别,

Cited by 1 Related articles All 2 versions

[CITATION] AEWGAN 이용한 고차원 불균형 데이터 이상 탐지

송승환, 백준걸 - 대한산업공학회 추계학술대회 논문집, 2019 - dbpia.co.kr

… Architecture of AEWGAN - 잠재 공간은 데이터의 변동성을 포착할 있는 속성의 조합을 

사용하여 표현한 공간 - WGAN 통해 오버샘플링 비정상 데이터가 처음에 학습된 모델에 

입력 값으로 들어가게 되면 기존과는 다른 속성의 조합으로 표현되기 때문에 이상치로 판정 …

S Related articles


2020

Hyperbolic Wasserstein Distance for Shape ... - EurekaMag.com

https://eurekamag.com › research

https://eurekamag.com › research

This work studies the Wasserstein space and proposes a novel framework to compute the Wasserstein distance between general topological surfaces by integrating ...


 

2020  see 2019  [PDF] psu.edu

Aggregated Wasserstein Distance for Hidden Markov Models and Automated Morphological Characterization of Placenta from Photos

Y Chen - 2020 - search.proquest.com

… Within that big picture, the first part of this thesis proposed a new fundamental tool – 

aggregated Wasserstein distances for hidden Markov … between components is the Wasserstein 

metric for Gaussian distributions. The solution of such optimization is a fast approximation to the …

Related articles All 2 versions

2020 

基于Wasserstein距离的多电压暂降事件同源检测方法- CNKI

https://global.cnki.net › detail › detail

https://global.cnki.net › detail › detail · Translate this page

Nov 10, 2020 — Multiple Voltage Sag Events Homology Detection Based on Wasserstein ... and presents the MSHD method based on the Wasserstein distance.

Multiple Voltage Sag Events Homology Detection Based on Wasserstein 

[CITATION] Multi-voltage sag event detection method based on Wasserstein distance [J]

XY Xiao, LY Gui, CX Li - Power System Technology, 2020

Cited by 3

[Chinese  Classification and Evaluation of Mobile Application Network Behavior Based on Deep Forest and CWGAN-GP]


2020  [PDF] arxiv.org

Wasserstein distributionally robust inverse multiobjective optimization

C DongB Zeng - arXiv preprint arXiv:2009.14552, 2020 - arxiv.org

… • We present a novel Wasserstein distributionally robust framework for constructing inverse

multiobjective optimization estimator. We use the prominent Wasserstein metric to construct

the uncertainty set centering at the empirical distribution of observed decisions. • We show …

 Cited by 3 Related articles All 5 versions 


2020


2020  [PDF] neurips.cc

Quantile Propagation for Wasserstein-Approximate Gaussian Processes

R ZhangC WalderEV Bonilla… - Advances in Neural …, 2020 - proceedings.neurips.cc

… Our method—dubbed Quantile Propagation (QP)—is similar to expectation propagation (EP)

but minimizes the L2 Wasserstein distance (… We show that QP matches quantile functions

rather than moments as in EP and has the same mean update but a smaller variance update …

SRelated articles All 8 versions 


2020  [PDF] researchgate.net

[PDF] Image hashing by minimizing independent relaxed wasserstein distance

KD DoanA KimiyaieS Manchanda… - arXiv preprint arXiv …, 2020 - researchgate.net

… estimate, SWD needs a very large number of random directions in order to accurately

estimate the Wasserstein distance. In this … minimizing a variant of the Wasserstein distance.

Our proposed method does not employ any discriminator/critic and can estimate the Wasserstein …

Cited by 2 Related articles 



2020  [PDF] arxiv.org

Image Hashing by Minimizing Discrete Component-wise Wasserstein Distance

KD DoanS ManchandaS Badirli… - arXiv preprint arXiv …, 2020 - arxiv.org

… distance by averaging the one-dimensional Wasserstein distances of the data points …

Wasserstein distance along this direction [7]. In this paper, we address the limitations of these

GAN-based approaches by robustly and efficie

ntly minimizing a novel variant of the Wasserstein …

Cited by 1 Related articles All 2 versions 



2020  [PDF] theses.fr

Régression quantile extrême: une approche par couplage et distance de Wasserstein.

B Bobbia - 2020 - theses.fr

Ces travaux concernent l'estimation de quantiles extrêmes conditionnels. Plus précisément,

l'estimation de quantiles d'une distribution réelle en fonction d'une covariable de grande

dimension. Pour effectuer une telle estimation, nous présentons un modèle, appelé modèle des …

Related articles All 9 versions 



2020  [PDF] arxiv.org

Improved image wasserstein attacks and defenses

JE HuA SwaminathanH SalmanG Yang - arXiv preprint arXiv …, 2020 - arxiv.org

… (2019), the perturbed image often uses less than 50% of the Wasserstein budget, and

thus rarely goes outside of the Wasserstein ball despite this unsafe clamping. However, as

we derive stronger attacks that project points onto the boundary of the Wasserstein ball, ad-hoc …

Cited by 6 Related articles All 4 versions 

<——2020——2020—3400 —



2020  [PDF] mlr.press

Stronger and faster Wasserstein adversarial attacks

K WuA WangY Yu - International Conference on Machine …, 2020 - proceedings.mlr.press

… To generate stronger Wasserstein adversarial attacks, we introduce two faster and more

accurate algorithms for Wasserstein constrained optimization. Each algorithm has its own

advantage thus they complement each other nicely: PGD with dual projection employs exact …

Cited by 6 Related articles All 8 versions 



2020  [PDF] arxiv.org

Regularization helps with mitigating poisoning attacks: Distributionally-robust machine learning using the wasserstein distance

F Farokhi - arXiv preprint arXiv:2001.10655, 2020 - arxiv.org

… using the adversariallymanipulated training data) defined using the Wasserstein distance. The

Wasserstein distance can be seen as the … work on poisoning attacks in adversarial machine

learning in Section 2. Background information on the Wasserstein distance is presented …

Cited by 7 Related articles All 3 versions 


2020  [PDF] arxiv.org

Virtual persistence diagrams, signed measures, and Wasserstein distance

P Bubenik, A Elchesen - arXiv preprint arXiv:2012.10514, 2020 - arxiv.org

… the Wasserstein distance for persistence diagrams and the classical Wasserstein distance from

optimal transport theory. Following this work, we define compatible Wasserstein … Furthermore,

we show that the 1-Wasserstein distance extends to virtual persistence diagrams and to …

Cited by 4 Related articles All 2 versions 

[CITATION] Virtual persistence diagrams, signed measures, Wasserstein distances, and Banach spaces

P Bubenik, A Elchesen - arXiv preprint arXiv:2012.10514, 2020


2020  Semi-supervised Surface Wave Tomography with Wasserstein Cycle-consistent GAN: Method and Application on Southern California Plate Boundary Region

A Cai, H QiuF Niu - 2020 - essoar.org

… algorithm is applied to shear wave velocity (Vs) inversion in surface wave tomography, where

… The algorithm is termed Wasserstein cycle-consistent generative adversarial networks (…

separate data generating network, while Wasserstein metric provides improved training stability …



 2020

Wasserstein distance computed between the input and output ...

https://www.researchgate.net › figure › Wasserstein-distan...

https://www.researchgate.net › figure › Wasserstein-distan...

This paper describes a deep latent variable model of speech power spectrograms and its application to semi-supervised speech enhancement with a deep speech ...

 2020


Loss Functions | Generative Adversarial Networks - Google ...

https://developers.google.com › machine-learning › gan

https://developers.google.com › machine-learning › gan

Feb 10, 2020 — Wasserstein loss: The default loss function for TF-GAN Estimators. ... G(z) is the generator's output when given noise z.


2020

Wasserstein Exponential Kernels | IEEE Conference Publication

https://ieeexplore.ieee.org › document

https://ieeexplore.ieee.org › document

by H De Plaen · 2020 · Cited by 5 — ... interest for supervised learning problems involving shapes and images. Empirically, Wasserstein squared exponential kernels are shown to yield smaller ...

Date of Conference: 19-24 July 2020

Cited by 5 Related articles All 5 versions

2020 online

Trajectories from Distribution-Valued Functional Curves: A Unified Wasserstein Framework

by Sharma, Anuja; Gerig, Guido

Medical Image Computing and Computer Assisted Intervention – MICCAI 2020, 09/2020

.... This is achieved under the unifying scheme of Wasserstein distance metric. The regression problem is formulated as a constrained optimization problem and solved using an alternating projection algorithm...

Book ChapterFull Text Online

Cited by 1 Related articles All 7 versions


2020

Multivariate Wasserstein metric for $n$-dimensions - Cross ...

https://stats.stackexchange.com › questions › multivariat...

https://stats.stackexchange.com › questions › multivariat...

Oct 1, 2020 — In the case where the two vectors a and b are of unequal length, it appears that this function interpolates, inserting values within each vector, which are ...



2020

Wasserstein Dictionary Learning: Optimal Transport-Based ...

https://epubs.siam.org › doi

https://epubs.siam.org › doi

by MA Schmitz · 2018 · Cited by 98 — (2022) Sketched Stochastic Dictionary Learning for large‐scale data and application ... 2021 IEEE/CVF International Conference on Computer Vision Workshops ...

<——2020——2020—3410 —


2020

Joint Transfer of Model Knowledge and Fairness Over ... - DOI

https://doi.org › ACCESS.2020.3005987

https://doi.org › ACCESS.2020.3005987

Jun 30, 2020 — This is done by controlling the Wasserstein distances between relevant distributions. ... Published in: IEEE Access ( Volume: 8 ).

DOI: 10.1109/ACCESS.2020.3005987


 2020 patent

Wasserstein distance-based image rapid enhancement method

CN CN111476721A 丰江帆 重庆邮电大学

Priority 2020-03-10 • Filed 2020-03-10 • Published 2020-07-31

4. The Wasserstein distance-based image rapid enhancement method according to claim 3, characterized in that: in step S21, the motion-blurred image has 256 features, including texture features, color features, and edge features. 5. The Wasserstein distance-based image rapid enhancement method …  

  

Building energy consumption prediction method and monitoring prediction system based on WGAN algorithm

CN CN111178626A 傅启明 苏州科技大学

Priority 2019-12-30 • Filed 2019-12-30 • Published 2020-05-19

8. the WGAN algorithm-based building energy consumption prediction method according to claim 1, wherein after completing one GAN prediction model training in step S400, the reinforcement learning algorithm is used to optimize the hyperparameters in GAN, LSTM, and CNN, find and update the optimal …


v

2020  Research article
A novel kernel Wasserstein distance on Gaussian measures: An application of identifying dental artifacts in head and neck computed tomography
Computers in Biology and Medicine26 March 2020...

Jung Hun OhMaryam PouryahyaAllen Tannenbaum

Cited by 12 Related articles All 5 versions
 
.  

2020  Research article
Chapter 4: Lagrangian schemes for Wasserstein gradient flows
Handbook of Numerical Analysis11 November 2020...

Jose A. CarrilloDaniel MatthesMarie-Therese Wolfram

 

2020


2020  Short communication
Convergence rate to equilibrium in Wasserstein distance for reflected jump–diffusions
Statistics & Probability Letters27 June 2020...

Andrey Sarantsev

.Cited by 1 Related articles All 6 versions

2020  Research article
Tensor product and Hadamard product for the Wasserstein means
Linear Algebra and its Applications3 July 2020...

Jinmi HwangSejong K..

Related articles All 5 versions

2020 see 2019   Research article
Wasserstein convergence rates for random bit approximations of continuous Markov processes

Journal of Mathematical Analysis and Applications28 August 2020...

Stefan AnkirchnerThomas KruseMikhail Urusov

 

  .  

2020  Short communication
Tractable reformulations of two-stage distributionally robust linear programs over the type-∞ Wasserstein ball
Operations Research Letters23 June 2020...

Weijun Xie

Cited by 19 Related articles All 4 versions

 2020 see 20219    Research article
Considering anatomical prior information for low-dose CT image enhancement using attribute-augmented Wasserstein generative adversarial networks
Neurocomputing6 November 2020...

Zhenxing HuangXinfeng LiuZhanli Hu

<——2020——2020—3420 —


2020  Research article
The quadratic Wasserstein metric for earthquake location
Journal of Computational Physics9 July 2018...

Jing ChenYifan ChenDinghui Yang


Peer-reviewed
Characterization of probability distribution convergence in Wasserstein distance by L^p-quantization error function
Show more
Authors:Y. LiuG. Pagès
Article, 2020
Publication:Bernoulli, 26, 2020, 1171
Publisher:2020

Scholarly Journal  Citation/Abstract

Person-independent facial expression recognition method based on improved Wasserstein generative adversarial networks in combination with identity aware

Xu Caie; Cui, Yang; Zhang, Yunhui; Gao, Peng; Xu, Jiayi.Multimedia Systems; Heidelberg Vol. 26, Iss. 1,  (2020): 53-61.

Abstract/Details   References (‎27)

Show Abstract 

Cited by 11 Related articles All 3 versions


Scholarly Journal  Citation/Abstract

Image Recognition with WGAN

Hu, Long-Hui; Wang, Chao-Li; Sun, Zhan-Quan; Yang, Ai-Jun.Kongzhi Gongcheng = Control Engineering of China; Shenyang Iss. 12,  (2020): 2168.

Abstract/Details  Show Abstract 


Scholarly Journal  Full Text

Solutions of a Class of Degenerate Kinetic Equations Using Steepest Descent in Wasserstein Space

Aboubacar Marcos; Ambroise Soglo.Journal of Mathematics; Cairo Vol. 2020,  (2020).

Abstract/DetailsFull textFull text - PDF (6 MB)‎

Show Abstract 

Cited by 3 Related articles All 7 versions

Conference Paper  Citation/Abstract

Spatial-aware Network using Wasserstein Distance for Unsupervised Domain Adaptation

Bin, Luo; Jiang, Fan.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).

Abstract/Details   Show Abstract 


Conference Paper  Citation/Abstract

Spoken Keyword Detection Based on Wasserstein Generative Adversarial Network

Zhao, Wen; Kun, She; Hao, Chen.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).

Abstract/Details   Show Abstract 

Related articles All 2 versions

 

Conference Paper  Citation/Abstract

S2A: Wasserstein GAN with Spatio-Spectral Laplacian Attention for Multi-Spectral Band Synthesis

Rout, Litu; Misra, Indranil; Moorthi, S Manthira; Dhar, Debajyoti.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).

Abstract/Details Get full textLink to external site, this linkwill open in a new window

Show Abstract 

Cited by 6 Related articles All 11 versions  

   

Conference Paper  Citation/Abstract

Learning Wasserstein Isometric Embedding for Point Clouds

Koide, Satoshi; Kutsuna, Takuro.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).

Abstract/Details   Show Abstract 

Cited by 1 Related articles All 2 versions

<——2020——2020—3430 —


Conference Paper  Citation/Abstract

Correlated Wasserstein Autoencoder for Implicit Data Recommendation

Zhong, Jingbin; Zhang, Xiaofeng; Luo, Linhao.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).

Abstract/Details   Show Abstract 


 

Conference Paper  Citation/Abstract

Semantics-assisted Wasserstein Learning for Topic and Word Embeddings

Li, Changchun; Ouyang, Jihong; Wang, Yiming.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).

Cited by 3 Related articles All 2 versions


Conference Paper  Citation/Abstract

A Novel Ant Colony Shape Matching Algorithm Based on the Gromov-Wasserstein Distance
Zhang, Lu; Saucan, Emil.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).
Abstract/Details   Show Abstract 
Related articles
All 2 versions


Scholarly Journal  Citation/Abstract

Network Intrusion Detection Based on Conditional Wasserstein Generative Adversarial Network and Cost-Sensitive Stacked Autoencoder

Zhang, Guoling; Wang, Xiaodan; Li, Rui; Song, Yafei; He, Jiaxing; et al.IEEE Access; Piscataway Vol. 8,  (2020): 190431-190447.

Abstract/Details Get full textLink to external site, this link will open in a new window

Show Abstract 

Free Full Text from Publisher

5 Citations  46 \References  Related records 

2020 see 2021  Scholarly Journal  Citation/Abstract

Wasserstein Index Generation Model: Automatic generation of time-series index with application to Economic Policy Uncertainty

Xie, Fangzhou.Economics Letters; Amsterdam Vol. 186,  (Jan 2020): 1.

Abstract/Details Get full textLink to external site, this link will open in a new window

Show Abstract 


2020


Scholarly Journal  Citation/Abstract

Multiple Voltage Sag Events Homology Detection Based on Wasserstein Distance

Xiao, Xianyong; Gui, Liangyu; Li, Chengxin; Zhang, Huaying; Li, Hongxin; et al.Dianwang Jishu = Power System Technology; Beijing Iss. 12,  (2020): 4684.


2020

Intelligent Optical Communication Based on Wasserstein Generative Adversarial Network

D Mu, W Meng, S Zhao, X Wang, W Liu - Chinese Journal of Lasers - researching.cn

相比原始 GAN KL散度,JS散度, Wasserstein距离的优越性在于,即便两个分布没有 重叠,Wasserstein距离

仍然能够反映它们的远近. KL散度梯度,但是 Wasserstein距离 却可以提供有意义的梯度. 

基于 Wasserstein距离的优越性,WGAN 将其 定义为生成器的损失函数,Wasserstein距离

 All 2 versions View as HTML 


2020  [PDF] arxiv.org

Application of an unbalanced optimal transport distance and a mixed L1/Wasserstein distance to full waveform inversion

D Li, MP Lamoureux, W Liao - arXiv preprint arXiv:2004.05237, 2020 - arxiv.org

… distance and 1Wasserstein distance in [22] and [39]. Compared to the 2-Wasserstein 

trace-by-trace strategy in [37], the UOT distance and mixed L1/Wasserstein distance provide more …

 Related articles All 2 versions


 

2020  [PDF] researchgate.net

[PDF] Potential Analysis of Wasserstein GAN as an Anomaly Detection Method for Industrial Images

A Misik - researchgate.net

… Abstract—The task of detecting anomalies in images is a crucial part of current industrial 

to use Wasserstein GAN (WGAN) as a method for anomaly detection in industrial images is …

 Related articles

[CITATION] Potential Analysis of Wasserstein GAN as an Anomaly Detection Method for Industrial Images


2020

Revisiting Fixed Support Wasserstein Barycenter - DeepAI

https://deepai.org › publication › revisiting-fixed-support-...

https://deepai.org › publication › revisiting-fixed-support-...

Feb 12, 2020 — 02/12/20 - We study the fixed-support Wasserstein barycenter problem (FS-WBP), which consists in computing the Wasserstein barycenter of m ...

<——2020——2020—3440 —


[PDF] arxiv.org

Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces

V EhrlacherD LombardiO Mula… - … and Numerical Analysis, 2020 - esaim-m2an.org

… metric spaces, and more specifically on the L2-Wasserstein … model reduction of conservative

PDEs in metric spaces. For two … for model reduction in general metric spaces. We also make …

 Cited by 17 Related articles All 25 versions


  MR4377772 Prelim Gupta, Neena; Rao, Dhvanita R.; Kolte, Sagar; A survey on the non-injectivity of the vaserstein symbol in dimension three. Leavitt path algebras and classical 

K-theory, 193–202, Indian Stat. Inst. Ser., Springer, Singapore, [2020], ©2020. 19G12

Review PDF Clipboard Series Chapter



2020 see 2019  [HTML] springer.com

[HTML] Adapted Wasserstein distances and stability in mathematical finance

J Backhoff-VeraguasD BartlM Beiglböck… - Finance and …, 2020 - Springer

… Wasserstein distance (say), the answer is No. Models which are similar with respect to the

Wasserstein distance … adapted version of the Wasserstein distance which takes the temporal …

 Cited by 35 Related articles All 20 versions

 

Improving Perceptual Quality by Phone-Fortified Perceptual Loss using Wasserstein Distance for Speech Enhancement

TA HsiehC YuSW FuX LuY Tsao - arXiv preprint arXiv:2010.15174, 2020 - arxiv.org

… the PFPL, which is a perceptual loss incorporated with Wasserstein distance in detail. … to

replace the Lp distance and use the Wasserstein distance as the distance measure to compute …

 Cited by 15 Related articles All 4 versions 

Illumination-invariant flotation froth color measuring via Wasserstein distance-based CycleGAN with structure-preserving constraint

J Liu, J He, Y Xie, W Gui, Z Tang, T Ma… - IEEE Transactions …, 2020 - ieeexplore.ieee.org

… Wasserstein distance-based structure-preserving CycleGAN, called WDSPCGAN.

WDSPCGAN is comprised of two generative adversarial networks (GANs), which have their own …

 Cited by 27 Related articles All 3 versions


 2020

Multi-view Wasserstein discriminant analysis with entropic regularized Wasserstein distance

H Kasai - ICASSP 2020-2020 IEEE International Conference …, 2020 - ieeexplore.ieee.org

… To evaluate this discrepancy, this paper presents a proposal of a multi-view Wasserstein

discriminant analysis, designated as MvWDA, which exploits a recently developed optimal …

 Cited by 8 Related articles

On the Computation of Kantorovich--Wasserstein Distances Between Two-Dimensional Histograms by Uncapacitated Minimum Cost Flows

F Bassetti, S GualandiM Veneroni - SIAM Journal on Optimization, 2020 - SIAM

… In section 2 we review the Kantorovich distance (ie, the Wasserstein distance of order 1)

in the discrete setting, and we show its connection with linear programming (subsection 2.2) …

 Cited by 10 Related articles All 2 versions


[PDF] arxiv.org

Virtual persistence diagrams, signed measures, Wasserstein distances, and Banach spaces

P Bubenik, A Elchesen - arXiv preprint arXiv:2012.10514, 2020 - arxiv.org

Wasserstein distance from optimal transport theory. Following this work, we define compatible 

Wasserstein distances for … We show that the 1-Wasserstein distance extends to virtual …

 Cited by 2 Related articles


[PDF] ieee.org

Robust multivehicle tracking with wasserstein association metric in surveillance videos

Y Zeng, X Fu, L Gao, J Zhu, H Li, Y Li - IEEE Access, 2020 - ieeexplore.ieee.org

… , we propose a robust multivehicle tracking with Wasserstein association metric (MTWAM) 

method. In MTWAM, we analyze the advantage of the 1-Wasserstein distance (WD-1) on …

  Cited by 9 Related articles All 2 versions


[PDF] neurips.cc

Fair regression with wasserstein barycenters

E Chzhen, C Denis, M Hebiri… - Advances in Neural …, 2020 - proceedings.neurips.cc

… Specifically, we show that the distribution of this optimum is the Wasserstein barycenter of

the distributions induced by the standard regression function on the sensitive groups. This …

Cited by 20 Related articles All 6 versions 

<——2020——2020—3450 —


[PDF] thecvf.com

Barycenters of natural images constrained wasserstein barycenters for image morphing

D SimonA Aberdam - … of the IEEE/CVF Conference on …, 2020 - openaccess.thecvf.com

… Wasserstein barycenters have been used for various applications in image processing and

… of the Wasserstein barycenter problem, and use it to obtain a natural-looking barycenter of …

Cited by 9 Related articles All 8 versions 


[HTML] nih.gov

Multimarginal Wasserstein barycenter for stain normalization and augmentation

S NadeemT HollmannA Tannenbaum - International Conference on …, 2020 - Springer

… on the multimarginal Wasserstein barycenter to normalize and augment H&E stained

images given one or more references. Specifically, we provide a mathematically robust way of …

Cited by 7 Related articles All 8 versions


[PDF] mlr.press

Fast algorithms for computational optimal transport and wasserstein barycenter

W GuoN HoM Jordan - International Conference on …, 2020 - proceedings.mlr.press

… Wasserstein barycenters for multiple probability distributions. … algorithms for computing

Wasserstein barycenters. Another … to approximate the Wasserstein barycenters. It remains as an …

Cited by 13 Related articles All 6 versions 


[PDF] arxiv.org

Wasserstein distributionally robust stochastic control: A data-driven approach

I Yang - IEEE Transactions on Automatic Control, 2020 - ieeexplore.ieee.org

… the distributionally robust stochastic control problem with Wasserstein ambiguity sets. … of

the distributionally robust policy. In Section V, we present the Wasserstein penalty problem and …

Cited by 42 Related articles All 3 versions


[PDF] arxiv.org

Finite-sample guarantees for Wasserstein distributionally robust optimization: Breaking the curse of dimensionality

R Gao - arXiv preprint arXiv:2009.04382, 2020 - arxiv.org

… for Wasserstein robust … generic Wasserstein DRO problems without suffering from the curse

of dimensionality. Our results highlight the bias-variation trade-off intrinsic in the Wasserstein …

Cited by 16 Related articles All 3 versions 


 2020

[PDF] arxiv.org

Stochastic approximation versus sample average approximation for population wasserstein barycenters

D Dvinskikh - arXiv preprint arXiv:2001.07697, 2020 - arxiv.org

… We show that for the Wasserstein barycenter problem this … implementations calculating

barycenters defined with respect to … confidence intervals for the barycenter defined with respect to …

Cited by 4 Related articles All 3 versions 


A linear programming approximation of distributionally robust chance-constrained dispatch with Wasserstein distance

A ZhouM Yang, M Wang… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org

… This paper studies a distributionally robust chance constrained real-time dispatch problem

where the adjustable Wassersteindistance-based ambiguity set is embedded into the …

 Cited by 18 Related articles All 2 versions


[PDF] openreview.net

Functional Priors for Bayesian Neural Networks through Wasserstein Distance Minimization to Gaussian Processes

BH Tran, D Milios, S Rossi… - Third Symposium on …, 2020 - openreview.net

… Nonetheless, this kind of generative priors on the functions is very different from shallow … 

We consider the Wasserstein distance between the distribution of functions induced by bnn …

Cited by 1 Related articles All 2 versions

[PDF] arxiv.org

Virtual persistence diagrams, signed measures, Wasserstein distances, and Banach spaces

P Bubenik, A Elchesen - arXiv preprint arXiv:2012.10514, 2020 - arxiv.org

… Wasserstein distance from optimal transport theory. Following this work, we define compatible

Wasserstein … We show that the 1-Wasserstein distance extends to virtual persistence …

 Cited by 2 Related articles 


2020 see 2019  [PDF] psu.edu

Aggregated Wasserstein Distance for Hidden Markov Models and Automated Morphological Characterization of Placenta from Photos

Y Chen - 2020 - search.proquest.com

… fundamental tool – aggregated Wasserstein distances for hidden Markov models (HMMs)

with … where the cost between components is the Wasserstein metric for Gaussian distributions. …

Related articles All 2 versions

<——2020——2020—3460 —


2020 see 2019 [PDF] arxiv.org

Bures-Wasserstein Geometry

J van Oostrum - arXiv preprint arXiv:2001.08056, 2020 - arxiv.org

… densities, where it is called the Wasserstein distance. In quantum information theory, this is

a distance measure between quantum states or density matrices, called the Bures distance. …

Cited by 4 Related articles All 2 versions 

 [PDF] neurips.cc

Ratio trace formulation of wasserstein discriminant analysis

H LiuY CaiYL Chen, P Li - Advances in Neural …, 2020 - proceedings.neurips.cc

… reduction technique that generalizes the classical Fisher Discriminant Analysis (FDA) [16] …

’s regularized Wasserstein distances to the intra-class’s regularized Wasserstein distances. …

 Cited by 1 Related articles All 4 versions 


[PDF] openreview.net

Wasserstein Distributional Normalization: Nonparametric Stochastic Modeling for Handling Noisy Labels

SW Park, J Kwon - 2020 - openreview.net

… Gaussian measures by imposing geometric constraints in the 2-Wasserstein space. We

simulated discrete SDE using the Euler-Maruyama scheme, which makes our method fast, …

 Related articles 


PDF] arxiv.org

Consistency of distributionally robust risk-and chance-constrained optimization under Wasserstein ambiguity sets

A CherukuriAR Hota - IEEE Control Systems Letters, 2020 - ieeexplore.ieee.org

… We consider ambiguity sets defined by the Wasserstein metric and the empirical distribution

… markov decision processes with Wasserstein distance,” IEEE Control Syst. Lett., vol. 1, no. …

 Cited by 4 Related articles All 6 versions


[PDF] harchaoui.org

[PDF] Wasserstein Clustering

W HARCHAOUI - 2020 - harchaoui.org

… minimizes the Wasserstein distance … the Wasserstein distances between the associated

groups. These two mechanisms are compatible with model selection according to a Wasserstein …

 Related articles 


2020


[PDF] ieee.org

Distributionally robust optimal reactive power dispatch with wasserstein distance in active distribution network

J Liu, Y Chen, C Duan, J Lin… - Journal of Modern Power …, 2020 - ieeexplore.ieee.org

… new distributionally robust chance-constraint (DRCC) ORPD model with Wasserstein distance

… 2) We firstly apply Wasserstein distance to the DRCCbased ORPD model to construct the …

 Cited by 17 Related articles All 5 versions


Data-driven distributionally robust unit commitment with Wasserstein metric: Tractable formulation and efficient solution method

X ZhengH Chen - IEEE Transactions on Power Systems, 2020 - ieeexplore.ieee.org

… In recent years, Wasserstein-metric-based distributionally robust optimization method …

robust optimization problem with Wasserstein metric can be reformulated as a two-stage robust …

 Cited by 15 Related articles All 2 versions




[PDF] arxiv.org

Regularization helps with mitigating poisoning attacks: Distributionally-robust machine learning using the wasserstein distance

F Farokhi - arXiv preprint arXiv:2001.10655, 2020 - arxiv.org

… Background information on the Wasserstein distance is presented in Section 3… -robust

machine learning problem using the Wasserstein distance and transform the distributionally-robust …

  Cited by 7 Related articles All 4 versions 


[PDF] neurips.cc

Fast Epigraphical Projection-based Incremental Algorithms for Wasserstein Distributionally Robust Support Vector Machine

J LiC ChenAMC So - Advances in Neural Information …, 2020 - proceedings.neurips.cc

… Wasserstein Distributionally Robust Optimization (DRO) is … probability distribution within a

Wasserstein ball centered at a … a family of Wasserstein distributionally robust support vector …

  Cited by 3 Related articles All 6 versions 

 Fast Epigraphical Projection-based Incremental Algorithms for ...

slideslive.com › fast-epigraphical-projectionbased-increm...

slideslive.com › fast-epigraphical-projectionbased-increm...

Fast Epigraphical Projection-based Incremental Algorithms for Wasserstein Distributionally Robust Support Vector Machine. Dec 6, 2020 ...

SlidesLive · 

Dec 6, 2020

[PDF] mlr.press

Robust document distance with wasserstein-fisher-rao metric

Z Wang, D Zhou, M Yang, Y Zhang… - Asian Conference on …, 2020 - proceedings.mlr.press

… In this paper, we propose to address this overestimation issue with a novel Wasserstein-Fisher-Rao

Cited by 3(WFR) document distance grounded on unbalanced optimal transport theory. …

  Related articles All 2 versions 

<——2020——2020—3470 —


[PDF] ieee.org

A new data-driven distributionally robust portfolio optimization method based on wasserstein ambiguity set

N Du, Y Liu, Y Liu - IEEE Access, 2020 - ieeexplore.ieee.org

… problem with respect to the Wasserstein metric. A new robust mean-CVaR portfolio …

computationally tractable equivalent form is derived with respect to the Wasserstein metric. …

  Cited by 3 Related articles All 2 versions


[PDF] atlantis-press.com

[PDF] Multimedia Analysis and Fusion via Wasserstein Barycenter.

C Jin, J Wang, J Wei, L Tan, S Liu… - Int. J. Networked …, 2020 - atlantis-press.com

… digital interface (MIDI) format and Wasserstein Barycenter as the algorithm of data fusion. …

we propos 

 Cited by 2 Related articles All 3 versions 

[PDF] nsf.gov

A data-driven distributionally robust game using wasserstein distance

G PengT ZhangQ Zhu - International Conference on Decision and Game …, 2020 - Springer

… We propose a distributionally robust formulation to … Wasserstein distance as the distribution

metric, we show that the game considered in this work is a generalization of the robust game …

  Cited by 1 Related articles All 3 versions


[PDF] thecvf.com

Channel Pruning for Accelerating Convolutional Neural Networks via Wasserstein Metric

H Duan, H Li - Proceedings of the Asian Conference on …, 2020 - openaccess.thecvf.com

… the Wasserstein metric. First, the output features of a channel are aggregated through the

Wasserstein … the above issues, we propose a novel pruning method via the Wasserstein metric. …

Related articles All 2 versions 

 

[PDF] arxiv.org

Convergence in Wasserstein distance for empirical measures of Dirichlet diffusion processes on manifolds

FY Wang - arXiv preprint arXiv:2005.09290, 2020 - arxiv.org

… Comparing with Eν[W2(µt,µ0)2|t<τ], in µν t the conditional expectation inside the Wasserstein

distance. According to [20], W2(µν t ,µ0)2 behaves as t−2, whereas the following result says …

 Cited by 4 Related articles All 3 versions 


2020


[PDF] colostate.edu

Vietoris–Rips metric thickenings and Wasserstein spaces

JR Mirth - 2020 - search.proquest.com

… : metric geometry, optimal transport, and category theory. Using the geodesic structure of

Wasserstein … The relation between this definition and the Wasserstein metric in this chapter is …

 Cited by 3 Related articles All 3 versions


[PDF] ieee.org

A new data-driven distributionally robust portfolio optimization method based on wasserstein ambiguity set

N Du, Y Liu, Y Liu - IEEE Access, 2020 - ieeexplore.ieee.org

… the portfolio optimization problem with respect to the Wasserstein metric. A new robust mean-…

computationally tractable equivalent form is derived with respect to the Wasserstein metric. …

Cited by 3 Related articles All 2 versions


Density estimation of multivariate samples using Wasserstein distance

E Luini, P Arbenz - Journal of Statistical Computation and …, 2020 - Taylor & Francis

… , presents the admissibility criteria, which determine the stopping rule of our partitioning

algorithm, and details the characteristics of the Wasserstein distance based hypothesis test. In …

 Related articles All 4 versions


[PDF] neurips.cc

Deep Diffusion-Invariant Wasserstein Distributional Classification

SW Park, DW Shu, J Kwon - Advances in Neural …, 2020 - proceedings.neurips.cc

… 2.4 Evaluation Metric To evaluate the classification performance, we propose the following 

… Thus, we can solve the metric-based classification problem in the Wasserstein space. …

Related articles All 2 versions


[PDF] arxiv.org

Central limit theorems for Markov chains based on their convergence rates in Wasserstein distance

R Jin, A Tan - arXiv preprint arXiv:2002.09427, 2020 - arxiv.org

… ours, most importantly the contraction of Wasserstein distance at the geometric rate, which is

… based on their convergence in Wasserstein distance, among other regularity conditions. To …

Cited by 1 Related articles All 3 versions 

<——2020——2020—3480 —


2020 see 2019  [PDF] aaai.org

Solving general elliptical mixture models through an approximate Wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - Proceedings of the AAAI Conference on …, 2020 - ojs.aaai.org

… Compared to the widely adopted Kullback– Leibler divergence, we show that the Wasserstein

distance provides a more desirable optimisation space. We thus provide a stable solution …

Cited by 4 Related articles All 5 versions 


  Wasserstein learning of determinantal point processes

L AnquetilM GartrellA Rakotomamonjy… - arXiv preprint arXiv …, 2020 - arxiv.org

… learning approach that minimizes the Wasserstein distance between the training data and …

Wasserstein distance with the chosen cost function in (1), we define the following Wasserstein …

Cited by 1 Related articles All 4 versions 

 

[PDF] arxiv.org

Characterization of probability distribution convergence in Wasserstein distance by -quantization error function

Y Liu, G Pagès - Bernoulli, 2020 - projecteuclid.org

… 1.1 some properties of the Wasserstein distance tp. Then in … distance AN,p := eN,p(μ,·) −

eN,p(ν,·) sup which we will prove to be topologically equivalent to the Wasserstein distance tp. …

 Cited by 2 Related articles All 7 versions


[PDF] oapen.org

[BOOK] An invitation to statistics in Wasserstein space

VM Panaretos, Y Zemel - 2020 - library.oapen.org

… topic of this book and is coming to be known as ‘statistics in Wasserstein spaces’ … chapter, 

we introduce the problem of optimal transport, which is the main concept behind Wasserstein …

Cited by 82 Related articles All 8 versions

Approximate inference with wasserstein gradient flows

C FrognerT Poggio - International Conference on Artificial …, 2020 - proceedings.mlr.press

… We propose to approximate this by m steps of the Wasserstein gradient flow (1), with … We

apply the Wasserstein gradient flow to approximate the predictive density of the diffusion, which …

Cited by 18 Related articles All 6 versions 


2020


[PDF] arxiv.org

Approximate Bayesian computation with the sliced-Wasserstein distance

K NadjahiV De BortoliA Durmus… - ICASSP 2020-2020 …, 2020 - ieeexplore.ieee.org

… , Wasserstein-ABC has been recently proposed, and compares the datasets via the Wasserstein

… , called Sliced-Wasserstein ABC and based on the Sliced-Wasserstein distance, which …

Cited by 12 Related articles All 8 versions


[PDF] aaai.org

Solving general elliptical mixture models through an approximate Wasserstein manifold

S Li, Z Yu, M Xiang, D Mandic - Proceedings of the AAAI Conference on …, 2020 - ojs.aaai.org

… , we show that the Wasserstein distance provides a more … along a manifold of an approximate

Wasserstein distance. To this … , especially under the Wasserstein distance. To relieve this …

Cited by 4 Related articles All 5 versions 


[PDF] arxiv.org

Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces

V EhrlacherD LombardiO Mula… - … and Numerical Analysis, 2020 - esaim-m2an.org

… where the goal is to approximate any function belonging to … so far been based on the

approximation of the solution set by … and require to study nonlinear approximation methods. In this …

Cited by 18 Related articles All 25 versions


[HTML] springer.com

[HTML] The Wasserstein Space

VM Panaretos, Y Zemel - An Invitation to Statistics in Wasserstein Space, 2020 - Springer

… Although the topological properties below still hold at that level of generality (except when p = 0 or p = ∞), for the sake of simplifying the notation we restrict the discussion to Banach …

 Related articles


2020
Energy - Wind Farms; Reports from Shanghai Jiao Tong University Describe Recent Advances in Wind Farms (Typical Wind Power Scenario Generation for Multiple Wind Farms Using Conditional Improved Wasserstein... 

Energy Weekly News, Jan 10, 2020, 908

Newspaper Article:  Full Text Online 

Typical wind power scenario generation for multiple wind farms using conditional improved Wasserstein generative adversarial network

Zhang, Yufan; Ai, Qian

International Journal of Electrical Power and Energy Systems

2020 v. 114 p. 105388 

FullText Online Journal Article

Document Title: "Energy - Wind Farms; Reports from Shanghai Jiao Tong University Describe Recent Advances in Wind Farms (Typical Wind Power Scenario Generation for Multiple Wind Farms Using Conditional Improved Wasserstein Generative Adversarial Network)"AndStart Page: 908AndISSN: 19456980

Energy Weekly News, Jan 10, 2020, 908   Newspaper Article:  Full Text Online 

Cited by 60 Related articles

<——2020——2020—––3490 —


[PDF] arxiv.org

Efficient wasserstein natural gradients for reinforcement learning

T MoskovitzM ArbelF HuszarA Gretton - arXiv preprint arXiv …, 2020 - arxiv.org

… Wasserstein natural gradient vs Fisher natural gradient While Figure 1 (c) shows that both methods seem to reach the same solution, a closer inspection of the loss, as shown in Figure …

S ited by 3 Related articles All 6 versions 

[HTML] nih.govWasserstein GANs for MR imaging: from paired to unpaired training

K Lei, M MardaniJM Pauly… - IEEE transactions on …, 2020 - ieeexplore.ieee.org

… Wasserstein distance is a measure of the distance between two probability distributions [25]… look at Wasserstein-1 distance in this paper. Here we first introduce Wasserstein-1 distance …

 \Cited by 34 Related articles All 10 versions


[PDF] kaust.edu.sa

Wasserstein proximal gradient

A Salim, A Korba, G Luise - 2020 - repository.kaust.edu.sa

… We adopt a Forward Backward (FB) Euler scheme for the discretization of the gradient flow of the relative entropy. This FB algorithm can be seen as a proximal gradient algorithm to …

Cited by 1 Related articles All 2 versions 


[PDF] thecvf.com

Illegible text to readable text: An image-to-image transformation using conditional sliced wasserstein adversarial networks

M KarimiG VeniYY Yu - … Vision and Pattern Recognition …, 2020 - openaccess.thecvf.com

… tions followed by computing average Wasserstein distance of these one-dimensional distributions [16]. We have developed a novel conditional sliced Wasserstein GAN with three …

 Cited by 3 Related articles All 7 versions 

mages. We shall also only consider the noise degradation, …

Cited by 3 Related articles All 7 versions

[PDF] arxiv.org

Estimating processes in adapted Wasserstein distance

J BackhoffD BartlM BeiglböckJ Wiesel - arXiv preprint arXiv …, 2020 - arxiv.org

… As already mentioned in the abstract, to overcome this flaw of the Wasserstein distance (or rather, … Below we present an adapted extension of the classical Wasserstein distance which …

Cited by 11 Related articles All 4 versions 

[CITATION] Estimating processes in adapted Wasserstein distance

J Backhoff-Veraguas, D Bartl, M Beiglböck, J Wiesel - arXiv preprint arXiv:2002.07261, 2020

  Cited by 2 Related articles


2020 


2020  [PDF] aaai.org

2020  Quantifying the empirical wasserstein distance to a set of measures: Beating the curse of dimensionality

N SiJ BlanchetS Ghosh… - Advances in Neural …, 2020 - proceedings.neurips.cc

We consider the problem of estimating the Wasserstein distance between the empirical

measure and a set of probability measures whose expectations over a class of functions …

  Cited by 8 Related articles All 3 versions 


PDF] aconf.org

LBWGAN: Label Based Shape Synthesis From Text With WGANs

B Li, Y Yu, Y Li - … International Conference on Virtual Reality and …, 2020 - ieeexplore.ieee.org

In this work, we purpose a novel method of voxel-based shape synthesis, which can build a

connection between the natural language text and the color shapes. The state-of-the-art …

Related articles All 3 versions


Generating synthetic financial time series with WGANs

https://towardsdatascience.com › generating-synthetic-fi...

Jun 19, 2020 — Generating synthetic financial time series with WGANs. A first experiment with Pytorch code. Introduction. Overfitting is one of the problems ...

[CITATION] Generating synthetic financial time series with WGANs

M Savasta - A first experiment with Pytorch code in June, 2020

Cited by 3 Related articles


[PDF] Exponential Convergence in Entropy and Wasserstein for McKean-Vlasov SDEs

P Renc, FY Wanga - 2020 - sfb1283.uni-bielefeld.de

… The convergence in entropy for stochastic systems is an important topic in both probability

theory and mathematical physics, and has been well studied for Markov processes by using …

All 2 versions 


2020 patent

Wasserstein-based high-energy image synthesis method and device for generating …

CN CN112634390A 郑海荣 深圳先进技术研究院

Priority 2020-12-17 • Filed 2020-12-17 • Published 2021-04-09

updating the preset generation countermeasure network model based on the first loss value and the first judgment result until the preset generation countermeasure network model converges, and determining the converged preset generation countermeasure network model as the Wasserstein generation …

<——2020——–2020—––3500—



2020 patent

High-energy image synthesis method and device based on wasserstein generative …

WO WO2022126480A1 郑海荣 深圳先进技术研究院

Filed 2020-12-17 • Published 2022-06-23

The preset generative adversarial network model is updated based on the first loss value and the first discrimination result until the preset generative adversarial network model converges, and the converged preset generative adversarial network model is determined as the Wasserstein generative …


2020 patent

Difference privacy greedy grouping method adopting Wasserstein distance

CN CN112307514A 杨悦 哈尔滨工程大学

Priority 2020-11-26 • Filed 2020-11-26 • Published 2021-02-02

1. A differential privacy greedy grouping method adopting Wasserstein distance is characterized by comprising the following steps: step 1: reading a data set D received at the ith time point i Step 2: will D i Data set D released from last time point i-1 Performing Wasserstein distance similarity …


2020 patent

System and Method for Generaring Highly Dense 3D Point Clouds using Wasserstein

KR20220088216A 권준석 중앙대학교 산학협력단

Filed 2020-12-18 • Published 2022-06-27

The present invention generates a high-resolution 3D point cloud using a Wasserstein distribution to generate a set of several 3D points by generating several input vectors from a prior distribution and expressing it as a Wasserstein distribution A prior distribution input unit for inputting a …


2020 patent

Multi-band image synchronous fusion and enhancement method based on improved WGAN-GP

CN CN111696066A 李大威 中北大学

Priority 2020-06-13 • Filed 2020-06-13 • Published 2020-09-22

1. The multiband image synchronous fusion and enhancement method based on the improved WGAN-GP is characterized by comprising the following steps of: designing and constructing a generation countermeasure network: generating a countermeasure network into a generator model and a discriminator model; 


LBWGAN: Label Based Shape Synthesis From Text With WGANs

B Li, Y Yu, Y Li - … International Conference on Virtual Reality and …, 2020 - ieeexplore.ieee.org

WGANs, a lot of research get a good result, they use WGANs … Compared to traditional GANs, 

WGANs completely solves the … With the advanced capability of WGANs, this paper uses it …

 Related articles All 3 versions


Wasserstein based two-stage distributionally robust optimization model for optimal operation of CCHP micro-grid under uncertainties
International Journal of Electrical Power & Energy Systems20 February 2020...

Yuwei WangYuanjuan YangBingkang Li


2020


 

2020 see 2019

Konarovskyi, Vitalii

On number of particles in coalescing-fragmentating Wasserstein dynamics. (English) Zbl 1488.60237

Theory Stoch. Process. 25, No. 2, 74-80 (2020).

MSC:  60K35 60H05 60G44

PDF BibTeX XML Cite

Full Text: arXiv 


2020 see 2019

Wasserstein Style Transfer

Mroueh, Y

23rd International Conference on Artificial Intelligence and Statistics (AISTATS)

2020 | INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108 108 , pp.842-851

We propose Gaussian optimal transport for image style transfer in an Encoder/Decoder framework. Optimal transport for Gaussian measures has closed forms Monge mappings from source to target distributions. Moreover, interpolating between a content and a style image can be seen as geodesics in the Wasserstein Geometry. Using this insight, we show how to mix different target styles, using Wasserst

Show more

3 Citations  36 References  Related records


DECWA : Density-Based Clustering using Wasserstein Distance

El Malki, NCugny, R; (...); Ravat, F

29th ACM International Conference on Information and Knowledge Management (CIKM)

2020 | CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT , pp.2005-2008

Clustering is a data analysis method for extracting knowledge by discovering groups of data called clusters. Among these methods, state-of-the-art density-based clustering methods have proven to be effective for arbitrary-shaped clusters. Despite their encouraging results, they suffer to find low-density clusters, near clusters with similar densities, and high-dimensional data. Our proposals ar

Show more

Full Text at Publisher 23  References  Related records


Learning with minibatch Wasserstein : asymptotic and gradient properties

Fatras, KZine, Y; (...); Courty, N

23rd International Conference on Artificial Intelligence and Statistics (AISTATS)

2020 | INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108 108 , pp.2131-2140

Optimal transport distances are powerful tools to compare probability distributions and have found many applications in machine learning. Yet their algorithmic complexity prevents their direct use on large scale datasets. To overcome this challenge, practitioners compute these distances on minibatches i.e. they average the outcome of several smaller optimal transport problems. We propose in thi

Show more

3Citations 44  References  Related records


Chinese Font Translation with Improved Wasserstein Generative Adversarial Network

Miao, YLJia, HH; (...); Ji, YC

12th International Conference on Machine Vision (ICMV)

2020 | TWELFTH INTERNATIONAL CONFERENCE ON MACHINE VISION (ICMV 2019) 11433

Nowadays, various fonts are applied in many fields, and the generation of multiple fonts by computer plays an important role in the inheritance, development and innovation of Chinese culture. Aiming at the existing font generation methods, which have some problems such as stroke deletion, artifact and blur, this paper proposes Chinese font translation with improved wasserstein generative advers

Show more

Full Text at Publisher

1Citation  17ReferencesRelated records

<——2020——–2020—––3510—


Improving EEG-based Motor Imagery Classification with Conditional Wasserstein GAN

Li, Z and Yu, Y

International Conference on Image, Video Processing and Artificial Intelligence

2020 | 2020 INTERNATIONAL CONFERENCE ON IMAGE, VIDEO PROCESSING AND ARTIFICIAL INTELLIGENCE 11584

Deep learning based algorithms have made huge progress in the field of image classification and speech recognition. There is an increasing number of researchers beginning to use deep learning to process electroencephalographic(EEG) brain signals. However, at the same time, due to the complexity of the experimental device and the expensive collection cost, we cannot train a powerful deep learnin

Show more

Full Text at Publisher  13 References  Related records


Cross-Domain Text Sentiment Classification Based on Wasserstein Distance

Cai, GYLin, Q and Chen, NN

2nd International Conference on Security with Intelligent Computing and Big-data Services (SICBS)

2020 | SECURITY WITH INTELLIGENT COMPUTING AND BIG-DATA SERVICES 895 , pp.280-291

Text sentiment analysis is mainly to detect the sentiment polarity implicit in text data. Most existing supervised learning algorithms are difficult to solve the domain adaptation problem in text sentiment analysis. The key of cross-domain text sentiment analysis is how to extract the domain shared features of different domains in the deep feature space. The proposed method uses denosing autoen

Show more

Full Text at Publisher

2 Citations 16  References  Related records


OPTIMALITY IN WEIGHTED L-2-WASSERSTEIN GOODNESS-OF-FIT STATISTICS

de Wet, T and Humble, V

2020 | SOUTH AFRICAN STATISTICAL JOURNAL 54 (1) , pp.1-13

In Del Barrio, Cuesta-Albertos, Matran and Rodriguez-Rodriguez (1999) and Del Barrio, Cuesta-Albertos and Matran (2000), the authors introduced a new class of goodness-of-fit statistics based on the L-2-Wasserstein distance. It was shown that the desirable property of loss of degrees-of-freedom holds only under normality. Furthermore, these statistics have some limitations in their applicabilit

Show more

View full text  22 References Related records


An Improvement based on Wasserstein GAN for Alleviating Mode Collapsing

Chen, YY and Hou, XW

International Joint Conference on Neural Networks (IJCNN) held as part of the IEEE World Congress on Computational Intelligence (IEEE WCCI)

2020 | 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)

In the past few years, Generative Adversarial Networks as a deep generative model has received more and more attention. Mode collapsing is one of the challenges in the study of Generative Adversarial Networks. In order to solve this problem, we deduce a new algorithm on the basis of Wasserstein GAN. We add a generated distribution entropy term to the objective function of generator net and maxi

Show more

25 References  Related records


THE WASSERSTEIN DISTANCES BETWEEN PUSHED-FORWARD MEASURES WITH APPLICATIONS TO UNCERTAINTY QUANTIFICATION

Sagiv, A

2020 | COMMUNICATIONS IN MATHEMATICAL SCIENCES 18 (3) , pp.707-724

In the study of dynamical and physical systems, the input parameters are often uncertain or randomly distributed according to a measure rho. The system's response f pushes forward rho to a new measure f(*)rho which we would like to study. However, we might not have access to f, but to its approximation g. This problem is common in the use of surrogate models for numerical uncertainty quantifica

Show more

3 Citations  50 References  Related records


2020


Pattern-Based Music Generation with Wasserstein Autoencoders and PR(C)Descriptions

Borghuis, VAngioloni, L; (...); Frasconi, P

29th International Joint Conference on Artificial Intelligence

2020 | PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE , pp.5225-5227

We present a pattern-based MIDI music generation system with a generation strategy based on Wasserstein autoencoders and a novel variant of pianoroll descriptions of patterns which employs separate channels for note velocities and note durations and can be fed into classic DCGAN-style convolutional architectures. We trained the system on two new datasets composed by musicians in our team with m

Show mre  15 References  Related records


ON THE COMPUTATION OF KANTOROVICH WASSERSTEIN DISTANCES BETWEEN TWO-DIMENSIONAL HISTOGRAMS BY UNCAPACITATED MINIMUM COST FLOWS

Bassetti, FGualandi, S and Veneroni, M

2020 | SIAM JOURNAL ON OPTIMIZATION 30 (3) , pp.2441-2469

In this work, we present a method to compute the Kantorovich Wasserstein distance of order 1 between a pair of two-dimensional histograms. Recent works in computer vision and machine learning have shown the benefits of measuring Wasserstein distances of order 1 between histograms with n bins by solving a classical transportation problem on very large complete bipartite graphs with n nodes and n

Show more

View full text 6 Citations  57 References  Related records


Data supplement for a soft sensor using a new generative model based on a variational autoencoder and Wasserstein GAN

Wang, X and Liu, H

Jan 2020 | JOURNAL OF PROCESS CONTROL 85 , pp.91-99

In industrial process control, measuring some variables is difficult for environmental or cost reasons. This necessitates employing a soft sensor to predict these variables by using the collected data from easily measured variables. The prediction accuracy and computational speed in the modeling procedure of soft sensors could be improved with adequate training samples. However, the rough envir

Show more

View full text


1 次元最適速度モデルの緩和過程における Wasserstein 計量空間における特徴量変化と感応度パラメーター変化の関係

石渡龍輔, 杉山雄規 - 日本物理学会講演概要集 75.1, 2020 - jstage.jst.go.jp

との距離について Wasserstein 計量を用いた解析をおこなった. 1 , Wasserstein 計量に対して

計量的多次元尺度法で得られた ψ1 軸方向は, Wasserstein 計量の変化に最も影響を与えた要素

 All 2 versions

]Japanese  Wasserstein metric space in the relaxation process of the one-dimensional optimal velocity model]


2020 patent

 Multi -band image synchronous fusion and enhancement method based on improved WGAN-GP

CN CN111696066A 李大威 中北大学

Priority 2020-06-13 • Filed 2020-06-13 • Published 2020-09-22

1. The multiband image synchronous fusion and enhancement method based on the improved WGAN-GP is characterized by comprising the following steps of: designing and constructing a generation countermeasure network: generating a countermeasure network into a generator model and a discriminator model;

<——2020——–2020—––3520—


2020 patent

… data leakage of halogen conveying pipeline based on S transformation/WGAN

CN CN111460367A 徐敏 淮阴工学院

Priority 2020-03-20 • Filed 2020-03-20 • Published 2020-07-28

5. The algorithm for solving the imbalance of the leakage data of the brine transportation pipeline based on the S transformation/WGAN as claimed in claim 4, wherein the objective function and the loss function in the training of the WGAN model in Step3 are as follows: an objective function: loss …


2020 patent

Method for generating biological Raman spectrum data based on WGAN (WGAN) …

CN CN112712857A 祝连庆 北京信息科技大学

Priority 2020-12-08 • Filed 2020-12-08 • Published 2021-04-27

1. A method of generating bio-raman spectral data based on a WGAN antagonistic generation network, the method comprising the steps of: a, extracting part of Raman spectrum data from a Raman spectrum database to serve as a real sample, and preprocessing the Raman spectrum data; b, creating a normal …



2020 patent

… attack flow data enhancement method and system combining self-encoder and WGAN

CN CN112688928A 姚叶鹏 中国科学院信息工程研究所

Priority 2020-12-18 • Filed 2020-12-18 • Published 2021-04-20

The invention discloses a network attack flow data enhancement method and system combining a self-encoder and a WGAN, which relate to the field of network space security, abnormal flow detection of a communication network and the field of artificial intelligence.


2020 patent

Method for image restoration based on WGAN network

CN CN112488956A 方巍 南京信息工程大学

Priority 2020-12-14 • Filed 2020-12-14 • Published 2021-03-12

3. The method for image inpainting based on WGAN network of claim 1, wherein in the step (1.3), through optimizing parameters and function algorithm: wherein, the activation function is specifically described as follows: 4. the method for image restoration based on WGAN network of claim 1, wherein …


Marcin Szelest i Paweł Kowalczyk, Zastosowanie metryki Wasserstein ...

https://www.ptm.org.pl › zawartosc

https://www.ptm.org.pl › zawartosc

Translate this page

Mar 3, 2020 — ZASTOSOWANIE METRYKI WASSERSTEINA DO STATYSTYCZNEJ ANALIZY DANYCH W SAMOCHODOWYCH SYSTEMACH PERCEPCJI OTOCZENIA. Serdecznie zapraszamy,


2020


2020

Обращение полного волнового поля с использованием метрики Вассерштейна

АА Василенко - МНСК-2020, 2020 - elibrary.ru

… Вданной работе для измерения отклонения предлагается использовать метрику

Вассерштейна.Рассматриваемая метрика имеет значительные преимуществапо …

Related article


 CWGAN-GP-based multi-task learning model for consumer credit scoring

Y Kang, L Chen, N Jia, W Wei, J Deng… - Expert Systems with …, 2022 - Elsevier

… (MTL) model (CWGAN-GP-MTL) for consumer credit scoring. First, the CWGAN-GP model is

… through augmenting synthetic bad data generated by CWGAN-GP. Next, we design an MTL …

Cited by 1 Related articles


Multi-classification of arrhythmias using ResNet with CBAM on CWGAN-GP augmented ECG Gramian Angular Summation Field

K Ma, AZ Chang'an, F Yang - Biomedical Signal Processing and Control, 2022 - Elsevier

… generative adversarial network with gradient penalty (CWGAN-GP) model to augment the …

features for arrhythmia classification, and that CWGAN-GP based data augmentation provides …

Cited by 2 Related articles

[CITATION] Corrigendum to “Multi-classification of arrhythmias using ResNet with CBAM on CWGAN-GP augmented ECG Angular Summation Field”[Biomed. Signal …

K Ma, AZ Chang'an, F Yang - Biomedical Signal Processing and Control, 2022 - Elsevier

Related articles



Gene-CWGAN: a data enhancement method for gene expression profile based on improved CWGAN-GP

F Han, S Zhu, Q Ling, H Han, H Li, X Guo… - Neural Computing and …, 2022 - Springer

… data based on CWGAN-GP (Gene-CWGAN) is proposed in … is adopted in Gene-CWGAN

to make the distribution of … a Gene-CWGAN based on a proxy model (Gene-CWGAN-PS) …

Cited by 2 Related articles


[PDF] ieee.org

BWGAN-GP: An EEG Data Generation Method for Class Imbalance Problem in RSVP Tasks

M Xu, Y Chen, Y Wang, D Wang, Z Liu… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org

… This balanced WGAN model with a gradient penalty (BWGAN-GP) combines an autoencoder

… This study also surveyed the performance of BWGAN-GP when reducing the proportion of …

Cited by 1 Related articles All 3 versions

<——2020——–2020—––3530—




[HTML] hindawi.com

[HTML] Bearing Remaining Useful Life Prediction Based on AdCNN and CWGAN under Few Samples

J Man, M Zheng, Y Liu, Y Shen, Q Li - Shock and Vibration, 2022 - hindawi.com

At present, deep learning is widely used to predict the remaining useful life (RUL) of rotation

machinery in failure prediction and health management (PHM). However, in the actual …

All 4 versions 


Based on CWGAN Deep Learning Architecture to Predict Chronic Wound Depth Image

CL Chin, TY Sun, JC Lin, CY Li… - 2022 IEEE …, 2022 - ieeexplore.ieee.org

… In this paper, the wound depth image predictions using the CWGAN have been studied. The

experimental results show that the wound depth image quality generated by the CWGAN is …



Fault Diagnosis Method Based on CWGAN-GP-1DCNN

H Yin, Y Gao, C Liu, S Liu - 2021 IEEE 24th International …, 2021 - ieeexplore.ieee.org

… CWGAN-GP To address the problem of failure sample, conditional WGAN-GP (CWGAN-GP) …

This paper proposes a fault diagnosis model combining 1D-CNN and CWGAN-GP for the …

Related articles All 2 versions



Classification and Evaluation of Mobile Application Network ...

https://www.jsjkx.com › abstract › abstract18857

Based on the network event behavior sequence,the improved deep forest model is used to classify and identify applications.The optimal classification ...
 
[CITATION] Classification and evaluation of mobile application network behavior based on deep forest and CWGAN-GP

PF Jiang, SJ Wei - Comput Sci, 2020

Cited by 2 Related articles


 CWGAN-DNN: 一种条件 Wasserstein 生成对抗网络入侵检测方法

贺佳星, 王晓丹, 宋亚飞, 来杰 - 空军工程大学学报, 2021 - kjgcdx.cnjournals.com

生成对抗网络(CWGAN)和深度神经网络(DNN)的入侵检测(CWGAN DNN).CWGAN DNN通过

生成… ,将连续特征的高斯混合分布进行分解;然后利用CWGAN学习预处理后数据的分布并生成新的

All 2 versions  

[Chinese   CWGAN-DNN: A Conditional Wasserstein Generative Adversarial Network Intrusion Detection Method|


2020 [PDF] arxiv.org

Novelty detection via robust variational autoencoding

CH Lai, D Zou, G Lerman - arXiv preprint arXiv:2006.05534, 2020 - arxiv.org

Wasserstein metric gives rise to outliers-robust estimation and is suitable to the low-rank 

modeling of inliers by MAW. … use of the Wasserstein metric in WAE is different than that of MAW. …

Save Cite Cited by 4 Related articles All 5 versions


2020 video

Wasserstein Distributionally Robust Learning

books.google.com › books

books.google.com › books

OROOSH Shafieezadeh Abadeh · 2020 · ‎No preview

Mots-clés de l'auteur: Distributionally robust optimization ; Wasserstein distance ; Regularization ; Supervised Learning ; Inverse optimization ; Kalman filter ; Frank-Wolfe algorithm.

 Wasserstein Distributionally Robust Learning


  VIDEO 3

The Quadratic Wasserstein Metric for Inverse Data Matching Problems

Yang, Yunan (New York University)2020

 This talk focus on two major effects of the quadratic Wasserstein (W2) distance as the measure of data discrepancy in computational solutions of inverse problems...

OPEN ACCESS

The Quadratic Wasserstein Metric for Inverse Data Matching Problems

No Online Access 

Shuangjian Zhang, Kelvin (École Normale Supérieure)


VIDEO 6

Gradient Flows in the Wasserstein Metric: From Discrete to Continuum via Regularization

Craig, Katy (University of California, Santa Barbara)2020

 ...: the Wasserstein metric provides a new notion of distance for classifying distributions and a rich geometry for interpolating...

OPEN ACCESS

Gradient Flows in the Wasserstein Metric: From Discrete to Continuum via Regularization

No Online Access 

Craig, Katy (University of California, Santa Barbara)
Optimal Transport - Gradient Flows in the Wasserstein Metric

Math 707: Optimal TransportGradient Flows in the Wasserstein MetricDecember 2, 2019This is a lecture on "Gradient Flows in the Wasserstein ...

YouTube · Brittany Hamfeldt · 

Dec 13, 2019

VIDEO 7

On Wasserstein Gradient Flows and the Search of Neural Network Architectures

Garcia Trillos, Nicolas (University of Wisconsin, Madison)2020

OPEN ACCESS

On Wasserstein Gradient Flows and the Search of Neural Network Architectures

No Online Access 

<——2020——–2020—––3540—



Wasserstein Natural Gradients for Reinforcement Learning

https://talks.cam.ac.uk › talk › index

https://talks.cam.ac.uk › talk › index

Dec 1, 2020 — In this talk I present new optimization approach which can be applied to policy optimisation as well as evolution strategies for reinforcement ...

Wasserstein Natural Gradients for Reinforcement ... - talks.cam

talks.cam.ac.uk › talk › index

Dec 1, 2020 — Zoom. If you have a question about this talk, please contact Mateja Jamnik. Join us on Zoom. Policy Gradient methods can learn complex ...


A Non-Commutative Analog of the 2-Wasserstein Metric for ...

video.ias.edu/analysis/carlen 

The Fermionic Fokker-Planck equation is a quantum-mechanical analog of the classical Fokker-Planck equation with which it has much in common, such as the same optimal hypercontractivity properties.
 
Dec 2, 2020


 

Samir Chowdhury: Gromov-Wasserstein Learning in a ...

But also solving a principal component analysis and graphs is also of interest. And ideally we can do all of this ...

Dec 8, 2020



Introduction to the Wasserstein distance17,335 views  

Introduction to the Wasserstein distance - YouTube

Title: Introduction to the Wasserstein distance Abstract: I give an introduction to the Wasserstein distance, which is also called the ...

YouTube · Applied Algebraic Topology Network · 

Oct 26, 2020


 2020 PDF

Interpolating between f-Divergences and Integral Probability ...

https://www.jmlr.org › papers › volume23

by MA Katsoulakis · 2022 — We develop a rigorous and general framework for constructing information-theoretic diver- gences that subsume both f-divergences and integral probability ...

70 pages

Interpolating Between f-Divergences and Wasserstein Metrics

I will present a general framework for constructing new information-theoreticdivergences that interpolate between and combine crucial ...

YouTube · Columbia University, Probability · 

Nov 22, 20


2020


Prrojection Robust Wasserstein Distance and Riemannian ...

https://arxiv.org › cs

https://arxiv.org › cs

by T Lin · 2020 · Cited by 35 — Abstract: Projection robust Wasserstein (PRW) distance, or Wasserstein projection pursuit (WPP), is a robust variant of the Wasserstein ...

slideslive.com › projection-robust-wasserstein-distance-an..

Projection Robust Wasserstein Distance and Riemannian Optimization ... Spatio-Temporal Persistent Homology for Dynamic Metric Spaces.

SlidesLive · 

Dec 6, 2020


Fixed-support Wasserstein barycenters: Computational hardness and fast algorithm

T Lin, N Ho, X Chen, M Cuturi… - Advances in Neural …, 2020 - proceedings.neurips.cc

We study the fixed-support Wasserstein barycenter problem (FS-WBP), which consists in 

computing the Wasserstein barycenter of $ m $ discrete probability measures supported on 

a finite metric space of size $ n $. We show first that the constraint matrix arising from the 

standard linear programming (LP) representation of the FS-WBP is\textit {not totally 

unimodular} when $ m\geq 3$ and $ n\geq 3$. This result resolves an open question 

pertaining to the relationship between the FS-WBP and the minimum-cost flow (MCF) …

Save Cite Cited by 39 Related articles All 10 versions
Fixed-Support Wasserstein Barycenters: Computational ...

slideslive.com › ...

Fixed-Support Wasserstein Barycenters: Computational Hardness and Fast Algorithm · Speakers · Organizer · About NeurIPS 2020 · Store presentation ...

SlidesLive · 

Dec 6, 2020

Fixed-Support Wasserstein Barycenters: Computational ...

paperswithcode.com › paper › review

Fixed-Support Wasserstein Barycenters: Computational Hardness and Fast Algorithm. We study the fixed-support Wasserstein barycenter problem (FS-WBP), ...

Papers With Code · Ross Taylor · 

May 30, 2020

Projection Robust Wasserstein Distance and Riemannian ...

https://proceedings.neurips.cc › paper › 2020 › hash

https://proceedings.neurips.cc › paper › 2020 › hash

by T Lin · 2020 · Cited by 35 — Projection Robust Wasserstein Distance and Riemannian Optimization. Part of Advances in Neural Information Processing Systems 33 (NeurIPS 2020).

NeurIPS 2020 : Projection Robust Wasserstein Distance and ...

neurips.cc › virtual › protected

Only iff poster is crowded, join Zoom . Authors have to start the Zoom call from their Profile page / Presentation History. Toggle Abstract Paper (in Proceedings / . pdf). 

Dec 10, 2020


WGAN-GP overriding `Model.train_step` - Keras

https://keras.io › examples › generative › wgan_gp

https://keras.io › examples › generative › wgan_gp

May 9, 2020 — The original Wasserstein GAN leverages the Wasserstein distance to produce a value function that has better theoretical properties than the ...

Wasserstein GAN (WGAN... · ‎Create the discriminator (the... · ‎Create the generator

Wgan keras. How to Develop a Wasserstein Generative .

Algorithm for the Wasserstein Generative Adversarial Networks. Taken from: Wasserstein GAN. The DCGAN ...

Dec 12, 2020 · Uploaded by AI Journal


Gromov-Wasserstein based optimal transport to align... - Rebecca Santorella - MLCSB - ISMB 2020

476 views    Gromov-Wasserstein based optimal transport to align single-cell multi-omics data - Re

ISCB

2.7K subscribers

Dec 23, 2020
Gromov-Wasserstein optimal transport to align single ... - bioRxiv

https://www.biorxiv.org › 2020.04.28.066787v1

https://www.biorxiv.org › 2020.04.28.066787v1

Apr 29, 2020 — We present Single-Cell alignment using Optimal Transport (SCOT), ... uses Gromov Wasserstein-based optimal transport to align single-cell ...

Gromov–Wasserstein Optimal Transport to Align Single-Cell ...

https://icml-compbio.github.io › 2020 › papers

https://icml-compbio.github.io › 2020 › papers

PDF

by P Demetci · Cited by 43 — Another method,. UnionCom (Cao et al., 2020), performs unsupervised topo- logical alignment for single-cell multi-omics data to empha- size both local and ...

Missing: MLCSB -ISMB 

Gromov-Wasserstein based optimal transport to align... - MLCSB

www.youtube.com › watch

www.youtube.com › watch

Gromov-Wasserstein based optimal transport to align single-cell multi-omics data ... Optimal transport for machine learning - Gabriel Peyre, ...

YouTube · ISCB · 

Dec 23, 2020

<——2020——–2020—––3550—



https://www.coursera.org › lecture › earth-movers-dista...

Earth Mover's Distance - Week 3: Wasserstein GANs with ...

Sep 29, 2020 — In this course, you will: - Learn about GANs and their applications - Understand the intuition behind the fundamental components of GANs ...

Earth Mover's Distance - Week 3: Wasserstein GANs with ...

Implement a WGAN to ... ... Week 3: Wasserstein GANs with Gradient Penalty ... 2022 Coursera Inc. All rights reserved. Coursera Facebook.

Coursera · DeepLearning.AI · 

Sep 29, 2020


Week 3: Wasserstein GANs with Gradient Penalty | Coursera

Video created by DeepLearning.AI for the course "Build Basic Generative Adversarial Networks (GANs)". Learn advanced techniques to reduce ...

Coursera · DeepLearning.AI ·

 Sep 29, 2020



1-Lipschitz Continuity Enforcement - Week 3 - Coursera

https://www.coursera.org › lecture › 1-lipschitz-continu...

https://www.coursera.org › lecture › 1-lipschitz-continu...

Oct 6, 2020 — In this course, you will: - Learn about GANs and their applications - Understand the intuition behind the fundamental components of GANs ...

1-Lipschitz Continuity Enforcement - Week 3: Wasserstein ...

Week 3: Wasserstein GANs with Gradient Penalty ... Using the L2 norm is very common here, which just ...

Oct 1, 2020



https://www.coursera.org › lecture › condition-on-wass...

Condition on Wasserstein Critic - Week 3 - Coursera

20 — In this course, you will: - Learn about GANs and their applications - Understand the intuition behind the fundamental components of GANs ...
 Condition on Wasserstein Critic - Week 3: Wasserstein GANs .

Video created by DeepLearning.AI for the course "Build Basic Generative Adversarial Networks (GANs ...

Oct 2, 2020 · Uploaded by Eric Zelikman


 


Optimal transport: constant geodesic in Wasserstein space_ ...

这是我讨论班上和大家分享的内容。关于Wp空间上的测地线。参考书目optimal transport for applied ...

Oct 16, 2020


2020

 


2020 see 2019 2021

 A Wasserstein Norm for Signed Measures

. Friday October 23rd at 3pm on Zoom ... I will make this talk pedagogical, explain using examples what is the ...

 Oct 23, 2020


 2020 10/29

Section 4.5 part 2 - "Wasserstein distance and entropy ...

Section 4.5 part 2 - "Wasserstein distance and entropy". 2 views2 views. • Sep 14, 2020. 0 0. Share Save. 0 / 0 ...

Oct 29, 2020  by Dmitry Panchenko


Д. Н. Тяпкин: Ускорение сведением к седловым задачам с приложением к поиску барицентров Вассерштейна125 views

Oct 29, 2020

[Russian  D. N. Tyapkin: Acceleration by reduction to saddle point problems with an application to the search for Wasserstein barycenters125 views]

In this video we implement WGAN and WGAN-GP in PyTorch

 this video we implement WGAN and WGAN-GP in PyTorch. Both of these improvements are based on the ...

Nov 3, 2020 · Uploaded by Aladdin Persson



WGAN implementation from scratch (with gradient penalty ...
Building our first simple GAN. Aladdin ... Pytorch Conditional GAN Tutorial. Aladdin ... [GAN by Hung-yi Lee ...

 · Uploaded by Aladdin Persson

Nov 3, 2020

WGAN implementation from scratch (with gradient penalty)15,729 views

<——2020——–2020—––3560——



 


Wasserstein Robust RL - YouTube

www.youtube.com › watch

www.youtube.com › watch

Wasserstein Robust RL. 549 views 2 years ago ... Safe Reinforcement Learning| Robotics| Machine Learning. Machine Learning and AI Academy.

YouTube · Machine Learning and AI Academy · 

Jul 22, 2020


2020 see 2021

Darina Dvinskikh - "Decentralized Algorithms for Wasserstein ...

www.youtube.com › watch

Gradient descent algorithms for Bures-Wasserstein barycenters

www.youtube.com › watch

Gradient descent algorithms for Bures-Wasserstein barycenters by Austin Stromme, Philippe Rigollet, Sinho Chewi, Tyler MaunuWatch also on ...

YouTube · COLT · 

Aug 4, 2020


Gradient descent algorithms for Bures-Wasserstein barycenters

slideslive.com › gradient-descent-algorithms-for-bureswas...

Gradient descent algorithms for Bures-Wasserstein barycenters · Speakers · Organizer · Categories · About COLT · Store presentation · Should this ...

SlidesLive · 

 

 

Principled Training of Generative Adversarial Networks with

Wasserstein metric and Gradient Penalty

Principled Training of Generative Adversarial Networks with Wasserstein metric and Gradient Penalty. Watch later. Share. Copy link.

YouTube · Quantil Matemáticas Aplicadas · 

Aug 5, 2020


AK on Twitter: "Wasserstein Generative Adversarial Networks

. https://arxiv.org/pdf/2008.03992.pdf … abs: https://arxiv.org/abs/2008.03992 project page: https ...

Aug 10, 2020


 2020


wasserstein ganwgan)是什么? -技术百科的定义- 音讯- 2021

Wasserstein GANWGAN)是库恩特数学科学研究所的Martin ArjovskySoumith Chintala ...

Aug 19, 2020

 

Section 4.4 part 1 - "Wasserstein distance" - YouTube
Metrics on probability measures ... In part 1 we discuss the definition and basic properties of the Wasserstein distance between probability ...

YouTube · Probability Theory · 

Sep 14, 2020

 Gabriel Peyré on Twitter: "Comparison of the Wasserstein ...

https://twitter.com › gabrielpeyre › status

https://twitter.com › gabrielpeyre › status


Aug 29, 2020 — It's more clear for the other metrics but for the Wasserstein seems like its barely changing while you move the 2nd Gaussian. 1.

Gabriel Peyré on Twitter: "Comparison of the Wasserstein

Embed Tweet. Comparison of the Wasserstein, Hellinger, Kullback-Leibler and reverse KL on the space of ...

Aug 28, 2020

Google AI on Twitter: "Introducing an #ImitationLearning ...

Introducing an #ImitationLearning approach for the low-data regime that calculates the Wasserstein distance ...

Sept 15, 2020

Gabriel Peyré on Twitter: "Comparison of the Wasserstein ...

twitter.com › gabrielpeyre › status

Comparison of the Wasserstein, Hellinger, Kullback-Leibler and ... It's more clear for the other metrics but for the Wasserstein seems like ...

Twitter · 

Aug 29, 2020



Problem with BCE Loss - Week 3: Wasserstein GANs with ...

https://www.coursera.org › lecture › problem-with-bce-...

https://www.coursera.org › lecture › problem-with-bce-...

Sep 29, 2020 — In this course, you will: - Learn about GANs and their applications - Understand the intuition behind the fundamental components of GANs ...

Problem with BCE Loss - Week 3: Wasserstein GANs with ...

mplement a WGAN to ... ... Week 3: Wasserstein GANs with Gradient Penalty ... 2022 Coursera Inc. All rights reserved. Coursera Facebook.

Coursera · DeepLearning.AI · 

Sep 29, 2020

<——2020——–2020—––3570——



Statistical and Computational Aspects of Learning with Complex Structure

S van de Geer, M Reiß, P Rigollet - Oberwolfach Reports, 2020 - ems.press

Computational aspects of structured learning: Ankur Moitra … complementary presentations 

on the computational aspects of two-… details the concept of Bures-Wasserstein barycenter Q

Save Cite Related articles All 3 versions

Statistical and Computational aspects of Wasserstein Barycenters

Rigollet @ MAD+ (8 Apr 2020). 439 ...

Apr 21, 2020

LATENT | Wasserstein GP GAN Loss Landscape morphology ...www.youtube.com › watch

Video for wasserstein space 1:00

Loss Landscape generated with real data: wasserstein GP Gan, celebA dataset, sgd-adam, bs=64, train mod ...

Apr 26, 2020 - Uploaded by Javier ideami

Apr 26, 2020 - Uploaded by Javier ideami


2020 see 2019
From GAN to WGAN - Papers With Code

paperswithcode.com › paper › from-gan-to-wgan › review 1:05

This paper explains the math behind a generative adversarial network (GAN) model and why it is hard to be ...

May 7, 2020 · Uploaded by Ross Taylor


The latest in Machine Learning | Papers With Code

istributional Sliced-Wasserstein and Applications to Generative Modeling. Sliced-Wasserstein distance (SW ...

May 7, 2020 · Uploaded by Ross Taylor

 


진보된 GAN - CGAN WGAN - YouTube

이전 시간에 살펴본 DCGAN에서 보다 진보된 GAN CGAN WGAN 살펴봤습니다.* 강의자료 다운로드: ...

May 24, 2020 · Uploaded by 룩팍

[Korean    Advanced GAN - CGAN and WGAN - YouTube]



2020


2020 see 2022
Arif Dönmez (@ArifDoenmez) / Twitter

twitter.com › arifdoenmez

MSc in Mathematics w/ minor in Computer Science ... New small preprint "

Stability of Entropic Wasserstein Barycenters and application to random geometric ...

Twitter · 

May 30, 2020


2020 see 2017

Parallel Streaming Wasserstein Barycenters - Papers With Code

paperswithcode.com › paper › review

One principled way to fuse probability distributions is via the lens of optimal transport: the Wasserstein barycenter is a single distribution that summarizes a ...

Papers With Code · Ross Taylor · 

May 31, 2020

WGAN WGAN-GP 손실 함수가 이렇게 생겨먹은 이유 ...

WGAN WGAN-GP 수학적으로 아름답게GAN 손실 함수를 해석했습니다. 지난 시간에 가볍게 살펴봤던 WGAN 오늘은 조금 ...

Jun 14, 2020 · Uploaded by 룩팍



Wasserstein Arlo Pro and Pro 2 Protective Silicone Skins

www.homedepot.com › ... › Home Safety Accessories

Gently peel open the skin from the back and insert the Arlo Pro into the skin. Adjust the skin to cover the camera perfectly. Wasserstein Silicone Skins for ...

The Home Depot · 

Jun 15, 2020


Luis Polanco (7/13/20): Data driven torsion coordinates and  Wasserstein stability

www.youtube.com › watch

... coordinates and Wasserstein stabilityAbstract: We introduce a framework to construct coordinates in \emph{finite} Lens spaces for ...

YouTube · Applied Algebraic Topology Network · 

<——2020——–2020—––3580——


  

Rémi Flamary (@RFlamary) / Twitter

twitter.com › rflamary

For the L2-Gromov-Wasserstein distance, we study the structure of minimizers in ... Graphs with Fused Gromov-Wasserstein Barycenters" in collaboration with.

Twitter · 

Jan 1, 2020


WGAN-GP - YouTube

We aim to generate 32x32 pixel images of celebrity faces from the CelebA image data set. Here we used a ...

Mar 7, 2020 · Uploaded by Temporarily Anonymous

WGAN 64x64 - YouTube

We aim to generate 64x64 pixel images of celebrity faces from the CelebA image data set. Here we used a ...

Mar 27, 2020 · Uploaded by Temporarily Anonymous


 
2020 see 2019

Daniel Kuhn: "Wasserstein Distributionally Robust Optimization: Theory and Applications in Machine Learning"

Daniel Kuhn: "Wasserstein Distributionally Robust ... - YouTube

1:01:16

... that perform well under the most adverse distribution within a certain Wasserstein distance from a nominal ...

Apr 9, 2020 - Uploaded by Institute for Pure & Applied Mathematics (IPAM)



   2020

An Invitation to Statistics in Wasserstein Space - YES24

An Invitation to Statistics in Wasserstein Space - YES24

www.yes24.com › Product › Goods

 This open access book presents the key aspects of statistics in Wasserstein spaces, i.e. statistics in the space of probability measures ...

YES24 · 예스티비 · 

Oct 9, 2020


2020


An Invitation to Statistics in Wasserstein Space - YES24

An Invitation to Statistics in Wasserstein Space · 책을 구입하신 분들이 함께 · 품목정보 · 관련분류 · 상품정보안내 · 배송/반품/교환 안내 ·  ...

YES24 · 예스티비 · 

May 2, 2020

An Invitation to Statistics in Wasserstein Space  ook


 2020

Lecture 11.4: Wasserstein Generative Adversarial Networks 

What Are GANs? | Generative Adversarial Networks Explained | Deep Learning With Python | Edureka. edureka! edureka!

YouTube · UniHeidelberg · 

Oct 15, 2020

[n libraries: An invitation to statistics in Wasserstein spaceAuthors:Victor M. PanaretosYoav Zemel
n Invitation to Statistics in Wasserstein Space
  book

2020 video

Gromov-Wasserstein Optimal Transport to Align Single-cell ...

crossminds.ai › video › gromov-wasserstein-optimal-trans...

crossminds.ai › video › gromov-wasserstein-optimal-trans...

Gromov-Wasserstein Optimal Transport to Align Single-cell ... ICAPS 2014: Satish Kumar on "A Tree-Based Algorithm for Construction Robots".

CrossMind.ai · 


exterous Robotic Grasping with Object-Centric Visual ...

slideslive.com › dexterous-robotic-grasping-with-objectce...

slideslive.com › dexterous-robotic-grasping-with-objectce...

Dexterous Robotic Grasping with Object-Centric Visual Affordances. Dec 6, 2020 ... 

Wasserstein Distances for Stereo Disparity Estimation.

SlidesLive · 

Dec 6, 2020


Wasserstein Information Geometry in Generative and Discriminative Learning. 

Dec 06, 2020. 0. Guido Montúfar. Follow. Recommended. Details. Comments.

CrossMind.ai · 

Dec 6, 2020

<——2020——–2020—––3590——


2020 see 2-10  BOOK CHAPTER

An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription

Jin, Cong ; Li, Zhongtong ; Sun, Yuanyuan ; Zhang, Haiyin ; Lv, Xin ; Li, Jianguang ; Liu, ShouxunCommunications and Networking, 2020, p.230-240

An Integrated Processing Method Based on Wasserstein Barycenter Algorithm for Automatic Music Transcription

Available Online 


2020 thesis
Classification of atomic environments via the Gromov-Wasserstein distance
Author:Sakura Kawano (Author)
Summary:Interpreting molecular dynamics simulations usually involves automated classification of local atomic environments to identify regions of interest. Existing approaches are generally limited to a small number of reference structures and only include limited information about the local chemical composition. This work proposes to use a variant of the Gromov-Wasserstein (GW) distance to quantify the difference between a local atomic environment and a set of arbitrary reference environments in a way that is sensitive to atomic displacements, missing atoms, and differences in chemical composition. This involves describing a local atomic environment as a finite metric measure space, which has the additional advantages of not requiring the local environment to be centered on an atom and of not making any assumptions about the material class. Numerical examples illustrate the efficacy and versatility of the algorithmShow mor
Thesis, Dissertation, 2020
English
Publisher:University of California, Davis, Davis, Calif., 2020


Confe

ence Paper

An Intelligent Maritime Communication Signal Recognition Algorithm based on ACWGAN

Caidan, Zhao; Zeping, He; Gege, Luo; Caiyun, Chen.

The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2020).

Cite  Email  Save to My Research  Citation/Abstract


2020

TextureWGAN: Texture Preserving WGAN with MLE Regularizer for Inverse ProblemsAuthors:Ikuta, Masaki (Creator), Zhang,

Jun (Creator)

Summary:Many algorithms and methods have been proposed for inverse problems particularly with the recent surge of interest in

machine learning and deep learning methods. Among all proposed methods, the most popular and effective method is the convolutional neural network (CNN) with mean square error (MSE). This method has been proven effective in super-resolution, image de-noising, and image reconstruction. However, this method is known to over-smooth images due to the nature of MSE. MSE based methods minimize Euclidean distance for all pixels between a baseline image and a generated image by CNN and ignore the spatial information of the pixels such as image texture. In this paper, we proposed a new method based on Wasserstein GAN (WGAN) for inverse problems. We showed that the WGAN-based method was effective to preserve image texture. It also used a maximum likelihood estimation (MLE) regularizer to preserve pixel fidelity. Maintaining image texture and pixel fidelity is the most important requirement for medical imaging. We used Peak Signal to Noise Ratio (PSNR) and Structure Similarity (SSIM) to evaluate the proposed method quantitatively. We also conducted first-order and second-order statistical image texture analysis to assess image textureShow more
Downloadable Archival Material, 2020-08-11
Undefined
Publisher:2020-08-11


2020
Peer-reviewed
A TextCNN and WGAN-gp based deep learning frame for unpaired text style transfer in multimedia services
Authors:Mingxuan HuMin HeWei SuAbdellah Chehri
Summary:Abstract: With the rapid growth of big multimedia data, multimedia processing techniques are facing some challenges, such as knowledge understanding, semantic modeling, feature representation, etc. Hence, based on TextCNN and WGAN-gp (improved training of Wasserstein GANs), a deep learning framework is suggested to improve the efficiency of discriminating the specific style features and the style-independent content features in unpaired text style transfer for multimedia services. To redact a sentence with the requested style and preserve the style-independent content, the encoder-decoder framework is usually adopted. However, lacking of same-content sentence pairs with different style for training, some works fail to capture the original content and generate satisfied style properties accurately in the transferred sentences. In this paper, we adopt TextCNN to extract the style features in the transferred sentences, and align the style features with the target style label by the generator (encoder and decoder). Meanwhile, WGAN-gp is utilized subtly to preserve the content features of original sentences. Experiments demonstrate that the performances of our framework on automatic evaluation and human evaluation are much better than the former works. Thus, it provides an effective method for unpaired text style transfer in multimedia servicesShow more
Article, 
Publication:Multimedia Systems, 27, 20201123, 723
Publisher:2020


2020


Accelerated WGAN update strategy with loss change rate balancingAuthors:Ouyang, Xu (Creator), Agam, Gady (Creator)

Summary:Optimizing the discriminator in Generative Adversarial Networks (GANs) to completion in the inner training loop is

computationally prohibitive, and on finite datasets would result in overfitting. To address this, a common update strategy is to 

alternate between k optimization steps for the discriminator D and one optimization step for the generator G. This strategy is

repeated in various GAN algorithms where k is selected empirically. In this paper, we show that this update strategy is not optimal

in terms of accuracy and convergence speed, and propose a new update strategy for Wasserstein GANs (WGAN) and other GANs

using the WGAN loss(e.g. WGAN-GP, Deblur GAN, and Super-resolution GAN). The proposed update strategy is based on a loss

change ratio comparison of G and D. We demonstrate that the proposed strategy improves both convergence speed and

accuracyShow more

Downloadable Archival Material, 2020-08-27

Undefined

Publisher:2020-08-27

2020  An Intelligent Maritime Communication Signal Recognition Algorithm based on ACWGANAuthors:Zhao CaidanHe ZepingLuo GegeChen Caiyun2020 15th International Conference on Computer Science & Education (ICCSE)
Summary:In maritime communications systems, there are marine VHF communications systems that meet GMDSS standards in addition to AIS and VDES systems which use very high-frequency signals for information communications such as security rescue. But because its communication does not contain the identity information, the channel is easy to be occupied maliciously, thus interferes in normal maritime communication. This paper studies and analyzes the individual identification technology of the VHF signal based on the rf fingerprint technology of signal. Using the improved adversarial generation network ACWGAN(Auxiliary Classifier Wasserstein Generative Adversarial Networks) to train and identify, we obtain a better classification result. The recognition rate can reach 94% when the SNR is 5 dB for 10 different classes of VHF signalShow more
Chapter, 2020
Publication:2020 15th International Conference on Computer Science & Education (ICCSE), 202008, 197
Publisher:2020


2020 book

Accelerated WGAN update strategy with loss change rate balancing

Authors:Xu OuyangGady Agam

Summary:Optimizing the discriminator in Generative Adversarial Networks (GANs) to completion in the inner training loop is computationally prohibitive, and on finite datasets would result in overfitting. To address this, a common update strategy is to alternate between k optimization steps for the discriminator D and one optimization step for the generator G. This strategy is repeated in various GAN algorithms where k is selected empirically. In this paper, we show that this update strategy is not optimal in terms of accuracy and convergence speed, and propose a new update strategy for Wasserstein GANs (WGAN) and other GANs using the WGAN loss(e.g. WGAN-GP, Deblur GAN, and Super-resolution GAN). The proposed update strategy is based on a loss change ratio comparison of G and D. We demonstrate that the proposed strategy improves both convergence speed and accuracy

Show more

Book, 2020

Publication:arXiv.org, Nov 3, 2020, n/a

Publisher:Cornell University Library, arXiv.org, Ithaca, 2020


2020 comp file

Alfonsi, AurélienSampling of probability measures in the convex order by Wasserstein projectionAuthors: (Creator), Corbetta, Jacopo (Creator), Jourdain, Benjamin (Creator)
Summary:In this paper, for $\mu $ and $\nu $ two probability measures on $\mathbb{R}^{d}$ with finite moments of order $\varrho \ge 1$, we define the respective projections for the $W_{\varrho}$-Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures dominated by $\nu $ and of probability measures larger than $\mu $ in the convex order. The $W_{2}$-projection of $\mu $ can be easily computed when $\mu $ and $\nu $ have finite support by solving a quadratic optimization problem with linear constraints. In dimension $d=1$, Gozlan et al. (Ann. Inst. Henri Poincaré Probab. Stat. 54 (3) (2018) 1667–1693) have shown that the projection of $\mu$ does not depend on $\varrho $. We explicit their quantile functions in terms of those of $\mu $ and $\nu $. The motivation is the design of sampling techniques preserving the convex order in order to approximate Martingale Optimal Transport problems by using linear programming solvers. We prove convergence of the Wasserstein projection based sampling methods as the sample sizes tend to infinity and illustrate them by numerical experimentsShow more
Computer File, 2020-08
English
Publisher:Institut Henri Poincaré, 2020-08


Mode Collapse - Wasserstein GANs with Gradient Penalty

www.coursera.org › lecture › mode-collapse-Terkm

www.coursera.org › lecture › mode-collapse-Terkm

Video created by DeepLearning.AI for the course "Build Basic Generative Adversarial Networks (GANs)". Learn advanced techniques to reduce ...

Coursera · DeepLearning.AI · 

Sep 29, 2020


Youssef Mroueh · Wasserstein Style Transfer - SlidesLive

slideslive.com › wasserstein-style-transfer

slideslive.com › wasserstein-style-transfer

... of researchers at the intersection of computer science, artificial intelligence, machine learning, statistics, and related areas.

SlidesLive · 

Aug 26, 2020

<——2020——–2020—––3600——


Wasserstein Embedding for Graph Learning - Papers With Code

paperswithcode.com › paper › review

paperswithcode.com › paper › review

We present Wasserstein Embedding for Graph Learning (WEGL), a novel and fast framework for embedding entire graphs in a vector space, in which various ...

Papers With Code · Ross Taylor · 



2020

Yan Wang, 'Wasserstein subsampling: Theory and empirical ...

tads.research.iastate.edu › yan-wang-wasserstein-subsampl...

tads.research.iastate.edu › yan-wang-wasserstein-subsampl...The Wasserstein distance is proposed as a metric for subsampling such $m$ points. Risk bounds are established in terms of the Wasserstein ...

HDR TRIPODS: D4 (Dependable Data-Driven Discovery ... · Computer Science · 

Aug 20, 2020


2020

  Wasserstein Learning of Deeterminantal Point Processes

slideslive.com › wasserstein-learning-of-deeterminantal-p...

slideslive.com › wasserstein-learning-of-deeterminantal-p...Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes ...

SlidesLive · 

Dec 6, 2020


Stochastics and Statistics Seminar - Jose Blanchet - YouTube

www.youtube.com › watch

www.youtube.com › watchStatistical Aspects of Wasserstein Distributionally Robust ... Machine Learning and Robust Optimization, Fengqi You, Cornell University.

YouTube · MIT Institute for Data, Systems, and Society · 

Nov 4, 2020


 [HTML] nih.gov

Wasserstein GANs for MR imaging: from paired to unpaired training

K Lei, M MardaniJM Pauly… - … on medical imaging, 2020 - ieeexplore.ieee.org

… Wasserstein distance is a measure of the distance between two probability distributions [25]…

look at Wasserstein-1 distance in this paper. Here we first introduce Wasserstein-1 distance …

Cited by 30 Related articles All 10 versions


2020


Tweets with replies by Evgeny (@burnaevevgeny) / Twitter

mobile.twitter.com › burnaevevgeny › with_replies

mobile.twitter.com › burnaevevgeny › with_replies

per "Large-Scale Wasserstein Gradient Flows" we show how to solve the ... Or need a NAS benchmark based on recurrent architectures for NLP tasks?

Twitter · 

Dec 4, 2020



2020

Review of Convolutions - Week 2: Deep Convolutional GANs

www.coursera.org › lecture › build-basic-generative-adve...

www.coursera.org › lecture › build-basic-generative-adve...Controllable Generation, WGANs, Conditional Generation, Components of GANs, DCGANs ... 2022 Coursera Inc. All rights reserved.

Coursera · DeepLearning.AI · 

Sep 29, 202

  

2020

Welcome to Week 1 - Week 1: Intro to GANs - Coursera

www.coursera.org › lecture › welcome-to-week-1-mgIKX

www.coursera.org › lecture › welcome-to-week-1-mgIKXControllable Generation, WGANs, Conditional Generation, Components of GANs, DCGANs ... 2022 Coursera Inc. All rights reserved.

Coursera · DeepLearning.AI · 

Sep 29, 2020

 

2020

Classifier Gradients - Week 4: Conditional GAN & Controllable ...

www.coursera.org › lecture › build-basic-generative-adve...

www.coursera.org › lecture › build-basic-generative-adve...Controllable Generation, WGANs, Conditional Generation, Components of GANs, DCGANs ... 2022 Coursera Inc. All rights reserved.

Coursera · DeepLearning.AI · 

Oct 7, 2020


2020

BCE Cost Function - Week 1: Intro to GANs - Coursera

www.coursera.org › lecture › build-basic-generative-adve...

www.coursera.org › lecture › build-basic-generative-adve...Controllable Generation, WGANs, Conditional Generation, Components of GANs, DCGANs ... 2022 Coursera Inc. All rights reserved.

Coursera · DeepLearning.AI · 

Sep 29, 2020

<——2020——–2020—––3610——



2020

Conditional Generation: Inputs - Coursera

www.coursera.org › lecture › build-basic-generative-adve...

www.coursera.org › lecture › build-basic-generative-adve...Controllable Generation, WGANs, Conditional Generation, Components of GANs, DCGANs ... 2022 Coursera Inc. All rights reserved.

Coursera · DeepLearning.AI · 

Sep 29, 2020


2020

Week 4: Conditional GAN & Controllable Generation | Coursera

www.coursera.org › lecture › build-basic-generative-adve...

www.coursera.org › lecture › build-basic-generative-adve...Controllable Generation, WGANs, Conditional Generation, Components of GANs, DCGANs ... 2022 Coursera Inc. All rights reserved.

Coursera · DeepLearning.AI · 

Sep 29, 2020



2020

Felipe Arevalo (@Pipe_ArevaloC) / Twitter

mobile.twitter.com › pipe_arevaloc

mobile.twitter.com › pipe_arevalocCésar A. Uribe's paper, “Approximate Wasserstein attraction flows for ... Have you submitted your paper for the LatinX in NeurIPS at #NeurIPS 2022 just yet?

Twitter · Jun 19, 2020


2020 see 2019

Parameter-Transferred Wasserstein Generative Adversarial Network (PT-WGAN) for Low-Dose PET Image Denoising

Authors:Yu GongHongming ShanYueyang TengNing TuMing LiGuodong LiangGe WangShanshan Wang

Summary:Due to the widespread use of positron emission tomography (PET) in clinical practice, the potential risk of PET-associated radiation dose to patients needs to be minimized. However, with the reduction in the radiation dose, the resultant images may suffer from noise and artifacts that compromise diagnostic performance. In this paper, we propose a parameter-transferred Wasserstein generative adversarial network (PT-WGAN) for low-dose PET image denoising. The contributions of this paper are twofold: i) a PT-WGAN framework is designed to denoise low-dose PET images without compromising structural details, and ii) a task-specific initialization based on transfer learning is developed to train PT-WGAN using trainable parameters transferred from a pretrained model, which significantly improves the training efficiency of PT-WGAN. The experimental results on clinical data show that the proposed network can suppress image noise more effectively while preserving better image fidelity than recently published state-of-the-art methods. We make our code available at https://github.com/90n9-yu/PT-WGAN

Show m

Book, 2020

Publication:arXiv.org, Aug 26, 2020, n/a

Publisher:Cornell University Library, arXiv.org, Ithaca, 2020



2020 see arxiv

Many-Objective Estimation of Distribution Optimization Algorithm Based on WGAN-GP

Authors:Zhenyu LiangYunfan LiZhongwei Wan

Summary:Estimation of distribution algorithms (EDA) are stochastic optimization algorithms. EDA establishes a probability model to describe the distribution of solution from the perspective of population macroscopically by statistical learning method, and then randomly samples the probability model to generate a new population. EDA can better solve multi-objective optimal problems (MOPs). However, the performance of EDA decreases in solving many-objective optimal problems (MaOPs), which contains more than three objectives. Reference Vector Guided Evolutionary Algorithm (RVEA), based on the EDA framework, can better solve MaOPs. In our paper, we use the framework of RVEA. However, we generate the new population by Wasserstein Generative Adversarial Networks-Gradient Penalty (WGAN-GP) instead of using crossover and mutation. WGAN-GP have advantages of fast convergence, good stability and high sample quality. WGAN-GP learn the mapping relationship from standard normal distribution to given data set distribution based on a given data set subject to the same distribution. It can quickly generate populations with high diversity and good convergence. To measure the performance, RM-MEDA, MOPSO and NSGA-II are selected to perform comparison experiments over DTLZ and LSMOP test suites with 3-, 5-, 8-, 10- and 15-objective

Show more

Book, Mar 16, 2020

Publication:arXiv.org, Mar 16, 2020, n/a

Publisher:Mar 16, 2020


3030



Peer-reviewed
Wasserstein Hamiltonian flows
Authors:Shui-Nee ChowWuchen LiHaomin Zhou
Summary:We establish kinetic Hamiltonian flows in density space embedded with the L 2 -Wasserstein metric tensor. We derive the Euler-Lagrange equation in density space, which introduces the associated Hamiltonian flows. We demonstrate that many classical equations, such as Vlasov equation, Schrödinger equation and Schrödinger bridge problem, can be rewritten as the formalism of Hamiltonian flows in density spaceShow more
Article, 2020
Publication:Journal of Differential Equations, 268, 20200115, 1205
Publisher:2020



2020  Peer-reviewed
Spam transaction attack detection model based on GRU and WGAN-div
Authors:Jin YangTao LiGang LiangYunPeng WangTianYu GaoFangDong Zhu
Summary:A Spam Transaction attack is a kind of hostile attack activity specifically targeted against a Cryptocurrency Network. Traditional network intrusion detection methods lack the capability of automatic feature extraction for spam transaction attacks, and thus the detection efficiency is low. Worse still, these kinds of attack methods and the key intrusion behaviour process are usually concealed and submerged into a large number of normal data packages; therefore, the captured threat test samples are too small, which easily leads to insufficient training of detection model, low detection accuracy rate, and high false alarm rate. In this paper, a spam transaction intrusion detection model based on GRU(Gated Recurrent Unit) is proposed, which takes advantage of the excellent features of deep learning and uses repeated and multilevel learning to perform automatic feature extraction for network intrusion behaviour. The model has extremely high learning ability and massive data processing ability. Moreover, it has a quicker and more accurate spam transaction attack detection ability than traditional intrusion detection algorithms. Additionally, a generation method of spam transaction-samples based on WGAN-div is proposed, which obtains new samples by learning training samples and solves the problems of insufficient original samples and unbalanced samples. A series of experiments were performed to verify the proposed models. The proposed models can distinguish between normal and abnormal transaction behaviours with an accuracy reaching to 99.86%. The experimental results indicate that the proposed models in this paper have higher efficiency and accuracy in detecting spam transaction attacks, which provides a novel and better idea for research of spam transaction attack detection systemsShow more
Article, 2020
Publication:Computer Communications, 161, 20200901, 172
Publisher:2020


2020  Peer-reviewed
WGAN domain adaptation for the joint optic disc-and-cup segmentation in fundus images
Authors:Shreya KadambiZeya WangEric Xing
Summary:Purpose: The cup-to-disc ratio (CDR), a clinical metric of the relative size of the optic cup to the optic disc, is a key indicator of glaucoma, a chronic eye disease leading to loss of vision. CDR can be measured from fundus images through the segmentation of optic disc and optic cup . Deep convolutional networks have been proposed to achieve biomedical image segmentation with less time and more accuracy, but requires large amounts of annotated training data on a target domain, which is often unavailable. Unsupervised domain adaptation framework alleviates this problem through leveraging off-the-shelf labeled data from its relevant source domains, which is realized by learning domain invariant features and improving the generalization capabilities of the segmentation model. Methods: In this paper, we propose a WGAN domain adaptation framework for detecting optic disc-and-cup boundary in fundus images. Specifically, we build a novel adversarial domain adaptation framework that is guided by Wasserstein distance, therefore with better stability and convergence than typical adversarial methods. We finally evaluate our approach on publicly available datasets. Results: Our experiments show that the proposed approach improves Intersection-over-Union score for optic disc-and-cup segmentation, Dice score and reduces the root-mean-square error of cup-to-disc ratio, when we compare it with direct transfer learning and other state-of-the-art adversarial domain adaptation methods. Conclusion: With this work, we demonstrate that WGAN guided domain adaptation obtains a state-of-the-art performance for the joint optic disc-and-cup segmentation in fundus imagesShow more
Downloadable Article, 2020
Publication:International Journal of Computer Assisted Radiology and Surgery : A journal for interdisciplinary research, development and applications of image guided diagnosis and therapy, 15, 202007, 1205
Publisher:2020

 

2020
Many-Objective Estimation of Distribution Optimization Algorithm Based on WGAN-GP
Authors:Liang, Zhenyu (Creator), Li, Yunfan (Creator), Wan, Zhongwei (Creator)
Summary:Estimation of distribution algorithms (EDA) are stochastic optimization algorithms. EDA establishes a probability model to describe the distribution of solution from the perspective of population macroscopically by statistical learning method, and then randomly samples the probability model to generate a new population. EDA can better solve multi-objective optimal problems (MOPs). However, the performance of EDA decreases in solving many-objective optimal problems (MaOPs), which contains more than three objectives. Reference Vector Guided Evolutionary Algorithm (RVEA), based on the EDA framework, can better solve MaOPs. In our paper, we use the framework of RVEA. However, we generate the new population by Wasserstein Generative Adversarial Networks-Gradient Penalty (WGAN-GP) instead of using crossover and mutation. WGAN-GP have advantages of fast convergence, good stability and high sample quality. WGAN-GP learn the mapping relationship from standard normal distribution to given data set distribution based on a given data set subject to the same distribution. It can quickly generate populations with high diversity and good convergence. To measure the performance, RM-MEDA, MOPSO and NSGA-II are selected to perform comparison experiments over DTLZ and LSMOP test suites with 3-, 5-, 8-, 10- and 15-objectiveShow more
Downloadable Archival Material, 2020-03-15
Undefined
Publisher:2020-03-15

 

2020
E-WACGAN: Enhanced Generative Model of Signaling Data Based on WGAN-GP and ACGAN
Authors:Qimin JinRongheng LinFangchun Yang
Summary:In recent years, the generative adversarial network (GAN) has achieved outstanding performance in the image field and the derivatives of GAN, namely auxiliary classifier GAN (ACGAN) and Wasserstein GAN with gradient penalty (WGAN-GP) have also been widely used, but the GAN applications of nonimage domain are not wide. At the time when the telecommunication fraud is rampant, the signaling data of telephone contain a lot of useful information, which is helpful for distinguishing fraudulent and nonfraudulent calls. In this article, aiming at the problem of limited amount of data and information leakage in the research of telephone signaling data, we adopt WGAN-GP and ACGAN to generate analog data, which confirms distribution of true data. In order to solve the problem of category accuracy of analog data and to enhance the stability and speed of training, we proposed a new network structure for discriminator of GAN based on WGAN-GP and ACGAN. The experiments on telecommunication fraud dataset found that our method obtains better performance on signaling data than base modelShow more
Article, 2020
Publication:IEEE Systems Journal, 14, 202009, 3289
Publisher:2020

<——2020——–2020—––3620——



GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators
Authors:Chen, Dingfan (Creator), Orekondy, Tribhuvanesh (Creator), Fritz, Mario (Creator)
Summary:The wide-spread availability of rich data has fueled the growth of machine learning applications in numerous domains. However, growth in domains with highly-sensitive data (e.g., medical) is largely hindered as the private nature of data prohibits it from being shared. To this end, we propose Gradient-sanitized Wasserstein Generative Adversarial Networks (GS-WGAN), which allows releasing a sanitized form of the sensitive data with rigorous privacy guarantees. In contrast to prior work, our approach is able to distort gradient information more precisely, and thereby enabling training deeper models which generate more informative samples. Moreover, our formulation naturally allows for training GANs in both centralized and federated (i.e., decentralized) data scenarios. Through extensive experiments, we find our approach consistently outperforms state-of-the-art approaches across multiple metrics (e.g., sample quality) and datasetsShow more
Downloadable Archival Material, 2020-06-15
Undefined
Publisher:2020-06-15

2020
E-WACGAN: Enhanced Generative Model of Signaling Data Based on WGAN-GP and ACGAN
Authors:Jin Q.Lin R.Yang F.
2020  Article
Publication:IEEE Systems Journal, 14, 2020 09 01, 3289
Publisher:2020



2020  Peer-reviewed
Motion Deblurring in Image Color Enhancement by WGAN
Authors:Jiangfan FengAdrian Podoleanu (Editor), Shuang Qi
Summary:Motion deblurring and image enhancement are active research areas over the years. Although the CNNbased model has an advanced state of the art in motion deblurring and image enhancement, it fails to produce multitask results when challenged with the images of challenging illumination conditions. The key idea of this paper is to introduce a novel multitask learning algorithm for image motion deblurring and color enhancement, which enables us to enhance the color effect of an image while eliminating motion blur. To achieve this, we explore the synchronization of processing two tasks for the first time by using the framework of generative adversarial networks (GANs). We add L1 loss to the generator loss to simulate the model to match the target image at the pixel level. To make the generated image closer to the target image at the visual level, we also integrate perceptual style loss into generator loss. After a lot of experiments, we get an effective configuration scheme. The best model trained for about one week has achieved stateoftheart performance in both deblurring and enhancement. Also, its image processing speed is approximately 1.75 times faster than the best competitorShow more
Downloadable Article, 2020
Publication:International Journal of Optics., 2020, 1
Publisher:2020

2020
WGAN-based Autoencoder Training Over-the-air
Authors:Dörner, Sebastian (Creator), Henninger, Marcus (Creator), Cammerer, Sebastian (Creator), Brink, Stephan ten (Creator)
Summary:The practical realization of end-to-end training of communication systems is fundamentally limited by its accessibility of the channel gradient. To overcome this major burden, the idea of generative adversarial networks (GANs) that learn to mimic the actual channel behavior has been recently proposed in the literature. Contrarily to handcrafted classical channel modeling, which can never fully capture the real world, GANs promise, in principle, the ability to learn any physical impairment, enabled by the data-driven learning algorithm. In this work, we verify the concept of GAN-based autoencoder training in actual over-the-air (OTA) measurements. To improve training stability, we first extend the concept to conditional Wasserstein GANs and embed it into a state-of-the-art autoencoder-architecture, including bitwise estimates and an outer channel code. Further, in the same framework, we compare the existing three different training approaches: model-based pre-training with receiver finetuning, reinforcement learning (RL) and GAN-based channel modeling. For this, we show advantages and limitations of GAN-based end-to-end training. In particular, for non-linear effects, it turns out that learning the whole exploration space becomes prohibitively complex. Finally, we show that the training strategy benefits from a simpler (training) data acquisition when compared to RL-based training, which requires continuous transmitter weight updates. This becomes an important practical bottleneck due to limited bandwidth and latency between transmitter and training algorithm that may even operate at physically different locationsShow more
Downloadable Archival Material, 2020-03-05
Undefined
Publisher:2020-03-05

2020 see 2019
VAE/WGAN-Based Image Representation Learning For Pose-Preserving Seamless Identity Replacement In Facial Images
Authors:Kawai, Hiroki (Creator), Chen, Jiawei (Creator), Ishwar, Prakash (Creator), Konrad, Janusz (Creator)
Summary:We present a novel variational generative adversarial network (VGAN) based on Wasserstein loss to learn a latent representation from a face image that is invariant to identity but preserves head-pose information. This facilitates synthesis of a realistic face image with the same head pose as a given input image, but with a different identity. One application of this network is in privacy-sensitive scenarios; after identity replacement in an image, utility, such as head pose, can still be recovered. Extensive experimental validation on synthetic and real human-face image datasets performed under 3 threat scenarios confirms the ability of the proposed network to preserve head pose of the input image, mask the input identity, and synthesize a good-quality realistic face image of a desired identity. We also show that our network can be used to perform pose-preserving identity morphing and identity-preserving pose morphing. The proposed method improves over a recent state-of-the-art method in terms of quantitative metrics as well as synthesized image qualityShow more
Downloadable Archival Material, 2020-03-01
Undefined
Publisher:2020-03-01


2020


2020  Peer-reviewed
Motion Deblurring in Image Color Enhancement by WGAN
Authors:Jiangfan FengShuang Qi
Summary:Motion deblurring and image enhancement are active research areas over the years. Although the CNN-based model has an advanced state of the art in motion deblurring and image enhancement, it fails to produce multitask results when challenged with the images of challenging illumination conditions. The key idea of this paper is to introduce a novel multitask learning algorithm for image motion deblurring and color enhancement, which enables us to enhance the color effect of an image while eliminating motion blur. To achieve this, we explore the synchronization of processing two tasks for the first time by using the framework of generative adversarial networks (GANs). We add _L_1 loss to the generator loss to simulate the model to match the target image at the pixel level. To make the generated image closer to the target image at the visual level, we also integrate perceptual style loss into generator loss. After a lot of experiments, we get an effective configuration scheme. The best model trained for about one week has achieved state-of-the-art performance in both deblurring and enhancement. Also, its image processing speed is approximately 1.75 times faster than the best competitorShow more
Article, 2020
Publication:International Journal of Optics, 2020, 20200624
Publisher:2020


2020
Image Dehazing Algorithm Based on FC-DenseNet and WGAN
Author:SUN Bin, JU Qingqing, SANG Qingbing
Summary:The existing image dehazing algorithms rely heavily on the accurate estimation of the intermediate variables. This paper proposes an end-to-end image dehazing model based on Wasserstein generative adversarial networks(WGAN). Firstly, the fully convolutional DenseNets (FC-DenseNet) is used to fully learn the features of the hazy in image. Secondly, the residual learning concept is used to directly learn the features of the clear image from the degraded image, and realize end-to-end image dehazing. Finally, the mean square error and perceptual structural error function are used as the loss function of the model to ensure the image structure and content information, and WGAN is used to finely optimize the generated results to produce clear and realistic clear images. Experimental results show that the proposed algorithm improves the structural similarity by 4% compared with other comparison algorithms on the synthetic hazy dataset, and on the natural hazy image, the image restored by the algorithm has higher definition and contrast, and is superior to other comparison algorithms on the subjective evaluationShow more
Downloadable Article
Publication:Jisuanji kexue yu tansuo, 14, 20200801, 1380
Access Free

结合FC结合FC-DenseNetWGAN的图像去雾算法

Online:2020-08-01 Published:2020-08-07



2020
Symmetric Skip Connection Wasserstein GAN for High-Resolution Facial Image Inpainting
Authors:Jam, Jireh (Creator), Kendrick, Connah (Creator), Drouard, Vincent (Creator), Walker, Kevin (Creator), Hsu, Gee-Sern (Creator), Yap, Moi Hoon (Creator)
Summary:The state-of-the-art facial image inpainting methods achieved promising results but face realism preservation remains a challenge. This is due to limitations such as; failures in preserving edges and blurry artefacts. To overcome these limitations, we propose a Symmetric Skip Connection Wasserstein Generative Adversarial Network (S-WGAN) for high-resolution facial image inpainting. The architecture is an encoder-decoder with convolutional blocks, linked by skip connections. The encoder is a feature extractor that captures data abstractions of an input image to learn an end-to-end mapping from an input (binary masked image) to the ground-truth. The decoder uses learned abstractions to reconstruct the image. With skip connections, S-WGAN transfers image details to the decoder. Additionally, we propose a Wasserstein-Perceptual loss function to preserve colour and maintain realism on a reconstructed image. We evaluate our method and the state-of-the-art methods on CelebA-HQ dataset. Our results show S-WGAN produces sharper and more realistic images when visually compared with other methods. The quantitative measures show our proposed S-WGAN achieves the best Structure Similarity Index Measure (SSIM) of 0.94Show more
Downloadable Archival Material, 2020-01-11
Undefined
Publisher:2020-01-11

Symmetric Skip Connection Wasserstein GAN for High-Resolution Facial Image Inpainting

Jam, Jireh; Kendrick, Connah  2020 

FullText OnlineJournal Article

01/2020   MSC Class: 62B99 

r. Additionally, we propose a Wasserstein-Perceptual loss function to pre…

 Cited by 6 Related articles All 3 versions

2020  Peer-reviewed
An Improved Defect Detection Method of Water Walls Using the WGAN
Authors:Zhang Y.Wang Y.Ding Y.Lu L.Yang J.Xu Z.Ma B.Lin X.2020 4th International Conference on Electrical, Automation and Mechanical Engineering, EAME 2020
Article, 2020
Publication:Journal of Physics: Conference Series, 1626, 2020 11 06
Publisher:2020


2020
Towards Generalized Implementation of Wasserstein Distance in GANs
Authors:Xu, Minkai (Creator), Zhou, Zhiming (Creator), Lu, Guansong (Creator), Tang, Jian (Creator), Zhang, Weinan (Creator), Yu, Yong (Creator)
Summary:Wasserstein GANs (WGANs), built upon the Kantorovich-Rubinstein (KR) duality of Wasserstein distance, is one of the most theoretically sound GAN models. However, in practice it does not always outperform other variants of GANs. This is mostly due to the imperfect implementation of the Lipschitz condition required by the KR duality. Extensive work has been done in the community with different implementations of the Lipschitz constraint, which, however, is still hard to satisfy the restriction perfectly in practice. In this paper, we argue that the strong Lipschitz constraint might be unnecessary for optimization. Instead, we take a step back and try to relax the Lipschitz constraint. Theoretically, we first demonstrate a more general dual form of the Wasserstein distance called the Sobolev duality, which relaxes the Lipschitz constraint but still maintains the favorable gradient property of the Wasserstein distance. Moreover, we show that the KR duality is actually a special case of the Sobolev duality. Based on the relaxed duality, we further propose a generalized WGAN training scheme named Sobolev Wasserstein GAN (SWGAN), and empirically demonstrate the improvement of SWGAN over existing methods with extensive experimentsShow more
Downloadable Archival Material, 2020-12-06
Undefined
Publisher:2020-12-06

<——2020——–2020—––3630——


When OT meets MoM: Robust estimation of Wasserstein DistanceAuthors:Staerman,

Guillaume (Creator), LaforguePierre (Creator), Mozharovskyi, Pavlo (Creator), d'Alché-Buc, Florence (Creator)

Summary:Issued from Optimal Transport, the Wasserstein distance has gained importance in Machine Learning due to its

appealing geometrical properties and the increasing availability of efficient approximations. In this work, we consider the problem

of estimating the Wasserstein distance between two probability distributions when observations are polluted by outliers. To that

end, we investigate how to leverage Medians of Means (MoM) estimators to robustify the estimation of Wasserstein distance.

Exploiting the dual Kantorovitch formulation of Wasserstein distance, we introduce and discuss novel MoM-based robust

estimators whose consistency is studied under a data contamination model and for which convergence rates are provided. These

MoM estimators enable to make Wasserstein Generative Adversarial Network (WGAN) robust to outliers, as witnessed by an

empirical study on two benchmarks CIFAR10 and Fashion MNIST. Eventually, we discuss how to

combine MoM with the entropy-regularized approximation of the Wasserstein distance and propose a simple MoM-based re

weighting scheme that could be used in conjunction with the Sinkhorn algorithmShow more

Downloadable Archival Material, 2020-06-18

Undefined

Publisher:2020-06-18 


Spectral Unmixing With Multinomial Mixture Kernel and Wasserstein Generative Adversarial Loss
Authors:Ozkan, Savas (Creator), Akar, Gozde Bozdagi (Creator)
Summary:This study proposes a novel framework for spectral unmixing by using 1D convolution kernels and spectral uncertainty. High-level representations are computed from data, and they are further modeled with the Multinomial Mixture Model to estimate fractions under severe spectral uncertainty. Furthermore, a new trainable uncertainty term based on a nonlinear neural network model is introduced in the reconstruction step. All uncertainty models are optimized by Wasserstein Generative Adversarial Network (WGAN) to improve stability and capture uncertainty. Experiments are performed on both real and synthetic datasets. The results validate that the proposed method obtains state-of-the-art performance, especially for the real datasets compared to the baselines. Project page at: https://github.com/savasozkan/dscnShow more
Downloadable Archival Material, 2020-12-12
Undefined
Publisher:2020-12-12


alexis jacq (@Alexis_D_Jacq) / Twitter

mobile.twitter.com › alexis_d_jacq

mobile.twitter.com › alexis_d_jacq

Research Scientist at Google Brain ... colab.research.google.com ... that calculates the Wasserstein distance (aka the earth mover's distance) between ...

Twitter · 

Sep 15, 2020


Hanjun Dai (@hanjundai) / Twitter

mobile.twitter.com › hanjundai

mobile.twitter.com › hanjundai

Discrete Langevin Sampler via Wasserstein Gradient Flow. Recently, a family of locally balanced (LB) ... Our team at Google Brain (w/ Dale Schuurmans,.

Twitter · 

Oct 2, 2020


Research - Abacus.AI

abacus.ai › research

abacus.ai › research

... of GANs including Wasserstein GANs and MMD GANs address some of these issues. ... Google Brain's scientists also explored attribution of predictions to ...

Abacus.AI - Effortlessly Embed Cutting Edge AI In Your ... · 

Jul 14, 2020


2020


Linear Optimal Transport Embedding: Provable Wasserstein classification for certain rigid transformations and perturbations

Authors:Moosmüller, Caroline (Creator), Cloninger, Alexander (Creator)

Summary:Discriminating between distributions is an important problem in a number of scientific fields. This motivated the introduction of Linear Optimal Transportation (LOT), which embeds the space of distributions into an $L^2$-space. The transform is defined by computing the optimal transport of each distribution to a fixed reference distribution, and has a number of benefits when it comes to speed of computation and to determining classification boundaries. In this paper, we characterize a number of settings in which LOT embeds families of distributions into a space in which they are linearly separable. This is true in arbitrary dimension, and for families of distributions generated through perturbations of shifts and scalings of a fixed distribution.We also prove conditions under which the $L^2$ distance of the LOT embedding between two distributions in arbitrary dimension is nearly isometric to Wasserstein-2 distance between those distributions. This is of significant computational benefit, as one must only compute $N$ optimal transport maps to define the $N^2$ pairwise distances between $N$ distributions. We demonstrate the benefits of LOT on a number of distribution classification problems

Show more

Downloadable Archival Material, 2020-08-


 


توظيف تقنية التقليص المويجي في تقدير نموذج الإنحدار الجمعي اللامعلمي المعمم "WGAM": دراسة مقارنة مع المحاكاة والتطبيق
Authors:حمودات، آلاء عبدالستار داؤدالطالب، بشار عبدالعزيز
Summary:تم في هذا البحث تناول مشكلة عدم معلومية التوزيع الاحتمالي للبيانات وتوظيف طريقة تقدير النموذج الجمعي المعمم Generalized Additive Models (GAM) اللامعلمية المستندة إلى الشرائح التمهيدية Smoothing Splines كممهدات والتعامل مع هذه الحالة بالأسلوب التكراري. وتم استخدام تقنية التقليص المويجي Wavelet Shrinkage والاعتماد عليها في تقدير نموذج الانحدار Wavelet Generalized Additive Models (WGAM) والتي تم اقتراح توظيفها كممهد للبيانات وذلك من خلال استخدام بعض المويجات كمرشحات في حساب التحويل المتقطع للمويجة، وقد تم الاعتماد على بعض المعايير الإحصائية لمقارنة طرق التقدير، وذلك من خلال توظيف أسلوب المحاكاة وتحليل بيانات حقيقية، وقد تم اختبار كفاءة الطريقة المقترحة على بيانات تم جمعها من مستشفى ابن سينا التعليمي، على حالات مصابة بقصر القامة، وقد أعطت مرشحات تقليص المويجة أفضل النتائج مقارنة بطريقة GAM الاعتيادية وساعدت المويجة على تمهيد البيانات وذلك من خلال الحصول على أكفأ النتائجShow more
[Arabic  Employing Wavelet Minimization Technique in Estimating the Generalized Nonparametric Aggregate Regression Model "WGAM": A Comparative Study with Simulation and Application Authors: Hammoudat, Alaa Abdel Sattar Daoud, Student, Bashar Abdel
Article, 2020
Publication:Iraqi Journal of Statistical Science, 2020, 9
Publisher:2020



Peer-reviewed
Wasserstein autoencoders for collaborative filtering
Authors:Xiaofeng ZhangJingbin ZhongKai Liu
Summary:Abstract: The recommender systems have long been studied in the literature. The collaborative filtering is one of the most widely adopted recommendation techniques which is usually applied to the explicit data, e.g., rating scores. However, the implicit data, e.g., click data, is believed to be able to discover user’s latent preferences. Consequently, a number of research attempts have been made toward this issue. To the best of our knowledge, this paper is the first attempt to adapt the Wasserstein autoencoders to collaborative filtering problem. Particularly, we propose a new loss function by introducing an regularization term to learn a sparse low-rank representation form to represent latent variables. Then, we carefully design (1) a new cost function to minimize the data reconstruction error, and (2) the appropriate distance metrics for the calculation of KL divergence between the learned distribution of latent variables and the underlying true data distribution. Rigorous experiments are performed on three widely adopted datasets. Both the state-of-the-art approaches, e.g., Mult-VAE and Mult-DAE, and the baseline models are evaluated on these datasets. The promising experimental results demonstrate that the proposed approach is superior to the compared approaches with respect to evaluation criteria Recall@R and NDCG@RShow more
Article, 2020
Publication:Neural Computing and Applications, 33, 20200713, 2793
Publisher:2020

Peer-reviewed
On the Wasserstein distance for a martingale central limit theorem
Authors:Xiequan FanXiaohui Ma
Summary:We prove an upper bound on the Wasserstein distance between normalized martingales and the standard normal random variable, which extends a result of Röllin (2018). The proof is based on a method of Bolthausen (1982)Show more
Article, 2020
Publication:Statistics and Probability Letters, 167, 202012
Publisher:2020


The Spectral-Domain $\mathcal{W}_2$ Wasserstein Distance for Elliptical Processes and the Spectral-Domain Gelbrich Bound
Authors:Fang, Song (Creator), Zhu, Quanyan (Creator)
Summary:In this short note, we introduce the spectral-domain $\mathcal{W}_2$ Wasserstein distance for elliptical stochastic processes in terms of their power spectra. We also introduce the spectral-domain Gelbrich bound for processes that are not necessarily ellipticalShow more

Downloadable Archival Material, 2020-12-07
Undefined
Publisher:2020-12-07

<——2020——–2020—––3640——



Independent Elliptical Distributions Minimize Their $\mathcal{W}_2$ Wasserstein Distance from Independent Elliptical Distributions with the Same Density Generator
Show more
Authors:Fang, Song (Creator), Zhu, Quanyan (Creator)
Summary:This short note is on a property of the $\mathcal{W}_2$ Wasserstein distance which indicates that independent elliptical distributions minimize their $\mathcal{W}_2$ Wasserstein distance from given independent elliptical distributions with the same density generators. Furthermore, we examine the implications of this property in the Gelbrich bound when the distributions are not necessarily elliptical. Meanwhile, we also generalize the results to the cases when the distributions are not independent. The primary purpose of this note is for the referencing of papers that need to make use of this property or its implicationsShow more
Downloadable Archival Material, 2020-12-07
Undefined
Publisher:2020-12-07

Peer-reviewed
Tensor product and Hadamard product for the Wasserstein means
Authors:Jinmi HwangSejong Kim
Summary:As one of the least squares mean, we consider the Wasserstein mean of positive definite Hermitian matrices. We verify in this paper the inequalities of the Wasserstein mean related with a strictly positive and unital linear map, the identity of the Wasserstein mean for tensor product, and several inequalities of the Wasserstein mean for Hadamard productShow more
Article, 2020
Publication:Linear Algebra and Its Applications, 603, 20201015, 496
Publisher:2020
 
 

Peer-reviewed
Wasserstein Distance Estimates for Stochastic Integrals by Forward-Backward Stochastic Calculus
Authors:Jean-Christophe BretonNicolas Privault
Summary:Abstract: We prove Wasserstein distance bounds between the probability distributions of stochastic integrals with jumps, based on the integrands appearing in their stochastic integral representations. Our approach does not rely on the Stein equation or on the propagation of convexity property for Markovian semigroups, and makes use instead of forward-backward stochastic calculus arguments. This allows us to consider a large class of target distributions constructed using Brownian stochastic integrals and pure jump martingales, which can be specialized to infinitely divisible target distributions with finite Lévy measure and Gaussian componentsShow more
Article, 2020
Publication:Potential Analysis : An International Journal Devoted to the Interactions between Potential Theory, Probability Theory, Geometry and Functional Analysis, 56, 20200829, 1
Publisher:2020

 2020

Peer-reviewed
Data-driven distributionally robust chance-constrained optimization with Wasserstein metric
Authors:Ran JiMiguel A. Lejeune
Summary:Abstract: We study distributionally robust chance-constrained programming (DRCCP) optimization problems with data-driven Wasserstein ambiguity sets. The proposed algorithmic and reformulation framework applies to all types of distributionally robust chance-constrained optimization problems subjected to individual as well as joint chance constraints, with random right-hand side and technology vector, and under two types of uncertainties, called uncertain probabilities and continuum of realizations. For the uncertain probabilities (UP) case, we provide new mixed-integer linear programming reformulations for DRCCP problems. For the continuum of realizations case with random right-hand side, we propose an exact mixed-integer second-order cone programming (MISOCP) reformulation and a linear programming (LP) outer approximation. For the continuum of realizations (CR) case with random technology vector, we propose two MISOCP and LP outer approximations. We show that all proposed relaxations become exact reformulations when the decision variables are binary or bounded general integers. For DRCCP with individual chance constraint and random right-hand side under both the UP and CR cases, we also propose linear programming reformulations which need the ex-ante derivation of the worst-case value-at-risk via the solution of a finite series of linear programs determined via a bisection-type procedure. We evaluate the scalability and tightness of the proposed MISOCP and (MI)LP formulations on a distributionally robust chance-constrained knapsack problemShow more
Article, 2020
Publication:Journal of Global Optimization : An International Journal Dealing with Theoretical and Computational Aspects of Seeking Global Optima and Their Applications in Science, Management and Engineering, 79, 20201117, 779
Publisher:2020
 
Peer-reviewed
Adapted Wasserstein distances and stability in mathematical finance
Authors:Julio Backhoff-VeraguasDaniel BartleMathias BeiglböckManu Eder
Summary:Assume that an agent models a financial asset through a measure Q with the goal to price/hedge some derivative or optimise some expected utility. Even if the model Q is chosen in the most skilful and sophisticated way, the agent is left with the possibility that Q does not provide an exact description of reality. This leads us to the following question: will the hedge still be somewhat meaningful for models in the proximity of Q? If we measure proximity with the usual Wasserstein distance (say), the answer is No. Models which are similar with respect to the Wasserstein distance may provide dramatically different information on which to base a hedging strategy. Remarkably, this can be overcome by considering a suitable adapted version of the Wasserstein distance which takes the temporal structure of pricing models into account. This adapted Wasserstein distance is most closely related to the nested dis tance as pioneered by Pflug and Pichler (SIAM J. Optim. 20:1406–1420, 2009, SIAM J. Optim. 22:1–23, 2012, Multistage Stochastic Optimization, 2014). It allows us to establish Lipschitz properties of hedging strategies for semimartingale models in discrete and continuous time. Notably, these abstract results are sharp already for Brownian motion and European call optionsShow more
Article
Publication:Finance and stochastics, 24, 2020

 
Peer-reviewed
Wasserstein upper bounds of the total variation for smooth densities
Authors:Minwoo ChaeStephen G. Walker
Summary:The total variation distance between probability measures cannot be bounded by the Wasserstein metric in general. If we consider sufficiently smooth probability densities, however, it is possible to bound the total variation by a power of the Wasserstein distance. We provide a sharp upper bound which depends on the Sobolev norms of the densities involvedShow more

 

Peer-reviewed
Convergence rates of the blocked Gibbs sampler with random scan in the Wasserstein metric
Authors:Neng-Yi WangGuosheng Yin
Summary:This paper establishes explicit estimates of convergence rates for the blocked Gibbs sampler with random scan under the Dobrushin conditions. The estimates of convergence in the Wasserstein metric are obtained by taking purely analytic approachesShow more
Article
Publication:Stochastics, 92, 20200217, 265

Peer-reviewed
Exponential Contraction in Wasserstein Distances for Diffusion Semigroups with Negative Curvature
Author:Feng-Yu Wang
Summary:Abstract: Let Pt be the (Neumann) diffusion semigroup Pt generated by a weighted Laplacian on a complete connected Riemannian manifold M without boundary or with a convex boundary. It is well known that the Bakry-Emery curvature is bounded below by a positive constant >0 if and only if holds for all probability measures μ1 and μ2 on M, where Wp is the Lp Wasserstein distance induced by the Riemannian distance. In this paper, we prove the exponential contraction for some constants c,>0 for a class of diffusion semigroups with negative curvature where the constant c is essentially larger than 1. Similar results are derived for SDEs with multiplicative noise by using explicit conditions on the coefficients, which are new even for SDEs with additive noiseShow more
Article, 2020
Publication:Potential Analysis : An International Journal Devoted to the Interactions between Potential Theory, Probability Theory, Geometry and Functional Analysis, 53, 20200206, 1123
Publisher:2020

<——2020——–2020—––3650——


 

Peer-reviewed
Characterization of probability distribution convergence in Wasserstein distance by $L^{p}$-quantization error function
Authors:Yating LiuGilles Pagès
Summary:We establish conditions to characterize probability measures by their $L^{p}$-quantization error functions in both $\mathbb{R}^{d}$ and Hilbert settings. This characterization is two-fold: static (identity of two distributions) and dynamic (convergence for the $L^{p}$-Wasserstein distance). We first propose a criterion on the quantization level $N$, valid for any norm on $\mathbb{R}^{d}$ and any order $p$ based on a geometrical approach involving the Voronoï diagram. Then, we prove that in the $L^{2}$-case on a (separable) Hilbert space, the condition on the level $N$ can be reduced to $N=2$, which is optimal. More quantization based characterization cases in dimension 1 and a discussion of the completeness of a distance defined by the quantization error function can be found at the end of this paperShow more
Downloadable Article
Publication:https://projecteuclid.org/euclid.bj/1580461576Bernoulli, 26, 2020-05, 1171

 

   6
Peer-reviewed
Regularized variational data assimilation for bias treatment using the Wasserstein metric
Authors:Sagar K. TamangArdeshir EbtehajDongmian ZouGilad Lerman
Summary:This article presents a new variational data assimilation (VDA) approach for the formal treatment of bias in both model outputs and observations. This approach relies on the Wasserstein metric, stemming from the theory of optimal mass transport, to penalize the distance between the probability histograms of the analysis state and an a priori reference dataset, which is likely to be more uncertain but less biased than both model and observations. Unlike previous bias-aware VDA approaches, the new Wasserstein metric VDA (WM-VDA) treats systematic biases of unknown magnitude and sign dynamically in both model and observations, through assimilation of the reference data in the probability domain, and can recover the probability histogram of the analysis state fully. The performance of WM-VDA is compared with the classic three-dimensional VDA (3D-Var) scheme for first-order linear dynamics and the chaotic Lorenz attractor. Under positive systematic biases in both model and observations, we consistently demonstrate a significant reduction in the forecast bias and unbiased root-mean-squared error.Classical variational data assimilation techniques used for improving the forecast skill of weather and land models are based on the unrealistic assumption of zero-mean Gaussian errors. The Wasserstein metric variational data assimilation (WM-VDA) developed here assimilates climatologically unbiased information dynamically in probability space. As shown in the figure, under systematic biases in both model output and observation, WM-VDA (right) demonstrates better performance for bias treatment (represented by lighter colour) compared with the classic three-dimensional variational algorithm (left)Show more
Article, 2020
Publication:Quarterly Journal of the Royal Meteorological Society, 146, July 2020 Part A, 2332
Publisher:2020

Peer-reviewed
Density estimation of multivariate samples using Wasserstein distance
Authors:E. LuiniP. Arbenz
Summary:Density estimation is a central topic in statistics and a fundamental task of machine learning. In this paper, we present an algorithm for approximating multivariate empirical densities with a piecewise constant distribution defined on a hyperrectangular-shaped partition of the domain. The piecewise constant distribution is constructed through a hierarchical bisection scheme, such that locally, the sample cannot be statistically distinguished from a uniform distribution. The Wasserstein distance has been used to measure the uniformity of the sample data points lying in each partition element. Since the resulting density estimator requires significantly less memory to be stored, it can be used in a situation where the information contained in a multivariate sample needs to be preserved, transferred or analysedShow more
Article
Publication:Journal of Statistical Computation and Simulation, 90, 20200122, 181
Peer-reviewed
Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings
Authors:Yuanfei DaiShiping WangXing ChenChaoyang XuWenzhong Guo
Summary:Knowledge graph embedding aims to project entities and relations into low-dimensional and continuous semantic feature spaces, which has captured more attention in recent years. Most of the existing models roughly construct negative samples via a uniformly random mode, by which these corrupted samples are practically trivial for training the embedding model. Inspired by generative adversarial networks (GANs), the generator can be employed to sample more plausible negative triplets, that boosts the discriminator to improve its embedding performance further. However, vanishing gradient on discrete data is an inherent problem in traditional GANs. In this paper, we propose a generative adversarial network based knowledge graph representation learning model by introducing the Wasserstein distance to replace traditional divergence for settling this issue. Moreover, the additional weak supervision information is also absorbed to refine the performance of embedding model since these textual information contains detailed semantic description and offers abundant semantic relevance. In the experiments, we evaluate our method on the tasks of link prediction and triplet classification. The experimental results indicate that the Wasserstein distance is capable of solving the problem of vanishing gradient on discrete data and accelerating the convergence, additional weak supervision information also can significantly improve the performance of the modelShow morerticle
Publiction:Knowledge-Based Systems, 190, 2020-02-2


Peer-reviewed
FRWCAE: joint faster-RCNN and Wasserstein convolutional auto-encoder for instance retrieval
Authors:Yi-yang ZhangYong FengDa-jiang LiuJia-xing ShangBao-hua Qiang
Summary:Abstract: Based on the powerful feature extraction capability of deep convolutional neural networks, image-level retrieval methods have achieved superior performance compared to the hand-crafted features and indexing algorithms. However, people tend to focus on foreground objects of interest in images. Locating objects accurately and using object-level features for retrieval become the essential tasks of instance search. In this work, we propose a novel instance retrieval method FRWACE, which combines the Faster R-CNN framework for object-level feature extraction with a brand-new Wasserstein Convolutional Auto-encoder for dimensionality reduction. In addition, we propose a considerate category-first spatial re-rank strategy to improve instance-level retrieval accuracy. Extensive experiments on four large datasets Oxford 5K, Paris 6K, Oxford 105K and Paris 106K show that our approach has achieved significant performance compared to the state-of-the-artsShow more
Article, 2020
Publication:Applied Intelligence : The International Journal of Research on Intelligent Systems for Real Life Complex Problems, 50, 20200302, 2208
Publisher:2020
 
Peer-reviewed
Wasserstein GAN based on Autoencoder with back-translation for cross-lingual embedding mappings
Authors:Yuhong ZhangYuling LiYi ZhuXuegang Hu
Summary:• We propose a novel framework to learn cross-lingual word embeddings with a supervised manner. • We propose a back-translation with target-side to improve the performance of GANs based model. • We impose a weak orthogonal constraint in our model. • We design a series of experiments on three language pairs.

Recent works about learning cross-lingual word mappings (CWMs) focus on relaxing the requirement of bilingual signals through generative adversarial networks (GANs). GANs based models intend to enforce source embedding space to align target embedding space. However, existing GANs based models cannot exploit the underlying information of target-side for an alignment standard in the training, which may lead to some suboptimal results of CWMs. To address this problem, we propose a novel method, named Wasserstein GAN based on autoencoder with back-translation (ABWGAN) that can effectively exploit the target-side information and improve the performance of GANs based models. ABWGAN is an innovative combination of preliminary mappings learning and back-translation with target-side (BT-TS). In the proposed BT-TS, we back-translate target-side embeddings with preliminary CWMs to learn the final cross-lingual mappings, which enables to improve the quality of the preliminary mappings by reusing the target-side samples. Experimental results on three language pairs demonstrate the effectiveness of the proposed ABWGANShow more
Article, 2020
Publication:Pattern Recognition Letters, 129, 202001, 311
Publisher:2020
Downloadable Archival Material, 2020-04-15
Undefined
Publisher:2020-04-15

Cited by 10 Related articles All 2 versions

2020



Authors:Ouyang, Xu (Creator), Agam, Gady (Creator)
Accelerated WGAN update strategy with loss change rate balancing
Summary:Optimizing the discriminator in Generative Adversarial Networks (GANs) to completion in the inner training loop is computationally prohibitive, and on finite datasets would result in overfitting. To address this, a common update strategy is to alternate between k optimization steps for the discriminator D and one optimization step for the generator G. This strategy is repeated in various GAN algorithms where k is selected empirically. In this paper, we show that this update strategy is not optimal in terms of accuracy and convergence speed, and propose a new update strategy for Wasserstein GANs (WGAN) and other GANs using the WGAN loss(e.g. WGAN-GP, Deblur GAN, and Super-resolution GAN). The proposed update strategy is based on a loss change ratio comparison of G and D. We demonstrate that the proposed strategy improves both convergence speed and accuracyShow more
Downloadable Archival Material, 2020-08-27
Undefined
Publisher:2020-08-27
 


Peer-reviewed
An Improved Defect Detection Method of Water Walls Using the WGAN
Authors:Zhang Y.Wang Y.Ding Y.Lu L.Yang J.Xu Z.Ma B.Lin X.2020 4th International Conference on Electrical, Automation and Mechanical Engineering, EAME 2020Show more


Article, 2020
Publication:Journal of Physics: Conference Series, 1626, 2020 11 06
Publisher:2020


Insulator object detection based on image deblurring by WGANAuthors:Wang D.Li Y.
Article, 2020
Publication:Dianli Zidonghua Shebei/Electric Power Automation Equipment, 40, 2020 05 10, 188
Publisher:2020

Multi-Band Image Synchronous Super-Resolution and Fusion Method Based on Improved WGAN-GP
Authors:Tian S.Lin S.Lei H.Li D.Wang L.
Article, 2020
Publication:Guangxue Xuebao/Acta Optica Sinica, 40, 2020 10 25
Publisher:2020

<——2020——–2020—––3660——



Capacity allocation of new energy source based on wind and solar resource scenario simulation using WGAN and sequential production simulation
Show more
Autrs:Ma Y.Fu Y.Zhao S.Yang X.Wang Z.Zeng F.Dong L.
 
Article, 2020
Publication:Dianli Zidonghua Shebei/Electric Power Automation Equipment, 40, 2020 11 10, 77
Publisher:2020

Generating and utilizing targeted adversarial examples by AE-WGAN transformation
Authors:Zhang J.Zhang Z.
Article, 2020
Publication:Nanjing Youdian Daxue Xuebao (Ziran Kexue Ban)/Journal of Nanjing University of Posts and Telecommunications (Natural Science), 40, 2020 02 01, 63
Publisher:2020


Eye in-painting using WGAN-GP for face images with mosaicAuthors:Ammar AmjadHsien-Tsung ChangRuidan SuCheng-Hsuan WuThird International Conference on Image, Video Processing and Artificial Intelligence 2570944 2020-08-21|2020-08-23 Shanghai, China2020 International Conference on Image, Video Processing and Artificial Intelligence 11584Image Processing and ApplicationsShow more
Summary:In order to protect personal privacy, news reports often use the mosaics upon the face of the protagonist in the photo. However, readers will feel uncomfortable and awkward to this kind of photos. In this research, we detect the eye mosaic and try to use eye complementing which is not the same with original picture but matches the nearby texture. It can arouse readers' interest in reading. Traditional in-painting research is not suitable for filling special or large objects, such as eyes or boxes on the ground. They only can fill a small area of missing parts or a single background refer to nearby textures, such as landscape photos. We use WGAN-GP that can refer to nearby textures to generate special objects for in-painting eyes. We also divide the training set into male and female according to gender to avoid eye makeup appearing in all pictures. The experiment result shows our method get higher score in blind testingShow more
Chapter, 2020
Publication:11584, 20201110, 115840O
Publisher:2020
 
정칙화 항에 기반한 WGAN 립쉬츠 연속 안정화 기법 제안
Authors:한희일Hee-Il Hahn
Summary:최근에 제안된 WGAN(Wasserstein generative adversarial network) 등장으로 GAN(generative adversarial network) 고질적인 문제인 까다롭고 불안정한 학습과정이 다소 개선되기는 하였으나 여전히 수렴이 안되거나 자연스럽지 못한 출력물을 생성하는 등의 경우가 발생한다. 이러한 문제를 해결하기 위하여 논문에서는 분별기가 실제 데이터 확률분포를 보다 정확히 추정할 있도록 표본화 과정을 개선하는 동시에 분별기 함수의 립쉬츠 연속조건을 안정적으로 유지시키기 위한 알고리즘을 제안한다. 다양한 실험을 통하여 제안 기법의 특성을 분석하고 성능을 확인한다.

The recently proposed Wasserstein generative adversarial network (WGAN) has improved some of the tricky and unstable training processes that are chronic problems of the generative adversarial network(GAN), but there are still cases where it generates poor samples or fails to converge. In order to solve the problems, this paper proposes algorithms to improve the sampling process so that the discriminator can more accurately estimate the data probability distribution to be modeled and to stably maintain the discriminator should be Lipschitz continuous. Through various experiments, we analyze the characteristics of the proposed techniques and verify their performancesShow more
Downloadable Article, 2020
Publication:The journal of the institute of internet, broadcasting and communication : JIIBC, 20, 2020, 239
Publisher:2020

VAE/WGAN-BASED IMAGE REPRESENTATION LEARNING FOR POSE-PRESERVING SEAMLESS IDENTITY REPLACEMENT IN FACIAL IMAGES
Show more
Authors:Chen J.Ishwar P.Konrad J.Kawai H.
Article, 2020
Publication:arXiv, 2020 03 01
Publisher:2020
 


A Generative Steganography Method Based on WGAN-GP
Authors:Li J.Niu K.Liu J.Lei Y.Zhang M.Liao L.Wang L.6th International Conference on Artificial Intelligence and Security,ICAIS 2020
Article, 2020
Publication:Communications in Computer and Information Science, 1252 CCIS, 2020, 386
Publisher:2020
 
WGAN-E: A generative adversarial networks for facial feature security
Authors:Wu C.Ju B.Zhang S.Wu Y.Xiong N.N.
 Article, 2020
Publication:Electronics (Switzerland), 9, 2020 03 01
Publisher:2020

Many-Objective Estimation of Distribution Optimization Algorithm Based on WGAN-GP
Authors:Liang Z.Li Y.Wan Z.
Article, 2020
Publication:arXiv, 2020 03 15
Publisher:2020
 
Eye in-painting using WGAN-GP for face images with mosaic
Authors:Wu C.-H.Chang H.-T.Amjad A.2020 International Conference on Image, Video Processing and Artificial Intelligence
Article, 2020
Publication:Proceedings of SPIE - The International Society for Optical Engineering, 11584, 2020
Publisher:2020
 
GS-WGAN: A gradient-sanitized approach for learning differentially private generators
Authors:Chen D.Fritz M.Orekondy T.
Article, 2020
Publication:arXiv, 2020 06 15
Publisher:2020

<——2020——–2020—––3670——


WGAN-based Autoencoder Training Over-The-AirAuthors:Dorner S.Henninger M.Cammerer S.Ten Brink S.21st IEEE International Workshop on Signal Processing Advances in Wireless Communications, SPAWC 2020Show more
2020
Publication:IEEE Workshop on Signal Processing Advances in Wireless Communications, SPAWC, 2020-May, 2020 05 01
Publisher:2020


TextureWGAN: Texture preserving WGAN with MLE regularizer for inverse problems
Authors:Ikuta M.Zhang J.
, 2020
Publication:arXiv, 2020 08 11
Publisher:2020
 


Adaptive WGAN with loss change rate balancing
Authors:Ouyang X.Agam G.
Article, 2020
Publication:arXiv, 2020 08 27
Publisher:2020


使用WGAN-GP對臉部馬賽克進行眼睛補圖 = Eye In-painting Using WGAN-GP for Face Images with Mosaic
 / Shi yongWGAN-GP dui lian bu ma sai ke jin xing yan jing bu tu = Eye In-painting Using WGAN-GP for Face Images with MosaicShow more
Authors:吳承軒, H. T. ChangCheng Hsuan Wu張賢宗 / Chengxuan WuXianzong Zhang

Peer-reviewed
Wasserstein based transfer network for cross-domain sentiment classification
Authors:Yongping DuMeng HeLulin WangHaitong Zhang
Summary:Automatic sentiment analysis of social media texts is of great significance for identifying people’s opinions that can help people make better decisions. Annotating data is time consuming and laborious, and effective sentiment analysis on domains lacking of labeled data has become a problem. Cross-domain sentiment classification is a promising task, which leverages the source domain data with rich sentiment labels to analyze the sentiment polarity of the target domain lacking supervised information. Most of the existing researches usually explore algorithms that select common features manually to bridge different domains. In this paper, we propose a Wasserstein based Transfer Network (WTN) to share the domain-invariant information of source and target domains. We benefit from BERT to achieve rich knowledge and obtain deep level semantic information of text. The recurrent neural network with attention is used to capture features automatically, and Wasserstein distance is applied to estimate feature representations of source and target domains, which could help to capture significant domain-invariant features by adversarial training. Extensive experiments on Amazon datasets demonstrate that WTN outperforms other state-of-the-art methods significantly. Especially, the model behaves more stable across different domainsShow more
Article, 2020
Publication:Knowledge-Based Systems, 204, 20200927
Publisher:2020
 

2020


Peer-reviewed
Wasserstein distributionally robust shortest path problem
Authors:Zhuolin WangKeyou YouShiji SongYuli Zhang
Summary:This paper proposes a data-driven distributionally robust shortest path (DRSP) model where the distribution of the travel time in the transportation network can only be partially observed through a finite number of samples. Specifically, we aim to find an optimal path to minimize the worst-case α-reliable mean-excess travel time (METT) over a Wasserstein ball, which is centered at the empirical distribution of the sample dataset and the ball radius quantifies the level of its confidence. In sharp contrast to the existing DRSP models, our model is equivalently reformulated as a tractable mixed 0-1 convex problem, e.g., 0-1 linear program or 0-1 second-order cone program. Moreover, we also explicitly derive the distribution achieving the worst-case METT by simply perturbing each sample. Experiments demonstrate the advantages of our DRSP model in terms of the out-of-sample performance and computational complexity. Finally, our DRSP model is easily extended to solve the distributionally robust bi-criteria shortest path problem and the minimum cost flow problemShow more
Article
Publication:European Journal of Operational Research, 284, 2020-07-01, 31

Peer-reviewed
Convergence rate to equilibrium in Wasserstein distance for reflected jump-diffusions
Author:Andrey Sarantsev
Summary:Convergence rate to the stationary distribution for continuous-time Markov processes can be studied using Lyapunov functions. Recent work by the author provided explicit rates of convergence in special case of a reflected jump-diffusion on a half-line. These results are proved for total variation distance and its generalizations: measure distances defined by test functions regardless of their continuity. Here we prove similar results for Wasserstein distance, convergence in which is related to convergence for continuous test functions. In some cases, including the reflected Ornstein-Uhlenbeck process, we get faster exponential convergence rates for Wasserstein distance than for total variation distanceShow more
Article
Publication:Statistics and Probability Letters, 165, October 2020


Peer-reviewed
A Rademacher-type theorem on <b>L</b> 2-Wasserstein spaces over closed Riemannian manifolds
Author:Lorenzo Dello Schiavo
Summary:Let P be any Borel probability measure on the L 2 -Wasserstein space ( P 2 ( M ) , W 2 ) over a closed Riemannian manifold M. We consider the Dirichlet form E induced by P and by the Wasserstein gradient on P 2 ( M ) . Under natural assumptions on P , we show that W 2 -Lipschitz functions on P 2 ( M ) are contained in the Dirichlet space D ( E ) and that W 2 is dominated by the intrinsic metric induced by E . We illustrate our results by giving several detailed examplesShow more
Article, 2020
Publication:Journal of Functional Analysis, 278, 20200401
Publisher:2020

 
Peer-reviewed
Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings
Authors:Yuanfei DaiShiping WangXing ChenChaoyang XuWenzhong Guo
Summary:Knowledge graph embedding aims to project entities and relations into low-dimensional and continuous semantic feature spaces, which has captured more attention in recent years. Most of the existing models roughly construct negative samples via a uniformly random mode, by which these corrupted samples are practically trivial for training the embedding model. Inspired by generative adversarial networks (GANs), the generator can be employed to sample more plausible negative triplets, that boosts the discriminator to improve its embedding performance further. However, vanishing gradient on discrete data is an inherent problem in traditional GANs. In this paper, we propose a generative adversarial network based knowledge graph representation learning model by introducing the Wasserstein distance to replace traditional divergence for settling this issue. Moreover, the additional weak supervision information is also absorbed to refine the performance of embedding model since these textual information contains detailed semantic description and offers abundant semantic relevance. In the experiments, we evaluate our method on the tasks of link prediction and triplet classification. The experimental results indicate that the Wasserstein distance is capable of solving the problem of vanishing gradient on discrete data and accelerating the convergence, additional weak supervision information also can significantly improve the performance of the modelShow more
Article, 2020
Publication:Knowledge-Based Systems, 190, 20200229
Publisher:2020

Peer-reviewed
Irregularity of Distribution in Wasserstein Distance
Author:Cole Graham
Summary:Abstract: We study the non-uniformity of probability measures on the interval and circle. On the interval, we identify the Wasserstein-p distance with the classical -discrepancy. We thereby derive sharp estimates in Wasserstein distances for the irregularity of distribution of sequences on the interval and circle. Furthermore, we prove an -adapted Erdős–Turán inequality, and use it to extend a well-known bound of Pólya and Vinogradov on the equidistribution of quadratic residues in finite fieldsShow more
Article, 2020
Publication:Journal of Fourier Analysis and Applications, 26, 20200929
Publisher:2020

<——2020——–2020—––3680——


 
A Wasserstein Coupled Particle Filter for Multilevel Estimation
Authors:Ballesio, Marco (Creator), Jasra, Ajay (Creator), von Schwerin, Erik (Creator), Tempone, Raul (Creator)
Summary:In this paper, we consider the filtering problem for partially observed diffusions, which are regularly observed at discrete times. We are concerned with the case when one must resort to time-discretization of the diffusion process if the transition density is not available in an appropriate form. In such cases, one must resort to advanced numerical algorithms such as particle filters to consistently estimate the filter. It is also well known that the particle filter can be enhanced by considering hierarchies of discretizations and the multilevel Monte Carlo (MLMC) method, in the sense of reducing the computational effort to achieve a given mean square error (MSE). A variety of multilevel particle filters (MLPF) have been suggested in the literature, e.g., in Jasra et al., SIAM J, Numer. Anal., 55, 3068--3096. Here we introduce a new alternative that involves a resampling step based on the optimal Wasserstein coupling. We prove a central limit theorem (CLT) for the new method. On considering the asymptotic variance, we establish that in some scenarios, there is a reduction, relative to the approach in the aforementioned paper by Jasra et al., in computational effort to achieve a given MSE. These findings are confirmed in numerical examples. We also consider filtering diffusions with unstable dynamics; we empirically show that in such cases a change of measure technique seems to be required to maintain our findingsShow more
Downloadable Archival Material, 2020-04-08
Undefined
Publisher:2020-04-08

Hierarchical Low-Rank Approximation of Regularized Wasserstein Distance
Author:Motamed, Mohammad (Creator)
Summary:Sinkhorn divergence is a measure of dissimilarity between two probability measures. It is obtained through adding an entropic regularization term to Kantorovich's optimal transport problem and can hence be viewed as an entropically regularized Wasserstein distance. Given two discrete probability vectors in the $n$-simplex and supported on two bounded spaces in ${\mathbb R}^d$, we present a fast method for computing Sinkhorn divergence when the cost matrix can be decomposed into a $d$-term sum of asymptotically smooth Kronecker product factors. The method combines Sinkhorn's matrix scaling iteration with a low-rank hierarchical representation of the scaling matrices to achieve a near-linear complexity ${\mathcal O}(n \log^3 n)$. This provides a fast and easy-to-implement algorithm for computing Sinkhorn divergence, enabling its applicability to large-scale optimization problems, where the computation of classical Wasserstein metric is not feasible. We present a numerical example related to signal processing to demonstrate the applicability of quadratic Sinkhorn divergence in comparison with quadratic Wasserstein distance and to verify the accuracy and efficiency of the proposed methodShow more
Downloadable Archival Material, 2020-04-26
Undefined
Publisher:2020-04-26

S2A: Wasserstein GAN with Spatio-Spectral Laplacian Attention for Multi-Spectral Band Synthesis
Authors:Rout, Litu (Creator), Misra, Indranil (Creator), Moorthi, S Manthira (Creator), Dhar, Debajyoti (Creator)
Summary:Intersection of adversarial learning and satellite image processing is an emerging field in remote sensing. In this study, we intend to address synthesis of high resolution multi-spectral satellite imagery using adversarial learning. Guided by the discovery of attention mechanism, we regulate the process of band synthesis through spatio-spectral Laplacian attention. Further, we use Wasserstein GAN with gradient penalty norm to improve training and stability of adversarial learning. In this regard, we introduce a new cost function for the discriminator based on spatial attention and domain adaptation loss. We critically analyze the qualitative and quantitative results compared with state-of-the-art methods using widely adopted evaluation metrics. Our experiments on datasets of three different sensors, namely LISS-3, LISS-4, and WorldView-2 show that attention learning performs favorably against state-of-the-art methods. Using the proposed method we provide an additional data product in consistent with existing high resolution bands. Furthermore, we synthesize over 4000 high resolution scenes covering various terrains to analyze scientific fidelity. At the end, we demonstrate plausible large scale real world applications of the synthesized bandShow more
 
Downloadable Archival Material, 2020-04-08
Undefined
Publisher:2020-04-08
 

Wasserstein Exponential Kernels

Authors:De Plaen, Henri (Creator), Fanuel, Michaël (Creator), Suykens, Johan A. K. (Creator)
Summary:In the context of kernel methods, the similarity between data points is encoded by the kernel function which is often defined thanks to the Euclidean distance, a common example being the squared exponential kernel. Recently, other distances relying on optimal transport theory - such as the Wasserstein distance between probability distributions - have shown their practical relevance for different machine learning techniques. In this paper, we study the use of exponential kernels defined thanks to the regularized Wasserstein distance and discuss their positive definiteness. More specifically, we define Wasserstein feature maps and illustrate their interest for supervised learning problems involving shapes and images. Empirically, Wasserstein squared exponential kernels are shown to yield smaller classification errors on small training sets of shapes, compared to analogous classifiers using Euclidean distancesShow more
Downloadable Archival Material, 2020-02-05
Undefined
Publisher:2020-02-05

Adversarial Classification via Distributional Robustness with Wasserstein Ambiguity
Authors:Ho-Nguyen, Nam (Creator), Wright, Stephen J. (Creator)
Summary:We study a model for adversarial classification based on distributionally robust chance constraints. We show that under Wasserstein ambiguity, the model aims to minimize the conditional value-at-risk of the distance to misclassification, and we explore links to adversarial classification models proposed earlier and to maximum-margin classifiers. We also provide a reformulation of the distributionally robust model for linear classification, and show it is equivalent to minimizing a regularized ramp loss objective. Numerical experiments show that, despite the nonconvexity of this formulation, standard descent methods appear to converge to the global minimizer for this problem. Inspired by this observation, we show that, for a certain class of distributions, the only stationary point of the regularized ramp loss minimization problem is the global minimizerShow more
Downloadable Archival Material, 2020-05-28
Undefined
Publisher:2020-05-28   


2020 2020 2020


Graph
Wasserstein Correlation Analysis for Movie Retrieval

Authors:Zhang, Xueya (Creator), Zhang, Tong (Creator), Hong, Xiaobin (Creator), Cui, Zhen (Creator), Yang, Jian (Creator)
Summary:Movie graphs play an important role to bridge heterogenous modalities of videos and texts in human-centric retrieval. In this work, we propose Graph Wasserstein Correlation Analysis (GWCA) to deal with the core issue therein, i.e, cross heterogeneous graph comparison. Spectral graph filtering is introduced to encode graph signals, which are then embedded as probability distributions in a Wasserstein space, called graph Wasserstein metric learning. Such a seamless integration of graph signal filtering together with metric learning results in a surprise consistency on both learning processes, in which the goal of metric learning is just to optimize signal filters or vice versa. Further, we derive the solution of the graph comparison model as a classic generalized eigenvalue decomposition problem, which has an exactly closed-form solution. Finally, GWCA together with movie/text graphs generation are unified into the framework of movie retrieval to evaluate our proposed method. Extensive experiments on MovieGrpahs dataset demonstrate the effectiveness of our GWCA as well as the entire frameworkShow more
Downloadable Archival Material, 2020-08-06
Undefined
Publisher:2020-08-06


Wasserstein sensitivity of Risk and Uncertainty PropagationAuthors:Ernst, Oliver G. (Creator), Pichler, Alois (Creator), Sprungk, Björn (Creator)
Summary:When propagating uncertainty in the data of differential equations, the probability laws describing the uncertainty are typically themselves subject to uncertainty. We present a sensitivity analysis of uncertainty propagation for differential equations with random inputs to perturbations of the input measures. We focus on the elliptic diffusion equation with random coefficient and source term, for which the probability measure of the solution random field is shown to be Lipschitz-continuous in both total variation and Wasserstein distance. The result generalizes to the solution map of any differential equation with locally H\"older dependence on input parameters. In addition, these results extend to Lipschitz continuous quantities of interest of the solution as well as to coherent risk functionals of these applied to evaluate the impact of their uncertainty. Our analysis is based on the sensitivity of risk functionals and pushforward measures for locally H\"older mappings with respect to the Wasserstein distance of perturbed input distributions. The established results are applied, in particular, to the case of lognormal diffusion and the truncation of series representations of input random fieldsShow more
Downloadable Archival Material, 2020-03-06
Undefined
Publisher:2020-03-06 



Some Theoretical Insights into Wasserstein GANs
Authors:Biau, Gérard (Creator), Sangnier, Maxime (Creator), Tanielian, Ugo (Creator)
Summary:Generative Adversarial Networks (GANs) have been successful in producing outstanding results in areas as diverse as image, video, and text generation. Building on these successes, a large number of empirical studies have validated the benefits of the cousin approach called Wasserstein GANs (WGANs), which brings stabilization in the training process. In the present paper, we add a new stone to the edifice by proposing some theoretical advances in the properties of WGANs. First, we properly define the architecture of WGANs in the context of integral probability metrics parameterized by neural networks and highlight some of their basic mathematical features. We stress in particular interesting optimization properties arising from the use of a parametric 1-Lipschitz discriminator. Then, in a statistically-driven approach, we study the convergence of empirical WGANs as the sample size tends to infinity, and clarify the adversarial effects of the generator and the discriminator by underlining some trade-off properties. These features are finally illustrated with experiments using both synthetic and real-world datasetsShow more
Downloadable Archival Material, 2020-06-04
Undefined
Publisher:2020-06-04


Online Stochastic Optimization with Wasserstein Based Non-stationarity
Authors:Jiang, Jiashuo (Creator), Li, Xiaocheng (Creator), Zhang, Jiawei (Creator)
Summary:We consider a general online stochastic optimization problem with multiple budget constraints over a horizon of finite time periods. In each time period, a reward function and multiple cost functions are revealed, and the decision maker needs to specify an action from a convex and compact action set to collect the reward and consume the budget. Each cost function corresponds to the consumption of one budget. In each period, the reward and cost functions are drawn from an unknown distribution, which is non-stationary across time. The objective of the decision maker is to maximize the cumulative reward subject to the budget constraints. This formulation captures a wide range of applications including online linear programming and network revenue management, among others. In this paper, we consider two settings: (i) a data-driven setting where the true distribution is unknown but a prior estimate (possibly inaccurate) is available; (ii) an uninformative setting where the true distribution is completely unknown. We propose a unified Wasserstein-distance based measure to quantify the inaccuracy of the prior estimate in setting (i) and the non-stationarity of the system in setting (ii). We show that the proposed measure leads to a necessary and sufficient condition for the attainability of a sublinear regret in both settings. For setting (i), we propose a new algorithm, which takes a primal-dual perspective and integrates the prior information of the underlying distributions into an online gradient descent procedure in the dual space. The algorithm also naturally extends to the uninformative setting (ii). Under both settings, we show the corresponding algorithm achieves a regret of optimal order. In numerical experiments, we demonstrate how the proposed algorithms can be naturally integrated with the re-solving technique to further boost the empirical performanceShow more
Downloadable Archival Material, 2020-12-12
Undefined
Publisher:2020-12-12
 


Importance-Aware Semantic Segmentation in Self-Driving with Discrete Wasserstein Training
Authors:Liu, Xiaofeng (Creator), Han, Yuzhuo (Creator), Bai, Song (Creator), Ge, Yi (Creator), Wang, Tianxing (Creator), Han, Xu (Creator), Li, Site (Creator), You, Jane (Creator), Lu, Ju (Creator)Show more
Summary:Semantic segmentation (SS) is an important perception manner for self-driving cars and robotics, which classifies each pixel into a pre-determined class. The widely-used cross entropy (CE) loss-based deep networks has achieved significant progress w.r.t. the mean Intersection-over Union (mIoU). However, the cross entropy loss can not take the different importance of each class in an self-driving system into account. For example, pedestrians in the image should be much more important than the surrounding buildings when make a decisions in the driving, so their segmentation results are expected to be as accurate as possible. In this paper, we propose to incorporate the importance-aware inter-class correlation in a Wasserstein training framework by configuring its ground distance matrix. The ground distance matrix can be pre-defined following a priori in a specific task, and the previous importance-ignored methods can be the particular cases. From an optimization perspective, we also extend our ground metric to a linear, convex or concave increasing function $w.r.t.$ pre-defined ground distance. We evaluate our method on CamVid and Cityscapes datasets with different backbones (SegNet, ENet, FCN and Deeplab) in a plug and play fashion. In our extenssive experiments, Wasserstein loss demonstrates superior segmentation performance on the predefined critical classes for safe-drivingShow more
Downloadable Archival Material, 2020-10-21
Undefined

Publisher:2020-10-21

<——2020——–2020—––3690——



A Wasserstein Minimum Velocity Approach to Learning Unnormalized Models
Authors:Wang, Ziyu (Creator), Cheng, Shuyu (Creator), Li, Yueru (Creator), Zhu, Jun (Creator), Zhang, Bo (Creator)
Summary:Score matching provides an effective approach to learning flexible unnormalized models, but its scalability is limited by the need to evaluate a second-order derivative. In this paper, we present a scalable approximation to a general family of learning objectives including score matching, by observing a new connection between these objectives and Wasserstein gradient flows. We present applications with promise in learning neural density estimators on manifolds, and training implicit variational and Wasserstein auto-encoders with a manifold-valued priorShow more
Downloadable Archival Material, 2020-02-18
Undefined
Publisher:2020-02-18

Improved Image Wasserstein Attacks and Defenses
Authors:Hu, J. Edward (Creator), Swaminathan, Adith (Creator), Salman, Hadi (Creator), Yang, Greg (Creator)
Summary:Robustness against image perturbations bounded by a $\ell_p$ ball have been well-studied in recent literature. Perturbations in the real-world, however, rarely exhibit the pixel independence that $\ell_p$ threat models assume. A recently proposed Wasserstein distance-bounded threat model is a promising alternative that limits the perturbation to pixel mass movements. We point out and rectify flaws in previous definition of the Wasserstein threat model and explore stronger attacks and defenses under our better-defined framework. Lastly, we discuss the inability of current Wasserstein-robust models in defending against perturbations seen in the real world. Our code and trained models are available at https://github.com/edwardjhu/improved_wassersteinShow more
Downloadable Archival Material, 2020-04-26
Undefined
Publisher:2020-04-26


SWIFT: Scalable Wasserstein Factorization for Sparse Nonnegative Tensors
Authors:Afshar, Ardavan (Creator), Yin, Kejing (Creator), Yan, Sherry (Creator), Qian, Cheng (Creator), Ho, Joyce C. (Creator), Park, Haesun (Creator), Sun, Jimeng (Creator)Show more
Summary:Existing tensor factorization methods assume that the input tensor follows some specific distribution (i.e. Poisson, Bernoulli, and Gaussian), and solve the factorization by minimizing some empirical loss functions defined based on the corresponding distribution. However, it suffers from several drawbacks: 1) In reality, the underlying distributions are complicated and unknown, making it infeasible to be approximated by a simple distribution. 2) The correlation across dimensions of the input tensor is not well utilized, leading to sub-optimal performance. Although heuristics were proposed to incorporate such correlation as side information under Gaussian distribution, they can not easily be generalized to other distributions. Thus, a more principled way of utilizing the correlation in tensor factorization models is still an open challenge. Without assuming any explicit distribution, we formulate the tensor factorization as an optimal transport problem with Wasserstein distance, which can handle non-negative inputs. We introduce SWIFT, which minimizes the Wasserstein distance that measures the distance between the input tensor and that of the reconstruction. In particular, we define the N-th order tensor Wasserstein loss for the widely used tensor CP factorization and derive the optimization algorithm that minimizes it. By leveraging sparsity structure and different equivalent formulations for optimizing computational efficiency, SWIFT is as scalable as other well-known CP algorithms. Using the factor matrices as features, SWIFT achieves up to 9.65% and 11.31% relative improvement over baselines for downstream prediction tasks. Under the noisy conditions, SWIFT achieves up to 15% and 17% relative improvements over the best competitors for the prediction tasksShow more
Downloadable Archival Material, 2020-10-08
Undefined



Scalable Computations of Wasserstein Barycenter via Input Convex Neural Networks
Authors:Fan, Jiaojiao (Creator), Taghvaei, Amirhossein (Creator), Chen, Yongxin (Creator)
Summary:Wasserstein Barycenter is a principled approach to represent the weighted mean of a given set of probability distributions, utilizing the geometry induced by optimal transport. In this work, we present a novel scalable algorithm to approximate the Wasserstein Barycenters aiming at high-dimensional applications in machine learning. Our proposed algorithm is based on the Kantorovich dual formulation of the Wasserstein-2 distance as well as a recent neural network architecture, input convex neural network, that is known to parametrize convex functions. The distinguishing features of our method are: i) it only requires samples from the marginal distributions; ii) unlike the existing approaches, it represents the Barycenter with a generative model and can thus generate infinite samples from the barycenter without querying the marginal distributions; iii) it works similar to Generative Adversarial Model in one marginal case. We demonstrate the efficacy of our algorithm by comparing it with the state-of-art methods in multiple experimentsShow more
Downloadable Archival Material, 2020-07-08
Undefined
Publisher:2020-07-08
Cited by 31
Related articles All 4 versions


Projection Robust Wasserstein Distance and Riemannian Optimization
Authors:Lin, Tianyi (Creator), Fan, Chenyou (Creator), Ho, Nhat (Creator), Cuturi, Marco (Creator), Jordan, Michael I. (Creator)
Summary:Projection robust Wasserstein (PRW) distance, or Wasserstein projection pursuit (WPP), is a robust variant of the Wasserstein distance. Recent work suggests that this quantity is more robust than the standard Wasserstein distance, in particular when comparing probability measures in high-dimensions. However, it is ruled out for practical application because the optimization model is essentially non-convex and non-smooth which makes the computation intractable. Our contribution in this paper is to revisit the original motivation behind WPP/PRW, but take the hard route of showing that, despite its non-convexity and lack of nonsmoothness, and even despite some hardness results proved by~\citet{Niles-2019-Estimation} in a minimax sense, the original formulation for PRW/WPP \textit{can} be efficiently computed in practice using Riemannian optimization, yielding in relevant cases better behavior than its convex relaxation. More specifically, we provide three simple algorithms with solid theoretical guarantee on their complexity bound (one in the appendix), and demonstrate their effectiveness and efficiency by conducing extensive experiments on synthetic and real data. This paper provides a first step into a computational theory of the PRW distance and provides the links between optimal transport and Riemannian optimizationShow more
Downloadable Archival Material, 2020-06-12
Undefined
Publisher:2020-06-12


2020


Encoded Prior Sliced Wasserstein AutoEncoder for learning latent manifold representations
Authors:Krishnagopal, Sanjukta (Creator), Bedrossian, Jacob (Creator)
Summary:While variational autoencoders have been successful in several tasks, the use of conventional priors are limited in their ability to encode the underlying structure of input data. We introduce an Encoded Prior Sliced Wasserstein AutoEncoder wherein an additional prior-encoder network learns an embedding of the data manifold which preserves topological and geometric properties of the data, thus improving the structure of latent space. The autoencoder and prior-encoder networks are iteratively trained using the Sliced Wasserstein distance. The effectiveness of the learned manifold encoding is explored by traversing latent space through interpolations along geodesics which generate samples that lie on the data manifold and hence are more realistic compared to Euclidean interpolation. To this end, we introduce a graph-based algorithm for exploring the data manifold and interpolating along network-geodesics in latent space by maximizing the density of samples along the path while minimizing total energy. We use the 3D-spiral data to show that the prior encodes the geometry underlying the data unlike conventional autoencoders, and to demonstrate the exploration of the embedded data manifold through the network algorithm. We apply our framework to benchmarked image datasets to demonstrate the advantages of learning data representations in outlier generation, latent structure, and geodesic interpolationShow more
Downloadable Archival Material, 2020-10-02
Undefined
Publisher:2020-10-02


Lagrangian schemes for Wasserstein gradient flows
Authors:Carrillo, Jose A. (Creator), Matthes, Daniel (Creator), Wolfram, Marie-Therese (Creator)
Summary:This paper reviews different numerical methods for specific examples of Wasserstein gradient flows: we focus on nonlinear Fokker-Planck equations,but also discuss discretizations of the parabolic-elliptic Keller-Segel model and of the fourth order thin film equation. The methods under review are of Lagrangian nature, that is, the numerical approximations trace the characteristics of the underlying transport equation rather than solving the evolution equation for the mass density directly. The two main approaches are based on integrating the equation for the Lagrangian maps on the one hand, and on solution of coupled ODEs for individual mass particles on the other handShow more
Downloadable Archival Material, 2020-03-08
Undefined
Publisher:2020-03-08


Wasserstein Generative Models for Patch-based Texture Synthesis
Authors:Houdard, Antoine (Creator), Leclaire, Arthur (Creator), Papadakis, Nicolas (Creator), Rabin, Julien (Creator)
Summary:In this paper, we propose a framework to train a generative model for texture image synthesis from a single example. To do so, we exploit the local representation of images via the space of patches, that is, square sub-images of fixed size (e.g. $4\times 4$). Our main contribution is to consider optimal transport to enforce the multiscale patch distribution of generated images, which leads to two different formulations. First, a pixel-based optimization method is proposed, relying on discrete optimal transport. We show that it is related to a well-known texture optimization framework based on iterated patch nearest-neighbor projections, while avoiding some of its shortcomings. Second, in a semi-discrete setting, we exploit the differential properties of Wasserstein distances to learn a fully convolutional network for texture generation. Once estimated, this network produces realistic and arbitrarily large texture samples in real time. The two formulations result in non-convex concave problems that can be optimized efficiently with convergence properties and improved stability compared to adversarial approaches, without relying on any regularization. By directly dealing with the patch distribution of synthesized images, we also overcome limitations of state-of-the art techniques, such as patch aggregation issues that usually lead to low frequency artifacts (e.g. blurring) in traditional patch-based approaches, or statistical inconsistencies (e.g. color or patterns) in learning approachesShow more
Downloadable Archival Material, 2020-06-19
Undefined
Publisher:2020-06-19



Stability of Gibbs Posteriors from the Wasserstein Loss for Bayesian Full Waveform Inversion
Authors:Dunlop, Matthew M. (Creator), Yang, Yunan (Creator)
Summary:Recently, the Wasserstein loss function has been proven to be effective when applied to deterministic full-waveform inversion (FWI) problems. We consider the application of this loss function in Bayesian FWI so that the uncertainty can be captured in the solution. Other loss functions that are commonly used in practice are also considered for comparison. Existence and stability of the resulting Gibbs posteriors are shown on function space under weak assumptions on the prior and model. In particular, the distribution arising from the Wasserstein loss is shown to be quite stable with respect to high-frequency noise in the data. We then illustrate the difference between the resulting distributions numerically, using Laplace approximations to estimate the unknown velocity field and uncertainty associated with the estimatesShow more
Downloadable Archival Material, 2020-04-07
Undefined
Publisher:2020-04-07


The Quantum Wasserstein Distance of Order 1
Authors:De Palma, Giacomo (Creator), Marvian, Milad (Creator), Trevisan, Dario (Creator), Lloyd, Seth (Creator)
Summary:We propose a generalization of the Wasserstein distance of order 1 to the quantum states of $n$ qudits. The proposal recovers the Hamming distance for the vectors of the canonical basis, and more generally the classical Wasserstein distance for quantum states diagonal in the canonical basis. The proposed distance is invariant with respect to permutations of the qudits and unitary operations acting on one qudit and is additive with respect to the tensor product. Our main result is a continuity bound for the von Neumann entropy with respect to the proposed distance, which significantly strengthens the best continuity bound with respect to the trace distance. We also propose a generalization of the Lipschitz constant to quantum observables. The notion of quantum Lipschitz constant allows us to compute the proposed distance with a semidefinite program. We prove a quantum version of Marton's transportation inequality and a quantum Gaussian concentration inequality for the spectrum of quantum Lipschitz observables. Moreover, we derive bounds on the contraction coefficients of shallow quantum circuits and of the tensor product of one-qudit quantum channels with respect to the proposed distance. We discuss other possible applications in quantum machine learning, quantum Shannon theory, and quantum many-body systemsShow more
Downloadable Archival Material, 2020-09-09
Undefined
Publisher:2020-09-09

<——2020——–2020—––3700——



Augmented Sliced Wasserstein Distances
Authors:Chen, Xiongjie (Creator), Yang, Yongxin (Creator), Li, Yunpeng (Creator)
Summary:While theoretically appealing, the application of the Wasserstein distance to large-scale machine learning problems has been hampered by its prohibitive computational cost. The sliced Wasserstein distance and its variants improve the computational efficiency through the random projection, yet they suffer from low accuracy if the number of projections is not sufficiently large, because the majority of projections result in trivially small values. In this work, we propose a new family of distance metrics, called augmented sliced Wasserstein distances (ASWDs), constructed by first mapping samples to higher-dimensional hypersurfaces parameterized by neural networks. It is derived from a key observation that (random) linear projections of samples residing on these hypersurfaces would translate to much more flexible nonlinear projections in the original sample space, so they can capture complex structures of the data distribution. We show that the hypersurfaces can be optimized by gradient ascent efficiently. We provide the condition under which the ASWD is a valid metric and show that this can be obtained by an injective neural network architecture. Numerical results demonstrate that the ASWD significantly outperforms other Wasserstein variants for both synthetic and real-world problemsShow more
Downloadable Archival Material, 2020-06-15
Undefined
Publisher:2020-06-15



Visual Transfer for Reinforcement Learning via Wasserstein Domain ConfusionAuthors:Roy, Josh (Creator), Konidaris, George (Creator)
Summary:We introduce Wasserstein Adversarial Proximal Policy Optimization (WAPPO), a novel algorithm for visual transfer in Reinforcement Learning that explicitly learns to align the distributions of extracted features between a source and target task. WAPPO approximates and minimizes the Wasserstein-1 distance between the distributions of features from source and target domains via a novel Wasserstein Confusion objective. WAPPO outperforms the prior state-of-the-art in visual transfer and successfully transfers policies across Visual Cartpole and two instantiations of 16 OpenAI Procgen environmentsShow more
Downloadable Archival Material, 2020-06-04
Undefined
Publisher:2020-06-04

Pruned Wasserstein Index Generation Model and wigpy Package
Author:Xie, Fangzhou (Creator)
Summary:Recent proposal of Wasserstein Index Generation model (WIG) has shown a new direction for automatically generating indices. However, it is challenging in practice to fit large datasets for two reasons. First, the Sinkhorn distance is notoriously expensive to compute and suffers from dimensionality severely. Second, it requires to compute a full $N\times N$ matrix to be fit into memory, where $N$ is the dimension of vocabulary. When the dimensionality is too large, it is even impossible to compute at all. I hereby propose a Lasso-based shrinkage method to reduce dimensionality for the vocabulary as a pre-processing step prior to fitting the WIG model. After we get the word embedding from Word2Vec model, we could cluster these high-dimensional vectors by $k$-means clustering, and pick most frequent tokens within each cluster to form the "base vocabulary". Non-base tokens are then regressed on the vectors of base token to get a transformation weight and we could thus represent the whole vocabulary by only the "base tokens". This variant, called pruned WIG (pWIG), will enable us to shrink vocabulary dimension at will but could still achieve high accuracy. I also provide a \textit{wigpy} module in Python to carry out computation in both flavor. Application to Economic Policy Uncertainty (EPU) index is showcased as comparison with existing methods of generating time-series sentiment indicesShow more
Downloadable Archival Material, 2020-03-30
Undefined
Publisher:2020-03-30
 
Wasserstein Dropout
Authors:Sicking, Joachim (Creator), Akila, Maram (Creator), Pintz, Maximilian (Creator), Wirtz, Tim (Creator), Fischer, Asja (Creator), Wrobel, Stefan (Creator)
Summary:Despite of its importance for safe machine learning, uncertainty quantification for neural networks is far from being solved. State-of-the-art approaches to estimate neural uncertainties are often hybrid, combining parametric models with explicit or implicit (dropout-based) ensembling. We take another pathway and propose a novel approach to uncertainty quantification for regression tasks, Wasserstein dropout, that is purely non-parametric. Technically, it captures aleatoric uncertainty by means of dropout-based sub-network distributions. This is accomplished by a new objective which minimizes the Wasserstein distance between the label distribution and the model distribution. An extensive empirical analysis shows that Wasserstein dropout outperforms state-of-the-art methods, on vanilla test data as well as under distributional shift, in terms of producing more accurate and stable uncertainty estimatesShow more
Downloadable Archival Material, 2020-12-23
Undefined
Publisher:2020-12-23


Variational Wasserstein Barycenters for Geometric Clustering
Authors:Mi, Liang (Creator), Yu, Tianshu (Creator), Bento, Jose (Creator), Zhang, Wen (Creator), Li, Baoxin (Creator), Wang, Yalin (Creator)
Summary:We propose to compute Wasserstein barycenters (WBs) by solving for Monge maps with variational principle. We discuss the metric properties of WBs and explore their connections, especially the connections of Monge WBs, to K-means clustering and co-clustering. We also discuss the feasibility of Monge WBs on unbalanced measures and spherical domains. We propose two new problems -- regularized K-means and Wasserstein barycenter compression. We demonstrate the use of VWBs in solving these clustering-related problemsShow more
Downloadable Archival Material, 2020-02-24
Undefined
Publisher:2020-02-24


2020


Improving Perceptual Quality by Phone-Fortified Perceptual Loss using Wasserstein Distance for Speech Enhancement
Authors:Hsieh, Tsun-An (Creator), Yu, Cheng (Creator), Fu, Szu-Wei (Creator), Lu, Xugang (Creator), Tsao, Yu (Creator)
Summary:Speech enhancement (SE) aims to improve speech quality and intelligibility, which are both related to a smooth transition in speech segments that may carry linguistic information, e.g. phones and syllables. In this study, we propose a novel phone-fortified perceptual loss (PFPL) that takes phonetic information into account for training SE models. To effectively incorporate the phonetic information, the PFPL is computed based on latent representations of the wav2vec model, a powerful self-supervised encoder that renders rich phonetic information. To more accurately measure the distribution distances of the latent representations, the PFPL adopts the Wasserstein distance as the distance measure. Our experimental results first reveal that the PFPL is more correlated with the perceptual evaluation metrics, as compared to signal-level losses. Moreover, the results showed that the PFPL can enable a deep complex U-Net SE model to achieve highly competitive performance in terms of standardized quality and intelligibility evaluations on the Voice Bank-DEMAND datasetShow more
Downloadable Archival Material, 2020-10-28
Undefined
Publisher:2020-10-28

Wasserstein Distances for Stereo Disparity Estimation
Authors:Garg, Divyansh (Creator), Wang, Yan (Creator), Hariharan, Bharath (Creator), Campbell, Mark (Creator), Weinberger, Kilian Q. (Creator), Chao, Wei-Lun (Creator)Show more
Summary:Existing approaches to depth or disparity estimation output a distribution over a set of pre-defined discrete values. This leads to inaccurate results when the true depth or disparity does not match any of these values. The fact that this distribution is usually learned indirectly through a regression loss causes further problems in ambiguous regions around object boundaries. We address these issues using a new neural network architecture that is capable of outputting arbitrary depth values, and a new loss function that is derived from the Wasserstein distance between the true and the predicted distributions. We validate our approach on a variety of tasks, including stereo disparity and depth estimation, and the downstream 3D object detection. Our approach drastically reduces the error in ambiguous regions, especially around object boundaries that greatly affect the localization of objects in 3D, achieving the state-of-the-art in 3D object detection for autonomous driving. Our code will be available at https://github.com/Div99/W-Stereo-DispShow more
Downloadable Archival Material, 2020-07-06
Undefined
Publisher:2020-07-06


2020

Wasserstein Distances for Stereo Disparity Estimation

www.youtube.com › watch

www.youtube.com › watch

Paper: https://arxiv.org/abs/2007.03085Speaker:Divyansh is currently pursuing Masters at Stanford Universityhttps://divyanshgarg.

YouTube · Computer Vision Talks · 


Multimarginal Wasserstein Barycenter for Stain Normalization and Augmentation
Authors:Nadeem, Saad (Creator), Hollmann, Travis (Creator), Tannenbaum, Allen (Creator)
Summary:Variations in hematoxylin and eosin (H&E) stained images (due to clinical lab protocols, scanners, etc) directly impact the quality and accuracy of clinical diagnosis, and hence it is important to control for these variations for a reliable diagnosis. In this work, we present a new approach based on the multimarginal Wasserstein barycenter to normalize and augment H&E stained images given one or more references. Specifically, we provide a mathematically robust way of naturally incorporating additional images as intermediate references to drive stain normalization and augmentation simultaneously. The presented approach showed superior results quantitatively and qualitatively as compared to state-of-the-art methods for stain normalization. We further validated our stain normalization and augmentations in the nuclei segmentation task on a publicly available dataset, achieving state-of-the-art results against competing approachesShow more
Downloadable Archival Material, 2020-06-25
Undefined
Publisher:2020-06-25



2020

LBWGAN: Label Based Shape Synthesis From Text With WGANsAuthors:Bowen LiYue YuYing Li2020 International Conference on Virtual Reality and Visualization (ICVRV)
Summary:In this work, we purpose a novel method of voxel-based shape synthesis, which can build a connection between the natural language text and the color shapes. The state-of-the-art method use Generative Adversarial Networks (GANs) to achieve this task and some achievements have been made with it. It is a very advanced framework on this subject but the state-of-the-art method significantly ignores the role of the class labels. Labels can guide shape synthesis because shapes in different labels have different characteristics. Therefore, this work attempts to create a deeper connection between the labels and the generated results. It based on a new structure and lets the labels guide the shape synthesis work. A key idea is to establish a new set of relationships outside the generator and discriminator to guide the training process. This paper introduces an independent class classifier in the new structure and makes it grow together with the generator to make the generated results have more distinctive class features. Experiments show that our method has a more exquisite performance on the synthesis of complex shapes, performing more realistic, and has better performance in structural integrity. Besides, our approach can extract the implied shape messages from the descriptions to realize shape synthesisShow more
Chapter, 2020
Publication:2020 International Conference on Virtual Reality and Visualization (ICVRV), 202011, 47
Publisher:2020



Distributed Wasserstein Barycenters via Displacement InterpolationAuthors:Cisneros-Velarde, Pedro (Creator), Bullo, Francesco (Creator)
Summary:Consider a multi-agent system whereby each agent has an initial probability measure. In this paper, we propose a distributed algorithm based upon stochastic, asynchronous and pairwise exchange of information and displacement interpolation in the Wasserstein space. We characterize the evolution of this algorithm and prove it computes the Wasserstein barycenter of the initial measures under various conditions. One version of the algorithm computes a standard Wasserstein barycenter, i.e., a barycenter based upon equal weights; and the other version computes a randomized Wasserstein barycenter, i.e., a barycenter based upon random weights for the initial measures. Finally, we specialize our algorithm to Gaussian distributions and draw a connection with the modeling of opinion dynamics in mathematical sociologyShow more
Downloadable Archival Material, 2020-12-15
Undefined
Publisher:2020-12-15

<——2020——–2020—––3710——



Stochastic Saddle-Point Optimization for Wasserstein Barycenters
Authors:Tiapkin, Daniil (Creator), Gasnikov, Alexander (Creator), Dvurechensky, Pavel (Creator)
Summary:We consider the population Wasserstein barycenter problem for random probability measures supported on a finite set of points and generated by an online stream of data. This leads to a complicated stochastic optimization problem where the objective is given as an expectation of a function given as a solution to a random optimization problem. We employ the structure of the problem and obtain a convex-concave stochastic saddle-point reformulation of this problem. In the setting when the distribution of random probability measures is discrete, we propose a stochastic optimization algorithm and estimate its complexity. The second result, based on kernel methods, extends the previous one to the arbitrary distribution of random probability measures. Moreover, this new algorithm has a total complexity better than the Stochastic Approximation approach combined with the Sinkhorn algorithm in many cases. We also illustrate our developments by a series of numerical experimentsShow more
Downloadable Archival Material, 2020-06-11
Undefined
Publisher:2020-06-11


Regularizing activations in neural networks via distribution matching with the Wasserstein metric
Authors:Joo, Taejong (Creator), Kang, Donggu (Creator), Kim, Byunghoon (Creator)
Summary:Regularization and normalization have become indispensable components in training deep neural networks, resulting in faster training and improved generalization performance. We propose the projected error function regularization loss (PER) that encourages activations to follow the standard normal distribution. PER randomly projects activations onto one-dimensional space and computes the regularization loss in the projected space. PER is similar to the Pseudo-Huber loss in the projected space, thus taking advantage of both $L^1$ and $L^2$ regularization losses. Besides, PER can capture the interaction between hidden units by projection vector drawn from a unit sphere. By doing so, PER minimizes the upper bound of the Wasserstein distance of order one between an empirical distribution of activations and the standard normal distribution. To the best of the authors' knowledge, this is the first work to regularize activations via distribution matching in the probability distribution space. We evaluate the proposed method on the image classification task and the word-level language modeling taskShow more
Downloadable Archival Material, 2020-02-13
Undefined
Publisher:2020-02-13
Cited by 5
Related articles All 6 versions


Missing Features Reconstruction Using a Wasserstein Generative Adversarial Imputation Network
Authors:Friedjungová, Magda (Creator), Vašata, Daniel (Creator), Balatsko, Maksym (Creator), Jiřina, Marcel (Creator)
Summary:Missing data is one of the most common preprocessing problems. In this paper, we experimentally research the use of generative and non-generative models for feature reconstruction. Variational Autoencoder with Arbitrary Conditioning (VAEAC) and Generative Adversarial Imputation Network (GAIN) were researched as representatives of generative models, while the denoising autoencoder (DAE) represented non-generative models. Performance of the models is compared to traditional methods k-nearest neighbors (k-NN) and Multiple Imputation by Chained Equations (MICE). Moreover, we introduce WGAIN as the Wasserstein modification of GAIN, which turns out to be the best imputation model when the degree of missingness is less than or equal to 30%. Experiments were performed on real-world and artificial datasets with continuous features where different percentages of features, varying from 10% to 50%, were missing. Evaluation of algorithms was done by measuring the accuracy of the classification model previously trained on the uncorrupted dataset. The results show that GAIN and especially WGAIN are the best imputers regardless of the conditions. In general, they outperform or are comparative to MICE, k-NN, DAE, and VAEACShow more
Downloadable Archival Material, 2020-06-21
Undefined
Publisher:2020-06-21


Tessellated Wasserstein Auto-Encoders
Authors:Gai, Kuo (Creator), Zhang, Shihua (Creator)
Summary:Non-adversarial generative models such as variational auto-encoder (VAE), Wasserstein auto-encoders with maximum mean discrepancy (WAE-MMD), sliced-Wasserstein auto-encoder (SWAE) are relatively easy to train and have less mode collapse compared to Wasserstein auto-encoder with generative adversarial network (WAE-GAN). However, they are not very accurate in approximating the target distribution in the latent space because they don't have a discriminator to detect the minor difference between real and fake. To this end, we develop a novel non-adversarial framework called Tessellated Wasserstein Auto-encoders (TWAE) to tessellate the support of the target distribution into a given number of regions by the centroidal Voronoi tessellation (CVT) technique and design batches of data according to the tessellation instead of random shuffling for accurate computation of discrepancy. Theoretically, we demonstrate that the error of estimate to the discrepancy decreases when the numbers of samples $n$ and regions $m$ of the tessellation become larger with rates of $\mathcal{O}(\frac{1}{\sqrt{n}})$ and $\mathcal{O}(\frac{1}{\sqrt{m}})$, respectively. Given fixed $n$ and $m$, a necessary condition for the upper bound of measurement error to be minimized is that the tessellation is the one determined by CVT. TWAE is very flexible to different non-adversarial metrics and can substantially enhance their generative performance in terms of Fr\'{e}chet inception distance (FID) compared to VAE, WAE-MMD, SWAE. Moreover, numerical results indeed demonstrate that TWAE is competitive to the adversarial model WAE-GAN, demonstrating its powerful generative abilityShow more
Downloadable Archival Material, 2020-05-20
Undefined
Publisher:2020-05-20


Universal consistency of Wasserstein $k$-NN classifier: Negative and Positive Results
Author:Ponnoprat, Donlapark (Creator)
Summary:The Wasserstein distance provides a notion of dissimilarities between probability measures, which has recent applications in learning of structured data with varying size such as images and text documents. In this work, we study the $k$-nearest neighbor classifier ($k$-NN) of probability measures under the Wasserstein distance. We show that the $k$-NN classifier is not universally consistent on the space of measures supported in $(0,1)$. As any Euclidean ball contains a copy of $(0,1)$, one should not expect to obtain universal consistency without some restriction on the base metric space, or the Wasserstein space itself. To this end, via the notion of $\sigma$-finite metric dimension, we show that the $k$-NN classifier is universally consistent on spaces of measures supported in a $\sigma$-uniformly discrete set. In addition, by studying the geodesic structures of the Wasserstein spaces for $p=1$ and $p=2$, we show that the $k$-NN classifier is universally consistent on the space of measures supported on a finite set, the space of Gaussian measures, and the space of measures with densities expressed as finite wavelet seriesShow more
Downloadable Archival Material, 2020-09-09
Undefined
Publisher:2020-09-09

2020


Two-sample Test using Projected Wasserstein Distance: Breaking the Curse of Dimensionality
Authors:Wang, Jie (Creator), Gao, Rui (Creator), Xie, Yao (Creator)
Summary:We develop a projected Wasserstein distance for the two-sample test, a fundamental problem in statistics and machine learning: given two sets of samples, to determine whether they are from the same distribution. In particular, we aim to circumvent the curse of dimensionality in Wasserstein distance: when the dimension is high, it has diminishing testing power, which is inherently due to the slow concentration property of Wasserstein metrics in the high dimension space. A key contribution is to couple optimal projection to find the low dimensional linear mapping to maximize the Wasserstein distance between projected probability distributions. We characterize the theoretical property of the finite-sample convergence rate on IPMs and present practical algorithms for computing this metric. Numerical examples validate our theoretical resultsShow more
Downloadable Archival Material, 2020-10-22
Undefined
Publisher:2020-10-22

Learning Deep-Latent Hierarchies by Stacking Wasserstein Autoencoders
Authors:Gaujac, Benoit (Creator), Feige, Ilya (Creator), Barber, David (Creator)
Summary:Probabilistic models with hierarchical-latent-variable structures provide state-of-the-art results amongst non-autoregressive, unsupervised density-based models. However, the most common approach to training such models based on Variational Autoencoders (VAEs) often fails to leverage deep-latent hierarchies; successful approaches require complex inference and optimisation schemes. Optimal Transport is an alternative, non-likelihood-based framework for training generative models with appealing theoretical properties, in principle allowing easier training convergence between distributions. In this work we propose a novel approach to training models with deep-latent hierarchies based on Optimal Transport, without the need for highly bespoke models and inference networks. We show that our method enables the generative model to fully leverage its deep-latent hierarchy, avoiding the well known "latent variable collapse" issue of VAEs; therefore, providing qualitatively better sample generations as well as more interpretable latent representation than the original Wasserstein Autoencoder with Maximum Mean Discrepancy divergenceShow more
Downloadable Archival Material, 2020-10-07
Undefined
Publisher:2020-10-07
 
Wasserstein K-Means for Clustering Tomographic Projections
Authors:Rao, Rohan (Creator), Moscovich, Amit (Creator), Singer, Amit (Creator)
Summary:Motivated by the 2D class averaging problem in single-particle cryo-electron microscopy (cryo-EM), we present a k-means algorithm based on a rotationally-invariant Wasserstein metric for images. Unlike existing methods that are based on Euclidean ($L_2$) distances, we prove that the Wasserstein metric better accommodates for the out-of-plane angular differences between different particle views. We demonstrate on a synthetic dataset that our method gives superior results compared to an $L_2$ baseline. Furthermore, there is little computational overhead, thanks to the use of a fast linear-time approximation to the Wasserstein-1 metric, also known as the Earthmover's distanceShow mor
Downloadable Archival Material, 2020-10-19
Undefined
Publisher:2020-10-19

Image Hashing by Minimizing Discrete Component-wise Wasserstein Distance
Authors:Doan, Khoa D. (Creator), Manchanda, Saurav (Creator), Badirli, Sarkhan (Creator), Reddy, Chandan K. (Creator)
Summary:Image hashing is one of the fundamental problems that demand both efficient and effective solutions for various practical scenarios. Adversarial autoencoders are shown to be able to implicitly learn a robust, locality-preserving hash function that generates balanced and high-quality hash codes. However, the existing adversarial hashing methods are inefficient to be employed for large-scale image retrieval applications. Specifically, they require an exponential number of samples to be able to generate optimal hash codes and a significantly high computational cost to train. In this paper, we show that the high sample-complexity requirement often results in sub-optimal retrieval performance of the adversarial hashing methods. To address this challenge, we propose a new adversarial-autoencoder hashing approach that has a much lower sample requirement and computational cost. Specifically, by exploiting the desired properties of the hash function in the low-dimensional, discrete space, our method efficiently estimates a better variant of Wasserstein distance by averaging a set of easy-to-compute one-dimensional Wasserstein distances. The resulting hashing approach has an order-of-magnitude better sample complexity, thus better generalization property, compared to the other adversarial hashing methods. In addition, the computational cost is significantly reduced using our approach. We conduct experiments on several real-world datasets and show that the proposed method outperforms the competing hashing methods, achieving up to 10% improvement over the current state-of-the-art image hashing methods. The code accompanying this paper is available on Github (https://github.com/khoadoan/adversarial-hashing)Show more
Downloadable Archival Material, 2020-02-28
Undefined
Publisher:2020-02-28


Averaging Atmospheric Gas Concentration Data using Wasserstein BarycentersAuthors:Barré, Mathieu (Creator), Giron, Clément (Creator), Mazzolini, Matthieu (Creator), d'Aspremont, Alexandre (Creator)
Summary:Hyperspectral satellite images report greenhouse gas concentrations worldwide on a daily basis. While taking simple averages of these images over time produces a rough estimate of relative emission rates, atmospheric transport means that simple averages fail to pinpoint the source of these emissions. We propose using Wasserstein barycenters coupled with weather data to average gas concentration data sets and better concentrate the mass around significant sourcesShow more
Downloadable Archival Material, 2020-10-06
Undefined
Publisher:2020-10-06

<——2020——–2020—––3720——



Consistency of Distributionally Robust Risk- and Chance-Constrained Optimization under Wasserstein Ambiguity Sets
Authors:Cherukuri, Ashish (Creator), Hota, Ashish R. (Creator)
Summary:We study stochastic optimization problems with chance and risk constraints, where in the latter, risk is quantified in terms of the conditional value-at-risk (CVaR). We consider the distributionally robust versions of these problems, where the constraints are required to hold for a family of distributions constructed from the observed realizations of the uncertainty via the Wasserstein distance. Our main results establish that if the samples are drawn independently from an underlying distribution and the problems satisfy suitable technical assumptions, then the optimal value and optimizers of the distributionally robust versions of these problems converge to the respective quantities of the original problems, as the sample size increasesShow more
Downloadable Archival Material, 2020-12-16
Undefined
Publisher:2020-12-16

online  OPEN ACCESS

Consistency of Distributionally Robust Risk- and Chance-Constrained Optimization under Wasserst...

by Cherukuri, Ashish; Hota, Ashish R

12/2020

We study stochastic optimization problems with chance and risk constraints, where in the latter, risk is quantified in terms of the conditional value-at-risk...

Journal ArticleFull Text Online

Conditional Wasserstein GAN-based Oversampling of Tabular Data for Imbalanced Learning
Authors:Engelmann, Justin (Creator), Lessmann, Stefan (Creator)
Summary:Class imbalance is a common problem in supervised learning and impedes the predictive performance of classification models. Popular countermeasures include oversampling the minority class. Standard methods like SMOTE rely on finding nearest neighbours and linear interpolations which are problematic in case of high-dimensional, complex data distributions. Generative Adversarial Networks (GANs) have been proposed as an alternative method for generating artificial minority examples as they can model complex distributions. However, prior research on GAN-based oversampling does not incorporate recent advancements from the literature on generating realistic tabular data with GANs. Previous studies also focus on numerical variables whereas categorical features are common in many business applications of classification methods such as credit scoring. The paper propoes an oversampling method based on a conditional Wasserstein GAN that can effectively model tabular datasets with numerical and categorical variables and pays special attention to the down-stream classification task through an auxiliary classifier loss. We benchmark our method against standard oversampling methods and the imbalanced baseline on seven real-world datasets. Empirical results evidence the competitiveness of GAN-based oversamplingShow more
Downloadable Archival Material, 2020-08-20
Undefined
Publisher:2020-08-20

Unsupervised Wasserstein Distance Guided Domain Adaptation for 3D Multi-Domain Liver Segmentation
Authors:You, Chenyu (Creator), Yang, Junlin (Creator), Chapiro, Julius (Creator), Duncan, James S. (Creator)
Summary:Deep neural networks have shown exceptional learning capability and generalizability in the source domain when massive labeled data is provided. However, the well-trained models often fail in the target domain due to the domain shift. Unsupervised domain adaptation aims to improve network performance when applying robust models trained on medical images from source domains to a new target domain. In this work, we present an approach based on the Wasserstein distance guided disentangled representation to achieve 3D multi-domain liver segmentation. Concretely, we embed images onto a shared content space capturing shared feature-level information across domains and domain-specific appearance spaces. The existing mutual information-based representation learning approaches often fail to capture complete representations in multi-domain medical imaging tasks. To mitigate these issues, we utilize Wasserstein distance to learn more complete representation, and introduces a content discriminator to further facilitate the representation disentanglement. Experiments demonstrate that our method outperforms the state-of-the-art on the multi-modality liver segmentation taskShow more
Downloadable Archival Material, 2020-09-06
Undefined
Publisher:2020-09-06


Robust Reinforcement Learning with Wasserstein Constraint
Authors:Hou, Linfang (Creator), Pang, Liang (Creator), Hong, Xin (Creator), Lan, Yanyan (Creator), Ma, Zhiming (Creator), Yin, Dawei (Creator)
Summary:Robust Reinforcement Learning aims to find the optimal policy with some extent of robustness to environmental dynamics. Existing learning algorithms usually enable the robustness through disturbing the current state or simulating environmental parameters in a heuristic way, which lack quantified robustness to the system dynamics (i.e. transition probability). To overcome this issue, we leverage Wasserstein distance to measure the disturbance to the reference transition kernel. With Wasserstein distance, we are able to connect transition kernel disturbance to the state disturbance, i.e. reduce an infinite-dimensional optimization problem to a finite-dimensional risk-aware problem. Through the derived risk-aware optimal Bellman equation, we show the existence of optimal robust policies, provide a sensitivity analysis for the perturbations, and then design a novel robust learning algorithm--Wasserstein Robust Advantage Actor-Critic algorithm (WRAAC). The effectiveness of the proposed algorithm is verified in the Cart-Pole environmentShow more
Downloadable Archival Material, 2020-06-01
Undefined
Publisher:2020-06-01


2020


Quantum statistical learning via Quantum Wasserstein natural gradient
Authors:Becker, Simon (Creator), Li, Wuchen (Creator)
Summary:In this article, we introduce a new approach towards the statistical learning problem $\operatorname{argmin}_{\rho(\theta) \in \mathcal P_{\theta}} W_{Q}^2 (\rho_{\star},\rho(\theta))$ to approximate a target quantum state $\rho_{\star}$ by a set of parametrized quantum states $\rho(\theta)$ in a quantum $L^2$-Wasserstein metric. We solve this estimation problem by considering Wasserstein natural gradient flows for density operators on finite-dimensional $C^*$ algebras. For continuous parametric models of density operators, we pull back the quantum Wasserstein metric such that the parameter space becomes a Riemannian manifold with quantum Wasserstein information matrix. Using a quantum analogue of the Benamou-Brenier formula, we derive a natural gradient flow on the parameter space. We also discuss certain continuous-variable quantum states by studying the transport of the associated Wigner probability distributionsShow more
Downloadable Archival Material, 2020-08-25
Undefined
Publisher:2020-08-25

Stronger and Faster Wasserstein Adversarial Attacks
Authors:Wu, Kaiwen (Creator), Wang, Allen Houze (Creator), Yu, Yaoliang (Creator)
Summary:Deep models, while being extremely flexible and accurate, are surprisingly vulnerable to "small, imperceptible" perturbations known as adversarial attacks. While the majority of existing attacks focus on measuring perturbations under the $\ell_p$ metric, Wasserstein distance, which takes geometry in pixel space into account, has long been known to be a suitable metric for measuring image quality and has recently risen as a compelling alternative to the $\ell_p$ metric in adversarial attacks. However, constructing an effective attack under the Wasserstein metric is computationally much more challenging and calls for better optimization algorithms. We address this gap in two ways: (a) we develop an exact yet efficient projection operator to enable a stronger projected gradient attack; (b) we show that the Frank-Wolfe method equipped with a suitable linear minimization oracle works extremely fast under Wasserstein constraints. Our algorithms not only converge faster but also generate much stronger attacks. For instance, we decrease the accuracy of a residual network on CIFAR-10 to $3.4\%$ within a Wasserstein perturbation ball of radius $0.005$, in contrast to $65.6\%$ using the previous Wasserstein attack based on an \emph{approximate} projection operator. Furthermore, employing our stronger attacks in adversarial training significantly improves the robustness of adversarially trained modelsShow more
Downloadable Archival Material, 2020-08-06
Undefined
Publisher:2020-08-06



Permutation invariant networks to learn Wasserstein metrics
Authors:Sehanobish, Arijit (Creator), Ravindra, Neal (Creator), van Dijk, David (Creator)
Summary:Understanding the space of probability measures on a metric space equipped with a Wasserstein distance is one of the fundamental questions in mathematical analysis. The Wasserstein metric has received a lot of attention in the machine learning community especially for its principled way of comparing distributions. In this work, we use a permutation invariant network to map samples from probability measures into a low-dimensional space such that the Euclidean distance between the encoded samples reflects the Wasserstein distance between probability measures. We show that our network can generalize to correctly compute distances between unseen densities. We also show that these networks can learn the first and the second moments of probability distributionsShow more
Downloadable Archival Material, 2020-10-12
Undefined
Publisher:2020-10-12


Reinforced Wasserstein Training for Severity-Aware Semantic Segmentation in Autonomous Driving
Authors:Liu, Xiaofeng (Creator), Zhang, Yimeng (Creator), Liu, Xiongchang (Creator), Bai, Song (Creator), Li, Site (Creator), You, Jane (Creator)
Summary:Semantic segmentation is important for many real-world systems, e.g., autonomous vehicles, which predict the class of each pixel. Recently, deep networks achieved significant progress w.r.t. the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross-entropy loss can essentially ignore the difference of severity for an autonomous car with different wrong prediction mistakes. For example, predicting the car to the road is much more servery than recognize it as the bus. Targeting for this difficulty, we develop a Wasserstein training framework to explore the inter-class correlation by defining its ground metric as misclassification severity. The ground metric of Wasserstein distance can be pre-defined following the experience on a specific task. From the optimization perspective, we further propose to set the ground metric as an increasing function of the pre-defined ground metric. Furthermore, an adaptively learning scheme of the ground matrix is proposed to utilize the high-fidelity CARLA simulator. Specifically, we follow a reinforcement alternative learning scheme. The experiments on both CamVid and Cityscapes datasets evidenced the effectiveness of our Wasserstein loss. The SegNet, ENet, FCN and Deeplab networks can be adapted following a plug-in manner. We achieve significant improvements on the predefined important classes, and much longer continuous playtime in our simulatorShow more
Downloadable Archival Material, 2020-08-11
Undefined
Publisher:2020-08-11

 Continuous Regularized Wasserstein BarycentersAuthors:Li, Lingxiao (Creator), Genevay, Aude (Creator), Yurochkin, Mikhail (Creator), Solomon, Justin (Creator)
Summary:Wasserstein barycenters provide a geometrically meaningful way to aggregate probability distributions, built on the theory of optimal transport. They are difficult to compute in practice, however, leading previous work to restrict their supports to finite sets of points. Leveraging a new dual formulation for the regularized Wasserstein barycenter problem, we introduce a stochastic algorithm that constructs a continuous approximation of the barycenter. We establish strong duality and use the corresponding primal-dual relationship to parametrize the barycenter implicitly using the dual potentials of regularized transport problems. The resulting problem can be solved with stochastic gradient descent, which yields an efficient online algorithm to approximate the barycenter of continuous distributions given sample access. We demonstrate the effectiveness of our approach and compare against previous work on synthetic examples and real-world applicationsShow more
Downloadable Archival Material, 2020-08-28
Undefined
Publisher:2020-08-28

<——2020——–2020—––3730——



When OT meets MoM: Robust estimation of Wasserstein Distance
Authors:Staerman, Guillaume (Creator), Laforgue, Pierre (Creator), Mozharovskyi, Pavlo (Creator), d'Alché-Buc, Florence (Creator)
Summary:Issued from Optimal Transport, the Wasserstein distance has gained importance in Machine Learning due to its appealing geometrical properties and the increasing availability of efficient approximations. In this work, we consider the problem of estimating the Wasserstein distance between two probability distributions when observations are polluted by outliers. To that end, we investigate how to leverage Medians of Means (MoM) estimators to robustify the estimation of Wasserstein distance. Exploiting the dual Kantorovitch formulation of Wasserstein distance, we introduce and discuss novel MoM-based robust estimators whose consistency is studied under a data contamination model and for which convergence rates are provided. These MoM estimators enable to make Wasserstein Generative Adversarial Network (WGAN) robust to outliers, as witnessed by an empirical study on two benchmarks CIFAR10 and Fashion MNIST. Eventually, we discuss how to combine MoM with the entropy-regularized approximation of the Wasserstein distance and propose a simple MoM-based re-weighting scheme that could be used in conjunction with the Sinkhorn algorithmShow more
Downloadable Archival Material, 2020-06-18
Undefined
Publisher:2020-06-18

A Sliced Wasserstein Loss for Neural Texture Synthesis
Authors:Heitz, Eric (Creator), Vanhoey, Kenneth (Creator), Chambon, Thomas (Creator), Belcour, Laurent (Creator)
Summary:We address the problem of computing a textural loss based on the statistics extracted from the feature activations of a convolutional neural network optimized for object recognition (e.g. VGG-19). The underlying mathematical problem is the measure of the distance between two distributions in feature space. The Gram-matrix loss is the ubiquitous approximation for this problem but it is subject to several shortcomings. Our goal is to promote the Sliced Wasserstein Distance as a replacement for it. It is theoretically proven,practical, simple to implement, and achieves results that are visually superior for texture synthesis by optimization or training generative neural networksShow more
Downloadable Archival Material, 2020-06-12
Undefined
Publisher:2020-06-12


Safe Wasserstein Constrained Deep Q-Learning
Authors:Kandel, Aaron (Creator), Moura, Scott J. (Creator)
Summary:This paper presents a distributionally robust Q-Learning algorithm (DrQ) which leverages Wasserstein ambiguity sets to provide idealistic probabilistic out-of-sample safety guarantees during online learning. First, we follow past work by separating the constraint functions from the principal objective to create a hierarchy of machines which estimate the feasible state-action space within the constrained Markov decision process (CMDP). DrQ works within this framework by augmenting constraint costs with tightening offset variables obtained through Wasserstein distributionally robust optimization (DRO). These offset variables correspond to worst-case distributions of modeling error characterized by the TD-errors of the constraint Q-functions. This procedure allows us to safely approach the nominal constraint boundaries. Using a case study of lithium-ion battery fast charging, we explore how idealistic safety guarantees translate to generally improved safety relative to conventional methodsShow more
Downloadable Archival Material, 2020-02-07
Undefined
Publisher:2020-02-07



Wasserstein Distance Regularized Sequence Representation for Text Matching in Asymmetrical Domains
Authors:Yu, Weijie (Creator), Xu, Chen (Creator), Xu, Jun (Creator), Pang, Liang (Creator), Gao, Xiaopeng (Creator), Wang, Xiaozhao (Creator), Wen, Ji-Rong (Creator)Show more
Summary:One approach to matching texts from asymmetrical domains is projecting the input sequences into a common semantic space as feature vectors upon which the matching function can be readily defined and learned. In real-world matching practices, it is often observed that with the training goes on, the feature vectors projected from different domains tend to be indistinguishable. The phenomenon, however, is often overlooked in existing matching models. As a result, the feature vectors are constructed without any regularization, which inevitably increases the difficulty of learning the downstream matching functions. In this paper, we propose a novel match method tailored for text matching in asymmetrical domains, called WD-Match. In WD-Match, a Wasserstein distance-based regularizer is defined to regularize the features vectors projected from different domains. As a result, the method enforces the feature projection function to generate vectors such that those correspond to different domains cannot be easily discriminated. The training process of WD-Match amounts to a game that minimizes the matching loss regularized by the Wasserstein distance. WD-Match can be used to improve different text matching methods, by using the method as its underlying matching model. Four popular text matching methods have been exploited in the paper. Experimental results based on four publicly available benchmarks showed that WD-Match consistently outperformed the underlying methods and the baselinesShow more
Downloadable Archival Material, 2020-10-15
Undefined
Publisher:2020-10-15


Wasserstein Distance guided Adversarial Imitation Learning with Reward Shape Exploration
Authors:Zhang, Ming (Creator), Wang, Yawei (Creator), Ma, Xiaoteng (Creator), Xia, Li (Creator), Yang, Jun (Creator), Li, Zhiheng (Creator), Li, Xiu (Creator)
Summary:The generative adversarial imitation learning (GAIL) has provided an adversarial learning framework for imitating expert policy from demonstrations in high-dimensional continuous tasks. However, almost all GAIL and its extensions only design a kind of reward function of logarithmic form in the adversarial training strategy with the Jensen-Shannon (JS) divergence for all complex environments. The fixed logarithmic type of reward function may be difficult to solve all complex tasks, and the vanishing gradients problem caused by the JS divergence will harm the adversarial learning process. In this paper, we propose a new algorithm named Wasserstein Distance guided Adversarial Imitation Learning (WDAIL) for promoting the performance of imitation learning (IL). There are three improvements in our method: (a) introducing the Wasserstein distance to obtain more appropriate measure in the adversarial training process, (b) using proximal policy optimization (PPO) in the reinforcement learning stage which is much simpler to implement and makes the algorithm more efficient, and (c) exploring different reward function shapes to suit different tasks for improving the performance. The experiment results show that the learning procedure remains remarkably stable, and achieves significant performance in the complex continuous control tasks of MuJoCoShow more
Downloadable Archival Material, 2020-06-05
Undefined
Publisher:2020-06-05


2020


Minimax control of ambiguous linear stochastic systems using the Wasserstein metric
Authors:Kim, Kihyun (Creator), Yang, Insoon (Creator)
Summary:In this paper, we propose a minimax linear-quadratic control method to address the issue of inaccurate distribution information in practical stochastic systems. To construct a control policy that is robust against errors in an empirical distribution of uncertainty, our method is to adopt an adversary, which selects the worst-case distribution. To systematically adjust the conservativeness of our method, the opponent receives a penalty proportional to the amount, measured with the Wasserstein metric, of deviation from the empirical distribution. In the finite-horizon case, using a Riccati equation, we derive a closed-form expression of the unique optimal policy and the opponent's policy that generates the worst-case distribution. This result is then extended to the infinite-horizon setting by identifying conditions under which the Riccati recursion converges to the unique positive semi-definite solution to an associated algebraic Riccati equation (ARE). The resulting optimal policy is shown to stabilize the expected value of the system state under the worst-case distribution. We also discuss that our method can be interpreted as a distributional generalization of the $H_\infty$-methodShow more
Downloadable Archival Material, 2020-03-30
Undefined
Publisher:2020-03-30


Joint Wasserstein Distribution Matching
Authors:Cao, JieZhang (Creator), Mo, Langyuan (Creator), Du, Qing (Creator), Guo, Yong (Creator), Zhao, Peilin (Creator), Huang, Junzhou (Creator), Tan, Mingkui (Creator)Show more
Summary:Joint distribution matching (JDM) problem, which aims to learn bidirectional mappings to match joint distributions of two domains, occurs in many machine learning and computer vision applications. This problem, however, is very difficult due to two critical challenges: (i) it is often difficult to exploit sufficient information from the joint distribution to conduct the matching; (ii) this problem is hard to formulate and optimize. In this paper, relying on optimal transport theory, we propose to address JDM problem by minimizing the Wasserstein distance of the joint distributions in two domains. However, the resultant optimization problem is still intractable. We then propose an important theorem to reduce the intractable problem into a simple optimization problem, and develop a novel method (called Joint Wasserstein Distribution Matching (JWDM)) to solve it. In the experiments, we apply our method to unsupervised image translation and cross-domain video synthesis. Both qualitative and quantitative comparisons demonstrate the superior performance of our method over several state-of-the-artsShow more
Downloadable Archival Material, 2020-02-29
Undefined
Publisher:2020-02-29



Wasserstein Routed Capsule Networks
Authors:Fuchs, Alexander (Creator), Pernkopf, Franz (Creator)
Summary:Capsule networks offer interesting properties and provide an alternative to today's deep neural network architectures. However, recent approaches have failed to consistently achieve competitive results across different image datasets. We propose a new parameter efficient capsule architecture, that is able to tackle complex tasks by using neural networks trained with an approximate Wasserstein objective to dynamically select capsules throughout the entire architecture. This approach focuses on implementing a robust routing scheme, which can deliver improved results using little overhead. We perform several ablation studies verifying the proposed concepts and show that our network is able to substantially outperform other capsule approaches by over 1.2 % on CIFAR-10, using fewer parametersShow more
Downloadable Archival Material, 2020-07-22
Undefined



Improving Relational Regularized Autoencoders with Spherical Sliced Fused Gromov WassersteinAuthors:Nguyen, Khai (Creator), Nguyen, Son (Creator), Ho, Nhat (Creator), Pham, Tung (Creator), Bui, Hung (Creator)
Summary:Relational regularized autoencoder (RAE) is a framework to learn the distribution of data by minimizing a reconstruction loss together with a relational regularization on the latent space. A recent attempt to reduce the inner discrepancy between the prior and aggregated posterior distributions is to incorporate sliced fused Gromov-Wasserstein (SFG) between these distributions. That approach has a weakness since it treats every slicing direction similarly, meanwhile several directions are not useful for the discriminative task. To improve the discrepancy and consequently the relational regularization, we propose a new relational discrepancy, named spherical sliced fused Gromov Wasserstein (SSFG), that can find an important area of projections characterized by a von Mises-Fisher distribution. Then, we introduce two variants of SSFG to improve its performance. The first variant, named mixture spherical sliced fused Gromov Wasserstein (MSSFG), replaces the vMF distribution by a mixture of von Mises-Fisher distributions to capture multiple important areas of directions that are far from each other. The second variant, named power spherical sliced fused Gromov Wasserstein (PSSFG), replaces the vMF distribution by a power spherical distribution to improve the sampling time in high dimension settings. We then apply the new discrepancies to the RAE framework to achieve its new variants. Finally, we conduct extensive experiments to show that the new proposed autoencoders have favorable performance in learning latent manifold structure, image generation, and reconstructionShow more
Downloadable Archival Material, 2020-10-05
Undefined
Publisher:2020-10-05 


Learning disentangled representations with the Wasserstein Autoencoder
Authors:Gaujac, Benoit (Creator), Feige, Ilya (Creator), Barber, David (Creator)
Summary:Disentangled representation learning has undoubtedly benefited from objective function surgery. However, a delicate balancing act of tuning is still required in order to trade off reconstruction fidelity versus disentanglement. Building on previous successes of penalizing the total correlation in the latent variables, we propose TCWAE (Total Correlation Wasserstein Autoencoder). Working in the WAE paradigm naturally enables the separation of the total-correlation term, thus providing disentanglement control over the learned representation, while offering more flexibility in the choice of reconstruction cost. We propose two variants using different KL estimators and perform extensive quantitative comparisons on data sets with known generative factors, showing competitive results relative to state-of-the-art techniques. We further study the trade off between disentanglement and reconstruction on more-difficult data sets with unknown generative factors, where the flexibility of the WAE paradigm in the reconstruction term improves reconstructionsShow more
Downloadable Archival Material, 2020-10-07
Undefined
Publisher:2020-10-07
Publisher:2020-07-22

<——2020——–2020—––3740—— 



node2coords: Graph Representation Learning with Wasserstein Barycenters
Authors:Simou, Effrosyni (Creator), Thanou, Dorina (Creator), Frossard, Pascal (Creator)
Summary:In order to perform network analysis tasks, representations that capture the most relevant information in the graph structure are needed. However, existing methods do not learn representations that can be interpreted in a straightforward way and that are robust to perturbations to the graph structure. In this work, we address these two limitations by proposing node2coords, a representation learning algorithm for graphs, which learns simultaneously a low-dimensional space and coordinates for the nodes in that space. The patterns that span the low dimensional space reveal the graph's most important structural information. The coordinates of the nodes reveal the proximity of their local structure to the graph structural patterns. In order to measure this proximity by taking into account the underlying graph, we propose to use Wasserstein distances. We introduce an autoencoder that employs a linear layer in the encoder and a novel Wasserstein barycentric layer at the decoder. Node connectivity descriptors, that capture the local structure of the nodes, are passed through the encoder to learn the small set of graph structural patterns. In the decoder, the node connectivity descriptors are reconstructed as Wasserstein barycenters of the graph structural patterns. The optimal weights for the barycenter representation of a node's connectivity descriptor correspond to the coordinates of that node in the low-dimensional space. Experimental results demonstrate that the representations learned with node2coords are interpretable, lead to node embeddings that are stable to perturbations of the graph structure and achieve competitive or superior results compared to state-of-the-art methods in node classificationShow more
Downloadable Archival Material, 2020-07-31
Undefined
Publisher:2020-07-31


Wasserstein Control of Mirror Langevin Monte Carlo
Authors:Zhang, Kelvin Shuangjian (Creator), Peyré, Gabriel (Creator), Fadili, Jalal (Creator), Pereyra, Marcelo (Creator)
Summary:Discretized Langevin diffusions are efficient Monte Carlo methods for sampling from high dimensional target densities that are log-Lipschitz-smooth and (strongly) log-concave. In particular, the Euclidean Langevin Monte Carlo sampling algorithm has received much attention lately, leading to a detailed understanding of its non-asymptotic convergence properties and of the role that smoothness and log-concavity play in the convergence rate. Distributions that do not possess these regularity properties can be addressed by considering a Riemannian Langevin diffusion with a metric capturing the local geometry of the log-density. However, the Monte Carlo algorithms derived from discretizations of such Riemannian Langevin diffusions are notoriously difficult to analyze. In this paper, we consider Langevin diffusions on a Hessian-type manifold and study a discretization that is closely related to the mirror-descent scheme. We establish for the first time a non-asymptotic upper-bound on the sampling error of the resulting Hessian Riemannian Langevin Monte Carlo algorithm. This bound is measured according to a Wasserstein distance induced by a Riemannian metric ground cost capturing the Hessian structure and closely related to a self-concordance-like condition. The upper-bound implies, for instance, that the iterates contract toward a Wasserstein ball around the target density whose radius is made explicit. Our theory recovers existing Euclidean results and can cope with a wide variety of Hessian metrics related to highly non-flat geometriesShow more
Downloadable Archival Material, 2020-02-11
Undefined
Publisher:2020-02-11


LCS Graph Kernel Based on Wasserstein Distance in Longest Common Subsequence Metric Space
Authors:Huang, Jianming (Creator), Fang, Zhongxi (Creator), Kasai, Hiroyuki (Creator)
Summary:For graph learning tasks, many existing methods utilize a message-passing mechanism where vertex features are updated iteratively by aggregation of neighbor information. This strategy provides an efficient means for graph features extraction, but obtained features after many iterations might contain too much information from other vertices, and tend to be similar to each other. This makes their representations less expressive. Learning graphs using paths, on the other hand, can be less adversely affected by this problem because it does not involve all vertex neighbors. However, most of them can only compare paths with the same length, which might engender information loss. To resolve this difficulty, we propose a new Graph Kernel based on a Longest Common Subsequence (LCS) similarity. Moreover, we found that the widely-used R-convolution framework is unsuitable for path-based Graph Kernel because a huge number of comparisons between dissimilar paths might deteriorate graph distances calculation. Therefore, we propose a novel metric space by exploiting the proposed LCS-based similarity, and compute a new Wasserstein-based graph distance in this metric space, which emphasizes more the comparison between similar paths. Furthermore, to reduce the computational cost, we propose an adjacent point merging operation to sparsify point clouds in the metric spaceShow more
Downloadable Archival Material, 2020-12-07
Undefined
Publisher:2020-12-07

Primal Wasserstein Imitation Learning
Authors:Dadashi, Robert (Creator), Hussenot, Léonard (Creator), Geist, Matthieu (Creator), Pietquin, Olivier (Creator)
Summary:Imitation Learning (IL) methods seek to match the behavior of an agent with that of an expert. In the present work, we propose a new IL method based on a conceptually simple algorithm: Primal Wasserstein Imitation Learning (PWIL), which ties to the primal form of the Wasserstein distance between the expert and the agent state-action distributions. We present a reward function which is derived offline, as opposed to recent adversarial IL algorithms that learn a reward function through interactions with the environment, and which requires little fine-tuning. We show that we can recover expert behavior on a variety of continuous control tasks of the MuJoCo domain in a sample efficient manner in terms of agent interactions and of expert interactions with the environment. Finally, we show that the behavior of the agent we train matches the behavior of the expert with the Wasserstein distance, rather than the commonly used proxy of performanceShow more
Downloadable Archival Material, 2020-06-08
Undefined
Publisher:2020-06-08


A Riemannian Block Coordinate Descent Method for Computing the Projection Robust Wasserstein Distance
Authors:Huang, Minhui (Creator), Ma, Shiqian (Creator), Lai, Lifeng (Creator)
Summary:The Wasserstein distance has become increasingly important in machine learning and deep learning. Despite its popularity, the Wasserstein distance is hard to approximate because of the curse of dimensionality. A recently proposed approach to alleviate the curse of dimensionality is to project the sampled data from the high dimensional probability distribution onto a lower-dimensional subspace, and then compute the Wasserstein distance between the projected data. However, this approach requires to solve a max-min problem over the Stiefel manifold, which is very challenging in practice. The only existing work that solves this problem directly is the RGAS (Riemannian Gradient Ascent with Sinkhorn Iteration) algorithm, which requires to solve an entropy-regularized optimal transport problem in each iteration, and thus can be costly for large-scale problems. In this paper, we propose a Riemannian block coordinate descent (RBCD) method to solve this problem, which is based on a novel reformulation of the regularized max-min problem over the Stiefel manifold. We show that the complexity of arithmetic operations for RBCD to obtain an $\epsilon$-stationary point is $O(\epsilon^{-3})$. This significantly improves the corresponding complexity of RGAS, which is $O(\epsilon^{-12})$. Moreover, our RBCD has very low per-iteration complexity, and hence is suitable for large-scale problems. Numerical results on both synthetic and real datasets demonstrate that our method is more efficient than existing methods, especially when the number of sampled data is very largeShow more
Downloadable Archival Material, 2020-12-09
Undefined
Publisher:2020-12-09


2020


Principled learning method for Wasserstein distributionally robust optimization with local perturbations
Authors:Kwon, Yongchan (Creator), Kim, Wonyoung (Creator), Won, Joong-Ho (Creator), Paik, Myunghee Cho (Creator)
Summary:Wasserstein distributionally robust optimization (WDRO) attempts to learn a model that minimizes the local worst-case risk in the vicinity of the empirical data distribution defined by Wasserstein ball. While WDRO has received attention as a promising tool for inference since its introduction, its theoretical understanding has not been fully matured. Gao et al. (2017) proposed a minimizer based on a tractable approximation of the local worst-case risk, but without showing risk consistency. In this paper, we propose a minimizer based on a novel approximation theorem and provide the corresponding risk consistency results. Furthermore, we develop WDRO inference for locally perturbed data that include the Mixup (Zhang et al., 2017) as a special case. We show that our approximation and risk consistency results naturally extend to the cases when data are locally perturbed. Numerical experiments demonstrate robustness of the proposed method using image classification datasets. Our results show that the proposed method achieves significantly higher accuracy than baseline models on noisy datasetsShow more
Downloadable Archival Material, 2020-06-05
Undefined
Publisher:2020-06-05 


Fair Regression with Wasserstein Barycenters
Authors:Chzhen, Evgenii (Creator), Denis, Christophe (Creator), Hebiri, Mohamed (Creator), Oneto, Luca (Creator), Pontil, Massimiliano (Creator)
Summary:We study the problem of learning a real-valued function that satisfies the Demographic Parity constraint. It demands the distribution of the predicted output to be independent of the sensitive attribute. We consider the case that the sensitive attribute is available for prediction. We establish a connection between fair regression and optimal transport theory, based on which we derive a close form expression for the optimal fair predictor. Specifically, we show that the distribution of this optimum is the Wasserstein barycenter of the distributions induced by the standard regression function on the sensitive groups. This result offers an intuitive interpretation of the optimal fair prediction and suggests a simple post-processing algorithm to achieve fairness. We establish risk and distribution-free fairness guarantees for this procedure. Numerical experiments indicate that our method is very effective in learning fair models, with a relative increase in error rate that is inferior to the relative gain in fairnessShow more
Downloadable Archival Material, 2020-06-12
Undefined
Publisher:2020-06-12



Convergence of Recursive Stochastic Algorithms using Wasserstein Divergence
Authors:Gupta, Abhishek (Creator), Haskell, William B. (Creator)
Summary:This paper develops a unified framework, based on iterated random operator theory, to analyze the convergence of constant stepsize recursive stochastic algorithms (RSAs). RSAs use randomization to efficiently compute expectations, and so their iterates form a stochastic process. The key idea of our analysis is to lift the RSA into an appropriate higher-dimensional space and then express it as an equivalent Markov chain. Instead of determining the convergence of this Markov chain (which may not converge under constant stepsize), we study the convergence of the distribution of this Markov chain. To study this, we define a new notion of Wasserstein divergence. We show that if the distribution of the iterates in the Markov chain satisfy a contraction property with respect to the Wasserstein divergence, then the Markov chain admits an invariant distribution. We show that convergence of a large family of constant stepsize RSAs can be understood using this framework, and we provide several detailed examplesShow more
Downloadable Archival Material, 2020-03-25
Undefined
Publisher:2020-03-25



Uncoupled isotonic regression via minimum Wasserstein deconvolution
Authors:Massachusetts Institute of Technology Department of Mathematics (Contributor), Rigollet, Philippe (Creator), Weed, Jonathan (Creator)
Summary:Isotonic regression is a standard problem in shape-constrainedestimation where the goal is to estimate an unknown nondecreasingregression functionffrom independent pairs (xi,yi) whereE[yi] =f(xi),i= 1,...n. While this problem is well understood both statis-tically and computationally, much less is known about its uncoupledcounterpart where one is given only the unordered sets{x1,...,xn}and{y1,...,yn}. In this work, we leverage tools from optimal trans-port theory to derive minimax rates under weak moments conditionsonyiand to give an efficient algorithm achieving optimal rates. Bothupper and lower bounds employ moment-matching arguments that arealso pertinent to learning mixtures of distributions and deconvolutionShow more
Downloadable Archival Material, 2020-08-21T13:00:11Z
English
Publisher:Oxford University Press (OUP), 2020-08-21T13:00:11Z


SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence
Authors:Chewi, Sinho (Creator), Gouic, Thibaut Le (Creator), Lu, Chen (Creator), Maunu, Tyler (Creator), Rigollet, Philippe (Creator)
Summary:Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of optimal transport. We introduce a new perspective on SVGD that instead views SVGD as the (kernelized) gradient flow of the chi-squared divergence which, we show, exhibits a strong form of uniform exponential ergodicity under conditions as weak as a Poincar\'e inequality. This perspective leads us to propose an alternative to SVGD, called Laplacian Adjusted Wasserstein Gradient Descent (LAWGD), that can be implemented from the spectral decomposition of the Laplacian operator associated with the target density. We show that LAWGD exhibits strong convergence guarantees and good practical performanceShow more
Downloadable Archival Material, 2020-06-03
Undefined
Publisher:2020-06-03

<——2020——–2020—––3750— 



Peer-reviewed
Sampling of probability measures in the convex order by Wasserstein projection
Authors:Aurélien AlfonsiJacopo CorbettaBenjamin Jourdain
Summary:In this paper, for $\mu $ and $\nu $ two probability measures on $\mathbb{R}^{d}$ with finite moments of order $\varrho \ge 1$, we define the respective projections for the $W_{\varrho}$-Wasserstein distance of $\mu $ and $\nu $ on the sets of probability measures dominated by $\nu $ and of probability measures larger than $\mu $ in the convex order. The $W_{2}$-projection of $\mu $ can be easily computed when $\mu $ and $\nu $ have finite support by solving a quadratic optimization problem with linear constraints. In dimension $d=1$, Gozlan et al. (Ann. Inst. Henri Poincaré Probab. Stat. 54 (3) (2018) 1667–1693) have shown that the projection of $\mu$ does not depend on $\varrho $. We explicit their quantile functions in terms of those of $\mu $ and $\nu $. The motivation is the design of sampling techniques preserving the convex order in order to approximate Martingale Optimal Transport problems by using linear programming solvers. We prove convergence of the Wasserstein projection based sampling methods as the sample sizes tend to infinity and illustrate them by numerical experimentsShow more
Downloadable Article
Publication:https://projecteuclid.org/euclid.aihp/1593137306Ann. Inst. H. Poincaré Probab. Statist., 56, 2020-08, 1706
 
Semi-supervised biomedical translation with cycle Wasserstein regression GaNs
Authors:Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science (Contributor), Massachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory (Contributor), McDermott, Matthew (Creator), Yan, Tom (Creator), Naumann, Tristan (Creator), Hunt, Nathan (Creator), Suresh, Harini S. (Creator), Szolovits, Peter (Creator), Ghassemi, Marzyeh (Creator)Show more
Summary:The biomedical field offers many learning tasks that share unique challenges: large amounts of unpaired data, and a high cost to generate labels. In this work, we develop a method to address these issues with semi-supervised learning in regression tasks (e.g., translation from source to target). Our model uses adversarial signals to learn from unpaired datapoints, and imposes a cycle-loss reconstruction error penalty to regularize mappings in either direction against one another. We first evaluate our method on synthetic experiments, demonstrating two primary advantages of the system: 1) distribution matching via the adversarial loss and 2) regularization towards invertible mappings via the cycle loss. We then show a regularization effect and improved performance when paired data is supplemented by additional unpaired data on two real biomedical regression tasks: estimating the physiological effect of medical treatments, and extrapolating gene expression (transcriptomics) signals. Our proposed technique is a promising initial step towards more robust use of adversarial signals in semi-supervised regression, and could be useful for other tasks (e.g., causal inference or modality translation) in the biomedical fieldShow more
Downloadable Archival Material, 2020-04-15T18:40:19Z
English
Publisher:2020-04-15T18:40:19Z


Wasserstein Distributionally Robust Inverse Multiobjective Optimization
Authors:Dong, Chaosheng (Creator), Zeng, Bo (Creator)
Summary:Inverse multiobjective optimization provides a general framework for the unsupervised learning task of inferring parameters of a multiobjective decision making problem (DMP), based on a set of observed decisions from the human expert. However, the performance of this framework relies critically on the availability of an accurate DMP, sufficient decisions of high quality, and a parameter space that contains enough information about the DMP. To hedge against the uncertainties in the hypothetical DMP, the data, and the parameter space, we investigate in this paper the distributionally robust approach for inverse multiobjective optimization. Specifically, we leverage the Wasserstein metric to construct a ball centered at the empirical distribution of these decisions. We then formulate a Wasserstein distributionally robust inverse multiobjective optimization problem (WRO-IMOP) that minimizes a worst-case expected loss function, where the worst case is taken over all distributions in the Wasserstein ball. We show that the excess risk of the WRO-IMOP estimator has a sub-linear convergence rate. Furthermore, we propose the semi-infinite reformulations of the WRO-IMOP and develop a cutting-plane algorithm that converges to an approximate solution in finite iterations. Finally, we demonstrate the effectiveness of our method on both a synthetic multiobjective quadratic program and a real world portfolio optimization problemShow more
Downloadable Archival Material, 2020-09-30
Undefined
Publisher:2020-09-30

First-Order Methods for Wasserstein Distributionally Robust MDP
Authors:Grand-Clément, Julien (Creator), Kroer, Christian (Creator)
Summary:Markov decision processes (MDPs) are known to be sensitive to parameter specification. Distributionally robust MDPs alleviate this issue by allowing for \emph{ambiguity sets} which give a set of possible distributions over parameter sets. The goal is to find an optimal policy with respect to the worst-case parameter distribution. We propose a framework for solving Distributionally robust MDPs via first-order methods, and instantiate it for several types of Wasserstein ambiguity sets. By developing efficient proximal updates, our algorithms achieve a convergence rate of $O\left(NA^{2.5}S^{3.5}\log(S)\log(\epsilon^{-1})\epsilon^{-1.5} \right)$ for the number of kernels $N$ in the support of the nominal distribution, states $S$, and actions $A$; this rate varies slightly based on the Wasserstein setup. Our dependence on $N,A$ and $S$ is significantly better than existing methods, which have a complexity of $O\left(N^{3.5}A^{3.5}S^{4.5}\log^{2}(\epsilon^{-1}) \right)$. Numerical experiments show that our algorithm is significantly more scalable than state-of-the-art approaches across several domainsShow more
Downloadable Archival Material, 2020-09-14
Undefined
Publisher:2020-09-14



Fixed-Support Wasserstein Barycenters: Computational Hardness and Fast AlgorithmAuthors:Lin, Tianyi (Creator), Ho, Nhat (Creator), Chen, Xi (Creator), Cuturi, Marco (Creator), Jordan, Michael I. (Creator)
Summary:We study the fixed-support Wasserstein barycenter problem (FS-WBP), which consists in computing the Wasserstein barycenter of $m$ discrete probability measures supported on a finite metric space of size $n$. We show first that the constraint matrix arising from the standard linear programming (LP) representation of the FS-WBP is \textit{not totally unimodular} when $m \geq 3$ and $n \geq 3$. This result resolves an open question pertaining to the relationship between the FS-WBP and the minimum-cost flow (MCF) problem since it proves that the FS-WBP in the standard LP form is not an MCF problem when $m \geq 3$ and $n \geq 3$. We also develop a provably fast \textit{deterministic} variant of the celebrated iterative Bregman projection (IBP) algorithm, named \textsc{FastIBP}, with a complexity bound of $\tilde{O}(mn^{7/3}\varepsilon^{-4/3})$, where $\varepsilon \in (0, 1)$ is the desired tolerance. This complexity bound is better than the best known complexity bound of $\tilde{O}(mn^2\varepsilon^{-2})$ for the IBP algorithm in terms of $\varepsilon$, and that of $\tilde{O}(mn^{5/2}\varepsilon^{-1})$ from accelerated alternating minimization algorithm or accelerated primal-dual adaptive gradient algorithm in terms of $n$. Finally, we conduct extensive experiments with both synthetic data and real images and demonstrate the favorable performance of the \textsc{FastIBP} algorithm in practiceShow more
Downloadable Archival Material, 2020-02-11
Undefined
Publisher:2020-02-11

2020


Safe Zero-Shot Model-Based Learning and Control: A Wasserstein Distributionally Robust Approach
Authors:Kandel, Aaron (Creator), Moura, Scott J. (Creator)
Summary:This paper presents a novel application of Wasserstein ambiguity sets to robustify online zero-shot learning and control. We identify and focus on scenarios of learning and controlling a system from scratch, starting with a randomly initialized model based on the strongest possible limitations on our prior knowledge of the dynamics. This paper labels this scenario as a "zero-shot" control problem, based on popular zero-shot transfer problems in machine learning. In this case, we adopt a loosely similar nomenclature to refer to a controller that must safely control a system it has truly never experienced or interacted with. Popular and current state-of-the-art methods in learning and control typically place more emphasis on model adaptation, and frequently require significant a-priori assumptions on knowledge of system dynamics and safe reference trajectories. Episodic designs are also commonplace in such applications, where constraint violation frequently occurs with gradually reduced frequency over the course of many sequential episodes of learning. We address the identified problem of single-episode zero-shot control by presenting a Wasserstein distributionally robust approach which, coupled with a receding horizon control scheme, can safely learn and control a dynamical system in a single episodeShow more
Downloadable Archival Material, 2020-04-01
Undefined
Publisher:2020-04-01


Peer-reviewed
A Central Limit Theorem for Wasserstein type distances between two distinct univariate distributions
Authors:Philippe BerthetJean-Claude FortThierry Klein
Summary:In this article we study the natural nonparametric estimator of a Wasserstein type cost between two distinct continuous distributions $F$ and $G$ on $\mathbb{R}$. The estimator is based on the order statistics of a sample having marginals $F$, $G$ and any joint distribution. We prove a central limit theorem under general conditions relating the tails and the cost function. In particular, these conditions are satisfied by Wasserstein distances of order $p>1$ and compatible classical probability distributionsShow more
Downlodable Article
Publication:https://projecteuclid.org/euclid.aihp/1584345626Ann. Inst. H. Poincaré Probab. Statist., 56, 2020-05, 954


Online Stochastic Convex Optimization: Wasserstein Distance Variation
Authors:Shames, Iman (Creator), Farokhi, Farhad (Creator)
Summary:Distributionally-robust optimization is often studied for a fixed set of distributions rather than time-varying distributions that can drift significantly over time (which is, for instance, the case in finance and sociology due to underlying expansion of economy and evolution of demographics). This motivates understanding conditions on probability distributions, using the Wasserstein distance, that can be used to model time-varying environments. We can then use these conditions in conjunction with online stochastic optimization to adapt the decisions. We considers an online proximal-gradient method to track the minimizers of expectations of smooth convex functions parameterised by a random variable whose probability distributions continuously evolve over time at a rate similar to that of the rate at which the decision maker acts. We revisit the concepts of estimation and tracking error inspired by systems and control literature and provide bounds for them under strong convexity, Lipschitzness of the gradient, and bounds on the probability distribution drift. Further, noting that computing projections for a general feasible sets might not be amenable to online implementation (due to computational constraints), we propose an exact penalty method. Doing so allows us to relax the uniform boundedness of the gradient and establish dynamic regret bounds for tracking and estimation error. We further introduce a constraint-tightening approach and relate the amount of tightening to the probability of satisfying the constraintsShow more
Downloadable Archival Material, 2020-06-02
Undefined
Publisher:2020-06-02


Unsupervised Multilingual Alignment using Wasserstein Barycenter
Authors:Lian, Xin (Creator), Jain, Kshitij (Creator), Truszkowski, Jakub (Creator), Poupart, Pascal (Creator), Yu, Yaoliang (Creator)
Summary:We study unsupervised multilingual alignment, the problem of finding word-to-word translations between multiple languages without using any parallel data. One popular strategy is to reduce multilingual alignment to the much simplified bilingual setting, by picking one of the input languages as the pivot language that we transit through. However, it is well-known that transiting through a poorly chosen pivot language (such as English) may severely degrade the translation quality, since the assumed transitive relations among all pairs of languages may not be enforced in the training process. Instead of going through a rather arbitrarily chosen pivot language, we propose to use the Wasserstein barycenter as a more informative "mean" language: it encapsulates information from all languages and minimizes all pairwise transportation costs. We evaluate our method on standard benchmarks and demonstrate state-of-the-art performancesShow more
Downloadable Archival Material, 2020-01-28
Undefined
Publisher:2020-01-28


Wasserstein barycenters can be computed in polynomial time in fixed dimension
Authors:Altschuler, Jason M. (Creator), Boix-Adsera, Enric (Creator)
Summary:Computing Wasserstein barycenters is a fundamental geometric problem with widespread applications in machine learning, statistics, and computer graphics. However, it is unknown whether Wasserstein barycenters can be computed in polynomial time, either exactly or to high precision (i.e., with $\textrm{polylog}(1/\varepsilon)$ runtime dependence). This paper answers these questions in the affirmative for any fixed dimension. Our approach is to solve an exponential-size linear programming formulation by efficiently implementing the corresponding separation oracle using techniques from computational geometryShow more
Downloadable Archival Material, 2020-06-14
Undefined

Publisher:2020-06-14

<——2020——–2020—––3760—



Data-driven Distributionally Robust Optimal Stochastic Control Using the Wasserstein Metric
Authors:Zhao, Feiran (Creator), You, Keyou (Creator)
Summary:Optimal control of a stochastic dynamical system usually requires a good dynamical model with probability distributions, which is difficult to obtain due to limited measurements and/or complicated dynamics. To solve it, this work proposes a data-driven distributionally robust control framework with the Wasserstein metric via a constrained two-player zero-sum Markov game, where the adversarial player selects the probability distribution from a Wasserstein ball centered at an empirical distribution. Then, the game is approached by its penalized version, an optimal stabilizing solution of which is derived explicitly in a linear structure under the Riccati-type iterations. Moreover, we design a model-free Q-learning algorithm with global convergence to learn the optimal controller. Finally, we verify the effectiveness of the proposed learning algorithm and demonstrate its robustness to the probability distribution errors via numerical examplesShow more
Downloadable Archival Material, 2020-10-13
Undefined
Publisher:2020-10-13


Stochastic Optimization for Regularized Wasserstein Estimators
Authors:Ballu, Marin (Creator), Berthet, Quentin (Creator), Bach, Francis (Creator)
Summary:Optimal transport is a foundational problem in optimization, that allows to compare probability distributions while taking into account geometric aspects. Its optimal objective value, the Wasserstein distance, provides an important loss between distributions that has been used in many applications throughout machine learning and statistics. Recent algorithmic progress on this problem and its regularized versions have made these tools increasingly popular. However, existing techniques require solving an optimization problem to obtain a single gradient of the loss, thus slowing down first-order methods to minimize the sum of losses, that require many such gradient computations. In this work, we introduce an algorithm to solve a regularized version of this problem of Wasserstein estimators, with a time per step which is sublinear in the natural dimensions of the problem. We introduce a dual formulation, and optimize it with stochastic gradient steps that can be computed directly from samples, without solving additional optimization problems at each step. Doing so, the estimation and computation tasks are performed jointly. We show that this algorithm can be extended to other tasks, including estimation of Wasserstein barycenters. We provide theoretical guarantees and illustrate the performance of our algorithm with experiments on synthetic dataShow more
Downloadable Archival Material, 2020-02-20
Undefined
Publisher:2020-02-20


On Linear Optimization over Wasserstein Balls
Authors:Yue, Man-Chung (Creator), Kuhn, Daniel (Creator), Wiesemann, Wolfram (Creator)
Summary:Wasserstein balls, which contain all probability measures within a pre-specified Wasserstein distance to a reference measure, have recently enjoyed wide popularity in the distributionally robust optimization and machine learning communities to formulate and solve data-driven optimization problems with rigorous statistical guarantees. In this technical note we prove that the Wasserstein ball is weakly compact under mild conditions, and we offer necessary and sufficient conditions for the existence of optimal solutions. We also characterize the sparsity of solutions if the Wasserstein ball is centred at a discrete reference measure. In comparison with the existing literature, which has proved similar results under different conditions, our proofs are self-contained and shorter, yet mathematically rigorous, and our necessary and sufficient conditions for the existence of optimal solutions are easily verifiable in practiceShow more
Downloadable Archival Material, 2020-04-15
Undefined
Publisher:2020-04-15



Peer-reviewed
An LP-based, strongly-polynomial 2-approximation algorithm for sparse Wasserstein barycenters
Author:Steffen Borgwardt
Summary:Abstract: Discrete Wasserstein barycenters correspond to optimal solutions of transportation problems for a set of probability measures with finite support. Discrete barycenters are measures with finite support themselves and exhibit two favorable properties: there always exists one with a provably sparse support, and any optimal transport to the input measures is non-mass splitting. It is open whether a discrete barycenter can be computed in polynomial time. It is possible to find an exact barycenter through linear programming, but these programs may scale exponentially. In this paper, we prove that there is a strongly-polynomial 2-approximation algorithm based on linear programming. First, we show that an exact computation over the union of supports of the input measures gives a tight 2-approximation. This computation can be done through a linear program with setup and solution in strongly-polynomial time. The resulting measure is sparse, but an optimal transport may split mass. We then devise a second, strongly-polynomial algorithm to improve this measure to one with a non-mass splitting transport of lower cost. The key step is an update of the possible support set to resolve mass split. Finally, we devise an iterative scheme that alternates between these two algorithms. The algorithm terminates with a 2-approximation that has both a sparse support and an associated non-mass splitting optimal transport. We conclude with some sample computations and an analysis of the scaling of our algorithms, exhibiting vast improvements in running time over exact LP-based computations and low practical errorsShow more
Article, 2020
Publication:Operational Research : An International Journal, 22, 20200803, 1511
Publisher:2020


Convergence in Monge-Wasserstein Distance of Mean Field Systems with Locally Lipschitz Coefficients
Authors:Dung Tien NguyenSon Luu NguyenNguyen Huu Du
Summary:Abstract: This paper focuses on stochastic systems of weakly interacting particles whose dynamics depend on the empirical measures of the whole populations. The drift and diffusion coefficients of the dynamical systems are assumed to be locally Lipschitz continuous and satisfy global linear growth condition. The limits of such systems as the number of particles tends to infinity are studied, and the rate of convergence of the sequences of empirical measures to their limits in terms of pth Monge-Wasserstein distance is established. We also investigate the existence, uniqueness, and boundedness, and continuity of solutions of the limiting McKean-Vlasov equations associated to the systemsShow more
Article, 2020
Publication:Acta Mathematica Vietnamica, 45, 20200818, 875
Publisher:2020

 2020

Peer-reviewed
Data Augmentation Method for Switchgear Defect Samples Based on Wasserstein Generative Adversarial Network
Authors:Xueyou HuangJun XiongYu ZhangJingyi LiangZhang HaoningHui Liu
Summary:The problem of sample imbalance will lead to poor generalization ability of the deep learning model algorithm, and the phenomenon of overfitting during network training, which limits the accuracy of intelligent fault diagnosis of switchgear equipment. In view of this, this paper proposes a data augmentation method for switchgear defect samples based on Wasserstein generative adversarial network with the partial discharge live detection data of the substation and the real-time switchgear partial discharge simulation experimental data. This method can improve the imbalanced distribution of data, and solve the problems such as the disappearance of gradients and model collapses in the classic generative adversarial network model, and greatly improve the stability of training. Verification through examples and comparison with traditional data augmentation methods. The results show that the data augmentation method mentioned in this paper can more effectively reduce the data imbalance, improve the performance of data-driven technology, and provide data support for subsequent fault diagnosis of switchgear equipmentShow more
Article, 2020
Publication:1659, October 2020, 012056
Publisher:2020

Peer-reviewed
Tractable reformulations of two-stage distributionally robust linear programs over the type- ∞ Wasserstein ball
Author:Weijun Xie
Summary:This paper studies a two-stage distributionally robust stochastic linear program under the type- ∞ Wasserstein ball by providing sufficient conditions under which the program can be efficiently computed via a tractable convex program. By exploring the properties of binary variables, the developed reformulation techniques are extended to those with mixed binary random parameters. The main tractable reformulations are projected into the original decision space. The complexity analysis demonstrates that these tractable results are tight under the setting of this paperShow more
Article, 2020
Publication:Operations Research Letters, 48, 202007, 513
Publisher:2020

Peer-reviewed
A variational finite volume scheme for Wasserstein gradient flows
Authors:Clément CancèsThomas O. GallouëtGabriele Todeschi
Summary:Abstract: We propose a variational finite volume scheme to approximate the solutions to Wasserstein gradient flows. The time discretization is based on an implicit linearization of the Wasserstein distance expressed thanks to Benamou–Brenier formula, whereas space discretization relies on upstream mobility two-point flux approximation finite volumes. The scheme is based on a first discretize then optimize approach in order to preserve the variational structure of the continuous model at the discrete level. It can be applied to a wide range of energies, guarantees non-negativity of the discrete solutions as well as decay of the energy. We show that the scheme admits a unique solution whatever the convex energy involved in the continuous problem, and we prove its convergence in the case of the linear Fokker–Planck equation with positive initial density. Numerical illustrations show that it is first order accurate in both time and space, and robust with respect to both the energy and the initial profileShow mor
Article, 2020
Publication:Numerische Mathematik, 146, 20201008, 437


Peer-reviewed
Obtaining PET/CT images from non-attenuation corrected PET images in a single PET system using Wasserstein generative adversarial networks
Show more
Authors:Zhanli HuYongchang LiSijuan ZouHengzhi XueZiru SangXin LiuYongfeng YangXiaohua ZhuDong LiangHairong Zheng
Summary:Positron emission tomography (PET) imaging plays an indispensable role in early disease detection and postoperative patient staging diagnosis. However, PET imaging requires not only additional computed tomography (CT) imaging to provide detailed anatomical information but also attenuation correction (AC) maps calculated from CT images for precise PET quantification, which inevitably demands that patients undergo additional doses of ionizing radiation. To reduce the radiation dose and simultaneously obtain high-quality PET/CT images, in this work, we present an alternative based on deep learning that can estimate synthetic attenuation corrected PET (sAC PET) and synthetic CT (sCT) images from non-attenuation corrected PET (NAC PET) scans for whole-body PET/CT imaging. Our model consists of two stages: the first stage removes noise and artefacts in the NAC PET images to generate sAC PET images, and the second stage synthesizes CT images from the sAC PET images obtained in the first stage. Both stages employ the same deep Wasserstein generative adversarial network and identical loss functions, which encourage the proposed model to generate more realistic and satisfying output images. To evaluate the performance of the proposed algorithm, we conducted a comprehensive study on a total of 45 sets of paired PET/CT images of clinical patients. The final experimental results demonstrated that both the generated sAC PET and sCT images showed great similarity to true AC PET and true CT images based on both qualitative and quantitative analyses. These results also indicate that in the future, our proposed algorithm has tremendous potential for reducing the need for additional anatomic imaging in hybrid PET/CT systems or the need for lengthy MR sequence acquisition in hybrid PET/MRI systemsShow more
Article, 2020
Publication:65, 07 November 2020, 215010
Publisher:2020


Peer-reviewed
Fisher information regularization schemes for Wasserstein gradient flows
Authors:Wuchen LiJianfeng LuLi Wang
Article, 2020
Publication:Journal of computational physics, 416, 2020
Publisher:2020
Peer-reviewed
Fisher information regularization schemes for Wasserstein gradient flows
Authors:Wuchen LiJianfeng LuLi Wang
Summary:We propose a variational scheme for computing Wasserstein gradient flows. The scheme builds upon the Jordan-Kinderlehrer-Otto framework with the Benamou-Brenier's dynamic formulation of the quadratic Wasserstein metric and adds a regularization by the Fisher information. This regularization can be derived in terms of energy splitting and is closely related to the Schrödinger bridge problem. It improves the convexity of the variational problem and automatically preserves the non-negativity of the solution. As a result, it allows us to apply sequential quadratic programming to solve the sub-optimization problem. We further save the computational cost by showing that no additional time interpolation is needed in the underlying dynamic formulation of the Wasserstein-2 metric, and therefore, the dimension of the problem is vastly reduced. Several numerical examples, including porous media equation, nonlinear Fokker-Planck equation, aggregation diffusion equation, and Derrida-Lebowitz-Speer-Spohn equation, are provided. These examples demonstrate the simplicity and stableness of the proposed schemeShow more
Article
Publication:Journal of Computational Physics, 416, 2020-09-01

<——2020——–2020—––3770—



Peer-reviewed
Gromov–Hausdorff limit of Wasserstein spaces on point clouds
Author:Nicolás García Trillos
Summary:Abstract: We consider a point cloud uniformly distributed on the flat torus , and construct a geometric graph on the cloud by connecting points that are within distance of each other. We let be the space of probability measures on and endow it with a discrete Wasserstein distance as introduced independently in Chow et al. (Arch Ration Mech Anal 203:969–1008, 2012), Maas (J Funct Anal 261:2250–2292, 2011) and Mielke (Nonlinearity 24:1329–1346, 2011) for general finite Markov chains. We show that as long as decays towards zero slower than an explicit rate depending on the level of uniformity of , then the space converges in the Gromov–Hausdorff sense towards the space of probability measures on endowed with the Wasserstein distance. The analysis presented in this paper is a first step in the study of stability of evolution equations defined over random point clouds as the number of points grows to infinityShow more Article, 2020
Publication:Calculus of Variations and Partial Differential Equations, 59, 20200311
Publisher:2020

 
Peer-reviewed
Adversarial sliced Wasserstein domain adaptation networks
Authors:Yun ZhangNianbin WangShaobin Cai
Summary:Domain adaptation has become a resounding success in learning a domain agnostic model that performs well on target dataset by leveraging source dataset which has related data distribution. Most of existing works aim at learning domain-invariant features across different domains, but they ignore the discriminability of learned features although it is import to improve the model's performance. This paper proposes a novel adversarial sliced Wasserstein domain adaptation network (AWDAN) that uses a shared encoder and classifier along with a domain classifier to enhance the discriminability of the domain-invariant features. AWDAN utilizes adversarial learning to learn domain-invariant features in feature space and simultaneously minimizes sliced Wasserstein distance in label space to enforce the generated features to be discriminative that guarantees the transfer performance. Meanwhile, we propose to fix the weights of the pre-trained CNN backbone to guarantee its adaptability. We provide theoretical analysis to demonstrate the efficacy of AWDAN. Experimental results show that the proposed AWDAN significantly outperforms existing domain adaptation methods on three visual domain adaptation tasks. Feature visualizations verify that AWDAN learns both domain-invariant and discriminative features, and can achieve domain agnostic feature learningShow more
Article, 2020
Publication:Image and Vision Computing, 102, 202010
Publisher:2020

Peer-reviewed
Optimal control of multiagent systems in the Wasserstein space
Authors:Chloé JimenezAntonio MarigondaMarc Quincampoix
Summary:Abstract: This paper concerns a class of optimal control problems, where a central planner aims to control a multi-agent system in in order to minimize a certain cost of Bolza type. At every time and for each agent, the set of admissible velocities, describing his/her underlying microscopic dynamics, depends both on his/her position, and on the configuration of all the other agents at the same time. So the problem is naturally stated in the space of probability measures on equipped with the Wasserstein distance. The main result of the paper gives a new characterization of the value function as the unique viscosity solution of a first order partial differential equation. We introduce and discuss several equivalent formulations of the concept of viscosity solutions in the Wasserstein spaces suitable for obtaining a comparison principle of the Hamilton Jacobi Bellman equation associated with the above control problemShow more
Article, 2020
Publication:Calculu

s of Variations and Partial Differential Equations, 59, 20200302
Publisher:2020



Peer-reviewed
Speech Dereverberation Based on Improved Wasserstein Generative Adversarial Networks
Authors:Lufang RaoJunmei Yang
Summary:In reality, the sound we hear is not only disturbed by noise, but also the reverberant, whose effects are rarely taken into account. Recently, deep learning has shown great advantages in speech signal processing. But among the existing dereverberation approaches, very few methods apply deep learning at the waveform level. In addition, in the case of sever reverberation, the conventional speech dereverberation methods perform poorly, such as MCLP (multi-channel linear prediction). We proposed a new speech dereverberation method in this paper, which is based on improved WGAN (Wasserstein Generative Adversarial Networks), called WGAN-GP, whose generator uses strided-convolutional networks and the discriminator is structured on DNNs. Due to the addition of the gradient penalty item, WGAN-GP improves the stability of training and the generalization of the model. In the case of severe reverberation, according to the experimental results, the proposed system can perform better than MCLP. As the proposed method based on WAGN-GP can improve speech quality, speech signal processing systems may be able to apply it to pre-processing stageShow more
Article, 2020
Publication:1621, August 2020, 012089
Publisher:2020


Intelligent Fault Diagnosis with a Deep Transfer Network based on Wasserstein Distance
Authors:Juan XuJingkun HuangYukun ZhaoLong Zhou
Summary:Intelligent fault-diagnosis methods based on deep-learning technology have been very successful for complex industrial systems. The deep learning based fault classification model requires a large number of labeled data. Moreover, the probability distribution of training set and test data should be the same. These two conditions are often not satisfied in practical working conditions. Thereby an intelligent fault-diagnosis method based on a deep adversarial transfer network is proposed, when the target domain only has unlabeled samples. The Wasserstein distance is used as a metric to learn a domain-independent feature through the adversarial training between the generator and the domain discriminator. Meanwhile, a reasonable loss function of fault classification is designed, which can ensure that the learned feature does not contain domain information, but also contains fault classification information. Finally the cross-domain fault classification can be solved, even if there is no labeled vibration data in the target domain. The experimental results show that in transfer tasks under different working conditions, the fault classification accuracy exceeds 90%, which is approximately 10% higher than that of the comparison methodShow more
Article
Publication:Procedia Computer Science, 174, 2020, 406

2020

 
Peer-reviewed
The quadratic Wasserstein metric for inverse data matching
Authors:Bjrn EngquistKui RenYunan Yang
Summary:This work characterizes, analytically and numerically, two major effects of the quadratic Wasserstein (W 2) distance as the measure of data discrepancy in computational solutions of inverse problems. First, we show, in the infinite-dimensional setup, that the W 2 distance has a smoothing effect on the inversion process, making it robust against high-frequency noise in the data but leading to a reduced resolution for the reconstructed objects at a given noise level. Second, we demonstrate that, for some finite-dimensional problems, the W 2 distance leads to optimization problems that have better convexity than the classical L 2 and  distances, making it a more preferred distance to use when solving such inverse matching problems.Show more
Article, 2020
Publication:36, May 2020, 055001
Publisher:2020

Peer-reviewed
A collaborative filtering recommendation framework based on Wasserstein GAN
Authors:Rui LiFulan QianXiuquan DuShu ZhaoYanping Zhang
Summary:Compared with the original GAN, Wasserstein GAN minimizes the Wasserstein Distance between the generative distribution and the real distribution, can well capture the potential distribution of data and has achieved excellent results in image generation. However, the exploration of Wasserstein GAN on recommendation systems has received relatively less scrutiny. In this paper, we propose a collaborative filtering recommendation framework based on Wasserstein GAN called CFWGAN to improve recommendation accuracy. By learning the real user distribution, we can mine the potential nonlinear interactions between users and items, and capture users’ preferences for all items. Besides, we combine two positive and negative item sampling methods and add the reconstruction loss to the generator’s loss. This can well handle the problem of discrete data in recommendation (relative to the continuity of image data). By continuously approximating the generative distribution to the real user distribution, we can finally obtain better users’ preference information and provide higher accuracy in recommendation. We evaluate the CFWGAN model on three real-world datasets, and the empirical results show that our method is competitive with or superior to state-of-the-art approaches on the benchmark top-N recommendation taskShow more
Article, 2020
Publication:1684, November 2020, 012057
Publisher:2020

Peer-reviewed
Parameter estimation for biochemical reaction networks using Wasserstein distances
Authors:Kaan calRamon GrimaGuido Sanguinetti
Summary:We present a method for estimating parameters in stochastic models of biochemical reaction networks by fitting steady-state distributions using Wasserstein distances. We simulate a reaction network at different parameter settings and train a Gaussian process to learn the Wasserstein distance between observations and the simulator output for all parameters. We then use Bayesian optimization to find parameters minimizing this distance based on the trained Gaussian process. The effectiveness of our method is demonstrated on the three-stage model of gene expression and a genetic feedback loop for which moment-based methods are known to perform poorly. Our method is applicable to any simulator model of stochastic reaction networks, including Brownian dynamics.Show more
Article, 2020
Publication:53, 24 January 2020, 034002
Publisher:2020

Peer-reviewed
Data supplement for a soft sensor using a new generative model based on a variational autoencoder and Wasserstein GAN
Authors:Xiao WangHan Liu
Summary:• We propose a generative model named VA-WGAN by integrating a VAE with WGAN to supplement training data for soft sensor modeling. The VA-WGAN generates the same distributions of real data from industrial processes, which is hard to achieve by traditional regression methods. • We merge and improve the optimization objectives of the VAE and WGAN to be the loss function of the model. In addition, the training procedure is improved to obtain stable convergence and high-quality generated samples.

In industrial process control, measuring some variables is difficult for environmental or cost reasons. This necessitates employing a soft sensor to predict these variables by using the collected data from easily measured variables. The prediction accuracy and computational speed in the modeling procedure of soft sensors could be improved with adequate training samples. However, the rough environment of some industrial fields makes it difficult to acquire enough samples for soft sensor modeling. Generative adversarial networks (GANs) and the variational autoencoder (VAE) are two prominent methods that have been employed for learning generative models. In the current work, the VA-WGAN combining VAE with Wasserstein generative adversarial networks (WGAN) as a generative model is established to produce new samples for soft sensors by using the decoder of VAE as the generator in WGAN. An actual industrial soft sensor with insufficient data is used to verify the data generation capability of the proposed model. According to the experimental results, the samples obtained with the proposed model more closely resemble the true samples compared with the other four common generative models. Moreover, the insufficiency of the training data and the prediction precision of soft sensors could be improved via these constructed samplesShow more
Article, 2020
Publication:Journal of Process Control, 85, 202001, 91
Publisher:2020
 


Multimedia Analysis and Fusion via Wasserstein Barycenter
Authors:Cong JinJunhao WangJin WeiLifeng TanShouxun LiuWei ZhaoShan LiuXin Lv
Summary:Optimal transport distance, otherwise known as Wasserstein distance, recently has attracted attention in music signal processing and machine learning as powerful discrepancy measures for probability distributions. In this paper, we propose an ensemble approach with Wasserstein distance to integrate various music transcription methods and combine different music classification models so as to achieve a more robust solution. The main idea is to model the ensemble as a problem of Wasserstein Barycenter, where our two experimental results show that our ensemble approach outperforms existing methods to a significant extent. Our proposal offers a new visual angle on the application of Wasserstein distance through music transcription and music classification in multimedia analysis and fusion tasksShow more
Downloadable Article, 2020
Publication:International Journal of Networked and Distributed Computing (IJNDC), 2, 20200201
Publisher:2020

<——2020——–2020—––3780—



Peer-reviewed
Data augmentation in fault diagnosis based on the Wasserstein generative adversarial network with gradient penalty
Show more
Authors:Xin GaoFang DengXianghu Yue
Summary:Fault detection and diagnosis in industrial process is an extremely essential part to keep away from undesired events and ensure the safety of operators and facilities. In the last few decades various data based machine learning algorithms have been widely studied to monitor machine condition and detect process faults. However, the faulty datasets in industrial process are hard to acquire. Thus low-data of faulty data or imbalanced data distributions are common to see in industrial processes, resulting in the difficulty to accurately identify different faults for many algorithms. Therefore, in this paper, Wasserstein generative adversarial network with gradient penalty (WGAN-GP) based data augmentation approaches are researched to generate data samples to supplement low-data input set in fault diagnosis field and help improve the fault diagnosis accuracies. To verify its efficient, various classifiers are used and three industrial benchmark datasets are involved to evaluate the performance of GAN based data augmentation ability. The results show the fault diagnosis accuracies for classifiers are increased in all datasets after employing the GAN-based data augmentation techniquesShow more
Article, 2020
Publication:Neurocomputing, 396, 20200705, 487
Publisher:2020



Peer-reviewed
Nonpositive curvature, the variance functional, and the Wasserstein barycenter
Authors:Young-Heon KimBrendan Pass
Summary:We show that a Riemannian manifold $ M$ has nonpositive sectional curvature and is simply connected if and only if the variance functional on the space $ P(M)$ of probability measures over $ M$ is displacement convex. We then establish convexity over Wasserstein barycenters of the variance, and derive an inequality between the variance of the Wasserstein and linear barycenters of a probability measure on $ P(M)$. These results are applied to invariant measures under isometry group actions, implying that the variance of the Wasserstein projection to the set of invariant measures is less than that of the $ L^2$ projection to the same setShow more
Downloadable Article, 2020
Publication:Proceedings of the American Mathematical Society, 148, April 1, 2020, 1745
Publisher:2020


  Peer-reviewed
Hyperbolic Wasserstein Distance for Shape Indexing
Authors:Yalin WangJie Shi
Summary:Shape space is an active research topic in computer vision and medical imaging fields. The distance defined in a shape space may provide a simple and refined index to represent a unique shape. This work studies the Wasserstein space and proposes a novel framework to compute the Wasserstein distance between general topological surfaces by integrating hyperbolic Ricci flow, hyperbolic harmonic map, and hyperbolic power Voronoi diagram algorithms. The resulting hyperbolic Wasserstein distance can intrinsically measure the similarity between general topological surfaces. Our proposed algorithms are theoretically rigorous and practically efficient. It has the potential to be a powerful tool for 3D shape indexing research. We tested our algorithm with human face classification and Alzheimer's disease (AD) progression tracking studies. Experimental results demonstrated that our work may provide a succinct and effective shape indexShow more
Article, 2020
Publication:IEEE Transactions on Pattern Analysis & Machine Intelligence, 42, 202006, 1362
Publisher:2020


Peer-reviewed
Squared quadratic Wasserstein distance: optimal couplings and Lions differentiability*
Authors:Aurélien AlfonsiBenjamin Jourdain
Summary:In this paper, we remark that any optimal coupling for the quadratic Wasserstein distance W22(μ,ν) between two probability measures μ and ν with finite second order moments on d is the composition of a martingale coupling with an optimal transport map . We check the existence of an optimal coupling in which this map gives the unique optimal coupling between μ and #μ. Next, we give a direct proof that σ W22(σ,ν) is differentiable at μ in the Lions (Cours au Collège de France. 2008) sense iff there is a unique optimal coupling between μ and ν and this coupling is given by a map. It was known combining results by Ambrosio, Gigli and Savaré (Lectures in Mathematics ETH Zürich. Birkhäuser Verlag, Basel, 2005) and Ambrosio and Gangbo (Comm. Pure Appl. Math., 61:18–53, 2008) that, under the latter condition, geometric differentiability holds. Moreover, the two notions of differentiability are equivalent according to the recent paper of Gangbo and Tudorascu (J. Math. Pures Appl. 125:119–174, 2019). Besides, we give a self-contained probabilistic proof that mere Fréchet differentiability of a law invariant function F on L2(Ω, ; d) is enough for the Fréchet differential at X to be a measurable function of XShow more
Article, 2020
Publication:ESAIM: Probability and Statistics, 24, 2020, 703
Publisher:2020

Peer-reviewed
Progressive Wasserstein Barycenters of Persistence Diagrams
Authors:Jules VidalJoseph BudinJulien Tierny
Summary:This paper presents an efficient algorithm for the progressive approximation of Wasserstein barycenters of persistence diagrams, with applications to the visual analysis of ensemble data. Given a set of scalar fields, our approach enables the computation of a persistence diagram which is representative of the set, and which visually conveys the number, data ranges and saliences of the main features of interest found in the set. Such representative diagrams are obtained by computing explicitly the discrete Wasserstein barycenter of the set of persistence diagrams, a notoriously computationally intensive task. In particular, we revisit efficient algorithms for Wasserstein distance approximation [12,51] to extend previous work on barycenter estimation [94]. We present a new fast algorithm, which progressively approximates the barycenter by iteratively increasing the computation accuracy as well as the number of persistent features in the output diagram. Such a progressivity drastically improves convergence in practice and allows to design an interruptible algorithm, capable of respecting computation time constraints. This enables the approximation of Wasserstein barycenters within interactive times. We present an application to ensemble clustering where we revisit the k-means algorithm to exploit our barycenters and compute, within execution time constraints, meaningful clusters of ensemble data along with their barycenter diagram. Extensive experiments on synthetic and real-life data sets report that our algorithm converges to barycenters that are qualitatively meaningful with regard to the applications, and quantitatively comparable to previous techniques, while offering an order of magnitude speedup when run until convergence (without time constraint). Our algorithm can be trivially parallelized to provide additional speedups in practice on standard workstations. We provide a lightweight C++ implementation of our approach that can be used to reproduce our resultsShow more
Article, 2020
Publication:IEEE Transactions on Visualization &amp; Computer Graphics, 26, 202001, 151
Publisher:2020


2020

Peer-reviewed
Data-Driven Distributionally Robust Unit Commitment With Wasserstein Metric: Tractable Formulation and Efficient Solution Method
Show more
Authors:Xiaodong ZhengHaoyong Chen
Summary:In this letter, we propose a tractable formulation and an efficient solution method for the Wasserstein-metric-based distributionally robust unit commitment (DRUC-dW) problem. First, a distance-based data aggregation method is introduced to hedge against the dimensionality issue arising from a huge volume of data. Then, we propose a novel cutting plane algorithm to solve the DRUC-dW problem much more efficiently than state-of-the-art. The novel solution method is termed extremal distribution generation, which is an extension of the column-and-constraint generation method to the distributionally robust cases. The feasibility and cost efficiency of the model, and the efficiency of the solution method are numerically validatedShow more
Article, 2020
Publication:IEEE Transactions on Power Systems, 35, 202011, 4940
Publisher:2020

ited by 23 Related articles All 2 versions


Biosignal Oversampling Using Wasserstein Generative Adversarial Network
Authors:Munawara Saiyara MuniaMehrdad NouraniSammy Houari2020 IEEE International Conference on Healthcare Informatics (ICHI)Show more
Summary:Oversampling plays a vital role in improving the minority-class classification accuracy for imbalanced biomedical datasets. In this work, we propose a single-channel biosignal data generation method by exploiting the advancements in well-established image-based Generative Adversarial Networks (GANs). We have implemented a Wasserstein GAN (WGAN) to generate synthetic electrocardiogram (ECG) signal, due to their stability in training as well as correlation of the loss function with the generated image quality. We first trained the WGAN with fixed-dimensional images of the signal and generated synthetic data with similar characteristics. Two evaluation methods were then used for evaluating the efficiency of the proposed technique in generating synthetic ECG data. We used Frechet Inception Distance score for measuring synthetic image quality. We then performed a binary classification of normal and abnormal (Anterior Myocardial Infarction) ECG using Support Vector Machine to verify the performance of the proposed method as an oversampling techniqueShow more
Chapter, 2020
Publication:2020 IEEE International Conference on Healthcare Informatics (ICHI), 202011, 1
Publisher:2020

Peer-reviewed
Aggregated Wasserstein Distance and State Registration for Hidden Markov Models
Authors:Jia LiJianbo YeYukun Chen
Summary:We propose a framework, named Aggregated Wasserstein, for computing a dissimilarity measure or distance between two Hidden Markov Models with state conditional distributions being Gaussian. For such HMMs, the marginal distribution at any time position follows a Gaussian mixture distribution, a fact exploited to softly match, aka register, the states in two HMMs. We refer to such HMMs as HMM. The registration of states is inspired by the intrinsic relationship of optimal transport and the Wasserstein metric between distributions. Specifically, the components of the marginal GMMs are matched by solving an optimal transport problem where the cost between components is the Wasserstein metric for Gaussian distributions. The solution of the optimization problem is a fast approximation to the Wasserstein metric between two GMMs. The new Aggregated Wasserstein distance is a semi-metric and can be computed without generating Monte Carlo samples. It is invariant to relabeling or permutation of states. The distance is defined meaningfully even for two HMMs that are estimated from data of different dimensionality, a situation that can arise due to missing variables. This distance quantifies the dissimilarity of HMMs by measuring both the difference between the two marginal GMMs and that between the two transition matrices. Our new distance is tested on tasks of retrieval, classification, and t-SNE visualization of time series. Experiments on both synthetic and real data have demonstrated its advantages in terms of accuracy as well as efficiency in comparison with existing distances based on the Kullback-Leibler divergenceShow more
Article, 2020
Publication:IEEE Transactions on Pattern Analysis & Machine Intelligence, 42, 202009, 2133
Publisher:2020



An Ensemble Wasserstein Generative Adversarial Network Method for Road Extraction From High Resolution Remote Sensing Images in Rural Areas
Show more
Authors:Chuan YangZhenghong Wang
Summary:Road extraction from high resolution remote sensing (HR-RS) images is an important yet challenging computer vision task. In this study, we propose an ensemble Wasserstein Generative Adversarial Network with Gradient Penalty (WGAN-GP) method called E-WGAN-GP for road extraction from HR-RS images in rural areas. The WGAN-GP model modifies the standard GANs with Wasserstein distance and gradient penalty. We add a spatial penalty term in the loss function of the WGAN-GP model to solve the class imbalance problem typically in road extraction. Parameter experiments are undertaken to determine the best spatial penalty and the weight term in the new loss function based on GaoFen-2 dataset. In addition, we execute an ensemble strategy in which we first train two WGAN-GP models using the U-Net and BiSeNet as generator respectively, and then intersect their inferred outputs to yield better road vectors. We train our new model with GaoFen-2 HR-RS images in rural areas from China and also the DeepGlobe Road Extraction dataset. Compared with the U-Net, BiSeNet, D-LinkNet and WGAN-GP methods without ensemble, our new method makes a good trade-off between precision and recall with F 1 -score = 0.85 and IoU = 0.73Show more20
Publication:IEEE Access, 8, 2020, 174317
Publisher:2020
 


Hausdorff and Wasserstein metrics on graphs and other structured data
Author:Evan Patterson
Summary:Abstract: Optimal transport is widely used in pure and applied mathematics to find probabilistic solutions to hard combinatorial matching problems. We extend the Wasserstein metric and other elements of optimal transport from the matching of sets to the matching of graphs and other structured data. This structure-preserving form of optimal transport relaxes the usual notion of homomorphism between structures. It applies to graphs—directed and undirected, labeled and unlabeled—and to any other structure that can be realized as a $\textsf{C}$-set for some finitely presented category $\textsf{C}$. We construct both Hausdorff-style and Wasserstein-style metrics on $\textsf{C}$-sets, and we show that the latter are convex relaxations of the former. Like the classical Wasserstein metric, the Wasserstein metric on $\textsf{C}$-sets is the value of a linear program and is therefore efficiently computableShow more
Article, 2020
Publication:Information and Inference: A Journal of the IMA, 10, 20200930, 1209
Publisher:2020

<——2020——–2020—––3790—e



Peer-reviewed
Learning to Align via Wasserstein for Person Re-Identification
Authors:Zhizhong ZhangYuan XieDing LiWensheng ZhangQi Tian
Summary:Existing successful person re-identification (Re-ID) models often employ the part-level representation to extract the fine-grained information, but commonly use the loss that is particularly designed for global features, ignoring the relationship between semantic parts. In this paper, we present a novel triplet loss that emphasizes the salient parts and also takes the consideration of alignment. This loss is based on the crossing-bing matching metric that also known as Wasserstein Distance. It measures how much effort it would take to move the embeddings of local features to align two distributions, such that it is able to find an optimal transport matrix to re-weight the distance of different local parts. The distributions in support of local parts is produced via a new attention mechanism, which is calculated by the inner product between high-level global feature and local features, representing the importance of different semantic parts w.r.t. identification. We show that the obtained optimal transport matrix can not only distinguish the relevant and misleading parts, and hence assign different weights to them, but also rectify the original distance according to the learned distributions, resulting in an elegant solution for the mis-alignment issue. Besides, the proposed method is easily implemented in most Re-ID learning system with end-to-end training style, and can obviously improve their performance. Extensive experiments and comparisons with recent Re-ID methods manifest the competitive performance of our methodShow more
Article, 2020
Publication:IEEE Transactions on Image Processing, 29, 2020, 7104
Publisher:2020

Peer-reviewed
Modeling EEG Data Distribution With a Wasserstein Generative Adversarial Network to Predict RSVP Events
Authors:Sharaj PanwarPaul RadTzyy-Ping JungYufei Huang
Summary:Electroencephalography (EEG) data are difficult to obtain due to complex experimental setups and reduced comfort with prolonged wearing. This poses challenges to train powerful deep learning model with the limited EEG data. Being able to generate EEG data computationally could address this limitation. We propose a novel Wasserstein Generative Adversarial Network with gradient penalty (WGAN-GP) to synthesize EEG data. This network addresses several modeling challenges of simulating time-series EEG data including frequency artifacts and training instability. We further extended this network to a class-conditioned variant that also includes a classification branch to perform event-related classification. We trained the proposed networks to generate one and 64-channel data resembling EEG signals routinely seen in a rapid serial visual presentation (RSVP) experiment and demonstrated the validity of the generated samples. We also tested intra-subject cross-session classification performance for classifying the RSVP target events and showed that class-conditioned WGAN-GP can achieve improved event-classification performance over EEGNetShow more
Article, 2020
Publication:IEEE Transactions on Neural Systems and Rehabilitation Engineering, 28, 202008, 1720
Publisher:2020
Cited by 27
Related articles All 6 versions
 
Peer-reviewed
A Linear Programming Approximation of Distributionally Robust Chance-Constrained Dispatch With Wasserstein Distance
Show more
Authors:Anping ZhouMing YangMingqiang WangYuming Zhang
Summary:This paper proposes a data-driven distributionally robust chance constrained real-time dispatch (DRCC-RTD) considering renewable generation forecasting errors. The proposed DRCC-RTD model minimizes the expected quadratic cost function and guarantees that the two-sided chance constraints are satisfied for any distribution in the ambiguity set. The Wasserstein-distance-based ambiguity set, which is a family of distributions centered at an empirical distribution, is employed to hedge against data perturbations. By applying the reformulation linearization technique (RLT) to relax the quadratic constraints of the worst-case costs and constructing linear reformulations of the DRCCs, the proposed DRCC-RTD model is cast into a deterministic linear programming (LP) problem, which can be solved efficiently by off-the-shelf solvers. Case studies are carried out on a 6-bus system and the IEEE 118-bus system to validate the effectiveness and efficiency of the proposed approachShow moreArticle, 2020
Publication:IEEE Transactions on Power Systems, 35, 202009, 3366
Publisher:2020
Cited by 42
 Related articles All 3 versions

A Novel Data-to-Text Generation Model with Transformer Planning and a Wasserstein Auto-Encoder
Authors:Xiaohong XuTing HeHuazhen Wang2020 IEEE International Conference on Services Computing (SCC)
Summary:Existing methods for data-to-text generation have difficulty producing diverse texts with low duplication rates. In this paper, we propose a novel data-to-text generation model with Transformer planning and a Wasserstein auto-encoder, which can convert constructed data to coherent and diverse text. This model possesses the following features: Transformer is first used to generate the data planning sequence of the target text content (each sequence is a subset of the input items that can be covered by a sentence), and then the Wasserstein Auto-Encoder(WAE) and a deep neural network are employed to establish the global latent variable space of the model. Second, text generation is performed through a hierarchical structure that takes the data planning sequence, global latent variables, and context of the generated sentences as conditions. Furthermore, to achieve diversity of text expression, a decoder is developed that combines the neural network with the WAE. The experimental results show that this model can achieve higher evaluation scores than the existing baseline models in terms of the diversity metrics of text generation and the duplication rateShow more
Chapter, 2020
Publication:2020 IEEE International Conference on Services Computing (SCC), 202011, 337
Publisher:2020



Spatial-aware Network using Wasserstein Distance for Unsupervised Domain AdaptationAuthors:Liu LongLuo BinFan Jiang2020 Chinese Automation Congress (CAC)
Summary:In a general scenario, the purpose of Unsupervised Domain Adaptation (UDA) is to classify unlabeled target domain data as much as possible, but the source domain data has a large number of labels. To address this situation, this paper introduces the optimal transport theory into the transfer learning, and proposes a deep adaptation network based on the second-order Wasserstein distance, which can measure the discrepancy between the two distributions. In addition, in order to retain the spatial structure information of features, the network is combined with convolutional auto-encoder. Experiments show that our method has achieved good resultsShow more
Chapter, 2020
Publication:2020 Chinese Automation Congress (CAC), 20201106, 4591
Publisher:2020

2020


Semantics-assisted Wasserstein Learning for Topic and Word Embeddings
Authors:Changchun LiXiming LiJihong OuyangYiming Wang2020 IEEE International Conference on Data Mining (ICDM)
Summary:Wasserstein distance, defined as the cost (measured by word embeddings) of optimal transport plan for moving between two histograms, has been proven effective in tasks of natural language processing. In this paper, we extend Nonnegative Matrix Factorization (NMF) to a novel Wasserstein topic model, namely Semantics-Assisted Wasserstein Learning (SAWL), with simultaneous learning of topics and word embeddings. In Sawl, we formulate an NMF-like unified objective that integrates the regularized Wasserstein distance loss with a context factorization of word context information. Therefore, Sawl can refine the word embeddings for capturing corpus-specific semantics, enabling to boost topics and word embeddings each other. We analyze Sawl, and provide its dimensionality-dependent generalization bounds of reconstruction errors. Experimental results indicate that Sawl outperforms the state-of-the-art baseline modelsShow more
Chapter, 2020
Publication:2020 IEEE International Conference on Data Mining (ICDM), 202011, 292
Publisher:2020

Convergence rates of the blocked Gibbs sampler with random scan in the Wasserstein metric
Authors:Neng-Yi Gou2020 IEEE 32nd International Conference on Tools with Artificial Intelligence (ICTAI)Show more
Summary:In this paper, we propose a feature selection method that characterizes the difference between two kinds of probability distributions. The key idea is to view the feature selection problem as a sparsest k-subgraph problem that considers Wasserstein distance between the studied two probability distributions. Our method does not presume any specific parametric models on the data distribution and is non-parametric. It outperforms existing Kullback-Leibler divergence based approaches, since we do not require two distributions to overlap. This relaxation makes our method work in many problems in which Kullback-Leibler divergence based methods fail. We also design a fast calculation algorithm using dynamic programming. Our experimental results show that our method outperforms the current method in both computation accuracy and speedShow mor
Chapter, 2020
Publication:2020 IEEE 32nd International Conference on Tools with Artificial Intelligence (ICTAI), 202011, 982
Publisher:2020
Peer-reviewed
Transport and Interface: An Uncertainty Principle for the Wasserstein Distance
Authors:Amir SagivStefan Steinerberger
Summary:Let $f: (0,1)^d \rightarrow \mathbb{R}$ be a continuous function with zero mean and interpret $f_{+} = \max(f, 0)$ and $f_{-} = -\min(f, 0)$ as the densities of two measures. We prove that if the cost of transport from $f_{+}$ to $f_{-}$ is small, in terms of the Wasserstein distance $W_1 (f_+ , f_-)$, then the Hausdorff measure of the nodal set $\left\{x \in (0,1)^d: f(x) = 0 \right\}$ has to be large (``if it is always easy to buy milk, there must be many supermarkets''). More precisely, we show that the product of the $(d-1)$-dimensional volume of the zero set and the Wasserstein transport cost can be bounded from below in terms of the $L^p$ norms of $f$. We apply this “uncertainty principle" to the metric Sturm--Liouville theory in higher dimensions to show that a linear combination of eigenfunctions of an elliptic operator cannot have an arbitrarily small zero setShow more

 
Peer-reviewed
Drug–drug interaction prediction with Wasserstein Adversarial Autoencoder-based knowledge graph embeddings
Show more
Authors:Yuanfei DaiChenhao GuoWenzhong GuoCarsten Eickhoff
Summary:Abstract: An interaction between pharmacological agents can trigger unexpected adverse events. Capturing richer and more comprehensive information about drug–drug interactions (DDIs) is one of the key tasks in public health and drug development. Recently, several knowledge graph (KG) embedding approaches have received increasing attention in the DDI domain due to their capability of projecting drugs and interactions into a low-dimensional feature space for predicting links and classifying triplets. However, existing methods only apply a uniformly random mode to construct negative samples. As a consequence, these samples are often too simplistic to train an effective model. In this paper, we propose a new KG embedding framework by introducing adversarial autoencoders (AAEs) based on Wasserstein distances and Gumbel-Softmax relaxation for DDI tasks. In our framework, the autoencoder is employed to generate high-quality negative samples and the hidden vector of the autoencoder is regarded as a plausible drug candidate. Afterwards, the discriminator learns the embeddings of drugs and interactions based on both positive and negative triplets. Meanwhile, in order to solve vanishing gradient problems on the discrete representation—an inherent flaw in traditional generative models—we utilize the Gumbel-Softmax relaxation and the Wasserstein distance to train the embedding model steadily. We empirically evaluate our method on two tasks: link prediction and DDI classification. The experimental results show that our framework can attain significant improvements and noticeably outperform competitive baselines. Supplementary information: Supplementary data and code are available at https://github.com/dyf0631/AAE_FOR_KGShow more
Article, 2020
Publication:Briefings in Bioinformatics, 22, 20201030
Publisher:2020

 
A Wasserstein-Type Distance in the Space of Gaussian Mixture Models
Authors:Julie DelonAgnès Desolneux
Summary:In this paper we introduce a Wasserstein-type distance on the set of Gaussian mixture models. This distance is defined by restricting the set of possible coupling measures in the optimal transport problem to Gaussian mixture models. We derive avery simple discrete formulation for this distance, which makes it suitable for high dimensional problems. We also study the corresponding multi-marginal and barycenter formulations. We show some properties of this Wasserstein-type distance, and we illustrate its practical use with some examples in image processingShow more
Downloadable Article
Publication:SIAM Journal on Imaging Sciences, 13, 2020, 936

Learning Wasserstein Isometric Embedding for Point Clouds

Authors:Keisuke KawanoSatoshi KoideTakuro Kutsuna2020 International Conference on 3D Vision (3DV)
Summary:The Wasserstein distance has been employed for determining the distance between point clouds, which have variable numbers of points and invariance of point order. However, the high computational cost associated with the Wasserstein distance hinders its practical applications for large-scale datasets. We propose a new embedding method for point clouds, which aims to embed point clouds into a Euclidean space, isometric to the Wasserstein space defined on the point clouds. In numerical experiments, we demonstrate that the point clouds decoded from the Euclidean averages and the interpolations in the embedding space accurately mimic the Wasserstein barycenters and interpolations of the point clouds. Furthermore, we show that the embedding vectors can be utilized as inputs for machine learning models (e.g., principal component analysis and neural networks)Show more

Chapter, 2020
Publication:2020 International Conference on 3D Vision (3DV), 202011, 473
Publisher:2020

<——2020——–2020—––3800—



Wasserstein Distributionally Robust Look-Ahead Economic Dispatch
Authors:Poolla, Bala Kameshwar (Creator), Hota, Ashish R. (Creator), Bolognani, Saverio (Creator), Callaway, Duncan S. (Creator), Cherukuri, Ashish (Creator)Show more
Summary:We consider the problem of look-ahead economic dispatch (LAED) with uncertain renewable energy generation. The goal of this problem is to minimize the cost of conventional energy generation subject to uncertain operational constraints. The risk of violating these constraints must be below a given threshold for a family of probability distributions with characteristics similar to observed past data or predictions. We present two data-driven approaches based on two novel mathematical reformulations of this distributionally robust decision problem. The first one is a tractable convex program in which the uncertain constraints are defined via the distributionally robust conditional-value-at-risk. The second one is a scalable robust optimization program that yields an approximate distributionally robust chance-constrained LAED. Numerical experiments on the IEEE 39-bus system with real solar production data and forecasts illustrate the effectiveness of these approaches. We discuss how system operators should tune these techniques in order to seek the desired robustness-performance trade-off and we compare their computational scalabilityShow more
Downloadable Archival Material, 2020-03-10
Undefined
Publisher:2020-03-10


Automatic coding of students' writing via Contrastive Representation Learning in the Wasserstein spaceAuthors:Jiang, Ruijie (Creator), Gouvea, Julia (Creator), Hammer, David (Creator), Miller, Eric (Creator), Aeron, Shuchin (Creator)
Summary:Qualitative analysis of verbal data is of central importance in the learning sciences. It is labor-intensive and time-consuming, however, which limits the amount of data researchers can include in studies. This work is a step towards building a statistical machine learning (ML) method for achieving an automated support for qualitative analyses of students' writing, here specifically in score laboratory reports in introductory biology for sophistication of argumentation and reasoning. We start with a set of lab reports from an undergraduate biology course, scored by a four-level scheme that considers the complexity of argument structure, the scope of evidence, and the care and nuance of conclusions. Using this set of labeled data, we show that a popular natural language modeling processing pipeline, namely vector representation of words, a.k.a word embeddings, followed by Long Short Term Memory (LSTM) model for capturing language generation as a state-space model, is able to quantitatively capture the scoring, with a high Quadratic Weighted Kappa (QWK) prediction score, when trained in via a novel contrastive learning set-up. We show that the ML algorithm approached the inter-rater reliability of human analysis. Ultimately, we conclude, that machine learning (ML) for natural language processing (NLP) holds promise for assisting learning sciences researchers in conducting qualitative studies at much larger scales than is currently possibleShow more
Downloadable Archival Material, 2020-11-26
Undefined
Publisher:2020-11-26

Social-WaGDAT: Interaction-aware Trajectory Prediction via Wasserstein Graph Double-Attention Network
Authors:Li, Jiachen (Creator), Ma, Hengbo (Creator), Zhang, Zhihao (Creator), Tomizuka, Masayoshi (Creator)
Summary:Effective understanding of the environment and accurate trajectory prediction of surrounding dynamic obstacles are indispensable for intelligent mobile systems (like autonomous vehicles and social robots) to achieve safe and high-quality planning when they navigate in highly interactive and crowded scenarios. Due to the existence of frequent interactions and uncertainty in the scene evolution, it is desired for the prediction system to enable relational reasoning on different entities and provide a distribution of future trajectories for each agent. In this paper, we propose a generic generative neural system (called Social-WaGDAT) for multi-agent trajectory prediction, which makes a step forward to explicit interaction modeling by incorporating relational inductive biases with a dynamic graph representation and leverages both trajectory and scene context information. We also employ an efficient kinematic constraint layer applied to vehicle trajectory prediction which not only ensures physical feasibility but also enhances model performance. The proposed system is evaluated on three public benchmark datasets for trajectory prediction, where the agents cover pedestrians, cyclists and on-road vehicles. The experimental results demonstrate that our model achieves better performance than various baseline approaches in terms of prediction accuracyShow more
Downloadable Archival Material, 2020-02-14
Undefined
Publisher:2020-02-14


Generating Natural Adversarial Hyperspectral examples with a modified Wasserstein GAN
Authors:Burnel, Jean-Christophe (Creator), Fatras, Kilian (Creator), Courty, Nicolas (Creator)
Summary:Adversarial examples are a hot topic due to their abilities to fool a classifier's prediction. There are two strategies to create such examples, one uses the attacked classifier's gradients, while the other only requires access to the clas-sifier's prediction. This is particularly appealing when the classifier is not full known (black box model). In this paper, we present a new method which is able to generate natural adversarial examples from the true data following the second paradigm. Based on Generative Adversarial Networks (GANs) [5], it reweights the true data empirical distribution to encourage the classifier to generate ad-versarial examples. We provide a proof of concept of our method by generating adversarial hyperspectral signatures on a remote sensing datasetShow more
Downloadable Archival Material, 2020-01-27
Undefined
Publisher:2020-01-27

Peer-reviewed
Adapted Wasserstein distances and stability in mathematical finance
Authors:Julio Backhoff-VeraguasDaniel BartlMathias BeiglböckManu Eder
Summary:Abstract: Assume that an agent models a financial asset through a measure with the goal to price/hedge some derivative or optimise some expected utility. Even if the model is chosen in the most skilful and sophisticated way, the agent is left with the possibility that does not provide an exact description of reality. This leads us to the following question: will the hedge still be somewhat meaningful for models in the proximity of ?Show more
Article, 2020
Publication:Finance and Stochastics, 24, 20200604, 601
Publisher:2020
Peer-reviewed
Wasserstein Index Generation Model: Automatic generation of time-series index with application to Economic Policy Uncertainty
Show more
Author:Xie F.
Article, 2020
Publication:Econ

Data for: Wasserstein Index Generation Model: Automatic Generation of Time-series Index with Application to Economic Policy Uncertainty 

By: Xie, Fangzhou 

Mendeley Data

DOI: ‏ http://dx.doi.org.ezaccess.libraries.psu.edu/10.17632/P35TPDMG4D 

Document Type: Data set omics Letters, 186, 2020 01 01
Publisher:2020

2020


Knowledge-aware Attentive Wasserstein Adversarial Dialogue Response Generation
Authors:Yingying Zhang (Author), Quan Fang (Author), Shengsheng Qian (Author), Changsheng Xu (Author)
Summary:Natural language generation has become a fundamental task in dialogue systems. RNN-based natural response generation methods encode the dialogue context and decode it into a response. However, they tend to generate dull and simple responses. In this article, we propose a novel framework, called KAWA-DRG (Knowledge-aware Attentive Wasserstein Adversarial Dialogue Response Generation) to model conversation-specific external knowledge and the importance variances of dialogue context in a unified adversarial encoder-decoder learning framework. In KAWA-DRG, a co-attention mechanism attends to important parts within and among context utterances with word-utterance-level attention. Prior knowledge is integrated into the conditional Wasserstein auto-encoder for learning the latent variable space. The posterior and prior distribution of latent variables are generated and trained through adversarial learning. We evaluate our model on Switchboard, DailyDialog, In-Car Assistant, and Ubuntu Dialogue Corpus. Experimental results show that KAWA-DRG outperforms the existing methodsShow more
Article, 2020
Publication:ACM Transactions on Intelligent Systems and Technology (TIST), 11, 20200528, 1
Publisher:2020


GraphWGAN: Graph Representation Learning with Wasserstein Generative Adversarial Networks

Authors:Rong YanHuawei ShenCao QiKeting CenLi Wang2020 IEEE International Conference on Big Data and Smart Computing (BigComp)Show more
Summary:Graph representation learning aims to represent vertices as low-dimensional and real-valued vectors to facilitate subsequent downstream tasks, i.e., node classification, link predictions. Recently, some novel graph representation learning frameworks, which try to approximate the underlying true connectivity distribution of the vertices, show their superiority. These methods characterize the distance between the true connectivity distribution and generated connectivity distribution by Kullback-Leibler or Jensen-Shannon divergence. However, since these divergences are not continuous with respect to the generator's parameters, such methods easily lead to unstable training and poor convergence. In contrast, Wasserstein distance is continuous and differentiable almost everywhere, which means it can produce more reliable gradient, allowing the training more stable and more convergent. In this paper, we utilize Wasserstein distance to characterize the distance between the underlying true connectivity distribution and generated distribution in graph representation learning. Experimental results show that the accuracy of our method exceeds existing baselines in tasks of both node classification and link predictionShow more
Chapter, 2020
Publication:2020 IEEE International Conference on Big Data and Smart Computing (BigComp), 202002, 315
Publisher:2020



Study of the aggregation procedure : patch fusion and generalized Wasserstein barycenters
Authors:Alexandre Saint-DizierJulie DelonCharles BouveyronErwan Le PennecNicolas CourtyNicolas PapadakisAgnès DesolneuxArthur LeclaireUniversité Paris Cité.École doctorale Sciences mathématiques de Paris centre (Paris / 2000-....).
Show more
Summary:Cette thèse porte sur une classe particulière d'algorithmes de traitement d'images : les méthodes par patchs. Ces méthodes nécessitent une étape appelée agrégation, qui consiste à reformer une image à partir d'un ensemble de patchs, et de modèles statistiques sur ces mêmes patchs. L'étape d'agrégation est formalisée ici comme une opération de fusion de distributions vivant sur des espaces différents mais non-disjoints. On propose d'abord une méthode de fusion basée sur des considérations probabilistes, directement applicable au problème d'agrégation. Il se trouve que cette opération peut aussi se formuler dans un contexte plus général comme une généralisation d'un problème de barycentre entre distributions, ce qui amène à l'étudier dans un deuxième temps du point de vue du transport optimal
Show more
Computer Program, 2020
English
Publisher:2020

 

Covariance Steering of Discrete-Time Stochastic Linear Systems Based on Wasserstein Distance Terminal Cost
Authors:Balci I.M.Bakolas E.
Article, 2020
Publication:IEEE Control Systems Letters, 2020
Publisher:2020



Finite-Horizon Control of Nonlinear Discrete-Time Systems with Terminal Cost of Wasserstein Distance
Authors:Kenta Hoshino2020 59th IEEE Conference on Decision and Control (CDC)
Summary:This study explores a finite-horizon optimal control problem of nonlinear discrete-time systems for steering a probability distribution of initial states as close as possible to a desired probability distribution of terminal states. The problem is formulated as an optimal control problem of the Mayer form, with the terminal cost given by the Wasserstein distance, which provides a metric on probability distributions. For this optimal control problem, this paper provides a necessary condition of the optimality as a variation of the minimum principle of standard optimal control problems. The motivation for exploring this optimal control problem was to provide a control-theoretic viewpoint of a machine-learning algorithm called "the normalizing flow". The obtained necessary condition is employed for developing a simple variation of the normalizing flow approach, and a gradient descent-type numerical algorithm is also providedShow more
Chapter, 2020
Publication:2020 59th IEEE Conference on Decision and Control (CDC), 20201214, 4268
Publisher:2020

<——2020——–2020—––3810—




Peer-reviewed
Author:Lorenzo Dello Schiavo
A Rademacher-type theorem on L2-Wasserstein spaces over closed Riemannian manifolds
Summary:Let
Article
Publication:Journal of Functional Analysis, 278, 2020-04-01

Peer-reviewed
Necessary Condition for Rectifiability Involving Wasserstein Distance W2
Author:Damian Dąbrowski
Summary:Abstract: A Radon measure $\mu $ is $n$-rectifiable if it is absolutely continuous with respect to $n$-dimensional Hausdorff measure and $\mu $-almost all of ${\operatorname{supp}}\mu $ can be covered by Lipschitz images of $\mathbb{R}^n$. In this paper, we give a necessary condition for rectifiability in terms of the so-called $\alpha _2$ numbers — coefficients quantifying flatness using Wasserstein distance $W_2$. In a recent article, we showed that the same condition is also sufficient for rectifiability, and so we get a new characterization of rectifiable measuresShow more
Article, 2020
Publication:International Mathematics Research Notices, 2020, 20200525, 8936
Publisher:2020

  

2020 see 2019
Barycenters of Natural Images - Constrained Wasserstein Barycenters for Image Morphing
Authors:Aviad AberdamDror Simon2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)

Summary:Image interpolation, or image morphing, refers to a visual transition between two (or more) input images. For such a transition to look visually appealing, its desirable properties are (i) to be smooth; (ii) to apply the minimal required change in the image; and (iii) to seem "real", avoiding unnatural artifacts in each image in the transition. To obtain a smooth and straightforward transition, one may adopt the well-known Wasserstein Barycenter Problem (WBP). While this approach guarantees minimal changes under the Wasserstein metric, the resulting images might seem unnatural. In this work, we propose a novel approach for image morphing that possesses all three desired properties. To this end, we define a constrained variant of the WBP that enforces the intermediate images to satisfy an image prior. We describe an algorithm that solves this problem and demonstrate it using the sparse prior and generative adversarial networks
Show more
Chapter, 2020
Publication:2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 202006, 7907
Publisher:2020
 


A Class of Optimal Transport Regularized Formulations with Applications to Wasserstein GANs
Authors:Saied MahdianJose H. BlanchetPeter W. Glynn2020 Winter Simulation Conference (WSC)
Summary:Optimal transport costs (e.g. Wasserstein distances) are used for fitting high-dimensional distributions. For example, popular artificial intelligence algorithms such as Wasserstein Generative Adversarial Networks (WGANs) can be interpreted as fitting a black-box simulator of structured data with certain features (e.g. images) using the Wasserstein distance. We propose a regularization of optimal transport costs and study its computational and duality properties. We obtain running time improvements for fitting WGANs with no deterioration in testing performance, relative to current benchmarks. We also derive finite sample bounds for the empirical Wasserstein distance from our regularizationShow more
Chapter, 2020
Publication:2020 Winter Simulation Conference (WSC), 20201214, 433
Publisher:2020
 


Gromov-Wasserstein Averaging in a Riemannian Framework

Authors:Samir ChowdhuryTom Needham2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)Show more
Summary:We introduce a theoretical framework for performing statistical tasks—including, but not limited to, averaging and principal component analysis—on the space of (possibly asymmetric) matrices with arbitrary entries and sizes. This is carried out under the lens of the Gromov-Wasserstein (GW) distance, and our methods translate the Riemannian framework of GW distances developed by Sturm into practical, implementable tools for network data analysis. Our methods are illustrated on datasets of letter graphs, asymmetric stochastic blockmodel networks, and planar shapes viewed as metric spaces. On the theoretical front, we supplement the work of Sturm by producing additional results on the tangent structure of this "space of spaces", as well as on the gradient flow of the Fréchet functional on this spaceShow more
Chapter, 2020
Publication:2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 202006, 3676
Publisher:2020

2020
 
Minimax Control of Ambiguous Linear Stochastic Systems Using the Wasserstein Metric
Authors:Kihyun KimInsoon Yang2020 59th IEEE Conference on Decision and Control (CDC)
Summary:In this paper, we propose a minimax linear-quadratic control method to address the issue of inaccurate distribution information in practical stochastic systems. To construct a control policy that is robust against errors in an empirical distribution of uncertainty, our method adopts an adversary, which selects the worst-case distribution. The opponent receives a penalty proportional to the amount (measured in the Wasserstein metric) of deviation from the empirical distribution. In the finite-horizon case, using a Riccati equation, we derive a closed-form expression of the unique optimal policy and the opponent’s policy that generates the worst-case distribution. This result is then extended to the infinite-horizon setting by identifying conditions under which the Riccati recursion converges to the unique positive semi-definite solution to an associated algebraic Riccati equation (ARE). The resulting optimal policy is shown to stabilize the expected value of the system state under the worst-case distribution. We also discuss that our method can be interpreted as a distributional generalization of the H∞-methodShow more
Chapter, 2020
Publication:2020 59th IEEE Conference on Decision and Control (CDC), 20201214, 1777
Publisher:2020


Peer-reviewed
Characterization of probability distribution convergence in Wasserstein distance by Lp-quantization error function
Authors:Liu Y.Pages G.
Article, 2020
Publication:Bernoulli, 26, 2020, 1171
Publisher:2020

Data Augmentation Method for Fault Diagnosis of Mechanical Equipment Based on Improved Wasserstein GAN
Show mor
Authors:Wenbiao LiuLixiang DuanYu TangJialing Yang2020 11th International Conference on Prognostics and System Health Management (PHM-2020 Jinan)Show more
Summary:Most of the time the mechanical equipment is in normal operation state, which results in high imbalance between fault data and normal data. In addition, traditional signal processing methods rely heavily on expert experience, making it difficult for classification or prediction algorithms to obtain accurate results. In view of the above problem, this paper proposed a method to augment failure data for mechanical equipment diagnosis based on Wasserstein generative adversarial networks with gradient penalty (WGAN-GP). First, the multi-dimensional sensor data are converted into two-dimensional gray images in order to avoid the interference of tedious parameters preset on the model and the dependence on the professional knowledge of signal preprocessing. Based on this foundation, the gray images of the minority sample are used as the input of WGAN-GP to carry out adversarial training until the network reaches the Nash Equilibrium. Then the generated images are added to the original failure samples, reducing the imbalance of the original data samples. Finally, by calculating the structural similarity index between the generated images and the original images, the difficulty of quantitative evaluation of WGAN-GP data generated by itself is solved. Taking the accelerated bearing failure dataset as an example, the classification prediction effects of different classifiers are compared. The results of multiple experiments shown that the proposed method can more effectively improve the prediction accuracy in the case of sparse fault samplesShow more
Chapter, 2020
Publication:2020 11th International Conference on Prognostics and System Health Management (PHM-2020 Jinan), 202010, 103
Publisher:2020

Domain-attention Conditional Wasserstein Distance for Multi-source Domain Adaptation
Authors:Hanrui Wu (Author), Yuguang Yan (Author), Michael K. Ng (Author), Qingyao Wu (Author)
Summary:Multi-source domain adaptation has received considerable attention due to its effectiveness of leveraging the knowledge from multiple related sources with different distributions to enhance the learning performance. One of the fundamental challenges in multi-source domain adaptation is how to determine the amount of knowledge transferred from each source domain to the target domain. To address this issue, we propose a new algorithm, called Domain-attention Conditional Wasserstein Distance (DCWD), to learn transferred weights for evaluating the relatedness across the source and target domains. In DCWD, we design a new conditional Wasserstein distance objective function by taking the label information into consideration to measure the distance between a given source domain and the target domain. We also develop an attention scheme to compute the transferred weights of different source domains based on their conditional Wasserstein distances to the target domain. After that, the transferred weights can be used to reweight the source data to determine their importance in knowledge transfer. We conduct comprehensive experiments on several real-world data sets, and the results demonstrate the effectiveness and efficiency of the proposed methodShow more
Article, 2020
Publication:ACM Transactions on Intelligent Systems and Technology (TIST), 11, 20200531, 1
Publisher:2020


An Improvement based on Wasserstein GAN for Alleviating Mode Collapsing
Authors:Yingying ChenXinwen Hou2020 International Joint Conference on Neural Networks (IJCNN)
Summary:In the past few years, Generative Adversarial Networks as a deep generative model has received more and more attention. Mode collapsing is one of the challenges in the study of Generative Adversarial Networks. In order to solve this problem, we deduce a new algorithm on the basis of Wasserstein GAN. We add a generated distribution entropy term to the objective function of generator net and maximize the entropy to increase the diversity of fake images. And then Stein Variational Gradient Descent algorithm is used for optimization. We named our method SW-GAN. In order to substantiate our theoretical analysis, we perform experiments on MNIST and CIFAR-10, and the results demonstrate superiority of our methodShow more
Chapter, 2020
Publication:2020 International Joint Conference on Neural Networks (IJCNN), 202007, 1
Publisher:2020

<——2020——–2020—––3820—



Peer-reviewed
Authors:Louis BrownStefan 

On the Wasserstein distance between classical sequences and the Lebesgue measureSteinerberger
Summary:We discuss the classical problem of measuring the regularity of distribution of sets of $ N$ points in $ \mathbb{T}^d$. A recent line of investigation is to study the cost ($ =$ mass $ × $ distance) necessary to move Dirac measures placed on these points to the uniform distribution. We show that Kronecker sequences satisfy optimal transport distance in $ d ≥ 2$ dimensions. This shows that for differentiable $ f: \mathbb{T}^d \mathbb{R}$ and badly approximable vectors $ α \mathbb{R}^d$, we have $\displaystyle ≤ft | t _{\mathbb{T}^d} f(x) dx - \frac {1}{N} ∑ _{k=... ...bla f^{(d-1)/d}_{L^{∞ }} f^{1/d}_{L^{2}} }{N^{1/d}}.$     We note that the result is uniform in $ N$ (it holds for a sequence instead of a set). Simultaneously, it refines the classical integration error for Lipschitz functions, $ f _{L^{∞ }} N^{-1/d}$. We obtain a similar improvement for numerical integration with respect to the regular grid. The main ingredient is an estimate involving Fourier coefficients of a measure; this allows for existing estimates to be conveniently `recycled'. We present several open problemsShow more
Downloadable Article, 2020
Publication:Transactions of the American Mathematical Society, 373, December 1, 2020, 8943
Publisher:2020

 
Joint Transfer of Model Knowledge and Fairness Over Domains Using Wasserstein Distance
Authors:Taeho YoonJaewook LeeWoojin Lee
Summary:Owing to the increasing use of machine learning in our daily lives, the problem of fairness has recently become an important topic in machine learning societies. Recent studies regarding fairness in machine learning have been conducted to attempt to ensure statistical independence between individual model predictions and designated sensitive attributes. However, in reality, cases exist in which the sensitive variables of data used for learning models differ from the data upon which the model is applied. In this paper, we investigate a methodology for developing a fair classification model for data with limited or no labels, by transferring knowledge from another data domain where information is fully available. This is done by controlling the Wasserstein distances between relevant distributions. Subsequently, we obtain a fair model that could be successfully applied to two datasets with different sensitive attributes. We present theoretical results validating that our approach provably transfers both classification performance and fairness over domains. Experimental results show that our method does indeed promote fairness for the target domain, while retaining reasonable classification accuracy, and that it often outperforms comparative models in terms of joint fairnessShow more
Article, 2020
Publication:IEEE Access, 8, 2020, 123783
Publisher:2020

Statistical data analysis in the Wasserstein space*
Author:Jérémie Bigot
Summary:This paper is concerned by statistical inference problems from a data set whose elements may be modeled as random probability measures such as multiple histograms or point clouds. We propose to review recent contributions in statistics on the use of Wasserstein distances and tools from optimal transport to analyse such data. In particular, we highlight the benefits of using the notions of barycenter and geodesic PCA in the Wasserstein space for the purpose of learning the principal modes of geometric variation in a dataset. In this setting, we discuss existing works and we present some research perspectives related to the emerging field of statistical optimal transportShow more
Article, 2020
Publication:ESAIM: Proceedings and Surveys, 68, 2020, 1
Publisher:2020


 Solutions of a Class of Degenerate Kinetic Equations Using Steepest Descent in Wasserstein Space
Authors:Aboubacar MarcosAmbroise Soglo
Summary:We use the steepest descent method in an Orlicz-Wasserstein space to study the existence of solutions for a very broad class of kinetic equations, which include the Boltzmann equation, the Vlasov-Poisson equation, the porous medium equation, and the parabolic p-Laplacian equation, among others. We combine a splitting technique along with an iterative variational scheme to build a discrete solution which converges to a weak solution of our problemShow more
Article, 2020
Publication:Journal of Mathematics, 2020, 20200609
Publisher:2020


Peer-reviewed
On the Wasserstein distance between classical sequences and the Lebesgue measure
Authors:Louis BrownStefan Steinerberger
Summary:We discuss the classical problem of measuring the regularity of distribution of sets of $ N$ points in $ \mathbb{T}^d$. A recent line of investigation is to study the cost ($ =$ mass $ × $ distance) necessary to move Dirac measures placed on these points to the uniform distribution. We show that Kronecker sequences satisfy optimal transport distance in $ d ≥ 2$ dimensions. This shows that for differentiable $ f: \mathbb{T}^d \mathbb{R}$ and badly approximable vectors $ α \mathbb{R}^d$, we have $\displaystyle ≤ft | t _{\mathbb{T}^d} f(x) dx - \frac {1}{N} ∑ _{k=... ...bla f^{(d-1)/d}_{L^{∞ }} f^{1/d}_{L^{2}} }{N^{1/d}}.$     We note that the result is uniform in $ N$ (it holds for a sequence instead of a set). Simultaneously, it refines the classical integration error for Lipschitz functions, $ f _{L^{∞ }} N^{-1/d}$. We obtain a similar improvement for numerical integration with respect to the regular grid. The main ingredient is an estimate involving Fourier coefficients of a measure; this allows for existing estimates to be conveniently `recycled'. We present several open problemsShow more
Downloadable Article, 2020
Publication:Transactions of the American Mathematical Society, 373, December 1, 2020, 8943
Publisher:2020


2020


VinAI Research Seminar Series - Quantile Matrix Factorization

www.youtube.com › watch

 in Paris, in October 2018. ... transport and Wasserstein distances in the machine learning community.

YouTube · VinAI Research · 

Oct 26, 2020


2020

Research - Abacus.AI

abacus.ai › research

... of GANs including Wasserstein GANs and MMD GANs address some of these issues. ... Google Brain's scientists also explored attribution of predictions to ...

Abacus.AI - Effortlessly Embed Cutting Edge AI In Your ... · 

Jul 14, 2020


2020

Archived News | UvA-Bosch DELTA Lab - Informatics Institute

ivi.fnwi.uva.nl › uvaboschdeltalab › archived-news

He spent five years at Google Brain, where he focused on neural network ... Our proposed approach, pairing a Wasserstein GAN with a classification loss, ...

Informatics Institute · SPUI 25 · 

Oct 6, 2020


2020

David Berthelot (dberth@sigmoid.social) - Twitter

twitter.com › d_berthelot_ml0:06

Machine Learner, ex-Google Brain, now in Apple. ... Connections between Support Vector Machines, Wasserstein distance and gradient-penalty GANs New work by ...

Twitter · 


A convergent lagrangian discretization for p-Wasserstein and flux-limited diffusion equations
Authors:Sollner B.Junge O.
Article, 2020
Publication:Communications on Pure and Applied Analysis, 19, 2020 06 01, 4227
Publisher:2020

<——2020——–2020—––3830—




Peer-reviewed
Hyperbolic Wasserstein Distance for Shape Indexing
Authors:Jie ShiYalin Wang
Article, 2020
Publication:IEEE transactions on pattern analysis and machine intelligence, 42, 2020, 1362
Publisher:2020

 
Peer-reviewed
A Riemannian submersion-based approach to the Wasserstein barycenter of positive definite matrices
Authors:Mingming LiHuafei SunDidong Li
Article, 2020
Publication:Mathematical Methods in the Applied Sciences, 43, 15 May 2020, 4927
Publisher:2020

Peer-reviewed
De Novo Protein Design for Novel Folds Using Guided Conditional Wasserstein Generative Adversarial Networks
Show more
Authors:Karimi M.Zhu S.Cao Y.Shen Y.
Article, 2020
Publication:Journal of Chemical Information and Modeling, 60, 2020 12 28, 5667
Publisher:2020



Approximate bayesian computation with the sliced-wasserstein distance
Authors:Nadjahi K.Badeau R.Simsekli U.De Bortoli V.Durmus A.2020 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2020Show more
Article, 2020
Publication:ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings, 2020-May, 2020 05 01, 5470
Publisher:2020

Wasserstein loss based deep object detection

Authors:Han Y.Luo Z.Liu X.Han X.Liu R.Sheng Z.Ren Y.2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2020Show more
Article, 2020
Publication:IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2020-June, 2020 06 01, 4299
Publisher:2020

2020

Multimarginal Wasserstein Barycenter for Stain Normalization and Augmentation
Authors:Nadeem S.Hollmann T.Tannenbaum A.23rd International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2020Show more
Article, 2020
Publication:Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 12265 LNCS, 2020, 362
Publisher:2020

Semantics-assisted wasserstein learning for topic and word embeddings
Authors:Li C.Li X.Ouyang J.Wang Y.20th IEEE International Conference on Data Mining, ICDM 2020
Article, 2020
Publication:Proceedings - IEEE International Conference on Data Mining, ICDM, 2020-November, 2020 11 01, 292
Publisher:2020

Peer-reviewed
The quadratic Wasserstein metric for inverse data matching
Authors:Engquist B.Ren K.Yang Y.

Article, 2020
Publication:Inverse Problems, 36, 2020 05 01
Publisher:2020



Peer-reviewed
Sample generation based on a supervised Wasserstein Generative Adversarial Network for high-resolution remote-sensing scene classification
Show more
Authors:Han W.Wang L.Feng R.Gao L.Chen X.Deng Z.Chen J.
Article, 2020
Publication:Information Sciences, 539, 2020 10 01, 177
Publisher:2020

Statistical Learning in Wasserstein Space
Authors:Amirhossein KarimiLuigia RipaniTryphon T. Georgiou
Article, 2020
Publication:IEEE control systems letters, 5, 2020, 899

<——2020——–2020—––3840—

 

A Wasserstein Graph Kernel based on Substructure Isomorphism Problem of Shortest Paths

Authors:JIANMING HUANGZHONGXI FANGHIROYUKI KASAI
Article
Publication:映像情報メディア学会技術報告 = ITE technical report., 44, 2020-11, 25


Peer-reviewed
Wasserstein GANs for MR Imaging: From Paired to Unpaired Training
              Authors:Ke LeiMorteza MardaniJohn M. PaulyShreyas S. Vasanawala
Article, 2020
Publication:IEEE transactions on medical imaging, 40, 2020, 105
Publisher:2020



Wasserstein-based Projections with Applications to Inverse Problems
    Authors:Heaton, Howard (Creator), Fung, Samy Wu (Creator), Lin, Alex Tong (Creator), Osher, Stanley (Creator), Yin, Wotao (Creator)
Summary:Inverse problems consist of recovering a signal from a collection of noisy measurements. These are typically cast as optimization problems, with classic approaches using a data fidelity term and an analytic regularizer that stabilizes recovery. Recent Plug-and-Play (PnP) works propose replacing the operator for analytic regularization in optimization methods by a data-driven denoiser. These schemes obtain state of the art results, but at the cost of limited theoretical guarantees. To bridge this gap, we present a new algorithm that takes samples from the manifold of true data as input and outputs an approximation of the projection operator onto this manifold. Under standard assumptions, we prove this algorithm generates a learned operator, called Wasserstein-based projection (WP), that approximates the true projection with high probability. Thus, WPs can be inserted into optimization methods in the same manner as PnP, but now with theoretical guarantees. Provided numerical examples show WPs obtain state of the art results for unsupervised PnP signal recoveryShow more
Downloadable Archival Material, 2020-08-05
Undefined
Publisher:2020-08-05

Hierarchical Gaussian Processes with Wasserstein-2 Kernels
 

    Authors:Popescu, Sebastian (Creator), Sharp, David (Creator), Cole, James (Creator), Glocker, Ben (Creator)
Summary:Stacking Gaussian Processes severely diminishes the model's ability to detect outliers, which when combined with non-zero mean functions, further extrapolates low non-parametric variance to low training data density regions. We propose a hybrid kernel inspired from Varifold theory, operating in both Euclidean and Wasserstein space. We posit that directly taking into account the variance in the computation of Wasserstein-2 distances is of key importance towards maintaining outlier status throughout the hierarchy. We show improved performance on medium and large scale datasets and enhanced out-of-distribution detection on both toy and real dataShow more
Downloadable Archival Material, 2020-10-28



 2020

node2coords: Graph representation learning with wasserstein barycenters

E SimouD ThanouP Frossard - IEEE Transactions on Signal …, 2020 - ieeexplore.ieee.org

… work in unsupervised learning of graph representations. In Section III we … Wasserstein

barycenter representation method, which is later incorporated in our graph representation learning …

Save Cite Cited by 6 Related articles All 5 versions

Graph Representation Learning with Wasserstein Barycenters

https://signalprocessingsociety.org › ieee-transactions-si...

https://signalprocessingsociety.org › ieee-transactions-si...

In order to perform network analysis tasks, representations that capture the most relevant information in the graph structure are needed.


2020



  

 2020

Wasserstein Loss - Coursera

www.coursera.org › lecture › wasserstein-loss-vy3To

www.coursera.org › lecture › wasserstein-loss-vy3To

Week 3: Wasserstein GANs with Gradient Penalty. Learn advanced techniques to reduce instances of GAN failure due to imbalances between the ...

Coursera · DeepLearning.AI · 

Sep 29, 2020


 

2020

Lénaïc Chizat (@LenaicChizat) / Twitter

twitter.com › LenaicChizat

Dec 3, 2022 ... continuous from the Wasserstein space to C^k functions - Wasserstein gradient flows ... msri.org. Mathematical Sciences Research Institute.

Twitter · 

Jul 17, 2020



 2020

Mode Collapse - Coursera

www.coursera.org › lecture › mode-collapse-Terkm

www.coursera.org › lecture › mode-collapse-Terkm

Week 3: Wasserstein GANs with Gradient Penalty. Learn advanced techniques to reduce instances of GAN failure due to imbalances between the ...

Coursera · DeepLearning.AI · 

Sep 29, 2020


2020

Transition for alternating mode collapse of Wasserstein GAN

www.youtube.com › watch

www.youtube.com › wat

The orange and blue represents the underlying distribution and the generated distribution respectively. The right hand side graph, ...

YouTube · zaytamas · 

Aug 25, 2020


2020

Where is generative AI headed in 2023?

fastcompanyme.com › Technology 

Expect the tech to go more mainstream—and to see heightened scrutiny from regulators.

Fast Company Middle East · 1 month ago

<——2020——–2020—––3850—


[PDF] aclanthology.org

WAE RN: Integrating Wasserstein Autoencoder and Relational Network for Text Sequence

X Zhang, X Liu, G Yang, F Li, W Liu - Chinese Computational Linguistics …, 2020 - Springer

… In this paper, we propose to integrate the relational network(RN) into a Wasserstein 

autoencoder(WAE). Specifically, WAE and RN are used to better keep the semantic structurse and …

 Related articles All 6 versions


2020 book

[PDF] Risk Measures Estimation Under Wasserstein Barycenter

https://www.semanticscholar.org › paper

https://www.semanticscholar.org › paper

Aug 13, 2020 — Model Risk Measurement Under Wasserstein Distance ... This book provides the most comprehensive treatment of the theoretical concepts and ...


 
 
Peer-reviewed
A Rademacher-type theorem on L<sup>2</sup>-Wasserstein spaces over closed Riemannian manifolds
Author:Dello Schiavo L.
Article, 2020
Publication:Journal of Functional Analysis, 278, 2020 04 01
Publisher:2020


Hyperspectral Image Classification Approach Based on Wasserstein Generative Adversarial Networks
Authors:Naigeng ChenChenming Li2020 International Conference on Machine Learning and Cybernetics (ICMLC)
Summary:Hyperspectral image classification is an important research direction in the application of remote sensing technology. In the process of labeling different types of objects based on spectral information and geometric spatial characteristics, noise interference often exists in continuous multi-band spectral information, which brings great troubles to spectral feature extraction. Besides, far from enough spectral samples will restrict the classification performance of the algorithm to some extent. In order to solve the problem of small amount of original spectral sample data and noisy signal, Wasserstein generative adversarial networks (WGAN) is used to generate samples similar to the original spectrum, and spectral features are extracted from the samples. In the case of small samples, the original materials are provided for the classification of hyperspectral images and a semi-supervised classification model WGAN-CNN for hyperspectral images based on Wasserstein generation antagonistic network is proposed in this paper. This model combines with CNN classifier and completes the classification of terrain objects according to the label for the synthesized samples. The proposed method is compared with several classical hyperspectral image classification methods in classification accuracy. WGAN-CNN can achieve higher classification accuracy in the case of small sample size, which proves the effectiveness of the proposed method
Show more
020
Publication:2020 International Conference on Machine Learning and Cybernetics (ICMLC), 20201202, 53
Publisher:2020


WAE$$:{-}$$ RN: Integrating Wasserstein Autoencoder and Relational Network for Text Sequence
Authors:Zhang X.Liu X.Yang G.Liu W.Li F.19th China National Conference on Computational Linguistics, CCL 2020
Article, 2020
Publication:Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 12522 LNAI, 2020, 467
Publisher:2020


2020

Peer-reviewed
CVaR-Based Approximations of Wasserstein Distributionally Robust Chance Constraints with Application to Process Scheduling
Authors:Liu B.Yuan Z.Zhang Q.Ge X.
Article, 2020
Publication:Industrial and Engineering Chemistry Research, 59, 2020 05 20, 9562
Publisher:2020


2020 see 2019  Peer-reviewed
Necessary Condition for Rectifiability Involving Wasserstein Distance W<sub>2</sub>
Author:Dabrowski D.
Article, 2020
Publication:International Mathematics Research Notices, 2020, 2020 11 01, 8936
Publisher:2020



Data Augmentation Method for Power Transformer Fault Diagnosis Based on Conditional Wasserstein Generative Adversarial Network
Authors:Liu Y.Xu Z.He J.Wang Q.Gao S.Zhao J.
Article, 2020
Publication:Dianwang Jishu/Power System Technology, 44, 2020 04 05, 1505
Publisher:2020


GP-WIRGAN: A Novel Image Recurrent Generative Adversarial Network Model Based on Wasserstein and Gradient Penalty
Authors:Feng Y.Zhang C.-P.Zhang Y.-Y.Shang J.-X.Qiang B.-H.
Article, 2020
Publication:Jisuanji Xuebao/Chinese Journal of Computers, 43, 2020 02 01, 190
Publisher:2020


Multiple Voltage Sag Events Homology Detection Based on Wasserstein Distance
Authors:Xiao X.Gui L.Li C.Zhang H.Li H.Wang Q.
Article, 2020
Publication:Dianwang Jishu/Power System Technology, 44, 2020 12 05, 4684
Publisher:2020

CITATION] Multiple Voltage Sag Events Homology Detection Based on Wasserstein Distance

XY Xiao, LY Gui, CX Li, HY Zhang, HX Li, Q Wang - Power System Technology, 2020

Cited by 4 Related articles

<——2020——–2020—––3860—e



Research of MRI Reconstruction Method by Using De-aliasing Wasserstein Generative Adversarial Networks with Gradient Penalty
Authors:Yuan Z.-H.Jiang M.-F.Li Y.Zhi M.-H.Zhu Z.-J.
Article, 2020
Publication:Tien Tzu Hsueh Pao/Acta Electronica Sinica, 48, 2020 10 01, 1883
Publisher:2020



Comparing Bottom-Up Energy Consumption Models Using The Wasserstein Distance Between Load Profile Histograms
Authors:Sanderson, Edward (Creator), Fragaki, Aikaterini (Creator), Simo, Jules (Creator), Matuszewski, Bogdan (Creator)

Summary:This paper presents a comparison of bottom up models that generate appliance load profiles. The comparison is based on their ability to accurately distribute load over time-of-day. This is a key feature of model performance if the model is used to assess the impact of low carbon technologies and practices on the network. No work has yet assessed models on this basis. In this work, the temporal characteristics of load are captured using histograms, and similarity between the histogram representations of measured and generated data is assessed using the Wasserstein distance. This is then applied to compare the results of three models, which were developed here by adopting approaches used in previous research. One is based on occupant presence, one on occupant activity, and one on empirical data. Typical statistical tests showed that the comparison method is robust and can be used for this purpose
Show more
Downloadable Archival Material, 2020-10-30
English
Publisher:Building Simulation and Optimization 2020, 2020-10-30



Trajectories from Distribution-valued Functional Curves: A Unified Wasserstein Framework
Authors:Anuja SharmaGuido Gerig
Summary:Temporal changes in medical images are often evaluated along a parametrized function that represents a structure of interest (e.g. white matter tracts). By attributing samples along these functions with distributions of image properties in the local neighborhood, we create distribution-valued signatures for these functions. We propose a novel and comprehensive framework which models their temporal evolution trajectories. This is achieved under the unifying scheme of Wasserstein distance metric. The regression problem is formulated as a constrained optimization problem and solved using an alternating projection algorithm. The solution simultaneously preserves the functional characteristics of the curve, models the temporal change in distribution profiles and forces the estimated distributions to be valid. Hypothesis testing is applied in two ways using Wasserstein based test statistics. Validation is presented on synthetic data. Detection of delayed growth is shown on DTI tracts, for a pediatric subject with respect to a healthy population of infants
Show more
Article, 2020
Publication:Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention, 12267, 202010, 343
Publisher:2020


Solutions of a Class of Degenerate Kinetic Equations Using Steepest Descent in Wasserstein Space
Authors:Aboubacar MarcosYongqiang Fu (Editor), Ambroise Soglo

Summary:We use the steepest descent method in an Orlicz–Wasserstein space to study the existence of solutions for a very broad class of kinetic equations, which include the Boltzmann equation, the Vlasov–Poisson equation, the porous medium equation, and the parabolic pLaplacian equation, among others. We combine a splitting technique along with an iterative variational scheme to build a discrete solution which converges to a weak solution of our problem
Show more
Downloadable Article, 2020
Publication:Journal of Mathematics., 2020, 1
Publisher:2020



Self-improvement of the Bakry-Emery criterion for poincaré inequalities and Wasserstein contraction using variable curvature bounds
Authors:Cattiaux P.Fathi M.Guillin A.
Article, 2020
Publication:arXiv, 2020 02 21
Publisher:2020


2020


McKean-vlasov SDEs with drifts discontinuous under wasserstein distance
Authors:Huang X.Wang F.-Y.
Article, 2020
Publication:arXiv, 2020 02 17
Publisher:2020


A material decomposition method for dual-energy CT via dual interactive Wasserstein generative adversarial networks
Authors:Shi Z.Li H.Wang Z.Cheng M.Cao Q.
Article, 2020
Publication:arXiv, 2020 07 22
Publisher:2020


On Stein’s factors for Poisson approximation in Wasserstein distance with non-linear transportation costs
Authors:Liao Z.-W.Ma Y.Xia A.
Article, 2020
Publication:arXiv, 2020 03 31
Publisher:2020


Central limit theorems for Markov chains based on their convergence rates in Wasserstein distance
Authors:Rui J.I.N.Aixin T.A.N.
Article, 2020
Publication:arXiv, 2020 02 21
Publisher:2020


Continuous regularized Wasserstein baarycenters
Authors:Li L.Genevay A.Yurochkin M.Solomon J.
Article, 2020
Publication:arXiv, 2020 08 28
Publisher:2020

<——2020——–2020—––3870—



2020 see 2019
Donsker’s theorem in wasserstein-1 distance
Authors:Coutin L.Decreusefond L.
Article, 2020
Publication:Electronic Communications in Probability, 25, 2020
Publisher:2020


Pattern-based music generation with wasserstein autoencoders and PR<sup>C</sup> descriptions
Authors:Borghuis V.Brusci L.Angioloni L.Frasconi P.29th International Joint Conference on Artificial Intelligence, IJCAI 2020
Article, 2020
Publication:IJCAI International Joint Conference on Artificial Intelligence, 2021-January, 2020, 5225
Publisher:2020



Convergence in Wasserstein Distance for Empirical Measures of Dirichlet Diffusion Processes on Manifolds<sup></sup>
Author:Wang F.-Y.
Article, 2020
Publication:arXiv, 2020 05 19
Publisher:2020


Convergence of Recursive Stochastic Algorithms using Wasserstein Divergence <sup></sup>
Authors:Gupta A.Haskell W.B.
Article, 2020
Publication:arXiv, 2020 03 25
Publisher:2020

Exact rate of convergence of the mean wasserstein distance between the empirical and true gaussian distribution: A Preprint
Authors:Berthet P.Fort J.C.
Article, 2020
Publication:arXiv, 2020 01 27
Publisher:2020

2020


Optimality in weighted l<sub>2</sub>-wasserstein goodness-of-fit statistics
Authors:de Wet T.Humble V.
Article, 2020
Publication:South African Statistical Journal, 54, 2020, 1
Publisher:2020


Synthetic images of longitudinal cracks in stainless steel slabs via wasserstein generative adversarial networks used toward unsupervised classification
Authors:Andrade D.Simiand M.Barreriro A.J.AISTech 2020 Iron and Steel Technology Conference
Article, 2020
Publication:AISTech - Iron and Steel Technology Conference Proceedings, 3, 2020, 1985
Publisher:2020



Remote Sensing Image Segmentation based on Generative Adversarial Network with Wasserstein divergence
Authors:Xia Cao (Author), Chenggang Song (Author), Jian Zhang (Author), Chang Liu (Author)
Chapter, 2020
Publication:2020 3rd International Conference on Algorithms, Computing and Artificial Intelligence, 20201224, 1
Publisher:2020

  

DECWA Density-Based Clustering using Wasserstein Distance
Authors:Nabil El Malki (Author), Robin Cugny (Author), Olivier Teste (Author), Franck Ravat (Author)
Summary:Clustering is a data analysis method for extracting knowledge by discovering groups of data called clusters. Among these methods, state-of-the-art density-based clustering methods have proven to be effective for arbitrary-shaped clusters. Despite their encouraging results, they suffer to find low-density clusters, near clusters with similar densities, and high-dimensional data. Our proposals are a new characterization of clusters and a new clustering algorithm based on spatial density and probabilistic approach. First of all, sub-clusters are built using spatial density represented as probability density function (p.d.f) of pairwise distances between points. A method is then proposed to agglomerate similar sub-clusters by using both their density (p.d.f) and their spatial distance. The key idea we propose is to use the Wasserstein metric, a powerful tool to measure the distance between p.d.f of sub-clusters. We show that our approach outperforms other state-of-the-art density-based clustering methods on a wide variety of datasets
Show more
Chapter, 2020
Publication:Proceedings of the 29th ACM International Conference on Information & Knowledge Management, 20201019, 2005
Publisher:2020


[PDF] openreview.net

Functional priors for Bayesian neural networks through Wasserstein distance minimization to Gaussian processes

BH TranD MiliosS Rossi… - Third Symposium on …, 2020 - openreview.net

… We stress that a fixed Gaussian prior on the parameters is not … shallow Bayesian models,

such as Gaussian Processes (gps), where … We consider the Wasserstein distance between the …

Cited by 1 Related articles All 2 versions 

<——2020——–2020—––3880—



2020. [HTML] hindawi.com

[HTML] Solutions of a class of degenerate kinetic equations using steepest descent in Wasserstein space

A Marcos, A Soglo - Journal of Mathematics, 2020 - hindawi.com

… We use the steepest descent method in an Orlicz–Wasserstein space to study the existence

of solutions for a very broad class of kinetic equations, which include the Boltzmann equation…

Cited by 4 Related articles All 7 versions 

[PDF] academia.edu

[PDF] Research Article Solutions of a Class of Degenerate Kinetic Equations Using Steepest Descent in Wasserstein Space

A Marcos, A Soglo - 2020 - academia.edu

… We use the steepest descent method in an Orlicz–Wasserstein space to study the existence

of solutions for a very broad class of kinetic equations, which include the Boltzmann equation…

 Related articles 


[PDF] arxiv.org

Unadjusted Langevin algorithm with multiplicative noise: Total variation and Wasserstein bounds

F Panloup - arXiv preprint arXiv:2012.14310, 2020 - arxiv.org

… of an ergodic diffusion with a possibly multiplicative diffusion term (non-constant diffusion … 

Variation and L1-Wasserstein distances in both multiplicative and additive and frameworks. …

Cited by 1 Related articles All 2 versions



2020 patent

Non-linear industrial process modeling method based on WGANs data enhancement

CN CN112966429A 褚菲 中国矿业大学

Priority 2020-08-11 • Filed 2020-08-11 • Published 2021-06-15

2. The WGANs data enhancement-based nonlinear industrial process modeling method of claim 1, wherein: the step B comprises the following steps: preprocessing the data set acquired in the step A, and specifically comprises the following steps: 1) the initial acquisition data comprises industrial …


2020 patent

Difference privacy greedy grouping method adopting Wasserstein distance

CN CN112307514A 杨悦 哈尔滨工程大学

Priority 2020-11-26 • Filed 2020-11-26 • Published 2021-02-02

1. A differential privacy greedy grouping method adopting Wasserstein distance is characterized by comprising the following steps: step 1: reading a data set D received at the ith time point i Step 2: will D i Data set D released from last time point i-1 Performing Wasserstein distance similarity …


2020 patent

Wi-Fi indoor positioning method based on signal distribution Wasserstein …

CN CN111741429B 周牧 重庆邮电大学

Priority 2020-06-23 • Filed 2020-06-23 • Granted 2022-05-03 • Published 2022-05-03

1. the Wi-Fi indoor positioning method based on signal distribution Wasserstein distance measurement is characterized by comprising the following steps of: step one, off-line stage, the Wi-Fi received signal from the mth AP at the nth Reference Point (RP) is strengthenedThe sequence of degrees ( …


2020


2020 patent

Wasserstein distance-based depth domain adaptive image classification method

CN CN111428803A 吴强 山东大学

Priority 2020-03-31 • Filed 2020-03-31 • Published 2020-07-17

The invention provides a Wasserstein distance-based depth domain adaptive image classification method and device and a computer-readable storage medium. First, features are extracted using a convolution structure. Secondly, the number of features is reduced by adopting layer-by-layer mapping of the …


2020 patent

Wasserstein distance-based image rapid enhancement method

CN CN111476721B 丰江帆 重庆邮电大学

Priority 2020-03-10 • Filed 2020-03-10 • Granted 2022-04-29 • Published 2022-04-29

5. The Wasserstein distance-based image rapid enhancement method according to claim 1, characterized in that: the up-samplin
 

2020-2023

… and device for generating countermeasure network model based on Wasserstein

CN112634390B 郑海荣 深圳先进技术研究院

Filed 2020-12-17 • Granted 2023-06-13 • Published 2023-06-13

the Wasserstein generation countermeasure network model is obtained through training of a preset generation countermeasure network model based on a low-energy image sample, a standard high-energy image and a preset loss function, and the Wasserstein generation countermeasure network model comprises …


2020 patent

Difference privacy greedy grouping method adopting Wasserstein distance

CN CN112307514A 杨悦 哈尔滨工程大学

Priority 2020-11-26 • Filed 2020-11-26 • Published 2021-02-02

1. A differential privacy greedy grouping method adopting Wasserstein distance is characterized by comprising the following steps: step 1: reading a data set D received at the ith time point i Step 2: will D i Data set D released from last time point i-1 Performing Wasserstein distance similarity …


Wi-Fi indoor positioning method based on signal distribution Wasserstein …

CN CN111741429B 周牧 重庆邮电大学

Priority 2020-06-23 • Filed 2020-06-23 • Granted 2022-05-03 • Published 2022-05-03

1. the Wi-Fi indoor positioning method based on signal distribution Wasserstein distance measurement is characterized by comprising the following steps of: step one, off-line stage, the Wi-Fi received signal from the mth AP at the nth Reference Point (RP) is strengthenedThe sequence of degrees ( …


 

<——2020——–2020—––3890—end 2020—

including 35 titles with WGAN,   2 titles with Wassersteina.

and 2 titles with Васерштейна,     

 

    

                                                                                                                                                                                     

end   end   end    end   end   b   yyy   end   end   end    end   end     end    end   

 2019  2906

2020   3890

—————————

                                                                                                                                                                                     

  


2018-202 


   




 



 



A
 






   

 ———————————no year————no year——

no year 2017?

[CITATION] On Bolza problem in Wasserstein space

C Jimenez, A Marigonda, M Quincampoix - preprint

  Cited by 2 Related articles

Reference in 2018:

[30] Jimenez, C., Marigonda, A., Quincampoix, M. : On Bolza problem in Wasserstein space. Preprint.  

in

  Generalized Dynamic Programming Principle and Sparse ...  2018


 no year in arXiv?

[CITATION] Trotter's formula for Fokker-Planck equations in the Wasserstein space

P Clément, J Maas - preparation

  Cited by 1 Related articles

Reference in 2009:

37. Ph. Clement and J. Maas ´ , Trotter’s formula for Fokker-Planck equations in the Wasserstein space, in preparation.  

in

[PDF] Analysis of infinite dimensional diffusions

J Maas - 2009 - janmaas.org

Stochastic partial differential equations (SPDEs) are used to model a wide variety of

phenomena in physics, population biology, finance, and other fields of science.

Mathematically, SPDEs are often formulated as stochastic ordinary differential equations …

  Cited by 6 Related articles All 5 versions 


———————————end no year