My name in title since z31

see   this page at web

 

 titles with my name in 2019-2020

my name in title till 2018

my name in title  what is my name; totals by year

my name in title from Math Reviews 

My publications

My name in titles of  videos 

last updated on  Aug 6, 2023   
 
   

 

 
 
start 2021  Wasserstein


 MR4180150 Prelim Çelik, stanford wasserstein Özlüm; Jamneshan, Asgar; Montúfar, Guido; Sturmfels, Bernd; Venturello, Lorenzo; Wasserstein distance to independence models. J. Symbolic Comput. 104 (2021), 855–873.

Matsuda, Takeru; Strawderman, William E.

Predictive density estimation under the Wasserstein loss. (English) Zbl 1441.62138 

J. Stat. Plann. Inference 210, 53-63 (2021). 

MSC:  62H10 62A01 62C05 62C20 

2021 Cover Image  PEER-REVIEW

Wasserstein distance to independence models

by Çelik, Türkü Özlüm; Jamneshan, Asgar; Montúfar, Guido ; More...

Journal of symbolic computation, 05/2021, Volume 104

An independence model for discrete random variables is a Segre-Veronese variety in a probability simplex. Any metric on the set of joint states of the random...

Journal ArticleCitation Online

Zbl 07312503

 

Statistical Learning in Wasserstein Space 

By: Karimi, Amirhossein; Ripani, Luigia; Georgiou, Tryphon T. 

IEEE CONTROL SYSTEMS LETTERS  Volume: ‏ 5   Issue: ‏ 3   Pages: ‏ 899-904   Published: ‏ JUL 2021

High-Confidence Attack Detection via Wasserstein-Metric Computations 

By: Li, Dan; Martinez, Sonia 

IEEE CONTROL SYSTEMS LETTERS  Volume: ‏ 5   Issue: ‏ 2   Pages: ‏ 379-384   Published: ‏ APR 2021 


Predictive density estimation under the Wasserstein loss 

By: Matsuda, Takeru; Strawderman, William E. 

JOURNAL OF STATISTICAL PLANNING AND INFERENCE  Volume: ‏ 210   Pages: ‏ 53-63   Published: ‏ JAN 2021    Zbl 1441.62138 

Cover Image

Predictive density estimation under the Wasserstein loss

by Matsuda, Takeru; Strawderman, William E

Journal of statistical planning and inference, 01/2021, Volume 210

We investigate predictive density estimation under the L2 Wasserstein loss for location families and location-scale families. We show that plug-in densities...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 Preview 

 Cite this item Email this item Save this item More actions

2021


Wasserstein convergence rates for random bit approximations of continuous Markov processes 

by Ankirchner, Stefan; Kruse, Thomas; Urusov, Mikhail 

Journal of mathematical analysis and applications, 01/2021, Volume 493, Issue 2

We determine the convergence speed of a numerical scheme for approximating one-dimensional continuous strong Markov processes. The scheme is based on the...

Article PDF Download PDF 

Journal ArticleFull Text Online 

MR4144292 Prelim Ankirchner, Stefan; Kruse, Thomas; Urusov, Mikhail; Wasserstein convergence rates for random bit approximations of continuous Markov processes. J. Math. Anal. Appl. 493 (2021), no. 2, 124543. 60 (65)

Review PDF Clipboard Journal Article 

Ankirchner, Stefan; Kruse, Thomas; Urusov, Mikhail

Wasserstein convergence rates for random bit approximations of continuous Markov processes. (English) Zbl 07267868 

J. Math. Anal. Appl. 493, No. 2, Article ID 124543, 31 p. (2021). 

MSC:  60 62 

 Zbl 07267868

 MR4153238 Prelim Bonnet, Benoît; Frankowska, Hélène; Differential inclusions in Wasserstein spaces: The Cauchy-Lipschitz framework. J. Differential Equations 271 (2021), 594–637. 28B20 (34A60 34G20 49J21 49J45)

Zbl 07283594


 

MR4101489   Matsuda, Takeru; Strawderman, William E. Predictive density estimation under the Wasserstein loss. 

J. Statist. Plann. Inference 210 (2021), 53–63. 62A99 (62C05 62C99)

Review PDF Clipboard Journal Article 

[PDF] arxiv.org

Predictive density estimation under the Wasserstein loss

T Matsuda, WE Strawderman - Journal of Statistical Planning and Inference, 2021 - Elsevier

… Here, we consider L 2 Wasserstein distance and focus on … useful properties of the Wasserstein 

distance and also review … estimation under the L 2 Wasserstein loss for location families …

Cited by 3 Related articles All 5 versions   

 

 Statistical Learning in Wasserstein Space

By: Karimi, AmirhosseinRipani, LuigiaGeorgiou, Tryphon T.

IEEE CONTROL SYSTEMS LETTERS   Volume: ‏ 5   Issue: ‏ 3   Pages: ‏ 899-904   Published: ‏ JUL 2021

MR4211609 Prelim Karimi, Amirhossein; Ripani, Luigia; Georgiou, Tryphon T.; Statistical learning in Wasserstein space. IEEE Control Syst. Lett. 5 (2021), no. 3, 899–904. 49 (62)

Review PDF Clipboard Journal Article
Matsuda, Takeru
Strawderman, William E.

Predictive density estimation under the Wasserstein loss. (English) Zbl 1441.62138

J. Stat. Plann. Inference 210, 53-63 (2021).

MSC:  62H10 62A01 62C05 62C20

PDF BibTeX XML Cite

Full Text: DOI


 Wasserstein convergence rates for random bit approximations of continuous Markov processes

By: Ankirchner, StefanKruse, ThomasUrusov, Mikhail

JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS   Volume: ‏ 493   Issue: ‏ 2     Article Number: 124543   Published: ‏ JAN 15 2021

Cited by 4 Related articles All 4 versions

 High-Confidence Attack Detection via Wasserstein-Metric ...

ieeexplore.ieee.org › document

Jun 16, 2020 — High-Confidence Attack Detection via Wasserstein-Metric Computations ... enabled via a linear optimization to compute the detection measure ... Published in: IEEE Control Systems Letters ( Volume: 5 , Issue: 2 , April 2021 ).

High-Confidence Attack Detection via Wasserstein-Metric Computations

By: Li, Dan; Martinez, Sonia

IEEE CONTROL SYSTEMS LETTERS   Volume: ‏ 5   Issue: ‏ 2   Pages: ‏ 379-384   Published: ‏ APR 2021
<——2021————2021———10——


Attainability property for a probabilistic target in wasserstein ...

www.aimsciences.org › article › doi › dcds.2020300

nability property for a probabilistic target in wasserstein spaces. Discrete & Continuous Dynamical Systems - A, 2021, 41 (2) : 777-812. doi: 10.3934/dcds.

by G Cavagnari · ‎2020 · ‎ · ‎Related articles

ATTAINABILITY PROPERTY FOR A PROBABILISTIC TARGET IN WASSERSTEIN SPACES

By: Cavagnari, Giulia; Marigonda, Antonio

DISCRETE AND CONTINUOUS DYNAMICAL SYSTEMS   Volume: ‏ 41   Issue: ‏ 2   Pages: ‏ 777-812   Published: ‏ FEB 2021

 Free Full Text from Publisher View Abstract

(from Web of Science Core Collection)

2021 see 2022

arXiv:2101.01100  [pdfpsother]  math.OC cs.CC cs.DS   cs.LG

Wasserstein barycenters are NP-hard to compute
Authors: Jason M. AltschulerEnric Boix-Adsera
Abstract: The problem of computing Wasserstein barycenters (a.k.a. Optimal Transport barycenters) has attracted considerable recent attention due to many applications in data science. While there exist polynomial-time algorithms in any fixed dimension, all known runtimes suffer exponentially in the dimension. It is an open question whether this exponential dependence is improvable to a polynomial dependence…  More
Submitted 4 January, 2021; originally announced January 2021.
Comments: 18 pages (9 pages main text)

 Cited by 5 Related articles All 2 versions 


Attention Residual Network for White Blood Cell Classification with WGAN Data Augmentation
by Zhao, MengJin, LingminTeng, Shenghua ; More...
2021 11th International Conference on Information Technology in Medicine and Education (ITME), 11/2021
In medicine, white blood cell (WBC) classification plays an important role in clinical diagnosis and treatment. Due to the similarity between classes and lack...
Conference Proceeding  Full Text Online

year 2021  [PDF] researchgate.net

[PDF] THE α-z-BURES WASSERSTEIN DIVERGENCE

THOA DINHCT LE, BK VO, TD VUONG - researchgate.net

Φ (A, B)= Tr ((1− α) A+ αB)− Tr (Qα, z (A, B)), where Qα, z (A, B)=(A 1− α 2z B α z A 1− α 2z) z

is the matrix function in the α-z-Renyi relative entropy. We show that for 0≤ α≤ z≤ 1, the

quantity Φ (A, B) is a quantum divergence and satisfies the Data Processing Inequality in …

  

arXiv:2101.01429  [pdfother stat.ML cs.LG
Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings
Authors: Minh Ha Quang
Abstract: This work studies the convergence and finite sample approximations of entropic regularized Wasserstein distances in the Hilbert space setting. Our first main result is that for Gaussian measures on an infinite-dimensional Hilbert space, convergence in the 2-Sinkhorn divergence is {\it strictly weaker} than convergence in the exact 2-Wasserstein distance. Specifically, a sequence of centered Gaussi…  More
Submitted 5 January, 2021; originally announced January 2021.

Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and
 Related articles All 3 versions 


2021 online Cover Image

Wasserstein convergence rates for random bit approximations of continuous Markov...

by Ankirchner, Stefan; Kruse, Thomas; Urusov, Mikhail

Journal of mathematical analysis and applications, 01/2021, Volume 493, Issue 2

We determine the convergence speed of a numerical scheme for approximating one-dimensional continuous strong Markov processes. The scheme is based on the...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

Zbl 07267868

arXiv:2101.00838  [pdfpsother math.OC
Distributionally robust second-order stochastic dominance constrained optimization with Wasserstein distance
Authors: Yu MeiJia LiuZhiping Chen
Abstract: We consider a distributionally robust second-order stochastic dominance constrained optimization problem, where the true distribution of the uncertain parameters is ambiguous. The ambiguity set contains all probability distributions close to the empirical distribution under the Wasserstein distance. We adopt the sample approximation technique to develop a linear programming formulation that provid…  More
Submitted 4 January, 2021; originally announced January 2021.
MSC Class: 90C59; 90C34
Cited by 1 Related articles

 Differential inclusions in Wasserstein spaces: The Cauchy-Lipschitz framework

By: Bonnet, BenoitFrankowska, Helene

JOURNAL OF DIFFERENTIAL EQUATIONS  Volume: ‏ 271   Pages: ‏ 594-637   Published: ‏ JAN 15 2021

   Cited by 5 Related articles All 6 versions

Cited by 6 Related articles All 6 versions

 Wasserstein GANs for MR Imaging: From Paired to Unpaired Training.

By: Lei, KeMardani, MortezaPauly, John M; et al.

IEEE transactions on medical imaging  Volume: ‏ 40   Issue: ‏ 1   Pages: ‏ 105-115   Published: ‏ 2021-Jan (Epub 2020 Dec 29)


Method for training Wasserstein generative adversarial network, involves adapting discriminator parameter as function of loss function, and creating second input date on basis of first input date using method of virtual adversarial training

Patent Number: DE102019210270-A1 US2020372297-A1 CN111985638-A

Patent Assignee: BOSCH GMBH ROBERT

Inventor(s): TERJEK D.


2021 [PDF] arxiv.org

Predictive density estimation under the Wasserstein loss

T Matsuda, WE Strawderman - Journal of Statistical Planning and Inference, 2021 - Elsevier

We investigate predictive density estimation under the L 2 Wasserstein loss for location

families and location-scale families. We show that plug-in densities form a complete class

and that the Bayesian predictive density is given by the plug-in density with the posterior …

   Related articles All 5 versions

MR4101489 Reviewed Matsuda, TakeruStrawderman, William E. Predictive density estimation under the Wasserstein loss. J. Statist. Plann. Inference 210 (2021), 53–63. 62A99 (62C05 62C99)

Review PDF Clipboard Journal Article

Cited by 2 Related articles All 5 versions

<——2021——2021————20——  

  

MR4191527 Prelim Cavagnari, Giulia; Marigonda, Antonio; Attainability property for a probabilistic target in Wasserstein spaces. Discrete Contin. Dyn. Syst. 41 (2021), no. 2, 777–812. 49K40 (34 49J20 49J52 49J53 49L20)

Review PDF Clipboard Journal Article

 TTAINABILITY PROPERTY FOR A PROBABILISTIC TARGET IN WASSERSTEIN SPACES

By: Cavagnari, GiuliaMarigonda, Antonio

DISCRETE AND CONTINUOUS DYNAMICAL SYSTEMS  Volume: ‏ 41   Issue: ‏ 2   Pages: ‏ 777-812   Published: ‏ FEB 2021


2021

Wasserstein GANs for MR Imaging: From Paired to Unpaired Training.

By: Lei, Ke; Mardani, Morteza; Pauly, John M; et al.

IEEE transactions on medical imaging  Volume: ‏ 40   Issue: ‏ 1   Pages: ‏ 105-115   Published: ‏ 2021-Jan (Epub 2020 Dec 29)

Get It Penn State  View Abstract


[PDF] iop.org

Graph Classification Method Based on Wasserstein Distance

W Wu, G Hu, F Yu - Journal of Physics: Conference Series, 2021 - iopscience.iop.org

… , and then calculates the Wasserstein distance between the … between two graphs, namely 

Wasserstein distance. However, … Thus, we can calculate the Wasserstein distance between …

Related articles All 3 versions

 

 

2021 see 2023

Publications | Khai Nguyen

https://khainb.com › publications

by K Nguyen · 2021 · Cited by 7 — 2023. Preprint. Markovian Sliced Wasserstein Distances: Beyond Independent ... 

Revisiting Sliced Wasserstein on Images: From Vectorization to Convolution.

 arXiv:2101.04039  [pdfpsother]  math.ST stat.ML
From Smooth Wasserstein Distance to Dual Sobolev Norm: Empirical Approximation and Statistical Applications
Authors: Sloan NietertZiv GoldfeldKengo Kato
Submitted 11 January, 2021; originally announced January 2021.

  Cited by 2 Related articles All 2 versions 

2021



Considering anatomical prior information for low-dose ct image enhancement using attribute-augmented wasserstein generative adversarial networks

Z Huang, X Liu, R Wang, J Chen, P Lu, Q Zhang… - Neurocomputing, 2021 - Elsevier

… For the adversarial training process, we apply the Wasserstein distance with a gradient … regularization parameter used to balance the Wasserstein estimation and the gradient penalty …

Cited by 9 Related articles

arXiv:2101.05756  [pdfother]  math.MG q-bio.PE

The ultrametric Gromov-Wasserstein distance

uthors: Facundo MémoliAxel MunkZhengchao WanChristoph Weitkamp

Authors: Facundo Mémoli, Axel Munk, Zhengchao Wan, Christoph Weitkamp

Abstract: In this paper, we investigate compact ultrametric measure spaces which form a subset Uw of the collection of all metric measure spaces Mw. Similar as for the ultrametric Gromov-Hausdorff distance on the collection of ultrametric spaces U, we define ultrametric versions of two metrics on Uw, namely of Sturm's distance of order p and of the Gromov… More

Submitted 14 January, 2021; originally announced January 2021.
Cited by 3
 Related articles All 3 versions 


ARTICLE

The Spectral-Domain \(\mathcal{W}_2\) Wasserstein Distance for Elliptical Processes and the Spectral-Domain Gelbrich Bound

Song Fang ; Quanyan ZhuarXiv.org, 2021

 

[PDF] openreview.net

[PDF] Functional Priors for Bayesian Neural Networks through Wasserstein Distance Minimization to Gaussian Processes

BH TranD MiliosS RossiM Filippone - openreview.net

The Bayesian treatment of neural networks dictates that a prior distribution is considered

over the weight and bias parameters of the network. The non-linear nature of the model

implies that any distribution of the parameters has an unpredictable effect on the distribution …

  

arXiv:2101.06936  [pdfpsother]  math.PR  math.ST
Wasserstein Convergence Rate for Empirical Measures of Markov Chains
Authors: Adrian Riekert
Abstract: We consider a Markov chain on Rd
 with invariant measure μ
. We are interested in the rate of convergence of the empirical measures towards the invariant measure with respect to the 1
-Wasserstein distance. The main result of this article is a new upper bound for the expected Wasserstein distance, which is proved by combining the Kantorovich dual formula with a Fourier expansion. In a… 
More
Submitted 18 January, 2021; originally announced January 2021.
Comments: 14 pages

Cited by 2 Related articles All 2 versions 

<——2021———2021———30——   


arXiv:2101.06572  [pdfpsother]  math.OA  cs.IT
Tracial smooth functions of non-commuting variables and the free Wasserstein manifold
Authors: David JekelWuchen LiDimitri Shlyakhtenko
Abstract: We formulate a free probabilistic analog of the Wasserstein manifold on Rd
 (the formal Riemannian manifold of smooth probability densities on Rd
), and we use it to study smooth non-commutative transport of measure. The points of the free Wasserstein manifold W(R
d)
 are smooth tracial non-commutative functions V
 with quadratic growth at ∞
, wh… 
More   01/2021
Submitted 16 January, 2021; originally announced January 2021.
Comments: 121 pages
MSC Class: 46L52; 46L54; 35Q49; 94A17

 Cited by 1 Related articles All 4 versions 

Tracial smooth functions of non-commuting variables and the free Wasserstein manifold
Jekel, David; Li, Wuchen; Shlyakhtenko, Dimitri. arXiv.org; Ithaca, Oct 25, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

 online OPEN ACCESS

Tracial smooth functions of non-commuting variables and the free Wasserstein manifold

by Jekel, David; Li, Wuchen; Shlyakhtenko, Dimitri

01/2021

We formulate a free probabilistic analog of the Wasserstein manifold on $\mathbb{R}^d$ (the formal Riemannian manifold of smooth probability densities on...

Journal ArticleFull Text Online
Cited by 3
Related articles All 6 versions


arXiv:2101.08126  [pdfpsother]  math.ST
A short proof on the rate of convergence of the empirical measure for the Wasserstein distance
Authors: Vincent Divol
Abstract: We provide a short proof that the Wasserstein distance between the empirical measure of a n-sample and the estimated measure is of order n^-(1/d), if the measure has a lower and upper bounded density on the d-dimensional flat torus.
Submitted 20 January, 2021; originally announced January 2021.

  All 4 versions 


arXiv:2101.07969  [pdfpsother]  math.ST cs.IT cs.LG stat.ML
Robust W-GAN-Based Estimation Under Wasserstein Contamination
Authors: Zheng LiuPo-Ling Loh
Abstract: Robust estimation is an important problem in statistics which aims at providing a reasonable estimator when the data-generating distribution lies within an appropriately defined ball around an uncontaminated distribution. Although minimax rates of estimation have been established in recent years, many existing robust estimators with provably optimal convergence rates are also computationally intra… 
More
Submitted 20 January, 2021; originally announced January 2021.

  Related articles All 2 versions 


arXiv:2101.07496  [pdfother cs.LG cs.AI
Disentangled Recurrent Wasserstein Autoencoder
Authors: Jun HanMartin Renqiang MinLigong HanLi Erran LiXuan Zhang
Abstract: Learning disentangled representations leads to interpretable models and facilitates data generation with style transfer, which has been extensively studied on static data such as images in an unsupervised learning framework. However, only a few works have explored unsupervised disentangled sequential representation learning due to challenges of generating sequential data. In this paper, we propose…  More
Submitted 19 January, 2021; originally announced January 2021.
Comments: ICLR 2021

Disentangled Recurrent Wasserstein Autoencoder

online  OPEN ACCESS

Disentangled Recurrent Wasserstein Autoencoder

by Han, Jun; Min, Martin Renqiang; Han, Ligong ; More...

01/2021

Learning disentangled representations leads to interpretable models and facilitates data generation with style transfer, which has been extensively studied on...

Journal ArticleFull Text Online
Cited by 16 Related articles All 4 versions

2021

Wasserstein metric-based Boltzmann entropy of a landscape ...

link.springer.com › article

Jan 12, 2021 — Wasserstein metric-based Boltzmann entropy of a landscape mosaic: a clarification, corr

 Wasserstein metric-based Boltzmann entropy of a landscape mosaic: a clarification, correction, and evaluation of thermodynamic consistency

P Gao, H Zhang, Z Wu - Landscape Ecology - Springer

Objectives The first objective is to provide a clarification of and a correction to the

Wasserstein metric-based method. The second is to evaluate the method in terms of

thermodynamic consistency using different implementations. Methods Two implementation …

2021

Linear and Deep Order-Preserving Wasserstein

online

Cover Image

Linear and Deep Order-Preserving Wasserstein Discriminant Analysis

by Su, Bing; Zhou, Jiahuan; Wen, Ji-Rong ; More...

IEEE transactions on pattern analysis and machine intelligence, 01/2021, Volume PP

Supervised dimensionality reduction for sequence data learns a transformation that maps the observations in sequences onto a low-dimensional subspace by...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

Related articles All 6 versions

Bridging Bayesian and Minimax Mean Square Error Estimation via Wasserstein Distributionally Robust Optimization
Nguyen, Viet Anh; Shafieezadeh-Abadeh, Soroosh; Kuhn, Daniel; Esfahani, Peyman Mohajerin. arXiv.org; Ithaca, Jan 27, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
 Cited by 19 Related articles All 8 versions



Wasserstein Convergence Rate for Empirical online

 OPEN ACCESS

Wasserstein Convergence Rate for Empirical Measures of Markov Chains

by Riekert, Adrian

01/2021

We consider a Markov chain on $\mathbb{R}^d$ with invariant measure $\mu$. We are interested in the rate of convergence of the empirical measures towards the...

Journal ArticleFull Text Online
Cited by 2 Related articles All 2 versions


[PDF] arxiv.org

Wasserstein inequality and minimal Green energy on compact manifolds

S Steinerberger - Journal of Functional Analysis, 2021 - Elsevier

Let M be a smooth, compact d− dimensional manifold, d≥ 3, without boundary and let G: M×

M R{∞} denote the Green's function of the Laplacian− Δ (normalized to have mean

value 0). We prove a bound on the cost of transporting Dirac measures in {x 1,…, xn} M to …

  Cited by 4 Related articles All 2 versions


MR4198574 Prelim Liu, Jialin; Yin, Wotao; Li, Wuchen; Chow, Yat Tin; Multilevel Optimal Transport: A Fast Approximation of Wasserstein-1 Distances. SIAM J. Sci. Comput. 43 (2021), no. 1, A193–A220. 49Q22 (49M25 49M29 90C90)

Review PDF Clipboard Journal Article

[PDF] arxiv.org

Multilevel optimal transport: a fast approximation of Wasserstein-1 distances

J LiuW YinW Li, YT Chow - SIAM Journal on Scientific Computing, 2021 - SIAM

We propose a fast algorithm for the calculation of the Wasserstein-1 distance, which is a

particular type of optimal transport distance with transport cost homogeneous of degree one.

Our algorithm is built on multilevel primal-dual algorithms. Several numerical examples and …

 6 Related articles All 6 versions

<——2021——2021——  40—

MR4197408 Prelim Becker, Simon; Li, Wuchen; Quantum Statistical Learning via Quantum Wasserstein Natural Gradient. J. Stat. Phys. 182 (2021), no. 1, 7. 81P45 (53B12 81P50)

Review PDF Clipboard Journal Article

[HTML] springer.com

[HTML] Quantum statistical learning via Quantum Wasserstein natural gradient

S BeckerW Li - Journal of Statistical Physics, 2021 - Springer

In this article, we introduce a new approach towards the statistical learning

problem\(\mathrm {argmin} _ {\rho (\theta)\in {\mathcal {P}} _ {\theta}} W_ {Q}^ 2 (\rho _

{\star},\rho (\theta))\) to approximate a target quantum state\(\rho _ {\star}\) by a set of …

 Cited by 2 Related articles All 9 versions


2021

 MR4180150 Prelim Çelik, Türkü ÖzlümJamneshan, Asgar; Montúfar, Guido; Sturmfels, BerndVenturello, LorenzoWasserstein distance to independence models. J. Symbolic Comput. 104 (2021), 855–873. 62R01 (14Q15)

Review PDF Clipboard Journal Article

[PDF] arxiv.org

Wasserstein distance to independence models

TÖ Çelik, A Jamneshan, G MontúfarB Sturmfels… - Journal of Symbolic …, 2021 - Elsevier

An independence model for discrete random variables is a Segre-Veronese variety in a

probability simplex. Any metric on the set of joint states of the random variables induces a

Wasserstein metric on the probability simplex. The unit ball of this polyhedral norm is dual to …

Cited by 8 Related articles All 4 versions

Linear and Deep Order-Preserving Wasserstein Discriminant Analysis.

By: Su, Bing; Zhou, Jiahuan; Wen, Ji-Rong; et al.

IEEE transactions on pattern analysis and machine intelligence  Volume: ‏ PP     Published: ‏ 2021-Jan-12 (Epub 2021 Jan 12)

  Related articles All 5 versions

 

Graph Representation Learning with Wasserstein Barycenters

ieeexplore.ieee.org › iel7

by E Simou · 2020 ·  — 7, 2021. 17 node2coords: Graph Representation Learning with. Wasserstein Barycenters. Effrosyni Simou , Dorina Thanou , and Pascal Frossard. Abstract—In ...

node2coords: Graph Representation Learning with Wasserstein Barycenters

By: Simou, Effrosyni; Thanou, Dorina; Frossard, Pascal

IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS  Volume: ‏ 7   Pages: ‏ 17-29   Published: ‏ 2021

Get It Penn State  View Abstract


node2coords: Graph Representation Learning with Wasserstein Barycenters

Authors:Effrosyni SimouDorina ThanouPascal Frossard
Summary:In order to perform network analysis tasks, representations that capture the most relevant information in the graph structure are needed. However, existing methods learn representations that cannot be interpreted in a straightforward way and that are relatively unstable to perturbations of the graph structure. We address these two limitations by proposing node2coords, a representation learning algorithm for graphs, which learns simultaneously a low-dimensional space and coordinates for the nodes in that space. The patterns that span the low dimensional space reveal the graph's most important structural information. The coordinates of the nodes reveal the proximity of their local structure to the graph structural patterns. We measure this proximity with Wasserstein distances that permit to take into account the properties of the underlying graph. Therefore, we introduce an autoencoder that employs a linear layer in the encoder and a novel Wasserstein barycentric layer at the decoder. Node connectivity descriptors, which capture the local structure of the nodes, are passed through the encoder to learn a small set of graph structural patterns. In the decoder, the node connectivity descriptors are reconstructed as Wasserstein barycenters of the graph structural patterns. The optimal weights for the barycenter representation of a node's connectivity descriptor correspond to the coordinates of that node in the low-dimensional space. Experimental results demonstrate that the representations learned with node2coords are interpretable, lead to node embeddings that are stable to perturbations of the graph structure and achieve competitive or superior results compared to state-of-the-art unsupervised methods in node classificationShow more
Article, 2021
Publication:IEEE Transactions on Signal and Information Processing over Networks, 7, 2021, 17
Publisher:2021

 

A New Data-Driven Distributionally Robust Portfolio Optimization Method Based on

 Wasserstein Ambiguity Set

By: Du, Ningning; Liu, Yankui; Liu, Ying

IEEE ACCESS  Volume: ‏ 9   Pages: ‏ 3174-3194   Published: ‏ 2021

Get It Penn State Free Full Text from Publisher View Abstract


[PDF] arxiv.org

The isometry group of Wasserstein spaces: the Hilbertian case

GP GehérT TitkosD Virosztek - arXiv preprint arXiv:2102.02037, 2021 - arxiv.org

Motivated by Kloeckner's result on the isometry group of the quadratic Wasserstein space

$\mathcal {W} _2\left (\mathbb {R}^ n\right) $, we describe the isometry group $\mathrm

{Isom}\left (\mathcal {W} _p (E)\right) $ for all parameters $0< p<\infty $ and for all separable …

  The isometry group of Wasserstein spaces: the Hilbertian case

G Pál Gehér, T TitkosD Virosztek - arXiv e-prints, 2021 - ui.adsabs.harvard.edu

Motivated by Kloeckner's result on the isometry group of the quadratic Wasserstein space

$\mathcal {W} _2\left (\mathbb {R}^ n\right) $, we describe the isometry group $\mathrm

{Isom}\left (\mathcal {W} _p (E)\right) $ for all parameters $0< p<\infty $ and for all separable …

 


[HTML] mdpi.com

Panchromatic Image Super-Resolution Via Self Attention-Augmented Wasserstein Generative Adversarial Network

J Du, K Cheng, Y Yu, D Wang, H Zhou - Sensors, 2021 - mdpi.com

Panchromatic (PAN) images contain abundant spatial information that is useful for earth

observation, but always suffer from low-resolution (LR) due to the sensor limitation and large-

scale view field. The current super-resolution (SR) methods based on traditional attention …

    Related articles All 7 versions 


Fan, XiequanMa, Xiaohui

On the Wasserstein distance for a martingale central limit theorem. (English) Zbl 07287580

Stat. Probab. Lett. 167, Article ID 108892, 6 p. (2020).

MSC:  60G42 60E15 60F25

Times Cited: 3


 РАСПРЕДЕЛЕННОЕ ВЫЧИСЛЕНИЕ БАРИЦЕНТРА ВАСЕРШТЕЙНА

ДМ Двинских - soc-phys.ipu.ru

Количественные модели и методы в исследованиях сложных сетей … Двинских Д. М. (Московский

физико-технический институт, Москва; Сколковский институт науки и технологий,

Москва) … Определим энтропийно-регуляризованное расстояние Васерштейна, порожденное …

  Related articles 

[Russian  Distributed computation of Vaserstein barycenters]


2021  [PDF] arxiv.org

A short proof on the rate of convergence of the empirical measure for the Wasserstein distance

V Divol - arXiv preprint arXiv:2101.08126, 2021 - arxiv.org

We provide a short proof that the Wasserstein distance between the empirical measure of a

n-sample and the estimated measure is of order n^-(1/d), if the measure has a lower and

upper bounded density on the d-dimensional flat torus. Subjects: Statistics Theory (math. ST) …

  All 4 versions 

<——2021——2021——  50—

[PDF] arxiv.org

Wasserstein Convergence Rate for Empirical Measures of Markov Chains

A Riekert - arXiv preprint arXiv:2101.06936, 2021 - arxiv.org

We consider a Markov chain on $\mathbb {R}^ d $ with invariant measure $\mu $. We are

interested in the rate of convergence of the empirical measures towards the invariant

measure with respect to the $1 $-Wasserstein distance. The main result of this article is a …

  All 2 versions 


[PDF] arxiv.org

Multilevel optimal transport: a fast approximation of Wasserstein-1 distances

J LiuW YinW Li, YT Chow - SIAM Journal on Scientific Computing, 2021 - SIAM

We propose a fast algorithm for the calculation of the Wasserstein-1 distance, which is a

particular type of optimal transport distance with transport cost homogeneous of degree one.

Our algorithm is built on multilevel primal-dual algorithms. Several numerical examples and …

  4 Related articles All 6 versions


[PDF] iop.org

Optimization Research on Abnormal Diagnosis of Transformer Voiceprint Recognition based on Improved Wasserstein GAN

K Zhu, H Ma, J Wang, C Yu, C Guo… - Journal of Physics …, 2021 - iopscience.iop.org

… vol 23(1).p 280–287. [17] Gulrajani, Ishaan, et al.2017. Improved Training of Wasserstein

GANs. http://arxiv.org/abs/1704. 00028 [18] Arjovsky, Martin, S. Chintala, and Bottou,

Léon. 2017.Wasserstein GAN. https://arxiv.org/abs/1 701.0

[PDF] iop.org

Optimization Research on Abnormal Diagnosis of Transformer Voiceprint Recognition based on Improved Wasserstein GAN

K Zhu, H Ma, J Wang, C Yu, C Guo… - Journal of Physics …, 2021 - iopscience.iop.org

Transformer is an important infrastructure equipment of power system, and fault monitoring is of great significance to its operation and maintenance, which has received wide attention and much research. However, the existing methods at home and abroad are based on …

[PDF] openreview.net

[PDF] iop.org

Optimization Research on Abnormal Diagnosis of Transformer Voiceprint Recognition based on Improved Wasserstein GAN

K Zhu, H Ma, J Wang, C Yu, C Guo… - Journal of Physics …, 2021 - iopscience.iop.org

Transformer is an important infrastructure equipment of power system, and fault monitoring

is of great significance to its operation and maintenance, which has received wide attention

and much research. However, the existing methods at home and abroad are based on …

Cited by 1 Related articles All 2 versions

2021

Continual Learning of Generative Models with Limited Data: From Wasserstein-1 Barycenter to Adaptive Coalescence

M Dedeoglu, S Lin, Z Zhang, J Zhang - arXiv preprint arXiv:2101.09225, 2021 - arxiv.org

Learning generative models is challenging for a network edge node with limited data and

computing power. Since tasks in similar environments share model similarity, it is plausible

to leverage pre-trained generative models from the cloud or other edge nodes. Appealing to …

  All 2 versions 


OPTIMAL TRANSPORT ALGORITHMS AND WASSERSTEIN BARYCENTERS

OY Kovalenko - INTERNATIONAL PROGRAM COMMITTEE, 2021 - pdmu.univ.kiev.ua

The work considers the question of finding the optimal algorithm that will be used to solve the

problem of finding Wasserstein's distance. The relevance of the research topic is that today …

Related articles 

2021

Supervised Tree-Wasserstein Distance

by Takezawa, YukiSato, RyomaYamada, Makoto

01/2021

To measure the similarity of documents, the Wasserstein distance is a powerful tool, but it requires a high computational cost.

Recently,

for fast computation...

Journal Article  Full Text Online

arXiv:2101.11520  [pdfother]  cs.LG stat.ML
Supervised Tree-Wasserstein Distance
Authors: Yuki TakezawaRyoma SatoMakoto Yamada
Abstract: To measure the similarity of documents, the Wasserstein distance is a powerful tool, but it requires a high computational cost. Recently, for fast computation of the Wasserstein distance, methods for approximating the Wasserstein distance using a tree metric have been proposed. These tree-based methods allow fast comparisons of a large number of documents; however, they are unsupervised and do not…  More
Submitted 27 January, 2021; originally announced January 2021.
istance, methods for approximating the Wasserstein distance using a tree metric have been …

Cited by 7 Related articles All 8 versions 

Automatic Visual Inspection of Rare Defects: A Framework based on GP-WGAN and Enhanced...
by Jalayer, MasoudJalayer, RezaKaboli, Amin ; More...
2021 IEEE International Conference on Industry 4.0, Artificial Intelligence, and Communications Technology (IAICT), 07/2021
A current trend in industries such as semiconductors and foundry is to shift their visual inspection processes to Automatic Visual Inspection (AVI) systems, to...
Conference Proceeding  Full Text Online

 [PDF] mdpi.com

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - Symmetry, 2021 - mdpi.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively reduce the radiation risk of patients, but it may increase noise and artefacts, which can compromise diagnostic information. The methods based on deep learning can effectively …

  Related articles 

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss. Symmetry 2021, 13, 126

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - 2021 - search.proquest.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively reduce the radiation risk of patients, but it may increase noise and artefacts, which can compromise diagnostic information. The methods based on deep learning can effectively …


arXiv:2101.09225  [pdfother]  cs.LG eess.IV
Continual Learning of Generative Models with Limited Data: From Wasserstein-1 Barycenter to Adaptive Coalescence
Authors: Mehmet DedeogluSen LinZhaofeng ZhangJunshan Zhang
Abstract: Learning generative models is challenging for a network edge node with limited data and computing power. Since tasks in similar environments share model similarity, it is plausible to leverage pre-trained generative models from the cloud or other edge nodes. Appealing to optimal transport theory tailored towards Wasserstein-1 generative adversarial networks (WGAN), this study aims to develop a fra… 
More
Submitted 22 January, 2021; originally announced January 2021.

 Cited by 1 Related articles All 3 versions 

2021  see 2020

Generalized Wasserstein Dice Score, Distributionally Robust Deep Learning, and Ranger for brain tumor segmentation: BraTS 2020 challenge
Lucas Fidon; Ourselin, Sebastien; Vercauteren, Tom. arXiv.org; Ithaca, Jan 25, 2021.

Abstract/Details Get full textLink to external site, this link will open in a new window

Cited by 33 Related articles All 6 versions

<——2021——–2021——–60—


MR4206077 Prelim Ren, Panpan; Wang, Feng-Yu; Exponential convergence in entropy and Wasserstein for McKean-Vlasov SDEs. Nonlinear Anal. 206 (2021), 112259. 60B05 (60B10)

Review PDF Clipboard Journal Article


Wasserstein metric-based Boltzmann entropy of a landscape mosaic: a clarification, correction, and evaluation of thermodynamic consistency

P Gao, H Zhang, Z Wu - Landscape Ecology, 2021 - Springer

Objectives The first objective is to provide a clarification of and a correction to the

Wasserstein metric-based method. The second is to evaluate the method in terms of

thermodynamic consistency using different implementations. Methods Two implementation …

Cited by 5 Related articles All 5 versions


2021 see 2022
Dynamic Topological Data Analysis for Brain Networks via Wasserstein...
by Chung, Moo K; Huang, Shih-Gu; Carroll, Ian C ; More...
12/2021
We present the novel Wasserstein graph clustering for dynamically changing graphs. The Wasserstein clustering penalizes the topological discrepancy between...
Journal Article  Full Text Online

2021 see 2020    [PDF] mdpi.com

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - Symmetry, 2021 - mdpi.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively

reduce the radiation risk of patients, but it may increase noise and artefacts, which can

compromise diagnostic information. The methods based on deep learning can effectively …

  Related articles 

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss. Symmetry 2021, 13, 126

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - 2021 - search.proquest.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively

reduce the radiation risk of patients, but it may increase noise and artefacts, which can

compromise diagnostic information. The methods based on deep learning can effectively …


Low-Illumination Image Enhancement in the Space Environment Based on the DC-WGAN Algorithm

M Zhang, Y Zhang, Z Jiang, X Lv, C Guo - Sensors, 2021 - mdpi.com

Owing to insufficient illumination of the space station, the image information collected by the

intelligent robot will be degraded, and it will not be able to accurately identify the tools

required for the robot's on-orbit maintenance. This situation increases the difficulty of the …

  Related articles All 6 versions 

SENSORS  Volume: ‏ 21   Issue: ‏ 1     Article Number: 286   Published: ‏ JAN 2021

Cited by 4 Related articles All 10 versions

Distributionally robust chance-constrained programs with right-hand side uncertainty under Wasserstein ambiguity

N Ho-NguyenF Kılınç-KarzanS Küçükyavuz… - Mathematical …, 2021 - Springer

We consider exact deterministic mixed-integer programming (MIP) reformulations of

distributionally robust chance-constrained programs (DR-CCP) with random right-hand

sides over Wasserstein ambiguity sets. The existing MIP formulations are known to have …

  Cited by 7 Related articles All 5 versions


Convergence in Wasserstein Distance for Empirical Measures of Semilinear SPDEs

by Wang, Feng-Yu
01/2021
The convergence rate in Wasserstein distance is estimated for the empirical measures of symmetric semilinear SPDEs. Unlike in the finite-dimensional case that...
Journal Article  Full Text Online

arXiv:2102.00361  [pdfpsother math.PR
Convergence in Wasserstein Distance for Empirical Measures of Semilinear SPDEs
Authors: Feng-Yu Wang
Abstract: The convergence rate in Wasserstein distance is estimated for the empirical measures of symmetric semilinear SPDEs. Unlike in the finite-dimensional case that the convergence is of algebraic order in time, in the present situation the convergence is of log order with a power given by eigenvalues of the underlying linear operator.
Submitted 30 January, 2021; originally announced February 2021.
Comments: 16 pages

Cited by 4 Related articles All 2 versions 

arXiv:2102.00356  [pdfpsother]  math.ST math.PR
Measuring association with Wasserstein distances
Authors: Johannes Wiesel
Abstract: Let πΠ(μ,ν) be a coupling between two probability measures μ and ν on a Polish space. In this article we propose and study a class of nonparametric measures of association between μ and ν. The analysis is based on the Wasserstein distance between ν and the disintegration πx1 of π with respect to the first coordinate. We also establish basic statistical properties of this ne… More

Submitted 30 January, 2021; originally announced February 2021.

[PDF] arxiv.org

Measuring association with Wasserstein distances

J Wiesel - arXiv preprint arXiv:2102.00356, 2021 - arxiv.org

Let $\pi\in\Pi (\mu,\nu) $ be a coupling between two probability measures $\mu $ and $\nu $

on a Polish space. In this article we propose and study a class of nonparametric measures of

association between $\mu $ and $\nu $. The analysis is based on the Wasserstein distance …

 Cited by 2 All 2 vers

 MR4474563

MR4206692 Prelim Petersen, Alexander; Liu, Xi; Divani, Afshin A.; Wasserstein 

F-tests and confidence bands for the Fréchet regression of density response curves. Ann. Statist. 49 (2021), no. 1, 590–611. 62J99 (62F03 62F05 62F12 62F25)

Review PDF Clipboard Journal Article  Zbl 07319879


  MR4213022 Prelim Steinerberger, Stefan; Wasserstein distance, Fourier series and applications. Monatsh. Math. 194 (2021), no. 2, 305–338. 11L03 (35B05 42A05 42A16 49Q20)

Review PDF Clipboard Journal Article

Wasserstein distance, Fourier series and applications

By: Steinerberger, Stefan

MONATSHEFTE FUR MATHEMATIK    

Early Access: JAN 2021

Cited by 33 Related articles All 3 versions

<——2021——2021——  70—


node2coords: Graph Representation 

Learning with Wasserstein Barycenters

By: Simou, EffrosyniThanou, DorinaFrossard, Pascal

IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS  Volume: ‏ 7   Pages: ‏ 17-29   Published: ‏ 2021

A New Data-Driven Distributionally Robust Portfolio Optimization Method Based on Wasserstein Ambiguity Set

By: Du, NingningLiu, YankuiLiu, Ying

IEEE ACCESS  Volume: ‏ 9   Pages: ‏ 3174-3194   Published: ‏ 2021

 Free Full Text from Publisher View Abstract

[PDF] auburn.edu

Efficient and Robust Classification for Positive Definite Matrices with Wasserstein Metric

J Cui - 2021 - etd.auburn.edu

Riemannian geometry methods are widely used to classify SPD (Symmetric Positives-

Definite) matrices, such as covariances matrices of brain-computer interfaces. Common

Riemannian geometry classification methods are based on Riemannian distance to …

Related articles 

[PDF] arxiv.org

Wasserstein -tests and confidence bands for the Fréchet regression of density response curves

A Petersen, X Liu, AA Divani - Annals of Statistics, 2021 - projecteuclid.org

Data consisting of samples of probability density functions are increasingly prevalent,

necessitating the development of methodologies for their analysis that respect the inherent

nonlinearities associated with densities. In many applications, density curves appear as …

 Cited by 8 Related articles All 4 versions

 [PDF] arxiv.org

Sufficient Condition for Rectifiability Involving Wasserstein Distance  W 2

D Dąbrowski - The Journal of Geometric Analysis, 2021 - Springer

Abstract A Radon measure\(\mu\) is n-rectifiable if it is absolutely continuous with respect

to\({\mathcal {H}}^ n\) and\(\mu\)-almost all of\({{\,\mathrm {supp}\,}}\mu\) can be covered by

Lipschitz images of\({\mathbb {R}}^ n\). In this paper we give two sufficient conditions for …

  Cited by 4 Related articles All 3 versions


2021


[PDF] mdpi.com

WRGAN: Improvement of RelGAN with Wasserstein Loss for Text Generation

Z Jiao, F Ren - Electronics, 2021 - mdpi.com

Generative adversarial networks (GANs) were first proposed in 2014, and have been widely

used in computer vision, such as for image generation and other tasks. However, the GANs

used for text generation have made slow progress. One of the reasons is that the …

  

[PDF] uni-bielefeld.de

Exponential convergence in entropy and Wasserstein for McKean–Vlasov SDEs

P Ren, FY Wang - Nonlinear Analysis, 2021 - Elsevier

The following type of exponential convergence is proved for (non-degenerate or

degenerate) McKean–Vlasov SDEs: W 2 (μ t, μ∞) 2+ Ent (μ t| μ∞)≤ ce− λ t min {W 2 (μ 0,

μ∞) 2, Ent (μ 0| μ∞)}, t≥ 1, where c, λ> 0 are constants, μ t is the distribution of the solution …

 Related articles All 6 versions

Journal ArticleFull Text Online

Cited by 6 Related articles All 6 versions

The measurement of relations on belief functions based on the Kantorovich problem and the Wasserstein metric

AG Bronevich, IN Rozenberg - International Journal of Approximate …, 2021 - Elsevier

In this paper, we show how the Kantorovich problem appears in many constructions in the

theory of belief functions. We demonstrate this on several relations on belief functions such

as inclusion, equality and intersection of belief functions. Using the Kantorovich problem we …

  All 2 versions

MR4209711 Prelim Bronevich, Andrey G.; Rozenberg, Igor N.; The measurement of relations on belief functions based on the Kantorovich problem and the Wasserstein metric. Internat. J. Approx. Reason. 131 (2021), 108–135. 68T37

Review PDF Clipboard Journal Article

2021 [PDF] googleapis.com

Object shape regression using wasserstein distance

J Sun, SKP Kumar, R Bala - US Patent 10,943,352, 2021 - Google Patents

One embodiment can provide a system for detecting outlines of objects in images. During

operation, the system receives an image that includes at least one object, generates a

random noise signal, and provides the received image and the random noise signal to a …

 Related articles All 4 versions 

  

[PDF] arxiv.org

Dimension-free Wasserstein contraction of nonlinear filters

N Whiteley - Stochastic Processes and their Applications, 2021 - Elsevier

For a class of partially observed diffusions, conditions are given for the map from the initial

condition of the signal to filtering distribution to be contractive with respect to Wasserstein

distances, with rate which does not necessarily depend on the dimension of the state-space …

   Related articles All 2 versions    All 6 versions

MR4222401 Prelim Whiteley, Nick; Dimension-free Wasserstein contraction of nonlinear filters. Stochastic Process. Appl. 135 (2021), 31–50.

Review PDF Clipboard Journal Article

2021  online

Dimension-free Wasserstein contraction of nonlinear filters
by Whiteley, Nick
Stochastic processes and their applications, 05/2021, Volume 135
For a class of partially observed diffusions, conditions are given for the map from the initial condition of the signal to filtering distribution to be...
Article PDF Download PDF 

Journal ArticleFull Text Online

TOCHASTIC PROCESSES AND THEIR APPLICATIONS  Volume: ‏ 135   Pages: ‏ 31-50   Published: ‏ MAY 2021

Cited by 1 Related articles All 5 versions

<——2021——2021——  80—


 

Artificial Neural Network with Histogram Data Time Series Forecasting: A Least Squares Approach Based on Wasserstein Distance

P Rakpho, W Yamaka, K Zhu - Behavioral Predictive Modeling in …, 2021 - Springer

This paper aims to predict the histogram time series, and we use the high-frequency data

with 5-min to construct the Histogram data for each day. In this paper, we apply the Artificial

Neural Network (ANN) to Autoregressive (AR) structure and introduce the AR—ANN model …

  Related articles All 4 versions 

Zbl 07442299
Related articles All 4 versions


[PDF] arxiv.org

From Smooth Wasserstein Distance to Dual Sobolev Norm: Empirical Approximation and Statistical Applications

S NietertZ GoldfeldK Kato - arXiv preprint arXiv:2101.04039, 2021 - arxiv.org

Statistical distances, ie, discrepancy measures between probability distributions, are

ubiquitous in probability theory, statistics and machine learning. To combat the curse of

dimensionality when estimating these distances from data, recent work has proposed …

  Related articles All 2 versions 


[PDF] arxiv.org

Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings

MH Quang - arXiv preprint arXiv:2101.01429, 2021 - arxiv.org

This work studies the convergence and finite sample approximations of entropic regularized

Wasserstein distances in the Hilbert space setting. Our first main result is that for Gaussian

measures on an infinite-dimensional Hilbert space, convergence in the 2-Sinkhorn …
Related articles
 All 3 versions 

  

arXiv:2102.02037  [pdfpsother]  math.MG  math-ph  math.FA  math.PR
The isometry group of Wasserstein spaces: the Hilbertian case
Authors: György Pál Gehér, Tamás Titkos, Dániel Virosztek
Abstract: Motivated by Kloeckner's result on the isometry group of the quadratic Wasserstein space W2(Rn)
, we describe the isometry group Isom(Wp(E))
 for all parameters 0<p<∞
 and for all separable real Hilbert spaces E.
 In fact, the 0<p<1
 case is a consequen
Submitted 3 February, 2021; originally announced February 2021.
Comments: 30 pages, 2 figures
MSC Class: Primary: 54E40; 46E27; Secondary: 60A10; 60B05

The isometry group of Wasserstein spaces: the Hilbertian case

G Pál Gehér, T TitkosD Virosztek - arXiv e-prints, 2021 - ui.adsabs.harvard.edu

Motivated by Kloeckner's result on the isometry group of the quadratic Wasserstein space

$\mathcal {W} _2\left (\mathbb {R}^ n\right) $, we describe the isometry group $\mathrm

{Isom}\left (\mathcal {W} _p (E)\right) $ for all parameters $0< p<\infty $ and for all separable …

Related articles 
[CITATION] The isometry group of Wasserstein spaces: the Hilbertian case, manuscript

GP Gehér, T Titkos, D Virosztek - arXiv preprint arXiv:2102.02037, 2021

 Cited by 1


2021 see 2022

An Embedding Carrier-Free Steganography Method Based on Wasserstein GAN

X Yu, J Cui, M Liu - … Conference on Algorithms and Architectures for …, 2021 - Springer

… In this paper, we proposed a carrier-free steganography method based on Wasserstein

GAN. We segmented the target information and input it into the trained Wasserstein GAN, and …

arXiv:2102.01752  [pdfother cs.LG
Continuous Wasserstein-2 Barycenter Estimation without Minimax Optimization
Authors: Alexander KorotinLingxiao LiJustin SolomonEvgeny Burnaev
Abstract: Wasserstein barycenters provide a geometric notion of the weighted average of probability measures based on optimal transport. In this paper, we present a scalable algorithm to compute Wasserstein-2 barycenters given sample access to the input measures, which are not restricted to being discrete. While past approaches rely on entropic or quadratic regularization, we employ input convex neural netw…  More
Submitted 2 February, 2021; originally announced February 2021.

Continuous Wasserstein-2 Barycenter Estimation without Minimax Optimization

A Korotin, L Li, J SolomonE Burnaev - arXiv preprint arXiv:2102.01752, 2021 - arxiv.org

Wasserstein barycenters provide a geometric notion of the weighted average of probability

measures based on optimal transport. In this paper, we present a scalable algorithm to

cCited by 23 Related articles All 5 versions 
Cited by 29
 Related articles All 5 versions 


A Data Enhancement Method for Gene Expression Profile Based on Improved WGAN-GP

by Zhu, ShaojunHan, Fei
Neural Computing for Advanced Applications, 08/2021
A large number of gene expression profile datasets mainly exist in the fields of biological information and gene microarrays. Traditional classification...
Book Chapter  Full Text Online

2021 [PDF] arxiv.org

Asymptotics of smoothed Wasserstein distances

HB ChenJ Niles-Weed - Potential Analysis, 2021 - Springer

We investigate contraction of the Wasserstein distances on\(\mathbb {R}^{d}\) under

Gaussian smoothing. It is well known that the heat semigroup is exponentially contractive

with respect to the Wasserstein distances on manifolds of positive curvature; however, on flat …

 


2021  [PDF] hrbcu.edu.cn

Deep Wasserstein Graph Discriminant Learning for Graph Classification

T ZhangY Wang, Z Cui, C Zhou… - … of the AAAI …, 2021 - ojs-aaai-ex4-oa-ex0-www-webvpn …

Graph topological structures are crucial to distinguish different-class graphs. In this work, we

propose a deep Wasserstein graph discriminant learning (WGDL) framework to learn

discriminative embeddings of graphs in Wasserstein-metric (W-metric) matching space. In …


[PDF] arxiv.org

Supervised Tree-Wasserstein Distance

Y TakezawaR SatoM Yamada - arXiv preprint arXiv:2101.11520, 2021 - arxiv.org

To measure the similarity of documents, the Wasserstein distance is a powerful tool, but it

requires a high computational cost. Recently, for fast computation of the Wasserstein

distance, methods for approximating the Wasserstein distance using a tree metric have been …

Cited by 3 Related articles All 8 versions

[PDF] arxiv.org

Measuring association with Wasserstein distances

J Wiesel - arXiv preprint arXiv:2102.00356, 2021 - arxiv.org

Let $\pi\in\Pi (\mu,\nu) $ be a coupling between two probability measures $\mu $ and $\nu $

on a Polish space. In this article we propose and study a class of nonparametric measures of

association between $\mu $ and $\nu $. The analysis is based on the Wasserstein distance  …
<——2021——2021——  90—— 

 

2021 see 2020

Wasserstein metric-based Boltzmann entropy of a landscape mosaic: a clarification, correction, and evaluation of thermodynamic consistency

P Gao, H Zhang, Z Wu - Landscape Ecology, 2021 - Springer

Objectives The first objective is to provide a clarification of and a correction to the

Wasserstein metric-based method. The second is to evaluate the method in terms of

thermodynamic consistency using different implementations. Methods Two implementation …

Cited by 6 Related articles All 5 versions

[PDF] arxiv.org

The ultrametric Gromov-Wasserstein distance

F MémoliA MunkZ Wan, C Weitkamp - arXiv preprint arXiv:2101.05756, 2021 - arxiv.org

In this paper, we investigate compact ultrametric measure spaces which form a subset

$\mathcal {U}^ w $ of the collection of all metric measure spaces $\mathcal {M}^ w $. Similar

as for the ultrametric Gromov-Hausdorff distance on the collection of ultrametric spaces …

Cited by 7 Related articles All 3 versions

2021

llumination-Invariant Flotation Froth Color Measuring via Wasserstein Distance-Based CycleGAN With

 tructure-Preserving Constraint

By: Liu, JinpingHe, JiezhouXie, Yongfang; et al.

IEEE TRANSACTIONS ON CYBERNETICS  Volume: ‏ 51   Issue: ‏ 2   Pages: ‏ 839-852   Published: ‏ FEB 2021


Distributionally Robust Resilient Operation of Integrated Energy Distribution Systems Using Moment and Wasserstein Metric for Contingencies

M Shahidehpour, Y Zhou, Z Wei… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

Extreme weather events pose a serious threat to energy distribution systems. We propose a

distributionally robust optimization model for the resilient operation of the integrated

electricity and heat energy distribution systems in extreme weather events. We develop a …

 

2021

[PDF] arxiv.org

Wasserstein distance to independence models

TÖ Çelik, A Jamneshan, G MontúfarB Sturmfels… - Journal of Symbolic …, 2021 - Elsevier

An independence model for discrete random variables is a Segre-Veronese variety in a

probability simplex. Any metric on the set of joint states of the random variables induces a

Wasserstein metric on the probability simplex. The unit ball of this polyhedral norm is dual to …

ited by 7 Related articles All 4 versions


[PDF] arxiv.org

Projected Statistical Methods for Distributional Data on the Real Line with the Wasserstein Metric

M Pegoraro, M Beraha - arXiv preprint arXiv:2101.09039, 2021 - arxiv.org

We present a novel class of projected methods, to perform statistical analysis on a data set

of probability distributions on the real line, with the 2-Wasserstein metric. We focus in

particular on Principal Component Analysis (PCA) and regression. To define these models …

  All 5 versions 

2021

Wasserstein Contraction Bounds on Closed Convex Domains with Applications to Stochastic Adaptive Control

Lekang, T and Lamperski, A

60th IEEE Conference on Decision and Control (CDC)

2021 | 2021 60TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC) , pp.366-371

This paper is motivated by the problem of quantitatively bounding the convergence of adaptive control methods for stochastic systems to a stationary distribution. Such bounds are useful for analyzing statistics of trajectories and determining appropriate step sizes for simulations. To this end, we extend a methodology from (unconstrained) stochastic differential equations (SDEs) which provides

Show more

Free Submitted Article From RepositoryFull Text at Publisher

27 References Related records

[PDF] arxiv.org

A short proof on the rate of convergence of the empirical measure for the Wasserstein distance

V Divol - arXiv preprint arXiv:2101.08126, 2021 - arxiv.org

We provide a short proof that the Wasserstein distance between the empirical measure of a

n-sample and the estimated measure is of order n^-(1/d), if the measure has a lower and

upper bounded density on the d-dimensional flat torus. Subjects: Statistics Theory (math. ST) …

  Cited by 2 Related articles All 4 versions 


[PDF] arxiv.org

Learning High Dimensional Wasserstein Geodesics

S Liu, S Ma, Y Chen, H Zha, H Zhou - arXiv preprint arXiv:2102.02992, 2021 - arxiv.org

We propose a new formulation and learning strategy for computing the Wasserstein geodesic between two probability distributions in high dimensions. By applying the method of Lagrange multipliers to the dynamic formulation of the optimal transport (OT) problem, we …

[PDF] arxiv.org

From Smooth Wasserstein Distance to Dual Sobolev Norm: Empirical Approximation and Statistical Applications

S NietertZ GoldfeldK Kato - arXiv preprint arXiv:2101.04039, 2021 - arxiv.org

Statistical distances, ie, discrepancy measures between probability distributions, are

ubiquitous in probability theory, statistics and machine learning. To combat the curse of

dimensionality when estimating these distances from data, recent work has proposed …

  Related articles All 2 versions 

<——2021——2021——  100——


[PDF] mdpi.com

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - Symmetry, 2021 - mdpi.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively

reduce the radiation risk of patients, but it may increase noise and artefacts, which can

compromise diagnostic information. The methods based on deep learning can effectively …

  Related articles 

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss. Symmetry 2021, 13, 126

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - 2021 - search.proquest.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively

reduce the radiation risk of patients, but it may increase noise and artefacts, which can

compromise diagnostic information. The methods based on deep learning can effectively …


[PDF] arxiv.org

Rate of convergence for particles approximation of PDEs in Wasserstein space

M GermainH PhamX Warin - arXiv preprint arXiv:2103.00837, 2021 - arxiv.org

We prove a rate of convergence of order 1/N for the N-particle approximation of a second-

order partial differential equation in the space of probability measures, like the Master

equation or Bellman equation of mean-field control problem under common noise. The proof …

  Cited by 2 All 16 versions 


[PDF] arxiv.org

Convergence in Wasserstein Distance for Empirical Measures of Semilinear SPDEs

FY Wang - arXiv preprint arXiv:2102.00361, 2021 - arxiv.org

The convergence rate in Wasserstein distance is estimated for the empirical measures of

symmetric semilinear SPDEs. Unlike in the finite-dimensional case that the convergence is

of algebraic order in time, in the present situation the convergence is of log order with a …

  

2021 modified. See 2017, 2019

Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance 

Jonathan Weed 1 Francis Bach 2 

1 MIT - Massachusetts Institute of Technology 

2 SIERRA - Statistical Machine Learning and Parsimony

DI-ENS - Département d'informatique de l'École normale supérieure, CNRS - Centre National de la Recherche Scientifique, Inria de Paris

[PS] uc.pt

[PS] … applied sciences 22 (4), 301-316.[24] Weed, J., & Bach, F.(2019). Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein …

PE OLIVEIRA, N PICADO - surfaces - mat.uc.pt

Let M be a compact manifold of Rd. The goal of this paper is to decide, based on a sample of

points, whether the interior of M is empty or not. We divide this work in two main parts. Firstly,

under a dependent sample which may or may not contain some noise within, we …

  Related articles 
Related articles
 View as HTML 

 

2021   [PDF] uni-bielefeld.de

Exponential convergence in entropy and Wasserstein for McKean–Vlasov SDEs

P Ren, FY Wang - Nonlinear Analysis, 2021 - Elsevier

The following type of exponential convergence is proved for (non-degenerate or

degenerate) McKean–Vlasov SDEs: W 2 (μ t, μ∞) 2+ Ent (μ t| μ∞)≤ ce− λ t min {W 2 (μ 0,

μ∞) 2, Ent (μ 0| μ∞)}, t≥ 1, where c, λ> 0 are constants, μ t is the distribution of the solution …

Cited by 13 Related articles All 5 versions


2021[PDF] uni-bielefeld.de

Exponential convergence in entropy and Wasserstein for McKean–Vlasov SDEs

P Ren, FY Wang - Nonlinear Analysis, 2021 - Elsevier

The following type of exponential convergence is proved for (non-degenerate or

degenerate) McKean–Vlasov SDEs: W 2 (μ t, μ∞) 2+ Ent (μ t| μ∞)≤ ce− λ t min {W 2 (μ 0,

μ∞) 2, Ent (μ 0| μ∞)}, t≥ 1, where c, λ> 0 are constants, μ t is the distribution of the solution …

  Related articles All 2 versions


2021
[PDF] arxiv.org

Differential inclusions in Wasserstein spaces: The Cauchy-Lipschitz framework

B BonnetH Frankowska - Journal of Differential Equations, 2021 - Elsevier

In this article, we propose a general framework for the study of differential inclusions in the

Wasserstein space of probability measures. Based on earlier geometric insights on the

structure of continuity equations, we define solutions of differential inclusions as absolutely …

  Cited by 2 Related articles All 6 versions

[PDF] arxiv.org

Multilevel optimal transport: a fast approximation of Wasserstein-1 distances

J Liu, W Yin, W Li, YT Chow - SIAM Journal on Scientific Computing, 2021 - SIAM

We propose a fast algorithm for the calculation of the Wasserstein-1 distance, which is a 

particular type of optimal transport distance with transport cost homogeneous of degree one. 

Our algorithm is built on multilevel primal-dual algorithms. Several numerical examples and …

  6 Related articles All 6 versions

 Zbl 07303444
Cited by 8
Related articles

[PDF] arxiv.org

Continual Learning of Generative Models with Limited Data: From Wasserstein-1 Barycenter to Adaptive Coalescence

M Dedeoglu, S Lin, Z Zhang, J Zhang - arXiv preprint arXiv:2101.09225, 2021 - arxiv.org

Learning generative models is challenging for a network edge node with limited data and 

computing power. Since tasks in similar environments share model similarity, it is plausible 

to leverage pre-trained generative models from the cloud or other edge nodes. Appealing to …

Cited by 1 Related articles All 3 versions


 

2021

Sufficient Condition for Rectifiability Involving Wasserstein Distance  W 2

D Dąbrowski - The Journal of Geometric Analysis, 2021 - Springer

Abstract A Radon measure\(\mu\) is n-rectifiable if it is absolutely continuous with respect 

to\({\mathcal {H}}^ n\) and\(\mu\)-almost all of\({{\,\mathrm {supp}\,}}\mu\) can be covered by 

Lipschitz images of\({\mathbb {R}}^ n\). In this paper we give two sufficient conditions for …

Cited by 4 Related articles All 3 versions 

[PDF] arxiv.org

Sufficient Condition for Rectifiability Involving Wasserstein Distance  2

D Dąbrowski - The Journal of Geometric Analysis, 2021 - Springer

Abstract A Radon measure\(\mu\) is n-rectifiable if it is absolutely continuous with respect

to\({\mathcal {H}}^ n\) and\(\mu\)-almost all of\({{\,\mathrm {supp}\,}}\mu\) can be covered by

Lipschitz images of\({\mathbb {R}}^ n\). In this paper we give two sufficient conditions for …

  Cited by 4 Related articles All 4 versions

Sufficient Condition for Rectifiability Involving Wasserstein Distance W-2

By: Dabrowski, Damian

JOURNAL OF GEOMETRIC ANALYSIS    

Early Access: JAN 2021.

<——2021——2021——  110——


2021

www.researchgate.net › publication › 347866685_The_...

Jan 22, 2021 — Abstract. We propose the Wasserstein-Fourier (WF) distance to measure the (dis)similarity between time series by quantifying the displaceme

Cover Image

 OPEN ACCESS

The Wasserstein-Fourier Distance for Stationary Time Series

by Cazelles, Elsa; Robert, Arnaud; Tobar, Felipe

IEEE transactions on signal processing, 2021, Volume 69

We propose the Wasserstein-Fourier (WF) distance to measure the (dis)similarity between time series by quantifying the displacement of their energy across...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 Preview  ctronics Newsweekly, 03/2021

 Cite this item Email this item Save this item More actions

Researchers at University of Chile Release New Data on Circuits and Signal Processing 

(The Wasserstein-fourier Distance for Stationary Time Series)". 

Electronics Newsweekly (1944-1630), p. 529.

NewsletterCitation Online

2021

Differential inclusions in Wasserstein spaces: The Cauchy ...

www.sciencedirect.com › science › article › abs › pii

Jan 15, 2021 — Differential inclusions in Wasserstein spaces: The Cauchy-Lipschitz ... framework for the study of differential inclusions in the Wasserstein ...

Cover Image  OPEN ACCESS

Differential inclusions in Wasserstein spaces: The Cauchy-Lipschitz framework

by Bonnet, Benoît; Frankowska, Hélène

Journal of Differential Equations, 01/2021, Volume 271

In this article, we propose a general framework for the study of differential inclusions in the Wasserstein space of probability measures. Based on earlier...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 

 

Petersen , Liu , Divani : Wasserstein $F$-tests and confidence ...

projecteuclid.org › euclid.aos

by A Petersen · 2021 · Cited by 3 — Wasserstein F-tests and confidence bands for the Fréchet regression of density response curves. Alexander Petersen, Xi Liu, and Afshin A.

Cover Image

Wasserstein F-tests and confidence bands for the Fréchet regression of density response curves

by Alexander Petersen; Xi Liu; Afshin A Divani

The Annals of statistics, 02/2021, Volume 49, Issue 1

Data consisting of samples of probability density functions are increasingly prevalent, necessitating the development of methodologies for their analysis that...

Journal ArticleFull Text Online

Zbl 07319879

[PDF] openreview.net

[PDF] GROMOV WASSERSTEIN

K Nguyen, S Nguyen, N Ho, T PhamH Bui - openreview.net

Relational regularized autoencoder (RAE) is a framework to learn the distribution of data by

minimizing a reconstruction loss together with a relational regularization on the latent space.

A recent attempt to reduce the inner discrepancy between the prior and aggregated …

 

[2102.00356] Measuring association with Wasserstein distances

arxiv.org › math

Jan 31, 2021 — Let \pi\in \Pi(\mu,\nu) be a coupling between two probability measures \mu and \nu on a Polish space. In this article we propose and study a class of nonparametric measures of association between \mu and \nu.

 OPEN ACCESS

Measuring association with Wasserstein distances

by Wiesel, Johannes

01/2021

Let $\pi\in \Pi(\mu,\nu)$ be a coupling between two probability measures $\mu$ and $\nu$ on a Polish space. In this article we propose and study a class of...

Journal ArticleFull Text Online

Cited by 3 Related articles All 3 versions

Wasserstein Autoencoders with Mixture of Gaussian Priors for ...

uwspace.uwaterloo.ca › handle

by A Ghabussi · 2021 — Wasserstein Autoencoders with Mixture of Gaussian Priors for Stylized Text Generation · View/ Open · Date · Author · Metadata · Statistics · Abstract.


Wasserstein Autoencoders with Mixture of Gaussian Priors for Stylized Text Generation

by Ghabussi, Amirpasha

2021

Probabilistic text generation is an important application of Natural Language Processing (NLP). Variational autoencoders and Wasserstein autoencoders are two...

Dissertation/ThesisCitation Online

Book Chapter   Full Text Online

 Related articles All 3 versions


[HTML] tandfonline.com

Full View

Dissimilarity measure of local structure in inorganic crystals using Wasserstein distance to search for novel phosphors

S Takemura, T Takeda, T Nakanishi… - … and Technology of …, 2021 - Taylor & Francis

To efficiently search for novel phosphors, we propose a dissimilarity measure of local

structure using the Wasserstein distance. This simple and versatile method provides the

quantitative dissimilarity of a local structure around a center ion. To calculate the …

  All 7 versions


The isometry group of Wasserstein spaces: the Hilbertian case

G Pál Gehér, T TitkosD Virosztek - arXiv e-prints, 2021 - ui.adsabs.harvard.edu

Motivated by Kloeckner's result on the isometry group of the quadratic Wasserstein space $\mathcal {W} _2\left (\mathbb {R}^ n\right) $, we describe the isometry group $\mathrm {Isom}\left (\mathcal {W} _p (E)\right) $ for all parameters $0< p<\infty $ and for all separable real Hilbert spaces $ E. $ In fact, the $0< p< 1$ case is a consequence of our more general result: we prove that $\mathcal {W} _1 (X) $ is isometrically rigid if $ X $ is a complete separable metric space that satisfies the strict triangle inequality. Furthermore, we show that …

 OPEN ACCESS

The isometry group of Wasserstein spaces: the Hilbertian case

by Gehér, György Pál; Titkos, Tamás; Virosztek, Dániel

02/2021

Motivated by Kloeckner's result on the isometry group of the quadratic Wasserstein space $\mathcal{W}_2\left(\mathbb{R}^n\right)$, we describe the isometry...

Journal ArticleFull Text Online

 

[2102.00361] Convergence in Wasserstein Distance for ...

arxiv.org › math

Jan 31, 2021 — Abstract: The convergence rate in Wasserstein distance is estimated for the empirical measures of symmetric semilinear SPDEs. Unlike in the ...

 OPEN ACCESS

Convergence in Wasserstein Distance for Empirical Measures of Semilinear SPDEs

by Wang, Feng-Yu

01/2021

The convergence rate in Wasserstein distance is estimated for the empirical measures of symmetric semilinear SPDEs. Unlike in the finite-dimensional case that...

Cited by 4 Related articles All 2 versions

<——2021—–—2021—— –130——  


2021 see 2022

Rethinking Rotated Object Detection with Gaussian ...

arxiv.org › cs

by X Yang · 2021 — Specifically, the rotated bounding box is converted to a 2-D Gaussian distribution, which enables to approximate the indifferentiable rotational IoU induced loss by the Gaussian Wasserstein distance (GWD) which can be learned efficiently by gradient back-propagation.

 OPEN ACCESS

Rethinking Rotated Object Detection with Gaussian Wasserstein Distance Loss

by Yang, Xue; Yan, Junchi; Ming, Qi ; More...

01/2021

Boundary discontinuity and its inconsistency to the final detection metric have been the bottleneck for rotating detection regression loss design. In this...

Journal ArticleFull Text Online
Cited by 83
Related articles All 9 versions

 Rethinking Rotated Object Detection with Gaussian Wasserstein 

Distance lLoss

slideslive.com › rethinking-rotated-object-detection-with-...

slideslive.com › rethinking-rotated-object-detection-with-...

Rethinking Rotated Object Detection with Gaussian Wasserstein Distance Loss ... computational biology, speech recognition, and robotics.

SlidesLive · 

Jul 19, 2021


online  OPEN ACCESS

Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces

by Bonnet, Benoît; Frankowska, Hélène

01/2021

In this article, we derive first-order necessary optimality conditions for a constrained optimal control problem formulated in the Wasserstein space of...

Journal ArticleFull Text Online

Zbl 07498406


2021

Tighter expected generalization error bounds via Wasserstein ...

arxiv.org › stat

by B Rodríguez-Gálvez · 2021 — In this work, we introduce several expected generalization error bounds based on the Wasserstein distance. More precisely, we present full- ...

 OPEN ACCESS

Tighter expected generalization error bounds via Wasserstein distance

by Rodríguez-Gálvez, Borja; Bassi, Germán; Thobaben, Ragnar ; More...

01/2021

In this work, we introduce several expected generalization error bounds based on the Wasserstein distance. More precisely, we present full-dataset,...

Journal ArticleFull Text Online

  All 2 versions 
Tighter Expected Generalization Error Bounds via

Wasserstein Distance - SlidesLive

slideslive.com › tighter-expected-generalization-error-bou...

Moreover, when the loss function is bounded and the geometry of the space is ignored by the choice of the metric in the Wasserstein distance ...

SlidesLive · 

Dec 6, 2021

 

 math.ST cs.IT cs.LG stat.ML

Robust W-GAN-Based Estimation Under Wasserstein Contamination

Authors: Zheng LiuPo-Ling Loh

Abstract: Robust estimation is an important problem in statistics which aims at providing a reasonable estimator when the data-generating distribution lies within an appropriately defined ball around an uncontaminated distribution. Although minimax rates of estimation have been established in recent years, many existing robust estimators with provably optimal convergence rates are also computationally intra…  More

Submitted 20 January, 2021; originally announced January 2021.


2021


arXiv:2102.03450  [pdfother cs.LG stat.ML
Wasserstein diffusion on graphs with missing attributes
Authors: Zhixian ChenTengfei MaYangqiu SongYang Wang
Abstract: Missing node attributes is a common problem in real-world graphs. Graph neural networks have been demonstrated powerful in graph representation learning, however, they rely heavily on the completeness of graph information. Few of them consider the incomplete node attributes, which can bring great damage to the performance in practice. In this paper, we propose an innovative node representation lea…  More
Submitted 5 February, 2021; originally announced February 2021.

arXiv:2102.03390  [pdfother cs.LG  stat.ML
Projection Robust Wasserstein Barycenter
Authors: Minhui HuangShiqian MaLifeng Lai
Abstract: Collecting and aggregating information from several probability measures or histograms is a fundamental task in machine learning. One of the popular solution methods for this task is to compute the barycenter of the probability measures under the Wasserstein metric. However, approximating the Wasserstein barycenter is numerically challenging because of the curse of dimensionality. This paper propo…  More
Submitted 5 February, 2021; originally announced February 2021.

[PDF] arxiv.org

Exact Statistical Inference for the Wasserstein Distance by Selective Inference

VNL DuyI Takeuchi - arXiv preprint arXiv:2109.14206, 2021 - arxiv.org

In this paper, we study statistical inference for the Wasserstein distance, which has attracted

much attention and has been applied to various machine learning tasks. Several studies …

ited by 4 Related articles All 2 versions 
Submitted 7 February, 2021; v1 submitted 4 February, 2021; originally announced February 2021.
arXiv:2109.14206
  [pdfother]  stat.ML   cs.LG
Exact Statistical Inference for the Wasserstein Distance by Selective Inference
Authors: Vo Nguyen Le DuyIchiro Takeuchi
Abstract: In this paper, we study statistical inference for the Wasserstein distance, which has attracted much attention and has been applied to various machine learning tasks. Several studies have been proposed in the literature, but almost all of them are based on asymptotic approximation and do not have finite-sample validity. In this study, we propose an exact (non-asymptotic) inference method for the W… 
More
Submitted 29 September, 2021; originally announced September 2021.


Robust W-GAN-Based Estimation Under Wasserstein ...

arxiv.org › math

by Z Liu · 2021 — Robust W-GAN-Based Estimation Under Wasserstein Contamination. Robust estimation is an important problem in statistics which aims at providing a reasonable estimator when the data-generating distribution lies within an appropriately defined ball around an uncontaminated distribution.

OPEN ACCESS

Robust W-GAN-Based Estimation Under Wasserstein Contamination

by Liu, Zheng; Loh, Po-Ling

01/2021

Robust estimation is an important problem in statistics which aims at providing a reasonable estimator when the data-generating distribution lies within an...

Journal ArticleFull Text Online

 

arxiv.org › stat

by B Rodríguez-Gálvez · 2021 — In this work, we introduce several expected generalization error bounds based on the Wasserstein distance. More precisely, we present full- ...

You visited this page on 2/9/21.


OPEN ACCESS

Projected Statistical Methods for Distributional Data on the Real Line with the Wasserstein Metric

by Pegoraro, Matteo; Beraha, Mario

01/2021

We present a novel class of projected methods, to perform statistical analysis on a data set of probability distributions on the real line, with the...

Journal ArticleFull Text Online

All 7 versions 

<——2021——2021——  140——


 

A note on relative Vaserstein symbol

Authors: Kuntal Chakraborty

Abstract: In an unpublished work of Fasel-Rao-Swan the notion of the relative Witt group 

W(R,I)

 is defined. In this article we will give the details of this construction. Then we studied the injectivity of the relative Vaserstein symbol  …

. We established injectivity of this symbol if 

R  is an affine non-singular algebra of dimension 

3  over a perfect…  More

Submitted 7 February, 2021; originally announced February 2021.

Comments: 26 pages

 

arxiv.org › math

by V Divol · 2021 — We provide a short proof that the Wasserstein distance between the empirical measure of a n-sample and the estimated measure is of order n^-(1/d), if the measure has a lower and upper bounded density on the d-dimensional flat toru

 OPEN ACCESS

A short proof on the rate of convergence of the empirical measure for the Wasserstein distance

by Divol, Vincent

01/2021

We provide a short proof that the Wasserstein distance between the empirical measure of a n-sample and the estimated measure is of order n^-(1/d), if the...

Journal ArticleFull Text Online

Cited by 5 Related articles All 4 versions

 OPEN ACCESS

Reproducibility report for paper: "Cross-lingual Document Retrieval using Regularized Wasserstein...

by Knudsen, Gunnar Sjúrðarson; Alaman Requena, Guillermo; Maliakel, Paul Joe

01/2021

Report on the attempt to reproduce results from the paper titled "Cross-lingual Document Retrieval using Regularized Wasserstein Distance."

ReportCitation Online

 

 

 OPEN ACCESS

Experiment data in support of "Segmentation analysis and the recovery of queuing parameters via the Wass...

by Wilde, Henry; Knight, Vincent; Gillard, Jonathan

01/2021

This archive contains a ZIP archive, `data.zip`, that itself contains the data used in the final sections of the paper. The remainder of the paper's supporting...

Data Set Citation Online

This archive contains a ZIP archive, `data.zip`, that itself contains the data used in the final sections of the paper. The remainder of the paper's supporting files are available at github.com/daffidwilde/copd-paper/ The ZIP archive is structured as follows: There is a directory, `wasserstein`, for the parameter sweep described in the model construction section of the paper. Its contents are: (i) a file, `main.csv`, describing each parameter and their maximal Wasserstein distance to the observed data, and (ii) three directories, `best`, `median` and `worst`, each containing the simulated queuing results (in `main.csv`) from that sweep with the best, median and worst found parameter sets, respectively (in `params.txt`). The remaining three directories correspond to the experiments conducted in the final section of the paper. Each directory contains two files: (i) `system_times.csv` which holds trial parameters and system time records for every patient to pass through the model in that experiment, and (ii) `utilisations.csv` which holds trial parameters and utilisations for each server in the model for that experiment Subjects:




  • wasserstein, queuing, wales, hospital, dataset, copd

Recent Studies from Sorbonne University Add New Data to Differential Equations (Differential Inclusions In Wasserstein...

Mathematics Week, 01/2021

NewsletterCitation Online

(01/26/2021). "Recent Studies from Sorbonne University Add New Data to Differential Equations (Differential Inclusions In Wasserstein Spaces: the Cauchy-lipschitz Framework)". Mathematics Week (1944-2440), p. 569.


2021


Bonnet, Benoît
Frankowska, Hélène

Differential inclusions in Wasserstein spaces: the Cauchy-Lipschitz framework. (English) Zbl 07283594

J. Differ. Equations 271, 594-637 (2021).

Reviewer: Andrej V. Plotnikov (Odessa)

MSC:  49J45 49J21 28B20 34A60 34G20 49Q22 60B05

PDF BibTeX XML Cite

Full Text: DOI

Cited by 18 Related articles All 5 versions

New Mathematical Sciences Data Have Been Reported by Researchers at University of Birmingham (

Wassers...

Mathematics Week, 01/2021

NewsletterCitation Online

 Preview 

 Cite this item Email this item Save this item More actions

(01/26/2021). "Recent Studies from Sorbonne University Add New Data to Differential Equations (Differential Inclusions In Wasserstein Spaces: the Cauchy-lipschitz Framework)". Mathematics Week (1944-2440), p. 569.

 

Report Summarizes Mathematics Study Findings from University of Duisburg-Essen (Wasserstein...

Mathematics Week, 01/2021

NewsletterCitation Online

(01/19/2021). "Report Summarizes Mathematics Study Findings from University of Duisburg-Essen (Wasserstein Convergence Rates for Random Bit Approximations of Continuous Markov Processes)". Mathematics Week (1944-2440), p. 407.

https://sk8es4mc2l.search.serialssolutions.com/?ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Report+Summarizes+Mathematics+Study+Findings+from+University+of+Duisburg-Essen+%28Wasserstein+Convergence+Rates+for+Random+Bit+Approximations+of+Continuous+Markov+Processes%29&rft.jtitle=Mathematics+Week&rft.date=2021-01-19&rft.pub=NewsRX+LLC&rft.issn=1944-2440&rft.eissn=1944-2459&rft.spage=407&rft.externalDocID=A648754533&paramdict=en-US 

 

2021 see 2020

Reports from China Three Gorges University Describe Recent Advances in Oil and Gas Research 

(First Arrival Picking of Microseismic Signals Based On Nested U-net and Wasserstein...

Energy Weekly News, 01/2021

NewsletterFull Text Online

Reports from China Three Gorges University Describe Recent Advances in Oil and Gas Research (First Arrival Picking of Microseismic Signals Based On Nested U-net and Wasserstein Generative Adversarial Network)


 

Research Conducted at Beijing University of Technology Has Updated Our Knowledge about Computer Science (Data Augmentation-based Conditional Wasserstein...

Computer Weekly News, 01/2021

NewsletterFull Text Online


Ankirchner, StefanKruse, ThomasUrusov, Mikhail

Wasserstein convergence rates for random bit approximations of continuous Markov processes. (English) Zbl 07267868

J. Math. Anal. Appl. 493, No. 2, Article ID 124543, 31 p. (2021).

MSC:  60 62

PDF BibTeX XML Cite

Full Text: DOI

Cited by 9 Related articles All 5 versions

<——2021——2021——  150——


DPIR-Net: Direct PET Image Reconstruction Based on the Wasserstein Generative Adversarial NetworkAuthors:Hu Z.Xue H.Zhang Q.Gao J.Zhang N.Liu X.Yang Y.Liang D.Zheng H.
Article, 2021
Publication:IEEE Transactions on Radiation and Plasma Medical Sciences, 5, 2021 01 01, 35
Publisher:2021


[PDF] arxiv.org

Robust W-GAN-Based Estimation Under Wasserstein Contamination

Z Liu, PL Loh - arXiv preprint arXiv:2101.07969, 2021 - arxiv.org

Robust estimation is an important problem in statistics which aims at providing a reasonable estimator when the data-generating distribution lies within an appropriately defined ball around an uncontaminated distribution. Although minimax rates of estimation have been …

  All 2 versions 


Considering anatomical prior information for low-dose CT image enhancement using attribute-augmented Wasserstein generative adversarial networks

Z Huang, X Liu, R Wang, J Chen, P Lu, Q Zhang… - Neurocomputing, 2021 - Elsevier

Currently, many deep learning (DL)-based low-dose CT image postprocessing technologies fail to consider the anatomical differences in training data among different human body sites, such as the cranium, lung and pelvis. In addition, we can observe evident anatomical …

  Cited by 2 Related articles


[PDF] arxiv.org

Continuous Wasserstein-2 Barycenter Estimation without Minimax Optimization

A Korotin, L Li, J SolomonE Burnaev - arXiv preprint arXiv:2102.01752, 2021 - arxiv.org

Wasserstein barycenters provide a geometric notion of the weighted average of probability measures based on optimal transport. In this paper, we present a scalable algorithm to compute Wasserstein-2 barycenters given sample access to the input measures, which are …

  All 2 versions 


[PDF] arxiv.org

Predictive density estimation under the Wasserstein loss

T Matsuda, WE Strawderman - Journal of Statistical Planning and Inference, 2021 - Elsevier

We investigate predictive density estimation under the L 2 Wasserstein loss for location families and location-scale families. We show that plug-in densities form a complete class and that the Bayesian predictive density is given by the plug-in density with the posterior …

   Related articles All 5 versions


2021

[PDF] mdpi.com

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - Symmetry, 2021 - mdpi.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively reduce the radiation risk of patients, but it may increase noise and artefacts, which can compromise diagnostic information. The methods based on deep learning can effectively …

  Related articles 

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss. Symmetry 2021, 13, 126

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - 2021 - search.proquest.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively reduce the radiation risk of patients, but it may increase noise and artefacts, which can compromise diagnostic information. The methods based on deep learning can effectively …

 Cited by 2 Related articles All 3 versions 


WRGAN: Improvement of RelGAN with Wasserstein Loss for Text Generation. Electronics 2021, 10, 275

Z Jiao, F Ren - 2021 - search.proquest.com

Generative adversarial networks (GANs) were first proposed in 2014, and have been widely used in computer vision, such as for image generation and other tasks. However, the GANs used for text generation have made slow progress. One of the reasons is that the …

 

2021

Unpaired Single-Image Depth Synthesis with cycle-consistent Wasserstein GANs

C Angermann, A Moravová, M Haltmeier… - arXiv preprint arXiv …, 2021 - arxiv.org

Real-time estimation of actual environment depth is an essential module for various

autonomous system tasks such as localization, obstacle detection and pose estimation.

During the last decade of machine learning, extensive deployment of deep learning …

  All 2 versions 


The measurement of relations on belief functions based on the Kantorovich problem and the Wasserstein metric

AG Bronevich, IN Rozenberg - International Journal of Approximate …, 2021 - Elsevier

In this paper, we show how the Kantorovich problem appears in many constructions in the theory of belief functions. We demonstrate this on several relations on belief functions such as inclusion, equality and intersection of belief functions. Using the Kantorovich problem we …

  All 2 versions

Multilevel optimal transport: a fast approximation of Wasserstein-1 distances

J LiuW YinW Li, YT Chow - SIAM Journal on Scientific Computing, 2021 - SIAM

We propose a fast algorithm for the calculation of the Wasserstein-1 distance, which is a particular type of optimal transport distance with transport cost homogeneous of degree one. Our algorithm is built on multilevel primal-dual algorithms. Several numerical examples and …

  4 Related articles All 6 versions

<——2021——2021——  160——


Distributionally Robust Resilient Operation of Integrated Energy Distribution Systems Using Moment and Wasserstein Metric for Contingencies

M Shahidehpour, Y Zhou, Z Wei… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

Extreme weather events pose a serious threat to energy distribution systems. We propose a distributionally robust optimization model for the resilient operation of the integrated electricity and heat energy distribution systems in extreme weather events. We develop a …

 

[PDF] arxiv.org

Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces

B BonnetH Frankowska - arXiv preprint arXiv:2101.10668, 2021 - arxiv.org

In this article, we derive first-order necessary optimality conditions for a constrained optimal control problem formulated in the Wasserstein space of probability measures. To this end, we introduce a new notion of localised metric subdifferential for compactly supported …

  All 2 versions 

2021

An inexact PAM method for computing Wasserstein barycenter ...

link.springer.com › article

3 days ago — This paper focuses on the computation of Wasserstein barycenters under ... A fast globally linearly convergent algorithm for the computation of ...

[PDF] arxiv.org

Wasserstein convergence rates for random bit approximations of continuous markov processes

S Ankirchner, T Kruse, M Urusov - Journal of Mathematical Analysis and …, 2021 - Elsevier

We determine the convergence speed of a numerical scheme for approximating one-dimensional continuous strong Markov processes. The scheme is based on the construction of certain Markov chains whose laws can be embedded into the process with a sequence of …

  Cited by 3 Related articles All 4 versions


[PDF] arxiv.org

Wasserstein Convergence Rate for Empirical Measures of Markov Chains

A Riekert - arXiv preprint arXiv:2101.06936, 2021 - arxiv.org

We consider a Markov chain on $\mathbb {R}^ d $ with invariant measure $\mu $. We are interested in the rate of convergence of the empirical measures towards the invariant measure with respect to the $1 $-Wasserstein distance. The main result of this article is a …

  All 2 versions 


[PDF] uni-bielefeld.de

Exponential convergence in entropy and Wasserstein for McKean–Vlasov SDEs

P Ren, FY Wang - Nonlinear Analysis, 2021 - Elsevier

The following type of exponential convergence is proved for (non-degenerate or degenerate) McKean–Vlasov SDEs: W 2 (μ t, μ∞) 2+ Ent (μ t| μ∞)≤ ce− λ t min {W 2 (μ 0, μ∞) 2, Ent (μ 0| μ∞)}, t≥ 1, where c, λ> 0 are constants, μ t is the distribution of the solution …

  Related articles All 2 versions


[PDF] arxiv.org

A short proof on the rate of convergence of the empirical measure for the Wasserstein distance

V Divol - arXiv preprint arXiv:2101.08126, 2021 - arxiv.org

We provide a short proof that the Wasserstein distance between the empirical measure of a n-sample and the estimated measure is of order n^-(1/d), if the measure has a lower and upper bounded density on the d-dimensional flat torus. Subjects: Statistics Theory (math. ST) …

  All 4 versions 


[PDF] arxiv.org

Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings

MH Quang - arXiv preprint arXiv:2101.01429, 2021 - arxiv.org

This work studies the convergence and finite sample approximations of entropic regularized Wasserstein distances in the Hilbert space setting. Our first main result is that for Gaussian measures on an infinite-dimensional Hilbert space, convergence in the 2-Sinkhorn …

  Related articles All 3 versions 


[PDF] arxiv.org

Convergence in Wasserstein Distance for Empirical Measures of Semilinear SPDEs

FY Wang - arXiv preprint arXiv:2102.00361, 2021 - arxiv.org

The convergence rate in Wasserstein distance is estimated for the empirical measures of symmetric semilinear SPDEs. Unlike in the finite-dimensional case that the convergence is of algebraic order in time, in the present situation the convergence is of log order with a …

  All 2 versions 


[PDF] mdpi.com

Power Electric Transformer Fault Diagnosis Based on Infrared Thermal Images Using Wasserstein Generative Adversarial Networks and Deep Learning Classifier

KH Fanchiang, YC Huang, CC Kuo - Electronics, 2021 - mdpi.com

The safety of electric power networks depends on the health of the transformer. However, once a variety of transformer failure occurs, it will not only reduce the reliability of the power system but also cause major accidents and huge economic losses. Until now, many …

  Cited by 3 Related articles All 3 versions 

<——2021——2021——  170——   


arXiv:2102.06449  [pdfother math.ST
Two-sample Test with Kernel Projected Wasserstein Distance
Authors: Jie WangRui GaoYao Xie
Abstract: We develop a kernel projected Wasserstein distance for the two-sample test, an essential building block in statistics and machine learning: given two sets of samples, to determine whether they are from the same distribution. This method operates by finding the nonlinear mapping in the data space which maximizes the distance between projected distributions. In contrast to existing works about proje…  More
Submitted 12 February, 2021; originally announced February 2021.
Comments: 34 pages, 3 figures
 Related articles All 3 versions 

Cited by 1 Related articles All 3 versions 

arXiv:2102.06350  [pdfother]  cs.LG  stat.ML
Projected Wasserstein gradient descent for high-dimensional Bayesian inference
Authors: Yifei WangWuchen LiPeng Chen
Abstract: We propose a projected Wasserstein gradient descent method (pWGD) for high-dimensional Bayesian inference problems. The underlying density function of a particle system of WGD is approximated by kernel density estimation (KDE), which faces the long-standing curse of dimensionality. We overcome this challenge by exploiting the intrinsic low-rank structure in the difference between the posterior and… 
More
Submitted 12 February, 2021; originally announced February 2021.
approximated by kernel density estimation (KDE), which faces the long-standing curse of …

   Related articles All 3 versions 


arXiv:2102.06278  [pdfother]  stat.ML  cs.LG
Unsupervised Ground Metric Learning using Wasserstein Eigenvectors
Authors: Geert-Jan HuizingLaura CantiniGabriel Peyré
Abstract: Optimal Transport (OT) defines geometrically meaningful "Wasserstein" distances, used in machine learning applications to compare probability distributions. However, a key bottleneck is the design of a "ground" cost which should be adapted to the task under study. In most cases, supervised metric learning is not accessible, and one usually resorts to some ad-hoc approach. Unsupervised metric learn… 
More
Submitted 11 February, 2021; originally announced February 2021.

Cited by 1 Related articles All 3 versions 

2021  online  OPEN ACCESS

Learning High Dimensional Wasserstein Geodesics

by Liu, ShuMa, ShaojunChen, Yongxin ; More...

02/2021

We propose a new formulation and learning strategy for computing the Wasserstein geodesic between two probability distributions in high dimensions. By applying...

Journal ArticleFull Text Online

Cited by 4 Related articles All 3 versions 

2021 online  OPEN ACCESS

Wasserstein diffusion on graphs with missing attributes

by Chen, ZhixianMa, TengfeiSong, Yangqiu ; More...

02/2021

Missing node attributes is a common problem in real-world graphs. Graph neural networks have been demonstrated powerful in graph representation learning,...

Journal ArticleFull Text Online

Cited by 3 Related articles All 2 versions 


 2021

2021 online  Cover Image

llumination-Invariant Flotation Froth Color Measuring via Wasserstein Distance-Based CycleGAN with structure-preserving constraint

by Liu, JinpingHe, JiezhouXie, Yongfang ; More...

IEEE transactions on cybernetics, 02/2021, Volume 51, Issue 2

Froth color can be referred to as a direct and instant indicator to the key flotation production index, for example, concentrate grade. However, it is...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

  

[PDF] arxiv.org

Wasserstein Robust Classification with Fairness Constraints

Y Wang, VA Nguyen, GA Hanasusanto - arXiv preprint arXiv:2103.06828, 2021 - arxiv.org

robust support vector machine with a fairness constraint that encourages the classifier to 

be fair in view of the equality of opportunity criterion. We use a type-∞ Wasserstein ambiguity …

Related articles All 2 versions 


The measurement of relations on belief functions based on the Kantorovich problem and the Wasserstein metric

AG Bronevich, IN Rozenberg - International Journal of Approximate …, 2021 - Elsevier

In this paper, we show how the Kantorovich problem appears in many constructions in the theory of belief functions. We demonstrate this on several relations on belief functions such as inclusion, equality and intersection of belief functions. Using the Kantorovich problem we …

  All 2 versions


[PDF] arxiv.org

Wasserstein diffusion on graphs with missing attributes

Z Chen, T Ma, Y Song, Y Wang - arXiv preprint arXiv:2102.03450, 2021 - arxiv.org

Missing node attributes is a common problem in real-world graphs. Graph neural networks have been demonstrated powerful in graph representation learning, however, they rely heavily on the completeness of graph information. Few of them consider the incomplete …

  All 2 versions 


[PDF] arxiv.org

Projected Statistical Methods for Distributional Data on the Real Line with the Wasserstein Metric

M Pegoraro, M Beraha - arXiv preprint arXiv:2101.09039, 2021 - arxiv.org

We present a novel class of projected methods, to perform statistical analysis on a data set of probability distributions on the real line, with the 2-Wasserstein metric. We focus in particular on Principal Component Analysis (PCA) and regression. To define these models …

  All 6 versions 

 <——2021——2021——  180—— 


[PDF] arxiv.org

A short proof on the rate of convergence of the empirical measure for the Wasserstein distance

V Divol - arXiv preprint arXiv:2101.08126, 2021 - arxiv.org

We provide a short proof that the Wasserstein distance between the empirical measure of a n-sample and the estimated measure is of order n^-(1/d), if the measure has a lower and upper bounded density on the d-dimensional flat torus. Subjects: Statistics Theory (math. ST) …

  All 4 versions 


[PDF] iop.org

Optimization Research on Abnormal Diagnosis of Transformer Voiceprint Recognition based on Improved Wasserstein GAN

K Zhu, H Ma, J Wang, C Yu, C Guo… - Journal of Physics …, 2021 - iopscience.iop.org

Transformer is an important infrastructure equipment of power system, and fault monitoring is of great significance to its operation and maintenance, which has received wide attention and much research. However, the existing methods at home and abroad are based on  …


2021

[PDF] arxiv.org

A Wasserstein Minimax Framework for Mixed Linear Regression

T Diamandis, YC EldarA FallahF Farnia… - arXiv preprint arXiv …, 2021 - arxiv.org

Multi-modal distributions are commonly used to model clustered data in statistical learning

tasks. In this paper, we consider the Mixed Linear Regression (MLR) problem. We propose

an optimal transport-based framework for MLR problems, Wasserstein Mixed Linear …

  All 2 versions 

[PDF] arxiv.org

The ultrametric Gromov-Wasserstein distance

F MémoliA MunkZ Wan, C Weitkamp - arXiv preprint arXiv:2101.05756, 2021 - arxiv.org

In this paper, we investigate compact ultrametric measure spaces which form a subset $\mathcal {U}^ w $ of the collection of all metric measure spaces $\mathcal {M}^ w $. Similar as for the ultrametric Gromov-Hausdorff distance on the collection of ultrametric spaces …

   Related articles 

 

[HTML] springer.com

[HTML] Quantum statistical learning via Quantum Wasserstein natural gradient

S BeckerW Li - Journal of Statistical Physics, 2021 - Springer

In this article, we introduce a new approach towards the statistical learning problem\(\mathrm {argmin} _ {\rho (\theta)\in {\mathcal {P}} _ {\theta}} W_ {Q}^ 2 (\rho _ {\star},\rho (\theta))\) to approximate a target quantum state\(\rho _ {\star}\) by a set of …

   Related articles All 5 versions


 2021

[PDF] arxiv.org

From Smooth Wasserstein Distance to Dual Sobolev Norm: Empirical Approximation and Statistical Applications

S NietertZ GoldfeldK Kato - arXiv preprint arXiv:2101.04039, 2021 - arxiv.org

Statistical distances, ie, discrepancy measures between probability distributions, are ubiquitous in probability theory, statistics and machine learning. To combat the curse of dimensionality when estimating these distances from data, recent work has proposed …

  Related articles All 2 versions 

[PDF] arxiv.org

Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces

B BonnetH Frankowska - arXiv preprint arXiv:2101.10668, 2021 - arxiv.org

In this article, we derive first-order necessary optimality conditions for a constrained optimal control problem formulated in the Wasserstein space of probability measures. To this end, we introduce a new notion of localised metric subdifferential for compactly supported …

  All 2 versions 


[PDF] arxiv.org

Projected Statistical Methods for Distributional Data on the Real Line with the Wasserstein Metric

M Pegoraro, M Beraha - arXiv preprint arXiv:2101.09039, 2021 - arxiv.org

We present a novel class of projected methods, to perform statistical analysis on a data set of probability distributions on the real line, with the 2-Wasserstein metric. We focus in particular on Principal Component Analysis (PCA) and regression. To define these models …

  All 6 versions 


2021

Primal dual methods for Wasserstein gradient flows

J Carrillo de la Plata, K Craig, L Wang… - Foundations of …, 2021 - ora.ox.ac.uk

Combining the classical theory of optimal transport with modern operator splitting techniques, we develop a new numerical method for nonlinear, nonlocal partial differential equations, arising in models of porous media, materials science, and biological swarming …

  


[PDF] arxiv.org

Multilevel optimal transport: a fast approximation of Wasserstein-1 distances

J LiuW YinW Li, YT Chow - SIAM Journal on Scientific Computing, 2021 - SIAM

We propose a fast algorithm for the calculation of the Wasserstein-1 distance, which is a particular type of optimal transport distance with transport cost homogeneous of degree one. Our algorithm is built on multilevel primal-dual algorithms. Several numerical examples and …

  4 Related articles All 6 versions

 <——2021——2021——–190——

2[PDF] jmlr.org

[PDF] A fast globally linearly convergent algorithm for the computation of Wasserstein barycenters

L YangJ LiD SunKC Toh - Journal of Machine Learning Research, 2021 - jmlr.org

We consider the problem of computing a Wasserstein barycenter for a set of discrete probability distributions with finite supports, which finds many applications in areas such as statistics, machine learning and image processing. When the support points of the …

  Cited by 8 Related articles All 6 versions 


[PDF] arxiv.org

Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces

B BonnetH Frankowska - arXiv preprint arXiv:2101.10668, 2021 - arxiv.org

In this article, we derive first-order necessary optimality conditions for a constrained optimal control problem formulated in the Wasserstein space of probability measures. To this end, we introduce a new notion of localised metric subdifferential for compactly supported …

  All 2 versions 

 arXiv:2102.08725  [pdfpsother math.MG  math.DG
Isometric Rigidity of compact Wasserstein spaces
Authors: Jaime Santos-Rodríguez
Abstract: Let (X,d,m)
 be a metric measure space. The study of the Wasserstein space (Pp(X),Wp)
 associated to X
 has proved useful in describing several geometrical properties of X.
 In this paper we focus on the study of isometries of Pp(X)
 for p(1,∞)
 under the assumption that there is some characterization of optimal maps between measures, the so…  More
Submitted 17 February, 2021; originally announced February 2021.
Comments: 16 pages, all comments are welcome
MSC Class: 53C23; 53C21
 Related articles All 3 versions 


 arXiv:2102.06862  [pdfother]  cs.LG  cs.AI math.NA

 Wasserstein Proximal of GANs.

0 citations*

2021 ARXIV: LEARNING

View More 

 Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator

19 citations* for all

0 citations*

2021 OPERATIONS RESEARCH

Viet Anh Nguyen 1,Daniel Kuhn 2,Peyman Mohajerin Esfahani 3

1 Stanford University ,2 École Polytechnique Fédérale de Lausanne ,3 Delft University of Technology

Shrinkage estimator

Estimation of covariance matrices

View More (8+) 

Note. The best result in each experiment is highlighted in bold.The optimal solutions of many decision problems such as the Markowitz portfolio allocation and the linear discriminant analysis depen...
Cited by 16
 Related articles All 7 versions
MR4424359 

Xie, Weijun

On distributionally robust chance constrained programs with Wasserstein distance. (English) Zbl 07310576

Math. Program. 186, No. 1-2 (A), 115-155 (2021).

MSC:  90C15 90C47 90C11

PDF BibTeX XML Cite

Full Text: DOI      Zbl 07310576

Cited by 97 Related articles All 9 versions


 2021

Steinerberger, Stefan

Wasserstein distance, Fourier series and applications. (English) Zbl 07308735

Monatsh. Math. 194, No. 2, 305-338 (2021).

MSC:  11L03 35B05 42A05 42A16 49Q20

PDF BibTeX XML Cite

MR4215207 Prelim Qian, Yitian; Pan, Shaohua; An inexact PAM method for computing Wasserstein barycenter with unknown supports. Comput. Appl. Math. 40 (2021), no. 2, 45. 90C26 (49J52 65K05)

Review PDF Clipboard Journal Article

An inexact PAM method for computing Wasserstein 
Cited by 35
Related articles All 3 versions

MR4214478 Prelim Xie, Weijun; On distributionally robust chance constrained programs with Wasserstein distance. Math. Program. 186 (2021), no. 1-2, Ser. A, 115–155. 90C15 (90C11 90C47)

Review PDF Clipboard Journal Article

in ambiguity set, where the uncertain constraints should be satisfied with a …

 Cited by 73 Related articles All 9 versions

Abstract/Details Get full textLink to external site, this link will open in a new window
Cited by 93
 Related articles All 9 versions

HICKN: extraction of peptide chromatographic elution profiles from large scale mass spectrometry data by means of Wasserstein compressive hierarchical cluster analysis.

By: Permiakova, Olga; Guibert, Romain; Kraut, Alexandra; et al.

BMC bioinformatics  Volume: ‏ 22   Issue: ‏ 1   Pages: ‏ 68   Published: ‏ 2021 Feb 12

 

2021  see 2020

Asymptotics of Smoothed Wasserstein Distances

By: Chen, Hong-BinNiles-Weed, Jonathan

POTENTIAL ANALYSIS    

 Niles-Weed, Jonathan

POTENTIAL ANALYSIS    

early access iconEarly Access: JAN 2021

 Cite this item Email this item Save this item More actions

Reports on Potential Analysis from New York University Provide New Insights (Asymptotics of Smoothed Wasserstein...

Mathematics Week, 03/2021

NewsletterCitation Online

  Cited by 2 Related articles All 2 versions

Asymptotics of smoothed Wasserstein distances

HB ChenJ Niles-Weed - Potential Analysis, 2021 - Springer

We investigate contraction of the Wasserstein distances on\(\mathbb {R}^{d}\) under

Gaussian smoothing. It is well known that the heat semigroup is exponentially contractive

with respect to the Wasserstein distances on manifolds of positive curvature; however, on flat …

    Cited by 5 Related articles All 5 versions

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss

, Zhixian; Xia, Kewen; He, Ziping; et al.

SYMMETRY-BASEL  Volume: ‏ 13   Issue: ‏ 1     Article Number: 126   Published: ‏ JAN 2021

Cited by 10 Related articles All 3 versions 

   <——2021——2021——  200——

DPIR-Net: Direct PET Image Reconstruction Based on the Wasserstein Generative Adversarial Network

By: Hu, ZhanliXue, HengzhiZhang, Qiyang; et al.

IEEE TRANSACTIONS ON RADIATION AND PLASMA MEDICAL SCIENCES  Volume: ‏ 5   Issue: ‏ 1   Pages: ‏ 35-43   Published: ‏ JAN 2021

  lume: ‏ 5   Issue: ‏ 1   Pages: ‏ 35-43   Published: ‏ JAN 2021

Get It Penn State Free Full Text from Publisher View Abstract

Times Cited: 3

DPIR-Net: Direct PET Image Reconstruction Based on the Wasserstein Generative Adversarial Network

By: Hu, Zhanli; Xue, Hengzhi; Zhang, Qiyang; et al.

IEEE TRANSACTIONS ON RADIATION AND PLASMA MEDICAL SCIENCES  Volume: ‏ 5   Issue: ‏ 1   Pages: ‏ 35-43   Published: ‏ JAN 2021

<——2021——2021——  200—— 


2021 [PDF] arxiv.org

A note on relative Vaserstein symbol

K Chakraborty - arXiv preprint arXiv:2102.03883, 2021 - arxiv.org

In an unpublished work of Fasel-Rao-Swan the notion of the relative Witt group $ W_E (R, I) $ is defined. In this article we will give the details of this construction. Then we studied the injectivity of the relative Vaserstein symbol $ V_ {R, I}: Um_3 (R, I)/E_3 (R, I)\rightarrow W_E …

  Related articles 


[PDF] arxiv.org

Tighter expected generalization error bounds via Wasserstein distance

B Rodríguez-GálvezG BassiR Thobaben… - arXiv preprint arXiv …, 2021 - arxiv.org

In this work, we introduce several expected generalization error bounds based on the Wasserstein distance. More precisely, we present full-dataset, single-letter, and random-subset bounds on both the standard setting and the randomized-subsample setting from …

  All 3 versions  


A Mathematical Principle of Deep Learning: Learn the Geodesic Curve in the Wasserstein...

by Kuo GaiZhang, Shihua
arXiv.org, 03/2021
Recent studies revealed the mathematical connection of deep neural network (DNN) and dynamic system. However, the fundamental principle of DNN has not been...
Paper  Full Text Online

arXiv:2102.09235  [pdfother]  cs.LG  stat.ML
A Mathematical Principle of Deep Learning: Learn the Geodesic Curve in the Wasserstein Space
Authors: Kuo GaiShihua Zhang
Abstract: Recent studies revealed the mathematical connection of deep neural network (DNN) and dynamic system. However, the fundamental principle of DNN has not been fully characterized with dynamic system in terms of optimization and generalization. To this end, we build the connection of DNN and continuity equation where the measure is conserved to model the forward propagation process of DNN which has no…  More
Submitted 18 February, 2021; originally announced February 2021.
Comments: 38 pages, 16 figures
  Related articles All 2 versions 


WGAN with an Infinitely Wide Generator Has No Spurious Stationary Points
Authors: Albert NoTaeho YoonSe-Hyeon KwonErnest K. Ryu
Abstract: Generative adversarial networks (GAN) are a widely used class of deep generative models, but their minimax training dynamics are not understood very well. In this work, we show that GANs with a 2-layer infinite-width generator and a 2-layer finite-width discriminator trained with stochastic gradient ascent-descent have no spurious stationary points. We then show that when the width of the generato…  More
Submitted 15 February, 2021; originally announced February 2021.
  Related articles All 6 versions 


[PDF] mdpi.com

WRGAN: Improvement of RelGAN with Wasserstein Loss for Text Generation

Z Jiao, F Ren - Electronics, 2021 - mdpi.com

Generative adversarial networks (GANs) were first proposed in 2014, and have been widely used in computer vision, such as for image generation and other tasks. However, the GANs used for text generation have made slow progress. One of the reasons is that the …

WRGAN: Improvement of RelGAN with Wasserstein Loss for Text Generation. Electronics 2021, 10, 275

Z Jiao, F Ren - 2021 - search.proquest.com

Generative adversarial networks (GANs) were first proposed in 2014, and have been widely used in computer vision, such as for image generation and other tasks. However, the GANs used for text generation have made slow progress. One of the reasons is that the …


2021 [PDF] arxiv.org

Predictive density estimation under the Wasserstein loss

T Matsuda, WE Strawderman - Journal of Statistical Planning and Inference, 2021 - Elsevier

We investigate predictive density estimation under the L 2 Wasserstein loss for location families and location-scale families. We show that plug-in densities form a complete class and that the Bayesian predictive density is given by the plug-in density with the posterior …

Cited by 3 Related articles All 5 versions


Variational Autoencoders and Wasserstein Generative Adversarial Networks for Improving the Anti-Money Laundering Process

ZY Chen, W Soliman, A Nazir, M Shorfuzzaman - IEEE Access, 2021 - ieeexplore.ieee.org

There has been much recent work on fraud and Anti Money Laundering (AML) detection

using machine learning techniques. However, most algorithms are based on supervised

techniques. Studies show that supervised techniques often have the limitation of not …

 
Simulation of broadband ground motions with consistent long-period and short-period components using Wasserstein interpolation of acceleration envelopes

T Okazaki, H Hachiya, A Iwaki, T Maeda… - Geophysical Journal …, 2021 - academic.oup.com

Practical hybrid approaches for the simulation of broadband ground motions often combine

long-period and short-period waveforms synthesised by independent methods under

different assumptions for different period ranges, which at times can lead to incompatible …
journal article

  Related articles


Linear and Deep Order-Preserving Wasserstein Discriminant Analysis

B SuJ ZhouJR WenY Wu - IEEE Transactions on Pattern …, 2021 - ieeexplore.ieee.org

Supervised dimensionality reduction for sequence data learns a transformation that maps the observations in sequences onto a low-dimensional subspace by maximizing the separability of sequences in different classes. It is typically more challenging than …

 Related articles All 6 versions


Projected Wasserstein gradient descent for high-dimensional Bayesian inference

Y Wang, P ChenW Li - arXiv preprint arXiv:2102.06350, 2021 - arxiv.org

We propose a projected Wasserstein gradient descent method (pWGD) for high-dimensional

Bayesian inference problems. The underlying density function of a particle system of WGD is

approximated by kernel density estimation (KDE), which faces the long-standing curse of …

  All 3 versions 


[HTML] springer.com

[HTML] Quantum statistical learning via Quantum Wasserstein natural gradient

S BeckerW Li - Journal of Statistical Physics, 2021 - Springer

In this article, we introduce a new approach towards the statistical learning

problem\(\mathrm {argmin} _ {\rho (\theta)\in {\mathcal {P}} _ {\theta}} W_ {Q}^ 2 (\rho _

{\star},\rho (\theta))\) to approximate a target quantum state\(\rho _ {\star}\) by a set of …

   Related articles All 5 versions

 <——2021——2021——  210—— 


[PDF] jmlr.org

[PDF] A fast globally linearly convergent algorithm for the computation of Wasserstein barycenters

L YangJ LiD SunKC Toh - Journal of Machine Learning Research, 2021 - jmlr.org

We consider the problem of computing a Wasserstein barycenter for a set of discrete

probability distributions with finite supports, which finds many applications in areas such as

statistics, machine learning and image processing. When the support points of the

barycenter are pre-specified, this problem can be modeled as a linear programming (LP)

problem whose size can be extremely large. To handle this large-scale LP, we analyse the

structure of its dual problem, which is conceivably more tractable and can be reformulated …

  Cited by 8 Related articles All 6 versions 

Zbl 07370538

[PDF] arxiv.org

Projection Robust Wasserstein Barycenter

M HuangS MaL Lai - arXiv preprint arXiv:2102.03390, 2021 - arxiv.org

Collecting and aggregating information from several probability measures or histograms is a

fundamental task in machine learning. One of the popular solution methods for this task is to

compute the barycenter of the probability measures under the Wasserstein metric. However …

  All 2 versions 


An inexact PAM method for computing Wasserstein barycenter with unknown supports

Y Qian, S Pan - Computational and Applied Mathematics, 2021 - Springer

Wasserstein barycenter is the centroid of a collection of discrete probability distributions

which minimizes the average of the\(\ell _2\)-Wasserstein distance. This paper focuses on

the computation of Wasserstein barycenters under the case where the support points are …

 Cited by 1 Related articles All 2 versions


[PDF] arxiv.org

Continuous Wasserstein-2 Barycenter Estimation without Minimax Optimization

A Korotin, L Li, J SolomonE Burnaev - arXiv preprint arXiv:2102.01752, 2021 - arxiv.org

Wasserstein barycenters provide a geometric notion of the weighted average of probability

measures based on optimal transport. In this paper, we present a scalable algorithm to

compute Wasserstein-2 barycenters given sample access to the input measures, which are …

  All 2 versions 


2021

Data augmentation for rolling bearing fault diagnosis using an enhanced few-shot Wasserstein auto-encoder with meta-learning

Z Pei, H Jiang, X Li, J Zhang, S Liu - Measurement Science and …, 2021 - iopscience.iop.org

Despite the advance of intelligent fault diagnosis for rolling bearings, in industries, data-

driven methods still suffer from data acquisition and imbalance. We propose an enhanced

few-shot Wasserstein auto-encoder (fs-WAE) to reverse the negative effect of imbalance …


Projection Robust Wasserstein Barycenter

M HuangS MaL Lai - arXiv preprint arXiv:2102.03390, 2021 - arxiv.org

Collecting and aggregating information from several probability measures or histograms is a

fundamental task in machine learning. One of the popular solution methods for this task is to

compute the barycenter of the probability measures under the Wasserstein metric. However …

  All 2 versions 


Distributionally Robust Resilient Operation of Integrated Energy Distribution Systems Using Moment and Wasserstein Metric for Contingencies

M Shahidehpour, Y Zhou, Z Wei… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

Extreme weather events pose a serious threat to energy distribution systems. We propose a

distributionally robust optimization model for the resilient operation of the integrated

electricity and heat energy distribution systems in extreme weather events. We develop a …

 

[PDF] arxiv.org

Dimension-free Wasserstein contraction of nonlinear filters

N Whiteley - Stochastic Processes and their Applications, 2021 - Elsevier

For a class of partially observed diffusions, conditions are given for the map from the initial

condition of the signal to filtering distribution to be contractive with respect to Wasserstein

distances, with rate which does not necessarily depend on the dimension of the state-space …

   Related articles All 2 versions


Accelerated WGAN Update Strategy With Loss Change Rate Balancing

X Ouyang, Y Chen, G Agam - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com

Optimizing the discriminator in Generative Adversarial Networks (GANs) to completion in the

inner training loop is computationally prohibitive, and on finite datasets would result in

overfitting. To address this, a common update strategy is to alternate between k optimization …

  Related articles 

 <——2021——2021——  220—— 


2021 SEE 2020    [PDF] arxiv.org

TextureWGAN: texture preserving WGAN with MLE regularizer for inverse problems

M Ikuta, J Zhang - Medical Imaging 2021: Image Processing, 2021 - spiedigitallibrary.org

Many algorithms and methods have been proposed for inverse problems particularly with

the recent surge of interest in machine learning and deep learning methods. Among all

proposed methods, the most popular and effective method is the convolutional neural …

  Related articles All 5 versions

WGAN  성능개선을 위한 효과적인 정칙항 제안

한희일 - 멀티미디어학회논문지, 2021 - dbpia.co.kr

A Wasserstein GAN (WGAN), optimum in terms of minimizing Wasserstein distance, still

suffers from inconsistent convergence or unexpected output due to inherent learning

instability. It is widely known some kinds of restriction on the discriminative function should …

[Korean  Proposal of effective regular clauses for improving the performance of WGAN ]
Related articles


[PDF] arxiv.org

Distributionally robust chance-constrained programs with right-hand side uncertainty under Wasserstein ambiguity

N Ho-NguyenF Kılınç-KarzanS Küçükyavuz… - Mathematical …, 2021 - Springer

We consider exact deterministic mixed-integer programming (MIP) reformulations of

distributionally robust chance-constrained programs (DR-CCP) with random right-hand

sides over Wasserstein ambiguity sets. The existing MIP formulations are known to have …

  Cited by 5 Related articles All 5 versions


[PDF] arxiv.org

Projection Robust Wasserstein Barycenter

M HuangS MaL Lai - arXiv preprint arXiv:2102.03390, 2021 - arxiv.org

Collecting and aggregating information from several probability measures or histograms is a

fundamental task in machine learning. One of the popular solution methods for this task is to

compute the barycenter of the probability measures under the Wasserstein metric. However …

  All 2 versions 


[PDF] arxiv.org

Robust W-GAN-Based Estimation Under Wasserstein Contamination

Z Liu, PL Loh - arXiv preprint arXiv:2101.07969, 2021 - arxiv.org

Robust estimation is an important problem in statistics which aims at providing a reasonable

estimator when the data-generating distribution lies within an appropriately defined ball

around an uncontaminated distribution. Although minimax rates of estimation have been …

  All 2 versions 


2021


Distributionally Robust Resilient Operation of Integrated Energy Distribution Systems Using Moment and Wasserstein Metric for Contingencies

M Shahidehpour, Y Zhou, Z Wei… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

Extreme weather events pose a serious threat to energy distribution systems. We propose a

distributionally robust optimization model for the resilient operation of the integrated

electricity and heat energy distribution systems in extreme weather events. We develop a …


2021

Differential inclusions in Wasserstein spaces: The Cauchy-Lipschitz framework

B BonnetH Frankowska - Journal of Differential Equations, 2021 - Elsevier

In this article, we propose a general framework for the study of differential inclusions in the

Wasserstein space of probability measures. Based on earlier geometric insights on the

structure of continuity equations, we define solutions of differential inclusions as absolutely …

  Cited by 2 Related articles All 6 versions


[PDF] arxiv.org

Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces

B BonnetH Frankowska - arXiv preprint arXiv:2101.10668, 2021 - arxiv.org

In this article, we derive first-order necessary optimality conditions for a constrained optimal

control problem formulated in the Wasserstein space of probability measures. To this end,

we introduce a new notion of localised metric subdifferential for compactly supported …

  All 2 versions 

APPLIED MATHEMATICS AND OPTIMIZATION    Early Access: MAY 2021

[PDF] arxiv.org

Isometric Rigidity of compact Wasserstein spaces

J Santos-Rodríguez - arXiv preprint arXiv:2102.08725, 2021 - arxiv.org

Let $(X, d,\mathfrak {m}) $ be a metric measure space. The study of the Wasserstein space

$(\mathbb {P} _p (X),\mathbb {W} _p) $ associated to $ X $ has proved useful in describing

several geometrical properties of $ X. $ In this paper we focus on the study of isometries of …

  All 2 versions 


[PDF] arxiv.org

The isometry group of Wasserstein spaces: the Hilbertian case

GP GehérT TitkosD Virosztek - arXiv preprint arXiv:2102.02037, 2021 - arxiv.org

Motivated by Kloeckner's result on the isometry group of the quadratic Wasserstein space

$\mathcal {W} _2\left (\mathbb {R}^ n\right) $, we describe the isometry group $\mathrm

{Isom}\left (\mathcal {W} _p (E)\right) $ for all parameters $0< p<\infty $ and for all separable …

The isometry group of Wasserstein spaces: the Hilbertian case

G Pál Gehér, T TitkosD Virosztek - arXiv e-prints, 2021 - ui.adsabs.harvard.edu

Motivated by Kloeckner's result on the isometry group of the quadratic Wasserstein space

$\mathcal {W} _2\left (\mathbb {R}^ n\right) $, we describe the isometry group $\mathrm

{Isom}\left (\mathcal {W} _p (E)\right) $ for all parameters $0< p<\infty $ and for all separable …
 <——2021——2021——  230——

 2021 see 2020  Cover Image

Classification of atomic environments via the Gromov–Wasserstein distance

by Kawano, Sakura; Mason, Jeremy K

Computational materials science, 02/2021, Volume 188

[Display omitted] •Molecular dynamics simulations need automated methods to classify atomic structure.•Existing methods are restricted to simple compositions...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

Cited by 2 Related articles All 8 versions 

 

Cover Image+

Data augmentation for rolling bearing fault diagnosis using an enhanced few-shot Wasserstein...

by Pei, Zeyu; Jiang, Hongkai; Li, Xingqiu ; More...

Measurement science & technology, 02/2021

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online
Data augmentation for rolling bearing fault diagnosis using an enhanced few-shot Wasserstein auto-encoder with meta-learning

Z Pei, H Jiang, X Li, J Zhang, S Liu - Measurement Science and …, 2021 - iopscience.iop.org

13 days ago - Despite the advance of intelligent fault diagnosis for rolling bearings, in

industries, data-driven methods still suffer from data acquisition and imbalance. We propose

an enhanced few-shot Wasserstein auto-encoder (fs-WAE) to reverse the negative effect of …

 Related articles

additional files below  24  30 …

onlineVCover Image  OPEN ACCESS

CHICKN: extraction of peptide chromatographic elution profiles from large scale mass spectrometry data by means of Wasserstein...

by Permiakova, Olga; Guibert, Romain; Kraut, Alexandra ; More...

BMC bioinformatics, 02/2021, Volume 22, Issue 1

The clustering of data produced by liquid chromatography coupled to mass spectrometry analyses (LC-MS data) has recently gained interest to extract meaningful...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online


Cover Image

Short‐term railway passenger demand forecast using improved Wasserstein generative adversarial nets...

by Feng, Fenling; Zhang, Jiaqi; Liu, Chengguang ; More...

IET intelligent transport systems, 02/2021

Article PDF Download PDF via Unpaywall BrowZine PDF Icon

Journal ArticleCitation Online


online Cover Image

Local Stability of Wasserstein GANs With Abstract Gradient Penalty

by Kim, Cheolhyeong; Park, Seungtae; Hwang, Hyung Ju

IEEE transaction on neural networks and learning systems, 02/2021, Volume PP

The convergence of generative adversarial networks (GANs) has been studied substantially in various aspects to achieve successful generative tasks. Ever since...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

online Cover Image

Local Stability of Wasserstein GANs With Abstract Gradient Penalty

by Kim, Cheolhyeong; Park, Seungtae; Hwang, Hyung Ju

IEEE transaction on neural networks and learning systems, 02/2021, Volume PP

The convergence of generative adversarial networks (GANs) has been studied substantially in various aspects to achieve successful generative tasks. Ever since...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

Cited by 2 Related articles All 3 versions

online OPEN ACCESS

Wasserstein Proximal of GANs

by Lin, Alex Tong; Li, Wuchen; Osher, Stanley ; More...

02/2021

We introduce a new method for training generative adversarial networks by applying the Wasserstein-2 metric proximal on the generators. The approach is based...

Journal ArticleFull Text Online

Cited by 23 Related articles All 8 versions

Zbl 07495252

ƒ

online  OPEN ACCESS

Isometric Rigidity of compact Wasserstein spaces

by Santos-Rodríguez, Jaime

02/2021

Let $(X,d,\mathfrak{m})$ be a metric measure space. The study of the Wasserstein space $(\mathbb{P}_p(X),\mathbb{W}_p)$ associated to $X$ has proved useful in...

Journal ArticleFull Text Online

 Related articles All 3 versions

 

online OPEN ACCESS

Unsupervised Ground Metric Learning using Wasserstein Eigenvectors

by Huizing, Geert-Jan; Cantini, Laura; Peyré, Gabriel

02/2021

Optimal Transport (OT) defines geometrically meaningful "Wasserstein" distances, used in machine learning applications to compare probability distributions....

Journal ArticleFull Text Online

 

onlineB OPEN ACCESS

Two-sample Test with Kernel Projected Wasserstein Distance

by Wang, Jie; Gao, Rui; Xie, Yao

02/2021

We develop a kernel projected Wasserstein distance for the two-sample test, an essential building block in statistics and machine learning: given two sets of...

Journal ArticleFull Text Online

 

onlineV OPEN ACCESS

Projected Wasserstein gradient descent for high-dimensional Bayesian inference

by Wang, Yifei; Chen, Peng; Li, Wuchen

02/2021

We propose a projected Wasserstein gradient descent method (pWGD) for high-dimensional Bayesian inference problems. The underlying density function of a...

Journal ArticleFull Text Online

 Cited by 3 Related articles All 4 versions

<——2021——2021——  240——

 

onlineB OPEN ACCESS

A Mathematical Principle of Deep Learning: Learn the Geodesic Curve in the Wasserstein Space

by Gai, Kuo; Zhang, Shihua

02/2021

Recent studies revealed the mathematical connection of deep neural network (DNN) and dynamic system. However, the fundamental principle of DNN has not been...

Journal ArticleFull Text Online
Learn the Geodesic Curve in the Wasserstein Space - arXiv


OPEN ACCESS

Additional file 7 of CHICKN: extraction of peptide chromatographic elution profiles from large scale mass spectrometry data by means of Wasserstein...

by Permiakova, Olga; Guibert, Romain; Kraut, Alexandra ; More...

02/2021

Additional file 7: Influence of k on the execution time of CHICKN. Figure depicting CHICKN execution time as a function of k, the number of clusters at each...

ImageCitation Online


 OPEN ACCESS

Additional file 10 of CHICKN: extraction of peptide chromatographic elution profiles from large scale mass spectrometry data by means of Wasserstein...

by Permiakova, Olga; Guibert, Romain; Kraut, Alexandra ; More...

02/2021

Additional file 10: Differently charged ions of a same peptide tend to cluster together. A subset of clusters was manually inspected so as to label as many...

ImageCitation Online

 

 

 OPEN ACCESS

Additional file 11 of CHICKN: extraction of peptide chromatographic elution profiles from large scale mass spectrometry data by means of Wasserstein...

by Permiakova, Olga; Guibert, Romain; Kraut, Alexandra ; More...

02/2021

Additional file 11: Cluster size distribution. Histograms of the cluster size distribution resulting from the application of CHICKN on each of the three...

ImageCitation Online

 


A Recommender System Based on Model Regularization Wasserstein Generative Adversarial Network *Authors:Qingxian WangQing HuangKangkang MaXuerui Zhang2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC)
Summary:A recommender system (RS) commonly adopts a High-dimensional and sparse (HiDS) matrix to describe user-item preferences. Collaborative Filtering (CF)-based models have been widely adopted to address such an HiDS matrix. However, a CF-based model is unable to learn the property distribution characteristic of user’s preference from an HiDS matrix, thereby its representation ability is limited. To address this issue, this paper proposes a Model Regularization Wasserstein GAN(MRWGAN) to extract the distribution of user’s preferences. Its main ideas are two-fold: a) adopting an auto-encoder to implement the generator model of GAN; b) proposing a model-regularized Wasserstein distance as an objective function to training a GAN model. Empirical studies on four HiDS matrices from industrial applications demonstrate that compared with state-of-the-art models, the proposed model achieves higher prediction accuracy for missing data of an HiDS matrixShow more
Chapter, 2021
Publication:2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 20211017, 2043
Publisher:2021

2021

 OPEN ACCESS

Additional file 8 of CHICKN: extraction of peptide chromatographic elution profiles from large scale mass spectrometry data by means of Wasserstein...

by Permiakova, Olga; Guibert, Romain; Kraut, Alexandra ; More...

02/2021

Additional file 8: Influence of ktotal on the execution time of CHICKN. Figure depicting CHICKN execution time as a function of ktotal, the maximum number of...

ImageCitation Online


Covariance Steering of Discrete-Time Stochastic Linear Systems Based on Wasserstein Distance Terminal Cost
Authors:Isin M. BalciEfstathios Bakolas
Article, 2021
Publication:IEEE control systems letters, 5, 2021, 2000
Publisher:2021

2021  see 2022
Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural...
by Pasini, Massimiliano LupoYin, Junqi
2021 International Conference on Computational Science and Computational Intelligence (CSCI), 12/2021
We use a stable parallel approach to train Wasserstein Conditional Generative Adversarial Neural Networks (W-CGANs). The parallel training reduces the risk of...
Conference Proceeding  Full Text Online
   

A unified framework for non-negative matrix and tensor factorisations with a smoothed Wasserstein lossAuthors:Stephen Y. Zhang2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW)
Summary:Non-negative matrix and tensor factorisations are a classical tool for finding low-dimensional representations of high-dimensional datasets. In applications such as imaging, datasets can be regarded as distributions supported on a space with metric structure. In such a setting, a loss function based on the Wasserstein distance of optimal transportation theory is a natural choice since it incorporates the underlying geometry of the data. We introduce a general mathematical framework for computing non-negative factorisations of both matrices and tensors with respect to an optimal transport loss. We derive an efficient computational method for its solution using a convex dual formulation, and demonstrate the applicability of this approach with several numerical illustrations with both matrix and tensor-valued dataShow more
Chapter, 2021
Publication:2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW), 202110, 4178
Publisher:2021


2021

[PDF] Wasserstein barycenters can be computed in polynomial time in fixed dimension

JM Altschuler, E Boix-Adsera - Journal of Machine Learning Research, 2021 - jmlr.org

Computing Wasserstein barycenters is a fundamental geometric problem with widespread

applications in machine learning, statistics, and computer graphics. However, it is unknown

whether Wasserstein barycenters can be computed in polynomial time, either exactly or to …

  All 14 versions 


2021 see 2022  [PDF] arxiv.org

Wasserstein barycenters are NP-hard to compute

JM Altschuler, E Boix-Adsera - arXiv preprint arXiv:2101.01100, 2021 - arxiv.org

The problem of computing Wasserstein barycenters (aka Optimal Transport barycenters) has

attracted considerable recent attention due to many applications in data science. While there

exist polynomial-time algorithms in any fixed dimension, all known runtimes suffer …

   Related articles All 2 versions 

2021  

arXiv:2102.12736  [pdfother stat.ML cs.LG
Time-Series Imputation with Wasserstein Interpolation for Optimal Look-Ahead-Bias and Variance Tradeoff
Authors: Jose BlanchetFernando HernandezViet Anh NguyenMarkus PelgerXuhui Zhang
Abstract: Missing time-series data is a prevalent practical problem. Imputation methods in time-series data often are applied to the full panel data with the purpose of training a model for a downstream out-of-sample task. For example, in finance, imputation of missing returns may be applied prior to training a portfolio optimization model. Unfortunately, this practice may result in a look-ahead-bias in the…  More
Submitted 25 February, 2021; originally announced February 2021.

Cited by 2 Related articles All 3 versions 

[PDF] mdpi.com

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - Symmetry, 2021 - mdpi.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively

reduce the radiation risk of patients, but it may increase noise and artefacts, which can

compromise diagnostic information. The methods based on deep learning can effectively …

   Related articles 

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss. Symmetry 2021, 13, 126

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - 2021 - search.proquest.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively

reduce the radiation risk of patients, but it may increase noise and artefacts, which can

compromise diagnostic information. The methods based on deep learning can effectively …

 

arXiv:2102.12715  [pdfother eess.SY  math.OC
Distributional robustness in minimax linear quadratic control with Wasserstein distance
Authors: Kihyun KimInsoon Yang
Abstract: To address the issue of inaccurate distributions in practical stochastic systems, a minimax linear-quadratic control method is proposed using the Wasserstein metric. Our method aims to construct a control policy that is robust against errors in an empirical distribution of underlying uncertainty, by adopting an adversary that selects the worst-case distribution. The opponent receives a Wasserstein…  More
Submitted 25 February, 2021; originally announced February 2021.
Comments: arXiv admin note: text overlap with arXiv:2003.13258
Cited by 4
 Related articles All 2 versions 

arXiv:2102.12178  [pdfother cs.LG  stat.ML
Learning to Generate Wasserstein Barycenters
Authors: Julien LacombeJulie DigneNicolas CourtyNicolas Bonneel
Abstract: Optimal transport is a notoriously difficult problem to solve numerically, with current approaches often remaining intractable for very large scale applications such as those encountered in machine learning. Wasserstein barycenters -- the problem of finding measures in-between given input measures in the optimal transport sense -- is even more computationally demanding as it requires to solve an o…  More
Submitted 24 February, 2021; originally announced February 2021.
Comments: 18 pages, 16 figures, submitted to the Machine Learning journal (Springer)
ACM Class: I.2.6; I.4.9; G.2.1; G.3; I.3.3
 Related articles
 All 6 versions 

 Learning to generate Wasserstein barycenters

0 citations*

2021 ARXIV: LEARNING

Julien Lacombe 1,Julie Digne 2,Nicolas Courty 3,Nicolas Bonneel 2

1 Institut national des sciences Appliquées de Lyon ,2 Centre national de la recherche scientifique ,3 IRISA

Convolutional neural network

Constructive solid geometry

View More (4+) 

Optimal transport is a notoriously difficult problem to solve numerically, with current approaches often remaining intractable for very large scale applications such as those encountered in machine learning. Wasserstein barycenters -- the problem of finding measures in-between given input measures i... View Full Abstract 


arXiv:2102.11524  [pdfother hep-ex doi10.1007/s40042-021-00095-1
A Data-driven Event Generator for Hadron Colliders using Wasserstein Generative Adversarial Network
A Data-driven Event Generator for Hadron Colliders using Wasserstein Generative Adversarial Network
Authors: Suyong ChoiJae Hoon Lim
Abstract: Highly reliable Monte-Carlo event generators and detector simulation programs are important for the precision measurement in the high energy physics. Huge amounts of computing resources are required to produce a sufficient number of simulated events. Moreover, simulation parameters have to be fine-tuned to reproduce situations in the high energy particle interactions which is not trivial in some p…  More
Submitted 23 February, 2021; originally announced February 2021.
Comments: To appear in Journal of the Korean Physical Society
   Related articles All 5 versions

A data-driven event generator for Hadron Colliders using Wasserstein Generative Adversarial Network
Choi Suyong; Lim, Jae Hoon. Journal of the Korean Physical Society; Heidelberg Vol. 78, Iss. 6,  (2021): 482-489.

Abstract/Details Get full textLink to external site, this link will open in a new window
Cited by 3 Related articles All 5 versions

arXiv:2102.10943  [pdfpsother math.PR
On Number of Particles in Coalescing-Fragmentating Wasserstein Dynamics
Authors: Vitalii Konarovskyi
Abstract: Because of the sticky-reflected interaction in coalescing-fragmentating Wasserstein dynamics, the model always consists of a finite number of distinct particles for almost all times. We show that the interacting particle system must admit an infinite number of distinct particles on a dense subset of the time interval if and only if the space generated by the interaction potential is infinite-dimen…  More
Submitted 22 February, 2021; originally announced February 2021.
MSC Class: 60K35; 60H05; 60H05; 60G44

Related articles All 4 versions 
<——2021——2021——  260——


 
2021 [PDF] openreview.net

[PDF] GROMOV WASSERSTEIN

K Nguyen, S Nguyen, N Ho, T PhamH Bui - openreview.net

5 days ago - Relational regularized autoencoder (RAE) is a framework to learn the

distribution of data by minimizing a reconstruction loss together with a relational

regularization on the latent space. A recent attempt to reduce the inner discrepancy between …


Local Stability of Wasserstein GANs With Abstract Gradient Penalty.

C Kim, S Park, HJ Hwang - IEEE Transactions on Neural Networks …, 2021 - europepmc.org

7 days ago - The convergence of generative adversarial networks (GANs) has been studied

substantially in various aspects to achieve successful generative tasks. Ever since it is first

proposed, the idea has achieved many theoretical improvements by injecting an instance …

  Related articles All 3 versions

IEEE transactions on neural networks and learning systems  Volume: ‏ PP     Published: ‏ 2021-Feb-19 (Epub 2021 Feb 19)
Related articles
 All 3 versions

MR4476561 

NEWSLETTER ARTICLE

New Climate Modeling Study Findings Reported from University of Hamburg (Evaluating the Performance of Climate Models Based On Wasserstein Distance)

Global Warming Focus, 2021, p.148

New Climate Modeling Study Findings Reported from University of Hamburg (Evaluating the Performance of Climate Models Based On Wasserstein Distance)

Available Online 

arXiv:2106.01954  [pdfother cs.LG
Do Neural Optimal Transport Solvers Work? A Continuous Wasserstein-2 Benchmark
Authors: Alexander KorotinLingxiao LiAude GenevayJustin SolomonAlexander FilippovEvgeny Burnaev
Abstract: Despite the recent popularity of neural network-based solvers for optimal transport (OT), there is no standard quantitative way to evaluate their performance. In this paper, we address this issue for quadratic-cost transport -- specifically, computation of the Wasserstein-2 distance, a commonly-used formulation of optimal transport in machine learning. To overcome the challenge of computing ground…  More
Submitted 3 June, 2021; originally announced June 202

 

[PDF] wiley.com Full View

Short‐term railway passenger demand forecast using improved Wasserstein generative adversarial nets and web search terms

F Feng, J Zhang, C Liu, W Li… - IET Intelligent Transport …, 2021 - Wiley Online Library

14 days ago - Accurately predicting railway passenger demand is conducive for managers to

quickly adjust strategies. It is time‐consuming and expensive to collect large‐scale traffic

data. With the digitization of railway tickets, a large amount of user data has been …

 Cover Image  OPEN ACCESS

Short‐term railway passenger demand forecast using improved Wasserstein generative adversarial nets...

by Feng, Fenling; Zhang, Jiaqi; Liu, Chengguang ; More...

IET intelligent transport systems, 03/2021, Volume 15, Issue 3

Accurately predicting railway passenger demand is conducive for managers to quickly adjust strategies. It is time‐consuming and expensive to collect...

Article PDF Download PDF via Unpaywall BrowZine PDF Icon

Journal ArticleCitation Online

 [PDF] wiley.com  Full View

Short‐term railway passenger demand forecast using improved Wasserstein generative adversarial nets and web search terms

F Feng, J Zhang, C Liu, W Li… - IET Intelligent Transport …, 2021 - Wiley Online Library

Accurately predicting railway passenger demand is conducive for managers to quickly

adjust strategies. It is time‐consuming and expensive to collect large‐scale traffic data. With

the digitization of railway tickets, a large amount of user data has been accumulated. We …

Related articles All 2 versions 

[PDF] mlr.press

Exploring the Wasserstein metric for time-to-event analysis

T SylvainM LuckJ Cohen… - Survival Prediction …, 2021 - proceedings.mlr.press

… (2016) and Beckham and Pal (2017) apply a Wasserstein metric for … In the case of distributions

of discrete supports (histograms of class probabilities), this is computed by moving … of the structure

of the space of values considered, eg, the 1-dimensional real-valued time axis, so …

Wasserstein proximal of GANs

AT Lin, W LiS OsherG Montúfar - arXiv preprint arXiv:2102.06862, 2021 - arxiv.org

We introduce a new method for training generative adversarial networks by applying the

Wasserstein-2 metric proximal on the generators. The approach is based on Wasserstein

information geometry. It defines a parametrization invariant natural gradient by pulling back …

  Cited by 7 Related articles All 4 versions 


Local Stability of Wasserstein GANs With Abstract Gradient Penalty.

C Kim, S Park, HJ Hwang - IEEE Transactions on Neural Networks …, 2021 - europepmc.org

The convergence of generative adversarial networks (GANs) has been studied substantially

in various aspects to achieve successful generative tasks. Ever since it is first proposed, the

idea has achieved many theoretical improvements by injecting an instance noise, choosing …

  All 2 versions 


PDF] arxiv.org

Unsupervised Ground Metric Learning using Wasserstein Eigenvectors

GJ Huizing, L Cantini, G Peyré - arXiv preprint arXiv:2102.06278, 2021 - arxiv.org

Optimal Transport (OT) defines geometrically meaningful" Wasserstein" distances, used in

machine learning applications to compare probability distributions. However, a key

bottleneck is the design of a" ground" cost which should be adapted to the task under study …

 Cited by 1 Related articles All 3 versions 

[PDF] ras.ru

[PDF] Обзор современных инструментальных средств и методов для автоматизации процессов сбора и анализа геопространственных данных

АИ Рогачев, АИ Газов, АА Кузмицкий, ДВ Федосенко… - ditc.ras.ru

… Литература 1. Arjovsky, M., Chintala, S., and Bottou, L., “Wasserstein GAN”, arXiv e- prints. –

2017. 2. Goodfellow, IJ, Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S.,

Courville, AC, & Bengio, Y. Generative Adversarial Networks. – 2014 …

<——2021——2021——  270——

[PDF] lut.edu.cn

含分布式储能的主动配电网鲁棒优化经济调度方陆玉姣, 游青山 - 兰州理工大学学报 - journal.lut.edu.cn

Page 1. 文章编号:16735196(2020)06011207 含分布式储能的主动配

电网鲁棒优化经济调度方法 陆玉姣1,游青山1,林林馨妍2,黎博3 (1.

重庆工程职业技术学院,重庆402260;2.河海大学能源与电气学院,江苏南京 …

Related articles 

[Chinese  Robust optimization economic dispatch of active distribution network with distributed energy storage Fang Lu Yujiao, You Qingshan-Journal of Lanzhou University of Technology-journal.lut.edu.cn]


[PDF] arxiv.org

Gradient flow formulation of diffusion equations in the Wasserstein space over a metric graph

M Erbar, D Forkert, J MaasD Mugnolo - arXiv preprint arXiv:2105.05677, 2021 - arxiv.org

… –Brenier formula for the Wasserstein distance, which … as gradient flow of the free energy

in the Wasserstein space of … -convexity of entropy functionals in the Wasserstein space. …

Cited by 3 Related articles All 3 versions 

[PDF] openreview.net

[PDF] Functional Priors for Bayesian Neural Networks through Wasserstein Distance Minimization to Gaussian Processes

BH TranD MiliosS RossiM Filippone - openreview.net

The Bayesian treatment of neural networks dictates that a prior distribution is considered

over the weight and bias parameters of the network. The non-linear nature of the model

implies that any distribution of the parameters has an unpredictable effect on the distribution …

  Related articles 


 MR4216550 Prelim Cheng, Li-juan; Thalmaier, Anton; Zhang, Shao-qin; Exponential contraction in Wasserstein distance on static and evolving manifolds. Rev. Roumaine Math. Pures Appl. 66 (2021), no. 1, 107–129. 60 (53E20 58J65)

Review PDF Clipboard Journal Article


 

MR4213934 Prelim Cherukuri, Ashish; Hota, Ashish R.; Consistency of distributionally robust risk- and chance-constrained optimization under Wasserstein ambiguity sets. IEEE Control Syst. Lett. 5 (2021), no. 5, 1729–1734. 90

Consistency of Distributionally Robust Risk- and Chance-Constrained Optimization Under Wasserstein Ambiguity SetsReview PDF Clipboard Journal Article

By: Cherukuri, AshishHota, Ashish R.

IEEE CONTROL SYSTEMS LETTERS  Volume: ‏ 5   Issue: ‏ 5   Pages: ‏ 1729-1734   Published: ‏ NOV 2021
2021 see 2020
Conference Paper  Citation/Abstract

Consistency of Distributionally Robust Risk- and Chance-Constrained Optimization under Wasserstein Ambiguity Sets
Cherukuri, Ashish; Hota, Ashish R.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).

Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 


2021

2021 patent

Apparatus for measuring distance between two signal distributions, has measurement network receives two received signal distributions as input output measurement of wasserstein distance between distributions in receiver domain

Patent Number: WO2021022850-A1

Patent Assignee: HUAWEI TECHNOLOGIES CO LTD

Inventor(s): GE Y; SHI W; TONG W.


WSGeometry: Compute Wasserstein Barycenters, Geodesics, PCA and Distances

By: Heinemann, Florian

Comprehensive R Archive Network

Source URL: ‏ https://CRAN.R-project.org/package=WSGeometry

Document Type: Software

View Data  View Abstract


2021 see 2020

node2coords: Graph Representation Learning with Wasserstein Barycenters

By: Simou, EffrosyniThanou, DorinaFrossard, Pascal

IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS  Volume: ‏ 7   Pages: ‏ 17-29   Published: ‏ 2021

  View Abstract


Source code in support of "Segmentation analysis and the recovery of queuing parameters via the Wasserstein distance: a study of administrative data for patients with chronic obstructive pulmonary disease"

By: Wilde, HenryKnight, VincentGillard, Jonathan

Zenodo

DOI: ‏ http://dx.doi.org.ezaccess.libraries.psu.edu/10.5281/ZENODO.4457902

Document Type: Software

 View Abstract


[PDF] Wasserstein barycenters can be computed in polynomial time in fixed dimension

JM Altschuler, E Boix-Adsera - Journal of Machine Learning Research, 2021 - jmlr.org

Computing Wasserstein barycenters is a fundamental geometric problem with widespread

applications in machine learning, statistics, and computer graphics. However, it is unknown

whether Wasserstein barycenters can be computed in polynomial time, either exactly or to …

  All 14 versions 

MR4253737

<——2021——2021——  280—— 


[PDF] arxiv.org

Continual Learning of Generative Models with Limited Data: From Wasserstein-1 Barycenter to Adaptive Coalescence

M DedeogluS Lin, Z Zhang, J Zhang - arXiv preprint arXiv:2101.09225, 2021 - arxiv.org

Learning generative models is challenging for a network edge node with limited data and

computing power. Since tasks in similar environments share model similarity, it is plausible

to leverage pre-trained generative models from the cloud or other edge nodes. Appealing to …

  All 2 versions 


[PDF] arxiv.org

Learning to Generate Wasserstein Barycenters

J Lacombe, J DigneN CourtyN Bonneel - arXiv preprint arXiv …, 2021 - arxiv.org

Optimal transport is a notoriously difficult problem to solve numerically, with current

approaches often remaining intractable for very large scale applications such as those

encountered in machine learningWasserstein barycenters--the problem of finding …

  All 2 versions 


[HTML] springer.com

[HTML] Quantum statistical learning via Quantum Wasserstein natural gradient

S BeckerW Li - Journal of Statistical Physics, 2021 - Springer

In this article, we introduce a new approach towards the statistical learning

problem\(\mathrm {argmin} _ {\rho (\theta)\in {\mathcal {P}} _ {\theta}} W_ {Q}^ 2 (\rho _

{\star},\rho (\theta))\) to approximate a target quantum state\(\rho _ {\star}\) by a set of …

   Related articles All 5 versions


[PDF] arxiv.org

Learning High Dimensional Wasserstein Geodesics

S Liu, S MaY ChenH ZhaH Zhou - arXiv preprint arXiv:2102.02992, 2021 - arxiv.org

We propose a new formulation and learning strategy for computing the Wasserstein

geodesic between two probability distributions in high dimensions. By applying the method

of Lagrange multipliers to the dynamic formulation of the optimal transport (OT) problem, we …

  All 2 versions 

[PDF] arxiv.org

A Mathematical Principle of Deep Learning: Learn the Geodesic Curve in the Wasserstein Space

K Gai, S Zhang - arXiv preprint arXiv:2102.09235, 2021 - arxiv.org

Recent studies revealed the mathematical connection of deep neural network (DNN) and

dynamic system. However, the fundamental principle of DNN has not been fully

characterized with dynamic system in terms of optimization and generalization. To this end …

  All 2 versions 


2021



 2021 see 2022
Rethinking Rotated Object Detection with Gaussian Wasserstein Distance Loss
by Yang, XueYan, JunchiMing, Qi ; More...
01/2021
Boundary discontinuity and its inconsistency to the final detection metric have been the bottleneck for rotating detection regression loss design. In this...
Journal Article  Full Text Online


[PDF] arxiv.org

Unsupervised Ground Metric Learning using Wasserstein Eigenvectors

GJ Huizing, L Cantini, G Peyré - arXiv preprint arXiv:2102.06278, 2021 - arxiv.org

Optimal Transport (OT) defines geometrically meaningful" Wasserstein" distances, used in

machine learning applications to compare probability distributions. However, a key

bottleneck is the design of a" ground" cost which should be adapted to the task under study …

  All 2 versions 


Multivariate goodness-of-fit tests based on Wasserstein distance

M HallinG MordantJ Segers - Electronic Journal of Statistics, 2021 - projecteuclid.org

Goodness-of-fit tests based on the empirical Wasserstein distance are proposed for simple

and composite null hypotheses involving general multivariate distributions. For group

families, the procedure is to be implemented after preliminary reduction of the data via …

 Cited by 7 Related articles All 15 versions

[PDF] iop.org

Optimization Research on Abnormal Diagnosis of Transformer Voiceprint Recognition based on Improved Wasserstein GAN

K Zhu, H Ma, J Wang, C Yu, C Guo… - Journal of Physics …, 2021 - iopscience.iop.org

Transformer is an important infrastructure equipment of power system, and fault monitoring

is of great significance to its operation and maintenance, which has received wide attention

and much research. However, the existing methods at home and abroad are based on …

  All 2 versions


data-driven event generator for Hadron Colliders using Wasserstein Generative Adversarial Network

S Choi, JH Lim - Journal of the Korean Physical Society, 2021 - Springer

Abstract Highly reliable Monte-Carlo event generators and detector simulation programs are

important for the precision measurement in the high energy physics. Huge amounts of

computing resources are required to produce a sufficient number of simulated events …

  All 3 versions

<——2021————2021———290——   


Learning High Dimensional Wasserstein Geodesics

S Liu, S MaY ChenH ZhaH Zhou - arXiv preprint arXiv:2102.02992, 2021 - arxiv.org

We propose a new formulation and learning strategy for computing the Wasserstein

geodesic between two probability distributions in high dimensions. By applying the method

of Lagrange multipliers to the dynamic formulation of the optimal transport (OT) problem, we …

  All 2 versions 


online  Cover Image

Classification of atomic environments via the Gromov–Wasserstein distance

by Kawano, Sakura; Mason, Jeremy K

Computational materials science, 02/2021, Volume 188

[Display omitted] •Molecular dynamics simulations need automated methods to classify atomic structure.•Existing methods are restricted to simple compositions...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

  Cited by 2 Related articles All 8 versions


Unpaired Single-Image Depth Synthesis with cycle-consistent Wasserstein GANs

C Angermann, A Moravová, M Haltmeier… - arXiv preprint arXiv …, 2021 - arxiv.org

Real-time estimation of actual environment depth is an essential module for various

autonomous system tasks such as localization, obstacle detection and pose estimation.

During the last decade of machine learning, extensive deployment of deep learning …

  All 2 versions 

 

online Cover Image

An inexact PAM method for computing Wasserstein barycenter with unknown supports

by Qian, Yitian; Pan, Shaohua

Computational & applied mathematics, 03/2021, Volume 40, Issue 2

Wasserstein barycenter is the centroid of a collection of discrete probability distributions which minimizes the average of the 2-Wasserstein distance. This...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 

2021  [PDF] archives-ouvertes.fr

Finite volume approximation of optimal transport and Wasserstein gradient flows

G Todeschi - 2021 - hal.archives-ouvertes.fr

… If the space has sufficient differential structure, as it is the case for the Wasserstein space, … 

initial condition ρ0, we can characterize a Wasserstein gradient flow with respect to E as the …

 Related articles All 14 versions


2021

 

online  OPEN ACCESS

Learning to Generate Wasserstein Barycenters

by Lacombe, Julien; Digne, Julie; Courty, Nicolas ; More...

02/2021

Optimal transport is a notoriously difficult problem to solve numerically, with current approaches often remaining intractable for very large scale...

Journal ArticleFull Text Online

 

 2021 online

Graph Diffusion Wasserstein Distances

by Barbe, Amélie; Sebban, Marc; Gonçalves, Paulo ; More...

Machine Learning and Knowledge Discovery in Databases, 02/2021

Optimal Transport (OT) for structured data has received much attention in the machine learning community, especially for addressing graph classification or...

Book ChapterFull Text Online

 

Peer-reviewed
Dimension-free Wasserstein contraction of nonlinear filters
Author:Nick Whiteley
Summary:For a class of partially observed diffusions, conditions are given for the map from the initial condition of the signal to filtering distribution to be contractive with respect to Wasserstein distances, with rate which does not necessarily depend on the dimension of the state-space. The main assumptions are that the signal has affine drift and constant diffusion coefficient and that the likelihood functions are log-concave. Ergodic and nonergodic signals are handled in a single framework. Examples include linear-Gaussian, stochastic volatility, neural spike-train and dynamic generalized linear models. For these examples filter stability can be established without any assumptions on the observationsShow more
Article, 2021
Publication:Stochastic Processes and their Applications, 135, 202105, 31
Publisher:2021

online  OPEN ACCESS

Distributional robustness in minimax linear quadratic control with Wasserstein distance

by Kim, Kihyun; Yang, Insoon

02/2021

To address the issue of inaccurate distributions in practical stochastic systems, a minimax linear-quadratic control method is proposed using the Wasserstein...

Journal ArticleFull Text Online

 Cited by 4 Related articles All 2 versions
 

online OPEN ACCESS

On Number of Particles in Coalescing-Fragmentating Wasserstein Dynamics

by Konarovskyi, Vitalii

02/2021

Because of the sticky-reflected interaction in coalescing-fragmentating Wasserstein dynamics, the model always consists of a finite number of distinct...

Journal ArticleFull Text Online

 Related articles All 4 versions 

<——2021————2021———300——


online

Channel Pruning for Accelerating Convolutional Neural Networks via Wasserstein Metric

by Duan, Haoran; Li, Hui

Computer Vision – ACCV 2020, 02/2021

Channel pruning is an effective way to accelerate deep convolutional neural networks. However, it is still a challenge to reduce the computational complexity...

Book ChapterFull Text Online

 Related articles

 

online

Researchers at East China Normal University Have Reported New Data on Landscape Ecology (Wasse...

Ecology, Environment & Conservation, 02/2021

NewsletterFull Text Online

online  OPEN ACCESS

Time-Series Imputation with Wasserstein Interpolation for Optimal Look-Ahead-Bias and...

by Blanchet, Jose; Hernandez, Fernando; Nguyen, Viet Anh ; More...

02/2021

Missing time-series data is a prevalent practical problem. Imputation methods in time-series data often are applied to the full panel data with the purpose of...

Journal ArticleFull Text Online

  Cited by 2 Related articles All 3 versions

Findings from University of Cambridge Update Understanding of Statistical Physics (Quantum Statistical Learning Via Quantum Wasserstein...

Journal of Physics Research, 02/2021

NewsletterCitation Online
02/23/2021). "Findings from University of Cambridge Update Understanding of Statistical Physics (Quantum Statistical Learning Via Quantum Wasserstein Natural Gradient)". Journal of Physics Research (1945-8193), p. 194.

Latest news | University of Cambridge

  Becker, SimonLi, Wuchen

Quantum statistical learning via quantum Wasserstein natural gradient. (English) Zbl 07325971

J. Stat. Phys. 182, No. 1, Paper No. 7, 26 p. (2021).

MSC:  60K35 47D03

PDF BibTeX XML Cite Zbl 07325971

[HTML] springer.com

[HTML] Quantum statistical learning via Quantum Wasserstein natural gradient

S BeckerW Li - Journal of Statistical Physics, 2021 - Springer

… quantum systems, satisfying a detailed balance condition. In these articles, they

showed, that such open quantum systems also evolve according to the

\(L^2\)-Wasserstein gradient flow. Moreover, they also showed that their …

Cited by 3 Related articles All 10 versions

Asymptotics of smoothed Wasserstein distances

HB ChenJ Niles-Weed - Potential Analysis, 2021 - Springer

We investigate contraction of the Wasserstein distances on\(\mathbb {R}^{d}\) under

Gaussian smoothing. It is well known that the heat semigroup is exponentially contractive

with respect to the Wasserstein distances on manifolds of positive curvature; however, on flat …

Cited by 4 Related articles All 5 versions

Reports on Potential Analysis from New York University Provide New Insights 

(Asymptotics of Smoothed Wasserstein Distances)

Mathematics Week, 03/2021

NewsletterCitation Online

 

Studies from University Grenoble Alpes Reveal New Findings on Bioinformatics (CHICKN: extraction of peptide chromatographic elution profiles from large scale mass spectrometry data by means of Wasserstein...

Biotech Week, 03/2021
ƒ

Researchers at East China Normal University Have Reported New Data on Landscape Ecology (Wasser...

Ecology, Environment & Conservation, 02/2021

NewsletterFull Text Online

 

Wasserstein distance, Fourier series and applications

S Steinerberger - Monatshefte für Mathematik, 2021 - Springer

We study the Wasserstein metric\(W_p\), a notion of distance between two probability

distributions, from the perspective of Fourier Analysis and discuss applications. In particular,

we bound the Earth Mover Distance\(W_1\) between the distribution of quadratic residues in …

  5 Related articles All 3 versions


[PDF] arxiv.org

Closed‐form expressions for maximum mean discrepancy with applications to Wasserstein auto‐encoders

RM Rustamov - Stat, 2021 - Wiley Online Library

The maximum mean discrepancy (MMD) has found numerous applications in statistics and

machine learning, among which is its use as a penalty in the Wasserstein auto‐encoder

(WAE). In this paper, we compute closed‐form expressions for estimating the Gaussian …

  Cited by 5 Related articles All 3 versions

[PS] uc.pt

[PS] … ), 301-316.[24] Weed, J., & Bach, F.(2019). Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance. Bernoulli 25 …

PE OLIVEIRA, N PICADO - surfaces - mat.uc.pt

Let M be a compact manifold of Rd. The goal of this paper is to decide, based on a sample of

points, whether the interior of M is empty or not. We divide this work in two main parts. Firstly,

under a dependent sample which may or may not contain some noise within, we …

  Related articles View as HTML 

<——2021————2021———310—— 

 


Recycling Discriminator Towards Opinion-Unaware Image Quality Assessment Using Wasserstein GAN

Authors:Yunan Zhu (Author), Haichuan Ma (Author), Jialun Peng (Author), Dong Liu (Author), Zhiwei Xiong (Author)
Summary:Generative adversarial networks (GANs) have been extensively used for training networks that perform image generation. After training, the discriminator in GAN was not used anymore. We propose to recycle the trained discriminator for another use: no-reference image quality assessment (NR-IQA). We are motivated by twofold facts. First, in Wasserstein GAN (WGAN), the discriminator is designed to calculate the distance between the distribution of generated images and that of real images thus, the trained discriminator may encode the distribution of real-world images. Second, NR-IQA often needs to leverage the distribution of real-world images for assessing image quality. We then conjecture that using the trained discriminator for NR-IQA may help get rid of any human-labeled quality opinion scores and lead to a new opinion-unaware (OU) method. To validate our conjecture, we start from a restricted NR-IQA problem, that is IQA for artificially super-resolved images. We train super-resolution (SR) WGAN with two kinds of discriminators: one is to directly evaluate the entire image, and the other is to work on small patches. For the latter kind, we obtain patch-wise quality scores, and then have the flexibility to fuse the scores, e.g., by weighted average. Moreover, we directly extend the trained discriminators for authentically distorted images that have different kinds of distortions. Our experimental results demonstrate that the proposed method is comparable to the state-of-the-art OU NR-IQA methods on SR images and is even better than them on authentically distorted images. Our method provides a better interpretable approach to NR-IQA. Our code and models are available at https://github.com/YunanZhu/RecycleDShow more
Chapter, 2021
Publication:Proceedings of the 29th ACM International Conference on Multimedia, 20211017, 116
Publisher:2021

2021

Huang, XingWang, Feng-Yu

McKean-Vlasov SDEs with drifts discontinuous under Wasserstein distance. (English) Zbl 07314927

Discrete Contin. Dyn. Syst. 41, No. 4, 1667-1679 (2021).

MSC:  60H10 60G44

PDF BibTeX XML Cite

Full Text: DOI


Cavagnari, GiuliaMarigonda, Antonio

Attainability property for a probabilistic target in Wasserstein spaces. (English) Zbl 07314364

Discrete Contin. Dyn. Syst. 41, No. 2, 777-812 (2021).

Reviewer: Xin Yang Lu (Thunder Bay)

MSC:  60B05 49Q22 49J15 49J53 90C56

PDF BibTeX XML Cite

Full Text: DOI


Rethinking Rotated Object Detection with Gaussian Wasserstein Distance Loss

X Yang, J Yan, Q Ming, W Wang, X Zhang… - arXiv preprint arXiv …, 2021 - arxiv.org

Boundary discontinuity and its inconsistency to the final detection metric have been the

bottleneck for rotating detection regression loss design. In this paper, we propose a novel

regression loss based on Gaussian Wasserstein distance as a fundamental approach to …

  All 3 versions 


Small Object Detection from Remote Sensing Images with the Help of Object-Focused Super-Resolution Using Wasserstein GANs

L Courtrai, MT Pham, C Friguet… - IGARSS 2020-2020 IEEE … - ieeexplore.ieee.org

In this paper, we investigate and improve the use of a super-resolution approach to benefit

the detection of small objects from aerial and satellite remote sensing images. The main

idea is to focus the super-resolution on target objects within the training phase. Such a …

 

2021

Nonembeddability of persistence diagrams with p>2 Wasserstein metric
by Wagner, Alexander
Proceedings of the American Mathematical Society, 06/2021, Volume 149, Issue 6
Persistence diagrams do not admit an inner product structure compatible with any Wasserstein metric. Hence, when applying kernel methods to persistence...
ArticleView Article PDF
Journal Article  Full Text Onlin


arXiv:2103.00899  [pdfpsother]  cs.LG
Computationally Efficient Wasserstein Loss for Structured Labels
Authors: Ayato ToyokuniSho YokoiHisashi KashimaMakoto Yamada
Abstract: The problem of estimating the probability distribution of labels has been widely studied as a label distribution learning (LDL) problem, whose applications include age estimation, emotion analysis, and semantic segmentation. We propose a tree-Wasserstein distance regularized LDL algorithm, focusing on hierarchical text classification tasks. We propose predicting the entire label hierarchy using ne… 
More
Submitted 1 March, 2021; originally announced March 2021.
 Related articles All 4 versions 

31 References  Related records
Computationally Efficient Wasserstein Loss for Structured Labels

Tovokuni, AYokoi, S; (...); Yamada, M

16th Conference of the European-Chapter-of-the-Association-for-Computational-Linguistics (EACL)

2021 | 

EACL 2021: THE 16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: PROCEEDINGS OF THE STUDENT RESEARCH WORKSHOP

 , pp.1-7

Enriched Cited References

The problem of estimating the probability distribution of labels has been widely studied as a label distribution learning (LDL) problem, whose applications include age estimation, emotion analysis, and semantic segmentation. We propose a tree-Wasserstein distance regularized LDL algorithm, focusing on hierarchical text classification tasks. We propose predicting the entire label hierarchy using

Show more

more_horiz

31 References  Related records


arXiv:2103.00837  [pdfpsother]  math.OC  math.PR q-fin.CP
Rate of convergence for particles approximation of PDEs in Wasserstein space
Authors: Maximilien GermainHuyên PhamXavier Warin
Abstract: We prove a rate of convergence of order 1/N for the N-particle approximation of a second-order partial differential equation in the space of probability measures, like the Master equation or Bellman equation of mean-field control problem under common noise. The proof relies on backward stochastic differential equations techniques.
Submitted 1 March, 2021; originally announced March 2021.

 Cited by 2 Related articles All 16 versions 

online  OPEN ACCESS

Rate of convergence for particles approximation of PDEs in Wasserstein space

by Germain, Maximilien; Pham, Huyên; Warin, Xavier

03/2021

We prove a rate of convergence of order 1/N for the N-particle approximation of a second-order partial differential equation in the space of probability...

Journal ArticleFull Text Online

Cited by 6 Related articles All 23 versions 

Geometrical aspects of entropy production in stochastic thermodynamics based on Wasserstein...

by Nkazato, Muka; Ito, Sosuk

Physical review research, 11/2021, Volume 3, Issue 4

We study a relationship between optimal transport theory and stochastic thermodynamics for the Fokker-Planck equation. We

show that the lower bound on the...

Article PDF (via Unpaywall)PDF

Journal Article  Full Text Online

arXiv:2103.00503  [pdfother cond-mat.stat-mech
Geometrical aspects of entropy production in stochastic thermodynamics based on Wasserstein distance
Authors: Muka NakazatoSosuke Ito
Abstract: We study a relationship between optimal transport theory and stochastic thermodynamics for the Fokker-Planck equation. We show that the entropy production is bounded by the action measured by the path length of the L
2-Wasserstein distance, which is a measure of optimal transport. By using its geometrical interpretation of the entropy production, we obtain a lower bound on the entropy production…  More
Submitted 4 March, 2021; v1 submitted 28 February, 2021; originally announced March 2021.
Comments: 15 pages, 3 figures
 
bruary 2021.

online  OPEN ACCESS

Geometrical aspects of entropy production in stochastic thermodynamics based on Wasserstein distance

by Nakazato, Muka; Ito, Sosuke

02/2021

We study a relationship between optimal transport theory and stochastic thermodynamics for the Fokker-Planck equation. We show that the lower bound on the...

Journal ArticleFull Text Online

Cited by 25 Related articles All 4 versions

[PDF] arxiv.org
Multilevel optimal transport: a fast approximation of Wasserstein-1 distances

J LiuW YinW Li, YT Chow - SIAM Journal on Scientific Computing, 2021 - SIAM

We propose a fast algorithm for the calculation of the Wasserstein-1 distance, which is a

particular type of optimal transport distance with transport cost homogeneous of degree one.

Our algorithm is built on multilevel primal-dual algorithms. Several numerical examples and …

  5 Related articles All 6 versions

<——2021————2021———320—— 

 

 

 [PDF] arxiv.org

Robust Graph Learning Under Wasserstein Uncertainty

X Zhang, Y Xu, Q Liu, Z Liu, J Lu, Q Wang - arXiv preprint arXiv …, 2021 - arxiv.org

Graphs are playing a crucial role in different fields since they are powerful tools to unveil

intrinsic relationships among signals. In many scenarios, an accurate graph structure

representing signals is not available at all and that motivates people to learn a reliable …

  All 2 versions 


 

2021

Illumination-Invariant Flotation Froth Color Measuring via Wasserstein Distance-Based CycleGAN With Structure-Preserving Constraint

By: Liu, JinpingHe, JiezhouXie, Yongfang; et al.

IEEE TRANSACTIONS ON CYBERNETICS  Volume: ‏ 51   Issue: ‏ 2   Pages: ‏ 839-852   Published: ‏ FEB 2021

Times Cited: 4

Cover Image

Illumination-Invariant Flotation Froth Color Measuring via Wasserstein Distance-Based CycleGAN With...

by Liu, Jinping; He, Jiezhou; Xie, Yongfang ; More...

IEEE transactions on cybernetics, 02/2021, Volume 51, Issue 2

Froth color can be referred to as a direct and instant indicator to the key flotation production index, for example, concentrate grade. However, it is...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 

2021

Decision Making Under Model Uncertainty: Fréchet–Wasserstein Mean Preferences

EV Petracou, A Xepapadeas… - Management …, 2021 - pubsonline.informs.org

This paper contributes to the literature on decision making under multiple probability models

by studying a class of variational preferences. These preferences are defined in terms of

Fréchet mean utility functionals, which are based on the Wasserstein metric in the space of …

 


The Wasserstein-Fourier Distance for Stationary Time Series

By: Cazelles, ElsaRobert, ArnaudTobar, Felipe

IEEE TRANSACTIONS ON SIGNAL PROCESSING  Volume: ‏ 69   Pages: ‏ 709-721   Published: ‏ 2021


Experiment data in support of "Segmentation analysis and the recovery of queuing parameters via the Wasserstein distance: a study of administrative data for patients with chronic obstructive pulmonary disease"

By: Wilde, HenryKnight, VincentGillard, Jonathan

Zenodo

DOI: ‏ http://dx.doi.org.ezaccess.libraries.psu.edu/10.5281/ZENODO.4457808

Document Type: Data set


2021

Rate of convergence for particles approximation of PDEs in Wasserstein space

M GermainH PhamX Warin - arXiv preprint arXiv:2103.00837, 2021 - arxiv.org

We prove a rate of convergence of order 1/N for the N-particle approximation of a second-

order partial differential equation in the space of probability measures, like the Master

equation or Bellman equation of mean-field control problem under common noise. The proof …

   All 6 versions 


 2021 modified

Вассерштейн ГАН в Swift для TensorFlow

www.machinelearningmastery.ru › ...

· Translate this page

Генеральная состязательная сеть Вассерштейна. Модель WGAN содержитСа такжегсеть, как обсуждалось выше.Ссодержит несколько сверточных ...

[Russian   Wasserstein GAN in Swift for TensorFiow]


[PDF] Functional Priors for Bayesian Neural Networks through Wasserstein Distance Minimization to Gaussian Processes

BH TranD MiliosS RossiM Filippone - openreview.net

The Bayesian treatment of neural networks dictates that a prior distribution is considered

over the weight and bias parameters of the network. The non-linear nature of the model

implies that any distribution of the parameters has an unpredictable effect on the distribution …

 Related articles All 2 versions 


2021

Primal dual methods for Wasserstein gradient flows

J Carrillo de la Plata, K Craig, L Wang… - Foundations of …, 2021 - ora.ox.ac.uk

Combining the classical theory of optimal transport with modern operator splitting

techniques, we develop a new numerical method for nonlinear, nonlocal partial differential

equations, arising in models of porous media, materials science, and biological swarming …

  Cited by 24 Related articles All 8 versions


[PDF] uwaterloo.ca

Wasserstein Autoencoders with Mixture of Gaussian Priors for Stylized Text Generation

A Ghabussi - 2021 - uwspace.uwaterloo.ca

Probabilistic text generation is an important application of Natural Language Processing

(NLP). Variational autoencoders and Wasserstein autoencoders are two widely used

methods for text generation. New research efforts focus on improving the quality of the …

Related articles All 3 versions

  <——2021————2021———330——


[PDF] arxiv.org

From smooth wasserstein distance to dual sobolev norm: Empirical approximation and statistical applications

S NietertZ Goldfeld, K Kato - arXiv preprint arXiv:2101.04039, 2021 - arxiv.org

… In the unsupervised learning task of generative modeling, we obtain an iid sample X1,...,Xn …

We adopt the smooth Wasserstein distance as the figure of merit and use the empirical distri…

 Cited by 3 Related articles All 2 versions 

[PDF] jmlr.org

[PDF] A fast globally linearly convergent algorithm for the computation of Wasserstein barycenters

L YangJ LiD SunKC Toh - Journal of Machine Learning Research, 2021 - jmlr.org

We consider the problem of computing a Wasserstein barycenter for a set of discrete

probability distributions with finite supports, which finds many applications in areas such as

statistics, machine learning and image processing. When the support points of the …

  Cited by 8 Related articles All 6 versions 

Cited by 9 Related articles All 22 versions 


PDF] arxiv.org

Closed‐form expressions for maximum mean discrepancy with applications to Wasserstein auto‐encoders

RM Rustamov - Stat, 2021 - Wiley Online Library

The maximum mean discrepancy (MMD) has found numerous applications in statistics and

machine learning, among which is its use as a penalty in the Wasserstein auto‐encoder

(WAE). In this paper, we compute closed‐form expressions for estimating the Gaussian …

  Cited by 5 Related articles All 3 versions


 

[PDF] ieee.org

Brain extraction from brain MRI images based on Wasserstein GAN and O-Net

S Jiang, L Guo, G Cheng, X Chen, C Zhang… - IEEE Access, 2021 - ieeexplore.ieee.org

… Based on the above observation and analysis, this paper … + O-Net, where WGAN (Wasserstein GAN) [23] performs adversarial … Based on the above analysis, we choose the Wasserstein …

 Cited by 2 Related articles

 

Breast Cancer Histopathology Image Super-Resolution Using Wide-Attention GAN With Improved Wasserstein Gradient Penalty and Perceptual Loss

F Shahidi - IEEE Access, 2021 - ieeexplore.ieee.org

In the realm of image processing, enhancing the quality of the images is known as a super-

resolution problem (SR). Among SR methods, a super-resolution generative adversarial

network, or SRGAN, has been introduced to generate SR images from low-resolution …

 

2021

 

Wasserstein inequality and minimal Green energy on compact manifolds

S Steinerberger - Journal of Functional Analysis, 2021 - Elsevier

Let M be a smooth, compact d− dimensional manifold, d≥ 3, without boundary and let G: M×

M R{∞} denote the Green's function of the Laplacian− Δ (normalized to have mean

value 0). We prove a bound on the cost of transporting Dirac measures in {x 1,…, xn} M to …

  Cited by 4 Related articles All 2 versions

[PDF] dtu.dk

[PDF] Enhanced Wasserstein Distributionally Robust OPF With Dependence Structure and Support Information

A ArrigoJ KazempourZ De GrèveJF Toubeau… - 14th IEEE …, 2021 - orbit.dtu.dk

This paper goes beyond the current state of the art related to Wasserstein distributionally

robust optimal power flow problems, by adding dependence structure (correlation) and

Cited by 1 Related articles All 5 versions
2021 IEEE MADRID POWERTECH

21 References Related records


[HTML] PFP-WGAN: Protein function prediction by discovering Gene Ontology term correlations with generative adversarial networks

SF Seyyedsalehi, M SoleymaniHR Rabiee… - Plos one, 2021 - journals.plos.org

Understanding the functionality of proteins has emerged as a critical problem in recent years

due to significant roles of these macro-molecules in biological mechanisms. However, in-

laboratory techniques for protein function prediction are not as efficient as methods …

  All 6 versions 

 OPEN ACCESS

PFP-WGAN: Protein function prediction by discovering Gene Ontology term correlations with generative...

by Seyyedsalehi, Seyyede Fatemeh; Soleymani, Mahdieh; Rabiee, Hamid R ; More...

PloS one, 2021, Volume 16, Issue 2

Understanding the functionality of proteins has emerged as a critical problem in recent years due to significant roles of these macro-molecules in biological...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online
 Related articles All 8 versions 


ARTICLE

PFP-WGAN: Protein function prediction by discovering Gene Ontology term correlations with generative adversarial networks

Seyyedsalehi, Seyyede Fatemeh ; Soleymani, Mahdieh ; Rabiee, Hamid R ; Mofrad, Mohammad R K; Iosifidis, AlexandrosPloS one, 2021, Vol.16 (2), p.e0244430-e0244430

PEER REVIEWED

OPEN ACCESS

PFP-WGAN: Protein function prediction by discovering Gene Ontology term correlations with generative adversarial networks

Available Online 

online Cover Image OPEN ACCESS

PFP-WGAN: Protein function prediction by discovering Gene Ontology term correlations with...

by Seyyedsalehi, Seyyede Fatemeh; Soleymani, Mahdieh; Rabiee, Hamid R ; More...

PloS one, 2021, Volume 16, Issue 2

Understanding the functionality of proteins has emerged as a critical problem in recent years due to significant roles of these macro-molecules in biological...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 Cited by 6 Related articles All 12 versions

使用 WGAN-GP 合成基於智慧手錶的現實安全與不安全的駕駛行為

A Prasetio - 2021 - ir.lib.ncu.edu.tw

摘要() 在真實環境收集駕駛行為資料是相當危險的事. 需準備許多預防措施以免在收集資料時

發生危險的事. 要收集像左右搖晃這種不安全的駕駛行為在現實中更是困難許多.

利用模擬的環境來收集資料既安全又方便, 但模擬環境與現實仍有一段差距 …

[Chinese  Use WGAN-GP to synthesize realistic safe and unsafe driving behaviors based on smart watches 


Rényi 차분 프라이버시를 적용한 WGAN 모델 연구

이수진, 박철희, 홍도원, 김재금 - 정보과학회논문지, 2021 - dbpia.co.kr

다양한 서비스를 이용함으로써 개인정보는 수집되며, 관리자는 수집된 데이터들로부터 가치를

추출하고 결과를 분석하여 개개인의 맞춤형 정보를 제공한다. 하지만 의료 데이터와 같은

민감한 데이터는 프라이버시 침해문제가 있으며, 이에 재현 데이터 생성 모델로 GAN 많이 …

[Korean  Rényi Rényi Research on the WGAN model applying differential privacy Research on the WGAN model applying differential privacy] 

Related articles

<——2021————2021———340——


host.jiangliu2u.com › fmu2 › Wasserstein-BiGAN › blob

Wasserstein BiGAN (Bidirectional GAN trained using Wasserstein distance) - fmu2/Wasserstein-BiGAN.

Wasserstein-BiGAN/README.md at master · fmu2 ...

2021 online

Wasserstein Embeddings for Nonnegative Matrix Factorization

by Febrissy, Mickael; Nadif, Mohamed

Machine Learning, Optimization, and Data Science, 01/2021

In the field of document clustering (or dictionary learning), the fitting error called the Wasserstein...

Book ChapterFull Text Online


arXiv:2103.06828  [pdfother cs.LG  math.OC  stat.ML
Wasserstein Robust Support Vector Machines with Fairness Constraints
Authors: Yijie WangViet Anh NguyenGrani A. Hanasusanto
Abstract: We propose a distributionally robust support vector machine with a fairness constraint that encourages the classifier to be fair in view of the equality of opportunity criterion. We use a type-∞
 Wasserstein ambiguity set centered at the empirical distribution to model distributional uncertainty and derive an exact reformulation for worst-case unfairness measure. We establish that the model…  More
Submitted 11 March, 2021; originally announced March 2021.
Wasserstein Robust Support Vector Machines with Fairness Constraints

Y Wang, VA NguyenGA Hanasusanto - arXiv preprint arXiv:2103.06828, 2021 - arxiv.org

We propose a distributionally robust support vector machine with a fairness constraint that

encourages the classifier to be fair in view of the equality of opportunity criterion. We use a

type-$\infty $ Wasserstein ambiguity set centered at the empirical distribution to model …

 Related articles All 3 versions 

arXiv:2103.04790  [pdfpsother math.OC
Distributionally Robust Chance-Constrained Programmings for Non-Linear Uncertainties with Wasserstein Distance
Authors: Yining GuYanjun Wang
Abstract: In this paper, we study a distributionally robust chance-constrained programming (DRCCP)
 under Wasserstein ambiguity set, where the uncertain constraints require to be jointly satisfied with a probability of at least a given risk level for all the probability distributions of the uncertain parameters within a chosen Wasserstein distance from an empirical distribution. Differently from the…  More
Submitted 8 March, 2021; originally announced March 2021.
  Related articles All 3 versions 


Set Representation Learning with Generalized Sliced-Wasserstein Embeddings
by Naderializadeh, NavidKolouri, SoheilComer, Joseph F ; More...
03/2021
An increasing number of machine learning tasks deal with learning representations from set-structured data. Solutions to these problems involve the composition...
Journal Article  Full Text Online

arXiv:2103.03892  [pdfother cs.LG
Set Representation Learning with Generalized Sliced-Wasserstein Embeddings
Authors: Navid NaderializadehSoheil KolouriJoseph F. ComerReed W. AndrewsHeiko Hoffmann
Abstract: An increasing number of machine learning tasks deal with learning representations from set-structured data. Solutions to these problems involve the composition of permutation-equivariant modules (e.g., self-attention, or individual processing via feed-forward neural networks) and permutation-invariant modules (e.g., global average pooling, or pooling by multi-head attention). In this paper, we pro…  More
Submitted 5 March, 2021; originally announced March 2021.

Cited by 3 Related articles All 4 versions 

2021

Borda, Bence

Berry-Esseen smoothing inequality for the Wasserstein metric on compact Lie groups. (English) Zbl 07321747

J. Fourier Anal. Appl. 27, No. 2, Paper No. 13, 24 p. (2021).

MSC:  43A77 60B15

PDF BibTeX XML Cite

Full Text: DOI

MR4228512 Prelim Borgwardt, Steffen; Patterson, Stephan; On the computational complexity of finding a sparse Wasserstein barycenter. J. Comb. Optim. 41 (2021), no. 3, 736–761. 68Q17 (05C70 49Q22 68Q25 90B80)

Review PDF Clipboard Journal Article

On the computational complexity of finding a sparse Wasserstein barycenter

5 Related articles All 6 versions

MR4226510 Prelim Borda, Bence; Berry-Esseen Smoothing Inequality for the Wasserstein Metric on Compact Lie Groups. J. Fourier Anal. Appl. 27 (2021), no. 2, 13. 43A77 (60B15)

Review PDF Clipboard Journal Article


arXiv:2103.07598  [pdfother]  cs.LG  cs.CR

Internal Wasserstein Distance for Adversarial Attack and Defense
Authors: Jincheng LiJiezhang CaoShuhai ZhangYanwu XuJian ChenMingkui Tan
Abstract: Deep neural networks (DNNs) are vulnerable to adversarial examples that can trigger misclassification of DNNs but may be imperceptible to human perception. Adversarial attack has been an important way to evaluate the robustness of DNNs. Existing attack methods on the construction of adversarial examples use such p
distance as a similarity metric to perturb samples. However, this kind of met…  More
Submitted 12 March, 2021; originally announced March 2021.

 Cited by 2 Related articles All 4 versions 

Considering anatomical prior information for low-dose CT image enhancement using attribute-augmented Wasserstein generative adversarial networks

Z Huang, X Liu, R Wang, J Chen, P Lu, Q Zhang… - Neurocomputing, 2021 - Elsevier

Currently, many deep learning (DL)-based low-dose CT image postprocessing technologies

fail to consider the anatomical differences in training data among different human body sites,

such as the cranium, lung and pelvis. In addition, we can observe evident anatomical …

  Related articles

<——2021————2021———350——

[HTML] tandfonline.com

Full View

Dissimilarity measure of local structure in inorganic crystals using Wasserstein distance to search for novel phosphors

S Takemura, T Takeda, T Nakanishi… - … and Technology of …, 2021 - Taylor & Francis

To efficiently search for novel phosphors, we propose a dissimilarity measure of local

structure using the Wasserstein distance. This simple and versatile method provides the

quantitative dissimilarity of a local structure around a center ion. To calculate the …

  All 2 versions


Distributionally Robust Resilient Operation of Integrated Energy Distribution Systems Using Moment and Wasserstein Metric for Contingencies

M Shahidehpour, Y Zhou, Z Wei… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

Extreme weather events pose a serious threat to energy distribution systems. We propose a

distributionally robust optimization model for the resilient operation of the integrated

electricity and heat energy distribution systems in extreme weather events. We develop a …

 

Data augmentation for rolling bearing fault diagnosis using an enhanced few-shot Wasserstein auto-encoder with meta-learning

Z Pei, H Jiang, X Li, J Zhang, S Liu - Measurement Science and …, 2021 - iopscience.iop.org

Despite the advance of intelligent fault diagnosis for rolling bearings, in industries, data-

driven methods still suffer from data acquisition and imbalance. We propose an enhanced

few-shot Wasserstein auto-encoder (fs-WAE) to reverse the negative effect of imbalance …

 

[PDF] wiley.com

Full View

Short‐term railway passenger demand forecast using improved Wasserstein generative adversarial nets and web search terms

F Feng, J Zhang, C Liu, W Li… - IET Intelligent Transport …, 2021 - Wiley Online Library

Accurately predicting railway passenger demand is conducive for managers to quickly

adjust strategies. It is time‐consuming and expensive to collect large‐scale traffic data. With

the digitization of railway tickets, a large amount of user data has been accumulated. We …

 

[PDF] arxiv.org

Unsupervised Ground Metric Learning using Wasserstein Eigenvectors

GJ Huizing, L Cantini, G Peyré - arXiv preprint arXiv:2102.06278, 2021 - arxiv.org

Optimal Transport (OT) defines geometrically meaningful" Wasserstein" distances, used in

machine learning applications to compare probability distributions. However, a key

bottleneck is the design of a" ground" cost which should be adapted to the task under study …

  All 2 versions 


2021


[PDF] ieee.org

Breast Cancer Histopathology Image Super-Resolution Using Wide-Attention GAN With Improved Wasserstein Gradient Penalty and Perceptual Loss

F Shahidi - IEEE Access, 2021 - ieeexplore.ieee.org

In the realm of image processing, enhancing the quality of the images is known as a super-

resolution problem (SR). Among SR methods, a super-resolution generative adversarial

network, or SRGAN, has been introduced to generate SR images from low-resolution …

  All 2 versions


[PDF] arxiv.org

A data-driven event generator for Hadron Colliders using Wasserstein Generative Adversarial Network

S Choi, JH Lim - Journal of the Korean Physical Society, 2021 - Springer

Abstract Highly reliable Monte-Carlo event generators and detector simulation programs are

important for the precision measurement in the high energy physics. Huge amounts of

computing resources are required to produce a sufficient number of simulated events …

  All 4 versions

A short proof on the rate of convergence of the empirical measure for the Wasserstein distance

V Divol - arXiv preprint arXiv:2101.08126, 2021 - arxiv.org

We provide a short proof that the Wasserstein distance between the empirical measure of a

n-sample and the estimated measure is of order n^-(1/d), if the measure has a lower and

upper bounded density on the d-dimensional flat torus. Subjects: Statistics Theory (math. ST) …

   All 4 versions 


[PDF] arxiv.org

From Smooth Wasserstein Distance to Dual Sobolev Norm: Empirical Approximation and Statistical Applications

S NietertZ GoldfeldK Kato - arXiv preprint arXiv:2101.04039, 2021 - arxiv.org

Statistical distances, ie, discrepancy measures between probability distributions, are

ubiquitous in probability theory, statistics and machine learning. To combat the curse of

dimensionality when estimating these distances from data, recent work has proposed …

  Related articles All 2 versions 


[PDF] arxiv.org

Unsupervised Ground Metric Learning using Wasserstein Eigenvectors

GJ Huizing, L Cantini, G Peyré - arXiv preprint arXiv:2102.06278, 2021 - arxiv.org

Optimal Transport (OT) defines geometrically meaningful" Wasserstein" distances, used in

machine learning applications to compare probability distributions. However, a key

bottleneck is the design of a" ground" cost which should be adapted to the task under study …

  All 2 versions 

<——2021————2021———360——


2021

On the computational complexity of finding a sparse Wasserstein barycenter

S Borgwardt, S Patterson - Journal of Combinatorial Optimization, 2021 - Springer

The discrete Wasserstein barycenter problem is a minimum-cost mass transport problem for

a set of probability measures with finite support. In this paper, we show that finding a

barycenter of sparse support is hard, even in dimension 2 and for only 3 measures. We …

  All 4 versions


[PDF] arxiv.org

Wasserstein Convergence Rate for Empirical Measures of Markov Chains

A Riekert - arXiv preprint arXiv:2101.06936, 2021 - arxiv.org

We consider a Markov chain on $\mathbb {R}^ d $ with invariant measure $\mu $. We are

interested in the rate of convergence of the empirical measures towards the invariant

measure with respect to the $1 $-Wasserstein distance. The main result of this article is a …

  All 2 versions 


[PDF] arxiv.org

Convergence in Wasserstein Distance for Empirical Measures of Semilinear SPDEs

FY Wang - arXiv preprint arXiv:2102.00361, 2021 - arxiv.org

The convergence rate in Wasserstein distance is estimated for the empirical measures of

symmetric semilinear SPDEs. Unlike in the finite-dimensional case that the convergence is

of algebraic order in time, in the present situation the convergence is of log order with a …

  All 2 versions 


[PDF] arxiv.org

Wasserstein convergence rates for random bit approximations of continuous markov processes

S Ankirchner, T Kruse, M Urusov - Journal of Mathematical Analysis and …, 2021 - Elsevier

We determine the convergence speed of a numerical scheme for approximating one-

dimensional continuous strong Markov processes. The scheme is based on the construction

of certain Markov chains whose laws can be embedded into the process with a sequence of …

  Cited by 3 Related articles All 4 versions

Set Representation Learning with Generalized Sliced-Wasserstein Embeddings

N NaderializadehS Kolouri, JF Comer… - arXiv preprint arXiv …, 2021 - arxiv.org

An increasing number of machine learning tasks deal with learning representations from set-

structured data. Solutions to these problems involve the composition of permutation-

equivariant modules (eg, self-attention, or individual processing via feed-forward neural …

  All 3 versions 


2021


Small Object Detection from Remote Sensing Images with the Help of Object-Focused Super-Resolution Using Wasserstein GANs

L Courtrai, MT Pham, C Friguet… - … and Remote Sensing … - ieeexplore.ieee.org

In this paper, we investigate and improve the use of a super-resolution approach to benefit

the detection of small objects from aerial and satellite remote sensing images. The main

idea is to focus the super-resolution on target objects within the training phase. Such a …


Multilevel optimal transport: a fast approximation of Wasserstein-1 distances

J LiuW YinW Li, YT Chow - SIAM Journal on Scientific Computing, 2021 - SIAM

We propose a fast algorithm for the calculation of the Wasserstein-1 distance, which is a

particular type of optimal transport distance with transport cost homogeneous of degree one.

Our algorithm is built on multilevel primal-dual algorithms. Several numerical examples and …

  5 Related articles All 6 versions


[PDF] arxiv.org

Supervised Tree-Wasserstein Distance

Y TakezawaR SatoM Yamada - arXiv preprint arXiv:2101.11520, 2021 - arxiv.org

To measure the similarity of documents, the Wasserstein distance is a powerful tool, but it

requires a high computational cost. Recently, for fast computation of the Wasserstein

distance, methods for approximating the Wasserstein distance using a tree metric have been …

  All 3 versions 


Wasserstein metric-based Boltzmann entropy of a landscape mosaic: a clarification, correction, and evaluation of thermodynamic consistency

P Gao, H Zhang, Z Wu - Landscape Ecology, 2021 - Springer

Objectives The first objective is to provide a clarification of and a correction to the

Wasserstein metric-based method. The second is to evaluate the method in terms of

thermodynamic consistency using different implementations. Methods Two implementation …

  Related articles


The measurement of relations on belief functions based on the Kantorovich problem and the Wasserstein metric

AG Bronevich, IN Rozenberg - International Journal of Approximate …, 2021 - Elsevier

In this paper, we show how the Kantorovich problem appears in many constructions in the

theory of belief functions. We demonstrate this on several relations on belief functions such

as inclusion, equality and intersection of belief functions. Using the Kantorovich problem we …

  All 2 versions

<——2021———2021——370——


[PDF] arxiv.org

Robust W-GAN-Based Estimation Under Wasserstein Contamination

Z Liu, PL Loh - arXiv preprint arXiv:2101.07969, 2021 - arxiv.org

Robust estimation is an important problem in statistics which aims at providing a reasonable

estimator when the data-generating distribution lies within an appropriately defined ball

around an uncontaminated distribution. Although minimax rates of estimation have been …

  All 2 versions 

[PDF] [PDF] arxiv.org

Busemann functions on the Wasserstein space

G Zhu, WL Li, X Cui - Calculus of Variations and Partial Differential …, 2021 - Springer

We study rays and co-rays in the Wasserstein space\(P_p ({\mathcal {X}})\)(\(p> 1\)) whose

ambient space\({\mathcal {X}}\) is a complete, separable, non-compact, locally compact

length space. We show that rays in the Wasserstein space can be represented as probability …

   Related articles All 3 versions
MR4249875
 Prelim Zhu, Guomin; Li, Wen-Long; Cui, Xiaojun; Busemann functions on the Wasserstein space. Calc. Var. Partial Differential Equations 60 (2021), no. 3, 97. 58E10 (60B10 60H30)

Review PDF Clipboard Journal Article

Approximation for Probability Distributions by Wasserstein GAN
by Ga, YihangNg, Michael KZhou, Mingjie
03/2021
In this paper, we study Wasserstein Generative Adversarial Networks (WGAN) using GroupSort neural networks as discriminators. We show that the error bound of...
Journal Article  Full Text Online

arXiv:2103.10060  [pdfpsother cs.LG   stat.ML

Approximation for Probability Distributions by Wasserstein GAN

Authors: Yihang GaoMichael K. Ng

Abstract: In this paper, we show that the approximation for distributions by Wasserstein GAN depends on both the width/depth (capacity) of generators and discriminators, as well as the number of samples in training. A quantified generalization bound is developed for Wasserstein distance between the generated distribution and the target distribution. It implies that with sufficient training samples, for gene…  More

Submitted 18 March, 2021; originally announced March 2021.

Comments: 15 pages
  Related articles All 2 versions 


 Considering anatomical prior information for low-dose CT image enhancement using attribute-augmented Wasserstein generative adversarial networks

Z Huang, X Liu, R Wang, J Chen, P Lu, Q Zhang… - Neurocomputing, 2021 - Elsevier

Currently, many deep learning (DL)-based low-dose CT image postprocessing technologies

fail to consider the anatomical differences in training data among different human body sites,

such as the cranium, lung and pelvis. In addition, we can observe evident anatomical …

 Cited by 6 Related articles


2021 patent news

March 09, 2021: Palo Alto Research Center Incorporated issued patent titled 

"Object shape regression using wasserstein distance"

Palo Alto Research Center Incorporated issued patent titled "Object shape regression using wasserstein...

News Bites - Private Companies, Mar 11, 2021

Newspaper ArticleCitation Online

 Preview   Cite this item Email this item Save this item More actions

Palo Alto Research Center Obtains Patent for Object Shape Regression Using Wasserstein Distance

Global IP News. Optics & Imaging Patent News; New Delhi [New Delhi]09 Mar 2021.

Palo Alto Research Center Obtains Patent for Object Shape Regression Using Wasserstein Distance

Global IP News. Optics & Imaging Patent News, Mar 9, 2021

Newspaper ArticleFull Text Online

 Preview  Cite this item Email this item Save this item More actions

 Palo Alto Research Center Obtains Patent for Object Shape Regression Using Wasserstein Distance

Palo Alto Research Center Obtains Patent for Object Shape Regression Using Wasserstein Distance


2021


online OPEN ACCESS

Object shape regression using wasserstein distance

by Palo Alto Research Center Incorporated

03/2021

One embodiment can provide a system for detecting outlines of objects in images. During operation, the system receives an image that includes at least one...

PatentAvailable Online

 Preview 

 Cite this item Email this item Save this item More actions

 

(03/10/2021). "US Patent Issued to Palo Alto Research Center on March 9 for "Object shape regression using wasserstein distance" (California, New York Inventors)". US Fed News Service, Including US State News US Patent Issued to Palo Alto Research Center on March 9 for "Object shape regression using wasser...

US Fed News Service, Including US State News, Mar 10, 2021

Newspaper ArticleCitation Online

US Patent Issued to Palo Alto Research Center on March 9 for "Object shape regression using wasser...

US Fed News Service, Including US State News, Mar 10, 2021

 

2021  [PDF] arxiv.org

Balanced Coarsening for Multilevel Hypergraph Partitioning via Wasserstein Discrepancy

Z Guo, J Zhao, L Jiao, X Liu - arXiv preprint arXiv:2106.07501, 2021 - arxiv.org

We propose a balanced coarsening scheme for multilevel hypergraph partitioning. In

addition, an initial partitioning algorithm is designed to improve the quality of k-way

hypergraph partitioning. By assigning vertex weights through the LPT algorithm, we …

  

[PDF] arxiv.org

Rethinking Rotated Object Detection with Gaussian Wasserstein Distance Loss

X Yang, J Yan, Q Ming, W Wang, X Zhang… - arXiv preprint arXiv …, 2021 - arxiv.org

Boundary discontinuity and its inconsistency to the final detection metric have been the

bottleneck for rotating detection regression loss design. In this paper, we propose a novel

regression loss based on Gaussian Wasserstein distance as a fundamental approach to …

  All 3 versions 


[PDF] arxiv.org

Berry–Esseen Smoothing Inequality for the Wasserstein Metric on Compact Lie Groups

B Borda - Journal of Fourier Analysis and Applications, 2021 - Springer

We prove a sharp general inequality estimating the distance of two probability measures on

a compact Lie group in the Wasserstein metric in terms of their Fourier transforms. We use a

generalized form of the Wasserstein metric, related by Kantorovich duality to the family of …

  Related articles All 2 versions

<——2021————2021380——

 

Small Object Detection from Remote Sensing Images with the Help of Object-Focused Super-Resolution Using Wasserstein GANs

L Courtrai, MT Pham, C Friguet… - IGARSS 2020-2020 IEEE … - ieeexplore.ieee.org

In this paper, we investigate and improve the use of a super-resolution approach to benefit

the detection of small objects from aerial and satellite remote sensing images. The main

idea is to focus the super-resolution on target objects within the training phase. Such a … 


[PDF] arxiv.org

Wasserstein Robust Support Vector Machines with Fairness Constraints

Y Wang, VA NguyenGA Hanasusanto - arXiv preprint arXiv:2103.06828, 2021 - arxiv.org

We propose a distributionally robust support vector machine with a fairness constraint that

encourages the classifier to be fair in view of the equality of opportunity criterion. We use a

type-$\infty $ Wasserstein ambiguity set centered at the empirical distribution to model …

  All 4 versions 

 MR4231684 Prelim Ji, Ran; Lejeune, Miguel A.; Data-driven distributionally robust chance-constrained optimization with Wasserstein metric. J. Global Optim. 79 (2021), no. 4, 779–811. 90C11 (90C15)

Review PDF Clipboard Journal Article

Data-driven distributionally robust chance-constrained optimization
[PDF] optimization-online.org

Data-driven distributionally robust chance-constrained optimization with Wasserstein metric

R JiMA Lejeune - Journal of Global Optimization, 2021 - Springer

We study distributionally robust chance-constrained programming (DRCCP) optimization

problems with data-driven Wasserstein ambiguity sets. The proposed algorithmic and

reformulation framework applies to all types of distributionally robust chance-constrained …

  5 Related articles All 6 versions  Zbl 07340874


[PDF] arxiv.org

Distributionally Robust Chance-Constrained Programmings for Non-Linear Uncertainties with Wasserstein Distance

Y Gu, Y Wang - arXiv preprint arXiv:2103.04790, 2021 - arxiv.org

In this paper, we study a distributionally robust chance-constrained programming $(\text

{DRCCP}) $ under Wasserstein ambiguity set, where the uncertain constraints require to be

jointly satisfied with a probability of at least a given risk level for all the probability …

  All 2 versions 


2021 [PDF] amazonaws.com

[PDF] INFORMS Optimization Society 2020 Young Researchers Prize Expository Article: On Distributionally Robust Chance Constrained Programs with Wasserstein …

W Xie - higherlogicdownload.s3.amazonaws …

I am truly honored and grateful to be awarded the 2020 INFORMS Optimization Society Young

Researcher Prize for the work “On Distributionally Robust Chance Constrained Program with

Wasserstein Distance.” I would like to thank the committee members (Prof. Sam Burer, Prof. Hande …

  

2021


Set Representation Learning with Generalized Sliced-Wasserstein Embeddings

N NaderializadehS Kolouri, JF Comer… - arXiv preprint arXiv …, 2021 - arxiv.org

An increasing number of machine learning tasks deal with learning representations from set-

structured data. Solutions to these problems involve the composition of permutation-

equivariant modules (eg, self-attention, or individual processing via feed-forward neural …

Cited by 3 Related articles All 4 versions 

[PDF] mdpi.com

WRGAN: Improvement of RelGAN with Wasserstein Loss for Text Generation

Z Jiao, F Ren - Electronics, 2021 - mdpi.com

Generative adversarial networks (GANs) were first proposed in 2014, and have been widely

used in computer vision, such as for image generation and other tasks. However, the GANs

used for text generation have made slow progress. One of the reasons is that the …

  

Adversarial training with Wasserstein distance for learning cross-lingual word embeddings

Y Li, Y Zhang, K Yu, X Hu - Applied Intelligence, 2021 - Springer

Recent studies have managed to learn cross-lingual word embeddings in a completely

unsupervised manner through generative adversarial networks (GANs). These GANs-based

methods enable the alignment of two monolingual embedding spaces approximately, but …

 Adversarial training with Wasserstein distance for learning cross-lingual word embeddings

by Li, YulingZhang, YuhongYu, Kui ; More...

Applied intelligence (Dordrecht, Netherlands), 03/2021

Article PDF Download PDF 

Journal ArticleFull Text Online

Related articles

 

Wasserstein GANs for Generation of Variated Image Dataset Synthesis

KDB Mudavathu, MVPCS Rao - Annals of the Romanian Society for …, 2021 - annalsofrscb.ro

Deep learning networks required a training lot of data to get to better accuracy. Given the

limited amount of data for many problems, we understand the requirement for creating the

image data with the existing sample space. For many years the different technique was used …

 Related articles All 3 versions 


arXiv:2103.11633  [pdfother]  math.AP  math.CA  math.SP
A sharp Wasserstein uncertainty principle for Laplace eigenfunctions
Authors: Mayukh Mukherjee
Abstract: Consider an eigenfunction of the Laplace-Beltrami operator on a smooth compact Riemannian surface. We prove a conjectured lower bound on the Wasserstein distance between the measures defined by the positive and negative parts of the eigenfunction. Essentially, our estimate can be interpreted as an upper bound on the aggregated oscillatory behaviour of the eigenfunction. As a consequence, we are ab…  More
Submitted 22 March, 2021; originally announced March 2021.
Comments: 8 pages, comments most welcome!

 Cited by 3 Related articles All 2 versions 

<——2021————2021———390——


 

Closed‐form expressions for maximum mean discrepancy with applications to Wasserstein auto‐encoders

RM Rustamov - Stat, 2021 - Wiley Online Library

The maximum mean discrepancy (MMD) has found numerous applications in statistics and

machine learning, among which is its use as a penalty in the Wasserstein auto‐encoder

(WAE). In this paper, we compute closed‐form expressions for estimating the Gaussian …

  Cited by 5 Related articles All 3 versions


[PDF] arxiv.org

Wasserstein statistics in one-dimensional location scale models

S AmariT Matsuda - Annals of the Institute of Statistical Mathematics, 2021 - Springer

Wasserstein geometry and information geometry are two important structures to be

introduced in a manifold of probability distributions. Wasserstein geometry is defined by

using the transportation cost between two distributions, so it reflects the metric of the base …

   Related articles All 2 versions


[PDF] arxiv.org

Continual Learning of Generative Models with Limited Data: From Wasserstein-1 Barycenter to Adaptive Coalescence

M DedeogluS Lin, Z Zhang, J Zhang - arXiv preprint arXiv:2101.09225, 2021 - arxiv.org

Learning generative models is challenging for a network edge node with limited data and

computing power. Since tasks in similar environments share model similarity, it is plausible

to leverage pre-trained generative models from the cloud or other edge nodes. Appealing to …

  All 2 versions 


2021

[PDF] arxiv.org

Wasserstein distance to independence models

TÖ Çelik, A Jamneshan, G MontúfarB Sturmfels… - Journal of Symbolic …, 2021 - Elsevier

An independence model for discrete random variables is a Segre-Veronese variety in a

probability simplex. Any metric on the set of joint states of the random variables induces a

Wasserstein metric on the probability simplex. The unit ball of this polyhedral norm is dual to …

  Related articles All 3 versions


[PDF] arxiv.org

Second-order Conic Programming Approach for Wasserstein Distributionally Robust Two-stage Linear Programs

Z Wang, K YouS SongY Zhang - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

This article proposes a second-order conic programming (SOCP) approach to solve

distributionally robust two-stage linear programs over 1-Wasserstein balls. We start from the

case with distribution uncertainty only in the objective function and then explore the case …

  Related articles All 3 versions


 2021

Precise limit in Wasserstein distance for conditional empirical measures of Dirichlet diffusion processes

FY Wang - Journal of Functional Analysis, 2021 - Elsevier

Let M be a d-dimensional connected compact Riemannian manifold with boundary∂ M, let

V C 2 (M) such that μ (dx):= e V (x) dx is a probability measure, and let X t be the diffusion

process generated by L:= Δ+ V with τ:= inf⁡{t≥ 0: X t∂ M}. Consider the conditional …

  Cited by 3 Related articles All 3 versions


[PDF] uni-bielefeld.de

Exponential convergence in entropy and Wasserstein for McKean–Vlasov SDEs

P Ren, FY Wang - Nonlinear Analysis, 2021 - Elsevier

The following type of exponential convergence is proved for (non-degenerate or

degenerate) McKean–Vlasov SDEs: W 2 (μ t, μ∞) 2+ Ent (μ t| μ∞)≤ ce− λ t min {W 2 (μ 0,

μ∞) 2, Ent (μ 0| μ∞)}, t≥ 1, where c, λ> 0 are constants, μ t is the distribution of the solution …

  Related articles All 2 versions


[PDF] arxiv.org

Geometrical aspects of entropy production in stochastic thermodynamics based on Wasserstein distance

M Nakazato, S Ito - arXiv preprint arXiv:2103.00503, 2021 - arxiv.org

We study a relationship between optimal transport theory and stochastic thermodynamics for

the Fokker-Planck equation. We show that the entropy production is proportional to the

action measured by the path length of the $ L^ 2$-Wasserstein distance, which is a measure …

  All 2 versions 


[PDF] arxiv.org

From Smooth Wasserstein Distance to Dual Sobolev Norm: Empirical Approximation and Statistical Applications

S NietertZ GoldfeldK Kato - arXiv preprint arXiv:2101.04039, 2021 - arxiv.org

Statistical distances, ie, discrepancy measures between probability distributions, are

ubiquitous in probability theory, statistics and machine learning. To combat the curse of

dimensionality when estimating these distances from data, recent work has proposed …

  Related articles All 2 versions 


Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic Schrödinger equation

G Ferriere - Analysis & PDE, 2021 - msp.org

We consider the dispersive logarithmic Schrödinger equation in a semiclassical scaling. We

extend the results of Carles and Gallagher (Duke Math. J. 167: 9 (2018), 1761–1801) about

the large-time behavior of the solution (dispersion faster than usual with an additional …

  All 2 versions

<—2021————2021———400—— 


arXiv:2103.13906  [pdfpsother cs.LG
About the regularity of the discriminator in conditional WGANs
Authors: Jörg Martin
Abstract: Training of conditional WGANs is usually done by averaging the underlying loss over the condition. Depending on the way this is motivated different constraints on the Lipschitz continuity of the discriminator arise. For the weaker requirement on the regularity there is however so far no mathematically complete justification for the used loss function. This short mathematical note intends to fill t…  More
Submitted 25 March, 2021; originally announced March 2021.
Comments: 5 pages
All 2 versions
 


arXiv:2103.13579  [pdfother]  math.OC  cs.LG  eess.SY  math-ph
On the Convexity of Discrete Time Covariance Steering in Stochastic Linear Systems with Wasserstein Terminal Cost
Authors: Isin M. BalciAbhishek HalderEfstathios Bakolas
Abstract: In this work, we analyze the properties of the solution to the covariance steering problem for discrete time Gaussian linear systems with a squared Wasserstein distance terminal cost. In our previous work, we have shown that by utilizing the state feedback control policy parametrization, this stochastic optimal control problem can be associated with a difference of convex functions program. Here,…  More
Submitted 24 March, 2021; originally announced March 2021.

online  OPEN ACCESS

On the Convexity of Discrete Time Covariance Steering in Stochastic Linear Systems with Wasserstein terminal cost

by Balci, Isin M; Halder, Abhishek; Bakolas, Efstathios

03/2021

In this work, we analyze the properties of the solution to the covariance steering problem for discrete time Gaussian linear systems with a squared Wasserstein...

Journal ArticleFull Text Online
 Cited by 1 Related articles All 5 versions

 Precise limit in Wasserstein distance for conditional empirical measures of Dirichlet diffusion processes

FY Wang - Journal of Functional Analysis, 2021 - Elsevier

Let M be a d-dimensional connected compact Riemannian manifold with boundary∂ M, let

V C 2 (M) such that μ (dx):= e V (x) dx is a probability measure, and let X t be the diffusion

process generated by L:= Δ+ V with τ:= inf⁡{t≥ 0: X t∂ M}. Consider the conditional  …

 Cited by 10 Related articles All 6 versions

MR4232671 Prelim Wang, Feng-Yu; Precise limit in Wasserstein distance for conditional empirical measures of Dirichlet diffusion processes. J. Funct. Anal. 280 (2021), no. 11, 108998. 60D05 (58J65)

[PDF] arxiv.org

Wasserstein Proximal Algorithms for the Schr\"{o} dinger Bridge Problem: Density Control with Nonlinear Drift

K CaluyaA Halder - IEEE Transactions on Automatic Control, 2021 - ieeexplore.ieee.org

We study the Schr {\" o} dinger bridge problem (SBP) with nonlinear prior dynamics. In

control-theoretic language, this is a problem of minimum effort steering of a given joint state

probability density function (PDF) to another over a finite time horizon, subject to a controlled …

  Cited by 4 Related articles All 4 versions


[PDF] arxiv.org

Distributional robustness in minimax linear quadratic control with Wasserstein distance

K KimI Yang - arXiv preprint arXiv:2102.12715, 2021 - arxiv.org

To address the issue of inaccurate distributions in practical stochastic systems, a minimax

linear-quadratic control method is proposed using the Wasserstein metric. Our method aims

to construct a control policy that is robust against errors in an empirical distribution of …

  All 2 versions 


2021

[PDF] arxiv.org

Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces

B BonnetH Frankowska - arXiv preprint arXiv:2101.10668, 2021 - arxiv.org

In this article, we derive first-order necessary optimality conditions for a constrained optimal

control problem formulated in the Wasserstein space of probability measures. To this end,

we introduce a new notion of localised metric subdifferential for compactly supported …

  All 2 versions 


2021

On the computational complexity of finding a sparse Wasserstein barycenter

S Borgwardt, S Patterson - Journal of Combinatorial Optimization, 2021 - Springer

The discrete Wasserstein barycenter problem is a minimum-cost mass transport problem for

a set of probability measures with finite support. In this paper, we show that finding a

barycenter of sparse support is hard, even in dimension 2 and for only 3 measures. We …

  All 4 versions


 Mathematical Principle of Deep Learning: Learn the Geodesic Curve in the Wasserstein Space

K Gai, S Zhang - arXiv preprint arXiv:2102.09235, 2021 - arxiv.org

Recent studies revealed the mathematical connection of deep neural network (DNN) and

dynamic system. However, the fundamental principle of DNN has not been fully

characterized with dynamic system in terms of optimization and generalization. To this end …

  All 2 versions 


[PDF] arxiv.org

Continual Learning of Generative Models with Limited Data: From Wasserstein-1 Barycenter to Adaptive Coalescence

M DedeogluS Lin, Z Zhang, J Zhang - arXiv preprint arXiv:2101.09225, 2021 - arxiv.org

Learning generative models is challenging for a network edge node with limited data and

computing power. Since tasks in similar environments share model similarity, it is plausible

to leverage pre-trained generative models from the cloud or other edge nodes. Appealing to …

  All 2 versions 


[PDF] arxiv.org

A sharp Wasserstein uncertainty principle for Laplace eigenfunctions

M Mukherjee - arXiv preprint arXiv:2103.11633, 2021 - arxiv.org

Consider an eigenfunction of the Laplace-Beltrami operator on a smooth compact

Riemannian surface. We prove a conjectured lower bound on the Wasserstein distance

between the measures defined by the positive and negative parts of the eigenfunction …

<——2021————2021———410——  

 

Sample Out-of-Sample Inference Based on Wasserstein Distance

by Blanchet, JoseKang, Yang

Operations research, 03/2021

Financial institutions make decisions according to a model of uncertainty. At the same time, regulators often evaluate the risk exposure of these institutions...

Article PDF Download PDF 

Journal ArticleFull Text Online

Cited by 28 Related articles All 7 versions

2021

A material decomposition method for dual-energy CT via dual interactive Wasserstein generative adversarial networks

by Shi, ZaifengLi, HuilongCao, Qingjie ; More...

Medical physics (Lancaster), 03/2021

Dual-energy computed tomography (DECT) is highly promising for material characterization and identification, whereas reconstructed material-specific images are...

Article PDF Download PDF 

Journal ArticleFull Text Online


2021

Panchromatic Image Super-Resolution Via Self Attention-Augmented Wasserstein Generative Adversarial Network

by Du, JuanCheng, KuanhongYu, Yue ; More...

Sensors (Basel, Switzerland), 03/2021, Volume 21, Issue 6

Panchromatic (PAN) images contain abundant spatial information that is useful for earth observation, but always suffer from low-resolution ( LR) due to Sensors (Basel, Switzerland)  Volume: ‏ 21   Issue: ‏ 6     Published: ‏ 2021 Mar19

scale view field. The current super-resolution (SR) methods based on traditional attention …

Sensors (Basel, Switzerland)  Volume: 21   Issue: ‏ 6     Published: ‏ Mar19

scale view field. The current super-resolution (SR) methods based on traditional attention …

Cited by 3 Related  articles    All 7 versions

   


2021  [PDF] arxiv.org

Learning disentangled representations with the wasserstein autoencoder

B Gaujac, I FeigeD Barber - Joint European Conference on Machine …, 2021 - Springer

… Disentangled representation learning has undoubtedly benefited from objective function

surgery. However, a delicate balancing act of … Building on previous successes of penalizing

the total correlation in the latent variables, we propose TCWAE (Total Correlation Wasserstein …

  Related articles All 3 versionsƒ


2021  [PDF] arxiv.org

Wasserstein Distances, Geodesics and Barycenters of Merge Trees

M Pont, J Vidal, J DelonJ Tierny - arXiv preprint arXiv:2107.07789, 2021 - arxiv.org

This paper presents a unified computational framework for the estimation of distances,

geodesics and barycenters of merge trees. We extend recent work on the edit distance [106]

and introduce a new metric, called the Wasserstein distance between merge trees, which is …

  All 8 versions 


2021


Reconstruction Method for Missing Measurement Data Based on Wasserstein Generative Adversarial Network

by Zhang, ChangfanChen, HongrunHe, Jing ; More...

Journal of advanced computational intelligence and intelligent informatics, 03/2021, Volume 25, Issue 2

Focusing on the issue of missing measurement data caused by complex and changeable working conditions during the operation of high-speed trains, in this paper,...

Journal ArticleFull Text Online

Related articles All 4 versions

2021

Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic Schrödinger equation

by Ferriere, Guillaume

Analysis & PDE, 03/2021, Volume 14, Issue 2

Article PDF Download PDF 

Journal ArticleFull Text Online

MR4241810 Prelim Ferriere, Guillaume; Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic Schrödinger equation. Anal. PDE 14 (2021), no. 2, 617–666. 35 (81)

Review PDF Clipboard Journal Article

 Cited by 9 Related articles All 7 versions

2021

The back-and-forth method for Wasserstein gradient flows

by Jacobs, MattLee, WonjunLeger, Flavien

ESAIM. Control, optimisation and calculus of variations, 03/2021

We present a method to efficiently compute Wasserstein gradient flows.  Our approach is based on a generalization of the back-and-forth method (BFM) introduced...

Article PDF Download PDF 

Journal ArticleFull Text Online

MR4238775 Prelim Jacobs, Matt; Lee, Wonjun; Léger, Flavien; The back-and-forth method for Wasserstein gradient flows. ESAIM Control Optim. Calc. Var. 27 (2021), Paper No. 28, 35 pp. 65K10 (49N15 65N99 90C46)

[CITATION] The back-and-forth method for Wasserstein gradient flows

M Jacobs, W Lee, F Léger - ESAIM: Control, Optimisation and …, 2021 - esaim-cocv.org

… Related Articles. An unbalanced optimal transport splitting scheme for general advection-reaction-

diffusion problems ESAIM: COCV 25 (2019) 8. A tumor growth model of Hele-Shaw type as a

gradient flow ESAIM: COCV 26 (2020) 103. An augmented Lagrangian approach to Wasserstein …

Cited by 5 Related articles All 3 versions


 Wasserstein distance-based probabilistic linguistic TODIM method with application to the evaluation...

by Zhang, ShitaoWu, ZhangjiaoMa, Zhenzhen ; More...

Ekonomska istraživanja, , Volume ahead-of-print, Issue ahead-of-print

The evaluation of sustainable rural tourism potential is a key work in sustainable rural tourism development. Due to the complexity of the rural tourism...

Article PDF Download PDF

<——2021————2021———420——


2021 see 2020

Selective Multi-source Transfer Learning with Wasserstein Domain Distance for Financial Fraud...

by Sun, Yifu; Lan, Lijun; Zhao, Xueyao ; More...

Intelligent Computing and Block Chain, 03/2021

As financial enterprises have moved their services to the internet, financial fraud detection has become an ever-growing problem causing severe economic losses...

Book ChapterCitation Online

 Preview 

 Cite this item Email this item Save this item More actions


elective Multi-source Transfer Learning with Wasserstein Domain Distance for Financial...
by Sun, YifuLan, LijunZhao, Xueyao ; More...
Intelligent Computing and Block Chain, 03/2021
As financial enterprises have moved their services to the internet, financial fraud detection has become an ever-growing problem causing severe economic losses...
Book Chapter  Full Text Online


Short-term railway passenger demand forecast using improved Wasserstein generative adversarial nets and web search terms

By: Feng, FenlingZhang, JiaqiLiu, Chengguang; et al.

IET INTELLIGENT TRANSPORT SYSTEMS  Volume: ‏ 15   Issue: ‏ 3   Pages: ‏ 432-445   Published: ‏ MAR 2021

Early Access: FEB 2021

 Free Full Text from Publisher View Abstract


Image analysis system for assessing accuracy of localization from single molecule localization microscopy dataset, comprises non-volatile computer-readable memory containing instructions of Wasserstein-induced flux component

Patent Number: US2021033536-A1

Patent Assignee: UNIV WASHINGTON; LEW M; NEHORAI A; et. al

Inventor(s): LEW M; NEHORAI A; MAZIDISHARFABADI H.

WASSERSTEIN F-TESTS AND CONFIDENCE BANDS FOR THE FRECHET REGRESSION OF DENSITY RESPONSE CURVES

By: Petersen, AlexanderLiu, XiDivani, Afshin A.

ANNALS OF STATISTICS  Volume: ‏ 49   Issue: ‏ 1   Pages: ‏ 590-611   Published: ‏ FEB 2021


2021 patent  see 2020

Wasserstein autoencoders for collaborative filtering

By: Zhang, XiaofengZhong, JingbinLiu, Kai

NEURAL COMPUTING & APPLICATIONS  Volume: ‏ 33   Issue: ‏ 7   Pages: ‏ 2793-2802   Published: ‏ APR 2021

Cited by 28 Related articles All 6 versions

2021 [PDF] koreascience.or.kr

Proposing Effective Regularization Terms for Improvement of WGAN

HI Hahn - Journal of Korea Multimedia Society, 2021 - koreascience.or.kr

Abstract A Wasserstein GAN (WGAN), optimum in terms of minimizing Wasserstein distance,

still suffers from inconsistent convergence or unexpected output due to inherent learning

instability. It is widely known some kinds of restriction on the discriminative function should …

Related articles   All 2 versions 


2021 see 2022

 When ot meets mom: Robust estimation of wasserstein distance

G StaermanP Laforgue… - International …, 2021 - proceedings.mlr.press

Abstract Originated from Optimal Transport, the Wasserstein distance has gained

importance in Machine Learning due to its appealing geometrical properties and the

increasing availability of efficient approximations. It owes its recent ubiquity in generative …

Cited by 22 Related articles All 7 versions 

 

2021 [PDF] mlr.press

Improved complexity bounds in wasserstein barycenter problem

D DvinskikhD Tiapkin - International Conference on …, 2021 - proceedings.mlr.press

In this paper, we focus on computational aspects of the Wasserstein barycenter problem. We

propose two algorithms to compute Wasserstein barycenters of $ m $ discrete measures of

size $ n $ with accuracy $\e $. The first algorithm, based on mirror prox with a specific norm …

  Cited by 3 Related articles All 2 versions 


2021 see 2020   [PDF] mlr.press

Fast and smooth interpolation on wasserstein space

S Chewi, J Clancy, T Le Gouic… - International …, 2021 - proceedings.mlr.press

We propose a new method for smoothly interpolating probability measures using the

geometry of optimal transport. To that end, we reduce this problem to the classical Euclidean

setting, allowing us to directly leverage the extensive toolbox of spline interpolation. Unlike …

Cited by 10 Related articles All 7 versions 

2021 see 2019   [PDF] nber.org

Using wasserstein generative adversarial networks for the design of monte carlo simulations

S AtheyGW Imbens, J Metzger, E Munro - Journal of Econometrics, 2021 - Elsevier

When researchers develop new econometric methods it is common practice to compare the

performance of the new methods to those of existing methods in Monte Carlo studies. The

credibility of such Monte Carlo studies is often limited because of the discretion the …

Cited by 82 Related articles All 14 versions 

 <——2021———2021———430——


2021  see 2020   [PDF] projecteuclid.org

Multivariate goodness-of-fit tests based on Wasserstein distance

M HallinG MordantJ Segers - Electronic Journal of Statistics, 2021 - projecteuclid.org

Goodness-of-fit tests based on the empirical Wasserstein distance are proposed for simple

and composite null hypotheses involving general multivariate distributions. For group

families, the procedure is to be implemented after preliminary reduction of the data via …

Cited by 19 Related articles All 14 versions

2021 [PDF] mlr.press

Generalized spectral clustering via Gromov-Wasserstein learning

S ChowdhuryT Needham - International Conference on …, 2021 - proceedings.mlr.press

We establish a bridge between spectral clustering and Gromov-Wasserstein Learning

(GWL), a recent optimal transport-based approach to graph partitioning. This connection

both explains and improves upon the state-of-the-art performance of GWL. The Gromov …

Cited by 22 Related articles All 5 versions 

[PDF] arxiv.org

Wasserstein Proximal Algorithms for the Schr\"{o} dinger Bridge Problem: Density Control with Nonlinear Drift

K CaluyaA Halder - IEEE Transactions on Automatic Control, 2021 - ieeexplore.ieee.org

We study the Schr {\" o} dinger bridge problem (SBP) with nonlinear prior dynamics. In

control-theoretic language, this is a problem of minimum effort steering of a given joint state

probability density function (PDF) to another over a finite time horizon, subject to a controlled …

 Cited by 16 Related articles All 7 versions


 2021  [PDF] mlr.press

Wasserstein Random Forests and Applications in Heterogeneous Treatment Effects

Q Du, G BiauF Petit, R Porcher - … Conference on Artificial …, 2021 - proceedings.mlr.press

We present new insights into causal inference in the context of Heterogeneous Treatment

Effects by proposing natural variants of Random Forests to estimate the key conditional

distributions. To achieve this, we recast Breiman's original splitting criterion in terms of …

  Cited by 2 Related articles All 6 versions 


[PDF] jmlr.org

[PDF] Some theoretical insights into Wasserstein GANs

G BiauM SangnierU Tanielian - Journal of Machine Learning Research, 2021 - jmlr.org

Abstract Generative Adversarial Networks (GANs) have been successful in producing

outstanding results in areas as diverse as image, video, and text generation. Building on

these successes, a large number of empirical studies have validated the benefits of the …

Cited by 31 Related articles All 13 versions

 

[PDF] tandfonline.com

Full View

Wasserstein distance-based probabilistic linguistic TODIM method with application to the evaluation of sustainable rural tourism potential

S Zhang, Z Wu, Z Ma, X Liu, J Wu - Economic Research …, 2021 - Taylor & Francis

The evaluation of sustainable rural tourism potential is a key work in sustainable rural

tourism development. Due to the complexity of the rural tourism development situation and

the limited cognition of people, most of the assessment problems for sustainable rural …

   Related articles All 2 versions


[PDF] arxiv.org

2D Wasserstein Loss for Robust Facial Landmark Detection

YAN Yongzhe, S Duffner, P Phutane, A Berthelier… - Pattern Recognition, 2021 - Elsevier

The recent performance of facial landmark detection has been significantly improved by

using deep Convolutional Neural Networks (CNNs), especially the Heatmap Regression

Models (HRMs). Although their performance on common benchmark datasets has reached a …

 CCited by 1 Related articles All 17 versions


[PDF] mlr.press

Exploring the Wasserstein metric for time-to-event analysis

T Sylvain, M Luck, J Cohen… - Survival Prediction …, 2021 - proceedings.mlr.press

Survival analysis is a type of semi-supervised task where the target output (the survival time) 

is often right-censored. Utilizing this information is a challenge because it is not obvious how 

to correctly incorporate these censored examples into a model. We study how three …

Cited by 1 Related articles All 2 versions

[PDF] arxiv.org

Berry–Esseen Smoothing Inequality for the Wasserstein Metric on Compact Lie Groups

B Borda - Journal of Fourier Analysis and Applications, 2021 - Springer

We prove a sharp general inequality estimating the distance of two probability measures on

a compact Lie group in the Wasserstein metric in terms of their Fourier transforms. We use a

generalized form of the Wasserstein metric, related by Kantorovich duality to the family of …

eneralized form of the Wasserstein metric, related by Kantorovich duality to the family of …

  Related articles All 5 versions


2021 [PDF] copernicus.org

Ensemble Riemannian Data Assimilation over the Wasserstein Space

SK Tamang, A Ebtehaj, PJ Van Leeuwen… - Nonlinear Processes …, 2021 - npg.copernicus.org

In this paper, we present an ensemble data assimilation paradigm over a Riemannian

manifold equipped with the Wasserstein metric. Unlike the Eulerian penalization of error in

the Euclidean space, the Wasserstein metric can capture translation and difference between …

Related articles All 7 versions 

 <——2021———2021———440—— 


Strong equivalence between metrics of Wasserstein type

E Bayraktar, G Guoï - Electronic Communications in Probability, 2021 - projecteuclid.org

The sliced Wasserstein metric p and more recently max-sliced Wasserstein metric W p

have attracted abundant attention in data sciences and machine learning due to their

advantages to tackle the curse of dimensionality, see eg [15],[6]. A question of particular …

Cited by 13 Related articles All 6 versions

 2021 [PDF] arxiv.org

Second-order Conic Programming Approach for Wasserstein Distributionally Robust Two-stage Linear Programs

Z Wang, K YouS SongY Zhang - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

This article proposes a second-order conic programming (SOCP) approach to solve

distributionally robust two-stage linear programs over 1-Wasserstein balls. We start from the

case with distribution uncertainty only in the objective function and then explore the case …

Cited by 2 Related articles All 5 versions

[PDF] aaai.org

[PDF] Towards Generalized Implementation of Wasserstein Distance in GANs

M XuG LuW ZhangY Yu - 2021 - aaai.org

Abstract Wasserstein GANs (WGANs), built upon the Kantorovich-Rubinstein (KR) duality of

Wasserstein distance, is one of the most theoretically sound GAN models. However, in

practice it does not always outperform other variants of GANs. This is mostly due to the …

Towards generalized implementation of wasserstein distance in 

 Cited by 5 Related articles All 6 versions 


WRGAN: Improvement of RelGAN with Wasserstein Loss for Text Generation. Electronics 2021, 10, 275

Z Jiao, F Ren - 2021 - search.proquest.com

Generative adversarial networks (GANs) were first proposed in 2014, and have been widely

used in computer vision, such as for image generation and other tasks. However, the GANs

used for text generation have made slow progress. One of the reasons is that the …

sed for text gen
 Cited by 3 Related articles All 2 versions 

Distributional robustness in minimax linear quadratic control with Wasserstein distance

K KimI Yang - arXiv preprint arXiv:2102.12715, 2021 - arxiv.org

To address the issue of inaccurate distributions in practical stochastic systems, a minimax

linear-quadratic control method is proposed using the Wasserstein metric. Our method aims

to construct a control policy that is robust against errors in an empirical distribution of …

  All 2 versions 


2021

[PDF] mlr.press

Fast and smooth interpolation on wasserstein space

S Chewi, J Clancy, T Le Gouic… - International …, 2021 - proceedings.mlr.press

We propose a new method for smoothly interpolating probability measures using the

geometry of optimal transport. To that end, we reduce this problem to the classical Euclidean

setting, allowing us to directly leverage the extensive toolbox of spline interpolation. Unlike …

   Related articles All 2 versions 

 

[PDF] arxiv.org

From Smooth Wasserstein Distance to Dual Sobolev Norm: Empirical Approximation and Statistical Applications

S NietertZ GoldfeldK Kato - arXiv preprint arXiv:2101.04039, 2021 - arxiv.org

Statistical distances, ie, discrepancy measures between probability distributions, are

ubiquitous in probability theory, statistics and machine learning. To combat the curse of

dimensionality when estimating these distances from data, recent work has proposed …

  Related articles All 2 versions 


[PDF] arxiv.org

Tracial smooth functions of non-commuting variables and the free Wasserstein manifold

D Jekel, W LiD Shlyakhtenko - arXiv preprint arXiv:2101.06572, 2021 - arxiv.org

We formulate a free probabilistic analog of the Wasserstein manifold on $\mathbb {R}^ d

$(the formal Riemannian manifold of smooth probability densities on $\mathbb {R}^ d $),

and we use it to study smooth non-commutative transport of measure. The points of the free …

  All 4 versions 


[PDF] projecteuclid.org

Strong equivalence between metrics of Wasserstein type

E Bayraktar, G Guoï - Electronic Communications in Probability, 2021 - projecteuclid.org

The sliced Wasserstein metric p and more recently max-sliced Wasserstein metric W p

have attracted abundant attention in data sciences and machine learning due to their

advantages to tackle the curse of dimensionality, see eg [15],[6]. A question of particular …

  All 2 versions

arXiv:2103.16938  [pdfother]  cs.CV  cs.LG  eess.IV
Unpaired Single-Image Depth Synthesis with cycle-consistent Wasserstein GANs
Authors: Christoph AngermannAdéla MoravováMarkus HaltmeierSteinbjörn JónssonChristian Laubichler
Abstract: Real-time estimation of actual environment depth is an essential module for various autonomous system tasks such as localization, obstacle detection and pose estimation. During the last decade of machine learning, extensive deployment of deep learning methods to computer vision tasks yielded successful approaches for realistic depth synthesis out of a simple RGB modality. While most of these model…  More
Submitted 31 March, 2021; originally announced March 2021.
Comments: submitted to the International Conference on Computer Vision (ICCV) 2021

online  OPEN ACCESS

Unpaired Single-Image Depth Synthesis with cycle-consistent Wasserstein GANs

by Angermann, Christoph; Moravová, Adéla; Haltmeier, Markus ; More...

03/2021

Real-time estimation of actual environment depth is an essential module for various autonomous system tasks such as localization, obstacle detection and pose...

Journal ArticleFull Text Online

All 2 versions 
 
<——2021———2021———450——


Data-Driven Distributionally Robust Surgery Planning in Flexible Operating Rooms Over a Wasserstein...
by Shehadeh, Karmel S
03/2021
We study elective surgery planning in flexible operating rooms (ORs) where emergency patients are accommodated in the existing elective surgery schedule....
Journal Article  Full Text Online

arXiv:2103.15221  [pdfother math.OC
Data-Driven Distributionally Robust Surgery Planning in Flexible Operating Rooms Over a Wasserstein Ambiguity
Authors: Karmel S. Shehadeh
Abstract: We study elective surgery planning in flexible operating rooms where emergency patients are accommodated in the existing elective surgery schedule. Probability distributions of surgery durations are unknown, and only a small set of historical realizations is available. To address distributional ambiguity, we first construct an ambiguity set that encompasses all possible distributions of surgery du…  More
Submitted 28 March, 2021; originally announced March 2021. 

Related articles All 3 versions 

MR4226638 Prelim Liu, Wei; Yang, Li; Yu, Bo; Wasserstein distributionally robust option pricing. J. Math. Res. Appl. 41 (2021), no. 1, 99–110. 91G20 (90C15 90C25 90C47)

Review PDF Clipboard Journal Article


MR4218933 Prelim Rustamov, Raif M.; Closed-form expressions for maximum mean discrepancy with applications to Wasserstein auto-encoders. Stat 10 (2021), e329, 12 pp. 62G07 (68T07)

Review PDF Clipboard Journal Article



2021 1

[HTML] CHICKN: extraction of peptide chromatographic elution profiles from large scale mass spectrometry data by means of Wasserstein compressive hierarchical cluster …

O PermiakovaR Guibert, A Kraut, T Fortin… - BMC …, 2021 - Springer

The clustering of data produced by liquid chromatography coupled to mass spectrometry

analyses (LC-MS data) has recently gained interest to extract meaningful chemical or

biological patterns. However, recent instrumental pipelines deliver data which size …

  All 12 versions


[PDF] arxiv.org

Projected Statistical Methods for Distributional Data on the Real Line with the Wasserstein Metric

M Pegoraro, M Beraha - arXiv preprint arXiv:2101.09039, 2021 - arxiv.org

We present a novel class of projected methods, to perform statistical analysis on a data set

of probability distributions on the real line, with the 2-Wasserstein metric. We focus in

particular on Principal Component Analysis (PCA) and regression. To define these models …

  All 7 versions 


2021


[PDF] arxiv.org

Wasserstein proximal of GANs

AT Lin, W LiS OsherG Montúfar - arXiv preprint arXiv:2102.06862, 2021 - arxiv.org

We introduce a new method for training generative adversarial networks by applying the

Wasserstein-2 metric proximal on the generators. The approach is based on Wasserstein

information geometry. It defines a parametrization invariant natural gradient by pulling back …

  Cited by 7 Related articles All 4 versions 


[PDF] arxiv.org

Wasserstein Proximal Algorithms for the Schr\"{o} dinger Bridge Problem: Density Control with Nonlinear Drift

K CaluyaA Halder - IEEE Transactions on Automatic Control, 2021 - ieeexplore.ieee.org

We study the Schr {\" o} dinger bridge problem (SBP) with nonlinear prior dynamics. In

control-theoretic language, this is a problem of minimum effort steering of a given joint state

probability density function (PDF) to another over a finite time horizon, subject to a controlled …

  Cited by 4 Related articles All 4 versions


[PDF] arxiv.org

2D Wasserstein Loss for Robust Facial Landmark Detection

YAN Yongzhe, S Duffner, P Phutane, A Berthelier… - Pattern Recognition, 2021 - Elsevier

The recent performance of facial landmark detection has been significantly improved by

using deep Convolutional Neural Networks (CNNs), especially the Heatmap Regression

Models (HRMs). Although their performance on common benchmark datasets has reached a …

  Related articles All 3 versions


2021

[PDF] projecteuclid.org

Strong equivalence between metrics of Wasserstein type

E Bayraktar, G Guoï - Electronic Communications in Probability, 2021 - projecteuclid.org

The sliced Wasserstein metric p and more recently max-sliced Wasserstein metric W p

have attracted abundant attention in data sciences and machine learning due to their

advantages to tackle the curse of dimensionality, see eg [15],[6]. A question of particular …

  All 2 versions


2021   [PDF] ieee.org

Wasserstein Divergence GAN with Cross-Age Identity Expert and Attribute Retainer for Facial Age Transformation

GS Hsu, RC Xie, ZT Chen - IEEE Access, 2021 - ieeexplore.ieee.org

We propose the Wasserstein Divergence GAN with an identity expert and an attribute

retainer for facial age transformation. The Wasserstein Divergence GAN (WGAN-div) can

better stabilize the training and lead to better target image generation. The identity expert …

 <——2021———2021———460——   


2021

[PDF] arxiv.org

Central Limit Theorem for Semidiscrete Wasserstein Distances

E del BarrioA González-SanzJM Loubes - arXiv preprint arXiv …, 2021 - arxiv.org

We address the problem of proving a Central Limit Theorem for the empirical optimal

transport cost, $\sqrt {n}\{\mathcal {T} _c (P_n, Q)-\mathcal {W} _c (P, Q)\} $, in the semi

discrete case, ie when the distribution $ P $ is finitely supported. We show that the …

  All 4 versions 


[PDF] arxiv.org

Closed‐form expressions for maximum mean discrepancy with applications to Wasserstein auto‐encoders

RM Rustamov - Stat, 2021 - Wiley Online Library

The maximum mean discrepancy (MMD) has found numerous applications in statistics and

machine learning, among which is its use as a penalty in the Wasserstein auto‐encoder

(WAE). In this paper, we compute closedform expressions for estimating the Gaussian …

  Cited by 5 Related articles All 3 versions


[PDF] arxiv.org

Robust W-GAN-Based Estimation Under Wasserstein Contamination

Z Liu, PL Loh - arXiv preprint arXiv:2101.07969, 2021 - arxiv.org

Robust estimation is an important problem in statistics which aims at providing a reasonable

estimator when the data-generating distribution lies within an appropriately defined ball

around an uncontaminated distribution. Although minimax rates of estimation have been …

  All 2 versions 


[PDF] arxiv.org

Approximation for Probability Distributions by Wasserstein GAN

Y GaoMK Ng - arXiv preprint arXiv:2103.10060, 2021 - arxiv.org

In this paper, we show that the approximation for distributions by Wasserstein GAN depends

on both the width/depth (capacity) of generators and discriminators, as well as the number of

samples in training. A quantified generalization bound is developed for Wasserstein  …

  All 3 versions 


[PDF] arxiv.org

Predictive density estimation under the Wasserstein loss

T Matsuda, WE Strawderman - Journal of Statistical Planning and Inference, 2021 - Elsevier

We investigate predictive density estimation under the L 2 Wasserstein loss for location

families and location-scale families. We show that plug-in densities form a complete class

and that the Bayesian predictive density is given by the plug-in density with the posterior …

   Related articles All 5 versions


2021


[PDF] arxiv.org

Data-Driven Distributionally Robust Surgery Planning in Flexible Operating Rooms Over a Wasserstein Ambiguity

KS Shehadeh - arXiv preprint arXiv:2103.15221, 2021 - arxiv.org

We study elective surgery planning in flexible operating rooms where emergency patients

are accommodated in the existing elective surgery schedule. Probability distributions of

surgery durations are unknown, and only a small set of historical realizations is available. To …

  All 2 versions 


Sample out-of-sample inference based on Wasserstein distance

J Blanchet, Y Kang - Operations Research, 2021 - pubsonline.informs.org

We present a novel inference approach that we call sample out-of-sample inference. The

approach can be used widely, ranging from semisupervised learning to stress testing, and it

is fundamental in the application of data-driven distributionally robust optimization. Our …

  Cited by 21 Related articles All 5 versions

[PDF] arxiv.org


Wasserstein -tests and confidence bands for the Fréchet regression of density response curves

A Petersen, X Liu, AA Divani - The Annals of Statistics, 2021 - projecteuclid.org

Data consisting of samples of probability density functions are increasingly prevalent,

necessitating the development of methodologies for their analysis that respect the inherent

nonlinearities associated with densities. In many applications, density curves appear as …

  Cited by 3 Related articles All 2 versions


[PDF] arxiv.org

Wasserstein Proximal Algorithms for the Schr\"{o} dinger Bridge Problem: Density Control with Nonlinear Drift

K CaluyaA Halder - IEEE Transactions on Automatic Control, 2021 - ieeexplore.ieee.org

We study the Schr {\" o} dinger bridge problem (SBP) with nonlinear prior dynamics. In

control-theoretic language, this is a problem of minimum effort steering of a given joint state

probability density function (PDF) to another over a finite time horizon, subject to a controlled …

  Cited by 4 Related articles All 4 versions


Local Stability of Wasserstein GANs With Abstract Gradient P enalty

C Kim, S Park, HJ Hwang - IEEE Transactions on Neural …, 2021 - ieeexplore.ieee.org

The convergence of generative adversarial networks (GANs) has been studied substantially

in various aspects to achieve successful generative tasks. Ever since it is first proposed, the

idea has achieved many theoretical improvements by injecting an instance noise, choosing …

  All 3 versions

 <——2021———2021———470——


[HTML] tandfonline.com

Full View

Dissimilarity measure of local structure in inorganic crystals using Wasserstein distance to search for novel phosphors

S Takemura, T Takeda, T Nakanishi… - … and Technology of …, 2021 - Taylor & Francis

To efficiently search for novel phosphors, we propose a dissimilarity measure of local

structure using the Wasserstein distance. This simple and versatile method provides the

quantitative dissimilarity of a local structure around a center ion. To calculate the …

  All 2 versions


[PDF] arxiv.org

Predictive density estimation under the Wasserstein loss

T Matsuda, WE Strawderman - Journal of Statistical Planning and Inference, 2021 - Elsevier

We investigate predictive density estimation under the L 2 Wasserstein loss for location

families and location-scale families. We show that plug-in densities form a complete class

and that the Bayesian predictive density is given by the plug-in density with the posterior …

   Related articles All 5 versions


[PDF] arxiv.org

Tighter expected generalization error bounds via Wasserstein distance

B Rodríguez-GálvezG BassiR Thobaben… - arXiv preprint arXiv …, 2021 - arxiv.org

In this work, we introduce several expected generalization error bounds based on the

Wasserstein distance. More precisely, we present full-dataset, single-letter, and random-

subset bounds on both the standard setting and the randomized-subsample setting from …

  All 3 versions 


[PDF] arxiv.org

Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings

MH Quang - arXiv preprint arXiv:2101.01429, 2021 - arxiv.org

This work studies the convergence and finite sample approximations of entropic regularized

Wasserstein distances in the Hilbert space setting. Our first main result is that for Gaussian

measures on an infinite-dimensional Hilbert space, convergence in the 2-Sinkhorn …

  Related articles All 3 versions 


[PDF] arxiv.org

Two-sample Test with Kernel Projected Wasserstein Distance

J Wang, R Gao, Y Xie - arXiv preprint arXiv:2102.06449, 2021 - arxiv.org

We develop a kernel projected Wasserstein distance for the two-sample test, an essential

building block in statistics and machine learning: given two sets of samples, to determine

whether they are from the same distribution. This method operates by finding the nonlinear …

  All 3 versions 


2021


[PDF] researchgate.net

[PDF] Two-sample Test using Projected Wasserstein Distance

J Wang, R Gao, Y Xie - researchgate.net

We develop a projected Wasserstein distance for the two-sample test, a fundamental

problem in statistics and machine learning: given two sets of samples, to determine whether

they are from the same distribution. In particular, we aim to circumvent the curse of …

 <——2021———2021———480——


[PDF] arxiv.org

On the Convexity of Discrete Time Covariance Steering in Stochastic Linear Systems with Wasserstein Terminal Cost

IM BalciA HalderE Bakolas - arXiv preprint arXiv:2103.13579, 2021 - arxiv.org

… It is assumed that the initial state is a normal vector and in particular, x0 N(µ0,S0), where µ0

Rn and S0 0, and in addition, the disturbance process is a … (13) It was shown in [1] that the

problem of discrete time covariance steering with Wasserstein terminal cost …

  All 2 versions 

[PDF] arxiv.org

Tighter expected generalization error bounds via Wasserstein distance

B Rodríguez-GálvezG BassiR Thobaben… - arXiv preprint arXiv …, 2021 - arxiv.org

In this work, we introduce several expected generalization error bounds based on the

Wasserstein distance. More precisely, we present full-dataset, single-letter, and random-

subset bounds on both the standard setting and the randomized-subsample setting from …

  All 3 versions 


Dimension-free Wasserstein contraction of nonlinear filters

N Whiteley - Stochastic Processes and their Applications, 2021 - Elsevier

For a class of partially observed diffusions, conditions are given for the map from the initial

condition of the signal to filtering distribution to be contractive with respect to Wasserstein

distances, with rate which does not necessarily depend on the dimension of the state-space …

  All 2 versions


2021 online Cover Image

Wasserstein autoencoders for collaborative filtering

by Zhang, Xiaofeng; Zhong, Jingbin; Liu, Kai

Neural computing & applications, 04/2021, Volume 33, Issue 7

The recommender systems have long been studied in the literature. The collaborative filtering is one of the most widely adopted recommendation techniques which...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

Cited by 22 Related articles All 4 versions

online Cover Image OPEN ACCESS

Sample Out-of-Sample Inference Based on Wasserstein Distance

by Blanchet, Jose; Kang, Yang

Operations research, 03/2021

Financial institutions make decisions according to a model of uncertainty. At the same time, regulators often evaluate the risk exposure of these institutions...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online
Zbl 07375481 
 Zbl 07376388
Cited by 28
Related articles All 7 versions


2021

Exponential convergence in entropy and Wasserstein for McKean-Vlasov SDEs

By: Ren, PanpanWang, Feng-Yu

NONLINEAR ANALYSIS-THEORY METHODS & APPLICATIONS  Volume: ‏ 206     Article Number: 112259   Published: ‏ MAY 2021

  View Abstract

[HTML] aclanthology.org

[HTML] Wasserstein Selective Transfer Learning for Cross-domain Text Mining

L Feng, M Qiu, Y Li, H Zheng… - Proceedings of the 2021 …, 2021 - aclanthology.org

… the estimated Wasserstein distance in an adversarial manner and provides domain invariant 

features for the reinforced selector. We adopt an evaluation metric based on the performance 

of the TL module as delayed reward and a Wasserstein-based metric as immediate …

Cited by 2 Related articles All 2 versions


On Riemannian Optimization over Positive Definite Matrices with the Bures-Wasserstein Geometry
Authors:Han, Andi (Creator), Mishra, Bamdev (Creator), Jawanpuria, Pratik (Creator), Gao, Junbin (Creator)
Summary:In this paper, we comparatively analyze the Bures-Wasserstein (BW) geometry with the popular Affine-Invariant (AI) geometry for Riemannian optimization on the symmetric positive definite (SPD) matrix manifold. Our study begins with an observation that the BW metric has a linear dependence on SPD matrices in contrast to the quadratic dependence of the AI metric. We build on this to show that the BW metric is a more suitable and robust choice for several Riemannian optimization problems over ill-conditioned SPD matrices. We show that the BW geometry has a non-negative curvature, which further improves convergence rates of algorithms over the non-positively curved AI geometry. Finally, we verify that several popular cost functions, which are known to be geodesic convex under the AI geometry, are also geodesic convex under the BW geometry. Extensive experiments on various applications support our findingsShow more
Downloadable Archival Material, 2021-06-01
Undefined
Publisher:2021-06-01

 

2021 online Cover Image

On the computational complexity of finding a sparse Wasserstein barycenter

by Borgwardt, Steffen; Patterson, Stephan

Journal of combinatorial optimization, 04/2021, Volume 41, Issue 3

The discrete Wasserstein barycenter problem is a minimum-cost mass transport problem for a set of probability measures with finite support. In this paper, we...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

  Borgwardt, SteffenPatterson, Stephan

On the computational complexity of finding a sparse Wasserstein barycenter. (English) Zbl 07347228

J. Comb. Optim. 41, No. 3, 736-761 (2021).

MSC:  68Q17 68Q25 90B06 90B80 05C70

PDF BibTeX XML Cite

Full Text: DOI

Cited by 16 Related articles All 6 versions

online Cover Image OPEN ACCESS

Panchromatic Image Super-Resolution Via Self Attention-Augmented Wasserstein Generative Adversarial Network

by Du, Juan; Cheng, Kuanhong; Yu, Yue ; More...

Sensors (Basel, Switzerland), 03/2021, Volume 21, Issue 6

Panchromatic (PAN) images contain abundant spatial information that is useful for earth observation, but always suffer from low-resolution ( LR) due to the...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online 

Published: ‏ 2021 Mar 19

Cited by 5 Related articles All 8 versions

<——2021———2021———490——


online Cover Image

The measurement of relations on belief functions based on the Kantorovich problem and the Wasserstein metric

by Bronevich, Andrey G; Rozenberg, Igor N

International journal of approximate reasoning, 04/2021, Volume 131

In this paper, we show how the Kantorovich problem appears in many constructions in the theory of belief functions. We demonstrate this on several relations on...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online  Zbl 07414867

Related articles All 4 versions

online Cover Image

Adversarial training with Wasserstein distance for learning cross-lingual word embeddings

by Li, Yuling; Zhang, Yuhong; Yu, Kui ; More...

Applied intelligence (Dordrecht, Netherlands), 03/2021

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

Cited by 5 Related articles All 3 versions

Cover Image

On clustering uncertain and structured data with Wasserstein barycenters and a geodesic criterion for...

by Papayiannis, G. I; Domazakis, G. N; Drivaliaris, D ; More...

Journal of statistical computation and simulation, , Volume ahead-of-print, Issue ahead-of-print

Clustering schemes for uncertain and structured data are considered relying on the notion of Wasserstein barycenters, accompanied by appropriate clustering...

Journal ArticleCitation Online


2021 online Cover Image

Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic

 Schrödinger equation

by Ferriere, Guillaume

Analysis & PDE, 03/2021, Volume 14, Issue 2

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 

online Cover Image

Reconstruction Method for Missing Measurement Data Based on Wasserstein Generative Adversarial Network

by Zhang, Changfan; Chen, Hongrun; He, Jing ; More...

Journal of advanced computational intelligence and intelligent informatics, 03/2021, Volume 25, Issue 2

Focusing on the issue of missing measurement data caused by complex and changeable working conditions during the operation of high-speed trains, in this paper,...

Journal ArticleFull Text Online

 Preview 

 Cite this item Email this item Save this item More actions


2021

[PDF] arxiv.org

The existence of minimizers for an isoperimetric problem with Wasserstein penalty term in unbounded domains

Q Xia, B Zhou - Advances in Calculus of Variations, 2021 - degruyter.com

In this article, we consider the (double) minimization problem min⁡{P⁢(E; Ω)+ λ⁢ W p⁢(E,

F): E Ω, F R d,| E F|= 0,| E|=| F|= 1}, where λ 0, p 1, Ω is a (possibly unbounded)

domain in R d, P⁢(E; Ω) denotes the relative perimeter of 𝐸 in Ω and W p denotes the 𝑝 …

  Related articles All 4 versions


onlinenCover Image

Nonembeddability of persistence diagrams with $p>2$ Wasserstein metric

by Wagner, Alexander

Proceedings of the American Mathematical Society, 03/2021

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 MR4246816 Prelim Wagner, Alexander; Nonembeddability of persistence diagrams with 

p>2  Wasserstein metric. Proc. Amer. Math. Soc. 149 (2021), no. 6, 2673–2677. 55N99 (46C05)

Review PDF Clipboard Journal Article    Zbl 07337079

2021 see 2022  online Cover Image

Wasserstein statistics in one-dimensional location scale models

by Amari, Shun-ichi; Matsuda, Takeru

Annals of the Institute of Statistical Mathematics, 03/2021

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online   

 

onlinem OPEN ACCESS

Approximation for Probability Distributions by Wasserstein GAN

by Gao, Yihang; Ng, Michael K

03/2021

In this paper, we show that the approximation for distributions by Wasserstein GAN depends on both the width/depth (capacity) of generators and discriminators,...

Journal ArticleFull Text Online


Approximation Capabilities of Wasserstein Generative Adversarial Networks

Y Gao, M Zhou, MK Ng - arXiv preprint arXiv:2103.10060, 2021 - 128.84.4.34

In this paper, we study Wasserstein Generative Adversarial Networks (WGANs) using

GroupSort neural networks as discriminators. We show that the error bound for the

approximation of target distribution depends on both the width/depth (capacity) of generators …

Related articles All 2 versions 

 <——2021———2021———500——

 

online  OPEN ACCESS

A sharp Wasserstein uncertainty principle for Laplace eigenfunctions

by Mukherjee, Mayukh

03/2021

Consider an eigenfunction of the Laplace-Beltrami operator on a smooth compact Riemannian surface. We prove a conjectured lower bound on the Wasserstein...

Journal ArticleFull Text Online

Cited by 4 Related articles

online  OPEN ACCESS

Internal Wasserstein Distance for Adversarial Attack and Defense

by Li, Jincheng; Cao, Jiezhang; Zhang, Shuhai ; More...

03/2021

Deep neural networks (DNNs) are vulnerable to adversarial examples that can trigger misclassification of DNNs but may be imperceptible to human perception....

Journal ArticleFull Text Online

Cited by 3 Related articles All 4 version

online   OPEN ACCESS

Wasserstein Robust Support Vector Machines with Fairness Constraints

by Wang, Yijie; Nguyen, Viet Anh; Hanasusanto, Grani A

03/2021

We propose a distributionally robust support vector machine with a fairness constraint that encourages the classifier to be fair in view of the equality of...

Journal ArticleFull Text Online

online  OPEN ACCESS

Data-Driven Distributionally Robust Surgery Planning in Flexible Operating Rooms Over a Wasserstein Ambiguity

by Shehadeh, Karmel S

03/2021

We study elective surgery planning in flexible operating rooms where emergency patients are accommodated in the existing elective surgery schedule. Probability...

Journal ArticleFull Text Online
[PDF] arxiv.org

Data-Driven Distributionally Robust Surgery Planning in Flexible Operating Rooms Over a Wasserstein Ambiguity

KS Shehadeh - arXiv preprint arXiv:2103.15221, 2021 - arxiv.org

We study elective surgery planning in flexible operating rooms where emergency patients

are accommodated in the existing elective surgery schedule. Probability distributions of

surgery durations are unknown, and only small set of historical realizations is available. To …

  Related articles All 2 versions 

[PDF] amazonaws.com

[PDF] STOCHASTIC GRADIENT METHODS FOR L2-WASSERSTEIN LEAST SQUARES PROBLEM OF GAUSSIAN MEASURES

S YUN, X SUNJIL CHOI… - J. Korean Soc …, 2021 - ksiam-editor.s3.amazonaws.com

This paper proposes stochastic methods to find an approximate solution for the L2-Wasserstein least squares problem of Gaussian measures. The variable for the problem is in a set of positive definite matrices. The first proposed stochastic method is a type of classical …

2021


online Cover Image  OPEN ACCESS

Wasserstein distance-based probabilistic linguistic TODIM method with application to the evaluation of

sustainable rural tourism potential

by Zhang, Shitao; Wu, Zhangjiao; Ma, Zhenzhen ; More...

Ekonomska istraživanja, , Volume ahead-of-print, Issue ahead-of-print

The evaluation of sustainable rural tourism potential is a key work in sustainable rural tourism development. Due to the complexity of the rural tourism...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

online

New Breast Cancer Findings from University of Technology Described (Breast Cancer Histopathology Image Super-resolution Using Wide-attention Gan With Improved Wasserstein Gradient Penalty and Perceptual Loss

Citation metadata

Women's health weekly, 04/2021

Journal ArticleFull Text Online


Patent Issued for Object Shape Regression Using Wasserstein Distance

Journal of Engineering, 03/2021

NewsletterCitation Online

 Preview 

 Cite this item Email this item Save this item More actions

2021 online  OPEN ACCESS

Generalized Wasserstein Dice Score, Distributionally Robust Deep Learning, and Ranger for Brain Tumor. segmentation: BraTS 2020 challenge

by Fidon, Lucas; Ourselin, Sébastien; Vercauteren, Tom

Brainlesion: Glioma, Multiple Sclerosis, Stroke and Traumatic Brain Injuries, 03/2021

Training a deep neural network is an optimization problem with four main ingredients: the design of the deep neural network, the per-sample loss function, the...

Book ChapterFull Text Online

Cited by 33 Related articles All 6 versions

Researchers from University of Nevada Report Details of New Studies and Findings in the Area of Statistics 

(Convergence Rate To Equilibrium In Wasserstein Distance for Reflected Jump-diffusions)

Mathematics Week, 03/2021

NewsletterCitation Online

 Preview 

 Cite this item Email this item Save this item More actions

 <——2021———2021———510—— 

Data from School of Traffic and Transportation Engineering Provide New Insights into Intelligent Systems 

(Short-term Railway Passenger Demand Forecast Using Improved Wasserstein Generative Adversarial Nets and Web Search Terms)

Robotics & Machine Learning, 03/2021

NewsletterCitation Online

 Preview 

 Cite this item Email this item Save this item More actions


 2021

TextureWGAN: texture preserving WGAN with MLE regularizer for inverse problems

by Ikuta, Masaki; Zhang, Jun

02/2021

Many algorithms and methods have been proposed for inverse problems particularly with the recent surge of interest in machine learning and deep learning...

Conference ProceedingCitation Online

 Preview 

 Cite this item Email this item Save this item More actions

 Cited by 2 Related articles All 4 versions

Publication:Multimedia Systems, 27, 20201123, 723
Publisher:2020

Research on WGAN models with Rényi Differential Privacy

by Lee, Sujin; Park, Cheolhee; Hong, Dowon ; More...

Chŏngbo Kwahakhoe nonmunji, 01/2021, Volume 48, Issue 1

Journal ArticleCitation Online

[PDF] arxiv.org

A Wasserstein index of dependence for random measures

M Catalano, H Lavenant, A Lijoi, I Prünster - arXiv preprint arXiv …, 2021 - arxiv.org

… in terms of Wasserstein distance from the maximally dependent scenario when $d=2$. 

By solving an intriguing max-min problem we are now able to define a Wasserstein index of …

 Cited by 2 Related articles All 3 versions
A Wasserstein index of dependence for random measures
Authors:Marta CatalanoHugo LavenantAntonio LijoiIgor Prünster
eBook, 2021
English
Publisher:Fondazione Collegio Carlo Alberto, Torino, 2021


online  OPEN ACCESS

WGAN with an Infinitely Wide Generator Has No Spurious Stationary Points

by No, Albert; Yoon, Taeho; Kwon, Se-Hyeon ; More...

02/2021

Generative adversarial networks (GAN) are a widely used class of deep generative models, but their minimax training dynamics are not understood very well. In...

Journal ArticleFull Text Online

 WGAN with an Infinitely Wide Generator Has No Spurious ...

https://arxiv.org › cs

by A No · 2021 — We then show that when the width of the generator is finite but wide, there are no spurious stationary points within a ball whose radius becomes ...

 Cited by 2 Related articles All 7 versions

2021

onlinenOPEN ACCESS

About the regularity of the discriminator in conditional WGANs

by Martin, Jörg

03/2021

Training of conditional WGANs is usually done by averaging the underlying loss over the condition. Depending on the way this is motivated different constraints...

Journal ArticleFull Text Online

 (PDF) About the regularity of the discriminator in conditional ...

https://www.researchgate.net › ... › Discrimination

Mar 26, 2021 — PDF | Training of conditional WGANs is usually done by averaging the underlying loss over the condition. Depending on the way this is ...

[CITATION] About the regularity of the discriminator in conditional WGANs.

J Martin - CoRR, 2021

2021 see 2020  online

New Computers Data Have Been Reported by Investigators at Sichuan University 

(Spam Transaction Attack Detection Model Based On Gru and Wgan-div)

Computer Weekly News, 02/2021

NewsletterFull Text Online

 

Hebei University of Technology Researchers Update Understanding of Robotics 

(Low-Illumination Image Enhancement in the Space Environment Based on the DC-WGAN...

Journal of Engineering, 01/2021

NewsletterCitation Online

 Sensors (Basel, Switzerland), 01/2021, Volume 21, Issue 1
Owing to insufficient illumination of the space station, the image information collected by the intelligent robot will be degraded, and it will not be able to...
ArticleView Article PDF
Journal Article  Full Text Online

Wasserstein -tests and confidence bands for the Fréchet regression of density response curves

A Petersen, X Liu, AA Divani - The Annals of Statistics, 2021 - projecteuclid.org

Data consisting of samples of probability density functions are increasingly prevalent,

necessitating the development of methodologies for their analysis that respect the inherent

nonlinearities associated with densities. In many applications, density curves appear as …

  Cited by 3 Related articles All 2 versions


[PDF] arxiv.org

On the Convexity of Discrete Time Covariance Steering in Stochastic Linear Systems with Wasserstein Terminal Cost

IM BalciA HalderE Bakolas - arXiv preprint arXiv:2103.13579, 2021 - arxiv.org

In this work, we analyze the properties of the solution to the covariance steering problem for

discrete time Gaussian linear systems with a squared Wasserstein distance terminal cost. In

our previous work, we have shown that by utilizing the state feedback control policy …

  All 2 versions 

<——2021———2021———520——


[PDF] arxiv.org

A Mathematical Principle of Deep Learning: Learn the Geodesic Curve in the Wasserstein Space

K Gai, S Zhang - arXiv preprint arXiv:2102.09235, 2021 - arxiv.org

Recent studies revealed the mathematical connection of deep neural network (DNN) and

dynamic system. However, the fundamental principle of DNN has not been fully

characterized with dynamic system in terms of optimization and generalization. To this end …

  All 2 versions 


MCKEAN-VLASOV SDES WITH DRIFTS DISCONTINUOUS UNDER WASSERSTEIN DISTANCE

By: Huang, XingWang, Feng-Yu

DISCRETE AND CONTINUOUS DYNAMICAL SYSTEMS  Volume: ‏ 41   Issue: ‏ 4   Pages: ‏ 1667-1679   Published: ‏ APR 2021

 Free Full Text from Publisher View Abstract


patent

Restoring image based on wasserstein generative adversarial network network comprises constructing shallow convolutional network structure by using cavity convolution to extract spatial characteristic of collected image

Patent Number: CN112488956-A

Patent Assignee: UNIV NANJING INFORMATION SCI & TECHNOLOG

Inventor(s): FANG W; GU E; WANG W. 

Select record9


Berry-Esseen Smoothing Inequality for the Wasserstein Metric on Compact Lie Groups

By: Borda, Bence

JOURNAL OF FOURIER ANALYSIS AND APPLICATIONS  Volume: ‏ 27   Issue: ‏ 2     Article Number: 13   Published: ‏ MAR 3 2021


2021

 Lane Line Detection Network and Wasserstein GAN

By: Zhang, YouchengLu, ZongqingMa, Dongdong; et al.

IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS  Volume: ‏ 22   Issue: ‏ 3   Pages: ‏ 1532-1542   Published: ‏ MAR 2021


2021


Classification of atomic environments via the Gromov-Wasserstein distance

By: Kawano, SakuraMason, Jeremy K.

COMPUTATIONAL MATERIALS SCIENCE  Volume: ‏ 188     Article Number: 110144   Published: ‏ FEB 15 2021


Wasserstein GAN for the Classification of Unbalanced THz Database

By: Zhu Rong-shengShen TaoLiu Ying-li; et al.

SPECTROSCOPY AND SPECTRAL ANALYSIS  Volume: ‏ 41   Issue: ‏ 2   Pages: ‏ 425-429   Published: ‏ FEB 2021

  All 2 versions 

[CITATION] Wasserstein GAN for the Classification of Unbalanced THz Database

Z Rong-sheng, S Tao, L Ying-li… - …, 2021 - OFFICE SPECTROSCOPY & …
All 2 versions
 

 

Wasserstein Exponential Kernels

By: De Plaen, HenriFanuel, MichaelSuykens, Johan A. K.

Conference: International Joint Conference on Neural Networks (IJCNN) held as part of the IEEE World Congress on Computational Intelligence (IEEE WCCI) Location: ‏ ELECTR NETWORK Date: ‏ JUL 19-24, 2020

Sponsor(s): ‏IEEE; IEEE Computat Intelligence Soc; Int Neural Network Soc

2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)  Book Series: ‏ IEEE International Joint Conference on Neural Networks (IJCNN)   Published: ‏


2021

When ot meets mom: Robust estimation of wasserstein distance

G StaermanP Laforgue… - International …, 2021 - proceedings.mlr.press

Abstract Originated from Optimal Transport, the Wasserstein distance has gained

importance in Machine Learning due to its appealing geometrical properties and the

increasing availability of efficient approximations. It owes its recent ubiquity in generative …

  Cited by 3 Related articles All 4 versions 


[PDF] arxiv.org

Classification of atomic environments via the Gromov–Wasserstein distance

S Kawano, JK Mason - Computational Materials Science, 2021 - Elsevier

Interpreting molecular dynamics simulations usually involves automated classification of

local atomic environments to identify regions of interest. Existing approaches are generally

limited to a small number of reference structures and only include limited information about …

  Cited by 2 Related articles All 4 versions

<——2021———2021———530——


[PDF] arxiv.org

Wasserstein proximal of GANs

AT Lin, W LiS OsherG Montúfar - arXiv preprint arXiv:2102.06862, 2021 - arxiv.org

We introduce a new method for training generative adversarial networks by applying the

Wasserstein-2 metric proximal on the generators. The approach is based on Wasserstein

information geometry. It defines a parametrization invariant natural gradient by pulling back …

  Cited by 7 Related articles All 4 versions 


Rate of convergence for particles approximation of PDEs in Wasserstein space

M GermainH PhamX Warin - arXiv preprint arXiv:2103.00837, 2021 - arxiv.org

We prove a rate of convergence of order 1/N for the N-particle approximation of a second-

order partial differential equation in the space of probability measures, like the Master

equation or Bellman equation of mean-field control problem under common noise. The proof …

   All 15 versions 


[PDF] arxiv.org

A short proof on the rate of convergence of the empirical measure for the Wasserstein distance

V Divol - arXiv preprint arXiv:2101.08126, 2021 - arxiv.org

We provide a short proof that the Wasserstein distance between the empirical measure of a

n-sample and the estimated measure is of order n^-(1/d), if the measure has a lower and

upper bounded density on the d-dimensional flat torus. Subjects: Statistics Theory (math. ST) …

   All 4 versions 


[PDF] arxiv.org

Tracial smooth functions of non-commuting variables and the free Wasserstein manifold

D Jekel, W LiD Shlyakhtenko - arXiv preprint arXiv:2101.06572, 2021 - arxiv.org

We formulate a free probabilistic analog of the Wasserstein manifold on $\mathbb {R}^ d

$(the formal Riemannian manifold of smooth probability densities on $\mathbb {R}^ d $),

and we use it to study smooth non-commutative transport of measure. The points of the free …

   All 4 versions 


[PDF] mdpi.com

WRGAN: Improvement of RelGAN with Wasserstein Loss for Text Generation

Z Jiao, F Ren - Electronics, 2021 - mdpi.com

Generative adversarial networks (GANs) were first proposed in 2014, and have been widely

used in computer vision, such as for image generation and other tasks. However, the GANs

used for text generation have made slow progress. One of the reasons is that the …

 

2021


[HTML] biomedcentral.com

[HTML] Prediction and analysis of multiple protein lysine modified sites based on conditional wasserstein generative adversarial networks

Y Yang, H Wang, W Li, X Wang… - BMC …, 2021 - bmcbioinformatics.biomedcentral …

Protein post-translational modification (PTM) is a key issue to investigate the mechanism of

protein's function. With the rapid development of proteomics technology, a large amount of

protein sequence data has been generated, which highlights the importance of the in-depth …

  All 7 versions 


The measurement of relations on belief functions based on the Kantorovich problem and the Wasserstein metric

AG Bronevich, IN Rozenberg - International Journal of Approximate …, 2021 - Elsevier

In this paper, we show how the Kantorovich problem appears in many constructions in the

theory of belief functions. We demonstrate this on several relations on belief functions such

as inclusion, equality and intersection of belief functions. Using the Kantorovich problem we …

  All 2 versions


Local Stability of Wasserstein GANs With Abstract Gradient Penalty

C Kim, S Park, HJ Hwang - IEEE Transactions on Neural …, 2021 - ieeexplore.ieee.org

The convergence of generative adversarial networks (GANs) has been studied substantially

in various aspects to achieve successful generative tasks. Ever since it is first proposed, the

idea has achieved many theoretical improvements by injecting an instance noise, choosing …

  All 3 versions


[HTML] tandfonline.com

Full View

Dissimilarity measure of local structure in inorganic crystals using Wasserstein distance to search for novel phosphors

S Takemura, T Takeda, T Nakanishi… - … and Technology of …, 2021 - Taylor & Francis

To efficiently search for novel phosphors, we propose a dissimilarity measure of local

structure using the Wasserstein distance. This simple and versatile method provides the

quantitative dissimilarity of a local structure around a center ion. To calculate the …

  All 2 versions


A Mathematical Principle of Deep Learning: Learn the Geodesic Curve in the Wasserstein Space

K Gai, S Zhang - arXiv preprint arXiv:2102.09235, 2021 - arxiv.org

Recent studies revealed the mathematical connection of deep neural network (DNN) and

dynamic system. However, the fundamental principle of DNN has not been fully

characterized with dynamic system in terms of optimization and generalization. To this end …

  All 2 versions 

<——2021———2021———540——


Distributionally Robust Resilient Operation of Integrated Energy Distribution Systems Using Moment and Wasserstein Metric for Contingencies

M Shahidehpour, Y Zhou, Z Wei… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

Extreme weather events pose a serious threat to energy distribution systems. We propose a

distributionally robust optimization model for the resilient operation of the integrated

electricity and heat energy distribution systems in extreme weather events. We develop a …


[HTML] springer.com

[HTML] … : extraction of peptide chromatographic elution profiles from large scale mass spectrometry data by means of Wasserstein compressive hierarchical cluster …

O PermiakovaR Guibert, A Kraut, T Fortin… - BMC …, 2021 - Springer

The clustering of data produced by liquid chromatography coupled to mass spectrometry

analyses (LC-MS data) has recently gained interest to extract meaningful chemical or

biological patterns. However, recent instrumental pipelines deliver data which size …

  All 12 versions


[PDF] arxiv.org

Geometrical aspects of entropy production in stochastic thermodynamics based on Wasserstein distance

M Nakazato, S Ito - arXiv preprint arXiv:2103.00503, 2021 - arxiv.org

We study a relationship between optimal transport theory and stochastic thermodynamics for

the Fokker-Planck equation. We show that the entropy production is proportional to the

action measured by the path length of the $ L^ 2$-Wasserstein distance, which is a measure …

  All 2 versions 


[PDF] arxiv.org

Continual Learning of Generative Models with Limited Data: From Wasserstein-1 Barycenter to Adaptive Coalescence

M DedeogluS Lin, Z Zhang, J Zhang - arXiv preprint arXiv:2101.09225, 2021 - arxiv.org

Learning generative models is challenging for a network edge node with limited data and

computing power. Since tasks in similar environments share model similarity, it is plausible

to leverage pre-trained generative models from the cloud or other edge nodes. Appealing to …

  All 2 versions 


[PDF] arxiv.org

On Number of Particles in Coalescing-Fragmentating Wasserstein Dynamics

V Konarovskyi - arXiv preprint arXiv:2102.10943, 2021 - arxiv.org

Because of the sticky-reflected interaction in coalescing-fragmentating Wasserstein

dynamics, the model always consists of a finite number of distinct particles for almost all

times. We show that the interacting particle system must admit an infinite number of distinct …

  All 4 versions 


2021


[PDF] arxiv.org

Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings

MH Quang - arXiv preprint arXiv:2101.01429, 2021 - arxiv.org

This work studies the convergence and finite sample approximations of entropic regularized

Wasserstein distances in the Hilbert space setting. Our first main result is that for Gaussian

measures on an infinite-dimensional Hilbert space, convergence in the 2-Sinkhorn …

  Related articles All 3 versions 


[PDF] arxiv.org

From the backward Kolmogorov PDE on the Wasserstein space to propagation of chaos for McKean-Vlasov SDEs

PEC de Raynal, N Frikha - Journal de Mathématiques Pures et Appliquées, 2021 - Elsevier

… In order to establish the existence and uniqueness of a fundamental solution of the Kolmogorov

PDE on the Wasserstein space as well as our quantitative estimates for the mean-field …

 Cited by 11 Related articles All 13 versions


 

On the computational complexity of finding a sparse Wasserstein barycenter

S Borgwardt, S Patterson - Journal of Combinatorial Optimization, 2021 - Springer

The discrete Wasserstein barycenter problem is a minimum-cost mass transport problem for

a set of probability measures with finite support. In this paper, we show that finding a

barycenter of sparse support is hard, even in dimension 2 and for only 3 measures. We …

  All 4 versions

 

Wasserstein Convergence Rate for Empirical Measures of Markov Chains

A Riekert - arXiv preprint arXiv:2101.06936, 2021 - arxiv.org

We consider a Markov chain on $\mathbb {R}^ d $ with invariant measure $\mu $. We are

interested in the rate of convergence of the empirical measures towards the invariant

measure with respect to the $1 $-Wasserstein distance. The main result of this article is a …

  All 2 versions 

2021  [PDF] projecteuclid.org

Strong equivalence between metrics of Wasserstein type

E Bayraktar, G Guoï - Electronic Communications in Probability, 2021 - projecteuclid.org

The sliced Wasserstein metric p and more recently max-sliced Wasserstein metric W p

have attracted abundant attention in data sciences and machine learning due to their

advantages to tackle the curse of dimensionality, see eg [15],[6]. A question of particular …

  All 2 versions

<——2021———2021———550——

[PDF] arxiv.org

Isometric Rigidity of compact Wasserstein spaces

J Santos-Rodríguez - arXiv preprint arXiv:2102.08725, 2021 - arxiv.org

Let $(X, d,\mathfrak {m}) $ be a metric measure space. The study of the Wasserstein space

$(\mathbb {P} _p (X),\mathbb {W} _p) $ associated to $ X $ has proved useful in describing

several geometrical properties of $ X. $ In this paper we focus on the study of isometries of  …

  All 3 versions 


[PDF] arxiv.org

On the Convexity of Discrete Time Covariance Steering in Stochastic Linear Systems with Wasserstein Terminal Cost

IM BalciA HalderE Bakolas - arXiv preprint arXiv:2103.13579, 2021 - arxiv.org

In this work, we analyze the properties of the solution to the covariance steering problem for

discrete time Gaussian linear systems with a squared Wasserstein distance terminal cost. In

our previous work, we have shown that by utilizing the state feedback control policy …

  All 2 versions 


[PDF] arxiv.org

The isometry group of Wasserstein spaces: the Hilbertian case

GP GehérT TitkosD Virosztek - arXiv preprint arXiv:2102.02037, 2021 - arxiv.org

Motivated by Kloeckner's result on the isometry group of the quadratic Wasserstein space

$\mathcal {W} _2\left (\mathbb {R}^ n\right) $, we describe the isometry group $\mathrm

{Isom}\left (\mathcal {W} _p (E)\right) $ for all parameters $0< p<\infty $ and for all separable …

  All 2 versions 

The isometry group of Wasserstein spaces: the Hilbertian case

G Pál Gehér, T TitkosD Virosztek - arXiv e-prints, 2021 - ui.adsabs.harvard.edu

Motivated by Kloeckner's result on the isometry group of the quadratic Wasserstein space

$\mathcal {W} _2\left (\mathbb {R}^ n\right) $, we describe the isometry group $\mathrm

{Isom}\left (\mathcal {W} _p (E)\right) $ for all parameters $0< p<\infty $ and for all separable …

[PDF] aaai.org

[PDF] Towards Generalized Implementation of Wasserstein Distance in GANs

M XuG LuW ZhangY Yu - 2021 - aaai.org

Abstract Wasserstein GANs (WGANs), built upon the Kantorovich-Rubinstein (KR) duality of

Wasserstein distance, is one of the most theoretically sound GAN models. However, in

practice it does not always outperform other variants of GANs. This is mostly due to the …

  All 2 versions 


WRGAN: Improvement of RelGAN with Wasserstein Loss for Text Generation. Electronics 2021, 10, 275

Z Jiao, F Ren - 2021 - search.proquest.com

Generative adversarial networks (GANs) were first proposed in 2014, and have been widely

used in computer vision, such as for image generation and other tasks. However, the GANs

used for text generation have made slow progress. One of the reasons is that the …

 

2021


[PDF] iop.org

Optimization Research on Abnormal Diagnosis of Transformer Voiceprint Recognition based on Improved Wasserstein GAN

K Zhu, H Ma, J Wang, C Yu, C Guo… - Journal of Physics …, 2021 - iopscience.iop.org

Transformer is an important infrastructure equipment of power system, and fault monitoring

is of great significance to its operation and maintenance, which has received wide attention

and much research. However, the existing methods at home and abroad are based on …

  All 2 versions


[PDF] ams.org

Nonembeddability of persistence diagrams with 𝑝> 2 Wasserstein metric

A Wagner - Proceedings of the American Mathematical Society, 2021 - ams.org

Persistence diagrams do not admit an inner product structure compatible with any

Wasserstein metric. Hence, when applying kernel methods to persistence diagrams, the

underlying feature map necessarily causes distortion. We prove that persistence diagrams …

 

[PDF] arxiv.org

Quantitative spectral gap estimate and Wasserstein contraction of simple slice sampling

V Natarovskii, D RudolfB Sprungk - The Annals of Applied …, 2021 - projecteuclid.org

We prove Wasserstein contraction of simple slice sampling for approximate sampling wrt

distributions with log-concave and rotational invariant Lebesgue densities. This yields, in

particular, an explicit quantitative lower bound of the spectral gap of simple slice sampling …

  Related articles All 4 versions


Wasserstein Autoencoders with Mixture of Gaussian Priors for Stylized Text Generation

A Ghabussi - 2021 - uwspace.uwaterloo.ca

Probabilistic text generation is an important application of Natural Language Processing

(NLP). Variational autoencoders and Wasserstein autoencoders are two widely used

methods for text generation. New research efforts focus on improving the quality of the …

  

[PDF] annalsofrscb.ro

Wasserstein GANs for Generation of Variated Image Dataset Synthesis

KDB Mudavathu, MVPCS Rao - Annals of the Romanian Society for …, 2021 - annalsofrscb.ro

Deep learning networks required a training lot of data to get to better accuracy. Given the

limited amount of data for many problems, we understand the requirement for creating the

image data with the existing sample space. For many years the different technique was used …

<——2021———2021———560—


Dimension-free Wasserstein contraction of nonlinear filters

N Whiteley - Stochastic Processes and their Applications, 2021 - Elsevier

For a class of partially observed diffusions, conditions are given for the map from the initial

condition of the signal to filtering distribution to be contractive with respect to Wasserstein

distances, with rate which does not necessarily depend on the dimension of the state-space …

  All 2 versions


Small Object Detection from Remote Sensing Images with the Help of Object-Focused Super-Resolution Using Wasserstein GANs

L Courtrai, MT Pham, C Friguet… - IGARSS 2020-2020 IEEE … - ieeexplore.ieee.org

In this paper, we investigate and improve the use of a super-resolution approach to benefit

the detection of small objects from aerial and satellite remote sensing images. The main

idea is to focus the super-resolution on target objects within the training phase. Such a …


2021  [PDF] mlr.press

Fast and smooth interpolation on wasserstein space

S Chewi, J Clancy, T Le Gouic… - … Intelligence and …, 2021 - proceedings.mlr.press

We propose a new method for smoothly interpolating probability measures using the 

geometry of optimal transport. To that end, we reduce this problem to the classical Euclidean 

setting, allowing us to directly leverage the extensive toolbox of spline interpolation. Unlike …

Cited by 10 Related articles All 7 versions

[PDF] arxiv.org

Wasserstein -tests and confidence bands for the Fréchet regression of density response curves

A Petersen, X Liu, AA Divani - The Annals of Statistics, 2021 - projecteuclid.org

Data consisting of samples of probability density functions are increasingly prevalent, 

necessitating the development of methodologies for their analysis that respect the inherent 

nonlinearities associated with densities. In many applications, density curves appear as …

 Cited by 14 Related articles All 5 versions

2021  [PDF] mlr.press

Wasserstein Random Forests and Applications in Heterogeneous Treatment Effects

Q Du, G Biau, F Petit, R Porcher - … Artificial Intelligence and …, 2021 - proceedings.mlr.press

We present new insights into causal inference in the context of Heterogeneous Treatment 

Effects by proposing natural variants of Random Forests to estimate the key conditional 

distributions. To achieve this, we recast Breiman's original splitting criterion in terms of …

Cited by 2 Related articles All 7 versions


2021


[PDF] arxiv.org

Wasserstein perturbations of Markovian transition semigroups

S Fuhrmann, M Kupper, M Nendel - arXiv preprint arXiv:2105.05655, 2021 - arxiv.org

In this paper, we deal with a class of time-homogeneous continuous-time Markov processes

with transition probabilities bearing a nonparametric uncertainty. The uncertainty is modeled

by considering perturbations of the transition probabilities within a proximity in Wasserstein  …

  All 2 versions 


[PDF] arxiv.org

From Smooth Wasserstein Distance to Dual Sobolev Norm: Empirical Approximation and Statistical Applications

S Nietert, Z Goldfeld, K Kato - arXiv preprint arXiv:2101.04039, 2021 - arxiv.org

Statistical distances, ie, discrepancy measures between probability distributions, are 

ubiquitous in probability theory, statistics and machine learning. To combat the curse of 

dimensionality when estimating these distances from data, recent work has proposed …

Cited by 4 Related articles

2021  [PDF] mdpi.com

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - Symmetry, 2021 - mdpi.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively 

reduce the radiation risk of patients, but it may increase noise and artefacts, which can 

compromise diagnostic information. The methods based on deep learning can effectively …

Related articles

Cited by 13 Related articles All 3 versions

2021  [PDF] arxiv.org

Quantitative spectral gap estimate and Wasserstein contraction of simple slice sampling

V Natarovskii, D Rudolf, B Sprungk - The Annals of Applied …, 2021 - projecteuclid.org

We prove Wasserstein contraction of simple slice sampling for approximate sampling wrt 

distributions with log-concave and rotational invariant Lebesgue densities. This yields, in 

particular, an explicit quantitative lower bound of the spectral gap of simple slice sampling …

Related articles All 4 versions  bl 07420501

[PDF] arxiv.org

Non-negative matrix and tensor factorisations with a smoothed Wasserstein loss

SY Zhang - arXiv preprint arXiv:2104.01708, 2021 - arxiv.org

Non-negative matrix and tensor factorisations are a classical tool in machine learning and 

data science for finding low-dimensional representations of high-dimensional datasets. In 

applications such as imaging, datasets can often be regarded as distributions in a space …

All 2 versions

<——2021———2021———570——


Enhanced Wasserstein Distributionally Robust OPF With  Dependence Structure and Support Information

Wasserstein distributionally robust optimal power flow problem, by adding dependence structure (correlation) and support information.

YouTube · PSMR UMONS · 

Aug 30, 2021

[PDF] researchgate.net

[PDF] Enhanced Wasserstein Distributionally Robust OPF With Dependence Structure and Support Information

A Arrigo, J Kazempour, Z De Grève… - 14th IEEE …, 2021 - researchgate.net

This paper goes beyond the current state of the art related to Wasserstein distributionally 

robust optimal power flow problems, by adding dependence structure (correlation) and 

support information. In view of the space-time dependencies pertaining to the stochastic …

Cited by 1 Related articles All 6 versions

[PDF] toronto.edu

[PDF] The Wasserstein 1 Distance-Constructing an Optimal Map and Applications to Generative Modelling

T Milne - math.toronto.edu

Recent advances in generative modelling have shown that machine learning algorithms are 

capable of generating high resolution images of fully synthetic scenes which some 

researchers call “dreams” or “hallucinations” of the algorithm. Poetic language aside, one …
Related articles 


The Research of MRI Super-Resolution with WGAN

LI Yuerong, WU Zhongke, W Xuesong… - 北京师范大学学报 …, 2021 - bnujournal.com

Magneticresonance imaging (MRI) is a tomography method, which is widely used in clinical

medical testing and is suitable for the diagnosis of various diseases. However, due to the

immaturity of imaging technology, it is necessary tofurther enhance the resolution of MRI …

 

 Wasserstein Distance-Based Auto-Encoder Tracking

By: Xu, LongWei, YingDong, Chenhe; et al.

NEURAL PROCESSING LETTERS    

Early Access: APR 2021

  View Abstract

Cited by 3 Related articles All 2 versions

[ 

SRWGANTV: Image Super-Resolution Through Wasserstein Generative Adversarial Networks with Total Variational Regularization

J Shao, L Chen, Y Wu - 2021 IEEE 13th International …, 2021 - ieeexplore.ieee.org

The study of generative adversarial networks (GAN) has enormously promoted the research

work on single image super-resolution (SISR) problem. SRGAN firstly apply GAN to SISR

reconstruction, which has achieved good results. However, SRGAN sacrifices the fidelity. At …

 

2021


ttps://www.springerprofessional.de › wasserstein-gener...

Wasserstein Generative Adversarial Networks for Realistic ...

Recently, Convolutional neural networks (CNN) with properly annotated training data and results will obtain the best traffic sign detection (TSD) and.

[CITATION] Wasserstein Generative Adversarial Networks for Realistic Traffic Sign Image Generation

C DewiRC Chen, YT Liu - … Thailand, April 7 …, 2021 - Springer International Publishing

2021

 Sampler for the Wasserstein Barycenter | MIT LIDS

https://lids.mit.edu › news-and-events › events › sample...

Apr 2, 2021 — https://mit.zoom.us/j/96227078621?pwd= ... In a nutshell a Wasserstein barycenter is a probability distribution that provides a compelling ... of Gradient flows over Wasserstein space together with convergence guarantees.


2021

Optimal Transport in Machine Learning and Computer Vision ...

https://www.soe.ucsc.edu › events › optimal-transport-...

Feb 22, 2021 — Optimal Transport in Machine Learning and Computer Vision. ‹ Previous ... Location: Via Zoom Link: http

2021

Scaling Wasserstein distances to high dimensions via ...

https://web.stanford.edu › talks › talks › ziv-goldfeld

Feb 12, 2021 — Until further notice, the IT Forum convenes exclusively via Zoom (on Fridays at 1pm PT) due to the ongoing pandemic. To avoid ...

Scaling Wasserstein distances to high dimensions via ...

web.stanford.edu › it-forum › talks › talks › ziv-goldfeld

Feb 12, 2021 — To avoid "Zoom-bombing", we ask attendees to input their email address here ... This talk will present a novel framework of smooth Wasserstein ...


arXiv:2104.06121
  [pdfother]  math.OC  math.DS  math.FA  math.MG
Weak topology and Opial property in Wasserstein spaces, with applications to Gradient Flows and Proximal Point Algorithms of geodesically convex functionals
Authors: Emanuele NaldiGiuseppe Savaré
Abstract: In this paper we discuss how to define an appropriate notion of weak topology in the Wasserstein space (P2(H),W2)
 of Borel probability measures with finite quadratic moment on a separable Hilbert space H
. We will show that such a topology inherits many features of the usual weak topology in Hilbert spaces, in particular the weak closedness of geodesically convex closed sets and the…  More
Submitted 13 April, 2021; originally announced April 2021.
Comments: Dedicated to the memory of Claudio Baiocchi, outstanding mathematician and beloved mentor

Related articles All 3 versions 
<——2021———2021———580——


Quantized Gromov-Wasserstein
by Chowdhury, SamirMiller, DavidNeedham, Tom
04/2021
The Gromov-Wasserstein (GW) framework adapts ideas from optimal transport to allow for the comparison of probability distributions defined on different metric...
Journal Article  Full Text Online

arXiv:2104.02013  [pdfother cs.LG
Quantized Gromov-Wasserstein
Authors: Samir ChowdhuryDavid MillerTom Needham
Abstract: The Gromov-Wasserstein (GW) framework adapts ideas from optimal transport to allow for the comparison of probability distributions defined on different metric spaces. Scalable computation of GW distances and associated matchings on graphs and point clouds have recently been made possible by state-of-the-art algorithms such as S-GWL and MREC. Each of these algorithmic breakthroughs relies on decomp…  More
Submitted 5 April, 2021; originally announced April 2021.

Book ChapterFull Text Online
 Cited by 8 Related articles All 5 versions

A Mathematical Principle of Deep Learning: Learn the Geodesic Curve in the Wasserstein Space

K Gai, S Zhang - arXiv preprint arXiv:2102.09235, 2021 - arxiv.org

Recent studies revealed the mathematical connection of deep neural network (DNN) and

dynamic system. However, the fundamental principle of DNN has not been fully

characterized with dynamic system in terms of optimization and generalization. To this end …

  All 2 versions 


The measurement of relations on belief functions based on the Kantorovich problem and the Wasserstein metric

AG Bronevich, IN Rozenberg - International Journal of Approximate …, 2021 - Elsevier

In this paper, we show how the Kantorovich problem appears in many constructions in the

theory of belief functions. We demonstrate this on several relations on belief functions such

as inclusion, equality and intersection of belief functions. Using the Kantorovich problem we …

  All 2 versions

<——2021———2021———590——


[PDF] arxiv.org

Classification of atomic environments via the Gromov–Wasserstein distance

S Kawano, JK Mason - Computational Materials Science, 2021 - Elsevier

Interpreting molecular dynamics simulations usually involves automated classification of

local atomic environments to identify regions of interest. Existing approaches are generally

limited to a small number of reference structures and only include limited information about …

  Cited by 2 Related articles All 4 versions


[PDF] arxiv.org

Differential inclusions in Wasserstein spaces: The Cauchy-Lipschitz framework

B BonnetH Frankowska - Journal of Differential Equations, 2021 - Elsevier

In this article, we propose a general framework for the study of differential inclusions in the

Wasserstein space of probability measures. Based on earlier geometric insights on the

structure of continuity equations, we define solutions of differential inclusions as absolutely …

  Cited by 2 Related articles All 6 versions

 

 [PDF] arxiv.org

Wasserstein Proximal Algorithms for the Schr\"{o} dinger Bridge Problem: Density Control with Nonlinear Drift

K CaluyaA Halder - IEEE Transactions on Automatic Control, 2021 - ieeexplore.ieee.org

We study the Schr {\" o} dinger bridge problem (SBP) with nonlinear prior dynamics. In

control-theoretic language, this is a problem of minimum effort steering of a given joint state

probability density function (PDF) to another over a finite time horizon, subject to a controlled …

  Cited by 4 Related articles All 4 versions



[PDF] arxiv.org

Wasserstein -tests and confidence bands for the Fréchet regression of density response curves

A Petersen, X Liu, AA Divani - The Annals of Statistics, 2021 - projecteuclid.org

Data consisting of samples of probability density functions are increasingly prevalent,

necessitating the development of methodologies for their analysis that respect the inherent

nonlinearities associated with densities. In many applications, density curves appear as …

  Cited by 3 Related articles All 2 versions



[PDF] arxiv.org

A short proof on the rate of convergence of the empirical measure for the Wasserstein distance

V Divol - arXiv preprint arXiv:2101.08126, 2021 - arxiv.org

We provide a short proof that the Wasserstein distance between the empirical measure of a

n-sample and the estimated measure is of order n^-(1/d), if the measure has a lower and

upper bounded density on the d-dimensional flat torus. Subjects: Statistics Theory (math. ST) …

   All 4 versions 


2021


[PDF] arxiv.org

Tracial smooth functions of non-commuting variables and the free Wasserstein manifold

D Jekel, W LiD Shlyakhtenko - arXiv preprint arXiv:2101.06572, 2021 - arxiv.org

We formulate a free probabilistic analog of the Wasserstein manifold on $\mathbb {R}^ d

$(the formal Riemannian manifold of smooth probability densities on $\mathbb {R}^ d $),

and we use it to study smooth non-commutative transport of measure. The points of the free …

   All 4 versions 


[PDF] arxiv.org

The ultrametric Gromov-Wasserstein distance

F MémoliA MunkZ Wan, C Weitkamp - arXiv preprint arXiv:2101.05756, 2021 - arxiv.org

In this paper, we investigate compact ultrametric measure spaces which form a subset

$\mathcal {U}^ w $ of the collection of all metric measure spaces $\mathcal {M}^ w $. Similar

as for the ultrametric Gromov-Hausdorff distance on the collection of ultrametric spaces …

  Cited by 2 Related articles 


[PDF] arxiv.org

Predictive density estimation under the Wasserstein loss

T Matsuda, WE Strawderman - Journal of Statistical Planning and Inference, 2021 - Elsevier

We investigate predictive density estimation under the L 2 Wasserstein loss for location

families and location-scale families. We show that plug-in densities form a complete class

and that the Bayesian predictive density is given by the plug-in density with the posterior …

   Related articles All 5 versions


[PDF] arxiv.org

Projected Statistical Methods for Distributional Data on the Real Line with the Wasserstein Metric

M Pegoraro, M Beraha - arXiv preprint arXiv:2101.09039, 2021 - arxiv.org

We present a novel class of projected methods, to perform statistical analysis on a data set

of probability distributions on the real line, with the 2-Wasserstein metric. We focus in

particular on Principal Component Analysis (PCA) and regression. To define these models …

  All 7 versions 


[PDF] arxiv.org

Wasserstein GANs Work Because They Fail (to Approximate the Wasserstein Distance)

J Stanczuk, C EtmannLM Kreusser… - arXiv preprint arXiv …, 2021 - arxiv.org

Wasserstein GANs are based on the idea of minimising the Wasserstein distance between a

real and a generated distribution. We provide an in-depth mathematical analysis of

differences between the theoretical setup and the reality of training Wasserstein GANs. In …

  All 4 versions 

<——2021———2021———600——



[PDF] copernicus.org

Ensemble Riemannian Data Assimilation over the Wasserstein Space

SK Tamang, A Ebtehaj, PJ Van Leeuwen… - Nonlinear Processes …, 2021 - npg.copernicus.org

In this paper, we present an ensemble data assimilation paradigm over a Riemannian

manifold equipped with the Wasserstein metric. Unlike the Eulerian penalization of error in

the Euclidean space, the Wasserstein metric can capture translation and difference between …

  Related articles All 4 versions 


[PDF] arxiv.org

Berry–Esseen Smoothing Inequality for the Wasserstein Metric on Compact Lie Groups

B Borda - Journal of Fourier Analysis and Applications, 2021 - Springer

We prove a sharp general inequality estimating the distance of two probability measures on

a compact Lie group in the Wasserstein metric in terms of their Fourier transforms. We use a

generalized form of the Wasserstein metric, related by Kantorovich duality to the family of …

  Related articles All 2 versions


[PDF] arxiv.org

Geometry on the Wasserstein space over a compact Riemannian manifold

H Ding, S Fang - arXiv preprint arXiv:2104.00910, 2021 - arxiv.org

We will revisit the intrinsic differential geometry of the Wasserstein space over a Riemannian

manifold, due to a series of papers by Otto, Villani, Lott, Ambrosio, Gigli, Savaré and so on.

Subjects: Mathematical Physics (math-ph); Probability (math. PR) Cite as: arXiv: 2104.00910 …

  All 6 versions 


  

On the computational complexity of finding a sparse Wasserstein barycenter

S Borgwardt, S Patterson - Journal of Combinatorial Optimization, 2021 - Springer

The discrete Wasserstein barycenter problem is a minimum-cost mass transport problem for

a set of probability measures with finite support. In this paper, we show that finding a

barycenter of sparse support is hard, even in dimension 2 and for only 3 measures. We …

  All 4 versions


The isometry group of Wasserstein spaces: the Hilbertian case

GP GehérT TitkosD Virosztek - arXiv preprint arXiv:2102.02037, 2021 - arxiv.org

Motivated by Kloeckner's result on the isometry group of the quadratic Wasserstein space

$\mathcal {W} _2\left (\mathbb {R}^ n\right) $, we describe the isometry group $\mathrm

{Isom}\left (\mathcal {W} _p (E)\right) $ for all parameters $0< p<\infty $ and for all separable …

  All 2 versions 

2021


[PDF] arxiv.org

On the Convexity of Discrete Time Covariance Steering in Stochastic Linear Systems with Wasserstein Terminal Cost

IM BalciA HalderE Bakolas - arXiv preprint arXiv:2103.13579, 2021 - arxiv.org

In this work, we analyze the properties of the solution to the covariance steering problem for

discrete time Gaussian linear systems with a squared Wasserstein distance terminal cost. In

our previous work, we have shown that by utilizing the state feedback control policy …

  All 2 versions 


Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic Schrödinger equation

G Ferriere - Analysis & PDE, 2021 - msp.org

We consider the dispersive logarithmic Schrödinger equation in a semiclassical scaling. We

extend the results of Carles and Gallagher (Duke Math. J. 167: 9 (2018), 1761–1801) about

the large-time behavior of the solution (dispersion faster than usual with an additional …

Cited by 9 Related articles All 7 versions

[PDF] toronto.edu

[PDF] The Wasserstein 1 Distance-Constructing an Optimal Map and Applications to Generative Modelling

T Milne - math.toronto.edu

Recent advances in generative modelling have shown that machine learning algorithms are

capable of generating high resolution images of fully synthetic scenes which some

researchers call “dreams” or “hallucinations” of the algorithm. Poetic language aside, one …

  

[PDF] julienmalka.me

[PDF] Gromov-Wasserstein Optimal Transport for Heterogeneous Domain Adaptation

J Malka, R FlamaryN Courty - julienmalka.me

Optimal Transport distances have shown great potential these last year in tackling the

homogeneous domain adaptation problem. This works present some adaptations of the

state of the art homogeneous domain adaptations methods to work on heterogeneous …

  Related articles 

Small Object Detection from Remote Sensing Images with the Help of Object-Focused Super-Resolution Using Wasserstein GANs

L Courtrai, MT Pham, C Friguet… - IGARSS 2020-2020 IEEE … - ieeexplore.ieee.org

In this paper, we investigate and improve the use of a super-resolution approach to benefit

the detection of small objects from aerial and satellite remote sensing images. The main

idea is to focus the super-resolution on target objects within the training phase. Such a …

<——2021———2021———610——


Closed‐form expressions for maximum mean discrepancy with applications to Wasserstein auto‐encoders

RM Rustamov - Stat, 2021 - Wiley Online Library

The maximum mean discrepancy (MMD) has found numerous applications in statistics and

machine learning, among which is its use as a penalty in the Wasserstein auto‐encoder

(WAE). In this paper, we compute closed‐form expressions for estimating the Gaussian …

  Cited by 5 Related articles All 3 versions


[PDF] arxiv.org

Computationally Efficient Wasserstein Loss for Structured Labels

A Toyokuni, S YokoiH KashimaM Yamada - arXiv preprint arXiv …, 2021 - arxiv.org

The problem of estimating the probability distribution of labels has been widely studied as a

label distribution learning (LDL) problem, whose applications include age estimation,

emotion analysis, and semantic segmentation. We propose a tree-Wasserstein distance …

  All 2 versions 

 

Considering anatomical prior information for low-dose CT image enhancement using attribute-augmented Wasserstein generative adversarial networks

Z Huang, X Liu, R Wang, J Chen, P Lu, Q Zhang… - Neurocomputing, 2021 - Elsevier

Currently, many deep learning (DL)-based low-dose CT image postprocessing technologies

fail to consider the anatomical differences in training data among different human body sites,

such as the cranium, lung and pelvis. In addition, we can observe evident anatomical …

  Related articles


[PDF] arxiv.org

A new perspective on Wasserstein distances for kinetic problems

M Iacobelli - arXiv preprint arXiv:2104.00963, 2021 - arxiv.org

We introduce a new class of Wasserstein-type distances specifically designed to tackle

questions concerning stability and convergence to equilibria for kinetic equations. Thanks to

these new distances, we improve some classical estimates by Loeper and Dobrushin on …

  All 2 versions 


[PDF] arxiv.org

Approximation for Probability Distributions by Wasserstein GAN

Y GaoMK Ng - arXiv preprint arXiv:2103.10060, 2021 - arxiv.org

In this paper, we show that the approximation for distributions by Wasserstein GAN depends

on both the width/depth (capacity) of generators and discriminators, as well as the number of

samples in training. A quantified generalization bound is developed for Wasserstein  …

  All 3 versions 


2021


Adversarial training with Wasserstein distance for learning cross-lingual word embeddings

Y Li, Y Zhang, K Yu, X Hu - Applied Intelligence, 2021 - Springer

Recent studies have managed to learn cross-lingual word embeddings in a completely

unsupervised manner through generative adversarial networks (GANs). These GANs-based

methods enable the alignment of two monolingual embedding spaces approximately, but …

 

Data augmentation for rolling bearing fault diagnosis using an enhanced few-shot Wasserstein auto-encoder with meta-learning

Z Pei, H Jiang, X Li, J Zhang, S Liu - Measurement Science and …, 2021 - iopscience.iop.org

Despite the advance of intelligent fault diagnosis for rolling bearings, in industries, data-

driven methods still suffer from data acquisition and imbalance. We propose an enhanced

few-shot Wasserstein auto-encoder (fs-WAE) to reverse the negative effect of imbalance …

 

[HTML] springer.com

[HTML] EEG data augmentation for emotion recognition with a multiple generator conditional Wasserstein GAN

A Zhang, L Su, Y Zhang, Y Fu, L Wu, S Liang - Complex & Intelligent …, 2021 - Springer

EEG-based emotion recognition has attracted substantial attention from researchers due to

its extensive application prospects, and substantial progress has been made in feature

extraction and classification modelling from EEG data. However, insufficient high-quality …

 

[PDF] arxiv.org

2D Wasserstein Loss for Robust Facial Landmark Detection

YAN Yongzhe, S Duffner, P Phutane, A Berthelier… - Pattern Recognition, 2021 - Elsevier

The recent performance of facial landmark detection has been significantly improved by

using deep Convolutional Neural Networks (CNNs), especially the Heatmap Regression

Models (HRMs). Although their performance on common benchmark datasets has reached a …

  Related articles All 3 versions


[PDF] arxiv.org

Projected Wasserstein gradient descent for high-dimensional Bayesian inference

Y WangP ChenW Li - arXiv preprint arXiv:2102.06350, 2021 - arxiv.org

We propose a projected Wasserstein gradient descent method (pWGD) for high-dimensional

Bayesian inference problems. The underlying density function of a particle system of WGD is

approximated by kernel density estimation (KDE), which faces the long-standing curse of …

  All 4 versions 

<——2021———2021———620——



Necessary optimality conditions for optimal control problems in Wasserstein spaces

B BonnetH Frankowska - arXiv preprint arXiv:2101.10668, 2021 - arxiv.org

In this article, we derive first-order necessary optimality conditions for a constrained optimal

control problem formulated in the Wasserstein space of probability measures. To this end,

we introduce a new notion of localised metric subdifferential for compactly supported …

  All 2 versions 


An inexact PAM method for computing Wasserstein barycenter with unknown supports

Y Qian, S Pan - Computational and Applied Mathematics, 2021 - Springer

Wasserstein barycenter is the centroid of a collection of discrete probability distributions

which minimizes the average of the\(\ell _2\)-Wasserstein distance. This paper focuses on

the computation of Wasserstein barycenters under the case where the support points are …

 

[PDF] arxiv.org

Second-order Conic Programming Approach for Wasserstein Distributionally Robust Two-stage Linear Programs

Z Wang, K YouS SongY Zhang - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

This article proposes a second-order conic programming (SOCP) approach to solve

distributionally robust two-stage linear programs over 1-Wasserstein balls. We start from the

case with distribution uncertainty only in the objective function and then explore the case …

  Related articles All 3 versions


[PDF] arxiv.org

A sharp Wasserstein uncertainty principle for Laplace eigenfunctions

M Mukherjee - arXiv preprint arXiv:2103.11633, 2021 - arxiv.org

Consider an eigenfunction of the Laplace-Beltrami operator on a smooth compact

Riemannian surface. We prove a conjectured lower bound on the Wasserstein distance

between the measures defined by the positive and negative parts of the eigenfunction …

  All 2 versions 


Reconstruction Method for Missing Measurement Data Based on Wasserstein Generative Adversarial Network

C Zhang, H Chen, J He, H Yang - Journal of Advanced …, 2021 - jstage.jst.go.jp

Focusing on the issue of missing measurement data caused by complex and changeable

working conditions during the operation of high-speed trains, in this paper, a framework for

the reconstruction of missing measurement data based on a generative adversarial network …

  All 3 versions


2021


[PDF] arxiv.org

Distributionally Robust Chance-Constrained Programmings for Non-Linear Uncertainties with Wasserstein Distance

Y Gu, Y Wang - arXiv preprint arXiv:2103.04790, 2021 - arxiv.org

In this paper, we study a distributionally robust chance-constrained programming $(\text

{DRCCP}) $ under Wasserstein ambiguity set, where the uncertain constraints require to be

jointly satisfied with a probability of at least a given risk level for all the probability …

  All 2 versions 


[PDF] arxiv.org

A data-driven event generator for Hadron Colliders using Wasserstein Generative Adversarial Network

S Choi, JH Lim - Journal of the Korean Physical Society, 2021 - Springer

Abstract Highly reliable Monte-Carlo event generators and detector simulation programs are

important for the precision measurement in the high energy physics. Huge amounts of

computing resources are required to produce a sufficient number of simulated events …

  All 4 versions


[PDF] iop.org

A deep learning-based approach for direct PET attenuation correction using Wasserstein generative adversarial network

Y Li, W Wu - Journal of Physics: Conference Series, 2021 - iopscience.iop.org

Positron emission tomography (PET) in some clinical assistant diagnose demands

attenuation correction (AC) and scatter correction (SC) to obtain high-quality imaging,

leading to gaining more precise metabolic information in tissue or organs of patient …

 

[PDF] openreview.net

[PDF] Functional Priors for Bayesian Neural Networks through Wasserstein Distance Minimization to Gaussian Processes

BH TranD MiliosS RossiM Filippone - openreview.net

The Bayesian treatment of neural networks dictates that a prior distribution is considered

over the weight and bias parameters of the network. The non-linear nature of the model

implies that any distribution of the parameters has an unpredictable effect on the distribution …

  Related articles 


Vasserstein Generative Adversarial Networks for Realistic ...

https://www.springerprofessional.de › wasserstein-gener...

Recently, Convolutional neural networks (CNN) with properly annotated training ... Wasserstein Generative Adversarial Networks for Realistic Traffic Sign Image ...

[CITATION] Wasserstein Generative Adversarial Networks for Realistic Traffic Sign Image Generation

C DewiRC Chen, YT Liu - … Thailand, April 7 …, 2021 - Springer International Publishing

<——2021———2021———630——e


Multilevel optimal transport: a fast approximation of Wasserstein-1 distances

J LiuW YinW Li, YT Chow - SIAM Journal on Scientific Computing, 2021 - SIAM

We propose fast algorithm for the calculation of the Wasserstein-1 distance, which is a

particular type of optimal transport distance with transport cost homogeneous of degree one.

Our algorithm is built on multilevel primal-dual algorithms. Several numerical examples and …

  5 Related articles All 6 versions


[PDF] jmlr.org

[PDF] A fast globally linearly convergent algorithm for the computation of Wasserstein barycenters

L YangJ LiD SunKC Toh - Journal of Machine Learning Research, 2021 - jmlr.org

We consider the problem of computing a Wasserstein barycenter for set of discrete

probability distributions with finite supports, which finds many applications in areas such as

statistics, machine learning and image processing. When the support points of the …

  Cited by 8 Related articles All 6 versions 


[PDF] arxiv.org

A short proof on the rate of convergence of the empirical measure for the Wasserstein distance

V Divol - arXiv preprint arXiv:2101.08126, 2021 - arxiv.org

We provide short proof that the Wasserstein distance between the empirical measure of a

n-sample and the estimated measure is of order n^-(1/d), if the measure has lower and

upper bounded density on the d-dimensional flat torus. Subjects: Statistics Theory (math. ST) …

   All 4 versions 


[PDF] arxiv.org

A new perspective on Wasserstein distances for kinetic problems

M Iacobelli - arXiv preprint arXiv:2104.00963, 2021 - arxiv.org

We introduce new class of Wasserstein-type distances specifically designed to tackle

questions concerning stability and convergence to equilibria for kinetic equations. Thanks to

these new distances, we improve some classical estimates by Loeper and Dobrushin on …

  All 2 versions 


[PDF] arxiv.org

A Mathematical Principle of Deep Learning: Learn the Geodesic Curve in the Wasserstein Space

K Gai, S Zhang - arXiv preprint arXiv:2102.09235, 2021 - arxiv.org

Recent studies revealed the mathematical connection of deep neural network (DNN) and

dynamic system. However, the fundamental principle of DNN has not been fully

characterized with dynamic system in terms of optimization and generalization. To this end …

  All 2 versions 


2021


[HTML] springer.com

[HTML] EEG data augmentation for emotion recognition with a multiple generator conditional Wasserstein GAN

A Zhang, L Su, Y Zhang, Y Fu, L Wu, S Liang - Complex & Intelligent …, 2021 - Springer

EEG-based emotion recognition has attracted substantial attention from researchers due to

its extensive application prospects, and substantial progress has been made in feature

extraction and classification modelling from EEG data. However, insufficient high-quality …

 

On clustering uncertain and structured data with Wasserstein barycenters and a geodesic criterion for the number of clusters

GI Papayiannis, GN Domazakis… - Journal of Statistical …, 2021 - Taylor & Francis

Clustering schemes for uncertain and structured data are considered relying on the notion of

Wasserstein barycenters, accompanied by appropriate clustering indices based on the

intrinsic geometry of the Wasserstein space. Such type of clustering approaches are highly …

 

[PDF] arxiv.org

Geometry on the Wasserstein space over a compact Riemannian manifold

H Ding, S Fang - arXiv preprint arXiv:2104.00910, 2021 - arxiv.org

We will revisit the intrinsic differential geometry of the Wasserstein space over Riemannian

manifold, due to series of papers by Otto, Villani, Lott, Ambrosio, Gigli, Savaré and so on.

Subjects: Mathematical Physics (math-ph); Probability (math. PR) Cite as: arXiv: 2104.00910 …

  All 7 versions 


On the computational complexity of finding a sparse Wasserstein barycenter

S Borgwardt, S Patterson - Journal of Combinatorial Optimization, 2021 - Springer

The discrete Wasserstein barycenter problem is minimum-cost mass transport problem for

set of probability measures with finite support. In this paper, we show that finding a

barycenter of sparse support is hard, even in dimension 2 and for only 3 measures. We …

  All 4 versions


[PDF] arxiv.org

A sharp Wasserstein uncertainty principle for Laplace eigenfunctions

M Mukherjee - arXiv preprint arXiv:2103.11633, 2021 - arxiv.org

Consider an eigenfunction of the Laplace-Beltrami operator on smooth compact

Riemannian surface. We prove conjectured lower bound on the Wasserstein distance

between the measures defined by the positive and negative parts of the eigenfunction …

  All 2 versions 

<——2021———2021———640—— 



Improved complexity bounds in wasserstein barycenter problem

D DvinskikhD Tiapkin - International Conference on …, 2021 - proceedings.mlr.press

In this paper, we focus on computational aspects of the Wasserstein barycenter problem. We

propose two algorithms to compute Wasserstein barycenters of $ m $ discrete measures of

size $ n $ with accuracy $\e $. The first algorithm, based on mirror prox with a specific norm …

  Cited by 3 Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein distance to independence models

TÖ Çelik, A Jamneshan, G MontúfarB Sturmfels… - Journal of Symbolic …, 2021 - Elsevier

An independence model for discrete random variables is a Segre-Veronese variety in a

probability simplex. Any metric on the set of joint states of the random variables induces a

Wasserstein metric on the probability simplex. The unit ball of this polyhedral norm is dual to  …

  Cited by 2 Related articles All 3 versions

Wasserstein Distance to Independence Models

T Özlüm Çelik, A JamneshanG Montúfar… - arXiv e …, 2020 - ui.adsabs.harvard.edu

An independence model for discrete random variables is a Segre-Veronese variety in a

probability simplex. Any metric on the set of joint states of the random variables induces a

Wasserstein metric on the probability simplex. The unit ball of this polyhedral norm is dual to  …

 




[PDF] arxiv.org

Precise limit in Wasserstein distance for conditional empirical measures of Dirichlet diffusion processes

FY Wang - Journal of Functional Analysis, 2021 - Elsevier

Let M be a d-dimensional connected compact Riemannian manifold with boundary∂ M, let

V C 2 (M) such that μ (dx):= e V (x) dx is a probability measure, and let X t be the diffusion

process generated by L:= Δ+ V with τ:= inf⁡{t≥ 0: X t∂ M}. Consider the conditional …

  Cited by 3 Related articles All 3 versions


[PDF] arxiv.org

Differential inclusions in Wasserstein spaces: The Cauchy-Lipschitz framework

B BonnetH Frankowska - Journal of Differential Equations, 2021 - Elsevier

In this article, we propose a general framework for the study of differential inclusions in the

Wasserstein space of probability measures. Based on earlier geometric insights on the

structure of continuity equations, we define solutions of differential inclusions as absolutely …

  Cited by 2 Related articles All 6 versions


[PDF] jmlr.org

[PDF] Wasserstein barycenters can be computed in polynomial time in fixed dimension

JM Altschuler, E Boix-Adsera - Journal of Machine Learning Research, 2021 - jmlr.org

Computing Wasserstein barycenters is a fundamental geometric problem with widespread

applications in machine learning, statistics, and computer graphics. However, it is unknown

whether Wasserstein barycenters can be computed in polynomial time, either exactly or to  …

  All 14 versions 


2021


[PDF] arxiv.org

Rate of convergence for particles approximation of PDEs in Wasserstein space

M GermainH PhamX Warin - arXiv preprint arXiv:2103.00837, 2021 - arxiv.org

We prove a rate of convergence of order 1/N for the N-particle approximation of a second-

order partial differential equation in the space of probability measures, like the Master

equation or Bellman equation of mean-field control problem under common noise. The proof …

   All 16 versions 


[PDF] arxiv.org

Closed‐form expressions for maximum mean discrepancy with applications to Wasserstein auto‐encoders

RM Rustamov - Stat, 2021 - Wiley Online Library

The maximum mean discrepancy (MMD) has found numerous applications in statistics and

machine learning, among which is its use as a penalty in the Wasserstein auto‐encoder

(WAE). In this paper, we compute closed‐form expressions for estimating the Gaussian …

  Cited by 5 Related articles All 3 versions


[PDF] arxiv.org

Tracial smooth functions of non-commuting variables and the free Wasserstein manifold

D Jekel, W LiD Shlyakhtenko - arXiv preprint arXiv:2101.06572, 2021 - arxiv.org

We formulate a free probabilistic analog of the Wasserstein manifold on $\mathbb {R}^ d

$(the formal Riemannian manifold of smooth probability densities on $\mathbb {R}^ d $),

and we use it to study smooth non-commutative transport of measure. The points of the free …

   All 4 versions 


[PDF] mlr.press

Wasserstein Random Forests and Applications in Heterogeneous Treatment Effects

Q Du, G BiauF Petit, R Porcher - … Conference on Artificial …, 2021 - proceedings.mlr.press

We present new insights into causal inference in the context of Heterogeneous Treatment

Effects by proposing natural variants of Random Forests to estimate the key conditional

distributions. To achieve this, we recast Breiman's original splitting criterion in terms of …

  Related articles All 4 versions 


[PDF] arxiv.org

Learning to Generate Wasserstein Barycenters

J Lacombe, J DigneN CourtyN Bonneel - arXiv preprint arXiv …, 2021 - arxiv.org

Optimal transport is a notoriously difficult problem to solve numerically, with current

approaches often remaining intractable for very large scale applications such as those

encountered in machine learning. Wasserstein barycenters--the problem of finding …

  All 2 versions 

<——2021———2021———650—— 


Exponential convergence in entropy and Wasserstein for McKean–Vlasov SDEs

P Ren, FY Wang - Nonlinear Analysis, 2021 - Elsevier

The following type of exponential convergence is proved for (non-degenerate or

degenerate) McKean–Vlasov SDEs: W 2 (μ t, μ∞) 2+ Ent (μ t| μ∞)≤ ce− λ t min {W 2 (μ 0,

μ∞) 2, Ent (μ 0| μ∞)}, t≥ 1, where c, λ> 0 are constants, μ t is the distribution of the solution …

  Related articles All 2 versions


[PDF] tandfonline.com

Full View

Wasserstein distance-based probabilistic linguistic TODIM method with application to the evaluation of sustainable rural tourism potential

S Zhang, Z Wu, Z Ma, X Liu, J Wu - Economic Research …, 2021 - Taylor & Francis

The evaluation of sustainable rural tourism potential is a key work in sustainable rural

tourism development. Due to the complexity of the rural tourism development situation and

the limited cognition of people, most of the assessment problems for sustainable rural …

  All 2 versions


[PDF] arxiv.org

Convergence in Wasserstein distance for empirical measures of semilinear SPDEs

FY Wang - arXiv preprint arXiv:2102.00361, 2021 - arxiv.org

The convergence rate in Wasserstein distance is estimated for the empirical measures of

symmetric semilinear SPDEs. Unlike in the finite-dimensional case that the convergence is

of algebraic order in time, in the present situation the convergence is of log order with a …

   All 2 versions 


[HTML] tandfonline.com

Full View

Dissimilarity measure of local structure in inorganic crystals using Wasserstein distance to search for novel phosphors

S Takemura, T Takeda, T Nakanishi… - … and Technology of …, 2021 - Taylor & Francis

To efficiently search for novel phosphors, we propose a dissimilarity measure of local

structure using the Wasserstein distance. This simple and versatile method provides the

quantitative dissimilarity of a local structure around a center ion. To calculate the …

  All 2 versions


[PDF] arxiv.org

A Mathematical Principle of Deep Learning: Learn the Geodesic Curve in the Wasserstein Space

K Gai, S Zhang - arXiv preprint arXiv:2102.09235, 2021 - arxiv.org

Recent studies revealed the mathematical connection of deep neural network (DNN) and

dynamic system. However, the fundamental principle of DNN has not been fully

characterized with dynamic system in terms of optimization and generalization. To this end …

  All 2 versions 


2021


[PDF] mlr.press

FlexAE: Flexibly learning latent priors for wasserstein auto-encoders

AK Mondal, H AsnaniP Singla… - Uncertainty in Artificial …, 2021 - proceedings.mlr.press

… (KLD), Jensen–Shannon divergence (JSD), Wasserstein Distance and so on. In this work,

we propose to use Wasserstein distance and utilize the principle laid in [Arjovsky et al., 2017, …

Cited by 2 Related articles All 5 versions 

[PDF] arxiv.org

Continual Learning of Generative Models with Limited Data: From Wasserstein-1 Barycenter to Adaptive Coalescence

M DedeogluS Lin, Z Zhang, J Zhang - arXiv preprint arXiv:2101.09225, 2021 - arxiv.org

Learning generative models is challenging for a network edge node with limited data and

computing power. Since tasks in similar environments share model similarity, it is plausible

to leverage pre-trained generative models from the cloud or other edge nodes. Appealing to  …

  All 2 versions 


[PDF] arxiv.org

From Smooth Wasserstein Distance to Dual Sobolev Norm: Empirical Approximation and Statistical Applications

S NietertZ GoldfeldK Kato - arXiv preprint arXiv:2101.04039, 2021 - arxiv.org

Statistical distances, ie, discrepancy measures between probability distributions, are

ubiquitous in probability theory, statistics and machine learning. To combat the curse of

dimensionality when estimating these distances from data, recent work has proposed …

  Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein statistics in one-dimensional location scale models

S AmariT Matsuda - Annals of the Institute of Statistical Mathematics, 2021 - Springer

Wasserstein geometry and information geometry are two important structures to be

introduced in a manifold of probability distributions. Wasserstein geometry is defined by

using the transportation cost between two distributions, so it reflects the metric of the base …

   Related articles All 2 versions

[PDF] arxiv.org

Geometrical aspects of entropy production in stochastic thermodynamics based on Wasserstein distance

M Nakazato, S Ito - arXiv preprint arXiv:2103.00503, 2021 - arxiv.org

We study a relationship between optimal transport theory and stochastic thermodynamics for

the Fokker-Planck equation. We show that the entropy production is proportional to the

action measured by the path length of the $ L^ 2$-Wasserstein distance, which is a measure …

  All 2 versions 

<——2021———2021———660—— 

Wasserstein GANs Work Because They Fail (to Approximate the Wasserstein Distance)

J Stanczuk, C EtmannLM Kreusser… - arXiv preprint arXiv …, 2021 - arxiv.org

Wasserstein GANs are based on the idea of minimising the Wasserstein distance between a

real and a generated distribution. We provide an in-depth mathematical analysis of

differences between the theoretical setup and the reality of training Wasserstein GANs. In  …

  All 4 versions 


[PDF] mdpi.com

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - Symmetry, 2021 - mdpi.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively

reduce the radiation risk of patients, but it may increase noise and artefacts, which can

compromise diagnostic information. The methods based on deep learning can effectively …

  Related articles 

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss. Symmetry 2021, 13, 126

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - 2021 - search.proquest.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively

reduce the radiation risk of patients, but it may increase noise and artefacts, which can

compromise diagnostic information. The methods based on deep learning can effectively …



[PDF] arxiv.org

Distributional robustness in minimax linear quadratic control with Wasserstein distance

K KimI Yang - arXiv preprint arXiv:2102.12715, 2021 - arxiv.org

To address the issue of inaccurate distributions in practical stochastic systems, a minimax

linear-quadratic control method is proposed using the Wasserstein metric. Our method aims

to construct a control policy that is robust against errors in an empirical distribution of …

  All 2 versions 


[PDF] arxiv.org

Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces

B BonnetH Frankowska - arXiv preprint arXiv:2101.10668, 2021 - arxiv.org

In this article, we derive first-order necessary optimality conditions for a constrained optimal

control problem formulated in the Wasserstein space of probability measures. To this end,

we introduce a new notion of localised metric subdifferential for compactly supported …

  All 2 versions 


[PDF] arxiv.org

On Number of Particles in Coalescing-Fragmentating Wasserstein Dynamics

V Konarovskyi - arXiv preprint arXiv:2102.10943, 2021 - arxiv.org

Because of the sticky-reflected interaction in coalescing-fragmentating Wasserstein

dynamics, the model always consists of a finite number of distinct particles for almost all

times. We show that the interacting particle system must admit an infinite number of distinct …

  All 4 versions 


2021


PDF] arxiv.org

Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings

MH Quang - arXiv preprint arXiv:2101.01429, 2021 - arxiv.org

This work studies the convergence and finite sample approximations of entropic regularized

Wasserstein distances in the Hilbert space setting. Our first main result is that for Gaussian

measures on an infinite-dimens

[PDF] arxiv.org

Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings

MH Quang - arXiv preprint arXiv:2101.01429, 2021 - arxiv.org

This work studies the convergence and finite sample approximations of entropic regularized

Wasserstein distances in the Hilbert space setting. Our first main result is that for Gaussian

measures on an infinite-dimensional Hilbert space, convergence in the 2-Sinkhorn …

  Related articles All 3 versions 


[HTML] hindawi.com

[HTML] Wasserstein Metric-Based Location Spoofing Attack Detection in WiFi Positioning Systems

Y Tian, N Zheng, X Chen, L Gao - Security and Communication …, 2021 - hindawi.com

WiFi positioning systems (WPS) have been introduced as parts of 5G location services

(LCS) to provide fast positioning results of user devices in urban areas. However, they are

prominently threatened by location spoofing attacks. To end this, we present a Wasserstein  …

  All 2 versions 


[PDF] arxiv.org

On the Convexity of Discrete Time Covariance Steering in Stochastic Linear Systems with Wasserstein Terminal Cost

IM BalciA HalderE Bakolas - arXiv preprint arXiv:2103.13579, 2021 - arxiv.org

In this work, we analyze the properties of the solution to the covariance steering problem for

discrete time Gaussian linear systems with a squared Wasserstein distance terminal cost. In

our previous work, we have shown that by utilizing the state feedback control policy …

  All 2 versions 


[PDF] aaai.org

[PDF] Towards Generalized Implementation of Wasserstein Distance in GANs

M XuG LuW ZhangY Yu - 2021 - aaai.org

Abstract Wasserstein GANs (WGANs), built upon the Kantorovich-Rubinstein (KR) duality of

Wasserstein distance, is one of the most theoretically sound GAN models. However, in

practice it does not always outperform other variants of GANs. This is mostly due to the …

  All 2 versions 


[PDF] arxiv.org

Non-negative matrix and tensor factorisations with a smoothed Wasserstein loss

SY Zhang - arXiv preprint arXiv:2104.01708, 2021 - arxiv.org

Non-negative matrix and tensor factorisations are a classical tool in machine learning and

data science for finding low-dimensional representations of high-dimensional datasets. In

applications such as imaging, datasets can often be regarded as distributions in a space …

  All 3 versions 

<——2021———2021———670——



Non-negative matrix and tensor factorisations with a smoothed Wasserstein loss

SY Zhang - arXiv preprint arXiv:2104.01708, 2021 - arxiv.org

Non-negative matrix and tensor factorisations are classical tool in machine learning and

data science for finding low-dimensional representations of high-dimensional datasets. In

applications such as imaging, datasets can often be regarded as distributions in space …

  All 3 versions 


PDF] arxiv.org

Geometry on the Wasserstein space over a compact Riemannian manifold

H Ding, S Fang - arXiv preprint arXiv:2104.00910, 2021 - arxiv.org

For the sake of simplicity, we will consider in this paper a connected compact Riemannian manifold

M of dimension m. We denote by dM the Riemannian distance and dx the Rieman- nian measure

on M such that ∫M dx = 1. Since the diameter of M is finite, any probability measure µ on M is …

  Related articles All 9 versions 


[PDF] arxiv.org

A data-driven event generator for Hadron Colliders using Wasserstein Generative Adversarial Network

S Choi, JH Lim - Journal of the Korean Physical Society, 2021 - Springer

Abstract Highly reliable Monte-Carlo event generators and detector simulation programs are

important for the precision measurement in the high energy physics. Huge amounts of

computing resources are required to produce sufficient number of simulated events …

  All 4 versions

[PDF] iop.org

A deep learning-based approach for direct PET attenuation correction using Wasserstein generative adversarial network

Y Li, W Wu - Journal of Physics: Conference Series, 2021 - iopscience.iop.org

Positron emission tomography (PET) in some clinical assistant diagnose demands

attenuation correction (AC) and scatter correction (SC) to obtain high-quality imaging,

leading to gaining more precise metabolic information in tissue or organs of patient …

Related articles All 3 versions

Weak topology and Opial property in Wasserstein spaces, with applications to Gradient Flows and Proximal Point Algorithms of geodesically convex functionals

E Naldi, G Savaré - arXiv preprint arXiv:2104.06121, 2021 - arxiv.org

In this paper we discuss how to define an appropriate notion of weak topology in the

Wasserstein space $(\mathcal {P} _2 (H), W_2) $ of Borel probability measures with finite

quadratic moment on a separable Hilbert space $ H $. We will show that such a topology …

  All 2 versions 


2021


[PDF] arxiv.org

Distributionally Robust Chance-Constrained Programmings for Non-Linear Uncertainties with Wasserstein Distance

Y Gu, Y Wang - arXiv preprint arXiv:2103.04790, 2021 - arxiv.org

In this paper, we study a distributionally robust chance-constrained programming $(\text

{DRCCP}) $ under Wasserstein ambiguity set, where the uncertain constraints require to be

jointly satisfied with a probability of at least a given risk level for all the probability …

  All 2 versions 


[PDF] arxiv.org

Data-Driven Distributionally Robust Surgery Planning in Flexible Operating Rooms Over a Wasserstein Ambiguity

KS Shehadeh - arXiv preprint arXiv:2103.15221, 2021 - arxiv.org

We study elective surgery planning in flexible operating rooms where emergency patients

are accommodated in the existing elective surgery schedule. Probability distributions of

surgery durations are unknown, and only a small set of historical realizations is available. To  …

  All 2 versions 


Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic Schrödinger equation

G Ferriere - Analysis & PDE, 2021 - msp.org

We consider the dispersive logarithmic Schrödinger equation in a semiclassical scaling. We

extend the results of Carles and Gallagher (Duke Math. J. 167: 9 (2018), 1761–1801) about

the large-time behavior of the solution (dispersion faster than usual with an additional …

  All 2 versions


[PDF] toronto.edu

[PDF] The Wasserstein 1 Distance-Constructing an Optimal Map and Applications to Generative Modelling

T Milne - math.toronto.edu

Recent advances in generative modelling have shown that machine learning algorithms are

capable of generating high resolution images of fully synthetic scenes which some

researchers call “dreams” or “hallucinations” of the algorithm. Poetic language aside, one …

Related articles 

  

2021

When ot meets mom: Robust estimation of wasserstein distance

G Staerman, P Laforgue… - International …, 2021 - proceedings.mlr.press

Abstract Originated from Optimal Transport, the Wasserstein distance has gained 

importance in Machine Learning due to its appealing geometrical properties and the 

increasing availability of efficient approximations. It owes its recent ubiquity in generative …

Cited by 18 Related articles All 9 versions

 <——2021———2021———680——


2021 [PDF] projecteuclid.org

Multivariate goodness-of-fit tests based on Wasserstein distance

M Hallin, G Mordant, J Segers - Electronic Journal of Statistics, 2021 - projecteuclid.org

Goodness-of-fit tests based on the empirical Wasserstein distance are proposed for simple 

and composite null hypotheses involving general multivariate distributions. For group 

families, the procedure is to be implemented after preliminary reduction of the data via …

MR4255302 Prelim Hallin, Marc; Mordant, Gilles; Segers, Johan; Multivariate goodness-of-fit tests based on Wasserstein distance. Electron. J. Stat. 15 (2021), no. 1, –.

Review PDF Clipboard Journal Article
Zbl 07379674

6 Related articles All 15 versions
Cited by 32
Related articles All 14 versions


[PDF] arxiv.org

Continuous Wasserstein-2 Barycenter Estimation without Minimax Optimization

A Korotin, L Li, J Solomon, E Burnaev - arXiv preprint arXiv:2102.01752, 2021 - arxiv.org

Wasserstein barycenters provide a geometric notion of the weighted average of probability 

measures based on optimal transport. In this paper, we present a scalable algorithm to 

Cited by 10 Related articles All 3 versions


[PDF] tandfonline.comFull View

Wasserstein distance-based probabilistic linguistic TODIM method with application to the evaluation of sustainable rural tourism potential

S Zhang, Z Wu, Z Ma, X Liu, J Wu - Economic Research …, 2021 - Taylor & Francis

The evaluation of sustainable rural tourism potential is a key work in sustainable rural 

tourism development. Due to the complexity of the rural tourism development situation and 

the limited cognition of people, most of the assessment problems for sustainable rural …

All 2 versions 


Sample out-of-sample inference based on Wasserstein distance

J Blanchet, Y Kang - Operations Research, 2021 - pubsonline.informs.org

We present a novel inference approach that we call sample out-of-sample inference. The

approach can be used widely, ranging from semisupervised learning to stress testing, and it

is fundamental in the application of data-driven distributionally robust optimization. Our …

  Cited by 21 Related articles All 5 versions


[HTML] mdpi.com

Panchromatic Image Super-Resolution Via Self Attention-Augmented Wasserstein Generative Adversarial Network

J Du, K Cheng, Y Yu, D Wang, H Zhou - Sensors, 2021 - mdpi.com

Panchromatic (PAN) images contain abundant spatial information that is useful for earth

observation, but always suffer from low-resolution (LR) due to the sensor limitation and large-

scale view field. The current super-resolution (SR) methods based on traditional attention  …

  All 6 versions 


2021


[PDF] arxiv.org

Wasserstein barycenters are NP-hard to compute

JM Altschuler, E Boix-Adsera - arXiv preprint arXiv:2101.01100, 2021 - arxiv.org

The problem of computing Wasserstein barycenters (aka Optimal Transport barycenters) has

attracted considerable recent attention due to many applications in data science. While there

exist polynomial-time algorithms in any fixed dimension, all known runtimes suffer …

   Related articles All 2 versions 


Data augmentation for rolling bearing fault diagnosis using an enhanced few-shot Wasserstein auto-encoder with meta-learning

Z Pei, H Jiang, X Li, J Zhang, S Liu - Measurement Science and …, 2021 - iopscience.iop.org

Despite the advance of intelligent fault diagnosis for rolling bearings, in industries, data-

driven methods still suffer from data acquisition and imbalance. We propose an enhanced

few-shot Wasserstein auto-encoder (fs-WAE) to reverse the negative effect of imbalance …

 

An inexact PAM method for computing Wasserstein barycenter with unknown supports

Y Qian, S Pan - Computational and Applied Mathematics, 2021 - Springer

Wasserstein barycenter is the centroid of a collection of discrete probability distributions

which minimizes the average of the\(\ell _2\)-Wasserstein distance. This paper focuses on

the computation of Wasserstein barycenters under the case where the support points are  …

 

[PDF] arxiv.org

The existence of minimizers for an isoperimetric problem with Wasserstein penalty term in unbounded domains

Q Xia, B Zhou - Advances in Calculus of Variations, 2021 - degruyter.com

In this article, we consider the (double) minimization problem min {P (E; Ω)+ λWp (E, F): E

Ω, F d,| E F|= 0,| E|=| F|= 1}, where λ 0, p 1, Ω is a (possibly unbounded) domain in

d, P (E; Ω) denotes the relative perimeter of E in Ω and Wp denotes the p-Wasserstein  …

  Related articles All 4 versions


[PDF] toronto.edu

[PDF] The Wasserstein 1 Distance-Constructing an Optimal Map and Applications to Generative Modelling

T Milne - math.toronto.edu

Recent advances in generative modelling have shown that machine learning algorithms are

capable of generating high resolution images of fully synthetic scenes which some

researchers call “dreams” or “hallucinations” of the algorithm. Poetic language aside, one …

<——2021———2021———690——

[PDF] openreview.net

[PDF] Functional Priors for Bayesian Neural Networks through Wasserstein Distance Minimization to Gaussian Processes

BH TranD MiliosS RossiM Filippone - openreview.net

The Bayesian treatment of neural networks dictates that a prior distribution is considered

over the weight and bias parameters of the network. The non-linear nature of the model

implies that any distribution of the parameters has an unpredictable effect on the distribution …

  Related articles 


SRWGANTV: Image Super-Resolution Through Wasserstein Generative Adversarial Networks with Total Variational Regularization

J Shao, L Chen, Y Wu - 2021 IEEE 13th International …, 2021 - ieeexplore.ieee.org

The study of generative adversarial networks (GAN) has enormously promoted the research

work on single image super-resolution (SISR) problem. SRGAN firstly apply GAN to SISR

reconstruction, which has achieved good results. However, SRGAN sacrifices the fidelity. At …


 CWGAN-DNN: 一种条件 Wasserstein 生成对抗网络入侵检测方法

贺佳星, 王晓丹, 宋亚飞, 来杰 - 空军工程大学学报, 2021 - kjgcdx.cnjournals.com

生成对抗网络(CWGAN)和深度神经网络(DNN)的入侵检测(CWGAN DNN).CWGAN DNN通过

生成… ,将连续特征的高斯混合分布进行分解;然后利用CWGAN学习预处理后数据的分布并生成新的

All 2 versions 

[Chinese  English

CWGAN-DNN: Yī zhǒng tiáojiàn Wasserstein shēngchéng duìkàng wǎngluò rùqīn jiǎncè fāngfǎ

2021

arXiv:2104.07970  [pdfother math.PR
Gromov-Wasserstein Distances between Gaussian Distributions
Authors: Antoine SalmonaJulie DelonAgnès Desolneux
Abstract: The Gromov-Wasserstein distances were proposed a few years ago to compare distributions which do not lie in the same space. In particular, they offer an interesting alternative to the Wasserstein distances for comparing probability measures living on Euclidean spaces of different dimensions. In this paper, we focus on the Gromov-Wasserstein distance with a ground cost defined as the squared Euclid…  More
Submitted 16 April, 2021; originally announced April 2021.
Cited by 4
 Related articles All 13 versions 

2021

arXiv:2104.07710  [pdfother]  cs.CG  cs.DS
Approximation algorithms for 1-Wasserstein distance between persistence diagrams
Authors: Samantha ChenYusu Wang
Abstract: Recent years have witnessed a tremendous growth using topological summaries, especially the persistence diagrams (encoding the so-called persistent homology) for analyzing complex shapes. Intuitively, persistent homology maps a potentially complex input object (be it a graph, an image, or a point set and so on) to a unified type of feature summary, called the persistence diagrams. One can then car…  More
Submitted 15 April, 2021; originally announced April 2021.
Comments: To be published in LIPIcs, Volume 190, SEA 2021
ACM Class: I.3.6; E.1
ACited by 4
 Related articles All 8 versions 

 2021

Entropic Gromov-Wasserstein between Gaussian DistributionsAuthors:Le, Khang (Creator), Le, Dung (Creator), Nguyen, Huy (Creator), Do, Dat (Creator), Pham, Tung (Creator), Ho, Nhat (Creator)
Summary:We study the entropic Gromov-Wasserstein and its unbalanced version between (unbalanced) Gaussian distributions with different dimensions. When the metric is the inner product, which we refer to as inner product Gromov-Wasserstein (IGW), we demonstrate that the optimal transportation plans of entropic IGW and its unbalanced variant are (unbalanced) Gaussian distributions. Via an application of von Neumann's trace inequality, we obtain closed-form expressions for the entropic IGW between these Gaussian distributions. Finally, we consider an entropic inner product Gromov-Wasserstein barycenter of multiple Gaussian distributions. We prove that the barycenter is a Gaussian distribution when the entropic regularization parameter is small. We further derive a closed-form expression for the covariance matrix of the barycenterShow more
Downloadable Archival Material, 2021-08-24
Undefined
Publisher:2021-08-24


2021

Primal Dual Methods for Wasserstein Gradient Flows

By: Carrillo, Jose A.; Craig, Katy; Wang, Li; et al.

FOUNDATIONS OF COMPUTATIONAL MATHEMATICS    

early access iconEarly Access: MAR 2021

 Cited by 34 Related articles All 10 versions

2021

Ripple-GAN: Lane Line Detection With Ripple Lane Line Detection Network and Wasserstein GAN

By: Zhang, Youcheng; Lu, Zongqing; Ma, Dongdong; et al.

IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS  Volume: ‏ 22   Issue: ‏ 3   Pages: ‏ 1532-1542   Published: ‏ MAR 2021

online  

New Findings Reported from Tsinghua University Describe Advances in Machine Learning 

(Ripple-gan: Lane Line Detection With Ripple Lane Line Detection Network and Wasserstein Gan)

Robotics & Machine Learning, 05/2021

NewsletterFull Text Online
Ripple-GAN: Lane Line Detection With Ripple Lane Line Detection Network and Wasserstein GAN

Zhang, Youcheng; Lu, Zongqing; Ma, Dongdong; Jing-Hao, Xue; Liao, Qingmin. IEEE Transactions on Intelligent Transportation Systems; New York Vol. 22, Iss. 3,  (2021): 1532-1542.

Abstract/Details Get full textLink to external site, this link will open in a new window

2021

Entropy-Regularized Optimal Transport on Multivariate ... - MDPI

https://www.mdpi.com › pdf

by Q Tong · 2021 ·  — Abstract: The distance and divergence of the probability measures play a central ... puting the Wasserstein distance is costly, entropy-regularized optimal ... cus on entropy-regularized optimal transport on multivariate normal ...

online  OPEN ACCESS

Weak topology and Opial property in Wasserstein spaces, with applications to Gradient Flows andProximal Point Algorithms of geodesically convex functionals

by Naldi, Emanuele; Savaré, Giuseppe

04/2021

In this paper we discuss how to define an appropriate notion of weak topology in the Wasserstein space $(\mathcal{P}_2(H),W_2)$ of Borel probability measures...

Journal ArticleFull Text Online

arXiv 

Zbl 07490883


online  OPEN ACCESS

Geometry on the Wasserstein space over a compact Riemannian manifold

by Ding, Hao; Fang, Shizan

04/2021

We will revisit the intrinsic differential geometry of the Wasserstein space over a Riemannian manifold, due to a series of papers by Otto, Villani, Lott,...

Journal ArticleFull Text Online

Related articles All 9 versions 

Zbl 07569351

online

Anhui University of Technology Researchers Further Understanding of Sustainable Development 

(Wasserstein  distance-based probabilistic linguistic TODIM method with application to the evaluation of sustainable rural tourism potential)

Ecology, Environment & Conservation, 04/2021

NewsletterFull Text Online

 

Chinese Academy of Sciences Reports Findings in Bioinformatics 

((Prediction and analysis of multiple protein lysine modified sites based on conditional wasserstein generative adversarial networks)

Information Technology Newsweekly, 04/2021

NewsletterCitation Online

online Cover Image PEER-REVIEW OPEN ACCESS

Prediction and analysis of multiple protein lysine modified sites based on conditional wasserstein...

by Yang, Yingxi; Wang, Hui; Li, Wen ; More...

BMC bioinformatics, 03/2021, Volume 22, Issue 1

Protein post-translational modification (PTM) is a key issue to investigate the mechanism of protein's function. With the rapid development of proteomics...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 [HTML] biomedcentral.com

[HTML] Prediction and analysis of multiple protein lysine modified sites based on conditional wasserstein generative adversarial networks

Y Yang, H Wang, W Li, X Wang… - BMC …, 2021 - bmcbioinformatics.biomedcentral …

Protein post-translational modification (PTM) is a key issue to investigate the mechanism of 

protein's function. With the rapid development of proteomics technology, a large amount of 

protein sequence data has been generated, which highlights the importance of the in-depth …

Cited by 4 Related articles All 12 versions 


New Engineering Research from National Taiwan University of Science and Technology Described 

(Wasserstein Divergence GAN With Cross-Age Identity

Journal of Engineering, 04/2021

NewsletterCitation Online


Generalized spectral clustering via Gromov-Wasserstein learning

S ChowdhuryT Needham - International Conference on …, 2021 - proceedings.mlr.press

We establish a bridge between spectral clustering and Gromov-Wasserstein Learning

(GWL), a recent optimal transport-based approach to graph partitioning. This connection

both explains and improves upon the state-of-the-art performance of GWL. The Gromov …

  Cited by 3 Related articles All 2 versions 


2021


[PDF] arxiv.org

Classification of atomic environments via the Gromov–Wasserstein distance

S Kawano, JK Mason - Computational Materials Science, 2021 - Elsevier

Interpreting molecular dynamics simulations usually involves automated classification of

local atomic environments to identify regions of interest. Existing approaches are generally

limited to a small number of reference structures and only include limited information about …

  Cited by 2 Related articles All 4 versions


 

[HTML] springer.com

[HTML] Quantum statistical learning via Quantum Wasserstein natural gradient

S BeckerW Li - Journal of Statistical Physics, 2021 - Springer

In this article, we introduce a new approach towards the statistical learning

problem\(\mathrm {argmin} _ {\rho (\theta)\in {\mathcal {P}} _ {\theta}} W_ {Q}^ 2 (\rho _

{\star},\rho (\theta))\) to approximate a target quantum state\(\rho _ {\star}\) by a set of …

   Related articles All 5 versions


[PDF] arxiv.org

Continuous wasserstein-2 barycenter estimation without minimax optimization

A KorotinL LiJ SolomonE Burnaev - arXiv preprint arXiv:2102.01752, 2021 - arxiv.org

Wasserstein barycenters provide a geometric notion of the weighted average of probability

measures based on optimal transport. In this paper, we present a scalable algorithm to …

 Cite Cited by 8 Related articles All 3 versions 

Cite Cited by 8 Related articles All 3 versions 


[HTML] mdpi.com

Panchromatic Image Super-Resolution Via Self Attention-Augmented Wasserstein Generative Adversarial Network

J Du, K Cheng, Y Yu, D Wang, H Zhou - Sensors, 2021 - mdpi.com

Panchromatic (PAN) images contain abundant spatial information that is useful for earth

observation, but always suffer from low-resolution (LR) due to the sensor limitation and large-

scale view field. The current super-resolution (SR) methods based on traditional attention …

  All 6 versions 


[PDF] mdpi.com

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - Symmetry, 2021 - mdpi.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively

reduce the radiation risk of patients, but it may increase noise and artefacts, which can

compromise diagnostic information. The methods based on deep learning can effectively …

  Related articles 

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss. Symmetry 2021, 13, 126

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - 2021 - search.proquest.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively

reduce the radiation risk of patients, but it may increase noise and artefacts, which can

compromise diagnostic information. The methods based on deep learning can effectively …

<——2021———2021———620——

An inexact PAM method for computing Wasserstein barycenter with unknown supports

Y Qian, S Pan - Computational and Applied Mathematics, 2021 - Springer

Wasserstein barycenter is the centroid of a collection of discrete probability distributions

which minimizes the average of the\(\ell _2\)-Wasserstein distance. This paper focuses on

the computation of Wasserstein barycenters under the case where the support points are …

 

[PDF] arxiv.org

Tighter expected generalization error bounds via Wasserstein distance

B Rodríguez-GálvezG BassiR Thobaben… - arXiv preprint arXiv …, 2021 - arxiv.org

In this work, we introduce several expected generalization error bounds based on the

Wasserstein distance. More precisely, we present full-dataset, single-letter, and random-

subset bounds on both the standard setting and the randomized-subsample setting from …

  All 3 versions 


[PDF] arxiv.org

Distributional robustness in minimax linear quadratic control with Wasserstein distance

K KimI Yang - arXiv preprint arXiv:2102.12715, 2021 - arxiv.org

To address the issue of inaccurate distributions in practical stochastic systems, a minimax

linear-quadratic control method is proposed using the Wasserstein metric. Our method aims

to construct a control policy that is robust against errors in an empirical distribution of …

  All 2 versions 


[PDF] arxiv.org

Two-sample Test with Kernel Projected Wasserstein Distance

J Wang, R Gao, Y Xie - arXiv preprint arXiv:2102.06449, 2021 - arxiv.org

We develop a kernel projected Wasserstein distance for the two-sample test, an essential

building block in statistics and machine learning: given two sets of samples, to determine

whether they are from the same distribution. This method operates by finding the nonlinear …

  All 3 versions 


[PDF] arxiv.org

The isometry group of Wasserstein spaces: the Hilbertian case

GP GehérT TitkosD Virosztek - arXiv preprint arXiv:2102.02037, 2021 - arxiv.org

Motivated by Kloeckner's result on the isometry group of the quadratic Wasserstein space

$\mathcal {W} _2\left (\mathbb {R}^ n\right) $, we describe the isometry group $\mathrm

{Isom}\left (\mathcal {W} _p (E)\right) $ for all parameters $0< p<\infty $ and for all separable …

  All 2 versions 

The isometry group of Wasserstein spaces: the Hilbertian case

G Pál Gehér, T TitkosD Virosztek - arXiv e-prints, 2021 - ui.adsabs.harvard.edu

Motivated by Kloeckner's result on the isometry group of the quadratic Wasserstein space

$\mathcal {W} _2\left (\mathbb {R}^ n\right) $, we describe the isometry group $\mathrm

{Isom}\left (\mathcal {W} _p (E)\right) $ for all parameters $0< p<\infty $ and for all separable …


2021


[PDF] arxiv.org

Projected Wasserstein gradient descent for high-dimensional Bayesian inference

Y WangP ChenW Li - arXiv preprint arXiv:2102.06350, 2021 - arxiv.org

We propose a projected Wasserstein gradient descent method (pWGD) for high-dimensional

Bayesian inference problems. The underlying density function of a particle system of WGD is

approximated by kernel density estimation (KDE), which faces the long-standing curse of …

  All 4 versions 

The Wasserstein space of stochastic processes

by Bartl, DanielBeiglböck, MathiasPammer, Gudmund

arXiv.org, 04/2021

Wasserstein distance induces a natural Riemannian structure for the probabilities on the Euclidean space. This insight of classical 

ransport theory is...Paper  Full Text Online

arXiv:2104.14245  [pdfother math.PR
The Wasserstein space of stochastic processes
Authors: Daniel BartlMathias BeiglböckGudmund Pammer
Abstract: Wasserstein distance induces a natural Riemannian structure for the probabilities on the Euclidean space. This insight of classical transport theory is fundamental for tremendous applications in various fields of pure and applied mathematics. We believe that an appropriate probabilistic variant, the adapted Wasserstein distance AW, can play a similar role for the class FP of filtered processes,…  More
Submitted 29 April, 2021; originally announced April 2021.
Journal ArticleFull Text Online

Cited by 9 Related articles All 2 versions 

arXiv:2104.12384  [pdfpsother]  stat.ML  cs.LG  math.NA  math.PR   

Wasserstein distance estimates for the distributions of numerical approximations to ergodic stochastic differential equations
Authors: J. M. Sanz-SernaKonstantinos C. Zygalakis
Abstract: We present a framework that allows for the non-asymptotic study of the 2
-Wasserstein distance between the invariant distribution of an ergodic stochastic differential equation and the distribution of its numerical approximation in the strongly log-concave case. This allows us to study in a unified way a number of different integrators proposed in the literature for the overdamped and underdamped… 
More
Submitted 26 April, 2021; originally announced April 2021.
Comments: 29 pages, 2 figures
MSC Class: 65C40; 60H10; 60H35

Journal ArticleFull Text Online
Cited by 3
 Related articles All 22 versions 
Wasserstein distance estimates for the distributions of numerical approximations to ergodic stochastic differential equations

Sanz-Serna, JM and Zygalakis, KC

2021 | 

JOURNAL OF MACHINE LEARNING RESEARCH

 22

We present a framework that allows for the non-asymptotic study of the 2-Wasserstein distance between the invariant distribution of an ergodic stochastic differential equation and the distribution of its numerical approximation in the strongly log-concave case. This allows us to study in a unified way a number of different integrators proposed in the literature for the overdamped and under damp

Show more

more_horiz

1 Citation 52 References  Related records


arXiv:2104.12368  [pdfother]  stat.ML  cs.LG
Finite sample approximations of exact and entropic Wasserstein distances between covariance operators and Gaussian processes
Authors: Minh Ha Quang
Abstract: This work studies finite sample approximations of the exact and entropic regularized Wasserstein distances between centered Gaussian processes and, more generally, covariance operators of functional random processes. We first show that these distances/divergences are fully represented by reproducing kernel Hilbert space (RKHS) covariance and cross-covariance operators associated with the correspon… 
More
Submitted 26 April, 2021; originally announced April 2021.
Comments: 30 pages

Journal ArticleFull Text Online

  All 3 versions 

 arXiv:2104.12097  [pdfpsother]  math.DG  math.FA  math.MG
Eigenfunctions and a lower bound on the Wasserstein distance
Authors: Nicolò De PontiSara Farinelli
Abstract: We prove a conjectured lower bound on the p
-Wasserstein distance between the positive and negative parts of a Laplace eigenfunction. Our result holds for general RCD(K,∞)
 spaces.
Submitted 25 April, 2021; originally announced April 2021.

Journal ArticleFull Text Online

Related articles All 4 versions 

<——2021———2021———630——


[PDF] intlpress.com

Zero-sum differential games on the Wasserstein space

T BaşarJ Moon - Communications in Information and Systems, 2021 - intlpress.com

We consider two-player zero-sum differential games (ZSDGs), where the state process

(dynamical system) depends on the random initial condition and the state process's

distribution, and the objective functional includes the state process's distribution and the …

  Related articles 


2021

MR4248478 Prelim Dinh, Trung Hoa; Le, Cong Trinh; Vo, Bich Khue; Vuong, Trung Dung; The 

αz-Bures Wasserstein divergence. Linear Algebra Appl. 624 (2021), 267–280. 47A63 (47A56)

Review PDF Clipboard Journal Article

Carrillo, J. A.Vaes, U.

Wasserstein stability estimates for covariance-preconditioned Fokker-Planck equations. (English) Zbl 07339635

Nonlinearity 34, No. 4, 2275-2295 (2021).

MSC:  35Q70 35Q84 62F15 65C35

PDF BibTeX XML Cite

Full Text: DOI

ited by 18 Related articles All 7 versions

Whiteley, Nick

Dimension-free Wasserstein contraction of nonlinear filters. (English) Zbl 07339593

Stochastic Processes Appl. 135, 31-50 (2021).

MSC:  60

PDF BibTeX XML Cite

Full Text: DOI


2021

Wang, Feng-Yu

Precise limit in Wasserstein distance for conditional empirical measures of Dirichlet diffusion processes. (English) Zbl 07336914

J. Funct. Anal. 280, No. 11, Article ID 108998, 23 p. (2021).

MSC:  60D05 58J65

PDF BibTeX XML Cite


2021


 2021 

Dimension-free Wasserstein contraction of nonlinear filters

N Whiteley - Stochastic Processes and their Applications, 2021 - Elsevier

For a class of partially observed diffusions, conditions are given for the map from the initial

condition of the signal to filtering distribution to be contractive with respect to Wasserstein

distances, with rate which does not necessarily depend on the dimension of the state-space …

  All 5 versions

Full Text: DOI


Precise limit in Wasserstein distance for conditional empirical measures of Dirichlet diffusion processes

FY Wang - Journal of Functional Analysis, 2021 - Elsevier

Let M be a d-dimensional connected compact Riemannian manifold with boundary∂ M, let

V C 2 (M) such that μ (dx):= e V (x) dx is a probability measure, and let X t be the diffusion

process generated by L:= Δ+ V with τ:= inf⁡{t≥ 0: X t∂ M}. Consider the conditional …

  Cited by 3 Related articles All 3 versions


[PDF] arxiv.org

Non-negative matrix and tensor factorisations with a smoothed Wasserstein loss

SY Zhang - arXiv preprint arXiv:2104.01708, 2021 - arxiv.org

Non-negative matrix and tensor factorisations are a classical tool in machine learning and

data science for finding low-dimensional representations of high-dimensional datasets. In

applications such as imaging, datasets can often be regarded as distributions in a space …

  All 3 versions 


[PDF] auburn.edu

Efficient and Robust Classification for Positive Definite Matrices with Wasserstein Metric

J Cui - 2021 - etd.auburn.edu

Riemannian geometry methods are widely used to classify SPD (Symmetric Positives-

Definite) matrices, such as covariances matrices of brain-computer interfaces. Common

Riemannian geometry classification methods are based on Riemannian distance to …


[PDF] arxiv.org

Computationally Efficient Wasserstein Loss for Structured Labels

A Toyokuni, S YokoiH KashimaM Yamada - arXiv preprint arXiv …, 2021 - arxiv.org

The problem of estimating the probability distribution of labels has been widely studied as a

label distribution learning (LDL) problem, whose applications include age estimation,

emotion analysis, and semantic segmentation. We propose a tree-Wasserstein distance …

  All 3 versions 

<——2021———2021———640——


[PDF] arxiv.org

Geometry on the Wasserstein space over a compact Riemannian manifold

H Ding, S Fang - arXiv preprint arXiv:2104.00910, 2021 - arxiv.org

We will revisit the intrinsic differential geometry of the Wasserstein space over a Riemannian

manifold, due to a series of papers by Otto, Villani, Lott, Ambrosio, Gigli, Savaré and so on.

Subjects: Mathematical Physics (math-ph); Probability (math. PR) Cite as: arXiv: 2104.00910 …

  All 9 versions 


On clustering uncertain and structured data with Wasserstein barycenters and a geodesic criterion for the number of clusters

GI Papayiannis, GN Domazakis… - Journal of Statistical …, 2021 - Taylor & Francis

Clustering schemes for uncertain and structured data are considered relying on the notion of

Wasserstein barycenters, accompanied by appropriate clustering indices based on the

intrinsic geometry of the Wasserstein space. Such type of clustering approaches are highly …


 

DISSERTATION

Decentralized Algorithms for Wasserstein Barycenters

Dvinskikh, Darina ; 2021

Decentralized Algorithms for Wasserstein Barycenters

Online Access Available 

arXiv:2105.01587  [pdfother math.OC
Decentralized Algorithms for Wasserstein Barycenters
Authors: Darina Dvinskikh
Abstract: In this thesis, we consider the Wasserstein barycenter problem of discrete probability measures from computational and statistical sides in two scenarios: (I) the measures are given and we need to compute their Wasserstein barycenter, and (ii) the measures are generated from a probability distribution and we need to calculate the population barycenter of the distribution defined by the notion of F…  More
Submitted 4 May, 2021; originally announced May 2021.
Cited by 4 Related articles All 4 versions

arXiv:2105.00447  [pdfother]  cs.CV  cs.LG
Automatic Visual Inspection of Rare Defects: A Framework based on GP-WGAN and Enhanced Faster R-CNN
Authors: Masoud JalayerReza JalayerAmin KaboliCarlotta OrsenigoCarlo Vercellis
Abstract: A current trend in industries such as semiconductors and foundry is to shift their visual inspection processes to Automatic Visual Inspection (AVI) systems, to reduce their costs, mistakes, and dependency on human experts. This paper proposes a two-staged fault diagnosis framework for AVI systems. In the first stage, a generation model is designed to synthesize new samples based on real samples. T…  More
Submitted 2 May, 2021; originally announced May 2021.
Comments: 13 pages, submitted for THE IEEE INTERNATIONAL CONFERENCE ON INDUSTRY 4.0, ARTIFICIAL INTELLIGENCE, AND COMMUNICATIONS TECHNOLOGY (IAICT2021)

 Cited by 2 Related articles All 4 versions

 patent
Method for synthesizing high-energy image based on Wasserstein generating countermeasure network model by using electronic device, involves utilizing arbiter network for judging high energy image synthesized by generator network

Patent Number: CN112634390-A

Patent Assignee: SHENZHEN INST ADVANCED TECHNOLOGY

Inventor(s): ZHENG H; HU Z; LIANG D; et al.


patent

Method for generating digital handwriting based on double arbiter weight generating countermeasure network, involves performing theoretical analysis to dual discriminator Wasserstein generative adversarial network model

Patent Number: CN112598125-A

Patent Assignee: UNIV XIAN SCI & TECHNOLOGY; SHAANXI ZHONGYI TIMES TECHNOLOGY CO LTD

Inventor(s): LIU B; GAO N; HUANG M; et al.


 patent

Wavelet transformation local characteristic scale decomposition Wasserstein distance based pipeline leakage signal de-noising method, involves reconstructing effective component, and expressing leakage signal after removing noise component

Patent Number: CN112539887-A

Patent Assignee: UNIV NORTHEAST PETROLEUM

Inventor(s): DONG H; ZHOU Y; LU J; et al.


Wasserstein generative adversarial network-based approach for real-time track irregularity estimation using vehicle dynamic responses

Z Yuan, J Luo, S ZhuW Zhai - Vehicle System Dynamics, 2021 - Taylor & Francis

… based on dynamic responses of the in-service train. In this paper, a Wasserstein generative

adversarial network (WGAN)-based … The proposed WGAN is composed of a generator …

Related articles


2021 see 2020

Parameter-Transferred Wasserstein Generative Adversarial Network (PT-WGAN) for Low-Dose PET Image Denoising

By: Gong, YuShan, HongmingTeng, Yueyang; et al.

IEEE TRANSACTIONS ON RADIATION AND PLASMA MEDICAL SCIENCES  Volume: ‏ 5   Issue: ‏ 2   Pages: ‏ 213-223   Published: ‏ MAR 2021


Classification of atomic environments via the Gromov–Wasserstein distance

S Kawano, JK Mason - Computational Materials Science, 2021 - Elsevier

Interpreting molecular dynamics simulations usually involves automated classification of

local atomic environments to identify regions of interest. Existing approaches are generally

limited to a small number of reference structures and only include limited information about …

  Cited by 2 Related articles All 4 versions

<——2021———2021———650——


[PDF] arxiv.org

Sufficient Condition for Rectifiability Involving Wasserstein Distance  W 2

D Dąbrowski - The Journal of Geometric Analysis, 2021 - Springer

Abstract A Radon measure\(\mu\) is n-rectifiable if it is absolutely continuous with respect

to\({\mathcal {H}}^ n\) and\(\mu\)-almost all of\({{\,\mathrm {supp}\,}}\mu\) can be covered by

Lipschitz images of\({\mathbb {R}}^ n\). In this paper we give two sufficient conditions for …

  Cited by 4 Related articles All 3 versions


 Local Stability of Wasserstein GANs With Abstract Gradient Penalty

C Kim, S Park, HJ Hwang - IEEE Transactions on Neural …, 2021 - ieeexplore.ieee.org

The convergence of generative adversarial networks (GANs) has been studied substantially

in various aspects to achieve successful generative tasks. Ever since it is first proposed, the

idea has achieved many theoretical improvements by injecting an instance noise, choosing …

  All 3 versions


[PDF] arxiv.org

Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces

B BonnetH Frankowska - arXiv preprint arXiv:2101.10668, 2021 - arxiv.org

In this article, we derive first-order necessary optimality conditions for a constrained optimal

control problem formulated in the Wasserstein space of probability measures. To this end,

we introduce a new notion of localised metric subdifferential for compactly supported …

  All 2 versions 


[PDF] auburn.edu

Efficient and Robust Classification for Positive Definite Matrices with Wasserstein Metric

J Cui - 2021 - etd.auburn.edu

Riemannian geometry methods are widely used to classify SPD (Symmetric Positives-

Definite) matrices, such as covariances matrices of brain-computer interfaces. Common

Riemannian geometry classification methods are based on Riemannian distance to …

  

基于WGAN的不均衡太赫兹光谱识别 - 光谱学与光谱分析

http://www.gpxygpfx.com › article

· Translate this page

Wasserstein GAN for the Classification of Unbalanced THz Database[J]. ... 太赫兹(TerahertzTHz)波是指频率在0.1~10 THz之间的电磁波, 在电磁波谱中位于微波 ...

[CITATION] Wasserstein GAN for the Classification of Unbalanced THz Database

Z Rong-sheng, S Tao, L Ying-li… - …, 2021 - OFFICE SPECTROSCOPY & …


2021

MR4252812 Prelim Steinerberger, Stefan; A Wasserstein inequality and minimal Green energy on compact manifolds. J. Funct. Anal. 281 (2021), no. 5, 109076. 31B10 (35K05 49Q20)

Review PDF Clipboard Journal Article

  Cited by 4 Related articles All 4 versions

 A Wasserstein inequality and minimal Green energy on compact manifolds

5 citations* for all

0 citations*

2021 JOURNAL OF FUNCTIONAL ANALYSIS

Stefan Steinerberger

University of Washington

Laplace operator

Kernel (category theory)

View More (8+) 

Abstract Let M be a smooth, compact d−dimensional manifold, d ≥ 3 , without boundary and let G : M × M R { ∞ } denote the Green's function of the Laplacian −Δ (normalized to have mean value 0). We prove a bound on the cost of transporting Dirac measures in { x ... View Full Abstract 

Cited by 10 Related articles All 3 versions

MR4251253 Prelim Frohmader, Andrew; Volkmer, Hans; 1-Wasserstein distance on the standard simplex. Algebr. Stat. 12 (2021), no. 1, 43–56.

Review PDF Clipboard Journal Article


2021


2021 see 2019

1-Wasserstein distance on the standard simplex

A Frohmader, H Volkmer - Algebraic Statistics, 2021 - msp.org

Wasserstein distances provide a metric on a space of probability measures. We consider the

space Ω of all probability measures on the finite set χ={1,…, n}, where n is a positive integer.

The 1-Wasserstein distance, W 1 (μ, ν), is a function from Ω× Ω to [0,∞). This paper derives …

 [PDF] arxiv.org

1-Wasserstein distance on the standard simplex

A Frohmader, H Volkmer - Algebraic Statistics, 2021 - msp.org

Wasserstein distances provide a metric on a space of probability measures. We consider the

space Ω of all probability measures on the finite set χ={1,…, n}, where n is a positive integer.

The 1-Wasserstein distance, W 1 (μ, ν), is a function from Ω× Ω to [0,∞). This paper derives …

Cited by 4 Related articles All 4 versions

 

2021  onlineOPEN ACCESS

Wasserstein Distributionally Robust Look-Ahead Economic Dispatch
by Poolla, Bala Kameshwar; Hota, Ashish R; Bolognani, Saverio ; More...
IEEE transactions on power systems, 05/2021, Volume 36, Issue 3
We consider the problem of look-ahead economic dispatch (LAED) with uncertain renewable energy generation. The goal of this problem is to minimize the cost of...

ournal ArticleFull Text Online

   

2021 online
Wasserstein Generative Models for Patch-Based Texture Synthesis
by Houdard, Antoine; Leclaire, Arthur; Papadakis, Nicolas ; More...
Scale Space and Variational Methods in Computer Vision, 04/2021
This work addresses texture synthesis by relying on the local representation of images through their patch distributions. The main contribution is a framework...

Book ChapterFull Text Online

Cited by 3 Related articles All 15 versions

<——2021———2021———660——

New Findings from Graz University of Technology in the Area of Fourier Analysis Reported 

(Berry-esseen Smoothing Inequality for the Wasserstein Metric On Compact Lie Groups)". 

Mathematics Week (1944-2440), p. 370.   04/2021
NewsletterCitation Online

New Findings from Graz University of Technology in the Area of Fourier Analysis Reported (Berry-esseen Smoothing Inequality

for the Wasserstein...

 Method for generating confrontation finger vein image based on texture constraint and Poisson fusion, involves judging rebuilt finger vein image by arbiter network, and obtaining sum of local WN-GP loss and global WGAN-GP arbiter loss

Patent Number: CN112488935-A

Patent Assignee: UNIV HANGZHOU DIANZI

Inventor(s): WANG Z; SHEN L; JIANG H.


2021 patent

Two-dimensional code image sharpening deblurring processing method, involves calculating loss function by WGAN-GP as critical function for balancing loss function specific gravity and marking data in training set with drop label picture

Patent Number: CN112258425-A

Patent Assignee: CHINA TELECOM WANWEI INFORMATION TECHNOL

Inventor(s): ZHAO W; WANG Z; HAO D; et al.


 Wasserstein distributionally robust chance-constrained optimization for energy and reserve dispatch: An exact and physically-bounded formulation

A ArrigoC OrdoudisJ KazempourZ De Grève… - European Journal of …, 2021 - Elsevier

In the context of transition towards sustainable, cost-efficient and reliable energy systems,

the improvement of current energy and reserve dispatch models is crucial to properly cope

with the uncertainty of weather-dependent renewable power generation. In contrast to …

 Cited by 6 Related articles All 6 versions

MR4304237

 2021  [PDF] arxiv.org

Conditional Wasserstein GAN-based oversampling of tabular data for imbalanced learning

J EngelmannS Lessmann - Expert Systems with Applications, 2021 - Elsevier

Class imbalance impedes the predictive performance of classification models. Popular

countermeasures include oversampling minority class cases by creating synthetic examples.

The paper examines the potential of Generative Adversarial Networks (GANs) for …

  Cited by 5 Related articles All 4 versions

Number: 114582   Published: ‏ JUL 15 2021

2021 patent
Patent Issued for Object Shape Regression Using Wasserstein Distance

Journal of Engineering, 03/2021
Newsletter  Full Text Online


A material decomposition method for dual‐energy CT via dual interactive Wasserstein generative...Prediction and analysis of multiple protein lysine modified sites based on conditional wasserstein
by Shi, ZaifengLi, HuilongCao, Qingjie ; More...
Medical physics (Lancaster), 06/2021, Volume 48, Issue 6
Purpose Dual‐energy computed tomography (DECT) is highly promising for material characterization and identification, whereas reconstructed material‐specific...
ArticleView Article PDF
Journal Article  Full Text Online

[PDF] umons.ac.be

Wasserstein distributionally robust chance-constrained optimization for energy and reserve dispatch: An exact and physically-bounded formulation

A ArrigoC OrdoudisJ KazempourZ De Grève… - European Journal of …, 2021 - Elsevier

In the context of transition towards sustainable, cost-efficient and reliable energy systems,

the improvement of current energy and reserve dispatch models is crucial to properly cope

with the uncertainty of weather-dependent renewable power generation. In contrast to …

  Cited by 3 All 6 versions


[PDF] optimization-online.org

Data-driven distributionally robust chance-constrained optimization with Wasserstein metric

R JiMA Lejeune - Journal of Global Optimization, 2021 - Springer

We study distributionally robust chance-constrained programming (DRCCP) optimization

problems with data-driven Wasserstein ambiguity sets. The proposed algorithmic and

reformulation framework applies to all types of distributionally robust chance-constrained  …

  3 Related articles All 3 versions


Considering anatomical prior information for low-dose CT image enhancement using attribute-augmented Wasserstein generative adversarial networks

Z Huang, X Liu, R Wang, J Chen, P Lu, Q Zhang… - Neurocomputing, 2021 - Elsevier

Currently, many deep learning (DL)-based low-dose CT image postprocessing technologies

fail to consider the anatomical differences in training data among different human body sites,

such as the cranium, lung and pelvis. In addition, we can observe evident anatomical …

   Related articles


On Stein’s Factors for Poisson Approximation in Wasserstein Distance with Nonlinear...
by Liao, Zhong-WeiMa, YutaoXia, Aihua
Journal of theoretical probability, 09/2021
Article Link Read Article
Journal Article  Full Text Online

<——2021———2021———670—


 

[HTML] biomedcentral.com

[HTML] Prediction and analysis of multiple protein lysine modified sites based on conditional wasserstein generative adversarial networks

Y Yang, H Wang, W Li, X Wang… - BMC …, 2021 - bmcbioinformatics.biomedcentral …

Protein post-translational modification (PTM) is a key issue to investigate the mechanism of

protein's function. With the rapid development of proteomics technology, a large amount of

protein sequence data has been generated, which highlights the importance of the in-depth …

  All 9 versions 

 

[PDF] arxiv.org

Distributionally Robust Chance-Constrained Programmings for Non-Linear Uncertainties with Wasserstein Distance

Y Gu, Y Wang - arXiv preprint arXiv:2103.04790, 2021 - arxiv.org

In this paper, we study a distributionally robust chance-constrained programming $(\text

{DRCCP}) $ under Wasserstein ambiguity set, where the uncertain constraints require to be

jointly satisfied with a probability of at least a given risk level for all the probability …

  All 2 versions 


2021

Wasserstein distance to independence models

TÖ Çelik, A Jamneshan, G MontúfarB Sturmfels… - Journal of Symbolic …, 2021 - Elsevier

An independence model for discrete random variables is a Segre-Veronese variety in a

probability simplex. Any metric on the set of joint states of the random variables induces a

Wasserstein metric on the probability simplex. The unit ball of this polyhedral norm is dual to …

  Cited by 2 Related articles All 3 versions


[PDF] wiley.com

Closed‐form expressions for maximum mean discrepancy with applications to Wasserstein auto‐encoders

RM Rustamov - Stat, 2021 - Wiley Online Library

The maximum mean discrepancy (MMD) has found numerous applications in statistics and

machine learning, among which is its use as a penalty in the Wasserstein auto‐encoder

(WAE). In this paper, we compute closed‐form expressions for estimating the Gaussian …

  Cited by 5 Related articles All 3 versions


online Cover Image  PEER-REVIEW

Exponential convergence in entropy and Wasserstein for McKean–Vlasov SDEs

by Ren, Panpan; Wang, Feng-Yu

Nonlinear analysis, 05/2021, Volume 206

The following type of exponential convergence is proved for (non-degenerate or degenerate) McKean–Vlasov SDEs:...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Cited by 17 Related articles All 5 versions

2021

online

Reports Outline Approximation Theory Study Findings from National Research University Higher School of Economics (The Measurement of Relations On Belief Functions Based On the Kantorovich Problem and the Wasserstein...

Robotics & Machine Learning, 05/2021

NewsletterFull Text Online

 

 

online Cover Image PEER-REVIEW

Fast Wasserstein-distance-based distributionally robust chance-constrained power dispatch for multi-zone...

by Chen, Ge; Zhang, Hongcai; Hui, Hongxun ; More...

IEEE transactions on smart grid, 04/2021

Heating, ventilation, and air-conditioning (HVAC) systems play an increasingly important role in the construction of smart cities because of their high energy...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 Preview 

 Cite this item Email this item Save this item More actions


Wasserstein型コストに基づくワンウェイ型カーシェアリングサービスの最適制御Authors:星野 健太櫻間 一徳自動制御連合講演会講演論文集 64回自動制御連合講演会
Downloadable Article, 2021
Publication:
自動制御連合講演会講演論文集 64回自動制御連合講演会, 2021, 825
Publisher:2021

Wasserstein型コストに基づくワンウェイ型カーシェアリングサービスの最適制御Authors:星野 健太櫻間 一徳自動制御連合講演会講演論文集 64回自動制御連合講演会

Wasserstein-gata kosuto ni motodzuku wan'u~ei-gata kāshearingusābisu no saiteki seigyo Authors: Hoshino Kenta,   Sakura Hazama Ittoku,   jidō seigyo rengō kōen-kai kōenronbunshū dai 64-kai jidō seigyo rengō kōen-kai

[Japanese  Optimal Control of One-Way Car Sharing Service Based on Wasserstein Cost Authors: Kenta Hoshino, Kazunori Sakurama, Proceedings of the Joint Conference on Automatic Control The 64th Joint Conference on Automatic Control]

 

online OPEN ACCESS

Decentralized Algorithms for Wasserstein Barycenters

by Dvinskikh, Darina

05/2021

In this thesis, we consider the Wasserstein barycenter problem of discrete probability measures from computational and statistical sides in two scenarios: (I)...

Journal ArticleFull Text Online

Decentralized Algorithms for Wasserstein Barycenters  

https://arxiv.org › pdf

https://arxiv.org › pdfPDF

by D Dvinskikh · 2021 — In this thesis, we consider the Wasserstein barycenter problem of ... skikh and Tiapkin, 2021) published in the proceedings of the 24th ...
Cited by
6 Related articles All 4 versions

 

online  OPEN ACCESS

The Wasserstein space of stochastic processes

by Bartl, Daniel; Beiglböck, Mathias; Pammer, Gudmund

04/2021

Wasserstein distance induces a natural Riemannian structure for the probabilities on the Euclidean space. This insight of classical transport theory is...

Journal ArticleFull Text Online

Cited by 2 Related articles All 2 versions

<——2021———2021———680——

online  OPEN ACCESS

Gromov-Wasserstein Distances between Gaussian Distributions

by Salmona, Antoine; Delon, Julie; Desolneux, Agnès

04/2021

The Gromov-Wasserstein distances were proposed a few years ago to compare distributions which do not lie in the same space. In particular, they offer an...

Journal ArticleFull Text Online

 Cited by 7 Related articles All 44 versions

 

online  OPEN ACCESS

Eigenfunctions and a lower bound on the Wasserstein distance

by De Ponti, Nicolò; Farinelli, Sara

04/2021

We prove a conjectured lower bound on the $p$-Wasserstein distance between the positive and negative parts of a Laplace eigenfunction. Our result holds for...

Journal ArticleFull Text Online

 Preview 

 Cite this item Email this item Save this item More actions

 Related articles All 4 versionsƒ

online OPEN ACCESS

Wasserstein distance estimates for the distributions of numerical approximations to ergodic stochastic...

by Sanz-Serna, J. M; Zygalakis, Konstantinos C

04/2021

We present a framework that allows for the non-asymptotic study of the $2$-Wasserstein distance between the invariant distribution of an ergodic stochastic...

Journal ArticleFull Text Online

 Preview 

 Cite this item Email this item Save this item More actions


 [PDF] archives-ouvertes.fr

Measuring the Irregularity of Vector-Valued Morphological Operators Using Wasserstein Metric

ME Valle, S Francisco, MA Granero… - … Conference on Discrete …, 2021 - Springer

Mathematical morphology is a useful theory of nonlinear operators widely used for image

processing and analysis. Despite the successful application of morphological operators for

binary and gray-scale images, extending them to vector-valued images is not straightforward …

  All 2 versions

 

2021 online

Wasserstein Generative Models for Patch-Based Texture Synthesis

by Houdard, Antoine; Leclaire, Arthur; Papadakis, Nicolas ; More...

Scale Space and Variational Methods in Computer Vision, 04/2021

This work addresses texture synthesis by relying on the local representation of images through their patch distributions. The main contribution is a framework...

Book ChapterFull Text Online

2021

 

online

Report Summarizes Symbolic Computation Study Findings from Simon Fraser University (Wasserstein...

Journal of Technology & Science, 05/2021

NewsletterFull Text Online

online

Findings from National University of Singapore Has Provided New Data on Machine Learning (A Fast Globally Linearly Convergent Algorithm for the Computation of Wasserstein...

Robotics & Machine Learning, 05/2021

NewsletterFull Text Online

Cited by 18 Related articles All 12 versions

online

New Findings Reported from Tsinghua University Describe Advances in Machine Learning 

(Ripple-gan: Lane Line Detection With Ripple Lane Line Detection Network and Wasserstein...

Robotics & Machine Learning, 05/2021

NewsletterFull Text Online

online

New Machine Learning Study Findings Recently Were Reported by Researchers at Massachusetts Institute of Technology (Wasserstein...

Robotics & Machine Learning, 05/2021

NewsletterFull Text Online

 

online

Reports Outline Approximation Theory Study Findings from National Research University Higher School of Economics 

(The Measurement of Relations On Belief Functions Based On the Kantorovich Problem and the Wasserstein...

Robotics & Machine Learning, 05/2021

NewsletterFull Text Online

 All 3 versions 

<——2021———2021———690——

2021 online Cover Image PEER-REVIEW

A material decomposition method for dual-energy CT via dual interactive Wasserstein generative adversarial...

by Shi, Zaifeng; Li, Huilong; Cao, Qingjie ; More...

Medical physics (Lancaster), 03/2021

Dual-energy computed tomography (DECT) is highly promising for material characterization and identification, whereas reconstructed material-specific images are...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

News results for "(TitleCombined:(wasserstein))"

online

Palo Alto Research Center Incorporated issued patent titled "Object shape regression using wasserstein...

News Bites - Private Companies, Mar 11, 2021

Newspaper ArticleFull Text Online

 Cite this item Email this item Save this item More actions

Palo Alto Research Center Obtains Patent for Object Shape Regression Using Wasserstein Distance

Pedia Content Solutions Pvt. LtdMar 9, 2021

 Cite this item Email this item Save this item More actions

Palo Alto Research Center Obtains Patent for Object Shape Regression Using Wasserstein Distance

Pedia Content Solutions Pvt. LtdMar 9, 2021

 Cite this item Email this item Save this item More actions

MR4257806 Prelim Chambolle, Antonin; Laux, Tim; Mullins-Sekerka as the Wasserstein flow of the perimeter. Proc. Amer. Math. Soc. 149 (2021), no. 7, 2943–2956. 35A15 (35R35 35R37 49Q20 76D27 90B06)

Review PDF Clipboard Journal Article

PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY  Volume: ‏ 149   Issue: ‏ 7   Pages: ‏ 2943-2956   Published: ‏ JUL 2021
  [PDF] ams.org

Mullins-Sekerka as the Wasserstein flow of the perimeter

A Chambolle, T Laux - Proceedings of the American Mathematical Society, 2021 - ams.org

We prove the convergence of an implicit time discretization for the one-phase Mullins-Sekerka

equation, possibly with additional non-local repulsion, proposed in [F. Otto, Arch. Rational …

Cited by 3 Related articles All 22 versions


2021

Wasserstein autoregressive models for density time series

By: Zhang, ChaoKokoszka, PiotrPetersen, Alexander

JOURNAL OF TIME SERIES ANALYSIS    

Early Access: MAY 2021

 Cited by 6 Related articles All 5 versions

online Cover Image  PEER-REVIEW

Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic...

by Ferriere, Guillaume

Analysis & PDE, 03/2021, Volume 14, Issue 2

Article Link Read Article BrowZine Article Link Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

online Cover Image PEER-REVIEW

Reconstruction Method for Missing Measurement Data Based on Wasserstein Generative Adversarial Network

by Zhang, Changfan; Chen, Hongrun; He, Jing ; More...

Journal of advanced computational intelligence and intelligent informatics, 03/2021, Volume 25, Issue 2

Focusing on the issue of missing measurement data caused by complex and changeable working conditions during the operation of high-speed trains, in this paper,...

Article Link Read Article BrowZine Article Link Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

2021

2021 online Cover Image

Nonembeddability of persistence diagrams with p>2 Wasserstein metric

by Alexander Wagner

Proceedings of the American Mathematical Society, 06/2021, Volume 149, Issue 6

Persistence diagrams do not admit an inner product structure compatible with any Wasserstein metric. Hence, when applying kernel methods to persistence...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Ico


Stability of Gibbs Posteriors from the Wasserstein Loss for Bayesian Full Waveform Inversion

Matthew M. Dunlop and Yunan Yang

SIAM/ASA Journal on Uncertainty QuantificationVol. 9, No. 4, pp. 1499–15262021
PDF
  Abstract

Classification of atomic environments via the Gromov–Wasserstein distance

S Kawano, JK Mason - Computational Materials Science, 2021 - Elsevier

Interpreting molecular dynamics simulations usually involves automated classification of

local atomic environments to identify regions of interest. Existing approaches are generally

limited to a small number of reference structures and only include limited information about …

   Related articles All 4 versions 


online Cover Image

A deep learning-based approach for direct PET attenuation correction using Wasserstein generative...

by Yongchang Li; Wei Wu

Journal of physics. Conference series, 04/2021, Volume 1848, Issue 1

Positron emission tomography (PET) in some clinical assistant diagnose demands attenuation correction (AC) and scatter correction (SC) to obtain high-quality...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

  Related articles All 3 versions

online OPEN ACCESS

A new perspective on Wasserstein distances for kinetic problems

by Iacobelli, Mikaela

04/2021

We introduce a new class of Wasserstein-type distances specifically designed to tackle questions concerning stability and convergence to equilibria for kinetic...

Journal ArticleFull Text Online

Included 1   ВАСЕРШТЕЙН and 2 tulles with Vaserstein

Related articles All 4 versions

<——2021———2021———700——


online Cover Image  OPEN ACCESS

Wasserstein Metric-Based Location Spoofing Attack Detection in WiFi Positioning Systems

by Tian, Yinghua; Zheng, Nae; Chen, Xiang ; More...

Security and communication networks, 04/2021, Volume 2021

WiFi positioning systems (WPS) have been introduced as parts of 5G location services (LCS) to provide fast positioning results of user devices in urban areas....

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

  Cited by 1 Related articles All 7 versions


2021 online  OPEN ACCESS

Object shape regression using wasserstein distance

by Palo Alto Research Center Incorporated

03/2021

One embodiment can provide a system for detecting outlines of objects in images. During operation, the system receives an image that includes at least one...

PatentAvailable Online

 

online

Wasserstein Generative Adversarial Networks for Realistic Traffic Sign Image Generation

by Dewi, Christine; Chen, Rung-Ching; Liu, Yan-Ting

Intelligent Information and Database Systems, 04/2021

Recently, Convolutional neural networks (CNN) with properly annotated training data and results will obtain the best traffic sign detection (TSD) and traffic...

Book ChapterFull Text Online

 Cited by 5 Related articles All 2 versions


online

Findings from Korea University Has Provided New Data on Information Technology (A Data-driven Event Generator for Hadron Colliders Using Wasserstein...

Information Technology Newsweekly, 03/2021

NewsletterFull Text Online

 Cited by 6 Related articles All 4 versions

online

Researchers from University of Nevada Report Details of New Studies and Findings in the Area of Statistics (Convergence Rate To Equilibrium In Wasserstein...

Mathematics Week, 03/2021

NewsletterFull Text Online

2021


2021

[PDF] arxiv.org

Robust Graph Learning Under Wasserstein Uncertainty

X Zhang, Y Xu, Q Liu, Z Liu, J Lu, Q Wang - arXiv preprint arXiv …, 2021 - arxiv.org

Graphs are playing a crucial role in different fields since they are powerful tools to unveil

intrinsic relationships among signals. In many scenarios, an accurate graph structure

representing signals is not available at all and that motivates people to learn a reliable …

  All 2 versions 

 

online

Researchers at University of Chile Release New Data on Circuits and Signal Processing (The Wasserstein-f...

Electronics Newsweekly, 03/2021

NewsletterFull Text Online

 Preview 

 Cite this item Email this item Save this item More actions

online

Data from School of Traffic and Transportation Engineering Provide New Insights into Intelligent Systems (Short-term Railway Passenger Demand Forecast Using Improved Wasserstein...

Robotics & Machine Learning, 03/2021

NewsletterFull Text Online

online

Researchers from Stanford University Report New Studies and Findings in the Area of Medical Imaging...

Health & Medicine Week, 03/2021

NewsletterFull Text Online


onlineß

US Patent Issued to Palo Alto Research Center on March 9 for "Object shape regression using wasserst...

US Fed News Service, Including US State News, Mar 10, 2021

Newspaper ArticleFull Text Online

You've reached the end of your results!

<——2021———2021———710—

2021 patent news

2021 patent online Cover Image PEER-REVIEW

Wasserstein Proximal Algorithms for the Schr\"dinger Bridge Problem: Density Control with Nonlinear Drift

by Caluya, Kenneth; Halder, Abhishek

IEEE transactions on automatic control, 02/2021

We study the Schr{\"o}dinger bridge problem (SBP) with nonlinear prior dynamics. In control-theoretic language, this is a problem of minimum effort steering of...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

News results for "(TitleCombined:(wasserstein))"

Palo Alto Research Center Obtains Patent for Object Shape Regression Using Wasserstein Distance

Global IP News: Optics & Imaging Patent News, Mar 9, 2021

Newspaper ArticleCitation Online

 Cite this item Email this item Save this item More actions

Palo Alto Research Center Obtains Patent for Object Shape Regression Using Wasserstein Distance

Pedia Content Solutions Pvt. LtdMar 9, 2021

 Cited by 1 Related articles All 4 versions
 
 

Controlling Wasserstein distances by Kernel norms with application to Compressive Statistical Learning

T VayerR Gribonval - arXiv preprint arXiv:2112.00423, 2021 - arxiv.org

Comparing probability distributions is at the crux of many machine learning algorithms.

Maximum Mean Discrepancies (MMD) and Optimal Transport distances (OT) are two

classes of distances between probability measures that have attracted abundant attention in

past years. This paper establishes some conditions under which the Wasserstein distance

can be controlled by MMD norms. Our work is motivated by the compressive statistical

learning (CSL) theory, a general framework for resource-efficient large scale learning in …

 All 4 versions 

online  OPEN ACCESS

Controlling Wasserstein distances by Kernel norms with application to Compressive Statistical Learning

by Vayer, Titouan; Gribonval, Rémi

12/2021

Comparing probability distributions is at the crux of many machine learning algorithms. Maximum Mean Discrepancies (MMD) and Optimal Transport distances (OT)...

Journal ArticleFull Text Online

 

Wasserstein GANs Work Because They Fail (to Approximate the Wasserstein Distance

Simons Institute

 Cut 27, 2021

Wasserstein GANs Work Because They Fail (to ... - YouTube

www.youtube.com › watch

1,235 views  Streamed live on 

Oct 27, 2021 

online OPEN ACCESS

Wasserstein GANs Work Because They Fail (to Approximate the Wasserstein Distance)

by Stanczuk, Jan; Etmann, Christian; Kreusser, Lisa Maria ; More...

03/2021

Wasserstein GANs are based on the idea of minimising the Wasserstein distance between a real and a generated distribution. We provide an in-depth mathematical...

Journal ArticleFull Text Online

Cited by 19 Related articles All 6 versions

2021

Computationally Efficient Wasserstein Loss for Structured Labels

A Toyokuni, S YokoiH KashimaM Yamada - arXiv preprint arXiv …, 2021 - arxiv.org

The problem of estimating the probability distribution of labels has been widely studied as a

label distribution learning (LDL) problem, whose applications include age estimation,

emotion analysis, and semantic segmentation. We propose a tree-Wasserstein distance

regularized LDL algorithm, focusing on hierarchical text classification tasks. We propose

predicting the entire label hierarchy using neural networks, where the similarity between

predicted and true labels is measured using the tree-Wasserstein distance. Through …

  All 3 versions 

online OPEN ACCESS

Computationally Efficient Wasserstein Loss for Structured Labels

by Toyokuni, Ayato; Yokoi, Sho; Kashima, Hisashi ; More...

03/2021

The problem of estimating the probability distribution of labels has been widely studied as a label distribution learning (LDL) problem, whose applications...

Journal ArticleFull Text Online

online Cover Image

Second-Order Conic Programming Approach for Wasserstein Distributionally Robust Two-Stage...

by Wang, Zhuolin; You, Keyou; Song, Shiji ; More...

IEEE transactions on automation science and engineering, 02/2021

This article proposes a second-order conic programming (SOCP) approach to solve distributionally robust two-stage linear programs over 1-Wasserstein balls. We...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon
 Cited by 4 Related articles All 5 versions


2021



Face Image Generation for Illustration by WGAN-GP Using Landmark Information
Authors:Miho TakahashiHiroshi Watanabe2021 IEEE 10th Global Conference on Consumer Electronics (GCCE)
Summary:With the spread of social networking services, face images for illustration are being used in a variety of situations. Attempts have been made to create illustration face images using adversarial generation networks, but the quality of the images has not been sufficient. It would be much easier to generate face images for illustrations if they could be generated by simply specifying the shape and expression of the face. Also, if images can be generated using landmark information, which is the location of the eyes, nose, and mouth of a face, it will be possible to capture and learn the features of the face. Therefore, in this paper, we propose a method to generate face images for illustration using landmark information. Our method can learn the location of landmarks and produce high quality images on creation of illustration face imagesShow more
Chapter, 2021
Publication:2021 IEEE 10th Global Conference on Consumer Electronics (GCCE), 20211012, 936
Publisher:2021

Conditional Wasserstein Generative Adversarial Networks for Fast Detector Simulation
Blue, John; Kronheim, Braden; Kuchera, Michelle; Ramanujan, Raghuram. EPJ Web of Conferences; Les Ulis,  Vol. 251, (2021).

Abstract/DetailsFull text - PDF (548 KB)‎

  All 3 versions


[PDF] arxiv.org

Distributionally robust second-order stochastic dominance constrained optimization with Wasserstein distance

Y Mei, J Liu, Z Chen - arXiv preprint arXiv:2101.00838, 2021 - arxiv.org

We consider a distributionally robust second-order stochastic dominance constrained optimization problem, where the true distribution of the uncertain parameters is ambiguous. The ambiguity set contains all probability distributions close to the empirical distribution …

  Related articles All 4 versions 

Unsupervised band selection for hyperspectral image classification using the Wasserstein metric-based configuration entropy

Z Hong, WU Zhiwei, W Jicheng… - Acta Geodaetica et … - xb.sinomaps.com

Band selection relies on the quantification of band information. Conventional measurements

such as Shannon entropy only consider the composition information (eg, types and ratios of

pixels) but ignore the configuration information (eg, the spatial distribution of pixels). The  …

  All 2 versions 

online OPEN ACCESS

Unsupervised band selection for hyperspectral image classification using the Wasserstein metric-based...

by ZHANG Hong; WU Zhiwei; WANG Jicheng ; More...

Ce hui xue bao, 03/2021, Volume 50, Issue 3

Band selection relies on the quantification of band information. Conventional measurements such as Shannon entropy only consider the composition information...

Journal ArticleFull Text Online

All 2 versions 



 OPEN ACCESS

Additional file 4 of CHICKN: extraction of peptide chromatographic elution profiles from large scale mass spectrometry data by means of Wasserstein...

by Permiakova, Olga; Guibert, Romain; Kraut, Alexandra ; More...

02/2021

Additional file 4: Sketch size influence on the clustering. Influence of the sketch size on performances clustering of the Ecoli-DIA dataset, in function of...

ImageCitation Online

 OPEN ACCESS

Additional file 7 of CHICKN: extraction of peptide chromatographic elution profiles from large scale mass spectrometry data by means of Wasserstein

compressive hierarchical cluster analysis

by Permiakova, Olga; Guibert, Romain; Kraut, Alexandra ; More...

02/2021

Additional file 7: Influence of k on the execution time of CHICKN. Figure depicting CHICKN execution time as a function of k, the number of clusters at each...

ImageCitation Online

 Related articles All 23 versions

<——2021———2021———720——


online

Reports on Potential Analysis from New York University Provide New Insights (Asymptotics of Smoothed Wasserstein...

Mathematics Week, 03/2021

NewsletterFull Text Online

online Cover Image  PEER-REVIEW

Wasserstein $F$-tests and confidence bands for the Fréchet regression of density response curves

by Petersen, Alexander; Liu, Xi; Divani, Afshin A

The Annals of statistics, 02/2021, Volume 49, Issue 1

Data consisting of samples of probability density functions are increasingly prevalent, necessitating the development of methodologies for their analysis that...

Article Link Read Article BrowZine Article Link Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 

Reconstruction Method for Missing Measurement Data Based on Wasserstein Generative Adversarial NetworkAuthors:Changfan ZhangHongrun ChenJing HeHaonan Yang
Article
Publication:Journal of advanced computational intelligence and intelligent informatics., 25, 2021-3, 195

 

2021 Cover Image PEER-REVIEW

Busemann functions on the Wasserstein space

by Zhu, Guomin; Li, Wen-Long; Cui, Xiaojun

Calculus of variations and partial differential equations, 06/2021, Volume 60, Issue 3

We study rays and co-rays in the Wasserstein space () whose ambient space is a complete, separable, non-compact, locally compact length space. We show that...

Journal ArticleCitation Online

Zhu, GuominLi, Wen-LongCui, Xiaojun

Busemann functions on the Wasserstein space. (English) Zbl 07344745

Calc. Var. Partial Differ. Equ. 60, No. 3, Paper No. 97, 16 p. (2021).

MSC:  58E10 60B10 60H30

PDF BibTeX XML Cite

Full Text: DOI

Peer-reviewed
Busemann functions on the Wasserstein space
Authors:Guomin ZhuWen-Long LiXiaojun Cui
Summary:Abstract: We study rays and co-rays in the Wasserstein space ( ) whose ambient space is a complete, separable, non-compact, locally compact length space. We show that rays in the Wasserstein space can be represented as probability measures concentrated on the set of rays in the ambient space. We show the existence of co-rays for any prescribed initial probability measure. We introduce Busemann functions on the Wasserstein space and show that co-rays are negative gradient lines in some senseShow more
Article, 2021
Publication:Calculus of Variations and Partial Differential Equations, 60, 20210427
Publisher:2021

2021 online Cover Image

 PEER-REVIEW OPEN ACCESS

2D Wasserstein loss for robust facial landmark detection

by Yan, Yongzhe; Duffner, Stefan; Phutane, Priyanka ; More...

Pattern recognition, 08/2021, Volume 116

•Rethink the problem of robust facial landmark detection between the reaserch and the practical use.•Novel method based on the Wasserstein loss to...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 2021

 

2021 see 2020  see 2018 arXiv online Cover Image

 PEER-REVIEW OPEN ACCESS

Wasserstein Distributionally Robust Stochastic Control: A Data-Driven Approach

by Yang, Insoon

IEEE transactions on automatic control, 08/2021, Volume 66, Issue 8

Standard stochastic control methods assume that the probability distribution of uncertain variables is available. Unfortunately, in practice, obtaining...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 

 

online Cover Image PEER-REVIEW OPEN ACCESS

WRGAN: Improvement of RelGAN with Wasserstein Loss for Text Generation

by Ziyun Jiao; Fuji Ren

Electronics (Basel), 01/2021, Volume 10, Issue 3

Generative adversarial networks (GANs) were first proposed in 2014, and have been widely used in computer vision, such as for image generation and other tasks....

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Cited by 3 Related articles All 2 versions

online Cover Image  PEER-REVIEW OPEN ACCESS

Wasserstein Divergence GAN With Cross-Age Identity Expert and Attribute Retainer for Facial Age...

by Hsu, Gee-Sern; Xie, Rui-Cang; Chen, Zhi-Ting

IEEE access, 2021, Volume 9

We propose the Wasserstein Divergence GAN with an identity expert and an attribute retainer for facial age transformation. The Wasserstein Divergence GAN...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Related articles All 2 versions

online

Cover Image  PEER-REVIEW OPEN ACCESS

A New Data-Driven Distributionally Robust Portfolio Optimization Method Based on Wasserstein...

by Du, Ningning; Liu, Yankui; Liu, Ying

IEEE access, 2021, Volume 9

Since optimal portfolio strategy depends heavily on the distribution of uncertain returns, this article proposes a new method for the portfolio optimization...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

online Cover Image PEER-REVIEW OPEN ACCESS

Breast Cancer Histopathology Image Super-Resolution Using Wide-Attention GAN With Improved Wasserstein...

by Shahidi, Faezehsadat

IEEE access, 2021, Volume 9

In the realm of image processing, enhancing the quality of the images is known as a super-resolution problem (SR). Among SR methods, a super-resolution...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Cited by 10 Related articles All 2 versions

<——2021———2021———730——


online Cover Image

 PEER-REVIEW OPEN ACCESS

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss...

by Zhixian Yin; Kewen Xia; Ziping He ; More...

Symmetry (Basel), 01/2021, Volume 13, Issue 1

The use of low-dose computed tomography (LDCT) in medical practice can effectively reduce the radiation risk of patients, but it may increase noise and...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 Cited by 13 Related articles All 3 versions


online  OPEN ACCESS

Projection Robust Wasserstein Barycenters

by Huang, Minhui; Ma, Shiqian; Lai, Lifeng

02/2021

Collecting and aggregating information from several probability measures or histograms is a fundamental task in machine learning. One of the popular solution...

Journal ArticleFull Text Online

Cited by 5 Related articles All 7 versions

online OPEN ACCESS

Learning High Dimensional Wasserstein Geodesics

by Liu, Shu; Ma, Shaojun; Chen, Yongxin ; More...

02/2021

We propose a new formulation and learning strategy for computing the Wasserstein geodesic between two probability distributions in high dimensions. By applying...

Journal ArticleFull Text Online

Cited by 4 Related articles All 3 versions

 

online  OPEN ACCESS

Wasserstein diffusion on graphs with missing attributes

by Chen, Zhixian; Ma, Tengfei; Song, Yangqiu ; More...

02/2021

Missing node attributes is a common problem in real-world graphs. Graph neural networks have been demonstrated powerful in graph representation learning,...

Journal ArticleFull Text Online

 Cite Cited by 3 Related articles All 2 versions

2021


2021 see 2022  online OPEN ACCESS arXiv

Wasserstein barycenters are NP-hard to compute

by Altschuler, Jason M; Boix-Adsera, Enric

01/2021

The problem of computing Wasserstein barycenters (a.k.a. Optimal Transport barycenters) has attracted considerable recent attention due to many applications in...

Journal ArticleFull Text Online

Cited by 9 Related articles All 2 versions

[PDF] arxiv.org

Wasserstein barycenters are NP-hard to compute

JM Altschuler, E Boix-Adsera - arXiv preprint arXiv:2101.01100, 2021 - arxiv.org

The problem of computing Wasserstein barycenters (aka Optimal Transport barycenters) has

attracted considerable recent attention due to many applications in data science. While there

exist polynomial-time algorithms in any fixed dimension, all known runtimes suffer …

   Related articles All 2 versions 

 2021

\arXiv:2102.03883  [pdfpsother math.KT
A note on relative Vaserstein symbol
Authors: Kuntal Chakraborty
Abstract: In an unpublished work of Fasel-Rao-Swan the notion of the relative Witt group W
E(R,I)
 is defined. In this article we will give the details of this construction. Then we studied the injectivity of the relative Vaserstein symbol …
. We established injectivity of this symbol if R
 is an affine non-singular algebra of dimension 3
 over a perfect…  More
Submitted 7 February, 2021; originally announced February 2021.
Comments: 26 pages

Related articles All 3 versions 

Selective Multi-source Transfer Learning with Wasserstein Domain Distance for Financial...
by Sun, YifuLan, LijunZhao, Xueyao ; More...
Intelligent Computing and Block Chain, 03/2021
As financial enterprises have moved their services to the internet, financial fraud detection has become an ever-growing problem causing severe economic losses...
Book Chapter  Full Text Online

 

online

SRWGANTV: Image Super-Resolution Through Wasserstein Generative Adversarial Networks with...

by Shao, Jun; Chen, Liang; Wu, Yi

2021 IEEE 13th International Conference on Computer Research and Development (ICCRD), 01/2021

The study of generative adversarial networks (GAN) has enormously promoted the research work on single image super-resolution (SISR) problem. SRGAN firstly...

Conference ProceedingFull Text Online

SRWGANTV: Image Super-Resolution Through Wasserstein Generative Adversarial Networks with Total Variational Regularization

J Shao, L Chen, Y Wu - 2021 IEEE 13th International …, 2021 - ieeexplore.ieee.org

The study of generative adversarial networks (GAN) has enormously promoted the research

work on single image super-resolution (SISR) problem. SRGAN firstly apply GAN to SISR

reconstruction, which has achieved good results. However, SRGAN sacrifices the fidelity. At …
Related articles


online OPEN ACCESS

Convergence and finite sample approximations of entropic regularized Wasserstein distances in...

by Quang, Minh Ha

01/2021

This work studies the convergence and finite sample approximations of entropic regularized Wasserstein distances in the Hilbert space setting. Our first main...

Journal ArticleFull Text Online

Cite  Related articles All 3 versions 

<——2021———2021———740——e


 Distributionally robust second-order stochastic dominance constrained optimization with Wasserstein distance

Y Mei, J Liu, Z Chen - arXiv preprint arXiv:2101.00838, 2021 - arxiv.org

We consider a distributionally robust second-order stochastic dominance constrained

optimization problem, where the true distribution of the uncertain parameters is ambiguous.

The ambiguity set contains all probability distributions close to the empirical distribution …

  All 2 versions

online OPEN ACCESS

Distributionally robust second-order stochastic dominance constrained optimization with Wasserst...

by Mei, Yu; Liu, Jia; Chen, Zhiping

01/2021

We consider a distributionally robust second-order stochastic dominance constrained optimization problem, where the true distribution of the uncertain...

Journal ArticleFull Text Online

Related articles All 4 versions 

Distributionally robust second-order stochastic dominance constrained optimization with Wasserstein ball
Yu, Mei; Liu, Jia; Chen, Zhiping. arXiv.org; Ithaca, Oct 19, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Related articles
 All 4 versions 


online Cover Image

DPIR-Net: Direct PET image reconstruction based on the Wasserstein generative adversarial network

by Hu, Zhanli; Xue, Hengzhi; Zhang, Qiyang ; More...

IEEE transactions on radiation and plasma medical sciences, 05/2020, Volume 5, Issue 1

Positron emission tomography (PET) is an advanced medical imaging technique widely used in various clinical applications, such as tumor detection and...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 

online Cover Image  OPEN ACCESS

Wasserstein distance-based probabilistic linguistic TODIM method with application to the evaluation of...

by Zhang, Shitao; Wu, Zhangjiao; Ma, Zhenzhen ; More...

Ekonomska istraživanja, , Volume ahead-of-print, Issue ahead-of-print

The evaluation of sustainable rural tourism potential is a key work in sustainable rural tourism development. Due to the complexity of the rural tourism...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Cited by 2 Related articles All 2 versions

Cover Image

Reconstruction Method for Missing Measurement Data Based on Wasserstein Generative Adversarial...

by Zhang Changfan; Chen Hongrun; He Jing ; More...

Journal of Advanced Computational Intelligence and Intelligent Informatics, 2021, Volume 25, Issue 2

Focusing on the issue of missing measurement data caused by complex and changeable working conditions during the operation of high-speed trains, in this paper,...

Journal ArticleCitation Online

 Cited by 2 Related articles All 3 versions


 OPEN ACCESS

Experiment data in support of "Segmentation analysis and the recovery of queuing parameters via the Wasserstein...

by Wilde, Henry; Knight, Vincent; Gillard, Jonathan

01/2021

This archive contains a ZIP archive, `data.zip`, that itself contains the data used in the final sections of the paper. The remainder of the paper's supporting...

Data Set Citation Online

 

 2021

 

PDF) Evaluating the Performance of Climate Models Based ...

https://www.researchgate.net › ... › Climate Modeling

Mar 21, 2021 — Evaluating the Performance of Climate Models Based on Wasserstein Distance. October 2020; Geophysical Research Letters 47(21).

online

New Climate Modeling Study Findings Reported from University of Hamburg 

(Evaluating the Performance of Climate Models Based On Wasserstein Distance)

Global Warming Focus, 01/2021

NewsletterFull Text Online

 

online

New Mathematical Sciences Data Have Been Reported by Researchers at University of Birmingham (Wasserstein...

Mathematics Week, 01/2021

NewsletterFull Text Online

 

2021 see 2020  online

Studies from South China University of Technology Have Provided New Data on Information Technology 

(Data-driven Distributionally Robust Unit Commitment With Wasserstein...

Information Technology Newsweekly, 01/2021

Newsletter Full Text Online

 

 OPEN ACCESS

一种采用Wasserstein距离的差分隐私贪心分组方法

02/2021 ...

PatentCitation Online

[Chinese  A Differential Privacy Greedy Grouping Method Using Wasserstein Distance'

 OPEN ACCESS

一种基于Wasserstein距离的深度对抗迁移网络的故障诊断方法

02/2021 ...

PatentCitation Online

[Chinese  A Fault Diagnosis Method Based on Wasserstein Distance for Deep Confrontation Migration Network]

<——2021———2021———750——


 OPEN ACCESS

LEARNING METHOD AND LEARNING DEVICE FOR HIGH-DIMENSION UNSUPERVISED ANOMALY DETECTION USING 

KERNALIZED WASSERSTEIN...

by PAIK MYUNGHEE CHO; CHANG HYUN WOONG; KIM YOUNG GEUN

01/2021

크리스토펠 함수의 과다한 연산량을 커널화 와서스타인 오토인코더를 이용하여 개선한 고차원 비지도 이상 탐지 학습 방법이 개시된다. , (a) 학습 장치가, 복수 개의 성분을 가진 적어도 하나의 학습 데이터 매트릭스가 획득되면, 적어도 하나의 와서스타인 인코딩 네트워크로 하여금, 각각의...

PatentCitation Online

 

online OPEN ACCESS

A note on relative Vaserstein symbol

by Chakraborty, Kuntal

02/2021

In an unpublished work of Fasel-Rao-Swan the notion of the relative Witt group $W_E(R,I)$ is defined. In this article we will give the details of this...

Journal ArticleFull Text Online

[PDF] arxiv.org

A note on relative Vaserstein symbol

K Chakraborty - arXiv preprint arXiv:2102.03883, 2021 - arxiv.org

… of Vaserstein symbol. We have considered two cases: One injectivity of the Vaserstein 

The next is injectivity of the Vaserstein symbol VR[X],(X2−X) where R is an affine algebra of …


2021 online Cover Image  PEER-REVIEW OPEN ACCESS

Low-Illumination Image Enhancement in the Space Environment Based on the DC-WGAN Algorithm

by Minglu Zhang; Yan Zhang; Zhihong Jiang ; More...

Sensors (Basel, Switzerland), 01/2021, Volume 21, Issue 1

Owing to insufficient illumination of the space station, the image information collected by the intelligent robot will be degraded, and it will not be able to...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 Related articles All 7 versions 

 

online Cover Image

IE-WGAN: A model with Identity Extracting for Face Frontal Synthesis

by Yanrui Zhu; Yonghong Chen; Yuxin Ren

Journal of physics. Conference series, 03/2021, Volume 1861, Issue 1

Face pose affects the effect of the frontal face synthesis, we propose a model used for frontal face synthesis based on WGAN-GP. This model includes identity...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 Related articles All 3 versions

online  OPEN ACCESS

Automatic Visual Inspection of Rare Defects: A Framework based on GP-WGAN and Enhanced...

by Jalayer, Masoud; Jalayer, Reza; Kaboli, Amin ; More...

05/2021

A current trend in industries such as semiconductors and foundry is to shift their visual inspection processes to Automatic Visual Inspection (AVI) systems, to...

Journal ArticleFull Text Online

  Cited by 4 Related articles All 5 versions

2021

 

online  OPEN ACCESS

Rényi 차분 프라이버시를 적용한 WGAN 모델 연구

by 이수진(Sujin Lee); 박철희(Cheolhee Park); 홍도원(Dowon Hong) ; More...

Chŏngbo Kwahakhoe nonmunji, 2021, Volume 48, Issue 1

Personal data is collected through various services and managers extract values from the collected data and provide individually customized services by...

Journal ArticleFull Text Online

[Korean  Rényi Differential Privacy Applied WGAN Model Study]

 

online OPEN ACCESS

WGAN 성능개선을 위한 효과적인 정칙항 제안

by 한희일(Hee Il Hahn)

멀티미디어학회논문지, 2021, Volume 24, Issue 1

A Wasserstein GAN(WGAN), optimum in terms of minimizing Wasserstein distance, still suffers from inconsistent convergence or unexpected output due to inherent...

Journal ArticleFull Text Online

[Korean  Proposal of effective regular terms for improving the performance of WGAN]


2021 patent

 OPEN ACCESS    patent 

一种基于WGAN网络进行图像修复的方法

03/2021 ...

PatentCitation Online

[Chinese  A method of image restoration based on WGAN network]

 OPEN ACCESS   patent

结合自编码器和WGAN的网络攻击流量数据增强方法及系统

04/2021 ...

PatentCitation Online

 [Chinese  Network attack flow data enhancement method and system combining autoencoder and WGAN]

Fault Diagnosis of Rotating Machinery Based on Wasserstein Distance and Feature Selection

F FerracutiA FreddiA Monteriù… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

This article presents a fault diagnosis algorithm for rotating machinery based on the

Wasserstein distance. Recently, the Wasserstein distance has been proposed as a new

research direction to find better distribution mapping when compared with other popular …

<——2021———2021———760——


2021 patent news    online

State Intellectual Property Office of China Releases Univ Jilin's Patent Application for Sketch-Photo Conversion Method Based on WGAN...

Global IP News. Information Technology Patent News, Jan 8, 2021

Newspaper ArticleFull Text Online


2021 PDF

Extreme quantile regression: a coupling approach and ...

http://doukhan.u-cergy.fr › seminary › Bobbia

Jan 27, 2021 — Extreme quantile regression: a coupling approach and. Wasserstein distance. Benjamin Bobbia. Joint work with C.Dombry and D.Varron.

online  OPEN ACCESS

Extreme quantile regression : a coupling approach and Wasserstein distance

by Bobbia, Benjamin

Université Bourgogne Franche-Comté, 2020

This work is related with the estimation of conditional extreme quantiles. More precisely, we estimate high quantiles of a real distribution conditionally to...

Dissertation/ThesisFull Text Online

 

2021

Régression quantile extrême : une approche par couplage et ...

https://www.researchgate.net › publication › 349758325_...

Mar 6, 2021 — Plus précisément, l'estimation de quantiles d'une distribution réelle en... | Find, read and ... Régression quantile extrême : une approche par couplage et distance de Wasserstein. December 2020. Authors: Benjamin Bobbia.


Data augmentation for rolling bearing fault diagnosis using an enhanced few-shot Wasserstein auto-encoder with meta-learning

Z Pei, H Jiang, X Li, J Zhang, S Liu - Measurement Science and …, 2021 - iopscience.iop.org

Despite the advance of intelligent fault diagnosis for rolling bearings, in industries, data-

driven methods still suffer from data acquisition and imbalance. We propose an enhanced

few-shot Wasserstein auto-encoder (fs-WAE) to reverse the negative effect of imbalance …

 

[PDF] ieee.org

Breast Cancer Histopathology Image Super-Resolution Using Wide-Attention GAN With Improved Wasserstein Gradient Penalty and Perceptual Loss

F Shahidi - IEEE Access, 2021 - ieeexplore.ieee.org

In the realm of image processing, enhancing the quality of the images is known as a super-

resolution problem (SR). Among SR methods, a super-resolution generative adversarial

network, or SRGAN, has been introduced to generate SR images from low-resolution …

  All 2 versions


2021

[PDF] iop.org

Optimization Research on Abnormal Diagnosis of Transformer Voiceprint Recognition based on Improved Wasserstein GAN

K Zhu, H Ma, J Wang, C Yu, C Guo… - Journal of Physics …, 2021 - iopscience.iop.org

Transformer is an important infrastructure equipment of power system, and fault monitoring

is of great significance to its operation and maintenance, which has received wide attention

and much research. However, the existing methods at home and abroad are based on …

  All 2 versions


Wasserstein distance feature alignment learning for 2D image-based 3D model retrieval

Y Zhou, Y Liu, H Zhou, W Li - Journal of Visual Communication and Image …, 2021 - Elsevier

Abstract 2D image-based 3D model retrieval has become a hotspot topic in recent years.

However, the current existing methods are limited by two aspects. Firstly, they are mostly

based on the supervised learning, which limits their application because of the high time …

  All 2 versions


[PDF] julienmalka.me

[PDF] Gromov-Wasserstein Optimal Transport for Heterogeneous Domain Adaptation

J Malka, R FlamaryN Courty - julienmalka.me

Optimal Transport distances have shown great potential these last year in tackling the

homogeneous domain adaptation problem. This works present some adaptations of the

state of the art homogeneous domain adaptations methods to work on heterogeneous …

  All 2 versions 


Unsupervised band selection for hyperspectral image classification using the Wasserstein metric-based configuration entropy

Z Hong, WU Zhiwei, W Jicheng… - Acta Geodaetica et … - xb.sinomaps.com

Band selection relies on the quantification of band information. Conventional measurements

such as Shannon entropy only consider the composition information (eg, types and ratios of

pixels) but ignore the configuration information (eg, the spatial distribution of pixels). The …

  All 2 versions 


[PDF]  

Unsupervised band selection for hyperspectral image classification using the Wasserstein metric-based configuration entropy

Z Hong, WU Zhiwei, W Jicheng… - Acta Geodaetica et … - xb.sinomaps.com

Band selection relies on the quantification of band information. Conventional measurements

such as Shannon entropy only consider the composition information (eg, types and ratios of

pixels) but ignore the configuration information (eg, the spatial distribution of pixels). The …

  All 2 versions 

<——2021———2021———770—

 

[HTML] springer.com

[HTML] EEG data augmentation for emotion recognition with a multiple generator conditional Wasserstein GAN

A Zhang, L Su, Y Zhang, Y Fu, L Wu, S Liang - Complex & Intelligent …, 2021 - Springer

EEG-based emotion recognition has attracted substantial attention from researchers due to

its extensive application prospects, and substantial progress has been made in feature

extraction and classification modelling from EEG data. However, insufficient high-quality …

Cited by 2 Related articles All 3 versions
[CITATION] Eeg data augmentation for emotion recognition with a multiple generator conditional Wasserstein GAN. Complex Intell Syst

A Zhang, L Su, Y Zhang, Y Fu, L Wu, S Liang - 2021

Cited by 5 Related articles All 3 versions
Journal Article  Full Text Online


[PDF] arxiv.org

A data-driven event generator for Hadron Colliders using Wasserstein Generative Adversarial Network

S Choi, JH Lim - Journal of the Korean Physical Society, 2021 - Springer

Abstract Highly reliable Monte-Carlo event generators and detector simulation programs are

important for the precision measurement in the high energy physics. Huge amounts of

computing resources are required to produce a sufficient number of simulated events …

  All 5 versions


[HTML] mdpi.com

Panchromatic Image Super-Resolution Via Self Attention-Augmented Wasserstein Generative Adversarial Network

J Du, K Cheng, Y Yu, D Wang, H Zhou - Sensors, 2021 - mdpi.com

Panchromatic (PAN) images contain abundant spatial information that is useful for earth

observation, but always suffer from low-resolution (LR) due to the sensor limitation and large-

scale view field. The current super-resolution (SR) methods based on traditional attention …

  All 6 versions 


[PDF] ieee.org

Breast Cancer Histopathology Image Super-Resolution Using Wide-Attention GAN With Improved Wasserstein Gradient Penalty and Perceptual Loss

F Shahidi - IEEE Access, 2021 - ieeexplore.ieee.org

In the realm of image processing, enhancing the quality of the images is known as a super-

resolution problem (SR). Among SR methods, a super-resolution generative adversarial

network, or SRGAN, has been introduced to generate SR images from low-resolution …

  All 2 versions


[HTML] springer.com

[HTML] EEG data augmentation for emotion recognition with a multiple generator conditional Wasserstein GAN

A Zhang, L Su, Y Zhang, Y Fu, L Wu, S Liang - Complex & Intelligent …, 2021 - Springer

EEG-based emotion recognition has attracted substantial attention from researchers due to

its extensive application prospects, and substantial progress has been made in feature

extraction and classification modelling from EEG data. However, insufficient high-quality …
 

year 2021 [PDF] toronto.edu

[PDF] The Wasserstein 1 Distance-Constructing an Optimal Map and Applications to Generative Modelling

T Milne - math.toronto.edu

Recent advances in generative modelling have shown that machine learning algorithms are

capable of generating high resolution images of fully synthetic scenes which some

researchers call “dreams” or “hallucinations” of the algorithm. Poetic language aside, one …


arXiv:2105.05677  [pdfother]  math.AP  math.MG  math.PR
Gradient flow formulation of diffusion equations in the Wasserstein space over a metric graph
Authors: Matthias ErbarDominik ForkertJan MaasDelio Mugnolo
Abstract: This paper contains two contributions in the study of optimal transport on metric graphs. Firstly, we prove a Benamou-Brenier formula for the Wasserstein distance, which establishes the equivalence of static and dynamical optimal transport. Secondly, in the spirit of Jordan-Kinderlehrer-Otto, we show that McKean-Vlasov equations can be formulated as gradient flow of the free energy in the Wasserst… 
More
Submitted 12 May, 2021; originally announced May 2021.
Comments: 27 pages
Cited by 3
 Related articles All 3 versions 


arXiv:2105.05655  [pdfother]  math.PR  math.AP
Wasserstein perturbations of Markovian transition semigroups
Authors: Sven FuhrmannMichael KupperMax Nendel
Abstract: In this paper, we deal with a class of time-homogeneous continuous-time Markov processes with transition probabilities bearing a nonparametric uncertainty. The uncertainty is modeled by considering perturbations of the transition probabilities within a proximity in Wasserstein distance. As a limit over progressively finer time periods, on which the level of uncertainty scales proportionally, we ob… 
More
Submitted 12 May, 2021; originally announced May 2021.
MSC Class: Primary 60J35; 47H20; Secondary 60G65; 90C31; 62G35

ited by 1 Related articles All 9 versions 

arXiv:2105.04402  [pdfother]  cs.LG  cs.CV
AWCD: An Efficient Point Cloud Processing Approach via Wasserstein Curvature
Authors: Yihao LuoAiling YangFupeng SunHuafei Sun
Abstract: In this paper, we introduce the adaptive Wasserstein curvature denoising (AWCD), an original processing approach for point cloud data. By collecting curvatures information from Wasserstein distance, AWCD consider more precise structures of data and preserves stability and effectiveness even for data with noise in high density. This paper contains some theoretical analysis about the Wasserstein cur… 
More
Submitted 11 May, 2021; v1 submitted 26 April, 2021; originally announced May 2021.
Comments: 13 pages, 5 figures
 Related articles All 2 versions 


arXiv:2105.04210  [pdfother]  cs.LG  eess.SP
Robust Graph Learning Under Wasserstein Uncertainty
Authors: Xiang ZhangYinfei XuQinghe LiuZhicheng LiuJian LuQiao Wang
Abstract: Graphs are playing a crucial role in different fields since they are powerful tools to unveil intrinsic relationships among signals. In many scenarios, an accurate graph structure representing signals is not available at all and that motivates people to learn a reliable graph structure directly from observed signals. However, in real life, it is inevitable that there exists uncertainty in the obse… 
More
Submitted 12 May, 2021; v1 submitted 10 May, 2021; originally announced May 2021.
Comments: 21 pages,9 figures

  Related articles All 3 versions View as HTML 

<——2021———2021———780——


arXiv:2105.02495  [pdfpsother]  math.AP  math.PR
On absolutely continuous curves in the Wasserstein space over R and their representation by an optimal Markov process
Authors: Charles BoubelNicolas Juillet
Abstract: Let μ
 = (μt) t
R be a 1-parameter family of probability measures on R. In [13] we introduced its "Markov-quantile" process: a process X = (Xt) t
R that resembles at most the quantile process attached to μ
, among the Markov processes attached to μ
, i.e. whose family of marginal laws is μ
. In this article we look at the case where μ
 is absolutely continuous in the Wasserstein spa… 
More
Submitted 6 May, 2021; originally announced May 2021.
Comments: arXiv admin note: substantial text overlap with arXiv:1804.10514
  Related articles All 3 versions 


arXiv:2105.01706  [pdfother]  cs.LG  stat.ML
Sampling From the Wasserstein Barycenter
Authors: Chiheb DaaloulThibaut Le GouicJacques LiandratMagali Tournus
Abstract: This work presents an algorithm to sample from the Wasserstein barycenter of absolutely continuous measures. Our method is based on the gradient flow of the multimarginal formulation of the Wasserstein barycenter, with an additive penalization to account for the marginal constraints. We prove that the minimum of this penalized multimarginal formulation is achieved for a coupling that is close to t… 
More
Submitted 4 May, 2021; originally announced May 2021.
 Cited by 2
 Related articles All 4 versions 

Transfer learning method for bearing fault diagnosis based on fully convolutional conditional Wasserstein adversarial Networks

By: Liu, Yong ZhiShi, Ke MingLi, Zhi Xuan; et al.

MEASUREMENT  Volume: ‏ 180     Article Number: 109553   Published: ‏ AUG 2021

Related articles All 2 versions

[PDF] ieee.org

Breast Cancer Histopathology Image Super-Resolution Using Wide-Attention GAN With Improved Wasserstein Gradient Penalty and Perceptual Loss

F Shahidi - IEEE Access, 2021 - ieeexplore.ieee.org

In the realm of image processing, enhancing the quality of the images is known as a super-

resolution problem (SR). Among SR methods, a super-resolution generative adversarial

network, or SRGAN, has been introduced to generate SR images from low-resolution  …

  All 2 versions


[PDF] archives-ouvertes.fr

On absolutely continuous curves in the Wasserstein space over R and their representation by an optimal Markov process

C Boubel, N Juillet - 2021 - hal.archives-ouvertes.fr

Let µ=(µt) t R be a 1-parameter family of probability measures on R. In [13] we introduced

its" Markov-quantile" process: a process X=(Xt) t R that resembles at most the quantile

process attached to µ, among the Markov processes attached to µ, ie whose family of …

  All 3 versions 


2021

RWGANTV: Image Super-Resolution Through Wasserstein Generative Adversarial Networks with Total Variational Regularization

J Shao, L Chen, Y Wu - 2021 IEEE 13th International …, 2021 - ieeexplore.ieee.org

The study of generative adversarial networks (GAN) has enormously promoted the research

work on single image super-resolution (SISR) problem. SRGAN firstly apply GAN to SISR

reconstruction, which has achieved good results. However, SRGAN sacrifices the fidelity. At …

 Related articles


2021  see 2020

Wasserstein Autoregressive Models for Density Time Series ...

https://onlinelibrary.wiley.com › doi › abs › jtsa

by C Zhang · Cited by 4 — Data consisting of time‐indexed distributions of cross‐sectional or intraday returns have been extensively studied in finance, and provide one ... Wasserstein Autoregressive Models for Density Time Series ... First published: 17 April 2021.

2021 online Cover Imag  PEER-REVIEW

Wasserstein autoregressive models for density time series

by Zhang, Chao; Kokoszka, Piotr; Petersen, Alexander

Journal of time series analysis, 05/2021

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 

De-aliased seismic data interpolation using conditional ...

https://www.x-mol.com › paper

· Translate this page

De-aliased seismic data interpolation using conditional Wasserstein generative adversarial networks · Computers & Geosciences ( IF 2.991 ) Pub Date : 2021-05-07 , DOI: 10.1016/j.cageo.2021.104801. Qing Wei, Xiangyang Li, Mingpeng ...


online Cover Image PEER-REVIEW

De-aliased seismic data interpolation using conditional Wasserstein generative adversarial networks

by Wei, Qing; Li, Xiangyang; Song, Mingpeng

Computers & geosciences, 05/2021

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 Cited by 2 Related articles All 2 versions


Reproducibility of radiomic features using network analysis ...

https://www.spiedigitallibrary.org › 1.JMI.8.3.031904.full

by JH Oh · 2021 — J. of Medical Imaging, 8(3), 031904 (2021). https://doi.org/10.1117/1. ... Further analysis using the network-based Wasserstein k-means algorithm on ... reproducible radiomic features and use of the selected set of features can ...

online Cover Image PEER-REVIEW

Reproducibility of radiomic features using network analysis and its application in Wasserstein k-means...

Journal of medical imaging (Bellingham, Wash.), 05/2021, Volume 8, Issue 3

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 online OPEN ACCESS

Sampling From the Wasserstein Barycenter

by Daaloul, Chiheb; Gouic, Thibaut Le; Liandrat, Jacques ; More...

05/2021

This work presents an algorithm to sample from the Wasserstein barycenter of absolutely continuous measures. Our method is based on the gradient flow of the...

Journal ArticleFull Text Online

Related articles All 4 versions

<——2021———2021———790——

online  OPEN ACCESS

Robust Graph Learning Under Wasserstein Uncertainty

by Zhang, Xiang; Xu, Yinfei; Liu, Qinghe ; More...

05/2021

Graphs are playing a crucial role in different fields since they are powerful tools to unveil intrinsic relationships among signals. In many scenarios, an...

Journal ArticleFull Text Online

  

online OPEN ACCESS

On absolutely continuous curves in the Wasserstein space over R and their representation by an optimal Markov process

by Boubel, Charles; Juillet, Nicolas

05/2021

Let $\mu$ = ($\mu$t) t$\in$R be a 1-parameter family of probability measures on R. In [13] we introduced its "Markov-quantile" process: a process X = (Xt)...

Journal ArticleFull Text Online

 Cited by 1 Related articles All 4 versions


online

Report Summarizes Symbolic Computation Study Findings from Simon Fraser University 

(Wasserstein Distance To Independence Models)

Journal of Technology & Science, 05/2021

NewsletterFull Text Online

Cited by 11 Related articles All 4 versions

2021 online

New Machine Learning Study Findings Recently Were Reported by Researchers at Massachusetts Institute of Technology (Wasserstein Barycenters Can Be Computed In Polynomial Time In Fixed Dimension)

Robotics & Machine Learning, 05/2021

NewsletterFull Text Online

Wasserstein Barycenters can be Computed in Polynomial Time in Fixed Dimension

By: Altschuler, Jason M.; Boix-Adsera, Enric

JOURNAL OF MACHINE LEARNING RESEARCH  Volume: ‏ 22   Published: ‏ 2021

Zbl 07370561



[PDF] arxiv.org

Tighter expected generalization error bounds via Wasserstein distance

B Rodríguez-GálvezG BassiR Thobaben… - arXiv preprint arXiv …, 2021 - arxiv.org

In this work, we introduce several expected generalization error bounds based on the

Wasserstein distance. More precisely, we present full-dataset, single-letter, and random-

subset bounds on both the standard setting and the randomized-subsample setting from …
Tighter expected generalization error bounds via Wasserstein distance

B Rodríguez-GálvezG BassiR Thobaben… - arXiv preprint arXiv …, 2021 - arxiv.org

In this work, we introduce several expected generalization error bounds based on the

Wasserstein distance. More precisely, we present full-dataset, single-letter, and random-

subset bounds on both the standard setting and the randomized-subsample setting from

Steinke and Zakynthinou [2020]. Moreover, we show that, when the loss function is

bounded, these bounds recover from below (and thus are tighter than) current bounds

based on the relative entropy and, for the standard setting, generate new, non-vacuous …

Cited by 14 Related articles All 7 versions

Showing the best result for this search. See all results

[CITATION] Tighter expected generalization error bounds via Wasserstein distance.

BR Gálvez, G Bassi, R Thobaben, M Skoglund - CoRR, 2021
Cited by 20
Related articles All 7 versions

2021


[PDF] arxiv.org

short proof on the rate of convergence of the empirical measure for the Wasserstein distance

V Divol - arXiv preprint arXiv:2101.08126, 2021 - arxiv.org

We provide a short proof that the Wasserstein distance between the empirical measure of a

n-sample and the estimated measure is of order n−1/d, if the measure has a lower and upper

bounded density on the d-dimensional flat torus … For 1 ≤ p < ∞, let Wp be the p-Wasserstein …

  Cited by 2 All 4 versions 


2021 

Image Super-Resolution Algorithm Based on RRDB Model

https://ieeexplore.ieee.org › iel7

https://ieeexplore.ieee.org › iel7

by H Li · 2021 — Super-resolution reconstruction of a single image is a tech- ... dense network, RDN), through the mutual connection and.

[CITATION] Image super-resolution reconstruction model based on RDN and WGAN

L Yida, M Xiaoxuan - Modern Electronics Technique, 2021

 Cited by 2 Related articles

MR4254496 Prelim Natarovskii, Viacheslav; Rudolf, Daniel; Sprungk, Björn; Quantitative spectral gap estimate and Wasserstein contraction of simple slice sampling. Ann. Appl. Probab. 31 (2021), no. 2, –.
1 April 2021

Quantitative spectral gap estimate and Wasserstein contraction of simple slice sampling

Viacheslav Natarovskii, Daniel Rudolf, Björn Sprungk

The Annals of Applied Probability Vol. 31, Issue 2 (Apr 2021), pg(s) 806-825

KEYWORDS: slice samplingspectral gapWasserstein contraction

Read Abstract 

Cited by 2 Related articles All 8 versions

arXiv:2105.09755  [pdfother]  math.PR  math.FA
Generalized Wasserstein barycenters between probability measures living on different subspaces
Authors: Julie DelonNathaël GozlanAlexandre Saint-Dizier
Abstract: In this paper, we introduce a generalization of the Wasserstein barycenter, to a case where the initial probability measures live on different subspaces of R^d. We study the existence and uniqueness of this barycenter, we show how it is related to a larger multi-marginal optimal transport problem, and we propose a dual formulation. Finally, we explain how to compute numerically this generalized ba… 
More
Submitted 20 May, 2021; originally announced May 2021.
elated articles
 All 10 versions 


arXiv:2105.09502  [pdfother]  math.NA
A continuation multiple shooting method for Wasserstein geodesic equation
Authors: Jianbo CuiLuca DieciHaomin Zhou
Abstract: In this paper, we propose a numerical method to solve the classic L
2-optimal transport problem. Our algorithm is based on use of multiple shooting, in combination with a continuation procedure, to solve the boundary value problem associated to the transport problem. We exploit the viewpoint of Wasserstein Hamiltonian flow with initial and target densities, and our method is designed to retain t… 
More
Submitted 20 May, 2021; originally announced May 2021.
Comments: 25 pages
MSC Class: 49Q22; 49M25; 65L09; 34A55

  Related articles All 2 versions 

<——2021———2021———800——


arXiv:2105.08715  [pdfother]  cs.CV  cs.AI
Human Motion Prediction Using Manifold-Aware Wasserstein GAN
Authors: Baptiste ChopinNaima OtberdoutMohamed DaoudiAngela Bartolo
Abstract: Human motion prediction aims to forecast future human poses given a prior pose sequence. The discontinuity of the predicted motion and the performance deterioration in long-term horizons are still the main challenges encountered in current literature. In this work, we tackle these issues by using a compact manifold-valued representation of human motion. Specifically, we model the temporal evolutio… 
More
Submitted 18 May, 2021; originally announced May 2021.

Cited by 2 Related articles All 6 versions

Human Motion Prediction Using Manifold-Aware Wasserstein GAN

Chopin, BOtberdout, N; (...); Bartolo, A

16th IEEE International Conference on Automatic Face and Gesture Recognition (FG) 

2021 16TH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION (FG 2021)

Human motion prediction aims to forecast future human poses given a prior pose sequence. The discontinuity of the predicted motion and the performance deterioration in long-term horizons are still the main challenges encountered in current literature. In this work, we tackle these issues by using a compact manifold-valued representation of human motion. Specifically, we model the temporal evolu

Show more

Free Submitted Article From RepositoryFull Text at Publishermore_horiz

25 References  Related records


arXiv:2105.08414  [pdfother]  math.OC   eess.SY
Data-driven distributionally robust MPC using the Wasserstein metric
Authors: Zhengang ZhongEhecatl Antonio del Rio-ChanonaPanagiotis Petsagkourakis
Abstract: A data-driven MPC scheme is proposed to safely control constrained stochastic linear systems using distributionally robust optimization. Distributionally robust constraints based on the Wasserstein metric are imposed to bound the state constraint violations in the presence of process disturbance. A feedback control law is solved to guarantee that the predicted states comply with constraints. The s…  More
Submitted 18 May, 2021; originally announced May 2021. 

Cited by 1 Related articles All 3 versions 

[HTML] Prediction and analysis of multiple protein lysine modified sites based on conditional wasserstein generative adversarial networks

Y Yang, H Wang, W Li, X Wang… - BMC …, 2021 - bmcbioinformatics.biomedcentral …

Protein post-translational modification (PTM) is a key issue to investigate the mechanism of

protein's function. With the rapid development of proteomics technology, a large amount of

protein sequence data has been generated, which highlights the importance of the in-depth …

 All 10 versions 


Artificial Neural Network with Histogram Data Time Series Forecasting: A Least Squares Approach Based on Wasserstein Distance

P Rakpho, W Yamaka, K Zhu - Behavioral Predictive Modeling in …, 2021 - Springer

… AR model, this model is capable of learning from the data and can process good results … Arroyo,

J.: Time series modeling of histogram-valued data: the daily histogram time series of …

MathSciNetCrossRefGoogle Scholar. 8. Irpino, A., Verde, R.: A new Wasserstein based distance …

  Related articles All 3 versions


 2021 [PDF] arxiv.org

Busemann functions on the Wasserstein space

G Zhu, WL Li, X Cui - Calculus of Variations and Partial Differential …, 2021 - Springer

We study rays and co-rays in the Wasserstein space\(P_p ({\mathcal {X}})\)(\(p> 1\)) whose

ambient space\({\mathcal {X}}\) is a complete, separable, non-compact, locally compact

length space. We show that rays in the Wasserstein space can be represented as probability …

   Related articles All 2 versions


2021


2021 [PDF] ams.org

Nonembeddability of persistence diagrams with 𝑝> 2 Wasserstein metric

A Wagner - Proceedings of the American Mathematical Society, 2021 - ams.org

Persistence diagrams do not admit an inner product structure compatible with any

Wasserstein metric. Hence, when applying kernel methods to persistence diagrams, the

underlying feature map necessarily causes distortion. We prove that persistence diagrams  …

 

2021 [PDF] iop.org

Wasserstein stability estimates for covariance-preconditioned Fokker–Planck equations

JA CarrilloU Vaes - Nonlinearity, 2021 - iopscience.iop.org

We study the convergence to equilibrium of the mean field PDE associated with the

derivative-free methodologies for solving inverse problems that are presented by Garbuno-

Inigo et al (2020 SIAM J. Appl. Dyn. Syst. 19 412–41), Herty and Visconti (2018 arXiv …

  Cited by 8 Related articles All 4 versions

 

 2021

The measurement of relations on belief functions based on the Kantorovich problem and the Wasserstein metric

AG Bronevich, IN Rozenberg - International Journal of Approximate …, 2021 - Elsevier

In this paper, we show how the Kantorovich problem appears in many constructions in the

theory of belief functions. We demonstrate this on several relations on belief functions such

as inclusion, equality and intersection of belief functions. Using the Kantorovich problem we …

   All 2 versions


2021

[PDF] arxiv.org

Tracial smooth functions of non-commuting variables and the free Wasserstein manifold

D Jekel, W LiD Shlyakhtenko - arXiv preprint arXiv:2101.06572, 2021 - arxiv.org

We formulate a free probabilistic analog of the Wasserstein manifold on $\mathbb {R}^ d

$(the formal Riemannian manifold of smooth probability densities on $\mathbb {R}^ d $),

and we use it to study smooth non-commutative transport of measure. The points of the free …

   All 3 versions 


2021

[PDF] arxiv.org

Busemann functions on the Wasserstein space

G Zhu, WL Li, X Cui - Calculus of Variations and Partial Differential …, 2021 - Springer

We study rays and co-rays in the Wasserstein space\(P_p ({\mathcal {X}})\)(\(p> 1\)) whose

ambient space\({\mathcal {X}}\) is a complete, separable, non-compact, locally compact

length space. We show that rays in the Wasserstein space can be represented as probability …

   Related articles All 2 versions

<——2021———2021———810——


2021  

Local Stability of Wasserstein GANs With Abstract Gradient Penalty

C Kim, S Park, HJ Hwang - IEEE Transactions on Neural …, 2021 - ieeexplore.ieee.org

The convergence of generative adversarial networks (GANs) has been studied substantially

in various aspects to achieve successful generative tasks. Ever since it is first proposed, the

idea has achieved many theoretical improvements by injecting an instance noise, choosing …

  All 3 versions

2021  [PDF] mlr.press

Generalized spectral clustering via Gromov-Wasserstein learning

S ChowdhuryT Needham - International Conference on …, 2021 - proceedings.mlr.press

We establish a bridge between spectral clustering and Gromov-Wasserstein Learning

(GWL), a recent optimal transport-based approach to graph partitioning. This connection

both explains and improves upon the state-of-the-art performance of GWL. The Gromov …

  Cited by 4 Related articles All 2 versions 



2021  [PDF] nber.org

Using wasserstein generative adversarial networks for the design of monte carlo simulations

S AtheyGW Imbens, J Metzger, E Munro - Journal of Econometrics, 2021 - Elsevier

When researchers develop new econometric methods it is common practice to compare the

performance of the new methods to those of existing methods in Monte Carlo studies. The

credibility of such Monte Carlo studies is often limited because of the discretion the …

  4 Related articles All 8 versions 



2021n[PDF] arxiv.org

Quantitative spectral gap estimate and Wasserstein contraction of simple slice sampling

V Natarovskii, D RudolfB Sprungk - The Annals of Applied …, 2021 - projecteuclid.org

We prove Wasserstein contraction of simple slice sampling for approximate sampling wrt

distributions with log-concave and rotational invariant Lebesgue densities. This yields, in

particular, an explicit quantitative lower bound of the spectral gap of simple slice sampling …

   Related articles All 4 versions


2021 May

[CITATION] A Wasserstein Minimax Framework for Mixed Linear Regression

T Diamandis, Y EldarA Fallah… - 38th International … - weizmann.esploro.exlibrisgroup.com


2021


Generalized Wasserstein barycenters between probability ...

https://arxiv.org › math

by J Delon · 2021 — ... introduce a generalization of the Wasserstein barycenter, to a case where the initial probability measures live on different subspaces of R^d.

online  OPEN ACCESS

Generalized Wasserstein barycenters between probability measures living on different subspaces

by Delon, Julie; Gozlan, Nathaël; Saint-Dizier, Alexandre

05/2021

In this paper, we introduce a generalization of the Wasserstein barycenter, to a case where the initial probability measures live on different subspaces of...

Journal ArticleFull Text Online

 

A continuation multiple shooting method for Wasserstein ...

https://arxiv.org › math

by J Cui · 2021 — Title:A continuation multiple shooting method for Wasserstein geodesic equation ... Our algorithm is based on use of multiple shooting, in combination with a ... We exploit the viewpoint of Wasserstein Hamiltonian flow with initial and target ... Several numerical examples are presented to illustrate the ...

online  OPEN ACCESS

A continuation multiple shooting method for Wasserstein geodesic equation

by Cui, Jianbo; Dieci, Luca; Zhou, Haomin

05/2021

In this paper, we propose a numerical method to solve the classic $L^2$-optimal transport problem. Our algorithm is based on use of multiple shooting, in...

Journal ArticleFull Text Online

 Zbl 07595091


Data-driven distributionally robust MPC using the Wasserstein metric

Z Zhong, EA del Rio-Chanona… - arXiv preprint arXiv …, 2021 - arxiv.org

A data-driven MPC scheme is proposed to safely control constrained stochastic linear

systems using distributionally robust optimization. Distributionally robust constraints based

on the Wasserstein metric are imposed to bound the state constraint violations in the

presence of process disturbance. A feedback control law is solved to guarantee that the

predicted states comply with constraints. The stochastic constraints are satisfied with regard

to the worst-case distribution within the Wasserstein ball centered at their discrete empirical …

  All 2 versions 

online OPEN ACCESS

Data-driven distributionally robust MPC using the Wasserstein metric

by Zhong, Zhengang; del Rio-Chanona, Ehecatl Antonio; Petsagkourakis, Panagiotis

05/2021

A data-driven MPC scheme is proposed to safely control constrained stochastic linear systems using distributionally robust optimization. Distributionally...

Journal ArticleFull Text Online

Cited by 8 Related articles All 3 versions

Human Motion Prediction Using Manifold-Aware Wasserstein GAN

B Chopin, N OtberdoutM Daoudi, A Bartolo - arXiv preprint arXiv …, 2021 - arxiv.org

Human motion prediction aims to forecast future human poses given a prior pose sequence.

The discontinuity of the predicted motion and the performance deterioration in long-term

horizons are still the main challenges encountered in current literature. In this work, we

tackle these issues by using a compact manifold-valued representation of human motion.

Specifically, we model the temporal evolution of the 3D human poses as trajectory, what

allows us to map human motions to single points on a sphere manifold. To learn these non …

Related articles All 2 versions 

online  OPEN ACCESS

Human Motion Prediction Using Manifold-Aware Wasserstein GAN

by Chopin, Baptiste; Otberdout, Naima; Daoudi, Mohamed ; More...

05/2021

Human motion prediction aims to forecast future human poses given a prior pose sequence. The discontinuity of the predicted motion and the performance...

Journal ArticleFull Text Online

 Cited by 4 Related articles All 8 versions


Measuring the Irregularity of Vector-Valued Morphological ...

https://link.springer.com › content › pdf

by ME Valle · 2021 — We illustrate by examples how to quantify the irregularity of vector-valued morphological operators using the Wasserstein metric. Keywords: Mathematical ...

online

Measuring the Irregularity of Vector-Valued Morphological Operators Using Wasserstein Metric

by Valle, Marcos Eduardo; Francisco, Samuel; Granero, Marco Aurélio ; More...

Discrete Geometry and Mathematical Morphology, 05/2021

Mathematical morphology is a useful theory of nonlinear operators widely used for image processing and analysis. Despite the successful application of...

Book Chapter Full Text Online

<——2021———2021———820——


The measurement of relations on belief functions based on the Kantorovich problem and the Wasserstein metric

AG Bronevich, IN Rozenberg - International Journal of Approximate …, 2021 - Elsevier

In this paper, we show how the Kantorovich problem appears in many constructions in the

theory of belief functions. We demonstrate this on several relations on belief functions such

as inclusion, equality and intersection of belief functions. Using the Kantorovich problem we …

   All 2 versions


2021

[PDF] arxiv.org

Wasserstein Robust Support Vector Machines with Fairness Constraints

Y Wang, VA NguyenGA Hanasusanto - arXiv preprint arXiv:2103.06828, 2021 - arxiv.org

We propose a distributionally robust support vector machine with a fairness constraint that

encourages the classifier to be fair in view of the equality of opportunity criterion. We use a

type-$\infty $ Wasserstein ambiguity set centered at the empirical distribution to model …

  All 4 versions 


[2021

[PDF] arxiv.org

Quantitative spectral gap estimate and Wasserstein contraction of simple slice sampling

V Natarovskii, D RudolfB Sprungk - The Annals of Applied …, 2021 - projecteuclid.org

We prove Wasserstein contraction of simple slice sampling for approximate sampling wrt

distributions with log-concave and rotational invariant Lebesgue densities. This yields, in

particular, an explicit quantitative lower bound of the spectral gap of simple slice sampling …

   Related articles All 4 versions


2021

[PDF] arxiv.org

Data-Driven Distributionally Robust Surgery Planning in Flexible Operating Rooms Over a Wasserstein Ambiguity

KS Shehadeh - arXiv preprint arXiv:2103.15221, 2021 - arxiv.org

We study elective surgery planning in flexible operating rooms where emergency patients

are accommodated in the existing elective surgery schedule. Probability distributions of

surgery durations are unknown, and only a small set of historical realizations is available. To …

  All 2 versions 

2021

Unpaired Single-Image Depth Synthesis with cycle-consistent Wasserstein GANs

C Angermann, A Moravová, M Haltmeier… - arXiv preprint arXiv …, 2021 - arxiv.org

Real-time estimation of actual environment depth is an essential module for various

autonomous system tasks such as localiza


2021

MR4258740 Prelim Wu, Hongguang; Cui, Xiaojun; Peacock geodesics in Wasserstein space. Differential Geom. Appl. 77 (2021), 101764. 60B05 (54E70 58E10 60B10 91G80)

Review PDF Clipboard Journal Article

Zbl 07370636
Related articles
 All 2 versions

 

arXiv:2105.15000  [pdfother stat.ME
Intrinsic Wasserstein Correlation Analysis
Authors: Hang ZhouZhenhua LinFang Yao
Abstract: We develop a framework of canonical correlation analysis for distribution-valued functional data within the geometry of Wasserstein spaces. Specifically, we formulate an intrinsic concept of correlation between random distributions, propose estimation methods based on functional principal component analysis (FPCA) and Tikhonov regularization, respectively, for the correlation and its corresponding…  More
Submitted 31 May, 2021; originally announced May 2021.
 Cited by 2 Related articles All 2 versions 

arXiv:2105.14348  [pdfother math.ST  stat.ME
Robust Hypothesis Testing with Wasserstein Uncertainty Sets
Authors: Liyan XieRui GaoYao Xie
Abstract: We consider a data-driven robust hypothesis test where the optimal test will minimize the worst-case performance regarding distributions that are close to the empirical distributions with respect to the Wasserstein distance. This leads to a new non-parametric hypothesis testing framework based on distributionally robust optimization, which is more robust when there are limited samples for one or b…  More
Submitted 29 May, 2021; originally announced May 2021.
 Cited by 3 Related articles All 2 versions 

arXiv:2105.11721  [pdfother math.PR
A Central Limit Theorem for Semidiscrete Wasserstein Distances
Authors: Eustasio del BarrioAlberto González-SanzJean-Michel Loubes
Abstract: We address the problem of proving a Central Limit Theorem for the empirical optimal transport cost, n
Related articles
 All 7 versions 


, in the semi discrete case, i.e when the distribution P
 is finitely supported. We show that the asymptotic distribution is the supremun of a centered Gaussian process which is Gaussian under some addition

online OPEN ACCESS

A Central Limit Theorem for Semidiscrete Wasserstein Distances

by del Barrio, Eustasio; González-Sanz, Alberto; Loubes, Jean-Michel

05/2021

We address the problem of proving a Central Limit Theorem for the empirical optimal transport cost, $\sqrt{n}\{\mathcal{T}_c(P_n,Q)-\mathcal{W}_c(P,Q)\}$, in...

Journal ArticleFull Text Online


[PDF] arxiv.org

2D Wasserstein loss for robust facial landmark detection

Y Yan, S Duffner, P Phutane, A Berthelier, C Blanc… - Pattern Recognition, 2021 - Elsevier

The recent performance of facial landmark detection has been significantly improved by

using deep Convolutional Neural Networks (CNNs), especially the Heatmap Regression

Models (HRMs). Although their performance on common benchmark datasets has reached a …

  Related articles All 3 versions

<——2021———2021———830——


Adversarial training with Wasserstein distance for learning cross-lingual word embeddings

Y Li, Y Zhang, K Yu, X Hu - Applied Intelligence, 2021 - Springer

Recent studies have managed to learn cross-lingual word embeddings in a completely

unsupervised manner through generative adversarial networks (GANs). These GANs-based

methods enable the alignment of two monolingual embedding spaces approximately, but …

 

[PDF] ieee.org

Wasserstein Divergence GAN With Cross-Age Identity Expert and Attribute Retainer for Facial Age Transformation

GS Hsu, RC Xie, ZT Chen - IEEE Access, 2021 - ieeexplore.ieee.org

We propose the Wasserstein Divergence GAN with an identity expert and an attribute

retainer for facial age transformation. The Wasserstein Divergence GAN (WGAN-div) can

better stabilize the training and lead to better target image generation. The identity expert …


arXiv:2106.01128  [pdfother]  cs.LG  stat.ML
Linear-Time Gromov Wasserstein Distances using Low Rank Couplings and Costs
Authors: Meyer ScetbonGabriel PeyréMarco Cuturi
Abstract: The ability to compare and align related datasets living in heterogeneous spaces plays an increasingly important role in machine learning. The Gromov-Wasserstein (GW) formalism can help tackle this problem. Its main goal is to seek an assignment (more generally a coupling matrix) that can register points across otherwise incomparable datasets. As a non-convex and quadratic generalization of optima… 
More
Submitted 2 June, 2021; originally announced June 2021.

arXiv:2106.00886  [pdfother]  cs.LG  stat.ML
Partial Wasserstein Covering
Authors: Keisuke KawanoSatoshi KoideKeisuke Otaki
Abstract: We consider a general task called partial Wasserstein covering with the goal of emulating a large dataset (e.g., application dataset) using a small dataset (e.g., development dataset) in terms of the empirical distribution by selecting a small subset from a candidate dataset and adding it to the small dataset. We model this task as a discrete optimization problem with partial Wasserstein divergenc…  More
Submitted 1 June, 2021; originally announced June 2021.
online OPEN ACCESS

Partial Wasserstein Covering

by Kawano, Keisuke; Koide, Satoshi; Otaki, Keisuke

06/2021

We consider a general task called partial Wasserstein covering with the goal of emulating a large dataset (e.g., application dataset) using a small dataset...

Journal ArticleFull Text Online

 Partial Wasserstein Covering

https://arxiv.org › cs

by K Kawano · 2021 — Computer Science > Machine Learning. arXiv:2106.00886 (cs). [Submitted on 2 Jun 2021]. Title:Partial Wasserstein Covering. Authors:Keisuke 

 R Cited by 1 Related articles All 2 versions 

arXiv:2106.00736  [pdfother]  cs.LG
Large-Scale Wasserstein Gradient Flows
Authors: Petr MokrovAlexander KorotinLingxiao LiAude GenevayJustin SolomonEvgeny Burnaev
Abstract: Wasserstein gradient flows provide a powerful means of understanding and solving many diffusion equations. Specifically, Fokker-Planck equations, which model the diffusion of probability measures, can be understood as gradient descent over entropy functionals in Wasserstein space. This equivalence, introduced by Jordan, Kinderlehrer and Otto, inspired the so-called JKO scheme to approximate these…  More
Submitted 1 June, 2021; originally announced June 2021.
Cited by 36
 Related articles All 10 versions 

Distributionally robust tail bounds based on Wasserstein distance and \(f\)-divergence

by Birghila, CorinaAigner, MaximilianEngelke, Sebastian

arXiv.org, 06/2021

In this work, we provide robust bounds on the tail probabilities and the tail index of heavy-tailed distributions in the context of

model misspecification....

Paper  Full Text Online


[PDF] acm.org

P2E-WGAN: ECG waveform synthesis from PPG with conditional wasserstein generative adversarial networks

K VoEK NaeiniA Naderi, D Jilani… - Proceedings of the 36th …, 2021 - dl.acm.org

Electrocardiogram (ECG) is routinely used to identify key cardiac events such as changes in

ECG intervals (PR, ST, QT, etc.), as well as capture critical vital signs such as heart rate (HR)

and heart rate variability (HRV). The gold standard ECG requires clinical measurement …

Cited by 5 Related articles All 2 versions

Generating Adversarial Patches Using Data-Driven MultiD-WGAN

W Wang, Y Chai, Z Wu, L Ge, X Han… - … on Circuits and …, 2021 - ieeexplore.ieee.org

In recent years, machine learning algorithms and training data are faced many security

threats, which affect the security of practical applications based on machine learning. At

present, generating adversarial patches based on Generative Adversarial Nets (GANs) has …

 Related articles

  Generating Adversarial Patches Using Data-Driven MultiD-WGAN

by Wang, Wei; Chai, Yimeng; Wu, Ziwen ; More...

2021 IEEE International Symposium on Circuits and Systems (ISCAS), 05/2021

In recent years, machine learning algorithms and training data are faced many security threats, which affect the security of practical applications based on...

Conference Proceeding Full Text Online


基于 WGAN 的不均衡太赫兹光谱识别

朱荣盛, 沈韬, 刘英莉, 朱艳, 崔向伟 - 光谱学与光谱分析, 2021 - opticsjournal.net

摘要物质的太赫兹光谱具有唯一性. 目前, 结合先进的机器学习方法, 研究基于规模光谱数据库的

太赫兹光谱识别技术已成为太赫兹应用技术领域的重点. 考虑到由于实验条件及实验设备的影响

, 很难收集到多物质均衡光谱数据, 而这又是对太赫兹光谱数据进行分类的基础. 针对这一问题 …

  [Chinese  Unbalanced terahertz spectrum recognition based on WGAN]

  All 2 versions 

[PDF] arxiv.org

Approximation for Probability Distributions by Wasserstein GAN

Y Gao, MK Ng - arXiv preprint arXiv:2103.10060, 2021 - arxiv.org

In this paper, we show that the approximation for distributions by Wasserstein GAN depends

on both the width/depth (capacity) of generators and discriminators, as well as the number of

samples in training. A quantified generalization bound is developed for Wasserstein  …

  All 2 versions 

<——2021———2021———840——


Approximation Capabilities of Wasserstein Generative Adversarial Networks

Y Gao, M Zhou, MK Ng - arXiv preprint arXiv:2103.10060, 2021 - 128.84.4.34

In this paper, we study Wasserstein Generative Adversarial Networks (WGANs) using

GroupSort neural networks as discriminators. We show that the error bound for the

approximation of target distribution depends on both the width/depth (capacity) of generators …

  

[PDF] arxiv.org

Approximation algorithms for 1-Wasserstein distance between persistence diagrams

S Chen, Y Wang - arXiv preprint arXiv:2104.07710, 2021 - arxiv.org

Recent years have witnessed a tremendous growth using topological summaries, especially

the persistence diagrams (encoding the so-called persistent homology) for analyzing

complex shapes. Intuitively, persistent homology maps a potentially complex input object (be …

  All 4 versions 


[PDF] arxiv.org

Intrinsic Wasserstein Correlation Analysis

H Zhou, Z Lin, F Yao - arXiv preprint arXiv:2105.15000, 2021 - arxiv.org

We develop a framework of canonical correlation analysis for distribution-valued functional

data within the geometry of Wasserstein spaces. Specifically, we formulate an intrinsic

concept of correlation between random distributions, propose estimation methods based on …


2021 [PDF] arxiv.org

Wasserstein barycenters are NP-hard to compute

JM Altschuler, E Boix-Adsera - arXiv preprint arXiv:2101.01100, 2021 - arxiv.org

The problem of computing Wasserstein barycenters (aka Optimal Transport barycenters) has

attracted considerable recent attention due to many applications in data science. While there

exist polynomial-time algorithms in any fixed dimension, all known runtimes suffer …

  Cited by 2 Related articles All 2 versions 


2021 see 2022
arXiv:2107.07789
  [pdfother]  cs.GR  cs.CG  cs.CV  eess.IV
Wasserstein Distances, Geodesics and Barycenters of Merge Trees
Authors: Mathieu PontJules VidalJulie DelonJulien Tierny
Abstract: This paper presents a unified computational framework for the estimation of distances, geodesics and barycenters of merge trees. We extend recent work on the edit distance [106] and introduce a new metric, called the Wasserstein distance between merge trees, which is purposely designed to enable efficient computations of geodesics and barycenters. Specifically, our new distance is strictly equival…  More
Submitted 16 July, 2021; originally announced July 2021.
Cited by 5
 Related articles All 50 versions

2021

[HTML] springer.com

[HTML] … : extraction of peptide chromatographic elution profiles from large scale mass spectrometry data by means of Wasserstein compressive hierarchical cluster …

O Permiakova, R Guibert, A Kraut, T Fortin… - BMC …, 2021 - Springer

The clustering of data produced by liquid chromatography coupled to mass spectrometry

analyses (LC-MS data) has recently gained interest to extract meaningful chemical or

biological patterns. However, recent instrumental pipelines deliver data which size …

  All 12 versions


Data augmentation for rolling bearing fault diagnosis using an enhanced few-shot Wasserstein auto-encoder with meta-learning

Z Pei, H Jiang, X Li, J Zhang, S Liu - Measurement Science and …, 2021 - iopscience.iop.org

Despite the advance of intelligent fault diagnosis for rolling bearings, in industries, data-

driven methods still suffer from data acquisition and imbalance. We propose an enhanced

few-shot Wasserstein auto-encoder (fs-WAE) to reverse the negative effect of imbalance …


 2021 PDF

Some Theoretical Insights into Wasserstein GANs - Journal of ...

https://jmlr.csail.mit.edu › papers › volume22

by G Biau · 2021 · Cited by 5 — Some Theoretical Insights into Wasserstein GANs. Gérard Biau gerard.biau@ sorbonne-universite.fr. Laboratoire de Probabilités, Statistique et Modélisation.

2021 online PEER-REVIEW OPEN ACCESS

Some Theoretical Insights into Wasserstein GANs

by Biau, Gérard; Sangnier, Maxime; Tanielian, Ugo

Journal of machine learning research, 05/2021

Generative Adversarial Networks (GANs) have been successful in producing outstanding results in areas as diverse as image, video, and text generation. Building...

Journal ArticleFull Text Online

 

 

Wasserstein Distance-Based Auto-Encoder Tracking

L Xu, Y Wei, C Dong, C Xu, Z Diao - Neural Processing Letters, 2021 - Springer

Most of the existing visual object trackers are based on deep convolutional feature maps, but

there have fewer works about finding new features for tracking. This paper proposes a novel

tracking framework based on a full convolutional auto-encoder appearance model, which is

trained by using Wasserstein distance and maximum mean discrepancy. Compared with

previous works, the proposed framework has better performance in three aspects, including

appearance model, update scheme, and state estimation. To address the issues of the …

online Cover Image PEER-REVIEW

Wasserstein Distance-Based Auto-Encoder Tracking

by Xu, Long; Wei, Ying; Dong, Chenhe ; More...

Neural processing letters, 06/2021, Volume 53, Issue 3

Most of the existing visual object trackers are based on deep convolutional feature maps, but there have fewer works about finding new features for tracking....

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

  Related articles

 

arXiv_AI - Paper Reading

https://paperreading.club › category

4 days ago — Using Deep Learning to Value Defensive Actions in Football Event-Data ... Munazza Zaib, Wei Emma Zhang, Quan Z. Sheng, Adnan Mahmood, Yang Zhang ... 2021-06-01 Statistical Mechanics of Neural Processing of Object ... Graph-based Interaction Model with a Tracker Runxin Xu, Tianyu Liu, Lei Li, ...

online OPEN ACCESS

Intrinsic Wasserstein Correlation Analysis

by Zhou, Hang; Lin, Zhenhua; Yao, Fang

05/2021

We develop a framework of canonical correlation analysis for distribution-valued functional data within the geometry of Wasserstein spaces. Specifically, we...

Journal ArticleFull Text Online

Intrinsic Wasserstein Correlation Analysis

H Zhou, Z Lin, F Yao - arXiv preprint arXiv:2105.15000, 2021 - arxiv.org

We develop a framework of canonical correlation analysis for distribution-valued functional

data within the geometry of Wasserstein spaces. Specifically, we formulate an intrinsic

concept of correlation between random distributions, propose estimation methods based on

functional principal component analysis (FPCA) and Tikhonov regularization, respectively,

for the correlation and its corresponding weight functions, and establish the minimax

convergence rates of the estimators. The key idea is to extend the framework of tensor …

 Cited by 2 Related articles All 2 versions

<——2021———2021———850——


Large-Scale Wasserstein Gradient Flows

P Mokrov, A KorotinL LiA Genevay… - arXiv preprint arXiv …, 2021 - arxiv.org

Wasserstein gradient flows provide a powerful means of understanding and solving many

diffusion equations. Specifically, Fokker-Planck equations, which model the diffusion of

probability measures, can be understood as gradient descent over entropy functionals in …

  All 2 versions 

online OPEN ACCESS

Large-Scale Wasserstein Gradient Flows

by Mokrov, Petr; Korotin, Alexander; Li, Lingxiao ; More...

06/2021

Wasserstein gradient flows provide a powerful means of understanding and solving many diffusion equations. Specifically, Fokker-Planck equations, which model...

Journal ArticleFull Text Online

Cited by 19 Related articles All 9 versions

Unbalanced optimal total variation transport problems and generalized Wasserstein barycenters

NP Chung, TS Trinh - Proceedings of the Royal Society of Edinburgh … - cambridge.org

In this paper, we establish a Kantorovich duality for unbalanced optimal total variation

transport problems. As consequences, we recover a version of duality formula for partial

optimal transports established by Caffarelli and McCann; and we also get another proof of

Kantorovich–Rubinstein theorem for generalized Wasserstein distance proved before by

Piccoli and Rossi. Then we apply our duality formula to study generalized Wasserstein

barycenters. We show the existence of these barycenters for measures with compact …

online Cover Image

Unbalanced optimal total variation transport problems and generalized Wasserstein barycenters

by Chung, Nhan-Phu; Trinh, Thanh-Son

Proceedings of the Royal Society of Edinburgh. Section A. Mathematics, 06/2021

In this paper, we establish a Kantorovich duality for unbalanced optimal total variation transport problems. As consequences, we recover a version of duality...

Article Link Read Article BrowZine Article Link Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 Zbl 07535678


Robust Hypothesis Testing with Wasserstein Uncertainty Sets

L Xie, R Gao, Y Xie - arXiv preprint arXiv:2105.14348, 2021 - arxiv.org

We consider a data-driven robust hypothesis test where the optimal test will minimize the

worst-case performance regarding distributions that are close to the empirical distributions

with respect to the Wasserstein distance. This leads to a new non-parametric hypothesis

testing framework based on distributionally robust optimization, which is more robust when

there are limited samples for one or both hypotheses. Such a scenario often arises from

applications such as health care, online change-point detection, and anomaly detection. We …

  Cited by 2 Related articles All 2 versions

online OPEN ACCESS

Robust Hypothesis Testing with Wasserstein Uncertainty Sets

by Xie, Liyan; Gao, Rui; Xie, Yao

05/2021

We consider a data-driven robust hypothesis test where the optimal test will minimize the worst-case performance regarding distributions that are close to the...

Journal ArticleFull Text Online

 

 

1Linear-Time Gromov Wasserstein Distances using Low Rank Couplings and Costs

M ScetbonG PeyréM Cuturi - arXiv preprint arXiv:2106.01128, 2021 - arxiv.org

The ability to compare and align related datasets living in heterogeneous spaces plays an

increasingly important role in machine learning. The Gromov-Wasserstein (GW) formalism

can help tackle this problem. Its main goal is to seek an assignment (more generally a

coupling matrix) that can register points across otherwise incomparable datasets. As a non-

convex and quadratic generalization of optimal transport (OT), GW is NP-hard. Yet,

heuristics are known to work reasonably well in practice, the state of the art approach being …

 Cited by 5 Related articles All 3 versions

online  OPEN ACCESS

Linear-Time Gromov Wasserstein Distances using Low Rank Couplings and Costs

by Scetbon, Meyer; Peyré, Gabriel; Cuturi, Marco

06/2021

The ability to compare and align related datasets living in heterogeneous spaces plays an increasingly important role in machine learning. The...

Journal ArticleFull Text Online

 Cited by 5 Related articles All 3 versions


On Riemannian Optimization over Positive Definite Matrices with the Bures-Wasserstein Geometry

A HanB MishraP JawanpuriaJ Gao - arXiv preprint arXiv:2106.00286, 2021 - arxiv.org

In this paper, we comparatively analyze the Bures-Wasserstein (BW) geometry with the

popular Affine-Invariant (AI) geometry for Riemannian optimization on the symmetric positive

definite (SPD) matrix manifold. Our study begins with an observation that the BW metric has

a linear dependence on SPD matrices in contrast to the quadratic dependence of the AI

metric. We build on this to show that the BW metric is a more suitable and robust choice for

several Riemannian optimization problems over ill-conditioned SPD matrices. We show that …

 Cited by 2 Related articles All 5 versions

online OPEN ACCESS

On Riemannian Optimization over Positive Definite Matrices with the Bures-Wasserstein Geometry

by Han, Andi; Mishra, Bamdev; Jawanpuria, Pratik ; More...

06/2021

In this paper, we comparatively analyze the Bures-Wasserstein (BW) geometry with the popular Affine-Invariant (AI) geometry for Riemannian optimization on the...

Journal ArticleFull Text Online
arXiv:2106.00286
  [pdfother math.OC  cs.LG
On Riemannian Optimization over Positive Definite Matrices with the Bures-Wasserstein Geometry
Authors: Andi HanBamdev MishraPratik JawanpuriaJunbin Gao
Abstract: In this paper, we comparatively analyze the Bures-Wasserstein (BW) geometry with the popular Affine-Invariant (AI) geometry for Riemannian optimization on the symmetric positive definite (SPD) matrix manifold. Our study begins with an observation that the BW metric has a linear dependence on SPD matrices in contrast to the quadratic dependence of the AI metric. We build on this to show that the BW…  More
Submitted 1 June, 2021; originally announced June 2021.

Cited by 17 Related articles All 6 versions 

2021


year 2021

Unbalanced optimal total variation transport problems and generalized Wasserstein barycenters

NP Chung, TS Trinh - Proceedings of the Royal Society of Edinburgh … - cambridge.org

In this paper, we establish a Kantorovich duality for unbalanced optimal total variation

transport problems. As consequences, we recover a version of duality formula for partial

optimal transports established by Caffarelli and McCann; and we also get another proof of …

 

2021

 [CITATION] Multivariate Stein Factors from Wasserstein Decay

MA Erdogdu, L Mackey, O Shamir - 2019 - preparation

  Cited by 2 Related articles

[CITATION] Wasserstein GAN for the Classification of Unbalanced THz Database

Z Rong-sheng, S Tao, L Ying-li… - …, 2021 - OFFICE SPECTROSCOPY & …


2021

[PDF] arxiv.org

Robust Hypothesis Testing with Wasserstein Uncertainty Sets

L Xie, R Gao, Y Xie - arXiv preprint arXiv:2105.14348, 2021 - arxiv.org

We consider a data-driven robust hypothesis test where the optimal test will minimize the

worst-case performance regarding distributions that are close to the empirical distributions

with respect to the Wasserstein distance. This leads to a new non-parametric hypothesis  …

  All 2 versions 


2021

Wasserstein Distance-Based Auto-Encoder Tracking

L Xu, Y Wei, C Dong, C Xu, Z Diao - Neural Processing Letters, 2021 - Springer

Most of the existing visual object trackers are based on deep convolutional feature maps, but

there have fewer works about finding new features for tracking. This paper proposes a novel

tracking framework based on a full convolutional auto-encoder appearance model, which is …

 


2021

[PDF] jmlr.org

[PDF] Some theoretical insights into Wasserstein GANs

G BiauM SangnierU Tanielian - Journal of Machine Learning Research, 2021 - jmlr.org

Abstract Generative Adversarial Networks (GANs) have been successful in producing

outstanding results in areas as diverse as image, video, and text generation. Building on

these successes, a large number of empirical studies have validated the benefits of the …

  Cited by 5 Related articles All 5 versions 

<——2021———2021———860——


The alpha-z-Bures Wasserstein divergence

By: Trung Hoa DinhCong Trinh LeBich Khue Vo; et al.

LINEAR ALGEBRA AND ITS APPLICATIONS  Volume: ‏ 624   Pages: ‏ 267-280   Published: ‏ SEP 1 2021


Intrinsic Dimension Estimation Using Wasserstein Distances

by Block, AdamJia, ZeyuPolyanskiy, Yury ; More...

06/2021

It has long been thought that high-dimensional data encountered in many practical machine learning tasks have low-dimensional

structure, i.e., the manifold...

Journal Article  Full Text Online


2021 patent

Network attack flow data enhancement method combined with self-encoder and Wasserstein generative adversarial network useful for e.g. field of network safety, involves constructing unbalanced data set based on network attack flow data

Patent Number: CN112688928-A

Patent Assignee: CAS INFORMATION ENG INST

Inventor(s): YAO Y; HAO X; WANG Q; et al.


2021

Wasserstein Metric-Based Location Spoofing Attack Detection in WiFi Positioning Systems

By: Tian, YinghuaZheng, NaeChen, Xiang; et al.

SECURITY AND COMMUNICATION NETWORKS  Volume: ‏ 2021     Article Number: 8817569   Published: ‏ APR 7  

Cited by 1 Related articles All 7 versions 

2021nn[PDF] ams.org

Mullins-Sekerka as the Wasserstein flow of the perimeter

A Chambolle, T Laux - Proceedings of the American Mathematical Society, 2021 - ams.org

We prove the convergence of an implicit time discretization for the one-phase Mullins-

Sekerka equation, possibly with additional non-local repulsion, proposed in [F. Otto, Arch.

Rational Mech. Anal. 141 (1998), pp. 63–103]. Our simple argument shows that the limit …

   Related articles All 4 versions


 2021  [PDF] arxiv.org

Wasserstein statistics in one-dimensional location scale models

S AmariT Matsuda - Annals of the Institute of Statistical Mathematics, 2021 - Springer

Wasserstein geometry and information geometry are two important structures to be

introduced in a manifold of probability distributions. Wasserstein geometry is defined by

using the transportation cost between two distributions, so it reflects the metric of the base …

   Related articles All 2 versions


2021  [HTML] hindawi.com

[HTML] Wasserstein Metric-Based Location Spoofing Attack Detection in WiFi Positioning Systems

Y Tian, N Zheng, X Chen, L Gao - Security and Communication …, 2021 - hindawi.com

WiFi positioning systems (WPS) have been introduced as parts of 5G location services

(LCS) to provide fast positioning results of user devices in urban areas. However, they are

prominently threatened by location spoofing attacks. To end this, we present a Wasserstein  …

  All 3 versions 


2021
Dinh, Trung Hoa
Le, Cong TrinhVo, Bich KhueVuong, Trung Dung

The  α-z--Bures Wasserstein divergence. (English) Zbl 07355223

Linear Algebra Appl. 624, 267-280 (2021).

MSC:  47A63 47A56

PDF BibTeX XML Cite

Full Text: DOI


2021

Chambolle, AntoninLaux, Tim

Mullins-Sekerka as the Wasserstein flow of the perimeter. (English) Zbl 07352294

Proc. Am. Math. Soc. 149, No. 7, 2943-2956 (2021).

MSC:  35A15 65M12 49Q20 76D27 90B06 35R35

PDF BibTeX XML Cite

Full Text: DOI

Cited by 7 Related articles All 25 versions

arXiv:2106.05724  [pdfother math.OC  cs.LG
Distributionally Robust Prescriptive Analytics with Wasserstein Distance
Authors: Tianyu WangNingyuan ChenChun Wang
Abstract: In prescriptive analytics, the decision-maker observes historical samples of (X,Y)
, where Y
 is the uncertain problem parameter and X
 is the concurrent covariate, without knowing the joint distribution. Given an additional covariate observation x
, the goal is to choose a decision z
 conditional on this observation to minimize the cost E[c(z,Y)|X=x]
. This paper proposes a new di…  More
Submitted 10 June, 2021; originally announced June 2021.

Related articles All 2 versions 
<——2021———2021———870——

2023 see 2021

arXiv:2106.04923  [pdfother stat.ML  cs.LG
Learning Domain Invariant Representations by Joint Wasserstein Distance Minimization
Authors: Léo AndéolYusei KawakamiYuichiro WadaTakafumi KanamoriKlaus-Robert MüllerGrégoire Montavon
Abstract: Domain shifts in the training data are common in practical applications of machine learning, they occur for instance when the data is coming from different sources. Ideally, a ML model should work well independently of these shifts, for example, by learning a domain-invariant representation. Moreover, privacy concerns regarding the source also require a domain-invariant representation. In this wor…  More
Submitted 9 June, 2021; originally announced June 2021.
Comments: 20 pages including appendix. Under Review
online OPEN ACCESS

Learning Domain Invariant Representations by Joint Wasserstein Distance Minimization

by Andéol, Léo; Kawakami, Yusei; Wada, Yuichiro ; More...

06/2021

Domain shifts in the training data are common in practical applications of machine learning, they occur for instance when the data is coming from different...

Journal ArticleFull Text Online

 Related articles All 4 versions 



arXiv:2106.03226  [pdfother]  math.OC  math.PR

Authors: Luis Felipe VargasMauricio Velasco
Abstract: Given a prior probability density p
 on a compact set K
 we characterize the probability distribution q
δ on K contained in a Wasserstein ball B
δ (μ) centered in a given discrete measure μ
 for which the relative-entropy H(q,p)
 achieves its minimum. This characterization gives us an algorithm for computing such distributions efficiently
Submitted 6 June, 2021; originally announced June 2021.
  Related articles All 2 versions 


arXiv:2106.02968  [pdfother]  cs.LG  math.OC
Low Budget Active Learning via Wasserstein Distance: An Integer Programming Approach
Authors: Rafid MahmoodSanja FidlerMarc T. Law
Abstract: Given restrictions on the availability of data, active learning is the process of training a model with limited labeled data by selecting a core subset of an unlabeled data pool to label. Although selecting the most useful points for training is an optimization problem, the scale of deep learning data sets forces most selection strategies to employ efficient heuristics. Instead, we propose a new i…  More
Submitted 5 June, 2021; originally announced June 2021.

Related articles All 2 versions 

 [HTML] springer.com

[HTML] Primal dual methods for Wasserstein gradient flows

JA CarrilloK CraigL WangC Wei - Foundations of Computational …, 2021 - Springer

… Next, we use the Benamou–Brenier dynamical characterization of the Wasserstein distance

to reduce computing the solution of the discrete … We conclude with simulations of nonlinear

PDEs and Wasserstein geodesics in one and two dimensions that illustrate the key properties …

 Cited by 29 Related articles All 8 versions

[PDF] jst.go.jp

深層マルコフモデルの Wasserstein 距離を用いた学習

福田紘平, 星野健太 - 自動制御連合講演会講演論文集 64 回自動 …, 2021 - jstage.jst.go.jp

… Abstract: This study discusses the modeling of time series using deep neural networks and

Wasserstein distance. The network provides generative models called Deep Markov Models,

which allows us to obtain generative data of time series. We explore the loss function based on …

All 2 versions

[Japanese  Learning using the Wasserstein distance of a deep Markov model]


2021


[PDF] udl.cat

[PDF] Cálculo privado de la distancia de Wasserstein (Earth Mover)

A Blanco-Justicia, J Domingo-Ferrer - recsi2020.udl.cat

La distancia de Wasserstein, más conocida en inglés como Earth Mover's Distance (EMD),

es una medida de distancia entre dos distribuciones de probabilidad. La EMD se utiliza

ampliamente en la comparación de imágenes y documentos, y forma parte de modelos de …

Related articles 

  

高光谱图像分类的 Wasserstein 配置熵非监督波段选择方法

张红, 吴智伟, 王继成, 高培超 - 测绘学报 - xb.sinomaps.com

高光谱图像波段选择需考虑波段信息. 传统香农信息熵指标仅考虑图像的组分信息(像元的种类

和比例), 忽略了图像的空间配置信息(像元的空间分布), 后者可由玻尔兹曼熵刻画. 其中,

Wasserstein 配置熵删除了连续像元的冗余信息, 但局限于四邻域, 本文将Wasserstein  …

  All 2 versions 

[Chinese  Wasserstein configuration entropy unsupervised band selection method for hyperspectral image classification]


2021  see 2022
Indeterminacy estimates, eigenfunctions and lower bounds on Wasserstein distances
by De Ponti, NicolòFarinelli, Sara
04/2021
In the paper we prove two inequalities in the setting of ${\sf RCD}(K,\infty)$ spaces using similar techniques. The first one is an indeterminacy estimate...
Journal Article  Full Text Online

 Wasserstein Generative Adversarial Networks for Realistic ...

https://link.springer.com › chapter

Apr 5, 2021 — ... Adversarial Networks for Realistic Traffic Sign Image Generation ... ( Wasserstein GAN, WGAN) is used to generate complicated images in ...


[PDF] zhankunluo.com

[PDF] Image Generation using Wasserstein Generative Adversarial Network

Z Luo, A Jara, W Ou - zhankunluo.com

GAN shows the capability to generate fake authentic images by evaluating and learning

from real and fake samples. This paper introduces an alternative algorithm to the traditional

DCGAN, named Wasserstein GAN (WGAN). It introduces the Wasserstein distance for the …

Related articles 

[PDF] iop.org

Optimization Research on Abnormal Diagnosis of Transformer Voiceprint Recognition based on Improved Wasserstein GAN

K Zhu, H Ma, J Wang, C Yu, C Guo… - Journal of Physics …, 2021 - iopscience.iop.org

Transformer is an important infrastructure equipment of power system, and fault monitoring

is of great significance to its operation and maintenance, which has received wide attention

and much research. However, the existing methods at home and abroad are based on …

  All 2 versions

<——2021———2021———880——

 

[PDF] uwaterloo.ca

Wasserstein Autoencoders with Mixture of Gaussian Priors for Stylized Text Generation

A Ghabussi - 2021 - uwspace.uwaterloo.ca

Probabilistic text generation is an important application of Natural Language Processing

(NLP). Variational autoencoders and Wasserstein autoencoders are two widely used

methods for text generation. New research efforts focus on improving the quality of the …


[PDF] ams.org

Mullins-Sekerka as the Wasserstein flow of the perimeter

A Chambolle, T Laux - Proceedings of the American Mathematical Society, 2021 - ams.org

We prove the convergence of an implicit time discretization for the one-phase Mullins-

Sekerka equation, possibly with additional non-local repulsion, proposed in [F. Otto, Arch.

Rational Mech. Anal. 141 (1998), pp. 63–103]. Our simple argument shows that the limit …

   Related articles All 4 versions


Intensity-Based Wasserstein Distance As A Loss Measure For Unsupervised Deformable Deep Registration

R Shams, W Le, A Weihs… - 2021 IEEE 18th …, 2021 - ieeexplore.ieee.org

Traditional pairwise medical image registration techniques are based on computationally

intensive frameworks due to numerical optimization procedures. While there is increasing

adoption of deep neural networks to improve deformable image registration, achieving a …



Considering anatomical prior information for low-dose CT image enhancement using attribute-augmented Wasserstein generative adversarial networks

Z Huang, X Liu, R Wang, J Chen, P Lu, Q Zhang… - Neurocomputing, 2021 - Elsevier

Currently, many deep learning (DL)-based low-dose CT image postprocessing technologies

fail to consider the anatomical differences in training data among different human body sites,

such as the cranium, lung and pelvis. In addition, we can observe evident anatomical …

  Cited by 2 Related articles


2021

Fault Diagnosis of Rotating Machinery Based on Wasserstein Distance and Feature Selection

F FerracutiA FreddiA Monteriù… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

This article presents a fault diagnosis algorithm for rotating machinery based on the

Wasserstein distance. Recently, the Wasserstein distance has been proposed as a new

research direction to find better distribution mapping when compared with other popular …


2021


 arXiv:2106.06266  [pdfother math.ST
Distributionally robust tail bounds based on Wasserstein distance and f-divergence

  All 2 versions 


2021
Authors: Corina BirghilaMaximilian AignerSebastian Engelke
Abstract: In this work, we provide robust bounds on the tail probabilities and the tail index of heavy-tailed distributions in the context of model misspecification. They are defined as the optimal value when computing the worst-case tail behavior over all models within some neighborhood of the reference model. The choice of the discrepancy between the models used to build this neighborhood plays a crucial…  More
Submitted 11 June, 2021; originally announced June 2021.


online Cover Image PEER-REVIEW

Pixel-wise Wasserstein Autoencoder for Highly Generative Dehazing

by Kim, Guisik; Park, Sung Woo; Kwon, Junseok

IEEE transactions on image processing, 06/2021, Volume PP

We propose a highly generative dehazing method based on pixel-wise Wasserstein autoencoders. In contrast to existing dehazing methods based on generative...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 Cited by 7 Related articles All 5 versions

 

 2021   Cover Image   PEER-REVIEW

Tropical optimal transport and Wasserstein distances

by Lee, Wonjun; Li, Wuchen; Lin, Bo ; More...

Information Geometry, 06/2021

AbstractWe study the problem of optimal transport in tropical geometry and define the Wasserstein-p distances in the continuous metric measure space setting of...

Article PDF Download Now (via Unpaywall) BrowZine PDF Icon

Journal ArticleCitation Online

  Cited by 2 Related articles All 3 versions

 

online Cover Image  PEER-REVIEW OPEN ACCESS

Variational Autoencoders and Wasserstein Generative Adversarial Networks for Improving the Anti-Money...

by Chen, ZhiYuan; Soliman, Waleed; Nazir, Amril ; More...

IEEE access, 06/2021

There has been much recent work on fraud and Anti Money Laundering (AML) detection using machine learning techniques. However, most algorithms are based on...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 

 

online OPEN ACCESS

Minimum cross-entropy distributions on Wasserstein balls and their applications

by Vargas, Luis Felipe; Velasco, Mauricio

06/2021

Given a prior probability density $p$ on a compact set $K$ we characterize the probability distribution $q_{\delta}^*$ on $K$ contained in a Wasserstein ball...

Journal ArticleFull Text Online

m cross-entropy distributions on Wasserstein balls and ...

https://arxiv.org › math

by LF Vargas · 2021 — ... distributions on Wasserstein balls and their applications ... measure \mu for which the relative-entropy H(q,p) achieves its minimum.

<——2021———2021———890——


online OPEN ACCESS

Low Budget Active Learning via Wasserstein Distance: An Integer Programming Approach

by Mahmood, Rafid; Fidler, Sanja; Law, Marc T

06/2021

Given restrictions on the availability of data, active learning is the process of training a model with limited labeled data by selecting a core subset of an...

Journal ArticleFull Text Online

Low Budget Active Learning via Wasserstein Distance: An Integer ...

https://www.reddit.com › comments › nwdhzj › low_bu...

Jun 9, 2021 — Low Budget Active Learning via Wasserstein Distance: An Integer Programming Approach by Rafid Mahmood et al. Close.

 

Jun 14, 2021

Near-Optimal Offline Reinforcement Learning via Double Variance ...

Feb 4, 2021

More results from www.reddit.com


[PDF] mdpi.com

 Pixel-wise Wasserstein Autoencoder for Highly Generative Dehazing

G Kim, SW Park, J Kwon - IEEE Transactions on Image …, 2021 - ieeexplore.ieee.org

We propose a highly generative dehazing method based on pixel-wise Wasserstein

autoencoders. In contrast to existing dehazing methods based on generative adversarial

networks, our method can produce a variety of dehazed images with different styles. It …

  All 3 versions


2021

Wasserstein k-means with sparse simplex projection

T Fukunaga, H Kasai - 2020 25th International Conference on …, 2021 - ieeexplore.ieee.org

This paper presents a proposal of a faster Wasser-stein k-means algorithm for histogram

data by reducing Wasser-stein distance computations and exploiting sparse simplex

projection. We shrink data samples, centroids, and the ground cost matrix, which leads to …

Cited by 13 Related articles All 5 versions

arXiv:2106.08812  [pdfother]  cs.LG  cs.AI  cs.CY  stat.ML
Costs and Benefits of Wasserstein Fair Regression
Authors: Han Zhao
Abstract: Real-world applications of machine learning tools in high-stakes domains are often regulated to be fair, in the sense that the predicted target should satisfy some quantitative notion of parity with respect to a protected attribute. However, the exact tradeoff between fairness and accuracy with a real-valued target is not clear. In this paper, we characterize the inherent tradeoff between statisti… 
More
Submitted 16 June, 2021; originally announced June 2021.
Related articles All 2 versions 


Averaging on the Bures-Wasserstein manifold: dimension-free convergence of gradient...

by Altschuler, Jason MChewi, SinhoGerber, Patrik ; More...
06/2021
We study first-order optimization algorithms for computing the barycenter of Gaussian distributions with respect to the optimal transport metric. Although the...
Journal Article  Full Text Online

arXiv:2106.08502  [pdfother]  math.OC  cs.LG
Averaging on the Bures-Wasserstein manifold: dimension-free convergence of gradient descent
Authors: Jason M. AltschulerSinho ChewiPatrik GerberAustin J. Stromme
Abstract: We study first-order optimization algorithms for computing the barycenter of Gaussian distributions with respect to the optimal transport metric. Although the objective is geodesically non-convex, Riemannian GD empirically converges rapidly, in fact faster than off-the-shelf methods such as Euclidean GD and SDP solvers. This stands in stark contrast to the best-known theoretical results for Rieman… 
More
Submitted 15 June, 2021; originally announced June 2021.
Comments: 48 pages, 8 figures
eodesically non-convex, Riemannian GD empirically converges rapidly, in fact faster than …

 Cited by 8 Related articles All 7 versions 

2021


Non-asymptotic convergence bounds for Wasserstein approximation using point clouds

by Merigot, QuentinSantambrogio, FilippoSarrazin, Clément
06/2021
Several issues in machine learning and inverse problems require to generate discrete data, as if sampled from a model probability distribution. A common way to...
Journal Article  Full Text Online

arXiv:2106.07911  [pdfother]  math.OC  stat.ML
Non-asymptotic convergence bounds for Wasserstein approximation using point clouds
Authors: Quentin MerigotFilippo SantambrogioClément Sarrazin
Abstract: Several issues in machine learning and inverse problems require to generate discrete data, as if sampled from a model probability distribution. A common way to do so relies on the construction of a uniform probability distribution over a set of N
 points which minimizes the Wasserstein distance to the model distribution. This minimization problem, where the unknowns are the positions of the atoms…  More
Submitted 15 June, 2021; originally announced June 2021.
Cited by 5 Related articles All 7 versions 


arXiv:2106.07537  [pdfother stat.ML  cs.LG math.OC
Wasserstein Minimax Framework for Mixed Linear Regression
Authors: Theo DiamandisYonina C. EldarAlireza FallahFarzan FarniaAsuman Ozdaglar
Abstract: Multi-modal distributions are commonly used to model clustered data in statistical learning tasks. In this paper, we consider the Mixed Linear Regression (MLR) problem. We propose an optimal transport-based framework for MLR problems, Wasserstein Mixed Linear Regression (WMLR), which minimizes the Wasserstein distance between the learned and target mixture regression models. Through a model-based…  More
Submitted 16 June, 2021; v1 submitted 14 June, 2021; originally announced June 2021.
Comments: To appear in 38th International Conference on Machine Learning (ICML 2021)

[CITATION] A Wasserstein Minimax Framework for Mixed Linear Regression

T Diamandis, Y EldarA Fallah… - 38th International … - weizmann.esploro.exlibrisgroup.com

 Cited by 1 Related articles All 6 versions 

Balanced Coarsening for Multilevel Hypergraph Partitioning via Wasserstein Discrepancy

by Guo, ZhichengZhao, JiaxuanJiao, Licheng ; More...

06/2021

We propose a balanced coarsening scheme for multilevel hypergraph partitioning. In addition, an initial partitioning algorithm is

designed to improve the...

Journal Article  Full Text Online

arXiv:2106.07501  [pdfpsother cs.LG
Balanced Coarsening for Multilevel Hypergraph Partitioning via Wasserstein Discrepancy
Authors: Zhicheng GuoJiaxuan ZhaoLicheng JiaoXu Liu
Abstract: We propose a balanced coarsening scheme for multilevel hypergraph partitioning. In addition, an initial partitioning algorithm is designed to improve the quality of k-way hypergraph partitioning. By assigning vertex weights through the LPT algorithm, we generate a prior hypergraph under a relaxed balance constraint. With the prior hypergraph, we have defined the Wasserstein discrepancy to coordina…  More
Submitted 14 June, 2021; originally announced June 2021.

Related articles All 3 versions 

2021 [PDF] arxiv.org

Tracial smooth functions of non-commuting variables and the free Wasserstein manifold

D Jekel, W LiD Shlyakhtenko - arXiv preprint arXiv:2101.06572, 2021 - arxiv.org

We formulate a free probabilistic analog of the Wasserstein manifold on $\mathbb {R}^ d

$(the formal Riemannian manifold of smooth probability densities on $\mathbb {R}^ d $),

and we use it to study smooth non-commutative transport of measure. The points of the free  …

   All 3 versions 


2021

Dimension-free Wasserstein contraction of nonlinear filters

N Whiteley - Stochastic Processes and their Applications, 2021 - Elsevier

For a class of partially observed diffusions, conditions are given for the map from the initial

condition of the signal to filtering distribution to be contractive with respect to Wasserstein

distances, with rate which does not necessarily depend on the dimension of the state-space …

  All 6 versions

<——2021———2021———900——


[PDF] archives-ouvertes.fr

Measuring the Irregularity of Vector-Valued Morphological Operators using Wasserstein Metric

ME Valle, S Francisco, MA Granero… - … Conference on Discrete …, 2021 - Springer

Mathematical morphology is a useful theory of nonlinear operators widely used for image

processing and analysis. Despite the successful application of morphological operators for

binary and gray-scale images, extending them to vector-valued images is not straightforward …

  All 5 versions


Hybrid Machine Learning Model for Rainfall Forecasting

H Abdel-Kader, M Abd-El Salam… - Journal of Intelligent …, 2021 - americaspg.com

… Hybrid Machine Learning Model for Rainfall Forecasting … This Paper presents a vigorous hybrid

technique was applied to forecast rainfall by combining Particle Swarm … Syam Prasad Reddy,

K.Vagdhan Kumar, B. Musala Reddy, and N. Raja, Nayak, “ANN Approach for Weather …

  1 Related articles


Full View

Short‐term railway passenger demand forecast using improved Wasserstein generative adversarial nets and web search terms

F Feng, J Zhang, C Liu, W Li… - IET Intelligent Transport …, 2021 - Wiley Online Library

Accurately predicting railway passenger demand is conducive for managers to quickly

adjust strategies. It is time‐consuming and expensive to collect large‐scale traffic data. With

the digitization of railway tickets, a large amount of user data has been accumulated. We …


2021 [PDF] thecvf.com

Self-Supervised Wasserstein Pseudo-Labeling for Semi-Supervised Image Classification

F Taherkhani, A Dabouei… - Proceedings of the …, 2021 - openaccess.thecvf.com

The goal is to use Wasserstein metric to provide pseudo labels for the unlabeled images to

train a Convolutional Neural Networks (CNN) in a Semi-Supervised Learning (SSL) manner

for the classification task. The basic premise in our method is that the discrepancy between …


arXiv:2106.13024  [pdfother]  cs.LG  cs.AI cs.CV
Symmetric Wasserstein Autoencoders
Authors: Sun SunHongyu Guo
Abstract: Leveraging the framework of Optimal Transport, we introduce a new family of generative autoencoders with a learnable prior, called Symmetric Wasserstein Autoencoders (SWAEs). We propose to symmetrically match the joint distributions of the observed data and the latent representation induced by the encoder and the decoder. The resulting algorithm jointly optimizes the modelling losses in both the d… 
More
Submitted 24 June, 2021; originally announced June 2021.
Comments: Accepted by UAI2021

Symmetric Wasserstein Autoencoders

by Sun, Sun; Guo, Hongyu

06/2021

Leveraging the framework of Optimal Transport, we introduce a new family of generative autoencoders with a learnable prior, called Symmetric Wasserstein...

Journal ArticleFull Text Online
Related articles
 All 4 versions 

arXiv:2106.12893  [pdfother]  cs.LG  stat.ML
Partial Wasserstein and Maximum Mean Discrepancy distances for bridging the gap between outlier detection and drift detection
Authors: Thomas Viehmann
Abstract: With the rise of machine learning and deep learning based applications in practice, monitoring, i.e. verifying that these operate within specification, has become an important practical problem. An important aspect of this monitoring is to check whether the inputs (or intermediates) have strayed from the distribution they were validated for, which can void the performance assurances obtained durin…  More
Submitted 9 June, 2021; originally announced June 2021.

Partial Wasserstein and Maximum Mean Discrepancy distances for bridging the gap...

by Viehmann, Thomas

06/202

With the rise of machine learning and deep learning based applications in practice, monitoring, i.e. verifying that these operate

within specification, has...

Journal Article  Full Text Online

All 2 versions 


Gromov-Wasserstein Distances between Gaussian Distributions

A Salmona, J Delon, A Desolneux - arXiv preprint arXiv:2104.07970, 2021 - arxiv.org

The Gromov-Wasserstein distances were proposed a few years ago to compare distributions

which do not lie in the same space. In particular, they offer an interesting alternative to the

Wasserstein distances for comparing probability measures living on Euclidean spaces of …

  All 12 versions 

 

Wasserstein distance based Asymmetric Adversarial Domain Adaptation in intelligent bearing fault diagnosis

Y Ying, Z Jun, T Tang, W Jingwei, C Ming… - Measurement …, 2021 - iopscience.iop.org

Addressing the phenomenon of data sparsity in hostile working conditions, which leads to

performance degradation in traditional machine learning based fault diagnosis methods, a

novel Wasserstein distance based Asymmetric Adversarial Domain Adaptation (WAADA) is …

 

[PDF] arxiv.org

Generalized Wasserstein barycenters between probability measures living on different subspaces

J DelonN Gozlan, A Saint-Dizier - arXiv preprint arXiv:2105.09755, 2021 - arxiv.org

In this paper, we introduce a generalization of the Wasserstein barycenter, to a case where

the initial probability measures live on different subspaces of R^ d. We study the existence

and uniqueness of this barycenter, we show how it is related to a larger multi-marginal …

  All 8 versions 


2021

[PDF] projecteuclid.org

Strong equivalence between metrics of Wasserstein type

E Bayraktar, G Guoï - Electronic Communications in Probability, 2021 - projecteuclid.org

The sliced Wasserstein metric p and more recently max-sliced Wasserstein metric W p

have attracted abundant attention in data sciences and machine learning due to their

advantages to tackle the curse of dimensionality, see eg [15],[6]. A question of particular …

  All 2 versions

<——2021———2021———910——


2021  see 2020

MR4276978 Prelim Fort, Jean-Claude; Klein, Thierry; Lagnoux, Agnès; Global Sensitivity Analysis and Wasserstein Spaces. SIAM/ASA J. Uncertain. Quantif. 9 (2021), no. 2, 880–921. 62G05 (62E17 62G20 62G30 65C60)

Review PDF Clipboard Journal Article
[PDF] arxiv.org

Global sensitivity analysis and Wasserstein spaces

JC Fort, T Klein, A Lagnoux - SIAM/ASA Journal on Uncertainty Quantification, 2021 - SIAM

… two indices: the first one is based on Wasserstein Fr\'echet means, while the second one is based on the Hoeffding decomposition of the indicators of Wasserstein balls. Further, when …

Cited by 12 Related articles All 16 versions


 rohmader, AndrewVolkmer, Hans

1-Wasserstein distance on the standard simplex. (English) Zbl 07359878

Algebr. Stat. 12, No. 1, 43-56 (2021).

MSC:  28A33 60B05

PDF BibTeX XML Cite Full Text: DOI

Cited by 5 Related articles All 4 versions

[PDF] jmlr.org

[PDF] A Fast Globally Linearly Convergent Algorithm for the Computation of Wasserstein Barycenters.

L YangJ LiD SunKC Toh - J. Mach. Learn. Res., 2021 - jmlr.org

We consider the problem of computing a Wasserstein barycenter for a set of discrete

probability distributions with finite supports, which finds many applications in areas such as

statistics, machine learning and image processing. When the support points of the …

  Cited by 9 Related articles All 6 versions 

 

Wasserstein k-means with sparse simplex projection

T Fukunaga, H Kasai - 2020 25th International Conference on …, 2021 - ieeexplore.ieee.org

This paper presents a proposal of a faster Wasser-stein k-means algorithm for histogram

data by reducing Wasser-stein distance computations and exploiting sparse simplex

projection. We shrink data samples, centroids, and the ground cost matrix, which leads to …

   Cited by 2 Related articles All 4 versions

2021  see 2019

1-Wasserstein distance on the standard simplex

A Frohmader, H Volkmer - Algebraic Statistics, 2021 - msp.org

Wasserstein distances provide a metric on a space of probability measures. We consider the

space Ω of all probability measures on the finite set χ={1,…, n}, where n is a positive integer.

The 1-Wasserstein distance, W 1 (μ, ν), is a function from Ω× Ω to [0,∞). This paper derives …

2021


[PDF] arxiv.org

Robust Hypothesis Testing with Wasserstein Uncertainty Sets

L Xie, R Gao, Y Xie - arXiv preprint arXiv:2105.14348, 2021 - arxiv.org

We consider a data-driven robust hypothesis test where the optimal test will minimize the

worst-case performance regarding distributions that are close to the empirical distributions

with respect to the Wasserstein distance. This leads to a new non-parametric hypothesis  …

  All 2 versions 


2021 Full View

Short‐term railway passenger demand forecast using improved Wasserstein generative adversarial nets and web search terms

F Feng, J Zhang, C Liu, W Li… - IET Intelligent Transport …, 2021 - Wiley Online Library

Accurately predicting railway passenger demand is conducive for managers to quickly

adjust strategies. It is time‐consuming and expensive to collect large‐scale traffic data. With

the digitization of railway tickets, a large amount of user data has been accumulated. We …


2021

Wasserstein perturbations of Markovian transition semigroups

S Fuhrmann, M Kupper, M Nendel - arXiv preprint arXiv:2105.05655, 2021 - arxiv.org

In this paper, we deal with a class of time-homogeneous continuous-time Markov processes

with transition probabilities bearing a nonparametric uncertainty. The uncertainty is modeled

by considering perturbations of the transition probabilities within a proximity in Wasserstein  …

  All 8 versions 


year 2021

Unbalanced optimal total variation transport problems and generalized Wasserstein barycenters

NP Chung, TS Trinh - Proceedings of the Royal Society of Edinburgh … - cambridge.org

In this paper, we establish a Kantorovich duality for unbalanced optimal total variation

transport problems. As consequences, we recover a version of duality formula for partial

optimal transports established by Caffarelli and McCann; and we also get another proof of …

 

online Cover Image

Corrigendum to: An enhanced uncertainty principle for the Vaserstein distance: Bull. Lond. Math. Soc . 52 (2020) 1158–1173

by Carroll, Tom; Massaneda, Xavier; Ortega‐Cerdà, Joaquim

The Bulletin of the London Mathematical Society, 05/2021

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

  Corrigendum to: An enhanced uncertainty principle for the ...

https://londmathsoc.onlinelibrary.wiley.com › full › blms

by T Carroll — Corrigendum to: An enhanced uncertainty principle for the Vaserstein distance. ( BullLond. Math. Soc. 52 (2020) 1158–1173) ..

<——2021———2021———920——

 

Synthetic Ride-Requests Generation using WGAN with Location Embeddings

by Nookala, Usha; Ding, Sihao; Alareqi, Ebrahim ; More...

2021 Smart City Symposium Prague (SCSP), 05/2021

Ride-hailing services have gained tremendous importance in social life today, and the amount of resources involved have been hiking up. Ride-request data has...

Conference ProceedingCitation Online

Synthetic Ride-Requests Generation using WGAN with ...

https://ieeexplore.ieee.org › abstract › document

by U Nookala · 2021 — Synthetic Ride-Requests Generation using WGAN with Location Embeddings. Abstract: R

2021 online Cover Image PEER-REVIEW

On linear optimization over Wasserstein balls

by Yue, Man-Chung; Kuhn, Daniel; Wiesemann, Wolfram

Mathematical programming, 06/2021

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

  On linear optimization over Wasserstein balls

MC YueD KuhnW Wiesemann - Mathematical Programming, 2021 - Springer

Wasserstein balls, which contain all probability measures within a pre-specified Wasserstein

distance to a reference measure, have recently enjoyed wide popularity in the

distributionally robust optimization and machine learning communities to formulate and

solve data-driven optimization problems with rigorous statistical guarantees. In this technical

note we prove that the Wasserstein ball is weakly compact under mild conditions, and we

offer necessary and sufficient conditions for the existence of optimal solutions. We also …

 Cited by 19 Related articles All 9 versions


online Cover Image PEER-REVIEW

Wasserstein distance based Asymmetric Adversarial Domain Adaptation in intelligent bearing fault diagnosis

by Ying, Yu; Jun, Zhao; Tang, Tang ; More...

Measurement science & technology, 06/2021

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

  Wasserstein distance based Asymmetric Adversarial Domain Adaptation in intelligent bearing fault diagnosis

Y Ying, Z Jun, T Tang, W Jingwei, C Ming… - Measurement …, 2021 - iopscience.iop.org

Addressing the phenomenon of data sparsity in hostile working conditions, which leads to

performance degradation in traditional machine learning based fault diagnosis methods, a

novel Wasserstein distance based Asymmetric Adversarial Domain Adaptation (WAADA) is

proposed for unsupervised domain adaptation in bearing fault diagnosis. A GAN-based loss

and asymmetric mapping are integrated to alleviate the difficulty of the training process in

adversarial transfer learning, especially when the domain shift is serious. Moreover …

 

online Cover Image  PEER-REVIEW

Precise limit in Wasserstein distance for conditional empirical measures of Dirichlet diffusion processes

by Wang, Feng-Yu

Journal of functional analysis, 06/2021, Volume 280, Issue 11

Let M be a d-dimensional connected compact Riemannian manifold with boundary ∂M, let VC2(M) such that μ(dx):=eV(x)dx is a probability measure, and let Xt be...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Precise Limit in Wasserstein Distance for Conditional ...

https://arxiv.org › math

by F Wang · 2020 · Cited by 3 — ... Wasserstein Distance for Conditional Empirical Measures of Dirichlet ... V\in C^ 2(M) such that \mu(dx):=e^{V(x)} d x is a probability measure, ...

Cited by 11 Related articles All 6 versions

Cover Image PEER-REVIEW OPEN ACCESS

Wasserstein distance-based distributionally robust optimal scheduling in rural microgrid considering the coordinated interaction among...

by Chen, Changming; Xing, Jianxu; Li, Qinchao ; More...

Energy reports, 06/2021

The microgrid (MG) is an effective way to alleviate the impact of the large-scale penetration of distributed generations. Due to the seasonal characteristics...

Article View Article PDF BrowZine PDF Icon

Journal ArticleCitation Online

View Complete Issue Browse Now BrowZine Book Icon

 [HTML] Wasserstein distance-based distributionally robust optimal scheduling in rural microgrid considering the coordinated interaction among source-grid-load-storage

C Chen, J Xing, Q Li, S Liu, J Ma, J Chen, L Han, W Qiu… - Energy Reports, 2021 - Elsevier

The microgrid (MG) is an effective way to alleviate the impact of the large-scale penetration

of distributed generations. Due to the seasonal characteristics of rural areas, the load curve

of the rural MG is different from the urban MG. Besides, the economy and stability of MG's

scheduling may be impacted due to the uncertainty of the distributed generations' output. To

adapt the seasonal characteristics of the rural microgrid, a Wasserstein distance-based

distributionally robust optimal scheduling model of rural microgrid considering the …

  All 3 versions


2021


online OPEN ACCESS

Costs and Benefits of Wasserstein Fair Regression

by Zhao, Han

06/2021

Real-world applications of machine learning tools in high-stakes domains are often regulated to be fair, in the sense that the predicted target should satisfy...

Journal ArticleFull Text Online

 Costs and Benefits of Wasserstein Fair Regression

H Zhao - arXiv preprint arXiv:2106.08812, 2021 - arxiv.org

Real-world applications of machine learning tools in high-stakes domains are often

regulated to be fair, in the sense that the predicted target should satisfy some quantitative

notion of parity with respect to a protected attribute. However, the exact tradeoff between

fairness and accuracy with a real-valued target is not clear. In this paper, we characterize the

inherent tradeoff between statistical parity and accuracy in the regression setting by

providing a lower bound on the error of any fair regressor. Our lower bound is sharp …

  All 2 versions 

Showing the best result for this search. See all results

online OPEN ACCESS

Distributionally Robust Prescriptive Analytics with Wasserstein Distance

by Wang, Tianyu; Chen, Ningyuan; Wang, Chun

06/2021

In prescriptive analytics, the decision-maker observes historical samples of $(X, Y)$, where $Y$ is the uncertain problem parameter and $X$ is the concurrent...

Journal ArticleFull Text Online

 istributionally Robust Prescriptive Analytics with Wasserstein ...

https://www.catalyzex.com › paper › arxiv:2106

Jun 10, 2021 — Distributionally Robust Prescriptive Analytics with Wasserstein Distance. Click To Get Model/Code. In prescriptive analytics, the decision-maker ...

 

online OPEN ACCESS

Non-asymptotic convergence bounds for Wasserstein approximation using point clouds

by Merigot, Quentin; Santambrogio, Filippo; Sarrazin, Clément

06/2021

Several issues in machine learning and inverse problems require to generate discrete data, as if sampled from a model probability distribution. A common way to...

Journal ArticleFull Text Online

 Non-asymptotic convergence bounds for Wasserstein approximation using point clouds

Q MerigotF Santambrogio, C Sarrazin - arXiv preprint arXiv:2106.07911, 2021 - arxiv.org

Several issues in machine learning and inverse problems require to generate discrete data,

as if sampled from a model probability distribution. A common way to do so relies on the

construction of a uniform probability distribution over a set of $ N $ points which minimizes

the Wasserstein distance to the model distribution. This minimization problem, where the

unknowns are the positions of the atoms, is non-convex. Yet, in most cases, a suitably

Cited by 9 Related articles All 7 versions
Non-asymptotic convergence bounds for Wasserstein ...

slideslive.com › nonasymptotic-convergence-bounds-for-...

Non-asymptotic convergence bounds for Wasserstein approximation using point clouds. Dec 6, 2021 ... the Wasserstein distance to the model distribution.

SlidesLive · 

Dec 6, 2021



online OPEN ACCESS

Balanced Coarsening for Multilevel Hypergraph Partitioning via Wasserstein Discrepancy

by Guo, Zhicheng; Zhao, Jiaxuan; Jiao, Licheng ; More...

06/2021

We propose a balanced coarsening scheme for multilevel hypergraph partitioning. In addition, an initial partitioning algorithm is designed to improve the...

Journal ArticleFull Text Online

 alanced Coarsening for Multilevel Hypergraph Partitioning ...

https://deepai.org › publication › balanced-coarsening-f...

Balanced Coarsening for Multilevel Hypergraph Partitioning via Wasserstein Discrepancy. 06/14/2021 by Zhicheng Guo, et al. 0 share. We propose a  ...

online  OPEN ACCESS

A Wasserstein Minimax Framework for…

by Diamandis, Theo; Eldar, Yonina C; Fallah, Alireza ; More...

06/2021

Multi-modal distributions are commonly used to model clustered data in statistical learning tasks. In this paper, we consider the Mixed Linear Regression (MLR)...

Journal ArticleFull Text Online

 A Wasserstein Minimax Framework for Mixed Linear Regression

T Diamandis, YC EldarA FallahF Farnia… - arXiv preprint arXiv …, 2021 - arxiv.org

Multi-modal distributions are commonly used to model clustered data in statistical learning

tasks. In this paper, we consider the Mixed Linear Regression (MLR) problem. We propose

an optimal transport-based framework for MLR problems, Wasserstein Mixed Linear

Regression (WMLR), which minimizes the Wasserstein distance between the learned and

target mixture regression models. Through a model-based duality analysis, WMLR reduces

the underlying MLR task to a nonconvex-concave minimax optimization problem, which can …

Cited by 1 Related articles All 6 versions

<——2021———2021———930——


online  OPEN ACCESS

Averaging on the Bures-Wasserstein manifold: dimension-free convergence of gradient descent

by Altschuler, Jason M; Chewi, Sinho; Gerber, Patrik ; More...

06/2021

We study first-order optimization algorithms for computing the barycenter of Gaussian distributions with respect to the optimal transport metric. Although the...

Journal ArticleFull Text Online

Averaging on the Bures-Wasserstein manifold: dimension-free ...

https://arxiv.org › math

by JM Altschuler · 2021 — In this work, we prove new geodesic convexity results which provide stronger control of the iterates, yielding a dimension-free convergence rate ...
Cited by 8
Related articles All 7 versions
Averaging on the Bures-Wasserstein manifold: dimension-free ...

slideslive.com › averaging-on-the-bureswasserstein-manif...

... barycenter of Gaussian distributions with respect to the optimal transport metric. Although the objective is geodesically non-convex,...

SlidesLive · 

Dec 6, 2021


online OPEN ACCESS

Distributionally robust tail bounds based on Wasserstein distance and $f$-divergence

by Birghila, Corina; Aigner, Maximilian; Engelke, Sebastian

06/2021

In this work, we provide robust bounds on the tail probabilities and the tail index of heavy-tailed distributions in the context of model misspecification....

Journal ArticleFull Text Online

 Distributionally robust tail bounds based on Wasserstein distance and -divergence

C Birghila, M Aigner, S Engelke - arXiv preprint arXiv:2106.06266, 2021 - arxiv.org

In this work, we provide robust bounds on the tail probabilities and the tail index of heavy-

tailed distributions in the context of model misspecification. They are defined as the optimal

value when computing the worst-case tail behavior over all models within some

neighborhood of the reference model. The choice of the discrepancy between the models

used to build this neighborhood plays a crucial role in assessing the size of the asymptotic

bounds. We evaluate the robust tail behavior in ambiguity sets based on the Wasserstein …

  All 2 versions 


2021  [PDF] arxiv.org

Costs and Benefits of Wasserstein Fair Regression

H Zhao - arXiv preprint arXiv:2106.08812, 2021 - arxiv.org

Real-world applications of machine learning tools in high-stakes domains are often

regulated to be fair, in the sense that the predicted target should satisfy some quantitative

notion of parity with respect to a protected attribute. However, the exact tradeoff between …

  All 2 versions 


2021  [PDF] arxiv.org

Averaging on the Bures-Wasserstein manifold: dimension-free convergence of gradient descent

JM Altschuler, S Chewi, P Gerber… - arXiv preprint arXiv …, 2021 - arxiv.org

We study first-order optimization algorithms for computing the barycenter of Gaussian

distributions with respect to the optimal transport metric. Although the objective is

geodesically non-convex, Riemannian GD empirically converges rapidly, in fact faster than …

  All 2 versions 


2021  [PDF] arxiv.org

Balanced Coarsening for Multilevel Hypergraph Partitioning via Wasserstein Discrepancy

Z Guo, J Zhao, L Jiao, X Liu - arXiv preprint arXiv:2106.07501, 2021 - arxiv.org

We propose a balanced coarsening scheme for multilevel hypergraph partitioning. In

addition, an initial partitioning algorithm is designed to improve the quality of k-way

hypergraph partitioning. By assigning vertex weights through the LPT algorithm, we …

  All 2 versions 


2021


[HTML] sciencedirect.com

[HTML] Wasserstein distance-based distributionally robust optimal scheduling in rural microgrid considering the coordinated interaction among source-grid-load …

C Chen, J Xing, Q Li, S Liu, J Ma, J Chen, L Han, W Qiu… - Energy Reports, 2021 - Elsevier

The microgrid (MG) is an effective way to alleviate the impact of the large-scale penetration

of distributed generations. Due to the seasonal characteristics of rural areas, the load curve

of the rural MG is different from the urban MG. Besides, the economy and stability of MG's …

  All 3 versions


 2021  [PDF] arxiv.org

Non-asymptotic convergence bounds for Wasserstein approximation using point clouds

Q MerigotF Santambrogio, C Sarrazin - arXiv preprint arXiv:2106.07911, 2021 - arxiv.org

Several issues in machine learning and inverse problems require to generate discrete data,

as if sampled from a model probability distribution. A common way to do so relies on the

construction of a uniform probability distribution over a set of $ N $ points which minimizes …

  All 4 versions 


2021

Wasserstein distance based Asymmetric Adversarial Domain Adaptation in intelligent bearing fault diagnosis

Y Ying, Z Jun, T Tang, W Jingwei, C Ming… - Measurement …, 2021 - iopscience.iop.org

Addressing the phenomenon of data sparsity in hostile working conditions, which leads to

performance degradation in traditional machine learning based fault diagnosis methods, a

novel Wasserstein distance based Asymmetric Adversarial Domain Adaptation (WAADA) is …

2021  [PDF] arxiv.org

Distributionally Robust Prescriptive Analytics with Wasserstein Distance

T WangN Chen, C Wang - arXiv preprint arXiv:2106.05724, 2021 - arxiv.org

In prescriptive analytics, the decision-maker observes historical samples of $(X, Y) $, where

$ Y $ is the uncertain problem parameter and $ X $ is the concurrent covariate, without

knowing the joint distribution. Given an additional covariate observation $ x $, the goal is to …

  All 2 versions 


2021

Transfer learning method for bearing fault diagnosis based on fully convolutional conditional Wasserstein adversarial Networks

YZ Liu, KM Shi, ZX Li, GF Ding, YS Zou - Measurement, 2021 - Elsevier

The diagnostic accuracy of existing transfer learning-based bearing fault diagnosis methods

is high in the source condition, but accuracy in the target condition is not guaranteed. These

methods mainly focus on the whole distribution of bearing source domain data and target …

  All 2 versions

<——2021———2021———940——


2021

Data augmentation for rolling bearing fault diagnosis using an enhanced few-shot Wasserstein auto-encoder with meta-learning

Z Pei, H Jiang, X Li, J Zhang, S Liu - Measurement Science and …, 2021 - iopscience.iop.org

Despite the advance of intelligent fault diagnosis for rolling bearings, in industries, data-

driven methods still suffer from data acquisition and imbalance. We propose an enhanced

few-shot Wasserstein auto-encoder (fs-WAE) to reverse the negative effect of imbalance …

Related articles

[PDF] github.io

[PDF] IFT 6756-Lecture 11 (Wasserstein Generative Adversarial Nets)

G Gidel - gauthiergidel.github.io

… Whereas, Wasserstein distance captures how close θ is to 0 and we get useful gradients

almost everywhere (except when θ = 0) as Wasserstein measure cannot saturate and …

 Related articles All 2 versions 


year 2021

[PDF] aconf.org

[PDF] LBWGAN: Label Based Shape Synthesis From Text With WGANs

B Li, Y Yu, Y Li - file.aconf.org

In this work, we purpose a novel method of voxel-based shape synthesis, which can build a

connection between the natural language text and the color shapes. The state-of-the-art

method use Generative Adversarial Networks (GANs) to achieve this task and some …


Wasserstein GANs Work Because They Fail (to Approximate the Wasserstein Distance)

J Stanczuk, C EtmannLM Kreusser… - arXiv preprint arXiv …, 2021 - arxiv.org

Wasserstein GANs are based on the idea of minimising the Wasserstein distance between a

real and a generated distribution. We provide an in-depth mathematical analysis of

differences between the theoretical setup and the reality of training Wasserstein GANs. In …

  Cited by 2 Related articles All 3 versions 


arXiv:2106.15427  [pdfother]  stat.ML  cs.LG
Fast Approximation of the Sliced-Wasserstein Distance Using Concentration of Random Projections
Authors: Kimia NadjahiAlain DurmusPierre E. JacobRoland BadeauUmut Şimşekli
Abstract: The Sliced-Wasserstein distance (SW) is being increasingly used in machine learning applications as an alternative to the Wasserstein distance and offers significant computational and statistical benefits. Since it is defined as an expectation over random projections, SW is commonly approximated by Monte Carlo. We adopt a new perspective to approximate SW by making use of the concentration of meas… 
More
Submitted 29 June, 2021; originally announced June 2021.
ACited by 10
 Related articles All 18 versions 


arXiv:2106.15341  [pdfother]  cs.CV  cs.LG  eess.IV 

Image Inpainting Using Wasserstein Generative Adversarial Imputation Network
Authors: Daniel VašataTomáš HalamaMagda Friedjungová
Abstract: Image inpainting is one of the important tasks in computer vision which focuses on the reconstruction of missing regions in an image. The aim of this paper is to introduce an image inpainting model based on Wasserstein Generative Adversarial Imputation Network. The generator network of the model uses building blocks of convolutional layers with different dilation rates, together with skip connecti… 
More
Submitted 23 June, 2021; originally announced June 2021.
Comments: To be publ

 All 5 versions

Image Inpainting Using Wasserstein Generative Adversarial Imputation Network

Vasata, DHalama, T and Friedjungova, M

30th International Conference on Artificial Neural Networks (ICANN)

2021 | 

ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT II

 12892 , pp.575-586

Enriched Cited References

Image inpainting is one of the important tasks in computer vision which focuses on the reconstruction of missing regions in an image. The aim of this paper is to introduce an image inpainting model based on Wasserstein Generative Adversarial Imputation Network. The generator network of the model uses building blocks of convolutional layers with different dilation rates, together with skip conne

Show more

Free Submitted Article From RepositoryFull Text at Publishermore_horiz

2 Citations  26 References  Related records


arXiv:2107.06008  [pdfother]  stat.ML  cs.LG
Wasserstein GAN: Deep Generation applied on Bitcoins financial time series
Authors: Rikli SamuelBigler Daniel NicoPfenninger MoritzOsterrieder Joerg
Abstract: Modeling financial time series is challenging due to their high volatility and unexpected happenings on the market. Most financial models and algorithms trying to fill the lack of historical financial time series struggle to perform and are highly vulnerable to overfitting. As an alternative, we introduce in this paper a deep neural network called the WGAN-GP, a data-driven model that focuses on s… 
More
Submitted 13 July, 2021; originally announced July 2021.
All 2 versions 


arXiv:2107.05766  [pdfother]  math.ST  stat.ME  stat.ML

Likelihood estimation of sparse topic distributions in topic models and its applications to Wasserstein document distance calculations
Authors: Xin BingFlorentina BuneaSeth Strimas-MackeyMarten Wegkamp
Abstract: This paper studies the estimation of high-dimensional, discrete, possibly sparse, mixture models in topic models. The data consists of observed multinomial counts of p
 words across n
 independent documents. In topic models, the p×n
 expected word frequency matrix is assumed to be factorized as a p×K
 word-topic matrix A
 and a K×n
 topic-document matrix T
. Since columns… 
More
Submitted 12 July, 2021; originally announced July 2021.
Cited by 2
Related articles All 2 versions

arXiv:2107.05680  [pdfother]  cs.LG  cs.CV  eess.IV  math.OC  stat.ML
Hidden Convexity of Wasserstein GANs: Interpretable Generative Models with Closed-Form Solutions
Authors: Arda SahinerTolga ErgenBatu OzturklerBurak BartanJohn PaulyMorteza MardaniMert Pilanci
Abstract: Generative Adversarial Networks (GANs) are commonly used for modeling complex distributions of data. Both the generators and discriminators of GANs are often modeled by neural networks, posing a non-transparent optimization problem which is non-convex and non-concave over the generator and discriminator, respectively. Such networks are often heuristically optimized with gradient descent-ascent (GD…  More
Submitted 12 July, 2021; originally announced July 2021.
Comments: First two authors contributed equally to this work; 30 pages, 11 figures

Cited by 12 Related articles All 3 versions 
<——2021———2021———950——


arXiv:2107.02555  [pdfother]  eess.IV  cs.CV
A Theory of the Distortion-Perception Tradeoff in Wasserstein Space
Authors: Dror FreirichTomer MichaeliRon Meir
Abstract: The lower the distortion of an estimator, the more the distribution of its outputs generally deviates from the distribution of the signals it attempts to estimate. This phenomenon, known as the perception-distortion tradeoff, has captured significant attention in image restoration, where it implies that fidelity to ground truth images comes at the expense of perceptual quality (deviation from stat… 
More
Submitted 6 July, 2021; originally announced July 2021.
online  OPEN ACCESSA 

Theory of the Distortion-Perception Tradeoff in Wasserstein Space

by Freirich, Dror; Michaeli, Tomer; Meir, Ron

07/2021

The lower the distortion of an estimator, the more the distribution of its outputs generally deviates from the distribution of the signals it attempts to...

Journal ArticleFull Text Online

Cited by 3 Related articles All 4 versions 

arXiv:2107.01848  [pdfother]  cs.LG  stat.ML
Differentially Private Sliced Wasserstein Distance
Authors: Alain RakotomamonjyLiva Ralaivola
Abstract: Developing machine learning methods that are privacy preserving is today a central topic of research, with huge practical impacts. Among the numerous ways to address privacy-preserving learning, we here take the perspective of computing the divergences between distributions under the Differential Privacy (DP) framework -- being able to compute divergences between distributions is pivotal for many… 
More
Submitted 5 July, 2021; originally announced July 2021.
Journal ref: International Conference of Machine Learning, Jul 2021, Virtual, France

Differentially Private Sliced Wasserstein Distance
Rakotomamonjy, Alain; Ralaivola, Liva. arXiv.org; Ithaca, Jul 5, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Cited by 5 Related articles All 25 versions 

arXiv:2107.01323  [pdfother]  stat.ML  cs.LG
Authors: Qiong ZhangJiahua Chen
Abstract: When a population exhibits heterogeneity, we often model it via a finite mixture: decompose it into several different but homogeneous subpopulations. Contemporary practice favors learning the mixtures by maximizing the likelihood for statistical efficiency and the convenient EM-algorithm for numerical computation. Yet the maximum likelihood estimate (MLE) is not well defined for the most widely us…  More
Submitted 2 July, 2021; originally announced July 2021.

All 2 versions 


Biau, Gérard
Sangnier, MaximeTanielian, Ugo

Some theoretical insights into Wasserstein GANs. (English) Zbl 07370636

J. Mach. Learn. Res. 22, Paper No. 119, 45 p. (2021).

MSC:  68T05

PDF BibTeX Cite  Full Text: Link

Zbl 07370636

 Cited by 15 Related articles All 27 versions


Jacobs, MattLee, WonjunLéger, Flavien

The back-and-forth method for Wasserstein gradient flows. (English) Zbl 07369242

ESAIM, Control Optim. Calc. Var. 27, Paper No. 28, 35 p. (2021).

MSC:  65K10 65N99 49N15 90C46

PDF BibTeX Cite  Full Text: DOI

steps in an appropriately weighted Sobolev space. The Sobolev control allows us to …Cited by 7 Related articles All 3 versions
Cited by 9
Related articles All 3 versions

2021

Liu, WeiYang, LiYu, Bo

Wasserstein distributionally robust option pricing. (English) Zbl 07366215

J. Math. Res. Appl. 41, No. 1, 99-110 (2021).

MSC:  91G20 90C17

PDF BibTeX Cite  Full Text: DOI


  MR4278784 Prelim Kroshnin, Alexey; Spokoiny, Vladimir; Suvorikova, Alexandra; 

Statistical inference for Bures–Wasserstein barycenters. Ann. Appl. Probab. 31 (2021), no. 3, 1264–1298.

Review PDF Clipboard Journal Article

Cited by 27 Related articles All 8 versions

2021  see 2020

MR4254137 Prelim Carrillo, Jose A.; Matthes, Daniel; Wolfram, Marie-Therese; 

Lagrangian schemes for Wasserstein gradient flows. Geometric partial differential equations. Part II, 271–312, Handb. Numer. Anal., 22, Elsevier/North-Holland, Amsterdam, [2021], ©2021. 65M60 (35Q84 49Q20)

Review PDF Clipboard Series Chapter
Cited by 5
 Related articles All 3 versions

Cited by 9 Related articles All 7 versions

Coverless Information Hiding Based on WGAN-GP Model

X Duan, B Li, D Guo, K Jia, E Zhang… - International Journal of …, 2021 - igi-global.com

Steganalysis technology judges whether there is secret information in the carrier by

monitoring the abnormality of the carrier data, so the traditional information hiding

technology has reached the bottleneck. Therefore, this paper proposed the coverless

information hiding based on the improved training of Wasserstein GANs (WGAN-GP) model.

The sender trains the WGAN-GP with a natural image and a secret image. The generated

image and secret image are visually identical, and the parameters of generator are saved to …

  Related articles All 2 versions

PEER-REVIEW

Coverless Information Hiding Based on WGAN-GP Model

by Duan, Xintao; Li, Baoxia; Guo, Daidou ; More...

International journal of digital crime and forensics, 07/2021, Volume 13, Issue 4

Steganalysis technology judges whether there is secret information in the carrier by monitoring the abnormality of the carrier data, so the traditional...

Article PDF Download Now (via Unpaywall) BrowZine PDF Icon

Journal ArticleCitation Online

 

Distributionally Robust Resilient Operation of Integrated Energy Distribution Systems Using Moment and Wasserstein Metric for Contingencies

M Shahidehpour, Y Zhou, Z Wei… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

Extreme weather events pose a serious threat to energy distribution systems. We propose a

distributionally robust optimization model for the resilient operation of the integrated

electricity and heat energy distribution systems in extreme weather events. We develop a

strengthened ambiguity set that incorporates both moment and Wasserstein metric

information of uncertain contingencies, which provides a more accurate characterization of

the true probability distribution. We first recast the proposed model into an equivalent …

  Related articles

online Cover Image

Distributionally Robust Resilient Operation of Integrated Energy Systems Using Moment and Wasserstein Metric for Contingencies

by Zhou, Yizhou; Wei, Zhinong; Shahidehpour, Mohammad ; More...

IEEE transactions on power systems, 07/2021, Volume 36, Issue 4

Extreme weather events pose a serious threat to energy distribution systems. We propose a distributionally robust optimization model for the resilient...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

<——2021———2021———960——


[HTML] Ensemble Riemannian data assimilation over the Wasserstein space

SK TamangA Ebtehaj, PJ Van Leeuwen… - Nonlinear Processes …, 2021 - npg.copernicus.org

In this paper, we present an ensemble data assimilation paradigm over a Riemannian

manifold equipped with the Wasserstein metric. Unlike the Euclidean distance used in

classic data assimilation methodologies, the Wasserstein metric can capture the translation

and difference between the shapes of square-integrable probability distributions of the

background state and observations. This enables us to formally penalize geophysical biases

in state space with non-Gaussian distributions. The new approach is applied to dissipative …

  Related articles All 7 versions 

2021 online Cover Image  PEER-REVIEW

Ensemble Riemannian data assimilation over the Wasserstein space

by Tamang, Sagar K; Ebtehaj, Ardeshir; van Leeuwen, Peter J ; More...

Nonlinear processes in geophysics, 07/2021, Volume 28, Issue 3

Journal ArticleFull Text Online

 

2021

Conditional Wasserstein GAN-based oversampling of tabular data for imbalanced learning

J Engelmann, S Lessmann - Expert Systems with Applications, 2021 - Elsevier

Class imbalance impedes the predictive performance of classification models. Popular

countermeasures include oversampling minority class cases by creating synthetic examples.

The paper examines the potential of Generative Adversarial Networks (GANs) for

oversampling. A few prior studies have used GANs for this purpose but do not reflect recent

methodological advancements for generating tabular data using GANs. The paper proposes

an approach based on a conditional Wasserstein GAN that can effectively model tabular …

Cited by 65 Related articles All 7 versions

2021 online Cover Image  PEER-REVIEW

Conditional Wasserstein GAN-based oversampling of tabular data for imbalanced learning

by Engelmann, Justin; Lessmann, Stefan

Expert systems with applications, 07/2021, Volume 174

•We design a tabular data GAN for oversampling that can handle categorical variables.•We assess our GAN in a credit scoring setting using multiple real-world...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

  

Low-Dose CT Denoising Using A Progressive Wasserstein Generative Adversarial Network

G Wang, X Hu - Computers in Biology and Medicine, 2021 - Elsevier

Low-dose computed tomography (LDCT) imaging can greatly reduce the radiation dose

imposed on the patient. However, image noise and visual artifacts are inevitable when the

radiation dose is low, which has serious impact on the clinical medical diagnosis. Hence, it

is important to address the problem of LDCT denoising. Image denoising technology based

on Generative Adversarial Network (GAN) has shown promising results in LDCT denoising.

Unfortunately, the structures and the corresponding learning algorithms are becoming more …

  All 3 versions

online Cover Image  PEER-REVIEW

Low-Dose CT Denoising Using A Progressive Wasserstein Generative Adversarial Network

by Wang, Guan; Hu, Xueli

Computers in biology and medicine, 07/2021

Low-dose computed tomography (LDCT) imaging can greatly reduce the radiation dose imposed on the patient. However, image noise and visual artifacts are...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 Cited by 5 Related articles All 5 versions


Simulation of broad-band ground motions with consistent long ...

https://academic.oup.com › gji › article

by T Okazaki · 2021 — Fourier analysis, Time-series analysis, Earthquake ground motions, ... It solves wave equations by modelling the earthquake rupture ... BB: broad-band; LP: long period; SP: short period; NN: neural network. ... These studies applied the Wasserstein distance to the oscillating time history of the waveforms.

online Cover Image  PEER-REVIEW

Simulation of broad-band ground motions with consistent long-period and short-period components using the Wasserstein interpolation of...

by Okazaki, Tomohisa; Hachiya, Hirotaka; Iwaki, Asako ; More...

Geophysical journal international, 07/2021, Volume 227, Issue 1

SUMMARY Practical hybrid approaches for the simulation of broad-band ground motions often combine long-period and short-period waveforms synthesized by...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 

2021

Wasserstein Adversarial Regularization for learning with label ...

https://ieeexplore.ieee.org › document

by K Fatras · 2021 — Noisy labels often occur in vision datasets, especially when they are obtained from crowdsourcing or Web scraping. We propose a new ...

online Cover Image

Wasserstein Adversarial Regularization for learning with label noise

IEEE transactions on pattern analysis and machine intelligence, 07/2021, Volume PP

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Cited by 8 Related articles All 10 versions

2021

Differentially Private Sliced Wasserstein Distance

A RakotomamonjyR Liva - International Conference on …, 2021 - proceedings.mlr.press

Developing machine learning methods that are privacy preserving is today a central topic of

research, with huge practical impacts. Among the numerous ways to address privacy-

preserving learning, we here take the perspective of computing the divergences between

distributions under the Differential Privacy (DP) framework—being able to compute

divergences between distributions is pivotal for many machine learning problems, such as

learning generative models or domain adaptation problems. Instead of resorting to the …

  All 9 versions 

online  OPEN ACCESS

Differentially Private Sliced Wasserstein Distance

by Rakotomamonjy, Alain; Ralaivola, Liva

07/2021

International Conference of Machine Learning, Jul 2021, Virtual, France Developing machine learning methods that are privacy preserving is today a central...

Journal ArticleFull Text Online

 Cited by 2 Related articles All 25 versions

  Differentially Private Sliced Wasserstein Distance · SlidesLive

slideslive.com › differentially-private-sliced-wasserstein-di...

slideslive.com › differentially-private-sliced-wasserstein-di...

... speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world.

SlidesLive · 

Jul 19, 2021


Minimum Wasserstein Distance Estimator under Finite Location-scale Mixtures

Q ZhangJ Chen - arXiv preprint arXiv:2107.01323, 2021 - arxiv.org

When a population exhibits heterogeneity, we often model it via a finite mixture: decompose

it into several different but homogeneous subpopulations. Contemporary practice favors

learning the mixtures by maximizing the likelihood for statistical efficiency and the

convenient EM-algorithm for numerical computation. Yet the maximum likelihood estimate

(MLE) is not well defined for the most widely used finite normal mixture in particular and for

finite location-scale mixture in general. We hence investigate feasible alternatives to MLE …

  All 2 versions 

online  OPEN ACCESS

Minimum Wasserstein Distance Estimator under Finite Location-scale Mixtures

by Zhang, Qiong; Chen, Jiahua

07/2021

When a population exhibits heterogeneity, we often model it via a finite mixture: decompose it into several different but homogeneous subpopulations....

Journal ArticleFull Text Online


2021

Conditional Wasserstein GAN-based oversampling of tabular data for imbalanced learning

J Engelmann, S Lessmann - Expert Systems with Applications, 2021 - Elsevier

Class imbalance impedes the predictive performance of classification models. Popular

countermeasures include oversampling minority class cases by creating synthetic examples.

The paper examines the potential of Generative Adversarial Networks (GANs) for …

  Cited by 2 Related articles All 2 versions

2021  [PDF] mdpi.com

Unpaired image denoising via wasserstein GAN in low-dose CT image with multi-perceptual loss and fidelity loss

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - Symmetry, 2021 - mdpi.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively

reduce the radiation risk of patients, but it may increase noise and artefacts, which can

compromise diagnostic information. The methods based on deep learning can effectively …

  Cited by 2 Related articles All 3 versions 

[PDF] archive.org

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss. Symmetry 2021, 13, 126

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - 2021 - search.proquest.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively

reduce the radiation risk of patients, but it may increase noise and artefacts, which can

compromise diagnostic information. The methods based on deep learning can effectively …

  Related articles


2021

Low-Dose CT Denoising Using A Progressive Wasserstein Generative Adversarial Network

G Wang, X Hu - Computers in Biology and Medicine, 2021 - Elsevier

Low-dose computed tomography (LDCT) imaging can greatly reduce the radiation dose

imposed on the patient. However, image noise and visual artifacts are inevitable when the

radiation dose is low, which has serious impact on the clinical medical diagnosis. Hence, it …

  All 3 versions

<——2021———2021———970——

 2021

Peacock geodesics in Wasserstein space

H Wu, X Cui - Differential Geometry and its Applications, 2021 - Elsevier

Martingale optimal transport has attracted much attention due to its application in pricing and

hedging in mathematical finance. The essential notion which makes martingale optimal

transport different from optimal transport is peacock. A peacock is a sequence of measures …

  Related articles All 2 versions


2021  [HTML] copernicus.org

[HTML] Ensemble Riemannian data assimilation over the Wasserstein space

SK TamangA Ebtehaj, PJ Van Leeuwen… - Nonlinear Processes …, 2021 - npg.copernicus.org

In this paper, we present an ensemble data assimilation paradigm over a Riemannian

manifold equipped with the Wasserstein metric. Unlike the Euclidean distance used in

classic data assimilation methodologies, the Wasserstein metric can capture the translation …

  Related articles All 7 versions 


2021  [PDF] arxiv.org

Wasserstein distance, Fourier series and applications

S Steinerberger - Monatshefte für Mathematik, 2021 - Springer

We study the Wasserstein metric\(W_p\), a notion of distance between two probability

distributions, from the perspective of Fourier Analysis and discuss applications. In particular,

we bound the Earth Mover Distance\(W_1\) between the distribution of quadratic residues in …

  9 Related articles All 3 versions

2021  [PDF] jmlr.org

[PDF] Some theoretical insights into Wasserstein GANs

G BiauM SangnierU Tanielian - Journal of Machine Learning Research, 2021 - jmlr.org

Abstract Generative Adversarial Networks (GANs) have been successful in producing

outstanding results in areas as diverse as image, video, and text generation. Building on

these successes, a large number of empirical studies have validated the benefits of the …

  Cited by 5 Related articles All 23 versions 

2021  [PDF] arxiv.org

Wasserstein autoregressive models for density time series

C ZhangP Kokoszka… - Journal of Time Series …, 2021 - Wiley Online Library

Data consisting of time‐indexed distributions of cross‐sectional or intraday returns have

been extensively studied in finance, and provide one example in which the data atoms

consist of serially dependent probability distributions. Motivated by such data, we propose …

  Cited by 5 Related articles All 5 versions

2021

[PDF] mlr.press

Differentially Private Sliced Wasserstein Distance

A RakotomamonjyR Liva - International Conference on …, 2021 - proceedings.mlr.press

Developing machine learning methods that are privacy preserving is today a central topic of

research, with huge practical impacts. Among the numerous ways to address privacy-

preserving learning, we here take the perspective of computing the divergences between …

  All 9 versions 

2021

[PDF] arxiv.org

Time-Series Imputation with Wasserstein Interpolation for Optimal Look-Ahead-Bias and Variance Tradeoff

J Blanchet, F Hernandez, VA NguyenM Pelger… - arXiv preprint arXiv …, 2021 - arxiv.org

Missing time-series data is a prevalent practical problem. Imputation methods in time-series

data often are applied to the full panel data with the purpose of training a model for a

downstream out-of-sample task. For example, in finance, imputation of missing returns may …

  Related articles All 2 versions 


2021

Distributionally Robust Resilient Operation of Integrated Energy Distribution Systems Using Moment and Wasserstein Metric for Contingencies

M Shahidehpour, Y Zhou, Z Wei… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

Extreme weather events pose a serious threat to energy distribution systems. We propose a

distributionally robust optimization model for the resilient operation of the integrated

electricity and heat energy distribution systems in extreme weather events. We develop a …

  Related articles

2021 [PDF] oup.com

Simulation of broadband ground motions with consistent long-period and short-period components using Wasserstein interpolation of acceleration envelopes

T Okazaki, H Hachiya, A Iwaki, T Maeda… - Geophysical Journal …, 2021 - academic.oup.com

Practical hybrid approaches for the simulation of broadband ground motions often combine

long-period and short-period waveforms synthesised by independent methods under

different assumptions for different period ranges, which at times can lead to incompatible …

  Related articles


EMS - European Mathematical Society Publishing House

https://www.ems-ph.org › books › book

An Invitation to Optimal Transport, Wasserstein Distances, and Gradient Flows. August 2021, 144 pages, hardcover, 16.5 x 23.5 cm. This book provides a self-contained introduction to optimal transport, and it is intended as a starting point for any researcher who wants to enter into this beautiful subject.

[CITATION] An Invitation to Optimal Transport, Wasserstein Distances, and Gradient Flows

A FigalliF Glaudo - 2021 - ems-ph.org

The presentation focuses on the essential topics of the theory: Kantorovich duality, existence

and uniqueness of optimal transport maps, Wasserstein distances, the JKO scheme, Otto's

calculus, and Wasserstein gradient flows. At the end, a presentation of some selected …

  Cited by 2 Related articles

Zbl 07375481 book

Book ReviewFull Text Online

<——2021———2021———980——

 
(PDF) Bayesian inverse problems for functions and ...

https://www.researchgate.net › publication › 231007840_...

Jul 13, 2021 — We show that the abstract theory applies to some concrete applications of interest by studying problems arising from data assimilation in fluid ...

[CITATION] Bayesian inverse problems in the Wasserstein distance and application to conservation laws

S Mishra, D Ochsner, AM Ruf, F Weber - 2021 - preparation

  Cited by 3 Related articles


Applications of Gromov-Wasserstein distance to network science

https://meetings.ams.org › math › meetingapp.cgi › Paper

by T Needham · 2021 — A rich mathematical theory underpins this work: optimal node correspondences realize the Gromov-Wasserstein (GW) distance between networks.

[CITATION] Applications of Gromov-Wasserstein distance to network science

T NeedhamS Chowdhury - 2021 Joint Mathematics Meetings …, 2021 - meetings.ams.org


Master Bellman equation in the Wasserstein space: Uniqueness of viscosity solutions
by Cosso, AndreaGozzi, FaustoKharroubi, Idris ; More...
07/2021
We study the Bellman equation in the Wasserstein space arising in the study of mean field control problems, namely stochastic optimal control problems for...
Journal Article  Full Text Online

  arXiv:2107.10535  [pdfpsother]  math.AP  math.OC  math.PR
Master Bellman equation in the Wasserstein space: Uniqueness of viscosity solutions
Authors: Andrea CossoFausto GozziIdris KharroubiHuyên PhamMauro Rosestolato
Abstract: We study the Bellman equation in the Wasserstein space arising in the study of mean field control problems, namely stochastic optimal control problems for McKean-Vlasov diffusion processes. Using the standard notion of viscosity solution {à} la Crandall-Lions extended to our Wasserstein setting, we prove a comparison result under general conditions, which coupled with the dynamic programming princ…  More
Submitted 22 July, 2021; originally announced July 2021.
 All 18 versions 


Conditional Wasserstein Barycenters and Interpolation/Extrapolation of Distributions
by Fan, JianingMüller, Hans-Georg
07/2021
Increasingly complex data analysis tasks motivate the study of the dependency of distributions of multivariate continuous random variables on scalar or vector...
Journal Article  Full Text Online

arXiv:2107.09218  [pdfother stat.ME
Conditional Wasserstein Barycenters and Interpolation/Extrapolation of Distributions
Authors: Jianing FanHans-Georg Müller
Abstract: Increasingly complex data analysis tasks motivate the study of the dependency of distributions of multivariate continuous random variables on scalar or vector predictors. Statistical regression models for distributional responses so far have primarily been investigated for the case of one-dimensional response distributions. We investigate here the case of multivariate response distributions while…  More
Submitted 19 July, 2021; originally announced July 2021.
Comments: 42 pages, 15 figures
 All 2 versions 


2021 see 2022

Wasserstein Adversarial Regularization for learning with label noise.

By: Fatras, KilianDamodaran, Bharath BhushanLobry, Sylvain; et al.

IEEE transactions on pattern analysis and machine intelligence  Volume: ‏ PP     Published: ‏ 2021-Jul-07 (Epub 2021 Jul 07)

Cited by 15 Related articles All


2021

 WDA: An Improved Wasserstein Distance-Based Transfer Learning Fault Diagnosis Method

By: Zhu, ZhiyuWang, LanzhiPeng, Gaoliang; et al.

SENSORS  Volume: ‏ 21   Issue: ‏ 13     Article Number: 4394   Published: ‏ JUL 2021
Cited by 4
 Related articles All 11 versions 

Computer-based method for generating tailored medical recipes for mental health disorders, involves transmitting that recipes, generative adversarial neural network (GAN) comprises wasserstein GAN (WGAN) and self-Attention GAN (SAGAN)

Patent Number: US11049605-B1

Patent Assignee: CORTERY AB

Inventor(s): PETERS F L.

26

Computer-based method for generating tailored medical recipes for mental health disorders, involves transmitting that recipes, generative adversarial neural network (GAN) comprises wasserstein GAN (WGAN) and self-Attention GAN (SAGAN)

US11049605-B1

Inventor(s) PETERS F L

Assignee(s) CORTERY AB

Derwent Primary Accession Number 

2021-72556R

2021 see 2020

STATISTICAL INFERENCE FOR BURES-WASSERSTEIN BARYCENTERS

By: Kroshnin, AlexeySpokoiny, VladimirSuvorikova, Alexandra

ANNALS OF APPLIED PROBABILITY  Volume: ‏ 31   Issue: ‏ 3   Pages: ‏ 1264-1298   Published: ‏ JUN 2021


2021

Symmetric Skip Connection Wasserstein GAN for High-resolution Facial Image Inpainting

By: Jam, JirehKendrick, ConnahDrouard, Vincent; et al.

Conference: 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP) / 16th International Conference on Computer Vision Theory and Applications (VISAPP) Location: ‏ ELECTR NETWORK Date: ‏ FEB 08-10, 2021

VISAPP: PROCEEDINGS OF THE 16TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS - VOL. 4: VISAPP  Pages: ‏ 35-44   Published: ‏ 2021


A New Data-Driven Distributionally Robust Portfolio Optimization Method Based on Wasserstein Ambiguity Set

By: Du, NingningLiu, YankuiLiu, Ying

IEEE ACCESS  Volume: ‏ 9   Pages: ‏ 3174-3194   Published: ‏ 2021

<——2021———2021———990——


A New Data-Driven Distributionally Robust Portfolio Optimization Method Based on Wasserstein Ambiguity Set

By: Du, NingningLiu, YankuiLiu, Ying

IEEE ACCESS  Volume: ‏ 9   Pages: ‏ 3174-3194   Published: ‏ 2021

CONFERENCE PROCEEDING

Accelerated WGAN update strategy with loss change rate balancing

Ouyang, Xu ; Chen, Ying ; Agam, Gady2021 IEEE Winter Conference on Applications of Computer Vision (WACV), 2021, p.2545-2554

OPEN ACCESS

Accelerated WGAN update strategy with loss change rate balancing

Available Online 

  Accelerated WGAN update strategy with loss change rate balancing

X OuyangY Chen, G Agam - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com

Optimizing the discriminator in Generative Adversarial Networks (GANs) to completion in the

inner training loop is computationally prohibitive, and on finite datasets would result in

overfitting. To address this, a common update strategy is to alternate between k optimization …

  Cited by 2 Related articles All 4 versions 

Cited by 3 Related articles All 5 versions 

 [PDF] ieee.org

Multi-Frame Super-Resolution Algorithm Based on a WGAN

K Ning, Z Zhang, K Han, S Han, X Zhang - IEEE Access, 2021 - ieeexplore.ieee.org

Image super-resolution reconstruction has been widely used in remote sensing, medicine

and other fields. In recent years, due to the rise of deep learning research and the successful

application of convolutional neural networks in the image field, the super-resolution …

  Related articles

IEEE access, 2021, Volume 9

Image super-resolution reconstruction has been widely used in remote sensing, medicine and other fields. In recent years, due to the rise

of deep learning...

ArticleView Article PDF

Journal Article  Full Text Online

[PDF] igi-global.com

Coverless Information Hiding Based on WGAN-GP Model

X Duan, B Li, D Guo, K Jia, E Zhang… - International Journal of …, 2021 - igi-global.com

Steganalysis technology judges whether there is secret information in the carrier by

monitoring the abnormality of the carrier data, so the traditional information hiding

technology has reached the bottleneck. Therefore, this paper proposed the coverless …

  Related articles All 2 versions


Synthetic Ride-Requests Generation using WGAN with Location Embeddings

U Nookala, S Ding, E Alareqi… - 2021 Smart City …, 2021 - ieeexplore.ieee.org

Ride-hailing services have gained tremendous importance in social life today, and the

amount of resources involved have been hiking up. Ride-request data has been crucial in

the research of improving ride-hailing efficiency and minimizing the cost. This work aims to …

  Related articles


2021

[PDF] researchsquare.com

[PDF] Inverse Airfoil Design Method for Generating Varieties of Smooth Airfoils Using Conditional WGAN-GP

K Yonekura, N Miyamoto, K Suzuki - 2021 - researchsquare.com

Machine learning models are recently utilized for airfoil shape generation methods. It is

desired to obtain airfoil shapes that satisfies required lift coefficient. Generative adversarial

networks (GAN) output reasonable airfoil shapes. However, shapes obtained from ordinal …

  arXiv:2110.00212  [pdfother cs.LG   cs.CE
Inverse airfoil design method for generating varieties of smooth airfoils using conditional WGAN-gp
Authors: Kazuo YonekuraNozomu MiyamotoKatsuyuki Suzuki
Abstract: Machine learning models are recently utilized for airfoil shape generation methods. It is desired to obtain airfoil shapes that satisfies required lift coefficient. Generative adversarial networks (GAN) output reasonable airfoil shapes. However, shapes obtained from ordinal GAN models are not smooth, and they need smoothing before flow analysis. Therefore, the models need to be coupled with Bezier…  More
Submitted 1 October, 2021; originally announced October 2021.
Cited by 5 Related articles All 7 versions


使用 WGAN-GP 合成基於智慧手錶的現實安全與不安全的駕駛行為

A Prasetio - 2021 - ir.lib.ncu.edu.tw

摘要() 在真實環境收集駕駛行為資料是相當危險的事. 需準備許多預防措施以免在收集資料時

發生危險的事. 要收集像左右搖晃這種不安全的駕駛行為在現實中更是困難許多.

利用模擬的環境來收集資料既安全又方便, 但模擬環境與現實仍有一段差距 …

  All 2 versions 

[Chinese  Shǐyòng WGAN-GP héchéng jīyú zhìhuì shǒubiǎo de xiànshí ānquán yǔ bù ānquán de jiàshǐ]


 Wasserstein GAN: Deep Generation applied on Bitcoins financial time series

R Samuel, BD Nico, P Moritz, O Joerg - arXiv preprint arXiv:2107.06008, 2021 - arxiv.org

Modeling financial time series is challenging due to their high volatility and unexpected

happenings on the market. Most financial models and algorithms trying to fill the lack of

historical financial time series struggle to perform and are highly vulnerable to overfitting. As

an alternative, we introduce in this paper a deep neural network called the WGAN-GP, a

data-driven model that focuses on sample generation. The WGAN-GP consists of a

generator and discriminator function which utilize an LSTM architecture. The WGAN-GP is …

  All 2 versions 

online  OPEN ACCESS

Wasserstein GAN: Deep Generation applied on Bitcoins financial time series

by Samuel, Rikli; Nico, Bigler Daniel; Moritz, Pfenninger ; More...

07/2021

Modeling financial time series is challenging due to their high volatility and unexpected happenings on the market. Most financial models and algorithms trying...

Journal ArticleFull Text Online

 

Wasserstein statistics in one-dimensional location-scale models

http://www.stat.t.u-tokyo.ac.jp › slide › w_est

PDF

by S Amari ·  — Wasserstein statistics in one-dimensional location-scale models. Shun-ichi Amari, Takeru Matsuda. RIKEN Center for Brain Science. GSI 2021. GSI 2021. 1 

online

Wasserstein Statistics in One-Dimensional Location-Scale Models

by Amari, Shun-ichi; Matsuda, Takeru

Geometric Science of Information, 07/2021

In this study, we analyze statistical inference based on the Wasserstein geometry in the case that the base space is one-dimensional. By using the...

Book ChapterFull Text Online

Zbl 07495249

2021 [PDF] arxiv.org

Time-Series Imputation with Wasserstein Interpolation for Optimal Look-Ahead-Bias and Variance Tradeoff

J Blanchet, F Hernandez, VA NguyenM Pelger… - arXiv preprint arXiv …, 2021 - arxiv.org

Missing time-series data is a prevalent practical problem. Imputation methods in time-series

data often are applied to the full panel data with the purpose of training a model for a

downstream out-of-sample task. For example, in finance, imputation of missing returns may …

  Related articles All 2 versions 

<——2021———2021———1000——


2021 [PDF] mlr.press

Fast and smooth interpolation on Wasserstein space

S Chewi, J Clancy, T Le Gouic… - International …, 2021 - proceedings.mlr.press

We propose a new method for smoothly interpolating probability measures using the

geometry of optimal transport. To that end, we reduce this problem to the classical Euclidean

setting, allowing us to directly leverage the extensive toolbox of spline interpolation. Unlike …

  Cited by 2 Related articles All 4 versions 


 2021

De-aliased seismic data interpolation using conditional Wasserstein generative adversarial networks

Q Wei, X Li, M Song - Computers & Geosciences, 2021 - Elsevier

When sampling at offset is too coarse during seismic acquisition, spatial aliasing will appear,

affecting the accuracy of subsequent processing. The receiver spacing can be reduced by

interpolating one or more traces between every two traces to remove the spatial aliasing …

  Related articles All 2 versions


2021 [PDF] arxiv.org

Tighter expected generalization error bounds via Wasserstein distance

B Rodríguez-GálvezG BassiR Thobaben… - arXiv preprint arXiv …, 2021 - arxiv.org

In this work, we introduce several expected generalization error bounds based on the

Wasserstein distance. More precisely, we present full-dataset, single-letter, and random-

subset bounds on both the standard setting and the randomized-subsample setting from …

  Related articles All 3 versions 

[CITATION] Tighter expected generalization error bounds via Wasserstein distance.

BR Gálvez, G Bassi, R Thobaben, M Skoglund - CoRR, 202

WRI: Wasserstein Regression Inference

By: Liu, Xi

Comprehensive R Archive Network

Source URL: ‏ https://CRAN.R-project.org/package=WRI

Document Type: Software

View Data  View Abstract

Related articles All 3 versions 

2021   see 2020  [PDF] arxiv.org

Wasserstein regression

Y Chen, Z Lin, HG Müller - Journal of the American Statistical …, 2021 - Taylor & Francis

The analysis of samples of random objects that do not lie in a vector space is gaining

increasing attention in statistics. An important class of such object data is univariate

probability measures defined on the real line. Adopting the Wasserstein metric, we develop …

Cited by 35 Related articles All 4 versions

  2021 see 2020  [PDF] mlr.press

Improved complexity bounds in wasserstein barycenter problem

D DvinskikhD Tiapkin - International Conference on …, 2021 - proceedings.mlr.press

In this paper, we focus on computational aspects of the Wasserstein barycenter problem. We

propose two algorithms to compute Wasserstein barycenters of $ m $ discrete measures of

size $ n $ with accuracy $\e $. The first algorithm, based on mirror prox with a specific norm …

Improved complexity bounds in wasserstein barycenter problem

D DvinskikhD Tiapkin - International Conference on …, 2021 - proceedings.mlr.press

In this paper, we focus on computational aspects of the Wasserstein barycenter problem. We

propose two algorithms to compute Wasserstein barycenters of $ m $ discrete measures of

size $ n $ with accuracy $\e $. The first algorithm, based on mirror prox with a specific norm …

Cited by 20 Related articles All 4 versions 


2021

[PDF] Deep Wasserstein Graph Discriminant Learning for Graph Classification

T ZhangY Wang, Z Cui, C Zhou, B Cui… - Proceedings of the AAAI …, 2021 - aaai.org

… and dictionary learning. We … deep Wasserstein graph discriminant learning framework for

graph discriminant analysis, where graph convolution learning and graph distance learning …

 Cited by 1 Related articles All 2 versions 

Self-Supervised Wasserstein Pseudo-Labeling for Semi-Supervised Image Classification

F TaherkhaniA Dabouei… - Proceedings of the …, 2021 - openaccess.thecvf.com

The goal is to use Wasserstein metric to provide pseudo labels for the unlabeled images to

train a Convolutional Neural Networks (CNN) in a Semi-Supervised Learning (SSL) manner

for the classification task. The basic premise in our method is that the discrepancy between …

  Related articles 


[PDF] hrbcu.edu.cn

Multi-Proxy Wasserstein Classifier for Image Classification

B LiuY RaoJ Lu, J Zhou… - … of the AAAI …, 2021 - ojs-aaai-ex4-oa-ex0-www-webvpn …

Most widely-used convolutional neural networks (CNNs) end up with a global average

pooling layer and a fully-connected layer. In this pipeline, a certain class is represented by

one template vector preserved in the feature banks of fully-connected layer. Yet, a class may …

  

Wasserstein distance feature alignment learning for 2D image-based 3D model retrieval

Y Zhou, Y Liu, H Zhou, W Li - Journal of Visual Communication and Image …, 2021 - Elsevier

Abstract 2D image-based 3D model retrieval has become a hotspot topic in recent years.

However, the current existing methods are limited by two aspects. Firstly, they are mostly

based on the supervised learning, which limits their applicatifon because of the high time …

  All 2 versions


[HTML] mdpi.com

Panchromatic Image super-resolution via self attention-augmented wasserstein generative adversarial network

J Du, K Cheng, Y Yu, D Wang, H Zhou - Sensors, 2021 - mdpi.com

Panchromatic (PAN) images contain abundant spatial information that is useful for earth

observation, but always suffer from low-resolution (LR) due to the sensor limitation and large-

scale view field. The current super-resolution (SR) methods based on traditional attention …

   Related articles All 7 versions 

<——2021———2021———1010——


[PDF] arxiv.org

Image Inpainting Using Wasserstein Generative Adversarial Imputation Network

D Vašata, T Halama, M Friedjungová - arXiv preprint arXiv:2106.15341, 2021 - arxiv.org

Image inpainting is one of the important tasks in computer vision which focuses on the

reconstruction of missing regions in an image. The aim of this paper is to introduce an image

inpainting model based on Wasserstein Generative Adversarial Imputation Network. The …

  All 2 versions 


[PDF] ieee.org

Breast cancer histopathology image super-resolution using wide-attention gan with improved wasserstein gradient penalty and perceptual loss

F Shahidi - IEEE Access, 2021 - ieeexplore.ieee.org

In the realm of image processing, enhancing the quality of the images is known as a super-

resolution problem (SR). Among SR methods, a super-resolution generative adversarial

network, or SRGAN, has been introduced to generate SR images from low-resolution …

  Cited by 2 Related articles All 2 versions


Unpaired Single-Image Depth Synthesis with cycle-consistent Wasserstein GANs

C Angermann, A Moravová, M Haltmeier… - arXiv preprint arXiv …, 2021 - arxiv.org

Real-time estimation of actual environment depth is an essential module for various

autonomous system tasks such as localization, obstacle detection and pose estimation.

During the last decade of machine learning, extensive deployment of deep learning …

  All 2 versions 


[PDF] annalsofrscb.ro

Wasserstein GANs for Generation of Variated Image Dataset Synthesis

KDB Mudavathu, MVPCS Rao - Annals of the Romanian Society for …, 2021 - annalsofrscb.ro

Deep learning networks required a training lot of data to get to better accuracy. Given the

limited amount of data for many problems, we understand the requirement for creating the

image data with the existing sample space. For many years the different technique was used …

  Related articles All 2 versions 


[PDF] zhankunluo.com

[PDF] Image Generation using Wasserstein Generative Adversarial Network

Z Luo, A Jara, W Ou - zhankunluo.com

GAN shows the capability to generate fake authentic images by evaluating and learning

from real and fake samples. This paper introduces an alternative algorithm to the traditional

DCGAN, named Wasserstein GAN (WGAN). It introduces the Wasserstein distance for the …

  Related articles 


2021


Unsupervised band selection for hyperspectral image classification using the Wasserstein metric-based configuration entropy

Z Hong, WU Zhiwei, W Jicheng… - Acta Geodaetica et … - xb.sinomaps.com

Band selection relies on the quantification of band information. Conventional measurements

such as Shannon entropy only consider the composition information (eg, types and ratios of

pixels) but ignore the configuration information (eg, the spatial distribution of pixels). The …

  All 2 versions 


SRWGANTV: Image Super-Resolution Through Wasserstein Generative Adversarial Networks with Total Variational Regularization

J Shao, L Chen, Y Wu - 2021 IEEE 13th International …, 2021 - ieeexplore.ieee.org

The study of generative adversarial networks (GAN) has enormously promoted the research

work on single image super-resolution (SISR) problem. SRGAN firstly apply GAN to SISR

reconstruction, which has achieved good results. However, SRGAN sacrifices the fidelity. At …

  Related articles


Wasserstein Generative Adversarial Networks for Realistic ...

https://link.springer.com › chapter

Apr 5, 2021 — Original Taiwan traffic sign image. Generating images has recently obtained impressive results in the Generative Adversarial Networks (GAN) [6, ..

[CITATION] Wasserstein Generative Adversarial Networks for Realistic Traffic Sign Image Generation

C DewiRC ChenYT Liu - … Thailand, April 7 …, 2021 - Springer International Publishing

 Related articles All 2 versions


2021 PDF

 CONLON: A Pseudo-Song Generator Based on a New 

pianoroll, wasserstein autoencoders, and optimal interpolations

https://bnaic.liacs.leidenuniv.nl › 15_Borghuis

by LAT Borghuis ·  — CONLON: A Pseudo-Song Generator Based on a New Pianoroll, Wasserstein Autoencoders, and Optimal Interpolations. Luca Angioloni1. Tijn Borghuis2,3.


2021  [PDF] arxiv.org

Master Bellman equation in the Wasserstein space: Uniqueness of viscosity solutions

A CossoF Gozzi, I Kharroubi, H Pham… - arXiv preprint arXiv …, 2021 - arxiv.org

We study the Bellman equation in the Wasserstein space arising in the study of mean field

control problems, namely stochastic optimal control problems for McKean-Vlasov diffusion

processes. Using the standard notion of viscosity solution {\a} la Crandall-Lions extended to …

  All 8 versions 

<——2021———2021———1020—— 


2021 [PDF] thecvf.com

DeepACG: Co-Saliency Detection via Semantic-Aware Contrast Gromov-Wasserstein Distance

K Zhang, M Dong, B Liu, XT Yuan… - Proceedings of the …, 2021 - openaccess.thecvf.com

The objective of co-saliency detection is to segment the co-occurring salient objects in a

group of images. To address this task, we introduce a new deep network architecture via

semantic-aware contrast Gromov-Wasserstein distance (DeepACG). We first adopt the …

  Related articles 

2021 [HTML] sciencedirect.com

[HTML] Wasserstein distance based Multiobjective Evolutionary algorithm for the Risk Aware Optimization of Sensor Placement

A PontiA CandelieriF Archetti - Intelligent Systems with Applications, 2021 - Elsevier

In this paper we propose a new algorithm for the identification of optimal “sensing spots”,

within a network, for monitoring the spread of “effects” triggered by “events”. This problem is

referred to as “Optimal Sensor Placement” and many real-world problems fit into this general …

 

2021 [PDF] arxiv.org

Weak topology and Opial property in Wasserstein spaces, with applications to Gradient Flows and Proximal Point Algorithms of geodesically convex functionals

E NaldiG Savaré - arXiv preprint arXiv:2104.06121, 2021 - arxiv.org

In this paper we discuss how to define an appropriate notion of weak topology in the

Wasserstein space $(\mathcal {P} _2 (H), W_2) $ of Borel probability measures with finite

quadratic moment on a separable Hilbert space $ H $. We will show that such a topology …

  Related articles All 3 versions 


2021 [PDF] oup.com

Simulation of broadband ground motions with consistent long-period and short-period components using Wasserstein interpolation of acceleration envelopes

T Okazaki, H Hachiya, A Iwaki, T Maeda… - Geophysical Journal …, 2021 - academic.oup.com

Practical hybrid approaches for the simulation of broadband ground motions often combine

long-period and short-period waveforms synthesised by independent methods under

different assumptions for different period ranges, which at times can lead to incompatible …

  Related articles



2021  [PDF] arxiv.org

Wasserstein Distances, Geodesics and Barycenters of Merge Trees

M Pont, J Vidal, J DelonJ Tierny - arXiv preprint arXiv:2107.07789, 2021 - arxiv.org

This paper presents a unified computational framework for the estimation of distances,

geodesics and barycenters of merge trees. We extend recent work on the edit distance [106]

and introduce a new metric, called the Wasserstein distance between merge trees, which is …

 All 22 versions 


2021


Artificial Neural Network with Histogram Data Time Series Forecasting: A Least Squares Approach Based on Wasserstein Distance

P Rakpho, W Yamaka, K Zhu - Behavioral Predictive Modeling in …, 2021 - Springer

This paper aims to predict the histogram time series, and we use the high-frequency data

with 5-min to construct the Histogram data for each day. In this paper, we apply the Artificial

Neural Network (ANN) to Autoregressive (AR) structure and introduce the AR—ANN model …

  Related articles All 4 versions

2021  [PDF] mdpi.com

Power Electric Transformer Fault Diagnosis Based on Infrared Thermal Images Using Wasserstein Generative Adversarial Networks and Deep Learning Classifier

KH Fanchiang, YC Huang, CC Kuo - Electronics, 2021 - mdpi.com

The safety of electric power networks depends on the health of the transformer. However,

once a variety of transformer failure occurs, it will not only reduce the reliability of the power

system but also cause major accidents and huge economic losses. Until now, many …

  Related articles All 3 versions 


arXiv:2107.13494  [pdfpsother]  math.ST  math.PR  stat.ML
Limit Distribution Theory for the Smooth 1-Wasserstein Distance with Applications
Authors: Ritwik SadhuZiv GoldfeldKengo Kato
Abstract: The smooth 1-Wasserstein distance (SWD) W  σ1
 was recently proposed as a means to mitigate the curse of dimensionality in empirical approximation while preserving the Wasserstein structure. Indeed, SWD exhibits parametric convergence rates and inherits the metric and topological structure of the classic Wasserstein distance. Motivated by the above, this work conducts a thorough statistical stu… 
More
Submitted 28 July, 2021; originally announced July 2021.
MSC Class: 62E17; 60F05; 60F17; 62G10; 62F12; 62F40
 Cited by 2 Related articles All 3 versions 


Wasserstein-Splitting Gaussian Process Regression for Heterogeneous Online...

by Kepler, Michael EKoppel, AlecBedi, Amrit Singh ; More...
07/2021
Gaussian processes (GPs) are a well-known nonparametric Bayesian inference technique, but they suffer from scalability problems for large sample sizes, and...
Journal Article  Full Text Online

arXiv:2107.12797  [pdfother]  stat.ML  cs.LG
Wasserstein-Splitting Gaussian Process Regression for Heterogeneous Online Bayesian Inference
Authors: Michael E. KeplerAlec KoppelAmrit Singh BediDaniel J. Stilwell
Abstract: Gaussian processes (GPs) are a well-known nonparametric Bayesian inference technique, but they suffer from scalability problems for large sample sizes, and their performance can degrade for non-stationary or spatially heterogeneous data. In this work, we seek to overcome these issues through (i) employing variational free energy approximations of GPs operating in tandem with online expectation pro…  More
Submitted 26 July, 2021; originally announced July 2021.
Cited by 2 Related articles All 4 versions

arXiv:2107.11568  [pdfpsother math.PR
Wasserstein Convergence for Empirical Measures of Subordinated Diffusions on Riemannian Manifolds
Authors: Feng-Yu WangBingyao Wu
Abstract: Let M
 be a connected compact Riemannian manifold possibly with a boundary, let VC
 such that $μ(\d x):=\e^{V(x)}\d x$ is a probability measure, where $\d x$ is the volume measure, and let L=Δ+V
. The exact convergence rate in Wasserstein distance is derived for empirical measures of subordinations for the (reflecting) diffusion process generated by L.
Submitted 24 July, 2021; originally announced July 2021.
Comments: 26 pages

Cited by 2 Related articles All 4 versions

<——2021———2021———1030——


Conditional Wasserstein Barycenters and Interpolation ...

https://arxiv.org › stat

by J Fan · 2021 — Conditional Wasserstein barycenters and distribution extrapolation are illustrated with applications in climate science and studies of aging ...

online  OPEN ACCESS

Conditional Wasserstein Barycenters and Interpolation/Extrapolation of Distributions

by Fan, Jianing; Müller, Hans-Georg

07/2021

Increasingly complex data analysis tasks motivate the study of the dependency of distributions of multivariate continuous random variables on scalar or vector...

Journal ArticleFull Text Online

elated articles All 2 versions

 

 Wasserstein Distances, Geodesics and Barycenters of Merge Trees

M Pont, J Vidal, J DelonJ Tierny - arXiv preprint arXiv:2107.07789, 2021 - arxiv.org

This paper presents a unified computational framework for the estimation of distances,

geodesics and barycenters of merge trees. We extend recent work on the edit distance [106]

and introduce a new metric, called the Wasserstein distance between merge trees, which is

purposely designed to enable efficient computations of geodesics and barycenters.

Specifically, our new distance is strictly equivalent to the L2-Wasserstein distance between

extremum persistence diagrams, but it is restricted to a smaller solution space, namely, the …

  Cited by 11 Related articles All 46 versions

online OPEN ACCESS

Wasserstein Distances, Geodesics and Barycenters of Merge Trees

by Pont, Mathieu; Vidal, Jules; Delon, Julie ; More...

07/2021

This paper presents a unified computational framework for the estimation of distances, geodesics and barycenters of merge trees. We extend recent work on the...

Journal ArticleFull Text Online

 IEEE transactions on visualization and computer graphics, 09/2021, Volume PP


Master Bellman equation in the Wasserstein space ...

https://arxiv.org › math

by A Cosso · 2021 — Using the standard notion of viscosity solution {à} la Crandall-Lions ... is the unique viscosity solution of the Master Bellman equation.

online  OPEN ACCESS

Master Bellman equation in the Wasserstein space: Uniqueness of viscosity solutions

by Cosso, Andrea; Gozzi, Fausto; Kharroubi, Idris ; More...

07/2021

We study the Bellman equation in the Wasserstein space arising in the study of mean field control problems, namely stochastic optimal control problems for...

Journal ArticleFull Text Online

Cited by 6 Related articles All 18 versions

Master Bellman equation in the Wasserstein space: Uniqueness of viscosity solutions

Library



Lecture 10: Wasserstein Geodesics, Nonbranching and ...

https://link.springer.com › 978-3-030-72162-6_10

by L Ambrosio · 2021 — Lecture 10: Wasserstein Geodesics, Nonbranching and Curvature. Authors; Authors and affiliations. Luigi Ambrosio; Elia Brué; Daniele Semola.

Lecture 10: Wasserstein Geodesics, Nonbranching and Curvature

by Ambrosio, Luigi; Brué, Elia; Semola, Daniele

Lectures on Optimal Transport, 07/2021

Let us now come to the proof of the lower semicontinuity of the action, defined as in (9.8). The proof could be achieved with more elementary tools, but we...

Book ChapterCitation Online

Lecture 15: Semicontinuity and Convexity of Energies in the ...

https://link.springer.com › 978-3-030-72162-6_15

by L Ambrosio · 2021 — Definition 15.1 (Relative Entropy). Let (X, τ) be a Polish space and choose a reference measure \mu \in \mathcal {P}(X ...

Lecture 15: Semicontinuity and Convexity of Energies in the Wasserstein Space

by Ambrosio, Luigi; Brué, Elia; Semola, Daniele

Lectures on Optimal Transport, 07/2021

We are now interested in the semicontinuity and convexity properties of the following three types of energy functionals defined on measures: internal energy,...

Book ChapterCitation Online

2021

A Wasserstein inequality and minimal Green energy on ...

https://www.sciencedirect.com › science › article › pii

by S Steinerberger · 2021 · Cited by 4 — Let M be a smooth, compact d−dimensional manifold, d ≥ 3 , without boundary and let G : M × M R { ∞ } denote the Green's function of the Laplacian ...

A Wasserstein inequality and minimal Green energy on compact manifolds

Steinerberger, S

Sep 1 2021 | JOURNAL OF FUNCTIONAL ANALYSIS

Let M be a smooth, compact d-dimensional manifold, d >= 3, without boundary and let G : M x M -> R boolean OR {infinity} denote the Green's function of the Laplacian - Delta (normalized to have mean value 0). We prove a bound on the cost of transporting Dirac measures in {x(1), ..., x(n)} subset of M to the normalized volume measure dxin terms of the Green's function of the Laplacian

Showing the best result for this search. See al

Peacock geodesics in Wasserstein space

Wu, HG and Cui, XJ

Aug 2021 | DIFFERENTIAL GEOMETRY AND ITS APPLICATIONS

Martingale optimal transport has attracted much attention due to its application in pricing and hedging in mathematical finance. The essential notion which makes martingale optimal transport different from optimal transport is peacock. A peacock is a sequence of measures which satisfies convex order property. In this paper we study peacock geodesics in Wasserstain space, and we hope this paper can provide some geometrical points of view to look at martingale optimal transport. (c) 2021 Elsevier B.V. All rights reserved.

Zbl 07371915

[PDF] uniandes.edu.co

Versión paramétrica del fenómeno de cutoff para el proceso de Ornstein-Uhlenbeck bajo la distancia de Wasserstein

LF Estrada Plata - 2021 - repositorio.uniandes.edu.co

Este trabajo toma como caso de estudio a la solución de la ecuación diferencial lineal

estocástica $ dX_t^\epsilon (x)=-\mathcal {Q} X_t^\epsilon dt+\epsilon dB (t),\; X_0^\epsilon …

2Fast Wasserstein-distance-based distributionally robust chance-constrained power dispatch for multi-zone HVAC systems

G ChenH ZhangH Hui, Y Song - IEEE Transactions on Smart …, 2021 - ieeexplore.ieee.org

… are explicitly described based on the delicate indoor thermal model. Wasserstein distance is … To overcome the computational intractability of Wassersteindistance-based method, we first …

Cited by 13 Related articles All 2 versions

2021 see 2020  2022  [PDF] arxiv.org

The quantum Wasserstein distance of order 1

G De PalmaM MarvianD Trevisan… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

We propose a generalization of the Wasserstein distance of order 1 to the quantum states of

n qudits. The proposal recovers the Hamming distance for the vectors of the canonical basis,

and more generally the classical Wasserstein distance for quantum states diagonal in the …

  Cited by 6 Related articles All 8 versions

2021 [PDF] arxiv.org

The quantum Wasserstein distance of order 1

G De PalmaM MarvianD Trevisan… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

We propose a generalization of the Wasserstein distance of order 1 to the quantum states of

n qudits. The proposal recovers the Hamming distance for the vectors of the canonical basis,

and more generally the classical Wasserstein distance for quantum states diagonal in the …

Cited by 29 Related articles All 12 versions

<——2021———2021———1040——


2021 see 2020  [PDF] thecvf.com

Wasserstein contrastive representation distillation

L ChenD WangZ Gan, J Liu… - Proceedings of the …, 2021 - openaccess.thecvf.com

The primary goal of knowledge distillation (KD) is to encapsulate the information of a model

learned from a teacher network into a student network, with the latter being more compact

than the former. Existing work, eg, using Kullback-Leibler divergence for distillation, may fail …

 Cited by 19 Related articles All 10 versions

Scholarly Journal  Citation/Abstract

The Quantum Wasserstein Distance of Order 1

De Palma, Giacomo; Marvian, Milad; Trevisan, Dario; Lloyd, Seth.IEEE Transactions on Information Theory; New York Vol. 67, Iss. 10,  (2021): 6627-6643.
Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 
Cited by 33
 Related articles All 12 versions

2021 see 2020  [PDF] arxiv.org

On linear optimization over Wasserstein balls

MC YueD KuhnW Wiesemann - Mathematical Programming, 2021 - Springer

Wasserstein balls, which contain all probability measures within a pre-specified Wasserstein

distance to a reference measure, have recently enjoyed wide popularity in the

distributionally robust optimization and machine learning communities to formulate and …

 Cited by 11 Related articles All 9 versions

[PDF] mlr.press

Smooth  p-Wasserstein Distance: Structure, Empirical Approximation, and Statistical Applications

S NietertZ GoldfeldK Kato - International Conference on …, 2021 - proceedings.mlr.press

Discrepancy measures between probability distributions, often termed statistical distances,

are ubiquitous in probability theory, statistics and machine learning. To combat the curse of

dimensionality when estimating these distances from data, recent work has proposed …

  Cited by 2 All 2 versions 

 



[PDF] arxiv.org

Relaxed Wasserstein with applications to GANs

X GuoJ HongT Lin, N Yang - ICASSP 2021-2021 IEEE …, 2021 - ieeexplore.ieee.org

Wasserstein Generative Adversarial Networks (WGANs) provide a versatile class of models,

which have attracted great attention in various applications. However, this framework has

two main drawbacks:(i) Wasserstein-1 (or Earth-Mover) distance is restrictive such that …

  Cited by 27 Related articles All 3 versions

   Relaxed Wasserstein with applications to GANs

X GuoJ HongT Lin, N Yang - ICASSP 2021-2021 IEEE …, 2021 - ieeexplore.ieee.org

Wasserstein Generative Adversarial Networks (WGANs) provide a versatile class of models,

which have attracted great attention in various applications. However, this framework has …

 Cited by 28 Related articles All 4 versions


[PDF] auai.org

[PDF] FlexAE: Flexibly learning latent priors for wasserstein auto-encoders

AK Mondal, H AsnaniP SinglaA Prathosh - Proc. of UAI, 2021 - auai.org

Auto-Encoder (AE) based neural generative frameworks model the joint-distribution

between the data and the latent space using an Encoder-Decoder pair, with regularization

imposed in terms of a prior over the latent space. Despite their advantages, such as stability …

Cited by 4 Related articles All 5 versions 

2021


[HTML] mdpi.com

WDA: an improved wasserstein distance-based transfer learning fault diagnosis method

Z Zhu, L Wang, G Peng, S Li - Sensors, 2021 - mdpi.com

… Thus, we introduced an improved Wasserstein distance-based … In WDA, Wasserstein distance 

with cosine similarity is … –Munkres algorithm to calculate the Wasserstein distance. In order …

Cited by 9 Related articles All 11 versions

[PDF] jmlr.org

[PDF] Wasserstein barycenters can be computed in polynomial time in fixed dimension.

JM Altschuler, E Boix-Adsera - J. Mach. Learn. Res., 2021 - jmlr.org

Computing Wasserstein barycenters is a fundamental geometric problem with widespread

applications in machine learning, statistics, and computer graphics. However, it is unknown

whether Wasserstein barycenters can be computed in polynomial time, either exactly or to …

Cited by 18 Related articles All 7 versions 

2021 see 2020[PDF] mlr.press

A Riemannian block coordinate descent method for computing the projection robust Wasserstein distance

M HuangS MaL Lai - International Conference on …, 2021 - proceedings.mlr.press

The Wasserstein distance has become increasingly important in machine learning and deep

learning. Despite its popularity, the Wasserstein distance is hard to approximate because of

the curse of dimensionality. A recently proposed approach to alleviate the curse of …

CCited by 15 Related articles All 9 versions 

[PDF] iop.org

Wasserstein stability estimates for covariance-preconditioned Fokker–Planck equations

JA CarrilloU Vaes - Nonlinearity, 2021 - iopscience.iop.org

We study the convergence to equilibrium of the mean field PDE associated with the

derivative-free methodologies for solving inverse problems that are presented by Garbuno-

Inigo et al (2020 SIAM J. Appl. Dyn. Syst. 19 412–41), Herty and Visconti (2018 arXiv …

  1 Related articles All 6 versions

     Wasserstein stability estimates for covariance-preconditioned Fokker–Planck equations

10 citations* for all

3 citations*

2021 NONLINEARITY

J A Carrillo ,U Vaes

Covariance

Convergence (routing)

View More (7+) 

We study the convergence to equilibrium of the mean field PDE associated with the derivative-free methodologies for solving inverse problems. We show stability estimates in the euclidean Wasserstein distance for the mean field PDE by using optimal transport arguments. As a consequence, this recovers... View Full Abstract 

Cited by 18 Related articles All 7 versions

[PDF] thecvf.com

Self-Supervised Wasserstein Pseudo-Labeling for Semi-Supervised Image Classification

F TaherkhaniA Dabouei… - Proceedings of the …, 2021 - openaccess.thecvf.com

The goal is to use Wasserstein metric to provide pseudo labels for the unlabeled images to

train a Convolutional Neural Networks (CNN) in a Semi-Supervised Learning (SSL) manner

for the classification task. The basic premise in our method is that the discrepancy between …

ite Cited by 4 Related articles All 5 versions 

<——2021———2021———1050—


onference Paper  Citation/Abstract

Temporal conditional Wasserstein GANs for audio-visual affect-related ties

Athanasiadis, Christos; Hortal, Enrique; Asteriadis, Stelios.The Institute of Electrical and

Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).
Abstract/Details   Show Abstract 

Related articles All 2 versions

 

2021 see 2020  [PDF] mlr.press

First-Order Methods for Wasserstein Distributionally Robust MDP

JG Clement, C Kroer - International Conference on Machine …, 2021 - proceedings.mlr.press

Markov decision processes (MDPs) are known to be sensitive to parameter specification.

Distributionally robust MDPs alleviate this issue by allowing for\textit {ambiguity sets} which

give a set of possible distributions over parameter sets. The goal is to find an optimal policy …


[PDF] Unsupervised Graph Alignment with Wasserstein Distance Discriminator

J GaoX Huang, J Li - … on Knowledge Discovery and Data Mining, 2021 - cs.virginia.edu

Graph alignment aims to identify the node correspondence across multiple graphs and is

essential to reveal insightful graph patterns that are otherwise inaccessible with a single

graph. With roots in graph theory, the graph alignment problem has significant implications …

  Related articles All 3 versions

Cited by 21 Related articles All 3 versions


[PDF] mlr.press

Exploring the Wasserstein metric for time-to-event analysis

T SylvainM LuckJ Cohen… - Survival Prediction …, 2021 - proceedings.mlr.press

Survival analysis is a type of semi-supervised task where the target output (the survival time)

is often right-censored. Utilizing this information is a challenge because it is not obvious how

to correctly incorporate these censored examples into a model. We study how three …

Cited by 1 Related articles All 2 versions 

Wasserstein Distributional Normalization For Robust Distributional Certification of Noisy Labeled Data

SW ParkJ Kwon - International Conference on Machine …, 2021 - proceedings.mlr.press

We propose a novel Wasserstein distributional normalization method that can classify noisy

labeled data accurately. Recently, noisy labels have been successfully handled based on

small-loss criteria, but have not been clearly understood from the theoretical point of view. In …

Related articles All 2 versions 

  

2021

[PDF] hrbcu.edu.cn

Multi-Proxy Wasserstein Classifier for Image Classification

B LiuY RaoJ Lu, J Zhou… - … of the AAAI …, 2021 - ojs-aaai-ex4-oa-ex0-www-webvpn …

Most widely-used convolutional neural networks (CNNs) end up with a global average

pooling layer and a fully-connected layer. In this pipeline, a certain class is represented by

one template vector preserved in the feature banks of fully-connected layer. Yet, a class may …

 

2021 see 2020  [PDF] researchgate.net

The α-z-Bures Wasserstein divergence

TH DinhCT Le, BK Vo, TD Vuong - Linear Algebra and its Applications, 2021 - Elsevier

In this paper, we introduce the α-z-Bures Wasserstein divergence for positive semidefinite

matrices A and B as Φ (A, B)= T r ((1− α) A+ α B)− T r (Q α, z (A, B)), where Q α, z (A, B)=(A

1− α 2 z B α z A 1− α 2 z) z is the matrix function in the α-z-Renyi relative entropy. We show …

Cited by 6 Related articles All 4 versions

Fault Diagnosis of Rotating Machinery Based on Wasserstein Distance and Feature Selection

F FerracutiA FreddiA Monteriù… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

This article presents a fault diagnosis algorithm for rotating machinery based on the

Wasserstein distance. Recently, the Wasserstein distance has been proposed as a new

research direction to find better distribution mapping when compared with other popular …

Cited by 4 Related articles

Decision Making Under Model Uncertainty: Fréchet–Wasserstein Mean Preferences

EV Petracou, A Xepapadeas… - Management …, 2021 - pubsonline.informs.org

This paper contributes to the literature on decision making under multiple probability models

by studying a class of variational preferences. These preferences are defined in terms of

Fréchet mean utility functionals, which are based on the Wasserstein metric in the space of …

 

Establishment and extension of digital aggregate database using auxiliary classifier Wasserstein GAN with gradient penalty

C Wang, F Li, Q Liu, H Wang, P Benmoussa… - … and Building Materials, 2021 - Elsevier

For road construction, the morphological characteristics of coarse aggregates such as

angularity and sphericity have a considerable influence on asphalt pavement performance.

In traditional aggregate simulation processes, images of real coarse grains are captured …

  All 2 versions

<——2021———2021———1060——


[PDF] hrbcu.edu.cn

Deep Wasserstein Graph Discriminant Learning for Graph Classification

T ZhangY Wang, Z Cui, C Zhou… - … of the AAAI …, 2021 - ojs-aaai-ex4-oa-ex0-www-webvpn …

Graph topological structures are crucial to distinguish different-class graphs. In this work, we

propose a deep Wasserstein graph discriminant learning (WGDL) framework to learn

discriminative embeddings of graphs in Wasserstein-metric (W-metric) matching space. In …


[PDF] carloalberto.org

[BOOK] Measuring dependence in the Wasserstein distance for Bayesian nonparametric models

M Catalano, A Lijoi, I Prünster - 2021 - carloalberto.org

The proposal and study of dependent Bayesian nonparametric models has been one of the

most active research lines in the last two decades, with random vectors of measures

representing a natural and popular tool to define them. Nonetheless a principled approach …

  Cited by 5 Related articles All 6 versions 

[PDF] ieee.org

Variational Autoencoders and Wasserstein Generative Adversarial Networks for Improving the Anti-Money Laundering Process

ZY Chen, W Soliman, A Nazir, M Shorfuzzaman - IEEE Access, 2021 - ieeexplore.ieee.org

There has been much recent work on fraud and Anti Money Laundering (AML) detection

using machine learning techniques. However, most algorithms are based on supervised

techniques. Studies show that supervised techniques often have the limitation of not …

  Related articles All 2 versions


[HTML] Prediction and analysis of multiple protein lysine modified sites based on conditional wasserstein generative adversarial networks

Y Yang, H Wang, W Li, X Wang… - BMC …, 2021 - bmcbioinformatics.biomedcentral …

Protein post-translational modification (PTM) is a key issue to investigate the mechanism of

protein's function. With the rapid development of proteomics technology, a large amount of

protein sequence data has been generated, which highlights the importance of the in-depth …

  Related articles All 10 versions 


[PDF] arxiv.org

Distributionally robust tail bounds based on Wasserstein distance and -divergence

C Birghila, M Aigner, S Engelke - arXiv preprint arXiv:2106.06266, 2021 - arxiv.org

In this work, we provide robust bounds on the tail probabilities and the tail index of heavy-

tailed distributions in the context of model misspecification. They are defined as the optimal

value when computing the worst-case tail behavior over all models within some …

Cited by 1 Related articles All 2 versions 

2021


[PDF] ams.org

Nonembeddability of persistence diagrams with 𝑝> 2 Wasserstein metric

A Wagner - Proceedings of the American Mathematical Society, 2021 - ams.org

Persistence diagrams do not admit an inner product structure compatible with any

Wasserstein metric. Hence, when applying kernel methods to persistence diagrams, the

underlying feature map necessarily causes distortion. We prove that persistence diagrams …

  Cited by 5 Related articles All 4 versions


[HTML] sciencedirect.com

[HTML] Wasserstein distance based Multiobjective Evolutionary algorithm for the Risk Aware Optimization of Sensor Placement

A PontiA CandelieriF Archetti - Intelligent Systems with Applications, 2021 - Elsevier

In this paper we propose a new algorithm for the identification of optimal “sensing spots”,

within a network, for monitoring the spread of “effects” triggered by “events”. This problem is

referred to as “Optimal Sensor Placement” and many real-world problems fit into this general …

 Cited by 4

2021

Considering anatomical prior information for low-dose CT image enhancement using attribute-augmented Wasserstein generative adversarial networks

Z Huang, X Liu, R Wang, J Chen, P Lu, Q Zhang… - Neurocomputing, 2021 - Elsevier

Currently, many deep learning (DL)-based low-dose CT image postprocessing technologies

fail to consider the anatomical differences in training data among different human body sites,

such as the cranium, lung and pelvis. In addition, we can observe evident anatomical …

 Cited by 3 Related articles

 Linear-Time Gromov Wasserstein Distances using Low Rank Couplings and Costs

M ScetbonG PeyréM Cuturi - arXiv preprint arXiv:2106.01128, 2021 - arxiv.org

The ability to compare and align related datasets living in heterogeneous spaces plays an

increasingly important role in machine learning. The Gromov-Wasserstein (GW) formalism

can help tackle this problem. Its main goal is to seek an assignment (more generally a …

Cited by 5 Related articles All 3 versions 

 

[HTML] springer.com

[HTML] … : extraction of peptide chromatographic elution profiles from large scale mass spectrometry data by means of Wasserstein compressive hierarchical cluster …

O Permiakova, R Guibert, A Kraut, T Fortin… - BMC …, 2021 - Springer

The clustering of data produced by liquid chromatography coupled to mass spectrometry

analyses (LC-MS data) has recently gained interest to extract meaningful chemical or

biological patterns. However, recent instrumental pipelines deliver data which size …

  Related articles All 12 versions

<——2021———2021———1070——


2021 see 2019  [HTML] springer.com

[HTML] Tropical optimal transport and Wasserstein distances

W Lee, W LiB LinA Monod - Information Geometry, 2021 - Springer

We study the problem of optimal transport in tropical geometry and define the Wasserstein-p

distances in the continuous metric measure space setting of the tropical projective torus. We

specify the tropical metric—a combinatorial metric that has been used to study of the tropical …

  Cited by 2 Related articles All 4 versions


[PDF] arxiv.org

A new perspective on Wasserstein distances for kinetic problems

M Iacobelli - arXiv preprint arXiv:2104.00963, 2021 - arxiv.org

We introduce a new class of Wasserstein-type distances specifically designed to tackle

questions concerning stability and convergence to equilibria for kinetic equations. Thanks to

these new distances, we improve some classical estimates by Loeper and Dobrushin on …

  Related articles All 3 versions 


Wasserstein distance based Asymmetric Adversarial Domain Adaptation in intelligent bearing fault diagnosis

Y Ying, Z Jun, T Tang, W Jingwei, C Ming… - Measurement …, 2021 - iopscience.iop.org

Addressing the phenomenon of data sparsity in hostile working conditions, which leads to

performance degradation in traditional machine learning based fault diagnosis methods, a

novel Wasserstein distance based Asymmetric Adversarial Domain Adaptation (WAADA) is …

 Related articles

P2E-WGAN: ECG waveform synthesis from PPG with conditional wasserstein generative adversarial networks

K VoEK NaeiniA Naderi, D Jilani… - Proceedings of the 36th …, 2021 - dl.acm.org

Electrocardiogram (ECG) is routinely used to identify key cardiac events such as changes in

ECG intervals (PR, ST, QT, etc.), as well as capture critical vital signs such as heart rate (HR)

and heart rate variability (HRV). The gold standard ECG requires clinical measurement …

Cited by 8 Related articles All 2 versions

[PDF] spiedigitallibrary.org

Reproducibility of radiomic features using network analysis and its application in Wasserstein k-means clustering

JH Oh, AP Apte, E Katsoulakis, N Riaz… - Journal of Medical …, 2021 - spiedigitallibrary.org

Purpose: The goal of this study is to develop innovative methods for identifying radiomic

features that are reproducible over varying image acquisition settings. Approach: We

propose a regularized partial correlation network to identify reliable and reproducible …

  Related articles All 5 versions


2021


Wasserstein Barycenter Transport for Acoustic Adaptation

EF Montesuma, FMN Mboula - ICASSP 2021-2021 IEEE …, 2021 - ieeexplore.ieee.org

The recognition of music genre and the discrimination between music and speech are

important components of modern digital music systems. Depending on the acquisition

conditions, such as background environment, these signals may come from different …

  Related articles


[PDF] iop.org

Graph Classification Method Based on Wasserstein Distance

W Wu, G Hu, F Yu - Journal of Physics: Conference Series, 2021 - iopscience.iop.org

Graph classification is a challenging problem, which attracts more and more attention. The

key to solving this problem is based on what metric to compare graphs, that is, how to define

graph similarity. Common graph classification methods include graph kernel, graph editing …

 All 3 versions


Intensity-Based Wasserstein Distance As A Loss Measure For Unsupervised Deformable Deep Registration

R ShamsW Le, A Weihs… - 2021 IEEE 18th …, 2021 - ieeexplore.ieee.org

Traditional pairwise medical image registration techniques are based on computationally

intensive frameworks due to numerical optimization procedures. While there is increasing

adoption of deep neural networks to improve deformable image registration, achieving a …

  Related articles


[PDF] ieee.org

Wasserstein Divergence GAN With Cross-Age Identity Expert and Attribute Retainer for Facial Age Transformation

GS Hsu, RC Xie, ZT Chen - IEEE Access, 2021 - ieeexplore.ieee.org

We propose the Wasserstein Divergence GAN with an identity expert and an attribute

retainer for facial age transformation. The Wasserstein Divergence GAN (WGAN-div) can

better stabilize the training and lead to better target image generation. The identity expert …

Related articles All 2 versions


 [PDF] arxiv.org

Projection Robust Wasserstein Barycenters

M HuangS MaL Lai - arXiv preprint arXiv:2102.03390, 2021 - arxiv.org

Collecting and aggregating information from several probability measures or histograms is a

fundamental task in machine learning. One of the popular solution methods for this task is to

compute the barycenter of the probability measures under the Wasserstein metric. However …

Cited by 5 Related articles All 7 versions 

<——2021———2021———1080——


[PDF] mdpi.com

Power Electric Transformer Fault Diagnosis Based on Infrared Thermal Images Using Wasserstein Generative Adversarial Networks and Deep Learning Classifier

KH Fanchiang, YC Huang, CC Kuo - Electronics, 2021 - mdpi.com

The safety of electric power networks depends on the health of the transformer. However,

once a variety of transformer failure occurs, it will not only reduce the reliability of the power

system but also cause major accidents and huge economic losses. Until now, many …

Cited by 6 Related articles All 3 versions

 Lecture 10: Wasserstein Geodesics, Nonbranching and Curvature

L AmbrosioE BruéD Semola - Lectures on Optimal Transport, 2021 - Springer

Let us now come to the proof of the lower semicontinuity of the action, defined as in ( 9.8). The

proof could be achieved with more elementary tools, but we prefer to use a general lemma that

will play a role also in the sequel … Let us now come to the proof of the lower semicontinuity …

 Related articles


n clustering uncertain and structured data with Wasserstein barycenters and a geodesic criterion for the number of clusters

GI Papayiannis, GN Domazakis… - Journal of Statistical …, 2021 - Taylor & Francis

Clustering schemes for uncertain and structured data are considered relying on the notion of

Wasserstein barycenters, accompanied by appropriate clustering indices based on the

intrinsic geometry of 

the Wasserstein space. Such type of clustering approaches are highly …

  Related articles

  

[PDF] archives-ouvertes.fr

Measuring the Irregularity of Vector-Valued Morphological Operators using Wasserstein Metric

ME Valle, S Francisco, MA Granero… - … Conference on Discrete …, 2021 - Springer

Mathematical morphology is a useful theory of nonlinear operators widely used for image

processing and analysis. Despite the successful application of morphological operators for

binary and gray-scale images, extending them to vector-valued images is not straightforward …

 Related articles All 17 versions


 [PDF] archives-ouvertes.fr

Sampled Gromov Wasserstein

T KerdoncuffR EmonetM Sebban - 2021 - hal.archives-ouvertes.fr

Optimal Transport (OT) has proven to be a powerful tool to compare probability distributions

in machine learning, but dealing with probability measures lying in different spaces remains

an open problem. To address this issue, the Gromov Wasserstein distance (GW) only …

   Cited by 5 Related articles 


2021


Wasserstein GAN: Deep Generation Applied on Financial Time Series

M Pfenninger, S Rikli, DN Bigler - Available at SSRN 3877960, 2021 - papers.ssrn.com

Modeling financial time series is challenging due to their high volatility and unexpected

happenings on the market. Most financial models and algorithms trying to fill the lack of

historical financial time series struggle to perform and are highly vulnerable to overfitting. As …
Cited by 2 Related articles All 2 versions

[PDF] arxiv.org

Geometry on the Wasserstein space over a compact Riemannian manifold

H Ding, S Fang - arXiv preprint arXiv:2104.00910, 2021 - arxiv.org

For the sake of simplicity, we will consider in this paper a connected compact Riemannian manifold

M of dimension m. We denote by dM the Riemannian distance and dx the Rieman- nian measure

on M such that ∫M dx = 1. Since the diameter of M is finite, any probability measure µ on M is …

Related articles All 9 versions 

MR4335934 

[PDF] thecvf.com

A Sliced Wasserstein Loss for Neural Texture Synthesis

E Heitz, K Vanhoey, T Chambon… - Proceedings of the …, 2021 - openaccess.thecvf.com

We address the problem of computing a textural loss based on the statistics extracted from

the feature activations of a convolutional neural network optimized for object recognition (eg

VGG-19). The underlying mathematical problem is the measure of the distance between two …

Conference Paper  Citation/Abstract
Cited by 10 Related articles All 7 versions 

[PDF] yuxinirisye.com

[PDF] Wasserstein Learning of Generative Models

Y Ye - yuxinirisye.com

This project presents a variant of generative adversarial nets minimizing a Wasserstein

metric measuring the distance between the generator distribution and the data distribution.

The Wasserstein Generative Adversarial Network, or Wasserstein GAN, is an extension to …

   

[PDF] aaai.org

[PDF] Fast PCA in 1-D Wasserstein Spaces via B-splines Representation and Metric Projection

M Pegoraro, M Beraha - Proceedings of the AAAI Conference on Artificial …, 2021 - aaai.org

We address the problem of performing Principal Component Analysis over a family of

probability measures on the real line, using the Wasserstein geometry. We present a novel

representation of the 2-Wasserstein space, based on a well known isometric bijection and a …

  Related articles All 3 versions 

[PDF] Fast PCA in 1-D Wasserstein Spaces via B-splines Representation and Metric Projection

<——2021———2021———1091——


  [PDF] mlr.press

[PDF] Supplementary Material for Wasserstein Distributional Normalization

SW Park, J Kwon - proceedings.mlr.press

Our collaboration model with co-teaching achieved the most accurate performance for the

CIFAR-100 dataset with asymmetric noise, which verifies that our WDN can be integrated

into existing methods to improve their performance significantly, especially when the density …

Related articles 

<——2021———2021———1090——


2021 see 2020

Drug–drug interaction prediction with Wasserstein Adversarial Autoencoder-based knowledge graph embeddings

Y Dai, C Guo, W Guo, C Eickhoff - Briefings in Bioinformatics, 2021 - academic.oup.com

An interaction between pharmacological agents can trigger unexpected adverse events.

Capturing richer and more comprehensive information about drug–drug interactions (DDIs)

is one of the key tasks in public health and drug development. Recently, several knowledge …

Cited by 15 Related articles All 9 versions

2021 see 2020 [PDF] projecteuclid.org

Posterior asymptotics in Wasserstein metrics on the real line

M ChaeP De Blasi, SG Walker - Electronic Journal of Statistics, 2021 - projecteuclid.org

In this paper, we use the class of Wasserstein metrics to study asymptotic properties of

posterior distributions. Our first goal is to provide sufficient conditions for posterior

consistency. In addition to the well-known Schwartz's Kullback–Leibler condition on the …

  Related articles All 2 versions

MR4298980 Prelim Chae, Minwoo; De Blasi, Pierpaolo; Walker, Stephen G.; 

Posterior asymptotics in Wasserstein metrics on the real line. Electron. J. Stat. 15 (2021), no. 2, 3635–3677.

Review PDF Clipboard Journal Article

ited by 2 Related articles All 10 versions

Channel Pruning for Accelerating Convolutional Neural Networks via Wasserstein Metric

by Duan, HaranLi, Hui

Computer Vision – ACCV 2020, 02/2021

Channel pruning is an effective way to accelerate deep convolutional neural networks. However, it is still a challenge to reduce the

computational complexity...

Book Chapter  Full Text Online


2021 see 2020  [PDF] sinica.edu.tw

[PDF] Improving Perceptual Quality by Phone-Fortified Perceptual Loss using Wasserstein Distance for Speech Enhancement

TA Hsieh, C YuSW FuX LuY Tsao - citi.sinica.edu.tw

Speech enhancement (SE) aims to improve speech quality and intelligibility, which are both

related to a smooth transition in speech segments that may carry linguistic information, eg

phones and syllables. In this study, we propose a novel phonefortified perceptual loss …

Cite Cited by 6 Related arti

OPTIMAL TRANSPORT ALGORITHMS AND WASSERSTEIN BARYCENTERS

OY Kovalenko - INTERNATIONAL PROGRAM COMMITTEE, 2021 - pdmu.univ.kiev.ua

The work considers the question of finding the optimal algorithm that will be used to solve

the problem of finding Wasserstein's distance. The relevance of the research topic is that

today these algorithms are among the most common ways to use optimal transport and are …

  

2021 see 2020 [PDF] arxiv.org

The existence of minimizers for an isoperimetric problem with Wasserstein penalty term in unbounded domains

Q Xia, B Zhou - Advances in Calculus of Variations, 2021 - degruyter.com

In this article, we consider the (double) minimization problem min⁡{P⁢(E; Ω)+ λ⁢ W p⁢(E,

F): E Ω, F R d,| E F|= 0,| E|=| F|= 1}, where λ 0, p 1, Ω is a (possibly unbounded)

domain in R d, P⁢(E; Ω) denotes the relative perimeter of 𝐸 in Ω and W p denotes the 𝑝 …

  Cited by 4 Related articles All 5 versions


2021  [PDF] mdpi.com

Unpaired image denoising via wasserstein GAN in low-dose CT image with multi-perceptual loss and fidelity loss

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - Symmetry, 2021 - mdpi.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively

reduce the radiation risk of patients, but it may increase noise and artefacts, which can

compromise diagnostic information. The methods based on deep learning can effectively …

  Cited by 2 Related articles All 3 versions 

Unpaired Image Denoising via Wasserstein GAN in Low-Dose CT Image with Multi-Perceptual Loss and Fidelity Loss. Symmetry 2021, 13, 126

Z Yin, K Xia, Z He, J Zhang, S Wang, B Zu - 2021 - search.proquest.com

The use of low-dose computed tomography (LDCT) in medical practice can effectively

reduce the radiation risk of patients, but it may increase noise and artefacts, which can

compromise diagnostic information. The methods based on deep learning can effectively …

  Related articles


2021  [PDF] thecvf.com

A Sliced Wasserstein Loss for Neural Texture Synthesis

E Heitz, K Vanhoey, T Chambon… - Proceedings of the …, 2021 - openaccess.thecvf.com

We address the problem of computing a textural loss based on the statistics extracted from

the feature activations of a convolutional neural network optimized for object recognition (eg

VGG-19). The underlying mathematical problem is the measure of the distance between two …

  Related articles All 4 versions 


2021 [PDF] acm.org

P2E-WGAN: ECG waveform synthesis from PPG with conditional wasserstein generative adversarial networks

K VoEK NaeiniA Naderi, D Jilani… - Proceedings of the 36th …, 2021 - dl.acm.org

Electrocardiogram (ECG) is routinely used to identify key cardiac events such as changes in

ECG intervals (PR, ST, QT, etc.), as well as capture critical vital signs such as heart rate (HR)

and heart rate variability (HRV). The gold standard ECG requires clinical measurement …

  Related articles All 2 versions


2021

Geometry on the Wasserstein space over a compact Riemannian manifold

H Ding, S Fang - arXiv preprint arXiv:2104.00910, 2021 - arxiv.org

For the sake of simplicity, we will consider in this paper a connected compact Riemannian manifold

M of dimension m. We denote by dM the Riemannian distance and dx the Rieman- nian measure

on M such that ∫M dx = 1. Since the diameter of M is finite, any probability measure µ on M is …

  Related articles All 9 versions 

<——2021———2021———1100—— 


2021  [PDF] arxiv.org

Isometric Rigidity of compact Wasserstein spaces

J Santos-Rodríguez - arXiv preprint arXiv:2102.08725, 2021 - arxiv.org

Let $(X, d,\mathfrak {m}) $ be a metric measure space. The study of the Wasserstein space

$(\mathbb {P} _p (X),\mathbb {W} _p) $ associated to $ X $ has proved useful in describing

several geometrical properties of $ X. $ In this paper we focus on the study of isometries of …

  Related articles All 3 versions 


Unpaired Single-Image Depth Synthesis with cycle-consistent Wasserstein GANs

C Angermann, A Moravová, M Haltmeier… - arXiv preprint arXiv …, 2021 - arxiv.org

Real-time estimation of actual environment depth is an essential module for various

autonomous system tasks such as localization, obstacle detection and pose estimation.

During the last decade of machine learning, extensive deployment of deep learning …

  All 2 versions 


2021

Linear and Deep Order-Preserving Wasserstein Discriminant Analysis

B SuJ ZhouJR WenY Wu - IEEE Transactions on Pattern …, 2021 - ieeexplore.ieee.org

Supervised dimensionality reduction for sequence data learns a transformation that maps

the observations in sequences onto a low-dimensional subspace by maximizing the

separability of sequences in different classes. It is typically more challenging than …

  Related articles All 5 versions


[PDF] mlr.press

Exploring the Wasserstein metric for time-to-event analysis

T SylvainM LuckJ Cohen… - Survival Prediction …, 2021 - proceedings.mlr.press

Survival analysis is a type of semi-supervised task where the target output (the survival time)

is often right-censored. Utilizing this information is a challenge because it is not obvious how

to correctly incorporate these censored examples into a model. We study how three …


2021 see 2020

The α-z-Bures Wasserstein divergence

TH DinhCT Le, BK Vo, TD Vuong - Linear Algebra and its Applications, 2021 - Elsevier

In this paper, we introduce the α-z-Bures Wasserstein divergence for positive semidefinite

matrices A and B as Φ (A, B)= T r ((1− α) A+ α B)− T r (Q α, z (A, B)), where Q α, z (A, B)=(A

1− α 2 z B α z A 1− α 2 z) z is the matrix function in the α-z-Renyi relative entropy. We show …

  Related articles All 3 versions


2021  [PDF] arxiv.org

A data-driven event generator for Hadron Colliders using Wasserstein Generative Adversarial Network

S Choi, JH Lim - Journal of the Korean Physical Society, 2021 - Springer

Abstract Highly reliable Monte-Carlo event generators and detector simulation programs are

important for the precision measurement in the high energy physics. Huge amounts of

computing resources are required to produce a sufficient number of simulated events …

   Related articles All 5 versions


2021  [PDF] arxiv.org

Robust W-GAN-Based Estimation Under Wasserstein Contamination

Z Liu, PL Loh - arXiv preprint arXiv:2101.07969, 2021 - arxiv.org

Robust estimation is an important problem in statistics which aims at providing a reasonable

estimator when the data-generating distribution lies within an appropriately defined ball

around an uncontaminated distribution. Although minimax rates of estimation have been …

  Related articles All 2 versions 


arXiv:2107.14184  [pdfps, other]  math.ST  math.OC
Wasserstein Conditional Independence Testing
Authors: Andrew Warren
Abstract: We introduce a test for the conditional independence of random variables X
 and Y
 given a random variable Z
, specifically by sampling from the joint distribution (X,Y,Z)
, binning the support of the distribution of Z
, and conducting multiple p
-Wasserstein two-sample tests. Under a p
-Wasserstein Lipschitz assumption on the conditional distributions L
X|Z
,…  More
Submitted 29 July, 2021; originally announced July 2021.
Comments: 32 pages
MSC Class: 62G10 (Primary); 49Q22 (Secondary

2 versions 


Altschuler, Jason M.Boix-Adsera, Enric

Wasserstein barycenters can be computed in polynomial time in fixed dimension. (English) Zbl 07370561

J. Mach. Learn. Res. 22, Paper No. 44, 19 p. (2021).
[PDF] jmlr.org

[PDF] Wasserstein barycenters can be computed in polynomial time in fixed dimension.

JM Altschuler, E Boix-Adsera - J. Mach. Learn. Res., 2021 - jmlr.org

… We give the first algorithm that, in any fixed dimension d, solves the Wasserstein barycenter … 

accuracy ε > 0, computes an ε-additively approximate Wasserstein barycenter in poly(n, k,log…

 Cited by 23 Related articles All 7 versions 


Reconstruction Method for Missing Measurement Data Based on Wasserstein Generative Adversarial Network

C Zhang, H Chen, J He, H Yang - … Intelligence and Intelligent …, 2021 - jstage.jst.go.jp

Focusing on the issue of missing measurement data caused by complex and changeable

working conditions during the operation of high-speed trains, in this paper, a framework for

the reconstruction of missing measurement data based on a generative adversarial network …

  Related articles All 4 versions

MSC:  68T05

<——2021———2021———1110——


Wasserstein contrastive representation distillation

L ChenD WangZ Gan, J Liu… - Proceedings of the …, 2021 - openaccess.thecvf.com

The primary goal of knowledge distillation (KD) is to encapsulate the information of a model

learned from a teacher network into a student network, with the latter being more compact

than the former. Existing work, eg, using Kullback-Leibler divergence for distillation, may fail …

  Cited by 2 Related articles All 4 versions 


year 2021  [PDF] thecvf.com

[PDF] Wasserstein Contrastive Representation Distillation: Supplementary Material

L ChenD WangZ Gan, J Liu, R HenaoL Carin - openaccess.thecvf.com

• Wide Residual Network (WRN)[20]: WRN-dw represents wide ResNet with depth d and

width factor w.• resnet [3]: We use ResNet-d to represent CIFAR-style resnet with 3 groups of

basic blocks, each with 16, 32, and 64 channels, respectively. In our experiments, resnet8x4 …

  Related articles 


2021  [PDF] arxiv.org

Wasserstein diffusion on graphs with missing attributes

Z ChenT MaY Song, Y Wang - arXiv preprint arXiv:2102.03450, 2021 - arxiv.org

Missing node attributes is a common problem in real-world graphs. Graph neural networks

have been demonstrated powerful in graph representation learning, however, they rely

heavily on the completeness of graph information. Few of them consider the incomplete …

  Related articles All 3 versions 


2021  [PDF] iop.org

A deep learning-based approach for direct PET attenuation correction using Wasserstein generative adversarial network

Y Li, W Wu - Journal of Physics: Conference Series, 2021 - iopscience.iop.org

Positron emission tomography (PET) in some clinical assistant diagnose demands

attenuation correction (AC) and scatter correction (SC) to obtain high-quality imaging,

leading to gaining more precise metabolic information in tissue or organs of patient …

  Related articles All 3 versions


2021  [PDF] arxiv.org

AWCD: An Efficient Point Cloud Processing Approach via Wasserstein Curvature

Y Luo, A Yang, F Sun, H Sun - arXiv preprint arXiv:2105.04402, 2021 - arxiv.org

In this paper, we introduce the adaptive Wasserstein curvature denoising (AWCD), an

original processing approach for point cloud data. By collecting curvatures information from

Wasserstein distance, AWCD consider more precise structures of data and preserves …

  Related articles All 2 versions 

2021


Fault Diagnosis of Rotating Machinery Based on Wasserstein Distance and Feature Selection

F FerracutiA FreddiA Monteriù… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

This article presents a fault diagnosis algorithm for rotating machinery based on the

Wasserstein distance. Recently, the Wasserstein distance has been proposed as a new

research direction to find better distribution mapping when compared with other popular …

  Related articles


2021

 Transfer learning method for bearing fault diagnosis based on fully convolutional conditional Wasserstein adversarial Networks

YZ Liu, KM Shi, ZX Li, GF Ding, YS Zou - Measurement, 2021 - Elsevier

The diagnostic accuracy of existing transfer learning-based bearing fault diagnosis methods

is high in the source condition, but accuracy in the target condition is not guaranteed. These

methods mainly focus on the whole distribution of bearing source domain data and target …

  Related articles All 2 versions


2021

Data augmentation for rolling bearing fault diagnosis using an enhanced few-shot Wasserstein auto-encoder with meta-learning

Z Pei, H Jiang, X Li, J Zhang, S Liu - Measurement Science and …, 2021 - iopscience.iop.org

Despite the advance of intelligent fault diagnosis for rolling bearings, in industries, data-

driven methods still suffer from data acquisition and imbalance. We propose an enhanced

few-shot Wasserstein auto-encoder (fs-WAE) to reverse the negative effect of imbalance …

  Related articles



2021  [PDF] mdpi.com

Power Electric Transformer Fault Diagnosis Based on Infrared Thermal Images Using Wasserstein Generative Adversarial Networks and Deep Learning Classifier

KH Fanchiang, YC Huang, CC Kuo - Electronics, 2021 - mdpi.com

The safety of electric power networks depends on the health of the transformer. However,

once a variety of transformer failure occurs, it will not only reduce the reliability of the power

system but also cause major accidents and huge economic losses. Until now, many …

  Related articles All 3 versions 


3021  [PDF] arxiv.org

Berry–Esseen Smoothing Inequality for the Wasserstein Metric on Compact Lie Groups

B Borda - Journal of Fourier Analysis and Applications, 2021 - Springer

We prove a sharp general inequality estimating the distance of two probability measures on

a compact Lie group in the Wasserstein metric in terms of their Fourier transforms. We use a

generalized form of the Wasserstein metric, related by Kantorovich duality to the family of …

  Related articles All 5 versions

<——2021———2021———11120—— 


2021  [PDF] arxiv.org

1-Wasserstein distance on the standard simplex

A Frohmader, H Volkmer - Algebraic Statistics, 2021 - msp.org

Wasserstein distances provide a metric on a space of probability measures. We consider the

space Ω of all probability measures on the finite set χ={1,…, n}, where n is a positive integer.

The 1-Wasserstein distance, W 1 (μ, ν), is a function from Ω× Ω to [0,∞). This paper derives …

   Related articles All 3 versions



[PDF] mlr.press

Improved complexity bounds in wasserstein barycenter problem

D DvinskikhD Tiapkin - International Conference on …, 2021 - proceedings.mlr.press

In this paper, we focus on computational aspects of the Wasserstein barycenter problem. We

propose two algorithms to compute Wasserstein barycenters of $ m $ discrete measures of

size $ n $ with accuracy $\e $. The first algorithm, based on mirror prox with a specific norm …

  Cited by 6 Related articles All 3 versions 


Low-Dose CT Denoising Using A Progressive Wasserstein Generative Adversarial Network

G Wang, X Hu - Computers in Biology and Medicine, 2021 - Elsevier

Low-dose computed tomography (LDCT) imaging can greatly reduce the radiation dose

imposed on the patient. However, image noise and visual artifacts are inevitable when the

radiation dose is low, which has serious impact on the clinical medical diagnosis. Hence, it …

  All 4 versions


[HTML] mdpi.com

WDA: An Improved Wasserstein Distance-Based Transfer Learning Fault Diagnosis Method

Z Zhu, L Wang, G Peng, S Li - Sensors, 2021 - mdpi.com

With the growth of computing power, deep learning methods have recently been widely

used in machine fault diagnosis. In order to realize highly efficient diagnosis accuracy,

people need to know the detailed health condition of collected signals from equipment …

  All 10 versions 


Full View

Short‐term railway passenger demand forecast using improved Wasserstein generative adversarial nets and web search terms

F Feng, J Zhang, C Liu, W Li… - IET Intelligent Transport …, 2021 - Wiley Online Library

Accurately predicting railway passenger demand is conducive for managers to quickly

adjust strategies. It is time‐consuming and expensive to collect large‐scale traffic data. With

the digitization of railway tickets, a large amount of user data has been accumulated. We …

  Related articles 

 2021

 

[PDF] thecvf.com

A Sliced Wasserstein Loss for Neural Texture Synthesis

E Heitz, K Vanhoey, T Chambon… - Proceedings of the …, 2021 - openaccess.thecvf.com

We address the problem of computing a textural loss based on the statistics extracted from

the feature activations of a convolutional neural network optimized for object recognition (eg

VGG-19). The underlying mathematical problem is the measure of the distance between two …

  Related articles All 4 versions 


[PDF] jmlr.org

[PDF] A Fast Globally Linearly Convergent Algorithm for the Computation of Wasserstein Barycenters.

L YangJ LiD SunKC Toh - J. Mach. Learn. Res., 2021 - jmlr.org

We consider the problem of computing a Wasserstein barycenter for a set of discrete

probability distributions with finite supports, which finds many applications in areas such as

statistics, machine learning and image processing. When the support points of the …

  Cited by 9 Related articles All 22 versions 

[CITATION] A Fast Globally Linearly Convergent Algorithm for the Computation of Wasserstein Barycenters. eprint

L Yang, J Li, D Sun, KC Toh - arXiv preprint arXiv:1809.04249, 2019

  Cited by 4 Related articles


  

[PDF] aaai.org

[PDF] Fast PCA in 1-D Wasserstein Spaces via B-splines Representation and Metric Projection

M Pegoraro, M Beraha - Proceedings of the AAAI Conference on Artificial …, 2021 - aaai.org

We address the problem of performing Principal Component Analysis over a family of

probability measures on the real line, using the Wasserstein geometry. We present a novel

representation of the 2-Wasserstein space, based on a well known isometric bijection and a …

  Related articles All 2 versions 


 

[PDF] arxiv.org

Isometric Rigidity of compact Wasserstein spaces

J Santos-Rodríguez - arXiv preprint arXiv:2102.08725, 2021 - arxiv.org

Let $(X, d,\mathfrak {m}) $ be a metric measure space. The study of the Wasserstein space

$(\mathbb {P} _p (X),\mathbb {W} _p) $ associated to $ X $ has proved useful in describing

several geometrical properties of $ X. $ In this paper we focus on the study of isometries of …

  Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein Proximal Algorithms for the Schr\"{o} dinger Bridge Problem: Density Control with Nonlinear Drift

K CaluyaA Halder - IEEE Transactions on Automatic Control, 2021 - ieeexplore.ieee.org

We study the Schr {\" o} dinger bridge problem (SBP) with nonlinear prior dynamics. In

control-theoretic language, this is a problem of minimum effort steering of a given joint state

probability density function (PDF) to another over a finite time horizon, subject to a controlled …

  Cited by 8 Related articles All 6 versions

<——2021———2021———11130——



[PDF] arxiv.org

Decentralized Algorithms for Wasserstein Barycenters

D Dvinskikh - arXiv preprint arXiv:2105.01587, 2021 - arxiv.org

In this thesis, we consider the Wasserstein barycenter problem of discrete probability

measures from computational and statistical sides in two scenarios:(I) the measures are

given and we need to compute their Wasserstein barycenter, and (ii) the measures are …

  Related articles All 2 versions 

 

[PDF] iop.org

A deep learning-based approach for direct PET attenuation correction using Wasserstein generative adversarial network

Y Li, W Wu - Journal of Physics: Conference Series, 2021 - iopscience.iop.org

Positron emission tomography (PET) in some clinical assistant diagnose demands

attenuation correction (AC) and scatter correction (SC) to obtain high-quality imaging,

leading to gaining more precise metabolic information in tissue or organs of patient …

  Related articles All 3 versions



OPTIMAL TRANSPORT ALGORITHMS AND WASSERSTEIN BARYCENTERS

OY Kovalenko - INTERNATIONAL PROGRAM COMMITTEE, 2021 - pdmu.univ.kiev.ua

The work considers the question of finding the optimal algorithm that will be used to solve

the problem of finding Wasserstein's distance. The relevance of the research topic is that

today these algorithms are among the most common ways to use optimal transport and are …

  

[PDF] arxiv.org

Approximation algorithms for 1-Wasserstein distance between persistence diagrams

S Chen, Y Wang - arXiv preprint arXiv:2104.07710, 2021 - arxiv.org

Recent years have witnessed a tremendous growth using topological summaries, especially

the persistence diagrams (encoding the so-called persistent homology) for analyzing

complex shapes. Intuitively, persistent homology maps a potentially complex input object (be …

   Related articles All 4 versions 


MR4293940 Prelim Dąbrowski, Damian; Sufficient Condition for Rectifiability Involving Wasserstein Distance W_2\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}W_2\end{document}. J. Geom. Anal. 31 (2021), no. 8, 8539–8606. 28A75 (28A78)

Sufficient condition for rectifiability involving Wasserstein distance \(W_2\)

by Dąbrowski, Damian

arXiv.org, 09/2020

A Radon measure \(\mu\) is \(n\)-rectifiable if it is absolutely continuous with respect to \(\mathcal{H}^n\) and \(\mu\)-almost all 

f \(\text{supp}\,\mu\)...

Paper  Full Text Online


2021


A TextCNN and WGAN-gp based deep learning frame for ...

https://www.researchgate.net › ... › Multimedia

Request PDF | A TextCNN and WGAN-gp based deep learning frame for unpaired text style transfer in multimedia services | With 

2021 see 2020 onlineCover Image PEER-REVIEW

A TextCNN and WGAN-gp based deep learning frame for unpaired text style transfer in multimedia services

by Hu, Mingxuan; He, Min; Su, Wei ; More...

Multimedia systems, 08/2021, Volume 27, Issue 4

Keywords: Big multimedia data; TextCNN; WGAN-gp; Unpaired text style transfer; Multimedia services With the rapid growth of big multimedia data, multimedia...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

  Cited by 3 Related articles All 4 versions

Publication:Multimedia Systems, 27, 20201123, 723
Publisher:2020

Sufficient Condition for Rectifiability Involving Wasserstein ...

https://www.researchgate.net › Home › Alpha

Apr 2, 2021 — Request PDF | Sufficient Condition for Rectifiability Involving Wasserstein Distance $$W_2 | A Radon measure \(\mu \) is n-rectifiable if it ...

Cover Image PEER-REVIEW

Sufficient Condition for Rectifiability Involving Wasserstein Distance W2

by DÄbrowski, Damian

The Journal of geometric analysis, 08/2021, Volume 31, Issue 8

Keywords: Rectifiability; Rectifiable measures; numbers; numbers A Radon measure is n-rectifiable if it is absolutely continuous with respect to and -almost...

Journal ArticleCitation Online

 

Data augmentation for rolling bearing fault diagnosis using an ...

https://iopscience.iop.org › article › meta

by Z Pei  2021 — Data augmentation for rolling bearing fault diagnosis using an enhanced few-shot Wasserstein auto-e

online Cover Image PEER-REVIEW

Data augmentation for rolling bearing fault diagnosis using an enhanced few-shot Wasserstein auto-encoder with meta-learning

by Pei, Zeyu; Jiang, Hongkai; Li, Xingqiu ; More...

Measurement science & technology, 08/2021, Volume 32, Issue 8

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 Zbl 07388797


Wasserstein distance feature alignment learning for 2D image ...

https://www.sciencedirect.com › science › article › pii

by Y Zhou · 2021 — 2D image-based 3D model retrieval has become a hotspot topic in recent years. However, the current existing methods are limited by two aspects.

online Cover Image PEER-REVIEW

Wasserstein distance feature alignment learning for 2D image-based 3D model retrieval

by Zhou, Yaqian; Liu, Yu; Zhou, Heyu ; More...

Journal of visual communication and image representation, 08/2021, Volume 79

2D image-based 3D model retrieval has become a hotspot topic in recent years. However, the current existing methods are limited by two aspects. Firstly, they...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 Cited by 2 Related articles All 3 versions

Transfer learning method for bearing fault diagnosis based on ...

https://www.sciencedirect.com › science › article › pii

by YZ Liu · 2021 — The proposed model is a deep Fully Convolutional Conditional Wasserstein Adversarial Network (FCWAN). •. A difference classifier improves the diagnosis accuracy ...

online Cover Image PEER-REVIEW

Transfer learning method for bearing fault diagnosis based on fully convolutional conditional Wasserstein adversarial Networks

by Liu, Yong Zhi; Shi, Ke Ming; Li, Zhi Xuan ; More...

Measurement : journal of the International Measurement Confederation, 08/2021, Volume 180

•A transfer learning fault diagnosis model for bearing under different working conditions is proposed.•The proposed model is a deep Fully Convolutional...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Cited by 16 Related articles

<——2021———2021———11140—


Drug–drug interaction prediction with Wasserstein Adversarial ...

https://academic.oup.com › bib › article-abstract › bbaa256

by Y Dai · 2021 — Capturing richer and more comprehensive information about drug–dr. ... prediction with Wasserstein Adversarial Autoencoder-based knowledge graph embeddings.

online Cover Image PEER-REVIEW

Drug-drug interaction prediction with Wasserstein Adversarial Autoencoder-based knowledge graph embeddings

Briefings in bioinformatics, 07/2021, Volume 22, Issue 4

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon
Cited by 28
Related articles All 9 versions


Wasserstein Convergence for Empirical Measures of ...

https://arxiv.org › math

by FYWang · 2021 — [Submitted on 24 Jul 2021]. Title:Wasserstein Convergence for Empirical Measures of Subordinated Diffusions on Riemannian Manifolds. Authors:Feng-Yu Wang, ...

online OPEN ACCESS

Wasserstein Convergence for Empirical Measures of Subordinated Diffusions on Riemannian Manifolds

by Wang, Feng-Yu; Wu, Bingyao

07/2021

Let $M$ be a connected compact Riemannian manifold possibly with a boundary, let $V\in C^2(M)$ such that $\mu(\d x):=\e^{V(x)}\d x$ is a probability measure,...

Journal ArticleFull Text Online

Related articles All 2 versions

Wasserstein Conditional Independence Testing

https://arxiv.org › math

by A Warren · 2021 — Mathematics > Statistics Theory. arXiv:2107.14184 (math). [Submitted on 29 Jul 2021]. Title:Wasserstein Conditional Independence Testing.

online  OPEN ACCESS

Wasserstein Conditional Independence Testing

by Warren, Andrew

07/2021

We introduce a test for the conditional independence of random variables $X$ and $Y$ given a random variable $Z$, specifically by sampling from the joint...

Journal ArticleFull Text Online

 

Two-sample goodness-of-fit tests on the flat torus based on ...

https://arxiv.org › pdf

by J González-Delgado · 2021 — Two-sample goodness-of-fit tests on the flat torus based on Wasserstein distance and their relevance to structural biology.

online  OPEN ACCESS

Two-sample goodness-of-fit tests on the flat torus based on Wasserstein distance and their relevance to structural biology

by González-Delgado, Javier; González-Sanz, Alberto; Cortés, Juan ; More...

PDF  07/2021

This work is motivated by the study of local protein structure, which is defined by two variable dihedral angles that take values from probability...

Journal ArticleFull Text Online

Cited by 4 Related articles All 40 versions 

Two-sample goodness-of-fit tests on the flat torus based on Wasserstein distance and their relevance to structural biology
González-Delgado, Javier; González-Sanz, Alberto; Cortés, Juan; Neuvial, Pierre. arXiv.org; Ithaca, Jul 31, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

 Related articles All 2 versions


[PDF] arxiv.org

On Riemannian Optimization over Positive Definite Matrices with the Bures-Wasserstein Geometry

A HanB MishraP JawanpuriaJ Gao - arXiv preprint arXiv:2106.00286, 2021 - arxiv.org

In this paper, we comparatively analyze the Bures-Wasserstein (BW) geometry with the

popular Affine-Invariant (AI) geometry for Riemannian optimization on the symmetric positive

definite (SPD) matrix manifold. Our study begins with an observation that the BW metric has …

  Related articles All 2 versions 


2021


[PDF] arxiv.org

On absolutely continuous curves in the Wasserstein space over R and their representation by an optimal Markov process

C Boubel, N Juillet - arXiv preprint arXiv:2105.02495, 2021 - arxiv.org

Let $\mu $=($\mu $ t) t $\in $ R be a 1-parameter family of probability measures on R. In [13]

we introduced its" Markov-quantile" process: a process X=(Xt) t $\in $ R that resembles at

most the quantile process attached to $\mu $, among the Markov processes attached to …

  Related articles All 3 versions 


[PDF] arxiv.org

Second-order Conic Programming Approach for Wasserstein Distributionally Robust Two-stage Linear Programs

Z Wang, K YouS SongY Zhang - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

This article proposes a second-order conic programming (SOCP) approach to solve

distributionally robust two-stage linear programs over 1-Wasserstein balls. We start from the

case with distribution uncertainty only in the objective function and then explore the case …

   Related articles All 4 versions


year 2021 modified  [PDF] psu.edu

[PDF] Wasserstein space over the Wiener space Shizan FANGa, b Jinghai SHAOb, c Karl-Theodor STURMc a: IMB, BP 47870, Université de Bourgogne, 21078 Dijon …

S FANG, J SHAO, KT STURM - Citeseer

The goal of this paper is to study optimal transportation problems and gradient flows of

probability measures on the Wiener space, based on and extending fundamental results of

Feyel-Ustünel. Carrying out the program of Ambrosio-Gigli-Savaré, we present a complete …

  Related articles All 4 versions 


2021

online Cover Image  OPEN ACCESS

Parameter-Transferred Wasserstein Generative Adversarial Network (PT-WGAN) for Low-Dose PET...

by Gong, Yu; Shan, Hongming; Teng, Yueyang ; More...

IEEE transactions on radiation and plasma medical sciences, 03/2021, Volume 5, Issue 2

Due to the widespread of positron emission tomography (PET) in clinical practice, the potential risk of PET-associated radiation dose to patients needs to be...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 Parameter-Transferred Wasserstein Generative Adversarial Network (PT-WGAN) for Low-Dose PET Image Denoising

2 citations* for all

2 citations*

2021 IEEE TRANSACTIONS ON RADIATION AND PLASMA MEDICAL SCIENCES

Yu Gong 1,Hongming Shan 2,Yueyang Teng 1,Ning Tu 3,Ming Li 4 see all 8 authors

1 Northeastern University (China) ,2 Fudan University ,3 Wuhan University ,4 MI Research and Development Division, Neusoft Medical Systems Company, Ltd., Shenyang, China

Image noise

Noise reduction

View More (9+) 

Due to the widespread of positron emission tomography (PET) in clinical practice, the potential risk of PET-associated radiation dose to patients needs to be minimized. However, with the reduction in the radiation dose, the resultant images may suffer from noise and artifacts that compromise diagnos... View Full Abstract 


   arXiv:2108.03815  [pdfother]  cs.CV   cs.AI
P-WAE: Generalized Patch-Wasserstein Autoencoder for Anomaly Screening
Authors: Yurong Chen
Abstract: To mitigate the inspector's workload and improve the quality of the product, computer vision-based anomaly detection (AD) techniques are gradually deployed in real-world industrial scenarios. Recent anomaly analysis benchmarks progress to generative models. The aim is to model the defect-free distribution so that anomalies can be classified as out-of-distribution samples. Nevertheless, there are t… 
More
Submitted 9 August, 2021; originally announced August 2021.

All 2 versions 
<——2021———2021———11150——


arXiv:2108.02120  [pdfother]  math.ST  cs.LG  math.OC  stat.ML
Statistical Analysis of Wasserstein Distributionally Robust Estimators
Authors: Jose BlanchetKarthyek MurthyViet Anh Nguyen
Abstract: We consider statistical methods which invoke a min-max distributionally robust formulation to extract good out-of-sample performance in data-driven optimization and learning problems. Acknowledging the distributional uncertainty in learning from limited samples, the min-max formulations introduce an adversarial inner player to explore unseen covariate data. The resulting Distributionally Robust Op…  More
Submitted 4 August, 2021; originally announced August 2021.

Cited by 9 Related articles All 4 versions

 2021 see 2020

Graph Diffusion Wasserstein Distances

Barbe, ASebban, M; (...); Gribonval, R

European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD)

2021 | MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2020, PT II 12458 , pp.577-592

Optimal Transport (OT) for structured data has received much attention in the machine learning community, especially for addressing graph classification or graph transfer learning tasks. In this paper, we present the Diffusion Wasserstein (DW) distance, as a generalization of the standard Wasserstein distance to undirected and connected graphs where nodes are described by feature vectors. DW is based on the Laplacian exponential kernel and benefits from the heat diffusion to catch both structural and feature information from the graphs. We further derive lower/upper bounds on DW and show that it can be directly plugged into the Fused GromovWasserstein (FGW) distance that has been recently proposed, leading - for free - to a DifFused Gromov Wasserstein distance (DFGW) that allows a significant performance boost when solving graph domain adaptation tasks.

Show more   Free Submitted Article From RepositoryFull Text at Publisher

References   Related records

 

 Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator

19 citations*

2018 ARXIV: OPTIMIZATION AND CONTROL

View More 

 Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator

0 citations*

2018 RESEARCH PAPERS IN ECONOMICS

View More 

 A Wasserstein Minimax Framework for Mixed Linear Regression

0 citations* for all

0 citations*

2021 INTERNATIONAL CONFERENCE ON MACHINE LEARNING

Theo Diamandis 1,Yonina Eldar 2,Alireza Fallah 1,Farzan Farnia 1,Asuman Ozdaglar 1

1 Massachusetts Institute of Technology ,2 Weizmann Institute of Science

Mixture model

Minimax

View More (8+) 

Multi-modal distributions are commonly used to model clustered data in statistical learning tasks. In this paper, we consider the Mixed Linear Regression (MLR) problem. We propose an optimal transport-based framework for MLR problems, Wasserstein Mixed Linear Regression (WMLR), which minimizes the W... View Full Abstract 

Cited by 1 Related articles All 6 versions 

 Wasserstein F-tests and confidence bands for the Fréchet regression of density response curves
Petersen, Alexander; Liu, Xi; Divani, Afshin A. Annals of Statistics; Hayward Vol. 49, Iss. 1,  (Feb 2021): 590.

Abstract/Details Get full textLink to external site, this link will open in a new window

   2021

Statistical Analysis of Wasserstein Distributionally Robust Estimators

J Blanchet, K Murthy, VA Nguyen - arXiv e-prints, 2021 - ui.adsabs.harvard.edu

We consider statistical methods which invoke a min-max distributionally robust formulation

to extract good out-of-sample performance in data-driven optimization and learning

problems. Acknowledging the distributional uncertainty in learning from limited samples, the …

Cited by 11 Related articles All 4 versions

2021

 

 Confidence Regions in Wasserstein Distributionally Robust Estimation

7 citations*

2019 ARXIV: STATISTICS THEORY

   

 Deep Wasserstein Graph Discriminant Learning for Graph Classification.

0 citations*

2021 NATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE

Tong Zhang 1,Yun Wang ,Zhen Cui 2,Chuanwei Zhou 2,Baoliang Cui 3 see all 7 authors

1 South China University of Technology ,2 Nanjing University of Science and Technology ,3 Alibaba Group

Graph (abstract data type)

Discriminant

View More (3+) 

Cited by 8 Related articles All 3 versions 


 Fast Approximation of the Sliced-Wasserstein Distance Using Concentration of Random Projections.

0 citations*

2021 ARXIV: MACHINE LEARNING

Kimia Nadjahi ,Alain Durmus 1,Pierre E. Jacob 2,Roland Badeau ,Umut Simsekli 3

1 École Normale Supérieure ,2 Harvard University ,3 French Institute for Research in Computer Science and Automation

Approximation error

Multivariate random variable

View More (8+) 

The Sliced-Wasserstein distance (SW) is being increasingly used in machine learning applications as an alternative to the Wasserstein distance and offers significant computational and statistical benefits. Since it is defined as an expectation over random projections, SW is commonly approximated by ... View Full Abstract 

 Cited by 8 Related articles All 18 versions 

 The NYCKidSeq project: study protocol for a randomized controlled trial incorporating genomics into the clinical care of diverse New York City children.

5 citations* for all

4 citations*

2021 TRIALS

Jacqueline A. Odgis 1,Katie M. Gallagher 2,Sabrina A. Suckiel 1,Katherine E. Donohue 1,Michelle A. Ramos 1 see all 46 authors

1 Icahn School of Medicine at Mount Sinai ,2 Albert Einstein College of Medicine

Genetic testing

Return of results

View More (8+) 

Increasingly, genomics is informing clinical practice, but challenges remain for medical professionals lacking genetics expertise, and in access to and clinical utility of genomic testing for minority and underrepresented populations. The latter is a particularly pernicious problem due to the histor... View Full Abstract 

 Cited by 9 Related articles All 13 versions


 Relaxed Wasserstein with Applications to GANs

20 citations*

2017 ARXIV: MACHINE LEARNING

View More 

 Primal Dual Methods for Wasserstein Gradient Flows

24 citations* for all

4 citations*

2021 FOUNDATIONS OF COMPUTATIONAL MATHEMATICS

José A. Carrillo 1,Katy Craig 2,Li Wang 3,Chaozhen Wei 4

1 Imperial College London ,2 University of California, Santa Barbara ,3 University of Minnesota ,4 Hong Kong University of Science and Technology

Discretization

Discrete time and continuous time

View More (8+) 

Combining the classical theory of optimal transport with modern operator splitting techniques, we develop a new numerical method for nonlinear, nonlocal partial differential equations, arising in models of porous media, materials science, and biological swarming. Our method proceeds as follows: firs... View Full Abstract 

Cited by 28 Related articles All 4 versionsCited by 29 Related articles All 6 versions 

 Primal dual methods for Wasserstein gradient flows

20 citations*

2019 ARXIV: NUMERICAL ANALYSIS

 Efficient Wasserstein Natural Gradients for Reinforcement Learning

2 citations* for all

2 citations*

2021 INTERNATIONAL CONFERENCE ON LEARNING REPRESENTATIONS

Ted Moskovitz 1,Michael Arbel 2,Ferenc Huszar 3,Arthur Gretton 2

1 Gatsby Computational Neuroscience Unit,2 University College London ,3 University of Cambridge

Reinforcement learning

Divergence (statistics)

View More (4+) 

A novel optimization approach is proposed for application to policy gradient methods and evolution strategies for reinforcement learning (RL). The procedure uses a computationally efficient \emph{Wasserstein natural gradient} (WNG) descent that takes advantage of the geometry induced by a Wasserstei... View Full Abstract 

Cited by 34 Related articles All 10 versions

 

 Efficient Wasserstein Natural Gradients for Reinforcement Learning

0 citations*

2020 ARXIV: LEARNING

 <——2021———2021———1160——


 

  Covariance Steering of Discrete-Time Stochastic Linear Systems Based on Wasserstein Distance Terminal Cost

2 citations* for all

2 citations*

2021 IEEE CONTROL SYSTEMS LETTERS

Isin M. Balci ,Efstathios Bakolas

University of Texas at Austin

Stochastic control

Probability distribution

View More (8+) 

We consider a class of stochastic optimal control problems for discrete-time linear systems whose objective is the characterization of control policies that will steer the probability distribution of the terminal state of the system close to a desired Gaussian distribution. In our problem formulatio... View Full Abstract 

  Covariance Steering of Discrete-Time Stochastic Linear Systems Based on Wasserstein Distance Terminal CostMS LETTERS  Volume: ‏ 5   Issue: ‏ 6   Pages: ‏ 2000-2005   Published: ‏ DEC 2021

  View Abstract


 2021 see 2020

 Covariance Steering of Discrete-Time Stochastic Linear Systems Based on Wasserstein Distance Terminal Cost.

0 citations*

2021 ADVANCES IN COMPUTING AND COMMUNICATIONS

View More 

 High-Confidence Attack Detection via Wasserstein-Metric Computations

5 citations* for all

5 citations*

2021 IEEE CONTROL SYSTEMS LETTERS

Dan Li ,Sonia Martinez

University of California, San Diego

Wasserstein metric

Fault detection and isolation

View More (8+) 

This letter considers a sensor attack and fault detection problem for linear cyber-physical systems, which are subject to system noise that can obey an unknown light-tailed distribution. We propose a new threshold-based detection mechanism that employs the Wasserstein metric, and which guarantees sy... View Full Abstract 

 MR4451854 

 

2021 see 2020 

 Distributional Sliced-Wasserstein and Applications to Generative Modeling

17 citations* for all

14 citations*

2021 INTERNATIONAL CONFERENCE ON LEARNING REPRESENTATIONS

Khai Nguyen 1,Nhat Ho 2,Tung Pham 3,Hung Bui 4

1 VinAI Research, Vietnam,2 University of Texas at Austin ,3 Vietnam National University, Hanoi ,4 Google

Projection (set theory)

Probability measure

View More (8+) 

Sliced-Wasserstein distance (SW) and its variant, Max Sliced-Wasserstein distance (Max-SW), have been used widely in the recent years due to their fast computation and scalability even when the probability measures lie in a very high dimensional space. However, SW requires many unnecessary projectio... View Full Abstract 

EXCERPTS (44)

 

Sample Out-of-Sample Inference Based on Wasserstein ...

https://pubsonline.informs.org › abs › opre.2020.2028

JBlanchet · 2021 · Cited by 22 — The methodology is inspired by empirical likelihood (EL), but we optimize the empirical Wasserstein distance (instead of the empirical ..

 

 Quantum statistical learning via Quantum Wasserstein natural gradient

0 citations*

2020 ARXIV: MATHEMATICAL PHYSICS

View More 

 Tropical optimal transport and Wasserstein distances

2 citations* for all

0 citations*

2021 INFORMATION GEOMETRY

Wonjun Lee 1,Wuchen Li 2,Bo Lin 3,Anthea Monod 4

1 University of California, Los Angeles ,2 University of South Carolina ,3 Georgia Institute of Technology ,4 Imperial College London

Tropical geometry

Metric (mathematics)

View More (8+) 

We study the problem of optimal transport in tropical geometry and define the Wasserstein-p distances in the continuous metric measure space setting of the tropical projective torus. We specify the tropical metric—a combinatorial metric that has been used to sCited by 2 Related articles All 10 versions

2021

  

2021 see 2020b  

 Distributionally Robust Chance-Constrained Programs with Right-Hand Side Uncertainty under Wasserstein Ambiguity

7 citations*

2020 ARXIV: OPTIMIZATION AND CONTROL

View More 

 Ensemble Riemannian data assimilation over the Wasserstein space

0 citations* for all

0 citations*

2021 NONLINEAR PROCESSES IN GEOPHYSICS

Sagar K. Tamang 1,Ardeshir Ebtehaj 1,Peter J. van Leeuwen 2,Dongmian Zou 3,Gilad Lerman 1

1 University of Minnesota ,2 Colorado State University ,3 Duke University

Wasserstein metric

Euclidean distance

View More (8+) 

Abstract. In this paper, we present an ensemble data assimilation paradigm over a Riemannian manifold equipped with the Wasserstein metric. Unlike the Euclidean distance used in classic data assimilation methodologies, the Wasserstein metric can capture the translation and difference between the sha... View Full Abstract 

3 Related articles All 7 versions

 Reports on Programming from Institute of Basic Science Provide New Insights (

Distributionally Robust Chance-constrained Programs With Right-hand Side Uncertainty Under Wasserstein...
Mathematics Week, 03/2021
Newsletter  Full Text Online

 

 Projected Wasserstein gradient descent for high-dimensional Bayesian inference.

0 citations*

2021 ARXIV: LEARNING

Yifei Wang 1,Peng Chen 2,Wuchen Li 3

1 Stanford University ,2 University of Texas at Austin ,3 University of South Carolina

Gradient descent

Kernel density estimation

View More (7+) 

We propose a projected Wasserstein gradient descent method (pWGD) for high-dimensional Bayesian inference problems. The underlying density function of a particle system of WGD is approximated by kernel density estimation (KDE), which faces the long-standing curse of dimensionality. We overcome this ... View Full Abstract 

Cited by 1 Related articles All 4 versions 


 
1 January 2021  see 2019

Strong equivalence between metrics of Wasserstein type

Erhan Bayraktar, Gaoyue Guo

Electronic Communications in Probability Vol. 26, Issue none (Jan 2021), pg(s) 1-13

KEYWORDS: Dualitymax-sliced Wasserstein metricOptimal transportWasserstein metric

Read Abstract +


20 March 2021

Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic Schrödinger equation

Guillaume Ferriere

Analysis & PDE Vol. 14, Issue 2 (Mar 2021), pg(s) 617-666

KEYWORDS: harmonic Fokker–Planck operatorkinetic isothermal Euler systemlarge-time behaviorlogarithmic Schrödinger equationsemiclassical limitWasserstein distancesWigner measures

Read Abstract +

DOWNLOAD PAPER+

Ferriere, Guillaume

Convergence rate in Wasserstein distance and semiclassical limit for the defocusing logarithmic Schrödinger equation. (English) Zbl 07403061

Anal. PDE 14, No. 2, 617-666 (2021).

MSC:  35Q55 35Q83 81Q20 35B25 35B35 35B40 35Q35 35Q84

PDF BibTeX XML Cite

Full Text: DO

Cited by 8 Related articles All 7 versions

 

1 June 2021

Statistical inference for Bures–Wasserstein barycenters

Alexey Kroshnin, Vladimir Spokoiny, Alexandra Suvorikova

The Annals of Applied Probability Vol. 31, Issue 3 (Jun 2021), pg(s) 1264-1298

KEYWORDS: Bures–Wasserstein barycentercentral limit theoremConcentrationHermitian operatorsWasserstein barycenter

Cited by 27 Related articles All 8 versions

 <——2021———2021———1170—— 


 

Low-dose CT denoising using a Progressive Wasserstein generative adversarial network

G Wang, X Hu - Computers in Biology and Medicine, 2021 - Elsevier

Low-dose computed tomography (LDCT) imaging can greatly reduce the radiation dose

imposed on the patient. However, image noise and visual artifacts are inevitable when the

radiation dose is low, which has serious impact on the clinical medical diagnosis. Hence, it …

  All 4 versions


2021  [PDF] arxiv.org

Learning Domain Invariant Representations by Joint Wasserstein Distance Minimization

L Andéol, Y Kawakami, Y Wada, T Kanamori… - arXiv preprint arXiv …, 2021 - arxiv.org

Domain shifts in the training data are common in practical applications of machine learning,

they occur for instance when the data is coming from different sources. Ideally, a ML model

should work well independently of these shifts, for example, by learning a domain-invariant  …

  Related articles All 4 versions 


2021

De-aliased seismic data interpolation using conditional Wasserstein generative adversarial networks

Q Wei, X Li, M Song - Computers & Geosciences, 2021 - Elsevier

When sampling at offset is too coarse during seismic acquisition, spatial aliasing will appear,

affecting the accuracy of subsequent processing. The receiver spacing can be reduced by

interpolating one or more traces between every two traces to remove the spatial aliasing …

  Related articles All 2 versions


2021  [PDF] virginia.edu

[PDF] Unsupervised Graph Alignment with Wasserstein Distance Discriminator

J GaoX Huang, J Li - … on Knowledge Discovery and Data Mining, 2021 - cs.virginia.edu

Graph alignment aims to identify the node correspondence across multiple graphs and is

essential to reveal insightful graph patterns that are otherwise inaccessible with a single

graph. With roots in graph theory, the graph alignment problem has significant implications …


2021

Wasserstein distance feature alignment learning for 2D image-based 3D model retrieval

Y Zhou, Y Liu, H Zhou, W Li - Journal of Visual Communication and Image …, 2021 - Elsevier

Abstract 2D image-based 3D model retrieval has become a hotspot topic in recent years.

However, the current existing methods are limited by two aspects. Firstly, they are mostly

based on the supervised learning, which limits their application because of the high time …

  All 2 versions


2021


 2021  [PDF] arxiv.org

An inexact PAM method for computing Wasserstein barycenter with unknown supports

Y Qian, S Pan - Computational and Applied Mathematics, 2021 - Springer

Wasserstein barycenter is the centroid of a collection of discrete probability distributions

which minimizes the average of the\(\ell _2\)-Wasserstein distance. This paper focuses on

the computation of Wasserstein barycenters under the case where the support points are …

  Related articles All 2 versions



2021  [PDF] thecvf.com

Wasserstein Barycenter for Multi-Source Domain Adaptation

EF Montesuma, FMN Mboula - Proceedings of the IEEE …, 2021 - openaccess.thecvf.com

Multi-source domain adaptation is a key technique that allows a model to be trained on data

coming from various probability distribution. To overcome the challenges posed by this

learning scenario, we propose a method for constructing an intermediate domain between …

  Related articles 



2021  [PDF] arxiv.org

Limit Distribution Theory for the Smooth 1-Wasserstein Distance with Applications

R Sadhu, Z GoldfeldK Kato - arXiv preprint arXiv:2107.13494, 2021 - arxiv.org

The smooth 1-Wasserstein distance (SWD) $ W_1^\sigma $ was recently proposed as a

means to mitigate the curse of dimensionality in empirical approximation while preserving

the Wasserstein structure. Indeed, SWD exhibits parametric convergence rates and inherits …

  All 3 versions 



2021  [PDF] arxiv.org

On Riemannian Optimization over Positive Definite Matrices with the Bures-Wasserstein Geometry

A HanB MishraP JawanpuriaJ Gao - arXiv preprint arXiv:2106.00286, 2021 - arxiv.org

In this paper, we comparatively analyze the Bures-Wasserstein (BW) geometry with the

popular Affine-Invariant (AI) geometry for Riemannian optimization on the symmetric positive

definite (SPD) matrix manifold. Our study begins with an observation that the BW metric has …

  Related articles All 2 versions 



 

2021  [PDF] arxiv.org

Approximation algorithms for 1-Wasserstein distance between persistence diagrams

S Chen, Y Wang - arXiv preprint arXiv:2104.07710, 2021 - arxiv.org

Recent years have witnessed a tremendous growth using topological summaries, especially

the persistence diagrams (encoding the so-called persistent homology) for analyzing

complex shapes. Intuitively, persistent homology maps a potentially complex input object (be …

   Related articles All 4 versions 


<——2021———2021———1180—— 



2021  see 2020  [PDF] arxiv.org

The quantum Wasserstein distance of order 1

G De PalmaM MarvianD Trevisan… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

We propose a generalization of the Wasserstein distance of order to the quantum states of

n qudits. The proposal recovers the Hamming distance for the vectors of the canonical basis,

and more generally the classical Wasserstein distance for quantum states diagonal in the …

  Cited by 6 Related articles All 8 versions


 year 2021

[PDF] toronto.edu

[PDF] The Wasserstein 1 Distance-Constructing an Optimal Map and Applications to Generative Modelling

T Milne - math.toronto.edu

Recent advances in generative modelling have shown that machine learning algorithms are

capable of generating high resolution images of fully synthetic scenes which some

researchers call “dreams” or “hallucinations” of the algorithm. Poetic language aside, one …

  Related articles 



2021  [PDF] auburn.edu

Efficient and Robust Classification for Positive Definite Matrices with Wasserstein Metric

J Cui - 2021 - etd.auburn.edu

Riemannian geometry methods are widely used to classify SPD (Symmetric Positives-

Definite) matrices, such as covariances matrices of brain-computer interfaces. Common

Riemannian geometry classification methods are based on Riemannian distance to …

  Related articles 



2021  [PDF] aaai.org

[PDF] Fast PCA in 1-D Wasserstein Spaces via B-splines Representation and Metric Projection

M Pegoraro, M Beraha - Proceedings of the AAAI Conference on Artificial …, 2021 - aaai.org

We address the problem of performing Principal Component Analysis over a family of

probability measures on the real line, using the Wasserstein geometry. We present a novel

representation of the 2-Wasserstein space, based on a well known isometric bijection and a …

  Related articles All 2 versions 


2021  [PDF] 8ecm.si

[PDF] Maps on positive definite cones of C-algebras preserving the Wasserstein mean

L Molnár - 2021 - 8ecm.si

… Math. 37 (2019), 165-191. R. Bhatia, T. Jain and Y. Lim, Inequalities for the Wasserstein mean

of positive definite matrices, Linear Algebra Appl. 576 (2019), 108-123. F. Chabbabi, M. Mbekhta

and L. Molnár, Characterizations of Jordan *-isomorphisms of C-algebras by weighted geometric …


2021


2021  [PDF] arxiv.org

Distributionally robust inverse covariance estimation: The Wasserstein shrinkage estimator

VA NguyenD Kuhn… - Operations …, 2021 - pubsonline.informs.org

We introduce a distributionally robust maximum likelihood estimation model with a

Wasserstein ambiguity set to infer the inverse covariance matrix of ap-dimensional Gaussian

random vector from n independent samples. The proposed model minimizes the worst case …

  Cited by 29 Related articles All 8 versions


2021

[CITATION] Bayesian inverse problems in the Wasserstein distance and application to conservation laws

S Mishra, D Ochsner, AM Ruf, F Weber - 2021 - preparation

  Cited by 3 Related articles


Least Wasserstein distance between disjoint shapes with perimeter regularization
by Novack, MichaelTopaloglu, IhsanVenkatraman, Raghavendra
08/2021
We prove the existence of global minimizers to the double minimization problem \[ \inf\Big\{ P(E) + \lambda W_p(\mathcal{L}^n \lfloor \, E,\mathcal{L}^n...
Journal Article  Full Text Online

 arXiv:2108.04390  [pdfpsother math.AP
Least Wasserstein distance between disjoint shapes with perimeter regularization
Authors: Michael NovackIhsan TopalogluRaghavendra Venkatraman
Abstract: We prove the existence of global minimizers to the double minimization problem
inf{P(E)+λW
  where P(E)  denotes the perimeter of the set E
   is the p-Wasserstein distance between Borel probability measures, and λ>0
 is arbitrary. The result holds in all space dimension…  More
Submitted 9 August, 2021; originally announced August 2021.

Cited by 1 Related articles All 6 versions 

[CITATION] Least Wasserstein distance between disjoint shapes with perimeter regularization

I Topaloglu - 2021 Fall Western Sectional Meeting, 2021 - meetings.ams.org

 

Başar, TamerMoon, Jun

Zero-sum differential games on the Wasserstein space. (English) Zbl 07379482

Commun. Inf. Syst. 21, No. 2, 219-251 (2021).

MSC:  60-XX 91-XX

PDF BibTeX XML Cite


2021

MR4297876 Prelim Gamboa, Carlos Andrés; Valladão, Davi Michel; Street, Alexandre; Homem-de-Mello, Tito; Decomposition methods for Wasserstein-based data-driven distributionally robust problems. Oper. Res. Lett. 49 (2021), no. 5, 696–702. 90C15

Review PDF Clipboard Journal Article

Full Text: DOI

Decomposition methods for Wasserstein-based data-driven distributionally robust problems

CA Gamboa, DM ValladãoA Street… - Operations Research …, 2021 - Elsevier

We study decomposition methods for two-stage data-driven Wasserstein-based DROs with

right-hand-sided uncertainty and rectangular support. We propose a novel finite

reformulation that explores the rectangular uncertainty support to develop and test five new …

  All 2 versions


2021

 Computationally Efficient Wasserstein Loss for Structured Labels

0 citations* for all

0 citations*

2021 CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS

Ayato Toyokuni ,Sho Yokoi 1,Hisashi Kashima 2,Makoto Yamada 2

1 Tohoku University ,2 Kyoto University

Artificial neural network

Probability distribution

View More (9+) 

The problem of estimating the probability distribution of labels has been widely studied as a label distribution learning (LDL) problem, whose applications include age estimation, emotion analysis, and semantic segmentation. We propose a tree-Wasserstein distance regularized LDL algorithm, focusing ... View Full Abstract 


2021

 Predictive density estimation under the Wasserstein loss

4 citations* for all

4 citations*

2021 JOURNAL OF STATISTICAL PLANNING AND INFERENCE

Takeru Matsuda 1,William E. Strawderman 2

1 University of Tokyo ,2 Rutgers University

Density estimation

Bayesian probability

View More (5+) 

Abstract We investigate predictive density estimation under the L 2 Wasserstein loss for location families and location-scale families. We show that plug-in densities form a complete class and that the Bayesian predictive density is given by the plug-in density with the posterior mean... View Full Abstract 

Cited by 2 Related articles All 5 versions

 

2021

 Projection Robust Wasserstein Barycenters

0 citations* for all

0 citations*

2021 INTERNATIONAL CONFERENCE ON MACHINE LEARNING

Minhui Huang ,Shiqian Ma ,Lifeng Lai

University of California, Davis

Wasserstein metric

Cluster analysis

View More (8+) 

Collecting and aggregating information from several probability measures or histograms is a fundamental task in machine learning. One of the popular solution methods for this task is to compute the barycenter of the probability measures under the Wasserstein metric. However, approximating the Wasser... View Full Abstract 


Least Wasserstein distance between disjoint shapes with ...

https://arxiv.org › math

by M Novack · 2021 — where P(E) denotes the perimeter of the set E, W_p is the p-Wasserstein distance between Borel probability measures, and \lambda > 0 is ...

online OPEN ACCESS

 

P-WAE: Generalized Patch-Wasserstein Autoencoder ... - arXiv

https://arxiv.org › cs

by Y Chen · 2021 — ... Generalized Patch-Wasserstein Autoencoder for Anomaly Screening ... we propose a novel Patch-wise Wasserstein AutoEncoder (P-WAE) ...

online OPEN ACCESS

P-WAE: Generalized Patch-Wasserstein Autoencoder for Anomaly Screening

by Chen, Yurong

08/2021

To mitigate the inspector's workload and improve the quality of the product, computer vision-based anomaly detection (AD) techniques are gradually deployed in...

Journal ArticleFull Text Online

Related articles All 2 versions

2021


 The Quantum Wasserstein Distance of Order 1

4 citations* for all

1 citations*

2021 IEEE TRANSACTIONS ON INFORMATION THEORY

Giacomo De Palma 1,Milad Marvian 1,Dario Trevisan 2,Seth Lloyd 1

1 Massachusetts Institute of Technology ,2 [Mathematics Department, Universit‘a degli Studi di Pisa, 56127 Pisa PI, Italy.]

Trace distance

Quantum state

View More (8+) 

We propose a generalization of the Wasserstein distance of order 1 to the quantum states of n qudits. The proposal recovers the Hamming distance for the vectors of the canonical basis, and more generally the classical Wasserstein distance for quantum states diagonal in the canonical basis. The propo... View Full Abstract 

3 Related articles All 8 versions

2021

Least Wasserstein distance between disjoint shapes with perimeter regularization

M NovackI TopalogluR Venkatraman - arXiv preprint arXiv:2108.04390, 2021 - arxiv.org

We prove the existence of global minimizers to the double minimization problem\[\inf\Big\{P

(E)+\lambda W_p (\mathcal {L}^ n\lfloor\, E,\mathcal {L}^ n\lfloor\, F)\colon| E\cap F|= 0,\,| E|=|

F|= 1\Big\},\] where $ P (E) $ denotes the perimeter of the set $ E $, $ W_p $ is the $ p …

  All 2 versions 



2021  [PDF] ams.org

Mullins-Sekerka as the Wasserstein flow of the perimeter

A Chambolle, T Laux - Proceedings of the American Mathematical Society, 2021 - ams.org

We prove the convergence of an implicit time discretization for the one-phase Mullins-

Sekerka equation, possibly with additional non-local repulsion, proposed in [F. Otto, Arch.

Rational Mech. Anal. 141 (1998), pp. 63–103]. Our simple argument shows that the limit …

   Related articles All 6 versions



2021  [PDF] arxiv.org

 Isometric Rigidity of compact Wasserstein spaces

J Santos-Rodríguez - arXiv preprint arXiv:2102.08725, 2021 - arxiv.org

Let $(X, d,\mathfrak {m}) $ be a metric measure space. The study of the Wasserstein space

$(\mathbb {P} _p (X),\mathbb {W} _p) $ associated to $ X $ has proved useful in describing

several geometrical properties of $ X. $ In this paper we focus on the study of isometries of …

  Related articles All 3 versions 


2021  [PDF] arxiv.org

Quantitative spectral gap estimate and Wasserstein contraction of simple slice sampling

V Natarovskii, D RudolfB Sprungk - The Annals of Applied …, 2021 - projecteuclid.org

We prove Wasserstein contraction of simple slice sampling for approximate sampling wrt

distributions with log-concave and rotational invariant Lebesgue densities. This yields, in

particular, an explicit quantitative lower bound of the spectral gap of simple slice sampling …

   Related articles All 8 versions

<——2021———2021———1200——


Reconstruction Method for Missing Measurement Data Based on Wasserstein Generative Adversarial Network

C Zhang, H Chen, J He, H Yang - Journal of Advanced …, 2021 - jstage.jst.go.jp

Focusing on the issue of missing measurement data caused by complex and changeable

working conditions during the operation of high-speed trains, in this paper, a framework for

the reconstruction of missing measurement data based on a generative adversarial network …

  Related articles All 4 versions


2021  [PDF] arxiv.org

An inexact PAM method for computing Wasserstein barycenter with unknown supports

Y Qian, S Pan - Computational and Applied Mathematics, 2021 - Springer

Wasserstein barycenter is the centroid of a collection of discrete probability distributions

which minimizes the average of the\(\ell _2\)-Wasserstein distance. This paper focuses on

the computation of Wasserstein barycenters under the case where the support points are …

  Related articles All 2 versions

2021  [PDF] intlpress.com

Zero-sum differential games on the Wasserstein space

T BaşarJ Moon - Communications in Information and Systems, 2021 - intlpress.com

We consider two-player zero-sum differential games (ZSDGs), where the state process

(dynamical system) depends on the random initial condition and the state process's

distribution, and the objective functional includes the state process's distribution and the …

  Related articles 


2021

 Differential inclusions in Wasserstein spaces: The Cauchy-Lipschitz framework

8 citations* for all

8 citations*

2021 JOURNAL OF DIFFERENTIAL EQUATIONS

Benoît Bonnet ,Hélène Frankowska

University of Paris

Differential inclusion

Lipschitz continuity

View More (8+) 

Abstract In this article, we propose a general framework for the study of differential inclusions in the Wasserstein space of probability measures. Based on earlier geometric insights on the structure of continuity equations, we define solutions of differential inclusions as absolutely continuous ... View Full Abstract 

Cited by 23 Related articles All 5 versions

 2021 

 Wasserstein Statistics in One-Dimensional Location-Scale Models.

0 citations*

2021 INTERNATIONAL CONFERENCE ON GEOMETRIC SCIENCE OF INFORMATION

View More 

 Projection Robust Wasserstein Barycenters

0 citations* for all

0 citations*

2021 INTERNATIONAL CONFERENCE ON MACHINE LEARNING

Minhui Huang ,Shiqian Ma ,Lifeng Lai

University of California, Davis

Wasserstein metric

Cluster analysis

View More (8+) 

Collecting and aggregating information from several probability measures or histograms is a fundamental task in machine learning. One of the popular solution methods for this task is to compute the barycenter of the probability measures under the Wasserstein metric. However, approximating the Wasser... View Full Abstract 

Cited by 5 Related articles All 7 versions 
MR4424356
 


 2021 


 Do Neural Optimal Transport Solvers Work? A Continuous Wasserstein-2 Benchmark.

0 citations*

2021 ARXIV: LEARNING

Alexander Korotin 1,Lingxiao Li 2,Aude Genevay ,Justin Solomon 2,Alexander Filippov 3 see all 6 authors

1 Skolkovo Institute of Science and Technology ,2 Massachusetts Institute of Technology ,3 Huawei

Benchmark (computing)

Artificial neural network

View More (7+) 

Despite the recent popularity of neural network-based solvers for optimal transport (OT), there is no standard quantitative way to evaluate their performance. In this paper, we address this issue for quadratic-cost transport -- specifically, computation of the Wasserstein-2 distance, a commonly-used... View Full Abstract 
Cited by 1
 Related articles All 6 versions 


 2021 

 Sampling From the Wasserstein Barycenter.

1 citations*

2021 ARXIV: LEARNING

Chiheb Daaloul 1,Thibaut Le Gouic 2,Jacques Liandrat 1,Magali Tournus 1

1 Aix-Marseille University ,2 Massachusetts Institute of Technology

Absolute continuity

Sampling (statistics)

View More (4+) 

This work presents an algorithm to sample from the Wasserstein barycenter of absolutely continuous measures. Our method is based on the gradient flow of the multimarginal formulation of the Wasserstein barycenter, with an additive penalization to account for the marginal constraints. We prove that t... View Full Abstract 
Cited by 2
 Related articles All 4 versions 


2021  see 2020  [PDF] arxiv.org

Wasserstein distributionally robust motion control for collision avoidance using conditional value-at-risk

A HakobyanI Yang - IEEE Transactions on Robotics, 2021 - ieeexplore.ieee.org

In this article, a risk-aware motion control scheme is considered for mobile robots to avoid

randomly moving obstacles when the true probability distribution of uncertainty is unknown.

We propose a novel model-predictive control (MPC) method for limiting the risk of unsafety …

 Cited by 15 Related articles All 3 versions

   

 2021 

 Multi-Proxy Wasserstein Classifier for Image Classification.

0 citations*

2021 NATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE

Benlin Liu 1,Yongming Rao 2,Jiwen Lu 2,Jie Zhou 2,Cho-Jui Hsieh 1

1 University of California, Los Angeles ,2 Tsinghua University

Classifier (UML)

Contextual image classification

View More (4+) 

Cited by 2 Related articles All 3 versions 
  

 [HTML] hindawi.com

[HTML] Oversampling Imbalanced Data Based on Convergent WGAN for Network Threat Detection

Y Xu, X Zhang, Z Qiu, X Zhang, J Qiu… - Security and …, 2021 - hindawi.com

Class imbalance is a common problem in network threat detection. Oversampling the

minority class is regarded as a popular countermeasure by generating enough new minority

samples. Generative adversarial network (GAN) is a typical generative model that can …

Related articles All 4 versions 

<——2021———2021———1210—— 


arXiv:2108.11102  [pdfpsother math.AP
Existence and stability results for an isoperimetric problem with a non-local interaction of Wasserstein type
Authors: Jules Candau-TilhMichael Goldman
Abstract: The aim of this paper is to prove the existence of minimizers for a variational problem involving the minimization under volume constraint of the sum of the perimeter and a non-local energy of Wasserstein type. This extends previous partial results to the full range of parameters. We also show that in the regime where the perimeter is dominant, the energy is uniquely minimized by balls.
Submitted 25 August, 2021; originally announced August 2021.
Cited by 2
 Related articles All 28 versions 


[PDF] openreview.net

Efficient Wasserstein and Sinkhorn Policy Optimization

J SongC ZhaoN He - 2021 - openreview.net

Trust-region methods based on Kullback-Leibler divergence are pervasively used to

stabilize policy optimization in reinforcement learning. In this paper, we examine two natural …


arXiv:2108.08351  [pdfpsother math.PR
The cutoff phenomenon in Wasserstein distance for nonlinear stable Langevin systems with small Lévy noise
Authors: Gerardo BarreraMichael A. HögeleJuan Carlos Pardo
Abstract: This article establishes the cutoff phenomenon in the Wasserstein distance for systems of nonlinear ordinary differential equations with a unique coercive stable fixed point subject to general additive Markovian noise in the limit of small noise intensity. This result generalizes the results shown in Barrera, Högele, Pardo (EJP2021) in a more restrictive setting of Blumenthal-Getoor index 
α>3/2
…  More
Submitted 18 August, 2021; originally announced August 2021.
MSC Class: 60H10; 37A25; 60G51; 15A16

arXiv:2108.06729  [pdfpsother math.OC  math.DS  math.FA
Dissipative probability vector fields and generation of evolution semigroups in Wasserstein spaces
Authors: Giulia CavagnariGiuseppe SavaréGiacomo Enrico Sodini
Abstract: We introduce and investigate a notion of multivalued λ
-dissipative probability vector field (MPVF) in the Wasserstein space 

P2(X)
 of Borel probability measures on a Hilbert space X
. Taking inspiration from the theory of dissipative operators in Hilbert spaces and of Wasserstein gradient flows of geodesically convex functionals, we study local and global well posedn…  More
Submitted 15 August, 2021; originally announced August 2021.
Comments: 63 pages

All 2 versions 


2021 see 2020

Fort, Jean-ClaudeKlein, ThierryLagnoux, Agnès

Global sensitivity analysis and Wasserstein spaces. (English) Zbl 07384776

SIAM/ASA J. Uncertain. Quantif. 9, 880-921 (2021).

MSC:  62G05 62G20 62G30 65C60 62E17

PDF BibTeX XML Cite

Full Text: DOI    Zbl 1468.62267
Cited by 13 Related articles All 16 versions

MR4303897 Prelim Borda, Bence; Equidistribution of random walks on compact groups II. The Wasserstein metric. Bernoulli 27 (2021), no. 4, 2598–2623.

Review PDF Clipboard Journal Article

Cited by 3 Related articles All 6 versions

MR4303265 Prelim Papayiannis, G. I.; Domazakis, G. N.; Drivaliaris, D.; Koukoulas, S.; Tsekrekos, A. E.; Yannacopoulos, A. N.; On clustering uncertain and structured data with Wasserstein barycenters and a geodesic criterion for the number of clusters. J. Stat. Comput. Simul. 91 (2021), no. 13, 2569–2594.

Review PDF Clipboard Journal Article

Related articles

MR4301115 Prelim Kerdoncuff, Tanguy; Emonet, Rémi; Sebban, Marc; 

Sampled Gromov WassersteinMach. Learn. 110 (2021), no. 8, 2151–2186.

Review PDF Clipboard Journal Article

PDF] archives-ouvertes.fr

Sampled Gromov Wasserstein

T KerdoncuffR EmonetM Sebban - Machine Learning, 2021 - Springer

… In this section, we introduce the Optimal Transport (OT) problem with its associated

Wasserstein distance, and the Gromov Wasserstein distance that allows the comparison of …

Cited by 7 Related articles All 3 versions

MR4300314 Prelim Kim, Sejong; Lee, Hosoo; 

Wasserstein barycenters of compactly supported measures. Anal. Math. Phys. 11 (2021), no. 4, Paper No. 153.

Review PDF Clipboard Journal Article

All 2 versions

Image Recognition Based on Super-Resolution Wasserstein Generative Adversarial Nets with Gradient Penalty

J Liu, L Yun, X Jin, C Zhang - 3D Imaging Technologies—Multi …, 2021 - Springer

Generative adversarial networks (GAN) are currently a hotly debated research topic in the

field of machine vision; however, they possess various shortcomings that cannot be

overlooked, such as unstable generated samples, collapsed modes, and slow convergence …

 All 3 versions

<——2021———2021———1220——


2021  [PDF] arxiv.org

Rethinking rotated object detection with gaussian wasserstein distance loss

X YangJ YanQ MingW WangX Zhang… - arXiv preprint arXiv …, 2021 - arxiv.org

Boundary discontinuity and its inconsistency to the final detection metric have been the

bottleneck for rotating detection regression loss design. In this paper, we propose a novel

regression loss based on Gaussian Wasserstein distance as a fundamental approach to …

  Cited by 2 Related articles All 3 versions 

2021  [PDF] researchgate.net

Establishment and extension of digital aggregate database using auxiliary classifier Wasserstein GAN with gradient penalty

C Wang, F Li, Q LiuH Wang, P Benmoussa… - … and Building Materials, 2021 - Elsevier

For road construction, the morphological characteristics of coarse aggregates such as

angularity and sphericity have a considerable influence on asphalt pavement performance.

In traditional aggregate simulation processes, images of real coarse grains are captured …

  All 3 versions

2021  [HTML] springer.com

[HTML] EEG data augmentation for emotion recognition with a multiple generator conditional Wasserstein GAN

A Zhang, L Su, Y Zhang, Y Fu, L Wu, S Liang - Complex & Intelligent …, 2021 - Springer

EEG-based emotion recognition has attracted substantial attention from researchers due to

its extensive application prospects, and substantial progress has been made in feature

extraction and classification modelling from EEG data. However, insufficient high-quality …

  Related articles

2021  [PDF] arxiv.org

Hidden Convexity of Wasserstein GANs: Interpretable Generative Models with Closed-Form Solutions

A SahinerT ErgenB OzturklerB Bartan… - arXiv preprint arXiv …, 2021 - arxiv.org

Generative Adversarial Networks (GANs) are commonly used for modeling complex

distributions of data. Both the generators and discriminators of GANs are often modeled by

neural networks, posing a non-transparent optimization problem which is non-convex and …

  All 2 versions 


2021 Wasserstein Barycenter Transport for Acoustic Adaptation

EF Montesuma, FMN Mboula - ICASSP 2021-2021 IEEE …, 2021 - ieeexplore.ieee.org

The recognition of music genre and the discrimination between music and speech are

important components of modern digital music systems. Depending on the acquisition

conditions, such as background environment, these signals may come from different …

  Related articles


2021


2021   [PDF] arxiv.org

Dissipative probability vector fields and generation of evolution semigroups in Wasserstein spaces

G Cavagnari, G SavaréGE Sodini - arXiv preprint arXiv:2108.06729, 2021 - arxiv.org

We introduce and investigate a notion of multivalued $\lambda $-dissipative probability

vector field (MPVF) in the Wasserstein space $\mathcal {P} _2 (\mathsf X) $ of Borel

probability measures on a Hilbert space $\mathsf X $. Taking inspiration from the theory of …

  All 2 versions 


2021  [PDF] arxiv.org

The cutoff phenomenon in Wasserstein distance for nonlinear stable Langevin systems with small L\'evy noise

G BarreraMA HögeleJC Pardo - arXiv preprint arXiv:2108.08351, 2021 - arxiv.org

This article establishes the cutoff phenomenon in the Wasserstein distance for systems of

nonlinear ordinary differential equations with a unique coercive stable fixed point subject to

general additive Markovian noise in the limit of small noise intensity. This result generalizes …

  All 3 versions 


2021

A Distributed Robust Optimization Method with MC Dropout and Wasserstein Ambiguity Set Applied in Day-head Dispatch of Microgird

X Zhou, S Sun, S Yang, K Gong… - 2021 IEEE 4th …, 2021 - ieeexplore.ieee.org

Day-ahead load forecasting is the key part in day-ahead scheduling of power system.

Considering the uncertainty of load forecasting can improve the robustness of the system

and reduce the risk cost. This paper proposes a distributed robust optimization (DRO) …

Related articles

2021  [PDF] arxiv.org

Wasserstein-Splitting Gaussian Process Regression for Heterogeneous Online Bayesian Inference

ME KeplerA KoppelAS BediDJ Stilwell - arXiv preprint arXiv …, 2021 - arxiv.org

Gaussian processes (GPs) are a well-known nonparametric Bayesian inference technique,

but they suffer from scalability problems for large sample sizes, and their performance can

degrade for non-stationary or spatially heterogeneous data. In this work, we seek to …

  All 2 version

 

Infrared Image Super-Resolution via Heterogeneous Convolutional WGAN
by Huang, YongsongJiang, ZetaoWang, Qingzhong ; More...
PRICAI 2021: Trends in Artificial Intelligence, 11/2021
Image super-resolution is important in many fields, such as surveillance and remote sensing. However, infrared (IR) images normally have low resolution since...
Book Chapter  Full Text Online
More Options 

arXiv:2109.00960  [pdfother]  cs.CV  cs.AI
Infrared Image Super-Resolution via Heterogeneous Convolutional WGAN
Authors: Yongsong HuangZetao JiangQingzhong WangQi JiangGuoming Pang
Abstract: Image super-resolution is important in many fields, such as surveillance and remote sensing. However, infrared (IR) images normally have low resolution since the optical equipment is relatively expensive. Recently, deep learning methods have dominated image super-resolution and achieved remarkable performance on visible images; however, IR images have received less attention. IR images have fewer… 
More
Submitted 2 September, 2021; originally announced September 2021.
Comments: To be published in the 18th Pacific Rim International Conference on Artificial Intelligence (PRICAI-2021)

 Related articles All 6 versions
<——2021———2021———1230——  

arXiv:2109.00528  [pdfpsother]  cs.LG  math.OC  stat.ML
Wasserstein GANs with Gradient Penalty Compute Congested Transport
Authors: Tristan MilneAdrian Nachman
Abstract: Wasserstein GANs with Gradient Penalty (WGAN-GP) are an extremely popular method for training generative models to produce high quality synthetic data. While WGAN-GP were initially developed to calculate the Wasserstein 1 distance between generated and real data, recent works (e.g. Stanczuk et al. (2021)) have provided empirical evidence that this does not occur, and have argued that WGAN-GP perfo… 
More
Submitted 1 September, 2021; originally announced September 2021.
Comments: 27 pages
Cited by 1 Related articles All 3 versions 


arXiv:2108.13054  [pdfother]  math.NA  cs.LG
Wasserstein Generative Adversarial Uncertainty Quantification in Physics-Informed Neural Networks
Authors: Yihang GaoMichael K. Ng
Abstract: In this paper, we study a physics-informed algorithm for Wasserstein Generative Adversarial Networks (WGANs) for uncertainty quantification in solutions of partial differential equations. By using groupsort activation functions in adversarial network discriminators, network generators are utilized to learn the uncertainty in solutions of partial differential equations observed from the initial/bou… 
More
Submitted 30 August, 2021; originally announced August 2021.
 Cited by 2
 Related articles All 2 versions

MR4421736

arXiv:2108.12755  [pdfpsother math.DG
Some inequalities on Riemannian manifolds linking Entropy,Fisher information, Stein discrepancy and Wasserstein distance
Authors: Li-Juan ChengFeng-Yu WangAnton Thalmaier
Abstract: For a complete connected Riemannian manifold M
 let VC2(M)
 be such that μ(dx)=e −V(x)
vol(dx)
 is a probability measure on M
. Taking μ
 as reference measure, we derive inequalities for probability measures on M
 linking relative entropy, Fisher information, Stein discrepancy and Wasserstein distance. These inequalities strengthen in particular the famous log-Sobolev a…  More
Submitted 29 August, 2021; originally announced August 2021.
MSC Class: 60E15; 35K08; 46E35

arXiv:2108.12463  [pdfother cs.CL  cs.AI
Automatic Text Evaluation through the Lens of Wasserstein Barycenters
Authors: Pierre ColomboGuillaume StaermanChloe ClavelPablo Piantanida
Abstract: A new metric \texttt{BaryScore} to evaluate text generation based on deep contextualized embeddings (\textit{e.g.}, BERT, Roberta, ELMo) is introduced. This metric is motivated by a new framework relying on optimal transport tools, \textit{i.e.}, Wasserstein distance and barycenter. By modelling the layer output of deep contextualized embeddings as a probability distribution rather than by a vecto…  More
Submitted 27 August, 2021; originally announced August 2021.
Journal ref: EMNLP 2021

Cited by 19 Related articles All 8 versions 

A Data Enhancement Method for Gene Expression Profile Based on Improved WGAN-GP

S Zhu, F Han - International Conference on Neural Computing for …, 2021 - Springer

A Data Enhancement Method for Gene Expression Profile Based on Improved WGAN-GP

by Zhu, Shaojun; Han, Fei

Neural Computing for Advanced Applications, 08/2021

A large number of gene expression profile datasets mainly exist in the fields of biological information and gene microarrays. Traditional classification...

Book ChapterCitation Online

 

2021

Sampled Gromov Wasserstein

T KerdoncuffR EmonetM Sebban - 2021 - hal.archives-ouvertes.fr

Optimal Transport (OT) has proven to be a powerful tool to compare probability distributions

in machine learning, but dealing with probability measures lying in different spaces remains

an open problem. To address this issue, the Gromov Wasserstein distance (GW) only …

  Related articles 

online Cover Image PEER-REVIEW OPEN ACCESS

Sampled Gromov Wasserstein

by Kerdoncuff, Tanguy; Emonet, Rémi; Sebban, Marc

Machine learning, 08/2021, Volume 110, Issue 8

Keywords: Optimal transport; Gromov Wasserstein; Convergence guarantees Optimal Transport (OT) has proven to be a powerful tool to compare probability...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon
Cited by 7
Related articles All 3 versions

Zbl 07465668

The α-z-Bures Wasserstein divergence

TH DinhCT Le, BK Vo, TD Vuong - Linear Algebra and its Applications, 2021 - Elsevier

In this paper, we introduce the α-z-Bures Wasserstein divergence for positive semidefinite

matrices A and B as Φ (A, B)= T r ((1− α) A+ α B)− T r (Q α, z (A, B)), where Q α, z (A, B)=(A

1− α 2 z B α z A 1− α 2 z) z is the matrix function in the α-z-Renyi relative entropy. We show

that for 0≤ α≤ z≤ 1, the quantity Φ (A, B) is a quantum divergence and satisfies the Data

Processing Inequality in quantum information. We also solve the least squares problem with

respect to the new divergence. In addition, we show that the matrix power mean μ (t, A …

  Related articles All 3 versions

2021 see 2020 online Cover Image PEER-REVIEW

The α-z-Bures Wasserstein divergence

by Dinh, Trung Hoa; Le, Cong Trinh; Vo, Bich Khue ; More...

Linear algebra and its applications, 09/2021, Volume 624

In this paper, we introduce the α-z-Bures Wasserstein divergence for positive semidefinite matrices A and B asΦ(A,B)=Tr((1−α)A+αB)−Tr(Qα,z(A,B)), where...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

News results for "(TitleCombined:(wasserstein OR wgan OR wgans))"

 

Decomposition methods for Wasserstein-based data-driven distributionally robust problems

CA Gamboa, DM ValladãoA Street… - Operations Research …, 2021 - Elsevier

We study decomposition methods for two-stage data-driven Wasserstein-based DROs with

right-hand-sided uncertainty and rectangular support. We propose a novel finite

reformulation that explores the rectangular uncertainty support to develop and test five new

different decomposition schemes: Column-Constraint Generation, Single-cut and Multi-cut

Benders, as well as Regularized Single-cut and Multi-cut Benders. We compare the

efficiency of the proposed methods for a unit commitment problem with 14 and 54 thermal …

Cited by 7 Related articles All 3 versions

online Cover Image PEER-REVIEW

Decomposition methods for Wasserstein-based data-driven distributionally robust problems

by Gamboa, Carlos Andrés; Valladão, Davi Michel; Street, Alexandre ; More...

Operations research letters, 09/2021, Volume 49, Issue 5

We study decomposition methods for two-stage data-driven Wasserstein-based DROs with right-hand-sided uncertainty and rectangular support. We propose a novel...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

  Zbl 07442913

 All 2 versions


[PDF] researchgate.net

Establishment and extension of digital aggregate database using auxiliary classifier Wasserstein GAN with gradient penalty

C WangF LiQ LiuH Wang, P Benmoussa… - … and Building Materials, 2021 - Elsevier

For road construction, the morphological characteristics of coarse aggregates such as

angularity and sphericity have a considerable influence on asphalt pavement performance.

In traditional aggregate simulation processes, images of real coarse grains are captured,

and their parameters are extracted manually for reproducing them in a numerical simulation

such as Discrete Element Modeling (DEM). Generative Adversarial Networks can generate

aggregate images, which can be stored in the Aggregate DEM Database directly. In this …

 Related articles All 3 versions+

online Cover Image  PEER-REVIEW

Establishment and extension of digital aggregate database using auxiliary classifier Wasserstein GAN with gradient penalty

by Wang, Chonghui; Li, Feifei; Liu, Quan ; More...

Construction & building materials, 09/2021, Volume 300

•Digital aggregate database was established by using deep learning methods.•The generated aggregates via ACWGAN-gp have a very close distribution of angularity...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 Cited by 2 Related articles All 2 versions

 

Improving Non-invasive Aspiration Detection ... - IEEE Xplore

https://ieeexplore.ieee.org › iel7

by K Shu · 2021 — Improving Non-invasive Aspiration Detection with Auxiliary Classifier Wasserstein Generative. Adversarial Networks. Kechen Shu, Shitong Mao, James L. Coyle, ...

online Cover Image

Improving Non-invasive Aspiration Detection with Auxiliary Classifier Wasserstein Generative Adversarial Networks

by Shu, Kechen; Mao, Shitong; Coyle, James L ; More...

IEEE journal of biomedical and health informatics, 08/2021, Volume PP

Aspiration is a serious complication of swallowing disorders. Adequate detection of aspiration is essential in dysphagia management and treatment....

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Improving Non-invasive Aspiration Detection with Auxiliary Classifier Wasserstein Generative Adversarial Networks

K Shu, S MaoJL CoyleE Sejdic - IEEE Journal of Biomedical …, 2021 - ieeexplore.ieee.org

… using an auxiliary classifier Wasserstein GAN (AC-WGAN) under the … a WGAN with auxiliary

classifier (ACWGAN) was proposed to improve the noninvasive aspiration detection on

imbalanced HRCA dataset. The AC-WGAN is trained by optimizing a combination of Wasserstein …

Cited by 2 Related articles All 3 versions
<——2021———2021———1240—

 

online  OPEN ACCESS

Entropic Gromov-Wasserstein between Gaussian Distributions

by Le, Khang; Le, Dung; Nguyen, Huy ; More...

08/2021

We study the entropic Gromov-Wasserstein and its unbalanced version between (unbalanced) Gaussian distributions with different dimensions. When the metric is...

Journal ArticleFull Text Online

 Cited by 2 Related articles All 3 versions


Wasserstein Generative Adversarial Uncertainty Quantification in Physics-Informed Neural Networks

Y Gao, MK Ng - arXiv preprint arXiv:2108.13054, 2021 - arxiv.org

In this paper, we study a physics-informed algorithm for Wasserstein Generative Adversarial

Networks (WGANs) for uncertainty quantification in solutions of partial differential equations.

By using groupsort activation functions in adversarial network discriminators, network

generators are utilized to learn the uncertainty in solutions of partial differential equations

observed from the initial/boundary data. Under mild assumptions, we show that the

generalization error of the computed generator converges to the approximation error of the …

  All 2 versions 

online  OPEN ACCESS

Wasserstein Generative Adversarial Uncertainty Quantification in Physics-Informed Neural Networks

by Gao, Yihang; Ng, Michael K

08/2021

In this paper, we study a physics-informed algorithm for Wasserstein Generative Adversarial Networks (WGANs) for uncertainty quantification in solutions of...

Journal ArticleFull Text Online

 

Automatic Text Evaluation through the Lens of Wasserstein Barycenters

P Colombo, G Staerman, C Clavel… - arXiv preprint arXiv …, 2021 - arxiv.org

A new metric\texttt {BaryScore} to evaluate text generation based on deep contextualized

embeddings (\textit {eg}, BERT, Roberta, ELMo) is introduced. This metric is motivated by a

new framework relying on optimal transport tools,\textit {ie}, Wasserstein distance and

barycenter. By modelling the layer output of deep contextualized embeddings as a

probability distribution rather than by a vector embedding; this framework provides a natural

way to aggregate the different outputs through the Wasserstein space topology. In addition, it …

Cited by 7 Related articles All 7 versions

online  OPEN ACCESS

Automatic Text Evaluation through the Lens of Wasserstein Barycenters

by Colombo, Pierre; Staerman, Guillaume; Clavel, Chloe ; More...

08/2021

EMNLP 2021 A new metric \texttt{BaryScore} to evaluate text generation based on deep contextualized embeddings (\textit{e.g.}, BERT, Roberta, ELMo) is...

Journal ArticleFull Text Online

 

Existence and stability results for an ... - Archive ouverte HAL

https://hal.archives-ouvertes.fr › document

PDF

by J Candau-Tilh · 2021 — publics ou privés. Existence and stability results for an isoperimetric problem with a non-local interaction of Wasserstein type.

online OPEN ACCESS

Existence and stability results for an isoperimetric problem with a non-local interaction of Wasserstein type

by Candau-Tilh, Jules; Goldman, Michael

08/2021

The aim of this paper is to prove the existence of minimizers for a variational problem involving the minimization under volume constraint of the sum of the...

Journal ArticleFull Text Online

 Cited by 2 All 28 versions



1021 see 2022

The cutoff phenomenon in Wasserstein distance for nonlinear ...

https://arxiv.org › math

by G Barrera · 2021 — ... for nonlinear stable Langevin systems with small Lévy noise ... the cutoff phenomenon in the Wasserstein distance for systems of ...

Missing: L ‎evy

online OPEN ACCESS

The cutoff phenomenon in Wasserstein distance for nonlinear stable Langevin systems with small L\'evy noise

by Barrera, Gerardo; Högele, Michael A; Pardo, Juan Carlos

08/2021

This article establishes the cutoff phenomenon in the Wasserstein distance for systems of nonlinear ordinary differential equations with a unique coercive...

Journal ArticleFull Text Online

  All 5 versions


2021


Dissipative probability vector fields and generation of ... - arXiv

https://arxiv.org › math

by G Cavagnari · 2021 — ... and generation of evolution semigroups in Wasserstein spaces ... multivalued \lambda-dissipative probability vector field (MPVF) in the ...

online OPEN ACCESS

Dissipative probability vector fields and generation of evolution semigroups in Wasserstein spaces

by Cavagnari, Giulia; Savaré, Giuseppe; Sodini, Giacomo Enrico

08/2021

We introduce and investigate a notion of multivalued $\lambda$-dissipative probability vector field (MPVF) in the Wasserstein space $\mathcal{P}_2(\mathsf X)$...

Journal ArticleFull Text Online

 Related articles All 2 versions 

MR4307706 Prelim Barrera, G.; Högele, M. A.; Pardo, J. C.; Cutoff Thermalization for Ornstein–Uhlenbeck Systems with Small Lévy Noise in the Wasserstein Distance. J. Stat. Phys. 184 (2021), no. 3, Paper No. 27.

Review PDF Clipboard Journal Article


MR4306876 Prelim Chen, Yaqing; Müller, Hans-Georg; Wasserstein gradients for the temporal evolution of probability distributions. Electron. J. Stat. 15 (2021), no. 2, 4061–4084.

Review PDF Clipboard Journal Article

[PDF] projecteuclid.org

Wasserstein gradients for the temporal evolution of probability distributions

Y ChenHG Müller - Electronic Journal of Statistics, 2021 - projecteuclid.org

Many studies have been conducted on flows of probability measures, often in terms of

gradient flows. We utilize a generalized notion of derivatives with respect to time to model

the instantaneous evolution of empirically observed one-dimensional distributions that vary …

 Cited by 1 Related articles All 4 versions

2021 

 Classification of atomic environments via the Gromov-Wasserstein distance

1 citations*

2020 ARXIV: MATERIALS SCIENCE

View More 

 Wasserstein Embedding for Graph Learning

7 citations* for all

4 citations*

2021 INTERNATIONAL CONFERENCE ON LEARNING REPRESENTATIONS

Soheil Kolouri 1,Navid Naderializadeh 1,Gustavo K. Rohde 2,Heiko Hoffmann 1

1 HRL Laboratories ,2 University of Virginia

Graph embedding

Embedding

View More (8+) 

We present Wasserstein Embedding for Graph Learning (WEGL), a novel and fast framework for embedding entire graphs in a vector space, in which various machine learning models are applicable for graph-level prediction tasks. We leverage new insights on defining similarity between graphs as a function... View Full Abstract 


 2021 

 Wasserstein Embedding for Graph Learning

3 citations*

2020 ARXIV: LEARNING

View More 

 Learning Domain Invariant Representations by Joint Wasserstein Distance Minimization.

0 citations*

2021 ARXIV: MACHINE LEARNING

Léo Andéol ,Yusei Kawakami ,Yuichiro Wada ,Takafumi Kanamori ,Klaus-Robert Müller see all 6 authors

Invariant (mathematics)

Domain (software engineering)

View More (8+) 

Domain shifts in the training data are common in practical applications of machine learning, they occur for instance when the data is coming from different sources. Ideally, a ML model should work well independently of these shifts, for example, by learning a domain-invariant representation. Moreove... View Full Abstract 

ite Related articles All 4 versions 

<——2021———2021———1250——  



2021 see 2020

 Symmetric Skip Connection Wasserstein GAN for High-resolution Facial Image Inpainting.

7 citations* for all

0 citations*

2021 INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS

Jireh Jam 1,Connah Kendrick 1,Vincent Drouard 2,Kevin Walker 2,Gee-Sern Hsu 3 see all 6 authors

1 Manchester Metropolitan University ,2 Image Metrics Ltd, Manchester, U.K, --- Select a Country ---,3 National Taiwan University

Inpainting

Feature (computer vision)

View More (9+) 

The state-of-the-art facial image inpainting methods achieved promising results but face realism preservation remains a challenge. This is due to limitations such as; failures in preserving edges and blurry artefacts. To overcome these limitations, we propose a Symmetric Skip Connection Wasserstein ... View Full Abstract 


 2021 

 A Mathematical Principle of Deep Learning: Learn the Geodesic Curve in the Wasserstein Space.

0 citations*

2021 ARXIV: LEARNING

Kuo Gai ,Shihua Zhang

Chinese Academy of Sciences

Geodesic

Artificial neural network

View More (10+) 

Recent studies revealed the mathematical connection of deep neural network (DNN) and dynamic system. However, the fundamental principle of DNN has not been fully characterized with dynamic system in terms of optimization and generalization. To this end, we build the connection of DNN and continuity ... View Full Abstract 

Related articles All 2 versions 

 2021  see 22020, 018

 Wasserstein Distributionally Robust Stochastic Control: A Data-Driven Approach

26 citations* for all

5 citations*

2021 IEEE TRANSACTIONS ON AUTOMATIC CONTROL

Insoon Yang

Systems Research Institute

Stochastic control

Robust optimization

View More (9+) 

Standard stochastic control methods assume that the probability distribution of uncertain variables is available. Unfortunately, in practice, obtaining accurate distribution information is a challenging task. To resolve this issue, in this article we investigate the problem of designing a control po... View Full Abstract 


[PDF] amazonaws.com

[PDF] STOCHASTIC GRADIENT METHODS FOR L2-WASSERSTEIN LEAST SQUARES PROBLEM OF GAUSSIAN MEASURES

S YUN, X SUNJIL CHOI… - J. Korean Soc …, 2021 - ksiam-editor.s3.amazonaws.com

This paper proposes stochastic methods to find an approximate solution for the L2-

Wasserstein least squares problem of Gaussian measures. The variable for the problem is in

a set of positive definite matrices. The first proposed stochastic method is a type of classical …

Related articles All 4 versions 

Zbl 07569351

 2021 

 Wasserstein distance feature alignment learning for 2D image-based 3D model retrieval

0 citations*

2021 JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION

Yaqian Zhou 1,Yu Liu 1,Heyu Zhou 1,Wenhui Li 1,2

1 Tianjin University ,2 Chinese Academy of Sciences

Feature (computer vision)

Euclidean distance

View More (9+) 

Abstract 2D image-based 3D model retrieval has become a hotspot topic in recent years. However, the current existing methods are limited by two aspects. Firstly, they are mostly based on the supervised learning, which limits their application because of the high time and cost consuming of manual a... View Full Abstract 

All 2 versions

2021


 Distributionally Robust Prescriptive Analytics with Wasserstein Distance.

0 citations*

2021 ARXIV: OPTIMIZATION AND CONTROL

Tianyu Wang ,Ningyuan Chen 1,Chun Wang 2

1 University of Toronto ,2 Tsinghua University

Conditional probability distribution

Joint probability distribution

View More (8+) 

In prescriptive analytics, the decision-maker observes historical samples of $(X, Y)$, where $Y$ is the uncertain problem parameter and $X$ is the concurrent covariate, without knowing the joint distribution. Given an additional covariate observation $x$, the goal is to choose a decision $z$ conditi... View Full Abstract 

Related articles All 2 versions 

  2021 

 Linear and Deep Order-Preserving Wasserstein Discriminant Analysis.

0 citations*

2021 IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE

Bing Su 1,Jiahuan Zhou 2,Ji Rong Wen 1,Ying Wu

1 Renmin University of China ,2 Northwestern University

Linear discriminant analysis

Dimensionality reduction

View More (9+) 

Supervised dimensionality reduction for sequence data learns a transformation that maps the observations in sequences onto a low-dimensional subspace by maximizing the separability of sequences in different classes. It is typically more challenging than conventional dimensionality reduction for stat... View Full Abstract 

Related articles All 6 versions

 Wasserstein Distributionally Robust Optimization: A Three-Player Game Framework

0 citations*

2021

Zhuozhuo Tu 1,Shan You 2,Tao Huang 3,Dacheng Tao 1

1 University of Sydney ,2 Tsinghua University ,3 SenseTime

Minimax

Robustness (computer science)

View More (8+) 

Wasserstein distributionally robust optimization (DRO) has recently received significant attention in machine learning due to its connection to generalization, robustness and regularization. Existing methods only consider a limited class of loss functions or apply to small values of robustness. In t... View Full Abstract 


 AN: Lane Line Detection With Ripple Lane Line Detection Network and Wasserstein GAN

3 citations*

2021 IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS

Youcheng Zhang 1,Zongqing Lu 1,Dongdong Ma 1,Jing-Hao Xue 2,Qingmin Liao 1

1 Tsinghua University ,2 University College London

Line (geometry)

Image segmentation

View More (9+) 

With artificial intelligence technology being advanced by leaps and bounds, intelligent driving has attracted a huge amount of attention recently in research and development. In intelligent driving, lane line detection is a fundamental but challenging task particularly under complex road conditions.... View Full Abstract 

 2021 see 2020

 Second-Order Conic Programming Approach for Wasserstein Distributionally Robust Two-Stage Linear Programs

1 citations* for all

0 citations*

2021 IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING

Zhuolin Wang 1,Keyou You 1,Shiji Song 1,Yuli Zhang 2

1 Tsinghua University ,2 Beijing Institute of Technology

Computational complexity theory

Maximization

View More (8+) 

This article proposes a second-order conic programming (SOCP) approach to solve distributionally robust two-stage linear programs over 1-Wasserstein balls. We start from the case with distribution uncertainty only in the objective function and then explore the case with distribution uncertainty only... View Full Abstract 

Cited by 2 Related articles All 5 versions

<——2021———2021———1260——


 First-Order Methods for Wasserstein Distributionally Robust MDP

1 citations* for all

0 citations*

2021 INTERNATIONAL CONFERENCE ON MACHINE LEARNING

Julien Grand-Clement ,Christian Kroer

Columbia University

Markov decision process

Rate of convergence

View More (8+) 

Markov Decision Processes (MDPs) are known to be sensitive to parameter specification. Distributionally robust MDPs alleviate this issue by allowing for ambiguity sets which give a set of possible distributions over parameter sets. The goal is to find an optimal policy with respect to the worst-case... View Full Abstract 

All 2 versions 

2021  [PDF] arxiv.org

Mathematical Principle of Deep Learning: Learn the Geodesic Curve in the Wasserstein Space

K Gai, S Zhang - arXiv preprint arXiv:2102.09235, 2021 - arxiv.org

Recent studies revealed the mathematical connection of deep neural network (DNN) and

dynamic system. However, the fundamental principle of DNN has not been fully

characterized with dynamic system in terms of optimization and generalization. To this end …

  Related articles All 2 versions 


2021 see 2020

[PDF] mlr.press

First-Order Methods for Wasserstein Distributionally Robust MDP

JG Clement, C Kroer - International Conference on Machine …, 2021 - proceedings.mlr.press

Markov decision processes (MDPs) are known to be sensitive to parameter specification.

Distributionally robust MDPs alleviate this issue by allowing for\textit {ambiguity sets} which

give a set of possible distributions over parameter sets. The goal is to find an optimal policy  …

  All 2 versions 


2021  [HTML] copernicus.org

[HTML] Ensemble Riemannian data assimilation over the Wasserstein space

SK TamangA Ebtehaj, PJ Van Leeuwen… - Nonlinear Processes …, 2021 - npg.copernicus.org

In this paper, we present an ensemble data assimilation paradigm over a Riemannian

manifold equipped with the Wasserstein metric. Unlike the Euclidean distance used in

classic data assimilation methodologies, the Wasserstein metric can capture the translation …

  Related articles All 7 versions 


2021  [PDF] projecteuclid.org

Strong equivalence between metrics of Wasserstein type

E BayraktarG Guoï - Electronic Communications in Probability, 2021 - projecteuclid.org

The sliced Wasserstein metric p and more recently max-sliced Wasserstein metric W p

have attracted abundant attention in data sciences and machine learning due to their

advantages to tackle the curse of dimensionality, see eg [15],[6]. A question of particular …

  Cited by 3 Related articles All 4 versions


2021
 

2021 see 2022

Inferential Wasserstein Generative Adversarial Networks

by Chen, YaoGao, QingyiWang, Xiao

09/2021

Generative Adversarial Networks (GANs) have been impactful on many problems and applications but suffer from unstable t

raining. The Wasserstein GAN (WGAN)...Journal Article  Full Text Online

arXiv:2109.06646  [pdfpsother]  math.ST  math.PR
A Wasserstein index of dependence for random measures
Authors: Marta CatalanoHugo LavenantAntonio LijoiIgor Prünster
Abstract: Nonparametric latent structure models provide flexible inference on distinct, yet related, groups of observations. Each component of a vector of d≥2
 random measures models the distribution of a group of exchangeable observations, while their dependence structure regulates the borrowing of information across different groups. Recent work has quantified the dependence between random measures i… 
More
Submitted 14 September, 2021; originally announced September 2021.

ll 3 versions 

 see 2021 SEE 2022

Inferential Wasserstein Generative Adversarial Networks

by Chen, YaoGao, QingyiWang, Xiao

09/2021

Generative Adversarial Networks (GANs) have been impactful on many problems and applications but suffer from unstable

training. The Wasserstein GAN (WGAN)...

Journal Article  Full Text Online

arXiv:2109.05652  [pdfother]  stat.ML  cs.LG
Inferential Wasserstein Generative Adversarial Networks
Authors: Yao ChenQingyi GaoXiao Wang
Abstract: Generative Adversarial Networks (GANs) have been impactful on many problems and applications but suffer from unstable training. The Wasserstein GAN (WGAN) leverages the Wasserstein distance to avoid the caveats in the minmax two-player training of GANs but has other defects such as mode collapse and lack of metric to detect the convergence. We introduce a novel inferential Wasserstein GAN (iWGAN)…  More
Submitted 12 September, 2021; originally announced September 2021.

 Cited by 3 Related articles All 5 versions 

arXiv:2109.04301  [pdfother cs.LG
On the use of Wasserstein metric in topological clustering of distributional data
Authors: Guénaël CabanesYounès BennaniRosanna VerdeAntonio Irpino
Abstract: This paper deals with a clustering algorithm for histogram data based on a Self-Organizing Map (SOM) learning. It combines a dimension reduction by SOM and the clustering of the data in a reduced space. Related to the kind of data, a suitable dissimilarity measure between distributions is introduced: the L
 Wasserstein distance. Moreover, the number of clusters is not fixed in advance but it is…  More
Submitted 9 September, 2021; originally announced September 2021.
Cited by 2
 Related articles All 2 versions 

Class-conditioned Domain Generalization via Wasserstein Distributional Robust Optimization

by Wang, JinggeLi, YangXie, Liyan ; More...

09/2021

Given multiple source domains, domain generalization aims at learning a universal model that performs well on any unseen

but related target domain. In this...

Journal Article  Full Text Online

arXiv:2109.03676  [pdfother cs.LG
Class-conditioned Domain Generalization via Wasserstein Distributional Robust Optimization
Authors: Jingge WangYang LiLiyan XieYao Xie
Abstract: Given multiple source domains, domain generalization aims at learning a universal model that performs well on any unseen but related target domain. In this work, we focus on the domain generalization scenario where domain shifts occur among class-conditional distributions of different domains. Existing approaches are not sufficiently robust when the variation of conditional distributions given the…  More
Submitted 8 September, 2021; originally announced September 2021.
Comments: presented as a RobustML workshop paper at ICLR 2021
Cited by 1 Related articles All 2 versions 

 

arXiv:2109.03431  [pdfother cs.AI  cs.LG
Fixed Support Tree-Sliced Wasserstein Barycenter
Authors: Yuki TakezawaRyoma SatoZornitsa KozarevaSujith RaviMakoto Yamada
Abstract: The Wasserstein barycenter has been widely studied in various fields, including natural language processing, and computer vision. However, it requires a high computational cost to solve the Wasserstein barycenter problem because the computation of the Wasserstein distance requires a quadratic time with respect to the number of supports. By contrast, the Wasserstein distance on a tree, called the t…  More
Submitted 8 September, 2021; originally announced September 2021.

Cited by 8 Related articles All 4 versions 

<——2021———2021———1270——  

 

arXiv:2110.02115  [pdfpsother math.MG
Wasserstein distance and metric trees
Authors: Maxime Mathey-PrevotAlain Valette
Abstract: We study the Wasserstein (or earthmover) metric on the space P(X)
 of probability measures on a metric space X
. We show that, if a finite metric space X
 embeds stochastically with distortion D
 in a family of finite metric trees, then P(X)
 embeds bi-Lipschitz into 1
 with distortion D
. Next, we re-visit the closed formula for the Wasserstein metric on finite metric trees due to Eva…  More
Submitted 5 October, 2021; originally announced October 2021.
Comments: 17 pages
MSC Class: 05C05; 05C12; 46B85; 68R12

Related articles All 5 versions 


arXiv:2110.01141  [pdfother cond-mat.stat-mech
Minimum entropy production, detailed balance and Wasserstein distance for continuous-time Markov processes
Authors: Andreas Dechant
Abstract: We investigate the problem of minimizing the entropy production for a physical process that can be described in terms of a Markov jump dynamics. We show that, without any further constraints, a given time-evolution may be realized at arbitrarily small entropy production, yet at the expense of diverging activity. For a fixed activity, we find that the dynamics that minimizes the entropy production…  More
Submitted 3 October, 2021; originally announced October 2021.
Comments: 23 pages, 6 figures

arXiv:2110.00295  [pdfpsother math.PR
Empirical measures and random walks on compact spaces in the quadratic Wasserstein metric
Authors: Bence Borda
Abstract: Estimating the rate of convergence of the empirical measure of an i.i.d. sample to the reference measure is a classical problem in probability theory. Extending recent results of Ambrosio, Stra and Trevisan on 2-dimensional manifolds, in this paper we prove sharp asymptotic and nonasymptotic upper bounds for the mean rate in the quadratic Wasserstein metric W
2  on a d
-dimensional compact Riema…  More
Submitted 1 October, 2021; originally announced October 2021.
Comments: 27 pages
MSC Class: 60B05; 60B15; 60G10; 49Q22
Journal ArticleFull Text Online

 Cited by 2 Related articles All 3 versions 


Multi WGAN-GP loss for pathological stain transformation using GAN

AZ Moghadam, H Azarnoush… - 2021 29th Iranian …, 2021 - ieeexplore.ieee.org

In this paper, we proposed a new loss function to train the conditional generative adversarial

network (CGAN). CGANs use a condition to generate images. Adding a class condition to

the discriminator helps improve the training process of GANs and has been widely used for

Related articles


Optimal control of the Fokker-Planck equation under state constraints in the Wasserstein...

by Daudin, Samuel

09/2021

We analyze a problem of optimal control of the Fokker-Planck equation with state constraints in the Wasserstein space of 

probability measures. Our main result...

Journal Article  Full Text Online

arXiv:2109.14978  [pdfpsother math.OC
Optimal control of the Fokker-Planck equation under state constraints in the Wasserstein space
Authors: Samuel Daudin
Abstract: We analyze a problem of optimal control of the Fokker-Planck equation with state constraints in the Wasserstein space of probability measures. Our main result is to derive necessary conditions for optimality in the form of a Mean Field Game system of partial differential equations completed with an exclusion condition. As a by-product we obtain optimal (feedback) controls that are proved to be Lip…  More
Submitted 14 October, 2021; v1 submitted 30 September, 2021; originally announced September 2021.
ournal ArticleFull Text Online

Cited by 4 Related articles All 7 versions 

2021

 

Towards Better Data Augmentation using Wasserstein Distance in Variational...
by Chen, ZichuanLiu, Peng
09/2021
VAE, or variational auto-encoder, compresses data into latent attributes, and generates new data of different varieties. VAE based on KL divergence has been...
Journal Article  Full Text Online

arXiv:2109.14795  [pdf cs.LG
Towards Better Data Augmentation using Wasserstein Distance in Variational Auto-encoder
Authors: Zichuan ChenPeng Liu
Abstract: VAE, or variational auto-encoder, compresses data into latent attributes, and generates new data of different varieties. VAE based on KL divergence has been considered as an effective technique for data augmentation. In this paper, we propose the use of Wasserstein distance as a measure of distributional similarity for the latent attributes, and show its superior theoretical lower bound (ELBO) com…  More
Submitted 29 September, 2021; originally announced September 2021.

RRelated articles All 4 versions 

[PDF] openreview.net

A Distributional Robustness Perspective on Adversarial Training with the -Wasserstein Distance

C Regniez, G Gidel - 2021 - openreview.net

… problem corresponds to an ∞-Wasserstein DRO problem with the l∞ underlying geometry.

… -∞-Wasserstein distance and add entropic regularization. 2-∞Wasserstein DRO has already …


arXiv:2109.12880  [pdfother]  cs.CV  math.NA
Wasserstein Patch Prior for Image Superresolution
Authors: Johannes HertrichAntoine HoudardClaudia Redenbach
Abstract: In this paper, we introduce a Wasserstein patch prior for superresolution of two- and three-dimensional images. Here, we assume that we have given (additionally to the low resolution observation) a reference image which has a similar patch distribution as the ground truth of the reconstruction. This assumption is e.g. fulfilled when working with texture images or material data. Then, the proposed…  More
Submitted 27 September, 2021; originally announced September 2021.
Journal ArticleFull Text Online

 Cited by 1 Related articles All 4 versions 

arXiv:2109.12198  [pdfother math.OC
Wasserstein Contraction Bounds on Closed Convex Domains with Applications to Stochastic Adaptive Control
Authors: Tyler LekangAndrew Lamperski
Abstract: This paper is motivated by the problem of quantitatively bounding the convergence of adaptive control methods for stochastic systems to a stationary distribution. Such bounds are useful for analyzing statistics of trajectories and determining appropriate step sizes for simulations. To this end, we extend a methodology from (unconstrained) stochastic differential equations (SDEs) which provides con…  More
Submitted 15 October, 2021; v1 submitted 24 September, 2021; originally announced September 2021.
All 2 versions 


arXiv:2109.09182  [pdfother math.OC
Application of Wasserstein Attraction Flows for Optimal Transport in Network Systems
Authors: Ferran ArquéCésar A. UribeCarlos Ocampo-Martinez
Abstract: This paper presents a Wasserstein attraction approach for solving dynamic mass transport problems over networks. In the transport problem over networks, we start with a distribution over the set of nodes that needs to be "transported" to a target distribution accounting for the network topology. We exploit the specific structure of the problem, characterized by the computation of implicit gradient…  More
Submitted 19 September, 2021; originally announced September 2021.

elated articles All 2 versions 


Application of Wasserstein Attraction Flows for Optimal Transport in Network Systems

Arque, FUribe, CA and Ocampo-Martinez, C

60th IEEE Conference on Decision and Control (CDC)

2021 | 

2021 60TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC)

 , pp.4058-4063

This paper presents a Wasserstein attraction approach for solving dynamic mass transport problems over networks. In the transport problem over networks, we start with a distribution over the set of nodes that needs to be "transported" to a target distribution accounting for the network topology. We exploit the specific structure of the problem, characterized by the computation of implicit gradi

Show more

Free Submitted Article From RepositoryFull Text at Publishermore_horiz

21 References  Related records

<——2021———2021———1280——   

 

.arXiv:2110.07940  [pdfother cs.LG
Wasserstein Unsupervised Reinforcement Learning
Authors: Shuncheng HeYuhang JiangHongchang ZhangJianzhun ShaoXiangyang Ji
Abstract: Unsupervised reinforcement learning aims to train agents to learn a handful of policies or skills in environments without external reward. These pre-trained policies can accelerate learning when endowed with external reward, and can also be used as primitive options in hierarchical reinforcement learning. Conventional approaches of unsupervised skill discovery feed a latent variable to the agent a…  More
Submitted 15 October, 2021; originally announced October 2021.

 Variance Minimization in the Wasserstein Space for Invariant Causal Prediction
by Martinet, GuillaumeStrzalkowski, AlexanderEngelhardt, Barbara E
10/2021
Selecting powerful predictors for an outcome is a cornerstone task for machine learning. However, some types of questions can only be answered by identifying...
Journal Article  Full Text Online

arXiv:2110.07064  [pdfother cs.LG 
  Variance Minimization in the Wasserstein Space for Invariant Causal Prediction
Authors: Guillaume MartinetAlexander StrzalkowskiBarbara E. Engelhardt
Abstract: Selecting powerful predictors for an outcome is a cornerstone task for machine learning. However, some types of questions can only be answered by identifying the predictors that causally affect the outcome. A recent approach to this causal inference problem leverages the invariance property of a causal mechanism across differing experimental environments (Peters et al., 2016; Heinze-Deml et al., 2…  More
Submitted 13 October, 2021; originally announced October 2021.
ll 2 versions 

 

A Framework for Verification of Wasserstein Adversarial Robustness
by Wegel, TobiasAssion, FelixMickisch, David ; More...
10/2021
Machine learning image classifiers are susceptible to adversarial and corruption perturbations. Adding imperceptible noise to images can lead to severe...
Journal Article  Full Text Online

arXiv:2110.06816  [pdfother cs.LG   cs.CV
A Framework for Verification of Wasserstein Adversarial Robustness
Authors: Tobias WegelFelix AssionDavid MickischFlorens Greßner
Abstract: Machine learning image classifiers are susceptible to adversarial and corruption perturbations. Adding imperceptible noise to images can lead to severe misclassifications of the machine learning model. Using Lp-norms for measuring the size of the noise fails to capture human similarity perception, which is why optimal transport based distance measures like the Wasserstein metric are increasingl…  More
Submitted 13 October, 2021; originally announced October 2021.
Comments: 10 pages, 4 figures
 All 4 versions 

 

Dynamical Wasserstein Barycenters for Time-series Modeling
by Cheng, Kevin CAeron, ShuchinHughes, Michael C ; More...
10/2021
Many time series can be modeled as a sequence of segments representing high-level discrete states, such as running and walking in a human activity application....
Journal Article  Full Text Online

arXiv:2110.06741  [pdfother]  cs.LG   stat.ML
Dynamical Wasserstein Barycenters for Time-series Modeling
Authors: Kevin C. ChengShuchin AeronMichael C. HughesEric L. Miller
Abstract: Many time series can be modeled as a sequence of segments representing high-level discrete states, such as running and walking in a human activity application. Flexible models should describe the system state and observations in stationary "pure-state" periods as well as transition periods between adjacent segments, such as a gradual slowdown between running and walking. However, most prior work a… 
More
Submitted 29 October, 2021; v1 submitted 13 October, 2021; originally announced October 2021.
Comments: To appear at Neurips 2021
Cited by 1
 Related articles All 5 versions 

 

arXiv:2110.06591  [pdfpsother]  math.CT   cs.LO   math.MG   math.PR
Lifting couplings in Wasserstein spaces
Authors: Paolo Perrone
Abstract: This paper makes mathematically precise the idea that conditional probabilities are analogous to path liftings in geometry. The idea of lifting is modelled in terms of the category-theoretic concept of a lens, which can be interpreted as a consistent choice of arrow liftings. The category we study is the one of probability measures over a given standard Borel space, with morphisms given by the c… 
More
Submitted 13 October, 2021; originally announced October 2021.
Comments: 27 pages
MSC Class: 18D20; 51F99; 49Q22
All 2 versions 


2021

 

Tangent Space and Dimension Estimation with the Wasserstein Distance
by Lim, UzuOberhauser, HaraldNanda, Vidit
10/2021
Consider a set of points sampled independently near a smooth compact submanifold of Euclidean space. We provide mathematically rigorous bounds on the number of...
Journal Article  Full Text Online

arXiv:2110.06357  [pdfother]  math.ST   cs.LG
Tangent Space and Dimension Estimation with the Wasserstein Distance
Authors: Uzu LimVidit NandaHarald Oberhauser
Abstract: We provide explicit bounds on the number of sample points required to estimate tangent spaces and intrinsic dimensions of (smooth, compact) Euclidean submanifolds via local principal component analysis. Our approach directly estimates covariance matrices locally, which simultaneously allows estimating both the tangent spaces and the intrinsic dimension of a manifold. The key arguments involve a ma…  More
Submitted 12 October, 2021; originally announced October 2021.
 All 4 versions
 


Backward and Forward Wasserstein Projections in Stochastic Order
by Young-Heon, KimYuan Long Ruan
arXiv.org, 10/2021
We study metric projections onto cones in the Wasserstein space of probability measures, defined by stochastic orders. Dualities for backward and forward...
Paper  Full Text Online

arXiv:2110.04822  [pdfother math.PR
Backward and Forward Wasserstein Projections in Stochastic Order
Authors: Young-Heon KimYuan Long Ruan
Abstract: We study metric projections onto cones in the Wasserstein space of probability measures, defined by stochastic orders. Dualities for backward and forward projections are established under general conditions. Dual optimal solutions and their characterizations require study on a case-by-case basis. Particular attention is given to convex order and subharmonic order. While backward and forward cones…  More
Submitted 10 October, 2021; originally announced October 2021.
MSC Class: Primary 49; 60; secondary 52
All 2 versions
 


arXiv:2110.03995  [pdfpsother]  stat.ML   cs.LG
Statistical Regeneration Guarantees of the Wasserstein Autoencoder with Latent Space Consistency
Authors: Anish ChakrabartySwagatam Das
Abstract: The introduction of Variational Autoencoders (VAE) has been marked as a breakthrough in the history of representation learning models. Besides having several accolades of its own, VAE has successfully flagged off a series of inventions in the form of its immediate successors. Wasserstein Autoencoder (WAE), being an heir to that realm carries with it all of the goodness and heightened generative pr… 
More
Submitted 8 October, 2021; originally announced October 2021.
Comments: Accepted for Spotlight Presentation at NeurIPS 2021
elated articles
 All 5 versions 

arXiv:2110.02753  [pdfother]  cs.LG
Semi-relaxed Gromov Wasserstein divergence with applications on graphs
Authors: Cédric Vincent-CuazRémi FlamaryMarco CorneliTitouan VayerNicolas Courty
Abstract: Comparing structured objects such as graphs is a fundamental operation involved in many learning tasks. To this end, the Gromov-Wasserstein (GW) distance, based on Optimal Transport (OT), has proven to be successful in handling the specific nature of the associated objects. More specifically, through the nodes connectivity relations, GW operates on graphs, seen as probability measures over specifi… 
More
Submitted 6 October, 2021; originally announced October 2021.
Comments: preprint under review
Cited by 5
 Related articles All 9 versions 


A Regularized Wasserstein Framework for Graph Kernels
by Wiesinghe, Asiri; Wang, Qing; Gould, Stephen
2021 IEEE International Conference on Data Mining (ICDM), 12/2021
We propose a learning framework for graph kernels, which is theoretically grounded on regularizing optimal transport. This framework provides a novel optimal...
Conference Proceeding  Full Text Online

arXiv:2110.02554  [pdfother]  cs.LG   stat.ML
A Regularized Wasserstein Framework for Graph Kernels
Authors: Asiri WijesingheQing WangStephen Gould
Abstract: We propose a learning framework for graph kernels, which is theoretically grounded on regularizing optimal transport. This framework provides a novel optimal transport distance metric, namely Regularized Wasserstein (RW) discrepancy, which can preserve both features and structure of graphs via Wasserstein distances on features and their local variations, local barycenters and global connectivity.…  More
Submitted 8 October, 2021; v1 submitted 6 October, 2021; originally announced October 2021.
Comments: 21st IEEE International Conference on Data Mining (ICDM 2021)

 Related articles All 5 versions

<——2021———2021———1290——  


 

arXiv:2110.14150  [pdfother]  cs.LG   cs.CV   math.NA
Training Wasserstein GANs without gradient penalties
Authors: Dohyun KwonYeoneung KimGuido MontúfarInsoon Yang
Abstract: We propose a stable method to train Wasserstein generative adversarial networks. In order to enhance stability, we consider two objective functions using the c
-transform based on Kantorovich duality which arises in the theory of optimal transport. We experimentally show that this algorithm can effectively enforce the Lipschitz constraint on the discriminator while other standard methods fail to… 
More
Submitted 26 October, 2021; originally announced October 2021.
All 3 versions
 

 

Uncertainty quantification in a mechanical submodel driven by a Wasserstein-GAN
by Boukraichi, HamzaAkkari, NissrineCasenave, Fabien ; More...
10/2021
The analysis of parametric and non-parametric uncertainties of very large dynamical systems requires the construction of a stochastic model of said system....
Journal Article  Full Text Online

arXiv:2110.13680  [pdfother]  stat.ML   12cs.LG
Uncertainty quantification in a mechanical submodel driven by a Wasserstein-GAN
Authors: Hamza BoukraichiNissrine AkkariFabien CasenaveDavid Ryckelynck
Abstract: The analysis of parametric and non-parametric uncertainties of very large dynamical systems requires the construction of a stochastic model of said system. Linear approaches relying on random matrix theory and principal componant analysis can be used when systems undergo low-frequency vibrations. In the case of fast dynamics and wave propagation, we investigate a random generator of boundary condi…  More
Submitted 26 October, 2021; originally announced October 2021.
MSC Class: 68T07 (Primary) 35L05 (Secondary) ACM Class: G.3; G.1.8
Journal ArticleFull Text Online

All 3 versions 

 

arXiv:2110.13389  [pdfother cs.CV
A Normalized Gaussian Wasserstein Distance for Tiny Object Detection
Authors: Jinwang WangChang XuWen YangLei Yu
Abstract: Detecting tiny objects is a very challenging problem since a tiny object only contains a few pixels in size. We demonstrate that state-of-the-art detectors do not produce satisfactory results on tiny objects due to the lack of appearance information. Our key observation is that Intersection over Union (IoU) based metrics such as IoU itself and its extensions are very sensitive to the location devi…  More
Submitted 25 October, 2021; originally announced October 2021.
Cited by 33 Related articles All 2 versions 


Variational Wasserstein Barycenters with c-Cyclical Monotonicity
by Chi, JinjinYang, ZhiyaoOuyang, Jihong ; More...
10/2021
Wasserstein barycenter, built on the theory of optimal transport, provides a powerful framework to aggregate probability distributions, and it has increasingly...
Journal Article  Full Text Online

arXiv:2110.11707  [pdfother]  cs.LG   stat.ML
Variational Wasserstein Barycenters with c-Cyclical Monotonicity
Authors: Jinjin ChiZhiyao YangJihong OuyangXiming Li
Abstract: Wasserstein barycenter, built on the theory of optimal transport, provides a powerful framework to aggregate probability distributions, and it has increasingly attracted great attention within the machine learning community. However, it suffers from severe computational burden, especially for high dimensional and continuous settings. To this end, we develop a novel continuous approximation method… 
More
Submitted 22 October, 2021; originally announced October 2021.
All 3 versions
 

2021

liced-Wasserstein Gradient Flow

arXiv:2110.10972  [pdfother]  cs.LG  math.OC  stat.ML
Sliced-Wasserstein Gradient Flows
Authors: Clément BonetNicolas CourtyFrançois SeptierLucas Drumetz
Abstract: Minimizing functionals in the space of probability distributions can be done with Wasserstein gradient flows. To solve them numerically, a possible approach is to rely on the Jordan-Kinderlehrer-Otto (JKO) scheme which is analogous to the proximal scheme in Euclidean spaces. However, this bilevel optimization problem is known for its computational challenges, especially in high dimension. To allev… 
More
Submitted 21 October, 2021; originally announced October 2021.
Cited by 2
 Related articles All 16 versions 

 

arXiv:2110.10932  [pdfother]  cs.LG  math.OC  stat.ML
Subspace Detours Meet Gromov-Wasserstein
Authors: Clément BonetNicolas CourtyFrançois SeptierLucas Drumetz
Abstract: In the context of optimal transport methods, the subspace detour approach was recently presented by Muzellec and Cuturi (2019). It consists in building a nearly optimal transport plan in the measures space from an optimal transport plan in a wisely chosen subspace, onto which the original measures are projected. The contribution of this paper is to extend this category of methods to the Gromov-Was… 
More
Submitted 21 October, 2021; originally announced October 2021.
Journal ArticleFull Text Online

arXiv:2110.10464  [pdfother]  math.FA  math.DG  math.OC  math.ST  stat.ML
Generalized Bures-Wasserstein Geometry for Positive Definite Matrices
Authors: Andi HanBamdev MishraPratik JawanpuriaJunbin Gao
Abstract: This paper proposes a generalized Bures-Wasserstein (BW) Riemannian geometry for the manifold of symmetric positive definite matrices. We explore the generalization of the BW geometry in three different ways: 1) by generalizing the Lyapunov operator in the metric, 2) by generalizing the orthogonal Procrustes distance, and 3) by generalizing the Wasserstein distance between the Gaussians. We show t… 
More
Cited by 1
 Related articles All 2 versions 

 

arXiv:2110.10363  [pdfpsother]  math.CO   math.PR
On the Wasserstein Distance Between k
-Step Probability Measures on Finite Graphs
Authors: Sophia BenjaminArushi MantriQuinn Perian
Abstract: We consider random walks X,Y
 on a finite graph G
 with respective lazinesses α,β[0,1]
. Let μ
k and ν k  be the k
-step transition probability measures of X
 and Y
. In this paper, we study the Wasserstein distance between μ
k  and ν k for general k
. We consider the sequence formed by the Wasserstein distance at odd values of k
 and the sequence formed by the Wasserstein dista…  More
Submitted 19 October, 2021; originally announced October 2021.
Comments: 31 pages, 0 figures
MSC Class: 05C81 (Primary) 05C12; 49Q22; 05C21 (Secondary)

 

arXiv:2110.08991  [pdfother]  cs.DS   cs.LG   math.PR
Dimensionality Reduction for Wasserstein Barycenter
Authors: Zachary IzzoSandeep SilwalSamson Zhou
Abstract: The Wasserstein barycenter is a geometric construct which captures the notion of centrality among probability distributions, and which has found many applications in machine learning. However, most algorithms for finding even an approximate barycenter suffer an exponential dependence on the dimension d
 of the underlying space of the distributions. In order to cope with this "curse of dimensional…  More
Submitted 18 October, 2021; v1 submitted 17 October, 2021; originally announced October 2021.
Comments: Published as a conference paper in NeurIPS 2021

nal Conference on Data Mining (ICDM 2021)

Cited by 12 Related articles All 5 versions 

<——2021———2021———1300——  


arXiv:2109.02625  [pdfother cs.CV
ERA: Entity Relationship Aware Video Summarization with Wasserstein GAN
Authors: Guande WuJianzhe LinClaudio T. Silva
Abstract: Video summarization aims to simplify large scale video browsing by generating concise, short summaries that diver from but well represent the original video. Due to the scarcity of video annotations, recent progress for video summarization concentrates on unsupervised methods, among which the GAN based methods are most prevalent. This type of methods includes a summarizer and a discriminator. The…  More
Submitted 6 September, 2021; originally announced September 2021.
Comments: 8 pages, 3 figures

 ALLWAS: Active Learning on Language models in WASserstein space
by Bastos, AnsonKaul, Manohar
09/2021
Active learning has emerged as a standard paradigm in areas with scarcity of labeled training data, such as in the medical domain. Language models have emerged...
Journal Article  Full Text Online

arXiv:2109.01691  [pdfother cs.CL  cs.LG
ALLWAS: Active Learning on Language models in WASserstein space
Authors: Anson BastosManohar Kaul
Abstract: Active learning has emerged as a standard paradigm in areas with scarcity of labeled training data, such as in the medical domain. Language models have emerged as the prevalent choice of several natural language tasks due to the performance boost offered by these models. However, in several domains, such as medicine, the scarcity of labeled training data is a common issue. Also, these models may n…  More
Submitted 3 September, 2021; originally announced September 2021.

<——2021———2021———1310——  


arXiv:2109.02625  [pdfother cs.CV
ERA: Entity Relationship Aware Video Summarization with Wasserstein GAN
Authors: Guande WuJianzhe LinClaudio T. Silva
Abstract: Video summarization aims to simplify large scale video browsing by generating concise, short summaries that diver from but well represent the original video. Due to the scarcity of video annotations, recent progress for video summarization concentrates on unsupervised methods, among which the GAN based methods are most prevalent. This type of methods includes a summarizer and a discriminator. The…  More
Submitted 6 September, 2021; originally announced September 2021.
Comments: 8 pages, 3 figures
Journal ArticleFull Text Online

Cited by 1 Related articles All 3 versions 

2021 see 2020

Chae, MinwooDe Blasi, PierpaoloWalker, Stephen G.

Posterior asymptotics in Wasserstein metrics on the real line. (English) Zbl 07408164

Electron. J. Stat. 15, No. 2, 3635-3677 (2021).

MSC:  62F15 62G20 62G07

PDF BibTeX XML Cite 

Cited by 4 Related articles All 9 versions
Posterior asymptotics in Wasserstein metrics on the real line

Authors:Minwoo ChaePierpaolo De BlasiStephen G. Walker
eBook, 2021
English
Publisher:CCA, Fondazione Collegio Carlo Alberto, Torino, 2021


[HTML] springer.com

[HTML] Entropy-regularized 2-Wasserstein distance between Gaussian measures

A Mallasto, A Gerolin, HQ Minh - Information Geometry, 2021 - Springer

… 3, we compute explicit solutions to the entropy-relaxed 2-Wasserstein distance between 

Gaussians, … We derive fixed-point expressions for the entropic 2-Wasserstein distance and the 2-…

3 Related articles All 6 versions

Cited by 23 Related articles All 6 versions

Barrera, G.Högele, M. A.Pardo, J. C.

Cutoff thermalization for Ornstein-Uhlenbeck systems with small Lévy noise in the Wasserstein distance. (English) Zbl 07402093

J. Stat. Phys. 184, No. 3, Paper No. 27, 54 p. (2021).

MSC:  60G15 60G55 60J65

PDF BibTeX XML Cite

Full Text: DOI

2021

Bishop, Adrian N.Doucet, Arnaud

Network consensus in the Wasserstein metric space of probability measures. (English) Zbl 07398751

SIAM J. Control Optim. 59, No. 5, 3261-3277 (2021).

MSC:  60B05 90C08 93C83 68T45

PDF BibTeX XML Cite  Full Text: DOI

Cited by 2 Related articles All 8 versions

Qian, YitianPan, Shaohua

An inexact PAM method for computing Wasserstein barycenter with unknown supports. (English) Zbl 07394327

Comput. Appl. Math. 40, No. 2, Paper No. 45, 29 p. (2021).

MSC:  90C26 49J52 65K05

PDF BibTeX XML Cite Full Text: DOI
 Related articles All 2 versions


Borda, Bence

Equidistribution of random walks on compact groups. II: The Wasserstein metric. (English) Zbl 07394102

Bernoulli 27, No. 4, 2598-2623 (2021).

MSC:  60G50 60Fxx 11Kxx 60Bxx

PDF BibTeX XML Cite

Full Text: DOI


Yang, Insoon

Wasserstein distributionally robust stochastic control: a data-driven approach. (English) Zbl 07393119

IEEE Trans. Autom. Control 66, No. 8, 3863-3870 (2021).

MSC:  93E20 93B35 90C39 91A10 91A05 91A80

PDF BibTeX XML Cite

Full Text: DOI


Dąbrowski, Damian

Sufficient condition for rectifiability involving Wasserstein distance W2

lish) Zbl 07388797

J. Geom. Anal. 31, No. 8, 8539-8606 (2021).

MSC:  28A75 28A78

PDF BibTeX XML Cite

Full Text: DOI    PDF] arxiv.org

Cited by 7 Related articles All 4 versions

<——2021———2021———1320—— 


A Liver Segmentation Method Based on the Fusion ... - PubMed

https://pubmed.ncbi.nlm.nih.gov › ...

by J Ma · 2021 — Accurate segmentation of liver images is an essential step in liver disease diagnosis, treatment planning, and prognosis.

online Cover Image PEER-REVIEW OPEN ACCESS

A Liver Segmentation Method Based on the Fusion of VNet and WGAN

by Ma, Jinlin; Deng, Yuanyuan; Ma, Ziping ; More...

Computational and mathematical methods in medicine, 10/2021, Volume 2021

Accurate segmentation of liver images is an essential step in liver disease diagnosis, treatment planning, and prognosis. In recent years, although liver...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 Cited by 3 Related articles All 7 versions


Prediction of Aquatic Ecosystem Health Indices through ...

https://www.mdpi.com › ...

by S Lee · 2021 — Prediction of Aquatic Ecosystem Health Indices through Machine Learning Models Using the WGAN-Based Data Augmentation Method. by. Seoro Lee

online Cover Image PEER-REVIEW OPEN ACCESS

Prediction of Aquatic Ecosystem Health Indices through Machine Learning Models Using the WGAN-Based Data Augmentation Method

by Lee, Seoro; Kim, Jonggun; Lee, Gwanjae ; More...

Sustainability (Basel, Switzerland), 09/2021, Volume 13, Issue 18

Changes in hydrological characteristics and increases in various pollutant loadings due to rapid climate change and urbanization have a significant impact on...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 

(PDF) Infrared Image Super-Resolution via Heterogeneous ...

https://www.researchgate.net › publication › 354328932_...

Sep 6, 2021 — In this paper, we present a framework that employs heterogeneous convolution and adversarial training, namely, heterogeneous kernel-based super- ...

online OPEN ACCESS

Infrared Image Super-Resolution via Heterogeneous Convolutional WGAN

by Huang, Yongsong; Jiang, Zetao; Wang, Qingzhong ; More...

09/2021

Image super-resolution is important in many fields, such as surveillance and remote sensing. However, infrared (IR) images normally have low resolution since...

Journal ArticleFull Text Online

 

PRICAI 2021: Trends in Artificial Intelligence: 18th Pacific ...

https://books.google.com › books

Duc Nghia Pham

A low-resolution image ILR is input to a generator network to generate the ... GAN with Infrared Image Super-Resolution via Heterogeneous Convolutional WGAN ...

online

Infrared Image Super-Resolution via Heterogeneous Convolutional WGAN

by Huang, Yongsong; Jiang, Zetao; Wang, Qingzhong ; More...

PRICAI 2021: Trends in Artificial Intelligence, 11/2021

Image super-resolution is important in many fields, such as surveillance and remote sensing. However, infrared (IR) images normally have low resolution since...

Book ChapterFull Text Online

 

Inverse airfoil design method for generating varieties of ... - arXiv

https://arxiv.org › cs

by K Yonekura · 2021 — In this study, we employed conditional Wasserstein GAN with gradient penalty (CWGAN-GP) to generate airfoil shapes, and the obtained shapes ...

online  OPEN ACCESS

Inverse airfoil design method for generating varieties of smooth airfoils using conditional WGAN-gp

by Yonekura, Kazuo; Miyamoto, Nozomu; Suzuki, Katsuyuki

10/2021

Machine learning models are recently utilized for airfoil shape generation methods. It is desired to obtain airfoil shapes that satisfies required lift...

Journal ArticleFull Text Online

 

2021

2021 see 2020

The Quantum Wasserstein Distance of Order 1 - IEEE Xplore

https://ieeexplore.ieee.org › iel7

by G De Palma · 2021 · 0 — Our main result is a continuity bound for the von Neumann entropy with respect to the proposed distance, which significantly strengthens the ...

17 pages

online Cover Image PEER-REVIEW OPEN ACCESS

The Quantum Wasserstein Distance of Order 1

by De Palma, Giacomo; Marvian, Milad; Trevisan, Dario ; More...

IEEE transactions on information theory, 10/2021, Volume 67, Issue 10

We propose a generalization of the Wasserstein distance of order 1 to the quantum states of n...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Cited by 36 Related articles All 12 versions

2021 see 2020

(PDF) The α-z-Bures Wasserstein divergence - ResearchGatehttps://www.researchgate.net › ... › Quantum
Jun 17, 2021 — TRUNG HOA DINH, CONG TRINH LE, BICH KHUE VO AND TRUNG DUNG VUONG. Abstract. In this paper, we introduce the α-z-Bures Wasserstein divergence.online

Cover Image  PEER-REVIEW

The [alpha]-z-Bures Wasserstein divergence

by Dinh, Trung Hoa; Le, Cong Trinh; Vo, Bich Khue ; More...

Linear algebra and its applications, 09/2021, Volume 624

Keywords Quantum divergence; [alpha]-z Bures distance; Least squares problem; Karcher mean; Matrix power mean; In-betweenness property; Data processing...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Cited by 6 Related articles All 4 versions

  2021 see 2020

[PDF] Wasserstein Regression - Researchain

https://researchain.net › archives › Wasserstein-Regressi...

Adopting the Wasserstein metric, we develop a class of regression models for such data, where random distributions serve as predictor

online Cover Image

Wasserstein Regression

by Chen, Yaqing; Lin, Zhenhua; Müller, Hans-Georg

Journal of the American Statistical Association, 10/2021

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 8 Related articles All 2 versions

  2021 see 2020

Wasserstein distributionally robust motion control for collision avoidance using conditional value-at-risk

A HakobyanI Yang - IEEE Transactions on Robotics, 2021 - ieeexplore.ieee.org

In this article, a risk-aware motion control scheme is considered for mobile robots to avoid

randomly moving obstacles when the true probability distribution of uncertainty is unknown.

We propose a novel model-predictive control (MPC) method for limiting the risk of unsafety

even when the true distribution of the obstacles' movements deviates, within an ambiguity

set, from the empirical distribution obtained using a limited amount of sample data. By

choosing the ambiguity set as a statistical ball with its radius measured by the Wasserstein …

 2 Related articles All 3 versions

online Cover Image  PEER-REVIEW

Wasserstein Distributionally Robust Motion Control for Collision Avoidance Using Conditional Value-at-Risk

by Hakobyan, Astghik; Yang, Insoon

IEEE transactions on robotics, 09/2021

In this article, a risk-aware motion control scheme is considered for mobile robots to avoid randomly moving obstacles when the true probability distribution...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

   Cited by 15 Related articles All 3 versions

 

DerainGAN: Single image deraining using wasserstein GAN

https://link.springer.com › article

by S Yadav · 2021 — In this paper, we design a simple yet effective 'DerainGAN' framework to achieve improved deraining performance over the existing state-of-the- ...

Abstract · ‎Introduction · ‎Background and related work · ‎Methodology of proposed...

online Cover Image PEER-REVIEW

DerainGAN: Single image deraining using wasserstein GAN

by Yadav, Sahil; Mehra, Aryan; Rohmetra, Honnesh ; More...

Multimedia tools and applications, 09/2021

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Cited by 1 Related articles All 2 versions

<——2021———2021———1330——

liced Wasserstein Based Canonical Correlation Analysis for ...

https://www.sciencedirect.com › science › article › pii

by Z Zhao · 2021 — In this paper, we propose a joint learning cross-domain recommendation model that can extract domain-specific and common features simultaneously ...


Sliced Wasserstein based Canonical Correlation Analysis for ...https://www.sciencedirect.com › science › article › abs › piiby Z Zhao · 2021 — In this paper, we propose a joint learning cross-domain recommendation model that can extract domain-specific and common features simultaneously, and only use ...

online Cover Image PEER-REVIEW

Sliced Wasserstein based Canonical Correlation Analysis for Cross-Domain Recommendation

by Zhao, Zian; Nie, Jie; Wang, Chenglong ; More...

Pattern recognition letters, 10/2021, Volume 150

•A cross-domain recommendation model based on Sliced Wasserstein autoencoder is proposed.•An improved cross-domain transformation loss of orthogonal...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Related articles All 3 versions

 

2021 see 2020

Cutoff Thermalization for Ornstein–Uhlenbeck ... - Springer LINK

https://link.springer.com › article

by G Barrera · 2021 · Cited by 3 — This article establishes cutoff thermalization (also known as the cutoff phenomenon) for a class of generalized OrnsteinUhlenbeck systems.

online Cover Image PEER-REVIEW OPEN ACCESS

Cutoff Thermalization for Ornstein–Uhlenbeck Systems with Small Lévy Noise in the Wasserstein Distance

by Barrera, G; Högele, M. A; Pardo, J. C

Journal of statistical physics, 08/2021, Volume 184, Issue 3

This article establishes cutoff thermalization (also known as the cutoff phenomenon ) for a class of generalized Ornstein–Uhlenbeck systems ( X t ε ( x ) ) t ...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 Cited by 4 Related articles All 9 versions

Deep transfer Wasserstein adversarial network for wafer map defect recognition

J Yu, S Li, Z Shen, S Wang, C Liu, Q Li - Computers & Industrial …, 2021 - Elsevier

Deep neural networks (DNNs) are capable of extracting effective features from data by using

deep structure and multiple non-linear processing units. However, they dependent on large

datasets from the same distribution. It is difficult to collect wafer maps with various defect

patterns in semiconductor manufacturing process. A new deep transfer learning model,

deep transfer Wasserstein adversarial network (DTWAN) is proposed to recognize wafer

map defect. An adaptive transfer learning framework based on adversarial training is …

 Cite All 2 versions

online Cover Image  PEER-REVIEW

Deep transfer Wasserstein adversarial network for wafer map defect recognition

by Yu, Jianbo; Li, Shijin; Shen, Zongli ; More...

Computers & industrial engineering, 11/2021, Volume 161

•A new transfer learning model is proposed to recognize wafer map defect.•Wasserstein distance and MMD are integrated in domain adversarial training...

Journal ArticleFull Text Online

  

 2021 see 2019

A Wasserstein inequality and minimal Green energy on ...

https://www.researchgate.net › ... › Green Energy

May 2, 2021 — A Wasserstein inequality and minimal Green energy on compact manifolds. September 2021; Journal of Func

online  Cover Image   PEER-REVIEW

A Wasserstein inequality and minimal Green energy on compact manifolds

by Steinerberger, Stefan

Journal of functional analysis, 09/2021, Volume 281, Issue 5

Let M be a smooth, compact d−dimensional manifold, d≥3, without boundary and let G:M×MR{∞} denote the Green's function of the Laplacian −Δ (normalized to...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

Cited by 11 Related articles All 3 versions
Zbl 07456696


 2021

2021 see 2020

Stochastic approximation versus sample average approximation for Wasserstein barycenters

D Dvinskikh - Optimization Methods and Software, 2021 - Taylor & Francis

In the machine learning and optimization community, there are two main approaches for the

convex risk minimization problem, namely the Stochastic Approximation (SA) and the

Sample Average Approximation (SAA). In terms of the oracle complexity (required number of

stochastic gradient evaluations), both approaches are considered equivalent on average

(up to a logarithmic factor). The total complexity depends on a specific problem, however,

starting from the work [A. Nemirovski, A. Juditsky, G. Lan, and A. Shapiro, Robust stochastic …

 Cite All 2 versions

online  Cover Image   PEER-REVIEW OPEN ACCESS

Stochastic approximation versus sample average approximation for Wasserstein barycenters

by Dvinskikh, Darina

Optimization methods & software, , Volume ahead-of-print, Issue ahead-of-print

In the machine learning and optimization community, there are two main approaches for the convex risk minimization problem, namely the Stochastic Approximation...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 Cite this item Email this item Save this item More actions

Conditional Wasserstein Generative Adversarial Networks for ...

https://www.jstage.jst.go.jp › -char

· Translate this page

by YH LI · 2021 — In an iris dataset, for instance, the minority class samples include images of eyes with glasses, oversized or undersized pupils, misaligned iris locations, and ...

online Cover Image PEER-REVIEW OPEN ACCESS

Conditional Wasserstein Generative Adversarial Networks for Rebalancing Iris Image Datasets

by LI, Yung-Hui; ASLAM, Muhammad Saqlain; HARFIYA, Latifa Nabila ; More...

IEICE transactions on information and systems, 09/2021, Volume E104.D, Issue 9

The recent development of deep learning-based generative models has sharply intensified the interest in data synthesis and its applications. Data synthesis...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 

 

Wasserstein distance based Asymmetric Adversarial Domain Adaptation in intelligent bearing fault diagnosis

Y Ying, Z Jun, T Tang, W Jingwei, C Ming… - Measurement …, 2021 - iopscience.iop.org

Addressing the phenomenon of data sparsity in hostile working conditions, which leads to

performance degradation in traditional machine learning based fault diagnosis methods, a

novel Wasserstein distance based Asymmetric Adversarial Domain Adaptation (WAADA) is

proposed for unsupervised domain adaptation in bearing fault diagnosis. A GAN-based loss

and asymmetric mapping are integrated to alleviate the difficulty of the training process in

adversarial transfer learning, especially when the domain shift is serious. Moreover …

Cited by 5 Related articles All 2 versions

online Cover Image PEER-REVIEW

Wasserstein distance-based asymmetric adversarial domain adaptation in intelligent bearing fault diagnosis

by Yu, Ying; Zhao, Jun; Tang, Tang ; More...

Measurement science & technology, 11/2021, Volume 32, Issue 11

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 

Fast Wasserstein-Distance-Based Distributionally Robust ...

https://ieeexplore.ieee.org › document

by G Chen · 2021 — This paper addresses this challenge by proposing a fast power dispatch model for multi-zone HVAC systems. A distributionally robust chance- ...

online Cover Image  PEER-REVIEW

Fast Wasserstein-Distance-Based Distributionally Robust Chance-Constrained Power Dispatch for Multi-Zone HVAC Systems

by Chen, Ge; Zhang, Hongcai; Hui, Hongxun ; More...

IEEE transactions on smart grid, 09/2021, Volume 12, Issue 5

Heating, ventilation, and air-conditioning (HVAC) systems play an increasingly important role in the construction of smart cities because of their high energy...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 

 

Differential semblance optimisation based on the adaptive ...

https://academic.oup.com › jge › article

by Z Y · 2021 — Adaptive quadratic Wasserstein distance between two probability distributions. The OT theory has been studied for a long time in the ...

<——2021———2021———1340——  


Differential semblance optimisation based on the adaptive  quadratic Wasserstein distance 

by Z Yu · 2021 — Adaptive quadratic Wasserstein distance between two probability distributions. The OT theory has been studied for a long time in the ...

online Cover Image PEER-REVIEW OPEN ACCESS

Differential semblance optimisation based on the adaptive quadratic Wasserstein distance

by Yu, Zhennan; Liu, Yang

Journal of geophysics and engineering, 08/2021, Volume 18, Issue 5

Abstract As the robustness for the wave equation-based inversion methods, wave equation migration velocity analysis (WEMVA) is stable for overcoming the...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 Related articles All 2 versions

 

2021 see 2020

The Wasserstein Impact Measure (WIM) - Statistics

https://www.researchgate.net › ... › Bayesian Statistics

Request PDF | The Wasserstein Impact Measure (WIM): a generally applicable, practical tool for quantifying prior impa

online Cover Image  PEER-REVIEW

The Wasserstein Impact Measure (WIM): A practical tool for quantifying prior impact in Bayesian statistics

by Ghaderinezhad, Fatemeh; Ley, Christophe; Serrien, Ben

Computational statistics & data analysis, 10/2021

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 All 2 versions

 

2021 see 2020

Equidistribution of random walks on compact groups II. The ...

https://projecteuclid.org › bernoulli › issue-4 › 21-BEJ1324

by B orda · 2021 · Cited by 2 — The proof uses a new Berry–Esseen type inequality for the p-Wasserstein metric on the torus, and the simultaneous Diophantine approximation ...

online Cover Image  PEER-REVIEW

Equidistribution of random walks on compact groups II. The Wasserstein metric

by Borda, Bence

Bernoulli : official journal of the Bernoulli Society for Mathematical Statistics and Probability, 11/2021, Volume 27, Issue 4

Article Link Read Article BrowZine Article Link Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

MR4303897 

Cited by 6 Related articles All 7 versions

Differential semblance optimisation based on the adaptive ...https://academic.oup.com › jge › article-abstract
by Z Yu · 2021 — As the robustness for the wave equation-based inversion methods, ... optimisation based on the adaptive quadratic Wasserstein distance.

MR4335738 Prelim Sun, Yue; Qiu, Ruozhen; Sun, Minghe; 

Optimizing decisions for a dual-channel retailer with service level requirements and demand uncertainties: A Wasserstein metric-based distributionally robust optimization approach. Comput. Oper. Res. 138 (2022), Paper No. 105589. 90B06

Review PDF Clipboard Journal Article

All 2 versions

MR4331435 Prelim Figalli, Alessio; Glaudo, Federico; 

An invitation to optimal transport, Wasserstein distances, and gradient flows. EMS Textbooks in Mathematics. EMS Press, Berlin, [2021], ©2021. vi + 136 pp. ISBN: 978-3-98547-010-5 49-01 (28A33 35A15 49N15 49Q22 60B05)

Review PDF Clipboard Series Book

CITATION] An Invitation to Optimal Transport, Wasserstein Distances, and Gradient Flows

A FigalliF Glaudo - 2021 - ems-ph.org

The presentation focuses on the essential topics of the theory: Kantorovich duality, existence

and uniqueness of optimal transport maps, Wasserstein distances, the JKO scheme, Otto's

calculus, and Wasserstein gradient flows. At the end, a presentation of some selected …

 Cited by 3 Related articles

2021


MR4330846 Prelim Marx, Victor; 

Infinite-dimensional regularization of McKean–Vlasov equation with a Wasserstein diffusion. Ann. Inst. Henri Poincaré Probab. Stat. 57 (2021), no. 4, 2315–2353. 60H10 (35Q83 60H15 60J60 60K35)

Review PDF Clipboard Journal Article


MR4328512 Prelim Gupta, Abhishek; Haskell, William B.; Convergence of Recursive Stochastic Algorithms Using Wasserstein Divergence. SIAM J. Math. Data Sci. 3 (2021), no. 4, 1141–1167. 90 (47J25 60J20 68Q32 93E35)

Review PDF Clipboard Journal Article


2021 see 2020

MR4324123 Prelim Carlier, Guillaume; Eichinger, Katharina; Kroshnin, Alexey; 

Entropic-Wasserstein Barycenters: PDE Characterization, Regularity, and CLT. SIAM J. Math. Anal. 53 (2021), no. 5, 5880–5914. 49Q22 (35J96 49Q15 60B12)

Review PDF Clipboard Journal Article 1 Citation

 Cited by 4 Related articles All 16 versions


MR4320448 Prelim Luo, Yihao; Zhang, Shiqiang; Cao, Yueqi; Sun, Huafei; 

Geometric Characteristics of the Wasserstein Metric on SPD(n) and Its Applications on Data Processing. Entropy 23 (2021), no. 9, Paper No. 1214. 15B48 (53Z50)

Review PDF Clipboard Journal Article


MR4318501 Prelim Huynh, Viet; Ho, Nhat; Dam, Nhan; Nguyen, XuanLong; Yurochkin, Mikhail; Bui, Hung; Phung, Dinh; 

On efficient multilevel clustering via Wasserstein distances. J. Mach. Learn. Res. 22 (2021), Paper No. 145, 43 pp. 62H30 (28A33 60B10)

Review PDF Clipboard Journal Arti

 Cited by 3 Related articles All 20 versions 

<——2021———2021———1350——


2021 see 2020

MR4316832 Prelim Mei, Yu; Chen, Zhi-Ping; Ji, Bing-Bing; Xu, Zhu-Jia; Liu, Jia; 

Data-driven Stochastic Programming with Distributionally Robust Constraints Under Wasserstein Distance: Asymptotic Properties. J. Oper. Res. Soc. China 9 (2021), no. 3, 525–542. 90C15 (90C47)

Review PDF Clipboard Journal Article 1 Citation
Cited by 4
 Related articles


MR4315475 Pending Bishop, Adrian N.Doucet, Arnaud 

Network consensus in the Wasserstein metric space of probability measures. SIAM J. Control Optim. 59 (2021), no. 5, 3261–3277. 60B10 (68W15 90B10 90C08 90C48 93D50)

Review PDF Clipboard Journal Article


2021 see 2020

MR4309269 Prelim Wang, Shulei; Cai, T. Tony; Li, Hongzhe; 

Optimal Estimation of Wasserstein Distance on a Tree With an Application to Microbiome Studies. J. Amer. Statist. Assoc. 116 (2021), no. 535, 1237–1253. 62H12 (62P10 62R20)

Review PDF Clipboard Journal Article

Cited by 4 Related articles All 7 versions

MR4307706 Pending Barrera, G.Högele, M. A.Pardo, J. C. 

Cutoff thermalization for Ornstein-Uhlenbeck systems with small Lévy noise in the Wasserstein distance. J. Stat. Phys. 184 (2021), no. 3, Paper No. 27, 54 pp. 60J60 (60B10 60G51)

Review PDF Clipboard Journal Article 1 Citation


WDIBS: Wasserstein deterministic information bottleneck for ...

https://link.springer.com › article

by X Zhu · 2021 — Deterministic Information Bottleneck for State abstraction (DIBS) ... DIBS

fails to balance the compression degree and decision performance.

online Cover Image PEER-REVIEW

WDIBS: Wasserstein deterministic information bottleneck for state abstraction to balance state-compression and performance

by Zhu, Xianchao; Huang, Tianyi; Zhang, Ruiyuan ; More...

Applied intelligence (Dordrecht, Netherlands), 09/2021

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 

2021


Simulation of broad-band ground motions with consistent long ...

https://academic.oup.com › gji › article

by T Okazaki · 2021 — This study explores an approach that generates consistent broad-band waveforms using past observation records, under the assumption that long- ...


[HTML] oup.com

Simulation of broad-band ground motions with consistent long-period and short-period components using the Wasserstein interpolation of acceleration envelopes

T Okazaki, H Hachiya, A Iwaki, T Maeda… - Geophysical Journal …, 2021 - academic.oup.com

… enables the introduction of a metric known as the Wasserstein distance, and (2) embed pairs 

of … as well as the advantage of the Wasserstein distance as a measure of dissimilarity of the …

Related articles All 5 versions

Simulation of broad-band ground motions ... - Oxford Academichttps://academic.oup.com › gji › article-abstract
by T Okazaki · 2021 — This study explores an approach that generates consistent broad-band waveforms using past observation records, under the assumption that long- ...
online Cover Image  PEER-REVIEW OPEN ACCESS

Simulation of broad-band ground motions with consistent long-period and short-period components using the Wasserstein interpolation of...

by Okazaki, Tomohisa; Hachiya, Hirotaka; Iwaki, Asako ; More...

Geophysical journal international, 07/2021, Volume 227, Issue 1

SUMMARY Practical hybrid approaches for the simulation of broad-band ground motions often combine long-period and short-period waveforms synthesized by...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

Related articles All 5 versions


Geometric Characteristics of the Wasserstein Metric on SPD(n ...

https://www.mdpi.com › ...

by Y Luo · 2021 — In this paper, by involving the Wasserstein metric on SPD(n), we obtain computationally feasible expressions for some geometric quantities, ...

online Cover Image  PEER-REVIEW OPEN ACCESS

Geometric Characteristics of the Wasserstein Metric on SPD(n) and Its Applications on Data Processing

by Luo, Yihao; Zhang, Shiqiang; Cao, Yueqi ; More...

Entropy (Basel, Switzerland), 09/2021, Volume 23, Issue 9

The Wasserstein distance, especially among symmetric positive-definite matrices, has broad and deep influences on the development of artificial intelligence...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 

Correction to: Necessary Optimality Conditions ... - SpringerLink

https://link.springer.com › article

by B Bonnet · 2021 · Cited by 6 — Bonnet, B., Frankowska, H. Correction to: Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces.

online Cover Image  PEER-REVIEW OPEN ACCESS

Correction to: Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces

by Bonnet, Benoît; Frankowska, Hélène

Applied mathematics & optimization, 09/2021

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 

A Novel Intelligent Fault Diagnosis Method for Rolling ...

https://pubmed.ncbi.nlm.nih.gov › ...

ng · 2021 — An intelligent fault diagnosis strategy for rolling bearings based on grayscale image transformation, a generative adversative network, and a ...

online Cover Image PEER-REVIEW OPEN ACCESS

A Novel Intelligent Fault Diagnosis Method for Rolling Bearings Based on Wasserstein Generative Adversarial Network and Convolutional...

by Tang, Hongtao; Gao, Shengbo; Wang, Lei ; More...

Sensors (Basel, Switzerland), 10/2021, Volume 21, Issue 20

Rolling bearings are widely used in industrial manufacturing, and ensuring their stable and effective fault detection is a core requirement in the...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Tex

2021 see 2019

From the backward Kolmogorov PDE on the Wasserstein ...

https://www.sciencedirect.com › pii

· Translate this page

by PEC de Raynal · 2021 · Cited by 8 — This article is a continuation of our first work [6]. We here establish some new quantitative estimates for propagation of chaos of ...

online Cover Image  PEER-REVIEW

From the backward Kolmogorov PDE on the Wasserstein space to propagation of chaos for McKean-Vlasov SDEs

by de Raynal, Paul-Eric Chaudru; Frikha, Noufel

Journal de mathématiques pures et appliquées, 10/2021

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 MR4338452

Cited by 13 Related articles All 13 versions

Spacecraft Intelligent Fault Diagnosis under Variable Working Conditions via Wasserstein Distance-Based Deep Adversarial Transfer Learning

by G Xiang — Spacecraft Intelligent Fault Diagnosis under Variable Working Conditions via Wasserstein Distance-Based Deep Adversarial Transfer Learning ...

online Cover Image  PEER-REVIEW OPEN ACCESS

Spacecraft Intelligent Fault Diagnosis under Variable Working Conditions via Wasserstein Distance-Based Deep Adversarial...

by Xiang, Gang; Tian, Kun

International journal of aerospace engineering, 10/2021, Volume 2021

In recent years, deep learning methods which promote the accuracy and efficiency of fault diagnosis task without any extra requirement of artificial feature...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 All 4 versions


2021 see 2020

Learning Disentangled Representations with ... - SpringerLink

https://link.springer.com › chapter

by B Gaujac · 2021 — Disentangled representation learning has undoubtedly benefited from ... we propose TCWAE (Total Correlation Wasserstein Autoencoder).

online PEER-REVIEW

Learning Disentangled Representations with the Wasserstein Autoencoder

by Gaujac, Benoit; Feige, Ilya; Barber, David

Machine Learning and Knowledge Discovery in Databases. Research Track, 09/2021

Disentangled representation learning has undoubtedly benefited from objective function surgery. However, a delicate balancing act of tuning is still required...

Book ChapterFull Text Online

 Cited by 3 Related articles All 7 versions

 

Wasserstein Bounds in the CLT of the MLE for the Drift ... - MDPI

https://www.mdpi.com › pdf

PDF

by K Es-Sebaiy · 2021 — Abstract: In this paper, we are interested in the rate of convergence for the central limit theorem of the maximum likelihood estimator of ...

online Cover Image PEER-REVIEW

Wasserstein Bounds in the CLT of the MLE for the Drift Coefficient of a Stochastic Partial Differential Equation

by Es-Sebaiy, Khalifa; Al-Foraih, Mishari; Alazemi, Fares

Fractal and Fractional, 10/2021, Volume 5, Issue 4

In this paper, we are interested in the rate of convergence for the central limit theorem of the maximum likelihood estimator of the drift coefficient for a...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 

Image Inpainting Using Wasserstein Generative Adversarial ...

https://arxiv.org › cs

by D Vašata · 2021 ·  — The aim of this paper is to introduce an image inpainting model based on Wasserstein Generative Adversarial Imputation Network.

Cite as: arXiv:2106.15341

online  PEER-REVIEW OPEN ACCESS

Image Inpainting Using Wasserstein Generative Adversarial Imputation Network

by Vašata, Daniel; Halama, Tomáš; Friedjungová, Magda

Artificial Neural Networks and Machine Learning – ICANN 2021, 09/2021

Image inpainting is one of the important tasks in computer vision which focuses on the reconstruction of missing regions in an image. The aim of this paper is...

Book ChapterFull Text Online

  All 5 versions


2021

Sliced-Wasserstein Gradient Flows

C BonetN CourtyF SeptierL Drumetz - arXiv preprint arXiv:2110.10972, 2021 - arxiv.org

Minimizing functionals in the space of probability distributions can be done with Wasserstein

gradient flows. To solve them numerically, a possible approach is to rely on the Jordan …

online  OPEN ACCESS

Sliced-Wasserstein Gradient Flows

by Bonet, Clément; Courty, Nicolas; Septier, François ; More...

10/2021

Minimizing functionals in the space of probability distributions can be done with Wasserstein gradient flows. To solve them numerically, a possible approach is...

Journal ArticleFull Text Online

 Cited by 8 Related articles All 12 versions

Wasserstein Unsupervised Reinforcement Learning

S He, Y Jiang, H Zhang, J Shao, X Ji - arXiv preprint arXiv:2110.07940, 2021 - arxiv.org

Unsupervised reinforcement learning aims to train agents to learn a handful of policies or

skills in environments without external reward. These pre-trained policies can accelerate

learning when endowed with external reward, and can also be used as primitive options in

hierarchical reinforcement learning. Conventional approaches of unsupervised skill

discovery feed a latent variable to the agent and shed its empowerment on agent's behavior

by mutual information (MI) maximization. However, the policies learned by MI-based …

Related articles All 3 versions

online OPEN ACCESS

Wasserstein Unsupervised Reinforcement Learning

by He, Shuncheng; Jiang, Yuhang; Zhang, Hongchang ; More...

10/2021

Unsupervised reinforcement learning aims to train agents to learn a handful of policies or skills in environments without external reward. These pre-trained...

Journal ArticleFull Text Online

 

Wasserstein Distance Maximizing Intrinsic Control

I DurugkarS Hansen, S Spencer, V Mnih - arXiv preprint arXiv …, 2021 - arxiv.org

This paper deals with the problem of learning a skill-conditioned policy that acts

meaningfully in the absence of a reward signal. Mutual information based objectives have

shown some success in learning skills that reach a diverse set of states in this setting. These

objectives include a KL-divergence term, which is maximized by visiting distinct states even

if those states are not far apart in the MDP. This paper presents an approach that rewards

the agent for learning skills that maximize the Wasserstein distance of their state visitation …

 Cite All 2 versions 

online OPEN ACCESS

Wasserstein Distance Maximizing Intrinsic Control

by Durugkar, Ishan; Hansen, Steven; Spencer, Stephen ; More...

10/2021

This paper deals with the problem of learning a skill-conditioned policy that acts meaningfully in the absence of a reward signal. Mutual information based...

Journal ArticleFull Text Online

 ll 4 versions

 Related articles All 4 versions

Wasserstein distance maximizing Intrinsic Control · SlidesLive

slideslive.com › wasserstein-distance-maximizing-intrinsic...

slideslive.com › wasserstein-distance-maximizing-intrinsic...

... Deep Reinforcement Learning; Wasserstein distance maximizing Intrinsic Control ... Wasserstein distance maximizing Intrinsic Control. Dec 6, 2021 ...

SlidesLive · 

Dec 6, 2021

Dimensionality Reduction for Wasserstein Barycenter

Z Izzo, S Silwal, S Zhou - arXiv preprint arXiv:2110.08991, 2021 - arxiv.org

The Wasserstein barycenter is a geometric construct which captures the notion of centrality

among probability distributions, and which has found many applications in machine

learning. However, most algorithms for finding even an approximate barycenter suffer an

exponential dependence on the dimension $ d $ of the underlying space of the distributions.

In order to cope with this" curse of dimensionality," we study dimensionality reduction

Cited by 4 Related articles All 6 versions

online OPEN ACCESS

Dimensionality Reduction for Wasserstein Barycenter

by Izzo, Zachary; Silwal, Sandeep; Zhou, Samson

10/2021

The Wasserstein barycenter is a geometric construct which captures the notion of centrality among probability distributions, and which has found many...

Journal ArticleFull Text Online

 Cited by 6 Related articles All 6 versions


Lifting couplings in Wasserstein spaces

P Perrone - arXiv preprint arXiv:2110.06591, 2021 - arxiv.org

This paper makes mathematically precise the idea that conditional probabilities are

analogous to path liftings in geometry. The idea of lifting is modelled in terms of the category-

theoretic concept of a lens, which can be interpreted as a consistent choice of arrow liftings.

The category we study is the one of probability measures over a given standard Borel space,

with morphisms given by the couplings, or transport plans. The geometrical picture is even

more apparent once we equip the arrows of the category with weights, which one can …

Cited by 1 Related articles All 2 versions

online OPEN ACCESS

Lifting couplings in Wasserstein spaces

by Perrone, Paolo

10/2021

This paper makes mathematically precise the idea that conditional probabilities are analogous to path liftings in geometry. The idea of lifting is modelled in...

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

<——2021———2021———1370——


[2110.02115] Wasserstein distance and metric trees - arXiv

https://arxiv.org › math

by M Mathey-Prevot · 2021 — Abstract: We study the Wasserstein (or earthmover) metric on the space P(X) of probability measures on a metric space X. We show that, ...

online OPEN ACCESS

Wasserstein distance and metric trees

by Mathey-Prevot, Maxime; Valette, Alain

10/2021

We study the Wasserstein (or earthmover) metric on the space $P(X)$ of probability measures on a metric space $X$. We show that, if a finite metric space $X$...

Journal ArticleFull Text Online

 Related articles All 5 versions

 

Inferential Wasserstein Generative Adversarial Networks

Y Chen, Q Gao, X Wang - arXiv preprint arXiv:2109.05652, 2021 - arxiv.org

Generative Adversarial Networks (GANs) have been impactful on many problems and

applications but suffer from unstable training. The Wasserstein GAN (WGAN) leverages the

Wasserstein distance to avoid the caveats in the minmax two-player training of GANs but

has other defects such as mode collapse and lack of metric to detect the convergence. We

introduce a novel inferential Wasserstein GAN (iWGAN) model, which is a principled

framework to fuse auto-encoders and WGANs. The iWGAN model jointly learns an encoder …

 Cite All 2 versions 

online OPEN ACCESS

Inferential Wasserstein Generative Adversarial Networks

by Chen, Yao; Gao, Qingyi; Wang, Xiao

09/2021

Generative Adversarial Networks (GANs) have been impactful on many problems and applications but suffer from unstable training. The Wasserstein GAN (WGAN)...

Journal ArticleFull Text Online

 Cited by 4 Related articles All 5 versions

 

2021 see 2020

Sig-Wasserstein GANs for Time Series Generation

H Ni, L Szpruch, M Sabate-Vidales, B Xiao… - arXiv preprint arXiv …, 2021 - arxiv.org

Synthetic data is an emerging technology that can significantly accelerate the development

and deployment of AI machine learning pipelines. In this work, we develop high-fidelity time-

series generators, the SigWGAN, by combining continuous-time stochastic models with the

newly proposed signature $ W_1 $ metric. The former are the Logsig-RNN models based on

the stochastic differential equations, whereas the latter originates from the universal and

principled mathematical features to characterize the measure induced by time series …

 Cite All 2 versions 

online OPEN ACCESS

Sig-Wasserstein GANs for Time Series Generation

by Ni, Hao; Szpruch, Lukasz; Sabate-Vidales, Marc ; More...

11/2021

Synthetic data is an emerging technology that can significantly accelerate the development and deployment of AI machine learning pipelines. In this work, we...

Journal ArticleFull Text Online

 Cited by 8 Related articles All 4 versions

Training Wasserstein GANs without gradient penalties - arXiv

https://arxiv.org › cs

by D Kwon · 2021 — Our method requires no gradient penalties nor corresponding hyperparameter tuning and is computationally mor

online OPEN ACCESS

Training Wasserstein GANs without gradient penalties

by Kwon, Dohyun; Kim, Yeoneung; Montúfar, Guido ; More...

10/2021 see videos

We propose a stable method to train Wasserstein generative adversarial networks. In order to enhance stability, we consider two objective functions using the...

Journal ArticleFull Text Online

Cited by 12 Related articles All 4 versions


Large-scale wasserstein gradient flows

ttps://nips.cc › Conferences › ScheduleMultitrack

We introduce a scalable method to approximate Wasserstein gradient flows, targeted to machine learning applications. Our approach relies on input-convex neural ...

Missing: YouTube ‎| Must include: YouTube
Large-Scale Wasserstein Gradient Flows [in Russian] - YouTube

Cited by 23 Related articles All 9 versions

www.youtube.com › watch

Slides: https://bayesgroup.github.io/bmml_sem... Speaker: Petr Mokrov, SkolTech Wasserstein gradient flows provide a powerful means of ...

YouTube · BayesGroup.ru · 

Oct 17, 2021

 

Clustering Market Regimes using the Wasserstein Distance

B Horvath, Z Issa, A Muguruza - Available at SSRN 3947905, 2021 - papers.ssrn.com

The problem of rapid and automated detection of distinct market regimes is a topic of great

interest to financial mathematicians and practitioners alike. In this paper, we outline an

unsupervised learning algorithm for clustering financial time-series into a suitable number of

temporal segments (market regimes).

 Cite All 5 versions

online  OPEN ACCESS

Clustering Market Regimes using the Wasserstein Distance

by Horvath, Blanka; Issa, Zacharia; Muguruza, Aitor

10/2021

The problem of rapid and automated detection of distinct market regimes is a topic of great interest to financial mathematicians and practitioners alike. In...

Journal ArticleFull Text Online

 All 5 versions


2021

Variational Wasserstein Barycenters with c-Cyclical Monotonicity

https://arxiv.org › cs

by J Chi · 2021 — The basic idea is to introduce a variational distribution as the approximation of the true continuous barycenter, so as to frame the barycenters ...

online OPEN ACCESS

Variational Wasserstein Barycenters with c-Cyclical Monotonicity

by Chi, Jinjin; Yang, Zhiyao; Ouyang, Jihong ; More...

10/2021

Wasserstein barycenter, built on the theory of optimal transport, provides a powerful framework to aggregate probability distributions, and it has increasingly...

Journal ArticleFull Text Online

 Related articles All 3 versions

Dynamical Wasserstein Barycenters for Time Series Modeling

https://pythonrepo.com › repo › kevin-c-cheng-dynami...

Oct 28, 2021 — This is the code related for the Dynamical Wasserstein Barycenter model published in Neurips 2021. To run the code and replicate the results ...

online OPEN ACCESS

Dynamical Wasserstein Barycenters for Time-series Modeling

by Cheng, Kevin C; Aeron, Shuchin; Hughes, Michael C ; More...

10/2021

Many time series can be modeled as a sequence of segments representing high-level discrete states, such as running and walking in a human activity application....

Journal ArticleFull Text Online

Cited by 2 Related articles All 5 versions

Dynamical Wasserstein Barycenters for Time-Series Modeling5:58

Many time series can be modeled as a sequence of segments representing high-level discrete states, such as running and walking in a human ...  SlidesLive · 

Dec 6, 2021

A Regularized Wasserstein Framework for Graph Kernels

A Wijesinghe, Q Wang, S Gould - arXiv preprint arXiv:2110.02554, 2021 - arxiv.org

We propose a learning framework for graph kernels, which is theoretically grounded on

regularizing optimal transport. This framework provides a novel optimal transport distance

metric, namely Regularized Wasserstein (RW) discrepancy, which can preserve both

features and structure of graphs via Wasserstein distances on features and their local

variations, local barycenters and global connectivity. Two strongly convex regularization

terms are introduced to improve the learning ability. One is to relax an optimal alignment …

 Related articles All 5 versions

online OPEN ACCESS

A Regularized Wasserstein Framework for Graph Kernels

by Wijesinghe, Asiri; Wang, Qing; Gould, Stephen

10/2021

We propose a learning framework for graph kernels, which is theoretically grounded on regularizing optimal transport. This framework provides a novel optimal...

Journal ArticleFull Text Online

 Related articles All 5 versions

Wasserstein Patch Prior for Image Superresolution

J Hertrich, A HoudardC Redenbach - arXiv preprint arXiv:2109.12880, 2021 - arxiv.org

In this paper, we introduce a Wasserstein patch prior for superresolution of two-and three-

dimensional images. Here, we assume that we have given (additionally to the low resolution

observation) a reference image which has a similar patch distribution as the ground truth of

the reconstruction. This assumption is eg fulfilled when working with texture images or

material data. Then, the proposed regularizer penalizes the $ W_2 $-distance of the patch

distribution of the reconstruction to the patch distribution of some reference image at different …

 Cite All 4 versions 

online  OPEN ACCESS

Wasserstein Patch Prior for Image Superresolution

by Hertrich, Johannes; Houdard, Antoine; Redenbach, Claudia

09/2021

In this paper, we introduce a Wasserstein patch prior for superresolution of two- and three-dimensional images. Here, we assume that we have given...

Journal ArticleFull Text Online

 Cited by 1 Related articles All 4 versions


https://dblp.org › rec › journals › corr › abs-2109-03431

Sep 20, 2021 — Bibliographic details on Fixed Support Tree-Sliced Wasserstein Barycenter.

online OPEN ACCESS

Fixed Support Tree-Sliced Wasserstein Barycenter

by Takezawa, Yuki; Sato, Ryoma; Kozareva, Zornitsa ; More...

09/2021

The Wasserstein barycenter has been widely studied in various fields, including natural language processing, and computer vision. However, it requires a high...

Journal ArticleFull Text Online

<——2021———2021———1380—— 



On Label Shift in Domain Adaptation via Wasserstein Distance

T Le, D Do, T NguyenH NguyenH BuiN Ho… - arXiv preprint arXiv …, 2021 - arxiv.org

We study the label shift problem between the source and target domains in general domain

adaptation (DA) settings. We consider transformations transporting the target to source

domains, which enable us to align the source and target examples. Through those

transformations, we define the label shift between two domains via optimal transport and

develop theory to investigate the properties of DA under various DA settings (eg, closed-set,

partial-set, open-set, and universal settings). Inspired from the developed theory, we …

 Related articles All 6 versions

online OPEN ACCESS

On Label Shift in Domain Adaptation via Wasserstein Distance

by Le, Trung; Do, Dat; Nguyen, Tuan ; More...

10/2021

We study the label shift problem between the source and target domains in general domain adaptation (DA) settings. We consider transformations transporting the...

Journal ArticleFull Text Online

 Related articles All 9 versions

 

A Normalized Gaussian Wasserstein Distance for Tiny Object ...

https://arxiv.org › cs

by J Wang · 2021 — We demonstrate that state-of-the-art detectors do not produce satisfactory results on tiny objects due to the lack of appearance information.

online OPEN ACCESS

A Normalized Gaussian Wasserstein Distance for Tiny Object Detection

by Wang, Jinwang; Xu, Chang; Yang, Wen ; More...

10/2021

Detecting tiny objects is a very challenging problem since a tiny object only contains a few pixels in size. We demonstrate that state-of-the-art detectors do...

Journal ArticleFull Text Online

Cited by 14 Related articles All 2 versions

 

Generalized Bures-Wasserstein Geometry for Positive Definite ...

https://arxiv.org › math

by A Han · 2021 — This paper proposes a generalized Bures-Wasserstein (BW) Riemannian geometry for the manifold of symmetric positive definite matrices. We ...

online OPEN ACCESS

Generalized Bures-Wasserstein Geometry for Positive Definite Matrices

by Han, Andi; Mishra, Bamdev; Jawanpuria, Pratik ; More...

10/2021

This paper proposes a generalized Bures-Wasserstein (BW) Riemannian geometry for the manifold of symmetric positive definite matrices. We explore the...

Journal ArticleFull Text Online

  Related articles All 2 versions

 

A Framework for Verification of Wasserstein Adversarial Robustness

T Wegel, F Assion, D Mickisch, F Greßner - arXiv preprint arXiv …, 2021 - arxiv.org

Machine learning image classifiers are susceptible to adversarial and corruption

perturbations. Adding imperceptible noise to images can lead to severe misclassifications of

the machine learning model. Using $ L_p $-norms for measuring the size of the noise fails to

capture human similarity perception, which is why optimal transport based distance

measures like the Wasserstein metric are increasingly being used in the field of adversarial

robustness. Verifying the robustness of classifiers using the Wasserstein metric can be …

 Cite All 2 versions 

online  OPEN ACCESS

A Framework for Verification of Wasserstein Adversarial Robustness

by Wegel, Tobias; Assion, Felix; Mickisch, David ; More...

10/2021

Machine learning image classifiers are susceptible to adversarial and corruption perturbations. Adding imperceptible noise to images can lead to severe...

Journal ArticleFull Text Online

 All 4 versions

 

Variance Minimization in the Wasserstein Space for Invariant ...

http://arxiv.org › cs

by G Martinet · 2021 — This method, invariant causal prediction (ICP), has a substantial computational defect -- the runtime scales exponentially with the number ...

online  OPEN ACCESS

Variance Minimization in the Wasserstein Space for Invariant Causal Prediction

by Martinet, Guillaume; Strzalkowski, Alexander; Engelhardt, Barbara E

10/2021

Selecting powerful predictors for an outcome is a cornerstone task for machine learning. However, some types of questions can only be answered by identifying...

Journal ArticleFull Text Online

 Princeton Insights (@pu_insights) / Twitter

twitter.com › pu_insights

twitter.com › pu_insights

Guillaume Martinet presents "Variance Minimization in the Wasserstein Space for Invariant Causal Prediction" with an oral at #AISTATS2022! We address the ...

Twitter · Jul 6, 2022


2021


65Tangent Space and Dimension Estimation with the ... - arXiv

https://arxiv.org › math

by U Lim · 2021 — Our approach directly estimates covariance matrices locally, which simultaneously allows estimating both the tangent spaces and the intrinsic ...

online OPEN ACCESS

Tangent Space and Dimension Estimation with the Wasserstein Distance

by Lim, Uzu; Nanda, Vidit; Oberhauser, Harald

10/2021

We provide explicit bounds on the number of sample points required to estimate tangent spaces and intrinsic dimensions of (smooth, compact) Euclidean...

Journal ArticleFull Text Online

 All 4 versions

 

Backward and Forward Wasserstein Projections in Stochastic ...

https://arxiv.org › math

by YH Kim · 2021 — Abstract: We study metric projections onto cones in the Wasserstein space of probability measures, defined by stochastic

online OPEN ACCESS

Backward and Forward Wasserstein Projections in Stochastic Order

by Kim, Young-Heon; Ruan, Yuan Long

10/2021

We study metric projections onto cones in the Wasserstein space of probability measures, defined by stochastic orders. Dualities for backward and forward...

Journal ArticleFull Text Online

  All 2 versions

2110.02753] Semi-relaxed Gromov Wasserstein divergence ...

https://arxiv.org › cs

by C Vincent-Cuaz · 2021 — We argue in this paper that this property can be detrimental for tasks such as graph dictionary or partition learning, and we relax it by ...

online OPEN ACCESS

Semi-relaxed Gromov Wasserstein divergence with applications on graphs

by Vincent-Cuaz, Cédric; Flamary, Rémi; Corneli, Marco ; More...

10/2021

Comparing structured objects such as graphs is a fundamental operation involved in many learning tasks. To this end, the Gromov-Wasserstein (GW) distance,...

Journal ArticleFull Text Online

 Cited by 7 Related articles All 9 versions

   

Exact Statistical Inference for the Wasserstein Distance ... - arXiv

https://arxiv.org › stat

by VNL Duy · 2021 ·  — In this study, we propose an exact (non-asymptotic) inference method for the Wasserstein distance inspired by the concept of conditional ...

online  OPEN ACCESS

Exact Statistical Inference for the Wasserstein Distance by Selective Inference

by Duy, Vo Nguyen Le; Takeuchi, Ichiro

09/2021

In this paper, we study statistical inference for the Wasserstein distance, which has attracted much attention and has been applied to various machine learning...

Journal ArticleFull Text Online

Cited by 4 Related articles All 2 versions

 

A Wasserstein index of dependence for random measures

https://arxiv.org › math

by M Catalano · 2021 — Abstract: Nonparametric latent structure models provide flexible inference on distinct, yet related, g

online OPEN ACCESS

A Wasserstein index of dependence for random measures

by Catalano, Marta; Lavenant, Hugo; Lijoi, Antonio ; More...

09/2021

Nonparametric latent structure models provide flexible inference on distinct, yet related, groups of observations. Each component of a vector of $d \ge 2$...

Journal ArticleFull Text Online

Cited by 2 Related articles All 3 versions

<——2021———2021———1390—— 


Class-conditioned Domain Generalization via Wasserstein Distributional Robust Optimization

J Wang, Y LiL Xie, Y Xie - arXiv preprint arXiv:2109.03676, 2021 - arxiv.org

Given multiple source domains, domain generalization aims at learning a universal model

that performs well on any unseen but related target domain. In this work, we focus on the

domain generalization scenario where domain shifts occur among class-conditional

distributions of different domains. Existing approaches are not sufficiently robust when the

variation of conditional distributions given the same class is large. In this work, we extend

the concept of distributional robust optimization to solve the class-conditional domain …

 Cite All 2 versions 

online  OPEN ACCESS

Class-conditioned Domain Generalization via Wasserstein Distributional Robust Optimization

by Wang, Jingge; Li, Yang; Xie, Liyan ; More...

09/2021

Given multiple source domains, domain generalization aims at learning a universal model that performs well on any unseen but related target domain. In this...

Journal ArticleFull Text Online

 

  

Active Learning on Language models in WASserstein space

https://arxiv.org › cs

by A Bastos · 2021 — Abstract: Active learning has emerged as a standard paradigm in areas with scarcity of labeled training data, such as in the medical domain.

online  OPEN ACCESS

ALLWAS: Active Learning on Language models in WASserstein space

by Bastos, Anson; Kaul, Manohar

09/2021

Active learning has emerged as a standard paradigm in areas with scarcity of labeled training data, such as in the medical domain. Language models have emerged...

Journal ArticleFull Text Online

 Related articles All 2 versions

 

Wasserstein GANs with Gradient Penalty Compute Congested ...

https://arxiv.org › cs

by T Milne · 2021 — In this paper we show for the first time that WGAN-GP compute the minimum of a different optimal transport problem, the so-called congested ...

online OPEN ACCESS

Wasserstein GANs with Gradient Penalty Compute Congested Transport

by Milne, Tristan; Nachman, Adrian

09/2021

Wasserstein GANs with Gradient Penalty (WGAN-GP) are an extremely popular method for training generative models to produce high quality synthetic data. While...

Journal ArticleFull Text Online

Cited by 1 Related articles All 3 versions

 

Physics-Driven Learning of Wasserstein GAN for Density ...

https://arxiv.org › eess

by Z Huang · 2021 — Physics-Driven Learning of Wasserstein GAN for Density Reconstruction in Dynamic Tomography. Authors:Zhishen Huang, Marc Klasky, Trevor Wilcox, ...

online  OPEN ACCESS

Physics-Driven Learning of Wasserstein GAN for Density Reconstruction in Dynamic Tomography

by Huang, Zhishen; Klasky, Marc; Wilcox, Trevor ; More...

10/2021

Object density reconstruction from projections containing scattered radiation and noise is of critical importance in many applications. Existing scatter...

Journal ArticleFull Text Online

 

 

[2110.10363] On the Wasserstein Distance Between $k - arXiv

https://arxiv.org › math

by S Benjamin · 2021 — On the Wasserstein Distance Between k-Step Probability Measures on Finite Graphs. Authors:Sophia Benjamin, Arushi Mantri, Quinn Perian.

online OPEN ACCESS

On the Wasserstein Distance Between $k$-Step Probability Measures on Finite Graphs

by Benjamin, Sophia; Mantri, Arushi; Perian, Quinn

10/2021

We consider random walks $X,Y$ on a finite graph $G$ with respective lazinesses $\alpha, \beta \in [0,1]$. Let $\mu_k$ and $\nu_k$ be the $k$-step transition...

Journal ArticleFull Text Online

 

2021

Statistical Regeneration Guarantees of the Wasserstein ... - arXiv

https://arxiv.org › stat

by A Chakrabarty · 2021 — Firstly, we provide statistical guarantees that WAE achieves the target distribution in the latent space, utilizing the Vapnik Chervonenkis ...

online  OPEN ACCESS

Statistical Regeneration Guarantees of the Wasserstein Autoencoder with Latent Space Consistency

by Chakrabarty, Anish; Das, Swagatam

10/2021

The introduction of Variational Autoencoders (VAE) has been marked as a breakthrough in the history of representation learning models. Besides having several...

Journal ArticleFull Text Online

Cited by 1 Related articles All 5 versions

[2110.01141] Minimum entropy production, detailed balance ...

https://arxiv.org › cond-mat

by A Dechant · 2021 ·  — Minimum entropy production, detailed balance and Wasserstein 

online

CDC-Wasserstein generated adversarial network for locally occluded face image recognition

by Zhang, Kun; Zhang, Wenlong; Yan, Shihan ; More...

10/2021

In the practical application of wisdom education classroom teaching, students' faces may be blocked due to various factors (such as clothing, environment,...

Conference ProceedingFull Text Online

 All 3 versions

 

2110.01141] Minimum entropy production, detailed balance ...

https://arxiv.org › cond-mat

by A Dechant · 2021 ·  — Minimum entropy production, detailed balance and Wasserstein distance for

online  OPEN ACCESS

Minimum entropy production, detailed balance and Wasserstein distance for continuous-time Markov processes

by Dechant, Andreas

10/2021

We investigate the problem of minimizing the entropy production for a physical process that can be described in terms of a Markov jump dynamics. We show that,...

Journal ArticleFull Text Online

 Cited by 2 All 2 versions

 

Waserstein Contraction Bounds on Closed Convex Domains ...

https://arxiv.org › math

by T Lekng · 2021 — This theory focuses on unconstrained SDEs with fairly restrictive assumptions on the drift terms. Typical adaptive control schemes place ...

online OPEN ACCESS

Wasserstein Contraction Bounds on Closed Convex Domains with Applications to Stochastic Adaptive Control

by Lekang, Tyler; Lamperski, Andrew

09/2021

This paper is motivated by the problem of quantitatively bounding the convergence of adaptive control methods for stochastic systems to a stationary...

Journal ArticleFull Text Online

 Cited by 1 Related articles All 5 versions

Application of Wasserstein Attraction Flows for Optimal Transport in Network Systems

F ArquéCA UribeC Ocampo-Martinez - arXiv preprint arXiv:2109.09182, 2021 - arxiv.org

This paper presents a Wasserstein attraction approach for solving dynamic mass transport

problems over networks. In the transport problem over networks, we start with a distribution

over the set of nodes that needs to be" transported" to a target distribution accounting for the

network topology. We exploit the specific structure of the problem, characterized by the

computation of implicit gradient steps, and formulate an approach based on discretized

flows. As a result, our proposed algorithm relies on the iterative computation of constrained …

  Related articles All 6 versions

online OPEN ACCESS

Application of Wasserstein Attraction Flows for Optimal Transport in Network Systems

by Arqué, Ferran; Uribe, César A; Ocampo-Martinez, Carlos

09/2021

This paper presents a Wasserstein attraction approach for solving dynamic mass transport problems over networks. In the transport problem over networks, we...

Journal ArticleFull Text Online

Date Added to IEEE Xplore: 01 February 2022.

<——2021———2021———1400——

[2109.04301] On the use of Wasserstein metric in topological ...

https://arxiv.org › cs

by G Cabanes · 2021 — Abstract: This paper deals with a clustering algorithm for histogram data based on a Self-Organizing Map (SOM) learning.

online  OPEN ACCESS

On the use of Wasserstein metric in topological clustering of distributional data

by Cabanes, Guénaël; Bennani, Younès; Verde, Rosanna ; More...

09/2021

This paper deals with a clustering algorithm for histogram data based on a Self-Organizing Map (SOM) learning. It combines a dimension reduction by SOM and the...

Journal ArticleFull Text Online

Cited by 2 Related articles All 2 versions

 

DeepParticle: learning invariant measure by a deep neural network minimizing Wasserstein distance on data generated from an interacting particle method

 Z Wang · 2021 — DeepParticle: learning invariant measure by a deep neural network minimizing Wasserstein distance on data generated from an interacting particle ...

online  OPEN ACCESS

DeepParticle: learning invariant measure by a deep neural network minimizing Wasserstein distance on data generated from an...

by Wang, Zhongjian; Xin, Jack; Zhang, Zhiwen

11/2021

We introduce the so called DeepParticle method to learn and generate invariant measures of stochastic dynamical systems with physical parameters based on data...

Journal ArticleFull Text Online

elated articles All 4 versions

Geometric Characteristics of the Wasserstein Metric on

SPD(n) and Its Applications on Data Processing

- MDPI

https://www.mdpi.com › pdf

by Y Luo · 2021 — In this paper, we derive more computationally feasible expressions in a concrete case.

online

Beijing Institute of Technology Researchers Further Understanding of Data Processing [Geometric Characteristics of the Wasserstein...

Information Technology Newsweekly, 10/2021

NewsletterFull Text Online

 Beijing Institute of Technology Researchers Further Understanding of Data Processing [Geometric Characteristics of the Wasserstein Metric on SPD and Its Applications on Data Processing...

Information Technology Newsweekly, 10/2021


Speech Emotion Recognition on Small Sample Learning by ...

https://www.worldscientific.com › doi

Oct 18, 2021 — The speech emotion recognition based on the deep networks on small ... Recognition on Small Sample Learning by Hybrid WGAN-LSTM Networks.

Speech Emotion Recognition on Small Sample Learning by Hybrid WGAN-LSTM Networks

by Sun, CunweiJi, LupingZhong, Hailing

Journal of circuits, systems, and computers, 10/2021

The speech emotion recognition based on the deep networks on small samples is often a very challenging problem in natural language processing. The massive...

Journal ArticleCitation Onlin


 2021 see 2020

Conditional Wasserstein GAN-based oversampling of tabular data for imbalanced learning

J EngelmannS Lessmann - Expert Systems with Applications, 2021 - Elsevier

Class imbalance impedes the predictive performance of classification models. Popular

countermeasures include oversampling minority class cases by creating synthetic examples …

 Cite Cited by 4 Related articles All 2 versions


 2021


DerainGAN: Single image deraining using wasserstein GAN

S Yadav, A MehraH Rohmetra, R Ratnakumar… - Multimedia Tools and …, 2021 - Springer

Rainy weather greatly affects the visibility of salient objects and scenes in the captured

images and videos. The object/scene visibility varies with the type of raindrops, ie adherent …

 

Recycling Discriminator: Towards Opinion-Unaware Image Quality Assessment Using Wasserstein GAN

Y ZhuH Ma, J Peng, D LiuZ Xiong - Proceedings of the 29th ACM …, 2021 - dl.acm.org

Generative adversarial networks (GANs) have been extensively used for training networks

that perform image generation. After training, the discriminator in GAN was not used …

 

[PDF] ieee.org

Brain Extraction From Brain MRI Images Based on Wasserstein GAN and O-Net

S Jiang, L Guo, G Cheng, X Chen, C Zhang… - IEEE Access, 2021 - ieeexplore.ieee.org

Brain extraction is an essential pre-processing step for neuroimaging analysis. It is difficult to

achieve high-precision extraction from low-quality brain MRI images with artifacts and gray …

 

[PDF] arxiv.org

Physics-Driven Learning of Wasserstein GAN for Density Reconstruction in Dynamic Tomography

Z HuangM Klasky, T Wilcox, S Ravishankar - arXiv preprint arXiv …, 2021 - arxiv.org

Object density reconstruction from projections containing scattered radiation and noise is of

critical importance in many applications. Existing scatter correction and density …

 Cite All 2 versions 


Wasserstein GAN: Deep Generation Applied on Financial Time Series

M Pfenninger, S Rikli, DN Bigler - Available at SSRN 3877960, 2021 - papers.ssrn.com

Modeling financial time series is challenging due to their high volatility and unexpected

happenings on the market. Most financial models and algorithms trying to fill the lack of …

<——2021———2021———1410——


Human Motion Generation using Wasserstein GAN

A Shiobara, M Murakami - 2021 5th International Conference on Digital …, 2021 - dl.acm.org

Human motion control, edit, and synthesis are important tasks to create 3D computer

graphics video games or movies, because some characters act like humans in most of them …


An unsupervised unimodal registration method based on Wasserstein Gan

Y Chen, H Wan, M Zou - Nan Fang yi ke da xue xue bao= Journal of …, 2021 - europepmc.org

本文提出一种基于 Wasserstein Gan 的无监督单模配准方法. 与现有的基于深度学习的单模配

准方法不同, 本文的方法完成训练不需要 Ground truth 和预设的相似性度量指标 …

 Cite All 3 versions 


[HTML] nih.gov

[HTML] 基于 Wasserstein Gan 的无监督单模配准方法

陈宇, 万辉帆, 邹茂扬 - Journal of Southern Medical University, 2021 - ncbi.nlm.nih.gov

本文提出一种基于Wasserstein Gan 的无监督单模配准方法。 与现有的基于深度学习的单模配

准方法不同, 本文的方法完成训练不需要Ground truth 和预设的相似性度量指标 …

 [Chinese  Unsupervised single-mode registration method based on Wasserstein Gan]


[PDF] ieee.org

Wasserstein Divergence GAN With Cross-Age Identity Expert and Attribute Retainer for Facial Age Transformation

GS Hsu, RC Xie, ZT Chen - IEEE Access, 2021 - ieeexplore.ieee.org

We propose the Wasserstein Divergence GAN with an identity expert and an attribute

retainer for facial age transformation. The Wasserstein Divergence GAN (WGAN-div) can …

 Cite Related articles


 http://www.opticsjournal.net › FullText

· Translate this page

基于WGAN的不均衡太赫兹光谱识别 - 中国光学期刊网

by 朱荣盛 · 2021 — Wasserstein GAN for the Classification of Unbalanced THz Database ... 太赫兹光谱数据为实数值, 采用GAN训练数据, 模型会出现梯度不稳定和多样性不足等问题。

[CITATION] Wasserstein GAN for the Classification of Unbalanced THz Database

Z Rong-sheng, S Tao, L Ying-li… - …, 2021 - OFFICE SPECTROSCOPY & …


2021


[PDF] arxiv.org

Wasserstein GANs Work Because They Fail (to Approximate the Wasserstein Distance)

J Stanczuk, C EtmannLM Kreusser… - arXiv preprint arXiv …, 2021 - arxiv.org

Wasserstein GANs are based on the idea of minimising the Wasserstein distance between a

real and a generated distribution. We provide an in-depth mathematical analysis of …

 Cite Cited by 8 Related articles All 3 versions 


[PDF] arxiv.org

Hidden Convexity of Wasserstein GANs: Interpretable Generative Models with Closed-Form Solutions

A SahinerT ErgenB OzturklerB Bartan… - arXiv preprint arXiv …, 2021 - arxiv.org

Generative Adversarial Networks (GANs) are commonly used for modeling complex

distributions of data. Both the generators and discriminators of GANs are often modeled by …

 Cite  All 2 versions 


 

[PDF] arxiv.org

Training Wasserstein GANs without gradient penalties

D Kwon, Y Kim, G MontúfarI Yang - arXiv preprint arXiv:2110.14150, 2021 - arxiv.org

We propose a stable method to train Wasserstein generative adversarial networks. In order

to enhance stability, we consider two objective functions using the $ c $-transform based on …

 Cite All 2 versions 


2021 see 2020

[PDF] arxiv.org

Sig-Wasserstein GANs for Time Series Generation

H Ni, L Szpruch, M Sabate-Vidales, B Xiao… - arXiv preprint arXiv …, 2021 - arxiv.org

Synthetic data is an emerging technology that can significantly accelerate the development

and deployment of AI machine learning pipelines. In this work, we develop high-fidelity time …

 Cite All 2 versions 


[PDF] arxiv.org

Wasserstein GANs with Gradient Penalty Compute Congested Transport

T MilneA Nachman - arXiv preprint arXiv:2109.00528, 2021 - arxiv.org

Wasserstein GANs with Gradient Penalty (WGAN-GP) are an extremely popular method for

training generative models to produce high quality synthetic data. While WGAN-GP were …

 Cite All 3 versions 

<——2021———2021———1420——



[PDF] arxiv.org

Relaxed Wasserstein with applications to GANs

X GuoJ HongT Lin, N Yang - ICASSP 2021-2021 IEEE …, 2021 - ieeexplore.ieee.org

Wasserstein Generative Adversarial Networks (WGANs) provide a versatile class of models,

which have attracted great attention in various applications. However, this framework has …

 Cite Cited by 28 Related articles All 4 versions


[PDF] annalsofrscb.ro

Wasserstein GANs for Generation of Variated Image Dataset Synthesis

KDB Mudavathu, MVPCS Rao - Annals of the Romanian Society for …, 2021 - annalsofrscb.ro

Deep learning networks required a training lot of data to get to better accuracy. Given the

limited amount of data for many problems, we understand the requirement for creating the …

 Cite Related articles All 2 versions 


Minimizing Wasserstein-1 Distance by Quantile Regression for GANs Model

Y Chen, X Hou, Y Liu - Chinese Conference on Pattern Recognition and …, 2021 - Springer

Abstract In recent years, Generative Adversarial Nets (GANs) as a kind of deep generative

model has become a research focus. As a representative work of GANs model, Wasserstein  …

 


arXiv:2111.06846
  [pdfpsother math.ST
Wasserstein convergence in Bayesian deconvolution models
Authors: Judith RousseauCatia Scricciolo
Abstract: We study the reknown deconvolution problem of recovering a distribution function from independent replicates (signal) additively contaminated with random errors (noise), whose distribution is known. We investigate whether a Bayesian nonparametric approach for modelling the latent distribution of the signal can yield inferences with asymptotic frequentist validity under the L01-Wasserstein metric…  More
Submitted 12 November, 2021; originally announced November 2021.
MSC Class: G3 ACM Class: G.3
All 2 versions
 


arXiv:2111.04981  [pdfother cs.LG  çƒ√

Wasserstein Adversarially Regularized Graph Autoencoder (
Authors: Huidong LiangJunbin Gao
Abstract: This paper introduces Wasserstein Adversarially Regularized Graph Autoencoder (WARGA), an implicit generative algorithm that directly regularizes the latent distribution of node embedding to a target distribution via the Wasserstein metric. The proposed method has been validated in tasks of link prediction and node clustering on real-world graphs, in which WARGA generally outperforms state-of-the-…  More
Submitted 9 November, 2021; originally announced November 2021.
Comments: 8 pages. 2021 NeurIPS OTML Workshop
Related articles All 2 versions


2021

arXiv:2111.03595  [pdfother math.PR
The Wasserstein distance to the Circular Law
Authors: Jonas Jalowy
Abstract: We investigate the Wasserstein distance between the empirical spectral distribution of non-Hermitian random matrices and the Circular Law. For general entry distributions, we obtain a nearly optimal rate of convergence in 1-Wasserstein distance of order n−1/2+ε
and we prove that the optimal rate n−1/2
is attained by Ginibre matrices. This shows that the expected transport cost of complex…  More
Submitted 5 November, 2021; originally announced November 2021.
Comments: 26p, 2 Figures, comments welcome!
MSC Class: 60B20 (Primary); 41A25; 49Q22; 60G55 (Secondary)
All 3 versions
 


arXiv:2111.03570  [pdfpsother math.ST  math.PR
Why the 1-Wasserstein distance is the area between the two marginal CDFs
Authors: Marco De AngelisAnder Gray
Abstract: We elucidate why the 1-Wasserstein distance W1  coincides with the area between the two marginal cumulative distribution functions (CDFs). We first describe the Wasserstein distance in terms of copulas, and then show that W1  with the Euclidean distance is attained with the M
 copula. Two random variables whose dependence is given by the M
 copula manifest perfect (positive) dependence. If w…  More
Submitted 5 November, 2021; originally announced November 2021.
Comments: 6 pages, 1 figure, a pedagogical note
Cited by 1
 Related articles All 2 versions 

arXiv:2111.02486  [pdfpsother math.OC
Convex Chance-Constrained Programs with Wasserstein Ambiguity
Authors: Haoming ShenRuiwei Jiang
Abstract: Chance constraints yield non-convex feasible regions in general. In particular, when the uncertain parameters are modeled by a Wasserstein ball, arXiv:1806.07418 and arXiv:1809.00210 showed that the distributionally robust (pessimistic) chance constraint admits a mixed-integer conic representation. This paper identifies sufficient conditions that lead to convex feasible regions of chance constrain…  More
Submitted 3 November, 2021; originally announced November 2021.
Cited by 2 Related articles All 2 versions 


2021 [PDF] arxiv.org

A Regularized Wasserstein Framework for Graph Kernels

A Wijesinghe, Q Wang, S Gould - arXiv preprint arXiv:2110.02554, 2021 - arxiv.org

We propose a learning framework for graph kernels, which is theoretically grounded on

regularizing optimal transport. This framework provides a novel optimal transport distance …

 Cite All 2 versions 

2021 [PDF] arxiv.org

On Label Shift in Domain Adaptation via Wasserstein Distance

T Le, D Do, T NguyenH NguyenH BuiN Ho… - arXiv preprint arXiv …, 2021 - arxiv.org

We study the label shift problem between the source and target domains in general domain

adaptation (DA) settings. We consider transformations transporting the target to source …

 Cite All 3 versions 

<——2021———2021———1430——


2021 [PDF] arxiv.org

Training Wasserstein GANs without gradient penalties

D Kwon, Y Kim, G MontúfarI Yang - arXiv preprint arXiv:2110.14150, 2021 - arxiv.org

We propose a stable method to train Wasserstein generative adversarial networks. In order

to enhance stability, we consider two objective functions using the $ c $-transform based on …

 Cite All 2 versions 


2021 [PDF] arxiv.org

Wasserstein Adversarially Regularized Graph Autoencoder

H Liang, J Gao - arXiv preprint arXiv:2111.04981, 2021 - arxiv.org

This paper introduces Wasserstein Adversarially Regularized Graph Autoencoder

(WARGA), an implicit generative algorithm that directly regularizes the latent distribution of …

 

2021 [PDF] mdpi.com

Wasserstein Bounds in the CLT of the MLE for the Drift Coefficient of a Stochastic Partial Differential Equation

K Es-Sebaiy, M Al-Foraih, F Alazemi - Fractal and Fractional, 2021 - mdpi.com

In this paper, we are interested in the rate of convergence for the central limit theorem of the

maximum likelihood estimator of the drift coefficient for a stochastic partial differential …

 Cite All 3 versions 


2021

CDC-Wasserstein generated adversarial network for locally occluded face image recognition

K Zhang, W Zhang, S Yan, J Jiang… - … Conference on Computer …, 2021 - spiedigitallibrary.org

In the practical application of wisdom education classroom teaching, students' faces may be

blocked due to various factors (such as clothing, environment, lighting), resulting in low …

 Cite All 2 versions


2021  [PDF] arxiv.org

Why the 1-Wasserstein distance is the area between the two marginal CDFs

M De Angelis, A Gray - arXiv preprint arXiv:2111.03570, 2021 - arxiv.org

We elucidate why the 1-Wasserstein distance $ W_1 $ coincides with the area between the

two marginal cumulative distribution functions (CDFs). We first describe the Wasserstein  …

 Cite All 2 versions 


2021


Big gaps seismic data interpolation using conditional Wasserstein generative adversarial networks with gradient penalty

Wei, Q and Li, XY

Oct 2021 (Early Access) | EXPLORATION GEOPHYSICS

Enriched Cited ReferenRegular sampled seismic data is important for seismic data processing. However, seismic data is often missing due to natural or economic reasons. Especially, when encountering big obstacles, the seismic data will be missing in big gaps, which is more difficult to be reconstructed. Conditional generative adversarial networks (cGANs) are deep-learning models learning the characteristics of the seismic data to reconstruct the missing data. In this paper, we use a conditional Wasserstein generative adversarial network (cWGAN) to interpolate the missing seismic data in big gaps. We use the Wasserstein loss function to train the network and use a gradient penalty in the WGAN (cWGAN-GP) to enforce the Lipschitz constraint. We use a pre-stack seismic dataset to assess the performance. The interpolated results and the calculated recovered signal-to-noise ratios indicate that the cWGAN-GP can recover the missing seismic traces in portions or the entire regions, and the cWGAN-GP based interpolation  

All 2 versions

 

Wasserstein and multivariate linear affine based distributionally robust optimization for CCHP-P2G scheduling considering multiple uncertainties

Wang, YWYang, YJ; (...); Jia, MY

Jan 15 2022 | APPLIED ENERGY 306

Power-to-gas is an emerging energy conversion technology. When integrating power-to-gas into the combined cooling, heating and power system, renewable generations can be further accommodated to synthesize natural gas, and additional revenues can be obtained by reutilizing and selling the synthesized gas. Therefore, it is necessary to address the optimal operation issue of the integrated system (Combined cooling, heating and powerPower-to-gas) for bringing the potential benefits, and thus promoting energy transition. This paper proposes a Wasserstein and multivariate linear affine based distributionally robust optimization model for the above issue considering multiple uncertainties. Specifically, the uncertain distribution of wind power and electric, thermal, cooling loads is modeled as an ambiguity set by applying the Wasserstein metric. Then, based on the ambiguity set, the proposed model with two-stage structure is established. In the first-stage, system operation cost (involving the energy exchange and carbon emission costs, etc.) is minimized under the forecast information. In the second stage, for resisting the interference of multiple uncertainties, the multivariate linear affine policy models are constructed for operation rescheduling under the worst-case distribution within the ambiguity set, which is capable of adjusting flexible resources according to various random factors simultaneously. Simulations are implemented and verify that: 1) both the economic and environmental benefits of system operation are improved by integrating power-to-gas; 2) the proposed model keeps both the conservativeness and computa-tional complexity at low levels, and its solutions enable the effective system operation in terms of cost saving, emission reduction, uncertainty resistance and renewable energy accommodation.

Show more

A Novel Intelligent Fault Diagnosis Method for Rolling Bearings Based on Wasserstein Generative Adversarial Network and Convolutional Neural Network under Unbalanced Dataset

Tang, HTGao, SB; (...); Pang, SB

Oct 2021 | SENSORS 21 (20)

Enriched Cited ReferRolling bearings are widely used in industrial manufacturing, and ensuring their stable and effective fault detection is a core requirement in the manufacturing process. However, it is a great challenge to achieve a highly accurate rolling bearing fault diagnosis because of the severe imbalance and distribution differences in fault data due to weak early fault features and interference from environmental noise. An intelligent fault diagnosis strategy for rolling bearings based on grayscale image transformation, a generative adversative network, and a convolutional neural network was proposed to solve this problem. First, the original vibration signal is converted into a grayscale image. Then more training samples are generated using GANs to solve severe imbalance and distribution differences in fault data. Finally, the rolling bearing condition detection and fault identification are carried out by using SECNN. The availability of the method is substantiated by experiments on datasets with different data imbalance ratios. In addition, the superiority of this diagnosis strategy is verified by comparing it with other mainstream intelligent diagnosis techniques. The experimental result demonstrates that this strategy can reach more than 99.6% recognition accuracy even under substantial environmental noise interference or changing working conditions and has good stability in the presence of a severe imbalance in fault data.

All 8 versions 

2021 see 2020

Wasserstein contrastive representation distillation

L ChenD WangZ Gan, J Liu… - Proceedings of the …, 2021 - openaccess.thecvf.com

The primary goal of knowledge distillation (KD) is to encapsulate the information of a model

learned from a teacher network into a student network, with the latter being more compact …

Cited by 6 Related articles All 4 versions 

[PDF] thecvf.com

[PDF] Wasserstein Contrastive Representation Distillation: Supplementary Material

L ChenD WangZ Gan, J Liu, R HenaoL Carin - openaccess.thecvf.com

• Wide Residual Network (WRN)[20]: WRN-dw represents wide ResNet with depth d and

width factor w.• resnet [3]: We use ResNet-d to represent CIFAR-style resnet with 3 groups of

basic blocks, each with 16, 32, and 64 channels, respectively. In our experiments, resnet8x4 …


 Tracial smooth functions of non-commuting variables and the ...

https://arxiv.org › math

by D Jekel · 2021 ·  — We formulate a free probabilistic analog of the Wasserstein manifold on \mathbb{R}^d (the formal Riemannian manifold of smooth probability ...

[CITATION] Tracial non-commutative smooth functions and the free Wasserstein manifold

D Jekel, W Li, D Shlyakhtenko - arXiv preprint arXiv:2101.06572, 2021

 Cited by 2

<——2021———2021———1440——


The Wasserstein Impact Measure (WIM): A practical tool for quantifying prior impact in Bayesian statistics

F GhaderinezhadC LeyB Serrien - Computational Statistics & Data …, 2021 - Elsevier

The prior distribution is a crucial building block in Bayesian analysis, and its choice will

impact the subsequent inference. It is therefore important to have a convenient way to …

 All 2 versions


Considering anatomical prior information for low-dose CT image enhancement using attribute-augmented Wasserstein generative adversarial networks

Z Huang, X Liu, R Wang, J Chen, P Lu, Q Zhang… - Neurocomputing, 2021 - Elsevier

Currently, many deep learning (DL)-based low-dose CT image postprocessing technologies

fail to consider the anatomical differences in training data among different human body sites …

 Cited by 3 Related articles


2021 [PDF] arxiv.org

Minimum entropy production, detailed balance and Wasserstein distance for continuous-time Markov processes

A Dechant - arXiv preprint arXiv:2110.01141, 2021 - arxiv.org

We investigate the problem of minimizing the entropy production for a physical process that

can be described in terms of a Markov jump dynamics. We show that, without any further …

  All 2 versions 


2021  [PDF] arxiv.org

Variational Wasserstein Barycenters with c-Cyclical Monotonicity

J Chi, Z Yang, J Ouyang, X Li - arXiv preprint arXiv:2110.11707, 2021 - arxiv.org

Wasserstein barycenter, built on the theory of optimal transport, provides a powerful

framework to aggregate probability distributions, and it has increasingly attracted great …

 All 2 versions 


2021 [PDF] ieee.org

Variational Autoencoders and Wasserstein Generative Adversarial Networks for Improving the Anti-Money Laundering Process

ZY ChenW SolimanA Nazir, M Shorfuzzaman - IEEE Access, 2021 - ieeexplore.ieee.org

There has been much recent work on fraud and Anti Money Laundering (AML) detection

using machine learning techniques. However, most algorithms are based on supervised …

 Related articles All 2 versions


2021


2021  [PDF] arxiv.org

Geometrical aspects of entropy production in stochastic thermodynamics based on Wasserstein distance

M Nakazato, S Ito - arXiv preprint arXiv:2103.00503, 2021 - arxiv.org

We study a relationship between optimal transport theory and stochastic thermodynamics for

the Fokker-Planck equation. We show that the lower bound on the entropy production is the …

 Cited by 3 Related articles All 2 versions 


2021 [PDF] arxiv.org

Wasserstein Patch Prior for Image Superresolution

J Hertrich, A HoudardC Redenbach - arXiv preprint arXiv:2109.12880, 2021 - arxiv.org

In this paper, we introduce a Wasserstein patch prior for superresolution of two-and three-

dimensional images. Here, we assume that we have given (additionally to the low resolution …

 All 4 versions 


2021  [PDF] arxiv.org

Physics-Driven Learning of Wasserstein GAN for Density Reconstruction in Dynamic Tomography

Z HuangM Klasky, T Wilcox, S Ravishankar - arXiv preprint arXiv …, 2021 - arxiv.org

Object density reconstruction from projections containing scattered radiation and noise is of

critical importance in many applications. Existing scatter correction and density …

 All 2 versions 


2021

Wasserstein contrastive representation distillation

L ChenD WangZ Gan, J Liu… - Proceedings of the …, 2021 - openaccess.thecvf.com

The primary goal of knowledge distillation (KD) is to encapsulate the information of a model

learned from a teacher network into a student network, with the latter being more compact …

 Cited by 3 Related articles All 4 versions 


2021

[PDF] thecvf.com

[PDF] Wasserstein Contrastive Representation Distillation: Supplementary Material

L ChenD WangZ Gan, J Liu, R HenaoL Carin - openaccess.thecvf.com

• Wide Residual Network (WRN)[20]: WRN-dw represents wide ResNet with depth d and

width factor w.• resnet [3]: We use ResNet-d to represent CIFAR-style resnet with 3 groups of …

 Related articles 

<——2021———2021———1450——


2021  [PDF] arxiv.org

Tracial smooth functions of non-commuting variables and the free Wasserstein manifold

D Jekel, W LiD Shlyakhtenko - arXiv preprint arXiv:2101.06572, 2021 - arxiv.org

We formulate a free probabilistic analog of the Wasserstein manifold on $\mathbb {R}^ d

$(the formal Riemannian manifold of smooth probability densities on $\mathbb {R}^ d $) …

  Related articles All 3 versions 


 

[PDF] arxiv.org

Variance Minimization in the Wasserstein Space for Invariant Causal Prediction

G Martinet, A Strzalkowski, BE Engelhardt - arXiv preprint arXiv …, 2021 - arxiv.org

Selecting powerful predictors for an outcome is a cornerstone task for machine learning.

However, some types of questions can only be answered by identifying the predictors that …

 All 2 versions 


 2021  [PDF] arxiv.org

Approximating 1-Wasserstein Distance between Persistence Diagrams by Graph Sparsification

TK Dey, S Zhang - arXiv preprint arXiv:2110.14734, 2021 - arxiv.org

Persistence diagrams (PD) s play a central role in topological data analysis. This analysis

requires computing distances among such diagrams such as the 1-Wasserstein distance …

 All 3 versions 


Speech Emotion Recognition on Small Sample Learning by Hybrid WGAN-LSTM Networks

C Sun, L Ji, H Zhong - Journal of Circuits, Systems and Computers, 2021 - World Scientific

The speech emotion recognition based on the deep networks on small samples is often a

very challenging problem in natural language processing. The massive parameters of a

deep network are much difficult to be trained reliably on small-quantity speech samples …

 Related articles

Wasserstein Coupled Graph Learning for Cross-Modal Retrieval

Y Wang, T Zhang, X Zhang, Z Cui… - Proceedings of the …, 2021 - openaccess.thecvf.com

Graphs play an important role in cross-modal image-text understanding as they characterize

the intrinsic structure which is robust and crucial for the measurement of cross-modal …

  Wasserstein Coupled Graph Learning for Cross-Modal Retrieval

Y Wang, T Zhang, X Zhang, Z Cui… - … Conference on …, 2021 - ieeexplore.ieee.org

… Then, a Wasserstein coupled dictionary, containing multiple … measurement through a

Wasserstein Graph Embedding (… , we specifically define a Wasserstein discriminant loss on the …

Wasserstein Coupled Graph Learning for Cross-Modal Retrieval
by Wang, YunZhang, TongZhang, Xueya ; More...
2021 IEEE/CVF International Conference on Computer Vision (ICCV), 10/2021
Graphs play an important role in cross-modal image-text understanding as they characterize the intrinsic structure which is robust and crucial for the...
Conference Proceeding  Full Text Online


2021

2021 see 2020

Infinite-dimensional regularization of McKean-Vlasov equation ...

by V Marx · 2021 · Cited by 2 — Keywords: Wasserstein diffusionMcKeanVlasov equation; Fokker–Planck equation; ... That diffusion, which is an infinite-dimensional analogue of a Brownian ...

Infinite-dimensional regularization of McKean-Vlasov equation with a Wasserstein diffusion

By: Marx, Victor

ANNALES DE L INSTITUT HENRI POINCARE-PROBABILITES ET STATISTIQUES  Volume: ‏ 57   Issue: ‏ 4   Pages: ‏ 2315-2353   Published: ‏ NOV 2021

Get It Penn State  View Abstract

Zbl 07481287

Big gaps seismic data interpolation using conditional ..ei · 2021 — In this paper, we use a conditional Wasserstein generative adversarial network (cWGAN) Bto interpolate the missing seismic data in big gaps.

Big gaps seismic data interpolation using conditional Wasserstein generative adversarial networks with gradient penalty

By: Wei, Qing; Li, Xiangyang

EXPLORATION GEOPHYSICS    

early access iconEarly Access: OCT 2021

Get It Penn State  View Abstract

 Cited by 1 Related articles All 2 versions

A Bismut–Elworthy inequality for a Wasserstein diffusion on ...

https://link.springer.com › article

by V Marx · 2021 ·  — We introduce in this paper a strategy to prove gradient estimates for some infinite-dimensional diffusions on L_2-Wasserstein spaces.

2021 see 2020

A Bismut-Elworthy inequality for a Wasserstein diffusion on the circle

By: Marx, Victor

STOCHASTICS AND PARTIAL DIFFERENTIAL EQUATIONS-ANALYSIS AND COMPUTATIONS    

early access iconEarly Access: OCT 2021

Get It Penn State OA Green Full Text  View Abstract


Fast Wasserstein-Distance-Based Distributionally Robust ...

by G Chen · 2021 — This paper addresses this challenge by proposing a fast power dispatch model for multi-zone HVAC systems. A distributionally robust chance- ...

DOI: 10.1109/TSG.2021.30

Fast Wasserstein-Distance-Based Distributionally Robust Chance-Constrained Power Dispatch for Multi-Zone HVAC Systems

By: Chen, Ge; Zhang, Hongcai; Hui, Hongxun; et al.

IEEE TRANSACTIONS ON SMART GRID  Volume: ‏ 12   Issue: ‏ 5   Pages: ‏ 4016-4028   Published: ‏ SEP 2021

Get It Penn State  View Abstract

 Cited by 10 Related articles All 2 versions

Distributionally robust inverse covariance estimation: The Wasserstein shrinkage estimator

VA NguyenD Kuhn… - Operations …, 2021 - pubsonline.informs.org

We introduce a distributionally robust maximum likelihood estimation model with a

Wasserstein ambiguity set to infer the inverse covariance matrix of ap-dimensional Gaussian

random vector from n independent samples. The proposed model minimizes the worst case

(maximum) of Stein's loss across all normal reference distributions within a prescribed

Wasserstein distance from the normal distribution characterized by the sample mean and

the sample covariance matrix. We prove that this estimation problem is equivalent to a …

 Cited by 32 Related articles Al

Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator

By: Nguyen, Viet Anh; Kuhn, Daniel; Esfahani, Peyman Mohajerin

OPERATIONS RESEARCH    

early access iconEarly Access: JUL 2

<——2021———2021———1460—

PDF) Distributionally Robust Resilient Operation of Integrated ...

https://www.researchgate.net › publication › 348300885_...

https://www.researchgate.net › publication › 348300885_...

Jan 11, 2021 — We develop a strengthened ambiguity set that incorporates both moment and Wasserstein metric information of uncertain contingencies, which ...

Distributionally Robust Resilient Operation of Integrated Energy Systems Using Moment and Wasserstein Metric for Contingencies

Associated Data

By: Zhou, Yizhou; Wei, Zhinong; Shahidehpour, Mohammad; et al.

IEEE TRANSACTIONS ON POWER SYSTEMS  Volume: ‏ 36   Issue: ‏ 4   Pages: ‏ 3574-3584   Published: ‏ JUL 2021

Get It Penn State  View Abstract

 

eproducibility of radiomic features using network analysis and its application in Wasserstein k-means clustering

JH Oh, AP Apte, E Katsoulakis, N Riaz… - Journal of Medical …, 2021 - spiedigitallibrary.org

Purpose: The goal of this study is to develop innovative methods for identifying radiomic 

features that are reproducible over varying image acquisition settings. Approach: We 

propose a regularized partial correlation network to identify reliable and reproducible 

radiomic features. This approach was tested on two radiomic feature sets generated using 

two different reconstruction methods on computed tomography (CT) scans from a cohort of 

47 lung cancer patients. The largest common network component between the two networks …

 Related articles All 5 versions

Reproducibility of radiomic features using network analysis and its application in Wasserstein k-means clustering

By: Oh, Jung Hun; Apte, Aditya P.; Katsoulakis, Evangelia; et al.

JOURNAL OF MEDICAL IMAGING  Volume: ‏ 8   Issue: ‏ 3     Article Number: 031904   Published: ‏ MAY 2021

Get It Penn State OA Green Full Text  View Abstract


 

ERA: Entity Relationship Aware Video Summarization  with Wasserstein GAN

by G Wu · 2021 — The GAN training problem is solved by introducing the Wasserstein GAN and two newly proposed video patch/score sum losses. In addition, the ...

ERA: Entity-Relationship Aware Video Summarization with Wasserstein GAN

By: Lin, Jianzhe; Wu, Guan-De; Silva, Claudio

Zenodo

DOI: ‏ http://dx.doi.org.ezaccess.libraries.psu.edu/10.5281/ZENODO.5081260

Document Type: Software

  

 

Peer-reviewed
Automatic Image Annotation Using Improved Wasserstein Generative Adversarial Networks

Authors:Jian LiuWeisheng Wu
Article, 2021
Publication:IAENG international journal of computer science, 48, 2021, 507
Publisher:2021

 

 2021 see 2020

Two-sample Test with Kernel Projected Wasserstein Distancehttps://arxiv.org › math
https://arxiv.org › math
by J Wang · 2021 ·  — This method operates by finding the nonlinear mapping in the data space which maximizes the distance between projected distributions. In ...

Two-sample Test using Projected Wasserstein Distance

By: Wang, Jie; Gao, Rui; Xie, Yao

Confe2021

Sponsor(s): ‏Inst Elect & Elect Engineers; IEEE Informat Theory Soc

2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT)  Book Series: ‏ IEEE International Symposium on Information Theory   Pages: ‏ 3320-3325   Published: ‏ 2021

 Cited by 11 Related articles All 7 versions

2021

 

Wasserstein distance estimates for the distributions of ... - arXiv

https://arxiv.org › stat

https://arxiv.org › stat

by JM Sanz-Serna · 2021 — Wasserstein distance estimates for the distributions of numerical approximations to ergodic stochastic differential equations. Authors:J.M. Sanz ...

Wasserstein distance estimates for the distributions of numerical approximations to ergodic stochastic differential equations

By: Maria Sanz-Serna, Jesus; Zygalakis, Konstantinos C.

JOURNAL OF MACHINE LEARNING RESEARCH  Volume: ‏ 22   Published: ‏ 2021

Get It Penn State  View Abstract

Cited by 2 Related articles All 22 versions  

Zbl 07626757

 

Variational Autoencoders and Wasserstein Generative Adversarial Networks for Improving the Anti-Money Laundering Process

ZY Chen, W Soliman, A Nazir, M Shorfuzzaman - IEEE Access, 2021 - ieeexplore.ieee.org

There has been much recent work on fraud and Anti Money Laundering (AML) detection 

using machine learning techniques. However, most algorithms are based on supervised 

techniques. Studies show that supervised techniques often have the limitation of not 

adapting well to new irregular fraud patterns when the dataset is highly imbalanced. Instead, 

unsupervised learning can have a better capability to find anomalous and irregular patterns 

in new transaction. Despite this, unsupervised techniques also have the disadvantage of not …

 Related articles All 2 versions

Variational Autoencoders and Wasserstein Generative Adversarial Networks for Improving the Anti-Money Laundering Process

By: Chen, Zhiyuan; Soliman, Waleed Mahmoud; Nazir, Amril; et al.

IEEE ACCESS  Volume: ‏ 9   Pages: ‏ 83762-83785   Published: ‏ 2021

Get It Penn State Free Full Text from Publisher View Abstract

Cited by 10 Related articles All 2 versions

 

Multi-Frame Super-Resolution Algorithm Based on a WGAN

K Ning, Z Zhang, K Han, S Han, X Zhang - IEEE Access, 2021 - ieeexplore.ieee.org

Image super-resolution reconstruction has been widely used in remote sensing, medicine 

and other fields. In recent years, due to the rise of deep learning research and the successful 

application of convolutional neural networks in the image field, the super-resolution 

reconstruction technology based on deep learning has also achieved great development. 

However, there are still some problems that need to be solved. For example, the current 

mainstream image super-resolution algorithms based on single or multiple frames pursue …

 Cited by 1 Related articles

Multi-Frame Super-Resolution Algorithm Based on a WGAN

By: Ning, Keqing; Zhang, Zhihao; Han, Kai; et al.

IEEE ACCESS  Volume: ‏ 9   Pages: ‏ 85839-85851   Published: ‏ 2021

Get It Penn State Free Full Text from Publisher View Abstract

 

Brain Extraction From Brain MRI Images Based on ...

https://ieeexplore.ieee.org › abstract › document

https://ieeexplore.ieee.org › abstract › document

by S Jiang · 2021 — To more accurately identify brain boundary, we designed a new GAN based brain extraction method, which used above O-Net as the segmentation ...

Date of Publication: 16 September 2021

Brain Extraction From Brain MRI Images Based on Wasserstein GAN and O-Net

By: Jiang, Shaofeng; Guo, Lanting; Cheng, Guangbin; et al.

IEEE ACCESS  Volume: ‏ 9   Pages: ‏ 136762-136774   Published: ‏ 2021

Get It Penn State Free Full Text from Publisher View Abstract

Cited by 2 Related articles

On Stein's Factors for Poisson Approximation in Wasserstein ...https://link.springer.com › content › pdf

by ZW Liao · 2021 — We establish various bounds on the solutions to a Stein equation for Poisson approxi- mation in

he Wasserstein distance with nonlinear transportation costs.


2021 see 2020 2022

On Stein's Factors for Poisson Approximation ... - Springer LINK

https://link.springer.com › article

by ZW Liao · 2021 — We establish various bounds on the solutions to a Stein equation for Poisson approximation in the Wasserstein distance with nonlinear ...

online Cover Image PEER-REVIEW

On Stein’s Factors for Poisson Approximation in Wasserstein Distance with Nonlinear Transportation Costs

by Liao, Zhong-Wei; Ma, Yutao; Xia, Aihua

Journal f theoretical probability, 09/2021

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

Zbl 07621015


Peer-reviewed
On Stein’s Factors for Poisson Approximation in Wasserstein Distance with Nonlinear Transportation Costs
Authors:Zhong-Wei LiaoYutao MaAihua Xia
Summary:Abstract: We establish various bounds on the solutions to a Stein equation for Poisson approximation in the Wasserstein distance with nonlinear transportation costs. The proofs are a refinement of those in Barbour and Xia (Bernoulli 12:943–954, 2006) using the results in Liu and Ma (Ann Inst H Poincaré Probab Stat 45:58–69, 2009). As a corollary, we obtain an estimate of Poisson approximation error measured in the -Wasserstein distanceShow more
Article, 2021
Publication:Journal of Theoretical Probability, 35, 20210928, 2383
Publisher:2021

<——2021———2021———1470——

  2021 see 2020

Numeric Data Augmentation using Structural ... - ResearchGate

https://www.researchgate.net › ... › Numerics

Sep 2, 2021 — In this paper, we present an analysis on optimization and risk management in Communication Networks (CNs). The model is proposed for offline ...

Numeric Data Augmentation Using Structural Constraint Wasserstein Generative Adversarial Networks

By: Wang, Wei; Wang, Chuang; Cui, Tao; et al.

Conference: IEEE International Symposium on Circuits and Systems (ISCAS) Location: ‏ ELECTR NETWORK Date: ‏ OCT 10-21, 2020

Sponsor(s): ‏IEEE

2020 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS)  Book Series: ‏ IEEE International Symposium on Circuits and Systems   Published: ‏ 2020

Get It Penn State  View Abstract

MR4342999 Prelim Badreddine, Zeinab; Frankowska, Hélène; 

Solutions to Hamilton–Jacobi equation on a Wasserstein space. Calc. Var. Partial Differential Equations 61 (2022), no. 1, Paper No. 9. 49


2021  [PDF] arxiv.org

Lifting couplings in Wasserstein spaces

P Perrone - arXiv preprint arXiv:2110.06591, 2021 - arxiv.org

This paper makes mathematically precise the idea that conditional probabilities are

analogous to path liftings in geometry. The idea of lifting is modelled in terms of the category …

 All 2 versions 


Some inequalities on Riemannian manifolds linking Entropy,Fisher information, Stein discrepancy and Wasserstein...
by Cheng, Li-JuanWang, Feng-YuThalmaier, Anton
08/2021
For a complete connected Riemannian manifold $M$ let $V\in C^2(M)$ be such that $\mu(d x)={\rm e}^{-V(x)} \mbox{vol}(d x)$ is a probability measure on $M$....
Journal Article  Full Text Online

arXiv:2111.08505  [pdfpsother math.ST
On Adaptive Confidence Sets for the Wasserstein Distances
Authors: Neil DeoThibault Randrianarisoa
Abstract: In the density estimation model, we investigate the problem of constructing adaptive honest confidence sets with radius measured in Wasserstein distance Wp   p≥1
, and for densities with unknown regularity measured on a Besov scale. As sampling domains, we focus on the d−
dimensional torus Td  in which case 1≤p≤2and Rd
 for which p=1  We identify necess…  More
Submitted 17 November, 2021; v1 submitted 16 November, 2021; originally announced November 2021.
Comments: 37 pages, 3 appendices
MSC Class: 62G ACM Class: G.3

All 3 versions 

2021


  2021.see 2020

 Permutation invariant networks to learn Wasserstein metrics

http://128.84.4.18 › pdf

Wasserstein distance is one of the fundamental questions in mathematical analysis. The Wasserstein metric has received


КАК С НУЛЯ РАЗРАБОТАТЬ ГЕНЕРИРУЮЩУЮ СОСТЯЗАТЕЛЬНУЮ СЕТЬ ВАССЕРШТЕЙНА (WGAN)

Генеративная состязательная сеть Вассерштейна или Wasserstein GAN - это расширение генеративной состязательной сети, ... Обновлено янв.2021 г.

[Russian  HOW TO DEVELOP A COMPETITIVE GENERATING NETWORK (WGAN) FROM SCRATCH]

 

Что такое Вассерштейн Ган? - определение из техопедии

- аудио - 2021 · Определение - Что означает Wasserstein GAN (

[Russian  What is Wasserstein GAN? - definition from technopedia]


SRWGANTV: Image Super-Resolution Through Wasserstein ...

https://ieeexplore.ieee.org › document

by J Shao · 2021 — Abstract: The study of generative adversarial networks (GAN) has enormously promoted the research work on single image super-resolution (SISR) problem.

Date Added to IEEE Xplore: 29 March 2021

Date of Conference: 5-7 Jan. 2021

DOI: 10.1109/ICCRD51685.2021.9386518

SRWGANTV: Image Super-Resolution Through Wasserstein Generative Adversarial Networks with Total Variational Regularization

Shao, J; Chen, L and Wu, Y

IEEE 13th International Conference on Computer Research and Development (ICCRD)

2021 | 2021 IEEE 13TH INTERNATIONAL CONFERENCE ON COMPUTER RESEARCH AND DEVELOPMENT (ICCRD 2021) , pp.21-26

The study of generative adversarial networks (GAN) has enormously promoted the research work on single image super-resolution (SISR) problem. SRGAN firstly apply GAN to SISR reconstruction, which has achieved good results. However, SRGAN sacrifices the fidelity. At the same time, it is well known that the GANs are difficult to train and the improper training fails the SISR results easily. Recently, Wasserstein Generative Adversarial Network with gradient penalty (WGAN-GP) has been proposed to alleviate these issues at the expense of performance of the model with a relatively simple training process. However, we find that applying WGAN-GP to SISR still suffers from training instability, leading to failure to obtain a good SR result. To address this problem, we present an image super resolution framework base on enhanced WGAN (SRWGAN-TV). We introduce the total variational (TV) regularization term into the loss function of WGAN. The total variational (TV) regularization term can stabilize the network training and improve the quality of generated images. Experimental results on public datasets show that the proposed method achieves superior performance in both quantitative and qualitative measurements.

 Cited by 2 Related articles

Combining the WGAN and ResNeXt Networks to Achieve ...

https://www.spectroscopyonline.com › view › combinin.

by Y Zhao — In this method, the data are first preprocessed using convolution, the FT-IR spectral data are augmented by WGAN, and the data are finally ...

Missing: 2021Combining ‎| Must include: 2021Combining


Combining the WGAN and ResNeXt Networks to Achieve Data Augmentation and Classification of the FT-IR Spectra of Strawberries

Zhao, YA; Tian, SW; (...); Xing, Y

Apr 2021 | SPECTROSCOPY 36 (4) , pp.28-40

It is essential to use deep learning algorithms for big data to implement a new generation of artificial intelligence. The effective use of deep learning methods depends largely on the number of samples. This work proposes a method combining the Wasserstein generative adversarial network (WGAN) with the specific deep learning model (ResNeXt) network to achieve data enhancement and classification of the Fourier transform infrared (FT-IR) spectra of strawberries. In this method, the data are first preprocessed using convolution, the FT-IR spectral data are augmented by WGAN, and the data are finally classified using the ResNeXt network. For the experimental investigation, 10 types of dimensionality-reduction algorithms combined with nine types of classification algorithms were used for comparing and arranging the 90 groups. The results obtained from these experiments prove that our method of using a combination of WGAN and ResNeXt is highly suitable for the classification of the IR spectra of strawberries and provides a data augmentation idea as a foundation for future research.

<——2021———2021———1480—— 


2021 see 2020

A New Data-Driven Distributionally Robust Portfolio Optimization Method Based on Wasserstein Ambiguity Set

Du, NN; Liu, YK and Liu, Y

2021 | IEEE ACCESS 9 , pp.3174-3194

Since optimal portfolio strategy depends heavily on the distribution of uncertain returns, this article proposes a new method for the portfolio optimization problem with respect to distribution uncertainty. When the distributional information of the uncertain return rate is only observable through a finite sample dataset, we model the portfolio selection problem with a robust optimization method from the data-driven perspective. We first develop an ambiguous mean-CVaR portfolio optimization model, where the ambiguous distribution set employed in the distributionally robust model is a Wasserstein ball centered within the empirical distribution. In addition, the computationally tractable equivalent model of the worst-case expectation under the uncertainty set of a cone is derived, and some theoretical conclusions of the box, budget and ellipsoid uncertainty set are obtained. Finally, to demonstrate the effectiveness of our mean-CVaR portfolio optimization method, two practical examples concerning the Chinese stock market and United States stock market are considered. Furthermore, some numerical experiments are carried out under different uncertainty sets. The proposed data-driven distributionally robust portfolio optimization method offers some advantages over the ambiguity-free stochastic optimization method. The numerical experiments illustrate that the new method is effective.


2021

A data-driven scheduling model of virtual power plant using Wasserstein distributionally robust optimization

H Liu, J QiuJ Zhao - International Journal of Electrical Power & Energy …, 2021 - Elsevier

Distributed energy resources (DER) can be efficiently aggregated by aggregators to sell

excessive electricity to spot market in the form of Virtual Power Plant (VPP). The aggregator

schedules DER within VPP to participate in day-ahead market for maximizing its profits while …

 All 2 versions

  

1A riemannian block coordinate descent method for computing the projection robust wasserstein distance

M HuangS MaL Lai - International Conference on …, 2021 - proceedings.mlr.press

The Wasserstein distance has become increasingly important in machine learning and deep

learning. Despite its popularity, the Wasserstein distance is hard to approximate because of

the curse of dimensionality. A recently proposed approach to alleviate the curse of

dimensionality is to project the sampled data from the high dimensional probability

distribution onto a lower-dimensional subspace, and then compute the Wasserstein distance

between the projected data. However, this approach requires to solve a max-min problem …

 3 Related articles All 9 versions

A Riemannian Block Coordinate Descent Method for Computing the Projection Robust Wasserstein Distance

Huang, MH; Ma, SQ and Lai, LF

International Conference on Machine Learning (ICML)

2021 | INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139 139

The Wasserstein distance has become increasingly important in machine learning and deep learning. Despite its popularity, the Wasserstein distance is hard to approximate because of the curse of dimensionality. A recently proposed approach to alleviate the curse of dimensionality is to project the sampled data from the high dimensional probability distribution onto a lower-dimensional subspace, and then compute the Wasserstein distance between the projected data. However, this approach requires to solve a max-min problem over the Stiefel manifold, which is very challenging in practice. In this paper, we propose a Riemannian block coordinate descent (RBCD) method to solve this problem, which is based on a novel reformulation of the regularized max-min problem over the Stiefel manifold. We show that the complexity of arithmetic operations for RBCD to obtain an 6-stationary point is O(epsilon(-3)), which is significantly better than the complexity of existing methods. Numerical results on both synthetic and real datasets demonstrate that our method is more efficient than existing methods, especially when the number of sampled data is very large.


Wasserstein Divergence GAN With Cross-Age ... - IEEE Xplorehttps://ieeexplore.ieee.org › iel7
  GS Hsu · 2021 — ABSTRACT We propose the Wasserstein Divergence GAN with an identity expert and an attribute

retainer for facial age transformation.

Wasserstein Divergence GAN With Cross-Age Identity Expert and Attribute Retainer for Facial Age Transformation

Hsu, GS; Xie, RC and Chen, ZT

2021 | IEEE ACCESS 9 , pp.39695-39706

We propose the Wasserstein Divergence GAN with an identity expert and an attribute retainer for facial age transformation. The Wasserstein Divergence GAN (WGAN-div) can better stabilize the training and lead to better target image generation. The identity expert aims to preserve the input identity at output, and the attribute retainer aims to preserve the input attribute at output. Unlike the previous works which take a specific model for identity and attribute preservation without giving a reason, both the identity expert and the attribute retainer in our proposed model are determined from a comprehensive comparison study on the state-of-the-art pretrained models. The candidate networks considered for identity preservation include the VGG-Face, VGG-Face2, LightCNN and ArcFace. The candidate backbones for making the attribute retainer are the VGG-Face, VGG-Object and DEX networks. This study offers a guidebook for choosing the appropriate modules for identity and attribute preservation. The interactions between the identity expert and attribute retainer are also extensively studied and experimented. To further enhance the performance, we augment the data by the 3DMM and explore the advantages of the additional training on cross-age datasets. The additional cross-age training is validated to make the identity expert capable of handling cross-age face recognition. The performance of our approach is justified by the desired age transformation with identity well preserved. Experiments on benchmark databases show that the proposed approach is highly competitive to state-of-the-art methods.


Wasserstein F-tests and confidence bands for the Frechet ...

https://projecteuclid.org › journals › issue-1 › 20-AOS1971

by A Petersen · 2021 · 0 — In this paper, we study a regression model with density functions as response variables under the Wasserstein geometry, with predictors being Euclidean vectors.

22 pages

WASSERSTEIN F-TESTS AND CONFIDENCE BANDS FOR THE FRECHET REGRESSION OF DENSITY RESPONSE CURVES

Petersen, A; Liu, X and Divani, AA

Feb 2021 | ANNALS OF STATISTICS 49 (1) , pp.590-611

Data consisting of samples of probability density functions are increasingly prevalent, necessitating the development of methodologies for their analysis that respect the inherent nonlinearities associated with densities. In many applications, density curves appear as functional response objects in a regression model with vector predictors. For such models, inference is key to understand the importance of density-predictor relationships, and the un- certainty associated with the estimated conditional mean densities, defined as conditional Frechet means under a suitable metric. Using the Wasserstein geometry of optimal transport, we consider the Frechet regression of density curve responses and develop tests for global and partial effects, as well as simultaneous confidence bands for estimated conditional mean densities. The asymptotic behavior of these objects is based on underlying functional central limit theorems within Wasserstein space, and we demonstrate that they are asymptotically of the correct size and coverage, with uniformly strong consistency of the proposed tests under sequences of contiguous alternatives. The accuracy of these methods, including nominal size, power and coverage, is assessed through simulations, and their utility is illustrated through a regression analysis of post-intracerebral hemorrhage hematoma densities and their associations with a set of clinical and radiological covariates.

 

 2021

On Stein's factors for Poisson approximation in Wasserstein ...

On Stein's Factors for Poisson Approximation in Wasserstein Distance with Nonlinear Transportation Costs

Liao, ZW; Ma, YT and Xia, AH

Sep 2021 (Early Access) | JOURNAL OF THEORETICAL PROBABILITY

Enriched Cited References

We establish various bounds on the solutions to a Stein equation for Poisson approximation in the Wasserstein distance with nonlinear transportation costs. The proofs are a refinement of those in Barbour and Xia (Bernoulli 12:943-954, 2006) using the results in Liu and Ma (Ann Inst H Poincare Probab Stat 45:58-69, 2009). As a corollary, we obtain an estimate of Poisson approximation error measured in the L-2-Wasserstein distance.


2021 see 2020

Cutoff Thermalization for Ornstein–Uhlenbeck Systems with ...

https://link.springer.com › article

by G Barrera · 2021 · Cited by 4 — This article establishes cutoff thermalization (also known as the cutoff phenomenon) for a class of generalized Ornstein–Uhlenbeck systems.

Cutoff Thermalization for Ornstein-Uhlenbeck Systems with Small Levy Noise in the Wasserstein Distance

Barrera, G; Hogele, MA and Pardo, JC

Sep 2021 | JOURNAL OF STATISTICAL PHYSICS 184 (3)

This article establishes cutoff thermalization (also knownas the cutoff phenomenon) foraclass of generalized Ornstein-Uhlenbeck systems (X-t(epsilon) (x))(t >= 0) with e-small additive Levy noise and initial value x. The driving noise processes include Brownian motion, alpha-stable Levy flights, finite intensity compound Poisson processes, and red noises, and may be highly degenerate. Window cutoff thermalization is shown under mild generic assumptions; that is, we see an asymptotically sharp infinity/0-collapse of the renormalized Wasserstein distance from the current state to the equilibrium measure mu epsilon along a time window centered on a precise epsilon-dependent time scale t(epsilon). In many interesting situations such as reversible (Levy) diffusions it is possible to prove the existence of an explicit, universal, deterministic cutoff thermalization profile. That is, for generic initial data x we obtain the stronger result W-p(X-t epsilon+r(epsilon) (x), mu(epsilon)).e(-1)-> K.e(-qr) for any r is an element of R as epsilon -> 0 for some spectral constants K, q > 0 and any p >= 1 whenever the distance is finite. The existence of this limit is characterized by the absence of nonnormal growth patterns in terms of an orthogonality condition on a computable family of generalized eigenvectors of Q. Precise error bounds are given. Using these results, this article provides a complete discussion of the cutoff phenomenon for the classical linear oscillator with friction subject to epsilon-small Brownian motion or alpha-stable Levy flights. Furthermore, we cover the highly degenerate case of a linear chain of oscillators in a generalized heat bath at low temperature.


  

2021 see 2020

ttps://ui.adsabs.harvard.edu › abs › abstract

by Z Shi · 2021 — In this study, a data-driven approach using dual interactive Wasserstein generative adve

A material decomposition method for dual-energy CT via dual interactive Wasserstein generative adversarial networks

Shi, ZF; Li, HL; (...); Cheng, M

Jun 2021 | May 2021 (Early Access) | MEDICAL PHYSICS 48 (6) , pp.2891-2905


Enriched Cited References

Purpose: Dual-energy computed tomography (DECT) is highly promising for material characterization and identification, whereas reconstructed material-specific images are affected by magnified noise and beam-hardening artifacts. Although various DECT material decomposition methods have been proposed to solve this problem, the quality of the decomposed images is still unsatisfactory, particularly in the image edges. In this study, a data-driven approach using dual interactive Wasserstein generative adversarial networks (DIWGAN) is developed to improve DECT decomposition accuracy and perform edge-preserving images.


Methods: In proposed DIWGAN, two interactive generators are used to synthesize decomposed images of two basis materials by modeling the spatial and spectral correlations from input DECT reconstructed images, and the corresponding discriminators are employed to distinguish the difference between the generated images and labels. The DECT images reconstructed from high- and low-energy bins are sent to two generators separately, and each generator synthesizes one material-specific image, thereby ensuring the specificity of the network modeling. In addition, the information from different energy bins is exploited through the feature sharing of two generators. During decomposition model training, a hybrid loss function including L-1 loss, edge loss, and adversarial loss is incorporated to preserve the texture and edges in the generated images. Additionally, a selector is employed to define the generator that should be trained in each iteration, which can ensure the modeling ability of two different generators and improve the material decomposition accuracy. The performance of the proposed method is evaluated using digital phantom, XCAT phantom, and real data from a mouse.


Results: On the digital phantom, the regions of bone and soft tissue are strictly and accurately separated using the trained decomposition model. The material densities in different bone and soft-tissue regions are near the ground truth, and the error of material densities is lower than 3 mg/ml. The results from XCAT phantom show that the material-specific images generated by directed matrix inversion and iterative decomposition methods have severe noise and artifacts. Regarding to the learning-based methods, the decomposed images of fully convolutional network (FCN) and butterfly network (Butterfly-Net) still contain varying degrees of artifacts, while proposed DIWGAN can yield high quality images. Compared to Butterfly-Net, the root-mean-square error (RMSE) of soft-tissue images generated by the DIWGAN decreased by 0.01 g/ml, whereas the peak-signal-to-noise ratio (PSNR) and structural similarity (SSIM) of the soft-tissue images reached 31.43 dB and 0.9987, respectively. The mass densities of the decomposed materials are nearest to the ground truth when using the DIWGAN method. The noise standard deviation of the decomposition images reduced by 69%, 60%, 33%, and 21% compared with direct matrix inversion, iterative decomposition, FCN, and Butterfly-Net, respectively. Furthermore, the performance of the mouse data indicates the potential of the proposed material decomposition method in real scanned dat

Conclusions: A DECT material decomposition method based on deep learning is proposed, and the relationship between reconstructed and material-specific images is mapped by training the DIWGAN model. Results from both the simulation phantoms and real data demonstrate the advantages of this method in suppressing noise and beam-hardening artifacts. (C) 2021 American Association of Physicists in Medicine


Geometric Characteristics of the Wasserstein Metric on ... - MDPI

https://www.mdpi.com › pdf

PDF

by Y Luo · 2021 — In this paper, by involving the Wasserstein metric on SPD(n), we obtain computationally feasible expressions for some geometric quantities, ...

Missing: 180 ‎| Must include: 180

Geometric Characteristics of the Wasserstein Metric on SPD(n) and Its Applications on Data Processing

Luo, YH; Zhang, SQ; (...); Sun, HF

Sep 2021 | ENTROPY 23 (9)


Enriched Cited References

The Wasserstein distance, especially among symmetric positive-definite matrices, has broad and deep influences on the development of artificial intelligence (AI) and other branches of computer science. In this paper, by involving the Wasserstein metric on SPD(n), we obtain computationally feasible expressions for some geometric quantities, including geodesics, exponential maps, the Riemannian connection, Jacobi fields and curvatures, particularly the scalar curvature. Furthermore, we discuss the behavior of geodesics and prove that the manifold is globally geodesic convex. Finally, we design algorithms for point cloud denoising and edge detecting of a polluted image based on the Wasserstein curvature on SPD(n). The experimental results show the efficiency and robustness of our curvature-based methods.


 2021 see 2022

 Wasserstein based two-stage distributionally robust ...

Wasserstein and multivariate linear affine based distributionally robust optimization for CCHP-P2G scheduling considering multiple uncertainties

Wang, YW; Yang, YJ; (...); Jia, MY

Jan 15 2022 | APPLIED ENERGY 306

Power-to-gas is an emerging energy conversion technology. When integrating power-to-gas into the combined cooling, heating and power system, renewable generations can be further accommodated to synthesize natural gas, and additional revenues can be obtained by reutilizing and selling the synthesized gas. Therefore, it is necessary to address the optimal operation issue of the integrated system (Combined cooling, heating and powerPower-to-gas) for bringing the potential benefits, and thus promoting energy transition. This paper proposes a Wasserstein and multivariate linear affine based distributionally robust optimization model for the above issue considering multiple uncertainties. Specifically, the uncertain distribution of wind power and electric, thermal, cooling loads is modeled as an ambiguity set by applying the Wasserstein metric. Then, based on the ambiguity set, the proposed model with two-stage structure is established. In the first-stage, system operation cost (involving the energy exchange and carbon emission costs, etc.) is minimized under the forecast information. In the second stage, for resisting the interference of multiple uncertainties, the multivariate linear affine policy models are constructed for operation rescheduling under the worst-case distribution within the ambiguity set, which is capable of adjusting flexible resources according to various random factors simultaneously. Simulations are implemented and verify that: 1) both the economic and environmental benefits of system operation are improved by integrating power-to-gas; 2) the proposed model keeps both the conservativeness and computa-tional complexity at low levels, and its solutions enable the effective system operation in terms of cost saving, emission reduction, uncertainty resistance and renewable energy accommodation.

<——2021———2021———1490——


2021 see 2020

Illumination-Invariant Flotation Froth Color Measuring via Wasserstein Distance-

Illumination-Invariant Flotation Froth Color Measuring via Wasserstein Distance-Based CycleGAN With Structure-Preserving Constraint

Liu, JP; He, JZ; (...); Niyoyita, JP

Feb 2021 | IEEE TRANSACTIONS ON CYBERNETICS 51 (2) , pp.839-852

Froth color can be referred to as a direct and instant indicator to the key flotation production index, for example, concentrate grade. However, it is intractable to measure the froth color robustly due to the adverse interference of time-varying and uncontrollable multisource illuminations in the flotation process monitoring. In this article, we proposed an illumination-invariant froth color measuring method by solving a structure-preserved image-to-image color translation task via an introduced Wasserstein distance-based structure-preserving CycleGAN, called WDSPCGAN. WDSPCGAN is comprised of two generative adversarial networks (GANs), which have their own discriminators but share two generators, using an improved U-net-like full convolution network to conduct the spatial structure-preserved color translation. By an adversarial game training of the two GANs, WDSPCGAN can map the color domain of froth images under any illumination to that of the referencing illumination, while maintaining the structure and texture invariance. The proposed method is validated on two public benchmark color constancy datasets and applied to an industrial bauxite flotation process. The experimental results show that WDSPCGAN can achieve illumination-invariant color features of froth images under various unknown lighting conditions while keeping their structures and textures unchanged. In addition, WDSPCGAN can be updated online to ensure its adaptability to any operational conditions. Hence, it has the potential for being popularized to the online monitoring of the flotation concentrate grade.


2021

Dissimilarity measure of local structure in inorganic crystals ...

Dissimilarity measure of local structure in inorganic crystals using Wasserstein distance to search for novel phosphors

Takemura, S; Takeda, T; (...); Hirosaki, N

Apr 21 2021 | SCIENCE AND TECHNOLOGY OF ADVANCED MATERIALS 22 (1) , pp.185-193

Enriched Cited References

To efficiently search for novel phosphors, we propose a dissimilarity measure of local structure using the Wasserstein distance. This simple and versatile method provides the quantitative dissimilarity of a local structure around a center ion. To calculate the Wasserstein distance, the local structures in crystals are numerically represented as a bag of interatomic distances. The Wasserstein distance is calculated for various ideal structures and local structures in known phosphors. The variation of the Wasserstein distance corresponds to the structural variation of the local structures, and the Wasserstein distance can quantitatively explain the dissimilarity of the local structures. The correlation between the Wasserstein distance and the full width at half maximum suggests that candidates for novel narrow-band phosphors can be identified by crystal structures that include local structures with small Wasserstein distances to local structures of known narrow-band phosphors. The quantitative dissimilarity using the Wasserstein distance is useful in the search of novel phosphors and expected to be applied in materials searches in other fields in which local structures play an important role.

Dissimilarity measure of local structure in inorganic crystals using Wasserstein distance to search for novel phosphors

by Takemura, ShotaTakeda, TakashiNakanishi, Takayuki ; More...

Science and technology of advanced materials, 03/2021

To efficiently search for novel phosphors, we propose a dissimilarity measure of local structure using the Wasserstein distance. This simple and versatile...

Article PDF Download PDF 

Journal ArticleFull Text Online

Cited by 5 Related articles All 7 versions

Necessary Optimality Conditions for Optimal Control Problems ...

Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces (Sept, 10.1007/s00245-021-09772-w, 2021)

Bonnet, B and Frankowska, H

Sep 2021 (Early Access) | APPLIED MATHEMATICS AND OPTIMIZATION

Get It Penn StateFree Full Text From Publisher

 Cited by 8 Related articles All 12 versions

 Zbl 07498421


Correction to: Necessary Optimality Conditions ... - SpringerLink

https://link.springer.com › 10.1007 › s00245-021-09811-6

by B Bonnet · 2021 · Cited by 6 — Bonnet, B., Frankowska, H. Correction to: Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces.

Zbl 07498406

https://www.tandfonline.com › ... › Volume 91, Issue 13

by GI Papayiannis · 2021 — On clustering uncertain and structured data with Wasserstein barycenters and a geodesic criterion for the number of clusters.


On clustering uncertain and structured data ... - ResearchGate

https://www.researchgate.net › publication › 350506517_...

Oct 1, 2021 — Request PDF | On clustering uncertain and structured data with Wasserstein barycenters and a geodesic criterion for the number of clusters ...

On clustering uncertain and structured data with Wasserstein barycenters and a geodesic criterion for the number of clusters

Papayiannis, GI; Domazakis, GN; (...); Yannacopoulos, AN

Sep 2 2021 | Mar 2021 (Early Access) | JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION 91 (13) , pp.2569-2594

Enriched Cited References

Clustering schemes for uncertain and structured data are considered relying on the notion of Wasserstein barycenters, accompanied by appropriate clustering indices based on the intrinsic geometry of the Wasserstein space. Such type of clustering approaches are highly appreciated in many fields where the observational/experimental error is significant or the data nature is more complex and the traditional learning algorithms are not applicable or effective to treat. Under this perspective, each observation is identified by an appropriate probability measure and the proposed clustering schemes rely on discrimination criteria that utilize the geometric structure of the space of probability measures through core techniques from the optimal transport theory. The advantages and capabilities of the proposed approach and the geodesic criterion performance are illustrated through a simulation study and the implementation in two different applications: (a) clustering eurozone countries' bond yield curves and (b) classifying satellite images to certain land uses categories.

Cite Related articles All 3 versions
Zbl 07497103


2021

arXiv:2111.09721  [pdfpsother math.ST
Bounds in L
 Wasserstein distance on the normal approximation of general M-estimators
Authors: François BachocMax Fathi
Abstract: We derive quantitative bounds on the rate of convergence in L
 Wasserstein distance of general M-estimators, with an almost sharp (up to a logarithmic term) behavior in the number of observations. We focus on situations where the estimator does not have an explicit expression as a function of the data. The general method may be applied even in situations where the observations are not independe…  More
Submitted 18 November, 2021; originally announced November 2021.


Model-agnostic bias mitigation methods with regressor distribution control for Wasserstein-...
by Miroshnikov, AlexeyKotsiopoulos, KonstandinosFranks, Ryan ; More...
11/2021
This article is a companion paper to our earlier work Miroshnikov et al. (2021) on fairness interpretability, which introduces bias explanations. In the...
Journal Article  Full Text Online
arXiv:2111.11259
  [pdfother cs.LG  math.PR
Model-agnostic bias mitigation methods with regressor distribution control for Wasserstein-based fairness metrics
Authors: Alexey MiroshnikovKonstandinos KotsiopoulosRyan FranksArjun Ravi Kannan
Abstract: This article is a companion paper to our earlier work Miroshnikov et al. (2021) on fairness interpretability, which introduces bias explanations. In the current work, we propose a bias mitigation methodology based upon the construction of post-processed models with fairer regressor distributions for Wasserstein-based fairness metrics. By identifying the list of predictors contributing the most to…  More
Submitted 19 November, 2021; originally announced November 2021.
Comments: 29 pages, 32 figures
MSC Class: 49Q22; 91A12; 68T01
Related articles All 2 versions 


arXiv:2111.10406  [pdfother math.ST
Convergence rates for Metropolis-Hastings algorithms in the Wasserstein distance
Authors: Austin BrownGalin L. Jones
Abstract: We develop necessary conditions for geometrically fast convergence in the Wasserstein distance for Metropolis-Hastings algorithms on Rd
when the metric used is a norm. This is accomplished through a lower bound which is of independent interest. We show exact convergence expressions in more general Wasserstein distances (e.g. total variation) can be achieved for a large class of distrib…  More
Submitted 19 November, 2021; originally announced November 2021.

All 2 versions 

Oversampling Imbalanced Data Based on Convergent WGAN ...  

online Cover Image  PEER-REVIEW OPEN ACCESS

Oversampling Imbalanced Data Based on Convergent WGAN for Network Threat Detection

by Xu, Yanping; Zhang, Xiaoyu; Qiu, Zhenliang ; More...

Security and communication networks, 11/2021, Volume 2021

Class imbalance is a common problem in network threat detection. Oversampling the minority class is regarded as a popular countermeasure by generating enough...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 

     

Wasserstein convergence in Bayesian deconvolution models

J Rousseau, C Scricciolo - arXiv preprint arXiv:2111.06846, 2021 - arxiv.org

We study the reknown deconvolution problem of recovering a distribution function from

independent replicates (signal) additively contaminated with random errors (noise), whose

distribution is known. We investigate whether a Bayesian nonparametric approach for

modelling the latent distribution of the signal can yield inferences with asymptotic frequentist

validity under the $ L^ 1$-Wasserstein metric. When the error density is ordinary smooth, we

develop two inversion inequalities relating either the $ L^ 1$ or the $ L^ 1$-Wasserstein …

Related articles All 5 versions

online  OPEN ACCESS

Wasserstein convergence in Bayesian deconvolution models

by Rousseau, Judith; Scricciolo, Catia

11/2021

We study the reknown deconvolution problem of recovering a distribution function from independent replicates (signal) additively contaminated with random...

Journal ArticleFull Text Online

Related articles All 5 versions

<——2021———2021———1500—— 


[2111.03595] The Wasserstein distance to the Circular Law

https://arxiv.org › math

by J Jalowy · 2021 — We investigate the Wasserstein distance between the empirical spectral distribution of non-Hermitian random matric

online OPEN ACCESS

The Wasserstein distance to the Circular Law

by Jalowy, Jonas

11/2021

We investigate the Wasserstein distance between the empirical spectral distribution of non-Hermitian random matrices and the Circular Law. For general entry...

Journal ArticleFull Text Online


Convex Chance-Constrained Programs with Wasserstein Ambiguity

H Shen, R Jiang - arXiv preprint arXiv:2111.02486, 2021 - arxiv.org

Chance constraints yield non-convex feasible regions in general. In particular, when the

uncertain parameters are modeled by a Wasserstein ball, arXiv: 1806.07418 and arXiv:

1809.00210 showed that the distributionally robust (pessimistic) chance constraint admits a

mixed-integer conic representation. This paper identifies sufficient conditions that lead to

convex feasible regions of chance constraints with Wasserstein ambiguity. First, when

uncertainty arises from the left-hand side of a pessimistic individual chance constraint, we …

 All 2 versions 

online   OPEN ACCESS

Convex Chance-Constrained Programs with Wasserstein Ambiguity

by Shen, Haoming; Jiang, Ruiwei

11/2021

Chance constraints yield non-convex feasible regions in general. In particular, when the uncertain parameters are modeled by a Wasserstein ball,...

Journal ArticleFull Text Online

 All 2 versions


2021 see 2020

Infinite-dimensional regularization of McKean ... - Project Euclid

https://projecteuclid.org › issue-4 › 20-AIHP1136

by V Marx · 2021 · Cited by 2 — Keywords: Wasserstein diffusion; McKean–Vlasov equation; Fokker–Planck equation; ... That diffusion, which is an infinite-dimensional analogue of a Brownian ...

39 pages

Cover Image

Infinite-dimensional regularization of McKean–Vlasov equation with a Wasserstein diffusion

by Marx, Victor

Annales de l'I.H.P. Probabilités et statistiques, 11/2021, Volume 57, Issue 4

Journal ArticleCitation Online

Cited by 2 Related articles All 9 versions

2021

On Adaptive Confidence Sets for the Wasserstein Distances

N Deo, T Randrianarisoa - arXiv preprint arXiv:2111.08505, 2021 - arxiv.org

In the density estimation model, we investigate the problem of constructing adaptive honest

confidence sets with radius measured in Wasserstein distance $ W_p $, $ p\geq1 $, and for

densities with unknown regularity measured on a Besov scale. As sampling domains, we

focus on the $ d-$ dimensional torus $\mathbb {T}^ d $, in which case $1\leq p\leq 2$, and

$\mathbb {R}^ d $, for which $ p= 1$. We identify necessary and sufficient conditions for the

existence of adaptive confidence sets with diameters of the order of the regularity-dependent …

 All 2 versions 

Showing the best result for this search. See all results


online  OPEN ACCESS

On Adaptive Confidence Sets for the Wasserstein Distances

by Deo, Neil; Randrianarisoa, Thibault

11/2021

In the density estimation model, we investigate the problem of constructing adaptive honest confidence sets with radius measured in Wasserstein distance $W_p$,...

Journal ArticleFull Text Online
Related articles All 3 versions


[HTML] nih.gov

[HTML] 基于 Wasserstein Gan 的无监督单模配准方法

陈宇, 万辉帆, 邹茂扬 - Journal of Southern Medical University, 2021 - ncbi.nlm.nih.gov

本文提出一种基于Wasserstein Gan 的无监督单模配准方法。 与现有的基于深度学习的单模配

准方法不同, 本文的方法完成训练不需要Ground truth 和预设的相似性度量指标 …

[Chinese  Unsupervised single-mode registration method based on Wasserstein Gan]


[PDF] mlr.press

Smooth -Wasserstein Distance: Structure, Empirical Approximation, and Statistical Applications

S NietertZ GoldfeldK Kato - International Conference on …, 2021 - proceedings.mlr.press

Discrepancy measures between probability distributions, often termed statistical distances,

are ubiquitous in probability theory, statistics and machine learning. To combat the curse of …

 Cited by 12 Related articles All 2 versions 

Multiplier bootstrap for Bures-Wasserstein barycenters

Kroshnin, Alexey; Spokoiny, Vladimir; Suvorikova, Alexandra. arXiv.org; Ithaca, Nov 24, 2021.

Abstract/DetailsGet full textLink to external site, this link will open in a new window

Show Abstract 

 Cited by 1 Related articles All 4 versions 

<——2021———2021———1510—— 

 

DeepParticle: learning invariant measure by a deep neural network minimizing Wasserstein distance on data generated from an interacting particle method

Wang, Zhongjian; Xin, Jack; Zhang, Zhiwen. arXiv.org; Ithaca, Nov 21, 2021.

Related articles All 4 versions

[PDF] arxiv.org

DeepParticle: learning invariant measure by a deep neural network minimizing Wasserstein distance on data generated from an interacting particle method

Z WangJ XinZ Zhang - arXiv preprint arXiv:2111.01356, 2021 - arxiv.org

… to minimize a discrete Wasserstein distance between the input and target samples. To reduce

computational cost, we propose an iterative divide-and-conquer (a mini-batch interior point)

algorithm, to find the optimal transition matrix in the Wasserstein distance. We present …

All 4 versions߃

2021 see 2020

Wasserstein-based fairness interpretability framework for machine learning modelMiroshnikov, Alexey; Kotsiopoulos,

Konstandinos; Franks, Ryan; Kannan, Arjun Ravi. arXiv.org; Ithaca, Nov 19, 2021.

Abstract/DetailsGet full text
 

2021 patent news

United States Patent for System and Method for 

Unsupervised Domain Adaptation Via Sliced-Wasserstein Distance IssuedHRLaboratories

Global IP News. Information Technology Patent News; New Delhi [New Delhi]. 17 Nov 2021.  

Rate of convergence for particle approximation of PDEs in Wasserstein space

Germain, Maximilien; Pham, Huyên; Warin, Xavier. arXiv.org; Ithaca, Nov 16, 2021.

Abstract/DetailsGet full text
 
Rate of convergence for particle approximation of PDEs in Wasserstein space

by Germain, MaximilienPham, HuyênWarin, Xavier
03/2021
We prove a rate of convergence for the $N$-particle approximation of a second-order partial differential equation in the space of probability measures, like...
Journal Article  Full Text Online


Using wasserstein generative adversarial networks to create network traffic samplesSychugov, A A; Grekov, M M. AIPConference

Proceedings; Melville,  Vol. 2402, Iss. 1, (Nov 15, 2021).

Abstract/Details Get full textLink to external site, this link will open in a new window
Related articles
 All 3 versions

2021

Sensitivity analysis of Wasserstein distributionally robust optimization problems

Bartl, Daniel; Drapeau, Samuel; Obloj, Jan; Wiesel, Johannes. arXiv.org; Ithaca, Nov 12, 2021.

Abstract/DetailsGet full text

Cited by 1 Related articles All 3 versions
 MR4366493


Tackling Algorithmic Bias in Neural-Network Classifiers using Wasserstein-2 Regularization
Risser, Laurent; Alberto Gonzalez Sanz; Vincenot, Quentin; Jean-Michel Loubes. arXiv.org; Ithaca, Nov 12, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Wasserstein Adversarially Regularized Graph Autoencoder
Liang, Huidong; Gao, Junbin. arXiv.org; Ithaca, Nov 9, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

All 2 versions 

2021 see 2020

Sig-Wasserstein GANs for Time Series Generation
Ni, Hao; Szpruch, Lukasz; Sabate-Vidales, Marc; Xiao, Baoren; Wiese, Magnus; et al. arXiv.org; Ithaca, Nov 1, 20

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Cited by 25
 Related articles All 4 versions

On Label Shift in Domain Adaptation via Wasserstein Distance
Le, Trung; Do, Dat; Nguyen, Tuan; Nguyen, Huy; Bui, Hung; et al. arXiv.org; Ithaca, Oct 29, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Related articles All 6 versions 

<——2021———2021———1520—— 


 2021  see 2020    

LCS Graph Kernel Based on Wasserstein Distance in Longest Common Subsequence Metric Space
Huang, Jianming; Fang, Zhongxi; Kasai, Hiroyuki. arXiv.org; Ithaca, Oct 29, 2021.

Abstract/Details Get full textLink to external site, this link will open in a new window
Cited by 4
 Related articles All 7 versions


Wasserstein Distance Maximizing Intrinsic Control
Durugkar, Ishan; Hansen, Steven; Spencer, Stephen; Mnih, Volodymyr. arXiv.org; Ithaca, Oct 28, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
 All 4 versions 

Approximating 1-Wasserstein Distance between Persistence Diagrams by Graph Sparsification
Dey, Tamal K; Zhang, Simon. arXiv.org; Ithaca, Oct 27, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Universality of persistence diagrams and the bottleneck and Wasserstein distances
Bubenik, Peter; Elchesen, Alex. arXiv.org; Ithaca, Oct 27, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
All 3 versions
 

 Journal Article  Full Text Online

[PDF] arxiv.org

Two-sample test with kernel projected Wasserstein distance

J WangR GaoY Xie - arXiv preprint arXiv:2102.06449, 2021 - arxiv.org

… We develop a kernel projected Wasserstein distance for the two-sample test, … distance

between projected distributions. In contrast to existing works about projected Wasserstein distance

Cited by 1 Related articles All 3 versions 


2021


2021  see 2020

Safe Wasserstein Constrained Deep Q-Learning
Kandel, Aaron; Moura, Scott J. arXiv.org; Ithaca, Oct 25, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window


[PDF] latinxinai.org

[PDF] Population Dynamics for Discrete Wasserstein Gradient Flows over Networks

G Diaz-GarciaCA UribeN Quijano - research.latinxinai.org

We study the problem of minimizing a convex function over probability measures supported

in a graph. We build upon the recent formulation of optimal transport over discrete domains

to propose a method that generates a sequence that provably converges to a minimum of …

Related articles 

 2021  see 2020

Distributed Wasserstein Barycenters via Displacement Interpolation
Cisneros-Velarde, Pedro; Bullo, Francesco. arXiv.org; Ithaca, Oct 19, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Wasserstein Unsupervised Reinforcement Learning
He, Shuncheng; Jiang, Yuhang; Zhang, Hongchang; Shao, Jianzhun; Ji, Xiangyang. arXiv.org; Ithaca, Oct 15, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Related articles
 All 2 versio


Learning with minibatch Wasserstein : asymptotic and gradient properties
Kilian Fatras; Zine, Younes; Flamary, Rémi; Gribonval, Rémi; Courty, Nicolas. arXiv.org; Ithaca, Oct 13, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

<——2021———2021———1530—— 


Stochastic Approximation versus Sample Average Approximation for population Wasserstein barycenters
Dvinskikh, Darina. arXiv.org; Ithaca, Oct 25, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Cited by 2
Related articles All 3 versions


Heterogeneous Wasserstein Discrepancy for Incomparable Distributions
Alaya, Mokhtar Z; Gasso, Gilles; Berar, Maxime; Rakotomamonjy, Alain. arXiv.org; Ithaca, Oct 12, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Related articles
 All 15 versions 


2021  see 2020

Augmented Sliced Wasserstein Distances
Chen, Xiongjie; Yang, Yongxin; Li, Yunpeng. arXiv.org; Ithaca, Oct 11, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window


Wasserstein GANs Work Because They Fail (to Approximate the Wasserstein Distance)
by Stanczuk, JanEtmann, ChristianKreusser, Lisa Maria ; More...
03/2021
Wasserstein GANs are based on the idea of minimising the Wasserstein distance between a real and a generated distribution. We provide an in-depth mathematical...
Journal Article  Full Text Online
arXiv:2103.01678  [pdfother]  stat.ML  cs.LG
Wasserstein GANs Work Because They Fail (to Approximate the Wasserstein Distance)
Authors: Jan StanczukChristian EtmannLisa Maria KreusserCarola-Bibiane Schonlieb
Abstract: Wasserstein GANs are based on the idea of minimising the Wasserstein distance between a real and a generated distribution. We provide an in-depth mathematical analysis of differences between the theoretical setup and the reality of training Wasserstein GANs. In this work, we gather both theoretical and empirical evidence that the WGAN loss is not a meaningful approximation of the Wasserstein dista… 
More
Submitted 3 March, 2021; v1 submitted 2 March, 2021; originally announced March 2021.
 Cited by 3 Related articles All 3 versions 

Wasserstein GANs Work Because They Fail (to Approximate the Wasserstein Distance)
Stanczuk, Jan; Etmann, Christian; Kreusser, Lisa Maria; Schönlieb, Carola-Bibiane. arXiv.org; Ithaca, Oct 5, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Cited by 16
 Related articles All 6 versions 

Minimum entropy production, detailed balance and Wasserstein distance for continuous-time Markov processes
Dechant, Andreas. arXiv.org; Ithaca, Oct 4, 2021.

2021

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Sliced Wasserstein based Canonical Correlation Analysis for Cross-Domain Recommendation
Zhao, Zian; Nie, Jie; Wang, Chenglong; Huang, Lei. Pattern Recognition Letters; Amsterdam Vol. 150,  (Oct 2021): 33.

Abstract/Details 

All 3 versions

Mass non-concentration at the nodal set and a sharp Wasserstein uncertainty principle
Mukherjee, Mayukh. arXiv.org; Ithaca, Sep 30, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Cited by 1
 Related articles 

  

Gaussian approximation for penalized Wasserstein barycenters
Nazar Buzun. arXiv.org; Ithaca, Sep 19, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window


De-aliased seismic data interpolation using conditional Wasserstein generative adversarial networks
Wei, Qing; Li, Xiangyang; Song, Mingpeng. Computers & geosciences Vol. 154,  (Sep 2021).

Abstract/Details 

Cited by 3 Related articles All 3 versions

<——2021—————2021———1540——  


2021  see 2020

Safe Zero-Shot Model-Based Learning and Control: A Wasserstein Distributionally Robust Approach
Kandel, Aaron; Moura, Scott J. arXiv.org; Ithaca, Aug 30, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

 

Automatic Image Annotation Using Improved Wasserstein Generative Adversarial Networks
Liu, Jian; Wu, Weisheng. IAENG International Journal of Computer Science; Hong Kong Vol. 48, Iss. 3,  (Aug 27, 2021): 507.

Abstract/Details

All 2 versions 

From the backopen in a new window

 Abstract     Translate

This article is a continuation of our first work \cite{chaudruraynal:frikha}. We here establish some new quantitative estimates for propagation of chaos of non-linear stochastic differential equations in the sense of McKean-Vlasov. We obtain explicit error estimates, at the level of the trajectories, at the level of the semi-group and at the level of the densities, for the mean-field approximation by systems of interacting particles under mild regularity assumptions on the coefficients. A first order expansion for the difference between the densities of one particle and its mean-field limit is also established. Our analysis relies on the well-posedness of classical solutions to the backward Kolmogorov partial differential equations defined on the strip 


ward Kolmogorov PDE on the Wasserstein space to propagation of chaos for Mckean-Vlasov SDEs
Frikha, Noufel; Paul-Eric Chaudru de Raynal. arXiv.org; Ithaca, Aug 25, 2021.

Abstract/DetailsGet full text
Link to external site, this link will

2021  [HTML] tandfonline.com

Full View

Dissimilarity measure of local structure in inorganic crystals using Wasserstein distance to search for novel phosphors

S Takemura, T Takeda, T Nakanishi… - … and Technology of …, 2021 - Taylor & Francis

To efficiently search for novel phosphors, we propose a dissimilarity measure of local

structure using the Wasserstein distance. This simple and versatile method provides the …

  Related articles All 7 versions


2021


2021  see 2020

Projection Robust Wasserstein Distance and Riemannian Optimization
Lin, Tianyi; Fan, Chenyou; Ho, Nhat; Cuturi, Marco; Jordan, Michael I. arXiv.org; Ithaca, Jul 17, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Local well-posedness in the Wasserstein space for a chemotaxis model coupled to Navier-Stokes equations
Kang, Kyungkeun; Kim, Haw Kil. arXiv.org; Ithaca, Aug 6, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Low-dose CT denoising using a Progressive Wasserstein generative adversarial network
Wang, Guan; Hu, Xueli. Computers in Biology and Medicine; Oxford Vol. 135,  (Aug 2021).

Abstract/DetailsFull textFull text - PDF (17 MB)‎

All 4 versions


[PDF] mdpi.com

A novel intelligent fault diagnosis method for rolling bearings based on Wasserstein generative adversarial network and Convolutional Neural Network under …

H Tang, S Gao, L Wang, X Li, B Li, S Pang - Sensors, 2021 - mdpi.com

… The Wasserstein generative … Wasserstein generative adversarial net (WGAN) evaluates the difference between the real and generated sample distributions by using the Wasserstein …

 Cited by 6 Related articles All 9 versions 

2021  see 2020

A Bismut-Elworthy inequality for a Wasserstein diffusion on the circle
Marx, Victor. arXiv.org; Ithaca, Jul 22, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

<——2021———2021———1550——


Drug-drug interaction prediction with Wasserstein Adversarial Autoencoder-based knowledge graph embeddings.
Dai, Yuanfei; Guo, Chenhao; Guo, Wenzhong; Eickhoff, Carsten; National Library of Medicine. Briefings in bioinformatics Vol. 22, Iss. 4,  (July 20, 2021).

Abstract/Details Get full textLink to external site, this link will open in a new window


A unified framework for non-negative matrix and tensor factorisations with a smoothed Wasserstein loss
Zhang, Stephen Y. arXiv.org; Ithaca, Jul 15, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Cited by 2
 Related articles All 5 versions 


Wasserstein GAN: Deep Generation applied on Bitcoins financial time series
Rikli Samuel; Bigler, Daniel Nico; Pfenninger Moritz; Osterrieder Joerg. arXiv.org; Ithaca, Jul 13, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Cited by 3
 Related articles All 2 versions 
 

Wasserstein Robust Classification with Fairness Constraints
Wang, Yijie; Nguyen, Viet Anh; Hanasusanto, Grani A. arXiv.org; Ithaca, Jul 12, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Related articles
 All 2 versions 


Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space
Thibaut Le Gouic; Paris, Quentin; Rigollet, Philippe; Stromme, Austin J. arXiv.org; Ithaca, Jul 12, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window


2021


Virtual persistence diagrams, signed measures, Wasserstein distances, and Banach spaces
Bubenik, Peter; Elchesen, Alex. arXiv.org; Ithaca, Jul 7, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

 

Semi-relaxed Gromov-Wasserstein divergence with applications on graphs
by Vincent-Cuaz, CédricFlamary, RémiCorneli, Marco ; More...
10/2021
Comparing structured objects such as graphs is a fundamental operation involved in many learning tasks. To this end, the Gromov-Wasserstein (GW) distance,...
Journal Article  Full Text Online


2021 see 2020

Wasserstein Convergence Rate for Empirical Measures on Noncompact Manifolds
Feng-Yu, Wang. arXiv.org; Ithaca, Jul 3, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
MR4347493


(PDF) Distributionally Robust Resilient Operation of Integrated ...

https://www.researchgate.net › publication › 348300885_...

Jan 11, 2021 — Distributionally Robust Resilient Operation of Integrated Energy Distribution Systems Using Moment and Wasserstein Metric for Contingencies.

[CITATION] Distributionally Robust Resilient Operation of Integrated Energy Systems Using Moment and Wasserstein Metric for Contingencies

Y Zhou, Z WeiM Shahidehpour… - IEEE Transactions on …, 2021 - ui.adsabs.harvard.edu

view. Abstract. Citations. References. Co-Reads. Similar Papers. Volume Content. Graphics.

Metrics. Export Citation. NASA/ADS. Distributionally Robust Resilient Operation of Integrated 

Distributionally Robust Resilient Operation of Integrated Energy Systems Using Moment and Wasserstein Metric for Contingencies
Zhou, Yizhou; Wei, Zhinong; Shahidehpour, Mohammad; Chen, Sheng. IEEE Transactions on Power Systems; New York Vol. 36, Iss. 4,  (2021): 3574-3584.

Cited by 11 Related articles All 2 versions

Wasserstein Adversarial Regularization (WAR) on label noise
Kilian Fatras; Damodaran, Bharath Bhushan; Lobry, Sylvain; Flamary, Rémi; Tuia, Devis; et al. arXiv.org; Ithaca, Jun 29, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

<——2021———2021———1560—— 


 

2021  [PDF] researchgate.net

[PDF] Two-sample Test using Projected Wasserstein Distance

J Wang, R Gao, Y Xie - researchgate.net

We develop a projected Wasserstein distance for the two-sample test, a fundamental

problem in statistics and machine learning: given two sets of samples, to determine whether

they are from the same distribution. In particular, we aim to circumvent the curse of …

  Related articles 
2021 see 2020 
Conference Paper

Two-sample Test using Projected Wasserstein Distance: Breaking the Curse of Dimensionality
Wang, Jie; Gao, Rui; Xie, Yao. arXiv.org; Ithaca, Jun 15, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

About exchanging expectation and supremum for conditional Wasserstein GANs
Martin, Jörg. arXiv.org; Ithaca, Jun 14, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

 
2021 see 2020

Universal consistency of Wasserstein  k-NN classifier
Ponnoprat, Donlapark. arXiv.org; Ithaca, Jun 14, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
 Related articles 


2021 see 2020

The Unbalanced Gromov Wasserstein Distance: Conic Formulation and Relaxation
Thibault Séjourné; Vialard, François-Xavier; Peyré, Gabriel. arXiv.org; Ithaca, Jun 8, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Cited by 36
 Related articles All 7 versions 

Gradient Flows for Frame Potentials on the Wasserstein Space
Wickman, Clare; Okoudjou, Kasso. arXiv.org; Ithaca, Jun 8, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

2021


2021`  [PDF] arxiv.org

Two-sample goodness-of-fit tests on the flat torus based on Wasserstein distance and their relevance to structural biology

J González-DelgadoA González-Sanz… - arXiv preprint arXiv …, 2021 - arxiv.org

This work is motivated by the study of local protein structure, which is defined by two variable

dihedral angles that take values from probability distributions on the flat torus. Our goal is to …

 All 19 versions 


2021 

Wasserstein GAN: Deep Generation Applied on Financial Time Series

M Pfenninger, S Rikli, DN Bigler - Available at SSRN 3877960, 2021 - papers.ssrn.com

Modeling financial time series is challenging due to their high volatility and unexpected

happenings on the market. Most financial models and algorithms trying to fill the lack of …

 

2021 

Wasserstein GAN: Deep Generation Applied on Financial Time Series

M Pfenninger, S Rikli, DN Bigler - Available at SSRN 3877960, 2021 - papers.ssrn.com

Modeling financial time series is challenging due to their high volatility and unexpected

happenings on the market. Most financial models and algorithms trying to fill the lack of …

 

<——2021———2021———1570—— 


 
2021

One-shot style transfer using Wasserstein Autoencoder

H NakadaH Asoh - 2021 Asian Conference on Innovation in …, 2021 - ieeexplore.ieee.org

We propose an image style transfer method based on disentangled representation obtained

with Wasser-stein Autoencoder. Style transfer is an area of image generation technique that …



2021

Distributionally Safe Path Planning: Wasserstein Safe RRT

P Lathrop, B Boardman… - IEEE Robotics and …, 2021 - ieeexplore.ieee.org

In this paper, we propose a Wasserstein metric-based random path planning algorithm.

Wasserstein Safe RRT (W-Safe RRT) provides finite-sample probabilistic guarantees on the …



2021`  [PDF] arxiv.org

Two-sample goodness-of-fit tests on the flat torus based on Wasserstein distance and their relevance to structural biology

J González-DelgadoA González-Sanz… - arXiv preprint arXiv …, 2021 - arxiv.org

This work is motivated by the study of local protein structure, which is defined by two variable

dihedral angles that take values from probability distributions on the flat torus. Our goal is to …

 All 19 versions 


2021  [PDF] tandfonline.com

Stochastic approximation versus sample average approximation for Wasserstein barycenters

D Dvinskikh - Optimization Methods and Software, 2021 - Taylor & Francis

In the machine learning and optimization community, there are two main approaches for the

convex risk minimization problem, namely the Stochastic Approximation (SA) and the …

 All 3 versions


2021  [PDF] arxiv.org

Convergence rates for Metropolis-Hastings algorithms in the Wasserstein distance

A Brown, GL Jones - arXiv preprint arXiv:2111.10406, 2021 - arxiv.org

We develop necessary conditions for geometrically fast convergence in the Wasserstein

distance for Metropolis-Hastings algorithms on $\mathbb {R}^ d $ when the metric used is a …

 All 2 versions 


2021


WDIBS: Wasserstein deterministic information bottleneck for state abstraction to balance state-compression and performance

X Zhu, T Huang, R Zhang, W Zhu - Applied Intelligence, 2021 - Springer

As an important branch of reinforcement learning, Apprenticeship learning studies how an

agent learns good behavioral decisions by observing an expert policy from the environment …



2021  arXiv

Bo2arXiv921unds in   L1,, Wasserstein distance on the normal approximation of general M-estimators

François Bachoc (IMT), Max Fathi (LPSM, LJLL)

Submitted on 18 Nov 2021]

We derive quantitative bounds on the rate of convergence in 

L1  Wasserstein distance of general M-estimators, with an almost sharp (up to a logarithmic term) behavior in the number of observations. We focus on situations where the estimator does not have an explicit expression as a function of the data. The general method may be applied even in situations where the observations are not independent. Our main application is a rate of convergence for cross validation estimation of covariance parameters of Gaussian processes.

Subjects:

Statistics Theory (math.ST)



2021  see 2020

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes
Minh Ha Quang. arXiv.org; Ithaca, Apr 23, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Well-Posedness for Some Non-Linear Diffusion Processes and Related PDE on the Wasserstein Space
Paul-Eric Chaudru de Raynal; Frikha, Noufel. arXiv.org; Ithaca, Apr 22, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

2021 see 2020

Stability of Gibbs Posteriors from the Wasserstein Loss for Bayesian Full Waveform Inversion
Dunlop, Matthew M; Yang, Yunan. arXiv.org; Ithaca, Apr 16, 2021.

Cited by 5 Related articles All 4 versions

  <——2021———2021———1580——


Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Wasserstein-based Projections with Applications to Inverse Problems
Heaton, Howard; Samy Wu Fung; Alex Tong Lin; Osher, Stanley; Yin, Wotao. arXiv.org; Ithaca, Apr 14, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Application of an unbalanced optimal transport distance and a mixed L1/Wasserstein distance to full waveform inversion
Li, Da; Lamoureux, Michael P; Liao, Wenyuan. arXiv.org; Ithaca, Apr 3, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

2821  see 2020

Interpretable Model Summaries Using the Wasserstein Distance
Dunipace, Eric; Trippa, Lorenzo. arXiv.org; Ithaca, Apr 2, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Li, YoA deep learning-based approach for direct PET attenuation correction using Wasserstein generative adversarial networkngchang; Wu, Wei. Journal of Physics: Conference Series; Bristol Vol. 1848, Iss. 1,  (Apr 2021).

Abstract/DetailsFull text - PDF (703 KB)‎

Cited by 1 Related articles All 3 versions

2021


2021  see 2020

Wasserstein Distances for Stereo Disparity Estimation
Garg, D

ct/DetailsGet full text
Link to external site, this link will open in a new window
Wasserstein Distances for Stereo Disparity Estimation

by Divyansh GargYan WangBharath Hariharan ; More...
arXiv.og, 03/2021
Existing approaches to depth or disparity estimation output a distribution over a set of pre-defined discrete values. This leads to inaccurate results when the...
Paper  Full Text Online


2021  see 2020

Primal Wasserstein Imitation Learning
Dadashi, Robert; Léonard Hussenot; Geist, Matthieu; Pietquin, Olivier. arXiv.org; Ithaca, Mar 17, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
 

Wasserstein Proximal Algorithms for the Schrödinger Bridge Problem: Density Control with Nonlinear Drift
Caluya, Kenneth F; Halder, Abhishek. arXiv.org; Ithaca, Mar 15, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

2021 patent

Object shape regression using wasserstein distance
by Palo Alto Research Center Incorporated
03/2021
One embodiment can provide a system for detecting outlines of objects in images. During operation, the system receives an image that includes at least one...
Patent  Available Online

Palo Alto Research Center Obtains Patent for Object Shape Regression Using Wasserstein Distance
Global IP News. Optics & Imaging Patent News; New Delhi [New Delhi]. 09 Mar 2021. 
 
2021  see 2020

Wasserstein Stability for Persistence Diagrams
Skraba, Primoz; Turner, Katharine. arXiv.org; Ithaca, Mar 4, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

 <——2021———2021———1590——



2021 see 2020

Tessellated Wasserstein Auto-Encoders
Kuo Gai; Zhang, Shihua. arXiv.org; Ithaca, Mar 4, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window



高光谱图像分类的Wasserstein配置熵非监督波段选择方法
Alternate title: Unsupervised band selection for hyperspectral image classification using the Wasserstein metric-based configuration entropy

张红吴智伟王继成高培超. Cehui Xuebao; Beijing Vol. 50, Iss. 3,  (Mar 2021): 405-415.
[Chinese  Wasserstein configuration entropy unsupervised band selection method for hyperspectral image classification]


2021 see 2020

Intrinsic Sliced Wasserstein Distances for Comparing Collections of Probability Distributions on Manifolds and Graphs
Rustamov, Raif M; Majumdar, Subhabrata. arXiv.org; Ithaca, Mar 1, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
ntrinsic Sliced Wasserstein Distances for Comparing Collections of Probability...

by Raif M RustamovSubhabrata Majumdar
arXiv.org, 03/2021
Collections of probability distributions arise in a variety of statistical applications ranging from user activity pattern analysis to brain connectomics. In...
Paper  Full Text Online


2021  see 2020

Permutation invariant networks to learn Wasserstein metrics
Sehanobish, Arijit; Neal, Ravindra; David van Dijk. arXiv.org; Ithaca, Feb 26, 2021

Abstract/DetailsGet full text
Link to external site, this link will open in a new window


2021  see 2020

Scalable Computations of Wasserstein Barycenter via Input Convex Neural Networks
Fan, Jiaojiao; Taghvaei, Amirhossein; Chen, Yongxin. arXiv.org; Ithaca, Feb 23, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

2021


2021  see 2020

The Wasserstein Proximal Gradient Algorithm
Salim, Adil; Korba, Anna; Luise, Giulia. arXiv.org; Ithaca, Feb 21, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

2021 see 2020

Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings
Minh Ha Quang. arXiv.org; Ithaca, Feb 15, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Cited by 2
 Related articles All 5 versions 


Geometric convergence bounds for Markov chains in Wasserstein distance based on generalized drift and contraction conditions
Qin, Qian; Hobert, James P. arXiv.org; Ithaca, Feb 15, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

MR4421611

Weak optimal total variation transport problems and generalized Wasserstein barycenters
Chung, Nhan-Phu; Thanh-Son Trinh. arXiv.org; Ithaca, Jan 18, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

 

NEWSLETTER ARTICLE

Studies from South China University of Technology Have Provided New Data on Information Technology (Data-driven Distributionally Robust Unit Commitment With Wasserstein Metric: Tractable Formulation and Efficient Solution Method)

Information Technology Newsweekly, 2021, p.887
<——2021———2021———1600——


2021  see 2020

From Smooth Wasserstein Distance to Dual Sobolev Norm: Empirical Approximation and Statistical Applications
Sloan Nietert; Goldfeld, Ziv; Kato, Kengo. arXiv.org; Ithaca, Jan 14, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Cited by 3
 Related articles All 2 versions 

Strong Formulations for Distributionally Robust Chance-Constrained Programs with Left-Hand Side Uncertainty under Wasserstein Ambiguity
Ho-Nguyen, Nam; Kılınç-Karzan, Fatma; Küçükyavuz, Simge; Lee, Dabeen. arXiv.org; Ithaca, Jan 13, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window


2021  see 2020

Stochastic Saddle-Point Optimization for Wasserstein Barycenters
Tiapkin, Daniil; Gasnikov, Alexander; Dvurechensky, Pavel. arXiv.org; Ithaca, Jan 11, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window


2021  see 2020

Exponential Convergence in Entropy and Wasserstein Distance for McKean-Vlasov SDEs
Ren, Panpan; Feng-Yu, Wang. arXiv.org; Ithaca, Jan 5, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

 2021


2021  see 2020

Convergence of Recursive Stochastic Algorithms using Wasserstein Divergence
Gupta, Abhishek; Haskell, William B. arXiv.org; Ithaca, Jan 5, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Cited by 5
 Related articles All 7 versions


Dissertation or Thesis

Wasserstein Adversarial Transformer for Cloud Workload Prediction
Arbat, Shivani Gajanan. University of Georgia, ProQuest Dissertations Publishing, 2021. 28643528.

Abstract/DetailsPreview - PDF (470 KB)‎

Related articles All 2 versions

  Scholarly Journal  Full Text

Spacecraft Intelligent Fault Diagnosis under Variable Working Conditions via Wasserstein Distance-Based Deep Adversarial Transfer Learning
Xiang, Gang; Tian, Kun. International Journal of Aerospace Engineering; New York Vol. 2021,  (2021).

Abstract/DetailsFull textFull text - PDF (4 MB)‎

[HTML] hindawi.com

[HTML] Spacecraft Intelligent Fault Diagnosis under Variable Working Conditions via Wasserstein Distance-Based Deep Adversarial Transfer Learning

G Xiang, K Tian - International Journal of Aerospace Engineering, 2021 - hindawi.com

In recent years, deep learning methods which promote the accuracy and efficiency of fault

diagnosis task without any extra requirement of artificial feature extraction have elicited the

attention of researchers in the field of manufacturing industry as well as aerospace …

 All 3 versions 

2021

SRWGANTV: Image Super-Resolution Through Wasserstein Generative Adversarial Networks withTotal

Variational Regularization

Chen, Liang; Wu, Yi. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference 

Proceedings; Piscataway, (2021).

Conference Paper  Citation/Abstract

SRWGANTV: Image Super-Resolution Through Wasserstein Generative Adversarial Networks with Total Variational Regularization
Chen, Liang; Wu, Yi.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).


Speech Enhancement Approach Based on Relativistic Wasserstein Generation Adversarial Networks
Li, Zhi. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).

Related articles All 2 versions

<——2021———2021———1610——  


2021  see 2019

Data-driven Wasserstein distributionally robust optimization for refinery planning under uncertainty
Zhao, Liang; He, Wangli. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).
 Related articles


 2021  see 2020

PLG-IN: Pluggable Geometric Consistency Loss with Wasserstein Distance in Monocular Depth Estimation
Koide, Satoshi; Kawano, Keisuke; Kondo, Ruho. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).

Abstract/Details Get full textLink to external site, this link will open in a new window

Wasserstein Based EmoGANs+
Khine, Win Shwe Sin; Siritanawan, Prarinya; Kotani, Kazunori. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).
Related articles


Inverse Domain Adaptation for Remote Sensing Images Using Wasserstein Distance
Li, Ziyao; Wang, Rui; Pun, Man-On; Wang, Zhiguo. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).

Related articles


Conference Paper

Solving Wasserstein Robust Two-stage Stochastic Linear Programs via Second-order Conic Programming
Wang, Zhuolin; You, Keyou; Song, Shiji. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).
Cited by 1
 Related articles


2021

  Conference Paper

Image Denoising Using an Improved Generative Adversarial Network with Wasserstein Distance
Liu, Han; Xie, Guo; Zhang, Youmin. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).


Conference Paper

One-shot style transfer using Wasserstein Autoencoder
Nakada, Hidemoto; Asoh, Hideki. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).
Related articles


Conference Paper

Joint Distribution Adaptation via Wasserstein Adversarial Training
Wang, Xiaolu; Zhang, Wenyong; Shen, Xin; Liu, Huikang. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).

Conference Paper

Wasserstein Distance-Based Domain Adaptation and Its Application to Road Segmentation
Kono, Seita; Ueda, Takaya; Arriaga-Varela, Enrique; Nishikawa, Ikuko. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).
Related articles


Conference Paper

Fault injection in optical path - detection quality degradation analysis with Wasserstein distance
Kowalczyk, Pawel; Bugiel, Paulina; Szelest, Marcin; Izydorczyk, Jacek. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).

<——2021———2021———1620——


Conditional Wasserstein Generative Adversarial Networks for Rebalancing Iris Image Datasets
Yung-Hui, LI; ASLAM, Muhammad Saqlain; Latifa Nabila HARFIYA; Ching-Chun, CHANG. IEICE Transactions on Information and Systems; Tokyo Vol. E104D, Iss. 9,  (2021): 1450-1458.

Abstract/Details Get full textLink to external site, this link will open in a new window 

All 6 versions

Critical Sample Generation Method for Static Voltage Stability Based on Transfer Learning and Wasserstein Generative Adversarial Network
Liao, Yifan; Wu, Zhigang. Dianwang Jishu = Power System Technology; Beijing Iss. 9,  (2021): 3722.
[CITATION] Critical Sample Generation Method for Static Voltage Stability Based on Transfer Learning and Wasserstein Generative Adversarial Network

Y Liao - Power System Technology, 2021

Cited by 2

Fault Diagnosis of Wind Turbine Drivetrain Based on Wasserstein Generative Adversarial Network-Gradient Penalty
Teng, Wei; Ding, Xian; Shi, Bingshuai; Xu, Jin. Dianli Xitong Zidonghua = Automation of Electric Power Systems; Nanjing Vol. 45, Iss. 22,  (2021): 167.

  
2021  see 2020

De Novo Protein Design for Novel Folds Using Guided Conditional Wasserstein Generative Adversarial Networks
Karimi, Mostafa; Zhu, Shaowen; Cao, Yue; Shen, Yang. Journal of Chemical Information and Modeling; Washington Vol. 60, Iss. 12,  (Dec 28, 2020): 5667.

Abstract/Details Get full textLink to external site, this link will open in a new window


Solutions to Hamilton–Jacobi equation on a Wasserstein space
Badreddine Zeinab; Frankowska Hélène. Calculus of Variations and Partial Differential Equations; Heidelberg Vol. 61, Iss. 1,  (2022).

Abstract/Details 

 

2021 


Working Paper arXiv
W-entropy formulas and Langevin deformation of flows on Wasserstein space over Riemannian manifolds
Li, Songzi; Xiang-Dong, Li. arXiv.org; Ithaca, Nov 29, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Schema matching using Gaussian mixture models with Wasserstein distance
Przyborowski, Mateusz; Pabiś, Mateusz; Janusz, Andrzej; Ślęzak, Dominik. arXiv.org; Ithaca, Nov 28, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
All 3 versions
 
 
2021  see 2020

Wasserstein-based fairness interpretability framework for machine learning models
Miroshnikov, Alexey; Kotsiopoulos, Konstandinos; Franks, Ryan; Kannan, Arjun Ravi. arXiv.org; Ithaca, Nov 19, 2021.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window


Stochastic Wasserstein Hamiltonian Flows
by Cui, JianboLiu, ShuZhou, Haomin
11/2021
In this paper, we study the stochastic Hamiltonian flow in Wasserstein manifold, the probability density space equipped with $L^2$-Wasserstein metric tensor,...
Journal Article  Full Text Online
arXiv:2111.15163
  [pdfpsother]  math.PR   math.DS
Stochastic Wasserstein Hamiltonian Flows
Authors: Jianbo CuiShu LiuHaomin Zhou
Abstract: In this paper, we study the stochastic Hamiltonian flow in Wasserstein manifold, the probability density space equipped with L2-Wasserstein metric tensor, via the Wong--Zakai approximation. We begin our investigation by showing that the stochastic Euler-Lagrange equation, regardless it is deduced from either variational principle or particle dynamics, can be interpreted as the stochastic kineti… 
More
Submitted 30 November, 2021; originally announced November 2021.
Comments: 34 pages
serstein manifold. We further propose a novel variational formulation …

Cited by 1 Related articles All 2 versions 

Trust the Critics: Generatorless and Multipurpose WGANs with Initial Convergence Guarantees
by Milne, TristanBilocq, ÉtienneNachman, Adrian
11/2021
Inspired by ideas from optimal transport theory we present Trust the Critics (TTC), a new algorithm for generative modelling. This algorithm eliminates the...
Journal Article  Full Text Online

arXiv:2111.15099  [pdfother]  cs.LG  cs.CV 
cs.NE  math.OC
Trust the Critics: Generatorless and Multipurpose WGANs with Initial Convergence Guarantees
Authors: Tristan MilneÉtienne BilocqAdrian Nachman
Abstract: Inspired by ideas from optimal transport theory we present Trust the Critics (TTC), a new algorithm for generative modelling. This algorithm eliminates the trainable generator from a Wasserstein GAN; instead, it iteratively modifies the source data using gradient descent on a sequence of trained critic networks. This is motivated in part by the misalignment which we observed between the optimal tr… 
More
Submitted 29 November, 2021; originally announced November 2021.
Comments: 20 pages, 8 figures
MSC Class: 49Q22 ACM Class: I.3.3; I.4.4; I.4.3

Related articles All 2 versions 
<——2021———2021———1630——

arXiv:2111.15057  [pdfother math.OC  eess.SY
Multi-period facility location and capacity planning under ∞
-Wasserstein joint chance constraints in humanitarian logistics
Authors: Zhuolin WangKeyou YouZhengli WangKanglin Liu
Abstract: The key of the post-disaster humanitarian logistics (PD-HL) is to build a good facility location and capacity planning (FLCP) model for delivering relief supplies to affected areas in time. To fully exploit the historical PD data, this paper adopts the data-driven distributionally robust (DR) approach and proposes a novel multi-period FLCP model under the ∞
-Wasserstein joint chance constrai…  More
Submitted 29 November, 2021; originally announced November 2021. 


2021.

ttps://www.worldscientific.com › doi › abs

The Wasserstein geometry of nonlinear σ models and the ...

by M Carfora · 2017 · 5 — Nonlinear sigma models are quantum field theories describing, in the large deviation sense, random fluctuations of harmonic maps between a Riemann surface ...


SVAE-WGAN based Soft Sensor Data Supplement Method for ...

https://ieeexplore.ieee.org › document

Nov 16, 2021 — Aimed at this problem, a SVAE-WGAN based soft sensor data supplement method is proposed for proce

online Cover Image PEER-REVIEW

SVAE-WGAN based Soft Sensor Data Supplement Method for Process Industry

by Gao, Shiwei; Qiu, Sulong; Ma, Zhongyu ; More...

IEEE sensors journal, 11/2021

Challenges of process industry, which is characterized as hugeness of process variables in complexity of industrial environment, can be tackled effectively by...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 

 

Solutions to Hamilton–Jacobi equation on a Wasserstein space

https://link.springer.com › content › pdf

Mar 13, 2021 — The considered Hamilton–Jacobi equations are stated on a Wasserstein space and are associated to a Calculus of Variation problem. Under some ...

Cover Image PEER-REVIEW

Solutions to Hamilton–Jacobi equation on a Wasserstein space

by Badreddine, Zeinab; Frankowska, Hélène

Calculus of variations and partial differential equations, 11/2021, Volume 61, Issue 1

We consider a Hamilton–Jacobi equation associated to the Mayer optimal control problem in the Wasserstein space P 2 ( R d ) and define its solutions in terms...

Journal ArticleCitation Online

  

 

Wasserstein barycenters of compactly supported measureshttps://link.springer.com › content › pdf
  by S Kim · 2021 — We consider in this paper probability measures with compact support on the open con- vex cone of positive definite Hermitian matrices. We define ...  

Cover Image PEER-REVIEW

Wasserstein barycenters of compactly supported measures

by Kim, Sejong; Lee, Hosoo

Analysis and mathematical physics, 08/2021, Volume 11, Issue 4

We consider in this paper probability measures with compact support on the open convex cone of positive definite Hermitian matrices. We define the least...

Journal ArticleCitation Online

 All 2 versions


2021

2021 patent

[PDF] scitation.org

Using wasserstein generative adversarial networks to create network traffic samples

AA Sychugov, MM Grekov - AIP Conference Proceedings, 2021 - aip.scitation.org

Modern information security systems are not always able to withstand constantly evolving

computer attacks. Using machine learning, attackers can carry out complex and unknown

attacks. Intrusion detection systems based on the search for anomalies allow us to detect

unknown attacks, but give a high percentage of false results. Small classes of attacks are

worse detected by classifiers when the training data sets are not balanced. In this paper, we

propose to use generative adversarial networks (GAN) to generate anomalous samples …

 Cited by 1 Related articles All 3 versions network traffic samples

by Sychugov, A. A; Grekov, M. M

AIP conference proceedings, 11/2021, Volume 2402, Issue 1

Modern information security systems are not always able to withstand constantly evolving computer attacks. Using machine learning, attackers can carry out...

Article PDF Download Now (via Unpaywall) BrowZine PDF Icon

Journal ArticleFull Text Online

News results for "(TitleCombined:(wasserstein OR wgan))"

online

HRL Laboratories LLC issued patent titled "System and method for unsupervised domain adaptation via sliced-wasserstein...

News Bites - Private Companies, Nov 23, 2021

Newspaper ArticleFull Text Online

 Cite this item Email this item Save this item More actions

United States Patent for System and Method for Unsupervised Domain Adaptation Via Sliced-Wasserstein Distance Issued to...

Pedia Content Solutions Pvt. LtdNov 17, 2021

 Cite this item Email this item Save this item More actions

United States Patent for System and Method for Unsupervised Domain Adaptation Via Sliced-Wasserstein Distance Issued to...

Pedia Content Solutions Pvt. LtdNov 17, 2021

 Cite this item Email this item Save this item More actions


2021  see 2020

Learning Graphons via Structured Gromov-Wasserstein ...

https://ojs.aaai.org › AAAI › article › view

https://ojs.aaai.org › AAAI › article › viewPDF

by H Xu · 2021 · Cited by 3 — Abstract. We propose a novel and principled method to learn a non- parametric graph model called graphon, which is defined in an infinite-dimensional space ...

Distributionally Safe Path Planning: Wasserstein Safe RRT

P LathropB Boardman… - IEEE Robotics and …, 2021 - ieeexplore.ieee.org

In this paper, we propose a Wasserstein metric-based random path planning algorithm.

Wasserstein Safe RRT (W-Safe RRT) provides finite-sample probabilistic guarantees on the

safety of a returned path in an uncertain obstacle environment. Vehicle and obstacle states

are modeled as distributions based upon state and model observations. We define limits on

distributional sampling error so the Wasserstein distance between a vehicle state

distribution and obstacle distributions can be bounded. This enables the algorithm to return …

ited by 3 Related articles


Multi WGAN-GP loss for pathological stain transformation using GAN

AZ Moghadam, H Azarnoush… - 2021 29th Iranian …, 2021 - ieeexplore.ieee.org

… ACGAN-WGAN is nearly the same, with ACGAN-WGAN outperforming others. The best 

results of our losses and ACGAN-WGAN compared to ACGAN loss can be attributed to WGAN-…

 Cited by 1 Related articles


Distributionally Safe Path Planning: Wasserstein Safe RRT

P LathropB Boardman… - IEEE Robotics and …, 2021 - ieeexplore.ieee.org

In this paper, we propose a Wasserstein metric-based random path planning algorithm.

Wasserstein Safe RRT (W-Safe RRT) provides finite-sample probabilistic guarantees on the

safety of a returned path in an uncertain obstacle environment. Vehicle and obstacle states

are modeled as distributions based upon state and model observations. We define limits on

distributional sampling error so the Wasserstein distance between a vehicle state

distribution and obstacle distributions can be bounded. This enables the algorithm to return …

online,Cover Image  PEER-REVIEW

Distributionally Safe Path Planning: Wasserstein Safe RRT

by Lathrop, Paul; Boardman, Beth; Martinez, Sonia

IEEE robotics and automation letters, 11/2021

In this paper, we propose a Wasserstein metric- based random path planning algorithm. Wasserstein Safe RRT (W-Safe RRT) provides finite-sample probabilistic...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 


LCS graph kernel based on Wasserstein distance in longest common subsequence metric space

J Huang, Z Fang, H Kasai - Signal Processing, 2021 - Elsevier

For graph learning tasks, many existing methods utilize a message-passing mechanism

where vertex features are updated iteratively by aggregation of neighbor information. This

strategy provides an efficient means for graph features extraction, but obtained features after

many iterations might contain too much information from other vertices, and tend to be

similar to each other. This makes their representations less expressive. Learning graphs

using paths, on the other hand, can be less adversely affected by this problem because it …

Cited by 9 Related articles All 2 versions

2021 see 2020 online Cover Image  PEER-REVIEW OPEN ACCESS

LCS graph kernel based on Wasserstein distance in longest common subsequence metric space

by Huang, Jianming; Fang, Zhongxi; Kasai, Hiroyuki

Signal processing, 12/2021, Volume 189

•Graph classification using Wasserstein graph kernel.•Path sequences comparing over longest common subsequence space metric space.•Adjacent point merging...

Article View Article PDF BrowZine PDF Icon

Journal ArticleFull Text Online

View Complete Issue Browse Now BrowZine Book Icon

 

A Wasserstein generative adversarial network-based approach for real-time track irregularity estimation using vehicle dynamic responses

Z Yuan, J Luo, S ZhuW Zhai - Vehicle System Dynamics, 2021 - Taylor & Francis

Accurate and timely estimation of track irregularities is the foundation for predictive

maintenance and high-fidelity dynamics simulation of the railway system. Therefore, it's of

great interest to devise a real-time track irregularity estimation method based on dynamic

responses of the in-service train. In this paper, a Wasserstein generative adversarial network

(WGAN)-based framework is developed to estimate the track irregularities using the

vehicle's axle box acceleration (ABA) signal. The proposed WGAN is composed of a …

Cover Image  PEER-REVIEW

A Wasserstein generative adversarial network-based approach for real-time track irregularity...

by Yuan, Zhandong; Luo, Jun; Zhu, Shengyang ; More...

Vehicle system dynamics, , Volume ahead-of-print, Issue ahead-of-print

Accurate and timely estimation of track irregularities is the foundation for predictive maintenance and high-fidelity dynamics simulation of the railway...

Journal ArticleCitation Online

<——2021———2021———1640—— 


 

[PDF] arxiv.org

Multiplier bootstrap for Bures-Wasserstein barycenters

A KroshninV SpokoinyA Suvorikova - arXiv preprint arXiv:2111.12612, 2021 - arxiv.org

Bures-Wasserstein barycenter is a popular and promising tool in analysis of complex data

like graphs, images etc. In many applications the input data are random with an unknown

distribution, and uncertainty quantification becomes a crucial issue. This paper offers an

approach based on multiplier bootstrap to quantify the error of approximating the true Bures--

Wasserstein barycenter $ Q_* $ by its empirical counterpart $ Q_n $. The main results state

the bootstrap validity under general assumptions on the data generating distribution $ P …

online OPEN ACCESS

Multiplier bootstrap for Bures-Wasserstein barycenters

by Kroshnin, Alexey; Spokoiny, Vladimir; Suvorikova, Alexandra

11/2021

Bures-Wasserstein barycenter is a popular and promising tool in analysis of complex data like graphs, images etc. In many applications the input data are...

Journal ArticleFull Text Online

  All 2 versions

 

Convergence rates for Metropolis-Hastings algorithms in the Wasserstein distance

A Brown, GL Jones - arXiv preprint arXiv:2111.10406, 2021 - arxiv.org

We develop necessary conditions for geometrically fast convergence in the Wasserstein

distance for Metropolis-Hastings algorithms on $\mathbb {R}^ d $ when the metric used is a

norm. This is accomplished through a lower bound which is of independent interest. We

show exact convergence expressions in more general Wasserstein distances (eg total

variation) can be achieved for a large class of distributions by centering an independent

Gaussian proposal, that is, matching the optimal points of the proposal and target densities …

 All 2 versions 

online OPEN ACCESS

Convergence rates for Metropolis-Hastings algorithms in the Wasserstein distance

by Brown, Austin; Jones, Galin L

11/2021

We develop necessary conditions for geometrically fast convergence in the Wasserstein distance for Metropolis-Hastings algorithms on $\mathbb{R}^d$ when the...

Journal ArticleFull Text Online

 

 

Model-agnostic bias mitigation methods with regressor ... - arXiv

https://arxiv.org › cs

by A Miroshnikov · 2021 — The post-processing methodology involves reshaping the predictor distributions by balancing the positive and negative bias explanations and ...

Missing: ba... ‎| Must include: ba...

online  OPEN ACCESS

Model-agnostic bias mitigation methods with regressor distribution control for Wasserstein-ba...

by Miroshnikov, Alexey; Kotsiopoulos, Konstandinos; Franks, Ryan ; More...

11/2021

This article is a companion paper to our earlier work Miroshnikov et al. (2021) on fairness interpretability, which introduces bias explanations. In the...

Journal ArticleFull Text Online

 

 

Model-agnostic bias mitigation methods with regressor ... - arXiv

https://arxiv.org › cs

by A Miroshnikov · 2021 — The post-processing methodology involves reshaping the predictor distributions by balancing the positive and negative bias explanations and ...

Missing: ba... ‎| Must include: ba...

online OPEN ACCESS

Bounds in $L^1$ Wasserstein distance on the normal approximation of general M-estimators

by Bachoc, François; Fathi, Max

11/2021

We derive quantitative bounds on the rate of convergence in $L^1$ Wasserstein distance of general M-estimators, with an almost sharp (up to a logarithmic term)...

Journal ArticleFull Text Online

online, OPEN ACCESS

System and method for unsupervised domain adaptation via sliced-wasserstein distance

by HRL Laboratories, LLC

11/2021

Described is a system for unsupervised domain adaptation in an autonomous learning agent. The system adapts a learned model with a set of unlabeled data from a...

PatentAvailable Online

online

US Patent Issued to HRL Laboratories on Nov. 16 for "System and method for unsupervised domain adaptation via sliced-wasserstein...

US Fed News Service, Including US State News, Nov 17, 2021

Newspaper ArticleFull Text Online


arXiv:2112.00423
  [pdfother stat.ML   cs.LG
Controlling Wasserstein distances by Kernel norms with application to Compressive Statistical Learning
Authors: Titouan VayerRémi Gribonval
Abstract: Comparing probability distributions is at the crux of many machine learning algorithms. Maximum Mean Discrepancies (MMD) and Optimal Transport distances (OT) are two classes of distances between probability measures that have attracted abundant attention in past years. This paper establishes some conditions under which the Wasserstein distance can be controlled by MMD norms. Our work is motivated…  More
Submitted 1 December, 2021; originally announced December 2021.
All 8 versions 

2021

arXiv:2112.00101  [pdfother cs.LG
Fast Topological Clustering with Wasserstein Distance
Authors: Tananun SongdechakraiwutBryan M. KrauseMatthew I. BanksKirill V. NourskiBarry D. Van Veen
Abstract: The topological patterns exhibited by many real-world networks motivate the development of topology-based methods for assessing the similarity of networks. However, extracting topological structure is difficult, especially for large and dense networks whose node degrees range over multiple orders of magnitude. In this paper, we propose a novel and computationally practical topological clustering m…  More
Submitted 30 November, 2021; originally announced December 2021.
All 3 versions
 


-Wasserstein metric tensor, via the Wong--Zakai approximation. We begin our investigation by showing that the stochastic Euler-Lagrange equation, regardless it is deduced from either variational principle or particle dynamics, can be interpreted as the stochastic kineti…  More
Submitted 30 November, 2021; originally announced November 2021.
Comments: 34 pages


2021  see 2019

From the backward Kolmogorov PDE on the Wasserstein space to propagation of chaos for McKean-Vlasov SDEs

de Raynal, PEC and Frikha, N

Dec 2021 | JOURNAL DE MATHEMATIQUES PURES ET APPLIQUEES 156 , pp.1-124

This article is a continuation of our first work [6]. We here establish some new quantitative estimates for propagation of chaos of non-linear stochastic differential equations in the sense of McKean-Vlasov. We obtain explicit error estimates, at the level of the trajectories, at the level of the semi-group and at the level of the densities, for the mean-field approximation by systems of interacting particles under mild regularity assumptions on the coefficients. A first order expansion for the difference between the densities of one particle and its mean-field limit is also established. Our analysis relies on the well-posedness of classical solutions to the backward Kolmogorov partial differential equations defined on the strip [0, T] x R-d x P-2 (Rd), P-2 (R-d) being the Wasserstein space, that is, the space of probability measures on Rdwith a finite second-order moment and also on the existence and uniqueness of a fundamental solution for the related parabolic linear operator here stated on [0, T] x P-2 (R-d). (C) 2021 Elsevier Masson SAS. All rights reserved.

Show more   Free Submitted Article From RepositoryFull Text at Publisher

References  Related record

 Cited by 11 Related articles All 13 versions

2021

Wasserstein generative adversarial network-based approach for real-time track irregularity estimation using vehicle dynamic responses

Yuan, ZDLuo, J; (...); Zhai, WM

Nov 2021 (Early Access) | VEHICLE SYSTEM DYNAMICS

Enriched Cited References

Accurate and timely estimation of track irregularities is the foundation for predictive maintenance and high-fidelity dynamics simulation of the railway system. Therefore, it's of great interest to devise a real-time track irregularity estimation method based on dynamic responses of the in-service train. In this paper, a Wasserstein generative adversarial network (WGAN)-based framework is developed to estimate the track irregularities using the vehicle's axle box acceleration (ABA) signal. The proposed WGAN is composed of a generator architected by an encoder-decoder structure and a spectral normalised (SN) critic network. The generator is supposed to capture the correlation between ABA signal and track irregularities, and then estimate the irregularities with the measured ABA signal as input; while the critic is supposed to instruct the generator's training by optimising the calculated Wasserstein distance. We combine supervised learning and adversarial learning in the network training process, where the estimation loss and adversarial loss are jointly optimised. Optimising the estimation loss is anticipated to estimate the long-wave track irregularities while optimising the adversarial loss accounts for the short-wave track irregularities. Two numerical cases, namely vertical and spatial vehicle-track coupled dynamics simulation, are implemented to validate the accuracy and reliability of the proposed method.

Show more   View full text   References  Related records 

[PDF] ieee.org

B Wasserstein generative adversarial network-based approach for real-time track irregularity estimation using vehicle dynamic responses

Z Yuan, J Luo, S ZhuW Zhai - Vehicle System Dynamics, 2021 - Taylor & Francis

… In this paper, a Wasserstein generative … ’s training by optimising the calculated Wasserstein

distance. We combine supervised learning and adversarial learning in the network training …

Related articles


2021 see 2020

Learning Disentangled Representations with the Wasserstein Autoencoder

Gaujac, BFeige, I and Barber, D

European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD)

2021 | MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2021: RESEARCH TRACK, PT III 12977 , pp.69-84

Disentangled representation learning has undoubtedly benefited from objective function surgery. However, a delicate balancing act of tuning is still required in order to trade off reconstruction fidelity versus disentanglement. Building on previous successes of penalizing the total correlation in the latent variables, we propose TCWAE (Total Correlation Wasserstein Autoencoder). Working in the WAE paradigm naturally enables the separation of the total-correlation term, thus providing disentanglement control over the learned representation, while offering more flexibility in the choice of reconstruction cost. We propose two variants using different KL estimators and analyse in turn the impact of having different ground cost functions and latent regularization terms. Extensive quantitative comparisons on data sets with known generative factors shows that our methods present competitive results relative to state-of-the-art techniques. We further study the trade off between disentanglement and reconstruction on more-difficult data sets with unknown generative factors, where the flexibility of the WAE paradigm leads to improved reconstructions.

Show more  Free Submitted Article From RepositoryFull Text at Publisher

References  Related articles All 3 versions

Cited by 2 Related articles All 5 versions

<——2021———2021———1650——


2021 see 2019

 MR4346718 Prelim Minh, Hà Quang; Alpha Procrustes metrics between positive definite operators: A unifying formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt metrics. Linear Algebra Appl. 636 (2022), 25–68. 15B48 (47B32 47B65)

Review PDF Clipboard Journal Article


2021

Spoken Keyword Detection Based on ... - ResearchGate

https://www.researchgate.net › publication › 351574657_...

Oct 31, 2021 — We analyze here a particular kind of linguistic network where vertices representwords and edges stand for syntactic relationships between words.


[PDF] neurips.cc

Generalization Bounds for (Wasserstein) Robust Optimization

Y An, R Gao - Advances in Neural Information Processing …, 2021 - proceedings.neurips.cc

Abstract (Distributionally) robust optimization has gained momentum in machine learning

community recently, due to its promising applications in developing generalizable learning

paradigms. In this paper, we derive generalization bounds for robust optimization and …

Cited by 5 Related articles All 2 versions
Generalization Bounds for (Wasserstein) Robust Optimization

Cited by 5 Related articles All 2 versions

slideslive.com › generalization-bounds-for-wasserstein-ro...

Generalization Bounds for (Wasserstein) Robust Optimization. Dec 6, 2021 ... of robust optimization and Wasserstein distributionally robust optimization.

SlidesLive · 

Dec 6, 2021


[HTML] opticsjournal.net

[HTML] 基于 WGAN 的不均衡太赫兹光谱识别

朱荣盛, 沈韬, 刘英莉, 朱艳, 崔向伟 - 光谱学与光谱分析, 2021 - opticsjournal.net

摘要物质的太赫兹光谱具有唯一性. 目前, 结合先进的机器学习方法, 研究基于规模光谱数据库的

太赫兹光谱识别技术已成为太赫兹应用技术领域的重点. 考虑到由于实验条件及实验设备的影响

, 很难收集到多物质均衡光谱数据, 而这又是对太赫兹光谱数据进行分类的基础. 针对这一问题 …

[Chonese Unbalanced terahertz spectrum recognition based on WGAN[

[HTML] opticsjournal.net

[HTML] 基于Search for English results only WGAN 的不均衡太赫兹光谱识别

朱荣盛, 沈韬, 刘英莉, 朱艳, 崔向伟 - 光谱学与光谱分析, 2021 - opticsjournal.net

… terahertz spectrum recognition method based on WGAN (Wasserstein Generative Adversarial

Networks). As a new method of generating data, WGAN uses the generated data under …

 Related articles All 2 versions 

  

基于 WGAN-GP 的风电机组传动链故障诊断

滕伟, 丁显, 史秉帅, 徐进, 袁帅 - 电力系统自动化, 2021 - aeps-info.com

传动链负责将风电机组叶轮的能量传递至发电机, 若传动链中的任一部件, 如齿轮,

轴承发生异常, 风电机组将面临巨大的安全隐患. 现有基于深度学习的风电机组故障诊断大多

需要人为选择目标变量, 所识别故障与所选变量关联性大, 通用性不足. 梯度惩罚Wasserstein  …

 All 2 versions 

[Chinese  Fault diagnosis of wind turbine transmission chain based on WGAN-GP]
 基于 WGAN-GP 的风电机组传动链故障诊断

滕伟, 丁显, 史秉帅, 徐进, 袁帅 - 电力系统自动化, 2021 - aeps-info.com

传动链负责将风电机组叶轮的能量传递至发电机, 若传动链中的任一部件, 如齿轮,

轴承发生异常, 风电机组将面临巨大的安全隐患. 现有基于深度学习的风电机组故障诊断大多

需要人为选择目标变量, 所识别故障与所选变量关联性大, 通用性不足. 梯度惩罚Wasserstein …

All 2 versions 

[Chinese  Fault diagnosis of wind turbine transmission chain based on WGAN-GP]


2021


面向核磁共振影像超分辨的 WGAN 方法研究

黎玥嵘, 武仲科, 王学松, 申佳丽… - … 师范大学学报 (自然科学版), 2021 - bnujournal.com

针对核磁共振成像(magnetic resonance imaging, MRI) 超分辨率重构任务,

提出了超分辨率重构WGAN 网络, 构建了合适的网络模型与损失函数; 基于残差U-net WGAN

后端上采样超分模型, 设计了感知损失, 纹理损失和对抗损失, 用于恢复低分辨率MRI …

[Chinese  Fault diagnosis of wind turbine transmission chain based on WGAN-GP[


基于电子舌和 WGAN-CNN 模型的小麦贮存年限快速检测

张鑫, 缪楠, 高继勇, 李庆盛, 王志强, 孙霞… - 电子测量与仪器 …, 2021 - cnki.com.cn

为了实现对不同贮存年限陈化小麦的快速检测, 提出一种伏安电子舌结合卷积神经网络(

convolutional neural network, CNN) 和基于Wasserstein 距离的生成对抗网络(wasserstein

generative adversarial nets, WGAN) 组合的模式识别模型. 使用伏安电子舌对6 …

 {Chinese  Fast detection of wheat storage age based on electronic tongue and WGAN-CNN model]


基于 BMFnet-WGAN 的中药饮片智能甄别

陈雁, 邹立思 - 中国实验方剂学杂志, 2021 - cqvip.com

目的: 为适应现代化饮片甄别的需求, 克服传统人工经验方法主观性强而效率低的问题,

探究机器视觉与深度学习方法在中药饮片智能甄别领域的可行性具有重要的研究意义. 方法:

构建包含60 11125 张饮片的图像集, 设计高低频特征学习的网络架构, 即采用平行卷积网络 …

 [Chinese  Intelligent Screening of Traditional Chinese Medicine Pieces Based on BMFnet-WGAN]

 Inferential Wasserstein Generative Adversarial Networks - arXiv

https://arxiv.org › stat

https://arxiv.org › stat

by Y Chen · 2021 — We introduce a novel inferential Wasserstein GAN (iWGAN) model, which is a principled framework to fuse auto-encoders and WGANs. The iWGAN model ...

 All 2 versions

 


Wasserstein GAN with Gradient Penalty(WGAN-GP) - Towards ...

https://towardsdatascience.com › demystified-wasserstei...

https://towardsdatascience.com › demystified-wasserstei...

In this post we will look into Wasserstein GANs with Gradient Penalty. While the original Wasserstein GAN[2] improves training stability, there still are ..


Human Motion Generation using Wasserstein GAN - ACM ...

https://dl.acm.org › doi › fullHtml

https://dl.acm.org › doi › fullHtml

by A Shiobara · 2021 — 2021. Human Motion Generation using Wasserstein GAN. In 2021 5th International Conference on Digital Signal Processing (ICDSP 2021), February 26-28, 2021, ...

 Code accompanying the paper "Wasserstein GAN"

https://pythonrepo.com › repo › martinarjovsky-Wasser...

Nov 27, 2021 — Dear @martinarjovsky, I am currently working on a project with MRI data. I was using WGAN -GP loss on 2D implementation, with hyperparameters ...

Missing: Wasser... ‎| Must include: Wasser...

An implementation of the [Hierarchical (Sig-Wasserstein) GAN ...

https://pythonrepo.com › repo › FernandoDeMeer-Hier...

https://pythonrepo.com › repo › FernandoDeMeer-Hier...

An implementation of the [Hierarchical (Sig-Wasserstein) GAN] algorithm for large dimensional Time Series Generation. Last update: Oct 28, 2021 ...


Wasserstein Distance Using C# and Python - Visual Studio ...

https://visualstudiomagazine.com › articles › 2021/08/16

By James McCaffrey; 08/16/2021 ... This article shows you how to compute the Wasserstein distance and explains why it is often preferable to alternative ...

2021

Why the 1-Wasserstein distance is the area between the two ...

https://arxiv.org › math

by M De Angelis · 2021 — [Submitted on 5 Nov 2021] ... We first describe the Wasserstein distance in terms of copulas, and then show that W_1 with the Euclidean distance is attained ...

 All 2 versions

2021  [PDF] arxiv.org

Why the 1-Wasserstein distance is the area between the two marginal CDFs

M De Angelis, A Gray - arXiv preprint arXiv:2111.03570, 2021 - arxiv.org

… Abstract We elucidate why the 1-Wasserstein distance W1 coincides with the area between 

the two marginal cumulative distribution functions (CDFs). We first describe the Wasserstein 

distance in terms of copulas, and then show that W1 with the Euclidean distance is attained …

Cited by 2 Related articles All 2 versions

Smooth p-Wasserstein Distance: Structure, Empirical ...

http://proceedings.mlr.press › ...

by S Nietert · 2021 · Cited by 3 — Proceedings of the 38th International Conference on Machine Learning, PMLR 139:8172-8183, 2021. Abstract. Discrepancy measures between probability distributions ...

Smooth p-Wasserstein Distance: Structure, Empirical ...

slideslive.com › smooth-pwasserstein-distance-structure-e...

... speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world.

SlidesLive · 

Jul 19, 2021


 Variance Minimization in the Wasserstein Space for Invariant ...

https://arxiv.org › cs

https://arxiv.org › cs

by G Martinet · 2021 — This method, invariant causal prediction (ICP), has a substantial computational defect -- the runtime scales exponentially with the number of ...


Wasserstein Fellowship Applicant Information

https://hls.harvard.edu › wasserstein-fellows-program

https://hls.harvard.edu › wasserstein-fellows-program

The deadline for the 2021-2022 application cycle is April 19, 2021. Please note, we are not hosting a Fellow-in-Residence for the 2021-2022 application cycle.


Wasserstein Barycenter for Multi-Source Domain Adaptation

https://openaccess.thecvf.com › CVPR2021 › html › M...

https://openaccess.thecvf.com › CVPR2021 › html › M...

by EF Montesuma · 2021 — These CVPR 2021 papers are the Open Access versions, provided by the Computer Vision ... This method relies on the barycenter on Wasserstein spaces for ...

Cited by 7 Related articles All 4 versions

<——2021———2021———1670—— 


2021  see 2020

Quantum Statistical Learning via Quantum Wasserstein ...

https://link.springer.com › article

https://link.springer.com › article

by S Becker · 2021 · Cited by 2 — In this article, we introduce a new approach towards the statistical learning problem.

Quantum semi-supervised generative adversarial network for ...

https://www.nature.com › scientific reports › articles

https://www.nature.com › scientific reports › articles

by K Nakaji · 2021 · 1 — Information Fusion (2021). 50. Dallaire-Demers, P.-L. & Killoran, N. Quantum generative adversarial networks. Phys. Rev.


arXiv:2112.04763  [pdf, ps, other]  math.MG math.OC
Obstructions to extension of
Wasserstein distances for variable masses
Authors:
Luca Lombardini, Francesco Rossi
Abstract: We study the possibility of defining a distance on the whole space of measures, with the property that the distance between two measures having the same mass is the Wasserstein distance, up to a scaling factor. We prove that, under very weak and natural conditions, if the base space is unbounded, then the scaling factor must be constant, independently of the mass. Moreover, no such distance can ex…
More
Submitted 9 December, 2021; originally announced December 2021.
MSC Class: 28A33; 49Q22 


arXiv:2112.03152  [pdf, other]  stat.CO cs.LG stat.ME
Bounding
Wasserstein distance with couplings
Authors:
Niloy Biswas, Lester Mackey
Abstract: Markov chain Monte Carlo (MCMC) provides asymptotically consistent estimates of intractable posterior expectations as the number of iterations tends to infinity. However, in large data applications, MCMC can be computationally expensive per iteration. This has catalyzed interest in sampling methods such as approximate MCMC, which trade off asymptotic consistency for improved computational speed. I…
More
Submitted 6 December, 2021; originally announced December 2021.
Comments: 53 pages, 10 figures 

Related articles All 4 versions

2021

arXiv:2112.02424  [pdf, other]  cs.LG
Variational Wasserstein gradient flow
Authors: Jiaojiao Fan, Amirhossein Taghvaei, Yongxin Chen
Abstract: The gradient flow of a function over the space of probability densities with respect to the Wasserstein metric often exhibits nice properties and has been utilized in several machine learning applications. The standard approach to compute the Wasserstein gradient flow is the finite difference which discretizes the underlying space over a grid, and is not scalable. In this work, we propose a scalab…
More
Submitted 4 December, 2021; originally announced December 2021. 

online OPEN ACCESS

Variational Wasserstein gradient flow

by Fan, Jiaojiao; Taghvaei, Amirhossein; Chen, Yongxin

12/2021

The gradient flow of a function over the space of probability densities with respect to the Wasserstein metric often exhibits nice properties and has been...

Journal ArticleFull Text Online
Cited by 10
Related articles All 3 versions

arXiv:2112.00423  [pdf, other]  stat.ML cs.LG
Controlling
Wasserstein distances by Kernel norms with application to Compressive Statistical Learning
Authors:
Titouan Vayer, Rémi Gribonval
Abstract: Comparing probability distributions is at the crux of many machine learning algorithms. Maximum Mean Discrepancies (MMD) and Optimal Transport distances (OT) are two classes of distances between probability measures that have attracted abundant attention in past years. This paper establishes some conditions under which the Wasserstein distance can be controlled by MMD norms. Our work is motivated…
More
Submitted 1 December, 2021; originally announced December 2021.
online  OPEN ACCESS

Controlling Wasserstein distances by Kernel norms with application to Compressive Statistical Learning

by Vayer, Titouan; Gribonval, Rémi

12/2021

Comparing probability distributions is at the crux of many machine learning algorithms. Maximum Mean Discrepancies (MMD) and Optimal Transport distances (OT)...

Journal ArticleFull Text Online

Related articles All 8 versions

MR4347335 Prelim He, Ruiqiang; Feng, Xiangchu; Zhu, Xiaolong; Huang, Hua; Wei, Bingzhe; RWRM: Residual Wasserstein regularization model for image restoration. Inverse Probl. Imaging 15 (2021), no. 6, 1307–.

Review PDF Clipboard Journal Article

[HTML] aimsciences.org

[HTML] RWRM: Residual Wasserstein regularization model for image restoration

R He, X Feng, X Zhu, H Huang… - Inverse Problems & …, 2021 - aimsciences.org

Existing image restoration methods mostly make full use of various image prior information.

However, they rarely exploit the potential of residual histograms, especially their role as

ensemble regularization constraint. In this paper, we propose a residual Wasserstein  …

 Related articles All 2 versions 


2Wasserstein Generative Adversarial Networks for Realistic Traffic Sign Image Generation

C DewiRC ChenYT Liu - Asian Conference on Intelligent Information …, 2021 - Springer

Recently, Convolutional neural networks (CNN) with properly annotated training data and

results will obtain the best traffic sign detection (TSD) and traffic sign recognition (TSR). The

efficiency of the whole system depends on the data collection, based on neural networks …

Cited by 2 Related articles All 2 versions

Wasserstein Generative Adversarial Networks for Realistic Traffic Sign Image Generation


2021  see 2020  online Cover Image

Wasserstein convergence rate for empirical ... - Science Direct

https://www.sciencedirect.com › science › article › pii

by FY Wang · 2021 · Cited by 3 — Let X t be the (reflecting) diffusion process generated by L Δ + V on a complete connected Riemannian manifold M possibly with a ...

Wasserstein convergence rate for empirical measures on noncompact manifolds

by Wang, Feng-Yu

Stochastic processes and their applications, 02/2022, Volume 144

Let Xt be the (reflecting) diffusion process generated by LΔ+V on a complete connected Riemannian manifold M possibly with a boundary ∂M, where VC1(M) such...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

<——2021———2021———1680—— 


Catalano, MartaLijoi, AntonioPrünster, Igor

Measuring dependence in the Wasserstein distance for Bayesian nonparametric models. (English) Zbl 07438274

Ann. Stat. 49, No. 5, 2916-2947 (2021).

MSC:  62C10 60G09 60G57

Cited by 5 Related articles All 6 versions

[PDF] carloalberto.org

[BOOK] Measuring dependence in the Wasserstein distance for Bayesian nonparametric models

M Catalano, A Lijoi, I Prünster - 2021 - carloalberto.org

The proposal and study of dependent Bayesian nonparametric models has been one of the 

most active research lines in the last two decades, with random vectors of measures 

representing a natural and popular tool to define them. Nonetheless a principled approach to …

Cited by 6 Related articles All 6 versions

Anti-confrontational Domain Data Generation Based on Improved WGAN

H Luo, X Chen, J Dong - 2021 International Symposium on …, 2021 - ieeexplore.ieee.org

The Domain Generate Algorithm (DGA) is used by a large number of botnets to evade

detection. At present, the mainstream machine learning detection technology not only lacks

the training data with evolutionary value, but also has the security problem that the model …

 All 2 versions

Conference Paper  Citation/Abstract

Anti-confrontational Domain Data Generation Based on Improved WGAN
Luo, Haibo; Chen, Xingchi; Dong, Jianhu.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).
Abstract/Details
  Show Abstract 


2021  see 2020   [PDF] mlr.press

First-Order Methods for Wasserstein Distributionally Robust MDP

JG Clement, C Kroer - International Conference on Machine …, 2021 - proceedings.mlr.press

Markov decision processes (MDPs) are known to be sensitive to parameter specification.

Distributionally robust MDPs alleviate this issue by allowing for\textit {ambiguity sets} which

give a set of possible distributions over parameter sets. The goal is to find an optimal policy …

 All 2 versions 


[PDF] arxiv.org

Wasserstein distributionally robust motion control for collision avoidance using conditional value-at-risk

A HakobyanI Yang - IEEE Transactions on Robotics, 2021 - ieeexplore.ieee.org

In this article, a risk-aware motion control scheme is considered for mobile robots to avoid

randomly moving obstacles when the true probability distribution of uncertainty is unknown.

We propose a novel model-predictive control (MPC) method for limiting the risk of unsafety …

 Cited by 8 Related articles All 2 versions


[PDF] arxiv.org

Statistical Analysis of Wasserstein Distributionally Robust Estimators

J BlanchetK Murthy… - Tutorials in Operations …, 2021 - pubsonline.informs.org

We consider statistical methods that invoke a min-max distributionally robust formulation to

extract good out-of-sample performance in data-driven optimization and learning problems.

Acknowledging the distributional uncertainty in learning from limited samples, the min-max …

  All 3 versions


2021


[PDF] arxiv.org

Distributionally robust inverse covariance estimation: The Wasserstein shrinkage estimator

VA NguyenD Kuhn… - Operations …, 2021 - pubsonline.informs.org

We introduce a distributionally robust maximum likelihood estimation model with a

Wasserstein ambiguity set to infer the inverse covariance matrix of ap-dimensional Gaussian

random vector from n independent samples. The proposed model minimizes the worst case …

 Cited by 32 Related articles All 8 versions


A data-driven scheduling model of virtual power plant using Wasserstein distributionally robust optimization

H Liu, J QiuJ Zhao - International Journal of Electrical Power & Energy …, 2021 - Elsevier

Distributed energy resources (DER) can be efficiently aggregated by aggregators to sell

excessive electricity to spot market in the form of Virtual Power Plant (VPP). The aggregator

schedules DER within VPP to participate in day-ahead market for maximizing its profits while …

 All 2 versions


[PDF] arxiv.org

Distributionally robust chance-constrained programs with right-hand side uncertainty under Wasserstein ambiguity

N Ho-NguyenF Kılınç-KarzanS Küçükyavuz… - Mathematical …, 2021 - Springer

We consider exact deterministic mixed-integer programming (MIP) reformulations of

distributionally robust chance-constrained programs (DR-CCP) with random right-hand

sides over Wasserstein ambiguity sets. The existing MIP formulations are known to have …

 3 Related articles All 7 versions


[PDF] arxiv.org

Second-order Conic Programming Approach for Wasserstein Distributionally Robust Two-stage Linear Programs

Z WangK YouS SongY Zhang - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

This article proposes a second-order conic programming (SOCP) approach to solve

distributionally robust two-stage linear programs over 1-Wasserstein balls. We start from the

case with distribution uncertainty only in the objective function and then explore the case …

 Cited by 2 Related articles All 4 versions


[PDF] optimization-online.org

Data-driven distributionally robust chance-constrained optimization with Wasserstein metric

R JiMA Lejeune - Journal of Global Optimization, 2021 - Springer

We study distributionally robust chance-constrained programming (DRCCP) optimization

problems with data-driven Wasserstein ambiguity sets. The proposed algorithmic and

reformulation framework applies to all types of distributionally robust chance-constrained …

 Cited by 22 Related articles All 6 versions

<——2021———2021———1690—— 



[PDF] umons.ac.be

[PDF] Enhanced Wasserstein Distributionally Robust OPF With Dependence Structure and Support Information

A ArrigoJ KazempourZ De Grève… - IEEE PowerTech …, 2021 - applications.umons.ac.be

This paper goes beyond the current state of the art related to Wasserstein distributionally

robust optimal power flow problems, by adding dependence structure (correlation) and

support information. In view of the space-time dependencies pertaining to the stochastic …

  Related articles All 4 versions 


[PDF] snu.ac.kr

A data-driven distributionally robust newsvendor model with a Wasserstein ambiguity set

S Lee, H Kim, I Moon - Journal of the Operational Research Society, 2021 - Taylor & Francis

In this paper, we derive a closed-form solution and an explicit characterization of the worst-

case distribution for the data-driven distributionally robust newsvendor model with an

ambiguity set based on the Wasserstein distance of order p[1,∞). We also consider the …

 Cited by 7 Related articles All 2 versions


Data-driven Wasserstein distributionally robust optimization for refinery planning under uncertainty

J Zhao, L ZhaoW He - … 2021–47th Annual Conference of the …, 2021 - ieeexplore.ieee.org

This paper addresses the issue of refinery production planning under uncertainty. A data-

driven Wasserstein distributionally robust optimization approach is proposed to optimize

refinery planning operations. The uncertainties of product prices are modeled as an …

Data-driven Wasserstein distributionally robust optimization for biomass with agricultural waste-to-energy network design under uncertainty

C NingF You - Applied Energy, 2019 - Elsevier

This paper addresses the problem of biomass with agricultural waste-to-energy network

design under uncertainty. We propose a novel data-driven Wasserstein distributionally

robust optimization model for hedging against uncertainty in the optimal network design …

 Cited by 27 Related articles All 6 versions



Distributionally Robust Resilient Operation of Integrated Energy Systems Using Moment and Wasserstein Metric for Contingencies

Y Zhou, Z WeiM Shahidehpour… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

Extreme weather events pose a serious threat to energy distribution systems. We propose a

distributionally robust optimization model for the resilient operation of the integrated

electricity and heat energy distribution systems in extreme weather events. We develop a …

 Related articles

[CITATION] Distributionally Robust Resilient Operation of Integrated Energy Systems Using Moment and Wasserstein Metric for Contingencies

Y Zhou, Z WeiM Shahidehpour… - IEEE Transactions on …, 2021 - ui.adsabs.harvard.edu

Distributionally Robust Resilient Operation of Integrated Energy Systems Using Moment and

Wasserstein Metric for Contingencies - NASA/ADS … Distributionally Robust Resilient Operation

of Integrated Energy Systems Using Moment and Wasserstein Metric for Contingencies …

 



… for a Dual-channel Retailer with Service Level Requirements and Demand Uncertainties: A Wasserstein Metric-based Distributionally Robust Optimization Approach

Y Sun, R Qiu, M Sun - Computers & Operations Research, 2021 - Elsevier

This study explores a dual-channel management problem of a retailer selling multiple

products to customers through a traditional retail channel and an online channel to

maximize expected profit. The prices and order quantities of both the online and the retail …

 All 2 versions


2021


Fast Wasserstein-distance-based distributionally robust chance-constrained power dispatch for multi-zone HVAC systems

G ChenH ZhangH Hui, Y Song - IEEE Transactions on Smart …, 2021 - ieeexplore.ieee.org

Heating, ventilation, and air-conditioning (HVAC) systems play an increasingly important

role in the construction of smart cities because of their high energy consumption and

available operational flexibility for power systems. To enhance energy efficiency and utilize …

 Related articles


Distributionally Safe Path Planning: Wasserstein Safe RRT

P LathropB Boardman… - IEEE Robotics and …, 2021 - ieeexplore.ieee.org

In this paper, we propose a Wasserstein metric-based random path planning algorithm.

Wasserstein Safe RRT (W-Safe RRT) provides finite-sample probabilistic guarantees on the

safety of a returned path in an uncertain obstacle environment. Vehicle and obstacle states …

 

2021  see 2019

[HTML] sciencedirect.com

[HTML] Wasserstein distance-based distributionally robust optimal scheduling in rural microgrid considering the coordinated interaction among source-grid-load …

C Chen, J Xing, Q Li, S Liu, J Ma, J Chen, L Han, W Qiu… - Energy Reports, 2021 - Elsevier

The microgrid (MG) is an effective way to alleviate the impact of the large-scale penetration

of distributed generations. Due to the seasonal characteristics of rural areas, the load curve

of the rural MG is different from the urban MG. Besides, the economy and stability of MG's …

 Related articles All 3 versions



[PDF] arxiv.org

Data-driven distributionally robust MPC using the Wasserstein metric

Z ZhongEA del Rio-Chanona… - arXiv preprint arXiv …, 2021 - arxiv.org

A data-driven MPC scheme is proposed to safely control constrained stochastic linear

systems using distributionally robust optimization. Distributionally robust constraints based

on the Wasserstein metric are imposed to bound the state constraint violations in the …

  Related articles All 2 versions 

<——2021———2021———1700——



[PDF] arxiv.org

Distributionally Robust Prescriptive Analytics with Wasserstein Distance

T WangN Chen, C Wang - arXiv preprint arXiv:2106.05724, 2021 - arxiv.org

In prescriptive analytics, the decision-maker observes historical samples of $(X, Y) $, where

$ Y $ is the uncertain problem parameter and $ X $ is the concurrent covariate, without

knowing the joint distribution. Given an additional covariate observation $ x $, the goal is to …

 Related articles All 2 versions 


[PDF] arxiv.org

Distributionally robust tail bounds based on Wasserstein distance and -divergence

C Birghila, M Aigner, S Engelke - arXiv preprint arXiv:2106.06266, 2021 - arxiv.org

In this work, we provide robust bounds on the tail probabilities and the tail index of heavy-

tailed distributions in the context of model misspecification. They are defined as the optimal

value when computing the worst-case tail behavior over all models within some …

  Related articles All 2 versions 


Sensitivity analysis of Wasserstein distributionally robust optimization problems

D Bartl, S DrapeauJ OblojJ Wiesel - Proceedings of the Royal …, 2021 - ora.ox.ac.uk

We consider sensitivity of a generic stochastic optimization problem to model uncertainty.

We take a non-parametric approach and capture model uncertainty using Wasserstein balls

around the postulated model. We provide explicit formulae for the first order correction to …

 


[PDF] arxiv.org

Distributionally robust second-order stochastic dominance constrained optimization with Wasserstein distance

Y Mei, J Liu, Z Chen - arXiv preprint arXiv:2101.00838, 2021 - arxiv.org

We consider a distributionally robust second-order stochastic dominance constrained

optimization problem, where the true distribution of the uncertain parameters is ambiguous.

The ambiguity set contains all probability distributions close to the empirical distribution …

 Related articles All 4 versions 


[PDF] arxiv.org

Distributionally Robust Chance-Constrained Programmings for Non-Linear Uncertainties with Wasserstein Distance

Y Gu, Y Wang - arXiv preprint arXiv:2103.04790, 2021 - arxiv.org

In this paper, we develop an exact reformulation and a deterministic approximation for

distributionally robust chance-constrained programmings $(\text {DRCCPs}) $ with convex

non-linear uncertain constraints under data-driven Wasserstein ambiguity sets. It is shown …

 Related articles All 2 versions 


2021


[PDF] arxiv.org

Data-Driven Distributionally Robust Surgery Planning in Flexible Operating Rooms Over a Wasserstein Ambiguity

KS Shehadeh - arXiv preprint arXiv:2103.15221, 2021 - arxiv.org

We study elective surgery planning in flexible operating rooms where emergency patients

are accommodated in the existing elective surgery schedule. Probability distributions of

surgery durations are unknown, and only a small set of historical realizations is available. To …

 Related articles All 2 versions 


Relaxed Wasserstein with applications to GANs

X GuoJ HongT Lin, N Yang - ICASSP 2021-2021 IEEE …, 2021 - ieeexplore.ieee.org

Wasserstein Generative Adversarial Networks (WGANs) provide a versatile class of models,

which have attracted great attention in various applications. However, this framework has

two main drawbacks:(i) Wasserstein-1 (or Earth-Mover) distance is restrictive such that

WGANs cannot always fit data geometry well;(ii) It is difficult to achieve fast training of

WGANs. In this paper, we propose a new class of Relaxed Wasserstein (RW) distances by

generalizing Wasserstein-1 distance with Bregman cost functions. We show that RW …

Cited by 28 Related articles All 6 versions


2021  [PDF] arxiv.org

Measuring association with Wasserstein distances

J Wiesel - arXiv preprint arXiv:2102.00356, 2021 - arxiv.org

Let $\pi\in\Pi (\mu,\nu) $ be a coupling between two probability measures $\mu $ and $\nu $

on a Polish space. In this article we propose and study a class of nonparametric measures of

association between $\mu $ and $\nu $. The analysis is based on the Wasserstein distance …

 Cited by 3 Related articles All 2 versions 


2021  [PDF] arxiv.org

Rate of convergence for particles approximation of PDEs in Wasserstein space

M GermainH PhamX Warin - arXiv preprint arXiv:2103.00837, 2021 - arxiv.org

We prove a rate of convergence of order 1/N for the N-particle approximation of a second-

order partial differential equation in the space of probability measures, like the Master

equation or Bellman equation of mean-field control problem under common noise. The proof …

 Cited by 2 Related articles All 16 versions 


2021

Establishment and extension of digital aggregate database using auxiliary classifier Wasserstein GAN with gradient penalty

C WangF LiQ LiuH Wang, P Benmoussa… - … and Building Materials, 2021 - Elsevier

For road construction, the morphological characteristics of coarse aggregates such as

angularity and sphericity have a considerable influence on asphalt pavement performance.

In traditional aggregate simulation processes, images of real coarse grains are captured …

 All 3 versions

<——2021———2021———1710——


2021  [PDF] arxiv.org

Obstructions to extension of Wasserstein distances for variable masses

L Lombardini, F Rossi - arXiv preprint arXiv:2112.04763, 2021 - arxiv.org

We study the possibility of defining a distance on the whole space of measures, with the

property that the distance between two measures having the same mass is the Wasserstein

distance, up to a scaling factor. We prove that, under very weak and natural conditions, if the …

 


2021  [PDF] arxiv.org

On Number of Particles in Coalescing-Fragmentating Wasserstein Dynamics

V Konarovskyi - arXiv preprint arXiv:2102.10943, 2021 - arxiv.org

We consider the system of sticky-reflected Brownian particles on the real line proposed in

[arXiv: 1711.03011]. The model is a modification of the Howitt-Warren flow but now the

diffusion rate of particles is inversely proportional to the mass which they transfer. It is known …

 Related articles All 4 versions 


2021  [PDF] archives-ouvertes.fr

Measuring the Irregularity of Vector-Valued Morphological Operators using Wasserstein Metric

ME Valle, S Francisco, MA Granero… - … Conference on Discrete …, 2021 - Springer

Mathematical morphology is a useful theory of nonlinear operators widely used for image

processing and analysis. Despite the successful application of morphological operators for

binary and gray-scale images, extending them to vector-valued images is not straightforward …

 Related articles All 6 versions


[HTML] Prediction and analysis of multiple protein lysine modified sites based on conditional wasserstein generative adversarial networks

Y Yang, H Wang, W Li, X Wang… - BMC …, 2021 - bmcbioinformatics.biomedcentral …

Protein post-translational modification (PTM) is a key issue to investigate the mechanism of

protein's function. With the rapid development of proteomics technology, a large amount of

protein sequence data has been generated, which highlights the importance of the in-depth

study and analysis of PTMs in proteins. We proposed a new multi-classification machine

learning pipeline MultiLyGAN to identity seven types of lysine modified sites. Using eight

different sequential and five structural construction methods, 1497 valid features were …

Cited by 2 Related articles All 10 versions 

online Cover Image

 PEER-REVIEW OPEN ACCESS

Prediction and analysis of multiple protein lysine modified sites based on conditional wasserstein generative...

by Yang, Yingxi; Wang, Hui; Li, Wen ; More...

BMC bioinformatics, 03/2021, Volume 22, Issue 1

Protein post-translational modification (PTM) is a key issue to investigate the mechanism of protein's function. With the rapid development of proteomics...

View NowPDF

Journal ArticleFull Text Online

View in Context Browse Journal 
Cited by 6
 Related articles All 14 versions 

2021


[v2] Wed, 5 May 2021 22:21:22 UTC (58 KB)

Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical...

by Bencheikh, O; Jourdain, B

Journal of approximation theory, 12/2021

We are interested in the approximation in Wasserstein distance with index ρ≥1 of a probability measure μ on the real line with finite moment of order ρ by the...

View NowPDF

Journal ArticleFull Text Online

View in Context Browse Journal

Cited by 3 Related articles All 12 versions 

Closed-form Expressions for Maximum Mean ... - ResearchGate

https://www.researchgate.net › ... › Training

Oct 14, 2021 — Download Citation | Closed-form Expressions for Maximum Mean Discrepancy with Applications to Wasserstein Auto-Encoders | The Maximum Mean ...

Cover Image

Closed‐form expressions for maximum mean discrepancy with applications to Wasserstein auto‐encoders

by Rustamov, Raif M

Stat (International Statistical Institute), 12/2021, Volume 10, Issue 1

Journal ArticleCitation Online

  Related articles

 

Wasserstein Bounds in the CLT of the MLE for the Drift ... - MDPI

https://www.mdpi.com › pdf

PDF

by K Es-Sebaiy · 2021 — Article. Wasserstein Bounds in the CLT of the MLE for the Drift. Coefficient of a Stochastic Partial Differential Equation.

online  Cover Image  PEER-REVIEW

Wasserstein Bounds in the CLT of the MLE for the Drift Coefficient of a Stochastic Partial Differential Equation

by Es-Sebaiy, Khalifa; Al-Foraih, Mishari; Alazemi, Fares

Fractal and Fractional, 10/2021, Volume 5, Issue 4

In this paper, we are interested in the rate of convergence for the central limit theorem of the maximum likelihood estimator of the drift coefficient for a...

View NowPDF

Journal ArticleFull Text Online

View in Context Browse Journal

Cited by 1 Related articles All 5 versions

arXiv:2112.06384  [pdfother]  cs.LG  stat.ML
WOOD: Wasserstein-based Out-of-Distribution Detection
Authors: Yinan WangWenbo SunJionghua "Judy" JinZhenyu "James" KongXiaowei Yue
Abstract: The training and test data for deep-neural-network-based classifiers are usually assumed to be sampled from the same distribution. When part of the test samples are drawn from a distribution that is sufficiently far away from that of the training samples (a.k.a. out-of-distribution (OOD) samples), the trained neural network has a tendency to make high confidence predictions for these OOD samples.… 
More
Submitted 12 December, 2021; originally announced December 2021.

WOOD: Wasserstein-based Out-of-Distribution Detection

by Wang, Yinan; Sun, Wenbo; Jin, Jionghua "Judy" ; More...

12/2021

The training and test data for deep-neural-network-based classifiers are usually assumed to be sampled from the same distribution. When part of the test...

Journal Article Full Text Online

All 2 versions 

arXiv:2112.06292  [pdf]  math.OC  cs.AI  cs.LG 

Gamifying optimization: a Wasserstein distance-based analysis of human search
Authors: Antonio CandelieriAndrea PontiFrancesco Archetti
Abstract: The main objective of this paper is to outline a theoretical framework to characterise humans' decision-making strategies under uncertainty, in particular active learning in a black-box optimization task and trading-off between information gathering (exploration) and reward seeking (exploitation). Humans' decisions making according to these two objectives can be modelled in terms of Pareto rationa…  More
Submitted 12 December, 2021; originally announced December 2021.
Comments: 49 pages, 39 figures. arXiv admin note: substantial text overlap with arXiv:2102.07647
MSC Class: 62F15; ACM Class: G.3; I.2.0

All 2 versions 

<——2021———2021———1720——  



arXiv:2112.05872  [pdfother]  cs.LG  cs.CV
SLOSH: Set LOcality Sensitive Hashing via Sliced-Wasserstein Embeddings
Authors: Yuzhe LuXinran LiuAndrea SoltoggioSoheil Kolouri
Abstract: Learning from set-structured data is an essential problem with many applications in machine learning and computer vision. This paper focuses on non-parametric and data-independent learning from set-structured data using approximate nearest neighbor (ANN) solutions, particularly locality-sensitive hashing. We consider the problem of set retrieval from an input set query. Such retrieval problem requ… 
More
Submitted 10 December, 2021; originally announced December 2021.
12/2021

Learning from set-structured data is an essential problem with many applications in machine learning and computer vision. This paper focuses on non-parametric...

Journal Article Full Text Online

ll 2 versions 

arXiv:2112.04763  [pdfpsother]  math.MG  math.OC
Obstructions to extension of Wasserstein distances for variable masses
Authors: Luca LombardiniFrancesco Rossi
Abstract: We study the possibility of defining a distance on the whole space of measures, with the property that the distance between two measures having the same mass is the Wasserstein distance, up to a scaling factor. We prove that, under very weak and natural conditions, if the base space is unbounded, then the scaling factor must be constant, independently of the mass. Moreover, no such distance can ex…  More
Submitted 9 December, 2021; originally announced December 2021.
MSC Class: 28A33; 49Q22
All 3 versions
 

 

arXiv:2112.03152  [pdfother]  stat.CO  cs.LG   stat.ME
Bounding Wasserstein distance with couplings
Authors: Niloy BiswasLester Mackey
Abstract: Markov chain Monte Carlo (MCMC) provides asymptotically consistent estimates of intractable posterior expectations as the number of iterations tends to infinity. However, in large data applications, MCMC can be computationally expensive per iteration. This has catalyzed interest in sampling methods such as approximate MCMC, which trade off asymptotic consistency for improved computational speed. I…  More
Submitted 6 December, 2021; originally announced December 2021.
Comments: 53 pages, 10 figures
All 4 versions 


2021 SEE 2922
Wasserstein Dropout
by Sicking, JoachimMaram AkilaPintz, Maximilian ; More...
arXiv.org, 12/2021
Despite of its importance for safe machine learning, uncertainty quantification for neural networks is far from being solved. State-of-the-art approaches to...
Paper  Full Text Online

2021

A new data generation approach with modified Wasserstein auto-encoder for rotating machinery fault diagnosis with limited fault data

K Zhao, H Jiang, C Liu, Y Wang, K Zhu - Knowledge-Based Systems, 2021 - Elsevier

Limited fault data restrict deep learning methods in solving fault diagnosis problems in

rotating machinery. Using limited fault data to generate massive data with similar

distributions is an important premise in applying deep learning methods to solve these …


2021


[HTML] biomedcentral.com

[HTML] Prediction and analysis of multiple protein lysine modified sites based on conditional wasserstein generative adversarial networks

Y Yang, H Wang, W Li, X Wang… - BMC …, 2021 - bmcbioinformatics.biomedcentral …

Protein post-translational modification (PTM) is a key issue to investigate the mechanism of

protein's function. With the rapid development of proteomics technology, a large amount of

protein sequence data has been generated, which highlights the importance of the in-depth …

 Cited by 2 Related articles All 10 versions 


2021  [PDF] arxiv.org

Human Motion Prediction Using Manifold-Aware Wasserstein GAN

B Chopin, N OtberdoutM DaoudiA Bartolo - arXiv preprint arXiv …, 2021 - arxiv.org

Human motion prediction aims to forecast future human poses given a prior pose sequence.

The discontinuity of the predicted motion and the performance deterioration in long-term

horizons are still the main challenges encountered in current literature. In this work, we …

  Related articles All 2 versions 


2021  [PDF] arxiv.org

A continuation multiple shooting method for Wasserstein geodesic equation

J Cui, L Dieci, H Zhou - arXiv preprint arXiv:2105.09502, 2021 - arxiv.org

In this paper, we propose a numerical method to solve the classic $ L^ 2$-optimal transport

problem. Our algorithm is based on use of multiple shooting, in combination with a

continuation procedure, to solve the boundary value problem associated to the transport …

 Related articles All 2 versions 

2021

Human Motion Generation using Wasserstein GAN

A Shiobara, M Murakami - 2021 5th International Conference on Digital …, 2021 - dl.acm.org

Human motion control, edit, and synthesis are important tasks to create 3D computer

graphics video games or movies, because some characters act like humans in most of them.

Our aim is to construct a system which can generate various natural character motions. We …



2021  [PDF] arxiv.org

Continuous wasserstein-barycenter estimation without minimax optimization

A KorotinL LiJ SolomonE Burnaev - arXiv preprint arXiv:2102.01752, 2021 - arxiv.org

Wasserstein barycenters provide a geometric notion of the weighted average of probability

measures based on optimal transport. In this paper, we present a scalable algorithm to

compute Wasserstein-barycenters given sample access to the input measures, which are …

 Cited by 9 Related articles All 3 versions 

 

<——2021———2021———1730——


2021  [PDF] arxiv.org

Measuring association with Wasserstein distances

J Wiesel - arXiv preprint arXiv:2102.00356, 2021 - arxiv.org

Let $\pi\in\Pi (\mu,\nu) $ be a coupling between two probability measures $\mu $ and $\nu $

on a Polish space. In this article we propose and study a class of nonparametric measures of

association between $\mu $ and $\nu $. The analysis is based on the Wasserstein distance …

 Cited by 3 Related articles All 2 versions 


2021  [PDF] ams.org

Nonembeddability of persistence diagrams with 𝑝2 Wasserstein metric

A Wagner - Proceedings of the American Mathematical Society, 2021 - ams.org

Persistence diagrams do not admit an inner product structure compatible with any

Wasserstein metric. Hence, when applying kernel methods to persistence diagrams, the

underlying feature map necessarily causes distortion. We prove that persistence diagrams …

 Cited by 8 Related articles All 4 versions


2021  [PDF] arxiv.org

Do Neural Optimal Transport Solvers Work? A Continuous Wasserstein-Benchmark

A KorotinL LiA GenevayJ Solomon… - arXiv preprint arXiv …, 2021 - arxiv.org

Despite the recent popularity of neural network-based solvers for optimal transport (OT),

there is no standard quantitative way to evaluate their performance. In this paper, we

address this issue for quadratic-cost transport--specifically, computation of the Wasserstein …

 Related articles All 2 versions 


2021  [PDF] arxiv.org

On Number of Particles in Coalescing-Fragmentating Wasserstein Dynamics

V Konarovskyi - arXiv preprint arXiv:2102.10943, 2021 - arxiv.org

We consider the system of sticky-reflected Brownian particles on the real line proposed in

[arXiv: 1711.03011]. The model is a modification of the Howitt-Warren flow but now the

diffusion rate of particles is inversely proportional to the mass which they transfer. It is known …

 Related articles All 4 versions 


2021  [PDF] arxiv.org

Multi-period facility location and capacity planning under -Wasserstein joint chance constraints in humanitarian logistics

Z Wang, K You, Z Wang, K Liu - arXiv preprint arXiv:2111.15057, 2021 - arxiv.org

The key of the post-disaster humanitarian logistics (PD-HL) is to build a good facility location

and capacity planning (FLCP) model for delivering relief supplies to affected areas in time.

To fully exploit the historical PD data, this paper adopts the data-driven distributionally …

 All 3 versions 


2021


 

[PDF] Multi-Proxy Wasserstein Classifier for Image Classification

B Liu, Y Rao, J Lu, J Zhou… - Proceedings of the AAAI …, 2021 - raoyongming.github.io

Most widely-used convolutional neural networks (CNNs) end up with a global average 

pooling layer and a fullyconnected layer. In this pipeline, a certain class is represented by 

one template vector preserved in the feature banks of fully-connected layer. Yet, a class may …

Cited by 2 Related articles All 3 versions 

2021

[2106.01954] Do Neural Optimal Transport Solvers Work? A ...
Do Neural Optimal Transport Solvers Work? A Continuous Wasserstein-2 Benchmark

 ited by 1 Related articles All 6 versions

Cited by 10 Related articles All 6 versions

arXiv:2112.11243  [pdfother cs.CV
Projected Sliced Wasserstein Autoencoder-based Hyperspectral Images Anomaly Detection
Authors: Yurong ChenHui ZhangYaonan WangQ. M. Jonathan WuYimin Yang
Abstract: Anomaly detection refers to identifying the observation that deviates from the normal pattern, which has been an active research area in various domains. Recently, the increasing data scale, complexity, and dimension turns the traditional representation and statistical-based outlier detection method into challenging. In this paper, we leverage the generative model in hyperspectral images anomaly d…  More
Submitted 20 December, 2021; originally announced December 2021.

All 2 versions 

arXiv:2112.10039  [pdfother cs.LG   math.ST
Wasserstein Generative Learning of Conditional Distribution
Authors: Shiao LiuXingyu ZhouYuling JiaoJian Huang
Abstract: Conditional distribution is a fundamental quantity for describing the relationship between a response and a predictor. We propose a Wasserstein generative approach to learning a conditional distribution. The proposed approach uses a conditional generator to transform a known distribution to the target conditional distribution. The conditional generator is estimated by matching a joint distribution…  More
Submitted 18 December, 2021; originally announced December 2021.
Comments: 34 pages, 8 figures
MSC Class: 62G05; 68T07
Cited by 1
 Related articles All 3 versions ƒ

FlowPool: Pooling Graph Representations with Wasserstein...
by Simou, Effrosyni
12/2021
In several machine learning tasks for graph structured data, the graphs under consideration may be composed of a varying number of nodes. Therefore, it is...
Journal Article  Full Text Online

arXiv:2112.09990  [pdfother cs.LG
FlowPool: Pooling Graph Representations with Wasserstein Gradient Flows
Authors: Effrosyni Simou
Abstract: In several machine learning tasks for graph structured data, the graphs under consideration may be composed of a varying number of nodes. Therefore, it is necessary to design pooling methods that aggregate the graph representations of varying size to representations of fixed size which can be used in downstream tasks, such as graph classification. Existing graph pooling methods offer no guarantee…  More
Submitted 18 December, 2021; originally announced December 2021.
Comments: The content of this article corresponds to a chapter included in the PhD thesis submitted on September 10th 2021 and successfully defended on October 15th 2021. The thesis manuscript will be published online by EPFL the day after its presentation at the PhD graduation ceremony

All 3 versions 
<——2021———2021———1740——


Gamifying optimization: a Wasserstein distance-based analysis of human search

A CandelieriA PontiF Archetti - arXiv preprint arXiv:2112.06292, 2021 - arxiv.org

The main objective of this paper is to outline a theoretical framework to characterise humans'

decision-making strategies under uncertainty, in particular active learning in a black-box

optimization task and trading-off between information gathering (exploration) and reward

seeking (exploitation). Humans' decisions making according to these two objectives can be

modelled in terms of Pareto rationality. If a decision set contains a Pareto efficient strategy, a

rational decision maker should always select the dominant strategy over its dominated …

Gamifying optimization: a Wasserstein distance-based analysis of human search

by Candelieri, Antonio; Ponti, Andrea; Archetti, Francesco

12/2021

The main objective of this paper is to outline a theoretical framework to characterise humans' decision-making strategies under uncertainty, in particular...

Journal Article Full Text Online

 Related articles All 2 versions

 

Mei, YuChen, Zhi-PingJi, Bing-BingXu, Zhu-JiaLiu, Jia

Data-driven stochastic programming with distributionally robust constraints under Wasserstein distance: asymptotic properties. (English) Zbl 07443744

J. Oper. Res. Soc. China 9, No. 3, 525-542 (2021).

MSC:  90C59 90C34

PDF BibTeX XML Cite

Cited by 4 Related articles

Bridging Bayesian and Minimax Mean Square Error

Estimation via Wasserstein Distributionally Robust...
by Nguyen, Viet Anh; Shafieezadeh-Abadeh, Soroosh; Kuhn, Daniel ; More...
Mathematics of operations research, 12/2021
We introduce a distributionally robust minimium mean square error estimation model with a Wasserstein ambiguity set to recover an unknown signal from a noisy...
View NowPDF
Journal Article Full Text Online


 2021
 Ma, YanlongMa, HongbinWang, Yingli

An image recognition method based on CD-WGAN. (Chinese. English summary) Zbl 07448529

J. Nat. Sci. Heilongjiang Univ. 38, No. 3, 348-354 (2021).

MSC:  68T10 68T07

PDF BibTeX XML Cite

Full Text: DOI


[PDF] neurips.cc

Solving Soft Clustering Ensemble via -Sparse Discrete Wasserstein Barycenter

R Qin, M Li, H Ding - Advances in Neural Information …, 2021 - proceedings.neurips.cc

Clustering ensemble is one of the most important problems in ensemble learning. Though it

has been extensively studied in the past decades, the existing methods often suffer from the

issues like high computational complexity and the difficulty on understanding the consensus …

 All 2 versions 


2021


Panchromatic Image super-resolution via self attention-augmented wasserstein generative adversarial network

J Du, K Cheng, Y Yu, D Wang, H Zhou - Sensors, 2021 - mdpi.com

… The proposed method of SAA-WGAN for PAN image super resolution is described in … (Bicubic,

SRCNN, SCN, VDSR, LapSRN, EDSR, RDN, RCAN) on some most used SR dataset, eg, …

 Cited by 2 Related articles All 7 versions 

 

 Combining the WGAN and ResNeXt Networks to Achieve ...

https://www.spectroscopyonline.com › view › combinin...

by Y Zhao — This work proposes a method combining the Wasserstein generative adversarial network (WGAN) with the specific deep learning model (ResNeXt) ...

[CITATION] Combining the WGAN and ResNeXt Networks to Achieve Data Augmentation and Classification of the FT-IR Spectra of Strawberries

Y Zhao, S Tian, L Yu, Y Xing - …, 2021 - … INC 131 W 1ST STREET, DULUTH …


arXiv:2112.12532  [pdfpsother]  math.OA  math-ph   math.OC
Wasserstein distance between noncommutative dynamical systems
Authors: Rocco Duvenhage
Abstract: We study a class of quadratic Wasserstein distances on spaces consisting of generalized dynamical systems on a von Neumann algebra. We emphasize how symmetry of such a Wasserstein distance arises, but also study the asymmetric case. This setup is illustrated in the context of reduced dynamics, and a number of simple examples are also presented.
Submitted 22 December, 2021; originally announced December 2021.
Comments: 30 pages
MSC Class: Primary: 49Q22. Secondary: 46L55; 81S22
Cited by 3
 Related articles All 4 versions

arXiv:2112.11964  [pdfother]  math.NA  math.OC
On a linear Gromov-Wasserstein distance
Authors: Florian BeierRobert BeinertGabriele Steidl
Abstract: Gromov-Wasserstein distances are generalization of Wasserstein distances, which are invariant under certain distance preserving transformations. Although a simplified version of optimal transport in Wasserstein spaces, called linear optimal transport (LOT), was successfully used in certain applications, there does not exist a notation of linear Gromov-Wasserstein distances so far. In this paper, w… 
More
Submitted 22 December, 2021; originally announced December 2021.
MSC Class: 65K10; 28A33; 28A50; 68W25

Cited by 3 Related articles All 4 versions 


arXiv:2112.13530  [pdfpsother]  cs.LG   math.OC   stat.ML
Wasserstein Flow Meets Replicator Dynamics: A Mean-Field Analysis of Representation Learning in Actor-Critic
Authors: Yufeng ZhangSiyu ChenZhuoran YangMichael I. JordanZhaoran Wang
Abstract: Actor-critic (AC) algorithms, empowered by neural networks, have had significant empirical success in recent years. However, most of the existing theoretical support for AC algorithms focuses on the case of linear function approximations, or linearized neural networks, where the feature representation is fixed throughout training. Such a limitation fails to capture the key aspect of representation… 
More
Submitted 27 December, 2021; originally announced December 2021.
Comments: 41 pages, accepted to NeurIPS 2021

Related articles All 4 versions 
<——2021———2021———1750——


arXiv:2112.13054  [pdfother]  eess.IV   cs.CV
Generalized Wasserstein Dice Loss, Test-time Augmentation, and Transformers for the BraTS 2021 challenge
Authors: Lucas FidonSuprosanna ShitIvan EzhovJohannes C. PaetzoldSébastien OurselinTom Vercauteren
Abstract: Brain tumor segmentation from multiple Magnetic Resonance Imaging (MRI) modalities is a challenging task in medical image computation. The main challenges lie in the generalizability to a variety of scanners and imaging protocols. In this paper, we explore strategies to increase model robustness without increasing inference time. Towards this aim, we explore finding a robust ensemble from models t…  More
Submitted 24 December, 2021; originally announced December 2021.

All 2 versions 

  

ВЛОЖЕНИЯ СНОУФЛЕЙКОВ В ПРОСТРАНСТВА ВАССЕРШТЕЙНА И МАРКОВСКИЙ ТИП

В Золотов - math-cs.spbu.ru

А. Андони, А. Наор и О. Нейман [1] показали, что для любого конечного метрического

пространства X и p> 1 сноуфлейк X1/p допускает вложение в p-пространство

Вассерштейна (Канторовича-Рубинштейна) над R3 с би-липшицевым искажением …

[Russian  Embedings of snowflakes tino Wasserstein spaces and Markov type]

  Related articles All 2 versions 
[Russian  Snowflake embeddings in Wasserstein spaces and Markov type]


2021   see 2020
Corrigendum: An enhanced uncertainty principle for the Vaserstein distance

T Carroll, FX Massaneda Clares… - Bulletin of the London …, 2021 - diposit.ub.edu

CORRIGENDUM TO AN ENHANCED UNCERTAINTY PRINCIPLE FOR THE VASERSTEIN

DISTANCE One of the main results in the paper mentioned in t … Here W1(f+,f−) indicates

the Vaserstein distance between the positive and negative parts of f. … By the precise …

 [PDF] wiley.com

[CITATION] Corrigendum to: An enhanced uncertainty principle for the Vaserstein distance: (Bull. Lond. Math. Soc. 52 (2020) 1158–1173)

T CarrollX MassanedaJ Ortega‐Cerdà - Bulletin of the London … - Wiley Online Library

Corrigendum to: An enhanced uncertainty principle for the Vaserstein distance …

Corrigendum An enhanced uncertainty principle for the Vaserstein distance … Here W1(f+,f

− ) indicates the Vaserstein distance between the positive and negative parts …

 Related articles

[CITATION] An enhanced uncertainty principle for the Vaserstein distance (vol 52, pg 1158, 2020)

T CarrollX Massaneda… - BULLETIN OF …, 2021 - … ST, HOBOKEN 07030-5774, NJ USA


Gupta, AbhishekHaskell, William B.

Convergence of recursive stochastic algorithms using Wasserstein divergence. (English) Zbl 07450692

SIAM J. Math. Data Sci. 3, No. 4, 1141-1167 (2021).

MSC:  93E35 60J20 68Q32

Zbl 07450692
Cited by 5
Related articles All 7 versions


 Lecture 15: Semicontinuity and Convexity of Energies in the Wasserstein Space

L AmbrosioE BruéD Semola - Lectures on Optimal Transport, 2021 - Springer

… McCann, who was the first to notice in [89] that this condition is the right one in order to get

good convexity properties of internal energies along Wasserstein geodesics. … This inequality …

  Related articles

2021

MR4356911 Prelim Bonnet, Benoît; Frankowska, Hélène; Correction to: Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces. Appl. Math. Optim. 84 (2021), suppl. 2, 1819–1819.

Review PDF Clipboard Journal Article

Scholarly Journal  Citation

Correction to: Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces
Bonnet Benoît; Frankowska Hélène.Applied Mathematics and Optimization, suppl. 2; New York Vol. 84,  (Dec 2021): 1819-1819.
Details Get full textLink to external site, this link will open in a new window
Show More     Select result item

Cited by 4 Related articles All 4 versions


arXiv:2101.10668
  [pdfpsother math.OC
Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces
Authors: Benot BonnetHélène Frankowska
Abstract: In this article, we derive first-order necessary optimality conditions for a constrained optimal control problem formulated in the Wasserstein space of probability measures. To this end, we introduce a new notion of localised metric subdifferential for compactly supported probability measures, and investigate the intrinsic linearised Cauchy problems associated to non-local continuity equations. In…  More
Submitted 26 January, 2021; originally announced January 2021.
Comments: 34 pages
MSC Class: 30L99; 34K09; 49J53; 49K21; 49Q22; 58E25

Cited by 2 All 3 versions

MR4356896 Prelim Bonnet, Benoît; Frankowska, Hélène; Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces. Appl. Math. Optim. 84 (2021), suppl. 2, 1281–1330. 30L99 (34K09 49J53 49K21 49Q22 58E25)

Review PDF Clipboard Journal Article

Scholarly Journal  Citation/Abstract
Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces

Bonnet Benoît; Frankowska Hélène.Applied Mathematics and Optimization, suppl. 2; New York Vol. 84,  (Dec 2021): 1281-1330.
Abstract/Details
 Get full textLink to external site, this link will open in a new window
Show Abstract 

Cited by 9 Related articles All 3 versions

MR4355697 Prelim Anastasiou, Andreas; Gaunt, Robert E.; Wasserstein distance error bounds for the multivariate normal approximation of the maximum likelihood estimator. Electron. J. Stat. 15 (2021), no. 2, 5758–5810. 62F10 (60F05 62E17)

Review PDF Clipboard Journal Article

Cited by 4 Related articles All 9 versions

MR4353126 Prelim Patterson, Evan; Hausdorff and Wasserstein metrics on graphs and other structured data. Inf. Inference 10 (2021), no. 4, 1209–1249. 05C70 (49Q22 90B10)

Review PDF Clipboard Journal Article


[PDF] arxiv.org

Hausdorff and Wasserstein metrics on graphs and other structured data

E Patterson - Information and Inference: A Journal of the IMA, 2021 - academic.oup.com

… We extend the Wasserstein metric and other elements of optimal transport from the matching

of sets to the … and Wasserstein-style metrics on |$\textsf{C}$|-sets, and we show that the latter

are convex relaxations of the former. Like the classical Wasserstein metric, the Wasserstein …

  Cited by 5 Related articles All 3 versions

Solutions to Hamilton–Jacobi equation on a Wasserstein space

https://link.springer.com › ...

Nov 20, 2021 — We consider a Hamilton–Jacobi equation associated to the Mayer optimal control problem in the Wasserstein space.

 Solutions to Hamilton–Jacobi equation on a Wasserstein space

by Badreddine, Zeinab; Frankowska, Hélène

Calculus of variations and partial differential equations, 11/2021, Volume 61, Issue 1

We consider a Hamilton–Jacobi equation associated to the Mayer optimal control problem in the Wasserstein space P 2 ( R d ) and define its solutions in terms...

Article PDFPDF

Journal Article 

Full Text Online

<——2021———2021———1760——


Subspace Detours Meet Gromov–Wasserstein

C BonetT VayerN CourtyF SeptierL Drumetz - Algorithms, 2021 - mdpi.com

In the context of optimal transport (OT) methods, the subspace detour approach was recently

proposed by Muzellec and Cuturi. It consists of first finding an optimal plan between the

measures projected on a wisely chosen subspace and then completing it in a nearly optimal

transport plan on the whole space. The contribution of this paper is to extend this category of

methods to the Gromov–Wasserstein problem, which is a particular type of OT distance

involving the specific geometry of each distribution. After deriving the associated formalism …

 All 21 versions 

Cover Image

Subspace Detours Meet Gromov–Wasserstein

by Bonet, Clément; Vayer, Titouan; Courty, Nicolas ; More...

Algorithms, 12/2021, Volume 14, Issue 12

In the context of optimal transport (OT) methods, the subspace detour approach was recently proposed by Muzellec and Cuturi. It consists of first finding an...

Article PDFPDF

Journal Article Full Text Online

  Cited by 1 Related articles All 19 versions


State Intellectual Property Office of China Releases Univ Nanjing Information Science & Tech's Patent Application for Image Restoration Method Based on WGAN...

Global IP News. Optics & Imaging Patent News, Dec 23, 2021

Newspaper Article Full Text Online

 

 

[PDF] arxiv.org

Bridging Bayesian and minimax mean square error estimation via Wasserstein distributionally robust optimization

VA NguyenS Shafieezadeh-Abadeh… - Mathematics of …, 2021 - pubsonline.informs.org

We introduce a distributionally robust minimium mean square error estimation model with a

Wasserstein ambiguity set to recover an unknown signal from a noisy observation. The

proposed model can be viewed as a zero-sum game between a statistician choosing an

estimator—that is, a measurable function of the observation—and a fictitious adversary

choosing a prior—that is, a pair of signal and noise distributions ranging over independent

Wasserstein balls—with the goal to minimize and maximize the expected squared …

Cited by 25 Related articles All 8 versions

Cover Image

Bridging Bayesian and Minimax Mean Square Error Estimation via Wasserstein Distributionally Robust Optimization

by Nguyen, Viet Anh; Shafieezadeh-Abadeh, Soroosh; Kuhn, Daniel ; More...

Mathematics of operations research, 12/2021

We introduce a distributionally robust minimium mean square error estimation model with a Wasserstein ambiguity set to recover an unknown signal from a noisy...

Article PDFPDF

Journal Article Full Text Online

 

    

 2021

Sensitivity analysis of Wasserstein distributionally robust ...

https://royalsocietypublishing.org › abs › rspa.2021.0176

by D Bartl · 2021 — We consider sensitivity of a generic stochastic optimization problem to model uncertainty. We take a non-parametric approach and capture ...

Sensitivity analysis of Wasserstein distributionally robust optimization problems

by Bartl, Daniel; Drapeau, Samuel; Obłój, Jan ; More...

Proceedings of the Royal Society. A, Mathematical, physical, and engineering sciences, 12/2021, Volume 477, Issue 2256

We consider sensitivity of a generic stochastic optimization problem to model uncertainty. We take a non-parametric approach and capture model uncertainty...

Article PDF (via Unpaywall)PDF

Journal Article Full Text Online

Cited by 3 Related articles All 6 versions

[2112.11964] On a linear Gromov-Wasserstein distance - arXiv

https://arxiv.org › math

Dec 22, 2021 — In this paper, we propose a definition of linear Gromov-Wasserstein distances. We motivate our approach by a generalized LOT model, ...

by Beier, Florian; Beinert, Robert; Steidl, Gabriele

12/2021

Gromov-Wasserstein distances are generalization of Wasserstein distances, which are invariant under certain distance preserving transformations. Although a...

Journal Article Full Text Online

Cited by 3 Related articles All 4 versions

2021

Wasserstein distance between noncommutative dynamical ...

https://arxiv.org › math

by R Duvenhage · 2021 — We study a class of quadratic Wasserstein distances on spaces consisting of generalized dynamical systems on a von Neumann algebra. We emphasize ...

Wasserstein distance between noncommutative dynamical systems

by Duvenhage, Rocco

12/2021

We study a class of quadratic Wasserstein distances on spaces consisting of generalized dynamical systems on a von Neumann algebra. We emphasize how symmetry...

Journal Article Full Text Online

 Related articles All 3 versions

 

Wasserstein Generative Learning of Conditional Distribution

by Liu, Shiao; Zhou, Xingyu; Jiao, Yuling ; More...

12/2021

Conditional distribution is a fundamental quantity for describing the relationship between a response and a predictor. We propose a Wasserstein generative...

Journal Article Full Text Online

 Related articles All 3 versions

Projected Sliced Wasserstein Autoencoder-based ... - arXiv

https://arxiv.org › cs

by Y Chen · 2021 — In this paper, we leverage the generative model in hyperspectral images anomaly detection. The gist is to model the distribution of the normal ...

Projected Sliced Wasserstein Autoencoder-based Hyperspectral Images Anomaly Detection

by Chen, Yurong; Zhang, Hui; Wang, Yaonan ; More...

12/2021

Anomaly detection (AD) has been an active research area in various domains. Yet, the increasing data scale, complexity, and dimension turn the traditional...

Journal Article Full Text Online

 

 

FlowPool: Pooling Graph Representations with Wasserstein ...

https://arxiv.org › cs

Dec 18, 2021 — This implementation relies on the computation of the gradient of the Wasserstein distance with recently proposed implicit differentiation ...

FlowPool: Pooling Graph Representations with Wasserstein Gradient Flows

by Simou, Effrosyni

12/2021

In several machine learning tasks for graph structured data, the graphs under consideration may be composed of a varying number of nodes. Therefore, it is...

Journal Article Full Text Online

 Related articles All 3 versions 

 

https://nips.cc › Conferences › ScheduleMultitrack

Wasserstein Flow Meets Replicator Dynamics - NeurIPS 2021

Wasserstein Flow Meets Replicator Dynamics: A Mean-Field Analysis of Representation Learning in Actor-Critic. Yufeng Zh


Wasserstein Flow Meets Replicator Dynamics: A Mean-Field Analysis of Representation Learning in Actor-Critic

by Zhang, Yufeng; Chen, Siyu; Yang, Zhuoran ; More...

12/2021

Actor-critic (AC) algorithms, empowered by neural networks, have had significant empirical success in recent years. However, most of the existing theoretical...

Journal Article Full Text Online

Related articles All 4 versions 

 <——2021———2021———1770——

    

Generalized Wasserstein Dice Loss, Test-time Augmentation ...

https://arxiv.org › eess

Dec 24, 2021 — Generalized Wasserstein Dice Loss, Test-time Augmentation, and Transformers for the BraTS 2021 challenge. Authors:Lucas Fidon, Suprosanna Shit, ...

Generalized Wasserstein Dice Loss, Test-time Augmentation, and Transformers for the BraTS 2021 challenge

by Fidon, Lucas; Shit, Suprosanna; Ezhov, Ivan ; More...

12/2021

Brain tumor segmentation from multiple Magnetic Resonance Imaging (MRI) modalities is a challenging task in medical image computation. The main challenges lie...

Journal Article Full Text Online

2021  [PDF] openreview.net

Combining Wasserstein GAN and Spatial Transformer Network for Medical Image Segmentation

Z Zhang, J Wang, Y Wang, S Li - 2021 - openreview.net

In previous method based on convolutional neural network, various data enhancement 

measures are applied to the input image in order to strengthen the generalization ability of 

the model or the segmentation ability of the target region. Common measures include …

 

2021  [PDF] aaai.org

[PDF] Deep Wasserstein Graph Discriminant Learning for Graph Classification

T Zhang, Y Wang, Z Cui, C Zhou, B Cui… - Proceedings of the AAAI …, 2021 - aaai.org

… In the constrain of a maximum Wasserstein discriminant loss (WD-Loss), ie ratio of inter-class … 

- We propose a novel deep Wasserstein graph discriminant learning framework for graph … 

- We define a Wasserstein graph transformer by using the optimal transport mechanism, …

Cited by 2 Related articles All 2 versions


2021 

Wasserstein Adversarial Transformer for Cloud Workload Prediction

SG Arbat - 2021 - search.proquest.com

Resource provisioning is essential to optimize cloud operating costs and the performance of 

cloud applications. Understanding job arrival rates is critical for predicting future workloads 

to determine the proper amount of resources for provisioning. However, due to the dynamic …

 All 2 versions


2021  [PDF] neurips.cc

Pooling by Sliced-Wasserstein Embedding

N Naderializadeh, J Comer… - Advances in …, 2021 - proceedings.neurips.cc

… (PMA), an important building block in the Set Transformer and Perceiver architectures [14, 

31], … sets is equal to the sliced-Wasserstein distance between their empirical distributions. Our 

… In short, [12] proposes an approximate Euclidean embedding for the Wasserstein distance, …

Cited by 2 Related articles All 3 versions
Pooling by Sliced-Wasserstein Embedding - SlidesLive

slideslive.com › pooling-by-slicedwasserstein-embedding

slideslive.com › pooling-by-slicedwasserstein-embedding

Pooling by Sliced-Wasserstein Embedding. Dec 6, 2021 ... distribution and propose an end-to-end trainable Euclidean embedding for Sliced-Wasserstein distan…

SlidesLive · 

Dec 6, 2021


2021


Intensity-Based Wasserstein Distance As A Loss Measure For Unsupervised Deformable Deep Registration

R Shams, W Le, A Weihs… - 2021 IEEE 18th …, 2021 - ieeexplore.ieee.org

… transformer networks (STN) for deformable registration, we propose three implementation 

variants to compare the model’s performance: the standard sliced Wasserstein, … the Learn2Reg 

open challenge demonstrate the Wasserstein methods converge faster than the baseline …

 Intensity-Based Wasserstein Distance As A Loss Measure For
Related articles
All 2 versions


2021 see 2019

[HTML] Primal dual methods for Wasserstein gradient flows

JA CarrilloK CraigL WangC Wei - Foundations of Computational …, 2021 - Springer

Combining the classical theory of optimal transport with modern operator splitting

techniques, we develop a new numerical method for nonlinear, nonlocal partial differential

equations, arising in models of porous media, materials science, and biological swarming …

 Cited by 29 Related articles All 8 versions


2021 see 2020  2019  [PDF] mlr.press

First-Order Methods for Wasserstein Distributionally Robust MDP

JG Clement, C Kroer - International Conference on Machine …, 2021 - proceedings.mlr.press

Markov decision processes (MDPs) are known to be sensitive to parameter specification.

Distributionally robust MDPs alleviate this issue by allowing for\textit {ambiguity sets} which

give a set of possible distributions over parameter sets. The goal is to find an optimal policy …

  All 2 versions 


2021

Decomposition methods for Wasserstein-based data-driven distributionally robust problems

CA Gamboa, DM ValladãoA Street… - Operations Research …, 2021 - Elsevier

We study decomposition methods for two-stage data-driven Wasserstein-based DROs with

right-hand-sided uncertainty and rectangular support. We propose a novel finite

reformulation that explores the rectangular uncertainty support to develop and test five new …

  All 2 versions


2021  [PDF] arxiv.org

Projected Statistical Methods for Distributional Data on the Real Line with the Wasserstein Metric

M PegoraroM Beraha - arXiv preprint arXiv:2101.09039, 2021 - arxiv.org

We present a novel class of projected methods, to perform statistical analysis on a data set

of probability distributions on the real line, with the 2-Wasserstein metric. We focus in

particular on Principal Component Analysis (PCA) and regression. To define these models …

  Related articles All 6 versions 

Projected Statistical Methods for Distributional Data on the Real Line with the Wasserstein Metric (preprint)

M PegoraroM Beraha - 2021 - pesquisa.bvsalud.org

We present a novel class of projected methods, to perform statistical analysis on a data set

of probability distributions on the real line, with the 2-Wasserstein metric. We focus in

particular on Principal Component Analysis (PCA) and regression. To define these models …

 

<——2021———2021———1780——



2021

Distributionally Safe Path Planning: Wasserstein Safe RRT

P LathropB Boardman… - IEEE Robotics and …, 2021 - ieeexplore.ieee.org

In this paper, we propose a Wasserstein metric-based random path planning algorithm.

Wasserstein Safe RRT (W-Safe RRT) provides finite-sample probabilistic guarantees on the

safety of a returned path in an uncertain obstacle environment. Vehicle and obstacle states …



2021  [PDF] arxiv.org

Model-agnostic bias mitigation methods with regressor distribution control for Wasserstein-based fairness metrics

A Miroshnikov, K Kotsiopoulos, R Franks… - arXiv preprint arXiv …, 2021 - arxiv.org

This article is a companion paper to our earlier work Miroshnikov et al.(2021) on fairness

interpretability, which introduces bias explanations. In the current work, we propose a bias

mitigation methodology based upon the construction of post-processed models with fairer …

 All 2 versions 


2021

PDF] Two-Sided Wasserstein Procrustes Analysis - Semantic ...

https://www.semanticscholar.org › paper › Two-Sided-Wa...

https://www.semanticscholar.org › paper › Two-Sided-Wa...

Two-Sided Wasserstein Procrustes Analysis · Kun Jin, Chaoyue Liu, Cathy Xia · Published in IJCAI 2021 · Computer Science.

[PDF] ijcai.org

[PDF] Two-sided Wasserstein Procrustes Analysis

K JinC LiuC Xia - ijcai.org

Learning correspondence between sets of objects is a key component in many machine

learning tasks. Recently, optimal Transport (OT) has been successfully applied to such

correspondence problems and it is appealing as a fully unsupervised approach. However …

 

2021  [PDF] amazonaws.com

[PDF] STOCHASTIC GRADIENT METHODS FOR L2-WASSERSTEIN LEAST SQUARES PROBLEM OF GAUSSIAN MEASURES

S YUN, X SUNJIL CHOI… - J. Korean Soc …, 2021 - ksiam-editor.s3.amazonaws.com

This paper proposes stochastic methods to find an approximate solution for the L2-

Wasserstein least squares problem of Gaussian measures. The variable for the problem is in

a set of positive definite matrices. The first proposed stochastic method is a type of classical …

 


year 2021

[PDF] github.io

[PDF] Inference in Synthetic Control Methods using the Robust Wasserstein Profile function

IM Lopez - isaacmeza.github.io

A popular method in comparative case studies is the synthetic control method (SCM). A

problem in this methodology is how to conduct formal inference. This work contributes by

using a novel approach similar to Empirical Likelihood (EL), to recover confidence regions …

 
2021


[PDF] mlr.press

When ot meets mom: Robust estimation of wasserstein distance

G StaermanP Laforgue… - International …, 2021 - proceedings.mlr.press

Abstract Originated from Optimal Transport, the Wasserstein distance has gained importance in Machine Learning due to its appealing geometrical properties and the increasing availability of efficient approximations. It owes its recent ubiquity in generative …

  Cited by 9 Related articles All 5 versions 


[PDF] neurips.cc

Pooling by Sliced-Wasserstein Embedding

N Naderializadeh, J Comer… - Advances in …, 2021 - proceedings.neurips.cc

Learning representations from sets has become increasingly important with many applications in point cloud processing, graph learning, image/video recognition, and object detection. We introduce a geometrically-interpretable and generic pooling mechanism for …

All 3 versions 


[PDF] siam.org

Convergence of recursive stochastic algorithms using Wasserstein divergence

A Gupta, WB Haskell - SIAM Journal on Mathematics of Data Science, 2021 - SIAM

This paper develops a unified framework, based on iterated random operator theory, to analyze the convergence of constant stepsize recursive stochastic algorithms (RSAs). RSAs use randomization to efficiently compute expectations, and so their iterates form a stochastic  …

 Cited by 4 Related articles All 4 versions


[PDF] arxiv.org

The Wasserstein space of stochastic processes

D BartlM BeiglböckG Pammer - arXiv preprint arXiv:2104.14245, 2021 - arxiv.org

Wasserstein distance induces a natural Riemannian structure for the probabilities on the Euclidean space. This insight of classical transport theory is fundamental for tremendous applications in various fields of pure and applied mathematics. We believe that an …

  Cited by 2 Related articles All 2 versions 


[PDF] arxiv.org

Stochastic Wasserstein Hamiltonian Flows

J Cui, S Liu, H Zhou - arXiv preprint arXiv:2111.15163, 2021 - arxiv.org

In this paper, we study the stochastic Hamiltonian flow in Wasserstein manifold, the probability density space equipped with $ L^ 2$-Wasserstein metric tensor, via the Wong--Zakai approximation. We begin our investigation by showing that the stochastic Euler …

 All 2 versions 

<——2021———2021———1790——



2021 see 2020[PDF] arxiv.org

FlowPool: Pooling Graph Representations with Wasserstein Gradient Flows

E Simou - arXiv preprint arXiv:2112.09990, 2021 - arxiv.org

In several machine learning tasks for graph structured data, the graphs under consideration may be composed of a varying number of nodes. Therefore, it is necessary to design pooling methods that aggregate the graph representations of varying size to representations of fixed …

  All 2 versions 


[PDF] tandfonline.com

Stochastic approximation versus sample average approximation for Wasserstein barycenters

D Dvinskikh - Optimization Methods and Software, 2021 - Taylor & Francis

In the machine learning and optimization community, there are two main approaches for the convex risk minimization problem, namely the Stochastic Approximation (SA) and the Sample Average Approximation (SAA). In terms of the oracle complexity   All 3 versions


Data-driven stochastic programming with distributionally robust constraints under Wasserstein distance: asymptotic properties

Y Mei, ZP Chen, BB Ji, ZJ Xu, J Liu - … of the Operations Research Society of …, 2021 - Springer

Distributionally robust optimization is a dominant paradigm for decision-making problems where the distribution of random variables is unknown. We investigate a distributionally robust optimization problem with ambiguities in the objective function and countably infinite …

  Cited by 2 Related articles


[PDF] arxiv.org

Backward and Forward Wasserstein Projections in Stochastic Order

YH Kim, YL Ruan - arXiv preprint arXiv:2110.04822, 2021 - arxiv.org

We study metric projections onto cones in the Wasserstein space of probability measures, defined by stochastic orders. Dualities for backward and forward projections are established under general conditions. Dual optimal solutions and their characterizations require study …

  All 2 versions 


[HTML] mdpi.com

Wasserstein Bounds in the CLT of the MLE for the Drift Coefficient of a Stochastic Partial Differential Equation

K Es-Sebaiy, M Al-Foraih, F Alazemi - Fractal and Fractional, 2021 - mdpi.com

In this paper, we are interested in the rate of convergence for the central limit theorem of the maximum likelihood estimator of the drift coefficient for a stochastic partial differential equation based on continuous time observations of the Fourier coefficients ui (t), i= 1,…, N of …

  All 5 versions 


 2021


Solving Wasserstein Robust Two-stage Stochastic Linear Programs via Second-order Conic Programming

Z Wang, K YouS Song, Y Zhang - 2021 40th Chinese Control …, 2021 - ieeexplore.ieee.org

This paper proposes a novel data-driven distributionally robust (DR) two-stage linear program over the 1-Wasserstein ball to handle the stochastic uncertainty with unknown distribution. We study the case with distribution uncertainty only in the objective function. In …

 

[PDF] arxiv.org

Statistical inference for Bures–Wasserstein barycenters

A KroshninV Spokoiny… - The Annals of Applied …, 2021 - projecteuclid.org

In this work we introduce the concept of Bures–Wasserstein barycenter Q, that is essentially a Fréchet mean of some distribution P supported on a subspace of positive semi-definite d-dimensional Hermitian operators H+ (d). We allow a barycenter to be constrained …

  Cite Cited by 21 Related articles All 4 versions


[PDF] arxiv.org

Sample out-of-sample inference based on Wasserstein distance

J BlanchetY Kang - Operations Research, 2021 - pubsonline.informs.org

We present a novel inference approach that we call sample out-of-sample inference. The approach can be used widely, ranging from semisupervised learning to stress testing, and it is fundamental in the application of data-driven distributionally robust optimization. Our …

 Cite Cited by 25 Related articles All 5 versions


[PDF] neurips.cc

Dynamical Wasserstein Barycenters for Time-series Modeling

K ChengS AeronMC Hughes… - Advances in Neural …, 2021 - proceedings.neurips.cc

Many time series can be modeled as a sequence of segments representing high-level discrete states, such as running and walking in a human activity application. Flexible models should describe the system state and observations in stationary``pure-state''periods as well …

  All 5 versions 

 

[PDF] researchgate.net

[PDF] Two-sample Test using Projected Wasserstein Distance

J WangR GaoY Xie - Proc. ISIT, 2021 - researchgate.net

We develop a projected Wasserstein distance for the two-sample test, a fundamental problem in statistics and machine learning: given two sets of samples, to determine whether they are from the same distribution. In particular, we aim to circumvent the curse of …

 Cite  Related articles 

<——2021———2021———1800—— 



[PDF] arxiv.org

Exact Statistical Inference for the Wasserstein Distance by Selective Inference

VNL DuyI Takeuchi - arXiv preprint arXiv:2109.14206, 2021 - arxiv.org

In this paper, we study statistical inference for the Wasserstein distance, which has attracted much attention and has been applied to various machine learning tasks. Several studies have been proposed in the literature, but almost all of them are based on asymptotic …

  Cite  All 3 versions 

Exact Statistical Inference for the Wasserstein Distance by Selective Inference

V Nguyen Le Duy, I Takeuchi - arXiv e-prints, 2021 - ui.adsabs.harvard.edu

In this paper, we study statistical inference for the Wasserstein distance, which has attracted much attention and has been applied to various machine learning tasks. Several studies have been proposed in the literature, but almost all of them are based on asymptotic …

 

[PDF] arxiv.org

Projected Wasserstein gradient descent for high-dimensional Bayesian inference

Y WangP ChenW Li - arXiv preprint arXiv:2102.06350, 2021 - arxiv.org

We propose a projected Wasserstein gradient descent method (pWGD) for high-dimensional Bayesian inference problems. The underlying density function of a particle system of WGD is approximated by kernel density estimation (KDE), which faces the long-standing curse of …

   Related articles All 3 versions 


[PDF] arxiv.org

Projected Statistical Methods for Distributional Data on the Real Line with the Wasserstein Metric

M PegoraroM Beraha - arXiv preprint arXiv:2101.09039, 2021 - arxiv.org

We present a novel class of projected methods, to perform statistical analysis on a data set of probability distributions on the real line, with the 2-Wasserstein metric. We focus in particular on Principal Component Analysis (PCA) and regression. To define these mod 

 Related articles All 6 versions 

Projected Statistical Methods for Distributional Data on the Real Line with the Wasserstein Metric (preprint)

M PegoraroM Beraha - 2021 - pesquisa.bvsalud.org

We present a novel class of projected methods, to perform statistical analysis on a data set of probability distributions on the real line, with the 2-Wasserstein metric. We focus in particular on Principal Component Analysis (PCA) and regression. To define these models …

 

[PDF] arxiv.org

Two-sample Test with Kernel Projected Wasserstein Distance

J WangR GaoY Xie - arXiv preprint arXiv:2102.06449, 2021 - arxiv.org

We develop a kernel projected Wasserstein distance for the two-sample test, an essential building block in statistics and machine learning: given two sets of samples, to determine whether they are from the same distribution. This method operates by finding the nonlinear …

   Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein distance between noncommutative dynamical systems

R Duvenhage - arXiv preprint arXiv:2112.12532, 2021 - arxiv.org

We study a class of quadratic Wasserstein distances on spaces consisting of generalized dynamical systems on a von Neumann algebra. We emphasize how symmetry of such a Wasserstein distance arises, but also study the asymmetric case. This setup is illustrated in …


2021



[PDF] arxiv.org

Projected Sliced Wasserstein Autoencoder-based Hyperspectral Images Anomaly Detection

Y Chen, H Zhang, Y Wang, QM Wu, Y Yang - arXiv preprint arXiv …, 2021 - arxiv.org

Anomaly detection refers to identifying the observation that deviates from the normal pattern, which has been an active research area in various domains. Recently, the increasing data scale, complexity, and dimension turns the traditional representation and statistical-based …

  Cite All 2 versions 



[PDF] openreview.net

Combining Wasserstein GAN and Spatial Transformer Network for Medical Image Segmentation

Z Zhang, J Wang, Y Wang, S Li - 2021 - openreview.net

In previous method based on convolutional neural network, various data enhancement measures are applied to the input image in order to strengthen the generalization ability of the model or the segmentation ability of the target region. Common measures include …

 


Wasserstein Adversarial Transformer for Cloud Workload Prediction

SG Arbat - 2021 - search.proquest.com

Resource provisioning is essential to optimize cloud operating costs and the performance of cloud applications. Understanding job arrival rates is critical for predicting future workloads to determine the proper amount of resources for provisioning. However, due to the dynamic …

 All 2 versions


[PDF] github.io

[PDF] Inference in Synthetic Control Methods using the Robust Wasserstein Profile function

IM Lopez - isaacmeza.github.io

A popular method in comparative case studies is the synthetic control method (SCM). A problem in this methodology is how to conduct formal inference. This work contributes by using a novel approach similar to Empirical Likelihood (EL), to recover confidence regions …


[PDF] koreascience.or.kr

Assurance of HIT (head impulse test, Saccade based Vestibular Anomaly Detection) using Confidence Interval of Optical Flow Comparison on Wasserstein Metric

M Ji, TH Kim, SW Kim - Proceedings of the Korea Information …, 2021 - koreascience.or.kr

최근의 기계 학습 (딥러닝) 기존의 전통적인 통계 분석 방법들에 비해 효율성과 정확도가 높은 장점이 있지만, 처리과정이 블랙박스와 같아 결과 값의 중요한 원인 또는 근거 요인을 찾기 어렵다는 단점을 가지고 있다. 이를 해결하기 위한 최근의 XAI (eXplainable AI) 연구를 …

[PDF] koreascience.or.kr

Assurance of HIT (head impulse test, Saccade based Vestibular Anomaly Detection) using Confidence Interval of Optical Flow Comparison on Wasserstein Metric

M Ji, TH Kim, SW Kim - Proceedings of the Korea Information …, 2021 - koreascience.or.kr

최근의 기계 학습 (딥러닝) 기존의 전통적인 통계 분석 방법들에 비해 효율성과 정확도가

높은 장점이 있지만, 처리과정이 블랙박스와 같아 결과 값의 중요한 원인 또는 근거 요인을 찾기

어렵다는 단점을 가지고 있다. 이를 해결하기 위한 최근의 XAI (eXplainable AI) 연구를 …

Related articles All 3 versions 

<——2021———2021———1810——  


An Intrusion Detection Method Based on WGAN and Deep Learning

L Han, X Fang, Y Liu, W Wu… - 2021 IEEE International …, 2021 - ieeexplore.ieee.org

Using WGAN and deep learning methods, a multiclass network intrusion detection model is proposed. The model uses the WGAN network to generate fake samples of rare attacks to achieve effective expansion of the original dataset and evaluates the samples through a two …


[PDF] latinxinai.org

[PDF] Computation of Discrete Flows Over Networks via Constrained Wasserstein Barycenters

F ArquéCA UribeC Ocampo-Martinez - research.latinxinai.org

We study a Wasserstein attraction approach for solving dynamic mass transport problems over networks. In the transport problem over networks, we start with a distribution over the set of nodes that needs to be “transported” to a target distribution accounting for the network …


Network Malicious Traffic Identification Method Based On WGAN Category Balancing

A Wang, Y Ding - 2021 IEEE International Conference on …, 2021 - ieeexplore.ieee.org

Aiming at the problem of data imbalance when in using deep learning model for traffic recognition tasks, a method of using Wasserstein Generative Adversarial Network (WGAN) to generate minority samples based on the image of the original traffic data packets is …

Conference Paper  Citation/Abstract

Network Malicious Traffic Identification Method Based On WGAN Category Balancing

Wang, Anzhou; Ding, Yaojun.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).



Military Target Recognition Technology based on WGAN-GP and XGBoost

K Zhao, B Dong, C Yang - 2021 4th International Conference on …, 2021 - dl.acm.org

This paper proposes a military target recognition method based on WGAN-GP and XGBoost, which expands and improves the quality of military target samples by constructing WGAN-GP, then sampling iteratively based on heuristic learning to construct an effective sample …


  [PDF] waseda.ac.jp

Face Image Generation for Illustration by WGAN-GP Using Landmark Information

M Takahashi, H Watanabe - 2021 IEEE 10th Global …, 2021 - ieeexplore.ieee.org

With the spread of social networking services, face images for illustration are being used in a variety of situations. Attempts have been made to create illustration face images using adversarial generation networks, but the quality of the images has not been sufficient. It …

Conference Paper  Citation/Abstract

Face Image Generation for Illustration by WGAN-GP Using Landmark Information

Watanabe, Hiroshi.The Institute of Electrical a

Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).Abstract/Details   Show Abstract 

Related articles All 2 versions


2021


[PDF] researchgate.net

[PDF] MULTIPLIER BOOTSTRAP FOR BURES–WASSERSTEIN BARYCENTERS BY ALEXEY KROSHNIN, VLADIMIR SPOKOINY 2 AND ALEXANDRA …

A KROSHNIN - researchgate.net

Bures–Wasserstein barycenter is a popular and promising tool in analysis of complex data like graphs, images etc. In many applications the input data are random with an unknown distribution, and uncertainty quantification becomes a crucial issue. This paper offers an …

Related articles 

[PDF] github.io

[PDF] Inference in Synthetic Control Methods using the Robust Wasserstein Profile function

IM Lopez - isaacmeza.github.io

A popular method in comparative case studies is the synthetic control method (SCM). A problem in this methodology is how to conduct formal inference. This work contributes by using a novel approach similar to Empirical Likelihood (EL), to recover confidence regions …


  Exact Statistical Inference for the Wasserstein Distance by Selective Inference

V Nguyen Le Duy, I Takeuchi - arXiv e-prints, 2021 - ui.adsabs.harvard.edu

In this paper, we study statistical inference for the Wasserstein distance, which has attracted much attention and has been applied to various machine learning tasks. Several studies have been proposed in the literature, but almost all of them are based on asymptotic …

  All 3 versions 


CONFERENCE PROCEEDING

WGAN-GP-Based Synthetic Radar Spectrogram Augmentation in Human Activity Recognition

Qu, Lele ; Wang, Yutong ; Yang, Tianhong ; Zhang, Lili ; Sun, Yanpeng2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS, 2021, p.2532-2535

WGAN-GP-Based Synthetic Radar Spectrogram Augmentation in Human Activity Recognition

No Online Access 

WGAN-GP-Based Synthetic Radar Spectrogram Augmentation in Human Activity Recognition

L Qu, Y Wang, T Yang, L Zhang… - 2021 IEEE International …, 2021 - ieeexplore.ieee.org

Despite deep convolutional neural networks (DCNNs) having been used extensively in radar-based human activity recognition in recent years, their performance could not be fully implemented because of the lack of radar dataset. However, radar data acquisition is difficult

 Related articles


[PDF] tue.nl

[PDF] Wasserstein Generative Adversarial Privacy

K Mulder, J Goseling - and Signal Processing in the Benelux, May 20-21, TU … - pure.tue.nl

We consider the problem of sharing data without revealing sensitive information, quantified through mutual information. We do so in a generative adversial setting, ie, training a GAN. In particular we consider training under a Wasserstein distance. Our main contribution is to …

Related articles All 4 versions 

 <——2021———2021———1820——


[PDF] unipd.it

[PDF] Optimal Transport and Wasserstein Gradient Flows

F Santambrogio - Doctoral Program in Mathematical Sciences Catalogue … - math.unipd.it

Aim: With the first part of the course students will learn the main features of the theory of optimal transport; the second part will allow them to master more specialized tools from this theory in their applications to some evolution PDEs with a gradient flow structure Course …

All 2 versions 



[PDF] 8ecm.si

[PDF] Maps on positive definite cones of C-algebras preserving the Wasserstein mean

L Molnár - 2021 - 8ecm.si

… We see that the midpoint of this curve is just the Wasserstein mean Aσw B of A and B. … The definition of the Bures-Wasserstein metric has recently been extended by Farenick and Rahaman to the setting of C-algebras with a faithful finite trace. In 2018 we determined the …

Cited by 1 Related articles All 2 versions 



Implementation of a WGAN-GP for Human Pose Transfer using a 3-channel pose representation

T Das, S Sutradhar, M Das… - … on Innovation and …, 2021 - ieeexplore.ieee.org

The computational problem of Human Pose Transfer (HPT) is addressed in this paper. HPT in recent days have become an emerging research topic which can be used in fields like fashion design, media production, animation, virtual reality. Given the image of a human …

 


[PDF] ijcai.org

[PDF] Two-sided Wasserstein Procrustes Analysis

K JinC LiuC Xia - ijcai.org

Learning correspondence between sets of objects is a key component in many machine learning tasks. Recently, optimal Transport (OT) has been successfully applied to such correspondence problems and it is appealing as a fully unsupervised approach. However …

Related articles All 2 versions 



2021 see 2020   [PDF] thecvf.com

Wasserstein contrastive representation distillation

L Chen, D Wang, Z Gan, J Liu… - Proceedings of the …, 2021 - openaccess.thecvf.com

… For better generalization, we also use the primal form of WD to indirectly bound generalization 

error via regularizing the Wasserstein … (i) We present a novel Wasserstein learning 

framework for representation distillation, utilizing the dual and primal forms of the Wasserstein …

Cited by 23 Related articles All 7 versions


 
 2021


Wasserstein distance for categorical data? Relationship to TVD?

https://stats.stackexchange.com › questions › wasserstei...

https://stats.stackexchange.com › questions › wasserstei...

Jan 10, 2021 — Is the Wasserstein distance applicable for categorical data? e.g. if we have the distribution of different coloured balls in two bags


2021

Military Target Recognition Technology based on WGAN-GP and XGBoost

K Zhao, B Dong, C Yang - 2021 4th International Conference on …, 2021 - dl.acm.org

This paper proposes a military target recognition method based on WGAN-GP and XGBoost, 

which expands and improves the quality of military target samples by constructing WGAN-GP, 

then sampling iteratively based on heuristic learning to construct an effective sample …


WDIBS: Wasserstein deterministic information bottleneck for state abstraction to balance state-compression and performance

X Zhu, T Huang, R Zhang, W Zhu - Applied Intelligence, 2021 - Springer

As an important branch of reinforcement learning, Apprenticeship learning studies how an 

agent learns good behavioral decisions by observing an expert policy from the environment. 

It has made many encouraging breakthroughs in real-world applications. State abstraction is …

 

[PDF] arxiv.org

Some inequalities on Riemannian manifolds linking Entropy, Fisher information, Stein discrepancy and Wasserstein distance

LJ Cheng, FY Wang, A Thalmaier - arXiv preprint arXiv:2108.12755, 2021 - arxiv.org

For a complete connected Riemannian manifold $ M $ let $ V\in C^ 2 (M) $ be such that 

$\mu (dx)={\rm e}^{-V (x)}\mbox {vol}(dx) $ is a probability measure on $ M $. Taking $\mu $ 

as reference measure, we derive inequalities for probability measures on $ M $ linking …

 All 4 versions


[PDF] researchgate.net

[PDF] MULTIPLIER BOOTSTRAP FOR BURES–WASSERSTEIN BARYCENTERS BY ALEXEY KROSHNIN, VLADIMIR SPOKOINY 2 AND ALEXANDRA …

A KROSHNIN - researchgate.net

Bures–Wasserstein barycenter is a popular and promising tool in analysis of complex data 

like graphs, images etc. In many applications the input data are random with an unknown 

distribution, and uncertainty quantification becomes a crucial issue. This paper offers an …

<——2021———2021———1830—— 


[HTML] aclanthology.org

[HTML] Wasserstein Selective Transfer Learning for Cross-domain Text Mining

L FengM Qiu, Y Li, H Zheng… - Proceedings of the 2021 …, 2021 - aclanthology.org

Transfer learning (TL) seeks to improve the learning of a data-scarce target domain by using

information from source domains. However, the source and target domains usually have

different data distributions, which may lead to negative transfer. To alleviate this issue, we …

 

Linear and Deep Order-Preserving Wasserstein Discriminant Analysis

B SuJ ZhouJR WenY Wu - IEEE Transactions on Pattern …, 2021 - ieeexplore.ieee.org

Supervised dimensionality reduction for sequence data learns a transformation that maps

the observations in sequences onto a low-dimensional subspace by maximizing the

separability of sequences in different classes. It is typically more challenging than …

Related articles All 5 versions


Differentially Privacy-Preserving Federated Learning Using Wasserstein Generative Adversarial Network

Y Wan, Y QuL GaoY Xiang - 2021 IEEE Symposium on …, 2021 - ieeexplore.ieee.org

Artificial intelligence (AI) requires a large amount of data to train high-quality machine

learning (ML) models. However, due to privacy issues, individuals or organizations are not

willing to share data with others, which results in “data islands”. This motivates the …

All 2 versions


[PDF] 8ecm.si

[PDF] Maps on positive definite cones of C-algebras preserving the Wasserstein mean

L Molnár - 2021 - 8ecm.si

… We see that the midpoint of this curve is just the Wasserstein mean Aσw B of A and B. …

The definition of the Bures-Wasserstein metric has recently been extended by Farenick and

Rahaman to the setting of C-algebras with a faithful finite trace. In 2018 we determined the …

All 2 versions 


CWGAN-DNN: 一种条件 Wasserstein 生成对抗网络入侵检测方法

贺佳星, 王晓丹, 宋亚飞, 来杰 - 空军工程大学学报, 2021 - kjgcdx.cnjournals.com

针对现有的基于机器学习的入侵检测系统对类不平衡数据检测准确率低的问题,

提出一种基于条件Wasserstein 生成对抗网络(CWGAN) 和深度神经网络(DNN)

的入侵检测(CWGAN DNN). CWGAN DNN 通过生成样本来改善数据集的类不平衡问题 …

All 2 versions 

[Chinese  CWGAN-DNN: A conditional Wasserstein generation method to counter network intrusion detection[


2021

2021 book

 An Invitation to Optimal Transport, Wasserstein Distances, and ...

https://www.amazon.com › Invitation-Transport-Wasser...

The presentation focuses on the essential topics of the theory: Kantorovich duality, existence and uniqueness of optimal transport maps, Wasserstein distances, ...

[CITATION] An invitation to Optimal Transport, Wasserstein Distances and Gradient Flows. EMS Textbooks in Mathematics

A Figalli, F Glaudo - 2021 - EMS Press (accepted, 2020)

Cited by 2

[CITATION] An Invitation to Optimal Transport, Wasserstein Distances, and Gradient Flows

A FigalliF Glaudo - 2021 - ems-ph.org

… maps, Wasserstein distances, the JKO scheme, Otto’s calculus, and Wasserstein gradient

 he end, a presentation of some selected applications of optimal transport is given. …

 Related articles All 2 versions


Scenario Reduction Network Based on Wasserstein Distance with Regularization

Y Sun, X Dong, SM Malik - 2021 - techrxiv.org

Power systems with high penetration of renewable energy contain various uncertainties.

Scenario-based optimization problems need a large number of discrete scenarios to obtain

a reliable approximation for the probabilistic model. It is important to choose typical …

 Related articles All 2 versions


CDC-Wasserstein generated adversarial network for locally occluded face image recognition

K Zhang, W Zhang, S Yan, J Jiang… - … Conference on Computer …, 2021 - spiedigitallibrary.org

In the practical application of wisdom education classroom teaching, students' faces may be

blocked due to various factors (such as clothing, environment, lighting), resulting in low

accuracy and low robustness of face recognition. To solve this problem, we introduce a new …

All 3 versionsƒ


[PDF] projecteuclid.org

Wasserstein gradients for the temporal evolution of probability distributions

Y Chen, HG Müller - Electronic Journal of Statistics, 2021 - projecteuclid.org

Many studies have been conducted on flows of probability measures, often in terms of 

gradient flows. We utilize a generalized notion of derivatives with respect to time to model 

tCited by 1 Related articles All 6 versions

Lecture 10: Wasserstein Geodesics, Nonbranching and Curvature

L Ambrosio, E Brué, D Semola - Lectures on Optimal Transport, 2021 - Springer

Lecture 10: Wasserstein Geodesics, Nonbranching and Curvature | SpringerLink … Lecture 

10: Wasserstein Geodesics, Nonbranching and Curvature … Characterization of absolutely 

continuous curves in Wasserstein spaces. Calc. Var. …

<——2021———2021———1840——


[PDF] github.io

[PDF] IFT 6756-Lecture 11 (Wasserstein Generative Adversarial Nets)

G Gidel - gauthiergidel.github.io

… Whereas, Wasserstein distance captures how close θ is to 0 and we get useful gradients 

almost everywhere (except when θ = 0) as Wasserstein measure cannot saturate and 

converges to a linear function. … If we compute the Wasserstein distance between the real …


Lecture 15: Semicontinuity and Convexity of Energies in the Wasserstein Space

L Ambrosio, E Brué, D Semola - Lectures on Optimal Transport, 2021 - Springer

… McCann, who was the first to notice in [89] that this condition is the right one in order to get 

good convexity properties of internal energies along Wasserstein geodesics. … This inequality 

provides an estimate of the Wasserstein distance between the standard Gaussian measure …


[PDF] arxiv.org

Bounds in Wasserstein distance on the normal approximation of general M-estimators

F Bachoc, M Fathi - arXiv preprint arXiv:2111.09721, 2021 - arxiv.org

We derive quantitative bounds on the rate of convergence in $ L^ 1$ Wasserstein distance 

of general M-estimators, with an almost sharp (up to a logarithmic term) behavior in the 

number of observations. We focus on situations where the estimator does not have an …

  Related articles All 24 versions

  [PDF] openreview.net

Combining Wasserstein GAN and Spatial Transformer Network for Medical Image Segmentation

Z Zhang, J Wang, Y Wang, S Li - 2021 - openreview.net

In previous method based on convolutional neural network, various data enhancement

measures are applied to the input image in order to strengthen the generalization ability of

the model or the segmentation ability of the target region. Common measures include …

 Related articles 


[PDF] chinaxiv.org

[PDF] Wasserstein metric between a discrete probability measure and a continuous one

W Yang, X Wang - 2021 - biotech.chinaxiv.org

The paper considers Wasserstein metric between the empirical probability measure of n

discrete random variables and a continuous uniform one on the d-dimensional ball and give

the asymptotic estimation of their expectation as n∞. Further We considers the above …

Related articles All 5 versions 

2021


CONFERENCE PROCEEDING

Towards a Camera-Based Road Damage Assessment and Detection for Autonomous Vehicles: Applying Scaled-YOLO and CVAE-WGAN

Fassmeyer, Pascal ; Kortmann, Felix ; Drews, Paul ; Funk, Burkhardt2021 IEEE 94th Vehicular Technology Conference (VTC2021-Fall), 2021, p.1-7

Towards a Camera-Based Road Damage Assessment and Detection for Autonomous Vehicles: Applying Scaled-YOLO and CVAE-WGAN

No Online Access 

Towards a Camera-Based Road Damage Assessment and Detection for Autonomous Vehicles: Applying Scaled-YOLO and CVAE-WGAN

P Fassmeyer, F Kortmann, P Drews… - 2021 IEEE 94th …, 2021 - ieeexplore.ieee.org

Initiatives such as the 2020 IEEE Global Road Damage Detection Challenge prompted

extensive research in camera-based road damage detection with Deep Learning, primarily

focused on improving the efficiency of road management. However, road damage detection …

R elated articles


2021 Cover Image

Distributionally Robust Mean-Variance Portfolio Selection with Wasserstein Distances

by Blanchet, Jose; Chen, Lin; Zhou, Xun Yu

Management science, 12/2021

We revisit Markowitz’s mean-variance portfolio selection model by considering a distributionally robust version, in which the region of distributional...

Article PDFPDF

Journal Article Full Text Online

View in Context Browse Journal

Cited by 48 Related articles All 6 versions

Data Augmentation of Wrist Pulse Signal for Traditional Chinese Medicine Using Wasserstein GAN

J Chang, F Hu, H Xu, X Mao, Y Zhao… - Proceedings of the 2nd …, 2021 - dl.acm.org

Pulse diagnosis has been widely used in traditional Chinese medicine (TCM) for thousands

of years. Recently, with the availability and improvement of advanced and portable sensor

technology, computational pulse diagnosis has been obtaining more and more attentions. In …

 

Domain Adaptive Rolling Bearing Fault Diagnosis based on Wasserstein Distance

C Yang, X Wang, J Bao, Z Li - 2021 33rd Chinese Control and …, 2021 - ieeexplore.ieee.org

The rolling bearing usually runs at different speeds and loads, which leads to a

corresponding change in the distribution of data. The cross-domain problem caused by

different data distributions can degrade the performance of deep learning-based fault …

 Cited by 1 Related articles

 

[PDF] mdpi.com

Polymorphic Adversarial Cyberattacks Using WGAN

R Chauhan, U Sabeel, A Izaddoost… - Journal of Cybersecurity …, 2021 - mdpi.com

Intrusion Detection Systems (IDS) are essential components in preventing malicious traffic

from penetrating networks and systems. Recently, these systems have been enhancing their

detection ability using machine learning algorithms. This development also forces attackers …

 Related articles 

<——2021———2021———1850——


2021  see 2020  [PDF] googleapis.com  patent

System and method for unsupervised domain adaptation via sliced-wasserstein distance

AJ GabourieM RostamiS KolouriK Kim - US Patent 11,176,477, 2021 - Google Patents

Described is a system for unsupervised domain adaptation in an autonomous learning

agent. The system adapts a learned model with a set of unlabeled data from a target

domain, resulting in an adapted model. The learned model was previously trained to …

Cited by 2 Related articles All 4 versions 

United States Patent for System and Method for Unsupervised Domain Adaptation Via Sliced-Wasserstein...

Global IP News. Information Technology Patent News, Nov 17, 2021

Newspaper Article Full Text Online 
 Cited by 3 Related articles All 4 versions


[PDF] metu.edu.tr

[PDF] Wasserstein generative adversarial active learning for anomaly detection with gradient penalty

HA Duran - 2021 - open.metu.edu.tr

Anomaly detection has become a very important topic with the advancing machine learning

techniques and is used in many different application areas. In this study, we approach

differently than the anomaly detection methods performed on standard generative models …
Related articles 


[HTML] hindawi.com

[HTML] A Liver Segmentation Method Based on the Fusion of VNet and WGAN

J Ma, Y Deng, Z Ma, K Mao, Y Chen - Computational and Mathematical …, 2021 - hindawi.com

Accurate segmentation of liver images is an essential step in liver disease diagnosis,

treatment planning, and prognosis. In recent years, although liver segmentation methods

based on 2D convolutional neural networks have achieved good results, there is still a lack …

All 7 versions 

 

[PDF] preprints.org

On the Distributional Characterization of Graph Models of Water Distribution Networks in Wasserstein Spaces

A CandelieriA PontiF Archetti - 2021 - preprints.org

This paper is focused on two topics very relevant in water distribution networks (WDNs):

vulnerability assessment and the optimal placement of water quality sensors. The main

novelty element of this paper is to represent the data of the problem, in this case all objects …

All 2 versions 

Synthesis of Adversarial DDoS Attacks Using Wasserstein Generative Adversarial Networks with Gradient Penalty

CS Shieh, TT Nguyen, WW Lin… - 2021 6th …, 2021 - ieeexplore.ieee.org

DDoS (Distributed Denial of Service) has become a pressing and challenging threat to the

security and integrity of computer networks and information systems. The detection of DDoS

attacks is essential before any mitigation approaches can be taken. AI (Artificial Intelligence) …

All 2 versions


2021

[PDF] unifi.it

[PDF] Pattern-based music generation with wasserstein autoencoders and PRCdescriptions

V Borghuis, L Angioloni, L Brusci… - Proceedings of the Twenty …, 2021 - flore.unifi.it

We present a pattern-based MIDI music generation system with a generation strategy based

on Wasserstein autoencoders and a novel variant of pianoroll descriptions of patterns which

employs separate channels for note velocities and note durations and can be fed into classic …

Related articles All 7 versions 

Distributionally robust tail bounds based on Wasserstein distance and $f$-divergence

by Birghila, CorinaAigner, MaximilianEngelke, Sebastian

06/2021

In this work, we provide robust bounds on the tail probabilities and the tail index of heavy-tailed distributions in the context of

model misspecification....

Journal Article  Full Text Online


[PDF] archives-ouvertes.fr

Finite volume approximation of optimal transport and Wasserstein gradient flows

G Todeschi - 2021 - hal.archives-ouvertes.fr

This thesis is devoted to the design of locally conservative and structure preserving schemes

for Wasserstein gradient flows, ie steepest descent curves in the Wasserstein space. The

time discretization is based on variational approaches that mimic at the discrete in time level …

Related articles All 14 versions 


Speech Bandwidth Extension Based on Wasserstein Generative Adversarial Network

X Chen, J Yang - 2021 IEEE 21st International Conference on …, 2021 - ieeexplore.ieee.org

Artificial bandwidth extension (ABE) algorithms have been developed to improve the quality

of narrowband calls before devices are upgraded to wideband calls. Most methods use

deep neural networks (DNN) to establish a nonlinear relationship between narrowband …

Conference Paper  Citation/Abstract

Speech Bandwidth Extension Based on Wasserstein Generative Adversarial Network

Chen, Xikun; Yang, Junmei.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings;

iscataway, (2021)Abstract/Details   Show Abstract ƒ

SVAE-WGAN based Soft Sensor Data Supplement Method for Process Industry

S Gao, S Qiu, Z Ma, R Tian, Y Liu - IEEE Sensors Journal, 2021 - ieeexplore.ieee.org

Challenges of process industry, which is characterized as hugeness of process variables in

complexity of industrial environment, can be tackled effectively by the use of soft sensor

technology. However, how to supplement the dataset with effective data supplement method …

<——2021———2021———1860——  


 Human Motion Generation using Wasserstein GAN

A Shiobara, M Murakami - 2021 5th International Conference on Digital …, 2021 - dl.acm.org

Human motion control, edit, and synthesis are important tasks to create 3D computer

graphics video games or movies, because some characters act like humans in most of them.

Our aim is to construct a system which can generate various natural character motions. We …

 Cited by 1 Related articles


Small Object Detection in Complex Large Scale Spatial Image by Concatenating SRGAN and Multi-Task WGAN

Y Fu, C Zheng, L Yuan, H Chen… - 2021 7th International …, 2021 - ieeexplore.ieee.org

With rapid development of IOT, especially in the field of geological exploration, areal images

obtained through optical sensors with large spatial coverage are becoming an effective

material for earth understanding. As such that research on object detection among large …

Small Object Detection in Complex Large Scale Spatial Image by Concatenating SRGAN and Multi-Task WG...

by Fu, Yu; Zheng, Chengyu; Yuan, Liyuan ; More...

2021 7th International Conference on Big Data Computing and Communications (BigCom), 08/2021

With rapid development of IOT, especially in the field of geological exploration, areal images obtained through optical sensors with large spatial coverage are...

Conference Proceeding 

Full Text Online

 Cited by 1 Related articles All 3 versions


 Fault injection in optical path-detection quality degradation analysis with Wasserstein distance

P Kowalczyk, P Bugiel, M Szelest… - … on Methods and …, 2021 - ieeexplore.ieee.org

The goal of this paper is to present results of analysis of artificially generated disturbances

imitating real defects of camera that occurs in the process of testing autonomous vehicles

both during rides and later, in vehicle software simulation and their impact on quality of …

Related articles

Minimizing Wasserstein-1 Distance by Quantile Regression for GANs Model

Y Chen, X Hou, Y Liu - Chinese Conference on Pattern Recognition and …, 2021 - Springer

Abstract In recent years, Generative Adversarial Nets (GANs) as a kind of deep generative

model has become a research focus. As a representative work of GANs model, Wasserstein

GAN involves earth moving distance which can be applied to un-overlapped probability …

15 References  Related records

 Related articles All 2 versions

 [PDF] openreview.net

An Improved Composite Functional Gradient Learning by Wasserstein Regularization for Generative adversarial networks

C Wan, Y Fu, K Fan, J Zeng, M Zhong, R Jia, ML Li… - 2021 - openreview.net

Generative adversarial networks (GANs) are usually trained by a minimax game which is

notoriously and empirically known to be unstable. Recently, a totally new methodology

called Composite Functional Gradient Learning (CFG) provides an alternative theoretical …

 Cite Related articles 


2021

   

 2021  see 2020

A TextCNN and WGAN-gp based deep learning frame for unpaired text style transfer in multimedia services

M Hu, M He, W Su, A Chehri - Multimedia Systems, 2021 - Springer

With the rapid growth of big multimedia data, multimedia processing techniques are facing

some challenges, such as knowledge understanding, semantic modeling, feature

representation, etc. Hence, based on TextCNN and WGAN-gp (improved training of …

Cited by 3 Related articles All 3 versions

[PDF] arxiv.org

Bounds in  Wasserstein distance on the normal approximation of general M-estimators

F Bachoc, M Fathi - arXiv preprint arXiv:2111.09721, 2021 - arxiv.org

We derive quantitative bounds on the rate of convergence in $ L^ 1$ Wasserstein distance

of general M-estimators, with an almost sharp (up to a logarithmic term) behavior in the

number of observations. We focus on situations where the estimator does not have an …

  Related articles All 23 versions 

 

Data balancing for thermal comfort datasets using conditional wasserstein GAN with a weighted loss function

H Yoshikawa, A UchiyamaT Higashino - Proceedings of the 8th ACM …, 2021 - dl.acm.org

The development of various machine learning methods helps to improve the performance of

the thermal comfort estimation. However, thermal comfort datasets are usually unbalanced

because hot/cold environments rarely appear in an air-conditioned environment …

 

[PDF] aimsciences.org

Distributionally robust chance constrained svm model with -Wasserstein distance

Q Ma, Y Wang - Journal of Industrial & Management Optimization, 2021 - aimsciences.org

In this paper, we propose a distributionally robust chanceconstrained SVM model with l2-

Wasserstein ambiguity. We present equivalent formulations of distributionally robust chance

constraints based on l2-Wasserstein ambiguity. In terms of this method, the distributionally …

 All 2 versions 


 [PDF] arxiv.org

On the Wasserstein Distance Between  k-Step Probability Measures on Finite Graphs

S Benjamin, A Mantri, Q Perian - arXiv preprint arXiv:2110.10363, 2021 - arxiv.org

We consider random walks $ X, Y $ on a finite graph $ G $ with respective lazinesses

$\alpha,\beta\in [0, 1] $. Let $\mu_k $ and $\nu_k $ be the $ k $-step transition probability

measures of $ X $ and $ Y $. In this paper, we study the Wasserstein distance between …

 All 2 versions 

<——2021———2021———1870——  


 2021  see 2020  [PDF] arxiv.org

Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures

O Bencheikh, B Jourdain - Journal of Approximation Theory, 2021 - Elsevier

We are interested in the approximation in Wasserstein distance with index ρ≥ 1 of a

probability measure μ on the real line with finite moment of order ρ by the empirical measure

of N deterministic points. The minimal error converges to 0 as N+∞ and we try to …


R-WGAN-based Multi-timescale Enhancement Method for Predicting f-CaO Cement Clinker

X Hao, L Liu, G Huang, Y Zhang… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

To address the problem that the high dimensionality, time-series, coupling and multiple time

scales of production data in the process industry lead to the low accuracy of traditional

prediction models, we propose a multi-time scale data enhancement and cement clinker f …

 Cited by 1 Related articles

[PDF] openreview.net

Probabilistic human-like gesture synthesis from speech using GRU-based WGAN

B WuC Liu, CT Ishi, H Ishiguro - Companion Publication of the 2021 …, 2021 - dl.acm.org

Gestures are crucial for increasing the human-likeness of agents and robots to achieve

smoother interactions with humans. The realization of an effective system to model human

 hich are matched with the speech utterances, is necessary to be embedded in …

Cited by 3 Related articles All 2 versions

MR4345084 Prelim Palma, Giacomo De; Marvian, Milad; Trevisan, Dario; Lloyd, Seth; 

The quantum Wasserstein distan of order 1.

 IEEE Trans. Inform. Theory 67 (2021), no. 10, 6627–6643. 81 (94A17)


2021

He, RuiqiangFeng, XiangchuZhu, XiaolongHuang, HuaWei, Bingzhe

RWRM: residual Wasserstein regularization model for image restoration. (English) Zbl 07454686

Inverse Probl. Imaging 15, No. 6, 1307-1332 (2021).

MSC:  68U10

PDF BibTeX XML 

 

2021


Bayraktar, ErhanGuo, Gaoyue

Strong equivalence between metrics of Wasserstein type. (English) Zbl 07453035

Electron. Commun. Probab. 26, Paper No. 13, 13 p. (2021).

MSC:  90C25  PDF BibTeX XML Cite

Full Text: DOI

Cited by 15 Related articles All 6 versions

2021  see 2020  Working Paper  Full Text

Encoded Prior Sliced Wasserstein AutoEncoder for learning latent manifold representations

Krishnagopal, Sanjukta; Bedrossian, Jacob.arXiv.org; Ithaca, Dec 10, 2021.

Abstract/DetailsGet full textLink to external site, this link will open in a new window


Considering anatomical prior information for low-dose CT image enhancement using attribute-augmented Wasserstein...
by Huang, ZhenxingLiu, XinfengWang, Rongpin ; More...
Neurocomputing (Amsterdam), 03/2021, Volume 428
[Display omitted] •Anatomical prior information is introduced to estimate high-resolution CT images.•A united framework is applied instead of designing...
ArticleView Article PDF
Journal Article  Full Text Online
Reports on Neural Computation Findings from Huazhong University of Science and Technology Provide New Insights (Considering Anatomical Prior Information for Low-dose Ct Image Enhancement Using Attribute-augmented Wasserstein...
Robotics & Machine Learning, 03/2021
Newsletter  Full Text Online
 Cited by 8
 Related articles


Distributionally Robust Joint Chance-Constrained Programming with Wasserstein...

by Gu, YiningWang, Yanjun
03/2021
In this paper, we develop an exact reformulation and a deterministic approximation for distributionally robust joint chance-constrained programmings (DRCCPs)...
Journal Article  Full Text Online

A Recommender System Based on Model Regularization Wasserstein Generative Adversarial Network*
Wang, Qingxian; Huang, Qing; Ma, Kangkang.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).
Abstract/Details   Show Abstract 

Related articles

<——2021———2021———1880—— 


 
Conference Paper  Citation/Abstract

Optimization of the Diffusion Time in Graph Diffused-Wasserstein Distances: Application to Domain Adaptation
Goncalves, Paulo; Sebban, Marc; Borgnat, Pierre; Gribonval, Remi; Vayer, Titouan.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).

Abstract/Details Get full textLink to external site, this link will open in a new window

Show Abstract 
Cited by 1
 Related articles All 4 versions


Conference Paper  Citation/Abstract

Differentially Privacy-Preserving Federated Learning Using Wasserstein Generative Adversarial Network
Wan, Yichen; Qu, Youyang; Gao, Longxiang; Xiang, Yong.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).

Abstract/Details   Show Abstract 

Related articles All 2 versions
 
Wasserstein Distributionally Robust Inverse Multiobjective Optimization

Dong, CS and Zeng, B

35th AAAI Conference on Artificial Intelligence / 33rd Conference on Innovative Applications of Artificial Intelligence / 11th Symposium on Educational Advances in Artificial Intelligence

2021 | 

THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE

 35 , pp.5914-5921

Inverse multiobjective optimization provides a general framework for the unsupervised learning task of inferring parameters of a multiobjective decision making problem (DMP), based on a set of observed decisions from the human expert. However, the performance of this framework relies critically on the availability of an accurate DMP, sufficient decisions of high quality, and a parameter space t

Show more  more_horiz

24 References  Related records

2021

Decision Making Under Model Uncertainty: Fr acute accent echet-Wasserstein Mean Preferences

Petracou, EVXepapadeas, A and Yannacopoulos, AN

Jul 2021 (Early Access) | MANAGEMENT SCIENCE

This paper contributes to the literature on decision making under multiple probability models by studying a class of variational preferences. These preferences are defined in terms of Frechet mean utility functionals, which are based on the Wasserstein metric in the space of probability models. In order to produce a measur…

Full Text at Publisher

40 References  Related records


Conference Paper  Citation/Abstract

Wasserstein Barycenter for Multi-Source Domain Adaptation
Mboula, Fred Maurice Ngole.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).
Abstract/Details Show Abstract 
Cited by 3 Related articles 

[PDF] thecvf.com

Wasserstein Barycenter for Multi-Source Domain Adaptation

EF Montesuma, FMN Mboula - Proceedings of the IEEE …, 2021 - openaccess.thecvf.com

Multi-source domain adaptation is a key technique that allows a model to be trained on data

coming from various probability distribution. To overcome the challenges posed by this

learning scenario, we propose a method for constructing an intermediate domain between …

 Cited by 2 Related articles All 4 versions 


 2021


Conference Paper  Citation/Abstract
Self-Supervised Wasserstein Pseudo-Labeling for Semi-Supervised Image Classification
Dabouei, Ali; Soleymani, Sobhan; Dawson, Jeremy; Nasrabadi, Nasser M.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).

Abstract/Details   Show Abstract 
Cited by 11
 Related articles All 6 versions 


Conference Paper  Citation/Abstract

Fault injection in optical path - detection quality degradation analysis with Wasserstein distance
Kowalczyk, Pawel; Bugiel, Paulina; Szelest, Marcin; Izydorczyk, Jacek.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).
Abstract/Details   Show Abstract 


Conference Paper  Citation/Abstract

Two-sample Test using Projected Wasserstein Distance
Wang, Jie; Gao, Rui; Xie, Yao.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).

Abstract/Details   Show Abstract 
 Cited by 10 Related articles All 7 versions


Fast Wasserstein-distance-based distributionally robust chance-constrained power dispatch for multi-zone HVAC systems

G ChenH ZhangH Hui, Y Song - IEEE Transactions on Smart …, 2021 - ieeexplore.ieee.org

Heating, ventilation, and air-conditioning (HVAC) systems play an increasingly important

role in the construction of smart cities because of their high energy consumption and

available operational flexibility for power systems. To enhance energy efficiency and utilize …
Scholarly Journal  Citation/Abstract

Fast Wasserstein-Distance-Based Distributionally Robust Chance-Constrained Power Dispatch for Multi-Zone HVAC Systems
Chen, Ge; Zhang, Hongcai; Hui, Hongxun; Song, Yonghua.IEEE Transactions on Smart Grid; Piscataway Vol. 12, Iss. 5,  (2021): 4016-4028.
Abstract/Details   Show Abstract 

Cited by 4 Related articles All 2 versions

Cited by 19 Related articles All 3 versions


Scholarly Joural  Citation/Abstract

The Wasserstein-Fourier Distance for Stationary Time Series
Cazelles, Elsa; Arnaud, Robert; Tobar, Felipe.IEEE Transactions on Signal Processing; New York Vol. 69,  (2021): 709-721.
Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 

<——2021———2021———1890——  



Scholarly Journal  Citation/Abstract

node2coords: Graph Representation Learning with Wasserstein Barycenters
Simou, Effrosyni; Thanou, Dorina; Frossard, Pascal.IEEE Transactions on Signal and Information Processing over Networks; Piscataway Vol. 7,  (2021): 17-29.
Abstract/Details Get full textLink to external site, this link will open in a new window
 Show Abstract 


Scholarly Journal Citation/Abstract
Parameter-Transferred Wasserstein Generative Adversarial Network (PT-WGAN) for Low-Dose PET Image Denoising
Gong, Yu; Shan, Hongming; Teng, Yueyang; Tu, Ning; Li, Ming; et al.IEEE Transactions on Radiation and Plasma Medical Sciences; Piscataway Vol. 5, Iss. 2,  (2021): 213-223.
Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 
 


Scholarly Journal  Citation/Abstract

DPIR-Net: Direct PET Image Reconstruction Based on the Wasserstein Generative Adversarial Network
Hu, Zhanli; Xue, Hengzhi; Zhang, Qiyang; Gao, Juan; Zhang, Na; et al.IEEE Transactions on Radiation and Plasma Medical Sciences; Piscataway Vol. 5, Iss. 1,  (2021): 35-43.
Abstract/Details
 
Show Abstract 


 
Scholarly Journal  Citation/Abstract

Wasserstein GANs for MR Imaging: From Paired to Unpaired Training
Ke Lei; Mardani, Morteza; Pauly, John M; Vasanawala, Shreyas S.IEEE Transactions on Medical Imaging; New York Vol. 40, Iss. 1,  (2021): 105-115.
Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 

[PDF] arxiv.org

Berry–Esseen smoothing inequality for the Wasserstein metric on compact Lie groups

B Borda - Journal of Fourier Analysis and Applications, 2021 - Springer

… Wasserstein metric in terms of their Fourier transforms. We use a generalized form of the

Wasserstein metric, … walks on semisimple groups in the Wasserstein metric is necessarily almost …

  Cited by 1 Related articles All 5 versions

2021


Scholarly Journal  Citation/Abstract

Pixel-Wise Wasserstein Autoencoder for Highly Generative Dehazing
Kim, Guisik; Sung Woo Park; Kwon, Junseok.IEEE Transactions on Image Processing; New York Vol. 30,  (2021): 5452-5462.
Abstract/Details  Show Abstract 

CCited by 8 Related articles All 5 versions


Scholarly Journal  Citation/Abstract

Wasserstein Distributionally Robust Stochastic Control: A Data-Driven Approach
Yang, Insoon.IEEE Transactions on Automatic Control; New York Vol. 66, Iss. 8,  (2021): 3863-3870.
Abstract/Details Get full textLink to external site, this link will open in a new window
Show Abstract 


Scholarly Journal  Citation/Abstract

Brain Extraction From Brain MRI Images Based on Wasserstein GAN and O-Net
Jiang, Shaofeng; Guo, Lanting; Cheng, Guangbin; Chen, Xingyan; Zhang, Congxuan; et al.IEEE Access; Piscataway Vol. 9,  (2021): 136762-136774.
Abstract/Details
 Get full textLink to external site, this link will open in a new window
Show Abstract 


Scholarly Journal  Full Text

CHICKN: extraction of peptide chromatographic elution profiles from large scale mass spectrometry data by means of Wasserstein compressive hierarchical cluster analysis
Permiakova, Olga; Guibert, Romain; Kraut, Alexandra; Fortin, Thomas; Hesse, Anne-Marie; et al.BMC Bioinformatics; London Vol. 22,  (2021): 1-30.

Abstract/DetailsFull text - PDF (5 MB)‎
Show Abstract 

Cited by 2 Related articles All 23 versions

Scholarly Journal  Full Text

Wasserstein Bounds in the CLT of the MLE for the Drift Coefficient of a Stochastic Partial Differential Equation
Khalifa Es-Sebaiy; Al-Foraih, Mishari; Alazemi, Fares.Fractal and Fractional; Basel Vol. 5, Iss. 4,  (2021): 187.

Abstract/DetailsFull textFull text - PDF (335 KB)‎

Show Abstract 

 All 5 versions 

<——2021———2021———1900——


Scholarly Journal  Full Text

Prediction of Aquatic Ecosystem Health Indices through Machine Learning Models Using the WGAN-Based Data Augmentation Method

Lee, Seoro; Kim, Jonggun; Lee, Gwanjae; Hong, Jiyeong; Bae, Joo Hyun; et al.Sustainability; Basel Vol. 13, Iss. 18,  (2021): 10435.

Abstract/DetailsFull textFull text - PDF (3 MB)‎

Cited by 6 Related articles All 6 versions

 

Scholarly Journal  Full TextLow-Illumination Image Enhancement in the Space Environment Based on the DC-WGAN Algorithm

Sensors; Basel Vol. 21, Iss. 1,  (2021): 286.

Cited by 2 Related articles All 8 versions 

Conference Paper  Citation/Abstract

Domain Adaptive Rolling Bearing Fault Diagnosis based on Wasserstein Distance

Yang, Chunliu; Bao, Jun; Li, Zhuorui.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference

Proceedings; Piscataway, (2021).

Abstract/Details   Show Abstract 

 The Wasserstein distance to the Circular Law - ResearchGate

https://www.researchgate.net › ... › Circularity

Jonas Jalowy

Nov 30, 2021 — We investigate the Wasserstein distance between the empirical spectral distribution of non-Hermitian random matrices and the Circular Law.

 Cited by 3 Related articles All 3 versions

Efficient and Robust Classification for Positive Definite Matrices with Wasserstein Metric

by J Cui · Thesis Apr1 5.pdf (1.148Mb) ... 04-16-2023 ... The results obtained in this paper include that Bures-Wasserstein simple projection mean ...

Master

2021

ARTICLE

Low-Dose CT Image Denoising with Improving WGAN and Hybrid Loss Function

Li, Zhihua ; Shi, Weili ; Xing, Qiwei ; Miao, Yu ; He, Wei ; Yang, Huamin ; Jiang, Zhengang; Khosrowabadi, RezaComputational and mathematical methods in medicine, 2021, Vol.2021, p.2973108-14

PEER REVIEWED

OPEN ACCESS

[HTML] hindawi.com

[HTML] Low-dose CT image denoising with improving WGAN and hybrid loss function

Z Li, W Shi, Q Xing, Y Miao, W He, H Yang… - … Methods in Medicine, 2021 - hindawi.com

The X-ray radiation from computed tomography (CT) brought us the potential risk. Simply

decreasing the dose makes the CT images noisy and diagnostic performance compromised.

Here, we develop a novel denoising low-dose CT image method. Our framework is based …

Cited by 2 Related articles All 6 versions 


[PDF] jst.go.jp

Wasserstein 型コストに基づくワンウェイ型カーシェアリングサービスの最適制御

星野健太, 櫻間一徳 - 自動制御連合講演会講演論文集 64 回自動 …, 2021 - jstage.jst.go.jp

… This study discusses an optimal control problem of Markov chains with a cost given by

distances in the space of probability distributions, called Wasserstein distances. The control

problem is formulated as a problem of finding the optimal control that designates the probability …

  All 2 versions


[PDF] arxiv.org

Multi-period facility location and capacity planning under -Wasserstein joint chance constraints in humanitarian logistics

Z Wang, K You, Z Wang, K Liu - arXiv preprint arXiv:2111.15057, 2021 - arxiv.org

The key of the post-disaster humanitarian logistics (PD-HL) is to build a good facility location

and capacity planning (FLCP) model for delivering relief supplies to affected areas in time.

To fully exploit the historical PD data, this paper adopts the data-driven distributionally …

 All 3 versions 


CNN-based Continuous Authentication on Smartphones with Conditional Wasserstein Generative Adversarial Network

Y Li, J Luo, S Deng, G Zhou - IEEE Internet of Things Journal, 2021 - ieeexplore.ieee.org

With the widespread usage of mobile devices, the authentication mechanisms are urgently

needed to identify users for information leakage prevention. In this paper, we present CAGANet,

a CNN-based continuous authentication on smartphones using a conditional Wasserstein …

Related articles

 2021 see 2022

Distributionally Safe Path Planning: Wasserstein Safe RRT

P LathropB Boardman… - IEEE Robotics and …, 2021 - ieeexplore.ieee.org

… Abstract—In this paper, we propose a Wasserstein metricbased random path planning

algorithm. Wasserstein Safe RRT (W-Safe RRT) provides finite-… We define limits on distributional

sampling error so the Wasserstein distance between a vehicle state distribution and obstacle …

Related articles

 <——2021———2021———1910—— 


[HTML] springer.com

[HTML] A Bismut–Elworthy inequality for a Wasserstein diffusion on the circle

V Marx - Stochastics and Partial Differential Equations: Analysis …, 2021 - Springer

We introduce in this paper a strategy to prove gradient estimates for some infinite-dimensional

diffusions on $$L_2$$ L 2 -Wasserstein spaces. For a specific example of a diffusion on …

  Related articles All 15 versions


Run-Sort-ReRun: Escaping Batch Size Limitations in Sliced Wasserstein Generative Models

J LezamaW ChenQ Qiu - International Conference on …, 2021 - proceedings.mlr.press

… In this paper, we build upon recent progress in sliced Wasserstein distances, a family of

differentiable metrics for distribution discrepancy based on the Optimal Transport paradigm.

We introduce a procedure to train these distances with virtually any batch size, allowing the …

Cited by 2 All 2 versions 

Conditional Wasserstein generative adversarial networks applied to acoustic metamaterial design

P Lai, F AmirkulovaP Gerstoft - … Journal of the Acoustical Society of …, 2021 - asa.scitation.org

This work presents a method for the reduction of the total scattering cross section (TSCS) for

a planar configuration of cylinders by means of generative modeling and deep learning.

Currently, the minimization of TSCS requires repeated forward modelling at considerable …

ited by 1 Related articles All 5 versions

Adversarial training with Wasserstein distance for learning cross-lingual word embeddings

Y Li, Y Zhang, K Yu, X Hu - Applied Intelligence, 2021 - Springer

… In this study, we employ the 1-Wasserstein distance to measure the discrepancy of two

embedding distributions. For simplicity, the … of the Wasserstein distance is an optimization

problem, we design a Wasserstein critic network C to implement the Wasserstein distance. Based on …

Cited by 5 Related articles All 3 versions

[PDF] arxiv.org

Infinite-dimensional regularization of McKean–Vlasov equation with a Wasserstein diffusion

V Marx - Annales de l'Institut Henri Poincaré, Probabilités et …, 2021 - projecteuclid.org

Much effort has been spent in recent years on restoring uniqueness of McKean–Vlasov

SDEs with non-smooth coefficients. As a typical instance, the velocity field b is assumed to be

bounded and measurable in its space variable and Lipschitz-continuous with respect to the …

Cited by 3 Related articles All 11 versions


2021


Data augmentation for rolling bearing fault diagnosis using an enhanced few-shot Wasserstein auto-encoder with meta-learning

Z Pei, H Jiang, X Li, J Zhang, S Liu - Measurement Science and …, 2021 - iopscience.iop.org

Despite the advance of intelligent fault diagnosis for rolling bearings, in industries, data-driven

methods still suffer from data acquisition and imbalance. We propose an enhanced few-shot

Wasserstein auto-encoder (fs-WAE) to reverse the negative effect of imbalance. Firstly, an …

Cited by 8 Related articles All 2 versions

F] arxiv.org

The cutoff phenomenon in Wasserstein distance for nonlinear stable Langevin systems with small L\'evy noise

G BarreraMA HögeleJC Pardo - arXiv preprint arXiv:2108.08351, 2021 - arxiv.org

… This article establishes the cutoff phenomenon in the Wasserstein distance for systems of …

index α > 3/2 to the formulation in Wasserstein distance, which allows to cover the case of …

the nonstandard shift linearity property of the Wasserstein distance, which is established by …

  All 5 versions 


[PDF] arxiv.org

A material decomposition method for dual‐energy CT via dual interactive Wasserstein generative adversarial networks

Z Shi, H Li, Q Cao, Z Wang, M Cheng - Medical Physics, 2021 - Wiley Online Library

… In this study, a data-driven approach using dual interactive Wasserstein generative

adversarial networks (DIWGAN) is developed to … The data distributions of ground truth (P r )

and the generated image (P g ) were compared by the Wasserstein distance instead of the JS …

 CCited by 9 Related articles All 6 versions

[PDF] iaeng.org

[PDF] Automatic Image Annotation Using Improved Wasserstein Generative Adversarial Networks.

J Liu, W Wu - IAENG International Journal of Computer Science, 2021 - iaeng.org

… To solve this problem, in this study a new annotation model combining the improved

Wasserstein generative adversarial network (GAN) and word2vec was proposed. First, the

tagged vocabulary was mapped to a fixed multidimensional word vector by word2vec. Second, a …

Cited by 8 Related articles All 3 versions 

A data-driven scheduling model of virtual power plant using Wasserstein distributionally robust optimization

H Liu, J QiuJ Zhao - International Journal of Electrical Power & Energy …, 2021 - Elsevier

Distributed energy resources (DER) can be efficiently aggregated by aggregators to sell

excessive electricity to spot market in the form of Virtual Power Plant (VPP). The aggregator

schedules DER within VPP to participate in day-ahead market for maximizing its profits while …

 All 2 versions

<——2021———2021———1920——

[HTML] copernicus.org

[HTML] Ensemble Riemannian data assimilation over the Wasserstein space

SK TamangA Ebtehaj, PJ Van Leeuwen… - Nonlinear Processes …, 2021 - npg.copernicus.org

In this paper, we present an ensemble data assimilation paradigm over a Riemannian

manifold equipped with the Wasserstein metric. Unlike the Euclidean distance used in classic

data assimilation methodologies, the Wasserstein metric can capture the translation and …

  Related articles All 7 versions 

An Improved Mixture Density Network via Wasserstein Distance Based Adversarial Learning for Probabilistic Wind Speed Predictions

L Yang, Z Zheng, Z Zhang - IEEE Transactions on Sustainable …, 2021 - ieeexplore.ieee.org

This paper develops a novel mixture density network via Wasserstein distance based adversarial

learning (WA-IMDN) for achieving more accurate probabilistic wind speed predictions

(PWSP). The proposed method utilizes historical supervisory control and data acquisition (…

Cited by 4 Related articles All 18 versions 

[PDF] tandfonline.com

Stochastic approximation versus sample average approximation for Wasserstein barycenters

D Dvinskikh - Optimization Methods and Software, 2021 - Taylor & Francis

… We show that for the Wasserstein barycenter problem, this superiority can be inverted. We

provide a detailed comparison by stating the … a general convex optimization problem given

by the expectation to have other applications besides the Wasserstein barycenter problem. …

 All 3 versions


Sliced Wasserstein Distance for Neural Style Transfer

J Li, D Xu, S Yao - Computers & Graphics, 2021 - Elsevier

… In this paper, we propose a new style loss based on Sliced Wasserstein Distance (SWD),

which has a theoretical approximation guarantee. Besides, an adaptive sampling algorithm

is also proposed to further improve the style transfer results. Experiment results show that the …

 All 2 versions


  2021


[HTML] mdpi.com

Geometric Characteristics of the Wasserstein Metric on SPD (n) and Its Applications on Data Processing

Y Luo, S Zhang, Y Cao, H Sun - Entropy, 2021 - mdpi.com

… In this paper, by involving the Wasserstein metric on S P D ( n ) , we obtain computationally

… Laplacian, we present the connection between Wasserstein sectional curvature and edges.

… In Section 3, we describe the Wasserstein geometry of S P D ( n ) , including the geodesic, …

 All 9 versions 


The Wasserstein Impact Measure (WIM): A practical tool for quantifying prior impact in Bayesian statistics

F GhaderinezhadC LeyB Serrien - Computational Statistics & Data …, 2021 - Elsevier

… sharp lower and upper bounds on the Wasserstein distance and their approach relies on a

… For practical purposes, the power of the Wasserstein distance idea has not been exploited

… More concretely, we will provide in Section 2 the Wasserstein Impact Measure, abbreviated …

 All 2 versions

 

[HTML] oup.com

Differential semblance optimisation based on the adaptive quadratic Wasserstein distance

Z Yu, Y Liu - Journal of Geophysics and Engineering, 2021 - academic.oup.com

As the robustness for the wave equation-based inversion methods, wave equation

migration velocity analysis (WEMVA) is stable for overcoming the multipathing problem and

has become popular in recent years. As a rapidly developed method, differential semblance …


021

Temporal conditional Wasserstein GANs for audio-visual affect-related ties

C Athanasiadis, E Hortal… - 2021 9th International …, 2021 - ieeexplore.ieee.org

… tion within emotion expressivity contexts by introducing an approach called temporal

conditional Wasserstein GANs (tcwGANs). The focus of this work is placed on the following two

research questions: Firstly, whether Wasserstein loss can help in improving the performance of …

 All 2 versions


2021  [PDF] projecteuclid.org

Wasserstein gradients for the temporal evolution of probability distributions

Y ChenHG Müller - Electronic Journal of Statistics, 2021 - projecteuclid.org

… Hence, we utilize temporal derivatives of log maps, the Wasserstein temporal gradients, to

model the instantaneous temporal evolution of distributions… and then estimate the Wasserstein

temporal gradients by difference quotients based on the local Fréchet regression estimates. …

  Related articles All 2 versions

<——2021———2021———1930—— 



2021  [PDF] openreview.net

Combining Wasserstein GAN and Spatial Transformer Network for Medical Image Segmentation

Z Zhang, J Wang, Y Wang, S Li - 2021 - openreview.net

… process of Wasserstein GAN with gradient penalty(WGAN-GP), the wasserstein distance may

not … We try to make the Wasserstein distance better reflect the current training situation by

adding … of the model according to the wasserstein distance. Keywords: Image segmentation, …



2021  [PDF] mlr.press

Differentially private sliced wasserstein distance

A RakotomamonjyR Liva - International Conference on …, 2021 - proceedings.mlr.press

… mechanism of the Sliced Wasserstein Distance, and we establish the sensitivity of the

resulting differentially private mechanism. One of … the Sliced Wasserstein distance into another

distance, that we call the Smoothed Sliced Wasserstein Distance. This new differentially private …

  All 25 versions 


22021

Sliced Gromov-Wasserstein - ACM Digital Library

https://dl.acm.org › doi

https://dl.acm.org › doi

Jun 15, 2021 — Recently used in various machine learning contexts, the Gromov-Wasserstein distance (GW) allows for comparing distributions whose supports ...

 

 2021

Differentially Privacy-Preserving Federated Learning Using Wasserstein Generative Adversarial Network

Y Wan, Y QuL GaoY Xiang - 2021 IEEE Symposium on …, 2021 - ieeexplore.ieee.org

… To address this issue, we propose to integrate Wasserstein Generative Adversarial

Network (WGAN) and differential privacy (DP) to protect the model parameters. WGAN is used

to generate controllable random noise, which is then injected into model parameters. The new …

 All 2 versions


 2021

Lecture 15: Semicontinuity and Convexity of Energies in the Wasserstein Space

L AmbrosioE BruéD Semola - Lectures on Optimal Transport, 2021 - Springer

… We move now our attention to the study of geodesic convexity of internal energies on \(\mathcal

{P}_2(\mathbb {R}^n)\). Assume that we are given an energy density U : [0, ∞) (−∞, ∞]

convex, lower semicontinuous and with U(0) = 0 (this is a natural assumption motivated by the …

 

Fault Diagnosis of Rotating Machinery Based on Wasserstein Distance and Feature Selection

F Ferracuti, A Freddi, A Monteriù… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

… In this work, first, frequency- and time-based features are extracted by vibration signals, and 

second, the Wasserstein distance … Wasserstein distance is considered in the learning phase 

to discriminate the different machine operating conditions. Specifically, the 1-D Wasserstein

Cited by 4 Related articles


[PDF] openreview.net

Sliced Wasserstein Variational Inference

M Yi, S Liu - Fourth Symposium on Advances in Approximate …, 2021 - openreview.net

… inference method by minimizing sliced Wasserstein distance. This sliced Wasserstein distance 

can be … Our approximation also does not require a tractable density function of variational …

Cited by 2 Related articles

[PDF] arxiv.org

Unsupervised Ground Metric Learning using Wasserstein Eigenvectors

GJ Huizing, L Cantini, G Peyré - arXiv preprint arXiv:2102.06278, 2021 - arxiv.org

… While this is not the focus of this paper, one can extend Wasserstein eigenvectors to a 

possibly infinite collection of probability measures A … In this paper we defined Wasserstein 

eigenvectors and provided results on their existence and uniqueness. We then extended these …

 Related articles All 2 versions


Decision Making Under Model Uncertainty: Fréchet–Wasserstein Mean Preferences

EV Petracou, A Xepapadeas… - Management …, 2021 - pubsonline.informs.org

This paper contributes to the literature on decision making under multiple probability models 

by studying a class of variational preferences. These preferences are defined in terms of 

Fréchet mean utility functionals, which are based on the Wasserstein metric in the space of …

 

[PDF] arxiv.org

Least Wasserstein distance between disjoint shapes with perimeter regularization

M Novack, I Topaloglu, R Venkatraman - arXiv preprint arXiv:2108.04390, 2021 - arxiv.org

… p-Wasserstein distances, which metrize the weak convergence of probability measures on 

compact spaces [San15]. Indeed, length-minimizing Wasserstein … In this note we investigate 

the role of perimeter regularization in variational problems involving the Wasserstein distance …

 All 6 versions

[CITATION] Least Wasserstein distance between disjoint shapes with perimeter regularization

Related articles 

<——2021———2021———1940——


PDF] thecvf.com

çƒ√factorisations with a smoothed Wasserstein loss

SY Zhang - Proceedings of the IEEE/CVF International …, 2021 - openaccess.thecvf.com

… Our work presents a unified framework for Wasserstein factorisation problems, since it 

handles the fully general case of finding a Tucker decomposition and includes nonnegative CP 

decompositions … We thus define the Wasserstein distance between tensors X, Y P(X) to be …

All 4 versions

Wasserstein Coupled Graph Learning for Cross-Modal Retrieval

Y Wang, T Zhang, X Zhang, Z Cui… - Proceedings of the …, 2021 - openaccess.thecvf.com

… Then, a Wasserstein coupled dictionary, containing multiple pairs of counterpart graph keys 

… to facilitate the similarity measurement through a Wasserstein Graph Embedding (WGE) 

process. … To further achieve discriminant graph learning, we specifically define a Wasserstein

 Cited by 2 Related articles All 3 versions


[PDF] arxiv.org

Hidden Convexity of Wasserstein GANs: Interpretable Generative Models with Closed-Form Solutions

A Sahiner, T Ergen, B Ozturkler, B Bartan… - arXiv preprint arXiv …, 2021 - arxiv.org

… In this work, we analyze the training of Wasserstein GANs with two-layer neural network 

discriminators through the lens of convex duality, and for a variety of generators expose the 

conditions under which Wasserstein GANs can be solved exactly with convex optimization …

Cited by 6 Related articles All 4 versions

A new data generation approach with modified Wasserstein auto-encoder for rotating machinery fault diagnosis with limited fault data

K Zhao, H Jiang, C Liu, Y Wang, K Zhu - Knowledge-Based Systems, 2021 - Elsevier

… a modified Wasserstein auto-encoder (MWAE) to generate data that are highly similar to 

the known data. The sliced Wasserstein distance is … The sliced Wasserstein distance with a 

gradient penalty is designed as the regularisation term to minimise the difference between the …

All 2 versions


[PDF] mlr.press

Wasserstein Distributional Normalization For Robust Distributional Certification of Noisy Labeled Data

SW Park, J Kwon - International Conference on Machine …, 2021 - proceedings.mlr.press

… We propose a novel Wasserstein distributional normalization method that can classify 

noisy labeled data accurately. Recently, noisy … introducing the non-parametric Ornstein-Ulenbeck 

type of Wasserstein gradient flows called Wasserstein distributional normalization, which is …

Related articles All 3 versions

2021


Wasserstein distance for categorical data? Relationship to TVD?

https://stats.stackexchange.com › questions › wasserstei...

https://stats.stackexchange.com › questions › wasserstei...

Jan 10, 2021 — From my understanding, TVD makes more sense here as TVD works on non-metric spaces while Wasserstein is not (e.g. according to this paper).


Polymorphic Adversarial Cyberattacks Using WGAN - MDPI

https://www.mdpi.com › ...

https://www.mdpi.com › ...

by R Chauhan · 2021 — In this paper, we propose a model to generate adversarial attacks using Wasserstein GAN (WGAN). The attack data synthesized using the proposed model can be ...

 

[HTML] aclanthology.org

[HTML] Wasserstein Selective Transfer Learning for Cross-domain Text Mining

L FengM Qiu, Y Li, H Zheng… - Proceedings of the 2021 …, 2021 - aclanthology.org

… a Wasserstein-based discriminator to maximize the empirical distance between the selected

source data and target data. The TL module is then trained to minimize the estimated Wasserstein

… We further use a Wasserstein-based discriminator to maximize the empirical distance …
Cited by 2
 Related articles All 2 versions 

[PDF] jmlr.org

[PDF] A Fast Globally Linearly Convergent Algorithm for the Computation of Wasserstein Barycenters.

L YangJ LiD SunKC Toh - J. Mach. Learn. Res., 2021 - jmlr.org

We consider the problem of computing a Wasserstein barycenter for a set of discrete probability

distributions with finite supports, which finds many applications in areas such as statistics,

machine learning and image processing. When the support points of the barycenter are pre…

 2 Related articles All 22 versions 


[PDF] arxiv.org

Plg-in: Pluggable geometric consistency loss with wasserstein distance in monocular depth estimation

N HiroseS KoideK Kawano… - 2021 IEEE International …, 2021 - ieeexplore.ieee.org

… Our objective is designed using the Wasserstein distance between two point clouds,

estimated from images with different camera poses. The Wasserstein distance can impose a

soft and symmetric coupling between two point clouds, which suitably maintains geometric …

Cited by 4 Related articles All 4 versions

 <——2021———2021———1950——


WDIBS: Wasserstein deterministic information bottleneck for state abstraction to balance state-compression and performance

X Zhu, T Huang, R Zhang, W Zhu - Applied Intelligence, 2021 - Springer

… In this section, we introduce the related notion of the Wasserstein distance to measure

decision performance after state compression. Based on the Wasserstein distance, we propose

the WDIBS algorithm and WSIBS algorithm. Our algorithms provide a better balance between …

 

 

Temporal conditional Wasserstein GANs for audio-visual affect-related ties

C Athanasiadis, E Hortal… - 2021 9th International …, 2021 - ieeexplore.ieee.org

… tion within emotion expressivity contexts by introducing an approach called temporal 

conditional Wasserstein GANs (tcwGANs). The focus of this work is placed on the following two 

research questions: Firstly, whether Wasserstein loss can help in improving the performance of …

 Related articles All 4 versions


2021  [PDF] arxiv.org

Disentangled Recurrent Wasserstein Autoencoder

J Han, MR Min, L HanLE LiX Zhang - arXiv preprint arXiv:2101.07496, 2021 - arxiv.org

… In this paper, we propose recurrent Wasserstein Autoencoder (R-WAE), a new framework for

… Wasserstein distance between model distribution and sequential data distribution, and

simultaneously maximizes the mutual information between input data and different disentangled …

 Cited by 6 Related articles All 4 versions 


2021  [HTML] mdpi.com

Geometric Characteristics of the Wasserstein Metric on SPD (n) and Its Applications on Data Processing

Y Luo, S Zhang, Y Cao, H Sun - Entropy, 2021 - mdpi.com

… In this paper, by involving the Wasserstein metric on S P D ( n ) , we obtain computationally

feasible expressions for some geometric … the Wasserstein curvature on S P D ( n ) . The

experimental results show the efficiency and robustness of our curvature-based methods. …

 All 9 versions 


2021  [PDF] oup.com

Simulation of broadband ground motions with consistent long-period and short-period components using Wasserstein interpolation of acceleration envelopes

…, A Iwaki, T Maeda, H Fujiwara, Ueda - Geophysical Journal …, 2021 - academic.oup.com

… of probability distributions, which enables the introduction of a metric known as the Wasserstein

distance, and (2) embed pairs of long-period … advantage of the Wasserstein distance as a

measure of dissimilarity of the envelopes. This method serves as a novel machine learning …

 Related articles


2021


2021  [PDF] arxiv.org

Lifting couplings in Wasserstein spaces

P Perrone - arXiv preprint arXiv:2110.06591, 2021 - arxiv.org

… Several metric spaces appearing in mathematics can be obtained in this way, including

Wasserstein spaces, as we show in Section 4. We … As we said above, the main theme of this

work is the idea of lifting, which is common to both category theory and geometry, and therefore …

 All 2 versions 



2021  [PDF] mdpi.com

Novel Intelligent Fault Diagnosis Method for Rolling Bearings Based on Wasserstein Generative Adversarial Network and Convolutional Neural Network under …

H Tang, S Gao, L Wang, X Li, B Li, S Pang - Sensors, 2021 - mdpi.com

… In 2017, Gulrajani and Ahmed designed a new generative adversarial network approach

called the Wasserstein generative … Wasserstein generative adversarial net (WGAN) evaluates

the difference between the real and generated sample distributions by using the Wasserstein …

 All 8 versions 



2021  [PDF] arxiv.org

Learning Domain Invariant Representations by Joint Wasserstein Distance Minimization

L Andéol, Y Kawakami, Y Wada, T Kanamori… - arXiv preprint arXiv …, 2021 - arxiv.org

… Wasserstein distance [43, 55] in the present work, as it characterizes the weak convergence

of measures and displays several advantages as discussed in [2]. We contribute several bounds

relating the Wasserstein … mechanistically lowers the Wasserstein distance between the …

 Related articles All 4 versions 



2021  [PDF] arxiv.org

Learning disentangled representations with the wasserstein autoencoder

B Gaujac, I FeigeD Barber - Joint European Conference on Machine …, 2021 - Springer

… Building on previous successes of penalizing the total correlation in the latent variables, we

propose TCWAE (Total Correlation Wasserstein … We propose two variants using different KL

estimators and analyse in turn the impact of having different ground cost functions and latent …

  Related articles All 3 versions



2021  [PDF] arxiv.org

Unsupervised Ground Metric Learning using Wasserstein Eigenvectors

GJ HuizingL CantiniG Peyré - arXiv preprint arXiv:2102.06278, 2021 - arxiv.org

… While this is not the focus of this paper, one can extend Wasserstein eigenvectors to a

possibly infinite collection of probability measures A … In this paper we defined Wasserstein

eigenvectors and provided results on their existence and uniqueness. We then extended these …

  Related articles All 2 versions 

<——2021———2021———1960——


[PDF] arxiv.org

FlowPool: Pooling Graph Representations with Wasserstein Gradient Flows

E Simou - arXiv preprint arXiv:2112.09990, 2021 - arxiv.org

… , we propose FlowPool as the minimization of the Wasserstein distance between graph

representations in Section 5. In Section 6 we explain our versatile implementation based on

implicit automatic differentiation that can be used with any ground cost. In Section 7 we show that …

 All 4 versions 

[PDF] oup.com

Simulation of broadband ground motions with consistent long-period and short-period components using Wasserstein interpolation of acceleration envelopes

…, A Iwaki, T Maeda, H Fujiwara, Ueda - Geophysical Journal …, 2021 - academic.oup.com

… Practical hybrid approaches for the simulation of broadband ground motions often combine

… enables the introduction of a metric known as the Wasserstein distance, and (2) embed pairs

of … as well as the advantage of the Wasserstein distance as a measure of dissimilarity of the …

 Related articles


Well-posedness for some non-linear SDEs and related PDE on the Wasserstein space

PEC de Raynal, N Frikha - Journal de Mathématiques Pures et Appliquées, 2021 - Elsevier

In this paper, we investigate the well-posedness of the martingale problem associated to

non-linear stochastic differential equations (SDEs) in the sense of McKean-Vlasov under mild

assumptions on the coefficients as well as classical solutions for a class of associated linear …

   All 2 versions


[PDF] snu.ac.kr

A data-driven distributionally robust newsvendor model with a Wasserstein ambiguity set

S Lee, H Kim, I Moon - Journal of the Operational Research Society, 2021 - Taylor & Francis

… In this section, we introduce the definition of the Wasserstein distance and discuss properties

of the Wasserstein ambiguity set in the optimization perspective. We adopt the strong duality

result of data-driven DRO with the Wasserstein distance and related definitions from the …

  Cited by 8 Related articles All 2 versions

 

[PDF] arxiv.org

Portfolio optimisation within a Wasserstein ball

SM PesentiS Jaimungal - Available at SSRN 3744994, 2021 - papers.ssrn.com

We study the problem of active portfolio management where an investor aims to outperform

a benchmark strategy's risk profile while not deviating too far from it. Specifically, an investor

considers alternative strategies whose terminal wealth lie within a Wasserstein ball …

 Cited by 3 Related articles All 7 versions


[PDF] neurips.cc

Pooling by Sliced-Wasserstein Embedding

N Naderializadeh, J Comer… - Advances in …, 2021 - proceedings.neurips.cc

… In this paper, we mainly use the 2-Wasserstein distance and hereafter, for brevity, we refer to it

as the Wasserstein distance. … sliced-Wasserstein distance is equivalent to the sliced-Wasserstein

distance. Equation (5) is the expected value of the Wasserstein distances between …

Cited by 14 Related articles All 5 versions 

2021


[PDF] arxiv.org

Distributionally robust mean-variance portfolio selection with Wasserstein distances

J Blanchet, L Chen, XY Zhou - Management Science, 2021 - pubsonline.informs.org

… (2016), as we discuss, is that we focus on the order-two Wasserstein distance. This is

important because, as a result of the quadratic nature of the variance objective that we consider,

applying an uncertainty set based on Wasserstein of order one could potentially lead to …

 Cited by 37 Related articles All 4 versions


2021

R-WGAN-based Multi-timescale Enhancement ... - IEEE Xplorehttps://ieeexplore.ieee.org › iel7
https://ieeexplore.ieee.org › iel7
by X Hao · 2021 ·  — enhancement and cement clinker f-CaO prediction method based on Regression-Wasserstein Generative Adversarial Nets (R-. WGAN) model.

 2021

Data Augmentation in Fault Diagnosis Based on the Wasserstein

generative adversarial network with gradient penalty

https://www.researchgate.net › publication › 332644271_...

https://www.researchgate.net › publication › 332644271_...

Sep 30, 2021 — Therefore, in this paper, Wasserstein generative adversarial network with gradient penalty (WGAN-GP)based data augmentation approaches are ...


2021

Stock price prediction using Generative Adversarial Networks

https://thescipub.com › jcssp.2021.188.196.pdf

https://thescipub.com › jcssp.2021.188.196.pdfPDF

Feb 24, 2021 — Network (GAN) with Gated Recurrent Units (GRU) used as a generator that ... and GRU with the basic GAN and Wasserstein GAN.


A data-driven distributionally robust newsvendor model with a Wasserstein ambiguity set

S Lee, H Kim, I Moon - Journal of the Operational Research Society, 2021 - Taylor & Francis

… In this section, we introduce the definition of the Wasserstein distance and discuss properties 

of the Wasserstein ambiguity set in the optimization perspective. We adopt the strong duality 

result of data-driven DRO with the Wasserstein distance and related definitions from the …

Cited by 13 Related articles All 6 versions

<——2021———2021———1970——


[PDF] arxiv.org

Limit Distribution Theory for the Smooth 1-Wasserstein Distance with Applications

R Sadhu, Z Goldfeld, K Kato - arXiv preprint arXiv:2107.13494, 2021 - arxiv.org

… 2-Wasserstein distance with Q = P is known only when d = 1 [44]. The key observation there 

is that when d = 1, the empirical 2Wasserstein … Empirical convergence under sliced distances 

follows their rates when d = 1 [47], which amounts to n−1/2 for sliced W1 [48, 49]. A sliced …

Cited by 5  All 3 versions

 

A Recommender System Based on Model Regularization Wasserstein Generative Adversarial Network*

Q Wang, Q Huang, K Ma… - 2021 IEEE International …, 2021 - ieeexplore.ieee.org

… proposes a Model Regularization Wasserstein GAN(MRWGAN) to extract the distribution 

of user’s preferences. Its main ideas are two-fold: a) adopting an auto-encoder to implement 

the generator model of GAN; b) proposing a model-regularized Wasserstein distance as an …

 Related articles All 2 versions

[PDF] arxiv.org

Approximation algorithms for 1-Wasserstein distance between persistence diagrams

S Chen, Y Wang - arXiv preprint arXiv:2104.07710, 2021 - arxiv.org

… Specifically: In Section 3, we show how to modify the algorithms of [15] and [1] to approximate 

the 1-Wasserstein distances between persistence diagrams within the same approximation 

factor (Theorems 7 and 10). Note that in the literature (eg, [18]), it is known that d per …

 Cited by 5 Related articles All 8 versions


[HTML] aimsciences.org

[HTML] RWRM: Residual Wasserstein regularization model for image restoration

R He, X Feng, X Zhu, H Huang… - Inverse Problems & …, 2021 - aimsciences.org

Wasserstein regularization model (RWRM), in which a residual histogram constraint is subtly 

embedded into a type of variational minimization problems. Specifically, utilizing the Wasserstein 

… Furthermore, the RWRM unifies the residual Wasserstein regularization and image …

 Related articles All 2 versions

Zbl 07454686

Minimizing Wasserstein-1 Distance by Quantile Regression for GANs Model

Y Chen, X Hou, Y Liu - Chinese Conference on Pattern Recognition and …, 2021 - Springer

… This paper starts from the problem of training instability caused by incomplete optimization 

of Wasserstein-1 distance in Wasserstein GAN model. Then we find a new way to minimize 

the Wasserstein-1 distance in the GANs model by extending the Quantile Regression …

 Related articles All 2 versions


2021

 

[PDF] aaai.org

[PDF] Fast PCA in 1-D Wasserstein Spaces via B-splines Representation and Metric Projection

M Pegoraro, M Beraha - Proceedings of the AAAI Conference on Artificial …, 2021 - aaai.org

… on the real line, using the Wasserstein geometry. We present a novel representation of 

the 2-Wasserstein space, based on a well known … We propose a novel definition of Principal 

Component Analysis in the Wasserstein space that, when used in combination with the B-spline …

Cited by 1 Related articles All 3 versions

A Distributed Robust Optimization Method with MC Dropout and Wasserstein Ambiguity Set Applied in Day-head Dispatch of Microgird

X Zhou, S Sun, S Yang, K Gong… - 2021 IEEE 4th …, 2021 - ieeexplore.ieee.org

… In this paper, the method based on MC dropout and Wasserstein distance is used to 

construct the ambiguity set, which is defined as a ball in the probability distribution space. It 

contains all distributions close to the true distribution or the most likely distribution in terms of …

A Distributed Robust Optimization Method with MC Dropout and Wasserstein Ambiguity Set Applied in Day-head Dispatch of Microgird

X Zhou, S Sun, S Yang, K Gong… - 2021 IEEE 4th …, 2021 - ieeexplore.ieee.org

Day-ahead load forecasting is the key part in day-ahead scheduling of power system.

Considering the uncertainty of load forecasting can improve the robustness of the system and

reduce the risk cost. This paper proposes a distributed robust optimization (DRO) method for …


[An unsupervised unimodal registration method based on Wasserstein Gan].

By: Chen, YWan, HZou, M

Nan fang yi ke da xue xue bao = Journal of Southern Medical University  Volume: ‏ 41   Issue: ‏ 9   Pages: ‏ 1366-1373   Article Number: 1673-4254(2021)41:9<1366:JYWGDW>2.0.TX;2-1   Published: ‏ 2021-Aug-31

Related articles All 3 versions 

 2021 patentWind power output power prediction method and wasserstein generative adversarial networks (WGAN) network, has filling 

process industry soft measurement data supplementing method based on self-supervised variational auto-encoder wasserstein generative adversarial network (svae-wgan) in idustrial field, involves calculating output distribution of encoder network

Patent Number: CN113505477-A

Patent Assignee: UNIV NORTHWEST NORMAL

Inventor(s): XU J; ZHANG Q; LIU Y; et al.

Inventor(s): XU J; ZHANG Q; LIU Y; et al.

CN10725654 29 Jun 2021   


2021 patent

Wind power output power prediction method and wasserstein generative adversarial networks (WGAN) network, has filling missing value of wind power data and abnormal value, normalizing data set as input data of prediction model, and outputting prediction result after training test of prediction mode

Patent Number: CN113298297-A

Patent Assignee: UNIV INNER MONGOLIA TECHNOLOGY

Inventor(s): WANG Y; WU Y; XU H; et al.

CN10503783 10 May 2021 

<——2021———2021———1980—— 


2021 patent

Robot motion planning method based on graph Walssetein auto-encoding network involves constructing graph Wasserstein auto-encoding network, performing non-uniform sampling distribution characterization learning process, and planning motion

Patent Number: CN113276119-A

Patent Assignee: TSINGHUA SHENZHEN INT GRADUATE SCHOOL

Inventor(s): XIA C; LIANG B; WANG X; et al.

CN10571993 25 May 2021 

Robot motion planning method based on graph Walssetein auto-encoding network involves constructing graph Wasserstein auto-encoding network, performing non-uniform sampling distribution characterization learning process, and planning motion

CN113276119-A

Inventor(s) XIA CLIANG B; (...); MAI S

Assignee(s) TSINGHUA SHENZHEN INT GRADUATE SCHOOL

Derwent Primary Accession Number 

Web of Sci all databases


2021 patent

Method for training low-dose computed tomography (CT) image denoising model, involves minimizing loss of generator in wasserstein divergence generative adversarial network (WGAN)-div generation countermeasure network initial model, and maximizing loss of discriminator

Patent Number: CN113205461-A

Patent Assignee: SHANGHAI HUIHU INFORMATION TECHNOLOGY CO LTD

Inventor(s): LI S; LI Y; PAN J.

CN10360991 02 Apr 2021 

Group target track starting method, involves determining wave gate of sub-group according to equivalent measurement, calculating Wasserstein distance between sub-group and backup sub-group, and taking candidate sub-group as track start

Patent Number: CN113126082-A

Patent Assignee: UNIV SUN YAT-SEN

Inventor(s): XU S; LEI X; ZHU N; et al.

CN10347426 31 Mar 2021 


2021

CDC-Wasserstein generated adversarial network ... - NASA/ADS

https://ui.adsabs.harvard.edu › abs › abstract

https://ui.adsabs.harvard.edu › abs › abstract
   

An Optimal Transport Analysis on Generalization in Deep ...

https://ieeexplore.ieee.org › document

https://ieeexplore.ieee.org › document

by J Zhang · 2021 — ... obscure in deep learning--why DNN models can generalize well, ... cost: the expected Wasserstein distance between the output hypothesis ...


 2921


WGAN-DNN:一种条件Wasserstein生成对抗网络入侵检测方法

http://kjgcdx.cnjournals.com › reader

http://kjgcdx.cnjournals.com › reader · Translate this page

by 贺佳星 · 2021 — 针对现有的基于机器学习的入侵检测系统对类不平衡数据检测准确率低的问题,提出一种基于条件Wasserstein生成对抗网络(CWGAN)和深度神经网络(DNN)的入侵检测(CWGAN DNN).

[Chinese  WGAN-DNN: A Conditional Wasserstein Generative Adversarial Network Intrusion Detection Method\0]


 2021

Wasserstein Dependency Measure for Representation Learninghttps://arxiv.org › cs
https://arxiv.org › cs
by S Ozair · 2019 · Cited by 51 — In this work, we empirically demonstrate that mutual information-based representation learning approaches do fail to learn complete ...

2021

Domain adaptation for robust workload level ... - PubMed

https://pubmed.ncbi.nlm.nih.gov › ...

https://pubmed.ncbi.nlm.nih.gov › ...

by B Lyu · 2021 · Cited by 4 — Domain adaptation has potential for session-by-session and subject-by-subject alignment of mental workload by using fNIRS data.



Application of Wasserstein attraction flows for optimal transport in network systems
Authors:Universitat Politècnica de Catalunya Departament d'Enginyeria de Sistemes, Automàtica i Informàtica Industrial (Contributor), Universitat Politècnica de Catalunya SAC - Sistemes Avançats de Control (Contributor), Arqué Pérez, Ferran (Creator), Uribe, César (Creator), Ocampo-Martínez, Carlos (Creator)Show more
Summary:© 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other worksShow more
Downloadable Archival Material, 2021
English
Publisher:Institute of Electrical and Electronics Engineers (IEEE), 2021

2021  May  7

NITheCS auf Twitter: "CoE MaSS Colloquium Prof Rocco ...https://twitter.com › nithecs › status
https://twitter.com › nithecs › status
Neue Tweets ansehen. Unterhaltung. NITheCS. @NITheCS. CoE MaSS Colloquium Prof Rocco Duvenhage (UP) "Noncommutative Wasserstein metrics" 7 May 10h30 ...

<——2021———2021———1990—— 


[PDF] virginia.edu

Unsupervised Graph Alignment with Wasserstein Distance Discriminator

J Gao, X Huang, J Li - Proceedings of the 27th ACM SIGKDD Conference …, 2021 - dl.acm.org

… Then we prove that in the embedding space, obtaining optimal alignment results is 

equivalent to minimizing the Wasserstein distance between embeddings of nodes from different 

graphs. Towards this, we propose a novel Wasserstein distance discriminator to identify …

 Cited by 5 Related articles All 3 versions


A data-driven scheduling model of virtual power plant using Wasserstein distributionally robust optimization

H Liu, J Qiu, J Zhao - International Journal of Electrical Power & Energy …, 2021 - Elsevier

Distributed energy resources (DER) can be efficiently aggregated by aggregators to sell 

excessive electricity to spot market in the form of Virtual Power Plant (VPP). The aggregator 

schedules DER within VPP to participate in day-ahead market for maximizing its profits while …

All 2 versions

2021

Network Malicious Traffic Identification Method Based On WGAN Category Balancing

A Wang, Y Ding - 2021 IEEE International Conference on …, 2021 - ieeexplore.ieee.org

… in using deep learning model for traffic recognition tasks, a method of using Wasserstein 

Generative Adversarial Network (WGAN) to generate minority samples based on the image 

of the original traffic data packets is proposed to achieve a small number of data categories …

All 3 versions


2021  

Differentially Privacy-Preserving Federated Learning Using Wasserstein Generative Adversarial Network

Y Wan, Y Qu, L Gao, Y Xiang - 2021 IEEE Symposium on …, 2021 - ieeexplore.ieee.org

… Unfortunately, the private data may be reconstructed by malicious participants by exploiting 

the context of model parameters in FL. This poses further challenges to privacy protection. 

To address this issue, we propose to integrate Wasserstein Generative Adversarial Network (…

Cited by 4 Related articles All 3 versions

2021  

Synthesis of Adversarial DDoS Attacks Using Wasserstein Generative Adversarial Networks with Gradient Penalty

CS Shieh, TT Nguyen, WW Lin… - 2021 6th …, 2021 - ieeexplore.ieee.org

… We believe that GAN is also capable of the generation of malicious but legitimate-looking 

traffic and therefore confuses the DDoS detection … Wasserstein GAN (WGAN) [15] introduces 

the Wasserstein distance to solve the original GAN's gradient vanishing problem. The WGAN is …

All 2 versions


2021 see 2020 symposium, arXiv  

Wasserstein stability for persistence diagrams - Mathematical ...

https://mathinstitutes.org › videos

May 10, 2021 — In this talk I will discuss new stability results with respect to the p-Wasserstein distance between persistence diagrams. The main result is ...

[CITATION] Wasserstein stability for persistence diagrams. CoRR

P Skraba, K Turner - arXiv preprint arXiv:2006.16824, 2021

 Cited by 2 Related articles
MSRI-Clay I workshop

2021

WDA: An Improved Wasserstein Distance-Based Transfer ...

https://www.mdpi.com › ...

https://www.mdpi.com › ...

by Z Zhu · 2021 · Cited by 2 — Author to whom correspondence should be addressed. Academic Editors: Kim Phuc Tran, Athanasios Rakitzis and Khanh T. P. Nguyen. Sensors 2021, 21(13), 4394; ...

Cited by 4 Related articles All 10 versions

2021

Wasserstein joint chance constraints in humanitarian logistics

https://arxiv.org › pdf

https://arxiv.org › pdfPDF

by Z Wang · 2021 — The key of the post-disaster humanitarian logistics (PD-HL) is to build a good facility location and capacity

  

2021

imitations in Sliced Wasserstein Generative Models

ping Batch Size Limitations in Sliced ...

https://proceedings.mlr.press › ...

https://proceedings.mlr.press › ...

by J Lezama · 2021 — In this paper, we build upon recent progress in sliced Wasserstein distances, a family of differentiable metrics for distribution discrepancy based on the ...


2021 video

Run-Sort-ReRun: Escaping Batch Size Limitations in Sliced Wasserstein Generative Models

CrossMind.ai logo. Sign in. It appears you are a search engine bot. If not, plea

Cited by 2 All 2 versions
<——2021———2021———2000——


 

Minimum Wasserstein Distance Estimator under Finite ... - arXivhttps://arxiv.org › stathttps://arxiv.org › statby Q Zhang · 2021 —We hence investigate feasible alternatives to MLE such as minimum distance estimators. Recently, the Wasserstein distance has drawn ...

 
2021

Wasserstein distance error bounds for the ... - Project Euclid

https://projecteuclid.org › journals › issue-2 › 21-EJS1920

https://projecteuclid.org › journals › issue-2 › 21-EJS1920

by A Anastasiou · 2021 · Cited by 4 — We obtain explicit p-Wasserstein distance error bounds between the distribution of the multi-parameter MLE and the multivariate normal distribution.


2021

Wasserstein distance error bounds for the ... - Research Explorer

https://www.research.manchester.ac.uk › ... › Publications ... › Publications

by A Anastasiou · 2021 · Cited by 4 — We obtain explicit p-Wasserstein distance error bounds between the distribution of the multi-parameter MLE and the multivariate normal ...


2021

An Improved Mixture Density Network via Wasserstein ...

https://ieeexplore.ieee.org › document

by L Yang · 2021 — To address drawbacks of the traditional maximum likelihood estimation (MLE) on training the mixture density network, a Wasserstein distance ...


 2021

Closed form of wasserstein distance between deterministic ...

https://stats.stackexchange.com › questions › closed-for...

https://stats.stackexchange.com › questions › closed-for...

Feb 16, 2021 — Let be Ut=ke−gt where k,g are factors. Let LtN(ke−gt,a2g(1−e−2gt)). Now i want to calculate the wasserstein distance. W(Ut,Lt).


2021


Pooling by Sliced-Wasserstein Embedding - NeurIPS 2021

https://neurips.cc › 2021 › ScheduleMultitrack

Hoffmann received his PhD in Robotics and Machine Learning in 2004 for work carried out at the Max Planck Institute for Human Cognitive and Brain Sciences in ...

Pooling by Sliced-Wasserstein Embedding | OpenReview

https://openreview.net › forum

https://openreview.net › forum

by N Naderializadeh · 2021 — This paper is very well written and illustrated. The idea of using the 1D Monge map as an embedding output to preserve the GSW distances in the output layer is ...
Cited by 4 Related articles All 3 versions

2021 see 2022

Wasserstein Convergence Rate for Empirical Measures on ...

http://cam.tju.edu.cn › research › downAchiev

http://cam.tju.edu.cn › research › downAchievPDF

Nov 24, 2021 — Measures on Noncompact Manifolds ... Keywords: Eempirical measure, diffusion process, Wasserstein distance, Riemannian mani-.

Wasserstein Convergence Rate for Empirical Measures on ...

http://cam.tju.edu.cn › research › downAchiev

http://cam.tju.edu.cn › research › downAchievPDF

Nov 24, 2021 — Abstract. Let Xt be the (reflecting) diffusion process generated by L := ∆+V on a complete connected Riemannian manifold M possibly with ...


2021

Trajectories from Distribution-Valued Functional Curves

https://www.researchgate.net › ... › Framework

Request PDF | Trajectories from Distribution-Valued Functional Curves: A Unified Wasserstein Framework | Temporal changes in medical images are often ...


2021 see 2022

https://www.sciencedirect.com › science › article › abs › pii

2D Wasserstein loss for robust facial landmark detection

by Y Yan · 2021 ·  — In this paper, we propose a novel method for robust facial landmark detection, using a loss function based on the 2D Wasserstein distance combined with a new ...

 Related articles All 17 versions

2021

A Unified Wasserstein Distributional Robustness Framework for Adversarial Training

A Unified Formulation for the Bures-Wasserstein and Log ...

https://link.springer.com › content › pdf

https://link.springer.com › content › pdf

by HQ Minh · 2019 · Cited by 9 — The Alpha Procrustes distances provide a unified formulation encompassing both the Bures-Wasserstein and Log-. Euclidean distances between SPD matrices. This ...

9 pages

<——2021———2021———2010——


2021

KL divergence and Wasserstein distance - Cross Validated

https://stats.stackexchange.com › questions › kl-diverge...

https://stats.stackexchange.com › questions › kl-diverge...

Feb 2, 2021 — ... property of being completely agnostic to the metric of the underlying data distribution, and invariant to any invertible transformation.


2021

ON THE GENERALIZATION OF WASSERSTEIN ROBUST ...

https://openreview.net › forum

https://openreview.net › forum

Nov 22, 2021 — To address this, we propose a Wasserstein distributionally robust optimization ... Generalizing the concepts of Agnostic Federated Learning, ...



drift ghost xl怎么连接手机 - CSDN

https://www.csdn.net › tags

https://www.csdn.net › tags · Translate this page

CLN2INV: Learning Loop Invariants with Continuous Logic Networks ... minimization of the entropy regularized Wasserstein distance between representations.



On asymptotics for Vaserstein coupling of ... - ResearchGate

https://www.researchgate.net › ... › Markov Chains

https://www.researchgate.net › ... › Markov Chains

Oct 7, 2021 — We prove that strong ergodicity of a Markov process is linked with a spectral radius of a certain “associated” semigroup operator, although, ..

 

 

Sinkhorn Collaborative Filtering - Xiucheng Notes

https://xiucheng.org › pdfs › www21-sinkhorncf

https://xiucheng.org › pdfs › www21-sinkhorncfPDF

by X Li · 2021 — mendation methods, the latent factor collaborative filtering models ... Mover distance or more generally 1-Wasserstein distance W1( , )

Bismut-Elworthy Inequality for Wasserstein Diffusion on Circles - X-MOL


   

2021


圆上Wasserstein 扩散的Bismut-Elworthy 不等式 - X-MOL

https://www.x-mol.com › paper

https://www.x-mol.com › paper · Translate this page

Oct 13, 2021 — We introduce in this paper a strategy to prove gradient estimates for some infinite-dimensional diffusions on \(L_2\)-Wasserstein spaces.

[Chinese  Bismut-Elworthy Inequality for Wasserstein Diffusion on Circles - X-MOL]


2021

Wasserstein Space Latest Research Papers | ScienceGate

https://www.sciencegate.app › keywords

https://www.sciencegate.app › keywords

A Bismut–Elworthy inequality for a Wasserstein diffusion on the circle · Stochastic Partial Differential Equations Analysis and Computations 


[HTML] springer.com

[HTML] A Bismut–Elworthy inequality for a Wasserstein diffusion on the circle

V Marx - Stochastics and Partial Differential Equations: Analysis …, 2021 - Springer

… We introduce in this paper a strategy to prove gradient estimates for some infinite-dimensional 

diffusions on \(L_2\)-Wasserstein spaces. For a specific example of a diffusion on the \(L_2\)-Wasserstein 

space of the torus, we get a Bismut-Elworthy-Li formula up to a remainder …

  Related articles All 9 versions


Yingyun Sun (0000-0002-7516-753X) - ORCID

https://orcid.org › ...

https://orcid.org › ...

Sep 7, 2021 — An Optimal Scenario Reduction Method Based on Wasserstein Distance and Validity Index,一种基于Wasserstein距离及有效性指标的最优场景约简方法.

 计及不确定性的综合能源系统容量规划方法

GMT-WGAN:一种用于地面移动目标分类的对抗样本扩展方法

https://www.x-mol.com › paper › adv

· Translate this page

Dec 28, 2021 — In the field of target classification, detecting a ground moving target that ... GMT-WGAN: An Adversarial Sample Expansion Method for Ground ...

2021  Cover Image

GMT-WGAN: An Adversarial Sample Expansion Method for Ground Moving...

by Yao, Xin; Shi, Xiaoran; Li, Yaxin ; More...

Remote sensing (Basel, Switzerland), 12/2021, Volume 14, Issue 1

In the field of target classification, detecting a ground moving target that is easily covered in clutter has been a challenge. In addition, traditional...

ArticleView Article PDF

Journal Article Full Text Online

View Complete Issue Browse Now

Cited by 1 Related articles All 6 versions

 <——2021———2021———2020—

    

2021 see 2022

Dynamic Topological Data Analysis for Brain Networks via Wasserstein Graph...

by Chung, Moo K; Huang, Shih-Gu; Carroll, Ian C ; More...

12/2021

We present the novel Wasserstein graph clustering for dynamically changing graphs. The Wasserstein clustering penalizes the topological discrepancy between...

Journal Article Full Text Online


WATCHWasserstein Change Point Detection for High-Dimensional Time Series Data

K FaberR CorizzoB Sniezynski… - 2021 IEEE …, 2021 - ieeexplore.ieee.org

… In this section we describe WATCH, our novel change point detection approach based on

the Wasserstein distance. We divide the discussion in two stages: in the first subsection we

introduce the Wasserstein distance discussing its potential for the task at hand; in the second …

All 4 versions


[PDF] arxiv.org

On the Wasserstein Distance Between -Step Probability Measures on Finite Graphs

S Benjamin, A Mantri, Q Perian - arXiv preprint arXiv:2110.10363, 2021 - arxiv.org

… the k-step transition probability measures of X and Y . In this paper, we study the Wasserstein 

distance between µk and νk for general k. We consider the sequence formed by the Wasserstein 

distance at odd values of k and the sequence formed by the Wasserstein distance at …

   All 2 versi


 2021 youtube

Scaling Wasserstein Distances to High Dimensions via Smoothing

854 views

Feb 21, 2021


2021 PDF

Accelerated WGAN Update Strategy With ... - CVF Open Access

https://openaccess.thecvf.com › content › papers

by X Ouyang · 2021 · Cited by 3 — The strategy we propose for balancing the training of the generator and discriminator is based on the discrimi- nator and generator loss change ratios (rd and ...

10 pages

Accelerated WGAN update strategy with loss change rate balancing

Ouyang, X; Chen, Y and Agam, G

IEEE Winter Conference on Applications of Computer Vision (WACV)

2021 | 2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WACV 2021 , pp.2545-2554

Optimizing the discriminator in Generative Adversarial Networks (GANs) to completion in the inner training loop is computationally prohibitive, and on finite datasets would result in overfitting. To address this, a common update strategy is to alternate between k optimization steps for the discriminator D and one optimization step for the generator G. This strategy is repeated in various GAN algorithms where k is selected empirically. In this paper, we show that this update strategy is not optimal in terms of accuracy and convergence speed, and propose a new update strategy for networks with Wasserstein GAN (WGAN) group related loss functions (e.g. WGAN, WGAN-GP, Deblur GAN, and Super resolution GAN). The proposed update strategy is based on a loss change ratio comparison of G and D. We demonstrate that the proposed strategy improves both convergence speed and accuracy.

28 References

 Cited by 4 Related articles All 6 versions 


  2021


2021  PDF

Wasserstein Distributionally Robust Inverse Multiobjective ...

https://ojs.aaai.org › AAAI › article › view

by C Dong · 2021 · Cited by 3 — we investigate in this paper the distributionally robust ap- proach for inverse multiobjective optimization. Specifically, we leverage the Wasserstein ...

Wasserstein Distributionally Robust Inverse Multiobjective Optimization

Dong, CS and Zeng, B

35th AAAI Conference on Artificial Intelligence / 33rd Conference on Innovative Applications of Artificial Intelligence / 11th Symposium on Educational Advances in Artificial Intelligence

2021 | THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE 35 , pp.5914-5921

Inverse multiobjective optimization provides a general framework for the unsupervised learning task of inferring parameters of a multiobjective decision making problem (DMP), based on a set of observed decisions from the human expert. However, the performance of this framework relies critically on the availability of an accurate DMP, sufficient decisions of high quality, and a parameter space that contains enough information about the DMP. To hedge against the uncertainties in the hypothetical DMP, the data, and the parameter space, we investigate in this paper the distributionally robust approach for inverse multiobjective optimization. Specifically, we leverage the Wasserstein metric to construct a ball centered at the empirical distribution of these decisions. We then formulate a Wasserstein distributionally robust inverse multiobjective optimization problem (WRO-IMOP) that minimizes a worst-case expected loss function, where the worst case is taken over all distributions in the Wasserstein ball. We show that the excess risk of the WRO-IMOP estimator has a sub-linear convergence rate. Furthermore, we propose the semi-infinite reformulations of the WRO-IMOP and develop a cutting-plane algorithm that converges to an approximate solution in finite iterations. Finally, we demonstrate the effectiveness of our method on both a synthetic multiobjective quadratic program and a real world portfolio optimization problem.

24 References
Cited by 2 Related articles All 8 versions


Generalized spectral clustering via Gromov-Wasserstein learning

S ChowdhuryT Needham - International Conference on …, 2021 - proceedings.mlr.press

We establish a bridge between spectral clustering and Gromov-Wasserstein Learning

(GWL), a recent optimal transport-based approach to graph partitioning. This connection

both explains and improves upon the state-of-the-art performance of GWL. The Gromov-

Wasserstein framework provides probabilistic correspondences between nodes of source

and target graphs via a quadratic programming relaxation of the node matching problem.

Our results utilize and connect the observations that the GW geometric structure remains …

0 Related articles All 3 versions 

Generalized Spectral Clustering via Gromov-Wasserstein Learning

Chowdhury, S and Needham, T

24th International Conference on Artificial Intelligence and Statistics (AISTATS)

2021 | 24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS) 130 , pp.712-+

We establish a bridge between spectral clustering and Gromov-Wasserstein Learning (GWL), a recent optimal transport-based approach to graph partitioning. This connection both explains and improves upon the state-of-the-art performance of GWL. The Gromov-Wasserstein framework provides probabilistic correspondences between nodes of source and target graphs via a quadratic programming relaxation of the node matching problem. Our results utilize and connect the observations that the GW geometric structure remains valid for any rank-2 tensor, in particular the adjacency, distance, and various kernel matrices on graphs, and that the heat kernel outperforms the adjacency matrix in producing stable and informative node correspondences. Using the heat kernel in the GWL framework provides new multiscale graph comparisons without compromising theoretical guarantees, while immediately yielding improved empirical results. A key insight of the GWL framework toward graph partitioning was to compute GW correspondences from a source graph to a template graph with isolated, self-connected nodes. We show that when comparing against a two-node template graph using the heat kernel at the infinite time limit, the resulting partition agrees with the partition produced by the Fiedler vector. This in turn yields a new insight into the k-cut graph partitioning problem through the lens of optimal transport. Our experiments on a range of real-world networks achieve comparable results to, and in many cases outperform, the state-of-the-art achieved by GWL.

GromovWasserstein framework provides probabilistic correspondences between …

Cited by 19 Related articles All 5 versions
Generalized Spectral Clustering via Gromov-Wasserstein ...

slideslive.com › generalized-spectral-clustering-via-gromo...

slideslive.com › generalized-spectral-clustering-via-gromo...serstein Learning (GWL), a recent optimal transport-based approach to graph ...

SlidesLive · 


Wasserstein Barycenter Transport for Acoustic Adaptation

EF Montesuma, FMN Mboula - ICASSP 2021-2021 IEEE …, 2021 - ieeexplore.ieee.org

The recognition of music genre and the discrimination between music and speech are

important components of modern digital music systems. Depending on the acquisition

conditions, such as background environment, these signals may come from different

probability distributions, making the learning problem complicated. In this context, domain

adaptation is a key theory to improve performance. Considering data coming from various

background conditions, the adaptation scenario is called multi-source. This paper proposes …

Related articles

WASSERSTEIN BARYCENTER TRANSPORT FOR ACOUSTIC ADAPTATION

Montesuma, EF and Mboula, FMN

IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)

2021 | 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021) , pp.3405-3409

The recognition of music genre and the discrimination between music and speech are important components of modern digital music systems. Depending on the acquisition conditions, such as background environment, these signals may come from different probability distributions, making the learning problem complicated. In this context, domain adaptation is a key theory to improve performance. Considering data coming from various background conditions, the adaptation scenario is called multi-source. This paper proposes a multi-source domain adaptation algorithm called Wasserstein Barycenter Transport, which transports the source domains to a target domain by creating an intermediate domain using the Wasserstein barycenter. Our method outperforms other state-of-the-art algorithms, and performs better than classifiers trained with target-only data.


2021 PDF

Learning Graphons via Structured Gromov-Wasserstein ...

https://ojs.aaai.org › AAAI › article › view

by H Xu · 2021 · Cited by 3 — Abstract. We propose a novel and principled method to learn a non- parametric graph model called graphon, which is defined in an infinite-dimensional space ...

Learning Graphons via Structured Gromov-Wasserstein Barycenters

Xu, HT; Luo, DX; (...); Zha, HY

35th AAAI Conference on Artificial Intelligence / 33rd Conference on Innovative Applications of Artificial Intelligence / 11th Symposium on Educational Advances in Artificial Intelligence

2021 | THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE 35 , pp.10505-10513

We propose a novel and principled method to learn a non-parametric graph model called graphon, which is defined in an infinite-dimensional space and represents arbitrary-size graphs. Based on the weak regularity lemma from the theory of graphons, we leverage a step function to approximate a graphon. We show that the cut distance of graphons can be relaxed to the Gromov-Wasserstein distance of their step functions. Accordingly, given a set of graphs generated by an underlying graphon, we learn the corresponding step function as the Gromov-Wasserstein barycenter of the given graphs. Furthermore, we develop several enhancements and extensions of the basic algorithm, e:g:, the smoothed Gromov-Wasserstein barycenter for guaranteeing the continuity of the learned graphons and the mixed Gromov-Wasserstein barycenters for learning multiple structured graphons. The proposed approach overcomes drawbacks of prior state-of-the-art methods, and outperforms them on both synthetic and real-world data. The code is available at https://github.com/HongtengXu/SGWB-Graphon.

Cited by 7 Related articles All 6 versions

[PDF] mlr.press

Improved complexity bounds in wasserstein barycenter problem

D DvinskikhD Tiapkin - International Conference on …, 2021 - proceedings.mlr.press

In this paper, we focus on computational aspects of the Wasserstein barycenter problem. We

propose two algorithms to compute Wasserstein barycenters of $ m $ discrete measures of

size $ n $ with accuracy $\e $. The first algorithm, based on mirror prox with a specific norm,

meets the complexity of celebrated accelerated iterative Bregman projections (IBP), namely

$\widetilde O (mn^ 2\sqrt n/\e) $, however, with no limitations in contrast to the (accelerated)

IBP, which is numerically unstable under small regularization parameter. The second …

  Cited by 16 Related articles All 4 versions

Improved Complexity Bounds in Wasserstein Barycenter Problem

Dvinskikh, D and Tiapkin, D

24th International Conference on Artificial Intelligence and Statistics (AISTATS)

2021 | 24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS) 130

In this paper, we focus on computational aspects of the Wasserstein barycenter problem. We propose two algorithms to compute Wasserstein barycenters of m discrete measures of size n with accuracy epsilon. The first algorithm, based on mirror prox with a specific norm, meets the complexity of celebrated accelerated iterative Bregman projections (IBP), namely (O) over tilde (mn(2) root n/epsilon), however, with no limitations in contrast to the (accelerated) IBP, which is numerically unstable under small regularization parameter. The second algorithm, based on area-convexity and dual extrapolation, improves the previously best-known convergence rates for the Wasserstein barycenter problem enjoying (O) over tilde (mn(2)/epsilon) complexity.

Improved Complexity Bounds in Wasserstein Barycenter ...

mproved Complexity Bounds in Wasserstein Barycenter Problem. Apr 14, 2021. 0. Darina Dvinskikh. Follow. Recommended. Details. Comments.

CrossMind.ai · 

Apr 14, 2021

<——2021———2021———2030—


Joint Distribution Adaptation via Wasserstein Adversarial ...

https://ieeexplore.ieee.org › document

by X Wang · 2021 — In this paper, we propose a representation learning approach for domain adaptation, w

Joint Distribution Adaptation via Wasserstein Adversarial Training

Wang, XL; Zhang, WY; (...); Liu, HK

International Joint Conference on Neural Networks (IJCNN)

2021 | 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)

This paper considers the unsupervised domain adaptation problem, in which we want to find a good prediction function on the unlabeled target domain, by utilizing the information provided in the labeled source domain. A common approach to the domain adaptation problem is to learn a representation space where the distributional discrepancy of the source and target domains is small. Existing methods generally tend to match the marginal distributions of the two domains, while the label information in the source domain is not fully exploited. In this paper, we propose a representation learning approach for domain adaptation, which is addressed as JODAWAT. We aim to adapt the joint distributions of the feature-label pairs in the shared representation space for both domains. In particular, we minimize the Wasserstein distance between the source and target domains, while the prediction performance on the source domain is also guaranteed. The proposed approach results in a minimax adversarial training procedure that incorporates a novel split gradient penalty term. A generalization bound on the target domain is provided to reveal the efficacy of representation learning for joint distribution adaptation. We conduct extensive evaluations on JODAWAT, and test its classification accuracy on multiple synthetic and real datasets. The experimental results justify that our proposed method is able to achieve superior performance compared with various domain adaptation methods.


 

Wasserstein k-means with sparse simplex projection

T Fukunaga, H Kasai - 2020 25th International Conference on …, 2021 - ieeexplore.ieee.org

This paper presents a proposal of a faster Wasser-stein k-means algorithm for histogram

data by reducing Wasser-stein distance computations and exploiting sparse simplex

projection. We shrink data samples, centroids, and the ground cost matrix, which leads to

considerable reduction of the computations used to solve optimal transport problems without

loss of clustering quality. Furthermore, we dynamically reduced the computational

complexity by removing lower-valued data samples and harnessing sparse simplex …

Cited by 10 Related articles All 5 versions

Wasserstein k-means with sparse simplex projection

Fukunaga, T and Kasai, H

25th International Conference on Pattern Recognition (ICPR)

2021 | 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR) , pp.1627-1634

This paper presents a proposal of a faster Wasserstein k-means algorithm for histogram data by reducing Wasserstein distance computations and exploiting sparse simplex projection. We shrink data samples, centroids, and the ground cost matrix, which leads to considerable reduction of the computations used to solve optimal transport problems without loss of clustering quality. Furthermore, we dynamically reduced the computational complexity by removing lower-valued data samples and harnessing sparse simplex projection while keeping the degradation of clustering quality lower. We designate this proposed algorithm as sparse simplex projection based Wasserstein k-means, or SSPW k-means. Numerical evaluations conducted with comparison to results obtained using Wasserstein k-means algorithm demonstrate the effectiveness of the proposed SSPW k-means for real-world datasets.


SWIFT: Scalable Wasserstein Factorization for Sparse ...

https://ojs.aaai.org › index.php › AAAI › article › view

by A Afshar · 2021 · Cited by 5 — We introduce SWIFT, which minimizes the Wasserstein distance that measures the distance between the input tensor and that of the reconstruction.

SWIFT: Scalable Wasserstein Factorization for Sparse Nonnegative Tensors

Afshar, A; Yin, KJ; (...); Sun, JM

35th AAAI Conference on Artificial Intelligence / 33rd Conference on Innovative Applications of Artificial Intelligence / 11th Symposium on Educational Advances in Artificial Intelligence

2021 | THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE 35 , pp.6548-6556

Existing tensor factorization methods assume that the input tensor follows some specific distribution (i.e. Poisson, Bernoulli, and Gaussian), and solve the factorization by minimizing some empirical loss functions defined based on the corresponding distribution. However, it suffers from several drawbacks: 1) In reality, the underlying distributions are complicated and unknown, making it infeasible to be approximated by a simple distribution. 2) The correlation across dimensions of the input tensor is not well utilized, leading to sub-optimal performance. Although heuristics were proposed to incorporate such correlation as side information under Gaussian distribution, they can not easily be generalized to other distributions. Thus, a more principled way of utilizing the correlation in tensor factorization models is still an open challenge. Without assuming any explicit distribution, we formulate the tensor factorization as an optimal transport problem with Wasserstein distance, which can handle non-negative inputs.

We introduce SWIFT, which minimizes the Wasserstein distance that measures the distance between the input tensor and that of the reconstruction. In particular, we define the N-th order tensor Wasserstein loss for the widely used tensor CP factorization and derive the optimization algorithm that minimizes it. By leveraging sparsity structure and different equivalent formulations for optimizing computational efficiency, SWIFT is as scalable as other well-known CP algorithms. Using the factor matrices as features, SWIFT achieves up to 9.65% and 11.31% relative improvement over baselines for downstream prediction tasks. Under the noisy conditions, SWIFT achieves up to 15% and 17% relative improvements over the best competitors for the prediction tasks.

Cited by 7 Related articles All 12 versions

  

First-Order Methods for Wasserstein Distributionally Robust MDP

JG Clement, C Kroer - International Conference on Machine …, 2021 - proceedings.mlr.press

Markov decision processes (MDPs) are known to be sensitive to parameter specification.

Distributionally robust MDPs alleviate this issue by allowing for\textit {ambiguity sets} which

give a set of possible distributions over parameter sets. The goal is to find an optimal policy

with respect to the worst-case parameter distribution. We propose a framework for solving

Distributionally robust MDPs via first-order methods, and instantiate it for several types of

Wasserstein ambiguity sets. By developing efficient proximal updates, our algorithms …

 Cited by 11 Related articles All 6 versions

First-Order Methods for Wasserstein Distributionally Robust MDPs

Grand-Clement, J and Kroer, C

International Conference on Machine Learning (ICML)

2021 | INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139 139

Markov decision processes (MDPs) are known to be sensitive to parameter specification. Distributionally robust MDPs alleviate this issue by allowing for ambiguity sets which give a set of possible distributions over parameter sets. The goal is to find an optimal policy with respect to the worst-case parameter distribution. We propose a framework for solving Distributionally robust MDPs via first-order methods, and instantiate it for several types of Wasserstein ambiguity sets. By developing efficient proximal updates, our algorithms achieve a convergence rate of O (NA(2.5) S-3.5 log(S) log(epsilon(-1))epsilon(-1.5)) for the number of kernels N in the support of the nominal distribution, states S, and actions A; this rate varies slightly based on the Wasserstein setup. Our dependence on N;A and S is significantly better than existing methods, which have a complexity of O (N(3.5)A(3.5)S(4.5) log(2)(epsilon(-1))). Numerical experiments show that our algorithm is significantly more scalable than state-of-the-art approaches across several domains.

2021

First-Order Methods for Wasserstein Distributionally Robust ...

slideslive.com › firstorder-methods-for-wasserstein-distrib...

5:18

... speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world.

SlidesLive · 

Jul 19, 2021

[PDF] aaai.org

[PDF] Towards Generalized Implementation of Wasserstein Distance in GANs

M XuZ ZhouG LuJ TangW ZhangY Yu - Proceedings of the AAAI …, 2021 - aaai.org

Abstract Wasserstein GANs (WGANs), built upon the Kantorovich-Rubinstein (KR) duality of

Wasserstein distance, is one of the most theoretically sound GAN models. However, in

practice it does not always outperform other variants of GANs. This is mostly due to the

imperfect implementation of the Lipschitz condition required by the KR duality. Extensive

work has been done in the community with different implementations of the Lipschitz

constraint, which, however, is still hard to satisfy the restriction perfectly in practice. In this …

Cited by 2 Related articles All 4 versions 


Towards Generalized Implementation of Wasserstein Distance in GANs

Xu, MK

35th AAAI Conference on Artificial Intelligence / 33rd Conference on Innovative Applications of Artificial Intelligence / 11th Symposium on Educational Advances in Artificial Intelligence

2021 | THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE 35 , pp.10514-10522

Wasserstein GANs (WGANs), built upon the Kantorovich-Rubinstein (KR) duality of Wasserstein distance, is one of the most theoretically sound GAN models. However, in practice it does not always outperform other variants of GANs. This is mostly due to the imperfect implementation of the Lipschitz condition required by the KR duality. Extensive work has been done in the community with different implementations of the Lipschitz constraint, which, however, is still hard to satisfy the restriction perfectly in practice. In this paper, we argue that the strong Lipschitz constraint might be unnecessary for optimization. Instead, we take a step back and try to relax the Lipschitz constraint. Theoretically, we first demonstrate a more general dual form of the Wasserstein distance called the Sobolev duality, which relaxes the Lipschitz constraint but still maintains the favorable gradient property of the Wasserstein distance. Moreover, we show that the KR duality is actually a special case of the Sobolev duality. Based on the relaxed duality, we further propose a generalized WGAN training scheme named Sobolev Wasserstein GAN, and empirically demonstrate the improvement over existing methods with extensive experiments.

Cited by 5 Related articles All 6 versions

2021

Visual Transfer For Reinforcement Learning Via Wasserstein ...

https://ojs.aaai.org › index.php › AAAI › article › view

by J Roy · 2021 · Cited by 5 — Roy, J., & Konidaris, G. D. (2021). Visual Transfer For Reinforcement Learning Via Wasserstein Domain Confusion. Proceedings of the AAAI ...

Visualransfer for Reinforcement Learning via Wasserstein Domain Confusion

Roy, J and Konidaris, G

35th AAAI Conference on Artificial Intelligence / 33rd Conference on Innovative Applications of Artificial Intelligence / 11th Symposium on Educational Advances in Artificial Intelligence

2021 | THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE 35 , pp.9454-9462

We introduce Wasserstein Adversarial Proximal Policy Optimization (WAPPO), a novel algorithm for visual transfer in Reinforcement Learning that explicitly learns to align the distributions of extracted features between a source and target task. WAPPO approximates and minimizes the Wasserstein-1 distance between the distributions of features from source and target domains via a novel Wasserstein Confusion objective. WAPPO outperforms the prior state-of-the-art in visual transfer and successfully transfers policies across Visual Cartpole and both the easy and hard settings of o

Cited by 8 Related articles All 9 versions

2021 PDF

AC-WGAN-GP: Augmenting ECG and GSR Signals using ...

https://ir.cwi.nl › pub

by A Furdui · 2021 — We compare the recognition performance between real and synthetic signals as training data in the task of binary arousal classification.

AC-WGAN-GP: Augmenting ECG and GSR Signals using Conditional Generative Models for Arousal Classification

Furdui, A; Zhang, TY; (...); El Ali, A

ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp) / ACM International Symposium on Wearable Computers (ISWC)

2021 | UBICOMP/ISWC '21 ADJUNCT: PROCEEDINGS OF THE 2021 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING AND PROCEEDINGS OF THE 2021 ACM INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTERS , pp.21-22

Computational recognition of human emotion using Deep Learning techniques requires learning from large collections of data. However, the complex processes involved in collecting and annotating physiological data lead to datasets with small sample sizes. Models trained on such limited data often do not generalize well to real-world settings. To address the problem of data scarcity, we use an Auxiliary Conditioned Wasserstein Generative Adversarial Network with Gradient Penalty (AC-WGAN-GP) to generate synthetic data. We compare the recognition performance between real and synthetic signals as training data in the task of binary arousal classification. Experiments on GSR and ECG signals show that generative data augmentation significantly improves model performance (avg. 16.5%) for binary arousal classification in a subject-independent setting.

Cited by 4 Related articles All 6 versions

2021 PDF

Scalable Computations of Wasserstein Barycenter ... - NSF PAR

https://par.nsf.gov › servlets › purl

by J Fan · 2021 · 2 — Our method is based on a Kantorovich-type dual characterization of the Wasserstein barycenter, which involves optimization over convex functions, and the ...

Scalable Computations of Wasserstein Barycenter via Input Convex Neural Networks

Fan, JJ; Taghvaei, A and Chen, YX

International Conference on Machine Learning (ICML)

2021 | INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139 139

Wasserstein Barycenter is a principled approach to represent the weighted mean of a given set of probability distributions, utilizing the geometry induced by optimal transport. In this work, we present a novel scalable algorithm to approximate the Wasserstein Barycenters aiming at high-dimensional applications in machine learning. Our proposed algorithm is based on the Kantorovich dual formulation of the Wasserstein-2 distance as well as a recent neural network architecture, input convex neural network, that is known to parametrize convex functions. The distinguishing features of our method are: i) it only requires samples from the marginal distributions; ii) unlike the existing approaches, it represents the Barycenter with a generative model and can thus generate infinite samples from the barycenter without querying the marginal distributions; iii) it works similar to Generative Adversarial Model in one marginal case. We demonstrate the efficacy of our algorithm by comparing it with the state-of-art methods in multiple experiments.(1)


Wasserstein Distance-Based Domain Adaptation and Its Application to Road Segmentation

S Kono, T Ueda, E Arriaga-Varela… - … Joint Conference on …, 2021 - ieeexplore.ieee.org

Domain adaptation is used in applying a classifier acquired in one data domain to another

data domain. A classifier obtained by supervised training with labeled data in an original

source domain can also be used for classification in a target domain in which the labeled

data are difficult to collect with the help of domain adaptation. The most recently proposed

domain adaptation methods focus on data distribution in the feature space of a classifier and

bring the data distribution of both domains closer through learning. The present work is …

Wasserstein Distance-Based Domain Adaptation and Its Application to Road Segmentation

Kono, S; Ueda, T; (...); Nishikawa, I

International Joint Conference on Neural Networks (IJCNN)

2021 | 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)

Domain adaptation is used in applying a classifier acquired in one data domain to another data domain. A classifier obtained by supervised training with labeled data in an original source domain can also be used for classification in a target domain in which the labeled data are difficult to collect with the help of domain adaptation. The most recently proposed domain adaptation methods focus on data distribution in the feature space of a classifier and bring the data distribution of both domains closer through learning. The present work is based on an existing unsupervised domain adaptation method, in which both distributions become closer through adversarial training between a target data encoder to the feature space and a domain discriminator. We propose to use the Wasserstein distance to measure the distance between two distributions, rather than the well-known Jensen-Shannon divergence. Wasserstein distance, or earth mover's distance, measures the length of the shortest path among all possible pairs between a corresponding pair of variables in two distributions. Therefore, minimization of the distance leads to overlap of the corresponding data pair in source and target domain. Thus, the classifier trained in the source domain becomes also effective in the target domain. The proposed method usingWasserstein distance shows higher accuracies in the target domains compared with an original distance in computer experiments on semantic segmentation of map images.

Cited by 1 Related articles

2021  are 2020

Symmetric Skip Connection Wasserstein GAN for High-resolution Facial Image Inpainting

Jam, J; Kendrick, C; (...); Yap, M

16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP) / 16th International Conference on Computer Vision Theory and Applications (VISAPP)

2021 | VISAPP: PROCEEDINGS OF THE 16TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS - VOL. 4: VISAPP , pp.35-44

The state-of-the-art facial image inpainting methods achieved promising results but face realism preservation remains a challenge. This is due to limitations such as; failures in preserving edges and blurry artefacts. To overcome these limitations, we propose a Symmetric Skip Connection Wasserstein Generative Adversarial Network (S-WGAN) for high-resolution facial image inpainting. The architecture is an encoder-decoder with convolutional blocks, linked by skip connections. The encoder is a feature extractor that captures data abstractions of an input image to learn an end-to-end mapping from an input (binary masked image) to the ground-truth. The decoder uses learned abstractions to reconstruct the image. With skip connections, S-WGAN transfers image details to the decoder. Additionally, we propose a Wasserstein-Perceptual loss function to preserve colour and maintain realism on a reconstructed image. We evaluate our method and the state-of-the-art methods on CelebA-HQ dataset. Our results show S-WGAN produces sharper and more realistic images when visually compared with other methods. The quantitative measures show our proposed S-WGAN achieves the best Structure Similarity Index Measure (SSIM) of 0.94.

<——2021———2021———20440——


[HTML] mdpi.com

Geometric Characteristics of the Wasserstein Metric on SPD (n) and Its Applications on Data Processing

Y Luo, S Zhang, Y Cao, H Sun - Entropy, 2021 - mdpi.com

… In this paper, by involving the Wasserstein metric on S P D ( n ) , we obtain computationally 

… Laplacian, we present the connection between Wasserstein sectional curvature and edges. 

… In Section 3, we describe the Wasserstein geometry of S P D ( n ) , including the geodesic, …

  re. In particular, we prove the geodesic

Cited by 1 Related articles All 9 versions

2021  [PDF] arxiv.org

Uncertainty quantification in a mechanical submodel driven by a Wasserstein-GAN

H Boukraichi, N Akkari, F Casenave… - arXiv preprint arXiv …, 2021 - arxiv.org

… Generative Adversarial Networks (GANs) are suited for such applications, where the 

Wasserstein-GAN with gradient penalty variant offers improved convergence results for our 

problem. The objective of our approach is to train a GAN on data from a finite element method code (…

Related articles All 3 versions


[PDF] arxiv.org

The existence of minimizers for an isoperimetric problem with Wasserstein penalty term in unbounded domains

Q Xia, B Zhou - Advances in Calculus of Variations, 2021 - degruyter.com

… where P ⁢ ( E ; Ω ) denotes the relative perimeter of 𝐸 in Ω, W p denotes the 𝑝-Wasserstein 

problem turns to be an isoperimetric problem with the Wasserstein penalty term. For instance… 

can be regarded as a gradient flow under the Wasserstein metric (see the review paper [20]). …

Cited by 4 Related articles All 5 versions


 

Implementation of a WGAN-GP for Huma n Pose Transfer using a 3-channel pose representation

T Das, S Sutradhar, M Das… - … on Innovation and …, 2021 - ieeexplore.ieee.org

… loss used in WGANs is derived from the Wasserstein metric, … the WGAN and Ir be the real

image (ground truth) from the dataset. Then, if C denote the critic of the WGAN, the Wasserstein …

 Related articles

 

An Intrusion Detection Method Based on WGAN and Deep Learning

by Han, Linfeng; Fang, Xu; Liu, Yongguang ; More...

2021 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC), 08/2021

Using WGAN and deep learning methods, a multiclass network intrusion detection model is proposed. The model uses the WGAN network to generate fake samples of...

Conference Proceeding 

Full Text Online

 Cited by 1 Related articles

    

2021


Anti-confrontational Domain Data Generation Based on Improved WGAN

by Luo, Haibo; Chen, Xingchi; Dong, Jianhu

2021 International Symposium on Computer Technology and Information Science (ISCTIS), 06/2021

The Domain Generate Algorithm (DGA) is used by a large number of botnets to evade detection. At present, the mainstream machine learning detection technology...

Conference Proceeding 

Full Text Online

  Related articles All 2 versions


Face Image Generation for Illustration by WGAN-GP Using Landmark Information

by Takahashi, Miho; Watanabe, Hiroshi

2021 IEEE 10th Global Conference on Consumer Electronics (GCCE), 10/2021

With the spread of social networking services, face images for illustration are being used in a variety of situations. Attempts have been made to create...

Conference Proceeding 

Full Text Online

Related articles All 2 versions

Implementation of a WGAN-GP for Human Pose Transfer using a 3-channel pose representation

by Das, Tamal; Sutradhar, Saurav; Das, Mrinmoy ; More...

2021 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies (3ICT), 09/2021

The computational problem of Human Pose Transfer (HPT) is addressed in this paper. HPT in recent days have become an emerging research topic which can be used...

Conference Proceeding 

Full Text Online

 

 

Conference Paper Citation/Abstract

Underwater Object Detection of an UVMS Based on WGAN

Chen, Wei.

The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).

     

Image Denoising Using an Improved Generative Adversarial Network with Wasserstein Distance

by Wang, Qian; Liu, Han; Xie, Guo ; More...

2021 40th Chinese Control Conference (CCC), 07/2021

The image denoising discriminant model has received extensive attention in recent years due to its good denoising performance. In order to solve the problems...

Conference Proceeding Full Text Online

[PDF] researchgate.net

Image Denoising Using an Improved Generative Adversarial Network with Wasserstein Distance

Q Wang, H Liu, G Xie, Y Zhang - 2021 40th Chinese Control …, 2021 - ieeexplore.ieee.org

The image denoising discriminant model has received extensive attention in recent years

due to its good denoising performance. In order to solve the problems of denoising of

traditional generative adversarial networks, which are difficult to train and easy to collapse …

 All 3 versionse 

<——2021———2021———2050——


Solving Wasserstein Robust Two-stage Stochastic Linear Programs via Second-order Conic...

by Wang, Zhuolin; You, Keyou; Song, Shiji ; More...

2021 40th Chinese Control Conference (CCC), 07/2021

This paper proposes a novel data-driven distributionally robust (DR) two-stage linear program over the 1-Wasserstein ball to handle the stochastic uncertainty...

  Cited by 1 Related articles All 2 versions

    

Wasserstein Based EmoGANs

by Khine, Win Shwe Sin; Siritanawan, Prarinya; Kotani, Kazunori

2021 Joint 10th International Conference on Informatics, Electronics & Vision (ICIEV) and 2021 5th International Conference on Imaging, Vision & Pattern Recognition (icIVPR), 08/2021

Nowadays, numerous generative models are powerful and becoming popular for image synthesis because their generated images are more and more similar to the...

Conference Proceeding 

Full Text Online

 

    

One-shot style transfer using Wasserstein Autoencoder

by Nakada, Hidemoto; Asoh, Hideki

2021 Asian Conference on Innovation in Technology (ASIANCON), 08/2021

We propose an image style transfer method based on disentangled representation obtained with Wasser-stein Autoencoder. Style transfer is an area of image...

Conference Proceeding 

Full Text Online

 

          

Fault injection in optical path - detection quality degradation analysis with Wasserstein distance

by Kowalczyk, Pawel; Bugiel, Paulina; Szelest, Marcin ; More...

2021 25th International Conference on Methods and Models in Automation and Robotics (MMAR), 08/2021

The goal of this paper is to present results of analysis of artificially generated disturbances imitating real defects of camera that occurs in the process of...

Conference Proceeding 

Full Text Online

 

    

Speech Bandwidth Extension Based on Wasserstein Generative Adversarial Network

by Chen, Xikun; Yang, Junmei

2021 IEEE 21st International Conference on Communication Technology (ICCT), 10/2021

Artificial bandwidth extension (ABE) algorithms have been developed to improve the quality of narrowband calls before devices are upgraded to wideband calls....

Conference Proceeding Full Text Online

 Related articles


2021

    

Speech Enhancement Approach Based on Relativistic Wasserstein Generation Adversarial Networks

by Huang, Jing; Li, Zhi

2021 International Conference on Wireless Communications and Smart Grid (ICWCSG), 08/2021

As a pre-processing technology in other speech applications, speech enhancement technology is one of the kernel technologies in the field of information...

Conference Proceeding 

Full Text Online

 Related articles All 2 versions

    

AWCD: An Efficient Point Cloud Processing Approach via Wasserstein Curvature

by Luo, Yihao; Yang, Ailing; Sun, Fupeng ; More...

2021 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA), 06/2021

In this paper, we introduce the adaptive Wasserstein curvature denoising (AWCD), an original processing approach for point cloud data. By collecting curvatures...

Conference Proceeding 

Full Text Online

Related articles All 4 versions

  

Domain Adaptive Rolling Bearing Fault Diagnosis based on Wasserstein Distance

by Yang, Chunliu; Wang, Xiaodong; Bao, Jun ; More...

2021 33rd Chinese Control and Decision Conference (CCDC), 05/2021

The rolling bearing usually runs at different speeds and loads, which leads to a corresponding change in the distribution of data. The cross-domain problem...

Conference Proceeding Full Text Online

 

WATCH: Wasserstein Change Point Detection for High-Dimensional Time Series Data

by Faber, Kamil; Corizzo, Roberto; Sniezynski, Bartlomiej ; More...

2021 IEEE International Conference on Big Data (Big Data), 12/2021

Detecting relevant changes in dynamic time series data in a timely manner is crucially important for many data analysis tasks in real-world settings. Change...

Conference Proceeding 

Full Text Online

Cited by 5 Related articles All 5 versions

     

High Impedance Fault Diagnosis Method Based on Conditional Wasserstein Generative Adversarial...

by Liu, Wen-Li; Guo, Mou-Fa; Gao, Jian-Hong

2021 IEEE 2nd China International Youth Conference on Electrical Engineering (CIYCEE), 12/2021

Data-driven fault diagnosis of high impedance fault (HIF) has received increasing attention and achieved fruitful results. However, HIF data is difficult to...

Conference Proceeding 

Full Text Online

Related articles

<——2021———2021———2060——


Wasserstein-Distance-Based Multi-Source Adversarial Domain Adaptation for Emotion...

by Luo, Yun; Lu, Bao-Liang

2021 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), 12/2021

To build a subject-independent affective model based on electroencephalography (EEG) is a challenging task due to the domain shift problem caused by individual...

Conference Proceeding 

Full Text Online


一种基于Wasserstein空间的可视化降维方法

05/2021

Patent Available Online Open Access Chinese

 Visual dimension reduction method based on Wasserstein space  


2021

[PDF] arxiv.org

Stability of Gibbs posteriors from the Wasserstein loss for Bayesian full waveform inversion

MM Dunlop, Y Yang - SIAM/ASA Journal on Uncertainty Quantification, 2021 - SIAM

Wasserstein metric and the negative Sobolev seminorm. The conclusion in this paper is 

consistent with the analysis of using the quadratic Wasserstein … We first briefly review 

necessary background knowledge on optimal transport, the quadratic Wasserstein metric, and the …

Cited by 3 Related articles All 3 versions

Cited by 6 Related articles All 5 versions

202[  PDF] arxiv.org

Multi-period facility location and capacity planning under -Wasserstein joint chance constraints in humanitarian logistics

Z Wang, K You, Z Wang, K Liu - arXiv preprint arXiv:2111.15057, 2021 - arxiv.org

… In this work, the ambiguity set is a data-driven ∞-Wasserstein ball (Kantorovich and … 

To sum up, we propose a novel MFLCP model with ∞-Wasserstein joint chance constraints (MFLCP-W) … 

To the best of our knowledge, we are the first to adopt the ∞-Wasserstein joint chance …

All 3 versions


2021

An unsupervised unimodal registration method based on Wasserstein Gan

Y Chen, H Wan, M Zou - Nan Fang yi ke da xue xue bao= Journal of …, 2021 - europepmc.org

对于输入的每一张正例图像都期望判别网络给出一个极大的Wasserstein,每一张负例图像

都期望给出一个极小的Wasserstein,判别网络通过最大化正例图像和负例图像Wasserstein距离

的差值[即最小化公式(2)的值]实现优化.同时为了防止训练过程中出现梯度消失或爆炸的现象,加入

SRelated articles All 3 versions 


2021


summary.wasp: Posterior summaries for the Wasserstein

barycenter ...https://rdrr.io › CRAN › waspr

https://rdrr.io › CRAN › waspr

Posterior summary statistics (mean, mode, sd, 95 all the Wasserstein barycenter of subset posteriors

of all parameters in the model. Examples. 1 2 3 4 5 6.

2021

[2104.14245] The Wasserstein space of stochastic processeshttps://arxiv.org › math
https://arxiv.org › math
by D Bartl · 2021 · Cited by 2 — Wasserstein distance induces a natural Riemannian structure for the probabilities on the Euclidean space. This insight of classical transport ...


2021 patent

Method for detecting industrial abnormality based on wasserstein generative adversarial network using computer device involves using convolutional neural network module image of work-piece to perform data filter using wasserstein generative adversarial network abnormal detection model

CN113554645-ACN113554645-B

Inventor(s) HOU DGUO J and HANG T

Assignee(s) CHANGZHOU MICRO-INTELLIGENCE TECHNOLOGY

Derwent Primary Accession Number 

2021-C2916J


2023 patent

Restoring image based on wasserstein generative adversarial network network comprises constructing shallow convolutional network structure by using cavity convolution to extract spatial characteristic of collected image

CN112488956-A

Inventor(s) FANG WGU E and WANG W

Assignee(s) UNIV NANJING INFORMATION SCI & TECHNOLOG

Derwent Primary Accession Number 

2021-30425


2021 [atent

Method for generating anti-disturbance image based on Wasserstein generative adversarial network-gradient penalty, involves generating corresponding counter disturbance image for any input counter target

CN113537467-A

Inventor(s) TIAN PSUN J; (...); JIANG L

Assignee(s) UNIV NANJING POST & TELECOM

Derwent Primary Accession Number 

2021-C5057X

<——2021———2021———2070——


 

Wasserstein Generative Adversarial Network based small sample ground slow motion target data classification method, involves classifying ground slow motion target in test set by using classification model

CN113569632-A

Inventor(s) LI YBAI X; (...); ZHOU F

Assignee(s) PLA NO 32203 TROOPS and UNIV XIDIAN

Derwent Primary Accession Number 

2021-C64797


 2021 patent

Wasserstein generative adversarial network scene simulation and time sequence production simulation based energy capacity configuration method, involves establishing new energy planning model, obtaining new energy planning scheme

CN112994115-A

Inventor(s) MA YFU Y; (...); ZHAO S

Assignee(s) UNIV NORTH CHINA ELECTRIC POWER

Derwent Primary Accession Number 

2021-721792

 

2021 patent

Non-linear industrial process modeling method based on Wasserstein generative adversarial network data enhancement, involves normalizing initial data set, and adding multiple groups of generated samples into mixed sample

CN112966429-A

Inventor(s) CHU FDING P; (...); MA X

Assignee(s) UNIV CHINA MINING & TECHNOLOGY BEIJING

Derwent Primary Accession Number 

2021-68724L

 

2021 patent

Representation similar countermeasure network based on Wasserstein distance for electroencephalogram emotion classification and deep migration learning, comprises the EEG signal is sampled at a sampling rate of 200hz

CN113673347-A

Inventor(s) YOU YHE G; (...); ZHU L

Assignee(s) UNIV HANGZHOU DIANZI

Derwent Primary Accession Number 

2021-D7113L

 

2021 patent

Method for generating peptide-based vaccine, involves training Wasserstein Generative Adversarial Network (WGAN) only on positive binding peptide sequences while updating generator to minimize kernel Maximum Mean Discrepancy (MMD) loss

US2021319847-A1WO2021211233-A1

Inventor(s) DURDANOVIC IGRAF H P; (...); MIN R

Assignee(s) NEC LAB AMERICA INC

Derwent Primary Accession Number 

2021-B70701


 2021


2021 patent

Method for adapting machine learning/prediction systems from one domain to another domain, involves training classifier networks as discriminator by maximizing Wasserstein distance-based discrepancy between class probability predictions

US11188795-B1

Inventor(s) ULBRICHT D and LEE C

Assignee(s) APPLE INC

Derwent Primary Accession Number 

2021-D7162W

 

2021 patent

Wasserstein generative adversarial network-gradient penalty based radar high-resolution range profile database constructing method, involves combining target sample group and small sample group for finishing construction of database

CN112946600-A

Inventor(s) WANG PLIU H; (...); JIU B

Assignee(s) UNIV XIDIAN

Derwent Primary Accession Number 

2021-70971U


2021 patent

Method for synthesizing high-energy image based on Wasserstein generating countermeasure network model by using electronic device, involves utilizing arbiter network for judging high energy image synthesized by generator network

CN112634390-A

Inventor(s) ZHENG HHU Z; (...); ZHOU H

Assignee(s) SHENZHEN INST ADVANCED TECHNOLOGY

Derwent Primary Accession Number 

2021-41197G


15

Method for unsupervised multi-view three-dimensional point cloud combined registration based on wasserstein generative adversarial network (WGAN) involves setting number of generator and discriminator training to be M times

CN112837356-A

Inventor(s) WANG YPENG W; (...); WU H

Assignee(s) UNIV HUNAN

Derwent Primary Accession Number 

2021-60602R

patent

Method for performing seismic record inversion, involves optimizing dual Wasserstein generative adversarial network model, and inputting large sample unlabeled seismic record into optimized inversion generator to generate corresponding large sample wave impedance

CN113722893-A

Inventor(s) WANG Z

Assignee(s) UNIV CHINA PETROLEUM BEIJING

Derwent Primary Accession Number 

2021-E61157

<——2021———2021———2080——


2021 patent

Computer-based method for generating tailored medical recipes for mental health disorders, involves transmitting that recipes, generative adversarial neural network (GAN) comprises wasserstein GAN (WGAN) and self-Attention GAN (SAGAN)

US11049605-B1

Inventor(s) PETERS F L

Assignee(s) CORTERY AB

Derwent Primary Accession Number 

2021-72556R


 2021 patent

Cross-domain recommendation method based on double-flow-based wasserstein self-encoder, involves obtaining input data to obtain user-item-score data of A data domain and user item score data of B domain

CN113536116-A

Inventor(s) XIE HZUO Z; (...); NIE J

Assignee(s) UNIV CHINA OCEAN

Derwent Primary Accession Number 

2021-C4660T


2021 pTENT

Rolling bearing enhanced diagnostic method based on personal digital assistant-wasserstein generative adversarial networks gradient penalty, involves evaluating quality of generated sample to output fault model with high quality

CN113486931-A

Inventor(s) GE HZHANG Q; (...); CHEN J

Assignee(s) UNIV NANJING AERONAUTICS & ASTRONAUTICS

Derwent Primary Accession Number 

2021-B8909U


2021 patent

Method for predicting bearing remaining life based on improved residual network and wasserstein-generative adversarial network used in mechanical transmission comprises sending common feature space to full-connected neural network structure bearing remaining life prediction model

CN113536697-A

Inventor(s) ZHAO ZXU J and SHEN Y

Assignee(s) UNIV JIANGNAN

Derwent Primary Accession Number 

2021-C4734E


2021 patent

Method for analyzing unbalanced data set based on WGAN training convergence, involves changing training time of discriminator, and outputting corresponding prediction tag to finish classification of network data safety

CN113537313-A

Inventor(s) CHEN ZZHANG L; (...); XU Y

Assignee(s) UNIV HANGZHOU DIANZI

Derwent Primary Accession Number 

2021-C40982

2021


2021 patent

Process industry soft measurement data supplementing method based on self-supervised variational auto-encoder wasserstein generative adversarial network (SVAE-WGAN) in idustrial field, involves calculating output distribution of encoder network

CN113505477-A

Inventor(s) XU JZHANG Q; (...); GAO S

Assignee(s) UNIV NORTHWEST NORMAL

Derwent Primary Accession Number 

2021-C05218


 2021 patent

Method for generating peptide-based vaccine, involves training Wasserstein Generative Adversarial Network (WGAN) only on positive binding peptide sequences while updating generator to minimize kernel Maximum Mean Discrepancy (MMD) loss

US2021319847-A1WO2021211233-A1

Inventor(s) DURDANOVIC IGRAF H P; (...); MIN R

Assignee(s) NEC LAB AMERICA INC

Derwent Primary Accession Number 

2021-B70701

 2021 patent

Method for generating biological Raman spectrum data based on WGAN resistance generating network comprises performing iterative training on generating network and judging network to obtain generated data of false true Raman spectrum

CN112712857-A

Inventor(s) ZHU LDING J; (...); ZHUANG W

Assignee(s) UNIV BEIJING INFORMATION SCI & TECHNOLOG

Derwent Primary Accession Number 

2021-51255C


2021 patent

Method for unsupervised multi-view three-dimensional point cloud combined registration based on wasserstein generative adversarial network (WGAN) involves setting number of generator and discriminator training to be M times

CN112837356-A

Inventor(s) WANG YPENG W; (...); WU H

Assignee(s) UNIV HUNAN

Derwent Primary Accession Number 

2021-60602R

 2021 patent

Improved private aggregation-of-teacher series based WGAN-GP privacy protection method, involves generating synthesized data by optimizing teacher classifier cluster for training machine learning models according to original sensitive training data protection process

CN113553624-A

Inventor(s) NIE PHAN Z; (...); YANG Z

Assignee(s) UNIV TIANJIN

Derwent Primary Accession Number 

2021-C5045Q

<——2021———2021———2090——  


Wind power output power prediction method and wasserstein generative adversarial networks (WGAN) network, has filling missing value of wind power data and abnormal value, normalizing data set as input data of prediction model, and outputting prediction result after training test of prediction mode

CN113298297-A

Inventor(s) WANG YWU Y; (...); LIU G

Assignee(s) UNIV INNER MONGOLIA TECHNOLOGY

Derwent Primary Accession Nu
Related articles
 View as HTML 


[HTML] sciencedirect.com

[HTML] Wasserstein distance based multiobjective evolutionary algorithm for the risk aware optimization of sensor placement

A PontiA CandelieriF Archetti - Intelligent Systems with Applications, 2021 - Elsevier

In this paper we propose a new algorithm for the identification of optimal “sensing spots”,

within a network, for monitoring the spread of “effects” triggered by “events”. This problem is

referred to as “Optimal Sensor Placement” and many real-world problems fit into this general …

 Cited by 4


Minimizing Wasserstein-1 Distance by Quantile Regression for GANs Model

Y Chen, X Hou, Y Liu - Chinese Conference on Pattern Recognition and …, 2021 - Springer

… So we propose minimizing the Wasseerstain-1 distance by Quantile Regression algorithm which

works well on minimizing the Wasserstein-1 … In short, we provide a new idea for minimizing

Wasserstein-1 distance in GANs model. Comparative experiments on MNIST, CIFAR-10, …

 

2021  [PDF] arxiv.org

Wasserstein GAN: Deep Generation applied on Bitcoins financial time series

R Samuel, BD Nico, P Moritz, O Joerg - arXiv preprint arXiv:2107.06008, 2021 - arxiv.org

… Because GANs often suffer from mode collapse during training, we introduce the improved

GAN called Wasserstein GAN to improve learning stability. The papers [28–30] focus on

implementing a Wasserstein GAN and show differences to the original GAN [57] and optimize …

   All 2 versions 


A data-driven scheduling model of virtual power plant using Wasserstein distributionally robust optimization

H Liu, J QiuJ Zhao - International Journal of Electrical Power & Energy …, 2021 - Elsevier

… sell excessive electricity to spot market in the form of Virtual Power Plant (VPP). The aggregator

schedules DER … Wasserstein ambiguity set is constructed under uncertainties of market

price and wind forecast errors. A set of data-driven linearization power constraints are applied …

All 2 versions


2021


Conditional Wasserstein generative adversarial networks applied to acoustic metamaterial design

P Lai, F AmirkulovaP Gerstoft - … Journal of the Acoustical Society of …, 2021 - asa.scitation.org

… Wasserstein generative adversarial networks (cWGANs) model is proposed for minimization

of TSCS in two dimensions by combining Wasserstein … To achieve this, we develop and

train a conditional Wasserstein generative adversarial network (cWGAN) to propose images of …

 All 5 versions


Run-Sort-ReRun: Escaping Batch Size Limitations in Sliced Wasserstein Generative Models

J LezamaW ChenQ Qiu - International Conference on …, 2021 - proceedings.mlr.press

… In this paper, we build upon recent progress in sliced Wasserstein distances, a family of

differentiable metrics for distribution discrepancy based on the Optimal Transport paradigm.

We introduce a procedure to train these distances with virtually any batch size, allowing the …

  All 2 versions 

 

Wasserstein GAN: Deep Generation Applied on Financial Time Series

M Pfenninger, DN Bigler, S Rikli… - Available at SSRN …, 2021 - papers.ssrn.com

… Because GANs often suffer from mode collapse during training, we introduce the improved

GAN called Wasserstein GAN to improve learning stability. The papers [28–30] focus on

implementing a Wasserstein GAN and show differences to the original GAN [57] and optimize …

  All 2 versions


Synthesis of Adversarial DDoS Attacks Using Wasserstein Generative Adversarial Networks with Gradient Penalty

CS Shieh, TT Nguyen, WW Lin… - 2021 6th …, 2021 - ieeexplore.ieee.org

… However, new types of attacks emerge as the technology for DDoS attacks keep evolving.

This study investigates the impact of a new sort of DDoS attack – adversarial DDoS attack.

We synthesize attacking traffic using Wasserstein Generative Adversarial Networks with …

  All 2 versions


Demystified: Wasserstein GANs (WGAN) | by Aadhithya Sankar

https://towardsdatascience.com › demystified-wasserstei.....

Sep 17, 2021 — In this article we will read about Wasserstein GANs. ... 1 we see clearly that the the optimal GAN discriminator saturates and results in ... 

<——2021———2021———2100—


A Theory of the Distortion-Perception Tradeoff in Wasserstein Space
A Theory of the Distortion-Perception Tradeoff in Wasserstein ...

https://openreview.net › forum

https://openreview.net › forum

by D Freirich · 2021 · Cited by 2 — Abstract: The lower the distortion of an estimator, the more the distribution of its outputs Cited by 5 Related articles All 4 versions
A Theory of the Distortion-Perception Tradeoff in Wasserstein ...

slideslive.com › a-theory-of-the-distortionperception-trad...

slideslive.com › a-theory-of-the-distortionperception-trad...

A Theory of the Distortion-Perception Tradeoff in Wasserstein Space. Dec 6, 2021. Speakers. About. The lower the distortion of an estimator, the more the ...

SlidesLive · 

Dec 6, 2021

Cited by 6 Related articles All 4 versions

  2021

On efficient multilevel Clustering via Wasserstein distances

https://jmlr.org › beta › papers

https://jmlr.org › beta › papers

by V Huynh · 2021 ·  — Our method involves a joint optimization formulation over several spaces of discrete probability measures, which are endowed with Wasserstein distance ...


   2021

On efficient multilevel Clustering via Wasserstein distances

https://jmlr.org › beta › papers

https://jmlr.org › beta › papers

by V Huynh · 2021 ·  — Our method involves a joint optimization formulation over several spaces of discrete probability measures, which are endowed with Wasserstein distance ...


2021

Linear and Deep Order-Preserving Wasserstein ... - Northwestern Scholars

https://www.scholars.northwestern.edu › publications

by B Su · 2021 — It is typically more challenging than conventional dimensionality reduction for static data, because measuring the separability of sequences involves ...



2021
Wasserstein statistics in one-dimensional location scale models

by Amari, Shun-ichi; Matsuda, Takeru
Annals of the Institute of Statistical Mathematics, 03/2021, Volume 74, Issue 1
Wasserstein geometry and information geometry are two important structures to be introduced in a manifold of probability distributions. Wasserstein geometry is...
Journal Article  Full Text Online
 Wasserstein statistics in one-dimensional location scale models

By: Amari, Shun-ichiMatsuda, Takeru

ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS    

Early Access: MAR 2021

  Related articles All 4 versions


2021


2021 see 2022
WATCH: Wasserstein Change Point Detection for High-Dimensional Time Series Data
by Faber, Kamil; Corizzo, Roberto; Sniezynski, Bartlomiej ; More...
2021 IEEE International Conference on Big Data (Big Data), 12/2021
Detecting relevant changes in dynamic time series data in a timely manner is crucially important for many data analysis tasks in real-world settings. Change...

Conference Proc 

Related articles All 4 versions

2021

Master's Thesis Presentation • Machine Learning - Cheriton ...

https://cs.uwaterloo.ca › events › masters-thesis-presenta...

Jan 21, 2021 — We present a semi-supervised approach using Wasserstein autoencoders and a mixture of Gaussian priors for topic-aware sentence generation. Our ...


2021

Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces (Sept, 10.1007/s00245-021-09772-w, 2021)

Bonnet, B and Frankowska, H

Dec 2021 | Sep 2021 (Early Access) | APPLIED MATHEMATICS AND OPTIMIZATION 84 (SUPPL 2) , pp.1819-1819

Free Full Text From Publisher

1 Reference Related records

Cited by 8 Related articles All 12 versions

[CITATION] Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces (Sept, 10.1007/s00245-021-09772-w, 2021)

B Bonnet, H Frankowsk
Bonnet, Benoît
Frankowska, Hélène

Correction to: “Necessary optimality conditions for optimal control problems in Wasserstein spaces”. (English) Zbl 07498421

Appl. Math. Optim. 84, Suppl. 2, 1819 (2021).

MSC:  30L99 34K09 49J53 49K21 49Q22 58E25

PDF BibTeX XML Cite

Full Text: DOI 



 Cited by 21 Related articles All 15 versions


[PDF] wisc.edu

[PDF] Wasserstein Graph Clustering in Determining the Genetic Contribution of State Changes in rs-fMRI

MK ChungSG Huang, IC Carroll, VD Calhoun, H Hill - pages.stat.wisc.edu

… novel Wasserstein graph clustering method for networks (Anand, 2021). The Wasserstein …

The Wasserstein clustering outperforms the widely used k-means clustering. We applied the …
All 2 versions
 


2021 patent

Method for estimating time delay distance, involves calculating time delay for each unique pair of multiple sensors by minimizing Wasserstein distance between two cumulative distribution transforms corresponding to unique pair

US2021281361-A1

Inventor(s) CRANCH G AMENKART N; (...); HUTCHINSON M N

Assignee(s) US SEC OF NAVY

Derwent Primary Accession Number 

2021-A3597D

<——2021———2021———2110——


Wasserstein Regression - Taylor & Francis Online

https://www.tandfonline.com › ... › Latest Articles

https://www.tandfonline.com › ... › Latest Articles

by Y Chen · 2021 · 8 — Adopting the Wasserstein metric, we develop a class of regression models for such data, where random distributions serve as predictors and the responses are ...

2021

Intuition on Wasserstein Distance - Cross Validated

https://stats.stackexchange.com › questions › intuition-o...

https://stats.stackexchange.com › questions › intuition-o...

Apr 29, 2021 — if you from scipy.stats import wasserstein_distance and calculate the distance between a vector like [6,1,1,1,1] and any permutation of it where the 6 "moves ...

1 answer

 Intuition on Wasserstein Distance - Cross Validated

https://stats.stackexchange.com › questions › intuition-o...

https://stats.stackexchange.com › questions › intuition-o...

Apr 29, 2021 · 1 answer

Here is the documentation: Parameters u_values, v_values array_like Values observed in the (empirical) distribution. Note that wasserstein_distance expects ...

2021

Wasserstein distance and Kolmogorov-Smirnov statistic as ...

https://stats.stackexchange.com › questions › wasserstei...

https://stats.stackexchange.com › questions › wasserstei...

Oct 24, 2021 — I thought that maybe the Wasserstein distance or the Kolmogorov-Smirnov statistic can be good measures of the effect size between the two distributions.

1 answer


2021

Multi-marginal wasserstein GAN - ACM Digital Library

https://dl.acm.org › doi

https://dl.acm.org › doi

Jun 15, 2021 — Wasserstein generative adversarial networks. In Proceedings of the International Conference on Machine Learning, pages 214-223, 2017.


2021

An Information-Theoretic View of Generalization via ...

https://scholar.harvard.edu › hao › publications › infor...

https://scholar.harvard.edu › hao › publications › infor...

Oct 26, 2021 — 2019. “An Information-Theoretic View of Generalization via Wasserstein Distance.” In IEEE International Symposium on Information Theory (ISIT).


2021


 arroll, Tom  Massaneda, XavierOrtega-Cerdà, Joaquim

Corrigendum to: “An enhanced uncertainty principle for the Vaserstein distance”. (English) Zbl 07456956

Bull. Lond. Math. Soc. 53, No. 5, 1520-1522 (2021).

MSC:  28A75

PDF BibTeX XML    . Z bl 1486.28002



2021

Wasserstein space-based visual dimension reduction method

CN CN112765426A 秦红星 重庆邮电大学

Priority 2021-01-18 • Filed 2021-01-18 • Published 2021-05-07

6. The visualization dimension reduction method based on Wasserstein space according to claim 5, wherein: the S5 specifically includes: note P i Is the ith row, Q, of the matrix P i Similarly, considered as a column vector; w represents the 1-Wasserstein distance and the dual form of the loss function … 



 

 

2021

… for high-dimension unsupervised anomaly detection using kernalized wasserstein

KR KR102202842B1 백명희조 서울대학교산학협력단

Priority 2019-08-13 • Filed 2019-08-13 • Granted 2021-01-14 • Published 2021-01-14

The present invention relates to a learning method and a learning apparatus for high-dimension unsupervised abnormality detection using a kernalized Wasserstein autoencoder to decrease excessive computations of a Christoffel function, and a test method and a test apparatus using the same.


  

2021

Industrial anomaly detection method and device based on WGAN

CN CN113554645A 杭天欣 常州微亿智造科技有限公司

Priority 2021-09-17 • Filed 2021-09-17 • Published 2021-10-26

constructing an original WGAN model, wherein the original WGAN model comprises a generator and a discriminator; inputting the first data set and the second data set into the original WGAN model to train the original WGAN model to obtain the WGAN anomaly detection model, wherein the first data set …

 

 Using Wasserstein Generative Adversarial Networks for the design of Monte Carlo simulations

 Using Wasserstein Generative Adversarial ... - Science Direct

by S Athey · 2021 · Cited by 43 — In this section we discuss the application of WGANs for Monte Carlo studies based on the Lalonde–Dehejia–Wahba (LDW) data. 3.1. Simulation ...

Cited by 70 Related articles All 14 versions

<——2021———2021———2120——


Method for generating biological Raman spectrum data based on WGAN (WGAN) …

CN CN112712857A 祝连庆 北京信息科技大学

Priority 2020-12-08 • Filed 2020-12-08 • Published 2021-04-27

1. A method of generating bio-raman spectral data based on a WGAN antagonistic generation network, the method comprising the steps of: a, extracting part of Raman spectrum data from a Raman spectrum database to serve as a real sample, and preprocessing the Raman spectrum data; b, creating a normal …

2021 

Method for image restoration based on WGAN network

CN CN112488956A 方巍 南京信息工程大学

Priority 2020-12-14 • Filed 2020-12-14 • Published 2021-03-12

3. The method for image inpainting based on WGAN network of claim 1, wherein in the step (1.3), through optimizing parameters and function algorithm: wherein, the activation function is specifically described as follows: 4. the method for image restoration based on WGAN network of claim 1, wherein …

2021  patent

WGAN-based fuzzy aerial image processing method

CN CN113538266A 李业东 南京国电南自电网自动化有限公司

Priority 2021-07-07 • Filed 2021-07-07 • Published 2021-10-22

the fuzzy image processing model takes a WGAN network as a basic network and comprises a generator network and a discriminator network, wherein the generator network comprises a down-sampling network block and an up-sampling network block which are sequentially arranged, the discriminator network …

 

2021   

Anti-disturbance image generation method based on WGAN-GP

CN CN113537467A 蒋凌云 南京邮电大学

Priority 2021-07-15 • Filed 2021-07-15 • Published 2021-10-22

2. The WGAN-GP-based disturbance rejection image generation method according to claim 1, wherein: target loss function L WGAN-GP The expression of the calculation is In the formula (2), d (x) represents that the discriminator determines whether the x class label belongs to the class information in …

2021 

New energy capacity configuration method based on WGAN scene simulation and …

CN CN112994115A 马燕峰 华北电力大学(保定)

Priority 2019-12-18 • Filed 2019-12-18 • Published 2021-06-18

A new energy capacity configuration method based on Wasserstein generation countermeasure network (WGAN) scene simulation and time sequence production simulation is characterized by mainly comprising the following specific steps: step 1, simulating a large number of wind and light resource …

2021 

Road texture picture enhancement method coupling traditional method and WGAN-GP

CN113850855A 徐子金 北京工业大学

Filed 2021-08-27 • Published 2021-12-28

1. A road texture picture enhancement method coupling a traditional method and WGAN-GP is characterized in that a new high-quality texture picture is generated by utilizing a road surface macro texture picture obtained by a commercial handheld three-dimensional laser scanner through a traditional …

2021  

Wind power output power prediction method based on isolated forest and WGAN …

CN CN113298297A 王永生 内蒙古工业大学

Priority 2021-05-10 • Filed 2021-05-10 • Published 2021-08-24

7. The isolated forest and WGAN network based wind power output power prediction method of claim 6, wherein the interpolation operation comprises the following steps: step 2.1, inputting the random noise vector z into a generator G to obtain a generated time sequence G (z), wherein G (z) is a …


2021

Research article
Deep transfer Wasserstein adversarial network for wafer map defect recognition
Computers & Industrial Engineering13 September 2021...

Jianbo YuShijin LiQingfeng Li

2021  Research articleOpen access
Sliding window neural network based sensing of bacteria in wastewater treatment plants
Journal of Process Control24 December 2021...

Mohammed AlharbiPei-Ying HongTaous-Meriem Laleg-Kirati

Download PDF

2021  Research article
Wasserstein distributionally robust shortest path problem
European Journal of Operational Research13 January 2020...

Zhuolin WangKeyou YouYuli Zhang

Related articles All 2 versions

<——2021———2021———2130——


[PDF] researchgate.net

[PDF] Improving Perceptual Quality by Phone-Fortified Perceptual Loss using Wasserstein Distance for Speech Enhancement

TA Hsieh, C Yu, SW Fu, X Lu, Y Tsao - arXiv, 2021 - researchgate.net

… In the followings, we will describe the PFPL, which is a perceptual loss incorporated with 

Wasserstein distance in detail. … Based on this concept, we decide to replace the Lp distance 

and use the Wasserstein distance as the distance measure to compute the perceptual loss for …

 Cited by 2 Related articles All 4 versions


2021  video and text

 Flexibly Learning Latent Priors for Wasserstein Auto-Encoders

 July 28, 2021   4 speakers

[PDF] mlr.press

FlexAE: Flexibly learning latent priors for wasserstein auto-encoders

AK Mondal, H Asnani, P Singla… - Uncertainty in Artificial …, 2021 - proceedings.mlr.press

… DZ, in principle can be chosen to be any distributional divergence such as Kullback-Leibler 

divergence (KLD), Jensen–Shannon divergence (JSD), Wasserstein Distance and so on. In 

this work, we propose to use Wasserstein distance and utilize the principle laid in [Arjovsky et …

Cited by 4 Related articles All 5 versions


[1104.4631] Comparison between $W_2$ distance and $\dot{H}

https://arxiv.org › math

https://arxiv.org › math

by R Peyre · 2011 · Cited by 47 — Comparison between W_2 distance and \dot{H}^{-1} norm, and localisation of Wasserstein distance. Authors:Rémi Peyre · Download PDF. Abstract: It ...

 


2021 patent
  One-dimensional time series data augmentation method based on WGA involves using Wasserstein generating countermeasure network (WGAN) generator corresponding to each subclass of clustered time series data, to generate artificial samples with same characteristics as original data of subclass

Patent Number: CN113627594-A

Patent Assignee: UNIV BEIHANG

Inventor(s): QIAN C; YANG D; REN Y; et al.

2One-dimensional time series data augmentation method based on WGA involves using Wasserstein generating countermeasure network (WGAN) generator corresponding to each subclass of clustered time series data, to generate artificial samples with same characteristics as original data of subclass

CN113627594-A

Inventor(s) QIAN CYANG D; (...); SUN B

Assignee(s) UNIV BEIHANG

Derwent Primary Accession Number 

2021-D12817

2021


Wasserstein Distribution Correction for Improved Robustness ...

elsevierpure.com

https://graz.elsevierpure.com › publications › wasserst...

elsevierpure.com

https://graz.elsevierpure.com › publications › wasserst...

by A Fuchs · 2021 — Wasserstein Distribution Correction for Improved Robustness in Deep Neural Networks. Alexander Fuchs, Christian Knoll, Franz Pernkopf.

[CITATION] Wasserstein Distribution Correction for Improved Robustness in Deep Neural Networks

A Fuchs, C Knoll, F Pernkopf - NeurIPS Workshop DistShift, 2021 - graz.pure.elsevier.com

Wasserstein Distribution Correction for Improved Robustness in Deep Neural Networks — 

Graz University of Technology … Wasserstein Distribution Correction for Improved …

Related articles


2021  


Wasserstein Contrastive Representation Distillation

By: Chen, LiqunWang, DongGan, Zhe; et al.

Conference: IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Location: ‏ ELECTR NETWORK Date: ‏ JUN 19-25, 2021

Sponsor(s): ‏IEEE; IEEE Comp Soc; CVF

2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021  Book Series: ‏ IEEE Conference on Computer Vision and Pattern Recognition   Pages: ‏ 16291-16300   Published: ‏ 2021

Cited by 30 Related articles All 7 versions 

2021

Model summary for the WGAN model.

By: Castelli, MauroManzoni, LucaEspindola, Tatiane; et al.

Figshare

DOI: ‏ https://doi-org.ezaccess.libraries.psu.edu/10.1371/journal.pone.0260308.t003

Document Type: Data set

 View Abstract

Model summary for the WGAN model.

Castelli, MauroManzoni, Luca; (...); De Lorenzo, Andrea

2021 | Figshare | Data set

Model summary for the WGAN model. Copyright: CC BY 4.0

View data

Conference Paper  Citation/Abstract

Multi-source Cross Project Defect Prediction with Joint Wasserstein Distance and Ensemble Learning

Zou, Quanyi; Yang, Zhanyu; Xu, Hao.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).

All 2 versions

Conference Paper  Citation/Abstract

Weighted Wasserstein Distance-based Improved Serial Principal Component Analysis for Incipient Fault Detection of Complex Industrial Process

Deng, Xiaogang; Wang, Xiaohui.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).


Abstract/Details  Show Abstract 

Conference Paper  Citation/Abstract

Probability Distribution Control of Finite-State Markov Chains with Wasserstein Costs and Application to Operation of Car-Sharing Services

Sakurama, Kazunori.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021). Abstract/Details   Show Abstract 

<——2021———2021———2140——


Conference Paper  Citation/Abstract

Fair Graph Auto-Encoder for Unbiased Graph Representations with Wasserstein Distance

Liu, Kunpeng; Xie, Rui; Liu, Hao; Xiong, Hui; Fu, Yanjie.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).

Abstract/Details   Show Abstract 

Related articles All 2 versions

 

Conference Paper  Citation/Abstract

High Impedance Fault Diagnosis Method Based on Conditional Wasserstein Generative Adversarial Network

Guo, Mou-Fa; Gao, Jian-Hong.The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).

Abstract/Details   Show Abstract 


[HTML] sinomaps.com

[HTML] 高光谱图像分类的 Wasserstein 配置熵非监督波段选择方法

张红, 吴智伟, 王继成, 高培超 - 2021 - xb.sinomaps.com

其中,Wasserstein配置熵删除了连续像元的冗余信息,但局限于四邻域,本文将Wasserstein配置

熵拓展至八邻域.以印度松木试验场和意大利帕维亚大学高光谱图像为例,使用Wasserstein配置熵

Related articles All 2 versions 

Unsupervised band selection for hyperspectral image classification using the Wasserstein metric-based configuration entropy

 


[PDF] arxiv.org

On Stein's Factors for Poisson Approximation in Wasserstein Distance with Nonlinear Transportation Costs

ZW Liao, Y Ma, A Xia - Journal of Theoretical Probability, 2021 - Springer

We establish various bounds on the solutions to a Stein equation for Poisson approximation

in the Wasserstein distance with nonlinear transportation costs. The proofs are a refinement …

Related articles All 3 versions

Projected Statistical Methods for Distributional Data on the Real Line with the Wasserstein Metric (preprint)

M PegoraroM Beraha - 2021 - pesquisa.bvsalud.org

… -Wasserstein metric. We focus in particular on Principal Component Analysis (PCA) and

regression. To define these models, we exploit a representation of the Wasserstein … Wasserstein …

 Projected Statistical Methods for Distributional Data on the Real Line with the Wasserstein Metric (preprint)

M PegoraroM Beraha - 2021 - pesquisa.bvsalud.org

… To define these models, we exploit a representation of the Wasserstein space closely … space

and using a metric projection operator to constrain the results in the Wasserstein space. By … 


2021

weighted wasserstein distance-based improved serial principal Component Analysis for Incipient Fault Detection of Complex Industrial Process

J Dai, X Deng, X Wang - 2021 CAA Symposium on Fault …, 2021 - ieeexplore.ieee.org

Wasserstein distance-based improved serial principal component analysis (WWDSPCA) 

method to detect the incipient fault of complex industrial … Then, Wasserstein distance (WD), …

 

[PDF] archive.org

[PDF] Supplement to “Wasserstein Regression”

Y Chen, Z Lin, HG Müller - scholar.archive.org

Thus, for the proofs of Theorems 1 and 2, we will derive the asymptotic order of the right hand

sides in (S. 1). To this end, we need to study the asymptotic properties of the estimators of …

 Cited by 22 Related articles All 4 versions+

 

2021 yhjesis  PDF  

Sliced-Wasserstein Distance for Large-Scale Machine Learning

https://www.theses.fr › ...

by K Nadjahi · 2021 ·  — This thesis further explores the use of the Sliced-Wasserstein distance in modern statistical and ... 2 2. 101. 102. 103. 104 number of generated samples m.


Entropy-regularized 2-Wasserstein distance between ...

https://link.springer.com › article

by A Mallasto · 2021 · 1 — Optimal transport (OT) [82] studies the geometry of probability measures through the lifting of a cost function between samples. This is carried ...

Cited by 18 Related articles All 6 versions


[PDF] wisc.edu

[PDF] Wasserstein Graph Clustering in Determining the Genetic Contribution of State Changes in rs-fMRI

MK Chung, SG Huang, IC Carroll, VD Calhoun, H Hill - pages.stat.wisc.edu

… novel Wasserstein graph clustering method for networks (Anand, 2021). The Wasserstein 

The Wasserstein clustering outperforms the widely used k-means clustering. We

Related articles All 2 versions 

<—–2021———2021———2150——


[PDF] mlr.press

Smooth -Wasserstein Distance: Structure, Empirical Approximation, and Statistical Applications

S NietertZ GoldfeldKato - International Conference on …, 2021 - proceedings.mlr.press

… by the scalability of this framework to high dimensions, we investigate the structural and

statistical behavior of the Gaussian-smoothed p-Wasserstein distance W (σ) p , for arbitrary p ≥ 1…

S Cited by 5 Related articles All 2 versions 



[PDF] mlr.press

When ot meets mom: Robust estimation of wasserstein distance

…, LaforgueMozharovskyi… - … Conference on …, 2021 - proceedings.mlr.press

… Medians of Means (MoM) approach to provide robust estimates. Exploiting the dual Kantorovitch

formulation of the Wasserstein distance, we introduce and discuss novel MoM-based ro…

S 0 Related articles All 8 versions 

 

[PDF] github.io

Measuring dependence in the Wasserstein distance for Bayesian nonparametric models

M Catalano, A Lijoi, I Prünster - The Annals of Statistics, 2021 - projecteuclid.org

… We conclude this section by recalling some properties of the Wasserstein distance to be

used in the sequel. Let X and Y be two random elements in R2. A coupling (ZX,ZY ) C(X,…

S Cited by 5 Related articles All 5 versions 



Multivariate goodness-of-fit tests based on Wasserstein distance

M HallinG MordantJ Segers - Electronic Journal of Statistics, 2021 - projecteuclid.org

… -sample performance of the test statistic based on the p-Wasserstein distance for p  {1, 2}

… the best of our knowledge, an implementation of the SAG method is not yet available in R (R …

S 8 Related articles All 14 versions


[PDF] arxiv.org

On distributionally robust chance constrained programs with Wasserstein distance

W Xie - Mathematical Programming, 2021 - Springer

… chance constrained program (DRCCP) with Wasserstein ambiguity set, where the uncertain

… distributions of the uncertain parameters within a chosen Wasserstein distance from an …

S Cited by 83 Related articles All 9 versions


2021


[PDF] neurips.cc

The unbalanced Gromov Wasserstein distance: Conic formulation and relaxation

SéjournéFX VialardG Peyré - Advances in Neural …, 2021 - proceedings.neurips.cc

… -Wasserstein formulations: a distance and a more tractable upper-bounding relaxation. They

both allow the comparison of … experiments on synthetic examples and domain adaptation …

S 3 Related articles All 7 versions 


Fault diagnosis of rotating machinery based on wasserstein distance and feature selection

F FerracutiA FreddiA Monteriù… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

… for fault classification and degradation prediction in the last years [20]–[24], whereas, in this

context, Wasserstein distance-based solutions for fault diagnosis are still at the beginning of …

S Cited by 2 Related articles



[PDF] arxiv.org

The Wasserstein space of stochastic processes

D BartlM BeiglböckG Pammer - arXiv preprint arXiv:2104.14245, 2021 - arxiv.org

… Wasserstein distance, whereas in Subsection 5.4 we establish two topological properties of

martingales as a subset of … In Subsection 5.6 we prove that FPp is a geodesic space for 1 < p …

 Cited by 3 Related articles All 2 versions 


[PDF] arxiv.org

Plg-in: Pluggable geometric consistency loss with wasserstein distance in monocular depth estimation

…, S KoideKawanoKondo - … Conference on Robotics …, 2021 - ieeexplore.ieee.org

… of monocular camera images. Our objective is designed using the Wasserstein distance …

The Wasserstein distance can impose a soft and symmetric coupling between two point …

  Cited by 3 Related articles All 4 versions


R-WGAN-based Multi-timescale Enhancement Method for Predicting f-CaO Cement Clinker

…, L Liu, G Huang, Zhang, Zhang… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

… In this paper, a prediction method based on R-WGAN model is … The method combines

WGAN with a regression prediction … The main work of this paper is to: 1) Based on the …

  Related articles

<——2021———2021———2160


[PDF] aaai.org

[PDF] Towards Generalized Implementation of Wasserstein Distance in GANs

M XuZ ZhouG LuJ TangW ZhangYu - Proceedings of the AAAI …, 2021 - aaai.org

… still keeps the gradient property of the Wasserstein distanceBased on this relaxed duality,

we propose a generalized WGAN model called Sobolev Wasserstein GAN. To the best of our …

 Cited by 4 Related articles All 5 versions 

 

SVAE-WGAN-Based Soft Sensor Data Supplement Method for Process Industry

S Gao, S Qiu, Z Ma, Tian, Liu - IEEE Sensors Journal, 2021 - ieeexplore.ieee.org

… WGAN under the consideration of the accuracy and diversity for generated data. The

SVAE-WGAN based … and use their encoders as generators of WGAN, and a deep generative …

 Related articles All 2 versions


 

[PDF] mdpi.com

Low-Illumination Image Enhancement in the Space Environment Based on the DC-WGAN Algorithm

M Zhang, Zhang, Z Jiang, X Lv, C Guo - Sensors, 2021 - mdpi.com

… mainly includes the GAN, DCGAN, and Wasserstein GAN (WGAN). Section 3 explains the

network model proposed in this paper based on the WGAN loss function, and the loss function …

 Cited by 2 Related articles All 8 versions 


[PDF] arxiv.org

Automatic Visual Inspection of Rare Defects: A Framework based on GP-WGAN and Enhanced Faster R-CNN

M JalayerJalayer, A Kaboli… - … on Industry 4.0 …, 2021 - ieeexplore.ieee.org

… We propose a heuristic data augmentation model based on the state-of-the-art GP-WGAN

network, which generates small and medium-sized synthetic defects which are heuristically …

Cited by 2 Related articles All 4 versions


 

A TextCNN and WGAN-gp based deep learning frame for unpaired text style transfer in multimedia services

M Hu, M He, W Su, A Chehri - Multimedia Systems, 2021 - Springer

… to learn to extract the content features and WGAN-gp is introduced to preserve the original …

WGAN-gp provides a more stably adversarial training because it utilizes Wasserstein distance …

Cited by 2 Related articles All 3 versions


2021


[HTML] hindawi.com

[HTML] Oversampling Imbalanced Data Based on Convergent WGAN for Network Threat Detection

Xu, X Zhang, Z Qiu, X Zhang, J Qiu… - Security and …, 2021 - hindawi.com

… apply WGAN as an oversampling method to generate the new minority samples to solve the

imbalanced problem. WGAN … stability of WGAN, we propose a convergent WGAN-based over…

Related articles All 4 versions 


[HTML] hindawi.com

[HTML] A Liver Segmentation Method Based on the Fusion of VNet and WGAN

J Ma, Deng, Z Ma, Mao, Y Chen - Computational and Mathematical …, 2021 - hindawi.com

… As the WGAN model entirely solves the problem of training instability of GAN, therefore, in …

Because of this excellent characteristic of WGAN, we employed WGAN as the basic structure …

Related articles All 7 versions 


2021  [PDF] arxiv.org

Lagrangian schemes for Wasserstein gradient flows

JA Carrillo, D Matthes, MT Wolfram - Handbook of Numerical Analysis, 2021 - Elsevier

… This chapter reviews different numerical methods for specific examples of Wasserstein 

Cited by 11 Related articles All 6 versions
see papers in 2020, 2021


[HTML] springer.com

[HTML] Quantum statistical learning via Quantum Wasserstein natural gradient

S Becker, W Li - Journal of Statistical Physics, 2021 - Springer

… For finite-dimensional quantum states, our aim is then to establish low-dimensional 

parameterized quantum Wasserstein gradient flows based on quantum Wasserstein distances. This …

Cited by 3 Related articles All 12 versions

[PDF] arxiv.org

Gradient flow formulation of diffusion equations in the Wasserstein space over a metric graph

M Erbar, D Forkert, J Maas, D Mugnolo - arXiv preprint arXiv:2105.05677, 2021 - arxiv.org

… In analogy with the classical Euclidean setting we will show below that this PDE is the 

Wasserstein gradient flow equation of the free energy F. Though the setting of metric graphs is one…

  Cited by 2 Related articles All 3 versions

<——2021———2021———2170——


2021  [PDF] arxiv.org

Wasserstein Proximal Algorithms for the Schr\"{o} dinger Bridge Problem: Density Control with Nonlinear Drift

K Caluya, A Halder - IEEE Transactions on Automatic Control, 2021 - ieeexplore.ieee.org

… The Wasserstein proximal recursions we employ, solve the … framework is that the Wasserstein 

proximal recursions originate … The use of Wasserstein gradient flows to solve the SBP with …

 5 Related articles All 7 versions


[PDF] thecvf.com

A sliced wasserstein loss for neural texture synthesis

E Heitz, K Vanhoey, T Chambon… - Proceedings of the …, 2021 - openaccess.thecvf.com

… can be measured with the Wasserstein Distance and the … The Sliced Wasserstein Distance 

allows for fast gradient de… for texture synthesis by gradient descent using wavelets as a …

 Cited by 15 Related articles All 7 versions
A Sliced Wasserstein Loss for Neural Texture Synthesis ...

 Sliced Wasserstein Loss for Neural Texture Synthesis - CVPR 2021. Watch later. Share. Copy link. Info ...

Jun 4, 2021 · Uploaded by Machine Learning


 sliced wasserstein loss for neural texture synthesis

E Heitz, K Vanhoey, T Chambon… - Proceedings of the …, 2021 - openaccess.thecvf.com

… of computing a textural loss based on the … loss is the ubiquitous approximation for this

problem but it is subject to several shortcomings. Our goal is to promote the Sliced Wasserstein …

 Cited by 5 Related articles All 7 versions 


[PDF] arxiv.org

On linear optimization over Wasserstein balls

MC YueD KuhnW Wiesemann - Mathematical Programming, 2021 - Springer

… within a pre-specified Wasserstein distance to a reference … In this technical note we prove

that the Wasserstein ball is … the sparsity of solutions if the Wasserstein ball is centred at a …

Cited by 12 Related articles All 9 versions

[PDF] neurips.cc

Pooling by Sliced-Wasserstein Embedding

N Naderializadeh, J Comer… - Advances in …, 2021 - proceedings.neurips.cc

… sets is equal to the sliced-Wasserstein distance between their … Euclidean embedding for the

Wasserstein distance, similar to … the (generalized) sliced-Wasserstein distance. Interestingly, …

  Related articles All 3 versions 


2021


[PDF] mlr.press

Differentially private sliced wasserstein distance

A RakotomamonjyR Liva - International Conference on …, 2021 - proceedings.mlr.press

… intrinsic randomized mechanism of the Sliced Wasserstein Distance, and we establish …

Sliced Wasserstein distance into another distance, that we call the Smoothed Sliced Wasserstein …

  Related articles All 25 versions 


[PDF] neurips.cc

Fast Approximation of the Sliced-Wasserstein Distance Using Concentration of Random Projections

K NadjahiA DurmusPE Jacob… - Advances in …, 2021 - proceedings.neurips.cc

… We develop a novel method to approximate the Sliced-Wasserstein distance of order 2,

by extending the bound in (6) and deriving novel properties for SW. We then derive …

 Cited by 2 Related articles All 5 versions 


[PDF] thecvf.com

A unified framework for non-negative matrix and tensor factorisations with a smoothed Wasserstein loss

SY Zhang - Proceedings of the IEEE/CVF International …, 2021 - openaccess.thecvf.com

… NMF by employing a Wasserstein loss that accounts for the … factorisations with a Wasserstein

loss has remained untouched until … Our work presents a unified framework for Wasserstein …

Cited by 2 Related articles All 5 versions 


[PDF] arxiv.org

Stability of Gibbs posteriors from the Wasserstein loss for Bayesian full waveform inversion

MM DunlopY Yang - SIAM/ASA Journal on Uncertainty Quantification, 2021 - SIAM

… loss function. In this subsection, we consider an unnormalized multiplicative noise loss function

and a Wasserstein loss … state-dependent multiplicative noise loss in the small noise limit. …

 Cited by 5 Related articles All 4 versions


[PDF] arxiv.org

Set representation learning with generalized sliced-wasserstein embeddings

N NaderializadehS Kolouri, JF Comer… - arXiv preprint arXiv …, 2021 - arxiv.org

… In particular, we treat elements of a set as samples from a probability measure and propose

an exact Euclidean embedding for Generalized Sliced Wasserstein (GSW) distances to learn …

Cited by 3  Related articles

<——2021———2021———2180——


[PDF] arxiv.org

Computationally Efficient Wasserstein Loss for Structured Labels

A Toyokuni, S YokoiH KashimaM Yamada - arXiv preprint arXiv …, 2021 - arxiv.org

… loss and used only Wasserstein loss, but [11] used a linear combination of KL divergence

and Wasserstein distance as the loss. … of Wasserstein loss and multi-class KL loss as a strong …

 Related articles All 4 versions 


Sliced Wasserstein Distance for Neural Style Transfer

J Li, D Xu, S Yao - Computers & Graphics, 2021 - Elsevier

… In this paper, we propose a new style loss based on Sliced Wasserstein Distance (SWD),

which has a theoretical approximation guarantee. Besides, an adaptive sampling algorithm is …

 Related articles All 2 versions


Sliced Wasserstein based Canonical Correlation Analysis for Cross-Domain Recommendation

Z Zhao, J Nie, C Wang, L Huang - Pattern Recognition Letters, 2021 - Elsevier

… Therefore we proposed to use sliced Wasserstein distance to restrict the discrepancy

between different latent variables and we get the following enhanced transformation loss:(7) L …

 Related articles All 3 versions


[PDF] arxiv.org

Fixed Support Tree-Sliced Wasserstein Barycenter

Y TakezawaR SatoZ KozarevaS Ravi… - arXiv preprint arXiv …, 2021 - arxiv.org

… By contrast, the Wasserstein distance on a tree, called the tree-Wasserstein distance, can

be … the tree-Wasserstein distance, called the fixed support tree-Wasserstein barycenter (FS-…

  Related articles All 2 versions 


 [PDF] arxiv.org

2D Wasserstein loss for robust facial landmark detection

Y Yan, S Duffner, P Phutane, A Berthelier, C Blanc… - Pattern Recognition, 2021 - Elsevier

… a new loss function based on the 2D Wasserstein distance (loss). The Wasserstein distance,

aka … We propose a novel method based on the Wasserstein loss to significantly improve the …

 Related articles All 17 versions


2021


Run-Sort-ReRun: Escaping Batch Size Limitations in Sliced Wasserstein Generative Models

J LezamaW ChenQ Qiu - International Conference on …, 2021 - proceedings.mlr.press

… In this paper, we build upon recent progress in sliced Wasserstein distances, a family of

differentiable metrics for distribution discrepancy based on the Optimal Transport paradigm. We …

All 2 versions 


[PDF] arxiv.org

SLOSH: Set LOcality Sensitive Hashing via Sliced-Wasserstein Embeddings

Y Lu, X Liu, A SoltoggioS Kolouri - arXiv preprint arXiv:2112.05872, 2021 - arxiv.org

… We propose Sliced-Wasserstein Embedding as a … Treating sets as empirical distributions,

Sliced-Wasserstein … is equal to the Sliced-Wasserstein distance between their corresponding …

Related articles All 2 versions 


[PDF] arxiv.org

Generalized Wasserstein Dice Loss, Test-time Augmentation, and Transformers for the BraTS 2021 challenge

L Fidon, S Shit, I EzhovJC PaetzoldS Ourselin… - arXiv preprint arXiv …, 2021 - arxiv.org

… The generalized Wasserstein Dice loss [15] is a generalization of the Dice Loss for multi-…

to predict it correctly, the generalized Wasserstein Dice loss and our matrix M are designed to …

Related articles All 2 versions 


[PDF] arxiv.org

Minimum cross-entropy distributions on Wasserstein balls and their applications

LF Vargas, M Velasco - arXiv preprint arXiv:2106.03226, 2021 - arxiv.org

… Our first result shows that membership in such Wasserstein balls can be recast as a collection

of moment inequalities. This fact justifies cross-entropy minimization as a (in fact the only) …

Related articles All 2 versions 


[PDF] openreview.net

Sliced Wasserstein Variational Inference

M Yi, S Liu - Fourth Symposium on Advances in Approximate …, 2021 - openreview.net

… In this paper, we extend sliced Wasserstein distance to variational inference tasks. … sliced

function yielded by Eq(5) is univariate. Leveraging this property, we define sliced Wasserstein …

<——2021———2021———2190——

Intensity-Based Wasserstein Distance As A Loss Measure For Unsupervised Deformable Deep Registration

R ShamsW Le, A Weihs… - 2021 IEEE 18th …, 2021 - ieeexplore.ieee.org

… loss function for training a deep generative model. In this work, we implement the Wasserstein

loss … three implementations to the standard registration loss of MSE on brain MRI datasets. …

Related articles


[PDF] arxiv.org

Projected Sliced Wasserstein Autoencoder-based Hyperspectral Images Anomaly Detection

Y Chen, H Zhang, Y Wang, QM Wu, Y Yang - arXiv preprint arXiv …, 2021 - arxiv.org

… Secondly, this paper introduces sliced Wasserstein distance, which is a weaker distribution

… In the end, we propose a projected sliced Wasserstein (PSW) autoencoder-based anomaly …

Related articles All 2 versions 


[PDF] tokushima-u.ac.jp

[PDF] WRGAN: Improvement of RelGAN with Wasserstein Loss for Text Generation. Electronics 2021, 10, 275

Z Jiao, F Ren - 2021 - repo.lib.tokushima-u.ac.jp

… with Wasserstein distance in experiments. In this paper, we propose an improved neural

network based on RelGAN and Wasserstein loss … Correspondingly, we also changed the loss …

Cited by 3 Related articles All 2 versions 

[PDF] archives-ouvertes.fr

Sliced-Wasserstein distance for large-scale machine learning: theory, methodology and extensions

K Nadjahi - 2021 - tel.archives-ouvertes.fr

… This thesis further explores the use of the Sliced-Wasserstein … introduce the Generalized

Sliced-Wasserstein distances (GSW)… We illustrate the Sliced-Wasserstein distance in Figure 1.3 …

 Cited by 1 Related articles All 15 versions 

 [PDF] googleapis.com

System and method for unsupervised domain adaptation via sliced-wasserstein distance

AJ GabourieM RostamiS KolouriK Kim - US Patent 11,176,477, 2021 - Google Patents

… In another aspect, sliced-Wasserstein (SW) distance is used as a … Wasserstein distances, on

the other hand, have been shown … Sliced-Wasserstein distances were utilized as a metric for …

  Cited by 2 Related articles All 4 versions 


2021

[PDF] mlr.press

S e smooth Sobolev IPM. The …

Save Cite Cited by 11 Related articles All 2 versions

T [PDF] neurips.cc

The unbalanced Gromov Wasserstein distance: Conic formulation and relaxation

T Séjourné, FX Vialard, G Peyré - Advances in Neural …, 2021 - proceedings.neurips.cc

distance between such metric measure spaces is the Gromov-Wasserstein (GW) distance, … 

two Unbalanced Gromov-Wasserstein formulations: a distance and a more tractable upper-…

Cited by 22 Related articles All 7 versions
Unbalanced Gromov Wasserstein Distance: Conic ...

slideslive.com › unbalanced-gromov-wasserstein-distance-...

Comparing metric measure spaces (i.e. a metric space endowed with a probability distribution) is at the heart of many machine learning …nnSlidesLive · 

Dec 6, 2021

Cited by 33 Related articles All 7 versions


Fast Approximation of the Sliced-Wasserstein Distance Using ...

slideslive.com › fast-approximation-of-the-slicedwasserste...

slideslive.com › fast-approximation-of-the-slicedwasserste...

The Sliced-Wasserstein distance (SW) is being increasingly used in machine learning applications as an alternative to the Wasserstein ... Dec 6, 2021 ...

SlidesLive · 

[PDF] neurips.cc

Fast Approximation of the Sliced-Wasserstein Distance Using Concentration of Random Projections

K Nadjahi, A Durmus, PE Jacob… - Advances in …, 2021 - proceedings.neurips.cc

… We develop a novel method to approximate the Sliced-Wasserstein distance of order 2, 

by extending the bound in (6) and deriving novel properties for SW. We then derive …

Cited by 11 Related articles All 18 versions

 [PDF] arxiv.org

Plg-in: Pluggable geometric consistency loss with wasserstein distance in monocular depth estimation

N Hirose, S Koide, K Kawano… - 2021 IEEE International …, 2021 - ieeexplore.ieee.org

… Our objective is designed using the Wasserstein distance between two point clouds, 

estimated from images with different camera poses. The Wasserstein distance can impose a soft …

Cited by 6 Related articles All 4 versions

2021 see 2022  [PDF] projecteuclid.org

Wasserstein distance error bounds for the multivariate normal approximation of the maximum likelihood estimator

A Anastasiou, RE Gaunt - Electronic Journal of Statistics, 2021 - projecteuclid.org

… We apply our general bounds to derive Wasserstein distance error … Wasserstein distance 

when the MLE is implicitly defined. … We provide p-Wasserstein distance analogues of these …

 Cited by 4 Related articles All 9 versions

 <——2021———2021———2200——


[PDF] jmlr.org

[PDF] On efficient multilevel Clustering via Wasserstein distances

V Huynh, N Ho, N Dam, XL Nguyen… - Journal of Machine …, 2021 - jmlr.org

Wasserstein distance is expensive. Therefore, we use the entropic regularized second order 

Wasserstein ˆW2 to approximate the Wasserstein distance (… order Wasserstein distance is …

 Cited by 3 Related articles All 20 versions


[PDF] arxiv.org

The Wasserstein space of stochastic processes

D Bartl, M Beiglböck, G Pammer - arXiv preprint arXiv:2104.14245, 2021 - arxiv.org

Wasserstein distance induces a natural Riemannian structure for the probabilities on the … 

We believe that an appropriate probabilistic variant, the adapted Wasserstein distance AW, can …

Cited by 7 Related articles All 2 versions

[PDF] upenn.edu

Optimal estimation of Wasserstein distance on a tree with an application to microbiome studies

S Wang, TT Cai, H Li - Journal of the American Statistical …, 2021 - Taylor & Francis

… The weighted UniFrac distance, a plug-in estimator of the Wasserstein distance of read … 

Motivated by this finding, we study the problem of optimal estimation of the Wasserstein distance

 Related articles All 7 versions


An Improved Mixture Density Network via Wasserstein Distance Based Adversarial Learning for Probabilistic Wind Speed Predictions

L Yang, Z Zheng, Z Zhang - IEEE Transactions on Sustainable …, 2021 - ieeexplore.ieee.org

This paper develops a novel mixture density network via Wasserstein distance based adversarial 

learning (WA-IMDN) for achieving more accurate probabilistic wind speed predictions (…

  Cited by 2 Related articles All 4 versions

[PDF] arxiv.org

Low Budget Active Learning via Wasserstein Distance: An Integer Programming Approach

R Mahmood, S Fidler, MT Law - arXiv preprint arXiv:2106.02968, 2021 - arxiv.org

… compute the discrete Wasserstein distance by a … Wasserstein distance induces a new 

bound on the expected risk in training. Specifically, we show below that the Wasserstein distance

Cited by 8 Related articles All 4 versions

2021


Fair Graph Auto-Encoder for Unbiased Graph Representations with Wasserstein Distance

W Fan, K Liu, R Xie, H Liu, H Xiong… - 2021 IEEE International …, 2021 - ieeexplore.ieee.org

… Then, to achieve multi-level fairness, we design a Wasserstein distance based regularizer 

… up Sinkhorn divergence as the approximations of Wasserstein cost for computation. Finally, …

Cited by 4 Related articles All 4 versions

Fast Topological Clustering with Wasserstein Distance

T Songdechakraiwut, BM Krause, MI Banks… - arXiv preprint arXiv …, 2021 - arxiv.org

… The notions of topological proximity and centroid are characterized using a novel and 

efficient approach to computation of the Wasserstein distance and barycenter for persistence …

Cited by 2 Related articles All 4 versions

Multi-source Cross Project Defect Prediction with Joint Wasserstein Distance and Ensemble Learning

Q Zou, L Lu, Z Yang, H Xu - 2021 IEEE 32nd International …, 2021 - ieeexplore.ieee.org

… • We propose a new joint Wasserstein distance, which takes the global and local information 

Wasserstein distance by combining the marginal and conditional Wasserstein distances. …

All 2 versions


Distributionally robust chance constrained svm model with -Wasserstein distance

Q Ma, Y Wang - Journal of Industrial and Management …, 2021 - aimsciences.org

… -Wasserstein ambiguity. We present equivalent formulations of distributionally robust chance 

constraints based on l2Wasserstein … problem when the l2-Wasserstein distance is discrete …

Related articles All 3 versions

[PDF] mlr.press

Approximating Lipschitz continuous functions with GroupSort neural networks

U Tanielian, G Biau - International Conference on Artificial …, 2021 - proceedings.mlr.press

… , we compute an approximation of the 1-Wasserstein distance and calculate the corresponding 

neural distance. Figure 4 depicts the best parabolic fit between 1-Wasserstein and neural …

Cited by 9 Related articl

<——2021———2021———2210——


[PDF] arxiv.org

Heterogeneous Wasserstein Discrepancy for Incomparable Distributions

MZ Alaya, G Gasso, M Berar… - arXiv preprint arXiv …, 2021 - arxiv.org

… We start by introducing Wasserstein and Gromov-Wasserstein distances with their sliced 

versions SW … In what follows, we propose an algorithm for computing an approximation of this …

 Related articles All 15 versions


[PDF] mlr.press

Analysis of stochastic Lanczos quadrature for spectrum approximation

T Chen, T Trogdon, S Ubaru - International Conference on …, 2021 - proceedings.mlr.press

… We show that SLQ obtains an approximation to the CESM within a Wasserstein distance 

of t |λmax[A] − λmin[A]| with probability at least 1 − η, by applying the Lanczos algorithm for 12t…

 Cited by 4 Related articles All 4 versions

[PDF] openreview.net

Efficient Wasserstein and Sinkhorn Policy Optimization

J Song, C Zhao, N He - 2021 - openreview.net

Wasserstein-like distance to measure proximity of policies instead of states. Unlike ours, these 

work apply Wasserstein … different strategies to approximate the Wasserstein distance. The …

 

[PDF] jmlr.org

[PDF]  [HTML] springer.com

[HTML] Cutoff Thermalization for Ornstein–Uhlenbeck Systems with Small Lévy Noise in the Wasserstein Distance

G Barrera, MA Högele, JC Pardo - Journal of Statistical Physics, 2021 - Springer

… (Wasserstein approximation of the total variation distance) Let \(U_1\) and \(U_2\) be two 

random variables taking values on \(\mathbb {R}^d\). Assume that there exists \(p\in (0,1)\) small …

 Cited by 4 Related articles All 9 versions


[PDF] archives-ouvertes.fr

Inside and around Wasserstein barycenters

A Kroshnin - 2021 - tel.archives-ouvertes.fr

… de l’algorithme bien connu de Sinkhorn, qui nous permet de trouver une solution approximative 

du problème de transport optimal ainsi que du problème du barycentre de Wasserstein. …

 Related articles All 6 versions


2021

[PDF] thecvf.com

Self-Supervised Wasserstein Pseudo-Labeling for Semi-Supervised Image Classification

F Taherkhani, A Dabouei… - Proceedings of the …, 2021 - openaccess.thecvf.com

… by the Wasserstein barycenter of the unlabeled data. Then, we leverage the Wasserstein 

metric to … Therefore, we can approximate the Wasserstein distance by optimizing the following …

Cited by 16 Related articles All 6 versions

[PDF] arxiv.org

Partial Wasserstein and Maximum Mean Discrepancy distances for bridging the gap between outlier detection and drift detection

T Viehmann - arXiv preprint arXiv:2106.12893, 2021 - arxiv.org

… involves a computationally very expensive quadratic programming problem, we use optimal 

transport problem underlying the partial Wasserstein distance to give an approximation. …

Cited by 2 Related articles All 2 versions


Learning to simulate sequentially generated data via neural networks and wasserstein training

T Zhu, Z Zheng - 2021 Winter Simulation Conference (WSC), 2021 - ieeexplore.ieee.org

… The supreme f over all 1-Lipschitz functions is still intractable, but we can use a neural 

network fψ to approximate f, and search over all such approximations parameterized by NN …

  All 2 versions


Scenario Reduction Network Based on Wasserstein Distance with Regularization

Y Sun, X Dong, SM Malik - 2021 - techrxiv.org

… number of discrete scenarios to obtain a reliable approximation for the probabilistic model. It 

is … This paper presents a scenario reduction network model based on Wasserstein distance. …

 


[HTML] sciencedirect.com

[HTML] Wasserstein distance-based distributionally robust optimal scheduling in rural microgrid considering the coordinated interaction among source-grid-load …

Cited by 6 Related articles All 2 versions

<——2021———2021———2220——


[PDF] github.io

[PDF] SiGANtures: Generating Times Series Using Wasserstein-Generative Adversarial Nets and the Signature Transform

L BLEISTEIN - linusbleistein.github.io

… We will only state those needed to properly define the Wasserstein distance and refer the 

curious … corresponds to an approximation of the Wasserstein distance and is less interpretable. …

[PDF] mlr.press

[PDF] Supplementary Material for Wasserstein Distributional Normalization

SW Park, J Kwon - proceedings.mlr.press

… Thus, ˆεcan be considered as an approximation of the theoretical upper bound ε suggested 

in Proposition 1. Subsequently, we investigate the effects of Wasserstein normalization …

 Related articles


A travers et autour des barycentres de Wasserstein

A Kroshnin - 2021 - theses.fr

… central limite pour les barycentres de Wasserstein pénalisés par l’entropie; • le … approximative 

du problème de transport optimal ainsi que du problème du barycentre de Wasserstein

 

[PDF] openreview.net

Wasserstein Weisfeiler-Lehman Subtree Distance for Graph-Structured Data

Z Fang, J Huang, H Kasai - 2021 - openreview.net

approximate solver (Garofalakis & Kumar, 2005) to solve it. As a condition for using this 

approximate … Furthermore, we define the graph Wasserstein distance by considering the distance …

 Related articles


Patterson, Evan

Hausdorff and Wasserstein metrics on graphs and other structured data. (English) Zbl 07478762

Inf. Inference 10, No. 4, 1209-1249 (2021).

MSC:  05C70 05C12 05C82 62C05 90C05 68R10 68P05

PDF BibTeX XML 

 Cited by 5 Related articles All 4 versions

2021


[PDF] arxiv.org

Clustering Market Regimes using the Wasserstein Distance

B Horvath, Z Issa, A Muguruza - Available at SSRN 3947905, 2021 - papers.ssrn.com

… th Wasserstein distance, and we aggregate nearest neighbours using the associated

Wasserstein barycenter. We motivate why the Wasserstein distance is the natural choice for this …

Related articles All 5 versions


 Projection robust Wasserstein barycenters

M HuangS MaL Lai - International Conference on …, 2021 - proceedings.mlr.press

… under the Wasserstein metric. However, approximating the Wasserstein barycenter is … This

paper proposes the projection robust Wasserstein barycenter (PRWB) that has the potential to …

Cited by 5 Related articles All 7 versions 

 

[PDF] jmlr.org

[PDF] Wasserstein barycenters can be computed in polynomial time in fixed dimension.

JM AltschulerE Boix-Adsera - J. Mach. Learn. Res., 2021 - jmlr.org

… Our starting point is a well-known LP reformulation of the Wasserstein barycenter problem

as a Multimarginal Optimal Transport (MOT) problem, recalled in the preliminaries section. …

3 Related articles All 21 versions 


[PDF] mlr.press

Improved complexity bounds in wasserstein barycenter problem

D DvinskikhD Tiapkin - International Conference on …, 2021 - proceedings.mlr.press

… In this paper, we focus on computational aspects of the Wasserstein barycenter problem. We

propose two algorithms to compute Wasserstein barycenters of m discrete measures of size …

5 Related articles All 4 versions 


[PDF] arxiv.org

On the computational complexity of finding a sparse Wasserstein barycenter

S Borgwardt, S Patterson - Journal of Combinatorial Optimization, 2021 - Springer

… The discrete Wasserstein barycenter problem is a minimum-… In this paper, we show that

finding a barycenter of sparse … nature of the discrete barycenter problem. Containment of SCMP …

6 Related articles All 6 versions

New Findings on Combinatorics from Louisiana State University Summarized 

(On the Computational Complexity of Finding a Sparse Wasserstein...
Mathematics Week, 04/2021
Newsletter  Full Text Online

<——2021———2021———2230——


[PDF] neurips.cc

Dynamical Wasserstein Barycenters for Time-series Modeling

K ChengS AeronMC Hughes… - Advances in Neural …, 2021 - proceedings.neurips.cc

… dynamic Wasserstein barycenter (DWB) time series model. For each window of data, indexed

by t, our model forms an emission distribution ρBt that is the Wasserstein barycenter given …

 Related articles All 5 versions 


[PDF] arxiv.org

Entropic-Wasserstein barycenters: PDE characterization, regularity, and CLT

G Carlier, K Eichinger, A Kroshnin - SIAM Journal on Mathematical Analysis, 2021 - SIAM

… In section 2, we introduce the setting and prove existence and uniqueness of the entropic

Wasserstein barycenter. The entropic barycenter is then characterized by a system of Monge--…

Cited by 10 Related articles All 14 versions

[PDF] arxiv.org

Automatic text evaluation through the lens of wasserstein barycenters

P ColomboG StaermanC Clavel… - arXiv preprint arXiv …, 2021 - arxiv.org

… By comparing the best performance achieved by BaryScore compared to MoverScore,

we hypothesize that Wasserstein barycenter preserves more geometric properties of the …

Cited by 19 Related articles All 8 versions 

[PDF] neurips.cc

Dimensionality reduction for wasserstein barycenter

Z Izzo, S Silwal, S Zhou - Advances in Neural Information …, 2021 - proceedings.neurips.cc

… reduction techniques for the Wasserstein barycenter problem. When the barycenter is restricted

to … We also provide a coreset construction for the Wasserstein barycenter problem that …

Cited by 3 Related articles All 6 versions 


[PDF] thecvf.com

Wasserstein Barycenter for Multi-Source Domain Adaptation

EF Montesuma, FMN Mboula - Proceedings of the IEEE …, 2021 - openaccess.thecvf.com

… domain through the Wasserstein barycenter, then transport the barycenter into the target

domain. The usage of the intermediate domain built by the Wasserstein barycenter has shown …

Cited by 2 Related articles All 4 versions 


[PDF] tandfonline.com

Stochastic approximation versus sample average approximation for Wasserstein barycenters

D Dvinskikh - Optimization Methods and Software, 2021 - Taylor & Francis

… We show that for the Wasserstein barycenter problem, this … and SAA implementations

calculating barycenters defined with … confidence intervals for the barycenter defined with respect …

 Cited by 2 Related articles All 3 versions


[PDF] mlr.press

When ot meets mom: Robust estimation of wasserstein distance

G StaermanP Laforgue… - International …, 2021 - proceedings.mlr.press

… In this work, we consider the problem of estimating the Wasserstein distance between two

… to provide robust estimates. Exploiting the dual Kantorovitch formulation of the Wasserstein …

 0 Related articles All 8 versions 


[PDF] arxiv.org

Decentralized Algorithms for Wasserstein Barycenters

D Dvinskikh - 2021 - search.proquest.com

… In this thesis, we consider the Wasserstein barycenter problem of discrete probability

measures as well as the population Wasserstein barycenter problem given by a Fréchet …

 Cited by 4 Related articles All 4 versions


[PDF] arxiv.org

On distributionally robust chance constrained programs with Wasserstein distance

W Xie - Mathematical Programming, 2021 - Springer

… a distributionally robust chance constrained program (DRCCP) with Wasserstein ambiguity

… the uncertain parameters within a chosen Wasserstein distance from an empirical distribution…

 Cited by 85 Related articles All 9 versions


[PDF] neurips.cc

Generalization Bounds for (Wasserstein) Robust Optimization

Y An, R Gao - Advances in Neural Information Processing …, 2021 - proceedings.neurips.cc

… (Distributionally) robust optimization has gained momentum in machine learning community

… generalization bounds for robust optimization and Wasserstein robust optimization for …

 Related articles All 2 versions 

<——2021———2021———2240——


[PDF] arxiv.org

Sampling from the wasserstein barycenter

C DaaloulTL GouicJ Liandrat, M Tournus - arXiv preprint arXiv …, 2021 - arxiv.org

… on generating samples distributed according to the barycenter of known measures. Given

the broad applicability of the Wasserstein barycenter and of sampling techniques in general, …

  Related articles All 4 versions 


[PDF] mlr.press

A riemannian block coordinate descent method for computing the projection robust wasserstein distance

M HuangS MaL Lai - International Conference on …, 2021 - proceedings.mlr.press

… The approach is called Wasserstein projection pursuit (WPP), and the largest Wasserstein

… the projection robust Wasserstein distance (PRW). As proved in (NilesWeed & Rigollet…

Cited by 23 Related articles All 11 versions ƒ

A Riemannian Block Coordinate Descent Method for

Computing the Projection Robust Wasserstein Distanceannian Block Coordinate Descent Method for Computing the Projection Robust Wasserstein Distance

slideslive.com › a-riemannian-block-coordinate-descent-...

slideslive.com › a-riemannian-block-coordinate-descent-..

... Descent Method for Computing the Projection Robust Wasserstein Distance ... computational biology, speech recognition, and robotics.

SlidesLive · J


[PDF] neurips.cc

Solving Soft Clustering Ensemble via -Sparse Discrete Wasserstein Barycenter

R Qin, M Li, H Ding - Advances in Neural Information …, 2021 - proceedings.neurips.cc

… In Section 3, we discuss the relation between the SCE problem and discrete Wasserstein

barycenter. In Section 4, we present our approximation algorithms based on random sampling. …

 Related articles All 2 versions 


[PDF] arxiv.org

Distributionally robust mean-variance portfolio selection with Wasserstein distances

J Blanchet, L Chen, XY Zhou - Management Science, 2021 - pubsonline.informs.org

… Wasserstein distance. This is important because, as a result of the quadratic nature of the

variance objective that we consider, applying an uncertainty set based on Wasserstein … robust …

Cited by 42 Related articles All 6 versions


Data-driven Wasserstein distributionally robust optimization for refinery planning under uncertainty

J Zhao, L Zhao, W He - … 2021–47th Annual Conference of the …, 2021 - ieeexplore.ieee.org

This paper addresses the issue of refinery production planning under uncertainty. A data-driven 

Wasserstein distributionally robust optimization approach is proposed to optimize …

Cited by 3 Related articles


2021


[HTML] mdpi.com

WDA: An Improved Wasserstein Distance-Based Transfer Learning Fault Diagnosis Method

Z Zhu, L Wang, G Peng, S Li - Sensors, 2021 - mdpi.com

… In Section 2, we introduce the basic conception of transfer learningWasserstein distance,

and the corresponding Kuhn-Munkres algorithm. Following that, the proposed method and …

S Cited by 3 Related articles All 10 versions 

 

Wasserstein Adversarial Regularization for learning with label noise

K FatrasBB DamodaranS Lobry… - … on Pattern Analysis …, 2021 - ieeexplore.ieee.org

… , which enables learning robust classifiers in presence of noisy data. To achieve this goal,

we propose a new adversarial regularization scheme based on the Wasserstein distance. …

 Cited by 2 Related articles All 9 versions


 2021  [PDF] archives-ouvertes.fr

Diffusion-Wasserstein Distances for Attributed Graphs

D Barbe - 2021 - tel.archives-ouvertes.fr

… Finally, the Fused Gromov-Wasserstein distance [15] works by merging two transport 

distances on graphs and vector-valued data; because the information it provides is much richer …

All 9 versions


2021  [PDF] arxiv.org

ERA: Entity Relationship Aware Video Summarization with Wasserstein GAN

G Wu, J Lin, CT Silva - arXiv preprint arXiv:2109.02625, 2021 - arxiv.org

… For the discriminator, we employ Wasserstein GAN and propose a patch mechanism to 

deal with the varying video length. The effectiveness of the proposed ERA is verified on the …

 Cited by 1 Related articles All 3 versions


2021 patent see 2022

Method for generating biological Raman spectrum data based on WGAN (WGAN) …

CN CN112712857A 祝连庆 北京信息科技大学

Priority 2020-12-08 • Filed 2020-12-08 • Published 2021-04-27

The invention provides a method for generating biological Raman spectrum data based on a WGAN antagonistic generation network, which comprises the following steps: step a, extracting partial Raman spectrum data from a Raman spectrum database to 

<——2021———2021———2250——


[PDF] mlr.press

Exploring the Wasserstein metric for time-to-event analysis

T SylvainM LuckJ Cohen… - Survival Prediction …, 2021 - proceedings.mlr.press

… EXPLORING THE WASSERSTEIN METRIC FOR SURVIVAL ANALYSIS In this study, we

propose to use the Wasserstein metric (WM) to learn the probability distribution of the event time…

 Cited by 1 Related articles All 2 versions 


[PDF] optimization-online.org

Data-driven distributionally robust chance-constrained optimization with Wasserstein metric

R JiMA Lejeune - Journal of Global Optimization, 2021 - Springer

… We give particular attention to studies based on the Wasserstein metric. … in the construction

of Wasserstein ambiguity sets. The Wasserstein metric between two probability distributions …

 Cited by 45 Related articles All 6 versions

[PDF] projecteuclid.org

Strong equivalence between metrics of Wasserstein type

E BayraktarG Guoï - Electronic Communications in Probability, 2021 - projecteuclid.org

… also holds for p = 1, while the sliced Wasserstein metric does not share this nice property. …

sliced Wasserstein metric using the recent results of [1], hence promoting the max-sliced metric …

 Cited by 7 Related articles All 5 versions


Distributionally robust resilient operation of integrated energy systems using moment and Wasserstein metric for contingencies

Y Zhou, Z WeiM Shahidehpour… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

… We develop a strengthened ambiguity set that incorporates both moment and

Wasserstein metric information of uncertain contingencies, which provides a more accurate …

Cited by 17 Related articles All 2 versions

[PDF] arxiv.org

Hausdorff and Wasserstein metrics on graphs and other structured data

E Patterson - Information and Inference: A Journal of the IMA, 2021 - academic.oup.com

… We extend the Wasserstein metric and other elements of optimal … both Hausdorff-style and

Wasserstein-style metrics on |$\… Like the classical Wasserstein metric, the Wasserstein metric …

 Cited by 5 Related articles All 4 versions


2021

[PDF] arxiv.org

Data-driven distributionally robust MPC using the Wasserstein metric

Z ZhongEA del Rio-Chanona… - arXiv preprint arXiv …, 2021 - arxiv.org

… Distributionally robust constraints based on the Wasserstein metric are imposed to bound …

to the worst-case distribution within the Wasserstein ball centered at their discrete empirical …

Cited by 2 Related articles All 3 versions 

[PDF] arxiv.org

Equidistribution of random walks on compact groups II. The Wasserstein metric

B Borda - Bernoulli, 2021 - projecteuclid.org

… -Wasserstein metric, we will use a Berry–Esseen type inequality. The corresponding result

for the uniform metric … The original Berry–Esseen inequality concerns the uniform metric on R […

Cited by 3 Related articles All 6 versions


[PDF] arxiv.org

On the use of Wasserstein metric in topological clustering of distributional data

G CabanesY Bennani, R Verde, A Irpino - arXiv preprint arXiv …, 2021 - arxiv.org

… Wasserstein metric in Topological Clustering … Wasserstein metric [22] (also named Mallow’s

distance [23]). In the case of distributions defined on R [25], this metric is defined as follows: …

Cited by 2 Related articles All 2 versions 


[PDF] ams.org

Nonembeddability of persistence diagrams with 𝑝> 2 Wasserstein metric

A Wagner - Proceedings of the American Mathematical Society, 2021 - ams.org

… inner product structure compatible with any Wasserstein metric. Hence, when applying kernel

… We prove that persistence diagrams with the p-Wasserstein metric do not admit a coarse …

Cited by 9 Related articles All 5 versions


[HTML] mdpi.com

Geometric Characteristics of the Wasserstein Metric on SPD (n) and Its Applications on Data Processing

Y Luo, S ZhangY Cao, H Sun - Entropy, 2021 - mdpi.com

… In this paper, by involving the Wasserstein metric on S P D ( n ) , we obtain computationally

… and edge detecting of a polluted image based on the Wasserstein curvature on S P D ( n ) . …

Related articles All 9 versions 

<——2021———2021———2260—— 



  [PDF] arxiv.org

Empirical measures and random walks on compact spaces in the quadratic Wasserstein metric

B Borda - arXiv preprint arXiv:2110.00295, 2021 - arxiv.org

… Here d > 0 is the “dimension”, in the sense that for all small R > 0, the metric space can be

… We shall only use the quadratic Wasserstein metric W2, defined in terms of the geodesic …

Cited by 1 Related articles All 3 versions 


[PDF] arxiv.org

LCS graph kernel based on Wasserstein distance in longest common subsequence metric space

J Huang, Z Fang, H Kasai - Signal Processing, 2021 - Elsevier

… Wasserstein distance with a new ground metric based on the proposed LCS similarity. The

main motivation is that the Wasserstein … As a consequence, the estimated similarity metric is …

Cited by 4 Related articles All 7 versions


The measurement of relations on belief functions based on the Kantorovich problem and the Wasserstein metric

AG Bronevich, IN Rozenberg - International Journal of Approximate …, 2021 - Elsevier

… and show how the Wasserstein metric is defined based on … In Section 4, we define the

Wasserstein metrics on random … on probability measures with the Wasserstein metric. Section 5 is …

Cited by 1 Related articles All 4 versions

 

[PDF] arxiv.org

Gradient flow formulation of diffusion equations in the Wasserstein space over a metric graph

M Erbar, D Forkert, J MaasD Mugnolo - arXiv preprint arXiv:2105.05677, 2021 - arxiv.org

… on metric graphs. Firstly, we prove a Benamou–Brenier formula for the Wasserstein distance,

… flow of the free energy in the Wasserstein space of probability measures. The proofs of …

Cited by 3 Related articles All 3 versions 


[PDF] arxiv.org

Schema matching using Gaussian mixture models with Wasserstein distance

M Przyborowski, M Pabiś, A Janusz… - arXiv preprint arXiv …, 2021 - arxiv.org

… From the viewpoint of optimal transport theory, the Wasserstein distance is an important …

In this paper we derive one of possible approximations of Wasserstein distances computed …

 Related articles All 3 versions 


2021

[PDF] annalsofrscb.ro

Wasserstein GANs for Generation of Variated Image Dataset Synthesis

KDB Mudavathu, MVPCS Rao - Annals of the Romanian Society for …, 2021 - annalsofrscb.ro

Deep learning networks required a training lot of data to get to better accuracy. Given the 

limited amount of data for many problems, we understand the requirement for creating the …

Save Cite Related articles All 3 versions

Learning to simulate sequentially generated data via neural networks and wasserstein training

T Zhu, Z Zheng - 2021 Winter Simulation Conference (WSC), 2021 - ieeexplore.ieee.org

… Next, we introduce the Wasserstein distance, which is used to quantify the distance between

two given distributions. The Wasserstein distance of the generated distribution ˆπS and the …

All 2 versions


 2021 see 2022

Wasserstein Graph Auto-Encoder

Y Chu, H Li, H Ning, Q Zhao - … on Algorithms and Architectures for Parallel …, 2021 - Springer

… the Wasserstein distance and graph neural network model to minimize the penalty in the

form of Wasserstein … We use 1-Wasserstein distance, referred to as Wasserstein distance for …

All 2 versions

[PDF] arxiv.org

Minimum Wasserstein Distance Estimator under Finite Location-scale Mixtures

Q ZhangJ Chen - arXiv preprint arXiv:2107.01323, 2021 - arxiv.org

… the explicit form of the Wasserstein distance between two measures on R for the numerical

solution to the MWDE. The strategy works for any p-Wasserstein distance but we only provide …

Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Conditional Wasserstein GAN-based oversampling of tabular data for imbalanced learning

J EngelmannS Lessmann - Expert Systems with Applications, 2021 - Elsevier

… As the magnitude of the Wasserstein loss fluctuates during training, we scale the AC loss by

… ) ) to ensure that minimising the Wasserstein loss is the primary objective of the generator…

 Cited by 34 Related articles All 7 versions

<——2021———2021———2270——

 

[PDF] arxiv.org

Set representation learning with generalized sliced-wasserstein embeddings

N NaderializadehS Kolouri, JF Comer… - arXiv preprint arXiv …, 2021 - arxiv.org

… sliced-Wasserstein distance. In addition, we develop a unique unsupervised learning scheme

… a set o 

 Cited by 3 Related articles All 4 versions 

 

[PDF] neurips.cc

Wasserstein Flow Meets Replicator Dynamics: A Mean-Field Analysis of Representation Learning in Actor-Critic

Y Zhang, S Chen, Z Yang… - Advances in Neural …, 2021 - proceedings.neurips.cc

… the Wasserstein … learning, we show that the critic induces a data-dependent feature

 representation within an O(1/α) neighborhood of the initial representation in terms of the Wasserstein …

Related articles All 5 versions 

[HTML] aclanthology.org

[HTML] Wasserstein Selective Transfer Learning for Cross-domain Text Mining

L FengM QiuY LiH Zheng… - Proceedings of the 2021 …, 2021 - aclanthology.org

… To alleviate this issue, we propose a Wasserstein Selective Transfer Learning (WSTL) … to

select helpful data for transfer learning. We further use a Wasserstein-based discriminator to …

 Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein GAN: Deep Generation applied on Bitcoins financial time series

R Samuel, BD Nico, P Moritz, O Joerg - arXiv preprint arXiv:2107.06008, 2021 - arxiv.org

… collapse during training, we introduce the improved GAN called Wasserstein GAN to improve

learning stability. The papers [28–30] focus on implementing a Wasserstein GAN and show …

Cited by 1 Related articles All 2 versions 


Deep transfer Wasserstein adversarial network for wafer map defect recognition

J Yu, S Li, Z Shen, S Wang, C Liu, Q Li - Computers & Industrial …, 2021 - Elsevier

… This study proposes a new transfer learning model, ie, deep transfer Wasserstein … and

learning the features of wafer maps. The contributions of this study are as follows: (1) Wasserstein …

Related articles All 2 versions


2021



 

Transfer learning method for bearing fault diagnosis based on fully convolutional conditional Wasserstein adversarial Networks

YZ Liu, KM Shi, ZX Li, GF Ding, YS Zou - Measurement, 2021 - Elsevier

… learning fault diagnosis model based on a deep Fully Convolutional Conditional Wasserstein

… domain of adversarial learning to strengthen the supervision of the learning process and …

 Cited by 3 Related articles All 2 versions



Wasserstein Coupled Graph Learning for Cross-Modal Retrieval

Y Wang, T Zhang, X Zhang, Z Cui… - 2021 IEEE/CVF …, 2021 - ieeexplore.ieee.org

… is constructed for further feature learning. Based on this dictionary… measurement through a

Wasserstein Graph Embedding (WGE) … graph learning, we specifically define a Wasserstein …

 

[PDF] arxiv.org

Learning Domain Invariant Representations by Joint Wasserstein Distance Minimization

L Andéol, Y Kawakami, Y Wada, T Kanamori… - arXiv preprint arXiv …, 2021 - arxiv.org

… invariance, we consider the Wasserstein distance [43, 55] in the … We contribute several

bounds relating the Wasserstein … invariant training mechanistically lowers the Wasserstein …

 Related articles All 4 versions 


[PDF] arxiv.org

Wasserstein Unsupervised Reinforcement Learning

S He, Y Jiang, H Zhang, J Shao, X Ji - arXiv preprint arXiv:2110.07940, 2021 - arxiv.org

… adopting Wasserstein distance as discrepancy measure for unsupervised reinforcement

learning. This framework is well-designed to be compatible with various Wasserstein distance …

 Related articles All 2 versions 


[PDF] arxiv.org

Learning disentangled representations with the wasserstein autoencoder

B Gaujac, I FeigeD Barber - … European Conference on Machine Learning …, 2021 - Springer

… Disentangled representation learning has undoubtedly benefited from objective function …

, we propose TCWAE (Total Correlation Wasserstein Autoencoder). Working in the WAE …

  Cited by 2 Related articles All 5 versions

<——2021———2021———2280——



[PDF] arxiv.org

Wasserstein Generative Learning of Conditional Distribution

S Liu, X Zhou, Y JiaoJ Huang - arXiv preprint arXiv:2112.10039, 2021 - arxiv.org

… We propose a Wasserstein generative approach to learning a conditional distribution. The

… the target joint distribution, using the Wasserstein distance as the discrepancy measure for …

Related articles All 3 versions 

 

An Improved Mixture Density Network via Wasserstein Distance Based Adversarial Learning for Probabilistic Wind Speed Predictions

L Yang, Z Zheng, Z Zhang - IEEE Transactions on Sustainable …, 2021 - ieeexplore.ieee.org

… density network via Wasserstein distance-based adversarial learning (WA-IMDN) for … on

training the mixture density network, a Wasserstein distance (WD)-based adversarial learning is …

Related articles All 3 versions

 

Adversarial training with Wasserstein distance for learning cross-lingual word embeddings

Y Li, Y Zhang, K Yu, X Hu - Applied Intelligence, 2021 - Springer

… distance is an optimization problem, we design a Wasserstein critic network C to implement

the Wasserstein distance. Based on (2), the Wasserstein distance d w between G(x) and G(…

Cited by 1 Related articles


[PDF] arxiv.org

Low Budget Active Learning via Wasserstein Distance: An Integer Programming Approach

R MahmoodS FidlerMT Law - arXiv preprint arXiv:2106.02968, 2021 - arxiv.org

… in training loss between using the full data set versus a subset via the discrete Wasserstein

… (2) We propose active learning by minimizing the Wasserstein distance. We develop a …

Cited by 2 Related articles All 2 versions 


 [PDF] arxiv.org

A Mathematical Principle of Deep Learning: Learn the Geodesic Curve in the Wasserstein Space

K Gai, S Zhang - arXiv preprint arXiv:2102.09235, 2021 - arxiv.org

… attempts to learn the geodesic curve in the Wasserstein space, which is induced by the …

of deep learning is to learn the geodesic curve in the Wasserstein space; and deep learning …

Related articles All 3 versions 


2021


[PDF] arxiv.org

DeepParticle: learning invariant measure by a deep neural network minimizing Wasserstein distance on data generated from an interacting particle method

Z WangJ XinZ Zhang - arXiv preprint arXiv:2111.01356, 2021 - arxiv.org

… In training, we update the network weights to minimize a discrete Wasserstein distance … ,

to find the optimal transition matrix in the Wasserstein distance. We present numerical results to …

Related articles All 4 versions 


[PDF] arxiv.org

Controlling Wasserstein distances by Kernel norms with application to Compressive Statistical Learning

T VayerR Gribonval - arXiv preprint arXiv:2112.00423, 2021 - arxiv.org

… learning. Based on the relations between the MMD and the Wasserstein distance, we provide

guarantees for compressive statistical learning by introducing and studying the concept of …

Cited by 1 Related articles All 8 versions 


Wasserstein distance feature alignment learning for 2D image-based 3D model retrieval

Y Zhou, Y Liu, H Zhou, W Li - Journal of Visual Communication and Image …, 2021 - Elsevier

… Firstly, they are mostly based on the supervised learning, … propose a Wasserstein distance

feature alignment learning (… critic network based on the Wasserstein distance to narrow the …

Related articles All 2 versions


 

[PDF] arxiv.org

Continual Learning of Generative Models with Limited Data: From Wasserstein-1 Barycenter to Adaptive Coalescence

M DedeogluS LinZ Zhang, J Zhang - arXiv preprint arXiv:2101.09225, 2021 - arxiv.org

… Wasserstein-1 generative adversarial networks (WGAN), this study aims to develop a framework

which systematically optimizes continual learning … other nodes as Wasserstein balls cen…

Cited by 1 Related articles All 3 versions 


Learning to simulate sequentially generated data via neural networks and wasserstein training

T Zhu, Z Zheng - 2021 Winter Simulation Conference (WSC), 2021 - ieeexplore.ieee.org

… new framework assisted by neural networks and Wasserstein training to address this need.

Our … distribution assumption, our framework applies Wasserstein training to meet the goal that …

All 2 versions

<——2021———2021———2290——

 


[PDF] arxiv.org

Training Wasserstein GANs without gradient penalties

D Kwon, Y Kim, G MontúfarI Yang - arXiv preprint arXiv:2110.14150, 2021 - arxiv.org

… We propose a stable method to train Wasserstein generative adversarial networks. In order

… optimal discriminator and also for the Wasserstein distance between the true distribution and …

  Related articles All 3 versions 


[PDF] arxiv.org

Robust Graph Learning Under Wasserstein Uncertainty

X Zhang, Y Xu, Q Liu, Z Liu, J Lu, Q Wang - arXiv preprint arXiv …, 2021 - arxiv.org

… To this end, we propose a graph learning framework using Wasserstein distributionally

robust optimization (WDRO) which handles uncertainty in data by defining an uncertainty set on …

 Related articles All 3 versions 


Multi-source Cross Project Defect Prediction with Joint Wasserstein Distance and Ensemble Learning

Q Zou, L Lu, Z Yang, H Xu - 2021 IEEE 32nd International …, 2021 - ieeexplore.ieee.org

… Secondly, we design a joint Wasserstein distance to measure the similarity between each

… Wasserstein distance by combining the marginal and conditional Wasserstein distances. …

  All 2 versions


 WDIBS: Wasserstein deterministic information bottleneck for state abstraction to balance state-compression and performance

X Zhu, T Huang, R Zhang, W Zhu - Applied Intelligence, 2021 - Springer

… In the VAE setting, we focus on learning a compact underlying data representation z, which

captures high-dimensional data space x using two parameterized functions \({{q}_{\psi }}\left (…

Cited by 1 Related articles


Wasserstein distance-based asymmetric adversarial domain adaptation in intelligent bearing fault diagnosis

Y Yu, J Zhao, T Tang, J Wang, M Chen… - Measurement …, 2021 - iopscience.iop.org

… be applicable in small data scenarios during adversarial training. In this paper, we propose

a novel adversarial TL method: Wasserstein distance-based asymmetric adversarial domain …

Cited by 3 Related articles All 2 versions

2021

[HTML] springer.com

[HTML] Entropy-regularized 2-Wasserstein distance between Gaussian measures

A Mallasto, A GerolinHQ Minh - Information Geometry, 2021 - Springer

… 3, we compute explicit solutions to the entropy-relaxed 2-Wasserstein distance between

Gaussians, … We derive fixed-point expressions for the entropic 2-Wasserstein distance and the 2-…

Cited by 15 Related articles All 6 versions


Intensity-Based Wasserstein Distance As A Loss Measure For Unsupervised Deformable Deep Registration

R ShamsW Le, A Weihs… - 2021 IEEE 18th …, 2021 - ieeexplore.ieee.org

… Recent work with Wasserstein GAN [4] provided a method to estimate the measure, … training

deep generative model. In this work, we implement the Wasserstein loss as part of a deep …

Related articles


[PDF] arxiv.org

ALLWAS: Active Learning on Language models in WASserstein space

A BastosM Kaul - arXiv preprint arXiv:2109.01691, 2021 - arxiv.org

… optimization and optimal transport for active learning in language models, dubbed ALLWAS.

… learning from few samples, we propose a novel strategy for sampling from the Wasserstein …

Related articles All 2 versions 

 

Wasserstein GAN: Deep Generation Applied on Financial Time Series

M Pfenninger, DN Bigler, S Rikli… - Available at SSRN …, 2021 - papers.ssrn.com

… collapse during training, we introduce the improved GAN called Wasserstein GAN to improve

learning stability. The papers [28–30] focus on implementing a Wasserstein GAN and show …

Related articles All 2 versions


[PDF] arxiv.org

Physics-Driven Learning of Wasserstein GAN for Density Reconstruction in Dynamic Tomography

Z HuangM Klasky, T Wilcox, S Ravishankar - arXiv preprint arXiv …, 2021 - arxiv.org

… Wasserstein distance is a measure of the distance between two probability distributions. In

… the Wasserstein-1 distance. For probability distributions 𝑝real and 𝑝fake, the Wasserstein-1 …

Related articles All 2 versions 

<——2021———2021———2300——


[PDF] aaai.org

[PDF] Swift: Scalable wasserstein factorization for sparse nonnegative tensors

A AfsharK Yin, S Yan, C QianJC Ho, H Park… - Proceedings of the AAAI …, 2021 - aaai.org

… We introduce SWIFT, which minimizes the Wasserstein distance … In particular, we define

the N-th order tensor Wasserstein loss for … Wasserstein Dictionary Learning. There are several …

Cited by 10 Related articles All 13 versions 

DerainGAN: Single image deraining using wasserstein GAN

S YadavA MehraH RohmetraR Ratnakumar… - Multimedia Tools and …, 2021 - Springer

… In order to mitigate this issue, we use Wasserstein loss [4] for training the generator. The

loss function encourages the discriminator (also known as ’critic’ for Wasserstein GAN) to …

  Related articles

 

Differentially Privacy-Preserving Federated Learning Using Wasserstein Generative Adversarial Network

Y Wan, Y QuL GaoY Xiang - 2021 IEEE Symposium on …, 2021 - ieeexplore.ieee.org

… to train high-quality machine learning (ML) models. However, … This motivates the emergence

of Federated Learning (FL), a … To address this issue, we propose to integrate Wasserstein …

 Related articles All 2 versions


2021 see 2022  [PDF] mdpi.com

GMT-WGAN: An Adversarial Sample Expansion Method for Ground Moving Targets Classification

X Yao, X Shi, Y Li, L Wang, H Wang, S Ren, F Zhou - Remote Sensing, 2021 - mdpi.com

… a Wasserstein generative adversarial network (WGAN) sample enhancement method for

ground moving target classification (GMT-WGAN). … Next, a WGAN is constructed to generate …

 Related articles All 4 versions 


 

Unbalanced optimal total variation transport problems and generalized Wasserstein barycenters

NP Chung, TS Trinh - Proceedings of the Royal Society of Edinburgh …, 2021 - cambridge.org

… get another proof of Kantorovich–Rubinstein theorem for generalized Wasserstein distance

… Then we apply our duality formula to study generalized Wasserstein barycenters. We show …

 Related articles All 2 versions


2021


[PDF] openreview.net

A Distributional Robustness Perspective on Adversarial Training with the -Wasserstein Distance

C Regniez, G Gidel - 2021 - openreview.net

… problem corresponds to an ∞-Wasserstein DRO problem with the l∞ underlying geometry.

… -∞-Wasse


[PDF] arxiv.org

On the Wasserstein Distance Between -Step Probability Measures on Finite Graphs

S Benjamin, A Mantri, Q Perian - arXiv preprint arXiv:2110.10363, 2021 - arxiv.org

… Wasserstein distance between µk and νk for general k. We consider the sequence formed by

the Wasserstein … either the Wasserstein distance converges or the Wasserstein distance at …

 Related articles All 2 versions 


[PDF] sjtu.edu.cn

Wasserstein-Distance-Based Multi-Source Adversarial Domain Adaptation for Emotion Recognition and Vigilance Estimation

Y Luo, BL Lu - 2021 IEEE International Conference on …, 2021 - ieeexplore.ieee.org

… generalization bound based on Wasserstein distance for multi-source classification and

regression problems. Based on our bound, we propose two novel Wasserstein-distance-based …

 Related articles All 2 versions


[PDF] arxiv.org

Minimum cross-entropy distributions on Wasserstein balls and their applications

LF Vargas, M Velasco - arXiv preprint arXiv:2106.03226, 2021 - arxiv.org

… Another motivation for using the Wasserstein distance to define our ambiguity sets is the

fact that there are well-known estimates of the distance between the true distribution p and the …

 Related articles All 2 versions 


2021 thesis

[PDF] archives-ouvertes.fr

Inside and around Wasserstein barycenters

A Kroshnin - 2021 - tel.archives-ouvertes.fr

… On est principalement motivé par le problème du barycentre de Wasserstein introduit par M.

… We are mainly motivated by the Wasserstein barycenter problem introduced by M. Agueh …

Related articles All 6 versions 

<——2021———2021———2310——


[PDF] archives-ouvertes.fr

Diffusion-Wasserstein Distances for Attributed Graphs

D Barbe - 2021 - tel.archives-ouvertes.fr

This thesis is about the definition and study of the Diffusion-Wasserstein distances between

attributed graphs.An attributed graph is a collection of points with individual descriptions (…

All 21 versions 

2021 see 2022

SVAE-WGAN-Based Soft Sensor Data Supplement Method for Process Industry

S Gao, S Qiu, Z Ma, R Tian, Y Liu - IEEE Sensors Journal, 2021 - ieeexplore.ieee.org

… WGAN under the consideration of the accuracy and diversity for generated data. The

SVAE-WGAN … and use their encoders as generators of WGAN, and a deep generative model with …

Cited by 4 Related articles All 2 versions

 

[PDF] openreview.net

ON THE GENERALIZATION OF WASSERSTEIN ROBUST FEDERATED LEARNING

LT Le, J Nguyen, CT DinhNH Tran - 2021 - openreview.net

… To address this, we propose a Wasserstein distributionally robust optimization scheme … the

Wasserstein ball (ambiguity set). Since the center location and radius of the Wasserstein ball …


[PDF] openreview.net

Wasserstein Weisfeiler-Lehman Subtree Distance for Graph-Structured Data

Z Fang, J Huang, H Kasai - 2021 - openreview.net

Defining a valid graph distance is a challenging task in graph machine learning because we

need to consider the theoretical validity of the distance, its computational complexity, and …

Related articles View as HTML 


[PDF] arxiv.org

A Normalized Gaussian Wasserstein Distance for Tiny Object Detection

J WangC XuW Yang, L Yu - arXiv preprint arXiv:2110.13389, 2021 - arxiv.org

… sets, we design a better metric for tiny objects based on Wasserstein Distance since it can

consistently reflect the distance between distributions even if they have no overlap. Therefore, …

Cited by 1 Related articles All 2 versions 


2021


[PDF] arxiv.org

On Adaptive Confidence Sets for the Wasserstein Distances

N DeoT Randrianarisoa - arXiv preprint arXiv:2111.08505, 2021 - arxiv.org

… sets with radius measured in Wasserstein distance Wp, p ≥ 1, … Wasserstein distances. Our

analysis and methods extend more globally to weak losses such as Sobolev norm distances …

Related articles All 3 versions 


[PDF] arxiv.org

Obstructions to extension of Wasserstein distances for variable masses

L Lombardini, F Rossi - arXiv preprint arXiv:2112.04763, 2021 - arxiv.org

… a distance between measures of different masses, that coincides with the Wasserstein distance

in … We show that it is not possible to extend the p-Wasserstein distance Wp to a distance …

Related articles All 3 versions 


[PDF] arxiv.org

1-Wasserstein distance on the standard simplex

A Frohmader, H Volkmer - Algebraic Statistics, 2021 - msp.org

… Wasserstein distances provide a metric on a space of … Wasserstein distances provide a

natural metric on a space of … The p-Wasserstein distance between µ and ν is defined by …

Cited by 4 Related articles All 4 versions


[PDF] arxiv.org

Distributionally Robust Prescriptive Analytics with Wasserstein Distance

T WangN Chen, C Wang - arXiv preprint arXiv:2106.05724, 2021 - arxiv.org

… We consider p = 1 in the Wasserstein distance in this section for the ambiguity set. For the

numerical studies in Section 5, the objective functions may be piece-wise linear such as the …

Related articles All 2 versions 


[PDF] inria.fr

Optimization of the Diffusion Time in Graph Diffused-Wasserstein Distances: Application to Domain Adaptation

A Barbe, P Gonçalves, M Sebban… - 2021 IEEE 33rd …, 2021 - ieeexplore.ieee.org

… of optimizing the diffusion time used in these distances. Inspired from the notion of triplet-…

-Wasserstein distances outperforms the Gromov and Fused-Gromov Wasserstein distances …

Cited by 1 Related articles All 4 versions

<——2021———2021———2320——


Intensity-Based Wasserstein Distance As A Loss Measure For Unsupervised Deformable Deep Registration

R ShamsW Le, A Weihs… - 2021 IEEE 18th …, 2021 - ieeexplore.ieee.org

… distance functions to assess image similarity. Recent works have explored the Wasserstein

distance … a fast approximation variant — the sliced Wasserstein distance — for deep image …

Related articles


[PDF] arxiv.org

Schema matching using Gaussian mixture models with Wasserstein distance

M Przyborowski, M Pabiś, A Janusz… - arXiv preprint arXiv …, 2021 - arxiv.org

… From the viewpoint of optimal transport theory, the Wasserstein distance is an important …

In this paper we derive one of possible approximations of Wasserstein distances computed …

Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein distance and metric trees

M Mathey-Prevot, A Valette - arXiv preprint arXiv:2110.02115, 2021 - arxiv.org

… In this paper we will be concerned with embeddings of Wasserstein spaces, that we now …

explicit we get a closed formula for the Wasserstein distance on closed subsets of real trees. …

Related articles All 5 versions 


[PDF] arxiv.org

Tangent Space and Dimension Estimation with the Wasserstein Distance

U LimV Nanda, H Oberhauser - arXiv preprint arXiv:2110.06357, 2021 - arxiv.org

… ball, then its Wasserstein distance to the uniform measure … Furthermore, the Wasserstein

distance has not been used … our approach uses the Wasserstein distance rather than the …

Related articles All 4 versions 

 

[HTML] hindawi.com

[HTML] Spacecraft Intelligent Fault Diagnosis under Variable Working Conditions via Wasserstein Distance-Based Deep Adversarial Transfer Learning

G Xiang, K Tian - International Journal of Aerospace Engineering, 2021 - hindawi.com

… proposed Wasserstein GAN (WGAN), which introduced a more sensible Wasserstein distance

… The motivation of this paper is to explore how the Wasserstein-based adversarial learning …

Related articles All 4 versions 

2021


[PDF] iop.org

Graph Classification Method Based on Wasserstein Distance

W Wu, G Hu, F Yu - Journal of Physics: Conference Series, 2021 - iopscience.iop.org

… the Wasserstein distance between the … Distance Between Graphs Inspired by literature

[3], we define the optimal transport distance between two graphs, namely Wasserstein distance

Related articles All 3 versions


[PDF] arxiv.org

Minimum Wasserstein Distance Estimator under Finite Location-scale Mixtures

Q ZhangJ Chen - arXiv preprint arXiv:2107.01323, 2021 - arxiv.org

… p-Wasserstein distance between η and ν also the distance … form of the Wasserstein distance

between two measures on R … The strategy works for any p-Wasserstein distance but we only …

Cited by 1 Related articles All 2 versions 

Multi-source Cross Project Defect Prediction with Joint Wasserstein Distance and Ensemble Learning

Q Zou, L Lu, Z Yang, H Xu - 2021 IEEE 32nd International …, 2021 - ieeexplore.ieee.org

… • We propose a new joint Wasserstein distance, which takes the global and local information

… Wasserstein distance by combining the marginal and conditional Wasserstein distances. …

All 2 versions


Multilevel optimal transport: a fast approximation of wasserstein-1 distances

J LiuW YinW Li, YT Chow - SIAM Journal on Scientific Computing, 2021 - SIAM

… We propose a fast algorithm for the calculation of the Wasserstein-1 distance, which is a

particular type of optimal transport distance with transport cost homogeneous of degree one. …

Cited by 5 Related articles


[PDF] arxiv.org

Wasserstein distance between noncommutative dynamical systems

R Duvenhage - arXiv preprint arXiv:2112.12532, 2021 - arxiv.org

… of quadratic Wasserstein distances on spaces consisting of generalized dynamical systems

on a von Neumann algebra. We emphasize how symmetry of such a Wasserstein distance …

Related articles All 3 versions 

<——2021———2021———2330——

[PDF] arxiv.org

Two-sample goodness-of-fit tests on the flat torus based on Wasserstein distance and their relevance to structural biology

J González-DelgadoA González-Sanz… - arXiv preprint arXiv …, 2021 - arxiv.org

… , we focus on the Wasserstein distance as a metric between distributions. … distance ignores

the underlying geometry of the space. Here, we propose to use the Wasserstein distance, …

Cited by 1 Related articles All 23 versions 

 

 

[PDF] arxiv.org

On the Wasserstein Distance Between -Step Probability Measures on Finite Graphs

S Benjamin, A Mantri, Q Perian - arXiv preprint arXiv:2110.10363, 2021 - arxiv.org

… Wasserstein distance between µk and νk for general k. We consider the sequence formed by

the Wasserstein distance … the Wasserstein distance converges or the Wasserstein distance …

Related articles All 2 versions 


[PDF] openreview.net

A Distributional Robustness Perspective on Adversarial Training with the -Wasserstein Distance

C Regniez, G Gidel - 2021 - openreview.net

… -∞-Wasserstein distance and add entropic regularization. 2-∞Wasserstein DRO has already

… Nevertheless, the use of the l∞ within the ∞-Wasserstein distance allows both to consider …

[PDF] arxiv.org

Towards Better Data Augmentation using Wasserstein Distance in Variational Auto-encoder

Z Chen, P Liu - arXiv preprint arXiv:2109.14795, 2021 - arxiv.org

… In this paper, we propose the use of Wasserstein distance as a measure of distributional

similarity for the latent attributes, and show its superior theoretical lower bound (ELBO) …

 Related articles All 4 versions 


[PDF] aimsciences.org

Distributionally robust chance constrained svm model with -Wasserstein distance

Q Ma, Y Wang - Journal of Industrial & Management Optimization, 2021 - aimsciences.org

… form representation with the Wasserstein distance; [22] used the l1-Wasserstein distance in

… case with the Wasserstein distance. The form of l2-Wasserstein distance can be expressed …

 Related articles All 2 versions 

2021


 [PDF] archives-ouvertes.fr

Diffusion-Wasserstein Distances for Attributed Graphs

D Barbe - 2021 - tel.archives-ouvertes.fr

… In this chapter, we introduce the Diffusion-Wasserstein distance (DW). We provide insight …

Our goal with the Diffusion-Wasserstein distance was to provide an alternative to FGW that …

 All 14 versions 

 

[PDF] archives-ouvertes.fr

Sliced-Wasserstein distance for large-scale machine learning: theory, methodology and extensions

K Nadjahi - 2021 - tel.archives-ouvertes.fr

… of the Wasserstein distance between these univariate representations. We illustrate the

Sliced-Wasserstein distance in Figure 1.3 and provide its definition in the caption of that figure. …

 Cited by 1 Related articles All 15 versions 

 

[PDF] arxiv.org

Bounds in  Wasserstein distance on the normal approximation of general M-estimators

F Bachoc, M Fathi - arXiv preprint arXiv:2111.09721, 2021 - arxiv.org

… L1 Wasserstein distance between … Wasserstein distance as a supremum of expectation

differences, over Lipschitz functions. This enables to decompose the target Wasserstein distance …

 Related articles All 23 versions 


[PDF] arxiv.org

Distributionally Robust Chance-Constrained Programmings for Non-Linear Uncertainties with Wasserstein Distance

Y Gu, Y Wang - arXiv preprint arXiv:2103.04790, 2021 - arxiv.org

… -constrained programming (DRCCP) under Wasserstein ambiguity set, where the uncertain

… uncertain parameters within a chosen Wasserstein distance from an empirical distribution. …

 Related articles All 3 versions 


Fault injection in optical path-detection quality degradation analysis with Wasserstein distance

P Kowalczyk, P Bugiel, M Szelest… - … on Methods and …, 2021 - ieeexplore.ieee.org

… distributions represented by Wasserstein distance which proved its … Since Wasserstein optimal

transport is a distance function we … Watching the results of Wasserstein metric analysis for …

 Related articles

<——2021———2021———2340——

[PDF] researchgate.net

Image Denoising Using an Improved Generative Adversarial Network with Wasserstein Distance

Q Wang, H Liu, G Xie, Y Zhang - 2021 40th Chinese Control …, 2021 - ieeexplore.ieee.org

… Wasserstein distance and Lipschitz continuity conditions are used to effectively improve

the … Among them, the first two items are the Wasserstein distance, and the last item is the …

 Related articles All 3 versions




[PDF] arxiv.org

On Stein's Factors for Poisson Approximation in Wasserstein Distance with Nonlinear Transportation Costs

ZW Liao, Y Ma, A Xia - Journal of Theoretical Probability, 2021 - Springer

… approximation errors measured in the \(L^2\)-Wasserstein distance and this is one of the

motivations why we are interested in the Wasserstein distances with nonlinear cost functions. …

S Related articles All 3 versions






An Improved Mixture Density Network via Wasserstein Distance Based Adversarial Learning for Probabilistic Wind Speed Predictions

L Yang, Z Zheng, Z Zhang - IEEE Transactions on Sustainable …, 2021 - ieeexplore.ieee.org

This paper develops a novel mixture density network via Wasserstein distance based adversarial

learning (WA-IMDN) for achieving more accurate probabilistic wind speed predictions (…

Related articles All 3 versions


Recycling Discriminator: Towards Opinion-Unaware Image Quality Assessment Using Wasserstein GAN

Y ZhuH Ma, J Peng, D LiuZ Xiong - Proceedings of the 29th ACM …, 2021 - dl.acm.org

… First, Wasserstein GAN (WGAN) has the valuable property that its discriminator loss is an

accurate estimate of the Wasserstein distance [2, 12]. Second, the perceptual quality index (…

Cited by 3 Related articles

2021

 

Sliced Wasserstein based Canonical Correlation Analysis for Cross-Domain Recommendation

Z Zhao, J Nie, C Wang, L Huang - Pattern Recognition Letters, 2021 - Elsevier

… , instead we use wasserstein distance to learn the common … inference but the Wasserstein 

distance for the variational … bound (ELBO) of Wasserstein autoencoder about two domains …

Cited by 3 Related articles All 3 versions

Inverse Domain Adaptation for Remote Sensing Images Using Wasserstein Distance

Z Li, R Wang, MO Pun, Z Wang… - 2021 IEEE International …, 2021 - ieeexplore.ieee.org

In this work, an inverse domain adaptation (IDA) method is proposed to cope with the 

distributional mismatch between the training images in the source domain and the test images in …

Related articles


The Wasserstein Impact Measure (WIM): A practical tool for quantifying prior impact in Bayesian statistics

F Ghaderinezhad, C Ley, B Serrien - Computational Statistics & Data …, 2021 - Elsevier

distance explicitly, the authors have provided sharp lower and upper bounds on the Wasserstein 

distance … the Wasserstein distance and not other distances such as the Total Variation …

Related articles All 2 versions


[PDF] arxiv.org

Mass non-concentration at the nodal set and a sharp Wasserstein uncertainty principle

M Mukherjee - arXiv preprint arXiv:2103.11633, 2021 - arxiv.org

… We prove a conjectured lower bound on the Wasserstein distance between the measures 

defined by the positive and negative parts of the eigenfunction. Essentially, our estimate can …

Cited by 1 Related articles


[PDF] birs.ca

[PDF] Convergence of Smoothed Empirical Measures under Wasserstein Distance

Y Polyanskiy - 2021 - birs.ca

… p-Wasserstein Distance: For two distributions P and Q on Rd and p ≥ 1 … p-Wasserstein 

Distance: For two distributions P and Q on Rd and p ≥ 1 … p-Wasserstein Distance: For …

<——2021———2021———2350——


2021 see 2022

Wasserstein Graph Auto-Encoder

Y Chu, H Li, H Ning, Q Zhao - … on Algorithms and Architectures for Parallel …, 2021 - Springer

Wasserstein distance and graph neural network model to minimize the penalty in the form of 

Wasserstein distance … We use 1-Wasserstein distance, referred to as Wasserstein distance

 Related articles All 2 versions


Wasserstein Based EmoGANs+

WSS Khine, P Siritanawan… - 2021 Joint 10th …, 2021 - ieeexplore.ieee.org

… Therefore, in our EmoGANs+, the Wasserstein distance was used as loss for the training 

objective. We also discovered that our generated images include aliasing covering the faces …

Related articles

 

A Sliced Wasserstein Loss for Neural Texture SynthesisAuthors:Eric HeitzKenneth VanhoeyThomas ChambonLaurent Belcour2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)Show more
Summary:We address the problem of computing a textural loss based on the statistics extracted from the feature activations of a convolutional neural network optimized for object recognition (e.g. VGG-19). The underlying mathematical problem is the measure of the distance between two distributions in feature space. The Gram-matrix loss is the ubiquitous approximation for this problem but it is subject to several shortcomings. Our goal is to promote the Sliced Wasserstein Distance as a replacement for it. It is theoretically proven, practical, simple to implement, and achieves results that are visually superior for texture synthesis by optimization or training generative neural networksShow more
Chapter, 2021
Publication:2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 202106, 9407
Publisher:2021


[PDF] openreview.net

An Improved Composite Functional Gradient Learning by Wasserstein Regularization for Generative adversarial networks

C Wan, Y Fu, K Fan, J Zeng, M Zhong, R Jia, ML Li… - 2021 - openreview.net

… are to introduce the Wasserstein distance regularization into the … This motivates us to repurpose 

the Wasserstein distance from … Gradient by enforcing the Wasserstein distance (ICFGW). …

Related articles


[PDF] arxiv.org

SLOSH: Set LOcality Sensitive Hashing via Sliced-Wasserstein Embeddings

Y Lu, X Liu, A Soltoggio, S Kolouri - arXiv preprint arXiv:2112.05872, 2021 - arxiv.org

Wasserstein-based learning: Wasserstein distances are … between embedded sets is equal 

to the SW-distance between … define the Wasserstein and Sliced-Wasserstein distances. Let µi …

Related articles All 2 versions


2021

 

[HTML] springer.com

[HTML] EEG data augmentation for emotion recognition with a multiple generator conditional Wasserstein GAN

A Zhang, L Su, Y Zhang, Y Fu, L Wu, S Liang - Complex & Intelligent …, 2021 - Springer

… two distributions, the Wasserstein distance can still provide … Wasserstein distance definition 

(formula (2)) cannot be solved directly. One strategy for calculating the Wasserstein distance

Cited by 3 Related articles All 3 versions

[PDF] spiedigitallibrary.org

Reproducibility of radiomic features using network analysis and its application in Wasserstein k-means clustering

JH Oh, AP Apte, E Katsoulakis, N Riaz… - Journal of Medical …, 2021 - spiedigitallibrary.org

… We employ the W 1 -Wasserstein distance (also known as Earth Mover’s distance: EMD) 

as a quantitative metric to assess the reproducibility of radiomic features. To investigate …

Cited by 2 Related articles All 6 versions

 

[HTML] copernicus.org

[HTML] Ensemble Riemannian data assimilation over the Wasserstein space

SK Tamang, A Ebtehaj, PJ Van Leeuwen… - Nonlinear Processes …, 2021 - npg.copernicus.org

… manifold equipped with the Wasserstein metric. Unlike the Euclidean distance used in 

classic data assimilation methodologies, the Wasserstein metric can capture the translation and …

 Cited by 4 Related articles All 18 versions

 

[PDF] arxiv.org

Stochastic Wasserstein Hamiltonian Flows

J Cui, S Liu, H Zhou - arXiv preprint arXiv:2111.15163, 2021 - arxiv.org

… The density space equipped with L2-Wasserstein metric forms an infinite dimensional Riemannain 

manifold, often called Wasserstein manifold or density manifold in literature (see eg […

Cited by 2 Related articles All 2 versions

Naldi, Emanuele; Savaré, Giuseppe

Weak topology and Opial property in Wasserstein spaces, with applications to gradient flows and proximal point algorithms of geodesically convex functionals. (English) Zbl 07490883

Atti Accad. Naz. Lincei, Cl. Sci. Fis. Mat. Nat., IX. Ser., Rend. Lincei, Mat. Appl. 32, No. 4, 725-750 (2021).

MSC:  60B05 49Q22 49J45 65K10

PDF BibTeX XML Cite

Full Text: DOI 

Zbl 07490883

Cited by 3 Related articles All 10 versions 

<——2021———2021———2360——


Scenario Reduction Network Based on Wasserstein Distance with Regularization

Y Sun, X Dong, SM Malik - 2021 - techrxiv.org

… This paper presents a scenario reduction network model based on Wasserstein distance.

Entropy regularization is used to transform the scenario reduction problem into an …

[PDF] openreview.net

Wasserstein Weisfeiler-Lehman Subtree Distance for Graph-Structured Data

Z Fang, J Huang, H Kasai - 2021 - openreview.net

… a node distance between WL subtrees with tree edit distance … node distance to define a

graph Wasserstein distance on tree … the graph Wasserstein distance by considering the distance …

Related articles 

Exact Statistical Inference for the Wasserstein Distance by Selective Inference

V Nguyen Le Duy, I Takeuchi - arXiv e-prints, 2021 - ui.adsabs.harvard.edu

… statistical inference for the Wasserstein distance, which has … inference method for the

Wasserstein distance inspired by the … (CI) for the Wasserstein distance with finite-sample coverage 

 

[PDF] googleapis.com

Object shape regression using wasserstein distance

J Sun, SKP Kumar, R Bala - US Patent 10,943,352, 2021 - Google Patents

… that computes a Wasserstein distance between probability … Wasserstein distance computed

by the discriminator module. … a way that the computed Wasserstein distance is reduced. …

Related articles All 4 versions 


2021

  
 Wasserstein Embeddings for Nonnegative Matrix Factorization
by Febrissy, MickaelNadif, Mohamed
Machine Learning, Optimization, and Data Science, 01/2021
In the field of document clustering (or dictionary learning), the fitting error called the Wasserstein (In this paper, we use “Wasserstein”, “Earth Mover’s”,...


[PDF] openreview.net

An Improved Composite Functional Gradient Learning by Wasserstein Regularization for Generative adversarial networks

C Wan, Y Fu, K Fan, J Zeng, M Zhong, R Jia, ML Li… - 2021 - openreview.net

… that the Wasserstein regularization improves the efficacy of ICFG; and the ICFG with Wasserstein

… • For the first time, We introduce the Wasserstein regularization to the CFG framework, …

Related articles 


[PDF] archives-ouvertes.fr

Sliced-Wasserstein distance for large-scale machine learning: theory, methodology and extensions

K Nadjahi - 2021 - tel.archives-ouvertes.fr

… proposed, including the Sliced-Wasserstein distance (SW), a … in modern statistical and

machine learning problems, with a … the Generalized Sliced-Wasserstein distances, and illustrate …

Cited by 1 Related articles All 15 versions 


2021 see 2022[PDF] arxiv.org

Entropic Gromov-Wasserstein between Gaussian Distributions

K LeD LeH NguyenD DoT PhamN Ho - arXiv preprint arXiv …, 2021 - arxiv.org

… Gromov-Wasserstein, named unbalanced Gromov-Wasserstein, via the idea of unbalanced

… The entropic unbalanced GromovWasserstein has been used in robust machine learning …

Related articles All 3 versions 

 

[PDF] metu.edu.tr

[PDF] Wasserstein generative adversarial active learning for anomaly detection with gradient penalty

HA Duran - 2021 - open.metu.edu.tr

… ] [26] Wasserstein GAN and Wasserstein GAN with Gradient Penalty. In this study, unlike the

standard GAN, the Wasserstein … In addition, by using Wasserstein distance calculation, the …

Related articles 

<——2021———2021———2370——


[PDF] openreview.net

ON THE GENERALIZATION OF WASSERSTEIN ROBUST FEDERATED LEARNING

LT Le, J Nguyen, CT DinhNH Tran - 2021 - openreview.net

… To address this, we propose a Wasserstein distributionally robust optimization scheme … the

Wasserstein ball (ambiguity set). Since the center location and radius of the Wasserstein ball …


MR4394493
 Prelim Valle, Marcos Eduardo; Francisco, Samuel; Granero, Marco Aurélio; Velasco-Forero, Santiago; Measuring the irregularity of vector-valued morphological operators using wasserstein metric. Discrete geometry and mathematical morphology, 512–524, Lecture Notes in Comput. Sci., 12708, Springer, Cham, [2021], ©2094 (06B05)21. 

[PDF] archives-ouvertes.fr

Measuring the Irregularity of Vector-Valued Morphological Operators using Wasserstein Metric

ME Valle, S Francisco, MA Granero… - … Conference on Discrete …, 2021 - Springer

… a framework based on the Wasserstein metric to score this … between the topologies induced

by the metric and the total ordering, … to measure the irregularity using the Wasserstein metric. …

 Related articles All 17 versions

 

[PDF] arxiv.org

Network Consensus in the Wasserstein Metric Space of Probability Measures

AN Bishop, A Doucet - SIAM Journal on Control and Optimization, 2021 - SIAM

… metric known as the Wasserstein distance which allows us to consider an important set of

probability measures as a metric … the weighted sum of its Wasserstein distances to the agent's …

Cited by 2 Related articles All 4 versions


Probability Distribution Control of Finite-State Markov Chains with Wasserstein Costs and Application to Operation of Car-Sharing Services

K Hoshino, K Sakurama - 2021 60th IEEE Conference on …, 2021 - ieeexplore.ieee.org

This study investigates an optimal control problem of discrete-time finite-state Markov chains 

with application in the operation of car-sharing services. The optimal control of probability …

Cited by 2 Related articles All 2 versions

Conference Paper  Citation/Abstract

Oversampling based on WGAN for Network Threat Detection

Xu, Yanping; Qiu, Zhenliang; Zhang, Jieyin; Zhang, Xia; Qiu, Jian; et al.

The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2021).

 2021


Working Paper  Full Text

GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators

Chen, Dingfan; Orekondy, Tribhuvanesh; Fritz, Mario.

arXiv.org; Ithaca, Mar 15, 2021.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window
GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators

Chen, Dingfan ; Orekondy, Tribhuvanesh ; Fritz, MarioarXiv.org, 2021


patent Wire Feed  Full Text

State Intellectual Property Office of China Releases Univ Jilin's Patent Application for Sketch-Photo Conversion Method Based on WGAN-GP and U-NET

Global IP News. Information Technology Patent News; New Delhi [New Delhi]. 08 Jan 2021. 

   DetailsFull tex

NEWSPAPER ARTICLE

State Intellectual Property Office of China Releases Univ Jilin's Patent Application for Sketch-Photo Conversion Method Based on WGAN-GP and U-NET

Global IP News. Information Technology Patent News, 2021

NEWSPAPER ARTICLE

State Intellectual Property Office of China Releases Univ Jilin's Patent Application for Sketch-Photo Conversion Method Based on WGAN-GP and U-NET

No Online Access 
NEWSPAPER ARTICLE

State Intellectual Property Office of China Releases Univ Jilin's Patent Application for Sketch-Photo Conversion Method Based on WGAN-GP and U-NET

Global IP News: Information Technology Patent News, 2021

State Intellectual Property Office of China Releases Univ Jilin's Patent Application for Sketch-Photo Conversion Method Based on WGAN-GP and U-NET

No Online Access 


[PDF] arxiv.org

Wasserstein convergence rate for empirical measures of Markov chains

A Riekert - arXiv preprint arXiv:2101.06936, 2021 - arxiv.org

… In addition to estimating the expectation of W1(µ, µn), it is also of interest how well the

Wasserstein distance concentrates around its expected value. In this section the Markov chain can …

Cited by 2 Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein convergence rates for random bit approximations of continuous Markov processes

S Ankirchner, T Kruse, M Urusov - Journal of Mathematical Analysis and …, 2021 - Elsevier

… The scheme is based on the construction of certain Markov chains whose laws can be …

Markov chains converge at fixed times at the rate of 1/4 with respect to every p-th Wasserstein …

Cited by 5 Related articles All 4 versions


[PDF] arxiv.org

On absolutely continuous curves in the Wasserstein space over R and their representation by an optimal Markov process

C Boubel, N Juillet - arXiv preprint arXiv:2105.02495, 2021 - arxiv.org

… case where µ is absolutely continuous in the Wasserstein space P2(R). Then, X … Markov

Lagrangian probabilistic representation of the continuity equation, moreover the unique Markov …

Cited by 1 Related articles All 4 versions 

<——2021———2021———2380——


Probability Distribution Control of Finite-State Markov Chains with Wasserstein Costs and Application to Operation of Car-Sharing Services

K Hoshino, K Sakurama - 2021 60th IEEE Conference on …, 2021 - ieeexplore.ieee.org

This study investigates an optimal control problem of discrete-time finite-state Markov chains

with application in the operation of car-sharing services. The optimal control of probability …

 

[PDF] arxiv.org

Some inequalities on Riemannian manifolds linking Entropy, Fisher information, Stein discrepancy and Wasserstein distance

LJ Cheng, FY Wang, A Thalmaier - arXiv preprint arXiv:2108.12755, 2021 - arxiv.org

… Taking µ as reference measure, we derive inequalities for probability measures on M linking

relative entropy, Fisher information, Stein discrepancy and Wasserstein distance. These …

Related articles All 4 versions 

[PDF] arxiv.org

The quantum Wasserstein distance of order 1

G De PalmaM MarvianD Trevisan… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

… of the Wasserstein distance of order 1 to the quantum states … the classical Wasserstein

distance for quantum states … the Lipschitz constant to quantum observables. The notion of …

 Cited by 17 Related articles All 10 versions


[HTML] springer.com

[HTML] Quantum statistical learning via Quantum Wasserstein natural gradient

S BeckerW Li - Journal of Statistical Physics, 2021 - Springer

… pull back the quantum Wasserstein metric such that the parameter space becomes a

Riemannian manifold with quantum Wasserstein information matrix. Using a quantum analogue of …

 Cited by 2 Related articles All 10 versions


[PDF] arxiv.org

Towards optimal transport for quantum densities

E Caglioti, F Golse, T Paul - arXiv preprint arXiv:2101.03256, 2021 - arxiv.org

… theories in the context of quantum mechanics. The present work … this quantum variant of the

Monge-Kantorovich or Wasserstein distance, and discusses the structure of optimal quantum …

 Cited by 10 Related articles All 44 versions 


2021


[2102.08725] Isometric Rigidity of compact Wasserstein spaces

https://arxiv.org › math

by J Santos-Rodríguez · 2021 — Title:Isometric Rigidity of compact Wasserstein spaces ... Abstract: Let (X,d,\mathfrak{m}) be a metric measure space. The study of the ...

Missing: manuscript ‎| Must include: manuscript

[CITATION]  [CITATION] Isometric rigidity of compact Wasserstein spaces, manuscript

J Santos-Rodrıguez - arXiv preprint arXiv:2102.08725, 2021
Cited by 2 Related articles All 3 versions


面向磁共振影像超分辨的 WGAN 方法研究

黎玥嵘, 武仲科, 王学松, 申佳丽… - … 师范大学学报 (自然科学版), 2021 - bnujournal.com

… )超分辨率重构任务,提出了Wasserstein 生成式对抗网络(Wasserstein generative adversarial

network,WGAN),构建了合适的网络模型与损失函数;基于残差U-net WGAN 后端上采样超分模型,…

 [Chinese Research on WGAN method for magnetic resonance imaging super-resolution ]


基于 RDN 和 WGAN 的图像超分辨率重建模型

李易达, 马晓轩 - 现代电子技术, 2021 - cnki.com.cn

文中针对现有基于生成对抗网络的单图超分辨率重建模型训练不稳定, 以及重建后的图像细节

视觉效果不理想等问题, 提出基于残差密集网络(RDN) WGAN 的图像超分辨率重建模型. 模型

[CITATION] Clustering of test scenes by use of wasserstein metric and analysis of tracking quality in optically degraded videos

P Kowalczyk, P Bugiel, J Izydorczyk, M Szelest - Wydawnictwo Politechniki Śląskiej …, 2021

 [Chonese  Image Super-Resolution Reconstruction Model Based on RDN and WGAN]

 Cited by 2

[PDF] aaai.org

[PDF] Fast PCA in 1-D Wasserstein Spaces via B-splines Representation and Metric Projection

M PegoraroM Beraha - 35th AAAI Conference on Artificial Intelligence …, 2021 - aaai.org

… the Wasserstein geometry. We present a novel representation of the 2-Wasserstein space, …

We propose a novel definition of Principal Component Analysis in the Wasserstein space …

Related articles All 3 versions 


[PDF] mlr.press

Rethinking rotated object detection with gaussian wasserstein distance loss

X YangJ YanQ MingW Wang… - International …, 2021 - proceedings.mlr.press

… Comparison between different solutions for inconsistency between metric and loss (IML),

boundary discontinuity (BD) and square-like problem (SLP) on DOTA dataset. The …

Cited by 31 Related articles All 9 versions 

<——2021———2021———2390——


2021 

Wasserstein metric between a discrete probability measure and a continuous one

[CITATION] Wasserstein metric between a discrete probability measure and a continuous one

W Yang, X Wang - 2021

Related articles All 5 versions


[HTML] oup.com

Simulation of broad-band ground motions with consistent long-period and short-period components using the Wasserstein interpolation of acceleration envelopes

T Okazaki, H Hachiya, A Iwaki, T Maeda… - Geophysical Journal …, 2021 - academic.oup.com

… metric known as the Wasserstein distance, and (2) embed pairs of long-period and short-period

envelopes into a common latent space to improve the consistency of the entire waveform…

Cited by 1 Related articles All 5 versions

 Data Enhancement Method for Gene Expression Profile Based on Improved WGAN-...
by Zhu, ShaojunHan, Fei
Neural Computing for Advanced Applications, 08/2021
A large number of gene expression profile datasets mainly exist in the fields of biological information and gene microarrays. Traditional classification...
Book Chapter  Full Text Online


 

A Data Augmentation Method for Distributed Photovoltaic Electricity Theft Using Wasserstein Generative Adversarial Network

J Li, W Liao, R Yang, Z Chen - 2021 IEEE 5th Conference on …, 2021 - ieeexplore.ieee.org

Because of the concealment of distributed photovoltaic (PV) electricity theft, the number of

electricity theft samples held by the power sector is insufficient, which results in low accuracy …

Related articles All 2 versions

AWGAN: Unsupervised Spectrum Anomaly Detection with Wasserstein Generative Adversarial Network along with Random Reverse Mapping

W Huang, B Li, W Wang, M Zhang… - … on Mobility, Sensing …, 2021 - ieeexplore.ieee.org

… In order to effectively detect anomalies, we propose AWGAN, a novel anomaly detection

method based on Wasserstein generative adversarial network. AWGAN can not only learn the …

All 2 versions


2021


Data-driven Wasserstein distributionally robust optimization for refinery planning under uncertainty

J Zhao, L ZhaoW He - … 2021–47th Annual Conference of the …, 2021 - ieeexplore.ieee.org

This paper addresses the issue of refinery production planning under uncertainty. A data-driven

Wasserstein distributionally robust optimization approach is proposed to optimize …

Related articles

[PDF] arxiv.org

Class-conditioned Domain Generalization via Wasserstein Distributional Robust Optimization

J Wang, Y LiL XieY Xie - arXiv preprint arXiv:2109.03676, 2021 - arxiv.org

… Our method uses Wasserstein barycenter as the reference … Our method uses Wasserstein

barycenter as the reference … domains is the 2-Wasserstein barycenter since it better capture the …

Cited by 1 Related articles All 2 versions 

 

[PDF] openreview.net

ON THE GENERALIZATION OF WASSERSTEIN ROBUST FEDERATED LEARNING

LT Le, J NguyenCT DinhNH Tran - 2021 - openreview.net

… propose a Wasserstein distributionally robust … is robust to all adversarial distributions inside

the Wasserstein ball (ambiguity set). Since the center location and radius of the Wasserstein …

 

[PDF] arxiv.org

Learning to Generate Wasserstein Barycenters

J Lacombe, J DigneN CourtyN Bonneel - arXiv preprint arXiv …, 2021 - arxiv.org

… on Wasserstein barycenters of pairs of measures, generalizes well to the problem of finding

Wasserstein barycenters of … a method to compute Wasserstein barycenters in milliseconds. It …

Related articles All 6 versions 


[PDF] aimsciences.org

Distributionally robust chance constrained svm model with -Wasserstein distance

Q Ma, Y Wang - Journal of Industrial & Management Optimization, 2021 - aimsciences.org

… robust chanceconstrained SVM model with l2-Wasserstein ambiguity. … robust chance

constraints based on l2Wasserstein ambiguity. In terms of this method, the distributionally robust …

 Related articles All 2 versions 

<——2021———2021———2400——

 

[PDF] arxiv.org

Data-Driven Distributionally Robust Surgery Planning in Flexible Operating Rooms Over a Wasserstein Ambiguity

KS Shehadeh - arXiv preprint arXiv:2103.15221, 2021 - arxiv.org

… In this paper, we construct a 1-Wasserstein distance-based ambiguity set of all probability

… robust surgery assignment (DSA) problem as a two-stage DRO model using a Wasserstein …

 Related articles All 3 versions 

2 Smooth $p$-Wasserstein Distance: Structure, Empirical Approximation, and Statistical...
by Nietert, SloanGoldfeld, ZivKato, Kengo
01/2021
Discrepancy measures between probability distributions, often termed statistical distances, are ubiquitous in probability theory, statistics and machine...
Journal Article  Full Text Online

 

Papayiannis, G. I.Domazakis, G. N.Drivaliaris, D.Koukoulas, S.Tsekrekos, A. E.Yannacopoulos, A. N.

On clustering uncertain and structured data with Wasserstein barycenters and a geodesic criterion for the number of clusters. (English) Zbl 07497103

J. Stat. Comput. Simulation 91, No. 13, 2569-2594 (2021).

MSC:  62-XX

PDF BibTeX XML Cite

Full Text: DOI 



[PDF] arxiv.org

Distributionally Robust Chance-Constrained Programmings for Non-Linear Uncertainties with Wasserstein Distance

Y Gu, Y Wang - arXiv preprint arXiv:2103.04790, 2021 - arxiv.org

… a distributionally robust chance-constrained programming (DRCCP) under Wasserstein …

a distributionally robust chance-constrained programming under Wasserstein ambiguity set …

Cited by 1 Related articles All 3 versions 


[PDF] arxiv.org

Distributionally robust second-order stochastic dominance constrained optimization with Wasserstein ball

Y Mei, J Liu, Z Chen - arXiv preprint arXiv:2101.00838, 2021 - arxiv.org

… with robust SD constrained optimization problems in [4,18,25,42], we study Wasserstein

ball … of the distributionally robust SSD constrained optimization with Wasserstein ball by the …

Related articles All 2 versions 


2021


[PDF] researchgate.net

[PDF] Distributionally robust second-order stochastic dominance constrained optimization with Wasserstein distance

Y Mei, J Liu, Z Chen - arXiv preprint arXiv:2101.00838, 2021 - researchgate.net

… Specifically, we extend the distributionally robust optimization with Wasserstein … robust SD

constrained optimization problems in [16,23,37], we study the ambiguity set with Wasserstein …

Cited by 1 Related articles All 2 versions 


2022 see 2021  [PDF] arxiv.org

A Regularized Wasserstein Framework for Graph Kernels

A Wijesinghe, Q Wang, S Gould - 2021 IEEE International …, 2021 - ieeexplore.ieee.org

… preserve both features and structure of graphs via Wasserstein distances on features and their

… Theoretically, our framework is robust and can guarantee the convergence and numerical …

Related articles All 5 versions


[PDF] researchgate.net

[PDF] MULTIPLIER BOOTSTRAP FOR BURES–WASSERSTEIN BARYCENTERS BY ALEXEY KROSHNIN, VLADIMIR SPOKOINY 2 AND ALEXANDRA …

A KROSHNIN - researchgate.net

… We note that unlike the Frobenius mean, the Bures–Wasserstein barycenter is not a linear

… We note that the Bures-Wasserstein barycenter is an M-estimators. In this regard, it is worth …

Related articles 


arXiv:2112.02424  [pdfother cs.LG
Variational Wasserstein gradient flow
Authors: Jiaojiao FanAmirhossein TaghvaeiYongxin Chen
Abstract: The gradient flow of a function over the space of probability densities with respect to the Wasserstein metric often exhibits nice properties and has been utilized in several machine learning applications. The standard approach to compute the Wasserstein gradient flow is the finite difference which discretizes the underlying space over a grid, and is not scalable. In this work, we propose a scalab…  More
Submitted 4 December, 2021; originally announced December 2021.

[PDF] arxiv.org

Variational Wasserstein gradient flow

J FanA TaghvaeiY Chen - arXiv preprint arXiv:2112.02424, 2021 - arxiv.org

… Wasserstein gradient flow models the gradient dynamics over the space of probability densities

with respect to the Wasserstein … equation is in fact the Wasserstein gradient flow of the …

Cited by 20 Related articles All 7 versions 


Efficient and Robust Classification for Positive Definite Matrices with Wasserstein Metric

J Cui - 2021 - etd.auburn.edu

… The purpose of this paper is to propose different algorithms based on Bures-Wasserstein …

that Bures-Wasserstein simple projection mean algorithm has a better efficient and robust …

 Related articles 

<——2021———2021———2410——


[PDF] thecvf.com

A sliced wasserstein loss for neural texture synthesis

E Heitz, K Vanhoey, T Chambon… - Proceedings of the …, 2021 - openaccess.thecvf.com

… Our Sliced Wasserstein loss also computes 1D losses but with an optimal transport formulation

(implemented by a sort) rather than a binning scheme and with arbitrary rather than axis-…

Cited by 10 Related articles All 7 versions 


[PDF] uwaterloo.ca

[PDF] Wasserstein Distance and Entropic Regularization in Kantorovich Problem

W Zhuo - student.cs.uwaterloo.ca

… The notion of Wasserstein distance provides a theoretical … , entropic regularization of the

Wasserstein distance leads to a … framework and compare Wasserstein distance against its en…


[PDF] yuxinirisye.com

[PDF] Wasserstein Learning of Generative Models

Y Ye - yuxinirisye.com

… The estimated 1-Wasserstein distance trackings are … -Wasserstein distance during

training, which corresponds to the fact that we can train the WGAN till optimality with the 1-Wasserstein …

Related articles 

 

Efficient and Provable Algorithms for Wasserstein Distributionally Robust Optimization in Machine Learning

LI Jiajin - 2021 - search.proquest.com

… robust learning models with Wasserstein distance-based ambiguity sets in this thesis. As a

matter of fact, Wasserstein … with the same support, Wasserstein distance does not have such a …

[PDF] arxiv.org

Learning to Generate Wasserstein Barycenters

J Lacombe, J DigneN CourtyN Bonneel - arXiv preprint arXiv …, 2021 - arxiv.org

… Contributions This paper introduces a method to compute Wasserstein barycenters in

milliseconds. It shows that this can be done b

learning Wasserstein barycenters of only two …


2021

Cheng, Li-JuanThalmaier, AantonZhang, Shao-Qin

Exponential contraction in Wasserstein distance on static and evolving manifolds. (English) Zbl 07523889

Rev. Roum. Math. Pures Appl. 66, No. 1, 107-129 (2021).

MSC:  60J60 58J65 53Exx

PDF BibTeX XML 

Zbl 07523889

[PDF] arxiv.org

Variational Wasserstein gradient flow

J FanA TaghvaeiY Chen - arXiv preprint arXiv:2112.02424, 2021 - arxiv.org

… This is essentially a backward Euler discretization or a proximal point method with respect to the Wasserstein metric. The solution to (4) converges to the continuous-time Wassrstein …

 Cited by 6 Related articles All 3 versions 


[PDF] arxiv.org

çƒ√free Wasserstein manifold

D Jekel, W LiD Shlyakhtenko - arXiv preprint arXiv:2101.06572, 2021 - arxiv.org

… stating versions of the heat equation, Wasserstein geodesic equation, incompressible Euler equation, and inviscid Burgers’ equation in our tracial non-commutative framework. The …

Cited by 1 Related articles All 4 versions 


[PDF] arxiv.org

Approximation Capabilities of Wasserstein Generative Adversarial Networks

Y Gao, M Zhou, MK Ng - arXiv preprint arXiv:2103.10060, 2021 - arxiv.org

… that the approximation for distributions by Wasserstein GAN … bound is developed for 

Wasserstein distance between the … , the learned Wasserstein GAN can approximate distributions …

  Related articles All 2 versions


[PDF] arxiv.org

Sampling from the wasserstein barycenter

C Daaloul, TL Gouic, J Liandrat, M Tournus - arXiv preprint arXiv …, 2021 - arxiv.org

… In order to implement it, we use a kernel to approximate the Wasserstein gradient of the 

penalization term in Fα as is done in Liu and Wang (2016) and Chewi, Le Gouic, Lu, Maunu, and …

 Cited by 2 Related articles All 4 versions

<——2021———2021———2420——


 Deep transfer Wasserstein adversarial network for wafer map defect recognition

J Yu, S Li, Z Shen, S Wang, C Liu, Q Li - Computers & Industrial …, 2021 - Elsevier

… This study proposes a new transfer learning model, ie, deep transfer Wasserstein adversarial 

… The contributions of this study are as follows: (1) Wasserstein distance and MMD are …

Cited by 2 Related articles All 2 versions

Internal wasserstein distance for adversarial attack and defense

J Li, J CaoS Zhang, Y Xu, J Chen, M Tan - arXiv preprint arXiv …, 2021 - arxiv.org

… and defense to DNNs on the manifold. To address this, we propose an internal Wasserstein … Correspondingly, we develop a defense method (called IWDD) to defend aga


A travers et autour des barycentres de Wasserstein

A Kroshnin - 2021 - theses.fr

… thesis, we consider some variational problems involving optimal transport. We are mainly motivated by the Wasserstein … In this thesis, we deal with the following problems: • barycenters …

 

 [PDF] arxiv.org

Wasserstein convergence rate for empirical measures of Markov chains

A Riekert - arXiv preprint arXiv:2101.06936, 2021 - arxiv.org

… measure with respect to the $1$-Wasserstein distance. The main result of this article is a new upper bound for the expected Wasserstein distance, which is proved by combining the …

 Cited by 3 Related articles All 2 versions 


[PDF] github.io

[PDF] SiGANtures: Generating Times Series Using Wasserstein-Generative Adversarial Nets and the Signature Transform

L BLEISTEIN - linusbleistein.github.io

… We focuse on properties used in this thesis, and present recent theoretical results as far as … We will only state those needed to properly define the Wasserstein distance and refer the …


2021

  Wasserstein distance-based auto-encoder tracking

L Xu, Y Wei, C Dong, C Xu, Z Diao - Neural Processing Letters, 2021 - Springer

… [23] introduced the Wasserstein distance and MMD for training to address the issues. In … 

to the code distribution. The experiment results demonstrated that the improved Wasserstein

Cited by 2 Related articles All 2 versions

[PDF] aaai.org

Visual transfer for reinforcement learning via wasserstein domain confusion

J RoyGD Konidaris - Proceedings of the AAAI Conference on Artificial …, 2021 - ojs.aaai.org

… While the Wasserstein GAN loss term seems to align ex- actly … Thus, the Wasserstein distance

is defined as W(Ps,Pt) = Ex … be correct in the proof of Theorem 3 of Arjovsky, Chintala, and …

Cited by 8 Related articles All 9 versions 


[PDF] ijcai.org

[PDF] Two-Sided Wasserstein Procrustes Analysis.

K JinC LiuC Xia - IJCAI, 2021 - ijcai.org

… find that initializing the transformation matrix using Wasserstein GAN [Arjovsky et al … [Grave et

al., 2019] proposes Wasserstein Procrustes by combining Wasserstein distance and Pro …

Related articles All 2 versions 


[PDF] mlr.press

Smooth -Wasserstein Distance: Structure, Empirical Approximation, and Statistical Applications

S Nietert, Z Goldfeld, K Kato - International Conference on …, 2021 - proceedings.mlr.press

… We now examine basic properties of smooth Wasserstein distances, including a useful 

 connection to the smooth Sobolev IPM. The case of W (σ) 1 has been well-studied in (Goldfeld & …

Cited by 13 Related articles All 4 versions


[PDF] ams.org

Nonembeddability of persistence diagrams with 𝑝> 2 Wasserstein metric

A Wagner - Proceedings of the American Mathematical Society, 2021 - ams.org

… product structure compatible with any Wasserstein metric. Hence, … We prove that persistence 

diagrams with the p-Wasserstein … This implies by Theorem 3.4 of Nowak [4] that lp coarsely …

Cited by 15 Related articles All 5 versions

<——2021———2021———2430——


2021 arXiv v2

Local well-posedness in the Wasserstein space for a ... - arXiv

https://arxiv.org › math

 K Kang · 2019 — Title:Local well-posedness in the Wasserstein space for a chemotaxis model coupled to Navier-Stokes equations ; Subjects: Analysis of PDEs (math.


2021 see 2020 2022

An embedding carrier-free steganography method based on wasserstein gan

X Yu, J Cui, M Liu - … Conference on Algorithms and Architectures for …, 2021 - Springer

… In this paper, we proposed a carrier-free steganography method based on Wasserstein 

GAN. We segmented the target information and input it into the trained Wasserstein GAN, and …

Cited by 1 Related articles All 2 versions


Wasserstein Graph Neural Networks for Graphs with Missing Attributes

Z Chen, T Ma, Y Song, Y Wang - arXiv e-prints, 2021 - ui.adsabs.harvard.edu

… Missing node attributes is a common problem in real-world … learning framework, Wasserstein 

Graph Neural Network (… information from neighbors in the Wasserstein space. We test …

 All 2 versions


[PDF] arxiv.org

Trust the Critics: Generatorless and Multipurpose WGANs with Initial Convergence Guarantees

T Milne, É Bilocq, A Nachman - arXiv preprint arXiv:2111.15099, 2021 - arxiv.org

Inspired by ideas from optimal transport theory we present Trust the Critics (TTC), a new 

algorithm for generative modelling. This algorithm eliminates the trainable generator from a …

 Cited by 1 Related articles All 2 versions


[HTML] sinomaps.com

[HTML] 高光谱图像分类的 Wasserstein 配置熵非监督波段选择方法

张红, 吴智伟, 王继成, 高培超 - 2021 - xb.sinomaps.com

其中,Wasserstein配置熵删除了连续像元的冗余信息,但局限于四邻域,本文将Wasserstein配置

熵拓展至八邻域.以印度松木试验场和意大利帕维亚大学高光谱图像为例,使用Wasserstein配置熵

  Related articles All 3 versions

[Chinese  Wasserstein Configuration Entropy Unsupervised Band Selection Method for Hyperspectral Image Classification]

2021


Towards Energy Efficient Smart Grids: Data Augmentation Through BiWGAN, Feature Extraction and Classification Using Hybrid 2DCNN and BiLSTM

M AsifB Kabir, A Ullah, S Munawar, N Javaid - … Conference on Innovative …, 2021 - Springer

In this paper, a novel hybrid deep learning approach is proposed to detect the nontechnical

losses (NTLs) that occur in smart grids due to illegal use of electricity, faulty meters, meter …

Cited by 1 Related articles All 2 versions


2021

MR4385577 Prelim Naldi, Emanuele; Savaré, Giuseppe; Weak topology and Opial property in Wasserstein spaces, with applications to gradient flows and proximal point algorithms of geodesically convex functionals. Atti Accad. Naz. Lincei Rend. Lincei Mat. Appl. 32 (2021), no. 4, 725–750. 49Q22 (49J45 65K10)

Review PDF Clipboard Journal Article


Some Theoretical Insights into Wasserstein GANs

https://jmlr.org › beta › papers

https://jmlr.org › beta › papers

First, we properly define the architecture of WGANs in the context of integral probability metrics parameterized by neural networks and highlight some of their ...

Year: 2021, Volume: 22, Issue: 119, Pages: 1−45


2021  [PDF] nozdr.ru

Functional inequalities for the Wasserstein Dirichlet form

W Stannat - Seminar on Stochastic Analysis, Random Fields and …, 2011 - Springer

… We give an alternative representation of the Wasserstein Dirichlet form that was … Wasserstein

Dirichlet form in [3]. A simple two-dimensional generalization of the Wasserstein Dirichlet …

Cited by 3 Related articles All 8 versions


[PDF] arxiv.org

Towards Better Data Augmentation using Wasserstein Distance in Variational Auto-encoder

Z Chen, P Liu - arXiv preprint arXiv:2109.14795, 2021 - arxiv.org

… In this paper, we propose the use of Wasserstein distance as a measure of distributional 

similarity for the latent attributes, and show its superior theoretical lower bound (ELBO) …

  Related articles All 4 versions

<——2021———2021———2440-


Data balancing for thermal comfort datasets using conditional wasserstein GAN with a weighted loss function

H Yoshikawa, A Uchiyama, T Higashino - Proceedings of the 8th ACM …, 2021 - dl.acm.org

… The loss function of the original comfortGAN is based on the idea of Wasserstein GAN-gradient 

penalty (WGAN-GP). WGAN-GP was proposed to facilitate training convergence and …

Cited by 2 Related articles

Wasserstein distance to independence models - ScienceDirect

https://www.sciencedirect.com › science › article › abs › pii

https://www.sciencedirect.com › science › article › abs › pii

by TÖ Çelik · 2021 · Cited by 10 — Abstract. An independence model for discrete random variables is a Segre-Veronese variety in a probability simplex. Any metric on the set of joint states of ...


2021 see 2022  [PDF] arxiv.org

Wasserstein proximal algorithms for the Schrödinger bridge problem: Density control with nonlinear drift

KF Caluya, A Halder - IEEE Transactions on Automatic Control, 2021 - ieeexplore.ieee.org

… is that the Wasserstein proximal recursions … -Wasserstein metric W, that will play an 

important role in the development that follows. Definition 1 (2-Wasserstein metric) The 2-Wasserstein

 Cited by 22 Related articles All 7 versions


2021 see 2022  [PDF] osti.gov

Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks:* Full/Regular Research Paper submission for the symposium CSCI-ISAI …

ML Pasini, J Yin - 2021 International Conference on …, 2021 - ieeexplore.ieee.org

We use a stable parallel approach to train Wasserstein Conditional Generative Adversarial 

Neural Networks (W-CGANs). The parallel training reduces the risk of mode collapse and …

All 4 versions
n metric for CGANs, we define the following input …

 Related articles All 4 versions


AWGAN: Unsupervised Spectrum Anomaly Detection with Wasserstein Generative Adversarial Network along with Random Reverse Mapping

W Huang, B Li, W Wang, M Zhang… - … on Mobility, Sensing …, 2021 - ieeexplore.ieee.org

… In order to effectively detect anomalies, we propose AWGAN, a novel anomaly detection 

method based on Wasserstein generative adversarial network. AWGAN can not only learn the …

Related articles All 2 versions

 

2021


Attention Residual Network for White Blood Cell Classification with WGAN Data Augmentation

M Zhao, L Jin, S Teng, Z Li - 2021 11th International …, 2021 - ieeexplore.ieee.org

… Data augmentation based on WGAN To expand the training dataset, we adopt data … 

This paper uses WGAN to augment the WBC images, and _ its parameters are shown in …

Related articles All 2 versions


Underwater Object Detection of an UVMS Based on WGAN

Q Wei, W Chen - 2021 China Automation Congress (CAC), 2021 - ieeexplore.ieee.org

… underwater target detection method based on WGAN is proposed. Firstly, the classic data 

expansion method is used to expand the data set. Then, WGAN based method UVMS is used …

 Related articles


[PDF] unifi.it

[PDF] Pattern-based music generation with wasserstein autoencoders and PRCdescriptions

V Borghuis, L Angioloni, L Brusci… - Proceedings of the Twenty …, 2021 - flore.unifi.it

We present a pattern-based MIDI music generation system with a generation strategy based 

on Wasserstein autoencoders and a novel variant of pianoroll descriptions of patterns which …

 Related articles All 7 versions


Search for English results only. You can specify your search language in Scholar Settings.

Вложения сноуфлейков в пространства Вассерштейна и марковский тип

В Золотов - math-cs.spbu.ru

… в p-пространство Вассерштейна (Канторовича-Рубинштейна) … Марковского типа 

пространств Вассерштейна, и как, … сноуфлейков в пространства Вассерштейна.(Эта вторая …

Save Cite Related articles All 2 versions 


Short communicationOpen access
Wasserstein distance-based distributionally robust optimal scheduling in rural microgrid considering the coordinated interaction among source-grid-load-storage
Energy Reports10 June 2021...

Changming ChenJianxu XingLi Yang

Download PDF

<——2021———2021———2450——


Research article
Chapter 4: Lagrangian schemes for Wasserstein gradient flows
Handbook of Numerical Analysis11 November 2020...

Jose A. CarrilloDaniel MatthesMarie-Therese Wolfram

Cited by 10 Related articles All 6 versions

2021 patent

Wgan-based unsupervised multi-view three-dimensional point cloud joint …

WO CN WO2022165876A1 王耀南 湖南大学

Priority 2021-02-06 • Filed 2021-02-25 • Published 2022-08-11

A kind of unsupervised multi-view three-dimensional point cloud joint registration method based on WGAN according to claim 8, is characterized in that, described step S51 is specifically: The WGAN network trains the discriminator network f ω with the parameter ω and the last layer is not a …


2021 patent

… life prediction method based on improved residual error network and WGAN

CN CN113536697A 沈艳霞 江南大学

Priority 2021-08-24 • Filed 2021-08-24 • Published 2021-10-22

8. The improved residual network and WGAN based bearing remaining life prediction method of claim 1, wherein: in step S3, the WGAN model uses the Wasserstein distance to measure the distribution difference between two feature sets, and optimizes the generator under the domain discriminator to …


2021 patent

Process industry soft measurement data supplementing method based on SVAE-WGAN

CN CN113505477B 高世伟 西北师范大学

Priority 2021-06-29 • Filed 2021-06-29 • Granted 2022-05-20 • Published 2022-05-20

1. A SVAE-WGAN-based process industry soft measurement data supplementing method in the industrial field is characterized by comprising the following steps: step 1: determining input and output of a model according to an industrial background, selecting a proper training data set, inputting time …


2021 patent

Cement clinker free calcium sample data enhancement and prediction method based on R-WGAN comprises performing convolution operation on input data and realizing regression prediction network in R-WGAN of cement clinker free calcium oxide

Patent Number: CN112906976-A

Patent Assignee: UNIV YANSHAN

Inventor(s): HAO X; LIU L; HUANG G; et al.


2021 patent

Method for constructing radar HRRP database based on WGAN-GP

CN CN112946600A 王鹏辉 西安电子科技大学

Priority 2021-03-17 • Filed 2021-03-17 • Published 2021-06-11

The invention discloses a WGAN-GP-based HRRP database construction method, which comprises the following steps: (1) generating a training set; (2) constructing a WGAN-GP network; (3) generating a sample set; (4) training the WGAN-GP network; (5) and completing the construction of the HRRP database.


2021


2021 patent

Mechanical pump small sample fault diagnosis method based on WGAN-GP-C and …

CN CN114037001A 王雪仁 中国人民解放军92578部队

Priority 2021-10-11 • Filed 2021-10-11 • Published 2022-02-11

(2.3) when the WGAN-GP-C which is trained is used for generating a sample, screening the generated sample data; (2.4) the quality of the generated sample data is evaluated by adopting the maximum mean difference MMD and the model is adjusted according to the maximum mean difference MMD. 5. The WGAN …


2021 patent

sEMG data enhancement method based on BiLSTM and WGAN-GP networks

CN CN114372490A 方银锋 杭州电子科技大学

Priority 2021-12-29 • Filed 2021-12-29 • Published 2022-04-19

A sEMG data enhancement method based on a BilSTM and WGAN-GP network comprises the following specific steps: s1, collecting surface electromyogram signals and preprocessing the signals; step S2, standardizing the preprocessed real electromyographic data, and dividing the standardized real …


2021 patent

… local surface slow-speed moving object classification method based on WGAN

CN CN113569632A 周峰 西安电子科技大学

Priority 2021-06-16 • Filed 2021-06-16 • Published 2021-10-29

5. The method for classifying a small sample local area slow moving target according to claim 4, wherein in the substep 2.2, the WGAN introduces Wasserstein distance to measure the difference between the generated data distribution and the real data distribution when performing network training;


2021 patent

Unbalanced data set analysis method based on WGAN training convergence

CN CN113537313A 许艳萍 杭州电子科技大学

Priority 2021-06-30 • Filed 2021-06-30 • Published 2021-10-22

1. An imbalance data set analysis method based on WGAN training convergence is characterized in that: the method specifically comprises the following steps: step one, data acquisition and pretreatment Collecting network security data, dividing the network security data into a multi-class data …

<——2021———2021———2460—— 



2021 patent

WGAN dynamic punishment-based network security unbalance data set analysis …

CN CN114301667A 许艳萍 杭州电子科技大学

Priority 2021-12-27 • Filed 2021-12-27 • Published 2022-04-08

2. The method of claim 1 wherein the method of analyzing an imbalance data set based on WGAN training convergence comprises: the imbalance IR and oversampling ratio R between different classes of data are defined as: wherein N is + And N - Respectively the quantity of the multi-class data and the small …


2021 patent

Data depth enhancement method based on WGAN-GP data generation and Poisson …

CN CN114219778A 侯越 北京工业大学

Priority 2021-12-07 • Filed 2021-12-07 • Published 2022-03-22

The invention discloses a data depth enhancement method based on WGAN-GP data generation and Poisson fusion, wherein WGAN-GP is a generation confrontation network with gradient punishment and is a generation model based on a game idea, and the generation model comprises two networks, namely a …


2021 patent

Clothing attribute editing method based on improved WGAN

CN CN113793397A 张建明 浙江大学

Priority 2021-07-30 • Filed 2021-07-30 • Published 2021-12-14

1. A garment attribute editing method based on improved WGAN comprises a training stage and a testing stage, wherein the training stage is optimized in a supervised learning mode; in the testing stage, the converged network is adopted to generate clothing attributes; the method is characterized by …


2021 patent

Micro-seismic record denoising method based on improved WGAN network and CBDNet

CN CN114218982A 盛冠群 三峡大学

Priority 2021-11-26 • Filed 2021-11-26 • Published 2022-03-22

1. The microseism record denoising method based on the improved WGAN network and the CBDNet is characterized by comprising the following steps of: the method comprises the following steps: collecting micro-seismic data; step two: generating forward simulation signals under different dominant …


2021 patent

Rapid identification method for ship attachment based on WGAN-GP and YOLO

CN CN113792785A 陈琦 上海理工大学

Priority 2021-09-14 • Filed 2021-09-14 • Published 2021-12-14

wherein L is the objective function of WGAN-GP; is the loss function of WGAN at Wasserstein distance; is a gradient penalty that is applied independently for each sample on a WGAN basis. 5. The WGAN-GP and YOLO based ship body attachment rapid identification method as claimed in claim 1, wherein …


2021


2021 patent

One-dimensional time sequence data amplification method based on WGAN

CN CN113627594A 孙博 北京航空航天大学

Priority 2021-08-05 • Filed 2021-08-05 • Published 2021-11-09

4. The method of claim 1, wherein the WGAN-based one-dimensional time series data augmentation method comprises: in the "network model constructed by training" described in the third step, gaussian noise z to n (μ, σ) having a mean value of 0 and a standard deviation of 1 is used, μ is 0, and σ is …


2021 patent

Power system harmonic law calculation method based on WGAN

CN CN114217132A 梅文波 江苏弈赫能源科技有限公司

Priority 2021-11-11 • Filed 2021-11-11 • Published 2022-03-22

2. The WGAN-based electric power system harmonic law calculation method according to claim 1, wherein a sampling frequency of the current data in the first step is two times or more of a highest frequency in the signal. 3. The WGAN-based power system harmonic law calculation method according to …


2021 patent

Road texture picture enhancement method coupling traditional method and WGAN-GP

CN CN113850855A 徐子金 北京工业大学

Priority 2021-08-27 • Filed 2021-08-27 • Published 2021-12-28

1. A road texture picture enhancement method coupling a traditional method and WGAN-GP is characterized in that a new high-quality texture picture is generated by utilizing a road surface macro texture picture obtained by a commercial handheld three-dimensional laser scanner through a traditional …


2021 patent

Anti-disturbance image generation method based on WGAN-GP

CN CN113537467A 蒋凌云 南京邮电大学

Priority 2021-07-15 • Filed 2021-07-15 • Published 2021-10-22

2. The WGAN-GP-based disturbance rejection image generation method according to claim 1, wherein: target loss function L WGAN-GP The expression of the calculation is In the formula (2), d (x) represents that the discriminator determines whether the x class label belongs to the class information in …


2021 patent

WGAN-GP privacy protection system and method based on improved PATE

CN CN113553624A 杨张妍 天津大学

Priority 2021-07-30 • Filed 2021-07-30 • Published 2021-10-26

6. The WGAN-GP privacy protection system based on an improved PATE as claimed in claim 2, wherein in the student discriminator module: the student arbiter generates a sample through analysis and a prediction label output by the conditional differential privacy aggregator corresponding to the sample …

<——2021———2021———2470—— 



2021 patent

Wind power output power prediction method based on isolated forest and WGAN …

CN CN113298297A 王永生 内蒙古工业大学

Priority 2021-05-10 • Filed 2021-05-10 • Published 2021-08-24

7. The isolated forest and WGAN network based wind power output power prediction method of claim 6, wherein the interpolation operation comprises the following steps: step 2.1, inputting the random noise vector z into a generator G to obtain a generated time sequence G (z), wherein G (z) is a …


2021 patent

Micro-seismic record denoising method based on improved WGAN network and CBDNet

CN CN114218982A 盛冠群 三峡大学

Priority 2021-11-26 • Filed 2021-11-26 • Published 2022-03-22

Micro-seismic record denoising method based on improved WGAN network and CBDNet Technical Field The invention relates to a microseism monitoring technology, in particular to a microseism record denoising method based on an improved WGAN network and CBDNet. Background The micro-seismic monitoring …


 2021 patent 

Rapid identification method for ship attachment based on WGAN-GP and YOLO

CN CN113792785A 陈琦 上海理工大学

Priority 2021-09-14 • Filed 2021-09-14 • Published 2021-12-14

Rapid identification method for ship attachment based on WGAN-GP and YOLO Technical Field The invention relates to the technical field of ship body attachment cleaning, in particular to a method for quickly identifying ship body attachments based on WGAN-GP and YOLO. Background The ocean occupies …

CN CN113378959B 潘杰 中国矿业大学

Priority 2021-06-24 • Filed 2021-06-24 • Granted 2022-03-15 • Published 2022-03-15

wherein L is WGAN Representing the loss of the production countermeasure network, D (v, s) representing the result of the visual features v and the original semantic features s being fed to the discriminator network D, indicating that a visual trait is to be synthesized And the result of the original …

  


  2021 patent

Early fault detection method based on Wasserstein distance

CN CN114722888A 曾九孙 中国计量大学

Priority 2021-10-27 • Filed 2021-10-27 • Published 2022-07-08

3. The early fault detection method based on Wasserstein distance as claimed in claim 2, characterized in that, by constructing dual form of model, the model is solved by Riemann block coordinate descent method, comprising the following steps: s2.1, adding two Lagrange multipliers to construct a …


2021 patent

Wasserstein distance-based motor imagery electroencephalogram migration …

CN CN113010013A 罗浩远 华南理工大学

Priority 2021-03-11 • Filed 2021-03-11 • Published 2021-06-22

6. The Wasserstein distance-based motor imagery electroencephalogram migration learning method according to claim 1, wherein: in step 4), the Wasserstein distance training deep migration learning model is used, and the method comprises the following steps: 4.1) inputting the source domain data and …


2021


2021 patent

… method for generating countermeasure network based on conditional Wasserstein

CN CN114154405A 陈乾坤 东风汽车集团股份有限公司

Priority 2021-11-19 • Filed 2021-11-19 • Published 2022-03-08

3. The motor data enhancement method based on the conditional Wasserstein generation countermeasure network as claimed in claim 1, wherein the conditional Wasserstein generation countermeasure network is a combination of the conditional warerstein generation countermeasure network and the …


2021 patent

Wasserstein distance-based object envelope multi-view reconstruction and …

CN CN113034695A 何力 广东工业大学

Priority 2021-04-16 • Filed 2021-04-16 • Published 2021-06-25

7. The Wasserstein distance-based object envelope multi-view reconstruction and optimization method according to claim 5, wherein the specific process of step S4-3 is as follows: embedding the Wasserstein-based distance cost function into three-dimensional reconstruction, including: in equation (3) …


2021 patent

Characteristic similarity countermeasure network based on Wasserstein distance

CN CN113673347A 祝磊 杭州电子科技大学

Priority 2021-07-20 • Filed 2021-07-20 • Published 2021-11-19

6. The Wasserstein distance-based characterized similar countermeasure network of claim 1, wherein in S5: obtaining the round-trip probability of the destination domain of the source domain comprises multiplying the resulting P st P ts The formula is as follows: P sts P st P ts in the formula, P sts …


2021 patent

Domain self-adaptive rolling bearing fault diagnosis method based on Wasserstein

CN CN113239610A 王晓东 昆明理工大学

Priority 2021-01-19 • Filed 2021-01-19 • Published 2021-08-10

3. The implementation principle of the Wasserstein distance-based domain-adaptive rolling bearing fault diagnosis method according to claim 2 is characterized in that: StepA, extracting features through convolutional neural network, convolutional layer containing a filter w and an offset b, let X n …


2021 patent

… robust optimization scheduling method based on improved Wasserstein measure

CN CN113962612A 刘鸿鹏 东北电力大学

Priority 2021-11-25 • Filed 2021-11-25 • Published 2022-01-21

2. The electric-heat combined system distribution robust optimization scheduling method based on improved Wasserstein measure of claim 1, characterized in that power error data set is predicted according to historical wind power Building an empirical distribution Where N is the total number of …

<——2021———2021———2480——



2021 8 patent

Power system bad data identification method based on improved Wasserstein GA

CN CN114330486A 臧海祥 河海大学

Priority 2021-11-18 • Filed 2021-11-18 • Published 2022-04-12

5. The improved Wasserstein GAN-based power system bad data identification method as claimed in claim 1, wherein the training process of the C4.5 decision tree model in the step (5) comprises: firstly, a calculation formula for determining the sample information entropy in the C4.5 decision tree …


2021 patent

Robot motion planning method and system based on graph Wasserstein self-coding …

CN CN113276119A 夏崇坤 清华大学深圳国际研究生院

Priority 2021-05-25 • Filed 2021-05-25 • Published 2021-08-20

8. The method for robot motion planning based on Wasserstein self-encoded network as claimed in claim 1, wherein step S2 comprises the following steps: s2-1, initializing encoder network parameter Q ψ And decoder network parameters Initializing a potential discriminator D τ (ii) a …


2021 patent

Cross-domain recommendation method based on double-current sliced wasserstein

CN CN113536116A 聂婕 中国海洋大学

Priority 2021-06-29 • Filed 2021-06-29 • Published 2021-10-22

The invention belongs to the technical field of cross-domain recommendation, and discloses a cross-domain recommendation method based on a double-current slotted Wasserstein self-encoder.


2021 patent

Topic modeling method based on Wasserstein self-encoder and Gaussian mixture …

CN CN114417852A 刘洪涛 重庆邮电大学

Priority 2021-12-06 • Filed 2021-12-06 • Published 2022-04-29

5. The method for modeling a subject based on a Wasserstein self-encoder and a gaussian mixture distribution as a priori as claimed in any one of claims 1 to 4, wherein the step S4 specifically comprises: s41: decoding the theme distribution theta obtained in the step S2 to obtain Representing the …


2021 patent

Wasserstein space-based visual dimension reduction method

CN CN112765426A 秦红星 重庆邮电大学

Priority 2021-01-18 • Filed 2021-01-18 • Published 2021-05-07

5. The visualization dimension reduction method based on Wasserstein space according to claim 4, characterized in that: the S4 specifically includes: recording the projection point as Unless the user specifies an initial value, at [0, 0.001 ]]Initializing all projection points with a uniform …


2021


2021 patent

Distribution robust optimization method based on Wasserstein measurement and …

CN CN114243683A 侯文庭 周口师范学院

Priority 2021-11-23 • Filed 2021-11-23 • Published 2022-03-25

6. The distributed robust optimization method based on Wasserstein measurement and kernel density estimation as claimed in claim 5, wherein the objective function of the distributed robust block combination model in step S4 is: in the formula: SU i SD i Respectively starting and stopping expenses of …


ttps://www.mimuw.edu.pl › seminaria

zastosowaniach metryki Wassersteina dla zrozumienia ...

Translate this page

Jun 9, 2021 — Przedstawię zastosowania wyników w statystyce, uczeniu maszynowym, matematyce finansowej i innych dziedzinach.

[Polish  pplications of Wasserstein metrics for understanding …]



Attention Residual Network for White Blood Cell Classification with WGAN Data...
by Zhao, MengJin, LingminTeng, Shenghua ; More...
2021 11th International Conference on Information Technology in Medicine and Education (ITME), 11/2021
In medicine, white blood cell (WBC) classification plays an important role in clinical diagnosis and treatment. Due to the similarity between classes and lack...
Conference Proceeding  Full Text Online


Hebei University of Technology Researchers Update Understanding of Robotics (Low-Illumination Image Enhancement in the Space Environment Based on the DC-WGAN...
Journal of Engineering, 01/2021
Newsletter  Full Text Online



2021

Full Attention Wasserstein GAN With Gradient Normalization for Fault Diagnosis Under Imbalanced Data

Fan, JGYuan, XF; (...); Zhou, FY

2022 | 

IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT

 71

The fault diagnosis of rolling bearings is vital for the safe and reliable operation of mechanical equipment. However, the imbalanced data collected from the real engineering scenario bring great challenges to the deep learning-based diagnosis methods. For this purpose, this article proposes a methodology called full attention Wasserstein generative adversarial network (WGAN) with gradient norm

Show more

Full Text at Publishermore_horiz44 References  Related records

<——2021———2021———2490——


 

Wasserstein Genera we Adversarial Ne works for Online Test Generation for Cyber Physical Systems

Peltomaki, JSpencer, F and Porres, I

15th Search-Based Software Testing Workshop (SBST)

2022 | 

15TH SEARCH-BASED SOFTWARE TESTING WORKSHOP (SBST 2022)

 , pp.1-5

We propose a novel online test generation algorithm WOGAN based on Wasserstein Generative Adversarial Networks. WOGAN is a general-purpose black-box test generator applicable to any system under test having a fitness function for determining failing tests. As a proof of concept, we evaluate WOGAN by generating roads such that a lane assistance system of a car fails to stay on the designated lan

Show more

Full Text at Publishermore_horiz

10 References  Related records


2021

A Target SAR Image Expansion Method Based on Conditional Wasserstein Deep Convolutional GAN for Automatic Target Recognition

Qin, JKLiu, Z; (...); Guo, ZK

2022 | 

IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING

 15 , pp.7153-7170Enriched Cited References

For the automatic target recognition (ATR) based on synthetic aperture radar (SAR) images, enough training data are required to effectively characterize target features and obtain good recognition performance. However, in practical applications, it is difficult to collect sufficient training data. To tackle the limitation, a novel end-to-end expansion method, called conditional Wasserstein deep

Show more

View full textmore_horiz

52 References  Related records


[PDF] mdpi.com

Polymorphic Adversarial Cyberattacks Using WGAN

R Chauhan, U Sabeel, A Izaddoost… - Journal of Cybersecurity …, 2021 - mdpi.com

… In this paper, we propose a model to generate adversarial attacks using Wasserstein GAN 

(WGAN). The attack data synthesized using the proposed model can be used to train an IDS. …

 Cited by 2 Related articles All 3 versions



Peer-reviewed
Speech Emotion Recognition on Small Sample Learning by Hybrid WGAN-LSTM Networks
Authors:Cunwei SunLuping JiHailing Zhong
Summary:The speech emotion recognition based on the deep networks on small samples is often a very challenging problem in natural language processing. The massive parameters of a deep network are much difficult to be trained reliably on small-quantity speech samples. Aiming at this problem, we propose a new method through the systematical cooperation of Generative Adversarial Network (GAN) and Long Short Term Memory (LSTM). In this method, it utilizes the adversarial training of GAN’s generator and discriminator on speech spectrogram images to implement sufficient sample augmentation. A six-layer convolution neural network (CNN), followed in series by a two-layer LSTM, is designed to extract features from speech spectrograms. For accelerating the training of networks, the parameters of discriminator are transferred to our feature extractor. By the sample augmentation, a well-trained feature extraction network and an efficient classifier could be achieved. The tests and comparisons on two publicly available datasets, i.e., EMO-DB and IEMOCAP, show that our new method is effective, and it is often superior to some state-of-the-art methodsShow more
Article, 2021
Publication:Journal of Circuits, Systems and Computers, 31, 18 October 2021
Publisher:2021



Conditional Wasserstein generative adversarial networks applied to acoustic metamaterial design

P Lai, F Amirkulova, P Gerstoft - … Journal of the Acoustical Society …, 2021 - asa.scitation.org

This work presents a method for the reduction of the total scattering cross section (TSCS) for 

a planar configuration of cylinders by means of generative modeling and deep learning. …

 Cited by 3 Related articles All 4 versions


2021


 PUBLICATION

Wasserstein Contrastive Representation Distillation 

Liqun Chen, Dong Wang, Zhe Gan, Jingjing Liu, Ricardo Henao, Lawrence Carin

CVPR 2021 | June 2021

Cited by 29 Related articles All

[PDF] neurips.cc

A theory of the distortion-perception tradeoff in wasserstein space

D Freirich, T MichaeliR Meir - Advances in Neural …, 2021 - proceedings.neurips.cc

… interpolation in pixel space [31] or in some latent space [26], … MSE–Wasserstein-2 tradeoff,

linear interpolation in pixel space … on the DP curve form a geodesic in Wasserstein space. …

 Cited by 5 Related articles All 4 versions 


[PDF] arxiv.org

The Wasserstein space of stochastic processes

D BartlM BeiglböckG Pammer - arXiv preprint arXiv:2104.14245, 2021 - arxiv.org

… We also show that (FP, AW) is a geodesic space, isometric to a classical Wasserstein space

adapted Wasserstein distance. In Section 3 we formally discuss the Wasserstein space of …

 Cited by 9 Related articles All 2 versions 


[PDF] mlr.press

Fast and smooth interpolation on wasserstein space

S Chewi, J Clancy, T Le Gouic… - International …, 2021 - proceedings.mlr.press

… (2019)) we equip this space with the 2-Wasserstein metric W2 and seek an interpolation …

over the Wasserstein space of probability measures. Splines in Wasserstein space were …

 Cited by 10 Related articles All 7 versions 

 

[PDF] arxiv.org

Variational Wasserstein gradient flow

J FanA TaghvaeiY Chen - arXiv preprint arXiv:2112.02424, 2021 - arxiv.org

… Wasserstein gradient flow models the gradient dynamics over the space of probability densities

with respect to the Wasserstein … is in fact the Wasserstein gradient flow of the free energy, …

Cited by 5 Related articles All 3 versions 

<——2021———2021———2500—— 


[PDF] arxiv.org

Master Bellman equation in the Wasserstein space: Uniqueness of viscosity solutions

A CossoF Gozzi, I Kharroubi, H Pham… - arXiv preprint arXiv …, 2021 - arxiv.org

… We study the Bellman equation in the Wasserstein space … -Lions extended to our Wasserstein

setting, we prove a … nature of the underlying Wasserstein space. The adopted strategy is …

 Cited by 4 Related articles All 18 versions 


[PDF] neurips.cc

Wasserstein Flow Meets Replicator Dynamics: A Mean-Field Analysis of Representation Learning in Actor-Critic

Y Zhang, S Chen, Z Yang… - Advances in Neural …, 2021 - proceedings.neurips.cc

… a semigradient flow in the Wasserstein space [59]. Moreover, the semigradient flow runs at a

… Thus, in the mean-field regime, our critic is given by a Wasserstein semigradient flow, which …

Related articles All 4 versions 


Recycling Discriminator: Towards Opinion-Unaware Image Quality Assessment Using Wasserstein GAN

Y Zhu, H Ma, J Peng, D Liu, Z Xiong - Proceedings of the 29th ACM …, 2021 - dl.acm.org

… First, Wasserstein GAN (WGAN) has the valuable property that its discriminator loss is an 

accurate estimate of the Wasserstein distance [2, 12]. Second, the perceptual quality index (…

Cited by 3 Related articles

Fault Diagnosis Method Based on CWGAN-GP-1DCNN

H Yin, Y Gao, C Liu, S Liu - 2021 IEEE 24th International …, 2021 - ieeexplore.ieee.org

… CWGAN-GP To address the problem of failure sample, conditional WGAN-GP (CWGAN-GP) …

This paper proposes a fault diagnosis model combining 1D-CNN and CWGAN-GP for the …

Related articles All 2 versions


[PDF] arxiv.org

Efficient wasserstein natural gradients for reinforcement learning

T Moskovitz, M Arbel, F Huszar, A Gretton - arXiv preprint arXiv …, 2020 - arxiv.org

… The procedure uses a computationally efficient Wasserstein natural gradient (WNG) 

descent that takes advantage of the geometry induced by a Wasserstein penalty to speed …

Cited by 4 Related articles All 6 versions

[v3] Wed, 17 Mar 2021 15:02:06 UTC (11,855 KB)

[v4] Thu, 18 Mar 2021 10:41:34 UTC (11,858 KB)


2021


DISSERTATION

基于条件Wasserstein生成对抗网络的跨模态光伏出力数据生成方法

康明与2021

[Chinese  Cross-modal photovoltaic output data generation method based on conditional Wasserstein generative adversarial network

Cuming and 2021]


DISSERTATION

Efficient and Provable Algorithms for Wasserstein Distributionally Robust Optimization in Machine Learning

LI, Jiajin2021

Efficient and Provable Algorithms for Wasserstein Distributionally Robust Optimization in Machine Learning

No Online Access 

[CITATION] Efficient and Provable Algorithms for Wasserstein Distributionally Robust Optimization in Machine Learning

J LI - 2021 - The Chinese University of Hong …

Existence and stability results for an isoperimetric problem with a non-local interaction of Wasserstein...
by Candau-Tilh, JulesGoldman, Michael
08/2021
The aim of this paper is to prove the existence of minimizers for a variational problem involving the minimization under volume constraint of the sum of the...
Journal Article  Full Text Online

Wasserstein Announces Exclusive Accessory Line for New Google Nest Devices
Plus Company Updates, Aug 13, 2021
Newspaper Article  Full Text Online


The [alpha]-z-Bures Wasserstein divergence
by Dinh, Trung HoaLe, Cong TrinhVo, Bich Khue ; More...
Linear algebra and its applications, 09/2021, Volume 624
Keywords Quantum divergence; [alpha]-z Bures distance; Least squares problem; Karcher mean; Matrix power mean; In-betweenness property; Data processing...
ArticleView Article PDF
Journal Article  Full Text Online

<——2021———2021———2510——e



 2021 see 2022
Approximate Wasserstein Attraction Flows for Dynamic Mass Transport over Networks
by Arqué, FerranUribe, César AOcampo-Martinez, Carlos
09/2021
This paper presents a Wasserstein attraction approach for solving dynamic mass transport problems over networks. In the transport problem over networks, we...
Journal Article  Full Text Online

Wasserstein index of dependence for random measures
by Catalano, MartaLavenant, HugoLijoi, Antonio ; More...
09/2021
Nonparametric latent structure models provide flexible inference on distinct, yet related, groups of observations. Each component of a vector of $d \ge 2$...
Journal Article  Full Text Online


Face Image Generation for Illustration by WGAN-GP Using Landmark Information

by Takahashi, MihoWatanabe, Hiroshi

2021 IEEE 10th Global Conference on Consumer Electronics (GCCE), 10/2021

With the spread of social networking services, face images for illustration are being used in a variety of situations. Attempts have 

been made to create...

Conference Proceeding  Full Text Online

Face Image Generation for Illustration by WGAN-GP Using Landmark Information



On the Wasserstein Distance Between $k$-Step Probability Measures on Finite Graphs
by Benjamin, SophiaMantri, ArushiPerian, Quinn
10/2021
We consider random walks $X,Y$ on a finite graph $G$ with respective lazinesses $\alpha, \beta \in [0,1]$. Let $\mu_k$ and $\nu_k$ be the $k$-step transition...
Journal Article  Full Text Online

Empirical measures and random walks on compact spaces in the quadratic Wasserstein...
by Borda, Bence
10/2021
Estimating the rate of convergence of the empirical measure of an i.i.d. sample to the reference measure is a classical problem in probability theory....
Journal Article  Full Text Online
  

2021


On the Wasserstein Distance Between \(k\)-Step Probability Measures on Finite Graphs
by Benjamin, SophiaMantri, ArushiQuinn Perian
arXiv.org, 10/2021
We consider random walks \(X,Y\) on a finite graph \(G\) with respective lazinesses \(\alpha, \beta \in [0,1]\). Let \(\mu_k\) and \(\nu_k\) be the \(k\)-step...
Paper  Full Text Online



Sliced-Wasserstein distance for large-scale machine learning : theory, methodology and...
by Nadjahi, Kimia
Institut Polytechnique de Paris, 2021
Many methods for statistical inference and generative modeling rely on a probability divergence to effectively compare two probability distributions. The...
Dissertation/Thesis  Full Text Online

Multi-period facility location and capacity planning under $\infty$-Wasserstein joint...
by Wang, ZhuolinYou, KeyouWang, Zhengli ; More...
11/2021
The key of the post-disaster humanitarian logistics (PD-HL) is to build a good facility location and capacity planning (FLCP) model for delivering relief...
Journal Article  Full Text Online

 2021 see 2022
Exact Convergence Analysis for Metropolis-Hastings Independence Samplers in Wasserst...
by Brown, AustinJones, Galin L
11/2021
Under mild assumptions, we show the sharp convergence rate in total variation is also sharp in weaker Wasserstein distances for the Metropolis-Hastings...
Journal Article  Full Text Online

Bounds in $L^1$ Wasserstein distance on the normal approximation of general M-estimators
by Bachoc, FrançoisFathi, Max
11/2021
We derive quantitative bounds on the rate of convergence in $L^1$ Wasserstein distance of general M-estimators, with an almost sharp (up to a logarithmic term)...
Journal Article  Full Text Online

<——2021———2021———2520——



 2021 see 2022
Adversarial Classification via Distributional Robustness with Wasserstein Ambiguity
by Ho-Nguyen, NamWright, Stephen J
arXiv.org, 11/2021
We study a model for adversarial classification based on distributionally robust chance constraints. We show that under Wasserstein ambiguity, the model aims...
Paper  Full Text Online

Multi-period facility location and capacity planning under \(\infty\)-Wasserstein joint...
by Wang, ZhuolinYou, KeyouWang, Zhengli ; More...
arXiv.org, 11/2021
The key of the post-disaster humanitarian logistics (PD-HL) is to build a good facility location and capacity planning (FLCP) model for delivering relief...
Paper  Full Text Online

Related articles All 2 versions


(W\)-entropy formulas and Langevin deformation of flows on Wasserstein space...
by Songzi LiXiang-Dong Li
arXiv.org,11/2021
We introduce Perelman's \(W\)-entropy and prove the \(W\)-entropy formula along the geodesic flow on the \(L^2\)-Wasserstein space over compact Riemannian...
Paper  Full Text Online

Bounds in \(L^1\) Wasserstein distance on the normal approximation of general M-estimators
by François BachocMax Fathi
arXiv.org, 11/2021
We derive quantitative bounds on the rate of convergence in \(L^1\) Wasserstein distance of general M-estimators, with an almost sharp (up to a logarithmic...
Paper  Full Text Online


HRL Laboratories LLC issued patent titled "System and method for unsupervised domain adaptation via sliced-wasserstein...
News Bites - Private Companies, Nov 23, 2021
Newspaper Article  Full Text Online


2021


The isometry group of Wasserstein spaces: the Hilbertian case
by György Pál GehérTamás TitkosDániel Virosztek
arXiv.org, 12/2021
Motivated by Kloeckner's result on the isometry group of the quadratic Wasserstein space \(\mathcal{W}_2\left(\mathbb{R}^n\right)\), we describe the isometry...
Paper  Full Text Online


Smooth \(p\)-Wasserstein Distance: Structure, Empirical Approximation, and Statistical...
by Sloan NietertGoldfeld, ZivKato, Kengo
arXiv.org, 12/2021
Discrepancy measures between probability distributions, often termed statistical distances, are ubiquitous in probability theory, statistics and machine...
Paper  Full Text Online
 
Gamifying optimization: a Wasserstein distance-based...
by Candelieri, Antonio; Ponti, Andrea; Archetti, Francesco
12/2021
The main objective of this paper is to outline a theoretical framework to characterise humans' decision-making strategies under uncertainty, in particular...
Journal Article  Full Text Online


2021 patent  2021-2022
Industrial anomaly detection method and device based on WGAN

CN CN113554645B 杭天欣 常州微亿智造科技有限公司

Priority 2021-09-17 • Filed 2021-09-17 • Granted 2022-01-11 • Published 2022-01-11

performing anomaly detection on the screened image to be detected by adopting the WGAN anomaly detection model, specifically, performing data preprocessing on the screened image to be detected to obtain a secondary image to be detected, inputting the secondary image to be detected into the WGAN


2021  Research article
Alpha Procrustes metrics between positive definite operators: A unifying formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt metrics
Linear Algebra and its Applications22 November 2021...

Hà Quang Minh

<——2021———2021———2530—



2021 see 2022  Research article
Sliced Wasserstein Distance for Neural Style Transfer
Computers & Graphics16 December 2021...

Jie LiDan XuShaowen Yao


2021 see 2022  Research article
Well-posedness for some non-linear SDEs and related PDE on the Wasserstein space
Journal de Mathématiques Pures et Appliquées23 December 2021...

Paul-Eric Chaudru de RaynalNoufel Frikha


2021 see 2022  Research article
Optimizing decisions for a dual-channel retailer with service level requirements and demand uncertainties: A Wasserstein metric-based distributionally robust optimization approach
Computers & Operations Research22 October 2021...

Yue SunRuozhen QiuMinghe Sun


2021 see 2022  Research article
Wasserstein distributionally robust chance-constrained optimization for energy and reserve dispatch: An exact and physically-bounded formulation
European Journal of Operational Research19 April 2021...

Adriano ArrigoChristos OrdoudisFrançois Vallée


2021 see 2022  Research article
Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures
Journal of Approximation Theory10 December 2021...

O. BencheikhB. Jourdain



[PDF] arxiv.org

On the estimation of the Wasserstein distance in generative models

T Pinetz, D Soukup, T Pock - … , September 10–13, 2019, Proceedings 41, 2019 - Springer

… from a set of noise vectors to a set of images using the Wasserstein distance for a batch of 

size 4000. The resulting images are shown in the supplementary material for all algorithms. To …

Cited by 8 Related articles All 5 versions


2021 PDF

Wasserstein Distance and Entropic Regularization in Kantorovich Problem 

https://student.cs.uwaterloo.ca › PMATH451 › P...

https://student.cs.uwaterloo.ca › PMATH451 › P...

Jan 3, 2021 — The process of solving the Kantorovich dual, a formulation of the optimal transport prob- lem, has gained popularity in synthetic image ...

16 pages


2021 PDF

On a linear Gromov–Wasserstein distance - arXiv

https://arxiv.org › pdf

https://arxiv.org › pdf

by F Beier · 2021 · Cited by 3 — Abstract—Gromov–Wasserstein distances are generalization of Wasserstein distances, which are invariant under distance preserving transformations.



Wasserstein Dictionary Learning: Optimal Transport ... - CNRS

y MA Schmitz · Cited by 115 — Abstract. This paper introduces a new nonlinear dictionary learning method for histograms in the probability simplex. The method leverages optimal transport ...

36 pages

see

SIAM J. IMAGING SCIENCES c© 2018 Morgan A. Schmitz

Vol. 11, No. 1, pp. 643–678 (2018)



VIDEO 1

Ultrametric Gromov-Hausdorff and Gromov-Wasserstein Distances

Mémoli, Facundo (The Ohio State University)2021

OPEN ACCESS

Ultrametric Gromov-Hausdorff and Gromov-Wasserstein Distances

No Online Access 

Mémoli, Facundo (The Ohio State University)

 

V\\IDEO 2

Wasserstein Control of Mirror Langevin Monte Carlo

Shuangjian Zhang, Kelvin (École Normale Supérieure)2020

OPEN ACCESS

Wasserstein Control of Mirror Langevin Monte Carlo

No Online Access 

Mémoli, Facundo (The Ohio State University)

<——2021———2021———2540—— 


An Invitation to Optimal Transport, Wasserstein Distances, ...

books.google.com › books

books.google.com › books

Alessio Figalli, ‎Federico Glaudo · 2021 · ‎No preview

"This book provides a self-contained introduction to optimal transport, and it is intended as a starting point for any researcher who wants to enter into this beautiful subject.

An Invitation to Optimal Transport, Wasserstein Distances, ...

books.google.com › books

books.google.com › books

Alessio Figalli, ‎Federico Glaudo · 2021 · ‎No preview

"This book provides a self-contained introduction to optimal transport, and it is intended as a starting point for any researcher who wants to enter into this beautiful subject.

2021  [PDF] ub.edu

Corrigendum: An enhanced uncertainty principle for the Vaserstein distance

T Carroll, FX Massaneda Clares… - Bulletin of the London …, 2021 - diposit.ub.edu

CORRIGENDUM TO AN ENHANCED UNCERTAINTY PRINCIPLE FOR THE VASERSTEIN 

DISTANCE One of the main results in the paper mentioned in t … Here W1(f+,f−) indicates …

\ Related articles

[PDF] wiley.com

[CITATION] Corrigendum: An enhanced uncertainty principle for the Vaserstein distance: (Bull. Lond. Math. Soc. 52 (2020) 1158–1173)

T Carroll, X Massaneda… - Bulletin of the London …, 2021 - Wiley Online Library

Theorem 1. Let Q0=[0, 1] d be the unit cube in Rd and let f: Q0 R be a continuous function 

with zero mean. Let Z (f) be the nodal set Z (f)={x Q: f (x)= 0}. Let Hd− 1 (Z (f)) denote the …

Related articles


2021  thesis

Flexibility properties and homology of Gromov-Vaserstein fibres

G De Vito, F Kutzschebauch - scholar.archive.org

The aim of this thesis consists in the study of a very concrete class of affine algebraic varieties, 

ie the fibres of the so-called Gromov-Vaserstein fibration, which are of importance in the …

 Related articles All 3 versions

Flexibility properties and homology of Gromov-Vaserstein fibres 

Inaugural dissertation of the Faculty of Science, University of Bern 

presented by Giorgio De Vito from Italy 



2021 video

Training Wasserstein GANs without gradient penalties15 views

Nov 8, 2021


Wasserstein Embeddings in the Deep Learning Era - YouTube

... Soheil Kolouri on 24th November in the one world seminar on the mathematics of machine learning on the topic "Wasserstein Embeddings in.

YouTube · One world theoretical machine learning · 

Nov 27, 2021

Talk by Prof. Soheil Kolouri, Vanderbilt University - YouTube

www.youtube.com › watch

Title: Wasserstein Embeddings in the Deep Learning Erahosted by High-dimensional Statistical Modeling Team (PI: Makoto Yamada )For more ...

YouTube · AIP RIKEN · 

Dec 19, 2021

2021`


Dimensionality Reduction for Wasserstein Barycenter

The Wasserstein barycenter is a geometric construct which captures the notion of centrality among probability distributions, and which has ...

SlidesLive · 

Dec 6, 2021


Barycenters of Natural Images Constrained Wasserstein ...

papertalk.org › papertalks

Barycenters of Natural Images Constrained Wasserstein Barycenters for Image Morphing. Dror Simon, Aviad Aberdam. Keywords: image morphing, image ...

Papertalk · ComputerVisionFoundation Videos · 

Dec 6, 2021


Wasserstein gradient flows for machine learning by Anna Korba

www.youtube.com › watch

www.youtube.com › watch

Minisymposia: Wasserstein gradient flows for machine learningAnna Korba (ENSAE, Paris)An important problem in machine learning and ...

YouTube · Red Estratégica en Matemáticas · 

Dec 23, 2021



CNN-Based Continuous Authentication on Smartphones With Conditional Wasserstein Generative Adversarial Network

Y Li, J Luo, S Deng, G Zhou - IEEE Internet of Things Journal, 2021 - ieeexplore.ieee.org

… Wasserstein generative adversarial network (CWGAN) for sensor data augmentation and specially design a CNN-based … In this article, we present CAGANet, a CNN-based continuous …

Cited by 3 Related articl


[PDF] arxiv.org

 Wasserstein distance-based asymmetric adversarial domain adaptation in intelligent bearing fault diagnosis

Y Yu, J Zhao, T Tang, J Wang, M Chen… - Measurement …, 2021 - iopscience.iop.org

… In this paper, we propose a novel adversarial TL methodWasserstein distance-based asymmetric adversarial domain adaptation (WAADA). Our method is inspired by ADDA, which …

 Cited by 5 Related articles All 2 versions

 <——2021———2021———2550——

[PDF] arxiv.org

Distributionally robust tail bounds based on Wasserstein distance and -divergence

C Birghila, M Aigner, S Engelke - arXiv preprint arXiv:2106.06266, 2021 - arxiv.org

… We evaluate the robust tail behavior in ambiguity sets based on the Wasserstein distance and Csiszár f-divergence and obtain explicit expressions for the corresponding asymptotic …

Cited by 3 Related articles All 2 versions 

[PDF] aaai.org

Learning graphons via structured gromov-wasserstein barycenters

H XuD LuoL CarinH Zha - Proceedings of the AAAI Conference on …, 2021 - ojs.aaai.org

… Besides the basic GWB method, we design 1) a smoothed GWB method to enhance the continuity of learned graphons, and 2) a mixture model of GWBs to learn multiple graphons from …

\ Cited by 8 Related articles All 6 versions 

Cited by 13 Related articles All 5 versions 


[HTML] sciencedirect.com

[HTML] Wasserstein distance based multiobjective evolutionary algorithm for the risk aware optimization of sensor placement

A PontiA CandelieriF Archetti - Intelligent Systems with Applications, 2021 - Elsevier

… The formulation of Wasserstein enabled, combination operators, a new method to assess the quality of the Pareto set and of a new method to choose among its solutions. …

Cited by 8 Related articles


LCS graph kernel based on Wasserstein distance in longest common subsequence metric space

J Huang, Z Fang, H Kasai - Signal Processing, 2021 - Elsevier

… is based on the length of their longest common subsequence. As the final step, we use the 1… Wasserstein distance of two distributions as our kernel value. In addition to our basic method

Cited by 10 Related articles All 2 versions


[PDF] arxiv.org

A material decomposition method for dual‐energy CT via dual interactive Wasserstein generative adversarial networks

Z Shi, H Li, Q Cao, Z Wang, M Cheng - Medical Physics, 2021 - Wiley Online Library

… In this study, a data‐driven approach using dual interactive Wasserstein generative adversarial networks (DIWGAN) is developed to improve DECT decomposition accuracy and …

Cited by 7 Related articles All 7 versions


2021


[PDF] mdpi.com

Power electric transformer fault diagnosis based on infrared thermal images using wasserstein generative adversarial networks and deep learning classifier

KH Fanchiang, YC Huang, CC Kuo - Electronics, 2021 - mdpi.com

… WAR model with the discriminator is based on the concept of Wasserstein GAN [35] and GANomaly [… in Equation (21) is exactly like the Wasse

Cited by 6 Related articles All 3 versions 


[PDF] archives-ouvertes.fr

Wasserstein generative models for patch-based texture synthesis

A HoudardA LeclaireN Papadakis… - … Conference on Scale …, 2021 - Springer

… -based optimization method is proposed, based on discrete optimal transport. We show that it generalizes a well-known texture optimization method … properties of Wasserstein distances …

 Cited by 7 Related articles All 16 versions



Gromov-Wasserstein based optimal transport to align... - Pinar Demetci - MLCSB - ISMB 2020 Posters68 views

Jan 13, 2021 

Gromov-Wasserstein based optimal transport to align... - MLCSB

www.youtube.com › watch6:51

Gromov-Wasserstein based optimal transport to align single-cell multi-omics data ... Estimating the Wasserstein Metric - Jonathan Niles-Weed.

YouTube · ISCB · 

Jan 13, 2021

Alex Elchesen (1/14/21): Universality of Persistence Diagrams ...

www.youtube.com › watch

Alex Elchesen (1/14/21): Universality of Persistence Diagrams and Bottleneck & Wasserstein Distances. 345 views345 views. Jan 14, 2021.

YouTube · Applied Algebraic Topology Network · 

Jan 14, 2021

Julius Lohmann: On the Wasserstein distance with respect to ...

www.uni-muenster.de › show_article

Translate this page

Jan 20, 2021 — Wednesday, 03.02.2021 11:00 per ZOOM: Link to Zoom info. Mathematik und Informatik. In this talk I use the idea of branched transport to ...

Jan 20, 2021

Wasserstein Distance to Independence Models - YouTube

Wasserstein Distance to Independence Models. 98 views 1 year ago ... Numerical Calabi-Yau Metrics & Machine Learning.

YouTube · Sanya AG&ML Workshop 2021 · 

Jan 28, 2021

<——2021———2021———2560——



Accelerated WGAN Update Strategy With Loss Change Rate ...

https://openaccess.thecvf.com › content › papers

https://openaccess.thecvf.com › content › papersPDF

by X Ouyang · 2021 · Cited by 4 — The proposed update strategy is based on a loss change ratio comparison of G and D.

Wedemonstrate ... The WGAN update strate

1223 - Accelerated WGAN update strategy with loss change rate balancing29 views

Feb 3, 2021

Statistics Seminar Speaker: Marta Catalano 02/04/2021 ...

stat.cornell.edu › events › statistics-seminar-speaker-ma...

Feb 4, 2021 — Talk: Measuring dependence in the Wasserstein distance for Bayesian nonparametric models. A link to this Zoom talk will be sent to the Stats ...

QIP2021 | The quantum Wasserstein distance of order 1 (Giacomo De Palma)

230 views•

Feb 4, 2021

QIP2021 | The quantum Wasserstein distance of order 1 ...

Your browser can't play this video. Learn more. Switch camera ... QIP2021 | The quantum Wasserstein ...

Feb 4, 2021 · Uploaded by Munich Ce

The quantum Wasserstein distance of order 1 (Giacomo De ...

www.youtube.com › watch

www.youtube.com › watch

Our main result is a continuity bound for the von Neumann entropy with respect to the proposed distance, which significantly strengthens the ...

YouTube · Munich Center for Quantum Science & Technology · 

Feb 4, 2021


The quantum Wasserstein distance of order 1 » NSF ...

Abstract. Giacomo De Palma - Massachusetts Institute of Technology, Research Laboratory of Electronics ...

Feb 10, 2021


Tariq Syed | The generalized Vaserstein symbol - YouTube

Seminar on A1-topology, motives and K-theory, February 18, 2021Tariq Syed (University of Duisburg-Essen ...

Feb 18, 2021 · Uploaded by EIMI, PDMI RAS and Chebyshev Laboratory

Giacomo De Palma: "The quantum Wasserstein distance of …

Entropy Inequalities, Quantum Information and Quantum Physics 2021"The quantum Wasserstein distance of ...

Feb 19, 2021 · Uploaded by Institute for Pure & Applied Mathematics (IPAM)


2021

Kas yra Wasserstein gan (wgan)? - apibrėžimas iš ...

Apibrėžimas - ką reiškia Wasserstein GAN (WGAN)?. Wassersteino GAN (WGAN) yra algoritmas ...

Feb 25, 2021

Imperial REDS Lab (@ImperialREDSLab) / Twitter

mobile.twitter.com › imperialredslab

Robotic manipulation: Engineering, Design, and Science Laboratory (REDS Lab) at ... a Highly Data-Efficient Controller Based on the Wasserstein Distance'.

Twitter · 

Feb 28, 2021

 

Thibaut Le Gouic: Sampler for the Wasserstein barycenter

www.youtub

Thibaut Le Gouic: Sampler for the Wasserstein barycenter. 167 views167 views. Apr 12, 2021. 1. Dislike. Share. Save.

YouTube · FODSI · 

Apr 12, 2021
 

Frank Nielsen on Twitter: "Why is there so many "distances" in ...

twitter.com › frnknlsn › status

I'm looking for a metric to quantify the distance between two probability ... But why is Wasserstein / EMD alone at the bottom?

Twitter · 

Apr 17, 2021


Algebraic Wasserstein distance between persistence modules

Katharine Turner, Australian National University

iSMi
Monday, 

April 26, 2021

<——2021———2021———2570——



Wasserstein-2 Generative Networks - SlidesLive 

We propose a novel end-to-end non-minimax algorithm for training optimal transport mappings for the quadratic cost (Wasserstein-2 distance).

SlidesLive · 

May 3, 2021

Continuous Wasserstein-2 Barycenter Estimation without ...

slideslive.com › continuous-wasserstein2-barycenter-estim...

Wasserstein barycenters provide a geometric notion of the weighted average of probability measures based on optimal transport.

SlidesLive · 

May 3, 2021


Wasserstein GAN (Continued) | Lecture 68 (Part 1) - YouTube

www.youtube.com › watch

www.youtube.com › watch

Wasserstein GANCourse Materials: https://github.com/maziarraissi/Applied-Deep-Learning.

YouTube · Maziar Raissi · 

May 7, 2021

Wasserstein GAN | Lecture 67 (Part 4) | Applied Deep Learning

www.youtube.com › watch

www.youtube.com › watch

Wasserstein GANCourse Materials: https://github.com/maziarraissi/Applied-Deep-Learning.

YouTube · Maziar Raissi · 

May 7, 2021


Adam Oberman (@oberman_adam) / Twitter

twitter.com › oberman_adam

Wasserstein barycenters are useful in stats/ML but typical algorithms discretize the domain, ... 's Misha Yurochkin we construct a continuous barycenter!

Twitter · 

May 8, 2021


Wasserstein Learning of Generative Models - Yuxin (Iris) Ye

The Wasserstein Generative Adversarial Network, or Wasserstein GAN, ... from scratch and validate the tractable loss function of WGAN-GP.

Yuxin (Iris) Ye · Yuxin Ye · 

May 20, 2021

Wasserstein Learning of Generative Models - Yuxin (Iris) Ye

http://yuxinirisye.com › ... › Machine Learning

May 20, 2021 — Wasserstein Learning of Generative Models – Yuxin (Iris) Ye ...


2021

Entropic Optimal Transport - Prof. Marcel Nutz - YouTube

www.youtube.com › watch 

A workshop to commemorate the centenary of publication of Frank Knight's "Risk, Uncertainty, and Profit" and John Maynard Keynes' “A ...

YouTube · The Alan Turing Institute · 

May 21, 2021


Coding a Basic WGAN in PyTorch272 views

May 22, 2021

Linear Wasserstein GAN (or disco ball) - YouTube 

Linear Wasserstein GAN (or disco ball) ... Comedy • 2011 • English audio.

YouTube · Alper Ahmetoglu · 

Jun 8, 2021

A Sliced Wasserstein Loss for Neural Texture Synthesis 

Our goal is to promote the Sliced Wasserstein Distance as a ... texture synthesis by optimization or training generative neural networks.

GitHub.io · Kenneth V · 

Jan 25, 2021

The quantum Wasserstein distance of order 1 - TQC 2021 ..

he quantum Wasserstein distance of order 1 - Giacomo De Palma, Milad Marvian, Dario Trevisan and Seth ...

June 29, 2021 · Uploaded by TQC 20

Abstract:Wasserstein metrics are of central importance in optimal transport and its applications. The seminar will sketch how such metrics ...

YouTube · CoE-MaSS · 

Jul 7, 2021

<——2021———2021———2580——



Rocco Duvenhage: Noncommutative Wasserstein metrics

Scalable Computations of Wasserstein Barycenter via Input ...

Wasserstein Barycenter is a principled approach to represent the weighted mean of a given set of probability distributions, utilizing the ...

CrossMind.ai · 

Jul 19, 2021

Projection Robust Wasserstein Barycenters · SlidesLive

slideslive.com › projection-robust-wasserstein-barycenters

Projection Robust Wasserstein Barycenters. Jul 19, 2021 ... Simultaneous Similarity-based Self-Distillation for Deep Metric Learning.

SlidesLive · 

Jul 19, 2021


Optimal Transport and PDE: Gradient Flows in the Wasserstein Metric2,049 viewsStreamed live on 

Simons Institute

41.2K subscribers

Sep 2, 2021

Gradient Flows in the Wasserstein Metric - YouTube 

Katy Craig (UC Santa Barbara)https://simons.berkeley.edu/talks/tbd-335Geometric Methods in Optimization and Sampling Boot Camp.

YouTube · Simons Institute · 

Sep 3, 2021


Optimal Transport and PDE: Gradient Flows in the ... - YouTube

www.youtube.com › watch

Optimal Transport and PDE: Gradient Flows in the Wasserstein Metric (continued). 1,182 views1.1K views. Streamed live on Sep 2, 2021.

YouTube · Simons Institute · 

Sep 3, 2021


2021

WGAN with Gradient Penalty and Attention - YouTube

www.youtube.com › watch

www.youtube.com › watch

Introduction to the Wasserstein distance: ... GAN — Wasserstein GAN & WGAN-GP: ... Improved Training of Wasserstein GANs: ...

YouTube · Moran Reznik · 

Sep 7, 2021
 

"Wasserstein barycenters from the computational perspective ...

www.youtube.com › watch

www.youtube.com › watch

Wasserstein barycenter allows to generalize the notion of average object to the space of probability measures and has a number of ...

YouTube · Combgeo Lab · 

Sep 15, 2021


Po-Ling Loh - Robust W-GAN-Based Estimation Under ...

www.youtube.com › watch

... an appropriately defined ball around an uncontaminated distribution. ... Specifically, we analyze properties of Wasserstein GAN-based ...

YouTube · One world theoretical machine learning · 

Sep 23, 2021

Лекция 7. Расстояние Вассерштейна к инвариантному распределению. Дороговцев А. А.43 views
Лекция 8. Скорость сходимости к инвариантному распределению в метрике Вассерштейна. Дороговцев А. А.31 views 

Sep 29, 2021

[Russian  Lecture 7. Wasserstein distance to the invariant distribution. Dorogovtse]



Brownian motion on Wasserstein space and Dean-Kawasaki models. Max von Renesse

85 views  Oct 6, 2021  The session of the seminar "Malliavin Calculus and its Applications"  5th of October, 2021

Oct 5, 2021

Oct 6, 2021

<——2021———2021———2590——



Brownian motion on Wasserstein space and Dean-Kawasaki ...

https://events.imath.kiev.ua › event

Translate this page

Oct 5, 2021 — Brownian motion on Wasserstein space and Dean-Kawasaki models. by Max von Renesse (University of Leipzig).

Brownian motion on Wasserstein space and Dean Kawasaki ...

www.youtube.com › watch

We will survey some results on the construction of candidate models for Brownian motion on Wasserstein space and the connection to the ...

YouTube · Max Planck Science · 

Oct 9, 2021


The quantum Wasserstein distance of order 1

G De PalmaM MarvianD Trevisan… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

… Wasserstein distance of order 1 to the quantum states of n qudits. The proposal recovers the Hamming distance for … the classical Wasserstein distance for quantum states diagonal in the …

Cited by 33 Related articles All 12 versions


On quantum Wasserstein distances

31 views  Max Planck Science  YouTube

Oct 10, 2021

On quantum Wasserstein distances - YouTube

www.youtube.com › watch

In this talk, I will describe several approaches to introduce a quantum analogue of the optimal transport problem and related Wasserstein ...

YouTube · Max Planck Science · 

Oct 10, 2021


Training Wasserstein Generative Adversarial Networks Without Gradient Penalties706 viewsStreamed live on 

Training Wasserstein Generative Adversarial Networks .

Dohyun Kwon (University of Wisconsin, Madison)Training Wasserstein Generative Adversarial Networks Without Gradient PenaltiesDynamics and ...

YouTube · Simons Institute · 

Oct 26, 2021

 

Sampling From Wasserstein Barycenter

Simons Institute

(Ecole Centrale de Marseille)

22 views  Streamed live on 

Oct 28, 2021

Sampling From Wasserstein Barycenter - YouTube

www.youtube.com › watch

mons.berkeley.edu/talks/sampling-wasserstein-barycenterDynamics and Discretization: ...

YouTube · Simons Institute · 

Oct 29, 2021


2021


Extending the JKO Scheme Beyond ... - Simons Institute

https://simons.berkeley.edu › talks › extending-jko-sche...

Oct 28, 2021 — The JKO scheme has many favorable properties that make it attractive for both theory and numerical simulation. In this talk, I will discuss ...
 Extending the JKO Scheme Beyond Wasserstein-2 Gradient ...

www.youtube.com › watch

www.youtube.com › watch

Matthew Jacobs (UCLA)https://simons.berkeley.edu/talks/extending-jko-scheme-beyond-wasserstein-2-gradient-flowsDynamics and Discretization: ...

YouTube · Simons Institute · 

Oct 29, 2021


Arnaud Fickinger (@arnaudfickinger) / Twitter

twitter.com › arnaudfickinger

Find out more about Gromov-Wasserstein imitation learning at our poster Wed ... RL Fine-Tuning searches over the neural parameter space at inference time by ...

Twitter · 

Dec 8, 2021


2021

Arnaud Fickinger on Twitter: "We provably achieve cross ...

twitter.com › arnaudfickinger › status

twitter.com › arnaudfickinger › status

... by minimizing the Gromov-Wasserstein distance with deep RL. ... of the state-action space, and propose Gromov-Wasserstein Imitation ...

Twitter · 

Oct 11, 2021

 

2021

Clément Canonne on Twitter: "Ninth #AcademicSpotlight ...

twitter.com › ccanonne_ › status

twitter.com › ccanonne_ › status

of the k-NN classifier on spaces of proba measures under p-Wasserstein distance. From studying geometric properties of Wasserstein spaces, ...

Twitter · 

Aug 21, 2021


2021

Wasserstein Embedding for Graph Learning (WEGL)

slideslive.com › wasserstein-embedding-for-graph-learnin...

slideslive.com › wasserstein-embedding-for-graph-learnin...

We present Wasserstein Embedding for Graph Learning (WEGL), ... speech recognition, text understanding, gaming, and robotics.

SlidesLive · 

May 3, 2021

<——2021———2021———2600——


2021

Primal Wasserstein Imitation Learning - SlidesLive

slideslive.com › primal-wasserstein-imitation-learning

... important application areas such as machine vision, computational biology, speech recognition, text understanding, gaming, and 

May 3, 2021


Nonlocal Wasserstein Distance and the Associated Gradient ...

www.youtube.com › watch 

... Mellon Univeristy)https://simons.berkeley.edu/talks/nonlocal-wasserstein-distance-and-associated-gradient-flowsDynamics and Discretiz...

YouTube · Simons Institute · 

Oct 26, 2021·  


[PDF] aaai.org

Visual transfer for reinforcement learning via wasserstein domain confusion

J Roy, GD Konidaris - Proceedings of the AAAI Conference on Artificial …, 2021 - ojs.aaai.org

… We have introduced a new method, WAPPO, that does not. Instead, it uses a novel Wasserstein 

Confusion objective term to force the RL agent to learn a mapping from visually distinct …

Cited by 9 Related articles All 9 versions



2021  PATENT

Ship body attachment rapid identification method based on WGAN-GP and YOLO

ZHU DAQI ; CHU ZHENZHONG ; CHEN QI ; REN CHENHUI2021

OPEN ACCESS

Ship body attachment rapid identification method based on WGAN-GP and YOLO

No Online Access 


2021 see 2022  ARTICLE

Trust the Critics: Generatorless and Multipurpose WGANs with Initial Convergence Guarantees

Milne, Tristan ; Bilocq, Étienne ; Nachman, AdrianarXiv.org, 2021

OPEN ACCESS

Trust the Critics: Generatorless and Multipurpose WGANs with Initial Convergence Guarantees

Available Online 


2021

  

2021 PATENT

WGAN-based one-dimensional time series data augmentation method

FENG QIANG ; WU ZEYU ; YANG DEZHEN ; REN YI ; SUN BO ; WANG ZILI ; QIAN CHENG2021

OPEN ACCESS

WGAN-based one-dimensional time series data augmentation method

No Online Access 

 

PATENT

Bearing residual life prediction method based on improved residual network and WGAN

SHEN YANXIA ; XU JIAJIE ; ZHAO ZHIPU2021

OPEN ACCESS

Bearing residual life prediction method based on improved residual network and WGAN

No Online Access 

 

PATENT

≈∂çTIAN RAN ; GAO SHIWEI ; QIU SULONG ; ZHANG QINGSONG ; MA ZHONGYU ; LIU YANXING ; XU JINPENG2021

OPEN ACCESS

Process industrial soft measurement data supplement method based on SVAE-WGAN

No Online Access 

 

2021 √  DATASET

Infrared Image Super-Resolution via Heterogeneous Convolutional WGAN

Huang, Yongsong ; Huang, Yongsong2021

OPEN ACCES

 

PATENT

Wind power output power prediction method based on isolated forest and WGAN network

XU ZHIWEI ; LIU GUANGWEN ; WANG YONGSHENG ; XU HAO ; WU YUHAO ; SU XIAOMING2021

OPEN ACCESS

Wind power output power prediction method based on isolated forest and WGAN network

No Online Access 

<——2021———2021———2610——



PATENT

Radar HRRP database construction method based on WGAN-GP

MA PEIWEN ; DING JUN ; JIU BO ; WANG PENGHUI ; LIU HONGWEI ; CHEN BO2021

OPEN ACCESS

Radar HRRP database construction method based on WGAN-GP

No Online Access 


 

ARTICLE

WGAN with an Infinitely Wide Generator Has No Spurious Stationary Points

Albert No ; Yoon, TaeHo ; Kwon, Sehyun ; Ryu, Ernest KarXiv.org, 2021

OPEN ACCESS

WGAN with an Infinitely Wide Generator Has No Spurious Stationary Points

Available Online 


 

2021 PATENT

Unsupervised multi-view three-dimensional point cloud joint registration method based on WGAN

LIU MIN ; JIANG YIMING ; WU HAOTIAN ; MAO JIANXU ; PENG WEIXING ; ZHANG HUI ; ZHU QING ; ZHAO JIAWEN ; WANG YAONAN2021

OPEN ACCESS

Unsupervised multi-view three-dimensional point cloud joint registration method based on WGAN

No Online Access



NEWSLETTER ARTICLE

New Computers Data Have Been Reported by Investigators at Sichuan University (Spam Transaction Attack Detection Model Based On Gru and Wgan-div)

Computer Weekly News, 2021, p.490

New Computers Data Have Been Reported by Investigators at Sichuan University (Spam Transaction Attack Detection Model Based On Gru and Wgan-div)

Available Online  


ARTICLE

Research on WGAN models with Rényi Differential Privacy

Lee, Sujin ; Park, Cheolhee ; Hong, Dowon ; Kim, Jae-kumChŏngbo Kwahakhoe nonmunji, 2021, Vol.48 (1), p.128-140

Research on WGAN models with Rényi Differential Privacy


NEWSLETTER ARTICLE

Hebei University of Technology Researchers Update Understanding of Robotics (Low-Illumination Image Enhancement in the Space Environment Based on the DC-WGAN Algorithm)

Journal of Engineering, 2021, p.737

Hebei University of Technology Researchers Update Understanding of Robotics (Low-Illumination Image Enhancement in the Space Environment Based on the DC-WGAN Algorithm)

Available Online 


2021


Jalal Kazempour (@JalalKazempour) / Twitter

twitter.com › jalalkazempour

twitter.com › jalalkazempour

How to make Wasserstein distributionally robust optimal power flow aware of ... grids” has won the 2022 Best IEEE-Transaction on Power System award, ...

Twitter · 

Aug 24, 2021

BOOK CHAPTER

Lagrangian schemes for Wasserstein Gradient Flows

Carrillo, Jose ; Matthes, Daniel ; Wolfram, Marie-Therese; Wolfram, Marie-ThereseWolfram, Marie-Therese, 2021, p.271-311

Lagrangian schemes for Wasserstein Gradient Flows

No Online Access 



BOOK CHAPTER

Selective Multi-source Transfer Learning with Wasserstein Domain Distance for Financial Fraud Detection

Sun, Yifu ; Lan, Lijun ; Zhao, Xueyao ; Fan, Mengdi ; Guo, Qingyu ; Li, ChaoIntelligent Computing and Block Chain, 2021, p.489-505

PEER REVIEWED

Selective Multi-source Transfer Learning with Wasserstein Domain Distance for Financial Fraud Detection

Available Online 


 

BOOK CHAPTER

La métrique de Wasserstein

HOANG, Lê NguyênLa Formule du Savoir, 2021


 2021
Intensity-Based Wasserstein Distance As A Loss Measure...
by Shams, Roozbeh; Le, William; Weihs, Adrien ; More...
2021 IEEE 18th International Symposium on Biomedical Imaging (ISBI), 04/2021
Traditional pairwise medical image registration techniques are based on computationally intensive frameworks due to numerical optimization procedures. While...
Conference proceeding  Full Text Online

<——2021———2021———2620——


2021  thesis
Wasserstein adversarial transformer for cloud workload predictionAuthors:Shivani Gajanan Arbat (Author), In Kee Kim (Degree supervisor), University of Georgia (Degree granting institution)
Summary:Resource provisioning is essential to optimize cloud operating costs and the performance of cloud applications. Understanding job arrival rates is critical for predicting future workloads to determine the proper amount of resources for provisioning. However, due to the dynamic patterns of cloud workloads, developing a model to accurately forecast job arrival rates is a challenging task. Previously, various prediction models, including Long-Short-Term-Memory (LSTM), have been employed to address the cloud workload prediction problem. Unfortunately, the current state-of-the-art LSTM model leverages recurrences to make a prediction, resulting in increased complexity and degraded computational efficiency as input sequences grow longer. To achieve both higher prediction accuracy and better computational efficiency, this work presents a novel time-series forecasting model for cloud resource provisioning, called WGAN-gp (Wasserstein Generative Adversarial Network with gradient penalty) Transformer. WGAN-gp Transformer is inspired by Transformer network and improved WGAN (Wasserstein Generative Adversarial Networks). Our proposed method adopts a Transformer network as the generator and a multi-layer perceptron network as a critic to improve the overall forecasting performance. WGAN-gp also employs MADGRAD (Momentumized, Adaptive, Dual Averaged Gradient Method for Stochastic Optimization) as the model's optimizer due its ability to converge faster and generalize better. Extensive experiments on the various real-world cloud workload datasets show improved performance and efficiency of our method. In particular, WGAN-gp Transformer shows 5x faster inference time with up to 5.1% higher prediction accuracy than the state-of-the-art workload prediction technique. Such faster inference time and higher prediction accuracy can be effectively used by cloud resource provisioning and autoscaling mechanisms. We then apply our model to cloud autoscaling and evaluate it on Google Cloud Platform with Facebook and Google cluster traces. We discuss the evaluation results showcasing that WGAN-gp Transformer-based autoscaling mechanism outperforms autoscaling with LSTM by reducing virtual machine over-provisioningShow more
Thesis, Dissertation, 2021
English
Publication:Masters Abstracts International
Publisher:ProQuest Dissertations & Theses, Ann Arbor, 2021



2021 6

Wasserstein Perturbations of Markovian Transition Semigroups

books.google.com › books

Sven Fuhrmann, ‎Michael Kupper, ‎Max Nendel · 2021 · ‎No preview

In this paper, we deal with a class of time-homogeneous continuous-time Markov processes with transition probabilities bearing a nonparametric uncertainty.


2021 7
Trust the Critics: Generatorless and Multipurpose WGANs with Initial Convergence GuaranteesAuthors:Milne, Tristan (Creator), Bilocq, Étienne (Creator), Nachman, Adrian (Creator)
Summary:Inspired by ideas from optimal transport theory we present Trust the Critics (TTC), a new algorithm for generative modelling. This algorithm eliminates the trainable generator from a Wasserstein GAN; instead, it iteratively modifies the source data using gradient descent on a sequence of trained critic networks. This is motivated in part by the misalignment which we observed between the optimal transport directions provided by the gradients of the critic and the directions in which data points actually move when parametrized by a trainable generator. Previous work has arrived at similar ideas from different viewpoints, but our basis in optimal transport theory motivates the choice of an adaptive step size which greatly accelerates convergence compared to a constant step size. Using this step size rule, we prove an initial geometric convergence rate in the case of source distributions with densities. These convergence rates cease to apply only when a non-negligible set of generated data is essentially indistinguishable from real data. Resolving the misalignment issue improves performance, which we demonstrate in experiments that show that given a fixed number of training epochs, TTC produces higher quality images than a comparable WGAN, albeit at increased memory requirements. In addition, TTC provides an iterative formula for the transformed density, which traditional WGANs do not. Finally, TTC can be applied to map any source distribution onto any target; we demonstrate through experiments that TTC can obtain competitive performance in image generation, translation, and denoising without dedicated algorithmsShow more
Downloadable Archival Material, 2021-11-29
Undefined
Publisher:2021-11-29



2021 see 2022

EEG Generation of Virtual Channels Using an Improved Wasserstein Generative Adversarial Networks

Li, LLCao, GZ; (...); Zhang, YP

15th International Conference on Intelligent Robotics and Applications (ICIRA ) - Smart Robotics for Society

INTELLIGENT ROBOTICS AND APPLICATIONS (ICIRA 2022), PT IV

 13458 , pp.386-39Enriched Cited References

Aiming at enhancing classification performance and improving user experience of a brain-computer interface (BCI) system, this paper proposes an improved Wasserstein generative adversarial networks (WGAN) method to generate EEG samples in virtual channels. The feature extractor and the proposed WGAN model with a novel designed feature loss are trained. Then artificial EEG of virtual channels are

Show more

Full Text at Publishermore_horiz

25 References  Related records



2021 see 2022 2023

Wasserstein asymptotics for the empirical measure of fractional Brownian motion on a flat torus

Huesmann, MMattesini, F and Trevisan, D

Jan 2023 | 

STOCHASTIC PROCESSES AND THEIR APPLICATIONS

 155 , pp.1-26

We establish asymptotic upper and lower bounds for the Wasserstein distance of any order p >= 1 between the empirical measure of a fractional Brownian motion on a flat torus and the uniform Lebesgue measure. Our inequalities reveal an interesting interaction between the Hurst index H and the dimension d of the state space, with a "phase-transition" in the rates when d = 2+1/H, akin to the Ajtai

Show more

Free Submitted Article From RepositoryView full textmore_horiz

36 References  Related records 


2021


2021 see 2022

Improved Wasserstein Generative Adversarial Networks Defense Method Against Data Integrity Attack on Smart Grid

Li, YCWang, X and Zeng, J

2022 | 

RECENT ADVANCES IN ELECTRICAL & ELECTRONIC ENGINEERING

 15 (3) , pp.243-254

Enriched Cited References

Background: As the integration of communication networks with power systems is getting closer, the number of malicious attacks against the cyber-physical power system is increasing substantially. The data integrity attack can tamper with the measurement information collected by Supervisory Control and Data Acquisition (SCADA), which can affect important decisions of the power grid and pose a si

Show more

View full textmore_horiz

31 References  Related records


2021

Optimal continuous-singular control of stochastic McKean-Vlasov system in Wasserstein space of probability measures

Boukaf, SGuenane, L and Hafayed, M

INTERNATIONAL JOURNAL OF DYNAMICAL SYSTEMS AND DIFFERENTIAL EQUATIONS

 12 (4) , pp.301-315

In this paper, we study the local form of maximum principle for optimal stochastic continuous-singular control of nonlinear Ito stochastic differential equation of McKean-Vlasov type, with incomplete information. The coefficients of the system are nonlinear and depend on the state process as well as its probability law. The control variable is allowed to enter into both drift and diffusion coefficients. The action space is assumed to be convex. The proof of our local maximum principle is based on the differentiability with respect to the probability law in Wasserstein space of probability measures with some appropriate estimates.

Show less

View full text



2021  Peer-reviewed
GMT-WGAN: An Adversarial Sample Expansion Method for Ground Moving Targets Classification
Authors:Xin YaoXiaoran ShiYaxin LiLi WangHan WangShijie RenFeng Zhou
Summary:In the field of target classification, detecting a ground moving target that is easily covered in clutter has been a challenge. In addition, traditional feature extraction techniques and classification methods usually rely on strong subjective factors and prior knowledge, which affect their generalization capacity. Most existing deep-learning-based methods suffer from insufficient feature learning due to the lack of data samples, which makes it difficult for the training process to converge to a steady-state. To overcome these limitations, this paper proposes a Wasserstein generative adversarial network (WGAN) sample enhancement method for ground moving target classification (GMT-WGAN). First, the micro-Doppler characteristics of ground moving targets are analyzed. Next, a WGAN is constructed to generate effective time-frequency images of ground moving targets and thereby enrich the sample database used to train the classification network. Then, image quality evaluation indexes are introduced to evaluate the generated spectrogram samples, with an aim to verify the distribution similarity of generated and real samples. Afterward, by feeding augmented samples to the deep convolutional neural networks with good generalization capacity, the classification performance of the GMT-WGAN is improved. Finally, experiments conducted on different datasets validate the effectiveness and robustness of the proposed methodShow mor
Downloadable Article, 2021
Publication:14, 20211201, 123
Publisher:2021


2021
Fault Diagnosis Method Based on CWGAN-GP-1DCNN
Authors:Shuangyin LiuChuanyun LiuYacui GaoHang Yin2021 IEEE 24th International Conference on Computational Science and Engineering (CSE)
Summary:In the actual industrial process, the fault data collection is difficult, and the fault sample is insufficient. The Imbalanced datasets is the main problem that is faced at present. However, the fault diagnosis method based on model optimization has over-fitting phenomenon in the training process. Therefore, using data enhancement methods to provide effective and sufficient fault samples for fault detection and diagnosis is a research hotspot to deal the data imbalance problem. To solve this problem, in this paper, a Conditional Wasserstein Generative Adversarial Network (CWGAN-GP1DCNN) with gradient penalty based on one dimensional Convolutional Neural Network is proposed to enhance the data of real fault samples to detect all kinds of bearing faults. Experimental results show that the proposed method can effectively enhance the sample data, improve the diagnosis accuracy under the condition of unbalanced fault samples, and has good robustness and effectivenessShow more
Chapter, 2021
Publication:2021 IEEE 24th International Conference on Computational Science and Engineering (CSE), 202110, 20
Publisher:2021


使用WGAN-GP對臉部馬賽克進行眼睛補圖 = Eye In-painting Using WGAN-GP for Face Images with Mosaic / Shi yongWGAN-GP dui lian bu ma sai ke jin xing yan jing bu tu = Eye In-painting Using WGAN-GP for Face Images with MosaicShow more
Authors:吳承軒, H. T. ChangCheng Hsuan Wu張賢宗 / Chengxuan WuXianzong Zhang
Thesis, Dissertation, 2019[min 108]
Chinese, Chu ban
Publisher:長庚大學, Tao yuan shi, 2019[min 108]

<——2021———2021———2630——


conference
Learning embeddings into entropic Wasserstein spacesAuthors:Frogner, C (Creator), Solomon, J (Creator), Mirzazadeh, F (Creator)
Summary:© 7th International Conference on Learning Representations, ICLR 2019. All Rights Reserved. Euclidean embeddings of data are fundamentally limited in their ability to capture latent semantic structures, which need not conform to Euclidean spatial assumptions. Here we consider an alternative, which embeds data as discrete probability distributions in a Wasserstein space, endowed with an optimal transport metric. Wasserstein spaces are much larger and more flexible than Euclidean spaces, in that they can successfully embed a wider variety of metric structures. We exploit this flexibility by learning an embedding that captures semantic information in the Wasserstein distance between embedded distributions. We examine empirically the representational capacity of our learned Wasserstein embeddings, showing that they can embed a wide variety of metric structures with smaller distortion than an equivalent Euclidean embedding. We also investigate an application to word embedding, demonstrating a unique advantage of Wasserstein embeddings: We can visualize the high-dimensional embedding directly, since it is a probability distribution on a low-dimensional space. This obviates the need for dimensionality reduction techniques like t-SNE for visualizationShow more
Downloadable Archival Material, 2021-11-08T17:43:35Z
English
Publisher:2021-11-08T17:43:35Z
Access Free


2021
Parallel Streaming Wasserstein Barycenters
Authors:Massachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory (Contributor), Staib, Matthew (Creator), Claici, Sebastian (Creator), Solomon, Justin (Creator), Jegelka, Stefanie (Creator)Show more
Summary:© 2017 Neural information processing systems foundation. All rights reserved. Efficiently aggregating data from different sources is a challenging problem, particularly when samples from each source are distributed differently. These differences can be inherent to the inference task or present for other reasons: sensors in a sensor network may be placed far apart, affecting their individual measurements. Conversely, it is computationally advantageous to split Bayesian inference tasks across subsets of data, but data need not be identically distributed across subsets. One principled way to fuse probability distributions is via the lens of optimal transport: the Wasserstein barycenter is a single distribution that summarizes a collection of input measures while respecting their geometry. However, computing the barycenter scales poorly and requires discretization of all input distributions and the barycenter itself. Improving on this situation, we present a scalable, communication-efficient, parallel algorithm for computing the Wasserstein barycenter of arbitrary distributions. Our algorithm can operate directly on continuous input distributions and is optimized for streaming data. Our method is even robust to nonstationary input distributions and produces a barycenter estimate that tracks the input measures over time. The algorithm is semi-discrete, needing to discretize only the barycenter estimate. To the best of our knowledge, we also provide the first bounds on the quality of the approximate barycenter as the discretization becomes finer. Finally, we demonstrate the practical effectiveness of our method, both in tracking moving distributions on a sphere, as well as in a large-scale Bayesian inference taskShow more
Downloadable Archival Material, 2021-11-08T21:18:59Z
English
Publisher:2021-11-08T21:18:59Z

2021 ebBook
An invitation to optimal transport, Wasserstein distances, and gradient flows
Authors:Alessio Figalli (Author), Federico Glaudo (Author)
eBook, 2021
English
Publisher:European Mathematical Society, [Zürich, Switzerland], 2021
Also available asPrint Book
View AllFormats & Editions

PaperView: Generalized Wasserstein Dice Score for ...

web.cs.ucla.edu › ~shayan › video › generalized-wasserst...

web.cs.ucla.edu › ~shayan › video › generalized-wasserst...

Soft generalisations of the Dice score allow it to be used as a loss function for training convolutional neural networks (CNN). Although CNNs ...

UCLA Computer Science · 



Wasserstein-2 Generative Networks [in Russian] - YouTube

www.youtube.com › watch

www.youtube.com › watch

Bayesian Methods in Machine Learning, Fall 2020. Wasserstein-2 Generative Networks [in Russian]. 305 views 1 year ago. BayesGroup.ru.

YouTube · BayesGroup.ru · 

Feb 21, 2021


2021


Efficient Wasserstein Natural Gradients for Reinforcement ...

slideslive.com › efficient-wasserstein-natural-gradients-for...

slideslive.com › efficient-wasserstein-natural-gradients-for...

A novel optimization approach is proposed for application to policy gradient methods and evolution strategies for reinforcement learning ...

SlidesLive · 

May 3, 2021


 Nicolas Courty (@nicolas_courty) / Twitter

twitter.com › nicolas_courty

twitter.com › nicolas_courty

Efficient Gradient Flows in Sliced-Wasserstein Space Clément Bonet, ... will present 'Optimal transport for graph signal processing & learning' & 2 recent ...

Twitter · 

May 19, 2021



De Novo Protein Design for Novel Folds with Guided  Wasserstein GAN

www.youtube.com › watchonditional Wasserstein GAN - Yang Shen - 3DSIG - ISMB 2020.

YouTube · ISCB · 

Jan 13, 2021


2021

Poster A-06: Intrinsic Sliced Wasserstein Distances ... - YouTube

www.youtube.com › watch

Collections of probability distributions arise in a variety of statistical applications ranging from user activity pattern analysis to brain ...

YouTube · Bay Learn · 

Oct 28, 2021


2021

slideslive.com › lineartime-gromov-wasserstein-distances-...

... Googlebot/2.1; +http://www.google.com/bot.html). Linear-Time Gromov Wasserstein Distances using Low Rank Couplings and Costs. Dec 6, 2021. Speakers.

SlidesLive · 

Dec 6, 2021

<——2021———2021———2640——



2021

OWOS: Bernd Sturmfels - "Wasserstein Distance to ... - YouTube

www.youtube.com › watch

The fifteenth talk in the third season of the One World Optimization Seminar given on April 26th, 2021, by Bernd Sturmfels (MPI Leipzig & UC ...

YouTube · One World Optimization Seminar · 

Apr 26, 2021

 

Niloy Biswas - Bounding Wasserstein distance with couplings

www.youtube.com › watch

This video was contributed to the workshop https://bayescomp-isba.github.io/measuringquality.html.

YouTube · ISBA - International Society of Bayesian Analysis · 

Oct 5, 2021


Geometry of covariance matrices, covariance operators ... - VinAI

www.vinai.io › seminar-posts › geometry-of-covariance-...

www.vinai.io › seminar-posts › geometry-of-covariance-...He has also held positions at the University of Vienna, Austria, ... Regularization of the 2-Wasserstein distance (from Optimal Transport).

VinAI · VinAI Research · 


2021 see 2022

Schema matching using Gaussian mixture models with Wasserstein distance

M Przyborowski, M Pabiś, A Janusz… - arXiv preprint arXiv …, 2021 - arxiv.org

… From the viewpoint of optimal transport theory, the Wasserstein distance is an important … 

In this paper we derive one of possible approximations of Wasserstein distances computed …


2021

On a Linear Gromov-Wasserstein Distance.

Beier, FlorianBeinert, Robert and Steidl, Gabriele

IEEE transactions on image processing : a publication of the IEEE Signal Processing Society

 31 , pp.7292-7305

Gromov-Wasserstein distances are generalization of Wasserstein distances, which are invariant under distance preserving transformations. Although a simplified version of optimal transport in Wasserstein spaces, called linear optimal transport (LOT), was successfully used in practice, there does not exist a notion of linear Gromov-Wasserstein distances so far. In this paper, we propose a definit

Show more

Free Submitted Article From RepositoryFull Text at Publishermore_horiz


2021


FAST SINKHORN I: AN O(N) ALGORITHM FOR THE WASSERSTEIN-1 METRIC

Liao, QCChen, J; (...); Wu, H

COMMUNICATIONS IN MATHEMATICAL SCIENCES

 20 (7) , pp.2053-2067

The Wasserstein metric is broadly used in optimal transport for comparing two prob-abilistic distributions, with successful applications in various fields such as machine learning, signal processing, seismic inversion, etc. Nevertheless, the high computational complexity is an obstacle for its practical applications. The Sinkhorn algorithm, one of the main methods in computing the Wasser-stein


Google Nest Cam (battery) Anti-Theft Mount - Wasserstein

wasserstein-home.com › Collections › All

wasserstein-home.com › Collections › All

Easy to install and looks totally integrated. Simple device that adds peace of mind. Recommends this product. Yes.

Wasserstein · Wasserstein Home · 

Dec 8, 2021


2021  [PDF] julienmalka.me

[PDF] Gromov-Wasserstein Optimal Transport for Heterogeneous Domain Adaptation

J Malka, R Flamary, N Courty - julienmalka.me

… We’ll notably present some new methods using the Gromov-Wasserstein optimal transport 

problems. … The Gromov-Wasserstein formulation of the optimal transport is then written as: …

 Related articles 


 2021

Krishnaswamy Lab (@KrishnaswamyLab) / Twitter

twitter.com › krishnaswamylab

twitter.com › krishnaswamylab

(2/n) The Optimal Transport (OT) problem gives rise to the Wasserstein distance. ... 2022, we're introducing a novel scalar curvature for high dimensional ...

Twitter · 

Feb 8, 2021


2021  Peer-reviewed

GMT-WGAN: An Adversarial Sample Expansion Method for Ground Moving Targets ClassificationAuthors:Xin

YaoXiaoran ShiYaxin LiLi WangHan WangShijie RenFeng Zhou

Summary:In the field of target classification, detecting a ground moving target that is easily covered in clutter has been a

challenge. In addition, traditional feature extraction techniques and classification methods usually rely on strong subjective factors 

and prior knowledge, which affect their generalization capacity. Most existing deep-learning-based methods suffer from 

insufficient feature learning due to the lack of data samples, which makes it difficult for the training process to converge to a 

steady-state. To overcome these limitations, this paper proposes a Wasserstein generative adversarial network (WGAN) sample 

enhancement method for ground moving target classification (GMT-WGAN). First, the micro-Doppler characteristics of ground 

moving targets are analyzed. Next, a WGAN is constructed to generate effective time-frequency images of ground moving targets

and thereby enrich the sample database used to train the classification network. Then, image quality evaluation indexes are

introduced to evaluate the generated spectrogram samples, with an aim to verify the distribution similarity of generated and real 

samples. Afterward, by feeding augmented samples to the deep convolutional neural networks with good generalization capacity,

the classification performance of the GMT-WGAN is improved. Finally, experiments conducted on different datasets validate the

effectiveness and robustness of the proposed methodShow more

Downloadable Article, 2021

Publication:14, 20211201, 123

Publisher:2021

<——2021———2021——2650——


 

 
WGAN with an Infinitely Wide Generator Has No Spurious Stationary Points
Authors:No, Albert (Creator), Yoon, TaeHo (Creator), Kwon, Sehyun (Creator), Ryu, Ernest K. (Creator)
Summary:Generative adversarial networks (GAN) are a widely used class of deep generative models, but their minimax training dynamics are not understood very well. In this work, we show that GANs with a 2-layer infinite-width generator and a 2-layer finite-width discriminator trained with stochastic gradient ascent-descent have no spurious stationary points. We then show that when the width of the generator is finite but wide, there are no spurious stationary points within a ball whose radius becomes arbitrarily large (to cover the entire parameter space) as the width goes to infinityShow more
Downloadable Archival Material, 2021-02-15
Undefined
Publisher:2021-02-15

2021  ebook  book
MultiMedia Modeling : 27th International Conference, MMM 2021, Prague, Czech Republic, June 22-24, 2021 : proceedings. Part I
Authors:International Conference on Multi-Media ModelingJakub Lokoč (Editor), Tomas Skopal (Editor), Klaus Schoeffmann (Editor), Vasileios Mezaris (Editor), Xirong Li (Editor), Stefanos Vrochidis (Editor), Ioannis Patras (Editor)Show more
2021  Summary:The two-volume set LNCS 12572 and 1273 constitutes the thoroughly refereed proceedings of the 27th International Conference on MultiMedia Modeling, MMM 2021, held in Prague, Czech Republic, in June2021. Of the 211 submitted regular papers, 40 papers were selected for oral presentation and 33 for poster presentation; 16 special session papers were accepted as well as 2 papers for a demo presentation and 17 papers for participation at the Video Browser Showdown 2021. The papers cover topics such as: multimedia indexing; multimedia mining; multimedia abstraction and summarization; multimedia annotation, tagging and recommendation; multimodal analysis for retrieval applications; semantic analysis of multimedia and contextual data; multimedia fusion methods; multimedia hyperlinking; media content browsing and retrieval tools; media representation and algorithms; audio, image, video processing, coding and compression; multimedia sensors and interaction modes; multimedia privacy, security and content protection; multimedia standards and related issues; advances in multimedia networking and streaming; multimedia databases, content delivery and transport; wireless and mobile multimedia networking; multi-camera and multi-view systems; augmented and virtual reality, virtual environments; real-time and interactive multimedia applications; mobile multimedia applications; multimedia web applications; multimedia authoring and personalization; interactive multimedia and interfaces; sensor networks; social and educational multimedia applications; and emerging trendsShow more
eBook, 2021
English
Publisher:Springer, Cham, 2021
Also available asPrint Book
View AllFormats & Editions



2021  Peer-reviewed
A Liver Segmentation Method Based on the Fusion of VNet and WGAN
Authors:Jinlin MaYuanyuan DengZiping MaKaiji MaoYong Chen
Summary:Accurate segmentation of liver images is an essential step in liver disease diagnosis, treatment planning, and prognosis. In recent years, although liver segmentation methods based on 2D convolutional neural networks have achieved good results, there is still a lack of interlayer information that causes severe loss of segmentation accuracy to a certain extent. Meanwhile, making the best of high-level and low-level features more effectively in a 2D segmentation network is a challenging problem. Therefore, we designed and implemented a 2.5-dimensional convolutional neural network, VNet_WGAN, to improve the accuracy of liver segmentation. First, we chose three adjacent layers of a liver model as the input of our network and adopted two convolution kernels in series connection, which can integrate cross-sectional spatial information and interlayer information of liver models. Second, a chain residual pooling module is added to fuse multilevel feature information to optimize the skip connection. Finally, the boundary loss function in the generator is employed to compensate for the lack of marginal pixel accuracy in the Dice loss function. The effectiveness of the proposed method is verified on two datasets, LiTS and CHAOS. The Dice coefficients are 92% and 90%, respectively, which are better than those of the compared segmentation networks. In addition, the experimental results also show that the proposed method can reduce computational consumption while retaining higher segmentation accuracy, which is significant for liver segmentation in practice and provides a favorable reference for clinicians in liver segmentationShow more
Article, 2021
Publication:Computational and Mathematical Methods in Medicine, 2021, 20211008
Publisher:2021



2021 thesis
Automatic Generation of Floorplans using Generative Adversarial Networks
Authors:Sneha Suhitha Galiveeti (Author), Chitaranjan Das (Thesis advisor)
Summary:In the present day, demand for construction of houses is increasing rapidly. But creating and designing a floorplan requires a lot of creativity, technical knowledge and mathematical skills. The number of architects with these skills are not adequate to meet the requirements of the rapidly growing demand. We can use Machine Learning to solve this problem of floorplan generation. This project explores the idea of generation of multiple floorplans using deep learning models especially Generative Adversarial Networks(GANs). This work concentrates on the generation of rasterized of floorplans. The main approach is to let GAN treat floorplans as raster images and learn their distribution to produce new floorplans. This work explored multiple models of GANs like simple DCGAN, LSGAN, WGAN, StyleGAN etc. and studied the pros and cons of each model over two major datasets Structure3D and Graph2Plan. This work also explored the conditional generation of floorplans i.e., controlling the layout of generated floorplans by giving input condition to the models in terms of types of rooms presentShow more
Thesis, Dissertation, 2021
English
Publisher:Pennsylvania State University, [University Park, Pennsylvania], 2021

2021
Automatic Visual Inspection of Rare Defects: A Framework based on GP-WGAN and Enhanced Faster R-CNN
Authors:Jalayer, Masoud (Creator), Jalayer, Reza (Creator), Kaboli, Amin (Creator), Orsenigo, Carlotta (Creator), Vercellis, Carlo (Creator)
Summary:A current trend in industries such as semiconductors and foundry is to shift their visual inspection processes to Automatic Visual Inspection (AVI) systems, to reduce their costs, mistakes, and dependency on human experts. This paper proposes a two-staged fault diagnosis framework for AVI systems. In the first stage, a generation model is designed to synthesize new samples based on real samples. The proposed augmentation algorithm extracts objects from the real samples and blends them randomly, to generate new samples and enhance the performance of the image processor. In the second stage, an improved deep learning architecture based on Faster R-CNN, Feature Pyramid Network (FPN), and a Residual Network is proposed to perform object detection on the enhanced dataset. The performance of the algorithm is validated and evaluated on two multi-class datasets. The experimental results performed over a range of imbalance severities demonstrate the superiority of the proposed framework compared to other solutionsShow more
Downloadable Archival Material, 2021-05-02
Undefined
Publisher:2021-05-02

 2021

 

2021 Peer-reviewed
Low-illumination image enhancement in the space environment based on the DC-WGAN algorithm
Authors:Zhang M.Zhang Y.Lv X.Guo C.Jiang Z.
Article, 2021
Publication:Sensors (Switzerland), 21, 2021 01 01, 1
Publisher:2021
 

 

 2021
Multi-Frame Super-Resolution Algorithm Based on a WGAN
Authors:Keqing NingZhihao ZhangKai HanSiyu HanXiqing Zhang
Article, 2021
Publication:IEEE access, 9, 2021, 85839
Publisher:2021

 

2021
The use of Generative Adversarial Networks to characterise new physics in multi-lepton final states at the LHC
Authors:Lebese, Thabang (Creator), Ruan, Xifeng (Creator)
Summary:Semi-supervision in Machine Learning can be used in searches for new physics where the signal plus background regions are not labelled. This strongly reduces model dependency in the search for signals Beyond the Standard Model. This approach displays the drawback in that over-fitting can give rise to fake signals. Tossing toy Monte Carlo (MC) events can be used to estimate the corresponding trials factor through a frequentist inference. However, MC events that are based on full detector simulations are resource intensive. Generative Adversarial Networks (GANs) can be used to mimic MC generators. GANs are powerful generative models, but often suffer from training instability. We henceforth show a review of GANs. We advocate the use of Wasserstein GAN (WGAN) with weight clipping and WGAN with gradient penalty (WGAN-GP) where the norm of gradient of the critic is penalized with respect to its input. Following the emergence of multi-lepton anomalies, we apply GANs for the generation of di-leptons final states in association with $b$-quarks at the LHC. A good agreement between the MC and the WGAN-GP generated events is found for the observables selected in the studyShow more
Downloadable Archival Material, 2021-05-31
Undefined
Publisher:2021-05-31


021  Peer-reviewed
Low-Dose CT Image Denoising with Improving WGAN and Hybrid Loss Function
Authors:Zhihua LiWeili ShiQiwei XingYu MiaoWei HeHuamin YangZhengang Jiang
Summary:The X-ray radiation from computed tomography (CT) brought us the potential risk. Simply decreasing the dose makes the CT images noisy and diagnostic performance compromised. Here, we develop a novel denoising low-dose CT image method. Our framework is based on an improved generative adversarial network coupling with the hybrid loss function, including the adversarial loss, perceptual loss, sharpness loss, and structural similarity loss. Among the loss function terms, perceptual loss and structural similarity loss are made use of to preserve textural details, and sharpness loss can make reconstruction images clear. The adversarial loss can sharp the boundary regions. The results of experiments show the proposed method can effectively remove noise and artifacts better than the state-of-the-art methods in the aspects of the visual effect, the quantitative measurements, and the texture detailsShow more
Article, 2021
Publication:Computational and Mathematical Methods in Medicine, 2021, 20210827
Publisher:2021

[HTML] hindawi.com

[HTML] Low-dose CT image denoising with improving WGAN and hybrid loss function

Z Li, W Shi, Q Xing, Y Miao, W He, H Yang… - … Methods in Medicine, 2021 - hindawi.com

… ated method for training network (WGAN-GP) [39]. It was important that WGAN-VGG [40] 

was … WGAN-VGG could overcome the problem of image overblur. Also, SMGAN [42] combined …

 Cited by 7 Related articles All 7 versions


2021
Approximation for Probability Distributions by Wasserstein GAN
Authors:Gao, Yihang (Creator), Ng, Michael K. (Creator), Zhou, Mingjie (Creator)
Summary:In this paper, we study Wasserstein Generative Adversarial Networks (WGAN) using GroupSort neural networks as discriminators. We show that the error bound of the approximation for the target distribution depends on both the width/depth (capacity) of generators and discriminators, as well as the number of samples in training. A quantified generalization bound is established for Wasserstein distance between the generated distribution and the target distribution. According to our theoretical results, WGAN has higher requirement for the capacity of discriminators than that of generators, which is consistent with some existing theories. More importantly, overly deep and wide (high capacity) generators may cause worse results than low capacity generators if discriminators are not strong enough. Moreover, we further develop a generalization error bound, which is free from curse of dimensionality w.r.t. numbers of training data. It reduces from $\widetilde{\mathcal{O}}(m^{-1/r} + n^{-1/d})$ to $\widetilde{\mathcal{O}}(\text{Pdim}(\mathcal{F}\circ \mathcal{G}) \cdot m^{-1/2} + \text{Pdim}(\mathcal{F}) \cdot n^{-1/2})$. However, the latter seems to contradict to numerical observations. Compared with existing results, our results are suitable for general GroupSort WGAN without special architectures. Finally, numerical results on swiss roll and MNIST data sets confirm our theoretical resultsShow more
Downloadable Archival Material, 2021-03-18
Undefined
Publisher:2021-03-18

<——2021———2021——2660——



Wasserstein GANs with Gradient Penalty Compute Congested TransportAuthors:Milne, Tristan (Creator), Nachman, Adrian (Creator)
Summary:Wasserstein GANs with Gradient Penalty (WGAN-GP) are an extremely popular method for training generative models to produce high quality synthetic data. While WGAN-GP were initially developed to calculate the Wasserstein 1 distance between generated and real data, recent works (e.g. Stanczuk et al. (2021)) have provided empirical evidence that this does not occur, and have argued that WGAN-GP perform well not in spite of this issue, but because of it. In this paper we show for the first time that WGAN-GP compute the minimum of a different optimal transport problem, the so-called congested transport (Carlier et al. (2008)). Congested transport determines the cost of moving one distribution to another under a transport model that penalizes congestion. For WGAN-GP, we find that the congestion penalty has a spatially varying component determined by the sampling strategy used in Gulrajani et al. (2017) which acts like a local speed limit, making congestion cost less in some regions than others. This aspect of the congested transport problem is new in that the congestion penalty turns out to be unbounded and depend on the distributions to be transported, and so we provide the necessary mathematical proofs for this setting. We use our discovery to show that the gradients of solutions to the optimization problem in WGAN-GP determine the time averaged momentum of optimal mass flow. This is in contrast to the gradients of Kantorovich potentials for the Wasserstein 1 distance, which only determine the normalized direction of flow. This may explain, in support of Stanczuk et al. (2021), the success of WGAN-GP, since the training of the generator is based on these gradientsShow more
Downloadable Archival Material, 2021-09-01
Undefined
Publisher:2021-09-01
 

2021
Wasserstein GAN: Deep Generation applied on Bitcoins financial time series
Authors:Samuel, Rikli (Creator), Nico, Bigler Daniel (Creator), Moritz, Pfenninger (Creator), Joerg, Osterrieder (Creator)
Summary:Modeling financial time series is challenging due to their high volatility and unexpected happenings on the market. Most financial models and algorithms trying to fill the lack of historical financial time series struggle to perform and are highly vulnerable to overfitting. As an alternative, we introduce in this paper a deep neural network called the WGAN-GP, a data-driven model that focuses on sample generation. The WGAN-GP consists of a generator and discriminator function which utilize an LSTM architecture. The WGAN-GP is supposed to learn the underlying structure of the input data, which in our case, is the Bitcoin. Bitcoin is unique in its behavior; the prices fluctuate what makes guessing the price trend hardly impossible. Through adversarial training, the WGAN-GP should learn the underlying structure of the bitcoin and generate very similar samples of the bitcoin distribution. The generated synthetic time series are visually indistinguishable from the real data. But the numerical results show that the generated data were close to the real data distribution but distinguishable. The model mainly shows a stable learning behavior. However, the model has space for optimization, which could be achieved by adjusting the hyperparametersShow more
Downloadable Archival Material, 2021-07-13
Undefined
Publisher:2021-07-13

ited by 5 Related articles All 2 versions

2021
Physics-Driven Learning of Wasserstein GAN for Density Reconstruction in Dynamic Tomography
Authors:Huang, Zhishen (Creator), Klasky, Marc (Creator), Wilcox, Trevor (Creator), Ravishankar, Saiprasad (Creator)
Summary:Object density reconstruction from projections containing scattered radiation and noise is of critical importance in many applications. Existing scatter correction and density reconstruction methods may not provide the high accuracy needed in many applications and can break down in the presence of unmodeled or anomalous scatter and other experimental artifacts. Incorporating machine-learned models could prove beneficial for accurate density reconstruction particularly in dynamic imaging, where the time-evolution of the density fields could be captured by partial differential equations or by learning from hydrodynamics simulations. In this work, we demonstrate the ability of learned deep neural networks to perform artifact removal in noisy density reconstructions, where the noise is imperfectly characterized. We use a Wasserstein generative adversarial network (WGAN), where the generator serves as a denoiser that removes artifacts in densities obtained from traditional reconstruction algorithms. We train the networks from large density time-series datasets, with noise simulated according to parametric random distributions that may mimic noise in experiments. The WGAN is trained with noisy density frames as generator inputs, to match the generator outputs to the distribution of clean densities (time-series) from simulations. A supervised loss is also included in the training, which leads to improved density restoration performance. In addition, we employ physics-based constraints such as mass conservation during network training and application to further enable highly accurate density reconstructions. Our preliminary numerical results show that the models trained in our frameworks can remove significant portions of unknown noise in density time-series dataShow more
Downloadable Archival Material, 2021-10-28
Undefined
Publisher:2021-10-28

2021
Synthetic Periocular Iris PAI from a Small Set of Near-Infrared-Images
Authors:Maureira, Jose (Creator), Tapia, Juan (Creator), Arellano, Claudia (Creator), Busch, Christoph (Creator)
Summary:Biometric has been increasing in relevance these days since it can be used for several applications such as access control for instance. Unfortunately, with the increased deployment of biometric applications, we observe an increase of attacks. Therefore, algorithms to detect such attacks (Presentation Attack Detection (PAD)) have been increasing in relevance. The LivDet-2020 competition which focuses on Presentation Attacks Detection (PAD) algorithms have shown still open problems, specially for unknown attacks scenarios. In order to improve the robustness of biometric systems, it is crucial to improve PAD methods. This can be achieved by augmenting the number of presentation attack instruments (PAI) and bona fide images that are used to train such algorithms. Unfortunately, the capture and creation of presentation attack instruments and even the capture of bona fide images is sometimes complex to achieve. This paper proposes a novel PAI synthetically created (SPI-PAI) using four state-of-the-art GAN algorithms (cGAN, WGAN, WGAN-GP, and StyleGAN2) and a small set of periocular NIR images. A benchmark between GAN algorithms is performed using the Frechet Inception Distance (FID) between the generated images and the original images used for training. The best PAD algorithm reported by the LivDet-2020 competition was tested for us using the synthetic PAI which was obtained with the StyleGAN2 algorithm. Surprisingly, The PAD algorithm was not able to detect the synthetic images as a Presentation Attack, categorizing all of them as bona fide. Such results demonstrated the feasibility of synthetic images to fool presentation attacks detection algorithms and the need for such algorithms to be constantly updated and trained with a larger number of images and PAI scenariosShow more
Downloadable Archival Material, 2021-07-26
Undefined


2021
A Data-driven Event Generator for Hadron Colliders using Wasserstein Generative Adversarial Network
Authors:Choi, Suyong (Creator), Lim, Jae Hoon (Creator)
Summary:Highly reliable Monte-Carlo event generators and detector simulation programs are important for the precision measurement in the high energy physics. Huge amounts of computing resources are required to produce a sufficient number of simulated events. Moreover, simulation parameters have to be fine-tuned to reproduce situations in the high energy particle interactions which is not trivial in some phase spaces in physics interests. In this paper, we suggest a new method based on the Wasserstein Generative Adversarial Network (WGAN) that can learn the probability distribution of the real data. Our method is capable of event generation at a very short computing time compared to the traditional MC generators. The trained WGAN is able to reproduce the shape of the real data with high fidelityShow more
Downloadable Archival Material, 2021-02-23
Undefined
Publisher:2021-02-23

online Cover Image OPEN ACCESS

A data-driven event generator for Hadron Colliders using Wasserstein Generative Adversarial Network

by Choi, Suyong; Lim, Jae Hoon

Journal of the Korean Physical Society, 02/2021

Highly reliable Monte-Carlo event generators and detector simulation programs are important for the precision measurement in the high energy physics. Huge...

Article PDF Download PDF BrowZine PDF Icon

Journal ArticleFull Text Online

 Preview 

 Cite this item Email this item Save this item More actions

 Findings from Korea University Has Provided New Data on Information Technology 

(A Data-driven Event Generator for Hadron Colliders Using Wasserstein...

Information Technology Newsweekly, 03/2021

NewsletterCitation Online

Cited by 5 Related articles All 4 versions

2021



2021
Inferential Wasserstein Generative Adversarial Networks
Authors:Chen, Yao (Creator), Gao, Qingyi (Creator), Wang, Xiao (Creator)
Summary:Generative Adversarial Networks (GANs) have been impactful on many problems and applications but suffer from unstable training. The Wasserstein GAN (WGAN) leverages the Wasserstein distance to avoid the caveats in the minmax two-player training of GANs but has other defects such as mode collapse and lack of metric to detect the convergence. We introduce a novel inferential Wasserstein GAN (iWGAN) model, which is a principled framework to fuse auto-encoders and WGANs. The iWGAN model jointly learns an encoder network and a generator network motivated by the iterative primal dual optimization process. The encoder network maps the observed samples to the latent space and the generator network maps the samples from the latent space to the data space. We establish the generalization error bound of the iWGAN to theoretically justify its performance. We further provide a rigorous probabilistic interpretation of our model under the framework of maximum likelihood estimation. The iWGAN, with a clear stopping criteria, has many advantages over other autoencoder GANs. The empirical experiments show that the iWGAN greatly mitigates the symptom of mode collapse, speeds up the convergence, and is able to provide a measurement of quality check for each individual sample. We illustrate the ability of the iWGAN by obtaining competitive and stable performances for benchmark datasetsShow more
Downloadable Archival Material, 2021-09-12
Undefined
Publisher:2021-09-12


2021
Wasserstein GANs Work Because They Fail (to Approximate the Wasserstein Distance)
Authors:Stanczuk, Jan (Creator), Etmann, Christian (Creator), Kreusser, Lisa Maria (Creator), Schönlieb, Carola-Bibiane (Creator)
Summary:Wasserstein GANs are based on the idea of minimising the Wasserstein distance between a real and a generated distribution. We provide an in-depth mathematical analysis of differences between the theoretical setup and the reality of training Wasserstein GANs. In this work, we gather both theoretical and empirical evidence that the WGAN loss is not a meaningful approximation of the Wasserstein distance. Moreover, we argue that the Wasserstein distance is not even a desirable loss function for deep generative models, and conclude that the success of Wasserstein GANs can in truth be attributed to a failure to approximate the Wasserstein distanceShow more
Downloadable Archival Material, 2021-03-02
Undefined
Publisher:2021-03-02



ENTROPIC-WASSERSTEIN BARYCENTERS: PDE CHARACTERIZATION, REGULARITY, AND CLT

Carlierdagger, GEichingerdagger, K and Kroshninddagger, A

2021 | 

SIAM JOURNAL ON MATHEMATICAL ANALYSIS

 53 (5) , pp.5880-5914

In this paper, we investigate properties of entropy-penalized Wasserstein barycenters introduced in [J. Bigot, E. Cazelles, and N. Papadakis, SIAM J. Math. Anal., 51 (2019), pp. 2261-2285] as a regularization of Wasserstein barycenters [M. Agueh and G. Carlier, SIAM J. Math. Anal., 43 (2011), pp. 904-924]. After characterizing these barycenters in terms of a system of Monge-Ampere equations, we

Show more

Free Submitted Article From RepositoryFull Text at Publishermore_horiz

1 Citation  40 References  Related records


First-Order Methods for Wasserstein Distributionally Robust MDPs

Grand-Clement, J and Kroer, C

International Conference on Machine Learning (ICML)

2021 | 

INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139

 139

Markov decision processes (MDPs) are known to be sensitive to parameter specification. Distributionally robust MDPs alleviate this issue by allowing for ambiguity sets which give a set of possible distributions over parameter sets. The goal is to find an optimal policy with respect to the worst-case parameter distribution. We propose a framework for solving Distributionally robust MDPs via firs

Show more

more_horiz

49 References  Related records



2021 see 2022

Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks

Pasini, ML and Yin, JQ

International Conference on Computational Science and Computational Intelligence (CSCI)

2021 | 

2021 INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND COMPUTATIONAL INTELLIGENCE (CSCI 2021)

 , pp.1-7

We use a stable parallel approach to train Wasserstein Conditional Generative Adversarial Neural Networks (W-CGAN5). The parallel training reduces the risk of mode collapse and enhances scalability by using multiple generators that are concurrently trained, each one of them focusing on a single data label. The use of the Wasserstein metric reduces the risk of cycling by stabilizing the training

Show more

Free Submitted Article From RepositoryFull Text at Publishermore_horiz

24 References  Related records

<——2021———2021——2670— 



Smooth p-Wasserstein Distance: Structure, Empirical Approximation, and Statistical Applications

Nietert, SGoldfeld, Z and Kato, K

International Conference on Machine Learning (ICML)

2021 | 

INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139

 139

Discrepancy measures between probability distributions, often termed statistical distances, are ubiquitous in probability theory, statistics and machine learning. To combat the curse of dimensionality when estimating these distances from data, recent work has proposed smoothing out local irregularities in the measured distributions via convolution with a Gaussian kernel. Motivated by the scalab

Show more

more_horiz

72 References  Related records 


NETWORK CONSENSUS IN THE WASSERSTEIN METRIC SPACE OF PROBABILITY MEASURES

Bishop, AN and Doucet, A

2021 | 

SIAM JOURNAL ON CONTROL AND OPTIMIZATION

 59 (5) , pp.3261-3277

Distributed consensus in the Wasserstein metric space of probability measures on the real line is introduced in this work. Convergence of each agent's measure to a common measure is proven under a weak network connectivity condition. The common measure reached at each agent is one minimizing a weighted sum of its Wasserstein distance to all initial agent measures. This measure is known as the W

Show more

Free Published Article From RepositoryView full textmore_horiz

63 References  Related records 


EXPONENTIAL CONTRACTION IN WASSERSTEIN DISTANCE ON STATIC AND EVOLVING MANIFOLDS

Cheng, LJThalmaier, A and Zhang, SQ

2021 | 

REVUE ROUMAINE DE MATHEMATIQUES PURES ET APPLIQUEES

 66 (1) , pp.107-129

In this article, exponential contraction in Wasserstein distance for heat semigroups of diffusion processes on Riemannian manifolds is established under curvature conditions where Ricci curvature is not necessarily required to be non-negative. Compared to the results of Wang (2016), we focus on explicit estimates for the exponential contraction rate. Moreover, we show that our results extend to

Show more

more_horiz

1 Citation  17 References  Related records


Hausdorff and Wasserstein metrics on graphs and other ...

https://academic.oup.com › imaiai › article-abstract

https://academic.oup.com › imaiai › article-abstract

by E Patterson · 2021 · Cited by 7 — Hausdorff and Wasserstein metrics on graphs and other structured data ... Information and Inference: A Journal of the IMA, Volume 10, Issue 4, December 2021 ...


2021

Lecture 20 | Wasserstein Generative Adversarial Networks

m.youtube.com › watchLecture 20 | Wasserstein Generative Adversarial Networks. 280 views · 1 year ago ...more ... MIT 6.824: Distributed Systems•325K views.

YouTube · Foundation for Armenian Science and Technology (FAST) · 

Jul 22, 2021


2021 


Robert Dadashi (@robdadashi) / Twitter

twitter.com › robdadashi

twitter.com › robdadashiWe provably achieve cross-domain transfer in non-trivial continuous control domain by minimizing the Gromov-Wasserstein distance with deep RL.

Twitter · 

Dec 15, 2021


Pablo Samuel Castro ☠️ (@psc@sigmoid.social) on Twitter ...

twitter.com › pcastr › status

twitter.com › pcastr › status1:55

... paper) gets around this but both still rely on the expensive Kantorovich/Wasserstein metric 3/ https://t.co/KoCztxNwDV" / Twitter ...

Twitter · 

Oct 22, 2021


Peer-reviewed
IE-WGAN: A model with Identity Extracting for Face Frontal Synthesis
Authors:Yanrui ZhuYonghong ChenYuxin Ren
Summary:Face pose affects the effect of the frontal face synthesis, we propose a model used for frontal face synthesis based on WGAN-GP. This model includes identity extracting module, which is used to supervise the training of the face generation module. On the one hand, the model improves the quality of synthetic face images, on the other hand, it can accelerate the convergence speed of network training. We conduct verification experiments on CelebA data sets, and the results show that this model improves the graphic quality of frontal synthesisShow more
Article, 2021
Publication:1861, March 2021, 012062
Publisher:2021



WGAN-GP-Based Synthetic Radar Spectrogram Augmentation in Human Activity Recognition
Authors:Lele QuYutong WangTianhong YangLili ZhangYanpeng SunIGARSS 2021 - 2021 IEEE International Geoscience and Remote Sensing Symposium
Summary:Despite deep convolutional neural networks (DCNNs) having been used extensively in radar-based human activity recognition in recent years, their performance could not be fully implemented because of the lack of radar dataset. However, radar data acquisition is difficult to achieve due to the high cost of its measurement. Generative adversarial networks (GANs) can be utilized to generate a large number of similar micro-Doppler signatures with which to increase the training data set. For the training of DCNNs, the quality and diversity of data set generated by GANs is particularly important. In this paper, we propose using a more stable and effective Wasserstein generative adversarial network with gradient penalty (WGAN-GP) to augment the training data set. The classification results from the experimental data have shown the proposed method can improve the classification accuracy of human activityShow mor
Chapter, 2021
Publication:2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS, 20210711, 2532
Publisher:2021


Attention Residual Network for White Blood Cell Classification with WGAN Data Augmentation
Authors:Meng ZhaoLingmin JinShenghua TengZuoyong Li2021 11th International Conference on Information Technology in Medicine and Education (ITME)Show more
Summary:In medicine, white blood cell (WBC) classification plays an important role in clinical diagnosis and treatment. Due to the similarity between classes and lack of training data, the precise classification of WBC is still challenging. To alleviate this problem, we propose an attention residual network for WBC image classification on the basis of data augmentation. Specifically, the attention residual network is composed of multiple attention residual blocks, an adaptive average pooling layer, and a full connection layer. The channel attention mechanism is introduced in each residual block to use the feature maps of WBC learned by a high layer to generate the attention map for a low layer. Each attention residual block also introduces depth separable convolution to extract the feature of WBC and decrease the training costs. The Wasserstein Generative adversarial network (WGAN) is used to create synthetic instances to enhance the size of training data. Experiments on two image datasets show the superiority of the proposed method over several state-of-the-art methodsShow more
Chapter, 2021
Publication:2021 11th International Conference on Information Technology in Medicine and Education (ITME), 202111, 336
Publisher:2021

<——2021———2021——2680—


2021
Coverless Information Hiding Based on WGAN-GP Model
Authors:Xintao DuanBaoxia LiDaidou GuoKai JiaEn ZhangChuan Qin
Summary:Steganalysis technology judges whether there is secret information in the carrier by monitoring the abnormality of the carrier data, so the traditional information hiding technology has reached the bottleneck. Therefore, this paper proposed the coverless information hiding based on the improved training of Wasserstein GANs (WGAN-GP) model. The sender trains the WGAN-GP with a natural image and a secret image. The generated image and secret image are visually identical, and the parameters of generator are saved to form the codebook. The sender uploads the natural image (disguise image) to the cloud disk. The receiver downloads the camouflage image from the cloud disk and obtains the corresponding generator parameter in the codebook and inputs it to the generator. The generator outputs the same image for the secret image, which realized the same results as sending the secret image. The experimental results indicate that the scheme produces high image quality and good securityShow more
Article, 2021
Publication:International Journal of Digital Crime and Forensics (IJDCF), 13, 20210701, 57
Publisher:2021

2021

Infrared Image Super-Resolution via Heterogeneous Convolutional WGANAuthors:Huang, Yongsong (Creator), Jiang, Zetao (Creator), Wang, Qingzhong (Creator), Jiang, Qi (Creator), Pang, Guoming (Creator)
Summary:Image super-resolution is important in many fields, such as surveillance and remote sensing. However, infrared (IR) images normally have low resolution since the optical equipment is relatively expensive. Recently, deep learning methods have dominated image super-resolution and achieved remarkable performance on visible images; however, IR images have received less attention. IR images have fewer patterns, and hence, it is difficult for deep neural networks (DNNs) to learn diverse features from IR images. In this paper, we present a framework that employs heterogeneous convolution and adversarial training, namely, heterogeneous kernel-based super-resolution Wasserstein GAN (HetSRWGAN), for IR image super-resolution. The HetSRWGAN algorithm is a lightweight GAN architecture that applies a plug-and-play heterogeneous kernel-based residual block. Moreover, a novel loss function that employs image gradients is adopted, which can be applied to an arbitrary model. The proposed HetSRWGAN achieves consistently better performance in both qualitative and quantitative evaluations. According to the experimental results, the whole training process is more stableShow more
Downloadable Archival Material, 2021-09-02
Undefined
Publisher:2021-09-02



Underwater Object Detection of an UVMS Based on WGAN
Authors:Qingyu WeiWei Chen2021 China Automation Congress (CAC)
Summary:To solve the problem that the underwater image quality is not high, which leads to the inaccuracy of UVMS target detection based on convolutional neural network, an underwater target detection method based on WGAN is proposed. Firstly, the classic data expansion method is used to expand the data set. Then, WGAN based method UVMS is used to synthesize data enhancement to improve the performance of detection network in underwater target detection. RetinaNet is used as a target detection network, and sea cucumbers are used as a typical research target for experiments. The experimental results show that the detection accuracy of UVMS is improved by 17% in underwater target detection. The proposed method provides a good technical support for autonomous fishing of UVMSShow more
Chapter, 2021
Publication:2021 China Automation Congress (CAC), 20211022, 702
Publisher:2021



TextureWGAN: texture preserving WGAN with MLE regularizer for inverse problems
Authors:Masaki IkutaIvana IšgumBennett A. LandmanJun ZhangMedical Imaging 33SPIE Medical Imaging 2570695 2021-02-15|2021-02-20 Online Only, California, United StatesMedical Imaging 2021: Image Processing 11596Adversarial Learning 7Show more
mary:Many algorithms and methods have been proposed for inverse problems particularly with the recent surge of interest in machine learning and deep learning methods. Among all proposed methods, the most popular and effective method is the convolutional neural network (CNN) with mean square error (MSE). This method has been proven effective in super-resolution, image de-noising, and image reconstruction. However, this method is known to over-smooth images due to the nature of MSE. MSE based methods minimize Euclidean distance for all pixels between a baseline image and a generated image by CNN and ignore the spatial information of the pixels such as image texture. In this paper, we proposed a new method based on Wasserstein GAN (WGAN) for inverse problems. We showed that the WGAN-based method was effective to preserve image texture. It also used a maximum likelihood estimation (MLE) regularizer to preserve pixel fidelity. Maintaining image texture and pixel fidelity is the most important requirement for medical imaging. We used Peak Signal to Noise Ratio (PSNR) and Structure Similarity (SSIM) to evaluate the proposed method quantitatively. We also conducted the first-order and the second-order statistical image texture analysis to assess image textureShow more
Chapter, 2021
Publication:11596, 20210215, 1159618
Publisher:2021


Anti-confrontational Domain Data Generation Based on Improved WGAN
Authors:Jianhu DongXingchi ChenHaibo Luo2021 International Symposium on Computer Technology and Information Science (ISCTIS)
Summary:The Domain Generate Algorithm (DGA) is used by a large number of botnets to evade detection. At present, the mainstream machine learning detection technology not only lacks the training data with evolutionary value, but also has the security problem that the model input sample is attacked. The Generative Adversarial Network (GAN) suggested by Goodfellow offers the possibility of solving the above problems, and WGAN is a variant of the GAN model implementation <sup>[1]</sup>. In this paper, an improved method for generating adversarial domain names by improved WGAN character domain name generator is proposed to improve model detection capability and expand effective training set. Experimental results show that this method produces adversarial domain names that are more consistent with human naming than traditional GAN models, adding these training sets with adversarial factors improves the discriminant hit ratio of the model to unknown domain namesShow more
Chapter, 2021
Publication:2021 International Symposium on Computer Technology and Information Science (ISCTIS), 202106, 203
Publisher:2021


2021


Multi WGAN-GP loss for pathological stain transformation using GAN
Authors:Atefeh Ziaei MoghadamHamed AzarnoushSeyyed Ali Seyyedsalehi2021 29th Iranian Conference on Electrical Engineering (ICEE)
Summary:In this paper, we proposed a new loss function to train the conditional generative adversarial network (CGAN). CGANs use a condition to generate images. Adding a class condition to the discriminator helps improve the training process of GANs and has been widely used for CGAN. Therefore, many loss functions have been proposed for the discriminator to add class conditions to it. Most of them have the problem of adjusting weights. This paper presents a simple yet new loss function that uses class labels, but no adjusting is required. This loss function is based on WGAN-GP loss, and the discriminator has outputs of the same order (the reason for no adjusting). More specifically, the discriminator has K (the number of classes) outputs, and each of them is used to compute the distance between fake and real samples of one class. Another loss to enable the discriminator to classify is also proposed by applying SoftMax to the outputs and adding cross-entropy to our first loss. The proposed loss functions are applied to a CGAN for image-to-image translation (here stain transformation for pathological images). The performances of proposed losses with some state-of-the-art losses are compared using Histogram Intersection Score between generated images and target images. The accuracy of a classifier is also computed to measure the effect of stain transformation. Our first loss performs almost similar to the loss that achieved the best results. (Abstract)Show more
Chapter, 2021
Publication:2021 29th Iranian Conference on Electrical Engineering (ICEE), 20210518, 927
Publisher:2021

An Intrusion Detection Method Based on WGAN and Deep Learning
Authors:Linfeng HanXu FangYongguang LiuWenping WuHuinan Wang2021 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC)Show more
Summary:Using WGAN and deep learning methods, a multiclass network intrusion detection model is proposed. The model uses the WGAN network to generate fake samples of rare attacks to achieve effective expansion of the original dataset and evaluates the samples through a two-classification method to ensure the effectiveness of the generated data. Through the CNN-LSTM network, the dimensionality reduction data is multiclassified and predicted. The network structure and parameters are effectively designed and trained to realize the identification and classification of network attacks. Experiments have proved that the model has improved the accuracy and recall index of network attack detection and classification compared with traditional methods. The proposed data generation method can improve the overall detection effect of the predictive model on rare attack types, and improve the accuracy rate and reduce errors reportsShow mor
Chapter, 2021
Publication:2021 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC), 20210817, 1
Publisher:2021


Accelerated WGAN update strategy with loss change rate balancing
Authors:Gady AgamYing ChenXu Ouyang2021 IEEE Winter Conference on Applications of Computer Vision (WACV)
Summary:Optimizing the discriminator in Generative Adversarial Networks (GANs) to completion in the inner training loop is computationally prohibitive, and on finite datasets would result in overfitting. To address this, a common update strategy is to alternate between k optimization steps for the discriminator D and one optimization step for the generator G. This strategy is repeated in various GAN algorithms where k is selected empirically. In this paper, we show that this update strategy is not optimal in terms of accuracy and convergence speed, and propose a new update strategy for networks with Wasserstein GAN (WGAN) group related loss functions (e.g. WGAN, WGAN-GP, Deblur GAN, and Super resolution GAN). The proposed update strategy is based on a loss change ratio comparison of G and D. We demonstrate that the proposed strategy improves both convergence speed and accuracyShow more
Chapter, 2021
Publication:2021 IEEE Winter Conference on Applications of Computer Vision (WACV), 202101, 2545
Publisher:2021


Attention Residual Network for White Blood Cell Classification with WGAN Data Augmentation
Authors:Zuoyong LiShenghua TengLingmin JinMeng Zhao2021 11th International Conference on Information Technology in Medicine and Education (ITME)Show more
Summary:In medicine, white blood cell (WBC) classification plays an important role in clinical diagnosis and treatment. Due to the similarity between classes and lack of training data, the precise classification of WBC is still challenging. To alleviate this problem, we propose an attention residual network for WBC image classification on the basis of data augmentation. Specifically, the attention residual network is composed of multiple attention residual blocks, an adaptive average pooling layer, and a full connection layer. The channel attention mechanism is introduced in each residual block to use the feature maps of WBC learned by a high layer to generate the attention map for a low layer. Each attention residual block also introduces depth separable convolution to extract the feature of WBC and decrease the training costs. The Wasserstein Generative adversarial network (WGAN) is used to create synthetic instances to enhance the size of training data. Experiments on two image datasets show the superiority of the proposed method over several state-of-the-art methodsShow more
Chapter, 2021
Publication:2021 11th International Conference on Information Technology in Medicine and Education (ITME), 202111, 336
Publisher:2021


Oversampling based on WGAN for Network Threat DetectionAuthors:Fangzhou GuoJian QiuXia ZhangJieyin ZhangZhenliang QiuYanping Xu2021 IEEE Intl Conf on Dependable, Autonomic and Secure Computing, Intl Conf on Pervasive Intelligence and Computing, Intl Conf on Cloud and Big Data Computing, Intl Conf on Cyber Science and Technology Congress (DASC/PiCom/CBDCom/CyberSciTech)Show more
Summary:Class imbalance is a common problem on network threat detection. The data of network threat usually are few and viewed as the minority class. Wasserstein GAN (WGAN) as a generative method can solve the imbalanced problem through oversampling. In this work, we use the shallow learning and the deep learning methods to build a network threat detection model on the imbalanced data. First, the imbalanced data are divided into the training data set and testing data set. Second, WGAN is used to generate the new minority samples for the training data. Then the generated data are fused to the original training data to form a balanced training data set. Third, the balanced training data set is input to the shallow learning methods to train the network threat detection model. Next, the imbalanced testing data set is input to the trained model to distinguish the network threat. The experimental results show that our network threat detection model based on WGAN for oversampling achieves a good performance for network threat detectionShow more
Chapter, 2021
Publication:2021 IEEE Intl Conf on Dependable, Autonomic and Secure Computing, Intl Conf on Pervasive Intelligence and Computing, Intl Conf on Cloud and Big Data Computing, Intl Conf on Cyber Science and Technology Congress (DASC/PiCom/CBDCom/CyberSciTech), 202110, 413
Publisher:2021

<——2021———2021——2690—



P2E-WGAN ECG waveform synthesis from PPG with conditional wasserstein generative adversarial networks
Authors:Khuong Vo (Author), Emad Kasaeyan Naeini (Author), Amir Naderi (Author), Daniel Jilani (Author), Amir M. Rahmani (Author), Nikil Dutt (Author), Hung Cao (Author)Show more
Summary:Electrocardiogram (ECG) is routinely used to identify key cardiac events such as changes in ECG intervals (PR, ST, QT, etc.), as well as capture critical vital signs such as heart rate (HR) and heart rate variability (HRV). The gold standard ECG requires clinical measurement, limiting the ability to capture ECG in everyday settings. Photoplethysmography (PPG) offers an out-of-clinic alternative for non-invasive, low-cost optical capture of cardiac physiological measurement in everyday settings, and is increasingly used for health monitoring in many clinical and commercial wearable devices. Although ECG and PPG are highly correlated, PPG does not provide much information for clinical diagnosis. Recent work has applied machine learning algorithms to generate ECG signals from PPG, but requires expert domain knowledge and heavy feature crafting to achieve good accuracy. We propose P2E-WGAN: a pure end-to-end, generalizable deep learning model using a conditional Wasserstein generative adversarial network to synthesize ECG waveforms from PPG. Our generative model is capable of augmenting the training data to alleviate the data-hungry problem of machine learning methods. Our model trained in the subject independent mode can achieve the average root mean square error of 0.162, Fréchet distance of 0.375, and Pearson's correlation of 0.835 on a normalized real-world dataset, demonstrating the effectiveness of our approachShow more
Chapter, 2021
Publication:Proceedings of the 36th Annual ACM Symposium on Applied Computing, 20210322, 1030
Publisher:2021


Accelerated WGAN update strategy with loss change rate balancing
Authors:Xu OuyangYing ChenGady Agam2021 IEEE Winter Conference on Applications of Computer Vision (WACV)
Summary:Optimizing the discriminator in Generative Adversarial Networks (GANs) to completion in the inner training loop is computationally prohibitive, and on finite datasets would result in overfitting. To address this, a common update strategy is to alternate between k optimization steps for the discriminator D and one optimization step for the generator G. This strategy is repeated in various GAN algorithms where k is selected empirically. In this paper, we show that this update strategy is not optimal in terms of accuracy and convergence speed, and propose a new update strategy for networks with Wasserstein GAN (WGAN) group related loss functions (e.g. WGAN, WGAN-GP, Deblur GAN, and Super resolution GAN). The proposed update strategy is based on a loss change ratio comparison of G and D. We demonstrate that the proposed strategy improves both convergence speed and accuracyShow more
Chapter, 2021
Publication:2021 IEEE Winter Conference on Applications of Computer Vision (WACV), 202101, 2545
Publisher:2021

AC-WGAN-GP: Augmenting ECG and GSR Signals using Conditional Generative Models for Arousal Classification
Authors:Andrei Furdui (Author), Tianyi Zhang (Author), Marcel Worring (Author), Pablo Cesar (Author), Abdallah El Ali (Author)
Summary:Computational recognition of human emotion using Deep Learning techniques requires learning from large collections of data. However, the complex processes involved in collecting and annotating physiological data lead to datasets with small sample sizes. Models trained on such limited data often do not generalize well to real-world settings. To address the problem of data scarcity, we use an Auxiliary Conditioned Wasserstein Generative Adversarial Network with Gradient Penalty (AC-WGAN-GP) to generate synthetic data. We compare the recognition performance between real and synthetic signals as training data in the task of binary arousal classification. Experiments on GSR and ECG signals show that generative data augmentation significantly improves model performance (avg. 16.5%) for binary arousal classification in a subject-independent settingShow more
Chapter, 2021
Publication:Adjunct Proceedings of the 2021 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2021 ACM International Symposium on Wearable Computers, 20210921, 21
Publisher:2021


Military Target Recognition Technology based on WGAN-GP and XGBoost
Authors:Kainan Zhao (Author), Baoliang Dong (Author), Cheng Yang (Author)
Summary:This paper proposes a military target recognition method based on WGAN-GP and XGBoost, which expands and improves the quality of military target samples by constructing WGAN-GP, then sampling iteratively based on heuristic learning to construct an effective sample training set. On the basis of the quality training set, XGBoost is used for supervised learning, and finally a classification network model for military target recognition is obtained. Compared with methods such as CNN, KNN, and SVM, the method proposed in the article has a 1.01% to 58.84% higher overall accuracy of target sample recognition, and the overall accuracy is 27.36%   57.46% higher under the conditions of different small-scale samplesShow mo
Chapter, 2021
Publication:2021 4th International Conference on Computer Science and Software Engineering (CSSE 2021), 20211022, 216
Publisher:2021


Probabilistic Human-like Gesture Synthesis from Speech using GRU-based WGAN
Authors:Bowen Wu (Author), Chaoran Liu (Author), Carlos T. Ishi (Author), Hiroshi Ishiguro (Author)
Summary:Gestures are crucial for increasing the human-likeness of agents and robots to achieve smoother interactions with humans. The realization of an effective system to model human gestures, which are matched with the speech utterances, is necessary to be embedded in these agents. In this work, we propose a GRU-based autoregressive generation model for gesture generation, which is trained with a CNN-based discriminator in an adversarial manner using a WGAN-based learning algorithm. The model is trained to output the rotation angles of the joints in the upper body, and implemented to animate a CG avatar. The motions synthesized by the proposed system are evaluated via an objective measure and a subjective experiment, showing that the proposed model outperforms a baseline model which is trained by a state-of-the-art GAN-based algorithm, using the same dataset. This result reveals that it is essential to develop a stable and robust learning algorithm for training gesture generation models. Our code can be found in https://github.com/wubowen416/gesture-generationShow more
Chapter, 2021
Publication:Companion Publication of the 2021 International Conference on Multimodal Interaction, 20211018, 194
Publisher:2021

2021


Network Malicious Traffic Identification Method Based On WGAN Category Balancing
Authors:Anzhou WangYaojun Ding2021 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC)
Summary:Aiming at the problem of data imbalance when in using deep learning model for traffic recognition tasks, a method of using Wasserstein Generative Adversarial Network (WGAN) to generate minority samples based on the image of the original traffic data packets is proposed to achieve a small number of data categories expansion to solve the problem of data imbalance. Firstly, through data preprocessing, the original traffic PCAP data in the dataset is segmented, filled, and mapped into grayscale pictures according to the flow unit. Then, the balance of dataset is achieved by using traditional random over sampling, WGAN confrontation network generation technology, ordinary GAN generation technology and synthetic minority oversampling technology. Finally, public datasets USTC- TFC2016 and CICIDS2017 are adopted to classify the unbalanced dataset and the balanced dataset on classic deep model CNN, and three evaluation indicators of precision, recall, and f1 are applied to evaluate classification effect. Experimental results show that the dataset balanced by the WGAN model is better than the ordinary GAN generation method, traditional oversampling method and the synthesis of the minority class sampling technique method under the same classification modelShow more
Chapter, 2021
Publication:2021 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC), 20210817, 1
Publisher:2021

Generating Adversarial Patches Using Data-Driven MultiD-WGAN
Authors:Wei WangYimeng ChaiZiwen WuLitong GeXuechun HanBaohua ZhangChuang WangYue Li2021 IEEE International Symposium on Circuits and Systems (ISCAS)Show more
Summary:In recent years, machine learning algorithms and training data are faced many security threats, which affect the security of practical applications based on machine learning. At present, generating adversarial patches based on Generative Adversarial Nets (GANs) has been an emerging study. However, existing attack strategies are still far from producing local adversarial patches with strong attack power, ignoring the attacked network's perceived sensitivity to the adversarial patches. This paper studies the security threat of adversarial patches to classifiers; adding an adversarial patch to the data can mislead the classifier into incorrect results. Considering the attention to aggression and reality, we propose the data-driven MultiD-WGAN, which can simultaneously enhance adversarial patches' attack power and authenticity through multi-discriminators. The experiments confirm that our datadriven MultiD-WGAN dramatically reduces the recall of seven classifiers attacked on four datasets. The attack of data-driven MultiD-WGAN on 25/28 groups of experiments leads to a decreased recall rate, which is better than the conventional GANs. Finally, we have proved a positive correlation between attack intensity and attack ability, both theoretically and experimentallyShow more
Chapter, 2021
Publication:2021 IEEE International Symposium on Circuits and Systems (ISCAS), 202105, 1
Publisher:2021

Implementation of a WGAN-GP for Human Pose Transfer using a 3-channel pose representation
Authors:Tamal DasSaurav SutradharMrinmoy DasSimantini ChakrabortySuman Deb2021 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies (3ICT)Show more
Summary:The computational problem of Human Pose Transfer (HPT) is addressed in this paper. HPT in recent days have become an emerging research topic which can be used in fields like fashion design, media production, animation, virtual reality. Given the image of a human subject and a target pose, the goal of HPT is to generate a new image of the human subject with the novel pose. That is, the pose of the target pose is transferred to the human subject. HPT has been carried out in two stages. In stage 1, a rough estimate is generated and in stage 2, the rough estimate is refined with a generative adversarial network. The novelty of this work is the way pose information is represented. Earlier methods used computationally expensive pose representations like 3D DensePose and 18-channel pose heatmaps. This work uses a 3-channel colour image of a stick figure to represent human pose. Different body parts are encoded with different colours. The convolutional neural networks will now have to recognize colours only, and since these colours encode body parts, eventually the network will also learn about the position of the body partsShow more
Chapter, 2021
Publication:2021 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies (3ICT), 20210929, 698
Publisher:2021

WGAN 성능개선을 위한 효과적인 정칙항 제안
Authors:한희일Hee Il Hahn
Summary:A Wasserstein GAN(WGAN), optimum in terms of minimizing Wasserstein distance, still suffers from inconsistent convergence or unexpected output due to inherent learning instability. It is widely known some kinds of restriction on the discriminative function should be considered to solve such problems, which implies the importance of Lipschitz continuity. Unfortunately, there are few known methods to satisfactorily maintain the Lipschitz continuity of the discriminative function. In this paper we propose techniques to stably maintain the Lipschitz continuity of the discriminative function by adding effective regularization terms to the objective function, which limit the magnitude of the gradient vectors of the discriminator to one or less. Extensive experiments are conducted to evaluate the performance of the proposed techniques, which shows the single-sided penalty improves convergence compared with the gradient penalty at the early learning process, while the proposed additional penalty increases inception scores by 0.18 after 100,000 number of learningShow more
Downloadable Article, 2021
Publication:멀티미디어학회논문지 = Journal of Korea Multimedia Society, 24, 2021, 13
Publisher:2021


2
Peer-reviewed
Wasserstein distance, Fourier series and applications
Author:Stefan Steinerberger
Summary:Abstract: We study the Wasserstein metric , a notion of distance between two probability distributions, from the perspective of Fourier Analysis and discuss applications. In particular, we bound the Earth Mover Distance between the distribution of quadratic residues in a finite field and uniform distribution by (the Polya–Vinogradov inequality implies ). We also show that for continuous with mean value 0 Moreover, we show that for a Laplacian eigenfunction on a compact Riemannian manifold , which is at most a factor away from sharp. Several other problems are discussedShow more
Article, 2021
Publication:Monatshefte für Mathematik, 194, 20210107, 305
Publisher:2021

<——2021———2021——2700—



Relaxed Wasserstein with Applications to GANs
Authors:Xin GuoJohnny HongTianyi LinNan YangICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Summary:Wasserstein Generative Adversarial Networks (WGANs) provide a versatile class of models, which have attracted great attention in various applications. However, this framework has two main drawbacks: (i) Wasserstein-1 (or Earth-Mover) distance is restrictive such that WGANs cannot always fit data geometry well; (ii) It is difficult to achieve fast training of WGANs. In this paper, we propose a new class of Relaxed Wasserstein (RW) distances by generalizing Wasserstein-1 distance with Bregman cost functions. We show that RW distances achieve nice statistical properties while not sacrificing the computational tractability. Combined with the GANs framework, we develop Relaxed WGANs (RWGANs) which are not only statistically flexible but can be approximated efficiently using heuristic approaches. Experiments on real images demonstrate that the RWGAN with Kullback-Leibler (KL) cost function outperforms other competing approaches, e.g., WGANs, even with gradient penaltyShow more
Chapter, 2021
Publication:ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 20210606, 3325
Publisher:2021

Cited by 30 Related articles All 5 versions


Peer-reviewed
Wasserstein distance-based asymmetric adversarial domain adaptation in intelligent bearing fault diagnosis
Authors:Ying YuJun ZhaoTang TangJingwei WangMing ChenJie WuLiang Wang
Summary:Addressing the phenomenon of data sparsity in hostile working conditions, which leads to performance degradation in traditional machine learning-based fault diagnosis methods, a novel Wasserstein distance-based asymmetric adversarial domain adaptation is proposed for unsupervised domain adaptation in bearing fault diagnosis. A generative adversarial network-based loss and asymmetric mapping are integrated to alleviate the difficulty of the training process in adversarial transfer learning, especially when the domain shift is serious. Moreover, a simplified lightweight architecture is introduced to enhance the generalization and representation capability and reduce the computational cost. Experimental results show that our method not only achieves outstanding performance with sufficient data, but also outperforms these prominent adversarial methods with limited data (both source and target domain), which provides a promising approach to real industrial bearing fault diagnosisShow more
Article, 2021
Publication:Measurement Science and Technology, 32, 202111
Publisher:202

Peer-reviewed
Asymptotics of Smoothed Wasserstein Distances
Authors:Hong-Bin ChenJonathan Niles-Weed
Summary:Abstract: We investigate contraction of the Wasserstein distances on under Gaussian smoothing. It is well known that the heat semigroup is exponentially contractive with respect to the Wasserstein distances on manifolds of positive curvature; however, on flat Euclidean space—where the heat semigroup corresponds to smoothing the measures by Gaussian convolution—the situation is more subtle. We prove precise asymptotics for the 2-Wasserstein distance under the action of the Euclidean heat semigroup, and show that, in contrast to the positively curved case, the contraction rate is always polynomial, with exponent depending on the moment sequences of the measures. We establish similar results for the p-Wasserstein distances for p≠2 as well as the χ2 divergence, relative entropy, and total variation distance. Together, these results establish the central role of moment matching arguments in the analysis of measures smoothed by Gaussian convolutionShow more
Article, 2021
Publication:Potential Analysis : An International Journal Devoted to the Interactions between Potential Theory, Probability Theory, Geometry and Functional Analysis, 56, 20210119, 571
Publisher:2021

Peer-reviewed
Quantum Statistical Learning via Quantum Wasserstein Natural Gradient
Authors:Simon BeckerWuchen Li
Summary:Abstract: In this article, we introduce a new approach towards the statistical learning problem to approximate a target quantum state by a set of parametrized quantum states in a quantum -Wasserstein metric. We solve this estimation problem by considering Wasserstein natural gradient flows for density operators on finite-dimensional algebras. For continuous parametric models of density operators, we pull back the quantum Wasserstein metric such that the parameter space becomes a Riemannian manifold with quantum Wasserstein information matrix. Using a quantum analogue of the Benamou–Brenier formula, we derive a natural gradient flow on the parameter space. We also discuss certain continuous-variable quantum states by studying the transport of the associated Wigner probability distributionsShow more
Article, 2021
Publication:Journal of Statistical Physics, 182, 20210107
Publisher:2021

Peer-reviewed
Wasserstein Distance-Based Auto-Encoder Tracking
Authors:Long XuYing WeiChenhe DongChuaqiao XuZhaofu Diao
Summary:Abstract: Most of the existing visual object trackers are based on deep convolutional feature maps, but there have fewer works about finding new features for tracking. This paper proposes a novel tracking framework based on a full convolutional auto-encoder appearance model, which is trained by using Wasserstein distance and maximum mean discrepancy . Compared with previous works, the proposed framework has better performance in three aspects, including appearance model, update scheme, and state estimation. To address the issues of the original update scheme including poor discriminant performance under limited supervisory information, sample pollution caused by long term object occlusion, and sample importance unbalance, in this paper, a novel latent space importance weighting algorithm, a novel sample space management algorithm, and a novel IOU-based label smoothing algorithm are proposed respectively. Besides, an improved weighted loss function is adopted to address the sample imbalance issue. Finally, to improve the state estimation accuracy, the combination of Kullback-Leibler divergence and generalized intersection over union is introduced. Extensive experiments are performed on the three widely used benchmarks, and the results demonstrate the state-of-the-art performance of the proposed method. Code and models are available at https://github.com/wahahamyt/CAT.gitShow more


2021



Peer-reviewed
Cutoff Thermalization for Ornstein–Uhlenbeck Systems with Small Lévy Noise in the Wasserstein Distance
Authors:G. BarreraM. A. HögeleJ. C. Pardo
Summary:Abstract: This article establishes cutoff thermalization (also known as the cutoff phenomenon) for a class of generalized Ornstein–Uhlenbeck systems with -small additive Lévy noise and initial value x. The driving noise processes include Brownian motion, -stable Lévy flights, finite intensity compound Poisson processes, and red noises, and may be highly degenerate. Window cutoff thermalization is shown under mild generic assumptions; that is, we see an asymptotically sharp -collapse of the renormalized Wasserstein distance from the current state to the equilibrium measure along a time window centered on a precise -dependent time scale . In many interesting situations such as reversible (Lévy) diffusions it is possible to prove the existence of an explicit, universal, deterministic cutoff thermalization profile. That is, for generic initial data x we obtain the stronger result for any as for some spectral constants and any whenever the distance is finite. The existence of this limit is characterized by the absence of non-normal growth patterns in terms of an orthogonality condition on a computable family of generalized eigenvectors of . Precise error bounds are given. Using these results, this article provides a complete discussion of the cutoff phenomenon for the classical linear oscillator with friction subject to -small Brownian motion or -stable Lévy flights. Furthermore, we cover the highly degenerate case of a linear chain of oscillators in a generalized heat bath at low temperatureShow more
Article, 2021
Publication:Journal of Statistical Physics, 184, 20210830
Publisher:2021

A Wasserstein distance based multiobjective evolutionary algorithm for the risk aware optimization of sensor placement
Authors:Andrea PontiAntonio CandelieriFrancesco Archetti
Summary:In this paper we propose a new algorithm for the identification of optimal “sensing spots”, within a network, for monitoring the spread of “effects” triggered by “events”. This problem is referred to as “Optimal Sensor Placement” and many real-world problems fit into this general framework. In this paper sensor placement (SP) (i.e., location of sensors at some nodes) for the early detection of contaminants in water distribution networks (WDNs) will be used as a running example. Usually, we have to manage a trade-off between different objective functions, so that we are faced with a multi objective optimization problem. (MOP). The best trade-off between the objectives can be defined in terms of Pareto optimality. In this paper we model the sensor placement problem as a multi objective optimization problem with boolean decision variables and propose a Multi Objective Evolutionary Algorithm (MOEA) for approximating and analyzing the Pareto setShow more
Article
Publication:Intelligent Systems with Applications, 10-11, July-September 2021
 
Peer-reviewed
DerainGAN: Single image deraining using wasserstein GAN
Authors:Sahil YadavAryan MehraHonnesh RohmetraRahul RatnakumarPratik Narang
Summary:Abstract: Rainy weather greatly affects the visibility of salient objects and scenes in the captured images and videos. The object/scene visibility varies with the type of raindrops, i.e. adherent rain droplets, streaks, rain, mist, etc. Moreover, they pose multifaceted challenges to detect and remove the raindrops to reconstruct the rain-free image for higher-level tasks like object detection, road segmentation etc. Recently, both Convolutional Neural Networks (CNN) and Generative Adversarial Network (GAN) based models have been designed to remove rain droplets from a single image by dealing with it as an image to image mapping problem. However, most of them fail to capture the complexities of the task, create blurry output, or are not time efficient. GANs are a prime candidate for solving this problem as they are extremely effective in learning image maps without harsh overfitting. In this paper, we design a simple yet effective ‘DerainGAN’ framework to achieve improved deraining performance over the existing state-of-the-art methods. The learning is based on a Wasserstein GAN and perceptual loss incorporated into the architecture. We empirically analyze the effect of different parameter choices to train the model for better optimization. We also identify the strengths and limitations of various components for single image deraining by performing multiple ablation studies on our model. The robustness of the proposed method is evaluated over two synthetic and one real-world rainy image datasets using Peak Signal to Noise Ratio (PSNR) and Structural Similarity Index Measure (SSIM) values. The proposed DerainGAN significantly outperforms almost all state-of-the-art models in Rain100L and Rain700 datasets, both in semantic and visual appearance, achieving SSIM of 0.8201 and PSNR 24.15 in Rain700 and SSIM of 0.8701 and PSNR of 28.30 in Rain100L. This accounts for an average improvement of 10 percent in PSNR and 20 percent in SSIM over benchmarked methods. Moreover, the DerainGAN is one of the fastest methods in terms of time taken to process the image, giving it over 0.1 to 150 seconds of advantage in some casesShow more
Article, 2021
Publication:Multimedia Tools and Applications : An International Journal, 80, 20210907, 36491
Publisher:2021

Peer-reviewed
2D Wasserstein loss for robust facial landmark detection
Authors:Yongzhe YanStefan DuffnerPriyanka PhutaneAnthony BerthelierChristophe BlancChristophe GarciaThierry Chateau
Summary:The recent performance of facial landmark detection has been significantly improved by using deep Convolutional Neural Networks (CNNs), especially the Heatmap Regression Models (HRMs). Although their performance on common benchmark datasets has reached a high level, the robustness of these models still remains a challenging problem in the practical use under noisy conditions of realistic environments. Contrary to most existing work focusing on the design of new models, we argue that improving the robustness requires rethinking many other aspects, including the use of datasets, the format of landmark annotation, the evaluation metric as well as the training and detection algorithm itself. In this paper, we propose a novel method for robust facial landmark detection, using a loss function based on the 2D Wasserstein distance combined with a new landmark coordinate sampling relying on the barycenter of the individual probability distributions. Our method can be plugged-and-play on most state-of-the-art HRMs with neither additional complexity nor structural modifications of the models. Further, with the large performance increase, we found that current evaluation metrics can no longer fully reflect the robustness of these models. Therefore, we propose several improvements to the standard evaluation protocol. Extensive experimental results on both traditional evaluation metrics and our evaluation metrics demonstrate that our approach significantly improves the robustness of state-of-the-art facial landmark detection modelsShow more
Article
Publication:Pattern Recognition, 116, August 2021

Peer-reviewed
A data-driven distributionally robust newsvendor model with a Wasserstein ambiguity set
Authors:Sangyoon LeeHyunwoo KimIlkyeong Moon
Summary:In this paper, we derive a closed-form solution and an explicit characterization of the worst-case distribution for the data-driven distributionally robust newsvendor model with an ambiguity set based on the Wasserstein distance of order We also consider the risk-averse decision with the Conditional Value-at-Risk (CVaR) objective. For the risk-averse model, we derive a closed-form solution for the p=1 case, and propose a tractable formulation to obtain an optimal order quantity for the p>1 case. We conduct numerical experiments to compare out-of-sample performance and convergence results of the proposed solutions against the solutions with other distributionally robust models. We also analyze the risk-averse solutions compared to the risk-neutral solutionsShow more
Article
Publication:Journal of the Operational Research Society, 72, 20210803, 1879

<——2021———2021——27010—



 
Peer-reviewed
De-aliased seismic data interpolation using conditional Wasserstein generative adversarial networks
Authors:Qing WeiXiangyang LiMingpeng Song
Summary:When sampling at offset is too coarse during seismic acquisition, spatial aliasing will appear, affecting the accuracy of subsequent processing. The receiver spacing can be reduced by interpolating one or more traces between every two traces to remove the spatial aliasing. And the seismic data with spatial aliasing can be seen as regular missing data. Deep learning is an efficient method for seismic data interpolation. We propose to interpolate the regular missing seismic data to remove the spatial aliasing by using conditional generative adversarial networks (cGAN). Wasserstein distance, which can avoid gradient vanishing and mode collapse, is used in training cGAN (cWGAN) to improve the quality of the interpolated data. One velocity model is designed to simulate the training dataset. Test results on different seismic datasets show that the cWGAN with Wasserstein distance is an accurate way for de-aliased seismic data interpolation. Unlike the traditional interpolation methods, cWGAN can avoid the assumptions of low-rank, sparsity, or linearity of seismic data. Besides, once the neural network is trained, we do not need to test different parameters for the best interpolation result, which will improve efficiencyShow more
Article
Publication:Computers and Geosciences, 154, September 2021
 
Peer-reviewed
Wasserstein distance, Fourier series and applications
Author:Stefan Steinerberger
Summary:Abstract: We study the Wasserstein metric , a notion of distance between two probability distributions, from the perspective of Fourier Analysis and discuss applications. In particular, we bound the Earth Mover Distance between the distribution of quadratic residues in a finite field and uniform distribution by (the Polya–Vinogradov inequality implies ). We also show that for continuous with mean value 0 Moreover, we show that for a Laplacian eigenfunction on a compact Riemannian manifold , which is at most a factor away from sharp. Several other problems are discussedShow more
Article, 2021
Publication:Monatshefte für Mathematik, 194, 20210107, 305
Publisher:2021

Peer-reviewed
De-aliased seismic data interpolation using conditional Wasserstein generative adversarial networks
Authors:Qing WeiXiangyang LiMingpeng Song
Summary:When sampling at offset is too coarse during seismic acquisition, spatial aliasing will appear, affecting the accuracy of subsequent processing. The receiver spacing can be reduced by interpolating one or more traces between every two traces to remove the spatial aliasing. And the seismic data with spatial aliasing can be seen as regular missing data. Deep learning is an efficient method for seismic data interpolation. We propose to interpolate the regular missing seismic data to remove the spatial aliasing by using conditional generative adversarial networks (cGAN). Wasserstein distance, which can avoid gradient vanishing and mode collapse, is used in training cGAN (cWGAN) to improve the quality of the interpolated data. One velocity model is designed to simulate the training dataset. Test results on different seismic datasets show that the cWGAN with Wasserstein distance is an accurate way for de-aliased seismic data interpolation. Unlike the traditional interpolation methods, cWGAN can avoid the assumptions of low-rank, sparsity, or linearity of seismic data. Besides, once the neural network is trained, we do not need to test different parameters for the best interpolation result, which will improve efficiencyShow more
Article
Publication:Computers and Geosciences, 154, September 2021

Peer-reviewed
Wasserstein statistics in one-dimensional location scale models
Authors:Shun-ichi AmariTakeru Matsuda
Summary:Abstract: Wasserstein geometry and information geometry are two important structures to be introduced in a manifold of probability distributions. Wasserstein geometry is defined by using the transportation cost between two distributions, so it reflects the metric of the base manifold on which the distributions are defined. Information geometry is defined to be invariant under reversible transformations of the base space. Both have their own merits for applications. In this study, we analyze statistical inference based on the Wasserstein geometry in the case that the base space is one-dimensional. By using the location-scale model, we further derive the W-estimator that explicitly minimizes the transportation cost from the empirical distribution to a statistical model and study its asymptotic behaviors. We show that the W-estimator is consistent and explicitly give its asymptotic distribution by using the functional delta method. The W-estimator is Fisher efficient in the Gaussian caseShow more
Article, 2021
Publication:Annals of the Institute of Statistical Mathematics, 74, 20210315, 33
Publisher:2021



Peer-reviewed
Classification of atomic environments via the Gromov-Wasserstein distance
Authors:Sakura KawanoJeremy K. Mason
Summary:Interpreting molecular dynamics simulations usually involves automated classification of local atomic environments to identify regions of interest. Existing approaches are generally limited to a small number of reference structures and only include limited information about the local chemical composition. This work proposes to use a variant of the Gromov-Wasserstein (GW) distance to quantify the difference between a local atomic environment and a set of arbitrary reference environments in a way that is sensitive to atomic displacements, missing atoms, and differences in chemical composition. This involves describing a local atomic environment as a finite metric measure space, which has the additional advantages of not requiring the local environment to be centered on an atom and of not making any assumptions about the material class. Numerical examples illustrate the efficacy and versatility of the algorithmShow more
Article
Publication:Computational Materials Science, 188, 2021-02-15

2021


Peer-reviewed
Precise limit in Wasserstein distance for conditional empirical measures of Dirichlet diffusion processes
Author:Feng-Yu Wang
Summary:Let M be a d-dimensional connected compact Riemannian manifold with boundary ∂M, let V C 2 ( M ) such that μ ( d x ) : = e V ( x ) d x is a probability measure, and let X t be the diffusion process generated by L : = Δ + V with τ : = inf ⁡ { t ≥ 0 : X t ∂ M } . Consider the conditional empirical measure μ t ν : = E ν ( 1 t ∫ 0 t δ X s d s | t < τ ) for the diffusion process with initial distribution ν such that ν ( ∂ M ) < 1 . Then lim t ∞ ⁡ { t W 2 ( μ t ν , μ 0 ) } 2 = 1 { μ ( ϕ 0 ) ν ( ϕ 0 ) } 2 ∑ m = 1 ∞ { ν ( ϕ 0 ) μ ( ϕ m ) + μ ( ϕ 0 ) ν ( ϕ m ) } 2 ( λ m − λ 0 ) 3 , where ν ( f ) : = ∫ M f d ν for a measure ν and f L 1 ( ν ) , μ 0 : = ϕ 0 2 μ , { ϕ m } m ≥ 0 is the eigenbasis of −L in L 2 ( μ ) with the Dirichlet boundary, { λ m } m ≥ 0 are the corresponding Dirichlet eigenvalues, and W 2 is the L 2 -Wasserstein distance induced by the Riemannian metricShow more
Article, 2021
Publication:Journal of Functional Analysis, 280, 20210601
Publisher:2021

Peer-reviewed
On the computational complexity of finding a sparse Wasserstein barycenter
Authors:Steffen BorgwardtStephan Patterson
Summary:Abstract: The discrete Wasserstein barycenter problem is a minimum-cost mass transport problem for a set of probability measures with finite support. In this paper, we show that finding a barycenter of sparse support is hard, even in dimension 2 and for only 3 measures. We prove this claim by showing that a special case of an intimately related decision problem SCMP—does there exist a measure with a non-mass-splitting transport cost and support size below prescribed bounds? Is NP-hard for all rational data. Our proof is based on a reduction from planar 3-dimensional matching and follows a strategy laid out by Spieksma and Woeginger (Eur J Oper Res 91:611–618, 1996) for a reduction to planar, minimum circumference 3-dimensional matching. While we closely mirror the actual steps of their proof, the arguments themselves differ fundamentally due to the complex nature of the discrete barycenter problem. Containment of SCMP in NP will remain open. We prove that, for a given measure, sparsity and cost of an optimal transport to a set of measures can be verified in polynomial time in the size of a bit encoding of the measure. However, the encoding size of a barycenter may be exponential in the encoding size of the underlying measuresShow more
Article, 2021
Publication:Journal of Combinatorial Optimization, 41, 20210303, 736
Publisher:2021

  

Wasserstein barycenters of compactly supported measures
Authors:Sejong KimHosoo Lee
Summary:Abstract: We consider in this paper probability measures with compact support on the open convex cone of positive definite Hermitian matrices. We define the least squares barycenter for the Bures–Wasserstein distance, called the Wasserstein barycenter, as a unique minimizer of the integral of squared Bures–Wasserstein distances. Furthermore, interesting properties of the Wasserstein barycenter including the iteration approach via optimal transport maps, the boundedness and extended Lie–Trotter–Kato formula, and several inequalities with power means have been established in the setting of compactly supported measuresShow more
Article, 2021
Publication:Analysis and Mathematical Physics, 11, 20210814
Publisher:2021

Peer-reviewed
Wasserstein distance to independence models
Authors:Türkü Özlüm ÇelikAsgar JamneshanGuido MontúfarBernd SturmfelsLorenzo Venturello
Summary:An independence model for discrete random variables is a Segre-Veronese variety in a probability simplex. Any metric on the set of joint states of the random variables induces a Wasserstein metric on the probability simplex. The unit ball of this polyhedral norm is dual to the Lipschitz polytope. Given any data distribution, we seek to minimize its Wasserstein distance to a fixed independence model. The solution to this optimization problem is a piecewise algebraic function of the data. We compute this function explicitly in small instances, we study its combinatorial structure and algebraic degrees in general, and we present some experimental case studiesShow more
Article
Publication:Journal of Symbolic Computation, 104, May-June 2021, 855


An inexact PAM method for computing Wasserstein barycenter with unknown supports
Authors:Yitian QianShaohua Pan
Summary:Abstract: Wasserstein barycenter is the centroid of a collection of discrete probability distributions which minimizes the average of the -Wasserstein distance. This paper focuses on the computation of Wasserstein barycenters under the case where the support points are free, which is known to be a severe bottleneck in the D2-clustering due to the large scale and nonconvexity. We develop an inexact proximal alternating minimization (iPAM) method for computing an approximate Wasserstein barycenter, and provide its global convergence analysis. This method can achieve a good accuracy with a reduced computational cost when the unknown support points of the barycenter have low cardinality. Numerical comparisons with the 3-block B-ADMM in Ye et al. (IEEE Trans Signal Process 65:2317–2332, 2017) and an alternating minimization method involving the LP subproblems on synthetic and real data show that the proposed iPAM can yield comparable even a little better objective values in less CPU time, and hence, the computed barycenter will render a better role in the D2-clusteringShow more
Article, 2021
Publication:Computational and Applied Mathematics, 40, 20210211
Publisher:2021

<——2021———2021——2720—


Peer-reviewed
Sliced Wasserstein based Canonical Correlation Analysis for Cross-Domain Recommendation
Authors:Zian ZhaoJie NieChenglong WangLei Huang
Summary:To solve the problem of data sparsity and cold start, the cross-domain recommendation is a promising research direction in the recommender system. The goal of cross-domain recommendation is to transfer learned knowledge from the source domain to the target domain by different means to improve the performance of the recommendation. But most approaches face the distribution misalignment. In this paper, we propose a joint learning cross-domain recommendation model that can extract domain-specific and common features simultaneously, and only use the implicit feedback data of users without additional auxiliary information. To the best of our knowledge, it is the first attempt to combine the sliced Wasserstein distance and canonical correlation analysis for the cross-domain recommendation scenario. Our one intuition is to reduce the reconstruction error caused by the variational inference based autoencoder model by the optimal transportation theory. Another attempt is to improve the correlation between domains by combining the idea of the canonical correlation analysis. With rigorous experiments, we empirically demonstrated that our model can achieve better performance compared with the state-of-the-art methodsShow more



Wasserstein $F$-tests and confidence bands for the Fréchet regression of density response curvesAuthors:Alexander PetersenXi LiuAfshin A. Divani
Summary:Data consisting of samples of probability density functions are increasingly prevalent, necessitating the development of methodologies for their analysis that respect the inherent nonlinearities associated with densities. In many applications, density curves appear as functional response objects in a regression model with vector predictors. For such models, inference is key to understand the importance of density-predictor relationships, and the uncertainty associated with the estimated conditional mean densities, defined as conditional Fréchet means under a suitable metric. Using the Wasserstein geometry of optimal transport, we consider the Fréchet regression of density curve responses and develop tests for global and partial effects, as well as simultaneous confidence bands for estimated conditional mean densities. The asymptotic behavior of these objects is based on underlying functional central limit theorems within Wasserstein space, and we demonstrate that they are asymptotically of the correct size and coverage, with uniformly strong consistency of the proposed tests under sequences of contiguous alternatives. The accuracy of these methods, including nominal size, power and coverage, is assessed through simulations, and their utility is illustrated through a regression analysis of post-intracerebral hemorrhage hematoma densities and their associations with a set of clinical and radiological covariatesShow more
Downloadable Article
Publication:https://projecteuclid.org/euclid.aos/1611889241Ann. Statist., 49, 2021-02, 590



Peer-reviewed
Sampled Gromov Wasserstein
Authors:Tanguy KerdoncuffRémi EmonetMarc Sebban
Summary:Abstract: Optimal Transport (OT) has proven to be a powerful tool to compare probability distributions in machine learning, but dealing with probability measures lying in different spaces remains an open problem. To address this issue, the Gromov Wasserstein distance (GW) only considers intra-distribution pairwise (dis)similarities. However, for two (discrete) distributions with N points, the state of the art solvers have an iterative O(N4) complexity when using an arbitrary loss function, making most of the real world problems intractable. In this paper, we introduce a new iterative way to approximate GW, called Sampled Gromov Wasserstein, which uses the current estimate of the transport plan to guide the sampling of cost matrices. This simple idea, supported by theoretical convergence guarantees, comes with a O(N2) solver. A special case of Sampled Gromov Wasserstein, which can be seen as the natural extension of the well known Sliced Wasserstein to distributions lying in different spaces, reduces even further the complexity to O(N log N). Our contributions are supported by experiments on synthetic and real datasetsShow more
Article, 2021
Publication:Machine Learning, 110, 20210726, 2151
Publisher:2021
 
Peer-reviewed
Predictive density estimation under the Wasserstein loss
Authors:Takeru MatsudaWilliam E. Strawderman
Summary:We investigate predictive density estimation under the L 2 Wasserstein loss for location families and location-scale families. We show that plug-in densities form a complete class and that the Bayesian predictive density is given by the plug-in density with the posterior mean of the location and scale parameters. We provide Bayesian predictive densities that dominate the best equivariant one in normal models. Simulation results are also presentedShow mor
Article, 2021
Publication:Journal of Statistical Planning and Inference, 210, 202101, 53
Publisher:2021

 
Peer-reviewed
Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces
Authors:Benoît BonnetHélène Frankowska
Summary:Abstract: In this article, we derive first-order necessary optimality conditions for a constrained optimal control problem formulated in the Wasserstein space of probability measures. To this end, we introduce a new notion of localised metric subdifferential for compactly supported probability measures, and investigate the intrinsic linearised Cauchy problems associated to non-local continuity equations. In particular, we show that when the velocity perturbations belong to the tangent cone to the convexification of the set of admissible velocities, the solutions of these linearised problems are tangent to the solution set of the corresponding continuity inclusion. We then make use of these novel concepts to provide a synthetic and geometric proof of the celebrated Pontryagin Maximum Principle for an optimal control problem with inequality final-point constraints. In addition, we propose sufficient conditions ensuring the normality of the maximum principleShow more
Article, 2021
Publication:Applied Mathematics & Optimization, 84, 20210519, 1281
Publisher:2021

2021


Lagrangian schemes for Wasserstein gradient flows
Authors:Jose A. CarrilloDaniel MatthesMarie-Therese Wolfram
Summary:This chapter reviews different numerical methods for specific examples of Wasserstein gradient flows: we focus on nonlinear Fokker-Planck equations, but also discuss discretizations of the parabolic-elliptic Keller-Segel model and of the fourth order thin film equation. The methods under review are of Lagrangian nature, that is, the numerical approximations trace the characteristics of the underlying transport equation rather than solving the evolution equation for the mass density directly. The two main approaches are based on integrating the equation for the Lagrangian maps on the one hand, and on solution of coupled ODEs for individual mass particles on the other handShow more
Article, 2021
Publication:Geometric Partial Differential Equations - Part II, 22, 2021, 271
Publisher:2021


A Bismut–Elworthy inequality for a Wasserstein diffusion on the circle
Author:Victor Marx
Summary:Abstract: We introduce in this paper a strategy to prove gradient estimates for some infinite-dimensional diffusions on -Wasserstein spaces. For a specific example of a diffusion on the -Wasserstein space of the torus, we get a Bismut-Elworthy-Li formula up to a remainder term and deduce a gradient estimate with a rate of blow-up of orderShow more
Article, 2021
Publication:Stochastics and Partial Differential Equations: Analysis and Computations, 10, 20211012, 1559
Publisher:2021

Lifting couplings in Wasserstein spaces
Author:Perrone, Paolo (Creator)
Summary:This paper makes mathematically precise the idea that conditional probabilities are analogous to path liftings in geometry. The idea of lifting is modelled in terms of the category-theoretic concept of a lens, which can be interpreted as a consistent choice of arrow liftings. The category we study is the one of probability measures over a given standard Borel space, with morphisms given by the couplings, or transport plans. The geometrical picture is even more apparent once we equip the arrows of the category with weights, which one can interpret as "lengths" or "costs", forming a so-called weighted category, which unifies several concepts of category theory and metric geometry. Indeed, we show that the weighted version of a lens is tightly connected to the notion of submetry in geometry. Every weighted category gives rise to a pseudo-quasimetric space via optimization over the arrows. In particular, Wasserstein spaces can be obtained from the weighted categories of probability measures and their couplings, with the weight of a coupling given by its cost. In this case, conditionals allow one to form weighted lenses, which one can interpret as "lifting transport plans, while preserving their cost"Show more
Downloadable Archival Material, 2021-10-13
Undefined
Publisher:2021-10-13



Schema matching using Gaussian mixture models with Wasserstein distanceAuthors:Przyborowski, Mateusz (Creator), Pabiś, Mateusz (Creator), Janusz, Andrzej (Creator), Ślęzak, Dominik (Creator)
Summary:Gaussian mixture models find their place as a powerful tool, mostly in the clustering problem, but with proper preparation also in feature extraction, pattern recognition, image segmentation and in general machine learning. When faced with the problem of schema matching, different mixture models computed on different pieces of data can maintain crucial information about the structure of the dataset. In order to measure or compare results from mixture models, the Wasserstein distance can be very useful, however it is not easy to calculate for mixture distributions. In this paper we derive one of possible approximations for the Wasserstein distance between Gaussian mixture models and reduce it to linear problem. Furthermore, application examples concerning real world data are shownShow more

Downloadable Archival Material, 2021-11-28
Undefined
Publisher:2021-11-28+



Wasserstein Distance Maximizing Intrinsic Control
Authors:Durugkar, Ishan (Creator), Hansen, Steven (Creator), Spencer, Stephen (Creator), Mnih, Volodymyr (Creator)
Summary:This paper deals with the problem of learning a skill-conditioned policy that acts meaningfully in the absence of a reward signal. Mutual information based objectives have shown some success in learning skills that reach a diverse set of states in this setting. These objectives include a KL-divergence term, which is maximized by visiting distinct states even if those states are not far apart in the MDP. This paper presents an approach that rewards the agent for learning skills that maximize the Wasserstein distance of their state visitation from the start state of the skill. It shows that such an objective leads to a policy that covers more distance in the MDP than diversity based objectives, and validates the results on a variety of Atari environmentsShow more
Downloadable Archival Material, 2021-10-28
Undefined
Publisher:2021-10-28 

<——2021———2021——2730—



Gamifying optimization: a Wasserstein distance-based analysis of human search
Authors:Candelieri, Antonio (Creator), Ponti, Andrea (Creator), Archetti, Francesco (Creator)
Summary:The main objective of this paper is to outline a theoretical framework to characterise humans' decision-making strategies under uncertainty, in particular active learning in a black-box optimization task and trading-off between information gathering (exploration) and reward seeking (exploitation). Humans' decisions making according to these two objectives can be modelled in terms of Pareto rationality. If a decision set contains a Pareto efficient strategy, a rational decision maker should always select the dominant strategy over its dominated alternatives. A distance from the Pareto frontier determines whether a choice is Pareto rational. To collect data about humans' strategies we have used a gaming application that shows the game field, with previous decisions and observations, as well as the score obtained. The key element in this paper is the representation of behavioural patterns of human learners as a discrete probability distribution. This maps the problem of the characterization of humans' behaviour into a space whose elements are probability distributions structured by a distance between histograms, namely the Wasserstein distance (WST). The distributional analysis gives new insights about human search strategies and their deviations from Pareto rationality. Since the uncertainty is one of the two objectives defining the Pareto frontier, the analysis has been performed for three different uncertainty quantification measures to identify which better explains the Pareto compliant behavioural patterns. Beside the analysis of individual patterns WST has also enabled a global analysis computing the barycenters and WST k-means clustering. A further analysis has been performed by a decision tree to relate non-Paretian behaviour, characterized by exasperated exploitation, to the dynamics of the evolution of the reward seeking processShow more
Downloadable Archival Material, 2021-12-12
Undefined
Publisher:2021-12-12 



Image Inpainting Using Wasserstein Generative Adversarial Imputation Network
Authors:Vašata, Daniel (Creator), Halama, Tomáš (Creator), Friedjungová, Magda (Creator)
Summary:Image inpainting is one of the important tasks in computer vision which focuses on the reconstruction of missing regions in an image. The aim of this paper is to introduce an image inpainting model based on Wasserstein Generative Adversarial Imputation Network. The generator network of the model uses building blocks of convolutional layers with different dilation rates, together with skip connections that help the model reproduce fine details of the output. This combination yields a universal imputation model that is able to handle various scenarios of missingness with sufficient quality. To show this experimentally, the model is simultaneously trained to deal with three scenarios given by missing pixels at random, missing various smaller square regions, and one missing square placed in the center of the image. It turns out that our model achieves high-quality inpainting results on all scenarios. Performance is evaluated using peak signal-to-noise ratio and structural similarity index on two real-world benchmark datasets, CelebA faces and Paris StreetView. The results of our model are compared to biharmonic imputation and to some of the other state-of-the-art image inpainting methodsShow more
Downloadable Archival Material, 2021-06-23
Undefined
Publisher:2021-06-23 


Sampling From the Wasserstein Barycenter
Authors:Daaloul, Chiheb (Creator), Gouic, Thibaut Le (Creator), Liandrat, Jacques (Creator), Tournus, Magali (Creator)
Summary:This work presents an algorithm to sample from the Wasserstein barycenter of absolutely continuous measures. Our method is based on the gradient flow of the multimarginal formulation of the Wasserstein barycenter, with an additive penalization to account for the marginal constraints. We prove that the minimum of this penalized multimarginal formulation is achieved for a coupling that is close to the Wasserstein barycenter. The performances of the algorithm are showcased in several settingsShow more
Downloadable Archival Material, 2021-05-04
Undefined
Publisher:2021-05-04 



About exchanging expectation and supremum for conditional Wasserstein GANs
Author:Martin, Jörg (Creator)
Summary:In cases where a Wasserstein GAN depends on a condition the latter is usually handled via an expectation within the loss function. Depending on the way this is motivated, the discriminator is either required to be Lipschitz-1 in both or in only one of its arguments. For the weaker requirement to become usable one needs to exchange a supremum and an expectation. This is a mathematically perilous operation, which is, so far, only partially justified in the literature. This short mathematical note intends to fill this gap and provides the mathematical rationale for discriminators that are only partially Lipschitz-1 for cases where this approach is more appropriate or successfulShow more
Downloadable Archival Material, 2021-03-25
Undefined
Publisher:2021-03-25 


Large-Scale Wasserstein Gradient Flows
Authors:Mokrov, Petr (Creator), Korotin, Alexander (Creator), Li, Lingxiao (Creator), Genevay, Aude (Creator), Solomon, Justin (Creator), Burnaev, Evgeny (Creator)Show mor
Summary:Wasserstein gradient flows provide a powerful means of understanding and solving many diffusion equations. Specifically, Fokker-Planck equations, which model the diffusion of probability measures, can be understood as gradient descent over entropy functionals in Wasserstein space. This equivalence, introduced by Jordan, Kinderlehrer and Otto, inspired the so-called JKO scheme to approximate these diffusion processes via an implicit discretization of the gradient flow in Wasserstein space. Solving the optimization problem associated to each JKO step, however, presents serious computational challenges. We introduce a scalable method to approximate Wasserstein gradient flows, targeted to machine learning applications. Our approach relies on input-convex neural networks (ICNNs) to discretize the JKO steps, which can be optimized by stochastic gradient descent. Unlike previous work, our method does not require domain discretization or particle simulation. As a result, we can sample from the measure at each time step of the diffusion and compute its probability density. We demonstrate our algorithm's performance by computing diffusions following the Fokker-Planck equation and apply it to unnormalized density sampling as well as nonlinear filteringShow more
Downloadable Archival Material, 2021-06-01
Undefined
Publisher:2021-06-01 


2021


Low Budget Active Learning via Wasserstein Distance: An Integer Programming Approach
Authors:Mahmood, Rafid (Creator), Fidler, Sanja (Creator), Law, Marc T. (Creator)
Summary:Active learning is the process of training a model with limited labeled data by selecting a core subset of an unlabeled data pool to label. The large scale of data sets used in deep learning forces most sample selection strategies to employ efficient heuristics. This paper introduces an integer optimization problem for selecting a core set that minimizes the discrete Wasserstein distance from the unlabeled pool. We demonstrate that this problem can be tractably solved with a Generalized Benders Decomposition algorithm. Our strategy uses high-quality latent features that can be obtained by unsupervised learning on the unlabeled pool. Numerical results on several data sets show that our optimization approach is competitive with baselines and particularly outperforms them in the low budget regime where less than one percent of the data set is labeledShow more
Downloadable Archival Material, 2021-06-05
Undefined
Publisher:2021-06-05

Unpaired Single-Image Depth Synthesis with cycle-consistent Wasserstein GANs
Authors:Angermann, Christoph (Creator), Moravová, Adéla (Creator), Haltmeier, Markus (Creator), Jónsson, Steinbjörn (Creator), Laubichler, Christian (Creator)
Summary:Real-time estimation of actual environment depth is an essential module for various autonomous system tasks such as localization, obstacle detection and pose estimation. During the last decade of machine learning, extensive deployment of deep learning methods to computer vision tasks yielded successful approaches for realistic depth synthesis out of a simple RGB modality. While most of these models rest on paired depth data or availability of video sequences and stereo images, there is a lack of methods facing single-image depth synthesis in an unsupervised manner. Therefore, in this study, latest advancements in the field of generative neural networks are leveraged to fully unsupervised single-image depth synthesis. To be more exact, two cycle-consistent generators for RGB-to-depth and depth-to-RGB transfer are implemented and simultaneously optimized using the Wasserstein-1 distance. To ensure plausibility of the proposed method, we apply the models to a self acquised industrial data set as well as to the renown NYU Depth v2 data set, which allows comparison with existing approaches. The observed success in this study suggests high potential for unpaired single-image depth estimation in real world applicationsShow more
Downloadable Archival Material, 2021-03-31
Undefined
Publisher:2021-03-31



Projected Sliced Wasserstein Autoencoder-based Hyperspectral Images Anomaly DetectionAuthors:Chen, Yurong (Creator), Zhang, Hui (Creator), Wang, Yaonan (Creator), Wu, Q. M. Jonathan (Creator), Yang, Yimin (Creator)
Summary:Anomaly detection (AD) has been an active research area in various domains. Yet, the increasing data scale, complexity, and dimension turn the traditional methods into challenging. Recently, the deep generative model, such as the variational autoencoder (VAE), has sparked a renewed interest in the AD problem. However, the probability distribution divergence used as the regularization is too strong, which causes the model cannot capture the manifold of the true data. In this paper, we propose the Projected Sliced Wasserstein (PSW) autoencoder-based anomaly detection method. Rooted in the optimal transportation, the PSW distance is a weaker distribution measure compared with $f$-divergence. In particular, the computation-friendly eigen-decomposition method is leveraged to find the principal component for slicing the high-dimensional data. In this case, the Wasserstein distance can be calculated with the closed-form, even the prior distribution is not Gaussian. Comprehensive experiments conducted on various real-world hyperspectral anomaly detection benchmarks demonstrate the superior performance of the proposed methodShow more
Downloadable Archival Material, 2021-12-20
Undefined
Publisher:2021-12-20 


 


Hidden Convexity of Wasserstein GANs: Interpretable Generative Models with Closed-Form Solutions
Authors:Sahiner, Arda (Creator), Ergen, Tolga (Creator), Ozturkler, Batu (Creator), Bartan, Burak (Creator), Pauly, John (Creator), Mardani, Morteza (Creator), Pilanci, Mert (Creator)Show more
Summary:Generative Adversarial Networks (GANs) are commonly used for modeling complex distributions of data. Both the generators and discriminators of GANs are often modeled by neural networks, posing a non-transparent optimization problem which is non-convex and non-concave over the generator and discriminator, respectively. Such networks are often heuristically optimized with gradient descent-ascent (GDA), but it is unclear whether the optimization problem contains any saddle points, or whether heuristic methods can find them in practice. In this work, we analyze the training of Wasserstein GANs with two-layer neural network discriminators through the lens of convex duality, and for a variety of generators expose the conditions under which Wasserstein GANs can be solved exactly with convex optimization approaches, or can be represented as convex-concave games. Using this convex duality interpretation, we further demonstrate the impact of different activation functions of the discriminator. Our observations are verified with numerical results demonstrating the power of the convex interpretation, with applications in progressive training of convex architectures corresponding to linear generators and quadratic-activation discriminators for CelebA image generation. The code for our experiments is available at https://github.com/ardasahiner/ProCoGANShow more
Downloadable Archival Material, 2021-07-12
Undefined
Publisher:2021-07-12 



A unified framework for non-negative matrix and tensor factorisations with a smoothed Wasserstein loss
Author:Zhang, Stephen Y. (Creator)
Summary:Non-negative matrix and tensor factorisations are a classical tool for finding low-dimensional representations of high-dimensional datasets. In applications such as imaging, datasets can be regarded as distributions supported on a space with metric structure. In such a setting, a loss function based on the Wasserstein distance of optimal transportation theory is a natural choice since it incorporates the underlying geometry of the data. We introduce a general mathematical framework for computing non-negative factorisations of both matrices and tensors with respect to an optimal transport loss. We derive an efficient computational method for its solution using a convex dual formulation, and demonstrate the applicability of this approach with several numerical illustrations with both matrix and tensor-valued dataShow more
Downloadable Archival Material, 2021-04-04
Undefined
Publisher:2021-04-04

<——2021———2021——2740—



Wasserstein Distances, Geodesics and Barycenters of Merge Trees
Authors:Pont, Mathieu (Creator), Vidal, Jules (Creator), Delon, Julie (Creator), Tierny, Julien (Creator)
Summary:This paper presents a unified computational framework for the estimation of distances, geodesics and barycenters of merge trees. We extend recent work on the edit distance [106] and introduce a new metric, called the Wasserstein distance between merge trees, which is purposely designed to enable efficient computations of geodesics and barycenters. Specifically, our new distance is strictly equivalent to the L2-Wasserstein distance between extremum persistence diagrams, but it is restricted to a smaller solution space, namely, the space of rooted partial isomorphisms between branch decomposition trees. This enables a simple extension of existing optimization frameworks [112] for geodesics and barycenters from persistence diagrams to merge trees. We introduce a task-based algorithm which can be generically applied to distance, geodesic, barycenter or cluster computation. The task-based nature of our approach enables further accelerations with shared-memory parallelism. Extensive experiments on public ensembles and SciVis contest benchmarks demonstrate the efficiency of our approach -- with barycenter computations in the orders of minutes for the largest examples -- as well as its qualitative ability to generate representative barycenter merge trees, visually summarizing the features of interest found in the ensemble. We show the utility of our contributions with dedicated visualization applications: feature tracking, temporal reduction and ensemble clustering. We provide a lightweight C++ implementation that can be used to reproduce our resultsShow more
Downloadable Archival Material, 2021-07-16
Undefined
Publisher:2021-07-16


Variational Wasserstein Barycenters with c-Cyclical Monotonicity
Authors:Chi, Jinjin (Creator), Yang, Zhiyao (Creator), Ouyang, Jihong (Creator), Li, Ximing (Creator)
Summary:Wasserstein barycenter, built on the theory of optimal transport, provides a powerful framework to aggregate probability distributions, and it has increasingly attracted great attention within the machine learning community. However, it suffers from severe computational burden, especially for high dimensional and continuous settings. To this end, we develop a novel continuous approximation method for the Wasserstein barycenters problem given sample access to the input distributions. The basic idea is to introduce a variational distribution as the approximation of the true continuous barycenter, so as to frame the barycenters computation problem as an optimization problem, where parameters of the variational distribution adjust the proxy distribution to be similar to the barycenter. Leveraging the variational distribution, we construct a tractable dual formulation for the regularized Wasserstein barycenter problem with c-cyclical monotonicity, which can be efficiently solved by stochastic optimization. We provide theoretical analysis on convergence and demonstrate the practical effectiveness of our method on real applications of subset posterior aggregation and synthetic dataShow more
Downloadable Archival Material, 2021-10-22
Undefined
Publisher:2021-10-22

Wasserstein Graph Neural Networks for Graphs with Missing Attributes
Authors:Chen, Zhixian (Creator), Ma, Tengfei (Creator), Song, Yangqiu (Creator), Wang, Yang (Creator)
Summary:Missing node attributes is a common problem in real-world graphs. Graph neural networks have been demonstrated power in graph representation learning while their performance is affected by the completeness of graph information. Most of them are not specified for missing-attribute graphs and fail to leverage incomplete attribute information effectively. In this paper, we propose an innovative node representation learning framework, Wasserstein Graph Neural Network (WGNN), to mitigate the problem. To make the most of limited observed attribute information and capture the uncertainty caused by missing values, we express nodes as low-dimensional distributions derived from the decomposition of the attribute matrix. Furthermore, we strengthen the expressiveness of representations by developing a novel message passing schema that aggregates distributional information from neighbors in the Wasserstein space. We test WGNN in node classification tasks under two missing-attribute cases on both synthetic and real-world datasets. In addition, we find WGNN suitable to recover missing values and adapt them to tackle matrix completion problems with graphs of users and items. Experimental results on both tasks demonstrate the superiority of our methodShow more
Downloadable Archival Material, 2021-02-05
Undefined
Publisher:2021-02-05
Related articles
All 2 versions


A Wasserstein Minimax Framework for Mixed Linear Regression
Authors:Diamandis, Theo (Creator), Eldar, Yonina C. (Creator), Fallah, Alireza (Creator), Farnia, Farzan (Creator), Ozdaglar, Asuman (Creator)
Summary:Multi-modal distributions are commonly used to model clustered data in statistical learning tasks. In this paper, we consider the Mixed Linear Regression (MLR) problem. We propose an optimal transport-based framework for MLR problems, Wasserstein Mixed Linear Regression (WMLR), which minimizes the Wasserstein distance between the learned and target mixture regression models. Through a model-based duality analysis, WMLR reduces the underlying MLR task to a nonconvex-concave minimax optimization problem, which can be provably solved to find a minimax stationary point by the Gradient Descent Ascent (GDA) algorithm. In the special case of mixtures of two linear regression models, we show that WMLR enjoys global convergence and generalization guarantees. We prove that WMLR's sample complexity grows linearly with the dimension of data. Finally, we discuss the application of WMLR to the federated learning task where the training samples are collected by multiple agents in a network. Unlike the Expectation Maximization algorithm, WMLR directly extends to the distributed, federated learning setting. We support our theoretical results through several numerical experiments, which highlight our framework's ability to handle the federated learning setting with mixture modelsShow more
Downloadable Archival Material, 2021-06-14
Undefined
Publisher:2021-06-14


Wasserstein Generative Adversarial Uncertainty Quantification in Physics-Informed Neural Networks
Authors:Gao, Yihang (Creator), Ng, Michael K. (Creator)
Summary:In this paper, we study a physics-informed algorithm for Wasserstein Generative Adversarial Networks (WGANs) for uncertainty quantification in solutions of partial differential equations. By using groupsort activation functions in adversarial network discriminators, network generators are utilized to learn the uncertainty in solutions of partial differential equations observed from the initial/boundary data. Under mild assumptions, we show that the generalization error of the computed generator converges to the approximation error of the network with high probability, when the number of samples are sufficiently taken. According to our established error bound, we also find that our physics-informed WGANs have higher requirement for the capacity of discriminators than that of generators. Numerical results on synthetic examples of partial differential equations are reported to validate our theoretical results and demonstrate how uncertainty quantification can be obtained for solutions of partial differential equations and the distributions of initial/boundary dataShow more
Downloadable Archival Material, 2021-08-30
Undefined
Publisher:2021-08-30


2021



ALLWAS: Active Learning on Language models in WASserstein space
Authors:Bastos, Anson (Creator), Kaul, Manohar (Creator)
Summary:Active learning has emerged as a standard paradigm in areas with scarcity of labeled training data, such as in the medical domain. Language models have emerged as the prevalent choice of several natural language tasks due to the performance boost offered by these models. However, in several domains, such as medicine, the scarcity of labeled training data is a common issue. Also, these models may not work well in cases where class imbalance is prevalent. Active learning may prove helpful in these cases to boost the performance with a limited label budget. To this end, we propose a novel method using sampling techniques based on submodular optimization and optimal transport for active learning in language models, dubbed ALLWAS. We construct a sampling strategy based on submodular optimization of the designed objective in the gradient domain. Furthermore, to enable learning from few samples, we propose a novel strategy for sampling from the Wasserstein barycenters. Our empirical evaluations on standard benchmark datasets for text classification show that our methods perform significantly better (>20% relative increase in some cases) than existing approaches for active learning on language modelsShow more
Downloadable Archival Material, 2021-09-03
Undefined
Publisher:2021-09-03

Wasserstein barycenters are NP-hard to compute
Authors:Altschuler, Jason M. (Creator), Boix-Adsera, Enric (Creator)
Summary:Computing Wasserstein barycenters (a.k.a. Optimal Transport barycenters) is a fundamental problem in geometry which has recently attracted considerable attention due to many applications in data science. While there exist polynomial-time algorithms in any fixed dimension, all known running times suffer exponentially in the dimension. It is an open question whether this exponential dependence is improvable to a polynomial dependence. This paper proves that unless P=NP, the answer is no. This uncovers a "curse of dimensionality" for Wasserstein barycenter computation which does not occur for Optimal Transport computation. Moreover, our hardness results for computing Wasserstein barycenters extend to approximate computation, to seemingly simple cases of the problem, and to averaging probability distributions in other Optimal Transport metricsShow more
Downloadable Archival Material, 2021-01-04
Undefined
Publisher:2021-01-04


FlowPool: Pooling Graph Representations with Wasserstein Gradient Flows
Author:Simou, Effrosyni (Creator)
Summary:In several machine learning tasks for graph structured data, the graphs under consideration may be composed of a varying number of nodes. Therefore, it is necessary to design pooling methods that aggregate the graph representations of varying size to representations of fixed size which can be used in downstream tasks, such as graph classification. Existing graph pooling methods offer no guarantee with regards to the similarity of a graph representation and its pooled version. In this work, we address this limitation by proposing FlowPool, a pooling method that optimally preserves the statistics of a graph representation to its pooled counterpart by minimising their Wasserstein distance. This is achieved by performing a Wasserstein gradient flow with respect to the pooled graph representation. Our method relies on a versatile implementation which can take into account the geometry of the representation space through any ground cost and computes the gradient of the Wasserstein distance with automatic differentiation. We propose the differentiation of the Wasserstein flow layer using an implicit differentiation scheme. Therefore, our pooling method is amenable to automatic differentiation and can be integrated in end-to-end deep learning architectures. Further, FlowPool is invariant to permutations and can therefore be combined with permutation equivariant feature extraction layers in GNNs in order to obtain predictions that are independent of the ordering of the nodes. Experimental results demonstrate that our method leads to an increase in performance compared to existing pooling methods when evaluated on graph classificationShow more
Downloadable Archival Material, 2021-12-18
Undefined
Publisher:2021-12-18

Authors:Huizing, Geert-Jan (Creator), Cantini, Unsupervised Ground Metric Learning using Wasserstein Singular VectorsLaura (Creator), Peyré, Gabriel (Creator)
Summary:Defining meaningful distances between samples, which are columns in a data matrix, is a fundamental problem in machine learning. Optimal Transport (OT) defines geometrically meaningful "Wasserstein" distances between probability distributions. However, a key bottleneck is the design of a "ground" cost which should be adapted to the task under study. OT is parametrized by a distance between the features (the rows of the data matrix): the "ground cost". However, there is usually no straightforward choice of distance on the features, and supervised metric learning is not possible either, leaving only ad-hoc approaches. Unsupervised metric learning is thus a fundamental problem to enable data-driven applications of OT. In this paper, we propose for the first time a canonical answer by simultaneously computing an OT distance between the rows and between the columns of a data matrix. These distance matrices emerge naturally as positive singular vectors of the function mapping ground costs to pairwise OT distances. We provide criteria to ensure the existence and uniqueness of these singular vectors. We then introduce scalable computational methods to approximate them in high-dimensional settings, using entropic regularization and stochastic approximation. First, we extend the definition using entropic regularization, and show that in the large regularization limit it operates a principal component analysis dimensionality reduction. Next, we propose a stochastic approximation scheme and study its convergence. Finally, we showcase Wasserstein Singular Vectors in the context of computational biology on a high-dimensional single-cell RNA-sequencing datasetShow more
Downloadable Archival Material, 2021-02-11
Undefined
Publisher:2021-02-11

Robust Graph Learning Under Wasserstein Uncertainty
Authors:Zhang, Xiang (Creator), Xu, Yinfei (Creator), Liu, Qinghe (Creator), Liu, Zhicheng (Creator), Lu, Jian (Creator), Wang, Qiao (Creator)
Summary:Graphs are playing a crucial role in different fields since they are powerful tools to unveil intrinsic relationships among signals. In many scenarios, an accurate graph structure representing signals is not available at all and that motivates people to learn a reliable graph structure directly from observed signals. However, in real life, it is inevitable that there exists uncertainty in the observed signals due to noise measurements or limited observability, which causes a reduction in reliability of the learned graph. To this end, we propose a graph learning framework using Wasserstein distributionally robust optimization (WDRO) which handles uncertainty in data by defining an uncertainty set on distributions of the observed data. Specifically, two models are developed, one of which assumes all distributions in uncertainty set are Gaussian distributions and the other one has no prior distributional assumption. Instead of using interior point method directly, we propose two algorithms to solve the corresponding models and show that our algorithms are more time-saving. In addition, we also reformulate both two models into Semi-Definite Programming (SDP), and illustrate that they are intractable in the scenario of large-scale graph. Experiments on both synthetic and real world data are carried out to validate the proposed framework, which show that our scheme can learn a reliable graph in the context of uncertaintyShow more
Downloadable Archival Material, 2021-05-10
Undefined

Publisher:2021-05-1

<——2021———2021——2750—


Unsupervised Ground Metric Learning using Wasserstein Singular VectorsAuthors:Huizing, Geert-Jan (Creator), Cantini, Laura (Creator), Peyré, Gabriel (Creator)
Summary:Defining meaningful distances between samples, which are columns in a data matrix, is a fundamental problem in machine learning. Optimal Transport (OT) defines geometrically meaningful "Wasserstein" distances between probability distributions. However, a key bottleneck is the design of a "ground" cost which should be adapted to the task under study. OT is parametrized by a distance between the features (the rows of the data matrix): the "ground cost". However, there is usually no straightforward choice of distance on the features, and supervised metric learning is not possible either, leaving only ad-hoc approaches. Unsupervised metric learning is thus a fundamental problem to enable data-driven applications of OT. In this paper, we propose for the first time a canonical answer by simultaneously computing an OT distance between the rows and between the columns of a data matrix. These distance matrices emerge naturally as positive singular vectors of the function mapping ground costs to pairwise OT distances. We provide criteria to ensure the existence and uniqueness of these singular vectors. We then introduce scalable computational methods to approximate them in high-dimensional settings, using entropic regularization and stochastic approximation. First, we extend the definition using entropic regularization, and show that in the large regularization limit it operates a principal component analysis dimensionality reduction. Next, we propose a stochastic approximation scheme and study its convergence. Finally, we showcase Wasserstein Singular Vectors in the context of computational biology on a high-dimensional single-cell RNA-sequencing datasetShow more
Downloadable Archival Material, 2021-02-11
Undefined
Publisher:2021-02-11


Projection Robust Wasserstein Barycenters
Authors:Huang, Minhui (Creator), Ma, Shiqian (Creator), Lai, Lifeng (Creator)
Summary:Collecting and aggregating information from several probability measures or histograms is a fundamental task in machine learning. One of the popular solution methods for this task is to compute the barycenter of the probability measures under the Wasserstein metric. However, approximating the Wasserstein barycenter is numerically challenging because of the curse of dimensionality. This paper proposes the projection robust Wasserstein barycenter (PRWB) that has the potential to mitigate the curse of dimensionality. Since PRWB is numerically very challenging to solve, we further propose a relaxed PRWB (RPRWB) model, which is more tractable. The RPRWB projects the probability measures onto a lower-dimensional subspace that maximizes the Wasserstein barycenter objective. The resulting problem is a max-min problem over the Stiefel manifold. By combining the iterative Bregman projection algorithm and Riemannian optimization, we propose two new algorithms for computing the RPRWB. The complexity of arithmetic operations of the proposed algorithms for obtaining an $\epsilon$-stationary solution is analyzed. We incorporate the RPRWB into a discrete distribution clustering algorithm, and the numerical results on real text datasets confirm that our RPRWB model helps improve the clustering performance significantlyShow more
Downloadable Archival Material, 2021-02-05
Undefined
Publisher:2021-02-05

Controlling Wasserstein distances by Kernel norms with application to Compressive Statistical Learning
Authors:Vayer, Titouan (Creator), Gribonval, Rémi (Creator)
Summary:Comparing probability distributions is at the crux of many machine learning algorithms. Maximum Mean Discrepancies (MMD) and Optimal Transport distances (OT) are two classes of distances between probability measures that have attracted abundant attention in past years. This paper establishes some conditions under which the Wasserstein distance can be controlled by MMD norms. Our work is motivated by the compressive statistical learning (CSL) theory, a general framework for resource-efficient large scale learning in which the training data is summarized in a single vector (called sketch) that captures the information relevant to the considered learning task. Inspired by existing results in CSL, we introduce the H\"older Lower Restricted Isometric Property (H\"older LRIP) and show that this property comes with interesting guarantees for compressive statistical learning. Based on the relations between the MMD and the Wasserstein distance, we provide guarantees for compressive statistical learning by introducing and studying the concept of Wasserstein learnability of the learning task, that is when some task-specific metric between probability distributions can be bounded by a Wasserstein distanceShow more
Downloadable Archival Material, 2021-12-01
Undefined
Publisher:2021-12-01


Tracial smooth functions of non-commuting variables and the free Wasserstein manifold
Authors:Jekel, David (Creator), Li, Wuchen (Creator), Shlyakhtenko, Dimitri (Creator)
Summary:We formulate a free probabilistic analog of the Wasserstein manifold on $\mathbb{R}^d$ (the formal Riemannian manifold of smooth probability densities on $\mathbb{R}^d$), and we use it to study smooth non-commutative transport of measure. The points of the free Wasserstein manifold $\mathscr{W}(\mathbb{R}^{*d})$ are smooth tracial non-commutative functions $V$ with quadratic growth at $\infty$, which correspond to minus the log-density in the classical setting. The space of smooth tracial non-commutative functions used here is a new one whose definition and basic properties we develop in the paper; they are scalar-valued functions of self-adjoint $d$-tuples from arbitrary tracial von Neumann algebras that can be approximated by trace polynomials. The space of non-commutative diffeomorphisms $\mathscr{D}(\mathbb{R}^{*d})$ acts on $\mathscr{W}(\mathbb{R}^{*d})$ by transport, and the basic relationship between tangent vectors for $\mathscr{D}(\mathbb{R}^{*d})$ and tangent vectors for $\mathscr{W}(\mathbb{R}^{*d})$ is described using the Laplacian $L_V$ associated to $V$ and its pseudo-inverse $\Psi_V$ (when defined). Following similar arguments to arXiv:1204.2182, arXiv:1701.00132, and arXiv:1906.10051 in the new setting, we give a rigorous proof for the existence of smooth transport along any path $t \mapsto V_t$ when $V$ is sufficiently close $(1/2) \sum_j \operatorname{tr}(x_j^2)$, as well as smooth triangular transportShow more
Downloadable Archival Material, 2021-01-16
Undefined
Publisher:2021-01-16

 
Learning to Generate Wasserstein Barycenters
Authors:Lacombe, Julien (Creator), Digne, Julie (Creator), Courty, Nicolas (Creator), Bonneel, Nicolas (Creator)
Summary:Optimal transport is a notoriously difficult problem to solve numerically, with current approaches often remaining intractable for very large scale applications such as those encountered in machine learning. Wasserstein barycenters -- the problem of finding measures in-between given input measures in the optimal transport sense -- is even more computationally demanding as it requires to solve an optimization problem involving optimal transport distances. By training a deep convolutional neural network, we improve by a factor of 60 the computational speed of Wasserstein barycenters over the fastest state-of-the-art approach on the GPU, resulting in milliseconds computational times on $512\times512$ regular grids. We show that our network, trained on Wasserstein barycenters of pairs of measures, generalizes well to the problem of finding Wasserstein barycenters of more than two measures. We demonstrate the efficiency of our approach for computing barycenters of sketches and transferring colors between multiple imagesShow more
Downloadable Archival Material, 2021-02-24
Undefined
Publisher:2021-02-24

2021

Minimum Wasserstein Distance Estimator under Finite Location-scale Mixtures
Authors:Zhang, Qiong (Creator), Chen, Jiahua (Creator)
Summary:When a population exhibits heterogeneity, we often model it via a finite mixture: decompose it into several different but homogeneous subpopulations. Contemporary practice favors learning the mixtures by maximizing the likelihood for statistical efficiency and the convenient EM-algorithm for numerical computation. Yet the maximum likelihood estimate (MLE) is not well defined for the most widely used finite normal mixture in particular and for finite location-scale mixture in general. We hence investigate feasible alternatives to MLE such as minimum distance estimators. Recently, the Wasserstein distance has drawn increased attention in the machine learning community. It has intuitive geometric interpretation and is successfully employed in many new applications. Do we gain anything by learning finite location-scale mixtures via a minimum Wasserstein distance estimator (MWDE)? This paper investigates this possibility in several respects. We find that the MWDE is consistent and derive a numerical solution under finite location-scale mixtures. We study its robustness against outliers and mild model mis-specifications. Our moderate scaled simulation study shows the MWDE suffers some efficiency loss against a penalized version of MLE in general without noticeable gain in robustness. We reaffirm the general superiority of the likelihood based learning strategies even for the non-regular finite location-scale mixturesShow more
Downloadable Archival Material, 2021-07-02
Undefined
Publisher:2021-07-02


Wasserstein Proximal of GANs
Authors:Lin, Alex Tong (Creator), Li, Wuchen (Creator), Osher, Stanley (Creator), Montufar, Guido (Creator)
Summary:We introduce a new method for training generative adversarial networks by applying the Wasserstein-2 metric proximal on the generators. The approach is based on Wasserstein information geometry. It defines a parametrization invariant natural gradient by pulling back optimal transport structures from probability space to parameter space. We obtain easy-to-implement iterative regularizers for the parameter updates of implicit deep generative models. Our experiments demonstrate that this method improves the speed and stability of training in terms of wall-clock time and Fr\'echet Inception DistanceShow more
Downloadable Archival Material, 2021-02-13
Undefined
Publisher:2021-02-13


Wasserstein Unsupervised Reinforcement Learning
Authors:He, Shuncheng (Creator), Jiang, Yuhang (Creator), Zhang, Hongchang (Creator), Shao, Jianzhun (Creator), Ji, Xiangyang (Creator)
Summary:Unsupervised reinforcement learning aims to train agents to learn a handful of policies or skills in environments without external reward. These pre-trained policies can accelerate learning when endowed with external reward, and can also be used as primitive options in hierarchical reinforcement learning. Conventional approaches of unsupervised skill discovery feed a latent variable to the agent and shed its empowerment on agent's behavior by mutual information (MI) maximization. However, the policies learned by MI-based methods cannot sufficiently explore the state space, despite they can be successfully identified from each other. Therefore we propose a new framework Wasserstein unsupervised reinforcement learning (WURL) where we directly maximize the distance of state distributions induced by different policies. Additionally, we overcome difficulties in simultaneously training N(N >2) policies, and amortizing the overall reward to each step. Experiments show policies learned by our approach outperform MI-based methods on the metric of Wasserstein distance while keeping high discriminability. Furthermore, the agents trained by WURL can sufficiently explore the state space in mazes and MuJoCo tasks and the pre-trained policies can be applied to downstream tasks by hierarchical learningShow more
Downloadable Archival Material, 2021-10-15
Undefined
Publisher:2021-10-15


A Theory of the Distortion-Perception Tradeoff in Wasserstein SpaceAuthors:Freirich, Dror (Creator), Michaeli, Tomer (Creator), Meir, Ron (Creator)
Summary:The lower the distortion of an estimator, the more the distribution of its outputs generally deviates from the distribution of the signals it attempts to estimate. This phenomenon, known as the perception-distortion tradeoff, has captured significant attention in image restoration, where it implies that fidelity to ground truth images comes at the expense of perceptual quality (deviation from statistics of natural images). However, despite the increasing popularity of performing comparisons on the perception-distortion plane, there remains an important open question: what is the minimal distortion that can be achieved under a given perception constraint? In this paper, we derive a closed form expression for this distortion-perception (DP) function for the mean squared-error (MSE) distortion and the Wasserstein-2 perception index. We prove that the DP function is always quadratic, regardless of the underlying distribution. This stems from the fact that estimators on the DP curve form a geodesic in Wasserstein space. In the Gaussian setting, we further provide a closed form expression for such estimators. For general distributions, we show how these estimators can be constructed from the estimators at the two extremes of the tradeoff: The global MSE minimizer, and a minimizer of the MSE under a perfect perceptual quality constraint. The latter can be obtained as a stochastic transformation of the formerShow more
Downloadable Archival Material, 2021-07-06
Undefined
Publisher:2021-07-06
Cited by 6

Tighter expected generalization error bounds via Wasserstein distance
Authors:Rodríguez-Gálvez, Borja (Creator), Bassi, Germán (Creator), Thobaben, Ragnar (Creator), Skoglund, Mikael (Creator)
Summary:This work presents several expected generalization error bounds based on the Wasserstein distance. More specifically, it introduces full-dataset, single-letter, and random-subset bounds, and their analogues in the randomized subsample setting from Steinke and Zakynthinou [1]. Moreover, when the loss function is bounded and the geometry of the space is ignored by the choice of the metric in the Wasserstein distance, these bounds recover from below (and thus, are tighter than) current bounds based on the relative entropy. In particular, they generate new, non-vacuous bounds based on the relative entropy. Therefore, these results can be seen as a bridge between works that account for the geometry of the hypothesis space and those based on the relative entropy, which is agnostic to such geometry. Furthermore, it is shown how to produce various new bounds based on different information measures (e.g., the lautum information or several $f$-divergences) based on these bounds and how to derive similar bounds with respect to the backward channel using the presented proof techniquesShow more
Downloadable Archival Material, 2021-01-22
Undefined

Publisher:2021-01-22

<——2021———2021——2760—



Automatic Text Evaluation through the Lens of Wasserstein Barycenters
Authors:Colombo, Pierre (Creator), Staerman, Guillaume (Creator), Clavel, Chloe (Creator), Piantanida, Pablo (Creator)
Summary:A new metric \texttt{BaryScore} to evaluate text generation based on deep contextualized embeddings e.g., BERT, Roberta, ELMo) is introduced. This metric is motivated by a new framework relying on optimal transport tools, i.e., Wasserstein distance and barycenter. By modelling the layer output of deep contextualized embeddings as a probability distribution rather than by a vector embedding; this framework provides a natural way to aggregate the different outputs through the Wasserstein space topology. In addition, it provides theoretical grounds to our metric and offers an alternative to available solutions e.g., MoverScore and BertScore). Numerical evaluation is performed on four different tasks: machine translation, summarization, data2text generation and image captioning. Our results show that \texttt{BaryScore} outperforms other BERT based metrics and exhibits more consistent behaviour in particular for text summarizationShow more
Downloadable Archival Material, 2021-08-27
Undefined
Publisher:2021-08-27

Variance Minimization in the Wasserstein Space for Invariant Causal Prediction
Authors:Martinet, Guillaume (Creator), Strzalkowski, Alexander (Creator), Engelhardt, Barbara E. (Creator)
Summary:Selecting powerful predictors for an outcome is a cornerstone task for machine learning. However, some types of questions can only be answered by identifying the predictors that causally affect the outcome. A recent approach to this causal inference problem leverages the invariance property of a causal mechanism across differing experimental environments (Peters et al., 2016; Heinze-Deml et al., 2018). This method, invariant causal prediction (ICP), has a substantial computational defect -- the runtime scales exponentially with the number of possible causal variables. In this work, we show that the approach taken in ICP may be reformulated as a series of nonparametric tests that scales linearly in the number of predictors. Each of these tests relies on the minimization of a novel loss function -- the Wasserstein variance -- that is derived from tools in optimal transport theory and is used to quantify distributional variability across environments. We prove under mild assumptions that our method is able to recover the set of identifiable direct causes, and we demonstrate in our experiments that it is competitive with other benchmark causal discovery algorithmsShow more
 
Downloadable Archival Material, 2021-10-13
Undefined
Publisher:2021-10-13
Cited by 1
Related articles


Learning Domain Invariant Representations by Joint Wasserstein Distance Minimization
Authors:Andéol, Léo (Creator), Kawakami, Yusei (Creator), Wada, Yuichiro (Creator), Kanamori, Takafumi (Creator), Müller, Klaus-Robert (Creator), Montavon, Grégoire (Creator)Show more
Summary:Domain shifts in the training data are common in practical applications of machine learning, they occur for instance when the data is coming from different sources. Ideally, a ML model should work well independently of these shifts, for example, by learning a domain-invariant representation. Moreover, privacy concerns regarding the source also require a domain-invariant representation. In this work, we provide theoretical results that link domain invariant representations -- measured by the Wasserstein distance on the joint distributions -- to a practical semi-supervised learning objective based on a cross-entropy classifier and a novel domain critic. Quantitative experiments demonstrate that the proposed approach is indeed able to practically learn such an invariant representation (between two domains), and the latter also supports models with higher predictive accuracy on both domains, comparing favorably to existing techniquesShow more
Downloadable Archival Material, 2021-06-09
Undefined
Publisher:2021-06-09



Human Motion Prediction Using Manifold-Aware Wasserstein GAN
Authors:Chopin, Baptiste (Creator), Otberdout, Naima (Creator), Daoudi, Mohamed (Creator), Bartolo, Angela (Creator)
Summary:Human motion prediction aims to forecast future human poses given a prior pose sequence. The discontinuity of the predicted motion and the performance deterioration in long-term horizons are still the main challenges encountered in current literature. In this work, we tackle these issues by using a compact manifold-valued representation of human motion. Specifically, we model the temporal evolution of the 3D human poses as trajectory, what allows us to map human motions to single points on a sphere manifold. To learn these non-Euclidean representations, we build a manifold-aware Wasserstein generative adversarial model that captures the temporal and spatial dependencies of human motion through different losses. Extensive experiments show that our approach outperforms the state-of-the-art on CMU MoCap and Human 3.6M datasets. Our qualitative results show the smoothness of the predicted motionsShow more
Downloadable Archival Material, 2021-05-18
Undefined
Publisher:2021-05-18

Balanced Coarsening for Multilevel Hypergraph Partitioning via Wasserstein Discrepancy
Authors:Guo, Zhicheng (Creator), Zhao, Jiaxuan (Creator), Jiao, Licheng (Creator), Liu, Xu (Creator)
Summary:We propose a balanced coarsening scheme for multilevel hypergraph partitioning. In addition, an initial partitioning algorithm is designed to improve the quality of k-way hypergraph partitioning. By assigning vertex weights through the LPT algorithm, we generate a prior hypergraph under a relaxed balance constraint. With the prior hypergraph, we have defined the Wasserstein discrepancy to coordinate the optimal transport of coarsening process. And the optimal transport matrix is solved by Sinkhorn algorithm. Our coarsening scheme fully takes into account the minimization of connectivity metric (objective function). For the initial partitioning stage, we define a normalized cut function induced by Fiedler vector, which is theoretically proved to be a concave function. Thereby, a three-point algorithm is designed to find the best cut under the balance constraintShow more
Downloadable Archival Material, 2021-06-14
Undefined
Publisher:2021-06-14

2021


ERA: Entity Relationship Aware Video Summarization with Wasserstein GAN
Authors:Wu, Guande (Creator), Lin, Jianzhe (Creator), Silva, Claudio T. (Creator)
Summary:Video summarization aims to simplify large scale video browsing by generating concise, short summaries that diver from but well represent the original video. Due to the scarcity of video annotations, recent progress for video summarization concentrates on unsupervised methods, among which the GAN based methods are most prevalent. This type of methods includes a summarizer and a discriminator. The summarized video from the summarizer will be assumed as the final output, only if the video reconstructed from this summary cannot be discriminated from the original one by the discriminator. The primary problems of this GAN based methods are two folds. First, the summarized video in this way is a subset of original video with low redundancy and contains high priority events/entities. This summarization criterion is not enough. Second, the training of the GAN framework is not stable. This paper proposes a novel Entity relationship Aware video summarization method (ERA) to address the above problems. To be more specific, we introduce an Adversarial Spatio Temporal network to construct the relationship among entities, which we think should also be given high priority in the summarization. The GAN training problem is solved by introducing the Wasserstein GAN and two newly proposed video patch/score sum losses. In addition, the score sum loss can also relieve the model sensitivity to the varying video lengths, which is an inherent problem for most current video analysis tasks. Our method substantially lifts the performance on the target benchmark datasets and exceeds the current leaderboard Rank 1 state of the art CSNet (2.1% F1 score increase on TVSum and 3.1% F1 score increase on SumMe). We hope our straightforward yet effective approach will shed some light on the future research of unsupervised video summarizationShow mor
Downloadable Archival Material, 2021-09-06
Undefined
Publisher:2021-09-06

Dynamical Wasserstein Barycenters for Time-series Modeling
Authors:Cheng, Kevin C. (Creator), Aeron, Shuchin (Creator), Hughes, Michael C. (Creator), Miller, Eric L. (Creator)
Summary:Many time series can be modeled as a sequence of segments representing high-level discrete states, such as running and walking in a human activity application. Flexible models should describe the system state and observations in stationary "pure-state" periods as well as transition periods between adjacent segments, such as a gradual slowdown between running and walking. However, most prior work assumes instantaneous transitions between pure discrete states. We propose a dynamical Wasserstein barycentric (DWB) model that estimates the system state over time as well as the data-generating distributions of pure states in an unsupervised manner. Our model assumes each pure state generates data from a multivariate normal distribution, and characterizes transitions between states via displacement-interpolation specified by the Wasserstein barycenter. The system state is represented by a barycentric weight vector which evolves over time via a random walk on the simplex. Parameter learning leverages the natural Riemannian geometry of Gaussian distributions under the Wasserstein distance, which leads to improved convergence speeds. Experiments on several human activity datasets show that our proposed DWB model accurately learns the generating distribution of pure states while improving state estimation for transition periods compared to the commonly used linear interpolation mixture modelsShow more
Downloadable Archival Material, 2021-10-13
Undefined
Publisher:2021-10-13

Fast Topological Clustering with Wasserstein Distance
Authors:Songdechakraiwut, Tananun (Creator), Krause, Bryan M. (Creator), Banks, Matthew I. (Creator), Nourski, Kirill V. (Creator), Van Veen, Barry D. (Creator)Show more
Summary:The topological patterns exhibited by many real-world networks motivate the development of topology-based methods for assessing the similarity of networks. However, extracting topological structure is difficult, especially for large and dense networks whose node degrees range over multiple orders of magnitude. In this paper, we propose a novel and computationally practical topological clustering method that clusters complex networks with intricate topology using principled theory from persistent homology and optimal transport. Such networks are aggregated into clusters through a centroid-based clustering strategy based on both their topological and geometric structure, preserving correspondence between nodes in different networks. The notions of topological proximity and centroid are characterized using a novel and efficient approach to computation of the Wasserstein distance and barycenter for persistence barcodes associated with connected components and cycles. The proposed method is demonstrated to be effective using both simulated networks and measured functional brain networksShow more
Downloadable Archival Material, 2021-11-30
Undefined
Publisher:2021-11-30

Wasserstein Robust Classification with Fairness Constraints
Authors:Wang, Yijie (Creator), Nguyen, Viet Anh (Creator), Hanasusanto, Grani A. (Creator)
Summary:We propose a distributionally robust classification model with a fairness constraint that encourages the classifier to be fair in view of the equality of opportunity criterion. We use a type-$\infty$ Wasserstein ambiguity set centered at the empirical distribution to model distributional uncertainty and derive a conservative reformulation for the worst-case equal opportunity unfairness measure. We establish that the model is equivalent to a mixed binary optimization problem, which can be solved by standard off-the-shelf solvers. To improve scalability, we further propose a convex, hinge-loss-based model for large problem instances whose reformulation does not incur any binary variables. Moreover, we also consider the distributionally robust learning problem with a generic ground transportation cost to hedge against the uncertainties in the label and sensitive attribute. Finally, we numerically demonstrate that our proposed approaches improve fairness with negligible loss of predictive accuracyShow more
Downloadable Archival Material, 2021-03-11
Undefined
Publisher:2021-03-11

Statistical Analysis of Wasserstein Distributionally Robust Estimators
Authors:Blanchet, Jose (Creator), Murthy, Karthyek (Creator), Nguyen, Viet Anh (Creator)
Summary:We consider statistical methods which invoke a min-max distributionally robust formulation to extract good out-of-sample performance in data-driven optimization and learning problems. Acknowledging the distributional uncertainty in learning from limited samples, the min-max formulations introduce an adversarial inner player to explore unseen covariate data. The resulting Distributionally Robust Optimization (DRO) formulations, which include Wasserstein DRO formulations (our main focus), are specified using optimal transportation phenomena. Upon describing how these infinite-dimensional min-max problems can be approached via a finite-dimensional dual reformulation, the tutorial moves into its main component, namely, explaining a generic recipe for optimally selecting the size of the adversary's budget. This is achieved by studying the limit behavior of an optimal transport projection formulation arising from an inquiry on the smallest confidence region that includes the unknown population risk minimizer. Incidentally, this systematic prescription coincides with those in specific examples in high-dimensional statistics and results in error bounds that are free from the curse of dimensions. Equipped with this prescription, we present a central limit theorem for the DRO estimator and provide a recipe for constructing compatible confidence regions that are useful for uncertainty quantification. The rest of the tutorial is devoted to insights into the nature of the optimizers selected by the min-max formulations and additional applications of optimal transport projectionsShow more
Downloadable Archival Material, 2021-08-04
Undefined
Publisher:2021-08-04

<——2021———2021——2770—


angent Space and Dimension Estimation with the Wasserstein DistanceAuthors:Lim, Uzu (Creator), Oberhauser, Harald (Creator), Nanda, Vidit (Creator)
Summary:We provide explicit bounds on the number of sample points required to estimate tangent spaces and intrinsic dimensions of (smooth, compact) Euclidean submanifolds via local principal component analysis. Our approach directly estimates covariance matrices locally, which simultaneously allows estimating both the tangent spaces and the intrinsic dimension of a manifold. The key arguments involve a matrix concentration inequality, a Wasserstein bound for flattening a manifold, and a Lipschitz relation for the covariance matrix with respect to the Wasserstein distanceShow more
Downloadable Archival Material, 2021-10-12
Undefined
Publisher:2021-10-12



Rethinking Rotated Object Detection with Gaussian Wasserstein Distance Loss
Authors:Yang, Xue (Creator), Yan, Junchi (Creator), Ming, Qi (Creator), Wang, Wentao (Creator), Zhang, Xiaopeng (Creator), Tian, Qi (Creator)
Summary:Boundary discontinuity and its inconsistency to the final detection metric have been the bottleneck for rotating detection regression loss design. In this paper, we propose a novel regression loss based on Gaussian Wasserstein distance as a fundamental approach to solve the problem. Specifically, the rotated bounding box is converted to a 2-D Gaussian distribution, which enables to approximate the indifferentiable rotational IoU induced loss by the Gaussian Wasserstein distance (GWD) which can be learned efficiently by gradient back-propagation. GWD can still be informative for learning even there is no overlapping between two rotating bounding boxes which is often the case for small object detection. Thanks to its three unique properties, GWD can also elegantly solve the boundary discontinuity and square-like problem regardless how the bounding box is defined. Experiments on five datasets using different detectors show the effectiveness of our approach. Codes are available at https://github.com/yangxue0827/RotationDetection and https://github.com/open-mmlab/mmrotateShow more
Downloadable Archival Material, 2021-01-28
Undefined
Publisher:2021-01-28

2021 see 2022
Finite sample approximations of exact and entropic Wasserstein distances between covariance operators and Gaussian processes
Author:Quang, Minh Ha (Creator)
Summary:This work studies finite sample approximations of the exact and entropic regularized Wasserstein distances between centered Gaussian processes and, more generally, covariance operators of functional random processes. We first show that these distances/divergences are fully represented by reproducing kernel Hilbert space (RKHS) covariance and cross-covariance operators associated with the corresponding covariance functions. Using this representation, we show that the Sinkhorn divergence between two centered Gaussian processes can be consistently and efficiently estimated from the divergence between their corresponding normalized finite-dimensional covariance matrices, or alternatively, their sample covariance operators. Consequently, this leads to a consistent and efficient algorithm for estimating the Sinkhorn divergence from finite samples generated by the two processes. For a fixed regularization parameter, the convergence rates are {\it dimension-independent} and of the same order as those for the Hilbert-Schmidt distance. If at least one of the RKHS is finite-dimensional, we obtain a {\it dimension-dependent} sample complexity for the exact Wasserstein distance between the Gaussian processesShow more
Downloadable Archival Material, 2021-04-26
Undefined
Publisher:2021-04-26

Linear-Time Gromov Wasserstein Distances using Low Rank Couplings and Costs
Authors:Scetbon, Meyer (Creator), Peyré, Gabriel (Creator), Cuturi, Marco (Creator)
Summary:The ability to compare and align related datasets living in heterogeneous spaces plays an increasingly important role in machine learning. The Gromov-Wasserstein (GW) formalism can help tackle this problem. Its main goal is to seek an assignment (more generally a coupling matrix) that can register points across otherwise incomparable datasets. As a non-convex and quadratic generalization of optimal transport (OT), GW is NP-hard. Yet, heuristics are known to work reasonably well in practice, the state of the art approach being to solve a sequence of nested regularized OT problems. While popular, that heuristic remains too costly to scale, with cubic complexity in the number of samples $n$. We show in this paper how a recent variant of the Sinkhorn algorithm can substantially speed up the resolution of GW. That variant restricts the set of admissible couplings to those admitting a low rank factorization as the product of two sub-couplings. By updating alternatively each sub-coupling, our algorithm computes a stationary point of the problem in quadratic time with respect to the number of samples. When cost matrices have themselves low rank, our algorithm has time complexity $\mathcal{O}(n)$. We demonstrate the efficiency of our method on simulated and real dataShow more
Downloadable Archival Material, 2021-06-02
Undefined
Publisher:2021-06-02

Wasserstein distance estimates for the distributions of numerical approximations to ergodic stochastic differential equations
Authors:Sanz-Serna, J. M. (Creator), Zygalakis, Konstantinos C. (Creator)
Summary:We present a framework that allows for the non-asymptotic study of the $2$-Wasserstein distance between the invariant distribution of an ergodic stochastic differential equation and the distribution of its numerical approximation in the strongly log-concave case. This allows us to study in a unified way a number of different integrators proposed in the literature for the overdamped and underdamped Langevin dynamics. In addition, we analyse a novel splitting method for the underdamped Langevin dynamics which only requires one gradient evaluation per time step. Under an additional smoothness assumption on a $d$--dimensional strongly log-concave distribution with condition number $\kappa$, the algorithm is shown to produce with an $\mathcal{O}\big(\kappa^{5/4} d^{1/4}\epsilon^{-1/2} \big)$ complexity samples from a distribution that, in Wasserstein distance, is at most $\epsilon>0$ away from the target distributionShow more
Downloadable Archival Material, 2021-04-26
Undefined
Publisher:2021-04-26
Zbl 07626757


2021


Distributionally Robust Prescriptive Analytics with Wasserstein Distance
Authors:Wang, Tianyu (Creator), Chen, Ningyuan (Creator), Wang, Chun (Creator)
Summary:In prescriptive analytics, the decision-maker observes historical samples of $(X, Y)$, where $Y$ is the uncertain problem parameter and $X$ is the concurrent covariate, without knowing the joint distribution. Given an additional covariate observation $x$, the goal is to choose a decision $z$ conditional on this observation to minimize the cost $\mathbb{E}[c(z,Y)|X=x]$. This paper proposes a new distributionally robust approach under Wasserstein ambiguity sets, in which the nominal distribution of $Y|X=x$ is constructed based on the Nadaraya-Watson kernel estimator concerning the historical data. We show that the nominal distribution converges to the actual conditional distribution under the Wasserstein distance. We establish the out-of-sample guarantees and the computational tractability of the framework. Through synthetic and empirical experiments about the newsvendor problem and portfolio optimization, we demonstrate the strong performance and practical value of the proposed frameworkShow more
Downloadable Archival Material, 2021-06-10
Undefined
Publisher:2021-06-10


Authors:Tan, Mingkui (Creator), Zhang, Internal Wasserstein Distance for Adversarial Attack and DefenseShuhai (Creator), Cao, Jiezhang (Creator), Li, Jincheng (Creator), Xu, Yanwu (Creator)
Summary:Deep neural networks (DNNs) are known to be vulnerable to adversarial attacks that would trigger misclassification of DNNs but may be imperceptible to human perception. Adversarial defense has been important ways to improve the robustness of DNNs. Existing attack methods often construct adversarial examples relying on some metrics like the $\ell_p$ distance to perturb samples. However, these metrics can be insufficient to conduct adversarial attacks due to their limited perturbations. In this paper, we propose a new internal Wasserstein distance (IWD) to capture the semantic similarity of two samples, and thus it helps to obtain larger perturbations than currently used metrics such as the $\ell_p$ distance We then apply the internal Wasserstein distance to perform adversarial attack and defense. In particular, we develop a novel attack method relying on IWD to calculate the similarities between an image and its adversarial examples. In this way, we can generate diverse and semantically similar adversarial examples that are more difficult to defend by existing defense methods. Moreover, we devise a new defense method relying on IWD to learn robust models against unseen adversarial examples. We provide both thorough theoretical and empirical evidence to support our methodsShow more
Downloadable Archival Material, 2021-03-12
Undefined
Publisher:2021-03-12

Computationally Efficient Wasserstein Loss for Structured Labels
Authors:Toyokuni, Ayato (Creator), Yokoi, Sho (Creator), Kashima, Hisashi (Creator), Yamada, Makoto (Creator)
Summary:The problem of estimating the probability distribution of labels has been widely studied as a label distribution learning (LDL) problem, whose applications include age estimation, emotion analysis, and semantic segmentation. We propose a tree-Wasserstein distance regularized LDL algorithm, focusing on hierarchical text classification tasks. We propose predicting the entire label hierarchy using neural networks, where the similarity between predicted and true labels is measured using the tree-Wasserstein distance. Through experiments using synthetic and real-world datasets, we demonstrate that the proposed method successfully considers the structure of labels during training, and it compares favorably with the Sinkhorn algorithm in terms of computation time and memory usageShow more
Downloadable Archival Material, 2021-03-01
Undefined
Publisher:2021-03-01

Towards Better Data Augmentation using Wasserstein Distance in Variational Auto-encoder
Authors:Chen, Zichuan (Creator), Liu, Peng (Creator)
Summary:VAE, or variational auto-encoder, compresses data into latent attributes, and generates new data of different varieties. VAE based on KL divergence has been considered as an effective technique for data augmentation. In this paper, we propose the use of Wasserstein distance as a measure of distributional similarity for the latent attributes, and show its superior theoretical lower bound (ELBO) compared with that of KL divergence under mild conditions. Using multiple experiments, we demonstrate that the new loss function exhibits better convergence property and generates artificial images that could better aid the image classification tasksShow more
Downloadable Archival Material, 2021-09-29
Undefined
Publisher:2021-09-29


Exact Statistical Inference for the Wasserstein Distance by Selective Inference
Authors:Duy, Vo Nguyen Le (Creator), Takeuchi, Ichiro (Creator)
Summary:In this paper, we study statistical inference for the Wasserstein distance, which has attracted much attention and has been applied to various machine learning tasks. Several studies have been proposed in the literature, but almost all of them are based on asymptotic approximation and do not have finite-sample validity. In this study, we propose an exact (non-asymptotic) inference method for the Wasserstein distance inspired by the concept of conditional Selective Inference (SI). To our knowledge, this is the first method that can provide a valid confidence interval (CI) for the Wasserstein distance with finite-sample coverage guarantee, which can be applied not only to one-dimensional problems but also to multi-dimensional problems. We evaluate the performance of the proposed method on both synthetic and real-world datasetsShow more
Downloadable Archival Material, 2021-09-29
Undefined
Publisher:2021-09-29


Learning High Dimensional Wasserstein GeodesicsAuthors:Liu, Shu (Creator), Ma, Shaojun (Creator), Chen, Yongxin (Creator), Zha, Hongyuan (Creator), Zhou, Haomin (Creator)
Summary:We propose a new formulation and learning strategy for computing the Wasserstein geodesic between two probability distributions in high dimensions. By applying the method of Lagrange multipliers to the dynamic formulation of the optimal transport (OT) problem, we derive a minimax problem whose saddle point is the Wasserstein geodesic. We then parametrize the functions by deep neural networks and design a sample based bidirectional learning algorithm for training. The trained networks enable sampling from the Wasserstein geodesic. As by-products, the algorithm also computes the Wasserstein distance and OT map between the marginal distributions. We demonstrate the performance of our algorithms through a series of experiments with both synthetic and realistic dataShow more
Downloadable Archival Material, 2021-02-04
Undefined
Publisher:2021-02-04

<——2021———2021——2780—



A Normalized Gaussian Wasserstein Distance for Tiny Object Detection
Authors:Wang, Jinwang (Creator), Xu, Chang (Creator), Yang, Wen (Creator), Yu, Lei (Creator)
Summary:Detecting tiny objects is a very challenging problem since a tiny object only contains a few pixels in size. We demonstrate that state-of-the-art detectors do not produce satisfactory results on tiny objects due to the lack of appearance information. Our key observation is that Intersection over Union (IoU) based metrics such as IoU itself and its extensions are very sensitive to the location deviation of the tiny objects, and drastically deteriorate the detection performance when used in anchor-based detectors. To alleviate this, we propose a new evaluation metric using Wasserstein distance for tiny object detection. Specifically, we first model the bounding boxes as 2D Gaussian distributions and then propose a new metric dubbed Normalized Wasserstein Distance (NWD) to compute the similarity between them by their corresponding Gaussian distributions. The proposed NWD metric can be easily embedded into the assignment, non-maximum suppression, and loss function of any anchor-based detector to replace the commonly used IoU metric. We evaluate our metric on a new dataset for tiny object detection (AI-TOD) in which the average object size is much smaller than existing object detection datasets. Extensive experiments show that, when equipped with NWD metric, our approach yields performance that is 6.7 AP points higher than a standard fine-tuning baseline, and 6.0 AP points higher than state-of-the-art competitors. Codes are available at: https://github.com/jwwangchn/NWDShow more
Downloadable Archival Material, 2021-10-25
Undefined
Publisher:2021-10-25


Variational Wasserstein gradient flow
Authors:Fan, Jiaojiao (Creator), Zhang, Qinsheng (Creator), Taghvaei, Amirhossein (Creator), Chen, Yongxin (Creator)
Summary:Wasserstein gradient flow has emerged as a promising approach to solve optimization problems over the space of probability distributions. A recent trend is to use the well-known JKO scheme in combination with input convex neural networks to numerically implement the proximal step. The most challenging step, in this setup, is to evaluate functions involving density explicitly, such as entropy, in terms of samples. This paper builds on the recent works with a slight but crucial difference: we propose to utilize a variational formulation of the objective function formulated as maximization over a parametric class of functions. Theoretically, the proposed variational formulation allows the construction of gradient flows directly for empirical distributions with a well-defined and meaningful objective function. Computationally, this approach replaces the computationally expensive step in existing methods, to handle objective functions involving density, with inner loop updates that only require a small batch of samples and scale well with the dimension. The performance and scalability of the proposed method are illustrated with the aid of several numerical experiments involving high-dimensional synthetic and real datasetsShow more
Downloadable Archival Material, 2021-12-04
Undefined
Publisher:2021-12-04



Iron-making Process Monitoring based on Wasserstein Dynamic Stationary Subspace Analysis
Authors:Junjie NanYating LyuQingqing LiuHaojie BaiHanwen ZhangJianxun Zhang2022 37th Youth Academic Annual Conference of Chinese Association of Automation (YAC)
Summary:The detection of early abnormalities in blast furnaces is important for the safety of iron-making processes, the reduction of energy consumption, and the improvement of economic benefits. Due to the fluctuation of raw material composition, production schedule adjustment, equipment degradation, etc., the iron-making process data often present non-stationary characteristics, resulting in the time-varying statistical characteristics of the data even under normal conditions. To solve this problem, a dynamic stationary subspace analysis method based on Wasserstein distance to detect anomalies of the iron-making process is proposed. Firstly, the switching time of hot blast stoves is identified to reduce the false alarms caused by the hot blast stove switching; Secondly, the differences in data distribution of different time intervals are measured by the Wasserstein distance, and the stationary features are obtained by minimizing the Wasserstein distance between intervals; Finally, the abnormal furnace condition monitoring index based on Mahalanobis distance is constructed by using the stationary features. The simulation results based on an actual iron-making process dataset show that the method proposed in this paper can effectively monitor the abnormal furnace conditions
Show more
Chapter, 2022
Publication:2022 37th Youth Academic Annual Conference of Chinese Association of Automation (YAC), 20221119, 640
Publisher:2022

Data-driven distributionally robust MPC using the Wasserstein metric
Authors:Zhong, Zhengang (Creator), del Rio-Chanona, Ehecatl Antonio (Creator), Petsagkourakis, Panagiotis (Creator)
Summary:A data-driven MPC scheme is proposed to safely control constrained stochastic linear systems using distributionally robust optimization. Distributionally robust constraints based on the Wasserstein metric are imposed to bound the state constraint violations in the presence of process disturbance. A feedback control law is solved to guarantee that the predicted states comply with constraints. The stochastic constraints are satisfied with regard to the worst-case distribution within the Wasserstein ball centered at their discrete empirical probability distribution. The resulting distributionally robust MPC framework is computationally tractable and efficient, as well as recursively feasible. The innovation of this approach is that all the information about the uncertainty can be determined empirically from the data. The effectiveness of the proposed scheme is demonstrated through numerical case studiesShow more
Downloadable Archival Material, 2021-05-18
Undefined
Publisher:2021-05-18

A continuation multiple shooting method for Wasserstein geodesic equation
Authors:Cui, Jianbo (Creator), Dieci, Luca (Creator), Zhou, Haomin (Creator)
Summary:In this paper, we propose a numerical method to solve the classic $L^2$-optimal transport problem. Our algorithm is based on use of multiple shooting, in combination with a continuation procedure, to solve the boundary value problem associated to the transport problem. We exploit the viewpoint of Wasserstein Hamiltonian flow with initial and target densities, and our method is designed to retain the underlying Hamiltonian structure. Several numerical examples are presented to illustrate the performance of the methodShow more
Downloadable Archival Material, 2021-05-20
Undefined
Publisher:2021-05-20

2021


Wasserstein Adversarially Regularized Graph Autoencoder
Authors:Liang, Huidong (Creator), Gao, Junbin (Creator)
Summary:This paper introduces Wasserstein Adversarially Regularized Graph Autoencoder (WARGA), an implicit generative algorithm that directly regularizes the latent distribution of node embedding to a target distribution via the Wasserstein metric. The proposed method has been validated in tasks of link prediction and node clustering on real-world graphs, in which WARGA generally outperforms state-of-the-art models based on Kullback-Leibler (KL) divergence and typical adversarial frameworkShow more
Downloadable Archival Material, 2021-11-09
Undefined
Publisher:2021-11-09


On the use of Wasserstein metric in topological clustering of distributional data
Authors:Cabanes, Guénaël (Creator), Bennani, Younès (Creator), Verde, Rosanna (Creator), Irpino, Antonio (Creator)
Summary:This paper deals with a clustering algorithm for histogram data based on a Self-Organizing Map (SOM) learning. It combines a dimension reduction by SOM and the clustering of the data in a reduced space. Related to the kind of data, a suitable dissimilarity measure between distributions is introduced: the $L_2$ Wasserstein distance. Moreover, the number of clusters is not fixed in advance but it is automatically found according to a local data density estimation in the original space. Applications on synthetic and real data sets corroborate the proposed strategyShow more
Downloadable Archival Material, 2021-09-09
Undefined
Publisher:2021-09-09

STOCHASTIC GRADIENT METHODS FOR L<sup>2</sup>-WASSERSTEIN LEAST SQUARES PROBLEM OF GAUSSIAN MEASURES
Show more
Authors:SANGWOON YUNXIANG SUNJUNG-IL CHOI
Summary:This paper proposes stochastic methods to find an approximate solution for the L<sup>2</sup>-Wasserstein least squares problem of Gaussian measures. The variable for the problem is in a set of positive definite matrices. The first proposed stochastic method is a type of classical stochastic gradient methods combined with projection and the second one is a type of variance reduced methods with projection. Their global convergence are analyzed by using the framework of proximal stochastic gradient methods. The convergence of the classical stochastic gradient method combined with projection is established by using diminishing learning rate rule in which the learning rate decreases as the epoch increases but that of the variance reduced method with projection can be established by using constant learning rate. The numerical results show that the present algorithms with a proper learning rate outperforms a gradient projection methodShow more
Downloadable Article, 2021
Publication:Journal of the Korean Society for Industrial and Applied Mathematics, 25, 2021, 162
Publisher:2021



Disentangled Recurrent Wasserstein Autoencoder
Authors:Han, Jun (Creator), Min, Martin Renqiang (Creator), Han, Ligong (Creator), Li, Li Erran (Creator), Zhang, Xuan (Creator)
Summary:Learning disentangled representations leads to interpretable models and facilitates data generation with style transfer, which has been extensively studied on static data such as images in an unsupervised learning framework. However, only a few works have explored unsupervised disentangled sequential representation learning due to challenges of generating sequential data. In this paper, we propose recurrent Wasserstein Autoencoder (R-WAE), a new framework for generative modeling of sequential data. R-WAE disentangles the representation of an input sequence into static and dynamic factors (i.e., time-invariant and time-varying parts). Our theoretical analysis shows that, R-WAE minimizes an upper bound of a penalized form of the Wasserstein distance between model distribution and sequential data distribution, and simultaneously maximizes the mutual information between input data and different disentangled latent factors, respectively. This is superior to (recurrent) VAE which does not explicitly enforce mutual information maximization between input data and disentangled latent representations. When the number of actions in sequential data is available as weak supervision information, R-WAE is extended to learn a categorical latent representation of actions to improve its disentanglement. Experiments on a variety of datasets show that our models outperform other baselines with the same settings in terms of disentanglement and unconditional video generation both quantitatively and qualitativelyShow more
Downloadable Archival Material, 2021-01-19
Undefined
Publisher:2021-01-19
Cited by 22
Related articles All 4 versions
 
Partial Wasserstein Covering
Authors:Kawano, Keisuke (Creator), Koide, Satoshi (Creator), Otaki, Keisuke (Creator)
Summary:We consider a general task called partial Wasserstein covering with the goal of providing information on what patterns are not being taken into account in a dataset (e.g., dataset used during development) compared with another dataset(e.g., dataset obtained from actual applications). We model this task as a discrete optimization problem with partial Wasserstein divergence as an objective function. Although this problem is NP-hard, we prove that it satisfies the submodular property, allowing us to use a greedy algorithm with a 0.63 approximation. However, the greedy algorithm is still inefficient because it requires solving linear programming for each objective function evaluation. To overcome this inefficiency, we propose quasi-greedy algorithms that consist of a series of acceleration techniques, such as sensitivity analysis based on strong duality and the so-called C-transform in the optimal transport field. Experimentally, we demonstrate that we can efficiently fill in the gaps between the two datasets and find missing scene in real driving scenes datasetsShow more
Downloadable Archival Material, 2021-06-01
Undefined
Publisher:2021-06-01

<——2021———2021——2790—



Blockchains for Network Security : Principles, Technologies and Applications
Authors:Haojun HuangLizhe WangYulei WuKim-Kwang Raymond Choo
Summary:Presenting a comprehensive view of blockchain technologies for network security from principles to core technologies and applications, this book offers unprecedented insights into recent advances and developments in these areas, and how they can make blockchain technologies associated with networks more secure and fit-for-purposeShow more
eBook, 2021
English
Publisher:Institution of Engineering & Technology, Stevenage, 2021
Also available asPrint Book
View AllFormats & Editions


Wasserstein Patch Prior for Image Superresolution
Authors:Hertrich, Johannes (Creator), Houdard, Antoine (Creator), Redenbach, Claudia (Creator)
Summary:In this paper, we introduce a Wasserstein patch prior for superresolution of two- and three-dimensional images. Here, we assume that we have given (additionally to the low resolution observation) a reference image which has a similar patch distribution as the ground truth of the reconstruction. This assumption is e.g. fulfilled when working with texture images or material data. Then, the proposed regularizer penalizes the $W_2$-distance of the patch distribution of the reconstruction to the patch distribution of some reference image at different scales. We demonstrate the performance of the proposed regularizer by two- and three-dimensional numerical examplesShow more
Downloadable Archival Material, 2021-09-27
Undefined
Publisher:2021-09-27

Authors:Massachusetts Institute of Technology Computer Science and Artificial Intelligence Stochastic wasserstein barycentersLaboratory (Contributor), Solomon, Justin (Creator), Chien, Edward (Creator), Claici, Sebastian (Creator)Show more
Summary:© 2018 35th International Conference on Machine Learning, ICML 2018. All rights reserved. Wi present a stochastic algorithm to compute the baryccntcr of a set of probability distributions under the Wasscrstcin metric from optimal transport Unlike previous approaches,our method extends to continuous input distributions and allows the support of the baryccntcr to be adjusted in each iteration. VVc tacklc the problem without rcgu- larization, allowing us to rccovcr a much sharper output; We give examples where our algorithm recovers a more meaningful baryccntcr than previous work. Our method is versatile and can be extended to applications such as generating super samples from a given distribution and recovering blue noise approximationsShow more
Downloadable Archival Material, 2021-11-09T18:35:57Z
English
Publisher:2021-11-09T18:35:57Z

Invitation to Statistics in Wasserstein Space
eBook, 2021
English
Publisher:Springer Nature, 2021
Also available asPrint Book
View AllFormats & Editions


Distributional robustness in minimax linear quadratic control with Wasserstein distance
Authors:Kim, Kihyun (Creator), Yang, Insoon (Creator)
Summary:To address the issue of inaccurate distributions in practical stochastic systems, a minimax linear-quadratic control method is proposed using the Wasserstein metric. Our method aims to construct a control policy that is robust against errors in an empirical distribution of underlying uncertainty, by adopting an adversary that selects the worst-case distribution. The opponent receives a Wasserstein penalty proportional to the amount of deviation from the empirical distribution. A closed-form expression of the finite-horizon optimal policy pair is derived using a Riccati equation. The result is then extended to the infinite-horizon average cost setting by identifying conditions under which the Riccati recursion converges to the unique positive semi-definite solution to an algebraic Riccati equation. Our method is shown to possess several salient features including closed-loop stability, and an out-of-sample performance guarantee. We also discuss how to optimize the penalty parameter for enhancing the distributional robustness of our control policy. Last but not least, a theoretical connection to the classical $H_\infty$-method is identified from the perspective of distributional robustnessShow more
Downloadable Archival Material, 2021-02-25
Undefined
Publisher:2021-02-25

2021


Dynamic Topological Data Analysis for Brain Networks via Wasserstein Graph Clustering
Authors:Chung, Moo K. (Creator), Huang, Shih-Gu (Creator), Carroll, Ian C. (Creator), Calhoun, Vince D. (Creator), Goldsmith, H. Hill (Creator)
Summary:We present the novel Wasserstein graph clustering for dynamically changing graphs. The Wasserstein clustering penalizes the topological discrepancy between graphs. The Wasserstein clustering is shown to outperform the widely used k-means clustering. The method applied in more accurate determination of the state spaces of dynamically changing functional brain networksShow more
Downloadable Archival Material, 2021-12-31
Undefined
Publisher:2021-12-31



Training Wasserstein GANs without gradient penaltiesAuthors:Kwon, Dohyun (Creator), Kim, Yeoneung (Creator), Montúfar, Guido (Creator), Yang, Insoon (Creator)
Summary:We propose a stable method to train Wasserstein generative adversarial networks. In order to enhance stability, we consider two objective functions using the $c$-transform based on Kantorovich duality which arises in the theory of optimal transport. We experimentally show that this algorithm can effectively enforce the Lipschitz constraint on the discriminator while other standard methods fail to do so. As a consequence, our method yields an accurate estimation for the optimal discriminator and also for the Wasserstein distance between the true distribution and the generated one. Our method requires no gradient penalties nor corresponding hyperparameter tuning and is computationally more efficient than other methods. At the same time, it yields competitive generators of synthetic images based on the MNIST, F-MNIST, and CIFAR-10 datasetsShow more
Downloadable Archival Material, 2021-10-26
Undefined
Publisher:2021-10-26

Peer-reviewed
Decomposition methods for Wasserstein-based data-driven distributionally robust problems
Authors:Carlos Andrés GamboaDavi Michel ValladãoAlexandre StreetTito Homem-de-Mello
Summary:We study decomposition methods for two-stage data-driven Wasserstein-based DROs with right-hand-sided uncertainty and rectangular support. We propose a novel finite reformulation that explores the rectangular uncertainty support to develop and test five new different decomposition schemes: Column-Constraint Generation, Single-cut and Multi-cut Benders, as well as Regularized Single-cut and Multi-cut Benders. We compare the efficiency of the proposed methods for a unit commitment problem with 14 and 54 thermal generators whose uncertainty vector differs from a 24 to 240-dimensional arrayShow more
Article
Publication:Operations Research Letters, 49, September 2021, 696

Peer-reviewed
Conditional Wasserstein GAN-based oversampling of tabular data for imbalanced learning
Authors:Justin EngelmannStefan Lessmann
Summary:Class imbalance impedes the predictive performance of classification models. Popular countermeasures include oversampling minority class cases by creating synthetic examples. The paper examines the potential of Generative Adversarial Networks (GANs) for oversampling. A few prior studies have used GANs for this purpose but do not reflect recent methodological advancements for generating tabular data using GANs. The paper proposes an approach based on a conditional Wasserstein GAN that can effectively model tabular datasets with numerical and categorical variables and pays special attention to the down-stream classification task through an auxiliary classifier loss. We focus on a credit scoring context in which binary classifiers predict the default risk of loan applications. Empirical comparisons in this context evidence the competitiveness of GAN-based oversampling compared to several standard oversampling regimes. We also clarify the conditions under which oversampling in general and the proposed GAN-based approach in particular raise predictive performance. In sum, our findings suggest that GAN architectures for tabular data and our extensions deserve a place in data scientists’ modelling toolboxShow more
Article
Publication:Expert Systems With Applications, 174, 2021-07-15

Peer-reviewed
Adversarial training with Wasserstein distance for learning cross-lingual word embeddings
Authors:Yuling LiYuhong ZhangKui YuXuegang Hu
Summary:Abstract: Recent studies have managed to learn cross-lingual word embeddings in a completely unsupervised manner through generative adversarial networks (GANs). These GANs-based methods enable the alignment of two monolingual embedding spaces approximately, but the performance on the embeddings of low-frequency words (LFEs) is still unsatisfactory. The existing solution is to set up the low sampling rates for the embeddings of LFEs based on word-frequency information. However, such a solution has two shortcomings. First, this solution relies on the word-frequency information that is not always available in real scenarios. Second, the uneven sampling may cause the models to overlook the distribution information of LFEs, thereby negatively affecting their performance. In this study, we propose a novel unsupervised GANs-based method that effectively improves the quality of LFEs, circumventing the above two issues. Our method is based on the observation that LFEs tend to be densely clustered in the embedding space. In these dense embedding points, obtaining fine-grained alignment through adversarial training is difficult. We use this idea to introduce a noise function that can disperse the dense embedding points to a certain extent. In addition, we train a Wasserstein critic network to encourage the noise-adding embeddings and the original embeddings to have similar semantics. We test our approach on two common evaluation tasks, namely, bilingual lexicon induction and cross-lingual word similarity. Experimental results show that the proposed model has stronger or competitive performance compared with the supervised and unsupervised baselinesShow more

Article, 2021
Publication:Applied Intelligence : The International Journal of Research on Intelligent Systems for Real Life Complex Problems, 51, 20210315, 7666
Publisher:2021

<——2021———2021——2800—



Peer-reviewed
Considering anatomical prior information for low-dose CT image enhancement using attribute-augmented Wasserstein generative adversarial networks
Show more
Authors:Zhenxing HuangXinfeng LiuRongpin WangJincai ChenPing LuQiyang ZhangChanghui JiangYongfeng YangXin LiuHairong Zheng
Summary:Currently, many deep learning (DL)-based low-dose CT image postprocessing technologies fail to consider the anatomical differences in training data among different human body sites, such as the cranium, lung and pelvis. In addition, we can observe evident anatomical similarities at the same site among individuals. However, these anatomical differences and similarities are ignored in the current DL-based methods during the network training process. In this paper, we propose a deep network trained by introducing anatomical site labels, termed attributes for training data. Then, the network can adaptively learn to obtain the optimal weight for each anatomical site. By doing so, the proposed network can take full advantage of anatomical prior information to estimate high-resolution CT images. Furthermore, we employ a Wasserstein generative adversarial network (WGAN) augmented with attributes to preserve more structural details. Compared with the traditional networks that do not consider the anatomical prior and whose weights are consequently the same for each anatomical site, the proposed network achieves better performance by adaptively adjusting to the anatomical prior informationShow more
Article
Publication:Neurocomputing, 428, 2021-03-07, 104


Peer-reviewed
Wasserstein distance feature alignment learning for 2D image-based 3D model retrieval
Authors:Yaqian ZhouYu LiuHeyu ZhouWenhui Li
Summary:2D image-based 3D model retrieval has become a hotspot topic in recent years. However, the current existing methods are limited by two aspects. Firstly, they are mostly based on the supervised learning, which limits their application because of the high time and cost consuming of manual annotation. Secondly, the mainstream methods narrow the discrepancy between 2D and 3D domains mainly by the image-level alignment, which may bring the additional noise during the image transformation and influence cross-domain effect. Consequently, we propose a Wasserstein distance feature alignment learning (WDFAL) for this retrieval task. First of all, we describe 3D models through a series of virtual views and use CNNs to extract features. Secondly, we design a domain critic network based on the Wasserstein distance to narrow the discrepancy between two domains. Compared to the image-level alignment, we reduce the domain gap by the feature-level distribution alignment to avoid introducing additional noise. Finally, we extract the visual features from 2D and 3D domains, and calculate their similarity by utilizing Euclidean distance. The extensive experiments can validate the superiority of the WDFAL methodShow more
Article, 2021
Publication:Journal of Visual Communication and Image Representation, 79, 202108
Publisher:2021


Peer-reviewed
Nonembeddability of persistence diagrams with $p>2$ Wasserstein metric
Author:Alexander Wagner
Summary:Persistence diagrams do not admit an inner product structure compatible with any Wasserstein metric. Hence, when applying kernel methods to persistence diagrams, the underlying feature map necessarily causes distortion. We prove that persistence diagrams with the $ p$-Wasserstein metric do not admit a coarse embedding into a Hilbert space when $ p > 2$Show more
Downloadable Article, 2021
Publication:Proceedings of the American Mathematical Society, 149, June 1, 2021, 2673
Publisher:2021



Peer-reviewed
A deep learning-based approach for direct PET attenuation correction using Wasserstein generative adversarial network
Authors:Yongchang LiWei Wu
Summary:Positron emission tomography (PET) in some clinical assistant diagnose demands attenuation correction (AC) and scatter correction (SC) to obtain high-quality imaging, leading to gaining more precise metabolic information in tissue or organs of patient. However, there still are some inevitable issues, such as imperceptible mismatching precision between PET and CT imaging, or plenty of ionizing radiation dose exposure in many after-treatment inspections. To cope with the abovementioned issues, we introduced a deep learning-based technique to achieve a direct attenuation correction for PET imaging in this article. Moreover, wasserstein generative adversarial networks and hybrid loss, including adversarial loss, L loss and gradient difference loss, were utilized to enforce the deep network model to synthesize PET images with much richer detail information. A comprehensive research was designed and carried out on a total of forty-five sets of PET images of lymphoma patients for the model training stage and test stage. Final performances analysis was totally based on our experimental outcomes, which demonstrated that the proposed algorithm has definitely improved the quality of PET imaging according to qualitative and quantitative studyShow more
Article, 2021
Publication:Journal of Physics: Conference Series, 1848, 20210401
Publisher:2021


Peer-reviewed
Authors:Benoît BonnetHélène Frankowska
Differential inclusions in Wasserstein spaces: The Cauchy-Lipschitz framework
Summary:In this article, we propose a general framework for the study of differential inclusions in the Wasserstein space of probability measures. Based on earlier geometric insights on the structure of continuity equations, we define solutions of differential inclusions as absolutely continuous curves whose driving velocity fields are measurable selections of multifunction taking their values in the space of vector fields. In this general setting, we prove three of the founding results of the theory of differential inclusions: Filippov's theorem, the Relaxation theorem, and the compactness of the solution sets. These contributions - which are based on novel estimates on solutions of continuity equations - are then applied to derive a new existence result for fully non-linear mean-field optimal control problems with closed-loop controlsShow more
Article
Publication:Journal of Differential Equations, 271, 2021-01-15, 594

2021
  
Peer-reviewed
Low-dose CT denoising using a Progressive Wasserstein generative adversarial network
Authors:Guan WangXueli Hu
Summary:Low-dose computed tomography (LDCT) imaging can greatly reduce the radiation dose imposed on the patient. However, image noise and visual artifacts are inevitable when the radiation dose is low, which has serious impact on the clinical medical diagnosis. Hence, it is important to address the problem of LDCT denoising. Image denoising technology based on Generative Adversarial Network (GAN) has shown promising results in LDCT denoising. Unfortunately, the structures and the corresponding learning algorithms are becoming more and more complex and diverse, making it tricky to analyze the contributions of various network modules when developing new networks. In this paper, we propose a progressive Wasserstein generative adversarial network to remove the noise of LDCT images, providing a more feasible and effective way for CT denoising. Specifically, a recursive computation is designed to reduce the network parameters. Moreover, we introduce a novel hybrid loss function for achieving improved results. The hybrid loss function aims to reduce artifacts while better retaining the details in the denoising results. Therefore, we propose a novel LDCT denoising model called progressive Wasserstein generative adversarial network with the weighted structurally-sensitive hybrid loss function (PWGAN-WSHL), which provides a better and simpler baseline by considering network architecture and loss functions. Extensive experiments on a publicly available database show that our proposal achieves better performance than the state-of-the-art methodsShow more
Article
Publication:Computers in Biology and Medicine, 135, August 2021


Peer-reviewed
Using Wasserstein Generative Adversarial Networks for the design of Monte Carlo simulations
Authors:Susan AtheyGuido W. ImbensJonas MetzgerEvan Munro
Summary:When researchers develop new econometric methods it is common practice to compare the performance of the new methods to those of existing methods in Monte Carlo studies. The credibility of such Monte Carlo studies is often limited because of the discretion the researcher has in choosing the Monte Carlo designs reported. To improve the credibility we propose using a class of generative models that has recently been developed in the machine learning literature, termed Generative Adversarial Networks (GANs) which can be used to systematically generate artificial data that closely mimics existing datasets. Thus, in combination with existing real data sets, GANs can be used to limit the degrees of freedom in Monte Carlo study designs for the researcher, making any comparisons more convincing. In addition, if an applied researcher is concerned with the performance of a particular statistical method on a specific data set (beyond its theoretical properties in large samples), she can use such GANs to assess the performance of the proposed method, e.g. the coverage rate of confidence intervals or the bias of the estimator, using simulated data which closely resembles the exact setting of interest. To illustrate these methods we apply Wasserstein GANs (WGANs) to the estimation of average treatment effects. In this example, we find thatShow more
Article
Publication:Journal of Econometrics


Peer-reviewed
The <b>α</b>-<b>z</b>-Bures Wasserstein divergence
Authors:Trung Hoa DinhCong Trinh LeBich Khue VoTrung Dung Vuong
Summary:In this paper, we introduce the α-z-Bures Wasserstein divergence for positive semidefinite matrices A and B as Φ ( A , B ) = T r ( ( 1 − α ) A + α B ) − T r ( Q α , z ( A , B ) ) , where Q α , z ( A , B ) = ( A 1 − α 2 z B α z A 1 − α 2 z ) z is the matrix function in the α-z-Renyi relative entropy. We show that for 0 ≤ α ≤ z ≤ 1 , the quantity Φ ( A , B ) is a quantum divergence and satisfies the Data Processing Inequality in quantum information. We also solve the least squares problem with respect to the new divergence. In addition, we show that the matrix power mean μ ( t , A , B ) = ( ( 1 − t ) A p + t B p ) 1 / p satisfies the in-betweenness property with respect to the α-z-Bures Wasserstein divergenceShow more
Article, 2021
Publication:Linear Algebra and Its Applications, 624, 20210901, 267
Publisher:2021



Peer-reviewed
Berry–Esseen Smoothing Inequality for the Wasserstein Metric on Compact Lie Groups
Author:Bence Borda
Summary:Abstract: We prove a sharp general inequality estimating the distance of two probability measures on a compact Lie group in the Wasserstein metric in terms of their Fourier transforms. We use a generalized form of the Wasserstein metric, related by Kantorovich duality to the family of functions with an arbitrarily prescribed modulus of continuity. The proof is based on smoothing with a suitable kernel, and a Fourier decay estimate for continuous functions. As a corollary, we show that the rate of convergence of random walks on semisimple groups in the Wasserstein metric is necessarily almost exponential, even without assuming a spectral gap. Applications to equidistribution and empirical measures are also givenShow more
Article, 2021
Publication:Journal of Fourier Analysis and Applications, 27, 20210303
Publisher:2021

Peer-reviewed
Transfer learning method for bearing fault diagnosis based on fully convolutional conditional Wasserstein adversarial Networks
Authors:Yong Zhi LiuKe Ming ShiZhi Xuan LiGuo Fu DingYi Sheng Zou
Summary:The diagnostic accuracy of existing transfer learning-based bearing fault diagnosis methods is high in the source condition, but accuracy in the target condition is not guaranteed. These methods mainly focus on the whole distribution of bearing source domain data and target condition data, ignoring the transfer learning of each kind of bearing fault data, which may lead to lower diagnostic accuracy. To overcome these limitations, we propose a transfer learning fault diagnosis model based on a deep Fully Convolutional Conditional Wasserstein Adversarial Network (FCWAN). The proposed model addresses the described problems separately: (1) A random-sampling map classification and difference classifier are used to handle the first limitation. (2) A label is introduced into the domain of adversarial learning to strengthen the supervision of the learning process and the effect of category field alignment, thus overcoming the second limitation. Experimental results demonstrate the superiority of this method over existing methodsShow more
Article
Publication:Measurement, 180, August 2021

Cited by 27 Related articles

<——2021———2021——2810—


Peer-reviewed
Primal Dual Methods for Wasserstein Gradient Flows
Authors:José A. CarrilloKaty CraigLi WangChaozhen Wei
Summary:Combining the classical theory of optimal transport with modern operator splitting techniques, we develop a new numerical method for nonlinear, nonlocal partial differential equations, arising in models of porous media, materials science, and biological swarming. Our method proceeds as follows: first, we discretize in time, either via the classical JKO scheme or via a novel Crank–Nicolson-type method we introduce. Next, we use the Benamou–Brenier dynamical characterization of the Wasserstein distance to reduce computing the solution of the discrete time equations to solving fully discrete minimization problems, with strictly convex objective functions and linear constraints. Third, we compute the minimizers by applying a recently introduced, provably convergent primal dual splitting scheme for three operators (Yan in J Sci Comput 1–20, 2018). By leveraging the PDEs’ underlying variational structure, our method overcomes stability issues present in previous numerical work built on explicit time discretizations, which suffer due to the equations’ strong nonlinearities and degeneracies. Our method is also naturally positivity and mass preserving and, in the case of the JKO scheme, energy decreasing. We prove that minimizers of the fully discrete problem converge to minimizers of the spatially continuous, discrete time problem as the spatial discretization is refined. We conclude with simulations of nonlinear PDEs and Wasserstein geodesics in one and two dimensions that illustrate the key properties of our approach, including higher-order convergence our novel Crank–Nicolson-type method, when compared to the classical JKO methodShow more
Downloadable Article, 2021
Publication:Foundations of Computational Mathematics, 20210331, 1
Publisher:2021

Peer-reviewed
Sufficient Condition for Rectifiability Involving Wasserstein Distance
Author:Damian Dąbrowski
Summary:Abstract: A Radon measure is n-rectifiable if it is absolutely continuous with respect to and -almost all of can be covered by Lipschitz images of . In this paper we give two sufficient conditions for rectifiability, both in terms of square functions of flatness-quantifying coefficients. The first condition involves the so-called and numbers. The second one involves numbers—coefficients quantifying flatness via Wasserstein distance . Both conditions are necessary for rectifiability, too—the first one was shown to be necessary by Tolsa, while the necessity of the condition is established in our recent paper. Thus, we get two new characterizations of rectifiabilityShow more
Article, 2021
Publication:The Journal of Geometric Analysis, 31, 20210119, 8539
Publisher:2021

Peer-reviewed
On linear optimization over Wasserstein balls
Authors:Man-Chung YueDaniel KuhnWolfram Wiesemann
Summary:Abstract: Wasserstein balls, which contain all probability measures within a pre-specified Wasserstein distance to a reference measure, have recently enjoyed wide popularity in the distributionally robust optimization and machine learning communities to formulate and solve data-driven optimization problems with rigorous statistical guarantees. In this technical note we prove that the Wasserstein ball is weakly compact under mild conditions, and we offer necessary and sufficient conditions for the existence of optimal solutions. We also characterize the sparsity of solutions if the Wasserstein ball is centred at a discrete reference measure. In comparison with the existing literature, which has proved similar results under different conditions, our proofs are self-contained and shorter, yet mathematically rigorous, and our necessary and sufficient conditions for the existence of optimal solutions are easily verifiable in practiceShow more
Article, 2021
Publication:Mathematical Programming : A Publication of the Mathematical Optimization Society, 195, 20210617, 1107
Publisher:2021


Peer-reviewed
Optimal Estimation of Wasserstein Distance on a Tree With an Application to Microbiome Studies
Authors:Shulei WangT. Tony CaiHongzhe Li
Summary:The weighted UniFrac distance, a plug-in estimator of the Wasserstein distance of read counts on a tree, has been widely used to measure the microbial community difference in microbiome studies. Our investigation however shows that such a plug-in estimator, although intuitive and commonly used in practice, suffers from potential bias. Motivated by this finding, we study the problem of optimal estimation of the Wasserstein distance between two distributions on a tree from the sampled data in the high-dimensional setting. The minimax rate of convergence is established. To overcome the bias problem, we introduce a new estimator, referred to as the moment-screening estimator on a tree (MET), by using implicit best polynomial approximation that incorporates the tree structure. The new estimator is computationally efficient and is shown to be minimax rate-optimal. Numerical studies using both simulated and real biological datasets demonstrate the practical merits of MET, including reduced biases and statistically more significant differences in microbiome between the inactive Crohn’s disease patients and the normal controls. Supplementary materials for this article are available onlineShow more
Article
Publication:Journal of the American Statistical Association, 116, 20210703, 1237
Zbl 1510.62459


Peer-reviewed
Solutions to Hamilton–Jacobi equation on a Wasserstein space
Authors:Zeinab BadreddineHélène Frankowska
Summary:Abstract: We consider a Hamilton–Jacobi equation associated to the Mayer optimal control problem in the Wasserstein space and define its solutions in terms of the Hadamard generalized differentials. Continuous solutions are unique whenever we focus our attention on solutions defined on explicitly described time dependent compact valued tubes of probability measures. We also prove some viability and invariance theorems in the Wasserstein space and discuss a new notion of proximal normalShow more
Article, 2021
Publication:Calculus of Variations and Partial Differential Equations, 61, 20211120
Publisher:2021

Semi-relaxed Gromov-Wasserstein divergence with applications on graphs
Authors:Vincent-Cuaz, Cédric (Creator), Flamary, Rémi (Creator), Corneli, Marco (Creator), Vayer, Titouan (Creator), Courty, Nicolas (Creator)
Summary:Comparing structured objects such as graphs is a fundamental operation involved in many learning tasks. To this end, the Gromov-Wasserstein (GW) distance, based on Optimal Transport (OT), has proven to be successful in handling the specific nature of the associated objects. More specifically, through the nodes connectivity relations, GW operates on graphs, seen as probability measures over specific spaces. At the core of OT is the idea of conservation of mass, which imposes a coupling between all the nodes from the two considered graphs. We argue in this paper that this property can be detrimental for tasks such as graph dictionary or partition learning, and we relax it by proposing a new semi-relaxed Gromov-Wasserstein divergence. Aside from immediate computational benefits, we discuss its properties, and show that it can lead to an efficient graph dictionary learning algorithm. We empirically demonstrate its relevance for complex tasks on graphs such as partitioning, clustering and completionShow more
Downloadable Archival Material, 2021-10-06
Undefined
Publisher:2021-10-06
Semi-relaxed Gromov-Wasserstein divergence ... - OpenReview

Oct 12, 2021


2021
 
EEG data augmentation for emotion recognition with a multiple generator conditional Wasserstein GAN
Authors:Aiming ZhangLei SuYin ZhangYunfa FuLiping WuShengjin Liang
Summary:Abstract: EEG-based emotion recognition has attracted substantial attention from researchers due to its extensive application prospects, and substantial progress has been made in feature extraction and classification modelling from EEG data. However, insufficient high-quality training data are available for building EEG-based emotion recognition models via machine learning or deep learning methods. The artificial generation of high-quality data is an effective approach for overcoming this problem. In this paper, a multi-generator conditional Wasserstein GAN method is proposed for the generation of high-quality artificial that covers a more comprehensive distribution of real data through the use of various generators. Experimental results demonstrate that the artificial data that are generated by the proposed model can effectively improve the performance of emotion classification models that are based on EEGShow more
Article, 2021Publication:Complex & Intelligent Systems, 8, 20210403, 3059
Publisher:2021

Peer-reviewed
LCS graph kernel based on Wasserstein distance in longest common subsequence metric space
Authors:Jianming HuangZhongxi FangHiroyuki Kasai
Summary:• Graph classification using Wasserstein graph kernel. • Path sequences comparing over longest common subsequence space metric space. • Adjacent point merging strategy in metric space for computation reduction.

For graph learning tasks, many existing methods utilize a message-passing mechanism where vertex features are updated iteratively by aggregation of neighbor information. This strategy provides an efficient means for graph features extraction, but obtained features after many iterations might contain too much information from other vertices, and tend to be similar to each other. This makes their representations less expressive. Learning graphs using paths, on the other hand, can be less adversely affected by this problem because it does not involve all vertex neighbors. However, most of them can only compare paths with the same length, which might engender information loss. To resolve this difficulty, we propose a new Graph Kernel based on a Longest Common Subsequence (LCS) similarity. Moreover, we found that the widely-used R -convolution framework is unsuitable for path-based Graph Kernel because a huge number of comparisons between dissimilar paths might deteriorate graph distances calculation. Therefore, we propose a novel metric space by exploiting the proposed LCS-based similarity, and compute a new Wasserstein-based graph distance in this metric space, which emphasizes more the comparison between similar paths. Furthermore, to reduce the computational cost, we propose an adjacent point merging operation to sparsify point clouds in the metric spaceShow more

Article, 2021
Publication:Signal Processing, 189, 202112
Publisher:2021



Consistency of Distributionally Robust Risk-and Chance-Constrained Optimization under Wasserstein Ambiguity SetsAuthors:Ashish CherukuriAshish R. Hota
Summary:We study stochastic optimization problems with chance and risk constraints, where in the latter, risk is quantified in terms of the conditional value-at-risk (CVaR). We consider the distributionally robust versions of these problems, where the constraints are required to hold for a family of distributions constructed from the observed realizations of the uncertainty via the Wasserstein distance. Our main results establish that if the samples are drawn independently from an underlying distribution and the problems satisfy suitable technical assumptions, then the optimal value and optimizers of the distributionally robust versions of these problems converge to the respective quantities of the original problems, as the sample size increasesShow more
Article
Publication:IEEE Control Systems Letters, 5, 2021, 1729


 

Geometry on the Wasserstein Space Over a Compact Riemannian Manifold
Authors:Hao DingShizan Fang
Summary:Abstract: We revisit the intrinsic differential geometry of the Wasserstein space over a Riemannian manifold, due to a series of papers by Otto, Otto-Villani, Lott, Ambrosio-Gigli-Savaré, etcShow more
Article, 2021
Publication:Acta Mathematica Scientia, 41, 20211105, 1959
Publisher:2021

Zbl 1513.58004

<——2021———2021——2820—



Peer-reviewed
Entropic-Wasserstein Barycenters: PDE Characterization, Regularity, and CLT
Authors:Guillaume CarlierKatharina EichingerAlexey Kroshnin
Summary:In this paper, we investigate properties of entropy-penalized Wasserstein barycenters introduced in [J. Bigot, E. Cazelles, and N. Papadakis, SIAM J. Math. Anal. , 51 (2019), pp. 2261--2285] as a regularization of Wasserstein barycenters [M. Agueh and G. Carlier, SIAM J. Math. Anal. , 43 (2011), pp. 904--924]. After characterizing these barycenters in terms of a system of Monge--Ampère equations, we prove some global moment and Sobolev bounds as well as higher regularity properties. We finally establish a central limit theorem for entropic-Wasserstein barycentersShow more
Downloadable Article
Publication:SIAM Journal on Mathematical Analysis, 53, 2021, 5880

Peer-reviewed
Pixel-Wise Wasserstein Autoencoder for Highly Generative Dehazing
Authors:Guisik KimSung Woo ParkJunseok Kwon
Summary:We propose a highly generative dehazing method based on pixel-wise Wasserstein autoencoders. In contrast to existing dehazing methods based on generative adversarial networks, our method can produce a variety of dehazed images with different styles. It significantly improves the dehazing accuracy via pixel-wise matching from hazy to dehazed images through 2-dimensional latent tensors of the Wasserstein autoencoder. In addition, we present an advanced feature fusion technique to deliver rich information to the latent space. For style transfer, we introduce a mapping function that transforms existing latent spaces to new ones. Thus, our method can produce highly generative haze-free images with various tones, illuminations, and moods, which induces several interesting applications, including low-light enhancement, daytime dehazing, nighttime dehazing, and underwater image enhancement. Experimental results demonstrate that our method quantitatively outperforms existing state-of-the-art methods for synthetic and real-world datasets, and simultaneously generates highly generative haze-free images, which are qualitatively diverseShow more
Article, 2021
Publication:IEEE Transactions on Image Processing, 30, 2021, 5452
Publisher:2021


Solving Wasserstein Robust Two-stage Stochastic Linear Programs via Second-order Conic Programming
Authors:Zhuolin WangKeyou YouShiji SongYuli Zhang2021 40th Chinese Control Conference (CCC)
Summary:This paper proposes a novel data-driven distributionally robust (DR) two-stage linear program over the 1-Wasserstein ball to handle the stochastic uncertainty with unknown distribution. We study the case with distribution uncertainty only in the objective function. In sharp contrast to the exiting literature, our model can be equivalently reformulated as a solvable second-order cone programming (SOCP) problem. Moreover, the distribution achieving the worst-case cost is given as an "empirical" distribution by simply perturbing each sample and the asymptotic convergence of the proposed model is also proved. Finally, experiments illustrate the advantages of our model in terms of the out-of-sample performance and computational complexityShow more
Chapter, 2021
Publication:2021 40th Chinese Control Conference (CCC), 20210726, 1875
Publisher:2021



Multi-source Cross Project Defect Prediction with Joint Wasserstein Distance and Ensemble Learning
Authors:Quanyi ZouLu LuZhanyu YangHao Xu2021 IEEE 32nd International Symposium on Software Reliability Engineering (ISSRE)
Summary:Cross-Project Defect Prediction (CPDP) refers to transferring knowledge from source software projects to a target software project. Previous research has shown that the impacts of knowledge transferred from different source projects differ on the target task. Therefore, one of the fundamental challenges in CPDP is how to measure the amount of knowledge transferred from each source project to the target task. This article proposed a novel CPDP method called Multi-source defect prediction with Joint Wasserstein Distance and Ensemble Learning (MJWDEL) to learn transferred weights for evaluating the importance of each source project to the target task. In particular, first of all, applying the TCA technique and Logistic Regression (LR) train a sub-model for each source project and the target project. Moreover, the article designs joint Wassertein distance to understand the source-target relationship and then uses this as a basis to compute the transferred weights of different sub-models. After that, the transferred weights can be used to reweight these sub-models to determine their importance in knowledge transfer to the target task. We conducted experiments on 19 software projects from PROMISE, NASA and AEEEM datasets. Compared with several state-of-the-art CPDP methods, the proposed method substantially improves CPDP performance in terms of four evaluation indicators (i.e., F-measure, Balance, G-measure and MMC)Show more
Chapter, 2021
Publication:2021 IEEE 32nd International Symposium on Software Reliability Engineering (ISSRE), 202110, 57
Publisher:2021

Differential semblance optimisation based on the adaptive quadratic Wasserstein distance
Authors:Zhennan YuYang Liu
Summary:Abstract: As the robustness for the wave equation-based inversion methods, wave equation migration velocity analysis (WEMVA) is stable for overcoming the multipathing problem and has become popular in recent years. As a rapidly developed method, differential semblance optimisation (DSO) is convenient to implement and can automatically detect the moveout existing in common image gathers (CIGs). However, by implementing in the image domain with the target of minimising moveouts and improving coherence of the CIGs, the DSO method often suffers from imaging artefacts caused by uneven illumination and irregular observation geometry, which may produce poor velocity updates with artefact contamination. To deal with this issue, in this paper, by introducing Wiener-like filters, we modify the conventional image matching-based objective function to a new one by introducing the quadratic Wasserstein metric technique. The new misfit function measures the distance of two distributions obtained by the convolutional filters and target functions. With the new misfit function, the adjoint sources and the corresponding gradients are improved. We apply the new method to two numerical examples and one field dataset. The corresponding results indicate that the new method is robust to compensate low frequency components of velocity modelsShow more
Article
Publication:Journal of Geophysics and Engineering, 18, 20210820, 60Publisher:2021

2021


Peer-reviewed
Multilevel Optimal Transport: A Fast Approximation of Wasserstein-1Distances
Authors:Jialin LiuWotao YinWuchen LiYat Tin Chow
Summary:We propose a fast algorithm for the calculation of the Wasserstein-1 distance, which is aparticular type of optimal transport distance with transport cost homogeneous of degreeone.Our algorithm is built on multilevel primal-dual algorithms. Several numericalexamples and a complexity analysis are provided to demonstrate its computational speed.On some commonly used image examples of size $512\times512$, the proposed algorithmgives solutions within $0.2\sim 1.5$ seconds on a single CPU, which is much faster thanthe state-of-the-art algorithmsShow more
Downloadable Article
Publication:SIAM Journal on Scientific Computing, 43, 2021, A193


Peer-reviewed
Wasserstein GANs for MR Imaging: From Paired to Unpaired Training
Authors:Ke LeiMorteza MardaniJohn M. PaulyShreyas S. Vasanawala
Summary:Lack of ground-truth MR images impedes the common supervised training of neural networks for image reconstruction. To cope with this challenge, this article leverages unpaired adversarial training for reconstruction networks, where the inputs are undersampled k-space and naively reconstructed images from one dataset, and the labels are high-quality images from another dataset. The reconstruction networks consist of a generator which suppresses the input image artifacts, and a discriminator using a pool of (unpaired) labels to adjust the reconstruction quality. The generator is an unrolled neural network – a cascade of convolutional and data consistency layers. The discriminator is also a multilayer CNN that plays the role of a critic scoring the quality of reconstructed images based on the Wasserstein distance. Our experiments with knee MRI datasets demonstrate that the proposed unpaired training enables diagnostic-quality reconstruction when high-quality image labels are not available for the input types of interest, or when the amount of labels is small. In addition, our adversarial training scheme can achieve better image quality (as rated by expert radiologists) compared with the paired training schemes with pixel-wise lossShow more
Article, 2021
Publication:IEEE Transactions on Medical Imaging, 40, 202101, 105
Publisher:2021


DPIR-Net: Direct PET Image Reconstruction Based on the Wasserstein Generative Adversarial Network
Authors:Zhanli HuHengzhi XueQiyang ZhangJuan GaoNa ZhangSijuan ZouYueyang TengXin LiuYongfeng YangDong Liang
Summary:Positron emission tomography (PET) is an advanced medical imaging technique widely used in various clinical applications, such as tumor detection and neurologic disorders. Reducing the radiotracer dose is desirable in PET imaging because it decreases the patient’s radiation exposure. However, reducing the dose can also increase noise, affecting the image quality. Therefore, an advanced image reconstruction algorithm based on low-dose PET data is needed to improve the quality of the reconstructed image. In this article, we propose the use of a direct PET image reconstruction network (DPIR-Net) using an improved Wasserstein generative adversarial network (WGAN) framework to enhance image quality. This article provides two main findings. First, our network uses sinogram data as input and outputs high-quality PET images direct, resulting in shorter reconstruction times compared with traditional model-based reconstruction networks. Second, we combine perceptual loss, mean square error, and the Wasserstein distance as the loss function, which effectively solves the problem of excessive smoothness and loss of detailed information in traditional network image reconstruction. We performed a comparative study using maximum-likelihood expectation maximization (MLEM) with a post-Gaussian filter, a total variation (TV)-norm regularization, a nonlocal means (NLMs) denoising method, a neural network denoising method, a traditional deep learning PET reconstruction network, and our proposed DPIR-Net method and evaluated the proposed method using both human and mouse data. The mouse data were obtained from a small animal PET prototype system developed by our laboratory. The quantitative and qualitative results show that our proposed method outperformed the conventional methodsShow more
Article, 2021
Publication:IEEE Transactions on Radiation and Plasma Medical Sciences, 5, 202101, 35
Publisher:2021


Joint Distribution Adaptation via Wasserstein Adversarial Training
Authors:Xiaolu WangWenyong ZhangXin ShenHuikang Liu2021 International Joint Conference on Neural Networks (IJCNN)
Summary:This paper considers the unsupervised domain adaptation problem, in which we want to find a good prediction function on the unlabeled target domain, by utilizing the information provided in the labeled source domain. A common approach to the domain adaptation problem is to learn a representation space where the distributional discrepancy of the source and target domains is small. Existing methods generally tend to match the marginal distributions of the two domains, while the label information in the source domain is not fully exploited. In this paper, we propose a representation learning approach for domain adaptation, which is addressed as JODAWAT. We aim to adapt the joint distributions of the feature-label pairs in the shared representation space for both domains. In particular, we minimize the Wasserstein distance between the source and target domains, while the prediction performance on the source domain is also guaranteed. The proposed approach results in a minimax adversarial training procedure that incorporates a novel split gradient penalty term. A generalization bound on the target domain is provided to reveal the efficacy of representation learning for joint distribution adaptation. We conduct extensive evaluations on JODAWAT, and test its classification accuracy on multiple synthetic and real datasets. The experimental results justify that our proposed method is able to achieve superior performance compared with various domain adaptation methodsShow more
Chapter, 2021
Publication:2021 International Joint Conference on Neural Networks (IJCNN), 20210718, 1
Publisher:2021

 
Peer-reviewed
The Quantum Wasserstein Distance of Order 1

Authors:Giacomo De PalmaMilad MarvianDario TrevisanSeth Lloyd
Summary:We propose a generalization of the Wasserstein distance of order 1 to the quantum states of n qudits. The proposal recovers the Hamming distance for the vectors of the canonical basis, and more generally the classical Wasserstein distance for quantum states diagonal in the canonical basis. The proposed distance is invariant with respect to permutations of the qudits and unitary operations acting on one qudit and is additive with respect to the tensor product. Our main result is a continuity bound for the von Neumann entropy with respect to the proposed distance, which significantly strengthens the best continuity bound with respect to the trace distance. We also propose a generalization of the Lipschitz constant to quantum observables. The notion of quantum Lipschitz constant allows us to compute the proposed distance with a semidefinite program. We prove a quantum version of Marton’s transportation inequality and a quantum Gaussian concentration inequality for the spectrum of quantum Lipschitz observables. Moreover, we derive bounds on the contraction coefficients of shallow quantum circuits and of the tensor product of one-qudit quantum channels with respect to the proposed distance. We discuss other possible applications in quantum machine learning, quantum Shannon theory, and quantum many-body systemsShow more
Article, 2021
Publication:IEEE Transactions on Information Theory, 67, 202110, 6627
Publisher:2021

<——2021———2021——2830—


Peer-reviewed
Sample Out-of-Sample Inference Based on Wasserstein Distance
Authors:Jose BlanchetYang Kang
Summary:Financial institutions make decisions according to a model of uncertainty. At the same time, regulators often evaluate the risk exposure of these institutions using a model of uncertainty, which is often different from the one used by the institutions. How can one incorporate both views into a single framework? This paper provides such a framework. It quantifies the impact of the misspecification inherent to the financial institution data-driven model via the introduction of an adversarial player. The adversary replaces the institution's generated scenarios by the regulator's scenarios subject to a budget constraint and a cost that measures the distance between the two sets of scenarios (using what in statistics is known as the Wasserstein distance). This paper also harnesses statistical theory to make inference about the size of the estimated error when the sample sizes (both of the institution and the regulator) are large. The framework is explained more broadly in the context of distributionally robust optimization (a class of perfect information games, in which decisions are taken against an adversary that perturbs a baseline distribution)Show more
Downloadable Article, 2021
Publication:Operations Research, 69, 202105, 985
Publisher:2021

 

Peer-reviewed
Confidence regions in Wasserstein distributionally robust estimation
Authors:Jose BlanchetKarthyek MurthyNian Si
Summary:Summary: Estimators based on Wasserstein distributionally robust optimization are obtained as solutions of min-max problems in which the statistician selects a parameter minimizing the worst-case loss among all probability models within a certain distance from the underlying empirical measure in a Wasserstein sense. While motivated by the need to identify optimal model parameters or decision choices that are robust to model misspecification, these distributionally robust estimators recover a wide range of regularized estimators, including square-root lasso and support vector machines, among others. This paper studies the asymptotic normality of these distributionally robust estimators as well as the properties of an optimal confidence region induced by the Wasserstein distributionally robust optimization formulation. In addition, key properties of min-max distributionally robust optimization problems are also studied; for example, we show that distributionally robust estimators regularize the loss based on its derivative, and we also derive general sufficient conditions which show the equivalence between the min-max distributionally robust optimization problem and the corresponding max-min formulationShow more
Article, 2021
Publication:Biometrika, 109, 20210420, 295
Publisher:2021


Wasserstein Distance-Based Domain Adaptation and Its Application to Road Segmentation
Authors:Seita KonoTakaya UedaEnrique Arriaga-VarelaIkuko Nishikawa2021 International Joint Conference on Neural Networks (IJCNN)Show more
Summary:Domain adaptation is used in applying a classifier acquired in one data domain to another data domain. A classifier obtained by supervised training with labeled data in an original source domain can also be used for classification in a target domain in which the labeled data are difficult to collect with the help of domain adaptation. The most recently proposed domain adaptation methods focus on data distribution in the feature space of a classifier and bring the data distribution of both domains closer through learning. The present work is based on an existing unsupervised domain adaptation method, in which both distributions become closer through adversarial training between a target data encoder to the feature space and a domain discriminator. We propose to use the Wasserstein distance to measure the distance between two distributions, rather than the well-known Jensen-Shannon divergence. Wasserstein distance, or earth mover's distance, measures the length of the shortest path among all possible pairs between a corresponding pair of variables in two distributions. Therefore, minimization of the distance leads to overlap of the corresponding data pair in source and target domain. Thus, the classifier trained in the source domain becomes also effective in the target domain. The proposed method using Wasserstein distance shows higher accuracies in the target domains compared with an original distance in computer experiments on semantic segmentation of map imagesShow more
Chapter, 2021
Publication:2021 International Joint Conference on Neural Networks (IJCNN), 20210718, 1
Publisher:2021


Convergence of Recursive Stochastic Algorithms Using Wasserstein Divergence
Authors:Abhishek GuptaWilliam B. Haskell
Summary:This paper develops a unified framework, based on iterated random operator theory, to analyze the convergence of constantstepsize recursive stochastic algorithms (RSAs). RSAs use randomization to efficiently compute expectations, and so theiriterates form a stochastic process. The key idea of our analysis is to lift the RSA into an appropriate higher-dimensionalspace and then express it as an equivalent Markov chain. Instead of determining the convergence of this Markov chain(which may not converge under constant stepsize), we study the convergence of the distribution of this Markov chain. Tostudy this, we define a new notion of Wasserstein divergence. We show that if the distribution of the iterates in theMarkov chain satisfy a contraction property with respect to the Wasserstein divergence, then the Markov chain admits aninvariant distribution. We show that convergence of a large family of constant stepsize RSAs can be understood using thisframework, and we provide several detailed examplesShow more
Downloadable Article
Publication:SIAM Journal on Mathematics of Data Science, 3, 2021, 1141


Authors:Ningning DuYankui LiuYing Liu
Summary:Since optimal portfolio strategy depends heavily on the distribution of uncertain returns, this article proposes a new method for the portfolio optimization problem with respect to distribution uncertainty. When the distributional information of the uncertain return rate is only observable through a finite sample dataset, we model the portfolio selection problem with a robust optimization method from the data-driven perspective. We first develop an ambiguous mean-CVaR portfolio optimization model, where the ambiguous distribution set employed in the distributionally robust model is a Wasserstein ball centered within the empirical distribution. In addition, the computationally tractable equivalent model of the worst-case expectation under the uncertainty set of a cone is derived, and some theoretical conclusions of the box, budget and ellipsoid uncertainty set are obtained. Finally, to demonstrate the effectiveness of our mean-CVaR portfolio optimization method, two practical examples concerning the Chinese stock market and United States stock market are considered. Furthermore, some numerical experiments are carried out under different uncertainty sets. The proposed data-driven distributionally robust portfolio optimization method offers some advantages over the ambiguity-free stochastic optimization method. The numerical experiments illustrate that the new method is effectiveShow more
Article, 2021
Publication:IEEE Access, 9, 2021, 3174
Publisher:2021

2021

Statistical Learning in Wasserstein Space
Authors:Amirhossein KarimiLuigia RipaniTryphon T. Georgiou
Summary:We seek a generalization of regression and principle component analysis (PCA) in a metric space where data points are distributions metrized by the Wasserstein metric. We recast these analyses as multimarginal optimal transport problems. The particular formulation allows efficient computation, ensures existence of optimal solutions, and admits a probabilistic interpretation over the space of paths (line segments). Application of the theory to the interpolation of empirical distributions, images, power spectra, as well as assessing uncertainty in experimental designs, is envisionedShow more
Article, 2021
Publication:IEEE Control Systems Letters, 5, 202107, 899
Publisher:2021


A Regularized Wasserstein Framework for Graph Kernels
Authors:Asiri WijesingheQing WangStephen Gould2021 IEEE International Conference on Data Mining (ICDM)
Summary:We propose a learning framework for graph kernels, which is theoretically grounded on regularizing optimal transport. This framework provides a novel optimal transport distance metric, namely Regularized Wasserstein (RW) discrepancy, which can preserve both features and structure of graphs via Wasserstein distances on features and their local variations, local barycenters and global connectivity. Two strongly convex regularization terms are introduced to improve the learning ability. One is to relax an optimal alignment between graphs to be a cluster-to-cluster mapping between their locally connected vertices, thereby preserving the local clustering structure of graphs. The other is to take into account node degree distributions in order to better preserve the global structure of graphs. We also design an efficient algorithm to enable a fast approximation for solving the optimization problem. Theoretically, our framework is robust and can guarantee the convergence and numerical stability in optimization. We have empirically validated our method using 12 datasets against 16 state-of-the-art baselines. The experimental results show that our method consistently outperforms all state-of-the-art methods on all benchmark databases for both graphs with discrete attributes and graphs with continuous attributesShow more
Chater, 2021
Publication:2021 IEEE International Conference on Data Mining (ICDM), 202112, 739
Publisher:2021

Peer-reviewed
The back-and-forth method for Wasserstein gradient flows
Authors:Matt JacobsWonjun LeeFlavien Léger
Summary:We present a method to efficiently compute Wasserstein gradient flows. Our approach is based on a generalization of the back-and-forth method (BFM) introduced in Jacobs and Léger [Numer. Math. 146 (2020) 513–544.]. to solve optimal transport problems. We evolve the gradient flow by solving the dual problem to the JKO scheme. In general, the dual problem is much better behaved than the primal problem. This allows us to efficiently run large scale gradient flows simulations for a large class of internal energies including singular and non-convex energiesShow more
Article, 2021
Publication:ESAIM: Control, Optimisation and Calculus of Variations, 27, 2021
Publisher:2021


Time-Series Imputation with Wasserstein Interpolation for Optimal Look-Ahead-Bias and Variance Tradeoff
Authors:Blanchet, Jose (Creator), Hernandez, Fernando (Creator), Nguyen, Viet Anh (Creator), Pelger, Markus (Creator), Zhang, Xuhui (Creator)Show more
Summary:Missing time-series data is a prevalent practical problem. Imputation methods in time-series data often are applied to the full panel data with the purpose of training a model for a downstream out-of-sample task. For example, in finance, imputation of missing returns may be applied prior to training a portfolio optimization model. Unfortunately, this practice may result in a look-ahead-bias in the future performance on the downstream task. There is an inherent trade-off between the look-ahead-bias of using the full data set for imputation and the larger variance in the imputation from using only the training data. By connecting layers of information revealed in time, we propose a Bayesian posterior consensus distribution which optimally controls the variance and look-ahead-bias trade-off in the imputation. We demonstrate the benefit of our methodology both in synthetic and real financial dataShow more
Downloadable Archival Material, 2021-02-25
Undefined
Publisher:2021-02-25



Global Sensitivity Analysis and Wasserstein SpacesAuthors:Jean-Claude FortThierry KleinAgnès Lagnoux
Summary:Sensitivity indices are commonly used to quantify the relative influence of any specific group of input variables on the output of a computer code. In this paper, we focus both on computer codes, the output of which is a cumulative distribution function, and on stochastic computer codes. We propose a way to perform a global sensitivity analysis for these kinds of computer codes. In the first setting, we define two indices: the first one is based on Wasserstein Fréchet means, while the second one is based on the Hoeffding decomposition of the indicators of Wasserstein balls. Further, when dealing with the stochastic computer codes, we define an “ideal version” of the stochastic computer code that fits into the framework of the first setting. Finally, we deduce a procedure to realize a second level global sensitivity analysis, namely, when one is interested in the sensitivity related to the input distributions rather than in the sensitivity related to the inputs themselves. Several numerical studies are proposed as illustrations of the different settingsShow more
Downloadable Article
Publication:SIAM/ASA Journal on Uncertainty Quantification, 9, 2021, 880

<——2021———2021——2840—



Stability of Gibbs Posteriors from the Wasserstein Loss for Bayesian Full Waveform Inversion
Authors:Matthew M. DunlopYunan Yang
Summary:Recently, the Wasserstein loss function has been proven to be effective when applied to deterministic full-waveform inversion (FWI) problems. We consider the application of this loss function in Bayesian FWI so that the uncertainty can be captured in the solution. Other loss functions that are commonly used in practice are also considered for comparison. Existence and stability of the resulting Gibbs posteriors are shown on function space under weak assumptions on the prior and model. In particular, the distribution arising from the Wasserstein loss is shown to be quite stable with respect to high-frequency noise in the data. We then illustrate the difference between the resulting distributions numerically, using Laplace approximations to estimate the unknown velocity field and uncertainty associated with the estimatesShow more
Downloadable Article
Publication:SIAM/ASA Journal on Uncertainty Quantification, 9, 2021, 1499

 

 

Bounding Wasserstein distance with couplings
Authors:Biswas, Niloy (Creator), Mackey, Lester (Creator)
Summary:Markov chain Monte Carlo (MCMC) provides asymptotically consistent estimates of intractable posterior expectations as the number of iterations tends to infinity. However, in large data applications, MCMC can be computationally expensive per iteration. This has catalyzed interest in sampling methods such as approximate MCMC, which trade off asymptotic consistency for improved computational speed. In this article, we propose estimators based on couplings of Markov chains to assess the quality of such asymptotically biased sampling methods. The estimators give empirical upper bounds of the Wassertein distance between the limiting distribution of the asymptotically biased sampling method and the original target distribution of interest. We establish theoretical guarantees for our upper bounds and show that our estimators can remain effective in high dimensions. We apply our quality measures to stochastic gradient MCMC, variational Bayes, and Laplace approximations for tall data and to approximate MCMC for Bayesian logistic regression in 4500 dimensions and Bayesian linear regression in 50000 dimensionsShow more
Downloadable Archival Material, 2021-12-06
Undefined
Publisher:2021-12-06

On Label Shift in Domain Adaptation via Wasserstein Distance
Authors:Le, Trung (Creator), Do, Dat (Creator), Nguyen, Tuan (Creator), Nguyen, Huy (Creator), Bui, Hung (Creator), Ho, Nhat (Creator), Phung, Dinh (Creator)Show more
Summary:We study the label shift problem between the source and target domains in general domain adaptation (DA) settings. We consider transformations transporting the target to source domains, which enable us to align the source and target examples. Through those transformations, we define the label shift between two domains via optimal transport and develop theory to investigate the properties of DA under various DA settings (e.g., closed-set, partial-set, open-set, and universal settings). Inspired from the developed theory, we propose Label and Data Shift Reduction via Optimal Transport (LDROT) which can mitigate the data and label shifts simultaneously. Finally, we conduct comprehensive experiments to verify our theoretical findings and compare LDROT with state-of-the-art baselinesShow more
Downloadable Archival Material, 2021-10-28
Undefined
Publisher:2021-10-28

Peer-reviewed
Graph Classification Method Based on Wasserstein Distance
Authors:Wei WuGuangmin HuFucai Yu
Summary:Graph classification is a challenging problem, which attracts more and more attention. The key to solving this problem is based on what metric to compare graphs, that is, how to define graph similarity. Common graph classification methods include graph kernel, graph editing distance, graph embedding and so on. We introduce a new graph similarity metric, namely GRD (Geometric gRaph Distance). Our model GRD is composed of three sub-modules, which capture the differences between the graph structures from different aspects. Finally, the graph distances defined by the three modules are fused to define the similarity between graphs. Experiments show that GRD is superior to the baseline methods on the benckmark datasetsShow more
Article, 2021
Publication:Journal of Physics: Conference Series, 1952, 202106
Publisher:2021


Peer-reviewed
Wasserstein autoencoders for collaborative filtering
Authors:Xiaofeng ZhangJingbin ZhongKai Liu

Downloadable Article, 2021
Publication:Neural Computing and Applications, 33, 202104, 2793
Publisher:2021
 

2021


Intrinsic Dimension Estimation Using Wasserstein Distances
Authors:Block, Adam (Creator), Jia, Zeyu (Creator), Polyanskiy, Yury (Creator), Rakhlin, Alexander (Creator)
Summary:It has long been thought that high-dimensional data encountered in many practical machine learning tasks have low-dimensional structure, i.e., the manifold hypothesis holds. A natural question, thus, is to estimate the intrinsic dimension of a given population distribution from a finite sample. We introduce a new estimator of the intrinsic dimension and provide finite sample, non-asymptotic guarantees. We then apply our techniques to get new sample complexity bounds for Generative Adversarial Networks (GANs) depending only on the intrinsic dimension of the dataShow more
Downloadable Archival Material, 2021-06-07
Undefined
Publisher:2021-06-07



Peer-reviewed
A Wasserstein inequality and minimal Green energy on compact manifolds
Author:Stefan Steinerberger
Summary:Let M be a smooth, compact d−dimensional manifold, d ≥ 3 , without boundary and let G : M × M R { ∞ } denote the Green's function of the Laplacian −Δ (normalized to have mean value 0). We prove a bound on the cost of transporting Dirac measures in { x 1 , … , x n } M to the normalized volume measure dx in terms of the Green's function of the Laplacian W 2 ( 1 n ∑ k = 1 n δ x k , d x ) M 1 n 1 / d + 1 n | ∑ k , = 1 k ≠ n G ( x k , x ) | 1 / 2 . We obtain the same result for the Coulomb kernel G ( x , y ) = 1 / x − y d − 2 on the sphere S d , for d ≥ 3 , where we show that W 2 ( 1 n ∑ k = 1 n δ x k , d x ) 1 n 1 / d + 1 n | ∑ k , = 1 k ≠ n ( 1 x k − x d − 2 − c d ) | 1 2 , where c d is the constant that normalizes the Coulomb kernel to have mean value 0. We use this to show that minimizers of the discrete Green energy on compact manifolds have optimal rate of convergence W 2 ( 1 n ∑ k = 1 n δ x k , d x ) n − 1 / d . The second inequality implies the same result for minimizers of the Coulomb energy on S d which was recently proven by Marzo & MasShow more
Article, 2021
Publication:Journal of Functional Analysis, 281, 20210901
Publisher:2021

 
Peer-reviewed
The Wasserstein-Fourier Distance for Stationary Time Series
Authors:Elsa CazellesArnaud RobertFelipe Tobar
Summary:We propose the Wasserstein-Fourier (WF) distance to measure the (dis)similarity between time series by quantifying the displacement of their energy across frequencies. The WF distance operates by calculating the Wasserstein distance between the (normalised) power spectral densities (NPSD) of time series. Yet this rationale has been considered in the past, we fill a gap in the open literature providing a formal introduction of this distance, together with its main properties from the joint perspective of Fourier analysis and optimal transport. As the main aim of this work is to validate WF as a general-purpose metric for time series, we illustrate its applicability on three broad contexts. First, we rely on WF to implement a PCA-like dimensionality reduction for NPSDs which allows for meaningful visualisation and pattern recognition applications. Second, we show that the geometry induced by WF on the space of NPSDs admits a geodesic interpolant between time series, thus enabling data augmentation on the spectral domain, by averaging the dynamic content of two signals. Third, we implement WF for time series classification using parametric/non-parametric classifiers and compare it to other classical metrics. Supported on theoretical results, as well as synthetic illustrations and experiments on real-world data, this work establishes WF as a meaningful and capable resource pertinent to general distance-based applications of time seriesShow more
Article, 2021
Publication:IEEE Transactions on Signal Processing, 69, 2021, 709
Publisher:2021

Consistency of Distributionally Robust Risk- and Chance-Constrained Optimization under Wasserstein Ambiguity Sets
Show more
Authors:Ashish CherukuriAshish R. Hota2021 American Control Conference (ACC)
Summary:We study stochastic optimization problems with chance and risk constraints, where in the latter, risk is quantified in terms of the conditional value-at-risk (CVaR). We consider the distributionally robust versions of these problems, where the constraints are required to hold for a family of distributions constructed from the observed realizations of the uncertainty via the Wasserstein distance. Our main results establish that if the samples are drawn independently from an underlying distribution and the problems satisfy suitable technical assumptions, then the optimal value and optimizers of the distributionally robust versions of these problems converge to the respective quantities of the original problems, as the sample size increasesShow more
Chapter, 2021
Publication:2021 American Control Conference (ACC), 20210525, 3818
Publisher:2021

2021 see 2020

Consistency of Distributionally Robust Risk- and Chance-Constrained Optimization under Wasserstein Ambiguity Sets

By: Cherukuri, Ashish; Hota, Ashish R.

Conference: American Control Conference (ACC) Location: ‏ ELECTR NETWORK Date: ‏ MAY 25-28, 2021

Sponsor(s): ‏Amer Automat Control Council; Mitsubishi Elect Res Lab; Halliburton; MathWorks; Wiley; GE Global Res; Soc Ind Appl Math; dSPACE; Tangibles That Teach; Elsevier; GM

2021 AMERICAN CONTROL CONFERENCE (ACC)  Book Series: ‏ Proceedings of the American Control Conference   Pages: ‏ 3818-3823   Published: ‏ 2021

Get It Penn State  View Abstract

Peer-reviewed
Mullins-Sekerka as the Wasserstein flow of the perimeter
Authors:Antonin ChambolleTim Laux
Summary:We prove the convergence of an implicit time discretization for the one-phase Mullins-Sekerka equation, possibly with additional non-local repulsion, proposed in [F. Otto, Arch. Rational Mech.Anal. 141 (1998), pp. 63-103]. Our simple argument shows that the limit satisfies the equation in a distributional sense as well as an optimal energy-dissipation relation. The proof combines arguments from optimal transport, gradient flows & minimizing movements, and basic geometric measure theoryShow more
Downloadable Article, 2021
Publication:Proceedings of the American Mathematical Society, 149, July 1, 2021, 2943
Publisher:2021

<——2021———2021——2850- 



Wasserstein Contraction Bounds on Closed Convex Domains with Applications to Stochastic Adaptive Control
Show more
Authors:Tyler LekangAndrew Lamperski2021 60th IEEE Conference on Decision and Control (CDC)
Summary:This paper is motivated by the problem of quantitatively bounding the convergence of adaptive control methods for stochastic systems to a stationary distribution. Such bounds are useful for analyzing statistics of trajectories and determining appropriate step sizes for simulations. To this end, we extend a methodology from (unconstrained) stochastic differential equations (SDEs) which provides contractions in a specially chosen Wasserstein distance. This theory focuses on unconstrained SDEs with fairly restrictive assumptions on the drift terms. Typical adaptive control schemes place constraints on the learned parameters and their update rules violate the drift conditions. To this end, we extend the contraction theory to the case of constrained systems represented by reflected stochastic differential equations and generalize the allowable drifts. We show how the general theory can be used to derive quantitative contraction bounds on a nonlinear stochastic adaptive regulation problemShow more
Chapter, 2021
Publication:2021 60th IEEE Conference on Decision and Control (CDC), 20211214, 366
Publisher:2021


Optimization of the Diffusion Time in Graph Diffused-Wasserstein Distances: Application to Domain Adaptation
Show more
Authors:Amelie BarbePaulo GoncalvesMarc SebbanPierre BorgnatRemi GribonvalTitouan Vayer2021 IEEE 33rd International Conference on Tools with Artificial Intelligence (ICTAI)Show more
Summary:The use of the heat kernel on graphs has recently given rise to a family of so-called Diffusion-Wasserstein distances which resort to Optimal Transport theory for comparing attributed graphs. In this paper, we address the open problem of optimizing the diffusion time used in these distances. Inspired from the notion of triplet-based constraints, we design a loss function that aims at bringing two graphs closer together while keeping an impostor away. After a thorough analysis of the properties of this function, we show on synthetic data that the resulting Diffusion-Wasserstein distances outperforms the Gromov and Fused-Gromov Wasserstein distances on unsupervised graph domain adaptation tasksShow more
Chapter, 2021
Publication:2021 IEEE 33rd International Conference on Tools with Artificial Intelligence (ICTAI), 202111, 786
Publisher:2021



Peer-reviewed
Peacock geodesics in Wasserstein space
Authors:Hongguang WuXiaojun Cui
Summary:Martingale optimal transport has attracted much attention due to its application in pricing and hedging in mathematical finance. The essential notion which makes martingale optimal transport different from optimal transport is peacock. A peacock is a sequence of measures which satisfies convex order property. In this paper we study peacock geodesics in Wasserstain space, and we hope this paper can provide some geometrical points of view to look at martingale optimal transportShow more
Article
Publication:Differential Geometry and its Applications, 77, August 2021
Application of Wasserstein Attraction Flows for Optimal Transport in Network Systems
Authors:Ferran ArqueCesar A. UribeCarlos Ocampo-Martinez2021 60th IEEE Conference on Decision and Control (CDC)
Summary:This paper presents a Wasserstein attraction approach for solving dynamic mass transport problems over networks. In the transport problem over networks, we start with a distribution over the set of nodes that needs to be "transported" to a target distribution accounting for the network topology. We exploit the specific structure of the problem, characterized by the computation of implicit gradient steps, and formulate an approach based on discretized flows. As a result, our proposed algorithm relies on the iterative computation of constrained Wasserstein barycenters. We show how the proposed method finds approximate solutions to the network transport problem, taking into account the topology of the network, the capacity of the communication channels, and the capacity of the individual nodesShow more
Chapter, 2021
Publication:2021 60th IEEE Conference on Decision and Control (CDC), 20211214, 4058
Publisher:2021


Domain Adaptive Rolling Bearing Fault Diagnosis based on Wasserstein Distance
Authors:Chunliu YangXiaodong WangJun BaoZhuorui Li2021 33rd Chinese Control and Decision Conference (CCDC)
Summary:The rolling bearing usually runs at different speeds and loads, which leads to a corresponding change in the distribution of data. The cross-domain problem caused by different data distributions can degrade the performance of deep learning-based fault diagnosis models. To address this problem, this paper proposes a multilayer domain adaptive method based on Wasserstein distance for fault diagnosis of rolling bearings operating under different operating conditions. First, the basic framework uses deep Convolutional Neural Networks (CNN) to extract domain invariant features and Then an adaptation learning procedure is used for optimizing the basic CNN to adapt cross different domains. According to the experimental results, the network model has excellent fault diagnosis capability and adaptive domain capability and is able to obtain a high fault diagnosis rate under different working conditions. Finally, for investigating the adaptability in this method, we use t-SNE to reduce the high dimension feature for better visualizationShow more
Chapter, 2021
Publication:2021 33rd Chinese Control and Decision Conference (CCDC), 20210522, 77
Publisher:2021
 
Learning to Simulate Sequentially Generated Data via Neural Networks and Wasserstein Training
Authors:Tingyu ZhuZeyu Zheng2021 Winter Simulation Conference (WSC)
Summary:We propose a new framework of a neural network-assisted sequential structured simulator to model, estimate, and simulate a wide class of sequentially generated data. Neural networks are integrated into the sequentially structured simulators in order to capture potential nonlinear and complicated sequential structures. Given representative real data, the neural network parameters in the simulator are estimated through a Wasserstein training process, without restrictive distributional assumptions. Moreover, the simulator can flexibly incorporate various kinds of elementary randomness and generate distributions with certain properties such as heavy-tail. Regarding statistical properties, we provide results on consistency and convergence rate for estimation of the simulator. We then present numerical experiments with synthetic and real data sets to illustrate the performance of our estimation methodShow more
Chapter, 2021
Publication:2021 Winter Simulation Conference (WSC), 20211212, 1
Publisher:2021
Cited by 1
Related articles All 2 versions


2921

Wasserstein-Splitting Gaussian Process Regression for Heterogeneous Online Bayesian Inference
Authors:Michael E. KeplerAlec KoppelAmrit Singh BediDaniel J. Stilwell2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)Show more
Summary:Gaussian processes (GPs) are a well-known nonparametric Bayesian inference technique, but they suffer from scalability problems for large sample sizes, and their performance can degrade for non-stationary or spatially heterogeneous data. In this work, we seek to overcome these issues through (i) employing variational free energy approximations of GPs operating in tandem with online expectation propagation steps; and (ii) introducing a local splitting step which instantiates a new GP whenever the posterior distribution changes significantly as quantified by the Wasserstein metric over posterior distributions. Over time, then, this yields an ensemble of sparse GPs which may be updated incrementally, and adapts to locality, heterogeneity, and non-stationarity in training data. We provide a 1-dimensional example to illustrate the motivation behind our approach, and compare the performance of our approach to other Gaussian process methods across various data sets, which often achieves competitive, if not superior predictive performance, relative to other locality-based GP regression methods in which hyperparameters are learned in an online mannerShow more
Chapter, 2021
Publication:2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 20210927, 9833
Publisher:2021


Wasserstein Divergence GAN With Cross-Age Identity Expert and Attribute Retainer for Facial Age Transformation
Show more
Authors:Gee-Sern HsuRui-Cang XieZhi-Ting Chen
Summary:We propose the Wasserstein Divergence GAN with an identity expert and an attribute retainer for facial age transformation. The Wasserstein Divergence GAN (WGAN-div) can better stabilize the training and lead to better target image generation. The identity expert aims to preserve the input identity at output, and the attribute retainer aims to preserve the input attribute at output. Unlike the previous works which take a specific model for identity and attribute preservation without giving a reason, both the identity expert and the attribute retainer in our proposed model are determined from a comprehensive comparison study on the state-of-the-art pretrained models. The candidate networks considered for identity preservation include the VGG-Face, VGG-Face2, LightCNN and ArcFace. The candidate backbones for making the attribute retainer are the VGG-Face, VGG-Object and DEX networks. This study offers a guidebook for choosing the appropriate modules for identity and attribute preservation. The interactions between the identity expert and attribute retainer are also extensively studied and experimented. To further enhance the performance, we augment the data by the 3DMM and explore the advantages of the additional training on cross-age datasets. The additional cross-age training is validated to make the identity expert capable of handling cross-age face recognition. The performance of our approach is justified by the desired age transformation with identity well preserved. Experiments on benchmark databases show that the proposed approach is highly competitive to state-of-the-art methodsShow more
Article, 2021
Publication:IEEE Access, 9, 2021, 39695
Publisher:2021



Peer-reviewed
Wasserstein Distributionally Robust Stochastic Control: A Data-Driven Approach
Author:Insoon Yang
Summary:Standard stochastic control methods assume that the probability distribution of uncertain variables is available. Unfortunately, in practice, obtaining accurate distribution information is a challenging task. To resolve this issue, in this article we investigate the problem of designing a control policy that is robust against errors in the empirical distribution obtained from data. This problem can be formulated as a two-player zero-sum dynamic game problem, where the action space of the adversarial player is a Wasserstein ball centered at the empirical distribution. A dynamic programming solution is provided exploiting the reformulation techniques for Wasserstein distributionally robust optimization. We show that the contraction property of associated Bellman operators extends a single-stage out-of-sample performance guarantee , obtained using a measure concentration inequality, to the corresponding multistage guarantee without any degradation in the confidence level. Furthermore, we characterize an explicit form of the optimal control policy and the worst-case distribution policy for linear-quadratic problems with Wasserstein penaltyShow more
Article, 2021
Publication:IEEE Transactions on Automatic Control, 66, 202108, 3863
Publisher:2021


Peer-reviewed
Data-driven distributionally robust chance-constrained optimization with Wasserstein metric
Authors:Ran JiMiguel A. Lejeune
Downloadable Article, 2021
Publication:Journal of Global Optimization : An International Journal Dealing with Theoretical and Computational Aspects of Seeking Global Optima and Their Applications in Science, Management and Engineering, 79, 202104, 779
Publisher:2021

High-Confidence Attack Detection via Wasserstein-Metric Computations

By: Li, Dan; Martinez, Sonia

IEEE CONTROL SYSTEMS LETTERS  Volume: ‏ 5   Issue: ‏ 2   Pages: ‏ 379-384   Published: ‏ APR 2021
High-Confidence Attack Detection via Wasserstein-Metric Computations
Authors:Dan LiSonia Martinez
Summary:This letter considers a sensor attack and fault detection problem for linear cyber-physical systems, which are subject to system noise that can obey an unknown light-tailed distribution. We propose a new threshold-based detection mechanism that employs the Wasserstein metric, and which guarantees system performance with high confidence with a finite number of measurements. The proposed detector may generate false alarms with a rate \Delta in normal operation, where \Delta can be tuned to be arbitrarily small by means of a benchmark distribution . Thus, the proposed detector is sensitive to sensor attacks and faults which have a statistical behavior that is different from that of the system noise. We quantify the impact of stealthy attacks on open-loop stable systems—which perturb the system operation while producing false alarms consistent with the natural system noise—via a probabilistic reachable set. Tractable implementation is enabled via a linear optimization to compute the detection measure and a semidefinite program to bound the reachable setShow more
Article, 2021
Publication:IEEE Control Systems Letters, 5, 202104, 379
Publisher:2021

<——2021———2021——2860—




Probability Distribution Control of Finite-State Markov Chains with Wasserstein Costs and Application to Operation of Car-Sharing Services
Show more
Authors:Kenta HoshinoKazunori Sakurama2021 60th IEEE Conference on Decision and Control (CDC)
Summary:This study investigates an optimal control problem of discrete-time finite-state Markov chains with application in the operation of car-sharing services. The optimal control of probability distributions is the object of focus to ensure that the controlled distributions are as close as possible to the given ones. The problem is formulated using Wasserstein distances, which allows us to measure the difference among probability distributions and is suitable for the objective of this study. For the control problem, we provide a necessary condition for optimality in the control inputs and develop an algorithm for obtaining optimal control inputs. The developed algorithm is then applied to the probability distribution control of a one-way car-sharing service, which provides a rebalancing strategy to resolve car unevennessShow more
Chapter, 2021
Publication:2021 60th IEEE Conference on Decision and Control (CDC), 20211214, 6634
Publisher:2021

Wasserstein Coupled Graph Learning for Cross-Modal Retrieval
Authors:Yun WangTong ZhangXueya ZhangZhen CuiYuge HuangPengcheng ShenShaoxin LiJian Yang2021 IEEE/CVF International Conference on Computer Vision (ICCV)Show more
Summary:Graphs play an important role in cross-modal image-text understanding as they characterize the intrinsic structure which is robust and crucial for the measurement of crossmodal similarity. In this work, we propose a Wasserstein Coupled Graph Learning (WCGL) method to deal with the cross-modal retrieval task. First, graphs are constructed according to two input cross-modal samples separately, and passed through the corresponding graph encoders to extract robust features. Then, a Wasserstein coupled dictionary, containing multiple pairs of counterpart graph keys with each key corresponding to one modality, is constructed for further feature learning. Based on this dictionary, the input graphs can be transformed into the dictionary space to facilitate the similarity measurement through a Wasserstein Graph Embedding (WGE) process. The WGE could capture the graph correlation between the input and each corresponding key through optimal transport, and hence well characterize the inter-graph structural relationship. To further achieve discriminant graph learning, we specifically define a Wasserstein discriminant loss on the coupled graph keys to make the intra-class (counterpart) keys more compact and inter-class (non-counterpart) keys more dispersed, which further promotes the final cross-modal retrieval task. Experimental results demonstrate the effectiveness and state-of-the-art performanceShow more
Chapter, 2021
Publication:2021 IEEE/CVF International Conference on Computer Vision (ICCV), 202110, 1793
Publisher:2021

 Covariance Steering of Discrete-Time Stochastic Linear Systems Based on
Wasserstein Distance Terminal Cost

By: Balci, Isin M.; Bakolas, Efstathios

IEEE CONTROL SYSTEMS LETTERS  Volume: ‏ 5   Issue: ‏ 6   Pages: ‏ 2000-2005   Published: ‏ DEC 2021
Covariance Steering of Discrete-Time Stochastic Linear Systems Based on Wasserstein Distance Terminal Cost
Authors:Isin M. BalciEfstathios Bakolas
Summary:We consider a class of stochastic optimal control problems for discrete-time linear systems whose objective is the characterization of control policies that will steer the probability distribution of the terminal state of the system close to a desired Gaussian distribution. In our problem formulation, the closeness between the terminal state distribution and the desired (goal) distribution is measured in terms of the squared Wasserstein distance which is associated with a corresponding terminal cost term. We recast the stochastic optimal control problem as a finite-dimensional nonlinear program whose performance index can be expressed as the difference of two convex functions. This representation of the performance index allows us to find local minimizers of the original nonlinear program via the so-called convex-concave procedure [1] . Finally, we present non-trivial numerical simulations to demonstrate the efficacy of the proposed technique by comparing it with sequential quadratic programming methods in terms of computation timeShow more
Article, 2021
Publication:IEEE Control Systems Letters, 5, 202112, 2000
Publisher:2021

Ripple-GAN: Lane Line Detection With Ripple Lane Line Detection Network and Wasserstein GAN
Authors:Youcheng ZhangZongqing LuDongdong MaJing-Hao XueQingmin Liao
Summary:With artificial intelligence technology being advanced by leaps and bounds, intelligent driving has attracted a huge amount of attention recently in research and development. In intelligent driving, lane line detection is a fundamental but challenging task particularly under complex road conditions. In this paper, we propose a simple yet appealing network called Ripple Lane Line Detection Network (RiLLD-Net), to exploit quick connections and gradient maps for effective learning of lane line features. RiLLD-Net can handle most common scenes of lane line detection. Then, in order to address challenging scenarios such as occluded or complex lane lines, we propose a more powerful network called Ripple-GAN, by integrating RiLLD-Net, confrontation training of Wasserstein generative adversarial networks, and multi-target semantic segmentation. Experiments show that, especially for complex or obscured lane lines, Ripple-GAN can produce a superior detection performance to other state-of-the-art methodsShow more
Article, 2021
Publication:IEEE Transactions on Intelligent Transportation Systems, 22, 202103, 1532
Publisher:2021



Inverse Domain Adaptation for Remote Sensing Images Using Wasserstein DistanceAuthors:Ziyao LiRui WangMan-On PunZhiguo WangHuiliang YuIGARSS 2021 - 2021 IEEE International Geoscience and Remote Sensing SymposiumShow more
Summary:In this work, an inverse domain adaptation (IDA) method is proposed to cope with the distributional mismatch between the training images in the source domain and the test images in the target domain in remote sensing. More specifically, a cycleGAN structure using the Wasserstein distance is developed to learn the distribution of the remote sensing images in the source domain before the images in the target domain are transformed into similar distribution while preserving the image details and semantic consistency of the target images via style transfer. Extensive experiments using the GF1 data are performed to confirm the effectiveness of the proposed IDA methodShow more
Chapter, 2021
Publication:2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS, 20210711, 2345
Publisher:2021

2021


Fair Graph Auto-Encoder for Unbiased Graph Representations with Wasserstein Distance
Authors:Wei FanKunpeng LiuRui XieHao LiuHui XiongYanjie Fu2021 IEEE International Conference on Data Mining (ICDM)
Summary:The fairness issue is very important in deploying machine learning models as algorithms widely used in human society can be easily in discrimination. Researchers have studied disparity on tabular data a lot and proposed many methods to relieve bias. However, studies towards unfairness in graph are still at early stage while graph data that often represent connections among people in real-world applications can easily give rise to fairness issues and thus should be attached to great importance. Fair representation learning is one of the most effective methods to relieve bias, which aims to generate hidden representations of input data while obfuscating sensitive information. In graph setting, learning fair representations of graph (also called fair graph embeddings) is effective to solve graph unfairness problems. However, most existing works of fair graph embeddings only study fairness in a coarse granularity (i.e., group fairness), but overlook individual fairness. In this paper, we study fair graph representations from different levels. Specifically, we consider both group fairness and individual fairness on graph. To debias graph embeddings, we propose FairGAE, a fair graph auto-encoder model, to derive unbiased graph embeddings based on the tailor-designed fair Graph Convolution Network (GCN) layers. Then, to achieve multi-level fairness, we design a Wasserstein distance based regularizer to learn the optimal transport for fairer embeddings. To overcome the efficiency concern, we further bring up Sinkhorn divergence as the approximations of Wasserstein cost for computation. Finally, we apply the learned unbiased embeddings into the node classification task and conduct extensive experiments on two real-world graph datasets to demonstrate the improved performances of our approachShow more
Chapter, 2021
Publication:2021 IEEE International Conference on Data Mining (ICDM), 202112, 1054
Publisher:2021
 


Closed-form expressions for maximum mean discrepancy with applications to Wasserstein auto-encoders
Author:Raif M. Rustamov
Article, 2021
Publication:Stat, 10, 2021, N
Publisher:2021

Peer-reviewed
Author:Dabrowski D.
Sufficient Condition for Rectifiability Involving Wasserstein Distance W<sub>2</sub>
Article, 2021
Publication:Journal of Geometric Analysis, 2021
Publisher:2021

Fast Wasserstein-Distance-Based Distributionally Robust Chance-Constrained Power Dispatch for Multi-Zone HVAC Systems
Show more
Authors:Ge ChenHongcai ZhangHongxun HuiYonghua Song
Article, 2021
Publication:IEEE transactions on smart grid, 12, 2021, 4016
Publisher:2021
 

Peer-reviewed
Zero-sum differential games on the Wasserstein space
Authors:Tamer BaşarJun Moon
Article, 2021
Publication:Communications in information and systems, 21, 2021, 219
Publisher:2021

<——2021———2021——2870—-


er-reviewed
CHICKN: extraction of peptide chromatographic elution profiles from large scale mass spectrometry data by means of Wasserstein compressive hierarchical cluster analysis
Show more
:Olga PermiakovaRomain GuibertAlexandra KrautThomas FortinAnne-Marie HesseThomas Burger
Summary:The clustering of data produced by liquid chromatography coupled to mass spectrometry analyses (LC-MS data) has recently gained interest to extract meaningful chemical or biological patterns. However, recent instrumental pipelines deliver data which size, dimensionality and expected number of clusters are too large to be processed by classical machine learning algorithms, so that most of the state-of-the-art relies on single pass linkage-based algorithms. We propose a clustering algorithm that solves the powerful but computationally demanding kernel k-means objective function in a scalable way. As a result, it can process LC-MS data in an acceptable time on a multicore machine. To do so, we combine three essential features: a compressive data representation, Nyström approximation and a hierarchical strategy. In addition, we propose new kernels based on optimal transport, which interprets as intuitive similarity measures between chromatographic elution profiles. Our method, referred to as CHICKN, is evaluated on proteomics data produced in our lab, as well as on benchmark data coming from the literature. From a computational viewpoint, it is particularly efficient on raw LC-MS data. From a data analysis viewpoint, it provides clusters which differ from those resulting from state-of-the-art methods, while achieving similar performances. This highlights the complementarity of differently principle algorithms to extract the best from complex LC-MS dataShow more
Downloadable Article, 2021
Publication:BMC Bioinformatics, 22, 20210212, 1
Publisher:2021


Peer-reviewed
Short-term railway passenger demand forecast using improved Wasserstein generative adversarial nets and web search terms
Show more
Authors:Fenling FengJiaqi ZhangChengguang LiuWan LiQiwei Jiang

Article, 2021
Publication:IET Intelligent Transport Systems, 15, March 2021, 432
Publisher:2021

Peer-reviewed
Equidistribution of random walks on compact groups II. The Wasserstein metric
Author:B. Borda

Article, 2021
Publication:Bernoulli, 27, 2021, 2598
Publisher:2021

2021 see 2022
Distributionally Safe Path Planning: Wasserstein Safe RRT
Authors:Paul LathropBeth BoardmanSonia Martinez
Article, 2021
Publication:IEEE robotics and automation letters, 7, 2021, 430
Publisher:2021
Cited by 7
Related articles All 2 versions
 
Peer-reviewed
Correction to: Necessary Optimality Conditions for Optimal Control Problems in Wasserstein Spaces
Authors:Benoît BonnetHélène Frankowska
Downloadable Article, 2021
Publicatio:Applied Mathematics & Optimization, 20210914, 1
Publisher:2021

2021

Brain Extraction From Brain MRI Images Based on Wasserstein GAN and O-Net
Authors:Shaofeng JiangLanting GuoGuangbin ChengXingyan ChenCongxuan ZhangZhen Chen
Article, 2021
Publication:IEEE access, 9, 2021, 136762
Publisher:2021

Cited by 3 Related articles

2021 see 2922
Wasserstein Loss With Alternative Reinforcement Learning for Severity-Aware Semantic Segmentation
Authors:Xiaofeng LiuYunhong LuXiongchang LiuSong BaiSite LiJane You
Article, 2021
Publication:Intelligent transportation systems, IEEE transactions on, 23, 2021, 587
Publisher:2021

 
Peer-reviewed
Network Consensus in the Wasserstein Metric Space of Probability Measures
Authors:Adrian N. BishopArnaud Doucet

Article, 2021
Publication:SIAM journal on control and optimization, 59, 2021, 3261
Publisher:2021


Mckean-vlasov sdes with drifts discontinuous under wasserstein distance
Authors:Huang X.Wang F.-Y.
Article, 2021
Publication:Discrete and Continuous Dynamical Systems- Series A, 41, 2021 04 01, 1667
Publisher:2021


Peer-reviewed
Conditional Wasserstein generative adversarial networks applied to acoustic metamaterial design
Authors:Peter LaiFeruza AmirkulovaPeter Gerstoft
Article, 2021
Publication:Journal of the Acoustical Society of America, 150, 2021, 4362
Publisher:2021

<——2021———2021——2880—


2021 see 2022
Peer-reviewed
Wasserstein Adversarial Regularization for Learning With Label Noise
Authors:Kilian FatrasBharath Bhushan DamodaranSylvain LobryRemi FlamaryDevis TuiaNicolas Courty

Article, 2021
Publication:IEEE transactions on pattern analysis and machine intelligence, 44, 2021, 7296
Publisher:2021

Illumination-Invariant Flotation Froth Color Measuring via Wasserstein Distance-Based CycleGAN with Structure-Preserving Constraint
Show more
Authors:Liu J.He J.Ma T.Xie Y.Gui W.
Article, 2021
Publication:IEEE Transactions on Cybernetics, 51, 2021 02 01, 839
Publisher:202

Peer-reviewed
Pixel-Wise Wasserstein Autoencoder for Highly Generative Dehazing
Authors:Guisik KimSung Woo ParkJunseok Kwon
Article, 2021
Publication:IEEE transactions on image processing, 30, 2021, 5452
Publisher:2021
Cited by 13 Related articles All 5 versions

Peer-reviewed
Statistical inference for Bures—Wasserstein barycenters
Authors:Alexey KroshninVladimir SpokoinyAlexandra Suvorikova
Article, 2021
Publication:Annals of applied probability, 31, 2021, 1264
Publisher:2021

Group Invited Talks - OATML

oatml.cs.ox.ac.uk › talksPREVIE

Been is a stff research scientist at Google Brain. ... upper bounded by the Wasserstein distance, which then allows us to tap into existing efficient ...

OATML · OATML research group · 

Mar 1, 2021

2021+


On the geometry of Stein variational gradient descent and ...

www.youtube.com › watch

Optimal Transport and PDE: Gradient Flows in the Wasserstein Metric ... Reliable Deep Anomaly Detection - Jie Ren, Google AI Brain Team.

YouTube · UCL Centre for Artificial Intelligence · 

Mar 18, 2021



Peer-reviewed
Quantitative spectral gap estimate and Wasserstein contraction of simple slice sampling
Authors:Viacheslav NatarovskiiDaniel RudolfBjörn Sprungk
Article, 2021
Publication:Annals of applied probability, 31, 2021, 806
Publisher:2021

Set Representation Learning with Generalized Sliced-Wasserstein Embeddings
Authors:Naderializadeh, Navid (Creator), Kolouri, Soheil (Creator), Comer, Joseph F. (Creator), Andrews, Reed W. (Creator), Hoffmann, Heiko (Creator)Show more
Summary:An increasing number of machine learning tasks deal with learning representations from set-structured data. Solutions to these problems involve the composition of permutation-equivariant modules (e.g., self-attention, or individual processing via feed-forward neural networks) and permutation-invariant modules (e.g., global average pooling, or pooling by multi-head attention). In this paper, we propose a geometrically-interpretable framework for learning representations from set-structured data, which is rooted in the optimal mass transportation problem. In particular, we treat elements of a set as samples from a probability measure and propose an exact Euclidean embedding for Generalized Sliced Wasserstein (GSW) distances to learn from set-structured data effectively. We evaluate our proposed framework on multiple supervised and unsupervised set learning tasks and demonstrate its superiority over state-of-the-art set representation learning approachesShow more
Downloadable Archival Material, 2021-03-05
Undefined
Publisher:2021-03-05

2021 see 2022
Peer-reviewed
Wasserstein Distances, Geodesics and Barycenters of Merge Trees
Authors:Mathieu PontJules VidalJulie DelonJulien Tierny
Article, 2021
Publication:IEEE transactions on visualization and computer graphics, 28, 2021, 291
Publisher:2021


Peer-reviewed
Nonembeddability of persistence diagrams with p > 2 Wasserstein metric
Author:Alexander Wagner

Article, 2021
Publication:Proceedings of the American Mathematical Society, 149, 2021, 2673
Publisher:2021

<——2021———2021——2890—



Peer-reviewed
Second-Order Conic Programming Approach for Wasserstein Distributionally Robust Two-Stage Linear Programs
Show more
Authors:Zhuolin WangKeyou YouShiji SongYuli Zhang
Article, 2021
Publication:IEEE transactions on automation science and engineering, 19, 2021, 946
Publisher:2021
 
Peer-reviewed
Linear and Deep Order-Preserving Wasserstein Discriminant Analysis
Authors:Su B.Zhou J.Wen J.Wu Y.
Article, 2021
Publication:IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021
Publisher:2021

Approximating 1-Wasserstein Distance between Persistence Diagrams by Graph Sparsification
Authors:Dey, Tamal K. (Creator), Zhang, Simon (Creator)
Summary:Persistence diagrams (PD)s play a central role in topological data analysis. This analysis requires computing distances among such diagrams such as the 1-Wasserstein distance. Accurate computation of these PD distances for large data sets that render large diagrams may not scale appropriately with the existing methods. The main source of difficulty ensues from the size of the bipartite graph on which a matching needs to be computed for determining these PD distances. We address this problem by making several algorithmic and computational observations in order to obtain an approximation. First, taking advantage of the proximity of PD points, we condense them thereby decreasing the number of nodes in the graph for computation. The increase in point multiplicities is addressed by reducing the matching problem to a min-cost flow problem on a transshipment network. Second, we use Well Separated Pair Decomposition to sparsify the graph to a size that is linear in the number of points. Both node and arc sparsifications contribute to the approximation factor where we leverage a lower bound given by the Relaxed Word Mover's distance. Third, we eliminate bottlenecks during the sparsification procedure by introducing parallelism. Fourth, we develop an open source software called PDoptFlow based on our algorithm, exploiting parallelism by GPU and multicore. We perform extensive experiments and show that the actual empirical error is very low. We also show that we can achieve high performance at low guaranteed relative errors, improving upon the state of the artsShow more
Downloadable Archival Material, 2021-10-27
Undefined
Publisher:2021-10-27

 
2021 see 1011
Peer-reviewed
Fault Diagnosis of Rotating Machinery Based on Wasserstein Distance and Feature Selection

Authors:Francesco FerracutiAlessandro FreddiAndrea MonteriuLuca Romeo
Article, 2021
Publication:IEEE transactions on automation science and engineering, 19, 2021, 1997
Publisher:2021

Measuring dependence in the Wasserstein distance for Bayesian nonparametric models
Authors:Marta CatalanoAntonio LijoiIgor Prünster
Article, 2021
Publication:Annals of statistics, 49, 2021, 2916
Publisher:2021


2021



SLOSH: Set LOcality Sensitive Hashing via Sliced-Wasserstein Embeddings
Authors:Lu, Yuzhe (Creator), Liu, Xinran (Creator), Soltoggio, Andrea (Creator), Kolouri, Soheil (Creator)
Summary:Learning from set-structured data is an essential problem with many applications in machine learning and computer vision. This paper focuses on non-parametric and data-independent learning from set-structured data using approximate nearest neighbor (ANN) solutions, particularly locality-sensitive hashing. We consider the problem of set retrieval from an input set query. Such retrieval problem requires: 1) an efficient mechanism to calculate the distances/dissimilarities between sets, and 2) an appropriate data structure for fast nearest neighbor search. To that end, we propose Sliced-Wasserstein set embedding as a computationally efficient "set-2-vector" mechanism that enables downstream ANN, with theoretical guarantees. The set elements are treated as samples from an unknown underlying distribution, and the Sliced-Wasserstein distance is used to compare sets. We demonstrate the effectiveness of our algorithm, denoted as Set-LOcality Sensitive Hashing (SLOSH), on various set retrieval datasets and compare our proposed embedding with standard set embedding approaches, including Generalized Mean (GeM) embedding/pooling, Featurewise Sort Pooling (FSPool), and Covariance Pooling and show consistent improvement in retrieval results. The code for replicating our results is available here: \href{https://github.com/mint-vu/SLOSH}{https://github.com/mint-vu/SLOSH}Show more
Downloadable Archival Material, 2021-12-10
Undefined
Publisher:2021-12-10

  
Peer-reviewed
Wasserstein GANs for MR Imaging: From Paired to Unpaired Training
Authors:Lei K.Mardani M.Pauly J.M.Vasanawala S.S.
Article, 2021
Publication:IEEE Transactions on Medical Imaging, 40, 2021 01 01, 105
Publisher:2021
 
Attainability property for a probabilistic target in Wasserstein spaces
Authors:Giulia CavagnariAntonio Marigonda
Article, 2021
Publication:Discrete and continuous dynamical systems, 41, 2021, 777
Publisher:2021

On a linear Gromov-Wasserstein distance
Authors:Beier, Florian (Creator), Beinert, Robert (Creator), Steidl, Gabriele (Creator)
Summary:Gromov-Wasserstein distances are generalization of Wasserstein distances, which are invariant under certain distance preserving transformations. Although a simplified version of optimal transport in Wasserstein spaces, called linear optimal transport (LOT), was successfully used in certain applications, there does not exist a notation of linear Gromov-Wasserstein distances so far. In this paper, we propose a definition of linear Gromov-Wasserstein distances. We motivate our approach by a generalized LOT model, which is based on barycentric projection maps of transport plans. Numerical examples illustrate that the linear Gromov-Wasserstein distances, similarly as LOT, can replace the expensive computation of pairwise Gromov-Wasserstein distances in certain applicationsShow more
Downloadable Archival Material, 2021-12-22
Undefined
Publisher:2021-12-22



P-WAE: Generalized Patch-Wasserstein Autoencoder for Anomaly ScreeningAuthor:Chen, Yurong (Creator)
Summary:Anomaly detection plays a pivotal role in numerous real-world scenarios, such as industrial automation and manufacturing intelligence. Recently, variational inference-based anomaly analysis has attracted researchers' and developers' attention. It aims to model the defect-free distribution so that anomalies can be classified as out-of-distribution samples. Nevertheless, there are two disturbing factors that need us to prioritize: (i) the simplistic prior latent distribution inducing limited expressive capability; (ii) the strong probability distance notion results in collapsed features. In this paper, we propose a novel Patch-wise Wasserstein AutoEncoder (P-WAE) architecture to alleviate those challenges. In particular, a patch-wise variational inference model coupled with solving the jigsaw puzzle is designed, which is a simple yet effective way to increase the expressiveness of the latent manifold. This makes using the model on high-dimensional practical data possible. In addition, we leverage a weaker measure, sliced-Wasserstein distance, to achieve the equilibrium between the reconstruction fidelity and generalized representations. Comprehensive experiments, conducted on the MVTec AD dataset, demonstrate the superior performance of our proposed methodShow more
Downloadable Archival Material, 2021-08-09
Undefined
Publisher:2021-08-09

<——2021———2021——2900—-



Supervised Tree-Wasserstein Distance
Authors:Takezawa, Yuki (Creator), Sato, Ryoma (Creator), Yamada, Makoto (Creator)
Summary:To measure the similarity of documents, the Wasserstein distance is a powerful tool, but it requires a high computational cost. Recently, for fast computation of the Wasserstein distance, methods for approximating the Wasserstein distance using a tree metric have been proposed. These tree-based methods allow fast comparisons of a large number of documents; however, they are unsupervised and do not learn task-specific distances. In this work, we propose the Supervised Tree-Wasserstein (STW) distance, a fast, supervised metric learning method based on the tree metric. Specifically, we rewrite the Wasserstein distance on the tree metric by the parent-child relationships of a tree and formulate it as a continuous optimization problem using a contrastive loss. Experimentally, we show that the STW distance can be computed fast, and improves the accuracy of document classification tasks. Furthermore, the STW distance is formulated by matrix multiplications, runs on a GPU, and is suitable for batch processing. Therefore, we show that the STW distance is extremely efficient when comparing a large number of documentsShow more
Downloadable Archival Material, 2021-01-27
Undefined
Publisher:2021-01-2


Approximation algorithms for 1-Wasserstein distance between persistence diagrams
Authors:Chen, Samantha (Creator), Wang, Yusu (Creator)
Summary:Recent years have witnessed a tremendous growth using topological summaries, especially the persistence diagrams (encoding the so-called persistent homology) for analyzing complex shapes. Intuitively, persistent homology maps a potentially complex input object (be it a graph, an image, or a point set and so on) to a unified type of feature summary, called the persistence diagrams. One can then carry out downstream data analysis tasks using such persistence diagram representations. A key problem is to compute the distance between two persistence diagrams efficiently. In particular, a persistence diagram is essentially a multiset of points in the plane, and one popular distance is the so-called 1-Wasserstein distance between persistence diagrams. In this paper, we present two algorithms to approximate the 1-Wasserstein distance for persistence diagrams in near-linear time. These algorithms primarily follow the same ideas as two existing algorithms to approximate optimal transport between two finite point-sets in Euclidean spaces via randomly shifted quadtrees. We show how these algorithms can be effectively adapted for the case of persistence diagrams. Our algorithms are much more efficient than previous exact and approximate algorithms, both in theory and in practice, and we demonstrate its efficiency via extensive experiments. They are conceptually simple and easy to implement, and the code is publicly available in githubShow more
Downloadable Archival Material, 2021-04-15
Undefined
Publisher:2021-04-15

Zbl 07700596


Quantized Gromov-Wasserstein
Authors:Chowdhury, Samir (Creator), Miller, David (Creator), Needham, Tom (Creator)
Summary:The Gromov-Wasserstein (GW) framework adapts ideas from optimal transport to allow for the comparison of probability distributions defined on different metric spaces. Scalable computation of GW distances and associated matchings on graphs and point clouds have recently been made possible by state-of-the-art algorithms such as S-GWL and MREC. Each of these algorithmic breakthroughs relies on decomposing the underlying spaces into parts and performing matchings on these parts, adding recursion as needed. While very successful in practice, theoretical guarantees on such methods are limited. Inspired by recent advances in the theory of quantization for metric measure spaces, we define Quantized Gromov Wasserstein (qGW): a metric that treats parts as fundamental objects and fits into a hierarchy of theoretical upper bounds for the GW problem. This formulation motivates a new algorithm for approximating optimal GW matchings which yields algorithmic speedups and reductions in memory complexity. Consequently, we are able to go beyond outperforming state-of-the-art and apply GW matching at scales that are an order of magnitude larger than in the existing literature, including datasets containing over 1M pointsShow more
Downloadable Archival Material, 2021-04-05
Undefined
Publisher:2021-04-05

Sig-Wasserstein GANs for Time Series Generation
Authors:Ni, Hao (Creator), Szpruch, Lukasz (Creator), Sabate-Vidales, Marc (Creator), Xiao, Baoren (Creator), Wiese, Magnus (Creator), Liao, Shujian (Creator)Show more
Summary:Synthetic data is an emerging technology that can significantly accelerate the development and deployment of AI machine learning pipelines. In this work, we develop high-fidelity time-series generators, the SigWGAN, by combining continuous-time stochastic models with the newly proposed signature $W_1$ metric. The former are the Logsig-RNN models based on the stochastic differential equations, whereas the latter originates from the universal and principled mathematical features to characterize the measure induced by time series. SigWGAN allows turning computationally challenging GAN min-max problem into supervised learning while generating high fidelity samples. We validate the proposed model on both synthetic data generated by popular quantitative risk models and empirical financial data. Codes are available at https://github.com/SigCGANs/Sig-Wasserstein-GANs.gitShow more
Downloadable Archival Material, 2021-11-01
Undefined
Publisher:2021-11-01 


Peer-reviewed
On distributionally robust chance constrained programs with Wasserstein distance
Author:Weijun Xie
Downloadable Article, 2021
Publication:Mathematical Programming : A Publication of the Mathematical Optimization Society, 186, 202103, 115
Publisher:2021 

 

2021


Do Neural Optimal Transport Solvers Work? A Continuous Wasserstein-2 Benchmark
Authors:Korotin, Alexander (Creator), Li, Lingxiao (Creator), Genevay, Aude (Creator), Solomon, Justin (Creator), Filippov, Alexander (Creator), Burnaev, Evgeny (Creator)Show more
Summary:Despite the recent popularity of neural network-based solvers for optimal transport (OT), there is no standard quantitative way to evaluate their performance. In this paper, we address this issue for quadratic-cost transport -- specifically, computation of the Wasserstein-2 distance, a commonly-used formulation of optimal transport in machine learning. To overcome the challenge of computing ground truth transport maps between continuous measures needed to assess these solvers, we use input-convex neural networks (ICNN) to construct pairs of measures whose ground truth OT maps can be obtained analytically. This strategy yields pairs of continuous benchmark measures in high-dimensional spaces such as spaces of images. We thoroughly evaluate existing optimal transport solvers using these benchmark measures. Even though these solvers perform well in downstream tasks, many do not faithfully recover optimal transport maps. To investigate the cause of this discrepancy, we further test the solvers in a setting of image generation. Our study reveals crucial limitations of existing solvers and shows that increased OT accuracy does not necessarily correlate to better results downstreamShow mor
Downloadable Archival Material, 2021-06-03
Undefined
Publisher:2021-06-03
Cited by 28
Related articles All 6 versions


Fast Approximation of the Sliced-Wasserstein Distance Using Concentration of Random Projections
Authors:Nadjahi, Kimia (Creator), Durmus, Alain (Creator), Jacob, Pierre E. (Creator), Badeau, Roland (Creator), Şimşekli, Umut (Creator)
Summary:The Sliced-Wasserstein distance (SW) is being increasingly used in machine learning applications as an alternative to the Wasserstein distance and offers significant computational and statistical benefits. Since it is defined as an expectation over random projections, SW is commonly approximated by Monte Carlo. We adopt a new perspective to approximate SW by making use of the concentration of measure phenomenon: under mild assumptions, one-dimensional projections of a high-dimensional random vector are approximately Gaussian. Based on this observation, we develop a simple deterministic approximation for SW. Our method does not require sampling a number of random projections, and is therefore both accurate and easy to use compared to the usual Monte Carlo approximation. We derive nonasymptotical guarantees for our approach, and show that the approximation error goes to zero as the dimension increases, under a weak dependence condition on the data distribution. We validate our theoretical findings on synthetic datasets, and illustrate the proposed approximation on a generative modeling problemShow more
Downloadable Archival Material, 2021-06-29
Undefined
Publisher:2021-06-29
 
Model-agnostic bias mitigation methods with regressor distribution control for Wasserstein-based fairness metrics
Authors:Miroshnikov, Alexey (Creator), Kotsiopoulos, Konstandinos (Creator), Franks, Ryan (Creator), Kannan, Arjun Ravi (Creator)
Summary:This article is a companion paper to our earlier work Miroshnikov et al. (2021) on fairness interpretability, which introduces bias explanations. In the current work, we propose a bias mitigation methodology based upon the construction of post-processed models with fairer regressor distributions for Wasserstein-based fairness metrics. By identifying the list of predictors contributing the most to the bias, we reduce the dimensionality of the problem by mitigating the bias originating from those predictors. The post-processing methodology involves reshaping the predictor distributions by balancing the positive and negative bias explanations and allows for the regressor bias to decrease. We design an algorithm that uses Bayesian optimization to construct the bias-performance efficient frontier over the family of post-processed models, from which an optimal model is selected. Our novel methodology performs optimization in low-dimensional spaces and avoids expensive model retrainingShow more
Downloadable Archival Material, 2021-11-19
Undefined
Publisher:2021-11-19
Uncertainty quantification in a mechanical submodel driven by a Wasserstein-GAN
Authors:Boukraichi, Hamza (Creator), Akkari, Nissrine (Creator), Casenave, Fabien (Creator), Ryckelynck, David (Creator)
Summary:The analysis of parametric and non-parametric uncertainties of very large dynamical systems requires the construction of a stochastic model of said system. Linear approaches relying on random matrix theory and principal componant analysis can be used when systems undergo low-frequency vibrations. In the case of fast dynamics and wave propagation, we investigate a random generator of boundary conditions for fast submodels by using machine learning. We show that the use of non-linear techniques in machine learning and data-driven methods is highly relevant. Physics-informed neural networks is a possible choice for a data-driven method to replace linear modal analysis. An architecture that support a random component is necessary for the construction of the stochastic model of the physical system for non-parametric uncertainties, since the goal is to learn the underlying probabilistic distribution of uncertainty in the data. Generative Adversarial Networks (GANs) are suited for such applications, where the Wasserstein-GAN with gradient penalty variant offers improved convergence results for our problem. The objective of our approach is to train a GAN on data from a finite element method code (Fenics) so as to extract stochastic boundary conditions for faster finite element predictions on a submodel. The submodel and the training data have both the same geometrical support. It is a zone of interest for uncertainty quantification and relevant to engineering purposes. In the exploitation phase, the framework can be viewed as a randomized and parametrized simulation generator on the submodel, which can be used as a Monte Carlo estimatorShow more
Downloadable Archival Material, 2021-10-26
Undefined
Publisher:2021-10-26

b

Sliced-Wasserstein Gradient FlowsAuthors:Bonet, Clément (Creator), Courty, Nicolas (Creator), Septier, François (Creator), Drumetz, Lucas (Creator)
Summary:Minimizing functionals in the space of probability distributions can be done with Wasserstein gradient flows. To solve them numerically, a possible approach is to rely on the Jordan-Kinderlehrer-Otto (JKO) scheme which is analogous to the proximal scheme in Euclidean spaces. However, it requires solving a nested optimization problem at each iteration, and is known for its computational challenges, especially in high dimension. To alleviate it, very recent works propose to approximate the JKO scheme leveraging Brenier's theorem, and using gradients of Input Convex Neural Networks to parameterize the density (JKO-ICNN). However, this method comes with a high computational cost and stability issues. Instead, this work proposes to use gradient flows in the space of probability measures endowed with the sliced-Wasserstein (SW) distance. We argue that this method is more flexible than JKO-ICNN, since SW enjoys a closed-form differentiable approximation. Thus, the density at each step can be parameterized by any generative model which alleviates the computational burden and makes it tractable in higher dimensionsShow more
Downloadable Archival Material, 2021-10-21
Undefined
Publisher:2021-10-21

<——2021———2021——2910—- 




Wasserstein distributionally robust option pricing
Authors:Wei LiuLi YangBo Yu
Article, 2021
Publication:Journal of mathematical research with applications, 41, 2021, 99
Publisher:2021

Geometric Characteristics of the Wasserstein Metric on SPD(n) and Its Applications on Data Processing
Authors:Yihao LuoShiqiang ZhangYueqi CaoHuafei SunCarlos M. Travieso-González
Article, 2021
Publication:Entropy, 23, 20210914
Publisher:2021


WDA: An Improved Wasserstein Distance-Based Transfer Learning Fault Diagnosis Method
Authors:Zhiyu ZhuLanzhi WangGaoliang PengSijue LiKim Phuc TranAthanasios RakitzisKhanh T. P. Nguyen
Article, 2021
Publication:Sensors (Basel, Switzerland), 21, 20210626
Publisher:2021


Reproducibility of radiomic features using network analysis and its application in Wasserstein k -means clustering
Authors:Aditya P. ApteJoseph O. DeasyVaios HatzoglouAditi IyerEvangelia KatsoulakisNancy Y. LeeUsman MahmoodJung Hun OhMaryam PouryahyaNadeem RiazShow more
Article, 2021
Publication:Journal of Medical Imaging, 8, 20210430, 031904
Publisher:2021
 
Image Denoising Using an Improved Generative Adversarial Network with Wasserstein Distance
Authors:Qian WangHan LiuGuo XieYoumin Zhang2021 40th Chinese Control Conference (CCC)
Summary:The image denoising discriminant model has received extensive attention in recent years due to its good denoising performance. In order to solve the problems of denoising of traditional generative adversarial networks, which are difficult to train and easy to collapse, traditional denoising methods will damage the visibility of important structural details after the image is denoised, this paper proposes an improved generative adversarial network algorithm. The novel algorithm obtains more image features by adding multi-level convolution of the generative network, and adds multiple residual blocks and global residuals to extract and learn the features of the input noisy image to avoid the loss of features. The network uses the weighted sum of feature loss and perceptual loss, which can effectively retain the detailed information of the image while removing noise. In the experiments, we selected PSNR and SSIM as the indicators, and the experimental results show that the novel algorithm can effectively remove image noise and improve the visual perception of the imageShow more
Chapter, 2021
Publication:2021 40th Chinese Control Conference (CCC), 20210726, 7027
Publisher:2021


2021


Peer-reviewed
Ωß≈SDEs
Authors:Paul-Eric Chaudru de RaynalNoufel Frikha
Article, 2021
Publication:JOURNAL DE MATHEMATIQUES PURES ET APPLIQUEES, 156, 2021, 1
Publisher:2021

Peer-reviewed
Wasserstein Distributionally Robust Look-Ahead Economic Dispatch
Authors:Bala Kameshwar PoollaAshish R. HotaSaverio BolognaniDuncan S. CallawayAshish Cherukuri
Summary:We consider the problem of look-ahead economic dispatch (LAED) with uncertain renewable energy generation. The goal of this problem is to minimize the cost of conventional energy generation subject to uncertain operational constraints. The risk of violating these constraints must be below a given threshold for a family of probability distributions with characteristics similar to observed past data or predictions. We present two data-driven approaches based on two novel mathematical reformulations of this distributionally robust decision problem. The first one is a tractable convex program in which the uncertain constraints are defined via the distributionally robust conditional-value-at-risk. The second one is a scalable robust optimization program that yields an approximate distributionally robust chance-constrained LAED. Numerical experiments on the IEEE 39-bus system with real solar production data and forecasts illustrate the effectiveness of these approaches. We discuss how system operators should tune these techniques in order to seek the desired robustness-performance trade-off and we compare their computational scalabilityShow more
Article, 2021
Publication:IEEE Transactions on Power Systems, 36, 202105, 2010
Publisher:2021
 2021 onlineCover Image OPEN ACCESS

Wasserstein Distributionally Robust Look-Ahead Economic Dispatch

by Poolla, Bala Kameshwar; Hota, Ashish R; Bolognani, Saverio ; More...

IEEE transactions on power systems, 05/2021, Volume 36, Issue 3

We consider the problem of look-ahead economic dispatch (LAED) with uncertain renewable energy generation. The goal of this problem is to minimize the cost of...

Article Link Read Article BrowZine Article Link Icon

Journal ArticleFull Text Online

 Preview 

 Cite this item Email this item Save this item More actions


Averaging on the Bures-Wasserstein manifold: dimension-free convergence of gradient descent
Authors:Altschuler, Jason M. (Creator), Chewi, Sinho (Creator), Gerber, Patrik (Creator), Stromme, Austin J. (Creator)
Summary:We study first-order optimization algorithms for computing the barycenter of Gaussian distributions with respect to the optimal transport metric. Although the objective is geodesically non-convex, Riemannian GD empirically converges rapidly, in fact faster than off-the-shelf methods such as Euclidean GD and SDP solvers. This stands in stark contrast to the best-known theoretical results for Riemannian GD, which depend exponentially on the dimension. In this work, we prove new geodesic convexity results which provide stronger control of the iterates, yielding a dimension-free convergence rate. Our techniques also enable the analysis of two related notions of averaging, the entropically-regularized barycenter and the geometric median, providing the first convergence guarantees for Riemannian GD for these problemsShow more
Downloadable Archival Material, 2021-06-15
Undefined
Publisher:2021-06-15


Wasserstein GAN for the Classification of Unbalanced THz Database
Authors:Zhu R.-S.Shen T.Liu Y.-L.Zhu Y.Cui X.-W.
Article, 2021
Publication:Guang Pu Xue Yu Guang Pu Fen Xi/Spectroscopy and Spectral Analysis, 41, 2021 02 01, 425
Publisher:2021

Peer-reviewed
Infinite-dimensional regularization of McKean—Vlasov equation with a Wasserstein diffusion
Author:V. Marx
Article, 2021
Publication:Annales de l'I.H.P.Probabilités et statistiques, 57, 2021, 2315
Publisher:2021

<——2021———2021——2920—


2021 eBook
Wasserstein perturbations of Markovian transition semigroups
Authors:Sven FuhrmannMichael KupperMax Nendel
Summary:In this paper, we deal with a class of time-homogeneous continuous-time Markov processes with transition probabilities bearing a nonparametric uncertainty. The uncertainty is modelled by considering perturbations of the transition probabilities within a proximity in Wasserstein distance. As a limit over progressively finer time periods, on which the level of uncertainty scales proportionally, we obtain a convex semigroup satisfying a nonlinear PDE in a viscosity sense. A remarkable observation is that, in standard situations, the nonlinear transition operators arising from nonparametric uncertainty coincide with the ones related to parametric drift uncertainty. On the level of the generator, the uncertainty is reflected as an additive perturbation in terms of a convex functional of first order derivatives. We additionally provide sensitivity bounds for the convex semigroup relative to the reference model. The results are illustrated with Wasserstein perturbations of Levy processes, infinite-dimensional Ornstein-Uhlenbeck processes, geometric Brownian motions, and Koopman semigroupsShow more
eBook, 2021
English
Publisher:Center for Mathematical Economics (IMW), Bielefeld University, Bielefeld, Germany, 2021


Wasserstein distance to independence models
Authors:Celik, Turku Ozluem (Creator), Jamneshan, Asgar (Creator), Montufar, Guido (Creator), Sturmfels, Bernd (Creator), Venturello, Lorenzo (Creator)
Summary:An independence model for discrete random variables is a Segre Veronese variety in a probability simplex. Any metric on the set of joint states of the random variables induces a Wasserstein metric on the probability simplex. The unit ball of this polyhedral norm is dual to the Lipschitz polytope. Given any data distribution, we seek to minimize its Wasserstein distance to a fixed independence model. The solution to this optimization problem is a piecewise algebraic function of the data. We compute this function explicitly in small instances, we study its combinatorial structure and algebraic degrees in general, and we present some experimental casestudiesShow more

Downloadable Archival Material, 2021
English
Publisher:KTH, Matematik (Avd.) Simon Fraser Univ, 8888 Univ Dr, Burnaby, BC, Canada. Univ Calif Los Angeles, 520 Portola Plaza, Los Angeles, CA USA. Univ Calif Los Angeles, 520 Portola Plaza, Los Angeles, CA USA.;MPI MiS Leipzig, Inselstr 22, Leipzig, Germany. MPI MiS Leipzig, Inselstr 22, Leipzig, Germany.;Univ Calif Berkeley, 970 Evans Hall, Berkeley, CA USA. Elsevier BV, 2021



Authors:UCL - SSH/LIDAM/ISBA - Institut de Multivariate Goodness-of-Fit Tests Based on Wasserstein DistanceStatistique, Biostatistique et Sciences Actuarielles (Contributor), Hallin, Marc (Creator), Mordant, Gilles (Creator), Segers, Johan (Creator)Show more
Summary:Goodness-of-fit tests based on the empirical Wasserstein distance are proposed for simple and composite null hypotheses involving general multivariate distributions. For group families, the procedure is to be implemented after preliminary reduction of the data via invariance. This property allows for calculation of exact critical values and p-values at finite sample sizes. Applications include testing for location–scale families and testing for families arising from affine transformations, such as elliptical distributions with given standard radial density and unspecified location vector and scatter matrix. A novel test for multivariate normality with unspecified mean vector and covariance matrix arises as a special case. For more general parametric families, we propose a parametric bootstrap procedure to calculate critical values. The lack of asymptotic distribution theory for the empirical Wasserstein distance means that the validity of the parametric bootstrap under the null hypothesis remains a conjecture. Nevertheless, we show that the test is consistent against fixed alternatives. To this end, we prove a uniform law of large numbers for the empirical distribution in Wasserstein distance, where the uniformity is over any class of underlying distributions satisfying a uniform integrability condition but no additional moment assumptions. The calculation of test statistics boils down to solving the well-studied semi-discrete optimal transport problem. Extensive numerical experiments demonstrate the practical feasibility and the excellent performance of the proposed tests for the Wasserstein distance of order p = 1 and p = 2 and for dimensions at least up to d = 5. The simulations also lend support to the conjecture of the asymptotic validity of the parametric bootstrapShow more
Downloadable Archival Material, 2021
English
Publisher:Institute of Mathematical Statistics, 2021

  

2021  [HTML] sciencedirect.com

[HTML] A Wasserstein distance based multiobjective evolutionary algorithm for the risk aware optimization of sensor placement

A Ponti, A Candelieri, F Archetti - Intelligent Systems with Applications, 2021 - Elsevier

… In this paper sensor placement (SP) (ie, location of sensors at some nodes) for the early 

detection … In this paper we model the sensor placement problem as a multi objective optimization 

… the Wasserstein distance between the histograms corresponding to the sensor placement F …

Cited by 5 Related articles
A Wasserstein distance based multiobjective evolutionary algorithm for the risk aware optimization of sensor placement
Authors:Ponti, A (Contributor), Candelieri, A (Contributor), Archetti, F (Contributor), Ponti A. (Creator), Candelieri A. (Creator), Archetti F. (Creator)
Summary:In this paper we propose a new algorithm for the identification of optimal “sensing spots”, within a network, for monitoring the spread of “effects” triggered by “events”. This problem is referred to as “Optimal Sensor Placement” and many real-world problems fit into this general framework. In this paper sensor placement (SP) (i.e., location of sensors at some nodes) for the early detection of contaminants in water distribution networks (WDNs) will be used as a running example. Usually, we have to manage a trade-off between different objective functions, so that we are faced with a multi objective optimization problem. (MOP). The best trade-off between the objectives can be defined in terms of Pareto optimality. In this paper we model the sensor placement problem as a multi objective optimization problem with boolean decision variables and propose a Multi Objective Evolutionary Algorithm (MOEA) for approximating and analyzing the Pareto set. The evaluation of the objective functions requires the execution of a simulation model: to organize the simulation results in a computationally efficient way we propose a data structure collecting simulation outcomes for every SP which is particularly suitable for visualization of the dynamics of contaminant concentration and evolutionary optimization. This data structure enables the definition of information spaces, in which a candidate placement can be represented as a matrix or, in probabilistic terms as a histogram. The introduction of a distance between histograms, namely the Wasserstein (WST) distance, enables to derive new genetic operators, indicators of the quality of the Pareto set and criteria to choose among the Pareto solutions. The new algorithm MOEA/WST has been tested on two benchmark water distribution networks and a real world network. Preliminary results are compared with NSGA-II and show a better performance, in terms of hypervolume and coverage, in particular for relatively large networks and low generationShow more
Downloadable Archival Material, 2021
English
Publisher:Elsevier Ltd. country:GB, 

<——2021———2021——2930—-




Multivariate Goodness-of-Fit Tests Based on Wasserstein Distance
Authors:UCL - SSH/LIDAM/ISBA - Institut de Statistique, Biostatistique et Sciences Actuarielles (Contributor), Hallin, Marc (Creator), Mordant, Gilles (Creator), Segers, Johan (Creator)Show more
Summary:Goodness-of-fit tests based on the empirical Wasserstein dis- tance are proposed for simple and composite null hypotheses involving general multivariate distributions. This includes the important problem of testing for multivariate normality with unspecified mean vector and covari- ance matrix and, more generally, testing for elliptical symmetry with given standard radial density and unspecified location and scatter parameters. The calculation of test statistics boils down to solving the well-studied semi- discrete optimal transport problem. Exact critical values can be computed for some important particular cases, such as null hypotheses of ellipticity with given standard radial density and unspecified location and scatter; else, approximate critical values are obtained via parametric bootstrap. Consistency is established, based on a result on the convergence to zero, uniformly over certain families of distributions, of the empirical Wasserstein distance|a novel result of independent interest. A simulation study estab- lishes the practical feasibility and excellent performance of the proposed testsShow more
Downloadable Archival Material, 2021
Englis

Exponential contraction in Wasserstein distance on static and evolving manifolds
Authors:Cheng, Li Juan (Creator), Thalmaier, Anton (Creator), Zhang, Shao-Qin (Creator)
Abstract:In this article, exponential contraction in Wasserstein distance for heat semigroups of diffusion processes on Riemannian manifolds is established under curvature conditions where Ricci curvature is not necessarily required to be non-negative. Compared to the results of Wang (2016), we focus on explicit estimates for the exponential contraction rate. Moreover, we show that our results extend to manifolds evolving under a geometric flow. As application, for the time-inhomogeneous semigroups, we obtain a gradient estimate with an exponential contraction rate under weak curvature conditions, as well as uniqueness of the corresponding evolution system of measuresShow more
Downloadable Archival Material, 2021
English
Publisher:2021


Enhanced Wasserstein Distributionally Robust OPF With Dependence Structure and Support Information
Authors:Arrigo, Adriano (Creator), Kazempour, Jalal (Creator), De Greve, Zacharie (Creator), Toubeau, Jean-François (Creator), Vallée, Francois (Creator)
Summary:This paper goes beyond the current state of the art related to Wasserstein distributionally robust optimal power flow problems, by adding dependence structure (correlation) and support information. In view of the space-time dependencies pertaining to the stochastic renewable power generation uncertainty, we apply a moment-metric-based distributionally robust optimization, which includes a constraint on the second-order moment of uncertainty. Aiming at further excluding unrealistic probability distributions from our proposed decision-making model, we enhance it by adding support information. We reformulate our proposed model, resulting in a semi-definite program, and show its satisfactory performance in terms of the operational

results achieved and the computational timeShow more
Downloadable Archival Material, 2021
English
Publisher:IEEE, 2021


Classification of atomic environments via the Gromov–Wasserstein distance
Authors:S KawanoKawano, S (Creator), Mason, JK (Creator)
Summary:Interpreting molecular dynamics simulations usually involves automated classification of local atomic environments to identify regions of interest. Existing approaches are generally limited to a small number of reference structures and only include limited information about the local chemical composition. This work proposes to use a variant of the Gromov–Wasserstein (GW) distance to quantify the difference between a local atomic environment and a set of arbitrary reference environments in a way that is sensitive to atomic displacements, missing atoms, and differences in chemical composition. This involves describing a local atomic environment as a finite metric measure space, which has the additional advantages of not requiring the local environment to be centered on an atom and of not making any assumptions about the material class. Numerical examples illustrate the efficacy and versatility of the algorithmShow more

Downloadable Archival Material, 2021-02-15
Undefined
Publisher:eScholarship, University of California, 2021-02-15




Network consensus in the wasserstein metric space of probability measures
Authors:BISHOP, AN (Creator), DOUCET, A (Creator)
Abstract:Distributed consensus in the Wasserstein metric space of probability measures on the real line is introduced in this work. Convergence of each agent's measure to a common measure is proven under a weak network connectivity condition. The common measure reached at each agent is one minimizing a weighted sum of its Wasserstein distance to all initial agent measures. This measure is known as the Wasserstein barycenter. Special cases involving Gaussian measures, empirical measures, and time-invariant network topologies are considered, where convergence rates and average-consensus results are given. This work has possible applicability in computer vision, machine learning, clustering, and estimationShow more
Downloadable Archival Material, 2021-01-01
Undefined
Publisher:Society for Industrial & Applied Mathematics (SIAM), 2021-01-01

2021


Peer-reviewed
Wasserstein convergence rates for random bit approximations of continuous Markov processes
Authors:Stefan AnkirchnerThomas KruseMikhail Urusov
Summary:We determine the convergence speed of a numerical scheme for approximating one-dimensional continuous strong Markov processes. The scheme is based on the construction of certain Markov chains whose laws can be embedded into the process with a sequence of stopping times. Under a mild condition on the process' speed measure we prove that the approximating Markov chains converge at fixed times at the rate of 1/4 with respect to every p-th Wasserstein distance. For the convergence of paths, we prove any rate strictly smaller than 1/4. Our results apply, in particular, to processes with irregular behavior such as solutions of SDEs with irregular coefficients and processes with sticky pointsShow more
Article, 2021
Publication:Journal of Mathematical Analysis and Applications, 493, 20210115
Publisher:2021


Peer-reviewed
Distributionally robust chance-constrained programs with right-hand side uncertainty under Wasserstein ambiguity
Authors:Nam Ho-NguyenFatma Kılınç-KarzanSimge KüçükyavuzDabeen Lee
Summary:Abstract: We consider exact deterministic mixed-integer programming (MIP) reformulations of distributionally robust chance-constrained programs (DR-CCP) with random right-hand sides over Wasserstein ambiguity sets. The existing MIP formulations are known to have weak continuous relaxation bounds, and, consequently, for hard instances with small radius, or with large problem sizes, the branch-and-bound based solution processes suffer from large optimality gaps even after hours of computation time. This significantly hinders the practical application of the DR-CCP paradigm. Motivated by these challenges, we conduct a polyhedral study to strengthen these formulations. We reveal several hidden connections between DR-CCP and its nominal counterpart (the sample average approximation), mixing sets, and robust 0–1 programming. By exploiting these connections in combination, we provide an improved formulation and two classes of valid inequalities for DR-CCP. We test the impact of our results on a stochastic transportation problem numerically. Our experiments demonstrate the effectiveness of our approach; in particular our improved formulation and proposed valid inequalities reduce the overall solution times remarkably. Moreover, this allows us to significantly scale up the problem sizes that can be handled in such DR-CCP formulations by reducing the solution times from hours to secondsShow more
Article, 2021
Publication:Mathematical Programming : A Publication of the Mathematical Optimization Society, 196, 20210204, 641
Publisher:2021


Temporal conditional Wasserstein GANs for audio-visual affect-related ties
Authors:Christos AthanasiadisAthanasiadis, Christos (Creator), Hortal, Enrique (Creator), Asteriadis, Stelios (Creator)
Downloadable Archival Material, 2021
English
Publisher:2021

 

Ωß≈Samantha (Creator), Wang, Yusu (Creator)
Summary:Recent years have witnessed a tremendous growth using topological summaries, especially the persistence diagrams (encoding the so-called persistent homology) for analyzing complex shapes. Intuitively, persistent homology maps a potentially complex input object (be it a graph, an image, or a point set and so on) to a unified type of feature summary, called the persistence diagrams. One can then carry out downstream data analysis tasks using such persistence diagram representations. A key problem is to compute the distance between two persistence diagrams efficiently. In particular, a persistence diagram is essentially a multiset of points in the plane, and one popular distance is the so-called 1-Wasserstein distance between persistence diagrams. In this paper, we present two algorithms to approximate the 1-Wasserstein distance for persistence diagrams in near-linear time. These algorithms primarily follow the same ideas as two existing algorithms to approximate optimal transport between two finite point-sets in Euclidean spaces via randomly shifted quadtrees. We show how these algorithms can be effectively adapted for the case of persistence diagrams. Our algorithms are much more efficient than previous exact and approximate algorithms, both in theory and in practice, and we demonstrate its efficiency via extensive experiments. They are conceptually simple and easy to implement, and the code is publicly available in githubShow more
Downloadable Archival Material, 2021
English
Publisher:LIPIcs - Leibniz International Proceedings in Informatics. 19th International Symposium on Experimental Algorithms (SEA 2021), 2021


A travers et autour des barycentres de Wasserstein
Authors:Aleksei KroshninFilippo SantambrogioAndrei SobolevskiIvan GentilJulie DelonLuigi De PascaleElsa CazellesAlexander KolesnikovAlexandra SuvorikovaUniversité de Lyon (2015-....).Show more
Summary:Le problème du transport optimal, initialement introduit par G. Monge en 1781 et redécouvert par L. Kantorovich en 1942, consiste à transformer une distribution de masse en une autre avec le minimum de travail. Dans cette thèse, on considère quelques problèmes variationnels impliquant un transport optimal. On est principalement motivé par le problème du barycentre de Wasserstein introduit par M. Agueh et G. Carlier en 2011. On traite les problèmes suivants : • les barycentres par rapport à un coût général de transport, leur existence et leur stabilité; • concentration et théorème central limite pour les barycentres empiriques de Wasserstein des mesures gaussiennes; • caractérisation, propriétés et théorème central limite pour les barycentres de Wasserstein pénalisés par l'entropie; • le problème de transport optimal, pénalisé en l'énergie de Dirichlet d'un plan de transport. Une autre partie de la thèse est consacrée à l'analyse de la complexité de l'algorithme des projections itératives de Bregman. Il s'agit d'une généralisation de l'algorithme bien connu de Sinkhorn, qui nous permet de trouver une solution approximative du problème de transport optimal ainsi que du problème du barycentre de WassersteinShow more
Computer Program, 2021
English
Publisher:2021

<——2021———2021——2940—- 



CDC-Wasserstein generated adversarial network for locally occluded face image recognition
Authors:Junrui JiangYuanyuan LiShihan YanKun ZhangWenlong Zhang2nd International Conference on Computer Vision, Image and Deep Learning 2600864 2021-06-25|2021-06-28 Liuzhou, China2nd International Conference on Computer Vision, Image, and Deep Learning 11911Image Processing Technology and Intelligent Recognition and DetectionShow more
Summary:In the practical application of wisdom education classroom teaching, students' faces may be blocked due to various factors (such as clothing, environment, lighting), resulting in low accuracy and low robustness of face recognition. To solve this problem, we introduce a new image restoration and recognition method is based on WGAN (Wasserstein Generated Adversarial Networks). When using the deep convolution generates adversarial networks for unsupervised training, we add the conditional category label c to guide the generator to generate sample data. At the same time, a double discriminant mechanism is introduced to enhance the feature extraction ability of the model. The local discriminant can better repair the details of the occlusion area, and the global discriminant is responsible for judging the authenticity and overall visual coherence of the restored image. Part of the convolution layer of the global discriminator is used to construct a VGG-like structure network as the feature extractor, which is composed of the full connection layer and the sigmoid layer. It can accelerate the convergence speed of the network and improve the robustness of the method. In order to improve the training stability and reduce overfitting, L2 regularization is added on the basis of context loss to enhance the continuity of local and whole images, and improve the quality of restoration and recognition accuracy. We used the peak signal-to-noise ratio (PSNR), structural similarity (SSIM) and recognition accuracy as evaluation indexes, and achieved good results on CelebA and CelebA-HQ datasetsShow more
Chapter, 2021
Publication:11911, 20211005, 1191112
Publisher:2021
 


Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks : *Full/Regular Research Paper submission for the symposium CSCI-ISAI: Artificial Intelligence
Show more
Authors:Massimiliano Lupo PasiniJunqi YinOak Ridge National Lab (ORNL), Oak Ridge, TN (United States)The 2021 International Conference on Computational Science and Computational Intelligence (CSCI'21) - Las Vegas, Nevada, United States of America - 12/15/2021 5:00:00 AM-12/17/2021 5:00:00 AMShow more
Summary:We use a stable parallel approach to train Wasserstein Conditional Generative Adversarial Neural Networks (W-CGANs). The parallel training reduces the risk of mode collapse and enhances scalability by using multiple generators that are concurrently trained, each one of them focusing on a single data label. The use of the Wasserstein metric reduces the risk of cycling by stabilizing the training of each generator. We apply the approach on the CIFAR10 and the CIFAR100 datasets, two standard benchmark datasets with images of the same resolution, but different number of classes. Performance is assessed using the inception score, the Fréchet inception distance, and image quality. An improvement in inception score and Fréchet inception distance is shown in comparison to previous results obtained by performing the parallel approach on deep convolutional conditional generative adversarial neural networks (DC-CGANs). Weak scaling is attained on both datasets using up to 100 NVIDIA V100 GPUs on the OLCF supercomputer SummitShow more
Book, 2021
Publication:20211201
Publisher:2021

Peer-reviewed
Exponential convergence in entropy and Wasserstein for McKean-Vlasov SDEs
Authors:Panpan RenFeng-Yu Wang
Summary:The following type of exponential convergence is proved for (non-degenerate or degenerate) McKean-Vlasov SDEs: W 2 ( μ t , μ ∞ ) 2 + Ent ( μ t | μ ∞ ) ≤ c e − λ t min { W 2 ( μ 0 , μ ∞ ) 2 , Ent ( μ 0 | μ ∞ ) } , t ≥ 1 , where c , λ > 0 are constants, μ t is the distribution of the solution at time t , μ ∞ is the unique invariant probability measure, Ent is the relative entropy and W 2 is the L 2 -Wasserstein distance. In particular, this type of exponential convergence holds for some (non-degenerate or degenerate) granular media type equations generalizing those studied in Carrillo et al. (2003) and Guillin et al. (0000) on the exponential convergence in a mean field entropyShow more
Article, 2021
Publication:Nonlinear Analysis, 206, 202105
Publisher:2021


Tropical optimal transport and Wasserstein distances
Authors:Wonjun LeeWuchen LiBo LinAnthea Monod
Summary:We study the problem of optimal transport in tropical geometry and define the Wasserstein-p distances in the continuous metric measure space setting of the tropical projective torus. We specify the tropical metric—a combinatorial metric that has been used to study of the tropical geometric space of phylogenetic trees—as the ground metric and study the cases of $$p=1,2$$ in detail. The case of $$p=1$$ gives an efficient computation of the infinitely-many geodesics on the tropical projective torus, while the case of $$p=2$$ gives a form for Fréchet means and a general inner product structure. Our results also provide theoretical foundations for geometric insight a statistical framework in a tropical geometric setting. We construct explicit algorithms for the computation of the tropical Wasserstein-1 and 2 distances and prove their convergence. Our results provide the first study of the Wasserstein distances and optimal transport in tropical geometry. Several numerical examples are providedShow more
Downloadable Article, 2021
Publication:Information Geometry, 20210607, 1
Publisher:2021

2021 see 2022
Entropy-regularized 2-Wasserstein distance between Gaussian measures
Authors:Anton MallastoAugusto GerolinHà Quang Minh
Summary:Gaussian distributions are plentiful in applications dealing in uncertainty quantification and diffusivity. They furthermore stand as important special cases for frameworks providing geometries for probability measures, as the resulting geometry on Gaussians is often expressible in closed-form under the frameworks. In this work, we study the Gaussian geometry under the entropy-regularized 2-Wasserstein distance, by providing closed-form solutions for the distance and interpolations between elements. Furthermore, we provide a fixed-point characterization of a population barycenter when restricted to the manifold of Gaussians, which allows computations through the fixed-point iteration algorithm. As a consequence, the results yield closed-form expressions for the 2-Sinkhorn divergence. As the geometries change by varying the regularization magnitude, we study the limiting cases of vanishing and infinite magnitudes, reconfirming well-known results on the limits of the Sinkhorn divergence. Finally, we illustrate the resulting geometries with a numerical studyShow more
Downloadable Article, 2021
Publication:Information Geometry, 20210816, 1
Publisher:2021


2021



Data-driven Wasserstein distributionally robust optimization for refinery planning under uncertainty
Authors:Jinmin ZhaoLiang ZhaoWangli HeIECON 2021 - 47th Annual Conference of the IEEE Industrial Electronics Society
Summary:This paper addresses the issue of refinery production planning under uncertainty. A data-driven Wasserstein distributionally robust optimization approach is proposed to optimize refinery planning operations. The uncertainties of product prices are modeled as an ambiguity set based on the Wasserstein metric, which contains a family of possible probability distributions of uncertain parameters. Then, a tractable Wasserstein distributionally robust counterpart is derived by using dual operation. Finally, a case study from the real-world refinery is performed to demonstrate the effectiveness of the proposed approach. The results show that compared with the traditional stochastic programming method, the data-driven Wasserstein distributionally robust optimization approach is less sensitive to variations of product prices, and provides optimal solutions with better out-of-sample performanceShow more
Chapter, 2021
Publication:IECON 2021 – 47th Annual Conference of the IEEE Industrial Electronics Society, 20211013, 1
Publisher:2021


Human Motion Prediction Using Manifold-Aware Wasserstein GAN
Authors:Angela BartoloMohamed DaoudiNaima OtberdoutBaptiste Chopin2021 16th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2021)Show more
Summary:Human motion prediction aims to forecast future human poses given a prior pose sequence. The discontinuity of the predicted motion and the performance deterioration in long-term horizons are still the main challenges encountered in current literature. In this work, we tackle these issues by using a compact manifold-valued representation of human motion. Specifically, we model the temporal evolution of the 3D human poses as trajectory, what allows us to map human motions to single points on a sphere manifold. To learn these non-Euclidean representations, we build a manifold-aware Wasserstein generative adversarial model that captures the temporal and spatial dependencies of human motion through different losses. Extensive experiments show that our approach outperforms the state-of-the-art on CMU MoCap and Human 3.6M datasets. Our qualitative results show the smoothness of the predicted motionsShow more
Chapter, 2021
Publication:2021 16th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2021), 202112, 1
Publisher:2021

A Sliced Wasserstein Loss for Neural Texture Synthesis
Authors:Laurent BelcourThomas ChambonKenneth VanhoeyEric Heitz2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)Show more
Summary:We address the problem of computing a textural loss based on the statistics extracted from the feature activations of a convolutional neural network optimized for object recognition (e.g. VGG-19). The underlying mathematical problem is the measure of the distance between two distributions in feature space. The Gram-matrix loss is the ubiquitous approximation for this problem but it is subject to several shortcomings. Our goal is to promote the Sliced Wasserstein Distance as a replacement for it. It is theoretically proven, practical, simple to implement, and achieves results that are visually superior for texture synthesis by optimization or training generative neural networksShow more
Chapter, 2021
Publication:2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 202106, 9407
Publisher:2021
Cited by 29
Related articles All 6 versions


Fault injection in optical path - detection quality degradation analysis with Wasserstein distance
Authors:Pawel KowalczykPaulina BugielMarcin SzelestJacek Izydorczyk2021 25th International Conference on Methods and Models in Automation and Robotics (MMAR)Show more
Summary:The goal of this paper is to present results of analysis of artificially generated disturbances imitating real defects of camera that occurs in the process of testing autonomous vehicles both during rides and later, in vehicle software simulation and their impact on quality of object detection. We focus on one perception module responsible for detection of other moving vehicles on the road. Injected faults are obliteration by particles and pulsating blinding. At the same time, we want to propose an examination approach scheme that will provide detailed information about distribution of quality in this comparative experiment. The method can be reused for different perception modules, faults, scene sets and also in order to compare new releases of main recognition software. To do so we combine statistical methods (Welch's ANOVA) and topological analysis (Clusterization over space of distributions, Wasserstein metric). Work provides a summary of the experiment for all data used and described by mentioned tools and examples of certain cases that illustrate general conclusionsShow more
Chapter, 2021
Publication:2021 25th International Conference on Methods and Models in Automation and Robotics (MMAR), 20210823, 121
Publisher:2021


2Breast Cancer Histopathology Image Super-Resolution Using Wide-Attention GAN with Improved Wasserstein Gradient Penalty and Perceptual LossShow more
Author:Shahidi F.
Article, 2021
Publication:IEEE Access, 2021
Publisher:2021

 Cited by 14 Related articles All 2 versions

<——2021———2021——2950—- 


AWCD: An Efficient Point Cloud Processing Approach via Wasserstein CurvatureAuthors:Yihao LuoAiling YangFupeng 

SunHuafei Sun2021 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA)Show more
Summary:In this paper, we introduce the adaptive Wasserstein curvature denoising (AWCD), an original processing approach for point cloud data. By collecting curvatures information from Wasserstein distance, AWCD consider more precise structures of data and preserves stability and effectiveness even for data with noise in high density. This paper contains some theoretical analysis about the Wasserstein curvature and the complete algorithm of AWCD. In addition, we design digital experiments to show the denoising effect of AWCD. According to comparison results, we present the advantages of AWCD against traditional algorithmsShow more
Chapter, 2021
Publication:2021 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA), 20210628, 847
Publisher:2021

Unsupervised Graph Alignment with Wasserstein Distance Discriminator
Authors:Ji Gao (Author), Xiao Huang (Author), Jundong Li (Author)
Summary:Graph alignment aims to identify node correspondence across multiple graphs, with significant implications in various domains. As supervision information is often not available, unsupervised methods have attracted a surge of research interest recently. Most of existing unsupervised methods assume that corresponding nodes should have similar local structure, which, however, often does not hold. Meanwhile, rich node attributes are often available and have shown to be effective in alleviating the above local topology inconsistency issue. Motivated by the success of graph convolution networks (GCNs) in fusing network and node attributes for various learning tasks, we aim to tackle the graph alignment problem on the basis of GCNs. However, directly grafting GCNs to graph alignment is often infeasible due to multi-faceted challenges. To bridge the gap, we propose a novel unsupervised graph alignment framework WAlign. We first develop a lightweight GCN architecture to capture both local and global graph patterns and their inherent correlations with node attributes. Then we prove that in the embedding space, obtaining optimal alignment results is equivalent to minimizing the Wasserstein distance between embeddings of nodes from different graphs. Towards this, we propose a novel Wasserstein distance discriminator to identify candidate node correspondence pairs for updating node embeddings. The whole process acts like a two-player game, and in the end, we obtain discriminative embeddings that are suitable for the alignment task. Extensive experiments on both synthetic and real-world datasets validate the effectiveness and efficiency of the proposed framework WAlignShow more
Chapter, 2021
Publication:Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, 20210814, 426
Publisher:2021


2021
DeepACG: Co-Saliency Detection via Semantic-aware Contrast Gromov-Wasserstein Distance
Authors:Qingshan LiuXiao-Tong YuanBo LiuMingliang DongKaihua Zhang2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)Show more
Summary:The objective of co-saliency detection is to segment the co-occurring salient objects in a group of images. To address this task, we introduce a new deep network architecture via semantic-aware contrast Gromov-Wasserstein distance (DeepACG). We first adopt the Gromov-Wasserstein (GW) distance to build dense 4D correlation volumes for all pairs of image pixels within the image group. These dense correlation volumes enable the network to accurately discover the structured pair-wise pixel similarities among the common salient objects. Second, we develop a semantic-aware co-attention module (SCAM) to enhance the foreground co-saliency through predicted categorical information. Specifically, SCAM recognizes the semantic class of the foreground co-objects, and this information is then modulated to the deep representations to localize the related pixels. Third, we design a contrast edge-enhanced module (EEM) to capture richer contexts and preserve fine-grained spatial information. We validate the effectiveness of our model using three largest and most challenging benchmark datasets (Cosal2015, CoCA, and CoSOD3k). Extensive experiments have demonstrated the substantial practical merit of each module. Compared with the existing works, DeepACG shows significant improvements and achieves state-of-the-art performanceShow more
Chapter, 2021
Publication:2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 202106, 13698
Publisher:2021

DeepACG: Co-Saliency Detection via Semantic-aware Contrast Gromov-Wasserstein Distance

Zhang, KHDong, ML; (...); Liu, QS

IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)

2021 | 

2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021

 , pp.13698-13707

The objective of co-saliency detection is to segment the co-occurring salient objects in a group of images. To address this task, we introduce a new deep network architecture via semantic-aware contrast Gromov-Wasserstein distance (DeepACG). We first adopt the Gromov-Wasserstein (GW) distance to build dense 4D correlation volumes for all pairs of image pixels within the image group. These dense

Show more

Full Text at Publishermore_horiz

61 References  Related records
 
WATCH: Wasserstein Change Point Detection for High-Dimensional Time Series Data
Authors:Nathalie JapkowiczMichael BaronBartlomiej SniezynskiRoberto CorizzoKamil Faber2021 IEEE International Conference on Big Data (Big Data)Show more
Summary:Detecting relevant changes in dynamic time series data in a timely manner is crucially important for many data analysis tasks in real-world settings. Change point detection methods have the ability to discover changes in an unsupervised fashion, which represents a desirable property in the analysis of unbounded and unlabeled data streams. However, one limitation of most of the existing approaches is represented by their limited ability to handle multivariate and high-dimensional data, which is frequently observed in modern applications such as traffic flow prediction, human activity recognition, and smart grids monitoring. In this paper, we attempt to fill this gap by proposing WATCH, a novel Wasserstein distance-based change point detection approach that models an initial distribution and monitors its behavior while processing new data points, providing accurate and robust detection of change points in dynamic high-dimensional data. An extensive experimental evaluation involving a large number of benchmark datasets shows that WATCH is capable of accurately identifying change points and outperforming state-of-the-art methodsShow more
Chapter, 2021
Publication:2021 IEEE International Conference on Big Data (Big Data), 202112, 4450
Publisher:2021



Sensitivity analysis of Wasserstein distributionally robust optimization problems
Authors:Daniel BartlSamuel DrapeauJan ObłójJohannes Wiesel
Summary:We consider sensitivity of a generic stochastic optimization problem to model uncertainty. We take a non-parametric approach and capture model uncertainty using Wasserstein balls around the postulated model. We provide explicit formulae for the first-order correction to both the value function and the optimizer and further extend our results to optimization under linear constraints. We present applications to statistics, machine learning, mathematical finance and uncertainty quantification. In particular, we provide an explicit first-order approximation for square-root LASSO regression coefficients and deduce coefficient shrinkage compared to the ordinary least-squares regression. We consider robustness of call option pricing and deduce a new Black-Scholes sensitivity, a non-parametric version of the so-called Vega. We also compute sensitivities of optimized certainty equivalents in finance and propose measures to quantify robustness of neural networks to adversarial examplesShow more
Article, 2021
Publication:Proceedings of the Royal Society A, 477, 20211222
Publisher:2021


2021


Speech Enhancement Approach Based on Relativistic Wasserstein Generation Adversarial Networks
Authors:Zhi LiJing Huang2021 International Conference on Wireless Communications and Smart Grid (ICWCSG)
Summary:As a pre-processing technology in other speech applications, speech enhancement technology is one of the kernel technologies in the field of information science. Recent research has found that generative adversarial networks have achieved a lot of research results in the field of speech enhancement. But generative adversarial networks are more difficult to train and stabilize in its application to speech enhancement. In this paper, we proposed a speech enhancement method based on relativistic Wasserstein generative adversarial networks. The experimental results show that it provides more stable training results, speeds up the convergence of the network, and produces a better generative model to improve speech enhancement without increasing the computational effort. The model provides better performance in many methods of comparing assessment metricsShow more
Chapter, 2021
Publication:2021 International Conference on Wireless Communications and Smart Grid (ICWCSG), 202108, 313
Publisher:2021


A Regularized Wasserstein Framework for Graph Kernels
Authors:Stephen GouldQing WangAsiri Wijesinghe2021 IEEE International Conference on Data Mining (ICDM)
Summary:We propose a learning framework for graph kernels, which is theoretically grounded on regularizing optimal transport. This framework provides a novel optimal transport distance metric, namely Regularized Wasserstein (RW) discrepancy, which can preserve both features and structure of graphs via Wasserstein distances on features and their local variations, local barycenters and global connectivity. Two strongly convex regularization terms are introduced to improve the learning ability. One is to relax an optimal alignment between graphs to be a cluster-to-cluster mapping between their locally connected vertices, thereby preserving the local clustering structure of graphs. The other is to take into account node degree distributions in order to better preserve the global structure of graphs. We also design an efficient algorithm to enable a fast approximation for solving the optimization problem. Theoretically, our framework is robust and can guarantee the convergence and numerical stability in optimization. We have empirically validated our method using 12 datasets against 16 state-of-the-art baselines. The experimental results show that our method consistently outperforms all state-of-the-art methods on all benchmark databases for both graphs with discrete attributes and graphs with continuous attributesShow more
Chapter, 2021
Publication:2021 IEEE International Conference on Data Mining (ICDM), 202112, 739
Publisher:2021


Authors:Fred Maurice Ngole MboulaEduardo Fernandes Montesuma2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
Summary:Multi-source domain adaptation is a key technique that allows a model to be trained on data coming from various probability distribution. To overcome the challenges posed by this learning scenario, we propose a method for constructing an intermediate domain between sources and target domain, the Wasserstein Barycenter Transport (WBT). This method relies on the barycenter on Wasserstein spaces for aggregating the source probability distributions. Once the sources have been aggregated, they are transported to the target domain using standard Optimal Transport for Domain Adaptation framework. Additionally, we revisit previous single-source domain adaptation tasks in the context of multi-source scenario. In particular, we apply our algorithm to object and face recognition datasets. Moreover, to diversify the range of applications, we also examine the tasks of music genre recognition and music-speech discrimination. The experiments show that our method has similar performance with the existing state-of-the-artShow more
Chapter, 2021
Publication:2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 202106, 16780
Publisher:2021

Temporal conditional Wasserstein GANs for audio-visual affect-related ties
Authors:Stelios AsteriadisEnrique HortalChristos Athanasiadis2021 9th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW)Show more
Summary:Emotion recognition through audio is a rather challenging task that entails proper feature extraction and classification. Meanwhile, state-of-the-art classification strategies are usually based on deep learning architectures. Training complex deep learning networks normally requires very large audiovisual corpora with available emotion annotations. However, such availability is not always guaranteed since harvesting and annotating such datasets is a time-consuming task. In this work, temporal conditional Wasserstein Generative Adversarial Networks (tc-wGANs) are introduced to generate robust audio data by leveraging information from a face modality. Having as input temporal facial features extracted using a dynamic deep learning architecture (based on 3dCNN, LSTM and Transformer networks) and, additionally, conditional information related to annotations, our system manages to generate realistic spectrograms that represent audio clips corresponding to specific emotional context. As proof of their validity, apart from three quality metrics (Frechet Inception Distance, Inception Score and Structural Similarity index), we verified the generated samples applying an audio-based emotion recognition schema. When the generated samples are fused with the initial real ones, an improvement between 3.5 to 5.5% was achieved in audio emotion recognition performance for two state-of-the-art datasetsShow more
Chapter, 2021
Publication:2021 9th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW), 202109, 1
Publisher:2021


Multi-source Cross Project Defect Prediction with Joint Wasserstein Distance and Ensemble Learning
Authors:Hao XuZhanyu YangLu LuQuanyi Zou2021 IEEE 32nd International Symposium on Software Reliability Engineering (ISSRE)
Summary:Cross-Project Defect Prediction (CPDP) refers to transferring knowledge from source software projects to a target software project. Previous research has shown that the impacts of knowledge transferred from different source projects differ on the target task. Therefore, one of the fundamental challenges in CPDP is how to measure the amount of knowledge transferred from each source project to the target task. This article proposed a novel CPDP method called Multi-source defect prediction with Joint Wasserstein Distance and Ensemble Learning (MJWDEL) to learn transferred weights for evaluating the importance of each source project to the target task. In particular, first of all, applying the TCA technique and Logistic Regression (LR) train a sub-model for each source project and the target project. Moreover, the article designs joint Wassertein distance to understand the source-target relationship and then uses this as a basis to compute the transferred weights of different sub-models. After that, the transferred weights can be used to reweight these sub-models to determine their importance in knowledge transfer to the target task. We conducted experiments on 19 software projects from PROMISE, NASA and AEEEM datasets. Compared with several state-of-the-art CPDP methods, the proposed method substantially improves CPDP performance in terms of four evaluation indicators (i.e., F-measure, Balance, G-measure and MMC)Show more
Chapter, 2021
Publication:2021 IEEE 32nd International Symposium on Software Reliability Engineering (ISSRE), 202110, 57
Publisher:2021

<——2021———2021——2960—- 



SRWGANTV: Image Super-Resolution Through Wasserstein Generative Adversarial Networks with Total Variational Regularization
Show more
Authors:Jun ShaoLiang ChenYi Wu2021 IEEE 13th International Conference on Computer Research and Development (ICCRD)
Summary:The study of generative adversarial networks (GAN) has enormously promoted the research work on single image super-resolution (SISR) problem. SRGAN firstly apply GAN to SISR reconstruction, which has achieved good results. However, SRGAN sacrifices the fidelity. At the same time, it is well known that the GANs are difficult to train and the improper training fails the SISR results easily. Recently, Wasserstein Generative Adversarial Network with gradient penalty (WGAN-GP) has been proposed to alleviate these issues at the expense of performance of the model with a relatively simple training process. However, we find that applying WGAN-GP to SISR still suffers from training instability, leading to failure to obtain a good SR result. To address this problem, we present an image super resolution framework base on enhanced WGAN (SRWGAN-TV). We introduce the total variational (TV) regularization term into the loss function of WGAN. The total variational (TV) regularization term can stabilize the network training and improve the quality of generated images. Experimental results on public datasets show that the proposed method achieves superior performance in both quantitative and qualitative measurementsShow more
Chapter, 2021
Publication:2021 IEEE 13th International Conference on Computer Research and Development (ICCRD), 20210105, 21
Publisher:2021

Cited by 3 Related articles


Fair Graph Auto-Encoder for Unbiased Graph Representations with Wasserstein Distance
Authors:Yanjie FuHui XiongHao LiuRui XieKunpeng LiuWei Fan2021 IEEE International Conference on Data Mining (ICDM)
Summary:The fairness issue is very important in deploying machine learning models as algorithms widely used in human society can be easily in discrimination. Researchers have studied disparity on tabular data a lot and proposed many methods to relieve bias. However, studies towards unfairness in graph are still at early stage while graph data that often represent connections among people in real-world applications can easily give rise to fairness issues and thus should be attached to great importance. Fair representation learning is one of the most effective methods to relieve bias, which aims to generate hidden representations of input data while obfuscating sensitive information. In graph setting, learning fair representations of graph (also called fair graph embeddings) is effective to solve graph unfairness problems. However, most existing works of fair graph embeddings only study fairness in a coarse granularity (i.e., group fairness), but overlook individual fairness. In this paper, we study fair graph representations from different levels. Specifically, we consider both group fairness and individual fairness on graph. To debias graph embeddings, we propose FairGAE, a fair graph auto-encoder model, to derive unbiased graph embeddings based on the tailor-designed fair Graph Convolution Network (GCN) layers. Then, to achieve multi-level fairness, we design a Wasserstein distance based regularizer to learn the optimal transport for fairer embeddings. To overcome the efficiency concern, we further bring up Sinkhorn divergence as the approximations of Wasserstein cost for computation. Finally, we apply the learned unbiased embeddings into the node classification task and conduct extensive experiments on two real-world graph datasets to demonstrate the improved performances of our approachShow more
Chapter, 2021
Publication:2021 IEEE International Conference on Data Mining (ICDM), 202112, 1054
Publisher:2021

Wasserstein Contrastive Representation Distillation
Authors:Lawrence CarinRicardo HenaoJingjing LiuZhe GanDong WangLiqun Chen2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)Show more
Summary:The primary goal of knowledge distillation (KD) is to encapsulate the information of a model learned from a teacher network into a student network, with the latter being more compact than the former. Existing work, e.g., using Kullback-Leibler divergence for distillation, may fail to capture important structural knowledge in the teacher network and often lacks the ability for feature generalization, particularly in situations when teacher and student are built to address different classification tasks. We propose Wasserstein Contrastive Representation Distillation (WCoRD), which leverages both primal and dual forms of Wasserstein distance for KD. The dual form is used for global knowledge transfer, yielding a contrastive learning objective that maximizes the lower bound of mutual information between the teacher and the student networks. The primal form is used for local contrastive knowledge transfer within a mini-batch, effectively matching the distributions of features between the teacher and the student networks. Experiments demonstrate that the proposed WCoRD method outperforms state-of-the-art approaches on privileged information distillation, model compression and cross-modal transferShow more
Chapter, 2021
Publication:2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 202106, 16291
Publisher:2021


Wasserstein Coupled Graph Learning for Cross-Modal Retrieval
Authors:Jian YangShaoxin LiPengcheng ShenYuge HuangZhen CuiXueya ZhangTong ZhangYun Wang2021 IEEE/CVF International Conference on Computer Vision (ICCV)Show more
Summary:Graphs play an important role in cross-modal image-text understanding as they characterize the intrinsic structure which is robust and crucial for the measurement of crossmodal similarity. In this work, we propose a Wasserstein Coupled Graph Learning (WCGL) method to deal with the cross-modal retrieval task. First, graphs are constructed according to two input cross-modal samples separately, and passed through the corresponding graph encoders to extract robust features. Then, a Wasserstein coupled dictionary, containing multiple pairs of counterpart graph keys with each key corresponding to one modality, is constructed for further feature learning. Based on this dictionary, the input graphs can be transformed into the dictionary space to facilitate the similarity measurement through a Wasserstein Graph Embedding (WGE) process. The WGE could capture the graph correlation between the input and each corresponding key through optimal transport, and hence well characterize the inter-graph structural relationship. To further achieve discriminant graph learning, we specifically define a Wasserstein discriminant loss on the coupled graph keys to make the intra-class (counterpart) keys more compact and inter-class (non-counterpart) keys more dispersed, which further promotes the final cross-modal retrieval task. Experimental results demonstrate the effectiveness and state-of-the-art performanceShow more
Chapter, 2021
Publication:2021 IEEE/CVF International Conference on Computer Vision (ICCV), 202110, 1793
Publisher:2021

Wasserstein Barycenter Transport for Acoustic Adaptation
Authors:Eduardo F. MontesumaFred-Maurice Ngole MboulaICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)Show more
Summary:The recognition of music genre and the discrimination between music and speech are important components of modern digital music systems. Depending on the acquisition conditions, such as background environment, these signals may come from different probability distributions, making the learning problem complicated. In this context, domain adaptation is a key theory to improve performance. Considering data coming from various background conditions, the adaptation scenario is called multi-source. This paper proposes a multi-source domain adaptation algorithm called Wasserstein Barycenter Transport, which transports the source domains to a target domain by creating an intermediate domain using the Wasserstein barycenter. Our method outperforms other state-of-the-art algorithms, and performs better than classifiers trained with target-only dataShow more
Chapter, 2021
Publication:ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 20210606, 3405
Publisher:2021


2021



A Data Augmentation Method for Distributed Photovoltaic Electricity Theft Using Wasserstein Generative Adversarial Network
Authors:Jingge LiWenlong LiaoRuiqi YangZechun Chen2021 IEEE 5th Conference on Energy Internet and Energy System Integration (EI2)
Summary:Because of the concealment of distributed photovoltaic (PV) electricity theft, the number of electricity theft samples held by the power sector is insufficient, which results in low accuracy of electricity theft detection. Therefore, this paper proposes a data augmentation method about electricity theft samples of distributed PV using Wasserstein generative adversarial network (WGAN). First, through the confrontation training of WGAN's generator network and discriminator network, the neural network can learn the time correlation of PV electricity theft data sequence that is difficult to explicitly model, and generate new electricity theft samples with similar distributions to the real electricity theft samples. Then, three typical photovoltaic power stealing models are proposed, and a convolutional neural network (CNN) is constructed based on the data features of the electricity theft samples. Finally, a practical example verifies the effectiveness and adaptability of the method. The experimental results show that WGAN can consider the shape and distribution of the samples, and the generated electricity theft samples can significantly improve the performance of different classifiersShow more
Chapter, 2021
Publication:2021 IEEE 5th Conference on Energy Internet and Energy System Integration (EI2), 20211022, 3132
Publisher:2021


Wasserstein Embeddings for Nonnegative Matrix Factorization
Authors:Mickael FebrissyMohamed NadifInternational Conference on Machine Learning, Optimization, and Data Science
Summary:In the field of document clustering (or dictionary learning), the fitting error called the Wasserstein (In this paper, we use “Wasserstein”, “Earth Mover’s”, “Kantorovich–Rubinstein” interchangeably) distance showed some advantages for measuring the approximation of the original data. Further, It is able to capture redundant information, for instance synonyms in bag-of-words, which in practice cannot be retrieved using classical metrics. However, despite the use of smoothed approximation allowing faster computations, this distance suffers from its high computational cost and remains uneasy to handle with a substantial amount of data. To circumvent this issue, we propose a different scheme of NMF relying on the Kullback-Leibler divergence for the term approximating the original data and a regularization term consisting in the approximation of the Wasserstein embeddings in order to leverage more semantic relations. With experiments on benchmark datasets, the results show that our proposal achieves good clustering and support for visualizing the clustersShow more
Chapter, 2021
Publication:Machine Learning, Optimization, and Data Science: 6th International Conference, LOD 2020, Siena, Italy, July 19–23, 2020, Revised Selected Papers, Part I, 20210108, 309
Publisher:2021


Human Motion Generation using Wasserstein GAN
Authors:Ayumi Shiobara (Author), Makoto Murakami (Author)
Summary:Human motion control, edit, and synthesis are important tasks to create 3D computer graphics video games or movies, because some characters act like humans in most of them. Our aim is to construct a system which can generate various natural character motions. We assume that the process of human motion generation is complex and nonlinear, and it can be modeled by deep neural networks. However, this process cannot be observed, and it needs to be estimated by learning from observable human motion data. On the other hand, the process of discrimination which is opposite to the generation is also modeled by deep neural networks. And the generator and discriminator are trained using human motion data. In this paper we constructed a human motion generative model using Wasserstein GAN. As a result, our model can generate various human motions from a 512-dimensional latent spaceShow more
Chapter, 2021
Publication:2021 5th International Conference on Digital Signal Processing, 20210226, 278
Publisher:2021


DeepACG: Co-Saliency Detection via Semantic-aware Contrast Gromov-Wasserstein Distance
Authors:Kaihua ZhangMingliang DongBo LiuXiao-Tong YuanQingshan Liu2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)Show more
Summary:The objective of co-saliency detection is to segment the co-occurring salient objects in a group of images. To address this task, we introduce a new deep network architecture via semantic-aware contrast Gromov-Wasserstein distance (DeepACG). We first adopt the Gromov-Wasserstein (GW) distance to build dense 4D correlation volumes for all pairs of image pixels within the image group. These dense correlation volumes enable the network to accurately discover the structured pair-wise pixel similarities among the common salient objects. Second, we develop a semantic-aware co-attention module (SCAM) to enhance the foreground co-saliency through predicted categorical information. Specifically, SCAM recognizes the semantic class of the foreground co-objects, and this information is then modulated to the deep representations to localize the related pixels. Third, we design a contrast edge-enhanced module (EEM) to capture richer contexts and preserve fine-grained spatial information. We validate the effectiveness of our model using three largest and most challenging benchmark datasets (Cosal2015, CoCA, and CoSOD3k). Extensive experiments have demonstrated the substantial practical merit of each module. Compared with the existing works, DeepACG shows significant improvements and achieves state-of-the-art performanceShow more
[PDF] thecvf.com

Deepacg: Co-saliency detection via semantic-aware contrast gromov-wasserstein distance

K Zhang, M Dong, B Liu, XT Yuan… - Proceedings of the …, 2021 - openaccess.thecvf.com

… To address this task, we introduce a new deep network architecture via semantic-aware 

contrast Gromov-Wasserstein distance (DeepACG). We first adopt the Gromov-Wasserstein (GW…

Cited by 10 Related articles All 4 versions


Conditional Wasserstein Generative Adversarial Networks for Fast Detector SimulationAuthors:John BlueBraden KronheimMichelle KucheraRaghuram Ramanujan
Summary:Detector simulation in high energy physics experiments is a key yet computationally expensive step in the event simulation process. There has been much recent interest in using deep generative models as a faster alternative to the full Monte Carlo simulation process in situations in which the utmost accuracy is not necessary. In this work we investigate the use of conditional Wasserstein Generative Adversarial Networks to simulate both hadronization and the detector response to jets. Our model takes the 4-momenta of jets formed from partons post-showering and pre-hadronization as inputs and predicts the 4-momenta of the corresponding reconstructed jet. Our model is trained on fully simulated tt events using the publicly available GEANT-based simulation of the CMS Collaboration. We demonstrate that the model produces accurate conditional reconstructed jet transverse momentum (pT) distributions over a wide range of pT for the input parton jet. Our model takes only a fraction of the time necessary for conventional detector simulation methods, running on a CPU in less than a millisecond per eventShow more
Article, 2021
Publication:EPJ Web of Conferences, 251, 2021

 <——2021———2021——2970—-



Data Augmentation of Wrist Pulse Signal for Traditional Chinese Medicine Using Wasserstein GAN
Authors:Jiaxing Chang (Author), Fei Hu (Author), Huaxing Xu (Author), Xiaobo Mao (Author), Yuping Zhao (Author), Luqi Huang (Author)
Summary:Pulse diagnosis has been widely used in traditional Chinese medicine (TCM) for thousands of years. Recently, with the availability and improvement of advanced and portable sensor technology, computational pulse diagnosis has been obtaining more and more attentions. In this field, pulse diagnosis based on deep learning show promising performance. However, the availability of labeled data is limited, due to lengthy experiments or data privacy. In this paper, for the first time, we propose a novel one-dimensional Wasserstein generative adversarial network (WGAN) model, which can learn the statistical characteristics of the wrist pulse signal and augment its datasets size. Visual inspection and experimental evaluations with two quantitative metrics demonstrated that the generated data has good fidelity. We hope this research opening up opportunities for researchers in TCM to further improve the performance of pulse diagnosis algorithms, further facilitating the modernization of TCMShow more
Chapter, 2021
Publication:Proceedings of the 2nd International Symposium on Artificial Intelligence for Medicine Sciences, 20211029, 426
Publisher:2021

Cited by 2 Related articles


Wasserstein Contrastive Representation Distillation
Authors:Liqun ChenDong WangZhe GanJingjing LiuRicardo HenaoLawrence Carin2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)Show more
Summary:The primary goal of knowledge distillation (KD) is to encapsulate the information of a model learned from a teacher network into a student network, with the latter being more compact than the former. Existing work, e.g., using Kullback-Leibler divergence for distillation, may fail to capture important structural knowledge in the teacher network and often lacks the ability for feature generalization, particularly in situations when teacher and student are built to address different classification tasks. We propose Wasserstein Contrastive Representation Distillation (WCoRD), which leverages both primal and dual forms of Wasserstein distance for KD. The dual form is used for global knowledge transfer, yielding a contrastive learning objective that maximizes the lower bound of mutual information between the teacher and the student networks. The primal form is used for local contrastive knowledge transfer within a mini-batch, effectively matching the distributions of features between the teacher and the student networks. Experiments demonstrate that the proposed WCoRD method outperforms state-of-the-art approaches on privileged information distillation, model compression and cross-modal transferShow more
Chapter, 2021
Publication:2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 202106, 16291
Publisher:2021

Self-Supervised Wasserstein Pseudo-Labeling for Semi-Supervised Image Classification
Authors:Fariborz TaherkhaniAli DaboueiSobhan SoleymaniJeremy DawsonNasser M. Nasrabadi2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)Show more
Summary:The goal is to use Wasserstein metric to provide pseudo labels for the unlabeled images to train a Convolutional Neural Networks (CNN) in a Semi-Supervised Learning (SSL) manner for the classification task. The basic premise in our method is that the discrepancy between two discrete empirical measures (e.g., clusters) which come from the same or similar distribution is expected to be less than the case where these measures come from completely two different distributions. In our proposed method, we first pre-train our CNN using a self-supervised learning method to make a cluster assumption on the unlabeled images. Next, inspired by the Wasserstein metric which considers the geometry of the metric space to provide a natural notion of similarity between discrete empirical measures, we leverage it to cluster the unlabeled images and then match the clusters to their similar class of labeled images to provide a pseudo label for the data within each cluster. We have evaluated and compared our method with state-of-the-art SSL methods on the standard datasets to demonstrate its effectivenessShow more
Chapter, 2021
Publication:2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 202106, 12262
Publisher:2021

Cited by 17 Related articles All 6 versions


High Impedance Fault Diagnosis Method Based on Conditional Wasserstein Generative Adversarial Network
Authors:Wen-Li LiuMou-Fa GuoJian-Hong Gao2021 IEEE 2nd China International Youth Conference on Electrical Engineering (CIYCEE)
Summary:Data-driven fault diagnosis of high impedance fault (HIF) has received increasing attention and achieved fruitful results. However, HIF data is difficult to obtain in engineering. Furthermore, there exists an imbalance between the fault data and non-fault data, making data-driven methods hard to detect HIFs reliably under the small imbalanced sample condition. To solve this problem, this paper proposes a novel HIF diagnosis method based on conditional Wasserstein generative adversarial network (WCGAN). By adversarial training, WCGAN can generate sufficient labeled zero-sequence current signals, which can expand the limited training set and achieve the balanced distribution of the samples. In addition, the Wasserstein distance was introduced to improve the loss function. Experimental results indicate that the proposed method can generate high-quality samples and achieve a high accuracy rate of fault detection in the case of small imbalanced samplesShow more
Chapter, 2021
Publication:2021 IEEE 2nd China International Youth Conference on Electrical Engineering (CIYCEE), 20211215, 1
Publisher:2021


Two-sample Test using Projected Wasserstein Distance
Authors:Jie WangRui GaoYao Xie2021 IEEE International Symposium on Information Theory (ISIT)
Summary:We develop a projected Wasserstein distance for the two-sample test, a fundamental problem in statistics and machine learning: given two sets of samples, to determine whether they are from the same distribution. In particular, we aim to circumvent the curse of dimensionality in Wasserstein distance: when the dimension is high, it has diminishing testing power, which is inherently due to the slow concentration property of Wasserstein metrics in the high dimension space. A key contribution is to couple optimal projection to find the low dimensional linear mapping to maximize the Wasserstein distance between projected probability distributions. We characterize theoretical properties of the two-sample convergence rate on IPMs and this new distance. Numerical examples validate our theoretical resultsShow more
Chapter, 2021
Publication:2021 IEEE International Symposium on Information Theory (ISIT), 20210712, 3320
Publisher:2021




Statistical Learning in Wasserstein Space
Authors:A KarimiKarimi, A (Creator), Ripani, L (Creator), Georgiou, TT (Creator)
Summary:We seek a generalization of regression and principle component analysis (PCA) in a metric space where data points are distributions metrized by the Wasserstein metric. We recast these analyses as multimarginal optimal transport problems. The particular formulation allows efficient computation, ensures existence of optimal solutions, and admits a probabilistic interpretation over the space of paths (line segments). Application of the theory to the interpolation of empirical distributions, images, power spectra, as well as assessing uncertainty in experimental designs, is envisionedShow more
Downloadable Article, 2021-07-01
Undefined
Publication:IEEE Control Systems Letters
Publisher:eScholarship, University of California, 2021-07-01

2021


Ёж Латте и Водный Камень : приключение первое
 / Ëzh Latte i Vodnyĭ Kamenʹ : prikluchenie pervoeAuthors:Себастьян Любек ; иллюстрации Даниэля Наппа ; перевод с немецкого Яны Садовниковой.Любек, Себастьян. (Author), Напп, Даниэль (Illustrator), Садовниковa, Янa. (Translator), Sebastian Lybeck / Sebastʹan Lubek ; illustrasii Daniėla Nappa ; perevod s nemeskogo Any Sadovnikovoĭ.Sebastian Lybeck (Author), Daniel Napp (Illustrator), Ana Sadovnikova (Translator)Show more
Summary:"На лес надвигается смертельная опасность: засуха! Ужасное горе для всех обитателей и, конечно, для ежа Латте. Единственный способ спасти родные земли -- найти Водный Камень. А чтобы достать его, нужно отправиться в Северный лес, обойти тысячу стражей и обхитрить самого короля медведей. Такая задача под силу только самому смелому и умному герою. Такому, как ёж Латте"--Troykaonline.comShow more
Print Book, 2021
Russian
Publisher:Эксмодетство, Moskva, 2021

Peer-reviewed
Exponential convergence in entropy and Wasserstein for McKean-Vlasov SDEs
Authors:Panpan RenFeng-Yu Wang
Article, 2021
Publication:Nonlinear analysis, 206, 2021
Publisher:2021


Statistical Learning in Wasserstein Space - NSF PAR

https://par.nsf.gov › servlets › purl

https://par.nsf.gov › servlets › purlPby A Karimi · 2021 · Cited by 9 — Amirhossein Karimi and Tryphon T. Georgiou are with thDepartmenofMechanicaanAerospace Engineering, University of California at. Irvine, CA 92617 USA (e ...

Missing: SpaceAuth 
Statistical Learning in Wasserstein Space
Authors:Karimi A.Georgiou T.T.Ripani L.


Temporal conditional Wasserstein GANs for audio-visual affect-related tiesAuthors:Christos AthanasiadisEnrique HortalStelios Asteriadis
Book
Publication:2021

Related articles All 5 versions


Conditional Wasserstein Generative Adversarial Networks for Rebalancing Iris Image Datasets
Authors:Yung-Hui LIMuhammad Saqlain ASLAMLatifa Nabila HARFIYAChing-Chun CHANG
Downloadable Article, 2021
Publication:IEICE Transactions on Information and Systems, 104.D, 20210901, 1450
Publisher:2021
Node2coords: Graph Representation Learning with Wasserstein Barycenters
Authors:Effrosyni SimouDorina ThanouPascal Frossard
Article, 2021
Publication:IEEE transactions on signal and information processing over networks, 7, 2021, 17
Publisher:2021

<——2021———2021——2980—


深層マルコフモデルのWasserstein距離を用いた学習Authors:福田 紘平星野 健太自動制御連合講演会講演論文集 64回自動制御連合講演会
Downloadable Article, 2021
Publication:自動制御連合講演会講演論文集 64回自動制御連合講演会, 2021, 152
Publisher:2021
[Japanese  Learning Using the Wasserstein Distance of Deep Markov Models Authors: Kohei Fukuda, Kenta Hoshino, Proceedings of the Joint Conference on Automatic Control The 64th Joint Conference on Automatic Control]


2021

Wasserstein Distances, Geodesics and Barycenters of Merge ...

https://ieeexplore.ieee.org › document

by M Pont · 2021 · Cited by 17 — PubMed ID: 34596544 ; INSPEC Accession Number: 21506059 ; DOI: 10.1109/TVCG.2021.3114839 ; Persistent Link: https://xplorestaging.ieee.org/servlet/ ...

DOI: 10.1109/TVCG.2021.3114839



[PDF] neurips.cc

Quantum wasserstein generative adversarial networks

S Chakrabarti, H Yiming, T Li… - Advances in Neural …, 2019 - proceedings.neurips.cc

… Specifically, we propose a definition of the Wasserstein semimetric between quantum data, 

… to turn the quantum Wasserstein semimetric into a concrete design of quantum WGANs that …

Save Cite Cited by 49 Related articles All 7 versions



[PDF] iop.org

Full View

Wasserstein metric for improved quantum machine learning with adjacency matrix representations

O Çaylak, OA von Lilienfeld… - … Learning: Science and …, 2020 - iopscience.iop.org

… We study the Wasserstein metric to measure distances between molecules represented by 

… Resulting machine learning models of quantum properties, aka quantum machine learning …

Cited by 13 Related articles All 7 versions


 

[PDF] iop.org

Wasserstein space as state space of quantum mechanics and optimal transport

MF Rosyid, K Wahyuningsih - Journal of Physics: Conference …, 2019 - iopscience.iop.org

… space P2(Σ(A)) which is called Wasserstein space. Let B be any other observable. It can be 

… We will investigate the Wasserstein spaces over the spectrums of a quantum observables, …

Related articles All 3 versions


2021


arXiv:2201.07125  [pdfother]  cs.AI doi10.1109/BigData52589.2021.9671962
WATCH: Wasserstein Change Point Detection for High-Dimensional Time Series Data
Authors: Kamil FaberRoberto CorizzoBartlomiej SniezynskiMichael BaronNathalie Japkowicz
Abstract: Detecting relevant changes in dynamic time series data in a timely manner is crucially important for many data analysis tasks in real-world settings. Change point detection methods have the ability to discover changes in an unsupervised fashion, which represents a desirable property in the analysis of unbounded and unlabeled data streams. However, one limitation of most of the existing approaches… 
More
Submitted 18 January, 2022; originally announced January 2022.
Journal ref: 2021 IEEE International Conference on Big Data (Big Data)

Related articles All 4 versions


The cutoff phenomenon in Wasserstein distance for nonlinear stable Langevin systems with small Lévy noise
Barrera, Gerardo; Högele, Michael A; Pardo, Juan Carlos.arXiv.org; Ithaca, Jan 13, 2022.

The Cutoff Phenomenon in Wasserstein Distance for Nonlinear Stable Langevin Systems with Small Levy Noise

Barrera, GHogele, MA and Pardo, JC

Feb 2022 (Early Access) | JOURNAL OF DYNAMICS AND DIFFERENTIAL EQUATIONS

This article establishes the cutoff phenomenon in the Wasserstein distance for systems of nonlinear ordinary differential equations with a dissipative stable fixed point subject to small additive Markovian noise. This result generalizes the results shown in Barrera, Hogele, Pardo (EJP2021) in a more restrictive setting of Blum…

 Free Full Text From Publisher

22 References  Related records

Cited by 3 Related articles All 10 versions



Optimal Transport and PDE: Gradient Flows in the ... - YouTube

www.youtube.com › watch

www.youtube.com › watch

Optimal Transport and PDE: Gradient Flows in the Wasserstein Metric (continued). Watch later. Share. Copy link.

YouTube · Simons Institute ·

 Sep 3, 2021

 

APDE Seminar -- Talks in 2020

www.maths.usyd.edu.au › Asia-Pacific-APDESeminar › T...

www.maths.usyd.edu.au › Asia-Pacific-APDESeminar › T...

Chao Xia, Recent progress on a sharp lower bound for Steklov ... leading to the notion of their Wasserstein distance, is a problem in the ...

School of Mathematics and Statistics, University of Sydney · Asia-Pacific Analysis and PDE Seminar · 

Mar 22, 2021


2021

[PDF] mlr.press

FlexAE: Flexibly learning latent priors for wasserstein auto-encoders

AK Mondal, H AsnaniP Singla… - Uncertainty in Artificial …, 2021 - proceedings.mlr.press

… (KLD), Jensen–Shannon divergence (JSD), Wasserstein Distance and so on. In this work,

we propose to use Wasserstein distance and utilize the principle laid in [Arjovsky et al., 2017, …

Cited by 4 Related articles All 5 versions 

<——2021———2021——2990—- 


  

Data augmentation for rolling bearing fault diagnosis using an enhanced few-shot Wasserstein auto-encoder with meta-learning

Z Pei, H Jiang, X Li, J Zhang, S Liu - Measurement Science and …, 2021 - iopscience.iop.org

… We propose an enhanced few-shot Wasserstein auto-encoder (fs-WAE) to reverse the

negative effect of imbalance. Firstly, an enhanced WAE is proposed for data augmentation, in …

 Cited by 15 Related articles All 3 versions


 Lidar with Velocity: Motion Distortion Correction of Point ...

DeepAI

https://deepai.org › publication › lidar-with-velocity-...

Nov 18, 2021 — Lidar Upsampling with Sliced Wasserstein Distance. Lidar became an important component of the perception systems in autonom... Artem Savkin ...


Crash Course on Optimal Transport - Simons Institute

simons.berkeley.edu › talks › crash-course-optimal-transp...

simons.berkeley.edu › talks › crash-course-optimal-transp..

... the Wasserstein metric, the Benamou-Brenier theorem, and Wasserstein gradient flows. ... 2013–2023 Simons Institute for the Theory of Computing.

Simons Institute · Simons Institute · 

Sep 1, 2021


 2021 see 2022

The quantum Wasserstein distance of order 1 - YouTube

www.youtube.com › watchnd Quantum Physics 2021"The quantum Wasserstein distance of order 1"Giacomo De Palma ...

YouTube · Institute for Pure & Applied Mathematics (IPAM) · 

Feb 19, 2021 in this video



Rethinking rotated object detection with gaussian wasserstein distance loss

X YangJ YanQ MingW Wang… - International …, 2021 - proceedings.mlr.press

… loss based on Gaussian Wasserstein distance as a fun… Gaussian distribution, which enables

to approximate the indifferentiable rotational IoU induced loss by the Gaussian Wasserstein …

Cited by 147 Related articles All 10 versions 

[CITATION] Rethinking rotated object detection with Gaussian wasserstein distance loss. arXiv 2021

X Yang, J Yan, Q Ming, W Wang, X Zhang, Q Tian - arXiv preprint arXiv:2101.11952, 2021

Cited by 8 Related articles


 2021


[PDF] arxiv.org

A normalized Gaussian Wasserstein distance for tiny object detection

J WangC XuW YangL Yu - arXiv preprint arXiv:2110.13389, 2021 - arxiv.org

… to measure the similarity of bounding boxes by Wasserstein distance to replace standard IoU.

… 2-D Gaussian distributions, and then use our proposed Normalized Wasserstein Distance (…

 Cited by 26 Related articles All 2 versions 



[PDF] arxiv.org

Gromov-Wasserstein distances between Gaussian distributions

A SalmonaJ Delon, A Desolneux - arXiv preprint arXiv:2104.07970, 2021 - arxiv.org

… In this paper, we focus on the Gromov-Wasserstein distance with a ground cost defined as …

between Gaussian distributions. We show that when the optimal plan is restricted to Gaussian …

Cited by 14 Related articles All 35 versions 


[PDF] arxiv.org

Wasserstein-splitting Gaussian process regression for heterogeneous online Bayesian inference

ME KeplerA KoppelAS Bedi… - 2021 IEEE/RSJ …, 2021 - ieeexplore.ieee.org

… To address these issues, we propose Wassersteinsplitting Gaussian Process Regression

(… of which have a joint Gaussian distribution [41]. We use a Gaussian process to model the …

Cited by 2 Related articles All 4 versions


[PDF] arxiv.org

Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings

MH Quang - arXiv preprint arXiv:2101.01429, 2021 - arxiv.org

… Our first main result is that for Gaussian measures on an infinite-dimensional Hilbert …

Wasserstein distance and kernel Gaussian-Sinkhorn divergence between mixtures of Gaussian …

 Cited by 4 Related articles All 5 versions 


[PDF] arxiv.org

Schema matching using Gaussian mixture models with Wasserstein distance

M Przyborowski, M Pabiś, A Janusz… - arXiv preprint arXiv …, 2021 - arxiv.org

… Gaussian mixture models find their place as a powerful tool, … from mixture models, the

Wasserstein distance can be very useful, … for the Wasserstein distance between Gaussian mixture …

Related articles All 3 versions 

<——2021———2021——3000—



[PDF] uwaterloo.ca

Wasserstein Autoencoders with Mixture of Gaussian Priors for Stylized Text Generation

A Ghabussi, L MouO Vechtomova - Text, Speech, and Dialogue: 24th …, 2021 - Springer

… We present a Wasserstein autoencoder with a Gaussian mixture prior for style-aware sentence

generation. Our model is trained on a multi-class dataset and generates sentences in the …

Related articles All 3 versions


Efficient and Robust Classification for Positive Definite Matrices with Wasserstein Metric

J Cui - 2021 - etd.auburn.edu

… The purpose of this paper is to propose different algorithms based on Bures-Wasserstein …

The results obtained in this paper include that Bures-Wasserstein simple projection mean …

 Related articles 


Multiperiod Facility Location and Capacity Planning Under∞-Wasserstein Joint Chance Constraints in Humanitarian Logistics

Z Wang, K You, Z Wang, K Liu - Available at SSRN 4192966, 2021 - papers.ssrn.com

… -hand-side uncertainty over the ∞-Wasserstein metric and equivalently reformulate it as a … 

-Wasserstein set in the MFLCP model. The most relevant paper that involves the Wasserstein

Related articles 


[PDF] mdpi.com

… intelligent fault diagnosis method for rolling bearings based on Wasserstein generative adversarial network and Convolutional Neural Network under Unbalanced …

H Tang, S Gao, L Wang, X Li, B Li, S Pang - Sensors, 2021 - mdpi.com

… The Wasserstein generative … Wasserstein generative adversarial net (WGAN) evaluates 

the difference between the real and generated sample distributions by using the Wasserstein

Cited by 13 Related articles All 9 versions

 

[PDF] ucl.ac.uk

[PDF] Towards Stochastic Neural Networks via Inductive Wasserstein Embeddings

H Yang, Y Yang, D Li, Y Zhou, T Hospedales - gatsby.ucl.ac.uk

… In this work, we present inductive Wasserstein embeddings, which models the activations 

and weights using implicit density, ie, we do not assign a specific form for the distribution such …

 Related articles View as HTML 


2021



[PDF] neurips.cc

Do neural optimal transport solvers work? a continuous wasserstein-2 benchmark

A Korotin, L LiA Genevay… - … in Neural …, 2021 - proceedings.neurips.cc

Despite the recent popularity of neural network-based solvers for optimal transport (OT),

there is no standard quantitative way to evaluate their performance. In this paper, we address …

Cited by 30 Related articles All 6 versions 



[PDF] thecvf.com

A sliced wasserstein loss for neural texture synthesis

E Heitz, K Vanhoey, T Chambon… - Proceedings of the …, 2021 - openaccess.thecvf.com

… activations of a convolutional neural network optimized for object … Our goal is to promote

the Sliced Wasserstein Distance as a … by optimization or training generative neural networks. …

Cited by 29 Related articles All 6 versions 



[PDF] mdpi.com

… intelligent fault diagnosis method for rolling bearings based on Wasserstein generative adversarial network and Convolutional Neural Network under Unbalanced …

H Tang, S Gao, L Wang, X Li, B Li, S Pang - Sensors, 2021 - mdpi.com

… Wasserstein generative adversarial net (WGAN) evaluates the difference between the real

and generated sample distributions by using the Wasserstein … of convolutional neural network …

Cited by 13 Related articles All 9 versions 


[PDF] arxiv.org

Wasserstein Graph Neural Networks for Graphs with Missing Attributes

Z Chen, T Ma, Y Song, Y Wang - arXiv preprint arXiv:2102.03450, 2021 - arxiv.org

… Graph neural networks have been demonstrated powerful in … representation learning

framework, Wasserstein graph diffusion (… in general graph neural networks to a Wasserstein space …

Related articles All 2 versions 


    2021

 Exact Statistical Inference for the Wasserstein Distance  

by VNL Duy · 2021 · Cited by 6 — In this study, we propose an exact (non-asymptotic) inference method for the Wasserstein distance inspired by the concept of conditional  ..

<——2021———2021——3010—-


2021

 Learning with symmetric positive definite matrices via generalized Bures-Wasserstein geometry

by Han, Andi; Mishra, Bamdev; Jawanpuria, Pratik ; More...

10/2021.


2021

Telecommunication customer loss prediction method and system for improving multi-layer perceptron

by KE LIJIA; LIANG HAIFANG; XIA GUO'EN

05/2021

The invention provides a telecommunication customer loss prediction method and system for improving a multi-layer perceptron, and the method comprises the steps...

Patent Available Online


2021 patent

sEMG data enhancement method based on BiLSTM and WGAN-GP networks

CN CN114372490A 方银锋 杭州电子科技大学

Priority 2021-12-29 • Filed 2021-12-29 • Published 2022-04-19

A sEMG data enhancement method based on a BilSTM and WGAN-GP network comprises the following specific steps: s1, collecting surface electromyogram signals and preprocessing the signals; step 



2021 patent

… robust optimization scheduling method based on improved Wasserstein measure

CN CN113962612A 刘鸿鹏 东北电力大学

Priority 2021-11-25 • Filed 2021-11-25 • Published 2022-01-21

An electric heating combined system distribution robust optimization scheduling method based on improved Wasserstein measure relates to the technical field of renewable energy scheduling in an electric heating combined system. The method aims to solve the problem that an uncertainty set …


… method for generating countermeasure network based on conditional Wasserstein

CN CN114154405A 陈乾坤 东风汽车集团股份有限公司

Priority 2021-11-19 • Filed 2021-11-19 • Published 2022-03-08

3. The motor data enhancement method based on the conditional Wasserstein generation countermeasure network as claimed in claim 1, wherein the conditional Wasserstein generation countermeasure network is a combination of the conditional warerstein generation countermeasure network and the …


2021

2021 patent  

Early fault detection method based on Wasserstein distance

\\\CN CN114722888A 曾九孙 中国计量大学

Priority 2021-10-27 • Filed 2021-10-27 • Published 2022-07-08

s3, analyzing data statistical characteristics of Wasserstein distance in the principal component space and the residual error space, and establishing monitoring statistics based on hypothesis test in the principal component space and the residual error space for judging whether a fault occurs or …


… and apparatus for conditional data genration using conditional wasserstein …

KR KR20230023464A 조명희 서울대학교산학협력단

Priority 2021-08-10 • Filed 2021-08-10 • Published 2023-02-17

Wherein the learned conditional Wasserstein generator outputs a future video frame as a response when the past video frame is input as condition data. According to claim 7, The learned conditional Wasserstein generator is learned by setting a past video frame and a future video frame as condition …


Characteristic similarity countermeasure network based on Wasserstein distance

CN CN113673347A 祝磊 杭州电子科技大学

Priority 2021-07-20 • Filed 2021-07-20 • Published 2021-11-19

6. The Wasserstein distance-based characterized similar countermeasure network of claim 1, wherein in S5: obtaining the round-trip probability of the destination domain of the source domain comprises multiplying the resulting P st P ts The formula is as follows: P sts P st P ts in the formula, P sts …


Cross-domain recommendation method based on double-current sliced wasserstein …

CN CN113536116A 聂婕 中国海洋大学

Priority 2021-06-29 • Filed 2021-06-29 • Published 2021-10-22

The invention belongs to the technical field of cross-domain recommendation, and discloses a cross-domain recommendation method based on a double-current slotted Wasserstein self-encoder.


Robot motion planning method and system based on graph Wasserstein self-coding …

CN CN113276119B 夏崇坤 清华大学深圳国际研究生院

Priority 2021-05-25 • Filed 2021-05-25 • Granted 2022-06-28 • Published 2022-06-28

1. A robot motion planning method based on a graph Wasserstein self-coding network is characterized by comprising the following steps: s1, constructing a graph Wasserstein self-coding network GraphWAE; the GraphWAE represents the non-obstacle area of the configuration space in a pre-training mode …

<——2021———2021——3020—



Domain self-adaptive rolling bearing fault diagnosis method based on Wasserstein …

CN CN113239610A 王晓东 昆明理工大学

Priority 2021-01-19 • Filed 2021-01-19 • Published 2021-08-10

3. The implementation principle of the Wasserstein distance-based domain-adaptive rolling bearing fault diagnosis method according to claim 2 is characterized in that: StepA, extracting features through convolutional neural network, convolutional layer containing a filter w and an offset b, let X n …


 

System and Method for Generaring Highly Dense 3D Point Clouds using Wasserstein …

KR KR102456682B1 권준석 중앙대학교 산학협력단

Priority 2020-12-18 • Filed 2020-12-18 • Granted 2022-10-19 • Published 2022-10-19

The present invention generates a high-resolution 3D point cloud using a Wasserstein distribution to generate a set of several 3D points by generating several input vectors from a prior distribution and expressing it as a Wasserstein distribution A prior distribution input unit for inputting a …


Wasserstein-based high-energy image synthesis method and device for generating …

CN112634390A 郑海荣 深圳先进技术研究院

Priority 2020-12-17 • Filed 2020-12-17 • Published 2021-04-09

updating the preset generation countermeasure network model based on the first loss value and the first judgment result until the preset generation countermeasure network model converges, and determining the converged preset generation countermeasure network model as the Wasserstein generation …



High-energy image synthesis method and device based on wasserstein generative …

WO WO2022126480A1 郑海荣 深圳先进技术研究院

Priority 2020-12-17 • Filed 2020-12-17 • Published 2022-06-23

The preset generative adversarial network model is updated based on the first loss value and the first discrimination result until the preset generative adversarial network model converges, and the converged preset generative adversarial network model is determined as the Wasserstein generative …


2021 patent

Wasserstein space based visual dimensionality reduction method, involves performing normalization pre-processing on high-dimensional data and updating position of projection point until reaching stop condition to obtain projection result

CN112765426-A

Inventor(s) QIN H and CHEN L

Assignee(s) UNIV CHONGQING POSTS & TELECOM

Derwent Primary Accession Number 

2021-525281


2021


2021 patent

Wassen distance based object envelope body multi-view reconstruction and optimization method, involves estimating Wasserstein distance to measure distribution similarity and converting measurement into cost function of optimization problem

CN113034695-ACN113034695-B

Inventor(s) HE LLIN X; (...); ZHANG H

Assignee(s) UNIV GUANGDONG TECHNOLOGY

Derwent Primary Accession Number 

2021-76322T


2021 patent

Wasserstein Generative Adversarial Network based small sample ground slow motion target data classification method, involves classifying ground slow motion target in test set by using classification model

CN113569632-A

Inventor(s) LI YBAI X; (...); ZHOU F

Assignee(s) PLA NO 32203 TROOPS and UNIV XIDIAN

Derwent Primary Accession Number 

2021-C64797


2021 patent

2Method for generating confrontation finger vein image based on texture constraint and Poisson fusion, involves judging rebuilt finger vein image by arbiter network, and obtaining sum of local WN-GP loss and global WGAN-GP arbiter loss2

CN112488935-A

Inventor(s) WANG ZSHEN L and JIANG H

Assignee(s) UNIV HANGZHOU DIANZI

Derwent Primary Accession Number 

2021-30795B


2021 patent

Method for training low-dose computed tomography (CT) image denoising model, involves minimizing loss of generator in wasserstein divergence generative adversarial network (WGAN)-div generation countermeasure network initial model, and maximizing loss of discriminator

CN113205461-A

Inventor(s) LI SLI Y and PAN J

Assignee(s) SHANGHAI HUIHU INFORMATION TECHNOLOGY CO LTD

Derwent Primary Accession Number 

2021-95262X


2021 patent

Wind power output power prediction method and wasserstein generative adversarial networks (WGAN) network, has filling missing value of wind power data and abnormal value, normalizing data set as input data of prediction model, and outputting prediction result after training test of prediction mode

CN113298297-ACN113298297-B

Inventor(s) WANG YWU Y; (...); LIU G

Assignee(s) UNIV INNER MONGOLIA TECHNOLOGY

Derwent Primary Accession Number 

2021-A0301C

<——2021———2022——3030—


2021 patent

One-dimensional time series data augmentation method based on WGA involves using Wasserstein generating countermeasure network (WGAN) generator corresponding to each subclass of clustered time series data, to generate artificial samples with same characteristics as original data of subclass

CN113627594-A

Inventor(s) QIAN CYANG D; (...); SUN B

Assignee(s) UNIV BEIHANG

Derwent Primary Accession Number 

One-dimensional time series data augmentation method based on WGA involves using Wasserstein generating countermeasure network (WGAN) generator corresponding to each subclass of clustered time series data, to generate artificial samples with same characteristics as original data of subclass

CN113627594-A

Inventor(s) QIAN CYANG D; (...); SUN B

Assignee(s) UNIV BEIHANG

Derwent Primary Accession Number 

2021-D12817


2021 data

alpha-davidson/falcon-cWGAN: Release for CHEP Submission

Blue, John

<——2021———2021——3032— end 2021. e21

unclud[ng,  3 titles with  ВАССЕРШТЕЙН,   1 title with   

Вассерштейна, and 1 title with Wasserstein


Yun, SangwoonSun, XiangChoi, Jung-Il

Stochastic gradient methods for 

L2-Wasserstein least squares problem of Gaussian measures. (English) Zbl 1492.90112

J. Korean Soc. Ind. Appl. Math. 25, No. 4, 162-172 (2021).

MSC:  90C15 90C25 90C30

PDFBibTeX XMLCite

Full Text: DOI 


2021

Carroll, TomMassaneda, XavierOrtega-Cerdà, Joaquim

An enhanced uncertainty principle for the vaserstein distance. (English) 

Zbl 1460.28003

Bull. Lond. Math. Soc. 52, No. 6, 1158-1173 (2020); corrigendum ibid. 53, No. 5, 1520-1522 (2021).

MSC:  28A75 49Q22 58C40
<——2021———2021——3034—end 2021


 

start 2022  Wasserstein

A New Method of Image Restoration Technology Based on WGAN

Fang, WGu, EM; (...); Sheng, VS

2022 | COMPUTER SYSTEMS SCIENCE AND ENGINEERING 41 (2) , pp.689-698

With the development of image restoration technology based on deep learning, more complex problems are being solved, especially in image semantic inpainting based on context. Nowadays, image semantic inpainting techniques are becoming more mature. However, due to the limitations of memory, the instability of training, and the lack of sample diversity, the results of image restoration are still encountering difficult problems, such as repairing the content of glitches which cannot be well integrated with the original image. Therefore, we propose an image inpainting network based on Wasserstein generative adversarial network (WGAN) distance. With the corresponding technology having been adjusted and improved, we attempted to use the Adam algorithm to replace the traditional stochastic gradient descent, and another algorithm to optimize the training used in recent years. We evaluated our algorithm on the ImageNet dataset. We obtained high-quality restoration results, indicating that our algorithm improves the clarity and consistency of the image.

Related articles All 2 versions 


Wasserstein and multivariate linear affine based distributionally 

Wasserstein and multivariate linear affine based ...

https://www.sciencedirect.com › science › article › pii

by Y Wang · 2022 — Wasserstein and multivariate linear affine based distributionally robust optimization for CCHP-P2G scheduling considering multiple uncertainties.

Wasserstein and multivariate linear affine based distributionally robust optimization for CCHP-P2G scheduling considering multiple uncertainties

By: Wang, Yuwei; Yang, Yuanjuan; Fei, Haoran; et al.

APPLIED ENERGY  Volume: ‏ 306     Article Number: 118034   Part: ‏ A   Published: ‏ JAN 15 2022

Get It Penn State  View Abstract


2022

Solutions to Hamilton-Jacobi equation on a Wasserstein space

Badreddine, Z and Frankowska, H

Feb 2022 | CALCULUS OF VARIATIONS AND PARTIAL DIFFERENTIAL EQUATIONS 61 (1)

Enriched Cited References

We consider a Hamilton-Jacobi equation associated to the Mayer optimal control problem in the Wasserstein space P-2(R-d) and define its solutions in terms of the Hadamard generalized differentials. Continuous solutions are unique whenever we focus our attention on solutions defined on explicitly described time dependent compact valued tubes of probability measures. We also prove some viability and invariance theorems in the Wasserstein space and discuss a new notion of proximal normal.

Show more   Full Text at Publisher   References    Related records


 

2022 see 2021

Wasserstein and multivariate linear affine based distributionally robust optimization for CCHP-P2G scheduling considering multiple uncertainties

Wang, YWYang, YJ; (...); Jia, MY

Jan 15 2022 | APPLIED ENERGY 306

Power-to-gas is an emerging energy conversion technology. When integrating power-to-gas into the combined cooling, heating and power system, renewable generations can be further accommodated to synthesize natural gas, and additional revenues can be obtained by reutilizing and selling the synthesized gas. Therefore, it is necessary to address the optimal operation issue of the integrated system (Combined cooling, heating and powerPower-to-gas) for bringing the potential benefits, and thus promoting energy transition. This paper proposes a Wasserstein and multivariate linear affine based distributionally robust optimization model for the above issue considering multiple uncertainties. Specifically, the uncertain distribution of wind power and electric, thermal, cooling loads is modeled as an ambiguity set by applying the Wasserstein metric. Then, based on the ambiguity set, the proposed model with two-stage structure is established. In the first-stage, system operation cost (involving the energy exchange and carbon emission costs, etc.) is minimized under the forecast information. In the second stage, for resisting the interference of multiple uncertainties, the multivariate linear affine policy models are constructed for operation rescheduling under the worst-case distribution within the ambiguity set, which is capable of adjusting flexible resources according to various random factors simultaneously. Simulations are implemented and verify that: 1) both the economic and environmental benefits of system operation are improved by integrating power-to-gas; 2) the proposed model keeps both the conservativeness and computa-tional complexity at low levels, and its solutions enable the effective system operation in terms of cost saving, emission reduction, uncertainty resistance and renewable energy accommodation.

Show more  Full Text at Publisher   References   Related records

 All 4 versions


Badreddine, Zeinab
Frankowska, Hélène

Solutions to Hamilton-Jacobi equation on a Wasserstein space. (English) Zbl 07432485

Calc. Var. Partial Differ. Equ. 61, No. 1, Paper No. 9, 41 p. (2022).

MSC:  49J21 35F21 60-XX 49J53 49L25 49Q22

PDF BibTeX XML Cite Zbl 07432485

Zbl 07432485

Arrigo, AdrianoOrdoudis, ChristosKazempour, JalalDe Grève, ZacharieToubeau, Jean-FrançoisVallée, François

Wasserstein distributionally robust chance-constrained optimization for energy and reserve dispatch: an exact and physically-bounded formulation. (English) Zbl 07421375

Eur. J. Oper. Res. 296, No. 1, 304-322 (2022).

MSC:  90Bxx

PDF BibTeX XML 

 Zbl 07421375

2022  zee 2021

Rate of convergence for particles approximation of PDEs in ...

online Cover Image

 OPEN ACCESS

Rate of convergence for particle approximation of PDEs in Wasserstein space

by Germain, Maximilien; Pham, Huyên; Warin, Xavier

Journal of applied probability, 2022, Volume 59, Issue 4

We prove a rate of convergence for the $N$-particle approximation of a second-order partial differential equation in the space of probability measures, like...

Journal ArticleFull Text Online

Zbl 07616198

[PDF] umons.ac.be

Wasserstein distributionally robust chance-constrained optimization for energy and reserve dispatch: An exact and physically-bounded formulation

A ArrigoC OrdoudisJ KazempourZ De Grève… - European Journal of …, 2022 - Elsevier

In the context of transition towards sustainable, cost-efficient and reliable energy systems,

the improvement of current energy and reserve dispatch models is crucial to properly cope

with the uncertainty of weather-dependent renewable power generation. In contrast to …

 Cited by 5 Related articles All 6 versions  


2022  see 2021

Wasserstein distributionally robust chance-constrained optimization for energy and reserve dispatch: An exact and physically-bounded formulation

A ArrigoC OrdoudisJ KazempourZ De Grève… - European Journal of …, 2022 - Elsevier

In the context of transition towards sustainable, cost-efficient and reliable energy systems,

the improvement of current energy and reserve dispatch models is crucial to properly cope

with the uncertainty of weather-dependent renewable power generation. In contrast to …

 Cited by 5 Related articles All 6 versions


 2022 see 2021

Optimizing decisions for a dual-channel retailer with service level requirements and demand uncertainties: A Wasserstein metric-based distributionally robust …

Y Sun, R Qiu, M Sun - Computers & Operations Research, 2022 - Elsevier

This study explores a dual-channel management problem of a retailer selling multiple

products to customers through a traditional retail channel and an online channel to

maximize expected profit. The prices and order quantities of both the online and the retail …

  Cite  All 3 versions

Zbl 07486378

<——2022———2022———10——



 Wasserstein convergence rate for empirical measures on noncompact manifolds

FY Wang - Stochastic Processes and their Applications, 2022 - Elsevier

Let X t be the (reflecting) diffusion process generated by L Δ+ V on a complete

connected Riemannian manifold M possibly with a boundary∂ M, where V C 1 (M) such

that μ (dx) e V (x) dx is a probability measure. We estimate the convergence rate for the …

 Cited by 3 Related articles All 2 versions

  |   Zbl 07461257
MR4347493
 Prelim Wang, Feng-Yu; Wasserstein convergence rate for empirical measures on noncompact manifolds. Stochastic Process. Appl. 144 (2022), 271–287. 60D05 (58J65)

Review PDF Clipboard Journal Article

Cited by 5 Related articles AllCited by 6 Related articles All 6 versions

 

2022  see 2021  [PDF] arxiv.org

Alpha Procrustes metrics between positive definite operators: a unifying formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt metrics

HQ Minh - Linear Algebra and its Applications, 2022 - Elsevier

This work presents a parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

Cited by 10 Related articles All 3 versions

MR4346718 | Zbl 07461257

Numerical Methods for Wasserstein Natural Gradient Descent ...

https://meetings.ams.org › math › meetingapp.cgi › Paper

https://meetings.ams.org › math › meetingapp.cgi › Paper

We propose a few efficient numerical schemes for scenarios where the derivative of the state variable with respect to the parameter is either known or unknown.

[CITATION] Numerical Methods for Wasserstein Natural Gradient Descent in Inverse Problems

W Lei, L Nurbekyan, Y Yang - 2022 Joint Mathematics Meetings …, 2022 - meetings.ams.org
Poster #111: Numerical Methods for Wasserstein Natural Gradient Descent in Inverse Problems
Wanzhou Lei*, NYU
Levon Nurbekyan, Department of Mathematics, UCLA
Yunan Yang, Cornell University

<p>Ultralimits of Wasserstein spaces and metric measure ...

https://meetings.ams.org › math › meetingapp.cgi › Paper

by Aarren · 2022 — Ultralimits of Wasserstein spaces and metric measure spaces with Ricci ... In other words, the synthetic notion of lower Ricci curvature bounds due to ...

[CITATION] Ultralimits of Wasserstein spaces and metric measure spaces with Ricci curvature bounded from below

A Warren - 2022 Joint Mathematics Meetings (JMM 2022), 2022 - meetings.ams.org

[CITATION] Ultralimits of Wasserstein spaces and metric measure spaces with Ricci curvature bounded from below

A Warren - 2022 Virtual Joint Mathematics Meetings (JMM 2022), 2022 - meetings.ams.org

Joint Mathematics Meetings 2022

https://www.jointmathematicsmeetings.org › 2268_progfull

Society for Industrial and Applied Mathematics Special Session on SIAM ... 

Ultralimits of Wasserstein spaces and metric measure spaces with Ricci curvature ...


QB Perian, A Mantri, SR Benjamin - 2022 Joint Mathematics …, 2022 - meetings.ams.org

Poster #4: On the Wasserstein Distance Between k

2022    see 2021 JMM April 6-9

[CITATION] On the Wasserstein Distance Between step probability measures

QB Perian, A Mantri, SR Benjamin - 2022 Joint Mathematics …, 2022 - meetings.ams.org

Poster #4: On the Wasserstein Distance Between k
k-Step Probability Measures on Finite Graphs

Sophia Rai Benjamin, North Carolina School of Science and Mathematics
Arushi Mantri*, Jesuit High School Portland
Quinn B Perian, Stanford Online High School
 [CITATION] On the Wasserstein Distance Between< svg xmlns: xlink=

QB Perian, A Mantri, SR Benjamin - 2022 Virtual Joint …

CITATION] On the Wasserstein Distance Between< svg xmlns: xlink=

QB Perian, A Mantri, SR Benjamin - 2022 Virtual Joint …, 2022 - meetings.ams.org


2022


2022 see 2021  [PDF] arxiv.org

Wasserstein Convergence for Empirical Measures of Subordinated Diffusions on Riemannian Manifolds

FY Wang, B Wu - Potential Analysis, 2022 - Springer

… t )t>0 (α (0,1)) be the empirical measures of the Markov … Recently, sharp convergence rate

in the Wasserstein distance … random variables and discrete time Markov chains. In this paper…

 Related articles All 2 versions


2022 see 2021

Solutions to Hamilton–Jacobi equation on a Wasserstein spacehttps://link.springer.com › content › pdf

by Z Badreddine · 2022 — The considered Hamilton–Jacobi equations are stated on a Wasserstein space and are associated to a Calculus of

Variation problem. Under some ...

Cited by 1 Related articles All 2 versions

 2022 see 2021

Wasserstein autoregressive models for density time series

https://econpapers.repec.org › article › blajtsera › v_3a...

by C Zhang · 2022 · Cited by 8 — Wasserstein autoregressive models for density time series ... Journal of Time Series Analysis, 2022, vol. 43, issue 1, 30-52.

\\2022 see 2021  Cover Image

Wasserstein autoregressive models for density time series

by Zhang, Chao; Kokoszka, Piotr; Petersen, Alexander

Journal of time series analysis, 01/2022, Volume 43, Issue 1

Data consisting of time‐indexed distributions of cross‐sectional or intraday returns have been extensively studied in finance, and provide one example in which...

Article PDFPDF

Journal Article 

Full Text Online


Wasserstein Distances, Geodesics and Barycenters of Merge ...

https://www.computer.org › csdl › journal › 2022/01

by M Pont · 2022 ·  — This paper presents a unified computational framework for the estimation of distances, geodesics and barycenters of merge trees.

2022  see 2921  Cover Image

Wasserstein Distances, Geodesics and Barycenters of Merge Trees

by Pont, Mathieu; Vidal, Jules; Delon, Julie ; More...

IEEE transactions on visualization and computer graphics, 01/2022, Volume 28, Issue 1

This paper presents a unified computational framework for the estimation of distances, geodesics and barycenters of merge trees. We extend recent work on the...

Article PDFPDF

Journal Article Full Text Online

View in Context Browse Journal

Cited by 2 Related articles All 50 versions    

 

Wasserstein and multivariate linear affine based ...

https://www.sciencedirect.com › science › article › abs › pii

by Y Wang · 2022 — Wasserstein and multivariate linear affine based distributionally robust optimization for CCHP-P2G scheduling considering multiple uncertainties.

2022  Cover Image

Wasserstein and multivariate linear affine based distributionally robust optimization for CCHP-P2G scheduling considering...

by Wang, Yuwei; Yang, Yuanjuan; Fei, Haoran ; More...

Applied energy, 01/2022, Volume 306

[Display omitted] •Power-to-gas facility is integrated into the trigeneration energy system.•Optimal operation of this integrated system is studied under...

Article PDFPDF

Journal Article Full Text Online

View in Context Browse Journal

A data-driven scheduling model of virtual power plant using  

Wasserstein and multivariate linear affine based distributionally robust optimization for CCHP-P2G scheduling considering multiple uncertainties

Y Wang, Y Yang, H Fei, M Song, M Jia - Applied Energy, 2022 - Elsevier

… This paper proposes a Wasserstein and multivariate linear affine based distributionally robust 

… In this paper, Wasserstein metric is adopted for establishing the ambiguity set of ξ ~ . The …

c such as excellent out-of-sample performance and tractable …

c such as excellent out-of-sample performance and tractable …

Cited by 20 Related articles All 4 versions

<——2022———2022———20—— 


2022  see 2021  Cover Image

Distributionally Safe Path Planning: Wasserstein Safe RRT

by Lathrop, Paul; Boardman, Beth; Martinez, Sonia

IEEE robotics and automation letters, 01/2022, Volume 7, Issue 1

In this paper, we propose a Wasserstein metric-based random path planning algorithm. Wasserstein Safe RRT (W-Safe RRT) provides finite-sample probabilistic...

Article PDFPDF

Journal Article Full Text Online

View in Context Browse Journal

 

Wasserstein distributionally robust chance-constrained optimization for energy and reserve dispatch: An exact and physically-bounded formulation

A ArrigoC OrdoudisJ KazempourZ De Grève… - European Journal of …, 2022 - Elsevier

In the context of transition towards sustainable, cost-efficient and reliable energy systems,

the improvement of current energy and reserve dispatch models is crucial to properly cope

with the uncertainty of weather-dependent renewable power generation. In contrast to

traditional approaches, distributionally robust optimization offers a risk-aware framework that

provides performance guarantees when the distribution of uncertain parameters is not

perfectly known. In this paper, we develop a distributionally robust chance-constrained …

Cited by 14 Related articles All 8 versions

2022  Cover Image

Wasserstein distributionally robust chance-constrained optimization for energy and reserve dispatch: An exact and...

by Arrigo, Adriano; Ordoudis, Christos; Kazempour, Jalal ; More...

European journal of operational research, 01/2022, Volume 296, Issue 1

•A distributionally robust energy and reserve dispatch problem is developed.•An exact reformulation for distributionally robust chance constraints is...

Article PDFPDF

Journal Article Full Text Online

View in Context Browse Journal
Zbl 07421375

Cited by 23 Related articles All 8 versions

2022  [PDF] arxiv.org

Alpha Procrustes metrics between positive definite operators: a unifying formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt metrics

HQ Minh - Linear Algebra and its Applications, 2022 - Elsevier

This work presents a parametrized family of distances, namely the Alpha Procrustes

distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes

distances provide a unified formulation encompassing both the Bures-Wasserstein and Log …

 Cited by 5 Related articles All 2 versions

 

 2022  JMM

Numerical Methods for Wasserstein Natural Gradient Descent ...

https://meetings.ams.org › math › meetingapp.cgi › Paper

https://meetings.ams.org › math › meetingapp.cgi › Paper

We propose a few efficient numerical schemes for scenarios where the derivative of the state variable with respect to the parameter is either known or unknown.

[CITATION] Numerical Methods for Wasserstein Natural Gradient Descent in Inverse Problems

W Lei, L Nurbekyan, Y Yang - 2022 Joint Mathematics Meetings …, 2022 - meetings.ams.org
Poster #111: Numerical Methods for Wasserstein Natural Gradient Descent in Inverse Problems
Wanzhou Lei*, NYU
Levon Nurbekyan, Department of Mathematics, UCLA
Yunan Yang, Cornell University
(1174-65-8594)

10:30 a.m.
[CITATION]
Numerical Methods for Wasserstein Natural Gradient Descent in Inverse Problems

W Lei, L Nurbekyan, Y Yang - 2022 Virtual Joint Mathematics …, 20


2022

Solutions to Hamilton–Jacobi equation on a Wasserstein space

Z Badreddine, H Frankowska - Calculus of Variations and Partial …, 2022 - Springer

Abstract We consider a Hamilton–Jacobi equation associated to the Mayer optimal control

problem in the Wasserstein space\(\mathscr {P} _2 (\mathbb {R}^{d})\) and define its

solutions in terms of the Hadamard generalized differentials. Continuous solutions are …

2022

Wasserstein and multivariate linear affine based distributionally robust optimization for CCHP-P2G scheduling considering multiple uncertainties

Y Wang, Y Yang, H Fei, M Song, M Jia - Applied Energy, 2022 - Elsevier

Power-to-gas is an emerging energy conversion technology. When integrating power-to-gas

into the combined cooling, heating and power system, renewable generations can be further

accommodated to synthesize natural gas, and additional revenues can be obtained by …

 All 4 versions


 arXiv:2201.01085  [pdf]  physics.flu-dyn
Dynamical Mode Recognition of Triple Flickering Buoyant Diffusion Flames: from Physical Space to Phase Space and to Wasserstein Space
Authors: Yicheng ChiTao YangPeng Zhang
Abstract: Triple flickering buoyant diffusion flames in an isosceles triangle arrangement were experimentally studied as a nonlinear dynamical system of coupled oscillators. The objective of the study is two-fold: to establish a well-controlled gas-fuel diffusion flame experiment that can remedy the deficiencies of previous candle-flame experiments, and to develop an objective methodology for dynamical mode… 
More
Submitted 4 January, 2022; originally announced January 2022.
Comments: 16 pages, 5 figures, 1 table, research article

Dynamical Mode Recognition of Triple Flickering Buoyant Diffusion Flames: from Physical Space to Phase Space and to Wasserstein Space

Y Chi, T Yang, P Zhang - arXiv preprint arXiv:2201.01085, 2022 - arxiv.org

Triple flickering buoyant diffusion flames in an isosceles triangle arrangement were

experimentally studied as a nonlinear dynamical system of coupled oscillators. The

objective of the study is two-fold: to establish a well-controlled gas-fuel diffusion flame …

Related articles All 2 versions 

arXiv:2201.01076  [pdfpsother]  math.CA  math.MG
Isometric rigidity of Wasserstein spaces: the graph metric case
Authors: Gergely KissTamás Titkos
Abstract: The aim of this paper is to prove that the p
-Wasserstein space Wp(X)
 is isometrically rigid for all p≥1
 whenever X
 is a countable graph metric space. As a consequence, we obtain that for every countable group H
 and any p≥1
 there exists a p-Wasserstein space whose isometry group is isomorphic to H.
Submitted 4 January, 2022; originally announced January 2022.
Comments: 14 pages with 3 figures
MSC Class: 54E40; 46E27 (Primary); 54E70; 05C12 (Secondary)

Cited by 2 Related articles All 9 versions
MR4446253


arXiv:2201.00422  [pdfpsother]  math.PR math-ph
Quantitative control of Wasserstein distance between Brownian motion and the Goldstein-Kac telegraph process
Authors: Gerardo BarreraJani Lukkarinen
Abstract: In this manuscript, we provide a non-asymptotic process level control between the telegraph process and the Brownian motion with suitable diffusivity constant via a Wasserstein distance with quadratic average cost. In addition, we derive non-asymptotic estimates for the corresponding time average p
-th moments. The proof relies on coupling techniques such as coin-flip coupling, synchronous coupli… 
More
Submitted 2 January, 2022; originally announced January 2022.
Comments: 59 pp
MSC Class: 60G50; 60J65; 60J99; 60K35; 35L99; 60K37; 60K40

WEB RESOURCE

Dynamic Persistent Homology for Brain Networks via Wasserstein Graph Clustering

Chung, Moo K ; Huang, Shih-Gu ; Carroll, Ian C ; Calhoun, Vince D ; Goldsmith, H. Hill2022

Dynamic Persistent Homology for Brain Networks via Wasserstein Graph Clustering

No Online Access 

arXiv:2201.00087  [pdfother]  math.AT   cs.LG  q-bio.NC
Dynamic Persistent Homology for Brain Networks via Wasserstein Graph Clustering
Authors: Moo K. ChungShih-Gu HuangIan C. CarrollVince D. CalhounH. Hill Goldsmith
Abstract: We present the novel Wasserstein graph clustering for dynamically changing graphs. The Wasserstein clustering penalizes the topological discrepancy between graphs. The Wasserstein clustering is shown to outperform the widely used k-means clustering. The method applied in more accurate determination of the state spaces of dynamically changing functional brain networks.
Submitted 31 December, 2021; originally announced January 2022

.All 3 versions 

<—–2022———2022———30—

 2022 see 2021`

Solutions to Hamilton–Jacobi equation on a Wasserstein space

Z Badreddine, H Frankowska - Calculus of Variations and Partial …, 2022 - Springer

Abstract We consider a Hamilton–Jacobi equation associated to the Mayer optimal control

problem in the Wasserstein space P _2 (R^ d) P 2 (R d) and define its solutions in terms of

the Hadamard generalized differentials. Continuous solutions are unique whenever we …

Cited by 1 Related articles All 2 versions

GMT-WGAN: An Adversarial Sample Expansion Method for Ground Moving Targets Classification

X Yao, X Shi, Y Li, L Wang, H Wang, S Ren, F Zhou - Remote Sensing, 2022 - mdpi.com

In the field of target classification, detecting a ground moving target that is easily covered in

clutter has been a challenge. In addition, traditional feature extraction techniques and

classification methods usually rely on strong subjective factors and prior knowledge, which …

 All 4 versions 


2022 Duke see 2021 Cover Image

SVAE-WGAN based Soft Sensor Data Supplement Method for Process Industry

by Gao, Shiwei; Qiu, Sulong; Ma, Zhongyu ; More...

IEEE sensors journal, 11/2021, Volume 22, Issue 1

Challenges of process industry, which is characterized as hugeness of process variables in complexity of industrial environment, can be tackled effectively by...

ArticleView Article PDF

Journal Article Full Text Online

View Complete Issue Browse Now

    

2022 Duke  see 2020  Cover Image

Wasserstein Loss With Alternative Reinforcement Learning for...

by Liu, Xiaofeng; Lu, Yunhong; Liu, Xiongchang ; More...

IEEE transactions on intelligent transportation systems, 09/2020, Volume 23, Issue 1

Semantic segmentation is important for many real-world systems, e.g., autonomous vehicles, which predict the class of each pixel. Recently, deep networks...

ArticleView Article PDF

Journal Article Full Text Online

View Complete Issue Browse Now

  

[PDF] arxiv.org

Central limit theorem for the Sliced 1-Wasserstein distance and the max-Sliced 1-Wasserstein distance

X Xu, Z Huang - arXiv preprint arXiv:2205.14624, 2022 - arxiv.org

Wasserstein distance has been proposed as an alternative to the original Wasserstein distance 

via averaging the Wasserstein … And its variant max-Sliced Wasserstein distance which is …

Cited by 2 Related articles All 2 versions

2022

 

Wasserstein distance based multi-scale adversarial domain adaptation method for remaining useful life prediction

Shi, HTHuang, CZ; (...); Li, SH

Jun 2022 (Early Access) | APPLIED INTELLIGENCE

Enriched Cited References

Accurate remaining useful life (RUL) prediction can formulate timely maintenance strategies for mechanical equipment and reduce the costs of industrial production and maintenance. Although data-driven methods represented by deep learning have been successfully implemented for mechanical equipment RUL prediction, existing methods generally require test data to have a similar distribution to the

Show more

Full Text at Publisher

Wasserstein distance estimates for stochastic integrals by forward-backward...

2022 Duke see 2020  Cover Image

Wasserstein distance estimates for stochastic integrals by forward-backward...

by Breton, Jean-Christophe; Privault, Nicolas

Potential analysis, 2020

Journal Article

arXiv:2201.02824  [pdfother]  stat.ML  cs.LG  math.ST
Optimal 1-Wasserstein Distance for WGANs
Authors: Arthur StéphanovitchUgo TanielianBenoît CadreNicolas KlutchnikoffGérard Biau
Abstract: The mathematical forces at work behind Generative Adversarial Networks raise challenging theoretical issues. Motivated by the important question of characterizing the geometrical properties of the generated distributions, we provide a thorough analysis of Wasserstein GANs (WGANs) in both the finite sample and asymptotic regimes. We study the specific case where the latent space is univariate and d…  More
Submitted 8 January, 2022; originally announced January 2022.


arXiv:2201.04232  [pdfpsother]  math.OC   math.PR
Stochastic Gradient Descent in Wasserstein Space
Authors: Julio Backhoff-VeraguasJoaquin FontbonaGonzalo RiosFelipe Tobar
Abstract: We present and study a novel algorithm for the computation of 2-Wasserstein population barycenters. The proposed method admits, at least, two relevant interpretations: it can be seen as a stochastic gradient descent procedure in the 2-Wasserstein space, and also as a manifestation of a Law of Large Numbers in the same space. Building on stochastic gradient descent, the proposed algorithm is online… 
More
Submitted 11 January, 2022; originally announced January 2022.
Cited by 4
 Related articles All 2 versions 


arXiv:2201.03732  [pdfother]  math-ph   quant-ph
Right mean for the α−z
 Bures-Wasserstein quantum divergence
Authors: Miran JeongJinmi HwangSejong Kim
Abstract: A new quantum divergence induced from the α−z
 Renyi relative entropy, called the α−z
 Bures-Wasserstein quantum divergence, has been recently introduced. We investigate in this paper properties of the right mean, which is a unique minimizer of the weighted sum of α−z
 Bures-Wasserstein quantum divergences to each points. Many interesting operator inequalities of the right mean with the matrix…  More
Submitted 10 January, 2022; originally announced January 2022.
MSC Class: 81P17; 15B48
<——2022———2022———40—


  MR4361616 Prelim Minh, Hà Quang; 

Finite Sample Approximations of Exact and Entropic Wasserstein Distances Between Covariance Operators and Gaussian Processes. SIAM/ASA J. Uncertain. Quantif. 10 (2022), no. 1, 96–124. 60G15

Review PDF Clipboard Journal Article

Cited by 3 Related articles All 4 versions

2022  see 2021

MR4358162 Prelim Amari, Shun-ichi; Matsuda, Takeru; 

Wasserstein statistics in one-dimensional location scale models. Ann. Inst. Statist. Math. 74 (2022), no. 1, 33–47. 62B11

Review PDF Clipboard Journal Article
Cited by 2
 Related articles All 8 versions


MR4357729 Prelim Breton, Jean-Christophe; Privault, Nicolas; 

Wasserstein Distance Estimates for Stochastic Integrals by Forward-Backward Stochastic Calculus. Potential Anal. 56 (2022), no. 1, 1–20. 60

Review PDF Clipboard Journal Article

MR4355909 Prelim Bencheikh, O.; Jourdain, B.; 

Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures. J. Approx. Theory 274 (2022), Paper No. 105684. 60B10 (28A33 41A50 49Q22)

Review PDF Clipboard Journal Article
Related articles
 All 3 versions

Cited by 2 Related articles All 6 versions

[PDF] arxiv.org

Right mean for the  Bures-Wasserstein quantum divergence

M Jeong, J Hwang, S Kim - arXiv preprint arXiv:2201.03732, 2022 - arxiv.org

… -Wasserstein quantum divergences to each points. Many interesting operator inequalities of

the right mean with the matrix power mean including the Cartan mean are presented. Moreover,

we verify the trace inequality with the Wasserstein mean … with the Wasserstein mean and …

 All 2 versions 

 2022


[PDF] arxiv.org

Optimal 1-Wasserstein Distance for WGANs

A Stéphanovitch, U Tanielian, B Cadre… - arXiv preprint arXiv …, 2022 - arxiv.org

… of the generated distributions, we provide a thorough analysis of Wasserstein GANs (WGANs)

in both the finite sample and asymptotic regimes. … We also highlight the fact that WGANs are

able to approach (for the 1-Wasserstein distance) the target distribution as the sample size …

All 2 versions 


Pesticide detection combining the Wasserstein generative adversarial network and the residual neural network based on terahertz spectroscopy

R Yang, Y Li, B Qin, D Zhao, Y Gan, J Zheng - RSC Advances, 2022 - pubs.rsc.org

… In this study, we proposed a WGAN-ResNet method, which combines two deep learning

networks, the Wasserstein generative adversarial network (WGAN) and the residual neural

network (ResNet), to detect carbendazim based on terahertz spectroscopy. The Wasserstein …

 

Improving Text Classifiers Through Controlled Text Generation Using Transformer Wasserstein Autoencoder

C Harikrishnan, NM Dhanya - Inventive Communication and …, 2022 - Springer

… This paper proposes a technique for generating controlled text using the transformer-based

Wasserstein autoencoder which helps in improving the classifiers. The paper compares the

results with classifiers trained on data generated by other synthetic data generators. …

 Cited by 1 Related articles All 3 versions

[PDF] siam.org

Approximating 1-Wasserstein Distance between Persistence Diagrams by Graph Sparsification

TK Dey, S Zhang - 2022 Proceedings of the Symposium on Algorithm …, 2022 - SIAM

Persistence diagrams (PD)s play a central role in topological data analysis. This analysis

requires computing distances among such diagrams such as the 1-Wasserstein distance.

Accurate computation of these PD distances for large data sets that render large diagrams may …

Cited by 1 Related articles All 5 versions

 [PDF] researchgate.net

[PDF] Gradient Penalty Approach for Wasserstein Generative Adversarial Networks

Y Ti - researchgate.net

… networks (GANs) in Wasserstein GAN (WGANs). We will understand the working of these

WGAN generators and discriminator structures as … While we have discussed the concept of

DCGANs in some of our previous articles, in this blog, we will focus on the WGAN networks for …

 Related articles 

<——2022———2022———50—  


 2022  see 2021

 S VAE-WGAN-Based Soft Sensor Data Supplement Method for ...

https://ui.adsabs.harvard.edu › abs › abstract

SVAE-WGAN-Based Soft Sensor Data Supplement Method for Process Industry ... Pub Date: January 2022; DOI: 10.1109/JSEN.2021.3128562; Bibcode: 2022ISenJ..22..

SVAE-WGAN-Based Soft Sensor Data Supplement Method for Process Industry
by Gao, Shiwei; Qiu, Sulong; Ma, Zhongyu ; More...
IEEE sensors journal, 01/2022, Volume 22, Issue 1
Challenges of process industry, which is characterized as hugeness of process variables in complexity of industrial environment, can be tackled effectively by...
Article PDFPDF
Journal Article Full Text Online

 
2022  see 2021

Wasserstein autoregressive models for density time ... - EconPapers

https://econpapers.repec.org › article › blajtsera › v_3a...

Wasserstein autoregressive models for density time series. Chao Zhang, Piotr Kokoszka and Alexander Petersen. Journal of Time Series Analysis, 2022, vol.
Wasserstein autoregressive models for density time series
by Zhang, Chao; Kokoszka, Piotr; Petersen, Alexander
Journal of time series analysis, 01/2022, Volume 43, Issue 1
Data consisting of time‐indexed distributions of cross‐sectional or intraday returns have been extensively studied in finance, and provide one example in which...

Cited by 16 Related articles All 7 versions
Zbl 07476226
  MR4400283


2022 see 2021

Wasserstein Distances, Geodesics and Barycenters of Merge ...

https://www.computer.org › csdl › journal › 2022/01

by M Pont · 2022 · Cited by 2 — This paper presents a unified computational framework for the estimation of distances, geodesics and barycenters of merge trees.
Wasserstein Distances, Geodesics and Barycenters of Merge Trees
by Pont, Mathieu; Vidal, Jules; Delon, Julie ; More...
IEEE transactions on visualization and computer graphics, 01/2022, Volume 28, Issue 1
This paper presents a unified computational framework for the estimation of distances, geodesics and barycenters of merge trees. We extend recent work on the...
Journal Article Full Text Online

Cited by 2 All 50 versions

Sliced Wasserstein Distance for Neural Style Transfer

by Li, Jie; Xu, Dan; Yao, Shaowen

Computers & graphics, 02/2022, Volume 102

Computers & graphics, 02/2022, Volume 102

Neural Style Transfer (NST) aims to render a content image with the style of another image in the feature space of a Convolution Neural Network (CNN). A...

Article PDF Journal Article

Related articles All 2 versions


[HTML] springer.com

[HTML] Proxying credit curves via Wasserstein distances

M Michielon, A Khedher, P Spreij - Annals of Operations Research, 2022 - Springer

… investigate whether the Wasserstein square distance can be … In particular, we show how 

using the Wasserstein distance … the essential concepts on Wasserstein distances relevant for the …

Cited by 1 Related articles All 5 versions

 2022

 

[HTML] mdpi.com

Simulating Multi-Asset Classes Prices Using Wasserstein Generative Adversarial Network: A Study of Stocks, Futures and Cryptocurrency

F Han, X Ma, J Zhang - Journal of Risk and Financial Management, 2022 - mdpi.com

… WGAN-GP to generate richer data with noise to discover hidden characteristics. To

ensure the quality of generated real and fake data discriminated by WGAN… Inspired by the

generator and discriminator ideas, we implement a Wasserstein GAN with Gradient Penalty (WGAN-GP) …

Related articles All 5 versions 


Finger vein image inpainting using neighbor binary-wasserstein generative adversarial networks (NB-WGAN)

H Jiang, L Shen, H Wang, Y Yao, G Zhao - Applied Intelligence, 2022 - Springer

Traditional inpainting methods obtain poor performance for finger vein images with blurred

texture. In this paper, a finger vein image inpainting method using Neighbor Binary-Wasserstein

Generative Adversarial Networks (NB-WGAN) is proposed. Firstly, the proposed algorithm …

 

[PDF] ams.org

Maps on positive definite cones of 𝐶*-algebras preserving the Wasserstein mean

L Molnár - Proceedings of the American Mathematical Society, 2022 - ams.org

… In this paper we consider a new type of means called Wasserstein mean, recently introduced

by Bhatia, … the Wasserstein mean AσwB of A and B; see (2) above. Further results on σw

were obtained in [2], [4], also see [13], [16]. We note that the definition of the Bures-Wasserstein …

 Zbl 07469054

Cited by 2 Related articles All 3 versions


 2022  see 2021

Rate of convergence for particle approximation of PDEs in ...

https://ideas.repec.org › hal › journl › hal-03154021

by M Germain · 2022 · Cited by 2 — We prove a rate of convergence for the $N$-particle approximation of a second-order partial differential equation in the space of probability measures, ...
Rate of convergence for particle approximation of PDEs in Wasserstein space
by Germain, Maximilien; Pham, Huyên; Warin, Xavier
Journal of applied probability, 2022, Volume 59, Issue 4
We prove a rate of convergence for the $N$-particle approximation of a second-order partial differential equation in the space of probability measures, like...
Journal Article Full Text Online

MR4507678

Quantitative control of Wasserstein distance between Brownian motion and the Goldstein-Kac telegraph..

Buoyant Diffusion Flames: from Physical Space to Phase Space and to Wasserstein...

by Chi, Yicheng; Yang, Tao; Zhang, Peng

01/2022

Triple flickering buoyant diffusion flames in an isosceles triangle arrangement were experimentally studied as a nonlinear dynamical system of coupled...

All 3 versions 

<——2022———2022———60— 



Distributionally Safe Path Planning: Wasserstein Safe RRT
by Lathrop, Paul; Boardman, Beth; Martinez, Sonia
IEEE robotics and automation letters, 01/2022, Volume 7, Issue 1
In this paper, we propose a Wasserstein metric-based random path planning algorithm. Wasserstein Safe RRT (W-Safe RRT) provides finite-sample probabilistic...
 Journal Article Full Text Online


Wasserstein Loss With Alternative Reinforcement Learning for Severity-Aware Semantic Segmentation
by Liu, Xiaofeng; Lu, Yunhong; Liu, Xiongchang ; More...
IEEE transactions on intelligent transportation systems, 01/2022, Volume 23, Issue 1
Semantic segmentation is important for many real-world systems, e.g., autonomous vehicles, which predict the class of each pixel. Recently, deep networks...
Journal Article Full Text Online



2Bounding Kolmogorov distances through Wasserstein and related integral probability...
by Gaunt, Robert ELi, Siqi
01/2022
We establish general upper bounds on the Kolmogorov distance between two probability distributions in terms of the distance between these distributions as...
Journal Article  Full Text Online
ounding Kolmogorov distances through Wasserstein and related integral probability...
by Gaunt, Robert ELi, Siqi
arXiv.org, 01/2022
We establish general upper bounds on the Kolmogorov distance between two probability distributions in terms of the distance between these distributions as...
Paper  Full Text Online


2022

Pesticide detection combining the Wasserstein generative adversarial network and the residual neural network based on terahertz spectroscopy

R Yang, Y Li, B Qin, D Zhao, Y Gan, J Zheng - RSC Advances, 2022 - pubs.rsc.org

… Wasserstein generative adversarial network (WGAN) and the residual neural network (ResNet),

to detect carbendazim based on terahertz spectroscopy. The Wasserstein … The Wasserstein

generative adversarial network was used for generating more new learning samples. At …


[HTML] rsc.org

[HTML] Pesticide detection combining the Wasserstein generative adversarial network and the residual neural network based on terahertz spectroscopy

R Yang, Y Li, B Qin, D Zhao, Y Gan, J Zheng - RSC Advances, 2022 - pubs.rsc.org

… Wasserstein generative adversarial network (WGAN) and the residual neural network (ResNet), 

to detect carbendazim based on terahertz spectroscopy. The Wasserstein … The Wasserstein 

generative adversarial network was used for generating more new learning samples. At …

ited by 4 Related articles All 6 versions

2022

[PDF] arxiv.org

Optimal 1-Wasserstein Distance for WGANs

A Stéphanovitch, U Tanielian, B Cadre… - arXiv preprint arXiv …, 2022 - arxiv.org

… of the generated distributions, we provide a thorough analysis of Wasserstein GANs (WGANs) 

in both the finite sample and asymptotic regimes. … We also highlight the fact that WGANs are 

able to approach (for the 1-Wasserstein distance) the target distribution as the sample size …

Related articles All 2 versions

[PDF] siam.org

Approximating 1-Wasserstein Distance between Persistence Diagrams by Graph Sparsification

TK Dey, S Zhang - 2022 Proceedings of the Symposium on Algorithm …, 2022 - SIAM

… This analysis requires computing distances among such diagrams such as the 1Wasserstein 

The 1-Wasserstein (W1) distance is a common distance to compare persistence diagrams; … 

the 1-Wasserstein distance called here the W1-distance that improves the state-of-the-art. …

Cited by 1 Related articles All 5 versions

arXiv:2201.07523  [pdfpsother]  math.PR   math.FA
Wasserstein contraction and Poincaré inequalities for elliptic diffusions at high temperature
Authors: Pierre Monmarché
Abstract: We consider elliptic diffusion processes on R
. Assuming that the drift contracts distances outside a compact set, we prove that, at a sufficiently high temperature, the Markov semi-group associated to the process is a contraction of the W2
 Wasserstein distance, which implies a Poincaré inequality for its invariant measure. The result doesn't require neither reversibility no… 
More
Submitted 19 January, 2022; originally announced January 2022.
MSC Class: 60J60

2 [PDF] arxiv.org

A new method for determining Wasserstein 1 optimal transport maps from Kantorovich potentials, with deep learning applications

T Milne, É Bilocq, A Nachman - arXiv preprint arXiv:2211.00820, 2022 - arxiv.org

… We apply TTC to restore images that have been corrupted with Gaussian noise. Specifically, 

we follow the experimental framework of [Lunz et al., 2018], where the target distribution ν …

Related articles All 2 versions

arXiv:2201.08157  [pdfother]  cs.CV   cs.LG   eess.IV
WPPNets: Unsupervised CNN Training with Wasserstein Patch Priors for Image Superresolution
Authors: Fabian AltekrügerJohannes Hertrich
Abstract: We introduce WPPNets, which are CNNs trained by a new unsupervised loss function for image superresolution of materials microstructures. Instead of requiring access to a large database of registered high- and low-resolution images, we only assume to know a large database of low resolution images, the forward operator and one high-resolution reference image. Then, we propose a loss function based o…  More
Submitted 20 January, 2022; originally announced January 2022.

 Cited by 1 Related articles All 2 versions 

<——2022———2022———70— 


2022 see 2021

Wang, Feng-Yu

Wasserstein convergence rate for empirical measures on noncompact manifolds. (English) Zbl 07457774

Stochastic Processes Appl. 144, 271-287 (2022).

MSC:  58J65 60J60

PDF BibTeX XML Cite

Full Text: DOI

Cited by 6 Related articles All 6 versions

2022

 MR4365941 Prelim Ma, Xiaohui; El Machkouri, Mohamed; Fan, Xiequan; 

On Wasserstein-1 distance in the central limit theorem for elephant random walk. J. Math. Phys. 63 (2022), no. 1, Paper No. 013301, 13 pp. 60

Review PDF Clipboard Journal Article

MR4365941

[PDF] arxiv.org

On Wasserstein-1 distance in the central limit theorem for elephant random walk

Cited by 1 Related articles All 5 versions

Zbl 07629197

2022 see 2021

Wasserstein convergence rate for empirical ... - Science Directhttps://www.sciencedirect.com › science › article › abs › pii


2022 see 2021

Wasserstein convergence rate for empirical measures on ...

https://econpapers.repec.org › article › eeespapps

https://econpapers.repec.org › article › eeespapps

by FY Wang · 2022 · Cited by 5 — Wasserstein convergence rate for empirical measures on noncompact manifolds ... Abstract: Let Xt be the (reflecting) diffusion process generated ...

Zbl 07457774

Cited by 12 Related articles All 5 versions

A novel multi-speakers Urdu singing voices synthesizer using Wasserstein Generative Adversarial Network

A Saeed, MF Hayat, T Habib, DA Ghaffar… - Speech …, 2022 - Elsevier

In this paper, the first-ever Urdu language singing voices corpus is developed using linguistic

(phonetic) and vocoder (F0 contours) features. Singer identity feature vector along with the

Urdu singing voices corpus is used in the synthesis of multi speakers Urdu singing voices …

 All 2 versions

2022

2022 see 2021  [PDF] umons.ac.be

Wasserstein distributionally robust chance-constrained optimization for energy and reserve dispatch: An exact and physically-bounded formulation

A ArrigoC OrdoudisJ KazempourZ De Grève… - European Journal of …, 2022 - Elsevier

… In this paper, we develop a distributionally robust chance-constrained optimization with a

Wasserstein ambiguity set for energy and reserve … robust chance-constrained energy and

reserve dispatch optimization problem with Wasserstein distance, which is the focus of this paper. …

C Cited by 19 Related articles All 8 versions
Wasserstein distributionally robust chance-constrained optimization for energy and reserve...
by Arrigo, Adriano; Ordoudis, Christos; Kazempour, Jalal ; More...
European journal of operational research, 01/2022, Volume 296, Issue 1
•A distributionally robust energy and reserve dispatch problem is developed.•An exact reformulation for distributionally robust chance constraints is...
 Journal Article Full Text Online


A new data generation approach with modified Wasserstein auto-encoder for rotating machinery fault diagnosis with limited fault data

K Zhao, H Jiang, C Liu, Y Wang, K Zhu - Knowledge-Based Systems, 2022 - Elsevier

… a modified Wasserstein auto-encoder (MWAE) to generate data that are highly similar to

the known data. The sliced Wasserstein distance is … The sliced Wasserstein distance with a

gradient penalty is designed as the regularization term to minimize the difference between the …

 All 2 versions

2022  see 2021  [PDF] arxiv.org

Minimum entropy production, detailed balance and Wasserstein distance for continuous-time Markov processes

A Dechant - Journal of Physics A: Mathematical and Theoretical, 2022 - iopscience.iop.org

We investigate the problem of minimizing the entropy production for a physical process that

can be described in terms of a Markov jump dynamics. We show that, without any further

constraints, a given time-evolution may be realized at arbitrarily small entropy production, yet at …

Cited by 5 Related articles All 3 versions

MR4378886 

[PDF] ieee.org

Wasserstein Generative Adversarial Network for Depth Completion with Anisotropic Diffusion Depth Enhancement

TM Nguyen, M Yoo - IEEE Access, 2022 - ieeexplore.ieee.org

… We used an adapted Wasserstein Generative Adversarial Network architecture instead of

applying the traditional autoencoder approach and post-processing process to preserve valid

depth measurements received from the input and further enhance the depth value precision …
Cited by 4
 Related articles All 2 versions


[PDF] arxiv.org

Wasserstein contraction and Poincar\'e inequalities for elliptic diffusions at high temperature

P Monmarché - arXiv preprint arXiv:2201.07523, 2022 - arxiv.org

We consider elliptic diffusion processes on $\mathbb R^d$. Assuming that the drift contracts

distances outside a compact set, we prove that, at a sufficiently high temperature, the Markov

semi-group associated to the process is a contraction of the $\mathcal W_2$ Wasserstein …

Related articles All 4 versions 

<——2022———2022———80— 


[HTML] springer.com

[HTML] Short-term prediction of wind power based on BiLSTM–CNN–WGAN-GP

L Huang, L Li, X Wei, D Zhang - Soft Computing, 2022 - Springer

A short-term wind power prediction model based on BiLSTM–CNN–WGAN-GP (LCWGAN-GP)

is proposed in this paper, aiming at the problems of instability and low prediction accuracy

of short-term wind power prediction. Firstly, the original wind energy data are decomposed …

 Cited by 3 Related articles


2022 see 2021

Zhaohui Tang (0000-0003-4132-4987) - ORCID

https://orcid.org › ...

Jan 11, 2022 — Illumination-Invariant Flotation Froth Color Measuring via Wasserstein Distance-Based CycleGAN With Structure-Preserving Constraint.

 Cited by 34 Related articles All 3 versions


2022

GMT-WGAN: An Adversarial Sample Expansion Method for ...

https://www.mdpi.com › ...

by X Yao · 2022 — ... this paper proposes a Wasserstein generative adversarial network (WGAN) sample enhanc


2022

Dynamic Persistent Homology for Brain Networks via Wasserstein Graph Clustering

MK ChungSG Huang, IC Carroll, VD Calhoun… - arXiv preprint arXiv …, 2022 - arxiv.org

… Wasserstein graph clustering for dynamically changing graphs. The Wasserstein clustering

penalizes the topological discrepancy between graphs… The final statistical analysis results

change depending on the choice of threshold or parameter [13, 32]. There is a need to develop …

All 3 versions 


2022  Cover Image

Wasserstein Loss With Alternative Reinforcement Learning for Severity-Aware...

by Liu, Xiaofeng; Lu, Yunhong; Liu, Xiongchang ; More...

IEEE transactions on intelligent transportation systems, 01/2022, Volume 23, Issue 1

Semantic segmentation is important for many real-world systems, e.g., autonomous vehicles, which predict the class of each pixel. Recently, deep networks...

ArticleView Article PDF

Journal Article Full Text Online

View Complete Issue Browse Now

 

2022    


2022 see 2021  [PDF] arxiv.org

Wasserstein patch prior for image superresolution

J Hertrich, A Houdard… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org

… To overcome this problem, we introduce a Wasserstein patch prior for unsupervised … 

The propo

Cited by 2 Related articles All 5 versions

 

2022

Stochastic Gradient Descent in Wasserstein Space

J Backhoff-VeraguasJ FontbonaG Rios… - arXiv preprint arXiv …, 2022 - arxiv.org

We present and study a novel algorithm for the computation of 2-Wasserstein population

barycenters. The proposed method admits, at least, two relevant interpretations: it can be

seen as a stochastic gradient descent procedure in the 2-Wasserstein space, and also as a

manifestation of a Law of Large Numbers in the same space. Building on stochastic gradient

descent, the proposed algorithm is online and computationally inexpensive. Furthermore,

we show that the noise in the method can be attenuated via the use of mini-batches, and …

Cited by 4 Related articles All 2 versions

Stochastic Gradient Descent in Wasserstein Space

by Backhoff-Veraguas, Julio; Fontbona, Joaquin; Rios, Gonzalo ; More...

01/2022

We present and study a novel algorithm for the computation of 2-Wasserstein population barycenters. The proposed method admits, at least, two relevant...

Journal Article Full Text Online

Cited by 4 Related articles All 2 versions 

2022  ttp://arxiv.org › math-ph

Right mean for the α-z Bures-Wasserstein quantum divergence

by M Jeong · 2022 — We investigate in this paper properties of the right mean, which is a unique minimizer of the weighted sum of \alpha-z Bures-Wasserstein quantum ...

Right mean for the $\alpha-z$ Bures-Wasserstein quantum divergence

by Jeong, Miran; Hwang, Jinmi; Kim, Sejong

01/2022

A new quantum divergence induced from the $\alpha-z$ Renyi relative entropy, called the $\alpha-z$ Bures-Wasserstein quantum divergence, has been recently...

Journal Article Full Text Online

 

 

2922

Isometric rigidity of Wasserstein spaces: the graph metric case

https://arxiv.org › math

by G Kiss · 2022 — The aim of this paper is to prove that the p-Wasserstein space \mathcal{W}_p(X) is isometrically rigid for all p\geq 1 whenever X is a countable ...

Isometric rigidity of Wasserstein spaces: the graph metric case

by Kiss, Gergely; Titkos, Tamás

01/2022

The aim of this paper is to prove that the $p$-Wasserstein space $\mathcal{W}_p(X)$ is isometrically rigid for all $p\geq 1$ whenever $X$ is a countable graph...

Journal Article Full Text Online

Cited by 2 Related articles All 9 versions

 Zbl 07554413


2022

Unsupervised CNN Training with Wasserstein Patch Priors for ...

https://arxiv.org › cs

by F Altekrüger · 2022 — We introduce WPPNets, which are CNNs trained by a new unsupervised loss function for image superresolution of materials microstructures. Instead ...

WPPNets: Unsupervised CNN Training with Wasserstein Patch Priors for Image...

by Altekrüger, Fabian; Hertrich, Johannes

01/2022

We introduce WPPNets, which are CNNs trained by a new unsupervised loss function for image superresolution of materials microstructures. Instead of requiring...

Journal Article Full Text Online

 Related articles All 2 versions

<——2022———2022———90— 


2022 see 2021

WATCH: Wasserstein Change Point Detection for ... - X-MOL

https://www.x-mol.com › paper

· Translate this page

Jan 18, 2022 — Detecting relevant changes in dynamic time series data in a timely manner is crucially important for many data analysis tasks in real-world ...

WATCH: Wasserstein Change Point Detection for High-Dimensional Time...

by Faber, Kamil; Corizzo, Roberto; Sniezynski, Bartlomiej ; More...

2021 IEEE International Conference on Big Data (Big Data), 12/2021

Detecting relevant changes in dynamic time series data in a timely manner is crucially important for many data analysis tasks in real-world settings. Change...

Conference Proceeding 

Full Text Online

 

2022  [PDF] arxiv.org

Invariance encoding in sliced-Wasserstein space for image classification with limited training data

Y Zhuang, S Li, AHM Rubaiyat, X Yin… - arXiv preprint arXiv …, 2022 - arxiv.org

Deep convolutional neural networks (CNNs) are broadly considered to be state-of-the-art

generic end-to-end image classification systems. However, they are known to underperform

when training data are limited and thus require data augmentation strategies that render the

method computationally expensive and not always effective. Rather than using a data

augmentation strategy to encode invariances as typically done in machine learning, here we

propose to mathematically augment a nearest subspace classification model in sliced …

 Related articles All 2 versions

Invariance encoding in sliced-Wasserstein space for image classification with limited...

by Shifat-E-Rabbi, Mohammad; Zhuang, Yan; Li, Shiying ; More...

01/2022

Deep convolutional neural networks (CNNs) are broadly considered to be state-of-the-art generic end-to-end image classification systems. However, they are...

Journal Article Full Text Online

 Related articles All 2 versions


2022

Quantitative control of Wasserstein distance between ... - arXiv

https://arxiv.org › math

by G Barrera · 2022 — In this manuscript, we provide a non-asymptotic process level control between the telegraph process and the Brownian motion with suitable ...

Quantitative control of Wasserstein distance between Brownian motion and...

by Barrera, Gerardo; Lukkarinen, Jani

01/2022

In this manuscript, we provide a non-asymptotic process level control between the telegraph process and the Brownian motion with suitable diffusivity constant...

Journal Article Full Text Online

Related articles All 3 versions

2022

Finger vein image inpainting using neighbor binary ...

https://link.springer.com › article

by H Jiang · 2022 — In this paper, a finger vein image inpainting method using Neighbor Binary-Wasserstein Generative Adversarial Networks (NB-WGAN) is proposed ...

Finger vein image inpainting using neighbor binary-wasserstein generative adversarial networks (NB-WGAN)

Jiang, HQ; Shen, L; (...); Zhao, GD

Jan 2022 (Early Access) | APPLIED INTELLIGENCE

Traditional inpainting methods obtain poor performance for finger vein images with blurred texture. In this paper, a finger vein image inpainting method using Neighbor Binary-Wasserstein Generative Adversarial Networks (NB-WGAN) is proposed. Firstly, the proposed algorithm uses texture loss, reconstruction loss, and adversarial loss to constrain the network, which protects the texture in the inpainting process. Secondly, the proposed NB-WGAN is designed with a coarse-to-precise generator network and a discriminator network composed of two Wasserstein Generative Adversarial Networks with Gradient Penalty (WGAN-GP). The cascade of a coarse generator network and a precise generator network based on Poisson fusion can obtain richer information and get natural boundary connection. The discriminator consists of a global WGAN-GP and a local WGAN-GP, which enforces consistency between the entire image and the repaired area. Thirdly, a training dataset is designed by analyzing the locations and sizes of the damaged finger vein images in practical applications (i.e., physical oil dirt, physical finger molting, etc). Experimental results show that the performance of the proposed algorithm is better than traditional inpainting methods including Curvature Driven Diffusions algorithm without texture constraints, a traditional inpainting algorithm with Gabor texture constraints, and a WGAN inpainting algorithm based on attention mechanism without texture constraints.

Related articles

ttps://scholars.ttu.edu › publications › a-new-method-o...

A new method of image restoration technology based on WGAN

by W Fang · 2022 — With the development of image restoration technology based on deep learning, more complex problems are being solved, especially in image semantic inpainting ...

A New Method of Image Restoration Technology Based on WGAN

Fang, W; Gu, EM; (...); Sheng, VS

2022 | COMPUTER SYSTEMS SCIENCE AND ENGINEERING 41 (2) , pp.689-698

With the development of image restoration technology based on deep learning, more complex problems are being solved, especially in image semantic inpainting based on context. Nowadays, image semantic inpainting techniques are becoming more mature. However, due to the limitations of memory, the instability of training, and the lack of sample diversity, the results of image restoration are still encountering difficult problems, such as repairing the content of glitches which cannot be well integrated with the original image. Therefore, we propose an image inpainting network based on Wasserstein generative adversarial network (WGAN) distance. With the corresponding technology having been adjusted and improved, we attempted to use the Adam algorithm to replace the traditional stochastic gradient descent, and another algorithm to optimize the training used in recent years. We evaluated our algorithm on the ImageNet dataset. We obtained high-quality restoration results, indicating that our algorithm improves the clarity and consistency of the image.

 Related articles All 2 versions


 2022


Dynamic Facial Expression Generation on Hilbert ...

https://www.computer.org › csdl › journal › 2022/02

by N Otberdout · 2022 · 7 — We propose to exploit the face geometry by modeling the facial landmarks motion as curves encoded as points on a hypersphere. By proposing a conditional version ...

Dynamic Facial Expression Generation on Hilbert Hypersphere With Conditional Wasserstein Generative Adversarial Nets

Otberdout, N; Daoudi, M; (...); Berretti, S

Feb 1 2022 | IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 44 (2) , pp.848-863

In this work, we propose a novel approach for generating videos of the six basic facial expressions given a neutral face image. We propose to exploit the face geometry by modeling the facial landmarks motion as curves encoded as points on a hypersphere. By proposing a conditional version of manifold-valued Wasserstein generative adversarial network (GAN) for motion generation on the hypersphere, we learn the distribution of facial expression dynamics of different classes, from which we synthesize new facial expression motions. The resulting motions can be transformed to sequences of landmarks and then to images sequences by editing the texture information using another conditional Generative Adversarial Network. To the best of our knowledge, this is the first work that explores manifold-valued representations with GAN to address the problem of dynamic facial expression generation. We evaluate our proposed approach both quantitatively and qualitatively on two public datasets; Oulu-CASIA and MUG Facial Expression. Our experimental results demonstrate the effectiveness of our approach in generating realistic videos with continuous motion, realistic appearance and identity preservation. We also show the efficiency of our framework for dynamic facial expressions generation, dynamic facial expression transfer and data augmentation for training improved emotion recognition models.

 5  Citations  62 References

 

 

2022  see 2021

Approximation rate in Wasserstein distance of ... - Science Direct

https://www.sciencedirect.com › science › article › abs › pii

by O Bencheikh · 2022 — We are interested in the approximation in Wasserstein distance with index ρ ≥ 1 of a probability measure μ on the real line with finite ...

Approximation rate in Wasserstein distance of probability measures on the real line by deterministic empirical measures

Bencheikh, O and Jourdain, B

Feb 2022 | JOURNAL OF APPROXIMATION THEORY 274

We are interested in the approximation in Wasserstein distance with index rho >= 1 of a probability measure mu on the real line with finite moment of order rho by the empirical measure of N deterministic points. The minimal error converges to 0 as N -> +infinity and we try to characterize the order associated with this convergence. In Xu and Berger (2019), Xu and Berger show that, apart when mu is a Dirac mass and the error vanishes, the order is not larger than 1 and give a sufficient condition for the order to be equal to this threshold 1 in terms of the density of the absolutely continuous with respect to the Lebesgue measure part of mu. They also prove that the order is not smaller than 1/rho when the support of mu is bounded and not larger when the support is not an interval. We complement these results by checking that for the order to lie in the interval (1/rho,1), the support has to be bounded and by stating a necessary and sufficient condition in terms of the tails of mu for the order to be equal to some given value in the interval (0,1/rho), thus precising the sufficient condition in terms of moments given in Xu and Berger (2019). We also give a necessary condition for the order to be equal to the boundary value 1/rho. In view of practical application, we emphasize that in the proof of each result about the order of convergence of the minimal error, we exhibit a choice of points explicit in terms of the quantile function of mu which exhibits the same order of convergence. (c) 2021 Elsevier Inc. All rights reserved.

17 References

Zbl 07466633

Cited by 4 Related articles All 6 versions


2022 see 2021  [PDF] arxiv.org

Minimum entropy production, detailed balance and Wasserstein distance for continuous-time Markov processes

A Dechant - Journal of Physics A: Mathematical and Theoretical, 2022 - iopscience.iop.org

detailed process instead of just the initial and final configurations is specified [39, 40]. Here, 

the Wasserstein path length, which is the sum over infinitesimal Wasserstein … The connection 

between minimum entropy production and Wasserstein distance is useful because it allows …

Cited by 21 Related articles All 3 versions

Zbl 07615836


.arXiv:2201.11305  [pdfother math.NA
The Quadratic Wasserstein Metric With Squaring Scaling For Seismic Velocity Inversion
Authors: Zhengyang LiYijia TangJing ChenHao Wu
Abstract: The quadratic Wasserstein metric has shown its power in measuring the difference between probability densities, which benefits optimization objective function with better convexity and is insensitive to data noise. Nevertheless, it is always an important question to make the seismic signals suitable for comparison using the quadratic Wasserstein metric. The squaring scaling is worth exploring sinc…  More
Submitted 26 January, 2022; originally announced January 2022.
Comments: 21 pages, 6 figures
MSC Class: 49N45; 65K10; 86-08; 86A15

Cited by 2 Related articles All 2 versions 

2022

Wasserstein contraction and Poincaré inequalities for elliptic ...https://arxiv.org › math
https://arxiv.org › math
by P Monmarché · 2022 — Abstract: We consider elliptic diffusion processes on \mathbb R^d. Assuming that the drift contracts distances outside a compact set, ...

<——2022———2022———100—  


BWGAN-GP: An EEG data generation method for class imbalance problem in RSVP tasks

M Xu, Y Chen, Y Wang, D Wang, Z Liu… - IEEE transactions on … - pubmed.ncbi.nlm.nih.gov

… generative adversarial network with gradient penalty (BWGAN-GP) to generate RSVP minority 

… The average AUC obtained with BWGAN-GP on EEGNet was 94.43%, an increase of 3.7% 

… These results show that the BWGAN-GP could effectively alleviate CIPs in the RSVP task …

 Cited by 3 Related articles All 3 versions

Breton, Jean-ChristophePrivault, Nicolas

Wasserstein distance estimates for stochastic integrals by forward-backward stochastic calculus. (English) Zbl 07464232

Potential Anal. 56, No. 1, 1-20 (2022).

MSC:  60H05 60H10 60G57 60G44 60J60 60J75

PDF BibTeX XML Cite

Zbl 07464232


arXiv:2202.00954  [pdfother]  math.NA  cs.LG
Approximative Algorithms for Multi-Marginal Optimal Transport and Free-Support Wasserstein Barycenters
Authors: Johannes von Lindheim
Abstract: Computationally solving multi-marginal optimal transport (MOT) with squared Euclidean costs for N
 discrete probability measures has recently attracted considerable attention, in part because of the correspondence of its solutions with Wasserstein-2
 barycenters, which have many applications in data science. In general, this problem is NP-hard, calling for practical approximative algorithms. Whi… 
More
Submitted 2 February, 2022; originally announced February 2022.

Approximative Algorithms for Multi-Marginal Optimal Transport and Free-Support Wass...
by von Lindheim, Johannes
02/2022
Computationally solving multi-marginal optimal transport (MOT) with squared Euclidean costs for $N$ discrete probability measures has recently attracted...
Journal Article  Full Text Online

All 2 versions


arXiv:2202.00808  [pdfother]  cs.LG   cs.CR
Gromov-Wasserstein Discrepancy with Local Differential Privacy for Distributed Structural Graphs
Authors: Hongwei JinXun Chen
Abstract: Learning the similarity between structured data, especially the graphs, is one of the essential problems. Besides the approach like graph kernels, Gromov-Wasserstein (GW) distance recently draws big attention due to its flexibility to capture both topological and feature characteristics, as well as handling the permutation invariance. However, structured data are widely distributed for different d… 
More
Submitted 1 February, 2022; originally announced February 2022.
Working Paper  Full Text

Gromov-Wasserstein Discrepancy with Local Differential Privacy for Distributed Structural Graphs
Jin, Hongwei; Chen, Xun.arXiv.org; Ithaca, Feb 1, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Show Abstract 
Cited by 2
 Related articles All 2 versions 

arXiv:2201.13386  [pdfother]  math.NA   math.CA   math.PR
On a linearization of quadratic Wasserstein distance
Authors: Philip GreengardJeremy G. HoskinsNicholas F. MarshallAmit Singer
Abstract: This paper studies the problem of computing a linear approximation of quadratic Wasserstein distance W2. In particular, we compute an approximation of the negative homogeneous weighted Sobolev norm whose connection to Wasserstein distance follows from a classic linearization of a general Monge-Ampére equation. Our contribution is threefold. First, we provide expository material on this classic…  More
Submitted 31 January, 2022; originally announced January 2022.
Comments: 24 pages, 6 figures
Working Paper  Full Text

On a linearization of quadratic Wasserstein distance
Greengard, Philip; Hoskins, Jeremy G; Marshall, Nicholas F; Singer, Amit.arXiv.org; Ithaca, Jan 31, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Show Abstract 

Related articles All 5 versions 


2022

arXiv:2201.12797  [pdfpsother math.PR
Wasserstein Convergence Rates for Empirical Measures of Subordinated Processes on Noncompact Manifolds
Authors: Huaiqian LiBingyao Wu
Abstract: The asymptotic behaviour of empirical measures has been studied extensively. In this paper, we consider empirical measures of given subordinated processes on complete (not necessarily compact) and connected Riemannian manifolds with possibly nonempty boundary. We obtain rates of convergence for empirical measures to the invariant measure of the subordinated process under the Wasserstein distance.…  More
Submitted 30 January, 2022; originally announced January 2022.
Comments: Comments welcome!
All 2 versions
 

arXiv:2201.12324  [pdfother]  cs.LG  stat.ML
Optimal Transport Tools (OTT): A JAX Toolbox for all things Wasserstein
Authors: Marco CuturiLaetitia Meng-PapaxanthosYingtao TianCharlotte BunneGeoff DavisOlivier Teboul
Abstract: Optimal transport tools (OTT-JAX) is a Python toolbox that can solve optimal transport problems between point clouds and histograms. The toolbox builds on various JAX features, such as automatic and custom reverse mode differentiation, vectorization, just-in-time compilation and accelerators support. The toolbox covers elementary computations, such as the resolution of the regularized OT problem,… 
More
Submitted 28 January, 2022; originally announced January 2022.
Comments: 4 pages

Cited by 4 All 2 versions 

arXiv:2201.12245  [pdfother]  cs.LG   stat.ML
Wasserstein Iterative Networks for Barycenter Estimation
Authors: Alexander KorotinVage EgiazarianLingxiao LiEvgeny Burnaev
Abstract: Wasserstein barycenters have become popular due to their ability to represent the average of probability measures in a geometrically meaningful way. In this paper, we present an algorithm to approximate the Wasserstein-2 barycenters of continuous measures via a generative model. Previous approaches rely on regularization (entropic/quadratic) which introduces bias or on input convex neural networks… 
More
Submitted 28 January, 2022; originally announced January 2022.
Cited by 3 Related articles All 3 versions 
ARTICLE

Wasserstein Iterative Networks for Barycenter Estimation

Korotin, Alexander ; Egiazarian, Vage ; Li, Lingxiao ; Burnaev, EvgenyarXiv.org, 2022

Cited by 11 Related articles All 4 versions 

arXiv:2201.12225  [pdfpsother]  math.ST   math.PR
Wasserstein posterior contraction rates in non-dominated Bayesian nonparametric models
Authors: Federico CamerlenghiEmanuele DoleraStefano FavaroEdoardo Mainini
Abstract: Posterior contractions rates (PCRs) strengthen the notion of Bayesian consistency, quantifying the speed at which the posterior distribution concentrates on arbitrarily small neighborhoods of the true model, with probability tending to 1 or almost surely, as the sample size goes to infinity. Under the Bayesian nonparametric framework, a common assumption in the study of PCRs is that the model is d…  More
Submitted 28 January, 2022; originally announced January 2022.
Comments: arXiv admin note: text overlap with arXiv:2011.14425
Cited by 2
 Related articles All 2 versions 


arXiv:2201.12087  [pdfpsother math.PR
Bounding Kolmogorov distances through Wasserstein and related integral probability metrics
Authors: Robert E. GauntSiqi Li
Abstract: We establish general upper bounds on the Kolmogorov distance between two probability distributions in terms of the distance between these distributions as measured with respect to the Wasserstein or smooth Wasserstein metrics. These bounds provide a significant generalisation of existing results from the literature. To illustrate the broad applicability of our general bounds, we apply them to extr…  More
Submitted 28 January, 2022; originally announced January 2022.
Comments: 26 pages
MSC Class: Primary 60E15; 60F05; Secondary 41A10

Cited by 1 Related articles All 3 versions 

<——2022————2022———110—  



Dynamic Facial Expression Generation on Hilbert Hypersphere With Conditional Wasserstein...
by Otberdout, Naima; Daoudi, Mohamed; Kacem, Anis ; More...
IEEE transactions on pattern analysis and machine intelligence, 02/2022, Volume 44, Issue 2
In this work, we propose a novel approach for generating videos of the six basic facial expressions given a neutral face image. We propose to exploit the face...

Journal Article  Full Text Online  Open Access

 
Remaining useful life estimation of bearings under different working conditions via Wasserstein distance-based weighted domain adaptation

T Hu, Y Guo, L Gu, Y Zhou, Z Zhang, Z Zhou - Reliability Engineering & …, 2022 - Elsevier

… This article proposes a Wasserstein distance-based weighted domain adversarial neural …

adaptation loss based on KL and JS divergence should be substituted. Wasserstein distance (…
Related articles
 All 2 versions

Journal Article  Full Text Online

Cited by 6 Related articles All 4 versions


A Wasserstein GAN for Joint Learning of Inpainting and its Spatial Optimisation
by Peter, Pascal
02/2022
Classic image inpainting is a restoration method that reconstructs missing image parts. However, a carefully selected mask of known pixels that yield a high...
Journal Article  Full Text Online

All 3 versions 


2022
Wasserstein and multivariate linear affine based distributionally robust optimization for CCHP-P2G...
by Wang, Yuwei; Yang, Yuanjuan; Fei, Haoran ; More...
Applied energy, 01/2022, Volume 306
[Display omitted] •Power-to-gas facility is integrated into the trigeneration energy system.•Optimal operation of this integrated system is studied under...
Journal Article  Full Text Online

Cited by 3 Related articles All 5 versions

2022

The Quadratic Wasserstein Metric With Squaring Scaling For ...

https://arxiv.org › math

Submitted 1/27/2022  — In this work, we will present a more in-depth study on the combination of squaring scaling technique and the qu

The Quadratic Wasserstein Metric With Squaring Scaling For Seismic Velocity...

by Li, Zhengyang; Tang, Yijia; Chen, Jing ; More...

01/2022

The quadratic Wasserstein metric has shown its power in measuring the difference between probability densities, which benefits optimization objective function...

Journal Article  Full Text Online
Cited by 3
Related articles All 2 versions

2022

[PDF] mdpi.com

Fault Feature Extraction Method of a Permanent Magnet Synchronous Motor Based on VAE-WGAN

L Zhan, X Xu, X Qiao, F Qian, Q Luo - Processes, 2022 - mdpi.com

This paper focuses on the difficulties that appear when the number of fault samples collected

by a permanent magnet synchronous motor is too low and seriously unbalanced compared

with the normal data. In order to effectively extract the fault characteristics of the motor and …

Related articles All 4 versions

[PDF] iop.org

A Strabismus Surgery Parameter Design Model with WGAN-GP Data Enhancement Method

R Tang, W Wang, Q Meng, S Liang… - Journal of Physics …, 2022 - iopscience.iop.org

… This paper enhanced the data set through a WGAN-GP model to improve the performance of

the LightGBM algorithm. The … them, WGAN-GP adds the Wasserstein distance and also adds

gradient constraints to meet Lipschitz continuity. Therefore, this paper chose the WGAN-GP …

All 3 versions

[PDF] iop.org

Improving Word Alignment by Adding Gromov-Wasserstein into Attention Neural Network

Y Huang, T Zhang, H Zhu - Journal of Physics: Conference …, 2022 - iopscience.iop.org

Statistical machine translation systems usually break the translation task into two or more

subtasks and an important one is finding word alignments over a parallel sentence bilingual

corpus. We address the problem of introducing word alignment for language pairs by …

 Related articles All 3 versions

2022  see 2021  [PDF] unipd.it

[PDF] Optimal Transport and Wasserstein Gradient Flows

F Santambrogio - Doctoral Program in Mathematical Sciences Catalogue … - math.unipd.it

… Wasserstein distances and their properties. Curves in the Wasserstein spaces and relation

with the continuity equation. Characterization of AC curves in the Wasserstein spaces …

Related articles All 2 versions 


2022 patent

WGAN based clothing attribute editing method, involves collecting different styles of garment construction data set in training stage, and inputting training data input with attribute label to generate countermeasure network to train

CN113793397-A

Inventor(s) WANG ZWANG W and ZHANG J

Assignee(s) UNIV YUYAO ZHEJIANG ROBOTICS RES CENT and UNIV ZHEJIANG

Derwent Primary Accession Number 

2022-00125T

<——2022———2022———120—-


Wasserstein Distances, Geodesics and Barycenters of Merge Trees

Pont, MVidal, J; (...); Tierny, J

Jan 2022 | IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 28 (1) , pp.291-301

This paper presents a unified computational framework for the estimation of distances, geodesics and barycenters of merge trees. We extend recent work on the edit distance [104] and introduce a new metric, called the Wasserstein distance between merge trees, which is purposely designed to enable efficient computations of geodesics and barycenters. Specifically, our new distance is strictly equivalent to the $L$2-Wasserstein distance between extremum persistence diagrams, but it is restricted to a smaller solution space, namely, the space of rooted partial isomorphisms between branch decomposition trees. This enables a simple extension of existing optimization frameworks [110] for geodesics and barycenters from persistence diagrams to merge trees. We introduce a task-based algorithm which can be generically applied to distance, geodesic, barycenter or cluster computation. The task-based nature of our approach enables further accelerations with shared-memory parallelism. Extensive experiments on public ensembles and SciVis contest benchmarks demonstrate the efficiency of our approach - with barycenter computations in the orders of minutes for the largest examples - as well as its qualitative ability to generate representative barycenter merge trees, visually summarizing the features of interest found in the ensemble. We show the utility of our contributions with dedicated visualization applications: feature tracking, temporal reduction and ensemble clustering. We provide a lightweight C++ implementation that can be used to reproduce our results.

Show more

Free Submitted Article From RepositoryView full text

119 References

Cited by 3 Related articles All 50 versions

2022

Dynamic Facial Expression Generation on Hilbert Hypersphere With Conditional Wasserstein Generative Adversarial Nets

Otberdout, NDaoudi, M; (...); Berretti, S

Feb 1 2022 | IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 44 (2) , pp.848-863

In this work, we propose a novel approach for generating videos of the six basic facial expressions given a neutral face image. We propose to exploit the face geometry by modeling the facial landmarks motion as curves encoded as points on a hypersphere. By proposing a conditional version of manifold-valued Wasserstein generative adversarial network (GAN) for motion generation on the hypersphere, we learn the distribution of facial expression dynamics of different classes, from which we synthesize new facial expression motions. The resulting motions can be transformed to sequences of landmarks and then to images sequences by editing the texture information using another conditional Generative Adversarial Network. To the best of our knowledge, this is the first work that explores manifold-valued representations with GAN to address the problem of dynamic facial expression generation. We evaluate our proposed approach both quantitatively and qualitatively on two public datasets; Oulu-CASIA and MUG Facial Expression. Our experimental results demonstrate the effectiveness of our approach in generating realistic videos with continuous motion, realistic appearance and identity preservation. We also show the efficiency of our framework for dynamic facial expressions generation, dynamic facial expression transfer and data augmentation for training improved emotion recognition models.

Free Submitted Article From RepositoryView full text

Citations 62

Related records 


2022

Unsupervised domain adaptation of bearing fault diagnosis based on Join Sliced Wasserstein Distance.

Chen, PengfeiZhao, Rongzhen; (...); Yang, Qidong

2022-Jan-05 | ISA transactions

Deep neural networks have been successfully utilized in the mechanical fault diagnosis, however, a large number of them have been based on the same assumption that training and test datasets followed the same distributions. Unfortunately, the mechanical systems are easily affected by environment noise interference, speed or load change. Consequently, the trained networks have poor generalization under various working conditions. Recently, unsupervised domain adaptation has been concentrated on more and more attention since it can handle different but related data. Sliced Wasserstein Distance has been successfully utilized in unsupervised domain adaptation and obtained excellent performances. However, most of the approaches have ignored the class conditional distribution. In this paper, a novel approach named Join Sliced Wasserstein Distance (JSWD) has been proposed to address the above issue. Four bearing datasets have been selected to validate the practicability and effectiveness of the JSWD framework. The experimental results have demonstrated that about 5% accuracy is improved by JSWD with consideration of the conditional probability than no the conditional probability, in addition, the other experimental results have indicated that JSWD could effectively capture the distinguishable and domain-invariant representations and have a has superior data distribution matching than the previous methods under various application scenarios.

Show more  View full text

Unsupervised domain adaptation of bearing fault diagnosis based on Join Sliced Wasserstein Distance

P Chen, R Zhao, T He, K Wei, Y Qidong - ISA Transactions, 2022 - Elsevier

… In recently, the Sliced Wasserstein distance (SWD) as a promising measurement distance

… with the SWD and proposed a model named Sliced Wasserstein Auto-encoders (SWAE). …

[26], we attempt to investigate and propose a joint Slice Wasserstein Distance (JSWD) method …

Cited by 4 Related articles All 3 versions

2022 see 2021 2020
The Quantum Wasserstein Distance of Order 1

De Palma, Giacomo; Marvian, Milad; Trevisan, Dario; Lloyd, Seth.arXiv.org; Ithaca, Jan 13, 2022.

2022  [PDF] ieee.org

Wasserstein Generative Adversarial Network for Depth Completion with Anisotropic Diffusion Depth Enhancement

TM Nguyen, M Yoo - IEEE Access, 2022 - ieeexplore.ieee.org

… We used an adapted Wasserstein Generative Adversarial Network architecture instead of 

applying the traditional autoencoder approach and post-processing process to preserve valid 

depth measurements received from the input and further enhance the depth value precision …

Cited by 1 Related articles All 2 versions
Cover Image

Wasserstein Generative Adversarial Network for Depth Completion with...

by Nguyen, Tri Minh; Yoo, Myungsik

IEEE access, 01/2022

The objective of depth completion is to generate a dense depth map by upsampling a sparse one. However, irregular sparse patterns or the lack of groundtruth...

ArticleView Article PDF

Journal Article Full Text Online

View Complete Issue Browse Now

Cited by 4 Related articles All 2 versions

2022


Working Paper  Full Text

Hierarchical Gaussian Processes with Wasserstein-2 Kernels
Popescu, Sebastian; Sharp, David; Cole, James; Glocker, Ben.arXiv.org; Ithaca, Feb 1, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Show Abstract 


CONFERENCE PROCEEDING

Towards Instant Calibration in Myoelectric Prosthetic Hands: A Highly Data-Efficient Controller Based on the Wasserstein Distance

Yang, Zeyu ; Son, Honn Wee ; Bello, Fernando ; Kormushev, Petar ; Rojas, NicolasIEEE International Conference on Rehabilitation Robotics, 2022, Vol.2022, p.1-6

Towards Instant Calibration in Myoelectric Prosthetic Hands: A Highly Data-Efficient Controller Based on the Wasserstein Distance

Available Online 

 Towards Instant Calibration in Myoelectric Prosthetic Hands: A Highly Data-Efficient Controller Based on the Wasserstein Distance

Related articles All 5 versions

2022 see 021   Working Paper Full Text

Sliced-Wasserstein Gradient Flows
Bonet, Clément; Courty, Nicolas; Septier, François; Lucas Drumetz.arXiv.org; Ithaca, Jan 28, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Show Abstract 

Cited by 5 Related articles All 12 versions 


Working Paper  Full Text

Quantitative control of Wasserstein distance between Brownian motion and the Goldstein--Kac telegraph process
Barrera, Gerardo; Lukkarinen, Jani.arXiv.org; Ithaca, Jan 21, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Show Abstract 

Related articles All 4 versions 


 A new data generation approach with modified Wasserstein auto-encoder for rotating machinery fault diagnosis with limited fault data

K Zhao, H Jiang, C Liu, Y Wang, K Zhu - Knowledge-Based Systems, 2022 - Elsevier

… Wasserstein auto-encoder (MWAE) to generate data that are highly similar to the known

data. The sliced Wasserstein … The sliced Wasserstein distance with a gradient penalty is …

Cited by 5 Related articles All 3 version

<——2022———2022——130—


2022 see 2021  Working Paper  Full Text

Exact Statistical Inference for the Wasserstein Distance by Selective Inference
Vo Nguyen Le Duy; Takeuchi, Ichiro.arXiv.org; Ithaca, Jan 20, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Show Abstract 

Select result item


RTICLE

Exact weights, path metrics, and algebraic Wasserstein distances

Peter Bubenik ; Jonathan Scott ; Donald StanleyarXiv.org, 2022

OPEN ACCESS

Exact weights, path metrics, and algebraic Wasserstein distances

Available Online 
Working Paper  Full Text

Exact weights, path metrics, and algebraic Wasserstein distances
Bubenik, Peter; Scott, Jonathan; Stanley, Donald.arXiv.org; Ithaca, Jan 19, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Show Abstract 

Working Paper  Full Text

Dynamic Topological Data Analysis for Brain Networks via Wasserstein Graph Clustering
Chung, Moo K; Shih-Gu, Huang; Carroll, Ian C; Calhoun, Vince D; H Hill Goldsmith.arXiv.org; Ithaca, Jan 11, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Show Abstract 

2022 see 2021  ARTICLE

FlowPool: Pooling Graph Representations with Wasserstein Gradient Flows

Simou, EffrosyniarXiv.org, 2022

OPEN ACCESS

FlowPool: Pooling Graph Representations with Wasserstein Gradient Flows

Available Online 

ARTICLE

Semi-relaxed Gromov-Wasserstein divergence with applications on graphs

Cédric Vincent-Cuaz ; Rémi Flamary ; Marco Corneli ; Titouan Vayer ; Nicolas CourtyarXiv.org, 2022

OPEN ACCESS

Semi-relaxed Gromov-Wasserstein divergence with applications on graphs

Available Online 
Working Paper  Full Text

Semi-relaxed Gromov-Wasserstein divergence with applications on graphs
Vincent-Cuaz, Cédric; Flamary, Rémi; Corneli, Marco; Vayer, Titouan; Courty, Nicolas.arXiv.org; Ithaca, Jan 4, 2022.

Abstract/Details


2022


[PDF] github.io

[PDF] Learning Robust Neural Networks using Wasserstein Adversarial GAN

S Goel, P Shah - goel-shashank.github.io

… between corresponding pixels of the two images. However, it has … Notwithstanding, it has 

been shown that images can be … perceptual distance D between two images. One can trivially …

Related articles


2022

Estimating empirically the Wasserstein distance - Cross ...

https://stats.stackexchange.com › questions › estimating...

https://stats.stackexchange.com › questions › estimating...

Jan 14, 2022 — I have a dataset of the form {xi,yi,yi}, where yip(|xi) and yiq(|xi), while xi itself has a distribution d().


.arXiv:2202.01275  [pdfother cs.LG
Topological Classification in a Wasserstein Distance Based Vector Space
Authors: Tananun SongdechakraiwutBryan M. KrauseMatthew I. BanksKirill V. NourskiBarry D. Van Veen
Abstract: Classification of large and dense networks based on topology is very difficult due to the computational challenges of extracting meaningful topological features from real-world networks. In this paper we present a computationally tractable approach to topological classification of networks by using principled theory from persistent homology and optimal transport to define a novel vector representa…  More
Submitted 2 February, 2022; originally announced February 2022.

Cited by 1 Related articles All 2 versions 

Proximal Optimal Transport Modeling of Population Dynamics

C Bunne, L Papaxanthos, A Krause… - … Conference on …, 2022 - proceedings.mlr.press

… Given T discrete measures µ0,...,µT describing the time … a neural network with parameters 

ξ, and fit ξ so that the JKO flow … a proximal gradient descent step in the Wasserstein space, …

Save Cite Cited by 9 Related articles A

2022

Optimizing decisions for a dual-channel retailer with service

level requirements and demand uncertainties: A Wasserstein metric-based distribuionaly

https://dl.acm.org › doi › abs › j.cor.2021.105589

https://dl.acm.org › doi › abs › j.cor.2021.105589

by Y Sun · 2022 ·  — Highlights • Study prices, order quantities and delivery times of a dual-channel retailing problem. • Construct Wasserstein uncertainty set ...

<——2022———2022———140—


 2022

Wasserstein soft label propagation on hypergraphs - ACM ...

https://dl.acm.org › doi

https://dl.acm.org › doi

Wasserstein soft label propagation on hypergraphs: algorithm and generalization error bounds. Share on ... Online:05 January 2022Publication History.

  

Fault data expansion method of permanent magnet synchronous motor based on Wasserstein-generative adversarial network

Zhan, LXu, XW; (...); Luo, Q

May 2022 (Early Access) | PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART C-JOURNAL OF MECHANICAL ENGINEERING SCIENCE

Enriched Cited References

Aiming at the characteristics of non-smooth, non-linear, multi-source heterogeneity, low density of value and unevenness of fault data collected by the online monitoring equipment of permanent magnet synchronous motor (PMSM), and the difficulty of fault mechanism analysis, this paper proposes a method of PMSM data expansion based on the improved generative adversarial network. First, use the re

Show more

Full Text at Publisher 23 References Related records

 Related articles

2022

Wasserstein distributionally robust control and optimization for ...

https://mae.ucsd.edu › seminar › 2022 › wasserstein-dis...

https://mae.ucsd.edu › seminar › 2022 › wasserstein-dis...

Seminar Information. Seminar Series. Dynamic Systems & Controls. Seminar Date - Time. February 11, 2022, 3:00 pm. -. 4:00. Important Links.


On a prior based on the Wasserstein information matrix

by Li, W; Rubio, FJ
11/2022
We introduce a prior for the parameters of univariate continuous distributions, based on the Wasserstein information matrix, which is invariant under...
Journal ArticleCitation Online

arXiv:2202.03217  [pdfother]  math.ST  stat.ME
On a prior based on the Wasserstein information matrix
Authors: W. LiF. J. Rubio
Abstract: We introduce a prior for the parameters of univariate continuous distributions, based on the Wasserstein information matrix, which is invariant under reparameterisations. We briefly discuss the links between the proposed prior with information geometry. We present several examples where we can either obtain this prior in closed-form, or propose a numerically tractable approximation for cases where…  More
Submitted 7 February, 2022; originally announced February 2022.
Related articles
 All 5 versions 

MR4467246 

arXiv:2202.02495  [pdfother]  cs.LG   math.MG
Weisfeiler-Lehman meets Gromov-Wasserstein
Authors: Samantha ChenSunhyuk LimFacundo MémoliZhengchao WanYusu Wang
Abstract: The Weisfeiler-Lehman (WL) test is a classical procedure for graph isomorphism testing. The WL test has also been widely used both for designing graph kernels and for analyzing graph neural networks. In this paper, we propose the Weisfeiler-Lehman (WL) distance, a notion of distance between labeled measure Markov chains (LMMCs), of which labeled graphs are special cases. The WL distance is polynom…  More
Submitted 5 February, 2022; originally announced February 2022.

2022

[PS] uc.pt

[PS] … ), 301-316.[24] Weed, J., & Bach, F.(2019). Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance. Bernoulli 25 …

PE OLIVEIRA, N PICADO - surfaces - mat.uc.pt

Let M be a compact manifold of Rd. The goal of this paper is to decide, based on a sample

of points, whether the interior of M is empty or not. We divide this work in two main parts.

Firstly, under a dependent sample which may or may not contain some noise within, we …

Related articles 
 

2022 Research articleOpen access
A brief survey on Computational Gromov-Wasserstein distance
Procedia Computer Science3 February 2022...

Lei ZhengYang XiaoLingfeng Niu

Download PDF

Research articleOpen access
A brief survey on Computational Gromov-Wasserstein distance

Procedia Computer Science3 February 2022...

Lei ZhengYang XiaoLingfeng Niu

Download PDF


2022 Research article
Fault diagnosis of rotating machinery based on combination of Wasserstein generative adversarial networks and long short term memory fully convolutional network

Measurement3 February 2022...

Yibing LiWeiteng ZouLi Jiang

2022  Research article
Fault diagnosis of rotating machinery based on combination of Wasserstein generative adversarial networks and long short term memory fully convolutional network
Measurement3 February 2022...

Yibing LiWeiteng ZouLi Jiang


2022  Review article
A novel multi-speakers Urdu singing voices synthesizer using Wasserstein Generative Adversarial Network
Speech Communication15 January 2022...

Ayesha SaeedMuhammad Faisal HayatMuhammad Ali Qureshi

Related articles All 2 versions

arXiv:2202.03928  [pdfpsother math.PR
Stein's method for steady-state diffusion approximation in Wasserstein distance
Authors: Thomas Bonis
Abstract: We provide a general steady-state diffusion approximation result which bounds the Wasserstein distance between the reversible measure $μ$ of a diffusion process and the measure $ν$ of an approximating Markov chain. Our result is obtained thanks to a generalization of a new approach to Stein's method which may be of independent interest. As an application, we study the invariant measure of a random…  More
Submitted 8 February, 2022; originally announced February 2022.

All 3 versions 
<——2022———2022———150—  


arXiv:2202.03926  [pdfother]  stat.ML  cs.LG
Distribution Regression with Sliced Wasserstein Kernels
Authors: Dimitri MeunierMassimiliano PontilCarlo Ciliberto
Abstract: The problem of learning functions over spaces of probabilities - or distribution regression - is gaining significant interest in the machine learning community. A key challenge behind this problem is to identify a suitable representation capturing all relevant properties of the underlying functional mapping. A principled approach to distribution regression is provided by kernel mean embeddings, wh… 
More
Submitted 8 February, 2022; originally announced February 2022.
Cited by 4
 Related articles All 4 versions 

ARTICLE

Learning to Predict Graphs with Fused Gromov-Wasserstein Barycenters

Luc Brogat-Motte ; Rémi Flamary ; Céline Brouard ; Juho Rousu ; Florence d'Alché-BucarXiv.org, 2022

OPEN ACCESS

Learning to Predict Graphs with Fused Gromov-Wasserstein Barycenters

Available Online 

arXiv:2202.03813  [pdfother]  stat.ML  cs.LG
Learning to Predict Graphs with Fused Gromov-Wasserstein Barycenters
Authors: Luc Brogat-MotteRémi FlamaryCéline BrouardJuho RousuFlorence d'Alché-Buc
Abstract: This paper introduces a novel and generic framework to solve the flagship task of supervised labeled graph prediction by leveraging Optimal Transport tools. We formulate the problem as regression with the Fused Gromov-Wasserstein (FGW) loss and propose a predictive model relying on a FGW barycenter whose weights depend on inputs. First we introduce a non-parametric estimator based on kernel ridge… 
More
Submitted 8 February, 2022; originally announced February 2022.


[PDF] lpsm.paris

[PDF] OPTIMAL 1-WASSERSTEIN DISTANCE FOR WGANS BY ARTHUR STÉPHANOVITCH, UGO TANIELIAN 2, BENOÎT CADRE 3, NICOLAS KLUTCHNIKOFF 3 …

A STÉPHANOVITCH - lpsm.paris

… of the generated distributions, we provide a thorough analysis of Wasserstein GANs (WGANs)

in both the finite sample and asymptotic regimes. … We also highlight the fact that WGANs are

able to approach (for the 1-Wasserstein distance) the target distribution as the sample size …

 Cited by 1 Related articles All 2 versions 

 Anastasiou, AndreasGaunt, Robert E.

Wasserstein distance error bounds for the multivariate normal approximation of the maximum likelihood estimator. (English) Zbl 07471517

Electron. J. Stat. 15, No. 2, 5758-5810 (2021).

MSC:  60F05 62E17 62F10 62F12

PDF BibTeX XML


 HackGAN: Harmonious Cross-Network Mapping Using CycleGAN With Wasserstein-Procrustes Learning for Unsupervised Network Alignment

By: Yang, LinyaoWang, XiaoZhang, Jun; et al.

IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS    

Early Access: FEB 2022

Cited by 3 Related articles

2022


[PDF] arxiv.org

Time discretizations of Wasserstein–Hamiltonian flows

J Cui, L Dieci, H Zhou - Mathematics of Computation, 2022 - ams.org

… $L^2$-Wasserstein metric, also known as Wasserstein manifold, and several authors have 

… field in phase space is a Hamiltonian flow on the Wasserstein manifold. To be more precise, …

Cited by 8 Related articles All 8 versions

2022 see 2021  Working Paper  Full Text

Master Bellman equation in the Wasserstein space: Uniqueness of viscosity solutions

Cosso, Andrea; Gozzi, Fausto; Kharroubi, Idris; Pham, Huyên; Rosestolato, Mauro.arXiv.org; Ithaca, Feb 9, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Show Abstract 

2022  see 2021  Working Paper  Full Text

SLOSH: Set LOcality Sensitive Hashing via Sliced-Wasserstein Embeddings

Lu, Yuzhe; Liu, Xinran; Soltoggio, Andrea; Kolouri, Soheil.arXiv.org; Ithaca, Feb 8, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Show Abstract 


2022  see 2021  Working Paper  Full Text

On the Computational Complexity of Finding a Sparse Wasserstein Barycenter

Borgwardt, Steffen; Patterson, Stephan.arXiv.org; Ithaca, Feb 8, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Show Abstract 


Working Paper  Full Text

Indeterminacy estimates, eigenfunctions and lower bounds on Wasserstein distances

De Ponti, Nicolò; Farinelli, Sara.arXiv.org; Ithaca, Feb 7, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

All 2 versions 

Journal Article  Full Text Online

<——2022———2022———160—  


[PDF] arxiv.org

On Wasserstein-1 distance in the central limit theorem for elephant random walk

X Ma, M El Machkouri, X Fan - Journal of Mathematical Physics, 2022 - aip.scitation.org

… For 3/4 < p ≤ 1, he also showed that the elephant random walk … From (1), we get ΔM k = a

k ɛ k = a k (S k − γ k−1 S k−1 ). … R ) . For a random variable X and a constant a > 0, it holds that …

Cited by 1 Related articles All 3 versions

MR4377993 Prelim Chaudru de Raynal, 

Paul-Eric; Frikha, Noufel; Well-posedness for some non-linear SDEs and related PDE on the Wasserstein space. J. Math. Pures Appl. (9) 159 (2022), 1–167. 60H10 (35K40 60H30 93E03)

Review PDF Clipboard Journal Article

MR4375715 Prelim Molnár, Lajos; Maps on positive definite cones of 

C*-algebras preserving the Wasserstein mean. Proc. Amer. Math. Soc. 150 (2022), no. 3, 1209–1221. 47A64 (46L40 47B49)

Review PDF Clipboard Journal Article

Cited by 12 Related articles All 8 versions

Zbl 07469054

arXiv:2202.06782  [pdf, other]  quant-ph q-fin.CP q-fin.PM
Wasserstein Solution Quality and the Quantum Approximate Optimization Algorithm: A Portfolio Optimization Case Study
Authors: Jack S. Baker, Santosh Kumar Radha
Abstract: Optimizing of a portfolio of financial assets is a critical industrial problem which can be approximately solved using algorithms suitable for quantum processing units (QPUs). We benchmark the success of this approach using the Quantum Approximate Optimization Algorithm (QAOA); an algorithm targeting gate-model QPUs. Our focus is on the quality of solutions achieved as determined by the Normalized…
More
Submitted 14 February, 2022; originally announced February 2022.
Comments: 21 pages and 11 Figures in main article, 8 pages, 5 Figures and 3 tables in Supplemental Material 

arXiv:2202.06380  [pdf, other]  math.PR math.ST
Central Limit Theorems for Semidiscrete Wasserstein Distances
Authors: Eustasio del Barrio, Alberto González-Sanz, Jean-Michel Loubes
Abstract: We prove a Central Limit Theorem for the empirical optimal transport cost, nm
n+m

Cited by 1 All 5 versions

arXiv:2207.04216  [pdfother cs.LG  cs.AI
Wasserstein Graph Distance based on L
1-Approximated Tree Edit Distance between Weisfeiler-Lehman Subtrees
Authors: Zhongxi FangJianming HuangXun SuHiroyuki Kasai
Abstract: The Weisfeiler-Lehman (WL) test has been widely applied to graph kernels, metrics, and neural networks. However, it considers only the graph consistency, resulting in the weak descriptive power of structural information. Thus, it limits the performance improvement of applied methods. In addition, the similarity and distance between graphs defined by the WL test are in coarse measurements. To the b…  More
Submitted 9 July, 2022; originally announced July 2022.


2022


arXiv:2202.05642  [pdf, ps, othermath.PR
Wasserstein-type distances of two-type continuous-state branching processes in Lévy random environments
Authors:
Shukai Chen, Rongjuan Fang, Xiangqi Zheng
Abstract: Under natural conditions, we proved the exponential ergodicity in Wasserstein distance of two-type continuous-state branching processes in Lévy random environments with immigration. Furthermore, we expressed accurately the parameters of the exponent. The coupling method and the conditioned branching property play an important role in the approach. Using the tool of superprocesses, the ergodicity i…
More
Submitted 11 February, 2022; originally announced February 2022.
MSC Class: 60J25; 60J68; 60J80; 60J76 

[PDF] arxiv.org

Wasserstein-type distances of two-type continuous-state branching processes in L\'{e} vy random environments

ChenFang, X Zheng - arXiv preprint arXiv:2202.05642, 2022 - arxiv.org

… And we write Qξ t (x, ·) for the special case of r = 0. Furthermore, for … Then by the relation

(22), we can verify that Qt(x, y, dσ1, dσ2) is the coupling measure of Qt(x,dσ1) and Qt(y,dσ2). For …

 All 2 versions 

arXiv:2202.05623  [pdf, othereess.IV cs.CV cs.LG
A
Wasserstein GAN for Joint Learning of Inpainting and its Spatial Optimisation
Authors:
Pascal Peter
Abstract: Classic image inpainting is a restoration method that reconstructs missing image parts. However, a carefully selected mask of known pixels that yield a high quality inpainting can also act as a sparse image representation. This challenging spatial optimisation problem is essential for practical applications such as compression. So far, it has been almost exclusively addressed by model-based approa…
More
Submitted 11 February, 2022; originally announced February 2022. 

All 3 versions

arXiv:2202.05495  [pdf, otherstat.ME
Inference for Projection-Based
Wasserstein Distances on Finite Spaces
Authors:
Ryo Okano, Masaaki Imaizumi
Abstract: The Wasserstein distance is a distance between two probability distributions and has recently gained increasing popularity in statistics and machine learning, owing to its attractive properties. One important approach to extending this distance is using low-dimensional projections of distributions to avoid a high computational cost and the curse of dimensionality in empirical estimation, such as t…
More
Submitted 11 February, 2022; originally announced February 2022. 

 Cited by 2 Related articles All 2 versions



 WAD-CMSN: Wasserstein Distance based Cross ... - arXivhttps://arxiv.org › pdf
https://arxiv.org › pdfPDF
by G Xu · 2022 — arXiv:2202.05465v1 [cs.CV] 11 Feb 2022. WAD-CMSN: Wasserstein Distance based Cross-Modal Semantic. Network for Zero-Shot Sketch-Based Image ...


(PDF) WAD-CMSN: Wasserstein Distance based Cross-Modal ...

https://www.researchgate.net › publication › 358579684_...

https://www.researchgate.net › publication › 358579684_...


arXiv:2202.05465v1 [cs.CV] 11 Feb 2022. WAD-CMSN: Wasserstein Distance based Cross-Modal Semantic. Network for Zero-Shot Sketch-Based Image Retrieval.


 
 


arXiv:2202.03928
 [pdf, ps, othermath.PR
Stein's method for steady-state diffusion approximation in
Wasserstein distance
Authors:
Thomas Bonis
Abstract: We provide a general steady-state diffusion approximation result which bounds the Wasserstein distance between the reversible measure μ
of a diffusion process and the measure ν
of an approximating Markov chain. Our result is obtained thanks to a generalization of a new approach to Stein's method which may be of independent interest. As an application, we study the invariant measure of a random…
More
Submitted 8 February, 2022; originally announced February 2022. 

Related articles All 3 versions

<——2022———2022———170—  


arXiv:2202.03926  [pdf, other]  stat.ML cs.LG
Distribution Regression with Sliced Wasserstein Kernels
Authors: Dimitri Meunier, Massimiliano Pontil, Carlo Ciliberto
Abstract: The problem of learning functions over spaces of probabilities - or distribution regression - is gaining significant interest in the machine learning community. A key challenge behind this problem is to identify a suitable representation capturing all relevant properties of the underlying functional mapping. A principled approach to distribution regression is provided by kernel mean embeddings, wh…
More
Submitted 8 February, 2022; originally announced February 2022. 

All 2 versions

arXiv:2202.03813  [pdf, other]  stat.ML cs.LG
Learning to Predict Graphs with Fused Gromov-Wasserstein Barycenters
Authors: Luc Brogat-Motte, Rémi Flamary, Céline Brouard, Juho Rousu, Florence d'Alché-Buc
Abstract: This paper introduces a novel and generic framework to solve the flagship task of supervised labeled graph prediction by leveraging Optimal Transport tools. We formulate the problem as regression with the Fused Gromov-Wasserstein (FGW) loss and propose a predictive model relying on a FGW barycenter whose weights depend on inputs. First we introduce a non-parametric estimator based on kernel ridge…
More
Submitted 8 February, 2022; originally announced February 2022. 

Cited by 3 Related articles All 5 versions


arXiv:2202.03217  [pdf, other]  math.ST stat.ME
On a prior based on the Wasserstein information matrix
Authors: W. Li, F. J. Rubio
Abstract: We introduce a prior for the parameters of univariate continuous distributions, based on the Wasserstein information matrix, which is invariant under reparameterisations. We briefly discuss the links between the proposed prior with information geometry. We present several examples where we can either obtain this prior in closed-form, or propose a numerically tractable approximation for cases where…
More
Submitted 7 February, 2022; originally announced February 2022.
All 5 versions


arXiv:2202.02495  [pdf, other]  cs.LG math.MG

Weisfeiler-Lehman meets Gromov-
Wasserstein
Authors:
Samantha Chen, Sunhyuk Lim, Facundo Mémoli, Zhengchao Wan, Yusu Wang
Abstract: The Weisfeiler-Lehman (WL) test is a classical procedure for graph isomorphism testing. The WL test has also been widely used both for designing graph kernels and for analyzing graph neural networks. In this paper, we propose the Weisfeiler-Lehman (WL) distance, a notion of distance between labeled measure Markov chains (LMMCs), of which labeled graphs are special cases. The WL distance is polynom…

Submitted 5 February, 2022; originally announced February 2022. 

Related articles All 2 versions 

arXiv:2202.01275  [pdf, othercs.LG
Topological Classification in a
Wasserstein Distance Based Vector Space
Authors:
Tananun Songdechakraiwut, Bryan M. Krause, Matthew I. Banks, Kirill V. Nourski, Barry D. Van Veen
Abstract: Classification of large and dense networks based on topology is very difficult due to the computational challenges of extracting meaningful topological features from real-world networks. In this paper we present a computationally tractable approach to topological classification of networks by using principled theory from persistent homology and optimal transport to define a novel vector representa…
More
Submitted 2 February, 2022; originally announced February 2022. 

Cited by 1 Related articles All 2 versions

2022

arXiv:2202.00808  [pdf, other]  cs.LG cs.CR
Gromov-Wasserstein Discrepancy with Local Differential Privacy for Distributed Structural Graphs
Authors: Hongwei Jin, Xun Chen
Abstract: Learning the similarity between structured data, especially the graphs, is one of the essential problems. Besides the approach like graph kernels, Gromov-Wasserstein (GW) distance recently draws big attention due to its flexibility to capture both topological and feature characteristics, as well as handling the permutation invariance. However, structured data are widely distributed for different d…
More
Submitted 1 February, 2022; originally announced February 2022. 

 Cited by 2 Related articles All 2 versions

arXiv:2201.13386  [pdf, other]  math.NA math.CA math.PR
On a linearization of quadratic
Wasserstein distance
Authors:
Philip Greengard, Jeremy G. Hoskins, Nicholas F. Marshall, Amit Singer
Abstract: This paper studies the problem of computing a linear approximation of quadratic Wasserstein distance W2
. In particular, we compute an approximation of the negative homogeneous weighted Sobolev norm whose connection to Wasserstein distance follows from a classic linearization of a general Monge-Ampére equation. Our contribution is threefold. First, we provide expository material on this classic…
More
Submitted 31 January, 2022; originally announced January 2022.
Comments: 24 pages, 6 figures
All 5 versions
 

arXiv:2201.12797  [pdf, ps, othermath.PR
Wasserstein Convergence Rates for Empirical Measures of Subordinated Processes on Noncompact Manifolds
Authors:
Huaiqian Li, Bingyao Wu
Abstract: The asymptotic behaviour of empirical measures has been studied extensively. In this paper, we consider empirical measures of given subordinated processes on complete (not necessarily compact) and connected Riemannian manifolds with possibly nonempty boundary. We obtain rates of convergence for empirical measures to the invariant measure of the subordinated process under the Wasserstein distance.…
More
Submitted 30 January, 2022; originally announced January 2022.
Comments: Comments welcome!
All 2 versions

arXiv:2201.12324  [pdf, other]  cs.LG stat.ML
Optimal Transport Tools (OTT): A JAX Toolbox for all things Wasserstein
Authors: Marco Cuturi, Laetitia Meng-Papaxanthos, Yingtao Tian, Charlotte Bunne, Geoff Davis, Olivier Teboul
Abstract: Optimal transport tools (OTT-JAX) is a Python toolbox that can solve optimal transport problems between point clouds and histograms. The toolbox builds on various JAX features, such as automatic and custom reverse mode differentiation, vectorization, just-in-time compilation and accelerators support. The toolbox covers elementary computations, such as the resolution of the regularized OT problem,…
More
Submitted 28 January, 2022; originally announced January 2022.
Comments: 4 pages 

Cited by 13 Related articles All 2 versions

arXiv:2201.12245  [pdf, other]  cs.LG stat.ML
Wasserstein Iterative Networks for Barycenter Estimation
Authors:
Alexander Korotin, Vage Egiazarian, Lingxiao Li, Evgeny Burnaev
Abstract: Wasserstein barycenters have become popular due to their ability to represent the average of probability measures in a geometrically meaningful way. In this paper, we present an algorithm to approximate the Wasserstein-2 barycenters of continuous measures via a generative model. Previous approaches rely on regularization (entropic/quadratic) which introduces bias or on input convex neural networks…
More
Submitted 28 January, 2022; originally announced January 2022. 

Cited by 3 Related articles All 3 versions

Wasserstein Iterative Networks for Barycenter Estimation

https://nips.cc › 2022 › ScheduleMultitrack

Cited by 5 Related articles All 4 versions

<——2022———2022———180—


[PDF] arxiv.org

Structure preservation via the Wasserstein distance

D Bartl, S Mendelson - arXiv preprint arXiv:2209.07058, 2022 - arxiv.org

… The proof follows from the optimal estimate on the worst Wasserstein distance between a … 

We begin with the useful characterization of the Wasserstein distance between measures …

Cited by 1 All 2 versions

arXiv:2201.12087  [pdf, ps, othermath.PR
Bounding Kolmogorov distances through
Wasserstein and related integral probability metrics
Authors:
Robert E. Gaunt, Siqi Li
Abstract: We establish general upper bounds on the Kolmogorov distance between two probability distributions in terms of the distance between these distributions as measured with respect to the Wasserstein or smooth Wasserstein metrics. These bounds provide a significant generalisation of existing results from the literature. To illustrate the broad applicability of our general bounds, we apply them to extr…
More
Submitted 28 January, 2022; originally announced January 2022.
Comments: 26 pages
MSC Class: Primary 60E15; 60F05; Secondary 41A10

Cited by 2 Related articles All 3 versions

2022  [HTML] mdpi.com

Simulating Multi-Asset Classes Prices Using Wasserstein Generative Adversarial Network: A Study of Stocks, Futures and Cryptocurrency

F Han, X Ma, J Zhang - Journal of Risk and Financial Management, 2022 - mdpi.com

… Inspired by the generator and discriminator ideas, we implement a Wasserstein GAN with 

Gradient Penalty (WGAN-GP) framework to learn … We explore a universal WGAN-GP model 

Related articles All 6 versions 

Stein's method for steady-state diffusion approximation in Wasserstein...
by Bonis, Thomas

02/2022
We provide a general steady-state diffusion approximation result which bounds the Wasserstein distance between the reversible measure $\mu$ of a diffusion...
Journal Article Full Text Online

All 3 versions 


 Central Limit Theorems for Semidiscrete Wasserstein Distances

by del Barrio, Eustasio; González-Sanz, Alberto; Loubes, Jean-Michel
02/2022
We prove a Central Limit Theorem for the empirical optimal transport cost, $\sqrt{\frac{nm}{n+m}}\{\mathcal{T}_c(P_n,Q_m)-\mathcal{T}_c(P,Q)\}$, in the semi...
Cited by 11
Related articles All 7 versions


2022

Inference for Projection-Based Wasserstein Distances on Finite Spaces
by Okano, Ryo; Imaizumi, Masaaki
02/2022
The Wasserstein distance is a distance between two probability distributions and has recently gained increasing popularity in statistics and machine learning,...
Journal Article  Full Text Online

 All 2 versions 
 
Wasserstein Solution Quality and the Quantum Approximate Optimization...

by Baker, Jack S; Radha, Santosh Kumar
02/2022
Optimizing of a portfolio of financial assets is a critical industrial problem which can be approximately solved using algorithms suitable for quantum...
Journal Article  Full Text Online


 
WAD-CMSN: Wasserstein Distance based Cross-Modal Semantic Network for...
by Xu, Guanglong; Hu, Zhensheng; Cai, Jia
02/2022
Zero-shot sketch-based image retrieval (ZSSBIR), as a popular studied branch of computer vision, attracts wide attention recently. Unlike sketch-based image...
Journal Article  Full Text Online


2022 see 2021    ARTICLE

Unpaired Single-Image Depth Synthesis with cycle-consistent Wasserstein GANs

Christoph Angermann ; Adéla Moravová ; Markus Haltmeier ; Steinbjörn Jónsson ; Christian LaubichlerarXiv.org, 2022

OPEN ACCESS

Unpaired Single-Image Depth Synthesis with cycle-consistent Wasserstein GANs

Available Online 

Working Paper  

Full TextUnpaired Single-Image Depth Synthesis with cycle-consistent Wasserstein GANs

Angermann, Christoph; Moravová, Adéla; Haltmeier, Markus; Jónsson, Steinbjörn; Laubichler, Christian.arXiv.org; Ithaca, Feb 16, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Show Abstract 


ARTICLE

Unsupervised Ground Metric Learning using Wasserstein Singular Vectors

Huizing, Geert-Jan ; Cantini, Laura ; Peyré, GabrielarXiv.org, 2022

OPEN ACCESS

Unsupervised Ground Metric Learning using Wasserstein Singular Vectors

Available Online 

Working Paper  Full Text

Unsupervised Ground Metric Learning using Wasserstein Singular Vectors

Huizing, Geert-Jan; Cantini, Laura; Peyré, Gabriel.arXiv.org; Ithaca, Feb 16, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Show Abstract 

Cited by 2 Related articles All 5 versions 

<——2022———2022———190— 


Working Paper  Full Text

Wasserstein Graph Neural Networks for Graphs with Missing Attributes

Chen, Zhixian; Ma, Tengfei; Song, Yangqiu; Wang, Yang.arXiv.org; Ithaca, Feb 16, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Show Abstract 


Working Paper  Full Text

Bayesian Learning with Wasserstein Barycenters

Backhoff-Veraguas, Julio; Fontbona, Joaquin; Rios, Gonzalo; Tobar, Felipe.arXiv.org; Ithaca, Feb 16, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Show Abstract 

 MR4519227

Bayesian learning with Wasserstein barycenters*

Backhoff-Veraguas, JFontbona, J; (...); Tobar, F

Dec 8 2022 | 

ESAIM-PROBABILITY AND STATISTICS

 26 , pp.436-472

We introduce and study a novel model-selection strategy for Bayesian learning, based on optimal transport, along with its associated predictive posterior law: the Wasserstein population barycenter of the posterior law over models. We first show how this estimator, termed Bayesian Wasserstein barycenter (BWB), arises naturally in a general, parameter-free Bayesian model-selection framework, when

Show moreFree Full Text From Publishermore_horiz

References

Related records

 

Working Paper  Full Text

Hypothesis Test and Confidence Analysis with Wasserstein Distance on General Dimension

Imaizumi, Masaaki; Ota, Hirofumi; Hamaguchi, Takuo.arXiv.org; Ithaca, Feb 15, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Show Abstract 

Cover Image

Hypothesis Test and Confidence Analysis With Wasserstein Distance on General Dimension

by Imaizumi, Masaaki; Ota, Hirofumi; Hamaguchi, Takuo

Neural computation, 05/2022, Volume 34, Issue 6

We develop a general framework for statistical inference with the 1-Wasserstein distance. Recently, the Wasserstein distance has attracted considerable...

Journal Article  Full Text Online

View in Context Browse Journal

 Preview

Working Paper  Full Text

Wasserstein Solution Quality and the Quantum Approximate Optimization Algorithm: A Portfolio Optimization Case Study

Baker, Jack S; Radha, Santosh Kumar.arXiv.org; Ithaca, Feb 14, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Show Abstract 

Cited by 1 All 5 versions 

Working Paper  Full Text

Central Limit Theorems for Semidiscrete Wasserstein Distances

Eustasio del Barrio; González-Sanz, Alberto; Jean-Michel Loubes.arXiv.org; Ithaca, Feb 13, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Show Abstract 

Cited by 9 Related articles All 11 versions 
ARTICLE

Central Limit Theorems for Semidiscrete Wasserstein Distances

Eustasio del Barrio ; González-Sanz, Alberto ; Jean-Michel LoubesarXiv.org, 2022

Cited by 12 Related articles All 7 versions 

2022

 

Working Paper  Full Text

WAD-CMSN: Wasserstein Distance based Cross-Modal Semantic Network for Zero-Shot Sketch-Based Image Retrieval

Xu, Guanglong; Hu, Zhensheng; Cai, Jia.arXiv.org; Ithaca, Feb 11, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Show Abstract 

 Related articles All 2 versions 

arXiv:2207.04913  [pdfother]  cs.LG  cs.CV
Generalizing to Unseen Domains with Wasserstein Distributional Robustness under Limited Source Knowledge
Authors: Jingge WangLiyan XieYao XieShao-Lun HuangYang Li
Abstract: Domain generalization aims at learning a universal model that performs well on unseen target domains, incorporating knowledge from multiple source domains. In this research, we consider the scenario where different domain shifts occur among conditional distributions of different classes across domains. When labeled samples in the source domains are limited, existing approaches are not sufficiently… 
More
Submitted 11 July, 2022; originally announced July 2022.
All 2 versions 


2022  see 2021Working Paper  Full Text

Fixed Support Tree-Sliced Wasserstein Barycenter

Takezawa, Yuki; Sato, Ryoma; Kozareva, Zornitsa; Sujith Ravi; Yamada, Makoto.arXiv.org; Ithaca, Feb 11, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

SARTICLE

Fixed Support Tree-Sliced Wasserstein Barycenter

Takezawa, Yuki ; Sato, Ryoma ; Kozareva, Zornitsa ; Sujith Ravi ; Yamada, MakotoarXiv.org, 2022


2022. patent

WGAN-GP and YOLO based quick hull attachment identification method, involves performing density evaluation to hull attachment in identification process, and calculating attachment area ratio

CN113792785-A

Inventor(s) CHU ZREN C; (...); CHEN Q

Assignee(s) UNIV SHANGHAI SCI & TECHNOLOGY

Derwent Primary Accession Number 

2022-00216M

 

2022 see 2021

A New Perspective on Wasserstein Distances for Kinetic Problems

Iacobelli, M

Feb 2022 (Early Access) | ARCHIVE FOR RATIONAL MECHANICS AND ANALYSIS

We introduce a new class of Wasserstein-type distances specifically designed to tackle questions concerning stability and convergence to equilibria for kinetic equations. Thanks to these new distances, we improve some classical estimates by Loeper (J Math Pures Appl (9) 86(1):68-79, 2006) and Dobrushin (Funktsional Anal i Pril…

Free Full Text From Publisher

62  References

Related records

<——2022———2022———200—

 

2022  see 2021

Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator

Nguyen, VAKuhn, D and Esfahani, PM

Jan-feb 2022 | Jul 2021 (Early Access) | OPERATIONS RESEARCH 70 (1) , pp.490-515

We introduce a distributionally robust maximum likelihood estimation model with a Wasserstein ambiguity set to infer the inverse covariance matrix of a p-dimensional Gaussian random vector from n independent samples. The proposed model minimizes the worst case (maximum) of Stein's loss across all normal referen…

Free Submitted Article From RepositoryFull Text at Publisher

Citation 62       14 References

Related records  


arXiv:2202.10042
  [pdfother math.OC
Fast Sinkhorn I: An O(N) algorithm for the Wasserstein-1 metric
Authors: Qichen LiaoJing ChenZihao WangBo BaiShi JinHao Wu
Abstract: The Wasserstein metric is broadly used in optimal transport for comparing two probabilistic distributions, with successful applications in various fields such as machine learning, signal processing, seismic inversion, etc. Nevertheless, the high computational complexity is an obstacle for its practical applications. The Sinkhorn algorithm, one of the main methods in computing the Wasserstein metri…  More
Submitted 21 February, 2022; originally announced February 2022.
Comments: 15 pages, 4 figures
MSC Class: 49M25; 49M30; 65K10

arXiv:2202.09025  [pdfother cs.LG  cs.SI
Graph Auto-Encoder Via Neighborhood Wasserstein Reconstruction
Authors: Mingyue TangCarl YangPan Li
Abstract: Graph neural networks (GNNs) have drawn significant research attention recently, mostly under the setting of semi-supervised learning. When task-agnostic representations are preferred or supervision is simply unavailable, the auto-encoder framework comes in handy with a natural graph reconstruction objective for unsupervised GNN training. However, existing graph auto-encoders are designed to recon…  More
Submitted 18 February, 2022; originally announced February 2022.
Comments: ICLR 2022; Code available at https://github.com/mtang724/NWR-GAE

Cited by 4 Related articles All 5 versions 

2022 see 2021

Nguyen, Viet AnhKuhn, DanielEsfahani, Peyman Mohajerin

Distributionally robust inverse covariance estimation: the Wasserstein shrinkage estimator. (English) Zbl 07476289

Oper. Res. 70, No. 1, 490-515 (2022).

MSC:  90Cxx

PDF BibTeX XML Cite

Full Text: DOI

Zbl 07476289 

Cited by 37 Related articles All 12 versions


2022 see 2020

Amari, Shun-ichiMatsuda, Takeru

Wasserstein statistics in one-dimensional location scale models. (English) Zbl 07473253

Ann. Inst. Stat. Math. 74, No. 1, 33-47 (2022).

Cited by 2 Related articles All 4 versions

Zbl 07473253

2022

  Unsupervised domain adaptation of bearing fault diagnosis based on Join Sliced Wasserstein Distance

P ChenR ZhaoT HeK WeiY Qidong - ISA transactions, 2022 - Elsevier

… has been successfully utilized in unsupervised domain adaptation and obtained … Join Sliced

Wasserstein Distance (JSWD) has been proposed to address the above issue. Four bearing …

 Related articles All 4 versions

 

[PDF] arxiv.org

Invariance encoding in sliced-Wasserstein space for image classification with limited training data

Zhuang, S Li, AHM Rubaiyat, X Yin… - arXiv preprint arXiv …, 2022 - arxiv.org

… from R to R and from R2 to R2 are denoted as T and TI, … connects the R-CDT and slicedWasserstein

distances for … by the same magnitudes in both x and y directions of the image). We …

 Related articles All 2 versions 

 

Stochastic saddle-point optimization for the Wasserstein barycenter problem

D TiapkinA GasnikovDvurechensky - Optimization Letters, 2022 - Springer

… ,c)\) between two probability measures r, c, which in this paper we … As a prox-structure on

\(\mathcal {Y}\) we choose the … We fix \((\mathcal {H},\mathcal {K})\) is a RKHS of functions …

 

[PDF] arxiv.org

Distribution Regression with Sliced Wasserstein Kernels

D MeunierM PontilC Ciliberto - arXiv preprint arXiv:2202.03926, 2022 - arxiv.org

… f : Pn  y for a new patient identified by P. Other examples of … regression for any distance

substitution kernel K(d) on X with … for different configurations of T,n,C and r. A validation set of …

 All 2 versions 


[PDF] arxiv.org

Topological Classification in a Wasserstein Distance Based Vector Space

Songdechakraiwut, BM Krause, MI Banks… - arXiv preprint arXiv …, 2022 - arxiv.org

… of the Wasserstein distance between barcode descriptors of networks. Let X and Y be 2…

The Prop method has the closest accuracy to TopVS when r = 0.55, but has a higher p-value…

 All 2 versions 

<——2022———2022———210—

[PDF] arxiv.org

WPPNets: Unsupervised CNN Training with Wasserstein Patch Priors for Image Superresolution

F Altekrüger, J Hertrich - arXiv preprint arXiv:2201.08157, 2022 - arxiv.org

 R mx,my and y  Rnx,ny related by y ≈ S(k  x + b), where the blur kernel k  R15×15

and the bias b  R … In the following, we aim to reconstruct k and b from x and y. Here, we use …

 Related articles All 2 versions 

[PDF] sciencedirect.com

A brief survey on Computational Gromov-Wasserstein distance

L Zheng, Xiao, L Niu - Procedia Computer Science, 2022 - Elsevier

… Let (X, dX,µX) and (Y, dY,µY) be two metric measure spaces, where (X, dX) is a compact …

This minimizaiton problem can be treated as OT where cost matrix is L(C, C)T(k), so T is …

Related articles

 A Continuation Multiple Shooting Method for Wasserstein ...https://epubs.siam.org › doi › absß
by J Cui · 2022 · Cited by 4 — In this paper, we propose a numerical method to solve the classic $L^2$-optimal transport problem. Our algorithm is based on the use of multiple shooting, ...

A new data generation approach with modified Wasserstein auto-encoder for rotating machinery fault diagnosis with limited fault data

K Zhao, H Jiang, C Liu, Wang, Zhu - Knowledge-Based Systems, 2022 - Elsevier

… Y are random variables that obey the distributions P X and P … proposed method is expressed

as (15)  t o t a l = w + λ d = … 1 , 2 , … , n , j ≠ k where r j and r k are the average of the …

 Related articles All 2 versions

A new data generation approach with modified Wasserstein auto-encoder for rotating machinery...
by Zhao, Ke; Jiang, Hongkai; Liu, Chaoqiang ; More...
Knowledge-based systems, 02/2022, Volume 238
Limited fault data restrict deep learning methods in solving fault diagnosis problems in rotating machinery. Using limited fault data to generate massive data...
ournal Article  Full Text Online

Cited by 17 Related articles All 2 versions

Distributed Kalman Filter With Faulty/Reliable Sensors Based on Wasserstein Average Consensus

DJ Xin, LF ShiX Yu - … Transactions on Circuits and Systems II …, 2022 - ieeexplore.ieee.org

… t|t and Pk t|t). Different from the traditional consensus based fusion strategies, this brief employs

Wasserstein … /clustering methods are K-means and learning based methods. However, …

 Related articles


2022

[PDF] arxiv.org

Quantitative control of Wasserstein distance between Brownian motion and the Goldstein-Kac telegraph process

G BarreraJ Lukkarinen - arXiv preprint arXiv:2201.00422, 2022 - arxiv.org

… In the rest of the manuscript we always assume that v0  R \ {0… variable K with Poisson

distribution with parameter T = λT … the jumps for the process (Y (t; n, s):0 ≤ t ≤ T) so that a good …

S Related articles All 3 versions 


[PDF] arxiv.org

WAD-CMSN: Wasserstein Distance based Cross-Modal Semantic Network for Zero-Shot Sketch-Based Image Retrieval

G Xu, Z Hu, J Cai - arXiv preprint arXiv:2202.05465, 2022 - arxiv.org

… this drawback, we propose a Wasserstein distance based cross-modal semantic network (…

joint distributions γ(x, y). In [4], it has been proved that Wasserstein distance is more suitable to …

S All 2 versions 


Convergence diagnostics for Monte Carlo fission source distributions using the Wasserstein distance measure

X Guo, Z Li, S Huang, Wang - Nuclear Engineering and Design, 2022 - Elsevier

… However, it was shown that k eff converges faster than … Let γ ( x , y ) be a joint distribution

with P n and P r as its … follows:(12) slowdown = T on - T off T off where T on is the time …

  All 2 versions


[PDF] arxiv.org

On a prior based on the Wasserstein information matrix

W LiFJ Rubio - arXiv preprint arXiv:2202.03217, 2022 - arxiv.org

… Wasserstein prior t−approximation … where r(x) = φ(x) … σ2 as a Gamma distribution, and

using that X has full column rank and that y is not in the column space of X, we obtain for n>p + 1 …

All 4 versions 


 Finger vein image inpainting using neighbor binary-wasserstein generative adversarial networks (NB-WGAN)

H Jiang, L Shen, H Wang, Yao, G Zhao - Applied Intelligence, 2022 - Springer

… ], a WGAN inpainting algorithm based on attention mechanism without texture constraints

(AT-WGAN) [18], a WGAN inpainting algorithm based … , without texture constraints (PF-WGAN). …

 Related articles

<——2022———2022———220—

[PDF] mdpi.com

Fault Feature Extraction Method of a Permanent Magnet Synchronous Motor Based on VAE-WGAN

L Zhan, X Xu, X Qiao, F Qian, Q Luo - Processes, 2022 - mdpi.com

… fault mechanism analysis difficult, this paper proposed a fault feature extraction method based

on VAE-WGAN … VAE-WGAN studied in this paper has good fault feature extraction effects. …

 Related articles All 2 versions 


Fault Diagnosis of Rotating Machinery Based on Combination of Wasserstein Generative Adversarial Networks and Long Short Term Memory Fully Convolutional …

Li, W Zou, L Jiang - Measurement, 2022 - Elsevier

… signal fault diagnosis. The method first utilizes WGAN to expand the imbalanced fault samples.

… architecture in WGAN and the loss function based on Wasserstein distance ensure the …

 All 2 versions


[HTML] springer.com

[HTML] Short-term prediction of wind power based on BiLSTM–CNN–WGAN-GP

L Huang, L Li, X Wei, D Zhang - Soft Computing, 2022 - Springer

… This paper presents a hybrid prediction model based on VMD, BiLSTM, CNN and WGAN-GP

for Short-term prediction of wind power. The overall prediction framework of the proposed …

 Related articles



 

2022

 

Wasserstein contraction and Poincar\'e inequalities for elliptic diffusions at high...

by Monmarché, Pierre

01/2022

We consider elliptic diffusion processes on $\mathbb R^d$. Assuming that the drift contracts distances outside a compact set, we prove that, at a sufficiently...

Journal Article Full Text Online

[PDF] arxiv.org

Local Sliced-Wasserstein Feature Sets for Illumination-invariant Face Recognition

Y Zhuang, S Li, X Yin, AHM Rubaiyat… - arXiv preprint arXiv …, 2022 - arxiv.org

We present a new method for face recognition from digital images acquired under varying

illumination conditions. The method is based on mathematical modeling of local gradient …

 

 2022

 2022 see 2021

Sliced Wasserstein Distance for Neural Style Transfer

J Li, D Xu, S Yao - Computers & Graphics, 2022 - Elsevier

… In this paper, we propose a new style loss based on Sliced Wasserstein Distance (SWD),

which has a theoretical approximation guarantee. Besides, an adaptive sampling algorithm is …

 Save  Related articles All 2 versions


[PDF] arxiv.org

Finite sample approximations of exact and entropic Wasserstein distances between covariance operators and Gaussian processes

HQ Minh - SIAM/ASA Journal on Uncertainty Quantification, 2022 - SIAM

… We show that the Wasserstein distance/Sinkhorn divergence between centered Gaussian 

processes are fully represented by RKHS covariance and cross-covariance operators …

Cited by 3 Related articles All 5 versions

Zbl 07487266

Sliced Wasserstein Distance for Neural Style Transfer

J Li, D Xu, S Yao - Computers & Graphics, 2022 - Elsevier

… Conceptually, Wasserstein Distance (WD) is ideal for measuring the distance between … 

In this paper, we propose a new style loss based on Sliced Wasserstein Distance (SWD), …

Related articles All 2 versions


 2022 see 2021

A new data generation approach with modified Wasserstein auto-encoder for rotating machinery fault diagnosis with limited fault data

K Zhao, H Jiang, C Liu, Y Wang, K Zhu - Knowledge-Based Systems, 2022 - Elsevier

… The sliced Wasserstein distance is introduced to measure the distribution difference. A … The 

sliced Wasserstein distance with a gradient penalty is designed as the regularization term to …

Cited by 8 Related articles All 3 versions
A new data generation approach with modified Wasserstein auto-encoder for rotating machinery fault diagnosis with limited fault data


[PDF] arxiv.org

Graph Auto-Encoder Via Neighborhood Wasserstein Reconstruction

M Tang, C Yang, P Li - arXiv preprint arXiv:2202.09025, 2022 - arxiv.org

… We adopt Wasserstein distance to characterize the … A further challenge is that the 

Wasserstein distance between … Therefore, we adopt the empirical Wasserstein distance that can …

Cited by 11 Related articles All 5 versions

<——2022———2022———230— 


On the 2‐Wasserstein distance for self‐similar measures on the unit interval

E Brawley, M Doyle… - Mathematische …, 2022 - Wiley Online Library

… that the 𝑟-Wasserstein distance between such measures can be computed using the 

cumulative distribution functions and we provide the formulation of 𝑟-Wasserstein distance and …

 All 4 versions

… decisions for a dual-channel retailer with service level requirements and demand uncertainties: A Wasserstein metric-based distributionally robust optimization …

Y Sun, R Qiu, M Sun - Computers & Operations Research, 2022 - Elsevier

Wasserstein uncertainty sets using the Wasserstein metric … is developed based on the 

data-driven Wasserstein uncertainty sets… in the Wasserstein metric for constructing the …

Cited by 6 Related articles All 2 versions

 [PDF] iop.org

Improving Word Alignment by Adding Gromov-Wasserstein into Attention Neural Network

Y Huang, T Zhang, H Zhu - Journal of Physics: Conference …, 2022 - iopscience.iop.org

… In this paper, we cast the correspondence problem directly as an optimal distance problem. 

We use the Gromov-Wasserstein distance to calculated how similarities between word pairs …

Related articles

Improving Word Alignment by Adding Gromov-Wasserstein into Attention Neural Network
Related articles
All 3 versions


2022 see 2021  [PDF] arxiv.org

Wasserstein barycenters are NP-hard to compute

JM Altschuler, E Boix-Adserà - SIAM Journal on Mathematics of Data Science, 2022 - SIAM

… This continuous setting has several additional computational challenges, such as how to 

even represent \mu i and \nu concisely and how to compute the Wasserstein distance between …

Zbl 07493842
Cited by 14 Related articles All 3 versions

[PDF] arxiv.org

Fast Sinkhorn I: An O (N) algorithm for the Wasserstein-1 metric

Q Liao, J Chen, Z Wang, B Bai, S Jin, H Wu - arXiv preprint arXiv …, 2022 - arxiv.org

Wasserstein metric, solves an entropy regularized minimizing problem, which allows arbitrary 

Cited by 3 Related articles All 3 versions

2022

 

[PDF] arxiv.org

Wasserstein contraction and Poincar\'e inequalities for elliptic diffusions at high temperature

P Monmarché - arXiv preprint arXiv:2201.07523, 2022 - arxiv.org

… This issue leads to the second main question of this work, which is to prove that Pt is a 

contraction of the W2 Wasserstein distance for t large enough. Indeed, from classical arguments (…

  Related articles All 4 versions

[PDF] researchgate.net

[PDF] INDETERMINACY ESTIMATES, EIGENFUNCTIONS AND LOWER BOUNDS ON WASSERSTEIN DISTANCES

N DE PONTI, S FARINELLI - researchgate.net

… The first one is an indeterminacy estimate involving the pWasserstein distance between the 

… The second one is a conjectured lower bound on the p-Wasserstein distance between the …

Cited by 1 Related articles All 6 versions

Zbl 07523712   MR4417396 


 [PDF] arxiv.org

Graded persistence diagrams and persistence landscapes

L Betthauser, P Bubenik, PB Edwards - Discrete & Computational …, 2022 - Springer

… 1-Wasserstein distance between kth graded persistence diagrams is bounded by twice the 

1-Wasserstein distance … We prove the following stability theorem: The 1-Wasserstein distance

Cited by 8 Related articles All 6 versions

[PDF] arxiv.org

On the capacity of deep generative networks for approximating distributions

Y Yang, Z Li, Y Wang - Neural Networks, 2022 - Elsevier

… Furthermore, we also show that the approximation orders in Wasserstein distances only 

depend on the intrinsic dimension of the target distribution, which is the first theoretical result of …

 Cited by 8 Related articles All 9 versions

Stochastic saddle-point optimization for the Wasserstein barycenter problem

D Tiapkin, A Gasnikov, P Dvurechensky - Optimization Letters, 2022 - Springer

… out to be available for the Wasserstein distance. … Wasserstein distance or its (sub)gradients? 

To propose such an online approach that does not require calculating Wasserstein distance

 Zbl 07569590

 <——2022———2022———240— 



The α-dependence of the invariant measure of stochastic real Ginzburg-Landau equation driven by α-stable Lévy processes

X Liu - Journal of Differential Equations, 2022 - Elsevier

… Brownian motions under the Wasserstein distance, as α … prove that a type of Wasserstein 

distance is contracting for the dual … forced by Brownian motions under the Wasserstein distance. …

Cited by 2 Related articles All 2 versions


[PDF] lpsm.paris

[PDF] OPTIMAL 1-WASSERSTEIN DISTANCE FOR WGANS BY ARTHUR STÉPHANOVITCH, UGO TANIELIAN 2, BENOÎT CADRE 3, NICOLAS KLUTCHNIKOFF 3 …

A STÉPHANOVITCH - lpsm.paris

… a thorough analysis of Wasserstein GANs (WGANs) in … distances between the sample 

points. We also highlight the fact that WGANs are able to approach (for the 1-Wasserstein distance…

 Related articles All 2 versions

 

T-copula and Waserstein distance-based stochastic neighbor embedding

Y Huang, K Guo, X Yi, J Yu, Z Shen, T Li - Knowledge-Based Systems, 2022 - Elsevier

… method by integrating the Wasserstein distance and t-copula … equipped with the Wasserstein 

distance to describe the … the Wasserstein distance to measure the distance between …

All 2 versions


2022  SEE 2021

Inferential Wasserstein generative adversarial networks

Y Chen, Q Gao, X Wang - Journal of the Royal Statistical Society …, 2022 - ideas.repec.org

… We introduce a novel inferential Wasserstein GAN (iWGAN) model, which is a principled 

framework to fuse autoencoders and WGANs. The iWGAN model jointly learns an encoder …

 

[PDF] arxiv.org

Right mean for the Bures-Wasserstein quantum divergence

M Jeong, J Hwang, S Kim - arXiv preprint arXiv:2201.03732, 2022 - arxiv.org

… which coincides with the L2-Wasserstein distance of two Gaussian measures with mean 

zero and covariance matrices A and B [3]. On the other hand, it does not give us the non-…

Related articles All 2 versions


2022


[PDF] github.io

[PDF] Wasserstein Distributionally Robust Optimization via Wasserstein Barycenters

TTK Lau, H Liu - 2022 - timlautk.github.io

… certain Wasserstein distance from this Wasserstein barycenter. Such an ambiguity set is called 

the Wasserstein … In particular, we consider the p-Wasserstein distance in this paper. The p-…

 

 

2022 see 2021  [PDF] jmlr.org

[PDF] Projected statistical methods for distributional data on the real line with the wasserstein metric

M Pegoraro, M Beraha - Journal of Machine Learning Research, 2022 - jmlr.org

… The Wasserstein distance provides a powerful tool to compare distributions, as it requires 

very … We start by recalling the definition of the 2-Wasserstein distance between two probability …

 Cited by 5 Related articles All 12 versions



Local Sliced-Wasserstein Feature Sets for Illumination-invariant Face Recognition
Authors: Yan ZhuangShiying LiMohammad Shifat-E-RabbiXuwang YinAbu Hasnat Mohammad RubaiyatGustavo K. Rohde
Abstract: We present a new method for face recognition from digital images acquired under varying illumination conditions. The method is based on mathematical modeling of local gradient distributions using the Radon Cumulative Distribution Transform (R-CDT). We demonstrate that lighting variations cause certain types of deformations of local image gradient distributions which, when expressed in R-CDT domain…  More
Submitted 21 February, 2022; originally announced February 2022.

All 2 versions 

[PDF] arxiv.org

Local Sliced-Wasserstein Feature Sets for Illumination-invariant Face RHackGAN: Harmonious Cross-Network Mapping Using CycleGAN With Wasserstein-Procrustes Learning for Unsupervised Network 

Alignmentecognition

Y Zhuang, S Li, X Yin, AHM Rubaiyat… - arXiv preprint arXiv …, 2022 - arxiv.org

… space with the Wasserstein metric to the transform space with the Euclidean distance (see eg, 

[… PZ(1) and PZ(2) in P(R), we have that the Wasserstein distance between them is given by …

 

2022 see 2021

[PDF] unipd.it Updated February 22nd, 2022

[PDF] Optimal Transport and Wasserstein Gradient Flows

F Santambrogio - Doctoral Program in Mathematical Sciences Catalogue … - math.unipd.it

… Optimal transport for the distance cost. Wasserstein distances and their properties. Curves 

in the Wasserstein spaces and relation with the continuity equation. Characterization of AC …

Related articles All 2 versions

<——2022———2022———250— 


  Distributed Kalman Filter With Faulty/Reliable Sensors Based on Wasserstein Average Consensus

DJ Xin, LF Shi, X Yu - … Transactions on Circuits and Systems II …, 2022 - ieeexplore.ieee.org

… Before proceeding on, we use W(p(xi),p(xj)) to describe the Wasserstein distance between 

two probabilities p(xi) and p(xj) of vectors xi and xj, respectively [22]. The iterative clustering …

Cited by 4 Related articles

[PS] uc.pt

[PS] … -316.[24] Weed, J., & Bach, F.(2019). Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance. Bernoulli 25 (4A) …

PE OLIVEIRA, N PICADO - surfaces - mat.uc.pt

… In order to do that, we will need some notions of distance between measures and distance 

to a measure. For the first we will use the classical Wasserstein distance (see Villani [22], for a …

Related articles


PDF] arxiv.org

On the potential benefits of entropic regularization for smoothing Wasserstein estimators

J Bigot, P Freulon, BP Hejblum, A Leclaire - arXiv preprint arXiv …, 2022 - arxiv.org

… properties of regularized Wasserstein estimators. Our main … of un-regularized Wasserstein 

estimators in statistical learning … the convergence of regularized Wasserstein estimators. We …

All 2 versions

2022 see 2021

Wasserstein Graph Auto-Encoder

by Chu, Yan; Li, Haozhuang; Ning, Hui ; More...

Algorithms and Architectures for Parallel Processing, 02/2022

Conventional Graph Auto-Encoder-based(GAE) minimizes the variational lower bound by Kullback-Leibler divergence in link prediction task and forces a...

Book Chapter   Full Text Online

 

Local Sliced-Wasserstein Feature Sets for Illumination-invariant Face Recognition

by Zhuang, Yan; Li, Shiying; Shifat-E-Rabbi, Mohammad ; More...

02/2022

We present a new method for face recognition from digital images acquired under varying illumination conditions. The method is based on mathematical modeling...

Journal Article  Full Text Online  

 All 2 versions


2022


2022 see 2021

An Embedding Carrier-Free Steganography Method Based on Wasserstein GAN

by Yu, Xi; Cui, Jianming; Liu, Ming

Algorithms and Architectures for Parallel Processing, 02/2022

Image has been widely studied as an effective carrier of information steganography, however, low steganographic capacity is a technical problem that has not...

Book Chapter   Full Text Online

 

2022 patent news

Shenzhen Inst Adv Tech Seeks Patent for High-Energy Image Synthesis Method and Device Based on Wasserstein...

Global IP News. Optics & Imaging Patent News, Feb 22, 2022

Newspaper Article  Full Text Online  


Approximative Algorithms for Multi-Marginal Optimal Transport and Free-Support Wasserstein...

by von Lindheim, Johannes

02/2022

Computationally solving multi-marginal optimal transport (MOT) with squared Euclidean costs for $N$ discrete probability measures has recently attracted...

Journal Article  Full Text Online  

 Preview   Open Access

Save this item    Cite this item   Email this item   More actions

 [PDF] arxiv.org

Approximative Algorithms for Multi-Marginal Optimal Transport and Free-Support Wasserstein Barycenters

J von Lindheim - arXiv preprint arXiv:2202.00954, 2022 - arxiv.org

approximating MOT directly can overcome this, but a simple yet effective algorithm is still 

missing. Thus, we present two approximative … enjoy theoretical approximation guarantees, have …

Cited by 1 Related articles All 3 versions 

A Wasserstein distributionally robust chance constrained programming approach for emergency medical system planning problem

Y Yuan, Q Song, B Zhou - International Journal of Systems …, 2022 - Taylor & Francis

… We will introduce outer approximation and inner approximation based upon the exact 

reformulation in previous section. By derivation of outer approximation we aim for a lower bound of …

Related articles All 5 versions

Zbl 07579623
A Wasserstein distributionally robust chance constrained programming approach for emergency medical system planning problem


Representing Graphs via Gromov-Wasserstein Factorization

H Xu, J Liu, D Luo, L Carin - IEEE Transactions on Pattern …, 2022 - ieeexplore.ieee.org

… After M iterations, we derive the Mstep approximation of the optimal transport matrix. Accordingly, 

we show the scheme of the PPA-based method in Algorithm 1. Besides the PPA-based …

ited by 5 Related articles All 6 versions

<——2022———2022———260—


[PDF] archives-ouvertes.fr

Lagrangian discretization of variationnal problems in Wasserstein spaces

C Sarrazin - 2022 - tel.archives-ouvertes.fr

… d’approximer (au sens de la distance de Wasserstein) une densité de probabilité par une … 

Nous nous concentrons sur l’approximation par minimisation de la distance de Wasserstein

  All 3 versions


[PDF] researchgate.net

[PDF] Gradient Penalty Approach for Wasserstein Generative Adversarial Networks

Y Ti - researchgate.net

… Hence, we make use of the Wasserstein distance to fix such recurring issues. The … faithfully 

approximate W-div by optimization. CramerGAN [20] argues that the Wasserstein distance …

 Related articles


2022 see 2021  [HTML] springer.com

[HTML] The Cutoff Phenomenon in Wasserstein Distance for Nonlinear Stable Langevin Systems with Small Lévy Noise

G Barrera, MA Högele, JC Pardo - Journal of Dynamics and Differential …, 2022 - Springer

… cutoff phenomenon in the Wasserstein distance for systems of … >3/2\) to the formulation in 

Wasserstein distance, which allows to … We define the Freidlin-Wentzell first order approximation
Cited by 3
Related articles All 10 versions


arXiv:2203.05856  [pdfpsother math.PR
Exponential convergence in Wasserstein metric for distribution dependent SDEs
Authors: Shao-Qin Zhang
Abstract: The existence and uniqueness of stationary distributions and the exponential convergence in Lp
-Wasserstein distance are derived for distribution dependent SDEs from associated decoupled equations. To establish the exponential convergence, we introduce a twinned Talagrand inequality of the original SDE and the associated decoupled equation, and explicit convergence rate is obtained. Our results…  More
Submitted 11 March, 2022; originally announced March 2022.
MSC Class: 60H10
All 2 versions
 


ARTICLE

Simple Approximative Algorithms for Free-Support Wasserstein Barycenters

Johannes von LindheimarXiv.org, 2022

OPEN ACCESS

Simple Approximative Algorithms for Free-Support Wasserstein Barycenters

Available Online 

arXiv:2203.05267  [pdfother]  math.NA
Simple Approximative Algorithms for Free-Support Wasserstein Barycenters
Authors: Johannes von Lindheim
Abstract: Computing Wasserstein barycenters of discrete measures has recently attracted considerable attention due to its wide variety of applications in data science. In general, this problem is NP-hard, calling for practical approximative algorithms. In this paper, we analyze two straightforward algorithms for approximating barycenters, which produce sparse support solutions and show promising numerical r…  More
Submitted 10 March, 2022; originally announced March 2022.
All 2 versions
 


2022


arXiv:2203.04851  [pdfpsother]  math.FA  math.MG  math.PR
Quasi α-Firmly Nonexpansive Mappings in Wasserstein Spaces
Authors: Arian BërdëllimaGabriele Steidl
Abstract: This paper aims to introduce the concept of quasi α
-firmly nonexpansive mappings in Wasserstein-2 spaces over $\R^d$ and to analyze properties of these mappings. We prove that for quasi α
-firmly nonexpansive mappings satisfying a certain quadratic growth condition, the fixed point iterations converge in the narrow topology. As a byproduct, we will get the known convergence of the counterpart o… 
More
Submitted 9 March, 2022; originally announced March 2022.
MSC Class: 46T99; 47H10; 47J26; 28A33

arXiv:2203.04711  [pdfother]  cs.LG
On a linear fused Gromov-Wasserstein distance for graph structured data
Authors: Dai Hai NguyenKoji Tsuda
Abstract: We present a framework for embedding graph structured data into a vector space, taking into account node features and topology of a graph into the optimal transport (OT) problem. Then we propose a novel distance between two graphs, named linearFGW, defined as the Euclidean distance between their embeddings. The advantages of the proposed distance are twofold: 1) it can take into account node featu… 
More
Submitted 9 March, 2022; originally announced March 2022.
All 3 versions
 


arXiv:2203.04054  [pdfpsother]  math.MG  math-ph  math.FA  math.PR
Isometric rigidity of Wasserstein tori and spheres
Authors: György Pál GehérTamás TitkosDániel Virosztek
Abstract: We prove isometric rigidity for p
-Wasserstein spaces over finite-dimensional tori and spheres for all p
. We present a unified approach to proving rigidity that relies on the robust method of recovering measures from their Wasserstein potentials.
Submitted 8 March, 2022; originally announced March 2022.
MSC Class: Primary: 54E40 Secondary: 46E27; 54E70

RTICLE

Wasserstein Distance-based Spectral Clustering with Application to Transaction Data

Yingqiu Zhu ; Danyang Huang ; Yingying Li ; Bo ZhangarXiv.org, 2022

OPEN ACCESS

Wasserstein Distance-based Spectral Clustering with Application to Transaction Data

Available Online 

arXiv:2203.02709  [pdfother]  stat.ME
Wasserstein Distance-based Spectral Clustering with Application to Transaction Data
Authors: Yingqiu ZhuDanyang HuangYingying LiBo Zhang
Abstract: With the rapid development of online payment platforms, it is now possible to record massive transaction data. The economic behaviors are embedded in the transaction data for merchants using these platforms. Therefore, clustering on transaction data significantly contributes to analyzing merchants' behavior patterns. This may help the platforms provide differentiated services or implement risk man… 
More
Submitted 5 March, 2022; originally announced March 2022.

All 2 versions 

ARTICLE

Partial Wasserstein Adversarial Network for Non-rigid Point Set Registration

Zi-Ming, Wang ; Xue, Nan ; Ling, Lei ; Gui-Song, XiaarXiv.org, 2022

OPEN ACCESS

Partial Wasserstein Adversarial Network for Non-rigid Point Set Registration

Available Online 

arXiv:2203.02227  [pdfother]  cs.CV
Partial Wasserstein Adversarial Network for Non-rigid Point Set Registration
Authors: Zi-Ming WangNan XueLing LeiGui-Song Xia
Abstract: Given two point sets, the problem of registration is to recover a transformation that matches one set to the other. This task is challenging due to the presence of the large number of outliers, the unknown non-rigid deformations and the large sizes of point sets. To obtain strong robustness against outliers, we formulate the registration problem as a partial distribution matching (PDM) problem, wh…  More
Submitted 4 March, 2022; originally announced March 2022.

Journal ref: ICLR 2022

Related articles All 3 versions 

<——2022———2022———270— 


2022  [PDF] arxiv.org

Accelerated Bregman Primal-Dual methods applied to Optimal Transport and Wasserstein Barycenter problems

A Chambolle, JP Contreras - arXiv preprint arXiv:2203.00802, 2022 - arxiv.org

This paper discusses the efficiency of Hybrid Primal-Dual (HDP) type algorithms to 

approximate solve discrete Optimal Transport (OT) and Wasserstein Barycenter (WB) problems …

 All 2 versions

ARTICLE

Accelerated Bregman Primal-Dual methods applied to Optimal Transport and Wasserstein Barycenter problems

Chambolle, Antonin ; Juan Pablo ContrerasarXiv.org, 2022

OPEN ACCESS

Accelerated Bregman Primal-Dual methods applied to Optimal Transport and Wasserstein Barycenter problems

Available Online 

arXiv:2203.00802  [pdfother math.OC
Accelerated Bregman Primal-Dual methods applied to Optimal Transport and Wasserstein Barycenter problems
Authors: Antonin ChambolleJuan Pablo Contreras
Abstract: This paper discusses the efficiency of Hybrid Primal-Dual (HDP) type algorithms to approximate solve discrete Optimal Transport (OT) and Wasserstein Barycenter (WB) problems without smoothing. Our first contribution is an analysis showing that these methods yield state-of-the-art convergence rates, both theoretically and practically. Next, we extend the HPD algorithm with line-search proposed by M…  More
Submitted 1 March, 2022; originally announced March 2022.
All 2 versions
 


arXiv:2203.00159  [pdfpsother math.PR  math.ST
Limit distribution theory for smooth p-Wasserstein distances
Authors: Ziv GoldfeldKengo KatoSloan NietertGabriel Rioux
Abstract: The Wasserstein distance is a metric on a space of probability measures that has seen a surge of applications in statistics, machine learning, and applied mathematics. However, statistical aspects of Wasserstein distances are bottlenecked by the curse of dimensionality, whereby the number of data points needed to accurately estimate them grows exponentially with dimension. Gaussian smoothing was r…  More
Submitted 28 February, 2022; originally announced March 2022.

ARTICLE

A Unified Wasserstein Distributional Robustness Framework for Adversarial Training

Bui, Tuan Anh ; Le, Trung ; Tran, Quan ; Zhao, He ; Dinh PhungarXiv.org, 2022

OPEN ACCESS

A Unified Wasserstein Distributional Robustness Framework for Adversarial Training

Available Online 

2022 see 2021 Working Paper  Full Text

A Unified Wasserstein Distributional Robustness Framework for Adversarial Training
Bui, Tuan Anh; Le, Trung; Tran, Quan; Zhao, He; Dinh Phung.
arXiv.org; Ithaca, Feb 27, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

arXiv:2202.13437  [pdfother cs.LG  cs.CV
A Unified Wasserstein Distributional Robustness Framework for Adversarial Training
Authors: Tuan Anh BuiTrung LeQuan TranHe ZhaoDinh Phung
Abstract: It is well-known that deep neural networks (DNNs) are susceptible to adversarial attacks, exposing a severe fragility of deep learning systems. As the result, adversarial training (AT) method, by incorporating adversarial examples during training, represents a natural and effective approach to strengthen the robustness of a DNN-based classifier. However, most AT-based methods, notably PGD-AT and T…  More
Submitted 27 February, 2022; originally announced February 2022.
All 3 versions
 


[PDF] cnjournals.com

[PDF] 基于 WGAN-GP 的微多普勒雷达人体动作识别

屈乐乐, 王禹桐 - 雷达科学与技术, 2022 - radarst.cnjournals.com

基于梯度惩罚的沃瑟斯坦生成 对抗网络(WGANGP)进行雷达数据增强,实现深度卷积 

WGANGP进行时频谱图像数据增强,最后利用生成的图像对DCNN进行训练.实验结果表明使用

 Related articles All 3 versions

 [Chionese  Human Action Recognition Based on Micro-Doppler Radar Based on WGAN-GP]

 

Cover Image

A Strabismus Surgery Parameter Design Model with WGAN-GP Data Enhancement Method

by Tang, Renhao; Wang, Wensi; Meng, Qingyu ; More...

Journal of physics. Conference series, 01/2022, Volume 2179, Issue 1

The purpose of this paper is a machine learning model that could predict the strabismus surgery parameter through the data of patients as accurately as...

Article PDFPDF  Journal Article 

 

 

2022

        

Causal Discovery on Discrete Data via Weighted Normalized Wasserstein Distance

Y Wei, X Li, L Lin, D Zhu, Q Li - IEEE transactions on neural networks … - ieeexplore.ieee.org

… Then, we propose a weighted normalized Wasserstein distance to measure the dissimilarity 

… by comparing two computed weighted normalized Wasserstein distances. An empirical …

 All 2 versions


2022 see 2021
Wasserstein Proximal Algorithms for the Schrödinger Bridge Problem: Density Control With Nonlinear Drift
by Caluya, Kenneth F; Halder, Abhishek
IEEE transactions on automatic control, 03/2022, Volume 67, Issue 3
In this article, we study the Schrödinger bridge problem (SBP) with nonlinear prior dynamics. In control-theoretic language, this is a problem of minimum...
Article PDFPDF
Journal Article  Full Text Online


Graph Wasserstein Autoencoder-Based Asymptotically Optimal Motion Planning With...
by Xia, Chongkun; Zhang, Yunzhou; Coleman, Sonya A ; More...
IEEE transactions on automation science and engineering, 02/2022
This paper presents a learning based motion planning method for robotic manipulation, aiming to solve the asymptotically-optimal motion planning problem with...
Article PDFPDF

Journal Article  Full Text Online


2022 see 2021
Inferential Wasserstein generative adversarial networks
by Chen, Yao; Gao, Qingyi; Wang, Xiao
Journal of the Royal Statistical Society. Series B, Statistical methodology, 02/2022, Volume 84, Issue 1
Generative adversarial networks (GANs) have been impactful on many problems and applications but suffer from unstable training. The Wasserstein GAN (WGAN)...
Article PDFPDF

https://psu.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Lj9MwEB7RRUhceCMKS2WJKyG1E8eutIDCsstDUC1aHhKXyMk4UEGb0u1eOOxvZ8Z1WBUhTuSUxJZl67NnPPbMfACZejRO_pAJ3rja4xhVq31GFkWDWFhd57VVRub4O3osukO960NjIty9lAyiG7uGT81TTqPC2lfmT5c_EuaR4vvWnlTDRbIFfCwzqe0ALpIqz5jj4Eh_7mV1Ro_ZhEyqhHSx7T3AyCyM_yZFSotapzLlMFPmohvMuuW25FZRHR1ehbO-5_6nC4yHfzF7t_M9_tdhXoMrcSMrys3Muw4X_OIGXAoOpc3JTTgrxdtATi26Vrzqc1KgOAixhlwWciCK49k8EoiJ527txCYNdvguv3-hfqy_zsUzUrXU0EJ8pCHG00tRnq47zsKJfiX25m717cmnF-V0Lw2vt-DD4cH7_ZdJ5HpIZsEldEJyopCoaEKZzPAdm5eNZ2MV2zEa42XrbG5r2oChyYu2QK0mlqQTWtog6kl2G3YW3cLfAaEbadA5Sw2ovPatpR1IhkrWLVL7Ug_hISFYxbV6UoVreGsrBrxiwCsGvJLVBvAhPNiq_vpo_3i7RrXEdgi7PXrnVc-hu_vv4ntwWXFYRfAG34Wd9erU3w9pLEcwmOo3ozBrRxydOP0F8PD8swJournal Article  Full Text Online
 MR4400391
 
On the 2‐Wasserstein distance for self‐similar measures on the unit interval
by Brawley, Easton; Doyle, Mason; Niedzialomski, Robert
Mathematische Nachrichten, 03/2022, Volume 295, Issue 3
We obtain a lower and an upper bound for the 2‐Wasserstein distance between self‐similar measures associated to two increasing non‐overlapping linear...
Article PDFPDF

Journal Article  Full Text Online

All 4 versions

<——2022———2022———280—



Maps on positive definite cones of C^-algebras preserving the Wasserstein mean
by Lajos Molnár
Proceedings of the American Mathematical Society, 03/2022, Volume 150, Issue 3
The primary aim of this paper is to present the complete description of the isomorphisms between positive definite cones of C^*-algebras with respect to the...
Article PDFPDF

Journal Article  Full Text Online


Distributed Kalman Filter With Faulty/Reliable Sensors Based on Wasserstein Average Consensus
by Xin, Dong-Jin; Shi, Ling-Feng; Yu, Xingkai
IEEE transactions on circuits and systems. II, Express briefs, 01/2022
This brief considers distributed Kalman filtering problem for systems with sensor faults. A trust-based classification fusion strategy is proposed to resist...
Article PDFPDF

Journal Article  Full Text Online

Related articles


Wasserstein Graph Auto-Encoder
by Chu, Yan; Li, Haozhuang; Ning, Hui ; More...
Algorithms and Architectures for Parallel Processing, 02/2022
Conventional Graph Auto-Encoder-based(GAE) minimizes the variational lower bound by Kullback-Leibler divergence in link prediction task and forces a...
Book Chapter  Full Text Online
 
Limit distribution theory for smooth $p$-Wasserstein distances
by Goldfeld, Ziv; Kato, Kengo; Nietert, Sloan ; More...
02/2022
The Wasserstein distance is a metric on a space of probability measures that has seen a surge of applications in statistics, machine learning, and applied...
Journal Article  Full Text Online


An Embedding Carrier-Free Steganography Method Based on Wasserstein GAN
by Yu, Xi; Cui, Jianming; Liu, Ming
Algorithms and Architectures for Parallel Processing, 02/2022
Image has been widely studied as an effective carrier of information steganography, however, low steganographic capacity is a technical problem that has not...
Book Chapter  Full Text Online


2022


 Representing Graphs via Gromov-Wasserstein Factorization
by Xu, Hongteng; Liu, Jiachang; Luo, Dixin ; More...
IEEE transactions on pattern analysis and machine intelligence, 02/2022, Volume PP
We propose a new nonlinear factorization model for graphs that have topological structures, and optionally, node attributes. This model is based on a...
ArticleView Article PDF
Journal Article  Full Text Online


 
2022 see 2021
Improving Non-invasive Aspiration Detection with Auxiliary Classifier Wasserstein...
by Shu, Kechen; Mao, Shitong; Coyle, James L ; More...
IEEE journal of biomedical and health informatics, 08/2021, Volume 26, Issue 3
Aspiration is a serious complication of swallowing disorders. Adequate detection of aspiration is essential in dysphagia management and treatment....
ArticleView Article PDF

Journal Article  Full Text Online
   

[HTML] Proxying credit curves via Wasserstein distances

M Michielon, A KhedherP Spreij - Annals of Operations Research, 2022 - Springer

… methodology for credit curves, in this article we investigate whether it is possible to construct

proxy credit curves from CDS quotes by means of (weighted) Wasserstein barycenters. We …

All 3 versions
University of Amsterdam Reports Findings in Operations Science (Proxying credit curves via Wasserstein...
Investment Weekly News, 03/2022
 Newsletter  Full Text Online


2022 patent news
Shenzhen Inst Adv Tech Seeks Patent for High-Energy Image
Synthesis Method and Device Based on Wasserstein Generative Adversarial Network Model

Global IP News. Optics & Imaging Patent News; New Delhi [New Delhi]. 22 Feb 2022. 
Shenzhen Inst Adv Tech Seeks Patent for High-Energy Image Synthesis Method and Device Based on Wasserstein...
Global IP News. Optics & Imaging Patent News, Feb 22, 2022
Newspaper Article  Full Text Online  Patent


Chaudru De Raynal, Paul-EricFrikha, Noufel

Well-posedness for some non-linear SDEs and related PDE on the Wasserstein space. (English) Zbl 07485589

J. Math. Pures Appl. (9) 159, 1-167 (2022).

MSC:  60H10 93E03 60H30 35K40

PDF BibTeX XML Cite  Full Text: DOI

Zbl 07485589

 Cited by 12 Related articles All 8 versions

<——2022———2022———290— 


Zhang, ChaoKokoszka, PiotrPetersen, Alexander

Wasserstein autoregressive models for density time series. (English) Zbl 07476226

J. Time Ser. Anal. 43, No. 1, 30-52 (2022).

MSC:  62G05 62G20 62M10

Cited by 18 Related articles All 7 versions

Zbl 07476226

MR4388921 Prelim Jourdain, Benjamin; Margheriti, William; Martingale Wasserstein inequality for probability measures in the convex order. Bernoulli 28 (2022), no. 2, 830–858. 60E15 (49Q22 60B10 60G42)

Review PDF Clipboard Journal Article

Related articles All 8 versions

MR4388654 Prelim Caluya, Kenneth F.; Halder, Abhishek; Wasserstein proximal algorithms for the Schrödinger bridge problem: density control with nonlinear drift. IEEE Trans. Automat. Control 67 (2022), no. 3, 1163–1178. 93E20 (49Q22 60)

Review PDF Clipboard Journal Article


MR4386535 Prelim Backhoff, Julio; Bartl, Daniel; Beiglböck, Mathias; Wiesel, Johannes; 

Estimating processes in adapted Wasserstein distance. 

Ann. Appl. Probab. 32 (2022), no. 1, 529–550. 60G42 (49Q22 58E30)

Review PDF Clipboard Journal Article

[PDF] arxiv.org

Estimating processes in adapted Wasserstein distance

J BackhoffD BartlM Beiglböck… - The Annals of Applied …, 2022 - projecteuclid.org

… Wasserstein distance, as established in the seminal works of Pflug–Pichler. Specifically, the

adapted Wasserstein … measure with respect to adapted Wasserstein distance. Surprisingly, …

Cited by 21 Related articles All 9 versions
 

MR4386523 Prelim Qin, Qian; Hobert, James P.; Wasserstein-based methods for convergence complexity analysis of MCMC with applications. Ann. Appl. Probab. 32 (2022), no. 1, 124–166. 60J05

Review PDF Clipboard Journal Article

Zbl 07493818

[PDF] arxiv.org

Wasserstein-based methods for convergence complexity analysis of MCMC with applications

Q QinJP Hobert - The Annals of Applied Probability, 2022 - projecteuclid.org

… We consider a discrete-time time-homogeneous Markov chain on X with Markov

transition function (Mtf) K : X × b [0,1]. For an integer m ≥ 0, let Km : X × b [0,1] be the …

 Cited by 8 Related articles All 2 versions

2022


MR4386483 Prelim Heinemann, Florian; Munk, Axel; Zemel, Yoav; Randomized Wasserstein Barycenter Computation: Resampling with Statistical Guarantees. SIAM J. Math. Data Sci. 4 (2022), no. 1, 229–259. 62G99 (49Q22 62P10 68T09 90C08)

Review PDF Clipboard Journal Article

[PDF] arxiv.org

Randomized Wasserstein Barycenter Computation: Resampling with Statistical Guarantees

F Heinemann, A MunkY Zemel - SIAM Journal on Mathematics of Data …, 2022 - SIAM

… We propose a hybrid resampling method to approximate finitely supported Wasserstein …

which are out of reach for state-of-the-art algorithms for computing Wasserstein barycenters. …
 Cited by 9 Related articles All 4 versions

MR4379594 Prelim Nguyen, Viet Anh; Kuhn, Daniel; Mohajerin Esfahani, Peyman; Distributionally robust inverse covariance estimation: the Wasserstein shrinkage estimator. Oper. Res. 70 (2022), no. 1, 490–515. 62J07 (90C15 90C90)

Review PDF Clipboard Journal Article

[PDF] arxiv.org

Distributionally robust inverse covariance estimation: The Wasserstein shrinkage estimator

VA NguyenD Kuhn… - Operations …, 2022 - pubsonline.informs.org

… in machine learning. Indeed, many classical regularization schemes of supervised learning

such as the Lasso method can be explained by a Wasserstein distributionally robust model. …

Cited by 37 Related articles All 8 versions
  

2022 see 2021

MR4378594 Prelim Altschuler, Jason M.; Boix-Adserà, Enric; Wasserstein barycenters are NP-hard to compute. SIAM J. Math. Data Sci. 4 (2022), no. 1, 179–203. 68Q17 (49Q22)

Review PDF Clipboard Journal Article

[PDF] arxiv.org

Wasserstein barycenters are NP-hard to compute

JM AltschulerE Boix-Adserà - SIAM Journal on Mathematics of Data Science, 2022 - SIAM

… Wasserstein barycenters have received considerable … applications include improving Bayesian

learning by averaging posterior … 30], and unsupervised representation learning in natural …

Cited by 14 Related articles All 3 versions

MR4466738 Prelim De Ponti, Nicolò; Muratori, Matteo; Orrieri, Carlo; 

Wasserstein stability of porous medium-type equations on manifolds with Ricci curvature bounded below. J. Funct. Anal. 283 (2022), no. 9, Paper No. 109661.

Review PDF Clipboard Journal Article

Wasserstein stability of porous medium-type equations on manifolds with Ricci curvature bounded below
De Ponti, N
Muratori, M and Orrieri, C

Nov 1 2022 | 

JOURNAL OF FUNCTIONAL ANALYSIS

 283 (9)

Given a complete, connected Riemannian manifold M-n with Ricci curvature bounded from below, we discuss the stability of the solutions of a porous medium-type equation with respect to the 2-Wasserstein distance. We produce (sharp) stability estimates under negative curvature bounds, which to some extent generalize well-known results by Sturm [35] and Otto-Westdickenberg [32]. The strategy of th

Show more

Free Submitted Article From RepositoryFull Text at Publishermore_horiz

41 References  Related records


[PDF] arxiv.org

Weak topology and Opial property in Wasserstein spaces, with applications to Gradient Flows and Proximal Point Algorithms of geodesically convex functionals

E NaldiG Savaré - Rendiconti Lincei, 2022 - ems.press

… A first approach, adopted in [1], is to work with the Wasserstein distance induced by a

weaker metric on H, which metrizes the weak topology on bounded sets. This provides a …

Cited by 1 Related articles All 4 versions

<——2022———2022———300— 


[PDF] archives-ouvertes.fr

Lagrangian discretization of variationnal problems in Wasserstein spaces

C Sarrazin - 2022 - tel.archives-ouvertes.fr

… This is only partially an impediment, as the discretization used is very robust, and even for …

as a Moreau envelope, using the 2-Wasserstein distance. These expressions introduce a non-…

All 4 versions 


[PDF] kaust.edu.sa

Digital Rock Reconstruction Using Wasserstein GANs with Gradient Penalty

Y LiX HeW Zhu, M AlSinan, H Kwak… - International Petroleum …, 2022 - onepetro.org

… From the perspective of engineering applications, the proposed workflow is simple and robust

to generate synthetic rock images under a given rock property within any range of interest. …

Cited by 1 Related articles All 3 versions

Wassersplines for Stylized Neural Animation

P ZhangD SmirnovJ Solomon - arXiv preprint arXiv:2201.11940, 2022 - arxiv.org

… We solve an additional Wasserstein barycenter interpolation problem to guarantee strict

adherence to keyframes. Our tool can stylize trajectories through a variety of PDE-based …

All 2 versions 


Health indicator construction method of bearings based on Wasserstein dual-domain adversarial networks under normal data only

J Li, Y Zi, Y Wang, Y Yang - IEEE Transactions on Industrial …, 2022 - ieeexplore.ieee.org

… indicator construction method (HICM) -- Wasserstein dual-domain adversarial networks (WD…

normal samples, making the constructed HI more robust and accurate. Moreover, to balance …

 1 Citation  36  References  Related records

Cited by 3 Related articles

[PDF] arxiv.org

Fast Sinkhorn I: An O (N) algorithm for the Wasserstein-1 metric

Q Liao, J Chen, Z WangB BaiS JinH Wu - arXiv preprint arXiv …, 2022 - arxiv.org

… Wasserstein metric, solves an entropy regularized minimizing problem, which allows arbitrary

approximations to the Wasserstein … algorithm to calculate the Wasserstein-1 metric with O(N…
Cited by 3
 Related articles All 3 versions 


2022

2022  see 2021  [PDF] jmlr.org

[PDF] cal methods for distributional data on the real line with the wasserstein metricProjected statisti

M PegoraroM Beraha - Journal of Machine Learning Research, 2022 - jmlr.org

… The discussion above implies that considering the tangent space at the Wasserstein barycenter

x … Moreover, centering the analysis in the barycenter presents a drawback when studying …

 Related articles All 8 versions 


 Wasserstein distributionally robust chance constrained programming approach for emergency medical system planning problem

Y Yuan, Q Song, B Zhou - International Journal of Systems …, 2022 - Taylor & Francis

… This paper proposes a distributionally robust chance constrained … The Wasserstein-metric

is employed to construct the … to reformulate distributionally robust chance constrained … 


Convergence diagnostics for Monte Carlo fission source distributions using the Wasserstein distance measure

X Guo, Z Li, S Huang, K Wang - Nuclear Engineering and Design, 2022 - Elsevier

… A new indicator based on the Wasserstein distance (WD) measure is proposed; it is intuitive

and easy to implement in MC codes. In addition, the usage of the WD-based indicator is …

 Convergence diagnostics for Monte Carlo fission source distributions using the Wasserstein distance measure

Guo, XYLi, ZG; (...); Wang, K

Apr 1 2022 | NUCLEAR ENGINEERING AND DESIGN 389

The power iteration technique is commonly used in Monte Carlo (MC) criticality simulations to obtain converged neutron source distributions. Entropy is a typical indicator used to examine source distribution convergence. However, spatial meshing is required to calculate entropy, and the performance of a convergence diagnostic is sensitive to the chosen meshing scheme. A new indicator based on t

Show more

Full Text at Publisher

30References

Related records


arXiv:2203.09358  [pdfother]  math.NA   math.PR
Low-rank Wasserstein polynomial chaos expansions in the framework of optimal transport
Authors: Robert GruhlkeMartin Eigel
Abstract: A unsupervised learning approach for the computation of an explicit functional representation of a random vector Y
 is presented, which only relies on a finite set of samples with unknown distribution. Motivated by recent advances with computational optimal transport for estimating Wasserstein distances, we develop a new \textit{Wasserstein multi-element polynomial chaos expansion} (WPCE). It rel… 
More
Submitted 17 March, 2022; originally announced March 2022.
MSC Class: 15A69; 35R13; 65N12; 65N22; 65J10; 65J10
All 3 versions
 

arXiv:2203.09347  [pdfother]  stat.ML  cs.LG  math.NA  math.ST
Dimensionality Reduction and Wasserstein Stability for Kernel Regression
Authors: Stephan EcksteinArmin IskeMathias Trabs
Abstract: In a high-dimensional regression framework, we study consequences of the naive two-step procedure where first the dimension of the input variables is reduced and second, the reduced input variables are used to predict the output variable. More specifically we combine principal component analysis (PCA) with kernel regression. In order to analyze the resulting regression errors, a novel stability re… 
More
Submitted 17 March, 2022; originally announced March 2022.

Related articles All 3 versions 
<——2022———2022———310— 


2022 see 2021  arXiv:2203.06501  [pdfother]  cs.LG  cs.AI  cs.DC  cs.PF
Wasserstein Adversarial Transformer for Cloud Workload Prediction
Authors: Shivani ArbatVinodh Kumaran JayakumarJaewoo LeeWei WangIn Kee Kim
Abstract: Predictive Virtual Machine (VM) auto-scaling is a promising technique to optimize cloud applications operating costs and performance. Understanding the job arrival rate is crucial for accurately predicting future changes in cloud workloads and proactively provisioning and de-provisioning VMs for hosting the applications. However, developing a model that accurately predicts cloud workload changes i…  More
Submitted 12 March, 2022; originally announced March 2022.
Comments: The Thirty-Fourth Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-22) (presented at AAAI-2022)

Cited by 4 Related articles All 7 versions 

[PDF] arxiv.org

A Unified Wasserstein Distributional Robustness Framework for Adversarial Training

TA BuiT LeQ TranH ZhaoD Phung - arXiv preprint arXiv:2202.13437, 2022 - arxiv.org

… (AT) method, by incorporating adversarial examples during training, … Wasserstein distributional

robustness with current state-of-the-art AT methods. We introduce a new Wasserstein cost …

 All 3 versions 


[PDF] arxiv.org

Learning to Predict Graphs with Fused Gromov-Wasserstein Barycenters

L Brogat-Motte, R FlamaryC BrouardJ Rousu… - arXiv preprint arXiv …, 2022 - arxiv.org

This paper introduces a novel and generic framework to solve the flagship task of supervised

labeled graph prediction by leveraging Optimal Transport tools. We formulate the problem …

 All 4 versions 


[PDF] arxiv.org

Wasserstein GAN for Joint Learning of Inpainting and its Spatial Optimisation

P Peter - arXiv preprint arXiv:2202.05623, 2022 - arxiv.org

… After a review of Wasserstein GANs in Section 2 we introduce our deep spatial optimisation

approach in Section 3 and evaluate it in Section 4. The paper concludes with a discussion …

 All 2 versions 


[PDF] arxiv.org

Invariance encoding in sliced-Wasserstein space for image classification with limited training data

Y ZhuangS LiAHM Rubaiyat, X Yin… - arXiv preprint arXiv …, 2022 - arxiv.org

… learning, here we propose to mathematically augment a nearest subspace classification

model in sliced-Wasserstein … We demonstrate that for a particular type of learning problem, our …

 Related articles All 2 versions 


2022

HackGAN: Harmonious Cross-Network Mapping Using CycleGAN With Wasserstein-Procrustes Learning for Unsupervised Network Alignment

L Yang, X Wang, J Zhang, J Yang, Y Xu… - IEEE Transactions …, 2022 - ieeexplore.ieee.org

… by iteratively solving the Wasserstein problem and the … structural information by learning

linear mapping functions to … the Wasserstein and Procrustes problems to prompt the learning of …

 Cited by 3 Related articles

[PDF] arxiv.org

WPPNets: Unsupervised CNN Training with Wasserstein Patch Priors for Image Superresolution

F Altekrüger, J Hertrich - arXiv preprint arXiv:2201.08157, 2022 - arxiv.org

… Then, we propose a loss function based on the Wasserstein patch prior which measures the

Wasserstein-2 distance between the patch distributions of the predictions and the reference …

 Related articles All 2 versions 


A Novel Bearing Imbalance Fault-Diagnosis Method Based on a Wasserstein Conditional Generative Adversarial Network

Y Peng, Y Wang, Y Shao - Measurement, 2022 - Elsevier

… The overall minimization of training loss will force the … framework named the Wasserstein

conditional generative … network training, this paper introduced the Wasserstein distance …

Cited by 1 All 2 versions

[PDF] wiley.com

Semi‐supervised Surface Wave Tomography with Wasserstein Cycle‐consistent GAN: Method and Application to Southern California Plate Boundary Region

A CaiH QiuF Niu - Journal of Geophysical Research: Solid …, 2022 - Wiley Online Library

… In this study, we propose a new machine learning based inversion scheme, termed

Wasserstein cycleconsistent Generative Adversarial Networks (Wcycle-GAN), to overcome the …

Cited by 4 Related articles All 8 versions

Stochastic saddle-point optimization for the Wasserstein barycenter problem

D TiapkinA GasnikovP Dvurechensky - Optimization Letters, 2022 - Springer

… We consider the population Wasserstein barycenter problem for random probability measures

supported on a finite set of points and generated by an online stream of data. This leads to …

Cited by 1 Related articles

MR4462865 

<——2022———2022———320— 


Partitive and Hierarchical Clustering of Distributional Data using the Wasserstein Distance

R Verde, A Irpino - Analysis of Distributional Data, 2022 - taylorfrancis.com

… a dispersion measure of a dataset, using the L2 Wasserstein distance for comparing

distributions, we prove (see Appendix of this chapter) that a classical results of decomposition of the …

All 2 versions 

 

[PDF] github.io

[PDF] Wasserstein Distributionally Robust Optimization via Wasserstein Barycenters

TTK LauH Liu - 2022 - timlautk.github.io

… called the Wasserstein barycentric ambiguity set. We hence introduce Wasserstein Barycentric

… We consider an approximation of the Wasserstein ambiguity set by characterizing it using …


2022 see 2021  [PDF] jmlr.org

[PDF] Projected statistical methods for distributional data on the real line with the wasserstein metric

M PegoraroM Beraha - Journal of Machine Learning Research, 2022 - jmlr.org

… Second, by exploiting a geometric characterization of Wasserstein space closely related

Cited by 4 Related articles All 12 versions
MR4420762    |   
Zbl 07625190


[PDF] archives-ouvertes.fr

Projection de mesures de probabilité sous contraintes de quantile par distance de Wasserstein et approximation monotone polynomiale

MI Idrissi, N Bousquet, B IoossF GamboaJM Loubes - 2022 - hal.archives-ouvertes.fr

Motivés par des applications en analyse de robustesse pour la quantification d'incertitude et

l'apprentissage statistique, nous nous intéressons au problème de projection d'une mesure …

 Related articles 

[PDF] archives-ouvertes.fr

[PDF] On the complexity of the data-driven Wasserstein distributionally robust binary problem

H Kim, D Watel, A Faye… - … et d'Aide à la Décision, 2022 - hal.archives-ouvertes.fr

… In this paper, we use a data-driven Wasserstein … Wasserstein metric that gives a distance

value between two distributions. This particular case of DRO is called data-driven Wasserstein …

Related articles All 7 versions 

2022


Improving SSH Detection Model using IPA Time and WGAN-GP

J Lee, H Lee - Computers & Security, 2022 - Elsevier

… , WGAN-GP generates various samples and synthesizes an enhanced training dataset. Since

the generated samples with the WGAN-… and synthetic dataset using WGAN-GP as follows. …

Cited by 1 Related articles All 3 versions

[HTML] sagepub.com

Intelligent data expansion approach of vibration gray texture images of rolling bearing based on improved WGAN-GP

H Fan, J Ma, X Zhang, C Xue… - Advances in …, 2022 - journals.sagepub.com

… on WGAN (Wasserstein … WGAN improves the loss function of GAN, it realizes the constraint

mainly by forcing weight clipping. To solve the above problems, this paper studies a WGAN-…

a hot topic in the field of bearing fault diagnosis. However, it is …

Cited by 2 Related articles

Fused WGAN  이용한 2 단계 오버 샘플링

최인재 - 2022 - repository.hanyang.ac.kr

… WGAN, an imbalanced data oversampling technique using GAN based deep learning to settle

these problems. I n Fused WGAN, we … the preliminary sampling model, 1st WGAN-GP. The …

[PDF] sns.it

[PDF] Sharp Wasserstein estimates for integral sampling and Lorentz summability of transport densities

F Santambrogio - 2022 - cvgmt.sns.it

We prove some Lorentz-type estimates for the average in time of suitable geodesic

interpolations of probability measures, obtaining as a by product a new estimate for transport …

  

2022 see 2021  [PDF] openreview.net

[PDF] Combining Wasserstein GAN and Spatial Transformer Network for Medical Image Segmentation

Z Zhang, J Wang, Y Wang, S Li - 2022 - openreview.net

… We try to make the Wasserstein distance better reflect the … , we are the first to introduce STN

into WGAN-GP, because the … to the wasserstein distance. Keywords: Image segmentation, …

 Related articles

[CITATION] Combining Wasserstein GAN and Spatial Transformer Network for Medical Image Segmentation

Z Zhang, J Wang, Y Wang, S Li - 2022

Related articles

<——2022———2022———330— 


2022 patent

nst Inf Eng, CAS Files Chinese Patent Application for Network Attack Traffic Data Enhancement Method and System

CombiniAuto-Encoder and WGAN

Global IP News. Broadband and Wireless Network News; New Delhi [New Delhi]. 12 Mar 2022. 

DetailsFull text

Scholarly Journal Citation/Abstract

Speech Emotion Recognition on Small Sample Learning by Hybrid WGAN-LSTM Networks

Sun, Cunwei; Ji, Luping; Zhong, Hailing.

Journal of Circuits, Systems and Computers; Singapore Vol. 31, Iss. 4,  (Mar 15, 2022).

Abstract/Details   Show Abstract 

2022 see 2021   Working Paper Full Text

Augmented Sliced Wasserstein Distances

Chen, Xiongjie; Yang, Yongxin; Li, Yunpeng.

arXiv.org; Ithaca, Mar 17, 2022.


Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Show Abstract 
Select result item

2022 see 2021   Working PaperFull Text

P-WAE: Generalized Patch-Wasserstein Autoencoder for Anomaly Screening
Chen, Yurong.
arXiv.org; Ithaca, Mar 9, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Show Abstract 

2022 


2022 see 2021  Working Paper  Full Text

Entropic Gromov-Wasserstein between Gaussian Distributions
Le, Khang; Le, Dung; Nguyen, Huy; Do, Dat; Pham, Tung; et al.
arXiv.org; Ithaca, Feb 24, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Show Abstract 
arXiv:2108.10961
  [pdfpsother math.ST  cs.IT  stat.ML
Entropic Gromov-Wasserstein between Gaussian Distributions
Authors: Khang LeDung LeHuy NguyenDat DoTung PhamNhat Ho
Abstract: We study the entropic Gromov-Wasserstein and its unbalanced version between (unbalanced) Gaussian distributions with different dimensions. When the metric is the inner product, which we refer to as inner product Gromov-Wasserstein (IGW), we demonstrate that the optimal transportation plans of entropic IGW and its unbalanced variant are (unbalanced) Gaussian distributions. Via an application of von… 
More
Submitted 24 August, 2021; originally announced August 2021.
Comments: 25 pages. Khang Le, Dung Le, and Huy Nguyen contributed equally to this work
Cited by 4
 Related articles All 7 versions 


2022 see 2021  Working Paper Full Text

Internal Wasserstein Distance for Adversarial Attack and Defense
Tan, Mingkui; Zhang, Shuhai; Cao, Jiezhang; Li, Jincheng; Xu, Yanwu.
arXiv.org; Ithaca, Feb 19, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Show Abstract 
ited by 1
 Related articles All 4 versions 
 
Working Paper Full Text

Wasserstein sensitivity of Risk and Uncertainty Propagation
Ernst, Oliver G; Pichler, Alois; Sprungk, Björn.
arXiv.org; Ithaca, Feb 28, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Show Abstract

2022 see 2021  Working Paper Full Text

Distributed Wasserstein Barycenters via Displacement Interpolation
Cisneros-Velarde, Pedro; Bullo, Francesco.
arXiv.org; Ithaca, Feb 25, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Show Abstract 

MR4490312

ARTICLE

Limit distribution theory for smooth \(p\)-Wasserstein distances

Ziv Goldfeld ; Kengo Kato ; Sloan Nietert ; Gabriel RiouxarXiv.org, 2022

OPEN ACCESS

Limit distribution theory for smooth \(p\)-Wasserstein distances

Available Online 


Show Abstract 

Cited by 1 All 3 versions 

<——2022———2022———340— 


2022 see 2021  Working Paper  Full Text

Low Budget Active Learning via Wasserstein Distance: An Integer Programming Approach
Mahmood, Rafid; Fidler, Sanja; Law, Marc T.
arXiv.org; Ithaca, Mar 5, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Show Abstract 


Working Paper Full Text
Subexponential upper and lower bounds in Wasserstein distance for Markov processes
Arapostathis, Ari; Pang, Guodong; Sandrić, Nikola.
arXiv.org; Ithaca, Feb 25, 2022.

MR4429313

2022 see 2021  Working Paper Full Text

Fast Topological Clustering with Wasserstein Distance

Songdechakraiwut, Tananun; Krause, Bryan M; Banks, Matthew I; Nourski, Kirill V; Van Veen, Barry D.

arXiv.org; Ithaca, Mar 14, 2022.

Abstract/DetailsGet full textLink to external site, this link will open in a new window

Show Abstract 


2022 see 2021 Working Paper Full Text

Two-sample Test with Kernel Projected Wasserstein Distance

Wang, Jie; Gao, Rui; Xie, Yao.

arXiv.org; Ithaca, Feb 23, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Show Abstract 


2022


2022 see 2021 Working Paper Full Text

Wasserstein-based fairness interpretability framework for machine learning models

Miroshnikov, Alexey; Kotsiopoulos, Konstandinos; Franks, Ryan; Kannan, Arjun Ravi.

arXiv.org; Ithaca, Mar 8, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Show Abstract 

Cited by 3 Related articles All 3 versions
MR4483540
  Zbl 07624274


2022 see 2021 Working Paper Full Text

On Label Shift in Domain Adaptation via Wasserstein Distance

Le, Trung; Do, Dat; Nguyen, Tuan; Nguyen, Huy; Bui, Hung; et al.

arXiv.org; Ithaca, Mar 2, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Show Abstract 
ARTICLE

On Label Shift in Domain Adaptation via Wasserstein Distance

Trung Le ; Dat Do ; Tuan Nguyen ; Huy Nguyen ; Hung Bui ; Nhat Ho ; Dinh PhungarXiv.org, 2022OPEN ACCESS

On Label Shift in Domain Adaptation via Wasserstein Distance

Available Online 


2022 see 2021 Working Paper Full Text

Variance Minimization in the Wasserstein Space for Invariant Causal Prediction

Martinet, Guillaume; Strzalkowski, Alexander; Engelhardt, Barbara E.

arXiv.org; Ithaca, Feb 28, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Show Abstract 

Cited by 3 Related articles All 3 versions 
ARTICLE

Variance Minimization in the Wasserstein Space for Invariant Causal Prediction

Martinet, Guillaume ; Strzalkowski, Alexander ; Engelhardt, Barbara EarXiv.org, 2022

OPEN ACCESS

Variance Minimization in the Wasserstein Space for Invariant Causal Prediction

Available Online 


2022 see 2021 Working Paper Full Text

When OT meets MoM: Robust estimation of Wasserstein Distance

Staerman, Guillaume; Laforgue, Pierre; Mozharovskyi, Pavlo; d'Alché-Buc, Florence.

arXiv.org; Ithaca, Feb 18, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Show Abstract 


2022 see 2021  RTICLE

Fast Approximation of the Sliced-Wasserstein Distance Using Concentration of Random Projections

Kimia Nadjahi ; Alain Durmus ; Pierre E Jacob ; Roland Badeau ; Umut \c{S}imşekliarXiv.org, 2022

OPEN ACCESS

Fast Approximation of the Sliced-Wasserstein Distance Using Concentration of Random Projections

Available Online 

<——2022———2022———350—


2022 see 2021 Working Paper  Full TextEntropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

Minh Ha Quang.

arXiv.org; Ithaca, Mar 14, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new windo


Scholarly Journal Citation/Abstract

R-WGAN-Based Multitimescale Enhancement Method for Predicting f-CaO Cement Clinker

Hao, Xiaochen; Liu, Lin; Huang, Gaolu; Zhang, Yuxuan; Zhang, Yifu; et al.

IEEE Transactions on Instrumentation and Measurement; New York Vol. 71,  (2022): 1-10.

Abstract/Details   Show Abstract 

 Cited by 1 Related articles

Right mean for the \(\alpha-z\) Bures-Wasserstein quantum divergence

by Jeong, MiranHwang, JinmiKim, Sejong

arXiv.org, 01/2022

A new quantum divergence induced from the \(\alpha-z\) Renyi relative entropy, called the \(\alpha-z\) Bures-Wasserstein

quantum divergence, has been recently...

Paper  Full Text Online
 

arXiv:2203.10754  [pdfpsother]  math.ST   stat.ML
Strong posterior contraction rates via Wasserstein dynamics
Authors: Emanuele DoleraStefano FavaroEdoardo Mainini
Abstract: In this paper, we develop a novel approach to posterior contractions rates (PCRs), for both finite-dimensional (parametric) and infinite-dimensional (nonparametric) Bayesian models. Critical to our approach is the combination of an assumption of local Lipschitz-continuity for the posterior distribution with a dynamic formulation of the Wasserstein distance, here referred to as Wasserstein dynamics…  More
Submitted 21 March, 2022; originally announced March 2022.
Comments: 42 pages, text overlap with arXiv:2011.14425

All 2 versions 

2022

Partitive and Hierarchical Clustering of Distributional Data using the Wasserstein Distance

R Verde, A Irpino - Analysis of Distributional Data, 2022 - taylorfrancis.com

… the L2 Wasserstein distance (that is a Euclidean distance between quantile functions). We 

briefly recall the L2 Wasserstein … The splitting or the merging of clusters is done using greedy …  All 2 versions

Zbl 07677380


2022


2022  [PDF] arxiv.org

Randomized Wasserstein Barycenter Computation: Resampling with Statistical Guarantees

F Heinemann, A Munk, Y Zemel - SIAM Journal on Mathematics of Data …, 2022 - SIAM

… We propose a hybrid resampling method to approximate finitely supported Wasserstein 

Cited by 11 Related articles All 4 versions
Zbl 07493844


Short-term prediction of wind power based on BiLSTM-CNN-WGAN-GP

Huang, LLi, LX; (...); Zhang, DS

Jan 2022 (Early Access) | SOFT COMPUTING

A short-term wind power prediction model based on BiLSTM-CNN-WGAN-GP (LCWGAN-GP) is proposed in this paper, aiming at the problems of instability and low prediction accuracy of short-term wind power prediction. Firstly, the original wind energy data are decomposed into subsequences of natural mode functions with diff…

Free Full Text From Publisher

34 References  Related records

Proxying credit curves via Wasserstein distances

Michielon, MKhedher, A and Spreij, P

Feb 2022 (Early Access) | ANNALS OF OPERATIONS RESEARCH

Credit risk plays a key role in financial modeling, and financial institutions are required to incorporate it in their pricing, as well as in capital requirement calculations. A common manner to extract credit worthiness information for existing and potential counterparties is based on the Credit Default Swap (CDS) market. Non…

Free Full Text From Publisher

32 References Related records

 Related articles All 3 versions

Representing Graphs via Gromov-Wasserstein Factorization.

Xu, HongtengLiu, Jiachang; (...); Carin, Lawrence

2022-Feb-23 | IEEE transactions on pattern analysis and machine intelligence PP

We propose a new nonlinear factorization model for graphs that have topological structures, and optionally, node attributes. This model is based on a pseudo-metric called Gromov-Wasserstein (GW) discrepancy, which compares graphs in a relational way. It estimates observed graphs as the GW barycenters constructed by a se…

Free Published Article From RepositoryView full text

Cited by 7 Related articles All 6 versions

Improving Non-Invasive Aspiration Detection With Auxiliary Classifier Wasserstein Generative Adversarial Networks.

Shu, KechenMao, Shitong; (...); Sejdic, Ervin

2022-03 | IEEE journal of biomedical and health informatics 26 (3) , pp.1263-1272

Aspiration is a serious complication of swallowing disorders. Adequate detection of aspiration is essential in dysphagia management and treatment. High-resolution cervical auscultation has been increasingly considered as a promising noninvasive swallowing screening tool and has inspired automatic diagnosis wit…

View full text

Related articles All 3 versions

<——2022———2022———360— 


Wasserstein Proximal Algorithms for the Schrodinger Bridge Problem: Density Control With Nonlinear Drift

Caluya, KF and Halder, A

Mar 2022 | IEEE TRANSACTIONS ON AUTOMATIC CONTROL 67 (3) , pp.1163-1178

In this article, we study the Schrodinger bridge problem (SBP) with nonlinear prior dynamics. In control-theoretic language, this is a problem of minimum effort steering of a given joint state probability density function (PDF) to another over a finite-time horizon, subject to a controlled stochastic differential evolution …

Free Submitted Article From RepositoryView full text

1 Citation 75 References Related records

On the 2-Wasserstein distance for self-similar measures on the unit interval

Brawley, EDoyle, M and Niedzialomski, R

Feb 2022 (Early Access) | MATHEMATISCHE NACHRICHTEN

We obtain a lower and an upper bound for the 2-Wasserstein distance between self-similar measures associated to two increasing non-overlapping linear contractions of the unit interval. We use a method of approximation of the measures via iterations of the Hutchinson operator on a delta Dirac measure. This allo…

 Full Text at Publisher

6 References  Related records

Wasserstein distributionally robust chance constrained programming approach for emergency medical system planning problem

Yuan, YFSong, QK and Zhou, B

Feb 2022 (Early Access) | INTERNATIONAL JOURNAL OF SYSTEMS SCIENCE

This paper proposes a distributionally robust chance constrained programming model for an emergency medical system location problem with uncertain demands. By minimising the total expected cost, the location of emergency medical stations, the allocation of the ambulances and demand assignments of system are op…

 Full Text at Publisher

25 References Related records

 All 2 versions

MR4452550


Graph Wasserstein Autoencoder-Based Asymptotically Optimal Motion Planning With Kinematic Constraints for Robotic Manipulation

Xia, CKZhang, YZ; (...); Chen, IM

Feb 2022 (Early Access) | IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING

This paper presents a learning based motion planning method for robotic manipulation, aiming to solve the asymptotically-optimal motion planning problem with nonlinear kinematics in a complex environment. The core of the proposed method is based on a novel neural network model, i.e., graph wasserstein autoencoder…

Full Text at Publisher

33 References  Related records
Related articles
 All 2 versions


2022 see 2021

Invariance encoding in sliced-Wasserstein space for image classification with limited training data

MSE Rabbi, Y Zhuang, S Li… - arXiv preprint …, 2022 - arxiv-export-lb.library.cornell.edu

… learning, here we propose to mathematically augment a nearest subspace classification

model in sliced-Wasserstein … We demonstrate that for a particular type of learning problem, our …


2022


Calculus of Variations - Weak topology and Opial property in Wasserstein spaces, with applications to gradient flows and proximal point algorithms of geodesically convex functionals, by EMANUELE NALDI and GIUSEPPE SAVARE, communicated on 12 November 2021.

Naldi, E and Savare, G

2021 | RENDICONTI LINCEI-MATEMATICA E APPLICAZIONI 32 (4) , pp.725-750

In this paper we discuss how to define an appropriate notion of weak topology in the Wasserstein space (P-2(H), W-2) of Borel probability measures with finite quadratic moment on a separable Hilbert space H.

We will show that such a topology inherits many features of the usual weak topology in Hilbert spaces, in pa…

Free Submitted Article From RepositoryFull Text at Publisher

 Cited by 2 Related articles All 8 versions

2022  patent

Road texture image enhancement method coupling traditional method and WGAN-GP involves pre-processing original image, and converting three-dimensional macroscopic texture data into two-dimensional image

CN113850855-A

Inventor(s) HOU YWANG Y; (...); XU Z

Assignee(s) UNIV BEIJING TECHNOLOGY

Derwent Primary Accession Number 

2022-35384T


2022 patent

Method for sorting blood leukocyte based on attention residual network, involves introducing depth separable convolution to extract characteristic of white blood cell, and using Wasserstein to generate anti-network creating synthesis

CN113887672-A

Inventor(s) CAO XZHAO M; (...); LI Z

Assignee(s) UNIV MINJIANG

Derwent Primary Accession Number 

2022-11243X


 2022 patent

Method for simulating artificial seismic wave based on generative confrontation network algorithm, involves verifying judgment result, and judging network training effect of artificial seismic data by using loss function defined by Wasserstein norm

CN113935240-A

Inventor(s) YANG CXIANG T and YANG M

Assignee(s) UNIV XIHUA

Derwent Primary Accession Number 


ARTICLE

Bounds on Wasserstein distances between continuous distributions using independent samples

Papp, Tamás ; Sherlock, ChrisarXiv.org, 2022

OPEN ACCESS

Bounds on Wasserstein distances between continuous distributions using independent samples

Available Online 

arXiv:2203.11627  [pdfother]  stat.ML  stat.CO  stat.ME
Bounds on Wasserstein distances between continuous distributions using independent samples
Authors: Tamás PappChris Sherlock
Abstract: The plug-in estimator of the Wasserstein distance is known to be conservative, however its usefulness is severely limited when the distributions are similar as its bias does not decay to zero with the true Wasserstein distance. We propose a linear combination of plug-in estimators for the squared 2-Wasserstein distance with a reduced bias that decays to zero with the true distance. The new estimat…  More
Submitted 22 March, 2022; originally announced March 2022.
Comments: 61 pages, 13 figures
All 2 versions
 

<——2022———2022———370—


Fault diagnosis of rotating machinery based on combination of Wasserstein generative adversarial...

by Li, Yibing; Zou, Weiteng; Jiang, Li

Measurement : journal of the International Measurement Confederation, 03/2022, Volume 191

•A WGAN model suitable for original vibration signal generation is proposed to provide a solution to the problem o

datimbalance.•LSTM-FCN is a...

Article PDFPDF

Journal Article  Full Text Online

View in Context Browse Journal

Research article
Fault diagnosis of rotating machinery based on combination of Wasserstein generative adversarial networks and long short term memory fully convolutional network
Measurement3 February 2022...

Yibing LiWeiteng ZouLi Jiang

Cited by 7 Related articles


 

Dimensionality Reduction and Wasserstein Stability for Kernel Regression

by Eckstein, Stephan; Iske, Armin; Trabs, Mathias

03/2022

In a high-dimensional regression framework, we study consequences of the naive two-step procedure where first the dimension of

the input variables is reduced...

Journal Article  Full Text Online

 Preview  Open Access

Related articles All 3 versions

    

Low-rank Wasserstein polynomial chaos expansions in the framework of optimal transport

by Gruhlke, Robert; Eigel, Martin

03/2022

A unsupervised learning approach for the computation of an explicit functional representation of a random vector $Y$ is presented, which only relies on a...

Journal Article  Full Text Online

All 3 versions

2022 patent news

State Intellectual Property Office of China Receives Univ Beijing Inf Sci & Tech's Patent Application for Method for Generating Biological Raman Spectrum Data Based on WGAN...

Global IP News. Biotechnology Patent News, Mar 17, 2022

Newspaper Article

[PDF] arxiv.org

Limit distribution theory for smooth -Wasserstein distances

Z GoldfeldK KatoS Nietert, G Rioux - arXiv preprint arXiv:2203.00159, 2022 - arxiv.org

… serving the Wasserstein metric and topological … Wasserstein distance. The limit distribution

results leverage the functional delta method after embedding the domain of the Wasserstein …

 Cited by 3 All 2 versions 

[PDF] arxiv.org

Limit distribution theory for smooth -Wasserstein distances

Z Goldfeld, K Kato, S Nietert, G Rioux - arXiv preprint arXiv:2203.00159, 2022 - arxiv.org

Wasserstein distance. The limit distribution results leverage the functional delta method after 

embedding the domain of the Wasserstein … with the smooth Wasserstein distance, showing …

Cited by 3 Related articles All 2 versions 

2022

2022 see 2021

Decision Making Under Model Uncertainty: Fréchet–Wasserstein Mean Preferences

EV Petracou, A Xepapadeas… - Management …, 2022 - pubsonline.informs.org

… In this paper, from Section 2.5 onward, we commit to the 2-Wasserstein metric (also denoted

by W2) and set d W2. For general definitions and properties of the Wasserstein metric, see …

 Cited by 2 Related articles

 

[PDF] arxiv.org

Isometric rigidity of Wasserstein spaces: the graph metric case

G KissT Titkos - arXiv preprint arXiv:2201.01076, 2022 - arxiv.org

… that p-Wasserstein spaces over graph metric spaces are all … -Wasserstein space whose

isometry group is isomorphic to G. … impression that although the p-Wasserstein space Wp(X) is …

 Cited by 1 Related articles All 5 versions 


[PDF] arxiv.org

Quasi -Firmly Nonexpansive Mappings in Wasserstein Spaces

A Bërdëllima, G Steidl - arXiv preprint arXiv:2203.04851, 2022 - arxiv.org

… Wasserstein gradient flow methods. Besides known facts, we are in particular interested in

Wasserstein … As in convex analysis these mappings coincide with the metric projections onto …

 All 2 versions 


 2022 see 2021

A data-driven scheduling model of virtual power plant using Wasserstein distributionally robust optimization

H Liu, J QiuJ Zhao - International Journal of Electrical Power & Energy …, 2022 - Elsevier

… Wasserstein metric has been proposed in day-head unit commitment in [15], which minimizes

generation cost for the worst-case distribution over Wasserstein … of Wasserstein ambiguity …

 Related articles All 2 versions


arXiv:2203.12796  [pdfpsother math.PR
Poisson equation on Wasserstein space and diffusion approximations for McKean-Vlasov equation
Authors: Yun LiFuke WuLongjie Xie
Abstract: We consider the fully-coupled McKean-Vlasov equation with two-time-scales potentials, and all the coefficients depend on the distributions of both the slow component and the fast motion. By studying the smoothness of the solution of the non-linear Poisson equation on Wasserstein space, we derive the asymptotic limit as well as the optimal rate of convergence for the slow process. Extra homogenized…  More
Submitted 23 March, 2022; originally announced March 2022.

Cited by 2 Related articles All 2 versions 

<——2022———2022———380— 


[HTML] springer.com

[HTML] Deep distributional sequence embeddings based on a wasserstein loss

A Abdelwahab, N Landwehr - Neural Processing Letters, 2022 - Springer

… on Wasserstein distances between the distributions and a corresponding loss function for … 

aggregation and metric learning, but does not employ our Wasserstein-based loss function. …

Cited by 8 Related articles All 3 versions


arXiv:2203.15728  [pdf, other
math.OC
Wasserstein-Fisher-Rao Splines
Authors:
Julien Clancy, Felipe Suarez
Abstract: In this work study interpolating splines on the Wasserstein-Fisher-Rao (WFR) space of measures with differing total masses. To achieve this, we derive the covariant derivative and the curvature of an absolutely continuous curve in the WFR space. We prove that this geometric notion of curvature is equivalent to a Lagrangian notion of curvature in terms of particles on the cone. Finally, we propose…
More
Submitted 29 March, 2022; originally announced March 2022.
Comments: 20 pages, 2 figures 

All 2 versions 

arXiv:2203.15333  [pdf, othereess.SY
On Affine Policies for
Wasserstein Distributionally Robust Unit Commitment
Authors:
Youngchae Cho, Insoon Yang
Abstract: This paper proposes a unit commitment (UC) model based on data-driven Wasserstein distributionally robust optimization (WDRO) for power systems under uncertainty of renewable generation as well as its tractable exact reformulation. The proposed model is formulated as a WDRO problem relying on an affine policy, which nests an infinite-dimensional worst-case expectation problem and satisfies the non…
More
Submitted 29 March, 2022; originally announced March 2022. 

All 2 versions

arXiv:2203.13417  [pdf, other]  stat.ML cs.LG
Amortized Projection Optimization for Sliced Wasserstein Generative Models
Authors: Khai Nguyen, Nhat Ho
Abstract: Seeking informative projecting directions has been an important task in utilizing sliced Wasserstein distance in applications. However, finding these directions usually requires an iterative optimization procedure over the space of projecting directions, which is computationally expensive. Moreover, the computational issue is even more severe in deep learning applications, where computing the dist…
More
Submitted 24 March, 2022; originally announced March 2022.
Comments: 24 pages, 6 figures 

All 3 versions

arXiv:2203.12796  [pdf, ps, other]  math.PR
Poisson equation on Wasserstein space and diffusion approximations for McKean-Vlasov equation
Authors: Yun Li, Fuke Wu, Longjie Xie
Abstract: We consider the fully-coupled McKean-Vlasov equation with two-time-scales potentials, and all the coefficients depend on the distributions of both the slow component and the fast motion. By studying the smoothness of the solution of the non-linear Poisson equation on Wasserstein space, we derive the asymptotic limit as well as the optimal rate of convergence for the slow process. Extra homogenized…
More
Submitted 23 March, 2022; originally announced March 2022. 

Cited by 3 Related articles All 2 versions

2022


arXiv:2203.12136  [pdf, other]  stat.ML cs.LG math.OC
Wasserstein Distributionally Robust Optimization via Wasserstein Barycenters
Authors:
Tim Tsz-Kit Lau, Han Liu
Abstract: In many applications in statistics and machine learning, the availability of data samples from multiple sources has become increasingly prevalent. On the other hand, in distributionally robust optimization, we seek data-driven decisions which perform well under the most adverse distribution from a nominal distribution constructed from data samples within a certain distance of probability distribut…
More
Submitted 22 March, 2022; originally announced March 2022. 

All 3 versions

arXiv:2203.11627  [pdf, other]  stat.ML stat.CO stat.ME
Bounds on Wasserstein distances between continuous distributions using independent samples
Authors: Tamás Papp, Chris Sherlock
Abstract: The plug-in estimator of the Wasserstein distance is known to be conservative, however its usefulness is severely limited when the distributions are similar as its bias does not decay to zero with the true Wasserstein distance. We propose a linear combination of plug-in estimators for the squared 2-Wasserstein distance with a reduced bias that decays to zero with the true distance. The new estimat…
More
Submitted 22 March, 2022; originally announced March 2022.
Comments: 61 pages, 13 figures

All 2 versions ƒ

arXiv:2203.10754  [pdf, ps, othermath.ST stat.ML
Strong posterior contraction rates via Wasserstein dynamics
Authors:
Emanuele Dolera, Stefano Favaro, Edoardo Mainini
Abstract: In this paper, we develop a novel approach to posterior contractions rates (PCRs), for both finite-dimensional (parametric) and infinite-dimensional (nonparametric) Bayesian models. Critical to our approach is the combination of an assumption of local Lipschitz-continuity for the posterior distribution with a dynamic formulation of the Wasserstein distance, here referred to as Wasserstein dynamics…
More
Submitted 21 March, 2022; originally announced March 2022.
Comments: 42 pages, text overlap with arXiv:2011.14425
All 2 versions


 Chen, Hong-Bin; Niles-Weed, Jonathan

(English) ZAsymptotics of smoothed Wasserstein distances. bl 07496366

Potential Anal. 56, No. 4, 571-595 (2022).

MSC:  58J65 60J60

PDF BibTeX XML Cite Full Text: DOI 

Cited by 6 Related articles All 5 versions

Estimating processes in adapted Wasserstein distance. (English) Zbl 07493830

Ann. Appl. Probab. 32, No. 1, 529-550 (2022).

MSC:  60G42 90C46 58E30

PDF BibTeX XML Cite

Full Text: DOI 

Cited by 31 Related articles All 9 versions

 <——2022———2022———390— 


Sun, Yue; Qiu, Ruozhen; Sun, Minghe

Optimizing decisions for a dual-channel retailer with service level requirements and demand uncertainties: a Wasserstein metric-based distributionally robust optimization approach. (English) Zbl 07486378

Comput. Oper. Res. 138, Article ID 105589, 21 p. (2022).

MSC:  90Bxx

PDF BibTeX XML Cite

Cited by 3 Related articles All 2 versions
Optimizing decisions for a dual-channel retailer with service level requirements and demand uncertainties: A Wasserstein metric-based 


Molnár, Lajos

Maps on positive definite cones of

C*-algebras preserving the Wasserstein mean. (English) Zbl 07469054

Proc. Am. Math. Soc. 150, No. 3, 1209-1221 (2022).

Summary: The primary aim of this paper is to present the complete description of the isomorphisms between positive definite cones of

MR4394092 Prelim Brawley, Easton; Doyle, Mason; Niedzialomski, Robert; On the 2-Wasserstein distance for self-similar measures on the unit interval. Math. Nachr. 295 (2022), no. 3, 468–486. 28A20 (60B10)

Review PDF Clipboard Journal Article


MR4393384 Prelim Iacobelli, Mikaela; A New Perspective on Wasserstein Distances for Kinetic Problems. Arch. Ration. Mech. Anal. 244 (2022), no. 1, 27–50. 35J96 (82B40 82D10)

Review PDF Clipboard Journal Article

[HTML] springer.com

[HTML] A new perspective on Wasserstein distances for kinetic problems

M Iacobelli - Archive for Rational Mechanics and Analysis, 2022 - Springer

… of Wasserstein distances in kinetic theory, as is beautifully described in the

bibliographical notes of [60, Chapter 6]… The first celebrated result relying on Monge–Kantorovich–Wasserstein …

Cited by 2 Related articles All 6 versions

ARTICLE

Viscosity solutions for obstacle problems on Wasserstein space

Talbi, Mehdi ; Touzi, Nizar ; Zhang, JianfengarXiv.org, 2022

OPEN ACCESS

Viscosity solutions for obstacle problems on Wasserstein space

Available Online 

 arXiv:2203.17162  [pdfpsother]  math.PR
Viscosity solutions for obstacle problems on Wasserstein space
Authors: Mehdi TalbiNizar TouziJianfeng Zhang
Abstract: This paper is a continuation of our accompanying paper [Talbi, Touzi and Zhang (2021)], where we characterized the mean field optimal stopping problem by an obstacle equation on the Wasserstein space of probability measures, provided that the value function is smooth. Our purpose here is to establish this characterization under weaker regularity requirements. We shall define a notion of viscosity… 
More
Submitted 31 March, 2022; originally announced March 2022.
Comments: 25 pages
MSC Class: 60G40; 35Q89; 49N80; 49L25; 60H30
Cited by 5
 Related articles All 5 versions 


ARTICLE

Wasserstein Distributionally Robust Control of Partially Observable Linear Systems: Tractable Approximation and Performance Guarantee

Hakobyan, Astghik ; Yang, InsoonarXiv.org, 202

OPEN ACCESS

Wasserstein Distributionally Robust Control of Partially Observable Linear Systems: Tractable Approximation and Performance Guarantee

Available Online 

arXiv:2203.17045  [pdfother eess.SY
Wasserstein Distributionally Robust Control of Partially Observable Linear Systems: Tractable Approximation and Performance Guarantee
Authors: Astghik HakobyanInsoon Yang
Abstract: Wasserstein distributionally robust control (WDRC) is an effective method for addressing inaccurate distribution information about disturbances in stochastic systems. It provides various salient features, such as an out-of-sample performance guarantee, while most of existing methods use full-state observations. In this paper, we develop a computationally tractable WDRC method for discrete-time par…  More
Submitted 31 March, 2022; originally announced March 2022.

All 2 versions 
MR4604196
 


[PDF] arxiv.org

WPPNets: Unsupervised CNN Training with Wasserstein Patch Priors for Image Superresolution

F Altekrüger, J Hertrich - arXiv preprint arXiv:2201.08157, 2022 - arxiv.org

… unsupervised loss function for image superresolution of materials microstructures. Instead of 

… function based on the Wasserstein patch prior which measures the Wasserstein-2 distance …

Cited by 1 Related articles All 2 versions

 

 A novel multi-speakers Urdu singing voices synthesizer using Wasserstein Generative Adversarial Network

A Saeed, MF Hayat, T Habib, DA Ghaffar… - Speech …, 2022 - Elsevier

In this paper, the first-ever Urdu language singing voices corpus is developed using linguistic 

(phoneti 

Related articles All 2 versions

Cited by 1 Related articles All 2 versions

 arXiv:2204.00387  [pdf]  cs.LG  stat.ML doi10.5121/csit.2022.120611
DAG-WGAN: Causal Structure Learning With Wasserstein Generative Adversarial Networks
Authors: Hristo PetkovColin HanleyFeng Dong
Abstract: The combinatorial search space presents a significant challenge to learning causality from data. Recently, the problem has been formulated into a continuous optimization framework with an acyclicity constraint, allowing for the exploration of deep generative models to better capture data sample distributions and support the discovery of Directed Acyclic Graphs (DAGs) that faithfully represent the… 
More
Submitted 1 April, 2022; originally announced April 2022.
Comments: 8th International Conference on Artificial Intelligence and Applications (AIFU 2022)

arXiv:2204.00263  [pdfpsother]  math.DS  math.PR
Wasserstein convergence rate in the invariance principle for deterministic dynamical systems
Authors: Zhenxin LiuZhe Wang
Abstract: In this paper, we consider the convergence rate with respect to Wasserstein distance in the invariance principle for deterministic nonuniformly hyperbolic systems. Our results apply to uniformly hyperbolic systems and large classes of nonuniformly hyperbolic systems including intermittent maps, Viana maps, unimodal maps and others. Furthermore, as a nontrivial application to homogenization problem… 
More
Submitted 1 April, 2022; originally announced April 2022.
Comments: 22 pages

All 2 versions 
<——2022———2022———400—


arXiv:2204.00191  [pdfother]  math.OC  eess.SY
Wasserstein Two-Sided Chance Constraints with An Application to Optimal Power Flow
Authors: Haoming ShenRuiwei Jiang
Abstract: As a natural approach to modeling system safety conditions, chance constraint (CC) seeks to satisfy a set of uncertain inequalities individually or jointly with high probability. Although a joint CC offers stronger reliability certificate, it is oftentimes much more challenging to compute than individual CCs. Motivated by the application of optimal power flow, we study a special joint CC, named tw…  More
Submitted 31 March, 2022; originally announced April 2022.

Wasserstein Two-Sided Chance Constraints with An Application to Optimal Power Flow
by Shen, Haoming; Jiang, Ruiwei
03/2022
As a natural approach to modeling system safety conditions, chance constraint (CC) seeks to satisfy a set of uncertain inequalities individually or jointly... Journal Article  Full Text Online

 Cited by 2 Related articles All 5 versions

[PDF] arxiv.org

Minimum entropy production, detailed balance and Wasserstein distance for continuous-time Markov processes

A Dechant - Journal of Physics A: Mathematical and Theoretical, 2022 - iopscience.iop.org

… the Wasserstein distance between the two probability densities. The Wasserstein distance

is a … Since the Wasserstein distance characterizes the minimum entropy production, it also …

 Cited by 5 Related articles All 3 versions

Cited by 11 Related articles All 5 versions


[PDF] rice.edu

[PDF] Subexponential upper and lower bounds in Wasserstein distance for Markov processes

N SandricA Arapostathis, G Pang - caam.rice.edu

… aperiodic Markov processes. We further discuss these results in the context of Markov Lévy-…

, we obtain exponential ergodicity in the Lp-Wasserstein distance for a class of Itô processes …

Related articles All 4 versions

Measuring Phase-Amplitude Coupling between Neural Oscillations of Different Frequencies via the Wasserstein Distance

T Ohki - Journal of Neuroscience Methods, 2022 - Elsevier

… mathematical framework of the Wasserstein distance to enhance the intuitive comprehension

of the Wasserstein Modulation Index (wMI). The Wasserstein distance is an optimization …

  All 3 versions


2022 patent news     Wire Feed  Full Text

Inst Inf Eng, CAS Files Chinese Patent Application for Network Attack Traffic Data Enhancement Method and System Combining Auto-Encoder and WGAN

Global IP News. Broadband and Wireless Network News; New Delhi [New Delhi]. 12 Mar 2022. 

DetailsFull text Select result item

NEWSPAPER ARTICLE

Inst Inf Eng, CAS Files Chinese Patent Application for Network Attack Traffic Data Enhancement Method and System Combining Auto-Encoder and WGAN

Global IP News. Broadband and Wireless Network News, 2022

Inst Inf Eng, CAS Files Chinese Patent Application for Network Attack Traffic Data Enhancement Method and System Combining Auto-Encoder and WGAN

No Online Access 


2022 see 2021  Working Paper Full Text

Tighter expected generalization error bounds via Wasserstein distance

Rodríguez-Gálvez, Borja; Bassi, Germán; Thobaben, Ragnar; Skoglund, Mikael.

arXiv.org; Ithaca, Mar 25, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Cited by 9 Related articles All 6 versions 
ARTICLE

Tighter expected generalization error bounds via Wasserstein distance

Rodríguez-Gálvez, Borja ; Bassi, Germán ; Thobaben, Ragnar ; Skoglund, MikaelarXiv.org, 2022

OPEN ACCESS

Tighter expected generalization error bounds via Wasserstein distance

Available Online 


 2022


2022 see 2021 Working Paper  Full Text

Data-Driven Distributionally Robust Surgery Planning in Flexible Operating Rooms Over a Wasserstein Ambiguity

Shehadeh, Karmel S.

arXiv.org; Ithaca, Mar 7, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Show Abstract 

Working Paper Full Text

Data-Driven Distributionally Robust Surgery Planning in Flexible Operating Rooms Over a Wasserstein Ambiguity

Shehadeh, Karmel S.

arXiv.org; Ithaca, Mar 7, 2022.

Related articles All 3 versions

MR4450414 

2022 see 2021 Working Paper Full Text

Projected Sliced Wasserstein Autoencoder-based Hyperspectral Images Anomaly Detection

Chen, Yurong; Zhang, Hui; Wang, Yaonan; Wu, Q M Jonathan; Yang, Yimin.

arXiv.org; Ithaca, Mar 20, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window
ARTICLE

Projected Sliced Wasserstein Autoencoder-based Hyperspectral Images Anomaly Detection

Yurong Chen ; Hui Zhang ; Yaonan Wang ; Q M Jonathan Wu ; Yimin YangarXiv.org, 2022

OPEN ACCESS

Projected Sliced Wasserstein Autoencoder-based Hyperspectral Images Anomaly Detection

Available Online 

Related articles All 2 versions

2022 see 2021 Working Paper Full TextSchema matching using Gaussian mixture models with Wasserstein distance

Przyborowski, Mateusz; Pabiś, Mateusz; Janusz, Andrzej; Ślęzak, Dominik.

arXiv.org; Ithaca, Mar 31, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Related articles All 3 versions 
ARTICLE

Schema matching using Gaussian mixture models with Wasserstein distance

Przyborowski, Mateusz ; Pabiś, Mateusz ; Janusz, Andrzej ; Ślęzak, DominikarXiv.org, 2022

OPEN ACCESS

Schema matching using Gaussian mixture models with Wasserstein distance

Available Online 


Scholarly Journal  Citation/Abstract

A Deep Transfer Learning Fault Diagnosis Method Based on WGAN and Minimum Singular Value for Non-Homologous Bearing

He, Jun; Ouyang, Ming; Chen, Zhiwen; Chen, Danfeng; Liu, Shiya.

IEEE Transactions on Instrumentation and Measurement; New York Vol. 71,  (2022): 1-9.

Abstract/Details 
A Deep Transfer Learning Fault Diagnosis Method Based on WGAN and Minimum Singular Value for Non-Homologous Bearing

J He, M Ouyang, Z Chen, D Chen… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org

… based on Wasserstein generative … The Wasserstein is the difference measurement between

SD and TD with compact space (H, p), in which p is the exponents of the Wasserstein metric …

Cited by 4 Related articles

Conference Paper  Citation/Abstract

The Wasserstein Distance Using QAOA: A Quantum Augmented Approach to Topological Data Analysis

Gopikrishnan, Mannathu.

The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2022).

Abstract/Details 

Conference Paper  Citation/Abstract

The Wasserstein Distance Using QAOA: A Quantum Augmented Approach to Topological Data Analysis

Gopikrishnan, Mannathu.

The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2022).

<——2022———2022———410—  


2022 see 2021  Scholarly Journal  Citation/Abstract

CNN-Based Continuous Authentication on Smartphones With Conditional Wasserstein Generative Adversarial Network

Li, Yantao; Luo, Jiaxing; Deng, Shaojiang; Zhou, Gang.

IEEE Internet of Things Journal; Piscataway Vol. 9, Iss. 7,  (2022): 5447-5460.

Abstract/Details 

2022 see 2021  Scholarly Journal  Citation/Abstract

An Improved Mixture Density Network Via Wasserstein Distance Based Adversarial Learning for

Probabilistic Wind Speed Predictions

Yang, Luoxiao; Zheng, Zhong; Zhang, Zijun.

IEEE Transactions on Sustainable Energy; Piscataway Vol. 13, Iss. 2,  (2022): 755-766.

Abstract/Details 

Scholarly Journal  Citation/Abstract

Wasserstein Proximal Algorithms for the Schrödinger Bridge Problem: Density Control With Nonlinear Drift
Caluya, Kenneth F; Halder, Abhishek.
IEEE Transactions on Automatic Control; New York Vol. 67, Iss. 3,  (2022): 1163-1178.
Abstract/Details Get full textLink to external site, this link will open in a new window
Citation/Abstract

Wasserstein Proximal Algorithms for the Schrödinger Bridge Problem: Density Control With Nonlinear Drift

Caluya, Kenneth F; Halder, Abhishek.

IEEE Transactions on Automatic Control; New York Vol. 67, Iss. 3,  (2022): 1163-1178.

Abstract/Details Get full textLink to external site, this link will open in a new window


Scholarly Journal  Full Text

A Data Augmentation Method for Prohibited Item X-Ray Pseudocolor Images in X-Ray Security Inspection Based on Wasserstein Generative Adversarial Network and Spatial-and-Channel Attention Block
Liu, Dongming; Liu, Jianchang; Yuan, Peixin; Yu, Feng.
Computational Intelligence and Neuroscience : CIN; New York Vol. 2022,  (2022).
Abstract/DetailsFull textFull text - PD


[PDF] researchgate.net

[PDF] Gradient Penalty Approach for Wasserstein Generative Adversarial Networks

Y Ti - researchgate.net

… Hence, we make use of the Wasserstein distance to fix such recurring issues. The representation

for the mathematical formula is as shown below. Refer to the following research paper …

Related articles 


2022


[PDF] arxiv.org

Wasserstein Solution Quality and the Quantum Approximate Optimization Algorithm: A Portfolio Optimization Case Study

JS Baker, SK Radha - arXiv preprint arXiv:2202.06782, 2022 - arxiv.org

… quantum processing units (QPUs). We benchmark the success of this approach using the

Quantum … determined by the Normalized and Complementary Wasserstein Distance, η, which …

 Cited by 1 All 5 versions 


[PDF] arxiv.org

Right mean for the  Bures-Wasserstein quantum divergence

M Jeong, J Hwang, S Kim - arXiv preprint arXiv:2201.03732, 2022 - arxiv.org

… It has been shown that the quantum divergence Φα,z is … Also, the right mean for the α −

z Bures-Wasserstein quantum … trace inequality with the Wasserstein mean and bounds for …

 Related articles All 2 versions 


The Wasserstein Distance Using QAOA: A Quantum Augmented Approach to Topological Data Analysis

M Saravanan, M Gopikrishnan - 2022 International Conference …, 2022 - ieeexplore.ieee.org

… is a metric called the Wasserstein Distance, which measures … Wasserstein Distance by

applying the Quantum Approximate Optimization Algorithm (QAOA) using gate-based quantum …

 

2022 see 2021

Asymptotics of Smoothed Wasserstein Distances

Chen, HB and Niles-Weed, J

Apr 2022 | Jan 2021 (Early Access) | POTENTIAL ANALYSIS 56 (4) , pp.571-595

We investigate contraction of the Wasserstein distances on Double-struck capital R-d under Gaussian smoothing. It is well known that the heat semigroup is exponentially contractive with respect to the Wasserstein distances on manifolds of positive curvature; however, on flat Euclidean space-where the heat semigroup corresponds to smoothing the measures by Gaussian convolution-the situation is m

Show more

Free Submitted Article From RepositoryFull Text at Publisher

39  References

Cited by 5 Related articles All 5 versions


Deep Distributional Sequence Embeddings Based on a Wasserstein Loss

Abdelwahab, A and Landwehr, N

Mar 2022 (Early Access) | NEURAL PROCESSING LETTERS

Deep metric learning employs deep neural networks to embed instances into a metric space such that distances between instances of the same class are small and distances between instances from different classes are large. In most existing deep metric learning techniques, the embedding of an instance is given by a feature vector produced by a deep neural network and Euclidean distance or cosine s

Free Full Text From Publisher

36 References  Related records

Cited by 8 Related articles All 3 versions

<——2022———2022———420—


MAPS ON POSITIVE DEFINITE CONES OF C*-ALGEBRAS PRESERVING THE WASSERSTEIN MEAN

Molnar, L

Mar 2022 | PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY 150 (3) , pp.1209-1221

The primary aim of this paper is to present the complete description of the isomorphisms between positive definite cones of C*-algebras with respect to the recently introduced Wasserstein mean and to show the nonexistence of nonconstant such morphisms into the positive reals in the case of von Neumann algebras without type I-2, I-1 direct summands. A comment on the algebraic properties of the W

Fre Full Text From Publisher

29 References

Related records


An LP-based, strongly-polynomial 2-approximation algorithm for sparse Wasserstein barycenters

Borgwardt, S

Apr 2022 | Aug 2020 (Early Access) | OPERATIONAL RESEARCH 22 (2) , pp.1511-1551

Discrete Wasserstein barycenters correspond to optimal solutions of transportation problems for a set of probability measures with finite support. Discrete barycenters are measures with finite support themselves and exhibit two favorable properties: there always exists one with a provably sparse support, and any optimal transport to the input measures is non-mass splitting. It is open whether a

Show more

Free Submitted Article From RepositoryFull Text at Publisher

2 Citations

52 References

Related records

  

37

Measuring phase-amplitude coupling between neural oscillations of different frequencies via the Wasserstein distance.

Ohki, Takefumi

2022-Mar-23 | Journal of neuroscience methods 374 , pp.109578

BACKGROUND: Phase-amplitude coupling (PAC) is a key neuronal mechanism. Here, a novel method for quantifying PAC via the Wasserstein distance is presented.

NEW METHOD: The Wasserstein distance is an optimization algorithm for minimizing transportation cost and distance. For the first time, the author has applied this distance function to quantify PAC and named the Wasserstein Modulation I

 View full text

 Cited by 1 All 4 versions

 

2022 see 2021

An Improved Mixture Density Network Via Wasserstein Distance Based Adversarial Learning for Probabilistic Wind Speed Predictions

Yang, LXZheng, Z and Zhang, ZJ

Apr 2022 | IEEE TRANSACTIONS ON SUSTAINABLE ENERGY 13 (2) , pp.755-766

This paper develops a novel improved mixture density network via Wasserstein distance-based adversarial learning (WA-IMDN) for achieving more accurate probabilistic wind speed predictions (PWSP). The proposed method utilizes the historical supervisory control and data acquisition (SCADA) system data collected from multiple wind turbines (WTs) in different wind farms to predict the wind speed pr

Full Text at Publisher

42 References

Related records

Демистификация: сети Wasserstein GAN (WGAN) - ICHI.PRO

https://ichi.pro › demistifikacia-seti-w...

· Translate this page

Что такое расстояние Вассерштейна? Какова интуиция при использовании расстояния ... Сезон 2022 года из серии «Искусственный интеллект, справедливость и ...

Сезон 2022 года из серии «Искусственный интеллект, справедливость и ...

[Russian Demystification of Wasserstein network GAN (WGAN)]


2022


arXiv:2204.01188  [pdfother]  cs.CV  cs.LG  stat.ML
Revisiting Sliced Wasserstein on Images: From Vectorization to Convolution
Authors: Khai NguyenNhat Ho
Abstract: The conventional sliced Wasserstein is defined between two probability measures that have realizations as vectors. When comparing two probability measures over images, practitioners first need to vectorize images and then project them to one-dimensional space by using matrix multiplication between the sample matrix and the projection matrix. After that, the sliced Wasserstein is evaluated by avera… 
More
Submitted 3 April, 2022; originally announced April 2022.
Comments: 34 pages, 12 figures, 10 tables
Cited by 1
 Related articles All 3 versions 
ARTICLE

Revisiting Sliced Wasserstein on Images: From Vectorization to Convolution

Nguyen, Khai ; Ho, NhatarXiv.org, 2022

OPEN ACCESS

Revisiting Sliced Wasserstein on Images: From Vectorization to Convolution

Available Online 


ARTICLE

Wasserstein Hamiltonian flow with common noise on graph

Cui, Jianbo ; Liu, Shu ; Zhou, HaominarXiv.org, 2022


OPEN ACCESS

Wasserstein Hamiltonian flow with common noise on graph

Available Online 

arXiv:2204.01185  [pdfpsother]  math.OC  math.DS  math.PR
Wasserstein Hamiltonian flow with common noise on graph
Authors: Jianbo CuiShu LiuHaomin Zhou
Abstract: We study the Wasserstein Hamiltonian flow with a common noise on the density manifold of a finite graph. Under the framework of stochastic variational principle, we first develop the formulation of stochastic Wasserstein Hamiltonian flow and show the local existence of a unique solution. We also establish a sufficient condition for the global existence of the solution. Consequently, we obtain the…  More
Submitted 3 April, 2022; originally announced April 2022.

AAll 3 versions 


Minimax confidence intervals for the Sliced Wasserstein distance

T ManoleS Balakrishnan… - Electronic Journal of …, 2022 - projecteuclid.org

… for estimating the Wasserstein distance and estimating under the Wasserstein distance,

the minimax risks we obtain for the Sliced Wasserstein distance when d > 1 are dimension-free. …

 Cited by 7 Related articles All 7 versions

MR4402565 


[PDF] sbc.org.br

Simulando padrões de acesso a memória com Wasserstein-GAN

ABV dos Santos, FB Moreira… - Anais da XXII Escola …, 2022 - sol.sbc.org.br

Neste trabalho exploramos a possibilidade de simular traços de padrões de acesso a memória

utilizando o mecanismo de redes neurais adversárias denominado Wasserstein-GAN (…

Related articles All 2 versions 

T-copula and Wasserstein distance-based stochastic neighbor embedding

Y Huang, K Guo, X Yi, J Yu, Z ShenT Li - Knowledge-Based Systems, 2022 - Elsevier

… Moreover, we use several metrics to evaluate the classification and clustering performances

after dimension reduction. Accuracy (ACC) describes the proportion of correctly classified …

Cited by 1 Related articles All 2 versions

<——2022———2022———430— 


A novel virtual sample generation method based on a modified conditional Wasserstein GAN to address the small sample size problem in soft sensing

YL HeXY Li, JH Ma, S Lu, QX Zhu - Journal of Process Control, 2022 - Elsevier

… Aiming at handling the issue of small sample size, a novel virtual sample generation method

embedding a deep neural network as a regressor into conditional Wasserstein generative …

Related articles All 2 versions

[PDF] ijicic.org

[PDF] ACWGAN: AN AUXILIARY CLASSIFIER WASSERSTEIN GAN-BASED OVERSAMPLING APPROACH FOR MULTI-CLASS IMBALANCED LEARNING

C Liao, M Dong - ijicic.org

Learning from multi-class imbalance data is a common but challenging task in machine

learning community. Oversampling method based on Generative Adversarial Networks …


[PDF] arxiv.org

Model-agnostic bias mitigation methods with regressor distribution control for Wasserstein-based fairness metrics

A Miroshnikov, K Kotsiopoulos, R Franks… - arXiv preprint arXiv …, 2021 - arxiv.org

… to assess regressor fairness using Wasserstein-based metrics. These metrics, which arise in

… In addition, the metric picks up changes in the geometry of the regressor distribution, unlike …

 Related articles All 2 versions 


 

[PDF] arxiv.org

Wasserstein Distributionally Robust Gaussian Process Regression and Linear Inverse Problems

X ZhangJ BlanchetY MarzoukVA Nguyen… - arXiv preprint arXiv …, 2022 - arxiv.org

… In simple words, our results imply that Gaussian … Wasserstein distance is designed to

explore the impact of distributions with potentially rougher (and, more importantly, non-Gaussian) …

2022  PDF
Convergence diagnostics for Monte Carlo fission source distributions using the Wasserstein distance...
by Guo, Xiaoyu; Li, Zeguang; Huang, Shanfang ; More...
Nuclear engineering and design, 04/2022, Volume 389
•A new diagnostic method for Monte Carlo (MC) source convergence is proposed based on the Wasserstein distance (WD) metric.•The usage of the WD-based indicator...

Journal Article  Full Text Online
View in Context Browse Journal
 Preview 

2022


An LP-based, strongly-polynomial 2-approximation algorithm for sparse Wasserstein barycenters
by Borgwardt, Steffen
Operational research, 08/2020, Volume 22, Issue 2
Discrete Wasserstein barycenters correspond to optimal solutions of transportation problems for a set of probability measures with finite support. Discrete...
Article PDFPDF Journal Article  Full Text Online
More Options 
View in Context Browse Journal
 Preview   Open Access

Reports Summarize Operational Research Study Results from University of Colorado Denver (An Lp-based, Strongly-polynomial 2-approximation Algorithm for Sparse Wasserstein...

Investment Weekly News, 04/2022

Newsletter  Full Text Online
    1


ARTICLE

Wasserstein-type distances of two-type continuous-state branching processes in L\'{e}vy random environments

Chen, Shukai ; Fang, Rongjuan ; Zheng, Xiangqi2022

OPEN ACCESS

Wasserstein-type distances of two-type continuous-state branching processes in L\'{e}vy random environments

Available Online 

Related articles All 2 versions 


 patent
State Intellectual Property Office of China Receives Univ Chongqing Posts & Telecom's Patent Application for Visual Dimension Reduction Method Based on Wasserstein...
Global IP News. Information Technology Patent News, Apr 5, 2022

Newspaper Article  Full Text Online

Wire Feed  Full Text

State Intellectual Property Office of China Receives Univ Chongqing Posts & Telecom's Patent Application for Visual Dimension Reduction Method Based on Wasserstein Space

Global IP News. Information Technology Patent News; New Delhi [New Delhi]. 05 Apr 2022. 


Fused WGAN 이용한 2 단계 오버 샘플링

최인재 - 2022 - repository.hanyang.ac.kr

WGAN, an imbalanced data oversampling technique using GAN based deep learning to settle 

these problems. I n Fused WGAN, we … the preliminary sampling model, 1st WGAN-GP. The …

 [Korean  2-Step Oversampling with Fused WGAN]

<——2022———2022———440— 

 


 АНАЛІЗ ГЕНЕРАТИВНИХ МОДЕЛЕЙ ГЛИБОКОГО НАВЧАННЯ ТА ОСОБЛИВОСТЕЙ ЇХ РЕАЛІЗАЦІЇ НА ПРИКЛАДІ WGAN

ЯО ІсаєнковОБ Мокін - Вісник Вінницького політехнічного …, 2022 - visnyk.vntu.edu.ua

Представлено особливості будови, навчання та сфери застосування генеративних

моделей глибокого навчання. До основних завдань таких модель відносяться генерування …

[Ukrainian  ANALYSIS OF GENERATIVE MODELS OF DEEP LEARNING AND FEATURES OF THEIR IMPLEMENTATION ON THE EXAMPLE OF WGAN]

Related articles 


2022  RTICLE

Wassmap: Wasserstein Isometric Mapping for Image Manifold Learning

Keaton Hamm ; Nick Henscheid ; Shujie KangarXiv.org, 2022

OPEN ACCESS

Wassmap: Wasserstein Isometric Mapping for Image Manifold Learning

Available Online 
 
arXiv:2204.06645  [pdfother]  cs.LG   cs.CV  stat.ML
Wassmap: Wasserstein Isometri
c Mapping for Image Manifold Learning
Authors: Keaton HammNick HenscheidShujie Kang
Abstract: In this paper, we propose Wasserstein Isometric Mapping (Wassmap), a parameter-free nonlinear dimensionality reduction technique that provides solutions to some drawbacks in existing global nonlinear dimensionality reduction algorithms in imaging applications. Wassmap represents images via probability measures in Wasserstein space, then uses pairwise quadratic Wasserstein distances between the ass…  More
Submitted 13 April, 2022; originally announced April 2022.
MSC Class: 68T10; 49Q22

Cited by 1 Related articles All 2 versions 


Cui, Jianbo
Dieci, LucaZhou, Haomin

Time discretizations of Wasserstein-Hamiltonian flows. (English) Zbl 07506843

Math. Comput. 91, No. 335, 1019-1075 (2022).

MSC:  65P10 35R02 58B20 65M12

PDF BibTeX XML Cite

Full Text: DOI 

 Zbl 07506843


2022

Wasserstein Distributionally Robust Optimization: Theory and ...

https://www.anl.gov › event › wasserstein-distributional...

Wasserstein distributionally robust optimization seeks data-driven decisions that perform well under the most adverse distribution within a certain ...

 

Chen, Hong-BinNiles-Weed, Jonathan

Asymptotics of smoothed Wasserstein distances. (English) Zbl 07496366

Potential Anal. 56, No. 4, 571-595 (2022).

MSC:  58J65 60J60

PDF BibTeX XML Cite

Zbl 07496366
 Cited by 9 Related articles All 5 versions


2022


 DAG-WGAN: Causal Structure Learning With Wasserstein Generative Adversarial Networks

by Petkov, Hristo; Hanley, Colin; Dong, Feng

04/2022

The combinatorial search space presents a significant challenge to learning causality from data. Recently, the problem has been formulated into a continuous...

Article PDF (via Unpaywall)PDF

Journal Article  Full Text Online

 Preview 

Open Access

    

2022 see 2021  Cover Image

Wasserstein Distributionally Robust Motion Control for Collision Avoidance Using Conditional Value-at-Risk

by Hakobyan, Astghik; Yang, Insoon

IEEE transactions on robotics, 04/2022, Volume 38, Issue 2

In this article, a risk-aware motion control scheme is considered for mobile robots to avoid randomly moving obstacles when the true probability distribution...

Article PDFPDF

Journal Article  Full Text Online

View in Context Browse Journal

 Preview 

Open Access

    

Cover Image

A Wasserstein GAN Autoencoder for SCMA Networks

by Miuccio, Luciano; Panno, Daniela; Riolo, Salvatore

IEEE wireless communications letters, 04/2022

In the view of the exponential growth of mobile applications, the Sparse Code Multiple Access (SCMA) is one promising code-domain Non-Orthogonal Multiple...

Article PDFPDF

Journal Article  Full Text Online

View in Context Browse Journal

Cited by 3 Related articles

    

Cover Image

Adversarial classification via distributional robustness with Wasserstein ambiguity

by Ho-Nguyen, Nam; Wright, Stephen J

Mathematical programming, 04/2022

Abstract We study a model for adversarial classification based on distributionally robust chance constraints. We show that under Wasserstein ambiguity, the...

Journal Article 

 Full Text Online

Cited by 7 Related articles All 5 versions

    

Cover Image

Convergence diagnostics for Monte Carlo fission source distributions using the Wasserstein distance...

by Guo, Xiaoyu; Li, Zeguang; Huang, Shanfang ; More...

Nuclear engineering and design, 04/2022, Volume 389

•A new diagnostic method for Monte Carlo (MC) source convergence is proposed based on the Wasserstein distance (WD) metric.•The usage of the WD-based indicator...

Article PDFPDF

Journal Article  Full Text Online

View in Context Browse Journal

 Preview 

<——2022———2022———450—


Cover Image

Two-Variable Wasserstein Means of Positive Definite Operators

by Hwang, Jinmi; Kim, Sejong

Mediterranean journal of mathematics, 04/2022, Volume 19, Issue 3

We investigate the two-variable Wasserstein mean of positive definite operators, as a unique positive solution of the nonlinear equation obtained from the...

Article PDFPDF

Journal Article 

 Full Text Online

More Options 

View in Context Browse Journal

 Preview 

    

2022 see 2021  Cover Image

An Improved Mixture Density Network Via Wasserstein Distance Based Adversarial Learning for Probabilistic...

by Yang, Luoxiao; Zheng, Zhong; Zhang, Zijun

IEEE transactions on sustainable energy, 04/2022, Volume 13, Issue 2

This paper develops a novel improved mixture density network via Wasserstein distance-based adversarial learning (WA-IMDN) for achieving more accurate...

Article PDFPDF

Journal Article  Full Text Online

View in Context Browse Journal

 Preview 

    

2022 see 2021  Cover Image

Second-Order Conic Programming Approach for Wasserstein Distributionally Robust Two-Stage...

by Wang, Zhuolin; You, Keyou; Song, Shiji ; More...

IEEE transactions on automation science and engineering, 04/2022, Volume 19, Issue 2

This article proposes a second-order conic programming (SOCP) approach to solve distributionally robust two-stage linear programs over 1-Wasserstein balls. We...

Article PDFPDF

Journal Article  Full Text Online

View in Context Browse Journal

 Preview   Open Access

    

      

CNN-Based Continuous Authentication on Smartphones With Conditional Wasserstein...

by Li, Yantao; Luo, Jiaxing; Deng, Shaojiang ; More...

IEEE internet of things journal, 04/2022, Volume 9, Issue 7

With the widespread usage of mobile devices, the authentication mechanisms are urgently needed to identify users for information leakage prevention. In this...

Journal Article  Full Text Online

 Preview 

Cited by 2 Related articles

        

Wasserstein Hamiltonian flow with common noise on graph

by Cui, Jianbo; Liu, Shu; Zhou, Haomin

04/2022

We study the Wasserstein Hamiltonian flow with a common noise on the density manifold of a finite graph. Under the framework of stochastic variational...

Journal Article  Full Text Online

 Preview   Open Access

  Cited by 1 Related articles All 2 versions


2022


Arbor Scientific on Twitter: "There is so much #physics going ...

mobile.twitter.com › ArborSci › status

mobile.twitter.com › ArborSci › statusMore Tweets. NASA ... NASA and 3 others ... a probability measure over images to one dimension is better for the sliced Wasserstein than doing vectorization ...

Twitter · 3 weeks ago

Revisiting Sliced Wasserstein on Images: From Vectorization to Convolution

by Nguyen, Khai; Ho, Nhat

04/2022

The conventional sliced Wasserstein is defined between two probability measures that have realizations as vectors. When comparing two probability measures over...

Journal Article  Full Text Online

 Preview   Open Access
Cited by 4
Related articles All 3 versions

 

Viscosity solutions for obstacle problems on Wasserstein space

by Talbi, Mehdi; Touzi, Nizar; Zhang, Jianfeng

03/2022

This paper is a continuation of our accompanying paper [Talbi, Touzi and Zhang (2021)], where we characterized the mean field optimal stopping problem by an...

Journal Article  Full Text Online

 Preview  Open Access

Related articles All 3 versions
Zbl 07704040


Cover Image

An LP-based, strongly-polynomial 2-approximation algorithm for sparse Wasserstein barycenters

by Borgwardt, Steffen

Operational research, 08/2020, Volume 22, Issue 2

Discrete Wasserstein barycenters correspond to optimal solutions of transportation problems for a set of probability measures with finite support. Discrete...

Article PDFPDF

Journal Article 

 Full Text Online

More Options 

View in Context Browse Journal

 Preview   Open Access

    

 

Wasserstein convergence rate in the invariance principle for deterministic dynamical systems

by Liu, Zhenxin; Wang, Zhe

04/2022

In this paper, we consider the convergence rate with respect to Wasserstein distance in the invariance principle for deterministic nonuniformly hyperbolic...

Journal Article  Full Text Online

 Preview  Open Access

    All 2 versions


Wasserstein Two-Sided Chance Constraints with An Application to Optimal Power Flow

by Shen, Haoming; Jiang, Ruiwei

03/2022

As a natural approach to modeling system safety conditions, chance constraint (CC) seeks to satisfy a set of uncertain inequalities individually or jointly...

Journal Article  Full Text Online

 Preview 

Open Access

 Related articles All 2 versions 

<——2022———2022———460—


Wasserstein Distributionally Robust Control of Partially Observable Linear Systems: Tractable...

by Hakobyan, Astghik; Yang, Insoon

03/2022

Wasserstein distributionally robust control (WDRC) is an effective method for addressing inaccurate distribution information about disturbances in stochastic...

Journal Article  Full Text Online

 Preview   Open Access

All 2 versions 

    

2022 see 2021  Study Results from University of Toronto in the Area of Biomedical and Health Informatics Reported (Improving Non-invasive Aspiration Detection With Auxiliary Classifier Wasserstein...

Health & Medicine Week, 04/2022

Newsletter  Full Text Online

 Preview

 DAG-WGAN: Causal Structure Learning With Wasserstein Generative Adversarial...

by Petkov, Hristo; Hanley, Colin; Dong, Feng

04/2022

The combinatorial search space presents a significant challenge to learning causality from data. Recently, the problem has been formulated into a continuous...

Article PDFDownload Now (via Unpaywall)

Journal Article  Full Text Online

 Preview Open Access

  Related articles All 4 versions 


Cover Image

An Improved Mixture Density Network Via Wasserstein Distance Based Adversarial...

by Yang, Luoxiao; Zheng, Zhong; Zhang, Zijun

IEEE transactions on sustainable energy, 04/2022, Volume 13, Issue 2

This paper develops a novel improved mixture density network via Wasserstein distance-based adversarial learning (WA-IMDN) for achieving more accurate...

ArticleView Article PDF

Journal Article  Full Text Online

View Complete Issue Browse Now

   Preview 

Peer-Reviewed

    

Oversampling based on WGAN for Network Threat Detection

Y Xu, Z Qiu, J Zhang, X Zhang, J Qiu… - 2021 IEEE Intl Conf on …, 2021 - ieeexplore.ieee.org

Wasserstein GAN (WGAN) as a generative method can solve the imbalanced problem … 

Another advantage of WGAN is that it can deal with the discrete data. Therefore, we apply WGAN

 Related articles All 2 versions


2022 

  

2022 see 2021

MR4407747 Prelim Carrillo, José A.; Craig, Katy; Wang, Li; Wei, Chaozhen; 

Primal Dual Methods for Wasserstein Gradient Flows. Found. Comput. Math. 22 (2022), no. 2, 389–443. 35A15 (47J25 47J35 49M29 65K10 82B21)

Review PDF Clipboard Journal Article

MR4407747 

MR4406866 Prelim Hwang, Jinmi; Kim, Sejong; 

Two-Variable Wasserstein Means of Positive Definite Operators. Mediterr. J. Math. 19 (2022), no. 3, Paper No. 110.

Review PDF Clipboard Journal Article

 Related articles

MR4405488 Prelim Cui, Jianbo; Dieci, Luca; Zhou, Haomin; 

Time discretizations of Wasserstein-Hamiltonian flows. Math. Comp. 91 (2022), no. 335, 1019–1075. 65P10 (35R02 58)

Review PDF Clipboard Journal Article

ARTICLE

On Affine Policies for Wasserstein Distributionally Robust Unit Commitment

Cho, Youngchae ; Yang, InsoonarXiv.org, 2022

OPEN ACCESS

On Affine Policies for Wasserstein Distributionally Robust Unit Commitment

Available Online 

[PDF] arxiv.org

On Affine Policies for Wasserstein Distributionally Robust Unit Commitment

Y Cho, I Yang - arXiv preprint arXiv:2203.15333, 2022 - arxiv.org

… uses Wasserstein ambiguity sets as they offer tractable formulations and out-of-sample

performance guarantees [9]–[11]. Furthermore, Wasserstein … ac-opf with wasserstein metric,” IEEE …

Cited by 1 Related articles All 3 versions

[HTML] rsc.org

[HTML] Pesticide detection combining the Wasserstein generative adversarial network and the residual neural network based on terahertz spectroscopy

R Yang, Y Li, B Qin, D Zhao, Y Gan, J Zheng - RSC Advances, 2022 - pubs.rsc.org

… Wasserstein generative adversarial network (WGAN) and the residual neural network (ResNet),

to detect carbendazim based on terahertz spectroscopy. The Wasserstein … Wasserstein …

Related articles  

<——2022———2022———470— 


[PDF] arxiv.org

Chance-constrained set covering with Wasserstein ambiguity

H Shen, R Jiang - Mathematical Programming, 2022 - Springer

… We remark that the Wasserstein distance is equivalent to … In this paper, we study joint DR-CCP

with LHS Wasserstein … structures of set covering and Wasserstein ambiguity to derive two …

 Cited by 8 Related articles All 4 versions

ARTICLE

Amortized Projection Optimization for Sliced Wasserstein Generative Models

Nguyen, Khai ; Ho, NhatarXiv.org, 2022

OPEN ACCESS

Amortized Projection Optimization for Sliced Wasserstein Generative Models

Available Online 

[PDF] arxiv.org

Amortized Projection Optimization for Sliced Wasserstein Generative Models

K NguyenN Ho - arXiv preprint arXiv:2203.13417, 2022 - arxiv.org

… of the Wasserstein distance, the sliced Wasserstein distance, and the max-sliced Wasserstein

… We then formulate generative models based on the max-sliced Wasserstein distances and …

Cited by 4 Related articles All 3 versions

 2022 research project

Hamilton-Jacobi Equations in the Wasserstein Space - The ...

https://www.uakron.edu › math › research › hamilton-j...

This project aims to study a class of dynamical systems on the Wasserstein space of probability measures corresponding to some fundamental systems of partial ...

 arXiv:2204.09928  [pdfpsother math.DG   math.MG
Bures-Wasserstein minimizing geodesics between covariance matrices of different ranks
Authors: Yann ThanwerdasXavier Pennec
Abstract: The set of covariance matrices equipped with the Bures-Wasserstein distance is the orbit space of the smooth, proper and isometric action of the orthogonal group on the Euclidean space of square matrices. This construction induces a natural orbit stratification on covariance matrices, which is exactly the stratification by the rank. Thus, the strata are the manifolds of symmetric positive semi-def…  More
Submitted 21 April, 2022; originally announced April 2022.
Fault Feature Recovery with Wasserstein Generative Adversarial Imputation Network With Gradient Penalty...

by Hu, Wenyang; Wang, Tianyang; Chu, Fulei

IEEE transactions on instrumentation and measurement, 04/2022

Rotating machine health monitoring systems sometimes suffer from large segments of continuous missing data in practical applications, which may lead to...

Article PDFPDF

 

ARTICLE

Bures-Wasserstein minimizing geodesics between covariance matrices of different ranks

Thanwerdas, Yann ; Pennec, XavierarXiv.org, 2022

OPEN ACCESS

Bures-Wasserstein minimizing geodesics between covariance matrices of different ranks

Available Online    

8Bures-Wasserstein minimizing geodesics between covariance matrices of different ranks

by Thanwerdas, Yann; Pennec, Xavier

04/2022

The set of covariance matrices equipped with the Bures-Wasserstein distance is the orbit space of the smooth, proper and isometric action of the orthogonal...

Journal Article  Full Text Online
 Related articles All 21 versions 


arXiv:2204.07405  [pdfother quant-ph   math-ph
Monotonicity of the quantum 2-Wasserstein distance
Authors: Rafał BistrońMichał EcksteinKarol Życzkowski
Abstract: We study a quantum analogue of the 2-Wasserstein distance as a measure of proximity on the set Ω
N  of density matrices of dimension N
. We show that such (semi-)distances do not induce Riemannian metrics on the tangent bundle of Ω
N   and are typically not unitary invariant. Nevertheless, we prove that for N=2
 dimensional Hilbert space the quantum 2-Wasserstein distance (unique up to rescalin…  More
Submitted 15 April, 2022; originally announced April 2022.
Comments: 21 pages, 5 figures

All 2 versions 
[CITATION] Monotonicity of the quantum 2-Wasserstein distance

R l Bistron, M l Eckstein, K Zyczkowski - arXiv preprint arXiv:2204.07405, 2022


2022


[PDF] infocomm-journal.com

基于 VAE-WGAN 的多维时间序列异常检测方法

段雪源, 付钰, 王坤 - 通信学报, 2022 -  



[PDF] thecvf.com

[PDF] Supplement Material for One Loss for Quantization: Deep Hashing with Discrete Wasserstein Distributional Matching

KD DoanP YangP Li - openaccess.thecvf.com

This document provides additional details and experimental results to support the main submission. We begin by providing a more detailed discussion on the existing quantization …

 Related articles 


[PDF] arxiv.org

Invariance encoding in sliced-Wasserstein space for image classification with limited training data

Y ZhuangS LiAHM Rubaiyat, X Yin… - arXiv preprint arXiv …, 2022 - arxiv.org

… The R-CDT NS method classifies a test image by finding the nearest set to the test sample

in the slicedWasserstein distance sense. Each set in this case, corresponds to a particular …

  Related articles All 2 versions 

2022 see 2021  [PDF] arxiv.org

A Unified Wasserstein Distributional Robustness Framework for Adversarial Training

TA Bui, T Le, Q Tran, H Zhao, D Phung - arXiv preprint arXiv:2202.13437, 2022 - arxiv.org

… Arguably, there are unexplored benefits in considering such adversarial … Wasserstein 

distributional robustness with current state-of-the-art AT methods. We introduce a new Wasserstein

 Cited by 3 Related articles All 3 versions


[PDF] arxiv.org

Approximating 1-Wasserstein Distance with Trees

M Yamada, Y Takezawa, R Sato, H Bao… - arXiv preprint arXiv …, 2022 - arxiv.org

… the 1-Wasserstein distance by the treeWasserstein distance (TWD), where TWD is a 1-Wasserstein 

To this end, we first show that the 1-Wasserstein approximation problem can be …

Cited by 1 Related articles All 5 versions

<——2022———2022———480— 

 

T-copula and Wasserstein distance-based stochastic neighbor embedding

Y Huang, K Guo, X Yi, J Yu, Z Shen 

, T Li - Knowledge-Based Systems, 2022 - Elsevier

Wasserstein distance and t-copula function into the stochastic neighbor embedding model. 

We first employ the Gaussian distribution equipped with the Wasserstein … use the Wasserstein

Cited by 2 Related articles All 3 versions

2022  [HTML] hindawi.com

[HTML] … Item X-Ray Pseudocolor Images in X-Ray Security Inspection Based on Wasserstein Generative Adversarial Network and Spatial-and-Channel Attention …

D Liu, J Liu, P Yuan, F Yu - Computational Intelligence and …, 2022 - hindawi.com

… Secondly, in the framework, we design a spatial-andchannel attention block and a new 

base block to compose our X-ray Wasserstein generative adversarial network model with …

  All 10 versions



Intrusion Detection Method Based on Wasserstein Generative Adversarial NetworkAuthors:Wenbo GuanQing Zou2022 2nd International Conference on Frontiers of Electronics, Information and Computation Technologies (ICFEICT)
Summary:An intrusion detection method based on Wasserstein generative adversarial network and deep neural network was proposed to solve the problem of low accuracy and high false positive rate caused by data imbalance in traditional intrusion detection system based on machine learning. By learning the distribution of the original data set and generating new samples, the method can improve the class imbalance of the data set, so as to train the intrusion detection model and improve the detection efficiency of the model. First, the training set is balanced by combining Wasserstein GAN of the Dropout regular network to generate new data. Then the balanced data set is used to train DNN, and the DNN model obtained after training is used for intrusion detection. Experimental results on NSL-KDD data set show that this method is more effective than the traditional methodShow more
Chapter, 2022
Publication:2022 2nd International Conference on Frontiers of Electronics, Information and Computation Technologies (ICFEICT), 202208, 599
Publisher:2022

 

2022  see 2021  [PDF] openreview.net

[PDF] Combining Wasserstein GAN and Spatial Transformer Network for Medical Image Segmentation

Z Zhang, J Wang, Y Wang, S Li - 2022 - openreview.net

… and Spatial Transformer Network(STN) (Jaderberg et al., 2015) for image segmentation. We 

try to make the Wasserstein … to the wasserstein distance. Keywords: Image segmentation, …

 Related articles 

[CITATION] Combining Wasserstein GAN and Spatial Transformer Network for Medical Image Segmentation

Z Zhang, J Wang, Y Wang, S Li - 2022

[PDF] arxiv.org

Quantitative control of Wasserstein distance between Brownian motion and the Goldstein-Kac telegraph process

G Barrera, J Lukkarinen - arXiv preprint arXiv:2201.00422, 2022 - arxiv.org

… coupling, synchronous coupling … Wasserstein metric on an infinite dimensional space is 

very difficult. For basic definitions, properties and notions related to couplings and Wasserstein

Related articles All 4 versions

2022


2022 see 2021 Cover Image

Decision Making Under Model Uncertainty: Fréchet–Wasserstein Mean Preferences

by Petracou, Electra V.; Xepapadeas, Anastasios; Yannacopoulos, Athanasios N.

Management science, 02/2022, Volume 68, Issue 2

This paper contributes to the literature on decision making under multiple probability models by studying a class of variational preferences. These preferences...

Article PDFPDF

Journal Article  Full Text Online

View in Context Browse Journal

[PDF] arxiv.org

Decision Making Under Model Uncertainty: Fréchet–Wasserstein Mean Preferences

EV Petracou, A Xepapadeas… - Management …, 2022 - pubsonline.informs.org

… utility functionals, which are based on the Wasserstein metric in the space of probability models. 

… We derive explicit expressions for the Fréchet–Wasserstein mean utility functionals and …

Cited by 6 Related articles All 2 version

 

[PDF] kaust.edu.sa

Digital Rock Reconstruction Using Wasserstein GANs with Gradient Penalty

Y Li, X He, W Zhu, M AlSinan, H Kwak… - International Petroleum …, 2022 - onepetro.org

Accelerating flash calculations in unconventional reservoirs considering capillary pressure 

Cited by 7 Related articles All 4 versions

Multiview Wasserstein Generative Adversarial Network for imbalanced pearl classification

S Gao, Y Dai, Y Li, K Liu, K Chen… - … Science and Technology, 2022 - iopscience.iop.org

… [15] introduced a Wasserstein gradient-penalty GAN with … [20] proposed the conditional 

mixture Wasserstein GAN (WGAN… insufficiencies, a new multiview Wasserstein GAN (MVWGAN) …

 

  

[PDF] researchgate.net

Graph Wasserstein Autoencoder-Based Asymptotically Optimal Motion Planning With Kinematic Constraints for Robotic Manipulation

C Xia, Y Zhang, SA Coleman, CY Weng… - IEEE Transactions …, 2022 - ieeexplore.ieee.org

… Inspired by the graph neural network [20], [21], we propose a novel graph wasserstein 

as the input of the developed wasserstein autoencoder. The configuration samples generated …

All 2 versions


Fault Feature Extraction Method of a Permanent Magnet Synchronous Motor Based on VAE-WGAN

L Zhan, X Xu, X Qiao, F Qian, Q Luo - Processes, 2022 - mdpi.com

… Therefore, this paper proposes an improved GAN using Wasserstein distance instead of

JS divergence which fundamentally improves the stability of model training and avoids model …

 Related articles All 2 versions 

<——2022———2022———490— 


[PDF] inria.fr

A convolutional Wasserstein distance for tractography evaluation: complementary study to state-of-the-art measures

T Durantel, J ColoignerO Commowick - ISBI 2022-IEEE International …, 2022 - hal.inria.fr

… on the computation of the Wasserstein distance, derived from op… The 2-Wasserstein distance,

simply called Wasserstein dis… in development, our new Wasserstein measure can be used …

All 11 versions


A CWGAN-GP-based multi-task learning model for consumer ...https://www.sciencedirect.com › science › article › pii

by Y Kang · 2022 · Cited by 1 — We employ the CWGAN-GP model to learn about the distribution of borrower population and

adjust the data distribution between good and bad borrowers through ...


Conditional Wasserstein Generative Adversarial Nets for Fault ...

https://www.researchgate.net › publication › 338722375_...

Jul 5, 2022 — CWGAN: Conditional Wasserstein Generative Adversarial Nets for Fault Data Generation · 20+ million members · 135+ million publications · 700k+ ...


2022 see 2021  Working Paper  Full Text

Wasserstein perturbations of Markovian transition semigroups

Fuhrmann, Sven; Kupper, Michael; Nendel, Max.

arXiv.org; Ithaca, Apr 8, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

2022 see 2021  Working Paper Full Text

Rethinking Rotated Object Detection with Gaussian Wasserstein Distance Loss

Yang, Xue; Junchi Yan; Qi Ming; Wang, Wentao; Zhang, Xiaopeng; et al.

arXiv.org; Ithaca, Apr 18, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

ARTICLE

Rethinking Rotated Object Detection with Gaussian Wasserstein Distance Loss

Yang, Xue ; Junchi Yan ; Qi Ming ; Wang, Wentao ; Zhang, Xiaopeng ; Tian, QiarXiv.org, 2022

OPEN ACCESS

Rethinking Rotated Object Detection with Gaussian Wasserstein Distance Loss

Available Online 

2022


2022 see 2021  Working Paper  Full Text

On a linear Gromov-Wasserstein distance

Beier, Florian; Beinert, Robert; Steidl, Gabriele.

arXiv.org; Ithaca, Mar 31, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

2022 see 2021  Scholarly Journal  Citation/Abstract

Physics-driven learning of Wasserstein GAN for density reconstruction in dynamic tomography

Huang, Zhishen; Klasky, Marc; Wilcox, Trevor; Ravishankar, Saiprasad.

Applied Optics; Washington Vol. 61, Iss. 10,  (Apr 1, 2022): 2805.

Abstract/Details 
Related articles
 All 8 versions 

62 References  Related records


Working Paper  Full Text

Convergence in Wasserstein Distance for Empirical Measures of Dirichlet Diffusion Processes on Manifolds

Feng-Yu, Wang.

arXiv.org; Ithaca, Apr 8, 2022.


Working Paper  Full Text

Stein factors for variance-gamma approximation in the Wasserstein and Kolmogorov distances

Gaunt, Robert E.

arXiv.org; Ithaca, Apr 19, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window
MR4413319
 

Free Submitted Article From RepositoryFull Text at Publisher

62 References  Related records


2022 see 2021Scholarly Journal  Citation/Abstract

WDIBS: Wasserstein deterministic information bottleneck for state abstraction to balance state-compression and performance

Zhu Xianchao; Huang, Tianyi; Zhang Ruiyuan; Zhu, William.

Applied Intelligence; Boston Vol. 52, Iss. 6,  (Apr 2022): 6316-6329.

Abstract/Details 

Cited by 1 Related articles All 2 versions

<——2022———2022———500—


 

2022 see 2021  Scholarly Journal  Citation/Abstract

Wasserstein Distributionally Robust Motion Control for Collision Avoidance Using Conditional Value-at-Risk

Hakobyan, Astghik; Yang, Insoon.

IEEE Transactions on Robotics; New York Vol. 38, Iss. 2,  (2022): 939-957.

Abstract/Details Get full textLink to external site, this link will open in a new windo

2022 see 2021  Working Paper  Full Text

Two-sample goodness-of-fit tests on the flat torus based on Wasserstein distance and their relevance to structural biology

González-Delgado, Javier; González-Sanz, Alberto; Cortés, Juan; Neuvial, Pierre.

arXiv.org; Ithaca, Apr 13, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window


[HTML] springer.com

[HTML] Adversarial classification via distributional robustness with wasserstein ambiguity

N Ho-NguyenSJ Wright - Mathematical Programming, 2022 - Springer

… We show that under Wasserstein ambiguity, the model aims to minimize the conditional

value-at-risk of the distance to misclassification, and we explore links to adversarial classification …

Cited by 4 Related articles All 3 versions

ARTICLE

Quantum Wasserstein isometries on the qubit state space

György Pál Gehér ; Pitrik, József ; Titkos, Tamás ; Virosztek, DánielarXiv.org, 2022

OPEN ACCESS

Quantum Wasserstein isometries on the qubit state space

Available Online 

arXiv:2204.14134  [pdfother]  math-ph  math.FA  math.MG quant-ph
Quantum Wasserstein isometries on the qubit state space
Authors: György Pál GehérJózsef PitrikTamás TitkosDániel Virosztek
Abstract: We describe Wasserstein isometries of the quantum bit state space with respect to distinguished cost operators. We derive a Wigner-type result for the cost operator involving all the Pauli matrices: in this case, the isometry group consists of unitary or anti-unitary conjugations. In the Bloch sphere model, this means that the isometry group coincides with the classical symmetry group…  More
Submitted 29 April, 2022; originally announced April 2022.
Comments: 17 pages
MSC Class: Primary: 49Q22; 81Q99. Secondary: 54E40
Cited by 1
 Related articles All 3 versions 


2022

arXiv:2204.13559  [pdfpsother math.PR
Wasserstein Convergence for Conditional Empirical Measures of Subordinated Dirichlet Diffusions on Riemannian Manifolds
Authors: Huaiqian LiBingyao Wu
Abstract: The asymptotic behaviour of empirical measures has plenty of studies. However, the research on conditional empirical measures is limited. Being the development of Wang \cite{eW1}, under the quadratic Wasserstein distance, we investigate the rate of convergence of conditional empirical measures associated to subordinated Dirichlet diffusion processes on a connected compact Riemannian manifold with…  More
Submitted 28 April, 2022; originally announced April 2022.
Comments: Comments welcome!
Cited by 1
 Related articles All 2 versions 


ARTICLE

Minimax Robust Quickest Change Detection using Wasserstein Ambiguity Sets

Liyan XiearXiv.org, 2022

OPEN ACCESS

Minimax Robust Quickest Change Detection using Wasserstein Ambiguity Sets

Available Online 

arXiv:2204.13034  [pdfother]  math.ST  stat.ME
Minimax Robust Quickest Change Detection using Wasserstein Ambiguity Sets
Authors: Liyan Xie
Abstract: We study the robust quickest change detection under unknown pre- and post-change distributions. To deal with uncertainties in the data-generating distributions, we formulate two data-driven ambiguity sets based on the Wasserstein distance, without any parametric assumptions. The minimax robust test is constructed as the CUSUM test under least favorable distributions, a representative pair of distr… 
More
Submitted 27 April, 2022; originally announced April 2022.
Comments: The 2022 IEEE International Symposium on Information Theory (ISIT)
Cited by 1
 Related articles All 4 versions

arXiv:2204.12527  [pdfother]  cs.IR  cs.LG
Application of WGAN-GP in recommendation and Questioning the relevance of GAN-based approaches
Authors: Hichem Ammar KhodjaOussama Boudjeniba
Abstract: Many neural-based recommender systems were proposed in recent years and part of them used Generative Adversarial Networks (GAN) to model user-item interactions. However, the exploration of Wasserstein GAN with Gradient Penalty (WGAN-GP) on recommendation has received relatively less scrutiny. In this paper, we focus on two questions: 1- Can we successfully apply WGAN-GP on recommendation and does…  More
Submitted 28 April, 2022; v1 submitted 26 April, 2022; originally announced April 2022.
Comments: 8 pages, 2 figures, Accepted at ICMLT 2022 (but not published)
Related articles
 


 Improving SSH detection model using IPA time and WGAN-GP

by Lee, Junwon; Lee, Heejo

Computers & security, 05/2022, Volume 116

In the machine learning-based detection model, the detection accuracy tends to be proportional to the quantity and quality of the training dataset. The machine...

Article PDFPD

Journal Article  Full Text Online

View in Context Browse Journal

Cited by 2 Related articles All 3 versions    

 

Two-Variable Wasserstein Means of Positive ... - Springer LINKhttps://link.springer.com › content › pdf

by J Hwang · 2022 — Abstract. We investigate the two-variable Wasserstein mean of positive definite operators, as 

unique positive solution of the nonlinear equa-.

[CITATION]  Two-variable Wasserstein mean of positive operators

S Kim - 2023 Joint Mathematics Meetings (JMM 2023), 2023 - meetings.ams.org


Small sample reliability assessment with online time-series data based on a worm WGAN learning method

by Sun, Bo; Wu, Zeyu; Feng, Qiang ; More...

IEEE transactions on industrial informatics, 04/2022

The scarcity of time-seri

[PDF] archives-ouvertes.fr

Small sample reliability assessment with online time-series data based on a worm WGAN learning method

B Sun, Z Wu, Q Feng, Z Wang, Y Ren… - IEEE Transactions …, 2022 - ieeexplore.ieee.org

The scarcity of time-series data constrains the accuracy of online reliability assessment. Data

expansion is the most intuitive way to address this problem. However, conventional, small-…es data constrains the accuracy of online reliability assessment. Data expansion is the most intuitive way to address this proble

Cited by 1 Related articles

<——2022———2022———510—

Time discretizations of Wasserstein--Hamiltonian flows

by Jianbo Cui; Luca Dieci; Haomin Zhou

Mathematics of computation, 05/2022, Volume 91, Issue 335

We study discretizations of Hamiltonian systems on the probability density manifold equipped with the L^2-Wasserstein metric. Based on discrete optimal...

Article PDFPDF

     

2022 see 2021  ARTICLE

Physics-driven learning of Wasserstein GAN for density reconstruction in dynamic tomography

Huang, Zhishen ; Klasky, Marc ; Wilcox, Trevor ; Ravishankar, SaiprasadApplied optics (2004), 2022, Vol.61 (10), p.2805-2817

PEER REVIEWED

OPEN ACCESS

Physics-driven learning of Wasserstein GAN for density reconstruction in dynamic tomography

Available Online 


2022 see 2021  ARTICLE

Full Attention Wasserstein GAN With Gradient Normalization for Fault Diagnosis Under Imbalanced Data

Fan, Jigang ; Yuan, Xianfeng ; Miao, Zhaoming ; Sun, Zihao ; Mei, Xiaoxue ; Zhou, FengyuIEEE transactions on instrumentation and measurement, 2022, Vol.71, p.1-16

Cited by 5 Related articles All 3 versions

PEER REVIEWED

 Download PDF 

Full Attention Wasserstein GAN With Gradient Normalization for Fault Diagnosis Under Imbalanced Data

Available Online 

 View Issue Contents 


ARTICLE

A new approach to posterior contraction rates via Wasserstein dynamics

Dolera, Emanuele ; Favaro, Stefano ; Mainini, EdoardoarXiv.org, 2022

OPEN ACCESS

A new approach to posterior contraction rates via Wasserstein dynamics

Available Online 


2022


 

VGAN: Generalizing MSE GAN and WGAN-GP for robot fault diagnosis

by Pu, Ziqiang; Cabrera, Diego; Li, Chuan ; More...

IEEE intelligent systems, 04/2022

Generative adversarial networks (GANs) have shown their potential for data generation. However, this type of generative model often suffers from oscillating...

Article PDFPDF

VGAN: Generalizing MSE GAN and WGAN-GP for robot fault diagnosis

by Pu, ZiqiangCabrera, DiegoLi, Chuan ; More...

IEEE intelligent systems, 04/2022

Generative adversarial networks (GANs) have shown their potential for data generation. However, this type of generative model often suffers from oscillating

All 2 versions

A WGAN-Based Method for Generating Malicious Domain Training Data

K Zhang, B Huang, Y Wu, C Chai, J Zhang… - … Conference on Artificial …, 2022 - Springer

… The generated confrontation network model is WGAN (Wasserstein GAN). WGAN mainly 

improves GAN from the perspective of loss function. After the loss function is improved, WGAN

 All 2 versions

    

Time discretizations of Wasserstein--Hamiltonian flows

by Jianbo CuiLuca DieciHaomin Zhou

Mathematics of computation, 05/2022, Volume 91, Issue 335

We study discretizations of Hamiltonian systems on the probability density manifold equipped with the L^2-Wasserstein metric. Based on discrete optimal...

ArticleView Article PDF


Martingale Wasserstein inequality for probability measures in the convex order

by Jourdain, BenjaminMargheriti, William

Bernoulli : official journal of the Bernoulli Society for Mathematical Statistics and Probability, 2022

It is known since [24] that two one-dimensional probability measures in the convex order admit a martingale coupling with respect to which the integral of...

Journal Article 

 Full Text Online

Fault Feature Recovery with Wasserstein Generative Adversarial Imputation Network...

by Hu, WenyangWang, TianyangChu, Fulei

IEEE transactions on instrumentation and measurement, 04/2022

Rotating machine health monitoring systems sometimes suffer from large segments of continuous missing data in practical applications, which may lead to...

ArticleView Article PDF

Journal Article 

Cited by 1 Related articles All 11 versions

<——2022————2022———520—  


Fault Feature Recovery with Wasserstein Generative Adversarial Imputation Network...

by Hu, WenyangWang, TianyangChu, Fulei

IEEE transactions on instrumentation and measurement, 04/2022

Rotating machine health monitoring systems sometimes suffer from large segments of continuous missing data in practical applications, which may lead to...

ArticleView Article PDF

Journal Artic

 

 

Wgan code framework

P König - 2022

Cited by 1 Related articles

[CITATION] Wgan code framework

P König - 2022


2 022 patent news

Wire Feed  Full Text

Univ Hunan Files Chinese Patent Application for Unsupervised Multi-View Three-Dimensional Point Cloud Joint Registration Method Based on WGAN

Global IP News. Software Patent News; New Delhi [New Delhi]. 28 Apr 2022. 

DetailsFull text
NEWSPAPER ARTICLE

Univ Hunan Files Chinese Patent Application for Unsupervised Multi-View Three-Dimensional Point Cloud Joint Registration Method Based on WGAN

Global IP News. Software Patent News, 2022

Univ Hunan Files Chinese Patent Application for Unsupervised Multi-View Three-Dimensional Point Cloud Joint Registration Method Based on WGAN

No Online Access 


2022 see 2021  ARTICLE

Local Stability of Wasserstein GANs With Abstract Gradient Penalty

Kim, Cheolhyeong ; Park, Seungtae ; Hwang, Hyung JuIEEE transaction on neural networks and learning systems, 2022, Vol.33 (9), p.4527-4537

 Download PDF 

Local Stability of Wasserstein GANs With Abstract Gradient Penalty

Available Online 

 View Issue Contents 

Wassertrain: An Adversarial Training Framework Against Wasserstein Adversarial Attacks

Q Zhao, X Chen, Z Zhao, E Tang… - ICASSP 2022-2022 IEEE …, 2022 - ieeexplore.ieee.org

… Our WasserAttack directly finds the worst point within the Wasserstein ball, in which the

Lagrangian relaxation and the change of variables technique are introduced to handle the 


2022


Topological Continual Learning with Wasserstein Distance and Barycenter
by Songdechakraiwut, TananunYin, XiaoshuangVan Veen, Barry D
10/2022
Continual learning in neural networks suffers from a phenomenon called catastrophic forgetting, in which a network quickly forgets what was learned in a...
Journal Article  Full Text Online

arXiv:2210.02661  [pdfother cs.LG
Topological Continual Learning with Wasserstein Distance and Barycenter
Authors: Tananun SongdechakraiwutXiaoshuang YinBarry D. Van Veen
Abstract: Continual learning in neural networks suffers from a phenomenon called catastrophic forgetting, in which a network quickly forgets what was learned in a previous task. The human brain, however, is able to continually learn new tasks and accumulate knowledge throughout life. Neuroscience findings suggest that continual learning success in the human brain is potentially associated with its modular s…  More


ARTICLE

A Simple Duality Proof for Wasserstein Distributionally Robust Optimization

Zhang, Luhao ; Yang, Jincheng ; Gao, RuiarXiv.org, 2022

OPEN ACCESS

A Simple Duality Proof for Wasserstein Distributionally Robust Optimization

Available Online 

[PDF] arxiv.org

A Simple Duality Proof for Wasserstein Distributionally Robust Optimization

L Zhang, J Yang, R Gao - arXiv preprint arXiv:2205.00362, 2022 - arxiv.org

… a new duality proof for Wasserstein distributionally robust optimization, which is based on

applying Legendre transform twice to the worst-case loss as a function of Wasserstein radius. …

Related articles All 2 versions 

Multisource Wasserstein Adaptation Coding Network for EEG emotion recognition

L Zhu, W Ding, J Zhu, P Xu, Y Liu, M Yan… - … Signal Processing and …, 2022 - Elsevier

… It also uses Wasserstein distance and Association Reinforcement to adapt marginal … In

order to reduce Wasserstein distance, we can maximize the Domain discriminator loss. …

Related articles

Full Text at Publisher

45 References Related records

[PDF] arxiv.org

Partial Wasserstein Adversarial Network for Non-rigid Point Set Registration

ZM Wang, N Xue, L Lei, GS Xia - arXiv preprint arXiv:2203.02227, 2022 - arxiv.org

… a scalable PDM algorithm by utilizing the efficient partial Wasserstein-1 (PW) discrepancy. … 

Based on these results, we propose a partial Wasserstein adversarial network (PWAN), …

All 3 versions 

 

Wasserstein Cross-Lingual Alignment For Named Entity Recognition

R Wang, R Henao - ICASSP 2022-2022 IEEE International …, 2022 - ieeexplore.ieee.org

… Specifically, we propose to align by minimizing the Wasserstein distance between the 

contextualized token embeddings from source and target languages. Experimental results show …

<——2022—–—2022———530— 


[PDF] arxiv.org

Chance-constrained set covering with Wasserstein ambiguity

H Shen, R Jiang - Mathematical Programming, 2022 - Springer

… We remark that the Wasserstein distance is equivalent to … In this paper, we study joint DR-CCP 

with LHS Wasserstein … structures of set covering and Wasserstein ambiguity to derive two …

 Cited by 8 Related articles All 4 versions

2022 see 2021

Learning to Generate Wasserstein Barycenters

by Lacombe, JulienDigne, JulieCourty, Nicolas ; More...

Journal of mathematical imaging and vision, 2022

Optimal transport is a notoriously difficult problem to solve numerically, with current approaches often remaining intractable for

very large scale...

Journal Article  Full Text Online

Related articles All 6 versions

[PDF] gatech.edu

[PDF] Distributionally Robust Disaster Relief Planning under the Wasserstein Set

M El Tonbari, G Nemhauser, A Toriello - sites.gatech.edu

… · To the best of our knowledge, this is the first work to consider a Wasserstein ball in a two-stage 

DRO formulation with binary variables in the second stage for disaster relief operations, …

Related articles All 2 versions 

[PDF] arxiv.org

Tackling Algorithmic Bias in Neural-Network Classifiers using Wasserstein-2 Regularization

L Risser, AG Sanz, Q Vincenot, JM Loubes - Journal of Mathematical …, 2022 - Springer

… It indeed only overloads the loss function with a Wasserstein-2-based regularization term for 

… This model is algorithmically reasonable and makes it possible to use our regularized loss

Cited by 1 Related articles All 6 versions

Cited by 3 Related articles All 7 versions

 Zbl 07630565

 2022 see 2021

UP-WGAN: Upscaling Ambisonic Sound Scenes Using Wasserstein Generative Adversarial Networks

Y Wang, X WuT Qu - Audio Engineering Society Convention 151, 2022 - aes.org

… In this work, a deep-learning-based method for upscaling is proposed. Specifically, the …

improves the upscaling results compared with the previous deep-learning-based method. …

Related articles All 2 versions

 Zbl 07630565

 

2022

 

Wasserstein Cross-Lingual Alignment For Named Entity Recognition

R Wang, R Henao - ICASSP 2022-2022 IEEE International …, 2022 - ieeexplore.ieee.org

… For the unlabeled paralleled corpus 1Xs,Xtl, we ignore the NER labels in training dataset of

the … We use a training batch size of 32. Our model is trained with the learning rate of 5e-5 for …

 

EVGAN: Optimization of Generative Adversarial Networks Using Wasserstein Distance and Neuroevolution

VK Nair, C Shunmuga Velayutham - Evolutionary Computing and Mobile …, 2022 - Springer

… of the training problems of GANs came to light with the addition of a new loss function called

the Wasserstein … The corresponding model was good at getting a stable training phase and …

 All 3 versions


Fault Feature Recovery with Wasserstein Generative Adversarial Imputation Network With Gradient Penalty for Rotating Machine Health Monitoring under Signal Loss …

W Hu, T Wang, F Chu - IEEE Transactions on Instrumentation …, 2022 - ieeexplore.ieee.org

… that optimizes training by using the Wasserstein distance … two distributions, the Wasserstein

distance can still describe … The first Wasserstein distance between two distributions p1 …

Cited by 7 Related articles All 2 versions

Wassertrain: An Adversarial Training Framework Against Wasserstein Adversarial Attacks

Q Zhao, X Chen, Z Zhao, E Tang… - ICASSP 2022-2022 IEEE …, 2022 - ieeexplore.ieee.org

… For adversarial attacks, the PGD-based attack method develops an approximate projection 

operator onto the Wasserstein ball called projected Sinkhorn to find adversarial examples. …

Cited by 1 Related articles

ARTICLE

DVGAN: Stabilize Wasserstein GAN training for time-domain Gravitational Wave physics

Dooney, Tom ; Bromuri, Stefano ; Curier, LyanaarXiv.org, 2022

OPEN ACCESS

DVGAN: Stabilize Wasserstein GAN training for time-domain Gravitational Wave physics

Available Online 

DVGAN: Stabilize Wasserstein GAN training for time-domain Gravitational Wave physics

by Dooney, TomBromuri, StefanoCurier, Lyana

09/2022

Simulating time-domain observations of gravitational wave (GW) detector environments will allow for a better understanding of

GW sources, augment datasets for...

Journal Article  Full Text Online

DVGAN: Stabilize Wasserstein GAN training for time-domain Gravitational Wave physics

<——2022———2022———540-


[PDF] sns.it

[PDF] Sharp Wasserstein estimates for integral sampling and Lorentz summability of transport densities

F Santambrogio - 2022 - cvgmt.sns.it

… ie when approximate an integral … approximation. Another important point in choosing the 

precise form of the inequality that we would like to prove concerns the choice of the Wasserstein

Multiview Wasserstein Generative Adversarial Network for imbalanced pearl classification

S Gao, Y Dai, Y Li, K Liu, K Chen… - … Science and Technology, 2022 - iopscience.iop.org

… [15] introduced a Wasserstein gradient-penalty GAN with … mixture Wasserstein GAN (WGAN) 

approximate true feature … insufficiencies, a new multiview Wasserstein GAN (MVWGAN) with …

 All 2 versions


  [PDF] projecteuclid.org

Minimax confidence intervals for the Sliced Wasserstein distance

T Manole, S Balakrishnan… - Electronic Journal of …, 2022 - projecteuclid.org

… In this setting, contrasting popular approximate Bayesian computation methods, we 

develop uncertainty quantification methods with rigorous frequentist coverage guarantees. …

Cited by 10 Related articles All 8 versions
Zbl 07524974

[CITATION] Minimax confidence intervals for the Sliced Wasserstein distance. Electron

T MANOLE, S BALAKRISHNAN, LA WASSERMAN - J. Stat, 2022

Multisource single‐cell data integration by MAW barycenter for Gaussian mixture models

L Lin, W Shi, J Ye, J Li - Biometrics, 2022 - Wiley Online Library

… Minimized Aggregated Wasserstein (MAW) distance to approximate the Wasserstein metric 

  Related articles All 4 versions

 

Image Outpainting using Wasserstein Generative Adversarial Network with Gradient Penalty

A Nair, J Deshmukh, A Sonare… - 2022 6th International …, 2022 - ieeexplore.ieee.org

… an approximation of the Earth-Mover distance (EM) rather than the Jensen-Shannon 

divergence as in the original GAN formulation in the Wasserstein GAN. • In “Improved Training of …

Related articles 

2022


2022  see 2021     [PDF] mlr.press

Variance minimization in the Wasserstein space for invariant causal prediction

GG Martinet, A Strzalkowski… - … Conference on Artificial …, 2022 - proceedings.mlr.press

… Each of these tests relies on the minimization of a novel loss function–the Wasserstein 

variance–that is derived from tools in optimal transport theory and is used to quantify distributional …

 Cited by 3 Related articles All 2 versions


 

Optimisation robuste en distribution: incertitude de Wasserstein, régularisation et applications

J Malick - indico.math.cnrs.fr

… En optimisation robuste en distribution (distributionnaly robust optimization [2]) par exemple, 


[PDF] arxiv.org

stein factors for variance-gamma approximation in the wasserstein and kolmogorov distances

RE Gaunt - Journal of Mathematical Analysis and Applications, 2022 - Elsevier

Wasserstein and Kolmogorov distance error bounds in a six moment theorem for VG 

approximSave Cite Cited by 7 Related articles All 3 versions

Cited by 10 Related articles All 6 versions
Zbl 07540692


2022 see 2021  [PDF] arxiv.org

Wasserstein generative adversarial uncertainty quantification in physics-informed neural networks

Y Gao, MK Ng - Journal of Computational Physics, 2022 - Elsevier

… [50] designed some special discriminators with restricted approximability to let the trained 

generator approximate the target distribution in Wasserstein distance. However, their designs …

Cited by 4 Related articles All 4 versions

 Zbl 07536770

2022 see 2021  [PDF] arxiv.org

Wasserstein-Based Projections with Applications to Inverse Problems

H Heaton, SW Fung, AT Lin, S Osher, W Yin - SIAM Journal on Mathematics of …, 2022 - SIAM

… to the approximation as a Wasserstein-based projection (WP). Once this approximation of 

1 Citation \101. References Related records

Cited by 5 Related articles All 4 versions

Zbl 07524173  MR4417000 

<——2022———2022———550— 


The Wasserstein Distance Using QAOA: A Quantum Augmented Approach to Topological Data Analysis

M Saravanan, M Gopikrishnan - 2022 International Conference …, 2022 - ieeexplore.ieee.org

… Of crucial importance to this method is a metric called the Wasserstein Distance, which … 

finding the Wasserstein Distance by applying the Quantum Approximate Optimization Algorithm (…

 

 

[PDF] arxiv.org

On a linear fused Gromov-Wasserstein distance for graph structured data

DH Nguyen, K Tsuda - arXiv preprint arXiv:2203.04711, 2022 - arxiv.org

… the concept of linear Wasserstein embedding for learning … 2-Wasserstein distance to the 

Fused Gromov-Wasserstein distance (… distance, we propose to approximate it by a linear optimal …

All 3 versions


[HTML] springer.com

[HTML] Primal dual methods for Wasserstein gradient flows

JA Carrillo, K Craig, L Wang, C Wei - Foundations of Computational …, 2022 - Springer

… Finally, in Algorithm 3, we describe how Algorithm 2 can be iterated to approximate the 

Cited by 55 Related articles All 14 versions

Zbl 07533993

[PDF] arxiv.org

A Simple Duality Proof for Wasserstein Distributionally Robust Optimization

L Zhang, J Yang, R Gao - arXiv preprint arXiv:2205.00362, 2022 - arxiv.org

We present a short and elementary proof of the duality for Wasserstein distributionally robust 

optimization, which holds for any arbitrary Kantorovich transport distance, any arbitrary …

 All 2 versions

    

[PDF] springer.com

On the use of Wasserstein distance in the distributional analysis of human decision making under uncertainty

A Candelieri, A Ponti, I Giordani, F Archetti - Annals of Mathematics and …, 2022 - Springer

… optimal transport-based Wasserstein distance. The distributional … Wasserstein has also 

enabled a global analysis computing the WST barycenters and performing k-means Wasserstein

Cited by 3 Related articles

2022

2022 see 2021  [PDF] arxiv.org

A continuation multiple shooting method for Wasserstein geodesic equation

J Cui, L Dieci, H Zhou - SIAM Journal on Scientific Computing, 2022 - SIAM

… concerned with approximating solutions of OT problems, and many of them are focused on 

the continuous problem considered in this work, that is, on computation of the Wasserstein

Cited by 4 Related articles All 3 versions

[PDF] arxiv.org

Limit distribution theory for smooth -Wasserstein distances

Z Goldfeld, K Kato, S Nietert, G Rioux - arXiv preprint arXiv:2203.00159, 2022 - arxiv.org

Wasserstein distance. The limit distribution results leverage the functional delta method after 

embedding the domain of the Wasserstein … with the smooth Wasserstein distance, showing …

Cited by 4 Related articles All 2 versions

[PDF] arxiv.org

Wasserstein Asymptotics for the Empirical Measure of Fractional Brownian Motion on a Flat Torus

M Huesmann, F Mattesini, D Trevisan - arXiv preprint arXiv:2205.01025, 2022 - arxiv.org

… We establish asymptotic upper and lower bounds for the Wasserstein distance of any order 

p ≥ 1 between the empirical measure of a fractional Brownian motion on a flat torus and the …

Cited by 1 Related articles All 2 versions

2022 see 2021

WDIBS: Wasserstein deterministic information bottleneck for state abstraction to balance state-compression and performance

X Zhu, T Huang, R Zhang, W Zhu - Applied Intelligence, 2022 - Springer

… In this section, we introduce the related notion of the Wasserstein distance to measure 

decision performance after state compression. Based on the Wasserstein distance, we propose …

Cited by 1 Related articles


2022 see 2021  [PDF] arxiv.org

Application of an unbalanced optimal transport distance and a mixed L1/Wasserstein distance to full waveform inversion

D Li, MP LamoureuxW Liao - Geophysical Journal International, 2022 - academic.oup.com

Full waveform inversion (FWI) is an important and popular technique in subsurface Earth

property estimation. In this paper, several improvements to the FWI methodology are developed …

Cited by 1 Related articles All 5 versions

<——2022———2022———560—


Optimisation robuste en distribution: incertitude de Wasserstein, régularisation et applications

J Malick - indico.math.cnrs.fr

Le transport optimal a fait une entrée fracassante en machine learning [1] mais aussi dans d’autres

applications manipulant des données. En optimisation robuste en distribution (…


[PDF] cnrs.fr

[PDF] Une nouvelle preuve du Théoreme de Représentation du gradient MFG dans l'espace de Wasserstein

C Jimenez, A MarigondaM Quincampoix - indico.math.cnrs.fr

Une nouvelle preuve du Théor`eme de Représentation du gradient MFG dans l’espace de

Wasserstein … Une nouvelle preuve du Théor`eme de Représentation du gradient MFG …


Imbalanced Cell-Cycle Classification Using Wgan-Div and Mixup

P Rana, A Sowmya, E Meijering… - 2022 IEEE 19th …, 2022 - ieeexplore.ieee.org

… discarded majority samples and used Wasserstein GAN-gradient penalty (WGAN-GP) [13] …

, we propose a framework that utilises Wasserstein divergence GAN (WGAN-div) [14] and …

Cited by 2 Related articles


2022 see 2021  ARTICLE

Approximation for Probability Distributions by Wasserstein GAN

Yihang Gao ; Michael K Ng ; Mingjie ZhouarXiv.org, 2022

OPEN ACCESS

Approximation for Probability Distributions by Wasserstein GAN

Available Online 


ARTICLE

Rate of convergence of the smoothed empirical Wasserstein distance

Block, Adam ; Jia, Zeyu ; Polyanskiy, Yury ; Rakhlin, AlexanderarXiv.org, 2022

OPEN ACCESS

Rate of convergence of the smoothed empirical Wasserstein distance

Available Online 

arXiv:2205.02128  [pdfpsother]  math.PR  cs.IT  math.ST
Rate of convergence of the smoothed empirical Wasserstein distance
Authors: Adam BlockZeyu JiaYury PolyanskiyAlexander Rakhlin
Abstract: Consider an empirical measure P …

 be the isotropic Gaussian measure. We study the speed of convergence of the smoothed Wasserstein distance W…
 being the convolution of measures. For K<σ
 and in any dim…  More
Submitted 4 May, 2022; originally announced May 2022.

2022


[PDF] arxiv.org

Wasserstein Adversarial Learning based Temporal Knowledge Graph Embedding

Y Dai, W Guo, C Eickhoff - arXiv preprint arXiv:2205.01873, 2022 - arxiv.org

… Meanwhile, we also apply a Gumbel-Softmax relaxation and the Wasserstein distance to 

prevent vanishing gradient problems on discrete data; an inherent flaw in traditional generative …

All 2 versions
ARTICLE

Wasserstein Adversarial Learning based Temporal Knowledge Graph Embedding

Dai, Yuanfei ; Guo, Wenzhong ; Eickhoff, CarstenarXiv.org, 2022

OPEN ACCESS

Wasserstein Adversarial Learning based Temporal Knowledge Graph Embedding

Available Online 

arXiv:2205.01873  [pdfother cs.IR
Wasserstein Adversarial Learning based Temporal Knowledge Graph Embedding
Authors: Yuanfei DaiWenzhong GuoCarsten Eickhoff
Abstract: Research on knowledge graph embedding (KGE) has emerged as an active field in which most existing KGE approaches mainly focus on static structural data and ignore the influence of temporal variation involved in time-aware triples. In order to deal with this issue, several temporal knowledge graph embedding (TKGE) approaches have been proposed to integrate temporal and structural information in rec…  More
Submitted 3 May, 2022; originally announced May 2022.
All 2 versions
 


arXiv:2205.01025  [pdfpsother]  math.PR  math.AP
Wasserstein Asymptotics for the Empirical Measure of Fractional Brownian Motion on a Flat Torus
Authors: Martin HuesmannFrancesco MattesiniDario Trevisan
Abstract: We establish asymptotic upper and lower bounds for the Wasserstein distance of any order p≥1
 between the empirical measure of a fractional Brownian motion on a flat torus and the uniform Lebesgue measure. Our inequalities reveal an interesting interaction between the Hurst index H
 and the dimension d
 of the state space, with a "phase-transition" in the rates when d=2+1/H
, akin to the Aj…  More
Submitted 2 May, 2022; originally announced May 2022.
Comments: Comments very welcome

ll 2 versions 

MR4414504 Prelim Mei, Yu; Liu, Jia; Chen, Zhiping; 

Distributionally Robust Second-Order Stochastic Dominance Constrained Optimization with Wasserstein Ball. SIAM J. Optim. 32 (2022), no. 2, 715–738. 90C15 (90-08 90C31 91B70 91G10)

Review PDF Clipboard Journal Article

Related articles All 2 versions

2022 see 2021  [PDF] arxiv.org

Distributionally Robust Second-Order Stochastic Dominance Constrained Optimization with Wasserstein Ball

Y Mei, J Liu, Z Chen - SIAM Journal on Optimization, 2022 - SIAM

… ambiguity sets, the Wasserstein distance contains an … Wasserstein ball. Thanks to the rapid

development recently on the strong duality theory of DRO problems with the Wasserstein ball […

Related articles All 2 versions

Adversarial classification via distributional robustness with Wasserstein ambiguity

Nam, HN and Wright, SJ

ApFree Full Text From Publisherr 2022 (Early Access) | MATHEMATICAL PROGRAMMING

Enriched Cited Referenc

We study a model for adversarial classification based on distributionally robust chance constraints. We show that under Wasserstein ambiguity, the model aims to minimize the conditional value-at-risk of the distance to misclassification, and we explore links to adversarial classification models proposed earlier and to maximum-margin classifiers. We also provide areformulation of the distributiShow more

57 References  Related recor


ARTICLE

Image Reconstruction for Electrical Impedance Tomography (EIT) with Improved Wasserstein Generative Adversarial Network (WGAN)

Zhang, Hanyu ; Wang, Qi ; Zhang, Ronghua ; Li, Xiuyan ; Duan, Xiaojie ; Sun, Yukuan ; Wang, Jianming ; Jia, JiabinIEEE sensors journal, 2022, p.1-1

PEER REVIEWED

 Download PDF 

Image Reconstruction for Electrical Impedance Tomography (EIT) with Improved Wasserstein Generative Adversarial Network (WGAN)

Available Online 

 View Issue Contents

Image Reconstruction for Electrical Impedance Tomography (EIT) with Improved Wasserstein Generative Adversarial Network (WGAN)

H Zhang, Q Wang, R Zhang, X Li, X Duan… - IEEE Sensors …, 2022 - ieeexplore.ieee.org

Wasserstein generative adversarial network (WGAN) overcomes the … , WGAN is proposed 

iited by 1 Related articles

<——2022———2022———570—

Cover Image

Martingale Wasserstein inequality for probability measures in the convex order

by Jourdain, Benjamin; Margheriti, William

Bernoulli : official journal of the Bernoulli Society for Mathematical Statistics and Probability, 2022

It is known since [24] that two one-dimensional probability measures in the convex order admit a martingale coupling with respect to which the integral of...

Journal Article  Full Text Online

 Preview   Open Access

Save this item   Permanent Link   Cite this item   Email this item   More actions

Zbl 07526567
Cited by 1 Related articles All 11 versions

 

2022 see 2021  Cover Image

A data-driven scheduling model of virtual power plant using Wasserstein

distributionally robust optimization

by Liu, Huichuan; Qiu, Jing; Zhao, Junhua

International journal of electrical power & energy systems, 05/2022, Volume 137

•A data-driven Wasserstein distributionally robust optimization model is proposed.•The day-head scheduling decision of VPP can be solved by off-the-shell...

Article PDFPDF

Journal Article  Full Text Online

View in Context Browse Journal

 Preview 

Save this item   Permanent Link   Cite this item   Email this item   More actions

Related articles All 2 versions


 

Rate of convergence of the smoothed empirical Wasserstein distance

by Block, Adam; Jia, Zeyu; Polyanskiy, Yury ; More...

05/2022

Consider an empirical measure $\mathbb{P}_n$ induced by $n$ iid samples from a $d$-dimensional $K$-subgaussian distribution $\mathbb{P}$ and let $\gamma =...

Journal Article  Full Text Online

 Preview 

Open Access

Save this item   Permanent Link   Cite this item   Email this item   More actions

sequently, the requirement K<σ is necessary for validity of the log-Sobolev inequality (…

Cited by 1 All 3 versions

ERP-WGAN: A Data Augmentation Method for EEG Single ...

https://pubmed.ncbi.nlm.nih.gov › ...

https://pubmed.ncbi.nlm.nih.gov › ...

All 3 versions

To alleviate the bottleneck problem of scarce EEG sample, we propose a data augmentation method based on generative adversarial network to ...

ERP-WGAN: A Data Augmentation Method for EEG Single-trial Detection

Journal of neuroscience methods, 05/2022

ArticleView Article PDF

Journal Article  Full Text Online

View Complete Issue Browse Now

  

arXiv:2207.11324  [pdfother cs.AI
Exploring Wasserstein Distance across Concept Embeddings for Ontology Matching
Authors: Yuan AnAlex KalinowskiJane Greenberg
Abstract: Measuring the distance between ontological elements is a fundamental component for any matching solutions. String-based distance metrics relying on discrete symbol operations are notorious for shallow syntactic matching. In this study, we explore Wasserstein distance metric across ontology concept embeddings. Wasserstein distance metric targets continuous space that can incorporate linguistic, str…  More

 Journal Article  Full Text Online 

ARTICLE

Exploring Wasserstein Distance across Concept Embeddings for Ontology Matching

Yuan An ; Alex Kalinowski ; Jane GreenbergarXiv.org, 2022

OPEN ACCESS

Exploring Wasserstein Distance across Concept Embeddings for Ontology Matching

Available Online 

2022

   

Time Discretizations of Wasserstein-Hamiltonian Flows

https://www.researchgate.net › publication › 342230141_...

https://www.researchgate.net › publication › 342230141_...

We study discretizations of Hamiltonian systems on the probability density manifold equipped with the $L^2$-Wasserstein metric. Based on discrete optimal ...

Cover Image

Time discretizations of Wasserstein--Hamiltonian flows

by Jianbo Cui; Luca Dieci; Haomin Zhou

Mathematics of computation, 05/2022, Volume 91, Issue 335

We study discretizations of Hamiltonian systems on the probability density manifold equipped with the L^2-Wasserstein metric. Based on discrete optimal...

ArticleView Article PDF

Journal Article  Full Text Online

View Complete Issue Browse Now

Cited by 8 Related articles All 8 versions

 The Impact of Edge Displacement Vaserstein Distance on UD ...

https://direct.mit.edu › coli › article › doi › coli_a_00440

https://direct.mit.edu › coli › article › doi › coli_a_00440

Apr 7, 2022 — We hypothesize that this measurement will be related to differences observed in parsing performance across treebanks. We motivate this by ...

Related articles All 3 versions

2022 see 2021

Physics-driven learning of Wasserstein GAN for density reconstruction in dynamic tomography

Huang, ZS; Klasky, M; (...); Ravishankar, S

Apr 1 2022 | APPLIED OPTICS 61 (10) , pp.2805-2817

Object density reconstruction from projections containing scattered radiation and noise is of critical importance in many applications. Existing scatter correction and density reconstruction methods may not provide the high accuracy needed in many applications and can break down in the presence of unmodeled or anomalous scatter and other experimental artifacts. Incorporating machine-learning mo

Show more

Get It Penn StateFree Submitted Article From RepositoryFull Text at Publisher

30 References

Related articles All 8 versions

Convergence diagnostics for Monte Carlo fission source distributions using the Wasserstein distance measure

Guo, XY; Li, ZG; (...); Wang, K

Apr 1 2022 | NUCLEAR ENGINEERING AND DESIGN 389

The power iteration technique is commonly used in Monte Carlo (MC) criticality simulations to obtain converged neutron source distributions. Entropy is a typical indicator used to examine source distribution convergence. However, spatial meshing is required to calculate entropy, and the performance of a convergence diagnostic is sensitive to the chosen meshing scheme. A new indicator based on t

Show more

Get It Penn StateFull Text at Publisher

30 References  Related records

Related articles

<——2022———2022———580—



An Improved Mixture Density Network Via Wasserstein Distance Based Adversarial Learning for Probabilistic Wind Speed Predictions

Yang, LX; Zheng, Z and Zhang, ZJ

Apr 2022 | IEEE TRANSACTIONS ON SUSTAINABLE ENERGY 13 (2) , pp.755-766

Enriched Cited References

This paper develops a novel improved mixture density network via Wasserstein distance-based adversarial learning (WA-IMDN) for achieving more accurate probabilistic wind speed predictions (PWSP). The proposed method utilizes the historical supervisory control and data acquisition (SCADA) system data collected from multiple wind turbines (WTs) in different wind farms to predict the wind speed pr

Show more

Get It Penn StateFull Text at Publisher


Conference Paper  Citation/Abstract

Image Outpainting using Wasserstein Generative Adversarial Network with Gradient Penalty

Deshmukh, Jay; Sonare, Akash; Mishra, Tarun; Joseph, Richard.

The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2022).

Abstract/Details 
Related articles

 

Wasserstein stability of porous medium-type equations on...

by De Ponti, NicolòMuratori, MatteoOrrieri, Carlo

Journal of functional analysis, 11/2022, Volume 283, Issue 9

Given a complete, connected Riemannian manifold Mn with Ricci curvature bounded from below, we discuss the stability of the

solutions of a porous medium-type...

View NowPDF

Journal article  Full Text Online

View in Context Browse Journal


Conference Paper  Citation/Abstract

A Convolutional Wasserstein Distance for Tractography Evaluation: Complementarity Study to State-of-the-Art Measures

Coloigner, Julie; Commowick, Olivier.

The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2022).

Abstract/Detail

All 11 versions


2022 see 2021  Scholarly Journal  Citation

Second-Order Conic Programming Approach for Wasserstein Distributionally Robust Two-Stage Linear Programs

Wang, Zhuolin; You, Keyou; Song, Shiji; Zhang, Yuli.

IEEE Transactions on Automation Science and Engineering; New York Vol. 19, Iss. 2,  (2022): 946-958.

Details Get full textLink to external site, this link will open in a new window

2022 Working Paper  Full Text

WPPNets and WPPFlows: The Power of Wasserstein Patch Priors for Superresolution

Altekrüger, Fabian; Hertrich, Johannes.

arXiv.org; Ithaca, May 5, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

ARTICLE

Orthogonal Gromov-Wasserstein Discrepancy with Efficient Lower Bound

Jin, Hongwei ; Yu, Zishun ; Zhang, XinhuaarXiv.org, 2022

OPEN ACCESS

Orthogonal Gromov-Wasserstein Discrepancy with Efficient Lower Bound

Available Online 

Working Paper  Full Text

Orthogonal Gromov-Wasserstein Discrepancy with Efficient Lower Bound

Jin, Hongwei; Yu, Zishun; Zhang, Xinhua.

arXiv.org; Ithaca, May 12, 2022.

Abstract/DetailsGet full text

\Link to external site, this link will open in a new window
Cited by 1
 Related articles All 8 versions 

NEWSLETTER ARTICLE

University of Amsterdam Reports Findings in Operations Science (Proxying credit curves via Wasserstein distances)

Investment Weekly News, 2022, p.935

University of Amsterdam Reports Findings in Operations Science (Proxying credit curves via Wasserstein distances)

Available Online 

ARTICLE

A Wasserstein distance approach for concentration of empirical risk estimates

Prashanth L A ; Sanjay P BhatarXiv.org, 2022

OPEN ACCESS

A Wasserstein distance approach for concentration of empirical risk estimates

Available Online 

Working Paper  Full Text

A Wasserstein distance approach for concentration of empirical risk estimates

Prashanth, L A; Bhat, Sanjay P.

arXiv.org; Ithaca, May 10, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Cited by 4 Related articles All 2 versions

MR4577677 


[PDF] marouaneilidrissi.com

[PDF] Robustness assessment using quantile-constrained Wasserstein projections

MI Idrissi - marouaneilidrissi.com

… These perturbation schemes lead to the class of “perturbed-law indices”(PLI), aiming at 

assessing input importance through the study of sensitivity of the model output with respect to the …

 All 2 versions

 <——2022———2022———590— 


[HTML] hindawi.com

[HTML] A data augmentation method for prohibited item X-ray pseudocolor images in X-ray security inspection based on wasserstein generative adversarial network …

D Liu, J Liu, P Yuan, F Yu - Computational Intelligence and …, 2022 - hindawi.com

… spatial-and-channel attention block (SCAB) and a new base block to compose our X-ray 

Wasserstein generative adversarial network model (SCAB-XWGAN-GP). e model directly …

Cited by 2 Related articles All 10 versions

Towards Efficient Variational Auto-Encoder Using Wasserstein ...

https://ieeexplore.ieee.org › document

https://ieeexplore.ieee.org › document

by Z Chen · 2022 — In this paper, we propose using Wasserstein distance as a measure of ... Published in: 2022 IEEE International Conference on Image Processing (ICIP).

Acoustic metamaterial design using Conditional Wasserstein Generative Adversarial Networks

P Lai, F Amirkulova - The Journal of the Acoustical Society of …, 2022 - asa.scitation.org

… to improve the model’s spatial recognition of cylinder configurations. The cWGAN model [1] … “Conditional Wasserstein generative adversarial networks applied to acoustic metamaterial …


Multiview Wasserstein Generative Adversarial Network for imbalanced pearl classification

S Gao, Y Dai, Y Li, K Liu, K Chen… - … Science and Technology, 2022 - iopscience.iop.org

… [1] studied the pearl shape recognition method based on computer vision. Through the search and comparison of image features from multiple views, the pearl morphology recognition …

All 2 versions

 [PDF] arxiv.org

Fast and Provably Convergent Algorithms for Gromov-Wasserstein in Graph Learning

J Li, J Tang, L Kong, H LiuJ LiAMC So… - arXiv preprint arXiv …, 2022 - arxiv.org

… In this paper, we study the design and analysis of a class of efficient algorithms for computing the Gromov-Wasserstein (GW) distance tailored to large-scale graph learning tasks. Armed …

Fast and Provably Convergent Algorithms for Gromov-Wasserstein in Graph Learning

by Li, Jiajin; Tang, Jianheng; Kong, Lemin ; More...

05/2022

In this paper, we study the design and analysis of a class of efficient algorithms for computing the Gromov-Wasserstein (GW) distance tailored to large-scale...

Journal Article  Full Text Online

 Preview   Open Access

2022


arXiv:2205.09006  [pdfpsother]  math.NA  math.OC
On Assignment Problems Related to Gromov-Wasserstein Distances on the Real Line
Authors: Robert BeinertCosmas HeissGabriele Steidl
Abstract: Let x …
, be real numbers. We show by an example that the assignment problem
max…
is in general neither solved by the identical permutation (id) nor the anti-identical permutation (a-id) if n>2+2
. Indeed the above maximum can be, d… 
More
Submitted 18 May, 2022; originally announced May 2022.
On Assignment Problems Related to Gromov-Wasserstein Distances on the Real Line

by Beinert, Robert; Heiss, Cosmas; Steidl, Gabriele

05/2022

Let $x_1 < \dots < x_n$ and $y_1 < \dots < y_n$, $n \in \mathbb N$, be real numbers. We show by an example that the assignment problem $$ \max_{\sigma \in S_n}...

Journal Article  Full Text O


arXiv:2205.08826  [pdfpsother]  math.OC
Regularization for Wasserstein Distributionally Robust Optimization
Authors: Waïss AzizianFranck IutzelerJérôme Malick
Abstract: Optimal transport has recently proved to be a useful tool in various machine learning applications needing comparisons of probability measures. Among these, applications of distributionally robust optimization naturally involve Wasserstein distances in their models of uncertainty, capturing data shifts or worst-case scenarios. Inspired by the success of the regularization of Wasserstein distances… 
More
Submitted 18 May, 2022; originally announced May 2022.
Cited by 1
 Related articles All 2 versions 

arXiv:2205.08748  [pdfpsother]  math.AP
Gradient flows of modified Wasserstein distances and porous medium equations with nonlocal pressure
Authors: Nhan-Phu ChungQuoc-Hung Nguyen
Abstract: We study families of porous medium equation with nonlocal pressure. We construct their weak solutions via JKO schemes for modified Wasserstein distances. We also establish the regularization effect and decay estimates for the L
…  norms.
Submitted 18 May, 2022; originally announced May 2022.
Comments: 24 pages. Dedicated to Professor Duong Minh Duc on the occasion of his 70th birthday. Comments welcome

arXiv:2205.07531  [pdfother]  cs.LG  s.HC  stat.ML
Wasserstein t-SNE
Authors: Fynn BachmannPhilipp HennigDmitry Kobak
Abstract: Scientific datasets often have hierarchical structure: for example, in surveys, individual participants (samples) might be grouped at a higher level (units) such as their geographical region. In these settings, the interest is often in exploring the structure on the unit level rather than on the sample level. Units can be compared based on the distance between their means, however this ignores the…  More
Submitted 16 May, 2022; originally announced May 2022.
All 2 versions 


arXiv:2205.06725  [pdfother]  math.OC  math.NA
Multi-Marginal Gromov-Wasserstein Transport and Barycenters
Authors: Florian BeierRobert BeinertGabriele Steidl
Abstract: Gromov-Wasserstein (GW) distances are generalizations of Gromov-Haussdorff and Wasserstein distances. Due to their invariance under certain distance-preserving transformations they are well suited for many practical applications. In this paper, we introduce a concept of multi-marginal GW transport as well as its regularized and unbalanced versions. Then we generalize a bi-convex relaxation of the…  More
Submitted 13 May, 2022; originally announced May 2022.
MSC Class: 65K10; 49M20; 28A35; 28A33

Multi-Marginal Gromov-Wasserstein Transport and Barycenters

by Kido, Daido

05/2022

The effect of treatments is often heterogeneous, depending on the observable characteristics, and it is necessary to exploit such heterogeneity to devise...

Journal Article  Full Text Online

 Preview  Open Access

Related articles All 3 versions 

 <——2022———2022———600—  


[PDF] arxiv.org

Dynamical Mode Recognition of Triple Flickering Buoyant Diffusion Flames: from Physical Space to Phase Space and to Wasserstein Space

Y Chi, T Yang, P Zhang - arXiv preprint arXiv:2201.01085, 2022 - arxiv.org

… The 1-Wasserstein distance apparently satisfies the three axioms for a metric: 1) 𝑊1(𝜇,𝜈) … 

Wasserstein distance can be found in [28] and the detailed descriptions about the Matlab code …

Cited by 1 Related articles All 2 versions 

 ARTICLE

Dynamical Mode Recognition of Triple Flickering Buoyant Diffusion Flames: from Physical Space to Phase Space and to Wasserstein Space

Chi, Yicheng ; Yang, Tao ; Zhang, Peng202

OPEN ACCESS

Dynamical Mode Recognition of Triple Flickering Buoyant Diffusion Flames: from Physical Space to Phase Space and to Wasserstein Space

Cited by 2 Related articles All 2 versions

[PDF] researchgate.net

Measuring phase-amplitude coupling between neural oscillations of different frequencies via the Wasserstein distance

T Ohki - Journal of Neuroscience Methods, 2022 - Elsevier

… The Wasserstein distance is an optimization algorithm for minimizing transportation cost … 

distance function to quantify PAC and named the Wasserstein Modulation Index (wMI). As the …

Cited by 1 All 5 versions

 

[PDF] arxiv.org

The Wasserstein distance of order for quantum spin systems on infinite lattices

G De Palma, D Trevisan - arXiv preprint arXiv:2210.11446, 2022 - arxiv.org

We propose a generalization of the Wasserstein distance of order $1$ to quantum spin 

systems on the lattice $\mathbb{Z}^d$, which we call specific quantum $W_1$ distance. The …

 All 2 versions


[PDF] aaai.org

[PDF] Semi-Supervised Conditional Density Estimation with Wasserstein Laplacian Regularisation

O Graffeuille, YS Koh, J Wicker, M Lehmann - 2022 - aaai.org

… but labelled data is scarce, we propose Wasserstein Laplacian Regularisation, a semi-… of 

the underlying data, as measured by Wasserstein distance. When applying our framework to …


[PDF] arxiv.org

Regularization for Wasserstein Distributionally Robust Optimization

W Azizian, F Iutzeler, J Malick - arXiv preprint arXiv:2205.08826, 2022 - arxiv.org

… of Wasserstein distances in optimal transport, we study in this paper the regularization of 

Wasserstein … First, we derive a general strong duality result of regularized Wasserstein

All 2 versions

2022

[PDF] arxiv.org

Universality of persistence diagrams and the bottleneck and Wasserstein distances

P Bubenik, A Elchesen - Computational Geometry, 2022 - Elsevier

… In contrast, we show that the Wasserstein distances are universal … Wasserstein distance on 

probability measures. Among other things, this allows for a version of Kantorovich-Rubinstein

Cited by 10 Related articles All 8 versions

Zbl 07541319

[PDF] rice.edu

Subexponential Upper and Lower Bounds in Wasserstein Distance for Markov Processes

N Sandrić, A Arapostathis, G Pang - Applied Mathematics & Optimization, 2022 - Springer

Lyapunov drift conditions, we establish subexponential upper and lower bounds on the rate 

of convergence in the \(\text {L}^p\)-Wasserstein … L}^p\)-Wasserstein distance for a class of Itô …

 All 2 versions

 Zbl 07584030


[PDF] arxiv.org

Quantum Wasserstein isometries on the qubit state space

GP Gehér, J Pitrik, T Titkos, D Virosztek - arXiv preprint arXiv:2204.14134, 2022 - arxiv.org

… We describe Wasserstein isometries of the quantum bit state space with respect to … 

This phenomenon mirrors certain surprising properties of the quantum Wasserstein distance…

Cited by 2 Related articles All 3 versions
Quantum Wasserstein isometries on the qubit...
by Gehér, György Pál; Pitrik, József; Titkos, Tamás ; More...
Journal of mathematical analysis and applications, 12/2022
Journal Article 

 Related articles

[PDF] arxiv.org

Monotonicity of the quantum 2-Wasserstein distance

R Bistroń, M Eckstein, K Życzkowski - arXiv preprint arXiv:2204.07405, 2022 - arxiv.org

… In the set of all such distances the one generated by the Fischer–Rao metric is distinguished 

as the unique continuous distance monotone under classical stochastic maps (Cencov …

Cited by 1 Related articles All 3 versions

2022 see 2021  [PDF] arxiv.org

Weak topology and Opial property in Wasserstein spaces, with applications to Gradient Flows and Proximal Point Algorithms of geodesically convex functionals

E Naldi, G Savaré - Rendiconti Lincei, 2022 - ems.press

… We will collect in Section 2 the main facts concerning optimal transport and Kantorovich-Rubinstein-Wasserstein 

distances; we adopt a general topological framework, in order to …

Cited by 2 Related articles All 8 versions

<——2022———2022———610— 


[PDF] arxiv.org

Improving Human Image Synthesis with Residual Fast Fourier Transformation and Wasserstein Distance

J Wu, S Si, J Wang, J Xiao - arXiv preprint arXiv:2205.12022, 2022 - arxiv.org

… Using Wasserstein distance can solve the problem of gradient disappearance, and using … 

to exceed the Lipschitz constant k, which makes the discriminator satisfy Lipschitz continuity. …

ARTICLE

Improving Human Image Synthesis with Residual Fast Fourier Transformation and Wasserstein Distance

Wu, Jianhan ; Si, Shijing ; Wang, Jianzong ; Xiao, Jing2022

OPEN ACCESS

Improving Human Image Synthesis with Residual Fast Fourier Transformation and Wasserstein Distance

Available Online 

Related articles All 5 versions

[HTML] nature.com

[HTML] Generalizing predictions to unseen sequencing profiles via deep generative models

M Oh, L Zhang - Scientific reports, 2022 - nature.com

… visual patterns based on conditional Wasserstein GAN, is proposed … Mover) formulated by 

Kantorovich-Rubinstein duality is used … function to enforce the Lipschitz constraint, alleviating …

Alll 5 versions

[PDF] arxiv.org

Multi-Marginal Gromov-Wasserstein Transport and Barycenters

F Beier, R Beinert, G Steidl - arXiv preprint arXiv:2205.06725, 2022 - arxiv.org

Gromov-Wasserstein (GW) distances are generalizations of Gromov-Haussdorff and Wasserstein 

distances. Due to their invariance under certain distance-preserving transformations …

Related articles All 3 versions

Bures–Wasserstein geometry for positive-definite Hermitian matrices and their trace-one subsetAuthor:Jesse van Oostrum
Summary:Abstract: In his classical argument, Rao derives the Riemannian distance corresponding to the Fisher metric using a mapping between the space of positive measures and Euclidean space. He obtains the Hellinger distance on the full space of measures and the Fisher distance on the subset of probability measures. In order to highlight the interplay between Fisher theory and quantum information theory, we extend this construction to the space of positive-definite Hermitian matrices using Riemannian submersions and quotient manifolds. The analog of the Hellinger distance turns out to be the Bures–Wasserstein (BW) distance, a distance measure appearing in optimal transport, quantum information, and optimisation theory. First we present an existing derivation of the Riemannian metric and geodesics associated with this distance. Subsequently, we present a novel derivation of the Riemannian distance and geodesics for this metric on the subset of trace-one matrices, analogous to the Fisher distance for probability measuresShow more
Article, 2022
Publication:Information Geometry, 5, 20220922, 405

arXiv:2205.13501  [pdfpsother math.OC
Wasserstein Logistic Regression with Mixed Features
Authors: Aras SelviMohammad Reza BelbasiMartin B HaughWolfram Wiesemann
Abstract: Recent work has leveraged the popular distributionally robust optimization paradigm to combat overfitting in classical logistic regression. While the resulting classification scheme displays a promising performance in numerical experiments, it is inherently limited to numerical features. In this paper, we show that distributionally robust logistic regression with mixed (i.e., numerical and categor…  More
Submitted 26 May, 2022; originally announced May 2022.
Comments: 22 pages (11 main + 11 appendix). Preprint
Related articles
 All 2 versions 


2022

arXiv:2205.13307  [pdfpsother math.PR From p
-Wasserstein Bounds to Moderate Deviations
Authors: Xiao FangYuta Koike
Abstract: We use a new method via p
-Wasserstein bounds to prove Cramér-type moderate deviations in (multivariate) normal approximations. In the classical setting that W
 is a standardized sum of n
 independent and identically distributed (i.i.d.) random variables with sub-exponential tails, our method recovers the optimal range of 0≤x=o(n
1/6) and the near optimal error rate…  More
Submitted 26 May, 2022; originally announced May 2022.
Comments: 58 pages
MSC Class: 60F05; 60F10; 62E17

ARTICLE

Wasserstein Distributionally Robust Gaussian Process Regression and Linear Inverse Problems

Zhang, Xuhui ; Blanchet, Jose ; Marzouk, Youssef ; Nguyen, Viet Anh ; Wang, SvenarXiv.org, 2022

OPEN ACCESS

Wasserstein Distributionally Robust Gaussian Process Regression and Linear Inverse Problems

Available Online 

arXiv:2205.13111  [pdfother math.OC  math.PR
Wasserstein Distributionally Robust Gaussian Process Regression and Linear Inverse Problems
Authors: Xuhui ZhangJose BlanchetYoussef MarzoukViet Anh NguyenSven Wang
Abstract: We study a distributionally robust optimization formulation (i.e., a min-max game) for problems of nonparametric estimation: Gaussian process regression and, more generally, linear inverse problems. We choose the best mean-squared error predictor on an infinite-dimensional space against an adversary who chooses the worst-case model in a Wasserstein ball around an infinite-dimensional Gaussian mode… 
More
Submitted 25 May, 2022; originally announced May 2022.
elated articles
 All 2 versions 


arXiv:2205.13098  [pdfother cs.LG  math.OC 
stat.ML
Optimal Neural Network Approximation of Wasserstein Gradient Direction via Convex Optimization
Authors: Yifei WangPeng ChenMert PilanciWuchen Li
Abstract: The computation of Wasserstein gradient direction is essential for posterior sampling problems and scientific computing. The approximation of the Wasserstein gradient with finite samples requires solving a variational problem. We study the variational problem in the family of two-layer networks with squared-ReLU activations, towards which we derive a semi-definite programming (SDP) relaxation. Thi… 
More
Submitted 25 May, 2022; originally announced May 2022.
Related articles
 All 3 versions 

2ARTICLE

Universal consistency of Wasserstein \(k\)-NN classifier: Negative and Positive Results

Donlapark PonnopratarXiv.org, 2022

OPEN ACCESS

Universal consistency of Wasserstein \(k\)-NN classifier: Negative and Positive Results

arXiv:2205.10740  [pdfother math.OC  eess.SY
Exact SDP Formulation for Discrete-Time Covariance Steering with Wasserstein Terminal Cost
Authors: Isin M. BalciEfstathios Bakolas
Abstract: In this paper, we present new results on the covariance steering problem with Wasserstein distance terminal cost. We show that the state history feedback control policy parametrization, which has been used before to solve this class of problems, requires an unnecessarily large number of variables and can be replaced by a randomized state feedback policy which leads to more tractable problem formul…  More
Submitted 22 May, 2022; originally announced May 2022.
<——2022———2022———620— 


2
Peer-reviewed
Fault Feature Recovery With Wasserstein Generative Adversarial Imputation Network With Gradient Penalty for Rotating Machine Health Monitoring Under Signal Loss Condition
Show more
Authors:Wenyang HuTianyang WangFulei Chu
Article, 2022
Publication:IEEE transactions on instrumentation and measurement, 71, 2022, 1
Publisher:2022


    

Cover Image

Measuring phase-amplitude coupling between neural oscillations of different frequencies via the Wasserstein...

by Ohki, Takefumi

Journal of neuroscience methods, 05/2022, Volume 374

Phase-amplitude coupling (PAC) is a key neuronal mechanism. Here, a novel method for quantifying PAC via the Wasserstein distance is presented. The Wasserstein...

Article PDFPDF

Journal Article  Full Text Online

View in Context Browse Journal

 Preview 

    

Cover Image

Wasserstein Uncertainty Estimation for Adversarial Domain Matching

by Rui Wang; Ruiyi Zhang; Ricardo Henao

Frontiers in big data, 05/2022, Volume 5

Domain adaptation aims at reducing the domain shift between a labeled source domain and an unlabeled target domain, so that the source model can be generalized...

Article PDFPDF

Journal Article  Full Text Online

View in Context Browse Journal

 Preview   Open Access

 Related articles All 4 versions

   

Cover Image

Distributionally robust mean-absolute deviation portfolio optimization using wasserstein metric

Journal of global optimization, 05/2022

Article PDFPDF

Journal Article  Full Text Online

View in Context Browse Journal

 All 3 versions

   

Cover Image

A data-driven scheduling model of virtual power plant using Wasserstein distributionally robust...

by Liu, Huichuan; Qiu, Jing; Zhao, Junhua

International journal of electrical power & energy systems, 05/2022, Volume 137

•A data-driven Wasserstein distributionally robust optimization model is proposed.•The day-head scheduling decision of VPP can be solved by off-the-shell...

Article PDFPDF

Journal Article  Full Text Online

View in Context Browse Journal

 Cited by 3 Related articles

    

2022


An Efficient Content Popularity Prediction of Privacy Preserving Based on Federated Learning and Wa...

Preserving Based on Federated Learning and Wa...

by Wang, Kailun; Deng, Na; Li, Xuanheng

IEEE internet of things journal, 05/2022

To relieve the high backhaul load and long transmission time caused by the huge mobile data traffic, caching devices are deployed at the edge of mobile...

Journal Article  Full Text Online

Related articles

    

ARTICLE

From \(p\)-Wasserstein Bounds to Moderate Deviations

Xiao Fang ; Yuta KoikearXiv.org, 2022

OPEN ACCESS

From \(p\)-Wasserstein Bounds to Moderate Deviations

Available Online 

    

Distributionally Robust Policy Learning with Wasserstein Distance

by Jin, Hongwei; Yu, Zishun; Zhang, Xinhua

05/2022

Comparing structured data from possibly different metric-measure spaces is a fundamental task in machine learning, with applications in, e.g., graph...

Journal Article  Full Text Online

 Preview  Open Access

 Related articles All 4 versions 

 arXiv:2205.04637  [pdfother econ.EM
Distributionally Robust Policy Learning with Wasserstein Distance
Authors: Daido Kido
Abstract: The effect of treatments is often heterogeneous, depending on the observable characteristics, and it is necessary to exploit such heterogeneity to devise individualized treatment rules. Existing estimation methods of such individualized treatment rules assume that the available experimental or observational data derive from the target population in which the estimated policy is implemented. Howeve…  More
Submitted 9 May, 2022; originally announced May 2022.
ARTICLE

Distributionally Robust Policy Learning with Wasserstein Distance

Kido, DaidoarXiv.org, 2022

OPEN ACCESS

Distributionally Robust Policy Learning with Wasserstein Distance

Available Online 



arXiv:2205.11060  [pdfother cs.LG  cs.SE
Wasserstein Generative Adversarial Networks for Online Test Generation for Cyber Physical Systems
Authors: Jarkko PeltomäkiFrankie SpencerIvan Porres
Abstract: We propose a novel online test generation algorithm WOGAN based on Wasserstein Generative Adversarial Networks. WOGAN is a general-purpose black-box test generator applicable to any system under test having a fitness function for determining failing tests. As a proof of concept, we evaluate WOGAN by generating roads such that a lane assistance system of a car fails to stay on the designated lane.… 
More
Submitted 23 May, 2022; originally announced May 2022.
Comments: 5 pages, 3 figures
   

Wasserstein Generative Adversarial Networks for Online Test Generation for Cyber Physical Systems

by Peltomäki, Jarkko; Spencer, Frankie; Porres, Ivan

05/2022

We propose a novel online test generation algorithm WOGAN based on Wasserstein Generative Adversarial Networks. WOGAN is a general-purpose black-box test...

Journal Article  Full Text Online

 Preview  Open Access

Cited by 3 Related articles All 3 versions 

    

Exact SDP Formulation for Discrete-Time Covariance Steering with Wasserstein Terminal...Exact SDP Formulation for Discrete-Time Covariance Steering with Wasserstein Terminal...

by Balci, Isin M; Bakolas, Efstathios

05/2022

In this paper, we present new results on the covariance steering problem with Wasserstein distance terminal cost. We show that the state history feedback...

Journal Article  Full Text Online

Related articles All 2 versions 

<——2022———2022——630—


2022 see 2021  ARTICLE

Hidden Convexity of Wasserstein GANs: Interpretable Generative Models with Closed-Form Solutions

Arda Sahiner ; Tolga Ergen ; Batu Ozturkler ; Burak Bartan ; John Pauly ; Morteza Mardani ; Mert PilanciarXiv.org, 2022

OPEN ACCESS

Hidden Convexity of Wasserstein GANs: Interpretable Generative Models with Closed-Form Solutions

Available Online

  

Gradient flows of modified Wasserstein distances and porous medium equations with nonlocal...

by Chung, Nhan-Phu; Nguyen, Quoc-Hung

05/2022

We study families of porous medium equation with nonlocal pressure. We construct their weak solutions via JKO schemes for modified Wasserstein distances. We...

Journal Article  Full Text Online

 Preview 

Open Access

    

NEWSLETTER ARTICLE

New Findings from Swiss Federal Institute of Technology Lausanne (EPFL) Describe Advances in Signal and Information Processing (Wasserstein-based Graph Alignment)

Network Business Weekly, 2022, p.273

New Findings from Swiss Federal Institute of Technology Lausanne (EPFL) Describe Advances in Signal and Information Processing (Wasserstein-based Graph Alignment)

No Online Access 

 

[PDF] arxiv.org

 WDIBS: Wasserstein deterministic information bottleneck for state abstraction to balance state-compression and performance

X Zhu, T Huang, R Zhang, W Zhu - Applied Intelligence, 2022 - Springer

Wasserstein deterministic information bottleneck for state abstractions … of the Wasserstein 

distance to measure decision performance after state compression. Based on the Wasserstein

Cited by 1 Related articles All 2 versions

    

Wasserstein Image Local Analysis: Histogram of Orientations, Smoothing and Edge Detection

by Huesmann, Martin; Mattesini, Francesco; Trevisan, Dario

05/2022

We establish asymptotic upper and lower bounds for the Wasserstein distance of any order $p\ge 1$ between the empirical measure of a fractional Brownian motion...

Journal Article  Full Text Online

Related articles All 2 versions 
Working Paper  Full Text

Wasserstein Image Local Analysis: Histogram of Orientations, Smoothing and Edge Detection

Zhu, Jiening; Veeraraghavan, Harini; Norton, Larry; Deasy, Joseph O; Tannenbaum, Allen.

arXiv.org; Ithaca, May 11, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window


2022

On Assignment Problems Related to Gromov-Wasserstein ...

https://arxiv.org › math

https://arxiv.org › math

by R Beinert · 2022 — [Submitted on 18 May 2022]. Title:On Assignment Problems Related to Gromov-Wasserstein Distances on the Real Line. Authors:Robert Beinert, Cosmas Heiss, .....

by Beinert, Robert; Heiss, Cosmas; Steidl, Gabriele

05/2022

Let $x_1 < \dots < x_n$ and $y_1 < \dots < y_n$, $n \in \mathbb N$, be real numbers. We show by an example that the assignment problem $$ \max_{\sigma \in S_n}...

Journal Article  Full Text Online

 Preview  Open Access

Related articles All 3 versions

Gradient flows of modified Wasserstein distances and porous ...

https://arxiv.org › math

https://arxiv.org › math

May 18, 2022 — We study families of porous medium equation with nonlocal pressure. We construct their weak solutions via JKO schemes for modified Wasserstein ...


[PDF] arxiv.org

Wasserstein Image Local Analysis: Histogram of Orientations, Smoothing and Edge Detection

J Zhu, H Veeraraghavan, L Norton, JO Deasy… - arXiv preprint arXiv …, 2022 - arxiv.org

The Histogram of Oriented Gradient is a widely used image feature, which describes local 

image directionality based on numerical differentiation. Due to its ill-posed nature, small 

noise may lead to large errors. Conventional HOG may fail to produce meaningful 

directionality results in the presence of noise, which is common in medical radiographic 

imaging. We approach the directionality problem from a novel perspective by the use of the 

optimal transport map of a local image patch to a uni-color patch of its mean. We decompose …

Save Cite All 2 versions 

Wasserstein Image Local Analysis: Histogram of Orientations, Smoothing...

by Zhu, Jiening; Veeraraghavan, Harini; Norton, Larry ; More...

05/2022

The Histogram of Oriented Gradient is a widely used image feature, which describes local image directionality based on numerical differentiation. Due to its...

Journal Article  Full Text Online

 Preview  Open Access

Related articles All 2 versions

 2022 see 2019    

Hypothesis Test and Confidence Analysis With Wasserstein ...Distance on General Dimension

https://direct.mit.edu › neco › article › Hypothesis-Test-an...

May 19, 2022 — We develop a general framework for statistical inference with the 1-Wasserstein distance. Recently, the Wasserstein distance has attracted ...

Cover Image

Hypothesis Test and Confidence Analysis With Wasserstein Distance on General...

by Imaizumi, Masaaki; Ota, Hirofumi; Hamaguchi, Takuo

Neural computation, 05/2022, Volume 34, Issue 6

We develop a general framework for statistical inference with the 1-Wasserstein distance. Recently, the Wasserstein distance has attracted considerable...

ArticleView Article PDF

Journal Article  Full Text Online

View Complete Issue Browse Now

 Preview  Peer-Reviewed

    

 

Wasserstein t-SNE

F Bachmann, P HennigD Kobak - arXiv preprint arXiv:2205.07531, 2022 - arxiv.org

Scientific datasets often have hierarchical structure: for example, in surveys, individual

participants (samples) might be grouped at a higher level (units) such as their geographical …

Wasserstein t-SNE

by Bachmann, Fynn; Hennig, Philipp; Kobak, Dmitry

05/2022

Scientific datasets often have hierarchical structure: for example, in surveys, individual participants (samples) might be grouped at a higher level (units)...

Journal Article  Full Text Online

 Preview  Open Access

[PDF] arxiv.org

Wasserstein t-SNE

F Bachmann, P Hennig, D Kobak - arXiv preprint arXiv:2205.07531, 2022 - arxiv.org

… use the Wasserstein metric [8] to compute pairwise distances between units. The Wasserstein 

… The analysis code reproducing all figures in this paper can be found on GitHub at fsvbach/…

Related articles All 2 versions 

<——2022———2022———640— 


[PDF] arxiv.org

Distributionally Robust Policy Learning with Wasserstein Distance

D Kido - arXiv preprint arXiv:2205.04637, 2022 - arxiv.org

The effect of treatments is often heterogeneous, depending on the observable 

characteristics, and it is necessary to exploit such heterogeneity to devise individualized 

treatment rules. Existing estimation methods of such individualized treatment rules assume 

that the available experimental or observational data derive from the target population in 

which the estimated policy is implemented. However, this assumption often fails in practice 

because useful data are limited. In this case, social planners must rely on the data …

[PDF] arxiv.org

Distributionally Robust Policy Learning with Wasserstein Distance

D Kido - arXiv preprint arXiv:2205.04637, 2022 - arxiv.org

The effect of treatments is often heterogeneous, depending on the observable 

characteristics, and it is necessary to exploit such heterogeneity to devise individualized 

treatment rules. Existing estimation methods of such individualized treatment rules assume 

that the available experimental or observational data derive from the target population in 

which the estimated policy is implemented. However, this assumption often fails in practice 

because useful data are limited. In this case, social planners must rely on the data …

All 4 versions 

Distributionally Robust Policy Learning with Wasserstein Distance

by Kido, Daido

05/2022

The effect of treatments is often heterogeneous, depending on the observable characteristics, and it is necessary to exploit such heterogeneity to devise...

Journal Article  Full Text Online

 Preview  Open Access

Related articles All 4 versions

    

Orthogonal Gromov-Wasserstein Discrepancy with Efficient Lower Bound

H Jin, Z Yu, X Zhang - arXiv preprint arXiv:2205.05838, 2022 - arxiv.org

Comparing structured data from possibly different metric-measure spaces is a fundamental 

task in machine learning, with applications in, eg, graph classification. The Gromov-

Wasserstein (GW) discrepancy formulates a coupling between the structured data based on 

optimal transportation, tackling the incomparability between different structures by aligning 

the intra-relational geometries. Although efficient local solvers such as conditional gradient 

and Sinkhorn are available, the inherent non-convexity still prevents a tractable evaluation …

Cited by 1 Related articles All 8 versions

Orthogonal Gromov-Wasserstein Discrepancy with Efficient Lower Bound

by Jin, Hongwei; Yu, Zishun; Zhang, Xinhua

05/2022

Comparing structured data from possibly different metric-measure spaces is a fundamental task in machine learning, with applications in, e.g., graph...

Journal Article  Full Text Online

 Preview  Open Access

    

2022 see 2017

Wasserstein Generative Adversarial Networks for Online Test Generation for Cyber Physical Systems

J Peltomäki, F Spencer, I Porres - arXiv preprint arXiv:2205.11060, 2022 - arxiv.org

We propose a novel online test generation algorithm WOGAN based on Wasserstein 

Generative Adversarial Networks. WOGAN is a general-purpose black-box test generator 

applicable to any system under test having a fitness function for determining failing tests. As 

a proof of concept, we evaluate WOGAN by generating roads such that a lane assistance 

system of a car fails to stay on the designated lane. We find that our algorithm has a 

competitive performance respect to previously published algorithms.

Cited by 3 Related articles All 3 versions

Wasserstein Generative Adversarial Networks for Online Test Generation for...

by Peltomäki, Jarkko; Spencer, Frankie; Porres, Ivan

05/2022

We propose a novel online test generation algorithm WOGAN based on Wasserstein Generative Adversarial Networks. WOGAN is a general-purpose black-box test...

Journal Article  Full Text Online

 Preview  Open Access

2022 5/11

Wasserstein Generative Adversarial Networks for Online Test ...

This is the video presentation for the paper "Wasserstein Generative Adversarial Networks for Online Test Generation for Cyber Physical ...

YouTube · Search-Based Software Testing Workshop (SBST) · 

May 11, 2022

ICSE 2022 Program - Conferences

https://conf.researchr.org › icse-2022 › virtual-program

Welcom to the website of the ICSE 2022 conference in Pittsburgh! ... 

Wasserstein Generative Adversarial Networks for Online Test Generation for Cyber ...

Cited by 3 Related articles All 6 versions

        

[PDF] arxiv.org

Fast and Provably Convergent Algorithms for Gromov-Wasserstein in Graph Learning

J Li, J Tang, L Kong, H Liu, J Li, AMC So… - arXiv preprint arXiv …, 2022 - arxiv.org

In this paper, we study the design and analysis of a class of efficient algorithms for 

computing the Gromov-Wasserstein (GW) distance tailored to large-scale graph learning 

tasks. Armed with the Luo-Tseng error bound condition~\cite {luo1992error}, two proposed 

algorithms, called Bregman Alternating Projected Gradient (BAPG) and hybrid Bregman 

Proximal Gradient (hBPG) are proven to be (linearly) convergent. Upon task-specific 

properties, our analysis further provides novel theoretical insights to guide how to select the …

Related articles All 4 versions 

Fast and Provably Convergent Algorithms for Gromov-Wasserstein in Graph...

by Li, Jiajin; Tang, Jianheng; Kong, Lemin ; More...

05/2022

In this paper, we study the design and analysis of a class of efficient algorithms for computing the Gromov-Wasserstein (GW) distance tailored to large-scale...


2022 see 2021

MR4428792 Prelim Wang, Zhongjian; Xin, Jack; Zhang, Zhiwen; DeepParticle: 

Learning invariant measure by a deep neural network minimizing Wasserstein distance on data generated from an interacting particle method. J. Comput. Phys. 464 (2022), Paper No. 111309.

Review PDF Clipboard Journal Article


2022

   

Wasserstein Image Local Analysis: Histogram of Orientations, Smoothing and Edge Detection

by Zhu, Jiening; Veeraraghavan, Harini; Norton, Larry ; More...

05/2022

The Histogram of Oriented Gradient is a widely used image feature, which describes local image directionality based on numerical differentiation. Due to its...

Journal Article  Full Text Online

 Preview   Open Access

[PDF] arxiv.org

Wasserstein-based graph alignment

HP Maretic, M El Gheche, G Chierchia… - … on Signal and …, 2022 - ieeexplore.ieee.org

… , where we consider the Wasserstein distance to measure the … Wasserstein distance 

combined with the one-tomany graph assignment permits to outperform both GromovWasserstein

Cited by 10 Related articles All 3 versions

MR4422567 Prelim Maretic, Hermina Petric; Gheche, Mireille El; Minder, Matthias; Chierchia, Giovanni; Frossard, Pascal; 

Wasserstein-based graph alignment. IEEE Trans. Signal Inform. Process. Netw. 8 (2022), 353–363. 94C15

Review PDF Clipboard Journal Article

Wasserstein-Based Graph Alignment - IEEE Xplore

https://ieeexplore.ieee.org › document

by HP Maretic · 2022 · Cited by 10 — A novel method for comparing non-aligned graphs of various sizes is proposed, based on the Wasserstein distance between graph signal ...
Cited by 12
 Related articles All 6 versions

2022

MR4421628 Prelim Reygner, Julien; Touboul, Adrien; 

Reweighting samples under covariate shift using a Wasserstein distance criterion. Electron. J. Stat. 16 (2022), no. 1, 3278–3314. 62E17 (62E20)

Review PDF Clipboard Journal Article

Journal Article  Full Text Online
Free Full Text from Publisher

42 References  Related records


[PDF] inria.fr

Wasserstein model reduction approach for parametrized flow problems in porous media

B Battisti, T Blickhan, G Enchery, V Ehrlacher… - 2022 - hal.inria.fr

Le but de ce travail est de construire un modèle réduit pour des problèmes d'écoulements

en milieux poreux paramétrés. La difficulté principale de ce type de problèmes est que la …

ited by 3 Related articles All 15 versions 

[HTML] nih.gov

[HTML] Wasserstein Uncertainty Estimation for Adversarial Domain Matching

R Wang, R ZhangR Henao - Frontiers in Big Data, 2022 - ncbi.nlm.nih.gov

Abstract Domain adaptation aims at reducing the domain shift between a labeled source

domain and an unlabeled target domain, so that the source model can be generalized to …

 All 4 versions


A 3D reconstruction method of porous media based on improved WGAN-GP

T Zhang, Q Liu, X Wang, X Ji, Y Du - Computers & Geosciences, 2022 - Elsevier

The reconstruction of porous media is important to the development of petroleum industry,

but the accurate characterization of the internal structures of porous media is difficult since …

<——2022———2022———650—


Research on Face Image Restoration Based on Improved WGAN

F Liu, R Chen, S Duan, M Hao, Y Guo - International Conference on …, 2022 - Springer

This article focuses on the face recognition model in real life scenarios, because the

possible occlusion affects the recognition effect of the model, resulting in a decline in the …

Related articles

Wasserstein Logistic Regression with Mixed Features - arXiv

https://arxiv.org › math

May 26, 2022  — In this paper, we show that distributionally robust logistic regression with mixed (i.e., numerical and categorical) features, ...

Wasserstein Logistic Regression with Mixed Features

Working Paper Full Text

Wasserstein Logistic Regression with Mixed Features

Aras Selvi; Belbasi, Mohammad Reza; Haugh, Martin B; Wiesemann, Wolfram.

arXiv.org; Ithaca, May 26, 2022.

Abstract/Details

Get full text

Link to external site, this link will open in a new window

Select result item

All 2 versions

2022 see 2021

Approximation for Probability Distributions by Wasserstein GAN

Working Paper

Full Text

Approximation for Probability Distributions by Wasserstein GAN

Gao, Yihang; Ng, Michael K; Zhou, Mingjie.

arXiv.org; Ithaca, May 23, 2022.

Abstract/Details

Get full text

Link to external site, this link will open in a new window

Select result item

Efficient Approximation of Gromov-Wasserstein Distance using Importance Sparsification

Working Paper

Full Text

Efficient Approximation of Gromov-Wasserstein Distance using Importance Sparsification

Li, Mengyu; Yu, Jun; Xu, Hongteng; Cheng, Meng.

arXiv.org; Ithaca, May 26, 2022.

Abstract/Details

Get full text

Link to external site, this link will open in a new window

Select result item

Related articles All 2 versions

Wasserstein Distributionally Robust Gaussian Process Regression and Linear Inverse Problems

X Zhang, J BlanchetY MarzoukVA Nguyen… - arXiv preprint arXiv …, 2022 - arxiv.org

We study a distributionally robust optimization formulation (ie, a min-max game) for

problems of nonparametric estimation: Gaussian process regression and, more generally,

linear inverse problems. We choose the best mean-squared error predictor on an infinite-

dimensional space against an adversary who chooses the worst-case model in a

Wasserstein ball around an infinite-dimensional Gaussian model. The Wasserstein cost

function is chosen to control features such as the degree of roughness of the sample paths …

Wasserstein Distributionally Robust Gaussian Process Regression and Linear Inverse Problems

Working Paper

Full Text

Wasserstein Distributionally Robust Gaussian Process Regression and Linear Inverse Problems

Zhang, Xuhui; Blanchet, Jose; Marzouk, Youssef; Nguyen, Viet Anh; Wang, Sven.

arXiv.org; Ithaca, May 26, 2022.

Abstract/Details

Get full text

Link to external site, this link will open in a new window

Select result item

 

2022


Optimal Neural Network Approximation of Wasserstein Gradient Direction via Convex Optimization

Y Wang, P ChenM PilanciW Li - arXiv preprint arXiv:2205.13098, 2022 - arxiv.org

The computation of Wasserstein gradient direction is essential for posterior sampling

problems and scientific computing. The approximation of the Wasserstein gradient with finite

samples requires solving a variational problem. We study the variational problem in the

family of two-layer networks with squared-ReLU activations, towards which we derive a semi-

definite programming (SDP) relaxation. This SDP can be viewed as an approximation of the

Wasserstein gradient in a broader function family including two-layer networks. By solving …

Optimal Neural Network Approximation of Wasserstein Gradient Direction via Convex Optimization

Working Paper

Full Text

Optimal Neural Network Approximation of Wasserstein Gradient Direction via Convex Optimization

Wang, Yifei; Chen, Peng; Pilanci, Mert; Li, Wuchen.

arXiv.org; Ithaca, May 26, 2022.

Abstract/Details

Get full text

Link to external site, this link will open in a new window

Select result item

ll 3 versions

 

2022 see 2021 

FlowPool: Pooling Graph Representations with Wasserstein Gradient Flows

Working Paper

Full Text

FlowPool: Pooling Graph Representations with Wasserstein Gradient Flows

Simou, Effrosyni.

arXiv.org; Ithaca, May 19, 2022.

Abstract/Details

Get full text

Link to external site, this link will open in a new window

Select result item
 

From p-Wasserstein Bounds to Moderate Deviations

Xiao FangYuta Koike

We use a new method via p-Wasserstein bounds to prove Cramér-type moderate deviations in (multivariate) normal approximations. In the classical setting that 

W  is a standardized sum of 

n independent and identically distributed (i.i.d.) random variables with sub-exponential tails, our method recovers the optimal range of  0≤x=o(

n 1/6 ) and the near optimal error rate 

O(1)(1+x)(logn+ x2 )/n

 for P(W>x)/(1−Φ(x))1

, where Φ  is the standard normal distribution function. Our method also works for dependent random variables (vectors) and we give applications to the combinatorial central limit theorem, Wiener chaos, homogeneous sums and local dependence. The key step of our method is to show that the 

p-Wasserstein distance between the distribution of the random variable (vector) of interest and a normal distribution grows like 

O( Δ) 1≤pp0, for some constants 

α,Δ and  . In the above i.i.d. setting,  α=1,Δ= …

. For this purpose, we obtain general 

p-Wasserstein bounds in (multivariate) normal approximations using Stein's method.

Comments:

58 pages

Subjects:

Probability (math.PR)

MSC classes:

60F05, 60F10, 62E17

Cite as:

arXiv:2205.13307 [math.PR]

 

(or arXiv:2205.13307v1 [math.PR] for this version)

 

https://doi.org/10.48550/arXiv.2205.13307

Focus to learn more

Submission history

From: Xiao Fang [view email]

[v1] Thu, 26 May 2022 12:35:15 UTC (46 KB)

From \(p\)-Wasserstein Bounds to Moderate Deviations

Working Paper Full Text

From  p -Wasserstein Bounds to Moderate Deviations

Xiao, Fang; Koike, Yuta.

arXiv.org; Ithaca, May 26, 2022.

Abstract/Details


52022 see 2921

Wasserstein convergence rate for empirical measures on noncompact manifolds

Wang, FY

Feb 2022 | STOCHASTIC PROCESSES AND THEIR APPLICATIONS 144 , pp.271-287

Let X-t be the (reflecting) diffusion process generated by L := ? + & nabla;V on a complete connected Riemannian manifold M possibly with a boundary & part;M, where V is an element of C-1(M) such that mu(dx) := e(V(x))dx is a probability measure. We estimate the convergence rate for the empirical measure mu(t) := 1/t & nbsp;integral(t)(0)& nbsp;delta X(s)ds under the Wasserstein distance. As a

Show more

Free Submitted Article From RepositoryFull Text at Publisher

Citation 18 References Related records

Cited by 10 Related articles All 5 versions

Tackling Algorithmic Bias in Neural-Network Classifiers using Wasserstein-2 Regularization

Risser, LSanz, AG; (...); Loubes, JM

Apr 2022 (Early Access) | JOURNAL OF MATHEMATICAL IMAGING AND VISIONEnriched Cited References

The increasingly common use of neural network classifiers in industrial and social applications of image analysis has allowed impressive progress these last years. Such methods are, however, sensitive to algorithmic bias, i.e., to an under- or an over-representation of positive predictions or to higher prediction errors in specific subgroups of images. We then introduce in this paper a new meth

Show more  View full text

5References  Related records

MR4458660 

Cited by 3 Related articles All 7 versions

<——2022———2022———660— 


VISUAL TRANSFER FOR REINFORCEMENT LEARNING VIA GRADIENT PENALTY BASED WASSERSTEIN DOMAIN CONFUSION

Zhu, XCZhang, RY; (...); Wang, XT

2022 | JOURNAL OF NONLINEAR AND VARIATIONAL ANALYSIS 6 (3) , pp.227-238

Enriched Cited References

It is pretty challenging to transfer learned policies among different visual environments. The recently proposed Wasserstein Adversarial Proximal Policy Optimization (WAPPO) attempts to over -come this difficulty by definitely learning a representation, which is sufficient to express the originating and the target domains simultaneously. Specifically, WAPPO uses the Wasserstein Confusion target

Show more

Free Full Text From Publisher 39

References Related records

[CITATION] VISUAL TRANSFER FOR REINFORCEMENT LEARNING VIA GRADIENT PENALTY BASED WASSERSTEIN DOMAIN CONFUSION

X Zhu, R Zhang, T Huang… - JOURNAL OF …, 2022 - BIEMDAS ACAD PUBLISHERS INC …


2022 see 2021  Cover Image

A data-driven scheduling model of virtual power plant using Wasserstein distributionally robust optimization

by Liu, Huichuan; Qiu, Jing; Zhao, Junhua

International journal of electrical power & energy systems, 05/2022, Volume 137

•A data-driven Wasserstein distributionally robust optimization model is proposed.•The day-head scheduling decision of VPP can be solved by off-the-shell...
Related articles
 All 2 versions

<p>A data-driven scheduling model of virtual power plant using Wasserstein distributionally robust optimization</p>

Liu, HCAQiu, J and Zhao, JH

May 2022 | INTERNATIONAL JOURNAL OF ELECTRICAL POWER & ENERGY SYSTEMS 137

Distributed energy resources (DER) can be efficiently aggregated by aggregators to sell excessive electricity to spot market in the form of Virtual Power Plant (VPP). The aggregator schedules DER within VPP to participate in day-ahead market for maximizing its profits while keeping the static operating envelope provided by distribution system operator (DSO) in real-time operation. Aggregator, h

Show more

Full Text at Publisher  27  References  Related records

 Cited by 3 Related articles


Measuring phase-amplitude coupling between neural oscillations of different frequencies via the Wasserstein distance

Ohki, T

May 15 2022 | JOURNAL OF NEUROSCIENCE METHODS 374

Background: Phase-amplitude coupling (PAC) is a key neuronal mechanism. Here, a novel method for quantifying PAC via the Wasserstein distance is presented.New method: The Wasserstein distance is an optimization algorithm for minimizing transportation cost and distance. For the first time, the author has applied this distance function to quantify PAC and named the Wasserstein Modulation Index (w

Show more

View full text View Associated Data 57 References Related records

Research article
Measuring phase-amplitude coupling between neural oscillations of different frequencies via the Wasserstein distance
Journal of Neuroscience Methods23 March 2022...

Takefumi Ohki

Cited by 3 Related articles All 3 versions


 
2022 see 2021

Dynamic Topological Data Analysis for Brain Networks via Wasserstein Graph Clustering

MK Chung, SG Huang… - arXiv preprint …, 2022 - arxiv-export-lb.library.cornell.edu

… the novel Wasserstein graph clustering for dynamically changing graphs. The Wasserstein 

clustering penalizes the topological discrepancy between graphs. The Wasserstein clustering …


Research article

A 3D reconstruction method of porous media based on improved WGAN-GP

Computers & Geosciences22 May 2022...

Ting ZhangQingyang LiuYi Du

Related articles
Scholarly Journal  Citation/Abstract

A 3D reconstruction method of porous media based on improved WGAN-GP
Zhang, Ting; Liu, Qingyang; Wang, Xianwu; Ji, Xin; Du, Yi.
Computers & geosciences Vol. 165,  (Aug 2022).

Abstract/Details 

Cited by 1 Related articles

2022
  

[PDF] arxiv.org

Wasserstein Steepest Descent Flows of Discrepancies with Riesz Kernels

J Hertrich, M Gräf, R Beinert, G Steidl - arXiv preprint arXiv:2211.01804, 2022 - arxiv.org

Wasserstein tangent space, we first introduce Wasserstein steepest descent flows. These 

are locally absolutely continuous curves in the WassersteinWasserstein spaces as geodesic

Cited by 3 Related articles All 2 versions

Research article
A novel virtual sample generation method based on a modified conditional Wasserstein GAN to address the small sample size problem in soft sensing
Journal of Process Control1 April 2022...

Yan-Lin HeXing-Yuan LiQun-Xiong Zh

[PDF] arxiv.org

Optimal 1-Wasserstein Distance for WGANs

A Stéphanovitch, U Tanielian, B Cadre… - arXiv preprint arXiv …, 2022 - arxiv.org

The mathematical forces at work behind Generative Adversarial Networks raise challenging

theoretical issues. Motivated by the important question of characterizing the geometrical …

 Related articles All 2 versions 


[PDF] lpsm.paris

[PDF] OPTIMAL 1-WASSERSTEIN DISTANCE FOR WGANS BY ARTHUR STÉPHANOVITCH, UGO TANIELIAN 2, BENOÎT CADRE 3, NICOLAS KLUTCHNIKOFF 3 …

A STÉPHANOVITCH - perso.lpsm.paris

The mathematical forces at work behind Generative Adversarial Networks raise challenging

theoretical issues. Motivated by the important question of characterizing the geometrical …


ARTICLE

Shedding a PAC-Bayesian Light on Adaptive Sliced-Wasserstein Distances

Ohana, Ruben ; Kimia Nadjahi ; Rakotomamonjy, Alain ; Ralaivola, LivaarXiv.org, 2022

OPEN ACCESS

Shedding a PAC-Bayesian Light on Adaptive Sliced-Wasserstein Distances

Available Online 

arXiv:2206.03230  [pdfother stat.ML  cs.LG
Shedding a PAC-Bayesian Light on Adaptive Sliced-Wasserstein Distances
Authors: Ruben OhanaKimia NadjahiAlain RakotomamonjyLiva Ralaivola
Abstract: The Sliced-Wasserstein distance (SW) is a computationally efficient and theoretically grounded alternative to the Wasserstein distance. Yet, the literature on its statistical properties with respect to the distribution of slices, beyond the uniform measure, is scarce. To bring new contributions to this line of research, we leverage the PAC-Bayesian theory and the central observation that SW actual…  More
Submitted 7 June, 2022; originally announced June 2022.

arXiv:2206.01984  [pdfother cs.LG  eess.SP  math.GT
Geodesic Properties of a Generalized Wasserstein Embedding for Time Series Analysis
Authors: Shiying LiAbu Hasnat Mohammad RubaiyatGustavo K. Rohde
Abstract: Transport-based metrics and related embeddings (transforms) have recently been used to model signal classes where nonlinear structures or variations are present. In this paper, we study the geodesic properties of time series data with a generalized Wasserstein metric and the geometry related to their signed cumulative distribution transforms in the embedding space. Moreover, we show how understand…  More
Submitted 4 June, 2022; originally announced June 2022

Related articles All 2 versions 

[PDF] huan-zhang.com

DATA AUGMENTATION.VIA WASSERSTEIN GEODESIC PERTURBATION FOR ROBUST ELECTROCARDIOGRAM PREDICTION

J Zhu, J Qiu, Z Yang, M Rosenberg, E Liu, B Li, D Zhao - download.huan-zhang.com

… We perturb the data distribution towards other classes along the geodesic in a

Wasserstein space. Also, the ground metric of this Wasserstein space is computed via a set of …

<——2022———2022———670— 


arXiv:2206.01778  [pdfother]  math.PR  math.AP
A probabilistic approach to vanishing viscosity for PDEs on the Wasserstein space
Authors: Ludovic Tangpi
Abstract: In this work we prove an analogue, for partial differential equations on the space of probability measures, of the classical vanishing viscosity result known for equations on the Euclidean space. Our result allows in particular to show that the value function arising in various problems of classical mechanics and games can be obtained as the limiting case of second order PDEs. The method of proof… 
More
Submitted 3 June, 2022; originally announced June 2022.

arXiv:2206.01496  [pdf]  cs.LG  cs.AI  stat.ME doi10.5121/ijaia.2022.13301
Causality Learning With Wasserstein Generative Adversarial Networks
Authors: Hristo PetkovColin HanleyFeng Dong
Abstract: Conventional methods for causal structure learning from data face significant challenges due to combinatorial search space. Recently, the problem has been formulated into a continuous optimization framework with an acyclicity constraint to learn Directed Acyclic Graphs (DAGs). Such a framework allows the utilization of deep generative models for causal structure learning to better capture the rela… 
More
Submitted 3 June, 2022; originally announced June 2022.
Comments: arXiv admin note: substantial text overlap with arXiv:2204.00387
Related articles All 3 versions 


2022 see 2021  arXiv:2206.01432  [pdfother]  cs.LG cs.DC
On the Generalization of Wasserstein Robust Federated Learning
Authors: Tung-Anh NguyenTuan Dung NguyenLong Tan LeCanh T. DinhNguyen H. Tran
Abstract: In federated learning, participating clients typically possess non-i.i.d. data, posing a significant challenge to generalization to unseen distributions. To address this, we propose a Wasserstein distributionally robust optimization scheme called WAFL. Leveraging its duality, we frame WAFL as an empirical surrogate risk minimization problem, and solve it using a local SGD-based algorithm with conv… 
More
Submitted 3 June, 2022; originally announced June 2022.
All 2 versions 

ARTICLE

On the Generalization of Wasserstein Robust Federated Learning

Tung-Anh Nguyen ; Nguyen, Tuan Dung ; Long Tan Le ; Dinh, Canh T ; Tran, Nguyen HarXiv.org, 2022

OPEN ACCESS

On the Generalization of Wasserstein Robust Federated Learning

Available Online 

Cited by 1 Related articles All 2 versions 


ARTICLE

Data-Driven Chance Constrained Programs over Wasserstein Balls

Zhi Chen ; Daniel Kuhn ; Wolfram WiesemannarXiv.org, 2022

OPEN ACCESS

Data-Driven Chance Constrained Programs over Wasserstein Balls

Available Online 

Working Paper  Full Text

Data-Driven Chance Constrained Programs over Wasserstein Balls

Chen, Zhi; Kuhn, Daniel; Wiesemann, Wolfram.arXiv.org; Ithaca, Jan 6, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Show Abstract 

arXiv:2206.00231  [pdfpsother]  math.OC
On Approximations of Data-Driven Chance Constrained Programs over Wasserstein Balls
Authors: Zhi ChenDaniel KuhnWolfram Wiesemann
Abstract: Distributionally robust chance constrained programs minimize a deterministic cost function subject to the satisfaction of one or more safety conditions with high probability, given that the probability distribution of the uncertain problem parameters affecting the safety condition(s) is only known to belong to some ambiguity set. We study two popular approximation schemes for distributionally robu… 
More
Submitted 1 June, 2022; originally announced June 2022.
Comments: arXiv admin note: substantial text overlap with arXiv:1809.00210
Cited by 101 Related articles All 7 versions


arXiv:2206.00156  [pdfother]  math.PR  math.ST
Distributional Convergence of the Sliced Wasserstein Process
Authors: Jiaqi XiJonathan Niles-Weed
Abstract: Motivated by the statistical and computational challenges of computing Wasserstein distances in high-dimensional contexts, machine learning researchers have defined modified Wasserstein distances based on computing distances between one-dimensional projections of the measures. Different choices of how to aggregate these projected distances (averaging, random sampling, maximizing) give rise to diff… 
More
Submitted 31 May, 2022; originally announced June 2022.

2022


arXiv:2205.15902  [pdfother]  stat.ML  cs.LG  math.ST

Variational inference via Wasserstein gradient flows
Authors: Marc LambertSinho ChewiFrancis BachSilvère BonnabelPhilippe Rigollet
Abstract: Along with Markov chain Monte Carlo (MCMC) methods, variational inference (VI) has emerged as a central computational approach to large-scale Bayesian inference. Rather than sampling from the true posterior π
, VI aims at producing a simple but effective approximation π
 for which summary statistics are easy to compute. However, unlike the well-studied MCMC methodology, VI is still p…  More
Submitted 31 May, 2022; originally announced May 2022.
Comments: 52 pages, 15 figures
Cited by 1
 Related articles All 2 versions 


arXiv:2205.15721  [pdfother]  cs.CV  cs.IR cs.LG
One Loss for Quantization: Deep Hashing with Discrete Wasserstein Distributional Matching
Authors: Khoa D. DoanPeng YangPing Li
Abstract: Image hashing is a principled approximate nearest neighbor approach to find similar items to a query in a large collection of images. Hashing aims to learn a binary-output function that maps an image to a binary vector. For optimal retrieval performance, producing balanced hash codes with low-quantization error to bridge the gap between the learning stage's continuous relaxation and the inference…  More
Submitted 31 May, 2022; originally announced May 2022.
Comments: CVPR 2022

WEB RESOURCE

Decentralized Computation of Wasserstein Barycenter over Time-Varying Networks

Yufereva, Olga ; Persiianov, Michael ; Dvurechensky, Pavel ; Gasnikov, Alexander ; Kovalev, Dmitry2022

Decentralized Computation of Wasserstein Barycenter over Time-Varying Networks

No Online Access 

arXiv:2205.15669  [pdfother]  math.OC
Decentralized Computation of Wasserstein Barycenter over Time-Varying Networks
Authors: Olga YuferevaMichael PersiianovPavel DvurechenskyAlexander GasnikovDmitry Kovalev
Abstract: Inspired by recent advances in distributed algorithms for approximating Wasserstein barycenters, we propose a novel distributed algorithm for this problem. The main novelty is that we consider time-varying computational networks, which are motivated by examples when only a subset of sensors can make an observation at each time step, and yet, the goal is to average signals (e.g., satellite pictures… 
More
Submitted 31 May, 2022; originally announced May 2022.
 Related articles All 3 versions 


arXiv:2205.14624  [pdfpsother]  math.ST
Central limit theorem for the Sliced 1-Wasserstein distance and the max-Sliced 1-Wasserstein distance
Authors: Xianliang XuZhongyi Huang
Abstract: The Wasserstein distance has been an attractive tool in many fields. But due to its high computational complexity and the phenomenon of the curse of dimensionality in empirical estimation, various extensions of the Wasserstein distance have been proposed to overcome the shortcomings such as the Sliced Wasserstein distance. It enjoys a low computational cost and dimension-free sample complexity, bu… 
More
Submitted 29 May, 2022; originally announced May 2022.
Comments: 36pages
Cited by 1
 Related articles All 2 versions 


arXiv:2205.13573  [pdfother]  cs.LG  stat.ME  stat.ML
Efficient Approximation of Gromov-Wasserstein Distance using Importance Sparsification
Authors: Mengyu LiJun YuHongteng XuCheng Meng
Abstract: As a valid metric of metric-measure spaces, Gromov-Wasserstein (GW) distance has shown the potential for the matching problems of structured data like point clouds and graphs. However, its application in practice is limited due to its high computational complexity. To overcome this challenge, we propose a novel importance sparsification method, called Spar-GW, to approximate GW distance efficientl…  More
Submitted 26 May, 2022; originally announced May 2022.
Comments: 24 pages, 7 figures

Related articles All 2 versions 

ARTICLE

Efficient Approximation of Gromov-Wasserstein Distance using Importance Sparsification

Li, Mengyu ; Yu, Jun ; Xu, Hongteng ; Cheng, MengarXiv.org, 2022

OPEN ACCESS

Efficient Approximation of Gromov-Wasserstein Distance using Importance Sparsification

Available Online 

Cited by 1 Related articles All 4 versions

<——2022———2022———680—

Mourrat, Jean-Christophe

The Parisi formula is a Hamilton-Jacobi equation in Wasserstein space. (English) Zbl 07535211

Can. J. Math. 74, No. 3, 607-629 (2022).

MSC:  82B44 82D30

PDF BibTeX XML Cite

Full Text: DOI 

  OpenURL 

Zbl 07535211

Mei, YuLiu, JiaChen, Zhiping

Distributionally robust second-order stochastic dominance constrained optimization with Wasserstein ball. (English) Zbl 07534670

SIAM J. Optim. 32, No. 2, 715-738 (2022).

MSC:  90C15 91B70 90C31 90-08

PDF BibTeX XML Cite

Full Text: DOI     OpenURL 
Cited by 1 Related articles All 4 versions


2
Working Paper  Full Text

Bures-Wasserstein geometry for positive-definite Hermitian matrices and their trace-one subset
Jesse van Oostrum.
arXiv.org; Ithaca, Sep 24, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
 Get full textLink to external site, this link will open in a new window


2022 see 2021 Cover Image

Linear and Deep Order-Preserving Wasserstein Discriminant Analysis

by Su, Bing; Zhou, Jiahuan; Wen, Ji-Rong ; More...

IEEE transactions on pattern analysis and machine intelligence, 06/2022, Volume 44, Issue 6

Supervised dimensionality reduction for sequence data learns a transformation that maps the observations in sequences onto a low-dimensional subspace by...

Article PDFPDF

Journal Article  Full Text Online

View in Context Browse Journal

 Preview 

    

Cover Image

The Parisi formula is a Hamilton–Jacobi equation in Wasserstein space

by Mourrat, Jean-Christophe

Canadian journal of mathematics, 06/2022, Volume 74, Issue 3

The Parisi formula is a self-contained description of the infinite-volume limit of the free energy of mean-field spin glass models. We showthat this quantity...

Article PDFPDF

Journal Article  Full Text Online

View in Context Browse Journal

Cited by 15 Related articles All 7 versions

2022


2022 see 2021 Cover Image

Unbalanced optimal total variation transport problems and generalized Wasserstein barycenters

by Chung, Nhan-Phu; Trinh, Thanh-Son

Proceedings of the Royal Society of Edinburgh. Section A. Mathematics, 06/2022, Volume 152, Issue 3

In this paper, we establish a Kantorovich duality for unbalanced optimal total variation transport problems. As consequences, we recover a version of duality...

Journal Article  Full Text Online

View in Context Browse Journal   Preview 

  Related articles All 2 versions


Variational inference via Wasserstein gradient flows

by Lambert, Marc; Chewi, Sinho; Bach, Francis ; More...

05/2022

Along with Markov chain Monte Carlo (MCMC) methods, variational inference (VI) has emerged as a central computational approach to large-scale Bayesian...

Journal Article  Full Text Online

 Preview  Open Access

Cited by 3 Related articles All 2 versions

 

Distributional Convergence of the Sliced Wasserstein Process

by Xi, Jiaqi; Niles-Weed, Jonathan

05/2022

Motivated by the statistical and computational challenges of computing Wasserstein distances in high-dimensional contexts, machine learning researchers have...

Journal Article  Full Text Online

 Cited by 1 Related articles All 2 versions

 

Decentralized Computation of Wasserstein Barycenter over Time-Varying Networks

by Yufereva, Olga; Persiianov, Michael; Dvurechensky, Pavel ; More...

05/2022

Inspired by recent advances in distributed algorithms for approximating Wasserstein barycenters, we propose a novel distributed algorithm for this problem. The...

Journal Article  Full Text Online

 Related articles All 3 versions

 

On Approximations of Data-Driven Chance Constrained Programs over Wasserstein Balls

by Chen, Zhi; Kuhn, Daniel; Wiesemann, Wolfram

06/2022

Distributionally robust chance constrained programs minimize a deterministic cost function subject to the satisfaction of one or more safety conditions with...

Journal Article  Full Text Online

 Preview  arXiv

<——2022———2022———690 — 


One Loss for Quantization: Deep Hashing with Discrete Wasserstein Distributional Matching

by Doan, Khoa D; Yang, Peng; Li, Ping

05/2022

Image hashing is a principled approximate nearest neighbor approach to find similar items to a query in a large collection of images. Hashing aims to learn a...

Journal Article

Cited by 1 Related articles All 4 versions

Cover Image

Subexponential Upper and Lower Bounds in Wasserstein Distance for Markov...

by Sandrić, Nikola; Arapostathis, Ari; Pang, Guodong

Applied mathematics & optimization, 05/2022, Volume 85, Issue 3

In this article, relying on Foster–Lyapunov drift conditions, we establish subexponential upper and lower bounds on the rate of convergence in the L p...

ArticleView Article PDF

Journal Article 

 Full Text Online

More Options 

View Complete Issue Browse Now

 Preview 

Peer-Reviewed

 

Cover Image

Unbalanced optimal total variation transport problems and generalized Was...

by Chung, Nhan-Phu; Trinh, Thanh-Son

Proceedings of the Royal Society of Edinburgh. Section A. Mathematics, 06/2022, Volume 152, Issue 3

In this paper, we establish a Kantorovich duality for unbalanced optimal total variation transport problems. As consequences, we recover a version of duality...

Article Link Read Article

Journal Article  Full Text Online

View Complete Issue Browse Now

 Preview 

 Zbl 07535678

    

Decentralized Computation of Wasserstein Barycenter over...

by Yufereva, Olga; Persiianov, Michael; Dvurechensky, Pavel ; More...

05/2022

Inspired by recent advances in distributed algorithms for approximating Wasserstein barycenters, we propose a novel distributed algorithm for this problem. The...

Journal Article  Full Text Online

 Preview   Open Access

   

On Approximations of Data-Driven Chance Constrained Programs over Wa...

by Chen, Zhi; Kuhn, Daniel; Wiesemann, Wolfram

06/2022

Distributionally robust chance constrained programs minimize a deterministic cost function subject to the satisfaction of one or more safety conditions with...

Journal Article  Full Text Online

 Preview Open Access

Cited by 1 Related articles All 3 versions 

2022

   

One Loss for Quantization: Deep Hashing with Discrete Wasserstein...

by Doan, Khoa D; Yang, Peng; Li, Ping

05/2022

Image hashing is a principled approximate nearest neighbor approach to find similar items to a query in a large collection of images. Hashing aims to learn a...

Journal Article  Full Text Online

 Preview 

Cited by 6 Related articles All 4 versions

2022   ARTICLE

Exploring Predictive States via Cantor Embeddings and Wasserstein Distance

Loomis, Samuel P ; Crutchfield, James ParXiv.org, 2022

OPEN ACCESS

Exploring Predictive States via Cantor Embeddings and Wasserstein Distance

Available Online 

Working Paper  Full Text

Exploring Predictive States via Cantor Embeddings and Wasserstein Distance

Loomis, Samuel P; Crutchfield, James P.

arXiv.org; Ithaca, Jun 9, 2022.

Abstract/DetailsGet full text
Related articles
 

MR4517244 


Working Paper  Full Text
Wasserstein Convergence for Empirical Measures of Subordinated Dirichlet Diffusions on Riemannian Manifolds
Li, Huaiqian; Wu, Bingyao.

arXiv.org; Ithaca, Jun 8, 2022.
Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Related articles All 3 versions

 2022 patent news
Univ Xidian Submits Chinese Patent Application for Radar HRRP Database Construction Method Based on WGAN...
Global IP News. Software Patent News, Jun 6, 2022
Newspaper Article  Full Text Online

Wire Feed  Full Text

Univ Xidian Submits Chinese Patent Application for Radar HRRP Database Construction Method Based on WGAN-GP
Global IP News. Software Patent News; New Delhi [New Delhi]. 06 June 2022. 

DetailsFull text


2022 see 2021  ARTICLE

Fixed-Support Wasserstein Barycenters: Computational Hardness and Fast Algorithm

Lin, Tianyi ; Ho, Nhat ; Chen, Xi ; Cuturi, Marco ; Jordan, Michael IarXiv.org, 2022

OPEN ACCESS

Fixed-Support Wasserstein Barycenters: Computational Hardness and Fast Algorithm

Available Online 

Working Paper Full Text

Optimal control of the Fokker-Planck equation under state constraints in the Wasserstein space
Daudin, Samuel.

arXiv.org; Ithaca, Jun 6, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Cited by 42 Related articles All 9 versions


Working Paper  Full Text

Fixed-Support Wasserstein Barycenters: Computational Hardness and Fast Algorithm

Lin, Tianyi; Ho, Nhat; Chen, Xi; Cuturi, Marco; Jordan, Michael I.

arXiv.org; Ithaca, Jun 4, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

<——2022———2022———700 — 



Scholarly Journal
Citation/Abstract

Unbalanced optimal total variation transport problems and generalized Wasserstein barycenters
Chung, Nhan-Phu; Thanh-Son Trinh.
Proceedings. Section A, Mathematics - The Royal Society of Edinburgh; Cambridge Vol. 152, Iss. 3,  (Jun 2022): 674-700.

Abstract/Details 

Scholarly Journal  Citation/Abstract

The Parisi formula is a Hamilton–Jacobi equation in Wasserstein space
Jean-Christophe Mourrat.

Canadian Journal of Mathematics. Journal Canadien de Mathématiques; Toronto Vol. 74, Iss. 3,  (Jun 2022): 607-629.

Abstract/Details 

Cited by 15 Related articles All 7 versions
MR4430924


Working Paper  Full Text

Intrinsic Dimension Estimation Using Wasserstein Distances
Block, Adam; Jia, Zeyu; Polyanskiy, Yury; Rakhlin, Alexander.

arXiv.org; Ithaca, May 31, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Working Paper  Full Text

Wasserstein Distributionally Robust Optimization with Wasserstein Barycenters

Lau, Tim Tsz-Kit; Liu, Han.

arXiv.org; Ithaca, May 30, 2022.

Abstract/Deta

2022 see 2021

Linear and Deep Order-Preserving Wasserstein Discriminant Analysis

Su, BZhou, JH; (...); Wu, Y

Jun 1 2022 | IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 44 (6) , pp.3123-3138

Enriched Cited References

Supervised dimensionality reduction for sequence data learns a transformation that maps the observations in sequences onto a low-dimensional subspace by maximizing the separability of sequences in different classes. It is typically more challenging than conventional dimensionality reduction for static data, because measuring the separability of sequences involves non-linear procedures to manipu

Show more

View full text 86  References  Related records


2022

The Parisi formula is a Hamilton-Jacobi equation in Wasserstein space

Mourrat, JC

Jun 2022 | CANADIAN JOURNAL OF MATHEMATICS-JOURNAL CANADIEN DE MATHEMATIQUES 74 (3) , pp.607-629

The Parisi formula is a self-contained description of the infinite-volume limit of the free energy of mean-field spin glass models. We showthat this quantity can be recast as the solution of a Hamilton-Jacobi equation in the Wasserstein space of probability measures on the positive half-line.

View full text 34 References Related records


[PDF] arxiv.org

  algorithm to calculate the Wasserstein-1 metric with O(N…

 Cited by 1 All 3 versions


Wasserstein distance based multi-scale adversarial domain adaptation method for remaining useful life prediction

H Shi, C Huang, X Zhang, J Zhao, S Li - Applied Intelligence, 2022 - Springer

… paper, the Wasserstein distance is used as a metric of distribution distance. The Wasserstein 

distance was … For two distributions P S and P T , the Wasserstein-1 distance is defined as: …

Related articles

[PDF] arxiv.org

Fast Approximation of the Generalized Sliced-Wasserstein Distance

D Le, H Nguyen, K Nguyen, T Nguyen, N Ho - arXiv preprint arXiv …, 2022 - arxiv.org

Wasserstein distance, sliced WassersteinWasserstein distance and the conditional central 

limit theorem for Gaussian projections. We then present background on sliced Wasserstein

 All 2 versions


Interval-valued functional clustering based on the Wasserstein distance with application to stock data

L Sun, L Zhu, W Li, C Zhang, T Balezentis - Information Sciences, 2022 - Elsevier

… Based on the Wasserstein distance, this study proposes an … Wasserstein distance is 

transformed into an interval-valued function, and the calculation of the interval function Wasserstein

Cited by 5 Related articles All 2 versions

<——2022———2022———710 — 


2022 see 2021 [PDF] arxiv.org

DeepParticle: learning invariant measure by a deep neural network minimizing Wasserstein distance on data generated from an interacting particle method

Z Wang, J Xin, Z Zhang - Journal of Computational Physics, 2022 - Elsevier

We introduce DeepParticle, a method to learn and generate invariant measures of stochastic 

dynamical systems with physical parameters based on data computed from an interacting …

Cited by 1 Related articles All 4 versions

Zbl 07540355  MR4428792


[PDF] arxiv.org

Exponential convergence in Wasserstein metric for distribution dependent SDEs

SQ Zhang - arXiv preprint arXiv:2203.05856, 2022 - arxiv.org

The existence and uniqueness of stationary distributions and the exponential convergence 

in $L^p$-Wasserstein distance are derived for distribution dependent SDEs from associated …

 All 2 versions


[PDF] arxiv.org

On the Generalization of Wasserstein Robust Federated Learning

TA Nguyen, TD Nguyen, LT Le, CT Dinh… - arXiv preprint arXiv …, 2022 - arxiv.org

… To address this, we propose a Wasserstein distributionally robust optimization scheme … the 

Wasserstein ball (ambiguity set). Since the center location and radius of the Wasserstein ball …

  Cited by 1 Related articles All 2 versions

2022 see 2023

[PDF] arxiv.org

Shedding a PAC-Bayesian Light on Adaptive Sliced-Wasserstein Distances

R Ohana, K Nadjahi, A Rakotomamonjy… - arXiv preprint arXiv …, 2022 - arxiv.org

… This work addresses the question of the generalization properties of adaptive Sliced-Wasserstein 

distances, ie Sliced-Wasserstein distances whose slice distribution may be different …

Related articles All 2 versions

[PDF] arxiv.org

From -Wasserstein Bounds to Moderate Deviations

X Fang, Y Koike - arXiv preprint arXiv:2205.13307, 2022 - arxiv.org

… Abstract: We use a new method via p-Wasserstein bounds … -Wasserstein distance between

the distribution of the random variable (vector) of interest and a normal distribution grows like …

Related articles All 2 versions 


2022


Data-driven Wasserstein distributionally robust mitigation and recovery against random supply chain disruption

Y Cao, X Zhu, H Yan - Transportation Research Part E: Logistics and …, 2022 - Elsevier

… The Wasserstein ambiguity set is good for dealing with the scarcity of data. Thus, this paper

introduces the DRO with Wasserstein ambiguity set to the joint optimum of robust supply …

Data-driven Wasserstein distributionally robust mitigation and recovery against random supply chain disruption br

Cao, YZZhu, XY and Yan, HM

Jul 2022 | TRANSPORTATION RESEARCH PART E-LOGISTICS AND TRANSPORTATION REVIEW 163

This paper studies joint robust network design and recovery investment management in a pro-duction supply chain, considering limited historical data about disruptions and their possibilities. The supply chain is subject to uncertain disruptions that reduce production capacity at plants, and the cascading failures propagate along the supply chain network. A data-driven two-stage dis-tributionall

Full Text at Publisher

50 References Related records

[PDF] archives-ouvertes.fr

Measuring 3D-reconstruction quality in probabilistic volumetric maps with the Wasserstein Distance

S Aravecchia, A Richard, M Clausel, C Pradalier - 2022 - hal.archives-ouvertes.fr

… quality based directly on the voxels’ occupancy likelihood: the Wasserstein Distance. Finally,

we evaluate this Wasserstein Distance metric in simulation, under different level of noise in …


Computing Wasserstein-$p$ Distance Between Images with ...

https://ieeexplore.ieee.org › document

https://ieeexplore.ieee.org › document

by Y Chen · 2022 — Computing Wasserstein- p Distance Between Images with Linear Cost ; Article #: ; Date of Conference: 18-24 June 2022 ; Date Added to IEEE Xplore: 27 September 2022.

[PDF] thecvf.com

Computing Wasserstein-p Distance Between Images With Linear Cost

Y Chen, C Li, Z Lu - … of the IEEE/CVF Conference on …, 2022 - openaccess.thecvf.com

… 2 (right) compares the memory usage of different algorithms for computing the Wasserstein1 

distance, namely, we compute the Earth Mover Distance (EMD). The M3S algorithm shows …

Related articles


ARTICLE

Asymptotics of smoothed Wasserstein distances in the small noise regime

Ding, Yunzi ; Niles-Weed, JonathanarXiv.org, 2022

OPEN ACCESS

Asymptotics of smoothed Wasserstein distances in the small noise regime

Available Online 

arXiv:2206.06452  [pdfother math.ST  math.PR
Asymptotics of smoothed Wasserstein distances in the small noise regime
Authors: Yunzi DingJonathan Niles-Weed
Abstract: We study the behavior of the Wasserstein-2
 distance between discrete measures μ
 and ν  in Rd  when both measures are smoothed by small amounts of Gaussian noise. This procedure, known as Gaussian-smoothed optimal transport, has recently attracted attention as a statistically attractive alternative to the unregularized Wasserstein distance. We give precise bounds on the approximatio…  More
Submitted 13 June, 2022; originally announced June 2022.
Comments: 26 pages, 2 figures
Asymptotics of smoothed Wasserstein distances in the small noise regime

by Ding, Yunzi; Niles-Weed, Jonathan

06/2022

We study the behavior of the Wasserstein-$2$ distance between discrete measures $\mu$ and $\nu$ in $\mathbb{R}^d$ when both measures are smoothed by small...
Cited by 10
 Related articles All 5 versions

arXiv:2206.05479  [pdfpsother math.PR
Ornstein-Uhlenbeck Type Processes on Wasserstein Space
Authors: Panpan RenFeng-Yu Wang
Abstract: The Wasserstein space P2  consists of square integrable probability measures on $\R^d$ and is equipped with the intrinsic Riemannian structure. By using stochastic analysis on the tangent space, we construct the Ornstein-Uhlenbeck (O-U) process on P2  whose generator is formulated as the intrinsic Laplacian with a drift. This process satisfies the log-Sobolev inequality and h…  More
Submitted 11 June, 2022; originally announced June 2022.

Ornstein-Uhlenbeck Type Processes on Wasserstein Space

All 2 versions 

<——2022———2022———720— 


Imaizumi, MasaakiOta, HirofumiHamaguchi, Takuo

Hypothesis test and confidence analysis with Wasserstein distance on general dimension. (English) Zbl 07541175

Neural Comput. 34, No. 6, 1448-1487 (2022).

MSC:  62-XX

PDF BibTeX XML Cite

Full Text: DOI 

Cited by 2 Related articles All 8 versions

Zbl 07541175

 

[HTML] sciencedirect.com

[HTML] ERP-WGAN: A data augmentation method for EEG single-trial detection

R Zhang, Y Zeng, L Tong, J Shu, R Lu, K Yang… - Journal of Neuroscience …, 2022 - Elsevier

Brain computer interaction based on EEG presents great potential and becomes the

research hotspots. However, the insufficient scale of EEG database limits the BCI system …

 Related articles All 3 versions

Speech Emotion Recognition on Small Sample Learning by Hybrid WGAN-LSTM Networks

C Sun, L Ji, H Zhong - Journal of Circuits, Systems and Computers, 2022 - World Scientific

The speech emotion recognition based on the deep networks on small samples is often a

very challenging problem in natural language processing. The massive parameters of a …

Cited by 1 Related articles


[PDF] cnjournals.com

[PDF] 基于 WGAN-GP 的微多普勒雷达人体动作识别

屈乐乐, 王禹桐 - 雷达科学与技术, 2022 - radarst.cnjournals.com

针对人体动作识别微多普勒雷达数据量有限的问题, 本文提出基于梯度惩罚的沃瑟斯坦生成对抗

网络(WGAN GP) 进行雷达数据增强, 实现深度卷积神经网络(DCNN) 在样本数量较少时可以 …

 Related articles All 3 versions 

[Chinese  Human Action Recognition Based on Micro-Doppler Radar Based on WGAN-GP]

Cited by 1  Related articles All 3 versions 

2022 see 2021

The Wasserstein-Fourier Distance for Stationary Time Serieshttps://ieeexplore.ieee.org › document
by E Cazelles · 2020 · Cited by 7 — We propose the Wasserstein-Fourier (WF) distance to measure the (dis)similarity between time series by quantifying the displacement of their ...

Zbl 07591375

2022


arXiv:2206.07767  [pdfother cs.LG
Kantorovich Strikes Back! Wasserstein GANs are not Optimal Transport?
Authors: Alexander KorotinAlexander KolesovEvgeny Burnaev
Abstract: Wasserstein Generative Adversarial Networks (WGANs) are the popular generative models built on the theory of Optimal Transport (OT) and the Kantorovich duality. Despite the success of WGANs, it is still unclear how well the underlying OT dual solvers approximate the OT cost (Wasserstein-1 distance, W
) and the OT gradient needed to update the generator. In this paper, we address thes…  More
Submitted 15 June, 2022; originally announced June 2022.

ll 4 versions 

  Mean-Field Langevin Dynamics and application to regularized

Wasserstein barycenters

Jun 9, 2022 — In this talk, motivated by the analysis of noisy gradient descent to compute grid-free regularized Wasserstein barycenters, we consider the « ...

Lenaïc Chizat (École polytechnique fédérale de Lausanne)


2022 see 2021
Robust W-GAN-Based Estimation Under Wasserstein ...

https://www.esi.ac.at › events

Po-Ling Loh (U Cambridge)

May 30, 2022 — Robust W-GAN-Based Estimation Under Wasserstein Contamination ... 2022, 14:00 — 14:45 ... Fabio Nobile (EPFL Lausanne)


Citation/Abstract

A Wasserstein GAN Autoencoder for SCMA Networks
Miuccio, Luciano; Panno, Daniela; Riolo, Salvatore.
IEEE Wireless Communications Letters; Piscataway Vol. 11, Iss. 6,  (2022): 1298-1302.
Abstract/Details
 
Cited by 3 Related articles


Scholarly Journal  Full Text

Distributionally Robust Multi-Energy Dynamic Optimal Power Flow Considering Water Spillage with Wasserstein Metric
Song, Gengli; Hua, Wei.

Energies; Basel Vol. 15, Iss. 11,  (2022): 3886.

Abstract/DetailsFull textFull text - PDF (881 KB)‎Related articles All 4 versions 
 Related articles All 4 versions 

<——2022———2022———730—


2022 see 2021  ARTICLE

A Normalized Gaussian Wasserstein Distance for Tiny Object Detection

Jinwang Wang ; Chang Xu ; Wen Yang ; Lei YuarXiv.org, 2022

OPEN ACCESS

A Normalized Gaussian Wasserstein Distance for Tiny Object Detection

Available Online 

Working Paper Full Text

A Normalized Gaussian Wasserstein Distance for Tiny Object Detection

Wang, Jinwang; Chang, Xu; Yang, Wen; Yu, Lei.

arXiv.org; Ithaca, Jun 14, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window


2022 see 2021

Wasserstein generative adversarial uncertainty quantification in physics-informed neural networks

Gao, YH and Ng, MK

Aug 15 2022 | JOURNAL OF COMPUTATIONAL PHYSICS 463

In this paper, we study a physics-informed algorithm for Wasserstein Generative Adversarial Networks (WGANs) for uncertainty quantification in solutions of partial differential equations. By using groupsort activation functions in adversarial network discriminators, network generators are utilized to learn the uncertainty in solutions of partial differential equations observed from the initial/

Show more

Free Submitted Article From RepositoryFull Text at Publisher

67 References  Related records 


Computed tomography image generation from magnetic resonance imaging using Wasserstein metric for MR-only radiation therapy

Joseph, JHemanth, C; (...); Puzhakkal, N

Jun 2022 (Early Access) | INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGYEnriched Cited References

Magnetic resonance imaging (MRI) and computed tomography (CT) are the prevalent imaging techniques used in treatment planning in radiation therapy. Since MR-only radiation therapy planning (RTP) is needed in the future for new technologies like MR-LINAC (medical linear accelerator), MR to CT synthesis model benefits in CT synthesis from MR images generated via MR-LINAC. A Wasserstein generative

Show more

Full Text at Publisher

33 References  Related records


2022 see 2011

Geometric convergence bounds for Markov chains in Wasserstein distance based on generalized drift and contraction conditions

Qin, Q and Hobert, JP

May 2022 | ANNALES DE L INSTITUT HENRI POINCARE-PROBABILITES ET STATISTIQUES 58 (2) , pp.872-889

Let (X-n)(n=0)(infinity) denote a Markov chain on a Polish space that has a stationary distribution pi. This article concerns upper bounds on the Wasserstein distance between the distribution of X-n and pi. In particular, an explicit geometric bound on the distance to stationarity is derived using generalized drift and contraction conditions whose parameters vary across the state space. These n

Show more

Free Submitted Article From RepositoryFull Text at Publisher

21 References  Related records


   MR4441130 Prelim Niles-Weed, Jonathan; Berthet, Quentin; 

Minimax estimation of smooth densities in Wasserstein distance. Ann. Statist. 50 (2022), no. 3, 1519–1540.

Review PDF Clipboard Journal Article

Cited by 3 All 4 versions

2022


2022 see 2021

MR4438713 Prelim Candau-Tilh, Jules; Goldman, Michael; 

Existence and stability results for an isoperimetric problem with a non-local interaction of Wasserstein type. ESAIM Control Optim. Calc. Var. 28 (2022), Paper No. 37, 20 pp. 49Q20 (49Q05 49Q22)

Review PDF Clipboard Journal Article


2022 see 2021

MR4437647 Prelim Ghaderinezhad, Fatemeh; Ley, Christophe; Serrien, Ben; 

The Wasserstein Impact Measure (WIM): A practical tool for quantifying prior impact in Bayesian statistics. Comput. Statist. Data Anal. 174 (2022), 107352.

Review PDF Clipboard Journal Article

 Citations  26 References  Related records

2022 see 2021

MR4430959 Prelim Blanchet, Jose; Murthy, Karthyek; Si, Nian; 

Confidence regions in Wasserstein distributionally robust estimation. Biometrika 109 (2022), no. 2, 295–315.

Review PDF Clipboard Journal Article

MR4430947 Prelim Chung, Nhan-Phu; Trinh, 

Thanh-Son; Unbalanced optimal total variation transport problems and generalized Wasserstein barycenters. Proc. Roy. Soc. Edinburgh Sect. A 152 (2022), no. 3, 674–700. 49Q22 (49N15 58E30)

Review PDF Clipboard Journal Article

Related articles All 2 versions

MR4441177 Prelim Arqué, Ferran; Uribe, César A.; Ocampo-Martinez, Carlos; 

Approximate Wasserstein attraction flows for dynamic mass transport over networks. Automatica J. IFAC 143 (2022), Paper No. 110432.

Review PDF Clipboard Journ

Cited by 1 All 2 versions

Working Paper  Full Text

Approximate Wasserstein Attraction Flows for Dynamic Mass Transport over Networks

Arqué, Ferran; Uribe, César A; Ocampo-Martinez, Carlos.

arXiv.org; Ithaca, Apr 26, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

 <——2022———2022———740— 


ARTICLE

Aligning individual brains with Fused Unbalanced Gromov-Wasserstein

Thual, Alexis ; Tran, Huy ; Zemskova, Tatiana ; Courty, Nicolas ; Flamary, Rémi ; Dehaene, Stanislas ; Thirion, BertrandarXiv.org, 2022

OPEN ACCESS

Aligning individual brains with Fused Unbalanced Gromov-Wasserstein

Available Online 

arXiv:2206.09398  [pdfother]  q-bio.NC 
stat.ML
Aligning individual brains with Fused Unbalanced Gromov-Wasserstein
Authors: Alexis ThualHuy TranTatiana ZemskovaNicolas CourtyRémi FlamaryStanislas DehaeneBertrand Thirion
Abstract: Individual brains vary in both anatomy and functional organization, even within a given species. Inter-individual variability is a major impediment when trying to draw generalizable conclusions from neuroimaging data collected on groups of subjects. Current co-registration procedures rely on limited data, and thus lead to very coarse inter-subject alignments. In this work, we present a novel metho… 
More
Submitted 19 June, 2022; originally announced June 2022.
Cited by 1
Related articles All 4 versions

arXiv:2206.08780  [pdfother]  stat.ML   cs.LG
Spherical Sliced-Wasserstein
Authors: Clément BonetPaul BergNicolas CourtyFrançois SeptierLucas DrumetzMinh-Tan Pham
Abstract: Many variants of the Wasserstein distance have been introduced to reduce its original computational burden. In particular the Sliced-Wasserstein distance (SW), which leverages one-dimensional projections for which a closed-form solution of the Wasserstein distance is available, has received a lot of interest. Yet, it is restricted to data living in Euclidean spaces, while the Wasserstein distance…  More
Submitted 17 June, 2022; originally announced June 20
Cited by 2 All 2 versions

[PDF] acm.org

Neural Subgraph Counting with Wasserstein Estimator

H Wang, R Hu, Y Zhang, L Qin, W Wang… - Proceedings of the 2022 …, 2022 - dl.acm.org

… Furthermore, we design a novel Wasserstein discriminator in WEst to minimize the Wasserstein 

distance between query and data graphs by updating the parameters in network with the …

 Cited by 1


Detecting tiny objects in aerial images: A normalized Wasserstein distance and a new benchmark

C Xu, J Wang, W Yang, H Yu, L Yu, GS Xia - ISPRS Journal of …, 2022 - Elsevier

… To tackle this problem, we propose a new evaluation metric dubbed Normalized Wasserstein 

Distance (NWD) and a new RanKing-based Assigning (RKA) strategy for tiny object …

 Cited by 2 All 5 versions

A novel bearing imbalance Fault-diagnosis method based on a Wasserstein conditional generative adversarial network

Y Peng, Y Wang, Y Shao - Measurement, 2022 - Elsevier

… fault diagnosis framework, the Wasserstein conditional generation adversarial network, based 

… The proposed framework unites Wasserstein loss and hierarchical feature matching loss, …

Cited by 4 Related articles All 2 versions

 

2022

[HTML] mdpi.com

Hyperspectral Anomaly Detection Based on Wasserstein Distance and Spatial Filtering

X Cheng, M Wen, C Gao, Y Wang - Remote Sensing, 2022 - mdpi.com

… This article proposes a hyperspectral AD method based on Wasserstein distance (WD) 

and spatial filtering (called AD-WDSF). Based on the assumption that both background and …

 Cited by 2 Related articles All 6 versions

[PDF] researchgate.net

[PDF] LIMITATIONS OF THE WASSERSTEIN MDE FOR UNIVARIATE DATA

YG Yatracos - 2022 - researchgate.net

… as tools the Kantorovich-Wasserstein distance, Wp, and the empirical distribution, ˆµn, of 

the data; n is the sample size, p [1,∞). The Wasserstein distance has been used extensively …

All 2 versions 

Zbl 07630671

Wasserstein distributionally robust planning model for renewable sources and energy storage systems under multiple uncertainties

J Li, Z Xu, H Liu, C Wang, L Wang… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org

… In this paper, a two-stage Wasserstein distributionally robust optimization (WDRO) model is

… the Wasserstein metric and historical data. Meanwhile, both 1-norm and -norm Wasserstein …

RCited by 1 Related articles All 2 versions

  

2022 see 2021  [HTML] springer.com

[HTML] Primal dual methods for Wasserstein gradient flows

JA CarrilloK CraigWangC Wei - Foundations of Computational …, 2022 - Springer

… Next, we use the Benamou–Brenier dynamical characterization of the Wasserstein distance

to … We conclude with simulations of nonlinear PDEs and Wasserstein geodesics in one and …

Cited by 39 Related articles All 14 versions

Interval-valued functional clustering based on the Wasserstein distance with application to stock data

L Sun, L Zhu, W Li, C ZhangT Balezentis - Information Sciences, 2022 - Elsevier

… Based on the Wasserstein distance, this study proposes an … Wasserstein distance is

transformed into an interval-valued function, and the calculation of the interval function Wasserstein …

 Cited by 45 Related articles All 14 versions

omputing Wasserstein-p Distance Between Images With Linear Cost

Y Chen, C Li, Z Lu - … of the IEEE/CVF Conference on …, 2022 - openaccess.thecvf.com

… discrete measures, computing Wasserstein-p distance between … a novel algorithm to compute

the Wasserstein-p distance be… We compute Wasserstein-p distance, estimate the transport …

Related articles 

<——2022———2022———750—


New Findings from Swiss Federal Institute of Technology Lausanne (EPFL) Describe Advances in Signal and Information Processing (Wasserstein...

Network Business Weekly, 06/2022

Newsletter  Full Text Online


[PDF] acm.org

Neural Subgraph Counting with Wasserstein Estimator

H Wang, R Hu, Y Zhang, L Qin, W Wang… - Proceedings of the 2022 …, 2022 - dl.acm.org

… Furthermore, we design a novel Wasserstein discriminator in WEst to minimize the Wasserstein

distance between query and data graphs by updating the parameters in network with the …


Application of WGAN-GP in recommendation and Questioning the relevance of GAN-based approaches

H Ammar Khodja, O Boudjeniba - arXiv e-prints, 2022 - ui.adsabs.harvard.edu

… - Can we successfully apply WGAN-GP on recommendation … a recommender system based 

on WGAN-GP called CFWGAN… of significant advantage of using WGAN-GP instead of the …

 Rited by 2 Related articles


2022 Working Paper  Full Text

Robust -learning Algorithm for Markov Decision Processes under Wasserstein Uncertainty
Neufeld, Ariel; Sester, Julian.
arXiv.org; Ithaca, Sep 30, 2022.
Abstract/DetailsGet full text
Link to external site, this link will open in a new window

All 4 versions 

Distributed particle filters via barycenters in 2-Wasserstein space

S Sheng - 2022 - dr.ntu.edu.sg

We develop a new version of distributed particle filters that exploits the novel theory of

2Wasserstein barycenters. We consider a wireless sensor network deployed over a vast …

2022

[PDF] iop.org

A Strabismus Surgery Parameter Design Model with WGAN-GP Data Enhancement Method

R Tang, W Wang, Q Meng, S Liang… - Journal of Physics …, 2022 - iopscience.iop.org

Wasserstein distance [7] significantly improves the training convergence speed and training 

stability. Among them, WGAN-GP adds the Wasserstein … , this paper chose the WGAN-GP as …

 Related articles All 3 versions

[CITATION] Confidence regions in Wasserstein distributionally robust estimation (vol 109, pg 295, 2022)

W Li - BIOMETRIKA, 2022 - OXFORD UNIV PRESS GREAT …

 Related articles All 3 versions

[PDF] arxiv.org

Confidence regions in Wasserstein distributionally robust estimation

J Blanchet, K Murthy, N Si - Biometrika, 2022 - academic.oup.com

… the Wasserstein distance of order 2 by setting |$W(P,Q) = \left\{D_c(P,Q)\right\}^{1/2}.$| … In 

 Cited by 24 Related articles All 7 versions

[PDF] infocomm-journal.com

基于 VAE-WGAN 的多维时间序列异常检测方法

段雪源, 付钰, 王坤 - 通信学报, 2022 - infocomm-journal.com

为解决上述问题,本文提出一种基于VAE-WGAN 架构的半监督多维时间序列异常检测方法.利用 

VAE 作为生成器,并利用WGAN 的判别结果调整 VAE 的分布参数,使用Wasserstein 距离作为

Related articles All 2 versions

[Chinese  Anomaly detection method for multidimensional time series based on VAE-WGAN]


[PDF] cnjournals.com

[PDF] 基于 WGAN-GP 的微多普勒雷达人体动作识别

屈乐乐, 王禹桐 - 雷达科学与技术, 2022 - radarst.cnjournals.com

基于梯度惩罚的沃瑟斯坦生成 对抗网络(WGANGP)进行雷达数据增强,实现深度卷积 

GP进行时频谱图像数据增强,最后利用生成的图像对DCNN进行训练.实验结果表明使用

Related articles All 3 versions

[Chinese  uman Action Recognition Based on Micro-Doppler Radar Based on WGAN-GP]


E 波段 3.5 W GaN 宽带高功率放大器 MMIC

戈勤, 陶洪琪, 王维波, 商德春, 刘仁福… - 固体电子学研究与进展, 2022 - cnki.com.cn

基于自主开发的100 nm GaN 高电子迁移率晶体管(HEMT) 工艺, 研制了一款工作频段覆盖E 波段

(60~ 90 GHz) 的宽带高功率放大器芯片. 放大器采用密集通孔结构的共源极晶体管, 降低寄生

 ]Chinese  E-Band 3.5 W GaN Broadband High Power Amplifier MMIC]


Research on Face Image Restoration Based on Improved WGAN

F Liu, R Chen, S Duan, M Hao, Y Guo - International Conference on …, 2022 - Springer

This article focuses on the face recognition model in real life scenarios, because the possible 

occlusion affects the recognition effect of the model, resulting in a decline in the accuracy …

Related articles

<——2022———2022———760— 


VGAN: Generalizing MSE GAN and WGAN-GP for robot fault diagnosis

Z Pu, D Cabrera, C Li, JV de Oliveira - IEEE Intelligent Systems, 2022 - ieeexplore.ieee.org

… and WGAN-GP, referred to as VGAN. Within the framework of conditional Wasserstein GAN 

… including vanilla GAN, conditional WGAN with and without conventional regularization, and …

Related articles All 2 versions

[PDF] muni.cz

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

HQ Minh - Journal of Theoretical Probability, 2022 - Springer

… the entropic regularization formulation of the 2-Wasserstein … generalization of the maximum 

entropy property of Gaussian … optimal entropic transport plan, entropic 2-Wasserstein distance…

Cited by 4 Related articles All 4 versions

 Zbl 07686371


Approximate Wasserstein attraction flows for dynamic mass transport over networks

F Arqué, CA Uribe, C Ocampo-Martinez - Automatica, 2022 - Elsevier

… Initially, we present the entropy regularized discrete JKO flow for the WA problem following 

the ideas introduced in Peyré (2015). The main contribution in Peyré (2015) is to replace the …

Cited by 1 All 2 versions

Zbl 07563637
Tweets with replies by César A. Uribe (@CesarAUribe) / Twitter

mobile.twitter.com › cesarauribe › with_replies

mobile.twitter.com › cesarauribe › with_replies

 ditor-in-Chief Andrew Teel - 'Approximate Wasserstein attraction flows for dynamic mass transport over ...

Twitter · 

Jun 3, 2022


Representing graphs via Gromov-Wasserstein factorization

H XuJ LiuD LuoL Carin - IEEE Transactions on Pattern …, 2022 - ieeexplore.ieee.org

… To solve this problem efficiently, we unroll the loopy … we aim at designing an interpretable and

effective method to … p = 2, we obtain the proposed Gromov-Wasserstein factorization model …

 Cited by 1 All 6 versions



 arXiv:2206.13996  [pdfother]  cs.CV doi10.1016/j.isprsjprs.2022.06.002
Detecting tiny objects in aerial images: A normalized Wasserstein distance and a new benchmark
Authors: Chang XuJinwang WangWen YangHuai YuLei YuGui-Song Xia
Abstract: Tiny object detection (TOD) in aerial images is challenging since a tiny object only contains a few pixels. State-of-the-art object detectors do not provide satisfactory results on tiny objects due to the lack of supervision from discriminative features. Our key observation is that the Intersection over Union (IoU) metric and its extensions are very sensitive to the location deviation of the tiny… 
More
Submitted 28 June, 2022; originally announced June 2022.
Comments: Accepted by ISPRS Journal of Photogrammetry and Remote Sensing
Journal ref: ISPRS Journal of Photogrammetry and Remote Sensing (2022) 190:79-93
Journal Article  Full Text Online

Cited by 1 All 5 versions

2022


arXiv:2206.13269  [pdfpsother]  stat.ML   cs.IT   cs.LG   math.OC
The Performance of Wasserstein Distributionally Robust M-Estimators in High Dimensions
Authors: Liviu AolariteiSoroosh Shafieezadeh-AbadehFlorian Dörfler
Abstract: Wasserstein distributionally robust optimization has recently emerged as a powerful framework for robust estimation, enjoying good out-of-sample performance guarantees, well-understood regularization effects, and computationally tractable dual reformulations. In such framework, the estimator is obtained by minimizing the worst-case expected loss over all probability distributions which are close,… 
More
Submitted 27 June, 2022; originally announced June 2022.
 All 2 versions 


arXiv:2206.12768  [pdfother]  math.ST  stat.ML
The Sketched Wasserstein Distance for mixture distributions
Authors: Xin BingFlorentina BuneaJonathan Niles-Weed
Abstract: The Sketched Wasserstein Distance (WS
) is a new probability distance specifically tailored to finite mixture distributions. Given any metric d
 defined on a set A
 of probability distributions, WS
is defined to be the most discriminative convex extension of this metric to the space S=conv(A)
 of mixtures of elements of A
. Our representa… 
More
Submitted 25 June, 2022; originally announced June 2022.
All 3 versions
 


arXiv:2206.12690  [pdfother]  cs.CE
ECG Classification based on Wasserstein Scalar Curvature
Authors: Fupeng SunYin NiYihao LuoHuafei Sun
Abstract: Electrocardiograms (ECG) analysis is one of the most important ways to diagnose heart disease. This paper proposes an efficient ECG classification method based on Wasserstein scalar curvature to comprehend the connection between heart disease and mathematical characteristics of ECG. The newly proposed method converts an ECG into a point cloud on the family of Gaussian distribution, where the patho… 
More
Submitted 25 June, 2022; originally announced June 2022.
 All 2 versions 


arXiv:2206.12116  [pdfother]  stat.ML  cs.AI  cs.LG
Approximating 1-Wasserstein Distance with Trees
Authors: Makoto YamadaYuki TakezawaRyoma SatoHan BaoZornitsa KozarevaSujith Ravi
Abstract: Wasserstein distance, which measures the discrepancy between distributions, shows efficacy in various types of natural language processing (NLP) and computer vision (CV) applications. One of the challenges in estimating Wasserstein distance is that it is computationally expensive and does not scale well for many distribution comparison tasks. In this paper, we aim to approximate the 1-Wasserstein…  More
Submitted 24 June, 2022; originally announced June 2022.

  Niles-Weed, JonathanBerthet, Quentin

Minimax estimation of smooth densities in Wasserstein distance. (English) Zbl 07547940

Ann. Stat. 50, No. 3, 1519-1540 (2022).

MSC:  62F99 62H99

PDF BibTeX XML Cite

Zbl 07547940

Cited by 4 Related articles All 4 versions

<——2022———2022———770— 


Candau-Tilh, JulesGoldman, Michael

Existence and stability results for an isoperimetric problem with a non-local interaction of Wasserstein type. (English) Zbl 07547807

ESAIM, Control Optim. Calc. Var. 28, Paper No. 37, 20 p. (2022).

MSC:  49Q05 49Q20 49Q22

PDF BibTeX XML Cite

Full Text: DOI  


Blanchet, JoseMurthy, KarthyekSi, Nian

Confidence regions in Wasserstein distributionally robust estimation. (English) Zbl 07543325

Biometrika 109, No. 2, 295-315 (2022); erratum ibid. 109, No. 2, 567 (2022).

MSC:  62-XX

PDF BibTeX XML Cite

Full Text: DOI 

Cited by 28 Related articles All 8 versions
[CITATION] Confidence regions in Wasserstein distributionally robust estimation (vol 109, pg 295, 2022)

W Li - BIOMETRIKA, 2022 - OXFORD UNIV PRESS GREAT …

Cited by 32 Related articles All 8 versions

2022 see 2021

 Local well-posedness in the Wasserstein space for a chemotaxis model coupled to incompressible...
by Kang, Kyungkeun; Kim, Hwa Kil
Zeitschrift für angewandte Mathematik und Physik, 06/2022, Volume 73, Issue 4
We consider a coupled system of Keller–Segel-type equations and the incompressible Navier–Stokes equations in spatial dimension two and three. In the previous...
Journal Article  Full Text Online

MR4444516

Rectified Wasserstein Generative Adversarial Networks for Perceptual Image Restoration
by Ma, Haichuan; Liu, Dong; Wu, Feng
IEEE transactions on pattern analysis and machine intelligence, 06/2022, Volume PP
Article PDFPDF

Journal Article  Full Text Online

 

[PDF] mdpi.com

The “Unreasonable” Effectiveness of the Wasserstein Distance in Analyzing Key Performance Indicators of a Network of Stores

A Ponti, I Giordani, M Mistri, A Candelieri… - Big Data and Cognitive …, 2022 - mdpi.com

… balls, and in [27], which proposed a distributionally robust two-stage WassersteinWasserstein 

distance in management science. A management topic where the Wasserstein distance

Cited by 1 Related articles All 2 versions


2022


2022 patent

North China Electric Power Univ Baoding Submits Patent Application for New Energy Capacity Configuration Method Based on WGAN...

Global IP News. Energy Patent News, Jun 23, 2022

2022 see ResearchGate see 2021 arXiv

Local well-posedness in the Wasserstein space for a chemotaxis model coupled...
by Kang, Kyungkeun; Kim, Hwa Kil
Zeitschrift für angewandte Mathematik und Physik, 06/2022, Volume 73, Issue 4
We consider a coupled system of Keller–Segel-type equations and the incompressible Navier–Stokes equations in spatial dimension two and three. In the previous...
Journal Article  Full Text Online
More Options 

Rectified Wasserstein Generative Adversarial ... - IEEE Xplore

https://ieeexplore.ieee.org › iel7

by H Ma · 2022 — Abstract—Wasserstein generative adversarial network (WGAN) has attracted great attention due to its solid mathematical background,.
Rectified Wasserstein Generative Adversarial Networks for Perceptual...
by Ma, Haichuan; Liu, Dong; Wu, Feng
IEEE transactions on pattern analysis and machine intelligence, 06/2022, Volume PP
ArticleView Article PDF
2022 see 2021
Portfolio Optimisation within a Wasserstein Ball
by Silvana Pesenti; Sebastian Jaimungal
arXiv.org, 06/2022
We study the problem of active portfolio management where an investor aims to outperform a benchmark strategy's risk profile while not deviating too far from...
Paper  Full Text Online
More Options 


 
https://arxiv.org › math

by X Zhang · 2022 — We study a distributionally robust optimization formulation (i.e., a min-max game) for problems of nonparametric estimation: Gaussian process ...


Wasserstein Distributionally Robust Gaussian Process ... - arXiv

by X Zhang · 2022 — We study a distributionally robust optimization formulation (i.e., a min-max game) for problems of nonparametric estimation: Gaussian process ...
Wasserstein Distributionally Robust Gaussian Process Regression and Linear...
by Xuhui Zhang; Jose Blanchet; Youssef Marzouk ; More...
arXiv.org, 06/2022
We study a distributionally robust optimization formulation (i.e., a min-max game) for problems of nonparametric estimation: Gaussian process regression and,...
Paper  Full Text Online
Related articles All 2 versions 

<——2022———2022———780—


2022 see 2021 Working Paper Full Text

Likelihood estimation of sparse topic distributions in topic models and its applications to Wasserstein document distance calculations

Xin Bing; Bunea, Florentina; Strimas-Mackey, Seth; Wegkamp, Marten.

arXiv.org; Ithaca, Jun 27, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

 Cited by 3 Related articles All 7 versions

 

2022 see 2021  Working Paper  Full Text

Universal consistency of Wasserstein 

kk-NN classifier: Negative and Positive Results

Ponnoprat, Donlapark.

arXiv.org; Ithaca, Jun 26, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

2022 see 2021  Working Paper  Full Text

DeepParticle: learning invariant measure by a deep neural network minimizing Wasserstein distance on data generated from an interacting particle method

Wang, Zhongjian; Xin, Jack; Zhang, Zhiwen.

arXiv.org; Ithaca, Jun 19, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Cited by 2 Related articles All 5 versions

Wasserstein Uncertainty Estimation for Adversarial Domain Matching

Wang, RZhang, RY and Henao, R

May 10 2022 | FRONTIERS IN BIG DATA 5Enriched Cited References

Domain adaptation aims at reducing the domain shift between a labeled source domain and an unlabeled target domain, so that the source model can be generalized to target domains without fine tuning. In this paper, we propose to evaluate the cross-domain transferability between source and target samples by domain prediction uncertainty, which is quantified via Wasserstein gradient flows. Further

Show more  Free Full Text from Publisher

36 ReferencesRelated records  

Related articles All 4 versions

 

Wasserstein coupled particle filter for multilevel estimation

Ballesio, MJasra, A; (...); Tempone, R

May 2022 (Early Access) | STOCHASTIC ANALYSIS AND APPLICATIONS

In this article, we consider the filtering problem for partially observed diffusions, which are regularly observed at discrete times. We are concerned with the case when one must resort to time-discretization of the diffusion process if the transition density is not available in an appropriate form. In such cases, one must resort to advanced numerical algorithms such as particle filters to cons

Show more

Free Submitted Article From RepositoryFull Text at Publisher  


2022


2022 see 2021

Dissipative probability vector fields and generation of evolution semigroups in Wasserstein spaces

Cavagnari, GSavare, G and Sodini, GE

Jun 2022 (Early Access) | PROBABILITY THEORY AND RELATED FIELDSEnriched Cited References

We introduce and investigate a notion of multivalued lambda-dissipative probability vector field (MPVF) in the Wasserstein space P-2(X) of Borel probability measures on a Hilbert space X. Taking inspiration from the theories of dissipative operators in Hilbert spaces and of Wasserstein gradient flows for geodesically convex functionals, we study local and global well posedness of evolution equa

Show more Free Full Text From Publisher

36 References Related records

Related articles All 2 versions

Confidence regions in Wasserstein distributionally robust estimation (vol 109, pg 295, 2022)

Li, W

May 25 2022 | BIOMETRIKA 109 (2) , pp.567-567

Free Full Text From Publisher

1 ReferenceRelated records 

[CITATION] Confidence regions in Wasserstein distributionally robust estimation (vol 109, pg 295, 2022)

W Li - BIOMETRIKA, 2022 - OXFORD UNIV PRESS GREAT …
Cited by 28
 Related articles All 8 versions


ARTICLE

Partial Discharge Data Augmentation Based on Improved Wasserstein Generative Adversarial Network With Gradient Penalty

Zhu, Guangya ; Zhou, Kai ; Lu, Lu ; Fu, Yao ; Liu, Zhaogui ; Yang, XiaominIEEE transactions on industrial informatics, 2022, p.1-11

 Download PDF 

Partial Discharge Data Augmentation Based on Improved Wasserstein Generative Adversarial Network With Gradient Penalty

Available Online 

 View Issue Contents 

Partial Discharge Data Augmentation Based on Improved Wasserstein Generative Adversarial Network With Gradient Penalty

G Zhu, K Zhou, L Lu, Y Fu, Z Liu… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org

WGAN-GP model is developed for PD classification of electric power equipment with enhanced 

accuracy. The improved WGAN-… To address the problem, the improved WGAN-GP model …



ACWGAN: AN AUXILIARY CLASSIFIER WASSERSTEIN GAN-BASED OVERSAMPLING APPROACH FOR MULTI-CLASS IMBALANCED LEARNING

Liao, C and Dong, MG

Jun 2022 | INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL 18 (3) , pp.703-721Enriched Cited References

Learning from multi-class imbalance data is a common but challenging task in machine learning community. Oversampling method based on Generative Adversarial Networks (GAN) is an effective countermeasure. However, due to the scarce number of trainable minority samples, existing methods may produce noise or low-quality minority samples; besides, they may suffer from mode collapse. To address the

Show more

View full text

36 References  Related records 


2022 see 2021

GRADIENT FLOW FORMULATION OF DIFFUSION EQUATIONS IN THE WASSERSTEIN SPACE OVER A METRIC GRAPH

Erbar, MForkert, D; (...); Mugnolo, D

Jun 2022 (Early Access) | NETWORKS AND HETEROGENEOUS MEDIA

This paper contains two contributions in the study of optimal transport on metric graphs. Firstly, we prove a Benamou-Brenier formula for the Wasserstein distance, which establishes the equivalence of static and dynamical optimal transport. Secondly, in the spirit of Jordan-Kinderlehrer- Otto, we show that McKean-Vlasov equations can be formulated as gradient flow of the free energy in the Wass

Show more

Free Full Text from Publisher

28 References Related records   

MR4459624   

<——2022———2022———790— 


[HTML] springer.com

[HTML] A Wasserstein-based measure of conditional dependence

J Etesami, K Zhang, N Kiyavash - Behaviormetrika, 2022 - Springer

… In this work, we use Wasserstein distance and discuss the advantage of using such metric 

… 2003), we obtain an alternative approach for computing the Wasserstein metric as follows: …

Cited by 1 


[PDF] arxiv.org

A Wasserstein coupled particle filter for multilevel estimation

M Ballesio, A Jasra, E von Schwerin… - Stochastic Analysis and …, 2022 - Taylor & Francis

… squared Wasserstein distance with L 2 as the metric (we call this the “Wasserstein coupling”)… 

resampling step corresponds to sampling the optimal Wasserstein coupling of the filters. We …

 Cited by 10 Related articles All 6 versions

[HTML] springer.com

Data-driven Wasserstein distributionally robust mitigation and recovery against random supply chain disruption

Y Cao, X Zhu, H Yan - Transportation Research Part E: Logistics and …, 2022 - Elsevier

… The Wasserstein ambiguity set is good for dealing with the scarcity of data. Thus, this paper 

introduces the DRO with Wasserstein ambiguity set to the joint optimum of robust supply …

 Related articles

Data-driven Wasserstein distributionally robust mitigation and recovery against random supply chain disruption

A Wasserstein distributionally robust planning model for renewable sources and energy storage systems under multiple uncertainties

J Li, Z Xu, H Liu, C Wang, L Wang… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org

… Both 1-norm Wasserstein metric and ∞-norm Wasserstein metric are considered in this 

paper. The former one is related to the general difference between probability distributions and …

 former one is related to the general difference between probability distributions and …

Cited by 4 Related articles All 2 versions

[PDF] arxiv.org

The Sketched Wasserstein Distance for mixture distributions

X Bing, F Bunea, J Niles-Weed - arXiv preprint arXiv:2206.12768, 2022 - arxiv.org

Wasserstein space over X = (A,d). This result establishes a universality property for the 

Wasserstein … on the risk of estimating the Wasserstein distance between distributions on a K-point …

 All 2 versions


2022


[HTML] mdpi.com

Detection and Isolation of Incipiently Developing Fault Using Wasserstein Distance

C Lu, J Zeng, S Luo, J Cai - Processes, 2022 - mdpi.com

… based on the Wasserstein distance [28] in … Wasserstein distance with sliding window is 

developed to detect incipient faults, the limiting distribution and sensitivity analysis of Wasserstein

 Related articles All 2 versions


Rectified Wasserstein Generative Adversarial Networks for Perceptual Image Restoration

H Ma, D Liu, F Wu - IEEE Transactions on Pattern Analysis and …, 2022 - ieeexplore.ieee.org

… This modification gives out our proposed rectified Wasserstein generative adversarial network 

(ReWaGAN). Note that we have shifted the action from the critic side (namely finding out …

All 4 versions


Learning Brain Representation using Recurrent Wasserstein Generative Adversarial Net

N Qiang, Q Dong, H Liang, J Li, S Zhang… - Computer Methods and …, 2022 - Elsevier

… Therefore, the Wasserstein distance is applied in this work to improve the stability of GAN 

training [60]. The value function of the Wasserstein distance-based GAN (WGAN) is as follows:(…

Cited by 2 Related articles All 4 versions

[PDF] arxiv.org

The Performance of Wasserstein Distributionally Robust M-Estimators in High Dimensions

L Aolaritei, S Shafieezadeh-Abadeh… - arXiv preprint arXiv …, 2022 - arxiv.org

… a Wasserstein sense, to the empirical distribution. In this paper, we propose a Wasserstein 

… work to study this problem in the context of Wasserstein distributionally robust M-estimation. …


[PDF] arxiv.org

Virtual persistence diagrams, signed measures, Wasserstein distances, and Banach spaces

P Bubenik, A Elchesen - Journal of Applied and Computational Topology, 2022 - Springer

Wasserstein distance from optimal transport theory. Following this work, we define compatible 

Wasserstein … We show that the 1-Wasserstein distance extends to virtual persistence …

Cited by 5 Related articles All 4 versions

<——2022———2022———800— 


Local well-posedness in the Wasserstein space for a chemotaxis model coupled to incompressible fluid flows

K Kang, HK Kim - Zeitschrift für angewandte Mathematik und Physik, 2022 - Springer

Wasserstein space. In this work, we refine the result on the existence of a weak solution of a 

Fokker–Planck equation in the Wasserstein … In this subsection, we introduce the Wasserstein

 All 2 versions

Zbl 07550872

[PDF] arxiv.org

A probabilistic approach to vanishing viscosity for PDEs on the Wasserstein space

L Tangpi - arXiv preprint arXiv:2206.01778, 2022 - arxiv.org

… where Pp(Rm) is the space of probability measures on Rm with finite p-th moment, which 

we equip with the Wasserstein metric Wp, and ∂µV denotes the Wasserstein gradient of V. …

Related articles All 2 versions


[PDF] arxiv.org

Asymptotics of smoothed Wasserstein distances in the small noise regime

Y Ding, J Niles-Weed - arXiv preprint arXiv:2206.06452, 2022 - arxiv.org

… We study the behavior of the Wasserstein-2 distance … alternative to the unregularized 

Wasserstein distance. We give precise … small, the smoothed Wasserstein distance approximates …

Cited by 12 Related articles All 5 versions

[PDF] arxiv.org

Geodesic Properties of a Generalized Wasserstein Embedding for Time Series Analysis

S Li, AHM Rubaiyat, GK Rohde - arXiv preprint arXiv:2206.01984, 2022 - arxiv.org

… In this paper, we study the geodesic properties of time series data with a generalized 

Wasserstein metric and the geometry related to their signed cumulative distribution transforms in …

Related articles All 2 versions 


[PDF] arxiv.org

Causality Learning With Wasserstein Generative Adversarial Networks

H Petkov, C Hanley, F Dong - arXiv preprint arXiv:2206.01496, 2022 - arxiv.org

… of Wasserstein distance in the context of causal structure learning. Our model named DAGWGAN 

combines the Wasserstein-… We conclude that the involvement of the Wasserstein metric …

 Related articles All 3 versions


2022


2022 see 2021 2023   [PDF] arxiv.org

Exact statistical inference for the wasserstein distance by selective inference

VNL Duy, I Takeuchi - Annals of the Institute of Statistical Mathematics, 2022 - Springer

… statistical inference for the Wasserstein distance, which has … inference method for the 

Wasserstein distance inspired by the … interval (CI) for the Wasserstein distance with finite-sample …

 Cited by 4 Related articles All 2 versions

arXiv:2207.02096  [pdfpsother math.PR
A short proof of the characterisation of convex order using the 2-Wasserstein distance
Authors: Beatrice AcciaioGudmund Pammer
Abstract: We provide a short proof of the intriguing characterisation of the convex order given by Wiesel and Zhang.
Submitted 5 July, 2022; originally announced July 2022.

arXiv:2207.01235  [pdfother math.PR  q-fin.MF
A characterisation of convex order using the 2-Wasserstein distance
Authors: Johannes WieselErica Zhang
Abstract: We give a new characterisation of convex order using the 2-Wasserstein distance W2
: we show that two probability measures μ
 and ν  on Rd with finite second moments are in convex order (i.e. μ
 …… holds for all probability measures ρ
 on…  More
Submitted 4 July, 2022; originally announced July 2022.

arXiv:2206.15131  [pdfother]  astro-ph.IM  astro-ph.GA
Radio Galaxy Classification with wGAN-Supported Augmentation
Authors: Janis KummerLennart RustigeFlorian GrieseKerstin BorrasMarcus BrüggenPatrick L. S. ConnorFrank GaedeGregor KasieczkaPeter Schleper
Abstract: Novel techniques are indispensable to process the flood of data from the new generation of radio telescopes. In particular, the classification of astronomical sources in images is challenging. Morphological classification of radio galaxies could be automated with deep learning models that require large sets of labelled training data. Here, we demonstrate the use of generative models, specifically…  More
Submitted 30 June, 2022; originally announced June 2022.
Comments: 10 pages, 6 figures; accepted to ml.astro


Working Paper  Full Text

Radio Galaxy Classification with wGAN-Supported Augmentation

Kummer, Janis; Rustige, Lennart; Griese, Florian; Borras, Kerstin; Brüggen, Marcus; et al. 

arXiv.org; Ithaca, Jun 30, 2022.

All 2 versions 


arXiv:2206.14897
  [pdfother cs.LG
Discrete Langevin Sampler via Wasserstein Gradient Flow
Authors: Haoran SunHanjun DaiBo DaiHaomin ZhouDale Schuurmans
Abstract: Recently, a family of locally balanced (LB) samplers has demonstrated excellent performance at sampling and learning energy-based models (EBMs) in discrete spaces. However, the theoretical understanding of this success is limited. In this work, we show how LB functions give rise to LB dynamics corresponding to Wasserstein gradient flow in a discrete space. From first principles, previous LB sample…  More
Submitted 29 June, 2022; originally announced June 2022.

All 2 versions 
<–—2022———2022———810— 



Finger vein image inpainting using neighbor binary

inpainting method using Neighbor Binary-

by H Jiang · 2022 — In this paper, a finger vein image inpainting method using Neighbor Binary-Wasserstein Generative Adversarial Networks (NB-WGAN) is proposed ...

 Finger vein image inpainting using neighbor binary-wasserstein generative adversarial networks (NB-WGAN...

by Jiang, Hanqiong; Shen, Lei; Wang, Huaxia ; More...

Applied intelligence (Dordrecht, Netherlands), 01/2022, Volume 52, Issue 9

Traditional inpainting methods obtain poor performance for finger vein images with blurred texture. In this paper, a finger vein image inpainting method using...

ArticleView Article PDF

Journal Article 

 Full Text Online

More Options 

View Complete Issue Browse Now

 Preview   Peer-Reviewed

 Related articles   

 

[2206.15131] Radio Galaxy Classification with wGAN ... - arXiv

https://arxiv.org › astro-ph

by J Kummer · 2022 — Morphological classification of radio galaxies could be automated with deep learning models that require large sets of labelled training data.

Radio Galaxy Classification with wGAN-Supported Augmentation

by Kummer, Janis; Rustige, Lennart; Griese, Florian ; More...

06/2022

Novel techniques are indispensable to process the flood of data from the new generation of radio telescopes. In particular, the classification of astronomical...

Journal Article  Full Text Online

 Preview   Open Access

 All 4 versions 

 

2022 book

Radio Galaxy Classification with wGAN-Supported Augmentation

Authors:Janis KummerLennart RustigeFlorian GrieseKerstin BorrasMarcus BrüggenPatrick L S ConnorFrank GaedeGregor KasieczkaPeter Schleper

Summary:Novel techniques are indispensable to process the flood of data from the new generation of radio telescopes. In particular, the classification of astronomical sources in images is challenging. Morphological classification of radio galaxies could be automated with deep learning models that require large sets of labelled training data. Here, we demonstrate the use of generative models, specifically Wasserstein GANs (wGAN), to generate artificial data for different classes of radio galaxies. Subsequently, we augment the training data with images from our wGAN. We find that a simple fully-connected neural network for classification can be improved significantly by including generated images into the training set

Show more

Book, Oct 7, 2022

Publication:arXiv.org, Oct 7, 2022, n/a

Publisher:Oct 7, 2022


Multisource Wasserstein Adaptation Coding Network for EEG emotion recognition

L Zhu, W Ding, J Zhu, P Xu, Y Liu, M Yan… - … Signal Processing and …, 2022 - Elsevier

… ) is a reliable method in emotion recognition and is widely studied. … a new emotion recognition 

method called Multisource Wasserstein … It also uses Wasserstein distance and Association …

Multisource Wasserstein Adaptation Coding Network for EEG ...

https://www.sciencedirect.com › science › article › abs › pii

by L Zhu · 2022 — In order to solve this problem, we propose a new emotion recognition method called Multisource Wasserstein Adaptation Coding Network (MWACN).

Cover Image

Multisource Wasserstein Adaptation Coding Network for EEG emotion...

by Zhu, Lei; Ding, Wangpan; Zhu, Jieping ; More...

Biomedical signal processing and control, 07/2022, Volume 76

•A novel model is proposed to adapt multisource domain distribution.•The proposed model obtains better results than existing methods.•The model shows good...

ArticleView Article PDF

Journal Article  Full Text Online

View Complete Issue Browse Now

 Preview   Peer-Reviewed

Cited by 1 Related articles

Discrete Langevin Sampler via Wasserstein Gradient Flow

H SunH Dai, B Dai, H Zhou, D Schuurmans - arXiv e-prints, 2022 - ui.adsabs.harvard.edu

Recently, a family of locally balanced (LB) samplers has demonstrated excellent

performance at sampling and learning energy-based models (EBMs) in discrete spaces.

However, the theoretical understanding of this success is limited. In this work, we show how

LB functions give rise to LB dynamics corresponding to Wasserstein gradient flow in a

discrete space. From first principles, previous LB samplers can then be seen as

discretizations of the LB dynamics with respect to Hamming distance. Based on this …

Discrete Langevin Sampler via Wasserstein Gradient Flow

by Haoran Sun; Hanjun Dai; Bo Dai ; More...

arXiv.org, 06/2022

Recently, a family of locally balanced (LB) samplers has demonstrated excellent performance at sampling and learning energy-based models (EBMs) in discrete...

Paper   Full Text Online

More Options   Preview   Open Access

All 2 versions

    

2022 see 2021  

Wasserstein GANs with Gradient Penalty Compute Congested ...

https://proceedings.mlr.press › ...

by T Milne · 2022 · Cited by 2 — Proceedings of Thirty Fifth Conference on Learning Theory, PMLR 178:103

Wasserstein GANs with Gradient Penalty Compute Congested Transport

by Tristan Milne; Adrian Nachman

arXiv.org, 06/2022

Wasserstein GANs with Gradient Penalty (WGAN-GP) are a very popular method for training generative models to produce high quality synthetic data. While WGAN-GP...

Paper 

 Full Text Online  More Options    Preview 

Cited by 2 Related articles All 3 versions

2022


Working Paper  Full Text

A short proof of the characterisation of convex order using the 2-Wasserstein distance

Acciaio, Beatrice; Pammer, Gudmund. 

arXiv.org; Ithaca, Jul 5, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window


patent Wire Feed  Full Text

State Intellectual Property Office of China Publishes Univ South China Tech's Patent Application for Motor Imagery Electroencephalogram Transfer Learning Method Based on Wasserstein Distance

Global IP News. Electrical Patent News; New Delhi [New Delhi]. 04 July 2022. 

DetailsFull text

Working Paper  Full Text

A characterisation of convex order using the 2-Wasserstein distance

Wiesel, Johannes; Zhang, Erica. 

arXiv.org; Ithaca, Jul 4, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

[PDF] mdpi.com

AC− WGAN− GP: Generating Labeled Samples for Improving Hyperspectral Image Classification with Small− Samples

C Sun, X Zhang, H Meng, X Cao, J Zhang - Remote Sensing, 2022 - mdpi.com

… named AC−WGAN−GP based on AC−GAN and WGAN−GP. … We construct a new generative

network named AC−WGAN−… mechanism makes AC−WGAN−GP periodically keep the …


Abstract/DetailsGet full text

Link to external site, this link will open in a new window

ARTICLE

Wasserstein GANs with Gradient Penalty Compute Congested Transport

Milne, Tristan ; Nachman, AdrianarXiv.org, 2022

OPEN ACCESS

Wasserstein GANs with Gradient Penalty Compute Congested Transport

Available Online 

Working Paper  Full Text

Wasserstein GANs with Gradient Penalty Compute Congested Transport

Milne, Tristan; Nachman, Adrian. 

arXiv.org; Ithaca, Jun 30, 2022.

Abstract/Details

 Cited by 4 Related articles All 4 versions

<–—2022———2022———820— 


ARTICLE

Sliced-Wassrstein normalizing flows: beyond maximum likelihood training

Florentin Coeurdoux ; Dobigeon, Nicolas ; Chainais, PierrearXiv.org, 2022

OPEN ACCESS

Sliced-Wasserstein normalizing flows: beyond maximum likelihood training

Available Online 

arXiv:2207.05468  [pdfother]  stat.ML  cs.AI  cs.LG
Sliced-Wasserstein normalizing flows: beyond maximum likelihood training
Authors: Florentin CoeurdouxNicolas DobigeonPierre Chainais
Abstract: Despite their advantages, normalizing flows generally suffer from several shortcomings including their tendency to generate unrealistic data (e.g., images) and their failing to detect out-of-distribution data. One reason for these deficiencies lies in the training strategy which traditionally exploits a maximum likelihood principle only. This paper proposes a new training paradigm based on a hybri… 
More
Submitted 12 July, 2022; originally announced July 2022.
All 31 versions 

arXiv:2207.05442  [pdfother]  stat.ML  cs.LG
Wasserstein multivariate auto-regressive models for modeling distributional time series and its application in graph learning
Authors: Yiye Jiang
Abstract: We propose a new auto-regressive model for the statistical analysis of multivariate distributional time series. The data of interest consist of a collection of multiple series of probability measures supported over a bounded interval of the real line, and that are indexed by distinct time instants. The probability measures are modelled as random objects in the Wasserstein space. We establish the a…  More
Submitted 12 July, 2022; originally announced July 2022.

Cited by 2 Related articles All 3 versions 

2022 see 2021  [PDF] aaai.org

Partial Wasserstein Covering

K Kawano, S Koide, K Otaki - Proceedings of the AAAI Conference on …, 2022 - ojs.aaai.org

… that allows us to estimate how much the partial Wasserstein divergence varies when adding 

candidate data, instead of performing the actual computation of divergence. Specifically, if …

Cited by 2 Related articles All 3 versions 


2022 patent

Rotating machine state monitoring method based on Wasserstein depth digital twin model

CN CN114662712A 胡文扬 清华大学

Filed 2022-02-22 • Published 2022-06-24

and if the similarity of the twin sample and the physical health sample reaches a set standard, obtaining the trained WGAN-GP network based on the Wasserstein deep digital twin model through consistency test. 3. The method for monitoring the condition of a rotating machine based on the Wasserstein


2022 see 2021

MR4452151 Prelim Mallasto, Anton; Gerolin, Augusto; Minh, Hà Quang; 

Entropy-regularized 2-Wasserstein distance between Gaussian measures. Inf. Geom. 5 (2022), no. 1, 289–323. 62R99 (49 90)

Review PDF Clipboard Journal Article

 Cited by 22 Related articles All 6 versions

2022


2022 see 2021

MR4452150 Prelim Lee, Wonjun; Li, Wuchen; Lin, Bo; Monod, Anthea; 

Tropical optimal transport and Wasserstein distances. Inf. Geom. 5 (2022), no. 1, 247–287. 49 (53 90)

Review PDF Clipboard Journal Article

ited by 9 Related articles All 4 versions

2022 see 2021 thesis

MR4451910 Prelim Jekel, David; Li, Wuchen; Shlyakhtenko, Dimitri; 

Tracial smooth functions of non-commuting variables and the free Wasserstein manifold. Dissertationes Math. 580 (2022), 1–150. 46L54 (35Q49 46L52 58D99 94A17)

Review PDF Clipboard Journal Article

Jekel, DavidLi, WuchenShlyakhtenko, Dimitri

Tracial smooth functions of non-commuting variables and the free Wasserstein manifold. (English) Zbl 07560209

Diss. Math. 580, 1-150 (2022).

MSC:  46L54 46L52 35Q49 94A17 58D99

PDF BibTeX XML Cite

Full Text: DOI 

WGAN-Based Image Denoising Algorithm

Zou, XFZhu, DJ; (...); Lian, ZT

2022 | JOURNAL OF GLOBAL INFORMATION MANAGEMENT 30 (9)

Traditional image denoising algorithms are generally based on spatial domains or transform domains to denoise and smooth the image. The denoised images are not exhaustive, and the depth-of-learning algorithm has better denoising effect and performs well while retaining the original image texture details such as edge characters. In order to enhance denoising capability of images by the restorati

Show moreFree Full Text from Publisher

31References  Related records


Exact statistical inference for the Wasserstein distance by selective inference Selective Inference for the Wasserstein Distance

Le Duy, VN and Takeuchi, I

Jun 2022 (Early Access) | ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS

Cited by 4 Related articles All 2 versions

Enriched Cited RefereIn this paper, we study statistical inference for the Wasserstein distance, which has attracted much attention and has been applied to various machine learning tasks. Several studies have been proposed in the literature, but almost all of thare based on asymptotic approximation and do not have finite-sample validity. In this study, we propose an exact asymptotic)inferencemethod for thShow moreFree Submitted Article From RepositoryFull Text at Publisher

30 References Related records


Detection and Isolation of Incipiently Developing Fault Using Wasserstein Distance

Lu, CZeng, JS; (...); Cai, JH

Jun 2022 | PROCESSES 10 (6)-Enriched Cited References

This paper develops an incipient fault detection and isolation method using the Wasserstein distance, which measures the difference between the probability distributions of normal and faulty data sets from the aspect of optimal transport. For fault detection, a moving window based approach is introduced, resulting in two monitoring statistics that are constructed based on the Wasserstein distan

Show more

Free Full Text from Publisher

45 References  Related records

<–—2022———2022———830— 


Hyperspectral Anomaly Detection Based on Wasserstein Distance and Spatial Filtering

Cheng, XYWen, MX; (...); Wang, YM

Jun 2022 | REMOTE SENSING 14 (12)Enriched Cited References

Since anomaly targets in hyperspectral images (HSIs) with high spatial resolution appear as connected areas instead of single pixels or subpixels, both spatial and spectral information of HSIs can be exploited for a hyperspectal anomaly detection (AD) task. This article proposes a hyperspectral AD method based on Wasserstein distance (WD) and spatial filtering (called AD-WDSF). Based on the ass

Show more

Free Full Text from Publisher

37 References  Related records 

Related articles All 2 versions 

Local well-posedness in the Wasserstein space for a chemotaxis model coupled to incompressible fluid flows

Kang, K and Kim, HK

Aug 2022 | ZEITSCHRIFT FUR ANGEWANDTE MATHEMATIK UND PHYSIK 73 (4)

Enriched Cited References

We consider a coupled system of Keller-Segel-type equations and the incompressible Navier-Stokes equations in spatial dimension two and three. In the previous work [17], we established the existence of a weak solution of a Fokker-Planck equation in the Wasserstein space using the optimal transportation technique. Exploiting this result, we constructed solutions of Keller-Segel-Navier-Stokes equ

Show moreFull Text at Publisher

31 References  Related records 

Journal Article  Full Text Online
All 3 versions


Imbalanced cell-cycle classification using WGAN-div and mixup

P Rana, A Sowmya, E Meijering… - 2022 IEEE 19th …, 2022 - ieeexplore.ieee.org

… discarded majority samples and used Wasserstein GAN-gradient penalty (WGAN-GP) [13] … 

, we propose a framework that utilises Wasserstein divergence GAN (WGAN-div) [14] and …

Cited by 3 Related articles


[PDF] researchsquare.com

[PDF] Learned Pseudo-Random Number Generator: WGAN-GP for Generating Statistically Robust Random Numbers

K Okada, K Endo, K Yasuoka, S Kurabayashi - 2022 - researchsquare.com

… In this paper, we propose a Wasserstein distance-based generative adversarial network (WGAN) … 

We remove the dropout layers from the conventional WGAN network to learn random …

AAll 5 versions


Wasserstein Graph Distance based on $L_1$-Approximated Tree Edit Distance between...

by Fang, Zhongxi; Hu
07/2022

The Weisfeiler-Lehman (WL) test has been widely applied to graph kernels, metrics, and neural networks. However, it considers only the graph consistency,...

Journal Article 

2022


Generalized Wasserstein Dice Loss, Test-Time Augmentation, and Transformers...

by Fidon, LucasShit, SuprosannaEzhov, Ivan ; More...

Brainlesion: Glioma, Multiple Sclerosis, Stroke and Traumatic Brain Injuries, 07/2022

Brain tumor segmentation from multiple Magnetic Resonance Imaging (MRI) modalities is a challenging task in medical image computation. The main challenges lie...

Book Chapter   Full Text Online


[2207.05468] Sliced-Wasserstein normalizing flows - arXiv

https://arxiv.org › stat

by F Coeurdoux · 2022 — Title:Sliced-Wasserstein normalizing flowsbeyond maximum likelihood training ... Abstract: Despite their advantages, normalizing flows generally ...

Sliced-Wasserstein normalizing flows: beyond maximum likelihood trainingby Coeurdoux, Florentin; Dobigeon, Nicolas;

Chainais, Pierre

07/Despite their advantages, normalizing flows generally suffer from several shortcomings including their tendency to generate

unrealistic data (e.g., images) and...

Journal Article  Full Text Onlin

Related articles All 33 versions

Wasserstein multivariate auto-regressive models for modeling distributional time series and its application in graph learning

Y Jiang - arXiv e-prints, 2022 - ui.adsabs.harvard.edu

We propose a new auto-regressive model for the statistical analysis of multivariate

distributional time series. The data of interest consist of a collection of multiple series of

probability measures supported over a bounded interval of the real line, and that are

indexed by distinct time instants. The probability measures are modelled as random objects

in the Wasserstein space. We establish the auto-regressive model in the tangent space at

the Lebesgue measure by first centering all the raw measures so that their Fréchet means …

Wasserstein multivariate auto-regressive models for modeling distributional time...

by Jiang, Yiye

07/2022

We propose a new auto-regressive model for the statistical analysis of multivariate distributional time series. The data of interest consist of a collection of...

Journal Article  Full Text Online

Cite All 4 versions

Generalizing to Unseen Domains with Wasserstein Distributional Robustness under Limited Source Knowledge

J Wang, L Xie, Y Xie, SL Huang, Y Li - arXiv preprint arXiv:2207.04913, 2022 - arxiv.org

Domain generalization aims at learning a universal model that performs well on unseen

target domains, incorporating knowledge from multiple source domains. In this research, we

consider the scenario where different domain shifts occur among conditional distributions of

different classes across domains. When labeled samples in the source domains are limited,

existing approaches are not sufficiently robust. To address this problem, we propose a novel

domain generalization framework called Wasserstein Distributionally Robust Domain …

Generalizing to Unseen Domains with Wasserstein Distributional Robustness...

by Wang, Jingge; Xie, Liyan; Xie, Yao ; More...

07/2022

Domain generalization aims at learning a universal model that performs well on unseen target domains, incorporating knowledge from multiple source domains. In...

Journal Article  Full Text Online

 Preview   Open Access

All 2 versions

    

[2207.04216] Wasserstein Graph Distance based on $L_1

https://arxiv.org › cs

by Z Fang · 2022 — Then we define a new graph embedding space based on L_1-approximated tree edit distance (L_1-TED): the L_1 norm of the difference between ...


Wasserstein Graph Distance based on $L_1$-Approximated Tree Edit Distance...

by Fang, Zhongxi; Huang, Jianming; Su, Xun ; More...

07/2022

The Weisfeiler-Lehman (WL) test has been widely applied to graph kernels, metrics, and neural networks. However, it considers only the graph consistency,...

Journal Article  Full Text Online

 Preview   Open Access

<–—2022———2022———840—


2022 see 2021

Generalized Wasserstein Dice Loss, Test-time Augmentation ...

https://www.researchgate.net › ... › Transformers

Jul 5, 2022 — Generalized Wasserstein Dice LossTest-time Augmentation, and Transformers for the BraTS 2021 challenge ... 2008-2022 ResearchGate GmbH.

Generalized Wasserstein Dice Loss, Test-Time Augmentation, and Transformers...

by Fidon, Lucas; Shit, Suprosanna; Ezhov, Ivan ; More...

Brainlesion: Glioma, Multiple Sclerosis, Stroke and Traumatic Brain Injuries, 07/2022

Brain tumor segmentation from multiple Magnetic Resonance Imaging (MRI) modalities is a challenging task in medical image computation. The main challenges lie...

Book Chapter    Full Text Online
Related articles
All 3 versions


Conference Paper  Citation/Abstract

WGAN-GP and LSTM based Prediction Model for Aircraft 4- D Traj ectory

Zhang, Lei; Chen, Huiping; Jia, Peiyan; Tian, Zhihong; Du, Xiaojiang. 

The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2022).

Abstract/Details 


Conference Paper  Citation/Abstract

Topic Embedded Representation Enhanced Variational Wasserstein Autoencoder for Text Modeling

Liu, Xiaoming; Yang, Guan; Liu, Yang. 

The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2022).

Abstract/Details 

Scholarly Journal  Citation/Abstract

Accuracy Improvement of Cable Termination Partial Discharging Recognition Based on Improved WGAN Algorithm

Fu, Yao; Zhou, Kai; Zhu, Guangya; Wang, Zijian; Wang, Guodong; et al. 

Dianwang Jishu = Power System Technology; Beijing Iss. 5,  (2022): 2000.

Abstract/Details 

[CITATION] Accuracy improvement of cable termination partial discharging recognition based on improved WGAN algorithm

Y Fu, K Zhou, G Zhu, Z Wang, G Wang, Z WANG - Power System Technology, 2022

 Cited by 2

2022 see 2021  [PDF] mlr.press

Linear-time gromov wasserstein distances using low rank couplings and costs

M Scetbon, G Peyré, M Cuturi - International Conference on …, 2022 - proceedings.mlr.press

The ability to align points across two related yet incomparable point clouds (eg living in 

different spaces) plays an important role in machine learning. The Gromov-Wasserstein (GW) …

 Cited by 15 Related articles All 4 versions

2022

[HTML] sciencedirect.com

[HTML] A GPM-based algorithm for solving regularized Wasserstein barycenter problems in some spaces of probability measures

S Kum, MH Duong, Y Lim, S Yun - Journal of Computational and Applied …, 2022 - Elsevier

In this paper, we focus on the analysis of the regularized Wasserstein barycenter problem. 

We provide uniqueness and a characterization of the barycenter for two important classes of …

 All 6 versions

Zbl 07567573
ARTICLE

A GPM-based algorithm for solving regularized Wasserstein barycenter problems in some spaces of probability measures

Kum, S ; Duong, M.H ; Lim, Y ; Yun, SJournal of computational and applied mathematics, 2022, Vol.416

PEER REVIEWED

OPEN ACCESS

A GPM-based algorithm for solving regularized Wasserstein barycenter problems in some spaces of probability measures

Available Online 


2022 see 2021  [PDF] mlr.press

Entropic gromov-wasserstein between gaussian distributions

K Le, DQ Le, H Nguyen, D Do… - … on Machine Learning, 2022 - proceedings.mlr.press

… We study the entropic Gromov-Wasserstein and its … to as inner product Gromov-Wasserstein 

(IGW), we demonstrate … entropic inner product Gromov-Wasserstein barycenter of multiple …

Cited by 2 Related articles All 3 versions

 

Wasserstein Approximate Bayesian Computation for Visual Tracking

J Park, J Kwon - Pattern Recognition, 2022 - Elsevier

In this study, we present novel visual tracking methods based on the Wasserstein approximate 

Bayesian computation (ABC). For visual tracking, the proposed Wasserstein ABC (WABC) …

 

[HTML] sciencedirect.com

[HTML] Energy data generation with Wasserstein Deep Convolutional Generative Adversarial Networks

J Li, Z Chen, L Cheng, X Liu - Energy, 2022 - Elsevier

Residential energy consumption data and related sociodemographic information are critical 

for energy demand management, including providing personalized services, ensuring …

Cited by 14 All 6 versions

 2022 see 2021

The Wasserstein Impact Measure (WIM): A practical tool for quantifying prior impact in Bayesian statistics

F Ghaderinezhad, C Ley, B Serrien - Computational Statistics & Data …, 2022 - Elsevier

… and upper bounds on the Wasserstein distance and their approach … For practical purposes, 

the power of the Wasserstein … More concretely, we will provide in Section 2 the Wasserstein

Cited by 2 Related articles All 2 versions

Zbl 07561870

<–—2022———2022———850— 


[PDF] kit.edu

[PDF] Dirac Mixture Reduction Using Wasserstein Distances on Projected Cumulative Distributions

D Prossel, UD Hanebeck - … of the 25th International Conference on …, 2022 - isas.iar.kit.edu

… In this paper, the Wasserstein distance is established as a … Therefore, the well-known 

sliced Wasserstein distance is … to minimize the sliced Wasserstein distance between the given …

Cited by 2 All 2 versions


[PDF] arxiv.org

Wasserstein Convergence for Conditional Empirical Measures of Subordinated Dirichlet Diffusions on Riemannian Manifolds

H Li, B Wu - arXiv preprint arXiv:2204.13559, 2022 - arxiv.org

… Being the development of Wang \cite{eW1}, under the quadratic Wasserstein distance, 

we investigate the rate of convergence of conditional empirical measures associated to …

 Cited by 1 Related articles All 2 versions


2022 modified tuteral  

WGAN: Wasserstein Generative Adversarial Networks

By Bharath K 

https://blog.paperspace.com › wgans

https://blog.paperspace.com › wgans

Understanding WGANs: ... The idea for the working of Generative Adversarial Networks (GANs) is to utilize two primary probability distributions. One of the main ...

Introduction · ‎Learning the details for the... · ‎Construct a project with WGANs

16 minutes read

2022 see 2021 2023

Partial Wasserstein Covering

K KawanoS KoideK Otaki - Proceedings of the AAAI Conference on …, 2022 - ojs.aaai.org

We consider a general task called partial Wasserstein covering with the goal of providing

information on what patterns are not being taken into account in a dataset (eg, dataset used …

 Cited by 2 Related articles All 3 versions 

asserstein divergence as an … ϵ denotes the partial Wasserstein divergence, using au

Cited by 3 Related articles All 7 versions 

First-order Conditions for Optimization in the Wasserstein...

by Lanzetti, Nicolas; Bolognani, Saverio; Dörfler, Florian

arXiv.org, 09/2022

We study first-order optimality conditions for constrained optimization in the Wasserstein space, whereby one seeks to minimize a

real-valued function over the...

Paper  Full Text Online


2022


2022 see 2021  [PDF] aaai.org

Wasserstein Unsupervised Reinforcement Learning

S He, Y Jiang, H Zhang, J Shao, X Ji - Proceedings of the AAAI …, 2022 - ojs.aaai.org

… Therefore, we choose Wasserstein distance, a well-studied … By maximizing Wasserstein

distance, the agents equipped … First, we propose a novel framework adopting Wasserstein …

Related articles All 3 versions 



DF] arxiv.org

On Combinatorial Properties of Greedy Wasserstein Minimization

S Steinerberger - arXiv preprint arXiv:2207.08043, 2022 - arxiv.org

We discuss a phenomenon where Optimal Transport leads to a remarkable amount of

combinatorial regularity. Consider infinite sequences $(x_k)_{k=1}^{\infty}$ in $[0,1]$ constructed …

All 2 versions 



ARTICLE

Mean field Variational Inference via Wasserstein Gradient Flow

Yao, Rentian ; Yang, YunarXiv.org, 2022

OPEN ACCESS

Mean field Variational Inference via Wasserstein Gradient Flow

Available Online 

[PDF] arxiv.org

Mean field Variational Inference via Wasserstein Gradient Flow

R Yao, Y Yang - arXiv preprint arXiv:2207.08074, 2022 - arxiv.org

… Wasserstein gradient flow, called the one-step minimization movement or the JKO scheme,

with an explicit contraction rate; lastly, we discuss the connection between Wasserstein …
All 2 versions
 


2022 see 2021  [PDF] muni.cz

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

HQ Minh - Journal of Theoretical Probability, 2022 - Springer

… formulation of the 2-Wasserstein distance on an infinite-… plan, entropic 2-Wasserstein

distance, and Sinkhorn divergence … , both the entropic 2-Wasserstein distance and Sinkhorn …

Cited by 5 Related articles All 4 versions


A General Wasserstein Framework for Data-driven Distributionally Robust Optimization: Tractability and Applications

JYM Li, T Mao - arXiv preprint arXiv:2207.09403, 2022 - arxiv.org

… Wasserstein DRO, distributionally robust optimization using the coherent Wasserstein metrics,

termed generalized Wasserstein … to design novel Wasserstein DRO models that can be …

arXiv

<–—2022———2022———860—


ARTICLE

Wasserstein convergence rates of increasingly concentrating probability measures

Hasenpflug, Mareike ; Rudolf, Daniel ; Sprungk, BjörnarXiv.org, 2022

OPEN ACCESS

Wasserstein convergence rates of increasingly concentrating probability measures

Available Online 

arXiv:2207.08551  [pdfother math.PR
Wasserstein convergence rates of increasingly concentrating probability measures
Authors: Mareike HasenpflugDaniel RudolfBjörn Sprungk
Abstract: For :Rd [0,∞)
 we consider the sequence of probability measures (μn)
nN where μn   is determined by a density that is proportional to exp(−n)
. We allow for infinitely many global minimal points of 
, as long as they form a finite union of compact manifolds. In this scenario, we show estimates for the p
-Wasserstein converge…  More
Submitted 18 July, 2022; originally announced July 2022.
Comments: 36 pages, 1 Figure
MSC Class: 60B10; 58C99
All 2 versions
 

 


Peer-reviewed
Wasserstein Proximal Algorithms for the Schrödinger Bridge Problem: Density Control With Nonlinear Drift
Authors:Kenneth F. CaluyaAbhishek Halder
Summary:In this article, we study the Schrödinger bridge problem (SBP) with nonlinear prior dynamics. In control-theoretic language, this is a problem of minimum effort steering of a given joint state probability density function (PDF) to another over a finite-time horizon, subject to a controlled stochastic differential evolution of the state vector. As such, it can be seen as a stochastic optimal control problem in continuous time with endpoint density constraints—A topic that originated in the physics literature in 1930s, and in the recent years, has garnered burgeoning interest in the systems-control community. For generic nonlinear drift, we reduce the SBP to solving a system of forward and backward Kolmogorov partial differential equations (PDEs) that are coupled through the boundary conditions, with unknowns being the “Schrödinger factors”—so named since their product at any time yields the optimal controlled joint state PDF at that time. We show that if the drift is a gradient vector field, or is of mixed conservative–dissipative nature, then it is possible to transform these PDEs into a pair of initial value problems (IVPs) involving the same forward Kolmogorov operator. Combined with a recently proposed fixed point recursion that is contractive in the Hilbert metric, this opens up the possibility to numerically solve the SBPs in these cases by computing the Schrödinger factors via a single IVP solver for the corresponding (uncontrolled) forward Kolmogorov PDE. The flows generated by such forward Kolmogorov PDEs, for the two aforementioned types of drift, in turn, enjoy gradient descent structures on the manifold of joint PDFs with respect to suitable distance functionals. We employ a proximal algorithm developed in our prior work that exploits this geometric viewpoint, to solve these IVPs and compute the Schrödinger factors via weighted scattered point cloud evolution in the state space. We provide the algorithmic details and illustrate the proposed framework of solving the SBPs with nonlinear prior dynamics by numerical examplesShow more

The Spectral-Domain $\mathcal{W}_2$ Wasserstein Distance ...

https://www.researchgate.net › ... › Stochastic Processes

Jul 5, 2022 — In this short note, we introduce the spectral-domain $\mathcal{W}_2$ Wasserstein distance for elliptical stochastic processes in terms of ...

 

2022 see 2021

Mallasto, AntonGerolin, AugustoMinh, Hà Quang

Entropy-regularized 2-Wasserstein distance between Gaussian measures. (English) Zbl 07560183

Inf. Geom. 5, No. 1, 289-323 (2022).

MSC:  49Q22 53Cxx 60Exx

PDF BibTeX XML Cite

Full Text: DOI 

 Cited by 22 Related articles All 6 versions

 

 2022 see 2021

Lee, WonjunLi, WuchenLin, BoMonod, Anthea

Tropical optimal transport and Wasserstein distances. (English) Zbl 07560182

Inf. Geom. 5, No. 1, 247-287 (2022).

MSC:  92Dxx 14Txx 49Qxx

PDF BibTeX XML Cite

Full Text: DOI  

Cited by 5 Related articles All 6 versions

2022   


Qin, QianHobert, James P.

Geometric convergence bounds for Markov chains in Wasserstein distance based on generalized drift and contraction conditions. (English. French summary) Zbl 07557525

Ann. Inst. Henri Poincaré, Probab. Stat. 58, No. 2, 872-889 (2022).

MSC:  60J05

PDF BibTeX XML Cite

Full Text: DOI

Reygner, JulienTouboul, Adrien

Reweighting samples under covariate shift using a Wasserstein distance criterion. (English) Zbl 07556932

Electron. J. Stat. 16, No. 1, 3278-3314 (2022).

MSC:  62-XX

PDF BibTeX XML Cite

Full Text: DOI

Cited by 3 Related articles All 29 versions

Zhu, XianchaoZhang, RuiyuanHuang, TianyiWang, Xiaoting

Visual transfer for reinforcement learning via gradient penalty based Wasserstein domain confusion. (English) Zbl 07556354

J. Nonlinear Var. Anal. 6, No. 3, 227-238 (2022).

MSC:  47-XX 46-XX

PDF BibTeX XML Cite

Full Text: DOI

VISUAL TRANSFER FOR REINFORCEMENT 

 

Cheramin, MeysamCheng, JianqiangJiang, RuiweiPan, Kai

Computationally efficient approximations for distributionally robust optimization under moment and Wasserstein ambiguity. (English) Zbl 07552234

INFORMS J. Comput. 34, No. 3, 1768-1794 (2022).

MSC:  90-XX

PDF BibTeX XML 

Computationally Efficient Approximations for Distributionally Robust Optimization Under Moment and Wasserstein Ambiguity

By: Cheramin, MeysamCheng, JianqiangJiang, Ruiwei; et al.

INFORMS JOURNAL ON COMPUTING    

Early Access: JAN 2022 

Related articles
Zbl 07552234
  MR4445886 

LEARNING VIA GRADIENT PENALTY BASED WASSERSTEIN DOMAIN CONFUSION

X Zhu, R Zhang, T Huang… - JOURNAL OF …, 2022 - BIEMDAS ACAD PUBLISHERS INC …

Cite Related articles

Research articleOpen access
A GPM-based algorithm for solving regularized Wasserstein barycenter problems in some spaces of probability measures
Journal of Computational and Applied Mathematics12 July 2022...

S. KumM. H. DuongS. Yun

Download PDF

All 5 versions

MR4456457

<–—2022———2022———870— 


Research articleFull text access
Interval-valued functional clustering based on the Wasserstein distance with application to stock data
Information Sciences30 May 2022...

Lirong SunLijun ZhuTomas Balezentis

Download PDF

Journal Article  Full Text Online

Research article
A Wasserstein metric-based distributionally robust optimization approach for reliable-economic equilibrium operation of hydro-wind-solar energy systems
Renewable Energy1 July 2022...

Xiaoyu JinBenxi LiuZhiyu Yan

Full Text at Publisher

82 References  Related records

Research article
Wasserstein approximate bayesian computation for visual tracking
Pattern Recognition16 July 2022...

Jinhee ParkJunseok Kwon

Wasserstein approximate bayesian computation for visual tracking

by Park, JinheeKwon, Junseok

Pattern recognition, 11/2022, Volume 131

•Our WABC is the first to use the Wasserstein distance to approximate the likelihood in visual tracking.•Our TWABC encodes the

temporal interdependence between...

Journal ArticleCitation Online


Research articleFull text access
Learning brain representation using recurrent Wasserstein generative adversarial net
Computer Methods and Programs in Biomedicine27 June 2022...

Ning QiangQinglin DongShijie Zhao

Download PDF

View full text

67 References  Related records


Learning brain representation using recurrent Wasserstein...

by Qiang, Ning; Dong, Qinglin; Liang, Hongtao ; More...

Computer methods and programs in biomedicine, 08/2022, Volume 223

Keywords fMRI; Functional Brain Network; Generative Adversarial Net; Deep Learning; Unsupervised Learning Highlights * Propose a novel Recurrent Wasserstein...

Article PDFPDF

Journal Article  Full Text Online


Scholarly Journal

Energy data generation with Wasserstein Deep Convolutional Generative Adversarial Networks

Li, Jianbin; Chen, Zhiqiang; Cheng, Long; Liu, Xiufeng.

Energy Vol. 257,  (Oct 15, 2022)

CiEmailSave to My Research

Citation/Abstract

Abstract/Details Get full textopens in a new window

Research articleOpen access
Energy data generation with Wasserstein Deep Convolutional Generative Adversarial Networks
Energy7 July 2022...

Jianbin LiZhiqiang ChenXiufeng Liu

Download PDF

 Cited by 12 All 4 versions


2022

arXiv:2207.12315  [pdfother]  cs.AI  cs.CV cs.DC cs.LG  cs.MA
Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks
Authors: Massimiliano Lupo PasiniJunqi Yin
Abstract: We propose a stable, parallel approach to train Wasserstein Conditional Generative Adversarial Neural Networks (W-CGANs) under the constraint of a fixed computational budget. Differently from previous distributed GANs training techniques, our approach avoids inter-process communications, reduces the risk of mode collapse and enhances scalability by using multiple generators, each one of them concu… 
More
Submitted 25 July, 2022; originally announced July 2022.
Comments: 22 pages; 9 figures
MSC Class: 68T01; 68T10; 68M14; 65Y05; 65Y10 ACM Class: I.2.0; I.2.11; C.1.4; C.2.4
Journal Article  Full Text Online

Free Submitted Article From RepositoryFull Text at Publisher

31 References  Related records

 All 2 versions 

arXiv:2207.12279  [pdfother]  stat.ML   cs.LG  math.OC
Orthogonalization of data via Gromov-Wasserstein type feedback for clustering and visualization
Authors: Martin RynerJohan Karlsson
Abstract: In this paper we propose an adaptive approach for clustering and visualization of data by an orthogonalization process. Starting with the data points being represented by a Markov process using the diffusion map framework, the method adaptively increase the orthogonality of the clusters by applying a feedback mechanism inspired by the Gromov-Wasserstein distance. This mechanism iteratively increas…  More
Submitted 25 July, 2022; originally announced July 2022.
Comments: 19 pages, 3 figures
Journal Article  Full Text Online

All 3 versions 


[PDF] arxiv.org

Mean field Variational Inference via Wasserstein Gradient Flow

R Yao, Y Yang - arXiv preprint arXiv:2207.08074, 2022 - arxiv.org

… In this work, we develop a general computational framework for implementing MF-VI via 

Wasserstein gradient flow (WGF), a gradient flow over the space of probability measures. When …

All 4 versions 


 arXiv:2207.14727  [pdfother]  stat.ML   cs.LG  econ.EM  math.ST
Tangential Wasserstein Projections
Authors: Florian GunsiliusMeng Hsuan HsiehMyung Jin Lee
Abstract: We develop a notion of projections between sets of probability measures using the geometric properties of the 2-Wasserstein space. It is designed for general multivariate probability measures, is computationally efficient to implement, and provides a unique solution in regular settings. The idea is to work on regular tangent cones of the Wasserstein space using generalized geodesics. Its structure…  More
Submitted 29 July, 2022; originally announced July 2022.
Journal Article  Full Text Online

All 5 versions 

ARTICLE

Wasserstein interpolation with constraints and application to a parking problem

Buttazzo, Giuseppe ; Carlier, Guillaume ; Eichinger, KatharinaarXiv.org, 2022

OPEN ACCESS

Wasserstein interpolation with constraints and application to a parking problem

Available Online 

arXiv:2207.14261  [pdfother]  math.OC
Wasserstein interpolation with constraints and application to a parking problem
Authors: Giuseppe ButtazzoGuillaume CarlierKatharina Eichinger
Abstract: We consider optimal transport problems where the cost for transporting a given probability measure μ 0 to another one μ 1
 consists of two parts: the first one measures the transportation from μ
0  to an intermediate (pivot) measure μ
 to be determined (and subject to various constraints), and the second one measures the transportation from μ
 to μ1
 This leads to Wasserstein interpolatio…  More
Submitted 28 July, 2022; originally announced July 2022.
Comments: 30 pages, 7 figures
MSC Class: 49Q22; 49J45; 49M29; 49K99

Journal Article  Full Text Online

All 3 versions 

<–—2022———2022———880—


2022 see 2021

arXiv:2207.13177  [pdfother]  stat.ML   cs.LG
Sliced Wasserstein Variational Inference
Authors: Mingxuan YiSong Liu
Abstract: Variational Inference approximates an unnormalized distribution via the minimization of Kullback-Leibler (KL) divergence. Although this divergence is efficient for computation and has been widely used in applications, it suffers from some unreasonable properties. For example, it is not a proper metric, i.e., it is non-symmetric and does not preserve the triangle inequality. On the other hand, opti… 
More
Submitted 26 July, 2022; originally announced July 2022.

Journal Article  Full Text Online

Cited by 2 Related articles 

RTICLE

Sensitivity of multi-period optimization problems in adapted Wasserstein distance

Bartl, Daniel ; Wiesel, JohannesIDEAS Working Paper Series from RePEc, 2022

OPEN ACCESS

Sensitivity of multi-period optimization problems in adapted Wasserstein distance

No Online Access 

arXiv:2208.05656  [pdfpsother]  math.OC   math.PR   q-fin.MF
Sensitivity of multi-period optimization problems in adapted Wasserstein distance
Authors: Daniel BartlJohannes Wiesel
Abstract: We analyze the effect of small changes in the underlying probabilistic model on the value of multi-period stochastic optimization problems and optimal stopping problems. We work in finite discrete time and measure these changes with the adapted Wasserstein distance. We prove explicit first-order approximations for both problems. Expected utility maximization is discussed as a special case.
Submitted 11 August, 2022; originally announced August 2022.

arXiv:2208.03323  [pdfother]  eess.IV  cs.CV doi10.1145/3503161.3548193
DeepWSD: Projecting Degradations in Perceptual Space to Wasserstein Distance in Deep Feature Space
Authors: Xigran LiaoBaoliang ChenHanwei ZhuShiqi WangMingliang ZhouSam Kwong
Abstract: Existing deep learning-based full-reference IQA (FR-IQA) models usually predict the image quality in a deterministic way by explicitly comparing the features, gauging how severely distorted an image is by how far the corresponding feature lies from the space of the reference images. Herein, we look at this problem from a different viewpoint and propose to model the quality degradation in perceptua… 
More
Submitted 4 August, 2022; originally announced August 2022.
Comments: ACM Multimedia 2022 accepted thesis
Journal Article  Full Text Online


ARTICLE

GeoECG: Data Augmentation via Wasserstein Geodesic Perturbation for Robust Electrocardiogram Prediction

Jiacheng Zhu ; Jielin Qiu ; Zhuolin Yang ; Douglas Weber ; Michael A Rosenberg ; Emerson Liu ; Bo Li ; Ding ZhaoarXiv.org, 2022

OPEN ACCESS

GeoECG: Data Augmentation via Wasserstein Geodesic Perturbation for Robust Electrocardiogram Prediction

Available Online 

arXiv:2208.01220  [pdfother]  stat.ML  cs.LG eess.SP 

GeoECG: Data Augmentation via Wasserstein Geodesic Perturbation for Robust Electrocardiogram Prediction
Authors: Jiacheng ZhuJielin QiuZhuolin YangDouglas WeberMichael A. RosenbergEmerson LiuBo LiDing Zhao
Abstract: There has been an increased interest in applying deep neural networks to automatically interpret and analyze the 12-lead electrocardiogram (ECG). The current paradigms with machine learning methods are often limited by the amount of labeled data. This phenomenon is particularly problematic for clinically-relevant data, where labeling at scale can be time-consuming and costly in terms of the specia…  More
Submitted 10 August, 2022; v1 submitted 1 August, 2022; originally announced August 2022.
Comments: 26 pages, Figure 13, Machine Learning for Healthcare 2022
Journal ref: Machine Learning for Healthcare 2022, JMLR Volume 182

Working Paper  Full Text

GeoECG: Data Augmentation via Wasserstein Geodesic Perturbation for Robust Electrocardiogram Prediction
Zhu, Jiacheng; Qiu, Jielin; Yang, Zhuolin; Weber, Douglas; Rosenberg, Michael A; et al.
arXiv.org; Ithaca, Aug 10, 2022.
Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Cited by 5 All 4 versions


 
A 3D reconstruction method of porous media based on improved WGAN...
by Zhang, Ting; Liu, Qingyang; Wang, Xianwu ; More...
Computers & geosciences, 08/2022, Volume 165
The reconstruction of porous media is important to the development of petroleum industry, but the accurate characterization of the internal structures of...
Article PDFPDF
Journal Article  Full Text Online
 Related articles


 2022



Electrocardiograph Based Emotion Recognition via WGAN...
by Hu, Jiayuan; Li, Yong
Intelligent Robotics and Applications, 08/2022
Emotion recognition is one of the key technologies for the further development of human-computer interaction, and is gradually becoming a hot spot in current...
Book Chapter  Full Text Online


Multiview Wasserstein generative adversarial...
by Gao, Shuang; Dai, Yun; Li, Yingjie ; More...
Measurement science & technology, 08/2022, Volume 33, Issue 8
Abstract This work described in this paper aims to enhance the level of automation of industrial pearl classification through deep learning methods. To better...
Article PDFPDF
Journal Article  Full Text Online


2022 see 2021
EEG data augmentation for emotion recognition with a multiple generator conditional Wasserstein...
by Zhang, Aiming; Su, Lei; Zhang, Yin ; More...
Complex & intelligent systems, 04/2021, Volume 8, Issue 4
EEG-based emotion recognition has attracted substantial attention from researchers due to its extensive application prospects, and substantial progress has...
Article PDFPDF
Journal Article  Full Text Online


2022 see 2021 
Universality of persistence diagrams and the bottleneck and Wasserstein...
by Bubenik, Peter; Elchesen, Alex
arXiv.org, 10/2021
We prove that persistence diagrams with the p-Wasserstein distance form the universal p-subadditive commutative monoid on an underlying metric space with a...
Paper  Full Text Online
More Options 

 

[PDF] arxiv.org

Topological Continual Learning with Wasserstein Distance and Barycenter

T Songdechakraiwut, X Yin, BD Van Veen - arXiv preprint arXiv …, 2022 - arxiv.org

… The human brain, however, is able to continually learn new … success in the human brain is 

potentially associated with its … form expressions of the Wasserstein distance and barycenter …

Related articles All 3 versions

Part of the Lecture Notes in Computer Science book series (LNAI,volume 13458)

Abstract

Aiming at enhancing classification performance and improving user experience of a brain-computer interface (BCI) system, this paper proposes an improved Wasserstein generative adversarial networks (WGAN) method to generate EEG samples in virtual channels. The feature extractor and the proposed WGAN model with a novel designed feature loss are trained. Then artificial EEG of virtual channels are generated by using the improved WGAN with EEG of multiple physical channels as the input. Motor imagery (MI) classification utilizing a CNN-based classifier is performed based on two EEG datasets. The experimental results show that the generated EEG of virtual channels are valid, which are similar to the ground truth as well as have learned important EEG features of other channels. The classification performance of the classifier with low-channel EEG has been significantly improved with the help with the generated EEG of virtual channels. Meanwhile, user experience on BCI application is also improved by low-channel EEG replacing multi-channel EEG. The feasibility and effectiveness of the proposed method are verified.
EEG Generation of Virtual Channels Using an Improved Wasserstein GAN
by Li, Ling-Long; Cao, Guang-Zhong; Liang, Hong-Jie ; More...
Intelligent Robotics and Applications, 08/2022
Aiming at enhancing classification performance and improving user experience of a brain-computer interface (BCI) system, this paper proposes an improved...
Book Chapter  Full Text Online

<–—2022———2022———890—



2022 Duke
Michigan State University secures contract for Nonlocal Reaction-Diffusion Equations And Wasserstein...
Pivotal Sources, Jul 26, 2022
Newspaper Article


 Working Paper  Full Text

Sensitivity of multi-period optimization problems in adapted Wasserstein distance
Bartl, Daniel; Wiesel, Johannes.
arXiv.org; Ithaca, Aug 11, 2022.
Abstract/DetailsGet full text
Link to external site, this link will open in a new window
All 4 versions


 [PDF] arxiv.org

Gromov-Wasserstein Autoencoders

N Nakagawa, R Togo, T Ogawa… - arXiv preprint arXiv …, 2022 - arxiv.org

… representation learning method, Gromov-Wasserstein Autoencoders (GWAE), … -based 

objective, GWAE models have a trainable prior optimized by minimizing the GromovWasserstein (…

Cited by 1 Related articles All 2 versions

 3022 thesis

DeepWSD: Projecting Degradations in Perceptual Space to

Wasserstein Distance in Deep Feature Spacehttps://arxiv.org › eess

https://arxiv.org › eess

by X Liao · 2022 — The deep Wasserstein distance (DeepWSD) performed on features from neural networks enjoys ... Comments: ACM Multimedia 2022 accepted thesis.

deep Wasserstein distance (DeepWSD) p
Working Paper  Full Text

DeepWSD: Projecting Degradations in Perceptual Space to Wasserstein Distance in Deep Feature Space

Liao, Xigran; Chen, Baoliang; Zhu, Hanwei; Wang, Shiqi; Zhou, Mingliang; et al.

arXiv.org; Ithaca, Aug 5, 2022.
Abstract/DetailsGet full text

Link to external site, this link will open in a new window
Cited by 6
 Related articles All 3 versions


Working Paper Full Text

Tangential Wasserstein Projections
Gunsilius, Florian; Meng Hsuan Hsieh; Myung Jin Lee.
arXiv.org; Ithaca, Aug 2, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Related articles All 5 versions

Cited by 1 Related articles All 3 versions


2022


 Scholarly Journal Citation/Abstract

Remaining useful life estimation of bearings under different working conditions via Wasserstein distance-based weighted domain adaptation
Hu, Tao; Guo, Yiming; Gu, Liudong; Zhou, Yifan; Zhang, Zhisheng; et al.
Reliability Engineering & System Safety; Barking Vol. 224,  (Aug 2022): 1.

Abstract/Details 
Cited by 2 Related articles All 2 versions


2022 see 2021 Scholarly Journal Full Text

EEG data augmentation for emotion recognition with a multiple generator conditional Wasserstein GAN

Zhang, Aiming; Su, Lei; Zhang, Yin; Fu, Yunfa; Wu, Liping; et al.
Complex & Intelligent Systems; Heidelberg Vol. 8, Iss. 4,  (Aug 2022): 3059-3071.
Abstract/DetailsFull text - PDF
 (2 MB)‎

Cited by 9 Related articles All 3 versions


Working Paper Full Text

Wasserstein interpolation with constraints and application to a parking problem

Buttazzo, Giuseppe; Carlier, Guillaume; Eichinger, Katharina.
arXiv.org; Ithaca, Jul 28, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Working Paper Full Text

Wasserstein stability of porous medium-type equations on manifolds with Ricci curvature bounded below
De Ponti, Nicolò; Muratori, Matteo; Orrieri, Carlo.
arXiv.org; Ithaca, Jul 28, 2022.
Zbl 07573814

Cited by 1 Related articles All 3 versions

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

2022 see 2021  Working Paper Full Text

Sliced Wasserstein Variational Inference
Yi, Mingxuan; Liu, Song.
arXiv.org; Ithaca, Jul 26, 2022.
Cited by 5 Related articles


Abstract/DetailsGet full text
Link to external site, this link will open in a new window

<–—2022———2022———900— 



Working Paper Full Text

A characterisation of convex order using the 2-Wasserstein distance
Wiesel, Johannes; Zhang, Erica.
arXiv.org; Ithaca, Jul 26, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Working Paper Full Text

Orthogonalization of data via Gromov-Wasserstein type feedback for clustering and visualization

Ryner, Martin; Karlsson, Johan.
arXiv.org; Ithaca, Jul 25, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

[PDF] arxiv.org

Wasserstein Distributional Learning

C Tang, N Lenssen, Y Wei, T Zheng - arXiv preprint arXiv:2209.04991, 2022 - arxiv.org

Wasserstein loss both from a theoretical and a computational perspective. We show that under 

the Wasserstein … The proposed combination of SCGMM and Wasserstein loss is therefore …

All 2 versions


Working Paper Full Text

On Combinatorial Properties of Greedy Wasserstein Minimization
Steinerberger, Stefan.
arXiv.org; Ithaca, Jul 25, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

 
Working Paper Full Text

Online Stochastic Optimization with Wasserstein Based Non-stationarity
Jiang, Jiashuo; Li, Xiaocheng; Zhang, Jiawei.
arXiv.org; Ithaca, Jul 25, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
All 3 versions


2022


2022 see 2021  Working Paper Full Text

Variational Wasserstein gradient flow
Fan, Jiaojiao; Zhang, Qinsheng; Taghvaei, Amirhossein; Chen, Yongxin.
arXiv.org; Ithaca, Jul 24, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

we explore Wasserstein distance metric across ontology concept embeddings. Wasserstein 

… ated in the continuous spaces of ontology embeddings, differing from string-based distance …

All 2 versions View as HTML 


 
Working Paper Full Text

Exploring Wasserstein Distance across Concept Embeddings for Ontology Matching
Yuan, An; Kalinowski, Alex; Greenberg, Jane.
arXiv.org; Ithaca, Jul 22, 2022

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
between the string-based distances and localWDs (string-context-… on the values related to 

the string-context-distance metric. … Clark et al. in [11] developed a Wasserstein distance-based …

All 3 versions View as HTML 


Working Paper Full Text

Ornstein-Uhlenbeck Type Processes on Wasserstein Space
Ren, Panpan; Feng-Yu, Wang.
arXiv.org; Ithaca, Jul 22, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Cited by 1
All 2 versions

  Paper Full Text

A General Wasserstein Framework for Data-driven Distributionally Robust Optimization: Tractability and Applications
Jonathan Yu-Meng Li; Mao, Tiantian.
arXiv.org; Ithaca, Jul 19, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

aper Full Text

Unsupervised Ground Metric Learning using Wasserstein Singular Vectors
Huizing, Geert-Jan; Cantini, Laura; Peyré, Gabriel.
arXiv.org; Ithaca, Jul 19, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Cited by 1 Related articles All 5 versions

<–—2022———2022———910— 


Working Paper Full Text

Wasserstein convergence rates of increasingly concentrating probability measures
Hasenpflug, Mareike; Rudolf, Daniel; Sprungk, Björn.
arXiv.org; Ithaca, Jul 18, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window


2022 see 2021

[PDF] arxiv.org

Generalized Wasserstein Dice Loss, Test-Time Augmentation, and Transformers for the BraTS 2021 Challenge

L Fidon, S Shit, I EzhovJC PaetzoldS Ourselin… - International MICCAI …, 2022 - Springer

Brain tumor segmentation from multiple Magnetic Resonance Imaging (MRI) modalities is a

challenging task in medical image computation. The main challenges lie in the …

 Related articles All 3 versions

BOOK CHAPTER

Generalized Wasserstein Dice Loss, Test-Time Augmentation, and Transformers for the BraTS 2021 Challenge

Fidon, Lucas ; Shit, Suprosanna ; Ezhov, Ivan ; Paetzold, Johannes C ; Ourselin, Sébastien ; Vercauteren, TomBrainlesion: Glioma, Multiple Sclerosis, Stroke and Traumatic Brain Injuries, 2022, p.187-196

[PDF] arxiv.org

Wasserstein Graph Distance based on -Approximated Tree Edit Distance between Weisfeiler-Lehman Subtrees

Z Fang, J Huang, X Su, H Kasai - arXiv preprint arXiv:2207.04216, 2022 - arxiv.org

… first time and defines a metric we call the Wasserstein WL subtree (WWLS) distance. We

introduce … Finally, we use the Wasserstein distance to reflect the L1–TED to the graph level. The …

 All 2 versions 


A novel sEMG data augmentation based on WGAN-GP

Coelho, FPinto, MF; (...); Marcato, ALM

Jul 2022 (Early Access) | COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING

 Enriched Cited ReferencThe classification of sEMG signals is fundamental in applications that use mechanical prostheses, making it necessary to work with generalist databases that improve the accuracy of those classifications. Therefore, synthetic signal generation can be beneficial in enriching a database to make it more generalist. This work proposes using a variant of generative adversarial networks to

produce syn

 All 3 versions


Online Try-On: GANs and TPS

Han, QC and Zhao, MB

2021 | AATCC JOURNAL OF RESEARCH 8 , pp.117-127

Style transfer between images has been a research direction gaining considerable attention in the field of image generation. CycleGAN is widely used because it does not require paired image data to train, which greatly reduces the cost of collecting data. In 2018, based on CycleGAN, a new model structure, InstaGAN, was proposed and then applied in the style transfer algorithm in the special par

Show more

 

2022


2022 see 2021

Wasserstein Distributionally Robust Look-Ahead Economic Dispatch

Poolla, BKHota, A; (...); Cherukuri, A

IEEE-Power-and-Energy-Society General Meeting (PESGM)

2021 | 2021 IEEE POWER & ENERGY SOCIETY GENERAL MEETING (PESGM)


2022 see 2021

Robust W-GAN-based estimation under Wasserstein contamination

Liu, Z and Loh, PL

Aug 2022 (Early Access) | INFORMATION AND INFERENCE-A JOURNAL OF THE IMA

Robust estimation is an important problem in statistics which aims at providing a reasonable estimator when the data-generating distribution lies within an appropriately defined ball around an uncontaminated distribution. Although minimax rates of estimation have been established in recent years, many existing robust estimators with provably optimal convergence rates are also computationally in

Show more

Free Submitted Article From RepositoryFull Text at Publisher

45 References

Related records   


Class-rebalanced wasserstein distance for multi-source domain adaptation

Wang, QWang, SS and Wang, BL

Jul 2022 (Early Access) | APPLIED INTELLIGENCE

Enriched Cited RefereIn the study of machine learning, multi-source domain adaptation (MSDA) handles multiple datasets which are collected from different distributions by using domain-invariant knowledge extraction. However, the current studies mainly employ features and raw labels on the joint space to perform domain alignment, neglecting the intrinsic structure of label distribution that can

harm the performance

OBSTRUCTIONS TO EXTENSION OF WASSERSTEIN DISTANCES FOR VARIABLE MASSES

Lombardini, L and Rossi, F

Jul 2022 (Early Access) | PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY

We study the possibility of defining a distance on the whole space of measures, with the property that the distance between two measures having the same mass is the Wasserstein distance, up to a scaling factor. We prove that, under very weak and natural conditions, if the base space is unbounded, then the scaling factor must be constant, independently of the mass. Moreover, no such distance can

Free Submitted Article From RepositoryView full text

18References  Related records

M R4489320

Related articles All 3 versions
MR4489320
 


2022 see 2021

Domain Adaptive Rolling Bearing Fault Diagnosis based on Wasserstein Distance

Yang, CLWang, XD; (...); Li, ZR

33rd Chinese Control and Decision Conference (CCDC)

2021 | PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021) , pp.77-83

The rolling bearing usually runs at different speeds and loads, which leads to a corresponding change in the distribution of data. The cross-domain problem caused by different data distributions can degrade the performance of deep learning-based fault diagnosis models. To address this problem, this paper proposes a multilayer domain adaptive method based on Wasserstein distance for fault diagno

Full Text at Publisher

12 References  Related records

<–—2022———2022———920— 



Informer-WGAN: High Missing Rate Time Series Imputation Based on Adversarial Training and a Self-Attention Mechanism

Qian, YFTian, LM; (...); Wu, R

Jul 2022 | ALGORITHMS 15 (7)

Enriched Cited ReferencMissing observations in time series will distort the data characteristics, change the dataset expectations, high- 

order distances, and other statistics, and increase the difficulty of data analysis. Therefore, data imputation needs to be performed first. Generally, data imputation methods include statistical imputation, regression imputation, multiple imputation, and imputation 

basemachineFree Full Text from Publisher21 References  Related records

 All 4 versions 


Improving reproducibility and performance of radiomics in low-dose CT using cycle GANs

Chen, JHWee, L; (...); Bermejo, I

Jul 2022 (Early Access) | JOURNAL OF APPLIED CLINICAL MEDICAL PHYSICS

Enriched Cited ReferBackground As a means to extract biomarkers from medical imaging, radiomics has attracted increased attention from researchers. However, reproducibility and performance of radiomics in low-dose CT scans are still poor, mostly due to noise. Deep learning generative models can be used to denoise these images and in turn improve radiomics' reproducibility and performance.  

Free Submitted Article From RepositoryFull Text at Publisher View Associated Data

66 References  Related records


Finite-Sample Guarantees for Wasserstein Distributionally Robust Optimization: Breaking the Curse of Dimensionality

Gao, R

Jul 2022 (Early Access) | OPERATIONS RESEARCH

Wasserstein distributionally robust optimization (DRO) aims to find robust and generalizable solutions by hedging against data perturbations in Wasserstein distance. Despite its recent empirical success in operations research and machine learning, existing performance guarantees for generic loss functions are either overly conservative because of the curse of dimensionality or plausible only i

 Free Submitted Article From RepositoryFull Text at Publisher

102 References  Related records

Cited by 20 Related articles All 3 versions

Convergence rates for empirical measures of Markov chains in dual and Wasserstein distances?

Riekert, A

Oct 2022 | STATISTICS & PROBABILITY LETTERS 189

We consider a Markov chain on Rd with invariant measure mu. We are interested in the rate of convergence of the empirical measures towards the invariant measure with respect to various dual distances, including in particular the 1-Wasserstein distance. The main result of this article is a new upper bound for the expected distance, which is proved by combining a Fourier expansion with a truncati

Show more

Full Text at Publisher

19 References  Related records 

MR4452774 


On the use of Wasserstein distance in the distributional analysis of human decision making under uncertainty

Candelieri, APonti, A; (...); Archetti, F

Jul 2022 (Early Access) | ANNALS OF MATHEMATICS AND ARTIFICIAL INTELLIGENCE

Enriched Cited ReferencThe key contribution of this paper is a theoretical framework to analyse humans' decision-making strategies under uncertainty, and more specifically how human subjects manage the trade-off between information gathering (exploration) and reward seeking (exploitation) in particular active learning in a black-box optimization task. Humans' decisions making according to these two objectives ca

Free Full Text From Publisher

Cited by 3 Related articles

2022


[PDF] elte.hu

[PDF] Optimális transzport és Wasserstein-terek

T Tamás, RAM Kutatóintézet - math.elte.hu

Az optimális transzport és a Wasserstein-terek elmélete a matematika egy nagyon széles

körben alkalmazott területe. Többek között használják a logisztikában, a közgazdaságtanban, …

Related articles 


[PDF] mdpi.com

Neurocartographer: CC-WGAN Based SSVEP Data Generation to Produce a Model toward Symmetrical Behaviour to the Human Brain

SE Karabulut, MM Khorasani, A Pantanowitz - Symmetry, 2022 - mdpi.com

… This work proposes a class-conditioned Wasserstein generative adversarial network with a

gradient penalty loss for electroencephalogram data generation. Electroencephalogram data …

 28 References  Related records

All 4 versions

Dual-WGAN-c: A GAN-based acoustic impedance inversion method

Z Wang, S Wang, C Zhou, W Cheng - Geophysics, 2022 - library.seg.org

… an acoustic impedance inversion method based on Dual Wasserstein Generative Adversarial

Network condition (Dual-WGAN-c). Dual-WGAN-c can perform seismic inversion and …


Electrocardiograph Based Emotion Recognition via WGAN-GP Data Enhancement and Improved CNN

J Hu, Y Li - International Conference on Intelligent Robotics and …, 2022 - Springer

… However, WGAN uses weight cropping to restrict the absolute value … WGAN The loss function

of -GP consists of two parts: the loss term and the gradient penalty term of the native WGAN

All 2 versions

 

[PDF] researchsquare.com

[PDF] Learned Pseudo-Random Number Generator: WGAN-GP for Generating Statistically Robust Random Numbers

K Okada, K EndoK YasuokaS Kurabayashi - 2022 - researchsquare.com

… In this paper, we propose a Wasserstein distance-based generative adversarial network (WGAN) …

We remove the dropout layers from the conventional WGAN network to learn random …

 All 5 versions 

<–—2022———2022———930—


WGAN-GP and LSTM based Prediction Model for Aircraft 4-D Traj ectory

L Zhang, H Chen, P Jia, Z Tian… - … and Mobile Computing …, 2022 - ieeexplore.ieee.org

… The data generation module is performed by the WGAN-GP … of the data generated by the

WGAN-GP network, an LSTM … The architecture of the WGAN-GP model used in this paper …


WGAN-Based Method for Generating Malicious Domain Training Data

K Zhang, B Huang, Y Wu, C Chai, J Zhang… - … Conference on Artificial …, 2022 - Springer

… The generated confrontation network model is WGAN (Wasserstein GAN). WGAN mainly

improves GAN from the perspective of loss function. After the loss function is improved, WGAN …

 All 2 versions


[AVM 最優秀賞記念講演] WL-部分木間の L1 近似木編集距離に基づく新たなグラフ Wasserstein 距離の提案

Z Fang - 研究報告オーディオビジュアル複合情報処理 (AVM), 2022 - ipsj.ixsq.nii.ac.jp

一般社団法人情報処理学会では複写複製および転載複製に係る著作権を学術著作権協会に委託

しています. 当該利用をご希望の方は, 学術著作権協会 が提供している複製利用許諾システム

 [Japanese  [AVM Grand Prize Commemorative Lecture] L1 approximate tree edit distance between WL-subtrees  ]


Automated segmentation of endometriosis using transfer ...

https://f1000research.com › articles › pdf

https://f1000research.com › articles › pdfPDF

by S Visalaxi · 2022 — F1000Research 2022, 11:360 Last updated: 23 JUN 2022 ... and Sudalaimuthu T. Automated segmentation of endometriosis using transfer learning.

[CITATION] Automated Segmentation of Adnexal Ovarian Metastases Using Joint Distribution Wasserstein Distance Loss Metric

A Nazib, K BoehmV Paroder… - MEDICAL …, 2022 - … ST, HOBOKEN 07030-5774, NJ USA


ARTICLE

A two-step approach to Wasserstein distributionally robust chance- and security-constrained dispatch

Maghami, Amin ; Ursavas, Evrim ; Cherukuri, AshisharXiv.org, 2022

OPEN ACCESS

A two-step approach to Wasserstein distributionally robust chance- and security-constrained dispatch

Available Online 

arXiv:2208.07642  [pdfother]  math.OC  eess.SY
A two-step approach to Wasserstein distributionally robust chance- and security-constrained dispatch
Authors: Amin MaghamiEvrim UrsavasAshish Cherukuri
Abstract: This paper considers a security constrained dispatch problem involving generation and line contingencies in the presence of the renewable generation. The uncertainty due to renewables is modeled using joint chance-constraint and the mismatch caused by contingencies and renewables are handled using reserves. We consider a distributionally robust approach to solve the chance-constrained program. We…  More
Submitted 16 August, 2022; originally announced August 2022.
Comments: 10 pages, 5 figures

2022


arXiv:2208.06306  [pdfother]  quant-ph  cs.CC  math-ph
Wasserstein Complexity of Quantum Circuits
Authors: Lu LiKaifeng BuDax Enshan KohArthur JaffeSeth Lloyd
Abstract: Given a unitary transformation, what is the size of the smallest quantum circuit that implements it? This quantity, known as the quantum circuit complexity, is a fundamental property of quantum evolutions that has widespread applications in many fields, including quantum computation, quantum field theory, and black hole physics. In this letter, we obtain a new lower bound for the quantum circuit c…  More
Submitted 12 August, 2022; originally announced August 2022.
Comments: 7+7 pages

Cited by 1 All 2 versions 

Kantorovich Strikes Back! Wasserstein GANs are not Optimal ...

https://arxiv.org › cs

by A Korotin · 2022 — [Submitted on 15 Jun 2022] ... Despite the success of WGANs, it is still unclear how well the underlying OT dual solvers approximate the OT cost ...

All 4 versions


[PDF] researchsquare.com

[PDF] A Novel Framework for Nonparametric Rainfall Generator Based on Deep Convolutional Wasserstein Generative Networks (DC-WGANs)

M Mirzaei, A Dehghani, A Dehghani - 2022 - researchsquare.com

A novel rainfall generator based on Deep convolutional Wasserstein Generative Adversarial 

Networks (DC-WGANs) is implemented to generate spatial-temporal hourly rainfall for the …

All 2 versions 


2022 book review  see 2021

MR4468328 Prelim Santambrogio, Filippo; 

Lectures on optimal transport [book review of MR4294651]; An invitation to optimal transport, Wasserstein distances, and gradient flows [book review of MR4331435]. Eur. Math. Soc. Mag. No. 124 (2022), 60–63. 00A17

Review PDF Clipboard Journal Ar

… Ambrosio, Elia Brué and Daniele Semola, and “An Invitation to Optimal Transport, Wasserstein Distances, and Gradient Flows” by Alessio Figalli and Federico Glaudo

F Santambrogio - European Mathematical Society Magazine, 2022 - ems.press

… (8–10) on the Wasserstein distances and Wasserstein spaces follows. Here the authors do

… metric space (X,d) are inherited by the corresponding Wasserstein space (𝒫(X),W2) (we see …

Related articles

[HTML] hindawi.com

[HTML] … Augmentation Method for Prohibited Item X-Ray Pseudocolor Images in X-Ray Security Inspection Based on Wasserstein Generative Adversarial Network …

D Liu, J Liu, P Yuan, F Yu - Computational Intelligence and …, 2022 - hindawi.com

… After augmenting the dataset, we trained the YOLOV4-tiny model on the training dataset

and the augmented training dataset, respectively. e stochastic gradient descent with momentum …

 All 5 versions 

<–—2022———2022———940—


 

2022 see 2021

Gromov-Wasserstein distances between Gaussian distributions

Delon, JDesolneux, A and Salmona, A

Aug 2022 (Early Access) | JOURNAL OF APPLIED PROBABILITY

Enriched Cited References

Gromov-Wasserstein distances were proposed a few years ago to compare distributions which do not lie in the same space. In particular, they offer an interesting alternative to the Wasserstein distances for comparing probability measures living on Euclidean spaces of different dimensions. We focus on the Gromov-Wasserstein distance with a ground cost defined as the squared Euclidean distance, an

Show more

Free Submitted Article From RepositoryView full text

27  References  Related records

 MR4507687


2022 cited by many. See 2023

Distributionally Robust Stochastic Optimization with Wasserstein Distance

Gao, R and Kleywegt, A

Aug 2022 (Early Access) | MATHEMATICS OF OPERATIONS RESEARCH , pp.1-53

Distributionally robust stochastic optimization (DRSO) is an approach to optimization under uncertainty in which, instead of assuming that there is a known true underlying probability distribution, one hedges against a chosen set of distributions. In this paper, we first point out that the set of distributions should be chosen to be appropriate for the application at hand and some of the choice

Show more

Free Submitted Article From RepositoryFull Text at Publisher

58 References  Related records

Cited by 544 Related articles All 5 versions


Accelerating the discovery of anticancer peptides targeting lung and breast cancers with the Wasserstein autoencoder model and PSO algorithm

Yang, LJYang, GH; (...); Yang, L

Aug 2022 (Early Access) | BRIEFINGS IN BIOINFORMATICSEnriched Cited References

In the development of targeted drugs, anticancer peptides (ACPs) have attracted great attention because of their high selectivity, low toxicity and minimal non-specificity. In this work, we report a framework of ACPs generation, which combines Wasserstein autoencoder (WAE) generative model and Particle Swarm Optimization (PSO) forward search algorithm guided by attribute predictive model to gen

Show more Full Text at Publisher
All 4 versions


[PDF] machinelearning.ru

[PDF] Wasserstein gradient flows: modeling and applications

P Mokrov - 2022 - machinelearning.ru

… Wasserstein gradient flows provide a powerful means of … over entropy functionals in

Wasserstein space. This equivalence, … We introduce a scalable method to approximate …

All 2 versions 

 

[PDF] researchsquare.com

[PDF] Reliability Metrics of Explainable CNN based on Wasserstein Distance for Cardiac Evaluation

Y Omae, Y Kakimoto, Y Saito, D Fukamachi… - 2022 - researchsquare.com

In recent works, convolutional neural networks (CNN) have been used in the non-invasive

examination of the cardiac region for estimating pulmonary artery wedge pressure (PAWP) …

All 3 versions 


2022

Finite-sample guarantees for Wasserstein distributionally robust optimization: Breaking the curse of dimensionality

R Gao - Operations Research, 2022 - pubsonline.informs.org

… distributions whose p-Wasserstein distance Wp to the empiricalWasserstein DRO and its 

associated variation regularization and demonstrate the bias-variation tradeoff in Wasserstein

Cited by 23 Related articles All 3 versions

[PDF] openreview.net

Orthogonal Gromov Wasserstein Distance with Efficient Lower Bound

H Jin, Z Yu, X Zhang - The 38th Conference on Uncertainty in …, 2022 - openreview.net

… The Gromov-Wasserstein (GW) discrepancy formulates a … the orthogonal Gromov-Wasserstein

(OGW) discrepancy that … It also directly extends to the fused Gromov-Wasserstein (FGW) …

 Related articles 


BOOK CHAPTER

Deep Convolutional Embedded Fuzzy Clustering with Wasserstein Loss

Chen, Tianzhen ; Sun, WeiArtificial Intelligence in Data and Big Data Processing, 2022, p.163-174

Deep Convolutional Embedded Fuzzy Clustering with Wasserstein Loss

No Online Access 

Deep Convolutional Embedded Fuzzy Clustering with Wasserstein Loss

T Chen, W Sun - International Conference on Artificial Intelligence and …, 2022 - Springer

… To avoid KL failure, this paper proposes to use Wasserstein distance as the loss function of

… convolution embedded fuzzy clustering method with Wasserstein loss (DCEFC). Through …

 Related articles All 3 versions

 

2022 thesis

Parallel translations, Newton flows and Q-Wiener processes on the W...
by Ding, Hao
2022
- Nous allons étendre la définition de la connexion de Levi-Civita de Lott à l’espace de Wasserstein des mesures de probabilité ayant densité et divergence. Un...
Dissertation/Thesis  Full Text Online

[PDF] archives-ouvertes.fr

Parallel translations, Newton flows and Q-Wiener processes on the Wasserstein space

H Ding - 2022 - tel.archives-ouvertes.fr

… We introduce Newton flows on the Wasserstein space and … Levi-Civita connection to the

Wasserstein space of probability … stochastic calculus on the Wasserstein space throughout three …

All 4 versions 


[PDF] openreview.net

Meta-Learning without Data via Wasserstein Distributionally-Robust Model Fusion

Z WangX Wang, L Shen, Q SuoK Song… - The 38th Conference …, 2022 - openreview.net

… in various ways, including KL-divergence, Wasserstein ball, etc. DRO has been applied to

many … This paper adopts the Wasserstein ball to characterize the task embedding uncertainty …

Cited by 6 Related articles All 3 versions 

<–—2022———2022———950— 


 

[PDF] archives-ouvertes.fr

Finite elements for Wasserstein  gradient flows

C CancèsD Matthes, F Nabet, EM Rott - 2022 - hal.archives-ouvertes.fr

Convergence of a finite element discretization of a degenerate parabolic equation of $q$-Laplace

type with an additional external potential is considered. The main novelty of our …

 All 19 versions 

[PDF] arxiv.org

 e formulate the CS problem with Wasserstein distance …

Related articles All 2 versions 


DISSERTATION

Data-driven approximation of transfer operators: DMD, Perron–Frobenius, and statistical learning in Wasserstein space

Karimi, Amirhossein ; 2022

OPEN ACCESS

Data-driven approximation of transfer operators: DMD, Perron–Frobenius, and statistical learning in Wasserstein space

Online Access Available 

[PDF] escholarship.org

Data-driven approximation of transfer operators: DMD, Perron–Frobenius, and statistical learning in Wasserstein space

A Karimi - 2022 - escholarship.org

The Perron–Frobenius and Koopman operators provide natural dual settings to investigate

the dynamics of complex systems. In this thesis we focus on certain pertinent concepts and …

All 2 versions


2022 see 2021  [PDF] arxiv.org

Right mean for the  Bures-Wasserstein quantum divergence

M Jeong, J Hwang, S Kim - arXiv preprint arXiv:2201.03732, 2022 - arxiv.org

… of α − z Bures-Wasserstein quantum divergences to each points… Moreover, we verify the trace

inequality with the Wasserstein … trace inequality with the Wasserstein mean and bounds for …

 Related articles All 2 versions 


[PDF] imstat.org

Estimation of Wasserstein distances in the Spiked Transport Model

J Niles-WeedP Rigollet - Bernoulli, 2022 - projecteuclid.org

… and subgaussian concentration properties of the Wasserstein distance. In Section 6 we

propose and analyze an estimator for the Wasserstein distance under the spiked transport model…

99 References  Related records

Cited by 1 All 4 versions

MR4474558 

2022

From Dirichlet Forms to Wasserstein Geometry HCM Conference

https://www.hcm.uni-bonn.de › eventpages › 2022 › fr...

2022 · HCM Conference: From Dirichlet Forms to Wasserstein Geometry ... Zoom-Link for online participation (Meeting-ID: 695 4765 1056, Code: 090535).

People also search for

haas lounge harvard

how to set up a zoom call

Venue: Wegelerstr. 10, Bonn

New Trends in Dirichlet Forms and Optimal Transport - HCM

https://www.hcm.uni-bonn.de › eventpages › 2022 › ne...

HCM Conference: From Dirichlet Forms to Wasserstein Geometry · Participants · Schedule ... Dates: August 29 - September 2, 2022. Venue: Wegelerstr. 10, Bonn.

 2022 MIT B.S. thesis

Non-parametric threshold for smoothed empirical Wasserstein distance

https://dspace.mit.edu › bitstream › handle › Jia-zyji...

https://dspace.mit.edu › bitstream › handle › Jia-zyji...PDF

by Z Jia · 2022 — c Massachusetts Institute of Technology 2022. ... proper way, which benefits my not only in the write-up of this thesis, and also among.


ARTICLE

(L_1\)-distortion of Wasserstein metrics: a tale of two dimensions

Baudier, Florent P ; Gartland, Chris ; Schlumprecht, ThomasarXiv.org, 2022

OPEN ACCESS

(L_1\)-distortion of Wasserstein metrics: a tale of two dimensions

Available Online 

L_1$-distortion of Wasserstein metrics: a tale of two dimensions
by Baudier, Florent P; Gartland, Chris; Schlumprecht, Thomas
08/2022
By discretizing an argument of Kislyakov, Naor and Schechtman proved that the 1-Wasserstein metric over the planar grid $\{0,1,\dots n\}^2$ has...
Journal Article  Full Text Online
Open Access  arXiv

arXiv:2208.13879  [pdfother]  math.MG  math.FA
-distortion of Wasserstein metrics: a tale of two dimensions
Authors: Florent P. BaudierChris GartlandThomas Schlumprecht
Abstract: By discretizing an argument of Kislyakov, Naor and Schechtman proved that the 1-Wasserstein metric over the planar grid {0,1,…n}
2 has L1-distortion bounded below by a constant multiple of logn
  We provide a new "dimensionality" interpretation of Kislyakov's argument, showing that, if {G
 is a sequence of graphs whose isoperimetric dimension and Lipschitz-… 
More
Submitted 29 August, 2022; originally announced August 2022.
Comments: 35 pages
MSC Class: 46B85; 68R12; 46B20; 51F30; 05C63; 46B99

arXiv:2208.12145  [pdfother]  cs.LG  \math.PR
A deep learning framework for geodesics under spherical Wasserstein-Fisher-Rao metric and its application for weighted sample generation
Authors: Yang JingJiaheng ChenLei LiJianfeng Lu
Abstract: Wasserstein-Fisher-Rao (WFR) distance is a family of metrics to gauge the discrepancy of two Radon measures, which takes into account both transportation and weight change. Spherical WFR distance is a projected version of WFR distance for probability measures so that the space of Radon measures equipped with WFR can be viewed as metric cone over the space of probability measures with spherical WFR… 
More
Submitted 25 August, 2022; originally announced August 2022.

arXiv:2208.11726  [pdfother]  cs.LG
Wasserstein Task Embedding for Measuring Task Similarities
Authors: Xinran LiuYikun BaiYuzhe LuAndrea SoltoggioSoheil Kolouri
Abstract: Measuring similarities between different tasks is critical in a broad spectrum of machine learning problems, including transfer, multi-task, continual, and meta-learning. Most current approaches to measuring task similarities are architecture-dependent: 1) relying on pre-trained models, or 2) training networks on tasks and using forward transfer as a proxy for task similarity. In this paper, we le… 
More
Submitted 24 August, 2022; originally announced August 2022.

All 2 versions 

<–—2022———2022———960— 


ARTICLE

Lipschitz continuity of the Wasserstein projections in the convex order on the line

Jourdain, Benjamin ; Margheriti, William ; Pammer, GudmundarXiv.org, 2022

OPEN ACCESS

Lipschitz continuity of the Wasserstein projections in the convex order on the line

Available Online 

arXiv:2208.10635  [pdfpsother]  math.PR
Lipschitz continuity of the Wasserstein projections in the convex order on the line
Authors: Benjamin JourdainWilliam MargheritiGudmund Pammer
Abstract: Wasserstein projections in the convex order were first considered in the framework of weak optimal transport, and found application in various problems such as concentration inequalities and martingale optimal transport. In dimension one, it is well-known that the set of probability measures with a given mean is a lattice w.r.t. the convex order. Our main result is that, contrary to the minimum an…  More
Submitted 22 August, 2022; originally announced August 2022.


Topic Embedded Representation Enhanced Variational Wasserstein Autoencoder for Text Modeling

Z Xiang, X Liu, G Yang, Y Liu - 2022 IEEE 5th International …, 2022 - ieeexplore.ieee.org

Variational Autoencoder (VAE) is now popular in text modeling and language generation tasks, 

which need to pay attention to the diversity of generation results. The existing models are …

Related articles

[PDF] arxiv.org

Quasi -Firmly Nonexpansive Mappings in Wasserstein Spaces

A Bërdëllima, G Steidl - arXiv preprint arXiv:2203.04851, 2022 - arxiv.org

… α-firmly nonexpansive mappings in Wasserstein-2 spaces over Rd and to analyze properties 

of these mappings. We prove that for quasi α-firmly … point algorithm in Wasserstein spaces. …

 Related articles All 2 versions


[PDF] arxiv.org Operations Research

Data-driven chance constrained programs over Wasserstein balls

Z Chen, D Kuhn, W Wiesemann - Operations Research, 2022 - pubsonline.informs.org

… of the selected ground metric for the Wasserstein ball, which opens up possibilities to 

incorporate other cost functions in our definition of the Wasserstein distance. Since the initial …

Cited by 106 Related articles All 7 versions


[PDF] arxiv.org cited by many

 

2022



Wasserstein Embedding for Capsule Learning

by Shamsolmoali, Pourya; Zareapoor, Masoumeh; Das, Swagatam ; More...
09/2022
Capsule networks (CapsNets) aim to parse images into a hierarchical component structure that consists of objects, parts, and their relations. Despite their...
Journal Article  Full Text Online
Open Access  arXiv
 
Fair learning with Wasserstein barycenters...
by Gaucher, Solenne; Schreuder, Nicolas; Chzhen, Evgenii
09/2022
This work provides several fundamental characterizations of the optimal classification function under the demographic parity constraint. In the awareness...
Journal Article  Full Text Online
Open Access

L_1$-distortion of Wasserstein metrics: a tale...
by Baudier, Florent P; Gartland, Chris; Schlumprecht, Thomas
08/2022
By discretizing an argument of Kislyakov, Naor and Schechtman proved that the 1-Wasserstein metric over the planar grid $\{0,1,\dots n\}^2$ has...
Journal Article  Full Text Online

Open Access

Wasserstein Announces Its Official Made for Fitbit...
PR newswire, Aug 29, 2022
Newspaper Article 


Fair learning with Wasserstein barycenters for non-decomposable performance...
by Gaucher, SolenneSchreuder, NicolasChzhen, Evgenii
09/2022
This work provides several fundamental characterizations of the optimal classification function under the demographic parity constraint. In the awareness...
Journal Article  Full Text Online
Open Access  arXiv

<–—2022———2022———970—


ARTICLE

Online Stochastic Optimization with Wasserstein Based Non-stationarity

Jiang, Jiashuo ; Li, Xiaocheng ; Zhang, JiaweiarXiv.org, 2022

OPEN ACCESS

Online Stochastic Optimization with Wasserstein Based Non-stationarity


A deep learning framework for geodesics under spherical Wasserstein-Fisher-Rao...
by Jing, YangChen, JiahengLi, Lei ; More...
08/2022
Wasserstein-Fisher-Rao (WFR) distance is a family of metrics to gauge the discrepancy of two Radon measures, which takes into account both transportation and...
Journal Article  Full Text Online  arXiv
All 3 versions
 


Wasserstein Announces Its Official Made for Fitbit Product Line for the New Fitbit...
PR newswire, Aug 29, 2022
Newspaper Article  Full Text Online


ARTICLE

Density of subalgebras of Lipschitz functions in metric Sobolev spaces and applications to Wasserstein Sobolev spaces

Fornasier, Massimo ; Savaré, Giuseppe ; Sodini, Giacomo EnricoarXiv.org, 2022

OPEN ACCESS

Density of subalgebras of Lipschitz functions in metric Sobolev spaces and applications to Wasserstein Sobolev spaces

Available Online 
arXiv:2209.00974
  [pdfpsother]  math.FA  math.MG
Density of subalgebras of Lipschitz functions in metric Sobolev spaces and applications to Wasserstein Sobolev spaces
Authors: Massimo FornasierGiuseppe SavaréGiacomo Enrico Sodini
Abstract: We prove a general criterion for the density in energy of suitable subalgebras of Lipschitz functions in the metric-Sobolev space H1,p
(X,d,m)
 associated with a positive and finite Borel measure m
 in a separable and complete metric space (X,d)
. We then provide a relevant application to the case of the algebra of cylinder functions in the Wasserstein… 
More
Submitted 2 September, 2022; originally announced September 2022.
Comments: 51 pages
MSC Class: 46E36 31C25 49Q20 28A33 35F21 58J65

arXiv:2209.00923  [pdfpsother]  math.PR  math.ST
Convergence of the empirical measure in expected Wasserstein distance: non asymptotic explicit bounds in Rd
Authors: Nicolas Fournier
Abstract: We provide some non asymptotic bounds, with explicit constants, that measure the rate of convergence, in expected Wasserstein distance, of the empirical measure associated to an i.i.d. N
-sample of a given probability distribution on Rd
. We consider the cases where Rd
 is endowed with the maximum and Euclidean norms.
Submitted 2 September, 2022; originally announced September 2022.
MSC Class: 60F25; 65C05


2022

LIFEWATCH: Lifelong Wasserstein Change Point DetectionAuthors:Kamil FaberRoberto CorizzoBartlomiej SniezynskiMichael BaronNathalie Japkowicz2022 International Joint Conference on Neural Networks (IJCNN)Show more
Summary:Change point detection methods offer a crucial ca-pability in modern data analysis tasks characterized by evolving time series data in the form of data streams. Recent interest in lifelong learning showed the importance of acquiring knowledge and identifying new occurring tasks in a continually evolving environment. Although this setting could benefit from a timely identification of changes, existing change point detection methods are unable to recognize recurring tasks, which is a necessary condition in lifelong learning. In this paper, we attempt to fill this gap by proposing LIFEWATCH, a novel Wasserstein-based change point detection approach with memory capable of modeling multiple data distributions in a fully unsupervised manner. Our method does not only detect changes, but discriminates between changes characterized by the appearance of a new task and changes that rather describe a recurring or previously seen task. An extensive experimental evaluation involving a large number of benchmark datasets shows that LIFEWATCH outperforms state-of-the-art methods for change detection while exploiting the characterization of detected changes to correctly identify tasks occurring in complex scenarios characterized by recurrence in lifelong consolidation settingsShow more
Chapter, 2022
Publication:2022 International Joint Conference on Neural Networks (IJCNN), 20220718, 1
Publisher:2022



Auto-weighted Sequential Wasserstein Distance and Application to Sequence Matching
Authors:Mitsuhiko HorieHiroyuki Kasai2022 30th European Signal Processing Conference (EUSIPCO)
Summary:Sequence matching problems have been central to the field of data analysis for decades. Such problems arise in widely diverse areas including computer vision, speech processing, bioinformatics, and natural language processing. However, solving such problems efficiently is difficult because one must consider temporal consistency, neighborhood structure similarity, robustness to noise and outliers, and flexibility on start-end matching points. This paper presents a proposal of a shape-aware Wasserstein distance between sequences building upon optimal transport (OT) framework. The proposed distance considers similarity measures of the elements, their neighborhood structures, and temporal positions. We incorporate these similarity measures into three ground cost matrixes of the OT formulation. The noteworthy contribution is that we formulate these measures as independent OT distances with a single shared optimal transport matrix, and adjust those weights automatically according to their effects on the total OT distance. Numerical evaluations suggest that the sequence matching method using our proposed Wasserstein distance robustly outperforms state-of-the-art methods across different real-world datasetsShow more
Chapter, 2022
Publication:2022 30th European Signal Processing Conference (EUSIPCO), 20220829, 1472
Publisher:2022
Peer-reviewed
On the 2-Wasserstein distance for self-similar measures on the unit interval
Authors:Easton BrawleyMason DoyleRobert Niedzialomski
Article, 2022
Publication:Mathematische Nachrichten, 295, March 2022, 468
Publisher:2022
Zbl 1529.28005


[PDF] arxiv.org

Quantum Wasserstein distance of order 1 between channels

R Duvenhage, M Mapaya - arXiv preprint arXiv:2210.03483, 2022 - arxiv.org

… The paper then proceeds to the behaviour of the Wasserstein distance of … the Wasserstein 

distance is additive over tensor products of channels between such subsystems, with stability

 Cited by 1 All 2 versions

Full Attention Wasserstein GAN With Gradient Normalization for Fault Diagnosis Under Imbalanced Data

J Fan, X Yuan, Z Miao, Z Sun, X Mei… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org

Wasserstein GAN (WGAN) method can make the optimizing process more stable. Because 

the discriminator in a WGAN … improved WGAN variant, named full attention Wasserstein GAN …


Detecting Incipient Fault Using Wasserstein Distance

C Lu, J Zeng, S Luo, U Kruger - 2022 IEEE 11th Data Driven …, 2022 - ieeexplore.ieee.org

This article develops a novel process monitoring method based on the Wasserstein distance 

for incipient fault detection. The core idea is to measure the difference between the normal …

<–—2022———2022———980—


 

Fault Feature Recovery With Wasserstein Generative Adversarial Imputation Network With Gradient Penalty for Rotating Machine Health Monitoring Under Signal …

W Hu, T Wang, F Chu - IEEE Transactions on Instrumentation …, 2022 - ieeexplore.ieee.org

… In this study, a fault feature recovery strategy called the Wasserstein generative adversarial 

… Finally, the introduction of the Wasserstein distance loss function and the gradient penalty …

Cited by 2 Related articles

ault Feature Recovery With Wasserstein Generative Adversarial Imputation Network With Gradient Penalty for Rotating Machine Health

A Novel Physical Layer Key Generation Method Based on WGAN-GP Adversarial Autoencoder

J Han, Y Zhou, G Liu, T Liu… - 2022 4th International …, 2022 - ieeexplore.ieee.org

… The WGAN-GP Adversarial Autoencoder’s Structure In this paper, the WGAN-GP adversarial 

… To further optimize the network structure, the Wasserstein distance and gradient penalty (GP…


The Wasserstein Distance Using QAOA: A Quantum Augmented Approach to Topological Data Analysis

M Saravanan, M Gopikrishnan - 2022 International Conference …, 2022 - ieeexplore.ieee.org

This paper examines the implementation of Topological Data Analysis methods based on 

Persistent Homology to meet the requirements of the telecommunication industry. Persistent …

Related articles


[PDF] inria.fr

A Convolutional Wasserstein Distance for Tractography Evaluation: Complementarity Study to State-of-the-Art Measures

T Durantel, J Coloigner… - 2022 IEEE 19th …, 2022 - ieeexplore.ieee.org

… on the computation of the Wasserstein distance, derived from op… The 2-Wasserstein distance, 

simply called Wasserstein dis… in development, our new Wasserstein measure can be used …

Related articles All 9 versions


Energy-constrained Crystals Wasserstein GAN for the inverse design of crystal structures

P Hu, B Ge, Y Liu, W Huang - Proceedings of the 8th International …, 2022 - dl.acm.org

… In this work, we develop a WGAN-gp-based inverse design framework, energy-constrained 

crystals Wasserstein GAN (ECCWGAN), to generate crystal structures with target properties (…

 

2022


 2022 see 2021  [PDF] ams.org

Obstructions to extension of Wasserstein distances for variable masses

L Lombardini, F Rossi - Proceedings of the American Mathematical Society, 2022 - ams.org

We study the possibility of defining a distance on the whole space of measures, with the 

property that the distance between two measures having the same mass is the Wasserstein

 Related articles All 3 versions

Zbl 07594318


arXiv:2209.03318  [pdfother stat.ME  stat.CO
On the Wasserstein median of probability measures
Authors: Kisung YouDennis Shung
Abstract: Measures of central tendency such as the mean and the median are a primary way to summarize a given collection of random objects. In the field of optimal transport, the Wasserstein barycenter corresponds to the Fréchet or geometric mean of a set of probability measures, which is defined as a minimizer of the sum of its squared distances to each element of the set when the order is 2. We present th…  More
Submitted 7 September, 2022; originally announced September 2022.
Comments: 25 pages, 9 figures
MSC Class: 49Q22

All 2 versions 

arXiv:2209.03243  [pdfpsother]  math.PR
Adapted Wasserstein distance between the laws of SDEs
Authors: Julio Backhoff-VeraguasSigrid KällbladBenjamin A. Robinson
Abstract: We study an adapted optimal transport problem between the laws of Markovian stochastic differential equations (SDE) and establish the optimality of the synchronous coupling between these laws. The proof of this result is based on time-discretisation and reveals an interesting connection between the synchronous coupling and the celebrated discrete-time Knothe–
Rosenblatt rearrangemen… 
More
Submitted 7 September, 2022; originally announced September 2022.
Comments: 29 pages, 1 figure
MSC Class: 60H10; 49Q22 (Primary) 60H35 (Secondary)
ARTICLE

Adapted Wasserstein distance between the laws of SDEs

Backhoff-Veraguas, Julio ; Källblad, Sigrid ; Robinson, Benjamin AarXiv.org, 2022

OPEN ACCESS

Adapted Wasserstein distance between the laws of SDEs

Available Online 

All 3 versions

ARTICLE

A Data-dependent Approach for High Dimensional (Robust) Wasserstein Alignment

Hu, Ding ; Liu, Wenjie ; Ye, MingquanarXiv.org, 2022

OPEN ACCESS

A Data-dependent Approach for High Dimensional (Robust) Wasserstein Alignment

Available Online 

arXiv:2209.02905  [pdfother]  cs.CV  cs.LG
A Data-dependent Approach for High Dimensional (Robust) Wasserstein Alignment
Authors: Hu DingWenjie LiuMingquan Ye
Abstract: Many real-world problems can be formulated as the alignment between two geometric patterns. Previously, a great amount of research focus on the alignment of 2D or 3D patterns in the field of computer vision. Recently, the alignment problem in high dimensions finds several novel applications in practice. However, the research is still rather limited in the algorithmic aspect. To the best of our kno… 
More
Submitted 6 September, 2022; originally announced September 2022.
Comments: arXiv admin note: substantial text overlap with arXiv:1811.07455
All 2 versions 


ARTICLE

Entropy-regularized Wasserstein distributionally robust shape and topology optimization

Dapogny, Charles ; Iutzeler, Franck ; Meda, Andrea ; Thibert, BorisarXiv.org, 2022

OPEN ACCESS

Entropy-regularized Wasserstein distributionally robust shape and topology optimization

Available Online 


arXiv:2209.01500  [pdfother]  math.OC   math.NA
Entropy-regularized Wasserstein distributionally robust shape and topology optimization
Authors: Charles DapognyFranck IutzelerAndrea MedaBoris Thibert
Abstract: This brief note aims to introduce the recent paradigm of distributional robustness in the field of shape and topology optimization. Acknowledging that the probability law of uncertain physical data is rarely known beyond a rough approximation constructed from observed samples, we optimize the worst-case value of the expected cost of a design when the probability law of the uncertainty is "close" t…  More
Submitted 3 September, 2022; originally announced September 2022.

<–—2022———2022———990—


[HTML] opticsjournal.net

[HTML] 结合双通道 WGAN-GP 的多角度人脸表情识别算法研究

邓源, 施一萍, 刘婕, 江悦莹, 朱亚梅… - Laser & Optoelectronics …, 2022 - opticsjournal.net

针对传统算法对多角度人脸表情识别效果不佳, 偏转角下生成的人脸正面化图像质量低等问题,

提出了一种结合双通道WGAN-GP 的多角度人脸表情识别算法. 传统模型仅利用侧脸特征对多

All 4 versions 

[Chinese  ] opticsjournal.net

[Research on multi-angle facial expression recognition algorithm combined with dual-channel WGAN-GP]

Cited by 2

Improved Training of Wasserstein GANs | Request PDF

https://www.researchgate.net › ... › Training

https://www.researchgate.net › ... › Training

Jul 5, 2022 — Generative Adversarial Networks (GANs) are powerful generative models, but suffer from training instability.

[CITATION] Improved training of wasserstein GANs. 2017

I Gulrajani, F Ahmed, M Arjovsky, V Dumoulin… - URL http://arxiv. org/abs …, 2022

  Cited by 2


MR4469075 Prelim Santos-Rodríguez, Jaime; 

On isometries of compact 

Lp-Wasserstein spaces. Adv. Math. 409 (2022), Paper No. 108632. 53C23 (53C21)

Review PDF Clipboard Journal Article


 Measuring association with Wasserstein distances

Wiesel, JCW

Nov 2022 | 

BERNOULLI

 28 (4) , pp.2816-2832

Let n ??? ??(??,v) be a coupling between two probability measures ?? and v on a Polish space. In this article we propose and study a class of nonparametric measures of association between ?? and v, which we call Wasserstein correlation coefficients. These coefficients are based on the Wasserstein distance between v and the disintegration nx1 of n with respect to the first coordinate. We also es

Show more

Free Submitted Article From RepositoryFull Text at Publishermore_horiz

40 References  Related records


Wasserstein generative adversarial networks for modeling marked events

Dizaji, SHSPashazadeh, S and Niya, JM

Aug 2022 (Early Access) | 

JOURNAL OF SUPERCOMPUTING

Enriched Cited ReferMarked temporal events are ubiquitous in several areas, where the events' times and marks (types) are usually interrelated. Point processes and their non-functional variations using recurrent neural networks (RNN) model temporal events using intensity functions. However, since they usually utilize the likelihood maximization approach, they might fail. Moreover, their high simulation complexiShow more

Full Text at Publishermore_horiz

49 References  Related records

2022

 

ARTICLE

Distributionally Robust Joint Chance-Constrained Programming with Wasserstein Metric

Gu, Yining ; Wang, YanjunarXiv.org, 2022

OPEN ACCESS

Distributionally Robust Joint Chance-Constrained Programming with Wasserstein Metric

Available Online 

Working Paper  Full Text

Distributionally Robust Joint Chance-Constrained Programming with Wasserstein Metric

Gu, Yining; Wang, Yanjun.

arXiv.org; Ithaca, Sep 5, 2022.

 Abstract/DetailsGet full text
Link to external site, this link will open in a new window

ARTICLE

Fair learning with Wasserstein barycenters for non-decomposable performance measures

Solenne Gaucher ; Nicolas Schreuder ; Evgenii ChzhenarXiv.org, 2022

OPEN ACCESS

Fair learning with Wasserstein barycenters for non-decomposable performance measures

Available Online 
  Working Paper  Full Text

Fair learning with Wasserstein barycenters for non-decomposable performance measures

Gaucher, Solenne; Schreuder, Nicolas; Chzhen, Evgenii.

arXiv.org; Ithaca, Sep 1, 2022.

 Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Cite Cited by 1 All 4 versions

Working Paper  Full Text

Wasserstein Embedding for Capsule Learning

Shamsolmoali, Pourya; Zareapoor, Masoumeh; Das, Swagatam; Granger, Eric; Garcia, Salvador.

arXiv.org; Ithaca, Sep 1, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

Wire Feed  Full Text

Wasserstein Announces Its Official Made for Fitbit Product Line for the New Fitbit Versa 4 and Sense 2

Newswire; New York [New York]. 29 Aug 2022. 

DetailsFull text

NEWSPAPER ARTICLE

Wasserstein Announces Its Official Made for Fitbit Product Line for the New Fitbit Versa 4 and Sense 2

Plus Company Updates, 2022

Wasserstein Announces Its Official Made for Fitbit Product Line for the New Fitbit Versa 4 and Sense 2

No Online Access 

  
Newspaper  Full Text
Check out photos of the world's first hydrogen-powered passenger train that's up and running in Germany as Europe tries to wean itself off of Russian oil
Delouya, Samantha.

Business Insider, US edition; New York [New York]. 25 Aug 2022. 
DetailsFull text



Newspaper  Full Text
Even as the West tries to wean itself off Russian oil, Moscow has found itself yet another buyer: Myanmar

Tan, Huileng.

Business Insider, US edition; New York [New York]. 19 Aug 2022. 

DetailsFull text

<–—2022———2022———1000—



Working Paper  Full Text

Wasserstein Generative Adversarial Uncertainty Quantification in Physics-Informed Neural Networks
Gao, Yihang; Ng, Michael K.arXiv.org; Ithaca, Aug 9, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

 Get full textLink to external site, this link will open in a new window 

Cited by 2 Related articles All 4 versions

Research article
On isometries of compact Lp–Wasserstein spaces
Advances in Mathematics11 August 2022...

Jaime Santos-Rodríguez

 Zbl 07597093

Research article
Optimal visual tracking using Wasserstein transport proposals
Expert Systems with Applications30 July 2022...

Jin HongJunseok Kwon
Optimal visual tracking using Wasserstein transport proposals

Hong, Jin ; Kwon, JunseokExpert systems with applications, 2022, Vol.209

Cited by 4 Related articles All 3 versions

2022 see 2021  ARTICLE

Wasserstein Patch Prior for Image Superresolution

Hertrich, Johannes ; Houdard, Antoine ; Redenbach, ClaudiaIEEE transactions on computational imaging, 2022, Vol.8, p.693-704

OPEN ACCESS

 Download PDF 

Wasserstein Patch Prior for Image Superresolution

Available Online 

 View Issue Contents 

Cited by 6 Related articles All 5 versions


2022 see 2021  Research article
Universality of persistence diagrams and the bottleneck and Wasserstein distances

Computational Geometry April 2022...

Peter BubenikAlex Elchesen

Cited by 6 Related articles All 5 versions

2022


Research article
Wasserstein metric-based two-stage distributionally robust optimization model for optimal daily peak shaving dispatch of cascade hydroplants under renewable energy uncertainties
Energy13 August 2022...

Xiaoyu JinBenxi LiuJia Lu

57 References  Related records 


Health Indicator Construction Method of Bearings Based on Wasserstein Dual-Domain Adversarial Networks Under Normal Data Only

J Li, Y Zi, Y Wang, Y Yang - IEEE Transactions on Industrial …, 2022 - ieeexplore.ieee.org

… where fw(x) is the Wasserstein discriminator, ˆx is a uniform sampling from Xand G(z) , and 

λ … uses Wasserstein distance to quantify the network results, which means that Wasserstein

Cited by 2 Related articles

Research article
Optimizing decisions for a dual-channel retailer with service level requirements and demand uncertainties: A Wasserstein metric-based distributionally robust optimization approach
Computers & Operations Research22 October 2021...

Yue SunRuozhen QiuMinghe Sun

Cited by 6 Related articles All 2 versions

Research article
Unbalanced network attack traffic detection based on feature extraction and GFDA-WGAN
Computer Networks17 August 2022...

Kehong LiWengang MaRuiqi Liu

Unbalanced network attack traffic detection based on feature extraction and GFDA-WGAN
by Li, Kehong; Ma, Wengang; Duan, Huawei ; More...
Computer networks (Amsterdam, Netherlands : 1999), 10/2022, Volume 216
Detecting various types of attack traffic is critical to computer network security. The current detection methods require massive amounts of data to detect...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal

Research articleFull text access
Lung image segmentation based on DRD U-Net and combined WGAN with Deep Neural Network
Computer Methods and Programs in Biomedicine30 August 2022...

Luoyu LianXin LuoZhendong Xu

Download PDF

All 5 versions

<–—2022———2022———1010— 


A two-step approach to Wasserstein distributionally robust ...

https://arxiv.org › math

https://arxiv.org › math

by A Maghami · 2022 — We consider a distributionally robust approach to solve the chance-constrained program. We assume that samples of the uncertainty are available.

Missing: Энциклопедия ‎| Must include: Энциклопедия.   


 2022 patent

Wasserstein distance-based battery SOH estimation method and device

CN CN114839552A 林名强 泉州装备制造研究所

Priority 2022-04-08 • Filed 2022-04-08 • Published 2022-08-02

3. The wasserstein distance-based battery SOH estimation method according to claim 1, wherein: in S1, the aging data of the pouch batteries is specifically aging data of eight nominal 740Ma · h pouch batteries recorded in advance. 4. A wasserstein distance-based battery SOH estimation method …


 2022 patent

Application of evidence Wasserstein distance algorithm in component …

CN CN114818957A 肖富元 肖富元

Priority 2022-05-10 • Filed 2022-05-10 • Published 2022-07-29

1. The application of the evidence Wasserstein distance algorithm in component identification is characterized in that: the Wasserstein distance is EWD, and the EWD is verified by the following method: 1): let m1 and m2 be the quality function of the multi-intersection element set Θ, where γ i, j …


 2022 patent

CAN bus fuzzy test case generation method based on WGAN-GP and fuzzy test system

CN114936149A 黄柯霖 华中科技大学

Filed 2022-04-27 • Published 2022-08-23

the model generation module is used for building and training a WGAN-GP model based on a neural network through the training data set; the test case generation module is used for configuring a noise vector for the trained WGAN-GP model, so that the WGAN-GP model generates a plurality of virtual CAN …



2022 see 2021  Academic Journal

Source: International Journal of Systems Science

Database: Business Source Complete

By: Yuan, Yuefei, Song, Qiankun, Zhou, Bo,

Published: 15-07-2022

Fault Diagnosis of Rotating Machinery Based on Wasserstein Distance and Feature Selection.


2022


Academic Journal

Source: Journal of the Royal Statistical Society: Series B (Statistical Methodology)

Database: Business Source Complete

By: Chen, Yao, Gao, Qingyi, Wang, Xiao,

Published: 01-02-2022

A Wasserstein metric-based distributionally robust optimization approach for reliable-economic equilibrium operation of hydro-wind-solar energy systems.

Cited by 1 All 5 versions

arXiv:2209.04268  [pdfpsother math.MG  math.AP  math.FA
Absolutely continuous and BV-curves in 1-Wasserstein spaces
Authors: Ehsan AbediZhenhao LiTimo Schultz
Abstract: We extend the result of [Lisini, S. Calc. Var. 28, 85-120 (2007)] on the superposition principle for absolutely continuous curves in p
-Wasserstein spaces to the special case of p=1
. In contrast to the case of p>1
, it is not always possible to have lifts on absolutely continuous curves. Therefore, one needs to relax the notion of a lift by considering curves of bounded variation, or shortly B…  More
Submitted 9 September, 2022; originally announced September 2022.
Comments: 37 pages, 3 figures
MSC Class: 49Q22; 49J27; 26A45
Cited by 1
 Related articles All 5 versions


DVGAN: Stabilize Wasserstein GAN training for time-domain...
by Dooney, Tom; Bromuri, Stefano; Curier, Lyana
09/2022
Simulating time-domain observations of gravitational wave (GW) detector environments will allow for a better understanding of GW sources, augment datasets for...
Journal Article  Full Text Online

2022 thesis
Computational Inversion with Wasserstein Distances and ...

https://academiccommons.columbia.edu › dhnq-j497

https://academiccommons.columbia.edu › dhnq-j497

by W Ding · 2022 — This thesis presents a systematic computational investigation of loss functions in solving inverse problems of partial differential equations.


2022 thesis

Non-parametric threshold for smoothed empirical Wasserstein ...

https://dspace.mit.edu › bitstream › handle › Jia-zyji...PDF

by Z Jia · 2022 — c Massachusetts Institute of Technology 2022. ... proper way, which benefits my not only in the write-up of this thesis, and also among.

<–—2022———2022———1020—


RTICLE

Modeling of Political Systems using Wasserstein Gradient Flows

Lanzetti, Nicolas ; Joudi Hajar ; Dörfler, FlorianarXiv.org, 2022

OPEN ACCESS

Modeling of Political Systems using Wasserstein Gradient Flows

Available Online 

  arXiv:2209.05382  [pdfpsother]  eess.SY
Modeling of Political Systems using Wasserstein Gradient Flows
Authors: Nicolas LanzettiJoudi HajarFlorian Dörfler
Abstract: The study of complex political phenomena such as parties' polarization calls for mathematical models of political systems. In this paper, we aim at modeling the time evolution of a political system whereby various parties selfishly interact to maximize their political success (e.g., number of votes). More specifically, we identify the ideology of a party as a probability distribution over a one-di… 
More
Submitted 12 September, 2022; originally announced September 2022.
Comments: Accepted for presentation at, and publication in the proceedings of, the 61st IEEE Conference on Decision and Control
All 2 versions 


arXiv:2209.04991  [pdfother]  stat.ME   stat.ML

Wasserstein Distributional Learning
hengliang Tang
Nathan LenssenYing WeiTian Zheng

Learning conditional densities and identifying factors that influence the entire distribution are vital tasks in data-driven applications. Conventional approaches work mostly with summary statistics, and are hence inadequate for a comprehensive investigation. Recently, there have been developments on functional regression methods to model density curves as functional outcomes. A major challenge for developing such models lies in the inherent constraint of non-negativity and unit integral for the functional space of density outcomes. To overcome this fundamental issue, we propose Wasserstein Distributional Learning (WDL), a flexible density-on-scalar regression modeling framework that starts with the Wasserstein distance 

W2   as a proper metric for the space of density outcomes. We then introduce a heterogeneous and flexible class of Semi-parametric Conditional Gaussian Mixture Models (SCGMM) as the model class  𝔉⊗

. The resulting metric space 

(𝔉⊗, W2)

 satisfies the required constraints and offers a dense and closed functional subspace. For fitting the proposed model, we further develop an efficient algorithm based on Majorization-Minimization optimization with boosted trees. Compared with methods in the previous literature, WDL better characterizes and uncovers the nonlinear dependence of the conditional densities, and their derived summary statistics. We demonstrate the effectiveness of the WDL framework through simulations and real-world applications.


2022 see 2021

Erbar, MatthiasForkert, DominikMaas, JanMugnolo, Delio

Gradient flow formulation of diffusion equations in the Wasserstein space over a metric graph. (English) Zbl 07579703

Netw. Heterog. Media 17, No. 5, 687-717 (2022).

MSC:  35R02 49Q22 60B05

PDF BibTeX XML Cite

Full Text: DOI 

 

Ernst, Oliver G.Pichler, AloisSprungk, Björn

Wasserstein sensitivity of risk and uncertainty propagation. (English) Zbl 07579692

SIAM/ASA J. Uncertain. Quantif. 10, 915-948 (2022).

MSC:  91G70 35R60 60G15 60G60 62P35

PDF BibTeX XML Cite

Full Text: DOI 

2022 5/12

Wasserstein Sensitivity of Risk and Uncertainty Propagation

May 12, 2022 ... in both total variation and Wasserstein distance. ... respect to the Wasserstein distance of perturbed input distributions.

YouTube · Erwin Schrödinger International Institute for

May 12, 2022

   2022 5/9

Wasserstein Sensitivity of Risk and Uncertainty Propagation

This talk was part of the Workshop on "Approximation of high-dimensional parametric PDEs in forward UQ" held at the ESI 

YouTube · Erwin Schrödinger International Institute for Mathem

May 9 to 13, 2022.


2
Paper  Full Text Online
Unadjusted Langevin algorithm with multiplicative noise: Total variation and Wasserstein...
by Gilles Pages; Fabien Panloup
arXiv.org, 09/2022
In this paper, we focus on non-asymptotic bounds related to the Euler scheme of an ergodic diffusion with a possibly multiplicative diffusion term...

2022

[PDF] eurasip.org

[PDF] Auto-weighted Sequential Wasserstein Distance and Application to Sequence Matching

M Horie, H Kasai - eurasip.org

… This paper presents a proposal of a shapeaware Wasserstein distance between sequences

… matching method using our proposed Wasserstein distance robustly outperforms stateof-the-…


 arXiv:2209.07139  [pdfother cs.CL doi10.1162/coli_a_00440
The Impact of Edge Displacement Vaserstein Distance on UD Parsing Performance
Authors: Mark AndersonCarlos Gómez-Rodríguez
Abstract: We contribute to the discussion on parsing performance in NLP by introducing a measurement that evaluates the differences between the distributions of edge displacement (the directed distance of edges) seen in training and test data. We hypothesize that this measurement will be related to differences observed in parsing performance across treebanks. We motivate this by building upon previous work…  More
Submitted 15 September, 2022; originally announced September 2022.
Comments: This is the final peer-reviewed manuscript accepted for publication in Computational Linguistics. The journal version with the final editorial and typesetting changes is available open-access at https://doi.org/10.1162/coli_a_00440
MSC Class: 68T50 ACM Class: I.2.7
Journal ref: Computational Linguistics, 48(3):517-554, 2022

arXiv:2209.07058  [pdfpsother math.ST   math.FA math.PR
Structure preservation via the Wasserstein distance
Authors: Daniel BartlShahar Mendelson
Abstract: We show that under minimal assumptions on a random vector XRd
 and with high probability, given m
 independent copies of X
, the coordinate distribution of each vector (X
 is dictated by the distribution of the true marginal X,θ
. Formally, we show that with high probability, \[\sup_{θ\in S^{d-1}} \left( \frac{1}{m}\sum_{i=1}^m \left|\l…  More
Submitted 15 September, 2022; originally announced September 2022.

arXiv:2209.07007  [pdfother cs.LG   cs.CV
Gromov-Wasserstein Autoencoders
Authors: Nao NakagawaRen TogoTakahiro OgawaMiki Haseyama
Abstract: Learning concise data representations without supervisory signals is a fundamental challenge in machine learning. A prominent approach to this goal is likelihood-based models such as variational autoencoders (VAE) to learn latent representations based on a meta-prior, which is a general premise assumed beneficial for downstream tasks (e.g., disentanglement). However, such approaches often deviate…  More
Submitted 14 September, 2022; originally announced September 2022.
Comments: 34 pages, 11 figures

2022 see 2021

arXiv:2209.06975  [pdfother stat.ML   cs.LG
Wasserstein K-means for clustering probability distributions
Authors: Yubo ZhuangXiaohui ChenYun Yang
Abstract: Clustering is an important exploratory data analysis technique to group objects based on their similarity. The widely used K
-means clustering method relies on some notion of distance to partition data into a fewer number of groups. In the Euclidean space, centroid-based and distance-based formulations of the K
-means are equivalent. In modern machine learning applications, data often arise as p…  More
Submitted 14 September, 2022; originally announced September 2022.
Comments: Accepted to NeurIPS 202

<–—2022———2022———1030—


Preprint ARTICLE | doi:10.20944/preprints202112.0506.v1

On the Distributional Characterization of Graph Models of Water Distribution Networks in Wasserstein Spaces

Antonio CandelieriAndrea PontiFrancesco Archetti

Subject: Mathematics & Computer ScienceNumerical Analysis & Optimization Keywords: multi-objectiveevolutionary algorithmsPareto optimalityWasserstein distancenetwork vulnerabilityresiliencesensor placement.

Online: 31 December 2021 (11:01:51 CET)

Show abstractShare 



Wasserstein dropout

Sicking, JAkila, M; (...); Fischer, A

Sep 2022 (Early Access) | Enriched Cited References

Despite of its importance for safe machine learning, uncertainty quantification for neural networks is far from being solved. State-of-the-art approaches to estimate neural uncertainties are often hybrid, combining parametric models with explicit or implicit (dropout-based) ensembling. We take another pathway and propose a novel approach to uncertainty quantification for regression tasks, Wasse

Show more

Free Full Text From Publisher


Wasserstein Patch Prior for mage Superresolution

Hertrich, JHoudard, A and Redenbach, C

2022 | 

IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING

 8 , pp.693-704Enriched Cited References

Many recent superresolution methods are based on supervised learning. That means, that they require a large database of pairs of high- and low-resolution images as training data. However, for many applications, acquiring registered pairs of high and low resolution data or even imaging a large area with a high resolution is unrealistic. To overcome this problem, we introduce a Wasserstein patch

Show more

Free Submitted Article From RepositoryFull Text at Publishermore_horiz

63 References  Related records 

Global Wasserstein Margin maximization for boosting generalization in adversarial training

Yu, TYWang, S and Yu, XZ

Sep 2022 (Early Access) | 

APPLIED INTELLIGENCE

Enriched Cited References

In recent researches on adversarial robustness boosting, the trade-off between standard and robust generalization has been widely concerned, in which margin, the average distance from samples to the decision boundary, has become the bridge between the two ends. In this paper, the problems of the existing methods to improve the adversarial robustness by maximizing the margin are discussed and an

Show more

Full Text at Publishermore_horiz

38 References  Related records


Generalized displacement convexity for nonlinear mobility continuity equation and entropy power concavity on Wasserstein space over Riemannian manifolds

Wang, YZLi, SJ and Zhang, XX

Sep 2022 (Early Access) | 

MANUSCRIPTA MATHEMATICA

In this paper, we prove the generalized displacement convexity for nonlinear mobility continuity equation with p-Laplacian on Wasserstein space over Riemannian manifolds under the generalized McCann condition GMC(m, n). Moreover, we obtain some variational formulae along the Langevin deformation of flows on the generalized Wasserstein space, which is the interpolation between the gradient flow

Show more

Full Text at Publishermore_horiz

32 References  Related records


2022

TO VINTITLE

 MR4479897 Prelim Cui, Jianbo; Dieci, Luca; Zhou, Haomin; 

A ContinSIAM J. Sci. Comput.uation Multiple Shooting Method for Wasserstein Geodesic Equation.  44 (2022), no. 5, A2918–A2943. 65K10 (34A55 49M25 49Q22 65L09 65L10 65M99 65P10)

Review PDF Clipboard Journal Article


MR4474563 Prelim Wiesel, Johannes C. W.; Measuring association with Wasserstein distances. Bernoulli 28 (2022), no. 4, 2816–2832. 62G05 (49Q22 62G20 62H20)

Review PDF Clipboard


 2022 see 2021  orking Paper  Full Text

On clustering uncertain and structured data with Wasserstein barycenters and a geodesic criterion for the number of clusters

Papayiannis, G I; Domazakis, G N; Drivaliaris, D; Koukoulas, S; Tsekrekos, A E; et al.

arXiv.org; Ithaca, Sep 13, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window

 Get full textLink to external site, this link will open in a new window

      

Working Paper  Full Text

Wasserstein Embedding for Capsule Learning
Shamsolmoali, Pourya; Zareapoor, Masoumeh; Das, Swagatam; Granger, Eric; Garcia, Salvador.
arXiv.org; Ithaca, Sep 1, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window


[PDF] aclanthology.org

Quantized Wasserstein Procrustes Alignment of Word Embedding Spaces

PO AboagyeY ZhengM Yeh, J Wang… - Proceedings of the …, 2022 - aclanthology.org

… We also defined what the 2-Wasserstein distance is and looked in detail at how the

Wasserstein-Procrustes problem under the unsupervised CLWE model is solved in practice. We …

 All 2 versions 

<–—2022———2022———1040—


[PDF] aaai.org

Wasserstein Unsupervised Reinforcement Learning

S He, Y Jiang, H Zhang, J Shao, X Ji - Proceedings of the AAAI …, 2022 - ojs.aaai.org

… Therefore, we choose Wasserstein distance, a well-studied … By maximizing Wasserstein

distance, the agents equipped … First, we propose a novel framework adopting Wasserstein …

Cited by 6 Related articles All 5 versions 

2022 see 2021  [PDF] mlr.press

Linear-time gromov wasserstein distances using low rank couplings and costs

M ScetbonG PeyréM Cuturi - International Conference on …, 2022 - proceedings.mlr.press

… The Gromov-Wasserstein (GW) framework provides an increasingly popular answer to such

problems, by seeking a low-distortion, geometrypreserving assignment between these points…

 Cited by 11 Related articles All 3 versions 


2022 see 2021  [PDF] arxiv.org

A continuation multiple shooting method for Wasserstein geodesic equation

J Cui, L Dieci, H Zhou - SIAM Journal on Scientific Computing, 2022 - SIAM

… that is, on computation of the Wasserstein distance gW and the … to the solution of the

Wasserstein geodesic equation, a two-… once, one can recover the Wasserstein distance, the OT …

Cited by 4 Related articles All 3 versions

MR4479897 

MR4495278 Thesis

  THESIS/DISSERTATION

Computational Inversion with Wasserstein Distances and Neural Network Induced Loss Functions

Ding, Wen2022

Read Online

Computational Inversion with Wasserstein Distances and Neural Network Induced Loss Functions

Available Online 

[PDF] columbia.edu

Computational Inversion with Wasserstein Distances and Neural Network Induced Loss Functions

W Ding - 2022 - academiccommons.columbia.edu

… The scientific contributions of the thesis can be summarized in two directions. In the first

part of this thesis, we investigate the general impacts of different Wasserstein metrics and the …


Electrocardiograph Based Emotion Recognition via WGAN-GP Data Enhancement and Improved CNN

J Hu, Y Li - International Conference on Intelligent Robotics and …, 2022 - Springer

… However, WGAN uses weight cropping to restrict the absolute value … WGAN The loss function 

of -GP consists of two parts: the loss term and the gradient penalty term of the native WGAN

 All 2 versions


2022


A novel sEMG data augmentation based on WGAN-GP

F Coelho, MF Pinto, AG Melo, GS Ramos… - Computer Methods in …, 2022 - Taylor & Francis

WGAN-GP focus is to obtain stable models during the training phase. However, to the best of 

our knowledge, no works in the literature used WGAN… network called WGAN with a gradient …

All 3 versions


Wasserstein generative adversarial networks for modeling marked events

SHS Dizaji, S Pashazadeh, JM Niya - The Journal of Supercomputing, 2022 - Springer

… The WGAN for time generation is the same as the WGANTPP model introduced in [7]. In this 

research, another conditional WGAN is … WGAN model for marked events, the original WGAN


Gromov-Wasserstein distances between Gaussian distributions
Authors:Julie DelonAgnes DesolneuxAntoine Salmona
Summary:Gromov-Wasserstein distances were proposed a few years ago to compare distributions which do not lie in the same space. In particular, they offer an interesting alternative to the Wasserstein distances for comparing probability measures living on Euclidean spaces of different dimensions. We focus on the Gromov-Wasserstein distance with a ground cost defined as the squared Euclidean distance, and we study the form of the optimal plan between Gaussian distributions. We show that when the optimal plan is restricted to Gaussian distributions, the problem has a very simple linear solution, which is also a solution of the linear Gromov-Monge problem. We also study the problem without restriction on the optimal plan, and provide lower and upper bounds for the value of the Gromov-Wasserstein distance between Gaussian distributionsShow more
Article
Publication:Journal of Applied Probability, 59, 20221218, 1178


UP-WGAN: Upscaling Ambisonic Sound Scenes Using Wasserstein Generative Adversarial Networks

Y Wang, X Wu, T Qu - Audio Engineering Society Convention 151, 2022 - aes.org

Sound field reconstruction using spherical harmonics (SH) has been widely used. However, 

order-limited summation leads to an inaccurate reconstruction of sound pressure when the …

Related articles All 2 versions


Wasserstein Distance-Based Nonlinear Dimensionality Reduction for Depth-of-Interaction Decoding in Monolithic Crystal PET Detector

S Bae, JS Lee - 2022 - Soc Nuclear Med

… the Wasserstein distance-based LLE (W-LLE) for DOI decoding. The Wasserstein distance 

… using Euclidean distance, and the Wasserstein distance between 1D distributions is a …

<–—2022———2022———1050—


Acoustic metamaterial design using Conditional Wasserstein Generative Adversarial Networks

P Lai, F Amirkulova - The Journal of the Acoustical Society of …, 2022 - asa.scitation.org

This talk presents a method for generating planar configurations of scatterers with a reduced 

total scattering cross section (TSCS) by means of generative modeling and deep learning. …

Related articles All 2 versions


[PDF] arxiv.org

Wasserstein Task Embedding for Measuring Task Similarities

X Liu, Y Bai, Y Lu, A Soltoggio, S Kolouri - arXiv preprint arXiv:2208.11726, 2022 - arxiv.org

Wasserstein distance between their updated samples. Lastly, we leverage the 2-Wasserstein 

points approximates the proposed 2-Wasserstein distance between tasks. We show that …

Cited by 1 Related articles All 2 versions

[PDF] optimization-online.org

[PDF] Wasserstein Logistic Regression with Mixed Features

ASMRB Martin, B Haugh, W Wiesemann - optimization-online.org

… We note that in practice, the radius ϵ of the Wasserstein ball will be chosen via cross-validation 

(cf. Section 4), in which case our mixed-feature model reliably outperforms the classical …

[PDF] arxiv.org

Wasserstein Logistic Regression with Mixed Features

A Selvi, MR Belbasi, MB Haugh… - arXiv preprint arXiv …, 2022 - arxiv.org

… Missing values (NaNs) were encoded as a new category of the corresponding feature; the 

exceptions are the data sets agaricus-lepiota and breast-cancer, where rows with missing …

 Related articles All 2 versions


[PDF] aclanthology.org

Quantized Wasserstein Procrustes Alignment of Word Embedding Spaces

PO Aboagye, Y Zheng, M Yeh, J Wang… - Proceedings of the …, 2022 - aclanthology.org

… Under unsupervised CLWE models that solve the Wasserstein-Procrustes problem, we aim 

to … (4) is equivalent to minimizing the 2-Wasserstein distance between XW and Y to solve for …

All 2 versions


 Class-rebalanced wasserstein distance for multi-source domain adaptation

Q Wang, S Wang, B Wang - Applied Intelligence, 2022 - Springer

… a rebalancing scheme, class-rebalanced Wasserstein distance (CRWD), for unsupervised 

… biased label structure by rectifying the Wasserstein mapping from source to target space. …


2022


A Novel Physical Layer Key Generation Method Based on WGAN-GP Adversarial Autoencoder

J Han, Y Zhou, G Liu, T Liu… - 2022 4th International …, 2022 - ieeexplore.ieee.org

… The WGAN-GP Adversarial Autoencoder’s Structure In this paper, the WGAN-GP adversarial

… To further optimize the network structure, the Wasserstein distance and gradient penalty (GP…


2022 see 2021

A new method of image restoration technology based on WGAN

W Fang, E Gu, W Yi, W Wang… - … Systems Science and …, 2022 - scholars.ttu.edu

… Therefore, we propose an image inpainting network based on Wasserstein generative

adversarial network (WGAN) distance. With the corresponding technology having been adjusted …

Related articles All 2 versions 

[HTML] opticsjournal.net

[HTML] 结合双通道 WGAN-GP 的多角度人脸表情识别算法研究

Cited by 2 Related articles All 2 versions 


[PDF] iop.org

 A new method of image restoration technology based on WGAN

W Fang, E Gu, W Yi, W Wang… - … Systems Science and …, 2022 - scholars.ttu.edu

… Therefore, we propose an image inpainting network based on Wasserstein generative

adversarial network (WGAN) distance. With the corresponding technology having been adjusted …

 Cited by 1 Related articles All 2 versions 

一种基于改进的 WGAN 模型的电缆终端局部放电识别准确率提升方法

傅尧, 周凯, 朱光亚, 王子健, 王国栋, 王子康 - 电网技术, 2022 - cnki.com.cn

改进的Wasserstein 生成对抗网络(Wasserstein generative adversarial network, WGAN) 模型

首先训练具有条件生成能力且训练过程稳定的改进WGAN 模型并生成新的样本; 然后利用新样本

[Chinese  A method for improving the accuracy of partial discharge identification of cable terminals based on the improved WGAN model]


[2208.06306]  Wasserstein Complexity of Quantum Circuits

https://arxiv.org › quant-ph

by L Li · 2022 — This quantity, known as the quantum circuit complexity, is a fundamental property of quantum evolutions that has widespread applications in ...


 Comment: Testing for Weibull scale families as a test case for

 Wasserstein correlation tests

 ul 5, 2022 — In this paper we construct a bivariate gamma mixture distribution by allowing the scale parameters of the two marginals to have a generalized ...

<–—2022———2022———1060—


[PDF] thecvf.com

Computing Wasserstein-p Distance Between Images With Linear Cost

Y Chen, C Li, Z Lu - … of the IEEE/CVF Conference on …, 2022 - openaccess.thecvf.com

… discrete measures, computing Wasserstein-p distance between … a novel algorithm to compute

the Wasserstein-p distance be… We compute Wasserstein-p distance, estimate the transport …

Related articles  


  Study Findings from University of the Witwatersrand Update Knowledge in Information Technology 

((Neurocartographer: CC-WGAN Based SSVEP Data Generation to Produce a Model toward Symmetrical Behaviour to the Human Brain).

Health & Medicine Week, 09/2022

Newsletter

Single Image Super-Resolution Using Wasserstein Generative Adversarial Network with Gradient Penalty

Y Tang, C Liu, X Zhang - Pattern Recognition Letters, 2022 - Elsevier

… In this paper, a new SISR method is proposed based on Wasserstein GAN, which is a

training more stable GAN with Wasserstein metric. To further increase the SR performance and …

 Cited by 1 Related articles All 3 versions


 2022 see 2021  RTICLE

EEG data augmentation for emotion recognition with a multiple generator conditional Wasserstein GAN

Zhang, Aiming ; Su, Lei ; Zhang, Yin ; Fu, Yunfa ; Wu, Liping ; Liang, ShengjinComplex & Intelligent Systems, 2021, Vol.8 (4), p.3059-3071

PEER REVIEWED

OPEN ACCESS

 Download PDF 

EEG data augmentation for emotion recognition with a multiple generator conditional Wasserstein GAN

Available Online 

  [HTML] springer.com

[HTML] EEG data augmentation for emotion recognition with a multiple generator conditional Wasserstein GAN

A Zhang, L Su, Y Zhang, Y Fu, L Wu, S Liang - Complex & Intelligent …, 2022 - Springer

… In this paper, a multi-generator conditional Wasserstein GAN method is proposed for the

generation of high-quality artificial that covers a more comprehensive distribution of real data …

Cited by 10 Related articles All 3 versions

 

[HTML] springer.com

[HTML] Wasserstein-based measure of conditional dependence

J EtesamiK ZhangN Kiyavash - Behaviormetrika, 2022 - Springer

… In this work, we use Wasserstein distance and discuss the advantage of using such metric

… 2003), we obtain an alternative approach for computing the Wasserstein metric as follows: …

Cited by 1


2022


[PDF] ijcai.org

[PDF] Weakly-supervised Text Classification with Wasserstein Barycenters Regularization

J Ouyang, Y WangX LiC Li - ijcai.org

… a Wasserstein barycenter regularization with the weakly-supervised targets on the deep

feature space. The intuition is that the texts tend to be close to the corresponding Wasserstein …

Cited by 2 Related articles All 2 versions 


Working Paper  Full Text

Mandarin Singing Voice Synthesis with Denoising Diffusion Probabilistic Wasserstein GAN

Yin-Ping, Cho; Tsao, Yu; Wang, Hsin-Min; Yi-Wen, Liu.

arXiv.org; Ithaca, Sep 21, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

 ARTICLE

Mandarin Singing Voice Synthesis with Denoising Diffusion Probabilistic Wasserstein GAN

Yin-Ping Cho ; Yu Tsao ; Hsin-Min Wang ; Yi-Wen LiuarXiv.org, 2022

OPEN ACCESS

Mandarin Singing Voice Synthesis with Denoising Diffusion Probabilistic Wasserstein GAN

Available On


Working Paper  Full Text

Quantitative Stability of Barycenters in the Wasserstein Space

Carlier, Guillaume; Delalande, Alex; Merigot, Quentin.

arXiv.org; Ithaca, Sep 21, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

arXiv

ARTICLE

Quantitative Stability of Barycenters in the Wasserstein Space

Guillaume Carlier ; Alex Delalande ; Quentin MerigotarXiv.org, 2022

OPEN ACCESS

Quantitative Stability of Barycenters in the Wasserstein Space

Available Online 

Cited by 1 All 6 versions


ARTICLE

Quantum Wasserstein distance based on an optimization over separable states

Tóth, Géza ; Pitrik, JózsefarXiv.org, 2022

OPEN ACCESS

Quantum Wasserstein distance based on an optimization over separable states

Available Online 

Working Paper  Full Text

Quantum Wasserstein distance based on an optimization over separable states

Tóth, Géza; Pitrik, József.

arXiv.org; Ithaca, Sep 20, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

arXiv   All 2 versions 
 All 2 versions


ARTICLE

Reversible Coalescing-Fragmentating Wasserstein Dynamics on the Real Line

Konarovskyi, Vitalii ; Max von RenessearXiv.org, 2022

OPEN ACCESS

Reversible Coalescing-Fragmentating Wasserstein Dynamics on the Real Line

Available Online ƒ

Working Paper  Full Text

Reversible Coalescing-Fragmentating Wasserstein Dynamics on the Real Line

Konarovskyi, Vitalii; Max von Renesse.

arXiv.org; Ithaca, Sep 20, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

<–—2022———2022———1070— 


ARTICLE

Wasserstein-p Bounds in the Central Limit Theorem under Weak Dependence

Liu, Tianle ; Austern, MorganearXiv.org, 2022

OPEN ACCESS

Wasserstein-p Bounds in the Central Limit Theorem under Weak Dependence

Available Online 

Working Paper  Full Text

Wasserstein-p Bounds in the Central Limit Theorem under Weak Dependence

Liu, Tianle; Austern, Morgane.

arXiv.org; Ithaca, Sep 19, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Related articles All 2 versions

Working Paper  Full Text

The GenCol algorithm for high-dimensional optimal transport: general formulation and application to barycenters and Wasserstein splines

Friesecke, Gero; Penka, Maximilian.

arXiv.org; Ithaca, Sep 19, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window

Wire Feed  Full Text

Univ Northeast Electric Power Submits Chinese Patent Application for Electric Heating Combined System Distribution Robust Optimization Scheduling Method Based on Improved Wasserstein Measurement


Global IP News. Electrical Patent News; New Delhi [New Delhi]. 17 Sep 2022. 

DetailsFull text

NEWSPAPER ARTICLE

Univ Northeast Electric Power Submits Chinese Patent Application for Electric Heating Combined System Distribution Robust Optimization Scheduling Method Based on Improved Wasserstein Measurement

Global IP News. Electrical Patent News, 2022

Univ Northeast Electric Power Submits Chinese Patent Application for Electric Heating Combined System Distribution Robust Optimization Scheduling Method Based on Improved Wasserstein Measurement

No Online Access 


Working Paper  Full Text

Nonlocal Wasserstein Distance: Metric and Asymptotic Properties

Slepčev, Dejan; Warren, Andrew.

arXiv.org; Ithaca, Sep 17, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window


ARTICLE

Solving Fredholm Integral Equations of the First Kind via Wasserstein Gradient Flows

Francesca R Crucinio ; Valentin De Bortoli ; Arnaud Doucet ; Adam M JohansenarXiv.org, 2022

OPEN ACCESS

Solving Fredholm Integral Equations of the First Kind via Wasserstein Gradient Flows

Available Online 

Working Paper  Full Text

Solving Fredholm Integral Equations of the First Kind via Wasserstein Gradient Flows

Crucinio, Francesca R; De Bortoli, Valentin; Doucet, Arnaud; Johansen, Adam M.

arXiv.org; Ithaca, Sep 16, 2022.

All 2 versions


2022

[HTML] opticsjournal.net

[HTML] 结合双通道 WGAN-GP 的多角度人脸表情识别算法研究

邓源, 施一萍, 刘婕, 江悦莹, 朱亚梅… - Laser & Optoelectronics …, 2022 - opticsjournal.net

针对传统算法对多角度人脸表情识别效果不佳, 偏转角下生成的人脸正面化图像质量低等问题, 

提出了一种结合双通道WGAN-GP 的多角度人脸表情识别算法. 传统模型仅利用侧脸特征对多

All 4 versions

[Chinese  Research on multi-angle facial expression recognition algorithm combined with dual-channel WGAN-GP]


一种基于改进的 WGAN 模型的电缆终端局部放电识别准确率提升方法

傅尧, 周凯, 朱光亚, 王子健, 王国栋, 王子康 - 电网技术, 2022 - cnki.com.cn

改进的Wasserstein 生成对抗网络(Wasserstein generative adversarial network, WGAN) 模型

首先训练具有条件生成能力且训练过程稳定的改进WGAN 模型并生成新的样本; 然后利用新样本

  [Cjnese A method for improving the accuracy of partial discharge identification at cable terminals based on an improved WGAN model]


 Cattiaux, PatrickFathi, MaxGuillin, Arnaud

Self-improvement of the Bakry-Emery criterion for Poincaré inequalities and Wasserstein contraction using variable curvature bounds. (English. French summary) Zbl 07589404

J. Math. Pures Appl. (9) 166, 1-29 (2022).

MSC:  26D10 47D07 60G10 60J60

PDF BibTeX XML Cite  OpenURL 

Cited by 2 Related articles All 13 versions

Zbl 07589404 |     MR4488132

ARTICLE

Self-improvement of the Bakry-Emery criterion for Poincaré inequalities and Wasserstein contraction using variable curvature bounds

Cattiaux, Patrick ; Fathi, Max ; Guillin, ArnaudJournal de mathématiques pures et appliquées, 2022, Vol.166, p.1

PEER REVIEWED

Self-improvement of the Bakry-Emery criterion for Poincaré inequalities and Wasserstein contraction using variable curvature bounds

Available Online 

<–—2022———2022———1080—


Yang, ChaoranChang, Guangping

A bootstrap method of testing normality based on L2

 Wasserstein distance. (English) Zbl 07588254

Chin. J. Appl. Probab. Stat. 38, No. 2, 179-194 (2022).

MSC:  62F40

PDF BibTeX XML Cite

Full Text: Link 

Optimization in a traffic flow model as an inverse problem in the Wasserstein space

R ChertovskihFL PereiraN PogodaevM Staritsyn - IFAC-PapersOnLine, 2022 - Elsevier

We address an inverse problem for a dynamical system in the space of probability measures,

namely, the problem of restoration of the time-evolution of a probability distribution from …

[HTML] springer.com

[HTML] Dissipative probability vector fields and generation of evolution semigroups in Wasserstein spaces

G Cavagnari, G SavaréGE Sodini - Probability Theory and Related Fields, 2022 - Springer

… operators in Hilbert spaces and of Wasserstein gradient flows for geodesically convex …

By using the properties of the Wasserstein distance, we will first compute the right derivative …

Related articles All 2 versions

arXiv:2209.11703  [pdfother cs.CV
Multivariate Wasserstein Functional Connectivity for Autism Screening
Authors: Oleg KachanAlexander Bernstein
Abstract: Most approaches to the estimation of brain functional connectivity from the functional magnetic resonance imaging (fMRI) data rely on computing some measure of statistical dependence, or more generally, a distance between univariate representative time series of regions of interest (ROIs) consisting of multiple voxels. However, summarizing a ROI's multiple time series with its mean or the first pr…  More
Submitted 23 September, 2022; originally announced September 2022.

ARTICLE

Quantile-constrained Wasserstein projections for robust interpretability of numerical and machine learning models

Marouane Il Idrissi ; Bousquet, Nicolas ; Gamboa, Fabrice ; Iooss, Bertrand ; Jean-Michel LoubesarXiv.org, 2022

OPEN ACCESS

Quantile-constrained Wasserstein projections for robust interpretability of numerical and machine learning models

Available Online 

arXiv:2209.11539  [pdfother]  math.OC   math.PR  math.ST  stat.ML
Quantile-constrained Wasserstein projections for robust interpretability of numerical and machine learning models
Authors: Marouane Il IdrissiNicolas BousquetFabrice GamboaBertrand IoossJean-Michel Loubes
Abstract: Robustness studies of black-box models is recognized as a necessary task for numerical models based on structural equations and predictive models learned from data. These studies must assess the model's robustness to possible misspecification of regarding its inputs (e.g., covariate shift). The study of black-box models, through the prism of uncertainty quantification (UQ), is often based on sensi…  More
Submitted 23 September, 2022; originally announced September 2022.


2022


[PDF] sns.it

[PDF] DYNAMICAL SYSTEMS AND HAMILTON-JACOBI-BELLMAN EQUATIONS ON THE WASSERSTEIN SPACE AND THEIR L2 REPRESENTATIONS

C JIMENEZ, A MARIGONDA, M QUINCAMPOIX - 2022 - cvgmt.sns.it

… control problems, both stated in the Wasserstein space of probability measures. Since … the 

Wasserstein space and to investigate the relations between dynamical systems in Wasserstein

BELLMAN EQUATIONS ON THE WASSERSTEIN SPACE AND THEIR L2 REPRESENTATIONS

C JIMENEZ, A MARIGONDAM QUINCAMPOIX - 2022 - cvgmt.sns.it

… control problems, both stated in the Wasserstein space of probability measures. Since … the

Wasserstein space and to investigate the relations between dynamical systems in Wasserstein …


arXiv:2209.13570
 [pdf, otherstat.ML cs.LG
Hierarchical Sliced
Wasserstein Distance
Authors:
Khai Nguyen, Tongzheng Ren, Huy Nguyen, Litu Rout, Tan Nguyen, Nhat Ho
Abstract: Sliced Wasserstein (SW) distance has been widely used in different application scenarios since it can be scaled to a large number of supports without suffering from the curse of dimensionality. The value of sliced Wasserstein distance is the average of transportation cost between one-dimensional representations (projections) of original measures that are obtained by Radon Transform (RT). Despite i…
More
Submitted 27 September, 2022; originally announced September 2022.
Comments: 30 pages, 7 figures, 6 tables. arXiv admin note: text overlap with arXiv:2204.01188 


arXiv:2209.12197  [pdf, ps, othermath.OC
First-order Conditions for Optimization in the
Wasserstein Space
Authors:
Nicolas Lanzetti, Saverio Bolognani, Florian Dörfler
Abstract: We study first-order optimality conditions for constrained optimization in the Wasserstein space, whereby one seeks to minimize a real-valued function over the space of probability measures endowed with the Wasserstein distance. Our analysis combines recent insights on the geometry and the differential structure of the Wasserstein space with more classical calculus of variations. We show that simp…
More
Submitted 25 September, 2022; originally announced September 2022. 

arXiv:2209.11703  [pdf, othercs.CV
Multivariate
Wasserstein Functional Connectivity for Autism Screening
Authors:
Oleg Kachan, Alexander Bernstein
Abstract: Most approaches to the estimation of brain functional connectivity from the functional magnetic resonance imaging (fMRI) data rely on computing some measure of statistical dependence, or more generally, a distance between univariate representative time series of regions of interest (ROIs) consisting of multiple voxels. However, summarizing a ROI's multiple time series with its mean or the first pr…
More
Submitted 23 September, 2022; originally announced September 2022. 

All 3 versions

arXiv:2209.11539  [pdf, other]  math.OC math.PR math.ST stat.ML
Quantile-constrained Wasserstein projections for robust interpretability of numerical and machine learning models
Authors: Marouane Il Idrissi, Nicolas Bousquet, Fabrice Gamboa, Bertrand Iooss, Jean-Michel Loubes
Abstract: Robustness studies of black-box models is recognized as a necessary task for numerical models based on structural equations and predictive models learned from data. These studies must assess the model's robustness to possible misspecification of regarding its inputs (e.g., covariate shift). The study of black-box models, through the prism of uncertainty quantification (UQ), is often based on sensi…
More
Submitted 23 September, 2022; originally announced September 2022. 

arXiv:2209.10446  [pdf, ps, other]  eess.AS cs.SD eess.SP
Mandarin Singing Voice Synthesis with Denoising Diffusion Probabilistic
Wasserstein GAN
Authors:
Yin-Ping Cho, Yu Tsao, Hsin-Min Wang, Yi-Wen Liu
Abstract: Singing voice synthesis (SVS) is the computer production of a human-like singing voice from given musical scores. To accomplish end-to-end SVS effectively and efficiently, this work adopts the acoustic model-neural vocoder architecture established for high-quality speech and singing voice synthesis. Specifically, this work aims to pursue a higher level of expressiveness in synthesized voices by co…
More
Submitted 21 September, 2022; originally announced September 2022.

All 7 versions

Chen, Yao; Gao, Qingyi; Wang, Xiao

Inferential Wasserstein generative adversarial networks. (English) Zbl 07593405

J. R. Stat. Soc., Ser. B, Stat. Methodol. 84, No. 1, 83-113 (2022).

MSC:  62-XX

PDF BibTeX XML Cite

Full Text: DOI 

Zbl 07593405

<–—2022———2022———1090— 


OpenURL

Santambrogio, Filippo

Book review of: “Lectures on optimal transport” by Luigi Ambrosio, Elia Brué and Daniele Semola, and “An invitation to optimal transport, Wasserstein distances, and gradient flows” by Alessio Figalli and Federico Glaudo. (English) Zbl 07593313

Eur. Math. Soc. Mag. 124, 60-63 (2022).

MSC:  00A17 49-01 49-02 49Q22 60B05 28A33 35A15 35Q35 49N15 28A50 49Jxx

PDF BibTeX XML Cite

Full Text: DOI 

OpenURL


2022 see 2021

Shehadeh, Karmel S.

Data-driven distributionally robust surgery planning in flexible operating rooms over a Wasserstein ambiguity. (English) Zbl 07593191

Comput. Oper. Res. 146, Article ID 105927, 16 p. (2022).

MSC:  90Bxx

PDF BibTeX XML Cite

Full Text: DOI 

Cited by 4 Related articles All 5 versions


[PDF] mlr.press

Variance minimization in the Wasserstein space for invariant causal prediction

GG MartinetA Strzalkowski… - … Conference on Artificial …, 2022 - proceedings.mlr.press

… Wasserstein variance means that the residuals’ distributions differ substantially across

environments; conversely, a Wasserstein … space (P2,W2) is also called the Wasserstein space. …

 Cited by 3 Related articles All 3 versions 


Solutions to Hamilton–Jacobi equation on a Wasserstein space

Z Badreddine, H Frankowska - Calculus of Variations and Partial …, 2022 - Springer

… optimal control problem in the Wasserstein space \(\mathscr {… invariance theorems in

the Wasserstein space and discuss a … and invariance theorems in the Wasserstein space \(\mathscr …

 Cited by 4 Related articles All 2 versions


[PDF] arxiv.org

Quantitative Stability of Barycenters in the Wasserstein Space

G Carlier, A DelalandeQ Merigot - arXiv preprint arXiv:2209.10217, 2022 - arxiv.org

… Wasserstein barycenters define averages of probability measures in a geometrically … We

show that Wasserstein barycenters depend in a Hölder-continuous way on their marginals …

All 4 versions 


2022


[PDF] arxiv.org

Poisson equation on Wasserstein space and diffusion approximations for McKean-Vlasov equation

Y Li, F Wu, L Xie - arXiv preprint arXiv:2203.12796, 2022 - arxiv.org

… By studying the smoothness of the solution of the nonlinear Poisson equation on Wasserstein

space, we derive the asymptotic limit as well as the optimal rate of convergence for the …

 Cited by 3 Related articles All 2 versions 



[PDF] arxiv.org

Invariance encoding in sliced-Wasserstein space for image classification with limited training data

Y ZhuangS LiAHM RubaiyatX Yin… - arXiv preprint arXiv …, 2022 - arxiv.org

… strategy to encode invariances as typically done in machine learning, here we propose to

mathematically augment a nearest subspace classification model in sliced-Wasserstein space …

 Related articles All 2 versions 

Invariance encoding in sliced-Wasserstein space for image classification with limited training data

MSE RabbiY Zhuang, S Li… - arXiv preprint …, 2022 - arxiv-export-lb.library.cornell.edu

… strategy to encode invariances as typically done in machine learning, here we propose to

mathematically augment a nearest subspace classification model in sliced-Wasserstein space …

 Cited by 1 Related articles 

 

[PDF] mlr.press

Wasserstein gans with gradient penalty compute congested transport

T MilneAI Nachman - Conference on Learning Theory, 2022 - proceedings.mlr.press

… developed to calculate the Wasserstein 1 distance between … For WGAN-GP, we find that the

congestion penalty has a … new, in that the congestion penalty turns out to be unbounded and …

Cited by 2 Related articles All 3 versions 

 

On isometries of compact Lp–Wasserstein spaces

J Santos-Rodríguez - Advances in Mathematics, 2022 - Elsevier

… This question of determining the structure of the group of isometries of the L p –Wasserstein

space … of the L 2 –Wasserstein space come from isometries of the base space we assume an …

All 2 versions


[PDF] arxiv.org

Viscosity solutions for obstacle problems on Wasserstein space

M Talbi, N Touzi, J Zhang - arXiv preprint arXiv:2203.17162, 2022 - arxiv.org

… on the Wasserstein space, that we call obstacle equation on Wasserstein space by analogy

… the unique solution of the obstacle equation on the Wasserstein space, provided it has C1,2 …

Related articles All 3 versions 

MR4604196

<–—2022———2022———1100— 


[PDF] arxiv.org

Wasserstein Distributional Learning

C Tang, N Lenssen, Y Wei, T Zheng - arXiv preprint arXiv:2209.04991, 2022 - arxiv.org

… is dense in the Wasserstein space so that it well approximates any regular conditional

distributions; In the second one, we prove the optimizer from the Wasserstein regression, f̂τ(x), is …

All 2 versions 


Optimization in a traffic flow model as an inverse problem in the Wasserstein space

R ChertovskihFL PereiraN PogodaevM Staritsyn - IFAC-PapersOnLine, 2022 - Elsevier

… system in the space of probability … in the Wasserstein space of probability measures. For

the simplest version of this problem, associated with a toy one-dimensional model of traffic flow, …


[PDF] arxiv.org

Wasserstein Hamiltonian flow with common noise on graph

J Cui, S Liu, H Zhou - arXiv preprint arXiv:2204.01185, 2022 - arxiv.org

… We study the Wasserstein Hamiltonian flow with a common … formulation of stochastic

Wasserstein Hamiltonian flow and show … stochastic Wasserstein Hamiltonian flow on graph as …

Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Quantum Wasserstein isometries on the qubit state space

GP GehérJ PitrikT TitkosD Virosztek - arXiv preprint arXiv:2204.14134, 2022 - arxiv.org

… We describe Wasserstein isometries of the quantum bit state space with respect to …

This phenomenon mirrors certain surprising properties of the quantum Wasserstein distance…

Cited by 2 Related articles All 3 versions 


Wasserstein Metric Attack on Person Re-identification

A VermaAV Subramanyam… - 2022 IEEE 5th …, 2022 - ieeexplore.ieee.org

… After projecting the perturbed image to Wasserstein space, we perform clamping to ensure

that the adversarial sample is a valid image with pixels in the range [0,1]. In Figure 2, we …

Related articles All 2 versions

2022


 SWGAN-GP: Improved Wasserstein Generative Adversarial Network with Gradient Penalty

C Yun-xiang, W Wei, N Juan, C Yi-dan… - Computer and …, 2022 - cam.org.cn

… , this paper proposes an improved Wasserstein generative adversarial network with …

penalty (PSWGAN-GP) method. Based on the Wasserstein distance loss and gradient penalty of …

 Related articles All 2 versions 


 WRI: Wasserstein Regression and Inference

Jul 8, 2022 — Title Wasserstein Regression and Inference. Version 0.2.0 ... tion, prediction, 

and inference of the Wasserstein autoregressive models.



ARTICLE

From geodesic extrapolation to a variational BDF2 scheme for Wasserstein gradient flows

Natale, Andrea ; Todeschi, Gabriele ; Gallouët, ThomasarXiv.org, 2022

OPEN ACCESS

From geodesic extrapolation to a variational BDF2 scheme for Wasserstein gradient flows

Available Online 

arXiv:2209.14622  [pdfother]  math.AP  math.NA
From geodesic extrapolation to a variational BDF2 scheme for Wasserstein gradient flows
Authors: Andrea Natale
Gabriele TodeschiThomas Gallouët
Abstract: We introduce a time discretization for Wasserstein gradient flows based on the classical Backward Differentiation Formula of order two. The main building block of the scheme is the notion of geodesic extrapolation in the Wasserstein space, which in general is not uniquely defined. We propose several possible definitions for such an operation, and we prove convergence of the resulting scheme to the…  More
Submitted 29 September, 2022; originally announced September 2022.
All 6 versions


arXiv:2209.14440  [pdfother cs.L  cs.AI  cs.CV  stat.ML
GeONet: a neural operator for learning the Wasserstein geodesic
Authors: Andrew GracykXiaohui Chen
Abstract: Optimal transport (OT) offers a versatile framework to compare complex data distributions in a geometrically meaningful way. Traditional methods for computing the Wasserstein distance and geodesic between probability measures require mesh-dependent domain discretization and suffer from the curse-of-dimensionality. We present GeONet, a mesh-invariant deep neural operator network that learns the non…  More
Submitted 28 September, 2022; originally announced September 2022.
ARTICLE

GeONet: a neural operator for learning the Wasserstein geodesic

Gracyk, Andrew ; Chen, XiaohuiarXiv.org, 2022

OPEN ACCESS

GeONet: a neural operator for learning the Wasserstein geodesic

Available Online 

Related articles All 3 versions

arXiv:2209.13592  [pdfother]  astro-ph.IM   cs.LG  gr-qc  physics.ins-det
DVGAN: Stabilize Wasserstein GA
N training for time-domain Gravitational Wave physics
Authors: Tom DooneyStefano BromuriLyana Curier
Abstract: Simulating time-domain observations of gravitational wave (GW) detector environments will allow for a better understanding of GW sources, augment datasets for GW signal detection and help in characterizing the noise of the detectors, leading to better physics. This paper presents a novel approach to simulating fixed-length time-domain signals using a three-player Wasserstein Generative Adversarial…  More
Submitted 29 September, 2022; v1 submitted 26 September, 2022; originally announced September 2022.
Comments: 10 pages, 6 figures, 3 tables

<–—2022———2022———1110—



arXiv:2209.13570  [pdfother stat.ML  cs.LG
Hierarchical Sliced Wasserstein Distance
Authors: Khai NguyenTongzheng RenHuy NguyenLitu RoutTan NguyenNhat Ho
Abstract: Sliced Wasserstein (SW) distance has been widely used in different application scenarios since it can be scaled to a large number of supports without suffering from the curse of dimensionality. The value of sliced Wasserstein distance is the average of transportation cost between one-dimensional representations (projections) of original measures that are obtained by Radon Transform (RT). Despite i…  More
Submitted 28 September, 2022; v1 submitted 27 September, 2022; originally announced September 2022.
Comments: 30 pages, 7 figures, 6 tables. arXiv admin note: text overlap with arXiv:2204.01188
Cited by 1
 All 3 versions 


 arXiv:2209.12197  [pdfpsother math.OC
First-order Conditions for Optimization in the Wasserstein Space
Authors: Nicolas LanzettiSaverio BolognaniFlorian Dörfler
Abstract: We study first-order optimality conditions for constrained optimization in the Wasserstein space, whereby one seeks to minimize a real-valued function over the space of probability measures endowed with the Wasserstein distance. Our analysis combines recent insights on the geometry and the differential structure of the Wasserstein space with more classical calculus of variations. We show that simp…  More
Submitted 25 September, 2022; originally announced September 2022.

ARTICLE

First-order Conditions for Optimization in the Wasserstein Space

Lanzetti, Nicolas ; Bolognani, Saverio ; Dörfler, FlorianarXiv.org, 202

OPEN ACCESS

First-order Conditions for Optimization in the Wasserstein Space

Available Online 
Cited by 2
All 2 versions


2022 see 2021

Wiesel, Johannes C. W.

Measuring association with Wasserstein distances. (English) Zbl 07594079

Bernoulli 28, No. 4, 2816-2832 (2022).

MSC:  62Gxx 62Hxx 90Cxx

PDF BibTeX XML Cite

Full Text: DOI 



2022 see 2021

Niles-Weed, JonathanRigollet, Philippe

Estimation of Wasserstein distances in the spiked transport model. (English) Zbl 07594074

Bernoulli 28, No. 4, 2663-2688 (2022).

MSC:  62Gxx 60Fxx 62Hxx

PDF BibTeX XML Cite

Full Text: DOI   

Zbl 07594074
Cited by 2 All 4 versions

2022 see 2-21

Wasserstein Adversarial Regularization for Learning With Label Noise

Fatras, KDamodaran, BB; (...); Courty, N

Oct 1 2022 | 

IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE

 44 (10) , pp.7296-7306

Enriched Cited RefereNoisy labels often occur in vision datasets, especially when they are obtained from crowdsourcing or Web scraping. We propose a new regularization method, 

whicenables learning robust classifiers in presence of noisy data. To achieve this goal, we propose a new adversarial regularization scheme based on the Wasserstein distance. 

Usithisdistaataking into account specific relationShow m

Usithisdistaataking into account specific relationShow m

View full textmore_ho

1 Citation  67 References  Related records

 

Self-improvement of the Bakry-Emery criterion for Poincare inequalities and Wasserstein contraction using variable curvature bounds

Cattiaux, PFathi, M and Guillin, A

Oct 2022 | 

JOURNAL DE MATHEMATIQUES PURES ET APPLIQUEES

 166 , pp.1-29

We study Poincare inequalities and long-time behavior for diffusion processes on Rn under a variable curvature lower bound, in the sense of Bakry-Emery. We derive various estimates on the rate of convergence to equilibrium in L1 optimal transport distance, as well as bounds on the constant in the Poincare inequality in several situations of interest, including some where curvature may be negati

Show more


2022


EEG Generation of Virtual Channels Using an Improved Wasserstein Generative Adversarial Networks

Ling-Long LiGuang-Zhong CaoHong-Jie LiangJiang-Cheng Chen & Yue-Peng Zhang 

Conference paper

First Online: 10 August 2022

37 Accesses

BOOK CHAPTER

EEG Generation of Virtual Channels Using an Improved Wasserstein Generative Adversarial Networks

Li, Ling-Long ; Cao, Guang-Zhong ; Liang, Hong-Jie ; Chen, Jiang-Cheng ; Zhang, Yue-PengIntelligent Robotics and Applications, 2022, p.386-39

EEG Generation of Virtual Channels Using an Improved Wasserstein Generative Adversarial Networks

Available Online 

All 2 versions

Wasserstein Metric Attack on Person Re-identification

https://ieeexplore.ieee.org › document

https://ieeexplore.ieee.org › document

by A Verma · 2022 — In our work, we propose the Wasserstein metric to perform adversarial attack on ReID system by projecting adversarial samples in the Wasserstein ...


2022 see 2021  ARTICLE

WELL-POSEDNESS FOR SOME NON-LINEAR DIFFUSION PROCESSES AND RELATED PDE ON THE WASSERSTEIN SPACE

Chaudru de Raynal, Paul-Eric ; Frikha, NoufelJournal de mathématiques pures et appliquées, 2022

PEER REVIEWED

OPEN ACCESS

WELL-POSEDNESS FOR SOME NON-LINEAR DIFFUSION PROCESSES AND RELATED PDE ON THE WASSERSTEIN SPACE

Available Online 



ARTICLE

Right mean for the $\alpha-z$ Bures-Wasserstein quantum divergence

Jeong, Miran ; Hwang, Jinmi ; Kim, Sejong2022

OPEN ACCESS

Right mean for the $\alpha-z$ Bures-Wasserstein quantum divergence

Available Online 

 

ARTICLE

A Wasserstein Autoencoder with SMU Activation Function for Anomaly Detection

Wu Guo ; Jaeil Kim한국통신학회 학술대회논문집, 2022, Vol.2022 (2), p.27-28

A Wasserstein Autoencoder with SMU Activation Function for Anomaly Detection

Available Online

[PDF] researchgate.net

Wasserstein Autoencoder with SMU Activation Function for Anomaly Detection

W Guo, J Kim - 한국통신학회 학술대회논문집, 2022 - dbpia.co.kr

Wasserstein Autoencoder with SMU Activation Function for Anomaly Detection - 한국통신

학회 학술대회논문집 - 한국통신학회 : 논문 - DBpia … A Wasserstein Autoencoder with SMU …

[Chinese  Anomaly detection method for multidimensional time series based on VAE-WGAN]

Related articles

<–—2022———2022———1120—


CONFERENCE PROCEEDING

Dirac Mixture Reduction Using Wasserstein Distances on Projected Cumulative Distributions

Dominik Prossel ; Uwe D. HanebeckThe Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings, 2022

Dirac Mixture Reduction Using Wasserstein Distances on Projected Cumulative Distributions

No Online Access 

CONFERENCE PROCEEDING

Dirac Mixture Reduction Using Wasserstein Distances on Projected Cumulative Distributions

Prossel, Dominik ; Hanebeck, Uwe D.2022 25th International Conference on Information Fusion (FUSION), 2022, p.1-8

Dirac Mixture Reduction Using Wasserstein Distances on Projected Cumulative Distributions

No Online Access 
e
 Cited by 2 Related articles All 2 versions


ARTICLE

Wasserstein Complexity of Quantum Circuits

Lu Li ; Kaifeng Bu ; Dax Enshan Koh ; Arthur Jaffe ; Seth LloydarXiv.org, 2022

OPEN ACCESS

Wasserstein Complexity of Quantum Circuits

Available Online 

Cited by 3 All 3 versions

ARTICLE

Bures-Wasserstein geometry for positive-definite Hermitian matrices and their trace-one subset

van Oostrum, JessearXiv.org, 2022

OPEN ACCESS

 Download PDF (via Unpaywall) 

Bures-Wasserstein geometry for positive-definite Hermitian matrices and their trace-one subset

Available Online 

Zbl 07625245

 

ARTICLE

Viability and Exponentially Stable Trajectories for Differential Inclusions in Wasserstein Spaces

Bonnet, Benoît ; Frankowska, Hélène2022

OPEN ACCESS

arXiv:2209.03640  [pdfpsother]  math.OC
Viability and Exponentially Stable Trajectories for Differential Inclusions in Wasserstein Spaces
Authors: Benoît Bonnet-WeillHélène Frankowska
Abstract: In this article, we prove a general viability theorem for continuity inclusions in Wasserstein spaces, and provide an application thereof to the existence of exponentially stable trajectories obtained via the second method of Lyapunov.
Submitted 8 September, 2022; originally announced September 2022.

Cited by 1 A All 6 versions 

ARTICLE

Quasi $\alpha$-Firmly Nonexpansive Mappings in Wasserstein Spaces

Bërdëllima, Arian ; Steidl, Gabriele202

OPEN ACCESS

Quasi $\alpha$-Firmly Nonexpansive Mappings in Wasserstein Spaces

Available Online 


2022


ARTICLE

Wasserstein Graph Distance based on $L_1$-Approximated Tree Edit Distance between Weisfeiler-Lehman Subtrees

Fang, Zhongxi ; Huang, Jianming ; Su, Xun ; Kasai, Hiroyuki202

OPEN ACCESS


Wasserstein GAN based Chest X-Ray Dataset Augmentation for Deep Learning Models: COVID-19 Detection Use-Case

BZ Hussain, I Andleeb, MS Ansari… - 2022 44th Annual …, 2022 - ieeexplore.ieee.org

… Here, we aim to present a WGAN-based technique which aims to … a WGAN based generative 

model which exhibits favorable performance gain. 2) Image generation using WGAN: It is …

All 3 versions

CONFERENCE PROCEEDING

Wasserstein GAN based Chest X-Ray Dataset Augmentation for Deep Learning Models: COVID-19 Detection Use-Case

Hussain, B Zahid ; Andleeb, Ifrah ; Ansari, Mohammad Samar ; Joshi, Amit Mahesh ; Kanwal, NadiaAnnual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference, 2022, Vol.2022, p.2058-2061

Wasserstein GAN based Chest X-Ray Dataset Augmentation for Deep Learning Models: COVID-19 Detection Use-Case

No Online Access 

Cited by 1 Related articles All 7 versions

ARTICLE

Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks

Pasini, Massimiliano Lupo ; Yin, JunqiarXiv.org, 2022

OPEN ACCESS

 Download PDF 

Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks

Available Online 

 View Issue Contents 

Working Paper Full Text

Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks
Massimiliano Lupo Pasini; Yin, Junqi.
arXiv.org; Ithaca, Jul 25, 2022.

Abstract/DetailsGet full text
Link to external site, this link will open in a new window
Cited by 1
 Related articles All 3 versions 

CONFERENCE PROCEEDING

Generalized Zero-Shot Learning Using Conditional Wasserstein Autoencoder

Kim, Junhan ; Shim, ByonghyoICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2022, p.3413-3417

Generalized Zero-Shot Learning Using Conditional Wasserstein Autoencoder

Available Online 

Generalized Zero-Shot Learning Using Conditional Wasserstein Autoencoder

J Kim, B Shim - … 2022-2022 IEEE International Conference on …, 2022 - ieeexplore.ieee.org

… , called conditional Wasserstein autoencoder (CWAE), minimizes the Wasserstein distance 

… In measuring the distance between the two distributions, we use Wasserstein distance1 …

Related articles

2022 see 2021  ARTICLE

Big gaps seismic data interpolation using conditional Wasserstein generative adversarial networks with gradient penalty

Wei, Qing ; Li, XiangyangExploration geophysics (Melbourne), 2022, Vol.53 (5), p.477-48

 Download PDF 

Big gaps seismic data interpolation using conditional Wasserstein generative adversarial networks with gradient penalty

Available Online 

 View Issue Contents 
 Cited by 4 Related articles All 4 versions

<–—2022———2022———1130—


2022 see 2021

[HTML] springer.com

[HTML] EEG data augmentation for emotion recognition with a multiple generator conditional Wasserstein GAN

A Zhang, L Su, Y Zhang, Y Fu, L Wu, S Liang - Complex & Intelligent …, 2022 - Springer

… In this paper, a multi-generator conditional Wasserstein GAN method is proposed for the 

generation of high-quality artificial that covers a more comprehensive distribution of real data …

Cited by 10 Related articles All 3 versions

2022 see 2021  ARTICLE

Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space

Le Gouic, Thibaut ; Paris, Quentin ; Rigollet, Philippe ; Stromme, Austin J.Journal of the European Mathematical Society : JEMS, 

 Download PDF 

Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space

Available Online 

Cited by 26 Related articles All 7 versions

2022 see 2021  [PDF] arxiv.org

Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space

T Le Gouic, Q Paris, P Rigollet… - Journal of the European …, 2022 - ems.press

… In particular, our results apply to infinite-dimensional spaces such as the 2-Wasserstein 

space, where bi-extendibility of geodesics translates into regularity of Kantorovich potentials. …

Cited by 24 Related articles All 7 versions

2022 see 2021  ARTICLE

Inverse airfoil design method for generating varieties of smooth airfoils using conditional WGAN-gp

Yonekura, Kazuo ; Miyamoto, Nozomu ; Suzuki, KatsuyukiStructural and multidisciplinary optimization, 2022, Vol.65 (6)

PEER REVIEWED

 Download PDF 

Inverse airfoil design method for generating varieties of smooth airfoils using conditional WGAN-gp

Available Online 

 View Issue Contents 

  Inverse airfoil design method for generating varieties of smooth airfoils using conditional WGAN-gp

by Yonekura, Kazuo; Miyamoto, Nozomu; Suzuki, Katsuyuki

Structural and multidisciplinary optimization, 06/2022, Volume 65, Issue 6

Machine learning models are recently adopted to generate airfoil shapes. A typical task is to obtain airfoil shapes that satisfy the required lift coefficient....

Article PDFPDF

Journal Article   Full Text Online  More Options 

iew in Context Browse Journal

 Cited by 5 Related articles All 7 versions

 MR4434249

2022 see 2021  ARTICLE

Virtual persistence diagrams, signed measures, Wasserstein distances, and Banach spaces

Bubenik, Peter ; Elchesen, AlexJournal of Applied and Computational Topology, 2022

PEER REVIEWED

OPEN ACCESS

Virtual persistence diagrams, signed measures, Wasserstein distances, and Banach spaces

Available Online 

Cited by 5 Related articles All 4 versions
MR4496687  | Zbl 07619345


ARTICLE

Parisi's formula is a Hamilton-Jacobi equation in Wasserstein space

Mourrat, Jean-ChristopheCanadian journal of mathematics, 2022

PEER REVIEWED

OPEN ACCESS

Parisi's formula is a Hamilton-Jacobi equation in Wasserstein space

Available Online 


2022

ARTICLE

An Efficient Content Popularity Prediction of Privacy Preserving Based on Federated Learning and Wasserstein GAN

Wang, Kailun ; Deng, Na ; Li, XuanhengIEEE internet of things journal, 2022, p.1-1

An Efficient Content Popularity Prediction of Privacy Preserving Based on Federated Learning and Wasserstein GAN

Available Online 

An Efficient Content Popularity Prediction of Privacy Preserving Based on Federated Learning and Wasserstein GAN

2022 see 2021  ARTICLE

A Target SAR Image Expansion Method Based on Conditional Wasserstein Deep Convolutional GAN for Automatic Target Recognition

Qin, Jikai ; Liu, Zheng ; Ran, Lei ; Xie, Rong ; Tang, Junkui ; Guo, ZekunIEEE journal of selected topics in applied earth observations and remote sensing, 2022, Vol.15, p.1-18

PEER REVIEWED

OPACCESS

 Download PDF 

A Target SAR Image Expansion Method Based on Conditional Wasserstein Deep Convolutional GAN for Automatic Target Recognition

Available Online 

 View Issue Contents 

All 2 versions

[PDF] ieee.org

A Target SAR Image Expansion Method Based on Conditional Wasserstein Deep Convolutional GAN for Automatic Target Recognition

J Qin, Z Liu, L Ran, R Xie, J Tang… - IEEE Journal of Selected …, 2022 - ieeexplore.ieee.org

… [33] exploited the gradient penalty (GP) on the WGAN to … scheme called conditional 

Wasserstein DCGAN with a gradient … Meanwhile, the Wasserstein distance and gradient penalty …

A2022 see 2021  RTICLE

Wasserstein distance-based probabilistic linguistic TODIM method with application to the evaluation of sustainable rural tourism potential

Zhang, Shitao ; Wu, Zhangjiao ; Ma, Zhenzhen ; Liu, Xiaodi ; Wu, JianEkonomska istraživanja, 2022, Vol.35 (1), p.409-437

OPEN ACCESS

 Download PDF 

Wasserstein distance-based probabilistic linguistic TODIM method with application to the evaluation of sustainable rural tourism potential

Available Online 

 View Issue Contents 

Cited by 3 Related articles All 2 versions

ARTICLE

Improved Wasserstein Generative Adversarial Networks Defense Method against Data Integrity Attack on Smart Grid

Li, Yuancheng ; Wang, Xiao ; Zeng, JingRecent Advances in Electrical & Electronic Engineering (Formerly Recent Patents on Electrical & Electronic Engineering), 2022, Vol.15

PEER REVIEWED

Improved Wasserstein Generative Adversarial Networks Defense Method against Data Integrity Attack on Smart Grid

No Online Access 

 Improved Wasserstein Generative Adversarial Networks Defense Method Against Data Integrity Attack on Smart Grid


ARTICLE

Tracial smooth functions of non-commuting variables and the free Wasserstein manifold

Jekel, David ; Li, Wuchen ; Shlyakhtenko, DimitriDissertationes Mathematicae, 2022

PEER REVIEWED

 Download PDF (via Unpaywall) 

Tracial smooth functions of non-commuting variables and the free Wasserstein manifold

No Online Access 

<–—2022———2022———1140—


[PDF] arxiv.org

Wasserstein -means for clustering probability distributions

Y Zhuang, X Chen, Y Yang - arXiv preprint arXiv:2209.06975, 2022 - arxiv.org

… The peculiar behaviors of Wasserstein barycenters may make the … Wasserstein $K$-means 

can achieve exact recovery given the clusters are well-separated under the $2$-Wasserstein

Cited by 4 Related articles All 3 versions

DISSERTATION

Lagrangian discretization of variational problems in Wasserstein spaces

Sarrazin, Clément2022

OPEN ACCESS

Lagrangan discretization of variational problems in Wasserstein spaces

No Online Access 

 

University of Medellin Researchers Yield New Data on Risk Management (Multi-Variate Risk Measures under Wasserstein...
Medical Letter on the CDC & FDA, 10/2022
NewsletterCitation Online

DISSERTATION

Wasserstein 행렬 평균과 작용소로의 확장

황진미2022

Wasserstein 행렬 평균과 작용소로의 확장

No Online Access 

[Korean  Wasserstein Matrix Means and Extensions to Operators]

 

ARTICLE

Wasserstein $K$-means for clustering probability distributions

Zhuang, Yubo ; Chen, Xiaohui ; Yang, Yun2022

OPEN ACCESS

Wasrstein $K$-means for clustering probability distributions

Available Online 


2022


CONFERENCE PROCEEDING

Detecting Incipient Fault Using Wasserstein Distance

Lu, Cheng ; Zeng, Jiusun ; Luo, Shihua ; Kruger, Uwe2022 IEEE 11th Data Driven Control and Learning Systems Conference (DDCLS), 2022, p.1044-1049

Detecting Incipient Fault Using Wasserstein Distance

No Online Access 


2022 see 2021  ARTICLE

Wasserstein GAN: Deep Generation Applied on Financial Time Series

Pfenninger, MoritzSSRN Electronic Journal, 2022

Wasserstein GAN: Deep Generation Applied on Financial Time Series

No Online Access 


ARTICLE

From $p$-Wasserstein Bounds to Moderate Deviations

Fang, Xiao ; Koike, Yuta2022

OPEN ACCESS

From p$-Wasserstein Bounds to Moderate Deviations

Available Online 



2022 see 2021  ARTICLE

Variational Wasserstein gradient flow

Jiaojiao Fan ; Qinsheng Zhang ; Amirhossein Taghvaei ; Yongxin ChenarXiv.org, 2022

OPEN ACCESS

Variatinal Wasserstein gradient flow

Available Online 


ARTICLE

A Pipe Ultrasonic Guided Wave Signal Generation Network Suitable for Data Enhancement in Deep Learning: US-WGAN

Lisha Peng ; Shisong Li ; Hongyu Sun ; Songling HuangEnergies (Basel), 2022, Vol.15 (18), p.6695

PEER REVIEWED

OPEN ACCESS

A Pipe Ultrasonic Guided Wave Signal Generation Network Suitable for Data Enhancement in Deep Learning: US-WGAN

Available Online 

34 References  Related records

<–—2022———2022———1150—


ARTICLE

A Method of Integrated Energy Metering Simulation Data Generation Algorithm Based on Variational Autoencoder WGAN

Zhang, Penghe ; Xue, Yang ; Song, Runan ; Yang, Yining ; Wang, Cong ; Yang, LiuJournal of physics. Conference series, 2022, Vol.2195 (1), p.12031

OPEN ACCESS

A Method of Integrated Energy Metering Simulation Data Generation Algorithm Based on Variational Autoencoder WGAN

Available Online 

 Related articles All 3 versions

RTICLE

DAG-WGAN: Causal Structure Learning With Wasserstein Generative Adversarial Networks

Petkov, Hristo ; Hanley, Colin ; Dong, FengarXiv.org, 2022

[PDF] arxiv.org

DAG-WGAN: Causal Structure Learning With Wasserstein Generative Adversarial Networks

H Petkov, C Hanley, F Dong - arXiv preprint arXiv:2204.00387, 2022 - arxiv.org

Wasserstein loss to causal structure learning by making a direct comparison with DAG-GNN 

[Cited by 1 Related articles All 4 versions
DAG-WGAN: Causal Structure Learning With Wasserstein Generative Adversarial Networks


ARTICLE

Optimal 1-Wasserstein Distance for WGANs

Arthur Stéphanovitch ; Ugo Tanielian ; Benoît Cadre ; Nicolas Klutchnikoff ; Gérard BiauarXiv.org, 2022

OPEN ACCESS

Optimal 1-Wasserstein Distance for WGANs

Available Online 

Cited by 1 Related articles All 2 versions

ARTICLE

Analysis of Generative Deep Learning Models and Features of Their Implementation on the Example of WGAN

Isaienkov, Ya.O. ; Mokin, O.B.Visnyk of Vinnytsia Politechnical Institute, 2022, Vol.160 (1), p.82-94

Analysis of Generative Deep Learning Models and Features of Their Implementation on the Example of WGAN

No Online Access 


NEWSLETTER ARTICLE

Study Findings from University of the Witwatersrand Update Knowledge in Information Technology (Neurocartographer: CC-WGAN Based SSVEP Data Generation to Produce a Model toward Symmetrical Behaviour to the Human Brain)

Health & Medicine Week, 2022, p.7193

Study Findings from University of the Witwatersrand Update Knowledge in Information Technology (Neurocartographer: CC-WGAN Based SSVEP Data Generation to Produce a Model toward Symmetrical Behaviour to the Human Brain)

Available Online 

All 5 versions

2022

NEWSPAPER ARTICLE

INTERNATIONAL PATENT: HUNAN UNIVERSITY FILES APPLICATION FOR "WGAN-BASED UNSUPERVISED MULTI-VIEW THREE-DIMENSIONAL POINT CLOUD JOINT REGISTRATION METHOD"

US Fed News Service, Including US State News, 2022

INTERNATIONAL PATENT: HUNAN UNIVERSITY FILES APPLICATION FOR "WGAN-BASED UNSUPERVISED MULTI-VIEW THREE-DIMENSIONAL POINT CLOUD JOINT REGISTRATION METHOD"

No Online Access 


2022 patent newsNEWSPAPER ARTICLE

Univ China Mining Applies for Patent on Nonlinear Industrial Process Modeling Method Based on Wgans Data Enhancement

Global IP News: Industrial Patent News, 2022

Univ China Mining Applies for Patent on Nonlinear Industrial Process Modeling Method Based on Wgans Data Enhancement

No Online Access 


2022 patent news   NEWSPAPER ARTICLE  

North China Electric Power Univ Baoding Submits Patent Application for New Energy Capacity Configuration Method Based on WGAN Scene Simulation and Time Sequence Production Simulation

Global IP News. Energy Patent News, 2022

North China Electric Power Univ Baoding Submits Patent Application for New Energy Capacity Configuration Method Based on WGAN Scene Simulation and Time Sequence Production Simulation

No Online Access 


NEWSPAPER ARTICLE

Univ Yanshan Submits Chinese Patent Application for Cement Clinker Free Calcium Sample Data Enhancement and Prediction Method Based on R-WGAN

Global IP News: Construction Patent News, 2022

Univ Yanshan Submits Chinese Patent Application for Cement Clinker Free Calcium Sample Data Enhancement and Prediction Method Based on R-WGAN

No Online Access 


2022 patent news    NEWSPAPER ARTICLE

Univ Yanshan Submits Chinese Patent Application for 

Cement Clinker Free Calcium Sample Data Enhancement and Prediction Method Based on R-WGAN

Global IP News. Construction Patent News, 2022

Univ Yanshan Submits Chinese Patent Application for Cement Clinker Free Calcium Sample Data Enhancement and Prediction Method Based on R-WGAN

No Online Access 

<–—2022———2022———1160—


2022 patent news   NEWSPAPER ARTICLE

State Intellectual Property Office of China Receives Univ Beijing Inf Sci & Tech's Patent Application for 

Method for Generating Biological Raman Spectrum Data Based on WGAN Generative Adversarial Network

Global IP News: Biotechnology Patent News, 2022

State Intellectual Property Office of China Receives Univ Beijing Inf Sci & Tech's Patent Application for Method for Generating Biological Raman Spectrum Data Based on WGAN Generative Adversarial Network

No Online Access 


2022 see 2021ARTICLE

A Note on Relative Vaserstein Symbol

Chakraborty, KuntalJournal of algebra and its applications, 2022



ARTICLE

On isometries of compact Lp–Wasserstein spaces

Santos-Rodríguez, JaimeAdvances in mathematics (New York. 1965), 2022, Vol.409

PEER REVIEWED

On isometries of compact Lp–Wasserstein spaces

Available Online 


ARTICLE

Wasserstein distributional harvesting for highly dense 3D point clouds

Shu, Dong Wook ; Park, Sung Woo ; Kwon, JunseokPattern recognition, 2022, Vol.132

PEER REVIEWED

Wasserstein distributional harvesting for highly dense 3D point clouds

Available Online 

 Cited by 1 Related articles All 4 versions

 Working Paper  Full Text

Exact Convergence Analysis for Metropolis-Hastings Independence Samplers in Wasserstein Distances

Brown, Austin; Jones, Galin L.

arXiv.org; Ithaca, Jun 27, 2022.

Abstract/DetailsGet full text

Link to external site, this link will open in a new window


2022


2022 see 2021  ARTICLE

Distributionally Robust Mean-Variance Portfolio Selection with Wasserstein Distances

Blanchet, Jose ; Chen, Lin ; Zhou, Xun YuManagement science, 2022, Vol.68 (9), p.6382-6410

PEER REVIEWED

 Download PDF 

Distributionally Robust Mean-Variance Portfolio Selection with Wasserstein Distances

Available Online 

 View Issue Contents 

Cited by 55 Related articles All 6 versions

ARTICLE

Multi-Variate Risk Measures under Wasserstein Barycenter

M Andrea Arias-Serna ; Jean Michel Loubes ; Francisco J Caro-LoperaRisks (Basel), 2022, Vol.10 (9), p.180

PEER REVIEWED

OPEN ACCESS

 Download PDF 

Multi-Variate Risk Measures under Wasserstein Barycenter

Available Online 

 View Issue Contents 

38 References  Related records
All 7 versions


NEWSLETTER ARTICLE

Reports Summarize Operational Research Study Results from University of Colorado Denver (An Lp-based, Strongly-polynomial 2-approximation Algorithm for Sparse Wasserstein Barycenters)

Investment Weekly News, 2022, p.845

Reports Summarize Operational Research Study Results from University of Colorado Denver (An Lp-based, Strongly-polynomial 2-approximation Algorithm for Sparse Wasserstein Barycenters)

Available Online  NEWSLETTER ART


Findings from Hangzhou Dianzi University Has Provided New Data on Applied Intelligence [Finger Vein Image Inpainting Using Neighbor Binary-wasserstein Generative Adversarial Networks ]

Robotics & Machine Learning, 2022, p.67

Findings from Hangzhou Dianzi University Has Provided New Data on Applied Intelligence [Finger Vein Image Inpainting Using Neighbor Binary-wasserstein Generative Adversarial Networks ]

No Online Access 

Cited by 1 Related articles

NEWSLETTER ARTICLE

Research Results from Hong Kong University of Science and Technology Update Understanding of Risk and Financial Management (Simulating Multi-Asset Classes Prices Using Wasserstein Generative Adversarial Network: A Study of Stocks, Futures and ...)

Investment Weekly News, 2022, p.683

Research Results from Hong Kong University of Science and Technology Update Understanding of Risk and Financial Management (Simulating Multi-Asset Classes Prices Using Wasserstein Generative Adversarial Network: A Study of Stocks, Futures and ...)

Available Online 

<–—2022———2022———1170—

NEWSLETTER ARTICLE

Study Results from University of Toronto in the Area of Biomedical and Health Informatics Reported (Improving Non-invasive Aspiration Detection With Auxiliary Classifier Wasserstein Generative Adversarial Networks)

Health & Medicine Week, 2022, p.1253

Study Results from University of Toronto in the Area of Biomedical and Health Informatics Reported (Improving Non-invasive Aspiration Detection With Auxiliary Classifier Wasserstein Generative Adversarial Networks)

Available Online 

Improving Non-Invasive Aspiration Detection With Auxiliary Classifier Wasserstein Generative Adversarial Networks
 

NEWSPAPER ARTICLE

Michigan State University secures contract for Nonlocal Reaction-Diffusion Equations And Wasserstein Gradient Flows

Pivotal Sources, 2022

Michigan State University secures contract for Nonlocal Reaction-Diffusion Equations And Wasserstein Gradient Flows

No Online Access 

[PDF] researchgate.net

Sparse-view CT reconstruction using wasserstein GANs

F Thaler, K Hammernik, C Payer, M Urschler… - Machine Learning for …, 2018 - Springer

… a limited number of projection images using Wasserstein generative adversarial networks 

(wGAN)… In contrast to the blurrier looking images generated by the CNNs trained on \(L_1\), the …

Cited by 10 Related articles All 3 versions


2022 patent news  NEWSPAPER ARTICLE

State Intellectual Property Office of China Releases Univ Guangdong Technology's Patent Application for Wasserstein Distance-Based Object Envelope Multi-View Reconstruction and Optimization Method

Global IP News: Packaging & Containers Patent News, 2022

State Intellectual Property Office of China Releases Univ Guangdong Technology's Patent Application for Wasserstein Distance-Based Object Envelope Multi-View Reconstruction and Optimization Method

No Online Access 

NEWSPAPER ARTICLE

INTERNATIONAL PATENT: SHENZHEN INSTITUTES OF ADVANCED TECHNOLOGY FILES APPLICATION FOR "HIGH-ENERGY IMAGE SYNTHESIS METHOD AND DEVICE BASED ON WASSERSTEIN GENERATIVE ADVERSARIAL NETWORK MODEL"

US Fed News Service, Including US State News, 2022

INTERNATIONAL PATENT: SHENZHEN INSTITUTES OF ADVANCED TECHNOLOGY FILES APPLICATION FOR "HIGH-ENERGY IMAGE SYNTHESIS METHOD AND DEVICE BASED ON WASSERSTEIN GENERATIVE ADVERSARIAL NETWORK MODEL"

No Online Access 


2022


 

NEWSPAPER ARTICLE

State Intellectual Property Office of China Receives Univ Chongqing Posts & Telecom's Patent Application for Visual Dimension Reduction Method Based on Wasserstein Space

Global IP News. Information Technology Patent News, 2022

State Intellectual Property Office of China Receives Univ Chongqing Posts & Telecom's Patent Application for Visual Dimension Reduction Method Based on Wasserstein Space

No Online Access 


NEWSPAPER ARTICLE

Shenzhen Inst Adv Tech Seeks Patent for High-Energy Image Synthesis Method and Device Based on Wasserstein Generative Adversarial Network Model

Global IP News. Optics & Imaging Patent News, 2022

Shenzhen Inst Adv Tech Seeks Patent for High-Energy Image Synthesis Method and Device Based on Wasserstein Generative Adversarial Network Model

No Online Access 


REVIEW

Book review: “Lectures on Optimal Transport” by Luigi Ambrosio, Elia Brué and Daniele Semola, and “An Invitation to Optimal Transport, Wasserstein Distances, and Gradient Flows” by Alessio Figalli and Federico Glaudo

Santambrogio, FilippoEuropean Mathematical Society Magazine, 2022 (124), p.60-63

Book review: “Lectures on Optimal Transport” by Luigi Ambrosio, Elia Brué and Daniele Semola, and “An Invitation to Optimal Transport, Wasserstein Distances, and Gradient Flows” by Alessio Figalli and Federico Glaudo

No Online Access 



DISSERTATION

On Adversarial Regularization of Tabular Wasserstein Generative Adversarial Networks

Eiring, Sverre Roalsø2022

OPEN ACCESS

On Adversarial Regularization of Tabular Wasserstein Generative Adversarial Networks

No Online Access 


Non-Parametric and Regularized Dynamical Wasserstein Barycenters for Time-Series...
by Cheng, Kevin CAeron, ShuchinHughes, Michael C ; More...
10/2022
We consider probabilistic time-series models for systems that gradually transition among a finite number of states. We are particularly motivated by...
Journal Article  Full Text Online

arXiv:2210.01918  [pdfother cs.LG 
eess.SP
Non-Parametric and Regularized Dynamical Wasserstein Barycenters for Time-Series Analysis
Authors: Kevin C. ChengShuchin AeronMichael C. HughesEric L. Miller
Abstract: We consider probabilistic time-series models for systems that gradually transition among a finite number of states, in contrast to the more commonly considered case where such transitions are abrupt or instantaneous. We are particularly motivated by applications such as human activity analysis where the observed time-series contains segments representing distinct activities such as running or walk…  More
Submitted 4 October, 2022; originally announced October 2022.

<–—2022———2022———1180—e


Multi-marginal Approximation of the Linear Gromov-Wasserstein...

by Beier, Florian; Beinert, Robert
10/2022

Recently, two concepts from optimal transport theory have successfully been brought to the Gromov--Wasserstein (GW) setting.

This introduces a linear version...

Journal Article  Full Text Online

arXiv:2210.01596  [pdfpsother math.NA   math.OC
Multi-marginal Approximation of the Linear Gromov-Wasserstein Distance
Authors: Florian BeierRobert Beinert
Abstract: Recently, two concepts from optimal transport theory have successfully been brought to the Gromov--Wasserstein (GW) setting. This introduces a linear version of the GW distance and multi-marginal GW transport. The former can reduce the computational complexity when computing all GW distances of a large set of inputs. The latter allows for a simultaneous matching of more than two marginals, which c… 
More
Submitted 4 October, 2022; originally announced October 2022.
MSC Class: 28A33; 28A35
All 2 versions


arXiv:2210.00898  [pdfpsother cs.LG   cs.AI  math.OC  math.PR 
stat.ML Robust Q-learning Algorithm for Markov Decision Processes under Wasserstein Uncertainty
Authors: Ariel NeufeldJulian Sester
Abstract: We present a novel 
Q
-learning algorithm to solve distributionally robust Markov decision problems, where the corresponding ambiguity set of transition probabilities for the underlying Markov decision process is a Wasserstein ball around a (possibly estimated) reference measure. We prove convergence of the presented algorithm and provide several examples also using real data to illustrate both th…  More
Submitted 30 September, 2022; originally announced October 2022.

arXiv:2209.15028  [pdfpsother]  math.OC   math.PR
A smooth variational principle on Wasserstein space
Authors: Erhan BayraktarIbrahim EkrenXin Zhang
Abstract: In this note, we provide a smooth variational principle on Wasserstein space by constructing a smooth gauge-type function using the sliced Wasserstein distance. This function is a crucial tool for optimization problems and in viscosity theory of PDEs on Wasserstein space.
Submitted 29 September, 2022; originally announced September 2022.
Comments: Keywords: Smooth variational principle, sliced Wasserstein distance, optimal transport
MSC Class: 58E30; 90C05

Zbl 07702414

Li, W.Rubio, F. J.

On a prior based on the Wasserstein information matrix. (English) Zbl 07595495

Stat. Probab. Lett. 190, Article ID 109645, 7 p. (2022).

MSC:  60-XX 62-XX

PDF BibTeX XML Cite

Full Text: DOI 

Related articles All 5 versions
Zbl 07595495


Wasserstein \(K\)-means for clustering probability distributions
by Yubo Zhuang; Xiaohui Chen; Yun Yang
arXiv.org, 09/2022
Clustering is an important exploratory data analysis technique to group objects based on their similarity. The widely used \(K\)-means clustering method relies...
Paper  Full Text Online

2022


2022 see 2021
Tangent Space and Dimension Estimation with the Wasserstein...
by Uzu Lim; Harald Oberhauser; Vidit Nanda
arXiv.org, 09/2022
Consider a set of points sampled independently near a smooth compact submanifold of Euclidean space. We provide mathematically rigorous bounds on the number of...
Paper  Full Text Online

 

2022 see 2021
Empirical measures and random walks on compact spaces in the quadratic Wasserstein...
by Borda, Bence
arXiv.org, 09/2022
Estimating the rate of convergence of the empirical measure of an i.i.d. sample to the reference measure is a classical problem in probability theory....

Working Paper  Full Text

Decentralized Convex Optimization on Time-Varying Networks with Application to Wasserstein Barycenters
Yufereva, Olga; Persiianov, Michael; Dvurechensky, Pavel; Gasnikov, Alexander; Kovalev, Dmitry.
arXiv.org; Ithaca, Oct 2, 2022.
Abstract/DetailsGet full text
Link to external site, this link will open in a new window
 
 
Working Paper  Full Text

A Simple and General Duality Proof for Wasserstein Distributionally Robust Optimization
Zhang, Luhao; Yang, Jincheng; Gao, Rui.
arXiv.org; Ithaca, Oct 2, 2022.
Abstract/DetailsGet full text
Link to external site, this link will open in a new window


Entropy-Based Wasserstein GAN for Imbalanced Learning

https://www.researchgate.net › ... › Psychology › Learning

Jul 5, 2022 — In this paper, we propose a novel oversampling strategy dubbed Entropy-based Wasserstein Generative Adversarial Network (EWGAN) to generate ...

<–—2022———2022———1190—e


A CWGAN-GP-based multi-task learning model for consumer ...

https://www.sciencedirect.com › science › article › abs › pii

by Y Kang · 2022 · Cited by 1 — First, the CWGAN-GP model is employed to learn about the distribution of the borrower population given both accepted and rejected data. Then, the data ...



 CWGAN-GP-based multi-task learning model for consumer credit scoring

Y Kang, L Chen, N Jia, W Wei, J Deng… - Expert Systems with …, 2022 - Elsevier

… (MTL) model (CWGAN-GP-MTL) for consumer credit scoring. First, the CWGAN-GP model is

… through augmenting synthetic bad data generated by CWGAN-GP. Next, we design an MTL …

Cited by 1 Related articles


Gene-CWGAN: a data enhancement method for gene expression profile based on improved CWGAN-GP

F Han, S Zhu, Q Ling, H Han, H Li, X Guo… - Neural Computing and …, 2022 - Springer

… data based on CWGAN-GP (Gene-CWGAN) is proposed in … is adopted in Gene-CWGAN

to make the distribution of … a Gene-CWGAN based on a proxy model (Gene-CWGAN-PS) …

Cited by 2 Related articles


Multi-classification of arrhythmias using ResNet with CBAM on CWGAN-GP augmented ECG Gramian Angular Summation Field

K Ma, AZ Chang'an, F Yang - Biomedical Signal Processing and Control, 2022 - Elsevier

… generative adversarial network with gradient penalty (CWGAN-GP) model to augment the …

features for arrhythmia classification, and that CWGAN-GP based data augmentation provides …

Cited by 2 Related articles

[CITATION] Corrigendum to “Multi-classification of arrhythmias using ResNet with CBAM on CWGAN-GP augmented ECG Angular Summation Field”[Biomed. Signal …

K Ma, AZ Chang'an, F Yang - Biomedical Signal Processing and Control, 2022 - Elsevier


[HTML] hindawi.com

[HTML] Bearing Remaining Useful Life Prediction Based on AdCNN and CWGAN under Few Samples

J Man, M Zheng, Y Liu, Y Shen, Q Li - Shock and Vibration, 2022 - hindawi.com

At present, deep learning is widely used to predict the remaining useful life (RUL) of rotation

machinery in failure prediction and health management (PHM). However, in the actual …

All 4 versions 


2022


Based on CWGAN Deep Learning Architecture to Predict Chronic Wound Depth Image

CL Chin, TY Sun, JC Lin, CY Li… - 2022 IEEE …, 2022 - ieeexplore.ieee.org

… In this paper, the wound depth image predictions using the CWGAN have been studied. The

experimental results show that the wound depth image quality generated by the CWGAN is …


U-Net  cWGAN  이용한 탄성파 탐사 자료 보간 성능 평가

유지윤, 윤대웅 - 지구물리와 물리탐사, 2022 - papersearch.net

… , and conditional Wasserstein GAN (cWGAN) were used as seismic data … cWGAN showed

better prediction performance than U-Net with higher PSNR and SSIM. However, cWGAN …

[Korean  Evaluation of interpolation performance of seismic survey data using U-Net and cWGAN]

ARTICLE

Wasserstein Convergence Rates for Empirical Measures of Subordinated Processes on Noncompact Manifolds

Li, Huaiqian ; Wu, Bingyao2022

OPEN ACCESS

Wasserstein Convergence Rates for Empirical Measures of Subordinated Processes on Noncompact Manifolds

Available Online 

ited by 1 Related articles All 3 versions
MR4591873


CONFERENCE PROCEEDING

Wasserstein-Based Feature Map Knowledge Transfer to Improve the Performance of Small Deep Neural Networks

Gondra, IkerThe Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings, 2022

Wasserstein-Based Feature Map Knowledge Transfer to Improve the Performance of Small Deep Neural Networks

No Online Access 

Wasserstein-Based Feature Map Knowledge Transfer to Improve the Performance of Small Deep Neural Networks

CONFERENCE PROCEEDING

A typical scenario generation method for active distribution network based on Wasserstein distance

Liu, Zhijie ; Zhu, Shouzhen ; Lv, Gongxiang ; Zhang, PengThe Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings, 2022

A typical scenario generation method for active distribution network based on Wasserstein distance

No Online Access 

<–—2022———2022———1200—


[HTML] springer.com

[HTML] Indeterminacy estimates, eigenfunctions and lower bounds on Wasserstein distances

N De PontiS Farinelli - Calculus of Variations and Partial Differential …, 2022 - Springer

In the paper we prove two inequalities in the setting of\(\mathsf {RCD}(K,\infty)\) spaces

using similar techniques. The first one is an indeterminacy estimate involving the p-Wasserstein …

Cited by 1 Related articles All 6 versions

2022 see 2021  ARTICLE

Interpretable Model Summaries Using the Wasserstein Distance

Dunipace, Eric ; Trippa, LorenzoarXiv.org, 2022

OPEN ACCESS

Intepretable Model Summaries Using the Wasserstein Distance

Available Online 


ARTICLE

Coalescing-fragmentating Wasserstein dynamics: particle approach

Konarovskyi, VitaliiarXiv.org, 2022

OPEN ACCESS

Coalescing-fragmentating Wasserstein dynamics: particle approach

Available Online 


ARTICLE

Semi-Supervised Surface Wave Tomography With Wasserstein Cycle-Consistent GAN: Method and Application to Southern California Plate Boundary Region

Cai, A ; Qiu, H ; Niu, F2022

OPEN ACCESS

Semi-Supervised Surface Wave Tomography With Wasserstein Cycle-Consistent GAN: Method and Application to Southern California Plate Boundary Region

Available Online 


2022 see 2021  ARTICLE

Tangent Space and Dimension Estimation with the Wasserstein Distance

Lim, Uzu ; Oberhauser, Harald ; Nanda, ViditarXiv.org, 2022

OPEN ACCESS

Tangent Space and Dimension Estimation with the Wasserstein Distance

Available Online 

 

2022


ARTICLE

Quasi \(\alpha\)-Firmly Nonexpansive Mappings in Wasserstein Spaces

Arian Bërdëllima ; Gabriele SteidlarXiv.org, 2022

OPEN ACCESS

Quas \(\alpha\)-Firmly Nonexpansive Mappings in Wasserstein Spaces

Available Online 


A2022 see 2021  RTICLE

Empirical measures and random walks on compact spaces in the quadratic Wasserstein metric

Borda, BencearXiv.org, 2022

OPEN ACCESS

Empirical measures and random walks on compact spaces in the quadratic Wasserstein metric

Available Online 


ARTICLE

Unadjusted Langevin algorithm with multiplicative noise: Total variation and Wasserstein bounds

Pages, Gilles ; Panloup, FabienarXiv.org, 2022

OPEN ACCESS

Unadjusted Langevin algorithm with multiplicative noise: Total variation and Wasserstein bounds

Available Online 


arXiv:2210.04260  [pdfpsother cs.LG
Coresets for Wasserstein Distributionally Robust Optimization Problems
Authors: Ruomin HuangJiawei HuangWenjie LiuHu Ding
Abstract: Wasserstein distributionally robust optimization (\textsf{WDRO}) is a popular model to enhance the robustness of machine learning with ambiguous data. However, the complexity of \textsf{WDRO} can be prohibitive in practice since solving its ``minimax'' formulation requires a great amount of computation. Recently, several fast \textsf{WDRO} training algorithms for some specific machine learning tas…  More
Submitted 9 October, 2022; originally announced October 2022.

Quantum Wasserstein distance of order 1 between channels
by Duvenhage, RoccoMapaya, Mathumo
10/2022
We set up a general theory for a quantum Wasserstein distance of order 1 in an operator algebraic framework, extending recent work in finite dimensions. In...
Journal Article  Full Text Online

arXiv:2210.03483  [pdfpsother quant-ph   math-ph
Quantum Wasserstein distance of order 1 between channels
Authors: Rocco DuvenhageMathumo Mapaya
Abstract: We set up a general theory for a quantum Wasserstein distance of order 1 in an operator algebraic framework, extending recent work in finite dimensions. This theory applies not only to states, but also to channels, giving a metric on the set of channels from one composite system to another. The additivity and stability properties of this metric are studied.
Submitted 7 October, 2022; originally announced October 2022.
Comments: 34 pages

Cited by 1 Related articles All 2 versions 
<–—2022———2022———1210—


Adversarial network training using higher-order moments in a modified Wasserstein distance
by Serang, Oliver
10/2022
Generative-adversarial networks (GANs) have been used to produce data closely resembling example data in a compressed, latent space that is close to sufficient...
Journal Article  Full Text Online

arXiv:2210.03354  [pdfother stat.M  cs.LG
Adversarial network training using higher-order moments in a modified Wasserstein distance
Authors: Oliver Serang
Abstract: Generative-adversarial networks (GANs) have been used to produce data closely resembling example data in a compressed, latent space that is close to sufficient for reconstruction in the original vector space. The Wasserstein metric has been used as an alternative to binary cross-entropy, producing more numerically stable GANs with greater mode covering behavior. Here, a generalization of the Wasse…  More
Submitted 7 October, 2022; originally announced October 2022.
ACM Class: G.3; G.1.6


 Short-term prediction of wind power based on BiLSTM–CNN–WGAN-GP
by Huang, Ling; Li, Linxia; Wei, Xiaoyuan ; More...
Soft computing (Berlin, Germany), 01/2022, Volume 26, Issue 20
A short-term wind power prediction model based on BiLSTM–CNN–WGAN-GP (LCWGAN-GP) is proposed in this paper, aiming at the problems of instability and low...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal



CAN bus fuzzy test case generation method based on WGAN-GP and fuzzy test system

CN CN114936149A 黄柯霖 华中科技大学

Priority 2022-04-27 • Filed 2022-04-27 • Published 2022-08-23

the model generation module is used for building and training a WGAN-GP model based on a neural network through the training data set; the test case generation module is used for configuring a noise vector for the trained WGAN-GP model, so that the WGAN-GP model generates a plurality of virtual CAN …


2022 patent
Rotating machine state monitoring method based on Wasserstein depth digital …

CN CN114662712A 胡文扬 清华大学

Priority 2022-02-22 • Filed 2022-02-22 • Published 2022-06-24

The invention relates to the technical field of artificial intelligence, and discloses a method for monitoring the state of a rotating machine based on a Wasserstein depth digital twin model, which comprises the steps of acquiring operation and maintenance data of the rotating machine in a healthy …


2022 patent

Application of evidence Wasserstein distance algorithm in component …

CN CN114818957A 肖富元 肖富元

Priority 2022-05-10 • Filed 2022-05-10 • Published 2022-07-29

1. The application of the evidence Wasserstein distance algorithm in component identification is characterized in that: the Wasserstein distance is EWD, and the EWD is verified by the following method: 1): let m1 and m2 be the quality function of the multi-intersection element set Θ, where γ i, j …


2022


2022 patent

Wasserstein distance-based battery SOH estimation method and device

CN CN114839552A 林名强 泉州装备制造研究所

Priority 2022-04-08 • Filed 2022-04-08 • Published 2022-08-02

3. The wasserstein distance-based battery SOH estimation method according to claim 1, wherein: in S1, the aging data of the pouch batteries is specifically aging data of eight nominal 740Ma · h pouch batteries recorded in advance. 4. A wasserstein distance-based battery SOH estimation method …



On isometries of compact L-P-Wasserstein spaces

Santos-Rodriguez, J

Nov 19 2022 | 

ADVANCES IN MATHEMATICS

 409

Let (X, d,m) be a compact non-branching metric measure space equipped with a qualitatively non-degenerate measure m. The study of properties of the L-p-Wasserstein space (P-p(X),W-p) associated to X has proved useful in describing several geometrical properties of X. In this paper we focus on the study of isometries of P-p(X) for p is an element of (1, infinity) under the assumption that there

Show more

Full Text at Publishermore_horiz

18 References  Related records 


A Mayer optimal control problem on Wasserstein spaces over Riemannian manifolds

Jean, FJerhaoui, O and Zidani, H

18th IFAC Workshop on Control Applications of Optimization (CAO)

2022 | 

IFAC PAPERSONLINE

 55 (16) , pp.44-49

This paper concerns an optimal control problem on the space of probability measures over a compact Riemannian manifold. The motivation behind it is to model certain situations where the central planner of a deterministic controlled system has only a probabilistic knowledge of the initial condition. The lack of information here is very specific. In particular, we show that the value function ver

Show more

Free Full Text from Publishermore_horiz


Linear optimal transport embedding: provable Wasserstein classification for certain rigid transformations and perturbations

Moosmuller, C and Cloninger, A

Sep 2022 (Early Access) | 

INFORMATION AND INFERENCE-A JOURNAL OF THE IMA

Discriminating between distributions is an important problem in a number of scientific fields. This motivated the introduction of Linear Optimal Transportation (LOT), which embeds the space of distributions into an L-2 -space. The transform is defined by computing the optimal transport of each distribution to a fixed reference distribution and has a number of benefits when it comes to speed of

Show more

Free Submitted Article From RepositoryView full textmore_horiz

35 References  Related records


Network intrusion detection based on conditional wasserstein variational autoencoder with generative adversarial network and one-dimensional convolutional neural networks

He, JXWang, XD; (...); Chen, C

Sep 2022 (Early Access) | 

APPLIED INTELLIGENCE

Enriched Cited References

There is a class-imbalance problem that the number of minority class samples is significantly lower than that of majority class samples in common network traffic datasets. Class-imbalance phenomenon will affect the performance of the classifier and reduce the robustness of the classifier to detect unknown anomaly detection. And the distribution of the continuous features in the dataset does not

Show more

Free Full Text From Publisher

<–—2022———2022———1220— 


The Chinese Peoples Liberation Army No.92578 Troops Applies for Patent on Mechanical Pump Small Sample Fault Diagnosis Method Based on WGAN...

Global IP News. Information Technology Patent News, Oct 7, 2022
Newspaper Article  Full Text Online

Wire Feed

The Chinese Peoples Liberation Army No.92578 Troops Applies for Patent on Mechanical Pump Small Sample Fault Diagnosis Method Based on WGAN-GP-C and Metric Learning

Global IP News. Information Technology Patent News; New Delhi [New Delhi]. 07 Oct 2022. 

CiteEmaSave to My Research

Full Text

DetailsFull text


2022 

Hierarchical sliced Wasserstein distance - Nhat Ho

https://nhatptnk8912.github.io › Hierarchical_SW

ongzheng RenHuy NguyenLitu RoutTan Nguyen Nhat HoUniversity of Texas, PD

by K Nguyen · 2022 · Cited by 1 — September 28, 2022. Abstract. Sliced Wasserstein (SW) distance has been widely used in different application scenarios.

Hierarchical Sliced Wasserstein Distance - OpenReview

https://openreview.net › forum

https://openreview.net › forum

The paper proposes hierarchical sliced Wasserstein distance which is faster than the ... 22 Sept 2022 (modified: 12 Nov 2022)ICLR 2023 Conference Blind ...

On the use of Wasserstein distance in the distributional ...

https://link.springer.com › article

https://link.springer.com › article

ort-based Wasserstein distance.1 


Efect of Dependence on the Convergence of Empirical  Wasserstein Distance

Wednesday, May 18, 2022. Abstract: The Wasserstein distance is a powerful tool in modern machine learning to metrize the space of probability distributions ...

Institute for Mathematical and Statistical Innovation · May 18, 2022

2022 5/18

Efect of Dependence on the Convergence of Empirical  Wasserstein Distance

Wednesday, May 18, 2022. Abstract: The Wasserstein distance is a powerful tool in modern machine learning to metrize the space of probability distributions ...

Institute for Mathematical and Statistical Innovation · 

May 18, 2022

Effect of Dependence on the Convergence of Empirical ...

The Quantum Wasserstein Distance of Order 1 - YouTube

54 views Oct 4, 2022 Speaker: Giacomo De Palma, University of Bologna Title: The Quantum Wasserstein Distance of Order 1 … ...more ...more.

YouTube · Mathematical Picture Language · 1 

see Feb 15,  2022

arXiv:2210.06934  [pdfother]  stat.ML   stat.AP  stat.ME
On the potential benefits of entropic regularization for smoothing Wasserstein estimators
Authors: Jérémie BigotPaul FreulonBoris P. HejblumArthur Leclaire
Abstract: This paper is focused on the study of entropic regularization in optimal transport as a smoothing method for Wasserstein estimators, through the prism of the classical tradeoff between approximation and estimation errors in statistics. Wasserstein estimators are defined as solutions of variational problems whose objective function involves the use of an optimal transport cost between probability m…  More
Submitted 13 October, 2022; originally announced October 2022.
Comments: 54 pages, 12 figures

Wasserstein Barycenter-based Model Fusion and Linear Mode...
by Akash, Aditya Kumar; Li, Sixu; Trillos, Nicolás García
10/2022
Based on the concepts of Wasserstein barycenter (WB) and Gromov-Wasserstein barycenter (GWB), we propose a unified mathematical framework for neural network...
Journal Article  Full Text Online

arXiv:2210.06671  [pdfother cs.LG
Wasserstein Barycenter-based Model Fusion and Linear Mode Connectivity of Neural Networks
Authors: Aditya Kumar AkashSixu LiNicolás García Trillos
Abstract: Based on the concepts of Wasserstein barycenter (WB) and Gromov-Wasserstein barycenter (GWB), we propose a unified mathematical framework for neural network (NN) model fusion and utilize it to reveal new insights about the linear mode connectivity of SGD solutions. In our framework, the fusion occurs in a layer-wise manner and builds on an interpretation of a node in a network as a function of the…  More
Submitted 12 October, 2022; originally announced October 2022.

Cited by 1 Related articles All 2 versions 


[PDF] ijcai.org

[PDF] Weakly-supervised Text Classification with Wasserstein Barycenters Regularization

J Ouyang, Y Wang, X Li, C Li - ijcai.org

… a Wasserstein barycenter regularization with the weakly-supervised targets on the deep 

feature space. The intuition is that the texts tend to be close to the corresponding Wasserstein

Cited by 1 Related articles All 2 versions

[PDF] ieee.org

Distributed Wasserstein Barycenters via Displacement Interpolation

P Cisneros-Velarde, F Bullo - IEEE Transactions on Control of …, 2022 - ieeexplore.ieee.org

Wasserstein space. We characterize the evolution of this algorithm and prove it computes the 

Wasserstein … One version of the algorithm computes a standard Wasserstein barycenter, ie, …

 Related articles All 4 versions


[PDF] arxiv.org

Wasserstein Embedding for Capsule Learning

P Shamsolmoali, M Zareapoor, S Das… - arXiv preprint arXiv …, 2022 - arxiv.org

… Subsequently, we present the Wasserstein Embedding Module that first measures the … Our 

experimental results indicate that Wasserstein Embedding Capsules (WECapsules) perform …

All 2 versions

<–—2022———2022———1230—


[PDF] arxiv.org

Robust -learning Algorithm for Markov Decision Processes under Wasserstein Uncertainty

A Neufeld, J Sester - arXiv preprint arXiv:2210.00898, 2022 - arxiv.org

… set of transition probabilities a Wasserstein ball of radius ε … worst-case expectations with 

respect to a Wasserstein ball (see [4], … Relying on the above introduced Wasserstein-distance we …

 All 4 versions 



Unsupervised Learning Model of Sparse Filtering Enhanced Using Wasserstein Distance for Intelligent Fault Diagnosis

G Vashishtha, R Kumar - Journal of Vibration Engineering & Technologies, 2022 - Springer

… features is done by Wasserstein distance with MMD … Wasserstein distance with MMD has 

been proposed. The GNSF is obtained by normalizing the feature matrix whereas Wasserstein


2022 see 2021  [PDF] arxiv.org

Convergence in Wasserstein distance for empirical measures of Dirichlet diffusion processes on manifolds

FY Wang - Journal of the European Mathematical Society, 2022 - ems.press

… 1, where W2 is the L2-Wasserstein distance induced by the Riemannian metric . In general, 

… , this implies that the law of ¹XT t W t 2 Œ0; t0 º coincides with the conditional law of ¹Xt W t 2 …

Cited by 6 Related articles All 3 versions

 

[PDF] arxiv.org

Non-Parametric and Regularized Dynamical Wasserstein Barycenters for Time-Series Analysis

KC Cheng, S Aeron, MC Hughes, EL Miller - arXiv preprint arXiv …, 2022 - arxiv.org

Wasserstein barycenter. This is in contrast to methods that model these transitions with a 

mixture of the pure state distributions. Here, focusing on the univariate case where Wasserstein

All 2 versions


Novel Robust Minimum Error Entropy Wasserstein Distribution Kalman Filter under Model Uncertainty and Non-Gaussian Noise

Z Feng, G Wang, B Peng, J He, K Zhang - Signal Processing, 2022 - Elsevier

… In this study, we consider a new Wasserstein-distribution … Gaussian distribution as a 

Wasserstein ambiguity set allowable … the minimax problem based on Wasserstein ambiguity sets, …
ARTICLE

Convergence of the empirical measure in expected Wasserstein distance: non asymptotic explicit bounds in $\mathbb{R}^d

Fournier, Nicolas2022

OPEN ACCESS

[PDF] arxiv.org

Convergence of the empirical measure in expected Wasserstein distance: non asymptotic explicit bounds in

N Fournier - arXiv preprint arXiv:2209.00923, 2022 - arxiv.org

… , and it seems that measuring this convergence in Wasserstein distance is nowadays a widely 

adopted choice. … Choosing for µ the uniform law on Bi(0, 1/2), we find that for i {2, ∞}, …

 All 3 versions

CONVERGENCE OF THE EMPIRICAL MEASURE IN EXPECTED WASSERSTEIN DISTANCE: NON ASYMPTOTIC EXPLICIT BOUNDS IN Rd

N FOURNIER - perso.lpsm.paris

… , and it seems that measuring this convergence in Wasserstein distance is nowadays a widely 

adopted choice. … Choosing for µ the uniform law on Bi(0, 1/2), we find that for i {2, ∞}, …

 

[PDF] arxiv.org

On the Wasserstein median of probability measures

K You, D Shung - arXiv preprint arXiv:2209.03318, 2022 - arxiv.org

… the Wasserstein median, an equivalent of Fr\'{e}chet median under the 2-Wasserstein metric… 

use of any established routine for the Wasserstein barycenter in an iterative manner and …

 All 4 versions


[PDF] arxiv.org

From -Wasserstein Bounds to Moderate Deviations

X Fang, Y Koike - arXiv preprint arXiv:2205.13307, 2022 - arxiv.org

… Abstract: We use a new method via p-Wasserstein bounds to … The key step of our method is 

to show that the p-Wasserstein … For this purpose, we obtain general p-Wasserstein bounds in …

Save Cite Cited by 2 Related articles All 2 versions 


[PDF] mdpi.com

ECG Classification Based on Wasserstein Scalar Curvature

F Sun, Y Ni, Y Luo, H Sun - Entropy, 2022 - mdpi.com

… method based on Wasserstein scalar curvature to … the Wasserstein geometric structure of 

the statistical manifold. Technically, this paper defines the histogram dispersion of Wasserstein

All 4 version 


[PDF] arxiv.org

DeepWSD: Projecting Degradations in Perceptual Space to Wasserstein Distance in Deep Feature Space

X Liao, B Chen, H Zhu, S Wang, M Zhou… - Proceedings of the 30th …, 2022 - dl.acm.org

… the deep feature domain Wasserstein distance (DeepWSD) … based upon the concept of the 

Wasserstein distance (WSD) [… , is characterized by the Wasserstein distance here. Moreover, …

Cited by 6 Related articles All 3 versions

<–—2022———2022———1240—


Distributionally Robust Portfolio Optimization with Second-Order Stochastic Dominance Based on Wasserstein Metric

Z Hosseini-Nodeh, R Khanjani-Shiraz, PM Pardalos - Information Sciences, 2022 - Elsevier

… The Wasserstein moment ambiguity set was presented and a DRPO model was formulated 

… both the Wasserstein moment and Wasserstein ambiguity sets. Based on the Wasserstein


 

https://www.springerprofessional.de › a-self-attention-b...

A Self-Attention Based Wasserstein ... - Springer Professional

Sep 1, 2022 — A Self-Attention Based Wasserstein Generative Adversarial Networks for Single Image Inpainting. Authors: Yuanxin Mao, Tianzhuang Zhang, Bo Fu, ...

A Self-Attention Based Wasserstein Generative Adversarial Networks for Single Image Inpainting

Y Mao, T Zhang, B Fu, DNH Thanh - Pattern Recognition and Image …, 2022 - Springer

… Second, the proposed model uses the Wasserstein distance (… Wasserstein distance can 

still reflect the distance between the two distributions, and it is conducive to ensuring the stability

 All 2 versions

2022 see 2021

arXiv:2210.11446  [pdfpsother]  math-ph   cond-mat.stat-mech  math.PR  quant-ph
The Wasserstein dist
ance of order 1
 for quantum spin systems on infinite lattices
Authors: Giacomo De PalmaDario Trevisan
Abstract: We propose a generalization of the Wasserstein distance of order 1
 to quantum spin systems on the lattice Zd
, which we call specific quantum W1
 distance. The proposal is based on the W1
distance for qudits of [De Palma et al., IEEE Trans. Inf. Theory 67, 6627 (2021)] and recovers Ornstein's d¯-distance for the quantum states whose marginal states on any finite number of…  More
Submitted 20 October, 2022; originally announced October 2022.


arXiv:2210.10535  [pdfother]  cs.CG   cs.LG  stat.OT
Stability of Entropic Wasserstein Barycenters and application to random geometric graphs
Authors: Marc TheveneauNicolas Keriven
Abstract: As interest in graph data has grown in recent years, the computation of various geometric tools has become essential. In some area such as mesh processing, they often rely on the computation of geodesics and shortest paths in discretized manifolds. A recent example of such a tool is the computation of Wasserstein barycenters (WB), a very general notion of barycenters derived from the theory of Opt… 
More
Submitted 19 October, 2022; originally announced October 2022.
twitter.com › n_keriven › status

twitter.com › n_keriven › status

New small preprint "Stability of Entropic Wasserstein Barycenters and application to random geometric graphs" We show that WB computed on ...

Twitter · 1 month ago

Oct 14, 2022


arXiv:2210.10268  [pdfother]  stat.ML   cs.LG
Fast Approximation of the Generalized Sliced-Wasserstein Distance
Authors: Dung LeHuy NguyenKhai NguyenTrang NguyenNhat Ho
Abstract: Generalized sliced Wasserstein distance is a variant of sliced Wasserstein distance that exploits the power of non-linear projection through a given defining function to better capture the complex structures of the probability distributions. Similar to sliced Wasserstein distance, generalized sliced Wasserstein is defined as an expectation over random projections which can be approximated by the M… 
More
Submitted 18 October, 2022; originally announced October 2022.
Comments: 22 pages, 2 figures. Dung Le, Huy Nguyen and Khai Nguyen contributed equally to this work
Related articles
 All 2 versions 


2022

arXiv:2210.09160  [pdfother]  stat.ML   cs.LG
Statistical, Robustness, and Computational Guarantees for Sliced Wasserstein Distances
Authors: Sloan NietertRitwik SadhuZiv GoldfeldKengo Kato
Abstract: Sliced Wasserstein distances preserve properties of classic Wasserstein distances while being more scalable for computation and estimation in high dimensions. The goal of this work is to quantify this scalability from three key aspects: (i) empirical convergence rates; (ii) robustness to data contamination; and (iii) efficient computational methods. For empirical convergence, we derive fast rates…  More
Submitted 17 October, 2022; originally announced October 2022.


Tool wear state recognition under imbalanced data based on WGAN-...
by Hu, Wen; Guo, Hong; Yan, Bingnan ; More...
Journal of mechanical science and technology, 2022, Volume 36, Issue 10
The tool is an important part of machining, and its condition determines the operational safety of the equipment and the quality of the workpiece. Therefore,...
Article PDFPDF
Journal Article  Full Text Online

 
AC-WGAN-GP: Generating Labeled Samples for Improving...
by Caihao Sun; Xiaohua Zhang; Hongyun Meng ; More...
Remote sensing (Basel, Switzerland), 10/2022, Volume 14, Issue 4910
The lack of labeled samples severely restricts the classification performance of deep learning on hyperspectral image classification. To solve this problem,...
Article PDFPDF
Journal Article  Full Text Online


Tool wear state recognition under imbalanced data based on WGAN-GP and lightweight...
by Hou, WenGuo, HongYan, Bingnan ; More...
Journal of mechanical science and technology, 2022, Volume 36, Issue 10
The tool is an important part of machining, and its condition determines the operational safety of the equipment and the quality of the workpiece. Therefore,...
ArticleView Article PDF
Journal Article  Full Text Online

AC-WGAN-GP: Generating Labeled Samples for Improving Hyperspectral Image Classification...
by Caihao SunXiaohua ZhangHongyun Meng ; More...
Remote sensing (Basel, Switzerland), 10/2022, Volume 14, Issue 4910
The lack of labeled samples severely restricts the classification performance of deep learning on hyperspectral image classification. To solve this problem,...
ArticleView Article PDF
Journal Article  Full Text Online 

All 3 versions 

<–—2022———2022———1250—


Tool wear state recognition under imbalanced data based on WGAN-GP and lightweight neural network ShuffleNet

Hou, WGuo, H; ; (...); Mao, Y

Oct 2022 | Oct 2022 (Early Access) | 

JOURNAL OF MECHANICAL SCIENCE AND TECHNOLOGY

 36 (10) , pp.4993-5009Enriched Cited References

The tool is an important part of machining, and its condition determines the operational safety of the equipment and the quality of the workpiece. Therefore, tool condition monitoring (TCM) is of great significance. To address the imbalance of the tool monitoring signal and achieve a lightweight model, a TCM method based on WGAN-GP and ShuffleNet is proposed in this paper. The tool monitoring d

Show more

Full Text at Publishermore_horiz

46 References  Related records 


Unsupervised Learning Model of Sparse Filtering Enhanced Using Wasserstein Distance for Intelligent Fault Diagnosis

Vashishtha, G and Kumar, R

Oct 2022 (Early Access) | 

JOURNAL OF VIBRATION ENGINEERING & TECHNOLOGIESEnriched Cited References

Background Deep learning-based fault diagnosis techniques are promising approaches that can eliminate the need for advanced skills in signal processing and diagnostic expertise.

Purpose The sparse filtering method is an unsupervised learning method whose parameters play a vital role in obtaining more accurate and reliable results. Thus, their appropriate selection is necessary to obtain m

Show more

Full Text at Publishermore_horiz


 2022 see 2021

Uncertainty quantification in a mechanical submodel driven by a Wasserstein-GAN

H BOUKRAICHI, N AKKARIF CASENAVE… - IFAC-PapersOnLine, 2022 - Elsevier

The analysis of parametric and non-parametric uncertainties of very large dynamical systems

requires the construction of a stochastic model of said system. Linear approaches relying …



2022 SIAM  wasserstein

About 38,700 results (0.38 seconds) 

Search Results


SIAM Journal on Mathematical Analysis 54, 986-1021, 2022. arxiv. ... Transport and interface: an uncertainty principle for the Wasserstein distance, SIAM ...

Wuchen Li - University of South Carolina

https://people.math.sc.edu › wuchen

Our paper "Projected Wasserstein gradient descent for high-dimensional Bayesian inference" is accepted in SIAM/ASA Journal on Uncertainty Quantification, 2022.

MR4512980 

2022

Jianbo Cui's Homepage - PolyU

https://www.polyu.edu.hk › ama › profile › cuijb

04.19/2022, the work 

on the continuation multiple shooting method for Wasserstein geodesic equation 

has been accepted by SIAM J. Sci. Comput.


Style Transfer Using Optimal Transport Via Wasserstein ...

https://ieeexplore.ieee.org › document

https://ieeexplore.ieee.org › document

by O Ryu · 2022 — We propose a module that combines these two methods to apply subtle style transfer even to high-resolution images. Published in: 2022 IEEE International ...

IEEE ICIP 2022 || Bordeaux, France || 16-19 October 2022

https://cmsworkshops.com › ICIP2022 › view_paper › alt

https://cmsworkshops.com › ICIP2022 › view_paper › alt

Oct 11, 2022 — STYLE TRANSFER USING OPTIMAL TRANSPORT VIA WASSERSTEIN DISTANCE. OSeok Ryu, Bowon Lee, Inha University, Korea, Republic of ...


Limit distribution theory for smooth p-Wasserstein distanceshttps://arxiv.org › math
by Z Goldfeld · 2022 · Cited by 3 — Abstract: The Wasserstein distance is a metric on a space of probability measures that has seen a surge of applications in statistics, ...

2022 see 2021

(PDF) Limit Distribution Theory for the Smooth 1-Wasserstein ...

https://www.researchgate.net › publication › 353544627_...

https://www.researchgate.net › publication › 353544627_...

Jul 5, 2022 — As applications of the limit distribution theory, we study two-sample testing and minimum distance estimation (MDE) under $W_1^\sigma$.


PSWGAN-GP: Improved Wasserstein Generative Adversarial Network with Gradient Penalty

C Yun-xiang, W Wei, N Juan, C Yi-dan… - Computer and …, 2022 - cam.org.cn

… In order to solve the detail quality problems of images generated … Wasserstein generative 

adversarial network with gradient penalty (PSWGAN-GP) method. Based on the Wasserstein

Related articles

2022 see 2021

Wasserstein Adversarial Regularization for Learning With ...

https://pubmed.ncbi.nlm.nih.gov › ...

https://pubmed.ncbi.nlm.nih.gov › ...

by K Fatras · Cited by 6 — IEEE Trans Pattern Anal Mach Intell. 2022 Oct;44(10):7296-7306. doi: 10.1109/TPAMI.2021.3094662. Epub 2022 Sep 14 ...


Gromov-Wasserstein Multi-modal Alignment and Clustering

https://dl.acm.org › doi › abs

https://dl.acm.org › doi › abs

by F Gong · 2022 — We propose a novel Gromov-Wasserstein multi-modal alignment and clustering method based on kernel-fusion strategy and Gromov-Wasserstein ...

Cited by 2 Related articles All 3 versions


2022  RESEARCH-ARTICLE
FREE
October 2022

Gromov-Wasserstein Multi-modal Alignment and Clustering

Fengjiao Gong,Yuzhou Nie,Hongteng Xu

CIKM '22: Proceedings of the 31st ACM International Conference on Information & Knowledge ManagementOctober 2022, pp 603–613https://doi.org/10.1145/3511808.3557339
Multi-modal clustering aims at finding a clustering structure shared by the data of different modalities in an unsupervised way. Currently, solving this problem often relies on two assumptions: i) the multi-modal data own the same latent distribution, ...


2022  RESEARCH-ARTICLE
OPEN ACCESS
October 2022
Gromov-Wasserstein Guided Representation Learning for Cross-Domain Recommendation

Xinhang Li,Zhaopeng Qiu,Xiangyu Zhao,Zihao Wang,Yong Zhang,+ 2

CIKM '22: Proceedings of the 31st ACM International Conference on Information & Knowledge ManagementOctober 2022, pp 1199–1208https://doi.org/10.1145/3511808.3557338
Cross-Domain Recommendation (CDR) has attracted increasing attention in recent years as a solution to the data sparsity issue. The fundamental paradigm of prior efforts is to train a mapping function based on the overlapping users/items and then apply ...

2022  RESEARCH-ARTICLE
March 2022
Energy-constrained Crystals Wasserstein GAN for the inverse design of crystal structuresPeiyao Hu
,Binjing Ge,Yirong Liu,Wei Huang

ICCAI '22: Proceedings of the 8th International Conference on Computing and Artificial IntelligenceMarch 2022, pp 24–31https://doi.org/10.1145/3532213.3532218


2022


Randomized Wasserstein Barycenter Computation: Resampling with Statistical Guarantees

Florian HeinemannAxel Munk, and Yoav Zemel

SIAM Journal on Mathematics of Data ScienceVol. 4, No. 1, pp. 229–2592022

2022  RESEARCH-ARTICLE

FREE

October 2022

Weakly-Supervised Temporal Action Alignment Driven by Unbalanced Spectral Fused

GromovWasserstein Distance

All 5 versions


Dixin Luo,Yutong Wang,Angxiao Yue,Hongteng Xu

MM '22: Proceedings of the 30th ACM International Conference on MultimediaOctober 2022, pp 728–739https://doi.org/10.1145/3503161.3548067
Temporal action alignment aims at segmenting videos into clips and tagging each clip with a textual description, which is an important task of video semantic analysis. Most existing methods, however, rely on supervised learning to train their alignment ...

2022  Research articleFull text access

Lung image segmentation based on DRD U-Net and combined WGAN with Deep Neural Network

Computer Methods and Programs in Biomedicine30 August 2022...

Luoyu LianXin LuoZhendong Xu

Download PDF

All 5 versions


2022 Research article

Wasserstein generative adversarial uncertainty quantification in physics-informed neural networks

Journal of Computational Physics6 May 2022...

Yihang GaoMichael K. Ng


2022 Research article

Single image super-resolution using Wasserstein generative adversarial network with gradient penalty

Pattern Recognition Letters17 September 2022...

Yinggan TangChenglu LiuXuguang Zhang

All 2 versions

<–—2022———2022———1270—


2022  Research article

Enhanced detection of imbalanced malicious network traffic with regularized Generative Adversarial Networks

Journal of Network and Computer Applications23 March 2022...

Radhika ChapaneriSeema Shah


2022 see 2021  Research articleFull text access

Uncertainty quantification in a mechanical submodel driven by a Wasserstein-GAN

IFAC-PapersOnLine23 September 2022...

Hamza BOUKRAICHINissrine AKKARIDavid RYCKELYNCK

Download PDF .

.

2022 Oct  Research articleOpen access
Wasserstein-based texture analysis in radiomic studies
Computerized Medical Imaging and GraphicsAvailable online 19 October 2022...

Zehor BelkhatirRaúl San José EstéparAllen Tannenbaum

Download PDF

Wasserstein-based texture analysis in...
by Belkhatir, Zehor; Estépar, Raúl San José; Tannenbaum, Allen R.
Computerized medical imaging and graphics, 12/2022, Volume 102
The emerging field of radiomics that transforms standard-of-care images to quantifiable scalar statistics endeavors to reveal the information hidden in these...
Article PDF PDF
Journal Article 


 2022 see 2021   Research article
Least Wasserstein distance between disjoint shapes with perimeter regularization
Journal of Functional Analysis4 October 2022...

Michael NovackIhsan TopalogluRaghavendra Venkatraman


2022  Research article
On isometries of compact Lp–Wasserstein spaces
Advances in Mathematics11 August 2022...

Jaime Santos-Rodríguez

Zbl 07597093


2022


2022  Research article
Optimal visual tracking using Wasserstein transport proposals
Expert Systems with Applications30 July 2022...

Jin HongJunseok Kwon

Optimal visual tracking using Wasserstein transport proposals

J Hong, J Kwon - Expert Systems with Applications, 2022 - Elsevier

… We propose a novel visual tracking method based on the Wasserstein transport proposal (… 

For this objective, we adopt the optimal transport theory in the Wasserstein space and present …

Cited by 1 All 2 versions

2022  Short communication
Convergence rates for empirical measures of Markov chains in dual and Wasserstein distances
Statistics & Probability Letters7 July 2022...

Adrian Riekert


2022 Research article
Wasserstein metric-based two-stage distributionally robust optimization model for optimal daily peak shaving dispatch of cascade hydroplants under renewable energy uncertainties
Energy13 August 2022...

Xiaoyu JinBenxi LiuJia Lu


2022  Research article
Unsupervised domain adaptation of bearing fault diagnosis based on Join Sliced Wasserstein Distance
ISA Transactions5 January 2022...

Pengfei ChenRongzhen ZhaoQidong Yang

Cited by 4 Related articles All 3 versions

 

2022  Research articleFull text access
Optimization in a traffic flow model as an inverse problem in the Wasserstein space
IFAC-PapersOnLine19 September 2022...

Roman ChertovskihFernando Lobo PereiraMaxim Staritsyn

Download PDF

<–—2022———2022———1280—


2022  Research article
A CWGAN-GP-based multi-task learning model for consumer credit scoring
Expert Systems with Applications6 June 2022...

Yanzhe KangLiao ChenHaizhang Qian


2022 patent  

  Rotating machine state monitoring method based on Wasserstein depth digital …

CN CN114662712A 胡文扬 清华大学

Priority 2022-02-22 • Filed 2022-02-22 • Published 2022-06-24

The invention relates to the technical field of artificial intelligence, and discloses a method for monitoring the state of a rotating machine based on a Wasserstein depth digital twin model, which comprises the steps of acquiring operation and maintenance data of the rotating machine in a healthy …

   arXiv:2210.12135  [pdfother]  cs.LG 
cs.CV eess.SP  math.OC  math.PR  math.ST
Geometric Sparse Coding in Wasserstein Space
Authors: Marshall MuellerShuchin AeronJames M. MurphyAbiy Tasissa
Abstract: Wasserstein dictionary learning is an unsupervised approach to learning a collection of probability distributions that generate observed distributions as Wasserstein barycentric combinations. Existing methods for Wasserstein dictionary learning optimize an objective that seeks a dictionary with sufficient representation capacity via barycentric interpolation to approximate the observed training da…  More
Submitted 21 October, 2022; originally announced October 2022.
Comments: 24 pages
Geometric Sparse Coding in Wasserstein Space

by Mueller, Marshall; Aeron, Shuchin; Murphy, James M ; More...
10/2022
Wasserstein dictionary learning is an unsupervised approach to learning a collection of probability distributions that generate observed distributions as...
Journal Article  Full Text Online


arXiv:2210.11945  [pdfother math.OC
On The Existence Of Monge Maps For The Gromov-wasserstein Distance
Authors: Theo DumontThéo LacombeFrançois-Xavier Vialard
Abstract: For the L2-Gromov-Wasserstein distance, we study the structure of minimizers in Euclidean spaces for two different costs. The first cost is the scalar product for which we prove that it is always possible to find optimizers as Monge maps and we detail the structure of such optimal maps. The second cost is the squared Euclidean distance for which we show that the worst case scenario is the existenc…  More
Submitted 19 October, 2022; originally announced October 2022.

2022 video

Wasserstein GANs with Gradient Penalty Compute Congested ...

slideslive.com › wasserstein-gans-with-gradient-penalty-c...

slideslive.com › wasserstein-gans-with-gradient-penalty-c...

serstein GANs with Gradient Penalty (WGAN-GP) are a very popular method for training generative models to produce high quality synthetic ...

SlidesLive · m

Jul 2, 2022


2022 video

 The Quantum Wasserstein Distance of Order 1 - YouTube

54 views Speaker: Giacomo De Palma, University of Bologna Title: The Quantum Wasserstein Distance of Order 1 … ...more ...more.

YouTube · Mathematical Picture Language · 1 

Oct 4, 2022   see Feb 15 and Feb 20 videos


[PDF] wiley.com

Semi‐Supervised Surface Wave Tomography With Wasserstein Cycle‐Consistent GAN: Method and Application to Southern California Plate Boundary Region

A Cai, H Qiu, F Niu - Journal of Geophysical Research: Solid …, 2022 - Wiley Online Library

… is termed Wasserstein cycle‐consistent generative adversarial networks (Wasserstein Cycle‐… 

The cycle‐consistency and Wasserstein metric significantly improve the training stability of …

 Cited by 8 Related articles All 12 versions

 

[PDF] hacettepe.edu.tr

MIWGAN-GP: Missing Data Imputation using Wasserstein Generative Adversarial Nets with Gradient Penalty

E UÇGUN ERGÜN - 2022 - openaccess.hacettepe.edu.tr

… the heart’s electrical activity [24]. The minute electrical changes on the skin caused by the 

heart … , commonly known as optical heart rate detection to measure heart-rate. Blood absorbs …

 

 

[PDF] researchsquare.com

[PDF] Reliability Metrics of Explainable CNN based on Wasserstein Distance for Cardiac Evaluation

Y Omae, Y Kakimoto, Y Saito, D Fukamachi… - 2022 - researchsquare.com

… In particular, we build the probability distributions for the cardiac region and the regression 

activation map (RAM) and measure the similarity between these distributions by Wasserstein

All 3 versions


Lagrangian schemes for Wasserstein gradient flowshttps://www.researchgate.net › ... › Gradient

https://www.researchgate.net › ... › Gradient

Jul 5, 2022 — This paper reviews different numerical methods for specific examples of Wasserstein gradient flows: wefocuonnonlinear Fokker-Planck ...

<–—2022———2022———1290—e


[PDF] arxiv.org

Martingale Wasserstein inequality for probability measures in the  vgbconvex order

B Jourdain, W Margheriti - Bernoulli, 2022 - projecteuclid.org

… −y| is smaller than twice their W1-distance (Wasserstein distance with index 1). We showed

that … Then we study the generalisation of this new martingale Wasserstein inequality to higher …

 Cited by 1 Related articles All 11 versions


2022 see 2021

Wasserstein Unsupervised Reinforcement Learning

https://www.aaai.org › AAAI-4760.HeS.pdf

https://www.aaai.org › AAAI-4760.HeS.pdfPDF

by S He · 2022 — fore we propose a new framework Wasserstein unsupervised reinforcement learning (WURL) where we ... The AAAI Digital Library will contain the published.

9 pages

Cited by 5 Related articles All 5 versions

2022

Partial Wasserstein Covering | Proceedings of the AAAI ...

https://ojs.aaai.org › index.php › AAAI › article › view

https://ojs.aaai.org › index.php › AAAI › article › view

by K Kawano · 2022 · Cited by 2 — We consider a general task called partial Wasserstein covering with the goal of providing information on what patterns are not being taken ...

ited by 3 Related articles All 7 versions

2022 see 2021

Wasserstein Adversarial Transformer for Cloud Workload ...

https://ojs.aaai.org › index.php › AAAI › article › view

https://ojs.aaai.org › index.php › AAAI › article › view

by S Arbat · 2022 · Cited by 2 — To develop a cloud workload prediction model with high accuracy and low inference overhead, this work presents a novel time-series forecasting ...

Cited by 2 Related articles All 5 versions

2022

Semi-supervised Conditional Density Estimation with ...

https://ojs.aaai.org › index.php › AAAI › article › view

https://ojs.aaai.org › index.php › AAAI › article › view

by O Graffeuille · 2022 — Semi-supervised Conditional Density Estimation with Wasserstein Laplacian Regularisation. Proceedings of the AAAI Conference on Artificial ...

Semi-supervised Conditional Density Estimation with Wasserstein Laplacian Regularisation

Related articles All 4 versions 

2022 

Wasserstein Distance Guided Representation Learning ... - dblp

https://dblp.org › rec › conf › aaai › ShenQZY18

https://dblp.org › rec › conf › aaai › ShenQZY18

Mar 8, 2022 — Jian Shen, Yanru Qu, Weinan Zhang, Yong Yu: Wasserstein Distance Guided Representation Learning for Domain Adaptation. AAAI 2018: 4058-4065 ...

ƒ

2022

WGAN-Based Image Denoising Algorithm - IGI Global

https://www.igi-global.com › viewtitle

https://www.igi-global.com › viewtitle

by XF Zou · 2000 — Therefore, the discriminator in WGAN needs to remove the sigmoid activation function. ... Proceedings of the AAAI conference on artificial intelligence.

training energy-based models with bidirectional bounds

https://proceedings.neurips.cc › paper › file

https://proceedings.neurips.cc › paper › filePDF

by C Geng · 2021 · Cited by 2 — The original WGAN paper (Arjovsky et al., 2017) noted optimization instabilities ... In Proceedings of the AAAI Conference on Artificial Intelligence,.

Related articles All 2 versions

 2022

[D] Is WGAN-GP gradient penalty applicable to the generator?

https://www.reddit.com › MachineLearning › comments

https://www.reddit.com › MachineLearning › comments

May 8, 2022 — The study describes an architecture based on WGAN-GP, a modification of WGAN which aims to enforce 1-Lipschitz ... [D] AAAI 2023 Reviews.


Learning to Generate Wasserstein Barycenters:...
by Lacombe, Julien; Digne, Julie; Bonneel, Nicolas ; More...
10/2022
Datasets used in the paper "Learning to Generate Wasserstein barycenters" published in the JMIV (https://link.springer.com/article/10.1007/s10851-022-01121-y)...
Data SetCitation Online
 Learning to Generate Wasserstein Barycenters: datasets
by Lacombe, JulienDigne, JulieBonneel, Nicolas ; More...
10/2022
Datasets used in the paper "Learning to Generate Wasserstein barycenters" published in the JMIV (https://link.springer.com/article/10.1007/s10851-022-01121-y)...
Data SetCitation Online

 
A Reusable Methodology for Player Clustering Using W
asserstein Autoencoders
by Tan, Jonathan; Katchabaw, Mike
Entertainment Computing – ICEC 2022, 10/2022
Identifying groups of player behavior is a crucial step in understanding the player base of a game. In this work, we use a recurrent autoencoder to create...
Book Chapter  Full Text Online

<–—2022———2022———1300—


1. arXiv:2210.15179  [pdfother math.OC   stat.ML
Mean-field neural networks: learning mappings on Wasserstein space
Authors: Huyên PhamXavier Warin
Abstract: We study the machine learning task for models with operators mapping between the Wasserstein space of probability measures and a space of functions, like e.g. in mean-field games/control problems. Two classes of neural networks, based on bin density and on cylindrical approximation, are proposed to learn these so-called mean-field functions, and are theoretically supported by universal approximati…  More
Submitted 27 October, 2022; originally announced October 2022.
Comments: 25 pages, 14 figures
MSC Class: 60G99

2022 see 2023

arXiv:2210.14671  [pdfother math.OC   math.ST
Bures-Wasserstein Barycenters and Low-Rank Matrix Recovery
Authors: Tyler MaunuThibaut Le GouicPhilippe Rigollet
Abstract: We revisit the problem of recovering a low-rank positive semidefinite matrix from rank-one projections using tools from optimal transport. More specifically, we show that a variational formulation of this problem is equivalent to computing a Wasserstein barycenter. In turn, this new perspective enables the development of new geometric first-order methods with strong convergence guarantees in Bures…  More
Submitted 26 October, 2022; originally announced October 2022.
Comments: 31 pages, 8 figures

arXiv:2210.14340  [pdfother q-fin.RM   math.PR  q-fin.MF
A parametric approach to the estimation of convex risk functionals based on Wasserstein distance
Authors: Max NendelAlessandro Sgarabottolo
Abstract: In this paper, we explore a static setting for the assessment of risk in the context of mathematical finance and actuarial science that takes into account model uncertainty in the distribution of a possibly infinite-dimensional risk factor. We allow for perturbations around a baseline model, measured via Wasserstein distance, and we investigate to which extent this form of probabilistic imprecisio…  More
Submitted 25 October, 2022; originally announced October 2022.
MSC Class: Primary 62G05; 90C31; Secondary 41A60; 68T07; 91G70
Cited by 1 All 4 versions


arXiv:2210.14298  [pdfother]  stat.ML   cs.LG  math.OC  math.PR
Wasserstein Archetypal Analysis
Authors: Katy CraigBraxton OstingDong WangYiming Xu
Abstract: Archetypal analysis is an unsupervised machine learning method that summarizes data using a convex polytope. In its original formulation, for fixed k, the method finds a convex polytope with k vertices, called archetype points, such that the polytope is contained in the convex hull of the data and the mean squared Euclidean distance between the data and the polytope is minimal. In the present wo…  More
Submitted 25 October, 2022; originally announced October 2022.
MSC Class: 62H12; 62H30; 65K10; 49Q22


MR4499525 Prelim Yatracos, Yannis G.; 

Limitations of the Wasserstein MDE for univariate data. Stat. Comput. 32 (2022), no. 6, 95.

Review PDF Clipboard Journal Article


2022


MR4499079 Prelim Yue, Man-Chung; Kuhn, Daniel; 

Wiesemann, Wolfram; On linear optimization over Wasserstein balls. Math. Program. 195 (2022), no. 1-2, Ser. A, 1107–1122.

Review PDF Clipboard Journal Article


Yue, Man-ChungKuhn, DanielWiesemann, Wolfram

On linear optimization over Wasserstein balls. (English) Zbl 07606038

Math. Program. 195, No. 1-2 (A), 1107-1122 (2022).

MSC:  90C25 90C05 90C17

PDF BibTeX XML Cite 

Zbl 07606038

2022 2/3

Julien Tierny (2/3/22): Wasserstein Distances, Geodesics and ...

In this talk, I will present a unified computational framework for the estimation of distances, geodesics and barycenters of merge trees.

YouTube · Applied Algebraic Topology Network · 

Feb 3, 2022


2022 see 2021

Computation of Discrete Flows Over Networks via Constrained ...

www.youtube.com › watch

Computation of Discrete Flows Over Networks via Constrained Wasserstein Barycenters. 8 views8 views. Feb 10, 2022.

YouTube · LatinX in AI · 

Feb 10, 2022


2022 2/26ƒ

Wasserstein Distance: Metric Proof - YouTube

 We prove that W_p is a metric. Can be found in Villani's ... Your browser can't play this video. Learn more. Switch camera ... Feb 26, 

YouTube · Tyler Masthay · 

3:38 / 26:43

Feb 26, 2022

<–—2022———2022———1310—



2022 3/17

Wasserstein gradient flows for machine learning | mathtube.org

Date: Thu, Mar 17, 2022 ... In particular, one can leverage the geometry of Optimal transport and consider Wasserstein gradient flows for the loss ...

Mathtube · 

Mar 17, 2022


2022 3/18 see 2021

Statistical Analysis of Wasserstein Distributionally Robust ...

We consider statistical methods which invoke a min-max distributionally robust formulation to extract good out-of-sample performance in ...

YouTube · INFORMS · 

Mar 18, 2022


2022 53/27

#ComputerVision #DeepLearning #Pytorch

62 - Wasserstein GAN (WGAN) Architecture Understanding | Deep Learning | Neural Network299 views

Mar 27, 2022
2022 3/27b

#ComputerVision #DeepLearning #Pytorch62 - Wasserstein GAN (WGAN) Architecture Understanding | Deep Learning | Neural Network299 views

Mar 27, 2022


2022 3/27c

[ZY4] Training Wasserstein GANs without gradient penalties

e-TEC Talks] @ SNU Winter 2022[Presenter] Dr. Yeoneung Kim, Seoul National Univ.[Topic] “Training Wasserstein GANs without gradient ...

YouTube · SNU ECE BK21 · 

Mar 27, 2022


2022 3/28

Fixed Support Tree-Sliced Wasserstein Barycenter - SlidesLive

slideslive.com › fixed-support-treesliced-wasserstein-bary...

slideslive.com › fixed-support-treesliced-wasserstein-bary...

The Wasserstein barycenter has been widely studied in various fields, ... Mar 28, 2022 ... By contrast, the Wasserstein distance on a tree, ...

SlidesLive · 

Mar 28, 2022


2022


2022 3/31

Wasserstein distributionally robust decision problems

Jan Obloj, University of Oxford
Thursday,  iSMi

March 31, 2022

 

 2022 4/13

Gerrardo Vargas: Quantitative control of Wasserstein distance between Brownian motion and the Gold..

7 views

Apr 13, 2022

2022 5/20
Sampling with kernelized Wasserstein gradient flows • IMSI

www.imsi.institute › Videos

www.imsi.institute › Videos

... the dissimilarity to the target distribution), and its Wasserstein gradient flow is approximated by an interacting particle system.

Institute for Mathematical and Statistical Innovation · 

May 20, 2022


2022 6/1
Po-Ling Loh - Robust W-GAN-Based Estimation ... - YouTube

Po-Ling Loh - Robust W-GAN-Based Estimation Under Wasserstein Contamination. 74 views Jun 1, 2022. Erwin Schrödinger International Institute ...

YouTube · Erwin Schrödinger International Institute for Mathematics and Physics (ESI) · Jun

June 1, 2022


2022 6/8
Wasserstein GAN (Q&A) | Lecture 64 (Part 5) - YouTube

29 views

Maziar Raissi

Jun 8, 2022

<–—2022———2022———1320—


2022 6/13

Help with understanding Wasserstein distance : r/math - Reddit

www.reddit.com › math › comments › help_with_underst...

For probability and statistics, the Wasserstein metric defines a distance between probability distributions. You are really looking at distances ...

Reddit · Quanta Magazine · 

Jun 13, 2022

CoRL2022-WASABI - Google Sites

sites.google.com › view › corl2022-wasabi

sites.google.com › view › corl2022-wasabi 

Learning agile skills is one of the main challenges in robotics. ... imitation learning method named Wasserstein Adversarial Behavior Imitation (WASABI).

Google Sites · Chenhao Li · 

Jul 1, 2022


Raghav Somani (@SomaniRaghav) / Twitter

twitter.com › somaniraghav

There are analogs of the Wasserstein metrics over probability measures. ... We showed previously that graphons carry a geodesic metric structure with them.

Twitter · 

Jul 20, 2022

2022 8/26

Breast Cancer Histopathology Image Super-Resolution Using ...

Wasserstein gradient penalty…

ieeeaccess.ieee.org › featured-articles › breastcancer_supe...

Moreover, we have applied improved Wasserstein with a Gradient penalty to ... Moreover, several evaluation metrics, such as PSNR, MSE, SSIM, MS-SSIM, ...

IEEE Access · IEEE Access · 

Aug 26, 2022


2022 see 2021

The Back-And-Forth Method For Wasserstein Gradient Flows

www.youtube.com › watch

https://cse.umn.edu/ima/events/back-and-forth-method-wasserstein-gradient-flows.

YouTube · IMA UMN · 1 week ago

Oct 21, 2022


2022


On the Efficiency of Entropic Regularized Algorithms for Optimal Transport

T LinN HoMI Jordan - Journal of Machine Learning Research, 2022 - jmlr.org

We present several new complexity results for the entropic regularized algorithms that approximately solve the optimal transport (OT) problem between two discrete probability measures with at most n atoms. First, we improve the complexity bound of a greedy variant of Sinkhorn, known as Greenkhorn, from O (n2ε− 3) to O (n2ε− 2). Notably, our result can match the best known complexity bound of Sinkhorn and help clarify why Greenkhorn significantly outperforms Sinkhorn in practice in terms of row/column updates as observed …

Cited by 3 Related articles All 4 versions 

Nhat Ho (@nhatptnk8912) / Twitter

mobile.twitter.com › nhatptnk8912 

"On the Efficiency of Entropic Regularized Algorithms for Optimal Transport", ... My paper, “Entropic Gromov-Wasserstein between Gaussian Distributions” ...

Twitter · 


Dual Wasserstein generative adversarial network condition: A generative adversarial network-based acoustic impedance inversion method

Z Wang, S Wang, C Zhou, W Cheng - Geophysics, 2022 - library.seg.org

… GAN, Wasserstein distance is used instead of cross entropy as the loss function, so that … 

loss function based on Wasserstein distance with the conditional information as the constraint. …

All 3 versions

May 7, 2022


PDF] arxiv.org

Statistical, robustness, and computational guarantees for sliced wasserstein distances

S Nietert, R Sadhu, Z Goldfeld, K Kato - arXiv preprint arXiv:2210.09160, 2022 - arxiv.org

… Sliced Wasserstein distances preserve properties of classic Wasserstein distances while … 

an equivalence between robust sliced 1-Wasserstein estimation and robust mean estimation. …
Cited by 5
Related articles All 3 versions

LDoS attack traffic detection based on feature optimization extraction and DPSA-WGAN

Ma, WGLiu, RQ and Guo, J

Oct 2022 (Early Access) | 

APPLIED INTELLIGENCE

Enriched Cited References

Low-rate Denial of Service (LDoS) attacks cause severe destructiveness to network security. Moreover, they are more difficult to detect because they are more hidden and lack distinguishing features. Consequently, packets belonging to legitimate users can be misplaced. The performance of a transport system can be degraded by frequently sending short bursts of packets. An attack program generates

Show more

Full Text at Publishermore_horiz

46 References  Related records 4


  Two-Sample Test with Kernel Projected Wasserstein Distance

Wang, JGao, R and Xie, Y

International Conference on Artificial Intelligence and Statistics

2022 |

INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151

 151

We develop a kernel projected Wasserstein distance for the two-sample test, an essential building block in statistics and machine learning: given two sets of samples, to determine whether they are from the same distribution. This method operates by finding the nonlinear mapping in the data space which maximizes the distance between projected distributions. In contrast to existing works about pr

Show more

more_horiz

63 References  Related records 

<–—2022———2022———1330—



AC-WGAN-GP: Generating Labeled Samples for Improving Hyperspectral Image Classification with Small-Samples

Sun, CHZhang, XH; (...); Zhang, JH

Oct 2022 | 

REMOTE SENSING

 14 (19)Enriched Cited References

The lack of labeled samples severely restricts the classification performance of deep learning on hyperspectral image classification. To solve this problem, Generative Adversarial Networks (GAN) are usually used for data augmentation. However, GAN have several problems with this task, such as the poor quality of the generated samples and an unstable training process. Thereby, knowing how to con

Show more

Free Full Text from Publishermore_horiz

49 References  Related records

 

A Self-Attention Based Wasserstein Generative Adversarial Networks for Single Image Inpainting

Mao, YXZhang, TZ; (...); Thanh, DNH

Sep 2022 | 

PATTERN RECOGNITION AND IMAGE ANALYSIS

 32 (3) , pp.591-599

With the popularization of portable devices such as mobile phones and cameras, digital images have been widely disseminated in human life. However, due to factors such as photoaging, shooting environment, etc., images will encounter some defects. To restore these defective images quickly and realistically, image inpainting technology emerges as the times require, and digital image processing te

Show more

View full textmore_horiz

33 References  Related records

 

Optimized complex object classification model: reconstructing the ISAR image of a hypersonic vehicle covered with a plasma sheath using a U-WGAN-GP framework

Li, JTBian, Z and Guo, LX

Jul 18 2022 | 

INTERNATIONAL JOURNAL OF REMOTE SENSING

 43 (14) , pp.5306-5323Enriched Cited References

This study aims to apply generative adversarial networks (GANs) to the effective classification of high/low resolution (HR/LR) image pairs obtained via inverse synthetic aperture radar (ISAR) for hypersonic objects covered with a plasma sheath. We propose a classification training model based on a Wasserstein GAN with a gradient penalty (U-WGAN-GP) framework, wherein a U-Net with an excellent j

Scholarly Journal

Computed tomography image generation from magnetic resonance imaging using Wasserstein metric for MR‐only radiation therapy

Jiffy, Joseph; Challa Hemanth; Narayanan, Pournami Pulinthanathu; Balakrishnan, Jayaraj Pottekkattuvalappil; Puzhakkal, Niyas.

International Journal of Imaging Systems and Technology; New York Vol. 32, Iss. 6,  (Nov 2022): 2080-2093.

Cite

EmSave to My Research

Citation/Abstract

Abstract/Details 


2022 see 2021

Working Paper

Dynamical Wasserstein Barycenters for Time-series Modeling

Cheng, Kevin C; Shuchin Aeron; Hughes, Michael C; Miller, Eric L.

arXiv.org; Ithaca, Nov 1, 2022.

CiEmail

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

2022


2022 see 2021  Working Paper

The Wasserstein distance to the Circular Law

Jalowy, Jonas.

arXiv.org; Ithaca, Oct 28, 2022.

Cit   Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

Working Paper

The Wasserstein distance of order for quantum spin systems on infinite lattices

De Palma, Giacomo; Trevisan, Dario.

arXiv.org; Ithaca, Oct 20, 2022.

Cite EmSave to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window


2022 see 2021  Working Paper

Sliced Gromov-Wasserstein

Vayer, Titouan; Flamary, Rémi; Tavenard, Romain; Chapel, Laetitia; Courty, Nicolas.

arXiv.org; Ithaca, Oct 20, 2022.

CiEmail  Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window


2022 see 2021  Working Paper

Some inequalities on Riemannian manifolds linking Entropy,Fisher information, Stein discrepancy and Wasserstein distance

Li-Juan, Cheng; Feng-Yu, Wang; Thalmaier, Anton.

arXiv.org; Ithaca, Oct 18, 2022.

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window


Working Paper

Wasserstein -means for clustering probability distributions

Zhuang, Yubo; Chen, Xiaohui; Yang, Yun.

arXiv.org; Ithaca, Oct 12, 2022

CiteEmaiSave to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

Wasserstein $K$-means for clustering probability distributions

Oct 31, 2022

<–—2022———2022———1340—


2022 see 2021  Working Paper

Least Wasserstein distance between disjoint shapes with perimeter regularization

Novack, Michael; Topaloglu, Ihsan; Venkatraman, Raghavendra.

arXiv.org; Ithaca, Oct 5, 2022.

Cite  EmaiSave to My ResearFull Text

Abstract/DetailsGet full text
opens in a new window


arXiv:2211.01804  [pdfother]  math.OC  math.NA  math.PR
Wasserstein Steepest Descent Flows of Discrepancies with Riesz Kernels
Authors: Johannes HertrichManuel GräfRobert BeinertGabriele Steidl
Abstract: The aim of this paper is twofold. Based on the geometric Wasserstein tangent space, we first introduce Wasserstein steepest descent flows. These are locally absolutely continuous curves in the Wasserstein space whose tangent vectors point into a steepest descent direction of a given functional. This allows the use of Euler forward schemes instead of the minimizing movement scheme (MMS) introduced… 
More
Submitted 2 November, 2022; originally announced November 2022.
Wasserstein Steepest Descent Flows of Discrepancies with...
by Hertrich, Johannes; Gräf, Manuel; Beinert, Robert ; More...
11/2022
The aim of this paper is twofold. Based on the geometric Wasserstein tangent space, we first introduce Wasserstein steepest descent flows. These are locally...
Journal Article  Full Text Online


arXiv:2211.01528  [pdfother]  cs.LG   cs.AI  cs.CY  stat.ML
Fair and Optimal Classification via Transports to Wasserstein-Barycenter
Authors: Ruicheng XianLang YinHan Zhao
Abstract: Fairness in automated decision-making systems has gained increasing attention as their applications expand to real-world high-stakes domains. To facilitate the design of fair ML systems, it is essential to understand the potential trade-offs between fairness and predictive power, and the construction of the optimal predictor under a given fairness constraint. In this paper, for general classificat… 
More
Submitted 2 November, 2022; originally announced November 2022.
Comments: Code is at https://github.com/rxian/fair-classification
Fair and Optimal Classification via Transports to Wasserstein-...
by Xian, Ruicheng; Yin, Lang; Zhao, Han
11/2022
Fairness in automated decision-making systems has gained increasing attention as their applications expand to real-world high-stakes domains. To facilitate the...
Journal Article  Full Text Online


arXiv:2211.00820  [pdfother]  math.OC   cs.CV  cs.LG  cs.NE
A new method for determining Wasserstein 1 optimal transport maps from Kantorovich potentials, with deep learning applications
Authors: Tristan MilneÉtienne BilocqAdrian Nachman
Abstract: Wasserstein 1 optimal transport maps provide a natural correspondence between points from two probability distributions, μ
 and ν
, which is useful in many applications. Available algorithms for computing these maps do not appear to scale well to high dimensions. In deep learning applications, efficient algorithms have been developed for approximating solutions of the dual problem, known as Kant…  More
Submitted 1 November, 2022; originally announced November 2022.
Comments: 25 pages, 12 figures. The TTC algorithm detailed here is a simplified and improved version of that of arXiv:2111.15099
MSC Class: 49Q22 ACM Class: I.3.3; I.4.4; I.4.3
A new method for determining Wasserstein 1 optimal...
by Milne, Tristan; Bilocq, Étienne; Nachman, Adrian
11/2022
Wasserstein 1 optimal transport maps provide a natural correspondence between points from two probability distributions, $\mu$ and $\nu$, which is useful in...
Journal Article  Full Text Online


arXiv:2211.00719  [pdfpsother math.PR
A finite-dimensional approximation for partial differential equations on Wasserstein space
Authors: Mehdi Talbi
Abstract: This paper presents a finite-dimensional approximation for a class of partial differential equations on the space of probability measures. These equations are satisfied in the sense of viscosity solutions. The main result states the convergence of the viscosity solutions of the finite-dimensional PDE to the viscosity solutions of the PDE on Wasserstein space, provided that uniqueness holds for the…  More
Submitted 1 November, 2022; originally announced November 2022.
MSC Class: 35D40; 35R15; 60H30; 49L25

by Talbi, Mehdi

11/2022

This paper presents a finite-dimensional approximation for a class of partial differential equations on the space of probability

measures. These equations are...

Journal Article  Full Text Online

2022


[PDF] arxiv.org

Stability of Entropic Wasserstein Barycenters and application to random geometric graphs

M Theveneau, N Keriven - arXiv preprint arXiv:2210.10535, 2022 - arxiv.org

Wasserstein barycenters. In Sec. 3, we give a generic stability results of Wasserstein 

barycenters to deformation cost, before presenting an application on random geometric graphs in …

All 2 versions   


Tweets with replies by Brandon Amos ... - Twitter

mobile.twitter.com › brandondamos › with_replies

For the L2-Gromov-Wasserstein distance, we study the structure of minimizers in Euclidean spaces for two different costs. The first cost is the scalar ...

Twitter · 

Nov 25, 2022


Tool wear state recognition under imbalanced data based on WGAN-GP and lightweight neural network ShuffleNet

W Hou, H Guo, B Yan, Z Xu, C Yuan, Y Mao - Journal of Mechanical …, 2022 - Springer

… model, a TCM method based on WGAN-GP and ShuffleNet is proposed in this paper. The 

tool monitoring data are enhanced and balanced using WGAN-GP, and the 1D signal data are …


[PDF] archives-ouvertes.fr

[PDF] On the complexity of the data-driven Wasserstein distributionally robust binary problem

H Kim, D Watel, A Faye… - … et d'Aide à la Décision, 2022 - hal.archives-ouvertes.fr

… In this paper, we use a data-driven WassersteinWasserstein metric that gives a distance 

value between two distributions. This particular case of DRO is called data-driven Wasserstein

Related articles All 7 versions


Multiscale Carbonate Rock Reconstruction Using a Hybrid WGAN-GP and Super-Resolution

Z Zhang, Y Li, M AlSinan, X He, H Kwak… - SPE Annual Technical …, 2022 - onepetro.org

… In addition, we adopt the Wasserstein GAN with gradient penalty to stabilize the training 

process. Benefiting on this technology, the proposed network successfully captures detailed …

All 2 versions

<–—2022———2022———1350—



[HTML] sciencedirect.com

[HTML] ERP-WGAN: A data augmentation method for EEG single-trial detection

R Zhang, Y Zeng, L Tong, J Shu, R Lu, K Yang… - Journal of Neuroscience …, 2022 - Elsevier

… in wasserstein generative adversarial networks (WGAN), the … , the proposed ERP-WGAN 

framework significantly improve the … ERP-WGAN can reduced at least 73% of the real subject …

Related articles All 3 versions


Linear-FM fuze signal enhancement based on WGAN

Z Jing, X Jing - 2nd International Conference on Signal Image …, 2022 - spiedigitallibrary.org

… 2.1 WGAN The basic structure of GAN used in this paper is WGAN (Wasserstein GAN)[9]. 

WGAN uses Wasserstein distance, which is different from the KL divergence and JS …

All 2 versions


[PDF] openreview.net

Bridging the Gap Between Coulomb GAN and Gradient-regularized WGAN

S Asokan, CS Seelamantula - The Symbiosis of Deep Learning and … - openreview.net

Wasserstein GAN (WGAN) cost. Subsequently, we show that, within 9 the regularized WGAN 

setting… As an alternative 11 to training a discriminator in either WGAN or Coulomb GAN, we …

All 3 versions 

[PDF] mdpi.com

Arrhythmia Detection Based on WGAN-GP and SE-ResNet1D

J Qin, F Gao, Z Wang, L Liu, C Ji - Electronics, 2022 - mdpi.com

WGAN-GP is widely used for generating time-series data [21,… In this paper, we propose 

to use the improved WGAN-GP to … We improve the structure of WGAN-GP. A Bi-GRU layer is …

All 2 versions 


 [PDF] syntaxliterate.co.id

Comparative Analysis Of DCGAN And WGAN

SH Al Furuqi, H Santoso - Syntax Literate; Jurnal Ilmiah …, 2022 - jurnal.syntaxliterate.co.id

Wasserstein GAN (WGAN) algorithms. This study analyzes the comparison among DCGAN 

and WGAN … time as WGAN can remedy those shortcomings however the process is slower. …

All 2 versions


2022


2022 see 2018 2   thesis MJT
Learning and inference with Wasserstein metrics
Authors:Tomaso Poggio (Contributor), Massachusetts Institute of Technology Department of Brain and Cognitive Sciences (Contributor), Frogner, Charles (Charles Albert) (Creator)
Summary:Thesis: Ph. D., Massachusetts Institute of Technology, Department of Brain and Cognitive Sciences, 2018
Downloadable Archival Material, 2019-03-01T19:52:20Z
English
Publisher:Massachusetts Institute of Technology, 2019-03-01T19:52:20Z
Access Free

Cite Cited by 1


EVGAN: Optimization of Generative Adversarial Networks Using Wasserstein Distance and Neuroevolution

VK Nair, C Shunmuga Velayutham - Evolutionary Computing and Mobile …, 2022 - Springer

Generative Adversarial Networks (or called GANs) is a generative type of model which can 

be used to generate new data points from the given initial dataset. In this paper, the training …

 Related articles All 3 versions



[PDF] arxiv.org

Wasserstein distributionally robust optimization and variation regularization

R Gao, X Chen, AJ Kleywegt - Operations Research, 2022 - pubsonline.informs.org

… The connection between Wasserstein DRO and … variation regularization effect of the 

Wasserstein DRO—a new form … -variation tradeoff intrinsic in the Wasserstein DRO, which …

 Cited by 22 Related articles All 3 versions


2022 see 2021  [PDF] arxiv.org

Strong formulations for distributionally robust chance-constrained programs with left-hand side uncertainty under Wasserstein ambiguity

N Ho-Nguyen, F Kilinç-Karzan… - INFORMS Journal …, 2022 - pubsonline.informs.org

… ambiguity set, which is defined by the Wasserstein distance ball of radius θ around the … 

We consider Wasserstein ambiguity sets F N ( θ ) defined as the θ-radius Wasserstein ball of …

 Cited by 12 Related articles All 3 versions


[PDF] mdpi.com

Arrhythmia Detection Based on WGAN-GP and SE-ResNet1D

J Qin, F Gao, Z Wang, L Liu, C Ji - Electronics, 2022 - mdpi.com

… WGAN-GP is widely used for generating time-series data [21,… In this paper, we propose to use the improved WGAN-GP to … We improve the structure of WGAN-GP. A Bi-GRU layer is …

<–—2022———2022———1360—


An Improved WGAN-Based Fault Diagnosis of Rolling Bearings

C Zhao, L Zhang, M Zhong - 2022 IEEE International …, 2022 - ieeexplore.ieee.org

… In [8], a selfattention mechanism and WGAN were used to ameliorate the ability of fitting … of WGAN. In [10], the gradient penalty unit was appended to the loss function of WGAN for …


2922

Robot Intelligence Lab (@RobotIntelliLab) / Twitter

twitter.com › robotintellilab

twitter.com › robotintellilab

The Robot Intelligence Lab, Imperial College London. ... 

Myoelectric Prosthetic Hands: a Highly Data-Efficient Controller Based on the Wasserstein Distance'.

Twitter · 

May 12, 2022



 arXiv:2211.02990  [pdfother stat.CO   q-fin.ST
Efficient Convex PCA with applications to Wasserstein geodesic PCA and ranked data
Authors: Steven CampbellTing-Kam Leonard Wong
Abstract: Convex PCA, which was introduced by Bigot et al., is a dimension reduction methodology for data with values in a convex subset of a Hilbert space. This setting arises naturally in many applications, including distributional data in the Wasserstein space of an interval, and ranked compositional data under the Aitchison geometry. Our contribution in this paper is threefold. First, we present several…  More
Submitted 5 November, 2022; originally announced November 2022.
Comments: 40 pages, 9 figures

 Efficient Convex PCA with applications to Wasserstein...
by Campbell, Steven; Wong, Ting-Kam Leonard
11/2022
Convex PCA, which was introduced by Bigot et al., is a dimension reduction methodology for data with values in a convex subset of a Hilbert space. This setting...
Journal Article  Full Text Online


2022      

Gabriel Peyré on Twitter: "Did I mention that this is the ...

twitter.com › gabrielpeyre › status

twitter.com › gabrielpeyre › status0:08

Gradient descent on particles' positions (Langrangian) is equivalent to ... D

id I mention that this is the Wasserstein gradient flow of the ...

Twitter · 

Apr 25, 2022


[PDF] uclouvain.be

[PDF] Improving weight clipping in Wasserstein GANs

E Massart - perso.uclouvain.be

… the critic under control, in Wasserstein GAN training. After each … strategy for weight clipping 

in Wasserstein GANs. Instead of … on Wasserstein GANs with simple feedforward architectures. …


2022


[HTML] springer.com

[HTML] Network intrusion detection based on conditional wasserstein variational autoencoder with generative adversarial network and one-dimensional …

J He, X Wang, Y Song, Q Xiang, C Chen - Applied Intelligence, 2022 - Springer

… We propose Conditional Wasserstein Variational Autoencoders with Generative Adversarial 

Network (CWVAEGAN) to solve the class-imbalance phenomenon, CWVAEGAN transform …

 Related articles


[PDF] eartharxiv.org

Comparing detrital age spectra, and other geological distributions, using the Wasserstein distance

AG Lipp, P Vermeesch - 2022 - eartharxiv.org

… In the following sections, we first introduce the Wasserstein … We then proceed to compare 

the Wasserstein distance to the … be accessed using an (online) graphical user interface, at2 …

 Cited by 1 Related articles All 3 versions

[PDF] arxiv.org

Efficient Convex PCA with applications to Wasserstein geodesic PCA and ranked data

S Campbell, TKL Wong - arXiv preprint arXiv:2211.02990, 2022 - arxiv.org

… In Section 5.3, we apply Wasserstein geodesic PCA to distributions of US stock returns 

ranked by size, and show that the first two convex principal components can be interpreted in …

All 4 versions


[PDF] ams.org

Maps on positive definite cones of 𝐶*-algebras preserving the Wasserstein mean

L Molnár - Proceedings of the American Mathematical Society, 2022 - ams.org

… -Wasserstein metric (actually, it is called Bures metric in quantum information theory and 

Wasserstein … We note that the definition of the Bures-Wasserstein metric was recently extended …

Cited by 2 Related articles All 3 versions


[PDF] arxiv.org

Causality Learning With Wasserstein Generative Adversarial Networks

H Petkov, C Hanley, F Dong - arXiv preprint arXiv:2206.01496, 2022 - arxiv.org

… of Wasserstein distance in the context of causal structure learning. Our model named DAGWGAN 

combines the Wasserstein-… We conclude that the involvement of the Wasserstein metric …

Related articles All 3 versions

<–—2022———2022———1370—


 [PDF] arxiv.org

Coresets for Wasserstein Distributionally Robust Optimization Problems

R Huang, J Huang, W Liu, H Ding - arXiv preprint arXiv:2210.04260, 2022 - arxiv.org

… the Wasserstein metric, especially for the applications in machine learning [55; 46; 6; 18]. The 

Wasserstein ball captures much richer information … complexity if the Wasserstein ball has a …

All 2 versions


[PDF] amazonaws.com

[PDF] Generative Data Augmentation via Wasserstein Autoencoder for Text Classification

K Jin, J Lee, J Choi, S Jang, Y Kim - journal-home.s3.ap-northeast-2 …

… In this paper, we propose a simple text augmentation method using a Wasserstein autoencoder 

(WAE, [9]) to mitigate posterior collapse during training. The WAE can prevent posterior …

elated articles All 3 versions
 

2022 see 2021  Iacobelli, Mikaela

A new perspective on Wasserstein distances for kinetic problems. (English) Zbl 07505277

Arch. Ration. Mech. Anal. 244, No. 1, 27-50 (2022).

MSC:  35Qxx 35Axx 82Cxx

PDF BibTeX XML Cite  Full Text: DOI 

Zbl 07505277

Cited by 1 Related articles All 6 versions

[HTML] springer.com

[HTML] A new perspective on Wasserstein distances for kinetic problems

M Iacobelli - Archive for Rational Mechanics and Analysis, 2022 - Springer

… We introduce a new class of Wasserstein-type distances specifically designed to tackle 

questions concerning stability and convergence to equilibria for kinetic equations. Thanks to …

Cited by 4 Related articles All 5 versions

[HTML] sciencedirect.com

[HTML] Ensemble data assimilation using optimal control in the Wasserstein metric

X Liu, J Frank - Journal of Computational Science, 2022 - Elsevier

… Also, in the Wasserstein metric, the geodesic path between two distributions is the optimal 

transport path, along which the deformation of a density is minimal. Consequently, in the …

Ensemble data assimilation using optimal control in the Wasserstein metric


[HTML] euromathsoc.org

… :“Lectures on Optimal Transport” by Luigi Ambrosio, Elia Brué and Daniele Semola, and “An Invitation to Optimal Transport, Wasserstein Distances, and Gradient …

F Santambrogio - European Mathematical Society Magazine, 2022 - ems.press

… title: Wasserstein distances and gradient flows. Chapter 3 is more metric in nature: it introduces 

the Wasserstein … structure of the Wasserstein space, discussing geodesic curves and the …

 Related articles


2022


Information Geometry (@SN_INGE) / Twitter

twitter.com › sn_inge

twitter.com › sn_inge

The journal Information Geometry has received its first #CiteScore of 🌟3.3🌟! ... "On a prior based on the Wasserstein information matrix" ...

Oct 13, 2022


Conditional Wasserstein Generator

Young-geun Kim;

Kyungbok Lee;

Myunghee Cho Paik

IEEE Transactions on Pattern Analysis and Machine Intelligence

Year: 2022 | Early Access Article | Publisher: IEEE


Wasserstein Generative Adversarial Networks with Meta Learning for Fault Diagnosis of Few-shot Bearing

Ouyang Chengda;

Noramalina Abdullah

2022 IEEE International Conference on Artificial Intelligence in Engineering and Technology (IICAIET)

Year: 2022 | Conference Paper | Publisher: IEEE

Abstract      HTML

Distributed robust optimal scheduling of multi-energy complementary based on Wasserstein distance

Penalty

Jiafeng Wang;

Ming Liu;

Xiaokang Yin;

Yuhao Zhao;

Shengli Liu

2022 IEEE 6th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC )

Year: 2022 | Conference Paper | Publisher: IEEE

Abstract  HTML
 

Distributed robust optimal scheduling of multi-energy complementary based on Wasserstein distanceX. Wang;

F. Xu;

Y. Wang;

J. Shi;

H. Wen;

C. Guo

2022 Tsinghua-IET Electrical Engineering Academic Forum

Year: 2022 | Volume: 2022 | Conference Paper | Publisher: IET

Abstract

<–—2022———2022———1380—



Optimal HVAC Scheduling under Temperature Uncertainty using the Wasserstein Metric

Guanyu Tian;

Qun Zhou Sun

2022 IEEE Power & Energy Society General Meeting (PESGM)

Year: 2022 | Conference Paper | Publisher: IEEE

Abstract  HTML

Data-Driven PMU Noise Emulation Framework using Gradient-Penalty-Based Wasserstein GAN

Austin R Lassetter;

Kaveri Mahapatra;

David J. Sebastian-Cardenas;

Sri Nikhil Gupta Gourisetti;

James G. O'Brien;

James P. Ogle

2022 IEEE Power & Energy Society General Meeting (PESGM)

Year: 2022 | Conference Paper | Publisher: IEEE

AbstractHTML


Causal Discovery on Discrete Data via Weighted Normalized Wasserstein Distance

Yi Wei;

Xiaofei Li;

Lihui Lin;

Dengming Zhu;

Qingyong Li

IEEE Transactions on Neural Networks and Learning Systems

Year: 2022 | Early Access Article | Publisher: IEEE

Abstract


 Wasserstein Generative Adversarial Network to Address the Imbalanced Data Problem in Real-Time Crash Risk Prediction

Cheuk Ki Man;

Mohammed Quddus;

Athanasios Theofilatos;

Rongjie Yu;

Marianna Imprialou

IEEE Transactions on Intelligent Transportation Systems

Year: 2022 | Early Access Article | Publisher: IEEE

Abstract


Towards Efficient Variational Auto-Encoder Using Wasserstein Distance

Zichuan Chen;

Peng Liu

2022 IEEE International Conference on Image Processing (ICIP)

Year: 2022 | Conference Paper | Publisher: IEEE

Abstract  HTML


2022


Style Transfer Using Optimal Transport Via Wasserstein Distance

Oseok Ryu;

Bowon Lee

2022 IEEE International Conference on Image Processing (ICIP)

Year: 2022 | Conference Paper | Publisher: IEEE

Abstract  HTML


Lidar Upsampling With Sliced Wasserstein Distance

Institute of Electrical and Electronics Engineers

https://ieeexplore.ieee.org › document

by A Savkin · 2022 — In this letter, we address the problem of lidar upsampling. Learning on lidar point clouds is rather a challenging task due to their irregular ...

IEEE Robotics and Automation Letters

Year: 2022 | Early Access Article | Publisher: IEEE

Abstract

Rolling Bearing Fault Diagnosis Based on Deep Adversarial Networks with Convolutional Layer and Wasserstein Distance

Xinyu Gao;

Rui Yang;

Eng Gee Lim

2022 27th International Conference on Automation and Computing (ICAC)

Year: 2022 | Conference Paper | Publisher: IEEE

Abstract  HTML


Electromagnetic Full Waveform Inversion Based on Quadratic Wasserstein Metric

Jian Deng;Peimin Zhu;Wlodek Kofman;Jinpeng Jiang;Yuefeng Yuan;Alain Herique

IEEE Transactions on Antennas and Propagation

Year: 2022 | Early Access Article | Publisher: IEEE

Abstract

Conditional Wasserstein GAN for Energy Load Forecasting in Large Buildings

George-Silviu Năstăsescu;

Dumitru-Clementin Cercel

2022 International Joint Conference on Neural Networks (IJCNN)

Year: 2022 | Conference Paper | Publisher: IEEE

Abstract  HTML

<–—2022———2022———1390—


LIFEWATCH: Lifelong Wasserstein Change Point Detection

Kamil Faber;

Roberto Corizzo;

Bartlomiej Sniezynski;

Michael Baron;

Nathalie Japkowicz

2022 International Joint Conference on Neural Networks (IJCNN)

Year: 2022 | Conference Paper | Publisher: IEEE

Abstract  HTML
Cited by 3
 Related articles All 2 versions

Distributed Wasserstein Barycenters via Displacement Interpolation

Pedro Cisneros-Velarde;

Francesco Bullo

IEEE Transactions on Control of Network Systems

Year: 2022 | Early Access Article | Publisher: IEEE

Abstract



Wasserstein Generative Adversarial Networks with Meta Learning for Fault Diagnosis of Few-shot Bearing

Ouyang Chengda;

Noramalina Abdullah

2022 IEEE International Conference on Artificial Intelligence in Engineering and Technology (IICAIET)

Year: 2022 | Conference Paper | Publisher: IEEE

Abstract  HTML
Wasserstein Generative Adversarial Networks with Meta Learning for Fault Diagnosis of Few-shot Bearing


Semi-supervised Malicious Traffic Detection with Improved Wasserstein Generative Adversarial Network with Gradient Penalty

Jiafeng Wang;

Ming Liu;

Xiaokang Yin;

Yuhao Zhao;

Shengli Liu

2022 IEEE 6th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC )

Year: 2022 | Conference Paper | Publisher: IEEE

Abstract  HTML


Data-Driven PMU Noise Emulation Framework using Gradient-Penalty-Based Wasserstein GAN

Austin R Lassetter;

Kaveri Mahapatra;

David J. Sebastian-Cardenas;

Sri Nikhil Gupta Gourisetti;

James G. O'Brien;

James P. Ogle

2022 IEEE Power & Energy Society General Meeting (PESGM)

Year: 2022 | Conference Paper | Publisher: IEEE

Abstract  HTML


2022


Wasserstein Generative Adversarial Network to Address the Imbalanced Data Problem in Real-Time Crash Risk Prediction

Cheuk Ki Man;

Mohammed Quddus;

Athanasios Theofilatos;

Rongjie Yu;

Marianna Imprialou

IEEE Transactions on Intelligent Transportation Systems

Year: 2022 | Early Access Article | Publisher: IEEE

Abstract
Related articles
All 2 versions


SSP-WGAN-Based Data Enhancement And Prediction Method for Cement Clinker f-CaO

Xiaochen Hao;

Hui Dang;

Yuxuan Zhang;

Lin Liu;

Gaolu Huang;

Yifu Zhang;

Jinbo Liu

IEEE Sensors Journal

Year: 2022 | Early Access Article | Publisher: IEEE


arXiv:2211.05903  [pdfother math.OC
Two-Stage Distributionally Robust Conic Linear Programming over 1-Wasserstein Balls
Authors: Geunyeong ByeonKibaek Kim
Abstract: This paper studies two-stage distributionally robust conic linear programming under constraint uncertainty over type-1 Wasserstein balls. We present optimality conditions for the dual of the worst-case expectation problem, which characterizes worst-case uncertain parameters for its inner maximization problem. The proposed optimality conditions suggest binary representations of uncertain parameters…  More
Submitted 10 November, 2022; originally announced November 2022.


2022 see 2021

Marx, Victor

A Bismut-Elworthy inequality for a Wasserstein diffusion on the circle. (English) Zbl 07613008

Stoch. Partial Differ. Equ., Anal. Comput. 10, No. 4, 1559-1618 (2022).

MSC:  60H10 60H07 60H30 60K35 60J60

PDF BibTeX XML Cite

Full Text: DOI


Handwriting Recognition Using Wasserstein Metric in...

by Jangpangi, Monica; Kumar, Sudhanshu; Bhardwaj, Diwakar ; More...

SN computer science, 11/2022, Volume 4, Issue 1

Deep intelligence provides a great way to deal with understanding the complex handwriting of the user. Handwriting is challenging due to its irregular shapes,...

Journal ArticleCitation Online

<–—2022———2022———1400—



Contrastive Prototypical Network with Wasserstein...
by Wang, Haoqing; Deng, Zhi-Hong
Computer Vision – ECCV 2022, 11/2022
Unsupervised few-shot learning aims to learn the inductive bias from unlabeled dataset for solving the novel few-shot tasks. The existing unsupervised few-shot...
Book Chapter  Full Text Online


 
Studies from Beijing Institute of Technology Update Current Data on Heart Disease (ECG Classification Based on Wasserstein..
Heart Disease Weekly, 11/2022
NewsletterCitation Online

 2022 patent news
State Intellectual Property Office of China Publishes Dongfeng Automobile Group Stock Ltd and Dongfeng Pleasure Science and Tech Limited's Patent Application for Motor Fault Data Enhancement Method Based on Conditional Wasserstein...
Global IP News: Automobile Patent News, Nov 4, 2022
Newspaper ArticleCitation Online



Lung image segmentation based on DRD U-Net and combined WGAN...
by Lian, LuoyuLuo, XinPan, Canyu ; More...
Computer methods and programs in biomedicine, 08/2022, Volume 226
PURPOSECOVID-19 is a hot issue right now, and it's causing a huge number of infections in people, posing a grave threat to human life. Deep learning-based...
ArticleView Article PDF
Journal Article  Full Text Online
View Complete Issue Browse Now

 
Handwriting Recognition Using Wasserstein Metric in Adversarial...
by Jangpangi, MonicaKumar, SudhanshuBhardwaj, Diwakar ; More...
SN computer science, 11/2022, Volume 4, Issue 1
Deep intelligence provides a great way to deal with understanding the complex handwriting of the user. Handwriting is challenging due to its irregular shapes,...
ArticleView Article PDF
Journal Article  Full Text Online

 2022


Contrastive Prototypical Network with Wasserstein Confidence...
by Wang, HaoqingDeng, Zhi-Hong
Computer Vision – ECCV 2022, 11/2022
Unsupervised few-shot learning aims to learn the inductive bias from unlabeled dataset for solving the novel few-shot tasks. The existing unsupervised few-shot...
Book Chapter  Full Text Online


2022 see 2021

MR4506815 Prelim Ho-Nguyen, Nam; Kılınç-Karzan, Fatma; Küçükyavuz, Simge; Lee, Dabeen; 

Distributionally robust chance-constrained programs with right-hand side uncertainty under Wasserstein ambiguity. Math. Program. 196 (2022), no. 1-2, Ser. B, 641–672. 90-08 (90-10 90C10 90C11 90C15 90C17 90C27 90C57)

Review PDF Clipboard Journal Article

 Zbl 07616319


2o22 see 2021

MR4503174 Prelim Marx, Victor; 

A Bismut-Elworthy inequality for a Wasserstein diffusion on the circle. Stoch. Partial Differ. Equ. Anal. Comput. 10 (2022), no. 4, 1559–1618. 60H10 (60H07 60H30 60J60 60K35)

Review PDF Clipboard Journal Article

<–—2022———2022———1410— 


ARTICLE

Unbalanced network attack traffic detection based on feature extraction and GFDA-WGAN

Li, Kehong ; Ma, Wengang ; Duan, Huawei ; Xie, Han ; Zhu, Juanxiu ; Liu, RuiqiComputer networks (Amsterdam, Netherlands : 1999), 2022, Vol.216

PEER REVIEWED

Unbalanced network attack traffic detection based on feature extraction and GFDA-WGAN

Available Online 


CONFERENCE PROCEEDING

Linear-FM fuze signal enhancement based on WGAN

Jing, Zhiyuan ; Jing, Xingyu2022

Linear-FM fuze signal enhancement based on WGAN

All 3 versions

CONFERENCE PROCEEDING

Multiscale Carbonate Rock Reconstruction Using a Hybrid WGAN-GP and Super-Resolution

Zhang, Zhen ; Li, Yiteng ; AlSinan, Marwah ; He, Xupeng ; Kwak, Hyung ; Hoteit, Hussein202

Multiscale Carbonate Rock Reconstruction Using a Hybrid WGAN-GP and Super-Resolution

No Online Access 


2022  PATENT

基于WGAN-GP的CAN总线模糊测试用例生成方法及模糊测试系统

OPEN ACCESS

基于WGAN-GPCAN总线模糊测试用例生成方法及模糊测试系统

No Online Access 

[Chinese  CAN bus fuzz test case generation method and fuzz test system based on WGAN-GP]



2022  PATENT

FC-VoVNet and WGAN-based B ultrasonic image denoising method

WEI JIANHUA ; CHEN DEHAI2022

OPEN ACCESS

FC-VoVNet and WGAN-based B ultrasonic image denoising method

No Online Access 

 

2022


PATENT

基于SVAE-WGAN的过程工业软测量数据补充方法

2022

OPEN ACCESS

基于SVAE-WGAN的过程工业软测量数据补充方法

No Online Access 

[Chines  VAE-WGAN-based data supplementation method for soft sensing in process industry


 

ARTICLE

Application of WGAN-GP in recommendation and Questioning the relevance of GAN-based approaches

Hichem Ammar Khodja ; Boudjeniba, OussamaarXiv.org, 2022

OPEN ACCESS

Application of WGAN-GP in recommendation and Questioning the relevance of GAN-based approaches

Available Online  

2022 book

Application of WGAN-GP in recommendation and Questioning the relevance of GAN-based approaches

Author:Oussama Boudjeniba

Summary:Many neural-based recommender systems were proposed in recent years and part of them used Generative Adversarial Networks (GAN) to model user-item interactions. However, the exploration of Wasserstein GAN with Gradient Penalty (WGAN-GP) on recommendation has received relatively less scrutiny. In this paper, we focus on two questions: 1- Can we successfully apply WGAN-GP on recommendation and does this approach give an advantage compared to the best GAN models? 2- Are GAN-based recommender systems relevant? To answer the first question, we propose a recommender system based on WGAN-GP called CFWGAN-GP which is founded on a previous model (CFGAN). We successfully applied our method on real-world datasets on the top-k recommendation task and the empirical results show that it is competitive with state-of-the-art GAN approaches, but we found no evidence of significant advantage of using WGAN-GP instead of the original GAN, at least from the accuracy point of view. As for the second question, we conduct a simple experiment in which we show that a well-tuned conceptually simpler method outperforms GAN-based models by a considerable margin, questioning the use of such models

Show more

Book, Apr 28, 2022

Publication:arXiv.org, Apr 28, 2022, n/a

Publisher:Apr 28, 2022

2022  PATENT

基于半监督WGAN-GP的高光谱图像分类方法

OPEN ACCESS

[Chines  Hyperspectral image classification method based on semi-supervised WGAN-GP]

Cited by 2 Related articles 

2022 PATENT

基于改进WGAN-GP的多波段图像同步融合与增强方法

OPEN ACCESS

基于改进WGAN-GP的多波段图像同步融合与增强方法

No Online Access 

[Chinese  Synchronous fusion and enhancement method of multi-band image based on improved WGAN-GP

] 

PATENT

一种基于BiLSTMWGAN-GP网络的sEMG数据增强方法

OPEN ACCESS

一种基于BiLSTMWGAN-GP网络的sEMG数据增强方法

No Online Access 

[Chinese  A sEMG data augmentation method based on BiLSTM and WGAN-GP network]

<–—2022———2022———1420—


2022  PATENT

基于WGAN动态惩罚的网络安全不平衡数据集分析方法

OPEN ACCESS

基于WGAN动态惩罚的网络安全不平衡数据集分析方法

No Online Access 

[Chinese  Hyperspectral image classification method based on semi-supervised WGAN-GP

]

2022  PATENT

一种基于孤立森林与WGAN网络的风电输出功率预测方法

OPEN ACCESS

一种基于孤立森林与WGAN网络的风电输出功率预测方法

No Online Access 

[Chinese  A wind power output power prediction method based on isolated forest and WGAN network]

 

2022 see 2021 PATENT

Data depth enhancement method based on WGAN-GP data generation and Poisson fusion

ZHANG HUITING ; LIU ZHUO ; HOU YUE ; CHEN NING ; CHEN YANYAN2022

OPEN ACCESS

Data depth enhancement method based on WGAN-GP data generation and Poisson fusion

No Online Access  


PATENT

Microseism record denoising method based on improved WGAN network and CBDNet

SHENG GUANQUN ; MA KAI ; JING TANG ; ZHENG YUELIN ; YU MEI ; ZHANG JINGLAN2022

OPEN ACCESS

Microseism record denoising method based on improved WGAN network and CBDNet

No Online Access 

 

2022 see 2021  PATENT

Power system harmonic law calculation method based on WGAN

WU OU ; PAN YANYI ; YAO CHANGQING ; MEI WENBO ; MEI ZHICHAO ; ZHU SHUAI2022

OPEN ACCESS

Power system harmonic law calculation method based on WGAN

No Online Access 


2022

  

2022  PATENT

基于WGAN-GP的雷达HRRP数据库构建方法

OPEN ACCESS

基于WGAN-GP的雷达HRRP数据库构建方法

No Online Access 

[Chinese  Construction method of radar HRRP database based on WGAN-GP]

 

2022 PATENT

基于WGAN-GP和U-net改进的图像增强的方法、装置及存储介质

OPEN ACCESS

基于WGAN-GPU-net改进的图像增强的方法、装置及存储介质

No Online Access 

[Chinese  mproved image enhancement method, device and storage medium based on WGAN-GP and U-net]


ARTICLE

适用于局放模式识别的WGAN-GP数据增强方法

路士杰 ; 董驰 ; 顾朝敏 ; 郑宝良 ; 刘兆宸 ; 谢庆 ; 谢军南方电网技术, 2022, Vol.16 (7), p.55-60

适用于局放模式识别的WGAN-GP数据增强方法

No Online Access  

[Chinese  WGAN-GP data augmentation method for PD pattern recognition

]


ARTICLE

融合软奖励和退出机制的WGAN知识图谱补全方法

张得祥 ; 王海荣 ; 钟维幸 ; 郭瑞萍郑州大学学报(理学版), 2022, Vol.54 (2), p.67-73

融合软奖励和退出机制的WGAN知识图谱补全方法

No Online Access 

[Chinese  A WGAN Knowledge Graph Completion Method Integrating Soft Reward and Exit Mechanism]

 

  

ARTICLE

基于GRU神经网络的WGAN短期负荷预测模型

高翱 ; 王帅 ; 韩兴臣 ; 张智晟电气工程学报, 2022, Vol.17 (2), p.168-175

[Chinese  WGAN short-term load forecasting model based on GRU neural network]

]<–—2022———2022———1430—



ARTICLE

基于VAE-WGAN的多维时间序列异常检测方法

段雪源 ; 付钰 ; 王坤通信学报, 2022, Vol.43 (3), p.1-13

基于VAE-WGAN的多维时间序列异常检测方法

No Online Access 

[Chinese  Anomaly detection method for multidimensional time series based on VAE-WGAN

]

ARTICLE

基于WGAN-GP的变压器故障样本扩充模型的构建与评价

王锦 ; 徐新光源与照明, 2022 (3), p.128-131

基于WGAN-GP的变压器故障样本扩充模型的构建与评价

No Online Access 

[Chinese  Construction and Evaluation of Transformer Fault Sample Expansion Model Based on WGAN-GP]

 

ARTICLE

Short-term prediction of wind power based on BiLSTM–CNN–WGAN-GP

Huang, Ling ; Li, Linxia ; Wei, Xiaoyuan ; Zhang, DongshengSoft computing (Berlin, Germany), 2022, Vol.26 (20), p.10607-10621


CONFERENCE PROCEEDING

Imbalanced Cell-Cycle Classification Using Wgan-Div and Mixup

Rana, Priyanka ; Sowmya, Arcot ; Meijering, Erik ; Song, Yang2022 IEEE 19th International Symposium on Biomedical Imaging (ISBI), 2022, p.1-4


2022  ARTICLE

基于WGAN-GP的微多普勒雷达人体动作识别

屈乐乐 ; 王禹桐雷达科学与技术, 2022, Vol.20 (2), p.195-201

基于WGAN-GP的微多普勒雷达人体动作识别

No Online Access 

[Chinese  Human Action Recognition Based on Micro-Doppler Radar Based on WGAN-GP]]


2022



BOOK CHAPTER

Contrastive Prototypical Network with Wasserstein Confidence Penalty

Wang, Haoqing ; Deng, Zhi-HongComputer Vision – ECCV 2022, 2022, p.665-682

Contrastive Prototypical Network with Wasserstein Confidence Penalty

Available Online 



BOOK CHAPTER

The Continuous Formulation of Shallow Neural Networks as Wasserstein-Type Gradient Flows

Fernández-Real, Xavier ; Figalli, AlessioAnalysis at Large, 2022, p.29-57

The Continuous Formulation of Shallow Neural Networks as Wasserstein-Type Gradient Flows

No Online Access 



BOOK CHAPTER

An Embedding Carrier-Free Steganography Method Based on Wasserstein GAN

Yu, Xi ; Cui, Jianming ; Liu, MingAlgorithms and Architectures for Parallel Processing, 2022, p.532-545

 .... In this paper, we proposed a carrier-free steganography method based on Wasserstein GAN. We segmented the target information and input it into the trained Wasserstein GAN, and then generated the visual-real image...

PEER REVIEWED

An Embedding Carrier-Free Steganography Method Based on Wasserstein GAN

Available Online 

BOOK CHAPTER

A Radar HRRP Target Recognition Method Based on Conditional Wasserstein VAEGAN and 1-D CNN

He, Jiaxing ; Wang, Xiaodan ; Xiang, QianPattern Recognition and Computer Vision, 2022, p.762-777

A Radar HRRP Target Recognition Method Based on Conditional Wasserstein VAEGAN and 1-D CNN

Available Online 

All 2 versions

 

BOOK CHAPTER

A Reusable Methodology for Player Clustering Using Wasserstein Autoencoders

Tan, Jonathan ; Katchabaw, MikeEntertainment Computing – ICEC 2022, 2022, p.296-308

A Reusable Methodology for Player Clustering Using Wasserstein Autoencoders

Available Online 

<–—2022———2022———1440—


 

22022  thesis

Computational Inversion with Wasserstein Distances and Neural Network Induced Loss FunctionsAuthor:Wen Ding
Summary:This thesis presents a systematic computational investigation of loss functions in solving inverse problems of partial differential equations. The primary efforts are spent on understanding optimization-based computational inversion with loss functions defined with the Wasserstein metrics and with deep learning models. The scientific contributions of the thesis can be summarized in two directions. In the first part of this thesis, we investigate the general impacts of different Wasserstein metrics and the properties of the approximate solutions to inverse problems obtained by minimizing loss functions based on such metrics. We contrast the results to those of classical computational inversion with loss functions based on the ?² and ?
metric. We identify critical parameters, both in the metrics and the inverse problems to be solved, that control the performance of the reconstruction algorithms. We highlight the frequency disparity in the reconstructions with the Wasserstein metrics as well as its consequences, for instance, the pre-conditioning effect, the robustness against high-frequency noise, and the loss of resolution when data used contain random noiseShow more
Thesis, Dissertation, 2022
English
Publisher:[publisher not identified], [New York, N.Y.?], 2022

[PDF] columbia.edu

[BOOK] Computational Inversion with Wasserstein Distances and Neural Network Induced Loss Functions

W Ding - 2022 - search.proquest.com

… functions defined with the Wasserstein metrics and with deep … the general impacts of different 

Wasserstein metrics and the … in the reconstructions with the Wasserstein metrics as well as …

 Related articles All 3 versions

2022  thesis
Non-parametric threshold for smoothed empirical Wasserstein distanceAuthors:Zeyu Jia (Author), Yury PolyanskiySasha RakhlinMassachusetts Institute of Technology
Abstract:Consider an empirical measure P[subscript n] induced by n iid samples from a d-dimensional K-subgaussian distribution P. We show that when K < [sigma], the Wasserstein distance [mathematical formula] converges at the parametric rate 0(1/n), and when K > [sigma], there exists a K-subgaussian distribution P such that [mathemetical formula]. This resolves the open problems in[7], closes the gap between where we get parametric rate and where we do not have parametric rate. Our result provides a complete characterization of the range of parametric rates for subgaussian PShow more
Thesis, Dissertation, 2022
English
Publisher:Massachusetts Institute of Technology, Cambridge, Massachusetts, 2022


2022

Optimal 1-Wasserstein Distance for WGANsAuthors:Stéphanovitch, Arthur (Creator), Tanielian, Ugo (Creator), Cadre, Benoît (Creator), Klutchnikoff, Nicolas (Creator), Biau, Gérard (Creator)
Summary:The mathematical forces at work behind Generative Adversarial Networks raise challenging theoretical issues. Motivated by the important question of characterizing the geometrical properties of the generated distributions, we provide a thorough analysis of Wasserstein GANs (WGANs) in both the finite sample and asymptotic regimes. We study the specific case where the latent space is univariate and derive results valid regardless of the dimension of the output space. We show in particular that for a fixed sample size, the optimal WGANs are closely linked with connected paths minimizing the sum of the squared Euclidean distances between the sample points. We also highlight the fact that WGANs are able to approach (for the 1-Wasserstein distance) the target distribution as the sample size tends to infinity, at a given convergence rate and provided the family of generative Lipschitz functions grows appropriately. We derive in passing new results on optimal transport theory in the semi-discrete settingShow more
Publisher:2022-01-08

Cited by 2 Related articles All 2 versions

 On linear optimization over Wasserstein balls

Yue, MCKuhn, D and Wiesemann, W

Sep 2022 | 

MATHEMATICAL PROGRAMMING

 195 (1-2) , pp.1107-1122

Wasserstein balls, which contain all probability measures within a pre-specified Wasserstein distance to a reference measure, have recently enjoyed wide popularity in the distributionally robust optimization and machine learning communities to formulate and solve data-driven optimization problems with rigorous statistical guarantees. In this technical note we prove that the Wasserstein ball is

Show more

Free Submitted Article From RepositoryView full textmore_horiz  


Working Paper

Efficient Gradient Flows in Sliced-Wasserstein Space

Bonet, Clément; Courty, Nicolas; Septier, François; Lucas Drumetz.

arXiv.org; Ithaca, Nov 15, 2022.

Cite  Email

Save to My Research

Full Text

Abstract/DetailsGet full text

opens in a new window

2022


Working Paper

A New Family of Dual-norm regularized -Wasserstein Metrics

Manupriya, Piyushi; Nath, J Saketha; Jawanpuria, Pratik.

arXiv.org; Ithaca, Nov 7, 2022.

Cite  Email

Save to My Research

Full Text

Abstract/DetailsGet full text

opens in a new window


2022 see 2021  Working Paper

Projection Robust Wasserstein Distance and Riemannian Optimization

Lin, Tianyi; Fan, Chenyou; Ho, Nhat; Cuturi, Marco; Jordan, Michael I.

arXiv.org; Ithaca, Nov 6, 2022.

Cite  Email

Save to My Research

Full Text

Abstract/DetailsGet full text

opens in a new window

Working Paper

Fair and Optimal Classification via Transports to Wasserstein-Barycenter

Ruicheng Xian; Lang, Yin; Zhao, Han.

arXiv.org; Ithaca, Nov 3, 2022.

Cite  Email

Save to My Research

Full Text

Abstract/DetailsGet full text

opens in a new window

Working Paper

Quadratic Wasserstein metrics for von Neumann algebras via transport plans

Duvenhage, Rocco.

arXiv.org; Ithaca, Nov 2, 2022.

Cite  Email

Save to My Research

Full Text

Abstract/DetailsGet full text

opens in a new window
MR4534899


Working Paper

Quantum Wasserstein isometries on the qubit state space

György Pál Gehér; Pitrik, József; Titkos, Tamás; Virosztek, Dániel.

arXiv.org; Ithaca, Oct 28, 2022.

Cite  Email

Save to My Research

Full Text

Abstract/DetailsGet full text

opens in a new window

<–—2022———2022———1450—


Working Paper

Geometric Sparse Coding in Wasserstein Space

Mueller, Marshall; Shuchin Aeron; Murphy, James M; Abiy Tasissa.

arXiv.org; Ithaca, Oct 21, 2022.

Cite  Email  Save to My Research  Full Tex

Abstract/DetailsGet full text

opens in a new window

[PDF] syntaxliterate.co.id

Comparative Analysis Of DCGAN And WGAN

SH Al Furuqi, H Santoso - Syntax Literate; Jurnal Ilmiah …, 2022 - jurnal.syntaxliterate.co.id

All 2 versions 

Wasserstein GAN (WGAN) algorithms. This study analyzes the comparison among DCGAN

and WGAN … time as WGAN can remedy those shortcomings however the process is slower. …

All 2 versions 


Health evaluation methodology of remote maintenance control system of natural gas pipeline based on ACWGAN-GP algorithm

Zhang, Shibin; Song, Fei; Zhang, Xiaojun; Jia, Lidong; Shi, Wei; et al.

Journal of Physics: Conference Series; Bristol Vol. 2247, Iss. 1,  (Apr 2022): 012006.

Cite  EmailSave to My Research  Full Text

Abstract/DetailsFull text - PDF (431 KB)‎

2022
Improving SSH detection model using IPA time and WGAN-GP
Authors:Junwon LeeHeejo Lee
Summary:In the machine learning-based detection model, the detection accuracy tends to be proportional to the quantity and quality of the training dataset. The machine learning-based SSH detection model’s performance is affected by the size of the training dataset and the ratio of target classes. However, in an actual network environment within a short period, it is inconvenient to collect a sufficient and diverse training dataset. Even though many training data samples are collected, it takes a lot of effort and time to prepare the training dataset through data classification. To overcome these limitations, we generate sophisticated samples using the WGAN-GP algorithm and present how to select samples by comparing generator loss. The synthetic training dataset with generated samples improves the performance of the SSH detection model. Furthermore, we add the new features to include the distinction of inter-packet arrival time. The enhanced SSH detection model decreases false positives and provides a 0.999 F1-score by applying the synthetic dataset and the packet inter-arrival time featuresShow more
Article, 2022
Publication:Computers & Security, 116, 202205
Publisher:2022
2022  Peer-reviewed
Lung image segmentation based on DRD U-Net and combined WGAN with Deep Neural Network
Authors:Luoyu LianXin LuoCanyu PanJinlong HuangWenshan HongZhendong Xu
Summary:COVID-19 is a hot issue right now, and it's causing a huge number of infections in people, posing a grave threat to human life. Deep learning-based image diagnostic technology can effectively enhance the deficiencies of the current main detection method. This paper proposes a multi-classification model diagnosis based on segmentation and classification multi-taskShow more
Article
Publication:Computer Methods and Programs in Biomedicine, 226, November 2022


2022
Peer-reviewed
Tool wear state recognition under imbalanced data based on WGAN-GP and lightweight neural network ShuffleNet
Authors:Wen HouHong GuoBingnan YanZhuang XuChao YuanYuan Mao
Summary:Abstract: The tool is an important part of machining, and its condition determines the operational safety of the equipment and the quality of the workpiece. Therefore, tool condition monitoring (TCM) is of great significance. To address the imbalance of the tool monitoring signal and achieve a lightweight model, a TCM method based on WGAN-GP and ShuffleNet is proposed in this paper. The tool monitoring data are enhanced and balanced using WGAN-GP, and the 1D signal data are converted into 2D grayscale images. The existing ShuffleNet is improved by adding a channel attention mechanism to construct the entire model. The tool wear state is recognized through experimental validation of the milling dataset and compared with those through other models. Results show that the proposed model achieves an accuracy of 99.78 % in recognizing the wear state of tools under imbalanced data while ensuring a light weight, showing the superiority of the methodShow more
Article, 2022
Publication:Journal of Mechanical Science and Technology, 36, 20221003, 4993
Publisher:2022

ited by 4 Related articles

2022


2022  Peer-reviewed
Unbalanced network attack traffic detection based on feature extraction and GFDA-WGAN
Authors:Kehong LiWengang MaHuawei DuanHan XieJuanxiu ZhuRuiqi Liu
Summary:Detecting various types of attack traffic is critical to computer network security. The current detection methods require massive amounts of data to detect attack traffic. However, in most cases, the attack traffic samples are unbalanced. A typical neural network model cannot detect such unbalanced attack traffic. Additionally, malicious network noise traffic has a detrimental effect on the detection stability. Very few effective methods exist to detect unbalanced attack traffic. In this paper, we develop a method to detect unbalanced attack traffic. A dynamic chaotic cross-optimized bidirectional residual-gated recurrent unit (DCCSO-Res-BIGRU) and an adaptive Wasserstein generative adversarial network with generated feature domains (GFDA-WGAN) are proposed. First, feature extraction is achieved using the DCCSO-Res-BIGRU. The GFDA-WGAN can then be used to detect the unbalanced attack traffic. We use a conditional WGAN network to generate the pseudo-sample features of the invisible classes. A GFDA strategy for conditional WGAN optimization is also proposed. Furthermore, we use an invisible sample and supervised learning to detect unbalanced attack traffic. Finally, the performance of the proposed method is validated using four network datasets. According to the experimental results, the proposed method significantly improves sample convergence and generation. It has a higher detection accuracy with respect to detecting unbalanced attack traffic. Furthermore, it provides the most powerful and effective visual classification. When noise is added, it outperforms all other conventionally used methods. Real-time traffic detection is also possible using this methodShow m

Article, 2022

Publication:Computer Networks, 216, 20221024

Publisher:2022


2022  Peer-reviewed

Finger vein image inpainting using neighbor binary-wasserstein generative adversarial networks (NB-WGAN)

Authors:Hanqiong JiangLei ShenHuaxia WangYudong YaoGuodong Zhao
Summary:Abstract: Traditional inpainting methods obtain poor performance for finger vein images with blurred texture. In this paper, a finger vein image inpainting method using Neighbor Binary-Wasserstein Generative Adversarial Networks (NB-WGAN) is proposed. Firstly, the proposed algorithm uses texture loss, reconstruction loss, and adversarial loss to constrain the network, which protects the texture in the inpainting process. Secondly, the proposed NB-WGAN is designed with a coarse-to-precise generator network and a discriminator network composed of two Wasserstein Generative Adversarial Networks with Gradient Penalty (WGAN-GP). The cascade of a coarse generator network and a precise generator network based on Poisson fusion can obtain richer information and get natural boundary connection. The discriminator consists of a global WGAN-GP and a local WGAN-GP, which enforces consistency between the entire image and the repaired area. Thirdly, a training dataset is designed by analyzing the locations and sizes of the damaged finger vein images in practical applications (i.e., physical oil dirt, physical finger molting, etc). Experimental results show that the performance of the proposed algorithm is better than traditional inpainting methods including Curvature Driven Diffusions algorithm without texture constraints, a traditional inpainting algorithm with Gabor texture constraints, and a WGAN inpainting algorithm based on attention mechanism without texture constraintsShow more

Cited by 2 Related articles All 3 versions

2022  Peer-reviewed
Short-term prediction of wind power based on BiLSTM–CNN–WGAN-GP
Authors:Ling HuangLinxia LiXiaoyuan WeiDongsheng Zhang
Summary:A short-term wind power prediction model based on BiLSTM–CNN–WGAN-GP (LCWGAN-GP) is proposed in this paper, aiming at the problems of instability and low prediction accuracy of short-term wind power prediction. Firstly, the original wind energy data are decomposed into subsequences of natural mode functions with different frequencies by using the variational mode decomposition (VMD) algorithm. The VMD algorithm relies on a decision support system for the decomposition of the data into natural mode functions. Once the decomposition is performed, the nonlinear and dynamic behavior are extracted from each natural mode function. Next, the BiLSTM network is chosen as the generation model of the generative adversarial network (WGAN-GP) to obtain the data distribution characteristics of wind power’s output. The convolutional neural network (CNN) is chosen as the discrimination model, and the semi-supervised regression layer is utilized to design the discrimination model to predict wind power. The minimum–maximum game is formed by the BiLSTM and CNN network models to improve the quality of sample generation and further improve the prediction accuracy. Finally, the actual data of a wind farm in Jiuquan City, Gansu Province, China is taken as an example to prove that the proposed method has superior performance compared with other prediction algorithmsShow more
Downloadable Article, 2022
Publication:Soft Computing, 20220117, 1
Publisher:2022
ited by 4
 Related articles


2022  Peer-reviewed
ERP-WGAN: A data augmentation method for EEG single-trial detection
Authors:Rongkai ZhangYing ZengLi TongJun ShuRunnan LuKai YangZhongrui LiBin Yan
Summary:Brain computer interaction based on EEG presents great potential and becomes the research hotspots. However, the insufficient scale of EEG database limits the BCI system performance, especially the positive and negative sample imbalance caused by oddball paradigm. To alleviate the bottleneck problem of scarce EEG sample, we propose a data augmentation method based on generative adversarial network to improve the performance of EEG signal classification. Taking the characteristics of EEG into account in wasserstein generative adversarial networks (WGAN), the problems of model collapse and poor quality of artificial data were solved by using resting noise, smoothing and random amplitude. The quality of artificial data was comprehensively evaluated from verisimilitude, diversity and accuracy. Compared with the three artificial data methods and two data sampling methods, the proposed ERP-WGAN framework significantly improve the performance of both subject and general classifiers, especially the accuracy of general classifiers trained by less than 5 dimensional features is improved by 20-25%. Moreover, we evaluate the training sets performance with different mixing ratios of artificial and real samples. ERP-WGAN can reduced at least 73% of the real subject data and acquisition cost, which greatly saves the test cycle and research costShow more
Article
Publication:Journal of Neuroscience Methods, 376, 2022-07-01
SVAE-WGAN-Based Soft Sensor Data Supplement Method for Process Industry
Authors:Shiwei GaoSulong QiuZhongyu MaRan TianYanxing Liu
Summary:Challenges of process industry, which is characterized as hugeness of process variables in complexity of industrial environment, can be tackled effectively by the use of soft sensor technology. However, how to supplement the dataset with effective data supplement method under harsh industrial environment is a key issue for the enhancement of prediction accuracy in soft-sensing model. Aimed at this problem, a SVAE-WGAN based soft sensor data supplement method is proposed for process industry. Firstly, deep features are extracted with the stacking of the variational autoencoder (SVAE). Secondly, a generation model is constructed with the combination of stacked variational autoencoder (SVAE) and Wasserstein generative adversarial network (WGAN). Thirdly, the proposed model is optimized with training of dataset in industrial process. Finally, the proposed model is evaluated with abundant experimental tests in terms of MSE, RMSE and MAE. It is shown in the results that the proposed SVAE-WGAN generation network is significantly better than that of the traditional VAE, GAN and WGAN generation network in case of industrial steam volume dataset. Specially, the proposed method is more effective than the latest reference VA-WGAN generation network in terms of RMSE, which is enhanced about 9.08% at most. Moreover, the prediction precision of soft sensors could be improved via the supplement of the training samplesShow more
Article, 2022
Publication:IEEE Sensors Journal, 22, 20220101, 601
Publisher:2022

<–—2022———2022———1460—


2022 Peer-reviewed
A CWGAN-GP-based multi-task learning model for consumer credit scoring
Authors:Yanzhe KangLiao ChenNing JiaWei WeiJiang DengHaizhang Qian
Summary:In consumer credit scoring practice, there is often an imbalanced distribution in accepted borrowers, which means there are far fewer defaulters than borrowers who pay on time. This makes it difficult for traditional models to function. Aside from traditional sampling methods for imbalanced data, the idea of using rejected information to one’s benefit is new. Without historical repayment performance, rejected data are often discarded or simply disposed of during credit scoring modeling. However, these data play an important role because they capture the distribution of the borrower population as well as the accepted data. Besides, due to the increasing complexity in loan businesses, the current methods have difficulties in addressing high-dimensional multi-source data. Thus, a more effective credit scoring approach towards imbalanced data should be studied. Inspired by the state-of-the-art neural network methods, in this paper, we propose a conditional Wasserstein generative adversarial network with a gradient penalty (CWGAN-GP)-based multi-task learning (MTL) model (CWGAN-GP-MTL) for consumer credit scoring. First, the CWGAN-GP model is employed to learn about the distribution of the borrower population given both accepted and rejected data. Then, the data distribution between good and bad borrowers is adjusted through augmenting synthetic bad data generated by CWGAN-GP. Next, we design an MTL framework for both accepted and rejected and good and bad data, which improves risk prediction ability through parameter sharing. The proposed model was evaluated on real-world consumer loan datasets from a Chinese financial technology company. The empirical results indicate that the proposed model performed better than baseline models across different evaluation metrics, demonstrating its promising application potentialShow more
Article
Publication:Expert Systems With Applications, 206, 2022-11-15


2022  Peer-reviewed
Multi-classification of arrhythmias using ResNet with CBAM on CWGAN-GP augmented ECG Gramian Angular Summation Field
Authors:Ke MaChang'an A. ZhanFeng Yang
Summary:Cardiovascular diseases are the leading cause of death globally. Arrhythmias are the most common symptoms and can cause sudden cardiac death. Accurate and reliable detection of arrhythmias from large amount of ECG signals remains a challenge. We here propose to use ResNet with convolutional block attention modules (CBAM-ResNet) to classify the major types of cardiac arrhythmias. To facilitate the classifier in extracting the rich information in the ECG signals, we transform the time series into Gramian angular summation field (GASF) images. In order to overcome the imbalanced data problem, we employ the conditional Wasserstein generative adversarial network with gradient penalty (CWGAN-GP) model to augment the minor categories. Tested using the MIT-BIH arrhythmia database, our method shows classification accuracy of 99.23%, average precision of 99.13%, sensitivity of 97.50%, specificity of 99.81% and the average F1 score of 98.29%. Compared with the performance of the state-of-the-art algorithms in the extant literature, our method is highest accuracy and specificity, comparable in precision, sensitivity and F1 score. These results suggest that transforming the ECG time series into GASF images is a valid approach to representing the rich ECG features for arrhythmia classification, and that CWGAN-GP based data augmentation provides effective solution to the imbalanced data problem and helps CBAM-ResNet to achieve excellent classification performanceShow more
Article
Publication:Biomedical Signal Processing and Control, 77, August 2022


2022  Peer-reviewed
Integrated Model of ACWGAN-GP and Computer Vision for Breakout Prediction in Continuous Casting
Authors:Yanyu WangXudong WangMan Yao
Summary:Abstract: The accurate prediction of mold sticking breakout is an important prerequisite to ensure the stable and smooth production of the continuous casting process. When sticking breakout occurs, the sticking region expands vertically along the casting direction and horizontally along the strand width direction, forming a V-shaped area on the strand surface. This paper uses computer vision technology to visualize the temperature of mold copper plates, extract the geometric and movement characteristics of the sticking region from time and space perspectives, and construct feature vectors to characterize the V-shaped sticking breakout region. We train and test the auxiliary classifier WGAN-GP (ACWGAN-GP) model on true and false sticking feature vector samples, developing a breakout prediction method based on computer vision and a generative adversarial network. The test results show that the model can distinguish between true sticking breakout and false sticking breakout in terms of mold copper plate temperature, providing a new approach for monitoring abnormalities in the continuous casting processShow more

Article, 2022
Publication:Metallurgical and Materials Transactions B, 53, 20220627, 2873
Publisher:2022


2022  Peer-reviewed
Bearing Remaining Useful Life Prediction Based on AdCNN and CWGAN under Few Samples
Authors:Junfeng ManMinglei ZhengYi LiuYiping ShenQianqian Li
Summary:At present, deep learning is widely used to predict the remaining useful life (RUL) of rotation machinery in failure prediction and health management (PHM). However, in the actual manufacturing process, massive rotating machinery data are not easily obtained, which will lead to the decline of the prediction accuracy of the data-driven deep learning method. Firstly, a novel prognostic framework is proposed, which is comprised of conditional Wasserstein distance-based generative adversarial networks (CWGAN) and adversarial convolution neural networks (AdCNN), which can stably generate high-quality training samples to augment the bearing degradation dataset and solve the problem of few samples. Then, the bearing RUL prediction method is realized by inputting the monitoring data into the one-dimensional convolutional neural network (1DCNN) for adversarial training. Via the bearing degradation dataset of the IEEE 2012 PHM data challenge, the reliability of the proposed method is verified. Finally, experimental results show that our approach is better than others in RUL prediction on average absolute deviation and average square root errorShow more
Article, 2022
Publication:Shock and Vibration, 2022, 20220630
Publisher:2022

 
2022  Peer-reviewed
Gene-CWGAN: a data enhancement method for gene expression profile based on improved CWGAN-GP
Authors:Fei HanShaojun ZhuQinghua LingHenry HanHailong LiXinli GuoJiechuan Cao
Article, 2022
Publication:Neural Computing & Applications, 34, October 2022, 16325
Publisher:2022


2022


2922  New Arrhythmia Findings Has Been Reported by Investigators at Southern Medical University (Multi-classification of Arrhythmias Using Resnet With Cbam On Cwgan-gp Augmented Ecg Gramian Angular Summation Field)Show more
Article, 2022
Publication:Obesity, Fitness & Wellness Week, August 6 2022, 2090
Publisher:2022

2022  Peer-reviewed
Corrigendum to “Multi-classification of arrhythmias using ResNet with CBAM on CWGAN-GP augmented ECG Angular Summation Field” [Biomed. Signal Process. Control 77 (2022) 103684]
Show more
Authors:Ke MaChang'an A. ZhanFeng Yang
Article
Publication:Biomedical Signal Processing and Control, 77, August 2022


2022  Peer-reviewed
Health evaluation methodology of remote maintenance control system of natural gas pipeline based on ACWGAN-GP algorithm
Authors:Shibin ZhangFei SongXiaojun ZhangLidong JiaWei ShiYueqiao AiWeichun Hei
Summary:The remote maintenance system of natural gas pipeline is of great significance to ensure the safe and stable operation of the pipeline network. The multi-classification method based on machine learning is more effective for the health evaluation of remote maintenance control system than the traditional evaluation method based on expert experience. In view of the sev ere imb alance i n the nu mber of s amples of f ive health levels, a health evaluation methodology of remote maintenance control system based on Wasserstein distance sand auxiliary classification generative adversarial network (ACWGAN-GP) is proposed. Firstly, the model stability is improved by introducing Wasserstein distance and gradient penalty. The generator generates balanced data, while the discriminator trains with generated and actual data. In this way, several ACWGAN-GP sub-models are trained. Then, the health levels of the sub-model are directly obtained by using the discriminator to classify the samples. Finally, according to the hierarchical relationship of the system, a parallel-serial combined evaluation method is adopted. By this means, the health evaluation model of remote maintenance control system including ACWGAN-GP sub-models is constructed. The experimental results based on 13 sets of KEEL and UCI multi-class imbalanced datasets and actual sampling data show that the effectiveness and advancement of the proposed method improved significantly compared with the existing similar typical algorithmsShow more
Article, 2022
Publication:Journal of Physics: Conference Series, 2247, 20220401
Publisher:2022


2022

U-Net cWGAN 이용한 탄성파 탐사 자료 보간 성능 평가Authors:유지윤 ( Jiyun Yu )윤대웅 ( Daeung Yoon )
Summary:탄성파 탐사 자료 획득 자료의 일부가 손실되는 문제가 발생할 있으며 이를 위해 자료 보간이 필수적으로 수행된다. 최근 기계학습 기반 탄성파 자료 보간법 연구가 활발히 진행되고 있으며, 특히 영상처리 분야에서 이미지 초해상화에 활용되고 있는 CNN (Convolutional Neural Network) 기반 알고리즘과 GAN (Generative Adversarial Network) 기반 알고리즘이 탄성파 탐사 자료 보간법으로도 활용되고 있다. 연구에서는 손실된 탄성파 탐사 자료를 높은 정확도로 복구하는 보간법을 찾기 위해 CNN 기반 알고리즘인 U-Net GAN 기반 알고리즘인 cWGAN (conditional Wasserstein Generative Adversarial Network) 탄성파 탐사 자료 보간 모델로 사용하여 성능 평가 결과 교를 진행하였다. 이때 예측 과정을 Case I Case II 나누어 모델 학습 성능 평가를 진행하였다. Case I에서는 규칙적으로 50% 트레이스가 손실된 자료만을 사용하여 모델을 학습하였고, 생성된 모델을 규칙/불규칙 샘플링 비율의 조합으로 구성된 6가지 테스트 자료 세트에 적용하여 모델 성능을 평가하였다. Case II에서는 6가지 테스트 자료와 동일한 형식으로 샘플링된 자료를 이용하여 해당 자료 모델을 생성하였고, 이를 Case I 동일한 테스트 자료 세트에 적용하여 결과를 비교하였다. 결과적으로 cWGAN U-Net 비해 높은 정확도의 예측 성능을 보였으며, 정량적 평가지수인 PSNR SSIM에서도 cWGAN 높은 값이 나타나는 것을 확인하였다. 하지만 cWGAN 경우 예측 결과에서 추가적인 잡음이 생성되었으며, 잡음을 제거하고 정확도를 개선하기 위해 앙상블 작업을 수행하였다. Case II에서 생성된 cWGAN 모델들을 이용하여 앙상블을 수행한 결과, 성공적으로 잡음이 제거되었으며 PSNR SSIM 또한 기존의 개별 모델 보다 향상된 결과를 나타내었다Show moreDownloadable Article, 2022
Publication:지구물리와 물리탐사, 25, 20220831, 140
Publisher:2022


2022  Based on CWGAN Deep Learning Architecture to Predict Chronic Wound Depth ImageAuthors:Chiun-Li ChinTzu-Yu SunJun-Cheng LinChieh-Yu LiYan-Ming LaiTing Chen2022 IEEE International Conference on Consumer Electronics - Taiwan
Summary:In order to observed the healing of the patient's wound, doctors will detect the depth of the wound by inserting a cotton swab into the deepest part of the wound. This processing may cause discomfort for the patient. Therefore, we propose Chronic Wound Depth Generative Adversarial Network (CWGAN) to convert wounds images into wound depth maps which are segmented into four types: shallow, semi-medium, medium, and deep. The accuracy of CWGAN reaches 84.8%. According to the experimental results, this research method can accurately segment wounds of different depths in images and also reduce patients' painShow more
Chapter, 2022
Publication:2022 IEEE International Conference on Consumer Electronics - Taiwan, 20220706, 275
Publisher:2022

<–—2022———2022———1470—


2022  U-Net cWGAN 이용한 탄성파 탐사 자료 보간 성능 평가Authors:유지윤윤대웅Jiyun YuDaeung Yoon
Summary:탄성파 탐사 자료 획득 자료의 일부가 손실되는 문제가 발생할 있으며 이를 위해 자료 보간이 필수적으로 수행된다. 최근 기계학습 기반 탄성파 자료 보간법 연구가 활발히 진행되고 있으며, 특히 영상처리 분야에서 이미지  

 [orean  2022 Evaluation of Seismic Survey Data Interpolation Performance Using U-Net and cWGANAuthors: Yoo Ji-yoon, Yoon Dae-woong, Jiyun Yu, Daeung YoonSummary: When acquiring seismic survey data, a problem in which part of the data may be lost may occur, and data interpolation is essential for this purpose. is carried out Recently, research on machine learning-based seismic data interpolation has been actively conducted, especially in the field of image processing.]


Perfomance comparison of CGANs and WGANs for crop disease image synthesisAuthors:Arsene DjatcheAchim IbenthalCordula ReischHochschule für Angewandte Wissenschaft und Kunst (Other)
Thesis, Dissertation, 2022
English
Publisher:2022

[CITATION] Perfomance Comparison of CGANs and WGANs for Crop Disease Image Synthesis

A Djatche - 2022 - HAWK Hochschule für angewandte …

2022 see 2021

Delon, JulieDesolneux, AgnesSalmona, Antoine

Gromov-Wasserstein distances between Gaussian distributions. (English) Zbl 07616207

J. Appl. Probab. 59, No. 4, 1178-1198 (2022).

MSC:  60E05 68T09 62H25 49Q22

PDF BibTeX XML Cite

Full Text: DOI 


Wasserstein Generalization Bound for Few-Shot Learning

Vector Quantized Wasserstein Auto-Encoder | OpenReview

Nov 15, 2022

A Higher Precision Algorithm for Computing the $1

Nov 8, 2022

Edge Wasserstein Distance Loss for Oriented Object Detection

Nov 11, 2022

Deep Generative Wasserstein Gradient Flows | OpenReview

Nov 16, 2022

More results from openreview.net


Gromov–Wasserstein Distances and the Metric Approach to ...

https://media.adelaide.edu.au › acvt › Publications

https://media.adelaide.edu.au › acvt › Publications

by F Mémoli · 2011 · Cited by 372 — Keywords Gromov–Hausdorff distances · Gromov–Wasserstein distances · Data ... Lem2.(König's Lemma) Let E A × B be a closed set.

71 pages


2022


Wasserstein dropout | SpringerLink

https://link.springer.com › article

https://link.springer.com › article

by J Sicking · 2022 — Wasserstein dropout (left column) employs sub-networks to model ... In B. Bonet, & S. Koenig (Eds.), Proceedings of the twenty-ninth AAAI ...



Global Wasserstein Margin maximization for boosting generalization in adversarial training

T Yu, S Wang, X Yu - Applied Intelligence, 2022 - Springer

… called Global Wasserstein Margin Maximization (… Wasserstein Margin in training process. 

Based on the work in [6], we design a conditional discriminator to measure the Wasserstein


[PDF] openreview.net

Wasserstein Threats

H Jin, Z Yu, X Zhang - Advances in Neural Information Processing Systems - openreview.net

… To address this issue, we propose measuring the perturbation with the orthogonal 

Gromov-Wasserstein discrepancy, and building its Fenchel biconjugate to facilitate convex …

All 3 versions


 [PDF] openreview.net

Meta-Learning without Data via Wasserstein Distributionally-Robust Model Fusion

…, X Wang, L Shen, Q Suo, K Song, D Yu… - The 38th Conference …, 2022 - openreview.net

… in various ways, including KL-divergence, Wasserstein ball, etc. DRO has been applied to 

many … This paper adopts the Wasserstein ball to characterize the task embedding uncertainty …

Cited by 5 Related articles All 3 versions

[PDF] openreview.net

Orthogonal Gromov Wasserstein Distance with Efficient Lower Bound

H Jin, Z Yu, X Zhang - The 38th Conference on Uncertainty in …, 2022 - openreview.net

… The Gromov-Wasserstein (GW) discrepancy formulates a … the orthogonal Gromov-Wasserstein 

(OGW) discrepancy that … It also directly extends to the fused Gromov-Wasserstein (FGW) …
<-—2022———2022———1480—



[PDF] ecva.net

Contrastive Prototypical Network with Wasserstein Confidence Penalty

H Wang, ZH Deng - European Conference on Computer Vision, 2022 - Springer

… To this end, we propose WassersteinWasserstein distance and introduce the semantic 

relationships with cost matrix. With semantic relationships as prior information, our Wasserstein

All 3 versions

 

[PDF] arxiv.org

Wasserstein Barycenter-based Model Fusion and Linear Mode Connectivity of Neural Networks

AK Akash, S Li, NG Trillos - arXiv preprint arXiv:2210.06671, 2022 - arxiv.org

… RNNs and LSTMs, we first discuss a Wasserstein barycenter (WB) based fusion algorithm … 

and the problem of computing Wasserstein (or Gromov-Wasserstein) barycenters, our aim is …

Cited by 1 Related articles All 2 versions


 arXiv:2211.12746  [pdf cs.CV
Completing point cloud from few points by Wasserstein GAN and Transformers
Authors: Xianfeng WuJinhui QianQing WeiXianzu WuXinyi LiuLuxin HuYanli GongZhongyuan LaiLibing Wu
Abstract: In many vision and robotics applications, it is common that the captured objects are represented by very few points. Most of the existing completion methods are designed for partial point clouds with many points, and they perform poorly or even fail completely in the case of few points. However, due to the lack of detail information, completing objects from few points faces a huge challenge. Inspi…  More
Submitted 23 November, 2022; originally announced November 2022.

Euler and Betti curves are stable under Wasserstein deformations of...
by Perez, Daniel
11/2022
Euler and Betti curves of stochastic processes defined on a $d$-dimensional compact Riemannian manifold which are almost surely in a Sobolev space...
Journal Art

arXiv:2211.12384  [pdfpsother math.PR   math.AT
Euler and Betti curves are stable under Wasserstein deformations of distributions of stochastic processes
Authors: Daniel Perez
Abstract: Euler and Betti curves of stochastic processes defined on a d
-dimensional compact Riemannian manifold which are almost surely in a Sobolev space Wn,s (X,R)
 (with d<n
) are stable under perturbations of the distributions of said processes in a Wasserstein metric. Moreover, Wasserstein stability is shown to hold for all p>d
n   for persistence diagrams stemming from functions…  More
Submitted 22 November, 2022; originally announced November 2022.
Comments: 9 pages
MSC Class: 60G60; 62M40; 55N31

arXiv:2211.11891  [pdfother stat.ML   cs.LG
A Bi-level Nonlinear Eigenvector Algorithm for Wasserstein Discriminant Analysis
Authors: Dong Min RohZhaojun Bai
Abstract: Much like the classical Fisher linear discriminant analysis, Wasserstein discriminant analysis (WDA) is a supervised linear dimensionality reduction method that seeks a projection matrix to maximize the dispersion of different data classes and minimize the dispersion of same data classes. However, in contrast, WDA can account for both global and local inter-connections between data classes using a…  More
Submitted 21 November, 2022; originally announced November 2022.

2022


Wasserstein bounds in CLT of approximative MCE and MLE of the drift parameter for Ornstein-Uhlenbeck...
by Es-Sebaiy, KhalifaAlazemi, FaresAl-Foraih, Mishari
11/2022
This paper deals with the rate of convergence for the central limit theorem of estimators of the drift coefficient, denoted $\theta$, for a Ornstein-Uhlenbeck...
Journal Article  Full Text Online

arXiv:2211.11566  [pdfpsother math.ST   math.PR
Wasserstein bounds in CLT of approximative MCE and MLE of the drift parameter for Ornstein-Uhlenbeck processes observed at high frequency
Authors: Khalifa Es-SebaiyFares AlazemiMishari Al-Foraih
Abstract: This paper deals with the rate of convergence for the central limit theorem of estimators of the drift coefficient, denoted θ
, for a Ornstein-Uhlenbeck process $X \coloneqq \{X_t,t\geq0\}$ observed at high frequency. We provide an Approximate minimum contrast estimator and an approximate maximum likelihood estimator of θ
, namely…  More
Submitted 18 November, 2022; originally announced November 2022.
Comments: arXiv admin note: text overlap with arXiv:2102.04810

arXiv:2211.11137  [pdfother cs.CV
Long Range Constraints for Neural Texture Synthesis Using Sliced Wasserstein Loss
Authors: Liping YinAlbert Chua
Abstract: In the past decade, exemplar-based texture synthesis algorithms have seen strong gains in performance by matching statistics of deep convolutional neural networks. However, these algorithms require regularization terms or user-added spatial tags to capture long range constraints in images. Having access to a user-added spatial tag for all situations is not always feasible, and regularization terms…  More
Submitted 20 November, 2022; originally announced November 2022.
Comments: Submitted to IEEE for possible publication

arXiv:2211.10066  [pdfother cs.LG   stat.ME  stat.ML
Hyperbolic Sliced-Wasserstein via Geodesic and Horospherical Projections
Authors: Clément BonetLaetitia ChapelLucas DrumetzNicolas Courty
Abstract: It has been shown beneficial for many types of data which present an underlying hierarchical structure to be embedded in hyperbolic spaces. Consequently, many tools of machine learning were extended to such spaces, but only few discrepancies to compare probability distributions defined over those spaces exist. Among the possible candidates, optimal transport distances are well defined on such Riem…  More
Submitted 18 November, 2022; originally announced November 2022.


Lung image segmentation based on DRD U-Net and combined WGAN with Deep...
by Lian, Luoyu; Luo, Xin; Pan, Canyu ; More...
Computer methods and programs in biomedicine, 11/2022, Volume 226
PURPOSECOVID-19 is a hot issue right now, and it's causing a huge number of infections in people, posing a grave threat to human life. Deep learning-based...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal
Open Access
Save this item
 Permanent Link
 Cite this item
 Email this item
More actions


Application of WGAN in financial time series generation compared with RNN
by Liao, Qingyao; Lu, Yuan; Luo, Yinghao ; More...
11/2022
This paper discusses WGAN, an important variant of the GAN model, and applies it to the generation of financial asset time series. Both WGAN and RNN can be...
Conference Proceeding  Full Text Online
Save this item
 Permanent Link
 Cite this item
 Email this item
More actions

<-—2022———2022———1490—



WGAN Approach to Synthetic TBM Data Generation
by Unterlass, Paul J.; Erharter, Georg H.; Sapronova, Alla ; More...
Trends on Construction in the Digital Era, 11/2022
In this work we propose a generative adversarial network (GAN) based approach of generating synthetic geotechnical data for further applications in research...
Book ChapterCitation Onlin
 ƒ

2022 patent
基于改进WGAN-GP的半监督恶意流量检测方法
11/2022
Patent  Available Online
Open Access
 [Chinese  Semi-supervised malicious traffic detection method based on improved WGAN-GP]


2022 patent
基于WGAN-CNN煤矿井下粉尘浓度预测方法和系统
11/2022
Patent  Available Online
 [Chinese  Prediction method and system of underground dust concentration in coal mine based on WGAN-CNN]
 


——————————————





2022 patent news
Beijing Industrial Univ Seeks Patent for Data Depth Enhancement Method Based on 

 WGAN-GP Data Generation and Poisson Fusion
Global IP News. Information Technology Patent News, Nov 21, 2022
Newspaper Article  Full Text Online
2022  Wire Feed patent news

Beijing Industrial Univ Seeks Patent for Data Depth Enhancement Method Based on WGAN-GP Data Generation and Poisson Fusion

Global IP News. Information Technology Patent News; New Delhi [New Delhi]. 21 Nov 2022. 

Cite  Email

Save to My Research

Full Text

DetailsFull text

 2022 patent news
Jiangsu Chegah Energy Tech Submits Chinese Patent Application for Power System Harmonic Law Calculation Method Based on WGAN
Global IP News. Electrical Patent News, Nov 19, 2022
Newspaper Article  Full Text Online
2022 Wire Feed patent news

Jiangsu Chegah Energy Tech Submits Chinese Patent Application for Power System Harmonic Law Calculation Method Based on WGAN

Global IP News. Electrical Patent News; New Delhi [New Delhi]. 19 Nov 2022. 

Cite  Email

Save to My Research

Full Text

DetailsFull text



2022 patent news
State Intellectual Property Office of China Receives Three Gorges Univ's Patent Application for Microseism Record Denoising Method Based on Improved WGAN...
Global IP News. Broadband and Wireless Network News, Nov 21, 2022
Newspaper Article  Full Text Online

On isometries of compact L .sup.p--Wasserstein spaces
by Santos-Rodríguez, Jaime
Advances in mathematics (New York. 1965), 11/2022, Volume 409
Keywords Wasserstein distance; Isometry group Let (X,d,m) be a compact non-branching metric measure space equipped with a qualitatively non-degenerate measure...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal

2022 see 2021
Distributionally robust chance-constrained programs with right-hand side uncertainty under Wasserstein...
by Ho-Nguyen, Nam; Kılınç-Karzan, Fatma; Küçükyavuz, Simge ; More...
Mathematical programming, 2022, Volume 196, Issue 1-2
We consider exact deterministic mixed-integer programming (MIP) reformulations of distributionally robust chance-constrained programs (DR-CCP) with random...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal

2023


Ensemble data assimilation using optimal control in the Wasserstein metric
by Liu, Xin; Frank, Jason
Journal of computational science, 11/2022, Volume 65
An ensemble data assimilation method is proposed that is based on optimal control minimizing the cost of mismatch in the Wasserstein metric on the observation...
Article PDFPDF
Journal ArticleCitation Online


Wasserstein gradient flows policy optimization via input convex neural...
by Wang, Yixuan
11/2022
Reinforcement learning (RL) is a widely used learning paradigm today. As a common RL method, policy optimization usually updates parameters by maximizing the...
Conference Proceeding  Full Text Online

<-—2022———2022———1500—



Handwriting Recognition Using Wasserstein Metric in Adversarial Learning
by Jangpangi, Monica; Kumar, Sudhanshu; Bhardwaj, Diwakar ; More...
SN computer science, 11/2022, Volume 4, Issue 1
Deep intelligence provides a great way to deal with understanding the complex handwriting of the user. Handwriting is challenging due to its irregular shapes,...
Journal ArticleCitation Online

Fault detection and diagnosis for liquid rocket engines with sample imbalance based on Wasserstein...
by Deng, Lingzhi; Cheng, Yuqiang; Yang, Shuming ; More...
Proceedings of the Institution of Mechanical Engineers. Part G, Journal of aerospace engineering, 11/2022
The reliability of liquid rocket engines (LREs), which are the main propulsion device of launch vehicles, cannot be overemphasised. The development of fault...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Jou


Efficient Convex PCA with applications to Wasserstein geodesic PCA and ranked...
by Campbell, Steven; Wong, Ting-Kam Leonard
11/2022
Convex PCA, which was introduced by Bigot et al., is a dimension reduction methodology for data with values in a convex subset of a Hilbert space. This setting...
Journal Article  Full Text Online
Efficient Convex PCA with applications to Wasserstein geodesic PCA and ranked...
by Campbell, Steven; Ting-Kam, Leonard Wong
arXiv.org, 11/2022
Convex PCA, which was introduced by Bigot et al., is a dimension reduction methodology for data with values in a convex subset of a Hilbert space. This setting...
Paper  Full Text Onlin


Contrastive Prototypical Network with Wasserstein Confidence Penalty
by Wang, Haoqing; Deng, Zhi-Hong
Computer Vision – ECCV 2022, 11/2022
Unsupervised few-shot learning aims to learn the inductive bias from unlabeled dataset for solving the novel few-shot tasks. The existing unsupervised few-shot...
Book Chapter  Full Text Online


A New Family of Dual-norm regularized \(p\)-Wasserstein Metrics
by Manupriya, Piyushi; Nath, J Saketha; Jawanpuria, Pratik
arXiv.org, 11/2022
We develop a novel family of metrics over measures, using \(p\)-Wasserstein style optimal transport (OT) formulation with dual-norm based regularized marginal...
Paper  Full Text Online

 2022


Wasserstein gradient flows policy optimization via input convex neural networks
by Wang, Yixuan
11/2022
Reinforcement learning (RL) is a widely used learning paradigm today. As a common RL method, policy optimization usually updates parameters by maximizing the...
Conference Proceeding  Full Text Online

Related articles All 3 versions

2022 thesis

MR4511551 Thesis Yenisey, Mehmet; The Metric Geometry 

Nature of Wasserstein Spaces of Probability Measures: On the Point of View of Submetry Projections. Thesis (Ph.D.)–University of Kansas. 2022. 68 pp. ISBN: 979-8352-92325-2, ProQuest LLC

Review PDF Clipboard Series Thesis

 Cited by 2 Related articles All 4 versions

2022 see 2021

MR4509072 Prelim Liao, Zhong-Wei; Ma, Yutao; Xia, Aihua; 

On Stein's Factors for Poisson Approximation in Wasserstein Distance with Nonlinear Transportation Costs. J. Theoret. Probab. 35 (2022), no. 4, 2383–2412. 60 (49 62)

Review PDF Clipboard Journal Article


2022

Vo Nguyen Le Duy (RIKEN-AIP) - Exact Statistical Inference ...

Vo Nguyen Le Duy (RIKEN-AIP) - Exact Statistical Inference for the Wasserstein Distance by ...

www.youtube.com › watch

... Wasserstein Distance by Selective Inference Abstract The Wasserstein distance (WD), ... e.g., in the form of confidence interval (CI).

YouTube · Center for Intelligent Systems CIS EPFL · 

Nov 16, 2022



  Arrhythmia Detection Based on WGAN-GP and SE-ResNet1D

Qin, JGao, FJ; (...); Ji, CQ

Nov 2022 | 

ELECTRONICS

 11 (21)

Enriched Cited References

A WGAN-GP-based ECG signal expansion and an SE-ResNet1D-based ECG classification method are proposed to address the problem of poor modeling results due to the imbalanced sample distribution of ECG data sets. The network architectures of WGAN-GP and SE-ResNet1D are designed according to the characteristics of ECG signals so that they can be better applied to the generation and classification of

Show more

Free Full Text from Publishermore_horiz

44 References  Related records



  Arrhythmia Detection Based on WGAN-GP and SE-ResNet1D

Qin, JGao, FJ; (...); Ji, CQ

Nov 2022 | 

ELECTRONICS

 11 (21)

Enriched Cited References

A WGAN-GP-based ECG signal expansion and an SE-ResNet1D-based ECG classification method are proposed to address the problem of poor modeling results due to the imbalanced sample distribution of ECG data sets. The network architectures of WGAN-GP and SE-ResNet1D are designed according to the characteristics of ECG signals so that they can be better applied to the generation and classification of

Show more

Free Full Text from Publishermore_horiz

44 References  Related records

<-—2022———2022———1510—


Scholarly Journal

Super-resolution of Sentinel-2 images using Wasserstein GAN

Latif, Hasan; Ghuffar, Sajid; Hafiz Mughees Ahmad.

Remote Sensing Letters; Abingdon Vol. 13, Iss. 12,  (Dec 2022): 1194-1202.

EmaiSave to My Research

Citation/Abstract

Abstract/Details 


Scholarly Journal

Gromov–Wasserstein distances between Gaussian distributions

Delon, Julie; Desolneux, Agnes; Salmona, Antoine.

Journal of Applied Probability; Sheffield Vol. 59, Iss. 4,  (Dec 2022): 1178-1198.

Cite Email

Save to My Research

Citation/Abstract

Abstract/Details Get full textopens in a new window

2022 see 2021  Scholarly Journal

Rate of convergence for particle approximation of PDEs in Wasserstein space

Germain, Maximilien; Pham, Huyên; Warin, Xavier.

Journal of Applied Probability; Sheffield Vol. 59, Iss. 4,  (Dec 2022): 992-1008.

Cite    Email

Save to My Research

Citation/Abstract

Abstract/Details Get full textopens in a new window

Working Paper

Completing point cloud from few points by Wasserstein GAN and Transformers

Wu, Xianfeng; Qian, Jinhui; Wei, Qing; Wu, Xianzu; Liu, Xinyi; et al.

arXiv.org; Ithaca, Nov 23, 2022.

Ciie  Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window


Xerra (@Xerra_EO) / Twitter

Semi-supervised Conditional Density Estimation with Wasserstein Laplacian ... A new open-source tool created by #NASA's Alaska Satellite Facility (ASF) ...

Twitter · 

Apr 20, 2022


2022


Earth Mover's Distance and Maximum Mean Discrepancy

Introduction to the Wasserstein distance ... Regularized Wasserstein Distances & Minimum Kantorovich Estimators. ... NASA Night Sky Network.

YouTube · Krishnaswamy Lab · Jan 13, 2022


Generative Data Augmentation via Wasserstein Autoencoder for Text Classification

Kyohoon Jin;

Junho Lee;

Juhwan Choi;

Soojin Jang;

Youngbin Kim

2022 13th International Conference on Information and Communication Technology Convergence (ICTC)

Year: 2022 | Conference Paper | Publisher: IEEE

Wasserstein Distance-based Distributionally Robust Chance-constrained Clustered Generation Expansion Planning Considering Flexible Resource Investments

Baorui Chen; Tianqi Liu; Xuan Liu; Chuan He; Lu Nan; Lei Wu; Xueneng Su; Jian Zhang

IEEE Transactions on Power Systems

Year: 2022 | Early Access Article | Publisher: IEEE

Renewable Energy Scenario Generation Method Based on Order-Preserving Wasserstein Distance

Hang Zhou;

Zhihang Mao;

Yi Gao;

Shuai Luo;

Yingyun Sun

2022 IEEE/IAS Industrial and Commercial Power System Asia (I&CPS Asia)

Year: 2022 | Conference Paper | Publisher: IEEE


On a Linear Gromov–Wasserstein Distance

Florian Beier;

Robert Beinert;

Gabriele Steidl

IEEE Transactions on Image Processing

Year: 2022 | Volume: 31 | Journal Article | Publisher: IEEE

Abstract  HTML

Cited by 4 Related articles All 6 versions

<-—2022———2022———1520—



Semi-supervised Malicious Traffic Detection with Improved Wasserstein Generative Adversarial Network with Gradient Penalty

Jiafeng Wang; Ming Liu; Xiaokang Yin; Yuhao Zhao; Shengli Liu

2022 IEEE 6th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC )

Year: 2022 | Conference Paper | Publisher: IEEE

Abstract  HTML


Bartolomeo Stellato (@b_stellato) / Twitter

twitter.com › b_stellato

Princeton, NJ stellato.io Joined October 2009 ... We derive theoretical bounds on how to adjust the Wasserstein ball radius to compensate for clustering.

Twitter · 

Mar 21, 2022 


Chapter 5. Wasserstein GAN - GANs in Action video ... - O'Reilly

www.oreilly.com › library › view › gans-in-action

www.oreilly.com › library › view › gans-in-action

Get GANs in Action video edition now with the O'Reilly learning platform. O'Reilly members experience live online training, plus books, videos, ...

O&#39;Reilly · Jakub Langr · 

Nov 7, 2022

  
The Wasserstein-Martingale projection of a Brownian motion ...

www.imsi.institute › Videos

www.imsi.institute › Videos

The Wasserstein-Martingale projection of a Brownian motion given initial and terminal ... Monday, May 16, 2022 ... 1155 E. 60th Street, Chicago, IL 60637.

Institute for Mathematical and Statistical Innovation · 

May 16, 2022


arXiv:2211.14923
  [pdfother cs.CL   cs.AI
Unsupervised Opinion Summarisation in the Wasserstein Space
Authors: Jiayu SongIman Munire BilalAdam TsakalidisRob ProcterMaria Liakata
Abstract: Opinion summarisation synthesises opinions expressed in a group of documents discussing the same topic to produce a single summary. Recent work has looked at opinion summarisation of clusters of social media posts. Such posts are noisy and have unpredictable structure, posing additional challenges for the construction of the summary distribution and the preservation of meaning compared to online r…  More
Submitted 27 November, 2022; originally announced November 2022.

2022


arXiv:2211.14881  [pdfpsother math.OC
An Efficient HPR Algorithm for the Wasserstein Barycenter Problem with O(Dim(P)/ε)
 Computational Complexity
Authors: Guojun ZhangYancheng YuanDefeng Sun
Abstract: In this paper, we propose and analyze an efficient Halpern-Peaceman-Rachford (HPR) algorithm for solving the Wasserstein barycenter problem (WBP) with fixed supports. While the Peaceman-Rachford (PR) splitting method itself may not be convergent for solving the WBP, the HPR algorithm can achieve an O(1/ε)
 non-ergodic iteration complexity with respect to the Karush-Kuhn-Tucker (KKT) res…  More
Submitted 27 November, 2022; originally announced November 2022.

arXiv:2211.13386  [pdfpsother math.OC
A Riemannian exponential augmented Lagrangian method for computing the projection robust Wasserstein distance
Authors: Bo JiangYa-Feng Liu
Abstract: Projecting the distance measures onto a low-dimensional space is an efficient way of mitigating the curse of dimensionality in the classical Wasserstein distance using optimal transport. The obtained maximized distance is referred to as projection robust Wasserstein (PRW) distance. In this paper, we equivalently reformulate the computation of the PRW distance as an optimization problem over the Ca…  More
Submitted 23 November, 2022; originally announced November 2022.
Comments: 25 pages, 20 figures, 4 tables
MSC Class: 65K10; 90C26; 90C47


2022 see 2021

Liao, Zhong-WeiMa, YutaoXia, Aihua

On Stein’s factors for Poisson approximation in Wasserstein distance with nonlinear transportation costs. (English) Zbl 07621015

J. Theor. Probab. 35, No. 4, 2383-2412 (2022).

MSC:  60F05 60E15 60J27

PDF BibTeX XML Cite

Full Text: DOI 

Zbl 07621015

The Wasserstein-Martingale projection of a B.M. for ... - YouTube

www.youtube.com › watch

www.youtube.com › watchJulio Backhoff-Veraguas (University of Vienna, Austria)The Wasserstein-Martingale projection of a Brownian motion given initial and terminal ...

YouTube · SPChile CL · 4 days ago

Nov 26, 2022

 

IFAC_Control - Twitter

twitter.com › ifac_control

Regular/Invited Papers deadline is Novenber 11, 2022 (firm deadline, ... Wasserstein attraction flows for dynamic mass transport over networks'.

Twitter · 2 weeks ago

Nov 16, 2022

<-—2022———2022———1530—



PhD & Postdoc Program - ELLIS Society

ellis.eu › phd-postdocELLIS Doctoral Symposium 2022: 150 PhD students and leading AI ... (IST Austria) ... Wasserstein gradient flows of various metrics applied to the train.

ELLIS Society · ELLIS · 1 month ago

OCT 30, 2022

   

 

Frank Hutter on Twitter: "TabPFN is radically different from ...

mobile.twitter.com › FrankRHutter › status

mobile.twitter.com › FrankRHutter › status

10:52 AM ·.·Twitter Web App ... Gifford and Jaakkola on learning Wasserstein flows is one of my all time favorite on Optimal ...

 Oct 21, 2022


2022 grant

Nonlocal Reaction-Diffusion Equations and Wasserstein Gradient Flows

Award Number:2204722; Principal Investigator:Olga Turanova; Co-Principal Investigator:; Organization:Michigan State University;NSF Organization:DMS Start Date:08/01/2022; Award Amount:$230,826.00; Relevance:75.0;


Rohan Anil on Twitter: "Today, we present our paper on ...

twitter.com › _arohan_ › status

twitter.com › _arohan_ › status5:30 PM · ·Twitter Web App ... but this paper of Hashimoto, Gifford and Jaakkola on learning Wasserstein flows is one of my all ...

Twitter · 4 days ago

 Sep 23, 2022 


 [PDF] eurasip.org

Auto-weighted Sequential Wasserstein Distance and Application to Sequence Matching

M Horie, H Kasai - 2022 30th European Signal Processing …, 2022 - ieeexplore.ieee.org

… start-end matching points. This paper presents a proposal of a shapeaware Wasserstein 

distance … that the sequence matching method using our proposed Wasserstein distance robustly …

All 2 versions


2022



2022

Welcome to IPSN 2022 - ACM

ipsn.acm.org › 2022

ipsn.acm.org › 2022

The International Conference on Information Processing in Sensor Networks (IPSN) is a leading annual forum on research in networked sensing and control, ...

IPSN · IPSN YouTube · 

Aug 17, 2022


2022

Face super-resolution using WGANs - YouTube

www.youtube.com › watch

Face super-resolution using WGANs. 230 views230 views ... THE 2022 OPPENHEIMER LECTURE: THE QUANTUM ORIGINS OF GRAVITY. UC Berkeley Events.

YouTube · Zhimin Chen · 

May 5, 2017

 

2022

Jiqing Wu | Papers With Code

paperswithcode.com › author › jiqing-wu

paperswithcode.com › author › jiqing-wu1 code implementation • 1 Mar 2022 • Jiqing Wu, Inti Zlobec, Maxime Lafarge, ... among which the family of Wasserstein GANs (WGANs) is considered to be ...

Papers With Code · cantabilewq · 


  2022

EASI-STRESS on Twitter: "Part 2 of the EASI-STRESS ...

twitter.com › EASI_STRESS › status

twitter.com › EASI_STRESS › status8:12 AM · Mar 29, 2022 ·Twitter Web App ... optimal transport along with new JAX software for (Euclidean) Wasserstein-2 OT! https://arxiv.org/abs/2210.12153 ...

Twitter · 1 month ago

Mar 29, 2022 


Conditional Wasserstein Generator.

Kim, Young-GeunLee, Kyungbok and Paik, Myunghee Cho

2022-nov-10 | 

IEEE transactions on pattern analysis and machine intelligence

 PP

The statistical distance of conditional distributions is an essential element of generating target data given some data as in video prediction. We establish how the statistical distances between two joint distributions are related to those between two conditional distributions for three popular statistical distances: f-divergence, Wasserstein distance, and integral probability metrics. Such cha

Show more

Full Text at Publishermore_horiz

Cited by 1 Related articles All 4 versions

<-—2022———2022———1540—

2022 piatent 

Method for determining relevance of object, involves calculating Wasserstein distance between detection object and tracking object according to Gaussian distribution of detection object and Gaussian distribution of tracking object

CN115273019-A

Inventor(s) LIN J

Assignee(s) JIUSHI ZHIXING BEIJING TECHNOLOGY CO LTD

Derwent Primary Accession Number 

2022-D93909

 

  

2022 patent

Method for upsampling network on point cloud, involves constructing loss function of point cloud upsampling network based on Wasserstein distance, and obtaining point cloud upsampling network by training sparse point cloud training data and dense point cloud training data

CN115294179-A

Inventor(s) LUO ZLI Z and LEI N

Assignee(s) UNIV DALIAN TECHNOLOGY

Derwent Primary Accession Number 

2022-D9871X

more_horiz


2922  patent

Method for removing artifacts from extremely sparse angular computed tomography reconstruction, involves sending image set as input to UNet generator network trained based on WGAN-GP generative adversarial network combined with transformers block

CN115239588-A

Inventor(s) QIN YJIANG W; (...); DI J

Assignee(s) UNIV GUANGDONG TECHNOLOGY

Derwent Primary Accession Number 

2022-D5413F

more_horiz


2022 see 2021

Optimal continuous-singular control of stochastic McKean-Vlasov system in Wasserstein space of probability measures

Boukaf, SGuenane, L and Hafayed, M

2022 | 

INTERNATIONAL JOURNAL OF DYNAMICAL SYSTEMS AND DIFFERENTIAL EQUATIONS

 12 (4) , pp.301-315

In this paper, we study the local form of maximum principle for optimal stochastic continuous-singular control of nonlinear Ito stochastic differential equation of McKean-Vlasov type, with incomplete information. The coefficients of the system are nonlinear and depend on the state process as well as its probability law. The control variable is allowed to enter into both drift and diffusion coef

Show more

View full textmore_horiz

31 References  Related records

 Zbl 07632182


Existence and stability results for an isoperimetric problem with a non-local interaction of Wasserstein type

Candau-Tilh, J and Goldman, M

Jun 14 2022 | 

ESAIM-CONTROL OPTIMISATION AND CALCULUS OF VARIATIONS

 28

The aim of this paper is to prove the existence of minimizers for a variational problem involving the minimization under volume constraint of the sum of the perimeter and a non-local energy of Wasserstein type. This extends previous partial results to the full range of parameters. We also show that in the regime where the perimeter is dominant, the energy is uniquely minimized by balls.

Free Full Text From Publishermore_horiz References

Related records


2022


Wasserstein generative adversarial network-based approach for real-time track irregularity estimation using vehicle dynamic responses

Yuan, ZDLuo, J; (...); Zhai, WM

Dec 2 2022 | Nov 2021 (Early Access) | 

VEHICLE SYSTEM DYNAMICS

 60 (12) , pp.4186-4205

Accurate and timely estimation of track irregularities is the foundation for predictive maintenance and high-fidelity dynamics simulation of the railway system. Therefore, it's of great interest to devise a real-time track irregularity estimation method based on dynamic responses of the in-service train. In this paper, a Wasserstein generative adversarial network (WGAN)-based framework is devel

Show more

View full textmore_horiz

27 References  Related records


22022

Fault detection and diagnosis for liquid rocket engines with sample imbalance based on Wasserstein generative adversarial nets and multilayer perceptron

Deng, LZCheng, YQ; (...); Shi, YH

Nov 2022 (Early Access) | 

PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART G-JOURNAL OF AEROSPACE ENGINEERING

The reliability of liquid rocket engines (LREs), which are the main propulsion device of launch vehicles, cannot be overemphasised. The development of fault detection and diagnosis (FDD) technology for LREs can effectively improve the safety and reliability of launch vehicles, which has important theoretical and engineering significance. With the rapid development of artificial intelligence tec

Show more

Full Text at Publishermore_horiz

30 References  Related records



Working Paper

WASCO: A Wasserstein-based statistical tool to compare conformational ensembles of intrinsically disordered proteins

Gonzalez-Delgado, Javier; Sagar, Amin; Zanon, Christophe; Lindorff-Larsen, Kresten; Bernado, Pau; et al.

bioRxiv; Cold Spring Harbor, Dec 2, 2022.

Cite Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

Cited by 6 Related articles All 16 versions

2022 see 20212020  Working Paper

Wasserstein Stability for Persistence Diagrams

Skraba, Primoz; Turner, Katharine.

arXiv.org; Ithaca, Nov 29, 2022.

Cite Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

2022 see 2021  Working Paper

Gradient Flows for Frame Potentials on the Wasserstein Space

Wickman, Clare; Okoudjou, Kasso.

arXiv.org; Ithaca, Nov 29, 2022.

Cite Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

<-—2022———2022———1550—



Working Paper

Unsupervised Opinion Summarisation in the Wasserstein Space

Song, Jiayu; Bilal, Iman Munire; Tsakalidis, Adam; Procter, Rob; Liakata, Maria.

arXiv.org; Ithaca, Nov 27, 2022.

Cite Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

Working Paper

An Efficient HPR Algorithm for the Wasserstein Barycenter Problem with Computational Complexity

Zhang, Guojun; Yancheng Yuan; Sun, Defeng.

arXiv.org; Ithaca, Nov 27, 2022.

Cite Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window
An Efficient HPR Algorithm for the Wasserstein Barycenter Problem with...

by Zhang, Guojun; Yuan, Yancheng; Sun, Defeng
11/2022
In this paper, we propose and analyze an efficient Halpern-Peaceman-Rachford (HPR) algorithm for solving the Wasserstein barycenter problem (WBP) with fixed...
Journal Article  Full Text Online


Working Paper

A Riemannian exponential augmented Lagrangian method for computing the projection robust Wasserstein distance

Jiang, Bo; Ya-Feng, Liu.

arXiv.org; Ithaca, Nov 24, 2022.

Cite Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window


Working Paper

Euler and Betti curves are stable under Wasserstein deformations of distributions of stochastic processes

Perez, Daniel.

arXiv.org; Ithaca, Nov 22, 2022.

Cite Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

Working Paper

A Bi-level Nonlinear Eigenvector Algorithm for Wasserstein Discriminant Analysis

Roh, Dong Min; Bai, Zhaojun.

arXiv.org; Ithaca, Nov 21, 2022.

Cite Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

  2022


Working Paper

Long Range Constraints for Neural Texture Synthesis Using Sliced Wasserstein Loss

Yin, Liping; Chua, Albert.

arXiv.org; Ithaca, Nov 21, 2022.

Cite Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window


Working Paper

On Approximations of Data-Driven Chance Constrained Programs over Wasserstein Balls

Chen, Zhi; Kuhn, Daniel; Wiesemann, Wolfram.

arXiv.org; Ithaca, Nov 20, 2022.

Cite Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

Working Paper

Multi-marginal Approximation of the Linear Gromov-Wasserstein Distance

Beier, Florian; Beinert, Robert.

arXiv.org; Ithaca, Nov 15, 2022.

Cite Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

Working Paper

Two-Stage Distributionally Robust Conic Linear Programming over 1-Wasserstein Balls

Byeon, Geunyeong; Kim, Kibaek.

arXiv.org; Ithaca, Nov 10, 2022.

Cite Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window


arXiv:2212.01310  [pdfother]  cs.LG   stat.ML
Gaussian Process regression over discrete probability measures: on the non-stationarity relation between Euclidean and Wasserstein Squared Exponential Kernels
Authors: Antonio CandelieriAndrea PontiFrancesco Archetti
Abstract: Gaussian Process regression is a kernel method successfully adopted in many real-life applications. Recently, there is a growing interest on extending this method to non-Euclidean input spaces, like the one considered in this paper, consisting of probability measures. Although a Positive Definite kernel can be defined by using a suitable distance -- the Wasserstein distance -- the common procedure… 
More
Submitted 2 December, 2022; originally announced December 2022.

<-—2022———2022———1560—



Wasserstein distributional harvesting for highly dense 3D point clouds
by Shu, Dong Wook; Park, Sung Woo; Kwon, Junseok
Pattern recognition, 12/2022, Volume 132
•Our method outputs the surface distributions and samples an arbitrary number of 3D points.•Our stochastic instance normalization transfers the implicit...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal

 
On Stein’s Factors for Poisson Approximation in Wasserstein Distance...
by Liao, Zhong-Wei; Ma, Yutao; Xia, Aihua
Journal of theoretical probability, 2022, Volume 35, Issue 4
We establish various bounds on the solutions to a Stein equation for Poisson approximation in the Wasserstein distance with nonlinear transportation costs. The...
Article PDFPDF
Journal Article  Full Text Online


Dual Wasserstein generative adversarial network condition; a generative...
by Wang Zixu; Wang Shoudong; Zhou Chen ; More...
Geophysics, 12/2022, Volume 87, Issue 6
Deep learning neural networks offer some advantages over conventional methods in acoustic impedance inversion. Because labeled data may be difficult to obtain...
Journal Article  Full Text Online

 
2022 see 2021
A Bismut–Elworthy inequality for a Wasserstein diffusion on the circle
by Marx, Victor
Stochastic partial differential equations : analysis and computations, 

2022, Volume 10, Issue 4
We introduce in this paper a strategy to prove gradient estimates for some infinite-dimensional diffusions on L 2 -Wasserstein spaces. For a specific example...
Article PDFPDF
Journal Article  Full Text Online



patent
Zhou Kou Teaching Univ Submits Patent Application for Distribution Robust Optimization Method Based on Wasserstein...
Global IP News. Measurement & Testing Patent News, Nov 26, 2022
Newspaper ArticleCitation Online


2022



On Stein’s Factors for Poisson Approximation in Wasserstein Distance with Nonlinear Transportation...
by Liao, Zhong-WeiMa, YutaoXia, Aihua
Journal of theoretical probability, 2022, Volume 35, Issue 4
We establish various bounds on the solutions to a Stein equation for Poisson approximation in the Wasserstein distance with nonlinear transportation costs. The...
ArticleView Article PDF
Journal Article  Full Text Online
View Complete Issue Browse Now

 
Dual Wasserstein generative adversarial network condition; a generative adversarial network-based...
by Wang ZixuWang ShoudongZhou Chen ; More...
Geophysics, 12/2022, Volume 87, Issue 6
Deep learning neural networks offer some advantages over conventional methods in acoustic impedance inversion. Because labeled data may be difficult to obtain...
Article Link Read Article
Journal Article  Full Text Online
View Complete Issue Browse Now

 
An Efficient HPR Algorithm for the Wasserstein Barycenter Problem with $O({Dim(P)}/\varepsilon)$...
by Zhang, GuojunYuan, YanchengSun, Defeng
11/2022
In this paper, we propose and analyze an efficient Halpern-Peaceman-Rachford (HPR) algorithm for solving the Wasserstein barycenter problem (WBP) with fixed...
Journal Article  Full Text Online



2022 patent news
Beijing Industrial Univ Seeks Patent for Data Depth Enhancement Method Based on WGAN-GP Data...
Global IP News: Information Technology Patent News, Nov 21, 2022
Newspaper ArticleCitation Online

 
8
022 patent news
Beijing Industrial Univ Seeks Patent for Data Depth Enhancement Method Based on WGAN-GP Data...
Global IP News. Information Technology Patent News, Nov 21, 2022
Newspaper Article  Full Text Online

<-—2022———2022———1570— 


arXiv:2212.08504
  [pdfother astro-ph.IM  astro-ph.GA
Morphological Classification of Radio Galaxies with wGAN-supported Augmentation
Authors: Lennart RustigeJanis KummerFlorian GrieseKerstin BorrasMarcus BrüggenPatrick L. S. ConnorFrank GaedeGregor KasieczkaTobias KnoppPeter Schleper
Abstract: Machine learning techniques that perform morphological classification of astronomical sources often suffer from a scarcity of labelled training data. Here, we focus on the case of supervised deep learning models for the morphological classification of radio galaxies, which is particularly topical for the forthcoming large radio surveys. We demonstrate the use of generative models, specifically Was…  More
Submitted 16 December, 2022; originally announced December 2022.
Comments: 12 pages, 7+2 figures, 1+2 tables. Submitted, comments welcome
 

  

2022 tesis

thesis_July26th.pdf - Toronto Math Blogs

http://blog.math.toronto.edu › files › 2022/08 › thes...

Optimal transport, congested transport, and Wasserstein generative ... The second part of this thesis presents new algorithms for generative.



The Wasserstein distance of order 1 for quantum spin systems ...

https://arxiv.org › pdf

by G De Palma · 2022 — October 21, 2022. Abstract. We propose a generalization of the Wasserstein distance of order 1 to quantum spin systems on the lattice Zd, ...



2022 thesis

On the Wasserstein median of probability measures - arXivhttps://arxiv.org › pdf
https://arxiv.org › pdfPDF
by K You · 2022 — In the field of optimal transport, the Wasserstein barycenter ... ME] 9 Sep 2022 ... thesis, University of California Los Angeles..

Related articles All 2 versions

Recent PhD Graduates - Department of Mathematics

https://mathematics.ku.edu › recent-phd-graduates

https://mathematics.ku.edu › recent-phd-graduates

Mehmet Yenisey. Graduation date: Summer 2022. Research area: Probability Theory Thesis title: The metric geometry nature of Wasserstein spaces of ...

U Kansas  Ph.D. math dept

Mehmet Yenisey

Graduation date:  Summer 2022
Research area:  Probability Theory
Thesis title:  The metric geometry nature of Wasserstein spaces of probability measures: On the point of view of submetry projections
Thesis advisor:  Jin Feng
First job:  Visiting Lecturer, Rochester Institute of Technology (RIT)


 2022

 

Viet Anh Nguyen

http://www.vietanhnguyen.net

December 2022: I am presenting at the Optimization in the Big Data Era workshop. ... Distributionally Robust Inverse Covariance Estimation: The Wasserstein ...


Mean field information Hessian matrices on graphs - Wuchen Li

https://people.math.sc.edu › wuchen › By_year

2022. J. Yu, R.J. Lai, W.C. Li, S. Osher. Computational mean field games on ... Optimal Neural Network Approximation of Wasserstein Gradient Direction via ...

 

NeurIPS 2022

https://neurips.cc › 2022 › ScheduleMultitrack

Wasserstein Generative Adversarial Networks (WGANs) are the popular generative ... After successfully defending his PhD thesis in Foundations of Computer ...



[PDF] arxiv.org

Hyperbolic Sliced-Wasserstein via Geodesic and Horospherical Projections

C Bonet, L Chapel, L Drumetz, N Courty - arXiv preprint arXiv:2211.10066, 2022 - arxiv.org

… background on optimal transport with the Wasserstein and the slicedWasserstein distance. 

We then review two … The main tool of OT is the Wasserstein distance which we introduce now. …

All 2 versions


2022 see 2021  [PDF] openreview.net

Efficient Gradient Flows in Sliced-Wasserstein Space

C Bonet, N Courty, F Septier, L Drumetz - 2022 - openreview.net

Wasserstein distance. We first derive some properties of this new class of flows and discuss 

links with Wasserstein … Sliced-Wasserstein gradient flows to the Wasserstein gradient flows. …

All 2 versions

<-—2022———2022———1580—


[PDF] archives-ouvertes.fr

Semi-relaxed Gromov-Wasserstein divergence for graphs classification

…, M Corneli, T Vayer, N Courty - … de Traitement du …, 2022 - hal.archives-ouvertes.fr

Comparing structured objects such as graphs is a fundamental operation involved in many 

learning tasks. To this end, the Gromov- Wasserstein (GW) distance, based on Optimal …

 All 5 versions


2022

Towards Instant Calibration in Myoelectric Prosthetic Hands

www.youtube.com › watch

www.youtube.com › watchDigby Chappell presenting the ICORR 2022 paper:D. Chappell, Z. Yang, ... Based on the Wasserstein Distance,” Proceedings of the 2022 IEEE ...

YouTube · Imperial REDS Lab ·  

Aug 4, 2022



Peer-reviewed
A Data Augmentation Method for Prohibited Item X-Ray Pseudocolor Images in X-Ray Security Inspection Based on Wasserstein Generative Adversarial Network and Spatial-and-Channel Attention Block
Show more
Authors:Dongming LiuJianchang LiuPeixin YuanFeng Yu

Summary:For public security and crime prevention, the detection of prohibited items in X-ray security inspection based on deep learning has attracted widespread attention. However, the pseudocolor image dataset is scarce due to security, which brings an enormous challenge to the detection of prohibited items in X-ray security inspection. In this paper, a data augmentation method for prohibited item X-ray pseudocolor images in X-ray security inspection is proposed. Firstly, we design a framework of our method to achieve the dataset augmentation using the datasets with and without prohibited items. Secondly, in the framework, we design a spatial-and-channel attention block and a new base block to compose our X-ray Wasserstein generative adversarial network model with gradient penalty. The model directly generates high-quality dual-energy X-ray data instead of pseudocolor images. Thirdly, we design a composite strategy to composite the generated and real dual-energy X-ray data with background data into a new X-ray pseudocolor image, which can simulate the real overlapping relationship among items. Finally, two object detection models with and without our data augmentation method are applied to verify the effectiveness of our method. The experimental results demonstrate that our method can achieve the data augmentation for prohibited item X-ray pseudocolor images in X-ray security inspection effectively
Show more
Article, 2022
Publication:Computational Intelligence and Neuroscience, 2022, 20220318
Publisher:2022


Rolling Bearing Fault Diagnosis Based on Deep Adversarial Networks with Convolutional Layer and Wasserstein Distance

Authors:Xinyu GaoRui YangEng Gee Lim2022 27th International Conference on Automation and Computing (ICAC)
Summary:Intelligent bearing fault diagnosis techniques have been well developed to meet the economy and safety criteria. Machine learning and deep learning schemes have shown to be promising tools for rolling bearing defect diagnosis. They require multitudinous labelled data in the training phase and assume that the training and testing samples abide by the same data distribution. However, in real-world industrial contexts, these two preconditions are almost impossible to be satisfied. Conversely, approaches based on transfer learning are potent instruments for proactively reacting to the above two challenges. Consequently, this paper presents an unsupervised method for diagnosing rolling bearing defects based on transfer learning. Convolutional neural networks, adversarial networks, and Wasserstein distance are adopted to extract domain invariant features, narrow the discrepancy between the source domain and target domain, and precisely categorize the faulty samples. A series of experiments corroborate that the proposed model can effectively facilitate the overall performance and outperform several traditional approaches under six measurement metricsShow more
Chapter, 2022
Publication:2022 27th International Conference on Automation and Computing (ICAC), 20220901, 1
Publisher:2022



2022  Peer-reviewed
VGAN: Generalizing MSE GAN and WGAN-GP for Robot Fault Diagnosis
Authors:Ziqiang PuDiego CabreraChuan LiJose Valente de Oliveira
Summary:Generative adversarial networks (GANs) have shown their potential for data generation. However, this type of generative model often suffers from oscillating training processes and mode collapse, among other issues. To mitigate these, this work proposes a generalization of both mean square error (mse) GAN and Wasserstein GAN (WGAN) with gradient penalty, referred to as VGAN. Within the framework of conditional WGAN with gradient penalty, VGAN resorts to the Vapnik V-matrix-based criterion that generalizes mse. Also, a novel early stopping-like strategy is proposed that keeps track during training of the most suitable model. A comprehensive set of experiments on a fault-diagnosis task for an industrial robot where the generative model is used as a data augmentation tool for dealing with imbalance datasets is presented. The statistical analysis of the results shows that the proposed model outperforms nine other models, including vanilla GAN, conditional WGAN with and without conventional regularization, and synthetic minority oversampling technique, a classic data augmentation techniqueShow more
Article, 2022
Publication:IEEE Intelligent Systems, 37, 20220501, 65
Publisher:2022

2022
 
2022  Peer-reviewed
VGAN: Generalizing MSE GAN and WGAN-GP for robot fault diagnosis
Authors:Jose Valente de OliveiraChuan LiDiego CabreraZiqiang Pu
Summary:Generative adversarial networks (GANs) have shown their potential for data generation. However, this type of generative model often suffers from oscillating training processes and mode collapse, among other issues. To mitigate these, this work proposes a generalization of both MSE GAN and WGAN-GP, referred to as VGAN. Within the framework of conditional Wasserstein GAN with gradient penalty, VGAN resorts to the Vapnik V-matrix based criterion that generalizes MSE. Also, a novel early stopping like strategy is proposed that keeps track during training of the most suitable model. A comprehensive set of experiments on a fault diagnosis task for an industrial robot where the generative model is used as a data augmentation tool for dealing with imbalance data sets is presented. The statistical analysis of the results shows that the proposed model outperforms nine other models including vanilla GAN, conditional WGAN with and without conventional regularization, and SMOTE, a classic data augmentation techniqueShow more
Article, 2022
Publication:IEEE Intelligent Systems, 202204, 1
Publisher:2022

2022  Peer-reviewed
R-WGAN-Based Multitimescale Enhancement Method for Predicting f-CaO Cement Clinker
Authors:Xiaochen HaoLin LiuGaolu HuangYuxuan ZhangYifu ZhangHui Dang
Summary:To address the problem that the high dimensionality, time series, coupling, and multiple timescales of production data in the process industry lead to the low accuracy of traditional prediction models, we propose a multitimescale data enhancement and cement clinker free calcium oxide (f-CaO) prediction method based on the regression-Wasserstein generative adversarial net (R-WGAN) model. The model is built using a combination of WGAN and regression prediction networks. First, the data are extracted according to the principle of sliding window to eliminate the effect of time-varying delay between data in data enhancement and prediction, and a dual data pathway is used for data stitching so that data of different timescales can be enhanced at the same time. We then augment the data with a generator network, use a discriminator network to judge the goodness of the generated data, and propose an auxiliary evaluation strategy to evaluate whether the high-dimensional generated data conform to the actual laws, expand the training set of the regression prediction network with the generated data that conform to the laws, and finally achieve the prediction of cement clinker f-CaO. The model was applied in the quality management system of a cement company for simulation, and experiments showed that the model with data enhancement has the advantages of high accuracy, robustness, and good generalization in cement clinker f-CaO predictionShow more
Article, 2022
Publication:IEEE Transactions on Instrumentation and Measurement, 71, 2022, 1
Publisher:2022

2022  Peer-reviewed
Intelligent data expansion approach of vibration gray texture images of rolling bearing based on improved WGAN-GP
Authors:Hongwei FanJiateng MaXuhui ZhangCeyi XueYang YanNingge Ma
Summary:Rolling bearing is one of the components with the high fault rate for rotating machinery. Big data-based deep learning is a hot topic in the field of bearing fault diagnosis. However, it is difficult to obtain the big actual data, which leads to a low accuracy of bearing fault diagnosis. WGAN-based data expansion approach is discussed in this paper. Firstly, the vibration signal is converted into the gray texture image by LBP to build the original data set. The small original data set is used to generate the new big data set by WGAN with GP. In order to verify its effectiveness, MMD is used for the expansion evaluation, and then the effect of the newly generated data on the original data expansion in different proportions is verified by CNN. The test results show that WGAN-GP data expansion approach can generate the high-quality samples, and CNN-based classification accuracy increases from 92.5% to 97.5% before and after the data expansionShow more
Article, 2022
Publication:Advances in Mechanical Engineering, 14, 202203
Publisher:2022


2022 thesis
Generative Adversarial Networks and Data Starvation
Authors:Brendan Jugan (Author), Patrick Drew McdanielSchreyer Honors College
Summary:Generative adversarial networks (GAN) have shown impressive results in data generation tasks, particularly in the image domain [1, 2, 3, 4]. Recent research has employed GANs to generate high-quality synthetic images of animals, scenes in nature, and even complex human faces. While GANs have seen great success, they are notoriously difficult to train. If improperly configured, their adversarial nature can lead to failures such as model divergence and mode collapse. It has been documented that training dataset size and quality influences GAN sensitivity to these failure modes [5]. However, there is limited research as to the extent of data starvation's negative impact on modern architectures. In this paper, we present a framework for evaluating modern architecture performance when given limited training data. Specifically, we apply data starvation techniques to GAN training and evaluate their performances using common metrics utilized by the research community. To accomplish this, we use a state-of-the-art image generation benchmark dataset [6], as well as publicly available architecture implementations provided by the Pohang University of Science and Technology's Computer Vision Lab [7]. We evaluate our data-starved GANs by recording inception score and Frechet inception distance, which are effective, and commonly used metrics for measuring GAN performance [8]. Our results show that SNGAN and BigGAN require more data than DCGAN, WGAN-GP, and SAGAN to avoid model divergence. We also find that data starvation has fewer performance implications when used on datasets from less complex domains, like those including handwritten digitsShow more
Thesis, Dissertation, 2022
English
Publisher:Pennsylvania State University, [University Park, Pennsylvania], 2022



2022
Small sample reliability assessment with online time-series data based on a worm WGAN learning method
Authors:Bo SunZeyu WuQiang FengZili WangYi RenDezhen YangQuan Xia
Summary:The scarcity of time-series data constrains the accuracy of online reliability assessment. Data expansion is the most intuitive way to address this problem. However, conventional, small-sample reliability evaluation methods either depend on prior knowledge or are inadequate for time series. This article proposes a novel auto-augmentation network, the worm Wasserstein generative adversarial network (WWGAN), which generates synthetic time-series data that carry realistic intrinsic patterns with the original data and expands a small sample without prior knowledge or hypotheses for reliability evaluation. After verifying the augmentation ability and demonstrating the quality of the generated data by manual datasets, the proposed method is demonstrated with an experimental case: the online reliability assessment of lithium battery cells. Compared with conventional methods, the proposed method accomplished a breakthrough of the online reliability assessment for an extremely small sample of time-series data and provided credible resultsShow more
Article, 2022
Publication:IEEE Transactions on Industrial Informatics, PP, 20220419, 1
Publisher:2022
<-—2022———2022———1590—



2022
Discriminator Contrastive Divergence: Semi-Amortized Generative Modeling by Exploring Energy of the Discriminator
Authors:Song, Yuxuan (Creator), Ye, Qiwei (Creator), Xu, Minkai (Creator), Liu, Tie-Yan (Creator)
Summary:Generative Adversarial Networks (GANs) have shown great promise in modeling high dimensional data. The learning objective of GANs usually minimizes some measure discrepancy, \textit{e.g.}, $f$-divergence~($f$-GANs) or Integral Probability Metric~(Wasserstein GANs). With $f$-divergence as the objective function, the discriminator essentially estimates the density ratio, and the estimated ratio proves useful in further improving the sample quality of the generator. However, how to leverage the information contained in the discriminator of Wasserstein GANs (WGAN) is less explored. In this paper, we introduce the Discriminator Contrastive Divergence, which is well motivated by the property of WGAN's discriminator and the relationship between WGAN and energy-based model. Compared to standard GANs, where the generator is directly utilized to obtain new samples, our method proposes a semi-amortized generation procedure where the samples are produced with the generator's output as an initial state. Then several steps of Langevin dynamics are conducted using the gradient of the discriminator. We demonstrate the benefits of significant improved generation on both synthetic data and several real-world image generation benchmarksShow more
Downloadable Archival Material, 2020-04-04
Undefined
Publisher:202



2022  Peer-reviewed
Inverse airfoil design method for generating varieties of smooth airfoils using conditional WGAN-gp
Authors:Kazuo YonekuraNozomu MiyamotoKatsuyuki Suzuki
Summary:Abstract: Machine learning models are recently adopted to generate airfoil shapes. A typical task is to obtain airfoil shapes that satisfy the required lift coefficient. These inverse design problems can be solved by generative adversarial networks (GAN). However, the shapes obtained from ordinal GAN models are not smooth; hence, flow analysis cannot be conducted. Therefore, Bézier curves or smoothing methods are required. This study employed conditional Wasserstein GAN with gradient penalty (cWGAN-gp) to generate smooth airfoil shapes without any smoothing method. In the proposed method, the cWGAN-gp model outputs a shape that indicates the specified lift coefficient. Then, the results obtained from the proposed model are compared with those of ordinal GANs and variational autoencoders; in addition, the proposed method outputs the smoothest shape owing to the earth mover's distance used in cWGAN-gp. By adopting the proposed method, no additional smoothing method is required to conduct flow analysisShow mor
Article, 2022
Publication:Structural and Multidisciplinary Optimization, 65, 20220603
Publisher:2022


2022
Wasserstein Adversarial Transformer for Cloud Workload Prediction

Authors:Arbat, Shivani (Creator), Jayakumar, Vinodh Kumaran (Creator), Lee, Jaewoo (Creator), Wang, Wei (Creator), Kim, In Kee (Creator)
Summary:Predictive Virtual Machine (VM) auto-scaling is a promising technique to optimize cloud applications operating costs and performance. Understanding the job arrival rate is crucial for accurately predicting future changes in cloud workloads and proactively provisioning and de-provisioning VMs for hosting the applications. However, developing a model that accurately predicts cloud workload changes is extremely challenging due to the dynamic nature of cloud workloads. Long-Short-Term-Memory (LSTM) models have been developed for cloud workload prediction. Unfortunately, the state-of-the-art LSTM model leverages recurrences to predict, which naturally adds complexity and increases the inference overhead as input sequences grow longer. To develop a cloud workload prediction model with high accuracy and low inference overhead, this work presents a novel time-series forecasting model called WGAN-gp Transformer, inspired by the Transformer network and improved Wasserstein-GANs. The proposed method adopts a Transformer network as a generator and a multi-layer perceptron as a critic. The extensive evaluations with real-world workload traces show WGAN-gp Transformer achieves 5 times faster inference time with up to 5.1 percent higher prediction accuracy against the state-of-the-art approach. We also apply WGAN-gp Transformer to auto-scaling mechanisms on Google cloud platforms, and the WGAN-gp Transformer-based auto-scaling mechanism outperforms the LSTM-based mechanism by significantly reducing VM over-provisioning and under-provisioning ratesShow more
Downloadable Archival Material, 2022-03-12
Undefined
Publisher:2022-03-12


2022
Causality Learning With Wasserstein Generative Adversarial Networks
Authors:Petkov, Hristo (Creator), Hanley, Colin (Creator), Dong, Feng (Creator)
Summary:Conventional methods for causal structure learning from data face significant challenges due to combinatorial search space. Recently, the problem has been formulated into a continuous optimization framework with an acyclicity constraint to learn Directed Acyclic Graphs (DAGs). Such a framework allows the utilization of deep generative models for causal structure learning to better capture the relations between data sample distributions and DAGs. However, so far no study has experimented with the use of Wasserstein distance in the context of causal structure learning. Our model named DAG-WGAN combines the Wasserstein-based adversarial loss with an acyclicity constraint in an auto-encoder architecture. It simultaneously learns causal structures while improving its data generation capability. We compare the performance of DAG-WGAN with other models that do not involve the Wasserstein metric in order to identify its contribution to causal structure learning. Our model performs better with high cardinality data according to our experimentsShow more
Downloadable Archival Material, 2022-06-03
Undefined
Publisher:2022-06-03


2022
Kantorovich Strikes Back! Wasserstein GANs are not Optimal Transport?

Authors:Korotin, Alexander (Creator), Kolesov, Alexander (Creator), Burnaev, Evgeny (Creator)
Summary:Wasserstein Generative Adversarial Networks (WGANs) are the popular generative models built on the theory of Optimal Transport (OT) and the Kantorovich duality. Despite the success of WGANs, it is still unclear how well the underlying OT dual solvers approximate the OT cost (Wasserstein-1 distance, $\mathbb{W}_{1}$) and the OT gradient needed to update the generator. In this paper, we address these questions. We construct 1-Lipschitz functions and use them to build ray monotone transport plans. This strategy yields pairs of continuous benchmark distributions with the analytically known OT plan, OT cost and OT gradient in high-dimensional spaces such as spaces of images. We thoroughly evaluate popular WGAN dual form solvers (gradient penalty, spectral normalization, entropic regularization, etc.) using these benchmark pairs. Even though these solvers perform well in WGANs, none of them faithfully compute $\mathbb{W}_{1}$ in high dimensions. Nevertheless, many provide a meaningful approximation of the OT gradient. These observations suggest that these solvers should not be treated as good estimators of $\mathbb{W}_{1}$, but to some extent they indeed can be used in variational problems requiring the minimization of $\mathbb{W}_{1}$Show more
Downloadable Archival Material, 2022-06-15
Undefined
Publisher:2022-06-15

Cited by 6 Related articles All 4 versions

2022


Unsupervised Denoising of Optical Coherence Tomography Images with Dual_Merged CycleWGANAuthors:Du, Jie (Creator), Yang, Xujian (Creator), Jin, Kecheng (Creator), Qi, Xuanzheng (Creator), Chen, Hu (Creator)
Summary:Nosie is an important cause of low quality Optical coherence tomography (OCT) image. The neural network model based on Convolutional neural networks(CNNs) has demonstrated its excellent performance in image denoising. However, OCT image denoising still faces great challenges because many previous neural network algorithms required a large number of labeled data, which might cost much time or is expensive. Besides, these CNN-based algorithms need numerous parameters and good tuning techniques, which is hardware resources consuming. To solved above problems, We proposed a new Cycle-Consistent Generative Adversarial Nets called Dual-Merged Cycle-WGAN for retinal OCT image denoiseing, which has remarkable performance with less unlabeled traning data. Our model consists of two Cycle-GAN networks with imporved generator, descriminator and wasserstein loss to achieve good training stability and better performance. Using image merge technique between two Cycle-GAN networks, our model could obtain more detailed information and hence better training effect. The effectiveness and generality of our proposed network has been proved via ablation experiments and comparative experiments. Compared with other state-of-the-art methods, our unsupervised method obtains best subjective visual effect and higher evaluation objective indicatorsShow more
Downloadable Archival Material, 2022-05-02
Undefined
Publisher:2022-05-02



arXiv:2212.02468  [pdfother cs.CL
Quantized Wasserstein Procrustes Alignment of Word Embedding Spaces
Authors: Prince O AboagyeYan ZhengMichael YehJunpeng WangZhongfang ZhuangHuiyuan ChenLiang WangWei ZhangJeff Phillips
Abstract: Optimal Transport (OT) provides a useful geometric framework to estimate the permutation matrix under unsupervised cross-lingual word embedding (CLWE) models that pose the alignment task as a Wasserstein-Procrustes problem. However, linear programming algorithms and approximate OT solvers via Sinkhorn for computing the permutation matrix come with a significant computational burden since they scal…  More
Submitted 5 December, 2022; originally announced December 2022.
Journal ref: AMTA 2022



Study Results from Taiyuan University of Science & Technology in the Area of Information Technology Reported (Tool Wear State Recognition Under Imbalanced Data Based On Wgan-gp and Lightweight Neural Network Shufflenet)
Show more
Downloadable Article, 2022
Publication:Information Technology Daily, 20221107
Publisher:2022

2022 patent news
Univ Xidian Submits Chinese Patent Application for Radar HRRP Database Construction Method Based on WGAN-GP
Article, 2022
Publication:Global IP News: Software Patent News, June 6 2022, NA
Publisher:2022

Beijing Industrial Univ Seeks Patent for Data Depth Enhancement Method Based on WGAN-GP Data Generation and Poisson Fusion
Article, 2022
Publication:Global IP News: Information Technology Patent News, November 21 2022, NA
Publisher:2022

<-—2022———2022———1600—



New Technology Study Findings Recently Were Reported by Researchers at Southwest Jiaotong University (Ldos Attack Traffic Detection Based On Feature Optimization Extraction and Dpsa-wgan)
Show more
Downloadable Article, 2022
Publication:Tech Daily News, 20221115
Publisher:2022


Researchers at Tsinghua University Have Published New Data on Networks (A Pipe Ultrasonic Guided Wave Signal Generation Network Suitable for Data Enhancement in Deep Learning: US-WGAN)
Show more
Downloadable Article, 2022
Publication:Network Daily News, 20221012
Publisher:2022

Study Findings from Harbin Institute of Technology Update Knowledge in Spacecraft (Informer-WGAN: High Missing Rate Time Series Imputation Based on Adversarial Training and a Self-Attention Mechanism)
Show more
Downloadable Article, 2022
Publication:Defense & Aerospace Daily, 20220810
Publisher:2022

Research on the Application of Hotel Cleanliness Compliance Detection Algorithm Based on WGAN
Authors:Hui GaoXiang Kang2022 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML)
Summary:Aiming at the problems of irregular cleaning and supervision difficulties in the cleaning process of hotel bathrooms, a target detection algorithm based on deep learning is proposed to detect the cleaning process transmitted by the sensor in real time and analyze its prescriptivity. However, the cleaning process has factors such as occlusion, light influence and insufficient data volume, resulting in inefficient detection. Therefore, this paper proposes a deep convolutional generation adversarial network (DCGAN) as the basic framework to expand the data set, improve the adaptability and robustness of the detector to different detection targets, take advantage of the fast speed and high accuracy of the YOLOv5 target detection network to detect the target, and then design a compliance detection network algorithm to detect whether the target meets the cleanliness standards. Experimental results show that the method has rapidity, practicality and high accuracy, and fully meets the engineering needs of hotel cleaning process detection and supervisionShow more
Chapter, 2022
Publication:2022 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML), 202206, 92
Publisher:2022
All 2 versions


Peer-reviewed
WGAN-Based Image Denoising Algorithm
Authors:XiuFang ZouDingju ZhuJun HuangWei LuXinchu YaoZhaotong Lian
Summary:Traditional image denoising algorithms are generally based on spatial domains or transform domains to denoise and smooth the image. The denoised images are not exhaustive, and the depth-of-learning algorithm has better denoising effect and performs well while retaining the original image texture details such as edge characters. In order to enhance denoising capability of images by the restoration of texture details and noise reduction, this article proposes a network model based on the Wasserstein GAN. In the generator, small convolution size is used to extract image features with noise. The extracted image features are denoised, fused and reconstructed into denoised images. A new residual network is proposed to improve the noise removal effect. In the confrontation training, different loss functions are proposed in this paperShow more
Article, 2022
Publication:Journal of Global Information Management (JGIM), 30, 20220415, 1
Publisher:2022


2022


1. Peer-reviewed
Improving H detection model using IPA time and WGAN-GP
Authors:Junwon LeeHeejo Lee
Article, 2022
Publication:Computers & security, 116, 2022
Publisher:2022


An Improved WGAN-Based Fault Diagnosis of Rolling Bearings
Authors:Chengli ZhaoLu ZhangMaiying Zhong2022 IEEE International Conference on Sensing, Diagnostics, Prognostics, and Control ( SDPC)
Summary:The generative adversarial network (GAN) has been extensively applied in the field of fault diagnosis of rolling bearings under data imbalance. However, it still suffers from unstable training and poor quality of generated data, especially when training data is extremely scarce. To deal with these problems, an improved Wasserstein generative adversarial network (IWGAN)-based fault diagnosis method is put forward in this article. A classifier is introduced into the discriminator for gaining label information, thus the model will be trained in a supervised way to enhance stability. In addition, the matching mechanism of feature map is considered to ameliorate the quality of generated fault data. Then, by blending original data with generated data, a fault diagnosis method, by using stacked denoising autoencoder, is designed to realize fault diagnosis. Finally, the availability of proposed model is verified on the benchmark fault dataset from Case Western Reserve University. The results of the comparative experiments strongly indicate that IWGAN can not only effectively strengthen the balance of the original data but also enhance the diagnosing precision of rolling bearingsShow more
Chapter, 2022
Publication:2022 IEEE International Conference on Sensing, Diagnostics, Prognostics, and Control ( SDPC), 20220805, 322
Publisher:2022



Reports Outline Robotics Study Results from University of Algarve (Vgan: Generalizing Mse Gan and Wgan-gp for Robot Fault Diagnosis)
Downloadable Article, 2022
Publication:Robotics & Machine Learning Daily News, 20220831
Publisher:2022

Findings from South China Normal University Has Provided New Data on Information Management (Wgan-based Image Denoising Algorithm)
Downloadable Article, 2022
Publication:Information Technology Daily, 20220804
Publisher:2022
Related articles
 All 2 versions


Studies from Xidian University Yield New Data on Remote Sensing (AC-WGAN-GP: Generating Labeled Samples for Improving Hyperspectral Image Classification with Small-Samples)
Show more
Downloadable Article, 2022
Publication:Tech Daily News, 20221101
Publisher:2022

<-—2022———2022———1610—



Peer-reviewed
Speech Emotion Recognition on Small Sample Learning by Hybrid WGAN-LSTM Networks
Authors:Cunwei SunLuping JiHailing Zhong
Article, 2022
Publication:Journal of circuits, systems and computers, 31, 2022, 2250073
Publisher:2022


Globally Consistent Image Inpainting based on WGAN-GP Network optimizationAuthors:Na GeWenhui GuoYanjiang Wang2022 16th IEEE International Conference on Signal Processing (ICSP)
Summary:There have been many methods applied to image inpainting. Although these algorithms can roughly produce visually plausible image structure and texture, they also create a lot of chaotic structural information and blurry texture details, resulting in inconsistencies with the surrounding content area. This paper proposes a globally consistent image inpainting network with a nonlocal module based on WGAN-GP optimization. It can make the network obtain the relevant information on long-distance dependence without superimposing network layers. And it is also able to prevent the limitations such as inefficient calculation and complex optimization caused by the local operation of the convolutional neural network. Thus making full use of the surrounding information of the area to be repaired will improve the semantic and structural consistency of generating predictions with the entire background area. Experiments with this model are conducted on a Places2 dataset, and the results prove that our method was superior to ordinary convolutional neural networksShow more
Chapter, 2022
Publication:2022 16th IEEE International Conference on Signal Processing (ICSP), 1, 20221021, 70
Publisher:2022

New Findings from Dalian University in the Area of Arrhythmia Described (Arrhythmia Detection Based on WGAN-GP and SE-ResNet1D)
Downloadable Article, 2022
Publication:NewsRx Cardiovascular Daily, 20221128
Publisher:2022



Wasserstein Distributionally Robust Optimization with Wasserstein Barycenters
Authors:Lau, Tim Tsz-Kit (Creator), Liu, Han (Creator)
Summary:In many applications in statistics and machine learning, the availability of data samples from multiple possibly heterogeneous sources has become increasingly prevalent. On the other hand, in distributionally robust optimization, we seek data-driven decisions which perform well under the most adverse distribution from a nominal distribution constructed from data samples within a certain discrepancy of probability distributions. However, it remains unclear how to achieve such distributional robustness in model learning and estimation when data samples from multiple sources are available. In this work, we propose constructing the nominal distribution in optimal transport-based distributionally robust optimization problems through the notion of Wasserstein barycenter as an aggregation of data samples from multiple sources. Under specific choices of the loss function, the proposed formulation admits a tractable reformulation as a finite convex program, with powerful finite-sample and asymptotic guarantees. As an illustrative example, we demonstrate with the problem of distributionally robust sparse inverse covariance matrix estimation for zero-mean Gaussian random vectors that our proposed scheme outperforms other widely used estimators in both the low- and high-dimensional regimesShow more
Downloadable Archival Material, 2022-03-22
Undefined
Publisher:2022-03-22



Optimal Transport Tools (OTT): A JAX Toolbox for all things Wasserstein
Authors:Cuturi, Marco (Creator), Meng-Papaxanthos, Laetitia (Creator), Tian, Yingtao (Creator), Bunne, Charlotte (Creator), Davis, Geoff (Creator), Teboul, Olivier (Creator)
Summary:Optimal transport tools (OTT-JAX) is a Python toolbox that can solve optimal transport problems between point clouds and histograms. The toolbox builds on various JAX features, such as automatic and custom reverse mode differentiation, vectorization, just-in-time compilation and accelerators support. The toolbox covers elementary computations, such as the resolution of the regularized OT problem, and more advanced extensions, such as barycenters, Gromov-Wasserstein, low-rank solvers, estimation of convex maps, differentiable generalizations of quantiles and ranks, and approximate OT between Gaussian mixtures. The toolbox code is available at \texttt{https://github.com/ott-jax/ott}Show more
Downloadable Archival Material, 2022-01-28
Undefined
Publisher:2022-01-28
Cited by 15 Related articles All 2 versions


2022


2022 see 2021
WATCH: Wasserstein Change Point Detection for High-Dimensional Time Series Data
Authors:Faber, Kamil (Creator), Corizzo, Roberto (Creator), Sniezynski, Bartlomiej (Creator), Baron, Michael (Creator), Japkowicz, Nathalie (Creator)
Summary:Detecting relevant changes in dynamic time series data in a timely manner is crucially important for many data analysis tasks in real-world settings. Change point detection methods have the ability to discover changes in an unsupervised fashion, which represents a desirable property in the analysis of unbounded and unlabeled data streams. However, one limitation of most of the existing approaches is represented by their limited ability to handle multivariate and high-dimensional data, which is frequently observed in modern applications such as traffic flow prediction, human activity recognition, and smart grids monitoring. In this paper, we attempt to fill this gap by proposing WATCH, a novel Wasserstein distance-based change point detection approach that models an initial distribution and monitors its behavior while processing new data points, providing accurate and robust detection of change points in dynamic high-dimensional data. An extensive experimental evaluation involving a large number of benchmark datasets shows that WATCH is capable of accurately identifying change points and outperforming state-of-the-art methodsShow more
Downloadable Archival Material, 2022-01-18
Undefined
Publisher:2022-01-18



WGAN-GP and LSTM based Prediction Model for Aircraft 4- D Traj ectory
Authors:Lei ZhangHuiping ChenPeiyan JiaZhihong TianXiaojiang Du2022 International Wireless Communications and Mobile Computing (IWCMC)
Summary:The rapid growth of air traffic flow has brought the airspace capacity close to saturation and, at the same time, has resulted in great stress for air traffic controllers. The 4- D trajectory-based operation system is an important solution to problems in the current civil aviation field. The system mainly relies on accurate 4-D trajectory prediction technology to share trajectory information among air traffic control, airlines, and aircraft to achieve coordinated decision-making between flight and control. However, due to the complexity of trajectory data processing, the current 4-D trajectory prediction technology cannot meet actual needs. Therefore, a data generation and prediction network model (DGPNM) is proposed. It integrates the Wasserstein generative adversarial networks with gradient penalty (WGAN-GP) and long-short-term memory (LSTM) neu-ral networks. With its outstanding performance, the LSTM neural network is utilized in both the generation module and the prediction module. The proposed model generates plenty of sample data to enlarge the train set, so overfitting could be reduced in the process of LSTM training. Experimental results prove that compared with other classical methods, the altitude prediction accuracy in the proposed model far exceeds that in current research results, which improves the prediction accuracy of the 4- D trajectoryShow more
Chapter, 2022
Publication:2022 International Wireless Communications and Mobile Computing (IWCMC), 20220530, 937
Publisher:2022


Fault Feature Extraction Method of a Permanent Magnet Synchronous Motor Based on VAE-WGAN
Authors:Zhan LiuXiaowei XuFeng QianQiong Luo
Summary:This paper focuses on the difficulties that appear when the number of fault samples collected by a permanent magnet synchronous motor is too low and seriously unbalanced compared with the normal data. In order to effectively extract the fault characteristics of the motor and provide the basis for the subsequent fault mechanism and diagnosis method research, a permanent magnet synchronous motor fault feature extraction method based on variational auto-encoder (VAE) and improved generative adversarial network (GAN) is proposed in this paper. The VAE is used to extract fault features, combined with the GAN to extended data samples, and the two-dimensional features are extracted by means of mean and variance for visual analysis to measure the classification effect of the model on the features. Experimental results show that the method has good classification and generation capabilities to effectively extract the fault features of the motor and its accuracy is as high as 98.26%Show more
Article
Publication:Processes, 10, 2022, 200

Related articles All 4 versions 

Research on Partial Discharge Recognition in GIS Based on Mobilenet V2 and Improved WGANAuthors:Li TaoNiu ShuofengLiu HonglingLi ZhenzuoDu YinjingLei Shengfeng2022 IEEE International Conference on High Voltage Engineering and Applications (ICHVE)
Summary:Partial discharge (PD) is an important reason for the deterioration of GIS insulation performance. Accurate PD pattern recognition is of great significance to GIS operation and maintenance. Due to the small number of samples, low recognition accuracy and long running time of traditional PD pattern recognition methods, this paper proposed a GIS PD pattern recognition method based on improved WGAN and MobileNet-V2 network. Firstly, the test platform for PD was designed and built to obtain UHF signals under typical defects, and the PRPD spectrum of UHF signals was generated. Then, the improved WGAN was used to expand the PRPD spectrum. Finally, the pattern recognition of PD was realized based on MobileNet-V2 network. The results show that the proposed method which has less parameters can effectively solve the problem of insufficient data volume, and it has a high accuracy. So the model can be applied to the GIS operation and maintenance process, which has practical engineering valueShow more
Chapter, 2022
Publication:2022 IEEE International Conference on High Voltage Engineering and Applications (ICHVE), 20220925, 1
Publisher:2022

Radio Galaxy Classification with wGAN-Supported AugmentationAuthors:Janis KummerLennart RustigeFlorian GrieseKerstin BorrasMarcus BrüggenPatrick L S ConnorFrank GaedeGregor KasieczkaPeter Schleper
Summary:Novel techniques are indispensable to process the flood of data from the new generation of radio telescopes. In particular, the classification of astronomical sources in images is challenging. Morphological classification of radio galaxies could be automated with deep learning models that require large sets of labelled training data. Here, we demonstrate the use of generative models, specifically Wasserstein GANs (wGAN), to generate artificial data for different classes of radio galaxies. Subsequently, we augment the training data with images from our wGAN. We find that a simple fully-connected neural network for classification can be improved significantly by including generated images into the training setShow more
Book, Oct 7, 2022
Publication:arXiv.org, Oct 7, 2022, n/a
Publisher:Oct 7, 2022

<-—2022———2022———1620—



Application of WGAN-GP in recommendation and Questioning the relevance of GAN-based approaches
Author:Oussama Boudjeniba
Summary:Many neural-based recommender systems were proposed in recent years and part of them used Generative Adversarial Networks (GAN) to model user-item interactions. However, the exploration of Wasserstein GAN with Gradient Penalty (WGAN-GP) on recommendation has received relatively less scrutiny. In this paper, we focus on two questions: 1- Can we successfully apply WGAN-GP on recommendation and does this approach give an advantage compared to the best GAN models? 2- Are GAN-based recommender systems relevant? To answer the first question, we propose a recommender system based on WGAN-GP called CFWGAN-GP which is founded on a previous model (CFGAN). We successfully applied our method on real-world datasets on the top-k recommendation task and the empirical results show that it is competitive with state-of-the-art GAN approaches, but we found no evidence of significant advantage of using WGAN-GP instead of the original GAN, at least from the accuracy point of view. As for the second question, we conduct a simple experiment in which we show that a well-tuned conceptually simpler method outperforms GAN-based models by a considerable margin, questioning the use of such modelsShow more
Book, Apr 28, 2022
Publication:arXiv.org, Apr 28, 2022, n/a
Publisher:Apr 28, 2022


DAG-WGAN: Causal Structure Learning With Wasserstein Generative Adversarial Networks
Authors:Hristo PetkovColin HanleyDong Feng
Summary:The combinatorial search space presents a significant challenge to learning causality from data. Recently, the problem has been formulated into a continuous optimization framework with an acyclicity constraint, allowing for the exploration of deep generative models to better capture data sample distributions and support the discovery of Directed Acyclic Graphs (DAGs) that faithfully represent the underlying data distribution. However, so far no study has investigated the use of Wasserstein distance for causal structure learning via generative models. This paper proposes a new model named DAG-WGAN, which combines the Wasserstein-based adversarial loss, an auto-encoder architecture together with an acyclicity constraint. DAG-WGAN simultaneously learns causal structures and improves its data generation capability by leveraging the strength from the Wasserstein distance metric. Compared with other models, it scales well and handles both continuous and discrete data. Our experiments have evaluated DAG-WGAN against the state-of-the-art and demonstrated its good performanceShow more
Book, Apr 1, 2022
Publication:arXiv.org, Apr 1, 2022, n/a
Publisher:Apr 1, 2022

2022 see 2021
GMT-WGAN: An Adversarial Sample Expansion Method for Ground Moving Targets Classification
Authors:Xin YaoXiaoran ShiYaxin LiLi WangHan WangShijie RenFeng Zhou
Summary:In the field of target classification, detecting a ground moving target that is easily covered in clutter has been a challenge. In addition, traditional feature extraction techniques and classification methods usually rely on strong subjective factors and prior knowledge, which affect their generalization capacity. Most existing deep-learning-based methods suffer from insufficient feature learning due to the lack of data samples, which makes it difficult for the training process to converge to a steady-state. To overcome these limitations, this paper proposes a Wasserstein generative adversarial network (WGAN) sample enhancement method for ground moving target classification (GMT-WGAN). First, the micro-Doppler characteristics of ground moving targets are analyzed. Next, a WGAN is constructed to generate effective time–frequency images of ground moving targets and thereby enrich the sample database used to train the classification network. Then, image quality evaluation indexes are introduced to evaluate the generated spectrogram samples, with an aim to verify the distribution similarity of generated and real samples. Afterward, by feeding augmented samples to the deep convolutional neural networks with good generalization capacity, the classification performance of the GMT-WGAN is improved. Finally, experiments conducted on different datasets validate the effectiveness and robustness of the proposed methodShow more
Article
Publication:Remote Sensing, 14, 2022, 123



Application of WGAN-GP in recommendation and Questioning the relevance of GAN-based approaches
Authors:Khodja, Hichem Ammar (Creator), Boudjeniba, Oussama (Creator)
Summary:Many neural-based recommender systems were proposed in recent years and part of them used Generative Adversarial Networks (GAN) to model user-item interactions. However, the exploration of Wasserstein GAN with Gradient Penalty (WGAN-GP) on recommendation has received relatively less scrutiny. In this paper, we focus on two questions: 1- Can we successfully apply WGAN-GP on recommendation and does this approach give an advantage compared to the best GAN models? 2- Are GAN-based recommender systems relevant? To answer the first question, we propose a recommender system based on WGAN-GP called CFWGAN-GP which is founded on a previous model (CFGAN). We successfully applied our method on real-world datasets on the top-k recommendation task and the empirical results show that it is competitive with state-of-the-art GAN approaches, but we found no evidence of significant advantage of using WGAN-GP instead of the original GAN, at least from the accuracy point of view. As for the second question, we conduct a simple experiment in which we show that a well-tuned conceptually simpler method outperforms GAN-based models by a considerable margin, questioning the use of such modelsShow more
Downloadable Archival Material, 2022-04-26
Undefined
Publisher:2022-04-26


Peer-reviewed
On a prior based on the Wasserstein information matrix

Authors:W. LiF.J. Rubio
Summary:We introduce a prior for the parameters of univariate continuous distributions, based on the Wasserstein information matrix, which is invariant under reparameterisations. We discuss the links between the proposed prior with information geometry. We present sufficient conditions for the propriety of the posterior distribution for general classes of models. We present a simulation study that shows that the induced posteriors have good frequentist propertiesShow more
Article, 2022
Publication:Statistics and Probability Letters, 190, 202211
Publisher:2022
Peer-reviewed
O
Publication:Computers and Operations Research, 138, February 2022


2022


2022 see 2021  Peer-reviewed
Alpha Procrustes metrics between positive definite operators: A unifying formulation for the Bures-Wasserstein and Log-Euclidean/Log-Hilbert-Schmidt metrics
Show more
Author:Hà Quang Minh
Summary:This work presents a parametrized family of distances, namely the Alpha Procrustes distances, on the set of symmetric, positive definite (SPD) matrices. The Alpha Procrustes distances provide a unified formulation encompassing both the Bures-Wasserstein and Log-Euclidean distances between SPD matrices. We show that the Alpha Procrustes distances are the Riemannian distances corresponding to a family of Riemannian metrics on the manifold of SPD matrices, which encompass both the Log-Euclidean and Wasserstein Riemannian metrics. This formulation is then generalized to the set of positive definite Hilbert-Schmidt operators on a Hilbert space, unifying the infinite-dimensional Bures-Wasserstein and Log-Hilbert-Schmidt distances. In the setting of reproducing kernel Hilbert spaces (RKHS) covariance operators, we obtain closed form formulas for all the distances via the corresponding kernel Gram matrices. From a statistical viewpoint, the Alpha Procrustes distances give rise to a parametrized family of distances between Gaussian measures on Euclidean space, in the finite-dimensional case, and separable Hilbert spaces, in the infinite-dimensional case, encompassing the 2-Wasserstein distance, with closed form formulas via Gram matrices in the RKHS setting. The presented formulations are new both in the finite and infinite-dimensional settingsShow more
Article
Publication:Linear Algebra and Its Applications, 636, 2022-03-01, 25



WPPNets and WPPFlows: The Power of Wasserstein Patch Priors for Superresolution
Authors:Altekrüger, Fabian (Creator), Hertrich, Johannes (Creator)
Summary:Exploiting image patches instead of whole images have proved to be a powerful approach to tackle various problems in image processing. Recently, Wasserstein patch priors (WPP), which are based on the comparison of the patch distributions of the unknown image and a reference image, were successfully used as data-driven regularizers in the variational formulation of superresolution. However, for each input image, this approach requires the solution of a non-convex minimization problem which is computationally costly. In this paper, we propose to learn two kinds of neural networks in an unsupervised way based on WPP loss functions. First, we show how convolutional neural networks (CNNs) can be incorporated. Once the network, called WPPNet, is learned, it can very efficiently applied to any input image. Second, we incorporate conditional normalizing flows to provide a tool for uncertainty quantification. Numerical examples demonstrate the very good performance of WPPNets for superresolution in various image classes even if the forward operator is known only approximatelyShow more
Downloadable Archival Material, 2022-01-20
Undefined
Publisher:2022-01-20



Wasserstein t-SNE
Authors:Bachmann, Fynn (Creator), Hennig, Philipp (Creator), Kobak, Dmitry (Creator)
Summary:Scientific datasets often have hierarchical structure: for example, in surveys, individual participants (samples) might be grouped at a higher level (units) such as their geographical region. In these settings, the interest is often in exploring the structure on the unit level rather than on the sample level. Units can be compared based on the distance between their means, however this ignores the within-unit distribution of samples. Here we develop an approach for exploratory analysis of hierarchical datasets using the Wasserstein distance metric that takes into account the shapes of within-unit distributions. We use t-SNE to construct 2D embeddings of the units, based on the matrix of pairwise Wasserstein distances between them. The distance matrix can be efficiently computed by approximating each unit with a Gaussian distribution, but we also provide a scalable method to compute exact Wasserstein distances. We use synthetic data to demonstrate the effectiveness of our 
Downloadable Archival Material, Undefined, 2022-05-16

[2205.07531] Wasserstein t-SNE - arXiv

https://arxiv.org › cs

by F Bachmann · 2022 — Wasserstein t-SNE. Authors:Fynn BachmannPhilipp HennigDmitry Kobak · Download PDF. Abstract: Scientific datasets often have hierarchical ...


2022 see 2021
Fast and Smooth Interpolation on Wasserstein Space

Authors:Massachusetts Institute of Technology Department of Mathematics (Contributor), Chewi, Sinho (Creator), Clancy, Julien (Creator), Le Gouic, Thibaut (Creator), Rigollet, Philippe (Creator), Stepaniants, George (Creator), Stromme, Austin J (Creator)Show more
Downloadable Archival Material, 2022-10-14T15:59:19Z
English
Publisher:2022-10-14T15:59:19Z


20222

Apple disease recognition based on Wasserstein generative adversarial networks and hybrid attention mechanism residual network

Y Xueying, G Jiyong, W Shoucheng… - Journal of Chinese …, 2022 - zgnjhxb.niam.com.cn

Abstract: It is important to identify and control the disease accurately to improve the yield

and quality of apples. Aiming at the problem of low recognition accuracy of apple disease …

<-—2022———2022———1630—


Reports from Tata Institute for Fundamental Research Provide New Insights into Algebra (A Note On Relative Vaserstein Symbol)

Downloadable Article, 2022

Publication:Math Daily News, 20221007

Code for the Article "Modeling of Political Systems using Wasserstein Gradient Flows"
by Lanzetti, Nicolas; Hajar, Joudi; Dörfler, Florian
12/2022
Web ResourceCitation Online

Related articles All 5 versions

Peer-reviewed
On isometries of compact L–Wasserstein spaces
Author:Jaime Santos-Rodríguez
Article, 2022
Publication:Advances in Mathematics, 409, 202211, 108632
Publisher:2022


Dirac Mixture Reduction Using Wasserstein Distances on Projected Cumulative Distributions

D Prossel, UD Hanebeck - 2022 25th International Conference …, 2022 - ieeexplore.ieee.org

… It can be viewed as the approximation of a given Dirac mixture density with another one, … ,

the Wasserstein distance is established as a suitable measure to compare two Dirac mixtures. …

Save Cite Cited by 2 All 2 versions

[PDF] neurips.cc


Distributionally Robust Second-Order Stochastic Dominance Constrained Optimization with Wasserstein Ball

Yu MeiJia Liu, and Zhiping Chen

SIAM Journal on OptimizationVol. 32, No. 2, pp. 715–7382022

PDF  Abstract


2022

EASI-STRESS on Twitter: "Part 2 of the EASI-STRESS ...

twitter.com › EASI_STRESS › status

twitter.com › EASI_STRESS › status

8:12 AM · Mar 29, 2022 ·Twitter Web App ... optimal transport along with new JAX software for (Euclidean) Wasserstein-2 OT! https://arxiv.org/abs/2210.12153 ...

Twitter · 1 month ago

Mar 29, 2022


Working Paper
Fast and Provably Convergent Algorithms for Gromov-Wasserstein in Graph Data
Li, Jiajin; Tang, Jianheng; Kong, Lemin; Liu, Huikang; Li, Jia; et al.
 arXiv.org; Ithaca, Dec 14, 2022.
Cit  Email
Save to My Research


Working Paper
Score-based Generative Modeling Secretly Minimizes the Wasserstein Distance
Kwon, Dohyun; Fan, Ying; Lee, Kangwook.
 arXiv.org; Ithaca, Dec 13, 2022.
Cit  Email
Save to My Research
 opens in a new window


Working Paper
On Generalization and Regularization via Wasserstein Distributionally Robust Optimization
Wu, Qinyu; Jonathan Yu-Meng Li; Mao, Tiantian.
 arXiv.org; Ithaca, Dec 12, 2022.
Cite  Email
 

Working Paper
Graph-Regularized Manifold-Aware Conditional Wasserstein GAN for Brain Functional Connectivity Generation
Yee-Fan, Tan; Chee-Ming, Ting; Noman, Fuad; Phan, Raphaël C -W; Ombao, Hernando.
 arXiv.org; Ithaca, Dec 10, 2022.
Cite

<-—2022———2022———1640—


Working Paper
Wasserstein distance estimates for jump-diffusion processes
Breton, Jean-Christophe; Privault, Nicolas.
 arXiv.org; Ithaca, Dec 9, 2022.
Cite

2022  Wire Feed patent news
State Intellectual Property Office of China Releases Hangzhou Electronics Science and Technology Univ's Patent Application for Network Security Unbalanced Data Set Analysis Method Based on WGAN Dynamic Penalty
Global IP News. Security & Protection Patent News; New Delhi [New Delhi]. 10 Dec 2

 Cite


2022

MR4514541 Prelim Oostrum, Jesse van; 

Bures–Wasserstein geometry for positive-definite Hermitian matrices and their trace-one subset. Inf. Geom. 5 (2022), no. 2, 405–425. 53

Review PDF Clipboard Journal Article


Liao, QichenChen, JingWang, ZihaoBai, BoJin, ShiWu, Hao

Fast Sinkhorn. I: 

An  O(N)   algorithm for the Wasserstein-1 metric. (English) Zbl 07632197

Commun. Math. Sci. 20, No. 7, 2053-2057 (2022).

MSC:  49M25 65K10 49Q22

PDF BibTeX XML Cite

Full Text: DOI 

 

Yatracos, Yannis G.

Limitations of the Wasserstein MDE for univariate data. (English) Zbl 07630671

Stat. Comput. 32, No. 6, Paper No. 95, 11 p. (2022).

MSC:  62-08 62E20 62G05

PDF BibTeX XML Cite

2022 patent

Wasserstein generative adversarial network (WGAN)- convolutional neural network (CNN) coal mine dust concentration prediction method involves obtaining real-time data of coal mine underground respiration dust and characteristic parameter, and inputting reliability verification of prediction model

CN115310361-A

Inventor(s) JIANG WLI H; (...); QIN B

Assignee(s) UNIV CHINA MINING & TECHNOLOGY BEIJING

Derwent Primary Accession Number 

2022-E22744

 

2022 patent

Method for predicting free calcium oxide data of cement clinker based on generative adversarial network, involves building SSP-WGAN model, combining WGAN and regression prediction network, and realizing accurate prediction

CN115331756-A

Inventor(s) ZHANG YDANG H and HAO X

Assignee(s) UNIV YANSHAN

Derwent Primary Accession Number 

2022-E3734C


2022 patent

Seismic data sample set augmented method, involves evaluating quality of seismic data augmented model generating fault seismic data using Wasserstein distance as evaluation index of anti-network model training effect

CN115310515-A

Inventor(s) YANG JZHANG Y; (...); DING R

Assignee(s) UNIV SHANDONG SCI & TECHNOLOGY

Derwent Primary Accession Number 

2022-E21666


 2022 patent

Wasserstein generative adversarial network-GP based semi-supervised malicious flow detecting method, involves training discriminator on small amount of marked sample, and training judger on sample generated by generator and unmarked sample

CN115314254-A

Inventor(s) CHENG JWU F; (...); LIU S

Assignee(s) UNIV CHINESE PEOPLES LIBERATION ARMY

Derwent Primary Accession Number 

2022-E4155R


Defect Detection of MEMS Based on Data Augmentation, WGAN-DIV-DC, and a YOLOv5 Model.

Shi, ZhenmanSang, Mei; (...); Liu, Tiegen

2022-12-02 | 

Sensors (Basel, Switzerland)

 22 (23)

Surface defect detection of micro-electromechanical system (MEMS) acoustic thin film plays a crucial role in MEMS device inspection and quality control. The performances of deep learning object detection models are significantly affected by the number of samples in the training dataset. However, it is difficult to collect enough defect samples during production. In this paper, an improved YOLOv

Show more

Free Full Text from Publishermore_horiz

<-—2022———2022———1650—

 

Tool wear state recognition under imbalanced data based on WGAN-GP and lightweight neural network ShuffleNet

Hou, WGuo, H; (...); Mao, Y

Oct 2022 | Oct 2022 (Early Access) | 


 

Data-Driven Approximation of the Perron-Frobenius Operator Using the Wasserstein Metric

Karimi, A and Georgiou, TT

25th International Symposium on Mathematical Theory of Networks and Systems (MTNS)

2022 | 

IFAC PAPERSONLINE

 55 (30) , pp.341-346

This manuscript introduces a regression-type formulation for approximating the Perron-Frobenius Operator by relying on distributional snapshots of data. These snapshots may represent densities of particles. The Wasserstein metric is leveraged to define a suitable functional optimization in the space of distributions. The formulation allows seeking suitable dynamics so as to interpolate the dist

Show more

Free Full Text from Publishermore_horiz


 21

Distributionally robust portfolio optimization with second- order stochastic dominance based on wasserstein metric

Hosseini-Nodeh, ZKhanjani-Shiraz, R and Pardalos, PM

Oct 2022 | 

INFORMATION SCIENCES

 613 , pp.828-852

In portfolio optimization, we may be dealing with misspecification of a known distribution, that stock returns follow it.The unknown true distribution is considered in terms of a Wasserstein-neighborhood of P to examine the tractable formulations of the portfolio selection problem. This study considers a distributionally robust portfolio optimization problem with an ambiguous stochastic dominan

Show more

Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings

Minh, HQ

Nov 2022 (Early Access) | 

ANALYSIS AND APPLICATIONS

This work studies the convergence and finite sample approximations of entropic regularized Wasserstein distances in the Hilbert space setting. Our first main result is that for Gaussian measures on an infinite-dimensional Hilbert space, convergence in the 2-Sinkhorn divergence is strictly weaker than convergence in the exact 2-Wasserstein distance. Specifically, a sequence of centered Gaussian

Show more

Free Submitted Article From RepositoryView full textmore_horiz

 

WAD-CMSN: Wasserstein distance-based cross-modal semantic network for zero-shot sketch-based image retrieval

Xu, GLHu, ZS and Cai, J

Nov 2022 (Early Access) | 

INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING

Zero-shot sketch-based image retrieval (ZSSBIR) aims at retrieving natural images given free hand-drawn sketches that may not appear during training. Previous approaches used semantic aligned sketch-image pairs or utilized memory expensive fusion layer for projecting the visual information to a low-dimensional subspace, which ignores the significant heterogeneous cross-domain discrepancy betwee

Show more

Free Submitted Article From RepositoryFull Text at Publishermore_horiz



Distributionally Robust Optimization Model for a Minimum Cost Consensus with Asymmetric Adjustment Costs Based on the Wasserstein Metric

Wu, ZQZhu, K and Qu, SJ

Nov 2022 | 

MATHEMATICS

 10 (22)

When solving the problem of the minimum cost consensus with asymmetric adjustment costs, decision makers need to face various uncertain situations (such as individual opinions and unit adjustment costs for opinion modifications in the up and down directions). However, in the existing methods for dealing with this problem, robust optimization will lead to overly conservative results, and stochas

Show more

Free Full Text from Publishermore_horiz

Cited by 1 Related articles All 4 versions

arXiv:2212.05316  [pdfother cs.LG   cs.CV  q-bio.NC
Graph-Regularized Manifold-Aware Conditional Wasserstein GAN for Brain Functional Connectivity Generation
Authors: Yee-Fan TanChee-Ming TingFuad NomanRaphaël C. -W. PhanHernando Ombao
Abstract: Common measures of brain functional connectivity (FC) including covariance and correlation matrices are semi-positive definite (SPD) matrices residing on a cone-shape Riemannian manifold. Despite its remarkable success for Euclidean-valued data generation, use of standard generative adversarial networks (GANs) to generate manifold-valued FC data neglects its inherent SPD structure and hence the in…  More
Submitted 10 December, 2022; originally announced December 2022.
Comments: 10 pages, 4 figures


arXiv:2212.09293  [pdfpsother math.AP   math-ph
Stability estimates for the Vlasov-Poisson system in p
-kinetic Wasserstein distances
Authors: Mikaela IacobelliJonathan Junné
Abstract: We extend Loeper's L2-estimate relating the electric fields to the densities for the Vlasov-Poisson system to Lp
 with 1<p<+∞
, based on the Helmholtz-Weyl decomposition. This allows us to generalize both the classical Loeper's 2
-Wasserstein stability estimate and the recent stability estimate by the first author relying on the newly introduced kinetic Wasserstein distance to kin…  More
Submitted 19 December, 2022; originally announced December 2022.
MSC Class: 35Q83; 82C40; 82D10; 35B35



A Wasserstein GAN with Gradient Penalty for 3D Porous Media Generation.

M CorralesM IzzatullahH Hoteit… - Second EAGE Subsurface …, 2022 - earthdoc.org

Linking the pore-scale and reservoir-scale subsurface fluid flow remains an open challenge

in areas such as oil recovery and Carbon Capture and Storage (CCS). One of the main

factors hindering our knowledge of such a process is the scarcity of physical samples from

geological areas of interest. One way to tackle this issue is by creating accurate, digital

representations of the available rock samples to perform numerical fluid flow simulations.

Recent advancements in Machine Learning and Deep Generative Modeling open up a new …

All 2 versions

[CITATION] DIG-Kaust/RockGAN: Reproducible material for A Wasserstein GAN with gradient penalty for 3D porous media generation.

M IzzatullahH HoteitM Ravasi, MA Corrales Guerrero - 2022 - repository.kaust.edu.sa

DIG-Kaust/RockGAN: Reproducible material for A Wasserstein GAN with gradient penalty

for 3D porous media generation. … Reproducible material for A Wasserstein GAN with …


marco cuturi (@CuturiMarco) / Twitter

twitter.com › cuturimarcoMolecular Machine Learning Conference 2022 | MIT Jameel Clinic ... Optimal Transport Tools (OTT): A JAX Toolbox for all things Wasserstein.

Twitter · 

Jul 29, 2022

<-—2022———2022———1660—



Chenhao Li @ CoRL 2022 (@breadli428) / Twitter

twitter.com › breadli428

twitter.com › breadli428

Chenhao Li @ CoRL 2022. @breadli428. Visiting researcher. @MIT ...

 Wasserstein Adversarial Skill Imitation (WASABI) acquires agile behaviors from partial ...

Twitter · 1 month ago

Nov 11, 2022


Welcome to Hao Su's homepage - UCSD CSE

cseweb.ucsd.edu › ~haosu

haosu AT eng.ucsd.edu / bio / CV / google scholar / publication ... Use VAE and Wasserstein Distance to align policies from local and global perspectives, ...

UCSD CSE · SU Lab UC San Diego · 

Aug 4, 2022



Defect Detection of MEMS Based on Data Augmentation, WGAN...
by Shi, Zhenman; Sang, Mei; Huang, Yaokang ; More...
Sensors (Basel, Switzerland), 12/2022, Volume 22, Issue 23
Surface defect detection of micro-electromechanical system (MEMS) acoustic thin film plays a crucial role in MEMS device inspection and quality control. The...
Article PDF PDF
Journal Article Full Text Online
More Options
View in Context Browse Journal
Open Access
 

 
SSP-WGAN-Based Data Enhancement and...
by Hao, Xiaochen; Dang, Hui; Zhang, Yuxuan ; More...
IEEE sensors journal, 12/2022, Volume 22, Issue 23
Aiming at the problem of low prediction accuracy of traditional prediction models due to the limited labeled sample data and the imbalance of multitimescale...
Article PDF PDF
Journal Article Full Text Online
View in Context Browse Journal

 
 
Morphological Classification of Radio Galaxies with wGAN...

by Rustige, Lennart; Kummer, Janis; Griese, Florian ; More...
12/2022
Machine learning techniques that perform morphological classification of astronomical sources often suffer from a scarcity of labelled training data. Here, we...
Journal Article Full Text Online
Open Access

 2922


Morphological Classification of Radio Galaxies with wGAN...

by Rustige, Lennart; Kummer, Janis; Griese, Florian ; More...
arXiv.org, 12/2022
Machine learning techniques that perform morphological classification of astronomical sources often suffer from a scarcity of labelled training data. Here, we...
Paper Full Text Online

2022 paint news
State Intellectual Property Office of China Releases Hangzhou Electronics Science and Technology Univ's Patent Application for Network Security Unbalanced Data Set Analysis Method Based on WGAN...

Global IP News: Security & Protection Patent News, Dec 10, 2022
Newspaper Article
State Intellectual Property Office of China Releases Hangzhou Electronics Science and Technology Univ's Patent Application for Network Security Unbalanced Data Set Analysis Method Based on WGAN...

Global IP News. Security & Protection Patent News, Dec 10, 2022
Newspaper Article Full Text Online

Representing Graphs via Gromov-Wasserstein...
by Xu, Hongteng; Liu, Jiachang; Luo, Dixin ; More...
IEEE transactions on pattern analysis and machine intelligence, 02/2022, Volume PP, Issue 1
We propose a new nonlinear factorization model for graphs that have topological structures, and optionally, node attributes. This model is based on a...
Article PDF PDF
Journal Article Full Text Online
View in Context Browse Journal

 
Optimal visual tracking using Wasserstein...
by Hong, Jin; Kwon, Junseok
Expert systems with applications, 12/2022, Volume 209
We propose a novel visual tracking method based on the Wasserstein transport proposal (WTP). In this study, we theoretically derive the optimal proposal...
Article PDF PDF
Journal Article Full Text Online
View in Context Browse Journal

<-—2022———2022———1670—



Gromov–Wasserstein distances between...
by Delon, Julie; Desolneux, Agnes; Salmona, Antoine
Journal of applied probability, 12/2022, Volume 59, Issue 4
Gromov–Wasserstein distances were proposed a few years ago to compare distributions which do not lie in the same space. In particular, they offer an...
Journal Article Full Text Online
View in Context Browse Journal

Bayesian learning with Wasserstein...
by Backhoff-Veraguas, Julio; Fontbona, Joaquin; Rios, Gonzalo ; More...
Probability and statistics, 2022, Volume 26
We introduce and study a novel model-selection strategy for Bayesian learning, based on optimal transport, along with its associated predictive posterior law:...
Article PDF PDF
Journal Article Full Text Online
View in Context Browse Journal

Electromagnetic Full Waveform Inversion Based on Quadratic Wasserstein...

by Deng, Jian; Zhu, Peimin; Kofman, Wlodek ; More...
IEEE transactions on antennas and propagation, 12/2022, Volume 70, Issue 12
Electromagnetic full waveform inversion (FWI) is a high-resolution method to reveal the distribution of dielectric parameters of the medium. Traditionally, the...
Article PDF PDF
Journal Article Full Text Online
View in Context Browse Journal
 
The isometry group of Wasserstein spaces:...
by Gehér, György Pál; Titkos, Tamás; Virosztek, Dániel

Journal of the London Mathematical Society, 12/2022, Volume 106, Issue 4
Motivated by Kloeckner's result on the isometry group of the quadratic Wasserstein space W2(Rn)$\mathcal {W}_2(\mathbb {R}^n)$, we describe the isometry group...
Article PDF PDF
Journal Article Full Text Online
View in Context Browse Journal



Accelerated Bregman Primal-Dual Methods Applied to Optimal Transport and Wasserstein...

by Chambolle, Antonin; Contreras, Juan Pablo
SIAM journal on mathematics of data science, 12/2022, Volume 4, Issue 4
This paper discusses the efficiency of Hybrid Primal-Dual (HPD) type algorithms to approximate solve discrete Optimal Transport (OT) and Wasserstein Barycenter...
Journal Article Full Text Online
MR4522876
   Zbl 07669881


2022

MHA-WoML: Multi-head attention and Wasserst...
by Yang, Junyan; Jiang, Jie; Guo, Yanming
International journal of multimedia information retrieval, 2022, Volume 11, Issue 4
Few-shot learning aims to classify novel classes with extreme few labeled samples. Existing metric-learning-based approaches tend to employ the off-the-shelf...
Article PDF PDF
Journal Article Full Text Online
View in Context Browse Journal

Wasserstein Generative Adversarial Network to...
by Man, Cheuk Ki; Quddus, Mohammed; Theofilatos, Athanasios ; More...
IEEE transactions on intelligent transportation systems, 12/2022, Volume 23, Issue 12
Article PDF PDF
Journal Article Full Text Online
View in Context Browse Journal

Super-resolution of Sentinel-2 images using Wasserstein...
by Latif, Hasan; Ghuffar, Sajid; Ahmad, Hafiz Mughees
Remote sensing letters, 12/2022, Volume 13, Issue 12
The Sentinel-2 satellites deliver 13 band multi-spectral imagery with bands having 10m, 20m or 60m spatial resolution. The low-resolution bands can be...
Journal Article

Likelihood estimation of sparse topic distributions in topic models and its applications to Wasserstein...

by Bing, Xin; Bunea, Florentina; Strimas-Mackey, Seth ; More...
The Annals of statistics, 12/2022, Volume 50, Issue 6
Journal Article Full Text Online
View in Context Browse Journal
MR4524498
   Zbl 07641127


A Bismut–Elworthy inequality for a Wasser...
by Marx, Victor
Stochastic partial differential equations : analysis and computations, 2022, Volume 10, Issue 4
We introduce in this paper a strategy to prove gradient estimates for some infinite-dimensional diffusions on L 2 -Wasserstein spaces. For a specific example...
Article PDF PDF
Journal Article Full Text Online
View in Context Browse Journal

<-—2022———2022———1680—



Rate of convergence for particle approximation of PDEs in Wasserstein...

by Germain, Maximilien; Pham, Huyên; Warin, Xavier
Journal of applied probability, 12/2022, Volume 59, Issue 4
We prove a rate of convergence for the N-particle approximation of a second-order partial differential equation in the space of probability measures, such as...
Journal Article Full Text Online
View in Context Browse Journal

 
A novel conditional weighting transfer Was...
by Zhao, Ke; Jia, Feng; Shao, Haidong
Knowledge-based systems, 12/2022
Transfer learning based on a single source domain to a target domain has received a lot of attention in the cross-domain fault diagnosis tasks of rolling...
Article PDF PDF
Journal Article Full Text Online
View in Context Browse Journal

 
 2022 see 2021
Projected Wasserstein Gradient Descent for...
by Wang, Yifei; Chen, Peng; Li, Wuchen
SIAM/ASA journal on uncertainty quantification, 12/2022, Volume 10, Issue 4
Journal Article Full Text Online
View in Context Browse Journal
 
A Wasserstein generative adversarial...
by Yuan, Zhandong; Luo, Jun; Zhu, Shengyang ; More...
Vehicle system dynamics, 12/2022, Volume 60, Issue 12
Accurate and timely estimation of track irregularities is the foundation for predictive maintenance and high-fidelity dynamics simulation of the railway...
Journal Article 


 
Gromov-Wasserstein Distances: Entropic...
by Zhang, Zhengxin; Goldfeld, Ziv; Mroueh, Youssef ; More...
12/2022
The Gromov-Wasserstein (GW) distance quantifies dissimilarity between metric measure spaces and provides a meaningful figure of merit for applications...
Journal Article Full Text Online 


2022



Score-based Generative Modeling Secretly Minimizes the Wasserstein...

by Kwon, Dohyun; Fan, Ying; Lee, Kangwook
12/2022
36th Conference on Neural Information Processing Systems (NeurIPS 2022) Score-based generative models are shown to achieve remarkable empirical performances in...
Journal Article Full Text Online

 
On Generalization and Regularization via Wass...
by Wu, Qinyu; Li, Jonathan Yu-Meng; Mao, Tiantian
12/2022
Wasserstein distributionally robust optimization (DRO) has found success in operations research and machine learning applications as a powerful means to obtain...
Journal Article Full Text Online


On Generalization and Regularization via Was...
by Wu, Qinyu; Jonathan Yu-Meng Li; Mao, Tiantian
arXiv.org, 12/2022
Wasserstein distributionally robust optimization (DRO) has found success in operations research and machine learning applications as a powerful means to obtain...
Paper Full Text Online

 

Quantized Wasserstein Procrustes Alignment...
by Aboagye, Prince O; Zheng, Yan; Yeh, Michael ; More...
12/2022
AMTA 2022 Optimal Transport (OT) provides a useful geometric framework to estimate the permutation matrix under unsupervised cross-lingual word embedding...
Journal Article Full Text Online

Quantized Wasserstein Procrustes Alignment...
by Aboagye, Prince O; Zheng, Yan; Yeh, Michael ; More...
arXiv.org, 12/2022
Optimal Transport (OT) provides a useful geometric framework to estimate the permutation matrix under unsupervised cross-lingual word embedding (CLWE) models...
Paper Full Text Online 

<-—2022———2022———1690— .



Covariance-based soft clustering of functional data based on the Wasserstein...

by Masarotto, V; Masarotto, G
12/2022
We consider the problem of clustering functional data according to their covariance structure. We contribute a soft clustering methodology based on the...
Journal Article Full Text Online


Covariance-based soft clustering of functional data based on the Wasserstein...

by Masarotto, V; Masarotto, G
arXiv.org, 12/2022
We consider the problem of clustering functional data according to their covariance structure. We contribute a soft clustering methodology based on the...
Paper Full Text Online
 
Stability estimates for the Vlasov-Poisson system in $p$-kinetic Wasserstein...

by Iacobelli, Mikaela; Junné, Jonathan
12/2022
We extend Loeper's $L^2$-estimate relating the electric fields to the densities for the Vlasov-Poisson system to $L^p$, with $1 < p < +\infty$, based on the...
Journal Article Full Text Online

Graph-Regularized Manifold-Aware Conditional Wasserstein...

by Tan, Yee-Fan; Ting, Chee-Ming; Noman, Fuad ; More...
12/2022
Common measures of brain functional connectivity (FC) including covariance and correlation matrices are semi-positive definite (SPD) matrices residing on a...
Journal Article Full Text Online 



Graph-Regularized Manifold-Aware Conditional Wasserstein...

by Tan, Yee-Fan; Ting, Chee-Ming; Noman, Fuad ; More...
12/2022
Common measures of brain functional connectivity (FC) including covariance and correlation matrices are semi-positive definite (SPD) matrices residing on a...
Web Resource 


2022



2022 see 2021
Variational Wasserstein Barycenters with...
by Chi, Jinjin; Yang, Zhiyao; Ouyang, Jihong ; More...
arXiv.org, 12/2022
Wasserstein barycenter, built on the theory of optimal transport, provides a powerful framework to aggregate probability distributions, and it has increasingly...
Paper Full Text Online
Open Access


The general class of Wasserstein Sobolev...
by Sodini, Giacomo Enrico
12/2022
We show that the algebra of cylinder functions in the Wasserstein Sobolev space $H^{1,q}(\mathcal{P}_p(X,\mathsf{d}), W_{p, \mathsf{d}}, \mathfrak{m})$...
Journal Article Full Text Online


The general class of Wasserstein Sobolev...
by Sodini, Giacomo Enrico
arXiv.org, 12/2022
We show that the algebra of cylinder functions in the Wasserstein Sobolev space \(H^{1,q}(\mathcal{P}_p(X,\mathsf{d}), W_{p, \mathsf{d}}, \mathfrak{m})\)...
Paper Full Text Online 


Gaussian Process regression over discrete probability measures: on the non-stationarity relation between Euclidean and Wasserstein...
by Candelieri, Antonio; Ponti, Andrea; Archetti, Francesco
12/2022
Gaussian Process regression is a kernel method successfully adopted in many real-life applications. Recently, there is a growing interest on extending this...
Journal Article Full Text Online 


 Gaussian Process regression over discrete probability measures: on the non-stationarity relation between Euclidean and Wasserstein...

by Candelieri, Antonio; Ponti, Andrea; Archetti, Francesco
arXiv.org, 12/2022
Gaussian Process regression is a kernel method successfully adopted in many real-life applications. Recently, there is a growing interest on extending this...
Paper Full Text Online 

Cited by 1 Related articles All 2 versions 

<-—2022———2022———1700—

 
Stability estimates for the Vlasov-Poisson system in \(p\)-kinetic Wasserstein...

by Iacobelli, Mikaela; Junné, Jonathan
arXiv.org, 12/2022
We extend Loeper's \(L^2\)-estimate relating the electric fields to the densities for the Vlasov-Poisson system to \(L^p\), with \(1 < p < +\infty\), based on...
Paper Full Text Online


Code for the Article "Modeling of Political Systems using Wasserstein...

by Lanzetti, Nicolas; Hajar, Joudi; Dörfler, Florian
12/2022
Web Resource


WASCO: A Wasserstein-based statistical tool to...
by Gonzalez-Delgado, Javier; Sagar, Amin; Zanon, Christophe ; More...
bioRxiv, 12/2022
The structural investigation of intrinsically disordered proteins (IDPs) requires ensemble models describing the diversity of the conformational states of the...
Paper
WASCO: A Wasserstein-based statistical tool to...
Life Science Weekly, 12/2022
Newsletter


Alibaba Researchers Describe Recent Advances in Mathematics (Distributionally Robust Optimization Model for a Minimum Cost Consensus with Asymmetric Adjustment Costs Based on the Wasserstein...

Entertainment Business Newsweekly, 12/2022
Newsletter 

Alibaba Researchers Describe Recent Advances in Mathematics (Distributionally Robust Optimization Model for a Minimum Cost Consensus with Asymmetric Adjustment Costs Based on the Wasserstein Metric)Show more
Article, 2022
Publication:Entertainment Business Newsweekly, December 18 2022, 71
Publisher:2022


一种基于路网像素化的Wasserstein生成对抗流...
12/2022
Patent Available Online
Open Access
[Chinese  A Wasserstein Generation Adversarial Flow Based on Road Network Pixelation…]
 
2022


2022 patent news
State Intellectual Property Office of China Receives River and Sea Univ's Patent Application for Power System Bad Data Identification Method Based on Improved Wasserstein...
Global IP News. Electrical Patent News, Dec 17, 2022
Newspaper Article Full Text Online 



Bayesian learning with Wasserstein barycenters

by Backhoff-Veraguas, Julio; Fontbona, Joaquin; Rios, Gonzalo ; More...
Probability and statistics, 2022, Volume 26
We introduce and study a novel model-selection strategy for Bayesian learning, based on optimal transport, along with its associated predictive posterior law:...
Article View Article PDF
Journal Article Full Text Online
View Complete Issue Browse Now
Peer-Reviewe
Open Access

A GPM-based algorithm for solving regularized Wasserstein...

by Kum, S.; Duong, M.H.; Lim, Y. ; More...
Journal of computational and applied mathematics, 12/2022, Volume 416
Keywords Wasserstein barycenter; q-Gaussian measures; Gradient projection method; Optimization In this paper, we focus on the analysis of the regularized...
Article View Article PDF
Journal Article
View Complete Issue Browse N
Peer-Reviewed
Open Access

Wasserstein distance-based probabilistic...
by Zhang, Shitao; Wu, Zhangjiao; Ma, Zhenzhen ; More...
Ekonomska istraživanja, 12/2022, Volume 35, Issue 1
The evaluation of sustainable rural tourism potential is a key work in sustainable rural tourism development. Due to the complexity of the rural tourism...
Article View Article PDF
Journal Article Full Text Online
More Options
View Complete Issue Browse Now


Wasserstein Distributionally Robust...
by Hakobyan, Astghik; Yang, Insoon
12/2022
Distributionally robust control (DRC) aims to effectively manage distributional ambiguity in stochastic systems. While most existing works address inaccurate...
Journal Article Full Text Online 

<-—2022———2022———1710—


Wasserstein Distributionally Robust...
by Hakobyan, Astghik; Yang, Insoon
arXiv.org, 12/2022
Distributionally robust control (DRC) aims to effectively manage distributional ambiguity in stochastic systems. While most existing works address inaccurate...
Paper Full Text Online 



[PDF] arxiv.org

Score-based Generative Modeling Secretly Minimizes the Wasserstein Distance

D Kwon, Y Fan, K Lee - arXiv preprint arXiv:2212.06359, 2022 - arxiv.org

Score-based generative models are shown to achieve remarkable empirical performances 

in various applications such as image generation and audio synthesis. However, a 

theoretical understanding of score-based diffusion models is still incomplete. Recently, Song 

et al. showed that the training objective of score-based generative models is equivalent to 

minimizing the Kullback-Leibler divergence of the generated distribution from the data 

distribution. In this work, we show that score-based models also minimize the Wasserstein …

Save Cite All 3 versions 

 

 Comparison Results for Gromov-Wasserstein...
by Mémoli, Facundo; Needham, Tom
12/2022
Inspired by the Kantorovich formulation of optimal transport distance between probability measures on a metric space, Gromov-Wasserstein (GW) distances...
Journal Article Full Text Online


Comparison Results for Gromov-Wasserstein...
by Mémoli, Facundo; Needham, Tom
arXiv.org, 12/2022
Inspired by the Kantorovich formulation of optimal transport distance between probability measures on a metric space, Gromov-Wasserstein (GW) distances...
Paper Full Text Online 


Square Root Normal Fields for Lipschitz surfaces and the Wasserstein...

by Hartman, Emmanuel; Bauer, Martin; Klassen, Eric
12/2022
The Square Root Normal Field (SRNF) framework is a method in the area of shape analysis that defines a (pseudo) distance between unparametrized surfaces. For...
Journal Article Full Text Online

 2022


Using affine policies to reformulate two-stage Wasserstein...

by ho, Youngchae; Yang, Insoon
12/2022
Intensively studied in theory as a promising data-driven tool for decision-making under ambiguity, two-stage distributionally robust optimization (DRO)...
Journal Article Full Text Online

 2022 see 2021
Internal Wasserstein Distance for...
by Tan, Mingkui; Zhang, Shuhai; Cao, Jiezhang ; More...
arXiv.org, 12/2022
Deep neural networks (DNNs) are known to be vulnerable to adversarial attacks that would trigger misclassification of DNNs but may be imperceptible to human...
Paper Full Text Online 



arXiv:2301.00284  [pdfpsother math.DG   math.FA
Square Root Normal Fields for Lipschitz surfaces and the Wasserstein Fisher Rao metric
Authors: Emmanuel HartmanMartin BauerEric Klassen
Abstract: The Square Root Normal Field (SRNF) framework is a method in the area of shape analysis that defines a (pseudo) distance between unparametrized surfaces. For piecewise linear (PL) surfaces it was recently proved that the SRNF distance between unparametrized surfaces is equivalent to the Wasserstein Fisher Rao (WFR) metric on the space of finitely supported measures on S
2  In the present article…  More
Submitted 31 December, 2022; originally announced January 2023.
Comments: 17 pages


2022

arXiv:2301.00191  [pdfother math.OC   eess.SY
Using affine policies to reformulate two-stage Wasserstein distributionally robust linear programs to be independent of sample size
Authors: Youngchae ChoInsoon Yang
Abstract: Intensively studied in theory as a promising data-driven tool for decision-making under ambiguity, two-stage distributionally robust optimization (DRO) problems over Wasserstein balls are not necessarily easy to solve in practice. This is partly due to large sample size. In this article, we study a generic two-stage distributionally robust linear program (2-DRLP) over a 1-Wasserstein ball using an…  More
Submitted 31 December, 2022; originally announced January 2023.

arXiv:2212.14123  [pdfpsother math.MG
Comparison Results for Gromov-Wasserstein and Gromov-Monge Distances
Authors: Facundo MémoliTom Needham
Abstract: Inspired by the Kantorovich formulation of optimal transport distance between probability measures on a metric space, Gromov-Wasserstein (GW) distances comprise a family of metrics on the space of isomorphism classes metric measure spaces. In previous work, the authors introduced a variant of this construction which was inspired by the original Monge formulation of optimal transport; elements of t…  More
Submitted 28 December, 2022; originally announced December 2022.
Comments: Some of these results appeared in an appendix to earlier versions of our previous paper arXiv:1810.09646, but were removed from the published version

<-—2022———2022———1720—



arXiv:2212.12848  [pdfother math.ST
Gromov-Wasserstein Distances: Entropic Regularization, Duality, and Sample Complexity
Authors: Zhengxin ZhangZiv GoldfeldYoussef MrouehBharath K. Sriperumbudur
Abstract: The Gromov-Wasserstein (GW) distance quantifies dissimilarity between metric measure spaces and provides a meaningful figure of merit for applications involving heterogeneous data. While computational aspects of the GW distance have been widely studied, a strong duality theory and fundamental statistical questions concerning empirical convergence rates remained obscure. This work closes these gaps…  More
Submitted 24 December, 2022; originally announced December 2022.
Comments: 32 pages

arXiv:2212.10955  [pdfpsother math.FA   math.MG
The general class of Wasserstein Sobolev spaces: density of cylinder functions, reflexivity, uniform convexity and Clarkson's inequalities
Authors: Giacomo Enrico Sodini
Abstract: We show that the algebra of cylinder functions in the Wasserstein Sobolev space H
1,q(Pp
(X,d),W  p,d ,m)
 generated by a finite and positive Borel measure m
 on the (p,d)
-Wasserstein space (P
p(X,d),Wp,d)
 on a complete and separable metric space (X,d)
 is dense in energy. As an applica…  More
Submitted 21 December, 2022; originally announced December 2022.
Comments: 35 pages
MSC Class: 46E36; 49Q22; 46B10; 46B20

Dvinskikh, Darina

Stochastic approximation versus sample average approximation for Wasserstein barycenters. (English) Zbl 07634893

Optim. Methods Softw. 37, No. 5, 1603-1635 (2022).

MSC:  90Cxx 65Kxx

PDF BibTeX XML Cite


Wasserstein Isometric Mapping for Image Manifold Learning

http://www.fields.utoronto.ca › talks › Wasserstein-Iso...

Speaker: Keaton Hamm, University of Texas at Arlington

Jun 2, 2022 — Wassmap represents images via probability measures in Wasserstein space, ... on various image data manifolds show that Wassmap yields.

Stat.ML Papers on Twitter: "Wassmap: Wasserstein Isometric ...

https://mobile.twitter.com › StatMLPapers › status

Dec 14, 2022 — Wassmap: Wasserstein Isometric Mapping for Image Manifold Learning. (arXiv:2204.06645v2 [cs.LG] UPDATED).


2022


Brain Parcellation and Connectivity Mapping ... - ResearchGatehttps://www.researchgate.net › publication › 324178113_...Jul 5

2022 — Request PDF | On Jan 1, 2018, Hamza Farooq and others published Brain Parcellation and Connectivity Mapping Using

Wasserstein Geometry ...


2022

HackGAN: Harmonious Cross-Network Mapping Using ...

https://ieeexplore.ieee.org › document

by L Yang · 2022 · Cited by 2 — Network alignment (NA) that identifies equivalent nodes across networks is an effective tool for integrating knowledge from multiple ...



Peer-reviewed
Two-Variable Wasserstein Means of Positive Definite Operators
Authors:Jinmi HwangSejong Kim
Summary:Abstract: We investigate the two-variable Wasserstein mean of positive definite operators, as a unique positive solution of the nonlinear equation obtained from the gradient of the objective function of the least squares problem. A various properties of two-variable Wasserstein mean including the symmetry and the refinement of the self-duality are shown. Furthermore, interesting inequalities such as the Ando–Hiai inequality and bounds for the difference between the two-variable arithmetic and Wasserstein mean are provided. Finally, we explore the relationship between the tolerance relation and two-variable Wasserstein mean of positive definite Hermitian matricesShow more
Article, 2022
Publication:Mediterranean Journal of Mathematics, 19, 20220412
Publisher:2022

 
Optimization in a traffic flow model as an inverse problem in the Wasserstein space
Authors:Roman ChertovskihFernando Lobo PereiraNikolay PogodaevMaxim Staritsyn
Summary:We address an inverse problem for a dynamical system in the space of probability measures, namely, the problem of restoration of the time-evolution of a probability distribution from certain given statistical information. The dynamics of the distribution is described by a nonlocal continuity equation in the Wasserstein space of probability measures. For the simplest version of this problem, associated with a toy one-dimensional model of traffic flow, we derive a necessary optimality condition and design, on its base, a numerical algorithm of the type of gradient descent. We also discuss some technical aspects of the realization of the elaborated algorithm, and present the results of computational experiments implementing an eloquent numeric scenarioShow mor
Article
Publication:IFAC PapersOnLine, 55, 2022, 32

Peer-reviewed
Subexponential Upper and Lower Bounds in Wasserstein Distance for Markov Processes
Authors:Nikola SandrićAri ArapostathisGuodong Pang
Summary:Abstract: In this article, relying on Foster–Lyapunov drift conditions, we establish subexponential upper and lower bounds on the rate of convergence in the -Wasserstein distance for a class of irreducible and aperiodic Markov processes. We further discuss these results in the context of Markov Lévy-type processes. In the lack of irreducibility and/or aperiodicity properties, we obtain exponential ergodicity in the -Wasserstein distance for a class of Itô processes under an asymptotic flatness (uniform dissipativity) assumption. Lastly, applications of these results to specific processes are presented, including Langevin tempered diffusion processes, piecewise Ornstein–Uhlenbeck processes with jumps under constant and stationary Markov controls, and backward recurrence time chains, for which we provide a sharp characterization of the rate of convergence via matching upper and lower boundsShow more
Article, 2022
Publication:Applied Mathematics & Optimization, 85, 20220510
Publisher:2022

<-—2022———2022———1730—



Peer-reviewed
T-copula and Wasserstein distance-based stochastic neighbor embedding
Authors:Yanyong HuangKejun GuoXiuwen YiJing YuZongxin ShenTianrui Li
Summary:The aim of dimensionality reduction is to obtain the faithful low-dimensional representations of high-dimensional data by preserving the data quality. It is beneficial to better visualize the high-dimensional data and improve the classification or clustering performance. Many dimensionality reduction methods based on the framework of stochastic neighbor embedding have been developed. However, most of them use the Euclidean distance to describe the dissimilarity of data points in high-dimensional space, which is not suitable for high-dimensional data with non-linear manifold structure. In addition, they usually use the family of normal distributions as their embedding distributions in low-dimensional space. This will incur that they are only suitable to deal with the spherical data. In order to deal with these issues, we present a novel dimensionality reduction method by integrating the Wasserstein distance and t-copula function into the stochastic neighbor embedding model. We first employ the Gaussian distribution equipped with the Wasserstein distance to describe the pairwise similarity in the high-dimensional space. Then, the t-copula function is used to generate a general heavy-tailed distribution for the description of low-dimensional pairwise similarity, which can process different shapes of data and avoid the crowding problem. Furthermore, Kullback-Leibler divergence is employed to measure the difference between the high-dimensional and low-dimensional similarities. Finally, a gradient descent algorithm with adaptive moment estimation is developed to solve the proposed objective function. Extensive experiments are conducted on eight real-world datasets to demonstrate the effectiveness of the proposed method in terms of the dimensional reduction quality, classification and clustering evaluation metricsShow more
Article, 2022
Publication:Knowledge-Based Systems, 243, 20220511
Publisher:2022
Cited by 3 Related articles All 2 versions


Peer-reviewed
Stochastic saddle-point optimization for the Wasserstein barycenter problem
Authors:Daniil TiapkinAlexander GasnikovPavel Dvurechensky
Summary:Abstract: We consider the population Wasserstein barycenter problem for random probability measures supported on a finite set of points and generated by an online stream of data. This leads to a complicated stochastic optimization problem where the objective is given as an expectation of a function given as a solution to a random optimization problem. We employ the structure of the problem and obtain a convex–concave stochastic saddle-point reformulation of this problem. In the setting when the distribution of random probability measures is discrete, we propose a stochastic optimization algorithm and estimate its complexity. The second result, based on kernel methods, extends the previous one to the arbitrary distribution of random probability measures. Moreover, this new algorithm has a total complexity better than the Stochastic Approximation approach combined with the Sinkhorn algorithm in many cases. We also illustrate our developments by a series of numerical experimentsShow more
Article, 2022
Publication:Optimization Letters, 16, 20220205, 2145
Publisher:2022

Cited by 1 Related articles All 2 versions

Peer-reviewed
A 3D reconstruction method of porous media based on improved WGAN-GP
Authors:Ting ZhangQingyang LiuXianwu WangXin JiYi Du
Summary:The reconstruction of porous media is important to the development of petroleum industry, but the accurate characterization of the internal structures of porous media is difficult since these structures cannot be directly described using some formulae or languages. As one of the mainstream technologies for reconstructing porous media, numerical reconstruction technology can reconstruct pore structures similar to the real pore spaces through numerical generation and has the advantages of low cost and good reusability compared to imaging methods. One of the recent variants of generative adversarial network (GAN), Wasserstein GAN with gradient penalty (WGAN-GP), has shown favorable capability of extracting features for generating or reconstructing similar images with training images. Therefore, a 3D reconstruction method of porous media based on an improved WGAN-GP is presented in this paper, in which the original multi-layer perceptron (MLP) in WGAN-GP is replaced by convolutional neural network (CNN) since CNN is composed of deep convolution structures with strong feature learning abilities. The proposed method uses real 3D images as training images and finally generates 3D reconstruction of porous media with the features of training images. Compared with some traditional numerical generation methods and WGAN-GP, this method has certain advantages in terms of reconstruction quality and efficiencyShow more
Article
Publication:Computers and Geosciences, 165, August 2022

Peer-reviewed
A Strabismus Surgery Parameter Design Model with WGAN-GP Data Enhancement Method
Authors:Renhao TangWensi WangQingyu MengShuting LiangZequn MiaoLili GuoLejin Wang
Summary:The purpose of this paper is a machine learning model that could predict the strabismus surgery parameter through the data of patients as accurately as possible. A strabismus surgery parameter design model’s input is a Medical records and return is a surgical value. The Machine learning algorithms is difficult to get a desired result in this process because of the small amount and uneven distribution strabismus surgery data. This paper enhanced the data set through a WGAN-GP model to improve the performance of the LightGBM algorithm. The performance of model is increased from 69.32% to 84.52%Show more
Article, 2022
Publication:Journal of Physics: Conference Series, 2179, 20220101
Publisher:2022
 


Arrhythmia Detection Based on WGAN-GP and SE-ResNet1D
Authors:Jing QinFujie GaoZumin WangLu LiuChangqing Ji
Summary:A WGAN-GP-based ECG signal expansion and an SE-ResNet1D-based ECG classification method are proposed to address the problem of poor modeling results due to the imbalanced sample distribution of ECG data sets. The network architectures of WGAN-GP and SE-ResNet1D are designed according to the characteristics of ECG signals so that they can be better applied to the generation and classification of ECG signals. First, ECG data were generated using WGAN-GP on the MIT-BIH arrhythmia database to balance the dataset. Then, the experiments were performed using the AAMI category and inter-patient data partitioning principles, and classification experiments were performed using SE-ResNet1D on the imbalanced and balanced datasets, respectively, and compared with three networks, VGGNet, DenseNet and CNN+Bi-LSTM. The experimental results show that using WGAN-GP to balance the dataset can improve the accuracy and robustness of the model classification, and the proposed SE-ResNet1D outperforms the comparison model, with a precision of 95.80%, recall of 96.75% and an F1 measure of 96.27% on the balanced dataset. Our methods have the potential to be a useful diagnostic tool to assist cardiologists in the diagnosis of arrhythmiasShow more
Downloadable Article, 2022
Publication:11, 20221001, 3427
Publisher:2022

2022


Peer-reviewed
Limitations of the Wasserstein MDE for univariate data
Author:Yannis G. Yatracos
Summary:Abstract: Minimum Kolmogorov and Wasserstein distance estimates, and respectively, of model parameter, are empirically compared, obtained assuming the model is intractable. For the Cauchy and Lognormal models, simulations indicate both estimates have expected values nearly but has in all repetitions of the experiments smaller SD than and ’s relative efficiency with respect to improves as the sample size, n,  increases. The minimum expected Kolmogorov distance estimate, has eventually bias and SD both smaller than the corresponding Wasserstein estimate, and ’s relative efficiency improves as n increases. These results hold also for stable models with stability index and For the Uniform and the Normal models the estimates have similar performance. The disturbing empirical findings for are due to the unboudedness and non-robustness of the Wasserstein distance and the heavy tails of the underlying univariate models.Theoretical confirmation is provided for stable models with which have finite first moment. Similar results are expected to hold for multivariate heavy tail models. Combined with existing results in the literature, the findings do not support the use of Wasserstein distance in statistical inference, especially for intractable and Black Box models with unverifiable heavy tailsShow more
Article, 2022
Publication:Statistics and Computing, 32, 20221017
Publisher:2022


WAD-CMSN: Wasserstein Distance based Cross-Modal Semantic Network for Zero-Shot Sketch-Based Image Retrieval
Authors:Xu, Guanglong (Creator), Hu, Zhensheng (Creator), Cai, Jia (Creator)

Summary:Zero-shot sketch-based image retrieval (ZSSBIR), as a popular studied branch of computer vision, attracts wide attention recently. Unlike sketch-based image retrieval (SBIR), the main aim of ZSSBIR is to retrieve natural images given free hand-drawn sketches that may not appear during training. Previous approaches used semantic aligned sketch-image pairs or utilized memory expensive fusion layer for projecting the visual information to a low dimensional subspace, which ignores the significant heterogeneous cross-domain discrepancy between highly abstract sketch and relevant image. This may yield poor performance in the training phase. To tackle this issue and overcome this drawback, we propose a Wasserstein distance based cross-modal semantic network (WAD-CMSN) for ZSSBIR. Specifically, it first projects the visual information of each branch (sketch, image) to a common low dimensional semantic subspace via Wasserstein distance in an adversarial training manner. Furthermore, identity matching loss is employed to select useful features, which can not only capture complete semantic knowledge, but also alleviate the over-fitting phenomenon caused by the WAD-CMSN model. Experimental results on the challenging Sketchy (Extended) and TU-Berlin (Extended) datasets indicate the effectiveness of the proposed WAD-CMSN model over several competitors
Show mor
Downloadable Archival Material, 2022-02-11
Undefined
Publisher:2022-02-11

Exact SDP Formulation for Discrete-Time Covariance Steering with Wasserstein Terminal Cost
Authors:Balci, Isin M. (Creator), Bakolas, Efstathios (Creator)
Summary:In this paper, we present new results on the covariance steering problem with Wasserstein distance terminal cost. We show that the state history feedback control policy parametrization, which has been used before to solve this class of problems, requires an unnecessarily large number of variables and can be replaced by a randomized state feedback policy which leads to more tractable problem formulations without any performance loss. In particular, we show that under the latter policy, the problem can be equivalently formulated as a semi-definite program (SDP) which is in sharp contrast with our previous results that could only guarantee that the stochastic optimal control problem can be reduced to a difference of convex functions program. Then, we show that the optimal policy that is found by solving the associated SDP corresponds to a deterministic state feedback policy. Finally, we present non-trivial numerical simulations which show the benefits of our proposed randomized state feedback policy derived from the SDP formulation of the problem over existing approaches in the field in terms of computational efficacy and controller performanceShow more
Downloadable Archival Material, 2022-05-22
Undefined
Publisher:2022-05-22


On Affine Policies for Wasserstein Distributionally Robust Unit Commitment
Authors:Cho, Youngchae (Creator), Yang, Insoon (Creator)
Summary:This paper proposes a unit commitment (UC) model based on data-driven Wasserstein distributionally robust optimization (WDRO) for power systems under uncertainty of renewable generation as well as its tractable exact reformulation. The proposed model is formulated as a WDRO problem relying on an affine policy, which nests an infinite-dimensional worst-case expectation problem and satisfies the non-anticipativity constraint. To reduce conservativeness, we develop a novel technique that defines a subset of the uncertainty set with a probabilistic guarantee. Subsequently, the proposed model is recast as a semi-infinite programming problem that can be efficiently solved using existing algorithms. Notably, the scale of this reformulation is invariant with the sample size. As a result, a number of samples are easily incorporated without using sophisticated decomposition algorithms. Numerical simulations on 6- and 24-bus test systems demonstrate the economic and computational efficiency of the proposed modelShow more
Downloadable Archival Material, 2022-03-29
Undefined
Publisher:2022-03-29


Distribution Regression with Sliced Wasserstein Kernels
Authors:Meunier, Dimitri (Creator), Pontil, Massimiliano (Creator), Ciliberto, Carlo (Creator)
Summary:The problem of learning functions over spaces of probabilities - or distribution regression - is gaining significant interest in the machine learning community. A key challenge behind this problem is to identify a suitable representation capturing all relevant properties of the underlying functional mapping. A principled approach to distribution regression is provided by kernel mean embeddings, which lifts kernel-induced similarity on the input domain at the probability level. This strategy effectively tackles the two-stage sampling nature of the problem, enabling one to derive estimators with strong statistical guarantees, such as universal consistency and excess risk bounds. However, kernel mean embeddings implicitly hinge on the maximum mean discrepancy (MMD), a metric on probabilities, which may fail to capture key geometrical relations between distributions. In contrast, optimal transport (OT) metrics, are potentially more appealing. In this work, we propose an OT-based estimator for distribution regression. We build on the Sliced Wasserstein distance to obtain an OT-based representation. We study the theoretical properties of a kernel ridge regression estimator based on such representation, for which we prove universal consistency and excess risk bounds. Preliminary experiments complement our theoretical findings by showing the effectiveness of the proposed approach and compare it with MMD-based estimatorsShow more
Downloadable Archival Material, 2022-02-08
Undefined
Publisher:2022-02-0

Cited by 4 Related articles All 4 versions

<-—2022———2022———1740—


Improving Human Image Synthesis with Residual Fast Fourier Transformation and Wasserstein Distance

Authors:Wu, Jianhan (Creator), Si, Shijing (Creator), Wang, Jianzong (Creator), Xiao, Jing (Creator)
Summary:With the rapid development of the Metaverse, virtual humans have emerged, and human image synthesis and editing techniques, such as pose transfer, have recently become popular. Most of the existing techniques rely on GANs, which can generate good human images even with large variants and occlusions. But from our best knowledge, the existing state-of-the-art method still has the following problems: the first is that the rendering effect of the synthetic image is not realistic, such as poor rendering of some regions. And the second is that the training of GAN is unstable and slow to converge, such as model collapse. Based on the above two problems, we propose several methods to solve them. To improve the rendering effect, we use the Residual Fast Fourier Transform Block to replace the traditional Residual Block. Then, spectral normalization and Wasserstein distance are used to improve the speed and stability of GAN training. Experiments demonstrate that the methods we offer are effective at solving the problems listed above, and we get state-of-the-art scores in LPIPS and PSNRShow more
Downloadable Archival Material, 2022-05-24
Undefined
Publisher:2022-05-24


Revisiting Sliced Wasserstein on Images: From Vectorization to Convolution
Authors:Nguyen, Khai (Creator), Ho, Nhat (Creator)
Summary:The conventional sliced Wasserstein is defined between two probability measures that have realizations as vectors. When comparing two probability measures over images, practitioners first need to vectorize images and then project them to one-dimensional space by using matrix multiplication between the sample matrix and the projection matrix. After that, the sliced Wasserstein is evaluated by averaging the two corresponding one-dimensional projected probability measures. However, this approach has two limitations. The first limitation is that the spatial structure of images is not captured efficiently by the vectorization step; therefore, the later slicing process becomes harder to gather the discrepancy information. The second limitation is memory inefficiency since each slicing direction is a vector that has the same dimension as the images. To address these limitations, we propose novel slicing methods for sliced Wasserstein between probability measures over images that are based on the convolution operators. We derive convolution sliced Wasserstein (CSW) and its variants via incorporating stride, dilation, and non-linear activation function into the convolution operators. We investigate the metricity of CSW as well as its sample complexity, its computational complexity, and its connection to conventional sliced Wasserstein distances. Finally, we demonstrate the favorable performance of CSW over the conventional sliced Wasserstein in comparing probability measures over images and in training deep generative modeling on imagesShow more
Downloadable Archival Material, 2022-04-03
Undefined
Publisher:2022-04-03

Simple Approximative Algorithms for Free-Support Wasserstein Barycenters
Author:von Lindheim, Johannes (Creator)
Summary:Computing Wasserstein barycenters of discrete measures has recently attracted considerable attention due to its wide variety of applications in data science. In general, this problem is NP-hard, calling for practical approximative algorithms. In this paper, we analyze two straightforward algorithms for approximating barycenters, which produce sparse support solutions and show promising numerical results. These algorithms require $N-1$ and $N(N-1)/2$ standard two-marginal OT computations between the $N$ input measures, respectively, so that they are fast, in particular the first algorithm, as well as memory-efficient and easy to implement. Further, they can be used with any OT solver as a black box. Based on relations of the barycenter problem to the multi-marginal optimal transport problem, which are interesting on their own, we prove sharp upper bounds for the relative approximation error. In the second algorithm, this upper bound can be evaluated specifically for the given problem, which always guaranteed an error of at most a few percent in our numerical experimentsShow more
Downloadable Archival Material, 2022-03-10
Undefined
Publisher:2022-03-10


S Generalized Zero-Shot Learning Using Conditional Wasserstein Autoencoder

J Kim, B Shim - ICASSP 2022-2022 IEEE International …, 2022 - ieeexplore.ieee.org

… , called conditional Wasserstein autoencoder (CWAE), minimizes the Wasserstein distance

… In measuring the distance between the two distributions, we use Wasserstein distance1 …

Related articles


Dimensionality Reduction and Wasserstein Stability for Kernel Regression
Authors:Eckstein, Stephan (Creator), Iske, Armin (Creator), Trabs, Mathias (Creator)
Summary:In a high-dimensional regression framework, we study consequences of the naive two-step procedure where first the dimension of the input variables is reduced and second, the reduced input variables are used to predict the output variable. More specifically we combine principal component analysis (PCA) with kernel regression. In order to analyze the resulting regression errors, a novel stability result of kernel regression with respect to the Wasserstein distance is derived. This allows us to bound errors that occur when perturbed input data is used to fit a kernel function. We combine the stability result with known estimates from the literature on both principal component analysis and kernel regression to obtain convergence rates for the two-step procedureShow more
Downloadable Archival Material, 2022-03-17
Undefined
Publisher:2022-03-17


2022


Approximative Algorithms for Multi-Marginal Optimal Transport and Free-Support Wasserstein Barycenters
Author:von Lindheim, Johannes (Creator)
Summary:Computationally solving multi-marginal optimal transport (MOT) with squared Euclidean costs for $N$ discrete probability measures has recently attracted considerable attention, in part because of the correspondence of its solutions with Wasserstein-$2$ barycenters, which have many applications in data science. In general, this problem is NP-hard, calling for practical approximative algorithms. While entropic regularization has been successfully applied to approximate Wasserstein barycenters, this loses the sparsity of the optimal solution, making it difficult to solve the MOT problem directly in practice because of the curse of dimensionality. Thus, for obtaining barycenters, one usually resorts to fixed-support restrictions to a grid, which is, however, prohibitive in higher ambient dimensions $d$. In this paper, after analyzing the relationship between MOT and barycenters, we present two algorithms to approximate the solution of MOT directly, requiring mainly just $N-1$ standard two-marginal OT computations. Thus, they are fast, memory-efficient and easy to implement and can be used with any sparse OT solver as a black box. Moreover, they produce sparse solutions and show promising numerical results. We analyze these algorithms theoretically, proving upper and lower bounds for the relative approximation errorShow more
Downloadable Archival Material, 2022-02-02
Undefined
Publisher:2022-02-02


Low-rank Wasserstein polynomial chaos expansions in the framework of optimal transport
Authors:Gruhlke, Robert (Creator), Eigel, Martin (Creator)
Summary:A unsupervised learning approach for the computation of an explicit functional representation of a random vector $Y$ is presented, which only relies on a finite set of samples with unknown distribution. Motivated by recent advances with computational optimal transport for estimating Wasserstein distances, we develop a new \textit{Wasserstein multi-element polynomial chaos expansion} (WPCE). It relies on the minimization of a regularized empirical Wasserstein metric known as debiased Sinkhorn divergence. As a requirement for an efficient polynomial basis expansion, a suitable (minimal) stochastic coordinate system $X$ has to be determined with the aim to identify ideally independent random variables. This approach generalizes representations through diffeomorphic transport maps to the case of non-continuous and non-injective model classes $\mathcal{M}$ with different input and output dimension, yielding the relation $Y=\mathcal{M}(X)$ in distribution. Moreover, since the used PCE grows exponentially in the number of random coordinates of $X$, we introduce an appropriate low-rank format given as stacks of tensor trains, which alleviates the curse of dimensionality, leading to only linear dependence on the input dimension. By the choice of the model class $\mathcal{M}$ and the smooth loss function, higher order optimization schemes become possible. It is shown that the relaxation to a discontinuous model class is necessary to explain multimodal distributions. Moreover, the proposed framework is applied to a numerical upscaling task, considering a computationally challenging microscopic random non-periodic composite material. This leads to tractable effective macroscopic random field in adopted stochastic coordinatesShow more
Downloadable Archival Material, 2022-03-17
Undefined
Publisher:2022-03-17

Rate of convergence of the smoothed empirical Wasserstein distance
Authors:Block, Adam (Creator), Jia, Zeyu (Creator), Polyanskiy, Yury (Creator), Rakhlin, Alexander (Creator)
Summary:Consider an empirical measure $\mathbb{P}_n$ induced by $n$ iid samples from a $d$-dimensional $K$-subgaussian distribution $\mathbb{P}$ and let $\gamma = \mathcal{N}(0,\sigma^2 I_d)$ be the isotropic Gaussian measure. We study the speed of convergence of the smoothed Wasserstein distance $W_2(\mathbb{P}_n * \gamma, \mathbb{P}*\gamma) = n^{-\alpha + o(1)}$ with $*$ being the convolution of measures. For $K<\sigma$ and in any dimension $d\ge 1$ we show that $\alpha = {1\over2}$. For $K>\sigma$ in dimension $d=1$ we show that the rate is slower and is given by $\alpha = {(\sigma^2 + K^2)^2\over 4 (\sigma^4 + K^4)} < 1/2$. This resolves several open problems in \cite{goldfeld2020convergence}, and in particular precisely identifies the amount of smoothing $\sigma$ needed to obtain a parametric rate. In addition, we also establish that $D_{KL}(\mathbb{P}_n * \gamma \|\mathbb{P}*\gamma)$ has rate $O(1/n)$ for $K<\sigma$ but only slows down to $O({(\log n)^{d+1}\over n})$ for $K>\sigma$. The surprising difference of the behavior of $W_2^2$ and KL implies the failure of $T_{2}$-transportation inequality when $\sigma < K$. Consequently, the requirement $K<\sigma$ is necessary for validity of the log-Sobolev inequality (LSI) for the Gaussian mixture $\mathbb{P} * \mathcal{N}(0, \sigma^{2})$, closing an open problem in \cite{wang2016functional}, who established the LSI under precisely this conditionShow more
Downloadable Archival Material, 2022-05-04
Undefined
Publisher:2022-05-04


Partial Wasserstein Adversarial Network for Non-rigid Point Set Registration
Authors:Wang, Zi-Ming (Creator), Xue, Nan (Creator), Lei, Ling (Creator), Xia, Gui-Song (Creator)
Summary:Given two point sets, the problem of registration is to recover a transformation that matches one set to the other. This task is challenging due to the presence of the large number of outliers, the unknown non-rigid deformations and the large sizes of point sets. To obtain strong robustness against outliers, we formulate the registration problem as a partial distribution matching (PDM) problem, where the goal is to partially match the distributions represented by point sets in a metric space. To handle large point sets, we propose a scalable PDM algorithm by utilizing the efficient partial Wasserstein-1 (PW) discrepancy. Specifically, we derive the Kantorovich-Rubinstein duality for the PW discrepancy, and show its gradient can be explicitly computed. Based on these results, we propose a partial Wasserstein adversarial network (PWAN), which is able to approximate the PW discrepancy by a neural network, and minimize it by gradient descent. In addition, it also incorporates an efficient coherence regularizer for non-rigid transformations to avoid unrealistic deformations. We evaluate PWAN on practical point set registration tasks, and show that the proposed PWAN is robust, scalable and performs more favorably than the state-of-the-art methodsShow more
Downloadable Archival Material, 2022-03-04
Undefined
Publisher:2022-03-04


On a linearization of quadratic Wasserstein distance
Authors:Greengard, Philip (Creator), Hoskins, Jeremy G. (Creator), Marshall, Nicholas F. (Creator), Singer, Amit (Creator)
Summary:This paper studies the problem of computing a linear approximation of quadratic Wasserstein distance $W_2$. In particular, we compute an approximation of the negative homogeneous weighted Sobolev norm whose connection to Wasserstein distance follows from a classic linearization of a general Monge-Amp\'ere equation. Our contribution is threefold. First, we provide expository material on this classic linearization of Wasserstein distance including a quantitative error estimate. Second, we reduce the computational problem to solving an elliptic boundary value problem involving the Witten Laplacian, which is a Schr\"odinger operator of the form $H = -\Delta + V$, and describe an associated embedding. Third, for the case of probability distributions on the unit square $[0,1]^2$ represented by $n \times n$ arrays we present a fast code demonstrating our approach. Several numerical examples are presentedShow more
Downloadable Archival Material, 2022-01-31
Undefined
Publisher:2022-01-31

<-—2022———2022———1750—


Geodesic Properties of a Generalized Wasserstein Embedding for Time Series AnalysisAuthors:Li, Shiying (Creator), Rubaiyat, Abu Hasnat Mohammad (Creator), Rohde, Gustavo K. (Creator)
Summary:Transport-based metrics and related embeddings (transforms) have recently been used to model signal classes where nonlinear structures or variations are present. In this paper, we study the geodesic properties of time series data with a generalized Wasserstein metric and the geometry related to their signed cumulative distribution transforms in the embedding space. Moreover, we show how understanding such geometric characteristics can provide added interpretability to certain time series classifiers, and be an inspiration for more robust classifiersShow more
Downloadable Archival Material, 2022-06-04
Undefined
Publisher:2022-06-04


Wasserstein Generative Adversarial Networks for Online Test Generation for Cyber Physical Systems
Authors:Peltomäki, Jarkko (Creator), Spencer, Frankie (Creator), Porres, Ivan (Creator)
Summary:We propose a novel online test generation algorithm WOGAN based on Wasserstein Generative Adversarial Networks. WOGAN is a general-purpose black-box test generator applicable to any system under test having a fitness function for determining failing tests. As a proof of concept, we evaluate WOGAN by generating roads such that a lane assistance system of a car fails to stay on the designated lane. We find that our algorithm has a competitive performance respect to previously published algorithmsShow more
Downloadable Archival Material, 2022-05-23
Undefined
Publisher:2022-05-23


Optimal Neural Network Approximation of Wasserstein Gradient Direction via Convex Optimization
Authors:Wang, Yifei (Creator), Chen, Peng (Creator), Pilanci, Mert (Creator), Li, Wuchen (Creator)
Summary:The computation of Wasserstein gradient direction is essential for posterior sampling problems and scientific computing. The approximation of the Wasserstein gradient with finite samples requires solving a variational problem. We study the variational problem in the family of two-layer networks with squared-ReLU activations, towards which we derive a semi-definite programming (SDP) relaxation. This SDP can be viewed as an approximation of the Wasserstein gradient in a broader function family including two-layer networks. By solving the convex SDP, we obtain the optimal approximation of the Wasserstein gradient direction in this class of functions. Numerical experiments including PDE-constrained Bayesian inference and parameter estimation in COVID-19 modeling demonstrate the effectiveness of the proposed methodShow more
Downloadable Archival Material, 2022-05-25
Undefined
Publisher:2022-05-25
Cited by 2
Related articles All 6 versions


Semi-supervised Surface Wave Tomography with Wasserstein Cycle-consistent GAN: Method and Application on Southern California Plate Boundary Region
Show more
Authors:Cai, Ao (Creator), Qiu, Hongrui (Creator), Niu, Fenglin (Creator)
Summary:Machine learning algorithm has been applied to shear wave velocity (Vs) inversion in surface wave tomography, where a set of starting 1-D Vs profiles and their corresponding synthetic dispersion curves are used in network training. Previous studies showed that the performance of such trained network is dependent on the diversity of the training data set, which limits its application to previously poorly understood regions. Here, we present an improved semi-supervised algorithm-based network that takes both model-generated and observed surface wave dispersion data in the training process. The algorithm is termed Wasserstein cycle-consistent generative adversarial networks (Wasserstein Cycle-GAN [Wcycle-GAN]). Different from conventional supervised approaches, the GAN architecture enables the inclusion of unlabeled data (the observed surface wave dispersion) in the training process that can complement the model-generated data set. The cycle-consistency and Wasserstein metric significantly improve the training stability of the proposed algorithm. We benchmark the Wcycle-GAN method using 4,076 pairs of fundamental mode Rayleigh wave phase and group velocity dispersion curves derived in periods from 3 to 16 s in Southern California. The final 3-D Vs model given by the best trained network shows large-scale features consistent with the surface geology. The resulting Vs model has reasonable data misfits and provides sharper images of structures near faults in the top 15 km compared with those from conventional machine learning methodsShow more
Downloadable Archival Material, 2022-06-15T15:10:50Z
Undefined
Publisher:Wiley, 2022-06-15T15:10:50Z


Topological Classification in a Wasserstein Distance Based Vector Space
Authors:Songdechakraiwut, Tananun (Creator), Krause, Bryan M. (Creator), Banks, Matthew I. (Creator), Nourski, Kirill V. (Creator), Van Veen, Barry D. (Creator)
Summary:Classification of large and dense networks based on topology is very difficult due to the computational challenges of extracting meaningful topological features from real-world networks. In this paper we present a computationally tractable approach to topological classification of networks by using principled theory from persistent homology and optimal transport to define a novel vector representation for topological features. The proposed vector space is based on the Wasserstein distance between persistence barcodes. The 1-skeleton of the network graph is employed to obtain 1-dimensional persistence barcodes that represent connected components and cycles. These barcodes and the corresponding Wasserstein distance can be computed very efficiently. The effectiveness of the proposed vector space is demonstrated using support vector machines to classify simulated networks and measured functional brain networksShow more
Downloadable Archival Material, 2022-02-02
Undefined
Publisher:2022-02-02

2022


Poems for life : celebrities choose their favorite poem and say why it inspires them
Author:Anna Quindlen (Writer of introduction)
Summary:When a group of fifth-graders asked fifty celebrities what their favorite peom was and why, the answers they received became a beautiful collection of some of the world's most beloved poems, from classic to contemporary. In this new edition, Poems for Life continues to offer inspiration, solace, wisdom, and sometimes humor. Each poem is accompanied by the celebrity's brief letter explaining why they chose it and its resonance for themShow more
Print Book, 2022
English
Publisher:Arcade Publishing, New York, 2022
Also available aseBook
View AllFormats & Editions


Wassmap: Wasserstein Isometric Mapping for Image Manifold Learning
Authors:Hamm, Keaton (Creator), Henscheid, Nick (Creator), Kang, Shujie (Creator)
Summary:In this paper, we propose Wasserstein Isometric Mapping (Wassmap), a parameter-free nonlinear dimensionality reduction technique that provides solutions to some drawbacks in existing global nonlinear dimensionality reduction algorithms in imaging applications. Wassmap represents images via probability measures in Wasserstein space, then uses pairwise quadratic Wasserstein distances between the associated measures to produce a low-dimensional, approximately isometric embedding. We show that the algorithm is able to exactly recover parameters of some image manifolds including those generated by translations or dilations of a fixed generating measure. Additionally, we show that a discrete version of the algorithm retrieves parameters from manifolds generated from discrete measures by providing a theoretical bridge to transfer recovery results from functional data to discrete data. Testing of the proposed algorithms on various image data manifolds show that Wassmap yields good embeddings compared with other global techniquesShow more
Downloadable Archival Material, 2022-04-13
Undefined
Publisher:2022-04-13

Cited by 5 Related articles All 2 versions 


Multi-Variate Risk Measures under Wasserstein BarycenterAuthors:M. Andrea Arias-SernaJean Michel LoubesFrancisco J. Caro-Lopera
Summary:When the uni-variate risk measure analysis is generalized into the multi-variate setting, many complex theoretical and applied problems arise, and therefore the mathematical models used for risk quantification usually present model risk. As a result, regulators have started to require that the internal models used by financial institutions are more precise. For this task, we propose a novel multi-variate risk measure, based on the notion of the Wasserstein barycenter. The proposed approach robustly characterizes the company’s exposure, filtering the partial information available from individual sources into an aggregate risk measure, providing an easily computable estimation of the total risk incurred. The new approach allows effective computation of Wasserstein barycenter risk measures in any location-scatter family, including the Gaussian case. In such cases, the Wasserstein barycenter Value-at-Risk belongs to the same family, thus it is characterized just by its mean and deviation. It is important to highlight that the proposed risk measure is expressed in closed analytic forms which facilitate its use in day-to-day risk management. The performance of the new multi-variate risk measures is illustrated in United States market indices of high volatility during the global financial crisis (2008) and during the COVID-19 pandemic situation, showing that the proposed approach provides the best forecasts of risk measures not only for “normal periods”, but also for periods of high volatilityShow more
Downloadable Article, 2022
Publication:10, 20220901, 180
Publisher:2022

3033 see 2021  Peer-reviewed
A Wasserstein generative adversarial network-based approach for real-time track irregularity estimation using vehicle dynamic responses
Authors:Zhandong YuanJun LuoShengyang ZhuWanming Zhai
Summary:Accurate and timely estimation of track irregularities is the foundation for predictive maintenance and high-fidelity dynamics simulation of the railway system. Therefore, it’s of great interest to devise a real-time track irregularity estimation method based on dynamic responses of the in-service train. In this paper, a Wasserstein generative adversarial network (WGAN)-based framework is developed to estimate the track irregularities using the vehicle’s axle box acceleration (ABA) signal. The proposed WGAN is composed of a generator architected by an encoder-decoder structure and a spectral normalised (SN) critic network. The generator is supposed to capture the correlation between ABA signal and track irregularities, and then estimate the irregularities with the measured ABA signal as input; while the critic is supposed to instruct the generator’s training by optimising the calculated Wasserstein distance. We combine supervised learning and adversarial learning in the network training process, where the estimation loss and adversarial loss are jointly optimised. Optimising the estimation loss is anticipated to estimate the long-wave track irregularities while optimising the adversarial loss accounts for the short-wave track irregularities. Two numerical cases, namely vertical and spatial vehicle-track coupled dynamics simulation, are implemented to validate the accuracy and reliability of the proposed methodShow more
Article
Publication:Vehicle System Dynamics, 60, 20221202, 4186

Peer-reviewed
Maps on positive definite cones of $C^*$-algebras preserving the Wasserstein mean
Author:Lajos Molnár
Summary:The primary aim of this paper is to present the complete description of the isomorphisms between positive definite cones of $C^*$-algebras with respect to the recently introduced Wasserstein mean and to show the nonexistence of nonconstant such morphisms into the positive reals in the case of von Neumann algebras without type I$_2$, I$_1$ direct summands. A comment on the algebraic properties of the Wasserstein mean relating associativity is also madeShow more
Downloadable Article, 2022
Publication:Proceedings of the American Mathematical Society, 150, March 1, 2022, 1209
Publisher:2022

<-—2022———2022———1760—



Peer-reviewed
Wasserstein generative adversarial uncertainty quantification in physics-informed neural networks
Authors:Yihang GaoMichael K. Ng
Summary:In this paper, we study a physics-informed algorithm for Wasserstein Generative Adversarial Networks (WGANs) for uncertainty quantification in solutions of partial differential equations. By using groupsort activation functions in adversarial network discriminators, network generators are utilized to learn the uncertainty in solutions of partial differential equations observed from the initial/boundary data. Under mild assumptions, we show that the generalization error of the computed generator converges to the approximation error of the network with high probability, when the number of samples are sufficiently taken. According to our established error bound, we also find that our physics-informed WGANs have higher requirement for the capacity of discriminators than that of generators. Numerical results on synthetic examples of partial differential equations are reported to validate our theoretical results and demonstrate how uncertainty quantification can be obtained for solutions of partial differential equations and the distributions of initial/boundary data. However, the quality or the accuracy of the uncertainty quantification theory in all the points in the interior is still the theoretical vacancy, and required for further researchShow more
Article
Publication:Journal of Computational Physics, 463, 2022-08-15

Cited by 8 Related articles All 5 versions


A brief survey on Computational Gromov-Wasserstein distance
Authors:Lei ZhengYang XiaoLingfeng Niu
Summary:Graph is a widely used data structure which can be regarded as a generalized measure metirc space. Since a graph not only includes points but also relation between points, the distance between two graphs is difficult to measure. In recent years, Gromov-Wasserstein discrepancy has been proposed as a pseudometirc on graphs which has a complete theoretical basis, but has few applications. Therefore, this paper reviews the basic ideas, applications and improvements of Gromov-Wasserstein. Since Gromov-Wasserstein discrepancy is a quadratic programming and difficult to calculate, this paper focuses on the iterative algorithm for solving this discrepancy. At the end, we look forward to the development of Gromov-Wasserstein discrepancyShow more
Article
Publication:Procedia Computer Science, 199, 2022, 697


Peer-reviewed
Optimal visual tracking using Wasserstein transport proposals
Authors:Jin HongJunseok Kwon
Summary:We propose a novel visual tracking method based on the Wasserstein transport proposal (WTP). In this study, we theoretically derive the optimal proposal function in Markov chain Monte Carlo (MCMC) based visual tracking frameworks. For this objective, we adopt the optimal transport theory in the Wasserstein space and present a new transport map that can transform from a simple proposal distribution to the optimal target distribution. To find the best transport map, we conduct an additional Monte Carlo simulation. Experimental results demonstrate that the proposed method outperforms other state-of-the-art visual tracking methods. The proposed WTP can be substituted with conventional proposal functions in an MCMC framework, and thus can be plugged into any existing MCMC-based visual tracker.

• We propose a visual tracker using the optimal proposal function in MCMC. • We use a Wasserstein transport proposal (WTP) as the optimal proposal function. • The proposed WTP is highly applicable to conventional MCMC frameworksShow more
Article, 2022
Publication:Expert Systems With Applications, 209, 20221215
Publisher:2022
Peer-reviewed
Wasserstein Adversarial Regularization for learning with label noise
Authors:Kilian FatrasBharath Bhushan DamodaranSylvain LobryRemi FlamaryDevis TuiaNicolas Courty
Summary:Noisy labels often occur in vision datasets, especially when they are obtained from crowdsourcing or Web scraping. We propose a new regularization method, which enables learning robust classifiers in presence of noisy data. To achieve this goal, we propose a new adversarial regularization {scheme} based on the Wasserstein distance. Using this distance allows taking into account specific relations between classes by leveraging the geometric properties of the labels space. {Our Wasserstein Adversarial Regularization (WAR) encodes a selective regularization, which promotes smoothness of the classifier between some classes, while preserving sufficient complexity of the decision boundary between others. We first discuss how and why adversarial regularization can be used in the context of noise and then show the effectiveness of our method on five datasets corrupted with noisy labels: in both benchmarks and real datasets, WAR outperforms the state-of-the-art competitorsShow more
Article
Publication:IEEE Transactions on Pattern Analysis and Machine Intelligence, 44, 2022, 7296



Peer-reviewed
Sliced Wasserstein Distance for Neural Style Transfer
Authors:Jie LiDan XuShaowen Yao
Summary:Neural Style Transfer (NST) aims to render a content image with the style of another image in the feature space of a Convolution Neural Network (CNN). A fundamental concept of NST is to define the features extracted from a CNN as a distribution so that the style similarity can be computed by measuring the distance between distributions. Conceptually, Wasserstein Distance (WD) is ideal for measuring the distance between distributions as it theoretically guarantees the similarity of style distributions with the WD between them equaling 0. However, due to the high computation cost of WD, previous WD-based methods either oversimplify the style distribution or only use a lower bound of WD, therefore, losing the theoretical guarantee of WD. In this paper, we propose a new style loss based on Sliced Wasserstein Distance (SWD), which has a theoretical approximation guarantee. Besides, an adaptive sampling algorithm is also proposed to further improve the style transfer results. Experiment results show that the proposed method improves the similarity of style distributions, and such improvements result in visually better style transfer resultsShow more
Article
Publication:Computers & Graphics, 102, February 2022, 89
Cited by 1 Related articles All 2 versions


Peer-reviewed
Wasserstein-based texture analysis in radiomic studies
Authors:Zehor BelkhatirRaúl San José EstéparAllen R. Tannenbaum
Summary:The emerging field of radiomics that transforms standard-of-care images to quantifiable scalar statistics endeavors to reveal the information hidden in these macroscopic images. The concept of texture is widely used and essential in many radiomic-based studies. Practice usually reduces spatial multidimensional texture matrices, e.g., gray-level co-occurrence matrices (GLCMs), to summary scalar features. These statistical features have been demonstrated to be strongly correlated and tend to contribute redundant information; and does not account for the spatial information hidden in the multivariate texture matrices. This study proposes a novel pipeline to deal with spatial texture features in radiomic studies. A new set of textural features that preserve the spatial information inherent in GLCMs is proposed and used for classification purposes. The set of the new features uses the Wasserstein metric from optimal mass transport theory (OMT) to quantify the spatial similarity between samples within a given label class. In particular, based on a selected subset of texture GLCMs from the training cohort, we propose new representative spatial texture features, which we incorporate into a supervised image classification pipeline. The pipeline relies on the support vector machine (SVM) algorithm along with Bayesian optimization and the Wasserstein metric. The selection of the best GLCM references is considered for each classification label and is performed during the training phase of the SVM classifier using a Bayesian optimizer. We assume that sample fitness is defined based on closeness (in the sense of the Wasserstein metric) and high correlation (Spearman’s rank sense) with other samples in the same class. Moreover, the newly defined spatial texture features consist of the Wasserstein distance between the optimally selected references and the remaining samples. We assessed the performance of the proposed classification pipeline in diagnosing the coronavirus disease 2019 (COVID-19) from computed tomographic (CT) images. To evaluate the proposed spatial features’ added value, we compared the performance of the proposed classification pipeline with other SVM-based classifiers that account for different texture features, namely: statistical features only, optimized spatial features using Euclidean metric, non-optimized spatial features with Wasserstein metric. The proposed technique, which accounts for the optimized spatial texture feature with Wasserstein metric, shows great potential in classifying new COVID CT images that the algorithm has not seen in the training step. The MATLAB code of the proposed classification pipeline is made available. It can be used to find the best reference samples in other data cohorts, which can then be employed to build different prediction models.

• Large data cohorts inherently have representative “reference” samples. • Radiomics statistical texture features lose the spatial information inherent in highdimensional texture matrices. • Proposing spatial texture features considering optimal reference samples. • Robust Wasserstein metric from Optimal Mass transport (OMT) theory used in the proposed classification pipelineShow more
Article, 2022
Publication:Computerized Medical Imaging and Graphics, 102, 202212
Publisher:2022
elated articles All 3 versions


2022


Peer-reviewed
Unsupervised domain adaptation of bearing fault diagnosis based on Join Sliced Wasserstein Distance

Authors:Pengfei ChenRongzhen ZhaoTianjing HeKongyuan WeiQidong Yang
Summary:Deep neural networks have been successfully utilized in the mechanical fault diagnosis, however, a large number of them have been based on the same assumption that training and test datasets followed the same distributions. Unfortunately, the mechanical systems are easily affected by environment noise interference, speed or load change. Consequently, the trained networks have poor generalization under various working conditions. Recently, unsupervised domain adaptation has been concentrated on more and more attention since it can handle different but related data. Sliced Wasserstein Distance has been successfully utilized in unsupervised domain adaptation and obtained excellent performances. However, most of the approaches have ignored the class conditional distribution. In this paper, a novel approach named Join Sliced Wasserstein Distance (JSWD) has been proposed to address the above issue. Four bearing datasets have been selected to validate the practicability and effectiveness of the JSWD framework. The experimental results have demonstrated that about 5% accuracy is improved by JSWD with consideration of the conditional probability than no the conditional probability, in addition, the other experimental results have indicated that JSWD could effectively capture the distinguishable and domain-invariant representations and have a has superior data distribution matching than the previous methods under various application scenarios. Display Omitted

• We consider the correlation of the output probabilities of each sample and obtain the conditional probability of Sliced Wasserstein Distance. • Our work directly takes the raw signals as the input of CNN, which provided an end-to-end model. • Different noises are added to the fault datasets to explore the sensitivity and robustness of the proposed methodShow more
Article, 2022
Publication:ISA Transactions, 129, 202210, 504
Publisher:2022
Cited by 23
 Related articles All 3 versions


Virtual persistence diagrams, signed measures, Wasserstein distances, and Banach spaces
Authors:Peter BubenikAlex Elchesen
Summary:Abstract: Persistence diagrams, an important summary in topological data analysis, consist of a set of ordered pairs, each with positive multiplicity. Persistence diagrams are obtained via Möbius inversion and may be compared using a one-parameter family of metrics called Wasserstein distances. In certain cases, Möbius inversion produces sets of ordered pairs which may have negative multiplicity. We call these virtual persistence diagrams. Divol and Lacombe recently showed that there is a Wasserstein distance for Radon measures on the half plane of ordered pairs that generalizes both the Wasserstein distance for persistence diagrams and the classical Wasserstein distance from optimal transport theory. Following this work, we define compatible Wasserstein distances for persistence diagrams and Radon measures on arbitrary metric spaces. We show that the 1-Wasserstein distance extends to virtual persistence diagrams and to signed measures. In addition, we characterize the Cauchy completion of persistence diagrams with respect to the Wasserstein distances. We also give a universal construction of a Banach space with a 1-Wasserstein norm. Persistence diagrams with the 1-Wasserstein distance isometrically embed into this Banach spaceShow more
Article, 2022
Publication:Journal of Applied and Computational Topology, 6, 20220421, 429
Publisher:2022

Semi-Supervised Surface Wave Tomography With Wasserstein Cycle-Consistent GAN: Method and Application to Southern California Plate Boundary Region
Authors:Ao CaiHongrui QiuFenglin Niu
Article, 2022
Publication:Journal of geophysical research.Solid earth, 127, 2022, N
Publisher:2022

Peer-reviewed
Isometric rigidity of Wasserstein spaces: The graph metric case
Authors:Gergely KissTamás Titkos
Summary:The aim of this paper is to prove that the $p$-Wasserstein space $\mathcal {W}_p(X)$ is isometrically rigid for all $p≥ 1$ whenever $X$ is a countable graph metric space. As a consequence, we obtain that for every countable group ${H}$ and any $p≥ 1$ there exists a $p$-Wasserstein space whose isometry group is isomorphic to ${H}$Show more
Downloadable Article, 2022
Publication:Proceedings of the American Mathematical Society, 150, September 1, 2022, 4083
Publisher:2022


Peer-reviewed
Wasserstein convergence rate for empirical measures on noncompact manifolds
Author:Feng-Yu Wang
Summary:Let Xt be the (reflecting) diffusion process generated by LΔ+V on a complete connected Riemannian manifold M possibly with a boundary ∂M, where VC1(M) such that μ(dx)eV(x)dx is a probability measure. We estimate the convergence rate for the empirical measure μt1t∫0tδXsds under the Wasserstein distance. As a typical example, when M=Rd and V(x)=c1−c2|x|p for some constants c1R,c2>0 and p>1, the explicit upper and lower bounds are present for the convergence rate, which are of sharp order when either d<4(p−1)p or d≥4 and pShow mor
Article, 2022
Publication:Stochastic Processes and their Applications, 144, 202202, 271
Publisher:2022

<-—2022———2022———1770—


ECG Classification based on Wasserstein Scalar Curvature

Authors:Sun, Fupeng (Creator), Ni, Yin (Creator), Luo, Yihao (Creator), Sun, Huafei (Creator)
Summary:Electrocardiograms (ECG) analysis is one of the most important ways to diagnose heart disease. This paper proposes an efficient ECG classification method based on Wasserstein scalar curvature to comprehend the connection between heart disease and mathematical characteristics of ECG. The newly proposed method converts an ECG into a point cloud on the family of Gaussian distribution, where the pathological characteristics of ECG will be extracted by the Wasserstein geometric structure of the statistical manifold. Technically, this paper defines the histogram dispersion of Wasserstein scalar curvature, which can accurately describe the divergence between different heart diseases. By combining medical experience with mathematical ideas from geometry and data science, this paper provides a feasible algorithm for the new method, and the theoretical analysis of the algorithm is carried out. Digital experiments on the classical database with large samples show the new algorithm's accuracy and efficiency when dealing with the classification of heart diseaseShow more
Downloadable Archival Material, 2022-06-25
Undefined
Publisher:2022-06-25

Related articles All 8 versions

Wasserstein Adversarial Learning based Temporal Knowledge Graph Embedding
Authors:Dai, Yuanfei (Creator), Guo, Wenzhong (Creator), Eickhoff, Carsten (Creator)
Summary:Research on knowledge graph embedding (KGE) has emerged as an active field in which most existing KGE approaches mainly focus on static structural data and ignore the influence of temporal variation involved in time-aware triples. In order to deal with this issue, several temporal knowledge graph embedding (TKGE) approaches have been proposed to integrate temporal and structural information in recent years. However, these methods only employ a uniformly random sampling to construct negative facts. As a consequence, the corrupted samples are often too simplistic for training an effective model. In this paper, we propose a new temporal knowledge graph embedding framework by introducing adversarial learning to further refine the performance of traditional TKGE models. In our framework, a generator is utilized to construct high-quality plausible quadruples and a discriminator learns to obtain the embeddings of entities and relations based on both positive and negative samples. Meanwhile, we also apply a Gumbel-Softmax relaxation and the Wasserstein distance to prevent vanishing gradient problems on discrete data; an inherent flaw in traditional generative adversarial networks. Through comprehensive experimentation on temporal datasets, the results indicate that our proposed framework can attain significant improvements based on benchmark models and also demonstrate the effectiveness and applicability of our frameworkShow more
Downloadable Archival Material, 2022-05-03
Undefined
Publisher:2022-05-03

Amortized Projection Optimization for Sliced Wasserstein Generative Models
Authors:Nguyen, Khai (Creator), Ho, Nhat (Creator)
Summary:Seeking informative projecting directions has been an important task in utilizing sliced Wasserstein distance in applications. However, finding these directions usually requires an iterative optimization procedure over the space of projecting directions, which is computationally expensive. Moreover, the computational issue is even more severe in deep learning applications, where computing the distance between two mini-batch probability measures is repeated several times. This nested-loop has been one of the main challenges that prevent the usage of sliced Wasserstein distances based on good projections in practice. To address this challenge, we propose to utilize the learning-to-optimize technique or amortized optimization to predict the informative direction of any given two mini-batch probability measures. To the best of our knowledge, this is the first work that bridges amortized optimization and sliced Wasserstein generative models. In particular, we derive linear amortized models, generalized linear amortized models, and non-linear amortized models which are corresponding to three types of novel mini-batch losses, named amortized sliced Wasserstein. We demonstrate the favorable performance of the proposed sliced losses in deep generative modeling on standard benchmark datasetsShow more
Downloadable Archival Material, 2022-03-24
Undefined
Publisher:2022-03-24

Cited by 13 Related articles All 5 versions


The Quadratic Wasserstein Metric With Squaring Scaling For Seismic Velocity Inversion
Authors:Li, Zhengyang (Creator), Tang, Yijia (Creator), Chen, Jing (Creator), Wu, Hao (Creator)
Summary:The quadratic Wasserstein metric has shown its power in measuring the difference between probability densities, which benefits optimization objective function with better convexity and is insensitive to data noise. Nevertheless, it is always an important question to make the seismic signals suitable for comparison using the quadratic Wasserstein metric. The squaring scaling is worth exploring since it guarantees the convexity caused by data shift. However, as mentioned in [Commun. Inf. Syst., 2019, 19:95-145], the squaring scaling may lose uniqueness and result in more local minima to the misfit function. In our previous work [J. Comput. Phys., 2018, 373:188-209], the quadratic Wasserstein metric with squaring scaling was successfully applied to the earthquake location problem. But it only discussed the inverse problem with few degrees of freedom. In this work, we will present a more in-depth study on the combination of squaring scaling technique and the quadratic Wasserstein metric. By discarding some inapplicable data, picking seismic phases, and developing a new normalization method, we successfully invert the seismic velocity structure based on the squaring scaling technique and the quadratic Wasserstein metric. The numerical experiments suggest that this newly proposed method is an efficient approach to obtain more accurate inversion resultsShow more
Downloadable Archival Material, 2022-01-26
Undefined
Publisher:2022-01-26
 
A Unified Wasserstein Distributional Robustness Framework for Adversarial Training
Authors:Bui, Tuan Anh (Creator), Le, Trung (Creator), Tran, Quan (Creator), Zhao, He (Creator), Phung, Dinh (Creator)
Summary:It is well-known that deep neural networks (DNNs) are susceptible to adversarial attacks, exposing a severe fragility of deep learning systems. As the result, adversarial training (AT) method, by incorporating adversarial examples during training, represents a natural and effective approach to strengthen the robustness of a DNN-based classifier. However, most AT-based methods, notably PGD-AT and TRADES, typically seek a pointwise adversary that generates the worst-case adversarial example by independently perturbing each data sample, as a way to "probe" the vulnerability of the classifier. Arguably, there are unexplored benefits in considering such adversarial effects from an entire distribution. To this end, this paper presents a unified framework that connects Wasserstein distributional robustness with current state-of-the-art AT methods. We introduce a new Wasserstein cost function and a new series of risk functions, with which we show that standard AT methods are special cases of their counterparts in our framework. This connection leads to an intuitive relaxation and generalization of existing AT methods and facilitates the development of a new family of distributional robustness AT-based algorithms. Extensive experiments show that our distributional robustness AT algorithms robustify further their standard AT counterparts in various settingsShow more
Downloadable Archival Material, 2022-02-27
Undefined
Publisher:2022-02-27


2022


A Simple Duality Proof for Wasserstein Distributionally Robust Optimization
Authors:Zhang, Luhao (Creator), Yang, Jincheng (Creator), Gao, Rui (Creator)
Summary:We present a short and elementary proof of the duality for Wasserstein distributionally robust optimization, which holds for any arbitrary Kantorovich transport distance, any arbitrary measurable loss function, and any arbitrary nominal probability distribution, as long as certain interchangeability principle holdsShow more
Downloadable Archival Material, 2022-04-30
Undefined
Publisher:2022-04-30

Graph Auto-Encoder Via Neighborhood Wasserstein Reconstruction
Authors:Tang, Mingyue (Creator), Yang, Carl (Creator), Li, Pan (Creator)
Summary:Graph neural networks (GNNs) have drawn significant research attention recently, mostly under the setting of semi-supervised learning. When task-agnostic representations are preferred or supervision is simply unavailable, the auto-encoder framework comes in handy with a natural graph reconstruction objective for unsupervised GNN training. However, existing graph auto-encoders are designed to reconstruct the direct links, so GNNs trained in this way are only optimized towards proximity-oriented graph mining tasks, and will fall short when the topological structures matter. In this work, we revisit the graph encoding process of GNNs which essentially learns to encode the neighborhood information of each node into an embedding vector, and propose a novel graph decoder to reconstruct the entire neighborhood information regarding both proximity and structure via Neighborhood Wasserstein Reconstruction (NWR). Specifically, from the GNN embedding of each node, NWR jointly predicts its node degree and neighbor feature distribution, where the distribution prediction adopts an optimal-transport loss based on the Wasserstein distance. Extensive experiments on both synthetic and real-world network datasets show that the unsupervised node representations learned with NWR have much more advantageous in structure-oriented graph mining tasks, while also achieving competitive performance in proximity-oriented onesShow more
Downloadable Archival Material, 2022-02-18
Undefined
Publisher:2022-02-18


Wasserstein Two-Sided Chance Constraints with An Application to Optimal Power Flow

Authors:Shen, Haoming (Creator), Jiang, Ruiwei (Creator)
Summary:As a natural approach to modeling system safety conditions, chance constraint (CC) seeks to satisfy a set of uncertain inequalities individually or jointly with high probability. Although a joint CC offers stronger reliability certificate, it is oftentimes much more challenging to compute than individual CCs. Motivated by the application of optimal power flow, we study a special joint CC, named two-sided CC. We model the uncertain parameters through a Wasserstein ball centered at a Gaussian distribution and derive a hierarchy of conservative approximations based on second-order conic constraints, which can be efficiently computed by off-the-shelf commercial solvers. In addition, we show the asymptotic consistency of these approximations and derive their approximation guarantee when only a finite hierarchy is adopted. We demonstrate the out-of-sample performance and scalability of the proposed model and approximations in a case study based on the IEEE 118-bus and 3120-bus systemsShow more
Downloadable Archival Material, 2022-03-31
Undefined
Publisher:2022-03-31


Detecting tiny objects in aerial images: A normalized Wasserstein distance and a new benchmark

Authors:Xu, Chang (Creator), Wang, Jinwang (Creator), Yang, Wen (Creator), Yu, Huai (Creator), Yu, Lei (Creator), Xia, Gui-Song (Creator)
Summary:Tiny object detection (TOD) in aerial images is challenging since a tiny object only contains a few pixels. State-of-the-art object detectors do not provide satisfactory results on tiny objects due to the lack of supervision from discriminative features. Our key observation is that the Intersection over Union (IoU) metric and its extensions are very sensitive to the location deviation of the tiny objects, which drastically deteriorates the quality of label assignment when used in anchor-based detectors. To tackle this problem, we propose a new evaluation metric dubbed Normalized Wasserstein Distance (NWD) and a new RanKing-based Assigning (RKA) strategy for tiny object detection. The proposed NWD-RKA strategy can be easily embedded into all kinds of anchor-based detectors to replace the standard IoU threshold-based one, significantly improving label assignment and providing sufficient supervision information for network training. Tested on four datasets, NWD-RKA can consistently improve tiny object detection performance by a large margin. Besides, observing prominent noisy labels in the Tiny Object Detection in Aerial Images (AI-TOD) dataset, we are motivated to meticulously relabel it and release AI-TOD-v2 and its corresponding benchmark. In AI-TOD-v2, the missing annotation and location error problems are considerably mitigated, facilitating more reliable training and validation processes. Embedding NWD-RKA into DetectoRS, the detection performance achieves 4.3 AP points improvement over state-of-the-art competitors on AI-TOD-v2. Datasets, codes, and more visualizations are available at: https://chasel-tsui.github.io/AI-TOD-v2Show more
Downloadable Archival Material, 2022-06-28
Undefined
Publisher:2022-06-28


The Performance of Wasserstein Distributionally Robust M-Estimators in High Dimensions
Authors:Aolaritei, Liviu (Creator), Shafieezadeh-Abadeh, Soroosh (Creator), Dörfler, Florian (Creator)
Summary:Wasserstein distributionally robust optimization has recently emerged as a powerful framework for robust estimation, enjoying good out-of-sample performance guarantees, well-understood regularization effects, and computationally tractable dual reformulations. In such framework, the estimator is obtained by minimizing the worst-case expected loss over all probability distributions which are close, in a Wasserstein sense, to the empirical distribution. In this paper, we propose a Wasserstein distributionally robust M-estimation framework to estimate an unknown parameter from noisy linear measurements, and we focus on the important and challenging task of analyzing the squared error performance of such estimators. Our study is carried out in the modern high-dimensional proportional regime, where both the ambient dimension and the number of samples go to infinity, at a proportional rate which encodes the under/over-parametrization of the problem. Under an isotropic Gaussian features assumption, we show that the squared error can be recover as the solution of a convex-concave optimization problem which, surprinsingly, involves at most four scalar variables. To the best of our knowledge, this is the first work to study this problem in the context of Wasserstein distributionally robust M-estimationShow more
Downloadable Archival Material, 2022-06-27
Undefined
Publisher:2022-06-27

<-—2022———2022———1780—



A Wasserstein GAN for Joint Learning of Inpainting and its Spatial Optimisation
Author:Peter, Pascal (Creator)
Summary:Classic image inpainting is a restoration method that reconstructs missing image parts. However, a carefully selected mask of known pixels that yield a high quality inpainting can also act as a sparse image representation. This challenging spatial optimisation problem is essential for practical applications such as compression. So far, it has been almost exclusively addressed by model-based approaches. First attempts with neural networks seem promising, but are tailored towards specific inpainting operators or require postprocessing. To address this issue, we propose the first generative adversarial network for spatial inpainting data optimisation. In contrast to previous approaches, it allows joint training of an inpainting generator and a corresponding mask optimisation network. With a Wasserstein distance, we ensure that our inpainting results accurately reflect the statistics of natural images. This yields significant improvements in visual quality and speed over conventional stochastic models and also outperforms current spatial optimisation networksShow more
Downloadable Archival Material, 2022-02-11
Undefined
Publisher:2022-02-11

Cited by 1 Related articles All 3 versions


On the Generalization of Wasserstein Robust Federated Learning
Authors:Nguyen, Tung-Anh (Creator), Nguyen, Tuan Dung (Creator), Le, Long Tan (Creator), Dinh, Canh T. (Creator), Tran, Nguyen H. (Creator)
Summary:In federated learning, participating clients typically possess non-i.i.d. data, posing a significant challenge to generalization to unseen distributions. To address this, we propose a Wasserstein distributionally robust optimization scheme called WAFL. Leveraging its duality, we frame WAFL as an empirical surrogate risk minimization problem, and solve it using a local SGD-based algorithm with convergence guarantees. We show that the robustness of WAFL is more general than related approaches, and the generalization bound is robust to all adversarial distributions inside the Wasserstein ball (ambiguity set). Since the center location and radius of the Wasserstein ball can be suitably modified, WAFL shows its applicability not only in robustness but also in domain adaptation. Through empirical evaluation, we demonstrate that WAFL generalizes better than the vanilla FedAvg in non-i.i.d. settings, and is more robust than other related methods in distribution shift settings. Further, using benchmark datasets we show that WAFL is capable of generalizing to unseen target domainsShow more
Downloadable Archival Material, 2022-06-03
Undefined
Publisher:2022-06-03


 


Variational inference via Wasserstein gradient flows
Authors:Lambert, Marc (Creator), Chewi, Sinho (Creator), Bach, Francis (Creator), Bonnabel, Silvère (Creator), Rigollet, Philippe (Creator)
Summary:Along with Markov chain Monte Carlo (MCMC) methods, variational inference (VI) has emerged as a central computational approach to large-scale Bayesian inference. Rather than sampling from the true posterior $\pi$, VI aims at producing a simple but effective approximation $\hat \pi$ to $\pi$ for which summary statistics are easy to compute. However, unlike the well-studied MCMC methodology, VI is still poorly understood and dominated by heuristics. In this work, we propose principled methods for VI, in which $\hat \pi$ is taken to be a Gaussian or a mixture of Gaussians, which rest upon the theory of gradient flows on the Bures-Wasserstein space of Gaussian measures. Akin to MCMC, it comes with strong theoretical guarantees when $\pi$ is log-concaveShow more
Downloadable Archival Material, 2022-05-31
Undefined
Publisher:2022-05-31


Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space

Authors:Massachusetts Institute of Technology Department of Mathematics (Contributor), Le Gouic, Thibaut (Creator), Paris, Quentin (Creator), Rigollet, Philippe (Creator), Stromme, Austin J (Creator)Show more
Downloadable Archival Material, 2022-10-14T16:28:22Z
English
Publisher:European Mathematical Society - EMS - Publishing House GmbH, 2022-10-14T16:28:22Z

Peer-reviewed
Indeterminacy estimates, eigenfunctions and lower bounds on Wasserstein distances

Authors:Nicolò De PontiSara Farinelli
Summary:Abstract: In the paper we prove two inequalities in the setting of spaces using similar techniques. The first one is an indeterminacy estimate involving the p-Wasserstein distance between the positive part and the negative part of an function and the measure of the interface between the positive part and the negative part. The second one is a conjectured lower bound on the p-Wasserstein distance between the positive and negative parts of a Laplace eigenfunctionShow more
Article, 2022
Publication:Calculus of Variations and Partial Differential Equations, 61, 20220505
Publisher:2022


2022



The Quantum Wasserstein Distance of Order 1

Authors:De Palma, Giacomo (Creator), Marvian, Milad (Creator), Trevisan, Dario (Creator), Lloyd, Seth (Creator)
Downloadable Archival Material, 2022-01-11T16:08:53Z
English
Publisher:Institute of Electrical and Electronics Engineers (IEEE), 2022-01-11T16:08:53Z


Estimation of Wasserstein distances in the Spiked Transport Model
Authors:Massachusetts Institute of Technology Department of Mathematics (Contributor), Niles-Weed, Jonathan (Creator), Rigollet, Philippe (Creator)
Downloadable Archival Material, 2022-10-14T16:56:23Z
English
Publisher:Bernoulli Society for Mathematical Statistics and Probability, 2022-10-14T16:56:23Z

Peer-reviewed
Multisource Wasserstein Adaptation Coding Network for EEG emotion recognition

Authors:Lei ZhuWangpan DingJieping ZhuPing XuYian LiuMing YanJianhai Zhang
Summary:Emotion recognition has an important application in human-computer interaction (HCI). Electroencephalogram (EEG) is a reliable method in emotion recognition and is widely studied. However, since the individual variability of EEG, it is difficult to build a generic model between different subjects. In addition, the EEG signals will change in different periods, which has a great impact on the model. Therefore, building an effective model for cross-sessions, cross-subjects, cross- subjects and sessions has become challenging. In order to solve this problem, we propose a new emotion recognition method called Multisource Wasserstein Adaptation Coding Network (MWACN). MWACN can simplify output data and retain important information of input data by Autoencoder. It also uses Wasserstein distance and Association Reinforcement to adapt marginal distribution and conditional distribution. We validated the effectiveness of the model on SEED dataset and SEED-IV dataset. In cross-sessions experiment, the accuracy of our model achieves 92.08% in SEED and 76.04% in SEED-IV. In cross-subjects experiment, the accuracy of our model achieves 87.59% in SEED and 74.38% in SEED-IV. In cross-subjects and sessions experiment, the accuracy of the MWACN is improved by 3.72% in SEED and 5.88% in SEED-IV. The results show that our proposed MWACN outperforms recent domain adaptation algorithmsShow more
Article
Publication:Biomedical Signal Processing and Control, 76, July 2022

Peer-reviewed
Detecting tiny objects in aerial images: A normalized Wasserstein distance and a new benchmark

Authors:Chang XuJinwang WangWen YangHuai YuLei YuGui-Song Xia
Summary:Tiny object detection (TOD) in aerial images is challenging since a tiny object only contains a few pixels. State-of-the-art object detectors do not provide satisfactory results on tiny objects due to the lack of supervision from discriminative features. Our key observation is that the Intersection over Union (IoU) metric and its extensions are very sensitive to the location deviation of the tiny objects, which drastically deteriorates the quality of label assignment when used in anchor-based detectors. To tackle this problem, we propose a new evaluation metric dubbed Normalized Wasserstein Distance (NWD) and a new RanKing-based Assigning (RKA) strategy for tiny object detection. The proposed NWD-RKA strategy can be easily embedded into all kinds of anchor-based detectors to replace the standard IoU threshold-based one, significantly improving label assignment and providing sufficient supervision information for network training. Tested on four datasets, NWD-RKA can consistently improve tiny object detection performance by a large margin. Besides, observing prominent noisy labels in the Tiny Object Detection in Aerial Images (AI-TOD) dataset, we are motivated to meticulously relabel it and release AI-TOD-v2 and its corresponding benchmark. In AI-TOD-v2, the missing annotation and location error problems are considerably mitigated, facilitating more reliable training and validation processes. Embedding NWD-RKA into DetectoRS, the detection performance achieves 4.3 AP points improvement over state-of-the-art competitors on AI-TOD-v2. Datasets, codes, and more visualizations are available at: https://chasel-tsui.github.io/AI-TOD-v2/Show more
Article, 2022
Publication:ISPRS Journal of Photogrammetry and Remote Sensing, 190, 202208, 79
Detecting tiny objects in aerial images: A normalized Wasserstein 

Cited by 13 Related articles All 5 versions


Wasserstein generative adversarial networks for form defects modeling

Authors:Yifan QieMahdieh BalaghiNabil Anwer
Summary:Geometric deviations of mechanical products are specified by tolerancing in the design stage for a functional purpose. In order to verify the impact of geometric deviations on functional surfaces while considering the manufacturing process, form defects have been considered in tolerance analysis in recent years. As a digital representation of geometrical defects in mechanical parts and assemblies, Skin Model Shapes enables the rapid and comprehensive generation of non-ideal shapes from either measurement or via data augmentation using simulation approaches. This paper presents a novel method for form defects modeling using Generative Adversarial Networks (GAN). The form defects of cylindrical surfaces considering machining process are represented and used for training a Wasserstein GAN. The pre-trained network is able to generate realistic form defects for cylindrical Skin Model Shapes rapidly and automatically without explicitly formulated representations. Manufacturing errors in turning process are considered in this approach and the generated samples from WGAN can be re-used for generating new cylindrical surfaces with a mapping strategy considering specification. A case study of a cylindricity specification is used in the paper to illustrate the effectiveness of the proposed methodShow mor
Article
Publication:Procedia CIRP, 114, 2022, 7

Peer-reviewed
Wasserstein-based fairness interpretability framework for machine learning models

Authors:Alexey MiroshnikovKonstandinos KotsiopoulosRyan FranksArjun Ravi Kannan
Summary:Abstract: The objective of this article is to introduce a fairness interpretability framework for measuring and explaining the bias in classification and regression models at the level of a distribution. In our work, we measure the model bias across sub-population distributions in the model output using the Wasserstein metric. To properly quantify the contributions of predictors, we take into account favorability of both the model and predictors with respect to the non-protected class. The quantification is accomplished by the use of transport theory, which gives rise to the decomposition of the model bias and bias explanations to positive and negative contributions. To gain more insight into the role of favorability and allow for additivity of bias explanations, we adapt techniques from cooperative game theoryShow mor
Article, 2022
Publication:Machine Learning, 111, 20220721, 3307
Publisher:2022

<-—2022———2022———1790—



Peer-reviewed
Wasserstein approximate bayesian computation for visual tracking

Authors:Jinhee ParkJunseok Kwon
Summary:In this study, we present novel visual tracking methods based on the Wasserstein approximate Bayesian computation (ABC). For visual tracking, the proposed Wasserstein ABC (WABC) method approximates the likelihood within the Wasserstein space more accurately than the conventional ABC methods by directly measuring the discrepancy between the likelihood distributions. To encode the temporal dependency among time-series likelihood distributions, we extend the WABC method to the time-series WABC (TWABC) method. Subsequently, the proposed Hilbert TWABC (HTWABC) method reduces the computational costs caused by the TWABC method while substituting the original Wasserstein distance with the Hilbert distance. Experimental results demonstrate that the proposed visual trackers outperform other state-of-the-art visual tracking methods quantitatively. Moreover, ablation studies verify the effectiveness of individual components consisting of the proposed method (e.g., the Wasserstein distance, curve matching, and Hilbert metric)Show more
Article
Publication:Pattern Recognition, 131, November 2022

Peer-reviewed
Authors:Laurent Tackling Algorithmic Bias in Neural-Network Classifiers using Wasserstein-2 RegularizationRisser

Alberto González SanzQuentin VincenotJean-Michel Loubes
Summary:Abstract: The increasingly common use of neural network classifiers in industrial and social applications of image analysis has allowed impressive progress these last years. Such methods are, however, sensitive to algorithmic bias, i.e., to an under- or an over-representation of positive predictions or to higher prediction errors in specific subgroups of images. We then introduce in this paper a new method to temper the algorithmic bias in Neural-Network-based classifiers. Our method is Neural-Network architecture agnostic and scales well to massive training sets of images. It indeed only overloads the loss function with a Wasserstein-2-based regularization term for which we back-propagate the impact of specific output predictions using a new model, based on the Gâteaux derivatives of the predictions distribution. This model is algorithmically reasonable and makes it possible to use our regularized loss with standard stochastic gradient-descent strategies. Its good behavior is assessed on the reference Adult census, MNIST, CelebA datasetsShow mor
Article, 2022
Publication:Journal of Mathematical Imaging and Vision, 64, 20220427, 672
Publisher:2022

mework as a smooth … , the authors specifically used Wasserstein-1 to post-process …

Save Cite Cited by 7 Related articles All 4 versions


Peer-reviewed
Adversarial classification via distributional robustness with Wasserstein ambiguity
Authors:Nam Ho-NguyenStephen J. Wright
Summary:We study a model for adversarial classification based on distributionally robust chance constraints. We show that under Wasserstein ambiguity, the model aims to minimize the conditional value-at-risk of the distance to misclassification, and we explore links to adversarial classification models proposed earlier and to maximum-margin classifiers. We also provide a reformulation of the distributionally robust model for linear classification, and show it is equivalent to minimizing a regularized ramp loss objective. Numerical experiments show that, despite the nonconvexity of this formulation, standard descent methods appear to converge to the global minimizer for this problem. Inspired by this observation, we show that, for a certain class of distributions, the only stationary point of the regularized ramp loss minimization problem is the global minimizerShow more
Downloadable Article, 2022
Publication:Mathematical Programming, 20220405, 1
Publisher:2022


Peer-reviewed
Obstructions to extension of Wasserstein distances for variable masses

Authors:Luca LombardiniFrancesco Rossi
Summary:We study the possibility of defining a distance on the whole space of measures, with the property that the distance between two measures having the same mass is the Wasserstein distance, up to a scaling factor. We prove that, under very weak and natural conditions, if the base space is unbounded, then the scaling factor must be constant, independently of the mass. Moreover, no such distance can exist, if we include the zero measure. Instead, we provide examples with non-constant scaling factors for the case of bounded base spacesShow moreDownloadable Article, 2022
Publication:Proceedings of the American Mathematical Society, 150, November 1, 2022, 4879
Publisher:2022
 

2022 see 2021
Uncertainty quantification in a mechanical submodel driven by a Wasserstein-GAN

Authors:Hamza BOUKRAICHINissrine AKKARIFabien CASENAVEDavid RYCKELYNCK
Summary:The analysis of parametric and non-parametric uncertainties of very large dynamical systems requires the construction of a stochastic model of said system. Linear approaches relying on random matrix theory Soize (2000) and principal component analysis can be used when systems undergo low-frequency vibrations. In the case of fast dynamics and wave propagation, we investigate a random generator of boundary conditions for fast submodels by using machine learning. We show that the use of non-linear techniques in machine learning and data-driven methods is highly relevantShow more
Article
Publication:IFAC PapersOnLine, 55, 2022, 469

 2022

Peer-reviewed
Energy data generation with Wasserstein Deep Convolutional Generative Adversarial Networks

Authors:Jianbin LiZhiqiang ChenLong ChengXiufeng Liu
Summary:Residential energy consumption data and related sociodemographic information are critical for energy demand management, including providing personalized services, ensuring energy supply, and designing demand response programs. However, it is often difficult to collect sufficient data to build machine learning models, primarily due to cost, technical barriers, and privacy. Synthetic data generation becomes a feasible solution to address data availability issues, while most existing work generates data without considering the balance between usability and privacy. In this paper, we first propose a data generation model based on the Wasserstein Deep Convolutional Generative Adversarial Network (WDCGAN), which is capable of synthesizing fine-grained energy consumption time series and corresponding sociodemographic information. The WDCGAN model can generate realistic data by balancing data usability and privacy level by setting a hyperparameter during training. Next, we take the classification of sociodemographic information as an application example and train four classical classification models with the generated datasets, including CNN, LSTM, SVM, and LightGBM. We evaluate the proposed data generator using Irish data, and the results show that the proposed WDCGAN model can generate realistic load profiles with satisfactory similarity in terms of data distribution, patterns, and performance. The classification results validate the usability of the generated data for real-world machine learning applications with privacy guarantee, e.g., most of the differences in classification accuracy and F 1 scores are less than 8% between using real and synthesized data.

• An improved GAN method for generating residential electricity load profiles. •Load profile generation to address data privacy and availability issues. •A data generation method balancing data usability and privacy. •Experimental studies validating synthetic load profiles that are closely similar to real profilesShow more
Article, 2022
Publication:Energy, 257, 20221015
Publisher:2022


Peer-reviewed
A data-driven scheduling model of virtual power plant using Wasserstein distributionally robust optimization

Authors:Huichuan LiuJing QiuJunhua Zhao
Summary:• A data-driven Wasserstein distributionally robust optimization model is proposed. • The day-head scheduling decision of VPP can be solved by off-the-shell solver. • A set of data-driven linearization power flow constraints are constructed. • The model computation efficiency is improved for solving the decisions.

Distributed energy resources (DER) can be efficiently aggregated by aggregators to sell excessive electricity to spot market in the form of Virtual Power Plant (VPP). The aggregator schedules DER within VPP to participate in day-ahead market for maximizing its profits while keeping the static operating envelope provided by distribution system operator (DSO) in real-time operation. Aggregator, however, needs to make a decision of its offer for biding under the uncertainties of market price and wind power. This paper proposes a two-stage data-driven scheduling model of VPP in day-ahead (DA) and real time (RT) market. In DA market, in order to determine VPP output for biding, a piece-wise affine formulation of VPP profits combing with CVaR for avoiding market price risk is constructed firstly, and then a data-driven distributionally robust model using a Wasserstein ambiguity set is constructed under uncertainties of market price and wind forecast errors. A set of data-driven linearization power constraints are applied in both DA and RT operation when the parameters of distribution network are unknown or inexact. The model then is reformulated equivalently to a mixed 0-1 convex programming problem. The proposed scheduling model is tested on the IEEE 33-bus distribution network showing that under same 1000-sample dataset in training, proposed DRO model has over 85% of reliability while the stochastic optimization has only 69% under the market risk, which means the proposed model has a better out-of-sample performance for uncertaintiesShow more
Article, 2022
Publication:International Journal of Electrical Power and Energy Systems, 137, 202205
Publisher:2022
Cited by 13
 Related articles All 2 versions


Peer-reviewed
A New Perspective on Wasserstein Distances for Kinetic Problems

Author:Mikaela Iacobelli
Summary:Abstract: We introduce a new class of Wasserstein-type distances specifically designed to tackle questions concerning stability and convergence to equilibria for kinetic equations. Thanks to these new distances, we improve some classical estimates by Loeper (J Math Pures Appl (9) 86(1):68–79, 2006) and Dobrushin (Funktsional Anal i Prilozhen 13:48–58, 1979) on Vlasov-type equations, and we present an application to quasi-neutral limitsShow more
Article, 2022
Publication:Archive for Rational Mechanics and Analysis, 244, 20220207, 27
Publisher:2022

<-—2022———2022———1800— 


Peer-reviewed
Learning brain representation using recurrent Wasserstein generative adversarial net

Authors:Ning QiangQinglin DongHongtao LiangJin LiShu ZhangCheng ZhangBao GeYifei SunJie GaoTianming Liu
Summary:To understand brain cognition and disorders, modeling the mapping between mind and brain has been of great interest to the neuroscience community. The key is the brain representation, including functional brain networks (FBN) and their corresponding temporal features. Recently, it has been proven that deep learning models have superb representation power on functional magnetic resonance imaging (fMRI) over traditional machine learning methods. However, due to the lack of high-quality data and labels, deep learning models tend to suffer from overfitting in the training processShow more
Article
Publication:Computer Methods and Programs in Biomedicine, 223, August 2022
Cited by 4
Related articles All 4 versions


Peer-reviewed
Multiview Wasserstein generative adversarial network for imbalanced pearl classification

Authors:Shuang GaoYun DaiYingjie LiKaixin LiuKun ChenYi Liu
Summary:This work described in this paper aims to enhance the level of automation of industrial pearl classification through deep learning methods. To better extract the features of different classes and improve classification accuracy, balanced training datasets are usually needed for machine learning methods. However, the pearl datasets obtained in practice are often imbalanced; in particular, the acquisition cost of some classes is high. An enhanced generative adversarial network, named the multiview Wasserstein generative adversarial network (MVWGAN), is proposed for the imbalanced pearl classification problem. For the minority classes in the training datasets, the MVWGAN method can generate high-quality multiview images simultaneously to balance the original imbalanced datasets. The augmented balanced datasets are used to train a multistream convolution neural network (MS-CNN) for pearl classification. The experimental results show that MVWGAN can overcome the imbalanced learning problem and improve the classification performance of MS-CNN effectively. Moreover, feature visualization is implemented to intuitively explain the effectiveness of MVWGANShow more
Article, 2022
Publication:Measurement Science and Technology, 33, 20220801
Publisher:2022
Cited by 10 Related articles All 2 versions


Peer-reviewed
Authors:Peter BubenikAlex Elchesen
Universality of persistence diagrams and the bottleneck and Wasserstein distances
Summary:We prove that persistence diagrams with the p-Wasserstein distance is the universal p-subadditive commutative monoid on an underlying metric space with a distinguished subset. This result applies to persistence diagrams, to barcodes, and to multiparameter persistence modules. In addition, the 1-Wasserstein distance satisfies Kantorovich-Rubinstein dualityShow more
Article
Publication:Computational Geometry: Theory and Applications, 105-106, August-October 2022

Peer-reviewed
Single image super-resolution using Wasserstein generative adversarial network with gradient penalty

Authors:Yinggan TangChenglu LiuXuguang Zhang
Summary:Due to its strong sample generating ability, Generative Adversarial Network (GAN) has been used to solve single image super-resolution (SISR) problem and obtains high perceptual quality super-resolution (SR) images. However, GAN suffers from the disadvantage of training instability, even fails to converge. In this paper, a new SISR method is proposed based on Wasserstein GAN, which is a training more stable GAN with Wasserstein metric. To further increase the SR performance and make the training process more easier and stable, two modifications are made on the original WGAN. First, a gradient penalty (GP) is adopted to replace weight clipping. Second, a new residual block with “pre-activation” of the weight layer is constructed in the generators of WGAN. Extensive experiments show that the proposed method yields superior SR performance than original GAN based SR methods and many other methods in accuracy and perceptual quality ofShow more
Article
Publication:Pattern Recognition Letters, 163, November 2022, 32
Cited by 1 Related articles All 3 versions


Peer-reviewed
On a linear Gromov-Wasserstein distance
Authors:Florian BeierRobert BeinertGabriele Steidl
Summary:Gromov-Wasserstein distances are generalization of Wasserstein distances, which are invariant under distance preserving transformations. Although a simplified version of optimal transport in Wasserstein spaces, called linear optimal transport (LOT), was successfully used in practice, there does not exist a notion of linear Gromov-Wasserstein distances so far. In this paper, we propose a definition of linear Gromov-Wasserstein distances. We motivate our approach by a generalized LOT model, which is based on barycentric projection maps of transport plans. Numerical examples illustrate that the linear Gromov-Wasserstein distances, similarly as LOT, can replace the expensive computation of pairwise Gromov-Wasserstein distances in applications like shape classificationShow more

2022


Peer-reviewed
Hypothesis Test and Confidence Analysis With Wasserstein Distance on General Dimension

Authors:Masaaki ImaizumiHirofumi OtaTakuo Hamaguchi
Summary:We develop a general framework for statistical inference with the 1-Wasserstein distance. Recently, the Wasserstein distance has attracted considerable attention and has been widely applied to various machine learning tasks because of its excellent properties. However, hypothesis tests and a confidence analysis for it have not been established in a general multivariate setting. This is because the limit distribution of the empirical distribution with the Wasserstein distance is unavailable without strong restriction. To address this problem, in this study, we develop a novel nonasymptotic gaussian approximation for the empirical 1-Wasserstein distance. Using the approximation method, we develop a hypothesis test and confidence analysis for the empirical 1-Wasserstein distance. We also provide a theoretical guarantee and an efficient algorithm for the proposed approximation. Our experiments validate its performance numericallyShow more
Downloadable Article, 2022
Publication:Neural Computation, 34, 20220519, 1448
Publisher:2022

Cited by 4 Related articles All 9 versions

MR4522876 Prelim Chambolle, Antonin; Contreras, Juan Pablo; 

Accelerated Bregman Primal-Dual Methods Applied to Optimal Transport and Wasserstein Barycenter Problems. SIAM J. Math. Data Sci. 4 (2022), no. 4, 1369–1395. 65Y20 (49Q22 90C05 90C06 90C08 90C47)

Review PDF Clipboard Journal Article



MR4522524 Prelim Sun, Fupeng; Ni, Yin; Luo, Yihao; Sun, Huafei; ECG Classification Based on Wasserstein Scalar Curvature. Entropy 24 (2022), no. 10, Paper No. 1450. 62 (53 94)

Review PDF Clipboard Journal Article



MR4522098 Prelim Dvinskikh, Darina; 

Stochastic approximation versus sample average approximation for Wasserstein barycenters. Optim. Methods Softw. 37 (2022), no. 5, 1603–1635.

Review PDF Clipboard Journal Article

MRWM: A Multiple Residual Wasserstein Driven Model for Image Denoising

He, RQLan, WS and Liu, F

IEEE ACCESS

 10 , pp.127397-127411

Enriched Cited References

Residual histograms can provide valuable information for vision research. However, current image restoration methods have not fully exploited the potential of multiple residual histograms, especially their role as overall regularization constraints. In this paper, we propose a novel framework of multiple residual Wasserstein driven model (MRWM) that can organically combine multiple residual Was

Show more

Free Full Text from Publishermore_horiz

61 References Related records

Cited by 1 Related articles

<-—2022———2022———1810—


One Loss for Quantization: Deep Hashing with Discrete Wasserstein Distributional Matching

Doan, KDYang, P and Li, P

IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)

2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR)

 , pp.9437-9447

Image hashing is a principled approximate nearest neighbor approach to find similar items to a query in a large collection of images. Hashing aims to learn a binary-output function that maps an image to a binary vector. For optimal retrieval performance, producing balanced hash codes with low-quantization error to bridge the gap between the learning stage's continuous relaxation and the inferen

Show more

Free Submitted Article From RepositoryFull Text at Publishermore_horiz

59 References  Related records


 

2922

The "Unreasonable" Effectiveness of the Wasserstein Distance in Analyzing Key Performance Indicators of a Network of Stores

Ponti, AGiordani, I; (...); Archetti, F

Dec 2022 | 

BIG DATA AND COGNITIVE COMPUTING

 6 (4)Enriched Cited References

Large retail companies routinely gather huge amounts of customer data, which are to be analyzed at a low granularity. To enable this analysis, several Key Performance Indicators (KPIs), acquired for each customer through different channels are associated to the main drivers of the customer experience. Analyzing the samples of customer behavior only through parameters such as average and varianc

Show more

Free Full Text from Publishermore_horiz

30 References  Related records



Working Paper
Covariance-based soft clustering of functional data based on the Wasserstein-Procrustes metric
Masarotto, V; Masarotto, G.
 arXiv.org; Ithaca, Dec 26, 2022.
Cite  Email
Full Text
Abstract/DetailsGet full topens in a new window


Working Paper
Wasserstein Distributionally Robust Control of Partially Observable Linear Stochastic Systems
Hakobyan, Astghik; Yang, Insoon.
 arXiv.org; Ithaca, Dec 22, 2022.
Cite  Email
Abstract/DetailsGet full text
opens in a new window
Cited by 2
 Related articles All 3 versions


Working Paper
Stability estimates for the Vlasov-Poisson system in -kinetic Wasserstein distances
Iacobelli, Mikaela; Junné, Jonathan.
 arXiv.org; Ithaca, Dec 19, 2022.
Cite  Email  Full Text
Abstract/DetailsGet full text
opens in a new window


2022


2022  patent news Wire Feed
State Intellectual Property Office of China Receives River and Sea Univ's Patent Application for Power System Bad Data Identification Method Based on Improved Wasserstein Gan
Global IP News. Electrical Patent News; New Delhi [New Delhi]. 17 Dec 2022.  
Cite  Email  Full Text
DetailsFull text


3022  see 2021  Working Paper
Variational Wasserstein Barycenters with c-Cyclical Monotonicity

Chi, Jinjin; Yang, Zhiyao; Ouyang, Jihong; Li, Ximing. arXiv.org; Ithaca, Dec 17, 2022.
Cite
  Email  Full Text
Abstract/DetailsGet full text
opens in a new window


Working Paper
A Wasserstein  GAN for Joint Learning of Inpainting and Spatial Optimisationl, Peter.
 arXiv.org; Ithaca, Dec 2, 2022.
Cite
  Email  Full Text
Abstract/DetailsGet full text
opens in a new window


  2022 see 2021  Working Paper
A new method for determining Wasserstein 1 optimal transport maps from Kantorovich potentials, with deep learning applications
; Bilocq, Étienne; Nachman, Adrian. arXiv.org; Ithaca, Nov 2, 2022.
Cite
  Email Full Text

Abstract/DetailsGet full text
opens in a new window

2022 see 2021  Working Paper
The Wasserstein distance to the Circular Law

Jalowy, Jonas. arXiv.org; Ithaca, Oct 28, 2022.
<-—2022———2022———1820—


2022 see 2021  Working Paper
Some inequalities on Riemannian manifolds linking Entropy,Fisher information, Stein discrepancy and Wasserstein distance
Li-Juan, Cheng; Feng-Yu, Wang; Thalmaier, Anton. arXiv.org; Ithaca, Oct 18, 2022.
Cite
  Email
Abstract/DetailsGet full text
opens in a new window



Handwriting Recognition Using Wasserstein Metric in Adversarial Learning

Authors:Monica JangpangiSudhanshu KumarDiwakar BhardwajByung-Gyu KimPartha Pratim Roy
Summary:Abstract: Deep intelligence provides a great way to deal with understanding the complex handwriting of the user. Handwriting is challenging due to its irregular shapes, which vary from one user to another. In recent advancements in artificial intelligence, deep learning has unprecedented potential to recognize the user’s handwritten characters or words more accurately than traditional algorithms. It works well on the concept of the neural network, many algorithms such as convolutional neural network (CNN), recurrent neural network (RNN), and long short term memory (LSTM) are the best approaches to get high accuracy in handwritten recognition. However, much of the existing literature work lacks the feature space in a scalable manner. A model consisting of CNN, RNN, and transcription layer called CRNN and the Adversarial Feature Deformation Module (AFDM) is used for the affine transformation to overcome the limitation of existing literature. Finally, we propose an adversarial architecture comprised of two separate networks; one is seven layers of CNN with spatial transformation networks (STN), which act as a generator network. Another is Bi-LSTM, with the transcription layer working as a discriminator network and applying Wasserstein’s function to verify the model effectiveness using IAM word and IndBAN dataset. The performance of the proposed model was evaluated through different baseline models. Finally, It reduced the overall word error and character error rate using our proposed approachShow more
Article, 2022
Publication:SN Computer Science, 4, 20221107
Publisher:2022


Wasserstein Patch Prior for Image Superresolution
Authors:Johannes HertrichAntoine HoudardClaudia Redenbach
Summary:Many recent superresolution methods are based on supervised learning. That means, that they require a large database of pairs of high- and low-resolution images as training data. However, for many applications, acquiring registered pairs of high and low resolution data or even imaging a large area with a high resolution is unrealistic. To overcome this problem, we introduce a Wasserstein patch prior for unsupervised superresolution of two- and three-dimensional images. In addition to the low-resolution observation, our method only requires one, possibly small, reference image which has a similar patch distribution as the high resolution ground truth. This assumption can e.g. be fulfilled when working with texture images or images of homogeneous material microstructures. The proposed regularizer penalizes the Wasserstein-2-distance of the patch distributions within the reconstruction and the reference image at different scales. We demonstrate the performance of the proposed method by applying it to two- and three-dimensional images of materials' microstructuresShow more
Article, 2022
Publication:IEEE Transactions on Computational Imaging, PP, 2022, 1
Publisher:2022


Conditional Wasserstein GAN for Energy Load Forecasting in Large Buildings

Authors:George-Silviu NastasescuDumitru-Clementin Cercel2022 International Joint Conference on Neural Networks (IJCNN)
Summary:Energy forecasting is necessary for planning electricity consumption, and large buildings play a huge role when making these predictions. Because of its importance, numerous methods to predict the buildings' energy load have appeared during the last decades, remaining an open area of research. In recent years, traditional machine learning techniques such as Random Forest, K-Nearest Neighbors, and AutoRegressive Integrated Moving Average (ARIMA) have been replaced with deep learning methods, which have an increased ability to capture underlying consumption trends. However, large amounts of data are mandatory for training neural networks to forecast energy load. With scarce data, augmentation techniques are necessary to ensure high-quality predictions. This paper introduces cWGAN-GP-SN, a conditional (convolutional) Wasserstein Generative Adversarial Network with Gradient Penalty and Spectral Normalization used to generate new electrical records. Our architecture leverages the advantages of multiple GAN models to enrich training stability and data quality. The experimental results based on the Building Data Genome dataset show how classification and regression tasks benefit from the enrichment of the dataset. Additionally, adversarial attacks were performed to investigate whether models trained on large amounts of synthetic data are more robustShow more
Chapter, 2022
Publication:2022 International Joint Conference on Neural Networks (IJCNN), 20220718, 1
Publisher:2022


Wasserstein Loss With Alternative Reinforcement Learning for Severity-Aware Semantic Segmentation

Authors:Xiaofeng LiuYunhong LuXiongchang LiuSong BaiSite LiJane You
Summary:Semantic segmentation is important for many real-world systems, e.g., autonomous vehicles, which predict the class of each pixel. Recently, deep networks achieved significant progress w.r.t. the mean Intersection-over Union (mIoU) with the cross-entropy loss. However, the cross entropy loss can essentially ignore the difference of severity for an autonomous car with different wrong prediction mistakes. For example, predicting the car to the road is much more servery than recognize it as the bus. Targeting for this difficulty, we develop a Wasserstein training framework to explore the inter-class correlation by defining its ground metric as misclassification severity. The ground metric of Wasserstein distance can be pre-defined following the experience on a specific task. From the optimization perspective, we further propose to set the ground metric as an increasing function of the pre-defined ground metric. Furthermore, an adaptively learning scheme of the ground matrix is proposed to utilize the high-fidelity CARLA simulator. Specifically, we follow a reinforcement alternative learning scheme. The experiments on both CamVid and Cityscapes datasets evidenced the effectiveness of our Wasserstein loss. The SegNet, ENet, FCN and Deeplab networks can be adapted following a plug in manner. We achieve significant improves on the predefined important classes, and much longer continuous play time in our simulatorShow more
Article, 2022
Publication:IEEE Transactions on Intelligent Transportation Systems, 23, 202201, 587
Publisher:2022

2022


Peer-reviewed
Data-Driven Chance Constrained Programs over Wasserstein Balls
Authors:Zhi ChenDaniel KuhnWolfram Wiesemann
Summary:In the era of modern business analytics, data-driven optimization has emerged as a popular modeling paradigm to transform data into decisions. By constructing an ambiguity set of the potential data-generating distributions and subsequently hedging against all member distributions within this ambiguity set, data-driven optimization effectively combats the ambiguity with which real-life data sets are plagued. Chen et al. (2022) study data-driven, chance-constrained programs in which a decision has to be feasible with high probability under every distribution within a Wasserstein ball centered at the empirical distribution. The authors show that the problem admits an exact deterministic reformulation as a mixed-integer conic program and demonstrate (in numerical experiments) that the reformulation compares favorably to several state-of-the-art data-driven optimization schemesShow more
Downloadable Article, 2022
Publication:Operations Research, 20220721
Publisher:2022

Bayesian learning with Wasserstein barycenters*

Authors:Julio Backhoff-VeraguasJoaquin FontbonaGonzalo RiosFelipe Tobar
Summary:We introduce and study a novel model-selection strategy for Bayesian learning, based on optimal transport, along with its associated predictive posterior law: the Wasserstein population barycenter of the posterior law over models. We first show how this estimator, termed Bayesian Wasserstein barycenter (BWB), arises naturally in a general, parameter-free Bayesian model-selection framework, when the considered Bayesian risk is the Wasserstein distance. Examples are given, illustrating how the BWB extends some classic parametric and non-parametric selection strategies. Furthermore, we also provide explicit conditions granting the existence and statistical consistency of the BWB, and discuss some of its general and specific properties, providing insights into its advantages compared to usual choices, such as the model average estimator. Finally, we illustrate how this estimator can be computed using the stochastic gradient descent (SGD) algorithm in Wasserstein space introduced in a companion paper, and provide a numerical example for experimental validation of the proposed methodShow more
Article, 2022
Publication:ESAIM: Probability and Statistics, 26, 2022, 436
Publisher:2022

Topic Embedded Representation Enhanced Variational Wasserstein Autoencoder for Text Modeling
Authors:Zheng XiangXiaoming LiuGuan YangYang Liu2022 IEEE 5th International Conference on Electronics Technology (ICET)
Summary:Variational Autoencoder (VAE) is now popular in text modeling and language generation tasks, which need to pay attention to the diversity of generation results. The existing models are insufficient in capturing the built-in relationships between topic representation and sequential words. At the same time, there is a massive contradiction between the commonly used simple Gaussian prior and the actual complex distribution of language texts. To address the above problems, we introduce a hybrid Wasserstein Autoencoder (WAE) with Topic Embedded Representation (TER) for text modeling. TER is obtained through an embedding-based topic model and can capture the dependencies and semantic similarities between topics and words. In this case, the learned latent variable has rich semantic knowledge with the help of TER and is easier to explain and control. Our experiments show that our method is competitive with other VAEs in text modelingShow more
Chapter, 2022
Publication:2022 IEEE 5th International Conference on Electronics Technology (ICET), 20220513, 1318
Publisher:2022

Peer-reviewed
Fault data expansion method of permanent magnet synchronous motor based on Wasserstein-generative adversarial network
Authors:Liu ZhanXiaowei XuXue QiaoZhixiong LiQiong Luo
Summary:Aiming at the characteristics of non-smooth, non-linear, multi-source heterogeneity, low density of value and unevenness of fault data collected by the online monitoring equipment of permanent magnet synchronous motor (PMSM), and the difficulty of fault mechanism analysis, this paper proposes a method of PMSM data expansion based on the improved generative adversarial network. First, use the real fault data of the motor to train the model to obtain a mature and stable generative countermeasure network. Secondly, use the generative countermeasure network model to test the remaining data and generate pseudo samples. Finally, use the two-dimensional data analysis method and the time-domain analysis method to generate validity analysis of samples. Aiming at the characteristics of unbalanced motor data, the data expansion method of inter-turn short-circuit faults is carried out based on the data expansion method of the improved generative countermeasure network, and the two-dimensional data analysis method and the time-domain analysis method are used for analysis. The experimental results show that the improved Wasserstein-Generative Adversarial Network (W-GAN) has a better ability to generate fake data, which provides a data basis for the mechanism analysis and machine fault diagnosis of PMSMs. Data analysis results show that the improved W-GAN effectively solves the problem of poor convergence of GANShow more
Article, 2022
Publication:Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science, 20220515
Publisher:2022


Peer-reviewed
Conditional Wasserstein Generator

Authors:Myunghee Cho PaikKyungbok LeeYoung-geun Kim
Summary:The statistical distance of conditional distributions is an essential element of generating target data given some data as in video prediction. We establish how the statistical distances between two joint distributions are related to those between two conditional distributions for three popular statistical distances: f-divergence, Wasserstein distance, and integral probability metrics. Such characterization plays a crucial role in deriving a tractable form of the objective function to learn a conditional generator. For Wasserstein distance, we show that the distance between joint distributions is an upper bound of the expected distance between conditional distributions, and derive a tractable representation of the upper bound. Based on this theoretical result, we propose a new conditional generator, the conditional Wasserstein generator. Our proposed algorithm can be viewed as an extension of Wasserstein autoencoders [1] to conditional generation or as a Wasserstein counterpart of stochastic video generation (SVG) model by Denton and Fergus [2]. We apply our algorithm to video prediction and video interpolation. Our experiments demonstrate that the proposed algorithm performs well on benchmark video datasets and produces sharper videos than state-of-the-art methodsShow more
Article, 2022
Publication:IEEE Transactions on Pattern Analysis & Machine Intelligence, 202211, 1

<-—2022———2022———18301—



Peer-reviewed
Second-Order Conic Programming Approach for Wasserstein Distributionally Robust Two-Stage Linear Programs
Authors:Zhuolin WangKeyou YouShiji SongYuli Zhang
Summary:This article proposes a second-order conic programming (SOCP) approach to solve distributionally robust two-stage linear programs over 1-Wasserstein balls. We start from the case with distribution uncertainty only in the objective function and then explore the case with distribution uncertainty only in constraints. The former program is exactly reformulated as a tractable SOCP problem, whereas the latter one is proved to be generally NP-hard as it involves a norm maximization problem over a polyhedron. However, it reduces to an SOCP problem if the extreme points of the polyhedron are given as a prior. This motivates the design of a constraint generation algorithm with provable convergence to approximately solve the NP-hard problem. Moreover, the least favorable distribution achieving the worst case cost is given as an “empirical” distribution by simply perturbing each original sample for both cases. Finally, experiments illustrate the advantages of the proposed model in terms of the out-of-sample performance and computational complexity. Note to Practitioners —The two-stage program with distribution uncertainty is an important decision problem in broad applications, e.g., two-stage schedule problems, facility location problems, and recourse allocation problems. To deal with the uncertainty, this work proposes a novel data-driven model over the 1-Wasserstein ball and develops an efficient second-order conic programming (SOCP)-based solution approach, where the sample data set can be easily exploited to reduce the distribution uncertainty. The good out-of-sample performance and computational complexity of the proposed model are validated by the experiments on the two-stage portfolio programs and material order programsShow more
Article, 2022
Publication:IEEE Transactions on Automation Science and Engineering, 19, 202204, 946
Publisher:2022

 
Peer-reviewed
Linear and Deep Order-Preserving Wasserstein Discriminant Analysis
Authors:Ying WuJi-Rong WenJiahuan ZhouBing Su
Summary:Supervised dimensionality reduction for sequence data learns a transformation that maps the observations in sequences onto a low-dimensional subspace by maximizing the separability of sequences in different classes. It is typically more challenging than conventional dimensionality reduction for static data, because measuring the separability of sequences involves non-linear procedures to manipulate the temporal structures. In this paper, we propose a linear method, called order-preserving Wasserstein discriminant analysis (OWDA), and its deep extension, namely DeepOWDA, to learn linear and non-linear discriminative subspace for sequence data, respectively. We construct novel separability measures between sequence classes based on the order-preserving Wasserstein (OPW) distance to capture the essential differences among their temporal structures. Specifically, for each class, we extract the OPW barycenter and construct the intra-class scatter as the dispersion of the training sequences around the barycenter. The inter-class distance is measured as the OPW distance between the corresponding barycenters. We learn the linear and non-linear transformations by maximizing the inter-class distance and minimizing the intra-class scatter. In this way, the proposed OWDA and DeepOWDA are able to concentrate on the distinctive differences among classes by lifting the geometric relations with temporal constraints. Experiments on four 3D action recognition datasets show the effectiveness of OWDA and DeepOWDAShow more
Article, 2022
Publication:IEEE Transactions on Pattern Analysis & Machine Intelligence, 44, 202206, 3123
Publisher:2022

Peer-reviewed
Authors:Viet Anh Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage EstimatorNguyenDaniel KuhnPeyman Mohajerin Esfahani
Summary:Note. The best result in each experiment is highlighted in bold.The optimal solutions of many decision problems such as the Markowitz portfolio allocation and the linear discriminant analysis depend on the inverse covariance matrix of a Gaussian random vector. In “Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator,” Nguyen, Kuhn, and Mohajerin Esfahani propose a distributionally robust inverse covariance estimator, obtained by robustifying the Gaussian maximum likelihood problem with a Wasserstein ambiguity set. In the absence of any prior structural information, the estimation problem has an analytical solution that is naturally interpreted as a nonlinear shrinkage estimator. Besides being invertible and well conditioned, the new shrinkage estimator is rotation equivariant and preserves the order of the eigenvalues of the sample covariance matrix. If there are sparsity constraints, which are typically encountered in Gaussian graphical models, the estimation problem can be solved using a sequential quadratic approximation algorithmSh
Downloadable Article, 2022
Publication:Operations Research, 70, 202201, 490
Publisher:2022


Wasserstein-Based Projections with Applications to Inverse Problems
Authors:Howard HeatonSamy Wu FungAlex Tong LinStanley OsherWotao Yin
Summary:Inverse problems consist of recovering a signal from a collection ofnoisy measurements. These are typically cast as optimizationproblems, with classic approaches using a data fidelity term and ananalytic regularizer that stabilizes recovery. Recent plug-and-play(PnP) works propose replacing the operator for analyticregularization in optimization methods by a data-driven denoiser.These schemes obtain state-of-the-art results, but at the cost oflimited theoretical guarantees. To bridge this gap, we present a new algorithm that takes samples from the manifold of true data as input and outputs an approximation of the projection operator onto this manifold. Under standard assumptions, we prove this algorithm generates a learned operator, called Wasserstein-based projection (WP), that approximates the true projection with high probability. Thus, WPs can be inserted into optimization methods in the same manner as PnP, but now with theoretical guarantees. Provided numerical examples show WPs obtain state-of-the-art results for unsupervised PnP signal recovery. All codes for this work can be found at https://github.com/swufung/WassersteinBasedProjectionsShow more
Cited by 11
Related articles All 4 versions

Downloadable Article
Publication:SIAM Journal on Mathematics of Data Science, 4, 2022, 581
Peer-reviewed
Wasserstein Adversarial Regularization for Learning With Label Noise
Authors:Kilian FatrasBharath Bhushan DamodaranSylvain LobryRemi FlamaryDevis TuiaNicolas Courty
Summary:Noisy labels often occur in vision datasets, especially when they are obtained from crowdsourcing or Web scraping. We propose a new regularization method, which enables learning robust classifiers in presence of noisy data. To achieve this goal, we propose a new adversarial regularization scheme based on the Wasserstein distance. Using this distance allows taking into account specific relations between classes by leveraging the geometric properties of the labels space. Our Wasserstein Adversarial Regularization (WAR) encodes a selective regularization, which promotes smoothness of the classifier between some classes, while preserving sufficient complexity of the decision boundary between others. We first discuss how and why adversarial regularization can be used in the context of noise and then show the effectiveness of our method on five datasets corrupted with noisy labels: in both benchmarks and real datasets, WAR outperforms the state-of-the-art competitorsShow more
Article, 2022
Publication:IEEE Transactions on Pattern Analysis and Machine Intelligence, 44, 20221001, 7296
Publisher:2022

2022


Peer-reviewed
Wasserstein Adversarial Regularization for Learning With Label Noise
Authors:Nicolas CourtyDevis TuiaRemi FlamarySylvain LobryBharath Bhushan DamodaranKilian Fatras
Summary:Noisy labels often occur in vision datasets, especially when they are obtained from crowdsourcing or Web scraping. We propose a new regularization method, which enables learning robust classifiers in presence of noisy data. To achieve this goal, we propose a new adversarial regularization scheme based on the Wasserstein distance. Using this distance allows taking into account specific relations between classes by leveraging the geometric properties of the labels space. Our Wasserstein Adversarial Regularization (WAR) encodes a selective regularization, which promotes smoothness of the classifier between some classes, while preserving sufficient complexity of the decision boundary between others. We first discuss how and why adversarial regularization can be used in the context of noise and then show the effectiveness of our method on five datasets corrupted with noisy labels: in both benchmarks and real datasets, WAR outperforms the state-of-the-art competitorsShow more
Article, 2022
Publication:IEEE Transactions on Pattern Analysis & Machine Intelligence, 44, 202210, 7296
Publisher:2022

CNN-Based Continuous Authentication on Smartphones With Conditional Wasserstein Generative Adversarial Network
Authors:Yantao LiJiaxing LuoShaojiang DengGang Zhou
Summary:With the widespread usage of mobile devices, the authentication mechanisms are urgently needed to identify users for information leakage prevention. In this article, we present CAGANet, a convolutional neural network (CNN)-based continuous authentication on smartphones using a conditional Wasserstein generative adversarial network (CWGAN) for data augmentation, which utilizes smartphone sensors of the accelerometer, gyroscope, and magnetometer to sense phone movements incurred by user operation behaviors. Specifically, based on the preprocessed real data, CAGANet employs CWGAN to generate additional sensor data for data augmentation that are used to train the designed CNN. With the augmented data, CAGANet utilizes the trained CNN to extract deep features and then performs principal component analysis (PCA) to select appropriate representative features for different classifiers. With the CNN-extracted features, CAGANet trains four one-class classifiers of OC-SVM, LOF, isolation forest (IF), and EE in the enrollment phase and authenticates the current user as a legitimate user or an impostor based on the trained classifiers in the authentication phase. To evaluate the performance of CAGANet, we conduct extensive experiments in terms of the efficiency of CWGAN, the effectiveness of CWGAN augmentation and the designed CNN, the accuracy on unseen users, and comparison with traditional augmentation approaches and with representative authentication methods, respectively. The experimental results show that CAGANet with the IF classifier can achieve the lowest equal error rate (EER) of 3.64% on 2-s sampling dataShow more
Article, 2022
Publication:IEEE Internet of Things Journal, 9, 20220401, 5447
Publisher:2022



Peer-reviewed
Rectified Wasserstein Generative Adversarial Networks for Perceptual Image Restoration
Authors:Haichuan MaDong LiuFeng Wu
Summary:Wasserstein generative adversarial network (WGAN) has attracted great attention due to its solid mathematical background, i.e., to minimize the Wasserstein distance between the generated distribution and the distribution of interest. In WGAN, the Wasserstein distance is quantitatively evaluated by the discriminator, also known as the critic . The vanilla WGAN trained the critic with the simple Lipschitz condition, which was later shown less effective for modeling complex distributions, like the distribution of natural images. We try to improve the WGAN training by introducing pairwise constraint on the critic, oriented to image restoration tasks. In principle, pairwise constraint is to suggest the critic assign a higher rating to the original (real) image than to the restored (generated) image, as long as such a pair of images are available. We show that such pairwise constraint may be implemented by rectifying the gradients in WGAN training, which leads to the proposed rectified Wasserstein generative adversarial network (ReWaGAN). In addition, we build interesting connections between ReWaGAN and the perception-distortion tradeoff. We verify ReWaGAN on two representative image restoration tasks: single image super-resolution (4× and 8×) and compression artifact reduction, where our ReWaGAN not only beats the vanilla WGAN consistently, but also outperforms the state-of-the-art perceptual quality-oriented methods significantly. Our code and models are publicly available at https://github.com/mahaichuan/ReWaGANShow more
Article, 2022
Publication:IEEE Transactions on Pattern Analysis and Machine Intelligence, PP, 2022, 1
Publisher:2022


Peer-reviewed
Distributionally Robust Mean-Variance Portfolio Selection with Wasserstein Distances
Authors:Jose BlanchetLin ChenXun Yu Zhou
Summary:We revisit Markowitz’s mean-variance portfolio selection model by considering a distributionally robust version, in which the region of distributional uncertainty is around the empirical measure and the discrepancy between probability measures is dictated by the Wasserstein distance. We reduce this problem into an empirical variance minimization problem with an additional regularization term. Moreover, we extend the recently developed inference methodology to our setting in order to select the size of the distributional uncertainty as well as the associated robust target return rate in a data-driven way. Finally, we report extensive back-testing results on S&P 500 that compare the performance of our model with those of several well-known models including the Fama-French and Black-Litterman models.This paper was accepted by David Simchi-Levi, finance.Show more
Downloadable Article, 2022
Publication:Management Science, 68, 202209, 6382
Publisher:2022

2Synthetic Traffic Generation with Wasserstein Generative Adversarial Networks
Authors:Chao-Lun WuYu-Ying ChenPo-Yu ChouChih-Yu WangGLOBECOM 2022 - 2022 IEEE Global Communications Conference

Summary:Network traffic data are critical for network research. With the help of synthetic traffic, researchers can readily generate data for network simulation and performance evaluation. However, the state-of-the-art traffic generators are either too simple to generate realistic traffic or require the implementation of original applications and user operations. We propose Synthetic PAcket Traffic Generative Adversarial Networks (SPATGAN) that are capable of generating synthetic traffic. The framework includes a server agent and a client agent, which transmit synthetic packets to each other and take the opponent's synthetic packets as conditional labels for the built-in Timing Synthesis Generative Adversarial Networks (TSynGAN) and a Packet Synthesis Generative Adversarial Networks (PSynGAN) to generate synthetic traffic. The evaluations demonstrate that the proposed framework can generate traffic whose distribution resembles real traffic distribution
Show more
Chapter, 2022
Publication:GLOBECOM 2022 - 2022 IEEE Global Communications Conference, 20221204, 1503
Publisher:2022

<-—2022———2022———1840—



Minimax Robust Quickest Change Detection using Wasserstein Ambiguity Sets

Authors:Liyan Xie2022 IEEE International Symposium on Information Theory (ISIT)
Summary:We study the robust quickest change detection under unknown pre- and post-change distributions. To deal with uncertainties in the data-generating distributions, we formulate two data-driven ambiguity sets based on the Wasserstein distance, without any parametric assumptions. The minimax robust test is constructed as the CUSUM test under least favorable distributions, a representative pair of distributions in the ambiguity sets. We show that the minimax robust test can be obtained in a tractable way and is asymptotically optimal. We investigate the effectiveness of the proposed robust test over existing methods, including the generalized likelihood ratio test and the robust test under KL divergence based ambiguity setsShow more
Chapter, 2022
Publication:2022 IEEE International Symposium on Information Theory (ISIT), 20220626, 1909
Publisher:2022



Peer-reviewed
Decision Making Under Model Uncertainty: Fréchet-Wasserstein Mean Preferences
Authors:Electra V. PetracouAnastasios XepapadeasAthanasios N. Yannacopoulos
Summary:This paper contributes to the literature on decision making under multiple probability models by studying a class of variational preferences. These preferences are defined in terms of Fréchet mean utility functionals, which are based on the Wasserstein metric in the space of probability models. In order to produce a measure that is the “closest” to all probability models in the given set, we find the barycenter of the set. We derive explicit expressions for the Fréchet-Wasserstein mean utility functionals and show that they can be expressed in terms of an expansion that provides a tractable link between risk aversion and ambiguity aversion. The proposed utility functionals are illustrated in terms of two applications. The first application allows us to define the social discount rate under model uncertainty. In the second application, the functionals are used in risk securitization. The barycenter in this case can be interpreted as the model that maximizes the probability that different decision makers will agree on, which could be useful for designing and pricing a catastrophe bond.This paper was accepted by Manel Baucells, decision analysis.Show more
Downloadable Article, 2022
Publication:Management Science, 68, 202202, 1195
Publisher:2022

Peer-reviewed
Fault Diagnosis of Rotating Machinery Based on Wasserstein Distance and Feature Selection
Authors:Francesco FerracutiAlessandro FreddiAndrea MonteriuLuca Romeo
Summary:This article presents a fault diagnosis algorithm for rotating machinery based on the Wasserstein distance. Recently, the Wasserstein distance has been proposed as a new research direction to find better distribution mapping when compared with other popular statistical distances and divergences. In this work, first, frequency- and time-based features are extracted by vibration signals, and second, the Wasserstein distance is considered for the learning phase to discriminate the different machine operating conditions. Specifically, the 1-D Wasserstein distance is considered due to its low computational burden because it can be evaluated directly by the order statistics of the extracted features. Furthermore, a distance weighting stage based on neighborhood component features selection (NCFS) is exploited to achieve robust fault diagnosis at low signal-to-noise ratio (SNR) conditions and with high-dimensional features. In detail, the NCFS framework is here adapted to weight 1-D Wasserstein distances evaluated from time/frequency features. Experiments are conducted on two benchmark data sets to verify the effectiveness of the proposed fault diagnosis method at different SNR conditions. The comparison with state-of-the-art fault diagnosis algorithms shows promising results. Note to Practitioners —This article was motivated by the problem of fault diagnosis of rotating machinery under low SNR and different machine operating conditions. The algorithm employs a statistical distance-based fault diagnosis technique, which permits to obtain an estimation of the fault signature without the need for training a classifier. The algorithm is computationally efficient during the training and testing stages, and thus, it can be used in embedded hardware. Finally, the proposed methodology can be applied to other application domains such as system monitoring and prognostics, which can help to schedule the maintenance of rotating machineryShow more
Article, 2022
Publication:IEEE Transactions on Automation Science and Engineering, 19, 202207, 1997
Publisher:2022

Peer-reviewed
Dynamic Facial Expression Generation on Hilbert Hypersphere With Conditional Wasserstein Generative Adversarial Nets
Authors:Stefano BerrettiLahoucine BallihiAnis KacemMohamed DaoudiNaima Otberdout
Summary:In this work, we propose a novel approach for generating videos of the six basic facial expressions given a neutral face image. We propose to exploit the face geometry by modeling the facial landmarks motion as curves encoded as points on a hypersphere. By proposing a conditional version of manifold-valued Wasserstein generative adversarial network (GAN) for motion generation on the hypersphere, we learn the distribution of facial expression dynamics of different classes, from which we synthesize new facial expression motions. The resulting motions can be transformed to sequences of landmarks and then to images sequences by editing the texture information using another conditional Generative Adversarial Network. To the best of our knowledge, this is the first work that explores manifold-valued representations with GAN to address the problem of dynamic facial expression generation. We evaluate our proposed approach both quantitatively and qualitatively on two public datasets; Oulu-CASIA and MUG Facial Expression. Our experimental results demonstrate the effectiveness of our approach in generating realistic videos with continuous motion, realistic appearance and identity preservation. We also show the efficiency of our framework for dynamic facial expressions generation, dynamic facial expression transfer and data augmentation for training improved emotion recognition modelsShow more
Article, 2022
Publication:IEEE Transactions on Pattern Analysis & Machine Intelligence, 44, 202202, 848
Publisher:2022


Wasserstein Barycenters Are NP-Hard to Compute
Authors:Jason M. AltschulerEnric Boix-Adserà
Summary:Computing Wasserstein barycenters (a.k.a. optimal transport barycenters) is a fundamental problem in geometry which has recently attracted considerable attention due to many applications in data science. While there exist polynomial-time algorithms in any fixed dimension, all known running times suffer exponentially in the dimension. It is an open question whether this exponential dependence is improvable to a polynomial dependence. This paper proves that unless ${P} = {NP}$, the answer is no. This uncovers a “curse of dimensionality” for Wasserstein barycenter computation which does not occur for optimal transport computation. Moreover, our hardness results for computing Wasserstein barycenters extend to approximate computation, to seemingly simple cases of the problem, and to averaging probability distributions in other optimal transport metricsShow more
Downloadable Article
Publication:SIAM Journal on Mathematics of Data Science, 4, 2022, 179

2022


Detecting Incipient Fault Using Wasserstein Distance
Authors:Cheng LuJiusun ZengShihua LuoUwe Kruger2022 IEEE 11th Data Driven Control and Learning Systems Conference (DDCLS)
Summary:This article develops a novel process monitoring method based on the Wasserstein distance for incipient fault detection. The core idea is to measure the difference between the normal data and the faulty data. For Gaussian distributed process variables, the paper proved that the difference measured by the Wasserstein distance is more sensitive than the Hotelling¡ − s T^{2 and the Squared Prediction Error (SPE) in the Principal Component Analysis (PCA) framework. For non-Gaussian distributed data, a Project Robust Wasserstein distance (PRW) model under the PCA framework is proposed and an algorithm called Riemannian Block Coordinate Descent (RBCD) algorithm is used to solve this model, which is fast when the number of sampled data is large. An application study to a glass melter demonstrate the effectiveness of the proposed methodShow more
Chapter, 2022
Publication:2022 IEEE 11th Data Driven Control and Learning Systems Conference (DDCLS), 20220803, 1044
Publisher:2022



Peer-reviewed
Full Attention Wasserstein GAN With Gradient Normalization for Fault Diagnosis Under Imbalanced Data
Authors:Jigang FanXianfeng YuanZhaoming MiaoZihao SunXiaoxue MeiFengyu Zhou
Summary:The fault diagnosis of rolling bearings is vital for the safe and reliable operation of mechanical equipment. However, the imbalanced data collected from the real engineering scenario bring great challenges to the deep learning-based diagnosis methods. For this purpose, this article proposes a methodology called full attention Wasserstein generative adversarial network (WGAN) with gradient normalization (FAWGAN-GN) for data augmentation and uses a shallow 1-D convolutional neural network (CNN) to perform fault diagnosis. First, a gradient normalization (GN) is introduced into the discriminator as a model-wise constraint to make it more flexible in setting the structure of the network, which leads to a more stable and faster training process. Second, the full attention (FA) mechanism is utilized to let the generator pay more attention to learning the discriminative features of the original data and generate high-quality samples. Third, to more thoroughly and deeply evaluate the data generation performance of generative adversarial networks (GANs), a more comprehensive multiple indicator-based evaluation framework is developed to avoid the one-sidedness and superficiality of using one or two simple indicators. Based on two widely applied fault diagnosis datasets and a real rolling bearing fault diagnosis testbed, extensive comparative fault diagnosis experiments are conducted to validate the effectiveness of the proposed method. Experimental results reveal that the proposed FAWGAN-GN can effectively solve the sample imbalance problem and outperforms the state-of-the-art imbalanced fault diagnosis methodsShow more
Article, 2022
Publication:IEEE Transactions on Instrumentation and Measurement, 71, 2022, 1
Publisher:2022


Peer-reviewed
Distributed Kalman Filter With Faulty/Reliable Sensors Based on Wasserstein Average Consensus
Authors:Dong-Jin XinLing-Feng ShiXingkai Yu
Summary:This brief considers distributed Kalman filtering problem for systems with sensor faults. A trust-based classification fusion strategy is proposed to resist against sensor faults. First, the local sensors collect measurements and then update their state estimations and estimation error covariance matrices. Then, sensors exchange the information (state estimations and estimation error covariance matrices) with their neighboring sensors. After obtaining the estimation information from neighboring sensors, an iterative classification/clustering algorithm, which contains three steps ( Initialization Step , Assignment Step , and Update Step ), is proposed to classify the collected estimations into two clusters (trusted and untrusted clusters). Third, the fused states and error covariance matrices are computed by Wasserstein average algorithm. Finally, the time update is performed on the basis of fusion information. Stability and convergence of the proposed filter are analyzed. A target tracking simulation example is provided to verify the effectiveness of the proposed distributed filter in a wireless sensor networkShow more
Article, 2022
Publication:IEEE Transactions on Circuits and Systems II: Express Briefs, 69, 202204, 2371
Publisher:2022


Peer-reviewed
Linear and Deep Order-Preserving Wasserstein Discriminant Analysis
Authors:Bing SuJiahuan ZhouJi-Rong WenYing Wu
Summary:Supervised dimensionality reduction for sequence data learns a transformation that maps the observations in sequences onto a low-dimensional subspace by maximizing the separability of sequences in different classes. It is typically more challenging than conventional dimensionality reduction for static data, because measuring the separability of sequences involves non-linear procedures to manipulate the temporal structures. In this paper, we propose a linear method, called order-preserving Wasserstein discriminant analysis (OWDA), and its deep extension, namely DeepOWDA, to learn linear and non-linear discriminative subspace for sequence data, respectively. We construct novel separability measures between sequence classes based on the order-preserving Wasserstein (OPW) distance to capture the essential differences among their temporal structures. Specifically, for each class, we extract the OPW barycenter and construct the intra-class scatter as the dispersion of the training sequences around the barycenter. The inter-class distance is measured as the OPW distance between the corresponding barycenters. We learn the linear and non-linear transformations by maximizing the inter-class distance and minimizing the intra-class scatter. In this way, the proposed OWDA and DeepOWDA are able to concentrate on the distinctive differences among classes by lifting the geometric relations with temporal constraints. Experiments on four 3D action recognition datasets show the effectiveness of OWDA and DeepOWDAShow more
Article, 2022
Publication:IEEE Transactions on Pattern Analysis and Machine Intelligence, 44, 20220601, 3123
Publisher:2022

 
Peer-reviewed
Rectified Wasserstein Generative Adversarial Networks for Perceptual Image Restoration
Authors:Feng WuDong LiuHaichuan Ma
Summary:Wasserstein generative adversarial network (WGAN) has attracted great attention due to its solid mathematical background, i.e., to minimize the Wasserstein distance between the generated distribution and the distribution of interest. In WGAN, the Wasserstein distance is quantitatively evaluated by the discriminator, also known as the <italic>critic</italic>. The vanilla WGAN trained the critic with the simple Lipschitz condition, which was later shown less effective for modeling complex distributions, like the distribution of natural images. We try to improve the WGAN training by introducing pairwise constraint on the critic, oriented to image restoration tasks. In principle, pairwise constraint is to suggest the critic assign a higher rating to the original (real) image than to the restored (generated) image, as long as such a pair of images are available. We show that such pairwise constraint may be implemented by <italic>rectifying</italic> the gradients in WGAN training, which leads to the proposed rectified Wasserstein generative adversarial network (ReWaGAN). In addition, we build interesting connections between ReWaGAN and the perception-distortion tradeoff. We verify ReWaGAN on two representative image restoration tasks: single image super-resolution (4× and 8×) and compression artifact reduction, where our ReWaGAN not only beats the vanilla WGAN consistently, but also outperforms the state-of-the-art perceptual quality-oriented methods significantly. Our code and models are publicly available at <uri>https://github.com/mahaichuan/ReWaGAN</uri>Show more
Article, 2022
Publication:IEEE Transactions on Pattern Analysis & Machine Intelligence, 202206, 1
Publisher:2022

Related articles All 4 versions

<-—2022———2022———1850—e



Peer-reviewed
Distributionally Robust Stochastic Optimization with Wasserstein Distance
Authors:Rui GaoAnton Kleywegt
Summary:Distributionally robust stochastic optimization (DRSO) is an approach to optimization under uncertainty in which, instead of assuming that there is a known true underlying probability distribution, one hedges against a chosen set of distributions. In this paper, we first point out that the set of distributions should be chosen to be appropriate for the application at hand and some of the choices that have been popular until recently are, for many applications, not good choices. We next consider sets of distributions that are within a chosen Wasserstein distance from a nominal distribution. Such a choice of sets has two advantages: (1) The resulting distributions hedged against are more reasonable than those resulting from other popular choices of sets. (2) The problem of determining the worst-case expectation over the resulting set of distributions has desirable tractability properties. We derive a strong duality reformulation of the corresponding DRSO problem and construct approximate worst-case distributions (or an exact worst-case distribution if it exists) explicitly via the first-order optimality conditions of the dual problem. Our contributions are fourfold. (i) We identify necessary and sufficient conditions for the existence of a worst-case distribution, which are naturally related to the growth rate of the objective function. (ii) We show that the worst-case distributions resulting from an appropriate Wasserstein distance have a concise structure and a clear interpretation. (iii) Using this structure, we show that data-driven DRSO problems can be approximated to any accuracy by robust optimization problems, and thereby many DRSO problems become tractable by using tools from robust optimization. (iv) Our strong duality result holds in a very general setting. As examples, we show that it can be applied to infinite dimensional process control and intensity estimation for point processesShow more
Downloadable Article, 2022
Publication:Mathematics of Operations Research, 20220805
Publisher:2022
Cited by 507 Related articles All 5 versions


Randomized Wasserstein Barycenter Computation: Resampling with Statistical Guarantees
Authors:Florian HeinemannAxel MunkYoav Zemel
Summary:We propose a hybrid resampling method to approximate finitely supported Wasserstein barycenters on large-scale datasets, which can be combined with any exact solver. Nonasymptotic bounds on the expected error of the objective value as well as the barycenters themselves allow one to calibrate computational cost and statistical accuracy. The rate of these upper bounds is shown to be optimal and independent of the underlying dimension, which appears only in the constants. Using a simple modification of the subgradient descent algorithm of Cuturi and Doucet, we showcase the applicability of our method on myriad simulated datasets, as well as a real-data example from cell microscopy, which are out of reach for state-of-the-art algorithms for computing Wasserstein barycentersShow more
Downloadable Article
Publication:SIAM Journal on Mathematics of Data Science, 4, 2022, 229
Cited by 13 Related articles All 4 versions

 

Peer-reviewed
Wasserstein autoregressive models for density time series
Authors:Chao ZhangPiotr KokoszkaAlexander Petersen
Article, 2022
Publication:Journal of Time Series Analysis, 43, January 2022, 30
Publisher:2022



Peer-reviewed
Inferential Wasserstein generative adversarial networks
Authors:Yao ChenQingyi GaoXiao Wang
Article, 2022
Publication:Journal of the Royal Statistical Society: Series B (Statistical Methodology), 84, February 2022, 83
Publisher:2022

One Loss for Quantization: Deep Hashing with Discrete Wasserstein Distributional Matching
Authors:Doan, Khoa D. (Creator), Yang, Peng (Creator), Li, Ping (Creator)
Summary:Image hashing is a principled approximate nearest neighbor approach to find similar items to a query in a large collection of images. Hashing aims to learn a binary-output function that maps an image to a binary vector. For optimal retrieval performance, producing balanced hash codes with low-quantization error to bridge the gap between the learning stage's continuous relaxation and the inference stage's discrete quantization is important. However, in the existing deep supervised hashing methods, coding balance and low-quantization error are difficult to achieve and involve several losses. We argue that this is because the existing quantization approaches in these methods are heuristically constructed and not effective to achieve these objectives. This paper considers an alternative approach to learning the quantization constraints. The task of learning balanced codes with low quantization error is re-formulated as matching the learned distribution of the continuous codes to a pre-defined discrete, uniform distribution. This is equivalent to minimizing the distance between two distributions. We then propose a computationally efficient distributional distance by leveraging the discrete property of the hash functions. This distributional distance is a valid distance and enjoys lower time and sample complexities. The proposed single-loss quantization objective can be integrated into any existing supervised hashing method to improve code balance and quantization error. Experiments confirm that the proposed approach substantially improves the performance of several representative hashing~methodsShow more
Downloadable Archival Material, 2022-05-31
Undefined
Publisher:2022-05-31

2022


Peer-reviewed
A Self-Attention Based Wasserstein Generative Adversarial Networks for Single Image Inpainting
Authors:Yuanxin MaoTianzhuang ZhangBo FuDang N. H. Thanh
Article, 2022
Publication:Pattern Recognition and Image Analysis, 32, 20221019, 591
Publisher:2022
Peer-reviewed
Health Indicator Construction Method of Bearings Based on Wasserstein Dual-Domain Adversarial Networks Under Normal Data Only
Show more
Related articles
All 3 versions

Authors:Jie LiYanyang ZiYu WangYing Yang
Summary:Rolling bearings are the most critical parts of rotating machinery and their damage is the leading cause of system failures. To ensure the reliability of the system, it demands to construct a health indicator (HI) to assess the state of degradation. However, existing HI construction methods (HICMs) have two limitations. First, the integration of well-designed features relies heavily on the experience of domain expert knowledge. Second, the construction of intelligent HI relies too much on life-cycle data. To cope with these limitations, this article proposed an HICM–Wasserstein dual-domain adversarial networks (WD-DAN), namely HICM-WD-DAN, which can extract generalized features with only normal data during the training. The dual-domain restriction of regularization promotes the generated signals approach to normal samples, making the constructed HI more robust and accurate. Moreover, to balance the weights of dual-domain parts automatically, an independent weighting structure is introduced. Finally, considering the actual degradation state of the system, the modified monotonicity and trendability indexes are proposed to evaluate the performance of HI. The effectiveness of HICM-WD-DAN is verified by bearings’ life-cycle data, and the results show that the constructed HI can represent the irreversible degradation process of bearings accurately and monotonouslyShow more
Article, 2022
Publication:IEEE Transactions on Industrial Electronics, 69, 202210, 10615
Publisher:2022


Improving Human Image Synthesis with Residual Fast Fourier Transformation and Wasserstein Distance
Authors:Jianhan WuShijing SiJianzong WangJing Xiao2022 International Joint Conference on Neural Networks (IJCNN)
Summary:With the rapid development of the Metaverse, virtual humans have emerged, and human image synthesis and editing techniques, such as pose transfer, have recently become popular. Most of the existing techniques rely on GANs, which can generate good human images even with large variants and occlusions. But from our best knowledge, the existing state-of-the-art method still has the following problems: the first is that the rendering effect of the synthetic image is not realistic, such as poor rendering of some regions. And the second is that the training of GAN is unstable and slow to converge, such as model collapse. Based on the above two problems, we propose several methods to solve them. To improve the rendering effect, we use the Residual Fast Fourier Transform Block to replace the traditional Residual Block. Then, spectral normalization and Wasserstein distance are used to improve the speed and stability of GAN training. Experiments demonstrate that the methods we offer are effective at solving the problems listed above, and we get state-of-the-art scores in LPIPS and PSNRShow more
Chapter, 2022
Publication:2022 International Joint Conference on Neural Networks (IJCNN), 20220718, 1
Publisher:2022
Related articles All 5 versions


Peer-reviewed
On isometries of compact Lp-Wasserstein spaces
Author:Jaime Santos-Rodríguez
Summary:Le
Article
Publication:Advances in Mathematics: Part A, 409, 2022-11-19

<-—2022———2022———1860—



A Wasserstein-based measure of conditional dependence
Authors:Jalal EtesamiKun ZhangNegar Kiyavash
Summary:Abstract: Measuring conditional dependencies among the variables of a network is of great interest to many disciplines. This paper studies some shortcomings of the existing dependency measures in detecting direct causal influences or their lack of ability for group selection to capture strong dependencies and accordingly introduces a new statistical dependency measure to overcome them. This measure is inspired by Dobrushin’s coefficients and based on the fact that there is no dependency between X and Y given another variable Z, if and only if the conditional distribution of Y given and does not change when X takes another realization while Z takes the same realization z. We show the advantages of this measure over the related measures in the literature. Moreover, we establish the connection between our measure and the integral probability metric (IPM) that helps to develop estimators of the measure with lower complexity compared to other relevant information theoretic-based measures. Finally, we show the performance of this measure through numerical simulationsShow more
Article, 2022
Publication:Behaviormetrika, 49, 20220625, 343

Publisher:2022

Cited by 1 Related articles All 2 versions

A Mayer optimal control problem on Wasserstein spaces over Riemannian manifoldsAuthors:F. JeanO. JerhaouiH. Zidani
Summary:This paper concerns an optimal control problem on the space of probability measures over a compact Riemannian manifold. The motivation behind it is to model certain situations where the central planner of a deterministic controlled system has only a probabilistic knowledge of the initial condition. The lack of information here is very specific. In particular, we show that the value function verifies a dynamic programming principle and we prove that it is the unique viscosity solution to a suitable Hamilton Jacobi Bellman equation. The notion of viscosity is defined using test functions that are directionally differentiable in the space of probability measuresShow more
Article
Publication:IFAC PapersOnLine, 55, 2022, 44



Peer-reviewed
Wasserstein stability of porous medium-type equations on manifolds with Ricci curvature bounded below
Authors:Nicolò De PontiMatteo MuratoriCarlo Orrieri
Summary:Given a complete, connected Riemannian manifold
Article
Publication:Journal of Functional Analysis, 283, 2022-11-01
Peer-reviewed
Well-posedness for some non-linear SDEs and related PDE on the Wasserstein space
Authors:Paul-Eric Chaudru de RaynalNoufel Frikha
Summary:In this paper, we investigate the well-posedness of the martingale problem associated to non-linear stochastic differential equations (SDEs) in the sense of McKean-Vlasov under mild assumptions on the coefficients as well as classical solutions for a class of associated linear partial differential equations (PDEs) defined onShow more
Article
Publication:Journal de mathématiques pures et appliquées, 159, March 2022, 1
Cited by 19
Related articles All 4 versions


RoBiGAN: A bidirectional Wasserstein GAN approach for online robot fault diagnosis via internal anomaly detection

T Schnell, K Bott, L PuckT Buettner… - 2022 IEEE/RSJ …, 2022 - ieeexplore.ieee.org

… Finally, TadGAN [17] offers a model similar to the BiGAN architecture using Wasserstein …

Therefore, we introduce a bidirectional Wasserstein GAN architecture fit for online anomaly …

Related articles
 

Peer-reviewed
Wasserstein Distances, Geodesics and Barycenters of Merge Trees
Authors:Mathieu PontJules VidalJulie DelonJulien Tierny
Summary:This paper presents a unified computational framework for the estimation of distances, geodesics and barycenters of merge trees. We extend recent work on the edit distance [104] and introduce a new metric, called the Wasserstein distance between merge trees, which is purposely designed to enable efficient computations of geodesics and barycenters. Specifically, our new distance is strictly equivalent to the $L$2-Wasserstein distance between extremum persistence diagrams, but it is restricted to a smaller solution space, namely, the space of rooted partial isomorphisms between branch decomposition trees. This enables a simple extension of existing optimization frameworks [110] for geodesics and barycenters from persistence diagrams to merge trees. We introduce a task-based algorithm which can be generically applied to distance, geodesic, barycenter or cluster computation. The task-based nature of our approach enables further accelerations with shared-memory parallelism. Extensive experiments on public ensembles and SciVis contest benchmarks demonstrate the efficiency of our approach - with barycenter computations in the orders of minutes for the largest examples - as well as its qualitative ability to generate representative barycenter merge trees, visually summarizing the features of interest found in the ensemble. We show the utility of our contributions with dedicated visualization applications: feature tracking, temporal reduction and ensemble clustering. We provide a lightweight C++ implementation that can be used to reproduce our resultsShow more
Article, 2022
Publication:IEEE Transactions on Visualization and Computer Graphics, 28, 202201, 291
Publisher:2022

2022


Generative Data Augmentation via Wasserstein Autoencoder for Text Classification
Authors:Kyohoon JinJunho LeeJuhwan ChoiSoojin JangYoungbin Kim2022 13th International Conference on Information and Communication Technology Convergence (ICTC)Show more
Summary:Generative latent variable models are commonly used in text generation and augmentation. However generative latent variable models such as the variational autoencoder(VAE) experience a posterior collapse problem ignoring learning for a subset of latent variables during training. In particular, this phenomenon frequently occurs when the VAE is applied to natural language processing, which may degrade the reconstruction performance. In this paper, we propose a data augmentation method based on the pre-trained language model (PLM) using the Wasserstein autoencoder (WAE) structure. The WAE was used to prevent a posterior collapse in the generative model, and the PLM was placed in the encoder and decoder to improve the augmentation performance. We evaluated the proposed method on seven benchmark datasets and proved the augmentation effectShow more
Chapter, 2022
Publication:2022 13th International Conference on Information and Communication Technology Convergence (ICTC), 20221019, 603
Publisher:2022

Distributionally Safe Path Planning: Wasserstein Safe RRT
Authors:Paul LathropBeth BoardmanSonia Martinez
Summary:In this paper, we propose a Wasserstein metric-based random path planning algorithm. Wasserstein Safe RRT (W-Safe RRT) provides finite-sample probabilistic guarantees on the safety of a returned path in an uncertain obstacle environment. Vehicle and obstacle states are modeled as distributions based upon state and model observations. We define limits on distributional sampling error so the Wasserstein distance between a vehicle state distribution and obstacle distributions can be bounded. This enables the algorithm to return safe paths with a confidence bound through combining finite sampling error bounds with calculations of the Wasserstein distance between discrete distributions. W-Safe RRT is compared against a baseline minimum encompassing ball algorithm, which ensures balls that minimally encompass discrete state and obstacle distributions do not overlap. The improved performance is verified in a 3D environment using single, multi, and rotating non-convex obstacle cases, with and without forced obstacle error in adversarial directions, showing that W-Safe RRT can handle poorly modeled complex environmentsShow more
Article, 2022
Publication:IEEE Robotics and Automation Letters, 7, 202201, 430
Publisher:2022

Peer-reviewed
Existence and stability results for an isoperimetric problem with a non-local interaction of Wasserstein type
Authors:Jules Candau-TilhMichael Goldman
Summary:The aim of this paper is to prove the existence of minimizers for a variational problem involving the minimization under volume constraint of the sum of the perimeter and a non-local energy of Wasserstein type. This extends previous partial results to the full range of parameters. We also show that in the regime where the perimeter is dominant, the energy is uniquely minimized by ballsShow more
Article, 2022
Publication:ESAIM: Control, Optimisation and Calculus of Variations, 28, 2022
Publisher:2022

Peer-reviewed
Accelerating the discovery of anticancer peptides targeting lung and breast cancers with the Wasserstein autoencoder model and PSO algorithm
Show more
Authors:Lijuan YangGuanghui YangZhitong BingYuan TianLiang HuangYuzhen NiuLei Yang
Article, 2022
Publication:Briefings in bioinformatics, 23, 2022
Publisher:2022

Peer-reviewed
On Wasserstein-1 distance in the central limit theorem for elephant random walk
Authors:Xiaohui MaMohamed El MachkouriXiequan Fan
Article, 2022
Publication:Journal of mathematical physics, 63, 2022
Publisher:2022

<-—2022———2022———1870—


Peer-reviewed
Interval-valued functional clustering based on the Wasserstein distance with application to stock data
Authors:Lirong SunLijun ZhuWencheng LiChonghui ZhangTomas Balezentis
Article, 2022
Publication:Information sciences, 606, 2022, 910
Publisher:2022
Zbl 07814178


 
Peer-reviewed
DeepParticle: Learning invariant measure by a deep neural network minimizing Wasserstein distance on data generated from an interacting particle method
Show more
Authors:Zhongjian WangJack XinZhiwen Zhang
Article, 2022
Publication:Journal of computational physics, 464, 2022
Publisher:2022

Cited by 4 Related articles All 10 versions


A Wasserstein GAN Autoencoder for SCMA Networks
Authors:Luciano MiuccioDaniela PannoSalvatore Riolo
Article, 2022
Publication:IEEE wireless communications letters, 11, 2022, 1298
Publisher:2022

Weisfeiler-Lehman meets Gromov-Wasserstein
Authors:Chen, Samantha (Creator), Lim, Sunhyuk (Creator), Mémoli, Facundo (Creator), Wan, Zhengchao (Creator), Wang, Yusu (Creator)
Summary:The Weisfeiler-Lehman (WL) test is a classical procedure for graph isomorphism testing. The WL test has also been widely used both for designing graph kernels and for analyzing graph neural networks. In this paper, we propose the Weisfeiler-Lehman (WL) distance, a notion of distance between labeled measure Markov chains (LMMCs), of which labeled graphs are special cases. The WL distance is polynomial time computable and is also compatible with the WL test in the sense that the former is positive if and only if the WL test can distinguish the two involved graphs. The WL distance captures and compares subtle structures of the underlying LMMCs and, as a consequence of this, it is more discriminating than the distance between graphs used for defining the state-of-the-art Wasserstein Weisfeiler-Lehman graph kernel. Inspired by the structure of the WL distance we identify a neural network architecture on LMMCs which turns out to be universal w.r.t. continuous functions defined on the space of all LMMCs (which includes all graphs) endowed with the WL distance. Finally, the WL distance turns out to be stable w.r.t. a natural variant of the Gromov-Wasserstein (GW) distance for comparing metric Markov chains that we identify. Hence, the WL distance can also be construed as a polynomial time lower bound for the GW distance which is in general NP-hard to computeShow more
Downloadable Archival Material, 2022-02-05
Undefined
Publisher:2022-02-05


Peer-reviewed
Wasserstein-based methods for convergence complexity analysis of MCMC with applications
Authors:Qian QinJames P. Hobert
Article, 2022
Publication:Annals of applied probability, 32, 2022, 124
Publisher:2022
Cited by 7 Related articles All 5 versions


2022


Peer-reviewed
Estimation of Wasserstein distances in the Spiked Transport Model

Authors:J. Niles-WeedP. Rigollet
Article, 2022
Publication:Bernoulli, 28, 2022, 2663
Publisher:2022

Cited by 57 Related articles All 7 versions


A Target SAR Image Expansion Method Based on Conditional Wasserstein Deep Convolutional GAN for Automatic Target Recognition
Show more
Authors:Jikai QinZheng LiuLei RanRong XieJunkui TangZekun Guo
Article, 2022
Publication:IEEE journal of selected topics in applied earth observations and remote sensing, 15, 2022, 7153
Publisher:2022

Fast and Provably Convergent Algorithms for Gromov-Wasserstein in Graph LearningAuthors:Li, Jiajin (Creator), Tang, Jianheng (Creator), Kong, Lemin (Creator), Liu, Huikang (Creator), Li, Jia (Creator), So, Anthony Man-Cho (Creator), Blanchet, Jose (Creator)Show more
Summary:In this paper, we study the design and analysis of a class of efficient algorithms for computing the Gromov-Wasserstein (GW) distance tailored to large-scale graph learning tasks. Armed with the Luo-Tseng error bound condition~\cite{luo1992error}, two proposed algorithms, called Bregman Alternating Projected Gradient (BAPG) and hybrid Bregman Proximal Gradient (hBPG) are proven to be (linearly) convergent. Upon task-specific properties, our analysis further provides novel theoretical insights to guide how to select the best fit method. As a result, we are able to provide comprehensive experiments to validate the effectiveness of our methods on a host of tasks, including graph alignment, graph partition, and shape matching. In terms of both wall-clock time and modeling performance, the proposed methods achieve state-of-the-art resultsShow more
Downloadable Archival Material, 2022-05-17
Undefined
Publisher:2022-05-17


Peer-reviewed
Time discretizations of Wasserstein—Hamiltonian flows
Authors:Jianbo CuiLuca DieciHaomin Zhou
Article, 2022
Publication:Mathematics of computation, 335, 2022, 1019
Publisher:2022

Peer-reviewed
Distributionally Robust Second-Order Stochastic Dominance Constrained Optimization with Wasserstein Ball
Authors:Yu MeiJia LiuZhiping Chen
Article, 2022
Publication:SIAM journal on optimization, 32, 2022, 715
Publisher:2022

<-—2022———2022———1880—




Multi-Marginal Gromov-Wasserstein Transport and Barycenters
Authors:Beier, Florian (Creator), Beinert, Robert (Creator), Steidl, Gabriele (Creator)
Summary:Gromov-Wasserstein (GW) distances are generalizations of Gromov-Haussdorff and Wasserstein distances. Due to their invariance under certain distance-preserving transformations they are well suited for many practical applications. In this paper, we introduce a concept of multi-marginal GW transport as well as its regularized and unbalanced versions. Then we generalize a bi-convex relaxation of the GW transport to our multi-marginal setting which is tight if the cost function is conditionally negative definite in a certain sense. The minimization of this relaxed model can be done by an alternating algorithm, where each step can be performed by a Sinkhorn scheme for a multi-marginal transport problem. We show a relation of our multi-marginal GW problem for a tree-structured cost function to an (unbalanced) GW barycenter problem and present different proof-of-concept numerical resultsShow more
Downloadable Archival Material, 2022-05-13
Undefined
Publisher:2022-05-13
Cited by 1 Related articles All 2 versions


Caluya, Kenneth F.
Halder, Abhishek

Wasserstein proximal algorithms for the Schrödinger bridge problem: density control with nonlinear drift. (English) Zbl 07560631

IEEE Trans. Autom. Control 67, No. 3, 1163-1178 (2022).

MSC:  93-XX

PDF BibTeX XML Cite

Full Text: DOI 
Peer-reviewed
Wasserstein Proximal Algorithms for the Schrödinger Bridge Problem: Density Control With Nonlinear Drift
Authors:Kenneth F. CaluyaAbhishek Halder
Article, 2022
Publication:IEEE transactions on automatic control, 67, 2022, 1163
Publisher:2022


Gromov-Wasserstein Discrepancy with Local Differential Privacy for Distributed Structural Graphs
Authors:Jin, Hongwei (Creator), Chen, Xun (Creator)
Summary:Learning the similarity between structured data, especially the graphs, is one of the essential problems. Besides the approach like graph kernels, Gromov-Wasserstein (GW) distance recently draws big attention due to its flexibility to capture both topological and feature characteristics, as well as handling the permutation invariance. However, structured data are widely distributed for different data mining and machine learning applications. With privacy concerns, accessing the decentralized data is limited to either individual clients or different silos. To tackle these issues, we propose a privacy-preserving framework to analyze the GW discrepancy of node embedding learned locally from graph neural networks in a federated flavor, and then explicitly place local differential privacy (LDP) based on Multi-bit Encoder to protect sensitive information. Our experiments show that, with strong privacy protections guaranteed by the $\varepsilon$-LDP algorithm, the proposed framework not only preserves privacy in graph learning but also presents a noised structural metric under GW distance, resulting in comparable and even better performance in classification and clustering tasks. Moreover, we reason the rationale behind the LDP-based GW distance analytically and empiricallyShow more
Downloadable Archival Material, 2022-02-01
Undefined
Publisher:2022-02-01

On Assignment Problems Related to Gromov-Wasserstein Distances on the Real Line
Authors:Beinert, Robert (Creator), Heiss, Cosmas (Creator), Steidl, Gabriele (Creator)
Summary:Let $x_1 < \dots < x_n$ and $y_1 < \dots < y_n$, $n \in \mathbb N$, be real numbers. We show by an example that the assignment problem $$ \max_{\sigma \in S_n} F_\sigma(x,y) := \frac12 \sum_{i,k=1}^n |x_i - x_k|^\alpha \, |y_{\sigma(i)} - y_{\sigma(k)}|^\alpha, \quad \alpha >0, $$ is in general neither solved by the identical permutation (id) nor the anti-identical permutation (a-id) if $n > 2 +2^\alpha$. Indeed the above maximum can be, depending on the number of points, arbitrary far away from $F_\text{id}(x,y)$ and $F_\text{a-id}(x,y)$. The motivation to deal with such assignment problems came from their relation to Gromov-Wasserstein divergences which have recently attained a lot of attentionShow more
Downloadable Archival Material, 2022-05-18
Undefined
Publisher:2022-05-18

Cited by 19 Related articles All 4 versions


Peer-reviewed
Maps on positive definite cones of C*-algebras preserving the Wasserstein mean
Author:Lajos Molnár
Article, 2022
Publication:Proceedings of the American Mathematical Society, 150, 2022, 1209
Publisher:2022

2022


Learning to Predict Graphs with Fused Gromov-Wasserstein Barycenters
Authors:Brogat-Motte, Luc (Creator), Flamary, Rémi (Creator), Brouard, Céline (Creator), Rousu, Juho (Creator), d'Alché-Buc, Florence (Creator)
Summary:This paper introduces a novel and generic framework to solve the flagship task of supervised labeled graph prediction by leveraging Optimal Transport tools. We formulate the problem as regression with the Fused Gromov-Wasserstein (FGW) loss and propose a predictive model relying on a FGW barycenter whose weights depend on inputs. First we introduce a non-parametric estimator based on kernel ridge regression for which theoretical results such as consistency and excess risk bound are proved. Next we propose an interpretable parametric model where the barycenter weights are modeled with a neural network and the graphs on which the FGW barycenter is calculated are additionally learned. Numerical experiments show the strength of the method and its ability to interpolate in the labeled graph space on simulated data and on a difficult metabolic identification problem where it can reach very good performance with very little engineeringShow more
Downloadable Archival Material, 2022-02-08
Undefined
Publisher:2022-02-08

Cited by 4 Related articles All 7 versions

Publisher:2022-02-08

Cited by 7 Related articles All 7 versions 


Spherical Sliced-Wasserstein
Authors:Bonet, Clément (Creator), Berg, Paul (Creator), Courty, Nicolas (Creator), Septier, François (Creator), Drumetz, Lucas (Creator), Pham, Minh-Tan (Creator)
Summary:Many variants of the Wasserstein distance have been introduced to reduce its original computational burden. In particular the Sliced-Wasserstein distance (SW), which leverages one-dimensional projections for which a closed-form solution of the Wasserstein distance is available, has received a lot of interest. Yet, it is restricted to data living in Euclidean spaces, while the Wasserstein distance has been studied and used recently on manifolds. We focus more specifically on the sphere, for which we define a novel SW discrepancy, which we call spherical Sliced-Wasserstein, making a first step towards defining SW discrepancies on manifolds. Our construction is notably based on closed-form solutions of the Wasserstein distance on the circle, together with a new spherical Radon transform. Along with efficient algorithms and the corresponding implementations, we illustrate its properties in several machine learning use cases where spherical representations of data are at stake: density estimation on the sphere, variational inference or hyperspherical auto-encodersShow more
Downloadable Archival Material, 2022-06-17
Undefined
Publisher:2022-06-17

Cited by 4 Related articles All 5 versions

Approximating 1-Wasserstein Distance with Trees
Authors:Yamada, Makoto (Creator), Takezawa, Yuki (Creator), Sato, Ryoma (Creator), Bao, Han (Creator), Kozareva, Zornitsa (Creator), Ravi, Sujith (Creator)
Summary:Wasserstein distance, which measures the discrepancy between distributions, shows efficacy in various types of natural language processing (NLP) and computer vision (CV) applications. One of the challenges in estimating Wasserstein distance is that it is computationally expensive and does not scale well for many distribution comparison tasks. In this paper, we aim to approximate the 1-Wasserstein distance by the tree-Wasserstein distance (TWD), where TWD is a 1-Wasserstein distance with tree-based embedding and can be computed in linear time with respect to the number of nodes on a tree. More specifically, we propose a simple yet efficient L1-regularized approach to learning the weights of the edges in a tree. To this end, we first show that the 1-Wasserstein approximation problem can be formulated as a distance approximation problem using the shortest path distance on a tree. We then show that the shortest path distance can be represented by a linear model and can be formulated as a Lasso-based regression problem. Owing to the convex formulation, we can obtain a globally optimal solution efficiently. Moreover, we propose a tree-sliced variant of these methods. Through experiments, we demonstrated that the weighted TWD can accurately approximate the original 1-Wasserstein distanceShow more
Downloadable Archival Material, 2022-06-24
Undefined
Publisher:2022-06-24

Peer-reviewed
Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings
Author:Hà Quang Minh
Article, 2022
Publication:Analysis and Applications, 20221126, 1
Publisher:2022

Peer-reviewed
Optimal continuous-singular control of stochastic McKean-Vlasov system in Wasserstein space of probability measures
Authors:Samira BoukafLina GuenaneMokhtar Hafayed
Article, 2022
Publication:International Journal of Dynamical Systems and Differential Equations, 12, 2022, 301
Publisher:2022

<-—2022———2022———1890—


Bayesian learning with Wasserstein barycentersAuthors:Julio Backhoff-VeraguasJoaquin FontbonaGonzalo RiosFelipe Tobar
Article, 2022
Publication:ESAIM: Probability and Statistics, 26, 2022, 436
Publisher:2022

Zbl 07730417


Peer-reviewed
On a Linear Gromov–Wasserstein Distance
Authors:Florian BeierRobert BeinertGabriele Steidl
Article, 2022
Publication:IEEE Transactions on Image Processing, 31, 2022, 7292
Publisher:2022
Article, 2022
Publication:IEEE Transactions on Automatic Control, 67, 202203, 1163
Publisher:2022
Cited by 5 Related articles All 6 versions


Peer-reviewed
A Continuation Multiple Shooting Method for Wasserstein Geodesic Equation
Authors:Jianbo CuiLuca DieciHaomin Zhou
Summary:In this paper, we propose a numerical method to solve the classic $L^2$-optimal transport problem.Our algorithm is based on the use of multiple shooting, in combination with a continuation procedure,to solve the boundary value problem associated to the transport problem. Based on the viewpoint ofWasserstein Hamiltonian flow with initial and target densities, our algorithm reflects the Hamiltonianstructure of the underlying problem and exploits it in the numerical discretization. Several numericalexamples are presented to illustrate the performance of the methodShow more
Downloadable Article
Publication:SIAM Journal on Scientific Computing, 44, 2022, A2918


Local Sliced-Wasserstein Feature Sets for Illumination-invariant Face Recognition
Authors:Zhuang, Yan (Creator), Li, Shiying (Creator), Shifat-E-Rabbi, Mohammad (Creator), Yin, Xuwang (Creator), Rubaiyat, Abu Hasnat Mohammad (Creator), Rohde, Gustavo K. (Creator)Show more
Summary:We present a new method for face recognition from digital images acquired under varying illumination conditions. The method is based on mathematical modeling of local gradient distributions using the Radon Cumulative Distribution Transform (R-CDT). We demonstrate that lighting variations cause certain types of deformations of local image gradient distributions which, when expressed in R-CDT domain, can be modeled as a subspace. Face recognition is then performed using a nearest subspace in R-CDT domain of local gradient distributions. Experiment results demonstrate the proposed method outperforms other alternatives in several face recognition tasks with challenging illumination conditions. Python code implementing the proposed method is available, which is integrated as a part of the software package PyTransKitShow more
Downloadable Archival Material, 2022-02-21
Undefined
Publisher:2022-02-21
Cited by 2 Related articles All 2 versions


Low-rank Wasserstein polynomial chaos expansions in the framework of optimal transport
Authors:Robert GruhlkeMartin EigelWeierstraß-Institut für Angewandte Analysis und Stochastik
Summary:A @unsupervised learning approach for the computation of an explicit functional representation of a random vector Y is presented, which only relies on a finite set of samples with unknown distribution. Motivated by recent advances with computational optimal transport for estimating Wasserstein distances, we develop a newWasserstein multi-element polynomial chaos expansion (WPCE). It relies on the minimization of a regularized empirical Wasserstein metric known as debiased Sinkhorn divergenceShow more
eBook, 2022
English
Publisher:Weierstraß-Institut für Angewandte Analysis und Stochastik Leibniz-Institut im Forschungsverbund Berlin e.V, Berlin, 2022
Cited by 2 Related articles All 5 versions


2022

Bures–Wasserstein geometry for positive-definite Hermitian matrices and their trace-one subset

Author:Oostrum, Jesse van (Creator)
Summary:In his classical argument, Rao derives the Riemannian distance corresponding to the Fisher metric using a mapping between the space of positive measures and Euclidean space. He obtains the Hellinger distance on the full space of measures and the Fisher distance on the subset of probability measures. In order to highlight the interplay between Fisher theory and quantum information theory, we extend this construction to the space of positive-definite Hermitian matrices using Riemannian submersions and quotient manifolds. The analog of the Hellinger distance turns out to be the Bures–Wasserstein (BW) distance, a distance measure appearing in optimal transport, quantum information, and optimisation theory. First we present an existing derivation of the Riemannian metric and geodesics associated with this distance. Subsequently, we present a novel derivation of the Riemannian distance and geodesics for this metric on the subset of trace-one matrices, analogous to the Fisher distance for probability measuresShow more
Downloadable Archival Material, 2022-09-22
English
Publisher:Springer Singapore, 2022-09-22


The Wasserstein Impact Measure (WIM): A practical tool for quantifying prior impact in Bayesian statistics

Authors:Ley, Christophe (Creator), Ghaderinezhad, Fatemeh (Creator), Serrien, Ben (Creator)
Abstract:The prior distribution is a crucial building block in Bayesian analysis, and its choice will impact the subsequent inference. It is therefore important to have a convenient way to quantify this impact, as such a measure of prior impact will help to choose between two or more priors in a given situation. To this end a new approach, the Wasserstein Impact Measure (WIM), is introduced. In three simulated scenarios, the WIM is compared to two competitor prior impact measures from the literature, and its versatility is illustrated via two real datasetsShow more
Downloadable Archival Material, 2022-10

English


Fast Sinkhorn I: An O(N) Algorithm for the Wasserstein-1 Metric
Authors:Liao, Qichen (Creator), Chen, Jing (Creator), Wang, Zihao (Creator), Bai, Bo (Creator), Jin, Shi (Creator), Wu, Hao (Creator)

Summary:The Wasserstein metric is broadly used in optimal transport for comparing two prob-abilistic distributions, with successful applications in various fields such as machine learning, signal processing, seismic inversion, etc. Nevertheless, the high computational complexity is an obstacle for its practical applications. The Sinkhorn algorithm, one of the main methods in computing the Wasser-stein metric, solves an entropy regularized minimizing problem, which allows arbitrary approximations to the Wasserstein metric with O(N2) computational cost. However, higher accuracy of its numerical approximation requires more Sinkhorn iterations with repeated matrix-vector multiplications, which is still unaffordable. In this work, we propose an efficient implementation of the Sinkhorn algorithm to calculate the Wasserstein-1 metric with O(N) computational cost, which achieves the optimal theo-retical complexity. By utilizing the special structure of Sinkhorn's kernel, the repeated matrix-vector multiplications can be implemented with O(N) times multiplications and additions, using the Qin Jiushao or Horner's method for efficient polynomial evaluation, leading to an efficient algorithm with-out losing accuracy. In addition, the log-domain stabilization technique, used to stabilize the iterative procedure, can also be applied in this algorithm. Our numerical experiments show that the newly developed algorithm is one to three orders of magnitude faster than the original Sinkhorn algorithm
Show more
Downloadable Archival Material, 2022
English
Publisher:Int Press Boston, Inc, 2022


Risk Averse Path Planning Using Lipschitz Approximated Wasserstein Distributionally Robust Deep Q-Learning
Author:Alptürk, Cem (Creator)
Summary:We investigate the problem of risk averse robot path planning using the deep reinforcement learning and distributionally robust optimization perspectives. Our problem formulation involves modelling the robot as a stochastic linear dynamical system, assuming that a collection of process noise samples is available. We cast the risk averse motion planning problem as a Markov decision process and propose a continuous reward function design that explicitly takes into account the risk of collision with obstacles while encouraging the robot’s motion towards the goal. We learn the risk-averse robot control actions through Lipschitz approximated Wasserstein distributionally robust deep Q-Learning to hedge against the noise uncertainty. The learned control actions result in a safe and risk averse trajectory from the source to the goal, avoiding all the obstacles. Various supporting numerical simulations are presented to demonstrate our proposed approachShow more
Downloadable Archival Material, 2022
English
Publisher:Lunds universitet/Institutionen för reglerteknik, 2022


Neural Subgraph Counting with Wasserstein Estimator
Authors:Wang, Hanchen (Creator), Hu, Rong (Creator), Zhang, Ying (Creator), Qin, Lu (Creator), Wang, Wei (Creator), Zhang, Wenjie (Creator)
Summary:Subgraph counting is a fundamental graph analysis task which has been widely used in many applications. As the problem of subgraph counting is NP-complete and hence intractable, approximate solutions have been widely studied, which fail to work with large and complex query graphs. Alternatively, Machine Learning techniques have been recently applied for this problem, yet the existing ML approaches either only support very small data graphs or cannot make full use of the data graph information, which inherently limits their scalability, estimation accuracies and robustness. In this paper, we propose a novel approximate subgraph counting algorithm, NeurSC, that can exploit and combine information from both the query graphs and the data graphs effectively and efficiently. It consists of two components: (1) an extraction module that adaptively generates simple yet representative substructures from data graph for each query graph and (2) an estimator WEst that first computes the representations from individual and joint distributions of query and data graphs and then estimates subgraph counts with the learned representations. Furthermore, we design a novel Wasserstein discriminator in WEst to minimize the Wasserstein distance between query and data graphs by updating the parameters in network with the vertex correspondence relationship between query and data graphs. By doing this, WEst can better capture the correlation between query and data graphs which is essential to the quality of the estimation. We conduct experimental studies on seven large real-life labeled graphs to demonstrate the superior performance of NeurSC in terms of estimation accuracy and robustnessShow more
Downloadable Archival Material, 2022
English
Publisher:Association for Computing Machinery, 2022

<-—2022———2022———1900—




Peer-reviewed
A novel conditional weighting transfer Wasserstein auto-encoder for rolling bearing fault diagnosis with multi-source domains
Authors:Ke ZhaoFeng JiaHaidong Shao
Summary:Transfer learning based on a single source domain to a target domain has received a lot of attention in the cross-domain fault diagnosis tasks of rolling bearing. However, the practical issues often contain multiple source domain data, and the information contained in the target domain is quite different from one source domain to another. Therefore, the transfer pattern from multiple source domains to the target domain undoubtedly has brighter application prospects. Based on these discussions, a multi-source domain transfer learning approach called conditional weighting transfer Wasserstein auto-encoder is developed to deal with the challenges of cross-domain fault diagnosis. Different from the traditional distribution alignment idea of directly aligning the source and target domains, the proposed framework adopts an indirect latent alignment idea to achieve better feature alignment, that is, indirectly aligning the feature distribution of source and target in the latent feature space with the help of Gaussian prior distribution. Furthermore, considering the variability of different source domains containing information about the target domain, an ingenious conditional weighting strategy is designed to quantify the similarity of different source domains to target domain, and further help the proposed model to minimize the discrepancy in conditional distribution. The cross-domain fault diagnosis tasks adequately verify that the proposed framework can sufficiently transfer knowledge from all source domains to the target domain, and has extensive application prospectsShow more
Article
Publication:Knowledge-Based Systems

MHA-WoML: Multi-head attention and Wasserstein-OT for few-shot learning
Authors:Junyan YangJie JiangYanming Guo
Summary:Abstract: Few-shot learning aims to classify novel classes with extreme few labeled samples. Existing metric-learning-based approaches tend to employ the off-the-shelf CNN models for feature extraction, and conventional clustering algorithms for feature matching. These methods neglect the importance of image regions and might trap in over-fitting problems during feature clustering. In this work, we propose a novel MHA-WoML framework for few-shot learning, which adaptively focuses on semantically dominant regions, and well relieves the over-fitting problem. Specifically, we first design a hierarchical multi-head attention (MHA) module, which consists of three functional heads (i.e., rare head, syntactic head and positional head) with masks, to extract comprehensive image features, and screen out invalid features. The MHA behaves better than current transformers in few-shot recognition. Then, we incorporate the optimal transport theory into Wasserstein distance and propose a Wasserstein-OT metric learning (WoML) module for category clustering. The WoML module focuses more on calculating the appropriately approximate barycenter to avoid the over accurate sub-stage fitting which may threaten the global fitting, thus alleviating the problem of over-fitting in the training process. Experimental results show that our approach achieves remarkably better performance compared to current state-of-the-art methods by scoring about 3% higher accuracy, across four benchmark datasets including MiniImageNet, TieredImageNet, CIFAR-FS and CUB200Show more
Article, 2022
Publication:International Journal of Multimedia Information Retrieval, 11, 20220921, 681
Publisher:2022

Peer-reviewed
Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator
Authors:Viet Anh NguyenDaniel KuhnP. Mohajerin Esfahani
Summary:We introduce a distributionally robust maximum likelihood estimation model with a Wasserstein ambiguity set to infer the inverse covariance matrix of a p-dimensional Gaussian random vector from n independent samples. The proposed model minimizes the worst case (maximum) of Stein’s loss across all normal reference distributions within a prescribed Wasserstein distance from the normal distribution characterized by the sample mean and the sample covariance matrix. We prove that this estimation problem is equivalent to a semidefinite program that is tractable in theory but beyond the reach of general-purpose solvers for practically relevant problem dimensions p. In the absence of any prior structural information, the estimation problem has an analytical solution that is naturally interpreted as a nonlinear shrinkage estimator. Besides being invertible and well conditioned even for p > n, the new shrinkage estimator is rotation equivariant and preserves the order of the eigenvalues of the sample covariance matrix. These desirable properties are not imposed ad hoc but emerge naturally from the underlying distributionally robust optimization model. Finally, we develop a sequential quadratic approximation algorithm for efficiently solving the general estimation problem subject to conditional independence constraints typically encountered in Gaussian graphical models.Show more
Article
Publication:Operations Research, 70, 2022



Data-Driven Approximation of the Perron-Frobenius Operator Using the Wasserstein MetricAuthors:Amirhossein KarimiTryphon T. Georgiou
Summary:This manuscript introduces a regression-type formulation for approximating the Perron-Frobenius Operator by relying on distributional snapshots of data. These snapshots may represent densities of particles. The Wasserstein metric is leveraged to define a suitable functional optimization in the space of distributions. The formulation allows seeking suitable dynamics so as to interpolate the distributional flow in function space. A first-order necessary condition for optimality is derived and utilized to construct a gradient flow approximating algorithm. The framework is exemplified with numerical simulationsShow more
Article
Publication:IFAC PapersOnLine, 55, 2022, 341

Peer-reviewed
Stein factors for variance-gamma approximation in the Wasserstein and Kolmogorov distances
Author:Robert E. Gaunt
Summary:We obtain new bounds for the solution of the variance-gamma (VG) Stein equation that are of the correct form for approximations in terms of the Wasserstein and Kolmogorov metrics. These bounds hold for all parameters values of the four parameter VG class. As an application we obtain explicit Wasserstein and Kolmogorov distance error bounds in a six moment theorem for VG approximation of double Wiener-Itô integralsShow more
Article
Publication:Journal of Mathematical Analysis and Applications, 514, 2022-10-01

2022


Tracial smooth functions of non-commuting variables and the free Wasserstein manifold
Authors:David JekelDimitri ShlyakhtenkoWuchen LiPolska Akademia Nauk
Print Book, 2022
English
Publisher:Instytut Matematyczny PAN, Warszawa, 2022

Parallel translations, Newton flows and Q-Wiener processes on the Wasserstein space
Authors:Hao DingShizan FangXiangdong LiJosé-Luis JaramilloFengyu WangNicolas JuilletZhao DongUniversité Bourgogne Franche-Comté.Academy of mathematics and Systems Science, CAS, BeijingÉcole doctorale Carnot-Pasteur (Besançon / Dijon) (2012-....).Show more
Summary:- Nous allons étendre la définition de la connexion de Levi-Civita de Lott à l'espace de Wasserstein des mesures de probabilité ayant densité et divergence. Un champ de vecteurs le long d'une courbe absolument continue est étendu sur tout espace de tel sorte que les transports parallèles puissent être définis comme en géométrie différentielle. Nous allons démontrer l'existence des transports parallèles au sens fort de Lott pour le cas du tore.- Nous allons démontrer l'existence et l'unicité de l'équation de Newton sur l'espace de Wasserstein et mettre en évidence la relation entre le flot de Newton relaxé et l'équation de Keller-Segel.- Nous allons établir un formalisme intrinsèque pour le calcul stochastique d'Itô sur l'espace de Wasserstein à travers les trois fonctionnelles typiques. Nous allons construire la forme faible et la forme forte de l'équation différentielle partielle stochastique définissant le transport parallèle, dont l'existence et l'unicité est démontrée dans le cas du tore. Des processus de diffusion non-dégénérée sont construits en utilisant les fonctions propres du laplacian.- Nous allons construire une nouvelle approche du système d'interaction de particules aux solutions du problème de martingale pour l'équation de Dean-Kawasaki sur le tore sous une condition plus faible portant sur l'intensité de corrélation spatialeShow more
Computer Program, 2022
English
Publisher:2022

Parallel translations, Newton flows and Q-Wiener processes on the Wasserstein space  thesis


2022 thesis
Gromov-Wasserstein Distances and their Lower Bounds
Authors:Christoph Alexander WeitkampProf MunkDr Proksch
Summary:In various applications in biochemistry, computer vision and machine learning, it is of great interest to compare general objects in a pose invariant manner. Recently, the following approach has received increased attention: Model the objects considered as metric measure spaces and compare them with the Gromov-Wasserstein distance. While this distance has many theoretically appealing properties and is a natural distance concept in numerous frameworks, it is NP-hard to compute. In consequence, several alternatives to the precise determination of this distance have been proposed. On the one h..Show more
Thesis, Dissertation, 2022
English
Publisher:2022

Super-resolution of Sentinel-2 images using Wasserstein GAN

Authors:Hasan LatifSajid GhuffarHafiz Mughees Ahmad
Summary:The Sentinel-2 satellites deliver 13 band multi-spectral imagery with bands having 10m, 20m or 60m spatial resolution. The low-resolution bands can be upsampled to match the high resolution bands to extract valuable information at higher spatial resolution. This paper presents a Wasserstein Generative Adversarial Network (WGAN) based approach named as DSen2-WGAN to super-resolve the low-resolution (i.e., 20m and 60m) bands of Sentinel-2 images to a spatial resolution of 10m. A proposed generator is trained in an adversarial manner using the min-max game to super-resolve the low-resolution bands with the guidance of available high-resolution bands in an image. The performance evaluated using metrics such as Signal Reconstruction Error (SRE) and Root Mean Squared Error (RMSE) shows the effectiveness of the proposed approach as compared to the state-of-the-art method, DSen2 as the DSen2-WGAN reduced RMSE by 14.68% and 7%, while SRE improved by almost 4% and 1.6% for 6 and 2 super-resolution. Lastly, for further evaluation, we have used trained DSen2-WGAN model to super-resolve the bands of EuroSAT dataset, a satellite image classification dataset based on Sentinel-2 images. The per band classification accuracy of low-resolution bands shows significant improvement after super-resolution using our proposed approachShow more
Article
Publication:Remote Sensing Letters, 13, 20221202, 1194

Peer-reviewed
Deep Distributional Sequence Embeddings Based on a Wasserstein Loss
Authors:Ahmed AbdelwahabNiels Landwehr
Summary:Deep metric learning employs deep neural networks to embed instances into a metric space such that distances between instances of the same class are small and distances between instances from different classes are large. In most existing deep metric learning techniques, the embedding of an instance is given by a feature vector produced by a deep neural network and Euclidean distance or cosine similarity defines distances between these vectors. This paper studies deep distributional embeddings of sequences, where the embedding of a sequence is given by the distribution of learned deep features across the sequence. The motivation for this is to better capture statistical information about the distribution of patterns within the sequence in the embedding. When embeddings are distributions rather than vectors, measuring distances between embeddings involves comparing their respective distributions. The paper therefore proposes a distance metric based on Wasserstein distances between the distributions and a corresponding loss function for metric learning, which leads to a novel end-to-end trainable embedding model. We empirically observe that distributional embeddings outperform standard vector embeddings and that training with the proposed Wasserstein metric outperforms training with other distance functionsShow mor
Downloadable Article, 2022
Publication:Neural Processing Letters, 20220318, 1
Publisher:2022
 Cited by 11 Related articles All 5 versions


Entropy-regularized 2-Wasserstein distance between Gaussian measures
Authors:Anton MallastoAugusto GerolinHà Quang Minh
Summary:Gaussian distributions are plentiful in applications dealing in uncertainty quantification and diffusivity. They furthermore stand as important special cases for frameworks providing geometries for probability measures, as the resulting geometry on Gaussians is often expressible in closed-form under the frameworks. In this work, we study the Gaussian geometry under the entropy-regularized 2-Wasserstein distance, by providing closed-form solutions for the distance and interpolations between elements. Furthermore, we provide a fixed-point characterization of a population barycenter when restricted to the manifold of Gaussians, which allows computations through the fixed-point iteration algorithm. As a consequence, the results yield closed-form expressions for the 2-Sinkhorn divergence. As the geometries change by varying the regularization magnitude, we study the limiting cases of vanishing and infinite magnitudes, reconfirming well-known results on the limits of the Sinkhorn divergence. Finally, we illustrate the resulting geometries with a numerical studyShow more
Article
Publication:Information Geometry, 5, 2022, 289

<-—2022———2022———1910-—


Image Outpainting using Wasserstein Generative Adversarial Network with Gradient Penalty
Authors:Aashish NairJay DeshmukhAkash SonareTarun MishraRichard Joseph2022 6th International Conference on Computing Methodologies and Communication (ICCMC)Show more
Summary:With advancements in AI technology, machines can perform or even mimic tasks that humans can do. One of its achievements can be seen in image generation, one of it being Image Inpainting (completion). In Image Inpainting, AI is used to complete missing data in an image. This is an extensive field of research, but its contemporary field, i.e. image outpainting, is not a well-researched one. In Image Outpainting (extrapolation) the image is extended beyond its borders. This is similar to our brain picturing the whole image of an object that is partially seen through a gap. This task can be achieved by using Generative Adversarial Networks (GANs). Compared to Inpainting, the biggest challenge is to achieve spatial correlation between the generated image and the ground truth image. Also, the process of overcoming this challenge is also sometimes affected because of the training instability of GAN. With the help of Wasserstein GAN (WGAN), the above issue can be solved. So, a model is proposed based on the Wasserstein GAN with Gradient Penalty (WGAN-GP) algorithm and deep convolutional neural networks for image outpainting using a dataset on natural images. From this proposed model it is found that the results of WGAN-GP algorithm was better than GAN algorithm in various aspectsShow more
Chapter, 2022
Publication:2022 6th International Conference on Computing Methodologies and Communication (ICCMC), 20220329, 1248
Publisher:2022

 Related articles


Computing Wasserstein-$p$ Distance Between Images with Linear Cost
Authors:Zhonghua LuChen LiYidong Chen2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
Summary:When the images are formulated as discrete measures, computing Wasserstein-p distance between them is challenging due to the complexity of solving the corresponding Kantorovich's problem. In this paper, we propose a novel algorithm to compute the Wasserstein-p distance between discrete measures by restricting the optimal transport (OT) problem on a subset. First, we define the restricted OT problem and prove the solution of the restricted problem converges to Kantorovich's OT solution. Second, we propose the SparseSinkhorn algorithm for the restricted problem and provide a multi-scale algorithm to estimate the subset. Finally, we implement the proposed algorithm on CUDA and illustrate the linear computational cost in terms of time and memory requirements. We compute Wasserstein-p distance, estimate the transport mapping, and transfer color between color images with size ranges from <tex>$64\times 64$</tex> to <tex>$1920\times 1200$</tex>. (Our code is available at https://github.com/ucascnic/CudaOT)Show more
Chapter, 2022
Publication:2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 202206, 509
Publisher:2022


The Continuous Formulation of Shallow Neural Networks as Wasserstein-Type Gradient Flows
Authors:Xavier Fernández-RealAlessio Figalli
Summary:It has been recently observed that the training of a single hidden layer artificial neural network can be reinterpreted as a Wasserstein gradient flow for the weights for the error functional. In the limit, as the number of parameters tends to infinity, this gives rise to a family of parabolic equations. This survey aims to discuss this relation, focusing on the associated theoretical aspects appealing to the mathematical community and providing a list of interesting open problemsShow more
Chapter, 2022
Publication:Analysis at Large: Dedicated to the Life and Work of Jean Bourgain, 20220520, 29
Publisher:2022

The Parisi formula is a Hamilton-Jacobi equation in Wasserstein space
Author:Jean-Christophe Mourrat
Summary:The Parisi formula is a self-contained description of the infinite-volume limit of the free energy of mean-field spin glass models. We showthat this quantity can be recast as the solution of a Hamilton-Jacobi equation in the Wasserstein space of probability measures on the positive half-lineShow more
Article
Publication:Canadian Journal of Mathematics, 74, 20220628, 607

Projected Wasserstein Gradient Descent for High-Dimensional Bayesian Inference
Authors:Yifei WangPeng ChenWuchen Li
Summary:Abstract. We propose a projected Wasserstein gradient descent method (pWGD) for high-dimensional Bayesian inference problems. The underlying density function of a particle system of Wasserstein gradient descent (WGD) is approximated by kernel density estimation (KDE), which faces the long-standing curse of dimensionality. We overcome this challenge by exploiting the intrinsic low-rank structure in the difference between the posterior and prior distributions. The parameters are projected into a low-dimensional subspace to alleviate the approximation error of KDE in high dimensions. We formulate a projected Wasserstein gradient flow and analyze its convergence property under mild assumptions. Several numerical experiments illustrate the accuracy, convergence, and complexity scalability of pWGD with respect to parameter dimension, sample size, and processor coresShow more
Downloadable Article
Publication:SIAM/ASA Journal on Uncertainty Quantification, 10, 20221231, 1513
Projected Wasserstein Gradient Descent for High-Dimensional Bayesian Inference
Authors:Yifei WangPeng ChenWuchen Li
Article, 2022
Publication:SIAM/ASA Journal on Uncertainty Quantification, 10, 20221231, 1513
Publisher:2022
Zbl 1506.62270


2022


The Wasserstein Distance Using QAOA: A Quantum Augmented Approach to Topological Data Analysis
Authors:M. SaravananMannathu Gopikrishnan2022 International Conference on Innovative Trends in Information Technology (ICITIIT)
Summary:This paper examines the implementation of Topological Data Analysis methods based on Persistent Homology to meet the requirements of the telecommunication industry. Persistent Homology based methods are especially useful in detecting anomalies in time series data and show good prospects of being useful in network alarm systems. Of crucial importance to this method is a metric called the Wasserstein Distance, which measures how much two Persistence Diagrams differ from one another. This metric can be formulated as a minimum weight maximum matching problem on a bipartite graph. We here solve the combinatorial optimization problem of finding the Wasserstein Distance by applying the Quantum Approximate Optimization Algorithm (QAOA) using gate-based quantum computing methods. This technique can then be applied to detect anomalies in time series datasets involving network traffic/throughput data in telecommunication systems. The methodology stands to provide a significant technological advantage to service providers who adopt this, once practical gate-based quantum computers become ubiquitousShow more
Chapter, 2022
Publication:2022 International Conference on Innovative Trends in Information Technology (ICITIIT), 20220212, 1
Publisher:2022

Renewable Energy Scenario Generation Method Based on Order-Preserving Wasserstein Distance
Authors:Hang ZhouZhihang MaoYi GaoShuai LuoYingyun Sun2022 IEEE/IAS Industrial and Commercial Power System Asia (I&CPS Asia)
Summary:With the increasing penetration rate of renewable energy generation, how to describe the uncertainty of renewable energy output is a key problem to overcome these challenges. Aiming at this problem, this paper proposes a day-ahead scenario generation method for renewable energy based on order-preserving Wasserstein distance. In this method, order-preserving Wasserstein distance is used as the loss function of discriminator to design a network structure suitable for day-ahead scenario generation. Through the game training of a conditional generative adversarial network (CGAN), the mapping between noise distribution and day-ahead scenario set under the prediction condition can be learned by the generator. In this paper, actual wind power data (including forecasting and actual generation data) are used to test the proposed method, and the results show that the proposed model can more accurately describe the day-ahead wind power uncertaintyShow more
Chapter, 2022
Publication:2022 IEEE/IAS Industrial and Commercial Power System Asia (I&CPS Asia), 20220708, 1754
Publisher:2022


Wasserstein Cross-Lingual Alignment For Named Entity Recognition
Authors:Rui WangRicardo HenaoICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Summary:Supervised training of Named Entity Recognition (NER) models generally require large amounts of annotations, which are hardly available for less widely used (low resource) languages, e.g., Armenian and Dutch. Therefore, it will be desirable if we could leverage knowledge extracted from a high resource language (source), e.g., English, so that NER models for the low resource languages (target) could be trained more efficiently with less cost associated with annotations. In this paper, we study cross-lingual alignment for NER, an approach for transferring knowledge from high-to low-resource languages, via the alignment of token embeddings between different languages. Specifically, we propose to align by minimizing the Wasserstein distance between the contextualized token embeddings from source and target languages. Experimental results show that our method yields improved performance over existing works for cross-lingual alignment in NER tasksShow more
Chapter, 2022
Publication:ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 20220523, 8342
Publisher:2022


Wasserstein Generative Adversarial Networks for Online Test Generation for Cyber Physical Systems
Authors:Ivan PorresFrankie SpencerJarkko Peltomaki2022 IEEE/ACM 15th International Workshop on Search-Based Software Testing (SBST)
Summary:We propose a novel online test generation algorithm WOGAN based on Wasserstein Generative Adversarial Networks. WOGAN is a general-purpose black-box test generator applicable to any system under test having a fitness function for determining failing tests. As a proof of concept, we evaluate WOGAN by generating roads such that a lane assistance system of a car fails to stay on the designated lane. We find that our algorithm has a competitive performance respect to previously published algorithmsShow more
Chapter, 2022
Publication:2022 IEEE/ACM 15th International Workshop on Search-Based Software Testing (SBST), 202205, 1
Publisher:2022
[PDF] Wasserstein Generative Adversarial Networks for Online Test ...

https://www.semanticscholar.org › paper › Wasserstein-Ge...

2022 IEEE/ACM 15th International Workshop on Search-Based Software Testing (SBST). 2022. TLDR. This note presents how WOGAN works and summarizes its ...

 Cited by 3 Related articles All 6 versions


Wasserstein Sensitivity of Risk and Uncertainty Propagation
Authors:Oliver G. ErnstAlois PichlerBjörn Sprungk
Summary:Abstract. When propagating uncertainty in the data of differential equations, the probability laws describing the uncertainty are typically themselves subject to uncertainty. We present a sensitivity analysis of uncertainty propagation for differential equations with random inputs to perturbations of the input measures. We focus on the elliptic diffusion equation with random coefficient and source term, for which the probability measure of the solution random field is shown to be Lipschitz-continuous in both total variation and Wasserstein distance. The result generalizes to the solution map of any differential equation with locally Hölder dependence on input parameters. In addition, these results extend to Lipschitz-continuous quantities of interest of the solution as well as to coherent risk functionals of these applied to evaluate the impact of their uncertainty. Our analysis is based on the sensitivity of risk functionals and pushforward measures for locally Hölder mappings with respect to the Wasserstein distance of perturbed input distributions. The established results are applied, in particular, to the case of lognormal diffusion and the truncation of series representations of input random fieldsShow more
Downloadable Article
Publication:SIAM/ASA Journal on Uncertainty Quantification, 10, 20220816, 915


Wassertrain: An Adversarial Training Framework Against Wasserstein Adversarial Attacks
Authors:Qingye ZhaoXin ChenZhuoyu ZhaoEnyi TangXuandong LiICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)Show more
Summary:This paper presents an adversarial training framework WasserTrain for improving model robustness against the adversarial attacks in terms of the Wasserstein distance. First, an effective attack method WasserAttack is introduced with a novel encoding of the optimization problem, which directly finds the worst point within the Wasserstein ball while keeping the relaxation error of the Wasserstein transformation as small as possible. The proposed adversarial training frame-work utilizes these high-quality adversarial examples to train robust models. Experiments on MNIST show that the adversarial loss arising from adversarial examples found by our method is about three times as much as that found by the PGD-based attack method. Furthermore, within the Wasserstein ball with a radius of 0.5, the WasserTrain model achieves 31% adversarial robustness against WasserAttack, which is 22% higher than that on the PGD-based training modelShow more
Chapter, 2022
Publication:ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 20220523, 2734
Publisher:2022

Related articles

<-—2022———2022———1920-—



Unbalanced optimal total variation transport problems and generalized Wasserstein barycenters

Authors:Nhan-Phu ChungThanh-Son Trinh
Summary:In this paper, we establish a Kantorovich duality for unbalanced optimal total variation transport problems. As consequences, we recover a version of duality formula for partial optimal transports established by Caffarelli and McCann; and we also get another proof of Kantorovich-Rubinstein theorem for generalized Wasserstein distance $\widetilde {W}_1^{a,b}$ proved before by Piccoli and Rossi. Then we apply our duality formula to study generalized Wasserstein barycenters. We show the existence of these barycenters for measures with compact supports. Finally, we prove the consistency of our barycentersShow more
Article
Publication:Proceedings of the Royal Society of Edinburgh: Section A Mathematics, 152, 202206, 674


A typical scenario generation method for active distribution network based on Wasserstein distance
Authors:Zhijie LiuShouzhen ZhuGongxiang LvPeng Zhang2022 Power System and Green Energy Conference (PSGEC)
Summary:With the rapid development of renewable energy such as wind power and solar power and the rapid popularization of electric vehicles, the operation and planning of active distribution network needs to consider the consequent uncertainties. To solve this problem, a generation algorithm of typical scenario based on Wasserstein probability distance is proposed. The algorithm first transforms the continuous probability density functions of wind power / solar power / electric vehicle output at a single time into the discrete quantiles containing precise probability information through the Wasserstein probability distance index. Then, considering the amount of calculation and its probability loss, the scheduling interval is divided into several sub intervals. The kmeans cluster algorithm is used to reduce the amount of the typical scenario in the interval, and the Cartesian product connection is used between the intervals. Through analyzing the cluster validity index, proposes two indexes of intra-class compactness and interclass separation, and the optimal clustering number is determined according to the class effectiveness index. Through iterative scenario reduction and fragment mergence operations, a typical scenario is finally formed. Finally, an IEEE 33 bus distribution network is taken as an example to verify the effectiveness of the proposed algorithm, The results show that the proposed typical scenario set has better effectiveness and reduction degreeShow more
Chapter, 2022
Publication:2022 Power System and Green Energy Conference (PSGEC), 202208, 1210
Publisher:2022

Dirac Mixture Reduction Using Wasserstein Distances on Projected Cumulative Distributions
Authors:Dominik ProsselUwe D. Hanebeck2022 25th International Conference on Information Fusion (FUSION)
Summary:The reapproximation of discrete probability densities is a common task in sample-based filters such as the particle filter. It can be viewed as the approximation of a given Dirac mixture density with another one, typically with fewer samples. In this paper, the Wasserstein distance is established as a suitable measure to compare two Dirac mixtures. The resulting minimization problem is also known as location-allocation or facility location problem and cannot be solved in polynomial time. Therefore, the well-known sliced Wasserstein distance is introduced as a replacement and its ties to the projected cumulative distribution (PCD) are shown. An iterative algorithm is proposed to minimize the sliced Wasserstein distance between the given distribution and approximationShow more
Chapter, 2022
Publication:2022 25th International Conference on Information Fusion (FUSION), 20220704, 1
Publisher:2022

Wasserstein Metric Attack on Person Re-identification

Authors:Rajiv Ratn ShahA. V. SubramanyamAstha Verma2022 IEEE 5th International Conference on Multimedia Information Processing and Retrieval (MIPR)Show more
Summary:Adversarial attacks in <tex>$l_{p}$</tex> ball have been recently investi-gated against person re-identification (ReID) models. How-ever, the <tex>$l_{p}$</tex> ball attacks disregard the geometry of the sam-ples. To this end, Wasserstein metric is a robust alternative as the attack incorporates a cost matrix for pixel mass movement. In our work, we propose the Wasserstein metric to perform adversarial attack on ReID system by projecting adversarial samples in the Wasserstein ball. We perform white-box and black-box attacks on state-of-the-art (SOTA) ReID models trained on Market-I 501, DukeMTMC-reID, and MSMTI7 datasets. The performance of best SOTA ReID models decreases drastically from 90.2% to as low as 0.4%. Our model outperforms the SOTA attack methods by 17.2% in white-box attacks and 14.4% in black-box at-tacks. To the best of our knowledge, our work is the first to propose the Wasserstein metric towards generating adversarial samples for ReID taskShow more
Chapter, 2022
Publication:2022 IEEE 5th International Conference on Multimedia Information Processing and Retrieval (MIPR), 202208, 234
Publisher:2022
Related articles All 2 versions
  

Wasserstein-Based Graph Alignment
Authors:Hermina Petric MareticMireille El GhecheMatthias MinderGiovanni ChierchiaPascal Frossard
Summary:A novel method for comparing non-aligned graphs of various sizes is proposed, based on the Wasserstein distance between graph signal distributions induced by the respective graph Laplacian matrices. Specifically, a new formulation for the one-to-many graph alignment problem is casted, which aims at matching a node in the smaller graph with one or more nodes in the larger graph. By incorporating optimal transport into our graph comparison framework, a structurally-meaningful graph distance, and a signal transportation plan that models the structure of graph data are generated. The resulting alignment problem is solved with stochastic gradient descent, where a novel Dykstra operator is used to ensure that the solution is a one-to-many (soft) assignment matrix. The performance of our novel framework is demonstrated on graph alignment, graph classification and graph signal transportation. Our method is shown to lead to significant improvements with respect to the state-of-the-art algorithms on each ofthese tasksShow more
Article, 2022
Publication:IEEE Transactions on Signal and Information Processing over Networks, 8, 2022, 353
Publisher:2022
Cited by 15
 Related articles All 6 versions


2022


2022 see 2021  Peer-reviewed
A Wasserstein generative adversarial network-based approach for real-time track irregularity estimation using vehicle dynamic responses
Authors:Zhandong YuanJun LuoShengyang ZhuWanming Zhai
Summary:Accurate and timely estimation of track irregularities is the foundation for predictive maintenance and high-fidelity dynamics simulation of the railway system. Therefore, it’s of great interest to devise a real-time track irregularity estimation method based on dynamic responses of the in-service train. In this paper, a Wasserstein generative adversarial network (WGAN)-based framework is developed to estimate the track irregularities using the vehicle’s axle box acceleration (ABA) signal. The proposed WGAN is composed of a generator architected by an encoder-decoder structure and a spectral normalised (SN) critic network. The generator is supposed to capture the correlation between ABA signal and track irregularities, and then estimate the irregularities with the measured ABA signal as input; while the critic is supposed to instruct the generator’s training by optimising the calculated Wasserstein distance. We combine supervised learning and adversarial learning in the network training process, where the estimation loss and adversarial loss are jointly optimised. Optimising the estimation loss is anticipated to estimate the long-wave track irregularities while optimising the adversarial loss accounts for the short-wave track irregularities. Two numerical cases, namely vertical and spatial vehicle-track coupled dynamics simulation, are implemented to validate the accuracy and reliability of the proposed method.Show more
Article, 2022
Publication:Vehicle System Dynamics, 60, 20221202, 4186
Publisher:2022


Contrastive Prototypical Network with Wasserstein Confidence Penalty

Authors:Haoqing WangZhi-Hong DengEuropean Conference on Computer Vision
Summary:Unsupervised few-shot learning aims to learn the inductive bias from unlabeled dataset for solving the novel few-shot tasks. The existing unsupervised few-shot learning models and the contrastive learning models follow a unified paradigm. Therefore, we conduct empirical study under this paradigm and find that pairwise contrast, meta losses and large batch size are the important design factors. This results in our CPN (Contrastive Prototypical Network) model, which combines the prototypical loss with pairwise contrast and outperforms the existing models from this paradigm with modestly large batch size. Furthermore, the one-hot prediction target in CPN could lead to learning the sample-specific information. To this end, we propose Wasserstein Confidence Penalty which can impose appropriate penalty on overconfident predictions based on the semantic relationships among pseudo classes. Our full model, CPNWCP (Contrastive Prototypical Network with Wasserstein Confidence Penalty), achieves state-of-the-art performance on miniImageNet and tieredImageNet under unsupervised setting. Our code is available at https://github.com/Haoqing-Wang/CPNWCPShow more
Chapter, 2022
Publication:Computer Vision – ECCV 2022: 17th European Conference, Tel Aviv, Israel, October 23–27, 2022, Proceedings, Part XIX, 20221109, 665
Publisher:2022


Improving weight clipping in Wasserstein GANs

Authors:Estelle Massart2022 26th International Conference on Pattern Recognition (ICPR)
Summary:Weight clipping is a well-known strategy to keep the Lipschitz constant of the critic under control, in Wasserstein GAN training. After each training iteration, all parameters of the critic are clipped to a given box, impacting the progress made by the optimizer. In this work, we propose a new strategy for weight clipping in Wasserstein GANs. Instead of directly clipping the parameters, we first obtain an equivalent model that is closer to the clipping box, and only then clip the parameters. Our motivation is to decrease the impact of the clipping strategy on the objective, at each iteration. This equivalent model is obtained by following invariant curves in the critic loss landscape, whose existence is a consequence of the positive homogeneity of common activations: rescaling the input and output signals to each activation by inverse factors preserves the loss. We provide preliminary experiments showing that the proposed strategy speeds up training on Wasserstein GANs with simple feed-forward architecturesShow more
Chapter, 2022
Publication:2022 26th International Conference on Pattern Recognition (ICPR), 202208, 2286
Publisher:2022
Related articles All 3 versions


Peer-reviewed
Wasserstein Distributionally Robust Optimization and Variation Regularization

Authors:Rui GaoXi ChenAnton J. Kleywegt
Summary:This paper builds a bridge between two area in optimization and machine learning by establishing a general connection between Wasserstein distributional robustness and variation regularization. It helps to demystify the empirical success of Wasserstein distributionally robust optimization and devise new regularization schemes for machine learningShow more
Downloadable Article, 2022
Publication:Operations Research, 20221101
Publisher:2022

Peer-reviewed
Computed tomography image generation from magnetic resonance imaging using Wasserstein metric for MR-only radiation therapy
Authors:Jiffy JosephChalla HemanthPournami Pulinthanathu NarayananJayaraj Pottekkattuvalappil BalakrishnanNiyas Puzhakkal
Article, 2022
Publication:International Journal of Imaging Systems and Technology, 32, November 2022, 2080
Publisher:2022

<-—2022———2022———1930-—


 


Computing Wasserstein-p Distance Between Images with Linear Cost
Authors:Yidong ChenChen LiZhonghua Lu2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
Summary:When the images are formulated as discrete measures, computing Wasserstein-p distance between them is challenging due to the complexity of solving the corresponding Kantorovich's problem. In this paper, we propose a novel algorithm to compute the Wasserstein-p distance between discrete measures by restricting the optimal transport (OT) problem on a subset. First, we define the restricted OT problem and prove the solution of the restricted problem converges to Kantorovich's OT solution. Second, we propose the SparseSinkhorn algorithm for the restricted problem and provide a multi-scale algorithm to estimate the subset. Finally, we implement the proposed algorithm on CUDA and illustrate the linear computational cost in terms of time and memory requirements. We compute Wasserstein-p distance, estimate the transport mapping, and transfer color between color images with size ranges from 64 64 to 1920 1200. (Our code is available at https://github.com/ucascnic/CudaOT)Show more
Chapter, 2022
Publication:2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 202206, 509
Publisher:2022
Computing Wasserstein-$p$ Distance Between Images with ...

Cited by 3 Related articles All 3 versions


Optimal HVAC Scheduling under Temperature Uncertainty using the Wasserstein Metric
Authors:Guanyu TianQun Zhou Sun2022 IEEE Power & Energy Society General Meeting (PESGM)
Summary:The heating, ventilation and air condition (HVAC) system consumes the most energy in commercial buildings, consisting over 60% of total energy usage in the U.S. Flexible HVAC system setpoint scheduling could potentially save building energy costs. This paper proposes a distributionally robust optimal (DRO) HVAC scheduling method that minimizes the daily operation cost with constraints of indoor air temperature comfort and mechanic operating requirement. Considering the uncertainties from ambient temperature, a Wasserstein metric-based ambiguity set is adopted to enhance the robustness against probabilistic prediction errors. The schedule is optimized under the worst-case distribution within the ambiguity set. The proposed DRO method is initially formulated as a two-stage problem and then reformulated into a tractable mixed-integer linear programming (MILP) form. The paper evaluates the feasibility and optimality of the optimized schedules for a real commercial building. The numerical results indicate that the costs of the proposed DRO method are up to 6.6% lower compared with conventional techniques of optimization under uncertainties. They also provide granular risk-benefit options for decision-makinz in demand response programsShow more
Chapter, 2022
Publication:2022 IEEE Power & Energy Society General Meeting (PESGM), 20220717, 1
Publisher:2022


Generalized Zero-Shot Learning Using Conditional Wasserstein Autoencoder
Authors:Junhan KimByonghyo ShimICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Summary:Generalized zero-shot learning (GZSL) is a technique to train a deep learning model to identify unseen classes. Conventionally, conditional generative models have been employed to generate training data for unseen classes from the attribute. In this paper, we propose a new conditional generative model that improves the GZSL performance greatly. In a nutshell, the proposed model, called conditional Wasserstein autoencoder (CWAE), minimizes the Wasserstein distance between the real and generated image feature distributions using an encoder-decoder architecture. From the extensive experiments on various benchmark datasets, we show that the proposed CWAE outperforms conventional generative models in terms of the GZSL classification performanceShow more
Chapter, 2022
Publication:ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 20220523, 3413
Publisher:2022

Cited by 1 Related articles

Cross-session Specific Emitter Identification using Adversarial Domain Adaptation with Wasserstein distance

Authors:Yalan YeChunji WangHai DongLi LuQiang Zhao2022 26th International Conference on Pattern Recognition (ICPR)
Summary:Accurate and robust specific emitter identification (SEI) is very challenging since distribution shift of signals occurs in cross-session scenario. General domain adaptation (DA) is proposed to alleviate the shift by aligning different signal distributions. However, existing general-DA based SEI methods which focus on the shift in the same session cannot be directly applied to cross-session SEI, since the distribution of signals varies more drastically in different sessions due to the continuously changing hardware imperfections. In this paper, we propose a novel method named adversarial domain adaptation with wasserstein distance (ADAW) to tackle the cross-session SEI. Specifically, to alleviate the severer distribution shift of signals in different sessions, a generative model is applied to map the data of previous session to latter session regardless of the degree of radio frequency fingerprints (RFFs) variations. Then, a wasserstein distance guided adversarial unsupervised domain adaptation (UDA) strategy is introduced to learn common feature representations for signals of different sessions, such that the model trained on the signals of previous session can precisely identify the signals of latter session. Experiments on ADS-B signals of same emitters in three distinct time sessions validate the capability of ADAW for SEI under cross-session and noisy conditionsShow more
Chapter, 2022
Publication:2022 26th International Conference on Pattern Recognition (ICPR), 20220821, 3119
Publisher:2022

Approximating 1-Wasserstein Distance between Persistence Diagrams by Graph Sparsification
Authors:Tamal K. Dey (Author), Simon Zhang (Author)
Summary:Abstract Persistence diagrams (PD)s play a central role in topological data analysis. This analysis requires computing distances among such diagrams such as the 1-Wasserstein distance. Accurate computation of these PD distances for large data sets that render large diagrams may not scale appropriately with the existing methods. The main source of difficulty ensues from the size of the bipartite graph on which a matching needs to be computed for determining these PD distances. We address this problem by making several algorithmic and computational observations in order to obtain an approximation. First, taking advantage of the proximity of PD points, we condense them thereby decreasing the number of nodes in the graph for computation. The increase in point multiplicities is addressed by reducing the matching problem to a min-cost flow problem on a transshipment network. Second, we use Well Separated Pair Decomposition to sparsify the graph to a size that is linear in the number of points. Both node and arc sparsifications contribute to the approximation factor where we leverage a lower bound given by the Relaxed Word Mover's distance. Third, we eliminate bottlenecks during the sparsification procedure by introducing parallelism. Fourth, we develop an open source software called¹ PDoptFlow based on our algorithm, exploiting parallelism by GPU and multicore. We perform extensive experiments and show that the actual empirical error is very low. We also show that we can achieve high performance at low guaranteed relative errors, improving upon the state of the artsShow more
Chapter
Publication:2022 Proceedings of the Symposium on Algorithm Engineering and Experiments (ALENEX), 2022, 169

2022


Peer-reviewed
Application of an unbalanced optimal transport distance and a mixed L1/Wasserstein distance to full waveform inversion
Authors:Da LiMichael P LamoureuxWenyuan Liao
Summary:SUMMARY: Full waveform inversion (FWI) is an important and popular technique in subsurface Earth property estimation. In this paper, several improvements to the FWI methodology are developed and demonstrated with numerical examples, including a simple two-layer seismic velocity model, a cross borehole Camembert model and a surface seismic Marmousi model. We introduce an unbalanced optimal transport (UOT) distance with Kullback–Leibler divergence to replace the L2 distance in the FWI problem. Also, a mixed L1/Wasserstein distance is constructed that preserves the convex properties with respect to shift, dilation, and amplitude change operation. An entropy regularization approach and convolutional scaling algorithms are used to compute the distance and the gradient efficiently. Two strategies of normalization methods that transform the seismic signals into non-negative functions are discussed. The numerical examples are then presented at the end of the paperShow more
Artcle, 2022
Publication:Geophysical Journal International, 230, 20220328, 1338
Publisher:2022



Part-Based Convolutional Neural Network and Dual Interactive Wasserstein Generative Adversarial Networks for Land Mark Detection and Localization of Autonomous Robots in Outdoor Environment * Note: Sub-titles are not captured in Xplore and should not be used
Show more
Authors:S. SindhuM. Saravanan2022 1st International Conference on Computational Science and Technology (ICCST)

Summary:Robot localization is a fundamental competency required by an autonomous robot because the robot's location knowledge is an essential precursor to making decisions about future actions. Accurate localization of robots or autonomous vehicles is the most important requirement for autonomous applications. In this manuscript, a part-based convolutional neural network and dual interactive Wasserstein generative adversarial networks for landmark detection and localization of autonomous robots in an outdoor environment are proposed (P-CNN-DIWGAN-LMD-LZ). This research contains two phases landmark detection phase and the localization phase. In the landmark detection phase, the part-based convolutional neural network (P-CNN) is proposed to detect landmarks in capturing images. This landmark detection process creates 3 categories of responses for every detected landmarks instance: bounding box, label, and score. The bounding box has positioning with the sizing of detected landmarks at the input imagery. Label implicates detected landmark's class name. A score signifies an abjectness score that scales bounding box membership for landmarks or background classes. In the localization phase, a dual interactive Wasserstein generative adversarial network (DIWGAN) is proposed to determine robot location coordinates. Finally, the proposed method attains high robot localization recall at high accuracy in the real-world environment. Here, the outdoor robot localization dataset is taken from the KITTI dataset. The proposed method is implemented in Python; its performance is estimated under certain performance metrics, like mean absolute error (MAE), cosine proximity (CP), and accuracy. The performance of the proposed method shows higher accuracy compared with existing approaches, like DQP-ODA-LMD-LZ, TDL-LMD-LZ, and 3D-RISS-MLEVD- LMD-LZ
Show more

Chapter, 2022
Publication:2022 1st International Conference on Computational Science and Technology (ICCST), 20221109, 1062
Publisher:2022


Data-Driven PMU Noise Emulation Framework using Gradient-Penalty-Based Wasserstein GAN
Authors:Austin R LassetterKaveri MahapatraDavid J. Sebastian-CardenasSri Nikhil Gupta GourisettiJames G. O'BrienJames P. Ogle2022 IEEE Power & Energy Society General Meeting (PESGM)Show more
Summary:Availability of phasor measurement unit (PMUs) data has led to research on data-driven algorithms for event monitoring, control and ensuring stability of the grid. Unavail-ability of infrequent critical event field PMU data with component failures is driving the need to generate realistic synthetic PMU data for research. The synthetic data from power system simu-lation softwares often neglect noise profiles of received phasors, thus creating some discrepancies between real PMU data and synthetic ones. To address this issue, this work presents an initial study on the noise characteristics of PMUs, as well as presenting models for recreating their unique noise signatures. The proposed method, utilizing the Wasserstein generative adversarial network with gradient penalty (WGAN-GP) architecture, provides an excellent benchmark for matching the noise distribution. One can use a well-learned GAN model to draw noise signatures from a distribution that seemingly mirrors the real PMU noise distribution, while also being able to be detached from the PMU data once the training is done. Based on the observed results and employed data-driven methodology, it is expected that the proposed methods can be adapted to replicate the behavior of other sensors, providing research and other applications with a tool for data synthesis and sensor characterizationShow more
Chapter, 2022
Publication:2022 IEEE Power & Energy Society General Meeting (PESGM), 20220717, 1
Publisher:2022


Peer-reviewed
The isometry group of Wasserstein spaces: the Hilbertian case
Authors:György Pál GehérTamás TitkosDániel Virosztek
Article, 2022
Publication:Journal of the London Mathematical Society, 106, December 2022, 3865
Publisher:2022


Article, 2022
Publication:Journal of Theoretical Probability, 20221219
Publisher:2022
Peer-reviewed
Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator.(Methods)
Authors:Viet Anh NguyenDaniel KuhnPeyman Mohajerin Esfahani
Article, 2022
Publication:Operations Research, 70, Jan-Feb 2022, 490
Publisher:2022

<-—2022———2022———1940-—

 
 

Peer-reviewed
Physics-driven learning of Wasserstein GAN for density reconstruction in dynamic tomography
Authors:Zhishen HuangMarc KlaskyTrevor WilcoxSaiprasad Ravishankar
Article, 2022
Publication:Applied optics, 61, 2022, 2805
Publisher:2022

Peer-reviewed
Approximate Wasserstein attraction flows for dynamic mass transport over networks
Authors:Ferran ArquéCésar A. UribeCarlos Ocampo-Martinez
Article, 1963-
Publication:Automatica : the journal of IFAC, the International Federation of Automatic Control., 143, 2022
Publisher:Elsevier, Amsterdam, 1963-

University of California Davis Researcher Furthers Understanding of Nonlinear Science (Exploring predictive states via Cantor embeddings and Wasserstein distance)Show more
Article, 2022
Publication:Science Letter, December 23 2022, 892
Publisher:2022

Peer-reviewed
Decision Making Under Model Uncertainty: Frechet-Wasserstein Mean Preferences
Authors:Electra V. PetracouAnastasios XepapadeasAthanasios N. Yannacopoulos
Article, 2022
Publication:Management Science, 68, February 2022, 1195
Publisher:2022


2022


Peer-reviewed
Convergence rates for empirical measures of Markov chains in dual and Wasserstein distances

Author:Adrian Riekert
Article, 2022
Publication:Statistics & probability letters, 189, 2022
Publisher:2022


Distributionally Safe Path Planning: Wasserstein Safe RRT
Authors:Paul Daniel LathropBeth Leigh BoardmanSonia MartinezLos Alamos National Lab (LANL), Los Alamos, NM (United States)
Article, 2022
Publication:IEEE Robotics and Automation Letters, 7, 20220101
Publisher:2022

Functional anomaly detection and robust estimation

Authors:Guillaume StaermanFlorence d' Alché-BucPavlo MozharovskyiNicolas VayatisZhi-Hua ZhouZoltán SzabóRémi FlamarySara Lopez-PintadoInstitut polytechnique de ParisÉcole doctorale de l'Institut polytechnique de ParisShow more
Summary:L'engouement pour l'apprentissage automatique s'étend à presque tous les domaines comme l'énergie, la médecine ou la finance. L'omniprésence des capteurs met à disposition de plus en plus de données avec une granularité toujours plus fine. Une abondance de nouvelles applications telles que la surveillance d'infrastructures complexes comme les avions ou les réseaux d'énergie, ainsi que la disponibilité d'échantillons de données massives, potentiellement corrompues, ont mis la pression sur la communauté scientifique pour développer de nouvelles méthodes et algorithmes d'apprentissage automatique fiables. Le travail présenté dans cette thèse s'inscrit dans cette ligne de recherche et se concentre autour de deux axes : la détection non-supervisée d'anomalies fonctionnelles et l'apprentissage robuste, tant du point de vue pratique que théorique.La première partie de cette thèse est consacrée au développement d'algorithmes efficaces de détection d'anomalies dans le cadre fonctionnel. Plus précisément, nous introduisons Functional Isolation Forest (FIF), un algorithme basé sur le partitionnement aléatoire de l'espace fonctionnel de manière flexible afin d'isoler progressivement les fonctions les unes des autres. Nous proposons également une nouvelle notion de profondeur fonctionnelle basée sur l'aire de l'enveloppe convexe des courbes échantillonnées, capturant de manière naturelle les écarts graduels de centralité. Les problèmes d'estimation et de calcul sont abordés et diverses expériences numériques fournissent des preuves empiriques de la pertinence des approches proposées. Enfin, afin de fournir des recommandations pratiques, la performance des récentes techniques de détection d'anomalies fonctionnelles est évaluée sur deux ensembles de données réelles liés à la surveillance des hélicoptères en vol et à la spectrométrie des matériaux de construction.La deuxième partie est consacrée à la conception et à l'analyse de plusieurs approches statistiques, potentiellement robustes, mêlant la profondeur de données et les estimateurs robustes de la moyenne. La distance de Wasserstein est une métrique populaire résultant d'un coût de transport entre deux distributions de probabilité et permettant de mesurer la similitude de ces dernières. Bien que cette dernière ait montré des résultats prometteurs dans de nombreuses applications d'apprentissage automatique, elle souffre d'une grande sensibilité aux valeurs aberrantes. Nous étudions donc comment tirer partie des estimateurs de la médiane des moyennes (MoM) pour renforcer l'estimation de la distance de Wasserstein avec des garanties théoriques. Par la suite, nous introduisons une nouvelle fonction de profondeur statistique dénommée Affine-Invariante Integrated Rank-Weighted (AI-IRW). Au-delà de l'analyse théorique effectuée, des résultats numériques sont présentés, confirmant la pertinence de cette profondeur. Les sur-ensembles de niveau des profondeurs statistiques donnent lieu à une extension possible des fonctions quantiles aux espaces multivariés. Nous proposons une nouvelle mesure de similarité entre deux distributions de probabilité. Elle repose sur la moyenne de la distance de Hausdorff entre les régions quantiles, induites par les profondeur de données, de chaque distribution. Nous montrons qu'elle hérite des propriétés intéressantes des profondeurs de données telles que la robustesse ou l'interprétabilité. Tous les algorithmes développés dans cette thèse sont accessible en ligneShow more
Computer Program, 2022
English
Publisher:2022


One Loss for Quantization: Deep Hashing with Discrete Wasserstein Distributional MatchingAuthors:Ping LiPeng
Yang
Khoa D. Doan2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
Summary:Image hashing is a principled approximate nearest neighbor approach to find similar items to a query in a large collection of images. Hashing aims to learn a binary-output function that maps an image to a binary vector. For optimal retrieval performance, producing balanced hash codes with low-quantization error to bridge the gap between the learning stage's continuous relaxation and the inference stage's discrete quantization is important. However, in the existing deep supervised hashing methods, coding balance and low-quantization error are difficult to achieve and involve several losses. We argue that this is because the existing quantization approaches in these methods are heuristically constructed and not effective to achieve these objectives. This paper considers an alternative approach to learning the quantization constraints. The task of learning balanced codes with low quantization error is re-formulated as matching the learned distribution of the continuous codes to a pre-defined discrete, uniform distribution. This is equivalent to minimizing the distance between two distributions. We then propose a computationally efficient distributional distance by leveraging the discrete property of the hash functions. This distributional distance is a valid distance and enjoys lower time and sample complexities. The proposed single-loss quantization objective can be integrated into any existing supervised hashing method to improve code balance and quantization error. Experiments confirm that the proposed approach substantially improves the performance of several representative hashing methodsShow more
Chapter, 2022
Publication:2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 202206, 9437
Publisher:2022
 Cited by 8 Related articles All 6 versions


Rate of convergence for particle approximation of PDEs in Wasserstein spaceAuthors:Maximilien GermainHuyên PhamXavier Warin
Summary:We prove a rate of convergence for the N -particle approximation of a second-order partial differential equation in the space of probability measures, such as the master equation or Bellman equation of the mean-field control problem under common noise. The rate is of order $1/N$ for the pathwise error on the solution v and of order $1/\sqrt{N}$ for the $L^2$-error on its L -derivative $\partial_\mu v$. The proof relies on backward stochastic differential equation techniquesShow more
Article
Publication:Journal of Applied Probability, 59, 20221228, 992

<-—2022———2022———1950—



MRWM: A Multiple Residual Wasserstein Driven Model for Image Denoising
Authors:Rui-Qiang HeWang-Sen LanFang Liu
Article, 2022
Publication:IEEE Access, 10, 2022, 127397
Publisher:2022


Safe Reinforcement Learning Using Wasserstein Distributionally Robust MPC and Chance Constraint
Authors:Arash Bahari KordabadRafael WisniewskiSebastien Gros
Article, 2022
Publication:IEEE Access, 10, 2022, 130058
Publisher:2022
Related articles


Causal Discovery on Discrete Data via Weighted Normalized Wasserstein Distance
Authors:Yi WeiXiaofei LiLihui LinDengming ZhuQingyong Li
Article, 2022
Publication:IEEE Transactions on Neural Networks and Learning Systems, 2022, 1
Publisher:2022


The Parisi formula is a Hamilton—Jacobi equation in Wasserstein space

Author:Jean-Christophe Mourrat
Article, 2022
Publication:Canadian journal of mathematics =, 74, 2022, 607
Publisher:2022

Bures-Wasserstein geometry for positive-definite Hermitian matrices and their trace-one subset
Author:Jesse van Oostrum
Downloadable Article, 2022
English
Publication:In: Information Geometry 5 (2): 405-425 (2022)
Publisher:Universitätsbibliothek der Technischen Universität Hamburg, Hamburg, 2022

2022


  Peer-reviewed

The Impact of Edge Displacement Vaserstein Distance on UD Parsing Performance

Authors:Mark AndersonCarlos Gómez-Rodríguez

Summary:We contribute to the discussion on parsing performance in NLP by introducing a measurement that evaluates the differences between the distributions of edge displacement (the directed distance of edges) seen in training and test data. We hypothesize that this measurement will be related to differences observed in parsing performance across treebanks. We motivate this by building upon previous work and then attempt to falsify this hypothesis by using a number of statistical methods. We establish that there is a statistical correlation between this measurement and parsing performance even when controlling for potential covariants. We then use this to establish a sampling technique that gives us an adversarial and complementary split. This gives an idea of the lower and upper bounds of parsing systems for a given treebank in lieu of freshly sampled data. In a broader sense, the methodology presented here can act as a reference for future correlation-based exploratory work in NLP


Isometric rigidity of Wasserstein tori and spheres - NASA/ADS

https://ui.adsabs.harvard.edu › abs › abstract

by G Pál Gehér · 2022 — Abstract. We prove isometric rigidity for $p$-Wasserstein spaces over finite-dimensional tori and spheres for all $p$. We present a unified approach to ...

[CITATION] Isometric rigidity of the Wasserstein torus and the Wasserstein sphere

GP Gehér, T Titkos, D Virosztek - arXiv preprint arXiv:2203.04054, 2022


 
2022

Wasserstein-Type Distances of Two-Type Continuous-State Branching Processes in Levy Random Environments

Chen, SKFang, RJ and Zheng, XQ

Dec 2022 (Early Access) | 

JOURNAL OF THEORETICAL PROBABILITY

Under natural conditions, we prove exponential ergodicity in the L-1-Wasserstein distance of two-type continuous-state branching processes in Levy random environments with immigration. Furthermore, we express precisely the parameters of the exponent. The coupling method and the conditioned branching property play an important role in the approach. Using the tool of superprocesses, ergodicity in

Show more

Free Submitted Article From RepositoryFull Text at Publishermore_horiz

23 References  Related records

2022 see 2021  conferee paper

C-WGAN-GP: Augmenting ECG and GSR Signals using 

Conditional Generative Models for Arousal Classification

https://www.researchgate.net › publication › 354834020_...

Aug 15, 2022 — using Conditional Generative Models for Arousal Classification ... We test AC-WGAN-GP for g


2022

 Tweets with replies by Laurence Aitchison ... - Twitter

mobile.twitter.com › laurence_ai › with_replies

From local circuit computations to brain area interactions. ... Mingxuan's paper on Sliced Wasserstein Variational Inference has been awarded the Best ...

Twitter · 

Sep 29, 2022

<-—2022———2022———1960—


 

Safe Reinforcement Learning Using Wasserstein Distributionally Robust MPC and Chance Constraint

Kordabad, ABWisniewski, R and Gros, S

2022 | 

IEEE ACCESS

 10 , pp.130058-130067

In this paper, we address the chance-constrained safe Reinforcement Learning (RL) problem using the function approximators based on Stochastic Model Predictive Control (SMPC) and Distributionally Robust Model Predictive Control (DRMPC). We use Conditional Value at Risk (CVaR) to measure the probability of constraint violation and safety. In order to provide a safe policy by construction, we fir

Show more

Free Full Text from Publishermore_horiz

47 References  Related records


2022

Dynamical mode recognition of triple flickering buoyant diffusion flames in Wasserstein space

Chi, YCYang, T and Zhang, P

Feb 2023 | 

COMBUSTION AND FLAME 248

Triple flickering buoyant diffusion flames in an isosceles triangle arrangement, as a nonlinear dynami-cal system of coupled oscillators, were experimentally studied. The focus of the study is two-fold: we established a well-controlled gas-fuel diffusion flame experiment, which well remedies the deficiencies of prevalent candle-flame experiments, and we developed a Wasserstein-space-based metho

Show more

Free Submitted Article From RepositoryView full textmore_horiz

51 References  Related records


 MR4533107 Thesis Milne, Tristan; 

Optimal Transport, Congested Transport, and Wasserstein Generative Adversarial Networks. Thesis (Ph.D.)–University of Toronto (Canada). 2022. 199 pp. ISBN: 979-8357-55114-6, ProQuest LLC

Review PDF Clipboard Series Thesis


2022

Conditional Wasserstein Generator | IEEE Journals & Magazine

https://ieeexplore.ieee.org › document

by Y Kim · 2022 — Our proposed algorithm can be viewed as an extension of Wasserstein autoencoders [1] to conditional generation or as a Wasserstein counterpart ...

Related articles All 4 versions

2022

LPOT: Locality-Preserving Gromov–Wasserstein Discrepancy for Nonrigid Point Set Registration

Gang Wang

IEEE Transactions on Neural Networks and Learning Systems

Year: 2022 | Early Access Article | Publisher: IEEE


2022


istributed Kalman Filter With Faulty/Reliable Sensors Based on Wasserstein Average Consensus

DJ Xin, LF ShiX Yu - IEEE Transactions on Circuits and …, 2022 - ieeexplore.ieee.org

… The main contribution of this brief lies in that we propose a Wasserstein average consensus

… perform local information fusion based on Wasserstein average consensus. The remainder …

Cited by 5 Related articles


2022

An Efficient Content Popularity Prediction of Privacy Preserving Based on Federated Learning and Wasserstein GAN

K Wang, N Deng, X Li - IEEE Internet of Things Journal, 2022 - ieeexplore.ieee.org

To relieve the high backhaul load and long transmission time caused by the huge mobile

data traffic, caching devices are deployed at the edge of mobile networks. The key to efficient …

Related articles


Synthetic Traffic Generation with Wasserstein Generative Adversarial Networks

CL Wu, YY Chen, PY Chou… - … 2022-2022 IEEE Global …, 2022 - ieeexplore.ieee.org

… Wasserstein GAN (WGAN) is proposed to ameliorate the problems mentioned previously.

It alternatively adopts the Wasserstein Distance to offer meaningful gradients to the generator …

 Related articles


[PDF] arxiv.org

The Wasserstein distance of order  for quantum spin systems on infinite lattices

G De PalmaD Trevisan - arXiv preprint arXiv:2210.11446, 2022 - arxiv.org

… We propose a generalization of the Wasserstein distance of order 1 to quantum spin systems

on the lattice Zd, which we call specific quantum W1 distance. The proposal is based on …

Related articles All 2 versions 


2022 see 2021 PDF

Wasserstein Convergence for Empirical Measures of ... - arXiv

https://arxiv.org › pdf

by H Li · 2022 — Abstract. We investigate long-time behaviors of empirical measures associated with subordi- nated Dirichlet diffusion processes on a compact ...

<-—2022———2022———1970—


View all

2022 see 2021

Local well-posedness in the Wasserstein space for a ...

https://dl.acm.org › doi › abs

by K Kang · 2022 — Published:01 August 2022Publication History ... refine the result on the existence of a weak solution of a Fokker–Planck equation in the Wasserstein space. 


Global Seminar

https://vega-institute.org › students › global-seminar

https://vega-institute.org › students › global-seminar

April 9, 2022 Mikhail Zhitlukhin (Steklov Mathematical Institute) ... Topic: 

Sensitivity analysis for Wasserstein Distributionally Robust Optimization and ...


Video Archive - Department of Mathematical Sciences, MCS

https://www.cmu.edu › math › cna › events › video-arc...

https://www.cmu.edu › math › cna › events › video-arc...

Matt Jacobs (Purdue University). 

Adversarial training and the generalized Wasserstein barycenter problem. . September 27, 2022 ...


Catchup results for math from Mon, 26 Dec 2022

http://128.84.4.18 › catchup

http://128.84.4.18 › catchup

Journal-ref: Seminaire Laurent Schwartz - EDP et applications (2021-2022), Talk III ... 

Fields for Lipschitz surfaces and the Wasserstein Fisher Rao metric.



2022 PDF

Distances Between Probability Distributions of Different ...

https://www.stat.uchicago.edu › work › probdist

by Y Cai · 2022 · Cited by 15 — Downloaded on May 21,2022 at 03:09:22 UTC from IEEE Xplore. ... 2-Wasserstein metri

toobtain distances on probability mea- ... Inst. Steklov., vol.

12 pages


2022


Justin Solomon | Papers With Code

https://paperswithcode.com › author › justin-solomon

https://paperswithcode.com › author › justin-solomon

no code implementations • 18 May 2022 • Christopher Scarvelis, ... 

Continuous Wasserstein-2 Barycenter Estimation without Minimax Optimization.


Contrastive Prototypical Network with Wasserstein Confidence ...

Association for Computing Machinery·

https://dl.acm.org › doi › abs

Association for Computing Machinery

https://dl.acm.org › doi › abs

by H Wang · 2022 — Computer Vision – ECCV 2022: 17th European Conference, Tel Aviv, ... To this end, we propose Wasserstein Confidence Penalty which can impose ...


The Wasserstein distance as a hydrological objective function

https://egusphere.copernicus.org › preprints › 2022

https://egusphere.copernicus.org › preprints › 2022

by JC Magyar · 2022 — https://doi.org/10.5194/egusphere-2022-1117. Preprint. Discussion started: 10 November 2022 cс Author(s) 2022. CC BY 4.0 License.

Analysis Seminar | Department of Mathematics - Penn Math

https://www.math.upenn.edu › events › seminars › anal...nal...

Thursday, November 17, 2022 - 3:30pm. Dóminique Kemp, IAS ... 

The Exponential Formula for the Wasserstein Metric ... Andrei Kapaev, Steklov Institute.


Personal Homepage of Prof. Dr. Alessio Figalli - People

https://people.math.ethz.ch › ~afigalli › lecture-notes

Preprint 2022. 

The continuous formulation of shallow neural networks as wasserstein-type gradient flows (with X. Fernández-Real) ... Inst. Steklov.

Zbl 07672483

Cited by 2 Related articles All 5 versions

Zbl 1518.35567

<-—2022———2022———1980—

2022 see 2008

Jonathan C. Mattingly - Duke University

https://fds.duke.edu › cv-28-82-1757-p

10 (September, 2022), Cambridge University Press (CUP) ... with Martin Hairer,, 

Spectral gaps in Wasserstein distances and the 2D stochastic Navier-Stokes ...


Decoding Biology through Mathematics - Digital Library

https://rajapakse.lab.medicine.umich.edu › papers › digi...

https://rajapakse.lab.medicine.umich.edu › papers › digi...

2839–2846, May 2022 ... Cell 185.4 (2022): 690-711. Rao, Suhas SP, et al. ... "Gromov–

Wasserstein distances and the metric approach to object matching.


2022

The “Unreasonable” Effectiveness of the Wasserstein ... - MDPI

https://www.mdpi.com › pdfP

by A Ponti · 2022 · Cited by 1 — the Wasserstein Distance in. Analyzing Key Performance. Indicators of a Network of Stores. Big Data Cogn. Comput. 2022, 6, 138.


Exploring predictive states via Cantor embeddings and ...

https://aip.scitation.org › doi › abs

by SP Loomis · 2022 — Full Submitted: 10 June 2022 Accepted: 02 November 2022 Published Online: 05 ... Exploring predictive states via Cantor embeddings and Wasserstein distance.


[2210.14298] Wasserstein Archetypal Analysis - arXivhttps://arxiv.org › stat
https://arxiv.org › stat
by K Craig · 2022 — [Submitted on 25 Oct 2022] ... formulation of archetypal analysis based on the Wasserstein metric, which we call Wasserstein archetypal analysis (WAA).


2022


2022 see 2021

Publications - Math

https://faculty.math.illinois.edu › Macaulay2 › Publicati...

Steklov Inst. Math. ... 

Wasserstein Distance to Independence Models, by Türkü Özlüm Çelik, Asgar Jamneshan, Guido Montúfar, Bernd Sturmfels, and Lorenzo ...

 , Proc. AMS, Vol. 150 (11), pp. 4879–4890, 2022.


NeurIPS 2022

https://neurips.cc › 2022 › ScheduleMultitrack

https://neurips.cc › 2022 › ScheduleMultitrack

... cryogenic electron microscopy density maps by minimizing their Wasserstein distance ... 

She is an ACM Fellow, a Fellow of the Royal Society of Canada, ...


2022

Thibaut Le Gouic - Google Scholar

https://scholar.google.co.id › citations

Existence and consistency of Wasserstein barycenters ... Proceedings of the 23rd ACM Conference on Economics and Computation, 208-209, 2022.


Gromov-Wasserstein Guided Representation Learning for Cross-Domain Recommendation

X Li, Z QiuX ZhaoZ Wang, Y Zhang, C Xing… - Proceedings of the 31st …, 2022 - dl.acm.org

… through enhancing the representation learning, it can be … the Gromov-Wasserstein distance

between two representation distribu… to further optimize the representation learning module of …

Related articles


ACM SIGMOD Conference 2022: Philadelphia, PA, USA - DBLPhttps://dblp.org › Conferences and Workshops › SIGMOD
https://dblp.org › Conferences and Workshops › SIGMO
Bibliographic content of ACM SIGMOD Conference 2022. ... 

Neural Subgraph Counting with Wasserstein Estimator. 160-175 text to speech.

Cited by 6 Related articles All 2 versions

<-—2022———2022———1990—


2022 see 2015

Convolutional wasserstein distances - Archive ouverte HAL

https://hal.science › hal-01188953

To this end, we approximate optimal transportation distances using entropic regularization ... Dernière modification le : vendredi 18 novembre 2022-09:23:35.

Significant New Researcher Award - ACM SIGGRAPH

https://www.siggraph.org › Awards

https://www.siggraph.org › Awards

ACM SIGGRAPH is pleased to present the 2022 Significant New Researcher Award ... 

and Convolutional Wasserstein Distances for optimal transport on meshes and ...


 2022

ACM Multimedia 2022 in Lisbon: Detailed Program

http://2022.acmmm.org › uploads › 2022/10 › A...PD

Weakly-Supervised Temporal Action Alignment Driven by Unbalanced Spectral Fused Gromov-Wasserstein Distance -- Dixin Luo (Beijing Institute of Technology), ...

Convolution Sliced Wasserstein - GitHub
https://github.com › CSW

@article{nguyen2022revisting, title={Revisiting Sliced Wasserstein on Images: From Vectorization to Convolution}, author={Khai Nguyen and Nhat Ho}, ...

Cited by 7 Related articles All 6 versions

2022 Workshop Videos | Banff International Research Station

http://www.birs.ca › videos › 2022

http://www.birs.ca › videos › 2022

Tuesday Sep 6, 2022 11:14 - 11:29. 

Tangent Space and Dimension Estimation with the Wasserstein Distance ... Boris Kashin, Steklov Mathematics Institute.


2022

New Trends - Caltech

https://newtrends.caltech.edu

https://newtrends.caltech.edu

This conference will be held at Caltech on December 19-21, 2022 ... 

Haomin Zhou: 

Wasserstein Hamiltonian flow and its structure preserving computations.


2022



2022 SEE 2021

Ryoma Sato - Google Scholar

https://scholar.google.com › citations

Proceedings of the Fifteenth ACM International Conference on Web Search and …, 2022‏. 5, 2022. 

Supervised tree-wasserstein distance‏. Y Takezawa, R Sato, ..

 2022 see 2021 2023

Jundong Li - UVA Engineering

https://engineering.virginia.edu › faculty › jundong-li

He has won several prestigious awards, including SIGKDD 2022 Best ... 

Graph Alignment with Wasserstein Distance Discriminator", ACM SIGKDD Conference on ...


Accueil - Laboratoire de Mathématiques de Besançon (UMR 6623)

https://hal.telecom-paris.fr › LMB › folder › list5

Thibault Modeste, Clément Dombry. 

Characterization of translation invariant MMD on R d and connections with Wasserstein distances. 2022. hal-03855093.

index - Archive ouverte HAL

https://hal.science › LMB

Thibault Modeste, Clément Dombry. Characterization of translation invariant MMD on R d and connections with Wasserstein distances. 2022. hal-03855093.


2022 see 2021

Homepage of Nathael Gozlan - Publications - Google Sites

https://sites.google.com › view › publications

International Mathematics Research Notices, Volume 2022, Issue 17, ... 

Generalized Wasserstein barycenters between probability measures living on different ...


Gromov-Wasserstein Guided Representation Learning for Cross-Domain Recommendation

X Li, Z Qiu, X Zhao, Z Wang, Y Zhang, C Xing… - … of the 31st ACM …, 2022 - dl.acm.org

… Here, we exploit Gromov-Wasserstein discrepancy as the … -Wasserstein (GW) OT 

problem (defined in Equation (3)). The Figure 3 shows the detailed process of Gromov-Wasserstein

 Related articles

<-—2022———2022———2000—e


022 see 2021  [HTML] mdpi.com

Panchromatic Image super-resolution via self attention-augmented wasserstein generative adversarial network

J Du, K Cheng, Y Yu, D Wang, H Zhou - Sensors, 2021 - mdpi.com

… Our system reconstructs high-resolution images via Wasserstein generative adversarial 

networks with the channel and spatial attention to obtain more representative features. Especially…

Cited by 7 Related articles All 6 versions



2022

Khai Nguyen on Twitter: "In our new #NeurIPS2022 paper, we ...

twitter.com › KhaiBaNguyen › status

twitter.com › KhaiBaNguyen › status

In our new #NeurIPS2022 paper, we show that using multiple ... over images to one dimension is better for the sliced Wasserstein than doing ...

Nov 26, 2022


 Jia Li - Eberly College of Science - Penn State

science.psu.edu › ... › 2022 lectures

science.psu.edu › ... › 2022 lectures

Frontiers Of Science 2022 ... These methods exploit mathematical tools such as optimal transport and the Wasserstein barycenter.

Eberly College of Science · Penn State Eberly College of Science · 

Jan 5, 2022


Rémi Flamary (@RFlamary) / Twitter

twitter.com › rflamary

Glad to announce that our paper on "Spherical Sliced-Wasserstein" was accepted ... Graph Neural Network with Optimal Transport Distances" at #NeurIPS2022.

Nov 25, 2022

2022
LPOT: Locality-Preserving Gromov–Wasserstein Discrepancy for Nonrigid Point Set Registration

G Wang - IEEE Transactions on Neural Networks and Learning …, 2022 - ieeexplore.ieee.org

… to compute registration mappings and apply them for registering images. Feydy et al. [59] … 

Here, we focus on the registration of points extracted from 2-D images and 3-D scenes. …

 Related articles

2022


[DAG-WGAN: Causal Structure Learning with Wasserstein ...

www.youtube.com › watch

www.youtube.com › watch

DAG-WGAN: Causal Structure Learning with Wasserstein Generative Adversarial NetworksAuthorsHristo Petkov, Colin Hanley and Feng Dong, ...

YouTube · Computer Science & IT Conference Proceedings · 

Apr 7, 2022

2022 see 2021

[PDF] jmlr.org

[PDF] Intrinsic Dimension Estimation Using Wasserstein Distance

A Block, Z Jia, Y Polyanskiy, A Rakhlin - Journal of Machine Learning …, 2022 - jmlr.org

images from MNIST in datasets of size ranging in powers of 2 from 32 to 2048, calculate 

the Wasserstein … distances to compute the Wasserstein distance between the empirical …

 Related articles All 2 versions

MR4577752 

[PDF] arxiv.org

Wasserstein distributionally robust optimization and variation regularization

R Gao, X Chen, AJ Kleywegt - Operations Research, 2022 - pubsonline.informs.org

… The connection between Wasserstein DRO and … variation regularization effect of the 

Wasserstein DRO—a new form … -variation tradeoff intrinsic in the Wasserstein DRO, which …

 Cited by 32 Related articles All 4 versions


Apple disease recognition based on Wasserstein generative adversarial networks and hybrid attention mechanism residual network

Y Xueying, G Jiyong, W Shoucheng… - Journal of Chinese …, 2022 - zgnjhxb.niam.com.cn

… disease images caused … Wasserstein Generative Adversarial Networks (WGAN). Through 

the antagonistic training between generator and discriminator, 10 000 apple disease images

Related articles 

<-—2022———2022———2010—



[PDF] arxiv.org

WOGAN at the SBST 2022 CPS tool competition

J Peltomäki, F Spencer, I Porres - 2022 IEEE/ACM 15th …, 2022 - ieeexplore.ieee.org

… We chose to train a Wasserstein generative adversarial network (WGAN) which is capable 

… Copyrights for components of this work owned by others than ACM must be honored. …

Cited by 1 Related articles All 6 versions 


2022

Nhat Ho (@nhatptnk8912) / Twitter

twitter.com › nhatptnk8912

Austin, Texas nhatptnk8912.github.io Joined March 2022 ... probability measure over images to one dimension is better for the sliced Wasserstein than doing ...

Nov 25, 2022



Khai Nguyen (@KhaiBaNguyen) / Twitter

twitter.com › KhaiBaNguyen

twitter.com › KhaiBaNguyen

In our new #NeurIPS2022 paper, we show that using multiple convolution layers ... measure over images to one dimension is better for the sliced Wasserstein ...

Twitter · 

Oct 24, 2022

Neural Operator || Physics Embedded Network || Seminar on

www.youtube.com › watch

... GeONet: a neural operator for learning the Wasserstein geodesic2. Ruiy... ... Physics Embedded Network || Seminar on: November 18, 2022.

YouTube · CRUNCH Group: Home of Math + Machine Learning + X · 

Nov 18, 2022


[PDF]  [PDF] arxiv.org

Long Range Constraints for Neural Texture Synthesis Using Sliced Wasserstein Loss

L Yin, A Chua - arXiv preprint arXiv:2211.11137, 2022 - arxiv.org

… to capture long range constraints in images. Having access to a … synthesis based on Sliced 

Wasserstein Loss and create a … long range constraints in images and compare our results to …

 Related articles All 2 versions

2022


PDF] arxiv.org

Learning to generate Wasserstein barycenters

J Lacombe, J Digne, N Courty, N Bonneel - Journal of Mathematical …, 2022 - Springer

Wasserstein barycenters in milliseconds. It shows that this can be done by learning Wasserstein 

… of our model due to our training images being significantly different from these images. …

Cited by 3 Related articles All 5 versions


Text to Face generation using Wasserstein stackGAN

A Kushwaha, P Chanakya… - 2022 IEEE 9th Uttar …, 2022 - ieeexplore.ieee.org

Wasserstein Loss The basic idea is to generate a score for real and fake images passed … 

,ie generator output are of similar kind or images in our case and another problem is with the …

 Related articles



[PDF] thecvf.com

Sliced wasserstein discrepancy for unsupervised domain adaptation

CY Lee, T Batra, MH Baig… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com

… SYNSIG GTSRB In this setting, we evaluate the adaptation ability from synthetic images 

SYNSIG to real images GTSRB. We randomly selected 31367 samples for target training and …

Cited by 386 Related articles All 9 versions


2022


[PDF] arxiv.org

Learning to solve inverse problems using Wasserstein loss

J Adler, A Ringh, O Öktem, J Karlsson - arXiv preprint arXiv:1710.10898, 2017 - arxiv.org

… Moreover, c(x1,x2)1/4 is in fact a metric on R2 (see lemma 6 in the appendix) and thus 

W4(µ0,µ1) := T(µ0,µ1)1/4 gives rise to a Wasserstein metric on the space of images, where T(µ0,…

Cited by 28 Related articles All 4 versions



Deep joint two-stream Wasserstein auto-encoder and selective attention alignment for unsupervised domain adaptation

Z Chen, C Chen, X Jin, Y Liu, Z Cheng - Neural computing and …, 2020 - Springer

images have the same distribution as real target images, and thus, only the synthetic target 

images … In this work, we propose a method that joints two-stream Wasserstein auto-encoder (…

Cited by 18 Related articles All 4 versions

<-—2022———2022———2020—



[PDF] aaai.org

Manifold-valued image generation with wasserstein generative adversarial nets

Z Huang, J Wu, L Van Gool - Proceedings of the AAAI Conference on …, 2019 - ojs.aaai.org

… a new Wasserstein distance on complete manifolds. By adopting the proposed Wasserstein 

For the FID computation, we translate CB images back to RGB images. By comparing with …

 Cited by 17 Related articles All 10 versions


[PDF] ieee.org

Accelerating CS-MRI reconstruction with fine-tuning Wasserstein generative adversarial network

M Jiang, Z Yuan, X Yang, J Zhang, Y Gong, L Xia… - IEEE …, 2019 - ieeexplore.ieee.org

… approach is to introduce Wasserstein distance as the new … reconstructed images we 

calculate the Wasserstein distance … to minimize the Wasserstein distance and RMSProp is used …

 Cited by 18 Related articles


[PDF] arxiv.org

An Efficient HPR Algorithm for the Wasserstein Barycenter Problem with Computational Complexity

G Zhang, Y Yuan, D Sun - arXiv preprint arXiv:2211.14881, 2022 - arxiv.org

… the model of the Wasserstein barycenter problem. Then we proposed a linear time complexity 

procedure for the linear system involved in solving the Wasserstein barycenter problem. …

 Related articles All 2 versions



PDF] arxiv.org

Single image haze removal using conditional wasserstein generative adversarial networks

JP Ebenezer, B Das… - 2019 27th European …, 2019 - ieeexplore.ieee.org

… methods have required a prior on natural images or multiple images of the same scene. We 

… of clear images conditioned on the haze-affected images using the Wasserstein loss function…

 Cited by 16 Related articles All 7 versions



 

[PDF] arxiv.org

Improving the improved training of wasserstein gans: A consistency term and its dual effect

X Wei, B Gong, Z Liu, W Lu, L Wang - arXiv preprint arXiv:1803.01541, 2018 - arxiv.org

… The corresponding algorithm, called Wasserstein GAN (WGAN), … -10 images and is the first 

that exceeds the accuracy of 90% on the CIFAR-10 dataset using only 4,000 labeled images, …

Cited by 238 Related articles All 5 versions

2022


Infrared and visible image fusion using dual discriminators generative adversarial networks with Wasserstein distance

J Li, H Huo, K Liu, C Li - Information Sciences, 2020 - Elsevier

images simultaneously, we consider the fusion image has been keeping enough intensity 

information and texture information from source images… discriminators Wasserstein generative …

Cited by 46 Related articles


Unsupervised segmentation of colonic polyps in narrow-band imaging data based on manifold representation of images and Wasserstein distance

IN Figueiredo, L Pinto, PN Figueiredo, R Tsai - … Signal Processing and …, 2019 - Elsevier

… The images in our dataset are RGB color images with 576 × 726 pixels (per color channel) 

that were resized to 600 × 600 for further processing. For the computation in this paper, the …

Cited by 6 Related articles All 2 versions


[PDF] neurips.cc

Computing Kantorovich-Wasserstein Distances on -dimensional histograms using -partite graphs

G Auricchio, F Bassetti, S Gualandi… - Advances in Neural …, 2018 - proceedings.neurips.cc

… This paper presents a novel method to compute the exact Kantorovich-Wasserstein 

Kantorovich-Wasserstein distance of order 2 among two sets of instances: gray scale images and d-…

 Cited by 16 Related articles All 8 versions


[PDF] arxiv.org

Improved image wasserstein attacks and defenses

JE Hu, A Swaminathan, H Salman, G Yang - arXiv preprint arXiv …, 2020 - arxiv.org

Wasserstein ball. In this work, we define the Wasserstein threat model such that it applies to 

all imagesWasserstein radius. Our algorithm uses a constrained Sinkhorn iteration to project …

Cited by 11 Related articles All 2 versions


[PDF] mdpi.com

Towards Generating Realistic Wrist Pulse Signals Using Enhanced One Dimensional Wasserstein GAN

J Chang, F Hu, H Xu, X Mao, Y Zhao, L Huang - Sensors, 2023 - mdpi.com

… SWD: The Wasserstein distance expresses the price of changing one distribution into … 

The sliced Wasserstein distance is a 1d projection-based approximation of the Wasserstein

All 2 versions 

<-—2022———2022———2030—


2022 see 2021  [PDF] arxiv.org

The cramer distance as a solution to biased wasserstein gradients

MG Bellemare, I Danihelka, W Dabney… - arXiv preprint arXiv …, 2017 - arxiv.org

… As additional supporting material, we provide here the results of experiments on learning a 

probabilistic generative model on images using either the 1-Wasserstein, Cramér, or KL loss. …

 Cited by 332 Related articles All 3 versions



[PDF] neurips.cc

Do neural optimal transport solvers work? a continuous wasserstein-2 benchmark

A Korotin, L Li, A Genevay… - Advances in …, 2021 - proceedings.neurips.cc

… task of generative modeling for CelebA 64 ˆ 64 images of faces. For comparison, we add tQCs, 

… We show sample generated images in the top row of each subplot of Figure 5 and report …

 Cited by 22 Related articles All 6 versions


[PDF] thecvf.com

Wasserstein GAN with quadratic transport cost

H Liu, X Gu, D Samaras - Proceedings of the IEEE/CVF …, 2019 - openaccess.thecvf.com

… Results on the CelebA-HQ dataset We resize the face images in CelebA-HQ to 256×256 

and train WGAN-QC on them. We can see that most of the randomly generated images by …

Cited by 58 Related articles All 5 versions



2022 see 2021   [PDF] thecvf.com

Deepacg: Co-saliency detection via semantic-aware contrast gromov-wasserstein distance

K Zhang, M Dong, B Liu, XT Yuan… - Proceedings of the …, 2021 - openaccess.thecvf.com

… the co-occurring salient objects in a group of images. To address this task, we introduce 

a … Gromov-Wasserstein distance (DeepACG). We first adopt the Gromov-Wasserstein (GW) …

Cited by 15 Related articles All 4 versions



 

Waserstein model reduction approach for parametrized flow ...

https://arxiv.org › math

by B Battisti · 2022 · Cited by 2 — The aim of this work is to build a reduced-order model for parametrized porous media equations. The main challenge of this type of problems is ...


2022 


arXiv:2302.01459  [pdfother cs.CV
A sliced-Wasserstein distance-based approach for out-of-class-distribution detection
Authors: Mohammad Shifat E RabbiAbu Hasnat Mohammad RubaiyatYan ZhuangGustavo K Rohde
Abstract: There exist growing interests in intelligent systems for numerous medical imaging, image processing, and computer vision applications, such as face recognition, medical diagnosis, character recognition, and self-driving cars, among others. These applications usually require solving complex classification problems involving complex images with unknown data generative processes. In addition to recen…  More
Submitted 2 February, 2023; originally announced February 2023.

2022

A new data generation approach with modified Wasserstein auto-encoder for rotating machinery fault diagnosis with limited fault data

K Zhao, H Jiang, C Liu, Y Wang, K Zhu - Knowledge-Based Systems, 2022 - Elsevier

… Wasserstein auto-encoder (MWAE) to generate data that are highly similar to the known

data. The sliced Wasserstein … The sliced Wasserstein distance with a gradient penalty is …

Cited by 17 Related articles All 2 versions


2022

Comparing Beta-VAE to WGAN-GP for Time Series Augmentation to Improve Classification Performance

D Kavran, B Žalik, N Lukač - … , ICAART 2022, Virtual Event, February 3–5 …, 2023 - Springer

… A comparison is presented between Beta-VAE and WGAN-GP as … The use of WGAN-GP 

generated synthetic sets to train … the use of Beta-VAE and WGAN-GP generated synthetic sets …

Cite Related articles


2022 see 2021  [PDF] tandfonline.com

Stochastic approximation versus sample average approximation for Wasserstein barycenters

D Dvinskikh - Optimization Methods and Software, 2022 - Taylor & Francis

… We show that for the Wasserstein barycenter problem, this superiority can be inverted. We 

… by the expectation to have other applications besides the Wasserstein barycenter problem. …

 Cited by 5 Related articles All 6 versions


2022 see 2021

MR4524213 Prelim Gehér, György Pál; Titkos, Tamás; Virosztek, Dániel; 

The isometry group of Wasserstein spaces: the Hilbertian case. J. Lond. Math. Soc. (106 (2022), no. 4,2)  3865–3894. 46E27 (46G12 47B49 54 60)

Review PDF Clipboard Journal Article 

<-—2022———2022———2040—


 Accelerating the discovery of anticancer peptides targeting lung and breast cancers with the Wasserstein autoencoder model and PSO algorithm

L Yang, G Yang, Z Bing, Y Tian, L Huang… - Briefings in …, 2022 - academic.oup.com

… In this work, we report a framework of ACPs generation, which combines Wasserstein 

autoencoder (WAE) generative model and Particle Swarm Optimization (PSO) forward search …

Related articles All 4 versions


 2022

A novel hybrid sampling method based on CWGAN for extremely imbalanced backorder prediction

H Liu, Q Liu, M Liu - … on Systems, Man, and Cybernetics (SMC), 2022 - ieeexplore.ieee.org

Product backorder is a common problem in supply chain management systems. It is essential

for entrepreneurs to predict the likelihood of backorder accurately to minimize a company’s …

 Related articles


 

[PDF] sbc.org.br

Motor Imagery EEG Data Augmentation with cWGAN-GP for Brain-Computer Interfaces

LH dos SantosDG Fantinato - Anais do XIX Encontro Nacional de …, 2022 - sol.sbc.org.br

… Based on this, in this work, we propose using cWGAN-GP to perform data augmentation

for dataset 1 of BCI Competition IV [Blankertz et al. 2007], which presents considerably small …

 Related articles All 3 versions 



2022

Commencement 2023 - Harvard Law School

May 24, 2023 ... at the corner of Massachusetts Avenue and Everett Street, which is near the North entrance to Wasserstein Hall (1585 Massachusetts Avenue).

Harvard Law Sch

 

2022

Wasserstein Distributionally Robust Optimization and Variation Regularization

Gao, RChen, X and Kleywegtc, AJ

Nov 2022 (Early Access) | 

OPERATIONS RESEARCH

Wasserstein distributionally robust optimization (DRO) is an approach to optimization under uncertainty in which the decision maker hedges against a set of probability distributions, specified by a Wasserstein ball, for the uncertain parameters. This approach facilitates robust machine learning, resulting in models that sustain good performance when the data are to some extent different from th

Show more

66 References. Related records


2022


2022. Data

MEDIGAN MODEL UPLOAD: 00022_WGAN_CARDIAC_AGING

Campello, Victor M and Skorupko, Grzegorz

2022 | 

Zenodo

 | Software

Model ID: 00022_WGAN_CARDIAC_AGING. Uploaded via: API Tags: ['Cardiac imaging', 'pix2pix', 'Pix2Pix'] Usage: This GAN is used as part ofthe medigan library. This GANs metadata is therefore stored in and retrieved frommedigan's configfile.medigan is an open-source Pythonlibraryon Github that allows developers and researchers to easily add synthetic imaging datainto their model training pipeli

Show more 


[PDF] arxiv.org

Fair and Optimal Classification via Transports to Wasserstein-Barycenter

R Xian, L Yin, H Zhao - arXiv preprint arXiv:2211.01528, 2022 - arxiv.org

… Our insight comes from the key observation that finding the optimal fair classifier is equivalent 

to solving a Wasserstein-barycenter problem under l1-norm restricted to the vertices of the …

Related articles All 2 versions



Entropic Gromov-Wasserstein between Gaussian Distributions

K LeDQ LeH NguyenD Do… - … on Machine Learning, 2022 - proceedings.mlr.press

… Gaussian distributions. Finally, we consider an entropic inner product Gromov-Wasserstein

barycenter of multiple Gaussian distributions. We prove that the barycenter is a Gaussian …

 Cited by 2 Related articles All 7 versions 


[HTML] springer.com

[HTML] Entropy-regularized 2-Wasserstein distance between Gaussian measures

A Mallasto, A GerolinHQ Minh - Information Geometry, 2022 - Springer

… Gaussian distributions are plentiful in applications dealing in uncertainty quantification

and … In this work, we study the Gaussian geometry under the entropy-regularized 2-Wasserstein …

Cited by 21 Related articles All 7 versions


[PDF] arxiv.org

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

HQ Minh - Journal of Theoretical Probability, 2022 - Springer

… 2-Wasserstein distance on an infinite-dimensional Hilbert space, in particular for the Gaussian

… of two Gaussian measures on Hilbert space with the smallest mutual information are joint …

 Cited by 3 Related articles All 5 versions



MR4585115 

<-—2022———2022———2050—


[PDF] arxiv.org

Gaussian Process regression over discrete probability measures: on the non-stationarity relation between Euclidean and Wasserstein Squared Exponential Kernels

A CandelieriA PontiF Archetti - arXiv preprint arXiv:2212.01310, 2022 - arxiv.org

… the Wasserstein … the Wasserstein distance, under certain assumptions or based on some

variants of the original distance. Although a PD kernel can be defined by using the Wasserstein …

Related articles All 2 versions 


Gromov–Wasserstein distances between Gaussian distributions

J Delon, A Desolneux, A Salmona - Journal of Applied Probability, 2022 - cambridge.org

… We focus on the Gromov–Wasserstein distance with a ground cost defined as the squared …

between Gaussian distributions. We show that when the optimal plan is restricted to Gaussian …

Cited by 1 Related articles


Accelerated Bregman primal-dual methods applied to optimal transport and Wasserstein Barycenter problems

A ChambolleJP Contreras - SIAM Journal on Mathematics of Data Science, 2022 - SIAM

… Transport (OT) and Wasserstein Barycenter (WB) problems, with … the dual space has a

Bregman divergence, and the dual … Finally, we introduce a new Bregman divergence based on a …

Cited by 6 Related articles All 12


3033 SEE ARXIV

The Impact of Edge Displacement Vaserstein Distance on UD Parsing Performance

M Anderson, C Gómez-Rodríguez - Computational Linguistics, 2022 - direct.mit.edu

We contribute to the discussion on parsing performance in NLP by introducing a measurement

that evaluates the differences between the distributions of edge displacement (the elated articles All 9 versions



Wasserstein distance-based probabilistic linguistic TODIM method with application to the evaluation of sustainable rural tourism potential

S Zhang, Z Wu, Z Ma, X LiuJ Wu - Economic Research …, 2022 - Taylor & Francis

… , Wasserstein distance and classical TODIM method. In Section 3, we propose a Wasserstein-…

In Section 4, the Wasserstein distance-based extended PL-TODIM method is proposed to …

Cited by 4 Related articles All 2 versions


2022


[PDF] arxiv.org

Lidar Upsampling With Sliced Wasserstein Distance

A SavkinY WangS WirkertN Navab… - IEEE Robotics and …, 2022 - ieeexplore.ieee.org

… We argue that, unlike existing point cloud upsampling methods, our edge-aware one-… -Wasserstein

distance can reconstruct fine details of lidar scans and surpass existing upsampling …

Cited by 5 Related articles


Handwriting Recognition Using Wasserstein Metric in Adversarial Learning

M Jangpangi, S KumarD BhardwajBG Kim… - SN Computer …, 2022 - Springer

… Handwriting is challenging due to its irregular shapes, which … discriminator network and

applying Wasserstein’s function to … We found that using the Wasserstein adversarial approach in …

 Related articles


[PDF] arxiv.org

Generalizing to Unseen Domains with Wasserstein Distributional Robustness under Limited Source Knowledge

J Wang, L XieY XieSL HuangY Li - arXiv preprint arXiv:2207.04913, 2022 - arxiv.org

… -specific Wasserstein uncertainty set. Compared with Kullback–Leibler divergence, Wasserstein

… While the classic DRO with one Wasserstein uncertainty set can be formulated into a …

 Cited by 1 Related articles All 2 versions 


[PDF] mlsb.io

[PDF] 3D alignment of cryogenic electron microscopy density maps by minimizing their Wasserstein distance

AT Riahi, G Woollard, F PoitevinA CondonKD Duc - mlsb.io

… electron density maps of multiple conformations of a biomolecule from Cryogenic electron

microscopy … for a rotation that minimizes the Wasserstein distance between two maps, …

Related articles 


[PDF] arxiv.org

Wasserstein Steepest Descent Flows of Discrepancies with Riesz Kernels

J Hertrich, M Gräf, R BeinertG Steidl - arXiv preprint arXiv:2211.01804, 2022 - arxiv.org

… introduce Wasserstein steepest decent flows which rely on the concept of the geometric

Wasserstein … , there exists a unique Wasserstein steepest descent flow, which coincides with the …

 Cited by 2 Related articles All 2 versions 

<——2022———2022———2060—



[PDF] arxiv.org

Lidar Upsampling With Sliced Wasserstein Distance

A SavkinY WangS WirkertN Navab… - IEEE Robotics and …, 2022 - ieeexplore.ieee.org

… We propose to train lidar generation models using loss function based on Sliced-Wasserstein

distance as opposed to commonly utilized CD, EMD-based losses. Our benchmark …

 Cited by 5 Related articles


Small Sample Reliability Assessment With Online Time-Series Data Based on a Worm Wasserstein Generative Adversarial Network Learning Method

B Sun, Z Wu, Q Feng, Z Wang, Y Ren… - IEEE Transactions …, 2022 - ieeexplore.ieee.org

… [23] pointed out that the Wasserstein GAN (WGAN) is more suitable for expanding … of

earthworms, this article introduced a novel worm Wasserstein GAN (WWGAN) method for ORA with …

Cited by 1 Related articles


[PDF] researchgate.net

Graph Wasserstein Autoencoder-Based Asymptotically Optimal Motion Planning With Kinematic Constraints for Robotic Manipulation

C Xia, Y Zhang, SA ColemanCY Weng… - IEEE Transactions …, 2022 - ieeexplore.ieee.org

… a new methodology based on a graph wasserstein autoencoder (GraphWAE) to learn the

bias … Essentially, the GraphWAE is an extension and improvement of the standard wasserstein …

Related articles All 2 versions


Vehicle Detection Based on Lidar-Camera Fusion Using Wasserstein Distance Method

Z Wei, H Zhu, D Liu, T He - 2022 China Automation Congress …, 2022 - ieeexplore.ieee.org

… lidar and camera fusion algorithm by Wasserstein distance method is proposed. In the propose

method, the advantage of Wasserstein … Therefore, we use Wasserstein distance to fuse …

[PDF] researchsquare.com

[PDF] Reliability Metrics of Explainable CNN based on Wasserstein Distance for Cardiac Evaluation

Y Omae, Y Kakimoto, Y Saito, D Fukamachi… - 2022 - researchsquare.com

… distributions by Wasserstein distance (WSD). When the CNN estimates PAWP from areas

other than the cardiac region, the WSD value is high. Therefore, WSD is a reliability metrics for …

Cited by 1 Related articles All 3 versions 


2022


A Wasserstein Distance-based Distributionally Robust Chance-constrained Clustered Generation Expansion Planning Considering Flexible Resource Investments

B Chen, T Liu, X Liu, C He, L Nan, L Wu… - … on Power Systems, 2022 - ieeexplore.ieee.org

… into the planning model, and the uncertainty is modeled via a Wasserstein distance (WD)-… 

-risk approximation method, the 

proposed planning model is reformulated as a tractable mixed-…

Cited by 1 Related articles



2022 see 2021. [PDF] arxiv.org

On Stein's Factors for Poisson Approximation in Wasserstein Distance with Nonlinear Transportation Costs

ZW Liao, Y Ma, A Xia - Journal of Theoretical Probability, 2022 - Springer

… We establish various bounds on the solutions to a Stein equation for Poisson approximation 

in the Wasserstein distance with nonlinear transportation costs. The proofs are a refinement …

Cited by 2 Related articles All 6 versions


Improving Text Classifiers Through Controlled Text Generation Using Transformer Wasserstein Autoencoder

C Harikrishnan, NM Dhanya - Inventive Communication and …, 2022 - Springer

… text using the transformer-based Wasserstein autoencoder which helps in improving the … 

The novelty of this paper is a transformer-based Wasserstein autoencoder which is used for …

Cited by 2 Related articles All 3 versions

 

[PDF] arxiv.org

Wasserstein distance estimates for jump-diffusion processes

JC Breton, N Privault - arXiv preprint arXiv:2212.04766, 2022 - arxiv.org

… Abstract We derive Wasserstein distance bounds between the probability distributions of 

a stochastic integral (Itô) process with jumps (Xt)t[0,T] and a jump-diffusion process (X t )t[…

Cited by 2 Related articles All 3 versions



[PDF] arxiv.org

Mean-field neural networks: learning mappings on Wasserstein space

H Pham, X Warin - arXiv preprint arXiv:2210.15179, 2022 - arxiv.org

Wasserstein space of probability measures and a space of functions, like eg in meanfield 

games/control problems. Two classes of neural … two mean-field neural networks, and show their …

Cited by 4 Related articles All 4 versions


[PDF] arxiv.org

Long Range Constraints for Neural Texture Synthesis Using Sliced Wasserstein Loss

L Yin, A Chua - arXiv preprint arXiv:2211.11137, 2022 - arxiv.org

… For the purpose of this paper, the Sliced Wasserstein … of texture synthesis via Sliced 

Wasserstein Loss that has the ability … that arises from using Sliced Wasserstein Loss is the …

Related articles All 2 versions

<——2022———2022———2070—


Wasserstein gradient flows policy optimization via input convex neural networks

Y Wang - … on Artificial Intelligence, Automation, and High …, 2022 - spiedigitallibrary.org

… Finally, in order to use Wasserstein gradient flow reinforcement learning method on a large 

scale, we introduce the input convex neural network to approximate the JKO method with the …

Related articles All 3 versions


   

[PDF] mlsb.io

[PDF] 3D alignment of cryogenic electron microscopy density maps by minimizing their Wasserstein distance

AT Riahi, G Woollard, F PoitevinA CondonKD Duc - mlsb.io

… For two given point clouds, A = {a1,...,an} and B = {b1,...,bn}, we define a cost matrix Ci,j =

d(ai,bj)2, where d is the Euclidean distance. The entropy regularized 2-Wasserstein distance …

 Related articles 


Wasserstein Distance Transfer Learning Algorithm based on Matrix-norm Regularization

X Wang, Y Yu - … Computing and Artificial Intelligence (AHPCAI), 2022 - ieeexplore.ieee.org

Wasserstein distance has been applied in the transfer learning algorithm, but the existing

methods are not ideal for the solution of Lipschitz constraint condition. To solve this problem, a …



[PDF] researchgate.net

Wasserstein metric-based two-stage distributionally robust optimization model for optimal daily peak shaving dispatch of cascade hydroplants under renewable …

X Jin, B Liu, S Liao, C Cheng, Y Zhang, Z Zhao, J Lu - Energy, 2022 - Elsevier

… of the Wasserstein metric, this paper uses the Wasserstein metric to assess the distance of 

P to P N . The comprehensive and well-known advantages of the Wasserstein metric can be …

Cited by 6 Related articles All 5 versions


2022


RoBiGAN: A bidirectional Wasserstein GAN approach for online robot fault diagnosis via internal anomaly detection

T Schnell, K Bott, L Puck, T Buettner… - 2022 IEEE/RSJ …, 2022 - ieeexplore.ieee.org

… , highly dependent data as needed for internal anomaly detection in complex robots. … 

bidirectional Wasserstein GAN architecture fit for online anomaly detection on internal sensor data …

Related articles


2022

The impact of WGAN-GP and BAGAN-GP generated cRBSs on glucarate biosensor dynamic range

By: Ding, NnZhou, Shenghu

FlowRepository

Source URL: ‏ http://flowrepository.org/id/FR-FCM-Z4RF

Time:2021-12-25 - 2021-10-15

Viewed Date: 09 Jan 2022



Neural Subgraph Counting with Wasserstein Estimator

H Wang, R Hu, Y ZhangL Qin, W Wang… - Proceedings of the 2022 …, 2022 - dl.acm.org

… Furthermore, we design a novel Wasserstein discriminator in WEst to minimize the … a

Wasserstein discriminator in the training process to optimize the parameters in the graph neural …

Cited by 6 Related articles All 2 versions


[PDF] arxiv.org

Optimal neural network approximation of wasserstein gradient direction via convex optimization

Y WangP ChenM PilanciW Li - arXiv preprint arXiv:2205.13098, 2022 - arxiv.org

… on the Wasserstein gradient descent direction of KL divergence functional. Later on, we

design a neural network convex optimization problems to approximate Wasserstein gradient in …

Cited by 2 Related articles All 6 versions 



[PDF] arxiv.org

DeepParticle: Learning invariant measure by a deep neural network minimizing Wasserstein distance on data generated from an interacting particle method

Z WangJ XinZ Zhang - Journal of Computational Physics, 2022 - Elsevier

… We design a neural network that has independently batched input for parameters so it can

learn … We train the neural network by minimizing the 2-Wasserstein distance between the input …

Cited by 4 Related articles All 10 versions

<——2022———2022———2080—



[HTML] rsc.org

[HTML] Pesticide detection combining the Wasserstein generative adversarial network and the residual neural network based on terahertz spectroscopy

R Yang, Y Li, B Qin, D Zhao, Y Gan, J Zheng - RSC advances, 2022 - pubs.rsc.org

… the Wasserstein generative adversarial network (WGAN) and the residual neural network

(ResNet), to detect carbendazim based on terahertz spectroscopy. The Wasserstein generative …

Cited by 5 Related articles All 6 versions



[PDF] arxiv.org

Mean-field neural networks: learning mappings on Wasserstein space

H PhamX Warin - arXiv preprint arXiv:2210.15179, 2022 - arxiv.org

… Wasserstein space of probability measures and a space of functions, like eg in meanfield

games/control problems. Two classes of neural … two mean-field neural networks, and show their …

Cited by 4 Related articles All 4 versions 



[PDF] sciencedirect.com

Measuring phase-amplitude coupling between neural oscillations of different frequencies via the Wasserstein distance

T Ohki - Journal of Neuroscience Methods, 2022 - Elsevier

… mathematical framework of the Wasserstein distance to enhance the intuitive comprehension

of the Wasserstein Modulation Index (wMI). The Wasserstein distance is an optimization …

Cited by 3 Related articles All 3 versions



[PDF] arxiv.org

GeONet: a neural operator for learning the Wasserstein geodesic

A Gracyk, X Chen - arXiv preprint arXiv:2209.14440, 2022 - arxiv.org

… In this paper, we propose a deep neural operator learning framework GeONet for the

Wasserstein geodesic. Our method is based on learning the optimality conditions in the dynamic …

Related articles All 3 versions 



[HTML] springer.com

[HTML] … intrusion detection based on conditional wasserstein variational autoencoder with generative adversarial network and one-dimensional convolutional neural …

J He, X Wang, Y Song, Q Xiang, C Chen - Applied Intelligence, 2022 - Springer

… a method of generating conditional Wasserstein variational autoencoder generative

adversarial network (CWVAEGAN) and one-dimensional Convolutional neural network (1D-CNN). …

Related articles


2022


Tackling algorithmic bias in neural-network classifiers using wasserstein-2 regularization

L RisserAG Sanz, Q Vincenot, JM Loubes - Journal of Mathematical …, 2022 - Springer

… use the Wasserstein metric when training Neural Networks, … Wasserstein distance appears

in this framework as a smooth … , the authors specifically used Wasserstein-1 to post-process …

Cited by 7 Related articles All 4 versions


[PDF] arxiv.org

Long Range Constraints for Neural Texture Synthesis Using Sliced Wasserstein Loss

L Yin, A Chua - arXiv preprint arXiv:2211.11137, 2022 - arxiv.org

… For the purpose of this paper, the Sliced Wasserstein … of texture synthesis via Sliced

Wasserstein Loss that has the ability … that arises from using Sliced Wasserstein Loss is the …

 Related articles All 2 versions 

 


Sliced wasserstein distance for neural style transfer

J Li, D Xu, S Yao - Computers & Graphics, 2022 - Elsevier

Neural Style Transfer (NST) aims to render a content image with the style of another image

in the feature space of a Convolution Neural Network (CNN). A fundamental concept of NST …

Cited by 1 Related articles All 2 versions


2022

Exact statistical inference for the Wasserstein distance

by VNL Duy · 2022 · Cited by 6 — 2017). This distance measures the cost to couple one distribution with another, which arises from the notion of optimal transport (Villani 2009) ...


2022

Bayesian optimization in Wasserstein spaces

A Candelieri, A Ponti, F Archetti - International Conference on Learning …, 2022 - Springer

Bayesian Optimization (BO) is a sample efficient approach for approximating the global 

optimum of black-box and computationally expensive optimization problems which has 

proved its effectiveness in a wide range of engineering and machine learning problems. A 

limiting factor in its applications is the difficulty of scaling over 15–20 dimensions. It has been 

remarked that global optimization problems often have a lower intrinsic dimensionality which 

can be exploited to construct a feature mapping the original problem into low dimension …

 Cited by 1

<——2022———2022———2090— 


2022

Estimation and inference for the Wasserstein distance between mixing measures in topic...

by Bing, Xin; Bunea, Florentina; Niles-Weed, Jonathan

06/2022

.... This work proposes a new canonical interpretation of this distance and provides tools to perform inference on the Wasserstein distance between mixing measures in topic models...


2022

A Wasserstein distance-based spectral clustering method...

by Zhu, Yingqiu; Huang, Danyang; Zhang, Bo

03/2022

.... We adopt Wasserstein distance to measure the dissimilarity between any two merchants and propose the Wasserstein-distance-based spectral clustering (WSC) approach...

Journal Article Full Text Online



2022

Wasserstein convergence rates in the invariance...

by Liu, Zhenxin; Wang, Zhe

04/2022

In this paper, we consider the convergence rate with respect to Wasserstein distance in the invariance principle for deterministic nonuniformly hyperbolic systems, where both discrete time systems and flows are included...

Journal Article Full Te


2022

On the Existence of Monge Maps for the Gromov-Wasse...

by Dumont, Théo; Lacombe, Théo; Vialard, François-Xavier

10/2022

...) problem on Euclidean spaces for two different costs. The first one is the scalar product for which we prove that it is always possible to find optimizers as Monge maps and we detail the structure of such optimal maps...

Journal Article Full Te

 

 2022

Robust $Q$-learning Algorithm for Markov Decision Processes under Wasserstein Uncertainty

by Neufeld, Ariel; Sester, Julian

09/2022

We present a novel $Q$-learning algorithm to solve distributionally robust Markov decision problems, where the corresponding ambiguity set of transition...

Journal Article Full Text 


2022


ARTICLE

Morphological Classification of Radio Galaxies with wGAN-supported Augmentation

Rustige, Lennart ; Kummer, Janis ; Griese, Florian ; Borras, Kerstin ; Brüggen, Marcus ; Connor, Patrick L. S ; Gaede, Frank ; Kasieczka, Gregor ; Knopp, Tobias ; Schleper, Peter

2022

 RAS Techniques and Instruments 2 (2023) no.1, 264-277 Machine learning techniques that perform morphological classification of astronomical sources often suffer from a scarcity of labelled training data...

OPEN ACCESS

BrowZine PDF Icon Download PDF 

Morphological Classification of Radio Galaxies with wGAN-supported Augmentation


Detection of false data injection attack in power information physical system based on SVM–GAB algorithm

by Xiong, Xiaoping; Hu, Siding; Sun, Di ; More...

Energy reports, 08/2022, Volume 8

.... Among them, false data injection attack (FDIA) is not easy to be found by traditional bad data detection methods, and becomes one of the main threats to the safe operation of power systems...


2022

A Convolutional Wasserstein Distance for Tractography Evaluation: Complementarity Study to State-of-the-Art Measures

T Durantel, J Coloigner… - 2022 IEEE 19th …, 2022 - ieeexplore.ieee.org

… on the computation of the Wasserstein distance, derived from op… The 2-Wasserstein distance,

simply called Wasserstein dis… in development, our new Wasserstein measure can be used …

Related articles All 5 versions


2022

The Multivariate Rate of Convergence for Selberg's Central Limit Theorem

by Roberts, Asher

arXiv.org, 12/2022

In this paper we quantify the rate of convergence in Selberg's central limit theorem for \(\log|\zeta(1/2...

Paper Full Text Online


 

2022

Wasserstein Distance for Attention based cross modality Person Re-Identification

Murali, N and Mishra, D

19th IEEE-India-Council International Conference (INDICON)

2022 | 

2022 IEEE 19TH INDIA COUNCIL INTERNATIONAL CONFERENCE, INDICON

Cross-modality based person re-identification is a challenging task because of the high inter modality gap present between the RGB and the IR data. These systems have to learn how to discriminate between different identities as well as how to match two modalities. This has been handled using various distance metrics and custom loss functions. But the already used distance metrics might not perf

Show more

Full Text at Publishermore_horiz

32 References. Related records

<——2022———2022———2200— 



2022 patent

Cloud environment intrusion detection method based on WGAN and LightGBM

CN116248344A 裴廷睿 湘潭大学

Filed 2022-12-28 • Published 2023-06-09

6. The cloud environment intrusion detection method according to claim 1, wherein S33 includes the WGAN model taking the preprocessing data as an input of the discrimination model, the random noise as an input of the generation model, performing a reciprocal game using the generation model and the …


2022 patent

Grid deformation data enhancement method based on WGAN-GP model

CN CN115937038A 李静 上海大学

Priority 2022-12-27 • Filed 2022-12-27 • Published 2023-04-07

The invention discloses a method for enhancing grid deformation data based on a WGAN-GP model, which relates to the technical field of image processing and data enhancement and comprises the following steps: constructing a training data set and a test data set; constructing a WGAN-GP model;


2022 patent

Method for selecting drive resistor of eGaN HEMT power converter

CN CN115859889A 贺远航 广东工业大学

Priority 2022-11-16 • Filed 2022-11-16 • Published 2023-03-28

10. The method for selecting the drive resistor of the eGaN HEMT power converter according to claim 1, wherein S3.6 specifically comprises: the method for improving the overall efficiency of the converter is to select a reasonable driving resistor to obtain a reasonable driving waveform, reduce the …


 
2022 patent

Antagonistic sample generation method and system based on WGAN-Unet

CN CN115761399A 秦中元 东南大学

Priority 2022-11-01 • Filed 2022-11-01 • Published 2023-03-07

s3, constructing a WGAN-Unet-based anti-attack network model: the anti-attack network model consists of a generator, a discriminator and the target network model obtained in the step S2, wherein the generator is constructed according to original sample data based on a Unet framework, performs …


 

2022 patent

WGAN-based FRP sheet and concrete interface bonding slippage model generation …

CN CN115544864A 贺畅 同济大学

Priority 2022-09-16 • Filed 2022-09-16 • Published 2022-12-30

2. The method for generating the WGAN-based FRP sheet and concrete interface bonding slip model according to claim 1, wherein the step 1: strain data acquisition is realized by modeling through finite element software LS-DYNA, and data corresponding to strain at every two seconds, namely average …


2022


2022 patent

Vehicle following behavior modeling method based on Transformer-WGAN

CN CN115630683A 徐东伟 浙江工业大学

Priority 2022-09-14 • Filed 2022-09-14 • Published 2023-01-20

2. The method of claim 1, wherein the method for modeling vehicle-following behavior based on Transformer-WGAN is characterized in that: acquiring state sequence data of a plurality of groups of following vehicles and vehicle

  

2022  patent

Micro-seismic signal denoising method combining WGAN-GP and SADNet

CN CN115600089A 余梅 三峡大学(Cn

Priority 2022-09-02 • Filed 2022-09-02 • Published 2023-01-13

1. A micro-seismic signal denoising method combining WGAN-GP and SADNet is characterized by comprising the following steps: inputting a microseism signal sample into a WGAN-GP network, adding a noise signal condition, generating a large number of training sample sets by generating a confrontation …


2022 patent

… system for predicting underground dust concentration of coal mine based on WGAN

CN CN115310361A 秦波涛 中国矿业大学

Priority 2022-08-16 • Filed 2022-08-16 • Published 2022-11-08

9. The WGAN-CNN-based coal mine dust concentration prediction system of claim 8, the prediction model building module further comprises: and the reliability inspection unit is used for inputting the test data set into the coal mine underground dust concentration prediction model to obtain a …

2022 -atent

Satellite cloud picture prediction method based on WGAN-GP network and optical …

CN CN115546257A 谈玲 南京信息工程大学

Priority 2022-08-09 • Filed 2022-08-09 • Published 2022-12-30

5. The W

2022  patent

Semi-supervised malicious flow detection method based on improved WGAN-GP

CN CN115314254A 刘胜利 中国人民解放军战略支援部队信息工程大学

Priority 2022-07-07 • Filed 2022-07-07 • Published 2022-11-08

The invention belongs to the technical field of malicious traffic detection, and particularly relates to a semi-supervised malicious traffic detection method based on improved WGAN-GP. The method carries out detection according to the established semi-supervised malicious flow detection model.


2022 patent

XRF-EGAN model-based soil XRF spectrogram background subtraction method

CN CN114861541A 赵彦春 电子科技大学长三角研究院(湖州)

Priority 2022-05-13 • Filed 2022-05-13 • Published 2022-08-05

and step 3: and loading an XRF-EGAN generator network model, carrying out XRF spectrum background deduction on new soil XRF spectrum data measured by an XRF fluorescence analyzer by using the XRF-EGAN generator network, and obtaining output after background deduction. 4. The XRF-EGAN model-based …

<——2022———2022———2210— 



2023 patent 55

Laplace noise and Wasserstein regularization-based multi-test EEG source …

CN CN116152372A 刘柯 重庆邮电大学

Priority 2023-02-07 • Filed 2023-02-07 • Published 2023-05-23

s4, establishing a multi-test robust EEG diffuse source imaging model based on Laplace noise and Wasserstein regularization in a projection space according to the lead matrix, the difference operator and the minimum distance matrix, and obtaining a multi-test estimated source by utilizing an ADMM …


2022 patent

… recognition domain self-adaption method and system combining Wasserstein …

CN CN115601535A 陈元姣 杭州电子科技大学(Cn

Priority 2022-11-08 • Filed 2022-11-08 • Published 2023-01-13

5. The chest radiograph abnormality recognition domain adaptation method combining Wasserstein distance and difference measure of claim 4, wherein the construction of the total objective function through the obtained Wasserstein distance and contrast domain difference in step S3 comprises the …


Telecom customer churn prediction method based on condition Wasserstein GAN

CN CN115688048A 苏畅 重庆邮电大学

Priority 2022-10-31 • Filed 2022-10-31 • Published 2023-02-03

3. the method of claim 2, wherein the conditional Wasserstein GAN-based telecommunications customer churn prediction method comprises: in S2, the mixed attention mechanism CBAM includes two parts: channel attention module CAM and spatial attention module SAM; the overall attention process for CBAM …


Road network pixelation-based Wasserstein generation countermeasure flow data …

CN CN115510174A 王蓉 重庆邮电大学

Priority 2022-09-29 • Filed 2022-09-29 • Published 2022-12-23

6. The road network pixelation-based Wasserstein generation countermeasure network traffic data interpolation method as claimed in claim 1, wherein the process of repairing missing data by a road network traffic data generation countermeasure network model comprises: splicing traffic flow data to …


Maximum slice Wasserstein measurement-based pedestrian target association …

CN CN115630190A 陈亮 南京信息技术研究院

Priority 2022-09-07 • Filed 2022-09-07 • Published 2023-01-20

5. The method for monitoring network pedestrian target association based on maximum slice Wasserstein measurement as claimed in claim 1, wherein said step S4 specifically comprises the steps of: s401: associating R' according to cross-mirror pedestrian target " j According to (C) i1 T i1,j1 T i, …


2022


2022 see 2018 patent  

Wasserstein distance-based battery SOH estimation method and device

CN CN114839552A 林名强 泉州装备制造研究所

Priority 2022-04-08 • Filed 2022-04-08 • Published 2022-08-02

3. The wasserstein distance-based battery SOH estimation method according to claim 1, wherein: in S1, the aging data of the pouch batteries is specifically aging data of eight nominal 740Ma · h pouch batteries recorded in advance. 4. A wasserstein distance-based battery SOH estimation method …



2022 patient

Conditional Wasserstein generating countermeasure network based motor fault data enhancing method, involves detecting data generated during operation of motor in real time, and determining whether fault of motor occurs to obtain motor failure mode for verifying validity of data enhancement

CN114154405-A

Inventor(s) CHEN Q

Assignee(s) DONGFENG YUEXIANG TECHNOLOGY CO LTD and DONGFENG MOTOR GROUP CO LTD

Derwent Primary Accession Number 

2022-411781


2022 patent

Small sample fault diagnosis method for mechanical pump based on WGAN-GP-C and metric learning, involves reading time sequence data and pre-processing data, and performing fault diagnosis on pre-processed and data enhanced data by using optimized model

CN114037001-A

Inventor(s) ZHANG HXUN B; (...); WANG X

Assignee(s) PLA NO 92578 TROOPS PLA

Derwent Primary Accession Number 

2022-33301S

2022 patient

Wasserstein-GAN based power system harmonic rule calculating method, involves generating electric power system current harmonic data based on generator of Wasserstein-GAN, obtaining current harmonic rule according to data

CN114217132-A

Inventor(s) YAO CPAN Y; (...); MEI W

Assignee(s) JIANGSU YIHE ENERGY TECHNOLOGY CO LTD

Derwent Primary Accession Number 

2022-49648K


2022 patient

Method for removing micro-seismic record based on improved wasserstein generative adversarial network and convolutional blind denoising network (CBDNet) involves inputting seismic data to be de-noised into CBDNet, and outputting microseismic data after removing target noise

CN114218982-A

Inventor(s) ZHANG JTANG J; (...); SHENG G

Assignee(s) UNIV CHINA THREE GORGES

Derwent Primary Accession Number 

2022-46756K

<——2022———2022———2220— 




2022 patient

Distribution robust optimization method based on Wasserstein measurement and nuclear density estimation, involves using nuclear density estimation method to push probability density of wind power prediction error, and establishing distributed robust unit combination model

CN114243683-A

Inventor(s) HOU W

Assignee(s) UNIV ZHOUKOU NORMAL

Derwent Primary Accession Number 

2022-489825


2022 patient

Method for removing shadow of railway fastener image of unmanned aerial vehicle based on non-woven Wasserstein generative adversarial network, involves receiving track image data shot by zoom camera carried on unmanned aerial vehicle, and pre-processing received track image data to expand data set

CN114266713-A

Inventor(s) CHEN PMU Z; (...); QIN Y

Assignee(s) UNIV BEIJING JIAOTONG and BEIJING SHANGHAI HIGH SPEED RAILWAY CORP

Derwent Primary Accession Number 

2022-51232N


2022 patient

Power system bad data identification method based on improved Wasserstein GAN, involves determining threshold value of bad data based on bad data threshold value setting method of C4.5 decision tree model

CN114330486-A

Inventor(s) ZHU YHAN H; (...); ZANG H

Assignee(s) UNIV HOHAI

Derwent Primary Accession Number 

2022-567483


2022 patient

Method for validating simulation data of simulation model of technical system such as software, hardware or embedded system, involves determining metric between first probability distribution with simulation data and second probability distribution with reference data using 1-Wasserstein metric

DE102020213199-A1US2022121792-A1CN114385479-A

Inventor(s) SCHMIDT J

Assignee(s) BOSCH GMBH ROBERT

Derwent Primary Accession Number 

2022-53275K


2022


2022 patient

Method for performing industrial quality detection, involves taking registration image as input to discriminator of wasserstein generative adversarial network (WGAN) network, and carrying out industrial detection according to discrimination result outputted by discriminator

CN114399505-ACN114399505-B

Inventor(s) ZHANG ZZHENG J; (...); HANG T

Assignee(s) JIANGSU SVFACTORY CO LTD

Derwent Primary Accession Number 

2022-69038S


 022 patientOpenURL 

 Subject modeling method based on Wasserstein self-encoder and Gaussian mixture distribution as priori in natural language processing technology field, involves representing text data set as input of bag model BOW, and iterating, updating model parameter to optimize generating model

CN114417852-A

Inventor(s) FANG Y and LIU H

Assignee(s) UNIV CHONGQING POSTS & TELECOM

Derwent Primary Accession Number 

2022-64348A


2022 patient

Computer-implemented method for validating simulation data of simulation model of technical system, involves determining total difference between simulation signals and reference signals using p-Wasserstein metric based on total metric

DE102020213892-A1CN114444250-AUS2022138377-A1

Inventor(s) BAUMANN MVON KELER J; (...); SCHMITT J

Assignee(s) BOSCH GMBH ROBERT

Derwent Primary Accession Number 

2022-588927


2022 patient

Method for improving former model based on Wasserstein distance in long sequence timing prediction system for influenza-like disease prediction, involves using ProbSparse self-attention mechanism in Informer model, and replacing Kullback-Leibler divergence by Wasserste distance

CN114444584-A

Inventor(s) ZHU TCHENG J; (...); ZHANG D

Assignee(s) UNIV CHINA GEOSCIENCES WUHAN

Derwent Primary Accession Number 

2022-66519N


2022 patient

Health monitoring method of electromagnetic valve involves calculating health monitoring index, and using Wasserstein distance for health index, and determining health status of solenoid valve according to health indexCN114611633-ACN114611633-B

Inventor(s) LU XWANG P; (...); WANG Q

Assignee(s) CHINA AERODYNAMICS RES & DEV CENT HIGH

Derwent Primary Accession Number 

2022-831279

<——2022———2022———2230—



2022 patient

Method for monitoring state of rotary machine based on Wasserstein depth digital twinning model, involves obtaining health state parameter corresponding to each measuring and evaluating health state of rotating measuring of time period

CN114662712-A

Inventor(s) LIU YCHU F; (...); HU W

Assignee(s) UNIV TSINGHUA

Derwent Primary Accession Number 

2022-86388J


2022 patient

Learning device for performing reverse reinforcement learning, has function input unit that receives input of compensation function set by feature to satisfy Lipschitz continuous condition, and update unit that updates parameter of reward function so as to maximize Wasserstein distance

WO2022137520-A1

Inventor(s) RIKI E

Assignee(s) NEC CORP

Derwent Primary Accession Number 

2022-85972V


2022 patient

Early fault detection method based on Wasserstein distance in industrial control system, involves establishing monitoring statistic based on hypothesis test in main space and residual space for judging whether fault is occurred

CN114722888-A

Inventor(s) YE FCAI J; (...); ZENG J

Assignee(s) UNIV CHINA JILIANG

Derwent Primary Accession Number 

2022-924189


2022 patient

Component authenticating application evidence wasserstein distance algorithm, has set of instructions determining whether maximum value cannot exceed 1 and minimum value cannot be less than 0 according to limitation of boundary

CN114818957-A

Inventor(s) HE HHE L and XIAO F

Assignee(s) XIAO F

Derwent Primary Accession Number 

2022-A39748


2022 patient

Battery state of health (SOH) estimation method based on wasserstein distance, involves adjusting and determining value of each super parameter, and using later data set to train prediction model of determined super-parameter

CN114839552-A

Inventor(s) YAN CWU J and LIN M

Assignee(s) QUANZHOU EQUIP MFG INST

Derwent Primary Accession Number 

2022-A7923V

<——2022———2022———2240—


2022


 

2022 patient


CN114924962-A

Method for selecting cross-project software defect prediction data in software testing, involves calculating Wasserstein distance between each source item data and target item data after pre-processing, and judging similarity of each dataInventor(s) YU YHU Z; (...); WU Y

Assignee(s) UNIV BEIHANG

Derwent Primary Accession Number 

2022-B1074L


2022 patient

Method for generating fuzzy test case of controller local area network (CAN) bus based on Wasserstein generative adversarial network (WGAN)-Gradient Penalty (GP), involves obtaining WGAN-GP model to generate multiple virtual CAN data frames

CN114936149-A

Inventor(s) ZHANG HWAN Z and HUANG K

Assignee(s) KAIYUAN WANGAN IOT TECHNOLOGY WUHAN CO and UNIV HUAZHONG SCI & TECHNOLOGY

Derwent Primary Accession Number 

2022-B6009R

2022 patient

Method for designing ultra-surface unit structure inverse based on improved generative adversarial network, involves obtaining output classification loss and Wasserstein distance, and updating generator model parameter

CN115169235-A

Inventor(s) TAN JLIU Y; (...); WANG H

Assignee(s) HARBIN INST TECHNOLOGY

Derwent Primary Accession Number 

2022-D0840M

 

2022 patient

Distributed robust optimization method for considering flexibility of power grid, involves constructing distributed robust optimization model based on Wasserstein distance, and utilizing conjugate function to optimize model

CN115425638-A

Inventor(s) SUN WLIU Z; (...); WANG Y

Assignee(s) STATE GRID SICHUAN ELECTRIC POWER CO

Derwent Primary Accession Number 

2022-F40108


2022 patient

Power equipment infrared image ultra-resolution reconstruction model, has generative adversarial network-instrument network provided with input layer, feature extraction layer and output layer for fixing anti-device network in original wasserstein generative adversarial network anti-device structure

CN115456877-A

Inventor(s) HE WZHANG D; (...); LIU H

Assignee(s) UNIV KUNMING SCI & TECHNOLOGY

Derwent Primary Accession Number 

2022-F43291





2022 patent

Data depth enhancing method based on WGAN-GP data generation and Poisson fusion, involves solving Poisson equation according to specified boundary condition to realize continuous update optimization of network by iteration, and realizing continuous on gradient domain

CN114219778-A

Inventor(s) CHEN YLIU Z; (...); HOU Y

Assignee(s) UNIV BEIJING TECHNOLOGY

Derwent Primary Accession Number 

2022-46303R


 2022 patent

Method for generating a semantic countermeasure sample based on GAN, involves pre-training the WGAN-GP neural network model by using the local training set, and obtaining the generator and the arbiter

CN114219969-A

Inventor(s) LIU NXIE H; (...); XIANG X

Assignee(s) CHINA ACAD SPACE TECHNOLOGY

Derwent Primary Accession Number 

2022-470231


2022 patent

Network security imbalanced dataset analysis method based on WGAN dynamic penalty, involves collecting network safety data without label, inputting trained shallow machine learning model, and outputting corresponding prediction label

CN114301667-A

Inventor(s) CHEN ZZHANG L; (...); XU Y

Assignee(s) UNIV HANGZHOU DIANZI

Derwent Primary Accession Number 

2022-54829E

2022 patent

Method for performing sEMG data enhancement based on BiLSTM and WGAN-GP network to measure electromyographic activity stimulated by nerve, involves comparing generated sample set and test data set for analyzing model error to judge stability of model training process

CN114372490-A

Inventor(s) QIAN Y and FANG Y

Assignee(s) UNIV HANGZHOU DIANZI

Derwent Primary Accession Number 

2022-60561T

 

2022 patent

Method for performing industrial quality detection, involves taking registration image as input to discriminator of wasserstein generative adversarial network (WGAN) network, and carrying out industrial detection according to discrimination result outputted by discriminator

CN114399505-ACN114399505-B

Inventor(s) ZHANG ZZHENG J; (...); HANG T

Assignee(s) JIANGSU SVFACTORY CO LTD

Derwent Primary Accession Number 

2022-69038S


2022


2022 patent

Method for type-B ultrasonic image denoising on FC-VoVNet and WGAN used in clinical diagnosis and treatment, involves verifying model on test set while testing model generalization performance

CN114764766-A

Inventor(s) WEI J and CHEN D

Assignee(s) WUXI KEMEIDA MEDICAL TECHNOLOGY CO LTD

Derwent Primary Accession Number 

2022-97333U


2022 patent

Method for generating fuzzy test case of controller local area network (CAN) bus based on Wasserstein generative adversarial network (WGAN)-Gradient Penalty (GP), involves obtaining WGAN-GP model to generate multiple virtual CAN data frames

CN114936149-A

Inventor(s) ZHANG HWAN Z and HUANG K

Assignee(s) KAIYUAN WANGAN IOT TECHNOLOGY WUHAN CO and UNIV HUAZHONG SCI & TECHNOLOGY

Derwent Primary Accession Number 

2022-B6009R


2022 patent

Method for removing artifacts from extremely sparse angular computed tomography reconstruction, involves sending image set as input to UNet generator network trained based on WGAN-GP generative adversarial network combined with transformers block

CN115239588-A

Inventor(s) QIN YJIANG W; (...); DI J

Assignee(s) UNIV GUANGDONG TECHNOLOGY

Derwent Primary Accession Number 

2022-D5413F


2022 patent

Wasserstein generative adversarial network (WGAN)- convolutional neural network (CNN) coal mine dust concentration prediction method involves obtaining real-time data of coal mine underground respiration dust and characteristic parameter, and inputting reliability verification of prediction model

CN115310361-A

Inventor(s) JIANG WLI H; (...); QIN B

Assignee(s) UNIV CHINA MINING & TECHNOLOGY BEIJING

Derwent Primary Accession Number 

2022-E22744


2022  

Comparison of Seismic Data Interpolation Performance using U-Net and cWGAN

U-Net cWGAN 이용한 탄성파 탐사 자료 보간 성능 평가

Yu, J and Yoon, D

2022 | 

GEOPHYSICS AND GEOPHYSICAL EXPLORATION (지구물리와 물리탐사)

 25 (3) , pp.140-161

Seismic data with missing traces are often obtained regularly or irregularly due to environmental and economic constraints in their acquisition. Accordingly, seismic data interpolation is an essential step in seismic data processing. Recently, research activity on machine learning-based seismic data interpolation has been flourishing. In particular, convolutional neural network (CNN) and genera

Show more

Show more

View full textmore_horiz

1 Citation. 29 References. Related records

<——2022———2022———2250—



2022 patent

Method for detecting network security internal threat used in network security abnormality detection technology field, involves expanding generated data to query after finishing detection task based on CWGAN network data enhancement

CN115203683-A

Inventor(s) YANG CZHAO F; (...); TAO X

Assignee(s) UNIV GUILIN ELECTRONIC TECHNOLOGY

Derwent Primary Accession Number 

2022-D3106Q


 2022

基于Z-Wasserstein距离的多属性决策扩展Exp-TODIM方法

x-mol.com

https://newsletter.x-mol.com › paper

An extended Exp-TODIM method for multiple attribute decision making based on the Z-Wasserstein distance Expert Systems with Applications ( IF8.5 ) Pub Date : 2022-


 3034z7

[CITATION] An extended 414 Exp-TODIM method for multiple attribute decision making 415 based on the Z-Wasserstein distance

H Sun, Z Yang, Q Cai, G Wei, Z Mo - Expert Systems with 416 Applications, 2023

 Cited by 2 Related articles


2022 thesis

Elchesen, Alexander University of Florida 2022

NIVERSALITY OF THE WASSERSTEIN DISTANCES AND RELATIVE OPTIMAL TRANSP

 

2022 thesis

Talbi, Mehdi Institut Polytechnique de Paris 2022

Arrêt optimal champ-moyen et approximations d'équations aux dérivées partielles sur l'espace de Wasserstein

Mathematics Subject Classification: 60—Probability theory and stochastic p


2022 thesis

Warren, Andrew Carnegie Mellon University 2022

 Nonlocal Wasserstein Geometry: Metric and Asymptotic Properties

Mathematics Subject Classification: 49—Calculus of variations and optimal cont 

2022


Duvenhage, Rocco

Quadratic Wasserstein metrics for von Neumann algebras via transport plans. (English) Zbl 07734184

J. Oper. Theory 88, No. 2, 289-308 (2022)


 WASSERSTEIN GAN BASED FRAMEWORK FOR ADVERSARIAL ATTACKS

AGAINST INTRUSION DETECTION SYSTEMS.

books.google.com › books

books.google.com › books

Fangda Cui · 2022 · ‎ No preview

IIntrusion detection system (IDS) detects malicious activities in network flows and is essential for modern communication netw

<——2022———2022———2257— end 2022.   e22 

 including 3 titles with Vaserstein.

and 1 title with Wasserstein


 


 

start 2023 Wasserstein


CryoSWD: Sliced Wasserstein Distance Minimization for 3D Reconstruction in Cryo-electron Microscopy

M ZehniZ Zhao - … 2023-2023 IEEE International Conference on …, 2023 - ieeexplore.ieee.org

… Therefore, we propose to re-place Wasserstein-1 distance with SWD in the CryoGAN

framework, hence the name CryoSWD. In low noise regimes, we show how CryoSWD eliminates …

Related articles All 3 versions


2023    see 2022

Two-variable Wasserstein mean of positive operators

https://meetings.ams.org › math › meetingapp.cgi › Paper

by S Kim · 2023 — We study two-variable Wasserstein mean of positive definite operators, as a unique positive definite solution of the nonlinear equation obtained from ...

Two-variable Wasserstein mean of positive operators

by S Kim · 2023 — We study two-variable Wasserstein mean of positive definite operators, as a unique positive definite solution of the nonlinear equation obtained from ...

[CITATION] Two-variable Wasserstein mean of positive operators

S Kim - 2023 Joint Mathematics Meetings (JMM 2023), 2023 - meetings.ams.org
2023 see 2022

[CITATION] Two-variable Wasserstein mean of positive operators

S Kim - 2023 Joint Mathematics Meetings (JMM 2023), 2023 - meetings.ams.org



 tps://meetings.ams.org › math › meetingapp.cgi › Paper

Robust Estimation of Wasserstein Distances

by SB Nietert · 2023 — Robust Estimation of Wasserstein Distances. Thursday, January 5, 2023 Thursday, January 5, 2023. 8:00 AM - 12:00 PM 5:00 AM - 9:00 AM 8:00 AM - 12:00 PM EST ...

[CITATION] Robust Estimation of Wasserstein Distances

SB Nietert - 2023 Joint Mathematics Meetings (JMM 2023), 2023 - meetings.ams.org

<p>Gromov Wasserstein distances for uniformly distributed ...

American Mathematical Society

https://meetings.ams.org › meetingapp.cgi › Paper

American Mathematical Society

https://meetings.ams.org › meetingapp.cgi › Paper

by A Auddy · 2023 — In this talk we will describe some recent developments towards understanding the GW distances on uniform distributions on the unit balls of dimensions and .

[CITATION] Gromov Wasserstein distances for uniformly distributed points on spheres

A Auddy - 2023 Joint Mathematics Meetings (JMM 2023), 2023 - meetings.ams.org

[CITATION] Gromov Wasserstein distances for uniformly distributed points on spheres

A Auddy - 2023 Joint Mathematics Meetings (JMM 2023), 2023 - meetings.ams.org

distributed points on spheres

A Auddy - 2023 Joint Mathematics Meetings (JMM 2023), 2023 - meetings.ams.org

Related articles 

2023 see 2022 2021   ARTICLE
Least Wasserstein distance between...
by Novack, Michael; Topaloglu, Ihsan; Venkatraman, Raghavendra
Journal of functional analysis, 01/2023, Volume 284, Issue 1
We prove the existence of global minimizers to the double minimization problem [Display omitted] where P(E) denotes the perimeter of the set E, Wp is the...
Article PDF PDF
Journal Article Full Text Online
View in Context Browse Journal
 Zbl 07616882

Cited by 1 Related articles All 6 versions


2023


<p>Wasserstein Labeled Graph Metrics and Stabilities for ...

American Mathematical Society

https://meetings.ams.org › meetingapp.cgi › Paper

American Mathematical Society

https://meetings.ams.org › meetingapp.cgi › Paper

by MG Rawson · 2023 — Abstract. We explore metrics and stabilities in labeled graph spaces. Graphs are used to describe and model many systems. Often, stable regions ...

[CITATION] Wasserstein Labeled Graph Metrics and Stabilities for Clustering

MG Rawson - 2023 Joint Mathematics Meetings (JMM 2023), 2023 - meetings.ams.org

[CITATION] Wasserstein Labeled Graph Metrics and Stabilities for Clustering

MG Rawson - 2023 Joint Mathematics Meetings (JMM 2023), 2023 - meetings.ams.org

[CITATION] Wasserstein Labeled Graph Metrics and Stabilities for Clustering

MG Rawson - 2023 Joint Mathematics Meetings (JMM 2023), 2023 - meetings.ams.org

Related articles 

2023  ARTICLE

Novel robust minimum error entropy wasserstein distribution kalman filter under model uncertainty and non-gaussian noise

Feng, Zhenyu ; Wang, Gang ; Peng, Bei ; He, Jiacheng ; Zhang, KunSignal processing, 2023, Vol.203

 ....•A novel MEE criterion minimax Wasserstein distribution Kalman Filter is proposed.•The proposed algorithm is proven to be convergent...

PEER REVIEWED

Novel robust minimum error entropy wasserstein distribution kalman filter under model uncertainty and non-gaussian noise

Available Online 

Cited by 3 All 2 versions
Novel robust minimum error entropy wasserstein distribution kalman filter under model uncertainty and non-gaussian 

Cited by 3 Related articles All 2 versions

2023  ARTICLE

Wasserstein asymptotics for the empirical measure of fractional Brownian motion on a flat torus

Huesmann, Martin ; Mattesini, Francesco ; Trevisan, DarioStochastic processes and their applications, 2023, Vol.155, p.1-26

PEER REVIEWED

Wasserstein asymptotics for the empirical measure of fractional Brownian motion on a flat torus

Available Online 

Cited by 1 Related articles All 2 versions
Zbl 07628744


2023   ARTICLE

An extended Exp-TODIM method for multiple attribute decision making based on the Z-Wasserstein distance

Sun, Hong ; Yang, Zhen ; Cai, Qiang ; Wei, Guiwu ; Mo, ZhiwenExpert systems with applications, 2023, Vol.214

 •The Wasserstein method measures the distance between two Z-numbers.•The Exp-TODIM method for MADM is built on the Z-Wasserstein distance...

PEER REVIEWED

An extended Exp-TODIM method for multiple attribute decision making based on the Z-Wasserstein distance

No Online Access 

Cited by 2 Related articles

 An extended Exp-TODIM method for multiple attribute decision making based on the Z-Wasserstein 


2023  ARTICLE

The backward Euler-Maruyama method for invariant measures of stochastic differential equations with super-linear coefficients

Liu, Wei ; Mao, Xuerong ; Wu, YueApplied numerical mathematics, 2023, Vol.184, p.137-150

 ....•The convergence rate is r/2 in r-Wasserstein distance. The backward Euler-Maruyama (BEM) method is employed to approximate the invariant measure of stochastic differential...

PEER REVIEWED

OPEN ACCESS

The backward Euler-Maruyama method for invariant measures of stochastic differential equations with super-linear coefficients

Available Online 

<–—2023———2023———10—


2023  ARTICLE

Controlled generation of unseen faults for Partial and Open-Partial domain adaptation

Rombach, Katharina ; Michau, Gabriel ; Fink, OlgaReliability engineering & system safety, 2023, Vol.230

 ... and Open-Partial domain adaptation based on generating distinct fault signatures with a Wasserstein GAN...

PEER REVIEWED

OPEN ACCESS

Controlled generation of unseen faults for Partial and Open-Partial domain adaptation

No Online Access 


 A New Framework of Quantitative analysis Based on WGAN
by Jiang, Xingru; Jiang, Kaiwen
SHS web of conferences, 2023, Volume 165
This paper follows the logic of financial investment strategies based on WGAN, one of AI algorithms. The trend prediction module and the distribution...
Article PDFPDF
Journal Article  Full Text Online

[CITATION] A New Framework of Quantitative analysis Based on WGAN

X Jiang, K Jiang - SHS Web of Conferences - shs-conferences.org

A New Framework of Quantitative analysis Based on WGAN | SHS Web of Conferences …

All 2 versions

Handwriting Recognition Using Wasserstein Metric in Adversarial Learning

M Jangpangi, S Kumar, D Bhardwaj, BG Kim… - SN Computer …, 2023 - Springer

… mechanism, we try to use Wasserstein Generative Adversarial Network (WGAN) [9] in the 

model. WGAN is also termed as earth mover’s distance (EMD). Wasserstein’s function tries to …


Protein secondary structure prediction based on Wasserstein generative adversarial networks and temporal convolutional networks with convolutional block attention …

L Yuan, Y Ma, Y Liu - Mathematical Biosciences and Engineering, 2023 - aimspress.com

… , which combines Wasserstein generative adversarial network with gradient penalty (WGAN-GP), … 

In the proposed model, the mutual game of generator and discriminator in WGAN-GP …

Related articles All 3 versions

2023


Synthetically Generated High-Resolution Reflected Ultra Violet Imaging System (RUVIS) Database Using Generative Adversarial Network

R Dhaneshwar, M Kaur, M Kaur - ICT Analysis and Applications, 2023 - Springer

… The improved-WGAN (I-WGAN) architecture utilizes Wasserstein distance for calculating the 

value function which leads to better performance than the available generative approaches […

All 2 versions

  2
 2023
Wasserstein asymptotics for the empirical measure of fractional Brownian motion on...
by Huesmann, MartinMattesini, FrancescoTrevisan, Dario
Stochastic processes and their applications, 01/2023, Volume 155
We establish asymptotic upper and lower bounds for the Wasserstein distance of any order p≥1 between the empirical measure of a fractional Brownian motion on a...
View NowPDF
Journal article  Full Text Online
View in Context Browse Journal

Cited by 10 Related articles All 7 versions

2023 see 2022
Isometric rigidity of Wasserstein tori and...
by Gehér, György Pál; Titkos, Tamás; Virosztek, Dániel
Mathematika, 01/2023, Volume 69, Issue 1
Article PDF PDF
Journal Article Full Text Online
View in Context Browse Journal


2023
Using Hybrid Penalty and Gated Linear Units to Improve Wasserstein...

by Zhu, Xiaojun; Huang, Heming
Computer modeling in engineering & sciences, 2023, Volume 135, Issue 3
Article PDF PDF
Journal Article 


Lidar Upsampling With Sliced Wasserstein...
by Savkin, Artem; Wang, Yida; Wirkert, Sebastian ; More...
IEEE robotics and automation letters, 01/2023, Volume 8, Issue 1
Article PDF PDF
Journal Article Full Text Online 

<–—2023———2023———20—



Distributionally robust chance constrained svm model with $\ell_2$-Wasserstein...
by Ma, Qing; Wang, Yanjun

Journal of industrial and management optimization, 2023, Volume 19, Issue 2

In this paper, we propose a distributionally robust chance-constrained SVM model with \begin{document}$ \ell_2...

Article PDF PDF

Journal Article



Small Sample Reliability Assessment With Online Time-Series Data Based on a Worm Wasserstein...

by Sun, Bo; Wu, Zeyu; Feng, Qiang ; More...
IEEE transactions on industrial informatics, 02/2023, Volume 19, Issue 2
The scarcity of time-series data constrains the accuracy of online reliability assessment. Data expansion is the most intuitive way to address this problem....
Article PDF PDF
Journal Article Full Text Online
View in Context Browse Journal


A Novel Graph Kernel Based on the Wasserst...
by Liu, Yantao; Rossi, Luca; Torsello, Andrea
Structural, Syntactic, and Statistical Pattern Recognition, 01/2023
Spectral signatures have been used with great success in computer vision to characterise the local and global topology of 3D meshes. In this paper, we propose...
Book Chapter Full Text Online 

A Novel Graph Kernel Based on the Wasserstein Distance and Spectral Signatures

Related articles All 2 versions

Santambrogio, Filippo

Sharp Wasserstein estimates for integral sampling and Lorentz summability of transport densities. (English) Zbl 07634004

J. Funct. Anal. 284, No. 4, Article ID 109783, 12 p. (2023).

MSC:  49Q22 60B05 49Jxx 42Bxx

PDF BibTeX XML Cite

harp Wasserstein estimates for integral sampling and Lorentz summability of transport densities
Related articles
All 2 versions

Peer-reviewed
Wasserstein asymptotics for the empirical measure of fractional Brownian motion on a flat torus

Authors:Martin HuesmannFrancesco MattesiniDario Trevisan
Summary:We establish asymptotic upper and lower bounds for the Wasserstein distance of any order p≥1 between the empirical measure of a fractional Brownian motion on a flat torus and the uniform Lebesgue measure. Our inequalities reveal an interesting interaction between the Hurst index H and the dimension d of the state space, with a “phase-transition” in the rates when d=2+1/H, akin to the Ajtai-Komlós-Tusnády theorem for the optimal matching of i.i.d. points in two-dimensions. Our proof couples PDE’s and probabilistic techniques, and also yields a similar result for discrete-time approximations of the process, as well as a lower bound for the same problem on RdShow more
Article, 2023
Publication:Stochastic Processes and their Applications, 155, 202301, 1
Publisher:2023

Cited by 6 Related articles All 5 versions

2023


MR4504022 Prelim Liao, Qichen; Chen, Jing; Wang, Zihao; Bai, Bo; Jin, Shi; Wu, Hao; Fast Sinkhorn I: an 

O(N)

 algorithm for the Wasserstein-1 metric. Commun. Math. Sci. 20 (2022), no. 7, 2053–2067. 49M25 (49Q22 65K10)

Review PDF Clipboard Journal Article

2023  patent news

Chongqing Post and Telecommunication Univ's Patent Application for Theme Modeling Method Based on Wasserstein Auto-Encoder and Gaussian Mixture Distribution as Prior
Global IP News. Telecom Patent News; New Delhi [New Delhi]. 06 Jan 2023.  
Cite  Email
Full Text   DetailsFull text


2023 see 2022  Working Paper
Robust -learning Algorithm for Markov Decision Processes under Wasserstein Uncertainty
Neufeld, Ariel; Sester, Julian.
 arXiv.org; Ithaca, Jan 5, 2023.
Cite
Email
  Full Text
Abstract/DetailsGet full text
opens in a new window

ited by 10
 Related articles All 4 versions 


2023  Working Paper
WPPNets and WPPFlows: The Power of Wasserstein Patch Priors for Superresolution
Altekrüger, Fabian; Hertrich, Johannes.
 arXiv.org; Ithaca, Jan 5, 2023.
   Cite     Email

Abstract/DetailsGet full text
opens in a new window
MR4613230

Cited by 12 Related articles All 3 versions


Working Paper
Learning Gaussian Mixtures Using the Wasserstein-Fisher-Rao Gradient Flow
Yan, Yuling; Wang, Kaizheng; Rigollet, Philippe.
 arXiv.org; Ithaca, Jan 4, 2023.
Cite  Email
Abstract/DetailsGet full text
opens in a new window

All 2 versions 

Learning Gaussian Mixtures Using the Wasserstein-Fisher-Rao Gradient Flow
by Yan, Yuling; Wang, Kaizheng; Rigollet, Philippe
01/2023
Gaussian mixture models form a flexible and expressive parametric family of distributions that has found applications in a wide variety of applications....
Journal Article  Full Text Online

Cited by 14 Related articles All 2 versions 

<–—2023———2023———30—


Working Paper
Wasserstein convergence rates in the invariance principle for deterministic dynamical systems
Liu, Zhenxin; Wang, Zhe.
 arXiv.org; Ithaca, Jan 3, 2023.
Cite  Email
Abstract/DetailsGet full text
opens in a new window

2023 see 2022  Scholarly Journal
Representing Graphs via Gromov-Wasserstein Factorization
Xu, Hongteng; Liu, Jiachang; Luo, Dixin; Lawrence, Carin.
 IEEE Transactions on Pattern Analysis and Machine Intelligence; New York Vol. 45, Iss. 1,  (2023): 999-1016.
Cite  Email

Citation/Abstract
Abstract/Details Get full textopens in a new window

Scholarly Journal
Small Sample Reliability Assessment With Online Time-Series Data Based on a Worm Wasserstein Generative Adversarial Network Learning Method
Sun, Bo; Wu, Zeyu; Feng, Qiang; Wang, Zili; Ren, Yi; et al.
 IEEE Transactions on Industrial Informatics; Piscataway Vol. 19, Iss. 2,  (2023): 1207-1216.
Cite  Email
Citation/Abstract

Abstract/Details
 

2023 see 2022  Scholarly Journal
Graph Wasserstein Autoencoder-Based Asymptotically Optimal Motion Planning With Kinematic Constraints for Robotic Manipulation

Xia, Chongkun; Zhang, Yunzhou; Coleman, Sonya A; Ching-Yen, Weng; Houde, Liu; et al. IEEE Transactions on Automation Science and Engineering; New York Vol. 20, Iss. 1,  (2023): 244-257.
Cite
  Email
Citation  Details 

2023 see 2922  Scholarly Journal
Lidar Upsampling With Sliced Wasserstein Distance
Savkin, Artem; Wang, Yida; Wirkert, Sebastian; Navab, Nassir; Tombari, Federico.
 IEEE Robotics and Automation Letters; Piscataway Vol. 8, Iss. 1,  (2023): 392-399.
Cite  Email
Citation/Abstract
Abstract/Details 
Working Pape
Lidar Upsampling with Sliced Wasserstein Distance
Savkin, Artem; Wang, Yida; Wirkert, Sebastian; Navab, Nassir; Tombar, Federico. arXiv.org; Ithaca, Jan 31, 2023.
Cite  Email
Save to My Research


2023

 
2023 see 2021  Working Paper
Projection Robust Wasserstein Distance and Riemannian Optimization

Lin, Tianyi; Fan, Chenyou; Ho, Nhat; Cuturi, Marco; Jordan, Michael I.
 arXiv.org; Ithaca, Jan 1, 2023.
Cite  Email
Full Text
Abstract/DetailsGet full text
opens in a new window

 
Scholarly Journal
Exact statistical inference for the Wasserstein distance by selective inference
Duy Vo Nguyen Le; Takeuchi Ichiro.
 Annals of the Institute of Statistical Mathematics; Tokyo Vol. 75, Iss. 1,  (2023): 127-157.
Cite  Email
Citation/Abstract
Abstract/Details Get full textopens in a new window


Scholarly Journal
The existence of minimizers for an isoperimetric problem with Wasserstein penalty term in unbounded domains
Advances in Calculus of Variations; Berlin Vol. 16, Iss. 1,  (Jan 2023): 1-15.
Cite  Email  Full Text
Abstract/Details Get full textopens in a new window

Cited by 8 Related articles All 10 versions

arXiv:2301.04791  [pdfother stat.ML   cs.CV  cs.GR  cs.LG
Self-Attention Amortized Distributional Projection Optimization for Sliced Wasserstein Point-Cloud Reconstruction
Authors: Khai NguyenDang NguyenNhat Ho
Abstract: Max sliced Wasserstein (Max-SW) distance has been widely known as a solution for redundant projections of sliced Wasserstein (SW) distance. In applications that have various independent pairs of probability measures, amortized projection optimization is utilized to predict the ``max" projecting directions given two input measures instead of using projected gradient ascent multiple times. Despite b…  More
Submitted 11 January, 2023; originally announced January 2023.
Comments: 31 pages, 6 figures, 5 tables

Self-Attention Amortized Distributional Projection Optimization for Sliced Wasserstein Point-Cloud Reconstruction
by Nguyen, Khai; Nguyen, Dang; Ho, Nhat
01/2023
Max sliced Wasserstein (Max-SW) distance has been widely known as a solution for redundant projections of sliced Wasserstein (SW) distance. In applications...
Journal Article  Full Text Online


arXiv:2301.04441  [pdfother math.OC   math.NA  math.PR
Wasserstein Gradient Flows of the Discrepancy with Distance Kernel on the Line
Authors: Johannes HertrichRobert BeinertManuel GräfGabriele Steidl
Abstract: This paper provides results on Wasserstein gradient flows between measures on the real line. Utilizing the isometric embedding of the Wasserstein space P2(R)
 into the Hilbert space L2((0,1))
, Wasserstein gradient flows of functionals on P2(R)
 can be characterized as subgradient flows of associated functionals on L2((0,1))
. For the maximum mean discrepa…  More
Submitted 11 January, 2023; originally announced January 2023.
Comments: arXiv admin note: text overlap with arXiv:2211.01804

<–—2023———2023———40—



arXiv:2301.03749  [pdfother stat.ML   cs.LG
Markovian Sliced Wasserstein Distances: Beyond Independent Projections
Authors: Khai NguyenTongzheng RenNhat Ho
Abstract: Sliced Wasserstein (SW) distance suffers from redundant projections due to independent uniform random projecting directions. To partially overcome the issue, max K sliced Wasserstein (Max-K-SW) distance (K≥1
), seeks the best discriminative orthogonal projecting directions. Despite being able to reduce the number of projections, the metricity of Max-K-SW cannot be guaranteed in practice due t…  More
Submitted 9 January, 2023; originally announced January 2023.
Comments: 37 pages, 9 figures, 5 tables
Working Paper

Markovian Sliced Wasserstein Distances: Beyond Independent Projections

Nguyen, Khai; Ren, Tongzheng; Ho, Nhat. arXiv.org; Ithaca, Jan 10, 2023.

Cite  Email

Save to My ResearchFull Text

Abstract/DetailsGet full text
opens in a new window
All 2 versions
 


arXiv:2301.03662  [pdfother cs.LG   math.AP  math.OC  math.PR
On adversarial robustness and the use of Wasserstein ascent-descent dynamics to enforce it
Authors: Camilo Garcia TrillosNicolas Garcia Trillos
Abstract: We propose iterative algorithms to solve adversarial problems in a variety of supervised learning settings of interest. Our algorithms, which can be interpreted as suitable ascent-descent dynamics in Wasserstein spaces, take the form of a system of interacting particles. These interacting particle dynamics are shown to converge toward appropriate mean-field limit equations in certain large number…  More
Submitted 9 January, 2023; originally announced January 2023.



2023 see 2021
Xia, Qinglan
Zhou, Bohan

The existence of minimizers for an isoperimetric problem with Wasserstein penalty term in unbounded domains. (English) Zbl 07641998

Adv. Calc. Var. 16, No. 1, 1-15 (2023).

MSC:  49J45 49Q20 49Q05 49J20 60B05

PDF BibTeX XML Cite

Full Text: DOI 

MR4529381 

https://math.dartmouth.edu › publications

Jan 16, 2023 — IMA Journal of Numerical Analysis, Oxford University Press , 2022 ... 

problem with Wasserstein penalty term in unbounded domains
Cited by 4
 Related articles All 7 versions


Representing Graphs via Gromov-Wasserstein Factorization

Xu, HTLiu, JC; (...); Carin, L

Jan 1 2023 | 

IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE

 45 (1) , pp.999-1016

Graph representation is a challenging and significant problem for many real-world applications. In this work, we propose a novel paradigm called "Gromov-Wasserstein Factorization " (GWF) to learn graph representations in a flexible and interpretable way. Given a set of graphs, whose correspondence between nodes is unknown and whose sizes can be different, our GWF model reconstructs each graph b

Show more

Free Published Article From RepositoryView full textmore_horiz

74 References  Related records

 

Isometric rigidity of Wasserstein tori and spheres

Geher, GPTitkos, T and Virosztek, D

Jan 2023 | 

MATHEMATIKA

 69 (1) , pp.20-32

Enriched Cited References

We prove isometric rigidity for p-Wasserstein spaces over finite-dimensional tori and spheres for all p. We present a unified approach to proving rigidity that relies on the robust method of recovering measures from their Wasserstein potentials.

Free Submitted Article From RepositoryFull Text at Publishermore_horiz

19 References  Related records

Cited by 3 Related articles All 2 versions

Cited by 4 Related articles All 2 versions

2023

2023 see 2022  Scholarly Journal

WAD-CMSN: Wasserstein distance-based cross-modal semantic network for zero-shot sketch-based image retrieval

Xu, Guanglong; Hu, Zhensheng; Cai, Jia. International Journal of Wavelets, Multiresolution and Information Processing; Singapore Vol. 21, Iss. 2,  (Mar 2023).

Cite  Email

Save to My Research

Citation/Abstract

Abstract/Details Get full textopens in a new window

MR4527996 


Scholarly Journal

Distributionally robust chance constrained svm model with -Wasserstein distance

Ma, Qing; Wang, Yanjun. Journal of Industrial and Management Optimization; Springfield Vol. 19, Iss. 2,  (Feb 2023): 916.

Cite  Email

Save to My Research

Citation/Abstract

Abstract/Details Get full textopens in a new window


2023 see 2022  Scholarly Journal

Wasserstein distance based multi-scale adversarial domain adaptation method for remaining useful life prediction

Shi, Huaitao; Huang, Chengzhuang; Zhang, Xiaochen; Zhao, Jinbao; Li, Sihui. Applied Intelligence; Boston Vol. 53, Iss. 3,  (Feb 2023): 3622-3637.

Cite  Email

Save to My Research

Citation/Abstract

Abstract/Details 


Working Paper

Algebraic Wasserstein distances and stable homological invariants of data

Agerberg, Jens; Guidolin, Andrea; Ren, Isaac; Scolamiero, Martina. arXiv.org; Ithaca, Jan 16, 2023.

Cite  Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

Algebraic Wasserstein distances and stable homological invariants of data
by Agerberg, Jens; Guidolin, Andrea; Ren, Isaac ; More...
01/2023
Distances have an ubiquitous role in persistent homology, from the direct comparison of homological representations of data to the definition and optimization...
Journal Article  Full Text Online

Related articles All 2 versions 

Working Paper

Wasserstein Steepest Descent Flows of Discrepancies with Riesz Kernels

Hertrich, Johannes; Gräf, Manuel; Beinert, Robert; Steidl, Gabriele. arXiv.org; Ithaca, Jan 16, 2023.

Cite  Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

Cited by 9 Related articles All 3 versions

<–—2023———2023———50—


Working Paper

Minimax Q-learning Control for Linear Systems Using the Wasserstein Metric

Zhao, Feiran; You, Keyou. arXiv.org; Ithaca, Jan 16, 2023.

Cite  Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

Minimax Q-learning control for linear systems using the Wasserstein...
by Zhao, Feiran; You, Keyou
Automatica (Oxford), 03/2023, Volume 149
Stochastic optimal control usually requires an explicit dynamical model with probability distributions, which are difficult to obtain in practice. In this...
Article PDFPDF
Journal Article  Full Text Online
Open Access

Working Paper

The Unbalanced Gromov Wasserstein Distance: Conic Formulation and Relaxation

Thibault Séjourné; Vialard, François-Xavier; Peyré, Gabriel. arXiv.org; Ithaca, Jan 16, 2023.

Cite  Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window


Working Paper

Wasserstein Logistic Regression with Mixed Features

Aras Selvi; Belbasi, Mohammad Reza; Haugh, Martin B; Wiesemann, Wolfram. arXiv.org; Ithaca, Jan 14, 2023.

Cite  Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window


Working Paper

Self-Attention Amortized Distributional Projection Optimization for Sliced Wasserstein Point-Cloud Reconstruction

Nguyen, Khai; Nguyen, Dang; Ho, Nhat. arXiv.org; Ithaca, Jan 12, 2023.

Cite   Email

Save to My Research Full Text

Abstract/DetailsGet full text
asserstein … -SW with distributional sliced Wasserstein distance with von Mises-…ave Cite Cited by 6 Related asserstein … -SW with distributional sliced Wasserstein distance with von Mises-…

Cited by 6 Related articles All 2 versions 

Working Paper

Wasserstein Gradient Flows of the Discrepancy with Distance Kernel on the Line

Hertrich, Johannes; Beinert, Robert; Gräf, Manuel; Steidl, Gabriele. arXiv.org; Ithaca, Jan 11, 2023.

Cite  Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window
All 2 versions
 


Wasserstein Gradient Flows of the Discrepancy with Distance Kernel on the Line
by Hertrich, Johannes; Beinert, Robert; Gräf, Manuel ; More...
01/2023
This paper provides results on Wasserstein gradient flows between measures on the real line. Utilizing the isometric embedding of the Wasserstein space...
Journal Article  Full Text Online

Cited by 6 Related articles All 4 versions


2023


Working Paper

Entropy-regularized Wasserstein distributionally robust shape and topology optimization

Dapogny, Charles; Iutzeler, Franck; Meda, Andrea; Thibert, Boris. arXiv.org; Ithaca, Jan 11, 2023.

Cite  Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

Related articles All 9 versions

 Novel Small Samples Fault Diagnosis Method Based on the Self-attention Wasserstein Generative Adversarial Network
by Shang, Zhiwu; Zhang, Jie; Li, Wanxiang ; More...
Neural processing letters, 01/2023
Article PDFPDF
Journal Article  Full Text Online


Working Paper

On adversarial robustness and the use of Wasserstein ascent-descent dynamics to enforce it

Camilo Garcia Trillos; Nicolas Garcia Trillos. arXiv.org; Ithaca, Jan 9, 2023.

Cite    \Email

Save to My Research  Full Text

Abstract/DetailsGet full text
opens in a new window

On adversarial robustness and the use of Wasserstein ascent-descent dynamics to enforce it
by Trillos, Camilo Garcia; Trillos, Nicolas Garcia
01/2023
We propose iterative algorithms to solve adversarial problems in a variety of supervised learning settings of interest. Our algorithms, which can be...
Journal Article  Full Text Online


Working Paper

Kantorovich Strikes Back! Wasserstein GANs are not Optimal Transport?

Korotin, Alexander; Kolesov, Alexander; Burnaev, Evgeny. arXiv.org; Ithaca, Jan 9, 2023.

Cite  Email

Save to My Research  Full Text

Abstract/DetailsGet full text
opens in a new window


2023 see 2022  Working Paper

Efficient Approximation of Gromov-Wasserstein Distance Using Importance Sparsification

Li, Mengyu; Yu, Jun; Xu, Hongteng; Cheng, Meng. arXiv.org; Ithaca, Jan 9, 2023.

Cite  Email

Save to My Research Full Text

Abstract/DetailsGet full text
opens in a new window

Efficient Approximation of Gromov-Wasserstein Distance Using Importance Sparsification
by Li, MengyuYu, JunXu, Hongteng ; More...
Journal of computational and graphical statistics, 01/2023
ArticleView Article PDF
Journal Article  Full Text Online

Cited by 10 Related articles All 3 versions

<–—2023———2023———60—


2023 see 2022  Working Paper

Wasserstein Iterative Networks for Barycenter Estimation

Korotin, Alexander; Egiazarian, Vage; Li, Lingxiao; Burnaev, Evgeny. arXiv.org; Ithaca, Jan 9, 2023.

Cite  Email  

Save to My Research  Full Text


Wasserstein Distributionally Robust Chance-Constrained Program with Moment Information

Z Luo, Y Yin, D Wang, TCE Cheng, CC Wu - Computers & Operations …, 2023 - Elsevier

This paper studies a distributionally robust joint chance-constrained program with a hybrid ambiguity set including the Wasserstein metric, and moment and bounded support …

Cited by 4 Related articles All 3 versions

MR4537607

2023 see 2022

Dynamical mode recognition of triple flickering buoyant diffusion flames in Wasserstein space

Y Chi, T YangP Zhang - Combustion and Flame, 2023 - Elsevier

… to it can be quantified by the Wasserstein distance between their probability distributions of … points and Wasserstein distance as the metric; this metric space is called Wasserstein space. …

Cited by 11 Related articles All 5 versions

A novel conditional weighting transfer Wasserstein auto-encoder for rolling bearing fault diagnosis with multi-source domains

K Zhao, F Jia, H Shao - Knowledge-Based Systems, 2023 - Elsevier

… Wasserstein distance is regarded as an optimal transport problem, and its purpose is to search an optimal transport strategy. Wasserstein distance can not only measure the distance …

 A novel conditional weighting transfer Wasserstein auto-encoder for rolling bearing fault diagnosis with...
by Zhao, KeJia, FengShao, Haidong
Knowledge-based systems, 02/2023, Volume 262
Transfer learning based on a single source domain to a target domain has received a lot of attention in the cross-domain fault diagnosis tasks of rolling...
Journal ArticleCitation Online

Wasserstein generative digital twin model in health monitoring of rotating machines

W HuT Wang, F Chu - Computers in Industry, 2023 - Elsevier

… The Wasserstein GAN (WGAN) is among the most … the WGAN is employed as the core of the entire configuration instead of the conventional simulation model. Under the Wasserstein …
Cited by 6 Related articles All 2 versions


 2023


2023 see 2022  [HTML] sciencedirect.com

[HTML] Bounding Kolmogorov distances through Wasserstein and related integral probability metrics

RE Gaunt, S Li - Journal of Mathematical Analysis and Applications, 2023 - Elsevier

… their smooth Wasserstein distance … Wasserstein metric. It should also be noted that whilst we have provided our 

 Cited by 5 Related articles All 5 versions

MR4533915 
ite Cited by 8 Related articles All 5 versions

[PDF] arxiv.org

Limit theorems in Wasserstein distance for empirical measures of diffusion processes on Riemannian manifolds

FY Wang, JX Zhu - Annales de l'Institut Henri Poincaré …, 2023 - projecteuclid.org

… où Ex est l’espérance par rapport au processus avec condition initiale x et W2 est la distance L2-Wasserstein associée à la métrique riemannienne de l’espace. La limite est finie si et …

Cited by 11 Related articles All 3 versions

MR4533736

[PDF] arxiv.org

LiAll 2 versionsmit theorems in Wasserstein distance for empirical measures of diffusion processes on Riemannian manifolds

FY Wang, JX Zhu - Annales de l'Institut Henri Poincaré …, 2023 - projecteuclid.org

… où Ex est l’espérance par rapport au processus avec condition initiale x et W2 est la distance L2-Wasserstein associée à la Cited by 7 Related articles 


[PDF] arxiv.org

Signature-Wasserstein-1 metric

PD Lozano, TL Bagén, J Vives - arXiv preprint arXiv:2301.01315, 2023 - arxiv.org

… Unordered Wasserstein-1 metric: We compare the Wasserstein-1 distance between the real one dimensional distributions that are given by: a) taking the data points yt from the output …


  2023 see 2022  [PDF] sns.it

Sharp Wasserstein estimates for integral sampling and Lorentz summability of transport densities

F Santambrogio - Journal of Functional Analysis, 2023 - Elsevier

We prove some Lorentz-type estimates for the average in time of suitable geodesic interpolations of probability measures, obtaining as a by product a new estimate for transport …

<–—2023———2023———70—


 2023 see 2022

Handwriting Recognition Using Wasserstein Metric in Adversarial Learning

M Jangpangi, S KumarD BhardwajBG Kim… - SN Computer …, 2023 - Springer

… mechanism, we try to use Wasserstein Generative Adversarial Network (WGAN) [9] in the model. WGAN is also termed as earth mover’s distance (EMD). Wasserstein’s function tries to …



2023 see 2022  Peer-reviewed
Representing Graphs via Gromov-Wasserstein Factorization

Authors:Lawrence CarinDixin LuoJiachang LiuHongteng Xu
Summary:Graph representation is a challenging and significant problem for many real-world applications. In this work, we propose a novel paradigm called “Gromov-Wasserstein Factorization” (GWF) to learn graph representations in a flexible and interpretable way. Given a set of graphs, whose correspondence between nodes is unknown and whose sizes can be different, our GWF model reconstructs each graph by a weighted combination of some “graph factors” under a pseudo-metric called Gromov-Wasserstein (GW) discrepancy. This model leads to a new nonlinear factorization mechanism of the graphs. The graph factors are shared by all the graphs, which represent the typical patterns of the graphs’ structures. The weights associated with each graph indicate the graph factors’ contributions to the graph's reconstruction, which lead to a permutation-invariant graph representation. We learn the graph factors of the GWF model and the weights of the graphs jointly by minimizing the overall reconstruction error. When learning the model, we reparametrize the graph factors and the weights to unconstrained model parameters and simplify the backpropagation of gradient with the help of the envelope theorem. For the GW discrepancy (the critical training step), we consider two algorithms to compute it, which correspond to the proximal point algorithm (PPA) and Bregman alternating direction method of multipliers (BADMM), respectively. Furthermore, we propose some extensions of the GWF model, including (i) combining with a graph neural network and learning graph representations in an auto-encoding manner, (ii) representing the graphs with node attributes, and (iii) working as a regularizer for semi-supervised graph classification. Experiments on various datasets demonstrate that our GWF model is comparable to the state-of-the-art methods. The graph representations derived by it perform well in graph clustering and classification tasksShow more
Article, 2023
Publication:IEEE Transactions on Pattern Analysis & Machine Intelligence, 45, 202301, 999
Publisher:2023

Towards inverse modeling of landscapes using the Wasserstein distance

MJ MorrisAG LippGG Roberts - Authorea Preprints, 2023 - authorea.com

… noise introduced to force channelisation, the widely used Euclidean measures of similarity

(eg … Instead, we introduce the Wasserstein distance as a means to measure misfit between …



2023 see 2022  Peer-reviewed
Isometric rigidity of Wasserstein tori and spheres

Authors:György Pál GehérTamás TitkosDániel Virosztek
Article, 2023
Publication:Mathematika, 69, 2023, 20
Publisher:2023

Cited by 2 Related articles All 2 versions




Lidar Upsampling With Sliced Wasserstein Distance

Authors:Artem SavkinYida WangSebastian WirkertNassir NavabFederico Tombari
Article, 2023
Publication:IEEE Robotics and Automation Letters, 8, 202301, 392
Publisher:2023
arXiv 2023

Lidar Upsampling With Sliced Wasserstein Distance

Exact statistical inference for the Wasserstein distance by selective inference: Selective Inference for the Wasserstein Distance
by Duy, Vo Nguyen Le; Takeuchi, Ichiro
Annals of the Institute of Statistical Mathematics, 2023, Volume 75, Issue 1
In this paper, we study statistical inference for the Wasserstein distance, which has attracted much attention and has been applied to various machine learning...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal
Open Access

Cited by 5 Related articles All 4 versions

MR4530140 
 
 2023 see 2021
Scenario Reduction Network Based on Wasserstein Distance with Regularization
by Dong, Xiaochong; Sun, Yingyun; Malik, Sarmad Majeed ; More...
IEEE transactions on power systems, 2023
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal
Cited by 1
 Related articles All 2 versions

 2023 see 2022
Stable parallel training of Wasserstein conditional generative adversarial neural networks
by Lupo Pasini, Massimiliano; Yin, Junqi
The Journal of supercomputing, 2023, Volume 79, Issue 2
We propose a stable, parallel approach to train Wasserstein conditional generative adversarial neural networks (W-CGANs) under the constraint of a fixed...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal

Cited by 2 Related articles All 9 versions


 2023 see 2022
Wasserstein generative adversarial networks for modeling marked events
by Dizaji, S. Haleh S.; Pashazadeh, Saeid; Niya, Javad Musevi
The Journal of supercomputing, 2023, Volume 79, Issue 3
Marked temporal events are ubiquitous in several areas, where the events’ times and marks (types) are usually interrelated. Point processes and their...
Article PDFPDF
Journal Article  Full Text Online
Cited by 1
 Related articles All 2 versions


Distributionally robust chance constrained svm model with $\ell_2$-Wasserstein distance
by Ma, Qing; Wang, Yanjun
Journal of industrial and management optimization, 2023, Volume 19, Issue 2
In this paper, we propose a distributionally robust chance-constrained SVM model with \begin{document}$ \ell_2...
Article PDFPDF
Journal ArticleCitation Online

Related articles All 2 versions

<–—2023———2023———80—


Using Hybrid Penalty and Gated Linear Units to Improve Wasserstein Generative Adversarial Networks for Single-Channel Speech Enhancement
by Zhu, Xiaojun; Huang, Heming
Computer modeling in engineering & sciences, 2023, Volume 135, Issue 3
Article PDFPDF
Journal ArticleCitation Online

IMPROVING SEMANTIC SEGMENTATION OF HIGH-RESOLUTION REMOTE SENSING IMAGES USING WASSERSTEIN GENERATIVE ADVERSARIAL NETWORK
by Hosseinpour, H. R.; Samadzadegan, F.; Dadrass Javan, F. ; More...
International archives of the photogrammetry, remote sensing and spatial information sciences., 01/2023, Volume XLVIII-4/W2-2022
Semantic segmentation of remote sensing images with high spatial resolution has many applications in a wide range of problems in this field. In recent years,...
Article PDFPDF
Journal Article  Full Text Online
More Options 

Markovian Sliced Wasserstein Distances: Beyond Independent Projections
by Nguyen, Khai; Ren, Tongzheng; Ho, Nhat
01/2023
Sliced Wasserstein (SW) distance suffers from redundant projections due to independent uniform random projecting directions. To partially overcome the issue,...
Journal Article  Full Text Online

On adversarial robustness and the use of Wasserstein ascent-descent dynamics to enforce it
by Camilo Garcia Trillos; Nicolas Garcia Trillos
arXiv.org, 01/2023
We propose iterative algorithms to solve adversarial problems in a variety of supervised learning settings of interest. Our algorithms, which can be...
Paper  Full Text Online
Related articles


On the Existence of Monge Maps for the Gromov-Wasserstein Problem
by Dumont, Théo; Lacombe, Théo; Vialard, François-Xavier
arXiv.org, 01/2023
In this work, we study the structure of minimizers of the quadratic Gromov--Wasserstein (GW) problem on Euclidean spaces for two different costs. The first one...
Paper  Full Text Online

2023

Robust \(Q\)-learning Algorithm for Markov Decision Processes under Wasserstein Uncertainty
by Neufeld, Ariel; Sester, Julian
arXiv.org, 01/2023
We present a novel \(Q\)-learning algorithm to solve distributionally robust Markov decision problems, where the corresponding ambiguity set of transition...
Paper  Full Text Online


A Novel Graph Kernel Based on the Wasserstein Distance and Spectral Signatures
by Liu, Yantao; Rossi, Luca; Torsello, Andrea
Structural, Syntactic, and Statistical Pattern Recognition, 01/2023
Spectral signatures have been used with great success in computer vision to characterise the local and global topology of 3D meshes. In this paper, we propose...
Book Chapter  Full Text Online


2023 patent news 
State Intellectual Property Office of China Receives Chongqing Post and Telecommunication Univ's Patent Application for Theme Modeling Method Based on Wasserstein Auto-Encoder and Gaussian...
Global IP News. Telecom Patent News, Jan 6, 2023
Newspaper Article  Full Text Online


A Novel Small Samples Fault Diagnosis Method Based on the Self-attention Wasserstein Generative Adversarial Network
by Shang, ZhiwuZhang, JieLi, Wanxiang ; More...
Neural processing letters, 01/2023
ArticleView Article PDF
Journal Article  Full Text Online


arXiv:2301.07963  [pdfother math.OC   math.PR
Wasserstein-type metric for generic mixture models, including location-scatter and group invariant measures
Authors: Geneviève DussonVirginie EhrlacherNathalie Nouaime
Abstract: In this article, we study Wasserstein-type metrics and corresponding barycenters for mixtures of a chosen subset of probability measures called atoms hereafter. In particular, this works extends what was proposed by Delon and Desolneux [A Wasserstein-Type Distance in the Space of Gaussian Mixture Models. SIAM J. Imaging Sci. 13, 936-970 (2020)] for mixtures of gaussian measures to other mixtures.…  More
Submitted 19 January, 2023; originally announced January 2023.

Related articles 

<–—2023———2023———90-—


arXiv:2301.07934  [pdfpsother math.FA
Weak log-majorization between the spectral geometric and Wasserstein means
Authors: Luyining GanSejong Kim
Abstract: In this paper, we establish the weak log-majorization between the spectra of the Wasserstein mean and the spectral geometric mean for two positive definite Hermitian matrices. In particular, for a specific range of the parameter, the two-variable Wasserstein mean converges decreasingly to the log-Euclidean mean with respect to the weak log-majorization.
Submitted 19 January, 2023; originally announced January 2023.
Comments: 12 pages


uy, Vo Nguyen LeTakeuchi, Ichiro

Exact statistical inference for the Wasserstein distance by selective inference. Selective inference for the Wasserstein distance. (English) Zbl 07643834

Ann. Inst. Stat. Math. 75, No. 1, 127-157 (2023).

MSC:  62-XX

PDF BibTeX XML Cite

Full Text: DOI  

) Zbl 07643834

2023 see 2022

Graph Wasserstein Autoencoder-Based Asymptotically Optimal Motion Planning With Kinematic Constraints for Robotic Manipulation

Chongkun Xia; Yunzhou Zhang; Sonya A. Coleman; Ching-Yen Weng; Houde Liu; Shichang Liu; I-Ming Chen

IEEE Transactions on Automation Science and Engineering

Year: 2023 | Volume: 20, Issue: 1 | Journal Article | Publisher: IEEE
Abstract 
HTML


 

arXiv:2301.09411  [pdfother astro-ph.CO   astro-ph.GA
Wasserstein distance as a new tool for discriminating cosmologies through the topology of large scale structure
Authors: Maksym TsizhVitalii TymchyshynFranco Vazza
Abstract: In this work we test Wasserstein distance in conjunction with persistent homology, as a tool for discriminating large scale structures of simulated universes with different values of σ 8
 cosmological parameter (present root-mean-square matter fluctuation averaged over a sphere of radius 8 Mpc comoving). The Wasserstein distance (a.k.a. the pair-matching distance) was proposed to measure the diff…  More
Submitted 23 January, 2023; originally announced January 2023.
Comments: submitted to Monthly notices of the royal astronomical society
All 6 versions


arXiv:2301.08420  [pdfpsother math.PR
Convergence in Wasserstein Distance for Empirical Measures of Non-Symmetric Subordinated Diffusion Processes
Authors: Feng-Yu Wang
Abstract: By using the spectrum of the underlying symmetric diffusion operator, the convergence in Wasserstein distance is characterized for the empirical measure of non-symmetric subordinated diffusion processes in an abstract framework. In particular, let μ(dx):=e  V(x)dx
 be a probability measure on an n
-dimensional compact connected Riemannian manifold M


2023


[PDF] arxiv.org

A kernel formula for regularized Wasserstein proximal operators

W Li, S Liu, S Osher - arXiv preprint arXiv:2301.10301, 2023 - arxiv.org

… One has to develop an optimization step to compute or approximate Wasserstein metrics …

the Wasserstein proximal operator. We use an optimal control formulation of the Wasserstein …

 

2023 see 2022. [PDF] arxiv.org

On a linear fused Gromov-Wasserstein distance for graph structured data

DH NguyenK Tsuda - Pattern Recognition, 2023 - Elsevier

We present a framework for embedding graph structured data into a vector space, taking

into account node features and structures of graphs into the optimal transport (OT) problem. …

Cited by 2 Related articles All 3 versions

2023 see 2022  [PDF] arxiv.org

Invariance encoding in sliced-Wasserstein space for image classification with limited training data

M Shifat-E-RabbiY ZhuangS LiAHM Rubaiyat… - Pattern Recognition, 2023 - Elsevier

Deep convolutional neural networks (CNNs) are broadly considered to be state-of-the-art

generic end-to-end image classification systems. However, they are known to underperform …

 Cited by 1 Related articles All 3 versions


A Novel Graph Kernel Based on the Wasserstein Distance and Spectral Signatures

Y Liu, L RossiA Torsello - … , and Statistical Pattern Recognition: Joint IAPR …, 2023 - Springer

Spectral signatures have been used with great success in computer vision to characterise the

local and global topology of 3D meshes. In this paper, we propose to use two widely used …

Related articles All 2 versions


[PDF] arxiv.org

Least Wasserstein distance between disjoint shapes with perimeter regularization

M NovackI TopalogluR Venkatraman - Journal of Functional Analysis, 2023 - Elsevier

We prove the existence of global minimizers to the double minimization problem where P (

E ) denotes the perimeter of the set E, W p is the p-Wasserstein distance between Borel …

Cited by 4 Related articles All 11 versions

<–—2023———2023———100-—


[PDF] berkeley.edu

[PDF] Wasserstein Distance

B Sturmfels - 2023 - math.berkeley.edu

Familiar examples of polyhedral norms are||·||∞ and||·|| 1, where the unit ball B is the cube

and the crosspolytope respectively. Polyhedral norms are very important in optimal transport …

Related articles 


 2023 see 2021  [PDF] aimsciences.org

Distributionally robust chance constrained svm model with -Wasserstein distance

Q Ma, Y Wang - Journal of Industrial and Management …, 2023 - aimsciences.org

… -Wasserstein ambiguity. We present equivalent formulations of distributionally robust chance

constraints based on 2Wasserstein … problem when the 2-Wasserstein distance is discrete …

Related articles All 2 versions

MR4509255 
 
blem when the 2-Wasserstein distance is discrete …

Related articles All 3 versions

[PDF] uu.nl

Stock Price Simulation under Jump-Diffusion Dynamics: A WGAN-Based Framework with Anomaly Detection

R Gan - 2023 - studenttheses.uu.nl

… GAN model-Wasserstein GAN and Wasserstein GAN with gradient penalty, which can

effectively prevent the GANs failure modes because of the introduced Wasserstein loss. Chapter 5 …

 

Comparing Beta-VAE to WGAN-GP for Time Series Augmentation to Improve Classification Performance

D Kavran, B ŽalikN Lukač - … , ICAART 2022, Virtual Event, February 3–5 …, 2023 - Springer

… A comparison is presented between Beta-VAE and WGAN-GP as … The use of WGAN-GP

generated synthetic sets to train … the use of Beta-VAE and WGAN-GP generated synthetic sets …

 Related articles


2023z1.   Z1

[CITATION] An extended 534 Exp-TODIM method for multiple attribute decision making 535 based on the Z-Wasserstein distance

H Sun, Z Yang, Q Cai, G Wei, Z Mo - Expert Systems with 536 Applications, 2023

Cited by 5 Related articles
  2023z5

[CITATION] An extended 534 Exp-TODIM method for multiple attribute decision making 535 based on the Z-Wasserstein distance

H Sun, Z Yang, Q Cai, G Wei, Z Mo - Expert Systems with 536 Applications, 2023

Cited by 5 Related articles



2023


2023 see 2022  [PS] uc.pt

[PS] … ), 301-316.[24] Weed, J., & Bach, F.(2019). Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance. Bernoulli 25 …

PE OLIVEIRA, N PICADO - surfaces - mat.uc.pt

Let M be a compact manifold of Rd. The goal of this paper is to decide, based on a sample of

points, whether the interior of M is empty or not. We divide this work in two main parts. Firstly…

 Related articles 



Comparing Beta-VAE to WGAN-GP for Time Series Augmentation to Improve Classification...
by Kavran, Domen; Žalik, Borut; Lukač, Niko
Agents and Artificial Intelligence, 01/2023
Datasets often lack diversity to train robust classification models, capable of being used in real-life scenarios. Neural network-based generative models learn...
Book Chapter  Full Text Online

A multi-period emergency medical service location problem based on Wasserstein-metric...
by Yuan, Yuefei; Song, Qiankun; Zhou, Bo
International journal of systems science, 01/2023
Journal ArticleCitation Online

 
IWGAN: Anomaly Detection in Airport Based on Improved Wasserstein Generative Adversarial...
by Huang, Ko-Wei; Chen, Guan-Wei; Huang, Zih-Hao ; More...
Applied sciences, 01/2023, Volume 13, Issue 3
Anomaly detection is an important research topic in the field of artificial intelligence and visual scene understanding. The most significant challenge in...
Article PDFPDF
Journal Article  Full Text Online
 
Wasserstein distance as a new tool for discriminating cosmologies through the...
by Tsizh, Maksym; Tymchyshyn, Vitalii; Vazza, Franco
01/2023
In this work we test Wasserstein distance in conjunction with persistent homology, as a tool for discriminating large scale structures of simulated universes...

Wasserstein distance as a new tool for discriminating cosmologies through the topology

All 6 versions
<–—2023———2023———110-—

 

  Convergence in Wasserstein Distance for Empirical Measures of Non-Symmetric...

by Feng-Yu, Wang

arXiv.org, 01/2023

By using the spectrum of the underlying symmetric diffusion operator, the convergence in Wasserstein distance is characterized for the empirical measure of...

Paper  Full Text Online

 
A multi-period emergency medical service location problem based on Wasserstein-metric approach using generalised...

by Yuan, YuefeiSong, QiankunZhou, Bo
International journal of systems science, 01/2023
Journal ArticleCitation Online

 
IWGAN: Anomaly Detection in Airport Based on Improved Wasserstein Generative Adversarial Network
by Huang, Ko-WeiChen, Guan-WeiHuang, Zih-Hao ; More...
Applied sciences, 01/2023, Volume 13, Issue 3
Anomaly detection is an important research topic in the field of artificial intelligence and visual scene understanding. The most significant challenge in...
ArticleView Article PDF
Journal Article  Full Text Online
View Complete Issue Browse Now
[HTML] mdpi.com

[HTML] IWGAN: Anomaly Detection in Airport Based on Improved Wasserstein Generative Adversarial Network

KW Huang, GW Chen, ZH Huang, SH Lee - Applied Sciences, 2023 - mdpi.com

… vector space of all images, the present … Wasserstein-GAN (WGAN) and Skip-GANomaly

models to distinguish between normal and abnormal images, is called the Improved Wasserstein …

Cited by 1 Related articles 

.arXiv:2301.11624  [pdfother cs.LG  math.OC math.PR
Neural Wasserstein Gradient Flows for Maximum Mean Discrepancies with Riesz Kernels
Authors: Fabian AltekrügerJohannes HertrichGabriele Steidl
Abstract: Wasserstein gradient flows of maximum mean discrepancy (MMD) functionals with non-smooth Riesz kernels show a rich structure as singular measures can become absolutely continuous ones and conversely. In this paper we contribute to the understanding of such flows. We propose to approximate the backward scheme of Jordan, Kinderlehrer and Otto for computing such Wasserstein gradient flows as well as…  More
Submitted 27 January, 2023; originally announced January 2023.
Comments: arXiv admin note: text overlap with arXiv:2211.01804

Cited by 12 Related articles All 6 versions 


arXiv:2301.11496  [pdfother math.ST
On Excess Mass Behavior in Gaussian Mixture Models with Orlicz-Wasserstein Distances
Authors: Aritra GuhaNhat HoXuanLong Nguyen
Abstract: Dirichlet Process mixture models (DPMM) in combination with Gaussian kernels have been an important modeling tool for numerous data domains arising from biological, physical, and social sciences. However, this versatility in applications does not extend to strong theoretical guarantees for the underlying parameter estimates, for which only a logarithmic rate is achieved. In this work, we (re)intro…  More
Submitted 26 January, 2023; originally announced January 2023.
Comments: 2 figures
Cited by 2
 All 2 versions 
 

2023


arXiv:2301.10301  [pdfother math.OC   math.NA
A kernel formula for regularized Wasserstein proximal operators
Authors: Wuchen LiSiting LiuStanley Osher
Abstract: We study a class of regularized proximal operators in Wasserstein-2 space. We derive their solutions by kernel integration formulas. We obtain the Wasserstein proximal operator using a pair of forward-backward partial differential equations consisting of a continuity equation and a Hamilton-Jacobi equation with a terminal time potential function and an initial time density function. We regularize…  More
Submitted 24 January, 2023; originally announced January 2023.

Cited by 3 Related articles All 4 versions

 Wasserstein Barycenter and Its Application to Texture Mixing

https://link.springer.com › chapter

by J Rabin · 2012 · Cited by 544 — Wasserstein Barycenter and Its Application to Texture Mixing ... ACM Trans. on Graphics 24, 777–786 (2005) ... 2023 Springer Nature Switzerland AG.


Learning Gaussian Mixtures Using the Wasserstein-Fisher ...

https://arxiv.org › math

https://arxiv.org › math

by Y Yan · 2023 — [Submitted on 4 Jan 2023] ... method is based on gradient descent over the space of probability measures equipped with the Wasserstein-Fisher-Rao geometry ...

Markovian Sliced Wasserstein Distances - arXivhttps://arxiv.org › pdf
https://arxiv.org › pdfPDF


Publications - Xiangyu Zhao

https://zhaoxyai.github.io › pub

... ACM International Conference on Web Search and Data Mining (WSDM'2023) ... Chunxiao Xing, 

Xian Wu, 

Gromov-Wasserstein Guided Representation Learning for ...


Graph Classification Method Based on Wasserstein Distance

https://iopscience.iop.org › article

by W Wu · 2021 — [2] Perozzi B., Al-Rfou R. and Skiena S. Proceedings of the 20th ACM SIGKDD ... machine learning Learning convolutional neural networks for graphs 2014-2023.

<–—2023———2023———120-—


 

Award # 1915967 - Robust Wasserstein Profile Inference

https://www.nsf.gov › awardsearch › showAward

https://www.nsf.gov › awardsearch › showAward

End Date: June 30, 2023 (Estimated) ... Test for Probabilistic Fairness" FAccT '21: Proceedings of the 2021 ACM Conference on Fairness, Accountability, ...


home - University of California, Berkeley

https://zheng.ieor.berkeley.edu

... Networks and Wasserstein Training, with Tingyu Zhu and Haoyu Liu, accepted by ACM Transactions on Modeling and Computer Simulation (TOMACS), 2023.


Limit theorems in Wasserstein distance for empirical ...

https://projecteuclid.org › issue-1 › 22-AIHP1251

by FY Wang · 2023 · Cited by 7 — In this paper, we characterize the long time behaviour of empirical measures for diffusion processes by using eigenvalues of the generator. Let M be a d- ...

Translate this page

Ann. Appl. Probab. 6. 王凤雨and Jie-Xiang Zhu. 

Limit Theorems in Wasserstein Distance for Empirical Measures of Diffusion Processes on Riemannian Manifolds.

 Cited by 2 Related articles All 3 versions
 Zbl 07657659


2023 see 2021

Publications - Electrical and Computer Engineering

http://www.ece.virginia.edu › publications

http://www.ece.virginia.edu › publications

ACM International Conference on Web Search and Data Mining (WSDM), 2023. ... 

Unsupervised Graph Alignment with Wasserstein Distance Discriminator


2023 see 2022

Distributionally robust chance constrained svm model with

https://www.aimsciences.org › article › doi › jimo.2021212

by Q Ma · 2023 — Journal of Industrial and Management Optimization, 2023, 19(2): 916-931. doi: ... 

Distributionally robust stochastic optimization with wasserstein distance, ...


2023

[PDF] mdpi.com

Towards Generating Realistic Wrist Pulse Signals Using Enhanced One Dimensional Wasserstein GAN

J Chang, F Hu, H Xu, X Mao, Y Zhao, L Huang - Sensors, 2023 - mdpi.com

… In this study, for the first time, we address the challenging by presenting a novel one-dimension 

generative adversarial networks (GAN) for generating wrist pulse signals, which manages to 

learn a mapping strategy from a random noise space to the original wrist pulse data distribution automatically. Concretely, Wasserstein GAN with gradient penalty (WGAN-GP) is employed to 

alleviate the mode collapse problem of vanilla GANs, which could be able to further … 

Therefore, its performance remains unclear in the case of wrist pulse signals. Motivated by …

 All 2 versions 


Towards Generating Realistic Wrist Pulse Signals Using ...

All 2 versions

2023 see 2022

Efficient Approximation of Gromov-Wasserstein Distance ...

https://www.tandfonline.com › ... › Latest Articles

https://www.tandfonline.com › ... › Latest Articles

Jan 9, 2023 — As a valid metric of metric-measure spaces, Gromov-Wasserstein (GW) distance has ... Accepted author version posted online: 09 Jan 2023.


Full Program - Joint Mathematics Meetings 2023

https://www.jointmathematicsmeetings.org › 2270_progfull

https://www.jointmathematicsmeetings.org › 2270_progfull

A probabilistic approach to vanishing viscosity for PDEs on the Wasserstein space. Ludovic Tangpi*, Princeton University (1183-49-19179); 11:00 a.m.


2023 see  022
Data Science Seminars - UMN-CSEhttps://cse.umn.edu › ima › data-science-seminars
https://cse.umn.edu › ima › data-science-seminars
The IMA Data Science Seminars are a forum for data scientists of IMA ... April 25, 2023 ... 

The Back-And-Forth Method For Wasserstein Gradient Flows

2023 see 2021
Continual learning of generative models with limited data: From wasserstein-1 barycenter to adaptive coalescence

M Dedeoglu, S Lin, Z Zhang, J Zhang

arXiv preprint arXiv:2101.09225 E INFOCOM 2023).

Cited by 1 Related articles All 2 versions

<–—2023———2023———130-—


2023 see 2021

Markovian Sliced Wasserstein Distances - arXiv

https://arxiv.org › pdf

PDby K Nguyen · 2023 — January 11, 2023. Abstract. Sliced Wasserstein (SW) distance suffers from redundant projections due to independent.


2023 Jan 1 modified
2-Wasserstein barycenter of 4 images - ResearchGate

https://www.researchgate.net › figure › 2-Wasserstein-bar...

https://www.researchgate.net › figure › 2-Wasserstein-bar...
Computing optimal transport (OT) distances between pairs of probability measures or histograms, such as the earth mover's distance [37,32] and Monge-Kantorovich ...


arXiv:2302.01237
  [pdfother stat.ML   cs.LG  math.ST
Robust Estimation under the Wasserstein Distance
Authors: Sloan NietertRachel CummingsZiv Goldfeld
Abstract: We study the problem of robust distribution estimation under the Wasserstein metric, a popular discrepancy measure between probability distributions rooted in optimal transport (OT) theory. We introduce a new outlier-robust Wasserstein distance Wεp  which allows f outlier mass to be removed from its input distributions, and show that minimum distance estimatio…  More
Submitted 2 February, 2023; originally announced February 2023.

arXiv:2302.00975  [pdfpsother math.ST
Stone's theorem for distributional regression in Wasserstein distance
Authors: Clément DombryThibault ModesteRomain Pic
Abstract: We extend the celebrated Stone's theorem to the framework of distributional regression. More precisely, we prove that weighted empirical distribution with local probability weights satisfying the conditions of Stone's theorem provide universally consistent estimates of the conditional distributions, where the error is measured by the Wasserstein distance of order p ≥
 1. Furthermore, for p = 1,…  More
Submitted 2 February, 2023; originally announced February 2023.

arXiv:2301.12880  [pdfother math.DS
Gromov-Wasserstein Transfer Operators
Authors: Florian Beier
Abstract: Gromov-Wasserstein (GW) transport is inherently invariant under isometric transformations of the data. Having this property in mind, we propose to estimate dynamical systems by transfer operators derived from GW transport plans, when merely the initial and final states are known. We focus on entropy regularized GW transport, which allows to utilize the fast Sinkhorn algorithm and a spectral cluste…  More
Submitted 30 January, 2023; originally announced January 2023.
MSC Class: 65K10; 28A35; 49M20; 37A30

2023


arXiv:2301.12461  [pdfpsother eess.SY
Stochastic Wasserstein Gradient Flows using Streaming Data with an Application in Predictive Maintenance
Authors: Nicolas LanzettiEfe C. BaltaDominic Liao-McPhersonFlorian Dörfler
Abstract: We study estimation problems in safety-critical applications with streaming data. Since estimation problems can be posed as optimization problems in the probability space, we devise a stochastic projected Wasserstein gradient flow that keeps track of the belief of the estimated quantity and can consume samples from online data. We show the convergence properties of our algorithm. Our analysis comb…  More
Submitted 29 January, 2023; originally announced January 2023.

arXiv:2301.12197  [pdfother cs.LG   cs.AI  cs.IR
Mutual Wasserstein Discrepancy Minimization for Sequential Recommendation
Authors: Ziwei FanZhiwei LiuHao PengPhilip S Yu
Abstract: Self-supervised sequential recommendation significantly improves recommendation performance by maximizing mutual information with well-designed data augmentations. However, the mutual information estimation is based on the calculation of Kullback Leibler divergence with several limitations, including asymmetrical estimation, the exponential need of the sample size, and training instability. Also,…  More
Submitted 28 January, 2023; originally announced January 2023.
Comments: This paper is accepted by The Web Conference 2023. 11 pages
WWW '23: Proceedings of the ACM Web Conference 2023
April 2023, pp 1375–1385https://doi.org/
Cited by 5 Related articles All 4 versions


Zhao, FeiranYou, Keyou

Minimax Q-learning control for linear systems using the Wasserstein metric. (English) Zbl 07649504

Automatica 149, Article ID 110850, 4 p. (2023).

MSC:  93-XX

PDF BibTeX XML Cite

Full Text: DOI 

Zbl 07649504  |  MR4539468

 

arXiv:2305.12056  [pdfpsother stat.ML   cs.LG   math.OC
Uniform-in-Time Wasserstein Stability Bounds for (Noisy) Stochastic Gradient Descent
Authors: Lingjiong ZhuMert GurbuzbalabanAnant RajUmut Simsekli
Abstract: Algorithmic stability is an important notion that has proven powerful for deriving generalization bounds for practical algorithms. The last decade has witnessed an increasing number of stability bounds for different algorithms applied on different classes of loss functions. While these bounds have illuminated various properties of optimization algorithms, the analysis of each case typically requir…  More
Submitted 19 May, 2023; originally announced May 2023.
Comments: 47 pages

 Cited by 3 Related articles All 5 versions 


The back-and-forth method for the quadratic Wasserstein distance-based full-waveform inversion

H Zhang, W He, J Ma - Geophysics, 2023 - library.seg.org

Wasserstein function is the heavy computation cost. This computational challenge can be 

Publication:GEOPHYSICS, 88, 20230701, R469

2023 see 2022  Working Paper

Spherical Sliced-Wasserstein

Bonet, Clément; Berg, Paul; Courty, Nicolas; Septier, François; Lucas Drumetz; et al. arXiv.org; Ithaca, Jan 30, 202

Cite  EmailSave to My Research  Full Text

Abstract/DetailsGet full text
opens in a new window

<–—2023———2023———140-—

Working Paper

Stochastic Wasserstein Gradient Flows using Streaming Data with an Application in Predictive Maintenance

Lanzetti, Nicolas; Balta, Efe C; Liao-McPherson, Dominic; Dörfler, Florian. arXiv.org; Ithaca, Jan 29, 2023.

Cite EmailSave to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window
All 2 versions
 
   

Working Paper

Multivariate stable approximation in Wasserstein distance by Stein's method

Chen, Peng; Nourdin, Ivan; Xu, Lihu; Yang, Xiaochuan. arXiv.org; Ithaca, Jan 25, 2023.

Cite  Email

Save to My Research  Full Text

Abstract/DetailsGet full text
opens in a new window 


  

2023 see 2022  Working Paper

Simple Approximative Algorithms for Free-Support Wasserstein Barycenters

Johannes von Lindheim. arXiv.org; Ithaca, Jan 24, 2023.

Cite  Email  Save to My Research  Full Text

Abstract/DetailsGet full text
opens in a new window
Cite Cited by 1


2023 see arXiv
Wasserstein convergence rates in the invariance principle for deterministic dynamical systemsWasserstein convergence rates in the invariance principle for deterministic dynamical systems

Liu, Zhenxin; Wang, Zhe. arXiv.org; Ithaca, Jan 3, 2023.

Cite  Email  Save to My Research

Full Text

Related articles All 2 versions 

2023 see 2022

 MR4530916 Prelim Gehér, György Pál; Pitrik, József; Titkos, Tamás; Virosztek, Dániel; 

Quantum Wasserstein isometries on the qubit state space. J. Math. Anal. Appl. 522 (2023), no. 2, Paper No. 126955.

Cited by 2 Related articles All 3 versions

2023


[PDF] arxiv.org

Weak log-majorization between the spectral geometric and Wasserstein means

L Gan, S Kim - arXiv preprint arXiv:2301.07934, 2023 - arxiv.org

In this paper, we establish the weak log-majorization between the spectra of the Wasserstein 

mean and the spectral geometric mean for two positive definite Hermitian matrices. In …

Related articles


2023 see 2022  [PDF] arxiv.org

Least Wasserstein distance between disjoint shapes with perimeter regularization

M Novack, I Topaloglu, R Venkatraman - Journal of Functional Analysis, 2023 - Elsevier

We prove the existence of global minimizers to the double minimization problem where P ( 

E ) denotes the perimeter of the set E, W p is the p-Wasserstein distance between Borel …

Cited by 2 Related articles All 9 versions

Least Wasserstein distance between disjoint shapes with perimeter regularization
Cited by 2 Related articles All 9 versions


2023 see 2022  

Stable parallel training of Wasserstein conditional generative adversarial neural networks

M Lupo Pasini, J Yin - The Journal of Supercomputing, 2023 - Springer

We propose a stable, parallel approach to train Wasserstein conditional generative 

adversarial neural networks (W-CGANs) under the constraint of a fixed computational budget. …

Related articles All 4 versions


On Wasserstein distances, barycenters, and the cross-section methodology for proxy credit curves

M Michielon, A Khedher, P Spreij - International Journal of Financial …, 2023 - World Scientific

… In particular, we investigate how to embed the concepts of Wasserstein distance and 

Wasserstein barycenter between implied CDS probability distributions in a cross-sectional …


2023 see 2021

[PDF] researchgate.net

[PDF] Markovian Sliced Wasserstein Distances: Beyond Independent Projections

KNTRN Ho - 2023 - researchgate.net

… background for Wasserstein distance, sliced Wasserstein distance, and max sliced Wasserstein 

distance in Section 2. In Section 3, we propose Markovian sliced Wasserstein distances …

Cited by 2 Related articles All 2 versions

<–—2023———2023———150-—


[PDF] berkeley.edu

[PDF] Wasserstein Distance

B Sturmfels - 2023 - math.berkeley.edu

Familiar examples of polyhedral norms are||·||∞ and||·|| 1, where the unit ball B is the cube 

and the crosspolytope respectively. Polyhedral norms are very important in optimal transport …

 Related articles

On adversarial robustness and the use of Wasserstein ascent-descent dynamics to enforce it

C Garcia Trillos, N Garcia Trillos - arXiv e-prints, 2023 - ui.adsabs.harvard.edu

We propose iterative algorithms to solve adversarial problems in a variety of supervised 

learning settings of interest. Our algorithms, which can be interpreted as suitable ascent-descent …

Related articles


2023 see 2021

Scenario Reduction Network Based on Wasserstein Distance with Regularization

X Dong, Y Sun, SM Malik, T Pu, Y Li… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org

… In the stagewise independent setting, Nested distance can be decomposed into multiple 

Wasserstein distances. Therefore, Wasserstein distance is used in this paper to measure the …

Related articles All 2 versions


[HTML] mdpi.com

[HTML] Prediction of Tumor Lymph Node Metastasis Using Wasserstein Distance-Based Generative Adversarial Networks Combing with Neural Architecture Search …

Y Wang, S Zhang - Mathematics, 2023 - mdpi.com

… So we introduced the Wasserstein distance as a screening indicator for the hidden … the 

Wasserstein distances as the supervision of the model training degree and when the Wasserstein

 Cited by 1 All 4 versions 

[PDF] ieee.org

WOMT: Wasserstein Distribution based minimization of False Positives in Breast Tumor classification using Deep Learning

L Lakshmi, KDS Devi, KAN Reddy, SK Grandhi… - IEEE …, 2023 - ieeexplore.ieee.org

… of the Lr- Wasserstein distance between µ1 and µ2 on an image space d. This section 

includes on how Wasserstein cumulative distributions are arrived upon from point masses. …

 

2023


2023 see 2022  [PDF] arxiv.org

Wasserstein distance as a new tool for discriminating cosmologies through the topology of large scale structure

M Tsizh, V Tymchyshyn, F Vazza - arXiv preprint arXiv:2301.09411, 2023 - arxiv.org

… ABSTRACT In this work we test Wasserstein distance in … The Wasserstein distance (aka the 

pairmatching distance) … -death) diagrams and evaluate Wasserstein distance between them. …

All 6 versions

[PDF] arxiv.org

Efficient Approximation of Gromov-Wasserstein Distance Using Importance Sparsification

M Li, J Yu, H Xu, C Meng - Journal of Computational and Graphical …, 2023 - Taylor & Francis

… In this section, we extend the Spar-GW algorithm to approximate the unbalanced Gromov-Wasserstein 

(UGW) distance. Similar to the unbalanced optimal transport (UOT) (Liero et al., …

Cited by 1 Related articles All 4 versions


Wasserstein Distance-based Full-waveform Inversion with A Regularizer Powered by Learned Gradient

F Yang, J Ma - IEEE Transactions on Geoscience and Remote …, 2023 - ieeexplore.ieee.org

… To further mitigate local minima issues, the Wasserstein distance induced by optimal transport 

theory with a new preprocessing transformation is applied as a measure in data domain. …


[PDF] arxiv.org

Gromov-Wasserstein Transfer Operators

F Beier - arXiv preprint arXiv:2301.12880, 2023 - arxiv.org

… This leads to a fused version of the GW and the Wasserstein distance. To incorporate label 

information, we introduce an additional set A Rm endowed with dA := dE|A×A. We assume …

All 2 versions


[PDF] arxiv.org

Wasserstein Gradient Flows of the Discrepancy with Distance Kernel on the Line

J Hertrich, R Beinert, M Gräf, G Steidl - arXiv preprint arXiv:2301.04441, 2023 - arxiv.org

… This paper provides results on Wasserstein gradient flows between measures on the real … 

of the Wasserstein space P2(R) into the Hilbert space L2((0, 1)), Wasserstein gradient flows of …

Related articles All 2 versions

<–—2023———2023———160—


[PDF] arxiv.org

Neural Wasserstein Gradient Flows for Maximum Mean Discrepancies with Riesz Kernels

F Altekrüger, J Hertrich, G Steidl - arXiv preprint arXiv:2301.11624, 2023 - arxiv.org

… We introduce Wasserstein gradient flows and Wasserstein steepest descent flows as well 

as a backward and forward scheme for their time discretization in Sect. 2. In Sect. …

Cited by 2 All 2 versions

[PDF] aiaa.org

Improved Generative Adversarial Network Method for Flight Crew Dialog Speech Enhancement

N Chen, W Ning, Y Man, J Li - Journal of Aerospace Information …, 2023 - arc.aiaa.org

… Firstly, the model integrates the deep convolutional generative adversarial network and the 

Wasserstein distance based on the generative adversarial network. Secondly, it introduces a …

 

Neural SDEs for Conditional Time Series Generation and the Signature-Wasserstein-1 metric

P Díaz Lozano, T Lozano Bagén, J Vives - arXiv e-prints, 2023 - ui.adsabs.harvard.edu

(Conditional) Generative Adversarial Networks (GANs) have found great success in recent 

years, due to their ability to approximate (conditional) distributions over extremely high …

that are given by: a) taking the data points yt from the output …

 Cited by 2 Related articles 

2023 see 2022  [PDF] arxiv.org

On a linear fused Gromov-Wasserstein distance for graph structured data

DH Nguyen, K Tsuda - Pattern Recognition, 2023 - Elsevier

… faster approximate of pairwise Wasserstein distance for large-… the concept of linear 

Wasserstein embedding for learning … the 2-Wasserstein distance to the Fused Gromov-Wasserstein

 Related articles All 3 versions


2023 see 2022  [PDF] arxiv.org

Robust W-GAN-based estimation under Wasserstein contamination

Z Liu, PL Loh - Information and Inference: A Journal of the IMA, 2023 - academic.oup.com

… In this paper, we study several estimation problems under a Wasserstein contamination model 

… Specifically, we analyze the properties of Wasserstein GAN-based estimators for location …

Cited by 1 Related articles All 5 versions

 

2023


2023 see 2021
On Adaptive Confidence Sets for the Wasserstein Distances
by Deo, Neil; Randrianarisoa, Thibault
Bernoulli : official journal of the Bernoulli Society for Mathematical Statistics and Probability, 02/2023
In the density estimation model, we investigate the problem of constructing adaptive honest confidence sets with diameter measured in Wasserstein distance Wp,...
Journal Article  Full Text Online
MR4580910
 

Cited by 2 Related articles All 10 versions


Distributionally robust chance constrained svm model with -Wasserstein distance
by Ma, Qing; Wang, Yanjun
Journal of industrial and management optimization, 02/2023, Volume 19, Issue 2
In this paper, we propose a distributionally robust chanceconstrained SVM model with 2-Wasserstein ambiguity. We present equivalent formulations of...
Article PDFPDF
Journal ArticleCitation Online
Related articles
All 2 versions


Protein secondary structure prediction based on Wasserstein generative...
by Yuan, Lu; Ma, Yuming; Liu, Yihui
Mathematical Biosciences and Engineering, 2022, Volume 20, Issue 2
As an important task in bioinformatics, protein secondary structure prediction (PSSP) is not only beneficial to protein function research and tertiary...
Journal Article  Full Text Online
View in Context Browse Journal

Gromov-Wasserstein Transfer Operators
by Beier, Florian
01/2023
Gromov-Wasserstein (GW) transport is inherently invariant under isometric transformations of the data. Having this property in mind, we propose to estimate...
Journal Article  Full Text Online
All 2 versions 
 
Robust Estimation under the Wasserstein Distance
by Nietert, Sloan; Cummings, Rachel; Goldfeld, Ziv
02/2023
We study the problem of robust distribution estimation under the Wasserstein metric, a popular discrepancy measure between probability distributions rooted in...
Journal Article  Full Text Online

All 2 versions 

<–—2023———2023———170—e


Stone's theorem for distributional regression in Wasserstein distance
by Dombry, Clément; Modeste, Thibault; Pic, Romain
02/2023
We extend the celebrated Stone's theorem to the framework of distributional regression. More precisely, we prove that weighted empirical distribution with...
Journal Article  Full Text Online

Bayesian Optimization in Wasserstein Spaces
by Candelieri, Antonio; Ponti, Andrea; Archetti, Francesco
Learning and Intelligent Optimization, 02/2023
Bayesian Optimization (BO) is a sample efficient approach for approximating the global optimum of black-box and computationally expensive optimization problems...
Book Chapter  Full Text Online
All 3 versions


Generating Bipedal Pokémon Images by Implementing the Wasserstein...
by Jermyn, Jacqueline
International Journal for Research in Applied Science and Engineering Technology, 01/2023, Volume 11, Issue 1
Pokémon is a video game series wherein players capture and train fauna that are known as Pokémons. These creatures vary in colour, shape, size, and have...
Journal Article  Full Text Online



Network Vulnerability Analysis in Wasserstein Spaces
by Ponti, Andrea; Irpino, Antonio; Candelieri, Antonio ; More...
Learning and Intelligent Optimization, 02/2023
The main contribution of this paper is the proposal of a new family of vulnerability measures based on a probabilistic representation framework in which the...
Book Chapter  Full Text Online
All 5 versions

 

Wasserstein distance based multi-scale adversarial domain adaptation method for remaining useful life...
by Shi, HuaitaoHuang, ChengzhuangZhang, Xiaochen ; More...
Applied intelligence (Dordrecht, Netherlands), 2023, Volume 53, Issue 3
Accurate remaining useful life (RUL) prediction can formulate timely maintenance strategies for mechanical equipment and reduce the costs of industrial...
ArticleView Article PDF
Journal Article  Full Text Online
View Complete Issue Browse Now

 2023
 
Protein secondary structure prediction based on Wasserstein generative adversarial networks and...
by Yuan, LuMa, YumingLiu, Yihui
Mathematical Biosciences and Engineering, 2022, Volume 20, Issue 2
As an important task in bioinformatics, protein secondary structure prediction (PSSP) is not only beneficial to protein function research and tertiary...
Article Link Read Article
Journal Article  Full Text Online

 
Generating Bipedal Pokémon Images by Implementing the Wasserstein Generative Adversarial Network
by Jermyn, Jacqueline
International Journal for Research in Applied Science and Engineering Technology, 01/2023, Volume 11, Issue 1
Pokémon is a video game series wherein players capture and train fauna that are known as Pokémons. These creatures vary in colour, shape, size, and have...
Article Link Read Article (via Unpaywall)
Journal Article  Full Text Online

 [PDF] arxiv.org

Linear optimal transport embedding: Provable Wasserstein classification for certain rigid transformations and perturbations

C MoosmüllerA Cloninger - … and Inference: A Journal of the IMA, 2023 - academic.oup.com

… In this section, we derive the error that occurs when approximating the Wasserstein

distance by the |$L^2$| distance obtained in the LOT embedding. We are thus interested in the …

Cited by 8 Related articles All 5 versions


[PDF] researchgate.net

Distributionally robust optimization with Wasserstein metric for multi-period portfolio selection under uncertainty

Z Wu, K Sun - Applied Mathematical Modelling, 2023 - Elsevier

… to estimate the radius of the Wasserstein balls under time-… by the different radius of the 

Wasserstein balls. Finally, we … the Wasserstein metric and definition of the Wasserstein ball. …

 Related articles All 2 versions
 Zbl 07682496


[PDF] arxiv.org

Invariance encoding in sliced-Wasserstein space for image classification with limited training dataM Shifat-E-RabbiY ZhuangS LiAHM Rubaiyat… - Pattern Recognition, 2023 - Elsevier

… subspace classification model in sliced-Wasserstein space by exploiting certain mathematical

… Throughout this manuscript, we consider images s to be square integrable functions such …

 Related articles All 6 versions

<–—2023———2023———180—


Wasserstein Distance-based Full-waveform Inversion with A Regularizer Powered by Learned Gradient

F YangJ Ma - IEEE Transactions on Geoscience and Remote …, 2023 - ieeexplore.ieee.org

… To further mitigate local minima issues, the Wasserstein distance induced by optimal …

gradient, whose training does not need any geological images, thus paving the way to develop FWI …
by Yang, Fangshu; Ma, Jianwei
IEEE transactions on geoscience and remote sensing, 2023, Volume 61
Full-waveform inversion (FWI) is a powerful technique for building high-quality subsurface geological structures. It is known to suffer from local minima...
Article PDFPDF
Journal Article  Full Text Online
 ited by 3
 Related articles All 2 versions
 

2023 see 2022  [PDF] arxiv.org

A sliced-Wasserstein distance-based approach for out-of-class-distribution detection

MSE RabbiAHM RubaiyatY Zhuang… - arXiv preprint arXiv …, 2023 - arxiv.org

… complex classification problems involving complex images with unknown data generative

… on the distribution of sliced-Wasserstein distance from the Radon Cumulative Distribution …

Related articles All 2 versions 

 

2023 see 2022[PDF] arxiv.org

Robust W-GAN-based estimation under Wasserstein contamination

Z LiuPL Loh - Information and Inference: A Journal of the IMA, 2023 - academic.oup.com

… In this paper, we study several estimation problems under a Wasserstein contamination model

… Specifically, we analyze the properties of Wasserstein GAN-based estimators for location …

Cited by 1 Related articles All 5 versions

EvaGoNet: an integrated network of variational autoencoder and Wasserstein generative adversarial network with gradient penalty for binary classification tasks

C Luo, Y Xu, Y Shao, Z Wang, J Hu, J Yuan, Y Liu… - Information …, 2023 - Elsevier

… WGAN uses the Wasserstein distance and Lipschitz continuum to restrain variations in

the objective function, as defined in Eq. (6).(6) L = E x p data ( x ) D ( x ) - E x p G ( x ) D ( x ) …

ll 2 versions

[PDF] wiley.com

Full View

Self-supervised non-rigid structure from motion with improved training of Wasserstein GANs

Y Wang, X Peng, W Huang, X Ye… - IET Computer …, 2023 - Wiley Online Library

… Wasserstein distance of the random 2D projection from the real 2D projection. To ensure

the rationality of the reconstruction results, experiments show that the feedback from the 2D …


2023

 

Global Pose Initialization Based on Gridded Gaussian Distribution With Wasserstein Distance

C Yang, Z Zhou, H Zhuang, C Wang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org

… The second step applies the Wasserstein distance that this … Compared with these two

methods, Wasserstein distance can … Therefore, this study uses Wasserstein distance to evaluate …

Cited by 4 Related articles All 3 versions

Towards inverse modeling of landscapes using the Wasserstein distance

MJ Morris, AG Lipp, GG Roberts - Authorea Preprints, 2023 - authorea.com

… Instead, we introduce the Wasserstein distance as a means to measure misfit between 

observed and theoretical landscapes. We first demonstrate its use with a one-dimensional …


[HTML] mdpi.com

[HTML] WGAN-GP-Based Scenarios Generation Method for Wind and Solar Power Complementary Study

X Ma, Y Liu, J Yan, H Wang - Energies, 2023 - mdpi.com

… In this paper, the generated scenarios with the highest probability generated by WGAN-GP

are close … for WGAN-GP generated data under different complementary modes, respectively. …

 Cited by 2 Related articles All 5 versions 

[PDF] uu.nl

Stock Price Simulation under Jump-Diffusion Dynamics: A WGAN-Based Framework with Anomaly Detection

R Gan - 2023 - studenttheses.uu.nl

… GAN model-Wasserstein GAN and Wasserstein GAN with gradient penalty, which can 

effectivRelated articlesely prevent the GANs failure modes because of the introduced Wasserstein loss. Chapter 5 …

 Related articles 

2023 see 2022

Comparing Beta-VAE to WGAN-GP for Time Series Augmentation to Improve Classification Performance

D Kavran, B Žalik, N Lukač - … , ICAART 2022, Virtual Event, February 3–5 …, 2023 - Springer

… A comparison is presented between Beta-VAE and WGAN-GP as … The use of WGAN-GP 

generated synthetic sets to train … the use of Beta-VAE and WGAN-GP generated synthetic sets …

<–—2023———2023———190—


arXiv:2302.04610  [pdfother cs.LG   stat.ML
Outlier-Robust Gromov Wasserstein for Graph Data
Authors: Lemin KongJiajin LiAnthony Man-Cho So
Abstract: Gromov Wasserstein (GW) distance is a powerful tool for comparing and aligning probability distributions supported on different metric spaces. It has become the main modeling technique for aligning heterogeneous data for a wide range of graph learning tasks. However, the GW distance is known to be highly sensitive to outliers, which can result in large inaccuracies if the outliers are given the sa…  More
Submitted 9 February, 2023; originally announced February 2023.

arXiv:2302.03372  [pdfpsother math.PR
Wasserstein-1
 distance between SDEs driven by Brownian motion and stable processes
Authors: Changsong DengRene L. SchillingLihu Xu
Abstract: We are interested in the following two R
d-valued stochastic differential equations (SDEs):
dXt=b(Xt )dt+σdLt,Xwhere σ
 is an invertible d×d
 matrix, Lt  is a rotationally symmetric α
-stable Lévy process, and Bt  is a d
-dimensional standard Brownia…  More
Submitted 7 February, 2023; originally announced February 2023.



Working Paper
Wasserstein- distance between SDEs driven by Brownian motion and stable processes
Deng, Changsong; Schilling, Rene L; Xu, Lihu. arXiv.org; Ithaca, Feb 7, 2023.
Cite  Email  Save to My Research
  Full Text

Abstract/DetailsGet full text
opens in a new window
 Related articles All 4 versions 

2023 see 2022  Working Paper
Hierarchical Sliced Wasserstein Distance
Nguyen, Khai; Ren, Tongzheng; Nguyen, Huy; Rout, Litu; Nguyen, Tan; et al. arXiv.org; Ithaca, Feb 6, 2023.
Cite  Email
  Save to My Research  Full Text

Abstract/DetailsGet full text
opens in a new window


2023 see 2022  Working Paper
Linear-Time Gromov Wasserstein Distances using Low Rank Couplings and Costs
Meyer Scetbon; Peyré, Gabriel; Cuturi, Marco. arXiv.org; Ithaca, Feb 6, 2023.
Cite  Email  Save to My Research
  Full Text


2023


Peer-reviewed
Dynamical mode recognition of triple flickering buoyant diffusion flames in Wasserstein space

Authors:Yicheng ChiTao YangPeng Zhang
Summary:Triple flickering buoyant diffusion flames in an isosceles triangle arrangement, as a nonlinear dynamical system of coupled oscillators, were experimentally studied. The focus of the study is two-fold: we established a well-controlled gas-fuel diffusion flame experiment, which well remedies the deficiencies of prevalent candle-flame experiments, and we developed a Wasserstein-space-based methodology for dynamical mode recognition, which is validated in the present triple-flame systems but can be readily generalized to the dynamical systems consisting of an arbitrary finite number of flames. By use of the present experiment and methodology, seven distinct stable dynamical modes were recognized, such as the in-phase mode, the flickering death mode, the partially flickering death mode, the partially in-phase mode, the rotation mode, the partially decoupled mode, and the decoupled mode. These modes unify the literature results for the triple flickering flame system in the straight-line and equal-lateral triangle arrangements. Compared with the mode recognitions in physical space and phase space, the Wasserstein-space-based methodology avoids personal subjectivity and is more applicable in high-dimensional systems, as it is based on the concept of distance between distribution functions of phase points. Consequently, the identification or discrimination of two dynamical modes can be quantified as the small or large Wasserstein distance, respectivelyShow more
 
Article
Publication:Combustion and Flame, 248, February 2023
Cited by 2
 Related articles



Small Sample Reliability Assessment With Online Time-Series Data Based on a Worm Wasserstein Generative Adversarial Network Learning Method
Authors:Bo SunZeyu WuQiang FengZili WangYi RenDezhen YangQuan Xia
Summary:The scarcity of time-series data constrains the accuracy of online reliability assessment. Data expansion is the most intuitive way to address this problem. However, conventional small-sample reliability evaluation methods either depend on prior knowledge or are inadequate for time series. This article proposes a novel autoaugmentation network, the worm Wasserstein generative adversarial network, which generates synthetic time-series data that carry realistic intrinsic patterns with the original data and expands a small sample without prior knowledge or hypotheses for reliability evaluation. After verifying the augmentation ability and demonstrating the quality of the generated data by manual datasets, the proposed method is demonstrated with an experimental case: the online reliability assessment of lithium battery cells. Compared with conventional methods, the proposed method accomplished a breakthrough in the online reliability assessment for an extremely small sample of time-series data and provided credible resultsShow more
Article, 2023
Publication:IEEE Transactions on Industrial Informatics, 19, 202302, 1207
Publisher:2023

[PDF] ieee.org

Improved Domain Adaptation Network Based on Wasserstein Distance for Motor Imagery EEG Classification

Q She, T Chen, F Fang, J Zhang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org

… domain adaption network based on Wasserstein distance, which utilizes existing labeled 

data … Next, the domain discriminator adopts the Wasserstein matrix to measure the distance

Cited by 1 All 3 versions


[PDF] arxiv.org

Robust Estimation under the Wasserstein Distance

S Nietert, R Cummings, Z Goldfeld - arXiv preprint arXiv:2302.01237, 2023 - arxiv.org

… estimation under the Wasserstein metric, a popular … Wasserstein distance Wε p which allows 

for ε outlier mass to be removed from its input distributions, and show that minimum distance

All 2 versions


[PDF] arxiv.org

A sliced-Wasserstein distance-based approach for out-of-class-distribution detection

MSE Rabbi, AHM Rubaiyat, Y Zhuang… - arXiv preprint arXiv …, 2023 - arxiv.org

Wasserstein distance method. The proposed mathematical solution attains high classification 

accuracy (compared with state-of-the-art end-to-end systems without out-of-class detection …

All 2 versions

<–—2023———2023———200—



[PDF] arxiv.org

Stone's theorem for distributional regression in Wasserstein distance

C Dombry, T Modeste, R Pic - arXiv preprint arXiv:2302.00975, 2023 - arxiv.org

… measured by the Wasserstein distance of order p … Wasserstein distance has a simple explicit 

form, but also the case of a multivariate output Y Rd. The use of the Wasserstein distance

All 5 versions


[PDF] arxiv.org

Algebraic Wasserstein distances and stable homological invariants of data

J Agerberg, A Guidolin, I Ren, M Scolamiero - arXiv preprint arXiv …, 2023 - arxiv.org

… define a richer family of parametrized Wasserstein distances where, in addition to standard 

Wasserstein distances are defined as a generalization of the algebraic Wasserstein distances

 Related articles All 2 versions


  EvaGoNet: an integrated network of variational autoencoder and Wasserstein generative adversarial network with gradient penalty for binary classification tasks

C Luo, Y Xu, Y Shao, Z Wang, J Hu, J Yuan, Y Liu… - Information …, 2023 - Elsevier

… This study proposes EvaGoNet, which refines the decoder module of the Gaussian mixture 

variational autoencoder using the Wasserstein generative adversarial network with gradient …

All 2 versions

[PDF] arxiv.org

Mutual Wasserstein Discrepancy Minimization for Sequential Recommendation

Z Fan, Z Liu, H Peng, PS Yu - arXiv preprint arXiv:2301.12197, 2023 - arxiv.org

… Copyrights for components of this work owned by others than ACM must be honored. 

Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to …

Cited by 1 All 4 versions

Network Vulnerability Analysis in Wasserstein Spaces

A Ponti, A Irpino, A Candelieri, A Bosio… - Learning and Intelligent …, 2023 - Springer

… of Wasserstein-based vulnerability measures and network clustering in the Wasserstein space. 

… We use the Wasserstein distance among possible distances between discrete probability …

 All 3 versions


2023


[PDF] arxiv.org

Outlier-Robust Gromov Wasserstein for Graph Data

L Kong, J Li, AMC So - arXiv preprint arXiv:2302.04610, 2023 - arxiv.org

… of Gromov Wasserstein distance and formally formulate the robust Gromov Wasserstein. 

Then, we discuss the statistical properties of the proposed robust Gromov-Wasserstein model …



[HTML] springer.com

[HTML] Wasserstein barycenter regression for estimating the joint dynamics of renewable and fossil fuel energy indices

ME De Giuli, A Spelta - Computational Management Science, 2023 - Springer

… That is, our forecasted densities are Wasserstein barycenter, a measure that minimizes the 

sum of its Wasserstein distances to each element in a set. Namely, we propose to obtain the …

All 6 versions


WorldCat Wasserstein barycenter regression for estimating the joint dynamics of renewable and fossil fuel energy indices

Human-related Anomalous Event Detection via Memory-augmented Wasserstein Generative Adversarial Network with Gradient Penalty

N Li, F Chang, C Liu - Pattern Recognition, 2023 - Elsevier

… patterns and performing anomaly detection, in additional to the traditional reconstruction 

and prediction errors based loss, we propose the adversarial loss based on the Wasserstein

All 6 versions


[PDF] arxiv.org

Learning Gaussian Mixtures Using the Wasserstein-Fisher-Rao Gradient Flow

Y Yan, K Wang, P Rigollet - arXiv preprint arXiv:2301.01766, 2023 - arxiv.org

… of probability measures equipped with the Wasserstein-Fisher-Rao … Wasserstein-Fisher-Rao 

distance which is a composite of the Fisher-Rao distance and the (quadratic) Wasserstein …

Related articles All 2 versions


[PDF] arxiv.org

Linear optimal transport embedding: Provable Wasserstein classification for certain rigid transformations and perturbations

C Moosmüller, A Cloninger - … and Inference: A Journal of the …, 2023 - academic.oup.com

… In this section, we derive the error that occurs when approximating the Wasserstein 

distance by the |$L^2$| distance obtained in the LOT embedding. We are thus interested in the …

Cited by 16 Related articles All 5 versions

Zbl 07655458

<–—2023———2023———210—


PDF] arxiv.org

A Wasserstein-type metric for generic mixture models, including location-scatter and group invariant measures

G Dusson, V Ehrlacher, N Nouaime - arXiv preprint arXiv:2301.07963, 2023 - arxiv.org

… In this article, we study Wasserstein-type metrics and corresponding barycenters for … is also 

geodesic space for the defined modified Wasserstein metric. We then focus on two particular …

Cited by 2 Related articles All 8 versions 

2023 see 2022  [PDF] arxiv.org

Convergence in Wasserstein Distance for Empirical Measures of Non-Symmetric Subordinated Diffusion Processes

FY Wang - arXiv preprint arXiv:2301.08420, 2023 - arxiv.org

Wasserstein distance is an intrinsic object in the theory of optimal transport and calculus in 

Wasserstein … So, it is crucial and interesting to study the convergence in Wasserstein distance …

All 2 versions


[PDF] hal.science

[PDF] Central Limit Theorem and bootstrap procedure for Wasserstein's barycenter variations and application to structural relationships between distributions

E del Barrio, H Lescornel, JM Loubes - hal.science

… of the empirical process of the Wasserstein’s variation using a … with respect to their 

Wasserstein’s barycenters for which we … criterion with respect to the Wasserstein’s barycenter of a …

All 2 versions

Small ship detection based on YOLOX and modified Gaussian Wasserstein distance in SAR images

W Yu, J Li, Y Wang, Z Wang… - … Conference on Geographic …, 2023 - spiedigitallibrary.org

… proposes a modified Gaussian Wasserstein distance. Based on the one-stage anchorfree 

detector YOLOX [9], the proposed Modified Gaussian Wasserstein Distance can be used to …

All 3 versions

 

2023 see 2022  [PDF] arxiv.org

Invariance encoding in sliced-Wasserstein space for image classification with limited training data

M Shifat-E-Rabbi, Y Zhuang, S Li, AHM Rubaiyat… - Pattern Recognition, 2023 - Elsevier

… subspace classification model in sliced-Wasserstein space by exploiting certain mathematical 

… Throughout this manuscript, we consider images s to be square integrable functions such …

 Cited by 1 Related articles All 3 versions

 
arXiv:2302.07373
  [pdfother cs.LG   math.NA stat.ML
Linearized Wasserstein dimensionality reduction with approximation guarantees
Authors: Alexander CloningerKeaton HammVarun KhuranaCaroline Moosmüller
Abstract: We introduce LOT Wassmap, a computationally feasible algorithm to uncover low-dimensional structures in the Wasserstein space. The algorithm is motivated by the observation that many datasets are naturally interpreted as probability measures rather than points in Rn, and that finding low-dimensional descriptions of such datasets requires manifold learning algorithms in the Wasserstein…  More
Submitted 14 February, 2023; originally announced February 2023.
Comments: 38 pages, 10 figures. Submitted

arXiv:2302.06673  [pdfother q-bio.NC
Unified Topological Inference for Brain Networks in Temporal Lobe Epilepsy Using the Wasserstein Distance
Authors: Moo K. ChungCamille Garcia RamosFelipe Branco De PaivaJedidiah MathisVivek PrabharakarenVeena A. NairElizabeth MeyerandBruce P. HermannJeffery R. BinderAaron F. Struck
Abstract: Persistent homology can extract hidden topological signals present in brain networks. Persistent homology summarizes the changes of topological structures over multiple different scales called filtrations. Doing so detect hidden topological signals that persist over multiple scales. However, a key obstacle of applying persistent homology to brain network studies has always been the lack of coheren…  More
Submitted 13 February, 2023; originally announced February 2023.
Unified Topological Inference for Brain Networks in Temporal Lobe Epilepsy Using the Wasserstein Distance


arXiv:2302.05917  [pdfother cs.LG
Vector Quantized Wasserstein Auto-Encoder
Authors: Tung-Long VuongTrung LeHe ZhaoChuanxia ZhengMehrtash HarandiJianfei CaiDinh Phung
Abstract: Learning deep discrete latent presentations offers a promise of better symbolic and summarized abstractions that are more useful to subsequent downstream tasks. Inspired by the seminal Vector Quantized Variational Auto-Encoder (VQ-VAE), most of work in learning deep discrete representations has mainly focused on improving the original VQ-VAE form and none of them has studied learning deep discrete…  More
Submitted 12 February, 2023; originally announced February 2023.

arXiv:2302.05833  [pdfother math.PR   math.DG
Bregman-Wasserstein divergence: geometry and applications
Authors: Cale RankinTing-Kam Leonard Wong
Abstract: Consider the Monge-Kantorovich optimal transport problem where the cost function is given by a Bregman divergence. The associated transport cost, which we call the Bregman-Wasserstein divergence, presents a natural asymmetric extension of the squared 2
-Wasserstein metric and has recently found applications in statistics and machine learning. On the other hand, Bregman divergence is a fundamental…  More
Submitted 11 February, 2023; originally announced February 2023.
Comments: 46 pages, 3 figures
MSC Class: 53B12 (Primary) 49Q22; 58B20 (Secondary)

arXiv:2302.05356  [pdfother math.NA
Approximation and Structured Prediction with Sparse Wasserstein Barycenters
Authors: Minh-Hieu DoJean FeydyOlga Mula
Abstract: We develop a general theoretical and algorithmic framework for sparse approximation and structured prediction in P2(Ω)
 with Wasserstein barycenters. The barycenters are sparse in the sense that they are computed from an available dictionary of measures but the approximations only involve a reduced number of atoms. We show that the best reconstruction from the class of sparse barycente…  More
Submitted 10 February, 2023; originally announced February 2023.
<–—2023———2023———220—



Open-Set Signal Recognition Based on Transformer and Wasserstein Distance

Zhang, WHuang, D; (...); Wang, XF

Feb 2023 | 

APPLIED SCIENCES-BASEL

 13 (4)

Featured Application Signal Processing. Open-set signal recognition provides a new approach for verifying the robustness of models by introducing novel unknown signal classes into the model testing and breaking the conventional closed-set assumption, which has become very popular in real-world scenarios. In the present work, we propose an efficient open-set signal recognition algorithm, which c

Show more

Free Full Text from PublisherView Full Text on ProQuestmore_horiz

[PDF] mdpi.com

Open-Set Signal Recognition Based on Transformer and Wasserstein Distance

W Zhang, D Huang, M Zhou, J Lin, X Wang - Applied Sciences, 2023 - mdpi.com

… metric sub-module based on Wasserstein distance, and a class … modeled in terms of the

Wasserstein distance instead of the … In this paper, the Wasserstein distance is used to improve …

ACited by 5 Related articles All 4 versions 


  Wasserstein distance based multi-scale adversarial domain adaptation method for remaining useful life prediction

H Shi, C Huang, X Zhang, J Zhao, S Li - Applied Intelligence, 2023 - Springer

… paper, the Wasserstein distance is used as a metric of distribution distance. The Wasserstein

distance was … For two distributions P S and P T , the Wasserstein-1 distance is defined as: …

Cited by 8 Related articles All 3 versions

[PDF] arxiv.org

A A Novel Small Samples Fault Diagnosis Method Based on the Self-attention Wasserstein Generative Adversarial Network

Z Shang, J Zhang, W Li, S Qian, J Liu, M Gao - Neural Processing Letters, 2023 - Springer

… To address the above problems, we propose a self-attention gradient penalty wasserstein

generative adversarial network (SA-WGAN-GP) for the expansion of sample capacity and the …

Related articles


 2023 see 2022  [PDF] researchgate.net

Distributionally robust optimization with Wasserstein metric for multi-period portfolio selection under uncertainty

Z Wu, K Sun - Applied Mathematical Modelling, 2023 - Elsevier

… to estimate the radius of the Wasserstein balls under time-… by the different radius of the

Wasserstein balls. Finally, we … the Wasserstein metric and definition of the Wasserstein ball. …

Related articles All 2 versions

MR4527311

arXiv:2302.12693  [pdfpsother cs.LG   math.ST  stat.ML
Wasserstein Projection Pursuit of Non-Gaussian Signals
Authors: Satyaki MukherjeeSoumendu Sundar MukherjeeDebarghya Ghoshdastidar
Abstract: We consider the general dimensionality reduction problem of locating in a high-dimensional data cloud, a $k$-dimensional non-Gaussian subspace of interesting features. We use a projection pursuit approach -- we search for mutually orthogonal unit directions which maximise the 2-Wasserstein distance of the empirical distribution of data-projections along these directions from a standard Gaussian. U…  More
Submitted 24 February, 2023; originally announced February 2023.
All 2 versions
 


2023


arXiv:2302.10682  [pdfother math.NA
Approximation of Splines in Wasserstein Spaces
Authors: Jorge JustinianoMartin RumpfMatthias Erbar
Abstract: This paper investigates a time discrete variational model for splines in Wasserstein spaces to interpolate probability measures. Cubic splines in Euclidean space are known to minimize the integrated squared acceleration subject to a set of interpolation constraints. As generalization on the space of probability measures the integral over the squared acceleration is considered as a spline energy an…  More
Submitted 21 February, 2023; originally announced February 2023.
Comments: 25 pages, 9 figures
MSC Class: 53B20; 65D07; 35Q49; 65K10; 68U10

Liu, ZhengLoh, Po-Ling

Robust W-GAN-based estimation under Wasserstein contamination. (English) Zbl 07655457

Inf. Inference 12, No. 1, 312-362 (2023).

MSC:  62-XX

PDF BibTeX XML Cite

 Cited by 1 Related articles All 5 versions

Gaunt, Robert E.Li, Siqi

Bounding Kolmogorov distances through Wasserstein and related integral probability metrics. (English) Zbl 07655403

J. Math. Anal. Appl. 522, No. 1, Article ID 126985, 24 p. (2023).

MSC:  60E05 60Bxx 60Gxx 62Exx

PDF BibTeX XML Cite

Full Text: DOI 

Cited by 5 Related articles All 5 versions


Medical multivariate time series imputation and forecasting based on a recurrent conditional Wasserstein...
by Festag, Sven; Spreckelsen, Cord
Journal of biomedical informatics, 03/2023, Volume 139
In the fields of medical care and research as well as hospital management, time series are an important part of the overall data basis. To ensure high quality...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal
All 3 versions


A novel conditional weighting transfer Wasserstein auto-encoder for rolling...
by Zhao, Ke; Jia, Feng; Shao, Haidong
Knowledge-based systems, 02/2023, Volume 262
Transfer learning based on a single source domain to a target domain has received a lot of attention in the cross-domain fault diagnosis tasks of rolling...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal

Cited by 57 Related articles All 2 versions

<–—2023———2023———230—


Vector Quantized Wasserstein Auto-Encoder
by Vuong, Tung-Long; Le, Trung; Zhao, He ; More...
02/2023
Learning deep discrete latent presentations offers a promise of better symbolic and summarized abstractions that are more useful to subsequent downstream...
Journal Article  Full Text Online

Cited by 7 Related articles All 7 versions 


 2023 see arxiv
Approximation of Splines in Wasserstein Spaces
by Justiniano, Jorge; Rumpf, Martin; Erbar, Matthias
02/2023
This paper investigates a time discrete variational model for splines in Wasserstein spaces to interpolate probability measures. Cubic splines in Euclidean...
Journal Article  Full Text Online
Related articles
 All 3 versions 
 
Linearized Wasserstein dimensionality reduction with approximation guarantees
by Cloninger, Alexander; Hamm, Keaton; Khurana, Varun ; More...
02/2023
We introduce LOT Wassmap, a computationally feasible algorithm to uncover low-dimensional structures in the Wasserstein space. The algorithm is motivated by...
Journal Article  Full Text Online

Cited by 7 Related articles All 2 versions 


Bregman-Wasserstein divergence: geometry and applications
by Rankin, Cale; Wong, Ting-Kam Leonard
02/2023
Consider the Monge-Kantorovich optimal transport problem where the cost function is given by a Bregman divergence. The associated transport cost, which we call...
Journal Article  Full Text Online
All 2 versions
 

ted by 4 Related articles All 2 versions 


Outlier-Robust Gromov Wasserstein for Graph Data
by Kong, Lemin; Li, Jiajin; So, Anthony Man-Cho
02/2023
Gromov Wasserstein (GW) distance is a powerful tool for comparing and aligning probability distributions supported on different metric spaces. It has become...
Journal Article  Full Text Online
All 2 versions
 


2023

Approximation and Structured Prediction with Sparse Wasserstein Barycenters
by Do, Minh-Hieu; Feydy, Jean; Mula, Olga
02/2023
We develop a general theoretical and algorithmic framework for sparse approximation and structured prediction in $\mathcal{P}_2(\Omega)$ with Wasserstein...
Journal Article  Full Text Online

Unified Topological Inference for Brain Networks in Temporal Lobe Epilepsy Using the Wasserstein...
by Chung, Moo K; Ramos, Camille Garcia; De Paiva, Felipe Branco ; More...
02/2023
Persistent homology can extract hidden topological signals present in brain networks. Persistent homology summarizes the changes of topological structures over...
Journal Article  Full Text Online


 2023 see 2022 2021
Internal Wasserstein Distance for Adversarial Attack and Defense
by Wang, Qicheng; Zhang, Shuhai; Cao, Jiezhang ; More...
arXiv.org, 02/2023
Deep neural networks (DNNs) are known to be vulnerable to adversarial attacks that would trigger misclassification of DNNs but may be imperceptible to human...
Paper  Full Text Online

 
2023 see 2022
Wassmap: Wasserstein Isometric Mapping for Image Manifold Learning
by Hamm, Keaton; Henscheid, Nick; Kang, Shujie
arXiv.org, 02/2023
In this paper, we propose Wasserstein Isometric Mapping (Wassmap), a nonlinear dimensionality reduction technique that provides solutions to some drawbacks in...
Paper  Full Text Online

Cited by 12 Related articles All 3 versions
MR4597927

Weak log-majorization between the geometric and Wasserstein means
by Gan, Luyining; Kim, Sejong
arXiv.org, 02/2023
There exist lots of distinct geometric means on the cone of positive definite Hermitian matrices such as the metric geometric mean, spectral geometric mean,...
Paper  Full Text Online

<–—2023———2023———240—



Wasserstein distance-based spectral clustering method for transaction data...
by Zhu, Yingqiu; Huang, Danyang; Zhang, Bo
arXiv.org, 02/2023
With the rapid development of online payment platforms, it is now possible to record massive transaction data. Clustering on transaction data significantly...
Paper  Full Text Online


 
Research Data from University of the Aegean Update Understanding of Management Science (Decision Making Under Model Uncertainty: Frechet-wasserstein...
Obesity, fitness, & wellness week, 02/2023
Journal ArticleCitation Online


Research Data from China University of Geosciences Update Understanding of Mathematics (Prediction of Tumor Lymph Node Metastasis Using Wasserstein...
Health & Medicine Week, 02/2023
Newsletter  Full Text Online


 

Wasserstein Distributionally Robust Chance-Constrained Program with Moment Information

Z Luo, Y Yin, D Wang, TCE Cheng, CC Wu - Computers & Operations …, 2023 - Elsevier

This paper studies a distributionally robust joint chance-constrained program with a hybrid

ambiguity set including the Wasserstein metric, and moment and bounded support

information of uncertain parameters. For the considered mathematical program, the random

variables are located in a given support space, so a set of random constraints with a high

threshold probability for all the distributions that are within a specified Wasserstein distance

from an empirical distribution, and a series of moment constraints have to be simultaneously …

Related articles
Zbl 07706559


 2023 see 2022

MR4551562 Prelim Pagès, Gilles; Panloup, Fabien; 

Unadjusted Langevin algorithm with multiplicative noise: Total variation and Wasserstein bounds. Ann. Appl. Probab. 33 (2023), no. 1, 726–779. 65 (37 60 62 93)

Review PDF Clipboard Journal Article

Cited by 17 Related articles All 13 versions


MR4551543 Prelim Wang, Feng-Yu; 

Convergence in Wasserstein distance for empirical measures of semilinear SPDEs. Ann. Appl. Probab. 33 (2023), no. 1, 70–84. 60

Review PDF Clipboard Journal Article

Cited by 9 Related articles All 2 versions


MR4551016 Prelim Zhou, Datong; Chen, Jing; Wu, Hao; Yang, Dinghui; Qiu, Lingyun; 

The Wasserstein-Fisher-Rao metric for waveform based earthquake location. J. Comput. Math. 41 (2023), no. 3, 437–458. 86 (65K10)

Review PDF Clipboard Journal Article


2023 see 2022

MR4550961 Prelim Shen, Haoming; Jiang, Ruiwei; 

Chance-constrained set covering with Wasserstein ambiguity. Math. Program. 198 (2023), no. 1, Ser. A, 621–674. 90C15 (90C11 90C47)

Review PDF Clipboard Journal Article

Cited by 21 Related articles All 5 versions
Cited by 21
 Related articles All 5 versions


MR4549971 Prelim Dapogny, Charles; Iutzeler, Franck; Meda, Andrea; Thibert, Boris; 

Entropy-regularized Wasserstein distributionally robust shape and topology optimization. Struct. Multidiscip. Optim. 66 (2023), no. 3, 42. 49Q10 (49Q12 49Q22 74)

Review PDF Clipboard Journal Articles

Related articles All 9 versions

MR4549474 Prelim Feng, Chunrong; Liu, Yujia; Zhao, Huaizhong; 

Periodic measures and Wasserstein distance for analysing periodicity of time series datasets. Commun. Nonlinear Sci. Numer. Simul. 120 (2023), Paper No. 107166. 60B12 (37A44 37A50 62M05)

Review PDF Clipboard Jou

Cited by 2 Related articles All 4 versions

<–—2023———2023———250—


MR4547374 Prelim De Giuli, Maria Elena; Spelta, Alessandro; 

Wasserstein barycenter regression for estimating the joint dynamics of renewable and fossil fuel energy indices. Comput. Manag. Sci. 20 (2023), no. 1, 1. 62 (49Q22 91)

Review PDF Clipboard Journal Article

All 6 versions

MR4515811 Pending Santambrogio, Filippo 

Sharp Wasserstein estimates for integral sampling and Lorentz summability of transport densities. J. Funct. Anal. 284 (2023), no. 4, Paper No. 109783, 12 pp. 49Q22 (35J96 46E30)

Review PDF Clipboard Journal Article


[HTML] Periodic measures and Wasserstein distance for analysing periodicity of time series datasets

C Feng, Y Liu, H Zhao - … in Nonlinear Science and Numerical Simulation, 2023 - Elsevier

In this article, we establish the probability foundation of the periodic measure approach in

analysing periodicity of a dataset. It is based on recent work of random periodic processes.

While random periodic paths provide a pathwise model for time series datasets with a

periodic pattern, their law is a periodic measure and gives a statistical description and the

ergodic theory offers a scope of statistical analysis. The connection of a sample path and the

periodic measure is revealed in the law of large numbers (LLN). We prove first the period is …

Cite Cited by 1 All 4 versions

Zbl 07676856


  Sharp Wasserstein estimates for integral sampling and Lorentz summability of transport densities

Santambrogio, F

Feb 15 2023 | 

JOURNAL OF FUNCTIONAL ANALYSIS

 284 (4)

We prove some Lorentz-type estimates for the average in time of suitable geodesic interpolations of probability measures, obtaining as a by product a new estimate for transport densities and a new integral inequality involving Wasserstein distances and norms of gradients. This last inequality was conjectured in a paper by S. Steinerberger.(c) 2022 Elsevier Inc. All rights reserved.

Full Text at Publishermore_horiz

23  References  Related records 

Related articles All 2 versions

Working Paper see arXiv

Cutoff ergodicity bounds in Wasserstein distance for a viscous energy shell model with Lévy noise

Barrera, G; Högele, M A; Pardo, J C; Pavlyukevich, I. arXiv.org; Ithaca, Feb 27, 2023.


2023


Working Paper see arXiv

Wasserstein-Kelly Portfolios: A Robust Data-Driven Solution to Optimize Portfolio Growth

Jonathan Yu-Meng Li. arXiv.org; Ithaca, Feb 27, 2023.

Cited by 1 All 5 versions 

 

2023 see 2022  Working Paper

Gromov-Wasserstein Autoencoders

Nakagawa, Nao; Togo, Ren; Ogawa, Takahiro; Haseyama, Miki. arXiv.org; Ithaca, Feb 24, 2023.

Cite  Em

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

Cited by 7 Related articles All 4 versions 

 

2023 see 1011  Working Paper see arxiv

Discrete Langevin Sampler via Wasserstein Gradient Flow

Sun, Haoran; Dai, Hanjun; Dai, Bo; Zhou, Haomin; Schuurmans, Dale. arXiv.org; Ithaca, Feb 22, 2023.

Cite

Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

Working Paper

A Wasserstein distance-based spectral clustering method for transaction data analysis

Zhu, Yingqiu; Huang, Danyang; Zhang, Bo. arXiv.org; Ithaca, Feb 16, 2023.

Cite  Email

Save to My Research

Full Text

Abstract/DetailsGet full text
opens in a new window

Wire Feed

Wasserstein Releases Solar Charger and Wireless Chime Compatible with the Blink Video Doorbell

PR Newswire; New York [New York]. 06 Feb 2023.  

Cite  Email  Save to My Research

Full Text

<–—2023———2023———260—



Attacking Mouse Dynamics Authentication using Novel Wasserstein Conditional DCGAN

A Roy, KS Wong, RCW Phan - IEEE Transactions on …, 2023 - ieeexplore.ieee.org

Behavioral biometrics is an emerging trend due to their cost-effectiveness and non-intrusive


Wasserstein-metric-based distributionally robust optimization method for unit commitment considering wind turbine uncertainty

D Qi - Authorea Preprints, 2023 - authorea.com

… Based on Wasserstein metric, an ambiguity set is established to reflect the probabilistic …

controlling the sample size and the confidence of Wasserstein ambiguity set radius. In addition, …


[PDF] cuhk.edu.hk

[PDF] AConvergent SINGLE-LOOP ALGORITHM FOR RE-LAXATION OF GROMOV-WASSERSTEIN IN GRAPH DATA

J Li, J Tang, L Kong, H Liu, J Li, AMC So, J Blanchet - se.cuhk.edu.hk

In this work, we present the Bregman Alternating Projected Gradient (BAPG) method, a

single-loop algorithm that offers an approximate solution to the Gromov-Wasserstein (GW) …



[PDF] fgv.br

[PDF] 1D-Wasserstein approximation of measures

A Chambolle, JM Machado - eventos.fgv.br

•(Pλ) always admits a solution ν.• If ρ0 has a L∞ density wrt H 1, so does ν.• If ρ0 P (R 2)

does not give mass to 1D sets, then ν is also a solution to (Pλ).• Σ is Ahlfors regular: there is C…


2023 see 1011  

Gradient flows of modified Wasserstein distances and porous medium equations with nonlocal pressure

NP Chung, QH Nguyen - Acta Mathematica Vietnamica, 2023 - Springer

… We construct their weak solutions via JKO schemes for modified Wasserstein distances.

We also establish the regularization effect and decay estimates for the L p norms. … To do …

Related articles All 2 versions

MR4581117 
Related articles
 All 4 versions


2023


Wasserstein distance-based expansion planning for integrated energy system considering hydrogen fuel cell vehicles

X Wei, KW Chan, T Wu, G Wang, X Zhang, J Liu - Energy, 2023 - Elsevier

Due to the increasing pressure from environmental concerns and the energy crisis, transportation 

electrification constitutes one of the key initiatives for global decarbonization. The zero …

Cited by 1 All 3 versions

2023 see 2022

A novel conditional weighting transfer Wasserstein auto-encoder for rolling bearing fault diagnosis with multi-source domains

K Zhao, F Jia, H Shao - Knowledge-Based Systems, 2023 - Elsevier

Wasserstein distance is regarded as an optimal transport problem, and its purpose is to 

search an optimal transport strategy. Wasserstein distance can not only measure the distance …

Cited by 8 Related articles

Sliced Wasserstein cycle consistency generative adversarial networks for fault data augmentation of an industrial robot

Z Pu, D Cabrera, C Li, JV de Oliveira - Expert Systems with Applications, 2023 - Elsevier

… Namely, the sliced Wasserstein distance is proposed for this type of generative model. Both 

… cases, sliced Wasserstein distance outperforms classic Wasserstein distance in CycleGANs. …

All 3 versions

 

Wasserstein Adversarial Learning for Identification of Power Quality Disturbances with Incomplete Data

G Feng, KW Lao - IEEE Transactions on Industrial Informatics, 2023 - ieeexplore.ieee.org

… framework of Wasserstein adversarial … of Wasserstein adversarial imputation (WAI) and 

Wasserstein adversarial domain adaptation (WADA). WAI minimizes the improved Wasserstein


[PDF] arxiv.org

Bregman-Wasserstein divergence: geometry and applications

C Rankin, TKL Wong - arXiv preprint arXiv:2302.05833, 2023 - arxiv.org

Wasserstein divergence and prove some basic properties including its relation with the 2-Wasserstein 

genuinely new features of the Bregman-Wasserstein geometry. In Section 4 we lift …

All 2 versions

<–—2023———2023———270—


[PDF] unimib.it

Integration of heterogeneous single cell data with Wasserstein Generative Adversarial Networks

V Giansanti - 2023 - boa.unimib.it

… Un regressore Bayesian viene successivamente applicato per selezionare i mini-batch 

con i quali viene allenata una particolare architettura di deep-learning, la Wasserstein

All 2 versions 

Integration of heterogeneous single cell data with Wasserstein Generative Adversarial Networks
Authors:Giansanti, V (Contributor), ANTONIOTTI, MARCO (Contributor), SCHETTINI, RAIMONDO (Contributor), GIANSANTI, VALENTINA (Creator)
Show more
Summary:Tessuti, organi e organismi sono sistemi biologici complessi, oggetto di studi che mirano alla caratterizzazione dei loro processi biologici. Comprendere il loro funzionamento e la loro interazione in campioni sani e malati consente di interferire, correggere e prevenire le disfunzioni dalle quali si sviluppano possibilmente le malattie. I recenti sviluppi nelle tecnologie di sequenziamento single-cell stanno ampliano la capacità di profilare, a livello di singola cellula, diversi layer molecolari (trascrittoma, genoma, epigenoma, proteoma). Il numero, la grandezza e le diverse modalità dei dataset prodotti è in continua crescita. Ciò spinge allo sviluppo di robusti metodi per l’integrazione di dataset multiomici, che siano essi descrittivi o meno delle stesse cellule. L’integrazione di più fonti di informazione produce una descrizione più ampia e completa dell’intero sistema analizzato. La maggior parte dei sistemi di integrazione disponibili ad oggi consente l’analisi simultanea di un numero limitato di omiche (generalmente due) e richiede conoscenze pregresse riguardo le loro relazioni. Questi metodi spesso impongono la traduzione di una modalità nelle variabili espresse da un altro dato (ad esempio, i picchi di ATAC vengono convertiti in gene activity matrix). Questo step introduce un livello di approssimazione nel dato che potrebbe pregiudicare le analisi svolte in seguito. Da qui nasce MOWGAN (Multi Omic Wasserstein Generative Adversarial Network), un framework basato sul deep-learning, per la simulazione di dati multimodali appaiati in grado di supportare un alto numero di dataset (più di due) e agnostico sulle relazioni che intercorrono tra loro (non viene imposta alcuna assunzione). Ogni modalità viene proiettata in uno spazio descrittivo ridotto, le cui dimensioni sono fissate per tutti i datasets. Questo processo previene la traduzione tra modalità. Le cellule, descritte da vettori nello spazio ridotto, vengono ordinate in base alla prima componente della
Thesis, Dissertation, 2023-02-17T00:00:00+01:00
English
Publisher:Università degli Studi di Milano-Bicocca country:Italy, 2023-02-17T00:00:00+01:00

All 2 versions

Vibration signal augmentation method for fault diagnosis of low-voltage circuit breaker based on W-CGAN

J Yang, G Zhang, B Chen… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org

… method based on Wasserstein distance and conditional … On the other hand, Wasserstein 

distance is used to calculate … adversarial networks, Wasserstein distances and explains …

 Cited by 1


Distributionally robust day-ahead combined heat and power plants scheduling with Wasserstein Metric

M Skalyga, M Amelin, Q Wu, L Söder - Energy, 2023 - Elsevier

… We define D ξ t with the Wasserstein distance in Section 3. The Wasserstein distance has … 

Here we use the Wasserstein metric to establish the distances between P ̂ ξ t and P ξ t as …

Cited by 1 All 2 versions

2023 see 2022  

[PDF] arxiv.org

A two-step approach to Wasserstein distributionally robust chance-and security-constrained dispatch

A Maghami, E Ursavas… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org

… Motivated by this fact, in this paper, we propose a Wasserstein metric based DR approach 

for solving a dispatch problem that takes into account N-1 security constraints and the …

Save Cite Related articles All 2 versions



[PDF] aimspress.com

[PDF] WG-ICRN: Protein 8-state secondary structure prediction based on Wasserstein generative adversarial networks and residual networks with Inception modules

S Li, L Yuan, Y Ma, Y Liu - Mathematical Biosciences and …, 2023 - aimspress.com

… First, we use the Wasserstein generative adversarial network (WGAN) to extract protein 

features in the position-specific scoring matrix (PSSM). The extracted features are combined with …

All 3 versions

2023

arXiv:2303.15350  [pdfother cs.CL   cs.IR   cs.LG doi10.1007/978-3-031-28238-6_21
Improving Neural Topic Models with Wasserstein Knowledge Distillation
Authors: Suman AdhyaDebarshi Kumar Sanyal
Abstract: Topic modeling is a dominant method for exploring document collections on the web and in digital libraries. Recent approaches to topic modeling use pretrained contextualized language models and variational autoencoders. However, large neural topic models have a considerable memory footprint. In this paper, we propose a knowledge distillation framework to compress a contextualized topic model witho…  More
Submitted 27 March, 2023; originally announced March 2023.
Comments: Accepted at ECIR 2023


arXiv:2303.15160  [pdfpsother math.PR   math.AP
On smooth approximations in the Wasserstein space
Authors: Andrea CossoMattia Martini
Abstract: In this paper we investigate the approximation of continuous functions on the Wasserstein space by smooth functions, with smoothness meant in the sense of Lions differentiability. In particular, in the case of a Lipschitz function we are able to construct a sequence of infinitely differentiable functions having the same Lipschitz constant as the original function. This solves an open problem raise…  More
Submitted 27 March, 2023; originally announced March 2023.
MSC Class: 28A33; 28A15; 49N80


arXiv:2303.15095  [pdfpsother math.MG   math-ph   math.FA
Isometries and isometric embeddings of Wasserstein spaces over the Heisenberg group
Authors: Zoltán M. BaloghTamás TitkosDániel Virosztek
Abstract: Our purpose in this paper is to study isometries and isometric embeddings of the p
-Wasserstein space Wp(Hn)
 over the Heisenberg group H
n  for all p≥1
 and for all n≥1
. First, we create a link between optimal transport maps in the Euclidean space R2n
 and the Heisenberg group Hn
. Then we use this link to understand isometric embe…  More
Submitted 27 March, 2023; originally announced March 2023.
Comments: 29 pages
MSC Class: 46E27; 49Q22; 54E40
All 2 versions
 


arXiv:2303.14950  [pdfother stat.AP
Parameter estimation for many-particle models from aggregate observations: A Wasserstein distance based sequential Monte Carlo sampler
Authors: Chen ChengLinjie WenJinglai Li
Abstract: In this work we study systems consisting of a group of moving particles. In such systems, often some important parameters are unknown and have to be estimated from observed data. Such parameter estimation problems can often be solved via a Bayesian inference framework. However in many practical problems, only data at the aggregate level is available and as a result the likelihood function is not a…  More
Submitted 27 March, 2023; originally announced March 2023.


arXiv:2303.14085  [pdfother math.ST   math.OC   math.PR
Optimal transport and Wasserstein distances for causal models
Authors: Stephan EcksteinPatrick Cheridito
Abstract: In this paper we introduce a variant of optimal transport adapted to the causal structure given by an underlying directed graph. Different graph structures lead to different specifications of the optimal transport problem. For instance, a fully connected graph yields standard optimal transport, a linear graph structure corresponds to adapted optimal transport, and an empty graph leads to a notion…  More
Submitted 24 March, 2023; originally announced March 2023.



arXiv:2303.12558  [pdfother cs.LG 
cs.AI
Wasserstein Auto-encoded MDPs: Formal Verification of Efficiently Distilled RL Policies with Many-sided Guarantees
Authors: Florent DelgrangeAnn NowéGuillermo A. Pérez
Abstract: Although deep reinforcement learning (DRL) has many success stories, the large-scale deployment of policies learned through these advanced techniques in safety-critical scenarios is hindered by their lack of formal guarantees. Variational Markov Decision Processes (VAE-MDPs) are discrete latent space models that provide a reliable framework for distilling formally verifiable controllers from any R…  More
Submitted 22 March, 2023; originally announced March 2023.
Comments: ICLR 2023, 9 pages main

xt, 14 pages appendix (excluding references)

<–—2023———2023———280—


  arXiv:2303.12357  [pdfother cs.LG   cs.AI
Wasserstein Adversarial Examples on Univariant Time Series Data
Authors: Wenjie WangLi XiongJian Lou
Abstract: Adversarial examples are crafted by adding indistinguishable perturbations to normal examples in order to fool a well-trained deep learning model to misclassify. In the context of computer vision, this notion of indistinguishability is typically bounded by L∞
 or other norms. However, these norms are not appropriate for measuring indistinguishiability for time series data. In this work, w…  More
Submitted 22 March, 2023; originally announced March 2023.


arXiv:2303.11844  [pdfother math.OC   cs.LG   stat.ML
Doubly Regularized Entropic Wasserstein Barycenters
Authors: Lénaïc Chizat
Abstract: We study a general formulation of regularized Wasserstein barycenters that enjoys favorable regularity, approximation, stability and (grid-free) optimization properties. This barycenter is defined as the unique probability measure that minimizes the sum of entropic optimal transport (EOT) costs with respect to a family of given probability measures, plus an entropy term. We denote it (λ,τ)-baryc…  More
Submitted 21 March, 2023; originally announced March 2023.
MSC Class: 49N99 (Primary) 62G05; 90C30 (Secondary)


arXiv:2303.08950  [pdfother math.NA  math.OC
High order spatial discretization for variational time implicit schemes: Wasserstein gradient flows and reaction-diffusion systems
Authors: Guosheng FuStanley OsherWuchen Li
Abstract: We design and compute first-order implicit-in-time variational schemes with high-order spatial discretization for initial value gradient flows in generalized optimal transport metric spaces. We first review some examples of gradient flows in generalized optimal transport spaces from the Onsager principle. We then use a one-step time relaxation optimization problem for time-implicit schemes, namely…

  More

Submitted 15 March, 2023; originally announced March 2023.



arXiv:2303.06595  [pdfother cs.CG  cs.LG   math.OC
A Convergent Single-Loop Algorithm for Relaxation of Gromov-Wasserstein in Graph Data
Authors: Jiajin LiJianheng TangLemin KongHuikang LiuJia LiAnthony Man-Cho SoJose Blanchet
Abstract: In this work, we present the Bregman Alternating Projected Gradient (BAPG) method, a single-loop algorithm that offers an approximate solution to the Gromov-Wasserstein (GW) distance. We introduce a novel relaxation technique that balances accuracy and computational efficiency, albeit with some compromises in the feasibility of the coupling map. Our analysis is based on the observation that the GW…  More
Submitted 12 March, 2023; originally announced March 2023.
Comments: Accepted by ICLR 2023

All 4 versions 


arXiv:2303.06398  [pdfpsother stat.CO  cs.CE  eess.SY stat.ML
Variational Gaussian filtering via Wasserstein gradient flows
Authors: Adrie CorenflosHany Abdulsamad
Abstract: In this article, we present a variational approach to Gaussian and mixture-of-Gaussians assumed filtering. Our method relies on an approximation stemming from the gradient-flow representations of a Kullback--Leibler discrepancy minimization. We outline the general method and show its competitiveness in parameter estimation and posterior representation for two models for which Gaussian approximatio…  More
Submitted 11 March, 2023; originally announced March 2023.
Comments: 5 pages, 2 figures, double column


2023


arXiv:2303.05978  [pdfother cs.LG
Neural Gromov-Wasserstein Optimal Transport
Authors: Maksim NekrashevichAlexander KorotinEvgeny Burnaev
Abstract: We present a scalable neural method to solve the Gromov-Wasserstein (GW) Optimal Transport (OT) problem with the inner product cost. In this problem, given two distributions supported on (possibly different) spaces, one has to find the most isometric map between them. Our proposed approach uses neural networks and stochastic mini-batch optimization which allows to overcome the limitations of exist…  More
Submitted 10 March, 2023; originally announced March 2023.

Working Paper
Variational Gaussian filtering via Wasserstein gradient flows

Corenflos, Adrie; Abdulsamad, Hany. arXiv.org; Ithaca, Mar 11, 2023Abstract/DetailsGet full text
opens in a new window

arXiv:2303.05798  [pdfother cs.LG 
eess.SP 
stat.ML
Sliced-Wasserstein on Symmetric Positive Definite Matrices for M/EEG Signals
Authors: Clément BonetBenoît MalézieuxAlain RakotomamonjyLucas DrumetzThomas MoreauMatthieu 


KowalskiNicolas Courty
Abstract: When dealing with electro or magnetoencephalography records, many supervised prediction tasks are solved by working with covariance matrices to summarize the signals. Learning with these matrices requires using Riemanian geometry to account for their structure. In this paper, we propose a new method to deal with distributions of covariance matrices and demonstrate its computational efficiency on M…  More
Submitted 10 March, 2023; originally announced March 2023.

All 2 versions 

arXiv:2303.05119  [pdfother stat.ML  cs.LG
Entropic Wasserstein Component Analysis
Authors: Antoine CollasTitouan VayerRémi FlamaryArnaud Breloy
Abstract: Dimension reduction (DR) methods provide systematic approaches for analyzing high-dimensional data. A key requirement for DR is to incorporate global dependencies among original and embedded samples while preserving clusters in the embedding space. To achieve this, we combine the principles of optimal transport (OT) and principal component analysis (PCA). Our method seeks the best linear subspace…  More
Submitted 9 March, 2023; originally announced March 2023.


arXiv:2303.04294  [pdfpsother math.MG  math.OC
Ultralimits of Wasserstein spaces and metric measure spaces with Ricci curvature bounded from below
Authors: Andrew Warren
Abstract: We investigate the stability of the Wasserstein distance, a metric structure on the space of probability measures arising from the theory of optimal transport, under metric ultralimits. We first show that if (Xi,di)
iN  is a sequence of metric spaces with metric ultralimit (X^d^)
, then the p-Wasserstein space (Pp(X^),Wp)
 embeds isometrically…  More
Submitted 7 March, 2023; originally announced March 2023.
Comments: 42 pages
MSC Class: 28E05 (Primary); 49Q22; 28A33; 53C23 (Secondary)


arXiv:2303.03883  [pdfpsother math.OC
A note on the Bures-Wasserstein metric
Authors: Shravan Mohan
Abstract: In this brief note, it is shown that the Bures-Wasserstein (BW) metric on the space positive definite matrices lends itself to convex optimization. In other words, the computation of the BW metric can be posed as a convex optimization problem. In turn, this leads to efficient computations of (i) the BW distance between convex subsets of positive definite matrices, (ii) the BW barycenter, and (iii)…  More
Submitted 7 March, 2023; originally announced March 2023.

All 2 versions 

<–—2023———2023———290—


  arXiv:2303.03027  [pdfother stat.ML  cs.LG

Critical Points and Convergence Analysis of Generative Deep Linear Networks Trained with Bures-Wasserstein Loss
Authors: Pierre BréchetKaterina PapagiannouliJing AnGuido Montúfar
Abstract: We consider a deep matrix factorization model of covariance matrices trained with the Bures-Wasserstein distance. While recent works have made important advances in the study of the optimization problem for overparametrized low-rank matrix approximation, much emphasis has been placed on discriminative settings and the square loss. In contrast, our model considers another interesting type of loss a…  More
Submitted 6 March, 2023; originally announced March 2023.
Comments: 35 pages, 1 figure


arXiv:2303.02378  [pdfother cs.LG  cs.AI
Wasserstein Actor-Critic: Directed Exploration via Optimism for Continuous-Actions Control
Authors: Amarildo LikmetaMatteo SaccoAlberto Maria MetelliMarcello Restelli
Abstract: Uncertainty quantification has been extensively used as a means to achieve efficient directed exploration in Reinforcement Learning (RL). However, state-of-the-art methods for continuous actions still suffer from high sample complexity requirements. Indeed, they either completely lack strategies for propagating the epistemic uncertainty throughout the updates, or they mix it with aleatoric uncerta…  More
Submitted 4 March, 2023; originally announced March 2023.


arXiv:2303.02183  [pdfother math.MG  math.AP  math.PR
Extending the Wasserstein metric to positive measures
Authors: Hugo LeblancThibaut Le GouicJacques LiandratMagali Tournus
Abstract: We define a metric in the space of positive finite positive measures that extends the 2-Wasserstein metric, i.e. its restriction to the set of probability measures is the 2-Wasserstein metric. We prove a dual and a dynamic formulation and extend the gradient flow machinery of the Wasserstein space. In addition, we relate the barycenter in this space to the barycenter in the Wasserstein space of th…  More
Submitted 3 March, 2023; originally announced March 2023.


arXiv:2303.00398  [pdfpsother math.PR math.FA
Wasserstein geometry and Ricci curvature bounds for Poisson spaces
Authors: Lorenzo Dello SchiavoRonan HerryKohei Suzuki
Abstract: Let Υ
 be the configuration space over a complete and separable metric base space, endowed with the Poisson measure π
. We study the geometry of Υ
 from the point of view of optimal transport and Ricci-lower bounds. To do so, we define a formal Riemannian structure on P1(Υ)
, the space of probability measures over Υ
 with finite first momen…  More
Submitted 1 March, 2023; originally announced March 2023.
Comments: 45 pages, comments are welcome
MSC Class: 60G55; 49Q22; 30L99

Wasserstein geometry and Ricci curvature bounds for Poisson spaces

L Dello Schiavo, R Herry, K Suzuki - arXiv e-prints, 2023 - ui.adsabs.harvard.edu

Let $\varUpsilon $ be the configuration space over a complete and separable metric base

space, endowed with the Poisson measure $\pi $. We study the geometry of $\varUpsilon $ …

Cited by 2 Related articles All 6 versions 

arXiv:2302.14618  [pdfother stat.ME  stat.CO
Barycenter Estimation of Positive Semi-Definite Matrices with Bures-Wasserstein Distance
Authors: Jingyi ZhengHuajun HuangYuyan YiYuexin LiShu-Chin Lin
Abstract: Brain-computer interface (BCI) builds a bridge between human brain and external devices by recording brain signals and translating them into commands for devices to perform the user's imagined action. The core of the BCI system is the classifier that labels the input signals as the user's imagined action. The classifiers that directly classify covariance matrices using Riemannian geometry are wide…  More
Submitted 24 February, 2023; originally announced February 2023.

All 2 versions 

2023


arXiv:2302.13968  [pdfother math-ph  math.DS  math.PR
Cutoff ergodicity bounds in Wasserstein distance for a viscous energy shell model with Lévy noise
Authors: Gerardo BarreraMichael A. HögeleJuan Carlos PardoIlya Pavlyukevich
Abstract: This article establishes non-asymptotic ergodic bounds in the renormalized, weighted Kantorovich-Wasserstein-Rubinstein distance for a viscous energy shell lattice model of turbulence with random energy injection. The obtained bounds turn out to be asymptotically sharp and establish abrupt thermalization. The types of noise under consideration are Gaussian and symmetric α
-stable, white and stati…  More
Submitted 3 March, 2023; v1 submitted 27 February, 2023; originally announced February 2023.
Comments: 26 pages
MSC Class: 60H10; 37L15; 37L60; 76M35; 76F20

 ited by 2 Related articles All 4 versions 

  

2023 see 202- 

Ho-Nguyen, NamWright, Stephen J.

Adversarial classification via distributional robustness with Wasserstein ambiguity. (English) Zbl 07667538

Math. Program. 198, No. 2 (B), 1411-1447 (2023).

MSC:  68T09 90C17 90C26 90C30

PDF BibTeX XML Cite

Full Text: DOI 


Abstract/Details Get full textopens in a new window

 2023 see 2022

Cavagnari, GiuliaSavaré, GiuseppeSodini, Giacomo Enrico

Dissipative probability vector fields and generation of evolution semigroups in Wasserstein spaces. (English) Zbl 07661875

Probab. Theory Relat. Fields 185, No. 3-4, 1087-1182 (2023).

MSC:  34A06 34A45 34A12 34A34 34A60 28A50

PDF BibTeX XML Cite

Full Text: DOI  



Zhou, DatongChen, JingWu, HaoYang, DinghuiQiu, Lingyun

The Wasserstein-Fisher-Rao metric for waveform based earthquake location. (English) Zbl 07661692

J. Comput. Math. 41, No. 3, 437-458 (2023).

MSC:  65K10 86A15 86A22

PDF BibTeX XML Cite

Full Text: DOI  


OpenURL 

2023 see 2022

Gehér, György PálPitrik, JózsefTitkos, TamásVirosztek, Dániel

Quantum Wasserstein isometries on the qubit state space. (English) Zbl 07659440

J. Math. Anal. Appl. 522, No. 2, Article ID 126955, 17 p. (2023).

MSC:  46L89 81P47 81P45

PDF BibTeX XML Cite

Full Text: DOI


 arXiv 

<–—2023———2023———300—

Xu, GuanglongHu, ZhenshengCai, Jia

Wad-CMSN: Wasserstein distance-based cross-modal semantic network for zero-shot sketch-based image retrieval. (English) Zbl 07659391

Int. J. Wavelets Multiresolut. Inf. Process. 21, No. 2, Article ID 2250054, 19 p. (2023).

MSC:  68U10 68P20 68T05

PDF BibTeX XML Cite

Full Text: DOI 

 arXiv 

 3 Shen, HaomingJiang, Ruiwei


2023 see 2022

Chance-constrained set covering with Wasserstein ambiguity. (English) Zbl 07658261

Math. Program. 198, No. 1 (A), 621-674 (2023).

MSC:  90C15 90C47 90C11

PDF BibTeX XML Cite

Full Text: DOI 

 arXiv  


Bistroń, R.Eckstein, M.Życzkowski, K.

Monotonicity of a quantum 2-Wasserstein distance. (English) Zbl 07657631

J. Phys. A, Math. Theor. 56, No. 9, Article ID 095301, 24 p. (2023).

MSC:  81-XX 82-XX

PDF BibTeX XML Cite

Full Text: DOI 


Cited by 5 Related articles All 2 versions

A novel prediction approach of polymer gear contact fatigue based on a WGAN‐XGBoost model

C Jia, P Wei, Z Lu, M Ye, R Zhu, H Liu - Fatigue & Fracture of … - Wiley Online Library

… In the data enhancement part, the WGAN algorithm is used to model the durability test …

WGAN algorithm introduces the Wasserstein distance based on the GAN model. The Wasserstein …


 

Wasserstein-based distributionally robust neural network for non-intrusive load monitoring

Zhang, QYan, Y; (...); Yang, LF

Apr 5 2023 | 

FRONTIERS IN ENERGY RESEARCH

 11

Non-intrusive load monitoring (NILM) is a technique that uses electrical data analysis to disaggregate the total energy consumption of a building or home into the energy consumption of individual appliances. To address the data uncertainty problem in non-intrusive load monitoring, this paper constructs an ambiguity set to improve the robustness of the model based on the distributionally robust

Show more

Free Full Text from Publishermore_horiz

34 References. Related records

[CITATION] A Wasserstein-based distributionally robust neural network for non-intrusive load monitoring

Q Zhang, Y Yan, F Kong, S Chen, L Yang - Frontiers in Energy Research - Frontiers

 

2023



[HTML] hanspub.org

[HTML] 基于 WGAN-GP 的建筑垃圾数据集的优化与扩充

邬欣诺 - Computer Science and Application, 2023 - hanspub.org

本文采用的模型框架是Wasserstein GAN (WGAN) [11],WGAN提出了一种新的衡量距离的

方法,采用Wasserstein距离(又叫Earth-Mover距离)来衡量真实数据与生成数据分布之间的距离,

[Chinese. Optimization and Expansion of Construction Waste Dataset Based on WGAN-GP]

2023 see 2021. [PDF] arxiv.org

Stochastic Wasserstein Hamiltonian Flows

J Cui, S Liu, H Zhou - Journal of Dynamics and Differential Equations, 2023 - Springer

… the lens of conditional probability, induces the stochastic Wasserstein Hamiltonian flow on

… structures in the density manifold without the help of conditional probability (see Sect. 3). …

Cited by 4 Related articles All 7 versions

[PDF] wiley.com

Full View

Self‐supervised non‐rigid structure from motion with improved training of Wasserstein GANs

Y Wang, X Peng, W Huang, X Ye… - IET Computer …, 2023 - Wiley Online Library

This study proposes a self‐supervised method to reconstruct 3D limbic structures from 2D

landmarks extracted from a single view. The loss of self‐consistency can be reduced by …

ARTICLE

Self‐supervised non‐rigid structure from motion with improved training of Wasserstein GANs

Wang, Yaming ; Peng, Xiangyang ; Huang, Wenqing ; Ye, Xiaoping ; Jiang, Mingfeng; Wiley

IET computer vision, 2023, Vol.17 (4), p.404-414

PEER REVIEWED

OPEN ACCESS

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents
Related articles
 All 3 versions


[PDF] wiley.com

Sparse super resolution and its trigonometric approximation in the pWasserstein distance

P Catala, M Hockmann, S Kunis - PAMM, 2023 - Wiley Online Library

… Hence, we can quantify rates of weak convergence in terms of Wasserstein distances being

a … of the Wasserstein metric. Frequently, we will use the dual formulation of the Wasserstein …

Related articles All 3 versions

[PDF] aimspress.com

[PDF]  how that the WGAN has excellent feature extraction capabilities, and …

 All 2 versions 

<–—2023———2023———310—



 

Text-to-image Generation Model Based on Diffusion Wasserstein Generative Adversarial Networks

H ZHAO, W LI - 电子与信息学报, 2023 - jeit.ac.cn

… 200 datasets show that D-WGAN achieves stable training while … These results indicate that

D-WGAN can generate higher … : A framework based on gp-wgan and enhanced faster R-CNN[…


Wasserstein Loss for Semantic Editing in the Latent Space of GANs
by Doubinsky, PerlaAudebert, NicolasCrucianu, Michel ; More...
03/2023
The latent space of GANs contains rich semantics reflecting the training data. Different methods propose to learn edits in latent space corresponding to...
Journal Article  Full Text Online

[PDF] hal.science

Wasserstein Loss for Semantic Editing in the Latent Space of GANs

P Doubinsky, N Audebert, M Crucianu, H Le Borgne - 2023 - hal.science

… We propose an alternative formulation based on the Wasserstein loss that avoids such

 classifier-based approaches. We …

Cited by 1 Related articles All 7 versions



On Wasserstein distances, barycenters, and the cross-section methodology for proxy credit curves

M Michielon, A KhedherP Spreij - International Journal of Financial …, 2023 - World Scientific

… In particular, we investigate how to embed the concepts of Wasserstein distance and

Wasserstein barycenter between implied CDS probability distributions in a cross-sectional …

MR4591927 
All 4 versions


[PDF] int-arch-photogramm-remote-sens-spatial-inf-sci.net

Improving Semantic Segmentation of High-Resolution Remote Sensing Images Using Wasserstein Generative Adversarial Network

HR Hosseinpour… - … Archives of the …, 2023 - … -remote-sens-spatial-inf-sci.net

… In this research, WGAN is specifically used to train the GAN network. … of the segmentation

network a

All 6 versions 

2023 see 2022

[HTML] Wasserstein t-SNE

F BachmannP HennigD Kobak - … 19–23, 2022, Proceedings, Part I, 2023 - Springer

… exact Wasserstein distances. We use synthetic data to demonstrate the effectiveness of our

… the Wasserstein metric [9] to compute pairwise distances between units. The Wasserstein …

Related articles All 6 versions


2023


[PDF] researchgate.net

[PDF] Markovian Sliced Wasserstein Distances: Beyond Independent Projections

KNTRN Ho - 2023 - researchgate.net

… background for Wasserstein distance, sliced Wasserstein distance, and max sliced Wasserstein

distance in Section 2. In Section 3, we propose Markovian sliced Wasserstein distances …

Cited by 2 Related articles All 2 versions 

 

[PDF] wiley.com

Multi‐marginal Approximation of the Linear Gromov–Wasserstein Distance

F Beier, R Beinert - PAMM, 2023 - Wiley Online Library

Recently, two concepts from optimal transport theory have successfully been brought to the

Gromov–Wasserstein (GW) setting. This introduces a linear version of the GW distance and …

Related articles All 2 versions


[PDF] researchgate.net

[PDF] Wasserstein Bounds in the Clt of Estimators of The Drift Parameter for Ornstein-Uhlenbeck Processes Observed at High Frequency

M AL-FORAIH - Conference Proceedings Report, 2023 - researchgate.net

This paper deals with the rate of convergence for the central limit theorem of estimators of the

drift coefficient, denoted θ, for a Ornstein-Uhlenbeck process X:={Xt, t≥ 0} observed at high …

Related articles All 3 versions 

Data Generation Scheme for Photovoltaic Power Forecasting Using Wasserstein Gan with Gradient Penalty Combined with Autoencoder and Regression Models

S Park, J MoonE Hwang - Available at SSRN 4388129 - papers.ssrn.com

… Wasserstein GAN with gradient penalty (WGAN-GP), autoencoder (AE), and regression model.

AE guides the WGAN… and the regression model guides the WGAN-GP to generate output …

 

[PDF] openreview.net

Provable Robustness against Wasserstein Distribution Shifts via Input Randomization

A KumarA LevineT GoldsteinS Feizi - The Eleventh International … - openreview.net

Certified robustness in machine learning has primarily focused on adversarial perturbations

with a fixed attack budget for each sample in the input distribution. In this work, we present 

<–—2023———2023———320—


[HTML] 基于 Wasserstein 生成对抗网络和残差网络的 8 类蛋白质二级结构预测

李舜, 马玉明, 刘毅慧 - Hans Journal of Computational Biology, 2023 - hanspub.org

… Wasserstein生成对抗网络(WGAN)和残差网络(ResNet)的蛋白质8态二级结构预测的方法.

 通过Wasserstein生成对抗网络(WGAN)… 通过实验表明,Wasserstein生成对抗网络(WGAN)…

ARelated articles All 3 versions 

[Chinese. Eight classes of proteins based on Wasserstein generative adversarial networks and residual networks|


[PDF] hal.science

[PDF] A travers et autour des barycentres de Wasserstein

IP GENTIL, AR SUVORIKOVA - theses.hal.science

… We are mainly motivated by the Wasserstein barycenter problem introduced by M. Agueh

and G. Carlier in 2011: … We refer to the recent monograph [PZ20] for more details on …

 


Aero-engine high speed bearing fault diagnosis for data imbalance: A sample enhanced diagnostic method based on pre-training WGAN-GP

J Chen, Z Yan, C Lin, B Yao, H Ge - Measurement, 2023 - Elsevier

… classifier Wasserstein generative adversarial network with gradient penalty (PT-WGAN-GP). …

and incorporated into the discriminator and classifier of WGAN-GP for feature adaptive and …

Related articles All 3 versions 

[HTML] mdpi.com

[HTML] A Solar Irradiance Forecasting Framework Based on the CEE-WGAN-LSTM Model

Q Li, D Zhang, K Yan - Sensors, 2023 - mdpi.com

… Therefore, this paper uses the WGAN model to train the high-frequency irradiance

subsequences after CEEMDAN decomposition, and its model structure is shown in Figure 3. Our …

Cited by 9 Related articles All 8 versions 

2023


[PDF] ieee.org

 Alternative Scheme on WGAN for Image Generation

D Wu, W Zhang, P Zhang - IEEE Access, 2023 - ieeexplore.ieee.org

… WGAN For better understanding the above algorithm, we discuss how the DPBA-WGAN can

… The WGAN training focuses on the privacy barrier of the discriminator. From the direction of …



TextureWGAN: texture preserving WGAN with multitask regularizer for computed tomography inverse problems

M Ikuta, J Zhang - Journal of Medical Imaging, 2023 - spiedigitallibrary.org

… TextureWGAN uses the discriminator and the generator of the Wasserstein GAN (WGAN). …

TextureWGAN is extended from the WGAN method. We show how the WGAN is turned into …

All 4 versions


[PDF] iop.org

Single-Location and Multi-Locations Scenarios Generation for Wind Power Based On WGAN-GP

J Zhu, Q Ai, Y Chen, J Wang - Journal of Physics: Conference …, 2023 - iopscience.iop.org

… According to the table, both WGAN and WGAN-GP can track the power generation characteristics

of the original data. The MMD index of the WGAN-GP data is closer to the index of the …

All 3 versions


A continual encrypted traffic classification algorithm based on WGAN

X Ma, W Zhu, Y Jin, Y Gao - Third International Seminar on …, 2023 - spiedigitallibrary.org

… In this paper, we propose a continual encrypted traffic classification method based on

WGAN. We use WGAN to train a separate generator for each class of encrypted traffic. The …

All 2 versions


2023 see 2022[PDF] arxiv.org

Quantum Wasserstein isometries on the qubit state space

GP Gehér, J Pitrik, T Titkos, D Virosztek - Journal of Mathematical Analysis …, 2023 - Elsevier

… We describe Wasserstein isometries of the quantum bit state … On the other hand, for the cost 

generated by the qubit ‘‘clock” … surprising properties of the quantum Wasserstein distance. …

Cited by 2 Related articles All 3 versions

<–—2023———2023———330—



Checkpoints for Morphological Classification of Radio Galaxies with wGAN...
by Griese, Florian; Kummer, Janis; Rustige, Lennart
02/2023
Checkpoint for the Generator Model described in https://github.com/floriangriese/wGAN-supported-augmentation
Data SetCitation Online
Open Access
 

2023 patent

一种基于WGAN-Unet...
03/2023
Patent  Available Online

Open Access
[Chinese.A WGAN-Unet based...|



Hydrological objective functions and ensemble averaging with the Wasserstein...
by Magyar, Jared C; Sambridge, Malcolm
Hydrology and earth system sciences, 03/2023, Volume 27, Issue 5
When working with hydrological data, the ability to quantify the similarity of different datasets is useful. The choice of how to make this quantification has...
Article PDFPDF
Journal Article  Full Text Online
More Options
Open Access

 
More actions

15Medical multivariate time series imputation and forecasting based on a recurrent conditional Wasserstein...
by Festag, Sven; Spreckelsen, Cord

Journal of biomedical informatics, 03/2023, Volume 139
In the fields of medical care and research as well as hospital management, time series are an important part of the overall data basis. To ensure high quality...
Article PDFPDF
Journal Article  Full Text Online
Open Access

2023 see 2022
An Efficient Content Popularity Prediction of Privacy Preserving Based on Federated Learning and Wasserstein...
by Wang, Kailun; Deng, Na; Li, Xuanheng
IEEE internet of things journal, 03/2023, Volume 10, Issue 5
To relieve the high backhaul load and long transmission time caused by the huge mobile data traffic, caching devices are deployed at the edge of mobile...
Journal Article  Full Text Online


2023


2023 see 2022
WASCO: A Wasserstein-...
by González-Delgado, Javier; Sagar, Amin; Zanon, Christophe ; More...
Journal of mol


Sparse super resolution and its trigonometric approximation in the p‐Wasserstein...
by Catala, Paul; Hockmann, Mathias; Kunis, Stefan
Proceedings in applied mathematics and mechanics, 03/2023, Volume 22, Issue 1
We consider the approximation of measures by trigonometric polynomials with respect to the p‐Wasserstein distance for general p ≥ 1. While the best...
Article PDFPDF
Journal Article  Full Text Online
Open Access

Entropic Wasserstein...
by Collas, Antoine; Vayer, Titouan; Flamary, Rémi ; More...
03/2023
Dimension reduction (DR) methods provide systematic approaches for analyzing high-dimensional data. A key requirement for DR is to incorporate global...
Journal Article  Full Text Online

2023 see arXiv
Doubly Regularized Entropic Wasserstein...
by Chizat, Lénaïc
03/2023
We study a general formulation of regularized Wasserstein barycenters that enjoys favorable regularity, approximation, stability and (grid-free) optimization...
Journal Article  Full Text Online

2023 see arXiv
 

Neural Gromov-Wasserstein Optimal Transport
Neural Gromov-W...
by Nekrashevich, Maksim; Korotin, Alexander; Burnaev, Evgeny
03/2023
We present a scalable neural method to solve the Gromov-Wasserstein (GW) Optimal Transport (OT) problem with the inner product cost. In this problem, given two...
Journal Article  Full Text Online

All 2 versions 

<–—2023———2023———340 —


2023 see arXiv
Variational Gaussian filtering via Wasserstein...
by Corenflos, Adrie; Abdulsamad, Hany
03/2023
In this article, we present a variational approach to Gaussian and mixture-of-Gaussians assumed filtering. Our method relies on an approximation stemming from...
Journal Article  Full Text Online

2023 see arXiv
A note on the Bures-Wasserstein...
by Mohan, Shravan
03/2023
In this brief note, it is shown that the Bures-Wasserstein (BW) metric on the space positive definite matrices lends itself to convex optimization. In other...
Journal Article  Full Text Online

2023 see arXiv
Extending the Wasser...
by Leblanc, Hugo; Gouic, Thibaut Le; Liandrat, Jacques ; More...
03/2023
We define a metric in the space of positive finite positive measures that extends the 2-Wasserstein metric, i.e. its restriction to the set of probability...
Journal Article  Full Text Online

2023 see arXiv 

Wasserstein Adversarial Examples on Univariant Time Series Data
Wasserstein Adversarial...

by Wang, Wenjie; Xiong, Li; Lou, Jian
03/2023
Adversarial examples are crafted by adding indistinguishable perturbations to normal examples in order to fool a well-trained deep learning model to...
Journal Article  Full Text Online
 All 2 versions


2023 see arXiv
 [Submitted on 22 Mar 2023]

Wasserstein Auto-encoded MDPs: Formal Verification of Efficiently Distilled RL Policies with Many-sided

Guarantee

Wasserstein...

by Delgrange, Florent; Nowé, Ann; Pérez, Guillermo A

03/2023

Although deep reinforcement learning (DRL) has many success stories, the large-scale

deployment of policies learned through these advanced techniques in...

Journal Article  Full Text Online

Cited by 2 All 4 versions

2023


2023 see arXiv

High order spatial discretization for variational time implicit

schemes: Wasserstein...

by Fu, Guosheng; Osher, Stanley; Li, Wuchen

03/2023

We design and compute first-order implicit-in-time variational schemes with high-order

spatial discretization for initial value gradient flows in generalized...

Journal Article  Full Text Online



3023 see arXiv
A Convergent Single-Loop Algorithm for Relaxation of Gromov-Wasserstein...
by Li, Jiajin; Tang, Jianheng; Kong, Lemin ; More...
03/2023
In this work, we present the Bregman Alternating Projected Gradient (BAPG) method, a single-loop algorithm that offers an approximate solution to the...
Journal Article  Full Text Online


2023 arXiv 

Sliced-Wasserstein on Symmetric Positive Definite Matrices for M/EEG Signals
Sliced-Wasserstein...
by Bonet, Clément; Malézieux, Benoît; Rakotomamonjy, Alain ; More...
03/2023
When dealing with electro or magnetoencephalography records, many supervised prediction tasks are solved by working with covariance matrices to summarize the...
Journal Article  Full Text Online
All 2 versions


2023 see arXiv 
Ultralimits of Wasserstein spaces and metric measure spaces with Ricci curvature bounded from below
Ultralimits of Wasserstein...
by Warren, Andrew
03/2023
We investigate the stability of the Wasserstein distance, a metric structure on the space of probability measures arising from the theory of optimal transport,...
Journal Article  Full Text Online
 

2023 see arXiv
Critical Points and Convergence Analysis of Generative Deep Linear Networks Trained with Bures-Wasserstein...
by Bréchet, Pierre; Papagiannouli, Katerina; An, Jing ; More...
03/2023
We consider a deep matrix factorization model of covariance matrices trained with the Bures-Wasserstein distance. While recent works have made important...
Journal Article  Full Text Online

<–—2023———2023———350 —



arXiv Wasserstein Actor-Critic: Directed Exploration via Optimism for Continuous-Actions Control

Wasserstein...

by Likmeta, Amarildo; Sacco, Matteo; Metelli, Alberto Maria ; More...

03/2023

Uncertainty quantification has been extensively used as a means to achieve efficient directed exploration in Reinforcement

Learning (RL). However,...

Journal Article  Full Text Online


2023 patent news
Univ Jiliang China Submits Chinese Patent Application for Early Fault Detection Method Based on Wasserstein...
Global IP News. Measurement & Testing Patent News, 03/2023
NewsletterCitation Online
Univ Qinghua Seeks Patent for Rotating Machine State Monitoring Method Based on Wasserstein...
Global IP News. Tools and Machinery Patent News, 03/2023
NewsletterCitation Online
 Quick Look
Univ Jiliang China Submits Chinese Patent Application for Early Fault Detection Method Based on Wasserstein...
Global IP News: Measurement & Testing Patent News, Mar 15, 2023
Newspaper ArticleCitation Online
Univ Qinghua Seeks Patent for Rotating Machine State Monitoring Method Based on Wasserstein...
Global IP News: Tools and Machinery Patent News, Mar 1, 2023
Newspaper ArticleCitation Online
 

2023 patent news
Univ Jiliang China Submits Chinese Patent Application for Early Fault Detection Method Based on Wasserstein...
Global IP News. Measurement & Testing Patent News, 03/2023
NewsletterCitation Online
Univ Qinghua Seeks Patent for Rotating Machine State Monitoring Method Based on Wasserstein...
Global IP News. Tools and Machinery Patent News, 03/2023
NewsletterCitation Online
 Quick Look
Univ Jiliang China Submits Chinese Patent Application for Early Fault Detection Method Based on Wasserstein...
Global IP News: Measurement & Testing Patent News, Mar 15, 2023
Newspaper ArticleCitation Online
Univ Qinghua Seeks Patent for Rotating Machine State Monitoring Method Based on Wasserstein...
Global IP News: Tools and Machinery Patent News, Mar 1, 2023
Newspaper ArticleCitation Online
 

MR4565740
 Prelim Moosmüller, Caroline; Cloninger, Alexander; Linear optimal transport embedding: provable Wasserstein classification for certain rigid transformations and perturbations. Inf. Inference 12 (2023), no. 1, 363–389.

Review PDF Clipboard Journal Article

Cited by 23 Related articles All 4 versions
[PDF] researchgate.net


Distributionally robust optimization with Wasserstein metric for multi-period portfolio selection under uncertainty

Z Wu, K Sun - Applied Mathematical Modelling, 2023 - Elsevier

… to estimate the radius of the Wasserstein balls under time-… by the different radius of the 

Wasserstein balls. Finally, we … the Wasserstein metric and definition of the Wasserstein ball. …

 Related articles All 2 versions
 Zbl 07682496


MR4565739 Prelim Liu, Zheng; Loh, Po-Ling; Robust W-GAN-based estimation under Wasserstein contamination. Inf. Inference 12 (2023), no. 1, 312–362.

Review PDF Clipboard Journal Article

Cited by 1 Related articles All 5 versions

2023


MR4563421 Prelim Kim, Kihyun; Yang, Insoon; Distributional Robustness in Minimax Linear Quadratic Control with Wasserstein Distance. SIAM J. Control Optim. 61 (2023), no. 2, 458–483. 93E20 (49N10 93C55)

Review PDF Clipboard Journal Article
Cited by 14
 Related articles All 3 versions


MR4562218 Prelim Ho-Nguyen, Nam; Wright, Stephen J.; Adversarial classification via distributional robustness with Wasserstei

Review PDF Clipboard Journal Article

Zbl 07667538


MR4561070 Prelim Chen, Zhi; Kuhn, Daniel; Wiesemann, Wolfram; On approximations of data-driven chance constrained programs over Wasserstein balls. Oper. Res. Lett. 51 (2023), no. 3, 226–233. 90C15

Review PDF Clipboard Journal Article

Cited by 9 Related articles All 7 versions

 Cited by 173 Related articles All 12 versions

MR4557952 Prelim Lacombe, Julien; Digne, Julie; Courty, Nicolas; Bonneel, Nicolas; Learning to

Review PDF Clipboard Journal Article

MR4557142 Prelim Li, Shun; Yuan, Lu; Ma, Yuming; Liu, Yihui; WG-ICRN: Protein 8-state secondary structure prediction based on Wasserstein generative adversarial networks and residual networks with Inception modules. Math. Biosci. Eng. 20 (2023), no. 5, 77217737.

Review PDF Clipboard Journal Article

Cited by 1 Related articles All 3 versions 

<–—2023———2023———360 —e


MR4556289 Prelim Cavagnari, Giulia; Savaré, Giuseppe; Sodini, Giacomo Enrico; Dissipative probability vector fields and generation of evolution semigroups in Wasserstein spaces. Probab. Theory Related Fields 185 (2023), no. 3-4, 10871182. 35R60 (49J40 49Q20)

Review PDF Clipboard Journal Article

Cited by 6 Related articles All 7 versions

MR4555084 Prelim Bistroń, R.; Eckstein, M.; Życzkowski, K.; Monotonicity of a quantum 2-Wasserstein distance. J. Phys. A 56 (2023), no. 9, Paper No. 095301, 24 pp. 49Q22 (35Q40 53B12 81P16 82C70)

Review PDF Clipboard Journal Article

Monotonicity of a quantum 2-Wasserstein distance

Bistron, REckstein, M and Zyczkowski, K

Mar 3 2023 | 

JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL

 56 (9)

We study a quantum analogue of the 2-Wasserstein distance as a measure of proximity on the set ?(N) of density matrices of dimension N. We show that such (semi-)distances do not induce Riemannian metrics on the tangent bundle of ?(N) and are typically not unitarily invariant. Nevertheless, we prove that for N = 2 dimensional Hilbert space the quantum 2-Wasserstein distance (unique up to rescali

Show more

Free Full Text From Publishermore_horiz

47 References  Related records

Cited by 5 Related articles All 2 versions

2023 arXiv  

The Wasserstein Believer: Learning Belief Updates for Partially Observable Environments through Reliable Latent Space Models
The Wasserstein...
by Avalos, Raphael; Delgrange, Florent; Nowé, Ann ; More...
03/2023
Partially Observable Markov Decision Processes (POMDPs) are useful tools to model environments where the full state cannot be perceived by an agent. As such...
Journal Article  

Full Text Online

All 5 vesions

Peer-reviewed
An extended Exp-TODIM method for multiple attribute decision making based on the Z-Wasserstein distance
Show more
Authors:Hong SunZhen YangQiang CaiGuiwu WeiZhiwen Mo
Summary:• The Wasserstein method measures the distance between two Z-numbers. • The Exp-TODIM method for MADM is built on the Z-Wasserstein distance. • A case study for choosing a reasonable carbon storage site is given. • The sensitivity analysis is given to illustrate the stability of the method. • Some comparative analysis is used to state the advantages of the method.

Z-numbers, as relatively emerging fuzzy numbers, are to a large extent close to human language. For this reason, the Z-number is a powerful tool for representing expert evaluation information. However, the Z-number is more complex than the general structure of fuzzy numbers since it consists of both the fuzzy restriction A and the reliability measure B. As a result, calculating of the Z-number is a very complex process. This paper uses a modified Wasserstein distance to measure the distance between two Z-numbers, which avoids the loss of information better than the existing metric. Then a new decision model is constructed by combining the Z-Wasserstein distance with the exponential TODIM method(exp-TODIM), which is less susceptible to changes in parameters and has good stability. Next, a detailed example of choosing a reasonable carbon storage site is given to illustrate the feasibility of the exp-TODIM method with wasserstein distance. Finally, a sensitivity analysis is given to illustrate the stability of the method, and a comparative analysis is used to state the advantages of the method
Show more
Article, 2023
Publication:Expert Systems With Applications, 214, 20230315
Publisher:2023
Cited by 56
 Related articles All 2 versions


Unified Topological Inference for Brain Networks in Temporal Lobe Epilepsy Using the Wasserstein Distance
Show more
Authors:Moo K ChungCamille Garcia RamosFelipe Branco De PaivaJedidiah MathisVivek PrabharakarenVeena A NairElizabeth MeyerandBruce P HermannJeffery R BinderAaron F Struck
Show more
Summary:Persistent homology can extract hidden topological signals present in brain networks. Persistent homology summarizes the changes of topological structures over multiple different scales called filtrations. Doing so detect hidden topological signals that persist over multiple scales. However, a key obstacle of applying persistent homology to brain network studies has always been the lack of coherent statistical inference framework. To address this problem, we present a unified topological inference framework based on the Wasserstein distance. Our approach has no explicit models and distributional assumptions. The inference is performed in a completely data driven fashion. The method is applied to the resting-state functional magnetic resonance images (rs-fMRI) of the temporal lobe epilepsy patients collected at two different sites: University of Wisconsin-Madison and the Medical College of Wisconsin. However, the topological method is robust to variations due to sex and acquisition, and thus there is no need to account for sex and site as categorical nuisance covariates. We are able to localize brain regions that contribute the most to topological differences. We made MATLAB package available at https://github.com/laplcebeltrami/dynamicTDA that was used to perform all the analysis in this study
Article, 2023
Publication:ArXiv, 20230213
Publisher:2023


2023


2023 see 2022. Peer-reviewed
Rectified Wasserstein Generative Adversarial Networks for Perceptual Image Restoration
Authors:Feng WuDong LiuHaichuan Ma
Summary:Wasserstein generative adversarial network (WGAN) has attracted great attention due to its solid mathematical background, i.e., to minimize the Wasserstein distance between the generated distribution and the distribution of interest. In WGAN, the Wasserstein distance is quantitatively evaluated by the discriminator, also known as the <italic>critic</italic>. The vanilla WGAN trained the critic with the simple Lipschitz condition, which was later shown less effective for modeling complex distributions, like the distribution of natural images. We try to improve the WGAN training by introducing pairwise constraint on the critic, oriented to image restoration tasks. In principle, pairwise constraint is to suggest the critic assign a higher rating to the original (real) image than to the restored (generated) image, as long as such a pair of images are available. We show that such pairwise constraint may be implemented by <italic>rectifying</italic> the gradients in WGAN training, which leads to the proposed rectified Wasserstein generative adversarial network (ReWaGAN). In addition, we build interesting connections between ReWaGAN and the perception-distortion tradeoff. We verify ReWaGAN on two representative image restoration tasks: single image super-resolution (4× and 8×) and compression artifact reduction, where our ReWaGAN not only beats the vanilla WGAN consistently, but also outperforms the state-of-the-art perceptual quality-oriented methods significantly. Our code and models are publicly available at <uri>https://github.com/mahaichuan/ReWaGAN</uri>
Show more
Article, 2023
Publication:IEEE Transactions on Pattern Analysis & Machine Intelligence, 45, 202303, 3648
Publisher:2023

2023 see 2022 
Peer-reviewed
Graph Wasserstein Autoencoder-Based Asymptotically Optimal Motion Planning With Kinematic Constraints for Robotic Manipulation
Show more
Authors:Chongkun XiaYunzhou ZhangSonya A. ColemanChing-Yen WengHoude LiuShichang LiuI-Ming Chen
Show more
Summary:This paper presents a learning based motion planning method for robotic manipulation, aiming to solve the asymptotically-optimal motion planning problem with nonlinear kinematics in a complex environment. The core of the proposed method is based on a novel neural network model, i.e., graph wasserstein autoencoder (GraphWAE) network, which is used to represent the implicit sampling distributions of the configuration space (C-space) for sampling-based planning algorithms. Through learning the implicit distributions, we can guide the planning process to search or extend in the desired region to reduce the collision checks dramatically for fast and high-quality motion planning. The theoretical analysis and proofs are given to demonstrate the probabilistic completeness and asymptotic optimality of the proposed method. Numerical simulations and experiments are conducted to validate the effectiveness of the proposed method through a series of planning problems from 2D, 6D and 12D robot C-spaces in the challenging scenes. Results indicate that the proposed method can achieve better planning performance than the state-of-the-art planning algorithms. Note to Practitioners—The motivation of this work is to develop a fast and high-quality asymptotically optimal motion planning method for practical applications such as autonomous driving, robotic manipulation and others. Due to the time consumption caused by collision detection, current planning algorithms usually take much time to converge to the optimal motion path especially in the complicated environment. In this paper, we present a neural network model based on GraphWAE to learn the biasing sampling distributions as the sample generation source to further reduce or avoid collision checks of sampling-based planning algorithms. The proposed method is general and can be also deployed in other sampling-based planning algorithms for improving planning performance in different robot applications
Show more
Article, 2023
Publication:IEEE Transactions on Automation Science and Engineering, 20, 202301, 244
Publisher:2023

A Small Town in Ukraine by Bernard Wasserstein review -- on the border and at the centre of history; In this touching account, the author traces his ancestors' terrible ordeal while living in an 'insignificant place' that became a battleground for more than a century of conflicts.(Books)
Show more
Article, 2023
Publication:The Observer (London, England), February 14 2023, NA
Publisher:2023


2023 see 2022 2021. Peer-reviewed
Exact statistical inference for the Wasserstein distance by selective inference Selective Inference for the Wasserstein Distance
Show more
Authors:Vo Nguyen Le DuyIchiro Takeuchi
Article, 2023
Publication:Annals of the Institute of Statistical Mathematics, 75, 202302, 127
Publisher:2023
Cited by 6
Related articles All 4 versions

[CITATION] Exact statistical inference for the Wasserstein distance by selective inference Selective Inference for the Wasserstein Distance

VN Le Duy, I Takeuchi - ANNALS OF …,


Research Data from China University of Geosciences Update Understanding of Mathematics (Prediction of Tumor Lymph Node Metastasis Using Wasserstein Distance-Based Generative Adversarial Networks Combing with Neural Architecture Search for ...)
Show more
Article, 2023
Publication:Health & Medicine Week, February 24 2023, 3694
Publisher:2023

<–—2023———2023———370 —



Shortfall-Based Wasserstein Distributionally Robust Optimization
Authors:Ruoxuan LiWenhua LvTiantian Mao
Article, 2023
Publication:Mathematics, 11, 20230207, 849
Publisher:2023


IWGAN: Anomaly Detection in Airport Based on Improved Wasserstein Generative Adversarial Network
Show more
Authors:Ko-Wei HuangGuan-Wei ChenZih-Hao HuangShih-Hsiung Lee
Article, 2023
Publication:Applied Sciences, 13, 20230120, 1397
Publisher:2023


2023 see 2022. Peer-reviewed
Wasserstein generative adversarial networks for modeling marked events
Authors:S. Haleh S. DizajiSaeid PashazadehJavad Musevi Niya
Article, 2023
Publication:The Journal of Supercomputing, 79, 202302, 2961
Publisher:2023
Related articles
All 2 versions


Peer-reviewed
Scalable model-free feature screening via sliced-Wasserstein dependency
Authors:Tao LiJun YuCheng Meng
Article, 2023
Publication:Journal of Computational and Graphical Statistics, 20230223, 1
Publisher:2023
Scalable Model-Free Feature Screening via Sliced-Wasserstein...

by Li, Tao; Yu, Jun; Meng, Cheng
04/2023
We consider the model-free feature screening problem that aims to discard non-informative features before downstream analysis. Most of the existing feature...
Data SetCitation Online

Cited by 1 Related articles



Research Findings from University of Tehran Update Understanding of Photogrammetry Remote Sensing and Spatial Information Sciences (Improving Semantic Segmentation of High-resolution Remote Sensing Images Using Wasserstein Generative ...)
Show more
Article, 2023
Publication:Science Letter, February 3 2023, 562
Publisher:2023
All 6 versions
 


2023


Peer-reviewed
Human-related anomalous event detection via memory-augmented Wasserstein generative adversarial network with gradient penalty
Show more
Authors:Nanjun LiFaliang ChangChunsheng Liu
Article, 2023
Publication:Pattern Recognition, 138, 202306, 109398
Publisher:2023
All 3 versions


Using Hybrid Penalty and Gated Linear Units to Improve Wasserstein Generative Adversarial Networks for Single-Channel Speech Enhancement
Show more
Authors:Xiaojun ZhuHeming Huang
Article, 2023
Publication:Computer Modeling in Engineering & Sciences, 135, 2023, 2155
Publisher:2023
[CITATION] Using Hybrid Penalty and Gated Linear Units to Improve Wasserstein Generative Adversarial Networks for Single-Channel Speech Enhancement

X Zhu, H Huang - … -COMPUTER MODELING IN …, 2023 - TECH SCIENCE PRESS 871 …

[CITATION] Using Hybrid Penalty and Gated Linear Units to Improve Wasserstein Generative Adversarial Networks for Single-Channel Speech Enhancement

X Zhu, H Huang - … -COMPUTER MODELING IN …, 2023 - TECH SCIENCE PRESS 871 …

Cited by 1 Related articles All 2 versions 


Peer-reviewed
A Novel Small Samples Fault Diagnosis Method Based on the Self-attention Wasserstein Generative Adversarial Network
Show more
Authors:Zhiwu ShangJie ZhangWanxiang LiShiqi QianJingyu LiuMaosheng Gao
Article, 2023
Publication:Neural Processing Letters, 20230107
Publisher:2023


Peer-reviewed
Least Wasserstein distance between disjoint shapes with perimeter regularization
Authors:Michael NovackIhsan TopalogluRaghavendra Venkatraman
Article, 2023
Publication:Journal of functional analysis, 284, 2023
Publisher:2023
Cited by 2
Related articles All 9 versions


Peer-reviewed
Medical multivariate time series imputation and forecasting based on a recurrent conditional Wasserstein GAN and attention
Show more
Authors:Sven FestagCord Spreckelsen
Article, 2023
Publication:Journal of Biomedical Informatics, 139, 202303, 104320
Publisher:2023

<–—2023———2023———380 —


Open-Set Signal Recognition Based on Transformer and Wasserstein Distance
Authors:Wei ZhangDa HuangMinghui ZhouJingran LinXiangfeng Wang
Article, 2023
Publication:Applied Sciences, 13, 20230207, 2151
Publisher:2023

Peer-reviewed
Self-supervised non-rigid structure from motion with improved training of Wasserstein GANs
Authors:Yaming WangXiangyang PengWenqing HuangXiaoping YeMingfeng Jiang
Article, 2023
Publication:IET Computer Vision, 20230206
Publisher:2023

Computing the Gromov-Wasserstein Distance between Two Surface Meshes Using Optimal Transport
Show more
Authors:Patrice KoehlMarc DelarueHenri Orland
Article, 2023
Publication:Algorithms, 16, 20230228, 131
Publisher:2023
All 2 versions


[PDF] arxi
2023 see 2022.
Peer-reviewed
On approximations of data-driven chance constrained programs over Wasserstein balls
Authors:Zhi ChenDaniel KuhnWolfram Wiesemann
Article, 2023
Publication:Operations Research Letters, 51, 202305, 226
Publisher:2023
Cited by 3
Related articles All 4 versions


Peer-reviewed
A multi-period emergency medical service location problem based on Wasserstein-metric approach using generalised benders decomposition method
Show more
Authors:Yuefei YuanQiankun SongBo Zhou
Article, 2023
Publication:International Journal of Systems Science, 20230124, 1
Publisher:2023
Zbl 07706621


2023



Peer-reviewed
Isometric rigidity of Wasserstein tori and spheres
Authors:György Pál GehérTamás TitkosDániel Virosztek
Article, 2023
Publication:Mathematika, 69, 2023, 20
Publisher:2023


Gradient Flows of Modified Wasserstein Distances and Porous Medium Equations with Nonlocal Pressure
Show more
Authors:Nhan-Phu ChungQuoc-Hung Nguyen
Article, 2023ublication:Acta Mathematica Vietnamica, 20230214
Publisher:2023
 Gradient flows of modified Wasserstein distances and porous
Wasserstein information matrix
Authors:Wuchen LiJiaxi Zhao
Article, 2023
Publication:Information Geometry, 20230214
Publisher:2023
Zbl 07686832


2023 see 2022
Global Pose Initialization Based on Gridded Gaussian Distribution With Wasserstein Distance
Authors:Chenxi YangZhibo ZhouHanyang ZhuangChunxiang WangMing Yang
Article, 2023
Publication:IEEE Transactions on Intelligent Transportation Systems, 2023, 1
Publisher:2023

 Cited by 4 Related articles All 3 versions

Peer-reviewed
Portfolio optimization using robust mean absolute deviation model: Wasserstein metric approach
Authors:Zohreh Hosseini-NodehRashed Khanjani-ShirazPanos M. Pardalos
Article, 2023
Publication:Finance Research Letters, 202302, 103735
Publisher:2023

Cited by 1 All 2 versions

<–—2023———2023———390 — 



An Efficient Content Popularity Prediction of Privacy Preserving Based on Federated Learning and Wasserstein GAN
Show more
Authors:Kailun WangDengXuanheng Li
Article, 2023
Publication:IEEE Internet of Things Journal, 10, 20230301, 3786
Publisher:2023

Peer-reviewed
Wasserstein Distance-Based Full-Waveform Inversion With a Regularizer Powered by Learned Gradient
Show more
Authors:Fangshu YangJianwei Ma
Article, 2023
Publication:IEEE Transactions on Geoscience and Remote Sensing, 61, 2023, 1
Publisher:2023

2023 see 2022.
Peer-reviewed
A two-step approach to Wasserstein distributionally robust chance- and security-constrained dispatch
Show more
Authors:Amin MaghamiEvrim UrsavasAshish Cherukuri
Article, 2023
Publication:IEEE Transactions on Power Systems, 2023, 1
Publisher:2023

Attacking Mouse Dynamics Authentication using Novel Wasserstein Conditional DCGAN
Authors:Arunava RoyKokSheik WongRaphael C. -W Phan
Article, 2023
Publication:IEEE Transactions on Information Forensics and Security, 2023, 1
Publisher:2023

2023


A Novel Graph Kernel Based on the Wasserstein Distance and Spectral Signatures
Authors:Yantao LiuLuca RossiAndrea TorselloJoint IAPR International Workshops on Statistical Techniques in Pattern Recognition (SPR) and Structural and Syntactic Pattern Recognition (SSPR)
Show more
Summary:Spectral signatures have been used with great success in computer vision to characterise the local and global topology of 3D meshes. In this paper, we propose to use two widely used spectral signatures, the Heat Kernel Signature and the Wave Kernel Signature, to create node embeddings able to capture local and global structural information for a given graph. For each node, we concatenate its structural embedding with the one-hot encoding vector of the node feature (if available) and we define a kernel between two input graphs in terms of the Wasserstein distance between the respective node embeddings. Experiments on standard graph classification benchmarks show that our kernel performs favourably when compared to widely used alternative kernels as well as graph neural networks
Show more
Chapter, 2023
Publication:Structural, Syntactic, and Statistical Pattern Recognition: Joint IAPR International Workshops, S+SSPR 2022, Montreal, QC, Canada, August 26–27, 2022, Proceedings, 20230101, 122
Publisher:2023

2023 see 2021
Distributional Robustness in Minimax Linear Quadratic Control with Wasserstein Distance
Authors:Kihyun KimInsoon Yang
Summary:Abstract. To address the issue of inaccurate distributions in discrete-time stochastic systems, a minimax linear quadratic control method using the Wasserstein metric is proposed. Our method aims to construct a control policy that is robust against errors in an empirical distribution of underlying uncertainty by adopting an adversary that selects the worst-case distribution at each time. The opponent receives a Wasserstein penalty proportional to the amount of deviation from the empirical distribution. As a tractable solution, a closed-form expression of the optimal policy pair is derived using a Riccati equation. We identify nontrivial stabilizability and observability conditions under which the Riccati recursion converges to the unique positive semidefinite solution of an algebraic Riccati equation. Our method is shown to possess several salient features, including closed-loop stability, a guaranteed-cost property, and a probabilistic out-of-sample performance guarantee
Show more
Downloadable Article
Publication:SIAM Journal on Control and Optimization, 61, 20230430, 458
 Zbl 07669365

Cited by 1 All 4 versions


University of Electronic Science and Technology of China Researcher Has Published New Study Findings on Applied Sciences (Open-Set Signal Recognition Based on Transformer and Wasserstein Distance)
Show more
Article, 2023
Publication:Science Letter, March 3 2023, 1367
Publisher:2023
All 3 versions
 

Cited by 5 Related articles All 4 versions 


Wasserstein Releases Solar Charger and Wireless Chime Compatible with the Blink Video Doorbell
Article, 2023
Publication:PR Newswire, February 6 2023, NA
Publisher:2023

Univ Jiliang China Submits Chinese Patent Application for Early Fault Detection Method Based on Wasserstein Distance
Article, 2023
Publication:Global IP News: Measurement & Testing Patent News, March 15 2023, NA
Publisher:2023
<–—2023———2023———400 —



Univ Qinghua Seeks Patent for Rotating Machine State Monitoring Method Based on Wasserstein Depth Digital Twinborn Model
Show more
Article, 2023
Publication:Global IP News: Tools and Machinery Patent News, March 1 2023, NA
Publisher:2023


2923 see 2022. Peer-reviewed
Entropy-regularized Wasserstein distributionally robust shape and topology optimization
Authors:Charles DapognyFranck IutzelerAndrea MedaBoris Thibert
Article, 2023
Publication:Structural and Multidisciplinary Optimization, 66, 202303
Publisher:2023


2923 see 2022. Peer-review
Wasserstein distance based multi-scale adversarial domain adaptation method for remaining useful life prediction
Show more
Authors:Huaitao ShiChengzhuang HuangXiaochen ZhangJinbao ZhaoSihui Li
Article, 2023
Publication:Applied Intelligence, 53, 202302, 3622
Publisher:2023
2923 see 2022. Peer-review
Simple approximative algorithms for free-support Wasserstein barycenters
Author:Johannes von Lindheim
Article, 2023
Publication:Computational Optimization and Applications, 20230301
Publisher:2023

Cited by 1 Related articles All 2 versions

National Kaohsiung University of Science and Technology Researchers Publish New Study Findings on Applied Sciences (IWGAN: Anomaly Detection in Airport Based on Improved Wasserstein Generative Adversarial Network)
Show more
Article, 2023
Publication:Science Letter, March 3 2023, 384
Publisher:2023
Zbl 07680908

Cited by 1 Related articles 


Wasserstein Distance in Deep Learning
Authors:Junior LeoErnest GeStotle Li
Article, 2023
Publication:SSRN Electronic Journal, 2023
Publisher:2023
Cited by 2
 Related articles All 5 versions 


2023
 

3034 see 2022
Image Reconstruction for Electrical Impedance Tomography (EIT) With Improved Wasserstein Generative Adversarial Network (WGAN)
Show more
Authors:Hanyu ZhangQi WangRonghua ZhangXiuyan LiXiaojie DuanYukuan SunJianming WangJiabin Jia
Article, 2023
Publication:IEEE Sensors Journal, 23, 20230301, 4466
Publisher:2023

Unified Topological Inference for Brain Networks in Temporal LobeEpilepsy Using the Wasserstein Distance
Show more
Authors:Moo K. ChungCamille Garcia RamosFelipe Branco De PaivaJedidiah MathisVivek PrabharakarenVeena A. NairElizabeth MeyerandBruce P. HermannJeffery R. BinderAaron F. Struck
Show more
Article, 2023
Publication:ArXiv, 20230213
Publisher:2023

Generating Bipedal Pokémon Images by Implementing the Wasserstein Generative Adversarial Network
Show more
Author:Jacqueline Jermyn
Article, 2023
Publication:International Journal for Research in Applied Science and Engineering Technology, 11, 20230131, 1211
Publisher:2023



TextureWGAN: texture preserving WGAN with multitask regularizer for computed tomography inverse problems
Show more
Authors:Masaki IkutaJun Zhang
Summary:This paper presents a deep learning (DL) based method called TextureWGAN. It is designed to preserve image texture while maintaining high pixel fidelity for computed tomography (CT) inverse problems. Over-smoothed images by postprocessing algorithms have been a well-known problem in the medical imaging industry. Therefore, our method tries to solve the over-smoothing problem without compromising pixel fidelity
Show more
Article, 2023
Publication:Journal of medical imaging (Bellingham, Wash.), 10, 202303, 024003
Publisher:2023
All 4 versions


Peer-reviewed
Aero-engine high speed bearing fault diagnosis for data imbalance: A sample enhanced diagnostic method based on pre-training WGAN-GP
Show more
Authors:Jiayu ChenZitong YanCuiyin LinBoqing YaoHongjuan Ge
Summary:Rolling bearing is the key supporting component of aero-engines, of which fault diagnosis is very important to ensure its reliable operation and continuous airworthiness. However, the data imbalance problem caused by its complex and harsh environment restricts the intelligent diagnosis. This paper proposes a sample enhanced diagnostic method based on pre-training and auxiliary classifier Wasserstein generative adversarial network with gradient penalty (PT-WGAN-GP). Firstly, a pre-training network is proposed and incorporated into the discriminator and classifier of WGAN-GP for feature adaptive and efficient extraction. Meanwhile, a new generator is constructed by introducing a residual network and the instance batch to improve its data-fitting ability. Finally, the data-enhanced model, PT-WGAN-GP, can stably generate high-quality faulty samples, which balances the testing dataset and completes the optimization training of network structure. Two cases under imbalanced data have verified the effectiveness of the proposed method, as well as its superiority over other widely used methods
Show more
Article
Publication:Measurement, 213, 2023-05-31

<–—2023———2023———410 —




Peer-reviewed
Research on bearing vibration signal generation msmall samplesethod based on filtering WGAN_GP with
Show more
Authors:Jiesong LiTao LiuXing Wu
Summary:In the practical application of bearing fault diagnosis, the data imbalance problems caused by the lack of available fault data lead to inaccurate diagnosis. The high cost and difficulty of obtaining fault samples has become an obstacle to the development of intelligent diagnosis technology. Aiming at the problem of data imbalance caused by small samples, this paper proposes a data generation method called FEF_WGAN_GP based on Wasserstein generative adversarial networks with gradient penalty (WGAN_GP) and feature Euclidean distance filtering (FEF) theory. Firstly, WGAN_GP is used to obtain signals with similar distribution to the small sample data, which can alleviate the imbalance of the dataset. Then, the FEF method is used to filter the generated data in order to obtain a higher quality of the samples. In the test validation part, not only the used dataset is evaluated to obtain a more reasonable dataset, but also the generated signals are evaluated from multiple perspectives. In addition, this paper evaluates the effects of the number, length and signal-to-noise ratio of the parent data on the quality of the generated signals, as well as the effect of the setting of the threshold of the data filtering method on the accuracy of the classifier. The experimental results indicate that this method performs well in processing unbalance fault data. It has better stability and diagnostic accuracy than the current stable method
Show more
Article, 2023
Publication:Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science, 20230222
Publisher:2023



DPBA-WGAN: A Vector-Valued Differential Private Bilateral Alternative Scheme on WGAN for Image Generation
Show more
Authors:Danhua WuWenyong ZhangPanfeng Zhang
Article, 2023
Publication:IEEE access, 11, 2023, 13889
Publisher:2023

[PDF] ieee.org

DPBA-WGAN: A Vector-Valued Differential Private Bilateral Alternative Scheme on WGAN for Image Generation

D Wu, W Zhang, P Zhang - IEEE Access, 2023 - ieeexplore.ieee.org

WGAN For better understanding the above algorithm, we discuss how the DPBA-WGAN can 

… The WGAN training focuses on the privacy barrier of the discriminator. From the direction of …
PBA-WGAN: A Vector-Valued Differential Private Bilateral Alternative Scheme on WGAN for Image Generation

Library



Enhanced CNN Classification Capability for Small Rice Disease Datasets Using Progressive WGAN-GP: Algorithms and Applications
Show more
Authors:Yang LuXianpeng TaoNianyin ZengJiaojiao DuRou Shang
Artile, 2023
Publication:Remote Sensing, 15, 20230327, 1789
Publisher:2023
All 4 versions
 


Peer-reviewed
Aero-engine high speed bearing fault diagnosis for data imbalance: A sample enhanced diagnostic method based on pre-training WGAN-GP
Show more
Authors:Jiayu ChenZitong YanCuiyin LinBoqing YaoHongjuan Ge
Article, 2023
Publication:Measurement, 213, 202305, 112709
Publisher:2023

Cited by 1 All 2 versions

[CITATION] … high speed bearing fault diagnosis for data imbalance: A sample enhanced diagnostic method based on pre-training WGAN-GP”[Measurement 213 (2023 …

J Chen, Z Yan, C Lin, B Yao, H Ge - Measurement, 202



ARTICLE

Single-Location and Multi-Locations Scenarios Generation for Wind Power Based On WGAN-GP

Zhu, Jianan ; Ai, Qian ; Chen, Yun ; Wang, Jiayu; Bristol: IOP Publishing

Journal of physics. Conference series, 2023, Vol.2452 (1), p.12022

 ...SEEE-2022 IOP Publishing Journal of Physics: Conference Series 2452 (2023) 012022 doi:10.1088/1742-6596/2452/1/012022 Single-Location and Multi-Locations...

PEER REVIEWED

OPEN ACCESS

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents
Peer-reviewed
Single-Location and Multi-Locations Scenarios Generation for Wind Power Based On WGAN-GP
Authors:Jianan ZhuQian AiYun ChenJiayu Wa
Summary:To address the randomness of renewable energy, scenario generation can simulate the random process of renewable energy. Still, most of the previous scenario generation algorithms are based on assumed probability distribution models, which are difficult to grasp the dynamic characteristics of renewable energy accurately. Based on an improved generative adversarial network algorithm, this paper generates single-site and multi-site scenarios for wind power generation data. The temporal and spatial correlations of the generated data were judged based on the maximum mean difference and the Pearson coefficient. Finally, multiple sets of wind power generation data are used to verify that the proposed method can reflect not only the volatility of renewable energy but also ensure the temporal and spatial correlation between the data
Show more
Article, 2023
Publication:Journal of Physics: Conference Series, 2452, 20230301
Publisher:2023
Cited by 1 All 3 versions


2023


 Wuxi Cansonic Medical Science & Tech Seeks Patent for FC-VoVNet and WGAN-Based B Ultrasonic Image Denoising Method
Show more
Article, 2023
Publication:Global IP News: Optics & Imaging Patent News, March 23 2023, NA
Publisher:2023


Peer-reviewed
A novel prediction approach of polymer gear contact fatigue based on a WGAN-XGBoost model
Authors:Chenfan JiaPeitang WeiZehua LuMao YeRui ZhuHuaiju Liu
Article, 2023
Publication:Fatigue & Fracture of Engineering Materials & Structures, 20230315
Publisher:2023


Optimization and Expansion of Construction Waste Dataset Based on WGAN-GP
Author:欣诺
Article, 2023
Publication:Computer Science and Application, 13, 2023, 136
Publisher:2023

Peer-reviewed
EAF-WGAN: Enhanced Alignment Fusion-Wasserstein Generative Adversarial Network for Turbulent Image Restoration
Show more
Authors:Xiangqing LiuGang LiZhenyang ZhaoQi CaoZijun ZhangShaoan YanJianbin XieMinghua Tang
Article, 2023
Publication:IEEE Transactions on Circuits and Systems for Video Technology, 2023, 1
Publisher:2023


Class-rebalanced wasserstein distance for multi-source domain adaptation
Wang, Qi; Wang, Shengsheng; Wang, Bilin. Applied Intelligence; Boston Vol. 53, Iss. 7,  (Apr 2023): 8024-8038.
C Citation/Abstract

Abstract/Details.  2  Quick look

Cited by 1 Related articles All 2 versions

<–—2023———2023———420 —



 arXiv.org; Ithaca, Mar 30, 2023.
 0Full Text
Working Paper
Variational Wasserstein Barycenters for Geometric Clustering

Liang Mi.

Abstract/DetailsGet full text
opens in a new window


 
Working Paper
Continuum Swarm Tracking Control: A Geometric Perspective in Wasserstein Space
Emetic, Max; Bamieh, Bassam. arXiv.org; Ithaca, Mar 27, 2023.
 Abstract/DetailsGet full text
opens in a new window

Isometries and isometric embeddings of Wasserstein spaces over the Heisenberg group

by Balogh, Zoltán MTitkos, TamásVirosztek, Dániel
03/2023
Our purpose in this paper is to study isometries and isometric embeddings of the $p$-Wasserstein space $\mathcal{W}_p(\mathbb{H}^n)$ over the Heisenberg group...
Journal Article  Full Text Online
6 Working Paper
Isometries and isometric embeddings of Wasserstein spaces over the Heisenberg group

Balogh, Zoltán M; Titkos, Tamás; Virosztek, Dániel. arXiv.org; Ithaca, Mar 27, 2023.

Abstract/DetailsGet full text
opens in a new window
arXiv:2303.15638
  [pdfother eess.SY  cs.MA 

Related articles All 3 versions 


2023 see arXiv. Working Paper

On smooth approximations in the Wasserstein space
by Cosso, AndreaMartini, Mattia
03/2023
In this paper we investigate the approximation of continuous functions on the Wasserstein space by smooth functions, with smoothness meant in the sense of...
Journal Ar
On smooth approximations in the Wasserstein space
On smooth approximations in the Wasserstein space
Cosso, Andrea; Martini, Mattia. arXiv.org; Ithaca, Mar 27, 2023.
Cited by 1
 All 2 versions 
itCited by 2
 Related articles All 6 versions

 

2023 see arXiv. Working Paper
Parameter estimation for many-particle models from aggregate observations: A Wasserstein distance based sequential Monte Carlo sampler
Chen, Cheng; Wen, Linjie; Li, Jinglai. arXiv.org; Ithaca, Mar 27, 2023.
 Full Text

ll 2 versions 


2023 see. arXiv    Working Paper
Improving Neural Topic Models with Wasserstein Knowledge Distillation

Adhya, Suman; Sanyal, Debarshi Kumar. arXiv.org; Ithaca, Mar 27, 2023.

Abstract/DetailsGet full text
opens in a new window

 Get full textopens in a new window

Cited by 2 Related articles All 4 versions

2023 see. arXiv    Working Paper
Stability of Entropic Wasserstein Barycenters and application to random geometric graphs

Theveneau, Marc; Keriven, Nicolas. arXiv.org; Ithaca, Mar 27, 2023.
 Abstract/DetailsGet full text
opens in a new window

2023 see. arXiv    Working Paper

Optimal transport and Wasserstein distances for causal models
by Eckstein, StephanCheridito, Patrick
03/2023
In this paper we introduce a variant of optimal transport adapted to the causal structure given by an underlying directed graph. Different graph structures...
Journal Article  Full Text Online
Optimal transport and Wasserstein distances for causal models

Eckstein, Stephan; Cheridito, Patrick. arXiv.org; Ithaca, Mar 24, 2023.

Abstract/DetailsGet full text
opens in a new window
Related articles
 All 2 versions 


2023 see. arXiv    Working Paper
Gromov-Wasserstein Distances: Entropic Regularization, Duality, and Sample Complexity

Zhang, Zhengxin; Goldfeld, Ziv; Mroueh, Youssef; Sriperumbudur, Bharath K. arXiv.org; Ithaca, Mar 24, 
Full Text

Abstract/DetailsGet full text
opens in a new window

 
2023 -patent news Wire Feed

Wuxi Cansonic Medical Science & Tech Seeks Patent for FC-VoVNet and WGAN-Based B Ultrasonic Image Denoising Method

Global IP News. Optics & Imaging Patent News; New Delhi [New Delhi]. 23 Mar 2023.  

DetailsFull text

Wuxi Cansonic Medical Science & Tech Seeks Patent for FC-VoVNet and WGAN-Based B Ultrasonic Image Denoising Method

Global IP News. Optics & Imaging Patent News; New

<–—2023———2023———430 —


2023 see 2022
Regularization for Wasserstein Distributionally Robust Optimization

Azizian, Waïss; Iutzeler, Franck; Malick, Jérôme. arXiv.org; Ithaca, Mar 23, 2023.

Abstract/DetailsGet full text
opens in a new window
MR4586572
 

Cited by 8 Related articles All 12 versions
Cited by 8 Related articles All 12 versions

Wasserstein Auto-encoded MDPs: Formal Verification of Efficiently Distilled RL Policies with Many-sided Guarantees

 2023 see arXiv. Working Pape
Wasserstein Auto-encoded MDPs: Formal Verification of Efficiently Distilled RL Policies with Many-sided Guarantees

Delgrange, Florent; Nowé, Ann; Pérez, Guillermo A. arXiv.org; Ithaca, Mar 22, 2023.

Abstract/DetailsGet full text
opens in a new window
Cited by 2
 All 4 versions 
[CITATION] WASSERSTEIN AUTO-ENCODED MDPS

F DelgrangeA NowéGA Pérez, F Make

Related articles


2023 see arxiv. Working Paper

Wasserstein Adversarial Examples on Univariant Time Series Data
Wasserstein Adversarial Examples on Univariant Time Series Data

Wang, Wenjie; Li, Xiong; Lou, Jian. arXiv.org; Ithaca, Mar 22, 2023.

Abstract/DetailsGet full text
opens in a new window
All 2 versions
 

 
2023 see arXiv. Working Paper

Doubly Regularized Entropic Wasserstein Barycenters

Chizat, Lénaïc. arXiv.org; Ithaca, Mar 21, 2023.

Abstract/DetailsGet full text
opens in a new window
Cited by 7
 Related articles All 2 versions 

Working Paper
Estimation and inference for the Wasserstein distance between mixing measures in topic models

Xin Bing; Bunea, Florentina; Niles-Weed, Jonathan. arXiv.org; Ithaca, Mar 17, 2023.

Abstract/DetailsGet full text
opens in a new window

2023

High order spatial discretization for variational time implicit schemes: Wasserstein gradient flows and reaction-diffusion systems
by Fu, GuoshengOsher, StanleyLi, Wuchen
03/2023
We design and compute first-order implicit-in-time variational schemes with high-order spatial discretization for initial value gradient flows in generalized...
Journal Article  Full Text Online
2023 see arXiv. Working Paper
High order spatial discretization for variational time implicit schemes: Wasserstein gradient flows and reaction-diffusion systems

Fu, Guosheng; Osher, Stanley; Li, Wuchen. arXiv.org; Ithaca, Mar 15, 2023.
Full Text

Abstract/DetailsGet full text
opens in a new window
Cited by 1
 All 4 versions 

 
 2023 see 2022. Working Paper
Convergence of the empirical measure in expected Wasserstein distance: non asymptotic explicit bounds in

Fournier, Nicolas. arXiv.org; Ithaca, Mar 14, 2023.
Full Text
Abstract/DetailsGet full text
opens in a new window

A Convergent Single-Loop Algorithm for Relaxation of Gromov-Wasserstein in Graph Data
by Li, JiajinTang, JianhengKong, Lemin ; More...
03/2023
In this work, we present the Bregman Alternating Projected Gradient (BAPG) method, a single-loop algorithm that offers an approximate solution to the...
Journal Article  Full Text Online

2023 see arXiv. Working Paper

A Convergent Single-Loop Algorithm for Relaxation of Gromov-Wasserstein in Graph Data

Li, Jiajin; Tang, Jianheng; Kong, Lemin; Liu, Huikang; Li, Jia; et al. arXiv.org; Ithaca, Mar 12, 2023.

Abstract/DetailsGet full text
opens in a new window

All 5 versions 


2023 see arXiv
Neural Gromov-Wasserstein Optimal Transport
by Nekrashevich, MaksimKorotin, AlexanderBurnaev, Evgeny
03/2023
We present a scalable neural method to solve the Gromov-Wasserstein (GW) Optimal Transport (OT) problem with the inner product cost. In this problem, given two...
Journal Article  Full Text Online


2023 see arXiv.   Working Paper
Neural Gromov-Wasserstein Optimal Transport

Nekrashevich, Maksim; Korotin, Alexander; Burnaev, Evgeny. arXiv.org; Ithaca, Mar 10, 2023.
Full Text

Abstract/DetailsGet full text
opens in a new window

Cited by 5 Related articles All 2 versions 

<–—2023———2023———440 —


Sliced-Wasserstein on Symmetric Positive Definite Matrices for M/EEG Signals
by Bonet, ClémentMalézieux, BenoîtRakotomamonjy, Alain ; More...
03/2023
When dealing with electro or magnetoencephalography records, many supervised prediction tasks are solved by working with covariance matrices to summarize the...
Journal Article  Full Text Online
 2023 see arXiv. Working Paper
Sliced-Wasserstein on Symmetric Positive Definite Matrices for M/EEG Signals

Bonet, Clément; Malézieux, Benoît; Rakotomamonjy, Alain; Lucas Drumetz; Moreau, Thomas; et al. arXiv.org; Ithaca, Mar 10, 2023.
Full Text

Abstract/DetailsGet full text
opens in a new window

Cited by 9
 Related articles All 11 versions 


2023 see 2022. Working Paper
Quantitative Stability of Barycenters in the Wasserstein Space

Carlier, Guillaume; Delalande, Alex; Merigot, Quentin. arXiv.org; Ithaca, Mar 10, 2023.earch
Full Text

Abstract/DetailsGet full text
opens in a new window


 2023 see arXiv. Working Paper
Entropic Wasserstein Component Analysis

Collas, Antoine; Vayer, Titouan; Flamary, Rémi; Breloy, Arnaud. arXiv.org; Ithaca, Mar 9, 2023.
Full Text

Abstract/DetailsGet full text
opens in a new window

ited by 4 Related articles All 16 versions

Ultralimits of Wasserstein spaces and metric measure spaces with Ricci curvature bounded from below
by Warren, Andre
03/202
We investigate the stability of the Wasserstein distance, a metric structure on the space of probability measures arising from the theory of optimal transport,...
Journal Article  Full Text Online
 2023 see arXiv. Working Paper
Ultralimits of Wasserstein spaces and metric measure spaces with Ricci curvature bounded from below

Warren, Andrew. arXiv.org; Ithaca, Mar 8, 2023.
Full TextAbstract/DetailsGet full text
opens in a new window

Related articles All 3 versions 


2023 see arXiv. Working Paper
A note on the Bures-Wasserstein metric

Mohan, Shravan. arXiv.org; Ithaca, Mar 7, 2023.
Full Text

Abstract/DetailsGet full text
opens in a new window


2023



arXiv:2303.03284  [pdfother cs.LG 
cs.AIThe Wasserstein Believer: Learning Belief Updates for Partially Observable Environments through Reliable Latent Space Models
Authors: Raphael AvalosFlorent DelgrangeAnn NowéGuillermo A. PérezDiederik M. Roijers
Abstract: Partially Observable Markov Decision Processes (POMDPs) are useful tools to model environments where the full state cannot be perceived by an agent. As such the agent needs to reason taking into account the past observations and actions. However, simply remembering the full history is generally intractable due to the exponential growth in the history space. Keeping a probability distribution that…  More
Submitted 6 March, 2023; originally announced March 2023.
 Working Paper
The Wasserstein Believer: Learning Belief Updates for Partially Observable Environments through Reliable Latent Space Models

Avalos, Raphael; Delgrange, Florent; Nowé, Ann; Pérez, Guillermo A; Roijers, Diederik M. arXiv.org; Ithaca, Mar 6, 2023.
Full Text

Abstract/DetailsGet full text
opens in a new window
All 5 versions
 


2023 see w0ww. Working Paper
Low Budget Active Learning via Wasserstein Distance: An Integer Programming Approach

Mahood, Rafid; Fidler, Sanja; Law, Marc T. arXiv.org; Ithaca, Mar 7, 2023Cite
Full Text

Abstract/DetailsGet full text
opens in a new window

Critical Points and Convergence Analysis of Generative Deep Linear Networks Trained with Bures-Wasserstein Loss
by Bréchet, PierrePapagiannouli, KaterinaAn, Jing ; More...
03/2023
We consider a deep matrix factorization model of covariance matrices trained with the Bures-Wasserstein distance. While recent works have made important...
Journal Article  Full Text Online
2023 see w0ww. Working Paper
Critical Points and Convergence Analysis of Generative Deep Linear Networks Trained with Bures-Wasserstein Loss

Bréchet, Pierre; Papagiannouli, Katerina; An, Jing; Montúfar, Guido. arXiv.org; Ithaca, Mar 6, 2023.
Full Text

Abstract/DetailsGet full text
opens in a new window
Cited by 4
 Related articles All 6 versions 


Wasserstein Actor-Critic: Directed Exploration via Optimism for Continuous-Actions Control
by Likmeta, AmarildoSacco, MatteoMetelli, Alberto Maria ; More...
03/2023
Uncertainty quantification has been extensively used as a means to achieve efficient directed exploration in Reinforcement Learning (RL). However,...
Journal Article  Full Text Online
2023 see w0ww. Working Paper
Wasserstein Actor-Critic: Directed Exploration via Optimism for Continuous-Actions Control

Likmeta, Amarildo; Sacco, Matteo; Metelli, Alberto Maria; Restelli, Marcello. arXiv.org; Ithaca, Mar 4, 2023.
Save to My Research
Full Text

Abstract/DetailsGet full text
Aopens in a new window
All 2 versions 


2023 see arxiv. Working Paper

Extending the Wasserstein metric to positive measures
by Leblanc, HugoGouic, Thibaut LeLiandrat, Jacques ; More...
03/2023
We define a metric in the space of positive finite positive measures that extends the 2-Wasserstein metric, i.e. its restriction to the set of probability...
Journal Article  Full Text Online
Extending the Wasserstein metric to positive measures

Leblanc, Hugo; Thibaut Le Gouic; Liandrat, Jacques; Tournus, Magali. arXiv.org; Ithaca, Mar 3, 2023.
Full Text

Abstract/DetailsGet full text
opens in a new window

All 2 versions 

<–—2023———2023———450 —



2023 patent

Variational autoencoder and Wasserstein generative adversarial network-based incremented software measurement defect data augmentation method, involves utilizing Wasserstein generative adversarial network to generate latent vector, and inputting generated code into decoder

CN115630612-A

Inventor(s) GUO Z

Assignee(s) GUO Z

Derwent Primary Accession Number 

2023-132479


DPBA-WGAN: A Vector-Valued Differential Private Bilateral Alternative Scheme on WGAN for Image Generation

Wu, DHZhang, WY and Zhang, PF

2023 | 

IEEE ACCESS

 11 , pp.13889-13905

The large amount of sensitive personal information used in deep learning models has attracted considerable attention for privacy security. Sensitive data may be memorialized or encoded into the parameters or the generation of the Wasserstein Generative Adversarial Networks (WGAN), which can be prevented by implementing privacy-preserving algorithms during the parameter training process. Meanwhi

Show more

Free Full Text from Publishermore_horiz

53 References. Related records

 Related articles All 2 versions


Method for generating conditional data for generating target data, involves inputting received conditional data to learned conditional Osserstein generator to generate target data for conditional data, where conditional Wasserstein generator is learned

KR2023023464-A

Inventor(s) JO MKIM Y and LEE K B

Assignee(s) UNIV SEOUL NAT R & DB FOUND

Derwent Primary Accession Number 

2023-21420J

 

2023 patent

TextureWGAN: texture preserving WGAN with multitask regularizer for computed tomography inverse problems.

Ikuta, Masaki and Zhang, Jun

2023-mar | 

Journal of medical imaging (Bellingham, Wash.)

 10 (2) , pp.024003

Purpose: This paper presents a deep learning (DL) based method called TextureWGAN. It is designed to preserve image texture while maintaining high pixel fidelity for computed tomography (CT) inverse problems. Over-smoothed images by postprocessing algorithms have been a well-known problem in the medical imaging industry. Therefore, our method tries to solve the over-smoothing problem without co

Show more

View full textmore_horiz 

All 4 versions
2023 patent

Method for modeling a vehicle following behavior based on WGAN, involves making discriminator difficult to judge result of prediction is false, so as to minimize error between prediction result and real result

CN115630683-A

Inventor(s) LI JLI C; (...); XU D

Assignee(s) UNIV ZHEJIANG TECHNOLOGY

Derwent Primary Accession Number 

2023-13424N


2023


Shortfall-Based Wasserstein Distributionally Robust Optimization

Li, RXLv, WH and Mao, TT

Feb 2023 |  

Related articles All 5 versions 


2023 patent

Method for predicting telecommunication client loss based on condition Wasserstein generative adversarial network used in e.g. telecommunication industry, involves processing telecommunication client data set based on comprehensive generative adversarial network model of WGANGP and CGAN

CN115688048-A

Inventor(s) XIE XWEI L and SU C

Assignee(s) UNIV CHONGQING POSTS & TELECOM

Derwent Primary Accession Number 

2023-183199

Method for identifying domain adaptation abnormality of combined Wasserstein distance and difference measure for chest image piece utilized in medical image processing field, involves performing classification prediction task of sternum

CN115601535-A

Inventor(s) XU ZCHEN H; (...); CHEN Y

Assignee(s) HANGZHOU SHUZHILAIDA TECHNOLOGY CO LTD and UNIV HANGZHOU DIANZI

Derwent Primary Accession Number 

2023-093898


2023 patent

Gradient inversion method, involves generating random noise data of Gaussian distribution, performing iteration inversion to virtual gradients and to-be-inverted model by using Wasserstein distance to obtain original training privacy data

CN115661610-A

Inventor(s) TAN WHE X; (...); PENG C

Assignee(s) UNIV GUIZHOU

Derwent Primary Accession Number 

2023-17519T


2023 patent

Mixed neural network and generating countermeasure based power system small sample load forecasting method, involves calculating wasserstein distance between source field and target field distribution, and updating target field characteristic extractor parameter

CN115640901-A

Inventor(s) CHEN YZENG J; (...); LIU J

Assignee(s) UNIV SOUTH CHINA TECHNOLOGY

Derwent Primary Accession Number 

2023-151025

<–—2023———2023———460 —



Research on bearing vibration signal generation method based on filtering WGAN_GP with small samples

Li, JSLiu, T and Wu, X

Feb 2023 (Early Access) | 

PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART C-JOURNAL OF MECHANICAL ENGINEERING SCIENCE

In the practical application of bearing fault diagnosis, the data imbalance problems caused by the lack of available fault data lead to inaccurate diagnosis. The high cost and difficulty of obtaining fault samples has become an obstacle to the development of intelligent diagnosis technology. Aiming at the problem of data imbalance caused by small samples, this paper proposes a data generation m

Show more

Full Text at Publishermore_horiz

50 References. Related records



2023 see 2022

Simple approximative algorithms for free-support Wasserstein barycenters

von Lindheim, J

Mar 2023 (Early Access) | 

COMPUTATIONAL OPTIMIZATION AND APPLICATIONS

Computing Wasserstein barycenters of discrete measures has recently attracted considerable attention due to its wide variety of applications in data science. In general, this problem is NP-hard, calling for practical approximative algorithms. In this paper, we analyze a well-known simple framework for approximating Wasserstein -p barycenters, where we mainly consider the most common case p = 2

Show more

Free Full Text From Publishermore_horiz

59 References. Related records


2023 patent

Single battery consistency detection algorithm based on vehicle-connected network large data platform used in field of new energy automobile and scale energy storage, has Wasserstein distance calculated between abnormal and average monomer voltage sequences

CN115792681-A

Inventor(s) GAO K and DAI R

Assignee(s) ZHEJIANG LEAP ENERGY TECHNOLOGY CO LTD

Derwent Primary Accession Number 

2023-30507E



2023 patent

Anti-sample generating system based on WGAN-Unet for intelligent traffic, has generator for constructing Unet architecture according to original sample data, performing feature extraction on input sample, pool and up-sampling, and outputting confrontation disturbance corresponding to original sample

CN115761399-A

Inventor(s) CHEN YSUN L; (...); QIN Z

Assignee(s) UNIV SOUTHEAST

Derwent Primary Accession Number 

2023-26661P

A novel prediction approach of polymer gear contact fatigue based on a WGAN-XGBoost model

Jia, CFWei, PT; (...); Liu, HJ

Mar 2023 (Early Access) | 

FATIGUE & FRACTURE OF ENGINEERING MATERIALS & STRUCTURES

Polymer gears have long been used on power transmissions with the fundamental durability data, including fatigue S-N curves, yielding important data informing reliable and compact designs. This paper proposed a prediction method for polyformaldehyde (POM) gear fatigue life based on the innovative WGAN-XGBoost algorithm. The findings generated herein revealed that the proposed method performs we

Show more

View full textmore_horiz

31 References. Related records

Cited by 1

2023

2023 patent

Building photovoltaic data completion method based on WGAN and whale optimization algorithm for a residential building, a commercial building, an office building, and an industrial building, involves obtaining historical photovoltaic output data on building roofs and preprocessing the data

CN115688982-A

Inventor(s) YANG YCAO K; (...); CUI L

Assignee(s) HUANENG JIANGSU INTEGRATED ENERGY SERVIC

Derwent Primary Accession Number 

2023-195556


2023 see 2022

Rectified Wasserstein Generative Adversarial Networks for Perceptual Image Restoration

Ma, HCLiu, D and Wu, F

Mar 1 2023 | 

IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE

 45 (3) , pp.3648-3663

Wasserstein generative adversarial network (WGAN) has attracted great attention due to its solid mathematical background, i.e., to minimize the Wasserstein distance between the generated distribution and the distribution of interest. In WGAN, the Wasserstein distance is quantitatively evaluated by the discriminator, also known as the critic. The vanilla WGAN trained the critic with the simple L

Show more

Full Text at Publishermore_horiz

62 References. Related records


Distributionally robust learning-to-rank under the Wasserstein metric.

Sotudian, ShahabeddinChen, Ruidi and Paschalidis, Ioannis Ch

2023-03-30 | 

PloS one

 18 (3) , pp.e0283574

Despite their satisfactory performance, most existing listwise Learning-To-Rank (LTR) models do not consider the crucial issue of robustness. A data set can be contaminated in various ways, including human error in labeling or annotation, distributional data shift, and malicious adversaries who wish to degrade the algorithm's performance. It has been shown that Distributionally Robust Optimizat

Show more

Free Full Text from Publishermore_horiz

 ll 5 versions 


2023 patent

Method for correlating pedestrian target in large-scale video monitoring network based on maximum slice Wasserstein measurement for use in smart city, involves performing merge association within single camera for detection and tracking records of pedestrian targets formed by entire cameras

CN115630190-A

Inventor(s) JU LZHANG J; (...); CHEN L

Assignee(s) NANJING INFORMATION INST TECHNOLOGY

Derwent Primary Accession Number 

2023-13414D


 2023 patent

Method for expanding aviation engine assembly vibration sample based on regression Wasserstein deep convolution generative adversarial network model for aviation engine data processing field, involves generating data set using random noise training regression model, and performing data enhancement

CN115640515-A

Inventor(s) ZHOU XLIN L; (...); ZHONG S

Assignee(s) SHANDONG TIANLAN INFORMATION TECHNOLOGY CO LTD

Derwent Primary Accession Number 

2023-17952E

CONVERGENCE IN WASSERSTEIN DISTANCE FOR EMPIRICAL MEASURES OF SEMILINEAR SPDES

Wang, FY

Feb 2023 | 

ANNALS OF APPLIED PROBABILITY

 33 (1) , pp.70-84

The convergence rate in Wasserstein distance is estimated for the em-pirical measures of symmetric semilinear SPDEs. Unlike in the finite -dimensional case that the convergence is of algebraic order in time, in the present situation the convergence is of log order with a power given by eigen-values of the underlying linear operator.

Free Accepted Article From RepositoryFull Text at Publishermore_horiz

12 References  elated records

ited by 13 Related articles All 8 versions

<–—2023———2023———470 —


Hydrological objective functions and ensemble averaging with the Wasserstein distance

Magyar, JC and Sambridge, M

Mar 6 2023 | 

HYDROLOGY AND EARTH SYSTEM SCIENCES

 27 (5) , pp.991-1010

When working with hydrological data, the ability to quantify the similarity of different datasets is useful. The choice of how to make this quantification has a direct influence on the results, with different measures of similarity emphasising particular sources of error (for example, errors in amplitude as opposed to displacements in time and/or space). The Wasserstein distance considers the s

Show more

Free Full Text from Publishermore_horiz

38 References. Related records

Cited by 4 Related articles All 11 versions 

Sharp Wasserstein estimates for integral sampling and Lorentz summability of transport densities

Santambrogio, F

Feb 15 2023 | 

JOURNAL OF FUNCTIONAL ANALYSIS

 284 (4)

We prove some Lorentz-type estimates for the average in time of suitable geodesic interpolations of probability measures, obtaining as a by product a new estimate for transport densities and a new integral inequality involving Wasserstein distances and norms of gradients. This last inequality was conjectured in a paper by S. Steinerberger.(c) 2022 Elsevier Inc. All rights reserved.

Full Text at Publishermore_horiz

23 ReferenRelated records

Related articles All 2 versions

Towards Generating Realistic Wrist Pulse Signals Using Enhanced One Dimensional Wasserstein GAN

Chang, JXHu, F; (...); Huang, LQ

Feb 2023 | 

SENSORS

 23 (3)

For the past several years, there has been an increasing focus on deep learning methods applied into computational pulse diagnosis. However, one factor restraining its development lies in the small wrist pulse dataset, due to privacy risks or lengthy experiments cost. In this study, for the first time, we address the challenging by presenting a novel one-dimension generative adversarial network

Show more

Free Full Text from Publishermore_horiz

57 References.  Related records 

All 8 versions 

Improved Domain Adaptation Network Based on Wasserstein Distance for Motor Imagery EEG Classification

She, QSChen, T; (...); Zhang, YC

2023 | 

IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING

 31 , pp.1137-1148

Motor Imagery (MI) paradigm is critical in neural rehabilitation and gaming. Advances in brain-computer interface (BCI) technology have facilitated the detection of MI from electroencephalogram (EEG). Previous studies have proposed various EEG-based classification algorithms to identify the MI, however, the performance of prior models was limited due to the cross-subject heterogeneity in EEG da

Show more

Free Full Text From Publishermore_hori

39 References. Related records
Cited by 22
 Related articles All 3 versions



2023

WASCO: A Wasserstein-based Statistical Tool to Compare Conformational Ensembles of Intrinsically Disordered Proteins.

Gonzalez-Delgado, JavierSagar, Amin; (...); Cortes, Juan

2023-mar-18 | 

Journal of molecular biology

 , pp.168053

The structural investigation of intrinsically disordered proteins (IDPs) requires ensemble models describing the diversity of the conformational states of the molecule. Due to their probabilistic nature, there is a need for new paradigms that understand and treat IDPs from a purely statistical point of view, considering their conformational ensembles as well-defined probability distributions. I

Show more

Free Submitted Article From RepositoryView full textmore_horiz 


Unified Topological Inference for Brain Networks in Temporal Lobe Epilepsy Using the Wasserstein Distance.

Chung, Moo KRamos, Camille Garcia; (...); Struck, Aaron F

2023-02-13 |

ArXiv

Persistent homology can extract hidden topological signals present in brain networks. Persistent homology summarizes the changes of topological structures over multiple different scales called filtrations. Doing so detect hidden topological signals that persist over multiple scales. However, a key obstacle of applying persistent homology to brain network studies has always been the lack of cohe

Cited by 1 All 6 versions

Human-related anomalous event detection via memory-augmented Wasserstein generative adversarial network with gradient penalty

Li, NJChang, FL and Liu, CS

Jun 2023 | Feb 2023 (Early Access) | 

PATTERN RECOGNITION

 138

Timely detection of human-related anomaly in surveillance videos is a challenging task. Generally, the irregular human motion and action patterns can be regarded as abnormal human-related events. In this paper, we utilize the skeleton trajectories to learn the regularities of human motion and action in videos for anomaly detection. The skeleton trajectories are decomposed into global and local

Show more

Full Text at Publishermore_hor

Cited by 11 Related articles All 3 versions

2023 see 2022

An Efficient Content Popularity Prediction of Privacy Preserving Based on Federated Learning and Wasserstein GAN

Wang, KLDeng, N and Li, XH

Mar 1 2023 | 

IEEE INTERNET OF THINGS JOURNAL

 10 (5) , pp.3786-3798

To relieve the high backhaul load and long transmission time caused by the huge mobile data traffic, caching devices are deployed at the edge of mobile networks. The key to efficient caching is to predict the content popularity accurately while touching the users' privacy as little as possible. Recently, many studies have applied federated learning in content caching to improve data security. H

Show more

Full Text at Publishermore_horiz

43 References. Related records

Medical multivariate time series imputation and forecasting based on a recurrent conditional Wasserstein GAN and attention

Festag, S and Spreckelsen, C

Mar 2023 | Feb 2023 (Early Access) | 

JOURNAL OF BIOMEDICAL INFORMATICS  139

Objective: In the fields of medical care and research as well as hospital management, time series are an important part of the overall data basis. To ensure high quality standards and enable suitable decisions, tools for precise and generic imputations and forecasts that integrate the temporal dynamics are of great importance. Since forecasting and imputation tasks involve an inherent uncertain

Show more

Free Full Text From Publishermore_horiz

22  References   Related records

<–—2023———2023———480 —



A multi-period emergency medical service location problem based on Wasserstein-metric approach using generalised benders decomposition method

Yuan, YFSong, QK and Zhou, B

Jan 2023 (Early Access) | 

INTERNATIONAL JOURNAL OF SYSTEMS SCIENCE

This paper considers a multi-period location and sizing problem for an emergency medical service (EMS) system based on a distributionally robust optimisation (DRO) chance-constrained programming approach. The dynamic uncertain emergency medical requests are described in the ambiguity set, which is constructed based on Wasserstein-metric. The model of this problem focuses on minimising long-term

Show more

Full Text at Publishermore_horiz

40 Refe ences. Related records 

 

Protein secondary structure prediction based on Wasserstein generative adversarial networks and temporal convolutional networks with convolutional block attention modules

Yuan, LMa, YM and Liu, YH

2023 | 

MATHEMATICAL BIOSCIENCES AND ENGINEERING

 20 (2) , pp.2203-2218

As an important task in bioinformatics, protein secondary structure prediction (PSSP) is not only beneficial to protein function research and tertiary structure prediction, but also to promote the design and development of new drugs. However, current PSSP methods cannot sufficiently extract effective features. In this study, we propose a novel deep learning model WGACSTCN, which combines Wasser

Show more

Free Full Text from Publishermore_horiz

40 References. Related records


Prediction of Tumor Lymph Node Metastasis Using Wasserstein Distance-Based Generative Adversarial Networks Combing with Neural Architecture Search for Predicting

Wang, YW and Zhang, SH

Feb 2023 | 

MATHEMATICS

 11 (3)

Long non-coding RNAs (lncRNAs) play an important role in development and gene expression and can be used as genetic indicators for cancer prediction. Generally, lncRNA expression profiles tend to have small sample sizes with large feature sizes; therefore, insufficient data, especially the imbalance of positive and negative samples, often lead to inaccurate prediction results. In this study, we

Show more

Free Full Text from Publishermore_horiz

36 References. Related records
Cited by 1
 All 4 versions 


Multi-source monitoring information fusion method for dam health diagnosis based on Wasserstein distance

A Chen, X Tang, BC Cheng, JP He - Information Sciences, 2023 - Elsevier

… The BWD proposed in this paper is obtained by combining Wasserstein–1 distance with 

BPAs. It not only retains the excellent mathematical properties of Wasserstein–1 distance but …


Parameterized Wasserstein means

S Kim - Journal of Mathematical Analysis and Applications, 2023 - Elsevier

… the Wasserstein mean for t = 1 / 2 , so we call it the parameterized Wasserstein mean … 

In this paper, we investigate a norm inequality of the parameterized Wasserstein mean, give …

Zbl 07680629

2023


[PDF] arxiv.org

A kernel formula for regularized Wasserstein proximal operators

W Li, S Liu, S Osher - arXiv preprint arXiv:2301.10301, 2023 - arxiv.org

… One has to develop an optimization step to compute or approximate Wasserstein metrics … 

the Wasserstein proximal operator. We use an optimal control formulation of the Wasserstein

  All 4 versions


Distributionally Robust Chance Constrained Games under Wasserstein Ball

T Xia, J Liu, A Lisser - Operations Research Letters, 2023 - Elsevier

… This paper considers distributionally robust chance constrained games with a Wasserstein 

distance based uncertainty set. We assume that the center of the uncertainty set is an …

Zbl 07705459

[PDF] arxiv.org

Optimal transport and Wasserstein distances for causal models

S Eckstein, P Cheridito - arXiv preprint arXiv:2303.14085, 2023 - arxiv.org

… in causal Wasserstein distance. Finally, in Section 4.3 we study a Wasserstein interpolation 

… that respects the causal structure and compare it to standard Wasserstein interpolation. …

All 2 versions

 

 

arXiv:2304.02402  [pdfother stat.ME  math.PR   math.ST
Wasserstein Principal Component Analysis for Circular Measures
Authors: Mario BerahaMatteo Pegoraro
Abstract: We consider the 2-Wasserstein space of probability measures supported on the unit-circle, and propose a framework for Principal Component Analysis (PCA) for data living in such a space. We build on a detailed investigation of the optimal transportation problem for measures on the unit-circle which might be of independent interest. In particular, we derive a

n expression for optimal transport maps i…  

More
Submitted 5 April, 2023; originally announced April 2023.

All 3 versions 

arXiv:2304.01343  [pdfother math.OC  cs.DM
Distributionally robust mixed-integer programming with Wasserstein metric: on the value of uncertain data
Authors: Sergey S. Ketkov
Abstract: This study addresses a class of linear mixed-integer programming (MIP) problems that involve uncertainty in the objective function coefficients. The coefficients are assumed to form a random vector, which probability distribution can only be observed through a finite training data set. Unlike most of the related studies in the literature, we also consider uncertainty in the underlying data set. Th…  More
Submitted 3 April, 2023; originally announced April 2023.

<–—2023———2023———490 —



 

arXiv:2303.18067  [pdfother physics.ao-ph  stat.AP
Rediscover Climate Change during Global Warming Slowdown via Wasserstein Stability Analysis
Authors: Zhiang XieDongwei ChenPuxi Li
Abstract: Climate change is one of the key topics in climate science. However, previous research has predominantly concentrated on changes in mean values, and few research examines changes in Probability Distribution Function (PDF). In this study, a novel method called Wasserstein Stability Analysis (WSA) is developed to identify PDF changes, especially the extreme event shift and non-linear physical value…  More
Submitted 29 March, 2023; originally announced March 2023.
Comments: 12 pages, 4 figures, and 1 Algorithm  


Well-posedness of Hamilton-Jacobi equations on the Wasserstein space on graphs

Wilfrid Gangbo, University of California, Los Angeles (UCLA)
Slides

Abstract:

We study a Hamilton-Jacobi equation on the Wasserstein space on graphs, in the presence of linear operators which include the discrete individual noise operator. Under appropriate conditions, slightly different from the ones covered by the classical theory, we prove a comparison principle, which allows to apply standard arguments for a well posedness theory.
(This talk is based on a joint work with C. Mou and A. Swiech).

1155 E. 60th Street, Chicago, IL 60637

Monday, 

February 20, 2023


[PDF] wiley.com

Multi‐marginal Approximation of the Linear Gromov–Wasserstein Distance

F Beier, R Beinert - PAMM, 2023 - Wiley Online Library

Recently, two concepts from optimal transport theory have successfully been brought to the 

Gromov–Wasserstein (GW) setting. This introduces a linear version of the GW distance and …

Related articles All 2 versions


[PDF] fgv.br

[PDF] 1D-Wasserstein approximation of measures

A Chambolle, JM Machado - eventos.fgv.br

•(Pλ) always admits a solution ν.• If ρ0 has a L∞ density wrt H 1, so does ν.• If ρ0 P (R 2) 

does not give mass to 1D sets, then ν is also a solution to (Pλ).• Σ is Ahlfors regular: there is C…


[PDF] researchgate.net

[PDF] Wasserstein Bounds in the Clt of Estimators of The Drift Parameter for Ornstein-Uhlenbeck Processes Observed at High Frequency

M AL-FORAIH - Conference Proceedings Report, 2023 - researchgate.net

This paper deals with the rate of convergence for the central limit theorem of estimators of the 

drift coefficient, denoted θ, for a Ornstein-Uhlenbeck process X:={Xt, t≥ 0} observed at high …

All 3 versions


2023

2023 book

 


 

[HTML] cmes.org

[HTML] 基于 Wasserstein 距离测度的非精确概率模型修正方法

杨乐昌 韩东旭, 王丕东 - 机械工程学报, 2023 - qikan.cmes.org

针对这一问题, 提出一种基于Wasserstein 距离测度的模型修正方法, 该方法基于Wasserstein 

距离测度构建核函数, 利用p 维参数空间中Wasserstein 距离的几何性质以量化不同概率分布之间

 All 2 versions

[Chinese.  Correction Method of Inexact Probability Model Based on Wasserstein Distance Measure]


[PDF] openreview.net

Provable Robustness against Wasserstein Distribution Shifts via Input Randomization

A Kumar, A Levine, T Goldstein, S Feizi - The Eleventh International … - openreview.net

Certified robustness in machine learning has primarily focused on adversarial perturbations 

with a fixed attack budget for each sample in the input distribution. In this work, we present …

 

[HTML] hanspub.org

[HTML] 基于 Wasserstein 生成对抗网络和残差网络的 8 类蛋白质二级结构预测

李舜, 马玉明, 刘毅慧 - Hans Journal of Computational Biology, 2023 - hanspub.org

Wasserstein生成对抗网络(WGAN)和残差网络(ResNet)的蛋白质8态二级结构预测的方法.

方法首先通过Wasserstein生成对抗网络(WGAN)… 通过实验表明,Wasserstein生成对抗网络(WGAN)…

 All 3 versions

[Chinese. 8 Protein Classes Based on Wasserstein Generative Adversarial Networks and Residual Networks\


Hamilton-Jacobi-Bellman equation on the Wasserstein Space 2 (R d)

H Frankowska, Z Badreddine - math.bas.bg

… type optimal control problem on the Wasserstein space 2(Rd) of Borel probability measures: 

… We also discuss some viability and invariance theorems in the Wasserstein space and …

<–—2023———2023———500—



 2023 see 2021. [PDF] hal.science

[PDF] A travers et autour des barycentres de Wasserstein

IP GENTIL, AR SUVORIKOVA - theses.hal.science

… We are mainly motivated by the Wasserstein barycenter problem introduced by M. Agueh 

and G. Carlier in 2011: … We refer to the recent monograph [PZ20] for more details on …


[HTML] mdpi.com

[HTML] A WGAN-GP-Based Scenarios Generation Method for Wind and Solar Power Complementary Study

X Ma, Y Liu, J Yan, H Wang - Energies, 2023 - mdpi.com

… In this paper, the generated scenarios with the highest probability generated by WGAN-GP 

are close … for WGAN-GP generated data under different complementary modes, respectively. …

All 5 versions 

[PDF] openreview.net

A Higher Precision Algorithm for Computing the -Wasserstein Distance

PK Agarwal, S Raghvendra, P Shirzadian… - … Conference on Learning … - openreview.net

We consider the problem of computing the $1$-Wasserstein distance $\mathcal{W}(\mu,\nu)$ 

between two $d$-dimensional discrete distributions $\mu$ and $\nu$ whose support lie …

 

A continual encrypted traffic classification algorithm based on WGAN

X Ma, W Zhu, Y Jin, Y Gao - Third International Seminar on …, 2023 - spiedigitallibrary.org

… In this paper, we propose a continual encrypted traffic classification method based on 

WGAN. We use WGAN to train a separate generator for each class of encrypted traffic. The …

All 2 versions


 
Wasserstein-...
by Qing Zhang; Qing Zhang; Yi Yan ; More...
Frontiers in energy research, 04/2023, Volume 11
Non-intrusive load monitoring (NILM) is a technique that uses electrical data analysis to disaggregate the total energy consumption of a building or home into...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal
Open AccessA asserstein-based distributionally robust neural network for non-intrusive load monitorin

[CITATION] A Wasserstein-based distributionally robust neural network for non-intrusive load monitoring

Q Zhang, Y Yan, F Kong, S Chen, L Yang - Frontiers in Energy Research - Frontiers

All 2 versions

2023


[HTML] hanspub.org

[HTML] 基于 WGAN-GP 的建筑垃圾数据集的优化与扩充

邬欣诺 - Computer Science and Application, 2023 - hanspub.org

本文采用的模型框架是Wasserstein GAN (WGAN) [11],WGAN提出了一种新的衡量距离的

方法,采用Wasserstein距离(又叫Earth-Mover距离)来衡量真实数据与生成数据分布之间的距离,

 [Chinese. Optimization and Expansion of Construction Waste Dataset Based on WGAN-GP]

2023  see 2021

[CITATION] Decomposition methods for Wasserstein-based data-driven distributionally robust problems

T Homem de Mello - 2023 - repositorio.uai.cl

Decomposition methods for Wasserstein-based data-driven distributionally robust problems … 

Decomposition methods for Wasserstein-based data-driven distributionally robust problems …


arXiv:2304.07415
  [pdfother math.OC
Nonlinear Wasserstein Distributionally Robust Optimal Control
Authors: Zhengang ZhongJia-Jie Zhu
Abstract: This paper presents a novel approach to addressing the distributionally robust nonlinear model predictive control (DRNMPC) problem. Current literature primarily focuses on the static Wasserstein distributionally robust optimal control problem with a prespecified ambiguity set of uncertain system states. Although a few studies have tackled the dynamic setting, a practical algorithm remains elusive.…  More
Submitted 14 April, 2023; originally announced April 2023.
Comments: 13 pages, 3 figures

All 3 versions 

arXiv:2304.07048  [pdfother stat.ML   cs.LG  math.OC
Wasserstein PAC-Bayes Learning: A Bridge Between Generalisation and Optimisation
Authors: Maxime HaddoucheBenjamin Guedj
Abstract: PAC-Bayes learning is an established framework to assess the generalisation ability of learning algorithm during the training phase. However, it remains challenging to know whether PAC-Bayes is useful to understand, before training, why the output of well-known algorithms generalise well. We positively answer this question by expanding the \emph{Wasserstein PAC-Bayes} framework, briefly introduced…  More
Submitted 14 April, 2023; originally announced April 2023.

All 5 versions 

arXiv:2304.06783  [pdfpsother math.OC   cs.LG  eess.SY
A Distributionally Robust Approach to Regret Optimal Control using the Wasserstein Distance
Authors: Shuhao YanFeras Al TahaEilyan Bitar
Abstract: This paper proposes a distributionally robust approach to regret optimal control of discrete-time linear dynamical systems with quadratic costs subject to stochastic additive disturbance on the state process. The underlying probability distribution of the disturbance process is unknown, but assumed to lie in a given ball of distributions defined in terms of the type-2 Wasserstein distance. In this…  More
Submitted 13 April, 2023; originally announced April 2023.
Comments: 6 pages

<–—2023———2023———510—


arXiv:2304.05398  [pdfother math.ST  cs.LG  math.OC
Forward-backward Gaussian variational inference via JKO in the Bures-Wasserstein Space
Authors: Michael DiaoKrishnakumar BalasubramanianSinho ChewiAdil Salim
Abstract: Variational inference (VI) seeks to approximate a target distribution π
 by an element of a tractable family of distributions. Of key interest in statistics and machine learning is Gaussian VI, which approximates π
 by minimizing the Kullback-Leibler (KL) divergence to π
 over the space of Gaussians. In this work, we develop the (Stochastic) Forward-Backward Gaussian Variational Inference (FB-G…  More
Submitted 10 April, 2023; originally announced April 2023.ß


-Wasserstein distance. (English) Zbl 07668806

J. Ind. Manag. Optim. 19, No. 2, 916-931 (2023).

MSC:  58F15 58F17 53C35

PDF BibTeX XML Cite

Full Text: DOI 


[HTML] cmes.org

[HTML] 基于 Wasserstein 距离测度的非精确概率模型修正方法

杨乐昌 韩东旭, 王丕东 - 机械工程学报, 2023 - qikan.cmes.org

针对这一问题, 提出一种基于Wasserstein 距离测度的模型修正方法, 该方法基于Wasserstein

距离测度构建核函数, 利用p 维参数空间中Wasserstein 距离的几何性质以量化不同概率分布之间

Related articles All 5 versions

[Cninese. Correction Method of Inexact Probability Model Based on Wasserstein Distance Measure]

Hamilton-Jacobi-Bellman equation on the Wasserstein Space 2 (R d)

H Frankowska, Z Badreddine - math.bas.bg

… type optimal control problem on the Wasserstein space 2(Rd) of Borel probability measures:

… We also discuss some viability and invariance theorems in the Wasserstein space and …



A Multi-Sensor Detection Method Based on WGAN-GP and Attention-Bi-GRU for Well Control Pipeline Defects

H Liang, Z Yang, Z Zhang - Journal of Nondestructive Evaluation, 2023 - Springer

… -sensor well control pipeline defect data, the WGAN-GP based enhanced model is used to

… % To be contribution and novelty (1) The WGAN-GP based enhancement model is used to …

 

2023


[PDF] openreview.net

A Higher Precision Algorithm for Computing the -Wasserstein Distance

PK AgarwalS RaghvendraP Shirzadian… - … Conference on Learning … - openreview.net

We consider the problem of computing the $1$-Wasserstein distance $\mathcal{W}(\mu,\nu)$

between two $d$-dimensional discrete distributions $\mu$ and $\nu$ whose support lie …


 2023z1 see 2021

[CITATION] Decomposition methods for Wasserstein-based data-driven distributionally robust problems

T Homem de Mello - 2023 - repositorio.uai.cl

Decomposition methods for Wasserstein-based data-driven distributionally robust problems …

Decomposition methods for Wasserstein-based data-driven distributionally robust problems …

 




 Class-rebalanced wasserstein...
by Wang, Qi; Wang, Shengsheng; Wang, Bilin
Applied intelligence (Dordrecht, Netherlands), 04/2023, Volume 53, Issue 7
In the study of machine learning, multi-source domain adaptation (MSDA) handles multiple datasets which are collected from different distributions by using...
Article PDFPDF
Journal Article  Full Text Online
Class-rebalanced wasserstein distance for multi-source domain adaptation


Reflecting image-dependent SDEs in Wasserstein...
by Yang, Xue
Stochastics (Abingdon, Eng. : 2005), 04/2023, Volume ahead-of-print, Issue ahead-of-print
In this article, we study a class of reflecting stochastic differential equations whose coefficients depend on image measures of solutions under a given...
Journal ArticleCitation Online

2023 see arXiv
Nonlinear Wasserstein...
by Zhong, Zhengang; Zhu, Jia-Jie
04/2023
This paper presents a novel approach to addressing the distributionally robust nonlinear model predictive control (DRNMPC) problem. Current literature...
Journal Article  Full Text Online

<–—2023———2023———520—



2023 see arXiv

Wasserstein Principal...
by Beraha, Mario; Pegoraro, Matteo
04/2023
We consider the 2-Wasserstein space of probability measures supported on the unit-circle, and propose a framework for Principal Component Analysis (PCA) for...
Journal Article  Full Text Online

Wasserstein Principal...
by Beraha, Mario; Pegoraro, Matteo
arXiv.org, 04/2023
We consider the 2-Wasserstein space of probability measures supported on the unit-circle, and propose a framework for Principal Component Analysis (PCA) for...
Paper  Full Text Online


  Wasserstein...
by Haddouche, Maxime; Guedj, Benjamin
04/2023
PAC-Bayes learning is an established framework to assess the generalisation ability of learning algorithm during the training phase. However, it remains...
Journal Article  Full Text Online
Wasserstein PAC-Bayes Learning: A Bridge Between Generalisation and Optimisatio
Open Access

2023 see arXiv

A Distributionally Robust Approach to Regret Optimal Control using the Wasserstein...
by Yan, Shuhao; Taha, Feras Al; Bitar, Eilyan
04/2023
This paper proposes a distributionally robust approach to regret optimal control of discrete-time linear dynamical systems with quadratic costs subject to...
Journal Article  Full Text Online
 

2023 see arXiv
Forward-backward Gaussian variational inference via JKO in the Bures-Wasserstein...

by Diao, Michael; Balasubramanian, Krishnakumar; Chewi, Sinho ; More...
04/2023
Variational inference (VI) seeks to approximate a target distribution $\pi$ by an element of a tractable family of distributions. Of key interest in statistics...
Journal Article  Full Text Online
Open Access

2023

2023 see arXiv
Distributionally robust mixed-integer programming with Wasserstein...
by Ketkov, Sergey S
04/2023
This study addresses a class of linear mixed-integer programming (MIP) problems that involve uncertainty in the objective function coefficients. The...
Journal Article  Full Text Online
Open Access

3 patent news see journal article
Quanzhou Institute of Equipment Mfg Submits Chinese Patent Application for Wasserstein...
Global IP News. Electrical Patent News, 04/2023
Newsletter  Full Text Online
Wasserstein PAC-Bayes Learning: A Bridge Between Generalisation and Optimisation

Quanzhou Institute of Equipment Mfg Submits Chinese Patent Application for Wasserstein...
Global IP News: Electrical Patent News, Apr 10, 2023
Newspaper Article
All 5 versions
 

 2023 patent news see2022 patent
Xiao Fuyuan Applies for Patent on Application of Evidence Wasserstein...
Global IP News: Software Patent News, Apr 5, 2023
Newspaper ArticleCitation Online
 Evidence Wasserstein Distance Algorithm in Aspect of Component


2023 see 2022

MR4577196 Prelim Cui, Jianbo; Liu, Shu; Zhou, Haomin; 

Wasserstein Hamiltonian Flow with Common Noise on Graph. SIAM J. Appl. Math. 83 (2023), no. 2, 484–509. 58B20 (35Q41 49Q20 58J65)

Review PDF Clipboard Journal Article
Cited by 3
 Related articles All 2 versions


2023 see 2022

MR4575023 Prelim Konarovskyi, Vitalii; 

Coalescing-fragmentating Wasserstein dynamics: Particle approach. Ann. Inst. Henri Poincaré Probab. Stat. 59 (2023), no. 2, 983–1028. 60 (82)

Review PDF Clipboard Journal Article

Cited by 9 Related articles All 7 versions

<–—2023———2023———530—


MR4575022 Prelim Barrera, Gerardo; 

Lukkarinen, Jani; Quantitative control of Wasserstein distance between Brownian motion and the Goldstein–Kac telegraph process. Ann. Inst. Henri Poincaré Probab. Stat. 59 (2023), no. 2, 933–982. 60 (35)

Review PDF Clipboard Journal Article

Related articles All 4 versions

2023 see 2022

MR4575021 Prelim Fuhrmann, Sven; Kupper, Michael; Nendel, Max; 

Wasserstein perturbations of Markovian transition semigroups. Ann. Inst. Henri Poincaré Probab. Stat. 59 (2023), no. 2, 904–932. 60 (35 49)

Review PDF Clipboard Journal Article

Cited by 6 Related articles All 9 versions 
Wasserstein perturbations of Markovian transition semigroups


MR4573902 Prelim Li, Zhengyang; Tang, Yijia; Chen, Jing; Wu, Hao; 

On quadratic Wasserstein metric with squaring scaling for seismic velocity inversion. Numer. Math. Theory Methods Appl. 16 (2023), no. 2, 277–297. 49 (65 86)

Review PDF Clipboard Journal Article

MR4571311 Prelim Lindheim, Johannes von; 

Simple approximative algorithms for free-support Wasserstein barycenters. Comput. Optim. Appl. 85 (2023), no. 1, 213–246. 65D18 (49Q22 90B80)

Review PDF Clipboard Journal Article


MR4571253 Prelim Xia, Tian; Liu, Jia; Lisser, Abdel; 

Distributionally robust chance constrained games under Wasserstein ball. Oper. Res. Lett. 51 (2023), no. 3, 315–321. 91

Review PDF Clipboard Journal Article

Related articles All 2 versions

2023


2023 see 2022

MR4570453 Prelim Minh, Hà Quang; 

Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings. Anal. Appl. (Singap.) 21 (2023), no. 3, 719–775. 28C20 (46E22 49Q22 60B10)

Review PDF Clipboard Journal Article


MR4570293 Prelim Kim, Sejong; 

Parameterized Wasserstein means. J. Math. Anal. Appl. 525 (2023), no. 1, Paper No. 127272. 15B48 (15A45 47A63 47A64)

Review PDF Clipboard Journal Article

 
MR4567275
 Prelim Nguyen, Viet Anh; Shafieezadeh-Abadeh, Soroosh; Kuhn, Daniel; Esfahani, Peyman Mohajerin;

Bridging bayesian and minimax mean square error estimation via Wasserstein distributionally robust optimization. (English summary)

Math. Oper. Res. 48 (2023), no. 1, 1–37.

90C47 (62C20 91A05 94A12)

Cited by 28 Related articles All 8 versions

2023 see 2022
Learning to Generate W...
by Lacombe, Julien; Digne, Julie; Courty, Nicolas ; More...
Journal of mathematical imaging and vision, 2023, Volume 65, Issue 2
Optimal transport is a notoriously difficult problem to solve numerically, with current approaches often remaining intractable for very large-scale...
Article PDFPDF
Journal Article  Full Text Online
Open Access
L
earning to Generate Wasserstein Barycenter


2023 see 2021 2022
Least Wasserstein distance between disjoint shapes with ...
by Novack, Michael; Topaloglu, Ihsan; Venkatraman, Raghavendra
Journal of functional analysis, 01/2023, Volume 284, Issue 1
We prove the existence of global minimizers to the double minimization problem [Display omitted] where P(E) denotes the perimeter of the set E, Wp is the...
Article PDFPDF
Journal Article  Full Text Online

<–—2023———2023———540—

A two-step approach to Wasserstein...
by Maghami, Amin; Ursavas, Evrim; Cherukuri, Ashish
IEEE transactions on power systems, 2023
Journal Article  Full Text Online



On the exotic isometry flow of the quadratic Wasserstein...
by Gehér, György Pál; Titkos, Tamás; Virosztek, Dániel
Linear algebra and its applications
Kloeckner discovered that the quadratic Wasserstein space over the real line (denoted by W2(R)) is quite peculiar, as its isometry group contains an exotic...
Journal ArticleCitation Online


 
2023 see 2022
Wasserstein t-SNE | SpringerLink

springer.com

https://link.springer.com › chapter

springer.com

https://link.springer.com › chapter0

by F Bachmann · 2023 — Here we develop an approach for exploratory analysis of hierarchical datasets using the Wasserstein distance metric that takes into account ...

Wasserstein t-SNE


AA-WGAN: Attention augmented Wasserstein generative adversarial network with application to fundus retinal vessel segmentation.

Liu, MeilinWang, Zidong; (...); Zeng, Nianyin

2023-mar-30 | 

Computers in biology and medicine

 158 , pp.106874

In this paper, a novel attention augmented Wasserstein generative adversarial network (AA-WGAN) is proposed for fundus retinal vessel segmentation, where a U-shaped network with attention augmented convolution and squeeze-excitation module is designed to serve as the generator. In particular, the complex vascular structures make some tiny vessels hard to segment, while the proposed AA-WGAN can

All 3 versions

51References. Related records
Cited by 25
 Related articles All 4 versions


Data

MatthewJMorris/landscape-wasserstein: submission

Morris, Matthew J

2023 | 

Zenodo

 | Software

Code accompanying 'Towards inverse modeling of landscapes using the Wasserstein distance' Copyright: Open Access

View datamore_horiz
Related articles
 All 6 versions


2023


Simple approximative algorithms for free-support Wasserstein barycenters

von Lindheim, J

May 2023 | Mar 2023 (Early Access) | 

COMPUTATIONAL OPTIMIZATION AND APPLICATIONS

 85 (1) , pp.213-246

Computing Wasserstein barycenters of discrete measures has recently attracted considerable attention due to its wide variety of applications in data science. In general, this problem is NP-hard, calling for practical approximative algorithms. In this paper, we analyze a well-known simple framework for approximating Wasserstein -p barycenters, where we mainly consider the most common case p = 2

Show more

59 References. Related records

Cited by 1 Related articles All 2 versions

2023 see 2022

Adversarial classification via distributional robustness with Wasserstein ambiguity

Nam, HN and Wright, SJ

Apr 2023 | Apr 2022 (Early Access) | 

MATHEMATICAL PROGRAMMING

 198 (2) , pp.1411-1447

We study a model for adversarial classification based on distributionally robust chance constraints. We show that under Wasserstein ambiguity, the model aims to minimize the conditional value-at-risk of the distance to misclassification, and we explore links to adversarial classification models proposed earlier and to maximum-margin classifiers. We also provide a reformulation of the distributi

Show more

Citation

57 References. Related records

Cited by 15 Related articles All 8 versions

2023 Data

Scalable model-free feature screening via sliced-Wasserstein dependency

Li, TaoYu, Jun and Meng, Cheng

Figshare

 | Data set

We consider the model-free feature screening problem that aims to discard non-informative features before downstream analysis. Most of the existing feature screening approaches have at least quadratic computational cost with respect to the sample size n, thus may suffer from a huge computational burden when n is large. To alleviate the computational burden, we propose a scalable model-free sure

Cited by 1 All 2 versions

2023 see 2022

Wasserstein generative adversarial uncertainty quantification in physics-informed neural networks

Gao, YH and Ng, MK

Aug 15 2022 | May 2022 (Early Access) | 

JOURNAL OF COMPUTATIONAL PHYSICS

 463

In this paper, we study a physics-informed algorithm for Wasserstein Generative Adversarial Networks (WGANs) for uncertainty quantification in solutions of partial differential equations. By using groupsort activation functions in adversarial network discriminators, network generators are utilized to learn the uncertainty in solutions of partial differential equations observed from the initial/

Show more

 Citation 66 References  Related records 

 Cited by 8 Related articles All 5 versions


Fast and Accurate Deep Leakage from Gradients Based on Wasserstein Distance

He, XPeng, CG and Tan, WJ

Mar 21 2023 | 

INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS

 2023

Shared gradients are widely used to protect the private information of training data in distributed machine learning systems. However, Deep Leakage from Gradients (DLG) research has found that private training data can be recovered from shared gradients. The DLG method still has some issues such as the "Exploding Gradient," low attack success rate, and low fidelity of recovered data. In this st

Show more

Free Full Text from Publishermore_horiz

57 References. Related records

<–—2023———2023———550—



LPOT: Locality-Preserving Gromov-Wasserstein Discrepancy for Nonrigid Point Set Registration

Wang, G

Dec 2022 (Early Access) | 

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS

The main problems in point registration involve recovering correspondences and estimating transformations, especially in a fully unsupervised way without any feature descriptors. In this work, we propose a robust point matching method using discrete optimal transport (OT), which is a natural and useful approach for assignment tasks, to recover the underlying correspondences and improve the nonr

Show more

Full Text at Publishe77 References. Related records


Computing the Gromov-Wasserstein Distance between Two Surface Meshes Using Optimal Transport

Koehl, PDelarue, M and Orland, H

Mar 2023 | 

ALGORITHMS

 16 (3)

The Gromov-Wasserstein (GW) formalism can be seen as a generalization of the optimal transport (OT) formalism for comparing two distributions associated with different metric spaces. It is a quadratic optimization problem and solving it usually has computational costs that can rise sharply if the problem size exceeds a few hundred points. Recently fast techniques based on entropy regularization

Show more

55 References. Related records

All 5 versions 

Wasserstein distance-based expansion planning for integrated energy system considering hydrogen fuel cell vehicles

Wei, XChan, KW; (...); Liu, JW

Jun 1 2023 | Mar 2023 (Early Access) | 

ENERGY v. 272

Due to the increasing pressure from environmental concerns and the energy crisis, transportation electrification constitutes one of the key initiatives for global decarbonization. The zero on-road global greenhouse gas emis-sions feature of electric vehicles (EVs) and hydrogen fuel cell vehicles (FCVs) are encouraged to facilitate the electrification of the transportation sector to reduce carbo

Show more 

37 References. Related records

Cite Cited by 16 Related articles All 4 versions

2023 see 2021 arXiv

Continual Learning of Generative Models With Limited Data: From Wasserstein-1 Barycenter to Adaptive Coalescence

Dedeoglu, MLin, S; (...); Zhang, JS

Mar 2023 (Early Access) | 

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS

Learning generative models is challenging for a network edge node with limited data and computing power. Since tasks in similar environments share a model similarity, it is plausible to leverage pretrained generative models from other edge nodes. Appealing to optimal transport theory tailored toward Wasserstein-1 generative adversarial networks (WGANs), this study aims to develop a framework th

Show more 

59 References. 00Related records

Cited by 2 Related articles All 6 versions

Sliced Wasserstein cycle consistency generative adversarial networks for fault data augmentation of an industrial robot

Pu, ZQCabrera, D; (...); de Oliveira, JV

Jul 15 2023 | Mar 2023 (Early Access) | 

EXPERT SYSTEMS WITH APPLICATIONS

 v. 222. 2023-04-10

We investigate the role of the loss function in cycle consistency generative adversarial networks (CycleGANs). Namely, the sliced Wasserstein distance is proposed for this type of generative model. Both the unconditional and the conditional CycleGANs with and without squeeze-and-excitation mechanisms are considered. Two data sets are used in the evaluation of the models, i.e., the well-known MN

Show more

View full textmore_horiz

37 References. Related records

Cited by 2 Related articles All 5 versions

2023


Dual Interactive Wasserstein Generative Adversarial Network optimized with arithmetic optimization algorithm-based job scheduling in cloud-based IoT

Sravanthi, G and Moparthi, NR

Apr 2023 (Early Access) | 

CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS

Job scheduling plays a prominent part in cloud computing, and the production schedule of jobs can increase the cloud system's effectiveness. When serving millions of users at once, cloud computing must provide all user requests with excellent performance and ensure Quality of Service (QoS). A suitable task scheduling algorithm is needed to appropriately and effectively fulfil these requests. Se

Show more

30 References. Related recor

Cited by 1 Related articles


2023 see 2022. Working Paper
Dimensionality Reduction and Wasserstein Stability for Kernel Regression
Eckstein, Stephan; Iske, Armin; Trabs, Mathias.
 arXiv.org; Ithaca, Apr 17, 2023.

Working Paper
Wasserstein Distributionally Robust Optimization with Expected Value Constraints
Fonseca, Diego; Junca, Mauricio.
 arXiv.org; Ithaca, Apr 15, 2023.
 
Working Paper
Nonlinear Wasserstein Distributionally Robust Optimal Control
Zhong, Zhengang; Jia-Jie, Zhu.
 arXiv.org; Ithaca, Apr 14, 2023.
 
Working Paper
A Distributionally Robust Approach to Regret Optimal Control using the Wasserstein Distance
Yan, Shuhao; Feras Al Taha; Bitar, Eilyan.
 arXiv.org; Ithaca, Apr 13, 2023.

All 3 versions 

<–—2023———2023———560—



2023 see 2021. Working Paper
Time-Series Imputation with Wasserstein Interpolation for Optimal Look-Ahead-Bias and Variance Tradeoff
Blanchet, Jose; Hernandez, Fernando; Nguyen, Viet Anh; Pelger, Markus; Zhang, Xuhui.
 arXiv.org; Ithaca, Apr 11,
 
2023 patent news. Wire Feed
Quanzhou Institute of Equipment Mfg Submits Chinese Patent Application for Wasserstein Distance-Based Battery SOH (State of Health) Estimation Method and Device
Global IP News. Electrical Patent News; New Delhi [New Delhi]. 10 Apr 2023.  
 
Working Paper
Forward-backward Gaussian variational inference via JKO in the Bures-Wasserstein Space
Diao Michael; Balasubramanian, Krishnakumar; Chewi, Sinho; Salim, Adil.
 arXiv.org; Ithaca, Apr 10, 2023.
All 2 versions
 


Working Paper
Stochastic Wasserstein Gradient Flows using Streaming Data with an Application in Predictive Maintenance
Lanzetti, Nicolas; Balta, Efe C; Liao-McPherson, Dominic; Dörfler, Florian.
 arXiv.org; Ithaca, Apr 6, 2023.
 

2023 patent news Wire Feed
Xiao Fuyuan Applies for Patent on Application of Evidence Wasserstein Distance Algorithm in Aspect of Component Identification
Global IP News. Software Patent News; New Delhi [New Delhi]. 05 Apr 2023.  
 

2023


Working Paper
Wasserstein Principal Component Analysis for Circular Measures
Beraha, Mario; Pegoraro, Matteo.
 arXiv.org; Ithaca, Apr 5, 2023.
 All 3 versions


2023 see 1011 Working Paper

Coresets for Wasserstein Distributionally Robust Optimization Problems
 Huang, Ruomin; Hua


 Working Paper
Distributionally robust mixed-integer programming with Wasserstein metric: on the value of uncertain data
Ketkov, Sergey S.
 arXiv.org; Ithaca, Apr 3, 2023.

Working Paper
Variational Wasserstein Barycenters for Geometric Clustering
Liang Mi.
 arXiv.org; Ithaca, Mar 30, 2023.
 
Working Paper
Rediscover Climate Change during Global Warming Slowdown via Wasserstein Stability Analysis
Xie, Zhiang; Chen, Dongwei; Li, Puxi.
 arXiv.org; Ithaca, Mar 29, 2023.
 

<–—2023———2023———570—



Working Paper
Continuum Swarm Tracking Control: A Geometric Perspective in Wasserstein Space
Emerick, Max; Bamieh, Bassam.
 arXiv.org; Ithaca, Mar 27, 2023.
 All 2 versions

Continuum Swarm Tracking Control: A Geometric Perspective in Wasserstein 

hapter, 2023
Publication:2023 62nd IEEE Conference on Decision and Control (CDC), 20231213, 1367
Publisher: 2023



Working Paper
Adjusted Wasserstein Distributionally Robust Estimator in Statistical Learning
Xie, Yiling; Huo, Xiaoming.
 arXiv.org; Ithaca, Mar 27, 2023.
All 2 versions
 

 
Working Paper
Isometries and isometric embeddings of Wasserstein spaces over the Heisenberg group
Balogh, Zoltán M; Titkos, Tamás; Virosztek, Dániel.
 arXiv.org; Ithaca, Mar 27, 2023.
 
Working Paper
On smooth approximations in the Wasserstein space
Cosso, Andrea; Martini, Mattia.
 arXiv.org; Ithaca, Mar 27, 2023.
 
Working Paper
Parameter estimation for many-particle models from aggregate observations: A Wasserstein distance based sequential Monte Carlo sampler
Chen, Cheng; Wen, Linjie; Li, Jinglai.
 arXiv.org; Ithaca, Mar 27, 2023.
All 2 versions

 

2023


Working Pape
Improving Neural Topic Models with Wasserstein Knowledge Distillation
Adhya, Suman; Sanyal, Debarshi Kumar.
 arXiv.org; Ithaca, Mar 27, 2023.
All 4 versions


2023 see 2022. Working Paper
Stability of Entropic Wasserstein Barycenters and application to random geometric graphs
Theveneau, Marc; Keriven, Nicolas.
 arXiv.org; Ithaca, Mar 27, 2023.

Abstract/DetailsGet full text
opens in a new window

Working Paper
Gromov-Wasserstein Distances: Entropic Regularization, Duality, and Sample Complexity
Zhang, Zhengxin; Goldfeld, Ziv; Mroueh, Youssef; Sriperumbudur, Bharath K.
 arXiv.org; Ithaca, Mar 24, 2023.

Abstract/DetailsGet full text
opens in a new window

2923 see 2022. Working Paper
Regularization for Wasserstein Distributionally Robust Optimization


Scholarly Journal
Shortfall-Based Wasserstein Distributionally Robust Optimization

Li, Ruoxuan; Lv, Wenhua; Mao, Tiantian. Mathematics; Basel Vol. 11, Iss. 4,  (2023): 849.

Abstract/DetailsFull textFull text - PDF (607 KB)‎

<–—2023———2023———580—



Scholarly Journal
Chance-constrained set covering with Wasserstein ambiguity

Shen Haoming; Jiang Ruiwei. Mathematical Programming; Heidelberg Vol. 198, Iss. 1,  (2023): 621-674.

Abstract/Details Get full textopens in a new window

 
2023 see 2022. Scholarly Journal
Learning to Generate Wasserstein Barycenters

Lacombe Julien; Digne, Julie; Courty, Nicolas; Bonneel Nicolas. Journal of Mathematical Imaging and Vision; New York Vol. 65, Iss. 2,  (2023): 354-370.

Abstract/Details Get full textopens in a new window
Cited by 5
Related articles All 5 versions

Zbl 07696249


holarlyJournal
Fast and Accurate Deep Leakage from Gradients Based on Wasserstein Distance

He, Xing; Peng, Changgen; Tan, Weijie. International Journal of Intelligent Systems; New York Vol. 2023,  (2023).
 Abstract/DetailsFull textFull text - PDF (2 MB)‎

All 4 versions


Conference Paper
Gaussian Wasserstein distance based ship target detection algorithm

Wang, Suying. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2023).



Conference Paper
Energy Theft Detection Using the Wasserstein Distance on Residuals

Altamimi, Emran; Al-Ali, Abdulaziz; Malluhi, Qutaibah M; Al-Ali, Abdulla K. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings; Piscataway, (2023).

Abstract/Details


2023


Scholarly Journal
A Robust Fault Classification Method for Streaming Industrial Data Based on Wasserstein Generative Adversarial Network and Semi-Supervised Ladder Network

Zhang, Chuanfang; Peng, Kaixiang; Dong, Jie; Zhang, Xueyi; Yang, Kaixuan. IEEE Transactions on  

Abstract/Details
Scholarly Journal
Small Sample Reliability Assessment With Online Time-Series Data Based on a Worm Wasserstein Generative Adversarial Network Learning Method

Sun, Bo; Wu, Zeyu; Feng, Qiang; Wang, Zili; Ren, Yi; et al. IEEE Transactions on Industrial Informatics; Piscataway Vol. 19, Iss. 2,  (2023): 1207-1216.

Abstract/Details

Cited by 8 Related articles


Scholarly Journal
AVO Inversion Based on Closed-Loop Multitask Conditional Wasserstein Generative Adversarial Network

Wang, Zixu; Wang, Shoudong; Chen, Zhou; Cheng, Wanli. IEEE Transactions on Geoscience and Remote Sensing; New York Vol. 61,  (2023): 1-13.

Abstract/Details


Scholarly Journal
HackGAN: Harmonious Cross-Network Mapping Using CycleGAN With Wasserstein–Procrustes Learning for Unsupervised Network Alignment

Yn, Linyao; Wang, Xiao; Zhang, Jun; Yang, Jun; Xu, Yancai; et al. IEEE Transactions on Computational Social Systems; Piscataway Vol. 10, Iss. 2,  (2023): 746-759.
 Abstract/Details


Scholarly Journal
Hydrological objective functions and ensemble averaging with the Wasserstein distance

Magyar, Jared C; Sambridge, Malcolm. Hydrology and Earth System Sciences; Katlenburg-Lindau Vol. 27, Iss. 5,  (2023): 991-1010.

Abstract/DetailsFull textFull text - PDF (6 MB)‎

Cited by 2 All 8 versions 

<–—2023———2023———590—


Working Paper
Projection Robust Wasserstein Distance and Riemannian Optimization

Lin, Tianyi; Fan, Chenyou; Ho, Nhat; Cuturi, Marco; Jordan, Michael I. arXiv.org; Ithaca, Jan 1, 2023.


Articles in Advance | Operations Research - PubsOnLine

INFORMS.org

https://pubsonline.informs.org › toc › opre

INFORMS PubsOnline. Log In ... Published Online:April 21, 2023 ... 

Wasserstein Distributionally Robust Optimization and Variation Regularization.


2023 see 2022 2021

Articles in Advance | INFORMS Journal on Optimization

Informs.org

https://pubsonline.informs.org › toc › ijoo

Published Online:February 21, 2023 ... for Distributionally Robust Chance-Constrained Programs with Left-Hand Side Uncertainty Under Wasserstein Ambiguity.


Zhiwei (Tony) Qin Copy

INFORMS.org

https://meetings.informs.org › Home › Speakers

 He is an INFORMS Franz Edelman Award Finalist in 2023, received the INFORMS ... GEM was developed as a generalized Wasserstein distance between the supply ...

Articles in Advance | Mathematics of Operations Research

Informs.org

https://pubsonline.informs.org › toc › moor



2023 see 2022. [PDF] mlr.press

Fair learning with Wasserstein barycenters for non-decomposable performance measures

S Gaucher, N Schreuder… - … on Artificial Intelligence …, 2023 - proceedings.mlr.press

This work provides several fundamental characterizations of the optimal classification function 

under the demographic parity constraint. In the awareness framework, akin to the classical …

Cited by 2 Related articles All 4 versions


2023

[PDF] arxiv.org

An Asynchronous Decentralized Algorithm for Wasserstein Barycenter Problem

C Zhang, H Qian, J Xie - arXiv preprint arXiv:2304.11653, 2023 - arxiv.org

… the Wasserstein barycenter problem(WBP) in the semi-discrete setting, which estimates the 

barycenter of a set of continuous probability distributions under the Wasserstein distance, ie, …

 

[PDF] mlr.press

Wasserstein Distributional Learning via Majorization-Minimization

C Tang, N Lenssen, Y Wei… - … on Artificial Intelligence …, 2023 - proceedings.mlr.press

… , Wasserstein Distributional Learning (WDL), that trains Semi-parametric Conditional 

Gaussian Mixture Models (SCGMM) for conditional density functions and uses the Wasserstein

 


[PDF] mlr.press

Discrete Langevin Samplers via Wasserstein Gradient Flow

H Sun, H Dai, B Dai, H Zhou… - … Artificial Intelligence …, 2023 - proceedings.mlr.press

… flow that minimizes KL divergence on a Wasserstein manifold. The superior efficiency of such 

… In this work, we show how the Wasserstein gradient flow can be generalized naturally to …

 Cited by 3 Related articles All 2 version


[PDF] arxiv.org

Wasserstein Loss for Semantic Editing in the Latent Space of GANs

P Doubinsky, N Audebert, M Crucianu… - arXiv preprint arXiv …, 2023 - arxiv.org

… We propose an alternative formulation based on the Wasserstein loss that avoids such 

problems, while maintaining performance on-par with classifier-based approaches. We …

All 5 versions


[PDF] upc.edu

[PDF] On the Size of Fully Diverse Sets of Polygons using the Earth Movers Distance or Wasserstein Distance

F Klute, M van Kreveld - dccg.upc.edu

… In this paper we extend these results by considering the Earth Movers Distance and the 

Wasserstein Distance: we show that the maximum size of fully diverse sets is O(1) in both cases. …

Related articles 

 <–—2023———2023———600—

Distributionally robust optimal dispatching method of integrated electricity and heating system based on improved Wasserstein metric

H Li, H Liu, J Ma, D Li, W Zhang - … Journal of Electrical Power & Energy …, 2023 - Elsevier

… system based on an improved Wasserstein metric, for dispatching … power forecasted error 

is the Wasserstein ball centered at the … The out-of-sample analysis with the IEEE 118-bus test …

  

Multidimensional scaling method for complex time series based on the Wasserstein–Fourier distance in complex systems

F Zhang, P Shang, X Mao - Nonlinear Dynamics, 2023 - Springer

… new time series classification method that combines the Wasserstein–Fourier (WF) distance 

[… between time series by computing the Wasserstein distance between the normalized power …


Text-to-image Generation Model Based on Diffusion Wasserstein Generative Adversarial Networks

H ZHAO, W LI - 电子与信息学报, 2023 - jeit.ac.cn

Text-to-image generation is a comprehensive task that combines the fields of Computer Vision 

(CV) and Natural Language Processing (NLP). Research on the methods of text to image …


Energy Theft Detection Using the Wasserstein Distance on Residuals

Emran Altamimi;

Abdulaziz Al-Ali;

Qutaibah M. Malluhi;

Abdulla K. Al-Ali

2023 IEEE Texas Power and Energy Conference (TPEC)

Year: 2023 | Conference Paper | P

Cited by 1 Related articles All 3 versions


2023


AVO Inversion Based on Closed-Loop Multitask Conditional Wasserstein Generative Adversarial Network

Zixu Wang;

Shoudong Wang;

Chen Zhou;

Wanli Cheng

IEEE Transactions on Geoscience and Remote Sensing

Year: 2023 | Volume: 61 | Journal Article | Publisher: IEEE

Cited by 3 Related articles All 2 versions


Gaussian Wasserstein distance based ship target detection algorithm

Suying Wang

2023 IEEE 2nd International Conference on Electrical Engineering, Big Data and Algorithms (EEBDA)

Year: 2023 | Conference Paper | 


  

EAF-WGAN: Enhanced Alignment Fusion-Wasserstein Generative Adversarial Network for Turbulent Image Restoration

Xiangqing Liu; Gang Li; Zhenyang Zhao; Qi Cao; Zijun Zhang; Shaoan Yan;

Jianbin Xie; Minghua Tang

IEEE Transactions on Circuits and Systems for Video Technology

Year: 2023 | Early Access Article | Publisher: IEEE


2023 see 2022

A two-step approach to Wasserstein distributionally robust chance- and security-constrained dispatch

Amin Maghami;

Evrim Ursavas;

Ashish Cherukuri

IEEE Transactions on Power Systems

Year: 2023 | Early Access Article | Publisher: IEEE

arXiv:2304.13586  [pdfother stat.ML   cs.CV  cs.GR  cs.LG
Energy-Based Sliced Wasserstein Distance
Authors: Khai NguyenNhat Ho
Abstract: The sliced Wasserstein (SW) distance has been widely recognized as a statistically effective and computationally efficient metric between two probability measures. A key component of the SW distance is the slicing distribution. There are two existing approaches for choosing this distribution. The first approach is using a fixed prior distribution. The second approach is optimizing for the best dis…  More
Submitted 26 April, 2023; originally announced April 2023.
Comments: 36 pages, 7 figures, 6 tables

All 2 versions 

A two-step approach to Wasserstein distributionally robust chance- and security-constrained dispatch
by Maghami, Amin; Ursavas, Evrim; Cherukuri, Ashish
IEEE transactions on power systems, 2023
This paper considers a security constrained dispatch problem involving generation and line contingencies in the presence of the renewable generation. The...
Article PDFPDF
Journal Article  Full Text Online

<–—2023———2023———610—


arXiv:2304.12093  [pdfother math.OC
Wasserstein Tube MPC with Exact Uncertainty Propagation
Authors: Liviu AolariteiMarta FochesatoJohn LygerosFlorian Dörfler
Abstract: We study model predictive control (MPC) problems for stochastic LTI systems, where the noise distribution is unknown, compactly supported, and only observable through a limited number of i.i.d. noise samples. Building upon recent results in the literature, which show that distributional uncertainty can be efficiently captured within a Wasserstein ambiguity set, and that such ambiguity sets propaga…  More
Submitted 24 April, 2023; originally announced April 2023.


arXiv:2304.12029  [pdfother math.PR
Reconstructing discrete measures from projections. Consequences on the empirical Sliced Wasserstein Distance
Authors: Eloi TanguyRémi FlamaryJulie Delon
Abstract: This paper deals with the reconstruction of a discrete measure γ
Z on Rd  from the knowledge of its pushforward measures P
i#γZ  by linear applications P
i:RdRdi
 (for instance projections onto subspaces). The measure γ
Z being fixed, assuming that the rows of the matrices P
i are independent realizations of laws which do not give mass to h…  More
Submitted 24 April, 2023; originally announced April 2023.


arXiv:2304.11945  [pdfpsother math.OC   math.AP
On the Viability and Invariance of Proper Sets under Continuity Inclusions in Wasserstein Spaces
Authors: Benoît Bonnet-WeillHélène Frankowska
Abstract: In this article, we derive necessary and sufficient conditions for the existence of solutions to state-constrained continuity inclusions in Wasserstein spaces whose right-hand sides may be discontinuous in time. These latter are based on fine investigations of the infinitesimal behaviour of the underlying reachable sets, through which we show that up to a negligible set of times, every admissible…  More
Submitted 27 April, 2023; v1 submitted 24 April, 2023; originally announced April 2023.
Comments: 43 pages
MSC Class: 28B20; 34G25; 46N20; 49Q22


arXiv:2304.11653  [pdfpsother cs.LG
An Asynchronous Decentralized Algorithm for Wasserstein Barycenter Problem
Authors: Chao ZhangHui QianJiahao Xie
Abstract: Wasserstein Barycenter Problem (WBP) has recently received much attention in the field of artificial intelligence. In this paper, we focus on the decentralized setting for WBP and propose an asynchronous decentralized algorithm (A
2 DWB). A
2 DWB is induced by a novel stochastic block coordinate descent method to optimize the dual of entropy regularized WBP. To our knowledge, A
2DWB is the fir…  More
Submitted 23 April, 2023; originally announced April 2023.

All 2 versions 

arXiv:2304.10508  [pdfother]  cs.CV   cs.AI
Wasserstein Loss for Semantic Editing in the Latent Space of GANs
Authors: Perla DoubinskyNicolas AudebertMichel CrucianuHervé Le Borgne
Abstract: The latent space of GANs contains rich semantics reflecting the training data. Different methods propose to learn edits in latent space corresponding to semantic attributes, thus allowing to modify generated images. Most supervised methods rely on the guidance of classifiers to produce such edits. However, classifiers can lead to out-of-distribution regions and be fooled by adversarial samples. We…  More
Submitted 22 March, 2023; originally announced April 2023.

 

2023


2023 see 2021. [PDF] arxiv.org

Gaussian approximation for penalized Wasserstein barycenters

N Buzun - Mathematical Methods of Statistics, 2023 - Springer

In this work we consider regularized Wasserstein barycenters (average in Wasserstein

distance) in Fourier basis. We prove that random Fourier parameters of the barycenter converge …

Related articles All 3 versions

MR4581750 
Cited by 1
 Related articles All 3 versions

[PDF] arxiv.org

On Excess Mass Behavior in Gaussian Mixture Models with Orlicz-Wasserstein Distances

A GuhaN HoXL Nguyen - arXiv preprint arXiv:2301.11496, 2023 - arxiv.org

… they showed that in the finite Gaussian mixture setting with overfitted … The infinite Gaussian

mixture setting is generally more … Gaussian mixture setting, the rates captured by Wasserstein …

Cited by 1 All 2 versions 


[PDF] arxiv.org

Learning Gaussian Mixtures Using the Wasserstein-Fisher-Rao Gradient Flow

Y YanK WangP Rigollet - arXiv preprint arXiv:2301.01766, 2023 - arxiv.org

… Gaussian mixture models form a flexible and … a Gaussian mixture model. Our method is

based on gradient descent over the space of probability measures equipped with the Wasserstein

Cited by 1 Related articles All 2 versions 


[PDF] arxiv.org

Forward-backward Gaussian variational inference via JKO in the Bures-Wasserstein Space

M DiaoK BalasubramanianS Chewi… - arXiv preprint arXiv …, 2023 - arxiv.org

… space is the metric space P2(Rd) endowed with the 2-Wasserstein distance W2 (which

we simply refer to as the Wasserstein distance). We recall that the Wasserstein distance is …

All 2 versions 


[PDF] arxiv.org

Variational Gaussian filtering via Wasserstein gradient flows

A Corenflos, H Abdulsamad - arXiv preprint arXiv:2303.06398, 2023 - arxiv.org

… Abstract—In this article, we present a variational approach to Gaussian and mixture-of-…

representation for two models for which Gaussian approximations typically fail: a multiplicative …

 All 2 versions 

<–—2023———2023———620—



[PDF] arxiv.org

Wasserstein Projection Pursuit of Non-Gaussian Signals

S Mukherjee, SS Mukherjee… - arXiv preprint arXiv …, 2023 - arxiv.org

… which maximise the 2-Wasserstein distance of the empirical distribution of … Gaussian.

Under a generative model, where there is a underlying (unknown) low-dimensional non-Gaussian …

All 2 versions 


Gaussian Wasserstein distance based ship target detection algorithm

S Wang - 2023 IEEE 2nd International Conference on Electrical …, 2023 - ieeexplore.ieee.org

… Gaussian distribution to model arbitrary target-oriented detection bounding boxes and

further improve the small target detection accuracy by calculating the Gaussian Wasserstein …

 

Small ship detection based on YOLOX and modified Gaussian Wasserstein distance in SAR images

W Yu, J Li, Y Wang, Z Wang… - … Conference on Geographic …, 2023 - spiedigitallibrary.org

… , this paper proposes a modified Gaussian Wasserstein distance. Based on the one-stage

anchorfree detector YOLOX [9], the proposed Modified Gaussian Wasserstein Distance can be …

 All 2 versions


[PDF] arxiv.org

Gaussian approximation for penalized Wasserstein barycenters

N Buzun - Mathematical Methods of Statistics, 2023 - Springer

… regularized Wasserstein barycenters (average in Wasserstein distance) in Fourier basis.

We prove that random Fourier parameters of the barycenter converge to some Gaussian …

Related articles All 2 versions


On Excess Mass Behavior in Gaussian Mixture Models with Orlicz-Wasserstein Distances

A GuhaN HoXL Nguyen - arXiv preprint arXiv:2301.11496, 2023 - arxiv.org

… We believe the usage of Orlicz-Wasserstein metrics for parameter estimation in Dirichlet …

models under Wasserstein distances. Section 3.1 introduces Orlicz-Wasserstein distances and …

Save Cite Cited by 1 All 2 versions 


2023

Wasserstein perturbations of Markovian transition semigroups

S Fuhrmann, M Kupper, M Nendel - … de l'Institut Henri Poincare (B) …, 2023 - projecteuclid.org

… , we consider a logarithmic version of the Wasserstein distance), the uncertainty in the

generator does not depend on the order of the Wasserstein distance. Our results in Section 3 …

 Cited by 5 Related articles All 9 versions 


[PDF] arxiv.org

Markovian Sliced Wasserstein Distances: Beyond Independent Projections

K NguyenT RenN Ho - arXiv preprint arXiv:2301.03749, 2023 - arxiv.org

… background for Wasserstein distance, sliced Wasserstein distance, and max sliced Wasserstein

distance in Section 2. In Section 3, we propose Markovian sliced Wasserstein distances …

 Cited by 1 Related articles All 2 versions 

[PDF] researchgate.net

[PDF] Markovian Sliced Wasserstein Distances: Beyond Independent Projections

KNTRN Ho - 2023 - researchgate.net

… background for Wasserstein distance, sliced Wasserstein distance, and max sliced Wasserstein

distance in Section 2. In Section 3, we propose Markovian sliced Wasserstein distances …

Related articles 


 2023 see 2022

Minh, Hà Quang

Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings. (English) Zbl 07680420

Anal. Appl., Singap. 21, No. 3, 719-775 (2023).

MSC:  28C20 49Q22 46E22

PDF BibTeX XML Cite

Full Text: DOI 


Wasserstein Tube MPC...
by Aolaritei, Liviu; Fochesato, Marta; Lygeros, John ; More...
04/2023
We study model predictive control (MPC) problems for stochastic LTI systems, where the noise distribution is unknown, compactly supported, and only observable...
Journal Article  Full Text Online
Open Access


2023  patent
US Patent Issued to CGG SERVICES on April 25 for 

"Methods and devices performing adaptive quadratic Wasserstein...

US Fed News Service, Including US State News, 04/2023

<–—2023———2023———630—


HackGAN: Harmonious Cross-Network Mapping Using CycleGAN With Wasserstein–Procrustes Learning for Unsupervised Network Alignment Linyao Yang; Xiao Wang; Jun Zhang; Jun Yang; Yancai Xu; Jiachen Hou; Kejun Xin;

Fei-Yue Wang

IEEE Transactions on Computational Social Systems

Year: 2023 | Volume: 10, Issue: 2 | Journal Article | Publisher: IEEE


Wasserstein Adversarial Learning for Identification of Power Quality Disturbances with Incomplete Data

Guangxu Feng;

Keng-Weng Lao

IEEE Transactions on Industrial Informatics

Year: 2023 | Early Access Article | Publisher: IEEE

Cited by 4 Related articles


2023 see 2022

Image Reconstruction for Electrical Impedance Tomography (EIT) With Improved Wasserstein Generative Adversarial Network (WGAN)

Hanyu Zhang;Qi Wang; Ronghua Zhang; Xiuyan Li; Xiaojie Duan; Yukuan Sun; Jianming Wang; Jiabin Jia

IEEE Sensors Journal

Year: 2023 | Volume: 23, Issue: 5 | Journal Article | Publisher: IEEE

Cited by: Papers (1)


A Robust Fault Classification Method for Streaming Industrial Data Based on Wasserstein Generative Adversarial Network and Semi-Supervised Ladder Network Chuanfang Zhang; Kaixiang Peng; Jie Dong; Xueyi Zhang;

Kaixuan Yang

IEEE Transactions on Instrumentation and Measurement

Year: 2023 | Volume: 72 | Journal Article | Publisher: IEEE

Cited by 4 Related articles All 2 versions


 

Attacking Mouse Dynamics Authentication using Novel Wasserstein Conditional DCGAN

Arunava Roy;

KokSheik Wong;

Raphaël C. -W Phan

IEEE Transactions on Information Forensics and Security

Year: 2023 | Early Access Article | Publisher: IEEE


2023



2023 patent 

Military intelligence analysis and deduction method based on map of affairs for use in data processing, involves automatically constructing event map, with building Wasserstein generative confrontation network model

CN115878811-A

Inventor(s) LIU BYANG Y; (...); WANG Y

Assignee(s) BEIJING COMPUTER TECHNOLOGY & APPL RES

Derwent Primary Accession Number 

2023-38809J


ECG Classification Based on Wasserstein Scalar Curvature

Sun, FPNi, Y; (...); Sun, HF

Oct 2022 | 

ENTROPY

 24 (10)

Electrocardiograms (ECG) analysis is one of the most important ways to diagnose heart disease. This paper proposes an efficient ECG classification method based on Wasserstein scalar curvature to comprehend the connection between heart disease and the mathematical characteristics of ECG. The newly proposed method converts an ECG into a point cloud on the family of Gaussian distribution, where th

Show mor

Free Full Text from Publishermore_horiz 4

Cited by 5 Related articles All 5 versions



2023 patent

Method for generating energy scene based on WGAN-GP and GMM, involves obtaining original data set, and using Gaussian mixture model and K-means clustering technique to reduce scene of basic scene set to obtain classical scene set

CN115859809-A

Inventor(s) DU RHE F; (...); XIN X

Assignee(s) INNER MONGOLIA ELECTRIC POWER SCI RES

Derwent Primary Accession Number 

2023-39049L

   

Distributionally robust day-ahead combined heat and power plants scheduling with Wasserstein Metric

Skalyga, MAmelin, M; (...); Soder, L

Apr 15 2023 | Jan 2023 (Early Access) | 

ENERGY

Cited by 5 Related articles All 5 versions


Combined heat and power (CHP) plants are main generation units in district heating systems that produce both heat and electric power simultaneously. Moreover, CHP plants can participate in electricity markets, selling and buying the extra power when profitable. However, operational decisions have to be made with unknown electricity prices. The distribution of unknown electricity prices is also

Show more

Full Text at Publishermore_horiz

1 Citation. 59. References. Related records
Cited by 5
 Related articles All 5 versions



Multidimensional scaling method for complex time series based on the Wasserstein-Fourier distance in complex systems

Zhang, FShang, PJ and Mao, XG

Apr 2023 (Early Access) | 

NONLINEAR DYNAMICS

In this paper, we propose a multidimensional scaling (MDS) method based on the Wasserstein-Fourier (WF) distance to analyze and classify complex time series from a frequency domain perspective in complex systems. Three properties with rigorous derivation are stated to reveal the basics structure of MDS method based on the WF distance and validate it as an excellent metric for time series classi

Show more

View full textmore_horiz

48 References. Related records

<–—2023———2023———640—+

Data

Normalized Wasserstein metric between intensity distributions of real and naive (orange bars, Naive) and real and optimized data (blue bars, Optimized).

Bruch, RomanKeller, Florian; (...); Reischl, Markus

2023 | 

Figshare

 | Data set

The measure is calculated for three image regions: background outside and inside the spheroid, and foreground. The images are divided into an upper, middle and lower part. A value of one indicates a perfect result, while a value of zero indicates a bad result. Copyright: CC BY 4.0

View datamore_horiz


Scholarly Journal

Wasserstein bounds in CLT of approximative MCE and MLE of the drift parameter for Ornstein-Uhlenbeck processes observed at high frequency

Es-Sebaiy, Khalifa; Alazemi, Fares; Al-Foraih, Mishari. Journal of Inequalities and Applications; Heidelberg Vol. 2023, Iss. 1,  (Dec 2023): 62.

Scholarly Journal


AA-WGAN: Attention augmented Wasserstein generative adversarial network with application to fundus retinal vessel segmentation

Computers in Biology and Medicine; Oxford Vol. 158,  (May 2023).   

2023 patent news  Wire Feed

CGG Services SAS Gets Patent for Methods and Devices Performing Adaptive Quadratic Wasserstein Full-Waveform Inversion

Global IP News. Information Technology Patent News; New Delhi [New Delhi]. 28 Apr 2023.  

 DetailsFull text

1013 Patent news

NEWSPAPER ARTICLE

CGG Services SAS Gets Patent for Methods and Devices Performing Adaptive Quadratic Wasserstein Full-Waveform Inversion

Pedia Content Solutions Pvt. Ltd

Global IP News: Information Technology Patent New

Working Paper

On the Viability and Invariance of Proper Sets under Continuity Inclusions in Wasserstein Spaces

Bonnet-Weill, Benoît; Frankowska, Hélène. arXiv.org; Ithaca, Apr 27, 2023.

Abstract/DetailsGet full text
opens in a new window

Cited by 4
 Related articles All 9 versions 


2023

 

2023 see 2021. Working Paper

Lifting couplings in Wasserstein spaces

Perrone, Paolo. arXiv.org; Ithaca, Apr 27, 2023.

Abstract/D 


Working Paper

Energy-Based Sliced Wasserstein Distance

Nguyen, Khai; Ho, Nhat. arXiv.org; Ithaca, Apr 26, 2023.

Abstract/DetailsGet full text
opens in a new window
All 2 versions


Wire Feed

Methods and Devices Performing Adaptive Quadratic Wasserstein Full-Waveform Inversion

Targeted News Service; Washington, D.C. [Washington, D.C]. 25 Apr 2023.  

DetailsFull text

NEWSLETTER ARTICLE

US Patent Issued to CGG SERVICES on April 25 for "Methods and devices performing adaptive quadratic Wasserstein full-waveform inversion" (Texas Inventors)

Washington, D.C: HT Digital Streams Limited

US Fed News Service, Including US State News, 2023


Working Paper

Wasserstein Tube MPC with Exact Uncertainty Propagation

Aolaritei, Liviu; Fochesato, Marta; Lygeros, John; Dörfler, Florian. arXiv.org; Ithaca, Apr 24, 2023.

Abstract


Working Paper

Reconstructing discrete measures from projections. Consequences on the empirical Sliced Wasserstein Distance

Tanguy, Eloi; Flamary, Rémi; Delon, Julie. arXiv.org; Ithaca, Apr 24, 2023.

Abstract/DetailsGet full text
opens in a new window

<–—2023———2023———650—


2023 thesis

     Applications of the Bures-Wasserstein Distance in Linear ...

Figshare  

share.com › articles › thesis › Applications_o...

Figshare

https://figshare.com › articles › thesis › Applications_o...


 2023 ;leure

   University of California, Berkeley

https://math.berkeley.edu › ~bernd › tuesday

Wasserstein DistancePDF

by B Sturmfels · 2023 — Wasserstein Distance. Bernd Sturmfels. January 31, 2023. A basic problem in metric algebraic geometry is finding a point in a variety X in Rn that.

12 pages

People also search for
[PDF] berkeley.edu

[PDF] Wasserstein Distance

B Sturmfels - 2023 - math.berkeley.edu

… The variety X will be an independence model in a probability simplex, described algebraically

by matrices or tensors of low rank, and we measure distances using a Wasserstein metric. …

Related articles 

[PDF] berkeley.edu

[PDF] Wasserstein Distance

B Sturmfels - 2023 - math.berkeley.edu

… To compute Wasserstein distances, we need to describe the Lipschitz polytope Pd as

explicitly as possible. All three metrics above are graph metrics. This means that there exists an …

 Related articles 

 Fast and Accurate Deep Leakage from Gradients Based on Wasserstein Distance
by X He — In this study, a Wasserstein DLG method, named WDLG, is proposed; ... Volume 2023 | Article ID 5510329 | https://doi.org/10.11
All 4 versions


 2023 thesis master see 2021

Efficient and Robust Classification for Positive Definite Matrices with Wasserstein Metric

The results obtained in this paper include that Bures-Wasserstein simple ... dc.type, Master's Thesis, en_US ... dc.embargo.enddate, 2023-04-16, en_US.


2023 thesis
 Optimal Control in Wasserstein Spaces

ResearchGate

https://www.researchgate.net › publication › 337311998...

ResearchGate

https://www.researchgate.net › publication › 337311998...

Jan 4, 2023 — In this thesis, we extend for the first time several of these concepts to the framework of control theory.The first result presented in this ...


2023


[PDF] arxiv.org

1d approximation of measures in Wasserstein spaces

A Chambolle, V Duval, JM Machado - arXiv preprint arXiv:2304.14781, 2023 - arxiv.org

… The problem consists in minimizing a Wasserstein distance as a data term with a regularization 

given by the length of the support. As it is challenging to prove existence of solutions to …

All 3 versions


 [PDF] arxiv.org

On the 1-Wasserstein Distance between Location-Scale Distributions and the Effect of Differential Privacy

S Chhachhi, F Teng - arXiv preprint arXiv:2304.14869, 2023 - arxiv.org

… for the 1Wasserstein distance between independent location-… Specifically, we find that the 

1-Wasserstein distance between … A new linear upper bound on the 1-Wasserstein distance is …


 

[PDF] arxiv.org

Approximation of Splines in Wasserstein Spaces

J Justiniano, M Rumpf, M Erbar - arXiv preprint arXiv:2302.10682, 2023 - arxiv.org

… distance and the approximate average will be the Wasserstein barycenter. … the Wasserstein 

distance between probability measures, the Riemannian perspective on Wasserstein spaces …

All 2 versions


[PDF] arxiv.org

Wasserstein Dictionaries of Persistence Diagrams

K Sisouk, J Delon, J Tierny - arXiv preprint arXiv:2304.14852, 2023 - arxiv.org

… , in the form of weighted Wasserstein barycenters [99], [101] of … of our approach, with 

Wasserstein dictionary computations in … framework based on a Wasserstein dictionary defined with …



[PDF] arxiv.org

Control Variate Sliced Wasserstein Estimators

K Nguyen, N Ho - arXiv preprint arXiv:2305.00402, 2023 - arxiv.org

… the expectation of the Wasserstein distance between two one-… the closed-form of the 

Wasserstein-2 distance between two … an upper bound of the Wasserstein-2 distance between two …

<–—2023———2023———660—



 2023 see 2022. [PDF] mlr.press

Sliced Wasserstein variational inference

M Yi, S Liu - Asian Conference on Machine Learning, 2023 - proceedings.mlr.press

… variational inference method by minimizing sliced Wasserstein distance–a valid metric 

arising from optimal transport. This sliced Wasserstein distance can be approximated simply by …

 Cited by 11


[PDF] arxiv.org

Approximation and Structured Prediction with Sparse Wasserstein Barycenters

MH Do, J Feydy, O Mula - arXiv preprint arXiv:2302.05356, 2023 - arxiv.org

… on closed forms for Wasserstein distances, and barycenters … recall the necessary background 

on Wasserstein spaces and … numerous computations of Wasserstein distances, barycenters…

 All 2 versions



[PDF] mlr.press

Bures-Wasserstein Barycenters and Low-Rank Matrix Recovery

T Maunu, T Le Gouic, P Rigollet - … Conference on Artificial …, 2023 - proceedings.mlr.press

… can solve a specific Wasserstein barycenter problem rather than the original matrix recovery 

problem (2.4). In other words, any methods that solve this Wasserstein barycenter problem …

Related articles All 3 versions

W2 barycenters for radially related distributions

N Ghaffari, SG Walker - Statistics & Probability Letters, 2023 - Elsevier

… We fit Student– t distributions to each sample dataset of daily log … However, employing the

quadratic Wasserstein distance, we can … From this we obtain the dfs of the closest student– t …

Related articles

On the exotic isometry flow of the quadratic Wasserstein space over the real line

GP GehérT TitkosD Virosztek - Linear Algebra and its Applications, 2023 - Elsevier

… We look at the Wasserstein space W 2 ( R ) as a convex and closed subset of L 2 ( ( 0 , 1 )

) whose linear span is dense in L 2 ( ( 0 , 1 ) ) , via the identification μ F μ − 1 . Therefore by […

Cite Cited by 1

2023


[PDF] arxiv.org

Wasserstein Dictionaries of Persistence Diagrams

K Sisouk, J DelonJ Tierny - arXiv preprint arXiv:2304.14852, 2023 - arxiv.org

… our approach, with Wasserstein dictionary computations in the … framework based on a

Wasserstein dictionary defined with a … 2D layouts obtained with our approach (W2-Dict) and these …


[PDF] mlr.press

Wasserstein Distributional Learning via Majorization-Minimization

C Tang, N Lenssen, Y Wei… - … Conference on Artificial …, 2023 - proceedings.mlr.press

… optimization algorithm, Wasserstein Distributional Learning (… functions and uses the

Wasserstein distance W2 as a proper … on the one-dimensional 2-Wasserstein distance W2(f1,f2) …


Methods and devices performing adaptive quadratic Wasserstein full-waveform inversion

W Diancheng, P Wang - US Patent 11,635,540, 2023 - Google Patents

… R43-R62) articulates an FWI problem formulation based on the 2-Wasserstein (W 2 ) metric.

Numerical simulations for both these formulations have shown that optimal transport can …

Cited by 1 Related articles All 2 versions 

Methods and devices performing adaptive quadratic Wasserstein full-waveform inversion
by CGG SERVICES SAS
04/2023
2Methods and devices for seismic exploration of an underground structure apply W-based full-wave inversion to transformed synthetic and seismic data. Data...
Patent  Available Online

The back-and-forth method for the quadratic Wasserstein distance-based full-waveform inversion

H Zhang, W He, J Ma - Geophysics, 2023 - library.seg.org

… Wasserstein distance (W2): a solution of the OT problem with … make the quadratic Wasserstein

distance convex with … The results obtained using the L2, 1D W2 and

Related articles All 2 versions

Related articles All 4 versions

[PDF] arxiv.org

Control Variate Sliced Wasserstein Estimators

K NguyenN Ho - arXiv preprint arXiv:2305.00402, 2023 - arxiv.org

… the expectation of the Wasserstein distance between two one-… the closed-form of the

Wasserstein-2 distance between two … an upper bound of the Wasserstein-2 distance between two …

Cited by 4 Related articles All 2 versions 

<–—2023———2023———670— 


Reflecting image-dependent SDEs in Wasserstein space and large deviation principle

X Yang - Stochastics, 2023 - Taylor & Francis

… R d , and equip it with the second order Wasserstein distance. In essence, this is a reflecting

problem for the image stochastic process in Wasserstein space and we characterize the …

All 2 versions



[HTML] sciencedirect.com

[HTML] Medical multivariate time series imputation and g based on a recurrent conditional Wasserstein GAN and attention

S Festag, C Spreckelsen - Journal of Biomedical Informatics, 2023 - Elsevier

… To circumvent the problem during optimisation, the Wasserstein-1 algorithm was applied 

for … the (scaled) Wasserstein-1 distance between the real conditional time series distribution …

All 4 versions


[PDF] acm.org

Dual Critic Conditional Wasserstein GAN for Height-Map Generation

N Ramos, P Santos, J Dias - … of the 18th International Conference on the …, 2023 - dl.acm.org

… Another line of works [2, 8] uses conditional GANs to introduce some measure of designer … 

In this line of research, we propose a conditional WGAN that uses two critics - one focused in …


Optical proximity correction with the conditional Wasserstein GAN

P Yuan, P Xu, Y Wei - DTCO and Computational Patterning II, 2023 - spiedigitallibrary.org

… To solve this problem, we introduce the Wasserstein distance in the loss function of the 

conditional GAN. It improves the training process of the GAN-OPC, and we can obtain the …


[PDF] arxiv.org

Spectral CT denoising using a conditional Wasserstein generative adversarial network

D Hein, M Persson - Medical Imaging 2023: Physics of …, 2023 - spiedigitallibrary.org

Next generation X-ray computed tomography, based on photon-counting detectors, is now 

clinically available. These new detectors come with the promise of higher contrast-to-noise …

All 2 versions


2023


[PDF] vub.be

[PDF] WAE-PCN: Wasserstein-autoencoded Pareto Conditioned Networks

F Delgrange, M Reymond, A Nowé… - 2023 Adaptive and …, 2023 - researchportal.vub.be

… In this preliminary work, we propose a method that combines two frameworks, Pareto 

Conditioned Networks (PCN) and Wasserstein auto-encoded MDPs (WAE-MDPs), to efficiently …

All 3 versions 


[PDF] arxiv.org

Variational Gaussian filtering via Wasserstein gradient flows

A Corenflos, H Abdulsamad - arXiv preprint arXiv:2303.06398, 2023 - arxiv.org

… The Wasserstein-flow and particle filter deliver consistent … methods based on linearizing the 

conditional observation mean E [… In the particular case of the Wasserstein filter, this is made …

 All 2 versions


[PDF]Ωß≈Smoothed Wasserstein Distance.

J XuL LuoC DengH Huang - IJCAI, 2018 - ijcai.org

… Wasserstein distance through a shared distance matrix and local smoothed Wasserstein

dis… diagonal matrix such that i, Ai,i = ai. e represents a unit vector, and I is the unit matrix. …

Cited by 15 Related articles All 5 versions 


[PDF] mlr.press

Wasserstein fair classification

R JiangA Pacchiano, T Stepleton… - … artificial intelligence, 2020 - proceedings.mlr.press

… This is achieved through enforcing small Wasserstein distances … We demonstrate that using

Wasserstein-1 distances to the … We introduce a Wasserstein-1 penalized logistic regression …

Cited by 125 Related articles All 5 versions 


[PDF] mlr.press

Stochastic optimization for regularized wasserstein estimatorsM BalluQ BerthetF Bach - International Conference on …, 2020 - proceedings.mlr.press

… If c is a distance and if ε = η = 0, then OTε is a Wasserstein distance and our problem can

be seen as computing a projection of µ onto M. In the discrete case, the solution to the …

Cited by 16 Related articles All 12 versions 

<–—2023———2023———680— 



[PDF] mlr.press

A fast proximal point method for computing exact wasserstein distance

Y XieX Wang, R Wang, H Zha - … in artificial intelligence, 2020 - proceedings.mlr.press

… cost of Wasserstein distance has been a thorny issue and has limited its application to

challenging machine learning problems. In this paper we focus on Wasserstein distance for …

Cited by 136 Related articles All 6 versions 


[PDF] mlr.press

Fast dictionary learning with a smoothed Wasserstein loss

A RoletM CuturiG Peyré - Artificial Intelligence and …, 2016 - proceedings.mlr.press

… Our goal in this paper is to generalize these approaches using a regularized Wasserstein

(aka … We motivate the idea of using a Wasserstein fitting error with a toy example described in …

 Cited by 145 Related articles All 9 versions 


2023 see 2022. [PDF] neurips.cc

Sliced gromov-wasserstein

V TitouanR FlamaryN Courty… - Advances in Neural …, 2019 - proceedings.neurips.cc

Recently used in various machine learning contexts, the Gromov-Wasserstein distance (GW)

allows for comparing distributions whose supports do not necessarily lie in the same metric …

Cited by 59 Related articles All 10 versions 


[PDF] aaai.org

Wasserstein distance guided representation learning for domain adaptation

J ShenY QuW ZhangY Yu - … Conference on Artificial Intelligence, 2018 - ojs.aaai.org

… Wasserstein GAN, in this paper we propose a novel approach to learn domain invariant feature

representations, namely Wasserstein … critic, to estimate empirical Wasserstein distance be…

 Cited by 663 Related articles All 6 versions 


[PDF] mlr.press

Stochastic wasserstein barycenters

S ClaiciE ChienJ Solomon - International Conference on …, 2018 - proceedings.mlr.press

… approximation to the true Wasserstein barycenter. The support … and sample from the

Wasserstein barycenter of a collection of … perform gradient ascent using the formula in (10) where …

Cited by 82 Related articles All 13 versions 


2023



2023 see 2021. [PDF] aaai.org

Deep wasserstein graph discriminant learning for graph classification

T ZhangY Wang, Z Cui, C Zhou, B Cui… - … on Artificial Intelligence, 2021 - ojs.aaai.org

Graph topological structures are crucial to distinguish different-class graphs. In this work, we

propose a deep Wasserstein graph discriminant learning (WGDL) framework to learn …

 Cited by 7 Related articles All 3 versions 




[PDF] projecteuclid.org

Notes on the Wasserstein metric in Hilbert spaces

JA CuestaC Matrán - The Annals of Probability, 1989 - JSTOR

… The interest in the Wasserstein metrics relies on the fact that … ) as the definition of the

Wasserstein distance between P and Q… Then Ai is contained in Xl(a) if and only if A * is contained …

Cited by 109 Related articles All 7 versions



2023 see 2021

Learning graphons via structured gromov-wasserstein barycenters

H XuD LuoL CarinH Zha - … AAAI Conference on Artificial Intelligence, 2021 - ojs.aaai.org

… -Wasserstein … Wasserstein barycenter of the given graphs. Furthermore, we develop

several enhancements and extensions of the basic algorithm, eg, the smoothed GromovWasserstein …

 Cited by 12 Related articles All 5 versions 


2023 see 2021. [PDF] arxiv.org

Scalable computations of wasserstein barycenter via input convex neural networks

J FanA TaghvaeiY Chen - arXiv preprint arXiv:2007.04462, 2020 - arxiv.org

… Wasserstein Barycenter is a principled approach to … to approximate the Wasserstein

Barycenters aiming at high… the Kantorovich dual formulation of the Wasserstein-2 distance as well …

Cited by 30 Related articles All 4 versions 


2023 see 2022. [PDF] mlr.press

Meta-learning without data via wasserstein distributionally-robust model fusion

Z WangX WangL ShenQ Suo… - … Artificial Intelligence, 2022 - proceedings.mlr.press

… in various ways, including KLdivergence, Wasserstein ball, etc. DRO has been applied to

many … This paper adopts the Wasserstein ball to characterize the task embedding uncertainty …

Cited by 5 Related articles All 3 versions 

<–—2023———2023———690—



2023 see 2021. [PDF] aaai.org

Swift: Scalable wasserstein factorization for sparse nonnegative tensors

A AfsharK Yin, S Yan, C QianJ Ho, H Park… - … on Artificial Intelligence, 2021 - ojs.aaai.org

… Wasserstein distance, which can handle non-negative inputs. We introduce SWIFT, which

minimizes the Wasserstein … In particular, we define the N-th order tensor Wasserstein loss for …

Cited by 10 Related articles All 13 versions 


[PDF] aaai.org

EWGAN: Entropy-based Wasserstein GAN for imbalanced learning

J Ren, Y Liu, J Liu - … of the AAAI Conference on Artificial Intelligence, 2019 - ojs.aaai.org

In this paper, we propose a novel oversampling strategy dubbed Entropy-based Wasserstein

Generative Adversarial Network (EWGAN) to generate data samples for minority classes in …

Cited by 16 Related articles All 4 versions 


[PDF] aaai.org

Partial Wasserstein Covering

K KawanoS KoideK Otaki - … AAAI Conference on Artificial Intelligence, 2022 - ojs.aaai.org

We consider a general task called partial Wasserstein covering with the goal of providing

information on what patterns are not being taken into account in a dataset (eg, dataset used …

 Cited by 3 Related articles All 7 versions 


[PDF] arxiv.org

Fixed support tree-sliced Wasserstein barycenter

Y TakezawaR SatoZ KozarevaS Ravi… - arXiv preprint arXiv …, 2021 - arxiv.org

… However, our goal is to compute a barycenter on Ω fast by approximating the Wasserstein

distance with the tree-Wasserstein distance. The probability on a leaf node is considered as …

 Cited by 7 Related articles All 4 versions 

 

2023 see 2022. [PDF] aaai.org

Wasserstein unsupervised reinforcement learning

S He, Y Jiang, H Zhang, J Shao, X Ji - … on Artificial Intelligence, 2022 - ojs.aaai.org

… Therefore, we choose Wasserstein distance, a well-studied … By maximizing Wasserstein

distance, the agents equipped … First, we propose a novel framework adopting Wasserstein …

Cited by 5 Related articles All 5 versions 


2023


2023 see 2022. [PDF] mlr.press

Fair learning with Wasserstein barycenters for non-decomposable performance measures

S GaucherN Schreuder… - … on Artificial Intelligence …, 2023 - proceedings.mlr.press

This work provides several fundamental characterizations of the optimal classification function

under the demographic parity constraint. In the awareness framework, akin to the classical …

Cited by 8 Related articles All 5 versions 


[PDF] arxiv.org

On smooth approximations in the Wasserstein space

A CossoM Martini - arXiv preprint arXiv:2303.15160, 2023 - arxiv.org

… In this paper we investigate the approximation of continuous functions on the Wasserstein

space by smooth functions, with smoothness meant in the sense of Lions differentiability. In …

Cited by 1 All 2 versions


[PDF] mlr.press

Discrete Langevin Samplers via Wasserstein Gradient Flow

H SunH DaiB DaiH Zhou… - International …, 2023 - proceedings.mlr.press

… -defined gradients in the sample space. In this work, we show how the Wasserstein gradient

flow can be generalized naturally to discrete spaces. Given the proposed formulation, we …
Cited by 13
 Related articles All 5 versions 


Wasserstein Loss for Semantic Editing in the Latent Space of GANs

P Doubinsky, N Audebert, M Crucianu… - arXiv preprint arXiv …, 2023 - arxiv.org

… space using the guidance of the Wasserstein loss with an Euclidean cost, which can be

combined with a Wasserstein loss with a cost computed in the attribute space … the latent space of …

All 5 versions 

 

2023 see 2022. [PDF] mlr.press

Sliced Wasserstein variational inference

M YiS Liu - Asian Conference on Machine Learning, 2023 - proceedings.mlr.press

… Wasserstein distance measures the cost of such a transformation. We denote X the sample

space and let Qp(X) be the set of Borel probability measures with finite p-th moment. Given …

 Cited by 18 Related articles All 4 versions 

<–—2023———2023———700—


2023 see 2022

Learning to generate wasserstein barycenters

J Lacombe, J DigneN CourtyN Bonneel - Journal of Mathematical …, 2023 - Springer

… PCA in the Wasserstein space require the ability to compute Wasserstein barycenters ;

they have been studied by Bigot et al. [7] but could only be computed in 1-d where theory is …

Cited by 8 Related articles All 8 versions

2023 see 2022. [PDF] arxiv.org

Wasserstein information matrix

W Li, J Zhao - Information Geometry, 2023 - Springer

… a Wasserstein information matrix (WIM). We derive the WIM by pulling back the Wasserstein

metric from a immense probability space to … Wasserstein score functions with a Wasserstein …

 Cited by 19 Related articles All 5 versions

 MR4611775 


[PDF] mlr.press

Bures-Wasserstein Barycenters and Low-Rank Matrix Recovery

T MaunuT Le GouicP Rigollet - … Conference on Artificial …, 2023 - proceedings.mlr.press

… Wasserstein distance defines a metric over P2(Rd), and the resulting geodesic metric space

is referred to as 2-Wasserstein space. … of 2-Wasserstein space, meaning there always exist 2…

Cited by 4 Related articles All 8 versions 

[CITATION] Bures-Wasserstein Barycenters and Low-Rank Matrix Recovery

T MaunuT Le GouicP Rigollet - 2023 Joint Mathematics …, 2023 - meetings.ams.org

ttps://meetings.ams.org › math › meetingapp.cgi › Paper


2023 see 2022. [PDF] arxiv.org

Invariance encoding in sliced-Wasserstein space for image classification with limited training data

M Shifat-E-RabbiY ZhuangS LiAHM Rubaiyat… - Pattern Recognition, 2023 - Elsevier

… strategy to encode invariances as typically done in machine learning, here we propose to

mathematically augment a nearest subspace classification model in sliced-Wasserstein space …

Cited by 1 Related articles All 3 versions


2023 see 2022. [PDF] arxiv.org

Quantum Wasserstein isometries on the qubit state space

GP GehérJ PitrikT TitkosD Virosztek - Journal of Mathematical Analysis …, 2023 - Elsevier

… We describe Wasserstein isometries of the quantum bit state space with respect to …

This phenomenon mirrors certain surprising properties of the quantum Wasserstein distance…

Cited by 2 Related articles All 3 versions


2023


[HTML] A Robust Continuous Authentication System Using Smartphone Sensors and Wasserstein Generative Adversarial Networks

S Zou, H Sun, G Xu, C Wang, X Zhang… - Security and …, 2023 - hindawi.com

… differences between the proposed method and state-ofthe-art methods, we analyze the

differences with other authentication methods. As shown in Table 6, this table shows the existing …

All 4 versions 

[PDF] preprints.org

Parallel Guiding Sparse Auto-Encoder with Wasserstein Regularization for Efficient Classification

H Lee, C Hur, B Ibrokhimov, S Kang - 2023 - preprints.org

… Most of the regularization-based autoencoders optimize the … prior distribution using Fused

Gromov-Wasserstein (FGW) [28] … difference by using Wasserstein distance to improve the …

All 2 versions 

A note on the Bures-Wasserstein metric

S Mohan - arXiv preprint arXiv:2303.03883, 2023 - arxiv.org

In this brief note, it is shown that the Bures-Wasserstein (BW) metric on the space positive

definite matrices le

nds itself to convex optimization. In other words, the computation of the BW …

All 2 versions 


[PDF] mlr.press

Wasserstein Distributionally Robust Linear-Quadratic Estimation under Martingale Constraints

K Lotidis, N Bambos, J Blanchet… - … Conference on Artificial …, 2023 - proceedings.mlr.press

… Our work contributes to this line of research by studying the impact of natural constraints in

the adversarial Wasserstein-based perturbations case. We choose the Wasserstein distance …

Cited by 5 Related articles All 2 versions 


2023 see 2021  [PDF] arxiv.org

Wasserstein Adversarially Regularized Graph Autoencoder

H LiangJ Gao - Neurocomputing, 2023 - Elsevier

… To match the encoded distribution P g ( z | A , X ) with the target distribution P r , we introduce

Wasserstein regularizer that minimizes the 1-Wasserstein distance between P r and P g …

 Cited by 3 Related articles All 4 versions

 <–—2023———2023———710=-


 

[PDF] upc.edu

PDF] On the Size of Fully Diverse Sets of Polygons using the Earth Movers Distance or Wasserstein Distance

F KluteM van Kreveld - dccg.upc.edu

… area of symmetric difference to determine the distance between two simple polygons. In this

… the Earth Movers Distance and the Wasserstein Distance: we show that the maximum size of …

 Related articles 

 


[HTML] hindawi.com

[HTML] Fast and Accurate Deep Leakage from Gradients Based on Wasserstein Distance

X He, C Peng, W Tan - International Journal of Intelligent Systems, 2023 - hindawi.com

… In this study, a Wasserstein DLG method, named WDLG, is … In the proposed method, the

Wasserstein distance is used to calculate … Based on the superior performance of the Wasserstein …

All 4 versions 


[HTML] Fast and Accurate Deep Leakage from Gradients Based on Wasserstein Distance

X He, C Peng, W Tan - International Journal of Intelligent Systems, 2023 - hindawi.com

… In this study, a Wasserstein DLG method, named WDLG, is … In the proposed method, the

Wasserstein distance is used to calculate … Based on the superior performance of the Wasserstein …

 All 4 versions 


[PDF] ijeast.com

[PDF] WASSERSTEIN GAN-GRADIENT PENALTY WITH DEEP TRANSFER LEARNING FOR 3D MRI ALZHEIMER DISEASE CLASSIFICATION

NR Thota, D Vasumathi - ijeast.com

MRI scans for Alzheimer's disease (AD) detection are popular. Recent computer vision (CV)

and deep learning (DL) models help construct effective computer assisted diagnosis (CAD) …



Spectral CT denoising using a conditional Wasserstein generative adversarial network

D Hein, M Persson - Medical Imaging 2023: Physics of …, 2023 - spiedigitallibrary.org

… This paper proposes a deep learning-based spectral CT denoiser. We formulate this …

This is achieved by pitting the generator against another deep neural network, called the …

All 2 versions

 

2023


2023 see 2022  [PDF] arxiv.org

Wasserstein GAN for Joint Learning of Inpainting and Spatial Optimisation

P Peter - Image and Video Technology: 10th Pacific-Rim …, 2023 - Springer

… With a Wasserstein distance, we ensure that our inpainting results accurately reflect the

statistics of … After a brief review of Wasserstein GANs in Sect. 2 we introduce our deep spatial …

Cited by 2 Related articles All 3 versions


Optical proximity correction with the conditional Wasserstein GAN

P Yuan, P Xu, Y Wei - DTCO and Computational Patterning II, 2023 - spiedigitallibrary.org

… The generalization capability of the deep neural network is important here. The generator …

To improve this, we use Wasserstein distance as the loss function and stabilize the training …

Related articles All 3 versions

[PDF] vub.be

[PDF] WAE-PCN: Wasserstein-autoencoded Pareto Conditioned Networks

F DelgrangeM ReymondA Nowé… - 2023 Adaptive and …, 2023 - researchportal.vub.be

… Concretely, we inspire ourselves from Deep-Sea-Treasure [16], a classic benchmark

MOMDP. 

Cited by 2 Related articles All 4 versions 

Spectral CT denoising using a conditional Wasserstein generative adversarial network

D Hein, M Persson - Medical Imaging 2023: Physics of …, 2023 - spiedigitallibrary.org

… In this abstract, we propose to tackle this issue by including an adversarial loss based on

the Wasserstein generative adversarial network with gradient penalty. The adversarial loss will …

Cited by 2 Related articles All 4 versions


[PDF] researchgate.net

[PDF] Distributionally Robust Two-Stage Linear Programs with Wasserstein Distance: Tractable Formulations

N Jiang, W Xie - 2023 - researchgate.net

… 11] that solving a DRTSLP with r−Wasserstein ambiguity set can be NP-hard. This section

reviews the tractable reformulations of DRTSLP (1) with r−Wasserstein ambiguity set with r = 1,…

Cited by 1 Related articles 

<–—2023———2023———720—


[PDF] projecteuclid.org

Lipschitz continuity of the Wasserstein projections in the convex order on the line

B Jourdain, W Margheriti… - Electronic Communications …, 2023 - projecteuclid.org

… that show sharpness of the obtained bounds for the 1-Wasserstein distance. … of

Wasserstein projections in the convex order, see [2]. For p ≥ 1, we denote the celebrated p-Wasserstein …

Cited by 2 Related articles All 5 versions

MR4596533 

Computation of Rate-Distortion-Perception Functions With Wasserstein Barycenter
by Chen, ChunhuiNiu, XueyanYe, Wenhao ; More...
04/2023
The nascent field of Rate-Distortion-Perception (RDP) theory is seeing a surge of research interest due to the application of machine learning techniques in...
Journal Article  Full Text Onlin

[PDF] arxiv.org

Computation of Rate-Distortion-Perception Functions With Wasserstein Barycenter

C ChenX Niu, W Ye, S Wu, B Bai, W Chen… - arXiv preprint arXiv …, 2023 - arxiv.org

… model appears to be in the form of the celebrated Wasserstein Barycenter problem [17]–[19]…

to the Wasserstein metric. Our model therefore will be referred to as the Wasserstein …

Cited by 1 All 2 versions 

2023 see 2022. [PDF] arxiv.org

Wasserstein Convergence for Empirical Measures of Subordinated Fractional Brownian Motions on the Flat Torus

H Li, B Wu - arXiv preprint arXiv:2305.01228, 2023 - arxiv.org

… Let m be the uniform distribution on the torus Td. The main … of Wasserstein distances between

empirical measures associated with the subordinated fractional Brownian motion and m. In …

All 2 versions

 

[PDF] arxiv.org

Nonparametric Generative Modeling with Conditional and Locally-Connected Sliced-Wasserstein Flows

C DuLi, PangS YanLin - arXiv preprint arXiv:2305.02164, 2023 - arxiv.org

… in the Wasserstein space is an absolutely continuous curve (pt)t≥… The Wasserstein gradient

flows are shown to be strongly … conditions) the Wasserstein gradient flows (pt)t coincide with …

All 2 versions 


2023 see 2021

Some inequalities on Riemannian manifolds linking Entropy, Fisher information, Stein discrepancy and Wasserstein distance

LJ Cheng, A Thalmaier, FY Wang - Journal of Functional Analysis, 2023 - Elsevier

… on M. Taking μ as reference measure, we derive inequalities for probability measures on

M linking relative entropy, Fisher information, Stein discrepancy and Wasserstein distance. …

Cited by 2 Related articles All 5 versions
R4591327


2023

2023 see 2021. [HTML] sciencedirect.com

[HTML] Wasserstein distance between noncommutative dynamical systems

R Duvenhage - Journal of Mathematical Analysis and Applications, 2023 - Elsevier

… of quadratic Wasserstein distances on spaces consisting of generalized dynamical systems

on a von Neumann algebra. We emphasize how symmetry of such a Wasserstein distance …

 Cited by 3 Related articles All 4 versions

MR4586050

[PDF] arxiv.org

 A multi-period emergency medical service location problem based on Wasserstein-metric approach using generalised benders decomposition method

Y Yuan, Q Song, B Zhou - International Journal of Systems …, 2023 - Taylor & Francis

… Unlikely, the Franco-German group is able to provide urgent treatment on-board. Recent

years, … By dealing with the inherent uncertainty of the EMS system, a Wasserstein-metric-based …

MR4587834 

Multidimensional scaling method for complex time series based on the Wasserstein–Fourier distance in complex systems

F Zhang, P Shang, X Mao - Nonlinear Dynamics, 2023 - Springer

… new time series classification method that combines the Wasserstein–Fourier (WF) distance

[… between time series by computing the Wasserstein distance between the normalized power …


[HTML] sciencedirect.com

[HTML] Medical multivariate time series imputation and forecasting based on a recurrent conditional Wasserstein GAN and attention

S Festag, C Spreckelsen - Journal of Biomedical Informatics, 2023 - Elsevier

… during optimisation, the Wasserstein-1 algorithm was applied for the actual training [18].

This means, in every training iteration the critic estimates the (scaled) Wasserstein-1 distance …

Cited by 6 Related articles All 4 versions


Nonparametric Generative Modeling with Conditional and Locally-Connected Sliced-Wasserstein Flows

[PDF] arxiv.org

 [PDF] arxiv.org

Nonparametric Generative Modeling with Conditional and Locally-Connected Sliced-Wasserstein Flows

C Du, T Li, T PangS YanM Lin - arXiv preprint arXiv:2305.02164, 2023 - arxiv.org

… 2022) propose difference variants of the slicedWasserstein distance and apply them on

generative models. These works adopt OT in different dimensions, while they are all parametric …

All 2 versions 

<–—2023———2023———730—


2023 see 2021.  [PDF] wiley.com

Multi‐marginal Approximation of the Linear Gromov–Wasserstein Distance

F Beier, R Beinert - PAMM, 2023 - Wiley Online Library

… In [22], the authors propose multi-marginal Gromov–Wasserstein transport to … –Wasserstein

In this section we go over fundamental definitions in relation to the Gromov–Wasserstein …

 Related articles All 6 versions

Wasserstein information matrix

W Li, J Zhao - Information Geometry, 2023 - Springer

… of classical Fisher information matrices. We introduce Wasserstein score functions and

study covariance operators in statistical models. Using them, we establish Wasserstein–Cramer–…

Cited by 18 Related articles All 5 versions


 

arXiv:2305.05492  [pdfpsother math.MG   math-ph  math.FA
Isometric rigidity of the Wasserstein space W(G)
 over Carnot groups
Authors: Zoltán M. BaloghTamás TitkosDániel Virosztek
Abstract: This paper aims to study isometries of the 1
-Wasserstein space W1(G)
 over Carnot groups endowed with horizontally strictly convex norms. Well-known examples of horizontally strictly convex norms on Carnot groups are the Heisenberg group Hn
 endowed with the Heisenberg-Korányi norm, or with the Naor-Lee norm; and H
-type Iwasawa groups endowed with a Korányi-type…  More
Submitted 9 May, 2023; originally announced May 2023.
Comments: 20 pages. arXiv admin note: text overlap with arXiv:2303.15095
MSC Class: 46E27; 49Q22; 54E40


arXiv:2305.05211  [pdfpsother math.FA  math.DS  math.OC  \math.PR
A Lagrangian approach to totally dissipative evolutions in Wasserstein spaces
Authors: Giulia CavagnariGiuseppe SavaréGiacomo Enrico Sodini
Abstract: We introduce and study the class of totally dissipative multivalued probability vector fields (MPVF) F
 on the Wasserstein space (P
2(X),W2)
 of Euclidean or Hilbertian probability measures. We show that such class of MPVFs is in one to one correspondence with law-invariant dissipative operators in a Hilbert space…  More
Submitted 9 May, 2023; originally announced May 2023.
Comments: 86 pages
MSC Class: Primary: 34A06; 47B44; 49Q22. Secondary: 34A12; 34A60; 28D05


arXiv:2305.04410  [pdfother cs.IR
WSFE: Wasserstein Sub-graph Feature Encoder for Effective User Segmentation in Collaborative Filtering
Authors: Yankai ChenYifei ZhangMenglin YangZixing SongChen MaIrwin King
Abstract: Maximizing the user-item engagement based on vectorized embeddings is a standard procedure of recent recommender models. Despite the superior performance for item recommendations, these methods however implicitly deprioritize the modeling of user-wise similarity in the embedding space; consequently, identifying similar users is underperforming, and additional processing schemes are usually require…  More
Submitted 7 May, 2023; originally announced May 2023.

Cited by 1 All 2 versions 

2023


arXiv:2305.04290  [pdfother math.ST
Wasserstein distance bounds on the normal approximation of empirical autocovariances and cross-covariances under non-stationarity and stationarity
Authors: Andreas AnastasiouTobias Kley
Abstract: The autocovariance and cross-covariance functions naturally appear in many time series procedures (e.g., autoregression or prediction). Under assumptions, empirical versions of the autocovariance and cross-covariance are asymptotically normal with covariance structure depending on the second and fourth order spectra. Under non-restrictive assumptions, we derive a bound for the Wasserstein distance…  More
Submitted 7 May, 2023; originally announced May 2023.
MSC Class: 62E17; 62F12


arXiv:2305.04034  [pdfother cs.AI  cs.DB  cs.LG
Wasserstein-Fisher-Rao Embedding: Logical Query Embeddings with Local Comparison and Global Transport
Authors: Zihao WangWeizhi FeiHang YinYangqiu SongGinny Y. WongSimon See
Abstract: Answering complex queries on knowledge graphs is important but particularly challenging because of the data incompleteness. Query embedding methods address this issue by learning-based models and simulating logical reasoning with set operators. Previous works focus on specific forms of embeddings, but scoring functions between embeddings are underexplored. In contrast to existing scoring functions…  More
Submitted 6 May, 2023; originally announced May 2023.
Comments: Findings in ACL 2023. 16 pages, 6 figures, and 8 tables. Our implementation can be found at https://github.com/HKUST-KnowComp/WFRE

ll 2 versions 

arXiv:2305.03565  [pdfother stat.ML   cs.LG  math.OC  math.PR  q-fin.MF
The geometry of financial institutions -- Wasserstein clustering of financial data
Authors: Lorenz RiessMathias BeiglböckJohannes TemmeAndreas WolfJulio Backhoff
Abstract: The increasing availability of granular and big data on various objects of interest has made it necessary to develop methods for condensing this information into a representative and intelligible map. Financial regulation is a field that exemplifies this need, as regulators require diverse and often highly granular data from financial institutions to monitor and assess their activities. However, p…  More
Submitted 5 May, 2023; originally announced May 2023.


arXiv:2305.02745  [pdfother cs.CV
Age-Invariant Face Embedding using the Wasserstein Distance
Authors: Eran DahanYosi Keller
Abstract: In this work, we study face verification in datasets where images of the same individuals exhibit significant age differences. This poses a major challenge for current face recognition and verification techniques. To address this issue, we propose a novel approach that utilizes multitask learning and a Wasserstein distance discriminator to disentangle age and identity embeddings of facial images.…  More
Submitted 4 May, 2023; originally announced May 2023.

All 3 versions 

arXiv:2305.02164  [pdfother cs.LG
Nonparametric Generative Modeling with Conditional and Locally-Connected Sliced-Wasserstein Flows
Authors: Chao DuTianbo LiTianyu PangShuicheng YanMin Lin
Abstract: Sliced-Wasserstein Flow (SWF) is a promising approach to nonparametric generative modeling but has not been widely adopted due to its suboptimal generative quality and lack of conditional modeling capabilities. In this work, we make two major contributions to bridging this gap. First, based on a pleasant observation that (under certain conditions) the SWF of joint distributions coincides with thos…  More
Submitted 3 May, 2023; originally announced May 2023.
Comments: ICML 2023

<–—2023———2023———740—


arXiv:2304.14869  [pdfother math.PR  stat.AP  stat.ML
On the 1-Wasserstein Distance between Location-Scale Distributions and the Effect of Differential Privacy
Authors: Saurab ChhachhiFei Teng
Abstract: We provide an exact expressions for the 1-Wasserstein distance between independent location-scale distributions. The expressions are represented using location and scale parameters and special functions such as the standard Gaussian CDF or the Gamma function. Specifically, we find that the 1-Wasserstein distance between independent univariate location-scale distributions is equivalent to the mean…  More
Submitted 28 April, 2023; originally announced April 2023.
Comments: 11 pages, 3 figures

A ll 2 versions 


arXiv:2304.14781  [pdfother math.AP
1d approximation of measures in Wasserstein spaces
Authors: Antonin ChambolleVincent DuvalJoao Miguel Machado
Abstract: We propose a variational approach to approximate measures with measures uniformly distributed over a 1 dimentional set. The problem consists in minimizing a Wasserstein distance as a data term with a regularization given by the length of the support. As it is challenging to prove existence of solutions to this problem, we propose a relaxed formulation, which always admits a solution. In the sequel…  More
Submitted 28 April, 2023; originally announced April 2023.



Discrete Langevin Samplers via Wasserstein Gradient Flow

H SunH DaiB DaiH Zhou… - International …, 2023 - proceedings.mlr.press

… flow that minimizes KL divergence on a Wasserstein manifold… In this work, we show how the

Wasserstein gradient flow can … With this new understanding, we reveal how recent gradient …

Cited by 4 Related articles All 2 versions 

 


[PDF] arxiv.org

Neural Wasserstein Gradient Flows for Maximum Mean Discrepancies with Riesz Kernels

F Altekrüger, J HertrichG Steidl - arXiv preprint arXiv:2301.11624, 2023 - arxiv.org

… We introduce Wasserstein gradient flows and Wasserstein steepest descent flows as well

as a backward and forward scheme for their time discretization in Sect. 2. In Sect. …

 Cited by 1 All 2 versions 


[PDF] arxiv.org

Wasserstein Gradient Flows of the Discrepancy with Distance Kernel on the Line

J HertrichR Beinert, M Gräf, G Steidl - arXiv preprint arXiv:2301.04441, 2023 - arxiv.org

… This paper provides results on Wasserstein gradient flows between measures on the real …

of the Wasserstein space P2(R) into the Hilbert space L2((0, 1)), Wasserstein gradient flows of …

 Related articles All 2 versions 


2023


[PDF] arxiv.org

Variational Gaussian filtering via Wasserstein gradient flows

A Corenflos, H Abdulsamad - arXiv preprint arXiv:2303.06398, 2023 - arxiv.org

In this article, we present a variational approach to Gaussian and mixture-of-Gaussians

assumed filtering. Our method relies on an approximation stemming from the gradient-flow …

All 2 versions 


[HTML] hindawi.com

[HTML] Fast and Accurate Deep Leakage from Gradients Based on Wasserstein Distance

X He, C Peng, W Tan - International Journal of Intelligent Systems, 2023 - hindawi.com

… independent of the approximation of the shared gradient, and thus, the label … Wasserstein

distance is used to calculate the error loss between the shared gradient and the virtual gradient

 Cited by 3 Related articles All 5 versions 


[PDF] WASSERSTEIN GAN-GRADIENT PENALTY WITH DEEP TRANSFER LEARNING FOR 3D MRI ALZHEIMER DISEASE CLASSIFICATION

NR Thota, D Vasumathi - ijeast.com

MRI scans for Alzheimer's disease (AD) detection are popular. Recent computer vision (CV)

and deep learning (DL) models help construct effective computer assisted diagnosis (CAD) …



[PDF] arxiv.org

Least Wasserstein distance between disjoint shapes with perimeter regularization

M NovackI TopalogluR Venkatraman - Journal of Functional Analysis, 2023 - Elsevier

… Indeed, length-minimizing Wasserstein geodesics between … problems involving the Wasserstein

distance between equal … R n , W p denotes the p-Wasserstein distance on the space of …

Cited by 2 Related articles All 9 versions

 


2023 see 2021. [PDF] arxiv.org

Convergence in Wasserstein Distance for Empirical Measures of Non-Symmetric Subordinated Diffusion Processes

FY Wang - arXiv preprint arXiv:2301.08420, 2023 - arxiv.org

… operator, the convergence in Wasserstein distance is characterized for the empirical … M,

let Wp be 

All 4 versions 

<–—2023———2023———750—



Class-rebalanced wasserstein distance for multi-source domain adaptation

Q Wang, S Wang, B Wang - Applied Intelligence, 2023 - Springer

… scheme, class-rebalanced Wasserstein distance (CRWD), for … structure by rectifying the

Wasserstein mapping from source … ground metric of Mahalanobis distance to better metricise the …

 Related articles


 

[PDF] arxiv.org

Algebraic Wasserstein distances and stable homological invariants of data

J Agerberg, A Guidolin, I Ren, M Scolamiero - arXiv preprint arXiv …, 2023 - arxiv.org

… define a richer family of parametrized Wasserstein distances where, in addition to standard

… Wasserstein distances are defined as a generalization of the algebraic Wasserstein distances

Related articles All 2 versions 



A Novel Graph Kernel Based on the Wasserstein Distance and Spectral Signatures

Y Liu, L RossiA Torsello - … , and Statistical Pattern Recognition: Joint IAPR …, 2023 - Springer

… With these distributions to hand, similarly to [29], we propose to take the negative

exponential of the Wasserstein distance, a widely used distance function between probability …

 Related articles All 2 versions


Stone's theorem for distributional regression in Wasserstein distance

C Dombry, T Modeste, R Pic - arXiv preprint arXiv:2302.00975, 2023 - arxiv.org

… measured by the Wasserstein distance of order p … Wasserstein distance has a simple explicit

form, but also the case of a multivariate output Y Rd. The use of the Wasserstein distance …

ACited by 1 Related articles All 6 versions 

[PDF] arxiv.org

On the 1-Wasserstein Distance between Location-Scale Distributions and the Effect of Differential Privacy

S ChhachhiF Teng - arXiv preprint arXiv:2304.14869, 2023 - arxiv.org

… for the 1Wasserstein distance between independent location-… Specifically, we find that the

1-Wasserstein distance between … A new linear upper bound on the 1-Wasserstein distance is …

 All 2 versions 


2023


2023 see 2021. 

Scenario Reduction Network Based on Wasserstein Distance with Regularization

X Dong, Y SunSM Malik, T Pu, Y Li… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org

… This paper presents a scenario reduction network model based on Wasserstein distance. …

reduction network corresponds to the Sinkhorn distance function. The scenario reduction …

Related articles All 2 versions


[PDF] researchgate.net

[PDF] Markovian Sliced Wasserstein Distances: Beyond Independent Projections

KNTRN Ho - 2023 - researchgate.net

… for Wasserstein distance, sliced Wasserstein distance, and max … Wasserstein distance based

on orthogonal projecting directions. We refer the distance as K sliced Wasserstein distance (…

Related articles 


Towards inverse modeling of landscapes using the Wasserstein distance

MJ MorrisAG LippGG Roberts - Authorea Preprints, 2023 - authorea.com

… Instead, we introduce the Wasserstein distance as a means to measure misfit between

observed and theoretical landscapes. We first demonstrate its use with a one-dimensional …


2023 see 2022. [PDF] arxiv.org

Quantitative control of Wasserstein distance between Brownian motion and the Goldstein–Kac telegraph process

G BarreraJ Lukkarinen - Annales de l'Institut Henri Poincare (B) …, 2023 - projecteuclid.org

… with suitable diffusivity constant via a Wasserstein distance with quadratic average cost. In

… de diffusivité appropriée par rapport à la distance de Wasserstein et avec un coût moyen …

Related articles All 4 versions


 2023 see 2021. [PDF] arxiv.org

On adaptive confidence sets for the Wasserstein distances

N DeoT Randrianarisoa - Bernoulli, 2023 - projecteuclid.org

… In the density estimation model, we investigate the problem of constructing adaptive

honest confidence sets with diameter measured in Wasserstein distance Wp, p ≥ 1, and for …

Related articles All 4 versions

<–—2023———2023———760—


wasserstein metric distributionally robust optimizationwasserstein metric convergencewasserstein

metric probability measureswasserstein metric probability distributionswasserstein

metric rüschendorfgaussian wasserstein metricwasserstein metric chance constrainedwasserstein metric markov

chain


Privacy-utility equilibrium data generation based on Wasserstein generative adversarial networks

H Liu, Y Tian, C Peng, Z Wu - Information Sciences, 2023 - Elsevier

… For privacy analysis, this paper uses Wasserstein distance and Euclidean distance to

evaluate the effect of privacy protection. The Wasserstein distance is the difference between the …

Privacy-utility equilibrium data generation based on Wasserstein generative adversarial networks
Cited by 4
 Related articles All 2 versions


GA-ENs: A novel drug–target interactions prediction method by incorporating prior Knowledge Graph into dual Wasserstein...
by Li, Guodong; Sun, Weicheng; Xu, Jinsheng ; More...
Applied soft computing, 05/2023, Volume 139
Bipartite graph-based drug–target interactions (DTIs) prediction methods are commonly limited by the sparse structure of the graph, resulting in acquiring...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal
All 2 versions

Cited by 1 Related articles


 Distributionally robust optimization with Wasserstein...
by Wu, Zhongming; Sun, Kexin
Applied mathematical modelling, 05/2023, Volume 117
•A new distributionally robust mean-variance model with Wasserstein metric was developed.•The novel model was transformed into a tractable convex problem by...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal
 
On approximations of data-driven chance constrained programs over Wasserstein...
by Chen, Zhi; Kuhn, Daniel; Wiesemann, Wolfram
Operations research letters, 05/2023, Volume 51, Issue 3
Distributionally robust chance constrained programs minimize a deterministic cost function subject to the satisfaction of one or more safety conditions with...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal

2023
 
WG-ICRN: Protein 8-state secondary structure prediction based on Wasserstein generative adversarial networks and residual networks with Inception modules
by Li, Shun; Yuan, Lu; Ma, Yuming ; More...
Mathematical Biosciences and Engineering, 05/2023, Volume 20, Issue 5
Protein secondary structure is the basis of studying the tertiary structure of proteins, drug design and development, and the 8-state protein secondary...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal
Open Access
All 3 versions


2023 see arXiv
Control Variate Sliced Wasserstein...
by Nguyen, Khai; Ho, Nhat
04/2023
The sliced Wasserstein (SW) distances between two probability measures are defined as the expectation of the Wasserstein distance between two one-dimensional...
Journal Article  Full Text Online
Open Access


Semidefinite Programming Relaxations of the Simplified Wasserstein...
by Cheng, Jiahui
2023
The Simplified Wasserstein Barycenter problem, the problem of picking k points each chosen from a distinct set of n points as to minimize the sum of distances...
Dissertation/ThesisCitation Online
Open Access
[PDF] uwaterloo.ca

Semidefinite Programming Relaxations of the Simplified Wasserstein Barycenter Problem: An ADMM Approach

J Cheng - 2023 - uwspace.uwaterloo.ca

… transport distance, ie: Wasserstein distance.Even though this … oriented applications, efficient

computation of Wasserstein … space is to compute their Wasserstein barycenter, the closest …

 


2023 see arXiv
The geometry of financial institutions -- Wasserstein...
by Riess, Lorenz; Beiglböck, Mathias; Temme, Johannes ; More...
05/2023
The increasing availability of granular and big data on various objects of interest has made it necessary to develop methods for condensing this information...
Journal Article  Full Text Online

Age-Invariant Face Embedding using the Wasserstein...
by Dahan, Eran; Keller, Yosi
05/2023
In this work, we study face verification in datasets where images of the same individuals exhibit significant age differences. This poses a major challenge for...
Journal Article  Full Text Online
Open Access

<–—2023———2023———770—



Control Variate Sliced Wasserstein...
by Nguyen, Khai; Ho, Nhat
arXiv.org, 04/2023
The sliced Wasserstein (SW) distances between two probability measures are defined as the expectation of the Wasserstein distance between two one-dimensional...
Paper  Full Text Online
Open Access

WSFE: Wasserstein...
by Chen, Yankai; Zhang, Yifei; Yang, Menglin ; More...
05/2023
Maximizing the user-item engagement based on vectorized embeddings is a standard procedure of recent recommender models. Despite the superior performance for...
Journal Article  Full Text Online
Open Access

2023 see 2022
Wasserstein-Fisher-Rao...
by Wang, Zihao; Fei, Weizhi; Yin, Hang ; More...
05/2023
Answering complex queries on knowledge graphs is important but particularly challenging because of the data incompleteness. Query embedding methods address...
Journal Article  Full Text Online
Open Access
 

 2023 see arxiv
Nonparametric Generative Modeling with Conditional and Locally-Connected Sliced-Wasserstein..
by Du, Chao; Li, Tianbo; Pang, Tianyu ; More...
05/2023
Sliced-Wasserstein Flow (SWF) is a promising approach to nonparametric generative modeling but has not been widely adopted due to its suboptimal generative...
Journal Article  Full Text Online
Open Access
 2023 see arXiv
Wasserstein Convergence...
by Li, Huaiqian; Wu, Bingyao
05/2023
We estimate rates of convergence for empirical measures associated with the subordinated fractional Brownian motion to the uniform distribution on the flat...
Journal Article  Full Text Online
Open Access

The geometry of financial institutions -- Wasserstein...
by Riess, Lorenz; Beiglböck, Mathias; Temme, Johannes ; More...
arXiv.org, 05/2023
The increasing availability of granular and big data on various objects of interest has made it necessary to develop methods for condensing this information...
Paper  Full Text Online
Open Access

2023

Age-Invariant Face Embedding using the Wasserstein...
by Dahan, Eran; Keller, Yosi
arXiv.org, 05/2023
In this work, we study face verification in datasets where images of the same individuals exhibit significant age differences. This poses a major challenge for...
Paper  Full Text Online
Open Access

Nonparametric Generative Modeling with Conditional and Locally-Connected Sliced-Wasserstein...
by Du, Chao; Li, Tianbo; Pang, Tianyu ; More...
arXiv.org, 05/2023
Sliced-Wasserstein Flow (SWF) is a promising approach to nonparametric generative modeling but has not been widely adopted due to its suboptimal generative...
Paper  Full Text Online
Open Access
Cited by 1
 Related articles All 2 versions 

Wasserstein Convergence...
by Li, Huaiqian; Wu, Bingyao
arXiv.org, 05/2023
We estimate rates of convergence for empirical measures associated with the subordinated fractional Brownian motion to the uniform distribution on the flat...
Paper  Full Text Online
Open Access

 
Wasserstein Graph...
by Fang, Zhongxi; Huang, Jianming; Su, Xun ; More...
arXiv.org, 05/2023
The Weisfeiler-Lehman (WL) test is a widely used algorithm in graph machine learning, including graph kernels, graph metrics, and graph neural networks....
Paper  Full Text Online
Open Access

MR4581742 Prelim Es-Sebaiy, Khalifa; Alazemi, Fares; Al-Foraih, Mishari; Wasserstein bounds in CLT of approximative MCE and MLE of the drift parameter for Ornstein-Uhlenbeck processes observed at high frequency. J. Inequal. Appl. 2023, Paper No. 62. 60F05 (60H07 62M05)

Review PDF Clipboard Journal Article

Cited by 1 Related articles All 9 versions

<–—2023———2023———780—


2023. Data

AC-WGAN-GP loss functions.

Gutta, CristianoMorhard, Christoph and Rehm, Markus

2023 | 

Figshare

 | Data set

(A) Loss functions of the discriminator identifying real vs fake patients and (B) risk category. (C) Loss function of the generator. Loss functions were computed over 1000 training epochs. (TIF) Copyright: CC BY 4.0

View datamore_horiz

  

 2023 patent

Bearing fault diagnosis method based on the improved WGAN-GP and the Alxnet, involves acquiring original vibration signal of bearing, and converting original vibration signal into time-frequency domain signals of different fault categories

CN115962946-A

Inventor(s) JIANG XZHANG Y; (...); FU W

Assignee(s) UNIV CHINA THREE GORGES

Derwent Primary Accession Number 

2023-43173F

 Related records



2023 patent

Computer-based method for unsupervised alignment of embedding spaces in natural language processing, involves updating permutation matrix and orthogonal matrix based on quantized two-Wasserstein distance and gradient descent and Procrustes problem

WO2023059503-A1

Inventor(s) ZHENG YABOAGYE P O; (...); ZHANG W

Assignee(s) VISA INT SERVICE ASSOC

Derwent Primary Accession Number 

2023-378480


2023 patent

Method for enhancing grid deformation data based on Wasserstein generative adversarial network gradient penalty model used in image processing and data enhancement field, involves respectively warping medical endoscope image and deformation grid to obtain data enhanced medical endoscope image

CN115937038-A

Inventor(s) SHEN NHU P and LI J

Assignee(s) UNIV SHANGHAI

Derwent Primary Accession Number 

2023-42316D


Multi-source monitoring information fusion method for dam health diagnosis based on Wasserstein distance

Chen, AYTang, XQ; (...); He, JP

Jun 2023 | Mar 2023 (Early Access) | 

INFORMATION SCIENCES

 632 , pp.378-389

The fusion of multi-source monitoring information has become the main trend in the field of dam health diagnosis because of the increasing amount of monitoring data that can be obtained from different sensors. However, the Dempster-Shafer (D-S) evidence theory, an important method in multi-source information fusion, may produce counter-intuitive results when fusing conflicting pieces of evidenc

Show more

View full textmore_horiz

42 References Related records

Cited by 5 Related articles All 2 versions

2023

2023. Scholarly Journal

CE and MLE of the drift par

Wasserstein bounds in CLT of approximative Mameter for Ornstein-Uhlenbeck processes observed at high frequency

Es-Sebaiy, Khalifa; Alazemi, Fares; Al-Foraih, Mishari. Journal of Inequalities and Applications; Heidelberg Vol. 2023, Iss. 1,  (Dec 2023): 62.


2023. Working Paper

A Fused Gromov-Wasserstein Framework for Unsupervised Knowledge Graph Entity Alignment

Tang, Jianheng; Zhao, Kangfei; Li, Jia. arXiv.org; Ithaca, May 11, 2023.

Abstract/DetailsGet full text
opens in a new window
Abstract/DetailsGet full text
opens in a new window


 2023 see 2022.  Working Paper

Coresets for Wasserstein Distributionally Robust Optimization Problems

Huang, Ruomin; Huang, Jiawei; Liu, Wenjie; Hu, Ding. arXiv.org; Ithaca, May 9, 2023.

Abstract/DetailsGet full text
opens in a new window

Coresets for Wasserstein Distributionally Robust Optimization Problems
by Huang, RuominHuang, JiaweiLiu, Wenjie ; More...
arXiv.org, 05/2023
Wasserstein distributionally robust optimization (\textsf{WDRO}) is a popular model to enhance the robustness of machine learning with ambiguous data. However,...

2023. Working Paper

Improved Image Wasserstein Attacks and Defenses

Improved Image Wasserstein Attacks and Defenses

Hu, Edward J; Swaminathan, Adith; Salman, Hadi; Yang, Greg. arXiv.org; Ithaca, May 9, 2023.

Abstract/DetailsGet full text
opens in a new window
  Improved Image Wasserstein Attacks and Defenses
by Hu, Edward JSwaminathan, AdithSalman, Hadi ; More...
arXiv.org, 05/2023
Robustness against image perturbations bounded by a \(\ell_p\) ball have been well-studied in recent literature. Perturbations in the real-world, however,...
Paper  Full Text Online


2023. Working Paper

Wasserstein distance bounds on the normal approximation of empirical autocovariances and cross-covariances under non-stationarity 

Anastasiou, Andreas; Kley, Tobias. arXiv.org; Ithaca, May 7, 2023.

Full Text

Abstract/DetailsGet full text
opens in a new window
 ew window

<–—2023———2023———



2023 see 2022. Working Paper

Wasserstein multivariate auto-regressive models for modeling distributional time series and its application in graph learning

Jiang, Yiye. arXiv.org; Ithaca, May 6, 2023.

Abstract/DetailsGet full text
opens in a new window

[PDF] arxiv.org

Gromov–Wasserstein Transfer Operators

F Beier - Scale Space and Variational Methods in Computer …, 2023 - Springer

Gromov–Wasserstein (GW) transport is inherently invariant under isometric transformations of the data. Having this property in mind, we propose to estimate dynamical systems by …

All 3 versions

Cited by 2 Related articles All 4 versions

2023 see 2022

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

HQ Minh - Journal of Theoretical Probability, 2023 - ideas.repec.org

… formulation of the 2-Wasserstein distance on an infinite-… plan, entropic 2-Wasserstein distance, and Sinkhorn divergence … , both the entropic 2-Wasserstein distance and Sinkhorn …

Cited by 6 Related articles All 9 versions

[CITATION] Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

HQ Minh - Journal of Theoretical Probability, 2023 - … 

ResNet-WGAN Based End-to-End Learning for IoV Communication with Unknown Channels

J Zhao, H Mu, Q Zhang, H Zhang - IEEE Internet of Things …, 2023 - ieeexplore.ieee.org

… Finally, we present the simulation results of the ResNet-WGAN under additive white Gaussian … WGAN to end-to-end learning. The training steps of ResNet-WGAN are given in Algorithm …
Cited by 2
 Related articles


[PDF] mdpi.com

extendGAN+: Transferable Data Augmentation Framework Using WGAN-GP for Data-Driven Indoor Localisation Model

S Yean, W Goh, BS Lee, HL Oh - Sensors, 2023 - mdpi.com

… on the WGAN-GP to create synthetic RSS data as a WGAN-… Upon selecting locmax, the WGAN-GP model is trained from … RSS in dmax amount using the trained WGAN-GP. For another …

Related articles All 9 versions 


2023

 

基于 WGAN-GP CNN-LSTM-Attention 的短期光伏功率预测

雷柯松, 吐松江, 卡日, 苏宁, 吴现, 崔传世 - 电力系统保护与控制, 2023 - epspc.net

生成对抗网络(Wasserstein generative adversarial network with gradient penalty, WGAN-GP) … 首先, 利用K-means++ 聚类算法将历史光伏数据划分为若干天气类型, 使用WGAN-GP 生成符合

[Chinese. Short-Term Photovoltaic Power Prediction Based on WGAN-GP and CNN-LSTM-Attention]

All 2 versions 


An extended Exp-TODIM method for multiple attribute decision making based on the Z-Wasserstein distance

H Sun, Z Yang, Q Cai, G Wei, Z Mo - Expert Systems with Applications, 2023 - Elsevier

… Therefore, based on the above analysis we propose a novel exp-TODIM method combined

with Z-Wasserstein distances and apply it to the carbon storage siting problem. Its main …

Cited by 7 Related articles
 
Parameter estimation from aggregate observations: A Wasserstein distance based sequential Monte Carlo sampler
by Chen, Cheng; Wen, Linjie; Li, Jinglai
arXiv.org, 05/2023
In this work we study systems consisting of a group of moving particles. In such systems, often some important parameters are unknown and have to be estimated...
Paper  Full Text Online
Open Access

[PDF] arxiv.org

Algebraic Wasserstein distances and stable homological invariants of data

J Agerberg, A Guidolin, I Ren, M Scolamiero - arXiv preprint arXiv …, 2023 - arxiv.org

… based on the algebraic Wasserstein distance defined in [ST20] and … Wasserstein stable

ranks on synthetic and real-world data, learning optimal parameters of algebraic Wasserstein …

Related articles All 2 versions 



2023 see 2022
INVITATION TO OPTIMAL TRANSPORT, WASSERSTEIN DISTANCES, AND GRADIENT FLOWS
Show more
Author:ALESSIO FIGALLI
Print Book, 2023
English

<–—2023———2023———800—


[PDF] aimspress.com≈∂çWG-ICRN: Protein 8-state secondary structure prediction based on Wasserstein generative adversarial networks and residual networks with Inception modules

S Li, L Yuan, Y Ma, Y Liu - Mathematical Biosciences and …, 2023 - aimspress.com

… , and the 8-state protein secondary structure can provide more adequate protein information

… 8-state s

 All 3 versions 

 



A Wasserstein generative digital twin model in health monitoring of rotating machines

W Hu, T Wang, F Chu - Computers in Industry, 2023 - Elsevier

Artificial intelligence-based rotating machine health monitoring and diagnosis methods often 

encounter problems, such as a lack of faulty samples. Although the simulation-based digital 

twin model may potentially alleviate these problems with sufficient prior knowledge and a 

large amount of time, the more demanding requirements of adaptivity, autonomy, and 

context-awareness may not be satisfied. This study attempted to address these problems by 

proposing a novel digital twin model referred to as the Wasserstein generative digital twin …

 Related articles



 arXiv:2305.10411  [pdfother cs.LG  cs.RO
Wasserstein Gradient Flows for Optimizing Gaussian Mixture Policies
Authors: Hanna ZiescheLeonel Rozo
Abstract: Robots often rely on a repertoire of previously-learned motion policies for performing tasks of diverse complexities. When facing unseen task conditions or when new task requirements arise, robots must adapt their motion policies accordingly. In this context, policy optimization is the \emph{de facto} paradigm to adapt robot policies as a function of task-specific objectives. Most commonly-used mo…  More
Submitted 17 May, 2023; originally announced May 2023.

 Cited by 1 Related articles All 5 versions 


arXiv:2305.10089  [pdfpsother cs.LG   cs.AI  stat.ML
A proof of imitation of Wasserstein inverse reinforcement learning for multi-objective optimization
Authors: Akira KitaokaRiki Eto
Abstract: We prove Wasserstein inverse reinforcement learning enables the learner's reward values to imitate the expert's reward values in a finite iteration for multi-objective optimizations. Moreover, we prove Wasserstein inverse reinforcement learning enables the learner's optimal solutions to imitate the expert's optimal solutions for multi-objective optimizations with lexicographic order.
Submitted 17 May, 2023; v1 submitted 17 May, 2023; originally announced May 2023.
Comments: 9 pages. This text is continuation from arXiv:2305.06137

Related articles All 2 versions 

arXiv:2305.09760  [pdfother eess.SY   math.OC
Distributionally Robust Differential Dynamic Programming with Wasserstein Distance
Authors: Astghik HakobyanInsoon Yang
Abstract: Differential dynamic programming (DDP) is a popular technique for solving nonlinear optimal control problems with locally quadratic approximations. However, existing DDP methods are not designed for stochastic systems with unknown disturbance distributions. To address this limitation, we propose a novel DDP method that approximately solves the Wasserstein distributionally robust control (WDRC) pro…  More
Submitted 16 May, 2023; originally announced May 2023.


2023


Buzun, Nazar

Gaussian approximation for penalized Wasserstein barycenters. (English) Zbl 07686805

Math. Methods Stat. 32, No. 1, 1-26 (2023).

MSC:  62-XX 60-XX

PDF BibTeX XML Cite

Full Text: DOI

 arXiv



Enhanced CNN Classification Capability for Small Rice Disease Datasets Using Progressive WGAN-GP: Algorithms and Applications
by Lu, YangTao, XianpengZeng, Nianyin ; More...
Remote sensing (Basel, Switzerland), 04/2023, Volume 15, Issue 7
An enhancement generator model with a progressive Wasserstein generative adversarial network and gradient penalized (PWGAN-GP) is proposed to solve the problem...
ArticleView Article PDF
Journal Article  Full Text Online
All 4 versions
 
2023 see 2022
Distributionally Robust Stochastic Optimization with Wasserstein Distance
by Gao, RuiKleywegt, Anton
Mathematics of operations research, 05/2023, Volume 48, Issue 2
Distributionally robust stochastic optimization (DRSO) is an approach to optimization under uncertainty in which, instead of assuming that there is a known...
ArticleView Article PDF
Journal Article  Full Text Online
Cited by 1
 Related articles All 4 versions 
 
Wasserstein distance-based distributionally robust parallel-machine scheduling
by Yin, YunqiangLuo, ZunhaoWang, Dujuan ; More...
Omega (Oxford), 05/2023
ArticleView Article PDF
Journal Article  Full Text Online

 

2023 see 2022

Partial Discharge Data Augmentation Based on Improved Wasserstein Generative Adversarial Network With Gradient Penalty
by Zhu, GuangyaZhou, KaiLu, Lu ; More...
IEEE transactions on industrial informatics, 05/2023, Volume 19, Issue 5
The partial discharge (PD) classification for electric power equipment based on machine learning algorithms often leads to insufficient generalization ability...
ArticleView Article PDF
Journal Article  Full Text Online

<–—2023———2023———810—



Sparse super resolution and its trigonometric approximation in the p‐Wasserstein distance
by Catala, PaulHockmann, MathiasKunis, Stefan
Proceedings in applied mathematics and mechanics, 03/2023, Volume 22, Issue 1
We consider the approximation of measures by trigonometric polynomials with respect to the p‐Wasserstein distance for general p ≥ 1. While the best...
ArticleView Article PDF
Journal Article  Full Text Online

2023 see 2022
HackGAN: Harmonious Cross-Network Mapping Using CycleGAN With Wasserstein-Procrustes Learning for Unsupervised Network Alignment
by Yang, LinyaoWang, XiaoZhang, Jun ; More...
IEEE transactions on computational social systems, 04/2023, Volume 10, Issue 2
Network alignment (NA) that identifies equivalent nodes across networks is an effective tool for integrating knowledge from multiple networks. The...
ArticleView Article PDF
Journal Article  Full Text Online


2023 see 2021
Learning to Simulate Sequentially Generated Data via Neural Networks and Wasserstein Training
by Zhu, TingyuLiu, HaoyuZheng, Zeyu
ACM transactions on modeling and computer simulation, 04/2023
We propose a new framework of a neural network-assisted sequential structured simulator to model, estimate, and simulate a wide class of sequentially generated...
ArticleView Article PDF
Journal Article  Full Text Online

Convergence of the empirical measure in expected Wasserstein distance: non asymptotic explicit bounds in Rd
by Fournier, Nicolas
Probability and statistics, 05/2023
We provide some non asymptotic bounds, with explicit constants, that measure the rate of convergence, in expected Wasserstein distance, of the empirical...
ArticleView Article PDF
Journal Article  Full Text Online
 
Geological model automatic reconstruction based on conditioning Wasserstein generative adversarial network with gradient penalty
by Fan, WenyaoLiu, GangChen, Qiyu ; More...
Earth science informatics, 04/2023
ArticleView Article PDF
Journal Article  Full Text Online
 

2023

 
2023 see 2022
Strong Formulations for Distributionally Robust Chance-Constrained Programs with Left-Hand Side Uncertainty Under Wasserstein Ambiguity
by Ho-Nguyen, NamKilinç-Karzan, FatmaKüçükyavuz, Simge ; More...
INFORMS journal on optimization, 04/2023, Volume 5, Issue 2
Distributionally robust chance-constrained programs (DR-CCPs) over Wasserstein ambiguity sets exhibit attractive out-of-sample performance and admit...
ArticleView Article PDF
Journal Article  Full Text Online


2023 see arXiv
Energy-Based Sliced Wasserstein Distance
by Nguyen, KhaiHo, Nhat
04/2023
The sliced Wasserstein (SW) distance has been widely recognized as a statistically effective and computationally efficient metric between two probability...
Journal Article  Full Text Online

Cited by 8 Related articles All 6 versions 


2023 see arXiv
1d approximation of measures in Wasserstein spaces
by Chambolle, AntoninDuval, VincentMachado, Joao Miguel
04/2023
We propose a variational approach to approximate measures with measures uniformly distributed over a 1 dimentional set. The problem consists in minimizing a...
Journal Article  Full Text Online
Related articles
 All 5 versions 


2023 see arXiv
Wasserstein Tube MPC with Exact Uncertainty Propagation
by Aolaritei, LiviuFochesato, MartaLygeros, John ; More...
04/2023
We study model predictive control (MPC) problems for stochastic LTI systems, where the noise distribution is unknown, compactly supported, and only observable...
Journal Article  Full Text Online
Cited by 9
 Related articles All 4 versions


2023 see arXiv
Nonlinear Wasserstein Distributionally Robust Optimal Control
by Zhong, ZhengangZhu, Jia-Jie
04/2023
This paper presents a novel approach to addressing the distributionally robust nonlinear model predictive control (DRNMPC) problem. Current literature...
Journal Article  Full Text Online

Cited by 3 Related articles All 4 versions 

<–—2023———2023———820—


692023 see arXiv
Wasserstein Principal Component Analysis for Circular Measures
by Beraha, MarioPegoraro, Matteo
04/2023
We consider the 2-Wasserstein space of probability measures supported on the unit-circle, and propose a framework for Principal Component Analysis (PCA) for...
Journal Article  Full Text Online

All 3 versions View as HT

Cited by 4 Related articles All 3 versions 

2023 see arXiv

Variational Gaussian filtering via Wasserstein gradient flows
by Corenflos, AdrieAbdulsamad, Hany
03/2023
In this article, we present a variational approach to Gaussian and mixture-of-Gaussians assumed filtering. Our method relies on an approximation stemming from...
Journal Article  Full Text Online
Related articles
 All 8 versions
Related articles
 All 8 versions


023 see arXiv
A note on the Bures-Wasserstein metric
by Mohan, Shravan
03/2023
In this brief note, it is shown that the Bures-Wasserstein (BW) metric on the space positive definite matrices lends itself to convex optimization. In other...
Journal Article  Full Text Online
Related articles
 All 2 versions 
 
2023 see arXiv
A Lagrangian approach to totally dissipative evolutions in Wasserstein spaces
by Cavagnari, GiuliaSavaré, GiuseppeSodini, Giacomo Enrico
05/2023
We introduce and study the class of totally dissipative multivalued probability vector fields (MPVF) $\boldsymbol{\mathrm F}$ on the Wasserstein space...
Journal Article  Full Text Online
Cited by 1
 All 3 versions 
 
2023 see arXiv
The geometry of financial institutions -- Wasserstein clustering of financial data
by Riess,LorenzBeiglböck, MathiasTemme, Johannes ; More...
05/2023
The increasing availability of granular and big data on various objects of interest has made it necessary to develop methods for condensing this information...
Journal Article  Full Text Online


2023
 


2023 see arXiv
Age-Invariant Face Embedding using the Wasserstein Distance
by Dahan, EranKeller, Yosi
05/2023
In this work, we study face verification in datasets where images of the same individuals exhibit significant age differences. This poses a major challenge for...
Journal Article  Full Text Online

Cited by 1 Related articles All 3 versions 


2023 see arXiv
An Asynchronous Decentralized Algorithm for Wasserstein Barycenter Problem
by Zhang, ChaoQian, HuiXie, Jiahao
04/2023
Wasserstein Barycenter Problem (WBP) has recently received much attention in the field of artificial intelligence. In this paper, we focus on the decentralized...
Journal Article  Full Text Online

2023 see arXiv
Wasserstein PAC-Bayes Learning: A Bridge Between Generalisation and Optimisation
by Haddouche, MaximeGuedj, Benjamin
04/2023
PAC-Bayes learning is an established framework to assess the generalisation ability of learning algorithm during the training phase. However, it remains...
Journal Article  Full Text Online
Cited by 4
 Related articles All 6 versions 

 
2023 see working paper
Adjusted Wasserstein Distributionally Robust Estimator in Statistical Learning
by Xie, YilingHuo, Xiaoming
03/2023
We propose an adjusted Wasserstein distributionally robust estimator -- based on a nonlinear transformation of the Wasserstein distributionally robust (WDRO)...
Journal Article  Full Text Online
Cited by 1
 Related articles All 2 versions 


2023 see working paper
Continuum Swarm Tracking Control: A Geometric Perspective in Wasserstein Space
by Emerick, MaxBamieh, Bassam
03/2023
We consider a setting in which one swarm of agents is to service or track a second swarm, and formulate an optimal control problem which trades off between the...
Journal Article  Full Text Online

ACited by 2 Related articles All 3 versions

<–—2023———2023———830—


2023 ee 2021
Lifting couplings in Wasserstein spaces
by Perrone, Paolo
arXiv.org, 04/2023
This paper makes mathematically precise the idea that conditional probabilities are analogous to path liftings in geometry. The idea of lifting is modelled in...
Paper  Full Text Online

2023 see w0rking paper
A Fused Gromov-Wasserstein Framework for Unsupervised Knowledge Graph Entity Alignment
by Tang, JianhengZhao, KangfeiLi, Jia
05/2023
Entity alignment is the task of identifying corresponding entities across different knowledge graphs (KGs). Although recent embedding-based entity alignment...
Journal Article  Full Text Online
Cited by 5
 Related articles All 3 versions 

2023 see arxiv
Isometric rigidity of the Wasserstein space $\mathcal{W}_1(\mathbf{G})$ over Carnot groups
by Balogh, Zoltán MTitkos, TamásVirosztek, Dániel
05/2023
This paper aims to study isometries of the $1$-Wasserstein space $\mathcal{W}_1(\mathbf{G})$ over Carnot groups endowed with horizontally strictly convex...
Journal Article  Full Text Online

2023 see arXiv
WSFE: Wasserstein Sub-graph Feature Encoder for Effective User Segmentation in Collaborative Filtering
by Chen, YankaiZhang, YifeiYang, Menglin ; More...
05/2023
Maximizing the user-item engagement based on vectorized embeddings is a standard procedure of recent recommender models. Despite the superior performance for...
Journal Article  Full Text Online
 Cited by 1 All 2 versions 

2023 see arXiv
Wasserstein-Fisher-Rao Embedding: Logical Query Embeddings with Local Comparison and Global Transport
by Wang, ZihaoFei, WeizhiYin, Hang ; More...
05/2023
Answering complex queries on knowledge graphs is important but particularly challenging because of the data incompleteness. Query embedding methods address...
Journal Article  Full Text Online

Cited by 7 Related articles All 4 versions 


2023


2023 see arXiv
Wasserstein Convergence for Empirical Measures of Subordinated Fractional Brownian Motions on the Flat Torus
by Li, HuaiqianWu, Bingyao
05/2023
We estimate rates of convergence for empirical measures associated with the subordinated fractional Brownian motion to the uniform distribution on the flat...
Journal Article  Full Text Online
Cited by 7
 Related articles All 4 versions

2023 see arXiv
Reconstructing discrete measures from projections. Consequences on the empirical Sliced Wasserstein Distance
by Tanguy, EloiFlamary, RémiDelon, Julie
04/2023
This paper deals with the reconstruction of a discrete measure $\gamma_Z$ on $\mathbb{R}^d$ from the knowledge of its pushforward measures $P_i\#\gamma_Z$ by...
Journal Article  Full Text Onlin


2023 see arXiv

Forward-backward Gaussian variational inference via JKO in the Bures-Wasserstein Space
by Diao, MichaelBalasubramanian, KrishnakumarChewi, Sinho ; More...
04/2023
Variational inference (VI) seeks to approximate a target distribution $\pi$ by an element of a tractable family of distributions. Of key interest in statistics...
Journal Article  Full Text Online
Cited by 4
All 3 versions

2023 see arXiv
Distributionally robust mixed-integer programming with Wasserstein metric: on the value of uncertain data
by Ketkov, Sergey S
04/2023
This study addresses a class of linear mixed-integer programming (MIP) problems that involve uncertainty in the objective function coefficients. The...
Journal Article  Full Text Online
Cited by 16 Related articles All 7 versions 


2023 see arXiv
Rediscover Climate Change during Global Warming Slowdown via Wasserstein Stability Analysis
by Xie, ZhiangChen, DongweiLi, Puxi
03/2023
Climate change is one of the key topics in climate science. However, previous research has predominantly concentrated on changes in mean values, and few...
Journal Article  Full Text Online

All 2 versions 

<–—2023———2023———840—



2023 see arXiv
The Wasserstein Believer: Learning Belief Updates for Partially Observable Environments through Reliable Latent Space Models
by Avalos, RaphaelDelgrange, FlorentNowé, Ann ; More...
03/2023
Partially Observable Markov Decision Processes (POMDPs) are useful tools to model environments where the full state cannot be perceived by an agent. As such...
Journal Article  Full Text Online
Cited by 2
 Related articles All 8 versions 
 
2023 see 2022
Variational inference via Wasserstein gradient flows
by Lambert, MarcChewi, SinhoBach, Francis ; More...
arXiv.org, 04/2023
Along with Markov chain Monte Carlo (MCMC) methods, variational inference (VI) has emerged as a central computational approach to large-scale Bayesian...
Paper  Full Text Online

Wasserstein distance bounds on the normal approximation of empirical autocovariances and cross-covariances under non-stationarity and...
by Anastasiou, AndreasKley, Tobias
05/2023
The autocovariance and cross-covariance functions naturally appear in many time series procedures (e.g., autoregression or prediction). Under assumptions,...
Journal Article  Full Text Online
All 4 versions


Parameter estimation for many-particle models from aggregate observations: A Wasserstein distance based sequential Monte Carlo...
by Cheng, ChenWen, LinjieLi, Jinglai
03/2023
In this work we study systems consisting of a group of moving particles. In such systems, often some important parameters are unknown and have to be estimated...
Journal Article  Full Text Online

Wasserstein GAN for Joint Learning of Inpainting and Spatial Optimisation
by Peter, Pascal
Image and Video Technology, 04/2023
Image inpainting is a restoration method that reconstructs missing image parts. However, a carefully selected mask of known pixels that yield a high quality...
Book Chapter  Full Text Online

 2023

 
 A Lagrangian approach to totally dissipative evolutions in Wasserstein spaces
by Cavagnari, GiuliaSavaré, GiuseppeSodini, Giacomo Enrico
arXiv.org, 05/2023
We introduce and study the class of totally dissipative multivalued probability vector fields (MPVF) \(\boldsymbol{\mathrm F}\) on the Wasserstein space...
Paper  Full Text Online



 
Wasserstein Distributionally Robust Optimization with Expected Value Constraints
by Fonseca, DiegoJunca, Mauricio
arXiv.org, 04/2023
We investigate a stochastic program with expected value constraints, addressing the problem in a general context through Distributionally Robust Optimization...
Paper  Full Text Online
 

2023 see 2022
Wasserstein Graph Distance Based on \(L_1\)-Approximated Tree Edit Distance between Weisfeiler-Lehman Subtrees
by Fang, ZhongxiHuang, JianmingSu, Xun ; More...
arXiv.org, 05/2023
The Weisfeiler-Lehman (WL) test is a widely used algorithm in graph machine learning, including graph kernels, graph metrics, and graph neural networks....
Paper  Full Text Online

2023 see 2021
Averaging on the Bures-Wasserstein manifold: dimension-free convergence of gradient descent
by Altschuler, Jason MChewi, SinhoGerber, Patrik ; More...
arXiv.org, 04/2023
We study first-order optimization algorithms for computing the barycenter of Gaussian distributions with respect to the optimal transport metric. Although the...
Paper  Full Text Online

Convergence of the empirical measure in expected Wasserstein distance: non asymptotic explicit bounds in \(\mathbb{R}^d\)
by Fournier, Nicolas
arXiv.org, 03/2023
We provide some non asymptotic bounds, with explicit constants, that measure the rate of convergence, in expected Wasserstein distance, of the empirical...
Paper  Full Text Online

Cited by 2 Related articles All 6 versions

<–—2023———2023———850—


2023 patent news
CGG Services SAS Gets Patent for Methods and Devices Performing Adaptive Quadratic Wasserstein Full-Waveform Inversion
Global IP News. Information Technology Patent News, 04/2023
Newsletter  Full Text Online

2023 patent news
Univ Jiliang China Submits Chinese Patent Application for Early Fault Detection Method Based on Wasserstein Distanc
Global IP News. Measurement & Testing Patent News, 03/2023
Newsletter  Full Text Online


2023 patent news
Xiao Fuyuan Applies for Patent on Application of Evidence Wasserstein Distance Algorithm in Aspect of Component Identification
Global IP News. Software Patent News, 04/2023
Newsletter  Full Text Online

2023 patent news
Univ Qinghua Seeks Patent for Rotating Machine State Monitoring Method Based on Wasserstein Depth Digital Twinborn Model
Global IP News. Tools and Machinery Patent News, 03/2023
Newsletter  Full Text Online

 
2023 patent news
US Patent Issued to CGG SERVICES on April 25 for "Methods and devices performing adaptive quadratic Wasserstein full-waveform...
US Fed News Service, Including US State News, 04/2023
Newsletter  Full Text Online
Methods and devices performing adaptive quadratic Wasserstein full-waveform...
by Wang, Diancheng; Wang, Ping
04/2023
Methods and devices for seismic exploration of an underground structure apply W2-based full-wave inversion to transformed synthetic and seismic data. Data...
Patent  Available Online

 2023

 
2023 patent news
Quanzhou Institute of Equipment Mfg Submits Chinese Patent Application for Wasserstein Distance-Based Battery SOH (State of...
Global IP News. Electrical Patent News, 04/2023
Newsletter  Full Text Online

 2023 patent news
Billionaire’s Love Child Sues His Other Kids for $100M: Bruce Wasserstein died in 2009, but a feud over his assets rages on
by Kirsch, Noah
The Daily Beast, May 2, 2023
Newspaper Article  Full Text Online


EAF-WGAN: Enhanced Alignment Fusion-Wasserstein Generative Adversarial Network for Turbulent Image Restoration
by Liu, Xiangqing; Li, Gang; Zhao, Zhenyang ; More...
IEEE transactions on circuits and systems for video technology, 2023
Article PDFPDF
Journal Article  Full Text Online
Cited by 2
 Related articles All 2 versions


Comparing Beta-VAE to WGAN-GP for Time Series Augmentation to Improve Classification Performance
by Kavran, Domen; Žalik, Borut; Lukač, Niko
Agents and Artificial Intelligence, 01/2023
Datasets often lack diversity to train robust classification models, capable of being used in real-life scenarios. Neural network-based generative models learn...
Book Chapter  Full Text Online

Wasserstein GAN-based Digital Twin Inspired Model for Early Drift Fault Detection in Wireless Sensor Networks
by Hasan, Md. Nazmul; Jan, Sana Ullah; Koo, Insoo
IEEE sensors journal, 2023
Article PDFPDF
Journal Article  Full Text Online 

All 3 versions

<–—2023———2023———860—


Bounds in L1 Wasserstein distance on the normal approximation of general M-estimators
by Bachoc, François; Fathi, Max
Electronic journal of statistics, 01/2023, Volume 17, Issue 1
Article PDFPDF
Journal Article  Full Text Online

MR4588477

[PDF] arxiv.org

Unified Topological Inference for Brain Networks in Temporal Lobe Epilepsy Using the Wasserstein Distance

MK Chung, CG Ramos, FB De Paiva, J Mathis… - arXiv preprint arXiv …, 2023 - arxiv.org

… for differentiating brain networks in a two-sample comparison setting based on the Wasserstein

distance. We will show that the proposed method based on the Wasserstein distance can …

 All 6 versions 


arXiv:2305.14248  [pdfpsother math.PR
Improved rates of convergence for the multivariate Central Limit Theorem in Wasserstein distance
Authors: Thomas Bonis
Abstract: We provide new bounds for rates of convergence of the multivariate Central Limit Theorem in Wasserstein distances of order p≥2
. In particular, we obtain an asymptotic bound for measures with a continuous component which we conjecture to be optimal.
Submitted 23 May, 2023; originally announced May 2023.


arXiv:2305.12310  [pdfother eess.IV   q-bio.QM  stat.ML
Alignment of Density Maps in Wasserstein Distance
Authors: Amit SingerRuiyi Yang
Abstract: In this paper we propose an algorithm for aligning three-dimensional objects when represented as density maps, motivated by applications in cryogenic electron microscopy. The algorithm is based on minimizing the 1-Wasserstein distance between the density maps after a rigid transformation. The induced loss function enjoys a more benign landscape than its Euclidean counterpart and Bayesian optimizat…  More
Submitted 20 May, 2023; originally announced May 2023.

All 3 versions 

[HTML] mdpi.com

[HTML] Intrusion Detection in Networks by Wasserstein Enabled Many-Objective Evolutionary Algorithms

A PontiA CandelieriI GiordaniF Archetti - Mathematics, 2023 - mdpi.com

… A graph variant of the Wasserstein distance is called the Gromow–Wasserstein (GW) distance,

… The Wasserstein distance is used to propagate their measurements to the whole network, …

All 4 versions 


2023


Data Augmentation Strategy for Power Inverter Fault Diagnosis Based on Wasserstein Distance and Auxiliary Classification Generative Adversarial Network

Q Sun, F Peng, X Yu, H Li - Reliability Engineering & System Safety, 2023 - Elsevier

… review of sample imbalance. Table 2 shows the Pros and cons of the GAN-based methods in

the literature review … an auxiliary classification based on Wasserstein distance to generative …



[HTML] copernicus.org

[HTML] <? xmltex\vskip-1mm?> Short communication: The Wasserstein distance as a dissimilarity metric for comparing detrital age spectra<? xmltex\break?> and …

A LippP Vermeesch - Geochronology, 2023 - gchron.copernicus.org

… In the following sections, we first introduce the Wasserstein … We then proceed to compare

the Wasserstein distance to the … distance has been added to the IsoplotR software, and …

[HTML] The Wasserstein distance as a dissimilarity metric for comparing detrital age spectra and other geological distributions

A LippP Vermeesch - Geochronology, 2023 - gchron.copernicus.org

… In the following sections, we first introduce the Wasserstein … We then proceed to compare the

Wasserstein distance to the … real datasets using b

h the Wasserstein and KS distances. We …

Cited by 6 Related articles All 7 versions 

 

Source-Independent Full-Waveform Inversion Based on Convolutional Wasserstein Distance Objective Function

S Jiang, H Chen, H Li, H Zhou, L Wang… - … on Geoscience and …, 2023 - ieeexplore.ieee.org

… To solve this dilemma, we construct a novel convolutional Wasserstein distance (CW)

objective function by applying the OTD objective function to convolved seismograms. Before the …


[PDF] ijeast.com

[PDF] WASSERSTEIN GAN-GRADIENT PENALTY WITH DEEP TRANSFER LEARNING FOR 3D MRI ALZHEIMER DISEASE CLASSIFICATION

NR Thota, D Vasumathi - ijeast.com

MRI scans for Alzheimer's disease (AD) detection are popular. Recent computer vision (CV)

and deep learning (DL) models help construct effective computer assisted diagnosis (CAD) …

Related articles All 2 versions 

[PDF] arxiv.org

The geometry of financial institutions--Wasserstein clustering of financial data

L RiessM Beiglböck, J Temme, A Wolf… - arXiv preprint arXiv …, 2023 - arxiv.org

… Wasserstein barycenter, introduced by Delon, Gozlan, and Saint-Dizier [7], which extends

the classical Wasserstein … to optimal transport and regularized Wasserstein distance, we refer …

Cited by 1 Related articles All 4 versions 

<–—2023———2023———870—

MR4593235 Prelim  Candelieri, Antonio; Ponti, Andrea; Giordani, Ilaria; Archetti, Francesco; 

On the use of Wasserstein distance in the distributional analysis of human decision making under uncertainty. Ann. Math. Artif. Intell. 91 (2023), no. 2-3, 217–238.

Review PDF Clipboard Journal Article

Cite Cited by 4 Related articles All 4 versions

MR4592868 Prelim Le Gouic, Thibaut; Paris, Quentin; Rigollet, Philippe; 

Stromme, Austin J.; Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space. J. Eur. Math. Soc. (JEMS) 25 (2023), no. 6, 2229–2250.

Review PDF Clipboard Journal Article

2023 see 2022

MR4588138 Prelim Bubenik, Peter; Scott, Jonathan; Stanley, Donald; 

Exact weights, path metrics, and algebraic Wasserstein distances. J. Appl. Comput. Topol. 7 (2023), no. 2, 185–219. 55N31 (18E10)

Review PDF Clipboard Journal Article

Cited by 6 Related articles

Working Paper

Large Sample Theory for Bures-Wasserstein Barycentres

Santoro, Leonardo V; Panaretos, Victor M. arXiv.org; Ithaca, May 24, 2023.

 

Working Paper

Wasserstein Gaussianization and Efficient Variational Bayes for Robust Bayesian Synthetic Likelihood

Nguyen, Nhat-Minh; Tran, Minh-Ngoc; Drovandi, Christopher; Nott, David. arXiv.org; Ithaca, May 24, 2023.

Abstract/DetailsGet full text
opens in a new window
by Nguyen, Nhat-Minh; Tran, Minh-Ngoc; Drovandi, Christopher ; More...

Wasserstein Gaussianization and Efficient Variational Bayes for Robust 

arXiv.org, 05/2023

The Bayesian Synthetic Likelihood (BSL) method is a widely-used tool for likelihood-free Bayesian inference. This method assumes that some summary statistics...

Paper  Full Text Online

Open Access


2023

  

2023 see arXive

Working Paper

Sliced-Wasserstein on Symmetric Positive Definite Matrices for M/EEG Signals

Bonet, Clément; Malézieux, Benoît; Rakotomamonjy, Alain; Lucas Drumetz; Moreau, Thomas; et al. arXiv.org; Ithaca, May 24, 2023.

Abstract/DetailsGet full text
opens in a new window

Improved rates of convergence for the multivariate Central Limit Theorem in Wasserstein distance

Bonis, Thomas. arXiv.org; Ithaca, May 23, 2023.

Abstract/DetailsGet full text
opens in a new window
Rediscover Climate Change during Global Warming Slowdown via Wasserstein Stability Analysis

by Xie, Zhiang; Chen, Dongwei; Li, Puxi

arXiv.org, 05/2023

Climate change is one of the key topics in climate science. However, previous research has predominantly concentrated on changes in mean values, and few...

Paper  Full Text Online

 Based on Wasserstein distance, we develop a novel method, named as Wasserstein Stability …

Related articles All 3 versions 


Working Paper

Alignment of Density Maps in Wasserstein Distance

Singer, Amit; Yang, Ruiyi. arXiv.org; Ithaca, May 21, 2023.

Abstract/DetailsGet full text
opens in a new window

Working Paper

Uniform-in-Time Wasserstein Stability Bounds for (Noisy) Stochastic Gradient Descent

Zhu, Lingjiong; Gurbuzbalaban, Mert; Anant Raj; Simsekli, Umut. arXiv.org; Ithaca, May 20, 2023.

Abstract/DetailsGet full text
opens in a new window

 

Working Paper

A proof of imitation of Wasserstein inverse reinforcement learning for multi-objective optimization

Kitaoka, Akira; Eto, Riki. arXiv.org; Ithaca, May 18, 2023.

Abstract/DetailsG

<–—2023———2023———880—



Working Paper

Wasserstein Gradient Flows for Optimizing Gaussian Mixture Policies

Ziesche, Hanna; Rozo, Leonel. arXiv.org; Ithaca, May 17, 2023.

Abstract/DetailsGet full text
opens in a new window
 

Working Paper

Distributionally Robust Differential Dynamic Programming with Wasserstein Distance

Hakobyan, Astghik; Yang, Insoon. arXiv.org; Ithaca, May 16, 2023.

Abstract/DetailsGet full text
opens in a new window
 Publication:IEEE control systems letters, 7, 2023, 2329
Publisher: 2023

Library


Working Paper

Optimal control of the Fokker-Planck equation under state constraints in the Wasserstein space

Daudin, Samuel. arXiv.org; Ithaca, May 15, 2023.

Zbl 07693669.    MR4598928

2023 see 2022

Conditional Wasserstein Generator.

Kim, Young-GeunLee, Kyungbok and Paik, Myunghee Cho

2023-jun | 

IEEE transactions on pattern analysis and machine intelligence

 45 (6) , pp.7208-7219

The statistical distance of conditional distributions is an essential element of generating target data given some data as in video prediction. We establish how the statistical distances between two joint distributions are related to those between two conditional distributions for three popular statistical distances: f-divergence, Wasserstein distance, and integral probability metrics. Such cha

Show more

Full Text at Publishermore_horiz

 


extendGAN plus : Transferable Data Augmentation Framework Using WGAN-GP for Data-Driven Indoor Localisation Model

Yean, SGoh, W; (...); Oh, HL

Apr 30 2023 | 

SENSORS

 23 (9)

For indoor localisation, a challenge in data-driven localisation is to ensure sufficient data to train the prediction model to produce a good accuracy. However, for WiFi-based data collection, human effort is still required to capture a large amount of data as the representation Received Signal Strength (RSS) could easily be affected by obstacles and other factors. In this paper, we propose an

Show more

Free Full Text from Publishermore_horiz

36 References   Related records



2023


2023 see 2022

Stochastic approximation versus sample average approximation for Wasserstein barycenters

Dvinskikh, D

Sep 3 2022 | Sep 2021 (Early Access) | 

OPTIMIZATION METHODS & SOFTWARE

 37 (5) , pp.1603-1635

In the machine learning and optimization community, there are two main approaches for the convex risk minimization problem, namely the Stochastic Approximation (SA) and the Sample Average Approximation (SAA). In terms of the oracle complexity (required number of stochastic gradient evaluations), both approaches are considered equivalent on average (up to a logarithmic factor). The total complex

Show more

Free Full Text From Publishermore_horiz

1 Citation  78 References

Related records

Show more

Free Full Text from Publisher View Associated Datamore_horiz

1 Citation \101. References Related records



Variant Wasserstein Generative Adversarial Network Applied on Low Dose CT Image Denoising

Mahmoud, AASayed, HA and Mohamed, SS

2023 | 

CMC-COMPUTERS MATERIALS & CONTINUA

 75 (2) , pp.4535-4552

Computed Tomography (CT) images have been extensively employed in disease diagnosis and treatment, causing a huge concern over the dose of radiation to which patients are exposed. Increasing the radiation dose to get a better image may lead to the development of genetic disorders and cancer in the patients; on the other hand, decreasing it by using a LowDose CT (LDCT) image may cause more noise

Show more

Free Full Text from Publishermore_horiz

45 References Related records 

Cited by 2 Related articles All 2 versions 


2023 see 2022

Quantitative control of Wasserstein distance between Brownian motion and the Goldstein-Kac telegraph process

Barrera, G and Lukkarinen, J

May 2023 | 

ANNALES DE L INSTITUT HENRI POINCARE-PROBABILITES ET STATISTIQUES

 59 (2) , pp.933-982

In this manuscript, we provide a non-asymptotic process level control between the telegraph process and the Brownian motion with suitable diffusivity constant via a Wasserstein distance with quadratic average cost. In addition, we derive non-asymptotic estimates for the corresponding time average p-th moments. The proof relies on coupling techniques such as coin-flip coupling, synchronous coupl

Show more

Free Full Text From Publishermore_horiz

47 References. Related records


2023. Research article
Distributionally robust optimal dispatching method of integrated electricity and heating system based on improved Wasserstein metric
International Journal of Electrical Power & Energy Systems11 April 2023...

Hongwei LiHongpeng LiuWei Zhang
Cited by 3
 Related articles

2023. Research article
Portfolio optimization using robust mean absolute deviation model: Wasserstein metric approach
Finance Research Letters28 February 2023...

Zohreh Hosseini-NodehRashed Khanjani-ShirazPanos M. Pardalos

Cited by 8 Related articles All 3 versions

<–—2023———2023———890—


2023. Research article

Wasserstein distance‐based distributionally robust parallel‐machine scheduling
Omega5 May 2023...

Yunqiang YinZunhao LuoT. C. E. Chen

arXiv:2305.17076  [pdfpsother cs.LG  stat.ML
Exact Generalization Guarantees for (Regularized) Wasserstein Distributionally Robust Models
Authors: Waïss AzizianFranck IutzelerJérôme Malick
Abstract: Wasserstein distributionally robust estimators have emerged as powerful models for prediction and decision-making under uncertainty. These estimators provide attractive generalization guarantees: the robust objective obtained from the training distribution is an exact upper bound on the true risk with high probability. However, existing guarantees either suffer from the curse of dimensionality, ar…  More
Submitted 26 May, 2023; originally announced May 2023.
Comments: 46 page


arXiv:2305.16984  [pdfother math.ST  math.PR
Wasserstein contraction and spectral gap of slice sampling revisited
Authors: Philip Schär
Abstract: We propose a new class of Markov chain Monte Carlo methods, called k
-polar slice sampling (k
-PSS), as a technical tool that interpolates between and extrapolates beyond uniform and polar slice sampling. By examining Wasserstein contraction rates and spectral gaps of k
-PSS, we obtain strong quantitative results regarding its performance for different kinds of target distributions. Because…  More
Submitted 26 May, 2023; originally announced May 2023.
Comments: 28 pages, 3 figures
MSC Class: 65C05 (Primary) 60J05; 60J22 (Secondary)


arXiv:2305.16557  [pdfother stat.ML  cs.LG math.PR
Tree-Based Diffusion Schrödinger Bridge with Applications to Wasserstein Barycenters
Authors: Maxence NobleValentin De BortoliArnaud DoucetAlain Durmus
Abstract: Multi-marginal Optimal Transport (mOT), a generalization of OT, aims at minimizing the integral of a cost function with respect to a distribution with some prescribed marginals. In this paper, we consider an entropic version of mOT with a tree-structured quadratic cost, i.e., a function that can be written as a sum of pairwise cost functions between the nodes of a tree. To address this problem, we…  More
Submitted 25 May, 2023; originally announced May 2023.

Cited by 1 Related articles All 9 versions 

arXiv:2305.15592  [pdfpsother math.PR  math.ST
Large Sample Theory for Bures-Wasserstein Barycentres
Authors: Leonardo V. SantoroVictor M. Panaretos
Abstract: We establish a strong law of large numbers and a central limit theorem in the Bures-Wasserstein space of covariance operators -- or equivalently centred Gaussian measures -- over a general separable Hilbert space. Specifically, we show that under a minimal first-moment condition, empirical barycentre sequences indexed by sample size are almost certainly relatively compact, with accumulation points…  More
Submitted 24 May, 2023; originally announced May 2023.
MSC Class: 60B12; 60G57; 60H25; 62R20; 62R30

Cited by 3 Related articles All 3 versions 

2023


arXiv:2305.14746  [pdfother stat.CO  stat.ML
Wasserstein Gaussianization and Efficient Variational Bayes for Robust Bayesian Synthetic Likelihood
Authors: Nhat-Minh NguyenMinh-Ngoc TranChristopher DrovandiDavid Nott
Abstract: The Bayesian Synthetic Likelihood (BSL) method is a widely-used tool for likelihood-free Bayesian inference. This method assumes that some summary statistics are normally distributed, which can be incorrect in many applications. We propose a transformation, called the Wasserstein Gaussianization transformation, that uses a Wasserstein gradient flow to approximately transform the distribution of th…  More
Submitted 24 May, 2023; originally announced May 2023.

RRelated articles All 3 versions 

Research on abnormal detection of gas load based on LSTM-WGAN

X Xu, X Ai, Z Meng - International Conference on Computer …, 2023 - spiedigitallibrary.org

… The anomaly detection model based on LSTM-WGAN proposed in this paper is shown in

Figure 2. The LSTM-WGAN model is divided into two stages of training and testing. …

All 2 versions

[PDF] ieee.org

 A continual encrypted traffic classification algorithm based on WGAN

X Ma, W Zhu, Y Jin, Y Gao - Third International Seminar on …, 2023 - spiedigitallibrary.org

… In this paper, we propose a continual encrypted traffic classification method based on

WGAN. We use WGAN to train a separate generator for each class of encrypted traffic. The …

Related articles All 3 versions

[HTML] hanspub.org

[HTML] 基于 WGAN 状态重构的智能电网虚假数据注入攻击检测

张笑, 孙越 - Modeling and Simulation, 2023 - hanspub.org

的问题,提出一种基于WGAN (Wasserstein generative adversarial networks, WGAN)状态重构

然后,采用Wasserstein生成对抗网络重构缺失变量,WGAN通过Wasserstein距离衡量生成分布

[Chihese.  Detection of False Data Injection Attacks in Smart Grid Based on WGAN State Reconstruction]


 2023 see 2022

 Conditional Wasserstein Generator

National Institutes of Health (.gov)

https://pubmed.ncbi.nlm.nih.gov › ...

by YG Kim · Cited by 1 — 2023 Jun;45(6):7208-7219. doi: 10.1109/TPAMI.2022.3220965. ... statistical distances: f-divergence, Wasse

<–—2023———2023———900—



[HTML] copernicus.org

[HTML] <? xmltex\vskip-1mm?> Short communication: The Wasserstein distance as a dissimilarity metric for comparing detrital age spectra<? xmltex\break?> and …

A Lipp, P Vermeesch - Geochronology, 2023 - gchron.copernicus.org

… In the following sections, we first introduce the Wasserstein … We then proceed to compare the 

Wasserstein distance to the … real datasets using both the Wasserstein and KS distances. We …

[PDF] copernicus.org

[PDF] The Wasserstein distance as a dissimilarity metric for comparing detrital age spectra, and other geological distributions

A Lipp, P Vermeesch - egusphere.copernicus.org

… In the following sections, we first introduce the Wasserstein … We then proceed to compare the 

Wasserstein distance to the … real datasets using both the Wasserstein and KS distances. We …

All 2 versions 



[PDF] arxiv.org

WSFE: Wasserstein Sub-graph Feature Encoder for Effective User Segmentation in Collaborative Filtering

Y Chen, Y Zhang, M Yang, Z Song, C Ma… - arXiv preprint arXiv …, 2023 - arxiv.org

… Then WSFE explicitly captures the distribution distances with Wasserstein metrics from the 

optimal transport theory [19, 32, 35, 42]. Consequently, the encoded user representations can …

 Cited by 1 All 2 versions


Wasserstein distance-based distributionally robust parallel-machine scheduling

Y Yin, Z Luo, D Wang, TCE Cheng - Omega, 2023 - Elsevier

Wasserstein distance-based DR parallel-machine scheduling, where the ambiguity set is 

defined as a Wasserstein … the distributions arising from the Wasserstein ambiguity set, subject …


[PDF] arxiv.org

A Fused Gromov-Wasserstein Framework for Unsupervised Knowledge Graph Entity Alignment

J Tang, K Zhao, J Li - arXiv preprint arXiv:2305.06574, 2023 - arxiv.org

… Instead of following the “embedding-learning-andmatching” paradigm, we invoke the Fused 

GromovWasserstein distance to realize a more explicit and comprehensive comparison of …

All 2 versions


[PDF] arxiv.org

A Lagrangian approach to totally dissipative evolutions in Wasserstein spaces

G Cavagnari, G Savaré, GE Sodini - arXiv preprint arXiv:2305.05211, 2023 - arxiv.org

… After a quick review in Section 2 of the main tools on Wasserstein spaces used in the 

sequel, we summarize in Subsection 2.2 the notation and the results concerning Multivalued …

Cited by 1 All 3 versions 


2023



 [PDF] openreview.net

Deep Generative Wasserstein Gradient Flows

A HengAF AnsariH Soh - 2023 - openreview.net

… equipped with the 2-Wasserstein metric (P2(Ω), W2). Given a functional F : P2(Ω) R in

the 2-Wasserstein space, the gradient … We call such curves Wasserstein gradient flows (WGF). …

Cited by 1 


[PDF] openreview.net

Wasserstein Fair Autoencoders

S Lee, H Lee, JH Won - 2023 - openreview.net

… In this paper, we demonstrate that Wasserstein autoencoders (WAEs) are highly flexible

in embracing structural constraints. Well-known extensions of VAEs for this purpose are …



[HTML] mdpi.com

[HTML] A Modified Gradient Method for Distributionally Robust Logistic Regression over the Wasserstein Ball

L Wang, B Zhou - Mathematics, 2023 - mdpi.com

… problem with the Wasserstein metric into decomposable semi… developed to address a

Wasserstein distributionally robust LR. … CG method for the Wasserstein distributionally robust LR …

 All 3 versions 



[PDF] ieee.org

WOMT: Wasserstein Distribution based minimization of False Positives in Breast Tumor classification using Deep Learning

L Lakshmi, KDS Devi, KAN Reddy, SK Grandhi… - IEEE …, 2023 - ieeexplore.ieee.org

… of the Lr- Wasserstein distance between µ1 and µ2 on an image space d. This section

includes on how Wasserstein cumulative distributions are arrived upon from point masses. …

Related articles All 2 versions

 

An Intelligent Method for Early Motor Bearing Fault Diagnosis Based on Wasserstein Distance Generative Adversarial Networks Meta Learning

P Luo, Z Yin, D Yuan, F Gao… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org

… This paper uses Wasserstein distance to judge the feature distribution between real signal

and generated signal, that is, it solves the extreme value problem between the G and D. The …

<–—2023———2023———910—e



Brain Tumour Segmentation Using Wasserstein Generative Adversarial Networks (WGANs)

S Nyamathulla, CS Meghana… - 2023 7th International …, 2023 - ieeexplore.ieee.org

… In the previous few years, the India has recorded an enormous number of cases of brain

tumours, many of which resulted in death [1]. As it involves life and death, numerous studies …


Indoor Localization Advancement Using Wasserstein Generative Adversarial Networks

S Kumar, S Majumder… - 2023 IEEE 8th …, 2023 - ieeexplore.ieee.org

… Wasserstein distance between the real and generated samples. To reduce the Wasserstein

… The Wasserstein distance between the real and generated samples is thereby minimized. …

Related articles 


arXiv:2306.02420  [pdfother cs.LG   cs.AI  math.NA  math.OC
Complexity of Block Coordinate Descent with Proximal Regularization and Applications to Wasserstein CP-dictionary Learning
Authors: Dohyun KwonHanbaek Lyu
Abstract: We consider the block coordinate descent methods of Gauss-Seidel type with proximal regularization (BCD-PR), which is a classical method of minimizing general nonconvex objectives under constraints that has a wide range of practical applications. We theoretically establish the worst-case complexity bound for this algorithm. Namely, we show that for general nonconvex smooth objectives with block-wi…  More
Submitted 4 June, 2023; originally announced June 2023.
Comments: Proceedings of the 40th International Conference on Machine Learning


arXiv:2306.00560  [pdfother cs.LG   stat.ML
Hinge-Wasserstein: Mitigating Overconfidence in Regression by Classification
Authors: Ziliang XiongAbdelrahman EldesokeyJoakim JohnanderBastian WandtPer-Erik Forssen
Abstract: Modern deep neural networks are prone to being overconfident despite their drastically improved performance. In ambiguous or even unpredictable real-world scenarios, this overconfidence can pose a major risk to the safety of applications. For regression tasks, the regression-by-classification approach has the potential to alleviate these ambiguities by instead predicting a discrete probability den…  More

Submitted 1 June, 2023; originally announced June 2023.

All 3 versions

arXiv:2306.00191  [pdfother math.NA
Parameterized Wasserstein Hamiltonian Flow
Authors: Hao WuShu LiuXiaojing YeHaomin Zhou
Abstract: In this work, we propose a numerical method to compute the Wasserstein Hamiltonian flow (WHF), which is a Hamiltonian system on the probability density manifold. Many well-known PDE systems can be reformulated as WHFs. We use parameterized function as push-forward map to characterize the solution of WHF, and convert the PDE to a finite-dimensional ODE system, which is a Hamiltonian system in the p…  More
Submitted 31 May, 2023; originally announced June 2023.
Comments: We welcome any comments and suggestions

Cited by 2 Related articles All 2 versions 

2023


arXiv:2306.00182  [pdfother math.OC   math.ST
Entropic Gromov-Wasserstein Distances: Stability, Algorithms, and Distributional Limits
Authors: Gabriel RiouxZiv GoldfeldKengo Kato
Abstract: The Gromov-Wasserstein (GW) distance quantifies discrepancy between metric measure spaces, but suffers from computational hardness. The entropic Gromov-Wasserstein (EGW) distance serves as a computationally efficient proxy for the GW distance. Recently, it was shown that the quadratic GW and EGW distances admit variational forms that tie them to the well-understood optimal transport (OT) and entro…  More
Submitted 31 May, 2023; originally announced June 2023.
Comments: 66 pages, 3 figures

Cited by 9 Related articles All 2 versions 

arXiv:2305.19738  [pdfother stat.M.   cs.LG cs.SI eess.SP
Bures-Wasserstein Means of Graphs
Authors: Isabel HaaslerPascal Frossard
Abstract: Finding the mean of sampled data is a fundamental task in machine learning and statistics. However, in cases where the data samples are graph objects, defining a mean is an inherently difficult task. We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions, where graph similarity can be measured using the Wasserstein metric. By finding…  More
Submitted 31 May, 2023; originally announced May 2023.

Related articles All 2 versions 

arXiv:2305.19371  [pdfpsother math-ph
On the Wasserstein distance and Dobrushin's uniqueness theorem
Authors: Tony C. DorlasBaptiste Savoie
Abstract: In this paper, we revisit Dobrushin's uniqueness theorem for Gibbs measures of lattice systems of interacting particles at thermal equilibrium. In a nutshell, Dobrushin's uniqueness theorem provides a practical way to derive sufficient conditions on the inverse-temperature and model parameters assuring uniqueness of Gibbs measures by reducing the uniqueness problem to a suitable estimate of the Wa…  More
Submitted 30 May, 2023; originally announced May 2023.
Comments: 47 pages
MSC Class: 2010: 82B05; 82B10; 82B20; 82B26 (Primary); 28C20; 46G10; 60B05; 60B10 (Secondary)

Related articles All 6 versions 

arXiv:2305.17555  [pdfother cs.CV
Diffeomorphic Deformation via Sliced Wasserstein Distance Optimization for Cortical Surface Reconstruction
Authors: Tung LeKhai NguyenShanlin SunKun HanNhat HoXiaohui Xie
Abstract: Mesh deformation is a core task for 3D mesh reconstruction, but defining an efficient discrepancy between predicted and target meshes remains an open problem. A prevalent approach in current deep learning is the set-based approach which measures the discrepancy between two surfaces by comparing two randomly sampled point-clouds from the two meshes with Chamfer pseudo-distance. Nevertheless, the se…  More
Submitted 27 May, 2023; originally announced May 2023.


Pagès, GillesPanloup, FabienZbl 07692273

Unadjusted Langevin algorithm with multiplicative noise: total variation and Wasserstein bounds. (English) 

Ann. Appl. Probab. 33, No. 1, 726-779 (2023).

MSC:  65C05 37M25 60F05 62L10 65C40

PDF BibTeX XML Cite

Full Text: DOI  OpenURL 

Cited by 6 Related articles All 14 versions

<–—2023———2023———920—e


2023 see 2021

Wang, Feng-Yu

Convergence in Wasserstein distance for empirical measures of semilinear SPDEs. (English) Zbl 07692254

Ann. Appl. Probab. 33, No. 1, 70-84 (2023).

MSC:  60H15 60F99

PDF BibTeX XML Cite

Full Text: DOI

 

2023 see 2021

Wang, Feng-Yu

Convergence in Wasserstein distance for empirical measures of semilinear SPDEs. (English) Zbl 07692254

Ann. Appl. Probab. 33, No. 1, 70-84 (2023).

MSC:  60H15 60F99

PDF BibTeX XML Cite

Full Text: DOI 

 

 

2023 see 2022

Li, HuaiqianWu, Bingyao

Wasserstein convergence rates for empirical measures of subordinated processes on noncompact manifolds. (English) Zbl 07692077

J. Theor. Probab. 36, No. 2, 1243-1268 (2023).

MSC:  60D05 58J65 60J60 60J76

PDF BibTeX XML Cite

Full Text: DOI 


Deo, NeilRandrianarisoa, Thibault

On adaptive confidence sets for the Wasserstein distances. (English) Zbl 07691575

Bernoulli 29, No. 3, 2119-2141 (2023).

MSC:  62Gxx 60Fxx 62Cxx

PDF BibTeX XML Cite

Full Text: DOI 



2023 see 2021

Bachoc, FrançoisFathi, Max

Bounds in L1 Wasserstein distance on the normal approximation of general M-estimators. (English) Zbl 07690328

Electron. J. Stat. 17, No. 1, 1457-1491 (2023).

MSC:  62-XX

PDF BibTeX XML Cite

Full Text: DOI 


2023.   OpenUR

Cui, JianboLiu, ShuZhou, Haomin

Wasserstein Hamiltonian flow with common noise on graph. (English) Zbl 07689740

SIAM J. Appl. Math. 83, No. 2, 484-509 (2023).

MSC:  35R02 35Q55 49Q22

PDF BibTeX XML Cite

Full Text: DOI 



Wassertein Gan Synthesis for Time Series with Complex Temporal Dynamics: Frugal Architectures and Arbitrary Sample-Size Generation

T Beroud, P AbryY Malevergne… - ICASSP 2023-2023 …, 2023 - ieeexplore.ieee.org

… a contribution towards sustainable Artificial Intelligence. … into researches towards frugal AI,

in contradistinction against 


[PDF] arxiv.org

Linearized Wasserstein dimensionality reduction with approximation guarantees

A Cloninger, K Hamm, V Khurana… - arXiv preprint arXiv …, 2023 - arxiv.org

… With these approximation schemes at hand, we define the empirical linearized Wasserstein- 

… of the r

All 2 versions



Parameterized Wasserstein Hamiltonian Flow
Authors:Wu, Hao (Creator), Liu, Shu (Creator), Ye, Xiaojing (Creator), Zhou, Haomin (Creator)
Show more
Summary:In this work, we propose a numerical method to compute the Wasserstein Hamiltonian flow (WHF), which is a Hamiltonian system on the probability density manifold. Many well-known PDE systems can be reformulated as WHFs. We use parameterized function as push-forward map to characterize the solution of WHF, and convert the PDE to a finite-dimensional ODE system, which is a Hamiltonian system in the phase space of the parameter manifold. We establish error analysis results for the continuous time approximation scheme in Wasserstein metric. For the numerical implementation, we use neural networks as push-forward maps. We apply an effective symplectic scheme to solve the derived Hamiltonian ODE system so that the method preserves some important quantities such as total energy. The computation is done by fully deterministic symplectic integrator without any neural network training. Thus, our method does not involve direct optimization over network parameters and hence can avoid the error introduced by stochastic gradient descent (SGD) methods, which is usually hard to quantify and measure. The proposed algorithm is a sampling-based approach that scales well to higher dimensional problems. In addition, the method also provides an alternative connection between the Lagrangian and Eulerian perspectives of the original WHF through the parameterized ODE dynamics
Show more
Downloadable Archival Material, 2023-05-31
Undefined
Publisher:2023-05-31


Doubly Regularized Entropic Wasserstein Barycenters
Author:Chizat, Lénaïc (Creator)
Summary:We study a general formulation of regularized Wasserstein barycenters that enjoys favorable regularity, approximation, stability and (grid-free) optimization properties. This barycenter is defined as the unique probability measure that minimizes the sum of entropic optimal transport (EOT) costs with respect to a family of given probability measures, plus an entropy term. We denote it $(\lambda,\tau)$-barycenter, where $\lambda$ is the inner regularization strength and $\tau$ the outer one. This formulation recovers several previously proposed EOT barycenters for various choices of $\lambda,\tau \geq 0$ and generalizes them. First, in spite of -- and in fact owing to -- being \emph{doubly} regularized, we show that our formulation is debiased for $\tau=\lambda/2$: the suboptimality in the (unregularized) Wasserstein barycenter objective is, for smooth densities, of the order of the strength $\lambda^2$ of entropic regularization, instead of $\max\{\lambda,\tau\}$ in general. We discuss this phenomenon for isotropic Gaussians where all $(\lambda,\tau)$-barycenters have closed form. Second, we show that for $\lambda,\tau>0$, this barycenter has a smooth density and is strongly stable under perturbation of the marginals. In particular, it can be estimated efficiently: given $n$ samples from each of the probability measures, it converges in relative entropy to the population barycenter at a rate $n^{-1/2}$. And finally, this formulation lends itself naturally to a grid-free optimization algorithm: we propose a simple \emph{noisy particle gradient descent} which, in the mean-field limit, converges globally at an exponential rate to the barycenter
Show more
Downloadable Archival Material, 2023-03-21
Undefined
Publisher:2023-03-21

<–—2023———2023———930—



Peer-reviewed
On the exotic isometry flow of the quadratic Wasserstein space over the real line
Show more
Authors:György Pál GehérTamás TitkosDániel Virosztek
Summary:Kloeckner discovered that the quadratic Wasserstein space over the real line (denoted by
Show more
Article
Publication:Linear Algebra and Its Applications
 


Wasserstein Projection Pursuit of Non-Gaussian Signals
Authors:Mukherjee, Satyaki (Creator), Mukherjee, Soumendu Sundar (Creator), Ghoshdastidar, Debarghya (Creator)
Show more
Summary:We consider the general dimensionality reduction problem of locating in a high-dimensional data cloud, a $k$-dimensional non-Gaussian subspace of interesting features. We use a projection pursuit approach -- we search for mutually orthogonal unit directions which maximise the 2-Wasserstein distance of the empirical distribution of data-projections along these directions from a standard Gaussian. Under a generative model, where there is a underlying (unknown) low-dimensional non-Gaussian subspace, we prove rigorous statistical guarantees on the accuracy of approximating this unknown subspace by the directions found by our projection pursuit approach. Our results operate in the regime where the data dimensionality is comparable to the sample size, and thus supplement the recent literature on the non-feasibility of locating interesting directions via projection pursuit in the complementary regime where the data dimensionality is much larger than the sample size
Show more
Downloadable Archival Material, 2023-02-24
Undefined
Publisher:2023-02-24

Age-Invariant Face Embedding using the Wasserstein Distance
Authors:Dahan, Eran (Creator), Keller, Yosi (Creator)
Summary:In this work, we study face verification in datasets where images of the same individuals exhibit significant age differences. This poses a major challenge for current face recognition and verification techniques. To address this issue, we propose a novel approach that utilizes multitask learning and a Wasserstein distance discriminator to disentangle age and identity embeddings of facial images. Our approach employs multitask learning with a Wasserstein distance discriminator that minimizes the mutual information between the age and identity embeddings by minimizing the Jensen-Shannon divergence. This improves the encoding of age and identity information in face images and enhances the performance of face verification in age-variant datasets. We evaluate the effectiveness of our approach using multiple age-variant face datasets and demonstrate its superiority over state-of-the-art methods in terms of face verification accuracy
Show more
Downloadable Archival Material, 2023-05-04
Undefined
Publisher:2023-05-04

The geometry of financial institutions -- Wasserstein clustering of financial data
Show more
Authors:Riess, Lorenz (Creator), Beiglböck, Mathias (Creator), Temme, Johannes (Creator), Wolf, Andreas (Creator), Backhoff, Julio (Creator)
Show more
Summary:The increasing availability of granular and big data on various objects of interest has made it necessary to develop methods for condensing this information into a representative and intelligible map. Financial regulation is a field that exemplifies this need, as regulators require diverse and often highly granular data from financial institutions to monitor and assess their activities. However, processing and analyzing such data can be a daunting task, especially given the challenges of dealing with missing values and identifying clusters based on specific features. To address these challenges, we propose a variant of Lloyd's algorithm that applies to probability distributions and uses generalized Wasserstein barycenters to construct a metric space which represents given data on various objects in condensed form. By applying our method to the financial regulation context, we demonstrate its usefulness in dealing with the specific challenges faced by regulators in this domain. We believe that our approach can also be applied more generally to other fields where large and complex data sets need to be represented in concise form
Show more
Downloadable Archival Material, 2023-05-05
Undefined
Publisher:2023-05-05



Peer-reviewed
Distributionally robust learning-to-rank under the Wasserstein metric
Show more
Authors:Shahabeddin SotudianRuidi ChenIoannis Ch Paschalidis
Summary:Despite their satisfactory performance, most existing listwise Learning-To-Rank (LTR) models do not consider the crucial issue of robustness. A data set can be contaminated in various ways, including human error in labeling or annotation, distributional data shift, and malicious adversaries who wish to degrade the algorithm's performance. It has been shown that Distributionally Robust Optimization (DRO) is resilient against various types of noise and perturbations. To fill this gap, we introduce a new listwise LTR model called Distributionally Robust Multi-output Regression Ranking (DRMRR). Different from existing methods, the scoring function of DRMRR was designed as a multivariate mapping from a feature vector to a vector of deviation scores, which captures local context information and cross-document interactions. In this way, we are able to incorporate the LTR metrics into our model. DRMRR uses a Wasserstein DRO framework to minimize a multi-output loss function under the most adverse distributions in the neighborhood of the empirical data distribution defined by a Wasserstein ball. We present a compact and computationally solvable reformulation of the min-max formulation of DRMRR. Our experiments were conducted on two real-world applications: medical document retrieval and drug response prediction, showing that DRMRR notably outperforms state-of-the-art LTR models. We also conducted an extensive analysis to examine the resilience of DRMRR against various types of noise: Gaussian noise, adversarial perturbations, and label poisoning. Accordingly, DRMRR is not only able to achieve significantly better performance than other baselines, but it can maintain a relatively stable performance as more noise is added to the data
Show more
Article, 2023
Publication:PloS one, 18, 2023, e0283574
Publisher:2023
All 6 versions
 


2023


Computation of Rate-Distortion-Perception Functions With Wasserstein Barycenter
Show more
Authors:Chen, Chunhui (Creator), Niu, Xueyan (Creator), Ye, Wenhao (Creator), Wu, Shitong (Creator), Bai, Bo (Creator), Chen, Weichao (Creator), Lin, Sian-Jheng (Creator)
Show more
Summary:The nascent field of Rate-Distortion-Perception (RDP) theory is seeing a surge of research interest due to the application of machine learning techniques in the area of lossy compression. The information RDP function characterizes the three-way trade-off between description rate, average distortion, and perceptual quality measured by discrepancy between probability distributions. However, computing RDP functions has been a challenge due to the introduction of the perceptual constraint, and existing research often resorts to data-driven methods. In this paper, we show that the information RDP function can be transformed into a Wasserstein Barycenter problem. The nonstrictly convexity brought by the perceptual constraint can be regularized by an entropy regularization term. We prove that the entropy regularized model converges to the original problem. Furthermore, we propose an alternating iteration method based on the Sinkhorn algorithm to numerically solve the regularized optimization problem. Experimental results demonstrate the efficiency and accuracy of the proposed algorithm
Show more
Downloadable Archival Material, 2023-04-27
Undefined
Publisher:2023-04-27

Cited by 3 Related articles All 4 versions



Exact Generalization Guarantees for (Regularized) Wasserstein Distributionally Robust Models
Show more
Authors:Azizian, Waïss (Creator), Iutzeler, Franck (Creator), Malick, Jérôme (Creator)
Show more
Summary:Wasserstein distributionally robust estimators have emerged as powerful models for prediction and decision-making under uncertainty. These estimators provide attractive generalization guarantees: the robust objective obtained from the training distribution is an exact upper bound on the true risk with high probability. However, existing guarantees either suffer from the curse of dimensionality, are restricted to specific settings, or lead to spurious error terms. In this paper, we show that these generalization guarantees actually hold on general classes of models, do not suffer from the curse of dimensionality, and can even cover distribution shifts at testing. We also prove that these results carry over to the newly-introduced regularized versions of Wasserstein distributionally robust problems
Show more
Downloadable Archival Material, 2023-05-26
Undefined
Publisher:2023-05-26

Exact Generalization Guarantees for (Regularized) Wasserstein Distributionally Robust Models

by Azizian, Waïss; Iutzeler, Franck; Malick, Jérôme

arXiv.org, 05/2023

Wasserstein distributionally robust estimators have emerged as powerful models for prediction and decision-making under uncertainty. These estimators provide...

Paper  Full Text Online

Open Access

Peer-reviewed
On a linear fused Gromov-Wasserstein distance for graph structured data
Show more
Authors:Dai Hai NguyenKoji Tsuda
Summary:We present a framework for embedding graph structured data into a vector space, taking into account node features and structures of graphs into the optimal transport (OT) problem. Then we propose a novel distance between two graphs, named LinearFGW, defined as the Euclidean distance between their embeddings. The advantages of the proposed distance are twofold: 1) it takes into account node features and structures of graphs for measuring the dissimilarity between graphs in a kernel-based framework, 2) it is more efficient for computing a kernel matrix than pairwise OT-based distances, particularly fused Gromov-Wasserstein [1], making it possible to deal with large-scale data sets. Our theoretical analysis and experimental results demonstrate that our proposed distance leads to an increase in performance compared to the existing state-of-the-art graph distances when evaluated on graph classification and clustering tasks
Show more
Article
Publication:Pattern Recognition, 138, June 2023
 

2023 see arXiv
Diffeomorphic Deformation via Sliced Wasserstein Distance Optimization for Cortical Surface Reconstruction
Show more
Authors:Le, Tung (Creator), Nguyen, Khai (Creator), Sun, Shanlin (Creator), Han, Kun (Creator), Ho, Nhat (Creator), Xie, Xiaohui (Creator)
Show more
Summary:Mesh deformation is a core task for 3D mesh reconstruction, but defining an efficient discrepancy between predicted and target meshes remains an open problem. A prevalent approach in current deep learning is the set-based approach which measures the discrepancy between two surfaces by comparing two randomly sampled point-clouds from the two meshes with Chamfer pseudo-distance. Nevertheless, the set-based approach still has limitations such as lacking a theoretical guarantee for choosing the number of points in sampled point-clouds, and the pseudo-metricity and the quadratic complexity of the Chamfer divergence. To address these issues, we propose a novel metric for learning mesh deformation. The metric is defined by sliced Wasserstein distance on meshes represented as probability measures that generalize the set-based approach. By leveraging probability measure space, we gain flexibility in encoding meshes using diverse forms of probability measures, such as continuous, empirical, and discrete measures via \textit{varifold} representation. After having encoded probability measures, we can compare meshes by using the sliced Wasserstein distance which is an effective optimal transport distance with linear computational complexity and can provide a fast statistical rate for approximating the surface of meshes. Furthermore, we employ a neural ordinary differential equation (ODE) to deform the input surface into the target shape by modeling the trajectories of the points on the surface. Our experiments on cortical surface reconstruction demonstrate that our approach surpasses other competing methods in multiple datasets and metrics
Show more
Downloadable Archival Material, 2023-05-27
Undefined
Publisher:2023-05-27
Diffeomorphic Deformation via Sliced Wasserstein Distance Optimization for Cortical Surface Reconstruction

by Le, Tung; Nguyen, Khai; Sun, Shanlin ; More...

05/2023

Mesh deformation is a core task for 3D mesh reconstruction, but defining an efficient discrepancy between predicted and target meshes remains an open problem....

Journ



Wasserstein PAC-Bayes Learning: Exploiting Optimisation Guarantees to Explain Generalisation
Show more
Authors:Haddouche, Maxime (Creator), Guedj, Benjamin (Creator)
Summary:PAC-Bayes learning is an established framework to both assess the generalisation ability of learning algorithms, and design new learning algorithm by exploiting generalisation bounds as training objectives. Most of the exisiting bounds involve a \emph{Kullback-Leibler} (KL) divergence, which fails to capture the geometric properties of the loss function which are often useful in optimisation. We address this by extending the emerging \emph{Wasserstein PAC-Bayes} theory. We develop new PAC-Bayes bounds with Wasserstein distances replacing the usual KL, and demonstrate that sound optimisation guarantees translate to good generalisation abilities. In particular we provide generalisation bounds for the \emph{Bures-Wasserstein SGD} by exploiting its optimisation properties
Show more
Downloadable Archival Material, 2023-04-14
Undefined
Publisher:2023-04-14

<–—2023———2023———940—



Self-Attention Amortized Distributional Projection Optimization for Sliced Wasserstein Point-Cloud Reconstruction
Show more
Authors:Nguyen, Khai (Creator), Nguyen, Dang (Creator), Ho, Nhat (Creator)
Show more
Summary:Max sliced Wasserstein (Max-SW) distance has been widely known as a solution for less discriminative projections of sliced Wasserstein (SW) distance. In applications that have various independent pairs of probability measures, amortized projection optimization is utilized to predict the ``max" projecting directions given two input measures instead of using projected gradient ascent multiple times. Despite being efficient, Max-SW and its amortized version cannot guarantee metricity property due to the sub-optimality of the projected gradient ascent and the amortization gap. Therefore, we propose to replace Max-SW with distributional sliced Wasserstein distance with von Mises-Fisher (vMF) projecting distribution (v-DSW). Since v-DSW is a metric with any non-degenerate vMF distribution, its amortized version can guarantee the metricity when performing amortization. Furthermore, current amortized models are not permutation invariant and symmetric. To address the issue, we design amortized models based on self-attention architecture. In particular, we adopt efficient self-attention architectures to make the computation linear in the number of supports. With the two improvements, we derive self-attention amortized distributional projection optimization and show its appealing performance in point-cloud reconstruction and its downstream applications
Show more
Downloadable Archival Material, 2023-01-11
Undefined
Publisher:2023-01-11
  Cited by 13
 Related articles All 11 versions 

Geological model automatic reconstruction based on conditioning Wasserstein generative adversarial network with gradient penalty
Show more
Authors:Wenyao FanGang LiuQiyu ChenZhesi CuiZixiao YangQianhong HuangXuechao Wu
Show mo
Summary:Abstract: Due to the structure complexity and heterogeneity of the geological models, it is difficult for traditional methods to characterize the corresponding anisotropic and structural features. Therefore, one of the generative models called Generative Adversarial Network (GAN) are introduced to the geological modeling fields, which describes the complex structural features effectively according to fitting the high-order statistical characteristics. However, the traditional GAN might cause gradient explosion or vanishment, insufficient model diversity, resulting the network cannot capture the spatial pattern and characteristics of geological models so that the reconstruction always has a bad performance. For this issue, this paper introduced the Wasserstein Generative Adversarial Network with Gradient Penalty (WGAN-GP), which can better measure the distribution discrepancy between the generative data and real data and provide a meaningful gradient for the training process. In addition, the gradient penalty term can make the objective function conform with Lipschitz constraints, which ensures the training process more stable and the correlation between the generative and real samples. Meanwhile, the conditioning loss function can make the reconstruction conform with the conditioning constraints. The 2D and 3D categorical facies model were introduced to perform experimental verification. The results show that the CWGAN-GP ensure the conditioning constraints and the reconstruction diversity simultaneously. In addition, for the network finished training, through inputting different kinds of conditioning data, a variety of stochastic simulation results can be generated, thereby realizing rapid and automatic geological model reconstruction
Show more
Article, 2023
Publication:Earth Science Informatics, 20230427, 1
Publisher:2023

Cited by 2 Related articles All 3 versions
 

2023 see arXiv Bbl
Complexity of Block Coordinate Descent with Proximal Regularization and Applications to Wasserstein CP-dictionary Learning
Show more
Authors:Kwon, Dohyun (Creator), Lyu, Hanbaek (Creator)
Summary:We consider the block coordinate descent methods of Gauss-Seidel type with proximal regularization (BCD-PR), which is a classical method of minimizing general nonconvex objectives under constraints that has a wide range of practical applications. We theoretically establish the worst-case complexity bound for this algorithm. Namely, we show that for general nonconvex smooth objectives with block-wise constraints, the classical BCD-PR algorithm converges to an epsilon-stationary point within O(1/epsilon) iterations. Under a mild condition, this result still holds even if the algorithm is executed inexactly in each step. As an application, we propose a provable and efficient algorithm for `Wasserstein CP-dictionary learning', which seeks a set of elementary probability distributions that can well-approximate a given set of d-dimensional joint probability distributions. Our algorithm is a version of BCD-PR that operates in the dual space, where the primal problem is regularized both entropically and proximally
Show more
Downloadable Archival Material, 2023-06-04
Undefined
Publisher:2023-06-04

 

Wasserstein Auto-encoded MDPs: Formal Verification of Efficiently Distilled RL Policies with Many-sided Guarantees
Show mor
Authors:Delgrange, Florent (Creator), Nowé, Ann (Creator), Pérez, Guillermo A. (Creator)
Show more
Summary:Although deep reinforcement learning (DRL) has many success stories, the large-scale deployment of policies learned through these advanced techniques in safety-critical scenarios is hindered by their lack of formal guarantees. Variational Markov Decision Processes (VAE-MDPs) are discrete latent space models that provide a reliable framework for distilling formally verifiable controllers from any RL policy. While the related guarantees address relevant practical aspects such as the satisfaction of performance and safety properties, the VAE approach suffers from several learning flaws (posterior collapse, slow learning speed, poor dynamics estimates), primarily due to the absence of abstraction and representation guarantees to support latent optimization. We introduce the Wasserstein auto-encoded MDP (WAE-MDP), a latent space model that fixes those issues by minimizing a penalized form of the optimal transport between the behaviors of the agent executing the original policy and the distilled policy, for which the formal guarantees apply. Our approach yields bisimulation guarantees while learning the distilled policy, allowing concrete optimization of the abstraction and representation model quality. Our experiments show that, besides distilling policies up to 10 times faster, the latent model quality is indeed better in general. Moreover, we present experiments from a simple time-to-failure verification algorithm on the latent space. The fact that our approach enables such simple verification techniques highlights its applicability
Show more
Downloadable Archival Material, 2023-03-22
Undefined
Publisher:2023-03-22


2023 see 2021. Peer-reviewed
Peer-reviewed
Stochastic Wasserstein Hamiltonian Flows
Authors:Jianbo CuiShu LiuHaomin Zhou
Summary:Abstract: In this paper, we study the stochastic Hamiltonian flow in Wasserstein manifold, the probability density space equipped with -Wasserstein metric tensor, via the Wong–Zakai approximation. We begin our investigation by showing that the stochastic Euler–Lagrange equation, regardless it is deduced from either the variational principle or particle dynamics, can be interpreted as the stochastic kinetic Hamiltonian flows in Wasserstein manifold. We further propose a novel variational formulation to derive more general stochastic Wasserstein Hamiltonian flows, and demonstrate that this new formulation is applicable to various systems including the stochastic Schrödinger equation, Schrödinger equation with random dispersion, and Schrödinger bridge problem with common noise
Show more
Article, 2023
Publication:Journal of Dynamics and Differential Equations, 20230418, 1
Publisher:2023


2023



High order spatial discretization for variational time implicit schemes: Wasserstein gradient flows and reaction-diffusion systems
Show more
Authors:Fu, Guosheng (Creator), Osher, Stanley (Creator), Li, Wuchen (Creator)
Show more
Summary:We design and compute first-order implicit-in-time variational schemes with high-order spatial discretization for initial value gradient flows in generalized optimal transport metric spaces. We first review some examples of gradient flows in generalized optimal transport spaces from the Onsager principle. We then use a one-step time relaxation optimization problem for time-implicit schemes, namely generalized Jordan-Kinderlehrer-Otto schemes. Their minimizing systems satisfy implicit-in-time schemes for initial value gradient flows with first-order time accuracy. We adopt the first-order optimization scheme ALG2 (Augmented Lagrangian method) and high-order finite element methods in spatial discretization to compute the one-step optimization problem. This allows us to derive the implicit-in-time update of initial value gradient flows iteratively. We remark that the iteration in ALG2 has a simple-to-implement point-wise update based on optimal transport and Onsager's activation functions. The proposed method is unconditionally stable for convex cases. Numerical examples are presented to demonstrate the effectiveness of the methods in two-dimensional PDEs, including Wasserstein gradient flows, Fisher--Kolmogorov-Petrovskii-Piskunov equation, and two and four species reversible reaction-diffusion systems
Show more
Downloadable Archival Material, 2023-03-15
Undefined
Publisher:2023-03-15
Cited by 1
 All 4 versions 


Peer-reviewed
GA-ENs: A novel drug-target interactions prediction method by incorporating prior Knowledge Graph into dual Wasserstein Generative Adversarial Network with gradient penalty
Show more
Authors:Guodong LiWeicheng SunJinsheng XuLun HuWeihan ZhangPing Zhang
Show more
Summary:Bipartite graph-based drug-target interactions (DTIs) prediction methods are commonly limited by the sparse structure of the graph, resulting in acquiring suboptimal node feature representations. In addition, these defective node features will also interfere with the representation quality of corresponding edge features. To alleviate the sparsity of bipartite graph and get better node representation, according to the prior Knowledge Graph (KG), we developed a novel prediction model based on Variational Graph Auto-Encoder (VGAE) combined with our proposed dual Wasserstein Generative Adversarial Network with gradient penalty strategy (dual-WGAN-GP) for generating edge information and augmenting their representations. Specifically, GA-ENs first utilized dual-WGAN-GP to fill possible edges by a prior KG containing various molecular associations knowledge when constructing a bipartite graph of known DTIs. Moreover, we utilized the KG transfer probability matrix to redefine the drug-drug and target-target similarity matrix, thus constructing the final graph adjacent matrix. Combining graph adjacent matrix with node features, we learn node representations by VGAE and augment them by utilizing dual-WGAN-GP again, thus obtaining final edge representations. Finally, a fully connected network with three layers was designed to predict potential DTIs. Extensive experiment results show that GA-ENs has excellent performance for DTIs prediction and could be a supplement tool for practical DTIs biological screening
Show more
Article
Publication:Applied Soft Computing Journal, 139, May 2023

Peer-reviewed
Dual Interactive Wasserstein Generative Adversarial Network optimized with arithmetic optimization algorithm-based job scheduling in cloud-based IoT
Show more
Authors:Gunaganti SravanthiNageswara Rao Moparthi
Summary:Abstract: Job scheduling plays a prominent part in cloud computing, and the production schedule of jobs can increase the cloud system’s effectiveness. When serving millions of users at once, cloud computing must provide all user requests with excellent performance and ensure Quality of Service (QoS). A suitable task scheduling algorithm is needed to appropriately and effectively fulfil these requests. Several methods were proposed for job scheduling in cloud computing, but the existing techniques do not provide better efficiency. The Dual Interactive Wasserstein Generative Adversarial Network Optimized with Arithmetic Optimization Algorithm Based Job Scheduling in Cloud-Based Internet of Things is proposed to overcome this issue. Primarily, the data from the Alibaba dataset is preprocessed using the Kernel Co-Relation (KC) method. The preprocessed data is given to the Dual Interactive Wasserstein Generative Adversarial Network (DIWGAN) for task forecasting in the dynamic cloud environment, and it generates scheduled tasks as output. Then the Arithmetic Optimization Algorithm (AOA) is utilized to optimize the weight parameters of the Dual Interactive Wasserstein Generative Adversarial Network. The proposed method precisely predicts the future workload and diminishes extravagant power consumption at cloud data centres. The proposed method is implemented in MATLAB. The proposed method attains lower MSE (Mean Square Error), RMSE (Root Mean Square Error), MAPE (Mean Squared Prediction Error), MAE (Mean Absolute Error), and higher results without optimization algorithm before and after normalization compared to the current approaches
Article, 2023
Publication:Cluster Computing : The Journal of Networks, Software Tools and Applications, 20230403, 1
Publisher:2023

 
Entropic Wasserstein Component Analysis
Authors:Collas, Antoine (Creator), Vayer, Titouan (Creator), Flamary, Rémi (Creator), Breloy, Arnaud (Creator)
Show more
Summary:Dimension reduction (DR) methods provide systematic approaches for analyzing high-dimensional data. A key requirement for DR is to incorporate global dependencies among original and embedded samples while preserving clusters in the embedding space. To achieve this, we combine the principles of optimal transport (OT) and principal component analysis (PCA). Our method seeks the best linear subspace that minimizes reconstruction error using entropic OT, which naturally encodes the neighborhood information of the samples. From an algorithmic standpoint, we propose an efficient block-majorization-minimization solver over the Stiefel manifold. Our experimental results demonstrate that our approach can effectively preserve high-dimensional clusters, leading to more interpretable and effective embeddings. Python code of the algorithms and experiments is available online
Show more
Downloadable Archival Material, 2023-03-09
Undefined
Publisher:2023-03-09

2023 see 2022
Vector Quantized Wasserstein Auto-Encoder
Authors:Vuong, Tung-Long (Creator), Le, Trung (Creator), Zhao, He (Creator), Zheng, Chuanxia (Creator), Harandi, Mehrtash (Creator), Cai, Jianfei (Creator), Phung, Dinh (Creator)
Show more
Summary:Learning deep discrete latent presentations offers a promise of better symbolic and summarized abstractions that are more useful to subsequent downstream tasks. Inspired by the seminal Vector Quantized Variational Auto-Encoder (VQ-VAE), most of work in learning deep discrete representations has mainly focused on improving the original VQ-VAE form and none of them has studied learning deep discrete representations from the generative viewpoint. In this work, we study learning deep discrete representations from the generative viewpoint. Specifically, we endow discrete distributions over sequences of codewords and learn a deterministic decoder that transports the distribution over the sequences of codewords to the data distribution via minimizing a WS distance between them. We develop further theories to connect it with the clustering viewpoint of WS distance, allowing us to have a better and more controllable clustering solution. Finally, we empirically evaluate our method on several well-known benchmarks, where it achieves better qualitative and quantitative performances than the other VQ-VAE variants in terms of the codebook utilization and image reconstruction/generation
Show mo
Downloadable Archival Material, 2023-02-12
Undefined
Publisher:2023-02-12 

<–—2023———2023———950—



Peer-reviewed
WASCO: A Wasserstein-based Statistical Tool to Compare Conformational Ensembles of Intrinsically Disordered Proteins
Show more
Authors:Javier González-DelgadoAmin SagarChristophe ZanonKresten Lindorff-LarsenPau BernadóPierre NeuvialJuan Cortés
Show more
Summary:The structural investigation of intrinsically disordered proteins (IDPs) requires ensemble models describing the diversity of the conformational states of the molecule. Due to their probabilistic nature, there is a need for new paradigms that understand and treat IDPs from a purely statistical point of view, considering their conformational ensembles as well-defined probability distributions. In this work, we define a conformational ensemble as an ordered set of probability distributions and provide a suitable metric to detect differences between two given ensembles at the residue level, both locally and globally. The underlying geometry of the conformational space is properly integrated, one ensemble being characterized by a set of probability distributions supported on the three-dimensional Euclidean space (for global-scale comparisons) and on the two-dimensional flat torus (for local-scale comparisons). The inherent uncertainty of the data is also taken into account to provide finer estimations of the differences between ensembles. Additionally, an overall distance between ensembles is defined from the differences at the residue level. We illustrate the interest of the approach with several examples of applications for the comparison of conformational ensembles: (i) produced from molecular dynamics (MD) simulations using different force fields, and (ii) before and after refinement with experimental data. We also show the usefulness of the method to assess the convergence of MD simulations, and discuss other potential applications such as in machine-learning-based approaches. The numerical tool has been implemented in Python through easy-to-use Jupyter Notebooks available at https://gitlab.laas.fr/moma/WASCO
Show more
Article
Publication:Journal of Molecular Biology


2023 see 2021. Peer-reviewed
Wasserstein Adversarially Regularized Graph Autoencoder
Show more
Authors:Huidong LiangJunbin Gao
Article, 2023
Publication:Neurocomputing, 541, 202307, 126235
Publisher:2023
 
Bures-Wasserstein Means of Graphs
Authors:Haasler, Isabel (Creator), Frossard, Pascal (Creator)
Summary:Finding the mean of sampled data is a fundamental task in machine learning and statistics. However, in cases where the data samples are graph objects, defining a mean is an inherently difficult task. We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions, where graph similarity can be measured using the Wasserstein metric. By finding a mean in this embedding space, we can recover a mean graph that preserves structural information. We establish the existence and uniqueness of the novel graph mean, and provide an iterative algorithm for computing it. To highlight the potential of our framework as a valuable tool for practical applications in machine learning, it is evaluated on various tasks, including k-means clustering of structured graphs, classification of functional brain networks, and semi-supervised node classification in multi-layer graphs. Our experimental results demonstrate that our approach achieves consistent performance, outperforms existing baseline approaches, and improves state-of-the-art methods
Show more
Downloadable Archival Material, 2023-05-31
Undefined
Publisher:2023-05-31
 
Multi-marginal Approximation of the Linear Gromov-Wasserstein Distance
Show more
Authors:Florian BeierRobert Beinert
Article, 2023
Publication:PAMM, 22, March 2023, n/a
Publisher:2023



2023 see 2021. Peer-reviewed
Bridging Bayesian and Minimax Mean Square Error Estimation via Wasserstein Distributionally Robust Optimization
Show more
Authors:Viet Anh NguyenSoroosh Shafieezadeh-AbadehDaniel KuhnPeyman Mohajerin Esfahani
Show more
Summary:We introduce a distributionally robust minimium mean square error estimation model with a Wasserstein ambiguity set to recover an unknown signal from a noisy observation. The proposed model can be viewed as a zero-sum game between a statistician choosing an estimator—that is, a measurable function of the observation—and a fictitious adversary choosing a prior—that is, a pair of signal and noise distributions ranging over independent Wasserstein balls—with the goal to minimize and maximize the expected squared estimation error, respectively. We show that, if the Wasserstein balls are centered at normal distributions, then the zero-sum game admits a Nash equilibrium, by which the players’ optimal strategies are given by an affine estimator and a normal prior, respectively. We further prove that this Nash equilibrium can be computed by solving a tractable convex program. Finally, we develop a Frank-Wolfe algorithm that can solve this convex program orders of magnitude faster than state-of-the-art general-purpose solvers. We show that this algorithm enjoys a linear convergence rate and that its direction-finding subproblems can be solved in quasi-closed form.Funding: This research was supported by the Swiss National Science Foundation [Grants BSCGI0_ 157733 and 51NF40_180545], an Early Postdoc.Mobility Fellowship [Grant P2ELP2_195149], and the European Research Council [Grant TRUST-949796]
Show more
Downloadable Article, 2023
Publication:Mathematics of Operations Research, 48, 202302, 1
Publisher:2023
Cited by 31
 Related articles All 10 versions


2023



Sparse super resolution and its trigonometric approximation in the p-Wasserstein distance
Show more
Authors:Paul CatalaMathias HockmannStefan Kunis
Article, 2023
Publication:PAMM, 22, March 2023, n/a
Publisher:2023


Nonparametric Generative Modeling with Conditional Sliced-Wasserstein Flows
Show more
Authors:Du, Chao (Creator), Li, Tianbo (Creator), Pang, Tianyu (Creator), Yan, Shuicheng (Creator), Lin, Min (Creator)
Show more
Summary:Sliced-Wasserstein Flow (SWF) is a promising approach to nonparametric generative modeling but has not been widely adopted due to its suboptimal generative quality and lack of conditional modeling capabilities. In this work, we make two major contributions to bridging this gap. First, based on a pleasant observation that (under certain conditions) the SWF of joint distributions coincides with those of conditional distributions, we propose Conditional Sliced-Wasserstein Flow (CSWF), a simple yet effective extension of SWF that enables nonparametric conditional modeling. Second, we introduce appropriate inductive biases of images into SWF with two techniques inspired by local connectivity and multiscale representation in vision research, which greatly improve the efficiency and quality of modeling images. With all the improvements, we achieve generative performance comparable with many deep parametric generative models on both conditional and unconditional tasks in a purely nonparametric fashion, demonstrating its great potential
Show more
Downloadable Archival Material, 2023-05-03
Undefined
Publisher:2023-05-03


Learning via Wasserstein-Based High Probability Generalisation Bounds
Show more
Authors:Viallard, Paul (Creator), Haddouche, Maxime (Creator), Simsekli, Umut (Creator), Guedj, Benjamin (Creator)
Show more
Summary:Minimising upper bounds on the population risk or the generalisation gap has been widely used in structural risk minimisation (SRM) - this is in particular at the core of PAC-Bayesian learning. Despite its successes and unfailing surge of interest in recent years, a limitation of the PAC-Bayesian framework is that most bounds involve a Kullback-Leibler (KL) divergence term (or its variations), which might exhibit erratic behavior and fail to capture the underlying geometric structure of the learning problem - hence restricting its use in practical applications. As a remedy, recent studies have attempted to replace the KL divergence in the PAC-Bayesian bounds with the Wasserstein distance. Even though these bounds alleviated the aforementioned issues to a certain extent, they either hold in expectation, are for bounded losses, or are nontrivial to minimize in an SRM framework. In this work, we contribute to this line of research and prove novel Wasserstein distance-based PAC-Bayesian generalisation bounds for both batch learning with independent and identically distributed (i.i.d.) data, and online learning with potentially non-i.i.d. data. Contrary to previous art, our bounds are stronger in the sense that (i) they hold with high probability, (ii) they apply to unbounded (potentially heavy-tailed) losses, and (iii) they lead to optimizable training objectives that can be used in SRM. As a result we derive novel Wasserstein-based PAC-Bayesian learning algorithms and we illustrate their empirical advantage on a variety of experiments
Show more
Downloadable Archival Material, 2023-06-07
Undefined
Publisher:2023-06-07 

All 3 versions


Peer-reviewed
Reflecting image-dependent SDEs in Wasserstein space and large deviation principle
Show more
Author:Xue Yang
Article, 2023
Publication:Stochastics, 20230413, 1
Publisher:2023

Peer-reviewed
A Robust Continuous Authentication System Using Smartphone Sensors and Wasserstein Generative Adversarial Networks
Show more
Authors:Shihong ZouHuizhong SunGuosheng XuChenyu WangXuanwen ZhangRuijie QuanGökhan Kul
Show more
Article, 2023
Publication:Security and Communication Networks, 2023, 20230426, 1
Publisher:2023

<–—2023———2023———960—



A Convergent Single-Loop Algorithm for Relaxation of Gromov-Wasserstein in Graph Data
Show more
Authors:Li, Jiajin (Creator), Tang, Jianheng (Creator), Kong, Lemin (Creator), Liu, Huikang (Creator), Li, Jia (Creator), So, Anthony Man-Cho (Creator), Blanchet, Jose (Creator)
Show more
Summary:In this work, we present the Bregman Alternating Projected Gradient (BAPG) method, a single-loop algorithm that offers an approximate solution to the Gromov-Wasserstein (GW) distance. We introduce a novel relaxation technique that balances accuracy and computational efficiency, albeit with some compromises in the feasibility of the coupling map. Our analysis is based on the observation that the GW problem satisfies the Luo-Tseng error bound condition, which relates to estimating the distance of a point to the critical point set of the GW problem based on the optimality residual. This observation allows us to provide an approximation bound for the distance between the fixed-point set of BAPG and the critical point set of GW. Moreover, under a mild technical assumption, we can show that BAPG converges to its fixed point set. The effectiveness of BAPG has been validated through comprehensive numerical experiments in graph alignment and partition tasks, where it outperforms existing methods in terms of both solution quality and wall-clock time
Show more
Downloadable Archival Material, 2023-03-12
Undefined
Publisher:2023-03-12
Cited by 9 Related articles All 4 versions 

Critical Points and Convergence Analysis of Generative Deep Linear Networks Trained with Bures-Wasserstein Loss
Show more
Authors:Bréchet, Pierre (Creator), Papagiannouli, Katerina (Creator), An, Jing (Creator), Montúfar, Guido (Creator)
Show more
Summary:We consider a deep matrix factorization model of covariance matrices trained with the Bures-Wasserstein distance. While recent works have made important advances in the study of the optimization problem for overparametrized low-rank matrix approximation, much emphasis has been placed on discriminative settings and the square loss. In contrast, our model considers another interesting type of loss and connects with the generative setting. We characterize the critical points and minimizers of the Bures-Wasserstein distance over the space of rank-bounded matrices. For low-rank matrices the Hessian of this loss can theoretically blow up, which creates challenges to analyze convergence of optimizaton methods. We establish convergence results for gradient flow using a smooth perturbative version of the loss and convergence results for finite step size gradient descent under certain assumptions on the initial weights
Show more
Downloadable Archival Material, 2023-03-06
Undefined
Publisher:2023-03-06


Peer-reviewed
Data augmentation strategy for power inverter fault diagnosis based on wasserstein distance and auxiliary classification generative adversarial network
Show more
Authors:Quan SunFei PengXianghai YuHongsheng Li
Article, 2023
Publication:Reliability Engineering & System Safety, 237, 202309, 109360
Publisher:2023
Related articles
 All 2 versions
 
Findings in Hydrology and Earth System Sciences Reported from Australian National University ( 

Reported from Australian National University (Hydrological objective functions and ensemble averaging with the Wasserstein distance)


Peer-reviewed
Wasserstein distance as a new tool for discriminating cosmologies through the topology of large-scale structure
Show more
Authors:Maksym TsizhVitalii TymchyshynFranco Vazza
Article, 2023
Publication:Monthly Notices of the Royal Astronomical Society, 522, 20230421, 2697
Publisher:2023

2023


Research Findings from University of Tehran Update Understanding of Photogrammetry Remote Sensing and Spatial Information Sciences (Improving Semantic Segmentation of High-resolution Remote Sensing Images Using Wasserstein Generative ...)
Show more
Related articles
 All 9 versions 


2023 see 2021. Peer-reviewed
Learning to Simulate Sequentially Generated Data via Neural Networks and Wasserstein Training
Show more
Authors:Tingyu ZhuHaoyu LiuZeyu Zheng
Article, 2023
Publication:ACM Transactions on Modeling and Computer Simulation, 20230403
Publisher:2023
Cited by 2
 Related articles
 

2023 see 2022
Strong Formulations for Distributionally Robust Chance-Constrained Programs with Left-Hand Side Uncertainty Under Wasserstein Ambiguity
Show more
Authors:Nam Ho-NguyenFatma Kilinç-KarzanSimge KüçükyavuzDabeen Lee
Show more
Article, 2023
Publication:INFORMS Journal on Optimization, 5, 202304, 211
Publisher:2023
MR4604128
 

New Findings from Qilu University of Technology (Shandong Academy of Sciences) in the Area of Mathematical Biosciences and Engineering Described (WG-ICRN: Protein 8-state secondary structure prediction based on Wasserstein generative ...)
Show more
Article, 2023
Publication:Obesity, Fitness & Wellness Week, March 25 2023, 2294
Publisher:2023

2023 see 2022.
Peer-reviewed
Quantum Wasserstein distance of order 1 between channels
Authors:Rocco DuvenhageMathumo Mapaya
Article, 2023
Publication:Infinite Dimensional Analysis, Quantum Probability and Related Topics, 20230421
Publisher:2023

<–—2023———2023———970—




Sparse super resolution and its trigonometric approximation in the p‐Wasserstein distance
Show more
Authors:Paul CatalaMathias HockmannStefan Kunis
Article, 2023
Publication:Proceedings in applied mathematics and mechanics, 22, 2023, n/a
Publisher:2023



Peer-reviewed
On Quadratic Wasserstein Metric with Squaring Scaling for Seismic Velocity Inversion
Show more
Authors:Zhengyang LiYijia TangJing Chen nullHao Wu
Article, 2023
Publication:Numerical Mathematics: Theory, Methods and Applications, 16, 202306, 277
Publisher:2023


Peer-reviewed
Scenario Reduction Network Based on Wasserstein Distance with Regularization
Show more
Authors:Xiaochong DongYingyun SunSarmad Majeed MalikTianjiao PuYe LiXinying Wang
Show more
Article, 2023
Publication:IEEE Transactions on Power Systems, 2023, 1
Publisher:2023



2023 see 2022.
Convergence of the empirical measure in expected Wasserstein distance: non asymptotic explicit bounds in Rd
Show more
Author:Nicolas Fournier
Article, 2023
Publication:ESAIM: Probability and Statistics, 20230504
Publisher:2023


2023 see 2021
Wasserstein perturbations of Markovian transition semigroups
Show more
Authors:Sven FuhrmannMichael KupperMax Nendel
Article, 2023
Publication:Annales de l'Institut Henri Poincaré, Probabilités et Statistiques, 59, 20230501
Publisher:2023
Zbl 07699946


2023

2023 see 2022

Coalescing-fragmentating Wasserstein dynamics: Particle approach
Show more
Author:Vitalii Konarovskyi
Article, 2023
Publication:Annales de l'Institut Henri Poincaré, Probabilités et Statistiques, 59, 20230501
Publisher:2023



Quantitative control of Wasserstein distance between Brownian motion and the Goldstein–Kac telegraph process
Show more
Authors:Gerardo BarreraJani Lukkarinen
Article, 2023
Publication:Annales de l'Institut Henri Poincaré, Probabilités et Statistiques, 59, 20230501
Publisher:2023
Related articles
All 4 versions


2023 book see 2022. Book
An invitation to optimal transport, Wasserstein distances, and gradient flows
Show more
Authors:Alessio FigalliFederico Glaudo
Computer File, 2023
English, Second edition
Publisher:EMS Press, Berlin, 2023
An invitation to optimal transport, Wasserstein distances, and gradient flows

Library book

2023 see 2022. Peer-reviewed
Conditional Wasserstein Generator
Authors:Myunghee Cho PaikKyungbok LeeYoung-geun Kim

Summary:The statistical distance of conditional distributions is an essential element of generating target data given some data as in video prediction. We establish how the statistical distances between two joint distributions are related to those between two conditional distributions for three popular statistical distances: f-divergence, Wasserstein distance, and integral probability metrics. Such characterization plays a crucial role in deriving a tractable form of the objective function to learn a conditional generator. For Wasserstein distance, we show that the distance between joint distributions is an upper bound of the expected distance between conditional distributions, and derive a tractable representation of the upper bound. Based on this theoretical result, we propose a new conditional generator, the conditional Wasserstein generator. Our proposed algorithm can be viewed as an extension of Wasserstein autoencoders (Tolstikhin et al. 2018) to conditional generation or as a Wasserstein counterpart of stochastic video generation (SVG) model by Denton and Fergus (Denton et al. 2018). We apply our algorithm to video prediction and video interpolation. Our experiments demonstrate that the proposed algorithm performs well on benchmark video datasets and produces sharper videos than state-of-the-art methods
Show more
Article, 2023
Publication:IEEE Transactions on Pattern Analysis & Machine Intelligence, 45, 202306, 7208
Publisher:2023


A Wasserstein GAN for Joint Learning of Inpainting and Spatial Optimisation
Show more
Authors:Pascal PeterPacific-Rim Symposium on Image and Video Technology
Show more
Summary:Image inpainting is a restoration method that reconstructs missing image parts. However, a carefully selected mask of known pixels that yield a high quality inpainting can also act as a sparse image representation. This challenging spatial optimisation problem is essential for practical applications such as compression. So far, it has been almost exclusively addressed by model-based approaches. First attempts with neural networks seem promising, but are tailored towards specific inpainting operators or require postprocessing. To address this issue, we propose the first generative adversarial network (GAN) for spatial inpainting data optimisation. In contrast to previous approaches, it allows joint training of an inpainting generator and a corresponding mask optimisation network. With a Wasserstein distance, we ensure that our inpainting results accurately reflect the statistics of natural images. This yields significant improvements in visual quality and speed over conventional stochastic models. It also outperforms current spatial optimisation networks

<–—2023———2023———980—


2023 see 2022
Wassmap: Wasserstein Isometric Mapping for Image Manifold Learning
Show more
Authors:Keaton HammNick HenscheidShujie Kang
Summary:Abstract. In this paper, we propose Wasserstein Isometric Mapping (Wassmap), a nonlinear dimensionality reduction technique that provides solutions to some drawbacks in existing global nonlinear dimensionality reduction algorithms in imaging applications. Wassmap represents images via probability measures in Wasserstein space, then uses pairwise Wasserstein distances between the associated measures to produce a low-dimensional, approximately isometric embedding. We show that the algorithm is able to exactly recover parameters of some image manifolds, including those generated by translations or dilations of a fixed generating measure. Additionally, we show that a discrete version of the algorithm retrieves parameters from manifolds generated from discrete measures by providing a theoretical bridge to transfer recovery results from functional data to discrete data. Testing of the proposed algorithms on various image data manifolds shows that Wassmap yields good embeddings compared with other global and local techniques
Show more
Downloadable Article
Publication:SIAM Journal on Mathematics of Data Science, 5, 20230630, 475

 


Indoor Localization Advancement Using Wasserstein Generative Adversarial Networks
Show more
Authors:Shivam KumarSaikat MajumderSumit Chakravarty2023 IEEE 8th International Conference for Convergence in Technology (I2CT)
Show more
Summary:Fingerprint-based indoor localization methods rely on a database of Received Signal Strength (RSS) measurements and corresponding location labels. However, collecting and maintaining such a database can be costly and time consuming. In this work, we proposed Wasserstein Generative Adversarial Networks (WGAN) to generate synthetic data for fingerprinting-based indoor localization. The proposed system consists of a WGAN that is trained on a dataset of real RSS measurements and corresponding location labels. The generator of the WGAN learns to generate synthetic RSS measurements, and the critic learns to differentiate the generated and the real measurements. We validate the proposed system on a dataset of real RSS measurements. The result of the proposed system shows better localization accuracy as compared to using real data, while being more cost-effective and scalable
Show more
Chapter, 2023
Publication:2023 IEEE 8th International Conference for Convergence in Technology (I2CT), 20230407, 1
Publisher:2023


2023 see 2021. Peer-reviewed
Wasserstein distance between noncommutative dynamical systems
Show more
Author:Rocco Duvenhage
Article, 2023
Publication:Journal of Mathematical Analysis and Applications, 527, 202311, 127353
Publisher:2023


A Novel Graph Kernel Based on the Wasserstein Distance and Spectral Signatures
Show more
Authors:Yantao LiuLuca RossiAndrea TorselloJoint IAPR International Workshops on Statistical Techniques in Pattern Recognition (SPR) and Structural and Syntactic Pattern Recognition (SSPR)
Show mor
Summary:Spectral signatures have been used with great success in computer vision to characterise the local and global topology of 3D meshes. In this paper, we propose to use two widely used spectral signatures, the Heat Kernel Signature and the Wave Kernel Signature, to create node embeddings able to capture local and global structural information for a given graph. For each node, we concatenate its structural embedding with the one-hot encoding vector of the node feature (if available) and we define a kernel between two input graphs in terms of the Wasserstein distance between the respective node embeddings. Experiments on standard graph classification benchmarks show that our kernel performs favourably when compared to widely used alternative kernels as well as graph neural networks
Show more


Variant Wasserstein Generative Adversarial Network Applied on Low Dose CT Image Denoising
Show more
Authors:Anoud A. MahmoudHanaa A. SayedSara S. Mohamed
Article, 2023


2023

2023 see 2021. Peer-reviewed
Some inequalities on Riemannian manifolds linking Entropy, Fisher information, Stein discrepancy and Wasserstein distance

Show more
Authors:Li-Juan ChengAnton ThalmaierFeng-Yu Wang
Article, 2023
Publication:Journal of Functional Analysis, 285, 202309, 109997
Publisher:2023
Zbl 07694895

2023 see 2022. Peer-reviewed
Entropic Regularization of Wasserstein Distance Between Infinite-Dimensional Gaussian Measures and Gaussian Processes

Show more
Author:Hà Quang Minh
Article, 2023
Publication:Journal of Theoretical Probability, 36, 202303, 201
Publisher:2023

Prediction of Tumor Lymph Node Metastasis Using Wasserstein Distance-Based Generative Adversarial Networks Combing with Neural Architecture Search for Predicting

Show more
Authors:Yawen WangShihua Zhang
Article, 2023
Publication:Mathematics, 11, 20230201, 729
Publisher:2023
Cited by 6 Related articles All 5 versions 



Brain Tumour Segmentation Using Wasserstein Generative Adversarial Networks(WGANs)

Show more
Authors:S. NyamathullaChSai MeghanaK. Yasaswi2023 7th International Conference on Trends in Electronics and Informatics (ICOEI)
Show more0
Summary:The majority of health-related applications use technology and require a lot of data to operate. Even more, information is required for brain tumor segmentation, but many people lack access to it because of privacy laws governing medical information. Therefore, this study utilizes GAN technology, which creates synthetic images to resolve this issue. AGGrGAN is used for aggregation and discrimination, and PGAN, WGAN, and DCGAN are used to create synthetic pictures. Synthetic brain tumor masks with the same visual characteristics as the real samples. potential benefits of using GANs to synthesize new images from a real dataset of brain MR scans and corresponding hand-labelled segmentation masks (annotations), and an evaluation of the potential performance gains that are achieved when using synthetic images to augment a dataset that is used to train a neural network to solve a brain tumour segmentation task
Show more
Chapter, 2023
Publication:2023 7th International Conference on Trends in Electronics and Informatics (ICOEI), 20230411, 1571
Publisher:2023

2023 see 2021. ARTICLE
Distributionally robust chance constrained svm model with $\ell_2$-Wasserstein distance

Show more
Authors:Qing MaYanjun Wang
Article, 2023
Publication:Journal of Industrial and Management Optimization, 19, 2023, 916
Publisher:2023

ARTICLE

Distributionally robust chance constrained svm model with $\ell_2$-Wasserstein distance

Ma, Qing ; Wang, Yanjun

Journal of industrial and management optimization, 2023, Vol.19 (2), p.916

PEER REVIEWED

BrowZine PDF Icon Download PDF 

elated articles All 2 versions

<–—2023———2023———990—



2023 see 2022

Lipschitz continuity of the Wasserstein projections in the convex order on the line

Show more
Authors:Benjamin JourdainWilliam MargheritiGudmund Pammer
Article, 2023
Publication:Electronic Communications in Probability, 28, 20230101
Publisher:2023

Attacking Mouse Dynamics Authentication using Novel Wasserstein Conditional DCGAN

Show more
Authors:Arunava RoyKokSheik WongRaphaël C. -W Phan
Article, 2023
Publication:IEEE Transactions on Information Forensics and Security, 2023, 1
Publisher:202
All 3 versions


Protein 8-State Secondary Structure Prediction Based on Wasserstein Generative Adversarial Network and Residual Network
Show mor
Author:
Article, 2023
Publication:Hans Journal of Computational Biology, 13, 2023, 1
Publisher:2023


2023 see 2022
Partial Discharge Data Augmentation Based on Improved Wasserstein Generative Adversarial Network With Gradient Penalty

Show more
Authors:Guangya ZhuKai ZhouLu LuYao FuZhaogui LiuXiaomin Yang
Show more
Article, 2023
Publication:IEEE Transactions on Industrial Informatics, 19, 202305, 6565
Publisher:2023


2023 see 2022

An Efficient Content Popularity Prediction of Privacy Preserving Based on Federated Learning and Wasserstein GAN
Show more
Authors:Kailun WangDengXuanheng Li
Article, 2023
Publication:IEEE Internet of Things Journal, 10, 20230301, 3786
Publisher:2023


2023



Peer-reviewed
A novel prediction approach of polymer gear contact fatigue based on a WGAN-XGBoost model

Show more
Authors:Chenfan JiaPeitang WeiZehua LuMao YeRui ZhuHuaiju Liu
Show more
Article, 2023
Publication:Fatigue & Fracture of Engineering Materials & Structures, 46, 202306, 2272
Publisher:2023
Cited by 8
 Related articles


Listen: Portland’s Question A, Home Equity Theft on WGAN
by Maine Policy Institute
CE Think Tank Newswire, 06/2023
Newsletter  Full Text Online

Wasserstein Regression
by Chen, Yaqing; Lin, Zhenhua; Müller, Hans-Georg
Journal of the American Statistical Association, 06/2023, Volume 118, Issue 542
The analysis of samples of random objects that do not lie in a vector space is gaining increasing attention in statistics. An important class of such object...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal
Save this item
 Permanent Link
 Cite this item
 Email this item
More actions
MR4595462
 


EvaGoNet: An integrated network of variational autoencoder and Wasserstein generative adversarial network with gradient penalty for...
by Luo, Changfan; Xu, Yiping; Shao, Yongkang ; More...
Information sciences, 06/2023, Volume 629
Feature engineering is an effective method for solving classification problems. Many existing feature engineering studies have focused on image or video data...
Article PDFPDF
Journal Article  Full Text Online


2023 see 2022
Wasserstein Convergence Rates for Empirical Measures of Subordinated Processes on Noncompact Manifolds
by Li, Huaiqian; Wu, Bingyao
Journal of theoretical probability, 06/2023, Volume 36, Issue 2
The asymptotic behavior of empirical measures has been studied extensively. In this paper, we consider empirical measures of given subordinated processes on...
Article PDFPDF
Journal Article  Full Text Online

Cited by 3 Related articles All 5 versions

<–—2023———2023———1000—



Fault detection and diagnosis for liquid rocket engines with sample imbalance based on Wasserstein generative adversarial nets and...
by Deng, Lingzhi; Cheng, Yuqiang; Yang, Shuming ; More...
Proceedings of the Institution of Mechanical Engineers. Part G, Journal of aerospace engineering, 06/2023, Volume 237, Issue 8
The reliability of liquid rocket engines (LREs), which are the main propulsion device of launch vehicles, cannot be overemphasised. The development of fault...
Article PDFPDF
Journal Article  Full Text Online


Nonparametric Generative Modeling with Conditional Sliced-Wasserstein Flows
by Du, Chao; Li, Tianbo; Pang, Tianyu ; More...
arXiv.org, 05/2023
Sliced-Wasserstein Flow (SWF) is a promising approach to nonparametric generative modeling but has not been widely adopted due to its suboptimal generative...
Paper  Full Text Online


2023 see 2022
Shedding a PAC-Bayesian Light on Adaptive Sliced-Wasserstein Distances
by Ohana, Ruben; Kimia Nadjahi; Rakotomamonjy, Alain ; More...

arXiv.org, 05/2023
The Sliced-Wasserstein distance (SW) is a computationally efficient and theoretically grounded alternative to the Wasserstein distance. Yet, the literature on...
Paper  Full Text Online

Wasserstein PAC-Bayes Learning: Exploiting Optimisation Guarantees to Explain Generalisation
by Haddouche, Maxime; Guedj, Benjamin
arXiv.org, 05/2023
PAC-Bayes learning is an established framework to both assess the generalisation ability of learning algorithms, and design new learning algorithm by...
Paper  Full Text Online
ited by 1
 Related articles All 4 versions 


2023 see 2022
From geodesic extrapolation to a variational BDF2 scheme for Wasserstein gradient flows
by Gallouët, Thomas; Natale, Andrea; Todeschi, Gabriele
arXiv.org, 05/2023
We introduce a time discretization for Wasserstein gradient flows based on the classical Backward Differentiation Formula of order two. The main building block...
Paper  Full Text Online

 2023


2023 see 2021
Controlling Wasserstein Distances by Kernel Norms with Application to Compressive Statistical Learning
by Vayer, Titouan; Gribonval, Rémi
MR4596096

ited by 8 Related articles All 8 versions 


arXiv.org, 05/2023
Comparing probability distributions is at the crux of many machine learning algorithms. Maximum Mean Discrepancies (MMD) and Wasserstein distances are two...
Paper  Full Text Online


[PDF] arxiv.org

Neural Wasserstein gradient flows for maximum mean discrepancies with Riesz kernels

F Altekrüger, J HertrichG Steidl - arXiv preprint arXiv:2301.11624, 2023 - arxiv.org

… neural schemes, we benchmark them on the interaction energy. Here we provide analytic

formulas for Wasserstein … Finally, we illustrate our neural MMD flows by numerical examples. …

Cited by 2 All 2 versions 



[PDF] arxiv.org

Neural SDEs for Conditional Time Series Generation and the Signature-Wasserstein-1 metric

PD Lozano, TL Bagén, J Vives - arXiv preprint arXiv:2301.01315, 2023 - arxiv.org

… Unordered Wasserstein-1 metric: We compare the Wasserstein-1 distance between the

real one dimensional distributions that are given by: a) taking the data points yt from the output …

 Cited by 1 Related articles 



Neural SDEs for Conditional Time Series Generation and the Signature-Wasserstein-1 metric

P Díaz Lozano, T Lozano Bagén, J Vives - arXiv e-prints, 2023 - ui.adsabs.harvard.edu

(Conditional) Generative Adversarial Networks (GANs) have found great success in recent

years, due to their ability to approximate (conditional) distributions over extremely high …

 Related articles


[HTML] springer.com

[HTML] Wasserstein enabled Bayesian optimization of composite functions

A CandelieriA PontiF Archetti - Journal of Ambient Intelligence and …, 2023 - Springer

… becomes a functional in the Wasserstein space. The minimizer of the acquisition functional

in the Wasserstein space is then mapped back to the original space using a neural network. …

Wasserstein enabled Bayesian optimization of composite functions

Cited by 2 Related articles All 2 versions

<–—2023———2023———1010—


[PDF] arxiv.org

Learning via Wasserstein-Based High Probability Generalisation Bounds

P ViallardM HaddoucheU Simsekli… - arXiv preprint arXiv …, 2023 - arxiv.org

… We consider that the models are either linear or neural networks (… For neural networks, we

consider fully connected ReLU neural … Note that neural networks are Lipschitz when the set of …

All 2 versions 


基于统计信息系数和 Wasserstein 生成对抗网络的 风火系统暂态特征选择与两阶段稳定评估.

赵冬梅, 谢家康, 杜泽航, 魏中庆… - Electric Power …, 2023 - search.ebscohost.com

其次,Wasserstein距离代替传统生成对抗网络中的JS散度, 出基于Wasserstein生成对抗网络

本文 应用W-GAN 进行数据生成,其核心思想是使用 Wasserstein距离代替JS散度来度量真实

]Chinese. Temporary Fenghuo System Based on Statistical Information Coefficients and Wasserstein Generative Adversarial Network]

Related articles

2023 see 2021 [PDF] projecteuclid.org

Bounds in  Wasserstein distance on the normal approximation of general M-estimators

F Bachoc, M Fathi - Electronic Journal of Statistics, 2023 - projecteuclid.org

… , for the L1 Wasserstein distance between the distribution of n1/… This enables to decompose

the target Wasserstein distance … of this paper to general Lp Wasserstein distances, p > 1). …

elated articles All 7 versions

[HTML] mdpi.com

[HTML] Crossline Reconstruction of 3D Seismic Data Using 3D cWGAN: A Comparative Study on Sleipner Seismic Survey Data

J Yu, D Yoon - Applied Sciences, 2023 - mdpi.com

… Our results show that the 3D cWGAN is more efficient in enhancing resolution and … cWGAN

and 3D cWGAN used in this study is illustrated in Figure 2. The generator for the 2D cWGAN …

All 3 versions 


An Effective WGAN-Based Anomaly Detection Model for IoT Multivariate Time Series

S Qi, J Chen, P Chen, P Wen, W Shan… - Pacific-Asia Conference on …, 2023 - Springer

… a model named Wasserstein-GAN with gradient Penalty and effective Scoring (WPS). In this

model, Wasserstein Distance … We address this issue by using Wasserstein distance with GP. …

 Cited by 1 Related articles All 2 versions

2023

A novel adversarial example generation algorithm based on WGAN-Unet

T Yao, J Fan, Z Qin, L Chen - International Conference on …, 2023 - spiedigitallibrary.org

The security of deep neural networks has triggered extensive research on the adversarial

example. The gradient or optimization-based adversarial example generation algorithm has …

Related articles All 3 versions


Low-count PET image reconstruction algorithm based on WGAN-GP

R Fang, R Guo, M Zhao, M Yao - Proceedings of the 2023 3rd …, 2023 - dl.acm.org

… based on the gradient penalized Wasserstein Generative Adversarial Network (WGAN-GP)

[… network based on the architecture of WGANGP, where the reconstruction network based on …



[PDF] openreview.net

Convergence of Generative Deep Linear Networks Trained with Bures-Wasserstein Loss

P BréchetK PapagiannouliJ AnG Montufar - 2023 - openreview.net

… of the Bures-Wasserstein loss when the matrices drop rank, we … • For the smooth Bures-Wasserstein

loss, in Theorem 5.6 we … • For the Bures-Wasserstein loss and its smooth version, in …



[PDF] arxiv.org

Optimal control of the Fokker-Planck equation under state constraints in the Wasserstein space

S Daudin - Journal de Mathématiques Pures et Appliquées, 2023 - Elsevier

… of the Fokker-Planck equation with state constraints in the Wasserstein space of probability

… -field game system of partial differential equations associated with an exclusion condition. …

Cited by 8 Related articles All 3 versions



[HTML] mdpi.com

[HTML] Wasserstein Distance-Based Deep Leakage from Gradients

Z Wang, C Peng, X He, W Tan - Entropy, 2023 - mdpi.com

… on DLG, which uses the Wasserstein distance to measure the distance between the …

Wasserstein distance; the analysis results show that Wasserstein distance substitution for Euclidean …

Related articles All 7 versions 

MR4603060

Cited by 1 Related articles All 9 versions 

<–—2023———2023———1020—


[PDF] arxiv.org

Isometric rigidity of the Wasserstein space  over Carnot groups

ZM Balogh, T TitkosD Virosztek - arXiv preprint arXiv:2305.05492, 2023 - arxiv.org

… (4.5) Here, and in the sequel we use the same notation || · || for the Euclidean norm of vectors

in various Euclidean spaces of (possibly) different dimensions. To continue the proof let us …

All 2 versions 



arXiv:2306.07176
  [pdfother cs.LG   math.OC
Unbalanced Optimal Transport meets Sliced-Wasserstein
Authors: Thibault SéjournéClément BonetKilian FatrasKimia NadjahiNicolas Courty
Abstract: Optimal transport (OT) has emerged as a powerful framework to compare probability measures, a fundamental task in many statistical and machine learning problems. Substantial advances have been made over the last decade in designing OT variants which are either computationally and statistically more efficient, or more robust to the measures and datasets to compare. Among them, sliced OT distances h…  More
Submitted 12 June, 2023; originally announced June 2023.



arXiv:2306.04878  [pdfother quant-ph
Quantum Wasserstein distance between unitary operations
Authors: Xinyu QiuLin Chen
Abstract: Quantifying the effect of noise on unitary operations is an essential task in quantum information processing. We propose the quantum Wasserstein distance between unitary operations, which shows an explanation for quantum circuit complexity and characterizes local distinguishability of multi-qudit operations. We show analytical calculation of the distance between identity and widely-used quantum ga…  More
Submitted 7 June, 2023; originally announced June 2023.



arXiv:2306.04375  [pdfpsother stat.ML  cs.LG
Learning via Wasserstein-Based High Probability Generalisation Bounds
Authors: Paul ViallardMaxime HaddoucheUmut SimsekliBenjamin Guedj
Abstract: Minimising upper bounds on the population risk or the generalisation gap has been widely used in structural risk minimisation (SRM) - this is in particular at the core of PAC-Bayesian learning. Despite its successes and unfailing surge of interest in recent years, a limitation of the PAC-Bayesian framework is that most bounds involve a Kullback-Leibler (KL) divergence term (or its variations), whi…  More
Submitted 7 June, 2023; originally announced June 2023.

ACited by 6 Related articles All 18 versions 

MR4599319 Prelim Bartl, Daniel; 

Wiesel, Johannes; Sensitivity of Multiperiod Optimization Problems with Respect to the Adapted Wasserstein Distance. SIAM J. Financial Math. 14 (2023), no. 2, 704–720. 91G10 (90C15)

Review PDF Clipboard Journal Article

Zbl 07707121

2023


2023 see 2021

MR4599250 Prelim Mathey-Prevot, Maxime; Valette, Alain; 

Wasserstein distance and metric trees. Enseign. Math. 69 (2023), no. 3-4, 315–333.

Review PDF Clipboard Journal Article


2023 see 2022

A SMOOTH VARIATIONAL PRINCIPLE ON WASSERSTEIN SPACE

Bayraktar, EEkren, I and Zhang, X

May 2023 (Early Access) | 

PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY

In this note, we provide a smooth variational principle on Wasser-stein space by constructing a smooth gauge-type function using the sliced Wasserstein distance. This function is a crucial tool for optimization problems and in viscosity theory of PDEs on Wasserstein space.

Free Submitted Article From RepositoryView full textmore_horiz

8 References Related records

MR4607651


Working Paper
MonoFlow: Rethinking Divergence GANs via the Perspective of Wasserstein Gradient Flows
Yi, Mingxuan; Zhu, Zhanxing; Liu, Song.
 arXiv.org; Ithaca, Jun 9, 2023.
Working Paper
Quantum Wasserstein distance between unitary operations
Qiu, Xinyu; Chen, Lin.
 arXiv.org; Ithaca, Jun 8, 2023.
Cited by 3
 Related articles All 6 versions 


Working Paper
Learning with symmetric positive definite matrices via generalized Bures-Wasserstein geometry
Han, Andi; Mishra, Bamdev; Jawanpuria, Pratik; Gao, Junbin.
 arXiv.org; Ithaca, Jun 8, 2023.

<–—2023———2023———1030—



2023 see 2022. Working Paper
Viability and Exponentially Stable Trajectories for Differential Inclusions in Wasserstein Spaces
Bonnet, Benoît; Frankowska, Hélène.
 arXiv.org; Ithaca, Jun 5, 2023.
 

 
2023 see 2022. Scholarly Journal
DoS attack traffic detection based on feature optimization extraction and DPSA-WGAN
Ma, Wengang; Liu, Ruiqi; Guo, Jin.
 Applied Intelligence; Boston Vol. 53, Iss. 11,  (Jun 2023): 13924-13955.
 

2023 see 2022. ARTICLE

obal Wasserstein Margin maximization for boosting generalization in adversarial training

Yu, Tingyue ; Wang, Shen ; Yu, Xiangzhan; New York: Springer US

Applied intelligence (Dordrecht, Netherlands), 2023, Vol.53 (10), p.11490-11504

PEER REVIEWED


  

ARTICLE

S Source-Independent Full-Waveform Inversion Based on Convolutional Wasserstein Distance Objective Function

Jiang, Shuqi ; Chen, Hanming ; Li, Honghui ; Zhou, Hui ; Wang, Lingqian ; Zhang, Mingkun ; Jiang, Chuntao; New York: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)

IEEE transactions on geoscience and remote sensing, 2023, Vol.61, p.1-14

PEER REVIEWED

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 

Related articles All 2 versions

 

2023 see 2021. ARTICLE

 Conditional Wasserstein generative adversarial networks applied to acoustic metamaterial design

Lai, Peter ; Amirkulova, Feruza ; Gerstoft, Peter; United States

The Journal of the Acoustical Society of America, 2021, Vol.150 (6), p.4362-4374

PEER REVIEWED

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 


2023


ARTICLE

 Wasserstein GAN-based Digital Twin Inspired Model for Early Drift Fault Detection in Wireless Sensor Networksworks

Hasan, Md. Nazmul ; Jan, Sana Ullah ; Koo, Insoo

IEEE sensors journal, 2023, p.1-1

PEER REVIEWED

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 

Cited by 9 Related articles All 4 versions

2023 see 2022

Bayesian Optimization in Wasserstein Spaces.

Bibliographic details on Bayesian Optimization in Wasserstein Spaces. ... type: Conference or Workshop Paper. metadata version: 2023-02-14. view.

BOOK CHAPTER

Bayesian Optimization in Wasserstein Spaces

Simos, Dimitris E ; Rasskazova, Varvara A ; Archetti, Francesco ; Kotsireas, Ilias S ; Pardalos, Panos M; Switzerland: Springer International Publishing AG

Learning and Intelligent Optimization, 2023, Vol.13621, p.248-262

PEER REVIEWED



2023 see 2022. ARTICLE

On the use of Wasserstein distance in the distributional analysis of human decision making under uncertainty

Candelieri, Antonio ; Ponti, Andrea ; Giordani, Ilaria ; Archetti, Francesco; Cham: Springer International Publishing

Annals of mathematics and artificial intelligence, 2023, Vol.91 (2-3), p.217-238

PEER REVIEWED

OPEN ACCESS

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 

Zbl 07709590

ARTICLE

An Intelligent Method for Early Motor Bearing Fault Diagnosis Based on Wasserstein Distance Generative Adversarial Networks Meta Learning

Luo, Peien ; Yin, Zhonggang ; Yuan, Dongsheng ; Gao, Fengtao ; Liu, Jing

IEEE transactions on instrumentation and measurement, 2023, p.1-1

PEER REVIEWED

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 

Cited by 14 Related articles All 2 versions

 

2023 see 2022. ARTICLE

A novel conditional weighting transfer Wasserstein auto-encoder for rolling bearing fault diagnosis with multi-source domains

Zhao, Ke ; Jia, Feng ; Shao, Haidong; Elsevier B.V

Knowledge-based systems, 2023, Vol.262, p.110203

PEER REVIEWED

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 

<–—2023———2023———1040—


2023 see 2022. ARTICLE

Exact weights, path metrics, and algebraic Wasserstein distances

Bubenik, Peter ; Scott, Jonathan ; Stanley, Donald; Cham: Springer International Publishing

Journal of applied and computational topology, 2023, Vol.7 (2), p.185-219

PEER REVIEWED

Zbl 07716161


2023 see 2022. ARTICLE

HackGAN: Harmonious Cross-Network Mapping Using CycleGAN With Wasserstein-Procrustes Learning for Unsupervised Network Alignment

Yang, Linyao ; Wang, Xiao ; Zhang, Jun ; Yang, Jun ; Xu, Yancai ; Hou, Jiachen ; Xin, Kejun ; Wang, Fei-Yue; Piscataway: IEEE

IEEE transactions on computational social systems, 2023, Vol.10 (2), p.1-14

PEER REVIEWED

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 


2023 see 2022. ARTICLE

Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space

Le Gouic, Thibaut ; Paris, Quentin ; Rigollet, Philippe ; Stromme, Austin J.

Journal of the European Mathematical Society : JEMS, 2023, Vol.25 (6), p.2229-2250

PEER REVIEWED

BrowZine PDF Icon Download PDF 

Zbl 07714611


2023 see 2022. ARTICLE

Wasserstein-Type Distances of Two-Type Continuous-State Branching Processes in Lévy Random Environments

Chen, Shukai ; Fang, Rongjuan ; Zheng, Xiangqi

Journal of theoretical probability, 2022

PEER REVIEWED

OPEN ACCESS

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 

MR4621076 

 

ARTICLE

Ss neural networks

He, Jiaxing ; Wang, Xiaodan ; Song, Yafei ; Xiang, Qian ; Chen, Chen; New York: Springer US

Applied intelligence (Dordrecht, Netherlands), 2023, Vol.53 (10), p.12416-12436

PEER REVIEWED

OPEN ACCESS

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 


2023


ARTICLE

Sample generation based on a supervised Wasserstein Generative Adversarial Network for high-resolution remote-sensing scene classification

Han, Wei ; Wang, Lizhe ; Feng, Ruyi ; Gao, Lang ; Chen, Xiaodao ; Deng, Ze ; Chen, Jia ; Liu, Peng; Elsevier Inc

Information sciences, 2020, Vol.539, p.177-194, Article 177

PEER REVIEWED

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 

 

CERENCE PROCEEDING

Conservative Wasserstein Training for Pose Estimation

Liu, Xiaofeng ; Zou, Yang ; Che, Tong ; Jia, Ping ; Ding, Peng ; You, Jane ; Kumar, B. V. K. Vijaya; Piscataway: IEEE

2019 IEEE/CVF International Conference on Computer Vision (ICCV), 2019, p.8261-8271

OPEN ACCESS


ARTCLE

Fault detection and diagnosis for liquid rocket engines with sample imbalance based on Wasserstein generative adversarial nets and multilayer perceptron

Deng, Lingzhi ; Cheng, Yuqiang ; Yang, Shuming ; Wu, Jianjun ; Shi, Yehui; London, England: SAGE Publications

Proceedings of the Institution of Mechanical Engineers. Part G, Journal of aerospace engineering, 2023, Vol.237 (8), p.1751-1763

PEER REVIEWED


ARTICLE

Wasserstein距离的在线机器学习算法研究

Li, Zhaoen ; Zhang, Zhihai

SCIENTIA SINICA Technologica, 2023

PEER REVIEWED

[[Chinese. Research on Online Machine Learning Algorithm Based on Wasserstein Distance] 


CONFERENCE PROCEEDING

DWGAN: a dual Wasserstein-autoencoder-based generative adversarial network for image super-resolution

Gao, Hongfan ; Zeng, Zhenbing; SPIE

2023

<–—2023———2023———1050—



ARTICLE

A Modified Gradient Method for Distributionally Robust Logistic Regression over the Wasserstein Ball

Wang, Luyun ; Zhou, Bo; Basel: MDPI AG

Mathematics (Basel), 2023, Vol.11 (11), p.2431

PEER REVIEWED

OPEN ACCESS

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 

Related articles All 5 versions

 

ARTICLE

Intrusion Detection in Networks by Wasserstein Enabled Many-Objective Evolutionary Algorithms

Ponti, Andrea ; Candelieri, Antonio ; Giordani, Ilaria ; Archetti, Francesco; Basel: MDPI AG

Mathematics (Basel), 2023, Vol.11 (10), p.2342

PEER REVIEWED

OPEN ACCESS

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 


 

ARTICLE

On potentials of regularized Wasserstein generative adversarial networks for realistic hallucination of tiny faces

Shao, Wen-Ze ; Xu, Jing-Jing ; Chen, Long ; Ge, Qi ; Wang, Li-Qian ; Bao, Bing-Kun ; Li, Hai-Bo; Elsevier B.V

Neurocomputing (Amsterdam), 2019, Vol.364, p.1-15, Article 1

PEER REVIEWED

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 


 

ARTICLE

Short communication: The Wasserstein distance as a dissimilarity metric for comparing detrital age spectra and other geological distributions

Lipp, Alex ; Vermeesch, Pieter

Geochronology (Göttingen. Online), 2023, Vol.5 (1), p.263-270

PEER REVIEWED

BrowZine Article Link Icon Read Article (via Unpaywall) 

 

ARTICLE

Wasserstein GAN-gradient penalty with deep transfer learning based alzheimer disease classification on 3D MRI scans

Narasimha, Rao Thota ; Vasumathi, D.; Nagercoil: iManager Publications

i-manager's Journal on Image Processing, 2022, Vol.9 (4), p.9


2023


ARTICLE

Data augmentation-based conditional Wasserstein generative adversarial network-gradient penalty for XSS attack detection system

Mokbal, Fawaz Mahiuob Mohammed ; Wang, Dan ; Wang, Xiaoxi ; Fu, Lihua; United States: PeerJ. Ltd

PeerJ. Computer science, 2020, Vol.6, p.e328-e328

PEER REVIEWED

OPEN ACCESS

BrowZine PDF Icon Download PDF (via Unpaywall) 


  

DISSERTATION

Semidefinite Programming Relaxations of the Simplified Wasserstein Barycenter Problem: An ADMM Approach

Cheng, Jiahui; University of Waterloo

2023

OPEN ACCESS

 

ARTICLE

A note on the Bures-Wasserstein metric

Mohan, Shravan

2023

OPEN ACCESS



ARTICLE

Extending the Wasserstein metric to positive measures

Leblanc, Hugo ; Gouic, Thibaut Le ; Liandrat, Jacques ; Tournus, Magali

2023

OPEN ACCESS

 

ARTICLE

Parameterized Wasserstein Hamiltonian Flow

Wu, Hao ; Liu, Shu ; Ye, Xiaojing ; Zhou, Haomin; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

<–—2023———2023———1060—



BOOK CHAPTER

Gromov–Wasserstein Transfer Operators

Calatroni, Luca ; Donatelli, Marco ; Morigi, Serena ; Prato, Marco ; Santacesaria, Matteo; Switzerland: Springer International Publishing AG

Scale Space and Variational Methods in Computer Vision, 2023, Vol.14009


ARTICLE

Entropic Gromov-Wasserstein Distances: Stability, Algorithms, and Distributional LimitsRioux, Gabriel ; Goldfeld, Ziv ; Kato, Kengo

2023

OPEN ACCESS

All 2 versions View as HTML
Entropic Gromov-Wasserstein Distances: Stability, Algorithms, and Distributional Limits

by Rioux, Gabriel; Goldfeld, Ziv; Kato, Kengo

arXiv.org, 05/2023

The Gromov-Wasserstein (GW) distance quantifies discrepancy between metric measure spaces, but suffers from computational hardness. The entropic...

Paper  Full Text Online

Open Access


ARTICLE

On the Wasserstein distance and Dobrushin's uniqueness theorem

Dorlas, Tony C ; Savoie, Baptiste

2023

OPEN ACCESS

On the Wasserstein distance and Dobrushin's uniqueness theorem

by Dorlas, Tony C; Savoie, Baptiste

arXiv.org, 05/2023

In this paper, we revisit Dobrushin's uniqueness theorem for Gibbs measures of lattice systems of interacting particles at thermal equilibrium. In a nutshell,...

Paper  Full Text Online


ARTICLE

Wasserstein contraction and spectral gap of slice sampling revisited

Schär, Philip

2023

OPEN ACCESS

 Wassertein contraction and spectral gap of slice sampling revisited

by Schär, Philip

arXiv.org, 05/2023

We propose a new class of Markov chain Monte Carlo methods, called \(k\)-polar slice sampling (\(k\)-PSS), as a technical tool that interpolates between and...

Paper  Full Text Online

Open Access


ARTICLE

A Lagrangian approach to totally dissipative evolutions in Wasserstein spaces

Cavagnari, Giulia ; Savaré, Giuseppe ; Sodini, Giacomo Enrico

2023

OPEN ACCESS

 Cited by 1 Related articles All 5 versions 

 2023

2023 see2022. ARTICLE

Gromov-Wasserstein Autoencoders

Nakagawa, Nao ; Togo, Ren ; Ogawa, Takahiro ; Haseyama, Miki; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS


2023 see 2022. ARTICLE

Hierarchical Sliced Wasserstein Distance

Nguyen, Khai ; Ren, Tongzheng ; Nguyen, Huy ; Rout, Litu ; Nguyen, Tan ; Ho, Nhat; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS


ARTICLE

Spherical Sliced-Wasserstein

Bonet, Clément ; Berg, Paul ; Courty, Nicolas ; Septier, François ; Lucas Drumetz ; Minh-Tan, Pham; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS


ARTICLE

Weak log-majorization between the geometric and Wasserstein means

Gan, Luyining ; Kim, Sejong

2023

OPEN ACCESS

Cited by 1 Related articles All 3 versions

ARTICLE

An innovative generative information steganography method based on Wasserstein GAN

Cui, Jianming ; Yu, Xi ; Liu, Ming; Jiaozuo: Henan Polytechnic University

Henan Ligong Daxue Xuebao. Ziran Kexue Ban = Journal of Henan Polytechnic University. Natural Science, 2023, Vol.42 (3), p.146

<–—2023———2023———1070—e



ARTICLE

Energy-Based Sliced Wasserstein Distance

Nguyen, Khai ; Ho, Nhat; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

 

ARTICLE

Tree-Based Diffusion Schr\"odinger Bridge with Applications to Wasserstein Barycenters

Noble, Maxence ; De Bortoli, Valentin ; Doucet, Arnaud ; Durmus, Alain

2023

OPEN ACCESS

Tree-Based Diffusion Schr\"odinger Bridge with Applications to Wasserstein Barycenters

by Noble, Maxence; De Bortoli, Valentin; Doucet, Arnaud ; More...

05/2023

Multi-marginal Optimal Transport (mOT), a generalization of OT, aims at minimizing the integral of a cost function with respect to a distribution with some...

Journal Article  Full Text Online

Open Access

ARTICLE

Isometric rigidity of the Wasserstein space $\mathcal{W}_1(\mathbf{G})$ over Carnot groups

Balogh, Zoltán M ; Titkos, Tamás ; Virosztek, Dániel

 

2023. ARTICLE

WSFE: Wasserstein Sub-graph Feature Encoder for Effective User Segmentation in Collaborative Filtering

Chen, Yankai ; Zhang, Yifei ; Yang, Menglin ; Song, Zixing ; Ma, Chen ; King, Irwin

OPEN ACCESS

Cited by 18 Related articles All 4 versions

ARTICLE

Wasserstein Convergence for Empirical Measures of Subordinated Fractional Brownian Motions on the Flat Torus

Li, Huaiqian ; Wu, Bingyao

2023

OPEN ACCESS


2023


ARTICLE

Vector Quantized Wasserstein Auto-Encoder

Tung-Long Vuong ; Le, Trung ; Zhao, He ; Zheng, Chuanxia ; Harandi, Mehrtash ; Cai, Jianfei ; Dinh Phung; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

 

2023. ARTICLE

Parameter estimation from aggregate observations: A Wasserstein distance based sequential Monte Carlo sampler

Cheng, Chen ; Wen, Linjie ; Li, Jinglai

OPEN ACCESS

 Parameter estimation from aggregate observations: A Wasserstein distance based sequential Monte Carlo sampler

by Chen, Cheng; Wen, Linjie; Li, Jinglai

arXiv.org, 05/2023

In this work we study systems consisting of a group of moving particles. In such systems, often some important parameters are unknown and have to be estimated...

Paper  Full Text Online. Oen AccesRelated articles All 8 versions

ARTICLE

Cutoff ergodicity bounds in Wasserstein distance for a viscous energy shell model with L\'evy noise

Barrera, Gerardo ; Högele, Michael A ; Pardo, Juan Carlos ; Pavlyukevich, Ilya

2023

OPEN ACCESS


ARTICLE

Wasserstein-$1$ distance between SDEs driven by Brownian motion and stable processes

Deng, Changsong ; Schilling, Rene L ; Xu, Lihu

2023

OPEN ACCESS


ARTICLE

MonoFlow: Rethinking Divergence GANs via the Perspective of Wasserstein Gradient Flows

Yi, Mingxuan ; Zhu, Zhanxing ; Liu, Song

2023

OPEN ACCESS

<–—2023———2023———1080—e


ARTICLE

On Excess Mass Behavior in Gaussian Mixture Models with Orlicz-Wasserstein Distances

Guha, Aritra ; Ho, Nhat ; Nguyen, XuanLong

2023

OPEN ACCESS

Cited by 3 Related articles All 7 versions 


2023 see 2022. ARTICLE

Discrete Langevin Sampler via Wasserstein Gradient Flow

Sun, Haoran ; Dai, Hanjun ; Dai, Bo ; Zhou, Haomin ; Schuurmans, Dale

arXiv.org, 2023

OPEN ACCESS


2023 see 2022 ARTICLE

Wasserstein Logistic Regression with Mixed Features

Aras Selvi ; Belbasi, Mohammad Reza ; Haugh, Martin B ; Wiesemann, Wolfram

arXiv.org, 2023

OPEN ACCESS


2023 see 2022 ARTICLE

Wasserstein Iterative Networks for Barycenter Estimation

Korotin, Alexander ; Egiazarian, Vage ; Li, Lingxiao ; Burnaev, Evgeny

arXiv.org, 2023

OPEN ACCESS

 

BOOK CHAPTER

A Wasserstein GAN for Joint Learning of Inpainting and Spatial Optimisation

Peter, Pascal

Image and Video Technology, 2023, Vol.13763, p.132-145


2023

 

Tree-Based Diffusion Schrödinger Bridge with Applications ...

ARTICLE

Tree-Based Diffusion Schrödinger Bridge with Applications to Wasserstein Barycenters

Noble, Maxence ; De Bortoli, Valentin ; Doucet, Arnaud ; Durmus, Alain

arXiv.org, 2023

OPEN ACCESS


 

Improving Neural Topic Models with Wasserstein ...

BOOK CHAPTER

Improving Neural Topic Models with Wasserstein Knowledge Distillation-12pt

Kamps, Jaap ; Goeuriot, Lorraine ; Crestani, Fabio ; Maistro, Maria ; Joho, Hideo ; Davis, Brian ; Gurrin, Cathal ; Kruschwitz, Udo ; Caputo, Annalina

Advances in Information Retrieval, 2023, Vol.13981



Wasserstein Gradient Flows for Optimizing Gaussian ...

ARTICLE

Wasserstein Gradient Flows for Optimizing Gaussian Mixture Policies

Ziesche, Hanna ; Rozo, Leonel; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

 

2023 see 2022. ARTICLE

Wasserstein Steepest Descent Flows of Discrepancies with Riesz Kernels

Hertrich, Johannes ; Gräf, Manuel ; Beinert, Robert ; Steidl, Gabriele; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

arXiv [v3] Mon, 16 Jan 2023 09:39:10 UTC (6,014 KB


2023 see 2022. ARTICLE  ARTICLE

Quantitative Stability of Barycenters in the Wasserstein Space

Carlier, Guillaume ; Delalande, Alex ; Merigot, Quentin; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

Doubly Regularized Entropic Wasserstein Barycenters

L Chizat - arXiv preprint arXiv:2303.11844, 2023 - arxiv.org

… Second, we show that for λ,τ > 0, the barycenter has a smooth density and is strongly 

stable under perturbation of the marginals. In particular, it can be estimated efficiently: given n …

Save Cite Cited by 1 All 2 versions 

<–—2023———2023———1090—



ARTICLE

Wasserstein geometry and Ricci curvature bounds for Poisson spaces

Lorenzo Dello Schiavo ; Herry, Ronan ; Suzuki, Kohei; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

[PDF] arxiv.org

Wasserstein geometry and Ricci curvature bounds for Poisson spaces

LD Schiavo, R Herry, K Suzuki - arXiv preprint arXiv:2303.00398, 2023 - arxiv.org

Let $\varUpsilon $ be the configuration space over a complete and separable metric base 

space, endowed with the Poisson measure $\pi $. We study the geometry of $\varUpsilon $ 

from the point of view of optimal transport and Ricci-lower bounds. To do so, we define a 

formal Riemannian structure on $\mathscr {P} _ {1}(\varUpsilon) $, the space of probability 

measures over $\varUpsilon $ with finite first moment, and we construct an extended 

distance $\mathcal {W} $ on $\mathscr {P} _ {1}(\varUpsilon) $. The distance $\mathcal {W} …

Cited by 1 All 4 versions 

 

2023 see 2021. ARTICLE

Internal Wasserstein Distance for Adversarial Attack and Defense

Wang, Qicheng ; Zhang, Shuhai ; Cao, Jiezhang ; Li, Jincheng ; Tan, Mingkui ; Yang, Xiang; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

arXiv PDF [v4] Tue, 21 Feb 2023 02:15:45 UTC (3,842 KB)

 

ARTICLE

Multivariate stable approximation in Wasserstein distance by Stein's methodChen, Peng ; Nourdin, Ivan ; Xu, Lihu ; Yang, Xiaochuan; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

Multivariate Stable Approximation by Stein's Methodhttps:// 

2023 ee 2021. ARTICLE

The Unbalanced Gromov Wasserstein Distance: Conic Formulation and Relaxation

Thibault Séjourné ; Vialard, François-Xavier ; Peyré, Gabriel; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

arXiv PDF 16 Jan 2023 13:06:12 UTC (6,059 KB)

  

2023 see 2022. ARTICLE

Kantorovich Strikes Back! Wasserstein GANs are not Optimal Transport?

Korotin, Alexander ; Kolesov, Alexander ; Burnaev, Evgeny; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

arXiv PDF  [v2] Mon, 9 Jan 2023 09:40:46 UTC (3,263 KB


 2023

 


ARTICLE

Learning with symmetric positive definite matrices via generalized Bures-Wasserstein geometry

Han, Andi ; Mishra, Bamdev ; Jawanpuria, Pratik ; Gao, Junbin; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

Learning with symmetric positive definite matrices via...


ARTICLE

Viability and Exponentially Stable Trajectories for Differential Inclusions in Wasserstein Spaces

Bonnet, Benoît ; Frankowska, Hélène; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

 Viability and Exponentially Stable Trajectories for...


2023 see 2021. ARTICLE

Controlling Wasserstein Distances by Kernel Norms with Application to Compressive Statistical Learning

Vayer, Titouan ; Gribonval, Rémi; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

Controlling Wasserstein Distances by Kernel Norms...

  


ARTICLE

Isometric rigidity of the Wasserstein space \(\mathcal{W}_1(\mathbf{G})\) over Carnot groups

Balogh, Zoltán M ; Titkos, Tamás ; Virosztek, Dániel; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

Isometric rigidity of the Wasserstein space $\mathcal{W}_1(\mathbf{G})$ over Carnot groups


2023 see 2022. ARTICLE

Wasserstein multivariate auto-regressive models for modeling distributional time series and its application in graph learning

Jiang, Yiye; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

arXiv. PDF [v2] Sat, 6 May 2023 19:39:32 UTC (12,820 KB)

<–—2023———2023———1100—



ARTICLE

Wasserstein Graph Distance Based on \(L_1\)-Approximated Tree Edit Distance between Weisfeiler-Lehman Subtrees

Fang, Zhongxi ; Huang, Jianming ; Su, Xun ; Kasai, Hiroyuki; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

 Wasserstein Graph Distance Based on \(L_1\)-Approximated Tree Edit Distance between Weisfeiler-Lehman...

 

2023 see 2021. ARTICLE

Averaging on the Bures-Wasserstein manifold: dimension-free convergence of gradient descent

Altschuler, Jason M ; Chewi, Sinho ; Gerber, Patrik ; Stromme, Austin J; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

Averaging on the Bures-Wasserstein manifold: dimension-free convergence of gradient descent 

 


2023 see 2022. ARTICLE

Estimation and inference for the Wasserstein distance between mixing measures in topic models

Xin Bing ; Bunea, Florentina ; Niles-Weed, Jonathan; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

arXiv. PDF Fri, 17 Mar 2023 18:53:37 UTC (1,583 KB)

 


2023 see 2021. ARTICLE

Low Budget Active Learning via Wasserstein Distance: An Integer Programming Approach

Mahmood, Rafid ; Fidler, Sanja ; Law, Marc T; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

arXiv. PDF. [v4] Tue, 7 Mar 2023 00:09:11 UTC (26,823 KB)

 

ARTICLE

Cutoff ergodicity bounds in Wasserstein distance for a viscous energy shell model with Lévy noise

Barrera, Gerardo ; Högele, Michael A ; Pardo, Juan Carlos ; Pavlyukevich, Ilya; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

Cutoff ergodicity bounds in arXiv

Cutoff ergodicity bounds in journal


2023



ARTICLE

Wasserstein-Kelly Portfolios: A Robust Data-Driven Solution to Optimize Portfolio Growth

Jonathan Yu-Meng Li; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

Wasserstein-Kelly Portfolios:   journal

Wasserstein-Kelly Portfolios: arxiv

Cited by 2 Related articles All 5 versions 

ARTICLE

Barycenter Estimation of Positive Semi-Definite Matrices with Bures-Wasserstein Distance

Zheng, Jingyi ; Huang, Huajun ; Yi, Yuyan ; Li, Yuexin ; Shu-Chin, Lin; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

Barycenter Estimation of Positive Semi-Definite Matrices with Bures-Wasserstein...   arXiv


2023 see 2022. ARTICLE

A Wasserstein distance-based spectral clustering method for transaction data analysis

Zhu, Yingqiu ; Huang, Danyang ; Zhang, Bo; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

A Wasserstein distance-based spectral clustering method... arXiv

All 2 versions

Related articles All 2 versions 

ARTICLE

Wasserstein-\(1\) distance between SDEs driven by Brownian motion and stable processes

Deng, Changsong ; Schilling, Rene L ; Xu, Lihu; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

Wasserstein-$1$ distance journal


2023 see 2022. 2021. ARTICLE

Linear-Time Gromov Wasserstein Distances using Low Rank Couplings and Costs

Meyer Scetbon ; Peyré, Gabriel ; Cuturi, Marco; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

Linear-Time Gromov … arXiv

<–—2023———2023———1110—



2023 see 2022. ARTICLE

WPPNets and WPPFlows: The Power of Wasserstein Patch Priors for Superresolution

Altekrüger, Fabian ; Hertrich, Johannes; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

WPPNets and WPPFlows: The Power of Wasserstein Patch...

by Altekrüger, Fabian; Hertrich, Johannes

 


ARTICLE

Robust \(Q\)-learning Algorithm for Markov Decision Processes under Wasserstein Uncertainty

Neufeld, Ariel ; Sester, Julian; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

Robust \(Q\)-learning Algorithm for Markov Decision Processes under Wasserstein Uncertainty



ARTICLE

On the Existence of Monge Maps for the Gromov-Wasserstein Problem

Dumont, Théo ; Lacombe, Théo ; Vialard, François-Xavier; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

On the Existence of Monge Maps for the Gromov-Wasse...

 


ARTICLE

Wasserstein convergence rates in the invariance principle for deterministic dynamical systems

Liu, Zhenxin ; Wang, Zhe; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

Wasserstein convergence rates in the invariance...

 


ARTICLE

Wasserstein distance bounds on the normal approximation of empirical autocovariances and cross-covariances under non-stationarity and stationarity

Anastasiou, Andreas ; Kley, Tobias; Ithaca: Cornell University Library, arXiv.org

arXiv.org, 2023

OPEN ACCESS

Wasserstein distance bounds on the normal approximation of empirical autocovariances and cross-covariances under... journal

Wasserstein distance bounds on the normal approximation of empirical autocovariances and cross-covariances under... arXiv

  

2023


PATENT

Telecommunication customer loss prediction method based on conditional Wasserstein GAN

XIE XIANZHONG ; WEI LINGLIN ; SU CHANG

2023

OPEN ACCESS

Telecommunication customer loss prediction method based on conditional Wasserstein GAN


PATENT

Wasserstein distance and difference measurement-combined chest radiograph anomaly recognition domain self-adaptive method and Wasserstein distance and difference measurement-combined chest radiograph anomaly recognition domain self-adaptive system

CHEN YUANJIAO ; HE BISHI ; WANG DIAO ; XU ZHE ; CHEN HUI

2023

OPEN ACCESS

Wasserstein distance and difference measurement-combined chest radiograph anomaly recognition domain self-adaptive method and Wasserstein distance and difference measurement-combined chest radiograph anomaly recognition...

  

 

2023. PATENT

Monitoring network pedestrian target association method based on maximum slice Wasserstein measurement

JU LIWEI ; CHEN LIANG ; LI QI ; ZHANG JING

2023

OPEN ACCESS

Monitoring network pedestrian target association method based on maximum slice Wasserstein measurement

 

2023. PATENT

METHOD AND APPARATUS FOR CONDITIONAL DATA GENRATION USING CONDITIONAL WASSERSTEIN GENERATOR

CHO MYUNG HEE ; LEE KYUNG BOK ; KIM YOUNG GEUN

2023

OPEN ACCESS

METHOD AND APPARATUS FOR CONDITIONAL DATA GENRATION USING CONDITIONAL WASSERSTEIN GENERATOR


2023 patent

一种基于Laplace噪声和Wasserstein正则的多试次EEG源成像方法

OPEN ACCESS

一种基于Laplace噪声和Wasserstein正则的多试次EEG源成像方法

 [Chinese A Multi-trial EEG Source Imaging Method Based on Laplace Noise and Wasserstein Regularization]

<–—2023———2023———1120——


ARTICLE

Corrigendum to “Aero-engine high speed bearing fault diagnosis for data imbalance: A sample enhanced diagnostic method based on pre-training WGAN-GP” [Measurement 213 (2023) 112709]

Chen, Jiayu ; Yan, Zitong ; Lin, Cuiyin ; Yao, Boqing ; Ge, Hongjuan; Elsevier Ltd

Measurement : journal of the International Measurement Confederation, 2023, Vol.217, p.113035

PEER REVIEWED

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 

Corrigendum to “Aero-engine high speed bearing fault diagnosis for data imbalance: A sample enhanced diagnostic method based on...

 Journal

 

ARTICLE

A Multi-Sensor Detection Method Based on WGAN-GP and Attention-Bi-GRU for Well Control Pipeline Defects

Liang, Haibo ; Yang, Ziwei ; Zhang, Zhidong; New York: Springer US

Journal of nondestructive evaluation, 2023, Vol.42 (2)

PEER REVIEWED

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 

A Multi-Sensor Detection Method Based on WGAN-GP and Attention-Bi-GRU for Well Control Pipeline Defects. Journal pdf

A Multi-Sensor Detection Method Based on WGAN-

2023 see 2022. ARTICLE

Learned pseudo-random number generator: WGAN-GP for generating statistically robust random numbers

Okada, Kiyoshiro ; Endo, Katsuhiro ; Yasuoka, Kenji ; Kurabayashi, Shuichi; San Francisco: Public Library of Science

PloS one, 2023, Vol.18 (6), p.e0287025

PEER REVIEWED

OPEN ACCESS

BrowZine PDF Icon Download PDF 

Learned pseudo-random number generator: WGAN-GP for generating statistically robust random numbers pdf

 

ARTICLE

extendGAN+: Transferable Data Augmentation Framework Using WGAN-GP for Data-Driven Indoor Localisation Model

Yean, Seanglidet ; Goh, Wayne ; Lee, Bu-Sung ; Oh, Hong Lye; Switzerland: MDPI AG

Sensors (Basel, Switzerland), 2023, Vol.23 (9), p.4402

PEER REVIEWED

OPEN ACCESS

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 

extendGAN+: Transferable Data Augmentation Framework Using WGAN-GP for Data-Driven Indoor Localisation Model

All 8 versions


DATASET

Checkpoints for Morphological Classification of Radio Galaxies with wGAN-supported Augmentation

Griese, Florian ; Kummer, Janis ; Rustige, Lennart; Zenodo

2023

OPEN ACCESS

Checkpoints for Morphological Classification of Radio Galaxies ...

 


2023



CONFERENCE PROCEEDING

A novel adversarial example generation algorithm based on WGAN-Unet

Yao, Tian ; Fan, Jiarong ; Qin, Zhongyuan ; Chen, Liquan; SPIE

2023

A Novel Physical Layer Key Generation Method Based on WGAN-GP Adversarial Autoencoder

 

ARTICLE

A New Framework of Quantitative analysis Based on WGAN

Jiang, Xingru ; Jiang, Kaiwen; Les Ulis: EDP Sciences

SHS Web of Conferences, 2023, Vol.165, p.1018

PEER REVIEWED

OPEN ACCESS

BrowZine PDF Icon Download PDF 

BrowZine Book Icon View Issue Contents 

A New Framework of Quantitative analysis Based on WGAN
All 2 versions


ARTICLE

ResNet-WGAN Based End-to-End Learning for IoV Communication with Unknown Channels

Zhao, Junhui ; Mu, Huiqin ; Zhang, Qingmiao ; Zhang, Huan

IEEE internet of things journal, 2023, p.1-1

BrowZine PDF Icon Download PDF 

 

 

ARTICLE

Detection of False Data Injection Attack in Smart Grid Based on WGAN State Reconfiguration

,

Modeling and Simulation, 2023, Vol.12 (3), p.2182-2196


BOOK CHAPTER

An Effective WGAN-Based Anomaly Detection Model for IoT Multivariate Time Series

Qi, Sibo ; Chen, Juan ; Chen, Peng ; Wen, Peian ; Shan, Wenyu ; Xiong, Ling; Cham: Springer Nature Switzerland

Advances in Knowledge Discovery and Data Mining, 2023, p.80-91

<–—2023———2023———1130——



2023 PATENT

Adversarial sample generation method and system based on WGAN-Unet

CHEN YUQING ; SUN LEI ; QIN ZHONGYUAN ; YAO TIAN ; ZHANG QUNFANG

2023

OPEN ACCESS

 

2023. PATENT

Transform-WGAN-based vehicle following behavior modeling method

XU DONGWEI ; GAO GUANGYAN ; LI JIANGPENG ; LI CHENGBIN

2023

OPEN ACCESS


2023PATENT

WGAN-GP and SADNet combined microseismic signal denoising method

SHENG GUANQUN ; MA KAI ; JING TANG ; WANG XIANGYU ; ZHENG YUELIN ; YU MEI

2023

OPEN ACCESS


2023. PATENT

Building photovoltaic data completion method based on WGAN and whale optimization algorithm

CUI LEI ; LI FENG ; YANG YANG ; YIN JIE ; CAO QINGWEI ; LI DONG ; GUO XI ; CAO KENAN

2023

OPEN ACCESS


2023 see 2022 . PATENT

Software measurement defect data augmentation method based on VAE and WGAN

GUO ZHAOYANG

2023

OPEN ACCESS


2023



2023 PATENT

一种基于改进WGAN网络的轴承故障诊断方法

2023

OPEN ACCESS

[Chinese. Bearing Fault Diagnosis Method Based on Improved WGAN Network]


 2023 patent
基于改进WGAN的服装属性编辑方法
11/2023
Patent  Available Online



ansform/WGAN to Solve the Data Imbalance of Brine Pipeline Leakage]

2023. PATENT

基于SVAE-WGAN模型的局部放电故障数据增强处理方法及装置

2023

OPEN ACCESS

[Chinese. Partial discharge fault data enhancement processing method and device based on SVAE-WGAN model]

  

2023. PATENT

基于改进WGAN-GPAlxnet的轴承故障诊断方法

2023

OPEN ACCESS

[Chinese. ]

CN CN115962946A 付文龙 三峡大学

2023. PATENT

一种基于WGAN-GP模型的网格形变数据增强方法

2023

OPEN ACCESS

[Chinese. Bearing fault diagnosis method based on improved WGAN-GP and Alxnet] 置,Δgg-g 0 g 0 为原图-原图的形变网格。 8.根据权利要求6所述的一种

<–—2023———2023———1140——



2023 patent news. NEWSLETTER ARTICLE

Wuxi Cansonic Medical Science & Tech Seeks Patent for FC-VoVNet and WGAN-Based B Ultrasonic Image Denoising Method

New Delhi: Pedia Content Solutions Pvt. Ltd

Global IP News. Optics & Imaging Patent News, 2023


2023 patent news. NEWSLETTER ARTICLE

Wuxi Cansonic Medical Science & Tech Seeks Patent for FC-VoVNet and WGAN-Based B Ultrasonic Image Denoising Method

New Delhi: Pedia Content Solutions Pvt. LtdGlobal IP News. Optics & Imaging Patent News, 2023

 ... Status: Application Beijing, March 23 -- Wuxi Cansonic Medical Science & Tech has sought patent for FC-VoVNet and WGAN-based B ultrasonic image denoising method...



arXiv:2306.11616  [pdfother math.PR   math-ph
Ergodicity bounds for stable Ornstein-Uhlenbeck systems in Wasserstein distance with applications to cutoff stability
Authors: Gerardo BarreraMichael A. Högele
Abstract: This article establishes cutoff stability or abrupt thermalization for generic multidimensional Hurwitz stable Ornstein-Uhlenbeck systems with moderate (possibly degenerate) Lévy noise at constant noise intensity. The result is based on several ergodicity bounds which make use of the recently established shift linearity property of the Wasserstein distance by the authors. It covers such irregular…  More
Submitted 20 June, 2023; originally announced June 2023.
Comments: 21 pages, 1 figure
MSC Class: 60H10; 37A25; 37A30;


arXiv:2306.10601  [pdfother stat.ME
Sliced Wasserstein Regression
Authors: Han ChenHans-Georg Müller
Abstract: While statistical modeling of distributional data has gained increased attention, the case of multivariate distributions has been somewhat neglected despite its relevance in various applications. This is because the Wasserstein distance that is commonly used in distributional data analysis poses challenges for multivariate distributions. A promising alternative is the sliced Wasserstein distance,…  More
Submitted 18 June, 2023; originally announced June 2023.


arXiv:2306.10586  [pdfpsother math.MG  math.OC
The Gromov-Wasserstein distance between spheres
Authors: Shreya AryaArnab AuddyRanthony EdmondsSunhyuk LimFacundo MemoliDaniel Packer
Abstract: In this paper we consider a two-parameter family {dGWp,q}p,q of Gromov- Wasserstein distances between metric measure spaces. By exploiting a suitable interaction between specific values of the parameters p and q and the metric of the underlying spaces, we determine the exact value of the distance dGW4,2 between all pairs of unit spheres of different dimension endowed with their Euclidean distance…  More
Submitted 18 June, 2023; originally announced June 2023.


2023


arXiv:2306.10155  [pdfother stat.ML   cs.CY cs.LG
Fairness in Multi-Task Learning via Wasserstein Barycenters
Authors: François HuPhilipp RatzArthur Charpentier
Abstract: Algorithmic Fairness is an established field in machine learning that aims to reduce biases in data. Recent advances have proposed various methods to ensure fairness in a univariate environment, where the goal is to de-bias a single task. However, extending fairness to a multi-task setting, where more than one objective is optimised using a shared representation, remains underexplored. To bridge t…  More
Submitted 16 June, 2023; originally announced June 2023.


arXiv:2306.09844  [pdfother cs.LG   cs.CV  math.OC  math.PR
Wasserstein distributional robustness of neural networks
Authors: Xingjian BaiGuangyi HeYifan JiangJan Obloj
Abstract: Deep neural networks are known to be vulnerable to adversarial attacks (AA). For an image recognition task, this means that a small perturbation of the original can result in the image being misclassified. Design of such attacks as well as methods of adversarial training against them are subject of intense research. We re-cast the problem using techniques of Wasserstein distributionally robust opt…  More
Submitted 16 June, 2023; originally announced June 2023.
Comments: 23 pages, 6 figures, 8 tables


arXiv:2306.09836  [pdfother math.OC
Distributionally Robust Airport Ground Holding Problem under Wasserstein Ambiguity Sets
Authors: Haochen WuMax Z. Li
Abstract: The airport ground holding problem seeks to minimize flight delay costs due to reductions in the capacity of airports. However, the critical input of future airport capacities is often difficult to predict, presenting a challenging yet realistic setting. Even when capacity predictions provide a distribution of possible capacity scenarios, such distributions may themselves be uncertain (e.g., distr…  More
Submitted 16 June, 2023; originally announced June 2023.
Comments: 18 pages, 9 figures


arXiv:2306.09120  [pdfpsother math.FA   math.OC
Some Convexity Criteria for Differentiable Functions on the 2-Wasserstein Space
Authors: Guy Parker
Abstract: We show that a differentiable function on the 2-Wasserstein space is geodesically convex if and only if it is also convex along a larger class of curves which we call `acceleration-free'. In particular, the set of acceleration-free curves includes all generalised geodesics. We also show that geodesic convexity can be characterised through first and second-order inequalities involving the Wasserste…  More
Submitted 20 June, 2023; v1 submitted 15 June, 2023; originally announced June 2023.
Comments: Subsection 1.5 added and reference list updated; 18 pages


arXiv:2306.08854  [pdfother cs.LG cs.AI  stat.CO  stat.ML
A Gromov--Wasserstein Geometric View of Spectrum-Preserving Graph Coarsening
Authors: Yifan ChenRentian YaoYun YangJie Chen
Abstract: Graph coarsening is a technique for solving large-scale graph problems by working on a smaller version of the original graph, and possibly interpolating the results back to the original graph. It has a long history in scientific computing and has recently gained popularity in machine learning, particularly in methods that preserve the graph spectrum. This work studies graph coarsening from a diffe…  More
Submitted 15 June, 2023; originally announced June 2023.
Comments: To appear at ICML 2023. Code is available at https://github.com/ychen-stat-ml/GW-Graph-Coarsening

<–—2023———2023———1150——


 [HTML] mdpi.com

[HTML] A Solar Irradiance Forecasting Framework Based on the CEE-WGAN-LSTM Model

Q Li, D Zhang, K Yan - Sensors, 2023 - mdpi.com

learning framework for solar irradiance forecasting by integrating modern machine learning 

methods, including CEEMDAN, WGAN, … Therefore, this paper uses the WGAN model to train …

Cited by 1 All 8 versions

Solar Irradiance Forecasting Framework Based on the CEE-WGAN-LSTM Model

Library


2023 see 2022  [HTML] plos.org

[HTML] Learned pseudo-random number generator: WGAN-GP for generating statistically robust random numbers

K Okada, K EndoK YasuokaS Kurabayashi - Plos one, 2023 - journals.plos.org

… by MT using the WGAN [46, 47]. We trained the WGAN with the MT random numbers

used as training data and used the WGAN as a model to output the random numbers. The …

Cited by 3 Related articles All 11 versions 


2023 see 2022

[PDF] infocomm-journal.com

≈∂ç

杨光, 吴朝阳, 聂敏, 闫晓红, 江帆 - 通信学报, 2023 - infocomm-journal.com

条件 Wasserstein 生成对抗网络(CWGAN, conditional Wasserstein generative adversarial

network)[19]相结 ,提出基于CWGAN-SLM PAPR 抑制算法. 其主要思想是利用CWGAN 生成K …

[PDF] infocomm-journal.com

基于 CWGAN-SLM 的多小波 OFDM 系统峰均比抑制算法研究

杨光, 吴朝阳, 聂敏, 闫晓红, 江帆 - 通信学报, 2023 - infocomm-journal.com

条件 Wasserstein 生成对抗网络(CWGAN, conditional Wasserstein generative adversarial

network)[19]相结 ,提出基于CWGAN-SLM PAPR 抑制算法. 其主要思想是利用CWGAN 生成K …

 All 2 versions

 [Chinese. Research on peak-to-average ratio suppression algorithm of multi-wavelet OFDM system based on CWGAN-SLM

Yang Guang, Wu Chaoyang, Nie Min, Yan Xiaohong, Jiang Fan - Journal of Communications, 2023 - infocomm-journal.com

… Conditional Wasserstein generative adversarial network (CWGAN, conditional Wasserstein generative adversarial

network)[19], a PAPR suppression algorithm based on CWGAN-SLM is proposed. The main idea is to use CWGAN to generate K ...]

Related articles All 3 versions

[HTML] plos.org

 Ωß≈Research on abnormal detection of gas load based on LSTM-WGAN

X Xu, X Ai, Z Meng - International Conference on Computer …, 2023 - spiedigitallibrary.org

… The anomaly detection model based on LSTM-WGAN proposed in this paper is shown in

Figure 2. The LSTM-WGAN model is divided into two stages of training and testing. …

All 2 versions


[PDF] openreview.net

Neural Wasserstein Gradient Flows for Discrepancies with Riesz Kernels

F AltekrügerJ HertrichG Steidl - 2023 - openreview.net

… scheme for so-called Wasserstein steepest descent flows by … Here we provide analytic

formulas for Wasserstein schemes … We introduce Wasserstein gradient flows and Wasserstein …


2023

 

BMarkovian Sliced Wasserstein Distances: Beyond Independent Projections

K NguyenT RenN Ho - arXiv preprint arXiv:2301.03749, 2023 - arxiv.org

… For the random walk transition, we use the von Mises-Fisher with the mean as the previous

projecting direction as the conditional distribution. For the orthogonal-based transition, we …

Cited by 7 Related articles All 5 versions 

[HTML] Interactive Guiding Sparse Auto-Encoder with Wasserstein Regularization for Efficient Classification

H Lee, C Hur, B Ibrokhimov, S Kang - Applied Sciences, 2023 - mdpi.com

… The interactive guiding layers keep the main distribution using Wasserstein distance, which 

is a metric of distribution difference, and it suppresses the leverage of guiding features to …



2023 see 2022

LDoS attack traffic detection based on feature optimization extraction and DPSA-WGAN

by Ma, Wengang; Liu, Ruiqi; Guo, Jin

Applied intelligence (Dordrecht, Netherlands), 06/2023, Volume 53, Issue 11

Low-rate Denial of Service (LDoS) attacks cause severe destructiveness to network security. Moreover, they are more difficult to detect because they are more...

ArticleView Article PDF

Journal Article  Full Text Online

View Complete Issue Browse Now

Peer-Reviewed

Cited by 1 Related articles All 2 versions

    

2023 see 2022

Wasserstein Regression

by Chen, Yaqing; Lin, Zhenhua; Müller, Hans-Georg

Journal of the American Statistical Association, 06/2023, Volume 118, Issue 542

The analysis of samples of random objects that do not lie in a vector space is gaining increasing attention in statistics. An important class of such object...

ArticleView Article PDF

Journal Article  Full Text Online

View Complete Issue Browse Now

Peer-Reviewed

    

MeshWGAN: Mesh-to-Mesh Wasserstein GAN With Multi-Task Gradient Penalty for 3D Facial Geometric Age Transformation

by Zhang, Jie; Zhou, Kangneng; Luximon, Yan ; More...

IEEE transactions on visualization and computer graphics, 06/2023, Volume PP

ArticleView Article PDF

Journal Article  Full Text Online

View Complete Issue Browse Now

Peer-Reviewed

<–—2023———2023———1160——



2023 see arXiv

Unbalanced Optimal Transport meets Sliced-Wasserstein

by Séjourné, Thibault; Bonet, Clément; Fatras, Kilian ; More...

06/2023

Optimal transport (OT) has emerged as a powerful framework to compare probability measures, a fundamental task in many statistical and machine learning...

Journal Article  Full Text Online

Open Access

    


2023 see arXiv

Quantum Wasserstein distance between unitary operations

by Qiu, Xinyu; Chen, Lin

06/2023

Quantifying the effect of noise on unitary operations is an essential task in quantum information processing. We propose the quantum Wasserstein distance...

Journal Article  Full Text Online

Open Access

    

A Gromov--Wasserstein Geometric View of Spectrum-Preserving Graph Coarsening

by Chen, Yifan; Yao, Rentian; Yang, Yun ; More...

06/2023

Graph coarsening is a technique for solving large-scale graph problems by working on a smaller version of the original graph, and possibly interpolating the...

Journal Article  Full Text Online

Open Access

    

2023 see arXiv

Learning via Wasserstein-Based High Probability Generalisation Bounds

by Viallard, Paul; Haddouche, Maxime; Simsekli, Umut ; More...

06/2023

Minimising upper bounds on the population risk or the generalisation gap has been widely used in structural risk minimisation (SRM) - this is in particular at...

Journal Article  Full Text Online

Open Access

    

2023 see arXiv

Hinge-Wasserstein: Mitigating Overconfidence in Regression by Classification

by Xiong, Ziliang; Eldesokey, Abdelrahman; Johnander, Joakim ; More...

06/2023

Modern deep neural networks are prone to being overconfident despite their drastically improved performance. In ambiguous or even unpredictable real-world...

Journal Article  Full Text Online

Open Access


2023


2023 see arXiv

Entropic Gromov-Wasserstein Distances: Stability, Algorithms, and Distributional Limits

by Rioux, Gabriel; Goldfeld, Ziv; Kato, Kengo

05/2023

The Gromov-Wasserstein (GW) distance quantifies discrepancy between metric measure spaces, but suffers from computational hardness. The entropic...

Journal Article  Full Text Online

Open Access

    


2023 see arXiv

On the Wasserstein distance and Dobrushin's uniqueness theorem

by Dorlas, Tony C; Savoie, Baptiste

05/2023

In this paper, we revisit Dobrushin's uniqueness theorem for Gibbs measures of lattice systems of interacting particles at thermal equilibrium. In a nutshell,...

Journal Article  Full Text Online

Open Access

     


2923 see arXiv

Wasserstein contraction and spectral gap of slice sampling revisited

by Schär, Philip

05/2023

We propose a new class of Markov chain Monte Carlo methods, called $k$-polar slice sampling ($k$-PSS), as a technical tool that interpolates between and...

Journal Article  Full Text Online

Open Access

   

2023 see arXiv

Exact Generalization Guarantees for (Regularized) Wasserstein Distributionally Robust Models

by Azizian, Waïss; Iutzeler, Franck; Malick, Jérôme

05/2023

Wasserstein distributionally robust estimators have emerged as powerful models for prediction and decision-making under uncertainty. These estimators provide...

Journal Article  Full Text Online

Open Access

  Cited by 2 Related articles All 15 versions 


 Complexity of Block Coordinate Descent with Proximal Regularization and Applications to Wasserstein CP-dictionary Learning

by Kwon, DohyunLyu, Hanbaek

06/2023

We consider the block coordinate descent methods of Gauss-Seidel type with proximal regularization (BCD-PR), which is a classical method of minimizing general...

Journal Article 

Cited by 1 Related articles All 7 versions 

<–—2023———2023———1170——


2023 see arxiv

Wasserstein Gaussianization and Efficient Variational Bayes for Robust Bayesian Synthetic Likelihood

by Nguyen, Nhat-Minh; Tran, Minh-Ngoc; Drovandi, Christopher ; More...

05/2023

The Bayesian Synthetic Likelihood (BSL) method is a widely-used tool for likelihood-free Bayesian inference. This method assumes that some summary statistics...

Journal Article  Full Text Online

Open Access

 Wasserstein Gaussianization and Efficient Variational Bayes for Robust Bayesian Synthetic Likelihood   


2023 see arXiv see 2022 rXiv

Improved rates of convergence for the multivariate Central Limit Theorem in Wasserstein distance

by Bonis, Thomas

05/2023

We provide new bounds for rates of convergence of the multivariate Central Limit Theorem in Wasserstein distances of order $p \geq 2$. In particular, we obtain...

Journal Article  Full Text Online

Open Access

The Multivariate Rate of Convergence for Selberg's Central Limit Theorem

 Related articles All 3 versions 

   

2023 see 2022

Shedding a PAC-Bayesian Light on Adaptive Sliced-Wasserstein Distances

by Ohana, Ruben; Kimia Nadjahi; Rakotomamonjy, Alain ; More...

arXiv.org, 05/2023

The Sliced-Wasserstein distance (SW) is a computationally efficient and theoretically grounded alternative to the Wasserstein distance. Yet, the literature on...

Paper  Full Text Online

Open Access

Cited by 5 Related articles All 7 versions 

    

MonoFlow: Rethinking Divergence GANs via the Perspective of Wasserstein Gradient Flows

by Yi, Mingxuan; Zhu, Zhanxing; Liu, Song

arXiv.org, 06/2023

The conventional understanding of adversarial training in generative adversarial networks (GANs) is that the discriminator is trained to estimate a divergence,...

Paper  Full Text Online

Open Access

[2302.01075] MonoFlow: Rethinking Divergence GANs via ...arXivhttps://arxiv.org › stat

Cited by 3 Related articles All 6 versions 

Learning with symmetric positive definite matrices via generalized Bures-Wasserstein geometry

by Han, Andi; Mishra, Bamdev; Jawanpuria, Pratik ; More...

arXiv.org, 06/2023

Learning with symmetric positive definite (SPD) matrices has many applications in machine learning. Consequently, understanding the Riemannian geometry of SPD...

Paper  Full Text Online

Open Access


2023


Small ship detection based on YOLOX and modified Gaussian Wasserstein distance in SAR images

by Yu, Wenbo; Li, Jiamu; Wang, Yi ; More...

02/2023

Due to the increase in data quantity, ship detection in Synthetic Aperture Radar (SAR) images has attracted numerous studies. As most ship

targets are small...

Conference Proceeding  Full Text Online

Related articles All 3 versions

MR4604128 Prelim Ho-Nguyen, Nam; Kilinç-Karzan, Fatma; Kuçukyavuz, Simge; Lee, Dabeen; 

Strong formulations for distributionally robust chance-constrained programs with left-hand side uncertainty under Wasserstein ambiguity. INFORMS J. Optim. 5 (2023), no. 2, 211–232. 90C15

MR4603681 Prelim Wickman, Clare; Okoudjou, Kasso A.; 

Gradient Flows for Probabilistic Frame Potenti

Cited by 19 Related articles All 4 versions

2023. Data

Sliced Wasserstein CylceGAN

Pu, Ziqiang; Cabrera, Diego; (...); De Oliveira, Jose Valente

2023 | 

Code Ocean

 | Software

"Sliced Wasserstein cycle consistency generative adversarial networks for fault data augmentation of an industrial robot" is proposed by Ziqiang Pu, Diego Cabrera, Chuan Li and Jose Valente de Oliveira in the jornual of Expert Systems with Applications, 2023. [Accepted]Tensorflow 2.0 implementation. Both unconditional and conditional SW-CycleGAN were verified with the MNIST handwritting dataset

Show more

View datamore_horiz



RDP-WGAN: Image Data Privacy Protection based on Renyi Differential Privacy

Ma, XB; Yang, R and Zheng, MB

18th IEEE International Conference on Mobility, Sensing and Networking (MSN)

2022 | 

2022 18TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING, MSN

 , pp.320-324

In recent years, artificial intelligence technology based on image data has been widely used in various industries. Rational analysis and mining of image data can not only promote the development of the technology field but also become a new engine to drive economic development. However, the privacy leakage problem has become more and more serious. To solve the privacy leakage problem of image

Show more

Full Text at Publishermore_horiz

<–—2023———2023———1180——



2023 patent

Multi-test electroencephalography source imaging method based on Laplace noise and Wasserstein regularization, involves obtaining tested personal brain anatomical structure data, and calculating head model

CN116152372-A

Inventor(s) WANG Z and LIU K

Assignee(s) UNIV CHONGQING POSTS & TELECOM

Derwent Primary Accession Number 

2023-58160K

more_horiz

 

 

2023 SEE 2021

Wasserstein distance between noncommutative dynamical systems

Duvenhage, R

Nov 1 2023 | May 2023 (Early Access) | 

JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS

 527 (1)

We introduce and study a class of quadratic Wasserstein distances on spaces consisting of generalized dynamical systems on a von Neumann algebra. We emphasize how symmetry of such a Wasserstein distance arises, but also study the asymmetric case. This setup is illustrated in the context of reduced dynamics, and a number of simple examples are also presented.(c) 2023 The Author(s). Published by

Show more

Free Full Text From Publishermore_horiz

53 References   Related records


Learned pseudo-random number generator: WGAN-GP for generating statistically robust random numbers.

Okada, Kiyoshiro; Endo, Katsuhiro; (...); Kurabayashi, Shuichi

2023-06-14 | 

PloS one

 18 (6) , pp.e0287025

Pseudo-random number generators (PRNGs) are software algorithms generating a sequence of numbers approximating the properties of random numbers. They are critical components in many information systems that require unpredictable and nonarbitrary behaviors, such as parameter configuration in machine learning, gaming, cryptography, and simulation. A PRNG is commonly validated through a statistica

Show more

Free Full Text from PublisherView Full Text on ProQuestmore_horiz


2023patent

Method for detecting abnormality of water supply pipe network by using computer device, involves calculating Wasserstein similarity between node monitoring value vector and predicted value vector, and performing pipe network abnormality judgment according to Wamerstein similarity

CN116108604-A

Inventor(s) XIA Z and WANG J

Assignee(s) SICHUAN AOTU ENVIRONMENTAL PROTECTION

Derwent Primary Accession Number 

2023-54767T


Su, C; Wei, LL and Xie, XZ

29th Annual IEEE International Conference on High Performance Computing, Data, and Analytics (HiPC)

2022 | 

2022 IEEE 29TH INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING, DATA, AND ANALYTICS, HIPC

 , pp.186-191

In recent years, with the globalization and advancement of the telecommunications industry, the competition in the telecommunications market has become more intense, accompanied by high customer churn rates. Therefore, telecom operators urgently need to formulate effective marketing strategies to prevent the churning of customers. Customer churn prediction is an important means to prevent custo

Show more

Full Text at Publishermore_horiz

21 References  Related records


2023



Wasserstein Generative Adversarial Networks Based Differential Privacy Metaverse Data Sharing.

Liu, Hai; Xu, Dequan; (...); Wang, Ziyue

2023-jun-16 | 

IEEE journal of biomedical and health informatics

 PP

Although differential privacy metaverse data sharing can avoid privacy leakage of sensitive data, randomly perturbing local metaverse data will lead to an imbalance between utility and privacy. Therefore, this work proposed models and algorithms of differential privacy metaverse data sharing using Wasserstein generative adversarial networks (WGAN). Firstly, this study constructed the mathematic

Show more

View full textmore_horiz
Cited by 5
 Related articles All 3 versions


2023 see 2022

Global Wasserstein Margin maximization for boosting generalization in adversarial training

Yu, TY; Wang, S and Yu, XZ

May 2023 | Sep 2022 (Early Access) | 

APPLIED INTELLIGENCE

 53 (10) , pp.11490-11504

In recent researches on adversarial robustness boosting, the trade-off between standard and robust generalization has been widely concerned, in which margin, the average distance from samples to the decision boundary, has become the bridge between the two ends. In this paper, the problems of the existing methods to improve the adversarial robustness by maximizing the margin are discussed and an

Show more

Full Text at Publishermore_horiz

38 References Related records

Cited by 1 Related articles All 2 versions


2023 see 2022

Exact convergence analysis for Metropolis-Hastings independence samplers in Wasserstein distances

Brown, A and Jones, GL

Jun 2023 (Early Access) | 

JOURNAL OF APPLIED PROBABILITY

Under mild assumptions, we show that the exact convergence rate in total variation is also exact in weaker Wasserstein distances for the Metropolis-Hastings independence sampler. We develop a new upper and lower bound on the worst-case Wasserstein distance when initialized from points. For an arbitrary point initialization, we show that the convergence rate is the same and matches the convergen

Show more

Free Submitted Article From RepositoryView full textmore_horiz

53 References. Related records


Bounds in L-1 Wasserstein distance on the normal approximation of general M-estimators

Bachoc, F and Fathi, M

2023 | 

ELECTRONIC JOURNAL OF STATISTICS

 17 (1) , pp.1457-1491

We derive quantitative bounds on the rate of convergence in L1 Wasserstein distance of general M-estimators, with an almost sharp (up to a logarithmic term) behavior in the number of observations. We focus on situations where the estimator does not have an explicit expression as a function of the data. The general method may be applied even in situations where the observations are not independe

Show mor

View full textmore_horiz

61 References. Related records


Aero-engine high speed bearing fault diagnosis for data imbalance: A sample enhanced diagnostic method based on pre-training WGAN-GP (vol 213, 112709, 2023)

Chen, JY; Yan, ZT; (...); Ge, HJ

Aug 2023 | May 2023 (Early Access) | 

MEASUREMENT

 217

Free Full Text From Publishermore_horiz

1 Reference. Related records

Conditional Time Series Generation and the Signature-

[CITATION] … high speed bearing fault diagnosis for data imbalance: A sample enhanced diagnostic method based on pre-training WGAN-GP"[Measurement 213 (2023 …

J Chen, Z Yan, C Lin, B Yao, H Ge - Measurement, 2023 - ui.adsabs.harvard.edu

Corrigendum to "Aero-engine high speed bearing fault diagnosis for data imbalance: A

sample enhanced diagnostic method based on pre-training WGAN-GP" [Measurement 213 (2023) …

Related articles

<–—2023———2023———1190——


Wasserstein distance as a new tool for discriminating cosmologies through the topology of large-scale structure

Tsizh, M; Tymchyshyn, V and Vazza, F

Apr 21 2023 | 

MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY

 522 (2) , pp.2697-2706

In this work, we test Wasserstein distance in conjunction with persistent homology as a tool for discriminating large-scale structures of simulated universes with different values of sigma(8) cosmological parameter (present root-mean-square matter fluctuation averaged over a sphere of radius 8 Mpc comoving). The Wasserstein distance (a.k.a. the pair-matching distance) was proposed to measure th

Show mor

Free Submitted Article From RepositoryView full textView Full Text on ProQuestmore_horiz

48 References. Related records

Cited by 4 Related articles All 7 versions

MeshWGAN: Mesh-to-Mesh Wasserstein GAN With Multi-Task Gradient Penalty for 3D Facial Geometric Age Transformation.

Zhang, Jie; Zhou, Kangneng; (...); Li, Ping

2023-jun-12 | 

IEEE transactions on visualization and computer graphics

 PP

As the metaverse develops rapidly, 3D facial age transformation is attracting increasing attention, which may bring many potential benefits to a wide variety of users, e.g., 3D aging figurescreation, 3D facial data augmentation and editing. Compared with 2D methods, 3D face aging is an underexplored problem. To fill this gap, we propose a new mesh-to-mesh Wasserstein generative adversarial netw

Show more

View full textmore_horiz

elated articles All 5 versions


2023 see 2022

Network intrusion detection based on conditional wasserstein variational autoencoder with generative adversarial network and one-dimensional convolutional neural networks

He, JX; Wang, XD; (...); Chen, C

May 2023 | Sep 2022 (Early Access) | 

APPLIED INTELLIGENCE

 53 (10) , pp.12416-12436

There is a class-imbalance problem that the number of minority class samples is significantly lower than that of majority class samples in common network traffic datasets. Class-imbalance phenomenon will affect the performance of the classifier and reduce the robustness of the classifier to detect unknown anomaly detection. And the distribution of the continuous features in the dataset does not

Show more

Free Full Text From Publishermore_horiz

40 References. Related records

Cited by 4 Related articles All 3 versions

2023 see 2022

Fault detection and diagnosis for liquid rocket engines with sample imbalance based on Wasserstein generative adversarial nets and multilayer perceptron

Deng, LZ; Cheng, YQ; (...); Shi, YH

Jun 2023 | Nov 2022 (Early Access) | 

PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART G-JOURNAL OF AEROSPACE ENGINEERING

 237 (8) , pp.1751-1763

The reliability of liquid rocket engines (LREs), which are the main propulsion device of launch vehicles, cannot be overemphasised. The development of fault detection and diagnosis (FDD) technology for LREs can effectively improve the safety and reliability of launch vehicles, which has important theoretical and engineering significance. With the rapid development of artificial intelligence tec

Show more

Full Text at Publishermore_horiz

30 References. Related records

Cited by 2 Related articles

Wasserstein bounds in CLT of approximative MCE and MLE of the drift parameter for Ornstein-Uhlenbeck processes observed at high frequency Khalifa Es-Sebaiy1*, Fares Alazemi1 and Mishari

Es-Sebaiy, K; Mishari, FA and Al-Foraih, M

Apr 26 2023 | 

JOURNAL OF INEQUALITIES AND APPLICATIONS

 2023 (1)

This paper deals with the rate of convergence for the central limit theorem of estimators of the drift coefficient, denoted theta, for the Ornstein-Uhlenbeck process X := {X-t, t >= 0} observed at high frequency. We provide an approximate minimum contrast estimator and an approximate maximum likelihood estimator of., namely theta(similar to)(n) := 1/( 2n Sigma (n)(i =1) X-2 (ti)), and theta(sim

Show more

Free Full Text from PublisherView Full Text on P
Cited by 1
 Related articles All 9 versions


2023



Mitigating Discrimination in Insurance with Wasserstein Barycenters

Blog, Podcast, or Website

Mitigating Discrimination in Insurance with Wasserstein Barycenters

Charpentier, Arthur. Weblog post. Freakonometrics [BLOG], Montreal: Newstex. Jun 23, 2023.

Full Text

DetailsFull text

Select result item

Mitigating Discrimination in Insurance with Wasserstein ... arXiv

arXiv PDF

ResearchGate

Mitigating Discrimination in Insurance with Wasserstein Barycenters

Working Paper

Mitigating Discrimination in Insurance with Wasserstein Barycenters

Charpentier, Arthur; Hu, François; Ratz, Philipp. arXiv.org; Ithaca, Jun 22, 2023.

Full Text

Abstract/Details

Get full text

opens in a new window

Select result item


Fairness in Multi-Task Learning via Wasserstein Barycenters

Blog, Podcast, or Website

Fairness in Multi-Task Learning via Wasserstein Barycenters

Charpentier, Arthur. Weblog post. Freakonometrics [BLOG], Montreal: Newstex. Jun 21, 2023.

Full Text

DetailsFull text

Select result item

Fairness in Multi-Task Learning via Wasserstein Barycenters
Cited by 6
 Related articles All 4 versions


The Gromov-Wasserstein distance between spheres

Working Paper

The Gromov-Wasserstein distance between spheres

Arya, Shreya; Auddy, Arnab; Edmonds, Ranthony; Lim, Sunhyuk; Memoli, Facundo; et al. arXiv.org; Ithaca, Jun 21, 2023.

Full Text

Abstract/Details

Get full text

opens in a new window

Select result item

The Gromov-Wasserstein distance between spheres

Ergodicity bounds for stable Ornstein-Uhlenbeck systems in Wasserstein distance with applications to cutoff stability

Working Paper

Ergodicity bounds for stable Ornstein-Uhlenbeck systems in Wasserstein distance with applications to cutoff stability

Barrera, Gerardo; Högele, Michael A. arXiv.org; Ithaca, Jun 20, 2023.

Full Text

Abstract/Details

Get full text

opens in a new window

Select result item

Ergodicity bounds for stable Ornstein-Uhlenbeck systems in Wasserstein distance with applications to cutoff stability

Some Convexity Criteria for Differentiable Functions on the 2-Wasserstein Space

Working Paper

Some Convexity Criteria for Differentiable Functions on the 2-Wasserstein Space

Parker, Guy. arXiv.org; Ithaca, Jun 20, 2023.

Full Text

Abstract/Details

Get full text

opens in a new window

Select result item

Cited by 2 Related articles All 3 versions 

 Some Convexity Criteria for Differentiable Functions on the 2-Wasserstein Space

<–—2023———2023———1200——



Sliced Wasserstein Regression

Working Paper

Sliced Wasserstein Regression

Chen, Han; Müller, Hans-Georg. arXiv.org; Ithaca, Jun 18, 2023.

Full Text

Abstract/Details

Get full text

opens in a new window

Select result item

Sliced Wasserstein Regression

Zbl 07707208

Listen: Childcare, inspection fees and budget talks on WGAN

Wire Feed

Listen: Childcare, inspection fees and budget talks on WGAN

Posik, Jacob. CE Think Tank Newswire; Miami [Miami]. 16 June 2023.  

Full Text

DetailsFull text

Select result item

Cited by 2 Related articles All 2 versions 


Listen: Childcare, inspection fees and budget talks on WGAN

Distributionally Robust Airport Ground Holding Problem under Wasserstein Ambiguity Sets

Working Paper

Distributionally Robust Airport Ground Holding Problem under Wasserstein Ambiguity Sets

Wu, Haochen; Li, Max Z. arXiv.org; Ithaca, Jun 16, 2023.

Full Text

Abstract/Details

Get full text

opens in a new window

Select result item

Distributionally Robust Airport Ground Holding Problem under Wasserstein Ambiguity Sets


Wasserstein distributional robustness of neural networks

Working Paper

Wasserstein distributional robustness of neural networks

Bai, Xingjian; He, Guangyi; Jiang, Yifan; Obloj, Jan. arXiv.org; Ithaca, Jun 16, 2023.

Full Text

Abstract/Details

Get full text

opens in a new window

Select result item

Wasserstein distributional robustness of neural networks

Sensitivity of multiperiod optimization problems in adapted Wasserstein distance

Working Paper

Sensitivity of multiperiod optimization problems in adapted Wasserstein distance

Bartl, Daniel; Wiesel, Johannes. arXiv.org; Ithaca, Jun 16, 2023.

Full Text

Abstract/Details

Get full text

opens in a new window

Select result item

Sensitivity of multiperiod optimization problems in adapted Wasserstein distance

 

2023 patent

Bearing fault diagnosis method based on improved WGAN network

CN CN116223038A 张辉 江苏科技大学

Priority 2023-01-09 • Filed 2023-01-09 • Published 2023-06-06

5. The method for diagnosing bearing failure based on the improved WGAN network as claimed in claim 4, wherein: the R-FCN network model is built in the step (3.2) as a discriminator model for improving the WGAN network, the discriminator model is utilized to judge the picture generated by the …


2023


2023 patent

Bearing fault diagnosis method based on improved WGAN-GP and Alxnet

CN CN115962946A 付文龙 三峡大学

Priority 2023-01-18 • Filed 2023-01-18 • Published 2023-04-14

8. The bearing fault diagnosis method based on improved WGAN-GP and Alxnet according to claim 1, characterized in that: the step 5 comprises the following steps: s5.1: taking the expanded balanced data set as input, and extracting deep features through the convolutional layer, the Relu activation …


2023

Electronic device for colorizing black and white image using gan based model …

KR KR20230025676A 이범식 조선대학교산학협력단

Priority 2023-02-03 • Filed 2023-02-03 • Published 2023-02-22

According to claim 4, The GAN-based model is trained using a total loss (L total ) considering all pixel wise (L1) loss function (L L1 ), VGG loss function (L VGG ), and WGAN loss function (L wgan ), The pixel wise (L1) loss function (L L1 ), the VGG loss function (L VGG ), the WGAN loss function ( …



2023 patent see 2022

Bearing fault diagnosis method based on improved WGAN-GP and Alxnet

CN CN115962946A 付文龙 三峡大学

Priority 2023-01-18 • Filed 2023-01-18 • Published 2023-04-14

8. The bearing fault diagnosis method based on improved WGAN-GP and Alxnet according to claim 1, characterized in that: the step 5 comprises the following steps: s5.1: taking the expanded balanced data set as input, and extracting deep features through the convolutional layer, the Relu activation …


2023 patent 55

Laplace noise and Wasserstein regularization-based multi-test EEG source …

CN CN116152372A 刘柯 重庆邮电大学

Priority 2023-02-07 • Filed 2023-02-07 • Published 2023-05-23

s4, establishing a multi-test robust EEG diffuse source imaging model based on Laplace noise and Wasserstein regularization in a projection space according to the lead matrix, the difference operator and the minimum distance matrix, and obtaining a multi-test estimated source by utilizing an ADMM …


Attacking Mouse Dynamics Authentication using Novel Wasserstein Conditional DCGAN

A RoyKS WongRCW Phan - IEEE Transactions on …, 2023 - ieeexplore.ieee.org

… As a remedy to our attack, we put forward a novel mouse … an end-to-end attack on existing

mouse dynamics authentication … is robust against our proposed attacks. Section V chronicles …
<–—2023———2023———1210——


A Novel Conditional Wasserstein Deep Convolutional Generative Adversarial Network

A RoyD Dasgupta - IEEE Transactions on Artificial Intelligence, 2023 - ieeexplore.ieee.org

… MDA: We consider one realistic surrogate-based attack scenario, where an attacker was

able to gain control over the target benign client for a specific duration, however, has no access …

Cite Cited by 2 Related articles All 3 versions


 arXiv:2306.16051  [pdfpsother math.PR
Uniform Wasserstein convergence of penalized Markov processes
Authors: Nicolas ChampagnatEdouard StricklerDenis Villemonais
Abstract: For general penalized Markov processes with soft killing, we propose a simple criterion ensuring uniform convergence of conditional distributions in Wasserstein distance to a unique quasi-stationary distribution. We give several examples of application where our criterion can be checked, including Bernoulli convolutions and piecewise deterministic Markov processes of the form of switched dynamical…  More
Submitted 28 June, 2023; originally announced June 2023.

All 4 versions 


arXiv:2306.15524
  [pdfpsother q-fin.MF
Robust Wasserstein Optimization and its Application in Mean-CVaR
Authors: Xin HaiKihun Nam
Abstract: We refer to recent inference methodology and formulate a framework for solving the distributionally robust optimization problem, where the true probability measure is inside a Wasserstein ball around the empirical measure and the radius of the Wasserstein ball is determined by the empirical data. We transform the robust optimization into a non-robust optimization with a penalty term and provide th…  More
Submitted 27 June, 2023; originally announced June 2023.

Related articles All 4 versions 

arXiv:2306.15163  [pdfother stat.ML  cs.LG
Wasserstein Generative Regression
Authors: Shanshan SongTong WangGuohao ShenYuanyuan LinJian Huang
Abstract: In this paper, we propose a new and unified approach for nonparametric regression and conditional distribution learning. Our approach simultaneously estimates a regression function and a conditional generator using a generative learning framework, where a conditional generator is a function that can generate samples from a conditional distribution. The main idea is to estimate a conditional genera…  More
Submitted 26 June, 2023; originally announced June 2023.
Comments: 50 pages, including appendix. 5 figures and 6 tables in the main text. 1 figure and 7 tables in the appendix
MSC Class: 62G08; 68T07

Related articles All 2 versions 

arXiv:2306.14363  [pdfpsother math.OC
Minimal Wasserstein Surfaces
Authors: Wuchen LiTryphon T. Georgiou
Abstract: In finite-dimensions, minimal surfaces that fill in the space delineated by closed curves and have minimal area arose naturally in classical physics in several contexts. No such concept seems readily available in infinite dimensions. The present work is motivated by the need for precisely such a concept that would allow natural coordinates for a surface with a boundary of a closed curve in the Was…  More
Submitted 25 June, 2023; originally announced June 2023.
MSC Class: 49Q20; 49Kxx; 49Jxx

All 3 versions 

2023


arXiv:2306.12912  [pdfother stat.ML   cs.CY  cs.LG
Mitigating Discrimination in Insurance with Wasserstein Barycenters
Authors: Arthur CharpentierFrançois HuPhilipp Ratz
Abstract: The insurance industry is heavily reliant on predictions of risks based on characteristics of potential customers. Although the use of said models is common, researchers have long pointed out that such practices perpetuate discrimination based on sensitive features such as gender or race. Given that such discrimination can often be attributed to historical data biases, an elimination or at least m…  More
Submitted 22 June, 2023; oiginally announced June 2023.

 Cited by 2 Related articles All 2 versions 


Konarovskyi, Vitalii

Coalescing-fragmentating Wasserstein dynamics: particle approach. (English) Zbl 07699948

Ann. Inst. Henri Poincaré, Probab. Stat. 59, No. 2, 983-1028 (2023).

MSC:  60K35 60B12 60G44 60J60 82B21

PDF BibTeX XML Cite

Full Text: DOI   arXiv

  arXiv 

(opens in new taOpenURL 

Zbl 07699948

2023 see 2022

Barrera, GerardoLukkarinen, Jani

Quantitative control of Wasserstein distance between Brownian motion and the Goldstein-Kac telegraph process. (English) Zbl 07699947

Ann. Inst. Henri Poincaré, Probab. Stat. 59, No. 2, 933-982 (2023).

MSC:  60G50 35L99 60J65 60J99 60K35 60K37 60K40

PDF BibTeX XML Cite

Full Text: DOI  

 arXiv 


(opens in new taOpenURL 

Lacombe, JulienDigne, JulieCourty, NicolasBonneel, Nicolas

Learning to generate Wasserstein barycenters. (English) Zbl 07696249

J. Math. Imaging Vis. 65, No. 2, 354-370 (2023).

MSC:  68T07 49Q22 68U10

PDF BibTeX XML Cite

Full Text: DOI 


92023 patent

Method for interpolating anti-traffic data generated by Wasserstein based on road network pixelate, involves re-inputting road network traffic data generative adversarial network model to repair data when realness is greater than set threshold value to finish repairing

CN115510174-A

Inventor(s) GUO QWANG H; (...); WANG R

Assignee(s) UNIV CHONGQING POSTS & TELECOM

Derwent Primary Accession Number 

2023-01639L

<–—2023———2023———1220——



92023 patent

Method for generating bond-slip model of fiber reinforced plastic sheet and concrete interface based on wasserstein generative adversarial networks, involves collecting strain data samples corresponding to corresponding position of strain data sample collection, and drawing multiple curves

CN115544864-A

Inventor(s) YANG ZXIONG Q; (...); HE C

Assignee(s) UNIV TONGJI

Derwent Primary Accession Number 

2023-049921


92023 patent

Method for generating conditional data for generating target data, involves inputting received conditional data to learned conditional Osserstein generator to generate target data for conditional data, where conditional Wasserstein generator is learned

KR2023023464-A

Inventor(s) JO MKIM Y and LEE K B

Assignee(s) UNIV SEOUL NAT R & DB FOUND

Derwent Primary Accession Number 

2023-21420J


2023 patent

Satellite cloud image prediction method based on WGAN-GP network and optical flow method, involves inputting historical satellite cloud image data into generator of trained dual-discriminator network to obtain prediction result

CN115546257-A

Inventor(s) XIA JKANG R and TAN L

Assignee(s) UNIV NANJING INFORMATION SCI & TECHNOLOG

Derwent Primary Accession Number 



Unbalanced Optimal Transport meets Sliced-Wasserstein

T SéjournéC BonetK FatrasK Nadjahi… - arXiv preprint arXiv …, 2023 - arxiv.org

… well-posedness of approximating an unbalanced Wasserstein gradient flow [36] using SUOT,

as done in [37, 38] for SOT. Unbalanced Wasserstein gradient flows have been a key tool …

Related articles All 2 versions 


2023


Model and observation of the feasible region for PV integration capacity considering Wasserstein-distance-based distributionally robust chance constraints

S Zhang, S Ge, H Liu, J Li, C Wang - Applied Energy, 2023 - Elsevier

… ], a Wasserstein ambiguity set P based on the Wasserstein metric can be constructed to

address the potentials of true distribution. The Wasserstein … From the definition, the Wasserstein …

 Cited by 3 Related articles All 2 versions



Data augmentation strategy for power inverter fault diagnosis based on wasserstein distance and auxiliary classification generative adversarial network

Q Sun, F Peng, X Yu, H Li - Reliability Engineering & System Safety, 2023 - Elsevier

… augmentation method based on Wasserstein distance and auxiliary … At the same time, the

Wasserstein distance is chosen to … Therefore, on the premise of meeting the fixed range, the …

Cited by 10 Related articles All 4 versions

[PDF] openreview.net

Interpolation for Robust Learning: Data Augmentation on Wasserstein Geodesics

J Zhu, J QiuA GuhaZ YangXL Nguyen, B Li, D Zhao - 2023 - openreview.net

… Specifically, (1) we augment the data by finding the worst-case Wasserstein barycenter on

the … Wasserstein barycenter and geodesic Equipped with the Wasserstein distance, we can …

Cite Related articles All 8 versions 

Working Paper

Wasserstein medians: robustness, PDE characterization and numerics
illaume; Chenchene, Enis; Eichinger, Katharina. arXiv.org; Ithaca, Jul 4, 2023.
 Full Text
 
Working Paper
Stability Analysis Framework for Particle-based Distance GANs with Wasserstein Gradient Flow
Chen, Chuqi; Wu, Yue; Yang, Xiang.
 arXiv.org; Ithaca, Jul 4, 2023.

, which is Wasserstein gradient flow based on particle-…

Related articles All 2 versions 

<–—2023———2023———1230——



2023 ee 2022. Scholarly Journal

A novel sEMG data augmentation based on WGAN-GP
Coelho, Fabrício; Pinto, Milena F; Melo, Aurélio G; Ramos, Gabryel S; Marcato, André L M.
 Computer Methods in Biomechanics and Biomedical Engineering; Abingdon Vol. 26, Iss. 9,  (Sep 2023): 1008-1017.


Working Paper

Fast Optimal Transport through Sliced Wasserstein Generalized Geodesics
Fast Optimal Transport through Sliced Wasserstein Generalized Geodesics
Mahey, Guillaume; Chapel, Laetitia; Gasso, Gilles; Bonet, Clément; Courty, Nicolas.
 arXiv.org; Ithaca, Jul 4, 2023.
Cited by 1
 Related articles All 3 versions 


Working Paper
On the reach of isometric embeddings into Wasserstein type spaces
Casado, Javier; Cuerno, Manuel; Santos-Rodríguez, Jaime.
 arXiv.org; Ithaca, Jul 3, 2023.
Cite
 
Working Paper
Wasserstein-1
distance and nonuniform Berry-Esseen bound for a supercritical branching process in a random environment
Wu, Hao; Fan, Xiequan; Gao, Zhiqiang; Ye, Yinna.
 arXiv.org; Ithaca, Jul 3, 2023.
 
Working Paper
Sulcal Pattern Matching with the Wasserstein Distance
Chen, Zijian; Das, Soumya; Chung, Moo K.
 arXiv.org; Ithaca, Jul 1, 2023.

arXiv:2307.01084  [pdfpsother math.PR   math.ST


2023



Working Paper
Approximating Probability Distributions by using Wasserstein Generative Adversarial Networks
Gao, Yihang; Ng, Michael K; Zhou, Mingjie.
 arXiv.org; Ithaca, Jun 30, 2023.
 

 
Working Paper

Uniform Wasserstein convergence of penalized Markov processes
Uniform Wasserstein convergence of penalized Markov processes


Champagnat, Nicolas; Strickler, Edouard; Villemonais, Denis. arXiv.org; Ithaca, Jun 28, 20



  1 arXiv:2307.02509  [pdfother cs.LG   cs.CG  cs.CV  cs.GR
Wasserstein Auto-Encoders of Merge Trees (and Persistence Diagrams)
Authors: Mahieu PontJulien Tierny
Abstract: This paper presents a computational framework for the Wasserstein auto-encoding of merge trees (MT-WAE), a novel extension of the classical auto-encoder neural network architecture to the Wasserstein metric space of merge trees. In contrast to traditional auto-encoders which operate on vectorized data, our formulation explicitly manipulates merge trees on their associated metric space at each laye…  More
Submitted 5 July, 2023; originally announced July 2023.
Comments: arXiv admin note: text overlap with arXiv:2207.10960

Cited by 1 Related articles All 6 versions


arXiv:2307.01879  [pdfother cs.LG
Stability Analysis Framework for Particle-based Distance GANs with Wasserstein Gradient Flow
Authors: Chuqi ChenWu YueYang Xiang
Abstract: In this paper, we investigate the training process of generative networks that use a type of probability density distance named particle-based distance as the objective function, e.g. MMD GAN, Cramér GAN, EIEG GAN. However, these GANs often suffer from the problem of unstable training. In this paper, we analyze the stability of the training process of these GANs from the perspective of probability…  More
Submitted 4 July, 2023; originally announced July 2023.


2023


arXiv:2307.01770  [pdfother stat.ML   cs.LG  stat.AP
Fast Optimal Transport through Sliced Wasserstein Generalized Geodesics
Authors: Guillaume MaheyLaetitia ChapelGilles GassoClément BonetNicolas Courty
Abstract: Wasserstein distance (WD) and the associated optimal transport plan have been proven useful in many applications where probability measures are at stake. In this paper, we propose a new proxy of the squared WD, coined min-SWGG, that is based on the transport map induced by an optimal one-dimensional projection of the two input distributions. We draw connections between min-SWGG and Wasserstein gen…  More
Submitted 4 July, 2023; originally announced July 2023.
Comments: Main: 10 pages,4 Figures Tables Supplementary: 19 pages, 13 Figures ,1 Table. Sumbitted to Neurips 2023
MSC Class: 62; 65 ACM Class: G.3

Cited by 2 Related articles All 3 versions 

arXiv:2307.01765  [pdfother math.OC   math.AP  stat.AP
Wasserstein medians: robustness, PDE characterization and numerics
Authors: Guillaume CarlierEnis ChencheneKatharina Eichinger
Abstract: We investigate the notion of Wasserstein median as an alternative to the Wasserstein barycenter, which has become popular but may be sensitive to outliers. In terms of robustness to corrupted data, we indeed show that Wasserstein medians have a breakdown point of approximately 1
2. We give explicit constructions of Wasserstein medians in dimension one which enable us to obtain L
p esti…  More
Submitted 4 July, 2023; originally announced July 2023.
Comments: 38 pages, 6 figures

Cited by 1 Related articles All 13 versions 

 

arXiv:2307.01051  [pdfpsother math.MG
On the reach of isometric embeddings into Wasserstein type spaces
Authors: Javier CasadoManuel CuernoJaime Santos-Rodríguez
Abstract: We study the reach (in the sense of Federer) of the natural isometric embedding XW
p(X)  of X  inside its p
-Wasserstein space, where (X,dist)
 is a geodesic metric space. We prove that if a point xX
 can be joined to another point yX
 by two minimizing geodesics, then reach(x,XW
pX))=0
. This includes the cases where X
 is…  More
Submitted 3 July, 2023; originally announced July 2023.
MSC Class: 49Q20; 28A33; 30L15; 49Q22; 53C21; 55N31

Related articles All 3 versions 

arXiv:2307.00385  [pdfother q-bio.NC   eess.IV
Sulcal Pattern Matching with the Wasserstein Distance
Authors: Zijian ChenSoumya DasMoo K. Chung
Abstract: We present the unified computational framework for modeling the sulcal patterns of human brain obtained from the magnetic resonance images. The Wasserstein distance is used to align the sulcal patterns nonlinearly. These patterns are topologically different across subjects making the pattern matching a challenge. We work out the mathematical details and develop the gradient descent algorithms for…  More
Submitted 1 July, 2023; originally announced July 2023.
Comments: In press in IEEE ISBI

Related articles All 4 versions

Duvenhage, Rocco

Wasserstein distance between noncommutative dynamical systems. (English) Zbl 07708119

J. Math. Anal. Appl. 527, No. 1, Part 2, Article ID 127353, 26 p. (2023).

MSC:  46Lxx 81Pxx 37Axx

PDF BibTeX XML Cite

Full Text: DOI 

 arXiv 

Zbl 07708119

<–—2023———2023———1250——



  Ponnoprat, Donlapark

Universal consistency of Wasserstein 

k-NN classifier: a negative and some positive results. (English) Zbl 07720187

Inf. Inference 12, No. 3, Article ID iaad027, 23 p. (2023).

MSC:  60A10 90Cxx 62F15

PDF BibTeX XML Cite

Full Text: DOI 

 arXiv  


Chen, ZhiKuhn, DanielWiesemann, Wolfram

On approximations of data-driven chance constrained programs over Wasserstein balls. (English) Zbl 07705444

Oper. Res. Lett. 51, No. 3, 226-233 (2023).

MSC:  90-XX

PDF BibTeX XML Cite

Full Text: DOI 


 arXiv 


 

2023 see 2022

Talbi, MehdiTouzi, NizarZhang, Jianfeng

Viscosity solutions for obstacle problems on Wasserstein space. (English) Zbl 07704040

SIAM J. Control Optim. 61, No. 3, 1712-1736 (2023).

MSC:  60G40 35Q89 49N80 49L25 60H30

PDF BibTeX XML Cite

Full Text: DOI 

 arXiv 


 

2023 see 2022

Bayraktar, ErhanEkren, IbrahimZhang, Xin

A smooth variational principle on Wasserstein space. (English) Zbl 07702414

Proc. Am. Math. Soc. 151, No. 9, 4089-4098 (2023).

MSC:  58E30 90C05 35D40

PDF BibTeX XML Cite

Full Text: DOI 

 arXiv 


 2023 see 2022. [PDF] hal.science

Measuring 3D-reconstruction quality in probabilistic volumetric maps with the Wasserstein Distance

S AravecchiaA RichardM ClauselC Pradalier - 2023 - hal.science

… quality based directly on the voxels’ occupancy likelihood: the Wasserstein Distance. Finally,

we evaluate this Wasserstein Distance metric in simulation, under different level of noise in …

Related articles All 6 versions 


2023



Renewable Energy Hosting Capacity Evaluation of Distribution Network Based on WGAN Scenario Generation

Z Wei, J Li, X Wu, J Ye, Y Liang… - 2023 IEEE International …, 2023 - ieeexplore.ieee.org

… Under the above background, this paper uses the Wasserstein Generative Adversarial

Networks (WGAN) to generate renewable energy power output scenarios, which have the same …


An Intelligent Diagnosis Approach Combining Resampling and CWGAN-GP of Single-to-Mixed Faults of Rolling Bearings Under Unbalanced Small …

[CITATION] 

H fan, J Ma, X Cao, X Zhang, Q Mao - International Journal of …, 2023 - World Scientific

Rolling bearing is a key component with the high fault rate in the rotary machines, and its

fault diagnosis is important for the safe and healthy operation of the entire machine. In recent …

 Related articles

 

2023 see 2022

On a linear fused Gromov-Wasserstein distance for graph structured data

by Nguyen, Dai Hai; Tsuda, Koji

Pattern recognition, 06/2023, Volume 138

Article PDFPDF

Journal Article  Full Text Online

More Options 

[CITATION] On a linear fused Gromov-Wasserstein distance for graph structured data

グエン,ダイハイ,Tsuda,Koji - Pattern Recognition, 2023 - cir.nii.ac.jp

On a linear fused Gromov-Wasserstein distance for graph structured data | CiNii Research …

On a linear fused Gromov-Wasserstein distance for graph structured data … Related articles 

 [CITATION] On a linear fused Gromov-Wasserstein distance for graph structured data

グエン,ダイハイ,Tsuda,Koji - Pattern Recognition, 2023 - cir.nii.ac.jp

On a linear fused Gromov-Wasserstein distance for graph structured data | CiNii Research …

On a linear fused Gromov-Wasserstein distance for graph structured data …

Cited by 8 Related articles All 4 versions


2023z z4 see z1

[CITATION] An 534 extended Exp-TODIM method for multiple attribute deci-535 sion making based on the Z-Wasserstein distance

H Sun, Z Yang, Q Cai, GW Wei, ZW Mo - Expert 536 Systems with Applications, 2023

Cited by 4 Related articles

[PS] uc.pt

CITATION] An 534 extended Exp-TODIM method for multiple attribute deci-535 sion making based on the Z-Wasserstein distance

H Sun, Z Yang, Q Cai, GW Wei, ZW Mo - Expert 536 Systems with Applications, 2023

Cited by 6 Related articles

   

TextureWGAN: texture preserving WGAN with multitask 

 by M Ikuta · 2023 — Purpose: This paper presents a deep learning (DL) based method called TextureWGAN. It is designed to preserve image texture while

<–—2023———2023———1260—



 2023 v2. see 2022   

Morphological Classification of Radio Galaxies with wGAN-

supported Augmentation
by Rustige, Lennart; Kummer, Janis; Griese, Florian ; More...
arXiv.org, 6/2023
Paper  Full Text Online

The Wasserstein distance of order 1 for quantum spin systems on infinite 

The Wasserstein distance of order 1 for quantum spin systems on infinite lattices

by De Palma, Giacomo; Trevisan, Dario

arXiv.org, 06/2023

We propose a generalization of the Wasserstein distance of order 1 to quantum spin systems on the lattice \(\mathbb{Z}^d\)

which we call

specific quantum...

Paper  Full Text Online

Cited by 6 Related articles All 4 versions

Wasserstein-$1$ distance and nonuniform Berry-Esseen bound for a 

supercritical branching process in a random environment
by Wu, Hao; Fan, Xiequan; Gao, Zhiqiang ; More...
07/2023
Let $ (Z_{n})_{n\geq 0} $ be a supercritical branching process in an independent and identically distributed random environment. We establish an optimal...
Journal Article  Full Text Online
Open Access

2023 see 2022

Hyperbolic Sliced-Wasserstein via Geodesic and Horospherical Projections

by Bonet, Clément; Chapel, Laetitia; Lucas Drumetz ; More...

arXiv.org, 06/2023

It has been shown beneficial for many types of data which present an underlying hierarchical structure to be embedded in hyperbolic spaces.

Consequently, many...

Paper  Full Text Online

Open Access

Cited by 5 Related articles All 9 versions

2023 see 2021

Intrinsic Sliced Wasserstein Distances for Comparing Collections of Probability 

Probability Distributions on Manifolds and Graphs
by Rustamov, Raif; Majumdar, Subhabrata
arXiv.org, 06/2023
Collections of probability distributions arise in a variety of applications ranging from user activity pattern analysis to brain connectomics. In practice...
Paper  Full Text Online
Cited by 10
 Related articles All 8 versions 


 2023
  

Following the Launch of the Google Pixel Tablet, Wasserstein Announces their 

Tablet, Wasserstein Announces their Dedicated Pixel Tablet Speaker Stand


2023 patent

基于Wasserstein生成对抗网络模型的高能图像合成方法、装置

06/2023

Patent  Available Online

Open Access

[Chinese. High-energy image synthesis method and device based on Wasserstein generative confrontation network model]

2023 patent

一种基于WGAN和LightGBM的云环境入侵检测方法

06/2023

Patent  Available Online

Bibliographic data: CN116248344 (A)

[Chinese.  A cloud environment intrusion detection method based on WGAN and LightGBM]


2023 patent

一种基于S变换/WGAN解决输卤管道泄漏数据不平衡的算法

05/2023

Patent  Available Online

[Chinese  An Algorithm Based on S-Transform/WGAN to Solve the Data Imbalance of Brine Pi]



Sensitivity of Multiperiod Optimization Problems with Respect to the Adapted Wasserstein Distance
Show more
Authors:Daniel BartlJohannes Wiesel
Summary:Abstract. We analyze the effect of small changes in the underlying probabilistic model on the value of multiperiod stochastic optimization problems and optimal stopping problems. We work in finite discrete time and measure these changes with the adapted Wasserstein distance. We prove explicit first-order approximations for both problems. Expected utility maximization is discussed as a special case
Show more
Downloadable Article
Publication:SIAM Journal on Financial Mathematics, 14, 20230630, 704

Zbl 07707121
<–—2023———2023———1270——



Renewable Energy Hosting Capacity Evaluation of Distribution Network Based on WGAN Scenario Generation
Show more
Authors:Zhiwen WeiJunhui LiXinxiong WuJianpeng YeYongqiu LiangJiaqi Li2023 IEEE International Conference on Power Science and Technology (ICPST)
Show more
Summary:With the continuous improvement of the proportion of renewable energy access in the distribution system, how to reasonably consider the uncertainty of renewable energy power output and efficiently and accurately evaluate the renewable energy hosting capacity of the distribution network is an urgent issue to be considered in current power grid planning. In this work, we proposed a scenario generation method based on Wasserstein Generative Adversarial Networks for renewable energy power output, which can generate the scenarios with the same distribution as real data and more diversity. Then, we extracted representative typical scenarios of renewable energy based on the improved K-means algorithm. We also established a mathematical model for calculating the hosting capacity of the distribution network with the maximum installed capacity of renewable energy as the objective function, including constraints such as power flow balance, voltage deviation, and voltage fluctuation. Finally, based on wind and solar power generation data for one year from NREL laboratory , a simulation calculation was conducted for a 45-bus distribution network to verify the effectiveness of the proposed model and method
Show more
Chapter, 2023
Publication:2023 IEEE International Conference on Power Science and Technology (ICPST), 20230505, 276
Publisher:2023


Low-count PET image reconstruction algorithm based on WGAN-GP
Authors:Ruiqi Fang (Author), Ruipeng Guo (Author), Min Zhao (Author), Min Yao (Author)
Summary:Positron emission tomography (PET) technique can visualize the working status or fluid flow state inside opaque devices, and how to reconstruct high-quality images from low-count (LC) projection data with short scan time to meet the real-time online inspection remains an important research problem. A direct reconstruction algorithm CED-PET based on gradient-penalized Wasserstein Generative Adversarial Network (WGAN-GP) architecture is proposed. This network combines content loss, perceptual loss, and adversarial loss to achieve fast and high-quality reconstruction of low-count projection data. In addition, a special dataset for obtuse body bypassing was produced by combining Computational Fluid Dynamics (CFD) simulation software and the Geant4 Application for Tomographic Emission (GATE) simulation platform. The results on this dataset show that CED-PET can quickly reconstruct high-quality images with more realistic detail contours
Show more
Chapter, 2023
Publication:Proceedings of the 2023 3rd International Conference on Bioinformatics and Intelligent Computing, 20230210, 60
Publisher:2023

Gradient Flows for Probabilistic Frame Potentials in the Wasserstein Space
Authors:Clare WickmanKasso A. Okoudjou
Summary:Abstract. In this paper we bring together some of the key ideas and methods of two disparate fields of mathematical research, frame theory, and optimal transport, using the methods of the second to answer questions posed in the first. In particular, we construct gradient flows in the Wasserstein space for a new potential, the tightness potential, which is a modification of the probabilistic frame potential. It is shown that the potential is suited for the application of a gradient descent scheme from optimal transport that can be used as the basis of an algorithm to evolve an existing frame toward a tight probabilistic frame
Show more
Downloadable Article
Publication:SIAM Journal on Mathematical Analysis, 55, 20230630, 2324ß

Zbl 07714700


2023 see 2022

On Assignment Problems Related to Gromov-Wasserstein Distances on the Real Line
Show more
Authors:Robert BeinertCosmas HeissGabriele Steidl
Summary:Abstract. Let and , , be real numbers. We show by an example that the assignment problem is in general neither solved by the identical permutation ( ) nor the anti-identical permutation ( ) if . Indeed the above maximum can be, depending on the number of points, arbitrarily far away from and . The motivation to deal with such assignment problems came from their relation to Gromov-Wasserstein distances, which have recently received a lot of attention in imaging and shape analysis
Show more
Downloadable Article
Publication:SIAM Journal on Imaging Sciences, 16, 20230630, 1028
Society for Industrial and Applied Mathematics

https://epubs.siam.org › doi › abs

by R Beinert · 2023 · Cited by 6 — The motivation to deal with such assignment problems came from their relation to GromovWasserstein distances, which have recently received a lot of attention ...



223 video

WGANs: A stable alternative to traditional GANs - YouTube

www.youtube.com › watch

www.youtube.com › watch

In this video, we'll explore the Wasserstein GAN with Gradient Penalty, which addresses the instability issues in traditional GANs.

YouTube · Developers Hutt · 

4 key moments

 in this video

May 1, 2023


2023


2023 video

Training WGAN-GP to generate fake People portrait images

pylessons.com › wgan-gp

pylessons.com › wgan-gp

28:1928:19

Published ... The Wasserstein GAN, or WGAN, was a breakthrough in Generative Adversarial Network ... Wasserstein GAN with Gradient Penalty.

PyLessons · Python Lessons · 1 month ago

May 30, 2023 


2023 no month
video

DVGAN: Stabilize Wasserstein GAN training for time-domain Gravitational Wave 

www.computer.org › csdl › video-library › video

www.computer.org › csdl › video-library › videoDVGAN: Stabilize Wasserstein GAN training for time-domain Gravitat

2023 see 2022. [PDF] unimib.it

Integration of heterogeneous single cell data with Wasserstein Generative Adversarial Networks

V Giansanti - 2023 - boa.unimib.it

… Mini-batches are selected by a Bayesian ridge regressor to train a Wasserstein Generative

Adversarial Network with gradient penalty. The output of the generative network is used to …

Related articles All 2 versions 



Brain Tumour Segmentation Using Wasserstein Generative Adversarial Networks(WGANs)

S. Nyamathulla;

Ch.Sai Meghana;

K. Yasaswi

2023 7th International Conference on Trends in Electronics and Informatics (ICOEI)

Year: 2023 | Conference Paper | Publisher: IEEE

Related articles

A Method for Generating Subsynchronous Oscillation Data of Power System Based on Wasserstein Generative Adversarial Network

Yuhang Zheng;

Miao Miao;

Xiangjia Peng;

Jiaxing Lei;

Shuang Feng

2023 IEEE 6th International Electrical and Energy Conference (CIEEC)

Year: 2023 | Conference Paper | Publisher: IEEE

<–—2023———2023———1280——



WGAN-Based Dialogue System for Embedding Humor, Empathy, and Cultural Aspects in Education

Chunpeng Zhai;

Santoso Wibowo

IEEE Access

Year: 2023 | Early Access Article | Publisher: IEEE

Related articles All 2 versions


WGAN-GP with Residual Network Model for Lithium Battery Thermal Image Data Expansion with Quantitative MetricsMetrics

Fengshuo Hu;

Chaoyu Dong;

Mingshen Wang;

Qian Xiao;

Yunfei Mu;

Hongjie Jia

2023 IEEE 6th International Electrical and Energy Conference (CIEEC)

Year: 2023 | Conference Paper | Publisher: IEEE

Cited by 1 Related articles

2023 see 2022

Wasserstein Distributionally Robust Planning Model for Renewable Sources and Energy Storage Systems Under Multiple UncertaintiesMultiple Uncertainties

Junkai Li;

Zhengyang Xu;

Hong Liu;

Chengshan Wang;

Liyong Wang;

Chenghong Gu

IEEE Transactions on Sustainable Energy

Year: 2023 | Volume: 14, Issue: 3 | Journal Article | Publisher: IEEE


2023 see 2022

Distributed Wasserstein Barycenters via Displacement Interpolation

Pedro Cisneros-Velarde;

Francesco Bullo

IEEE Transactions on Control of Network Systems

Year: 2023 | Volume: 10, Issue: 2 | Journal Article | Publisher: IEEE



Wasserstein GAN-Based Digital Twin-Inspired Model for Early Drift Fault Detection in Wireless Sensor Networksorks

Md. Nazmul Hasan;

Sana Ullah Jan;

Insoo Koo

IEEE Sensors Journal

Year: 2023 | Volume: 23, Issue: 12 | Journal Article | Publisher: IEEE


2023


Correntropy-Induced Wasserstein GCN: Learning Graph Embedding via Domain Adaptation

Wei Wang;

Gaowei Zhang;

Hongyong Han;

Chi Zhang

IEEE Transactions on Image Processing

Year: 2023 | Volume: 32 | Journal Article | Publisher: IEEE


Path Planning Using Wasserstein Distributionally Robust Deep Q-learning

Cem Alptürk;

Venkatraman Renganathan

2023 European Control Conference (ECC)

Year: 2023 | Conference Paper | Publisher: IEEE

Related articles All 8 versions

[PDF] arxiv.org

Computing Wasserstein distance for persistence diagrams on a quantum computer

JJ Berwald, JM Gottlieb, E Munch - arXiv preprint arXiv:1809.06433, 2018 - arxiv.org

… the constraints, the quantum computer will also fail to … quantum computer finds Wasserstein 

distances corresponding to the low-energy states correctly for small problems. The quantum

Cited by 25 Related articles All 3 versions


2023 see 2021. [PDF] arxiv.org

Wasserstein complexity of quantum circuits

L Li, K Bu, DE Koh, A Jaffe, S Lloyd - arXiv preprint arXiv:2208.06306, 2022 - arxiv.org

quantum resources to computational resources. Our results provide novel applications of 

the quantum Wasserstein … of the resources needed to implement a quantum computation. …

Cited by 6 Related articles All 2 versions

2023 see 2021. [PDF] neurips.cc

Quantum Wasserstein generative adversarial networks

S Chakrabarti, H Yiming, T Li… - Advances in Neural …, 2019 - proceedings.neurips.cc

… a definition of the Wasserstein semimetric between quantum data, … We also demonstrate how 

to turn the quantum Wasserstein … our quantum WGAN on an actual quantum computer? Our …

Cited by 68 Related articles All 8 versions

<–—2023———2023———1290—



2023 see 2022. [PDF] arxiv.org

The quantum Wasserstein distance of order 1

G De Palma, M Marvian, D Trevisan… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org

… W1 distance to be a powerful tool with a broad range of applications in quantum information, 

quantum computing and quantum machine learning. We propose a few of them in the …

Cited by 55 Related articles All 15 versions



2023 see 2022. [PDF] arxiv.org

Wasserstein solution quality and the quantum approximate optimization algorithm: a portfolio optimization case study

JS Baker, SK Radha - arXiv preprint arXiv:2202.06782, 2022 - arxiv.org

… Using the WD (W), we were able to provide insights into how quantum computational 

difficulty scales with increasing n and p having identified the existence of critical p-values where …

Cited by 12 Related articles All 5 versions



[PDF] academia.edu

Machine learning algorithms in quantum computing: A survey

SB Ramezani, A Sommers… - … joint conference on …, 2020 - ieeexplore.ieee.org

Wasserstein GANs (WGAN) is a variation of such networks that uses Wasserstein distance 

to determine the distance between the actual and generated distributions and is optimistic for …

Related articles All 2 versions



2023 see 2022

The Wasserstein Distance Using QAOA: A Quantum Augmented Approach to Topological Data Analysis

M Saravanan, M Gopikrishnan - 2022 International Conference …, 2022 - ieeexplore.ieee.org

Wasserstein Distance by applying the Quantum Approximate Optimization Algorithm (QAOA) 

using gate-based quantum computing … gate-based quantum computers become ubiquitous. …

Related articles



[PDF] metacatalogo.com

On quantum versions of the classical Wasserstein distance

J Agredo, F Fagnola - Stochastics, 2017 - Taylor & Francis

… We investigate a definition of quantum Wasserstein distance of two states based on their 

couplings on the product algebra as in the classical case. A detailed analysis of the two qubit …

Cited by 15 Related articles All 5 versions


2023



arXiv:2307.10099  [pdfother math.ST   stat.CO tat.ML
Memory Efficient And Minimax Distribution Estimation Under Wasserstein Distance Using Bayesian Histograms
Authors: Peter Matthew JacobsLekha PatelAnirban BhattacharyaDebdeep Pati
Abstract: We study Bayesian histograms for distribution estimation on [0,1]
d under the Wasserstein W
v,1≤v<∞
 distance in the i.i.d sampling regime. We newly show that when d<2v
, histograms possess a special \textit{memory efficiency} property, whereby in reference to the sample size n
, order n
d/2v
 bins are needed to obtain minimax rate optimality. This result holds for the poste…  More
Submitted 19 July, 2023; originally announced July 2023.



arXiv:2307.10093  [pdfother cs.LG   q-bio.GN  \stat.ML
Revisiting invariances and introducing priors in Gromov-Wasserstein distances
Authors: Pinar DemetciQuang Huy TranIevgen RedkoRitambhara Singh
Abstract: Gromov-Wasserstein distance has found many applications in machine learning due to its ability to compare measures across metric spaces and its invariance to isometric transformations. However, in certain applications, this invariance property can be too flexible, thus undesirable. Moreover, the Gromov-Wasserstein distance solely considers pairwise sample similarities in input datasets, disregardi…  More
Submitted 19 July, 2023; originally announced July 2023.

Related articles All 2 versions 

arXiv:2307.09057  [pdfpsother math.OC   cs.LG  stat.ML
Globally solving the Gromov-Wasserstein problem for point clouds in low dimensional Euclidean spaces
Authors: Martin RynerJan KronqvistJohan Karlsson
Abstract: This paper presents a framework for computing the Gromov-Wasserstein problem between two sets of points in low dimensional spaces, where the discrepancy is the squared Euclidean norm. The Gromov-Wasserstein problem is a generalization of the optimal transport problem that finds the assignment between two sets preserving pairwise distances as much as possible. This can be used to quantify the simil…  More
Submitted 18 July, 2023; originally announced July 2023.
Comments: 20 pages, 5 figures
MSC Class: 90C26


arXiv:2307.08402  [pdfpsother math.PR. 

Wasserstein distance in terms of the Comonotonicity Copula
Authors: Mariem AbdellatifPeter KuchlingBarbara RüdigerIrene Ventura
Abstract: In this article, we represent the Wasserstein metric of order p
, where p[1,∞)
, in terms of the comonotonicity copula, for the case of probability measures on $\R^d$, by revisiting existing results. In 1973, Vallender established the link between the 1
-Wasserstein metric and the corresponding distribution functions for d=1
. In 1956 Giorgio dall'Aglio showed that the p-Wasserstein m…  More
Submitted 17 July, 2023; originally announced July 2023.

All 2 versions 

arXiv:2307.07273  [pdfpsother math.OA  math.FA doi10.13001/ela.2023.7679
Preservers of the p-power and the Wasserstein means on 2×2
 matrices
Authors: Richárd SimonDániel Virosztek
Abstract: In one of his recent papers \cite{ML1}, Molnár showed that if A
 is a von Neumann algebra without I
1,I-type direct summands, then any function from the positive definite cone of A
 to the positive real numbers preserving the Kubo-Ando power mean for some 0≠p(−1,1)
 is necessarily constant. It was shown in that paper, that I
1-type algebras admit nontrivial…  More
Submitted 14 July, 2023; originally announced July 2023.
Comments: accepted manuscript version
MSC Class: Primary: 15A24. Secondary: 47A64
Journal ref: Electron. J. Linear Algebra 39 (2023), 395-408

<–—2023———2023———1300——


arXiv:2307.07084  [pdfother cs.LG   cs.AI  cs.RO  eess.SY
Safe Reinforcement Learning as Wasserstein Variational Inference: Formal Methods for Interpretability
Authors: Yanran WangDavid Boyle
Abstract: Reinforcement Learning or optimal control can provide effective reasoning for sequential decision-making problems with variable dynamics. Such reasoning in practical implementation, however, poses a persistent challenge in interpreting the reward function and corresponding optimal policy. Consequently, formalizing the sequential decision-making problems as inference has a considerable value, as pr…  More
Submitted 13 July, 2023; originally announced July 2023.
Comments: 24 pages, 8 figures, containing Appendix


arXiv:2307.07050  [pdfother physics.comp-ph   cs.LG  physics.chem-ph
Wasserstein Quantum Monte Carlo: A Novel Approach for Solving the Quantum Many-Body Schrödinger Equation
Authors: Kirill NeklyudovJannes NysLuca ThiedeJuan CarrasquillaQiang LiuMax WellingAlireza Makhzani
Abstract: Solving the quantum many-body Schrödinger equation is a fundamental and challenging problem in the fields of quantum physics, quantum chemistry, and material sciences. One of the common computational approaches to this problem is Quantum Variational Monte Carlo (QVMC), in which ground-state solutions are obtained by minimizing the energy of the system within a restricted family of parameterized wa…  More
Submitted 6 July, 2023; originally announced July 2023.

Cited by 5 Related articles All 4 versions 

arXiv:2307.06137  [pdfother stat.ME   math.ST
Distribution-on-Distribution Regression with Wasserstein Metric: Multivariate Gaussian Case
Authors: Ryo OkanoMasaaki Imaizumi
Abstract: Distribution data refers to a data set where each sample is represented as a probability distribution, a subject area receiving burgeoning interest in the field of statistics. Although several studies have developed distribution-to-distribution regression models for univariate variables, the multivariate scenario remains under-explored due to technical complexities. In this study, we introduce mod…  More
Submitted 12 July, 2023; originally announced July 2023.
Comments: 32 pages


arXiv:2307.05802  [pdfpsother math.MG   math.OC  math.PR
Sliced Wasserstein Distance between Probability Measures on Hilbert Spaces
Authors: Ruiyu Han
Abstract: The sliced Wasserstein distance as well as its variants have been widely considered in comparing probability measures defined on R
d; however, we are not aware of whether the notion can be extended to probability measures on a noncompact infinite dimensional spaces. Here we derive an analogy of sliced Wasserstein distance for measures on an infinite dimensional separable Hilbert spaces,…  More
Submitted 11 July, 2023; originally announced July 2023.
Comments: 11 pages, 0 figures
MSC Class: 53B12


arXiv:2307.04966  [pdfother math.OC
Wasserstein Distributionally Robust Regret-Optimal Control under Partial Observability
Authors: Joudi HajarTaylan KarginBabak Hassibi
Abstract: This paper presents a framework for Wasserstein distributionally robust (DR) regret-optimal (RO) control in the context of partially observable systems. DR-RO control considers the regret in LQR cost between a causal and non-causal controller and aims to minimize the worst-case regret over all disturbances whose probability distribution is within a certain Wasserstein-2 ball of a nominal distribut…  More
Submitted 10 July, 2023; originally announced July 2023.


2023


arXiv:2307.04188  [pdfpsother math.PR   math.ST
Wasserstein-p Bounds in the Central Limit Theorem Under Local Dependence
Authors: Tianle LiuMorgane Austern
Abstract: The central limit theorem (CLT) is one of the most fundamental results in probability; and establishing its rate of convergence has been a key question since the 1940s. For independent random variables, a series of recent works established optimal error bounds under the Wasserstein-p distance (with p>=1). In this paper, we extend those results to locally dependent random variables, which include m…  More
Submitted 9 July, 2023; originally announced July 2023.
Comments: 49 pages. arXiv admin note: substantial text overlap with arXiv:2209.09377
MSC Class: 60F05


 2023 see 2022

Jeong, MiranHwang, JinmiKim, Sejong

Right mean for the  αz

 Bures-Wasserstein quantum divergence. (English) Zbl 07713414

Acta Math. Sci., Ser. B, Engl. Ed. 43, No. 5, 2320-2332 (2023).

MSC:  81P17 15B48

PDF BibTeX XML Cite

Full Text: DOI 

 arXiv 

OpenURL 

MR4614017


 2023 see 2022

Li, WuchenZhao, Jiaxi

Wasserstein information matrix. (English) Zbl 07711389

Inf. Geom. 6, No. 1, 203-255 (2023).

MSC:  53C23 94A17 49Q22 90C26

PDF BibTeX XML Cite

Full Text: DOI 

 arXiv 

OpenURL 

 Zbl 07711389




Chen, YaqingLin, ZhenhuaMüller, Hans-Georg

Wasserstein regression. (English) Zbl 07707208

J. Am. Stat. Assoc. 118, No. 542, 869-882 (2023).

MSC:  62-XX

PDF BibTeX XML Cite

Full Text: DOI 

 arXiv 


 


Augmentation of FTIR spectral datasets using Wasserstein generative adversarial networks for cancer liquid biopsies
by McHardy, Rose G.; Antoniou, Georgios; Conn, Justin J. A. ; More...
Analyst (London), 07/2023
Over recent years, deep learning (DL) has become more widely used within the field of cancer diagnostics. However, DL often requires large training datasets to...
Article PDFPDF
Journal Article  Full T

5ext Online

<–—2023———2023———1310——


 
Modified locally joint sparse marginal embedding and wasserstein generation adversarial network for bearing fault diagnosis
by Zhou, Hongdi; Zhang, Hang; Li, Zhi ; More...
Journal of vibration and control, 07/2023
Rolling bearings are essential parts for manufacturing machines. Vast quantities of features are often extracted from measured signals to comprehensively...
Article PDFPDF
Journal Article  Full Text Online

Preservers of the p-power and the Wasserstein means on 2x2 matrices
by Simon, Richárd; Virosztek, Dániel
The Electronic journal of linear algebra, 07/2023, Volume 39
In one of his recent papers, Molnár showed that if $\mathcal{A}$ is a von Neumann algebra without $I_1, I_2$-type direct summands, then any function from the...
Article PDFPDF
Journal Article  Full Text Online

Wasserstein Generative Adversarial Network–Gradient Penalty-Based Model with Imbalanced Data Enhancement for Network Intrusion Detection
by Lee, Gwo-Chuan; Li, Jyun-Hong; Li, Zi-Yang
Applied sciences, 07/2023, Volume 13, Issue 14
In today’s network intrusion detection systems (NIDS), certain types of network attack packets are sparse compared to regular network packets, making them...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal

2023 see arXiv

Distribution-on-Distribution Regression with Wasserstein Metric: Multivariate Gaussian Case
by Okano, Ryo; Imaizumi, Masaaki
07/2023
Distribution data refers to a data set where each sample is represented as a probability distribution, a subject area receiving burgeoning interest in the...
Journal Article  Full Text Online
Cited by 1
 Related articles All 2 versions 


2023 see arX0v
Sliced Wasserstein Distance between Probability Measures on Hilbert Spaces
by Han, Ruiyu
07/2023
The sliced Wasserstein distance as well as its variants have been widely considered in comparing probability measures defined on $\mathbb R^d$; however, we are...
Journal Article  Full Text Online
Cite Cited by 1 Related articles All 2 versions 


2o23


2023 see arX0v Bbl
Wasserstein Distributionally Robust Regret-Optimal Control under Partial Observability

by Hajar, Joudi; Kargin, Taylan; Hassibi, Babak
Cited by 1
 Related articles All 3 versions


07/2023
This paper presents a framework for Wasserstein distributionally robust (DR) regret-optimal (RO) control in the context of partially observable systems. DR-RO...
Journal Article  Full Text Online
 
2023 see arX0v
Globally solving the Gromov-Wasserstein problem for point clouds in low dimensional Euclidean spaces

by Ryner, Martin; Kronqvist, Jan; Karlsson, Johan
07/2023
This paper presents a framework for computing the Gromov-Wasserstein problem between two sets of points in low dimensional spaces, where the discrepancy is the...
Journal Article  Full Text Online
Cited by 4
 Related articles All 4 versions 


Preservers of the $p$-power and the Wasserstein means on $2 \times 2$ matrices

by Simon, Richárd; Virosztek, Dániel
07/2023
Electron. J. Linear Algebra 39 (2023), 395-408 In one of his recent papers \cite{ML1}, Moln\'ar showed that if $\mathcal{A}$ is a von Neumann algebra without...
Article PDFPDF
Journal Article  Full Text Online
View in Context Browse Journal

2023 see arXiv
Safe Reinforcement Learning as Wasserstein Variational Inference: Formal Methods for Interpretability
by Wang, Yanran; Boyle, David
07/2023
Reinforcement Learning or optimal control can provide effective reasoning for sequential decision-making problems with variable dynamics. Such reasoning in...
Journal Article  Full Text Online
Related articles
 All 2 versions 

2023 see 2022
Multi-Marginal Gromov-Wasserstein Transport and Barycenters
by Beier, Florian; Beinert, Robert; Steidl, Gabriele
arXiv.org, 07/2023
Gromov-Wasserstein (GW) distances are combinations of Gromov-Hausdorff and Wasserstein distances that allow the comparison of two different metric measure...
Paper  Full Text Online

<–—2023———2023———1320——



2023 see 2021
Balanced Coarsening for Multilevel Hypergraph Partitioning via Wasserstein Discrepancy
by Guo, Zhicheng; Zhao, Jiaxuan; Jiao, Licheng ; More...
arXiv.org, 07/2023
We propose a balanced coarsening scheme for multilevel hypergraph partitioning. In addition, an initial partitioning algorithm is designed to improve the...
Paper  Full Text Online


 2023 see 2022
Rate of convergence of the smoothed empirical Wasserstein distance

by Block, Adam; Jia, Zeyu; Polyanskiy, Yury ; More...

arXiv.org, 07/2023
Consider an empirical measure \(\mathbb{P}_n\) induced by \(n\) iid samples from a \(d\)-dimensional \(K\)-subgaussian distribution \(\mathbb{P}\) and let...
Paper  Full Text Online



Preservers of the \(p\)-power and the Wasserstein means on \(2 \times 2\) matrices
by Richárd Simon; Virosztek, Dániel
arXiv.org, 07/2023
In one of his recent papers \cite{ML1}, Molnár showed that if \(\mathcal{A}\) is a von Neumann algebra without \(I_1, I_2\)-type direct summands, then any...
Paper  Full Text Online


New Proteomics Study Findings Have Been Reported from University of Toulouse (Two-sample Goodness-of-fit Tests On the Flat Torus Based On Wasserstein...
Health & Medicine Week, 07/2023
Newsletter  Full Text Online



 Conditional Wasserstein Generator

Kim, YGLee, K and Paik, MC

Jun 1 2023 | 

IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE

 45 (6) , pp.7208-7219

The statistical distance of conditional distributions is an essential element of generating target data given some data as in video prediction. We establish how the statistical distances between two joint distributions are related to those between two conditional distributions for three popular statistical distances: f-divergence, Wasserstein distance, and integral probability metrics. Such cha

Show more

Full Text at Publishermore_horiz

Cited by 4 Related articles All 6 versions

2023


Multi-Angle Facial Expression Recognition Algorithm Combined with Dual-Channel WGAN-GP

Deng, YShi, YP; (...); Liu, J

Sep 2022 | 

LASER & OPTOELECTRONICS PROGRESS

 59 (18)

A multi-angle facial expression recognition algorithm combined with dual-channel WGAN-GP is suggested to address the concerns of poor performance of standard algorithms for multi-angle facial expression identification and had quality of frontal face pictures generated under deflection angles. Traditional models only use profile features to recognize the multi-angle facial expression, which lead

Show more

View full textmore_horiz1 Citation. 42 References. Related records

 

2023 patent

çƒbetween current evacuation distribution and original distribution, and generating evacuation scheme according to original and current evacuation distributions

CN116205773-A

Inventor(s) LUO ZZHENG X; (...); LEI N

Assignee(s) UNIV DALIAN TECHNOLOGY

Derwent Primary Accession Number 

2023-61640R


Computer program for performing visual tracking using Wasserstein distance, has set of instructions for determining next position of target by using distance between distribution estimated for candidate positions and external distribution

KR2023080165-A

Inventor(s) KWON J and KIM Y

Assignee(s) UNIV CHUNG ANG IND ACAD COOP FOUND

Derwent Primary Accession Number 

2023-625809


From p-Wasserstein bounds to moderate deviations

Fang, X and Koike, Y

2023 | 

ELECTRONIC JOURNAL OF PROBABILITY

 28

We use a new method via p-Wasserstein bounds to prove Cramer-type moderate deviations in (multivariate) normal approximations. In the classical setting that W is a standardized sum of n independent and identically distributed (i.i.d.) random variables with sub-exponential tails, our method recovers the optimal range of 0 x = o(n1/6) and the near optimal error rate O(1)(1+x)(log n+x2)/A/n for P(

Show more

Free Full Text from Publishermore_horiz

48 References. Related records

MR4609449 

2023 patent

Computer-implemented method for training and testing deep wasserstein generative adversarial network, involves measuring quality of noiseless images using peak signal to noise ratio and structural similarity index measure performance metrics

IN202341031956-A

Inventor(s) PRAKASH C and KAVITHA G

Assignee(s) PRAKASH C and KAVITHA G

Derwent Primary Accession Number 

2023-66607F

<–—2023———2023———1330——



2023 see 2022

Quantum Wasserstein distance of order 1 between channels

Duvenhage, R and Mapaya, M

Jun 2023 (Early Access) | 

INFINITE DIMENSIONAL ANALYSIS QUANTUM PROBABILITY AND RELATED TOPICS

We set up a general theory leading to a quantum Wasserstein distance of order 1 between channels in an operator algebraic framework. This gives a metric on the set of channels from one composite system to another, which is deeply connected to reductions of the channels. The additivity and stability properties of this metric are studied.

Free Submitted Article From RepositoryFull Text at Publishermore_horiz

51 References. Related records


Scalable Gromov-Wasserstein Based Comparison of Biological Time Series

Kravtsova, NMcGee, RLM and Dawes, AT

Aug 2023 | 

BULLETIN OF MATHEMATICAL BIOLOGY

 85 (8)

A time series is an extremely abundant data type arising in many areas of scientific research, including the biological sciences. Any method that compares time series data relies on a pairwise distance between trajectories, and the choice of distance measure determines the accuracy and speed of the time series comparison. This paper introduces an optimal transport type distance for comparing ti

Show more

Free Full Text From Publishermore_horiz

36 References. Related records

MR4614626   


Correntropy-Induced Wasserstein GCN: Learning Graph Embedding via Domain Adaptation.

Wang, WeiZhang, Gaowei; (...); Zhang, Chi

2023 | 

IEEE transactions on image processing : a publication of the IEEE Signal Processing Society

 32 , pp.3980-3993

Graph embedding aims at learning vertex representations in a low-dimensional space by distilling information from a complex-structured graph. Recent efforts in graph embedding have been devoted to generalizing the representations from the trained graph in a source domain to the new graph in a different target domain based on information transfer. However, when the graphs are contaminated by unp

Show more

Free Full Text From Publishermore_horiz

Cited by 4 Related articles All 5 versions


Interactive Guiding Sparse Auto-Encoder with Wasserstein Regularization for Efficient Classification

Lee, HEHur, C; (...); Kang, SG

Jun 2023 | 

APPLIED SCIENCES-BASEL

 13 (12)

In the era of big data, feature engineering has proved its efficiency and importance in dimensionality reduction and useful information extraction from original features. Feature engineering can be expressed as dimensionality reduction and is divided into two types of methods, namely, feature selection and feature extraction. Each method has its pros and cons. There are a lot of studies that co

Show more

Free Full Text from PublisherView Full Text on ProQuestmore_horiz

50 References. Related records


Augmenttion of FTIR spectral datasets using Wasserstein generative adversarial networks for cancer liquid biopsies.McHardy, Rose GAntoniou, Georgios; (...); Palmer, David S

2023-jul-12 | 

The Analyst

Over recent years, deep learning (DL) has become more widely used within the field of cancer diagnostics. However, DL often requires large training datasets to prevent overfitting, which can be difficult and expensive to acquire. Data augmentation is a method that can be used to generate new data points to train DL models. In this study, we use attenuated total reflectance Fourier-transform inf

Show more

Free Full Text From Publishermore_horiz


2023



Synthetic high-energy computed tomography image via a Wasserstein generative adversarial network with the convolutional block attentin module

Kong, HYuan, ZD; (...); Hu, ZL

Jun 2023 (Early Access) | 

QUANTITATIVE IMAGING IN MEDICINE AND SURGERY

Background: Computed tomography (CT) is now universally applied into clinical practice with its noninvasive quality and reliability for lesion detection, which highly improves the diagnostic accuracy of patients with systemic diseases. Although low-dose CT reduces X-ray radiation dose and harm to the human body, it inevitably produces noise and artifacts that are detrimental to information acqu

Show more

Related articles All 7 versions


Two-sample goodness-of-fit tests on the flat torus based on Wasserstein distance and their relevance to structural biology*

Gonzalez-Delgado, JGonzalez-Sanz, A; (...); Neuvial, P

2023 | 

ELECTRONIC JOURNAL OF STATISTICS

 17 (1) , pp.1547-1586

This work is motivated by the study of local protein structure, which is defined by two variable dihedral angles that take values from probability distributions on the flat torus. Our goal is to provide the space P(R2/Z2) with a metric that quantifies local structural modifications due to changes in the protein sequence, and to define associated two-sample goodness-of-fit testing approaches. Du

Show more

Free Full Text from Publishermore_horiz

67 References. Related records 

Cited by 13 Related articles All 20 versions


[PDF] aaai.org

Privacy-Preserved Evolutionary Graph Modeling via Gromov-Wasserstein Autoregression

Y XiangD LuoH Xu - Proceedings of the AAAI Conference on Artificial …, 2023 - ojs.aaai.org

… Gromov–Wasserstein distances and the metric approach to object matching. Foundations

of computational mathematics, 11(4): 417–487. Panzarasa, P.; Opsahl, T.; and Carley, KM …



Wasserstein Distance in Deep Learning

J Leo, E Ge, S Li - Available at SSRN 4368733, 2023 - papers.ssrn.com

… The mathematical foundation of Wasserstein distance lies in optimal transport theory, which

provides a framework for solving mass transportation problems and has been extensively …

Cited by 2 Related articles 


[PDF] acm.org

Dual Critic Conditional Wasserstein GAN for Height-Map Generation

N Ramos, P SantosJ Dias - … Conference on the Foundations of Digital …, 2023 - dl.acm.org

… We adapted one of the most successful models described, the Wasserstein GAN [1], … In

order to accomplish this goal we introduced the Dual Critic Conditional Wasserstein GAN (…

Related articles All 2 versions

<–—2023———2023———1340——


2023 see 2022. [PDF] snu.ac.kr

[PDF] Generating synthetic data with Inferential Wasserstein Generative Adversarial Network

KIM SEUNG-JONG - 2023 - s-space.snu.ac.kr

… In Chapter 2, there will be an introduction about GAN and Wasserstein GAN, or WGAN, which

… The Wasserstein Autoencoder, which is a foundation for the iWGAN, proposes an encoder …



Working Paper

Convergence of SGD for Training Neural Networks with Sliced Wasserstein Losses

Tanguy, Eloi. arXiv.org; Ithaca, Jul 21, 2023

Select result item

Working Paper

Wasserstein Asymptotics for Brownian Motion on the Flat Torus and Brownian Interlacements

Mariani, Mauro; Trevisan, Dario. arXiv.org; Ithaca, Jul 19, 2023.

arXiv:2307.10325  [pdfpsother]  math.PR

Wasserstein Asymptotics for Brownian Motion on the Flat Torus and Brownian Interlacements

Authors: Mauro MarianiDario Trevisan

Abstract: We study the large time behavior of the optimal transportation cost towards the uniform distribution, for the occupation measure of a stationary Brownian motion on

the flat torus in d

 dimensions, where the cost of transporting a unit of mass is given by a power of the flat distance. We establish a global upper bound, in terms of the limit for the analogue

problem concerning the occupation measur…  More

Submitted 19 July, 2023; originally announced July 2023.

MSC Class: 60D05; 90C05; 39B62; 60F25; 35J05

 Related articles All 3 versions 

Convergence of SGD for Training Neural Networks with Sliced Wasserstein Loss

Properties of Discrete Sliced Wasserstein Losses

Tanguy, Eloi; Flamary, Rémi; Delon, Julie. arXiv.org; Ithaca, Jul 19, 2023.

arXiv:2307.10352  [pdfother stat.ML   cs.LG  math.OC math.PR
Properties of Discrete Sliced Wasserstein Losses
Authors: Eloi TanguyRémi FlamaryJulie Delon
Abstract: The Sliced Wasserstein (SW) distance has become a popular alternative to the Wasserstein distance for comparing probability measures. Widespread applications include image processing, domain adaptation and generative modelling, where it is common to optimise some parameters in order to minimise SW, which serves as a loss function between discrete probability measures (since measures admitting dens…  More
Submitted 19 July, 2023; originally announced July 2023.

Cited by 2 Related articles All 8 versions 


Working PaperMemory Efficient And Minimax Distribution Estimation Under Wasserstein Distance Using

Bayesian HistogramsJacobs, Peter

Matthew; Patel, Lekha; Bhattacharya, Anirban; Pati, Debdeep. arXiv.org; Ithaca,

Jul 19, 2023.
Related articles
 All 2 versions 


2023

 2023 se4e 2022

Working Paper

 Wasserstein contraction and Poincaré inequalities for elliptic diffusions at high temperature

Monmarché, Pierre. arXiv.org; Ithaca, Jul 19, 2023. 

Listen: Vaccine mandate reversal, special appropriations spending on WGAN

Maine Policy Institute. CE Think Tank Newswire; Miami [Miami]. 14 July 2023.  

Full Text

 Cited by 1 Related articles All 2 versions 

 arXiv:2307.13370  [pdfpsother math.OC s.LG  stat.ML
Computational Guarantees for Doubly Entropic Wasserstein Barycenters via Damped Sinkhorn Iterations
Authors: Lénaïc ChizatTomas Vaškevičius
Abstract: We study the computation of doubly regularized Wasserstein barycenters, a recently introduced family of entropic barycenters governed by inner and outer regularization strengths. Previous research has demonstrated that various regularization parameter choices unify several notions of entropy-penalized barycenters while also revealing new ones, including a special case of debiased barycenters. In t…  More
Submitted 25 July, 2023; originally announced July 2023.

arXiv:2307.13362  [pdfother math.PR   math.AP
Wasserstein contraction for the stochastic Morris-Lecar neuron model
Authors: Maxime HerdaPierre MonmarchéBenoît Perthame
Abstract: Neuron models have attracted a lot of attention recently, both in mathematics and neuroscience. We are interested in studying long-time and large-population emerging properties in a simplified toy model. From a mathematical perspective, this amounts to study the long-time behaviour of a degenerate reflected diffusion process. Using coupling arguments, the flow is proven to be a contraction of the…  More
Submitted 25 July, 2023; originally announced July 2023.
MSC Class: 35Q84; 60J60; 92B20

Related articles All 8 versions 


arXiv:2307.13135  [pdfother math.OC
High-dimensional Optimal Density Control with Wasserstein Metric Matching
Authors: Shaojun MaMengxue HouXiaojing YeHaomin Zhou
Abstract: We present a novel computational framework for density control in high-dimensional state spaces. The considered dynamical system consists of a large number of indistinguishable agents whose behaviors can be collectively modeled as a time-evolving probability distribution. The goal is to steer the agents from an initial distribution to reach (or approximate) a given target distribution within a fix…  More
Submitted 24 July, 2023; originally announced July 2023.
Comments: 8 pages, 4 figures. Accepted for IEEE Conference on Decision and Control 2023
MSC Class: 93E20; 76N25; 49L99

Cited by 2 Related articles All 4 versions

<–—2023———2023———1350——


arXiv:2307.12884  [pdfpsother math.MG   math.AT
Coarse embeddability of Wasserstein space and the space of persistence diagrams
Authors: Neil PritchardThomas Weighill
Abstract: We prove an equivalence between open questions about the embeddability of the space of persistence diagrams and the space of probability distributions (i.e.~Wasserstein space). It is known that for many natural metrics, no coarse embedding of either of these two spaces into Hilbert space exists. Some cases remain open, however. In particular, whether coarse embeddings exist with respect to the…  More
Submitted 24 July, 2023; originally announced July 2023.
Comments: 11 pages, 1 figure
MSC Class: 55N99; 51F30



arXiv:2307.12508  [pdfpsother math.ST   stat.ML
Information Geometry of Wasserstein Statistics on Shapes and Affine Deformations
Authors: Shun-ichi AmariTakeru Matsuda
Abstract: Information geometry and Wasserstein geometry are two main structures introduced in a manifold of probability distributions, and they capture its different characteristics. We study characteristics of Wasserstein geometry in the framework of Li and Zhao (2023) for the affine deformation statistical model, which is a multi-dimensional generalization of the location-scale model. We compare merits an…  More
Submitted 23 July, 2023; originally announced July 2023.

Ss≈ƒ

Related articles All 2 versions 

2023 see 2022

 [PDF] arxiv.org

WAD-CMSN: Wasserstein distance-based cross-modal semantic network for zero-shot sketch-based image retrieval

G Xu, Z Hu, J Cai - International Journal of Wavelets, Multiresolution …, 2023 - World Scientific

… To tackle this issue and overcome this drawback, we propose a Wasserstein distance-…

low-dimensional semantic subspace via Wasserstein distance in an adversarial training manner. …

Cited by 1 Related articles All 4 versions


[PDF] jst.go.jp

Gromov--Wasserstein 距離を用いたクロスドメイン推薦

熊谷雄介, 野沢悠哉, 牛久雅崇… - 人工知能学会全国大会論文 …, 2023 - jstage.jst.go.jp

… Our method utilizes the Gromov–Wasserstein distance to determine the similarity of users

across domains. Through experiments conducted on multiple real-world data sets, we …

Related articles All 2 versions

WGAN-Based Generative Strategy in Evolutionary Multitasking for Multi-objective Optimization

T Zhou, X Yao, G Yue, B Niu - International Conference on Swarm …, 2023 - Springer

… algorithm named MTMO-WGAN, which leverages Wasserstein GAN(WGAN) with weight …

Based on the MTMOO benchmark problems, MTMO-WGAN outperforms EMT-GS in the bulk …

All 2 versions


2023


[PDF] ieee.org

Correntropy-Induced Wasserstein GCN: Learning Graph Embedding via Domain Adaptation

W Wang, G Zhang, H Han… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org

… Next, we learn the target GCN based on extending the Wasserstein distance to avoid noisy 

source nodes as well. At the testing time, the target nodes in the shared space are classified …

All 3 versions



Augmentation of FTIR spectral datasets using Wasserstein generative adversarial networks for cancer liquid biopsies †
Authors:McHardy, Rose G. (Creator), Antoniou, Georgios (Creator), Conn, Justin J. A. (Creator), Baker, Matthew James (Creator), Palmer, David S. (Creator)
Show more
Summary:Over recent years, deep learning (DL) has become more widely used within the field of cancer diagnostics. However, DL often requires large training datasets to prevent overfitting, which can be difficult and expensive to acquire. Data augmentation is a method that can be used to generate new data points to train DL models. In this study, we use attenuated total reflectance Fourier-transform infrared (ATR-FTIR) spectra of patient dried serum samples and compare non-generative data augmentation methods to Wasserstein generative adversarial networks (WGANs) in their ability to improve the performance of a convolutional neural network (CNN) to differentiate between pancreatic cancer and non-cancer samples in a total cohort of 625 patients. The results show that WGAN augmented spectra improve CNN performance more than non-generative augmented spectra. When compared with a model that utilised no augmented spectra, adding WGAN augmented spectra to a CNN with the same architecture and same parameters, increased the area under the receiver operating characteristic curve (AUC) from 0.661 to 0.757, presenting a 15% increase in diagnostic performance. In a separate test on a colorectal cancer dataset, data augmentation using a WGAN led to an increase in AUC from 0.905 to 0.955. This demonstrates the impact data augmentation can have on DL performance for cancer diagnosis when the amount of real data available for model training is limited
Show more
Downloadable Archival Material, English, 2023-07-06
Publisher:The Royal Society of Chemistry, 2023-07-06
Access Free



2023 library see arXiv
Wasserstein Wasserstein Actor-Critic: Directed Exploration via Optimism for Continuous-Actions Control
Authors:Likmeta, Amarildo (Creator), Sacco, Matteo (Creator), Metelli, Alberto Maria (Creator), Restelli, Marcello (Creator)
Summary:Uncertainty quantification has been extensively used as a means to achieve efficient directed exploration in Reinforcement Learning (RL). However, state-of-the-art methods for continuous actions still suffer from high sample complexity requirements. Indeed, they either completely lack strategies for propagating the epistemic uncertainty throughout the updates, or they mix it with aleatoric uncertainty while learning the full return distribution (e.g., distributional RL). In this paper, we propose Wasserstein Actor-Critic (WAC), an actor-critic architecture inspired by the recent Wasserstein Q-Learning (WQL) \citep{wql}, that employs approximate Q-posteriors to represent the epistemic uncertainty and Wasserstein barycenters for uncertainty propagation across the state-action space. WAC enforces exploration in a principled way by guiding the policy learning process with the optimization of an upper bound of the Q-value estimates. Furthermore, we study some peculiar issues that arise when using function approximation, coupled with the uncertainty estimation, and propose a regularized loss for the uncertainty estimation. Finally, we evaluate our algorithm on standard MujoCo tasks as well as suite of continuous-actions domains, where exploration is crucial, in comparison with state-of-the-art baselines
Show more
Downloadable Archival Material, Undefined, 2023-03-04
Publisher:2023-03-04
Access Free

  w0we library
Outlier-Robust Gromov Wasserstein for Graph Data
Authors:Kong, Lemin (Creator), Li, Jiajin (Creator), So, Anthony Man-Cho (Creator)
Summary:Gromov Wasserstein (GW) distance is a powerful tool for comparing and aligning probability distributions supported on different metric spaces. It has become the main modeling technique for aligning heterogeneous data for a wide range of graph learning tasks. However, the GW distance is known to be highly sensitive to outliers, which can result in large inaccuracies if the outliers are given the same weight as other samples in the objective function. To mitigate this issue, we introduce a new and robust version of the GW distance called RGW. RGW features optimistically perturbed marginal constraints within a $\varphi$-divergence based ambiguity set. To make the benefits of RGW more accessible in practice, we develop a computationally efficient algorithm, Bregman proximal alternating linearization minimization, with a theoretical convergence guarantee. Through extensive experimentation, we validate our theoretical results and demonstrate the effectiveness of RGW on real-world graph learning tasks, such as subgraph matching and partial shape correspondence
Show more
Downloadable Archival Material, Undefined, 2023-02-09
Publisher:2023-02-09
Access Free



arXiv:2307.16421  [pdfother math.PR  math.AP  stat.ML
Wasserstein Mirror Gradient Flow as the limit of the Sinkhorn Algorithm
Authors: Nabarun DebYoung-Heon KimSoumik PalGeoffrey Schiebinger
Abstract: We prove that the sequence of marginals obtained from the iterations of the Sinkhorn algorithm or the iterative proportional fitting procedure (IPFP) on joint densities, converges to an absolutely continuous curve on the 2
-Wasserstein space, as the regularization parameter ε
 goes to zero and the number of iterations is scaled as 1/ε
 (and other technical assumptions). This…  More
Submitted 31 July, 2023; originally announced July 2023.
Comments: 49 pages, 2 figures
MSC Class: 49N99; 49Q22

<–—2023———2023———1360——



arXiv:2307.15764  [pdfpsother math.PR
Geometric Ergodicity, Unique Ergodicity and Wasserstein Continuity of Non-Linear Filters with Compact State Space
Authors: Yunus Emre DemirciSerdar Yüksel
Abstract: In this paper, we present conditions for the geometric ergodicity of non-linear filter processes, which has received little attention in the literature. Furthermore, we provide additional results on the unique ergodicity of filter processes associated with ergodic hidden Markov models, generalizing existing results to compact state spaces. While previous studies in the field of non-linear filterin…  More
Submitted 28 July, 2023; originally announced July 2023.
MSC Class: 60J05; 60J10; 93E11; 93E15

Cited by 1 Related articles All 2 versions 

arXiv:2307.15423  [pdfother math.NA
Nonlinear reduced basis using mixture Wasserstein barycenters: application to an eigenvalue problem inspired from quantum chemistry
Authors: Maxime DaleryGenevieve DussonVirginie EhrlacherAlexei Lozinski
Abstract: The aim of this article is to propose a new reduced-order modelling approach for parametric eigenvalue problems arising in electronic structure calculations. Namely, we develop nonlinear reduced basis techniques for the approximation of parametric eigenvalue problems inspired from quantum chemistry applications. More precisely, we consider here a one-dimensional model which is a toy model for the…  More
Submitted 28 July, 2023; originally announced July 2023.


arXiv:2307.14953  [pdfother cs.LG   cs.AI stat.ML
Multi-Source Domain Adaptation through Dataset Dictionary Learning in Wasserstein Space
Authors: Eduardo Fernandes MontesumaFred Ngolè MboulaAntoine Souloumiac
Abstract: This paper seeks to solve Multi-Source Domain Adaptation (MSDA), which aims to mitigate data distribution shifts when transferring knowledge from multiple labeled source domains to an unlabeled target domain. We propose a novel MSDA framework based on dictionary learning and optimal transport. We interpret each domain in MSDA as an empirical distribution. As such, we express each domain as a Wasse…  More
Submitted 27 July, 2023; originally announced July 2023.
Comments: 13 pages,9 figures,Accepted as a conference paper at the 

OpenURL

Figalli, AlessioGlaudo, Federico

An invitation to optimal transport. Wasserstein distances, and gradient flows. 2nd edition. (English) Zbl 07713911

EMS Textbooks in Mathematics. Berlin: European Mathematical Society (EMS) (ISBN 978-3-98547-050-1/hbk; 978-3-98547-550-6/ebook). (2023).

MSC:  49-01 49-02 49Q22 60B05 28A33 35A15 35Q35 49N15 28A50

PDF BibTeX XML

Full Text: DOI 

 Cited by 2 Related articles

e show that Wasserstein barycenters are an …

Cited by 7 Related articles All 5 versions 

Chen, ZhiKuhn, DanielWiesemann, Wolfram

On approximations of data-driven chance constrained programs over Wasserstein balls. (English) ß≈

Oper. Res. Lett. 51, No. 3, 226-233 (2023).

MSC:  90-XX

PDF BibTeX XML Cite


2023


2023 see 2022

MR4613194 Prelim Ponnoprat, Donlapark; 

Universal consistency of Wasserstein 

k

-NN classifier: a negative and some positive results. Inf. Inference 12 (2023), no. 3, iaad027. 62H30 (49Q22 62G20)

Review PDF Clipboard Journal Article

Full Text: DOI 



MR4613165 Prelim Hakobyan, Astghik; Yang, Insoon; 

Distributionally robust differential dynamic programming with Wasserstein distance. IEEE Control Syst. Lett. 7 (2023), 2329–2334. 93E20 (49)

Review PDF Clipboard Journal Article


MR4605212 Prelim Beinert, Robert; Heiss, Cosmas; Steidl, Gabriele; 

On assignment problems related to Gromov–Wasserstein distances on the real line. SIAM J. Imaging Sci. 16 (2023), no. 2, 1028–1032. 90C27 (28A35 49Q22 90B80)

Review PDF Clipboard Journal Article

Cited by 14 Related articles All 3 versions


MLNAN: Multi-level noise-aware network for low-dose CT imaging implemented with constrained cyle Wasserstein generative adversarial networks

Huang, ZXLi, WB; (...); Zhang, N

Sep 2023 | Jul 2023 (Early Access) | 

ARTIFICIAL INTELLIGENCE IN MEDICINE

 143

Low-dose CT techniques attempt to minimize the radiation exposure of patients by estimating the high -resolution normal-dose CT images to reduce the risk of radiation-induced cancer. In recent years, many deep learning methods have been proposed to solve this problem by building a mapping function between low-dose CT images and their high-dose counterparts. However, most of these methods ignore

Show more

View full textmore_horiz

76 References. Related records



Method for diagnosing bearing fault based on improved Wasserstein distance generative adversarial network, involves inputting processed vibration signal, and obtaining diagnosis precision and fault type of bearing to be tested

CN116223038-A

Inventor(s) SHEN JZHU C; (...); ZHANG H

Assignee(s) UNIV JIANGSU SCI & TECHNOLOGY

Derwent Primary Accession Number 

2023-637004

<–—2023———2023———1370—



2012 patent

ECG Classification Based on Wasserstein Scalar Curvature

Sun, FPNi, Y; (...); Sun, HF

Oct 2022 | 

ENTROPY

 24 (10)

Electrocardiograms (ECG) analysis is one of the most important ways to diagnose heart disease. This paper proposes an efficient ECG classification method based on Wasserstein scalar curvature to comprehend the connection between heart disease and the mathematical characteristics of ECG. The newly proposed method converts an ECG into a point cloud on the family of Gaussian distribution, where th

Show more

  

Data

Comparisons of the portfolio performance under different Wasserstein radius.

He, Qingyun and Hong, Chuanyang

2023 | 

Figshare

 | Data set

Comparisons of the portfolio performance under different Wasserstein radius. Copyright: CC BY 4.0

 

2023 patent

Enhanced YOLOv5 based short wave frequency hopping signal sorting method, involves replacing intersection of Union (IoU) measurement in Non-maximum suppression-maximum Suppression (NMS) and regression loss function during small target detection by using Normalized Wasserstein Distance

CN116318249-A

Inventor(s) GONG KCHEN P; (...); ZHU Z

Assignee(s) UNIV ZHENGZHOU

Derwent Primary Accession Number 

2023-76332X


Schematic representation of the learning process of the Wasserstein GAN.

Okada, KiyoshiroEndo, Katsuhiro; (...); Kurabayashi, Shuichi

2023 | 

Figshare

 | Data set

At each iteration, the generator receives the latent variable z and outputs a false image, and the discriminator receives the false and real images and calculates the loss using the Wasserstein distance. Copyright: CC BY 4.0


Unsupervised domain adaptation for Covid-19 classification based on balanced slice Wasserstein distance.

Gu, JiaweiQian, Xuan; (...); Wu, Fang

2023-jul-05 | 

Computers in biology and medicine

 164 , pp.107207

Covid-19 has swept the world since 2020, taking millions of lives. In order to seek a rapid diagnosis of Covid-19, deep learning-based Covid-19 classification methods have been extensively developed. However, deep learning relies on many samples with high-quality labels, which is expensive. To this end, we propose a novel unsupervised domain adaptation method to process many different but relat

SCited by 4 Related articles All 4 versions


2023

 

Research on image inpainting algorithm of improved total variation minimization method

Chen, YTZhang, HP; (...); Xie, JB

Jan 2021 (Early Access) | 

JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING

In order to solve the issue mismatching and structure disconnecting in exemplar-based image inpainting, an image completion algorithm based on improved total variation minimization method had been proposed in the paper, refer as ETVM. The structure of image had been extracted using improved total variation minimization method, and the known information of image is sufficiently used by existing

Show more


An Integrated Method Based on Wasserstein Distance and Graph for Cancer Subtype Discovery.

Cao, QingqingZhao, Jianping; (...); Zheng, Chunhou

2023-aug-01 | 

IEEE/ACM transactions on computational biology and bioinformatics

 PP

Due to the complexity of cancer pathogenesis at different omics levels, it is necessary to find a comprehensive method to accurately distinguish and find cancer subtypes for cancer treatment. In this paper, we proposed a new cancer multi-omics subtype identification method, which is based on variational autoencoder measured by Wasserstein distance and graph autoencoder (WVGMO). This method depe

Show more

INFORMATION AND INFERENCE-A JOURNAL OF THE IMA

 12 (1) , pp.363-389

Discriminating between distributions is an important problem in a number of scientific fields. This motivated the introduction of Linear Optimal Transportation (LOT), which embeds the space of distributions into an L-2 -space. The transform is defined by computing the optimal transport of each distribution to a fixed reference distribution and has a number of benefits when it comes to speed of

Show more

Related articles All 5 versions

 

Wasserstein Generative Adversarial Network-Gradient Penalty-Based Model with Imbalanced Data Enhancement for Network Intrusion Detection

Lee, GCLi, JH and Li, ZY

Jul 2023 | 

APPLIED SCIENCES-BASEL

 13 (14)

In today's network intrusion detection systems (NIDS), certain types of network attack packets are sparse compared to regular network packets, making them challenging to collect, and resulting in significant data imbalances in public NIDS datasets. With respect to attack types with rare data, it is difficult to classify them, even by using various algorithms such as machine learning and deep le

Show more


MLNAN: Multi-level noise-aware network for low-dose CT imaging implemented with constrained cycle Wasserstein generative adversarial networks

Huang, ZXLi, WB; (...); Zhang, N

Sep 2023 | Jul 2023 (Early Access) | 

ARTIFICIAL INTELLIGENCE IN MEDICINE

 143

Low-dose CT techniques attempt to minimize the radiation exposure of patients by estimating the high -resolution normal-dose CT images to reduce the risk of radiation-induced cancer. In recent years, many deep learning methods have been proposed to solve this problem by building a mapping function between low-dose CT images and their high-dose counterparts. However, most of these methods ignore

Cited by 2 Related articles All 4 versions


2023 DataWGA model.

Okada, KiyoshiroEndo, Katsuhiro; (...); Kurabayashi, Shuichi

Figshare

 | Data set

Upper figure: Generator model. Lower figure: discriminator model. Copyright: CC BY 4.0

View datamore_horiz

<–—2023———2023———1380——



2023 Data

NIST test results when WGAN learned with dropout layer.

Okada, KiyoshiroEndo, Katsuhiro; (...); Kurabayashi, Shuichi

Figshare

 | Data set

NIST test results when WGAN learned with dropout layer. Copyright: CC BY 4.0

View datamore_horiz


 

2023 patent

Method for first-arrival picking of microseismic signals based on improved WGAN-GP and Picking-Net, involves inputting microseismic signal to be picked by first arrival into trained Picking-Net, and outputting picked first arrival signal

CN116299676-A

Inventor(s) TANG JXIE K; (...); SHENG G

Assignee(s) UNIV CHINA THREE GORGES

Derwent Primary Accession Number 

2023-71165L

2 



arXiv:2312.17376  [pdfpsother eess.SY   ath.OC
Wasserstein Distributionally Robust Regret-Optimal Control in the Infinite-Horizon
Authors: Taylan KarginJoudi HajarVikrant MalikBabak Hassibi
Abstract: We investigate the Distributionally Robust Regret-Optimal (DR-RO) control of discrete-time linear dynamical systems with quadratic cost over an infinite horizon. Regret is the difference in cost obtained by a causal controller and a clairvoyant controller with access to future disturbances. We focus on the infinite-horizon framework, which results in stability guarantees. In this DR setting, the p…  More
Submitted 28 December, 2023; originally announced December 2023.
Comments: Submitted to L4DC 

Cite Related articles All 2 versions 

arXiv:2312.15762  [pdfother cs.LG 
s.AI
On Robust Wasserstein Barycenter: The Model and Algorithm
Authors: Xu WangJiawei HuangQingyuan YangJinpeng Zhang
Abstract: The Wasserstein barycenter problem is to compute the average of m
 given probability measures, which has been widely studied in many different areas; however, real-world data sets are often noisy and huge, which impedes its applications in practice. Hence, in this paper, we focus on improving the computational efficiency of two types of robust Wasserstein barycenter problem (RWB): fixed-support R…  More
Submitted 25 December, 2023; originally announced December 2023.
Comments: Algorithms for accelerating robust Wasserstein barycenter problem

Related articles All 2 versions 

arXiv:2312.15394  [pdfpsother math.FA
Order relations of the Wasserstein mean and the spectral geometric mean
Authors: Luyining GanHuajun Huang
Abstract: On the space of positive definite matrices, several operator means are popular and have been studied extensively. In this paper, we provide the eigenvalue entrywise order between the Wasserstein mean and the spectral geometric mean. We also study the near order relation proposed by Dimitru and Franco on the matrix means, which is weaker than the Löwner order but stronger than the eigenvalue entryw…  More
Submitted 23 December, 2023; originally announced December 2023.
Comments: 14 pages
MSC Class: 15A42; 15A45; 15B48


2023



arXiv:2312.14572  [pdfother math.OC 
stat.ML
Semidefinite Relaxations of the Gromov-Wasserstein Distance
Authors: Junyu ChenBinh T. NguyenYong Sheng Soh
Abstract: The Gromov-Wasserstein (GW) distance is a variant of the optimal transport problem that allows one to match objects between incomparable spaces. At its core, the GW distance is specified as the solution of a non-convex quadratic program and is not known to be tractable to solve. In particular, existing solvers for the GW distance are only able to find locally optimal solutions. In this work, we pr…  More
Submitted 26 December, 2023; v1 submitted 22 December, 2023; originally announced December 2023.


arXiv:2312.12769  [pdfother math.OC
Wasserstein robust combinatorial optimization problems
Authors: Marcel JackiewiczAdam KasperskiPawel Zielinski
Abstract: This paper discusses a class of combinatorial optimization problems with uncertain costs in the objective function. It is assumed that a sample of the cost realizations is available, which defines an empirical probability distribution for the random cost vector. A Wasserstein ball, centered at the empirical distribution, is used to define an ambiguity set of probability distributions. A solution m…  More
Submitted 20 December, 2023; originally announced December 2023.

elated articles All 3 versions 

arXiv:2312.12230  [pdfother math.OC  cs.LG
It's All in the Mix: Wasserstein Machine Learning with Mixed Features
Authors: Reza BelbasiAras SelviWolfram Wiesemann
Abstract: Problem definition: The recent advent of data-driven and end-to-end decision-making across different areas of operations management has led to an ever closer integration of prediction models from machine learning and optimization models from operations research. A key challenge in this context is the presence of estimation errors in the prediction models, which tend to be amplified by the subseque…  More
Submitted 19 December, 2023; originally announced December 2023.
Comments: 48 pages (31 main + proofs), 7 tables, 2 colored plots, an early version appeared in NeurIPS 2022 main track (arXiv 2205.13501)

Cited by 1 Related articles All 3 versions 

arXiv:2312.10322  [pdfpsother math.OC   math.AP   math.PR
Viscosity Solutions of a class of Second Order Hamilton-Jacobi-Bellman Equations in the Wasserstein Space
Authors: Hang CheungHo Man TaiJinniao Qiu
Abstract: This paper is devoted to solving a class of second order Hamilton-Jacobi-Bellman (HJB) equations in the Wasserstein space, associated with mean field control problems involving common noise. We provide the well-posedness of viscosity solutions to the HJB equation in the sense of Crandall-Lions' definition, under general assumptions on the coefficients. Our approach adopts the smooth metric develop…  More
Submitted 21 February, 2024; v1 submitted 16 December, 2023; originally announced December 2023.
Comments: 32 pages;
MSC Class: 49L25


arXiv:2312.10295  [pdfother math.OC  cs.DM
On a Generalization of Wasserstein Distance and the Beckmann Problem to Connection Graphs
Authors: Sawyer RobertsonDhruv KohliGal MishneAlexander Cloninger
Abstract: The intersection of connection graphs and discrete optimal transport presents a novel paradigm for understanding complex graphs and node interactions. In this paper, we delve into this unexplored territory by focusing on the Beckmann problem within the context of connection graphs. Our study establishes feasibility conditions for the resulting convex optimization problem on connection graphs. Furt…  More
Submitted 15 December, 2023; originally announced December 2023.
Comments: 19 pages, 6 figures
MSC Class: 65K10; 05C21; 90C25; 68R10; 05C50

Related articles All 2 versions 

<–—2023———2023———1480—



arXiv:2312.09862  [pdfother math.ST 
stat.ME
Wasserstein-based Minimax Estimation of Dependence in Multivariate Regularly Varying Extremes
Authors: Xuhui ZhangJose BlanchetYoussef MarzoukViet Anh NguyenSven Wang
Abstract: We study minimax risk bounds for estimators of the spectral measure in multivariate linear factor models, where observations are linear combinations of regularly varying latent factors. Non-asymptotic convergence rates are derived for the multivariate Peak-over-Threshold estimator in terms of the p
-th order Wasserstein distance, and information-theoretic lower bounds for the minimax risks are es…  More
Submitted 15 December, 2023; originally announced December 2023.


arXiv:2312.08227  [pdfother stat.ML  cs.CR 
cs.LG
Differentially Private Gradient Flow based on the Sliced Wasserstein Distance for Non-Parametric Generative Modeling
Authors: Ilana SebagMuni Sreenivas PYDIJean-Yves FranceschiAlain RakotomamonjyMike GartrellJamal AtifAlexandre Allauzen
Abstract: Safeguarding privacy in sensitive training data is paramount, particularly in the context of generative modeling. This is done through either differentially private stochastic gradient descent, or with a differentially private metric for training models or generators. In this paper, we introduce a novel differentially private generative modeling approach based on parameter-free gradient flows in t…  More
Submitted 13 December, 2023; originally announced December 2023.

Related articles All 2 versions 

arXiv:2312.07788  [pdfother math-ph  eess.SY  math.OC
Wasserstein speed limits for Langevin systems
Authors: Ralph SabbaghOlga Movilla MiangolarraTryphon T. Georgiou
Abstract: Physical systems transition between states with finite speed that is limited by energetic costs. In this Letter, we derive bounds on transition times for general Langevin systems that admit a decomposition into reversible and irreversible dynamics, in terms of the Wasserstein distance between states and the energetic costs associated with respective reversible and irreversible currents. For illust…  More
Submitted 12 December, 2023; originally announced December 2023.
Comments: 7 pages, 1 figure
MSC Class: 82C31; 82M60; 82Cxx; 80M60; 93E03

Related articles All 2 versions 


arXiv:2312.07397  [pdfother math.ST
Neural Entropic Gromov-Wasserstein Alignment
Authors: Tao WangZiv Goldfeld
Abstract: The Gromov-Wasserstein (GW) distance, rooted in optimal transport (OT) theory, provides a natural framework for aligning heterogeneous datasets. Alas, statistical estimation of the GW distance suffers from the curse of dimensionality and its exact computation is NP hard. To circumvent these issues, entropic regularization has emerged as a remedy that enables parametric estimation rates via plug-in…  More
Submitted 12 December, 2023; originally announced December 2023.

arXiv:2312.07048  [pdfother cs.CV
Edge Wasserstein Distance Loss for Oriented Object Detection
Authors: Yuke ZhuYumeng RuanZihua XiongSheng Guo
Abstract: Regression loss design is an essential topic for oriented object detection. Due to the periodicity of the angle and the ambiguity of width and height definition, traditional L1-distance loss and its variants have been suffered from the metric discontinuity and the square-like problem. As a solution, the distribution based methods show significant advantages by representing oriented boxes as distri…  More
Submitted 12 December, 2023; originally announced December 2023.

Related articles All 3 versions 

arXiv:2312.06591  [pdfother stat.ML 
cs.LG
Concurrent Density Estimation with Wasserstein Autoencoders: Some Statistical Insights
Authors: Anish ChakrabartyArkaprabha BasuSwagatam Das
Abstract: Variational Autoencoders (VAEs) have been a pioneering force in the realm of deep generative models. Amongst its legions of progenies, Wasserstein Autoencoders (WAEs) stand out in particular due to the dual offering of heightened generative quality and a strong theoretical backbone. WAEs consist of an encoding and a decoding network forming a bottleneck with the prime objective of generating new s…  More
Submitted 11 December, 2023; originally announced December 2023.

Related articles All 2 versions 

2023


arXiv:2312.04481  [pdfother stat.ME
Wasserstein complexity penalization priors: a new class of penalizing complexity priors
Authors: David BolinAlexandre B. SimasZhen Xiong
Abstract: Penalizing complexity (PC) priors is a principled framework for designing priors that reduce model complexity. PC priors penalize the Kullback-Leibler Divergence (KLD) between the distributions induced by a ``simple'' model and that of a more complex model. However, in many common cases, it is impossible to construct a prior in this way because the KLD is infinite. Various approximations are used…  More
Submitted 7 December, 2023; originally announced December 2023.

Related articles All 2 versions 

arXiv:2312.03573  [pdfother math.OC 
eess.SY
On data-driven Wasserstein distributionally robust Nash equilibrium problems with heterogeneous uncertainty
Authors: George PantazisBarbara FranciSergio Grammatico
Abstract: We study stochastic Nash equilibrium problems subject to heterogeneous uncertainty on the cost functions of the individual agents. In our setting, we assume no prior knowledge of the underlying probability distributions of the uncertain variables. To account for this lack of knowledge, we consider an ambiguity set around the empirical probability distribution under the Wasserstein metric. We then…  More
Submitted 26 January, 2024; v1 submitted 6 December, 2023; originally announced December 2023.

Related articles All 2 versions 

arXiv:2312.02849  [pdfpsother math.ST  cs.LG  math.OC
Algorithms for mean-field variational inference via polyhedral optimization in the Wasserstein space
Authors: Yiheng JiangSinho ChewiAram-Alexandre Pooladian
Abstract: We develop a theory of finite-dimensional polyhedral subsets over the Wasserstein space and optimization of functionals over them via first-order methods. Our main application is to the problem of mean-field variational inference, which seeks to approximate a distribution π
 over R
d  by a product measure π
  When π
 is strongly log-concave and log-smooth, we provide (1) approxi…  More
Submitted 5 December, 2023; originally announced December 2023.
Comments: 40 pages

arXiv:2312.02324  [pdfpsother math.AP 
math.OC 
math.PR
Well-posedness of Hamilton-Jacobi equations in the Wasserstein space: non-convex Hamiltonians and common noise
Authors: Samuel DaudinJoe JacksonBenjamin Seeger
Abstract: We establish the well-posedness of viscosity solutions for a class of semi-linear Hamilton-Jacobi equations set on the space of probability measures on the torus. In particular, we focus on equations with both common and idiosyncratic noise, and with Hamiltonians which are not necessarily convex in the momentum variable. Our main results show (i) existence, (ii) the comparison principle (and hence…  More
Submitted 4 December, 2023; originally announced December 2023.

arXiv:2312.01584  [pdfpsother math.AP
Homogenization of Wasserstein gradient flows
Authors: Yuan GaoNung Kwan Yip
Abstract: We prove the convergence of a Wasserstein gradient flow of a free energy in an inhomogeneous media. Both the energy and media can depend on the spatial variable in a fast oscillatory manner. In particular, we show that the gradient flow structure is preserved in the limit which is expressed in terms of an effective energy and Wasserstein metric. The gradient flow and its limiting behavior is analy…  More
Submitted 3 December, 2023; originally announced December 2023.

<–—2023———2023———1490—



arXiv:2312.00800  [pdfpsother math.AP
On the global convergence of Wasserstein gradient flow of the Coulomb discrepancy
Authors: Siwan BoufadèneFrançois-Xavier Vialard
Abstract: In this work, we study the Wasserstein gradient flow of the Riesz energy defined on the space of probability measures. The Riesz kernels define a quadratic functional on the space of measure which is not in general geodesically convex in the Wasserstein geometry, therefore one cannot conclude to global convergence of the Wasserstein gradient flow using standard arguments. Our main result is the ex…  More
Submitted 29 January, 2024; v1 submitted 21 November, 2023; originally announced December 2023.

Cited by 1 Related articles All 14 versions 


arXiv:2312.00541  [pdfpsother math-ph  math.PR
Limit theorems for empirical measures of interacting quantum systems in Wasserstein space
Authors: Lorenzo PortinaleSimone RademacherDániel Virosztek
Abstract: We prove fundamental properties of empirical measures induced by measurements performed on quantum N
-body systems. More precisely, we consider measurements performed on the ground state of an interacting, trapped Bose gase in the Gross--Pitaevskii regime, known to exhibit Bose--Einstein condensation. For the corresponding empirical measure, we prove a weak law of large numbers with limit induced…  More
Submitted 1 December, 2023; originally announced December 2023.
MSC Class: 60F05 35Q40 81Q99 49Q22



arXiv:2311.18826  [pdfother cs.LG 
stat.ML
Geometry-Aware Normalizing Wasserstein Flows for Optimal Causal Inference
Authors: Kaiwen Hou
Abstract: This paper presents a groundbreaking approach to causal inference by integrating continuous normalizing flows (CNFs) with parametric submodels, enhancing their geometric sensitivity and improving upon traditional Targeted Maximum Likelihood Estimation (TMLE). Our method employs CNFs to refine TMLE, optimizing the Cramér-Rao bound and transitioning from a predefined distribution p
0.  to a data-dri…  More
Submitted 1 February, 2024; 

v1 submitted 30 November, 2023; originally announced November 2023.

Related articles All 2 versions 

arXiv:2311.18645  [pdfother cs.CV 
cs.AI
Stochastic Vision Transformers with Wasserstein Distance-Aware Attention
Authors: Franciskus Xaverius ErickMina RezaeiJohanna Paula MüllerBernhard Kainz
Abstract: Self-supervised learning is one of the most promising approaches to acquiring knowledge from limited labeled data. Despite the substantial advancements made in recent years, self-supervised models have posed a challenge to practitioners, as they do not readily provide insight into the model's confidence and uncertainty. Tackling this issue is no simple feat, primarily due to the complexity involve…  More
Submitted 30 November, 2023; originally announced November 2023.

All 2 versions 

arXiv:2311.18613  [pdfpsother math.ST
Wasserstein GANs are Minimax Optimal Distribution Estimators
Authors: Arthur StéphanovitchEddie AamariClément Levrard
Abstract: We provide non asymptotic rates of convergence of the Wasserstein Generative Adversarial networks (WGAN) estimator. We build neural networks classes representing the generators and discriminators which yield a GAN that achieves the minimax optimal rate for estimating a certain probability measure μ
 with support in R
p. . The probability μ
 is considered to be the push forward of the Le…  More
Submitted 30 November, 2023; originally announced November 2023.


2023



arXiv:2311.18531  [pdfother cs.CV  s.AI  s.LG
Dataset Distillation via the Wasserstein Metric
Authors: Haoyang LiuYijiang LiTiancheng XingVibhu DalalLuwei LiJingrui HeHaohan Wang
Abstract: Dataset Distillation (DD) emerges as a powerful strategy to encapsulate the expansive information of large datasets into significantly smaller, synthetic equivalents, thereby preserving model performance with reduced computational overhead. Pursuing this objective, we introduce the Wasserstein distance, a metric grounded in optimal transport theory, to enhance distribution matching in DD. Our appr…  More
Submitted 15 March, 2024; v1 submitted 30 November, 2023; originally announced November 2023.
Comments: 21 pages, 8 figures

All 2 versions 

arXiv:2311.16988  [pdfother stat.ME
Wasserstein-type Distance for Gaussian Mixtures on Vector Bundles with Applications to Shape Analysis
Authors: Michael WilsonTom NeedhamChiwoo ParkSuprateek KunduAnuj Srivastava
Abstract: This paper uses sample data to study the problem of comparing populations on finite-dimensional parallelizable Riemannian manifolds and more general trivial vector bundles. Utilizing triviality, our framework represents populations as mixtures of Gaussians on vector bundles and estimates the population parameters using a mode-based clustering algorithm. We derive a Wasserstein-type metric between…  More
Submitted 28 November, 2023; originally announced November 2023.


arXiv:2311.13595  [pdfother math.ST   s.LG   tat.ME   tat.ML
Covariance alignment: from maximum likelihood estimation to Gromov-Wasserstein
Authors: Yanjun HanPhilippe RigolletGeorge Stepaniants
Abstract: Feature alignment methods are used in many scientific disciplines for data pooling, annotation, and comparison. As an instance of a permutation learning problem, feature alignment presents significant statistical and computational challenges. In this work, we propose the covariance alignment model to study and compare various alignment methods and establish a minimax lower bound for covariance ali…  More
Submitted 22 November, 2023; originally announced November 2023.
Comments: 41 pages, 2 figures
MSC Class: Primary 62C20; 90B80; 49Q22; secondary 62R07; 05C60 ACM Class: G.3

 Related articles All 2 versions 

arXiv:2311.13159  [pdfother cs.LG   math.OC  stat.ML
Multi-Objective Optimization via Wasserstein-Fisher-Rao Gradient Flow
Authors: Yinuo RenTesi XiaoTanmay GangwaniAnshuka RangiHolakou RahmanianLexing YingSubhajit Sanyal
Abstract: Multi-objective optimization (MOO) aims to optimize multiple, possibly conflicting objectives with widespread applications. We introduce a novel interacting particle method for MOO inspired by molecular dynamics simulations. Our approach combines overdamped Langevin and birth-death dynamics, incorporating a "dominance potential" to steer particles toward global Pareto optimality. In contrast to pr…  More
Submitted 21 November, 2023; originally announced November 2023.

Related articles All 2 versions 

arXiv:2311.12689  [pdfother cs.CL   s.CY  s.LG
Fair Text Classification with Wasserstein Independence
Authors: Thibaud LetenoAntoine GourruCharlotte LaclauRémi EmonetChristophe Gravier
Abstract: Group fairness is a central research topic in text classification, where reaching fair treatment between sensitive groups (e.g. women vs. men) remains an open challenge. This paper presents a novel method for mitigating biases in neural text classification, agnostic to the model architecture. Considering the difficulty to distinguish fair from unfair information in a text encoder, we take inspirat…  More
Submitted 21 November, 2023; originally announced November 2023.

Related articles All 10 versions 

<–—2023———2023———1500—


arXiv:2311.12684  [pdfother cs.LG
Adversarial Reweighting Guided by Wasserstein Distance for Bias Mitigation
Authors: Xuan ZhaoSimone FabbrizziPaula Reyero LoboSiamak GhodsiKlaus BroelemannSteffen StaabGjergji Kasneci
Abstract: The unequal representation of different groups in a sample population can lead to discrimination of minority groups when machine learning models make automated decisions. To address these issues, fairness-aware machine learning jointly optimizes two (or more) metrics aiming at predictive effectiveness and low unfairness. However, the inherent under-representation of minorities in the data makes th…  More

Submitted 21 November, 2023; originally announced November 2023.

he latent space during training, our reweighting approach leads to predictions that …

 Cite  All 3 versions 

arXiv:2311.11003  [pdfother cs.LG  math.PR  stat.ML
Wasserstein Convergence Guarantees for a General Class of Score-Based Generative Models
Authors: Xuefeng GaoHoang M. NguyenLingjiong Zhu
Abstract: Score-based generative models (SGMs) is a recent class of deep generative models with state-of-the-art performance in many applications. In this paper, we establish convergence guarantees for a general class of SGMs in 2-Wasserstein distance, assuming accurate score estimates and smooth log-concave data distribution. We specialize our result to several concrete SGMs with specific choices of forwar…  More
Submitted 18 November, 2023; originally announced November 2023.

Cited by 6 Related articles All 2 versions 

arXiv:2311.10618  [pdfpsother math.AP
General distance-like functions on the Wasserstein space
Authors: Huajian JiangXiaojun Cui
Abstract: Viscosity solutions to the eikonal equations play a fundamental role to study the geometry, topology and geodesic flows. The classical definition of viscosity solution depends on the differential structure and can not extend directly to a general metric space. However, the distance-like functions, which are exactly viscosity solutions of the eikonal equation on a Riemannian manifold, is independen…  More
Submitted 17 November, 2023; originally announced November 2023.
Comments: 19 pages

Related articles All 2 versions 

arXiv:2311.10108  [pdfpsother hep-lat
Study of topological quantities of lattice QCD by a modified Wasserstein generative adversarial network
Authors: Lin GaoHeping YingJianbo Zhang
Abstract: A modified Wasserstein generative adversarial network (M-WGAN) is proposed to study the distribution of the topological charge in lattice QCD based on the Monte Carlo (MC) simulations. We construct new generator and discriminator in M-WGAN to support the generation of high-quality distribution. Our results show that the M-WGAN scheme of the Machine learning should be helpful for us to calculate ef…  More
Submitted 18 March, 2024; v1 submitted 15 November, 2023; originally announced November 2023.

Related articles All 3 versions 

arXiv:2311.09385  [pdfpsother math.FA  math.ST
Non-injectivity of Bures--Wasserstein barycentres in infinite dimensions
Authors: Yoav Zemel
Abstract: We construct a counterexample to the injectivity conjecture of Masarotto et al (2018). Namely, we construct a class of examples of injective covariance operators on an infinite-dimensional separable Hilbert space for which the Bures--Wasserstein barycentre is highly non injective -- it has a kernel of infinite dimension.
Submitted 15 November, 2023; originally announced November 2023.

Related articles All 2 versions 

2023


arXiv:2311.08549  [pdfother stat.ML  cs.LG  math.DG
Manifold learning in Wasserstein space
Authors: Keaton HammCaroline MoosmüllerBernhard SchmitzerMatthew Thorpe
Abstract: This paper aims at building the theoretical foundations for manifold learning algorithms in the space of absolutely continuous probability measures on a compact and convex subset of R
d, metrized with the Wasserstein-2 distance W
. We begin by introducing a natural construction of submanifolds Λ
 of probability measures equipped with metric W
Λ , the geodesic restriction of W
 to…  More
Submitted 14 November, 2023; originally announced November 2023.
MSC Class: 49Q22; 41A65; 58B20; 53Z50

Cited by 1 Related articles All 3 versions 

arXiv:2311.08343  [pdfpsother math.PR
Eigenvalues of random matrices from compact classical groups in Wasserstein metric
Authors: Bence Borda
Abstract: The circular unitary ensemble and its generalizations concern a random matrix from a compact classical group U(N)
, SU(N)
, O(N)
, SO(N)
 or USp(N)
 distributed according to the Haar measure. The eigenvalues are known to be very evenly distributed on the unit circle. In this paper, we study the distance from the empirical measure of the eigenvalues…  More
Submitted 14 November, 2023; originally announced November 2023.
Comments: 31 pages, 4 tables, 1 figure
MSC Class: 60B20; 60F05; 60G55; 49Q22

elated articles All 2 versions 

arXiv:2311.05573  [pdfother stat.ML   cs.LG  math.OC
Outlier-Robust Wasserstein DRO
Authors: Sloan NietertZiv GoldfeldSoroosh Shafiee
Abstract: Distributionally robust optimization (DRO) is an effective approach for data-driven decision-making in the presence of uncertainty. Geometric uncertainty due to sampling or localized perturbations of data points is captured by Wasserstein DRO (WDRO), which seeks to learn a model that performs uniformly well over a Wasserstein ball centered around the observed data distribution. However, WDRO fails…  More
Submitted 9 November, 2023; originally announced November 2023.
Comments: Appearing at NeurIPS 2023

 Cited by 1 Related articles All 5 versions 

arXiv:2311.05445  [pdfother cs.CE
Airfoil generation and feature extraction using the conditional VAE-WGAN-gp
Authors: Kazuo YonekuraYuki TomoriKatsuyuki Suzuki
Abstract: A machine learning method was applied to solve an inverse airfoil design problem. A conditional VAE-WGAN-gp model, which couples the conditional variational autoencoder (VAE) and Wasserstein generative adversarial network with gradient penalty (WGAN-gp), is proposed for an airfoil generation method, and then it is compared with the WGAN-gp and VAE models. The VAEGAN model couples the VAE and GAN m…  More
Submitted 9 November, 2023; originally announced November 2023.

Cited by 1 Related articles All 2 versions 


arXiv:2311.05134  [pdfpsother math.AP  math.FA  math.PR
Geometry and analytic properties of the sliced Wasserstein space
Authors: Sangmin ParkDejan Slepčev
Abstract: The sliced Wasserstein metric compares probability measures on R
d by taking averages of the Wasserstein distances between projections of the measures to lines. The distance has found a range of applications in statistics and machine learning, as it is easier to approximate and compute than the Wasserstein distance in high dimensions. While the geometry of the Wasserstein metric is quit…  More
Submitted 19 December, 2023; v1 submitted 8 November, 2023; originally announced November 2023.
Comments: 49 pages; some revisions in Sections 5 and 6
MSC Class: 49Q22; 46E27; 60B10; 44A12

Cited by 1 Related articles All 2 versions 

<–—2023———2023———1510— 


arXiv:2311.05045  [pdfother math.OC
Exact Solutions for the NP-hard Wasserstein Barycenter Problem using a Doubly Nonnegative Relaxation and a Splitting Method
Authors: Abdo AlfakihJeffery ChengWoosuk L. JungWalaa M. MoursiHenry Wolkowicz
Abstract: The simplified Wasserstein barycenter problem consists in selecting one point from k
 given sets, each set consisting of n
 points, with the aim of minimizing the sum of distances to the barycenter of the k
 points chosen. This problem is known to be NP-hard. We compute the Wasserstein barycenter by exploiting the Euclidean distance matrix structure to obtain a facially reduced doubly nonnegati…  More
Submitted 8 November, 2023; originally announced November 2023.
MSC Class: 90C26; 65K10; 90C27; 90C22

Related articles All 3 versions 

arXiv:2311.02953  [pdfother math.OC
Data-Driven Bayesian Nonparametric Wasserstein Distributionally Robust Optimization
Authors: Chao NingXutao Ma
Abstract: In this work, we develop a novel data-driven Bayesian nonparametric Wasserstein distributionally robust optimization (BNWDRO) framework for decision-making under uncertainty. The proposed framework unifies a Bayesian nonparametric method and the Wasserstein metric to decipher the global-local features of uncertainty data and encode these features into a novel data-driven ambiguity set. By establis…  More
Submitted 6 November, 2023; originally announced November 2023.
Comments: 9 pages, including Supplementary Material

Related articles All 3 versionsPreviousNext


arXiv:2311.01331  [pdfother]  cs.LG  cs.AI
Offline Imitation from Observation via Primal Wasserstein State Occupancy Matching
Authors: Kai YanAlexander G. SchwingYu-xiong Wang
Abstract: In real-world scenarios, arbitrary interactions with the environment can often be costly, and actions of expert demonstrations are not always available. To reduce the need for both, Offline Learning from Observations (LfO) is extensively studied, where the agent learns to solve a task with only expert states and \textit{task-agnostic} non-expert state-action pairs. The state-of-the-art DIstributio…  More
Submitted 21 November, 2023; v1 submitted 2 November, 2023; originally announced November 2023.
Comments: 23 pages. Accepted to the Optimal Transport and Machine Learning Workshop at NeurIPS 2023

Related articles All 4 versions 

arXiv:2311.00850  [pdfother]  q-bio.BM
EMPOT: partial alignment of density maps and rigid body fitting using unbalanced Gromov-Wasserstein divergence
Authors: Aryan Tajmir RiahiChenwei ZhangJames ChenAnne CondonKhanh Dao Duc
Abstract: Aligning EM density maps and fitting atomic models are essential steps in single particle cryogenic electron microscopy (cryo-EM), with recent methods leveraging various algorithms and machine learning tools. As aligning maps remains challenging in the presence of a map that only partially fits the other (e.g. one subunit), we here propose a new procedure, EMPOT (EM Partial alignment with Optimal…  More
Submitted 1 November, 2023; originally announced November 2023.

All 2 versions 


arXiv:2311.00109  [pdfother]  cs.LG  stat.ML
FairWASP: Fast and Optimal Fair Wasserstein Pre-processing
Authors: Zikai XiongNiccolò DalmassoAlan MishlerVamsi K. PotluruTucker BalchManuela Veloso
Abstract: Recent years have seen a surge of machine learning approaches aimed at reducing disparities in model outputs across different subgroups. In many settings, training data may be used in multiple downstream applications by different users, which means it may be most effective to intervene on the training data itself. In this work, we present FairWASP, a novel pre-processing approach designed to reduc…  More
Submitted 8 February, 2024; v1 submitted 31 October, 2023; originally announced November 2023.
Comments: Accepted at AAAI 2024, Main Track. 15 pages, 4 figures, 1 table

Abstract: Variational inference is a technique that approximates a target distribution by optimizing within the parameter space of variational families. On the other hand, Wasserstein gradient flows describe optimization within the space of probability measures where they do not necessarily admit a parametric density function. In this paper, we bridge the gap between these two methods. We Approximation Theory, Computing, and Deep Learning on thAuthors: Massimo FornasierPascal HeidGiacomo EnriAbstract: The challenge of approximating functions in infinite-dimensional spaces from finite samples is widely regarded as formidable. In this study, we delve into the challenging problem of the numerical approximation of Sobolev-smooth functions

defined on probability spaces. Our particular focus centers on the Wasserstein distance function, which serves as a relevant

xample. In contrast to the existing…  MorMSC Class: 49Q22; 33F05; 46E36; 28A

Cited by 1 Related articles All 2 versions 



arXiv:2310.18908  [pdfother]  cs.IT  cs.LG  stat.AP  stat.ML
Estimating the Rate-Distortion Function by Wasserstein Gradient Descent
Authors: Yibo YangStephan EcksteinMarcel NutzStephan Mandt
Abstract: In the theory of lossy compression, the rate-distortion (R-D) function R(D)
 describes how much a data source can be compressed (in bit-rate) at any given level of fidelity (distortion). Obtaining R(D)
 for a given data source establishes the fundamental performance limit for all compression algorithms. We propose a new method to estimate R(D)
 from the perspective of optimal transport. Unlike…  More
Submitted 29 October, 2023; originally announced October 2023.
Comments: Accepted as conference paper at NeurIPS 2023

Cited by 1 Related articles All 8 vDF] neurips.cc

2023



arXiv:2310.18678  [pdfpsother]  math.PR
Diffusion processes as Wasserstein gradient flows via stochastic control of the volatility matrix
Authors: Bertram Tschiderer
Abstract: We consider a class of time-homogeneous diffusion processes on R
n with common invariant measure but varying volatility matrices. In Euclidean space, we show via stochastic control of the diffusion coefficient that the corresponding flow of time-marginal distributions admits an entropic gradient flow formulation in the quadratic Wasserstein space if the volatility matrix of the diffus…  More
Submitted 28 October, 2023; originally announced October 2023.
MSC Class: Primary 60H30; 60G44; secondary 60J60; 94A17


arXiv:2310.17897  [pdfother]  physics.comp-ph 
hep-ex
Event Generation and Consistence Test for Physics with Sliced Wasserstein Distance
Authors: Chu-Cheng PanXiang DongYu-Chang SunAo-Yan ChengAo-Bo WangYu-Xuan HuHao Cai
Abstract: In the field of modern high-energy physics research, there is a growing emphasis on utilizing deep learning techniques to optimize event simulation, thereby expanding the statistical sample size for more accurate physical analysis. Traditional simulation methods often encounter challenges when dealing with complex physical processes and high-dimensional data distributions, resulting in slow perfor…  More
Submitted 27 October, 2023; originally announced October 2023.

elated articles All 5 versions 

arXiv:2310.17582  [pdfother]  stat.ML  cs.LG  math.OC 
math.ST
Convergence of flow-based generative models via proximal gradient descent in Wasserstein space
Authors: Xiuyuan ChengJianfeng LuYixin TanYao Xie
Abstract: Flow-based generative models enjoy certain advantages in computing the data generation and the likelihood, and have recently shown competitive empirical performance. Compared to the accumulating theoretical studies on related score-based diffusion models, analysis of flow-based models, which are deterministic in both forward (data-to-noise) and reverse (noise-to-data) directions, remain sparse. In…  More
Submitted 26 October, 2023; originally announced October 2023.

ited by 4 Related articles All 2 versions 

arXiv:2310.16705  [pdfother]  cs.LG  stat.ML
Wasserstein Gradient Flow over Variational Parameter Space for Variational Inference
Authors: Dai Hai NguyenTetsuya SakuraiHiroshi Mamitsuka
Abstract: Variational inference (VI) can be cast as an optimization problem in which the variational parameters are tuned to closely align a variational distribution with the true posterior. The optimization task can be approached through vanilla gradient descent in black-box VI or natural-gradient descent in natural-gradient VI. In this work, we reframe VI as the optimization of an objective that concerns…  More
Submitted 25 October, 2023; originally announced October 2023.



arXiv:2310.16552  [pdfpsother]  cs.LG doi10.1145/3340531.3412125
DECWA : Density-Based Clustering using Wasserstein Distance
Authors: Nabil El MalkiRobin CugnyOlivier TesteFranck Ravat
Abstract: Clustering is a data analysis method for extracting knowledge by discovering groups of data called clusters. Among these methods, state-of-the-art density-based clustering methods have proven to be effective for arbitrary-shaped clusters. Despite their encouraging results, they suffer to find low-density clusters, near clusters with similar densities, and high-dimensional data. Our proposals are a…  More
Submitted 25 October, 2023; originally announced October 2023.
Comments: 6 pages, CIKM 2020

<–—2023———2023———1520— 



arXiv:2310.16516  [pdfother]  stat.ML   cs.LG
Particle-based Variational Inference with Generalized Wasserstein Gradient Flow
Authors: Ziheng ChengShiyue ZhangLonglin YuCheng Zhang
Abstract: Particle-based variational inference methods (ParVIs) such as Stein variational gradient descent (SVGD) update the particles based on the kernelized Wasserstein gradient flow for the Kullback-Leibler (KL) divergence. However, the design of kernels is often non-trivial and can be restrictive for the flexibility of the method. Recent works show that functional gradient flow approximations with quadr…  More
Submitted 25 October, 2023; originally announced October 2023.

Cited by 1 Related articles All 5 versions 

arXiv:2310.15897  [pdfpsother]  math.PR
-Wasserstein contraction for Euler schemes of elliptic diffusions and interacting particle systems
Authors: Linshan LiuMateusz B. MajkaPierre Monmarché
Abstract: We show the L
2-Wasserstein contraction for the transition kernel of a discretised diffusion process, under a contractivity at infinity condition on the drift and a sufficiently high diffusivity requirement. This extends recent results that, under similar assumptions on the drift but without the diffusivity restrictions, showed the L
1-Wasserstein contraction, or L
p-Wasserstein bounds for…  More
Submitted 24 October, 2023; originally announced October 2023.
Comments: 28 pages

Related articles All 2 versions 

arXiv:2310.14446  [pdfpsother]  math.OC 
math.AP 
math.PR
A Viscosity Solution Theory of Stochastic Hamilton-Jacobi-Bellman equations in the Wasserstein Space
Authors: Hang CheungJinniao QiuAlexandru Badescu
Abstract: This paper is devoted to a viscosity solution theory of the stochastic Hamilton-Jacobi-Bellman equation in the Wasserstein spaces for the mean-field type control problem which allows for random coefficients and may thus be non-Markovian. The value function of the control problem is proven to be the unique viscosity solution. The major challenge lies in the mixture of the lack of local compactness…  More
Submitted 22 October, 2023; originally announced October 2023.
Comments: 41 pages
MSC Class: 49L25


arXiv:2310.14038  [pdfother]  cs.RO  eess.SY
Risk-Aware Wasserstein Distributionally Robust Control of Vessels in Natural Waterways
Authors: Juan Moreno NadalesAstghik HakobyanDavid Muñoz de la PeñaDaniel LimonInsoon Yang
Abstract: In the realm of maritime transportation, autonomous vessel navigation in natural inland waterways faces persistent challenges due to unpredictable natural factors. Existing scheduling algorithms fall short in handling these uncertainties, compromising both safety and efficiency. Moreover, these algorithms are primarily designed for non-autonomous vessels, leading to labor-intensive operations vuln…  More
Submitted 21 October, 2023; originally announced October 2023.

Related articles All 2 versions

arXiv:2310.13832  [pdfpsother]  math.PR   math.AP  math.D

Absolute continuity of Wasserstein barycenters on manifolds with a lower Ricci curvature bound

Authors: Jianyu Ma

Abstract: Given a complete Riemannian manifold M

 with a lower Ricci curvature bound, we consider barycenters in the Wasserstein space W

2(M)

 of probability measures on M

We refer to them as Wasserstein barycenters, which by definition are probability measures on M

The goal of this article is to present a novel approach to proving their absolute continuity. We introduce a new class of

dis…  More

Submitted 20 October, 2023; originally announced October 2023.

Related articles All 2 versions 

2023


arXiv:2310.13433  [pdfother]  cs.LG  math.ST 
stat.ML
Y-Diagonal Couplings: Approximating Posteriors with Conditional Wasserstein Distances
Authors: Jannis ChemseddinePaul HagemannChristian Wald
Abstract: In inverse problems, many conditional generative models approximate the posterior measure by minimizing a distance between the joint measure and its learned approximation. While this approach also controls the distance between the posterior measures in the case of the Kullback Leibler divergence, it does not hold true for the Wasserstein distance. We will introduce a conditional Wasserstein distan…  More
Submitted 20 October, 2023; originally announced October 2023.
Comments: 26 pages, 9 figures


arXiv:2310.12714  [pdfother]  cond-mat.stat-mech
Wasserstein distance in speed limit inequalities for Markov jump processes
Authors: Naoto Shiraishi
Abstract: The role of the Wasserstein distance in the thermodynamic speed limit inequalities for Markov jump processes is investigated. We elucidate the nature of the Wasserstein distance in the thermodynamic speed limit inequality from three different perspectives with resolving three remaining problems. In the first part, we derive a unified speed limit inequality for a general weighted graph, which repro…  More
Submitted 19 October, 2023; originally announced October 2023.
Comments: 28 pages, 1 figures

Related articles All 2 versions 

arXiv:2310.12498  [pdfother]  cs.LG 
math.NA
Quasi Manhattan Wasserstein Distance
Authors: Evan Unit Lim
Abstract: The Quasi Manhattan Wasserstein Distance (QMWD) is a metric designed to quantify the dissimilarity between two matrices by combining elements of the Wasserstein Distance with specific transformations. It offers improved time and space complexity compared to the Manhattan Wasserstein Distance (MWD) while maintaining accuracy. QMWD is particularly advantageous for large datasets or situations with l…  More
Submitted 19 October, 2023; originally announced October 2023.


arXiv:2310.11762  [pdfother]  cs.LG
A Quasi-Wasserstein Loss for Learning Graph Neural Networks
Authors: Minjie ChengHongteng Xu
Abstract: When learning graph neural networks (GNNs) in node-level prediction tasks, most existing loss functions are applied for each node independently, even if node embeddings and their labels are non-i.i.d. because of their graph structures. To eliminate such inconsistency, in this study we propose a novel Quasi-Wasserstein (QW) loss with the help of the optimal transport defined on graphs, leading to n…  More
Submitted 13 March, 2024; v1 submitted 18 October, 2023; originally announced October 2023.


arXiv:2310.10649  [pdfother]  cs.LG  math.OC  stat.ML
A Computational Framework for Solving Wasserstein Lagrangian Flows
Authors: Kirill NeklyudovRob BrekelmansAlexander TongLazar AtanackovicQiang LiuAlireza Makhzani
Abstract: The dynamical formulation of the optimal transport can be extended through various choices of the underlying geometry (kinetic energy
), and the regularization of density paths (potential energy
). These combinations yield different variational problems (Lagrangians
), encompassing many variations of the optimal transport problem such as the Schrödinger bridge, unbala…  More
Submitted 17 October, 2023; v1 submitted 16 October, 2023; originally announced October 2023.

Cited by 2 Related articles All 4 versions 

<–—2023———2023———1530— 



arXiv:2310.10143  [pdfother]  stat.ML  cs.LG
An Empirical Study of Self-supervised Learning with Wasserstein Distance
Authors: Makoto YamadaYuki TakezawaGuillaume HouryKira Michaela DusterwaldDeborah SulemHan ZhaoYao-Hung Hubert Tsai
Abstract: In this study, we delve into the problem of self-supervised learning (SSL) utilizing the 1-Wasserstein distance on a tree structure (a.k.a., Tree-Wasserstein distance (TWD)), where TWD is defined as the L1 distance between two tree-embedded vectors. In SSL methods, the cosine similarity is often utilized as an objective function; however, it has not been well studied when utilizing the Wasserstein…  More
Submitted 5 February, 2024; v1 submitted 16 October, 2023; originally announced October 2023.


arXiv:2310.09369  [pdfpsother]  math.MG
Coarse embeddings of quotients by finite group actions via the sliced Wasserstein distance
Authors: Thomas Weighill
Abstract: We prove that for a metric space X
 and a finite group G
 acting on X
 by isometries, if X
 coarsely embeds into a Hilbert space, then so does the quotient X/G
. A crucial step towards our main result is to show that for any integer k>0
 the space of unordered k
-tuples of points in Hilbert space, with the 1
-Wasserstein distance, itself coarsely embeds into Hilbert space. Our proof reli…  More
Submitted 13 October, 2023; originally announced October 2023.
Comments: 10 pages
MSC Class: 51F30


arXiv:2310.09254  [pdfother]  stat.ML 
cs.LG
Entropic (Gromov) Wasserstein Flow Matching with GENOT
Authors: Dominik KleinThéo UsciddaFabian TheisMarco Cuturi
Abstract: Optimal transport (OT) theory has reshaped the field of generative modeling: Combined with neural networks, recent \textit{Neural OT} (N-OT) solvers use OT as an inductive bias, to focus on ``thrifty'' mappings that minimize average displacement costs. This core principle has fueled the successful application of N-OT solvers to high-stakes scientific challenges, notably single-cell genomics. N-OT…  More
Submitted 12 March, 2024; v1 submitted 13 October, 2023; originally announced October 2023.


arXiv:2310.09149  [pdfpsother]  stat.ML 
cs.LG 
math.NA
Lattice Approximations in Wasserstein Space
Authors: Keaton HammVarun Khurana
Abstract: We consider structured approximation of measures in Wasserstein space W
p R)
 for p[1,∞)
 by discrete and piecewise constant measures based on a scaled Voronoi partition of R
d. We show that if a full rank lattice Λ
 is scaled by a factor of h(0,1]
, then approximation of a measure based on the Voronoi partition of hΛ
 is O(h)
 regardless of d
 or p
. We th…  More
Submitted 13 October, 2023; originally announced October 2023.

Related articles All 2 versions 

arXiv:2310.08492  [pdfpsother]  math.PR   q-fin.MF
Maximal Martingale Wasserstein Inequality
Authors: Benjamin JourdainKexin Shao
Abstract: In this note, we complete the analysis of the Martingale Wasserstein Inequality started in arXiv:2011.11599 by checking that this inequality fails in dimension d≥2
 when the integrability parameter ρ
 belongs to [1,2)
 while a stronger Maximal Martingale Wasserstein Inequality holds whatever the dimension d
 when ρ≥2
.Submitted 12 October, 2023; originally announced October 2023.
Comments: 7 pages

Related articles All 11 versions 

2023


arXiv:2310.08371  [pdfother]  cs.CV doi10.1oi10.1155/1970/9353816

Worst-Case Morphs using Wasserstein ALI and Improved MIPGAN
Authors: Una M. KellyMeike NautaLu LiuLuuk J. SpreeuwersRaymond N. J. Veldhuis
Abstract: A morph is a combination of two separate facial images and contains identity information of two different people. When used in an identity document, both people can be authenticated by a biometric Face Recognition (FR) system. Morphs can be generated using either a landmark-based approach or approaches based on deep learning such as Generative Adversarial Networks (GAN). In a recent paper, we intr…  More
Submitted 13 October, 2023; v1 submitted 12 October, 2023; originally announced October 2023.

elated articles All 5 versions 

arXiv:2310.04918  [pdfother]  cs.AI
SWAP: Sparse Entropic Wasserstein Regression for Robust Network Pruning
Authors: Lei YouHei Victor Cheng
Abstract: This study addresses the challenge of inaccurate gradients in computing the empirical Fisher Information Matrix during neural network pruning. We introduce SWAP, a formulation of Entropic Wasserstein regression (EWR) for pruning, capitalizing on the geometric properties of the optimal transport problem. The ``swap'' of the commonly used linear regression with the EWR in optimization is analyticall…  More
Submitted 20 February, 2024; v1 submitted 7 October, 2023; originally announced October 2023.
Comments: Published as a conference paper at ICLR 2024

elated articles All 2 versions 

arXiv:2310.04141  [pdfother]  math.OC  eess.SY
Wasserstein distributionally robust risk-constrained iterative MPC for motion planning: computationally efficient approximations
Authors: Alireza ZolanvariAshish Cherukuri
Abstract: This paper considers a risk-constrained motion planning problem and aims to find the solution combining the concepts of iterative model predictive control (MPC) and data-driven distributionally robust (DR) risk-constrained optimization. In the iterative MPC, at each iteration, safe states visited and stored in the previous iterations are imposed as terminal constraints. Furthermore, samples collec…  More
Submitted 6 October, 2023; originally announced October 2023.
Comments: 8 pages, 6 figures, Proceedings of the IEEE Conference on Decision and Control, Singapore, 2023

Related articles All 3 versions

arXiv:2310.03945  [pdfother]  stat.ML cs.LG
On Wasserstein distances for affine transformations of random vectors
Authors: Keaton HammAndrzej Korzeniowski
Abstract: We expound on some known lower bounds of the quadratic Wasserstein distance between random vectors in R
n.  with an emphasis on affine transformations that have been used in manifold learning of data in Wasserstein space. In particular, we give concrete lower bounds for rotated copies of random vectors in R
2 by computing the Bures metric between the covariance matrices. We als…  More
Submitted 7 February, 2024; v1 submitted 5 October, 2023; originally announced October 2023.

Related articles All 2 versions 

arXiv:2310.03629  [pdfother]  cs.IT cs.CV eess.IV
Wasserstein Distortion: Unifying Fidelity and Realism
Authors: Yang QiuAaron B. WagnerJohannes BalléLucas Theis
Abstract: We introduce a distortion measure for images, Wasserstein distortion, that simultaneously generalizes pixel-level fidelity on the one hand and realism on the other. We show how Wasserstein distortion reduces mathematically to a pure fidelity constraint or a pure realism constraint under different parameter choices. Pairs of images that are close under Wasserstein distortion illustrate its utility.…  More
Submitted 5 October, 2023; originally announced October 2023.

Submitted 17 October, 2023; v1 submitted 16 October, 2023; originally announced October 2023.

Cited by 2 Related articles All 3 versions

<–—2023———2023———1450— 



arXiv:2310.03398  [pdfother]  cs.LG  stat.ML
Interpolating between Clustering and Dimensionality Reduction with Gromov-Wasserstein
Authors: Hugues Van AsselCédric Vincent-CuazTitouan VayerRémi FlamaryNicolas Courty
Abstract: We present a versatile adaptation of existing dimensionality reduction (DR) objectives, enabling the simultaneous reduction of both sample and feature sizes. Correspondances between input and embedding samples are computed through a semi-relaxed Gromov-Wasserstein optimal transport (OT) problem. When the embedding sample size matches that of the input, our model recovers classical popular DR model…  More
Submitted 5 October, 2023; originally announced October 2023.

arXiv:2310.01973  [pdfother]  cs.LG 
cs.DC
Federated Wasserstein Distance
Authors: Alain RakotomamonjyKimia NadjahiLiva Ralaivola
Abstract: We introduce a principled way of computing the Wasserstein distance between two distributions in a federated manner. Namely, we show how to estimate the Wasserstein distance between two samples stored and kept on different devices/clients whilst a central entity/server orchestrates the computations (again, without having access to the samples). To achieve this feat, we take advantage of the geomet…  More
Submitted 3 October, 2023; originally announced October 2023.
Comments: 23 pages



arXiv:2310.01670  [pdfpsother]  math.PR
Asymptotic behavior of Wasserstein distance for weighted empirical measures of diffusion processes on compact Riemannian manifolds
Authors: Jie-Xiang Zhu
Abstract: Let (X
t) t≥0.  be a diffusion process defined on a compact Riemannian manifold, and for α>0
, let. μ (α) t
t α ∫t 0 δ X  s
α−1 ds
be the associated weighted empirical measure. We investigate asymptotic behavior of E
ν W2 2

 for sufficient large t
, where W2
 is the quadratic Wasserstein d…  More
Submitted 2 October, 2023; originally announced October 2023.

Related articles All 2 versions 


arXiv:2310.01285  [pdfother]  q-fin.CP  cs.LG  q-fin.MF  stat.ML
Automated regime detection in multidimensional time series data using sliced Wasserstein k-means clustering
Authors: Qinmeng LuanJames Hamp
Abstract: Recent work has proposed Wasserstein k-means (Wk-means) clustering as a powerful method to identify regimes in time series data, and one-dimensional asset returns in particular. In this paper, we begin by studying in detail the behaviour of the Wasserstein k-means clustering algorithm applied to synthetic one-dimensional time series data. We study the dynamics of the algorithm and investigate how…  More
Submitted 2 October, 2023; originally announced October 2023.

Related articles All 5 versions 


arXiv:2309.16604  [pdfother]  stat.ML 
cs.LG
Exploiting Edge Features in Graphs with Fused Network Gromov-Wasserstein Distance
Authors: Junjie YangMatthieu LabeauFlorence d'Alché-Buc
Abstract: Pairwise comparison of graphs is key to many applications in Machine learning ranging from clustering, kernel-based classification/regression and more recently supervised graph prediction. Distances between graphs usually rely on informative representations of these structured objects such as bag of substructures or other graph embeddings. A recently popular solution consists in representing graph…  More
Submitted 28 September, 2023; originally announced September 2023.


arXiv:2309.16171  [pdfother]  math.ST
Distributionally Robust Quickest Change Detection using Wasserstein Uncertainty Sets
Authors: Liyan XieYuchen LiangVenugopal V. Veeravalli
Abstract: The problem of quickest detection of a change in the distribution of a sequence of independent observations is considered. It is assumed that the pre-change distribution is known (accurately estimated), while the only information about the post-change distribution is through a (small) set of labeled data. This post-change data is used in a data-driven minimax robust framework, where an uncertainty…  More
Submitted 28 September, 2023; originally announced September 2023.



2023


arXiv:2309.16017  [pdfpsother]  math.DG
The Wasserstein distance for Ricci shrinkers
Authors: Franciele ConradoDetang Zhou
Abstract: Let (M
n ,g,f)
 be a Ricci shrinker such that Ric
f =12 g
 and the measure induced by the weighted volume element (4π)
−n2 e−f dv g
 is a probability measure. Given a point pM
, we consider two probability measures defined in the tangent space T
pM
, namely the Gaussian measure γ
 and the measure ν
 induced by the exponential map of M
 to p
. In t…  More
Submitted 27 September, 2023; originally announced September 2023.

ll 2 versions 

arXiv:2309.15300  [pdfpsother]  math.ST doi10.48550/arXiv.2111.06846
Wasserstein convergence in Bayesian and frequentist deconvolution models
Authors: Judith RousseauCatia Scricciolo
Abstract: We study the multivariate deconvolution problem of recovering the distribution of a signal from independent and identically distributed observations additively contaminated with random errors (noise) from a known distribution. For errors with independent coordinates having ordinary smooth densities, we derive an inversion inequality relating the L
1-Wasserstein distance between two distributions…  More
Submitted 26 September, 2023; originally announced September 2023.
Comments: arXiv admin note: text overlap with arXiv:2111.06846



arXiv:2309.12997  [pdfother]  math.PR  math.NA  stat.ML
Scaling Limits of the Wasserstein information matrix on Gaussian Mixture Models
Authors: Wuchen LiJiaxi Zhao
Abstract: We consider the Wasserstein metric on the Gaussian mixture models (GMMs), which is defined as the pullback of the full Wasserstein metric on the space of smooth probability distributions with finite second moment. It derives a class of Wasserstein metrics on probability simplices over one-dimensional bounded homogeneous lattices via a scaling limit of the Wasserstein metric on GMMs. Specifically,…  More
Submitted 22 September, 2023; originally announced September 2023.
Comments: 32 pages, 3 figures
MSC Class: 62B11; 41A60


arXiv:2309.12221  [pdfother]  astro-ph.HE
Optimizing the Wasserstein GAN for TeV Gamma Ray Detection with VERITAS
Authors: Deivid RibeiroYuping ZhengRamana SankarKameswara Mantha
Abstract: The observation of very-high-energy (VHE, E>100 GeV) gamma rays is mediated by the imaging atmospheric Cherenkov technique (IACTs). At these energies, gamma rays interact with the atmosphere to create a cascade of electromagnetic air showers that are visible to the IACT cameras on the ground with distinct morphological and temporal features. However, hadrons with significantly higher incidence rat…  More
Submitted 21 September, 2023; originally announced September 2023.


arXiv:2309.11713  [pdfother]  stat.ML  cs.GR  cs.LG
Quasi-Monte Carlo for 3D Sliced Wasserstein
Authors: Khai NguyenNicola BarilettoNhat Ho
Abstract: Monte Carlo (MC) integration has been employed as the standard approximation method for the Sliced Wasserstein (SW) distance, whose analytical expression involves an intractable expectation. However, MC integration is not optimal in terms of absolute approximation error. To provide a better class of empirical SW, we propose quasi-sliced Wasserstein (QSW) approximations that rely on Quasi-Monte Car…  More
Submitted 16 February, 2024; v1 submitted 20 September, 2023; originally announced September 2023.
Comments: Accepted to ICLR 2024 (Spotlight), 25 pages, 13 figures, 6 tables

Cited by 3 Related articles All 4 versions 

<–—2023———2023———1540— 



arXiv:2309.08748  [pdfother]  cs.LG
Wasserstein Distributionally Robust Policy Evaluation and Learning for Contextual Bandits
Authors: Yi ShenPan XuMichael M. Zavlanos
Abstract: Off-policy evaluation and learning are concerned with assessing a given policy and learning an optimal policy from offline data without direct interaction with the environment. Often, the environment in which the data are collected differs from the environment in which the learned policy is applied. To account for the effect of different environments during learning and execution, distributionally…  More
Submitted 17 January, 2024; v1 submitted 15 September, 2023; originally announced September 2023.


arXiv:2309.08702  [pdfpsother]  math.PR
Stochastic differential equations and stochastic parallel translations in the Wasserstein space
Authors: Hao DingShizan FangXiang-dong Li
Abstract: We will develop some elements in stochastic analysis in the Wasserstein space P
2(M)  over a compact Riemannian manifold M
, such as intrinsic Itô
 formulae, stochastic regular curves and parallel translations along them. We will establish the existence of parallel translations along regular curves, or stochastic regular curves in case of P
2(T)
. Surprisingly enough…  More
Submitted 15 September, 2023; originally announced September 2023.


arXiv:2309.08700  [pdfother]  cs.RO   .LG  ess.SY
Wasserstein Distributionally Robust Control Barrier Function using Conditional Value-at-Risk with Differentiable Convex Programming
Authors: Alaa Eddine ChriatChuangchuang Sun
Abstract: Control Barrier functions (CBFs) have attracted extensive attention for designing safe controllers for their deployment in real-world safety-critical systems. However, the perception of the surrounding environment is often subject to stochasticity and further distributional shift from the nominal one. In this paper, we present distributional robust CBF (DR-CBF) to achieve resilience under distribu…  More
Submitted 15 September, 2023; originally announced September 2023.

ited by 2 Related articles All 2 versions

arXiv:2309.08189  [pdfpsother]  math.PR
Rates of convergence in the distances of Kolmogorov and Wasserstein for standardized martingales
Authors: Xiequan FanZhonggen Su
Abstract: We give some rates of convergence in the distances of Kolmogorov and Wasserstein for standardized martingales with differences having finite variances. For the Kolmogorov distances, we present some exact Berry-Esseen bounds for martingales, which generalizes some Berry-Esseen bounds due to Bolthausen. For the Wasserstein distance, with Stein's method and Lindeberg's telescoping sum argument, the r…  More
Submitted 15 September, 2023; originally announced September 2023.
Comments: 31 pages
MSC Class: Primary 60G42; 60F05; Secondary 60E15

Related articles All 2 versions 

arXiv:2309.07692  [pdfother]  math.ST  stat.ME
A minimum Wasserstein distance approach to Fisher's combination of independent discrete p-values
Authors: Gonzalo ContadorZheyang Wu
Abstract: This paper introduces a comprehensive framework to adjust a discrete test statistic for improving its hypothesis testing procedure. The adjustment minimizes the Wasserstein distance to a null-approximating continuous distribution, tackling some fundamental challenges inherent in combining statistical significances derived from discrete distributions. The related theory justifies Lancaster's mid-p…  More
Submitted 14 September, 2023; originally announced September 2023.
MSC Class: 62E17; 62G10; 60E07

Related articles All 2 versions 

2023

arXiv:2309.07351  [pdfother math.OC
Wasserstein Consensus ADMM
Authors: Iman NodoziAbhishek Halder
Abstract: We introduce Wasserstein consensus alternating direction method of multipliers (ADMM) and its entropic-regularized version: Sinkhorn consensus ADMM, to solve measure-valued optimization problems with convex additive objectives. Several problems of interest in stochastic prediction and learning can be cast in this form of measure-valued convex additive optimization. The proposed algorithm generaliz…  More
Submitted 13 September, 2023; originally announced September 2023.

Related articles All 3 versions 


arXiv:2309.07031  [pdfpsother math.PR
Smooth Edgeworth Expansion and Wasserstein-p
 Bounds for Mixing Random Fields
Authors: Tianle LiuMorgane Austern
Abstract: In this paper, we consider d
-dimensional mixing random fields (X
… and study the convergence of the empirical average W
…..
 Under α
-mixing and moment conditions, we obtain smooth Edgeworth expansions for Wn
 of any order k≥1
 with better controlled remainder terms. We exploit this to obtain rates for the convergence…  More
Submitted 5 December, 2023; v1 submitted 13 September, 2023; originally announced September 2023.
Comments: 93 pages, 4 figures. arXiv admin note: substantial text overlap with arXiv:2209.09377
MSC Class: 60F05


arXiv:2309.05522  [pdfpsother math.AP
Maximizers of nonlocal interactions of Wasserstein type
Authors: Almut BurchardDavide CarazzatoIhsan Topaloglu
Abstract: We characterize the maximizers of a functional involving the minimization of the Wasserstein distance between equal volume sets. This functional appears as a repulsive interaction term in some models describing biological membranes. We combine a symmetrization-by-reflection technique with the uniqueness of optimal transport plans to prove that balls are the only maximizers. Further, in one dimensi…  More
Submitted 11 September, 2023; originally announced September 2023.

Related articles All 4 versions 


arXiv:2309.05315  [pdfother math.OC
Computing Wasserstein Barycenter via operator splitting: the method of averaged marginals
Authors: D. MimouniP MalisaniJ. ZhuW. de Oliveira
Abstract: The Wasserstein barycenter (WB) is an important tool for summarizing sets of probabilities. It finds applications in applied probability, clustering, image processing, etc. When the probability supports are finite and fixed, the problem of computing a WB is formulated as a linear optimization problem whose dimensions generally exceed standard solvers' capabilities. For this reason, the WB problem…  More
Submitted 11 September, 2023; originally announced September 2023.

Related articles All 29 versions 

arXiv:2309.05040  [pdfpsother math.AP  math.OC  math.PR
Comparison of viscosity solutions for a class of second order PDEs on the Wasserstein space
Authors: Erhan BayraktarIbrahim EkrenXin Zhang
Abstract: We prove a comparison result for viscosity solutions of second order parabolic partial differential equations in the Wasserstein space. The comparison is valid for semisolutions that are Lipschitz continuous in the measure in a Fourier-Wasserstein metric and uniformly continuous in time. The class of equations we consider is motivated by Mckean-Vlasov control problems with common noise and filteri…  More
Submitted 13 September, 2023; v1 submitted 10 September, 2023; originally announced September 2023.
Comments: Keywords: Wasserstein space, second order PDEs, viscosity solutions, comparison principle, Ishii's Lemma. In version 2 some small typos are fixed
MSC Class: 58E30; 90C05

ite Cited by 8 Related articles All 2 versions 

<–—2023———2023———1550— 


arXiv:2309.04674  [pdfpsother math.PR
Wasserstein Convergence Rate for Empirical Measures of Markov Processes
Authors: Feng-Yu Wang
Abstract: The convergence rate in Wasserstein distance is estimated for empirical measures of ergodic Markov processes, and the estimate can be sharp in some specific situations. The main result is applied to subordinations of typical models excluded by existing results, which include: stochastic Hamiltonian systems on R
n×R
m, spherical velocity Langevin processes on…  More
Submitted 2 October, 2023; v1 submitted 8 September, 2023; originally announced September 2023.
Comments: 35 pages

Related articles All 2 versions 

arXiv:2308.16381  [pdfother cs.RO
Wasserstein Distributionally Robust Chance Constrained Trajectory Optimization for Mobile Robots within Uncertain Safe Corridor
Authors: Shaohang XuHaolin RuanWentao ZhangYian WangLijun ZhuChin Pang Ho
Abstract: Safe corridor-based Trajectory Optimization (TO) presents an appealing approach for collision-free path planning of autonomous robots, offering global optimality through its convex formulation. The safe corridor is constructed based on the perceived map, however, the non-ideal perception induces uncertainty, which is rarely considered in trajectory generation. In this paper, we propose Distributio…  More
Submitted 30 August, 2023; originally announced August 2023.
Comments: 7 pages

inty of the safe corridor. To the best of our knowledge, this is the first time the …

Related articles All 2 versions 

arXiv:2308.15174  [pdfpsother math.AP  math.OC
A comparison principle for semilinear Hamilton-Jacobi-Bellman equations in the Wasserstein space
Authors: Samuel DaudinBenjamin Seeger
Abstract: The goal of this paper is to prove a comparison principle for viscosity solutions of semilinear Hamilton-Jacobi equations in the space of probability measures. The method involves leveraging differentiability properties of the 2
-Wasserstein distance in the doubling of variables argument, which is done by introducing a further entropy penalization that ensures that the relevant optima are achieve…  More
Submitted 29 August, 2023; originally announced August 2023.

Cited by 6 Related articles All 3 versions 


arXiv:2308.14945  [pdfother stat.ML  cs.LG  stat.CO
Noise-Free Sampling Algorithms via Regularized Wasserstein Proximals
Authors: Hong Ye TanStanley OsherWuchen Li
Abstract: We consider the problem of sampling from a distribution governed by a potential function. This work proposes an explicit score based MCMC method that is deterministic, resulting in a deterministic evolution for particles rather than a stochastic differential equation evolution. The score term is given in closed form by a regularized Wasserstein proximal, using a kernel convolution that is approxim…  More
Submitted 2 October, 2023; v1 submitted 28 August, 2023; originally announced August 2023.
MSC Class: 65C05; 62G07



arXiv:2308.14048  [pdfother stat.ML  cs.LG  stat.AP  stat.CO stat.ME
A Bayesian Non-parametric Approach to Generative Models: Integrating Variational Autoencoder and Generative Adversarial Networks using Wasserstein and Maximum Mean Discrepancy
Authors: Forough Fazeli-AslMichael Minyi Zhang
Abstract: Generative models have emerged as a promising technique for producing high-quality images that are indistinguishable from real images. Generative adversarial networks (GANs) and variational autoencoders (VAEs) are two of the most prominent and widely studied generative models. GANs have demonstrated excellent performance in generating sharp realistic images and VAEs have shown strong abilities to…  More
Submitted 27 August, 2023; originally announced August 2023.



arXiv:2308.13840  [pdfother math.NA  cs.LG
Optimal Transport-inspired Deep Learning Framework for Slow-Decaying Problems: Exploiting Sinkhorn Loss and Wasserstein Kernel
Authors: Moaad KhamlichFederico PichiGianluigi Rozza
Abstract: Reduced order models (ROMs) are widely used in scientific computing to tackle high-dimensional systems. However, traditional ROM methods may only partially capture the intrinsic geometric characteristics of the data. These characteristics encompass the underlying structure, relationships, and essential features crucial for accurate modeling. To overcome this limitation, we propose a novel ROM fr…  More
Submitted 26 August, 2023; originally announced August 2023.
MSC Class: 68T05; 65D99; 41A05; 65N30; 41A46; 90C25



2023


arXiv:2308.12540  [pdfother stat.ME
Wasserstein Regression with Empirical Measures and Density Estimation for Sparse Data
Authors: Yidong ZhouHans-Georg Müller
Abstract: The problem of modeling the relationship between univariate distributions and one or more explanatory variables has found increasing interest. Traditional functional data methods cannot be applied directly to distributional data because of their inherent constraints. Modeling distributions as elements of the Wasserstein space, a geodesic metric space equipped with the Wasserstein metric that is re…  More
Submitted 23 August, 2023; originally announced August 2023.
Comments: 27 pages, 5 figures, 2 tables

Related articles All 2 versions 

arXiv:2308.10869  [pdfother cs.LG  cs.AI  eess.SP
A Novel Loss Function Utilizing Wasserstein Distance to Reduce Subject-Dependent Noise for Generalizable Models in Affective Computing
Authors: Nibraas KhanMahrukh TauseefRitam GhoshNilanjan Sarkar
Abstract: Emotions are an essential part of human behavior that can impact thinking, decision-making, and communication skills. Thus, the ability to accurately monitor and identify emotions can be useful in many human-centered applications such as behavioral training, tracking emotional well-being, and development of human-computer interfaces. The correlation between patterns in physiological data and affec…  More
Submitted 16 August, 2023; originally announced August 2023.
Comments: 9 pages

Related articles All 2 versions 

arXiv:2308.10753  [pdfother math.AP  math.OC
The Total Variation-Wasserstein Problem
Authors: Antonin ChambolleVincent DuvalJoao Miguel Machado
Abstract: In this work we analyze the Total Variation-Wasserstein minimization problem. We propose an alternative form of deriving optimality conditions from the approach of Calier\&Poon'18, and as result obtain further regularity for the quantities involved. In the sequel we propose an algorithm to solve this problem alongside two numerical experiments.
Submitted 21 August, 2023; originally announced August 2023.

All 3 versions 

arXiv:2308.10341  [pdfpsother math.PR 
math.OC
Computable Bounds on Convergence of Markov Chains in Wasserstein Distance
Authors: Yanlin QuJose BlanchetPeter Glynn
Abstract: We introduce a unified framework to estimate the convergence of Markov chains to equilibrium using Wasserstein distance. The framework provides convergence bounds with various rates, ranging from polynomial to exponential, all derived from a single contractive drift condition. This approach removes the need for finding a specific set with drift outside and contraction inside. The convergence bound…  More
Submitted 20 August, 2023; originally announced August 2023.
MSC Class: 60J05


arXiv:2308.10145  [pdfother stat.ML  cs.LG
Wasserstein Geodesic Generator for Conditional Distributions
Authors: Young-geun KimKyungbok LeeYoungwon ChoiJoong-Ho WonMyunghee Cho Paik
Abstract: Generating samples given a specific label requires estimating conditional distributions. We derive a tractable upper bound of the Wasserstein distance between conditional distributions to lay the theoretical groundwork to learn conditional distributions. Based on this result, we propose a novel conditional generation algorithm where conditional distributions are fully characterized by a metric spa…  More
Submitted 28 August, 2023; v1 submitted 19 August, 2023; originally announced August 2023.

Related articles All 2 versions 

<–—2023———2023———1560— 



arXiv:2308.08672  [pdfother math.ST
Nearly Minimax Optimal Wasserstein Conditional Independence Testing
Authors: Matey NeykovLarry WassermanIlmun KimSivaraman Balakrishnan
Abstract: This paper is concerned with minimax conditional independence testing. In contrast to some previous works on the topic, which use the total variation distance to separate the null from the alternative, here we use the Wasserstein distance. In addition, we impose Wasserstein smoothness conditions which on bounded domains are weaker than the corresponding total variation smoothness imposed, for inst…  More
Submitted 16 August, 2023; originally announced August 2023.
Comments: 24 pages, 1 figure, ordering of the last three authors is random

Related articles All 3 versions 

arXiv:2308.05065  [pdfother math.M math-ph  math.FA
Isometric rigidity of Wasserstein spaces over Euclidean spheres
Authors: György Pál GehérAranka HruškováTamás TitkosDániel Virosztek
Abstract: We study the structure of isometries of the quadratic Wasserstein space W
… over the sphere endowed with the distance inherited from the norm of R
. We prove that W
2 is isometrically rigid, meaning that its isometry group is isomorphic to that of…  More
Submitted 9 August, 2023; originally announced August 2023.
Comments: 16 pages, 1 figure
MSC Class: 54E40; 46E27 (Primary) 60A10; 60B05 (Secondary)


arXiv:2308.04097  [pdfpsother math.OC
Viscosity Solutions of the Eikonal Equation on the Wasserstein Space
Authors: H. Mete SonerQinxin Yan
Abstract: Dynamic programming equations for mean field control problems with a separable structure are Eikonal equations on the Wasserstein space. Standard differentiation using linear derivatives yield a direct extension of the classical viscosity theory. We use Fourier representation of the Sobolev norms on the space of measures, together with the standard techniques from the finite dimensional theory to…  More
Submitted 7 January, 2024; v1 submitted 8 August, 2023; originally announced August 2023.
Comments: 13 pages
MSC Class: 35D40; 35Q89; 49L12; 49L25; 60G99


arXiv:2308.03133  [pdfpsother math.OC 
math.MG
A Duality-Based Proof of the Triangle Inequality for the Wasserstein Distances
Authors: François Golse
Abstract: This short note gives a proof of the triangle inequality based on the Kantorovich duality formula for the Wasserstein distances of exponent p[1,+∞)
 in the case of a general Polish space. In particular it avoids the "glueing of couplings" procedure used in most textbooks on optimal transport.
Submitted 6 August, 2023; originally announced August 2023.
Comments: 10 pages, no figure
MSC Class: 49Q22; 49N15 (60B10)



arXiv:2308.02607  [pdfother physics.comp-ph
Wasserstein-penalized Entropy closure: A use case for stochastic particle methods
Authors: Mohsen SadrNicolas G. HadjiconstantinouM. Hossein Gorji
Abstract: We introduce a framework for generating samples of a distribution given a finite number of its moments, targeted to particle-based solutions of kinetic equations and rarefied gas flow simulations. Our model, referred to as the Wasserstein-Entropy distribution (WE), couples a physically-motivated Wasserstein penalty term to the traditional maximum-entropy distribution (MED) functions, which serves…  More
Submitted 4 August, 2023; originally announced August 2023.



2023. 



arXiv:2308.01853  [pdfother stat.ML   s.LG   ath.ST
Statistical Estimation Under Distribution Shift: Wasserstein Perturbations and Minimax Theory
Authors: Patrick ChaoEdgar Dobriban
Abstract: Distribution shifts are a serious concern in modern statistical learning as they can systematically change the properties of the data away from the truth. We focus on Wasserstein distribution shifts, where every data point may undergo a slight perturbation, as opposed to the Huber contamination model where a fraction of observations are outliers. We consider perturbations that are either independe…  More
Submitted 9 October, 2023; v1 submitted 3 August, 2023; originally announced August 2023.
Comments: 60 pages, 7 figures

Related articles  

arXiv:2308.00989  [pdfother cs.LG   s.AI
Wasserstein Diversity-Enriched Regularizer for Hierarchical Reinforcement Learning
Authors: Haorui LiJiaqi LiangLinjing LiDaniel Zeng
Abstract: Hierarchical reinforcement learning composites subpolicies in different hierarchies to accomplish complex tasks.Automated subpolicies discovery, which does not depend on domain knowledge, is a promising approach to generating subpolicies.However, the degradation problem is a challenge that existing methods can hardly deal with due to the lack of consideration of diversity or the employment of weak…  More
Submitted 2 August, 2023; originally announced August 2023.

Related articles All 4 versions

arXiv:2308.00273  [pdfother cs.LG
Neural approximation of Wasserstein distance via a universal architecture for symmetric and factorwise group invariant functions
Authors: Samantha ChenYusu Wang
Abstract: Learning distance functions between complex objects, such as the Wasserstein distance to compare point sets, is a common goal in machine learning applications. However, functions on such complex objects (e.g., point sets and graphs) are often required to be invariant to a wide variety of group actions e.g. permutation or rigid transformation. Therefore, continuous and symmetric product functions (…  More
Submitted 17 November, 2023; v1 submitted 1 August, 2023; originally announced August 2023.
Comments: Accepted to NeurIPS 2023

Related articles All 4 versions 


arXiv:2307.16421  [pdfother math.PR  math.AP  stat.ML
Wasserstein Mirror Gradient Flow as the limit of the Sinkhorn Algorithm
Authors: Nabarun DebYoung-Heon KimSoumik PalGeoffrey Schiebinger
Abstract: We prove that the sequence of marginals obtained from the iterations of the Sinkhorn algorithm or the iterative proportional fitting procedure (IPFP) on joint densities, converges to an absolutely continuous curve on the 2
-Wasserstein space, as the regularization parameter ε
 goes to zero and the number of iterations is scaled as 1/ε
 (and other technical assumptions). This…  More
Submitted 31 July, 2023; originally announced July 2023.
Comments: 49 pages, 2 figures
MSC Class: 49N99; 49Q22; 60J60

Cited by 6 Related articles All 3 versions 


arXiv:2307.15764  [pdfpsother math.PR
Geometric Ergodicity and Wasserstein Continuity of Non-Linear Filters
Authors: Yunus Emre DemirciSerdar Yüksel
Abstract: In this paper, we present conditions for the geometric ergodicity and Wasserstein regularity of non-linear filter processes, which has received little attention in the literature. While previous studies in the field of non-linear filtering have mainly focused on unique ergodicity and the weak Feller property, our work extends these findings in three main directions: (i) We present conditions on th…  More
Submitted 29 October, 2023; v1 submitted 28 July, 2023; originally announced July 2023.
MSC Class: 60J05; 60J10; 93E11; 93E15

<–—2023———2023———1570— 



arXiv:2307.15423  [pdfother math.NA
Nonlinear reduced basis using mixture Wasserstein barycenters: application to an eigenvalue problem inspired from quantum chemistry
Authors: Maxime DaleryGenevieve DussonVirginie EhrlacherAlexei Lozinski
Abstract: The aim of this article is to propose a new reduced-order modelling approach for parametric eigenvalue problems arising in electronic structure calculations. Namely, we develop nonlinear reduced basis techniques for the approximation of parametric eigenvalue problems inspired from quantum chemistry applications. More precisely, we consider here a one-dimensional model which is a toy model for the…  More
Submitted 28 July, 2023; originally announced July 2023.

Related articles All 12 versions 

Gao, Rui

Finite-sample guarantees for Wasserstein distributionally robust optimization: breaking the curse of dimensionality. (English) Zbl 07819191

Oper. Res. 71, No. 6, 2291-2306 (2023).

MSC:  90Cxx

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 



Li, ZhengyangTang, YijiaChen, JingWu, Hao

On quadratic Wasserstein metric with squaring scaling for seismic velocity inversion. (English) Zbl 07814756

Numer. Math., Theory Methods Appl. 16, No. 2, 277-297 (2023).

MSC:  49N45 65K10 86-08 86A15

PDFBibTeX XMLCite

Full Text: DOI 



Monmarché, Pierre

Wasserstein contraction and Poincaré inequalities for elliptic diffusions with high diffusivity. (Contraction Wasserstein et inégalité de Poincaré pour des diffusions elliptiques à forte diffusivité.) (English. French summary) Zbl 07814461

Ann. Henri Lebesgue 6, 941-973 (2023).

MSC:  60J60

PDFBibTeX XMLCite

Full Text: DOI 


Gao, RuiKleywegt, Anton

Distributionally robust stochastic optimization with Wasserstein distance. (English) Zbl 07808961

Math. Oper. Res. 48, No. 2, 603-655 (2023).

MSC:  90C15 90C17 90C46

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Cited by 693 Related articles All 8 versions


2023



Nguyen, Viet AnhShafieezadeh-Abadeh, SorooshKuhn, DanielEsfahani, Peyman Mohajerin

Bridging Bayesian and minimax mean square error estimation via Wasserstein distributionally robust optimization. (English) Zbl 07808926

Math. Oper. Res. 48, No. 1, 1-37 (2023).

MSC:  90Cxx

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Wu, HaoFan, XiequanGao, ZhiqiangYe, Yinna

Wasserstein-1 distance and nonuniform Berry-Esseen bound for a supercritical branching process in a random environment. (English) Zbl 07808400

J. Math. Res. Appl. 43, No. 6, 737-753 (2023).

MSC:  60J80 60K37 60F05 62E20

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Bensoussan, AlainHuang, ZiyuYam, Sheung Chi Phillip

Control theory on Wasserstein space: a new approach to optimality conditions. (English) Zbl 07800856

Ann. Math. Sci. Appl. 8, No. 3, 565-628 (2023).

MSC:  35Q93 35Q84 49L25 49N80 93E20 93B52 60H30 60H10 60H15 35F21

PDFBibTeX XMLCite

Full Text: DOI 


Yang, Xue

Reflecting image-dependent SDEs in Wasserstein space and large deviation principle. (English) Zbl 07800074

Stochastics 95, No. 8, 1361-1394 (2023).

Reviewer: Ivan Podvigin (Novosibirsk)

MSC:  60H15 60G46 60H10

PDFBibTeX XMLCite

Full Text: DOI 



Li, MengyuYu, JunXu, HongtengMeng, Cheng

Efficient approximation of Gromov-Wasserstein distance using importance sparsification. (English) Zbl 07792634

J. Comput. Graph. Stat. 32, No. 4, 1512-1523 (2023).

MSC:  62-XX

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

<–—2023———2023———1580—  



Li, TaoYu, JunMeng, Cheng

Scalable model-free feature screening via sliced-Wasserstein dependency. (English) Zbl 07792633

J. Comput. Graph. Stat. 32, No. 4, 1501-1511 (2023).

MSC:  62-XX

PDFBibTeX XMLCite

Full Text: DOI 



Liu, TianleAustern, Morgane

Wasserstein-

p

 bounds in the central limit theorem under local dependence. (English) Zbl 07790312

Electron. J. Probab. 28, Paper No. 117, 47 p. (2023).

Reviewer: Fraser Daly (Edinburgh)

MSC:  60F05 62E17

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 



Schär, Philip

Wasserstein contraction and spectral gap of slice sampling revisited. (English) Zbl 07790291

Electron. J. Probab. 28, Paper No. 136, 28 p. (2023).

MSC:  65C05 60J05 60J22

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Delon, JulieGozlan, NathaelDizier, Alexandre Saint

Generalized Wasserstein barycenters between probability measures living on different subspaces. (English) Zbl 07789638

Ann. Appl. Probab. 33, No. 6A, 4395-4423 (2023).

MSC:  60A10 49J40 49K21 49N15

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Chambolle, AntoninDuval, VincentMachado, João Miguel

The total variation-Wasserstein problem: a new derivation of the Euler-Lagrange equations. (English) Zbl 07789238

Nielsen, Frank (ed.) et al., Geometric science of information. 6th international conference, GSI 2023, St. Malo, France, August 30 – September 1, 2023. Proceedings. Part I. Cham: Springer. Lect. Notes Comput. Sci. 14071, 610-619 (2023).

MSC:  35A35 35A15 49Q22

PDFBibTeX XMLCite

Full Text: DOI 


2023


Han, AndiMishra, BamdevJawanpuria, PratikGao, Junbin

Learning with symmetric positive definite matrices via generalized Bures-Wasserstein geometry. (English) Zbl 07789217

Nielsen, Frank (ed.) et al., Geometric science of information. 6th international conference, GSI 2023, St. Malo, France, August 30 – September 1, 2023. Proceedings. Part I. Cham: Springer. Lect. Notes Comput. Sci. 14071, 405-415 (2023).

MSC:  53B12

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Related articles All 4 versions

Jalowy, Jonas

The Wasserstein distance to the circular law. (English) Zbl 07788740

Ann. Inst. Henri Poincaré, Probab. Stat. 59, No. 4, 2285-2307 (2023).

MSC:  60B20 41A25 49Q22 60G55

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Borda, Bence

Empirical measures and random walks on compact spaces in the quadratic Wasserstein metric. (English) Zbl 07788732

Ann. Inst. Henri Poincaré, Probab. Stat. 59, No. 4, 2017-2035 (2023).

MSC:  60G50 49Q22 60B15 60G10

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 



Mémoli, FacundoMunk, AxelWan, ZhengchaoWeitkamp, Christoph

The ultrametric Gromov-Wasserstein distance. (English) Zbl 07781566

Discrete Comput. Geom. 70, No. 4, 1378-1450 (2023).

MSC:  53-XX 49Q22 53C23 53Z50

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Es-Sebaiy, KhalifaAlazemi, FaresAl-Foraih, Mishari

Wasserstein bounds in CLT of approximative MCE and MLE of the drift parameter for Ornstein-Uhlenbeck processes observed at high frequency. (English) Zbl 07778059

J. Inequal. Appl. 2023, Paper No. 62, 17 p. (2023).

MSC:  60F05 60G15 60G10 62F12 60H07

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

<–—2023———2023———1590— 


2023



De Giuli, Maria ElenaSpelta, Alessandro

Wasserstein barycenter regression for estimating the joint dynamics of renewable and fossil fuel energy indices. (English) Zbl 07778005

Comput. Manag. Sci. 20, Paper No. 1, 17 p. (2023).

MSC:  90Bxx

PDFBibTeX XMLCite

Full Text: DOI 

Piccoli, BenedettoRossi, FrancescoTournus, Magali

A Wasserstein norm for signed measures, with application to non-local transport equation with source term. (English) Zbl 1527.35336

Commun. Math. Sci. 21, No. 5, 1279-1301 (2023).

MSC:  35Q49 28A33 35A01 35A02 35R06

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 



Fu, GuoshengOsher, StanleyLi, Wuchen

High order spatial discretization for variational time implicit schemes: Wasserstein gradient flows and reaction-diffusion systems. (English) Zbl 07771291

J. Comput. Phys. 491, Article ID 112375, 30 p. (2023).

MSC:  35Kxx 65Mxx 35Qxx

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Cited by 11 Related articles All 7 versions


Pesenti, Silvana M.Jaimungal, Sebastian

Portfolio optimization within a Wasserstein ball. (English) Zbl 07770149

SIAM J. Financ. Math. 14, No. 4, 1175-1214 (2023).

MSC:  91G10 91G70 62H05

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Gao, YihangNg, Michael K.Zhou, Mingjie

Approximating probability distributions by using Wasserstein generative adversarial networks. (English) Zbl 07768240

SIAM J. Math. Data Sci. 5, No. 4, 949-976 (2023).

MSC:  68Q32 68T15 68W40

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 



2023



Friesecke, GeroPenka, Maximilian

The GenCol algorithm for high-dimensional optimal transport: general formulation and application to barycenters and Wasserstein splines. (English) Zbl 07768238

SIAM J. Math. Data Sci. 5, No. 4, 899-919 (2023).

MSC:  65Kxx 90C06

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 




Chen, DaliWu, YuweiLi, JingquanDing, XiaohuiChen, Caihua

Distributionally robust mean-absolute deviation portfolio optimization using Wasserstein metric. (English) Zbl 1528.90295

J. Glob. Optim. 87, No. 2-4, 783-805 (2023).

MSC:  90C90 90C15 90C26

PDFBibTeX XMLCite

Full Text: DOI 



De Palma, GiacomoTrevisan, Dario

The Wasserstein distance of order 1 for quantum spin systems on infinite lattices. (English) Zbl 07761144

Ann. Henri Poincaré 24, No. 12, 4237-4282 (2023).

MSC:  81Q35 81R25 81P55 81P65 81P17 26B05 80A10

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 



Jimenez, ChloéMarigonda, AntonioQuincampoix, Marc

Dynamical systems and Hamilton-Jacobi-Bellman equations on the Wasserstein space and their 

L22

 representations. (English) Zbl 1526.35118

SIAM J. Math. Anal. 55, No. 5, 5919-5966 (2023).

MSC:  35F21 35D40 49L25 49Q22

PDFBibTeX XMLCite

Full Text: DOI 



Duvenhage, RoccoMapaya, Mathumo

Quantum Wasserstein distance of order 1 between channels. (English) Zbl 07757254

Infin. Dimens. Anal. Quantum Probab. Relat. Top. 26, No. 3, Article ID 2350006, 36 p. (2023).

MSC:  81P47 81S22 49Q22 46L60 32E05 82C70

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

 arXiv 

<–—2023———2023———1600—



Wang, Feng-Yu

Convergence in Wasserstein distance for empirical measures of Dirichlet diffusion processes on manifolds. (English) Zbl 1525.58011

J. Eur. Math. Soc. (JEMS) 25, No. 9, 3695-3725 (2023).

MSC:  58J65

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 



Thanwerdas, YannPennec, Xavier

Bures-Wasserstein minimizing geodesics between covariance matrices of different ranks. (English) Zbl 1528.15027

SIAM J. Matrix Anal. Appl. 44, No. 3, 1447-1476 (2023).

Reviewer: Benjamin McKay (Cork)

MSC:  15B48 15A63 53B20 53C22 58D17 54E50 58A35

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Li, WuchenLiu, SitingOsher, Stanley

A kernel formula for regularized Wasserstein proximal operators. (English) Zbl 1525.35064

Res. Math. Sci. 10, No. 4, Paper No. 43, 16 p. (2023).

MSC:  35C15 35A22 35K08

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 



Ballesio, MarcoJasra, Ajayvon Schwerin, ErikTempone, Raúl

A Wasserstein coupled particle filter for multilevel estimation. (English) Zbl 07750820

Stochastic Anal. Appl. 41, No. 5, 820-859 (2023).

MSC:  62-XX 93-XX

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

(opens in new tab


Wang, Feng-YuWu, Bingyao

Wasserstein convergence for empirical measures of subordinated diffusions on Riemannian manifolds. (English) Zbl 1523.58036

Potential Anal. 59, No. 3, 933-954 (2023).

MSC:  58J65

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


2023


Karakhanyan, Aram L.

A nonlocal free boundary problem with Wasserstein distance. (English) Zbl 1528.35240

Calc. Var. Partial Differ. Equ. 62, No. 9, Paper No. 240, 22 p. (2023).

Reviewer: Mariana Vega Smit (Bellingham)

MSC:  35R35 35A15 35J60 35J87 49Q20

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 



Slepčev, DejanWarren, Andrew

Nonlocal Wasserstein distance: metric and asymptotic properties. (English) Zbl 07748712

Calc. Var. Partial Differ. Equ. 62, No. 9, Paper No. 238, 66 p. (2023).

MSC:  46E27 49J99 60J76 60B10 45G10

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


 



Fornasier, MassimoSavaré, GiuseppeSodini, Giacomo Enrico

Density of subalgebras of Lipschitz functions in metric Sobolev spaces and applications to Wasserstein Sobolev spaces. (English) Zbl 07744988

J. Funct. Anal. 285, No. 11, Article ID 110153, 76 p. (2023).

MSC:  46E36 31C25 49Q20 28A33

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 




Chakraborty, Kuntal

A note on relative Vaserstein symbol. (English) Zbl 07742052

J. Algebra Appl. 22, No. 10, Article ID 2350210, 29 p. (2023).

Reviewer: Moshe Roitman (Haifa)

MSC:  13C10 13H05 19B14 19B10

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 




Pál, Gehér GyörgyTitkos, TamásVirosztek, Dániel

On isometries of Wasserstein spaces. (English) Zbl 1527.54010

RIMS Kôkyûroku Bessatsu B93, 239-250 (2023).

Reviewer: Aleksey A. Dovgoshey (Slovyansk)

MSC:  54E40 46E27 60B05

PDFBibTeX XMLCite

Full Text: Link 

<–—2023———2023———1610-



Gehér, György PálTitkos, TamásVirosztek, Dániel

Isometric rigidity of Wasserstein tori and spheres. (English) Zbl 07740588

Mathematika 69, No. 1, 20-32 (2023).

Reviewer: Aleksey A. Dovgoshey (Slovyansk)

MSC:  54E40 46E27 54E70

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

(opens in new tab


Baudier, F.Gartland, C.Schlumprecht, Th.

L1  1

-distortion of Wasserstein metrics: a tale of two dimensions. (English) Zbl 07739839

Trans. Am. Math. Soc., Ser. B 10, 1077-1118 (2023).

MSC:  46B85 68R12 51F30 05C63 46B99

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Wang, Yu-ZhaoLi, Sheng-JieZhang, Xinxin

Generalized displacement convexity for nonlinear mobility continuity equation and entropy power concavity on Wasserstein space over Riemannian manifolds. (English) Zbl 1521.49035

Manuscr. Math. 172, No. 1-2, 405-426 (2023).

MSC:  49Q20 49K20 49K45 53C22 58J35 53E40

PDFBibTeX XMLCite

Full Text: DOI 


Azizian, WaïssIutzeler, FranckMalick, Jérôme

Regularization for Wasserstein distributionally robust optimization. (English) Zbl 1522.90059

ESAIM, Control Optim. Calc. Var. 29, Paper No. 33, 31 p. (2023).

MSC:  90C17 90C25 49N15 49Q22

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Cosso, AndreaMartini, Mattia

On smooth approximations in the Wasserstein space. (English) Zbl 07734100

Electron. Commun. Probab. 28, Paper No. 30, 11 p. (2023).

MSC:  28A33 28A15 49N80

PDFBibTeX XMLCite

Full Text: DOI 

arXiv


2023



 

Fournier, Nicolas

Convergence of the empirical measure in expected Wasserstein distance: non-asymptotic explicit bounds in 

d  

. (English) Zbl 07730458

ESAIM, Probab. Stat. 27, 749-775 (2023).

Reviewer: Carlo Sempi (Lecce)

MSC:  60F25 65C05

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 




Sodini, Giacomo Enrico

The general class of Wasserstein Sobolev spaces: density of cylinder functions, reflexivity, uniform convexity and Clarkson’s inequalities. (English) Zbl 07727384

Calc. Var. Partial Differ. Equ. 62, No. 7, Paper No. 212, 41 p. (2023).

MSC:  46E36 49Q22 46B10 46B20

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 




Simon, RichárdVirosztek, Dániel

Preservers of the 

p 

-power and the Wasserstein means on 

2×2

  2

 matrices. (English) Zbl 1521.15024

Electron. J. Linear Algebra 39, 395-408 (2023).

MSC:  15A86 15A24 47B49 47A64

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv    Link 




Hamm, KeatonHenscheid, NickKang, Shujie

Wassmap: Wasserstein isometric mapping for image manifold learning. (English) Zbl 07726190

SIAM J. Math. Data Sci. 5, No. 2, 475-501 (2023).

MSC:  68T10 49Q22

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 




González-Delgado, JavierGonzález-Sanz, AlbertoCortés, JuanNeuvial, Pierre

Two-sample goodness-of-fit tests on the flat torus based on Wasserstein distance and their relevance to structural biology. (English) Zbl 07725163

Electron. J. Stat. 17, No. 1, 1547-1586 (2023).

MSC:  62-XX

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv

<–—2023———2023———1620--


Link 





 Portfolio optimization within a Wasserstein ball. (English) Zbl 07770149

SIAM J. Financ. Math. 14, No. 4, 1175-1214 (2023).

MSC:  91G10 91G70 62H05

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Gao, YihangNg, Michael K.Zhou, Mingjie

Approximating probability distributions by using Wasserstein generative adversarial networks. (English) Zbl 07768240

SIAM J. Math. Data Sci. 5, No. 4, 949-976 (2023).

MSC:  68Q32 68T15 68W40

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Friesecke, GeroPenka, Maximilian

The GenCol algorithm for high-dimensional optimal transport: general formulation and application to barycenters and Wasserstein splines. (English) Zbl 07768238

SIAM J. Math. Data Sci. 5, No. 4, 899-919 (2023).

MSC:  65Kxx 90C06

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Cited by 13 Related articles All 3 versions


Chen, DaliWu, YuweiLi, JingquanDing, XiaohuiChen, Caihua

Distributionally robust mean-absolute deviation portfolio optimization using Wasserstein metric. (English) Zbl 1528.90295

J. Glob. Optim. 87, No. 2-4, 783-805 (2023).

MSC:  90C90 90C15 90C26

PDFBibTeX XMLCite

Full Text: DOI 


De Palma, GiacomoTrevisan, Dario

The Wasserstein distance of order 1 for quantum spin systems on infinite lattices. (English) Zbl 07761144

Ann. Henri Poincaré 24, No. 12, 4237-4282 (2023).

MSC:  81Q35 81R25 81P55 81P65 81P17 26B05 80A10

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


2023

Jimenez, ChloéMarigonda, AntonioQuincampoix, Marc

Dynamical systems and Hamilton-Jacobi-Bellman equations on the Wasserstein space and their 

L2

 representations. (English) Zbl 1526.35118

SIAM J. Math. Anal. 55, No. 5, 5919-5966 (2023).

MSC:  35F21 35D40 49L25 49Q22

PDFBibTeX XMLCite

Full Text: DOI 

Duvenhage, RoccoMapaya, Mathumo

Quantum Wasserstein distance of order 1 between channels. (English) Zbl 07757254

Infin. Dimens. Anal. Quantum Probab. Relat. Top. 26, No. 3, Article ID 2350006, 36 p. (2023).

MSC:  81P47 81S22 49Q22 46L60 32E05 82C70

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Wang, Feng-Yu

Convergence in Wasserstein distance for empirical measures of Dirichlet diffusion processes on manifolds. (English) Zbl 1525.58011

J. Eur. Math. Soc. (JEMS) 25, No. 9, 3695-3725 (2023).

MSC:  58J65

PDFBibTeX XMLCite

Full Text: DOI 

 arXi

Thanwerdas, YannPennec, Xavier

Bures-Wasserstein minimizing geodesics between covariance matrices of different ranks. (English) Zbl 1528.15027

SIAM J. Matrix Anal. Appl. 44, No. 3, 1447-1476 (2023).

Reviewer: Benjamin McKay (Cork)

MSC:  15B48 15A63 53B20 53C22 58D17 54E50 58A35

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Li, WuchenLiu, SitingOsher, Stanley

A kernel formula for regularized Wasserstein proximal operators. (English) Zbl 1525.35064

Res. Math. Sci. 10, No. 4, Paper No. 43, 16 p. (2023).

MSC:  35C15 35A22 35K08

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

<–—2023———2023———1630--


Ballesio, MarcoJasra, Ajayvon Schwerin, ErikTempone, Raúl

A Wasserstein coupled particle filter for multilevel estimation. (English) Zbl 07750820

Stochastic Anal. Appl. 41, No. 5, 820-859 (2023).

MSC:  62-XX 93-XX

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Wang, Feng-YuWu, Bingyao

Wasserstein convergence for empirical measures of subordinated diffusions on Riemannian manifolds. (English) Zbl 1523.58036

Potential Anal. 59, No. 3, 933-954 (2023).

MSC:  58J65

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Karakhanyan, Aram L.

A nonlocal free boundary problem with Wasserstein distance. (English) Zbl 1528.35240

Calc. Var. Partial Differ. Equ. 62, No. 9, Paper No. 240, 22 p. (2023).

Reviewer: Mariana Vega Smit (Bellingham)

MSC:  35R35 35A15 35J60 35J87 49Q20

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

OA License 

Drup. Aaa

Slepčev, DejanWarren, Andrew

Nonlocal Wasserstein distance: metric and asymptotic properties. (English) Zbl 07748712

Calc. Var. Partial Differ. Equ. 62, No. 9, Paper No. 238, 66 p. (2023).

MSC:  46E27 49J99 60J76 60B10 45G10

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

OA License 


Fornasier, MassimoSavaré, GiuseppeSodini, Giacomo Enrico

Density of subalgebras of Lipschitz functions in metric Sobolev spaces and applications to Wasserstein Sobolev spaces. (English) Zbl 07744988

J. Funct. Anal. 285, No. 11, Article ID 110153, 76 p. (2023).

MSC:  46E36 31C25 49Q20 28A33

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Cited by 8 Related articles All 8 versions


2023

Chakraborty, Kuntal

A note on relative Vaserstein symbol. (English) Zbl 07742052

J. Algebra Appl. 22, No. 10, Article ID 2350210, 29 p. (2023).

Reviewer: Moshe Roitman (Haifa)

MSC:  13C10 13H05 19B14 19B10

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

bl 1529.13010




Pál, Gehér GyörgyTitkos, TamásVirosztek, Dániel

On isometries of Wasserstein spaces. (English) Zbl 1527.54010

RIMS Kôkyûroku Bessatsu B93, 239-250 (2023).

Reviewer: Aleksey A. Dovgoshey (Slovyansk)

MSC:  54E40 46E27 60B05

PDFBibTeX XMLCite

Full Text: Link 


Gehér, György PálTitkos, TamásVirosztek, Dániel

Isometric rigidity of Wasserstein tori and spheres. (English) Zbl 07740588

Mathematika 69, No. 1, 20-32 (2023).

Reviewer: Aleksey A. Dovgoshey (Slovyansk)

MSC:  54E40 46E27 54E70

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Baudier, F.Gartland, C.Schlumprecht, Th.

L 11

-distortion of Wasserstein metrics: a tale of two dimensions. (English) Zbl 07739839

Trans. Am. Math. Soc., Ser. B 10, 1077-1118 (2023).

MSC:  46B85 68R12 51F30 05C63 46B99

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Wang, Yu-ZhaoLi, Sheng-JieZhang, Xinxin

Generalized displacement convexity for nonlinear mobility continuity equation and entropy power concavity on Wasserstein space over Riemannian manifolds. (English) Zbl 1521.49035

Manuscr. Math. 172, No. 1-2, 405-426 (2023).

MSC:  49Q20 49K20 49K45 53C22 58J35 53E40

PDFBibTeX XMLCite

Full Text: DOI 

<–—2023———2023———1640--



Azizian, WaïssIutzeler, FranckMalick, Jérôme

Regularization for Wasserstein distributionally robust optimization. (English) Zbl 1522.90059

ESAIM, Control Optim. Calc. Var. 29, Paper No. 33, 31 p. (2023).

MSC:  90C17 90C25 49N15 49Q22

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Cosso, AndreaMartini, Mattia

On smooth approximations in the Wasserstein space. (English) Zbl 07734100

Electron. Commun. Probab. 28, Paper No. 30, 11 p. (2023).

MSC:  28A33 28A15 49N80

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

 Link 


Fournier, Nicolas

Convergence of the empirical measure in expected Wasserstein distance: non-asymptotic explicit bounds in 

d

. (English) Zbl 07730458

ESAIM, Probab. Stat. 27, 749-775 (2023).

Reviewer: Carlo Sempi (Lecce)

MSC:  60F25 65C05

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Sodini, Giacomo Enrico

The general class of Wasserstein Sobolev spaces: density of cylinder functions, reflexivity, uniform convexity and Clarkson’s inequalities. (English) Zbl 07727384

Calc. Var. Partial Differ. Equ. 62, No. 7, Paper No. 212, 41 p. (2023).

MSC:  46E36 49Q22 46B10 46B20

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Simon, RichárdVirosztek, Dániel

Preservers of the 

p-power and the Wasserstein means on 

2×2. 2. ×2

 matrices. (English) Zbl 1521.15024

Electron. J. Linear Algebra 39, 395-408 (2023).

MSC:  15A86 15A24 47B49 47A64

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

 Link 


2023


Hamm, KeatonHenscheid, NickKang, Shujie

Wassmap: Wasserstein isometric mapping for image manifold learning. (English) Zbl 07726190

SIAM J. Math. Data Sci. 5, No. 2, 475-501 (2023).

MSC:  68T10 49Q22

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


González-Delgado, JavierGonzález-Sanz, AlbertoCortés, JuanNeuvial, Pierre

Two-sample goodness-of-fit tests on the flat torus based on Wasserstein distance and their relevance to structural biology. (English) Zbl 07725163

Electron. J. Stat. 17, No. 1, 1547-1586 (2023).

MSC:  62-XX

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

 Link 


Mathey-Prevot, MaximeValette, Alain

Wasserstein distance and metric trees. (English) Zbl 07724327

Enseign. Math. (2) 69, No. 3-4, 315-333 (2023).

MSC:  05C12 05C05 46B85 62P10

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Altekrüger, FabianHertrich, Johannes

WPPNets and WPPFlows: the power of Wasserstein patch priors for superresolution. (English) Zbl 1517.94016

SIAM J. Imaging Sci. 16, No. 3, 1033-1067 (2023).

MSC:  94A08 62F15 68T07 62C10 68U10 68T37 49Q22

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Beinert, RobertHeiss, CosmasSteidl, Gabriele

On assignment problems related to Gromov-Wasserstein distances on the real line. (English) Zbl 1519.49022

SIAM J. Imaging Sci. 16, No. 2, 1028-1032 (2023).

MSC:  49M20 28A35 90C27

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

<–—2023———2023———1650--


Kravtsova, NataliaMcGee, Reginald L. IIDawes, Adriana T.

Scalable Gromov-Wasserstein based comparison of biological time series. (English) Zbl 1519.92005

Bull. Math. Biol. 85, No. 8, Paper No. 77, 26 p. (2023).

MSC:  92B15

PDFBibTeX XMLCite

Full Text: DOI 


Chen, ShukaiFang, RongjuanZheng, Xiangqi

Wasserstein-type distances of two-type continuous-state branching processes in Lévy random environments. (English) Zbl 07722781

J. Theor. Probab. 36, No. 3, 1572-1590 (2023).

Reviewer: Victor V. Goryainov (Moskva)

MSC:  60J25 60J68 60J80 60J76

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Jourdain, BenjaminMargheriti, WilliamPammer, Gudmund

Lipschitz continuity of the Wasserstein projections in the convex order on the line. (English) Zbl 1519.49032

Electron. Commun. Probab. 28, Paper No. 18, 13 p. (2023).

MSC:  49Q22

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Fang, XiaoKoike, Yuta

From 

p-Wasserstein bounds to moderate deviations. (English) Zbl 1519.60032

Electron. J. Probab. 28, Paper No. 83, 52 p. (2023).

MSC:  60F05 60F10 62E17

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Ponnoprat, Donlapark

Universal consistency of Wasserstein 

k

-NN classifier: a negative and some positive results. (English) Zbl 07720187

Inf. Inference 12, No. 3, Article ID iaad027, 23 p. (2023).

MSC:  62H30 49Q22 90Cxx 62F15

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


2023



Bubenik, PeterScott, JonathanStanley, Donald

Exact weights, path metrics, and algebraic Wasserstein distances. (English) Zbl 1522.55007

J. Appl. Comput. Topol. 7, No. 2, 185-219 (2023).

Reviewer: Massimo Ferri (Bologna)

MSC:  55N31 18E10

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Wickman, ClareOkoudjou, Kasso A.

Gradient flows for probabilistic frame potentials in the Wasserstein space. (English) Zbl 1518.42045

SIAM J. Math. Anal. 55, No. 3, 2324-2346 (2023).

MSC:  42C15 60D05 94A12 35Q82 35R60

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Cited by 2 Related articles All 2 versions


Le Gouic, ThibautParis, QuentinRigollet, PhilippeStromme, Austin J.

Fast convergence of empirical barycenters in Alexandrov spaces and the Wasserstein space. (English) Zbl 07714611

J. Eur. Math. Soc. (JEMS) 25, No. 6, 2229-2250 (2023).

MSC:  62G05 51F99

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Figalli, AlessioGlaudo, Federico

An invitation to optimal transport. Wasserstein distances, and gradient flows. 2nd edition. (English) Zbl 1527.49001

EMS Textbooks in Mathematics. Berlin: European Mathematical Society (EMS) (ISBN 978-3-98547-050-1/hbk; 978-3-98547-550-6/ebook). vi, 146 p. (2023).

Reviewer: Antonio Masiello (Bari)

MSC:  49-01 49-02 49Q22 60B05 28A33 35A15 35Q35 49N15 28A50

PDFBibTeX XMLCite

Full Text: DOI 




Jeong, MiranHwang, JinmiKim, Sejong

Right mean for the 

α−z.

 Bures-Wasserstein quantum divergence. (English) Zbl 1524.81013

Acta Math. Sci., Ser. B, Engl. Ed. 43, No. 5, 2320-2332 (2023).

MSC:  81P17 15B48

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

<–—2023———2023———1650--



Li, WuchenZhao, Jiaxi

Wasserstein information matrix. (English) Zbl 1521.94014

Inf. Geom. 6, No. 1, 203-255 (2023).

Reviewer: Sorin-Mihai Grad (Paris)

MSC:  94A15 94A17 62H10 62F10 49Q22 90C26 15B52

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 




Candelieri, AntonioPonti, AndreaGiordani, IlariaArchetti, Francesco

On the use of Wasserstein distance in the distributional analysis of human decision making under uncertainty. (English) Zbl 07709590

Ann. Math. Artif. Intell. 91, No. 2-3, 217-238 (2023).

MSC:  68Txx

PDFBibTeX XMLCite

Full Text: DOI 


Duvenhage, Rocco

Wasserstein distance between noncommutative dynamical systems. (English) Zbl 1528.46053

J. Math. Anal. Appl. 527, No. 1, Part 2, Article ID 127353, 26 p. (2023).

MSC:  46L55 37A55

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Chen, YaqingLin, ZhenhuaMüller, Hans-Georg

Wasserstein regression. (English) Zbl 07707208

J. Am. Stat. Assoc. 118, No. 542, 869-882 (2023).

MSC:  62-XX

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Bartl, DanielWiesel, Johannes

Sensitivity of multiperiod optimization problems with respect to the adapted Wasserstein distance. (English) Zbl 1520.91364

SIAM J. Financ. Math. 14, No. 2, 704-720 (2023).

MSC:  91G10 93E20 60G40

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Cited by 16 Related articles


2023

Yuan, YuefeiSong, QiankunZhou, Bo

A 


Luo, ZunhaoYin, YunqiangWang, DujuanCheng, T. C. E.Wu, Chin-Chia

Wasserstein distributionally robust chance-constrained program with moment information. (English) Zbl 07706559

Comput. Oper. Res. 152, Article ID 106150, 22 p. (2023).

MSC:  90Bxx

PDFBibTeX XMLCite

Full Text: DOI 


Xia, TianLiu, JiaLisser, Abdel

Distributionally robust chance constrained games under Wasserstein ball. (English) Zbl 1525.91008

Oper. Res. Lett. 51, No. 3, 315-321 (2023).

MSC:  91A10

PDFBibTeX XMLCite

Full Text: DOI   

 

Chen, ZhiKuhn, DanielWiesemann, Wolfram

On approximations of data-driven chance constrained programs over Wasserstein balls. (English) Zbl 1525.90289

Oper. Res. Lett. 51, No. 3, 226-233 (2023).

MSC:  90C15 90C11

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

ited by 9 Related articles All 7 versions


Talbi, MehdiTouzi, NizarZhang, Jianfeng

Viscosity solutions for obstacle problems on Wasserstein space. (English) Zbl 1515.60112

SIAM J. Control Optim. 61, No. 3, 1712-1736 (2023).

MSC:  60G40 35Q89 49N80 49L25 60H30

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

<–—2023———2023———1660--(



Bayraktar, ErhanEkren, IbrahimZhang, Xin

A smooth variational principle on Wasserstein space. (English) Zbl 07702414

Proc. Am. Math. Soc. 151, No. 9, 4089-4098 (2023).

MSC:  58E30 90C05 35D40

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Konarovskyi, Vitalii

Coalescing-fragmentating Wasserstein dynamics: particle approach. (English) Zbl 07699948

Ann. Inst. Henri Poincaré, Probab. Stat. 59, No. 2, 983-1028 (2023).

MSC:  60K35 60B12 60G44 60J60 82B21

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Barrera, GerardoLukkarinen, Jani

Quantitative control of Wasserstein distance between Brownian motion and the Goldstein-Kac telegraph process. (English) Zbl 07699947

Ann. Inst. Henri Poincaré, Probab. Stat. 59, No. 2, 933-982 (2023).

MSC:  60G50 60K99 60J76 60K35 60K40

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Fuhrmann, SvenKupper, MichaelNendel, Max

Wasserstein perturbations of Markovian transition semigroups. (English) Zbl 1516.60045

Ann. Inst. Henri Poincaré, Probab. Stat. 59, No. 2, 904-932 (2023).

MSC:  60J35 47H20 60G65 62G35 90C31

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Lacombe, JulienDigne, JulieCourty, NicolasBonneel, Nicolas

Learning to generate Wasserstein barycenters. (English) Zbl 1512.68299

J. Math. Imaging Vis. 65, No. 2, 354-370 (2023).

MSC:  68T07 49Q22 68U10

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 



2023


Cheng, Li-JuanThalmaier, AntonWang, Feng-Yu

Some inequalities on Riemannian manifolds linking entropy, Fisher information, Stein discrepancy and Wasserstein distance. (English) Zbl 1516.60013

J. Funct. Anal. 285, No. 5, Article ID 109997, 42 p. (2023).

Reviewer: Fraser Daly (Edinburgh)

MSC:  60E15 35K08 46E35 42B35

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Cited by 3 Related articles All 6 versions

 Cited by 16 Related articles All 6 versions


Daudin, Samuel

Optimal control of the Fokker-Planck equation under state constraints in the Wasserstein space. (English. French summary) Zbl 1522.49024

J. Math. Pures Appl. (9) 175, 37-75 (2023).

Reviewer: Christian Clason (Graz)

MSC:  49K20 49J20 49J30 93E20 35K99

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Cited by 16 Related articles All 6 versions



Pagès, GillesPanloup, Fabien

Unadjusted Langevin algorithm with multiplicative noise: total variation and Wasserstein bounds. (English) Zbl 1515.65032

Ann. Appl. Probab. 33, No. 1, 726-779 (2023).

MSC:  65C30 37M25 60F05 60H10 62L20

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Wang, Feng-Yu

Convergence in Wasserstein distance for empirical measures of semilinear SPDEs. (English) Zbl 1521.60033

Ann. Appl. Probab. 33, No. 1, 70-84 (2023).

Reviewer: Anhui Gu (Chongqing)

MSC:  60H15 60F99

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Li, HuaiqianWu, Bingyao

Wasserstein convergence rates for empirical measures of subordinated processes on noncompact manifolds. (English) Zbl 1516.58014

J. Theor. Probab. 36, No. 2, 1243-1268 (2023).

Reviewer: Feng-Yu Wang (Tianjin)

MSC:  58J65 60J60 60J76

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

<–—2023———2023———1670--


Deo, NeilRandrianarisoa, Thibault

On adaptive confidence sets for the Wasserstein distances. (English) Zbl 07691575

Bernoulli 29, No. 3, 2119-2141 (2023).

MSC:  62Gxx 60Fxx 62Cxx

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

 Link 


Bachoc, FrançoisFathi, Max

Bounds in 

L11

 Wasserstein distance on the normal approximation of general M-estimators. (English) Zbl 07690328

Electron. J. Stat. 17, No. 1, 1457-1491 (2023).

MSC:  62-XX

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

 Link 


Cui, JianboLiu, ShuZhou, Haomin

Wasserstein Hamiltonian flow with common noise on graph. (English) Zbl 1516.35434

SIAM J. Appl. Math. 83, No. 2, 484-509 (2023).

MSC:  35R02 35Q55 35R60 49Q22

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Nhan-Phu ChungQuoc-Hung Nguyen

Gradient flows of modified Wasserstein distances and porous medium equations with nonlocal pressure. (English) Zbl 1514.35007

Acta Math. Vietnam. 48, No. 1, 209-235 (2023).

MSC:  35A15 35A01 35A35 35R11

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Buzun, Nazar

Gaussian approximation for penalized Wasserstein barycenters. (English) Zbl 07686805

Math. Methods Stat. 32, No. 1, 1-26 (2023).

MSC:  62-XX 60-XX

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


2023

Minh, Hà Quang

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes. (English) Zbl 1514.49024

J. Theor. Probab. 36, No. 1, 201-296 (2023).

MSC:  49Q22 28C20 60G15 47B65

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Wu, ZhongmingSun, Kexin

Distributionally robust optimization with Wasserstein metric for multi-period portfolio selection under uncertainty. (English) Zbl 1510.91155

Appl. Math. Modelling 117, 513-528 (2023).

MSC:  91G10 49Q22 90C17

ibTeX XMLCite

Full Text: DOI 


von Lindheim, Johannes

Simple approximative algorithms for free-support Wasserstein barycenters. (English) Zbl 1515.65055

Comput. Optim. Appl. 85, No. 1, 213-246 (2023).

MSC:  65D18 68U10 90B80

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Cited by 7 Related articles All 9 versions

Kim, Sejong

Parameterized Wasserstein means. (English) Zbl 1514.15006

J. Math. Anal. Appl. 525, No. 1, Article ID 127272, 14 p. (2023).

MSC:  15A04 15A45 60B20 47A64 43A07

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Minh, Hà Quang

Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings. (English) Zbl 07680420

Anal. Appl., Singap. 21, No. 3, 719-775 (2023).

MSC:  28C20 49Q22 46E22

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Cited by 1 Related articles All 4 versions

<–—2023———2023———1680--


Feng, ChunrongLiu, YujiaZhao, Huaizhong

Periodic measures and Wasserstein distance for analysing periodicity of time series datasets. (English) Zbl 1511.60011

Commun. Nonlinear Sci. Numer. Simul. 120, Article ID 107166, 31 p. (2023).

MSC:  60B12 62M05 37A50 37A44

PDFBibTeX XMLCite

Full Text: DOI 

Kim, KihyunYang, Insoon

Distributional robustness in minimax linear quadratic control with Wasserstein distance. (English) Zbl 1511.93142

SIAM J. Control Optim. 61, No. 2, 458-483 (2023).

MSC:  93E20 49N05 93C55 93C05

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Ma, QingWang, Yanjun

Distributionally robust chance constrained SVM model with 

22

-Wasserstein distance. (English) Zbl 07668806

J. Ind. Manag. Optim. 19, No. 2, 916-931 (2023).

MSC:  90C11

PDFBibTeX XMLCite

Full Text: DOI 


Ho-Nguyen, NamWright, Stephen J.

Adversarial classification via distributional robustness with Wasserstein ambiguity. (English) Zbl 07667538

Math. Program. 198, No. 2 (B), 1411-1447 (2023).

MSC:  68T09 90C17 90C26 90C30

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Cited by 19 Related articles All 11 versions

Cavagnari, GiuliaSavaré, GiuseppeSodini, Giacomo Enrico

Dissipative probability vector fields and generation of evolution semigroups in Wasserstein spaces. (English) Zbl 1512.35696

Probab. Theory Relat. Fields 185, No. 3-4, 1087-1182 (2023).

MSC:  35R60 28A50 34A06 35Q49 47J20 47J35 49J40

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


2023

Zhou, DatongChen, JingWu, HaoYang, DinghuiQiu, Lingyun

The Wasserstein-Fisher-Rao metric for waveform based earthquake location. (English) Zbl 1515.86012

J. Comput. Math. 41, No. 3, 437-458 (2023).

MSC:  86A15 86A22 65K10

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Gehér, György PálPitrik, JózsefTitkos, TamásVirosztek, Dániel

Quantum Wasserstein isometries on the qubit state space. (English) Zbl 1518.46048

J. Math. Anal. Appl. 522, No. 2, Article ID 126955, 17 p. (2023).

MSC:  46L89 81P47 81P45

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

 


Xu, GuanglongHu, ZhenshengCai, Jia

Wad-CMSN: Wasserstein distance-based cross-modal semantic network for zero-shot sketch-based image retrieval. (English) Zbl 1506.68177

Int. J. Wavelets Multiresolut. Inf. Process. 21, No. 2, Article ID 2250054, 19 p. (2023).

MSC:  68U10 68P20 68T05

PDFBibTeX



Shen, HaomingJiang, Ruiwei

Chance-constrained set covering with Wasserstein ambiguity. (English) Zbl 1512.90154

Math. Program. 198, No. 1 (A), 621-674 (2023).

MSC:  90C15 90C47 90C11

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 




Wang, Feng-YuZhu, Jie-Xiang

Limit theorems in Wasserstein distance for empirical measures of diffusion processes on Riemannian manifolds. (English. French summary) Zbl 1508.58010

Ann. Inst. Henri Poincaré, Probab. Stat. 59, No. 1, 437-475 (2023).

MSC:  58J65 60J60 60F05 60F10

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

<–—2023———2023———1690--    



Bistroń, R.Eckstein, M.Życzkowski, K.

Monotonicity of a quantum 2-Wasserstein distance. (English) Zbl 1519.81126

J. Phys. A, Math. Theor. 56, No. 9, Article ID 095301, 24 p. (2023).

MSC:  81P47 35A16 81R60 35Q49

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 




Moosmüller, CarolineCloninger, Alexander

Linear optimal transport embedding: provable Wasserstein classification for certain rigid transformations and perturbations. (English) Zbl 07655458

Inf. Inference 12, No. 1, 363-389 (2023).

MSC:  62-XX 90Bxx

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 



Liu, ZhengLoh, Po-Ling

Robust W-GAN-based estimation under Wasserstein contamination. (English) Zbl 07655457

Inf. Inference 12, No. 1, 312-362 (2023).

MSC:  62-XX

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 



Gaunt, Robert E.Li, Siqi

Bounding Kolmogorov distances through Wasserstein and related integral probability metrics. (English) Zbl 1510.60010

J. Math. Anal. Appl. 522, No. 1, Article ID 126985, 24 p. (2023).

Reviewer: Carlo Sempi (Lecce)

MSC:  60E05

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 


Zhao, FeiranYou, Keyou

Minimax Q-learning control for linear systems using the Wasserstein metric. (English) Zbl 1512.93154

Automatica 149, Article ID 110850, 4 p. (2023).

Reviewer: Kurt Marti (München)

MSC:  93E20

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

 Cited by 3 Related articles All 4 versions


2023


Duy, Vo Nguyen LeTakeuchi, Ichiro

Exact statistical inference for the Wasserstein distance by selective inference. Selective inference for the Wasserstein distance. (English) Zbl 07643834

Ann. Inst. Stat. Math. 75, No. 1, 127-157 (2023).

MSC:  62-XX

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

[CITATION] Exact statistical inference for the Wasserstein distance by selective inference Selective Inference for the Wasserstein Distance

VN Le Duy, I Takeuchi - ANNALS OF …, 2023 - … TIERGARTENSTRASSE 17, D …

Related articles



 

Xia, QinglanZhou, Bohan

The existence of minimizers for an isoperimetric problem with Wasserstein penalty term in unbounded domains. (English) Zbl 1508.49006

Adv. Calc. Var. 16, No. 1, 1-15 (2023).

Reviewer: Luca Lussardi (Torino)

MSC:  49J45 49Q20 49Q05 49J20 60B05

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

Cited by 8 Related articles All 10 versions


Santambrogio, Filippo

Sharp Wasserstein estimates for integral sampling and Lorentz summability of transport densities. (English) Zbl 1507.49039

J. Funct. Anal. 284, No. 4, Article ID 109783, 12 p. (2023).

Reviewer: Nicolò De Ponti (Trieste)

MSC:  49Q22 60B05 49J55 42B35

PDFBibTeX XMLCite

Full Text: DOI 




Huesmann, MartinMattesini, FrancescoTrevisan, Dario

Wasserstein asymptotics for the empirical measure of fractional Brownian motion on a flat torus. (English) Zbl 1508.60050

Stochastic Processes Appl. 155, 1-26 (2023).

MSC:  60G22 49Q22 60B05 60B10

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 




Novack, MichaelTopaloglu, IhsanVenkatraman, Raghavendra

Least Wasserstein distance between disjoint shapes with perimeter regularization. (English) Zbl 1501.49022

J. Funct. Anal. 284, No. 1, Article ID 109732, 26 p. (2023).

MSC:  49Q10 49J10 49Q20

PDFBibTeX XMLCite

Full Text: DOI 

 arXiv 

<–—2023———2023—— —1700--   



Kargin, TaylanHajar, JoudiMalik, VikrantHassibi, Babak

Wasserstein Distributionally Robust Regret-Optimal Control in the Infinite-Horizon. arXiv:2312.17376

Preprint, arXiv:2312.17376 [eess.SY] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Gan, LuyiningHuang, Huajun

Order relations of the Wasserstein mean and the spectral geometric mean. arXiv:2312.15394

Preprint, arXiv:2312.15394 [math.FA] (2023).

MSC:  15A42 15A45 15B48

BibTeX Cite

Full Text: arXiv 

OA License 



Chen, JunyuNguyen, Binh T.Soh, Yong Sheng

Semidefinite Relaxations of the Gromov-Wasserstein Distance. arXiv:2312.14572

Preprint, arXiv:2312.14572 [math.OC] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Jackiewicz, MarcelKasperski, AdamZielinski, Pawel

Wasserstein robust combinatorial optimization problems. arXiv:2312.12769

Preprint, arXiv:2312.12769 [math.OC] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Belbasi, RezaSelvi, ArasWiesemann, Wolfram

It’s All in the Mix: Wasserstein Machine Learning with Mixed Features. arXiv:2312.12230

Preprint, arXiv:2312.12230 [math.OC] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 


2023


Cheung, HangTai, Ho ManQiu, Jinniao

Viscosity Solutions of a class of Second Order Hamilton-Jacobi-Bellman Equations in the Wasserstein Space. arXiv:2312.10322

Preprint, arXiv:2312.10322 [math.OC] (2023).

MSC:  49L25

BibTeX Cite

Full Text: arXiv 

OA License 



Robertson, SawyerKohli, DhruvMishne, GalCloninger, Alexander

On a Generalization of Wasserstein Distance and the Beckmann Problem to Connection Graphs. arXiv:2312.10295

Preprint, arXiv:2312.10295 [math.OC] (2023).

MSC:  65K10 05C21 90C25 68R10 05C50

BibTeX Cite

Full Text: arXiv 

OA License 



Sabbagh, RalphMiangolarra, Olga MovillaGeorgiou, Tryphon T.

Wasserstein speed limits for Langevin systems. arXiv:2312.07788

Preprint, arXiv:2312.07788 [math-ph] (2023).

MSC:  82C31 82M60 82Cxx 80M60 93E03

BibTeX Cite

Full Text: arXiv 

OA License 



Wang, TaoGoldfeld, Ziv

Neural Entropic Gromov-Wasserstein Alignment. arXiv:2312.07397

Preprint, arXiv:2312.07397 [math.ST] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Pantazis, GeorgeFranci, BarbaraGrammatico, Sergio

On data-driven Wasserstein distributionally robust Nash equilibrium problems with heterogeneous uncertainty. arXiv:2312.03573

Preprint, arXiv:2312.03573 [math.OC] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 

 arXiv 

<–—2023———2023———1710--     

 



Jiang, YihengChewi, SinhoPooladian, Aram-Alexandre

Algorithms for mean-field variational inference via polyhedral optimization in the Wasserstein space. arXiv:2312.02849

Preprint, arXiv:2312.02849 [math.ST] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Daudin, SamuelJackson, JoeSeeger, Benjamin

Well-posedness of Hamilton-Jacobi equations in the Wasserstein space: non-convex Hamiltonians and common noise. arXiv:2312.02324

Preprint, arXiv:2312.02324 [math.AP] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Gao, YuanYip, Nung Kwan

Homogenization of Wasserstein gradient flows. arXiv:2312.01584

Preprint, arXiv:2312.01584 [math.AP] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 

Boufadène, SiwanVialard, François-Xavier

On the global convergence of Wasserstein gradient flow of the Coulomb discrepancy. arXiv:2312.00800

Preprint, arXiv:2312.00800 [math.AP] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 

Portinale, LorenzoRademacher, SimoneVirosztek, Dániel

Limit theorems for empirical measures of interacting quantum systems in Wasserstein space. arXiv:2312.00541

Preprint, arXiv:2312.00541 [math-ph] (2023).

MSC:  60F05 35Q40 81Q99 49Q22

BibTeX Cite

Full Text: arXiv 

OA License 

Related articles All 3 versions 

2023


Stéphanovitch, ArthurAamari, EddieLevrard, Clément

Wasserstein GANs are Minimax Optimal Distribution Estimators. arXiv:2311.18613

Preprint, arXiv:2311.18613 [math.ST] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Ren, YinuoXiao, TesiGangwani, TanmayRangi, AnshukaRahmanian, HolakouYing, LexingSanyal, Subhajit

Multi-Objective Optimization via Wasserstein-Fisher-Rao Gradient Flow. arXiv:2311.13159

Preprint, arXiv:2311.13159 [cs.LG] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Gao, XuefengNguyen, Hoang M.Zhu, Lingjiong

Wasserstein Convergence Guarantees for a General Class of Score-Based Generative Models. arXiv:2311.11003

Preprint, arXiv:2311.11003 [cs.LG] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Jiang, HuajianCui, Xiaojun

General distance-like functions on the Wasserstein space. arXiv:2311.10618

Preprint, arXiv:2311.10618 [math.AP] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Zemel, Yoav

Non-injectivity of Bures–Wasserstein barycentres in infinite dimensions. arXiv:2311.09385

Preprint, arXiv:2311.09385 [math.FA] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 


<–—2023———2023———1720--     



Hamm, KeatonMoosmüller, CarolineSchmitzer, BernhardThorpe, Matthew

Manifold learning in Wasserstein space. arXiv:2311.08549

Preprint, arXiv:2311.08549 [stat.ML] (2023).

MSC:  49Q22 41A65 58B20 53Z50

BibTeX Cite

Full Text: arXiv 

OA License 



Borda, Bence

Eigenvalues of random matrices from compact classical groups in Wasserstein metric. arXiv:2311.08343

Preprint, arXiv:2311.08343 [math.PR] (2023).

MSC:  60B20 60F05 60G55 49Q22

BibTeX Cite

Full Text: arXiv 

OA License 



Nietert, SloanGoldfeld, ZivShafiee, Soroosh

Outlier-Robust Wasserstein DRO. arXiv:2311.05573

Preprint, arXiv:2311.05573 [stat.ML] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Park, SangminSlepčev, Dejan

Geometry and analytic properties of the sliced Wasserstein space. arXiv:2311.05134

Preprint, arXiv:2311.05134 [math.AP] (2023).

MSC:  49Q22 46E27 60B10 44A12

BibTeX Cite

Full Text: arXiv 

OA License 



Alfakih, AbdoCheng, JefferyJung, Woosuk L.Moursi, Walaa M.Wolkowicz, Henry

Exact Solutions for the NP-hard Wasserstein Barycenter Problem using a Doubly Nonnegative Relaxation and a Splitting Method. arXiv:2311.05045

Preprint, arXiv:2311.05045 [math.OC] (2023).

MSC:  90C26 65K10 90C27 90C22

BibTeX Cite

Full Text: arXiv 

OA License 


2023


Ning, ChaoMa, Xutao

Data-Driven Bayesian Nonparametric Wasserstein Distributionally Robust Optimization. arXiv:2311.02953

Preprint, arXiv:2311.02953 [math.OC] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 

Fornasier, MassimoHeid, PascalSodini, Giacomo Enrico

Approximation Theory, Computing, and Deep Learning on the Wasserstein Space. arXiv:2310.19548

Preprint, arXiv:2310.19548 [math.OC] (2023).

MSC:  49Q22 33F05 46E36 28A33 68T07

BibTeX Cite

Full Text: arXiv 

OA License 

Cited by 1 Related articles All 4 versions 


Tschiderer, Bertram

Diffusion processes as Wasserstein gradient flows via stochastic control of the volatility matrix. arXiv:2310.18678

Preprint, arXiv:2310.18678 [math.PR] (2023).

MSC:  60H30 60G44 60J60 94A17

BibTeX Cite

Full Text: arXiv 

OA License 

Cheng, XiuyuanLu, JianfengTan, YixinXie, Yao

Convergence of flow-based generative models via proximal gradient descent in Wasserstein space. arXiv:2310.17582

Preprint, arXiv:2310.17582 [stat.ML] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 

prove the Kullback-Leibler (KL) guarantee of data generation by a JKO flow …

Cited by 4 Related articles All 2 versions 

Liu, LinshanMajka, Mateusz B.Monmarché, Pierre

L2 

2-Wasserstein contraction for Euler schemes of elliptic diffusions and interacting particle systems. arXiv:2310.15897

Preprint, arXiv:2310.15897 [math.PR] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 

<–—2023———2023———1730—     


 



Cheung, HangQiu, JinniaoBadescu, Alexandru

A Viscosity Solution Theory of Stochastic Hamilton-Jacobi-Bellman equations in the Wasserstein Space. arXiv:2310.14446

Preprint, arXiv:2310.14446 [math.OC] (2023).

MSC:  49L25

BibTeX Cite

Full Text: arXiv 

OA License 



Ma, Jianyu

Absolute continuity of Wasserstein barycenters on manifolds with a lower Ricci curvature bound. arXiv:2310.13832

Preprint, arXiv:2310.13832 [math.PR] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Lim, Evan Unit

Quasi Manhattan Wasserstein Distance. arXiv:2310.12498

Preprint, arXiv:2310.12498 [cs.LG] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Neklyudov, KirillBrekelmans, RobTong, AlexanderAtanackovic, LazarLiu, QiangMakhzani, Alireza

A Computational Framework for Solving Wasserstein Lagrangian Flows. arXiv:2310.10649

Preprint, arXiv:2310.10649 [cs.LG] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Weighill, Thomas

Coarse embeddings of quotients by finite group actions via the sliced Wasserstein distance. arXiv:2310.09369

Preprint, arXiv:2310.09369 [math.MG] (2023).

MSC:  51F30

BibTeX Cite

Full Text: arXiv 

OA License 



2023


Hamm, KeatonKhurana, Varun

Lattice Approximations in Wasserstein Space. arXiv:2310.09149

Preprint, arXiv:2310.09149 [stat.ML] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Jourdain, BenjaminShao, Kexin

Maximal Martingale Wasserstein Inequality. arXiv:2310.08492

Preprint, arXiv:2310.08492 [math.PR] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Zhu, Jie-Xiang

Asymptotic behavior of Wasserstein distance for weighted empirical measures of diffusion processes on compact Riemannian manifolds. arXiv:2310.01670

Preprint, arXiv:2310.01670 [math.PR] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Xie, LiyanLiang, YuchenVeeravalli, Venugopal V.

Distributionally Robust Quickest Change Detection using Wasserstein Uncertainty Sets. arXiv:2309.16171

Preprint, arXiv:2309.16171 [math.ST] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 

Cited by 1 Related articles All 2 versions 

Conrado, FrancieleZhou, Detang

The Wasserstein distance for Ricci shrinkers. arXiv:2309.16017

Preprint, arXiv:2309.16017 [math.DG] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 

<–—2023———2023———1740—     



Rousseau, JudithScricciolo, Catia

Wasserstein convergence in Bayesian and frequentist deconvolution models. arXiv:2309.15300

Preprint, arXiv:2309.15300 [math.ST] (2023).

BibTeX Cite

Full Text: DOI 

 arXi

OA License 



Li, WuchenZhao, Jiaxi

Scaling Limits of the Wasserstein information matrix on Gaussian Mixture Models. arXiv:2309.12997

Preprint, arXiv:2309.12997 [math.PR] (2023).

MSC:  62B11 41A60

BibTeX Cite

Full Text: arXiv 

OA License 



Schär, PhilipStier, Thilo D.

A dimension-independent bound on the Wasserstein contraction rate of geodesic slice sampling on the sphere for uniform target. arXiv:2309.09097

Preprint, arXiv:2309.09097 [math.ST] (2023).

MSC:  60J05 60G50 65C05

BibTeX Cite

Full Text: arXiv 

OA License 



Ding, HaoFang, ShizanLi, Xiang-dong

Stochastic differential equations and stochastic parallel translations in the Wasserstein space. arXiv:2309.08702

Preprint, arXiv:2309.08702 [math.PR] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 


Fan, XiequanSu, Zhonggen

Rates of convergence in the distances of Kolmogorov and Wasserstein for standardized martingales. arXiv:2309.08189

Preprint, arXiv:2309.08189 [math.PR] (2023).

MSC:  60G42 60F05 60E15

BibTeX Cite

Full Text: arXiv 

OA License 

Related articles All 2 versions 


2023


Nodozi, ImanHalder, Abhishek

Wasserstein Consensus ADMM. arXiv:2309.07351

Preprint, arXiv:2309.07351 [math.OC] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Liu, TianleAustern, Morgane

Smooth Edgeworth Expansion and Wasserstein-p


 Bounds for Mixing Random Fields. arXiv:2309.07031

Preprint, arXiv:2309.07031 [math.PR] (2023).

MSC:  60F05

BibTeX Cite

Full Text: arXiv 

OA License 



Burchard, AlmutCarazzato, DavideTopaloglu, Ihsan

Maximizers of nonlocal interactions of Wasserstein type. arXiv:2309.05522

Preprint, arXiv:2309.05522 [math.AP] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Mimouni, D.Malisani, PZhu, J.de Oliveira, W.

Computing Wasserstein Barycenter via operator splitting: the method of averaged marginals. arXiv:2309.05315

Preprint, arXiv:2309.05315 [math.OC] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Bayraktar, ErhanEkren, IbrahimZhang, Xin

Comparison of viscosity solutions for a class of second order PDEs on the Wasserstein space. arXiv:2309.05040

Preprint, arXiv:2309.05040 [math.AP] (2023).

MSC:  58E30 90C05

BibTeX Cite

Full Text: arXiv 

OA License 

<–—2023———2023———1750—    



Wang, Feng-Yu

Wasserstein Convergence Rate for Empirical Measures of Markov Processes. arXiv:2309.04674

Preprint, arXiv:2309.04674 [math.PR] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Daudin, SamuelSeeger, Benjamin

A comparison principle for semilinear Hamilton-Jacobi-Bellman equations in the Wasserstein space. arXiv:2308.15174

Preprint, arXiv:2308.15174 [math.AP] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Khamlich, MoaadPichi, FedericoRozza, Gianluigi

Optimal Transport-inspired Deep Learning Framework for Slow-Decaying Problems: Exploiting Sinkhorn Loss and Wasserstein Kernel. arXiv:2308.13840

Preprint, arXiv:2308.13840 [math.NA] (2023).

MSC:  68T05 65D99 41A05 65N30 41A46 90C25

BibTeX Cite

Full Text: arXiv 

OA License 

Related articles 


Chambolle, AntoninDuval, VincentMachado, Joao Miguel

The Total Variation-Wasserstein Problem. arXiv:2308.10753

Preprint, arXiv:2308.10753 [math.AP] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 


Qu, YanlinBlanchet, JoseGlynn, Peter

Computable Bounds on Convergence of Markov Chains in Wasserstein Distance. arXiv:2308.10341

Preprint, arXiv:2308.10341 [math.PR] (2023).

MSC:  60J05

BibTeX Cite

Full Text: arXiv 

OA License 

Related articles All 2 versions 


2023


Neykov, MateyWasserman, LarryKim, IlmunBalakrishnan, Sivaraman

Nearly Minimax Optimal Wasserstein Conditional Independence Testing. arXiv:2308.08672

Preprint, arXiv:2308.08672 [math.ST] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 

Gehér, György PálHrušková, ArankaTitkos, TamásVirosztek, Dániel

Isometric rigidity of Wasserstein spaces over Euclidean spheres. arXiv:2308.05065

Preprint, arXiv:2308.05065 [math.MG] (2023).

MSC:  54E40 46E27 60A10 60B05

BibTeX Cite

Full Text: arXiv 

OA License 


Soner, H. MeteYan, Qinxin

Viscosity Solutions of the Eikonal Equation on the Wasserstein Space. arXiv:2308.04097

Preprint, arXiv:2308.04097 [math.OC] (2023).

MSC:  35D40 35Q89 49L12 49L25 60G99

BibTeX Cite

Full Text: arXiv 

OA License 



Golse, François

A Duality-Based Proof of the Triangle Inequality for the Wasserstein Distances. arXiv:2308.03133

Preprint, arXiv:2308.03133 [math.OC] (2023).

MSC:  49Q22 49N15 60B10

BibTeX Cite

Full Text: arXiv 

OA License 

Related articles All 13 versions


Deb, NabarunKim, Young-HeonPal, SoumikSchiebinger, Geoffrey

Wasserstein Mirror Gradient Flow as the limit of the Sinkhorn Algorithm. arXiv:2307.16421

Preprint, arXiv:2307.16421 [math.PR] (2023).

MSC:  49N99 49Q22 60J60

BibTeX Cite

Full Text: arXiv 

OA License 

<–—2023———2023———1760—    



Demirci, Yunus EmreYüksel, Serdar

Geometric Ergodicity and Wasserstein Continuity of Non-Linear Filters. arXiv:2307.15764

Preprint, arXiv:2307.15764 [math.PR] (2023).

MSC:  60J05 60J10 93E11 93E15

BibTeX Cite

Full Text: arXiv 

OA License 



Dalery, MaximeDusson, GenevieveEhrlacher, VirginieLozinski, Alexei

Nonlinear reduced basis using mixture Wasserstein barycenters: application to an eigenvalue problem inspired from quantum chemistry. arXiv:2307.15423

Preprint, arXiv:2307.15423 [math.NA] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Liu, ZhenxinWang, Zhe

Wasserstein convergence rates in the invariance principle for sequential dynamical systems. arXiv:2307.13913

Preprint, arXiv:2307.13913 [math.DS] (2023).

MSC:  37A50 60F17 37C99 60B10

BibTeX Cite

Full Text: arXiv 

OA License 



Chizat, LénaïcVaškevičius, Tomas

Computational Guarantees for Doubly Entropic Wasserstein Barycenters via Damped Sinkhorn Iterations. arXiv:2307.13370

Preprint, arXiv:2307.13370 [math.OC] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 

Related articles All 2 versions 


Herda, MaximeMonmarché, PierrePerthame, Benoît

Wasserstein contraction for the stochastic Morris-Lecar neuron model. arXiv:2307.13362

Preprint, arXiv:2307.13362 [math.PR] (2023).

MSC:  35Q84 60J60 92B20

BibTeX Cite

Full Text: arXiv 

OA License 



 

2023

Ma, ShaojunHou, MengxueYe, XiaojingZhou, Haomin

High-dimensional Optimal Density Control with Wasserstein Metric Matching. arXiv:2307.13135

Preprint, arXiv:2307.13135 [math.OC] (2023).

MSC:  93E20 76N25 49L99

BibTeX Cite

Full Text: arXiv 

OA License 



Pritchard, NeilWeighill, Thomas

Coarse embeddability of Wasserstein space and the space of persistence diagrams. arXiv:2307.12884

Preprint, arXiv:2307.12884 [math.MG] (2023).

MSC:  55N99 51F30

BibTeX Cite

Full Text: arXiv 

OA License 



Tanguy, Eloi

Convergence of SGD for Training Neural Networks with Sliced Wasserstein Losses. arXiv:2307.11714

Preprint, arXiv:2307.11714 [cs.LG] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 

Cited by 2 Related articles All 5 versions 


Tanguy, EloiFlamary, RémiDelon, Julie

Properties of Discrete Sliced Wasserstein Losses. arXiv:2307.10352

Preprint, arXiv:2307.10352 [stat.ML] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Mariani, MauroTrevisan, Dario

Wasserstein Asymptotics for Brownian Motion on the Flat Torus and Brownian Interlacements. arXiv:2307.10325

Preprint, arXiv:2307.10325 [math.PR] (2023).

MSC:  60D05 90C05 39B62 60F25 35J05

BibTeX Cite

Full Text: arXiv 

OA License 

<–—2023———2023———1770—    


 


Abdellatif, MariemKuchling, PeterRüdiger, BarbaraVentura, Irene

Wasserstein distance in terms of the Comonotonicity Copula. arXiv:2307.08402

Preprint, arXiv:2307.08402 [math.PR] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Han, Ruiyu

Sliced Wasserstein Distance between Probability Measures on Hilbert Spaces. arXiv:2307.05802

Preprint, arXiv:2307.05802 [math.MG] (2023).

MSC:  53B12

BibTeX Cite

Full Text: arXiv 

OA License 



Hajar, JoudiKargin, TaylanHassibi, Babak

Wasserstein Distributionally Robust Regret-Optimal Control under Partial Observability. arXiv:2307.04966

Preprint, arXiv:2307.04966 [math.OC] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Carlier, GuillaumeChenchene, EnisEichinger, Katharina

Wasserstein medians: robustness, PDE characterization and numerics. arXiv:2307.01765

Preprint, arXiv:2307.01765 [math.OC] (2023).

BibTeX Cite

Full Text: arXiv 

OA License  



2023


Casado, JavierCuerno, ManuelSantos-Rodríguez, Jaime

On the reach of isometric embeddings into Wasserstein type spaces. arXiv:2307.01051

Preprint, arXiv:2307.01051 [math.MG] (2023).

MSC:  49Q20 28A33 30L15 49Q22 53C21 55N31

BibTeX Cite

Full Text: arXiv 

OA License 



Champagnat, NicolasStrickler, EdouardVillemonais, Denis

Uniform Wasserstein convergence of penalized Markov processes. arXiv:2306.16051

Preprint, arXiv:2306.16051 [math.PR] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Li, WuchenGeorgiou, Tryphon T.

Minimal Wasserstein Surfaces. arXiv:2306.14363

Preprint, arXiv:2306.14363 [math.OC] (2023).

MSC:  49Q20 49Kxx 49Jxx

BibTeX Cite

Full Text: arXiv 

OA License 



Barrera, GerardoHögele, Michael A.

Ergodicity bounds for stable Ornstein-Uhlenbeck systems in Wasserstein distance with applications to cutoff stability. arXiv:2306.11616

Preprint, arXiv:2306.11616 [math.PR] (2023).

MSC:  60H10 37A25 37A30

BibTeX Cite

Full Text: arXiv 

OA License 



Arya, ShreyaAuddy, ArnabEdmonds, RanthonyLim, SunhyukMemoli, FacundoPacker, Daniel

The Gromov-Wasserstein distance between spheres. arXiv:2306.10586

Preprint, arXiv:2306.10586 [math.MG] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 


<–—2023———2023———1780—    




Bai, XingjianHe, GuangyiJiang, YifanObloj, Jan

Wasserstein distributional robustness of neural networks. arXiv:2306.09844

Preprint, arXiv:2306.09844 [cs.LG] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Wu, HaochenLi, Max Z.

Distributionally Robust Airport Ground Holding Problem under Wasserstein Ambiguity Sets. arXiv:2306.09836

Preprint, arXiv:2306.09836 [math.OC] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 
Cited by 1
 Related articles All 2 versions 


Parker, Guy

Some Convexity Criteria for Differentiable Functions on the 2-Wasserstein Space. arXiv:2306.09120

Preprint, arXiv:2306.09120 [math.FA] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 


Séjourné, ThibaultBonet, ClémentFatras, KilianNadjahi, KimiaCourty, Nicolas

Unbalanced Optimal Transport meets Sliced-Wasserstein. arXiv:2306.07176

Preprint, arXiv:2306.07176 [cs.LG] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 

Cited by 5 Related articles All 3 versions 


Wu, HaoLiu, ShuYe, XiaojingZhou, Haomin

Parameterized Wasserstein Hamiltonian Flow. arXiv:2306.00191

Preprint, arXiv:2306.00191 [math.NA] (2023).

BibTeX Cite

Full Text: arXiv  

OA License 

 

2023


Rioux, GabrielGoldfeld, ZivKato, Kengo

Entropic Gromov-Wasserstein Distances: Stability and Algorithms. arXiv:2306.00182

Preprint, arXiv:2306.00182 [math.OC] (2023).

BibTeX Cite

Full Text: arXiv 

OA License 



Dorlas, Tony C.Savoie, Baptiste

On the Wasserstein distance and Dobrushin’s uniqueness theorem. arXiv:2305.19371

Preprint, arXiv:2305.19371 [math-ph] (2023).

MSC:  82B05 82B10 82B20 82B26 28C20 46G10 60B05 60B10

BibTeX Cite

Full Text: arXiv 

OA License  




[PDF] arxiv.org

Robust Network Pruning With Sparse Entropic Wasserstein Regression

L You, HV Cheng - arXiv preprint arXiv:2310.04918, 2023 - arxiv.org

This study unveils a cutting-edge technique for neural network pruning that judiciously addresses

noisy gradients during the computation of the empirical Fisher Information Matrix (FIM). …

 Related articles All 2 versions 













Wasserstein Topology Transfer for Joint Distilling Embeddings of Knowledge Graph Entities and Relations

J Yu, Y Wu, S Liang - 2023 6th International Conference on Algorithms …, 2023 - dl.acm.org

… To address these issues, we propose Wasserstein Topology Transfer (termed as WTT), …

(OT) regularizers: Reversing Wasserstein Regularizer and Blending Wasserstein Regularizer, to …

 Related articles



[PDF] colorado.edu

On characterizing optimal Wasserstein GAN solutions for non-Gaussian data

YJ HuangSC LinYC Huang, KH Lyu… - 2023 IEEE …, 2023 - ieeexplore.ieee.org

… In this paper, we focus on the characterization of optimal WGAN parameters … WGANs with

non-linear sigmoid and ReLU activation functions. Extensions to high-dimensional WGANs are …

 Related articles All 4 versions

<–—2023———2023———1790—    



… Non-parametric Approach to Generative Models: Integrating Variational Autoencoder and Generative Adversarial Networks using Wasserstein and Maximum Mean …

F Fazeli-AslMM Zhang - arXiv preprint arXiv:2308.14048, 2023 - arxiv.org

… By considering both the Wasserstein and MMD loss functions, our proposed model benefits

… Next, in Section 3, we introduce a probabilistic method for calculating the Wasserstein …

Related articles All 3 versions 


Image inpainting based on double joint predictive filtering and Wasserstein generative adversarial networks

Y Liu, Z Pan - Journal of Electronic Imaging, 2023 - spiedigitallibrary.org

… , a model based on Wasserstein GAN (WGAN) and double joint … Our method uses the

Wasserstein distance of the WGAN … do not overlap, the Wasserstein distance can still reflect their …

Related articles All 3 versions



2023 see arXiv 2021.   [PDF] arxiv.org

Bounding Wasserstein distance with couplings

N BiswasL Mackey - Journal of the American Statistical …, 2023 - Taylor & Francis

… Entropy-regularized variants of the Wasserstein distance … upper bound estimates for

Wasserstein distances. The … upper bounds on the Wasserstein distance between the limiting …

Cited by 5 Related articles All 6 versions


[PDF] semanticscholar.org

[PDF] 3D Magnetic Resonance Image Denoising using Wasserstein Generative Adversarial Network with Residual Encoder-Decoders and Variant Loss Functions

HA Sayed, AA Mahmoud… - International Journal of …, 2023 - pdfs.semanticscholar.org

… WGAN-SSIM model has also been developed using Structural Similarity Loss SSIM. The

proposed RED-WGAN-SSL and RED-WGAN… fine image better than RED-WGAN, so our models …

 Related articles All 2 versions 

Cite Related articles All 2 versions 

[PDF] openreview.net

Duality and Sample Complexity for the Gromov-Wasserstein Distance

Z Zhang, Z GoldfeldY Mroueh… - … Optimal Transport and …, 2023 - openreview.net

The Gromov-Wasserstein (GW) distance, rooted in optimal transport (OT) theory, quantifies

dissimilarity between metric measure spaces and provides a framework for aligning …

Related articles All 2 versions 


2023

[PDF] arxiv.org

Wasserstein distance and entropic divergences between quantum states of light

S Paul, S Ramanan, V Balakrishnan… - arXiv preprint arXiv …, 2024 - arxiv.org

We assess the extent of similarity between pairs of probability distributions that arise naturally

in quantum optics. We employ the Wasserstein distance, the Kullback-Leibler divergence …

Related articles All 3 versions 



[PDF] sciencedirect.com

A Novel approach using WGAN-GP and conditional WGAN-GP for generating artificial thermal images of induction motor faults

S Hejazi, M Packianather, Y Liu - Procedia Computer Science, 2023 - Elsevier

… the utilisation of WGAN-GP and cWGANGP with health condition labels to create high-quality

thermal images artificially. The results demonstrate that the cWGAN-GP approach is …

Related articles All 3 versions



Incorporating Least-Effort Loss to Stabilize Training of Wasserstein GAN

F Li, L Wang, B Yang, P Guan - 2023 International Joint …, 2023 - ieeexplore.ieee.org

… To evaluate the effectiveness of the least-effort loss, we introduce it into Wasserstein GAN. …

the convergence properties and generation quality of WGAN. Furthermore, the behaviors of …

Related articles



[PDF] arxiv.org

Sharp bounds for the max-sliced Wasserstein distance

MT Boedihardjo - arXiv preprint arXiv:2403.00666, 2024 - arxiv.org

We obtain sharp upper and lower bounds for the expected max-sliced 1-Wasserstein distance

between a probability measure on a separable Hilbert space and its empirical distribution …

Related articles 


2023 see 2022 2021

Slosh: Set locality sensitive hashing via sliced-wasserstein embeddings

Y LuX LiuA Soltoggio… - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com

Learning from set-structured data is an essential problem with many applications in

machine learning and computer vision. This paper focuses on non-parametric and data-independent …

Cited by 5 Related articles All 4 versions 

<–—2023———2023———1800—  



[PDF] radioeng.cz

[PDF] A Wasserstein Distance-Based Cost-Sensitive Framework for Imbalanced Data Classification

R FENG, H JIZ ZHU, L WANG - Radioengineering, 2023 - radioeng.cz

… Wasserstein distance WNi,P between Ni and the minority class set P, as well as the Wasserstein

… Figure 1 illustrates the construction of the Wasserstein distance-guided data clusters. …

Related articles All 4 versions 



Stereographic Spherical Sliced Wasserstein Distances

H TranY Bai, A Kothapalli, A ShahbaziX Liu… - arXiv preprint arXiv …, 2024 - arxiv.org

Comparing spherical probability distributions is of great interest in various fields, including

geology, medical domains, computer vision, and deep representation learning. The utility of …

Related articles All 2 versions 



[PDF] arxiv.org

Preservers of the -power and the Wasserstein means on  matrices

R Simon, D Virosztek - arXiv preprint arXiv:2307.07273, 2023 - arxiv.org

… A similar result occurred in another recent paper of Molnár [20] concerning the Wasserstein

mean. We prove the conjecture for I2-type algebras in regard of the Wasserstein mean, too. …

Related articles All 5 versions 

W-IRL: Inverse Reinforcement Learning via Wasserstein Metric

K Wu, C Li, F Wu, J Zhao - 2023 3rd International Conference …, 2023 - ieeexplore.ieee.org

… learning algorithm based on the Wasserstein metric (W-IRL). Initially, aiming at the problem

of hard training and slow convergence, this paper uses the Wasserstein metric as a loss …

Related articles

2023  7ee 2022

GeONet: a neural operator for learning the Wasserstein geodesic

A Gracyk, X Chen - 2023 - openreview.net

Optimal transport (OT) offers a versatile framework to compare complex data distributions in

a geometrically meaningful way. Traditional methods for computing the Wasserstein …

Cited by 1 Related articles All 4 versions 


2023

2023 see 2021

[HTML] sciencedirect.com

[HTML] Learning domain invariant representations by joint Wasserstein distance minimization

L Andéol, Y Kawakami, Y Wada, T Kanamori… - Neural Networks, 2023 - Elsevier

… We contribute several bounds relating the Wasserstein distance between the joint … With the

proposed theoretical grounding, one can show that (1) the Wasserstein distance between the …

 Cited by 3 Related articles All 7 versions

Adversarial Debiasing using Regularized Direct Entropy Minimization in Wasserstein GAN with Gradient Penalty

MN AminA Al Imran, FS Bayram… - 2023 26th International …, 2023 - ieeexplore.ieee.org

… By harnessing the power of adversarial training with the Wasserstein GAN enhanced by

gradient penalty (WGAN-GP), our approach strives to minimize gender-specific information in …

Related articles



[PDF] arxiv.org

Quantum Wasserstein GANs for State Preparation at Unseen Points of a Phase Diagram

W JuraszCB Mendl - arXiv preprint arXiv:2309.09543, 2023 - arxiv.org

Generative models and in particular Generative Adversarial Networks (GANs) have become

very popular and powerful data generation tool. In recent years, major progress has been …

SRelated articles All 3 versions 


[PDF] biorxiv.org

Toolbox for Gromov-Wasserstein optimal transport: Application to unsupervised alignment in neuroscience

M Sasaki, K Takeda, K Abe, M Oizumi - bioRxiv, 2023 - biorxiv.org

… interpreted as an extension of the well-researched Wasserstein … of the optimization, we can

obtain the Gromov-Wasserstein … 117 Gromov-Wasserstein optimal transport problem is to find …

Cited by 2 Related articles All 2 versions 


[PDF] arxiv.org

Coarse embeddings of quotients by finite group actions via the sliced Wasserstein distance

T Weighill - arXiv preprint arXiv:2310.09369, 2023 - arxiv.org

We prove that for a metric space $X$ and a finite group $G$ acting on $X$ by isometries, if $X$

coarsely embeds into a Hilbert space, then so does the quotient $X/G$. A crucial step …

Related articles All 2 versions 

<–—2023———2023———



 Reflecting image-dependent SDEs in Wasserstein space and large deviation principle

X Yang - Stochastics, 2023 - Taylor & Francis

In this article, we study a class of reflecting stochastic differential equations whose coefficients

depend on image measures of solutions under a given initial measure in Wasserstein …

Cited by 1 Related articles All 2 versions


2023 see 2022

Wasserstein adversarial learning based temporal knowledge graph embedding

Y Dai, W Guo, C Eickhoff - Information Sciences, 2024 - Elsevier

… Meanwhile, we also apply a Gumbel-Softmax relaxation and the Wasserstein distance to

prevent vanishing gradient problems on discrete data; an inherent flaw in traditional generative …

Related articles

[PDF] arxiv.org

 

The Total Variation-Wasserstein Problem: A New Derivation of the Euler-Lagrange Equations

A Chambolle, V Duval, JM Machado - International Conference on …, 2023 - Springer

In this work we analyze the Total Variation-Wasserstein minimization problem. We propose

an alternative form of deriving optimality conditions from the approach of [ 8 ], and as result …

Related articles All 2 versions


[PDF] semarakilmu.com.my

Clustering Datasaurus Dozen Using Bottleneck Distance, Wasserstein Distance (WD) and Persistence Landscapes

RU Gobithaasan, KD Selvarajh… - Journal of Advanced …, 2024 - semarakilmu.com.my

Topological Data Analysis (TDA) is an emerging field of study that helps to obtain insights

from the topological information of datasets. Motivated by the emergence of TDA, we applied …

Related articles 



Multi-marginal Gromov–Wasserstein transport and barycentres

F BeierR BeinertG Steidl - … and Inference: A Journal of the IMA, 2023 - academic.oup.com

Gromov–Wasserstein (GW) distances are combinations of Gromov–Hausdorff and

Wasserstein distances that allow the comparison of two different metric measure spaces (mm-spaces)…

Cited by 1 Related articles All 2 versions

2023


DD-WGAN: Generative Adversarial Networks with Wasserstein Distance and Dual-Domain Discriminators for Low-Dose CT

X Bai, H Wang, S Yang, Z Wang… - 2023 IEEE 20th …, 2023 - ieeexplore.ieee.org

… We propose the DD-WGAN network to solve the optimization problem in equation (2).

The network takes three adjacent LDCT images as inputs, and the generator produces the …

Related articles

 

[PDF] arxiv.org

Exact convergence analysis for metropolis–hastings independence samplers in Wasserstein distances

A BrownGL Jones - Journal of Applied Probability, 2024 - cambridge.org

… Wasserstein distances, which we refer to as just Wasserstein … –Hastings algorithms using

specific Wasserstein distances [7… different metrics used in Wasserstein distances for the MHI …

Cited by 4 Related articles All 4 versions


2023 see 2022

[PDF] springer.com

Strong posterior contraction rates via Wasserstein dynamics

E DoleraS Favaro, E Mainini - Probability Theory and Related Fields, 2024 - Springer

… 2 we recall the definition of PCR, presenting an equivalent definition in terms of the Wasserstein

distance, and we outline the main steps of our approach to PCRs. Section 3 contains the …

Cited by 1 Related articles All 3 versions



Smooth Edgeworth Expansion and Wasserstein- Bounds for Mixing Random Fields

T Liu, M Austern - arXiv preprint arXiv:2309.07031, 2023 - arxiv.org

In this paper, we consider $d$-dimensional mixing random fields $\bigl(X^{(n)}_{i}\bigr)_{i\in

T_{n}}$ and study the convergence of the empirical average $W_n:=\sigma_n^{-1} \sum_{i\…

Related articles All 2 versions 



Synthetic Batik Pattern Generator Using Wasserstein Generative Adversarial Network with Gradient Penalty

K AndriadiY HeryadiW Suparta… - 2023 6th International …, 2023 - ieeexplore.ieee.org

… The objective function of WGAN is based on the Wasserstein distance (Earth Mover’s … with

WGAN, generator still can learn even if discriminator is doing a good job. the WGAN objective …

Related articles

<–—2023———2023———1820—


[PDF] researchgate.net

A Semi-supervised Approach For Brain Tumor Classification Using Wasserstein Generative Adversarial Network with Gradient Penalty

A YeafiM IslamSK Mondal… - 2023 6th …, 2023 - ieeexplore.ieee.org

… and detection technique was employed using the WGAN-GP algorithm in this research work.

… for training the WGAN-GP model. The overall workflow of this paper is illustrated in Fig. 2. …

Related articles All 3 versions


023 see 2022

Square Root Normal Fields for Lipschitz Surfaces and the Wasserstein Fisher Rao Metric

E HartmanM BauerE Klassen - SIAM Journal on Mathematical Analysis, 2024 - SIAM

The square root normal field (SRNF) framework is a method in the area of shape analysis

that defines a (pseudo)distance between unparametrized surfaces. For piecewise linear …

Related articles All 2 versions



[PDF] researchsquare.com

Towards Analysis of Covariance Matrices through Bures-Wasserstein Distance

J Zheng, H Huang, Y Yi, Y Li, SC Lin - 2024 - researchsquare.com

… In this paper, we tackles these issues by introducing the Bures-Wasserstein (BW) distance

for analyzing positive semi-definite matrices. Both theoretical and computational aspects of …

Related articles All 2 versions 



A Data-Driven Wasserstein Distributionally Robust Weight-Based Joint Power Optimization for Dynamic Multi-WBAN

M Wang, F Hu, Z Ling, D Jia, S Li - GLOBECOM 2023-2023 …, 2023 - ieeexplore.ieee.org

… The distribution P of vector v is uncertain, we build an ambiguity set based on wasserstein

distance, which includes the family of all eligible probability distributions. The eligible …

Related articles



Graph Contrastive Learning with Wasserstein Distance for Recommendation

J Sun, J Li, Y Ma - 2023 8th International Conference on …, 2023 - ieeexplore.ieee.org

… samples subgraphs first, then uses a Wasserstein-based subgraph similarity measure to …

then build a contrastive loss based on Wasserstein distance to capture the distinction between …

Related articles All 2 versions  


2023


Point Cloud Registration based on Gaussian Mixtures and Pairwise Wasserstein Distances

S Steuernagel, A Kurda, M Baum - 2023 IEEE Symposium …, 2023 - ieeexplore.ieee.org

… This is achieved using an efficient approximation of the Gaussian Wasserstein distance,

which we find a suitable metric capturing the similarity between shape and position of two …

Related articles



Using Fourier Coefficients and Wasserstein Distances to Estimate Entropy in Time Series

S Perkey, A Carvalho… - 2023 IEEE 19th …, 2023 - ieeexplore.ieee.org

Time series from real data measurements are often noisy, under-sampled, irregularly

sampled, and inconsistent across long-term measurements. Typically, in analyzing these time …

Related articles All 3 versions

[PDF] openreview.net

WASSERSTEIN-GUIDED SYMBOLIC REGRESSION: MODEL DISCOVERY OF NETWORK DYNAMICS

R Dakhmouche, I Lunati, H Gorji - 2023 - openreview.net

Real-world complex systems often miss high-fidelity physical descriptions and are typically

subject to partial observability. Learning dynamics of such systems is a challenging and …

Related articles   


Wasserstein GAN Based Underwater Acoustic Channel Simulator

M Zhou, J Wang, H Sun - 2023 IEEE International Conference …, 2023 - ieeexplore.ieee.org

… , A Wasserstein generative adversarial network (WGAN)based … Then the CNN-based WGAN

is constructed with the earth-… , Xiamen, the proposed WGAN performs steadily in simulating …

Related articles


Wasserstein GANs are Minimax Optimal Distribution Estimators

A Stéphanovitch, E Aamari, C Levrard - arXiv preprint arXiv:2311.18613, 2023 - arxiv.org

We provide non asymptotic rates of convergence of the Wasserstein Generative Adversarial

networks (WGAN) estimator. We build neural networks classes representing the generators …

Cited by 1 Related articles All 2 versions 

<–—2023———2023———1830—  


[PDF] openreview.net

Wasserstein-2 Distance for Efficient Reconstruction of Stochastic Differential Equations

M Xia, X Li, Q Shen, T Chou - 2023 - openreview.net

… We provide an analysis of the squared Wasserstein-2 (W2) distance between two probability …

To demonstrate the practical use our Wasserstein distance-based loss function, we carry …

Related articles 


Bolstering IoT security with IoT device type Identification using optimized Variational Autoencoder Wasserstein Generative Adversarial Network

JS Sankar, S Dhatchnamurthy… - Network: Computation in …, 2024 - Taylor & Francis

… Wasserstein GAN is used because it can rectify the mode faint problem. Here, critic discriminator

scale Wasserstein distance Earth-Mover distance (EMD) betwixt real D K known device …

Related articles All 3 versions


[PDF] openreview.net

Entropic Gromov-Wasserstein Distances: Stability and Algorithms

G Rioux, Z Goldfeld, K Kato - NeurIPS 2023 Workshop Optimal …, 2023 - openreview.net

The Gromov-Wasserstein (GW) distance quantifies discrepancy between metric measure

spaces, but suffers from computational hardness. The entropic Gromov-Wasserstein (EGW) …

Cited by 1 Related articles 



[PDF] case.edu

Generalized Gromov Wasserstein Distance for Seed-Informed Network Alignment

M LiM Koyutürk - International Conference on Complex Networks and …, 2023 - Springer

… framework for Gromov-Wasserstein based network alignment … The proposed “generalized

Gromov-Wasserstein distance” … proposed Generalized Gromov-Wasserstein-based Network …

Related articles All 2 versions



[HTML] github.io

[HTML] Wasserstein GAN

DDD Blog - 2024 - ddangchani.github.io

이전 글에서 GAN Generative Adversarial Network 결과적으로 Jensen-Shannon divergence

최소화하는 문제로 귀결되는 것을 확인했다. 경우 density ratio discriminator $ D …

Related articles 


2023



[PDF] openreview.net

LMCC-MBC: Metric-Constrained Model-Based Clustering with Wasserstein-2 Distance of Gaussian Markov Random Fields

Z Wang, G MaiK JanowiczN Lao - 2023 - openreview.net

… We use Wasserstein-2 distance as dm. Then we compute the generalized semivariogram …

out that the choice of Wasserstein-2 distance and GMRF is not heuristic. In fact, Wasserstein-2 …

Related articles 



[PDF] hal.science

An Innovative Framework for Static and Dynamic Clustering using Histogram Models and Wasserstein Distance over Sliding Windows

X Qian, G CabanesP Rastin, MA Guidani… - Available at SSRN …, 2023 - papers.ssrn.com

In this article, we present an innovative clustering framework designed for large datasets and

real-time data streams which uses a sliding window and histogram model to address the …

Related articles All 6 versions 



EC-WGAN: Enhanced Conditional and Wasserstein GAN for Fault Samples Augmentation

L Li, Z Li, X Wang, J Gong - 2023 6th International Conference …, 2023 - ieeexplore.ieee.org

… and Wasserstein GAN (EC-GAN) … Wasserstein distance. Moreover, the gradient penalty is

applied for keeping continuous of the Lipschitz function and gradient vanishing in Wasserstein …

Related articles



[PDF] arxiv.org

Improved rates of convergence for the multivariate Central Limit Theorem in Wasserstein distance

T Bonis - arXiv preprint arXiv:2305.14248, 2023 - arxiv.org

… We provide new bounds for rates of convergence of the multivariate Central Limit

Theorem in Wasserstein distances of order p ≥ 2. In particular, we obtain an asymptotic …

Related articles All 3 versions 



[PDF] openreview.net

Constrained Reinforcement Learning as Wasserstein Variational Inference: Formal Methods for Interpretability

Y WANG, D Boyle - 2023 - openreview.net

… In this section, we extend the Wasserstein distance into the variational inference, and present

the derivation of how we transform the GSWD between the two posteriors to the optimality …

Related articles

<–—2023———2023———1840—



[PDF] mit.edu

Proximal Gradient Algorithms for Gaussian Variational Inference: Optimization in the Bures–Wasserstein Space

MZ Diao - 2023 - dspace.mit.edu

… Wasserstein geometry and introduce the Bures–Wasserstein … optimization versus

Bures–Wasserstein optimization, laying … to the setting of the Bures–Wasserstein space in order to …Related articles All 2 versions 


[PDF] uni-goettingen.de

[PDF] Wasserstein-like Distance on Vector Fields

V Sommer - 2023 - ediss.uni-goettingen.de

… work with the Wasserstein-2-distance over the Wasserstein-1-distance. Numerically, the

problem of finding a Wasserstein geodesic is referred to as the dynamical Wasserstein problem, …

Related articles All 2 versions 


2023


Distributed Robust Particle Filter Based on Wasserstein Metric

H Wang, A Yang, Q Sun - 2023 3rd International Conference on …, 2023 - ieeexplore.ieee.org

In this paper, for nonlinear systems with uncertain interference factors, particle filter can no

longer accurately estimate the state when both process noise and measurement noise are …

Related articles



[PDF] researchgate.net

Wasserstein Distance and Realized Volatility

H Gobato SoutoA Moradi - Available at SSRN 4539628, 2023 - papers.ssrn.com

This research proposes a novel loss function function for neural network models that

explores the topological structure of stock realized volatility (RV) data through the addition of …

Related articles All 3 versions 



[PDF] openreview.net

Fast Stochastic Kernel Approximation by Dual Wasserstein Distance Method

Z Lin, A Ruszczynski - 2023 - openreview.net

We introduce a generalization of the Wasserstein metric, originally designed for probability

measur to establish a novel distance between probability kernels of Markov systems. We …

Related articles 



Emotion Recognition Based on Wasserstein Distance Fusion of Audiovisual Features

N Ai, S Zhang, N Yi, Z Ma - 2023 6th International Conference …, 2023 - ieeexplore.ieee.org

… To address these challenges, we introduce a model based on the Wasserstein distance.

This model extracts meaningful features from each modality by minimizing the Wasserstein …

Related articles



[PDF] openreview.net

Fourier-Based Bounds for Wasserstein Distances and Their Implications in Computational Inversion

W Hong, VA Kobzar, K Ren - NeurIPS 2023 Workshop Optimal …, 2023 - openreview.net

… We focus on the case when this metric is the Wasserstein-p (Wp) distance between

probability measures as well as its generalizations by Piccoli et al., for unbalanced measures, …

Related articles All 3 versions  


[HTML] arxiv.org

A Proof of the Central Limit Theorem Using the -Wasserstein Metric

CW Chin - arXiv preprint arXiv:2402.17085, 2024 - arxiv.org

… Wasserstein generative adversarial networks (WGANs) in machine learning, for example.

There is a p-Wasserstein … In this note, we use the 2-Wasserstein metric to prove the central limit …

Related articles 


[PDF] openreview.net

An Enhanced Gromov-Wasserstein Barycenter Method for Graph-based Clustering

C Liu, Z Zhang - 2023 - openreview.net

Optimal Transport (OT) recently has gained remarkable success in machine learning. These

methods based on the Gromov-Wasserstein (GW) distance have proven highly effective in …

Related articles 


[HTML] cyberleninka.ru

Full View

[HTML] Wasserstein and weighted metrics for multidimensional Gaussian distributions

MY Kelbert, Y Suhov - Известия Саратовского университета …, 2023 - cyberleninka.ru

We present a number of low and upper bounds for Levy Џ Prokhorov, Wasserstein, Frechet,

and Hellinger distances between probability distributions of the same or different dimensions…

Related articles All 6 versions 

<–—2023———2023———1850— 



Bayesian Inference for Markov Kernels Valued in Wasserstein Spaces

K Eikenberry - 2023 - search.proquest.com

In this work, the author analyzes quantitative and structural aspects of Bayesian inference

using Markov kernels, Wasserstein metrics, and Kantorovich monads. In particular, the author …

Related articles All 2 versions



[PDF] openreview.net

Network Regression with Wasserstein Distances

A Zalles, KM Hung, AE Finneran… - … Optimal Transport and …, 2023 - openreview.net

We study the problem of network regression, where the graph topology is inferred for unseen

predictor values. We build upon recent developments on generalized regression models …

Related articles 



GWS: Rotation object detection in aerial remote sensing images based on Gauss–Wasserstein scattering

L Gan, X Tan, L Hu - AI Communications, 2023 - content.iospress.com

… a new regression loss function named Gauss–Wasserstein scattering (GWS). First, the …

Gaussian distributions are calculated as the Wasserstein scatter; By analyzing the gradient of …

Related articles



Incipient Fault Detection of CRH Suspension system Based on PRPCA and Wasserstein Distance

K Fang, Y Wu, Y Zhou, Z Zhu… - 2023 42nd Chinese …, 2023 - ieeexplore.ieee.org

As an important part of CRH(China Railway High-speed) trains, the stability and stationarity

of a suspension system is of great significance to the vehicle system. Based on the …

Related articles



[PDF] stanford.edu

Entropic Regularization in Wasserstein Gans: Robustness, Generalization and Privacy

D Reshetova - 2023 - search.proquest.com

… In this thesis, we study the consequences of regularizing Wasserstein GANs with entropic …

Wasserstein distance promotes sparsification of the solution while replacing the Wasserstein …

Related articles All 2 versions


 

2023


 

[PDF] openreview.net

Local Differential Privacy with Entropic Wasserstein Distance

D ReshetovaWN ChenA Ozgur - 2023 - openreview.net

… • LDP Framework for Wasserstein GANs: We propose a novel modification to the widely

adopted Wasserstein GAN framework that enables it to learn effectively from LDP samples with …

Related articles 



Semi-discrete Gromov-Wasserstein distances: Existence of Gromov-Monge Maps and Statistical Theory

G Rioux, Z GoldfeldK Kato - NeurIPS 2023 Workshop Optimal …, 2023 - openreview.net

The Gromov-Wasserstein (GW) distance serves as a discrepancy measure between metric …

As is the case with the standard Wasserstein distance, the rate we derive in the semi-discrete …

Related articles 



[HTML] risk.net

Neural stochastic differential equations for conditional time series generation using the Signature-Wasserstein-1 metric

PD Lozano, TL Bagén, J Vives - Journal of Computational Finance, 2023 - papers.ssrn.com

… However, in our case we will focus on the Wasserstein GAN formulation, where the adversarial

network (called the “critic”) aims at approximating the Wasserstein-1 distance (Arjovsky et …

Related articles All 2 versions 



[PDF] ethz.ch

[PDF] Gromov–Wasserstein Alignment: Statistical and Computational Advancements via Duality

Z Goldfeld - International Zurich Seminar on Information …, 2024 - research-collection.ethz.ch

The Gromov-Wasserstein (GW) distance quantifies dissimilarity between metric measure (mm)

spaces and provides a natural correspondence between them. As such, it serves as a …

Related articles All 2 versions 



[HTML] arxiv.org

Wasserstein- distance and nonuniform Berry-Esseen bound for a supercritical branching process in a random environment

H Wu, X Fan, Z Gao, Y Ye - arXiv preprint arXiv:2307.01084, 2023 - arxiv.org

Let $ (Z_{n})_{n\geq 0} $ be a supercritical branching process in an independent and

identically distributed random environment. We establish an optimal convergence rate in the …

Related articles All 3 versions 

<–—2023———2023———1860—



[PDF] sns.it

[PDF] Long-Time Asymptotics of the Sliced-Wasserstein Flow

G Cozzi, F Santambrogio - 2024 - cvgmt.sns.it

… facts about optimal transport and Wasserstein spaces, then we present the definition and

first properties of the slicedWasserstein distance and of the sliced-Wasserstein flow. In section …

Related articles 



A Novel Transfer Learning Method for Robot Bearing Fault Diagnosis Based on Deep Convolutional Residual Wasserstein Adversarial Network

B Pan, X Xiong, H Hu, J He, S Yang - International Conference on …, 2023 - Springer

… This paper replaces JS divergence with Wasserstein distance. The advantage of Wasserstein

distance is that it can measure the … The expression of Wasserstein distance is shown as: …

Related articles All 2 versions



[PDF] arxiv.org

Right Mean for the α − z Bures-Wasserstein Quantum Divergence

M Jeong, J Hwang, S Kim - Acta Mathematica Scientia, 2023 - Springer

… of α−z Bures-Wasserstein quantum divergences to given positive … Moreover, we verify the

trace inequality with the Wasserstein … We also show the trace inequality with the Wasserstein …

Cited by 2 Related articles All 6 versions

 


Parsimonious Wasserstein Text-mining

S GadatS Villeneuve - 2023 - publications.ut-capitole.fr

This document introduces a parsimonious novel method of processing textual data based on

the NMF factorization and on supervised clustering withWasserstein barycenter’s to reduce …

Related articles All 5 versions 



Parameter estimation for many-particle models from aggregate observations: A Wasserstein distance based sequential Monte Carlo sampler

C Cheng, L Wen, J Li - arXiv preprint arXiv:2303.14950, 2023 - arxiv.org

… Wasserstein distance based sequential Monte Carlo sampler to solve the problem: the

Wasserstein … To address the issue, we propose to use the Wasserstein distance [16] as a distance …

Related articles All 2 versions 


2023 


[PDF] essopenarchive.org

Generation of Heterogeneous Pore-Space Images Using Improved Pyramid Wasserstein Generative Adversarial Networks

L ZhuB BijeljicMJ Blunt - Authorea Preprints, 2023 - essopenarchive.org

We use Wasserstein Generative Adversarial Networks to learn and integrate multi-scale

features in segmented three-dimensional images of porous materials, enabling the dependable …

Related articles All 4 versions 


A Dual Approach to Wasserstein-Robust Counterfactuals

J Gu, T Russell - Available at SSRN 4517842, 2023 - papers.ssrn.com

We study the identification of scalar counterfactual parameters in partially identified structural

models, paying particular attention to relaxing parametric distributional assumptions on the …

Related articles 


 

Detect Rumors by CycleGAN with Wasserstein Distance

Z Hongzhi, D Zhiping, D Fangmin… - Data Analysis and …, 2024 - manu44.magtech.com.cn

… [Objective] By CycleGAN and improved generation loss through Wasserstein distance improving

… We use Wasserstein distance upgrade the cycle consistency loss and identity loss, and …

Related articles



Improved Indoor Localization Algorithm Combining K-means Clustering Algorithm And Wasserstein Generative Adversarial Network Algorithm

C Li, Y Mao - 2023 19th International Conference on Natural …, 2023 - ieeexplore.ieee.org

… K-means clustering and Wasserstein Generative Adversarial Network(WGAN). Firstly, K-… is

located, and the fingerprint features are expanded by WGAN to the cluster in which the point is …

Related articles


2023 see 2022

[PDF] arxiv.org

Decentralized convex optimization on time-varying networks with application to Wasserstein barycenters

O Yufereva, M Persiianov, P Dvurechensky… - Computational …, 2024 - Springer

… Inspired by recent advances in distributed algorithms for approximating Wasserstein …

efficiency of the proposed algorithm when applied to the Wasserstein barycenter problem. …

Related articles All 7 versions

<–—2023———2023———1870— 




[PDF] purdue.edu

Energy-Dissipative Methods in Numerical Analysis, Optimization and Deep Neural Networks for Gradient Flows and Wasserstein Gradient Flows

S Zhang - 2023 - hammer.purdue.edu

This thesis delves into the development and integration of energy-dissipative methods, with

applications spanning numerical analysis, optimization, and deep neural networks, primarily …

 \Related articles All 2 versions 




[PDF] openreview.net

Sample Complexity Bounds for Estimating the Wasserstein Distance under Invariances

B TahmasebiS Jegelka - 2023 - openreview.net

… of estimating the Wasserstein distance under group invariances… on the convergence rate

of the Wasserstein distance W1(., .) … • We also study the convergence rate of the Wasserstein …

SRelated articles



[PDF] hal.science

Wasserstein gradient flow of the Fisher information from a non-smooth convex minimization viewpoint

G Carlier, JD BenamouD Matthes - 2023 - inria.hal.science

Motivated by the Derrida-Lebowitz-Speer-Spohn (DLSS) quantum drift equation, which is

the Wasserstein gradient flow of the Fisher information, we study in details solutions of the …

Related articles All 4 versions 


 

Wasserstein domain adversarial neural networks for grade prediction in zinc flotation process

X Li, L Cen - Third International Conference on Control and …, 2023 - spiedigitallibrary.org

… To solve the deficiency of the binary classification, a wasserstein domain adversarial

neural networks (W-DANN) is proposed, which calculates the domain loss with wasserstein …

Related articles All 3 versions



[PDF] imca.edu.pe

[PDF] Differential Inclusions on Wasserstein spaces

H Frankowska - isora2023.imca.edu.pe

… of Wasserstein spaces, ie metric spaces of Borel probability measures endowed with the

Wasserstein … inclusions to the setting of general Wasserstein spaces. Anchoring our analysis on …

Related articles 


2023


ating Information Processing of Wasserstein Distances for Spike Data Analysis

원종현 - 2023 - repository.hanyang.ac.kr

… Wasserstein distance for spike data analysis. Recently, it has been shown that the Wasserstein

… data, we demonstrate the usefulness of the Wasserstein distance for spike data analysis.|…

Related articles 



Adaptive Motions of ADS in Wasserstein Metric Space

Y Sugiyama - Dynamics of Asymmetric Dissipative Systems: From …, 2023 - Springer

In Chap. 8 we have studied the variations of macroscopic formation organized by non-equilibrium

motions of many particles by asymmetric dissipative interactions. The interaction …

Related articles


 

[PDF] hal.science

A minimality property of the value function in optimal control over the Wasserstein space

C HermosillaA Prost - 2024 - hal.science

An optimal control problem with (possibly) unbounded terminal cost is considered in P2(Rd),

the space of Borel probability measures with finite second moment. We consider the metric …

Related articles All 2 versions 



Stochastic Scenario Generation for Wind Power and Photovoltaic System Based on Wasserstein Distance Metric

X Ling, B Yang, H Chen, W Xu… - 2023 6th Asia …, 2023 - ieeexplore.ieee.org

… In this paper, we utilize a Wasserstein distancebased scenario generation method commonly

applicable to wind and photovoltaic (PV) power is proposed, which use discretized optimal …

Related articles



[PDF] researchgate.net

[PDF] Scaling limit of Wasserstein metric on Gaussian mixture models

J Zhao, W Li - researchgate.net

… • Apply numerical simulations of the partial differential equation with structure preserving

properties inherited from Wasserstein gradient flow. Computational cost is low as the …

Related articles 

<–—2023———2023———1880—



[PDF] openreview.net

Distributionally Robust Federated Learning with Wasserstein Barycenter

W LiS Fu, Y Pang - The Second Tiny Papers Track at ICLR 2024 - openreview.net

… of the Federated labeled Wasserstein barycenter. Then we conduct a simple comparison

to show the validation performance based on the Wasserstein barycenter and Eucludiean …

 Related articles 



Small object detection for mobile behavior recognition based on Wasserstein distance and partial convolution

B Cai, L Kong, Y Zhou, L Dong… - … Imaging and Multimedia …, 2023 - spiedigitallibrary.org

While mobile phones offer convenience in our daily lives, they also introduce associated

security risks. For instance, in high-security settings like confidential facilities, casual mobile …

Related articles All 3 versions


[PDF] openreview.net

Functional Wasserstein Bridge Inference for Bayesian Deep Learning

M Wu, J Xuan, J Lu - 2023 - openreview.net

… Functional Wasserstein Bridge Inference (FWBI), which can assign meaningful functional

priors an\\ Related articles 

 


2023 see 2022

[PDF] esaim-proc.org

Wasserstein model reduction approach for parametrized flow problems in porous media

B Battisti, T Blickhan, G Enchery… - ESAIM: Proceedings …, 2023 - esaim-proc.org

… based on the use of Wasserstein barycenters, which was originally … Note that the use of

Wasserst

Cited by 2 Related articles All 5 versions



Wasserstein Generative Adversarial Networks are Minimax Optimal Distribution Estimators

A Stéphanovitch, E Aamari, C Levrard - 2023 - hal.science

We provide non asymptotic rates of convergence of the Wasserstein Generative Adversarial

networks (WGAN) estimator. We build neural networks classes representing the generators …

Cited by 1 Related articles All 3 versions 


 2023


Wasserstein Autoencoder Based End-to-End Learning Strategy of Geometric Shaping for an OAM Mode Division Multiplexing IM/DD Transmission

Z Cheng, R Gao, Q Xu, F Wang… - … Meetings (ACP/POEM), 2023 - ieeexplore.ieee.org

… Compared with traditional AE, we add the Maximum Mean Discrepancy (MMD) regularization

term DZ in the loss function, introducing the Wasserstein Autoencoder proposed in [9] to perform …

Related articles

 


[PDF] osaka-u.ac.jp

[PDF] Wasserstein distance on solutions to stochastic differential equations with jumps

A Takeuchi - 2023 - math.sci.osaka-u.ac.jp

… Wasserstein distance between two jump processes determined by stochastic differential

equations in Rd or the Riemannian manifold M. As an application, the study on the Wasserstein …

Related articles All 2 versions 


[PDF] hal.science

[PDF] EQUIVALENCE BETWEEN STRICT VISCOSITY SOLUTION AND VISCOSITY SOLUTION IN THE WASSERSTEIN SPACE AND REGULAR EXTENSION OF …

C JIMENEZ - 2023 - hal.science

This article aims to build bridges between several notions of viscosity solution of first order

dynamic Hamilton-Jacobi equations. The first main result states that, under assumptions, the …

Related articles All 9 versions 


[PDF] uncg.edu

[BOOK] Injective and Coarse Embeddings of Persistence Diagrams and Wasserstein Space

N Pritchard - 2023 - search.proquest.com

… -Wasserstein metric uniformly coarsely embed into the space of persistence diagrams with the

p-Wasserstein … of persistence diagrams embed into Wasserstein space (Proposition 2.4.5). …

Related articles All 2 versions 



Wasserstein Barycenter-based Evolutionary Algorithm for the optimization of sets of points

B Sow, R Le Riche, J Pelamatti, M Keller… - PGMO DAYS …, 2023 - hal-emse.ccsd.cnrs.fr

… , we rely on the Wasserstein barycenter [2]. The Wasserstein barycenter being a contracting

… We implement the Wasserstein barycenter computation using the POT library [1]. These …

Related articles All 2 versions 

<–—2023———2023———1890—



 A novel sEMG data augmentation based on WGAN-GP

F CoelhoMF PintoAG Melo, GS Ramos… - Computer methods in …, 2023 - Taylor & Francis

… WGAN-GP focus is to obtain stable models during the training phase. However, to the best of

our knowledge, no works in the literature used WGAN… network called WGAN with a gradient …

Cited by 1 Related articles All 7 versions



Subgraph Matching via Fused Gromov-Wasserstein Distance

W Pan - 2023 - repository.tudelft.nl

… subgraph matching frameworks using the Fused Gromov-Wasserstein (FGW) distance, namely

the … a sliding window framework and Wasserstein pruning to enhance the performance, …

Related articles 



DWGAN: a dual Wasserstein-autoencoder-based generative adversarial network for image super-resolution

H Gao, Z Zeng - … Conference on Image, Signal Processing, and …, 2023 - spiedigitallibrary.org

… To address this challenge, we propose DWGAN: a dual Wasserstein-Autoencoder based …

models the latent distribution with Wasserstein Autoencoders and the adversarial training in … Related articles All 3 versions


Application of the Wasserstein Distance to identify inter-crystal scatter in a light-sharing depth-encoding PET detector

EW PetersenA Goldan - 2023 IEEE Nuclear Science …, 2023 - ieeexplore.ieee.org

… centroid position of the event and a Wasserstein distanced-based method that incorporates

… However, the Wasserstein-based classification outperformed the contour method, indicating …

Related articles



The Kantorovich-Wasserstein distance for spatial statistics: The Spatial-KWD library

F RicciatoS Gualandi - Statistical Journal of the IAOS - content.iospress.com

In this paper we present Spatial-KWD, a free open-source tool for efficient computation of the

Kantorovich-Wasserstein Distance (KWD), also known as Earth Mover Distance, between …

 Related articles


2023


Wasserstein Distributionally Robust Optimization in Multivariate Ridge Regression

W Liu, C Fang - 2023 3rd International Conference on Frontiers …, 2023 - ieeexplore.ieee.org

Distributionally robust optimization is an effective method to deal with uncertainty. In this

paper, we apply distributionally robust optimization methods to multivariate ridge regression …

Related articles All 2 versions


2023see 2022

[PDF] hal.science

On the complexity of the data-driven Wasserstein distributionally robust binary problem

H Kim, D Watel, A Faye, H Cédric - 2023 - hal.science

… In this paper, we use a data-driven Wasserstein … set we use is a Wasserstein ball which is,

using the Wasserstein metric, the … DRO is called data-driven Wasserstein DRO. In the case of a …

Cited by 1 Related articles All 3 versions 

Check for updates Multi-target Weakly Supervised Regression Using Manifold Regularization and Wasserstein Metric

K Kalmutskiy, L Cherikbayeva… - … Optimization Theory and …, 2023 - books.google.com

In this paper, we consider the weakly supervised multi-target regression problem where the

observed data is partially or imprecisely labelled. The model of the multivariate normal …

Related articles



[PDF] americaserial.com

ENHANCING POWER FLOW DATASETS WITH WASSERSTEIN-GRADIENT FLOW-BASED SAMPLE REPLENISHMENT

Z Wei - Michigan Journal of Engineering and Technology, 2023 - americaserial.com

The application of artificial intelligence (AI) methods in power grid analysis necessitates the

utilization of power flow datasets for model training. Presently, power flow data sources …

Related articles 

 "Noncommutative Wasserstein metrics" 7 May 10h30 ...



On Wasserstein Distributionally Robust Mean Semi-Absolute Deviation Portfolio Model: Robust Selection and Efficient Computation

W Zhou, YJ Liu - arXiv preprint arXiv:2403.00244, 2024 - arxiv.org

… Wasserstein distributionally robust mean-lower semiabsolute deviation (DR-MLSAD)

model, where the ambiguity set is a Wasserstein … the size of the Wasserstein radius for DR-MLSAD …

Related articles 



[PDF] arxiv.org

The Ultrametric Gromov–Wasserstein Distance

F MémoliA MunkZ Wan, C Weitkamp - Discrete & Computational …, 2023 - Springer

We investigate compact ultrametric measure spaces which form a subset U w \documentclass[12pt]{minimal}

\usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \…

Cited by 9 Related articles All 6 versions

<–—2023———2023——2000—



[HTML] arxiv.org

Enhancing selectivity using Wasserstein distance based reweighing

P Worah - arXiv preprint arXiv:2401.11562, 2024 - arxiv.org

… Wasserstein distance computation that we found in this area were [8] and [16], which focus

on exact solution of the Wasserstein … but efficient computation of Wasserstein distance. That is

Related articles All 2 versions 



[PDF] projecteuclid.org

Wasserstein-p bounds in the central limit theorem under local dependence

T Liu, M Austern - Electronic Journal of Probability, 2023 - projecteuclid.org

… recent works established optimal error bounds under the Wasserstein-p distance (with p

≥ … , we derive optimal rates in the CLT for the Wasserstein-p distance. Our proofs rely on …

Related articles All 4 versions



FLWGAN: Federated Learning with Wasserstein Generative Adversarial Network for Brain Tumor Segmentation

D PeketiV ChalavadiCK Mohan… - 2023 International Joint …, 2023 - ieeexplore.ieee.org

… real data and the data generated by WGAN-GP. Finally, all the … brain tumor segmentation

using Wasserstein GANs (FLWGAN) … 2) We propose a modification to the standard WGAN-GP …

 Cited by 7 Related articles


2023 see 2022

[PDF] arxiv.org

Wasserstein coupled particle filter for multilevel estimation

M Ballesio, A JasraE von Schwerin… - Stochastic Analysis and …, 2023 - Taylor & Francis

… squared Wasserstein distance with L 2 as the metric (we call this the “Wasserstein coupling”)…

resampling step corresponds to sampling the optimal Wasserstein coupling of the filters. We …

Cited by 13 Related articles All 9 versions



Multi-Mode Rayleigh Wave Dispersion Spectrum Inversion Using Wasserstein Distance Coupled with Bayesian Optimization

Y Niu, G Fang, YE LiSC Chian, E Nilot - Geophysics, 2024 - library.seg.org

We propose a new automatic framework for non-destructive multi-channel analysis of surface

waves (MASW) that combines multi-mode dispersion spectrum matching and the finite …

Related articles


2023 see 2021

[PDF] arxiv.org

Wasserstein index of dependence for random measures

M Catalano, H LavenantA Lijoi… - Journal of the American …, 2023 - Taylor & Francis

… Wasserstein distance between Lévy measures, highlight its relation to the classical Wasserstein

… state some key properties underlying the Wasserstein index of dependence. We refer to …

Cited by 6 Related articles All 4 versions 




[HTML] A mechanical derivation of the evolution equation for scintillating crystals: Recombination–diffusion–drift equations, gradient flows and Wasserstein measures

F Daví - Mechanics Research Communications, 2023 - Elsevier

… structure: moreover, we show that the drift–diffusion part is a Wasserstein gradient flow and

we show how the energy dissipation is correlated with an appropriate Wasserstein distance. …

Related articles All 2 versions


[HTML] arxiv.org

Diffusion Processes on -Wasserstein Space over Banach Space

P Ren, FY Wang, S Wittmann - arXiv preprint arXiv:2402.15130, 2024 - arxiv.org

To study diffusion processes on the $p$-Wasserstein space $\scr P_p $ for $p\in [1,\infty) $

over a separable Banach space $X$, we present a criterion on the quasi-regularity of …

Related articles All 2 versions 


[PDF] arxiv.org

Wasserstein Diffusion on Multidimensional Spaces

KT Sturm - arXiv preprint arXiv:2401.12721, 2024 - arxiv.org

… Our goal now is to introduce a ‘canonical’ Wasserstein Dirichlet form EW and associated

Wasserstein diffusion process on P(M). We will define the former as the relaxation on L2(P(M),…

Related articles All 2 versions 

 

[PDF] arxiv.org

Wood: Wasserstein-based out-of-distribution detection

Y WangW SunJ JinZ Kong… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org

… To overcome these challenges, we propose a Wasserstein-based out-of-distribution detection

(WOOD) method. The basic idea is to define a Wassersteinbased score that evaluates the …

Cited by 5 Related articles All 9 versions

<–—2023———2023——2010——



Gaussian process regression with Sliced Wasserstein Weisfeiler-Lehman graph kernels

RC PerezS Da VeigaJ GarnierB Staber - arXiv preprint arXiv …, 2024 - arxiv.org

… on the Wasserstein distance, the sliced Wasserstein distance … complexity reduction, the

sliced Wasserstein (SW) distance … which is not the case with the Wasserstein distance. Similarly …

Related articles All 6 versions 


  

[PDF] aimsciences.org

Wasserstein distance and total variation regularized model for image reconstruction problems

Y Gao - Inverse Problems and Imaging, 2023 - aimsciences.org

… In this paper, we incorporate Wasserstein distance, together with total variation, into static

inverse problems as a prior regularization. The Wasserstein distance formulated by Benamou-…

   All 2 versions


[HTML] arxiv.org

Measures determined by their values on balls and Gromov-Wasserstein convergence

A van DelftAJ Blumberg - arXiv preprint arXiv:2401.11125, 2024 - arxiv.org

A classical question about a metric space is whether Borel measures on the space are

determined by their values on balls. We show that for any given measure this property is stable …

Selated articles All 2 versions 



WGAN for Data Augmentation

M PatilMM Patil, S Agrawal - GANs for Data Augmentation in Healthcare, 2023 - Springer

… WGANs work on the distance between the expected probability and the parameterized …

This chapter focuses on the Wasserstein distance in deep, data augmentation using WGANs …

 Related articles


2023 see 2021.  [PDF] arxiv.org

Generalized Wasserstein barycenters between probability measures living on different subspaces

J DelonN Gozlan, A Saint Dizier - The Annals of Applied …, 2023 - projecteuclid.org

In this paper, we introduce a generalization of the Wasserstein barycenter, to a case where

the initial probability measures live on different subspaces of R d . We study the existence …

Cited by 9 Related articles All 9 versions


 

2023



Wasserstein Distance-assisted Variational Bayesian Kalman Filter with Statistical Similarity Measure for GNSS/INS Integrated Navigation

Z Chen, Z Li, W Chen, Y Sun, X Ding - IEEE Sensors Journal, 2024 - ieeexplore.ieee.org

… Thus, we propose a Wasserstein distance (W distance)-assisted variational Bayesian Kalman

filter with a statistical similarity measure (VBSSM) that aims to provide a unified algorithm …

Related articles



 

VAEWGAN-NCO in image deblurring framework using variational autoencoders and Wasserstein generative adversarial network

A RanjanM Ravinder - Signal, Image and Video Processing, 2024 - Springer

This article proposes a novel “Deep Salient Image Deblurring (DSID) Framework” for kernel-free

image deblurring that combines saliency detection and variational autoencoders and …

Related articles


[PDF] arxiv.org

Wasserstein Differential Privacy

C Yang, J Qi, A Zhou - arXiv preprint arXiv:2401.12436, 2024 - arxiv.org

… Wasserstein distance and define our Wasserstein differential privacy. Definition 1 (Wasserstein …

obtain the 1Wasserstein distance applied in Wasserstein generative adversarial network (…

Related articles All 2 versions 


[PDF] aimspress.com

[PDF] A new imbalanced data oversampling method based on Bootstrap method and Wasserstein Generative Adversarial Network

B Hou, G Chen - Mathematical Biosciences and Engineering, 2024 - aimspress.com

… method and the Wasserstein GAN Network (BM-WGAN). In our … WGAN improves the

classification performance greatly compared to other oversampling algorithms. The BM-WGAN …

 Related articles All 2 versions 

<–—2023———2023——2020—— 



[PDF] arxiv.org

The algebraic degree of the Wasserstein distance

C Meroni, B Reinke, K Wang - arXiv preprint arXiv:2401.12735, 2024 - arxiv.org

… The Wasserstein distance allows to put a distance between the space of probability … of

Wasserstein distance for finitely supported measures and relate the Wasserstein distance of …

 Related articles All 2 versions 



[PDF] arxiv.org

Wasserstein Nonnegative Tensor Factorization with Manifold Regularization

J Wang, L Tang - arXiv preprint arXiv:2401.01842, 2024 - arxiv.org

… Wasserstein manifold nonnegative tensor factorization (WMNTF), which minimizes the

Wasserstein … Although some researches about Wasserstein distance have been proposed in …

Related articles All 2 versions 



[PDF] paperplaza.net

Finite-Horizon Optimal Control of Continuous-Time Stochastic Systems with Terminal Cost of Wasserstein Distance

K Hoshino - 2023 62nd IEEE Conference on Decision and …, 2023 - ieeexplore.ieee.org

… of the Wasserstein distance in the deep learning problem can be viewed as the optimal

control with the Wasserstein distance. The optimal control with the Wasserstein distance can be …

Related articles

 


[PDF] arxiv.org

Memory Efficient And Minimax Distribution Estimation Under Wasserstein Distance Using Bayesian Histograms

PM JacobsL PatelA BhattacharyaD Pati - arXiv preprint arXiv …, 2023 - arxiv.org

We study Bayesian histograms for distribution estimation on $[0,1]^d$ under the Wasserstein

$W_v, 1 \leq v < \infty$ distance in the iid sampling regime. We newly show that when $d < …

 Related articles All 2 versions 



The Wasserstein mean of unipotent matrices

S Kim, VN Mer - Linear and Multilinear Algebra, 2023 - Taylor & Francis

… We define the Wasserstein mean of n × n unipotent matrices by solving its … two-variable

Wasserstein means. Furthermore, we show that the explicit formula of multi-variable Wasserstein …

Cited by 1 Related articles

……

2023


2023 see 2021. [PDF] arxiv.org

Backward and forward Wasserstein projections in stochastic order

YH Kim, Y Ruan - Journal of Functional Analysis, 2024 - Elsevier

… , we propose to study Wasserstein projections onto the cones … numerical benefits offered by

Wasserstein projections in a … function c, we define the Wasserstein transport cost as T c ( μ , …

Cited by 6 Related articles All 3 versions


Optimal State Estimation in the Presence of Non-Gaussian Uncertainty via Wasserstein Distance Minimization

H Prabhat, R Bhattacharya - arXiv preprint arXiv:2403.13828, 2024 - arxiv.org

… Abstract—This paper presents a novel distributionagnostic Wasserstein … The Wasserstein

metric is used to quantify the effort of … Notably, the proposed Wasserstein filter does not rely on …

Related articles All 2 versions 



[PDF] arxiv.org

Viscosity Solution of Second Order Hamilton-Jacobi-Bellman Equations in the Wasserstein Space

H Cheung, HM Tai, J Qiu - arXiv preprint arXiv:2312.10322, 2023 - arxiv.org

This paper is devoted to mean field control problems and the associated second-order

Hamilton-Jacobi-Bellman (HJB) equations in the Wasserstein space. Through the incorporation of …

Related articles All 3 versions 

 



[PDF] openreview.net

Modeling Changes in Molecular Dynamics Time Series as Wasserstein Barycentric Interpolations

J Damjanovic, YS LinJM Murphy - … Conference on Sampling …, 2023 - ieeexplore.ieee.org

… We note that Wasserstein geodesics are special cases of Wasserstein barycenters and

Laplace distributions are preserved under barycentric combinations [20]; see the Supplement for …

Cited by 1 Related articles All 2 versions



[HTML] arxiv.org

Correction to" Wasserstein distance estimates for the distributions of numerical approximations to ergodic stochastic differential equations"

D PaulinPA Whalley - arXiv preprint arXiv:2402.08711, 2024 - arxiv.org

A method for analyzing non-asymptotic guarantees of numerical discretizations of ergodic

SDEs in Wasserstein-2 distance is presented by Sanz-Serna and Zygalakis in ``Wasserstein …

Related articles All 2 versions 

<–—2023———2023——2030—


Data-Driven Distributionally Robust Risk-Averse Two-Stage Stochastic Linear Programming over Wasserstein Ball

Y Gu, Y Huang, Y Wang - Journal of Optimization Theory and Applications, 2024 - Springer

… the Wasserstein balls rather than moment-based ambiguity sets to capture distribution uncertainty.

Specifically, we apply the 1-Wasserstein … is that the 1-Wasserstein ball shrinks with the …

Related articles All 4 versions


[PDF] kyoto-u.ac.jp

On isometries of Wasserstein spaces (Research on preserver problems on Banach algebras and related topics)

GG Pál, T TITKOS… - 数理解析研究所講究 …, 2023 - repository.kulib.kyoto-u.ac.jp

It is known that if p ≥ 1, then the isometry group of the metric space (X, ϱ) embeds into the

isometry group of the Wasserstein space Wp(X, ϱ). Those isometries that belong to the image …

Related articles All 2 versions 

 


[HTML] arxiv.org

Causal Tracking of Distributions in Wasserstein Space: A Model Predictive Control Scheme

M Emerick, J Jonas, B Bamieh - arXiv preprint arXiv:2403.15702, 2024 - arxiv.org

… 2 (Rt,Dt), denotes the square of the 2-Wasserstein distance … In the optimal transport literature,

the Wasserstein distance is … We emphasize that in our setting, the Wasserstein distance is …

Related articles All 2 versions 



Multi-residual unit fusion and Wasserstein distance-based deep transfer learning for mill load recognition

H Xu, X Luo, W Xiao - Signal, Image and Video Processing, 2024 - Springer

This paper proposes the ball mill load recognition algorithm (MRUF-WD) based on multi-residual

unit fusion (MRUF) and Wasserstein distance transfer learning to address the problem …

Related articles



[HTML] arxiv.org

Minimum energy density steering of linear systems with Gromov-Wasserstein terminal cost

K Morimoto, K Kashima - arXiv preprint arXiv:2402.15942, 2024 - arxiv.org

In this study, we address optimal control problems focused on steering the probabilistic

distribution of state variables in linear dynamical systems. Specifically, we address the problem …

Related articles All 2 versions 


2023


Non-homogeneous Riemannian gradient equations for sum of squares of Bures–Wasserstein metric

J Hwang, S Kim, VN Mer - Journal of Computational and Applied …, 2024 - Elsevier

… ( X , A j ) where d W denotes the Bures–Wasserstein metric. On the special case where the

… Bures–Wasserstein metrics vanishes, its unique solution is known as the Wasserstein mean …

Related articles All 3 versions



[PDF] bsc.org.cn

[PDF] Transition Time Determination of Single-Molecule FRET Trajectories via Wasserstein

T Chen, F Gao, YW Tan - 第四届全国现代生物物理方法与技术暨单分子 … - meeting.bsc.org.cn

… In this study, we introduce a novel methodology called WAVE (Wasserstein distance Analysis

in … We then apply Maximum Wasserstein Distance (MWD) analysis to differentiate the FRET …

Related articles All 2 versions 



[PDF] unipd.it

[PDF] Wasserstein regularity in mean field control problems

C BONESINI - 2023 - thesis.unipd.it

In this thesis we deal with a class of mean field control problems that are obtained as limits

of optimal control problems for large particle systems. Developing on [Cardaliaguet, P., …

Related articles All 2 versions 



[PDF] wgtn.ac.nz

[PDF] Applications of the Bures-Wasserstein Distance in Linear Metric Learning

D Cooper - openaccess.wgtn.ac.nz

… -Wasserstein distance are developed, utilising its novel properties. The first set of algorithms

we contribute use the BuresWasserstein … Learning with the BuresWasserstein distance to a …

Related articles All 2 versions 



[PDF] unicamp.br

[PDF] An Irregularity Measure Based on Wasserstein Metric for Multivariate Mathematical Morphology

S FRANCISCO - repositorio.unicamp.br

… is defined using the Wasserstein metric and the generalized … optimization problem to obtain

the Wasserstein metric. We prove … as an approximation for the Wasserstein metric in order to …

Related articles 

<–—2023———2023——2040— 


[PDF] openreview.net

Imitation Learning Using Generalized Sliced Wasserstein Distances

I OvinnikovJM Buhmann - openreview.net

Imitation learning methods allow to train reinforcement learning policies by way of minimizing

a divergence measure between the state occupancies of the expert agent and the novice …

Related articles 



[PDF] oaes.cc

[PDF] Rolling bearing fault diagnosis method based on 2D grayscale images and Wasserstein Generative Adversarial Nets under unbalanced sample condition

J He, Z Lv, X Chen - 2023 - f.oaes.cc

… method based on 2D grayscale images and WGAN. By converting time-domain signals into

… and generate images well, while WGAN introduces Wasserstein distance to make the model …

Related articles 


[PDF] ethz.ch

[PDF] AN INTRODUCTION TO OPTIMAL TRANSPORT AND WASSERSTEIN GRADIENT FLOWS

A FIGALLI - people.math.ethz.ch

These short notes summarize a series of lectures given by the author during the School “Optimal

Transport on Quantum Structures”, which took place on September 19th-23rd, 2022, at …

Related articles 



Stable and Feature Decoupled of Deep Mutual Information Maximization Based on Wasserstein Distance

X He, C Peng, L Wang - papers.ssrn.com

… based on the Wasserstein distance is proposed [26], and we give the Wasserstein distance

definition and its approximate optimal estimation algorithm. Wasserstein distance definition: …

Related articles 



Estimating location errors in precipitation forecasts with the Wasserstein and Attribution distances

L LledóG Skok, T Haiden - 2023 - meetingorganizer.copernicus.org

… The Wasserstein distance, defined as … Wasserstein distances to circumvent too literal

comparisons. As a result, new algorithms have been developed that can approximate Wasserstein …

Related articles All 2 versions 


2023


[PDF] unl.pt

[PDF] THE HAMILTON-JACOBI EQUATION IN THE WASSERSTEIN SPACE OF PROBABILITY MEASURES ON THE 𝑑-DIMENSIONAL TORUS

DC CABANAS - 2023 - run.unl.pt

… the way we define and understand the Wasserstein distance. … We will look at narrow

convergence and the Wasserstein … and the Wasserstein distance, recalling some of their …

 Related articles 



[PDF] researchgate.net

[PDF] Wasserstein Distributionally Robust Estimation in High Dimensions: Performance Analysis and Optimal Hyperparameter Tuning

L AolariteiS ShafieeF Dörfler - researchgate.net

… Wasserstein distributionally robust optimization has recently emerged as a powerful … close,

in a Wasserstein sense, to the empirical distribution. In this paper, we propose a Wasserstein …lated articles All 2 ve

rsions View as HTML 



[PDF] iwaponline.com

Alleviating sample imbalance in water quality assessment using the VAE–WGAN–GP model

J Xu, D Xu, K Wan, Y Zhang - Water Science & Technology, 2023 - iwaponline.com

… that utilizes the VAE–WGAN–GP model. The VAE–WGAN–GP model combines the …

potential distribution learning ability of the proposed VAE–WGAN–GP model, (3) introducing the …

Related articles All 2 versions



Fine-Yolov7: A Small Object Detection Network Based on Cross-Layer Integration and Normalized Wasserstein Distance

Y Zhou, K Cao, D Li, J Piao - Available at SSRN 4612609 - papers.ssrn.com

Object detection remains a vital yet challenging task in the domain of computer vision. Despite

high-performance computers delivering satisfactory results, existing methods struggle to …

Related articles 



 [PDF] arxiv.org

Gradient flows of modified Wasserstein distances and porous medium equations with nonlocal pressure

NP ChungQH Nguyen - Acta Mathematica Vietnamica, 2023 - Springer

… We construct their weak solutions via JKO schemes for modified Wasserstein distances.

We also establish the regularization effect and decay estimates for the L p norms. … To do …

Related articles All 4 versions

<–—2023———2023——2050—


[PDF] wazizian.fr

[PDF] Exact Generalization Guarantees for Wasserstein Distributionally Robust Models

W Azizian, F Iutzeler, J erˆome Malick - wazizian.fr

… Wasserstein Distributionally Robust Optimization … Con dence regions in wasserstein

distributionally robust estimation. Biometrika, 2022. … On the rate of convergence in …

Related articles All 7 versions 


[PDF] sissa.it

[PDF] Enhancing Deep Learning for Slow-Decaying Problems: An Optimal Transport-based Approach with Sinkhorn Loss and Wasserstein Kernel

M Khamlich, F PichiG Rozza - people.sissa.it

… Wasserstein barycenters between the … Wasserstein distance (Wreg) between probability

measures by introducing entropy (ǫ) into the optimization problem. The regularized Wasserstein …

Related articles 



Wasserstein speed limits for underdamped Brownian particles

R SabbaghO Movilla MiangolarraT Georgiou - Bulletin of the American …, 2024 - APS

We derive thermodynamic speed limits for underdamped Brownian particles subject to

arbitrary forcing by utilizing the Benamou-Brenier fluid dynamics formulation of optimal mass …

Related articles 


[PDF] cmes.org

[PDF] Imprecise Probabilistic Model Updating Using A Wasserstein Distance-based Uncertainty Quantification Metric

杨乐昌, 韩东旭, 王丕东 - Journal of Mechanical Engineering - qikan.cmes.org

针对这一问题,提出一种基于 Wasserstein 距离测度的模型修正方 ,该方法基于 Wasserstein

距离测度构建核函数,利用 p 维参数空间中 Wasserstein 距离的几何性质以量化不同概率分布之

Related articles All 2 versions 



An intelligent IoT intrusion detection system using HeInit-WGAN and SSO-BNMCNN based multivariate feature analysis

J Wu, SA HaiderH YuM IrshadM Soni… - … Applications of Artificial …, 2024 - Elsevier

… of a He initialization-based Wasserstein gradient penalty loss generative adversarial network

(HeInit-WGAN). The HeInit-WGAN technique identifies attacks more reliably and accurately …

Related articles All 2 versions


2023



Control theory on Wasserstein space: a new approach to optimality conditions

A Bensoussan, Z Huang, SCP Yam - Annals of Mathematical Sciences …, 2023 - intlpress.com

We study the deterministic control problem in the Wasserstein space, following the recent

works of Bonnet and Frankowska, but with a new approach. One of the major advantages of …

 Cited by 2 Related articles 

 

[PDF] hal.science

[PDF] Contraction in the Wasserstein metric for some Markov chains, and applications to the dynamics of expanding maps

AO LopesM StadlbauerBR Kloeckner - hal.science

We employ techniques from optimal transport in order to prove decay of transfer operators

associated to iterated functions systems and expanding maps, giving rise to a new proof …

Related articles 


Wasserstein-1 metric

P Díaz Lozano, T Lozano Bagén, J Vives - arXiv e-prints, 2023 - ui.adsabs.harvard.edu

(Conditional) Generative Adversarial Networks (GANs) have found great success in recent

years, due to their ability to approximate (conditional) distributions over extremely high …

Related articles



2023 see 2022. [PDF] arxiv.org

Distributionally robust joint chance-constrained programming with Wasserstein metric

Y Gu, Y Wang - Optimization Methods and Software, 2023 - Taylor & Francis

In this paper, we develop an exact reformulation and a deterministic approximation for

distributionally robust joint chance-constrained programmings ( DRCCPs ) with a general class of …

SCited by 1 Related articles All 3 versions

Related articles All 3 versions


[PDF] antoinecollas.fr

[PDF] Optimal transport and dimension reduction: Entropic Wasserstein Component Analysis

A Collas - antoinecollas.fr

… Optimal Transport (OT): Wasserstein distance … the squared 2-Wasserstein distance

with the 2 metric is … Entropic Wasserstein Component Analysis (EWCA): …

elated articles 

<–—2023———2023——2060—


PELID: Enhancing real-time intrusion detection with augmented WGAN and parallel ensemble learning

HV VoHP DuHN Nguyen - Computers & Security, 2024 - Elsevier

… WGAN method, AWGAN, generates realistic samples for minority classes using the WGAN.

… We propose using the K-Means algorithm in conjunction with WGAN to eradicate ineffective …

Cited by 4 Related articles All 2 versions



[HTML] springer.com

Full View

[HTML] Mdwgan-gp: data augmentation for gene expression data based on multiple discriminator WGAN-GP

R Li, J Wu, G Li, J Liu, J Xuan, Q Zhu - BMC bioinformatics, 2023 - Springer

… Wasserstein generative adversarial network with gradient penalty (WGAN-GP) [24] is an

modified model based on WGAN… [26] further improved the WGAN-GP model from the addition of …

[HTML] frontiersin.org

[HTML] Wasserstein generative adversarial network with gradient penalty for active sonar signal reverberation suppression

Z Wang, H Zhang, W Huang, X Chen… - Frontiers in Marine …, 2023 - frontiersin.org

… This paper proposes a Wasserstein generative adversarial … , the structure design referred to

WGAN-GP comprehensively. In … , we propose a Wasserstein generative adversarial network …

SRelated articles

TLS-WGAN-GP: A generative adversarial network model for data-driven fault root cause location

S Xu, X Xu, H Gao, F Xiao - IEEE Transactions on Consumer …, 2023 - ieeexplore.ieee.org

… In the third section, we introduce the architecture of TLSWGAN-GP in detail and introduce

the preliminary theoretical knowledge algorithm of the discriminator in the TLS-WGANGP …

Cited by 2 Related articles



[PDF] Bayesian nonparametric mixing distribution estimation in the Gaussian-smoothed 1-Wasserstein distance

C Scricciolo - 2024 - dse.univr.it

We study the problem of mixing distribution estimation for mixtures of discrete exponential

family models, taking a Bayesian nonparametric approach. It has been recently shown that, …

SRelated articles All 2 versions 



Wasserstein 距离在液体火箭发动机故障检测中的应用.

程玉强, 邓凌志 - Journal of National University of Defense …, 2023 - search.ebscohost.com

[20] 提出了Wasserstein生成对抗网络(Wasserstein generative adversarial network, WGAN),

 Wasserstein距离来度量两个样本分布之间的差 .Wasserstein距离又叫推土机距离(earth …

Cited by 1 Related articles



2023



[PDF] arxiv.org

Some Convexity Criteria for Differentiable Functions on the 2-Wasserstein Space

G Parker - arXiv preprint arXiv:2306.09120, 2023 - arxiv.org

We show that a differentiable function on the 2-Wasserstein space is geodesically convex if

and only if it is also convex along a larger class of curves which we call `acceleration-free'. In …

Related articles All 2 versions 



Power Quality Disturbances Classification with Imbalanced/Insufficient Samples Based on Wgan-Gp-Sa and Dcnn

Y Xi, X Li, F Zhou - Insufficient Samples Based on Wgan-Gp-Sa and Dcnn - papers.ssrn.com

… For making WGAN more stable and easier to converge, WGAN-GP (Wasserstein generative

… to enhance the optimization of the Wasserstein distance by adding a gradient penalty to the …

Related articles 



Bayesian mixing distribution estimation in the Gaussian-smoothed 1-Wasserstein distance

C Scricciolo - SIS 2023-Statistical Learning, Sustainability and Impact …, 2023 - iris.univr.it

We consider the problem of nonparametric mixing distribution estimation for discrete exponential

family models. It has been recently shown that, under the Gaussian-smoothed optimal …

Related articles 


[HTML] sciencedirect.com

[HTML] CWGAN-GP with residual network model for lithium-ion battery thermal image data expansion with quantitative metrics

F Hu, C Dong, L Tian, Y Mu, X Yu, H Jia - Energy and AI, 2024 - Elsevier

… called time series Wasserstein generative adversarial network (TS-WGAN) which effectively

… a Condition Wasserstein GAN with gradient penalty and a residual network model (CWGAN-…

Related articles



[HTML] springer.com

[HTML] A nonlocal free boundary problem with Wasserstein distance

AL Karakhanyan - Calculus of Variations and Partial Differential …, 2023 - Springer

… 2 we recall some facts on the Wasserstein distance and Fourier transformation of measures.

One of the key facts that we use is that the logarithmic term can be written as a weighted \(L^…

Related articles All 6 versions

<–—2023———2023——2070— 



A novel quality inspection method of compressors based on Deep SVDD and CWGAN-XGBoost

J Wang, X Jin, Y Lyu, Z Jia - International Journal of Refrigeration, 2024 - Elsevier

… inspection method based on Deep SVDD and CWGAN-XGBoost are presented in this research.

… CWGAN algorithm is used to generate a large number of fake samples of labeled fault …

Related articles



[HTML] mdpi.com

[HTML] Enhancer Recognition: A Transformer Encoder-Based Method with WGAN-GP for Data Augmentation

T Feng, T Hu, W Liu, Y Zhang - International Journal of Molecular …, 2023 - mdpi.com

… introduces the Wasserstein GAN with a gradient penalty (WGAN-GP) [35]. The WGAN-GP …

The combination of Transformers and the WGAN-GP allows for the effective application of …

Related articles All 6 versions 



Construction Health Indicator using Physically-Informed 1D-WGAN-GP Joint Attention LSTM-DenseNet Method

H Yang, XD Yang, D Sun, Y Hu - Measurement Science and …, 2024 - iopscience.iop.org

In data-driven prognosis methods, the accuracy of predicting the remaining useful life (RUL)

of mechanical systems is predominantly contingent upon the efficacy of system health …

Selated articles

 


ROD-WGAN hybrid: A Generative Adversarial Network for Large-Scale Protein Tertiary Structures

MNA KhalafTHA Soliman… - … on Computer and …, 2023 - ieeexplore.ieee.org

… The ROD-WGAN model has shown promise in generating … In this paper, we tried to refine

the ROD-WGAN model by … -WGAN model, and finally, an explanation of the ROD-WGAN hybrid …

SRelated articles



Dual-WGAN Ensemble Model for Alzheimer's Dataset Augmentation with Minority Class Boosting

MS Ansari, K Ilyas, A Aslam - 2023 International Conference on …, 2023 - ieeexplore.ieee.org

… model based on Wasserstein Generative Adversarial Networks (WGAN) [?]. This WGAN-based …

This paper makes a dual contribution: firstly, the development of a WGAN to augment the …

Related articles All 2 versions



2-23


 

基于循环生成对抗网络和 Wasserstein 损失的谣言检测研究

张洪志, 但志平, 董方敏, 高准… - 数据分析与知识 …, 2024 - manu44.magtech.com.cn

… [目的] 通过循环生成对抗网络和Wasserstein 距离改进的生成目标的可控性, 并使用Wasserstein

距离改进了模型生成损失, 提高… [结论] 使用Wasserstein 距离改进生成损失的循环生成对抗网络

Related articles



Research on abnormal detection of gas load based on LSTM-WGAN

X Xu, X Ai, Z Meng - International Conference on Computer …, 2023 - spiedigitallibrary.org

… The anomaly detection model based on LSTM-WGAN proposed in this paper is shown in

Figure 2. The LSTM-WGAN model is divided into two stages of training and testing. …

Cited by 1 Related articles All 3 versions



Enhancing Subject-Independent EEG-Based Auditory Attention Decoding with WGAN and Pearson Correlation Coefficient

S PahujaG IvucicF Putze, S Cai, H Li… - … on Systems, Man …, 2023 - ieeexplore.ieee.org

… using a Wasserstein Generative Adversarial Network (WGAN), … are similar to real EEG data

with the Wasserstein Loss [14]. PCC, … In our method, WGAN+PCC works together as WGAN …

Related articles



CyberGuard: Detecting Adversarial DDoS Attacks in SDN using WGAN-CNN-GRU

KS Goud, SR Giduturi - 2024 Fourth International Conference …, 2024 - ieeexplore.ieee.org

… Our proposed methodology capitalizes on the strengths of both WGAN and CNN-GRU to

effectively identify subtle adversarial attack patterns. Firstly, we utilize WGAN to synthesize …

Related articles


A Data Retrieval Method Based on AGCN-WGAN

G Sun, G Peng, X Tian, L Li, Y Zhao… - 2024 IEEE 4th …, 2024 - ieeexplore.ieee.org

… Therefore, a nonlinear model AGCN-WGAN is proposed to solve retrieval tasks in relational

… of the loss function, we use Wasserstein distance to replace JS distance to construct WGAN: …

Related articles

<–—2023———2023——2080— 



[PDF] iaeng.org

[PDF] Research on WGAN-based Image Super-resolution Reconstruction Method.

X Chen, S Lv, C Qian - IAENG International Journal of Computer Science, 2023 - iaeng.org

… L is the Wasserstein distance between the generated and true distributions. WGAN does not

use the … Due to the approximately fitted Wasserstein distance, WGAN turns the dichotomous …

Related articles All 2 versions 



 


Research on vehicle trajectory anomaly detection algorithm based on GRU and WGAN

YH Liu, L Wang, XY Zhao, HM Lu… - 2023 8th …, 2023 - ieeexplore.ieee.org

… In this paper, GRU-WGAN deep learning model based on … The GRU-WGAN model

combining GRU and WGAN is then … The experiments demonstrate that the GRUWGAN model …

 Related articles All 2 versions




Anti-Jamming Method of Near-Field Underwater Acoustic Detection Based on WGAN

Z Jingbo, J Zhe, L Daojiang… - 2023 IEEE International …, 2023 - ieeexplore.ieee.org

… enhancement, this study employs Wasserstein GAN and integrates the characteristics of …

WGAN to enhance training stability. Simulation analysis demonstrates that the trained WGAN …

Related articles



WGAN-GP Framework for SAR and Optical Remote Sensing Image Fusion

A Ajay, G Amith, GS Kumar… - … on Intelligent Systems …, 2023 - ieeexplore.ieee.org

… • We introduce a novel Wasserstein GAN with Gradient Penalty (WGAN-GP) to realize …

variant of WGAN-GP for the fusion of SAR and optical images. The modified WGAN-GP model …

 Related articles



DMM-WGAN: An industrial process data augmentation approach

S Gao, Z Chen, X Dang, X Dong… - … , Computing and Data …, 2023 - ieeexplore.ieee.org

… This paper proposes a DMM-WGAN data augmentation method … Firstly, we use the

DMM-WGAN to generate samples to … effectiveness of the DMM-WGAN generation capabilities. The …

Related articles All 2 versions


2023



[PDF] Conditional Sound Effects Generation with Regularized WGAN

Y LiuC Jin - 2023 - researchgate.net

… models for sound effects with a conditional Wasserstein GAN (WGAN) model. We train our

… The results indicate that a conditional WGAN model trained on log-magnitude spectrograms …

Related articles All 2 versions 


DHAN-WGAN: Adversarial Image Colorization Using Deep Hybrid Attention Aggregation Network

H Feng, Y Yang - … Conference on Culture-Oriented Science and …, 2023 - ieeexplore.ieee.org

… PROPOSED APPROACH DHAN-WGAN focuses on the coloring … Wasserstein GAN

with Gradient Penalty (WGAN-GP) to further train DHAN-class and obtain the model DHAN-WGAN. …

Related articles All 2 versions


Stochastic Optimal Scheduling of Photovoltaic-Energy Storage Charging Station Based on WGAN-GP Scenario Generation

X Bao, Y Chi, H Zhou, Y Huang… - 2023 8th International …, 2023 - ieeexplore.ieee.org

… First, this paper uses WGAN-GP to generate the scenario of photovoltaic output and charging

load, and combines the DPC algorithm to reduce the scenario to improve the quality of …

Related articles



[PDF] conf-icnc.org

[PDF] WGAN-based Oversampling for QoS-Aware M2M Network Power Allocation

J Zhou, Y Tao - conf-icnc.org

… In this work, we used WGAN to conduct the oversampling of … first work that utilizes the WGAN

to conduct the oversampling in … Then, in Section IV, WGAN and oversampling are described …

Related articles 



WGAN-based Missing Data Causal Discovery Method

Y Gao, Q Cai - 2023 4th International Conference on Big Data …, 2023 - ieeexplore.ieee.org

The state-of-the-art causal discovery algorithms are typically based on complete observed

data. However, in reality, technical issues, human errors, and data collection methods among …

Related articles

<–—2023———2023——2090— 



RUL Prediction of Turbofan Engine Based on WGAN-Trans Under Small Samples

C Qi, Z Mao, W Liu - 2023 China Automation Congress (CAC), 2023 - ieeexplore.ieee.org

… To solve this problem, this paper proposes a WGANTrans model, which expands the data set

… paper uses the improved Wasserstein distance instead of JS divergence to form WGAN-F, …

Related articles



End-to-end image dehazing by joint atmospheric scattering and WGAN model

H Zhang, Q Sang - Third International Conference on …, 2023 - spiedigitallibrary.org

The performance of most existing dehazing methods are limited by the independency of the

transmission map estimation and atmospheric light. To ameliorate this, we present an …

Related articles All 3 versions



Dynamic Residual Attention UNet for Precipitation Nowcasting Based on WGAN

C Li, F Huang, J Zhang, L Ma… - 2023 China Automation …, 2023 - ieeexplore.ieee.org

… Modules(DRAM) while leveraging the Wasserstein GAN training strategy to perform generative

… Moreover, the utilization of the Wasserstein GAN(WGAN) strategy during model training …

Related articles



A Compound Generative Adversarial Network Designed for Stock Price Prediction Based on WGAN

Z Chang, Z Zhang - 2023 International Conference on Cyber …, 2023 - ieeexplore.ieee.org

In the stock market, the combination of historical stock data and machine learning methods

has gradually replaced the investment method that relies solely on human experience. We …

Related articles



Classification of IoT intrusion detection data based on WGAN-gp and E-GraphSAGE

C Wang, Z Dong, W Hu, X Jin, X Huang… - … , and Internet of …, 2023 - spiedigitallibrary.org

… Therefore, in this study, we applied WGAN-gp to enhance the intrusion detection data

before using graph neural network algorithms for detection. We utilized the characteristics of …

Selated articles All 3 versions


2023


[PDF] researchgate.net

[PDF] EEG Data Privacy Enhancement using Differential Privacy in WGAN-based Federated Learning

N JAHAN - researchgate.net

Protecting the privacy of EEG (Electroencephalography) data poses a critical challenge

amid its burgeoning applications in neuroscience and healthcare. This study explores an …

Related articles All 2 versions 


An RF Fingerprint Data Enhancement Method Based on WGAN

B Li, D Liu, J Yang, H Zhou, D Lin - International Conference in …, 2023 - Springer

… In this paper, we propose an RF fingerprint data enhancement method based on Wasserstein

Generative Adversarial Network (WGAN). The experimental results show that the method …

Related articles


Power Load Data Cleaning Method Based on DBSCAN Clustering 

and WGAN Algorithm

L Wei, Y Ding, E Wang, L Liu - 2023 IEEE 5th International …, 2023 - ieeexplore.ieee.org

… Then, the Wasserstein distance is used to improve on the original GAN network. Through

non-supervised training of WGAN, the neural network will automatically learn complex spatio-…

Related articles


Research on Two-stage Identification of Distributed Photovoltaic Output Based on WGAN Data Reconstruction Technology

Y Su, J Gu, J Zhang, X Yang, Y Jin… - 2023 4th International …, 2023 - ieeexplore.ieee.org

… Therefore, this article selects the improved Wasserstein GAN with Gradient Penalty (WGAN)

model based on gradient penalty optimization. WGAN uses Wasserstein distance to …

Related articles


[PDF] jst.go.jp

最適輸送問題Wasserstein 距離って何?

星野健太 - システム/制御/情報, 2023 - jstage.jst.go.jp

この論文では,Wasserstein 距離っ ていう概念を使って評価関数を与えているんだWasserstein

距離を使うと,確率分布の類似度を測 れるようになる.,Wasserstein 距離は最適輸送 問題という問題

Related articles All 2 versions

<–—2023———2023——2100—



Uma medida de irregularidade baseada na métrica de Wasserstein para morfologia matemática multivariada

S Francisco - 2023 - repositorio.ifsp.edu.br

… -se a métrica de Wasserstein e a soma generalizada da … para obtenção da métrica de

Wasserstein. Prova-se que o índice … aproximação para a métrica de Wasserstein com o intuito de …

Related articles 



Folded Handwritten Digit Recognition Based on WGAN-GP Model

J Wei, H Song, X Lin, S Jin, S Chen… - 2023 4th International …, 2023 - ieeexplore.ieee.org

The study of overlapped handwritten digit recognition algorithms is critical for improving

automated recognition accuracy, improving document processing, and automating recognition …

Related articles


 

[PDF] 基于 Wasserstein 距离与生成对抗网络的高光谱图像分类

晏远翔, 曹国, 张友强 - 2023 - csa.org.cn

… Wasserstein 距离来缓解GAN 网络中存在的模式崩溃问题, 本文在ADGAN 方法的 基础上进行

了改进, 提出了新的SPCA-AD-WGAN … D 之间通过基于Wasserstein 距离的GAN 网络进行训 . …

SRelated articles All 2 versions 




[PDF] unipd.it

Distanza di Wasserstein adattata, proprietà e un'applicazione.

L VIGOLO - thesis.unipd.it

In questa tesi si studia la distanza di Wasserstein adattata, si tratta di una variante della usuale

distanza di Wasserstein tra misure di probabilità definita con lo scopo di tener conto del …

Related articles 



Bearing Fault Diagnosis Based on CWGAN-GP and CNN

J Lei, T Jian, Y Chao-yue, LYU Ting-ting - Computer and …, 2023 - cam.org.cn

… ) and gradient penalized Wasserstein distance-based generative adversarial network (WGAN-GP).

Then, a small number of bearing fault data samples are input into CWGAN-GP, in …

Related articles All 2 versions 



2023


[PDF] github.io

[PDF] Réduction de modèle non linéaire dans l'espace de Wasserstein et méta-modélisation pour certaines lois de conservation unidimensionnelles

P Noble, O Roustant, P Lafitte, T Filière, S Olaru - iain-pl-henderson.github.io

… transport optimal, l’espace de Wasserstein. Nous présentons ensuite une méthode de …

Wasserstein, il sera toujours implicite que l’on utilisera le cadre p = 2. La distance de Wasserstein …

Related articles All 2 versions 



[PDF] hal.science

[PDF] A travers et autour des barycentres de Wasserstein

IP GENTIL, AR SUVORIKOVA - theses.hal.science

… We are mainly motivated by the Wasserstein barycenter problem introduced by M. Agueh

and G. Carlier in 2011: … We refer to the recent monograph [PZ20] for more details on …

Related articles 

[PDF] unipd.it

[PDF] Optimal Transport and Sliced Wasserstein Gradient Flow

G COZZI - thesis.unipd.it

… the gradient flow generated by the Sliced Wasserstein distance does not provide optimal …

transport and Wasserstein spaces, then we will present the sliced Wasserstein distance and its …

Related articles All 2 versions 


2[HTML] arxiv.org

-Wasserstein barycenters

C Brizzi, G FrieseckeT Ried - arXiv preprint arXiv:2402.13176, 2024 - arxiv.org

We generalize the notion and theory of Wasserstein barycenters introduced by Agueh and

Related articles All 2 versions 

[CITATION] Weak Wasserstein Barycenters: Theory and Applications to Machine Learning

T Valencia, F Tobar, J Fontbona

Related articles


23z5z

[CITATION] Research on t-SNE Similarity Measurement Method Based on Wasserstein Divergence

L Xin-peng, S Xiang-hong… - …, 2023 - OFFICE SPECTROSCOPY & …

Related articles


2023z6

[CITATION] A novel loss function for neural network models exploring stock realized volatility using wasserstein distance. Decision Analytics Journal, 100369

HG Souto, A Moradi - 2023

<–—2023———2023——2110—



2023z see 2022

[CITATION] Dynamic persistent homology for brain networks via wasserstein graph clustering, arXiv (2022)

MK Chung, SG Huang, IC Carroll, VD Calhoun… - 2024 - Apreprint-JANUARY

Cited by 2 Related articles

[CITATION] Wasserstein distributional sensitivity to model uncertainty in a dynamic context

Y Jiang - DPhil Transfer of Status Thesis. University of Oxford, 2023

Cited by 4 Related articles



2023z

[CITATION] Wasserstein Nonlinear MPC

A Kandel - Version 0.0, 2023

 Cited by 2 Related articles



2023z

[CITATION] Reaction-diffusion-drift equations for scintillators. From multi-scale mechanics to Gradient Flows and Wasserstein measures

F Daví - 57th Meeting of Society for Natural Philosophy

Related articles



2023z

[CITATION] Tomato leaf dis ease recognition based on wgan and mca-mobilenet

ZQ Wang, XY Yu, XJ Yang, YB Lan, XN Jin, JY Ma - Transactions of the Chinese …, 2023

Cited by 3 Related articles


2023



 2o23z

[CITATION] Short-term wind power prediction based on SAM-WGAN-GP. J

L Huang, LX Li, Y Cheng - Solar Energy, 2023

Cited by 2 Related articles



2023z

[CITATION] Isometries and isometric embeddings of Wasserstein spaces over the Heisenberg group, manuscript

ZM Balogh, T Titkos, D Virosztek - arXiv preprint arXiv:2303.15095, 2023

Cited by 2 Related articles



[CITATION] An 398 extended Exp-TODIM method for multiple attribute deci- 399 sion making based on the Z-Wasserstein distance

H Sun, Z Yang, Q Cai, GW Wei, ZW Mo - Expert 380 Systems with Applications, 2023

Cited by 2 Related articles


2023z

[CITATION] Wasserstein Metric-Based Clustering for Large-Scale Power Distribution

AE Oneto, B Gjorgiev… - … Annual Meeting 2023, 2023 - research-collection.ethz.ch

Wasserstein Metric-Based Clustering for Large-Scale Power Distribution - Research

Collection … Wasserstein Metric-Based Clustering for Large-Scale Power Distribution …

Related articles 


2023z

[CITATION] Aero-engine high speed bearing fault diagnosis for data imbalance: A sample enhanced diagnostic method based on pre-training WGAN-GP (vol 213 …

J Chen, Z Yan, C Lin, B Yao… - …, 2023 - ELSEVIER SCI LTD THE …

Related articles

<–—2023———2023——2120—



2023z

[CITATION] Entropic Regularization for Wasserstein Distributionally Robust Chance-Constrained Process Optimization

SB Yang, Z Li - 2023 AIChE Annual Meeting, 2023 - aiche.confex.com

Cite Related articles 


2023z

[CITATION] Optimizing Allosteric Analysis: A Wasserstein Distance and Heat Kernel-based Methodology for Investigating P53 Energetics

BS Cowan - 2023 - Wesleyan University

Related articles 



On isometries of Wasserstein spaces

G GehérT TitkosD Virosztek - RIMS KOKYUROKU BESSATSU, 2023 - real.mtak.hu

It is known that if p ≥ 1, then the isometry group of the metric space (X, ϱ) embeds into the

isometry group of the Wasserstein space Wp(X, ϱ). Those isometries that belong to the image …

Related articles All 2 versions 

2023z  z5

[CITATION] 基于 WGAN  MCA-MobileNet 的番茄叶片病害识别

王志强, 于雪莹, 杨晓婧, 兰玉彬, 金鑫宁, 马景余 - 农业机械学报, 2023

 Cited by 3 Related articles


基于 CWGAN-GP CNN 的轴承故障诊断方法

江蕾, 唐建, 杨超越, 吕婷婷 - 计算机与现代化, 2023 - cam.org.cn

… ,提出一种基于条件Wasserstein生成对抗网络(CWGAN-GP)和卷积… Wasserstein距离的生成

对抗网络(WGAN-GP),构建CWGAN-GP生成对抗网络;然后,将少量轴承故障的数据样本输入CWGAN

Related articles All 2 versions 

 

2023

  


基于 CWGAN-div Mi-CNN GIS 局部放电图谱 识别.

刘航斌, 林厚飞, 褚静, 叶静… - Zhejiang Electric …, 2023 - search.ebscohost.com

… GAN, CWGAN-div(带条件约束的Wasserstein生成对抗 网络),使用Wasserstein 距离代替JS

不同于以上3 种方案,本文采取的方法是在 Wasserstein 距离的基础上再次引入Wasserstein 

Related articles


結合 Metropolis-Hastings 演算法和 WGAN 模型進行股票價格的時間序列預測

蕭仁鴻 - 2023 - nckur.lib.ncku.edu.tw

… However, the challenges associated with convergence have led to the adoption of the

Wasserstein GAN (WGAN) as the foundational model in this study. To improve the model's ability …

Related articles

[PDF] chinacaj.net

[PDF] 基于 WGAN 的生成式信息隐写方法研究

崔建明, 余茜, 刘铭 - 河南理工大学学报 (自然科学版), 2023 - chinacaj.net

… in formation steganography method based on Wasserstein distance was proposed in this

… as noise fragments and inputted into the pre-trained WGAN.Finally,the network outputted a …

Related articles 



Tip: Search for English results only. You can specify your search language in Scholar Settings.

整合 WGAN-GP 及 YOLOv5 於不平衡鋼帶金屬表面資料集之瑕疵檢測

JX Lin, MS Lu - 危機管理學刊, 2023 - airitilibrary.com

In the steel belt production environment, the influence of equipment and environmental

factors leads to surface defects in steel belts, and for the steel industry, surface defects are the …

Related articles


[PDF] cssc709.net

[PDF] 基于梯度惩罚 WGAN 的人脸对抗样本生成方法

梁杰, 彭长根, 谭伟, 杰何兴 - 计算机与数字工程, 2023 - jsj.journal.cssc709.net

… (WGAN-Gradient penalty,WGAN-GP)的人脸对抗 样本生成方法AdvFace-GP. 本文的贡献:使用

WGAN-GP模型相比WGAN 更稳定的特点,提出了一种基于生成对抗网络 WGAN-GP 的人脸对抗

 Related articles 

<–—2023———2023——2130—



Conditional WGAN-gp を用いたモータの回転子の形状生成

加藤信人, 鈴木圭介, 近藤慶長, 鈴木克幸… - … 部門講演会講演論文集 …, 2023 - jstage.jst.go.jp

In this study, we utilize a deep generative model called Conditional Wasserstein Generative

Adversarial Networks with gradient penalty to generate the rotor geometry of an interior …

Related articles


023z6

[CITATION] WGAN  활용한 데이터 생성과 STFT 이미지 기반 협동로봇 구동모듈의 고장 검출 연구

최승환 - Proceedings of KIIT Conference, 2023 - dbpia.co.kr

Study on data generation using WGAN and fault detection of STFT image-based … Study on

data generation using WGAN and fault detection of STFT image-based collaborative robot …

 Related articles


023 z7

[CITATION] Wasserstein 距離を用いた確率分布の最適制御とワンウェイ型カーシェアリングへの応用

星野健太 - システム・制御・情報= Systems, control and information …, 2023 - cir.nii.ac.jp

Wasserstein距離を用いた確率分布の最適制御とワンウェイ型カーシェアリングへの応用 | CiNii

Research … Wasserstein距離を用いた確率分布の最適制御とワンウェイ型カーシェアリングへの …

Related articles 

[CITATION] 基于 WGAN  CNN 的轴承故障诊断研究

佘媛, 温秀兰, 唐颖, 赫忠乐… - 南京工程学院学报自然科学 …, 2023 - xbnew.njit.edu.cn

… Bearing Fault Diagnosis Based on WGAN and CNN … In this paper, a method of bearing

fault diagnosis based on the improved generative adversarial network, Wasserstein GAN (WGAN) …

Related articles All 2 versions 


2023z

[CITATION] Wasserstein GAN 모델을 활용한 적층 복합재의 데이터 증강과 결함상태 분류

김성준, 김흥수 - 대한기계학회 춘추학술대회, 2023 - dbpia.co.kr

그리고 부족한 진동 데이터의 보충하기 위해 Wasserstein GAN(WGAN) 모델을 이용해

WGAN 모델을 사용해 데이 터를 증강해 데이터 불균형 문제를 해결했고, 1D CNN 모델의 데이터

Related articles 



Empirical martingale projections via the adapted Wasserstein distance

J BlanchetJ WieselE ZhangZ Zhang - arXiv preprint arXiv:2401.12197, 2024 - arxiv.org

… is the Wasserstein distance, which in our context is given by … Q even though the Wasserstein

distance between Q and P is … Wasserstein distance which addresses these types of issues. …

Related articles All 3 versions 



2023



Multi-scale Wasserstein Shortest-path Graph Kernels for Graph Classification

W Ye, H Tian, Q Chen - IEEE Transactions on Artificial …, 2023 - ieeexplore.ieee.org

… called the Multi-scale Wasserstein Shortest-Path graph kernel (… We use the Wasserstein

distance to compute the similarity … In this paper, we adopt the Wasserstein distance to better …

Related articles All 2 versions



[HTML] arxiv.org

Supervised Gromov-Wasserstein Optimal Transport

Z Cang, Y Wu, Y Zhao - arXiv preprint arXiv:2401.06266, 2024 - arxiv.org

… Gromov-Wasserstein (sGW) optimal transport, an extension of Gromov-Wasserstein by

incorporating … Through comparisons with other Gromov-Wasserstein variants on real data, we …

Related articles All 2 versions 



[HTML] arxiv.org

Squared Wasserstein-2 Distance for Efficient Reconstruction of Stochastic Differential Equations

M Xia, X Li, Q Shen, T Chou - arXiv preprint arXiv:2401.11354, 2024 - arxiv.org

… We provide an analysis of the squared Wasserstein-2 (W2) distance between two probability

… To demonstrate the practicality of our Wasserstein distance-based loss functions, we …

Related articles All 2 versions 



[HTML] springer.com

[HTML] Scalable Gromov–Wasserstein Based Comparison of Biological Time Series

N Kravtsova, RL McGee II, AT Dawes - Bulletin of Mathematical Biology, 2023 - Springer

… –Wasserstein distance optimization program, reducing the problem to a Wasserstein … to the

scalability of the one-dimensional Wasserstein distance. We discuss theoretical properties of …

 Related articles All 6 versions


2023 see 2022

[PDF] hal.science

Characterization of translation invariant MMD on R d and connections with Wasserstein distances

T Modeste, C Dombry - 2023 - hal.science

… between translation invariant MMDs and Wasserstein distances on Rd. We show in …

Wasserstein distance of order β < α. We also provide examples of kernels metrizing the Wasserstein …

Cited by 5 Related articles All 5 versions 

<–—2023———2023——2140— 



[HTML] mdpi.com

[HTML] … Monitoring Data Quality Based on the Hierarchical Density-Based Spatial Clustering of Applications with a Noise–Wasserstein Slim Generative Adversarial …

F Zhang, J Guo, F Yuan, Y Qiu, P Wang, F Cheng, Y Gu - Sensors, 2023 - mdpi.com

In order to solve low-quality problems such as data anomalies and missing data in the

condition monitoring data of hydropower units, this paper proposes a monitoring data quality …

Related articles All 5 versions 


[PDF] arxiv.org

Towards Understanding the Riemannian SGD and SVRG Flows on Wasserstein Probabilistic Space

M Yi, B Wang - arXiv preprint arXiv:2401.13530, 2024 - arxiv.org

… ) optimization method on Wasserstein space is Riemannian … optimization methods in the

Wasserstein space by extending the … By leveraging the structures in Wasserstein space, we …

Related articles All 2 versions 



[HTML] mdpi.com

[HTML] Fault Diagnosis in Hydroelectric Units in Small-Sample State Based on Wasserstein Generative Adversarial Network

W Sun, Y Zou, Y Wang, B Xiao, H Zhang, Z Xiao - Water, 2024 - mdpi.com

… Wasserstein distance as a loss function, calculated as shown in Equation (1). Wasserstein

… When the Wasserstein loss value decreases, the similarity of the generated data increases. …

Cite Cited by 1 Related articles All 3 versions 



[PDF] gatech.edu

[PDF] Well-posedness for Hamilton-Jacobi equations on the Wasserstein space on graphs

W GangboC MouA Swiech - Preprint, 2023 - swiech.math.gatech.edu

… the mathematical setup for the Wasserstein space of probability measures on a finite graph.

Section 3 collects preliminary material about calculus on the Wasserstein space on a graph …

Cited by 1 Related articles 



[HTML] sciencedirect.com

[HTML] Wasserstein distance loss function for financial time series deep learning

HG SoutoA Moradi - Software Impacts, 2024 - Elsevier

This paper presents user-friendly code for the implementation of a loss function for neural

network time series models that exploits the topological structures of financial data. By …

Related articles All 2 versions


2023


[PDF] ieee.org

Rolling Bearing Fault Diagnosis Using Deep Transfer Learning Based on Joint Generalized Sliced Wasserstein Distance

N Lei, J Cui, J Han, X Chen, Y Tang - IEEE Access, 2024 - ieeexplore.ieee.org

… Wasserstein distances guided transfer learning (JGSWD) is proposed in this article that

utilizes the generalized sliced Wasserstein … generalized sliced Wasserstein distances guided …

 Related articles



[HTML] mdpi.com

[HTML] Anomaly Detection for Wind Turbines Using Long Short-Term Memory-Based Variational Autoencoder Wasserstein Generation Adversarial Network under …

C Zhang, T Yang - Energies, 2023 - mdpi.com

… variational autoencoder Wasserstein generation adversarial network (LSTM-based VAE-WGAN) …

and true distribution was quantified using Wasserstein distance, enabling complex high-…

Related articles All 4 versions 


[PDF] arxiv.org

Scalable Wasserstein Gradient Flow for Generative Modeling through Unbalanced Optimal Transport

J ChoiJ Choi, M Kang - arXiv preprint arXiv:2402.05443, 2024 - arxiv.org

Wasserstein Gradient Flow (WGF) describes the gradient dynamics of probability density

within the Wasserstein space. WGF provides a promising approach for conducting optimization …

Related articles All 2 versions 



Sliced Wasserstein Estimation with Control Variates

K NguyenN Ho - NeurIPS 2023 Workshop Optimal Transport and …, 2023 - openreview.net

… the expectation of the Wasserstein distance between two one-… the closed-form of the

Wasserstein-2 distance between two … an upper bound of the Wasserstein-2 distance between two …

Cited by 1 Related articles All 3 versions 


 

[HTML] arxiv.org

Wasserstein proximal operators describe score-based generative models and resolve memorization

BJ Zhang, S Liu, W LiMA Katsoulakis… - arXiv preprint arXiv …, 2024 - arxiv.org

… the Wasserstein metric and the cross-entropy loss. Our primary contribution in this section

is connecting the Wasserstein … Kernel formulas for approximating the Wasserstein proximal …

 Related articles All 2 versions 

<–—2023———2023——2150— 



[PDF] arxiv.org

End-to-end Supervised Prediction of Arbitrary-size Graphs with Partially-Masked Fused Gromov-Wasserstein Matching

P Krzakala, J YangR Flamary, FA Buc… - arXiv preprint arXiv …, 2024 - arxiv.org

We present a novel end-to-end deep learning-based approach for Supervised Graph

Prediction (SGP). We introduce an original Optimal Transport (OT)-based loss, the Partially-…

Related articles All 2 versions 



[HTML] arxiv.org

Order  quantum Wasserstein distances from couplings

E Beatty, DS França - arXiv preprint arXiv:2402.16477, 2024 - arxiv.org

… expected from the Wasserstein distance. For … Wasserstein distance based on the coupling

approach. Our novel definition departs from the observation that, classically, the Wasserstein …

Related articles All 5 versions 



Full-waveform tomography of the northeastern Tibetan Plateau based on the quadratic Wasserstein-metric

XP DONG, DH YANG, WJ MENG - Chinese Journal of Geophysics, 2024 - en.dzkx.org

This study collected waveform data from 50 regional seismic events recorded by 114 broadband

seismic stations located at the northeastern margin of the Tibetan Plateau. Utilizing the …

Related articles



[HTML] arxiv.org

Semi-Supervised Image Captioning Considering Wasserstein Graph Matching

Y Yang - arXiv preprint arXiv:2403.17995, 2024 - arxiv.org

… from traditional wasserstein distance considering continuous probability distributions, we

turn to deal with finite sets of node embedding. Therefore, we can reformulate the wasserstein …

 Related articles All 2 versions 



Unsupervised Domain Adaptation for Grade Prediction of Froth Flotation Based on Wasserstein Distance and Transformer

L Cen, X Li, X Chen, Y Xie, Z Tang - JOM, 2024 - Springer

… , the wasserstein-transformer domain adversarial neural network (WT-DANN), which uses

transformer to extract global features and calculates domain loss with wasserstein distance. …

Related articles



2023



[HTML] arxiv.org

Ornstein− Uhlenbeck type processes on Wasserstein spaces

P Ren, FY Wang - Stochastic Processes and their Applications, 2024 - Elsevier

… on the Wasserstein space, we introduce an inherent Gauss measure on the Wasserstein

space… process on the Wasserstein space, which will be addressed in the forthcoming paper [37]. …

 Cited by 2 Related articles All 2 versions


[HTML] mdpi.com

[HTML] Wasserstein Dissimilarity for Copula-Based Clustering of Time Series with Spatial Information

A BeneventoF Durante - Mathematics, 2023 - mdpi.com

… In general, the use of the Wasserstein metric for capturing … Here, instead, we will consider

the Wasserstein distance … The use of a dissimilarity based on the Wasserstein distance …

Related articles All 5 versions 


2023 see 2022. [PDF] jmlr.org

Dimensionality reduction and wasserstein stability for kernel regression

S EcksteinA IskeM Trabs - Journal of Machine Learning Research, 2023 - jmlr.org

In a high-dimensional regression framework, we study consequences of the naive two-step

procedure where first the dimension of the input variables is reduced and second, the …

Cited by 2 Related articles All 4 versions 



[HTML] arxiv.org

Active Learning for Regression based on Wasserstein distance and GroupSort Neural Networks

B Bobbia, M Picard - arXiv preprint arXiv:2403.15108, 2024 - arxiv.org

This paper addresses a new active learning strategy for regression problems. The presented

Wasserstein active regression model is based on the principles of distribution-matching to …

Related articles All 2 versions 



Wasserstein Distance-Preserving Vector Space of Persistent Homology

T SongdechakraiwutBM Krause, MI Banks… - … Conference on Medical …, 2023 - Springer

… The associated vector space preserves the Wasserstein distance between persistence

diagrams and fully leverages the Wasserstein stability properties. This vector space …

Related articles All 2 versions

<–—2023———2023——2160—



An Integrated Method Based on Wasserstein Distance and Graph for Cancer Subtype Discovery

Q Cao, J Zhao, H Wang, Q Guan… - IEEE/ACM Transactions …, 2023 - ieeexplore.ieee.org

… autoencoder measured by Wasserstein distance and graph … autoencoder measured by

Wasserstein distance (WVAE), … We take t

he 2-Wasserstein distance on Euclidean space to …

 Related articles All 5 versions


 


[PDF] arxiv.org

A data-dependent approach for high-dimensional (robust) wasserstein alignment

H Ding, W Liu, M Ye - ACM Journal of Experimental Algorithmics, 2023 - dl.acm.org

… Wasserstein flow simultaneously. Moreover, due to the flexibility of rigid transformations, we

Cited by 1 Related articles All 3 versions


[PDF] nsf.gov

[PDF] Provable Robustness against Wasserstein Distribution Shifts via Input Randomization

A KumarA LevineT GoldsteinS Feizi - 2023 - par.nsf.gov

Certified robustness in machine learning has primarily focused on adversarial perturbations

with a fixed attack budget for each sample in the input distribution. In this work, we present …

Cited by 2 Related articles All 3 versions 



[HTML] arxiv.org

Sliced-Wasserstein Estimation with Spherical Harmonics as Control Variates

R LelucA Dieuleveut, F Portier, J Segers… - arXiv preprint arXiv …, 2024 - arxiv.org

The Sliced-Wasserstein (SW) distance between probability measures is defined as the

average of the Wasserstein distances resulting for the associated one-dimensional projections. …

Related articles All 7 versions 



Linearity of Cartan and Wasserstein means

H Choi, S Kim, Y Lim - Linear Algebra and its Applications, 2024 - Elsevier

… Similarly we consider the linearity problem for the Wasserstein geodesic:(1.3) A t B = x A

+ y B , ( 0 < t < 1 ) . This is equivalent to the problem of determining A and B such that the real …

 Related articles


2023



2023 see 2022

[HTML] Dissipative probability vector fields and generation of evolution semigroups in Wasserstein spaces

G Cavagnari, G SavaréGE Sodini - Probability Theory and Related Fields, 2023 - Springer

… operators in Hilbert spaces and of Wasserstein gradient flows for geodesically convex …

By using the properties of the Wasserstein distance, we will first compute the right derivative …

Cited by 10 Related articles All 10 versions



[HTML] arxiv.org

Wasserstein Distance-based Expansion of Low-Density Latent Regions for Unknown Class Detection

P MallickF Dayoub, J Sherrah - arXiv preprint arXiv:2401.05594, 2024 - arxiv.org

This paper addresses the significant challenge in open-set object detection (OSOD): the

tendency of state-of-the-art detectors to erroneously classify unknown objects as known …

Related articles All 2 versions 



[HTML] springer.com

Full View

[HTML] Target detection based on generalized Bures–Wasserstein distance

Z Huang, L Zheng - EURASIP Journal on Advances in Signal Processing, 2023 - Springer

… With the latest development in this topic, the Wasserstein metric is used to distinguish the …

We noted that the Wasserstein distance of order 2 between two Gaussian variables with …

Related articles All 7 versions



[HTML] arxiv.org

Tangential Fixpoint Iterations for Gromov-Wasserstein Barycenters

F BeierR Beinert - arXiv preprint arXiv:2403.08612, 2024 - arxiv.org

… The Wasserstein space is a metric space and exhibits a rich … 6] and gives rise to the very

active study of Wasserstein … A shortcoming of the Wasserstein distance is that it is heavily …

 Related articles All 2 versions 



[PDF] arxiv.org

An Interacting Wasserstein Gradient Flow Strategy to Robust Bayesian Inference

F IgeaA Cicirello - arXiv preprint arXiv:2401.11607, 2024 - arxiv.org

… To address these limitations a Robust Bayesian Inference approach based on an interacting

Wasserstein gradient flows has been developed in this paper. The method estimates the …

 Cite Related articles All 2 versions 

<–—2023———2023——2170—



On the convergence of continuous and discrete unbalanced optimal transport models for 1-wasserstein distance

Z Xiong, L Li, YN Zhu, X Zhang - SIAM Journal on Numerical Analysis, 2024 - SIAM

We consider a Beckmann formulation of an unbalanced optimal transport (UOT) problem. The

\(\Gamma\) -convergence of this formulation of UOT to the corresponding optimal transport …

Cited by 1 Related articles



The Fibonacci constant, the Wasserstein distance, and biological tumor aggressiveness in prostate cancer

W Przemyslaw - 2023 24th International Conference on Control …, 2023 - ieeexplore.ieee.org

… measure, the Wasserstein distance. In that way, topological similarity can be quantified in

the objective manner. However, values of the Wasserstein distance overlap between the …

Cited by 1 Related articles



[PDF] emerald.com

A prelude to statistics in Wasserstein metric spaces

C Van Le, UH Pham - Asian Journal of Economics and Banking, 2023 - emerald.com

… In Section 2, we elaborate on Wasserstein metrics in a concrete data set consisting of (…

spaces to Wasserstein spaces. In Section 4, we mention an application of Wasserstein metrics to …

Related articles All 3 versions



Improving cement production process with data-augmented sequence to sequence-Wasserstein generative adversarial networks model for accurate prediction of f …

Y Zhang, J Liu, H Dang, Y Zhang, G Huang… - Review of Scientific …, 2023 - pubs.aip.org

… By reconstructing the input and output layers of a WGAN generator, unlabeled cement-…

-WGAN to address the issue of time scale imbalance in cement production. Seq2Seq-WGAN …

Related articles All 4 versions



[PDF] arxiv.org

Sticky-reflecting diffusion as a Wasserstein gradient flow

JB CasterasL Monsaingeon… - arXiv preprint arXiv …, 2024 - arxiv.org

In this paper we identify the Fokker-Planck equation for (reflected) Sticky Brownian Motion

as a Wasserstein gradient flow in the space of probability measures. The driving functional is …

Related articles All 5 versions 



2023


Regularized Wasserstein distance-based joint distribution adaptation approach for fault detection under variable working conditions

D Yang, X Peng, C Su, L Li, Z Cao… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org

… Therefore, the 2-Wasserstein distance is used as the metric function in this paper. For … In

general, the complexity of computing Wasserstein distance is high. Cuturi et al. demonstrated …

Related articles



2023 see 2022. [PDF] arxiv.org

Stochastic Wasserstein gradient flows using streaming data with an application in predictive maintenance

N LanzettiEC BaltaD Liao-McPhersonF Dörfler - IFAC-PapersOnLine, 2023 - Elsevier

We study estimation problems in safety-critical applications with streaming data. Since estimation

problems can be posed as optimization problems in the probability space, we devise a …

Cited by 1 Related articles All 4 versions


 


[HTML] arxiv.org

Distributionally Robust Density Control with Wasserstein Ambiguity Sets

J PilipovskyP Tsiotras - arXiv preprint arXiv:2403.12378, 2024 - arxiv.org

Precise control under uncertainty requires a good understanding and characterization of the

noise affecting the system. This paper studies the problem of steering state distributions of …

Related articles All 2 versions 


[PDF] arxiv.org

Covariance‐based soft clustering of functional data based on the Wasserstein–Procrustes metric

V Masarotto, G Masarotto - Scandinavian Journal of Statistics, 2023 - Wiley Online Library

… Wasserstein distance of Optimal Transport in order to cluster data with respect to di erences

in their covariance structure. The Wasserstein … ed version of the Wasserstein distance and …

Cited by 1 Related articles All 5 versions


[PDF] Wasserstein SVM: Support Vector Machines made fair

E Carrizosa, T Halskov, DR Morales - 2023 - researchgate.net

… We propose to measure unfairness as the Wasserstein distance (… We show that the hereafter

called Wasserstein SVM … Support Vector Machines and the Wasserstein distance. Section 3 …

Cited by 1 Related articles 

<–—2023———2023——2180—



[PDF] arxiv.org

Tensor train based sampling algorithms for approximating regularized Wasserstein proximal operators

F HanS OsherW Li - arXiv preprint arXiv:2401.13125, 2024 - arxiv.org

… To address this, we consider a kernel formula to approximate a regularized Wasserstein

operator. Specifically, we first recall the Wasserstein proximal with linear energy …

Related articles All 3 versions 



[HTML] arxiv.org

Sliced-Wasserstein Distances and Flows on Cartan-Hadamard Manifolds

C BonetL DrumetzN Courty - arXiv preprint arXiv:2403.06560, 2024 - arxiv.org

… for the Sliced-Wasserstein distance on the Euclidean space endowed with the Mahalanobis

distance on a document classification task, and of the Sliced-Wasserstein distance on …

Related articles All 2 versions 


2-23


Sequential Wasserstein Uncertainty Sets for Minimax Robust Online Change Detection

Y Yang, L Xie - … 2024-2024 IEEE International Conference on …, 2024 - ieeexplore.ieee.org

We consider the robust online change-point detection problem with unknown post-change

distributions. An online sequence of non-parametric uncertainty sets are constructed for the …

Related articles



[PDF] arxiv.org

Wasserstein Gradient Flows for Moreau Envelopes of f-Divergences in Reproducing Kernel Hilbert Spaces

S NeumayerV SteinG Steidl - arXiv preprint arXiv:2402.04613, 2024 - arxiv.org

… Subsequently, we use our findings to analyze Wasserstein gradient flows of MMD-regularized

f-divergences. Finally, we consider Wasserstein gradient flows starting from empirical …

Cited by 1 Related articles All 2 versions 



[PDF] iop.org

The use of Wasserstein Generative Adversarial Networks in searches for new resonances at the LHC.

B Lieberman, SE DahbiB Mellado - Journal of Physics …, 2023 - iopscience.iop.org

… Wasserstein generative adversarial network, WGAN, is used as an event generator for a Zγ

final state dataset. The data generated by WGAN … by the pre-trained WGAN can then be used …

Cite Related articles All 4 versions



[PDF] arxiv.org

Wasserstein distance estimates for jump-diffusion processes

JC Breton, N Privault - Stochastic Processes and their Applications, 2024 - Elsevier

We derive Wasserstein distance bounds between the probability distributions of a stochastic

integral (Itô) process with jumps ( X t ) t [ 0 , T ] and a jump-diffusion process ( X t ) t [ …

Cited by 2 Related articles All 4 versions


Spatial and channel attention-based conditional Wasserstein GAN for direct and rapid image reconstruction in ultrasound computed tomography

X Long, C Tian - Biomedical Engineering Letters, 2024 - Springer

… The discriminator in this study is based on the design principles of WGAN and Pix2Pix, as

… ’s discriminator is removed, and the Wasserstein distance is used to measure the difference …

Related articles All 3 versions



Multifrequency matched-field source localization based on Wasserstein metric for probability measures

Q Zhu, C Sun, M Li - The Journal of the Acoustical Society of America, 2023 - pubs.aip.org

… A p-Wasserstein metric cannot be expressed as an f-divergence. This paper aims to present

a novel version of MFP that uses the Wasserstein … In certain conditions, the Wasserstein …

 Related articles All 4 versions



Rotated SAR Ship Detection based on 

Gaussian Wasserstein Distance Loss

C Xu, H Su, L Gao, J Wu, W Yan - Mobile Networks and Applications, 2023 - Springer

… ship detection algorithm based on the Gaussian Wasserstein Distance (GWD) loss function

… -dimensional Gaussian encodings, and the Wasserstein distance between the distributions is …

Related articles

Wasserstein information matrix

W LiJ Zhao - Information Geometry, 2023 - Springer

… Wasserstein score functions and study covariance operators in statistical models. Using

them, we establish Wasserstein–… We derive the online asymptotic efficiency for Wasserstein …

Cited by 10 Related articles All 2 versions



[PDF] dal.ca

Wasserstein GAN-based framework for adversarial attacks against intrusion detection systems

F Cui, Q Ye, P Kibenge-MacLeod - ICC 2023-IEEE International …, 2023 - ieeexplore.ieee.org

… In this paper, we propose a framework based on Wasserstein generative adversarial networks

(WGANs) to generate adversarial traffic to evade ML/DL-based IDS. Compared with the …

Cited by 1 Related articles All 2 versions 

<–—2023———2023——2190—



[HTML] arxiv.org

Fair Wasserstein Coresets

Z XiongN DalmassoVK PotluruT Balch… - arXiv preprint arXiv …, 2023 - arxiv.org

… by minimizing the Wasserstein distance between the … Wasserstein distance is particularly

useful when generating coresets, as downstream model performance is tied to the Wasserstein …

Related articles All 3 versions 




Synthetic aperture radar ground target image generation based on improved Wasserstein generative adversarial networks with gradient penalty

Z Qu, G Fan, Z Zhao, L Jia, J Shi… - Journal of Applied Remote …, 2023 - spiedigitallibrary.org

… To solve these problems, we propose an improved Wasserstein GAN with gradient

penalty (IWGAN-GP), which introduces dense connection in the generator, integrates feature …

Cited by 1 Related articles All 3 versions



[PDF] sjtu.edu.cn

A Machine Learning Framework for Geodesics Under Spherical Wasserstein–Fisher–Rao Metric and Its Application for Weighted Sample Generation

Y JingJ ChenL LiJ Lu - Journal of Scientific Computing, 2024 - Springer

… particle transport process compared with Wasserstein distance. The spherical WFR metric …

Wasserstein distance has been well studied [49, 55]. For example, in the case of Wasserstein

Related articles All 3 versions



[PDF] arxiv.org

Isometric rigidity of Wasserstein spaces over Euclidean spheres

GP Gehér, A Hrušková, T TitkosD Virosztek - arXiv preprint arXiv …, 2023 - arxiv.org

We study the structure of isometries of the quadratic Wasserstein space $\mathcal{W}_2\left(\mathbb{S}^n,\varrho_{\|\cdot\|}\right)$

over the sphere endowed with the distance inherited …

Cited by 1 Related articles 



Universal consistency of Wasserstein k-NN classifier: a negative and some positive results

D Ponnoprat - Information and Inference: A Journal of the IMA, 2023 - academic.oup.com

… ) of probability measures under the Wasserstein distance. We … the base metric space, or the

Wasserstein space itself. To this … the geodesic structures of the Wasserstein spaces for |$p=1$…

Related articles All 3 versions


023




Poisson Equation on Wasserstein Space and Diffusion Approximations for Multiscale McKean–Vlasov Equation

Y Li, F Wu, L Xie - SIAM Journal on Mathematical Analysis, 2024 - SIAM

We consider the fully-coupled McKean–Vlasov equation with multi-time-scale potentials,

and all the coefficients depend on the distributions of both the slow component and the fast …

Related articles



A study of distributionally robust mixed-integer programming with Wasserstein metric: on the value of incomplete data

SS Ketkov - European Journal of Operational Research, 2024 - Elsevier

… realization of data, we focus on a Wasserstein ball wrt l 1 -norm, … be slightly modified to

Related articles All 4 versions



CR-Net: A robust craniofacial registration network by introducing Wasserstein distance constraint and geometric attention mechanism

Z Dai, J Zhao, X Deng, F Duan, D Li, Z Pan… - Computers & Graphics, 2023 - Elsevier

Accurate registration of three-dimensional (3D) craniofacial data is fundamental work for

craniofacial reconstruction and analysis. The complex topology and low-quality 3D models …

Related articles



[PDF] arxiv.org

Low-Rate, Low-Distortion Compression with Wasserstein Distortion

Y QiuAB Wagner - arXiv preprint arXiv:2401.16858, 2024 - arxiv.org

… Abstract—Wasserstein distortion is a one-… for Wasserstein in the extreme cases of pure

fidelity and pure realism, we prove the first coding theorems for compression under Wasserstein …

Related articles All 2 versions 



Alzheimer Brain Imaging Dataset Augmentation Using Wasserstein Generative Adversarial Network

K Ilyas, B Zahid HussainI AndleebA Aslam… - … Conference on Data …, 2023 - Springer

… Contribution: Considering the lack of a high-quality dataset (in the public domain) for

training deep neural networks for AD detection, we propose a Wasserstein GAN (WGAN) [13]-…

Related articles All 2 versions

<–—2023———2023——2200—



[PDF] nsf.gov

Non-Parametric and Regularized Dynamical Wasserstein Barycenters for Sequential Observations

KC ChengEL MillerMC Hughes… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org

… corresponding to a Wasserstein barycenter by detailing the … an overview of the Wasserstein

distance and barycenter … -scaling property of the Wasserstein barycenter as well as the …

Related articles All 4 versions



[PDF] neurips.cc

Outlier-Robust Gromov-Wasserstein for Graph Data

L Kong, J LiJ TangAMC So - Advances in Neural …, 2024 - proceedings.neurips.cc

… • On the statistical side, we demonstrate that the robust Gromov-Wasserstein is bounded

above … of Gromov-Wasserstein distance and formally formulate the robust Gromov-Wasserstein. …

Related articles All 5 versions 


2023 see 2022

Fast Approximation of the Generalized Sliced-Wasserstein Distance

D LeH NguyenK Nguyen… - ICASSP 2024-2024 …, 2024 - ieeexplore.ieee.org

… -Wasserstein distance is a variant of slicedWasserstein distance … -Wasserstein distance,

generalized slicedWasserstein is … of generalized sliced-Wasserstein distance, which is mainly …

Cited by 1 Related articles All 3 versions



[HTML] arxiv.org

Wasserstein Graph Distance Based on Distributions of Probabilistic Node Embeddings

M Scholkemper, D Kühn, G Nabbefeld, S Musall… - arXiv preprint arXiv …, 2024 - arxiv.org

… of the Wasserstein distance used: The full Wasserstein distance, the scaled Wasserstein …

, and the tied Wasserstein distance, where we assume Σi =Σj =Σ, which further simplifies the …

Related articles All 2 versions 



[HTML] arxiv.org

Numerical Analysis on Neural Network Projected Schemes for Approximating One Dimensional Wasserstein Gradient Flows

X ZuoJ ZhaoS LiuS OsherW Li - arXiv preprint arXiv:2402.16821, 2024 - arxiv.org

… This study continues the study of the Wasserstein … of Wasserstein gradient flows of free

energies in both Eulerian and Lagrangian coordinates. We formulate the projected Wasserstein …

Related articles All 3 versions 


 

2023



Positive Definite Wasserstein Graph Kernel for Brain Disease Diagnosis

K Ma, X Wen, Q ZhuD Zhang - International Conference on Medical …, 2023 - Springer

… To address this problem, we propose a graph sliced Wasserstein distance to measure the …

sliced Wasserstein distance, we propose a new graph kernel called sliced Wasserstein graph …

Cited by 1 Related articles All 2 versions



[HTML] arxiv.org

Validating Climate Models with Spherical Convolutional Wasserstein Distance

RC GarrettT Harris, B Li, Z Wang - arXiv preprint arXiv:2401.14657, 2024 - arxiv.org

The validation of global climate models is crucial to ensure the accuracy and efficacy of

model output. We introduce the spherical convolutional Wasserstein distance to more …

Related articles All 2 versions 



[HTML] mdpi.com

[HTML] Rotating Machinery Fault Diagnosis with Limited Multisensor Fusion Samples by Fused Attention-Guided Wasserstein GAN

W Fu, K Yang, B Wen, Y Shan, S Li, B Zheng - Symmetry, 2024 - mdpi.com

… [16] proposed a full-attention mechanism with Wasserstein GAN (WGAN), which integrated

… -guided WGAN. In the proposed approach, fused attention-guided WGAN is combined with …

Cited by 1 Related articles 



[PDF] mlr.press

Geometrically Regularized Wasserstein Dictionary Learning

M Mueller, S AeronJM Murphy… - Topological, Algebraic …, 2023 - proceedings.mlr.press

… Wasserstein dictionary learning is an unsupervised approach to learning a collection of

probability distributions that generate observed distributions as Wasserstein … for Wasserstein …

 Related articles All 2 versions 


L] A Multi-Objective Geoacoustic Inversion of Modal-Dispersion and Waveform Envelope Data Based on Wasserstein Metric

J Ding, X Zhao, P Yang, Y Fu - Remote Sensing, 2023 - mdpi.com

… including the Wasserstein metric and … Wasserstein metric (Wasserstein-MOBO) and L2 norm

(L2-MOBO). In Section 4, numerical experiments are performed to compare the Wasserstein-…

 Related articles All 4 versions 

<–—2023———2023——2210—


[PDF] researchgate.net

Enhanced data imputation framework for bridge health monitoring using Wasserstein generative adversarial networks with gradient penalty

S Gao, C Wan, Z Zhou, J Hou, L Xie, S Xue - Structures, 2023 - Elsevier

… data imputation with Wasserstein distance and gradient … The loss function of the generator

is composed of Wasserstein … of critic is attributed to Wasserstein distance loss and gradient …

Cited by 1 Related articles All 2 versions



[PDF] arxiv.org

  space is called Wasserstein space. …

Cited by 11 Related articles All 5 versions



[PDF] arxiv.org

A two-step approach to Wasserstein distributionally robust chance-and security-constrained dispatch

A MaghamiE Ursavas… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org

This paper considers a security constrained dispatch problem involving generation and line

contingencies in the presence of the renewable generation. The uncertainty due to …

Cited by 5 Related articles All 6 versions



Optimal Transport and the Wasserstein Distance for Fuzzy Measures: An Example

V Torra - International Conference on Intelligent and Fuzzy …, 2023 - Springer

… Among them, we can distinguish the Wasserstein distance which is based on the optimal

transport problem. Both the optimal transport problem and the Wasserstein distance have been …

Cited by 1 Related articles All 2 versions


Wasserstein Distance for OWA Operators

IÁ HarmatiL CoroianuR Fullér - Fuzzy Sets and Systems, 2024 - Elsevier

… First we associate an OWA operator with a unique regular increasing monotone quantifier

and then define the distance between two OWA operators as the Wasserstein-1 distance …

Related articles


2023


[PDF] aaai.org

Wasserstein Graph Distance Based on L1–Approximated Tree Edit Distance between Weisfeiler–Lehman Subtrees

Z FangJ Huang, X Su, H Kasai - … of the AAAI Conference on Artificial …, 2023 - ojs.aaai.org

… In this paper, we propose a novel graph metric called the Wasserstein WL Subtree (WWLS) …

Subsequently, we combine the Wasserstein distance and the L1TED to define the WWLS …

SavCited by 3 Related articles All 7 versions 


[PDF] arxiv.org

Optimizing the Wasserstein GAN for TeV Gamma Ray Detection with VERITAS

D Ribeiro, Y Zheng, R Sankar, K Mantha - arXiv preprint arXiv:2309.12221, 2023 - arxiv.org

… In this study, we propose an unsupervised Wasserstein Generative Adversarial Network (WGAN) …

the model (WGAN-gp,[10]). In this project, we utilize WGAN-gp (hereafter just WGAN) to …

Cited by 1 Related articles All 4 versions 



Wasserstein‐metric‐based distributionally robust optimization method for unit commitment considering wind turbine uncertainty

G Chen, D Qi, Y Yan, Y Chen, Y Wang… - Engineering …, 2023 - Wiley Online Library

… Based on Wasserstein metric, an ambiguity set is established to reflect the probabilistic …

controlling the sample size and the confidence of Wasserstein ambiguity set radius. In addition, …

 Cited by 2 Related articles All 5 versions



[HTML] springer.com

[HTML] Data-driven decadal climate forecasting using Wasserstein time-series generative adversarial networks

A Bouteska, ML Seranto, P Hajek… - Annals of Operations …, 2023 - Springer

… To solve the problem of low or zero gradients, here we propose to use the Wasserstein loss

function in the TimeGAN model. The Wasserstein loss function is grounded on the distance …

Related articles


PDF] openreview.net

In Defence Of Wasserstein

A ElnekaveY Weiss - 2023 - openreview.net

… the connection between Wasserstein distance and WGANs. The … WGAN training protocol

is employed, WGANs with a CNN-GAP discriminator indeed minimize the patch Wasserstein …

Related articles 

<–—2023———2023——2220— 



The Wasserstein metric matrix and its computational property

ZZ Bai - Linear Algebra and its Applications, 2024 - Elsevier

… computational properties about the Wasserstein-1 metric … one- and two-dimensional

Wasserstein-1 metric matrices, as … generalized and extended Wasserstein-1 metric matrices. …

 Related articles



2023 see w0ww. [PDF] eartharxiv.org

[PDF] Comparing detrital age spectra, and other geological distributions, using the Wasserstein distance

A LippP Vermeesch - Geochronology, 2023 - eartharxiv.org

… For the toy example, the Wasserstein distance simply … In the following sections, we first

introduce the Wasserstein … We then proceed to compare the Wasserstein distance to the KS …

 Cited by 2 Related articles All 3 versions 



[HTML] arxiv.org

Neural Entropic Gromov-Wasserstein Alignment

T Wang, Z Goldfeld - arXiv preprint arXiv:2312.07397, 2023 - arxiv.org

The Gromov-Wasserstein (GW) distance, rooted in optimal transport (OT) theory, provides a

natural framework for aligning heterogeneous datasets. Alas, statistical estimation of the GW …

Related articles All 2 versions 



Optimal Charging of Lithium-Ion Battery Using Distributionally Robust Model Predictive Control With Wasserstein Metric

G Dong, Z Zhu, Y LouJ Yu, L Wu… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org

Developing a fast and safe charging strategy has been one of the key breakthrough points in

lithium battery development owing to its range anxiety and long charging time. The majority …

 Related articles



[PDF] arxiv.org

Efficient Solvers for Partial Gromov-Wasserstein

Y BaiRD MartinH DuA Shahbazi… - arXiv preprint arXiv …, 2024 - arxiv.org

The partial Gromov-Wasserstein (PGW) problem facilitates the comparison of measures with

unequal masses residing in potentially distinct metric spaces, thereby enabling unbalanced …

 Related articles All 2 versions 


2023



[HTML] Stable and Fast Deep Mutual Information Maximization Based on Wasserstein Distance

X He, C Peng, L Wang, W Tan, Z Wang - Entropy, 2023 - mdpi.com

… Wasserstein distance metric encoder and the prior distribution as the loss of the prior

discriminator based on the superiority of the Wasserstein … value of the Wasserstein distance metric …

Related articles All 9 versions 




An oil wear particle identification method based on Wasserstein generative adversarial network and improved CNN using a custom-built optical imaging sensor

Z Liu, Y Liu, F Bai, H Zuo, J Dhupia, H Fei - Measurement, 2024 - Elsevier

… the image data, we present an innovative method based on Wasserstein Generative

Adversarial Network with gradient penalty (WGAN-GP). This method is devised to augment the …

Related articles



Bayesian Nonparametric Two-Stage Distributionally Robust Unit Commitment Optimization: From Global Multimodality to Local Trimming-Wasserstein Ambiguity

X Ma, C Ning, L Li, H Qiu, W Gu… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org

Uncertainty brought by the deep penetration of renewable energy has imposed great challenges

on the operation of power systems. Accurately characterizing the global-local features …

 Related articles



Wasserstein filter for variable screening in binary classification in the reproducing kernel Hilbert space

S Jeong, C Kim, H Yang - Journal of Nonparametric Statistics, 2024 - Taylor & Francis

… classification based on the Wasserstein distance accounting for … (RKHS), we consider the

Wasserstein filter's capacity to detect the … We prove that the Wasserstein filter satisfies the sure …

Cited by 1 Related articles



Wasserstein distance-based full waveform inversion method for density reconstruction

H Liu, G Wu, Z Jia, Q Li, J Shan, S Yang - Journal of Applied Geophysics, 2024 - Elsevier

… This paper introduces the Wasserstein distance into the full waveform inversion for velocity …

Furthermore, due to the sensitivity of the quadratic Wasserstein distance to low-frequency, we …

 Related articles

<–—2023———2023——2230—



Grownbb: Gromov–Wasserstein learning of neural best buddies for cross-domain correspondence

R Tang, W WangY Han, X Feng - The Visual Computer, 2024 - Springer

Identifying pixel correspondences between two images is a fundamental task in computer

vision, and has been widely used for 3D reconstruction, image morphing, and image retrieval. …

Related articles



[HTML] arxiv.org

Hamilton--Jacobi equations for Wasserstein controlled gradient flows: existence of viscosity solutions

G ConfortiRC KraaijL Tamanini, D Tonon - arXiv preprint arXiv …, 2024 - arxiv.org

… work applies to controlled Wasserstein gradient flows only, … may well differ from the

Wasserstein space. Some candidate … definition of Wasserstein distance and Wasserstein space. In …

Cited by 1 Related articles All 2 versions 




Sig‐Wasserstein GANs for conditional time series generation

S LiaoH Ni, M Sabate‐Vidales, L Szpruch… - Mathematical …, 2023 - Wiley Online Library

… We propose the generic conditional SigWGAN framework by integrating Wasserstein-GANs

(WGANs) with mathematically principled and efficient path feature extraction called the …

Related articles All 2 versions


[HTML] mdpi.com

[HTML] Wasserstein-Enabled Leaks Localization in Water Distribution Networks

A PontiI GiordaniA CandelieriF Archetti - Water, 2024 - mdpi.com

… can be captured by the Wasserstein distance. This choice … in the Wasserstein space using

Wasserstein barycenters as … distribution endowed with the Wasserstein distance. Experiments …

Cited by 1 Related articles All 3 versions 


[PDF] arxiv.org

Wasserstein distributionally robust optimization and its tractable regularization formulations

H Chu, M LinKC Toh - arXiv preprint arXiv:2402.03942, 2024 - arxiv.org

… We study a variety of Wasserstein distributionally robust optimization (WDRO) problems

where the distributions in the ambiguity set are chosen by constraining their Wasserstein …

Related articles All 2 versions 


2023


[PDF] arxiv.org

Wasserstein Auto-Encoders of Merge Trees (and Persistence Diagrams)

M PontJ Tierny - IEEE Transactions on Visualization and …, 2023 - ieeexplore.ieee.org

… the Wasserstein auto-encoding of merge trees (MT-WAE), a novel extension of the classical

auto-encoder neural network architecture to the Wasserstein … both the Wasserstein distances …

Cited by 1 Related articles All 6 versions



[PDF] ieee.org

Explainable AI using the Wasserstein Distance

SS ChaudhuryP Sadhukhan, K Sengupta - IEEE Access, 2024 - ieeexplore.ieee.org

… A pair of classes (distribution of points) with more Wasserstein distance between them will …

of Wasserstein distance. For a dataset, we propose to measure the Wasserstein distances for …

Related articles All 2 versions



Conditional Wasserstein Distances with Applications in Bayesian OT Flow Matching

J ChemseddineP Hagemann, C Wald… - arXiv preprint arXiv …, 2024 - arxiv.org

… true for the Wasserstein distance. In this paper, we introduce a conditional Wasserstein

distance via a set of restricted couplings that equals the expected Wasserstein distance of the …

Related articles All 2 versions 



[HTML] optica.org

Full View

Combination of near-infrared spectroscopy with Wasserstein generative adversarial networks for rapidly detecting raw material quality for formula products

X Xin, J Jia, S Pang, R Hu, H Gong, X Gao, X Ding - Optics Express, 2024 - opg.optica.org

… NIRS with Wasserstein generative adversarial networks (WGANs). … Then, the WGAN

augments the database by generating … Experimental results show the NIRS-WGAN method …

Related articles All 2 versions



[PDF] ams.org

𝐿₁-distortion of Wasserstein metrics: A tale of two dimensions

F Baudier, C Gartland, T Schlumprecht - Transactions of the American …, 2023 - ams.org

… The metric space (P(X), dW1 ) is referred to as the 1-Wasserstein space over X, and we

denote it by Wa1(X). Wasserstein metrics are of high theoretical interest but most importantly they …

Cited by 2 Related articles All 7 versions

<–—2023———2023——2240—



 An infrared small target detection model via Gather-Excite attention and normalized Wasserstein distance

K Sun, J Huo, Q Liu, S Yang - … and engineering: MBE, 2023 - pubmed.ncbi.nlm.nih.gov

Infrared small target detection (ISTD) is the main research content for defense confrontation,

long-range precision strikes and battlefield intelligence reconnaissance. Targets from the …

Cited by 1 Related articles All 3 versions

[HTML] arxiv.org

A Statistical Analysis of Wasserstein Autoencoders for Intrinsically 


Low-dimensional Data

S ChakrabortyPL Bartlett - arXiv preprint arXiv:2402.15710, 2024 - arxiv.org

Variational Autoencoders (VAEs) have gained significant popularity among researchers as

a powerful tool for understanding unknown distributions based on limited samples. This …

Related articles All 3 versions 



[HTML] arxiv.org

Wasserstein perspective of Vanilla GANs

L Kunkel, M Trabs - arXiv preprint arXiv:2403.15312, 2024 - arxiv.org

… Wasserstein GANs can be extended to Vanilla GANs. In particular, we obtain an oracle

inequality for Vanilla GANs in Wasserstein … GANs as well as Wasserstein GANs as estimators of …

Cite Related articles All 2 versions 


A proximal forward-backward splitting based algorithmic framework for Wasserstein logistic regression using heavy ball strategy

B Zhou, Y Yuan, Q Song - International Journal of Systems …, 2024 - Taylor & Francis

In this paper, a forward-backward splitting based algorithmic framework incorporating the

heavy ball strategy is proposed so as to efficiently solve the Wasserstein logistic regression …

Related articles All 5 versions


Transition Time Determination of Single-Molecule FRET Trajectories via Wasserstein Distance Analysis in Steady-State Variations in smFRET (WAVE)

T Chen, F Gao, YW Tan - The Journal of Physical Chemistry B, 2023 - ACS Publications

… In this study, we introduce a novel methodology called WAVE (Wasserstein distance Analysis

in … We then apply Maximum Wasserstein Distance analysis to differentiate the FRET state …

Related articles All 3 versions



2023



[HTML] arxiv.org

Distributional Reduction: Unifying Dimensionality Reduction and Clustering with Gromov-Wasserstein Projection

H Van AsselC Vincent-CuazN Courty… - arXiv preprint arXiv …, 2024 - arxiv.org

… Leveraging tools from optimal transport, particularly the Gromov-Wasserstein distance, we

… We review in this section the Gromov-Wasserstein formulation of OT aiming at comparing …

Related articles All 2 versions 



2023 see 2022. [PDF] arxiv.org

On combinatorial properties of greedy Wasserstein minimization

S Steinerberger - Journal of Mathematical Analysis and Applications, 2024 - Elsevier

We discuss a phenomenon where Optimal Transport leads to a remarkable amount of

combinatorial regularity. Consider infinite sequences ( x k ) k = 1 ∞ in [ 0 , 1 ] constructed in a …

Cited by 3 Related articles All 3 versions

Shared wasserstein adversarial domain adaption

S Yao, Y Chen, Y Zhang, Z Xiao, J Ni - Multimedia Tools and Applications, 2024 - Springer

In numerous real-world applications, obtaining labeled data for a specific deep learning

task can be prohibitively expensive. We present an innovative framework for unsupervised …

Related articles


Improving Android Malware Detection Through Data Augmentation Using Wasserstein Generative Adversarial Networks

K Stalin, MB Mekoya - arXiv preprint arXiv:2403.00890, 2024 - arxiv.org

Generative Adversarial Networks (GANs) have demonstrated their versatility across various

applications, including data augmentation and malware detection. This research explores …

Related articles All 2 versions


2023 see 2022

Single image super-resolution using Wasserstein generative adversarial network with gradient penalty

Y Tang, C Liu, X Zhang - Pattern Recognition Letters, 2022 - Elsevier

… based on Wasserstein GAN, which is a training more stable GAN with Wasserstein metric. …

and stable, two modifications are made on the original WGAN. First, a gradient penalty (GP) is …

Cited by 8 Related articles All 3 versions

<–—2023———2023——2250—



[HTML] rsc.org

[HTML] Augmentation of FTIR spectral datasets using Wasserstein generative adversarial networks for cancer liquid biopsies

RG McHardy, G Antoniou, JJA ConnMJ Baker… - Analyst, 2023 - pubs.rsc.org

… The results show that WGAN augmented spectra improve … no augmented spectra, adding

WGAN augmented spectra to a … , data augmentation using a WGAN led to an increase in AUC …

Cited by 3 Related articles All 10 versions


2023 see2022. 

[PDF] openrepository.com

Wasserstein gan based chest x-ray dataset augmentation for deep learning models: Covid-19 detection use-case

BZ HussainI AndleebMS Ansari… - 2022 44th annual …, 2022 - ieeexplore.ieee.org

… of a Wasserstein Generative Adversarial Network (WGAN) could lead to an effective and

lightweight solution. It is demonstrated that the WGAN … Therefore we propose a WGAN based …

Cited by 12 Related articles All 8 versions


 


2023 see 2022. [PDF] arxiv.org

Wasserstein-based graph alignment

HP MareticM El GhecheM Minder… - … on Signal and …, 2022 - ieeexplore.ieee.org

… , where we consider the Wasserstein distance to measure the … Wasserstein distance

combined with the one-to-many graph assignment permi

ts to outperform both Gromov-Wasserstein …

Cited by 22 Related articles All 6 versions



Wasserstein Embedding Learning for Deep Clustering: A Generative Approach

J Cai, Y Zhang, S Wang, J Fan… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org

… We provide two realization approaches to the Wasserstein embedding clustering, one is …

introduce the Wasserstein embedding learning to address this issue, as Wasserstein distance …

Related articles


[HTML] springer.com

[HTML] Wasserstein GAN-based architecture to generate collaborative filtering synthetic datasets

J Bobadilla, A Gutiérrez - Applied Intelligence, 2024 - Springer

… To reduce mode collapse even further, our proposed WGANRS method introduces the

Wasserstein concept into the GAN kernel (Fig. 1c). The Wasserstein approach has been shown to …

Related articles


2023



2023 see 2022. [PDF] arxiv.org

Accelerated Bregman primal-dual methods applied to optimal transport and Wasserstein Barycenter problems

A ChambolleJP Contreras - SIAM Journal on Mathematics of Data Science, 2022 - SIAM

This paper discusses the efficiency of Hybrid Primal-Dual (HPD) type algorithms to approximately

solve discrete Optimal Transport (OT) and Wasserstein Barycenter (WB) problems, …

Cited by 14 Related articles All 7 versions


2023 see 2022.

[PDF] arxiv.org

Projected Wasserstein gradient descent for high-dimensional Bayesian inference

Y WangP ChenW Li - SIAM/ASA Journal on Uncertainty Quantification, 2022 - SIAM

… Wasserstein gradient descent method (pWGD) for high-dimensional Bayesian inference

problems. The underlying density function of a particle system of Wasserstein … Wasserstein …

Cited by 18 Related articles All 4 versions




2023 see 2022.  [PDF] arxiv.org

Improving EEG Signal Classification Accuracy Using Wasserstein Generative Adversarial Networks

J Park, P Mahey, O Adeniyi - arXiv preprint arXiv:2402.09453, 2024 - arxiv.org

… The WGAN was trained on the BCI2000 dataset, consisting of around 1500 … WGAN model

was able to emulate the spectral and spatial properties of the EEG training data. The WGAN-…

 Related articles All 2 versions 



[PDF] researchgate.net

Lifewatch: Lifelong wasserstein change point detection

K FaberR CorizzoB Sniezynski… - … Joint Conference on …, 2022 - ieeexplore.ieee.org

… In this paper, we attempt to fill this gap by proposing LIFEWATCH, a novel Wasserstein-based

change point detection approach with memory capable of modeling multiple data …

 Cited by 10 Related articles All 2 versions


2023 see 2022. [PDF] arxiv.org

Randomized Wasserstein barycenter computation: resampling with statistical guarantees

F Heinemann, A MunkY Zemel - SIAM Journal on Mathematics of Data …, 2022 - SIAM

We propose a hybrid resampling method to approximate finitely supported Wasserstein

barycenters on large-scale datasets, which can be combined with any exact solver. …

Cited by 16 Related articles All 6 versions

<–—2023———2023——2260—



[PDF] univr.it

Dynamical Systems and Hamilton–Jacobi–Bellman Equations on the Wasserstein Space and their L2 Representations

C Jimenez, A MarigondaM Quincampoix - SIAM Journal on Mathematical …, 2023 - SIAM

… control problems, both stated in the Wasserstein space of probability measures. Since … the

Wasserstein space and to investigate the relations between dynamical systems in Wasserstein …

Cited by 8 Related articles All 10 versions



Novel dual-network autoencoder based adversarial domain adaptation with Wasserstein divergence for fault diagnosis of unlabeled data

JF Yang, N ZhangYL He, QX Zhu, Y Xu - Expert Systems with Applications, 2024 - Elsevier

… domain adaptation with Wasserstein divergence (DWADA). … the Wasserstein distance that

measures the difference in feature distribution between different domains. Finally, Wasserstein …

Cited by 2 Related articles All 2 versions



2023 see 2022

A novel multi-speakers Urdu singing voices synthesizer using Wasserstein Generative Adversarial Network

A Saeed, MF HayatT Habib, DA Ghaffar… - Speech …, 2022 - Elsevier

In this paper, the first-ever Urdu language singing voices corpus is developed using linguistic

(phonetic) and vocoder (F0 contours) features. Singer identity feature vector along with the …

Cited by 3 Related articles All 2 versions



[PDF] arxiv.org

Density of subalgebras of Lipschitz functions in metric Sobolev spaces and applications to Wasserstein Sobolev spaces

M FornasierG SavaréGE Sodini - Journal of Functional Analysis, 2023 - Elsevier

… linear continuous Wasserstein-… Wasserstein Sobolev spaces. In particular, the techniques

developed in the present paper can also be applied to study the general class of Wasserstein …

Cited by 8 Related articles All 8 versions


[HTML] mdpi.com

[HTML] Hyperspectral anomaly detection based on wasserstein distance and spatial filtering

X Cheng, M Wen, C GaoY Wang - Remote Sensing, 2022 - mdpi.com

… This article proposes a hyperspectral AD method based on Wasserstein distance (WD)

and spatial filtering (called AD-WDSF). Based on the assumption that both background and …

Cited by 8 Related articles All 5 versions 


2023



2023 see 2022

EvaGoNet: An integrated network of variational autoencoder and Wasserstein generative adversarial network with gradient penalty for binary classification tasks

C Luo, Y Xu, Y Shao, Z Wang, J Hu, J Yuan, Y Liu… - Information …, 2023 - Elsevier

Feature engineering is an effective method for solving classification problems. Many existing

feature engineering studies have focused on image or video data and not on structured data…

Cited by 2 Related articles All 2 versions



2023 see 2022

Decision making under model uncertainty: Fréchet–Wasserstein mean preferences

EV Petracou, A Xepapadeas… - Management …, 2022 - pubsonline.informs.org

This paper contributes to the literature on decision making under multiple probability models

by studying a class of variational preferences. These preferences are defined in terms of …

Cited by 15 Related articles All 5 versions



2023 see 2022

Conditional wasserstein gan for energy load forecasting in large buildings

GS Năstăsescu, DC Cercel - 2022 International Joint …, 2022 - ieeexplore.ieee.org

Energy forecasting is necessary for planning electricity consumption, and large buildings

play a huge role when making these predictions. Because of its importance, numerous …

Cited by 4 Related articles All 2 versions



2023 see 2022. [PDF] arxiv.org

Rate of convergence for particle approximation of PDEs in Wasserstein space

M GermainH PhamX Warin - Journal of Applied Probability, 2022 - cambridge.org

We prove a rate of convergence for the N-particle approximation of a second-order partial

differential equation in the space of probability measures, such as the master equation or …

Cited by 21 Related articles All 21 versions



023 see 2022. [PDF] arxiv.org. [PDF] aaai.org

Variational wasserstein barycenters with c-cyclical monotonicity regularization

J Chi, Z Yang, X Li, J Ouyang, R Guan - Proceedings of the AAAI …, 2023 - ojs.aaai.org

… The barycenter of multiple given probability distributions under Wasserstein distance is … of

Wasserstein distances to all input distributions. Due to geometric properties, the Wasserstein …

Cited by 3 Related articles All 3 version

<–—2023———2023——2270—



2023 see 2022 [PDF] researchgate.net

[PDF] The sketched Wasserstein distance for mixture distributions

X BingF BuneaJ Niles-Weed - arXiv preprint arXiv:2206.12768, 2022 - researchgate.net

… Wasserstein space over X = (A,d). This result establishes a universality property for the

Wasserstein … on the risk of estimating the Wasserstein distance between distributions on a K-point …

Cited by 7 Related articles 




[HTML] arxiv.org

On the metric property of quantum Wasserstein divergences

G BunthJ PitrikT TitkosD Virosztek - arXiv preprint arXiv:2402.13150, 2024 - arxiv.org

… Quantum Wasserstein divergences are modified versions of quantum Wasserstein … We

prove triangle inequality for quantum Wasserstein divergences for any finitedimensional …

Cited by 1 Related articles All 3 versions 



2023 see 2022

RoBiGAN: A bidirectional Wasserstein GAN approach for online robot fault diagnosis via internal anomaly detection

T Schnell, K Bott, L PuckT Buettner… - 2022 IEEE/RSJ …, 2022 - ieeexplore.ieee.org

… Finally, TadGAN [17] offers a model similar to the BiGAN architecture using Wasserstein …

Therefore, we introduce a bidirectional Wasserstein GAN architecture fit for online anomaly …

Cited by 3 Related articles



[PDF] arxiv.org

WGAN-AFL: Seed Generation Augmented Fuzzer with Wasserstein-GAN

L Yang, C Li, Y Qiu, C Wei, J YangH Guo… - arXiv preprint arXiv …, 2024 - arxiv.org

… with WGAN, which leverages the wasserstein distance for GAN model optimization. WGAN

provides … Furthermore, We substitute GAN with WGAN, which effectively mitigates the gradient …

Related articles All 2 versions 



[PDF] neurips.cc

Fast Bellman Updates for Wasserstein Distributionally Robust MDPs

Z Yu, L Dai, S XuS GaoCP Ho - Advances in Neural …, 2024 - proceedings.neurips.cc

… for solving distributionally robust MDPs with Wasserstein ambiguity sets. By exploiting the

… and actions when the distance metric of the Wasserstein distance is chosen to be $ L_1 $, $ …

Cited by 2 Related articles All 2 versions 


2023



[HTML] Unified topological inference for brain networks in temporal lobe epilepsy using the Wasserstein distance

MK Chung, CG Ramos, FB De Paiva, J Mathis… - NeuroImage, 2023 - Elsevier

… However, the Wasserstein distance in these applications is purely geometric in nature, and

… graphs through the Wasserstein distance. We directly build the Wasserstein distance using …

Cited by 4 Related articles All 11 versions



Wasserstein generative adversarial networks for modeling marked events

SHS DizajiS PashazadehJM Niya - The Journal of Supercomputing, 2023 - Springer

… for Marks In addition to our proposed conditional WGAN model for marked events, the

original WGAN method was trained with an independent WGAN model for generating marks of …

Cited by 1 Related articles All 3 versions



[PDF] neurips.cc

On distributionally robust generalized Nash games defined over On distributionally robust generalized Nash games defined over 

X Bai, G He, Y JiangJ Obloj - Advances in Neural …, 2024 - proceedings.neurips.cc

… using techniques of Wasserstein distributionally robust … Wasserstein ambiguity set

BδpPq, which is a ball centered at the reference distribution P with radius δ under the Wasserstein …

Related articles All 4 versions 



[PDF] arxiv.org

Humanmimic: Learning natural locomotion and transitions for humanoid robot via wasserstein adversarial imitation

A Tang, T Hiraoka, N Hiraoka, F Shi… - arXiv preprint arXiv …, 2023 - arxiv.org

… In this study, we introduce a Wasserstein adversarial imitation learning system, allowing …

Additionally, we employ a specific Integral Probabilistic Metric (IPM), namely the Wasserstein-1 …

Cited by 3 Related articles All 2 versions 



Empirical measures and random walks on compact spaces in the quadratic Wasserstein metric

B Borda - Annales de l'Institut Henri Poincare (B) Probabilites et …, 2023 - projecteuclid.org

Estimating the rate of convergence of the empirical measure of an iid sample to the reference

measure is a classical problem in probability theory. Extending recent results of Ambrosio, …

Cited by 2 Related articles All 3 versions



[PDF] researchgate.net

Contrastive prototypical network with wasserstein confidence penalty

H WangZH Deng - European Conference on Computer Vision, 2022 - Springer

… To this end, we propose Wasserstein … Wasserstein distance and introduce the semantic

relationships with cost matrix. With semantic relationships as prior information, our Wasserstein …

Cited by 3 Related articles All 5 versions

[PDF] arxiv.org


 


[PDF] mlr.press

Wasserstein distributional learning via majorization-minimization

C TangN Lenssen, Y Wei… - … Conference on Artificial …, 2023 - proceedings.mlr.press

… the Wasserstein loss is notoriously challenging, which has been the obstacle for distributional

learning under the Wasserstein … problem associated with the Wasserstein geometry. It …

Cited by 1 Related articles 





On distributionally robust generalized Nash games defined over the Wasserstein ball

F FabianiB Franci - Journal of Optimization Theory and Applications, 2023 - Springer

In this paper we propose an exact, deterministic, and fully continuous reformulation of

generalized Nash games characterized by the presence of soft coupling constraints in the form of …

Cited by 4 Related articles All 6 versions



Well-posedness of Hamilton-Jacobi equations in the Wasserstein space: non-convex Hamiltonians and common noise

S DaudinJ JacksonB Seeger - arXiv preprint arXiv:2312.02324, 2023 - arxiv.org

We establish the well-posedness of viscosity solutions for a class of semi-linear Hamilton-Jacobi

equations set on the space of probability measures on the torus. In particular, we focus …

Cited by 5 Related articles All 3 versions 

<–—2023———2023——2280— 




2023 see 2021

Projected statistical methods for distributional data on the real line with the Wasserstein metric

M PegoraroM Beraha - Journal of Machine Learning Research, 2022 - jmlr.org

… Second, by exploiting a geometric characterization of Wasserstein space closely related

to its weak Riemannian structure, we build a novel approximation of the Wasserstein space …

Cited by 15 Related articles All 12 versions 



2023 see 2022.  [HTML] rsc.org

[HTML] Pesticide detection combining the Wasserstein generative adversarial network and the residual neural network based on terahertz spectroscopy

R Yang, Y Li, B Qin, D Zhao, Y Gan, J Zheng - RSC advances, 2022 - pubs.rsc.org

… a WGAN-ResNet method, which combines two deep learning networks, the Wasserstein

generative adversarial network (WGAN) … The Wasserstein generative adversarial network and …

Cited by 8 Related articles All 8 versions



[PDF] researchsquare.com

Enhancing genomic data synthesis: A WGAN-GP approach for haplotype generation and evaluation using quasi Manhattan Wasserstein distance

EU Lim, AMW LimCSJ Fann - 2024 - researchsquare.com

… the efficacy of Wasserstein GANs with Gradient Penalty (WGAN-GP) in generating synthetic

haplotype data. Overcoming challenges observed in traditional GANs, WGAN-GP produced …

Cite Related articles All 3 versions 


[PDF] arxiv.org

Geometric sparse coding in Wasserstein space

M MuellerS AeronJM MurphyA Tasissa - arXiv preprint arXiv …, 2022 - arxiv.org

… regularizer for Wasserstein space … in Wasserstein space and addresses the problem of

non-uniqueness of barycentric representation. Moreover, when data is generated as Wasserstein …

Cited by 4 Related articles All 2 versions 



[HTML] oup.com

Full View

Morphological classification of radio galaxies with Wasserstein generative adversarial network-supported augmentation

L Rustige, J Kummer, F Griese, K Borras… - RAS Techniques …, 2023 - academic.oup.com

… models, specifically Wasserstein generative adversarial … data with images from our wGAN

on three different classification … In addition, we apply wGAN-supported augmentation to a …

Cited by 4 Related articles All 4 versions


2-023

[PDF] arxiv.org

Sliced Wasserstein with random-path projecting directions

K Nguyen, S Zhang, T LeN Ho - arXiv preprint arXiv:2401.15889, 2024 - arxiv.org

… From the RPSD, we introduce two novel variants of sliced Wasserstein. The first variant is

called random-path projection sliced Wasserstein (RPSW), which replaces the uniform …

Cited by 1 Related articles All 2 versions 

Imbalanced fault diagnosis using conditional wasserstein generative adversarial networks with switchable normalization

W Fu, Y Chen, H Li, X Chen, B Chen - IEEE Sensors Journal, 2023 - ieeexplore.ieee.org

… of GAN training, wasserstein generative adversarial network (WGAN) has been proposed,

which adopt the Wasserstein distance to … The Wasserstein distance is defined as follows: …

Cited by 2 Related articles All 2 versions

2023 see 2022

Super-resolution of Sentinel-2 images using Wasserstein GAN

H LatifS GhuffarHM Ahmad - Remote Sensing Letters, 2022 - Taylor & Francis

… and proposes DSen2-Wasserstein GAN (DSen2-WGAN), which … study of WGAN on

super-resolution of Sentinel-2 images. … This paper proposes a new approach: DSen2-WGAN to …

Cited by 2 Related articles All 2 versions

2023 see 2022. [PDF] arxiv.org

The performance of Wasserstein distributionally robust M-estimators in high dimensions

L AolariteiS Shafieezadeh-Abadeh… - arXiv preprint arXiv …, 2022 - arxiv.org

… a Wasserstein sense, to the empirical distribution. In this paper, we propose a Wasserstein

… work to study this problem in the context of Wasserstein distributionally robust M-estimation. …

Cited by 7 Related articles All 2 versions 



2023 see 2022.  [HTML] springer.com

[HTML] The general class of Wasserstein Sobolev spaces: density of cylinder functions, reflexivity, uniform convexity and Clarkson's inequalities

GE Sodini - Calculus of Variations and Partial Differential …, 2023 - Springer

We show that the algebra of cylinder functions in the Wasserstein Sobolev space H 1 , q ( P

p ( X , d ) , W p , d , m ) \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{…

Cited by 6 Related articles All 11 versions

<–—2023———2023——2290—




Intelligent Bearing Anomaly Detection for Industrial Internet of Things Based on Auto-Encoder Wasserstein Generative Adversarial Network

R Liu, D Xiao, D Lin, W Zhang - IEEE Internet of Things Journal, 2024 - ieeexplore.ieee.org

… Wasserstein distance [28] is proposed, which provides a smoother training process, thereby

enhancing stability. In WGAN, the discriminator quantifies the Wasserstein … called WGAN-GP …

Related articles



[PDF] mlr.press

Hyperbolic sliced-wasserstein via geodesic and horospherical projections

C BonetL ChapelL Drumetz… - Topological, Algebraic …, 2023 - proceedings.mlr.press

… background on Optimal Transport with the Wasserstein and the slicedWasserstein distance.

We then review … The main tool of OT is the Wasserstein distance which we introduce now. …

Cited by 5 Related articles All 9 versions 


Wasserstein barycenter for link prediction in temporal networks

A Spelta, N Pecora - Journal of the Royal Statistical Society …, 2024 - academic.oup.com

… problem associated with Wasserstein barycenter, which is … is established, the Wasserstein

barycentric coordinates are … that minimises the sum of its Wasserstein distances to each past …

Cited by 1 Related articles All 4 versions


[HTML] mdpi.com

[HTML] Computing the Gromov-Wasserstein distance between two surface meshes using optimal transport

P KoehlM DelarueH Orland - Algorithms, 2023 - mdpi.com

The Gromov-Wasserstein (GW) formalism can be seen as a generalization of the optimal

transport (OT) formalism for comparing two distributions associated with different metric spaces. …

Cited by 4 Related articles All 5 versions 


[HTML] mdpi.com

[HTML] A Novel Approach to Satellite Component Health Assessment Based on the Wasserstein Distance and Spectral Clustering

Y Hui, Y Cheng, B Jiang, X Han, L Yang - Applied Sciences, 2023 - mdpi.com

This research presents a multiparameter approach to satellite component health assessment

aimed at addressing the increasing demand for in-orbit satellite component health …

Cited by 2 Related articles All 4 versions 


2023


[PDF] neurips.cc

Disentangled Wasserstein Autoencoder for T-Cell Receptor Engineering

T LiH GuoF GrazioliM Gerstein… - Advances in Neural …, 2024 - proceedings.neurips.cc

… To automate this process from a data-driven perspective, we propose a disentangled

Wasserstein autoencoder with an auxiliary classifier, which isolates the function-related patterns …

Cited by 2 Related articles All 4 versions 



[PDF] arxiv.org

Convergence analysis for general probability flow ODEs of diffusion models in Wasserstein distances

X GaoL Zhu - arXiv preprint arXiv:2401.17958, 2024 - arxiv.org

… Can we establish Wasserstein convergence guarantees for probability flow ODE … 2-Wasserstein

distance, we can decompose the 2-Wasserstein error in terms of the 2-Wasserstein error …

Cited by 3 Related articles All 2 versions 



[HTML] mdpi.com

[HTML] One-Dimensional Convolutional Wasserstein Generative Adversarial Network Based Intrusion Detection Method for Industrial Control Systems

Z Cai, H Du, H Wang, J Zhang, Y Si, P Li - Electronics, 2023 - mdpi.com

… can generate more samples by optimizing the Wasserstein distance. In general, WGANs are

… generation method, 1D CWGAN, which integrates 1D CNN and WGAN. The algorithm uses …

Cited by 1 Related articles All 3 versions 



[HTML] quantum-journal.org

[HTML] Quantum Wasserstein distance based on an optimization over separable states

G TóthJ Pitrik - Quantum, 2023 - quantum-journal.org

… We define the quantum Wasserstein distance such that the … We discuss how the quantum

Wasserstein distance … can be obtained from the quantum Wasserstein distance by replacing the …

Cited by 3 Related articles All 15 versions 


HTML] mdpi.com

[HTML] Learning Wasserstein Contrastive Color Histogram Representation for Low-Light Image Enhancement

Z Sun, S Hu, H Song, P Liang - Mathematics, 2023 - mdpi.com

… To this end, this paper introduces a Wasserstein contrastive regularization method (WCR) …

Afterwards, to ensure color consistency, we utilize the Wasserstein distance (WD) to quantify …

Cited by 1 Related articles All 5 versions 

<–—2023———2023——2300—


[PDF] arxiv.org

An empirical study of simplicial representation learning with wasserstein distance

M YamadaY Takezawa, G Houry… - arXiv preprint arXiv …, 2023 - arxiv.org

In this paper, we delve into the problem of simplicial representation learning utilizing the 1-Wasserstein

distance on a tree structure (aka, Tree-Wasserstein distance (TWD)), where TWD …

Cited by 1 Related articles All 3 versions 


2023 see 2022 [PDF] ams.org

Master Bellman equation in the Wasserstein space: Uniqueness of viscosity solutions

A CossoF Gozzi, I Kharroubi, H Pham… - Transactions of the …, 2024 - ams.org

… We study the Bellman equation in the Wasserstein space aris… -Lions extended to our

Wasserstein setting, we prove a … nature of the underlying Wasserstein space. The adopted strategy …

Cited by 33 Related articles All 15 versions



Regularized Hypothesis-Induced Wasserstein Divergence for unsupervised domain adaptation

L SiH DongW QiangC Zheng, J Yu, F Sun - Knowledge-Based Systems, 2024 - Elsevier

… We use the Wasserstein distance as a metric to measure the divergence between two

probability distributions. The Wasserstein distance has the advantage of being continuous and …

Cited by 1 Related articles All 2 versions



[PDF] thecvf.com

Wasserstein Expansible Variational Autoencoder for Discriminative and Generative Continual Learning

F YeAG Bors - Proceedings of the IEEE/CVF International …, 2023 - openaccess.thecvf.com

… new approach called the Wasserstein Expansible Variational … , we evaluate the Wasserstein

distance for representing the … through the proposed Wasserstein expansion mechanism, …

Cited by 2 Related articles All 4 versions 


[HTML] mdpi.com

[HTML] An efficient rep-style gaussian–wasserstein network: Improved uav infrared small object detection for urban road surveillance and safety

T Aibibu, J Lan, Y Zeng, W Lu, N Gu - Remote Sensing, 2023 - mdpi.com

… reduces the detection accuracy, so we provide a novel Gaussian–Wasserstein Points (GWPIoU)

calculation method based on Wasserstein [35] and minimum points distances [36]. …

Cited by 1 Related articles All 5 versions 


2023



 arXiv:2309.09543
  [pdfother]  quant-ph cs.LG
Quantum Wasserstein GANs for State Preparation at Unseen Points of a Phase Diagram
Authors: Wiktor JuraszChristian B. Mendl
Abstract: Generative models and in particular Generative Adversarial Networks (GANs) have become very popular and powerful data generation tool. In recent years, major progress has been made in extending this concept into the quantum realm. However, most of the current methods focus on generating classes of states that were supplied in the input set and seen at the training time. In this work, we propose a…  More
Submitted 18 September, 2023; originally announced September 2023.



2023 see 2022. [PDF] arxiv.org

Finite-sample guarantees for Wasserstein distributionally robust optimization: Breaking the curse of dimensionality

R Gao - Operations Research, 2023 - pubsonline.informs.org

… out-of-sample performance for Wasserstein robust learning and the … Wasserstein DRO

problems without suffering from the curse of dimensionality. Our results highlight that Wasserstein …

Cited by 72 Related articles All 8 versions


2023 see 2022. [PDF] arxiv.org

Viscosity solutions for obstacle problems on Wasserstein space

M TalbiN Touzi, J Zhang - SIAM Journal on Control and Optimization, 2023 - SIAM

… on the Wasserstein space, which we call an obstacle equation on Wasserstein space by …

the unique solution of the obstacle equation on the Wasserstein space, provided it has \(C^{…

Cited by 15 Related articles All 9 versions

<–—2023———2023——2310— 



[HTML] sciencedirect.com

[HTML] A time-series Wasserstein GAN method for state-of-charge estimation of lithium-ion batteries

X Gu, KW See, Y LiuB Arshad, L Zhao… - Journal of Power Sources, 2023 - Elsevier

… In this work, the proposed TS-WGAN model is trained utilizing the WGAN-GP loss function.

This function incorporates a Wasserstein-1 distance to evaluate the distribution between …

Cited by 5 Related articles All 3 versions


[HTML] springer.com

Full View

[HTML] Identifying disease-related microbes based on multi-scale variational graph autoencoder embedding Wasserstein distance

H Zhu, H Hao, L Yu - BMC biology, 2023 - Springer

… Furthermore, the Wasserstein distance was employed to substitute KL divergence to maintain

… In addition, we utilized Wasserstein distance to precisely measure two distributions. The …

Cited by 9 Related articles All 9 versions



[PDF] mlr.press

A Gromov-Wasserstein geometric view of spectrum-preserving graph coarsening

Y ChenR YaoY YangJ Chen - … Conference on Machine …, 2023 - proceedings.mlr.press

Graph coarsening is a technique for solving large-scale graph problems by working on a

smaller version of the original graph, and possibly interpolating the results back to the original …

Cited by 3 Related articles All 9 versions 


Wasserstein barycenter matching for graph size generalization of message passing neural networks

X ChuY JinX WangS Zhang… - International …, 2023 - proceedings.mlr.press

… -generating space, we propose to use Wasserstein barycenters as graph-level consensus …

Wasserstein barycenter matching (WBM) layer that represents an input graph by Wasserstein …

Cited by 1 Related articles All 6 versions 



Diffusion-based Wasserstein generative adversarial network for blood cell image augmentation

EE Ngasa, MA Jang, SA TarimoJ Woo… - … Applications of Artificial …, 2024 - Elsevier

… This study proposes incorporating a diffusion process from the DDPM into the WGAN-GP …

It is subsequently utilized as input for the WGAN-GP’s generator to generate photorealistic …

Related articles


  2023


2023 see 2022. [PDF] arxiv.org

Mean-field neural networks: learning mappings on Wasserstein space

H PhamX Warin - Neural Networks, 2023 - Elsevier

We study the machine learning task for models with operators mapping between the Wasserstein

space of probability measures and a space of functions, like eg in mean-field games/…

Cited by 9 Related articles All 9 versions





Wasserstein distance‐based distributionally robust parallel‐machine scheduling

Y Yin, Z Luo, D Wang, TCE Cheng - Omega, 2023 - Elsevier

… Wasserstein distance-based DR parallel-machine scheduling, where the ambiguity set is

defined as a Wasserstein … the distributions arising from the Wasserstein ambiguity set, subject …

Cited by 3 Related articles All 4 versions



Hybrid machine condition monitoring based on interpretable dual tree methods using Wasserstein metrics

Y Liu, T Wang, F Chu - Expert Systems with Applications, 2024 - Elsevier

For condition monitoring and predictive maintenance of high-end manufacturing equipment,

surface roughness is a critical metric to evaluate machining quality. Designing a method that …

Cited by 2 Related articles All 2 versions


[HTML] mdpi.com

[HTML] Delamination detection framework for the imbalanced dataset in laminated composite using wasserstein generative adversarial network-based data …

S Kim, MM Azad, J Song, H Kim - Applied Sciences, 2023 - mdpi.com

… technique using the Wasserstein Generative Adversarial Network (WGAN) model. WGAN is

a … structures is less necessary to use the WGAN model to generate the synthetic data. As a …

Cited by 2 Related articles All 4 versions 



[PDF] arxiv.org

Wasserstein regression

Y ChenZ LinHG Müller - Journal of the American Statistical …, 2023 - Taylor & Francis

… Adopting the Wasserstein metric, we develop a class of regression models for such data, …

of random measures endowed with the Wasserstein metric for mapping distributions to tangent …

Cite Cited by 73 Related articles All 10 versions

<–—2023———2023——2320— 



2023 see 2022

Unsupervised learning model of sparse filtering enhanced using wasserstein distance for intelligent fault diagnosis

G VashishthaR Kumar - Journal of Vibration Engineering & Technologies, 2023 - Springer

… features is done by Wasserstein distance with MMD … Wasserstein distance with MMD has

sed. The GNSF is obtained by normalizing the feature matrix whereas Wasserstein …

 Cited by 13 Related articles All 2 versions


 

[PDF] neurips.cc

Fused Gromov-Wasserstein Graph Mixup for Graph-level Classifications

X MaX Chu, Y Wang, Y Lin, J Zhao… - Advances in Neural …, 2024 - proceedings.neurips.cc

Graph data augmentation has shown superiority in enhancing generalizability and robustness

of GNNs in graph-level classifications. However, existing methods primarily focus on the …

Cited by 5 Related articles All 2 versions 



Crash injury severity prediction considering data imbalance: A Wasserstein generative adversarial network with gradient penalty approach

Y Li, Z Yang, L Xing, C Yuan, F Liu, D Wu… - Accident Analysis & …, 2023 - Elsevier

… learning method, the Wasserstein generative adversarial network with gradient penalty (WGAN-GP), …

To e the effectiveness of the WGAN-GP model, we systematically compare …

Cited by 2 Related articles All 7 versions


2023
MR4641859 Pending Chakraborty, Kuntal A note on relative Vaserstein symbol. J. Algebra Appl. 22 (2023), no. 10, Paper No. 2350210, 29 pp. 19B14 (13C10 13H05 19B99)

Review PDF Clipboard Journal Article

Review PDF Clipboard  Article Make Link


MR4649662 Reviewed Cheng, Kevin C.Miller, Eric L.Hughes, Michael C.Aeron, Shuchin Nonparametric and regularized dynamical Wasserstein barycenters for sequential observations. IEEE Trans. Signal Process. 71 (2023), 3164–3178. 62G05 (62L10 62M10)

Review PDF Clipboard Journal Article


2023

MR4649509 Pending Duvenhage, RoccoMapaya, Mathumo Quantum Wasserstein distance of order 1 between channels. Infin. Dimens. Anal. Quantum Probab. Relat. Top. 26 (2023), no. 3, Paper No. 2350006, 36 pp. 81P47 (46L60 49Q22 81S22)

Review PDF Clipboard Journal Article 1 Citation

MR4648573 Pending Karakhanyan, Aram L. A nonlocal free boundary problem with Wasserstein distance. Calc. Var. Partial Differential Equations 62 (2023), no. 9, Paper No. 240, 22 pp. 49Q20 (35J60 35R35)

Review PDF Clipboard Journal Article


MR4647938 Pending Wang, Feng-YuWu, Bingyao Wasserstein convergence for empirical measures of subordinated diffusions on Riemannian manifolds. Potential Anal. 59 (2023), no. 3, 933–954. 60D05 (58J65)

Review PDF Clipboard Journal Article


MR4646878 Pending Slepčev, DejanWarren, Andrew Nonlocal Wasserstein distance: metric and asymptotic properties. Calc. Var. Partial Differential Equations 62 (2023), no. 9, Paper No. 238, 66 pp. 60B10 (45G10 46E27 49Q22 60J76)

49Q22 60J76)

Review PDF Clipboard Journal Article

MR4645670 Pending Thanwerdas, YannPennec, Xavier Bures-Wasserstein minimizing geodesics between covariance matrices of different ranks. SIAM J. Matrix Anal. Appl. 44 (2023), no. 3, 1447–1476. 53C22 (15A63 15B48 54E50 58D17)

5B48 54E50 58D17)

Review PDF Clipboard Journal Article

<–—2023———2023——2330— 



  

 MR4722805 Thesis Reshetova, Daria; Entropic Regularization in Wasserstein Gans: Robustness, Generalization and Privacy. Thesis (Ph.D.)–Stanford University. 2023. 124 pp. ISBN: 979-8381-01960-5, ProQuest LLC

Review PDF Clipboard Series Thesis

 


MR4722805 Thesis Reshetova, Daria; Entropic Regularization in Wasserstein Gans: Robustness, Generalization and Privacy. Thesis (Ph.D.)–Stanford University. 2023. 124 pp. ISBN: 979-8381-01960-5, ProQuest LLC

Review PDF Clipboard Series Thesis


MR4722609 Thesis Eikenberry, Keenan; Bayesian Inference for Markov Kernels Valued in Wasserstein Spaces. Thesis (Ph.D.)–Arizona 

Review PDF Clipboard Series Thesis


MR4705818 Prelim Gao, Rui; Finite-sample guarantees for Wasserstein distributionally robust optimization: breaking the curse of dimensionality. Oper. Res. 71 (2023), no. 6, 2291–2306. 90C15 (49Q22 90C47)

Review PDF Clipboard Journal Article 2 Citations


MR4702121 Pending Ning, ChaoMa, Xutao Data-driven Bayesian nonparametric Wasserstein distributionally robust optimization. IEEE Control Syst. Lett. 7 (2023), 3597–3602. 90C15 (62F15 62H30)

Review PDF Clipboard Journal Article


2023


MR4700944 Prelim Thach, Nguyen Ngoc; Trung, Nguyen Duc; Padilla, R. Noah; Why Wasserstein metric is useful in econometrics. Internat.  

Review PDF Clipboard Journal Article


MR4690283 Pending Eckstein, StephanIske, ArminTrabs, Mathias Dimensionality reduction and Wasserstein stability for kernel regression. J. Mach. Learn. Res. 24 (2023), Paper No. [334], 35 pp. 62G08 (62H25 68T09)

Review PDF Clipboard Journal Article


MR4678939 Pending Monmarché, Pierre Wasserstein contraction and Poincaré inequalities for elliptic diffusions with high diffusivity. Ann. H. Lebesgue 6 (2023), 941–973. 60J60

Review PDF Clipboard Journal Article

 

MR4675506 Thesis Pritchard, Neil; Injective and Coarse Embeddings of Persistence Diagrams and Wasserstein Space. Thesis (Ph.D.)–The University of North Carolina at Greensboro. 2023. 46 pp. ISBN: 979-8380-17160-1, ProQuest LLC

Review PDF Clipboard Series Thesis


MR4674289 Reviewed Kelbert, M. Y.Suhov, Y. Wasserstein and weighted metrics for multidimensional Gaussian distributions. Izv. Sarat. Univ. (N.S.) Ser. Mat. Mekh. Inform. 23 (2023), no. 4, 422–434. 60E05

Review PDF Clipboard Journal Article

<–—2023———2023——23340— 



MR4674055 Pending Delon, JulieGozlan, NathaelSaint Dizier, Alexandre Generalized Wasserstein barycenters between probability measures living on different subspaces. Ann. Appl. Probab. 33 (2023), no. 6A, 4395–4423. 60A10 (49N15 49Q22)

Review PDF Clipboard Journal Article


MR4670363 Pending Mémoli, FacundoMunk, AxelWan, ZhengchaoWeitkamp, Christoph The ultrametric Gromov-Wasserstein distance. Discrete Comput. Geom. 70 (2023), no. 4, 1378–1450. 51F30 (49Q22 53C23)

Review PDF Clipboard Journal Article


MR4670302 Reviewed Wu, HaoFan, XiequanGao, ZhiqiangYe, Yinna Wasserstein-1 distance and nonuniform Berry-Esseen bound for a supercritical branching process in a random environment. J. Math. Res. Appl. 43 (2023), no. 6, 737–753. 60J80 (60F05 60K37)

Review PDF Clipboard Journal Article



MR4669265 Expansion Li, Mengyu; Yu, Jun; Xu, Hongteng; Meng, Cheng; Efficient approximation of Gromov-Wasserstein distance using importance sparsification. J. Comput. Graph. Statist. 32 (2023), no. 4, 1512–1523.

Review PDF Clipboard Journal Article


MR4669264 Expansion Li, Tao; Yu, Jun; Meng, Cheng; Scalable model-free feature screening via sliced-Wasserstein dependency. J. Comput. Graph. Statist. 32 (2023), no. 4, 1501–1511.

Review PDF Clipboard Journal Article


2023


MR4667978 Reviewed Barrera, GerardoHögele, Michael A. Ergodicity bounds for stable Ornstein-Uhlenbeck systems in Wasserstein distance with applications to cutoff stability. Chaos 33 (2023), no. 11, Paper No. 113124, 19 pp. 60H10

Review PDF Clipboard Journal Article


MR4667976 Pending Bensoussan, AlainHuang, ZiyuYam, Sheung Chi Phillip Control theory on Wasserstein space: a new approach to optimality conditions. Ann. Math. Sci. Appl. 8 (2023), no. 3, 565–628. 49N80 (49K45 49L20 60H10 60H15 60H30 91A16 93E20)

Review PDF Clipboard Journal Article



MR4666691 Pending Yang, Xue Reflecting image-dependent SDEs in Wasserstein space and large deviation principle. Stochastics 95 (2023), no. 8, 1361–1394. 60H10 (60F10 60G46)

Review PDF Clipboard Journal Article



MR4664458 Reviewed Schär, Philip Wasserstein contraction and spectral gap of slice sampling revisited. Electron. J. Probab. 28 (2023), Paper No. 136, 28 pp. 65C05 (60J10 60J22)

Review PDF Clipboard Journal Article



MR4663523 Pending Jalowy, Jonas The Wasserstein distance to the circular law. Ann. Inst. Henri Poincaré Probab. Stat. 59 (2023), no. 4, 2285–2307. 60B20 (41A25 49Q22 60G55)

Review PDF Clipboard Journal Article 2 Citations

<–—2023———2023——2350— emd 2023-




2023 2023


2023. 2023. 2023



MR4663515 Reviewed Borda, Bence Empirical measures and random walks on compact spaces in the quadratic Wasserstein metric. Ann. Inst. Henri Poincaré Probab. Stat. 59 (2023), no. 4, 2017–2035. 60B05 (49Q22 60B15 60G10)

Review PDF Clipboard Journal Article 1 Citation

 


MR4662767 Reviewed Gao, YihangNg, Michael K.Zhou, Mingjie Approximating probability distributions by using Wasserstein generative adversarial networks. SIAM J. Math. Data Sci. 5 (2023), no. 4, 949–976. 68T07

Review PDF Clipboard Journal Article


MR4662765 Pending Pesenti, Silvana M.Jaimungal, Sebastian Portfolio optimization within a Wasserstein ball. SIAM J. Financial Math. 14 (2023), no. 4, 1175–1214. 91G10

Review PDF Clipboard Journal Article


MR4661822 Reviewed Chen, DaliWu, YuweiLi, JingquanDing, XiaohuiChen, Caihua Distributionally robust mean-absolute deviation portfolio optimization using Wasserstein metric. J. Global Optim. 87 (2023), no. 2-4, 783–805. 90C90 (90C15)

Review PDF Clipboard Journal Article



MR4660917 Reviewed Battisti, BeatriceBlickhan, TobiasEnchery, GuillaumeEhrlacher, VirginieLombardi, DamianoMula, Olga Wasserstein model reduction approach for parametrized flow problems in porous media. CEMRACS 2021—data assimilation and reduced modeling for high dimensional problems, 28–47, ESAIM Proc. Surveys, 73, EDP Sci., Les Ulis, 2023. 76S05 (65M70)

Review PDF Clipboard Series Chapter


2923


MR4659880 Pending De Palma, GiacomoTrevisan, Dario The Wasserstein distance of order 1 for quantum spin systems on infinite lattices. Ann. Henri Poincaré 24 (2023), no. 12, 4237–4282. 81P45 (81P17 82B20)

Review PDF Clipboard Journal Article



MR4659835 Pending Liu, TianleAustern, Morgane Wasserstein-

p

 bounds in the central limit theorem under local dependence. Electron. J. Probab. 28 (2023), Paper No. 117, 47 pp. 60F05

Review PDF Clipboard Journal Article


MR4659330 Indexed Friesecke, GeroPenka, Maximilian The GenCol algorithm for high-dimensional optimal transport: general formulation and application to barycenters and Wasserstein splines. SIAM J. Math. Data Sci. 5 (2023), no. 4, 899–919. 65J10 (49Q22 68W50)

Review PDF Clipboard Journal Article



MR4658258 Reviewed Beier, FlorianBeinert, RobertSteidl, Gabriele Multi-marginal Gromov-Wasserstein transport and barycentres. Inf. Inference 12 (2023), no. 4, 2720–2752. 49Q22 (28A33 28A35 65J15)

Review PDF Clipboard Journal Article 1 Citation


MR4656008 Reviewed Gehér, György PálTitkos, TamásVirosztek, Dániel On isometries of Wasserstein spaces. Research on preserver problems on Banach algebras and related topics, 239–250, RIMS Kôkyûroku Bessatsu, B93, Res. Inst. Math. Sci. (RIMS), Kyoto, 2023. 54E40 (46E27 60B05)

Review PDF Clipboard Series Chapter

<–—2023———2023——2360— 



MR4655923 Reviewed Figalli, AlessioGlaudo, Federico An invitation to optimal transport, Wasserstein distances, and gradient flows. Second edition [of MR4331435]. EMS Textbooks in Mathematics. EMS Press, Berlin, [2023], ©2023. vi+146 pp. ISBN: 978-3-98547-050-1; 978-3-98547-550-6 (Reviewer: Luca Granieri) 49-01 (28A33 35A15 49N15 49Q22 60B05

Review PDF Clipboard Series Book


MR4651466 Reviewed Kalmutskiy, KirillCherikbayeva, LyailyaLitvinenko, AlexanderBerikov, Vladimir Multi-target weakly supervised regression using manifold regularization and Wasserstein metric. Mathematical optimization theory and operations research—recent trends, 364–375, Commun. Comput. Inf. Sci., 1881, Springer, Cham, [2023], ©2023. 62G08 (49Q22 68T05)

Review PDF Clipboard Series Chapter


MR4651068 Pending Fabiani, FilippoFranci, Barbara On distributionally robust generalized Nash games defined over the Wasserstein ball. J. Optim. Theory Appl. 199 (2023), no. 1, 298–309. 91A15 (90C11 90C15)

Review PDF Clipboard Journal Article


MR4650916 Pending Jimenez, ChloéMarigonda, AntonioQuincampoix, Marc Dynamical systems and Hamilton-Jacobi-Bellman equations on the Wasserstein space and their 


 representations. SIAM J. Math. Anal. 55 (2023), no. 5, 5919–5966. 49J15 (34A60 49J52 49L25 49Q22 93C15)

Review PDF Clipboard Journal Article


2023

MR4650053 Reviewed Li, WuchenLiu, SitingOsher, Stanley A kernel formula for regularized Wasserstein proximal operators. Res. Math. Sci. 10 (2023), no. 4, Paper No. 43, 16 pp. 65M06 (35L05)

Review PDF Clipboard Journal Article



MR4641607 Pending Fornasier, MassimoSavaré, GiuseppeSodini, Giacomo Enrico Density of subalgebras of Lipschitz functions in metric Sobolev spaces and applications to Wasserstein Sobolev spaces. J. Funct. Anal. 285 (2023), no. 11, Paper No. 110153, 76 pp. 46E36 (28A33 31C25 49Q20)

Review PDF Clipboard Journal Article 1 Citation



MR4638301 Pending Chambolle, AntoninDuval, VincentMachado, João Miguel The total variation-Wasserstein problem: a new derivation of the Euler-Lagrange equations. Geometric science of information. Part I, 610–619, Lecture Notes in Comput. Sci., 14071, Springer, Cham, [2023], ©2023. 49Q22 (35A15)

Review PDF Clipboard Series Chapter



MR4638280 Reviewed Han, AndiMishra, BamdevJawanpuria, PratikGao, Junbin Learning with symmetric positive definite matrices via generalized Bures-Wasserstein geometry. Geometric science of information. Part I, 405–415, Lecture Notes in Comput. Sci., 14071, Springer, Cham, [2023], ©2023. 53B99

Review PDF Clipboard Series Chapter



MR4637091 Reviewed Piccoli, BenedettoRossi, FrancescoTournus, Magali A Wasserstein norm for signed measures, with application to non-local transport equation with source term. Commun. Math. Sci. 21 (2023), no. 5, 1279–1301. 35Q49 (28A33)

Review PDF Clipboard Journal Article 3 Citations

<–—2023———2023——2370— 



MR4635230 Indexed Zhu, TingyuLiu, HaoyuZheng, Zeyu Learning to simulate sequentially generated data via neural networks and Wasserstein training. ACM Trans. Model. Comput. Simul. 33 (2023), no. 3, Art. 9, 34 pp. 62L10 (65C20 68T07)

Review PDF Clipboard Journal Article


MR4634681 Pending Wang, Feng-Yu Convergence in Wasserstein distance for empirical measures of Dirichlet diffusion processes on manifolds. J. Eur. Math. Soc. (JEMS) 25 (2023), no. 9, 3695–3725. 60D05 (58J65)

Review PDF Clipboard Journal Article 1 Citation


MR4632933 Pending Wang, Yu-ZhaoLi, Sheng-JieZhang, Xinxin Generalized displacement convexity for nonlinear mobility continuity equation and entropy power concavity on Wasserstein space over Riemannian manifolds. Manuscripta Math. 172 (2023), no. 1-2, 405–426. 58J05 (58J35)

Review PDF Clipboard Journal Article


MR4631994 Pending Baudier, F.Gartland, C.Schlumprecht, Th. 

L1-distortion of Wasserstein metrics: a tale of two dimensions. Trans. Amer. Math. Soc. Ser. B 10 (2023), 1077–1118. 46B85 (05C63 46B20 51F30 68R12)

Review PDF Clipboard Journal Article



MR4629039 Reviewed Ding, HuLiu, WenjieYe, Mingquan A data-dependent approach for high-dimensional (robust) Wasserstein alignment. ACM J. Exp. Algorithmics 28 (2023), Art. 1.8, 32 pp. 68Q87

Review PDF Clipboard Journal Article


2023

MR4627412 Pending Cosso, AndreaMartini, Mattia On smooth approximations in the Wasserstein space. Electron. Commun. Probab. 28 (2023), Paper No. 30, 11 pp. 28A33 (28A15 46E27 49N80)

 

Review PDF Clipboard Journal Article



MR4627299 Pending Sodini, Giacomo Enrico The general class of Wasserstein Sobolev spaces: density of cylinder functions, reflexivity, uniform convexity and Clarkson's inequalities. Calc. Var. Partial Differential Equations 62 (2023), no. 7, Paper No. 212, 41 pp. 46E36 (46B10 46B20 49Q22)

Review PDF Clipboard Journal Article 1 Citation



MR4627136 Reviewed Ballesio, MarcoJasra, Ajayvon Schwerin, ErikTempone, Raúl A Wasserstein coupled particle filter for multilevel estimation. Stoch. Anal. Appl. 41 (2023), no. 5, 820–859. 62M20 (60G35 65C05)

Review PDF Clipboard Journal Article


MR4626657 Reviewed Simon, RichárdVirosztek, Dániel Preservers of the 

p

-power and the Wasserstein means on 

2×2

 matrices. Electron. J. Linear Algebra 39 (2023), 395–408. (Reviewer: Mohamed Bendaoud) 47B49 (15A24 47A63 47A64)

Review PDF Clipboard Journal Article



MR4626409 Reviewed Fu, GuoshengOsher, StanleyLi, Wuchen High order spatial discretization for variational time implicit schemes: Wasserstein gradient flows and reaction-diffusion systems. J. Comput. Phys. 491 (2023), Paper No. 112375, 30 pp. 65M60

Review PDF Clipboard Journal Article 2 Citations

<–—2023———2023——2380— 



MR4624322 Pending Fournier, Nicolas Convergence of the empirical measure in expected Wasserstein distance: non-asymptotic explicit bounds in 

ESAIM Probab. Stat. 27 (2023), 749–775. 60F25 (65C05)

Review PDF Clipboard Journal Article

 


MR4616173 Pending Cisneros-Velarde, PedroBullo, Francesco Distributed Wasserstein barycenters via displacement interpolation. IEEE Trans. Control Netw. Syst. 10 (2023), no. 2, 785–795. 49Q22 (60B10 91D30)

Review PDF Clipboard Journal Article




eer-reviewed

WGAN-CL: A Wasserstein GAN with confidence loss for small-sample augmentation

Authors:Jiaqi MiCongcong MaLihua ZhengMan ZhangMinzan LiMinjuan Wang

Summary:• We propose a GAN-based method for image small-sample augmentation named WGAN-CL. • WGAN-CL designs shortcut-stream connections to broaden the model’s solution space. • We design a confidence loss to improve the model's learning capability. • Experiments achieve state-of-the-art performance in image quality and diversity.

The small-sample task is a current challenge in the field of deep learning, due to the huge annotation cost and the inherent limitations of targets, such as the acquisition of rare animal and plant images. Data augmentation is an effective method to solve the semantic sparseness and overfitting of deep convolution neural network in small-sample classification, but its effectiveness remains to be improved. We propose a Wasserstein GAN with confidence loss (WGAN-CL) to implement the expansion of small-sample plant dataset. Firstly, a shallower GAN’s structure is designed to adapt to less plant data. Meanwhile, shortcut-stream connections are brought into the basic network to enlarge the solution space of the model without producing additional training parameters. Secondly, the Wasserstein distance combined with confidence loss is used for optimizing the model. Experiments demonstrate that the Wasserstein distance with gradient penalty guarantees the stability of model training and the diversity of outputs. And the sample screening strategy based on confidence loss can ensure that the generated image is close to the real image in semantic features, which is critical for subsequent image classification. To verify the effectiveness of the WGAN-CL in plant small-sample augmentation, 2000 flower images of 5 categories in the “Flowers” dataset are utilized as training samples, while 2000 augmented images are employed for model training as well to improve the performance of a classical classifier. WGAN-CL has a significant performance improvement over state-of-the-art technologies, i.e., a 2.2% improvement in recall and a 2% improvement in F1-score. Experiments on the “Plant Leaves” dataset also achieved excellent results demonstrating that WGAN-CL can be migrated to other tasks. WGAN-CL uses less computational resources while considering both effectiveness and robustness, proved the practicality of our model

ßArticle, 2023

Publication:Expert Systems With Applications, 233, 20231215

Publisher: 202



Peer-reviewed
Alleviating sample imbalance in water quality assessment using the VAE-WGAN-GP model
Authors:Jingbin Xu, Degang Xu, Kun Wan, Ying Zhang
Summary:Water resources are essential for sustaining human life and promoting sustainable development. However, rapid urbanization and industrialization have resulted in a decline in freshwater availability. Effective prevention and control of water pollution are essential for ecological balance and human well-being. Water quality assessment is crucial for monitoring and managing water resources. Existing machine learning-based assessment methods tend to classify the results into the majority class, leading to inaccuracies in the outcomes due to the prevalent issue of imbalanced class sample distribution in practical scenarios. To tackle the issue, we propose a novel approach that utilizes the VAE-WGAN-GP model. The VAE-WGAN-GP model combines the encoding and decoding mechanisms of VAE with the adversarial learning of GAN. It generates synthetic samples that closely resemble real samples, effectively compensating data of the scarcity category in water quality evaluation. Our contributions include (1) introducing a deep generative model to alleviate the issue of imbalanced category samples in water quality assessment, (2) demonstrating the faster convergence speed and improved potential distribution learning ability of the proposed VAE-WGAN-GP model, (3) introducing the compensation degree concept and conducting comprehensive compensation experiments, resulting in a 9.7% increase in the accuracy of water quality assessment for multi-classification imbalance samples.HIGHLIGHTSNovel method: the VAE-WGAN-GP model is introduced to alleviate the problem of imbalanced category distribution in water quality evaluation and improve the accuracy of assessment.;Water resource management: our research bridges the gap in the distribution of categories in water management by providing deep generative models to compensate for data scarcity in water quality assessments.
Show more
Downloadable Article, 2023
Publication:Water Science and Technology, 88, 20231201, 2762
Publisher: 2023
Access Free



Peer-reviewed
Mdwgan-gp: data augmentation for gene expression data based on multiple discriminator WGAN-GP

uthors:Rongyuan LiJingli WuGaoshi LiJiafei LiuJunbo XuanQi Zhu
Summary:Background: Although gene expression data play significant roles in biological and medical studies, their applications are hampered due to the difficulty and high expenses of gathering them through biological experiments. It is an urgent problem to generate high quality gene expression data with computational methods. WGAN-GP, a generative adversarial network-based method, has been successfully applied in augmenting gene expression data. However, mode collapse or over-fitting may take place for small training samples due to just one discriminator is adopted in the method. Results: In this study, an improved data augmentation approach MDWGAN-GP, a generative adversarial network model with multiple discriminators, is proposed. In addition, a novel method is devised for enriching training samples based on linear graph convolutional network. Extensive experiments were implemented on real biological data. Conclusions: The experimental results have demonstrated that compared with other state-of-the-art methods, the MDWGAN-GP method can produce higher quality generated gene expression data in most cases
Show more
Downloadable Article, 2023
Publication:BMC Bioinformatics, 24, 20231113
Publisher: 2023
Access Free


2023



A WGAN-Based Dialogue System for Embedding Humor, Empathy, and Cultural Aspects
Authors:Chunpeng Zhai, Santoso Wibowo

Summary:Artificial intelligence (AI) technologies have been utilized in the education industry for enhancing student’s performance by generating spontaneous, timely, and personalized query response. One such technology is a dialogue system which is capable of generating humorous and empathetic responses for enhancing students’ learning outcomes. There is, however, limited research on the combination of humor, empathy, and culture in education. Thus, this paper proposes a dialogue system that is based on Wasserstein’s Generative Adversarial Network (WGAN) for generating responses with humor, empathy, and cultural sensitivity. The dialogue system has the ability to generate responses that take into account both coarse-grained emotions at the conversation level and fine-grained emotions at the token level, allowing for a nuanced understanding of a student’s emotional state. It can utilize external knowledge and prior context to enhance the ability of AI dialogue systems to comprehend emotions in a multimodal context. It can also analyze large corpora of text and other data, providing valuable insights into cultural context, semantic properties, and language variations. The dialogue system is a promising AI technology that can improve learning outcomes in various academic fields by generating responses with humor, empathy, and cultural sensitivity. In our study, the dialogue system achieved an accuracy rate of 94.12%, 93.83% and 92.60% in humor, empathy and culture models, respectively
Show more
Downloadable Article, 2023
Publication:IEEE Access, 11, 20230101, 71940
Publisher: 2023
Access Free

A Novel approach using WGAN-GP and Conditional WGAN-GP for Generating Artificial
Authors:Shahd HejaziMichael PackianatherYing Liuu
Summary:This paper proposes a novel approach for generating artificial thermal images for induction motor faults using Wasserstein Generative Adversarial Network with Gradient Penalty (WGAN-GP) and Conditional Wasserstein Generative Adversarial Network with Gradient Penalty (cWGAN-GP) frameworks. Traditional fault classification methods based on vibration signals often require extensive preprocessing and are more susceptible to noise. In contrast, thermal images offer easier classification and require less preprocessing. However, challenges arise due to the limited availability of thermal images representing different fault conditions and data confidentiality. To overcome these challenges, this paper introduces the utilisation of WGAN-GP and cWGAN-GP with health condition labels to create high-quality thermal images artificially. The results demonstrate that the cWGAN-GP approach is superior in generating thermal images that closely resemble real images of induction motors under various health conditions with a Maximum Mean Discrepancy (MMD) score of 1.023 compared to 1.078 using WGAN-GP. Furthermore, cWGAN-GP requires less training time (7.25 hours to train all health conditions classes) compared to WGAN-GP (12 hours to train the Inner fault class only) using NVIDIA V100. In addition to using EMD and MMD metrics for quantitative analysis of the GAN model, the evaluation process incorporated the expertise of a pre-trained CNN model, namely AlexNet, to assess cWGAN-GP's discriminative capabilities of the generated samples and their alignment with the real thermal images, which resulted in an overall accuracy of 98.41%. Therefore, these proposed approaches offer a promising solution to address the lack of public datasets containing induction motor thermal images representing different health states. By leveraging these models, it will be feasible to enhance induction motor condition monitoring systems and improve the process of fault diagnosis
Show more
Article, 2023
Publication:Procedia Computer Science, 225, 2023, 3681
Publisher: 2023



Stochastic Optimal Scheduling of Photovoltaic-Energy Storage Charging Station Based on WGAN-GP Scenario Generation
Authors:Xiang Bao, Yingchen Chi, Hua Zhou, Yan Huang, Xiu Wan, Fan Chen, 2023 8th International Conference on Power and Renewable Energy (ICPRE)
Show more
Summary:To address the optimal scheduling of photovoltaic-energy storage charging station (PV-ES CS) under the background of the new power system construction, a stochastic optimization strategy based on the Wasserstein Generative Adversarial Network with Gradient Penalty (WGAN-GP) scenario generation is proposed. This work comprehensively considers the randomness of photovoltaic (PV) power and electric vehicles (EV) charging load, as well as the influence of carbon trading and demand response mechanisms on carbon emissions and operating costs. First, the WGAN-GP network is utilized to generate scenarios of PV output and charging load, followed by scenario reduction using the Density Peak Clustering (DPC) algorithm. Second, the ladder-type carbon trading and demand response mechanisms are introduced to reduce carbon emissions and promote PV consumption. Finally, a stochastic optimal dispatching model with the objective of maximizing the revenue of the PV-ES CS is constructed, considering carbon emission cost, grid electricity purchase cost, PV curtailment cost, and demand response cost. A case study using the actual parameters of a domestic PV-ES CS is conducted to validate the effectiveness of the scenario generation algorithm and the proposed optimal scheduling strategy. Results of case study demonstrate that the ladder-type carbon trading and demand response mechanisms can further limit carbon emissions and promote PV consumption
Show more
Chapter, 2023
Publication:2023 8th International Conference on Power and Renewable Energy (ICPRE), 20230922, 1157
Publisher: 2023



Dual-WGAN Ensemble Model for Alzheimer’s Dataset Augmentation with Minority Class Boosting

Authors:Mohammad Samar AnsariKulsum IlyasAsra Aslam2023 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies (3ICT)

Summary:Deep learning models have become very efficient and robust for several computer vision applications. However, to harness the benefits of state-of-art deep networks in the realm of disease detection and prediction, it is crucial that high-quality datasets be made available for the models to train on. This work recognizes the lack of training data (both in terms of quality and quantity of images) for using such networks for the detection of Alzheimer’s Disease. To address this issue, a Wasserstein Generative Adversarial Network (WGAN) is proposed to generate synthetic images for augmentation of an existing Alzheimer brain image dataset. The proposed approach is successful in generating high-quality images for inclusion in the Alzheimer image dataset, potentially making the dataset more suitable for training high-end models. This paper presents a two-fold contribution: (i) a WGAN is first developed for augmenting the non-dominant class (i.e. Moderate Demented) of the Alzheimer image dataset to bring the sample count (for that class) at par with the other classes, and (ii) another lightweight WGAN is used to augment the entire dataset for increasing the sample counts for all classes
Show more
Chapter, 2023
Publication:2023 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies (3ICT), 20231120, 333
Publisher: 2023


Power Load Data Cleaning Method Based on DBSCAN Clustering and WGAN Algorithm

Authors:Liyong WeiYi DingEn WangLixin Liu2023 IEEE 5th International Conference on Power, Intelligent Computing and Systems (ICPICS)
Summary:The various processes of acquisition and transmission of measurement data are subject to malfunction or interference, resulting in missing data. Traditional data restoration methods ignore the historical load change pattern and have low reconstruction accuracy. In this paper, we first use the DBSCAN clustering algorithm to detect outliers in the original load dataset and remove the outliers with large deviations to form a dataset with vacant values. Then, the Wasserstein distance is used to improve on the original GAN network. Through non-supervised training of WGAN, the neural network will automatically learn complex spatio-temporal relationships that are difficult to model explicitly, such as correlations between measurements and load fluctuation patterns. Finally, the authenticity constraint and contextual similarity constraint are used to optimize the hidden variables, so that the trained generator will be able to generate highly accurate reconstructed data. The algorithm analysis proves the performance stability of the proposed method, and the reconstructed data can reflect the real time characteristics of the measured data
Chapter, 2023
Publication:2023 IEEE 5th International Conference on Power, Intelligent Computing and Systems (ICPICS), 20230714, 652
Publisher: 2023

<–—2023———2023——2390—



Enhancing Subject-Independent EEG-Based Auditory Attention Decoding with WGAN and Pearson Correlation Coefficient

Authors:Saurav PahujaGabriel IvucicFelix PutzeSiqi CaiHaizhou LiTanja Schultz2023 IEEE International Conference on Systems, Man, and Cybernetics (SMC)
Summary:Electroencephalography (EEG) related research faces a significant challenge of subject independence due to the variation in brain signals and responses among individuals. While deep learning models hold promise in addressing this challenge, their effectiveness depends on large datasets for training and generalization across participants. To overcome this limitation, we propose a solution to the above limitation by increasing the size and quality of training data for subject-independent auditory attention decoding (AAD) using EEG with deep learning. Specifically, our method employs a Wasserstein Generative Adversarial Network (WGAN) to generate synthetic data, with Pearson correlation filtering the most realistic samples. We evaluated this method on a publicly available dataset of selective auditory attention experiments and showed superior performance in subject-independent AAD performance. The mixed training set, consisting of both real and artificial data generated by the WGAN+Pearson Correlation Coefficient, demonstrated approximately 4% improvement in AAD accuracy for a 1-second window. These results demonstrate that deep learning remains a viable approach to overcoming data scarcity in subject-independent AAD tasks based on EEG. Moreover, the proposed method has the potential to improve the generalization and reliability of EEG classification tasks
Show more
Chapter, 2023
Publication:2023 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 20231001, 3715
Publisher: 2023


DD-WGAN: Generative Adversarial Networks with Wasserstein Distance and Dual-Domain Discriminators for Low-Dose CT
Show mor

Authors:Xiao BaiHuamin WangShuo YangZhe WangGuohua Cao2023 IEEE 20th Internation

Symposium on Biomedical Imaging (ISBI)Show Summary:X-ray computed tomography (CT) is a mainstream medical imaging modality. The widespread use of CT has made image denoising of low-dose CT (LDCT) images a key issue in medical imaging. Deep learning (DL) methods have been successful in this area over the past few years, but most DL-based dual-domain methods directly filter the sinogram domain data, which is prone to induce new artifacts in the reconstructed image. This paper proposes a new method called DD-WGAN, which has an image domain generator network (IDG-Net) and two discriminator networks, namely the image domain discriminator network (ID-Net) and the sinogram domain discriminator network (SD-Net). We use dual-domain discriminators to balance the data weights of sinogram and image. Experimental results show that the proposed method achieves significantly improved LDCT denoising performance
Show morChapterPublication:2023 IEEE 20th International Symposium on Biomedical Imaging (ISBI), 20230418, 1
Publisher: 2023

Anti-Jamming Method of Near-Field Underwater Acoustic Detection Based on WGAN

Authors:Zhang Ji  International Conference on Signal Processing, Communications and Computing (ICSPCC)
Summary:This paper analyzes the issue of artificial interference encountered in underwater near-field detection. We briefly examines three common types of artificial jamming signal and their mechanisms. Taking inspiration from the application of Generative Adversarial Networks in speech signal enhancement, this study employs Wasserstein GAN and integrates the characteristics of detection signals. L2 loss is added to the generator's loss function in WGAN to enhance training stability. Simulation analysis demonstrates that the trained WGAN generator effectively combats the three types of artificial jamming
Show m
Chapter, 2023
Publication:2023 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC), 20231114, 1
Publisher: 2023

RUL Prediction of Turbofan Engine Based on WGAN-Trans Under Small Samples
Authors:Chenman QiZehui MaoWenjing Liu2023 China Automation
 Congress (CAC)
Summary:The prediction of the remaining useful life of the turbofan engine of an aircraft is a very important part of its PHM. With the arrival of the era of Big data, data driven RUL prediction methods are gradually emerging, however, in the actual industry, it is often difficult to collect sufficient data to predict the RUL under deep learning which means there will be a problem of small samples. Moreover, in the early stage of degradation, it is difficult to accurately classify the health status of equipment, resulting in the inability to provide early warning before the rapid degradation of equipment performance, thus losing the significance of developing maintenance strategies in advance. To solve this problem, this paper proposes a WGAN-Trans model, which expands the data set by improving the Gener-ative adversarial network, uses the one-dimensional Convolutionl Neural Network to divide the health state, and finally uses the Transformer model to predict the RUL, which is verified on the commercial aircraft turbofan engine data set provided by NASA. The results show that the proposed model has good performance
Show more
Chapter, 2023
Publication:2023 China Automation Congress (CAC), 20231117, 6506
Publisher: 2023


Multi-Track Music Generation with WGAN-GP and Attention Mechanisms

Authors:Luyu ChenLin ShenDan YuZhihua WangKun QianBin HuBjorn W. SchullerYoshiharu Yamamoto2023 IEEE 12th Global Conference on Consumer Electronics (GCCE)

Show mSummary:Music generation with artificial intelligence is a complex and captivating task. The utilisation of generative adversarial networks (GANs) has exhibited promising outcomes in producing realistic and diverse music compositions. In this paper, we propose a model based on Wasserstein GAN with gradient penalty (WGAN-GP) for multi-track music generation. This model incorporates self-attention and introduces a novel cross-attention mechanism in the generator to enhance its expressive capability. Additionally, we transpose all music to C major in training to ensure data consistency and quality. Experimental results demonstrate that our model can produce multi-track music with enhanced rhythm and sound characteristics, accelerate convergence, and improve generation quality
Show more
Chapter, 2023
Publication:2023 IEEE 12th Global Conference on Consumer Electronics (GCCE), 20231010, 606
Publisher: 2023


2023



Peer-reviewed
Enhancer Recognition: A Transformer Encoder-Based Method with WGAN-GP for Data Augmentation
Authors:Tianyu Feng, Tao Hu, Wenyu Liu, Yang Zhang

Summary:Enhancers are located upstream or downstream of key deoxyribonucleic acid (DNA) sequences in genes and can adjust the transcription activity of neighboring genes. Identifying enhancers and determining their functions are important for understanding gene regulatory networks and expression regulatory mechanisms. However, traditional enhancer recognition relies on manual feature engineering, which is time-consuming and labor-intensive, making it difficult to perform large-scale recognition analysis. In addition, if the original dataset is too small, there is a risk of overfitting. In recent years, emerging methods, such as deep learning, have provided new insights for enhancing identification. However, these methods also present certain challenges. Deep learning models typically require a large amount of high-quality data, and data acquisition demands considerable time and resources. To address these challenges, in this paper, we propose a data-augmentation method based on generative adversarial networks to solve the problem of small datasets. Moreover, we used regularization methods such as weight decay to improve the generalizability of the model and alleviate overfitting. The Transformer encoder was used as the main component to capture the complex relationships and dependencies in enhancer sequences. The encoding layer was designed based on the principle of k-mers to preserve more information from the original DNA sequence. Compared with existing methods, the proposed approach made significant progress in enhancing the accuracy and strength of enhancer identification and prediction, demonstrating the effectiveness of the proposed method. This paper provides valuable insights for enhancer analysis and is of great significance for understanding gene regulatory mechanisms and studying disease correlations


Enhancer Recognition: A Transformer Encoder-Based Method with WGAN-GP for Data Augmentation

Authors:Tianyu FengTao HuWenyu LiuYang Zhan
 Article, 2023
Publication:Applied Intelligence, 53, 202306, 13924
Publisher: 2023


ResNet-WGAN-Based End-to-End Learning for IoV Communication With Unknown Channels

Authors:Junhui ZhaoHuiqin MuQingmiao ZhangHuan Zhang
Article, 2023
Publication:IEEE Internet of things journal, 10, 2023, 17184
Publisher: 2023


A novel prediction approach of polymer gear contact fatigue based on a WGAN‐XGBoost model
Authors:Chenfan JiaPeitang WeiZehua LuMao YeRui ZhuHuaiju Liu
Article, 2023
Publication:Fatigue & fracture of engineering materials & structures, 46, 2023, 2272
Publisher: 2023

A WGAN-Based Dialogue System for Embedding Humor, Empathy, and Cultural Aspects in Education
Authors:Chunpeng ZhaiSantoso Wibowo
Article, 2023
Publication:IEEE access, 11, 2023, 71940
Publisher: 2023

<–—2023———2023——2400— 



 Dynamic Residual Attention UNet for Precipitation Nowcasting Based on WGAN

Authors:Ce LiFan HuangJianwei ZhangLin MaHuizhong ChenChaoyue Li2023 China Automation Congress (CAC)
Summary:In recent years, using radar echo maps for precipitation nowcasting has been a research hotspot. How to use deep learning methods to forecast precipitation is a challenge. Radar echo map contains rich temporal and spatial information, capturing the location distribution and intensity characteristics of radar echo is a key problem in precipitation prediction. To tackle these challenges, the paper presents a novel approach called the Dynamic Residual Attention UNet model(DRA-UNet). This model incorporates Decoupled Dynamic Filter(DDF) and Dynamic Residual Attention Modules(DRAM) while leveraging the Wasserstein GAN training strategy to perform generative adversarial training. A decoupled Dynamic Filter can adaptively adjust the convolution kernel in the feature extraction stage, effectively reducing blank areas in the feature maps. By exploring the correlation between residual paths and input image statistics, and appropriately weighting each residual path, the model's focus on precipitation positions is enhanced. Moreover, the utilization of the Wasserstein GAN(WGAN) strategy during model training enhances the image generation quality when facing the discriminator in adversarial training. This advancement ensures that the model's outputs closely approximate real results, leading to further improvements in overall model performance. We comprehensively evaluate the performance of our model on the KNMI dataset, and a large number of experimental results show that our method achieves remarkable results on the precipitation prediction task
Show more
Chapter, 2023
Publication:2023 China Automation Congress (CAC), 20231117, 6265
Publisher: 2023


SHort-term wind power prediction based on SAM-WGAN-GP

Authors:Huang LingLi LinxiaCheng YuXu Original Language Zijian
Authors:Huang Ling, Li Linxia, Cheng Yu, Xu Original Language Zijian
Article, 1980-
Publication:Tai yang neng xue bao = Acta energiae solaris sinica /, 44, 2023, 188
Publisher: Zhongguo tai yang neng xue hui, Beijing, 1980-


結合 Metropolis-Hastings 演算法和 WGAN 模型進行股票價格的時間序列預測= Integration of Metropolis-Hastings Algorithm and WGAN Model for Time Series Prediction of Stock Price / Integration of Metropolis-Hastings Algorithm and WGAN Model for Time Series Prediction of Stock Price
Authors:
蕭仁鴻撰蕭仁鴻, 文字作者 (Author) / JenHung Hsiao, JenHung Hsiao (Author)
Thesis, Dissertation, English, 2023
Publisher: 2023



DMM-WGAN: An Industrial Process Data Augmentation Approach
Authors:Wenfeng Zhao, Xiaohui Dong, Xiaochao Dang, Zhiwei Chen, Shiwei Gao, 2023 International Conference on Algorithms, Computing and Data Processing (ACDP)
Show more

Summary:How to apply effective data augmentation methods to supplement datasets in harsh industrial environments is an important problem in complex industrial process modeling. In response to this problem, this paper proposes a new industrial process data augmentation method, DMM-WGAN, based on WGAN. Firstly, a Deep Threshold Mixing Feature Extraction Module (DMM) is proposed in the generator, which adopts a dual-channel fusion strategy. one channel extracts deep features of industrial data, while the other channel extracts global features of industrial data to enhance the feature extraction ability of the generator. Then, the DMM module and the Wasserstein Generative Adversarial Network are combined to establish the DMM-WGAN generation model. Finally, the proposed model is optimized and extensively experimented on a thermal power plant dataset, and the results are evaluated based on MSE, RMSE, MAE, and R2. The results show that the proposed DMM-WGAN generation model is superior to traditional VAE, GAN and WGAN generation models
Chapter, 2023
Publication:2023 International Conference on Algorithms, Computing and Data Processing (ACDP), 202306, 80
Publisher: 2023

2023



WGAN for Data Augmentation
Authors:Mallanagouda PatilMalini M. PatilSurbhi Agrawal
Summary:Large annotated data sets play an important role in deep learning models as they need a lot of data to be trained to resemble the real data distribution. However, it is sometimes difficult and expensive to generate such realistic synthetic data that mimic the original distribution of the data set. Therefore, augmentation of data is essential to expand the size of the data set used for learning purpose while introducing more variety in what the model looks at and learns from. Data augmentation is the addition of new data artificially derived from existing data by including little updated variants of already available data. This additional data can be anything ranging from text to video, and its use in machine learning algorithms helps improve the model performance. Data augmentation provides enough data for the model to understand and train all the available parameters. The generative adversarial networks (GANs) have been employed for data augmentation for refining the deep learning models by generating additional information with no pre-planned process to generate realistic samples from the existing data and improve the model performance. Wasserstein Generative Adversarial Network (WGAN) is the improved version of GANs which enhances the model stability and overcomes issues such as mode collapse and convergence. They also deliver understandable training curves for testing and hyper parameter findings while building the model. WGANs provide an error of loss function which compares the quality of produced image data. WGANs work on the distance between the expected probability and the parameterized probability distributions to better the produced image quality. They learn distributions in high-dimensional feature spaces. This chapter focuses on the Wasserstein distance in deep, data augmentation using WGANs along with their detailed design, advantages, and limitations. The chapter also discusses a case study and concludes with future scope and research issues
Chapter, 2023
Publication:GANs for Data Augmentation in Healthcare, 20231114, 223
Publisher: 2023


One software defective data augmentation method based on VAE and WGAN
Authors:Mengtian CuiZhaoyang Guo2023 International Conference on Frontiers of Robotics and Software Engineering (FRSE)

Summary:In software metric datasets, the number of defective samples is always even fewer than that of non-defective samples, which makes follow-up research complex and difficult. Therefore, this essay provides a method of software defective data augmentation based on a variational autoencoder (VAE) and Wasserstein Generative Adversarial Network (WGAN). This process contains several steps. First, dimensionality reduction of software metric data is achieved by utilizing VAE, producing a set of codes (latent vectors); the distribution of the set is studied by WGAN and then “the code set (latent vectors)” is generated. Finally, the generated codes (latent vectors) are input into VAE to acquire defective data. This paper proposes a data augmentation method of a minority class, based on VAE and WGAN. The experiments performed in MINST, NSAS MDP confirm that 1) This data augmentation method is superior to WGAN, AE+WGAN, and SMOTE; 2) Introducing the variance of generated samples to the WGAN generator, the variety of those is efficiently improved
Chapter,ublication:2023 International Conference on Frontiers of Robotics and Software Engineering (FRSE), 202306, 44
Publisher: 2023


 A WGAN-GP Framework for SAR and Optical Remote Sensing Image Fusion
Authors:Akshay AjayAmith GGokul S KumarR MalavikaAnup Aprem2023 Annual International Conference on Emerging Research Areas: International
Summary:Synthetic Aperture Radar (SAR) and optical images are widely used in remote sensing applications due to their unique capabilities and complementary data. SAR can penetrate through cloud cover and adverse weather conditions, while optical images provide rich spectral information, however are affected by cloud cover. This paper proposes a novel approach for SAR and optical image fusion using the Wasserstein Generative Adversarial Network with gradient penalty (WGAN-GP). In addition, we introduce the content loss function that combines the structural similarity loss in the literature, with the pixel loss and gradient loss for optimal fusion. Experiments on the Sentinel dataset show that the proposed methodology outperforms existing state-of-art architectures based on CNN and wavelet transform. We also illustrate, how the fused image is more beneficial for both human perception and automatic computer analysis
Chapter, 2023
Publication:2023 Annual International Conference on Emerging Research Areas: International Conference on Intelligent Systems (AICERA/ICIS), 20231116, 1
Publisher: 2023


Research on vehicle trajectory anomaly detection algorithm based on GRU and WGAN
Authors:YuHang LiuLei WangXiaoYong ZhaoHuaMing LuJingLe ZhangJianHua LiDeBin Han2023 8th International Conference on Intelligent Computing and Signal Processing (ICSP)
Summary:The uncertainty of vehicle trajectories and the existence of anomalous data lead to challenges in their application in the field of digital transportation. In this paper, GRU-WGAN deep learning model based on GAN is proposed for vehicle trajectory feature extraction and anomalous trajectory detection. Firstly, VAE utilises the GRU neural network as the Encoder and Decoder part, which can deeply extract features from the original data at the encoding layer and do variational inference. At the same time, learning deep feature extraction helps the VAE model to restore the approximate probability distribution of the initial data at the coding layer to the maximum extent, thus improving the efficiency of anomaly detection. The GRU-WGAN model combining GRU and WGAN is then proposed to learn the output of the feature extraction part and the potential features of the real data to complete the task of anomaly detection of vehicle track data. In addition, comparative experiments were set up to validate the proposed model. The experiments demonstrate that the GRU-WGAN model outperforms the conventional algorithm in terms of accuracy, recall and F1 metrics. Therefore, the proposed model can be effectively applied to feature extraction and vehicle trajectory anomaly detection tasks
Chapter, 2023
Publication:2023 8th International Conference on Intelligent Computing and Signal Processing (ICSP), 20230421, 1452
Publisher: 2023







 

ROD-WGAN hybrid: A Generative Adversarial Network for Large-Scale Protein Tertiary Structures
Authors:Mena Nagy A. KhalafTaysir Hassan A. SolimanSara Salah Mohamed2023 International Conference on Computer and Applications (ICCA)
Summary:The tertiary structures of proteins play a critical role in determining their functions, interactions, and bonding in molecular chemistry. Proteins are known to demonstrate natural dynamism under various physiological conditions, which enables them to adjust their tertiary structures and effectively interact with the surrounding molecules. The present study utilized the remarkable progress made in Generative Adversarial Networks (GANs) to generate tertiary structures that accurately mimic the inherent attributes of actual proteins, which includes the backbone conformation as well as the local and distal characteristics of proteins. The current study has introduced a robust model, ROD-WGAN hybrid, that is able to generate large-scale tertiary protein structures that greatly mimic those found in nature. We have made several noteworthy contributions in pursuit of this objective by integrating the ROD-WGAN model with a hybrid function that incorporates adversarial loss, perceptual loss, distance-by-distance loss, and structural similarity loss. Through this innovative approach, we achieved remarkable results, particularly in the generation of protein structures of considerable length. The ROD-WGAN hybrid model has transcended the limitations inherent in the ROD-WGAN model, demonstrating its capability in the successful synthesis of high-quality proteins, characterized by a length of up to 256 amino acids. This accomplishment serves as a demonstrating the effectiveness and potential of our proposed methodologies. Furthermore, these generated protein structures serve as a valuable resource for data augmentation in crucial applications such as molecular structure prediction, inpainting, dynamics, and drug design. Interested individuals can access the data, code, and trained models at https://github.conmena01/ROD-WGAN-and-ROD-WGAN-hybird-models
Chapter, 2023
Publication:2023 International Conference on Computer and Applications (ICCA), 20231128, 1
Publisher: 2023

<–—2023———2023——2410—



Shape Generation of IPM Motor Rotor Using Conditional WGAN-gp
Authors:Nobuhito KATOKeisuke SUZUKIYoshihisa KONDOKatsuyuki SUZUKIKazuo YONEKURA
Article, 2023
Publication:The Proceedings of Design & Systems Conference, 2023.33, 2023, 3206
Publisher: 2023

Conditional WGAN-gpを用いたモータの回転子の形状生成
Authors:加藤 信人鈴木 圭介近藤 慶長鈴木 克幸米倉 一男設計工学
Downloadable Article, 2023
Publication:
設計工学・システム部門講演会講演論文集 2023, 2023, 3206
Publisher: 2023

Research on Two-stage Identification of Distributed Photovoltaic Output Based on WGAN Data Reconstruction Technology

Authors:Yun SuJun GuJianxin ZhangXiu YangYu JinWenhao Li2023 4th International Conference on Advanced Electrical and Energy Systems (AEES)
Show more
Summary:In recent years, the development of household distributed photovoltaics has been rapid, and its unobservable characteristics have brought huge challenges to the planning and control of the power grid and user energy management by the power sector. In view of this, this article proposes a two-stage identification method for distributed photovoltaic output based on data reconstruction technology. This method transforms the problem of distributed photovoltaic output identification into the problem of reconstruction of actual load missing data. Based on this, it fully considers the hidden photovoltaic feature information in the net load, corrects the photovoltaic output identification results, and achieves precise identification of the target user's photovoltaic output. Compared with existing methods, the proposed method does not need to consider the impact of environmental factors on photovoltaic output during identification, and the accuracy of the results identified by existing technologies is also significantly improved
Show more
Chapter, 2023
Publication:2023 4th International Conference on Advanced Electrical and Energy Systems (AEES), 20231201, 710
Publisher: 2023


A Compound Generative Adversarial Network Designed for Stock Price Prediction Based on WGAN

Authors:Zhichao ChangZuping Zhang2023 International Conference on Cyber-
Summary:In the stock market, the combination of historical stock data and machine learning methods has gradually replaced the investment method that relies solely on human experience. We have implemented a composite Generative Adversarial Network based on a pre-training model, which deeply analyzes the characteristics of stock trends and is more suitable for processing stock data than traditional methods. The models introduced in this manuscript including pre-training model and deep training model. The deep training model also includes the Generative Adversarial Network, ARIMA-Lasso units, and selectors. When dealing with the historical data set of American stock market in recent 3 years, we find that the optimal accuracy of our model is more than 84% in all experiments. In addition, we also compared this model with other excellent models, which also proved that this model is outstanding
Show more
Chapter, 2023
Publication:2023 International Conference on Cyber-Physical Social Intelligence (ICCSI), 20231020, 256
Publisher: 2023
 
A new method for mandala image synthesis based on WGAN-GP
Authors:Zhengyu CaoWei HeFengsheng LinChangyi Liu
Article, 2023
Publication:Applied and Computational Engineering, 6, 20230614, 561
Publisher: 2023


2023

A WGAN-based Missing Data Causal Discovery Method
Authors:Yanyang Gao, Qingsong Cai, 2023 4th International Conference on Big Data, Artificial Intelligence and Internet of Things Engineering (ICBAIE)
Show more
Summary:The state-of-the-art causal discovery algorithms are typically based on complete observed data. However, in reality, technical issues, human errors, and data collection methods among other reasons result in missing data. The methods for handling missing data mainly involve statistical and machine learning approaches, where statistical methods are simple and practical, while machine learning methods offer higher accuracy. The typical approach for causal structure discovery in the presence of missing data involves two steps: First, applying missing data imputation algorithms to address the issue of missing data, and then using causal discovery algorithms to identify the causal structure. However, this two-step approach is suboptimal because imputing missing data may introduce biases in the underlying data distribution, making it challenging to accurately assess causal effects between variables. This paper proposes an iterative approach based on generative models for both missing data imputation and causal structure discovery. This approach incorporates an architecture based on Wasserstein generative adversarial networks and autoencoders (AE) to respectively impute missing data and output the causal structure. Through extensive experiments comparing against various state-of-the-art baseline algorithms, the effectiveness and superiority of this method are validated, providing valuable insights for further research on causal structures in the context of missing data
Chapter, 2023
Publication:2023 4th International Conference on Big Data, Artificial Intelligence and Internet of Things Engineering (ICBAIE), 20230825, 136
Publisher: 2023

EC-WGAN: Enhanced Conditional and Wasserstein GAN for Fault Samples Augmentation
Authors:Lingli LiZhongxin LiXiuli WangJianye Gong2023 6th International
 Conference on Robotics, Control and Automation Engineering (RCAE)
Show more
Summary:Considering the issue of limited and imbalanced fault samples, an enhanced conditional and Wasserstein GAN (EC-GAN) is proposed for fault samples augmentation of bearing. At first, random noise along with fault category are inputted into the generator for synthetic samples. Then, the authenticity between real samples and synthetic samples are judged by the discriminator in a minmax cost function with Wasserstein distance. Moreover, the gradient penalty is applied for keeping continuous of the Lipschitz function and gradient vanishing in Wasserstein GAN training. As the collected samples are time series data, one-dimensional convolutional layers replace the fully connected layers of the generator and the discriminator. Finally, the effectiveness of the proposed EC-GAN is verified under the limited data of Case Western Reserve University bearings
Chapter, 2023
Publication:2023 6th International Conference on Robotics, Control and Automation Engineering (RCAE), 20231103, 401
Publisher: 2023

Folded Handwritten Digit Recognition Based on WGAN-GP Model
Authors:Jiacheng WeiHuijia SongXiaozhu LinShaoning JinSenliu ChenTianqian Zhou2023 4th International Conference on Intelligent Computing
Summary:The study of overlapped handwritten digit recognition algorithms is critical for improving automated recognition accuracy, improving document processing, and automating recognition systems. The majority of current research in this field is focused on detecting two overlapped handwritten numbers. However, when one handwritten digit is folded onto another, recognition becomes more difficult, and there is currently no well-established recognition algorithm for this circumstance. To solve the issue of folded digit recognition, a method is provided that reconstructs the folded handwritten digit images using Generative Adversarial Networks (GANs) and transformation optimization algorithm, followed by recognition using a recognition network. The MNIST dataset is used to validate the suggested approach. The experimental findings demonstrate that the recognition accuracy reaches 97.22%, proving the suggested approach's considerable promise in solving the recognition of folded handwritten digit images
Chapter, 2023
Publication:2023 4th International Conference on Intelligent Computing and Human-Computer Interaction (ICHCI), 20230804, 


2023 thesis

coarse embeddings of persistence diagrams and Wasserstein space
Authors:Christopher Neil Pritchard (Author), NC Digital Online Collection of Knowledge and Scholarship 

Injective and coarse embeddings of persistence diagrams and Authors:Christopher Neil Pritchard (Author), NC Digital 

Online Collection of Knowledge and Scholarship (NCDOCKS).

Summary:"In this dissertation we will examine questions related to two fields of mathematics, topological data analysis

(TDA) and optimal transport (OT). Both of these fields center on complex data types to which one often needs to apply

standard machine learning or statistical methods. Such application will typically mandate that these data types are

embedded into a vector space. It has been shown that for many natural metrics such embeddings necessarily have high

distortion, i.e. are not even coarse embeddings. Whether coarse embeddings exist with respect to the p-Wasserstein 

distance for 1 = p = 2 remains an open question, however, both for persistence diagrams (from TDA) and planar 

distributions (from OT). In this first part of this dissertation, we use coarse geometric techniques to show that the TDA and OT sides of this open question are equivalent for p > 1. In the second, we study an embedding of 

persistence diagrams, and show that under mild conditions it is injective, i.e. distinguishes between distinct 

diagrams."--Abstract from author supplied metadata

Thesis, Dissertation, English, 2023

Publisher: [University of North Carolina at Greensboro], [Greensboro, N.C.], 2023

Access Free



he science of deep learning

Author:Iddo Drori (Author)

Summary:The Science of Deep Learning emerged from courses taught by the author that have provided thousands of

students with training and experience for their academic studies, and prepared them for careers in deep learning,

machine learning, and artificial intelligence in top companies in industry and academia. The book begins by covering

the foundations of deep learning, followed by key deep learning architectures. Subsequent parts on generative models

and reinforcement learning may be used as part of a deep learning course or as part of a course on each topic. The book

includes state-of-the-art topics such as Transformers, graph neural networks, variational autoencoders, and deep

reinforcement learning, with a broad range of applications. The appendices provide equations for computing gradients

in backpropagation and optimization, and best practices in scientific writing and reviewing. The text presents an up-to

date guide to the field built upon clear visualizations using a unified notation and equations, lowering the barrier to

entry for the reader. The accompanying website provides complementary code and hundreds of exercises with solutions

Print Book, English, 2023

Publisher: Cambridge University Press, Cambridge, 2023

Also available aseBook

View AllFormats & Editions

<–—2023———2023——2410—



Optimizing allosteric analysis : a Wasserstein distance and heat kernel-based methodology for investigating p53

energetics

Author:Benjamin Scott Cowan (Author)

Thesis, Dissertation, English, 2023

Publisher: 2023


2023 thesis
Injective and coarse embeddings of persistence diagrams and Wasserstein space

Authors:Christopher Neil Pritchard (Author), NC Digital Online Collection of Knowledge and Scholarship (NCDOCKS).
Summary:"In this dissertation we will examine questions related to two fields of mathematics, topological data analysis (TDA) and optimal transport (OT). Both of these fields center on complex data types to which one often needs to apply standard machine learning or statistical methods. Such application will typically mandate that these data types are embedded into a vector space. It has been shown that for many natural metrics such embeddings necessarily have high distortion, i.e. are not even coarse embeddings. Whether coarse embeddings exist with respect to the p-Wasserstein distance for 1 = p = 2 remains an open question, however, both for persistence diagrams (from TDA) and planar distributions (from OT). In this first part of this dissertation, we use coarse geometric techniques to show that the TDA and OT sides of this open question are equivalent for p > 1. In the second, we study an embedding of persistence diagrams, and show that under mild conditions it is injective, i.e. distinguishes between distinct diagrams."--Abstract from author supplied metadata
Thesis, Dissertation, English, 2023
Publisher: [University of North Carolina at Greensboro], [Greensboro, N.C.], 2


Peer-reviewed

A mechanical derivation of the evolution equation for scintillating crystals: Recombination-diffusion-drift

equations, gradient flows and Wasserstein measures

Author:Fabrizio Daví
Summary:In a series of previous papers we obtained, by the means of the mechanics of continua with

microstructure, the Reaction-Diffusion-Drift equation which describes the evolution of charge carriers in scintillators.

Here we deal, first of all, with the consequences of constitutive assumptions for the entropic and dissipative terms. In

the case of Boltzmann-Gibbs entropy, we show that the equation admits a gradient flows structure: moreover, we show

that the drift-diffusion part is a Wasserstein gradient flow and we show how the energy dissipation is correlated with an

appropriate Wasserstein distance.

16 • We obtain a continuum mechanics model for charges evolution of in scintillators. • The evolution equation is a RDD equation coupled with the equation of electrostatic. • The RDD are Wasserstein gradient flows, since the charge densities are measures
Article, 2023
Publication:Mechanics Research Communications, 134, 202312
Publisher: 2023


 
Learning Wasserstein Contrastive Color Histogram Representation for Low-Light Image Enhancement

Authors:Zixuan SunShenglong HuHuihui SongPeng Liang
Summary:The goal of low-light image enhancement (LLIE) is to enhance perception to restore normal-light images. The primary emphasis of earlier LLIE methods was on enhancing the illumination while paying less attention to the color distortions and noise in the dark. In comparison to the ground truth, the restored images frequently exhibit inconsistent color and residual noise. To this end, this paper introduces a Wasserstein contrastive regularization method (WCR) for LLIE. The WCR regularizes the color histogram (CH) representation of the restored image to keep its color consistency while removing noise. Specifically, the WCR contains two novel designs including a differentiable CH module (DCHM) and a WCR loss. The DCHM serves as a modular component that can be easily integrated into the network to enable end-to-end learning of the image CH. Afterwards, to ensure color consistency, we utilize the Wasserstein distance (WD) to quantify the resemblance of the learnable CHs between the restored image and the normal-light image. Then, the regularized WD is used to construct the WCR loss, which is a triplet loss and takes the normal-light images as positive samples, the low-light images as negative samples, and the restored images as anchor samples. The WCR loss pulls the anchor samples closer to the positive samples and simultaneously pushes them away from the negative samples so as to help the anchors remove the noise in the dark. Notably, the proposed WCR method was only used for training, and was shown to achieve high performance and high speed inference using lightweight networks. Therefore, it is valuable for real-time applications such as night automatic driving and night reversing image enhancement. Extensive evaluations on benchmark datasets such as LOL, FiveK, and UIEB showed that the proposed WCR method achieves superior performance, outperforming existing state-of-the-art methods
Downloadable Article, 2023
Publication:Mathematics, 11, 20231001, 4194
Publisher: 2023
Access Free


2023 see 2921. Peer-reviewed

Learning domain invariant representations by joint Wasserstein distance minimization

Authors:Léo AndéolYusei KawakamiYuichiro WadaTakafumi KanamoriKlaus-Robert

MüllerGrégoire Montavon

Summary:Domain shifts in the training data are common in practical applications of machine learning; they occur for

instance when the data is coming from different sources. Ideally, a ML model should work well independently of these

shifts, for example, by learning a domain-invariant representation. However, common ML losses do not give strong

guarantees on how consistently the ML model performs for different domains, in particular, whether the model

performs well on a domain at the expense of its performance on another domain. In this paper, we build new theoretical

foundations for this problem, by contributing a set of mathematical relations between classical losses for supervised

ML and the Wasserstein distance in joint space (i.e. representation and output space). We show that classification or

regression losses, when combined with a GAN-type discriminator between domains, form an upper-bound to the true 

Wasserstein distance between domains. This implies a more invariant representation and also more stable prediction

performance across domains. Theoretical results are corroborated empirically on several image datasets. Our proposed

approach systematically produces the highest minimum classification accuracy across domains, and the most invariant

representation

Article, 2023

Publication:Neural Networks, 167, 202310, 233

Publisher: 2023

2023


2023 thesis 

Parameter estimation from aggregate observations: a Wasserstein distance-based sequential Monte Carlo sampler
Authors:Chen ChengLinjie WenJinglai Li

2Summary:In this work, we study systems consisting of a group of moving particles. In such systems, often some important parameters are unknown and have to be estimated from observed data. Such parameter estimation problems can often be solved via a Bayesian inference framework. However, in many practical problems, only data at the aggregate level is available and as a result the likelihood function is not available, which poses a challenge for Bayesian methods. In particular, we consider the situation where the distributions of the particles are observed. We propose a Wasserstein distance (WD)-based sequential Monte Carlo sampler to solve the problem: the WD is used to measure the similarity between the observed and the simulated particle distributions and the sequential Monte Carlo samplers is used to deal with the sequentially available observations. Two real-world examples are provided to demonstrate the performance of the proposed method
Downloadable Article, 2023
Publication:Royal Society Open Science, 10, 20230801
Publisher: 2023
Access Free

 

Peer-reviewed
On the exotic isometry flow of the quadratic Wasserstein space over the real line

Authors:György Pál GehérTamás TitkosDániel Virosztek
Summary:Kloeckner discovered that the quadratic Wasserstein space over the real line (denoted by
Summary:Kloeckner discovered that the quadratic Wasserstein space over the real line (denoted b
Article
Publication:Linear Algebra and Its Applications

 

Peer-reviewed

Decentralized convex optimization on time-varying networks with application to Wasserstein barycenters

Show more

Authors:Olga YuferevaMichael PersiianovPavel DvurechenskyAlexander GasnikovDmitry

Kovalev

Summary:Abstract: Inspired by recent advances in distributed algorithms for approximating Wasserstein barycenters,

we propose a novel distributed algorithm for this problem. The main novelty is that we consider time-varying

computational networks, which are motivated by examples when only a subset of sensors can observe each time step,

and yet, the goal is to average signals (e.g., satellite pictures of some area) by approximating their barycenter. We

embed this problem into a class of non-smooth dual-friendly distributed optimization problems over time-varying

networks and develop a first-order method for this class. We prove non-asymptotic accelerated in the sense of Nesterov

convergence rates and explicitly characterize their dependence on the parameters of the network and its dynamics. In

the experiments, we demonstrate the efficiency of the proposed algorithm when applied to the Wasserstein barycenter

problem

S

Article, 2023

Publication:Computational Management Science, 21, 202406

Publisher: 2023


Wasserstein and weighted metrics for multidimensional Gaussian distributions

Authors:Kelbert, Mark YakovlevichSuhov, Yurii M.

Summary:We present a number of low and upper bounds for Levy - Prokhorov, Wasserstein, Frechet, and Hellinger 

distances between probability distributions of the same or different dimensions. The weighted (or context-sensitive) 

total variance and Hellinger  distances are introduced. The upper and low bounds for these weighted metrics are 

proved. The low bounds for the minimum of different errors in sensitive hypothesis testing are proved.

Downloadable Article, 2023

Publication:Известия Саратовского университета. Новая серия. Серия Математика. Механика. Информатика, 23, 

20231101, 422

Publisher: 2023

Access Free

 

 Peer-reviewed
A Wasserstein Distance-Based Cost-Sensitive Framework for Imbalanced Data Classification

Authors:R. FengH. JiZ. ZhuL. Wang
Summary:Class imbalance is a prevalent problem in many real-world applications, and imbalanced data distribution can dramatically skew the performance of classifiers. In general, the higher the imbala
Summary:Class imbalance is a prevalent problem in many real-world applications, and imbalanced data distribution can dramatically skew the performance of classifiers. In general, the higher the imbalance ratio of a dataset, the more difficult it is to classify. However, it is found that standard classifiers can still achieve good classification results on some highly imbalanced datasets. Obviously, the class imbalance is only a superficial characteristic of the data, and the underlying structural information is often the key factor affecting the classification performance. As implicit prior knowledge, structural information has been validated to be crucial for designing a good classifier. This paper proposes a Wasserstein-based cost-sensitive support vector machine (CS-WSVM) for class imbalance learning, incorporating prior structural information and a cost-sensitive strategy. The Wasserstein distance is introduced to model the distribution of majority and minority samples to capture the structural information, which is employed to weight the majority and minority samples. Comprehensive experiments on synthetic and real-world datasets, especially on the radar emitter signal dataset, demonstrated that CS-WSVM can achieve outstanding performance in imbalanced scenarios
Downloadable Article, 2023
Publication:Radioengineering, 32, 20230901, 451
Publisher: 2023
Access Free

<–—2023———2023——2420—



Peer-reviewed

Wasserstein Distance-Based Deep Leakage from Gradients
Authors:Zifan WangChanggen PengXing HeWeijie Tan
Summary:Federated learning protects the privacy information in the data set by sharing the average gradient. However, “Deep Leakage from Gradient” (DLG) algorithm as a gradient-based feature reconstruction attack can recover privacy training data using gradients shared in federated learning, resulting in private information leakage. However, the algorithm has the disadvantages of slow model convergence and poor inverse generated images accuracy. To address these issues, a Wasserstein distance-based DLG method is proposed, named WDLG. The WDLG method uses Wasserstein distance as the training loss function achieved to improve the inverse image quality and the model convergence. The hard-to-calculate Wasserstein distance is converted to be calculated iteratively using the Lipschit condition and Kantorovich-Rubinstein duality. Theoretical analysis proves the differentiability and continuity of Wasserstein distance. Finally, experiment results show that the WDLG algorithm is superior to DLG in training speed and inversion image quality. At the same time, we prove through the experiments that differential privacy can be used for disturbance protection, which provides some ideas for the development of a deep learning framework to protect privacy
Downloadable Article, 2023
Publication:Entropy, 25, 20230501, 810
Publisher: 2023
 

 
Peer-reviewed
A Machine Learning Framework for Geodesics Under Spherical Wasserstein–Fisher–Rao Metric and Its Application for Weighted Sample Generation

Authors:Yang JingJiaheng ChenLei LiJianfeng Lu
Authors:Yang Jing, Jiaheng Chen, Lei Li, Jianfeng Lu
Summary:Abstract: Wasserstein–Fisher–Rao (WFR) distance is a family of metrics to gauge the discrepancy of two Radon measures, which takes into account both transportation and weight change. Spherical WFR distance is a projected version of WFR distance for probability measures so that the space of Radon measures equipped with WFR can be viewed as metric cone over the space of probability measures with spherical WFR. Compared to the case for Wasserstein distance, the understanding of geodesics under the spherical WFR is less clear and still an ongoing research focus. In this paper, we develop a deep learning framework to compute the geodesics under the spherical WFR metric, and the learned geodesics can be adopted to generate weighted samples. Our approach is based on a Benamou–Brenier type dynamic formulation for spherical WFR. To overcome the difficulty in enforcing the boundary constraint brought by the weight change, a Kullback–Leibler divergence term based on the inverse map is introduced into the cost function. Moreover, a new regularization term using the particle velocity is introduced as a substitute for the Hamilton–Jacobi equation for the potential in dynamic formula. When used for sample generation, our framework can be beneficial for applications with given weighted samples, especially in the Bayesian inference, compared to sample generation with previous flow models
Article, 2023
Publication:Journal of Scientific Computing, 98, 202401
Publisher: 2023


Peer-reviewed
SModel and observation of the feasible region for PV integration capacity considering Wasserstein-distance-based distributionally robust chance constraints

Authors:Shida ZhangShaoyun GeHong LiuJunkai LiChengshan Wang
Authors:Shida Zhang, Shaoyun Ge, Hong Liu, Junkai Li, Chengshan Wang
Summary:• PV hosting capability for multiple locations is modeled as a feasible region. • The uncertainty of PV output is addressed by data-driven WDRCC. • Active management with limited budget is used to improve the hosting capability. • A systematic solution procedure is proposed to observe the feasible region. • Each PV integration request in the feasible region is a feasible request.

83 The increasing integration scale of photovoltaic (PV) systems brings enormous challenges on distribution networks (DNs). To provide an explicit boundary of feasible PV integration capacity (PVIC) associated with each integration location, this paper proposes a novel depiction of PV hosting capability, which is the feasible region for PVIC in high-dimensional space. Next, a multi-objective optimization model based on information gap decision theory (IGDT) is proposed to observe the feasible region, where active distribution network management (ADNM) schemes with a limited budget are deployed to improve PV hosting capability. The impact of PV output uncertainty on security constraints is addressed by data-driven Wasserstein-distance-based distributionally robust chance constraints (WDRCCs). Finally, a systematic procedure is developed to solve the proposed model. Note that the exact power flow model associated with a new equivalent WDRCC reformulation method is deployed so as to guarantee the accuracy of the assessment. The effectiveness, solution accuracy, computational efficiency, and scalability of the proposed model and method are verified with the 4-bus system, the IEEE 33-bus system, and the IEEE 123-bus system. The output of the common method based on the linearized power flow deviates by more than 10% from the output of the proposed method based on the exact power flow in the 33-bus system. Each multi-location PV integration request is feasible as its locations and capacities belong to the proposed region. It implies that this region can provide guidance for PV allocation
Article, 2023
Publication:Applied Energy, 347, 20231001
Publisher: 2023

Peer-reviewed
Wasserstein enabled Bayesian optimization of composite functions

Authors:Candelieri, A (Contributor), Ponti, A (Contributor), Archetti, F (Contributor)
Summary:Abstract: Bayesian optimization (BO) based on the Gaussian process model (GP-BO) has become the most used approach for the global optimization of black-box functions and computationally expensive optimization problems. BO has proved its sample efficiency and its versatility in a wide range of engineering and machine learning problems. A limiting factor in its applications is the difficulty of scaling over 15–20 dimensions. In order to mitigate this drawback, it has been remarked that optimization problems can have a lower intrinsic dimensionality. Several optimization strategies, built on this observation, map the original problem into a lower dimension manifold. In this paper we take a novel approach mapping the original problem into a space of discrete probability distributions endowed with a Wasserstein metric. The Wasserstein space is a non-linear manifold whose elements are discrete probability distributions. The input of the Gaussian process is given by discrete probability distributions and the acquisition function becomes a functional in the Wasserstein space. The minimizer of the acquisition functional in the Wasserstein space is then mapped back to the original space using a neural network. Computational results for three test functions with dimensionality ranging from 5 to 100, show that the exploration in the Wasserstein space is significantly more effective than that performed by plain Bayesian optimization in the Euclidean space and its advantage grows with the dimensions of the search space
Article, 2023
Publication:Journal of Ambient Intelligence and Humanized Computing, 14, 202308, 11263
Publisher: 2023


2023



 
Peer-reviewed
Data-Driven Distributionally Robust Risk-Averse Two-Stage Stochastic Linear Programming over Wasserstein Ball

Authors:Yining GuYicheng HuangYanjun Wang
Summary:Abstract: In this paper, we consider a data-driven distributionally robust two-stage stochastic linear optimization problem over 1-Wasserstein ball centered at a discrete empirical distribution. Differently from the traditional two-stage stochastic programming which involves the expected recourse function as the preference criterion and hence is risk-neutral, we take the conditional value-at-risk (CVaR) as the risk measure in order to model its effects on decision making problems. We mainly explore tractable reformulations for the proposed robust two-stage stochastic programming with mean-CVaR criterion by analyzing the first case where uncertainties are only in the objective function and then the second case where uncertainties are only in the constraints. We demonstrate that the first model can be exactly reformulated as a deterministic convex programming. Furthermore, it is shown that under several different support sets, the resulting convex optimization problems can be converted into computationally tractable conic programmings. Besides, the second model is generally NP-hard since checking constraint feasibility can be reduced to a norm maximization problem over a polytope. However, even with the case of uncertainty in constraints, tractable conic reformulations can be established when the extreme points of the polytope are known. Finally, we present numerical results to discuss how to control the risk for the best decisions and illustrate the computational effectiveness and superiority of the proposed models
 Article, 2023
Publication:Journal of Optimization Theory and Applications, 200, 202401, 242
Publisher: 2023


 
Wasserstein-metric-based distributionally robust optimization method for unit commitment considering wind turbine uncertainty

Authors:Gengrui ChenDonglian QiYunfeng YanYulin ChenYaxin WangJingcheng Mei
Authors:Gengrui Chen, Donglian Qi, Yunfeng Yan, Yulin Chen, Yaxin Wang, Jingcheng Mei
Summary:Abstract The penetration of wind turbines in the power grid is increasing rapidly. Still, the wind turbine output power has uncertainty, leading to poor grid reliability, affecting the grid's dispatching plan, and increasing the total cost. Thus, a distributionally robust optimization method for thermal power unit commitment considering the uncertainty of wind power is proposed. For this method, energy storage and interruptible load are added to simulate increasingly complex electricity consumption scenarios. Furthermore, the amount of load cutting reflects the satisfaction level of electricity consumption on the user side. Based on Wasserstein metric, an ambiguity set is established to reflect the probabilistic distribution information of the wind power uncertainty. An ambiguity set preprocessing method is proposed to depict the probability distribution of ambiguity set more clearly, to minimize the operation cost under the condition that the uncertainty of wind turbine output power obeys the extreme probabilistic distribution of the ambiguity set. The test case in a modified version of the IEEE 6-bus system shows that the proposed method can flexibly adjust the robustness and economy of optimization decisions by controlling the sample size and the confidence of Wasserstein ambiguity set radius. In addition, the proposed ambiguity set preprocessing method can obtain more economical dispatching decisions with a smaller sample size
Downloadable Article, 2023
Publication:Engineering Reports, 5, 20231001, n/a
Publisher: 2023
Access Free

 

2023 see 2022. Peer-reviewed

n-field neural networks: Learning mappings on Wasserstein space
Authors:Huyên PhamXavier Warin
Summary:We study the machine learning task for models with operators mapping between the Wasserstein space of probability measures and a space of functions, like e.g. in mean-field games/control problems. Two classes of neural networks based on bin density and on cylindrical approximation, are proposed to learn these so-called mean-field functions, and are theoretically supported by universal approximation theorems. We perform several numerical experiments for training these two mean-field neural networks, and show their accuracy and efficiency in the generalization error with various test distributions. Finally, we present different algorithms relying on mean-field neural networks for solving time-dependent mean-field problems, and illustrate our results with numerical tests for the example of a semi-linear partial differential equation in the Wasserstein space of probability measures
Article, 2023
Publication:Neural Networks, 168, 202311, 380
Publisher: 2023



Peer-reviewed
CWGAN-GP: an image fusion method based on infrared compensator and wasserstein generative adversarial network with gradient penalty

Authors:Xiao WangGang LiuLili TangDurga Prasad BavirisettiGang Xiao
Authors:Xiao Wang, Gang Liu, Lili Tang, Durga Prasad Bavirisetti, Gang Xiao
Summary:Abstract: The existing Generative adversarial network (GAN)-based infrared (IR) and visible (VIS) image fusion methods mainly used multiple discriminators to preserve salient information in source images, which brings difficulty in balancing the performance of these discriminators during training, leading to unideal fused results. To tackle this disadvantage, an image fusion method based on IR compensator and Wasserstein generative adversarial network with gradient penalty (WGAN-GP) is proposed, called ICWGAN-GP. The generator of ICWGAN-GP employs an adjustment mechanism to obtain more VIS gradients while getting IR intensities, and important details in VIS images are highlighted through the adversarial game between a discriminator and a generator. Using one discriminator allows ICWGAN-GP to focus on learning the feature distribution in a source image, which avoids the balance problem caused by multiple discriminators, and improves the efficiency of the ICWGAN-GP. In addition, an IR compensator based on Quadtree-Bézier method is designed to make up for bright IR features in the fused images. Extensive experiments on public datasets show that ICWGAN-GP can highlight bright target features while generating rich texture in the fused images, and achieves better objective metrics in terms of SCD, CC, FMI_W and VIF than the state-of-the-art methods like U2Fusion, MDLatLRR, DDcGAN, etc. Moreover, in our further fusion tracking experiments, ICWGAN-GP also demonstrates good tracking performance
Article, 2023
Publication:Applied Intelligence : The International Journal of Research on Intelligent Systems for Real Life Complex Problems, 53, 202311, 27637
Publisher: 2023


 eer-reviewed
  ection framework using variational auto encoder Wasserstein generative adversarial network optimized with archerfish hunting optimization algorithm

Authors:G. SenthilkumarK. TamilarasiJ. K. Periasamy
Summary:Abstract: The cloud computing environment has been severely harmed by security issues, which has a negative impact on the healthy and sustainable development of the cloud. Intrusion detection technologies are protecting the cloud computing environment from malicious attacks. To overcome this problem, Variational auto encoder Wasserstein generative adversarial networks enhanced by Gazelle optimization algorithm embraced cloud intrusion detection (CIDF-VAWGAN-GOA) is proposed in this manuscript. Here, the data is collected via NSL-KDD dataset. Then the data is supplied to pre-processing. In pre-processing, it purges the redundancy and missing value is restored using Difference of Gaussian filtering. Then the pre-processing output is fed to the feature selection. In feature selection, the optimal feature is selected using archerfish hunting optimizer (AHOA). The optimal features based, data is characterized by normal and anomalous under VAWGAN. Generally, VAWGAN does not adopt any optimization techniques to compute the optimum parameters for assuring accurate detection of intruder in cloud intrusion detection. Therefore, in this work, GOA is used for optimizing VAWGAN. The proposed CIDF-VAWGAN-GOA technique is implemented in Python under NSL-KDD data set. The performance metrics, like accuracy, sensitivity, specificity, precision, F-Score, Computation Time, Error rate, AUC are examined. The proposed method provides higher recall of 17.58%, 23.18% and 13.92%, high AUC of 19.43%, 12.84% and 21.63% and lower computation Time of 15.37%, 1.83%,18.34% compared to the existing methods, like Cloud intrusion detection depending on stacked contractive auto-encoder with support vector machine (CIDF-SVM), Efficient feature selection with classification using ensemble method for network intrusion detection on cloud computing (CIDF-DNN) and Deep belief network under chronological salp swarm approach for intrusion detection in cloud utilizing fuzzy entropy (CIDF-DBN) respectively
:Alessio Figalli (Author), Federico Glaudo (Author)
Summary:"This book provides a self-contained introduction to optimal transport, and it is intended as a starting point for any researcher who wants to enter into this beautiful subject. The presentation focuses on the essential topics of the theory: Kantorovich duality, existence and uniqueness of optimal transport maps, Wasserstein distances, the JKO scheme, Otto's calculus, and Wasserstein gradient flows. At the end, a presentation of some selected applications of optimal transport is given. The book is suitable for a course at the graduate level and also includes an appendix with a series of exercises along with their solutions. The present second edition contains a number of additions, such as a new section on the Brunn-Minkowski inequality, new exercises, and various corrections throughout the text."-- publisher
eBook, English, 2023
Edition: Second edition
Publisher: EMS Press, Berlin, Germany, 2023
Also available asPrint Book
View AllFormats & Editions

<–—2023———2023——2430—



Peer-reviewed
CR-Net: A robust craniofacial registration network by introducing Wasserstein distance constraint and geometric attention mechanism

Authors:Zhenyu DaiJunli ZhaoXiaodan DengFuqing DuanDantong LiZhenkuan PanMingquan Zhou
Authors:Zhenyu Dai, Junli Zhao, Xiaodan Deng, Fuqing Duan, Dantong Li, Zhenkuan Pan, Mingquan Zhou
Summary:Accurate registration of three-dimensional (3D) craniofacial data is fundamental work for craniofacial reconstruction and analysis. The complex topology and low-quality 3D models make craniofacial registration challenging in the iterative optimization process. In this paper, we proposed a craniofacial registration network (CR-Net) that can automatically learn the registration parameters of the non-rigid thin plate spline (TPS) transformation from the training data sets and perform the required geometric transformations to align craniofacial point clouds. The proposed CR-Net employs an improved point cloud encoder architecture, a specially designed attention mechanism that can perceive the geometric structure of the point cloud. In order to align the source and target data, Wasserstein distance loss is introduced to combined with Chamfer loss and Gaussian Mixture Models (GMM) loss as an unsupervised loss function dedicated to improves registration accuracy. After efficient training, the network can automatically generate the transformation parameters for registration, transforming the reference craniofacial data to the target craniofacial data without manual calibration of feature points or performing an iterative optimization process. Experimental results show that our method has high registration accuracy and is robust to low-quality models. Display Omitted

• A neural network for robust craniofacial point cloud registration. • Geometric attention mechanism to perceive the geometric structure of point clouds. • Introducing Wasserstein distance loss constrain unsupervised training
Article, 2023
Publication:Computers & Graphics, 116, 202311, 194
Publisher: 2023



Peer-reviewed
A time-series Wasserstein GAN method for state-of-charge estimation of lithium-ion batteries

Authors:Xinyu GuK.W. SeeYanbin LiuBilal ArshadLiang ZhaoYunpeng Wang
Summary:Estimating the state-of-charge (SOC) of lithium-ion batteries is essential for maintaining secure and reliable battery operation while minimizing long-term service and maintenance expenses. In this work, we present a novel Time-Series Wasserstein Generative Adversarial Network (TS-WGAN) approach for SOC estimation of lithium-ion batteries, characterized by a well-designed data preprocessing process and a distinctive WGAN-GP architecture. In the data preprocessing stage, we employ the Pearson correlation coefficient (PCC) to identify strongly associated features and apply feature scaling techniques for data normalization. Moreover, we leverage polynomial regression to expand the original features and utilize principal component analysis (PCA) to reduce the computational load and retain essential information by projecting features into a lower-dimensional subspace. Within the WGAN-GP architecture, we originally devise a Transformer as the generator and a Convolution Neural Network (CNN) as the critic to make the most of local (CNN) and global (Transformer) variables. The overall model is trained with the WGAN, incorporating gradient penalty loss for training purposes. Simulation outcomes using real-road dataset and laboratory dataset reveal that TS-WGAN surpasses all baseline methods with enhanced accuracy, stability, and robustness. The coefficient of determination (R2) for both datasets exceeds 99.50%, demonstrating its potential for practical application. Display Omitted

180 • The TS-WGAN model integrates local (CNN) and global (Transformer) information. • The WGAN-GP model is adapted for one-dimensional SOC time series forecasting. • The model's performance is tested using experimental and real-world road datasets
Article, 2023
Publication:Journal of Power Sources, 581, 20231015
Publisher: 2023

 
2023 see 2022    Peer-reviewed
Authors:Huyên Pham, Xavier Warin
Mean-field neural networks: Learning mappings on Wasserstein space

Authors:Huyên PhamXavier Warin
Summary:We study the machine learning task for models with operators mapping between the Wasserstein space of probability measures and a space of functions, like e.g. in mean-field games/control problems. Two classes of neural networks based on bin density and on cylindrical approximation, are proposed to learn these so-called mean-field functions, and are theoretically supported by universal approximation theorems. We perform several numerical experiments for training these two mean-field neural networks, and show their accuracy and efficiency in the generalization error with various test distributions. Finally, we present different algorithms relying on mean-field neural networks for solving time-dependent mean-field problems, and illustrate our results with numerical tests for the example of a semi-linear partial differential equation in the Wasserstein space of probability measures
Article, 2023
Publication:Neural Networks, 168, 202311, 380
Publisher: 2023

 
Peer-reviewed
ICWGAN-GP: an image fusion method based on infrared compensator and wasserstein generative adversarial network with gradient penalty

Authors:Xiao Wang, Gang Liu, Lili Tang, Durga Prasad Bavirisetti, Gang Xiao
Summary:Abstract: The existing Generative adversarial network (GAN)-based infrared (IR) and visible (VIS) image fusion methods mainly used multiple discriminators to preserve salient information in source images, which brings difficulty in balancing the performance of these discriminators during training, leading to unideal fused results. To tackle this disadvantage, an image fusion method based on IR compensator and Wasserstein generative adversarial network with gradient penalty (WGAN-GP) is proposed, called ICWGAN-GP. The generator of ICWGAN-GP employs an adjustment mechanism to obtain more VIS gradients while getting IR intensities, and important details in VIS images are highlighted through the adversarial game between a discriminator and a generator. Using one discriminator allows ICWGAN-GP to focus on learning the feature distribution in a source image, which avoids the balance problem caused by multiple discriminators, and improves the efficiency of the ICWGAN-GP. In addition, an IR compensator based on Quadtree-Bézier method is designed to make up for bright IR features in the fused images. Extensive experiments on public datasets show that ICWGAN-GP can highlight bright target features while generating rich texture in the fused images, and achieves better objective metrics in terms of SCD, CC, FMI_W and VIF than the state-of-the-art methods like U2Fusion, MDLatLRR, DDcGAN, etc. Moreover, in our further fusion tracking experiments, ICWGAN-GP also demonstrates good tracking performance
Article, 2023
Publication:Applied Intelligence : The International Journal of Research on Intelligent Systems for Real Life Complex Problems, 53, 202311, 27637
Publisher: 2023


Peer-reviewed
A Wasserstein generative digital twin model in health monitoring of rotating machines

Authors:Wenyang HuTianyang WangFulei Chu
Summary:Artificial intelligence-based rotating machine health monitoring and diagnosis methods often encounter problems, such as a lack of faulty samples. Although the simulation-based digital twin model may potentially alleviate these problems with sufficient prior knowledge and a large amount of time, the more demanding requirements of adaptivity, autonomy, and context-awareness may not be satisfied. This study attempted to address these problems by proposing a novel digital twin model referred to as the Wasserstein generative digital twin model (WGDT). The model employs a Wasserstein generative adversarial network (WGAN) as its core to model virtual samples with high fidelity to the healthy physical samples obtained from different industrial assets, thereby meeting the adaptivity requirement. Further, through a designed consistency test criterion mechanism, samples with high fidelity were generated by checking the similarity of distributions between generated samples and healthy physical samples to ensure that training process in conducted in a timely manner and manual involvement is avoided, thereby catering to the need for autonomy. This mechanism is based on the synchronous evolution of the generator and critic during training. Furthermore, the structure of the critic network can be customized according to the service-end tasks and testing conditions, thereby fulfilling the context awareness requirement. Subsequently, the critic network in the Wasserstein generative adversarial network (WGAN) can be used to perform different service-end tasks. The performance of the digital twin model was evaluated using two experimental cases and the results indicated that the WGDT model can efficiently and stably perform service-end tasks such as health monitoring, early fault detection, and degradation tracking without the requirement of prior knowledge, historical test samples, and faulty samples regarding the asset.

220 • Wasserstein generative digital twin model is proposed and overcomes lack of faulty samples. • Wasserstein generative adversarial network as the core of model ensures adaptivity. • Consistency test criterion mechanism is designed to ensure autonomy. • Customization ability of critic network structure ensures context awareness. • Efficiency and reliability to perform service end tasks confirmed via experiments
Article, 2023
Publication:Computers in Industry, 145, 202302
Publisher: 2023


2023



Peer-reviewed
Privacy-utility equilibrium data generation based on Wasserstein generative adversarial networks

Authors:Hai LiuYouliang TianChanggen PengZhenqiang Wu
Summary:To solve the contradictory problem of local data sharing and privacy protection, this paper presented the models and algorithms for privacy-utility equilibrium data generation based on Wasserstein generative adversarial networks. First, we introduce the expected estimation error between the discriminant probability of real data and generated data into Wasserstein generative adversarial networks, and we formally construct a basic mathematical model of privacy-utility equilibrium data generation based on computationally indistinguishable. Second, we construct the basic model and the basic algorithm of privacy-utility equilibrium data generation based on Wasserstein generative adversarial networks according to the constructed basic mathematical model, and our theoretical analysis results show that the basic algorithm can achieve the equilibrium between local data sharing and privacy protection. Third, according to the constructed basic model, we construct the federated model and the federated algorithm of privacy-utility equilibrium data generation based on Wasserstein generative adversarial networks using the serialized training method of federated learning, and our theoretical analysis results also show that the federated algorithm can achieve the equilibrium between local data sharing and privacy protection in a distributed environment. Finally, our experimental results show that the proposed algorithms can achieve the equilibrium between local data sharing and privacy protection in this paper. Therefore, the constructed basic mathematical model provides a theoretical basis of achieving the equilibrium between local data sharing and privacy protection. At the same time, the proposed basic model and the federated model of privacy-utility equilibrium data generation provide a concrete method of achieving the equilibrium between local data sharing and privacy protection in centralized and distributed environment respectively
Article, 2023
Publication:Information Sciences, 642, 202309
Publisher: 2023


Peer-reviewed
Peer-reviewed

Cloud intrusion detection framework using variational auto encoder Wasserstein generative adversarial network optimized with archerfish hunting optimization algorithm
Summary:Abstract: The cloud computing environment has been severely harmed by security issues, which has a negative impact on the healthy and sustainable development of the cloud. Intrusion detection technologies are protecting the cloud computing environment from malicious attacks. To overcome this problem, Variational auto encoder Wasserstein generative adversarial networks enhanced by Gazelle optimization algorithm embraced cloud intrusion detection (CIDF-VAWGAN-GOA) is proposed in this manuscript. Here, the data is collected via NSL-KDD dataset. Then the data is supplied to pre-processing. In pre-processing, it purges the redundancy and missing value is restored using Difference of Gaussian filtering. Then the pre-processing output is fed to the feature selection. In feature selection, the optimal feature is selected using archerfish hunting optimizer (AHOA). The optimal features based, data is characterized by normal and anomalous under VAWGAN. Generally, VAWGAN does not adopt any optimization techniques to compute the optimum parameters for assuring accurate detection of intruder in cloud intrusion detection. Therefore, in this work, GOA is used for optimizing VAWGAN. The proposed CIDF-VAWGAN-GOA technique is implemented in Python under NSL-KDD data set. The performance metrics, like accuracy, sensitivity, specificity, precision, F-Score, Computation Time, Error rate, AUC are examined. The proposed method provides higher recall of 17.58%, 23.18% and 13.92%, high AUC of 19.43%, 12.84% and 21.63% and lower computation Time of 15.37%, 1.83%,18.34% compared to the existing methods, like Cloud intrusion detection depending on stacked contractive auto-encoder with support vector machine (CIDF-SVM), Efficient feature selection with classification using ensemble method for network intrusion detection on cloud computing (CIDF-DNN) and Deep belief network under chronological salp swarm approach for intrusion detection in cloud utilizing fuzzy entropy (CIDF-DBN) respectively
Article, 2023
Publication:Wireless Networks : The Journal of Mobile Communication, Computation and Information, 20231201, 1
Publisher: 2023
rticle, 2023
Publication:Wireless Networks : The Journal of Mobile Communication, Computation and Information, 20231201, 1
Publisher: 2023

227 Publication:Wireless Networks : The Journal of Mobile Communication, Computation and Informati20231201, 1
Publisher: 2023



Peer-reviewed
The Ultrametric Gromov–Wasserstein Distance

Authors:Facundo MémoliAxel MunkZhengchao WanChristoph Weitkamp
Summary:Abstract: We investigate compact ultrametric measure spaces which form a subset of the collection of all metric measure spaces . In analogy with the notion of the ultrametric Gromov–Hausdorff distance on the collection of ultrametric spaces , we define ultrametric versions of two metrics on , namely of Sturm’s Gromov–Wasserstein distance of order p and of the Gromov–Wasserstein distance of order p. We study the basic topological and geometric properties of these distances as well as their relation and derive for a polynomial time algorithm for their calculation. Further, several lower bounds for both distances are derived and some of our results are generalized to the case of finite ultra-dissimilarity spaces. Finally, we study the relation between the Gromov–Wasserstein distance and its ultrametric version (as well as the relation between the corresponding lower bounds) in simulations and apply our findings for phylogenetic tree shape comparisons
Article, 2023
Publication:Discrete & Computational Geometry, 70, 202312, 1378
Publisher: 2023


A Novel Approach to Satellite Component Health Assessment Based on the Wasserstein Distance and Spectral Clustering

Authors:Yongchao HuiYuehua ChengBin JiangXiaodong HanLei Yang
Summary:This research presents a multiparameter approach to satellite component health assessment aimed at addressing the increasing demand for in-orbit satellite component health assessment. The method encompasses three key enhancements. Firstly, the utilization of the Wasserstein distance as an indicator simplifies the decision-making process for assessing the health of data distributions. This enhancement allows for a more robust handling of noisy sensor data, resulting in improved accuracy in health assessment. Secondly, the original limitation of assessing component health within the same parameter class is overcome by extending the evaluation to include multiple parameter classes. This extension leads to a more comprehensive assessment of satellite component health. Lastly, the method employs spectral clustering to determine the boundaries of different health status classes, offering an objective alternative to traditional expert-dependent approaches. By adopting this technique, the proposed method enhances the objectivity and accuracy of the health status classification. The experimental results show that the method is able to accurately describe the trends in the health status of components. Its effectiveness in real-time health assessment and monitoring of satellite components is confirmed. This research provides a valuable reference for further research on satellite component health assessment. It introduces novel and enhanced ideas and methodologies for practical applications
Downloadable Article, 2023
Publication:Applied Sciences, 13, 20230801, 9438
Publisher: 2023
Access Free

Peer-reviewed
Identifying disease-related microbes based on multi-scale variational graph autoencoder embedding Wasserstein distance

Authors:Huan ZhuHongxia HaoLiang Yu
Summary:Abstract Background Enormous clinical and biomedical researches have demonstrated that microbes are crucial to human health. Identifying associations between microbes and diseases can not only reveal potential disease mechanisms, but also facilitate early diagnosis and promote precision medicine. Due to the data perturbation and unsatisfactory latent representation, there is a significant room for improvement. Results In this work, we proposed a novel framework, Multi-scale Variational Graph AutoEncoder embedding Wasserstein distance (MVGAEW) to predict disease-related microbes, which had the ability to resist data perturbation and effectively generate latent representations for both microbes and diseases from the perspective of distribution. First, we calculated multiple similarities and integrated them through similarity network confusion. Subsequently, we obtained node latent representations by improved variational graph autoencoder. Ultimately, XGBoost classifier was employed to predict potential disease-related microbes. We also introduced multi-order node embedding reconstruction to enhance the representation capacity. We also performed ablation studies to evaluate the contribution of each section of our model. Moreover, we conducted experiments on common drugs and case studies, including Alzheimer’s disease, Crohn’s disease, and colorectal neoplasms, to validate the effectiveness of our framework. Conclusions Significantly, our model exceeded other currently state-of-the-art methods, exhibiting a great improvement on the HMDAD database
Downloadable Article, 2023
Publication:BMC Biology, 21, 20231201, 1
Publisher: 2023
Access Free

<–—2023———2023——2440—



Authors:Zengyu Cai, Hongyu Du, Haoqi Wang, Jianwei Zhang, Yajie Si, Pengrong Li
One-Dimensional Convolutional Wasserstein Generative Adversarial Network Based Intrusion Detection Method for Industrial Control Systems
Summary:The imbalance between normal and attack samples in the industrial control systems (ICSs) network environment leads to the low recognition rate of the intrusion detection model for a few abnormal samples when classifying. Since traditional machine learning methods can no longer meet the needs of increasingly complex networks, many researchers use deep learning to replace traditional machine learning methods. However, when a large amount of unbalanced data is used for training, the detection performance of deep learning decreases significantly. This paper proposes an intrusion detection method for industrial control systems based on a 1D CWGAN. The 1D CWGAN is a network attack sample generation method that combines 1D CNN and WGAN. Firstly, the problem of low ICS intrusion detection accuracy caused by a few types of attack samples is analyzed. This method balances the number of various attack samples in the data set from the aspect of data enhancement to improve detection accuracy. According to the temporal characteristics of network traffic, the algorithm uses 1D convolution and 1D transposed convolution to construct the modeling framework of network traffic data of two competing networks and uses gradient penalty instead of weight cutting in the Wasserstein Generative Adversarial Network (WGAN) to generate virtual samples similar to real samples. After a large number of data sets are used for verification, the experimental results show that the method improves the classification performance of the CNN and BiSRU. For the CNN, after data balancing, the accuracy rate is increased by 0.75%, and the accuracy, recall rate and F1 are improved. Compared with the BiSRU without data processing, the accuracy of the s1D CWGAN-BiSRU is increased by 1.34%, and the accuracy, recall and F1 are increased by 7.2%, 3.46% and 5.29%
Downloadable Article, 2023
Publication:Electronics, 12, 20231101, 4653
Publisher: 2023
Access Free


Wasserstein Dissimilarity for Copula-Based Clustering of Time Series with Spatial Information

Authors:Alessia BeneventoFabrizio Durante
Article, 2023
Publication:Mathematics, 12, 20231224, 67
Publisher: 2023
Summary:The clustering of time series with geo-referenced data requires a suitable dissimilarity matrix interpreting the comovements of the time series and taking into account the spatial constraints. In this paper, we propose a new way to compute the dissimilarity matrix, merging both types of information, which leverages on the Wasserstein distance. We then make a quasi-Gaussian assumption that yields more convenient formulas in terms of the joint correlation matrix. The method is illustrated in a case study involving climatological data
Downloadable Article, 2023
Publication:Mathematics, 12, 20231201, 67
Publisher: 2023
Access Free

 

Peer-reviewed
Authors:Chen Zhang, Tao Yang
Peer-reviewed
Anomaly Detection for Wind Turbines Using Long Short-Term Memory-Based Variational Autoencoder Wasserstein Generation Adversarial Network under Semi-Supervised Training

Authors:Chen ZhangTao Yang
Summary:Intelligent anomaly detection for wind turbines using deep-learning methods has been extensively researched and yielded significant results. However, supervised learning necessitates sufficient labeled data to establish the discriminant boundary, while unsupervised learning lacks prior knowledge and heavily relies on assumptions about the distribution of anomalies. A long short-term memory-based variational autoencoder Wasserstein generation adversarial network (LSTM-based VAE-WGAN) was established in this paper to address the challenge of small and noisy wind turbine datasets. The VAE was utilized as the generator, with LSTM units replacing hidden layer neurons to effectively extract spatiotemporal factors. The similarity between the model-fit distribution and true distribution was quantified using Wasserstein distance, enabling complex high-dimensional data distributions to be learned. To enhance the performance and robustness of the proposed model, a two-stage adversarial semi-supervised training approach was implemented. Subsequently, a monitoring indicator based on reconstruction error was defined, with the threshold set at a 99.7% confidence interval for the distribution curve fitted by kernel density estimation (KDE). Real cases from a wind farm in northeast China have confirmed the feasibility and advancement of the proposed model, while also discussing the effects of various applied parameters
Downloadable Article, 2023
Publication:Energies, 16, 20231001, 7008
Publisher: 2023
Access Free

 
Right Mean for the α − z Bures-Wasserstein Quantum Divergence
Authors:Miran JeongJinmi HwangSejong Kim
Summary:Abstract: The optimization problem to minimize the weighted sum of α−z Bures-Wasserstein quantum divergences to given positive definite Hermitian matrices has been solved. We call the unique minimizer the α − z weighted right mean, which provides a new non-commutative version of generalized mean (Hölder mean). We investigate its fundamental properties, and give many interesting operator inequalities with the matrix power mean including the Cartan mean. Moreover, we verify the trace inequality with the Wasserstein mean and provide bounds for the Hadamard product of two right means
Article, 2023
Publication:Acta Mathematica Scientia, 43, 202309, 2320
Publisher: 2023

 
Rotated SAR Ship Detection based on Gaussian Wasserstein Distance Loss

Authors:Congan XuHang SuLong GaoJunfeng WuWenjun Yan
Summary:Abstract: Deep learning-based rotated ship detection algorithms in Synthetic Aperture Radar images suffer from low detection accuracy and converge speed due to the boundary discontinuity and angle sensitivity problems. At the same time, in complex scenarios such as inshore, the detection accuracy is limited due to more interference. To address these problems, this paper proposes a rotated SAR ship detection algorithm based on the Gaussian Wasserstein Distance (GWD) loss function and salient feature extraction network. Based on the anchor-free detection framework, the rotated bounding boxes are converted to two-dimensional Gaussian encodings, and the Wasserstein distance between the distributions is used as the loss function of the rotated bounding boxes, and the model is guided to focus on the key features by the salient feature extraction network. Experimental results on the publicly available rotated SAR ship dataset SSDD+ demonstrate that the proposed method obtains remarkable performance compared to one-stage and anchor-free methods, especially that the average precision (AP) of the proposed method is 79.30% in the nearshore scenario, which is 4.90% higher than the suboptimal methocle, 2023
Publiction:Mobie Networks and Applications : The Journal of SPECIAL ISSUES on Mobility of Systems, Users, DataandComputing, 20230810, 1
Publisher: 2023


2023



Peer-reviewed
Scalable Gromov–Wasserstein Based Comparison of Biological Time Series
Authors:Natalia KravtsovaReginald L. McGee IIAdriana T. Dawes
Summary:Abstract: A time series is an extremely abundant data type arising in many areas of scientific research, including the biological sciences. Any method that compares time series data relies on a pairwise distance between trajectories, and the choice of distance measure determines the accuracy and speed of the time series comparison. This paper introduces an optimal transport type distance for comparing time series trajectories that are allowed to lie in spaces of different dimensions and/or with differing numbers of points possibly unequally spaced along each trajectory. The construction is based on a modified Gromov–Wasserstein distance optimization program, reducing the problem to a Wasserstein distance on the real line. The resulting program has a closed-form solution and can be computed quickly due to the scalability of the one-dimensional Wasserstein distance. We discuss theoretical properties of this distance measure, and empirically demonstrate the performance of the proposed distance on several datasets with a range of characteristics commonly found in biologically relevant data. We also use our proposed distance to demonstrate that averaging oscillatory time series trajectories using the recently proposed Fused Gromov–Wasserstein barycenter retains more characteristics in the averaged trajectory when compared to traditional averaging, which demonstrates the applicability of Fused Gromov–Wasserstein barycenters for biological time series. Fast and user friendly software for computing the proposed distance and related applications is provided. The proposed distance allows fast and meaningful comparison of biological time series and can be efficiently used in a wide range of applications
Article, 2023
Publication:Bulletin of Mathematical Biology : A journal devoted to research at the interface of the life and mathematical sciences, 85, 202308
Publisher: 2023

 

An infrared small target detection model via Gather-Excite attention and normalized Wasserstein distanceAn infrared small target detection model via Gather-Excite attention and normalized Wasserstein distance
Authors:Kangjian Sun, Ju Huo, Qi Liu, Shunyuan Yan

Summary:Infrared small target detection (ISTD) is the main research content for defense confrontation, long-rang

precision strikes and battlefield intelligence reconnaissance. Targets from the aerial view have the characteristics of

small size and dim signal. These characteristics affect the performance of traditional detection models. At present, the

target detection model based on deep learning has made huge advances. The You Only Look Once (YOLO) series is a

classic branch. In this paper, a model with better adaptation capabilities, namely ISTD-YOLOv7, is proposed for 

infrared small target detection. First, the anchors of YOLOv7 are updated to provide prior. Second, Gather-Excite (GE)

attention is embedded in YOLOv7 to exploit feature context and spatial location information. Finally, Normalized

Wasserstein Distance (NWD) replaces IoU in the loss function to alleviate the sensitivity of YOLOv7 for location

deviations of small targets. Experiments on a standard dataset show that the proposed model has stronger detection

performance than YOLOv3, YOLOv5s, SSD, CenterNet, FCOS, YOLOXs, DETR and the baseline model, with a mean

Average Precision (mAP) of 98.43%. Moreover, ablation studies indicat

the effectiveness of the improved componentsArticle, 202

Publication:Mathematical biosciences and engineering : MBE, 20, 20231011, 19040

Publisher: 2023


Peer-reviewed

An Approach for EEG Denoising Based on Wasserstein Generative Adversarial Network
Authors:Yuanzhe DongXi TangQingge LiYingying WangNaifu
Summary:Electroencephalogram (EEG) recordings often contain artifacts that would lower signal quality. Many efforts have been made to eliminate or at least minimize the artifacts, and most of them rely on visual inspection and manual operations, which is time/labor-consuming, subjective, and incompatible to filter massive EEG data in real-time. In this paper, we proposed a deep learning framework named Artifact Removal Wasserstein Generative Adversarial Network (AR-WGAN), where the well-trained model can decompose input EEG, detect and delete artifacts, and then reconstruct denoised signals within a short time. The proposed approach was systematically compared with commonly used denoising methods including Denoised AutoEncoder, Wiener Filter, and Empirical Mode Decomposition, with both public and self-collected datasets. The experimental results proved the promising performance of AR-WGAN on automatic artifact removal for massive data across subjects, with correlation coefficient up to 0.726±0.033, and temporal and spatial relative root-mean-square error as low as 0.176±0.046 and 0.761±0.046, respectively. This work may demonstrate the proposed AR-WGAN as a high-performance end-to-end method for EEG denoising, with many on-line applications in clinical EEG monitoring and brain-computer interfaces
Article, 2023
Publication:IEEE transactions on neural systems and rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society, 31, 2023, 3524
Publisher: 2023


 
An Integrated Method Based on Wasserstein Distance and Graph for Cancer Subtype Discovery
Authors:Qingqing CaoJianping ZhaoHaiyun WangQi GuanChunhou Zheng
Summary:Due to the complexity of cancer pathogenesis at different omics levels, it is necessary to find a comprehensive method to accurately distinguish and find cancer subtypes for cancer treatment. In this paper, we proposed a new cancer multi-omics subtype identification method, which is based on variational autoencoder measured by Wasserstein distance and graph autoencoder (WVGMO). This method depends on two foremost models. The first model is a variational autoencoder measured by Wasserstein distance (WVAE), which is used to extract potential spatial information of each omic data type. The second model is the graph autoencoder (GAE) with the second-order proximity. It has the capability to retain the topological structure information and feature information of the multi-omics data. And then, the identification of cancer subtypes via k-means clustering. Extensive experiments were conducted on seven different cancers based on four omics data from TCGA. The results show that WVGMO provides equivalent or even better results than the most of advanced synthesis methods
Article, 2023
Publication:IEEE/ACM transactions on computational biology and bioinformatics, 20, 2023, 3499
Publisher: 2023

 ang, Ao-Bo (Creator), Hu, Yu-Xuan (Creator), Cai, Hao (Creator)
Event Generation and Consistence Test for Physics with Sliced Wasserstein Distance

Authors:Pan, Chu-Cheng (Creator), Dong, Xiang (Creator), Sun, Yu-Chang (Creator), Cheng, Ao-Yan (Creator), Wang, Ao-Bo (Creator), Hu, 

Summary:In the field of modern high-energy physics research, there is a growing emphasis on utilizing deep learning techniques to optimize event simulation, thereby expanding the statistical sample size for more accurate physical analysis. Traditional simulation methods often encounter challenges when dealing with complex physical processes and high-dimensional data distributions, resulting in slow performance. To overcome these limitations, we propose a solution based on deep learning with the sliced Wasserstein distance as the loss function. Our method shows its ability on high precision and large-scale simulations, and demonstrates its effectiveness in handling complex physical processes. By employing an advanced transformer learning architecture, we initiate the learning process from a Monte Carlo sample, and generate high-dimensional data while preserving all original distribution features. The generated data samples have passed the consistence test, that is developed to calculate the confidence of the high-dimentional distributions of the generated data samples through permutation tests. This fast simulation strategy, enabled by deep learning, holds significant potential not only for increasing sample sizes and reducing statistical uncertainties but also for applications in numerical integration, which is crucial in partial wave analysis, high-precision sample checks, and other related fields. It opens up new possibilities for improving event simulation in high-energy physics research
Downloadable Archival Material, Undefined, 2023-10-27
Publisher: 2023-10-27
Access Free

<–—2023———2023——2450—


udy of topological quantities of lattice QCD by a modified Wasserstein generative adversarial network
Authors:Gao, Lin (Creator), Ying, Heping (Creator), Zhang, Jianbo (Creator)
Summary:A modified Wasserstein generative adversarial network (M-WGAN) is proposed to study the distribution of the topological charge in lattice QCD based on the Monte Carlo (MC) simulations. We construct new generator and discriminator in M-WGAN to support the generation of high-quality distribution. Our results show that the M-WGAN scheme of the Machine learning should be helpful for us to calculate efficiently the 1D distribution of topological charge compared with the results by the MC simulation alone
Downloadable Archival Material, Undefined, 2023-11-15
Publisher: 2023-11-15
Access Free



2023 see 2022
Computation of Rate-Distortion-Perception Functions With Wasserstein Barycenter

rAuthors:Chen, Chunhui (Creator), Niu, Xueyan (Creator), Ye, Wenhao (Creator), Wu, Shitong (Creator), Bai, Bo (Creator), Chen, Weichao (Creator), Lin, Sian-Jheng (Creator)

Summary:The nascent field of Rate-Distortion-Perception (RDP) theory is seeing a surge of research interest due to the application of machine learning techniques in the area of lossy compression. The information RDP function characterizes the three-way trade-off between description rate, average distortion, and perceptual quality measured by discrepancy between probability distributions. However, computing RDP functions has been a challenge due to the introduction of the perceptual constraint, and existing research often resorts to data-driven methods. In this paper, we show that the information RDP function can be transformed into a Wasserstein Barycenter problem. The nonstrictly convexity brought by the perceptual constraint can be regularized by an entropy regularization term. We prove that the entropy regularized model converges to the original problem. Furthermore, we propose an alternating iteration method based on the Sinkhorn algorithm to numerically solve the regularized optimization problem. Experimental results demonstrate the efficiency and accuracy of the proposed algorithm
Downloadable Archival Material, Undefined, 2023-04-27
Publisher: 2023-04-27
Access Free

 Optimizing the Wasserstein GAN for TeV Gamma Ray Detection with VERITAS

Authors:Ribeiro, Deivid (Creator), Zheng, Yuping (Creator), Sankar, Ramana (Creator), Mantha, Kameswara (Creator)
Summary:The observation of very-high-energy (VHE, E>100 GeV) gamma rays is mediated by the imaging atmospheric Cherenkov technique (IACTs). At these energies, gamma rays interact with the atmosphere to create a cascade of electromagnetic air showers that are visible to the IACT cameras on the ground with distinct morphological and temporal features. However, hadrons with significantly higher incidence rates are also imaged with similar features, and must be distinguished with handpicked parameters extracted from the images. The advent of sophisticated deep learning models has enabled an alternative image analysis technique that has been shown to improve the detection of gamma rays, by improving background rejection. In this study, we propose an unsupervised Wasserstein Generative Adversarial Network (WGAN) framework trained on normalized, uncleaned stereoscopic shower images of real events from the VERITAS observatory to extract the landscape of their latent space and optimize against the corresponding inferred latent space of simulated gamma-ray events. We aim to develop a data driven approach to guide the understanding of the extracted features of real gamma-ray images, and will optimize the WGAN to calculate a probabilistic prediction of "gamma-ness" per event. In this poster, we present results of ongoing work toward the optimization of the WGAN, including the exploration of conditional parameters and multi-task learning
Downloadable Archival Material, Undefined, 2023-09-21
Publisher: 2023-09-21
Access Free


Peer-reviewed

Target detection based on generalized Bures-Wasserstein distance
Authors:Zhizhong HuangLin Zheng
Summary:Abstract Radar target detection with fewer echo pulses in non-Gaussian clutter background is a challenging problem. In this instance, the conventional detectors using coherent accumulation are not very satisfactory. In contrast, the matrix detector based on Riemannian manifolds has shown potential on this issue since the covariance matrix of radar echo data during one coherent processing interval(CPI) has a smooth manifold structure. The Affine Invariant (AI) Riemannian distance between the cell under test (CUT) and the reference cells has been used as a statistic to achieve improved detection performance. This paper uses the Bures-Wasserstein (BW) distance and Generalized Bures-Wasserstein (GBW) distance on Riemannian manifolds as test statistics of matrix detectors, and propose relevant target detection method. Maximizing the GBW distance is formulated as an optimization problem and is solved by the Riemannian trust-region (RTR) method to achieve enhanced discrimination for target detection. Our evaluation of simulated data and measured data show that the matrix detector based on GBW distance leads to a significant performance gain over existing methods
Downloadable Article, 2023
Publication:EURASIP Journal on Advances in Signal Processing, 2023, 20231201, 1
Publisher: 2023
Access Free


Wasserstein Generative Adversarial Networks Based Differential Privacy Metaverse Data Sharing
Authors:Hai LiuDequan XuYouliang TianChanggen PengZhenqiang WuZiyue Wang
Summary:Although differential privacy metaverse data sharing can avoid privacy leakage of sensitive data, randomly perturbing local metaverse data will lead to an imbalance between utility and privacy. Therefore, this work proposed models and algorithms of differential privacy metaverse data sharing using Wasserstein generative adversarial networks (WGAN). Firstly, this study constructed the mathematical model of differential privacy metaverse data sharing by introducing appropriate regularization term related to generated data's discriminant probability into WGAN. Secondly, we established basic model and algorithm for differential privacy metaverse data sharing using WGAN based on the constructed mathematical model, and theoretically analyzed basic algorithm. Thirdly, we established federated model and algorithm for differential privacy metaverse data sharing using WGAN by serialized training based on basic model, and theoretically analyzed federated algorithm. Finally, based on utility and privacy metrics, we conducted a comparative analysis for the basic algorithm of differential privacy metaverse data sharing using WGAN, and experimental results validate theoretical results, which show that algorithms of differential privacy metaverse data sharing using WGAN maintaining equilibrium between privacy and utility
Article, 2023
Publication:IEEE journal of biomedical and health informatics, PP, 20230616
Publisher: 2023

 
2023

Peer-reviewed

A Multi-Objective Geoacoustic Inversion of Modal-Dispersion and Waveform Envelope Data Based on Wasserstein Metric
Summary:The inversion of acoustic field data to estimate geoacoustic parameters has been a prominent research focus in the field of underwater acoustics for several decades. Modal-dispersion curves have been used to inverse seabed sound speed and density profiles, but such techniques do not account for attenuation inversion. In this study, a new approach where modal-dispersion and waveform envelope data are simultaneously inversed under a multi-objective framework is proposed. The inversion is performed using the Multi-Objective Bayesian Optimization (MOBO) method. The posterior probability densities (PPD) of the estimation results are obtained by resampling from the exploited state space using the Gibbs Sampler. In this study, the implemented MOBO approach is compared with individual inversions both from modal-dispersion curves and the waveform data. In addition, the effective use of the Wasserstein metric from optimal transport theory is explored. Then the MOBO performance is tested against two different cost functions based on the L2 norm and the Wasserstein metric, respectively. Numerical experiments are employed to evaluate the effect of different cost functions on inversion performance. It is found that the MOBO approach may have more profound advantages when applied to Wasserstein metrics. Results obtained from our study reveal that the MOBO approach exhibits reduced uncertainty in the inverse results when compared to individual inversion methods, such as modal-dispersion inversion or waveform inversion. However, it is important to note that this enhanced uncertainty reduction comes at the cost of sacrificing accuracy in certain parameters other than the sediment sound speed and attenuation
Downloadable Article, 2023
Publication:Remote Sensing, 15, 20231001, 4893
Publisher: 2023
Access Free


 
2023 see 3022
Quantum Wasserstein distance based on an optimization over separable states

Authors:Géza TóthJózsef Pitrik
Summary:We define the quantum Wasserstein distance such that the optimization of the coupling is carried out over bipartite separable states rather than bipartite quantum states in general, and examine its properties. Surprisingly, we find that the self-distance is related to the quantum Fisher information. We present a transport map corresponding to an optimal bipartite separable state. We discuss how the quantum Wasserstein distance introduced is connected to criteria detecting quantum entanglement. We define variance-like quantities that can be obtained from the quantum Wasserstein distance by replacing the minimization over quantum states by a maximization. We extend our results to a family of generalized quantum Fisher information quantities
Downloadable Article, 2023
Publication:Quantum, 7, 20231001, 1143
Publisher: 2023
Access Free



Peer-reviewed

Unsupervised domain adaptation for Covid-19 classification based on balanced slice Wasserstein distance

Authors:Jiawei GuXuan QianQian ZhangHongliang ZhangFang Wu
Summary:Covid-19 has swept the world since 2020, taking millions of lives. In order to seek a rapid diagnosis of Covid-19, deep learning-based Covid-19 classification methods have been extensively developed. However, deep learning relies on many samples with high-quality labels, which is expensive. To this end, we propose a novel unsupervised domain adaptation method to process many different but related Covid-19 X-ray images. Unlike existing unsupervised domain adaptation methods that cannot handle conditional class distributions, we adopt a balanced Slice Wasserstein distance as the metric for unsupervised domain adaptation to solve this problem. Multiple standard datasets for domain adaptation and X-ray datasets of different Covid-19 are adopted to verify the effectiveness of our proposed method. Experimented by cross-adopting multiple datasets as source and target domains, respectively, our proposed method can effectively capture discriminative and domain-invariant representations with better data distribution matching.

• Proposed a Balanced Slice Wasserstein Distance (BSWD) for depth-domain adaptation. • BSWD uses pseudo-labels from pre-trained models to align unlabeled target domains. • BSWD validated via synthetic, standard, and Covid-19 datasets with good performance
Article, 2023
Publication:Computers in Biology and Medicine, 164, 202309
Publisher: 2023

Peer-reviewed
iTransition Time Determination of Single-Molecule FRET Trajectories via Wasserstein Distance Analysis in Steady-State Variations in smFRET (WAVE)

Authors:Ting ChenFengnan GaoYan-Wen Tan
Summary:Many biological molecules respond to external stimuli that can cause their conformational states to shift from one steady state to another. Single-molecule FRET (Fluorescence Resonance Energy Transfer) is of particular interest to not only define the steady-state conformational ensemble usually averaged out in the ensemble of molecules but also characterize the dynamics of biomolecules. To study steady-state transitions, i.e., non-equilibrium transitions, a data analysis methodology is necessary to analyze single-molecule FRET photon trajectories, which contain mixtures of contributions from two steady-state statuses and include non-equilibrium transitions. In this study, we introduce a novel methodology called WAVE (Wasserstein distance Analysis in steady-state Variations in smFRET) to detect and locate non-equilibrium transition positions in FRET trajectories. Our method first utilizes a combined STaSI-HMM (Stepwise Transitions with State Inference Hidden Markov Model) algorithm to convert the original FRET trajectories into discretized trajectories. We then apply Maximum Wasserstein Distance analysis to differentiate the FRET state compositions of the fitting trajectories before and after the non-equilibrium transition. Forward and backward algorithms, based on the Minimum Description Length (MDL) principle, are used to find the refined positions of the non-equilibrium transitions. This methodology allows us to observe changes in experimental conditions in chromophore-tagged biomolecules or vice versa
Article, 2023
Publication:The journal of physical chemistry. B, 127, 20230921, 7819
Publisher: 2023

Peer-reviewed
Absolutely continuous and BV-curves in 1-Wasserstein spaces

Authors:Ehsan AbediZhenhao LiTimo Schultz
Summary:Abstract: We extend the result of Lisini (Calc Var Partial Differ Equ 28:85–120, 2007) on the superposition principle for absolutely continuous curves in p-Wasserstein spaces to the special case of . In contrast to the case of , it is not always possible to have lifts on absolutely continuous curves. Therefore, one needs to relax the notion of a lift by considering curves of bounded variation, or shortly BV-curves, and replace the metric speed by the total variation measure. We prove that any BV-curve in a 1-Wasserstein space can be represented by a probability measure on the space of BV-curves which encodes the total variation measure of the Wasserstein curve. In particular, when the curve is absolutely continuous, the result gives a lift concentrated on BV-curves which also characterizes the metric speed. The main theorem is then applied for the characterization of geodesics and the study of the continuity equation in a discrete setting
Article, 2023
Publication:Calculus of Variations and Partial Differential Equations, 63, 202401
Publisher: 2023

<–—2023———2023——2460— 



Peer-reviewed
Crash injury severity prediction considering data imbalance: A Wasserstein generative adversarial network with gradient penalty approach

Authors:Ye LiZhanhao YangLu XingChen YuanFei LiuDan 

Summary:For each road crash event, it is necessary to predict its injury severity. However, predicting crash injury severity with the imbalanced data frequently results in ineffective classifier. Due to the rarity of severe injuries in road traffic crashes, the crash data is extremely imbalanced among injury severity classes, making it challenging to the training of prediction models. To achieve interclass balance, it is possible to generate certain minority class samples using data augmentation techniques. Aiming to address the imbalance issue of crash injury severity data, this study applies a novel deep learning method, the Wasserstein generative adversarial network with gradient penalty (WGAN-GP), to investigate a massive amount of crash data, which can generate synthetic injury severity data linked to traffic crashes to rebalance the dataset. To evaluate the effectiveness of the WGAN-GP model, we systematically compare performances of various commonly-used sampling techniques (random under-sampling, random over-sampling, synthetic minority over-sampling technique and adaptive synthetic sampling) with respect to dataset balance and crash injury severity prediction. After rebalancing the dataset, this study categorizes the crash injury severity using logistic regression, multilayer perceptron, random forest, AdaBoost and XGBoost. The AUC, specificity and sensitivity are employed as evaluation indicators to compare the prediction performances. Results demonstrate that sampling techniques can considerably improve the prediction performance of minority classes in an imbalanced dataset, and the combination of XGBoost and WGAN-GP performs best with an AUC of 0.794 and a sensitivity of 0.698. Finally, the interpretability of the model is improved by the explainable machine learning technique SHAP (SHapley Additive exPlanation), allowing for a deeper understanding of the effects of each variable on crash injury severity. Findings of this study shed light on the prediction of crash injury severity with data imbalance using data-driven approaches
Article, 2023
Publication:Accident; analysis and prevention, 192, 202311, 107271
Publisher: 2023

Peer-reviewed
Target detection based on generalized Bures–Wasserstein distance

Authors:Zhizhong HuangLin Zheng
Summary:Abstract: Radar target detection with fewer echo pulses in non-Gaussian clutter background is a challenging problem. In this instance, the conventional detectors using coherent accumulation are not very satisfactory. In contrast, the matrix detector based on Riemannian manifolds has shown potential on this issue since the covariance matrix of radar echo data during one coherent processing interval(CPI) has a smooth manifold structure. The Affine Invariant (AI) Riemannian distance between the cell under test (CUT) and the reference cells has been used as a statistic to achieve improved detection performance. This paper uses the Bures–Wasserstein (BW) distance and Generalized Bures–Wasserstein (GBW) distance on Riemannian manifolds as test statistics of matrix detectors, and propose relevant target detection method. Maximizing the GBW distance is formulated as an optimization problem and is solved by the Riemannian trust-region (RTR) method to achieve enhanced discrimination for target detection. Our evaluation of simulated data and measured data show that the matrix detector based on GBW distance leads to a significant performance gain over existing methods
Article, 2023
Publication:EURASIP Journal on Advances in Signal Processing, 2023, 20231206
Publisher: 2023

Peer-reviewed
Wasserstein Auto-Encoders of Merge Trees (and Persistence Diagrams)

Authors:Mathieu PontJulien Tierny
Summary:This paper presents a computational framework for the Wasserstein auto-encoding of merge trees (MT-WAE), a novel extension of the classical auto-encoder neural network architecture to the Wasserstein metric space of merge trees. In contrast to traditional auto-encoders which operate on vectorized data, our formulation explicitly manipulates merge trees on their associated metric space at each layer of the network, resulting in superior accuracy and interpretability. Our novel neural network approach can be interpreted as a non-linear generalization of previous linear attempts [72] at merge tree encoding. It also trivially extends to persistence diagrams. Extensive experiments on public ensembles demonstrate the efficiency of our algorithms, with MT-WAE computations in the orders of minutes on average. We show the utility of our contributions in two applications adapted from previous work on merge tree encoding [72]. First, we apply MT-WAE to merge tree compression, by concisely representing them with their coordinates in the final layer of our auto-encoder. Second, we document an application to dimensionality reduction, by exploiting the latent space of our auto-encoder, for the visual analysis of ensemble data. We illustrate the versatility of our framework by introducing two penalty terms, to help preserve in the latent space both the Wasserstein distances between merge trees, as well as their clusters. In both applications, quantitative experiments assess the relevance of our framework. Finally, we provide a C++ implementation that can be used for reproducibility
Article, 2023
Publication:IEEE transactions on visualization and computer graphics, PP, 20231128
Publisher: 2023
 

Peer-reviewed
Multifrequency matched-field source localization based on Wasserstein metric for probability measures

Authors:Qixuan ZhuChao SunMingyang Li
Summary:Matched-field processing (MFP) for underwater source localization serves as a generalized beamforming approach that assesses the correlation between the received array data and a dictionary of replica vectors. In this study, the processing scheme of MFP is reformulated by computing a statistical metric between two Gaussian probability measures with the cross-spectral density matrices (CSDMs). To achieve this, the Wasserstein metric, a widely used notion of metric in the space of probability measures, is employed for developing the processor to attach the intrinsic properties of CSDMs, expressing the underlying optimal value of the statistic. The Wasserstein processor uses the embedded metric structure to suppress ambiguities, resulting in the ability to distinguish between multiple sources. In this foundation, a multifrequency processor that combines the information at different frequencies is derived, providing improved localization statistics with deficient snapshots. The effectiveness and robustness of the Wasserstein processor are demonstrated using acoustic simulation and the event S5 of the SWellEx-96 experiment data, exhibiting correct localization statistics and a notable reduction in ambiguity. Additionally, this paper presents an approach to derive the averaged Bartlett processor by evaluating the Wasserstein metric between two Dirac measures, providing an innovative perspective for MFP
Article, 2023
Publication:The Journal of the Acoustical Society of America, 154, 20231101, 3062
Publisher: 2023

Peer-reviewed

Stable and Fast Deep Mutual Information Maximization Based on Wasserstein Distance
Authors:Xing HeChanggen PengLin WangWeijie TanZifan Wang
Summary:Deep learning is one of the most exciting and promising techniques in the field of artificial intelligence (AI), which drives AI applications to be more intelligent and comprehensive. However, existing deep learning techniques usually require a large amount of expensive labeled data, which limit the application and development of deep learning techniques, and thus it is imperative to study unsupervised machine learning. The learning of deep representations by mutual information estimation and maximization (Deep InfoMax or DIM) method has achieved unprecedented results in the field of unsupervised learning. However, in the DIM method, to restrict the encoder to learn more normalized feature representations, an adversarial network learning method is used to make the encoder output consistent with a priori positively distributed data. As we know, the model training of the adversarial network learning method is difficult to converge, because there is a logarithmic function in the loss function of the cross-entropy measure, and the gradient of the model parameters is susceptible to the "gradient explosion" or "gradient disappearance" phenomena, which makes the training of the DIM method extremely unstable. In this regard, we propose a Wasserstein distance-based DIM method to solve the stability problem of model training, and our method is called the WDIM. Subsequently, the training stability of the WDIM method and the classification ability of unsupervised learning are verified on the CIFAR10, CIFAR100, and STL10 datasets. The experiments show that our proposed WDIM method is more stable to parameter updates, has faster model convergence, and at the same time, has almost the same accuracy as the DIM method on the classification task of unsupervised learning. Finally, we also propose a reflection of future research for the WDIM method, aiming to provide a research idea and direction for solving the image classification task with unsupervised learning
Article, 2023
Publication:Entropy (Basel, Switzerland), 25, 20231130
Publisher: 2023


2023



Synthetic high-energy computed tomography image via a Wasserstein generative adversarial network with the convolutional block attention module

Authors:Hai KongZhidong YuanHaojie ZhouGanglin LiangZhonghong YanGuanxun ChengZhanli Hu
Summary:Computed tomography (CT) is now universally applied into clinical practice with its non-invasive quality and reliability for lesion detection, which highly improves the diagnostic accuracy of patients with systemic diseases. Although low-dose CT reduces X-ray radiation dose and harm to the human body, it inevitably produces noise and artifacts that are detrimental to information acquisition and medical diagnosis for CT images
Article, 2023
Publication:Quantitative imaging in medicine and surgery, 13, 20230701, 4365
Publisher: 2023


Wasserstein model reduction approach for parametrized flow problems in porous media

Authors:Battisti BeatriceBlickhan TobiasEnchery GuillaumeEhrlacher VirginieLombardi DamianoMula Olga
Summary:The aim of this work is to build a reduced order model for parametrized porous media equations. The main challenge of this type of problems is that the Kolmogorov width of the solution manifold typically decays quite slowly and thus makes usual linear model order reduction methods inappropriate. In this work, we investigate an adaptation of the methodology proposed in [Ehrlacher et al., Nonlinear model reduction on metric spaces. Application to one-dimensional conservative PDEs in Wasserstein spaces, ESAIM: Mathematical Modelling and Numerical Analysis (2020)], based on the use of Wasserstein barycenters [Agueh & Carlier, Barycenters in the Wasserstein Space, SIAM Journal on Mathematical Analysis (2011)], to the case of non-conservative problems. Numerical examples in one-dimensional test cases illustrate the advantages and limitations of this approach and suggest further research directions that we intend to explore in the future
Downloadable Article, 2023
Publication:ESAIM: Proceedings and Surveys, 73, 20230101, 28
Publisher: 2023
Access Free


Peer-reviewed

An Efficient Rep-Style Gaussian-Wasserstein Network: Improved UAV Infrared Small Object Detection for Urban Road Surveillance and Safety
Authors:Tuerniyazi AibibuJinhui LanYiliang ZengWeijian LuNaiwei
Summary:Owing to the significant application potential of unmanned aerial vehicles (UAVs) and infrared imaging technologies, researchers from different fields have conducted numerous experiments on aerial infrared image processing. To continuously detect small road objects 24 h/day, this study proposes an efficient Rep-style Gaussian-Wasserstein network (ERGW-net) for small road object detection in infrared aerial images. This method aims to resolve problems of small object size, low contrast, few object features, and occlusions. The ERGW-net adopts the advantages of ResNet, Inception net, and YOLOv8 networks to improve object detection efficiency and accuracy by improving the structure of the backbone, neck, and loss function. The ERGW-net was tested on a DroneVehicle dataset with a large sample size and the HIT-UAV dataset with a relatively small sample size. The results show that the detection accuracy of different road targets (e.g., pedestrians, cars, buses, and trucks) is greater than 80%, which is higher than the existing methods
Downloadable Article, 2023
Publication:Remote Sensing, 16, 20231201, 25
Publisher: 2023
Access Free

 
 
 



Peer-reviewed

Data-driven decadal climate forecasting using Wasserstein time-series generative adversarial networks
Show mor

Authors:Ahmed BouteskaMarco Lavazza SerantoPetr HajekMohammad
Summary:Abstract: Recent trends in global climate modeling, coupled with the availability of more fine-scale datasets, have opened up opportunities for deep learning-based climate prediction to improve the accuracy of predictions over traditional physics-based models. For this, however, large ensembles of data are needed. Generative models have recently proven to be a suitable solution to this problem. For a sound generative model for time-series forecasting, it is essential that temporal dynamics are preserved in that the generated data obey the original data distributions over time. Existing forecasting methods aided by generative models are not adequate for capturing such temporal relationships. Recently, generative models have been proposed that generate realistic time-series data by exploiting the combinations of unsupervised and supervised learning. However, these models suffer from instable learning and mode collapse problems. To overcome these issues, here we propose Wasserstein Time-Series Generative Adversarial Network (WTGAN), a new forecasting model that effectively imitates the dynamics of the original data by generating realistic synthetic time-series data. To validate the proposed forecasting model, we evaluate it by backtesting the challenging decadal climate forecasting problem. We show that the proposed forecasting model outperforms state-of-the- art generative models. Another advantage of the proposed model is that once WTGAN is tuned, generating time-series data is very fast, whereas standard simulators consume considerable computer time. Thus, a large amount of climate data can be generated, which can substantially improve existing data-driven climate forecasting models
Article, 2023
Publication:Annals of Operations Research, 20231201, 1
Publisher: 2023

<–—2023———2023——2470—



Peer-reviewed

Improving cement production process with data-augmented sequence to sequence-Wasserstein generative adversarial networks 

model for accurate prediction of f-

Smproving cement production process with data-augmented sequence to sequence-Wasserstein generative adversarial networks

model for accurate prediction of f-CaO

Authors:Ying Zhang, Jinbo Liu, Hui Dang, Yifu Zhang, Gaolu Huang, Junze Jiao, Xiaochen Hao

Summary:This paper proposes a method to address the issue of insufficient capture of temporal dependencies in cement production 

processes, which is based on a data-augmented Seq2Seq-WGAN (Sequence to Sequence-Wasserstein Generate Adversarial Network) 

model. Considering the existence of various temporal scales in cement production processes, we use WGAN to generate a large amount 

of f-CaO label data and employ Seq2Seq to solve the problem of unequal length input-output sequences. We use the unlabeled relevant 

variable data as the input to the encoder of the Seq2Seq-WGAN model and use the generated labels as the input to the decoder, thus 

fully exploring the temporal dependency relationships between input and output variables. We use the hidden vector containing the 

temporal characteristics of cement produced by the encoder as the initial state of the gate recurrent unit in the decoder to achieve 

accurate prediction of key points and continuous time. The experimental results show that the Seq2Seq-WGAN model can achieve 

accurate prediction of continuous time series of free calcium and offer direction for subsequent production planning. This method has 

high practicality and application prospects, and can provide strong support for the production scheduling of the cement industry

Article, 2023

Publication:The Review of scientific instruments, 94, 20231001

Publisher: 2023



2023 thesis MIT

Proximal gradient algorithms for Gaussian variational inference : optimization in the Bures-Wasserstein space
Authors:Michael Ziyang Diao (Author), Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science
Abstract:Variational inference (VI) seeks to approximate a target distribution [pi] by an element of a tractable family of distributions. Of key interest in statistics and machine learning is Gaussian VI, which approximates [pi] by minimizing the Kullback-Leibler (KL) divergence to [pi] over the space of Gaussians. In this work, we develop the (Stochastic) Forward-Backward Gaussian Variational Inference (FB-GVI) algorithm to solve Gaussian VI. Our approach exploits the composite structure of the KL divergence, which can be written as the sum of a smooth term (the potential) and a non-smooth term (the entropy) over the Bures-Wasserstein (BW) space of Gaussians endowed with the Wasserstein distance. For our proposed algorithm, we obtain state-of-the-art convergence guarantees when [pi] is log-smooth and log-concave, as well as the first convergence guarantees to first-order stationary solutions when [pi] is only log-smooth. Additionally, in the setting where the potential admits a representation as the average of many smooth component functionals, we develop and analyze a variance-reduced extension to (Stochastic) FB-GVI with improved complexity guarantees
Thesis, Dissertation, English, 2023
Publisher: Massachusetts Institute of Technology, Cambridge, Massachusetts, 2023


Wasserstein GAN Based Underwater Acoustic Channel Simulator
Authors:Mingzhang ZhouJunfeng WangHaixin Sun2023 IEEE International Conference on Signal Processing, Communications and asserstein GAN Based Underwater Acoustic Channel Simulator

Summary:Underwater acoustic channel is a crucial part for implementation of underwater communications, and its measurement is expensive. This leads to the lack of samples for the training of deep neural network (DNN)-based underwater communication receivers. To solve this problem, A Wasserstein generative adversarial network (WGAN)-based underwater acoustic channel simulator is proposed in this paper for channel sample augmentation. The overlap between the distributions of the measured channel and generator output is analyzed. Then the CNN-based WGAN is constructed with the earth-mover's distance as part of the loss function. Tested with the measured channels in Wuyuanwan Bay, Xiamen, the proposed WGAN performs steadily in simulating the complete channel impulse responses and their single taps distributions. Moreover, a simple DNN-based OFDM channel estimator is built and completely trained with the generated channels. The simulation results show that the BER DNN-based channel estimator outperforms the LS and MMSE estimators
Chapter, 2023
Publication:2023 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC), 20231114, 1
Publisher: 2023


The Fibonacci Constant, the Wasserstein Distance, and Biological Tumor Aggressiveness in Prostate Cancer
Authors:Waliszewski Przemyslaw2023 24th International Conference on 

 Science (CSCS)
 
Summary: 

Both epithelial and mesenchymal cells interact with each other at various levels of hierarchical organization forming tissue; a 

complex dynamic system. The Fibonacci constant sets a limit for self-organization of epithelial cells into tissue structures of the higher order, such as glands. It also determines the optimal use of space available for growth. Patterns of self-organization of cancer cells can be stratified according to the novel parameter, the function of cellular expansion. This function is related to the spatial global capacity fractal dimension D0. That stratification enables a selection of the objective reference set of images for a neural network. Neither normal nor malignant epithelial cells fill the available space in the perfect manner. The first ones self-organize approaching values far from the Fibonacci constant. The second ones self-organize until they reach the value zero. The spatial distribution of cancer cell nuclei can be compared between different prostate carcinomas using a topological measure, the Wasserstein distance. In that way, topological similarity can be quantified in the objective manner. However, values of the Wasserstein distance overlap between the classes of complexity. This may end up in a significant inaccuracy during the automated evaluation of biological tumor aggressiveness by neural networks
Chapter, 2023
Publication:2023 24th International Conference on Control Systems and Computer Science (CSCS), 202305, 229
Publisher: 2023


 
CryoSWD: Sliced Wasserstein Distance Minimization for 3D Reconstruction in Cryo-electron Microscopy

Authors:Mona ZehniZhizhen ZhaoICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing
Summary:Single particle reconstruction (SPR) in cryo-electron microscopy (cryo-EM) is a prominent imaging method that recovers the 3D shape of a biomolecule, given a large number of its noisy projections from random and unknown views. Recently, CryoGAN [1] cast SPR as an unsupervised distribution matching problem and solved it via a Wasserstein generative adversarial network (WGAN) framework. The approach bypasses the estimation of the projection parameters. The reconstruction criterion in CryoGAN is Wasserstein-1 distance. Despite the desirable properties of Wasserstein distances (WD) such as continuity and almost everywhere differentiability, they are difficult to compute and require careful tuning for a stable training. Sliced Wasserstein distance (SWD), on the other hand, has shown desirable training stability and ease to compute. Therefore, we propose to re-place Wasserstein-1 distance with SWD in the CryoGAN framework, hence the name CryoSWD. In low noise regimes, we show how CryoSWD eliminates the need to have a discriminator which is crucial in CryoGAN. However, coupling CryoSWD with a discriminator boosts its performance, especially in high noise settings. While performing as good as CryoGAN, CryoSWD does not require a gradient penalty term for stabilizing the training and imposing Lipschitz continuity of the discriminator
Chapter, 2023
Publication:ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 20230604, 1
Publisher: 2023


2023



Incipient Fault Detection of CRH Suspension system Based on PRPCA and Wasserstein Distance
Authors:Kangyue FangYunkai WuYang ZhouZhiyu ZhuQingjun Zeng2023 42nd Chinese Control Conference (CCC)
Summary:As an important part of CRH(China Railway High-speed) trains, the stability and stationarity of a suspension system is of great significance to the vehicle system. Based on the framework of probability relevant principal component analysis(PRPCA), a novel data-driven based incipient fault detection method is proposed. Firstly, simulation data including fault information is derived from Simpack-Matlab/Simulink co-simulation platform. Secondly, the real-time monitoring of high-speed train suspension system is proposed based on PRPCA theory combined with wasserstein distance. Furthermore, compared with the traditional PCA based fault detection and diagnosis (FDD) methods, the proposed PRPCA-based method has a better performance and is more suitable for actual fault data has nonlinear and non-Gaussian characteristics. Finally, according to the comparison results with other multivariate statistical analysis based methods, the incipient fault detection method proposed in this paper has a higher sensitivity to the incipient spring/damping faults of CRH suspension system
Chapter, 2023
Publication:2023 42nd Chinese Control Conference (CCC), 20230724, 5082
Publisher: 2023

 

An Improved Bayesian Learner Based on Optimized Kernel Density Estimation and Wasserstein Soft Ensemble Strategy

Authors:Zhiwei YeYuanhu LiuJiazhi LvYingying LiuZhenwei WuWanfang Bai2023 IEEE 12th International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and
Summary:Bayesian classification is a common data analysis and modeling method in data mining. In this paper, an improved ensemble method and the optimized Kernel density estimation used to Bayesian classifier. Unlike the traditional Bayesian classifier in that the conditional possibility density is calculated by an assumed statistical model, our method estimated the possibility value without model assumption, they are obtained by kernel density estimator with optimized window width and the Wasserstein distance is used to solve the problem of how to cause the classifier difference in the integration process. The optimized window parameters are solveded by minimizing the Unbias Cross-Validation (UCV) objective function using Chaos-Particle Swarm Optimization (CPSO). Wasserstein distance is innovational used to evaluate the similarity between possibility distributions and generates weights of base classifiers. The final ensemble outputs are calculated by a soft voting strategy. The experimental results illustrate the effectiveness and improvements of the proposed method in terms of density curve fitting, ensemble ability, and overall classification accuracy
Chapter, 2023
Publication:2023 IEEE 12th International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications (IDAACS), 1, 20230907, 90
Publisher: 2023

 

Point Cloud Registration based on Gaussian Mixtures and Pairwise Wasserstein Distance

Authors:Simon SteuernagelAaron

KurdaMarcus Baum2023 IEEE Symposium Sensor Data Fusion and International Conference on Multisensor Fusion and Integration (SDF-MFI)

Summary:Point cloud registration has plenty of applications in robotics, e.g., for matching two consecutive LiDAR scans in order to estimate the motion of a mobile robot. In particular for dense point clouds, it can be advantageous to work with accumulative features such as Gaussian distributions. In this context, we propose a novel iterative method that directly aligns two Gaussian mixtures. This is achieved using an efficient approximation of the Gaussian Wasserstein distance, which we find a suitable metric capturing the similarity between shape and position of two components of the mixtures. The method is first analyzed in a simulation study, and afterwards further evaluated on real-world data. We find it to be a promising approach for point cloud registration, which can directly be expanded to LiDAR odometry
Chapter, 2023
Publication:2023 IEEE Symposium Sensor Data Fusion and International Conference on Multisensor Fusion and Integration (SDF-MFI), 20231127, 1
Publisher: 2023

 


Multi-scale Wasserstein Shortest-path Graph Kernels for Graph Classification

Authors:Qijun ChenHao TianWei Ye
Summary:Graph kernels are conventional methods for computing graph similarities. However, the existing R-convolution graph kernels cannot resolve both of the two challenges: 1) Comparing graphs at multiple different scales, and 2) Considering the distributions of substructures when computing the kernel matrix. These two challenges limit their performances. To mitigate both of the two challenges, we propose a novel graph kernel called the Multi-scale Wasserstein Shortest-Path graph kernel (MWSP), at the heart of which is the multi-scale shortest-path node feature map, of which each element denotes the number of occurrences of the shortest path around a node. The shortest path is represented by the concatenation of all the labels of nodes in it. Since the shortest-path node feature map can only compare graphs at local scales, we incorporate into it the multiple different scales of the graph structure, which are captured by the truncated BFS trees of different depths rooted at each node in a graph. We use the Wasserstein distance to compute the similarity between the multi-scale shortest-path node feature maps of two graphs, considering the distributions of shortest paths. We empirically validate MWSP on various benchmark graph datasets and demonstrate that it achieves state-of-the-art performance on most datasets
Article, 2023
Publication:IEEE Transactions on Artificial Intelligence, 1, 202311, 1
Publisher: 2023


 

A Data-Driven Wasserstein Distributionally Robust Weight-Based Joint Power Optimization for Dynamic Multi-WBAN
Authors:Mingyang WangFengye HuZhuang LingDifei JiaShuang LiGLOBECOM 2023 - 2023 IEEE Global Communications Conference

Summary:To improve the reliability of dynamic multiple wireless body area networks (WBANs) system, it is indispensable to comprehensively consider the interference mitigation and user data differences. In this paper, we study a multi-WBAN system, where sensors receive radio frequency (RF) signals from the access point (AP), then transmit the monitoring sign to the sink node. Considering the dynamic network topology and the individuality of users, we propose a data-driven wasser-stein distributionally robust weight-based joint power allocation (DW-JPA) scheme. In particular, we formulate a sum-weighted transmission rate maximization problem by optimizing dynamic weight and transmit power ratio subject to the data transmission and energy limitation constraints. We divide the problem into dynamic weight subproblem and transmission power control subproblem. We utilize the collected physiological data to predict the optimal actual weight assignment. Then, we quantify the criticality of sensors and build an ambiguity set based on wasserstein distance for probability distributions of the critically. In essence, the optimal weight is obtained by using the distributionally robust optimization (DRO) method. Furthermore, due to the non-convexity of the power control subproblem, we convert the subproblem to a difference of convex (DC) problem and use an iterative algorithm to alternately optimize the power ratio. The results reveal that the proposed scheme achieves a significantly higher weighted transmission rate with physiological data compared with traditional schemes
Chapter, 2023
Publication:GLOBECOM 2023 - 2023 IEEE Global Communications Conference, 20231204, 7031
Publisher: 2023

<–—2023———2023——2480—-



FLWGAN: Federated Learning with Wasserstein Generative Adversarial Network for Brain Tumor Segmentation

Authors:Divya PeketiVishnu ChalavadiC Krishna MohanYen Wei Chen2023 International Joint Conferen

- on Neural Networks (IJCNN

Summary:Recently, the potential of deep learning in identifying complex patterns is gaining research interest in medical applications specifically for brain tumor diagnosis. To segment tumors accurately in brain MRIs, there is a need for a large amount of data for training deep learning models. Also, hospitals cannot share patient data for centralization on the server since health records are prone to privacy and ownership challenges. To deal with these challenges, we set up an efficient federated learning (FL) pipeline with Wasserstein generative adversarial networks (FLWGAN) to ensure data privacy and data sufficiency. FL preserves the data privacy of clients by sharing only the trained model parameters to a centralized server instead of raw data. A modified 3D Wasserstein generative adversarial network with gradient penalty (WGAN-GP) and is incorporated at the client side to generate image-segmentation pairs for efficient training segmentation models. Here, 3D-UNet with an attention module is used for the brain MRI segmentation. The attention module is integrated into a 3D-UNet encoder network for effective brain tumor segmentation. Our approach aims to allow each client to benefit from locally available real data and synthetic data. This process enhances the learning performance while respecting data privacy. The efficacy of our proposed pipeline is demonstrated on the brain tumor task of the medical segmentation decathlon (MSD) dataset. We designed FLWGAN frameworks for predicting four segmentation tasks, i.e., whole tumor (WT), enhanced tumor (ET), tumor core (TC), and multiclass. Our proposed approach achieves state of the art performance in terms of various segmentation metrics

Chapter, 20

Publication:2023 International Joint Conference on Neural Networks (IJCNN), 20230618, 1
Publisher: 2023


Peer-reviewed

A mechanical derivation of the evolution equation for scintillating crystals: Recombination-diffusion-drift equations, gradient flows and Wasserstein measures
Author:Fabrizio Daví

Summary:In a series of previous papers we obtained, by the means of the mechanics of continua with microstructure, the Reaction-Diffusion-Drift equation which describes the evolution of charge carriers in scintillators. Here we deal, first of all, with the consequences of constitutive assumptions for the entropic and dissipative terms. In the case of Boltzmann-Gibbs entropy, we show that the equation admits a gradient flows structure: moreover, we show that the drift-diffusion part is a Wasserstein gradient flow and we show how the energy dissipation is correlated with an appropriate Wasserstein distance

Article, 2023

Publication:Mechanics Research Communications, 134, December 2023


Distributed IoT Community Detection via Gromov-Wasserstein Metric

Authors:Shih Yu ChangYi ChenYi-Chih KaoHsiao Hwa Chen

Summary:The Internet of Things (IoT) network is a complex system interconnected by different types of devices, e.g., sensors, 

smartphones, computers, etc.. Community detection is a critical component to understand and manage complex IoT networks. Although 

several community detection algorithms were proposed, they in general suffer several issues, such as lack of optimal solutions and 

scalability, and difficulty to be applied to a dynamic IoT environment. In this work, we propose a framework that uses Distributed 

Community Detection (DCD) algorithms based on Gromov-Wasserstein (GW) metric, namely GW-DCD, to support scalable 

community detection and address the issues with the existing community detection algorithms. The proposed GW-DCD applies 

Gromov-Wasserstein metric to detect communities of IoT devices embedded in a Euclidean space or in a graph space. GW-DCD is able 

to handle community detection problems in a dynamic IoT environment, utilizing translation/rotation invariance properties of the GW 

metric. In addition, distributed community detection approach and parallel matrix computations can be integrated into GW-DCD to 

shorten the execution time of GW-DCD. Finally, a new metric, i.e., Gromov-Wasserstein driven mutual information (GWMI), is 

derived to measure the performance of community detection by considering internal structure within each community. Numerical 

experiments for the proposed GW-DCD were conducted with simulated and real-world datasets. Compared to the existing community 

detection algorithms, the proposed GW-DCD can achieve a much better performance in terms of GWMI and the runtime

Article, 202

Publication:IEEE Internet of Things Journal, PP, 20231129, 1

Publisher: 2023




Peer-reviewed

Multilevel Laser-Induced Pain Measurement with Wasserstein Generative Adversarial Network — Gradient Penalty Model
Authors:Jiancai Leng, Jianqun Zhu, Yihao Yan, Xin Yu, Ming Liu, Yitai Lou, Yanbing Liu, Licai Gao, Yuan Sun, Tianzheng He, Qingbo Yang, Chao Feng, Dezheng Wang, Yang Zhang, Qing Xu, Fangzhou Xu
Summary:Pain is an experience of unpleasant sensations and emotions associated with actual or potential tissue damage. In the global context, billions of people are affected by pain disorders. There are particular challenges in the measurement and assessment of pain, and the commonly used pain measuring tools include traditional subjective scoring methods and biomarker-based measures. The main tools for biomarker-based analysis are electroencephalography (EEG), electrocardiography and functional magnetic resonance. The EEG-based quantitative pain measurements are of immense value in clinical pain management and can provide objective assessments of pain intensity. The assessment of pain is now primarily limited to the identification of the presence or absence of pain, with less research on multilevel pain. High power laser stimulation pain experimental paradigm and five pain level classification methods based on EEG data augmentation are presented. First, the EEG features are extracted using modified S-transform, and the time-frequency information of the features is retained. Based on the pain recognition effect, the 20-40Hz frequency band features are optimized. Afterwards the Wasserstein generative adversarial network with gradient penalty is used for feature data augmentation. It can be inferred from the good classification performance of features in the parietal region of the brain that the sensory function of the parietal lobe region is effectively activated during the occurrence of pain. By comparing the latest data augmentation methods and classification algorithms, the proposed method has significant advantages for the five-level pain dataset. This research provides new ways of thinking and research methods related to pain recognition, which is essential for the study of neural mechanisms and regulatory mechanisms of pain
Article, 2023
Publication:International Journal of Neural Systems, 34, 30 November 2023
Publisher: 2023



Peer-reviewed

MLNAN: Multi-level noise-aware network for low-dose CT imaging implemented with constrained cycle Wasserstein generative 

adversarial network

Authors:Zhenxing HuangWenbo LiYunling WangZhou LiuQiyang ZhangYuxi JinRuodai WuGuotao QuanDong LiangZhanli HuNa 

Summary:Low-dose CT techniques attempt to minimize the radiation exposure of patients by estimating the high-resolution normal-dose CT images to reduce the risk of radiation-induced cancer. In recent years, many deep learning methods have been proposed to solve this problem by building a mapping function between low-dose CT images and their high-dose counterparts. However, most of these methods ignore the effect of different radiation doses on the final CT images, which results in large differences in the intensity of the noise observable in CT images. What’more, the noise intensity of low-dose CT images exists significantly differences under different medical devices manufacturers. In this paper, we propose a multi-level noise-aware network (MLNAN) implemented with constrained cycle Wasserstein generative adversarial networks to recovery the low-dose CT images under uncertain noise levels. Particularly, the noise-level classification is predicted and reused as a prior pattern in generator networks. Moreover, the discriminator network introduces noise-level determination. Under two dose-reduction strategies, experiments to evaluate the performance of proposed method are conducted on two datasets, including the simulated clinical AAPM challenge datasets and commercial CT datasets from United Imaging Healthcare (UIH). The experimental results illustrate the effectiveness of our proposed method in terms of noise suppression and structural detail preservation compared with several other deep-learning based methods. Ablation studies validate the effectiveness of the individual components regarding the afforded performance improvement. Further research for practical clinical applications and other medical modalities is required in future works
Article, 2023
Publication:Artificial Intelligence In Medicine, 143, September 2023
Publisher: 2023

2023

Modeling Changes in Molecular Dynamics Time Series as Wasserstein Barycentric Interpolations
Authors:Jovan DamjanovicYu-Shan LinJames M. Murphy2023 International Conference on Sampling Theory and Applications (SampTA)
Summary:Molecular dynamics (MD) simulations are a powerful computational tool for elucidation of molecular behavior. These simulations generate an abundance of high-dimensional time series data and parsing these data into a human-interpretable format is nontrivial. Clustering trajectory segments obtained via change point detection has been shown to lower memory complexity and yield improved partitioning resolution of the time series compared to the state of the art. However, accurate change point placement is often inhibited by the presence of gradual changes between long-lived metastable states. The trajectory regions corresponding to these gradual changes are not well-modeled by a single distribution, and therefore are frequently over-segmented. In this work, we model such regions using weighted Wasserstein barycentric interpolations between adjacent metastable states, allowing for gradual changes to be resolved correctly. The improved detection performance of our proposed method is demonstrated on a range of toy and real MD simulation data, showing significant potential for faithfully modeling and compressing complex MD simulations
Chapter, 2023
Publication:2023 International Conference on Sampling Theory and Applications (SampTA), 20230710, 1
Publisher: 2023

  

A Wasserstein GAN-based Framework for Adversarial Attacks Against Intrusion Detection Systems
Authors:Fangda CuiQiang YePatricia Kibenge-MacLeodICC 2023 - IEEE International Conference on Communications
Summary:Intrusion detection system (IDS) has become an essential component of modern communication networks. The major responsibility of an IDS is to monitor communication networks for malicious attacks or policy violations. Over the past years, machine learning (ML) and deep learning (DL) have been employed to construct effective IDS. However, recent studies have shown that the reliability of ML/DL-based IDS is questionable under adversarial attacks. In this paper, we propose a framework based on Wasserstein generative adversarial networks (WGANs) to generate adversarial traffic to evade ML/DL-based IDS. Compared with the existing adversarial attack generation schemes, the proposed framework only involves highly restricted modification operations and the output of the framework is carefully regulated, ultimately preserving the type of the intended malicious traffic. In our research, we validated the effectiveness of the proposed framework by launching adversarial attacks of varied types against multiple ML/DL-based IDS. Our experimental results in terms of detection rate and evasion increase rate indicate that the proposed framework can completely deceive the IDS based on Naive Bayes (NB), Logistic Regression (LR), Random Forest (RF), and Recurrent Neural Network (RNN). In addition, the framework can partially evade the IDS based on Decision Tree (DT), Gradient Boosting (GB), and Multilayer Perceptrons (MLP)
Chapter, 2023
Publication:ICC 2023 - IEEE International Conference on Communications, 20230528, 3187
Publisher: 2023
 


Using Fourier Coefficients and Wasserstein Distances to Estimate Entropy in Time Series
Authors:Scott PerkeyAna CarvalhoAlberto Krone-Martins2023 IEEE 

Summary:Time series from real data measurements are often noisy, under-sampled, irregularly sampled, and inconsistent across long-term measurements. Typically, in analyzing these time series, particularly within astronomy, it is common to use estimators such as sample entropy and multi-scale entropy that require interpolation to avoid irregular sampling. In this work, we analyze and consider a new entropy estimator that combines permutations, Fourier Coefficients, and Wasserstein distances to address the concern of irregularly sampled data
Chapter, 2023
Publication:2023 IEEE 19th International Conference on e-Science (e-Science), 20231009, 1
Publisher: 2023

 

Wasserstein Expansible Variational Autoencoder for Discriminative and Generative Continual Learning

Authors:Fei YeAdrian G. Bors2023 IEEE/CVF International Conference on Computer Vision (ICCV)
Summary:Task-Free Continual Learning (TFCL) represents a challenging learning paradigm where a model is trained on the non-stationary data distributions without any knowledge of the task information, thus representing a more practical approach. Despite promising achievements by the Variational Autoencoder (VAE) mixtures in continual learning, such methods ignore the redundancy among the probabilistic representations of their components when performing model expansion, leading to mixture components learning similar tasks. This paper proposes the Wasserstein Expansible Variational Autoencoder (WEVAE), which evaluates the statistical similarity between the probabilistic representation of new data and that represented by each mixture component and then uses it for deciding when to expand the model. Such a mechanism can avoid unnecessary model expansion while ensuring the knowledge diversity among the trained components. In addition, we propose an energy-based sample selection approach that assigns high energies to novel samples and low energies to the samples which are similar to the model’s knowledge. Extensive empirical studies on both supervised and unsupervised benchmark tasks demonstrate that our model outperforms all competing methods. The code is available at https://github.com/dtuzi123/WEVAE/
Chapter, 2023
Publication:2023 IEEE/CVF International Conference on Computer Vision (ICCV), 20231001, 18619
Publisher: 2023


 
A Novel Conditional Wasserstein Deep Convolutional Generative Adversarial Network

Authors:Arunava RoyDipankar Dasgupta

Summary:Generative Adversarial Networks (GAN) and their several variants have not only been used for adversarial purposes but also used for extending the learning coverage of different AI/ML models. Most of these variants are unconditional and do not have enough control over their outputs. Conditional GANs (CGANs) have the ability to control their outputs by conditioning their generator and discriminator with an auxiliary variable (such as class labels, and text descriptions). However, CGANs have several drawbacks such as unstable training, non-convergence and multiple mode collapses like other unconditional basic GANs (where the discriminators are classifiers). DCGANs, WGANs, and MMDGANs enforce significant improvements to stabilize the GAN training although have no control over their outputs. We developed a novel conditional Wasserstein GAN model, called CWGAN (a.k.a RD-GAN named after the initials of the authors' surnames) that stabilizes GAN training by replacing relatively unstable JS divergence with Wasserstein-1 distance while maintaining better control over its outputs. We have shown that the CWGAN can produce optimal generators and discriminators irrespective of the original and input noise data distributions. We presented a detailed formulation of CWGAN and highlighted its salient features along with proper justifications. We showed the CWGAN has a wide variety of adversarial applications including preparing fake images through a CWGAN-based deep generative hashing function and generating highly accurate user mouse trajectories for fooling any underlying mouse dynamics authentications (MDAs). We conducted detailed experiments using well-known benchmark datasets in support of our claims
Article, 2023
Publication:IEEE Transactions on Artificial Intelligence, PP, 20230623, 1
Publisher: 2023
<–—2023———2023——2490—


r-reviewed
Wasserstein Regression

Authors:Yaqing ChenZhenhua LinHans-Georg Müller
Summary:The analysis of samples of random objects that do not lie in a vector space is gaining increasing attention in statistics. An important class of such object data is univariate probability measures defined on the real line. Adopting the Wasserstein metric, we develop a class of regression models for such data, where random distributions serve as predictors and the responses are either also distributions or scalars. To define this regression model, we use the geometry of tangent bundles of the space of random measures endowed with the Wasserstein metric for mapping distributions to tangent spaces. The proposed distribution-to-distribution regression model provides an extension of multivariate linear regression for Euclidean data and function-to-function regression for Hilbert space-valued data in functional data analysis. In simulations, it performs better than an alternative transformation approach where one maps distributions to a Hilbert space through the log quantile density transformation and then applies traditional functional regression. We derive asymptotic rates of convergence for the estimator of the regression operator and for predicted distributions and also study an extension to autoregressive models for distribution-valued time series. The proposed methods are illustrated with data on human mortality and distributional time series of house prices
Article, 2023
Publication:Journal of the American Statistical Association, 118, 20230403, 869
Publisher: 2023


Peer-reviewed
Modified locally joint sparse marginal embedding and wasserstein generation adversarial network for bearing fault diagnosis
Authors:Hongdi Zhou, Hang Zhang, Zhi Li, Fei Zhong
Summary:Rolling bearings are essential parts for manufacturing machines. Vast quantities of features are often extracted from measured signals to comprehensively reflect the conditions of bearings, which may cause high dimensionality, information redundancy, and time consumption. In addition, it is extremely difficult, expensive, and time-consuming to collect samples with label information during the bearing fault diagnosis in real-world scenarios. In this study, a novel bearing defect diagnosis method for small sample size is proposed based on modified local joint sparse marginal embedding (MLJSME) and Wasserstein generative adversarial networks (WGANs). MLJSME can effectively extract intrinsic sparse discriminant features of high-dimensional dataset by preserving both global and local structures. Graph embedding and Gaussian kernel function are adopted to preserve the locality structure of dataset. The global structure and discriminate information are preserved by maximum margin criterion which can also avoid small sample-size problem. Moreover, joint sparsity is applied to preserve the sparse property and improve the robustness to noise and outliers. An abundance of artificial samples can be obtained with WGAN and a few labeled samples. Firstly, a high-dimensional feature dataset consisting of time-domain and frequency-domain features is extracted from original vibration signals, then MLJSME is utilized to extract sensitive low-dimensional features, and a small number of low-dimensional features are fed into WGAN to generate a large number of artificial samples that used to train the classifier, and the bearing fault types can be finally identified. The effectiveness and feasibility of the proposed method is validated by analyzing the different experimental cases
Article, 2023
Publication:Journal of Vibration and Control, 20230717
Publisher: 2023

 
Synthetic Batik Pattern Generator Using Wasserstein Generative Adversarial Network with Gradient Penalty
Authors:Kus AndriadiYaya HeryadiLukasWayan SupartaIlvico Sonata2023 6th International Seminar on Research of Information Technology and Intelligent Systems (ISRITI)
Summary:Batik is an Indonesian world cultural heritage. Batik consists of many kinds of patterns depending on where the batik comes from, Batik-making techniques continue to develop along with technology development. Among the batik making techniques that are widely used are hand-written, stamping, and printing. Batik motifs have been widely used as research material, especially in the field of artificial intelligence. The diverse appearance of batik motifs has attracted many researchers to carry out research on making synthetic batik patterns, one of which uses a Generative Adversarial Network. This paper presents a synthetic batik pattern model based on the Wasserstein Generative Adversarial Network with Gradient Penalty. This model has been proven to create new synthetic batik patterns quite well and almost identical with images provided in the dataset, with the notes if the dataset provided is large
Chapter, 2023
Publication:2023 6th International Seminar on Research of Information Technology and Intelligent Systems (ISRITI), 20231211, 492
Publisher: 2023

 

Incorporating Least-Effort Loss to Stabilize Training of Wasserstein GAN
Authors:Fanqi Li, Lin Wang, Bo Yang, Pengwei Guan, 2023 International Joint Conference on Neural Networks (IJCNN)
Summary:In order to further improve the convergence properties of generative adversarial networks, in this paper, we analyze how the stability can be affected by the so-called best-effort manner of the discriminator in the minimax game. We point out that this manner can cause the multistate problem and the optimization entangling problem. To alleviate these, we proposed an alternative least-effort loss to regularize the training behaviors of the discriminator. With this loss, the discriminator only updates when it is unable to distinguish distributions. To evaluate the effectiveness of the least-effort loss, we introduce it into Wasserstein GAN. Experiments on Dirac delta distribution and image datasets demonstrate that the least-effort loss can effectively improve the convergence properties and generation quality of WGAN. Furthermore, the behaviors of the discriminator and generator during the training show that, with the least-effort loss, the state space of the discriminator shrinks, and the optimization of the discriminator and the generator disentangles in some way
Chapter, 2023
Publication:2023 International Joint Conference on Neural Networks (IJCNN), 20230618, 1
Publisher: 2023



Application of the Wasserstein Distance to identify inter-crystal scatter in a light-sharing depth-encoding PET detector

Authors:E. W. Petersen, A. Goldan, 2023 IEEE Nuclear Science Symposium, Medical Imaging Conference and International Symposium on Room-Temperature Semiconductor Detectors (NSS MIC RTSD)
Summary:In finely-pixelated PET detectors, Compton-scatter of the incident annihilation photon generates a spatial blurring known as inter-crystal scatter (ICS). This blurring is particularly exacerbated in light-sharing depth-encoding detectors - a design that is otherwise an excellent candidate for cost-effective high-resolution imaging. Accurate identification of ICS in these detectors is crucial to establishing their viability as a high-performance imaging platform. We therefore developed a pair of data-driven ICS identification algorithms - a contour-based method that utilizes only the centroid position of the event and a Wasserstein distanced-based method that incorporates the full dimensionality of the detector response pattern. As a proof-of-concept, both algorithms were tested on experimental calibration data acquired from the Prism-PET brain scanner, and each event was classified as ICS or photoelectric (PE). Results of the classification were evaluated by inspecting distributions of the energy and the DOI estimation parameter (w) for both ICS and PE-classified events. Excellent classification performance was demonstrated by both methods via suppression of high-energy components of the energy distribution and shaping of the DOI-parameter distribution to align with expectations from the Beer-Lambert relation. However, the Wasserstein-based classification outperformed the contour method, indicating the importance of utilizing the full dimensionality of the input detector response data
Chapter, 2023
Publication:2023 IEEE Nuclear Science Symposium, Medical Imaging Conference and International Symposium on Room-Temperature Semiconductor Detectors (NSS MIC RTSD), 20231104, 1
Publisher: 2023


2023



On characterizing optimal Wasserstein GAN solutions for non-Gaussian data
Authors:Yu-Jui HuangShih-Chun LinYu-Chih HuangKuan-Hui LyuHsin-Hua ShenWan-Yi Lin2023 IEEE International Symposium on 

Summary:The generative adversarial network (GAN) aims to approximate an unknown distribution via a parameterized neural network (NN). While GANs have been widely applied in reinforcement and semi-supervised learning as well as computer vision tasks, selecting their parameters often needs an exhaustive search and only a few selection methods can be proved to be theoretically optimal. One of the most promising GAN variants is the Wasserstein GAN (WGAN). Prior work on optimal parameters for WGAN is limited to the linear-quadratic-Gaussian (LQG) setting, where the NN is linear and the data is Gaussian. In this paper, we focus on the characterization of optimal WGAN parameters beyond the LQG setting. We derive closed-form optimal parameters for one-dimensional WGANs with non-linear sigmoid and ReLU activation functions. Extensions to high-dimensional WGANs are also discussed. Empirical studies show that our closed-form WGAN parameters have good convergence behavior with data under both Gaussian and Laplace distributions
Chapter, 2023
Publication:2023 IEEE International Symposium on Information Theory 



Enhanced data imputation framework for bridge health monitoring using Wasserstein generative adversarial networks with gradient penalty
Authors:Shuai GaoChunfeng WanZhenwei ZhouJiale HouLiyu XieSongtao Xue
Summary:The availability of complete data is essential for accurately assessing structural stability and condition in structural health monitoring (SHM) systems. Unfortunately, data missing is a common occurrence in daily monitoring operations, which hinders real-time analysis and evaluation of structural conditions. Although considerable research has been conducted to efficiently recover missing data, the implementation of these recovery methods often encounters issues such as serious mode collapse and gradient vanishing. To address these challenges, this paper proposes a missing data imputation framework called WGAIN-GP based on Wasserstein Generative Adversarial Network with Gradient Penalty. This framework aims to enhance the stability and convergence rate of the network during the missing data recovery process. The effectiveness and robustness of the proposed method are extensively evaluated using measured acceleration data from a long-span highway-railway dual-purpose bridge. The results of the implementation demonstrate that the proposed method achieves superior recovery performance even under various missing data conditions, including high missing rates of up to 90%. Furthermore, the generality of the method is validated by successfully recovering data from different missing sensors. Additionally, the recovered data is utilized for modal analysis of the bridge's structural state, further verifying the reliability of the recovery method. The proposed recovery method offers several advantages, with its stability and robustness being particularly noteworthy. By significantly enhancing the reliability of the recovered data, this method contributes to improving the overall accuracy and effectiveness of structural health monitoring systems
Article, 2023
Publication:Structures, 57, November 2023
Publisher: 2023

5

 

Finite-Horizon Optimal Control of Continuous-Time Stochastic Systems with Terminal Cost of Wasserstein Distance

Authors:Kenta Hoshino2023 62nd IEEE Conference on Decision and Control (CDC)
Summary:This study addresses a stochastic optimal control problem for continuous-time systems aimed at steering a probability distribution of the terminal state towards a desired probability distribution. The problem formulation incorporates the Wasserstein distance, a metric of the space of probability measures, in the cost functional. We provide an optimality condition for this optimal control problem in the form of Pontryagin's minimal principle. The condition is obtained by carefully examining the properties of the Wasserstein distance. Consequently, we obtain the optimality condition described by a forward-backward stochastic differential equation and a Kantorovich potential, which appears in optimal transport theory
Chapter, 2023
Publication:2023 62nd IEEE Conference on Decision and Control (CDC), 20231213, 5825
Publisher: 2023
 

Peer-reviewed

Convergence of the empirical measure in expected wasserstein distance: non-asymptotic explicit bounds in d

Author:Nicolas Fournier
Summary:We provide some non-asymptotic bounds, with explicit constants, that measure the rate of convergence, in expected Wasserstein distance, of the empirical measure associated to an i.i.d. N-sample of a given probability distribution on
d
Article, 2023
Publication:ESAIM: Probability and Statistics, 27, 2023, 749
Publisher: 2023


Imbalanced Fault Diagnosis Using Conditional Wasserstein Generative Adversarial Networks With Switchable Normalization
Authors:Wenlong FuYupeng ChenHongyan LiXiaoyue ChenBaojia
Summary:Mechanical equipment usually runs under normal condition (NC), making it prohibitively challenging to collect sufficient fault samples and the dataset is prone to imbalanced characteristics, which severely limits the performance of intelligent fault diagnosis methods. In view of this, a conditional Wasserstein generative adversarial network with switchable normalization (SN-CWGAN) is proposed. First, self-attention mechanism and dense convolutional network (DenseNet) are integrated into SN-CWGAN to enhance the transmission of key features, so as to obtain more discriminative feature information. Simultaneously, switchable normalization is performed within discriminators to increase the generalization capability of the SN-CWGAN model. Then, a two time-scale update rule (TTUR) is applied to improve the convergence speed and stability of the model during training. Accordingly, the SN-CWGAN model can generate high-quality fault samples to balance the dataset. Finally, the AlexNet classifier is trained on the balanced dataset to realize fault diagnosis. The effectiveness of the proposed method is validated by two case studies. The diagnostic results and comparative experiments indicate that the proposed method achieves significant improvements in diagnostic accuracy and stability
Article, 2023
Publication:IEEE Sensors Journal, 23, 20231201, 29119
Publisher: 2023

<–—2023———2023——2500-



Peer-reviewed

Bures–Wasserstein Minimizing Geodesics between Covariance Matrices of Different Ranks
Authors:Yann ThanwerdasXavier Pennec

Article, 2023
Publication:SIAM Journal on Matrix Analysis and Applications, 44, 20230930, 1447
Publisher: 2023
ures–Wasserstein Minimizing Geodesics between Covariance Matrices of Different Ranks

Authors:Yann ThanwerdasXavier Pennec


Peer-reviewed
An enhanced Wasserstein generative adversarial network with Gramian Angular Fields for efficient stock market prediction during market crash periods

Authors:Alireza GhasemiehRasha Kashef

Summary:Abstract: At the beginning of 2020, the COVID-19 pandemic caused a sharp decline in equity market indices, which remained stagnant for a considerable period. This resulted in significant losses for many investors. Despite extensive research on stock market prediction and the development of various effective models, there has been no specific effort to create a stable model during a financial crisis. Several studies have been conducted to forecast stock market trends and prices using advanced techniques like machine learning, deep learning, generative adversarial networks, and reinforcement learning. However, none of the existing forecasting models address the issue of market crashes, leading to substantial losses. We propose a GAF-EWGAN, a stacking ensemble model that combines enhanced WGANs with Gramian Angular Fields. This model demonstrates a high level of resilience during stock market crashes, effectively preventing investors from experiencing losses and generating significant profits. The GAF-EWGAN model achieved an average annual return of 16.49% across 20 selected stocks. Financial indicators indicate its reliability for real-world transactions
Article, 2023
Publication:Applied Intelligence : The International Journal of Research on Intelligent Systems for Real Life Complex Problems, 53, 202312, 28479
Publisher: 2023

Spatial and channel attention-based conditional Wasserstein GAN for direct and rapid image reconstruction i

ultrasound computed tomography

 rapid image reconstruction in ultrasound computed tomography

Authors:Xiaoyun LongChao Tian

Summary:Abstract: Ultrasound computed tomography (USCT) is an emerging technology that offers a noninvasive and radiation-free 

imaging approach with high sensitivity, making it promising for the early detection and diagnosis of breast cancer. The speed-of-sound 

(SOS) parameter plays a crucial role in distinguishing between benign masses and breast cancer. However, traditional SOS 

reconstruction .methods face challenges in achieving a balance between resolution and computational efficiency, which hinders their 

clinical applications due to high computational complexity and long reconstruction times. In this paper, we propose a novel and efficient 

approach for direct SOS image reconstruction based on an improved conditional generative adversarial network. The generator directly 

reconstructs SOS images from time-of-flight information, eliminating the need for intermediate steps. Residual spatial-channel attention 

blocks are integrated into the generator to adaptively determine the relevance of arrival time from the transducer pair corresponding to each 

pixel in the SOS image. An ablation study verified the effectiveness of this module. Qualitative and quantitative evaluation results on breast 

phantom datasets demonstrate that this method is capable of rapidly reconstructing high-quality SOS images, achieving better generation 

results and image quality. Therefore, we believe that the proposed algorithm represents a new direction in the research area of USCT SOS 

reconstruction

Article, 2023

Publication:Biomedical Engineering Letters, 14, 202401, 57

Publisher: 2023



A Wasserstein Generative Adversarial Network-Gradient Penalty-Based Model with Imbalanced Data Enhancement for Networ

Intrusion Detection

Authors:Gwo-Chuan Lee, Jyun-Hong Li, Zi-Yang Li

Summary:In today’s network intrusion detection systems (NIDS), certain types of network attack packets are sparse compared to regular 

network packets, making them challenging to collect, and resulting in significant data imbalances in public NIDS datasets. With respect to 

attack types with rare data, it is difficult to classify them, even by using various algorithms such as machine learning and deep learning. To 

address this issue, this study proposes a data augmentation technique based on the WGAN-GP model to enhance the recognition accuracy of 

sparse attacks in network intrusion detection. The enhanced performance of the WGAN-GP model on sparse attack classes is validated by 

evaluating three sparse data generation methods, namely Gaussian noise, WGAN-GP, and SMOTE, using the NSL-KDD dataset. 

Additionally, machine learning algorithms, including KNN, SVM, random forest, and XGBoost, as well as neural network models such as 

multilayer perceptual neural networks (MLP) and convolutional neural networks (CNN), are applied to classify the enhanced NSL-KDD 

dataset. Experimental results revealed that the WGAN-GP generation model is the most effective for detecting sparse data probes. 

Furthermore, a two-stage fine-tuning algorithm based on the WGAN-GP model is developed, fine-tuning the classification algorithms and 

model parameters to optimize the recognition accuracy of the sparse data probes. The final experimental results demonstrate that the MLP 

classifier significantly increases the accuracy rate from 74% to 80% after fine tuning, surpassing all other classifiers. The proposed method 

exhibits a 10%, 7%, and 13% improvement over untuned Gaussian noise enhancement, untuned SMOTE enhancement, and no 

enhancement

Downloadable Article, 2023

Publication:Applied Sciences, 13, 20230701, 8132

Publisher: 2023

Access Free

Universal consistency of Wasserstein k -NN classifier: a negative and some positive results

Author:Donlapark Ponnoprat

Article, 2023

Publication:Information and Inference: A Journal of the IMA, 12, 20230427, 1997

Publisher: 2023



Peer-reviewed

The use of Wasserstein Generative Adversarial Networks in searches for new resonances at the LHC

Authors:Benjamin LiebermanSalah-Eddine DahbiBruce Mellado

Summary:In the search for physics beyond the standard model, machine learning classifiers provide 

Summary:In the search for physics beyond the standard model, machine learning classifiers provide methods for extracting signals from

background processes in data produced at the LHC. Semi-supervised machine learning models are trained on a labeled background and

unlabelled signal. When using semi-supervised techniques in the training of machine learning models, over-training can lead to background

events incorrectly being labeled as signal events. The extent of false signals generated must therefore be quantified before semi-supervised

techniques can be used in resonance searches. In this study, a frequentest methodology is presented to quantify the extent of fake signals

generated in the training of semi supervised DNN classifiers when confronting side-bands and the signal regions. The use of a WGAN is

explored as a machine learning based data generator

Article, 2023 

Publication:Journal of Physics: Conference Series, 2586, 20230901

Publisher: 2023


2023



2023 see 2021

Insupervised Learning Model of Sparse Filtering Enhanced Using Wasserstein Distance for Intelligent Fault Diagnosis
Authors:Govind VashishthaRajesh Kumar
Article, 2023
Publication:Journal of Vibration Engineering & Technologies, 11, 202310, 2985
Publisher: 2023
 

Peer-reviewed

Dynamical Systems and Hamilton–Jacobi–Bellman Equations on the Wasserstein Space and their L 2 Representations

Authors:Chloé JimenezAntonio MarigondaMarc Quincampoix
Article, 2023
Publication:SIAM Journal on Mathematical Analysis, 55, 20231031, 5919
Publisher: 2023


Peer-reviewed

The Wasserstein mean of unipotent matrices
Authors:Sejong KimVatsalkumar N. Mer
Article, 2023
Publication:Linear and Multilinear Algebra, 20231226, 1
Publisher: 2023

Sparse super resolution and its trigonometric approximation in the p -Wasserstein distance

Authors:Paul CatalaMathias HockmannStefan Kunis
Article, 2023
Publication:PAMM, 22, 202303
Publisher: 2023


A Wasserstein Generative Adversarial Network–Gradient Penalty-Based Model with Imbalanced Data Enhancement for Network Intrusion Detection
Authors:Gwo-Chuan Lee, Jyun-Hong Li, Zi-Yang Li
Article, 2023
Publication:Applied Sciences, 13, 20230712, 8132
Publisher: 2023

<–—2023———2023——25010—



Peer-reviewed

 Data-dependent Approach for High-dimensional (Robust) Wasserstein Alignment
Authors:Hu DingWenjie LiuMingquan Ye
Article, 2023
Publication:ACM Journal of Experimental Algorithmics, 28, 20231231, 1
Publisher: 2023

3D Magnetic Resonance Image Denoising using Wasserstein Generative Adversarial Network with Residual Encoder-Decoders and

Variant Loss Functions

Authors:Hanaa A. SayedAnoud A. MahmoudSara S. Mohamed

Sanaa A. Sayed, Anoud A. Mahmoud, Sara S. Mohamed

Article, 202Publication:International Journal of Advanced Computer Science and Applications, 14, 2023

Publisher: 2023Delamination Detection Framework for the Imbalanced Dataset in Laminated Composite Using Wasserstein Generative

Adversarial Network-Based Data Augmentation

Authors:Sungjun Kim, Muhammad Muzammil Azad, Jinwoo Song, Heungsoo Kim

Article, 202

Publication:Applied Sciences, 13, 20231029, 11837

Publisher: 2023


Peer-reviewed

On Distributionally Robust Generalized Nash Games Defined over the Wasserstein Ball

Authors:Filippo Fabiani, Barbara Franci

Summary:Abstract: In this paper we propose an exact, deterministic, and fully continuous reformulation of generalized Nash games

characterized by the presence of soft coupling constraints in the form of distributionally robust (DR) joint chance-constraints (CCs). We

first rewrite the underlying uncertain game introducing mixed-integer variables to cope with DR–CCs, where the integer restriction actually

amounts to a binary decision vector only, and then extend it to an equivalent deterministic problem with one additional agent handling all

those introduced variables. Successively we show that, by means of a careful choice of tailored penalty functions, the extended

deterministic game with additional agent can be equivalently recast in a fully continuous setting

Article, 2023

Publication:Journal of Optimization Theory and Applications, 199, 202310, 298

Publisher: 2023

Peer-reviewed

Enhancement Methods of Hydropower Unit Monitoring Data Quality Based on the Hierarchical Density-Based Spatial Clustering

Aof Applications with a Noise-Wasserstein Slim Generative Adversarial Imputation Network with a Gradient 

W sserstein Slim Generative Adversarial Imputation Network with a Gradient 

Authors:Fangqing ZhangJiang GuoFang YuanYuanfeng QiuPei 

Summary:In order to solve low-quality problems such as data anomalies and missing data in the condition monitoring data of hydropower 

units, this paper proposes a monitoring data quality enhancement method based on HDBSCAN-WSGAIN-GP, which improves the quality 

and usability of the condition monitoring data of hydropower units by combining the advantages of density clustering and a generative 

adversarial network. First, the monitoring data are grouped according to the density level by the HDBSCAN clustering method in 

combination with the working conditions, and the anomalies in this dataset are detected, recognized adaptively and cleaned. Further 

combining the superiority of the WSGAIN-GP model in data filling, the missing values in the cleaned data are automatically generated by 

the unsupervised learning of the features and the distribution of real monitoring data. The validation analysis is carried out by the online 

monitoring dataset of the actual operating units, and the comparison experiments show that the clustering contour coefficient (SCI) of the 

HDBSCAN-based anomaly detection model reaches 0.4935, which is higher than that of the other comparative models, indicating that the 

proposed model has superiority in distinguishing between the valid samples and anomalous samples. The probability density distribution of 

the data filling model based on WSGAIN-GP is similar to that of the measured data, and the KL dispersion, JS dispersion and Hellinger's 

distance of the distribution between the filled data and the original data are close to 0. Compared with the filling methods such as SGAIN, 

GAIN, KNN, etc., the effect of data filling with different missing rates is verified, and the RMSE error of data filling with WSGAIN-GP is 

lower than that of other comparative models. The WSGAIN-GP method has the lowest RMSE error under different missing rates, which 

proves that the proposed filling model has good accuracy and generalization, and the research results in this paper provide a high-quality 

data basis for the subsequent trend prediction and state warning

Article, 2023

Publication:Sensors (Basel, Switzerland), 24, 20231225

Publisher: 2023

  

Research on an Improved Auxiliary Classifier Wasserstein Generative Adversarial Network with Gradient Penalty Fault Diagnosis

Method for Tilting Pad Bearing of Rotating Equipment

Authors:Chunlei ZhouQingfeng WangYang XiaoWang XiaoYue Shu

Article, 202

Publication:Lubricants, 11, 20231002, 423

Publisher: 2023


2023


Peer-reviewed

Wasserstein filter for variable screening in binary classification in the reproducing kernel Hilbert space

Authors:Sanghun JeongChoongrak KimHojin Yang

Article, 2023

Publication:Journal of Nonparametric Statistics, 20230716, 1

Publisher: 2023


2023 see 2022   . Peer-reviewed

Network intrusion detection based on conditional wasserstein variational autoencoder with generative adversarial network and one

dimensional convolutional neural networks

Authors:Jiaxing He, Xiaodan Wang, Yafei Song, Qian Xiang, Chen Che




Peer-reviewed

Network intrusion d

etection based on conditional wasserstein variational autoencoder with generative adversarial network and

one-dimensional convolutional neural networks

Authors:Jiaxing HeXiaodan WangYafei SongQian XiangChen Chen

Publication:Applied Intelligence, 53, 202305, 12416



Peer-reviewed
Exact convergence analysis for Metropolis–Hastings independence samplers in Wasserstein distances
Authors:Austin BrownGalin L. Jones

Article, 2023
Publication:Journal of Applied Probability, 20230605, 1
Publisher: 2023

2023 see 2022. Peer-reviewed
Global Wasserstein Margin maximization for boosting generalization in adversarial training

Authors:Tingyue YuShen WangXiangzhan Yu

Article, 2023
Publication:Applied Intelligence, 53, 202305, 11490
Publisher: 2023

<–—2023———2023——2520—



Assignment Problems Related to Gromov–Wasserstein Distances on the Real Line

Authors:Robert BeinertCosmas HeissGabriele Steidl

Article, 2023
Publication:SIAM Journal on Imaging Sciences, 16, 20230630, 1028
Publisher: 2023



P2023 see 3033. eer-reviewed

Distributionally robust joint chance-constrained programming with Wasserstein metric

Authors:Yining GuYanjun Wang

Article, 2023

Publication:Optimization Methods and Software, 20230808, 1
Publisher: 20
Article, 2023
Publication:Optimization Methods and Software, 20230808, 1
Publisher: 2023

Multi-marginal Gromov–Wasserstein transport and barycentres

Authors:Florian BeierRobert BeinertGabriele Steidl

Article, 2023
Publication:Information and Inference: A Journal of the IMA, 12, 20230918, 2753
Publisher: 2023


Peer-reviewed
Covariance-based soft clustering of functional data based on the Wasserstein–Procrustes metric

Authors:Valentina MasarottoGuido Masarotto

Article, 2023
Publication:Scandinavian Journal of Statistics, 20231028
Publisher: 2023


 
Morphological classification of radio galaxies with Wasserstein generative adversarial network-supported augmentation

Authors:Lennart RustigeJanis KummerFlorian GrieseKerstin BorrasMarcus BrüggenPatrick L S ConnorFrank GaedeGregor 

Article, 2023
Publication:RAS Techniques and Instruments, 2, 20230117, 264
Publisher: 2023

  2023


Online machine learning algorithms based on Wasserstein distance
Authors:ZhaoEn LIZhi-Hai ZHANG

Article, 2023
Publication:SCIENTIA SINICA Technologica, 20230601
Publisher: 2023

NATURE OF WASSERSTEIN METRIC ON WAVEFORM SIMILARITY EVALUATION AND EXAMPLE OF APPLICATION TO SEMBLANCE ANALYSIS

Authors:Tatsuki NARAHiroyuki GOTO

Article, 2023
Publication:Japanese Journal of JSCE, 79, 2023, n/a
Publisher: 2023


Wasserstein Generative Adversarial Network optimized with Remora optimization algorithm based Lung Disease Detection using Chest X-Ray Images
Authors:K. RavikumarMohamed Shameem PBeaulah DavidG. Simi 

Article, 2023
Publication:International Journal of Bio-Inspired Computation, 1, 2023
Publisher: 2023
Peer-reviewedy Wasserstein Metric Is Useful in Econometrics
Authors:Nguyen Ngoc Thach, Nguyen Duc Trung, R. Noah Padilla
Article, 2023
Publication:International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 31, 202312, 259
Publisher: 2023

Peer-reviewed
A Data-dependent Approach for High-dimensional (Robust) Wasserstein Alignment
Authors:Hu Ding (Author), Wenjie Liu (Author), Mingquan Ye (Author)

Summary:Many real-world problems can be formulated as the alignment between two geometric patterns. Previously, a great amount of research focus on the alignment of two-dimensional (2D) or 3D patterns in the field of computer vision. Recently, the alignment problem in high dimensions finds several novel applications in practice. However, the research is still rather limited in the algorithmic aspect. To the best of our knowledge, most existing approaches are just simple extensions of their counterparts for 2D and 3D cases and often suffer from the issues such as high computational complexities. In this article, we propose an effective framework to compress the high-dimensional geometric patterns. Any existing alignment method can be applied to the compressed geometric patterns and the time complexity can be significantly reduced. Our idea is inspired by the observation that high-dimensional data often has a low intrinsic dimension. Our framework is a "data-dependent" approach that has the complexity depending on the intrinsic dimension of the input data. Our experimental results reveal that running the alignment algorithm on compressed patterns can achieve similar qualities, comparing with the results on the original patterns, but the runtimes (including the times cost for compression) are substantially lower
Article, 2023
Publication:ACM Journal of Experimental Algorithmics, 28, 20230811, 1
Publisher: 2023


Peer-reviewed
Image inpainting based on double joint predictive filtering and Wasserstein generative adversarial networks
Show more
Authors:Yuanchen Liu, Zhongliang Pan
Article 2023
Publication:Journal of Electronic Imaging, 32, 20231115, 063008

<–—2023———2023——2530—



Gromov--Wasserstein 距離を用いたクロスドメイン推薦

Authors:熊谷 雄介野沢 悠哉牛久 雅崇横井 人工知能学会全国大会論文集 37 (2023)

Downloadable Article, 2023
Publication:
人工知能学会全国大会論文集 37 (2023), 2023, 4L2GS402
Publisher: 2023

Privacy-Preserved Evolutionary Graph Modeling via Gromov-Wasserstein Autoregression
Authors:Yue XiangDixin LuoHongteng Xu

Article, 2023
Publication:Proceedings of the AAAI Conference on Artificial Intelligence, 37, 20230626, 14566
Publisher: 2023

Peer-reviewed
Source-Independent Full-Waveform Inversion Based on Convolutional Wasserstein Distance Objective Function
Authors:Shuqi JiangHanming ChenHonghui LiHui ZhouLingqian WangMingkun ZhangChuntao Jiang

Article, 2023
Publication:IEEE Transactions on Geoscience and Remote Sensing, 61, 2023, 1
Publisher: 2023


Wasserstein GAN-Based Digital Twin-Inspired Model for Early Drift Fault Detection in Wireless Sensor Networks
Show more
Authors:Md Nazmul Hasan, Sana Ullah Jan, Insoo Koo
asserstein GAN-Based Digital Twin-Inspired Model for Early Drift Fault Detection in Wireless Sensor Networks

Show more

Wasserstein GAN-Based Digital Twin-Inspired Model for Early Drift Fault Detection in Wireless Sensor Networks
Authors:Md Nazmul HasanSana Ullah JanInsoo Koo

Article, 2023
Publication:IEEE Sensors Journal, 23, 20230615, 13327
Publisher: 2023

2023

From p-Wasserstein bounds to moderate deviations

Authors:Xiao FangYuta Koike

Article, 2023
Publication:Electronic Journal of Probability, 28, 20230101
Publisher: 2023

Wasserstein Graph Distance Based on L1–Approximated Tree Edit Distance between Weisfeiler–Lehman Subtrees

Article, 2023
Publication:Proceedings of the AAAI Conference on Artificial Intelligence, 37, 20230626, 7539
Publisher: 2023

Anomaly Detection on Time Series with Wasserstein GAN applied to PHM
Authors:Mélanie DucoffeIlyass HalouiJayant Sen Gupta

Article, 2023
Publication:International Journal of Prognostics and Health Management, 10, 20230604
Publisher: 2023

最適輸送問題Wasserstein距離って何?
Author:星野 健太
Downloadable Article, 2023
Publication:
システム/制御/情報, 67, 20230515, 202
Publisher: 2023

Synthetic aperture radar ground target image generation based on improved Wasserstein generative adversarial networks with gradient penalty

Authors:Jiaqiu AiGaowei FanLu JiaZheng QuJun ShiZhicheng Zhao

Article, 2023
Publication:Journal of Applied Remote Sensing, 17, 20230722, 036501
Publisher: 2023

<–—2023———2023——2540—



2023 see 2022
A Wasserstein Distributionally Robust Planning Model for Renewable Sources and Energy Storage Systems Under Multiple Uncertainties

Authors:Junkai LiZhengyang XuHong LiuChengshan WangLiyong 

Article, 2023
Publication:IEEE Transactions on Sustainable Energy, 14, 202307, 1346
Publisher: 2023



Peer-reviewed

Preservers of the p-power and the Wasserstein means on 2x2 matrices
Authors:Richárd SimonDániel Virosztek
Article, 2023
Publication:The Electronic Journal of Linear Algebra, 39, 20230713, 395
Publisher: 2023

2023 see 3022
Distributed Wasserstein Barycenters via Displacement Interpolation

Authors:Pedro Cisneros-VelardeFrancesco Bullo

Article, 2023
Publication:IEEE Transactions on Control of Network Systems, 10, 202306, 785
Publisher: 2023


Peer-reviewed
Nonparametric and Regularized Dynamical Wasserstein Barycenters for Sequential Observations

Authors:Kevin C. ChengEric L. MillerMichael C. HughesShuchin 

Article, 2023
Publication:IEEE Transactions on Signal Processing, 71, 2023, 3164
Publisher: 2023


Peer-reviewed
Gradient Flows for Probabilsitic Frame Potentials in the Wasserstein Space

Authors:Clare WickmanKasso A. Okoudjou

Article, 2023
Publication:SIAM journal on mathematical analysis, 55, 2023, 2324
Publisher: 2023


2023



2023 see 2022. Peer-reviewed

A Wasserstein Distance-Based Distributionally Robust Chance-Constrained Clustered Generation Expansion Planning Considering Flexible Resource Investments

Authors:Baorui ChenTianqi LiuXuan LiuChuan HeLu NanLei 

Article, 2023
Publication:IEEE Transactions on Power Systems, 38, 202311, 5635
Publisher: 2023


A Wasserstein generative adversarial network with gradient penalty for active sonar signal reverberation suppression
sonar signal reverberation suppression

Authors:Zhen WangHao ZhangWei HuangXiao ChenNing TangYuan 

Article, 2023
Publication:Frontiers in Marine Science, 10, 20231023
Publisher: 2023
 

 
Variational Wasserstein Barycenters with C-cyclical Monotonicity Regularization

Authors:Jinjin ChiZhiyao YangXiming LiJihong OuyangRenchu Guan

Article, 2023
Publication:Proceedings of the AAAI Conference on Artificial Intelligence, 37, 20230626, 7157
Publisher: 2023



Peer-reviewed

Quantitative control of Wasserstein distance between Brownian motion and the Goldstein—Kac telegraph process
Authors:G. BarreraJ. Lukkarinen
Article, 2023
Publication:Annales de l'I.H.P.Probabilités et statistiques, 59, 2023, 933
Publisher: 2023



Optimal transport and the Wasserstein distance for fuzzy measures : an example
Author:Torra, Vicenç (Creator)

Optimal transport and the Wasserstein distance for fuzzy measures : an example
Author:Torra, Vicenç (Creator)
Summary:Probabilities and, in general, additive measures are extensively used in all kind of applications. A key concept in mathematics is the one of a distance. Different distances provide different implementations of what means to be near. Wasserstein distance is one of them for probabilities, with interesting properties and a large number of applications. It is based on the optimal transport problem. Non-additive measures also known as fuzzy measures, capacities and monotonic games, generalize probabilities replacing the additivity axiom by a monotonicity condition. Applications have been developed for this type of measures. In a recent paper we have introduced the optimal transport problem for non-additive measures. This permits to define the Wasserstein distance for non-additive measures. It is based on the (max, +)-transform. We review in this paper this definition, and provide some examples. Examples have been computed with an implementation we have provided in Python
Downloadable Archival Material, English, 2023
Publisher: Umeå universitet, Institutionen för datavetenskap, 2023
Access Free

<–—2023———2023——2550—



Recognition of partial discharge patterns of GIS based on CWGAN-div and Mi-CNN

Authors:LIU HangbinLIN HoufeiCHU JingYE JingLIN Quanwei

Summary:In order to solve the constraints of the limited number and uneven distribution of samples on the performance of the deep learning model in the identification of partial discharge patterns of GIS gas-insulated switchgear), a CWGAN-divconditional Wassertein generative adversarial network-divergence model is proposed to guide the generation of multi-class partial discharge patterns which overcomes the instability of the original GAN generative adversarial network training enhances the sample data and reduces the average imbalance rate from 11.01 to 3.03. Then after using five kinds of classifiers for the comparative experiments before and after sample enhancement the F1mean value of each classifier has been improved by more than 3.7% after sample enhancement. In the experiment the Mi-CNN multi-input-convolutional neural networks model proposed in this paper can use the PRPD phase resolved partial discharge spectrum of ultra-high frequency method and ultrasonic method at the same time and its final F1mean value reaches 95.8%
Downloadable Article, 2023
Publication:Zhejiang dianli, 42, 20230801, 75
Publisher: 2023
Access Free

Peer-reviewed
An Intelligent Diagnosis Approach Combining Resampling and CWGAN-GP of Single-to-Mixed Faults of Rolling Bearings Under Unbalanced Small Samples

Authors:Hongwei fanJiateng MaXiangang CaoXuhui ZhangQinghua 

Summary:Rolling bearing is a key component with the high fault rate in the rotary machines, and its fault diagnosis is important for the safe and healthy operation of the entire machine. In recent years, the deep learning has been widely used for the mechanical fault diagnosis. However, in the process of equipment operation, its state data always presents unbalanced. Number of effective data in different states is different and usually the gap is large, which makes it difficult to directly conduct deep learning. This paper proposes a new data enhancement method combining the resampling and Conditional Wasserstein Generative Adversarial Networks-Gradient Penalty (CWGAN-GP), and uses the gray images-based Convolutional Neural Network (CNN) to realize the intelligent fault diagnosis of rolling bearings. First, the resampling is used to expand the small number of samples to a large level. Second, the conditional label in Conditional Generative Adversarial Networks (CGAN) is combined with WGAN-GP to control the generated samples. Meanwhile, the Maximum Mean Discrepancy (MMD) is used to filter the samples to obtain the high-quality expanded data set. Finally, CNN is used to train the obtained dataset and carry out the fault classification. In the experiment, a single, compound and mixed fault cases of rolling bearings are successively simulated. For each case, the different sets considering the imbalance ratio of data are constructed, respectively. The results show that the method proposed significantly improves the fault diagnosis accuracy of rolling bearings, which provides a feasible way for the intelligent diagnosis of mechanical component with the complex fault modes and unbalanced small data
Article, 2023
Publication:International Journal of Pattern Recognition and Artificial Intelligence, 37, 17 October 2023
Publisher: 2023
 
Crossline Reconstruction of 3D Seismic Data Using 3D cWGAN: A Comparative Study on Sleipner Seismic Survey Data
Authors:Jiyun YuDaeung Yoon

Article, 2023
Publication:Applied Sciences, 13, 20230513, 5999

Publisher: 2023


Method for generating image of Wasserstein generative adversarial network (WGAN), involves inputting output result of generator and pre-processed image to discriminator to judge authenticity of image, and obtaining generated imagbaed on improved WGAN model

CN116912349-A

Inventor WU J

Assignee UNIV JIANGXI SCI & TECHNOLOGY

Derwent Primary Accession Number 

2023-B1370M


Method for detecting abnormality of power slice framework based on federated countermeasure learning, involves performing model training on local WGAN-GP model to obtain trained WGAN -GP model, detecting virtual machine operation data by using trained WAN-GP Model to obtain detection result

CN117076050-A

Inventors ZHAO ZWU C; (...); ZHANG D

Assignees STATE GRID CHONGQING ELECTRIC POWER CO and STATE GRID CORP CHINA

Derwent Primary Accession Number 

2023-C72301


2023  7



Method for predicting cloud resource based on machine learning, involves performing data normalization and using WGAN-GP training data and extracting information by using BIGRU network and calculating the similarity and predicting method

CN116489039-A

Inventors LIU YCHEN J; (...); XIE X

Assignee UNIV GUILIN TECHNOLOGY

Derwent Primary Accession Number 

2023-793043

 


Anti-sample generating system based on WGAN-Unet for intelligent traffic, has generator for constructing Unet architecture according to original sample data, performing feature extraction on input sample, pool and up-sampling, and outputting confrontation disturbance corresponding to original sample

CN115761399-A

Inventors CHEN YSUN L; (...); QIN Z

Assignee UNIV SOUTHEAST

Derwent Primary Accession Number 

2023-26661P



Wasserstein distance estimation method, involves minimizing optimal transport loss by doing sampling data sub-sets from generated data, and computing Wasserstein distances between each pair offst and second subsets

VN95333-A

Inventors NGUYEN B K; NGUYEN D Q; (...); PHONG Q D

Assignee VINAI AI RES & APPL JOINT STOCK CO

Derwent Primary Accession Number 

2023-79367W

 

Method for training three-dimensional point cloud self-encoder, involves using adaptive-sliced Wasserstein algorithm to evaluate sliced Wasserstein distance to ensure that evaluated value is close enough to true value

VN95973-A

Inventors NGUYEN D T; PHAM H T and HOA B S

Assignee VINAI AI RES & APPL JOINT STOCK CO

Derwent Primary Accession Number 

2023-B45698


2023 patent

Risk integration prediction method based on Wasserstein distributed robust optimization, involves determining best integrated strategy, obtaining risk integrated prediction model, and predicting risk

CN116384744-A

Inventors YUAN JLI J and HAO J

Assignee UNIV CHINESE ACAD SCI

Derwent Primary Accession Number 

2023-736623

<–—2023———2023——2560—



Small object detection method based on semantic enhancement and Gaussian loss, involves evaluating prediction frame result through Gaussian Wasserstein distance loss function to train training model

CN116524274-A

Inventors BAI C; MAO J; (...); CUI J

Assignee UNIV ZHEJIANG TECHNOLOGY

Derwent Primary Accession Number 

2023-839238



Method for facilitating expansion of chiral terahertz spectrum database based on a Wasserstein generative adversarial network-gradient penalty, involves constructing chiral terahertz spectrum database, and finishing expansion of database

CN116741284-A

Inventors CAO YSUN H; (...); WU W

Assignee UNIV PEKING

Derwent Primary Accession Number 

2023-98252S



Method for graph convolution structure depth embedded clustering based on sliced-wasserstein distance, involves constructing an adjacent matrix, constructing a self-encoder module, and constructing an integrated network

CN116563587-A

Inventors CHEN HYING N; (...); GUO C

Assignee UNIV HANGZHOU DIANZI

Derwent Primary Accession Number 

2023-86125J

 


Method for training language comprehension model for performing event detection task, involves providing associative representation learning mechanism and Wasserstein distance-based technique for selecting data in adversarial learning

VN94330-A

Inventor NGUYEN H T

Assignee VINAI AI RES & APPL JOINT STOCK CO

Derwent Primary Accession Number 

2023-79391U

 

 

Computer program for performing visual tracking using Wasserstein distance, has set of instructions for determining next position of target by using distance between distribution estimated for candidate positions and external distribution

KR2023080165-A

Inventors KWON J and KIM Y

Assignee UNIV CHUNG ANG IND ACAD COOP FOUND

Derwent Primary Accession Number 

2023-625809


2023


Typical scene generation method based on the Wasserstein distance measure explicit/implicit model, involves aiming at wind power and photovoltaic data, and respectively generating a first wind power and photovoltaic typical scene set

CN117113628-A

Inventors CHEN HXU W; (...); WANG C

Assignees UNIV WUHAN and STATE GRID CORP CHINA CO LTD

Derwent Primary Accession Number 

2023-C66719


Method for predicting telecommunication client loss based on condition Wasserstein generative adversarial network used in e.g. telecommunication industry, involves processing telecommunication client data set based on comprehensive generative adversarial network model of WGANGP and CGAN

CN115688048-A

Inventors XIE XWEI L and SU C

Assignee UNIV CHONGQING POSTS & TELECOM



Method for reconstructing structural health monitoring missing data based on Wasserstein generative adversarial network-Gradient Penalty (WGANGP)-U-shaped encoder-decoder network (Unet), involves verifying data reconstruction effect

CN116502060-A

Inventors GE H; ZHANG W; (...); WAN H

Assignee UNIV INNOVATION CENT YANGTZE RIVER DELTA

Derwent Primary Accession Number 

2023-827988



Method for detecting water surface floating small target based on YOLOv6 network model, involves using standard Gaussian Wasserstein distance to calculate regression loss in model training by YOLOv6 network model

CN116863306-A

Inventors XU SYUAN B; (...); LI N

Assignee UNIV CHANGZHOU

Derwent Primary Accession Number 

2023-A7447D



Cross-scale defect detection method, involves detecting boundingbox prediction result and classification result of defect, using Wasserstein distance as loss function, training and weight updating of model, and obtaining final defect detection model

CN117094999-ACN117094999-B

Inventors SHAN ZGAO C; (...); WANG J

Assignee UNIV NANJING AERONAUTICS & ASTRONAUTICS

Derwent Primary Accession Number 

2023-C9253S

<–—2023———2023——2570—



Cross-scale defect detection method, involves detecting boundingbox prediction result and classification result of defect, using Wasserstein distance as loss function, training and weight updating of model, and obtaining final defect detection model

CN117094999-ACN117094999-B

Inventors SHAN ZGAO C; (...); WANG J

Assignee UNIV NANJING AERONAUTICS & ASTRONAUTICS

Derwent Primary Accession Number 

2023-C9253S


 

Method for facilitating unsupervised reinforcement learning based on Wasserstein distance, involves replacing reward fed back from environment in target reinforcement learning framework with pseudo reward, and guiding current policy of agent to maintain maximum distance from other historical policy

US11823062-B1

Inventors JIANG YHE S and JI X

Assignee UNIV TSINGHUA

Derwent Primary Accession Number 

2023-C0176D


 

Method for for setting Wasserstein centroid matching layer over message transmission neural network by using electronic device, involves inputting graph representation vector into classifier network to obtain prediction type corresponding to original graph signal

CN117056726-A

Inventors CHU XWANG X and ZHU W

Assignee UNIV TSINGHUA

Derwent Primary Accession Number 

2023-C29781


 

Single battery consistency detection algorithm based on vehicle-connected network large data platform used in field of new energy automobile and scale energy storage, has Wasserstein distance calculated between abnormal and average monomer voltage sequences

CN115792681-A

Inventors GAO K and DAI R

Assignee ZHEJIANG LEAP ENERGY TECHNOLOGY CO LTD

Derwent Primary Accession Number 

2023-30507E


 

Method for detecting early fault of suspension system of high speed train, involves obtaining Wasserstein distance index at each moment under fault working condition by using idea of moving window with reference signal

CN116818377-A

Inventors ZHU ZZHOU Y; (...); WU Y

Assignee UNIV JIANGSU SCI & TECHNOLOGY

Derwent Primary Accession Number 

2023-A52034


 

2023


 

Method for localizing underwater sound source based on matching probability measure, used in sonar system, involves designing matching field cost function based on Wasserstein distance, and obtaining underwater sound source positioning results through computer numerical simulation

CN117148273-ACN117148273-B

Inventors SUN C and ZHU Q

Assignee UNIV NORTHWESTERN POLYTECHNICAL QINGDAO

Derwent Primary Accession Number 

2023-C9062T


 

Method for training detection model using electronic device, involves predicting category of target based on target feature map to obtain prediction box position, and evaluating prediction frame position by Gaussian wasserstein distance loss function

CN117218496-A

Inventors WANG JLIU Y; (...); WANG W

Assignee HARBIN INST TECHNOLOGY SHENZHEN GRADUATE

Derwent Primary Accession Number 

2023-D3165T


 

 Method for detecting abnormality of water supply pipe network by using computer device, involves calculating Wasserstein similarity between node monitoring value vector and predicted value vector, and performing pipe network abnormality judgment according to Wamerstein similarity

CN116108604-ACN116108604-B

Inventors XIA Z and WANG J

Assignee SICHUAN AOTU ENVIRONMENTAL PROTECTION

Derwent Primary Accession Number 

2023-54767T


 

PPerforming glutarylation site prediction method based on dual generator wasserstein generative adversarial network with gradient penalty involves obtaining glutarylated protein data, taking lysine residue as center of window, and setting fixed window size to intercept sequence fragments

CN116758982-A

Inventors WU MQI Z and NING Q

Assignee UNIV DALIAN MARITIME

Derwent Primary Accession Number 

2023-A0286B


 

Method for object detection in unmanned aerial vehicle (UAV) images based on multi-scale and Gaussian Wasserstein distance, involves using trained improved feature extraction network for target prediction for UAV images containing small targets in test set

CN116469020-A

Inventors YANG LMENG L and LI H

Assignee UNIV BEIHANG

Derwent Primary Accession Number 

2023-80277X

<–—2023———2023——2580—


Bearing fault diagnosis method based on multi-scale feature fusion and migration learning, involves inputting feature sequence generated in transformer encoder layer into countermeasure generation network to perform confrontation learning based on wasserstein distance metric

CN116383757-ACN116383757-B

Inventors ZHAO WLIU Y; (...); ZOU Y

Assignees UNIV CHANGCHUN and UNIV HARBIN SCI & TECHNOLOGY

Derwent Primary Accession Number 

2023-73672S


 

[HTML] Bearing Fault Diagnosis Using ACWGAN-GP Enhanced by Principal Component Analysis

B Chen, C Tao, J Tao, Y Jiang, P Li - Sustainability, 2023 - mdpi.com

… In this paper, the proposed PCA-ACWGAN-GP model is compared with the ACWGAN-GP

model without principal component analysis (PCA) and the ACGAN model to verify the …

Cited by 3 Related articles All 5 versions 



2023  ebook
Data Generation Scheme for Photovoltaic Power Forecasting Using Wasserstein Gan with Gradient Penalty Combined with Autoencoder and Regression Models

Authors:Sungwoo ParkJaeuk MoonEenjun Hwang
Summary:Machine learning and deep learning (DL)-based forecasting models have shown excellent predictive performances, but they require a large amount of data for model construction. Insufficient data can be augmented using generative adversarial networks (GANs), but these are not effective for generating tabular data. In this paper, we propose a novel data generation scheme that can generate tabular data for photovoltaic power forecasting (PVPF). The proposed scheme consists of the Wasserstein GAN with gradient penalty (WGAN-GP), autoencoder (AE), and regression model. AE guides the WGAN-GP to generate input variables similar to the real data, and the regression model guides the WGAN-GP to generate output variables that well reflect the relationship with the input variables. We conducted extensive comparative experiments with various GAN-based models on different datasets to verify the effectiveness of the proposed scheme. Experimental results show that the proposed scheme generates data similar to real data compared to other models and, as a result, improves the performance of PVPF models. Especially the deep neural network showed 62% and 70% improvements in mean absolute error and root mean squared error, respectively, when using the data generated through the proposed scheme, indicating the effectiveness of the proposed scheme in DL-based forecasting models
Show more
eBook, 2023


2023 book

Figalli, Alessio; Glaudo, Federico

An invitation to optimal transport. Wasserstein distances, and gradient flows. 2nd edition. (English) Zbl 1527.49001

EMS Textbooks in Mathematics. Berlin: European Mathematical Society (EMS) (ISBN 978-3-98547-050-1/hbk; 978-3-98547-550-6/ebook). vi, 146 p. (2023).

This is the second edition of a graduate text which gives an introduction to optimal transport theory. The first chapter gives an introduction of the historical roots of optimal transport, with the work of Gaspard Monge and Leonid Kantorovich. Moreover the basic notions of measure theory and Riemannian Geometry are presented. Finally some examples of transport maps are presented.

Chapter 2 presents the core of optimal transport theory, as the solution of Kantorovich’s problem for general costs and the solution of the Monge’s problem for suitable costs. Other applications are presented, as the polar decomposition and an application to the Euler equation of fluid dynamics.

Chapter 3 presents some connections between optimal transport, gradient flows and partial differential equations. The Wasserstein distances and gradient flows in Hilbert spaces are introduced. Then the authors show that the gradient flow of the entropy functional in the Wasserstein space coincides with the heat equation, following the seminal approach of Jordan, Kinderlehrer and Otto.

Chapter 4 is devoted to an analysis of optimal transport from the differential point of view, in particular some several important partial differential equations are interpreted as gradient flows with respect to the 2-Wasserstein distance.

The last Chapter 5 presents some further reading on optimal transport for the readers.

The book contains also two appendices, Appendix A, which presents some exercises on optimal transport, and Appendix B, in which the authors give a sketch of the proof of a disintegration theorem, remanding to a book by Luigi Ambrosio, Nicola Fusco e Diego Pallara for a complete proof.

For the first edition of the book, see [A. Figalli and F. Glaudo, An invitation to optimal transport, Wasserstein distances, and gradient flows. Berlin: European Mathematical Society (EMS) (2021; Zbl 1472.49001)].

Reviewer: Antonio Masiello (Bari)

MSC:

49-01

Introductory exposition (textbooks, tutorial papers, etc.) pertaining to calculus of variations and optimal control

49-02

Research exposition (monographs, survey articles) pertaining to calculus of variations and optimal control

49Q22

Optimal transportation

60B05

Probability measures on topological spaces

28A33

Spaces of measures, convergence of measures

35A15

Variational methods applied to PDEs

35Q35

PDEs in connection with fluid mechanics

49N15

Duality theory (optimization)

28A50

Integration and disintegration of measures

Keywords:

optimal transport; Wasserstein distance; duality; gradient flows; measure theory; displacement convexity

Citations:

Zbl 1472.49001
 

Integration of Metropolis-Hastings Algorithm and WGAN Model ...

books.google.com › books

books.google.com › books

2023 · ‎ No preview


2023



 2023 patent
基于改进WGAN的服装属性编辑方法
11/2023
Patent  Available Online
Open Access

2023 patent
Microseismic signal first arrival pickup method based on improved WGAN-GP and...
by SHENG GUANQUN; MA KAI; XIE KAI ; More...
06/2023
The invention provides a microseismic signal first arrival pickup method based on an improved WGAN-GP and Picking-Net, and the method comprises the steps: 1,...
Patent  Available Online
Open Access

2023 patent
基于WGAN的一维时序数据增广方法
09/2023
Patent  Available Online
Open Access



2023 patent

一种基于WGAN场景模拟和时序生产模拟的新能源容量配置方法

09/2023

atent  Available Online

<–—2023———2023——2590—


Patent  Available Online

2023 patent
一种基于WGAN-GP的对抗扰动图像生成方法
08/2023
...Patent  Available Online
Open Access



2023 paten see 2022 2023 patent see 2022
基于改进WGAN-GP的半监督恶意流量检测方法
06/2023
Patent  Available Online 

<–—2023———2023——2600—


Proximal Gradient Algorithms for Gaussian Variational Inference:Optimization in the Bures–Wasserstein Space
023-0

Diao, Michael Ziyang



Wasserstein distance and proxies, application and statistics

he Wasserstein distance (WD) has been proven useful in many ...

YouTube · SPChile CL · Nov 28, 2023


Nicolas Champagnat : Wasserstein
convergence of penalized Markov processes

Sep 28, 2023

Wasserstein Distance Explained | Data Science Fundamentals

www.youtube.com › watch

www.youtube.com › watch
YouTube · NannyML · Apr 14, 2023


James Murphy, Intrinsically Low-Dimensional Models for ...

www.youtube.com › watch

www.youtube.com › watch

47:53

... Wasserstein space. We consider a general barycentric coding model in which data are represented as Wasserstein-2 (W2) barycenters of a set ...

YouTube · CodEx Seminar · Jun 23, 2023



Keenan Eikenberry, Markov Kernels Valued in Wasserstein ...

www.youtube.com › watch

www.youtube.com › watch

51:00

Speaker: Keenan Eikenberry (Arizona State University) Title: Markov Kernels Valued in Wasserstein Spaces Date: 07.18.2023 Abstract: I'll ...

YouTube · CodEx Seminar · Oct 16, 2023

10 key moments

 in this video

Oct 16, 2023


2023


Giacomo De Palma | October 4, 2022 | The Quantum ...

Giacomo De Palma | October 4, 2022 | The Quantum Wasserstein Distance of Order 1

www.youtube.com › watch

www.youtube.com › watch1:09:28

Maciej Dunajski | September 5, 2023 | Equivalence principle, de-Sitter space, and twistor theory. Mathematical Picture Language•302 views · 1:03 ...

Sep 5, 2023 Carmin.tv · Centre International de Rencontres Mathématiques 

YouTube · Mathematical Picture Language · 

Sep 5, 2023 


<–—2023———2023——2606—emd 2023-

                                                                                                                                                                                   

end   end  end  end      yyy c     end  end   end    end   end     end    end    

 

2023   26012606

2022   2257

2021   3034

                                                                                                                                                                             5,