Technical ReportPDF Available

Lecture Notes on Machine Learning: Maximum Product of Numbers of Constant Sum

Authors:

Abstract

In this short note, we prove that the maximum value of the product of n positive numbers whose sum is fixed occurs when all the numbers are equal.
Lecture Notes on Machine Learning
Maximum Product of Numbers of Constant Sum
Christian Bauckhage
B-IT, University of Bonn
In this short note, we prove that the maximum value of the product of
npositive numbers whose sum is fixed occurs when all the numbers
are equal.
Introduction
In our introductory note on constrained optimization,1we observed 1C. Bauckhage and D. Speicher. Lec-
ture Notes on Machine Learning: Con-
strained Optimization – Setting the
Stage. B-IT, University of Bonn, 2019
that, for a fixed perimeter, the rectangle of largest area is a square.
Here, we briefly show that this observation is but a special case of a
more general result, namely: The product of npositive numbers with
constant sum is largest when all the numbers are equal.
In the more abstract language of optimization theory, this is to
say that, for nnumbers x1, . . . , xnwhere xi0, the constrained
maximization problem
Pn=max
x1,...,xn
n
i=1
xi
s.t.
n
i=1
xi=Sn
(1)
is solved by xi=Sn/nfor all 1 in.
Since this result may come in handy when designing machine
learning algorithms,2we will actually prove it. This will require 2C. Bauckhage, E. Brito, K. Cvejoski,
C. Ojeda, R. Sifa, and S. Wrobel. Ising
Models for Binary Clustering via Adi-
abatic Quantum Computing. In Proc.
EMMCVPR, volume 10746 of LNCS.
Springer, 2017
only marginally more efforts than we had to spent on the special
case mentioned above.
Maximum Product of Numbers of Constant Sum
Theorem 1.If n 2positive numbers x1, . . . , xnhave a constant sum of
Sn=
n
i=1
xi
the value of their product
Pn=
n
i=1
xi
is largest, if xi=Sn/nfor all 1in.
Proof. We resort to induction over nand make use of basic calculus.
n=2:Since S2=x1+x2immediately provides x2=S2x1so that
P2=x1x2=x1(S2x1) = S2x1x2
1is concave, the point where
dP2
dx1
=S22x1
vanishes will maximize P2. Equating this derivative to zero then
yields x1=S2
2and thus x2=S2
2.
© C. Bauckhage
licensed under Creative Commons License CC BY-NC
2 c.bauckhage
nn+1:Since Sn=Sn+1xn+1, the product
Pn=
n
i=1
xi
is maximal, if
xi=Sn+1xn+1
n
for all 1 in.
To maximize Pn+1, we therefore consider
Pn+1=
n
i=1
Sn+1xn+1
n·xn+1=Sn+1xn+1
nn
·xn+1
and ask for the optimal value of xn+1. Deriving Pn+1w.r.t. xn+1
results in
dPn+1
dxn+1
=nSn+1xn+1
nn11
nxn+1+Sn+1xn+1
nn
and equating to zero yields
xn+1=Sn+1
n+1.
Plugging this back into the above expression for xi, we find
xi=Sn+1
n+1.
Acknowledgments
This material was prepared within project P3ML which is funded by
the Ministry of Education and Research of Germany (BMBF) under
grant number 01/S17064. The authors gratefully acknowledge this
support.
constrained optimization 3
References
C. Bauckhage and D. Speicher. Lecture Notes on Machine Learning:
Constrained Optimization – Setting the Stage. B-IT, University of
Bonn, 2019.
C. Bauckhage, E. Brito, K. Cvejoski, C. Ojeda, R. Sifa, and S. Wro-
bel. Ising Models for Binary Clustering via Adiabatic Quantum
Computing. In Proc. EMMCVPR, volume 10746 of LNCS. Springer,
2017.
... Interestingly, this simplification provides us with an intuition as to why k-means clustering is agnostic of cluster shapes and distances and often produces clusters of about equal size [15]: In order for S B in (11) to be large, both the distance ∥µ 1 − µ 2 ∥ between the two cluster centers and the product n 1 n 2 of the two cluster sizes have to be large. However, since the sum n 1 + n 2 = n is fixed, the product of the cluster sizes will be maximal if n 1 = n 2 = n 2 [2]. This observation now hands us a heuristic argument for how to rewrite the objective in (11) in a way that will lead to a QUBO. ...
... However, our practical results in Figs. [2][3][4] suggest that this violation of our assumption does not impact the capability of the resulting Hopfield nets to cluster correctly. In other words, our Hopfield energy function is more data agnostic than the arguments required to derive it. ...
Technical Report
Full-text available
We show that Hopfield networks can cluster numerical data into two salient clusters. Our derivation of a corresponding energy function is based on properties of the specific problem of 2-means clustering. Our corresponding NumPy code is short and simple.
Technical Report
Full-text available
The method of Lagrange multipliers is a powerful tool for solving equality constrained optimization problems. To begin to familiarize ourselves with this technique, we here show how it applies to a simple problem we considered before.
Technical Report
Full-text available
In this short note, we prove that the minimum value of the sum of n positive numbers whose product is fixed occurs when all the numbers are equal.
Article
Full-text available
Quantum computing for machine learning attracts increasing attention and recent technological developments suggest that especially adiabatic quantum computing may soon be of practical interest. In this paper, we therefore consider this paradigm and discuss how to adopt it to the problem of binary clustering. Numerical simulations demonstrate the feasibility of our approach and illustrate how systems of qubits adiabatically evolve towards a solution.
  • C Bauckhage
  • D Speicher
C. Bauckhage and D. Speicher. Lecture Notes on Machine Learning: Constrained Optimization -Setting the Stage. B-IT, University of Bonn, 2019.