Decentralized Jointly Sparse Optimization by Reweighted Lq
Download
Report
Transcript Decentralized Jointly Sparse Optimization by Reweighted Lq
Decentralized Jointly Sparse Optimization by
Reweighted Lq Minimization
Qing Ling
Department of Automation
University of Science and Technology of China
Joint work with Zaiwen Wen (SJTU) and Wotao Yin (RICE)
2012/09/05
1
A brief introduction to my research interest
optimization and control in networked multi-agent systems
autonomous agents
- collect data
- process data
- communicate
problem: how to efficiently accomplish in-network
optimization and control tasks through collaboration of agents?
2
Large-scale wireless sensor networks: decentralized
signal processing, node localization, sensor selection …
blind
anchor
how to localize blinds with anchors?
how to fuse big sensory data?
e.g. structural health monitoring
difficulty in data transmission
→ decentralized optimization
without any fusion center
how to assign sensors to targets?
3
Computer/server networks with big data: collaborative
data mining
new challenges in the big data era
- big data is stored in distributed computers/servers
- data transmission is prohibited due to bandwidth/privacy/…
- computers/servers collaborate to do data mining
distributed/decentralized optimization
4
Wireless sensor and actuator networks: with application
in large-scale greenhouse control
wireless sensing
- temperature
- humidity
-…
wireless actuating
- circulating fan
- wet curtain
-…
disadvantages of traditional centralized control
- communication burden in collecting distributed sensory data
- lack of robustness due to packet-loss, time-delay, …
decentralized control system design
5
Recent works
wireless sensor networks
- decentralized signal processing with application in SHM
- decentralized node localization using SDP and SOCP
- decentralized sensor node selection for target tracking
collaborative data mining
- decentralized approaches to jointly sparse signal recovery
- decentralized approaches to matrix completion
wireless sensor and actuator networks
- modeling, hardware design, controller design, prototype
theoretical issues
- convergence and convergence rate analysis
6
Decentralized Jointly Sparse Optimization by
Reweighted Lq Minimization
Qing Ling
Department of Automation
University of Science and Technology of China
Joint work with Zaiwen Wen (SJTU) and Wotao Yin (RICE)
2012/09/05
7
Outline
Background
decentralized jointly sparse optimization with applications
Roadmap
nonconvex versus convex, difficulty in decentralized computing
Algorithm development
successive linearization, inexact average consensus
Simulation and conclusion
8
Background (I): jointly sparse optimization
Structured signals
A sparse signal: only few elements are nonzero
Jointly sparse signals: sparse, with the same nonzero supports
zeros
nonzeros
Jointly sparse optimization: to recover X from linear measurements
measurement matrix
measurement noise
9
Background (II): decentralized jointly sparse optimization
Decentralized computing in a network
Distributed data in distributed agents & no fusion center
Consideration of privacy, difficulty in data collection, etc
Decentralized jointly sparse optimization
Goal: agent i has y(i) and A(i), to recover x(i) through collaboration
10
Background (III): applications
Cooperative spectrum sensing [1][2]
(i)
Cognitive radios sense jointly sparse spectra {x }
Measure from time domain [1] or frequency selective filter [2]
(i)
(i) (i)
Decentralized recovery from {y =A x }
Decentralized event detection [3]
(i)
Sensors {i} sense few targets represented by jointly sparse {x }
(i)
(i) (i)
Decentralized recovery from {y =A x }
Collaborative data mining, distributed human action recognition, etc
[1] F. Zeng, C. Li, and Z. Tian, “Distributed compressive spectrum sensing in cooperative multi-hop wideband
cognitive networks,” IEEE Journal of Selected Topics in Signal Processing, vol. 5, pp. 37–48, 2011
[2] J. Meng, W. Yin, H. Li, E. Houssain, and Z. Han, “Collaborative spectrum sensing from sparse observations for
cognitive radio networks,” IEEE Journal on Selected Areas on Communications, vol. 29, pp. 327–337, 2011
[3] N. Nguyen, N. Nasrabadi, and T. Tran, “Robust multi-sensor classification via joint sparse representation,”
submitted to Journal of Advance in Information Fusion
11
Roadmap (I): nonconvex versus convex
Convex model: group lasso or L21 norm minimization
regularization parameter
Nonconvex versus convex
Convex: with global convergence guarantee
Nonconvex: often with better recovery performance
Look back on nonconvex models to recover a single sparse signal
Reweighted L1/L2 norm minimization [4][5]
Reweighted algorithms for jointly sparse optimization?
[4] E. Candes, M. Wakin, and S. Boyd, “Enhancing sparsity by reweighted L1 minimization,” Journal of Fourier
Analysis and Applications, vol. 14, pp. 877–905, 2008
[5] R. Chartrand and W. Yin, “Iteratively reweighted algorithms for compressive sensing,” In: Proceedings of
ICASSP, 2008
12
Roadmap (II): difficulty in decentralized computing
A popular decentralized computing technique: consensus
objective function in agent i
local copy in agent i
common optimization variable
neighboring copies are equal
Obviously, two problems are equivalent for a connected network
Efficient algorithms (ADM, SGD, etc) for if it is convex [6]
Nothing for consensus in jointly sparse optimization!
Signals are different; common supports bring nonconvexity
[6] D. Bertsekas and J. Tsitsiklis, Parallel and Distributed Computation: Numerical Methods, Second Edition, Athena
Scientific, 1997
13
Roadmap (III): solution overview
Nonconvex model + convex decentralized computing subproblem
Nonconvex model -> successive linearization -> reweighted Lq
Natural decentralized computing, one nontrivial subproblem
Inexactly solving the subproblem still leads to good recovery
14
Algorithm (I): successive linearization
Nonconvex model (q=1 or 2)
smoothing parameter
regularization parameter
“Successive linearization” to the joint sparsity term at t
Actually a majorization minimization approach
15
Algorithm (II): reweighted algorithm
Centralized reweighted Lq minimization algorithm
Updating weight vector
weight vector u=[u1; u2; uN]
Updating signals
From a decentralized implementation perspective …
Natural decentralized computing in x-update
One subproblem needs decentralized solution in u-update
16
Algorithm (III): average consensus
Check u-update: average consensus problem
Rewrite to more familiar forms
17
Algorithm (IV): inexact average consensus
Solve the average consensus problem with ADM (time t, slot s/S)
Updating weight vectors (local copies)
Updating Lagrange multipliers (c is a positive constant)
Exact average consensus versus inexact average consensus
Exact average consensus: exact implementation of reweighted Lq
Introducing inner loops: cost of coordination & communication
Inexact average consensus: one iteration in the inner loop
18
Algorithm (V): decentralized reweighted Lq
Algorithm outline
Updating weight vectors (local copies)
Updating Lagrange multipliers (c is a positive constant)
Updating signals
19
Simulation (I): simulation settings
Network settings
L=50 agents, randomly deployed in 100×100 area
Communication range=30, bidirectionally connected
Measurement settings
Signal dimension N=20, signal sparsity K=2
Measurement dimension M=10
Random measurement matrices and random measurement noise
Parameter settings
20
Simulation (II): recovery performance
21
Simulation (III): convergence rate
22
Conclusion
Decentralized jointly sparse optimization problem
Jointly sparse signal recovery in a distributed network
Reweighted Lq minimization algorithms
Feature #1: nonconvex model <- successive linearization
Feature #2: decentralized computing <- inexact average consensus
Good news and bad news
Local convergence of the centralized algorithms
Excellent performance of the decentralized algorithms
No theoretical performance guarantee (recovery and convergence)
Outlook: many open questions in decentralized optimization
23
Thanks for your attention!
24