Poster session

From ATLAS
Revision as of 15:46, 23 May 2016 by Clausel (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

The Poster Session will be on Monday 23 May at 17:00

  • H. Alian (LIG)
Title: Streaming-LDA a copula-based approach to modeling topic dependencies in document streams.
Abstract: We proposed new model for modeling topic and word-topic dependencies between consecutive documents in document streams. This model makes use of copulas, which constitute a generic tools to model dependencies between random variables. They are symmetric and associative and are thus appropriate for exchangeable random variables. Our experiments, conducted on three standard collections that have been used in several studies on topic modeling, show that our proposals outperform previous ones (as dynamic topic models and temporal LDA), both in terms of perplexity and for tracking similar topics in a document stream.
  • G. Balikas (LIG)
Abstract: We propose a Polylingual text Embedding (PE) strategy, that learns language independent representation of texts using Neural Networks and we study the effects of bilingual representation learning for text classification. We empirically show that the learned representations allow to reach better classification performance, with respect to traditional bag-of-words representation and also other monolingual distributed representations, especially in the interesting case where few labeled examples are available for training the classifiers.
  • Nicolas Ducros (CREATIS)
Abstract: Spectral computed tomography (CT) exploits measurements from x-rays with different energies to obtain the 3D description of the patient in a material basis. It requires to solve two subproblems, namely the material decomposition and the tomographic reconstruction problems, either sequentially or jointly. In this work, we address the material decomposition problem, which an ill-posed non-linear problem. Our main contribution is to introduce a material-dependent spatial regularization scheme. The problem is solved iteratively using the Gauss-Newton’s method. The framework is validated on numerical experiments of a thorax phantom made of soft tissue, bone and gadolinium scanned with a 90 kV source and a 3-bin photon counting detector.
  • Adrien DULAC (LIG)
Title: Graphical Models for Diffusion in Social Networks
Abstract: Modeling how information diffuses in social networks is a difficult task as two dynamics interact at the same time: the one of the information itself, and the one of the network topology. Recently, several graphical models have been proposed to provide a generative view of the dynamics of the information in a social graph as well as modeling the uncertainty. We intend to extend these models with an explicit modeling of the dynamics of the network itself. The complete model should provide explanations (as well as predictive capacities) on how the diffusion process behaves.
  • I. Gannaz (ICJ)
Title : Estimation of fractal connectivity in multivariate time series with long-range dependence: application to cerebral connectivity
Abstract : A challenge in imaging neuroscience is to characterize the brain organization, trough the integration of interactions between segregated areas. One way to estimate the functional connectivity consists in estimating correlations of pairs of measurements of neuronal activity. The aim of the present work is to take into account the long range dependence properties of the recordings. Fractal connectivity can statistically be defined as the spectral correlations between long memory processes over a range of low frequency scales. It can be seen as the asymptotic limit of Fourier and wavelets correlation at low frequencies. Fractal connectivity thus corresponds to the “structural” or long-term covariation between the processes. We first introduce a semi-parametric multivariate model, defining the fractal connectivity for a large class of multivariate time series. This model includes the multivariate Brownian motion and some fractionally integrated processes. We propose an estimation of the long-dependence parameters and of the fractal connectivity, based on the Whittle approximation and on a wavelet representation of the time series. The theoretical properties of the estimation show the asymptotic optimality. A simulation study confirms the satisfying behaviour of the procedure on finite samples. Finally we propose an application to the estimation of a human brain functional network based on MEG data sets. Our study highlights the benefit of the multivariate analysis, namely improved efficiency of estimation of dependence parameters and of long term correlations.
  • B. Joshi (LIG)
Abstract: In the context of large-scale problems, traditional multiclass classification approaches have to deal with class imbalancement and complexity issues which make them inoperative in some extreme cases. In this work we study a transformation that reduces the initial multiclass classification of examples into a binary classification of pairs of examples and classes. We present generalization error bounds that exhibit the interdependency between the pairs of examples and which recover known results on binary classification with i.i.d. data. We show the efficiency of the deduced algorithm compared to state-of-the-art multiclass classification strategies on two large scale document collections especially in the interesting case where the number of classes becomes very large
  • C. Lartizien (CREATIS) Joint work with Meriem El Azami (CREATIS,Lyon) and Stéphane Canu (LITIS, Rouen)
Title : Converting SVDD Scores into Probability Estimates
Abstract: To enable post-processing, the output of a support vector data description (SVDD) should be a calibrated probability as done for SVM. Standard SVDD does not provide such probabilities. To create probabilities, we first generalize the SVDD model and propose two calibration functions. The first one uses a sigmoid model and the other one is based on a generalized extreme distribution model. To estimate calibration parameters, we use the consistency property of the estimator associated with a single SVDD model. A synthetic dataset and datasets from the UCI repository are used to compare the performance against a robust kernel density estimator
  • K. Polisano (LJK) Joint work with M. Clausel, L. Condat and V. Perrier
Title : Texture modeling by gaussian fields with prescribed local orientation.
Abstract: Texture modeling is a challenging issue of image processing. There is a variety of texture methods in the field of computer vision, namely structural, statistical, model-based and transform-based methods. Thus, identifying the perceived characteristics of a texture in an image (regularity, roughness, frequency, content directionality, etc.) is an important first step towards building mathematical models for textures. We are interested in textures presenting same similar patterns at different scales, as is often the case for objects appearing in the nature, like clouds or mountains. We focus on stochastic models with a property of self-similarity, characteristic of a fractal behavior, and anisotropy, characteristic of directionality. We introduce a new class of Gaussian fields, called Locally Anisotropic Fractional Brownian Fields (LAFBF), with prescribed local orientation at any point. These fields are a local version of a specific class of anisotropic self-similar Gaussian fields with stationary increments. The simulation of such textures is obtained using a new algorithm mixing the tangent field formulation and a turning band method, this latter method having proved its efficiency for generating stationary anisotropic textures. Numerical experiments show the ability of the method for synthesis of textures with prescribed local orientation.
  • H. Raguet (I2M) (Collaboration with L. Landieu (EPNC))
Abstract: Nowadays, primal-dual algorithms seems to be largely favored among proximal algorithms for solving large-scale, nondiffšerentiable convex optimization problems.This comes along the tradition of the method of Arrow et al. (1958), of the alternating direction of multipliers method (ADMM, Gabay and Mercier, 1976) and more recently of the popular algorithm of Chambolle and Pock (2011). Still, primal-only approaches can oŸften be used as well, and sometimes can be more adapted to the structure of the problem at hand, benefiting from symmetry and/or differentiability of certain parts of the functional. In this work, we present a preconditioning of a generalized forward-backward splitting algorithm for minimizing functionals of the form \sum g_i + f with f smooth, using only the gradient of f and the proximity operator of each i separately. By adapting the underlying metric, such preconditioning can serve two practical purposes: first, it might accelerate the convergence or, second, it might simplify the computation of the resolvent of Ai for some i. In addition, in many cases of interest, our preconditioning strategy allows the economy of storage and computation concerning some auxiliary variables. In particular, we show how this approach can handle large-scale, nonsmooth, convex optimization problems structured on graphs, which arise in many image processing or learning applications, and that it compares favorably to alternatives in the literature.
References
K. J. Arrow, L. Hurwicz, and H. Uzawa. Studies in linear and nonlinear programming. Stanford University Press, 1958˜.
A. Chambolle and T. Pock. A rst-order primal-dual algorithm for convex problems with applications to imaging. Journal of Mathematical Imaging and Vision, 2011.
D. Gabay and B. Mercier. A dual algorithm for the solution of nonlinear variational problems via finite element approximation. Computers & Mathematics with Applications,1976
  • S. Zouarhi (LIG)
Title : Constrained data transmission with Critical Services
Abstract : Transmission of constrained data is a major issue in industrial systems. Today, we find more and more sensitive data in circulation. Regarding the transmission channel, questions of security, confidentiality, but also end-to-end integrity and traceability are hot research topics. It is of course a major concern in healthcare, especially in telemedicine where telecommunications are used to enable tele-expertise or tele-monitoring services. This is why constrained data can only be correctly handled by critical services. With the Internet of Things, the world is becoming a connected place, which is why it is important as of now to understand and anticipate the risks that come along with this transformation.