This file is indexed.

/usr/lib/python2.7/dist-packages/MDP-3.5.egg-info/PKG-INFO is in python-mdp 3.5-1.

This file is owned by root:root, with mode 0o644.

The actual contents of the file can be viewed below.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
Metadata-Version: 1.1
Name: MDP
Version: 3.5
Summary: MDP is a Python library for building complex data processing software by combining widely used machine learning algorithms into pipelines and networks.
Home-page: http://mdp-toolkit.sourceforge.net
Author: MDP Developers
Author-email: mdp-toolkit-devel@lists.sourceforge.net
License: http://mdp-toolkit.sourceforge.net/license.html
Download-URL: http://sourceforge.net/projects/mdp-toolkit/files/mdp-toolkit/3.5/MDP-3.5.tar.gz
Description: **The Modular toolkit for Data Processing (MDP)** package is a library
        of widely used data processing algorithms, and the possibility to
        combine them together to form pipelines for building more complex
        data processing software.
        
        MDP has been designed to be used as-is and as a framework for
        scientific data processing development.
        
        From the user's perspective, MDP consists of a collection of *units*,
        which process data. For example, these include algorithms for
        supervised and unsupervised learning, principal and independent
        components analysis and classification.
        
        These units can be chained into data processing flows, to create
        pipelines as well as more complex feed-forward network
        architectures. Given a set of input data, MDP takes care of training
        and executing all nodes in the network in the correct order and
        passing intermediate data between the nodes. This allows the user to
        specify complex algorithms as a series of simpler data processing
        steps.
        
        The number of available algorithms is steadily increasing and includes
        signal processing methods (Principal Component Analysis, Independent
        Component Analysis, Slow Feature Analysis), manifold learning methods
        ([Hessian] Locally Linear Embedding), several classifiers,
        probabilistic methods (Factor Analysis, RBM), data pre-processing
        methods, and many others.
        
        Particular care has been taken to make computations efficient in terms
        of speed and memory. To reduce the memory footprint, it is possible to
        perform learning using batches of data. For large data-sets, it is
        also possible to specify that MDP should use single precision floating
        point numbers rather than double precision ones. Finally, calculations
        can be parallelised using the ``parallel`` subpackage, which offers a
        parallel implementation of the basic nodes and flows.
        
        From the developer's perspective, MDP is a framework that makes the
        implementation of new supervised and unsupervised learning algorithms
        easy and straightforward. The basic class, ``Node``, takes care of tedious
        tasks like numerical type and dimensionality checking, leaving the
        developer free to concentrate on the implementation of the learning
        and execution phases. Because of the common interface, the node then
        automatically integrates with the rest of the library and can be used
        in a network together with other nodes.
        
        A node can have multiple training phases and even an undetermined
        number of phases. Multiple training phases mean that the training data
        is presented multiple times to the same node. This allows the
        implementation of algorithms that need to collect some statistics on
        the whole input before proceeding with the actual training, and others
        that need to iterate over a training phase until a convergence
        criterion is satisfied. It is possible to train each phase using
        chunks of input data if the chunks are given as an iterable. Moreover,
        crash recovery can be optionally enabled, which will save the state of
        the flow in case of a failure for later inspection.
        
        MDP is distributed under the open source BSD license. It has been
        written in the context of theoretical research in neuroscience, but it
        has been designed to be helpful in any context where trainable data
        processing algorithms are used. Its simplicity on the user's side, the
        variety of readily available algorithms, and the reusability of the
        implemented nodes also make it a useful educational tool.
        
        http://mdp-toolkit.sourceforge.net
Platform: Any
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Education
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: BSD License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 3
Classifier: Topic :: Scientific/Engineering :: Information Analysis
Classifier: Topic :: Scientific/Engineering :: Mathematics