• Nie Znaleziono Wyników

2 The Data

N/A
N/A
Protected

Academic year: 2021

Share "2 The Data"

Copied!
9
0
0

Pełen tekst

(1)

in Lexical Data

Julian Szyma´nski1and Włodzisław Duch2,3

1 Department of Computer Systems Architecture, Gda´nsk University of Technology, Poland julian.szymanski@eti.pg.gda.pl

2 Department of Informatics, Nicolaus Copernicus University, Toru´n, Poland

3 School of Computer Engineering, Nanyang Technological University, Singapore Google: W. Duch

Abstract. Unsupervised organization of a set of lexical concepts that captures common-sense knowledge inducting meaningful partitioning of data is described.

Projection of data on principal components allow for identification of clusters with wide margins, and the procedure is recursively repeated within each cluster.

Application of this idea to a simple dataset describing animals created hierarchi- cal partitioning with each clusters related to a set of features that have common- sense interpretation.

Keywords: hierarchical clustering, spectral analysis, PCA.

1 Introduction

Categorization of concepts into meaningful hierarchies lies at the foundation of understanding their meaning. Ontologies provide such hand-crafted hierarchical clas- sification, but they are based usually on expert knowledge, not on the common-sense knowledge. For example, most biological taxonomies are hard to understand for lay people. There is no relationship between linguistic labels and their referents, so words may only point at the concept, inducing brain states that contain semantic information, predisposing people to meaningful associations and answers. In particular visual simi- larity is not related to names. Dog’s breeds are categorized depending on their function, like Sheepdogs and Cattle Dogs, Scenthounds, Pointing Dogs, Retrievers, Companion and Toy Dogs, with many diverse breeds within each category. Such categories may have very little in common when all properties are considered. Differences between two similar dog breeds may be based on rare traits, not relevant to general classifi- cation. This makes identification of objects by their description quite difficult and the problem of forming common sense natural categories worth studying.

In this paper we have focused on relatively simple data describing animals. First, this is a domain where everyone has relatively good understanding of similarity and hierarchical description, second there is a lot of structured information in the Internet resources that may be used to create detailed description of the animals, third one can test the quality of such data by playing word games. We shall look at the novel way of using principal component analysis (PCA) to create hierarchical descriptions, but many other choices and other knowledge domains (for example, automatic classification of library subjects) may be treated using similar methodology.

B.-L. Lu, L. Zhang, and J. Kwok (Eds.): ICONIP 2011, Part II, LNCS 7063, pp. 726–734, 2011.

 Springer-Verlag Berlin Heidelberg 2011c

(2)

2 The Data

The data used in the experiments has been obtained using automatic knowledge ac- quisition followed by corrections resulting from the 20-questions word game [1]. The point of this game is to guess the concept the opponent is thinking of by asking ques- tions that should narrow down the set of likely concepts. In our implementation1the program is trying to make a guess asking people questions. Results are used to correct lexical knowledge and in its final stage controlled dialog between human and computer, based on several plausible scenarios, is added to acquire additional knowledge. If the program wins, guessing the concept correctly, it will strengthen the knowledge related to this concept. If it fails, human is asked additional question „What did you think of?”

and concepts related to the answer are added or features are modified according to the information harvested during the game.

ANIMAL KINGDOM

spider snail worm

rattlesnake constrictorsnake

salmon herring shark

dolphin whale

neandertal human

girl boy

ant

monkey ape

wasp bee

viper mosquito

moth

turtle

tyrranosaur dragon

bear coyote

lion panthera

tiger grasshopper

fly butterfly caterpillar

crocodile

wolf

cat hippopotamus

rat

fox gekon

frog toad

mouse squirrel

koala

dingo domesticdog

dog bat platypus

hamster owl

elephant

rabbit horse kangaroo

vulture stork

penguin

zebra

camel

mule

lamb

calf pig

donkey

goat

femalecow sparrow

tukan

pigeon sparrow swan

goose duck

hen rooster

turkey

bull antelope

buffalo giraffe unihorn

Fig. 1. Data used in the experiments visualized with Self-Organizing Map

Implementation of our knowledge acquisition system based on the 20-question game uses a semantic memory model [2] to store lexical concepts. This approach makes it more versatile than using just correlation matrix, as it has been successfully done in the implementation of this word game2. The matrix stores correlations between ob- jects and features using weights that describe mutual association derived from thou- sands of games, providing decomposition of each concepts into a sum of contributions from questions. Such representation is flat and does not treat lexical features as natural

1http://diodor.eti.pg.gda.pl

2http://www.20-q.net

(3)

language concepts that allow for creation of a hierarchy of the common sense objects.

Our program, based on semantic memory representation, shows elementary linguistic competence collecting common sense knowledge in restricted domains [1], and the knowledge generated may be used in many ways, for example by generating word puzzles.

This lexical data in semantic memory may be reorganized in a way that will introduce generalizations and increase cognitive economy [3]. This hierarchy is induced searching for the directions with highest variance using PCA eigenvectors, separating subsets of concepts and repeating the process to create consecutive subspaces. To illustrate and better understand this process a relatively small experiment has been performed.

-0.6 -0.5 -0.4 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 0.4

-0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6

bear hippopotamus

tyrranosaur dragon sparrow tukan vulture

stork

gekon

turtle mosquito

bat

fox koala

neandertal human girlboy grasshopper

spider fly butterfly

rattlesnake

rabbit horse

calf

monkey mousekangaroo frog

crocodile snail

ant toad wasp bee

pigeon

viper

constrictorsnake worm

lion dingo domesticdog

salmon hen

cat dog ape

mulegoat goose

duck

squirrel

herring

femalecow owl

rooster moth

antelopebull

rat

buffalo platypus lamb

giraffe pig donkey elephant

wolf sparrowswan

penguin

camel hamster

shark

dolphin whale

panthera turkey

coyote caterpillar

unihorn zebra

tiger

Fig. 2. The data used in the experiments visualized with MDS

A test dataset with 84 concepts (animals, or in general some objects) described by 71 features has been constructed after performing 346 games. The dataset used in the experiments is displayed using Self-Organizing Map (SOM) [4] visualization in Fig.

1 and with parametric Multidimensional Scaling (MDS) [5] in Fig. 2. Distances be- tween points that represent dissimilarities between animals are calculated using cosine measures d(X, Z) = X · Z/||X||||Z||

3 PCA Directions

Expert taxonomies are frequently based on single feature, such as mammals, and then marsupials, but common-sense categorization is based on combination of features that makes objects similar. Principal Component Analysis [6] finds directions of highest data variance. Projecting the data on these direction shows interesting combination of features and thus helps to select groups of correlated features that separate data points, creating subsets of animals.

(4)

-3 -2.5 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 -2

-1.5 -1 -0.5 0 0.5 1 1.5 2 2.5

1st Principal Component

2nd Principal Component

bear hippopotamus

tyrranosaur dragon sparrow

tukan

vulture stork

gekonturtle

mosquito

bat

fox koala

neandertal human girl grasshopper boy

spider fly

butterfly

rattlesnake

rabbit

horse calf

monkey mousekangaroo frog

crocodile

snail

toad

ant waspbee

pigeon

viperconstrictorsnake worm

lion dingo

domesticdog

salmon hen

dog

cat ape mulegoat goose

duck

squirrel herring

femalecow owl

rooster

moth

bullantelope rat buffalo

lamb platypus

giraffe pigdonkey elephant

wolf sparrow

swan

penguin

camel hamster

shark

dolphin whale

panthera turkey

coyote

caterpillar

unihorn zebra

tiger

Fig. 3. Dataset visualization using two highest Principal Components

A pair of PCA directions may be used for visualization of the data. Projection on the first two directions with largest variance is shown in Fig. 3. The three visualiza- tions (Figure 1, 2, 3) show different aspects of the data. Note for example the cluster formed in SOM containing lion, pantera and tiger. MDS shows their similarity but can still distinguish between them, while in PCA projection some objects appear between them (fox, bear) that do not fall into that cluster. PCA is able to find groups of related features and thus extract some commonsense knowledge approximating meaningful di- rections in the feature space. At the one end of the axis objects that have a mixture of features making them similar to each other are placed, on the other end objects that do not have such features. Fig. 4 shows coefficients of features in our semantic space for the first six principal components. Each feature, such as lay-eggs, is-mammal is placed above the line in one of the 6 columns, one for each component, to indicate the value of its coefficient in PCA vector. The most important features (having the highest abso- lute coefficient weights) in terms of data partitioning can be obtained from subsequent components. In the first vector (lowest row) the most negative (leftmost) coefficients correspond to features lay-eggs, has-wings describing insects and birds, while the most positive (right-most) are for is-mammal, has-teeth, has-coat, is-warmblooded, and oth- ers typical for mammals. The second PCA component has most positive coefficients for has-beak, has-bill, has-feathers, is-bird, has-wings indicating that this group of features is characteristic for the birds.

Hierarchical clusterization for such groups of features should show interesting common-sense clusters. In Fig. 5 direct projection of all vectors describing animals on each of these principal directions is shown. These projections show different aspects of the data, for example the projection on the second PCA shows a clear cluster for birds, starting with swan and ending with owl as less typical bird, the third cluster starts with vulture and groups other hunting animals. Projection on each PCA component may be used to generate different partitions of all objects.

(5)

1 2 3 -0.2

-0.1 0 0.1 0.2 0.3

lay-egg has-wingcan-fly is-invertebrateis-coldblooded

is-insect has-bill is-wild have-feather has-beak is-bird has-no-legs live-in-forest is-venomous live-on-treeslive-in-africa

has-sting live-in-australia is-reptilehas-long-neck is-grayis-dangerous is-predatoris-amphibian live-in-nightlive-in-water is-raptoris-arthropod is-fishis-carnivore have-clawshave-hands is-dommesticatedhave-horns is-primatehave-pawseat-plants is-rodenthas-trunk can-climbeat-grassis-friendly has-armcan-swim have-legsis-human has-faceis-extinctis-canine is-intelligenteat-meat has-headhas-shell is-molluskhas-shell live-on-desertis-man is-felineis-corourfulis-big

produce-milk can-jump

is-vertebrate has-lung have-hairs

has-tail has-rib is-warmblooded has-coat

has-teeth is-mammal

can-climb is-coldblooded

live-on-desert eat-meat live-in-australia live-in-africa is-invertebratehas-teeth has-no-legs is-dangerous is-insectis-venomous is-predatoris-amphibian is-biglive-in-forestis-arthropod is-felinehave-horns is-corourfulis-mammal can-swimhave-paws is-fishhave-hairsis-mollusk has-stingis-primate is-humanhas-trunk live-on-treescan-jump live-in-nightis-canine is-extincthas-shell produce-milkhas-shell live-in-waterhas-face is-intelligenthas-coat is-rodentis-reptile is-manis-gray eat-grassis-wild have-handshas-armis-carnivore is-dommesticatedis-friendly has-long-neckhas-head has-tailis-raptor

have-legs lay-egg eat-plants is-vertebrate

has-lung have-claws can-fly has-rib is-warmblooded has-wing

is-bird has-bill have-feather has-beak

eat-plants eat-grass

is-gray have-horns

is-invertebrate has-coat is-friendly is-mammal live-in-australiais-insect

has-no-legs produce-milk is-dommesticatedis-arthropod is-amphibianhave-hairs has-stingis-mollusk is-warmbloodedhave-legs live-in-waterhas-trunk can-jumphas-shell has-teethhas-tail has-shellis-fish is-rodent

has-head live-in-night has-long-neck is-bigis-venomous is-coldbloodedhave-feather has-beaklive-in-africa has-wingis-intelligent has-billis-primate is-extinctcan-swim live-on-treeshas-lung is-birdis-reptile live-in-forestis-raptorhas-arm have-handsis-feline has-facelay-egg have-pawsis-man is-corourfulhas-rib is-caninecan-fly is-vertebrateis-wild is-humanlive-on-desert

can-climb

is-dangerous is-predator

have-claws is-carnivore eat-meat

Feature coefficient value

Principal Component no.

Fig. 4. Groups of features related to the principal components

1 2 3 4 5 6

-2.5 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 2.5

bear

hippopotamus

tyrranosaur dragon

sparrow tukan vulture stork gekon turtle

mosquito bat fox koala neandertal human girlboy

grasshopper spider

fly butterfly rattlesnake rabbit horse calf monkey mouse kangaroo

frog crocodile

snail toad

ant waspbee pigeon viper constrictorsnake

worm liondingo domesticdog

salmon hen dogcat ape

goat mule

goose duck squirrel

herring femalecow

owl rooster

moth bullantelope rat buffalo lamb

platypus giraffe pig donkey elephant wolf

sparrow swan penguin camel hamster

shark dolphin whale panthera

turkey coyote

caterpillar unihorn

zebra tiger

bear hippopotamus

tyrranosaur dragon sparrow tukan

vulture stork

gekon turtle

mosquito bat fox koala

neandertal human girl boy grasshopper

spider fly butterfly

rattlesnake rabbit

horse calf

monkey mouse kangaroo frog crocodile

snail toad

ant waspbee pigeon

viper constrictorsnake worm lion dingo domesticdog

salmon hen

dog

cat

ape goatmule goose duck

squirrel herring femalecow owl rooster

moth bullantelope

rat buffalo lamb platypus giraffe pigdonkey elephant

wolf sparrow swan

penguin

camel hamster

shark dolphin whale panthera turkey

coyote

caterpillar unihorn zebra

tiger

bear

hippopotamus tyrranosaur dragon

sparrow tukan vulture

stork

gekon turtle mosquito bat fox

koala neandertal human girlboy

grasshopper spider fly butterfly rattlesnake

rabbit horse calf monkey

mouse kangaroo frog crocodile

snail toadant wasp bee pigeon viper constrictorsnake

worm lion

dingo domesticdog

salmon hen dogcat ape

goatmule goose duck squirrel herring

femalecow owl

rooster

moth bullantelope rat

buffalo lamb platypus

giraffe pig

donkey elephant wolf

sparrow swan penguin

camel hamster shark

dolphin

whale panthera

turkey coyote

caterpillar unihorn zebra tiger

bear hippopotamus tyrranosaur dragon

sparrow tukan vulture stork gekon turtle

mosquito bat fox

koala

neandertal

human girlboy grasshopper spider fly butterfly rattlesnake

rabbit

horse calf

monkey mouse kangaroo frog crocodile

snail toad

ant wasp bee pigeon viper constrictorsnake

worm liondingo domesticdog salmon

hen dog

cat

ape goat mule goose duck

squirrel herring

femalecow owl

rooster moth bull antelope rat buffalo lamb platypus giraffe pigdonkey elephant wolf

sparrow swan penguin

camel hamster shark

dolphin whale

panthera turkey coyote caterpillar unihorn zebra tiger

bear

hippopotamus tyrranosaur

dragon sparrow tukan

vulture stork gekon

turtle mosquito bat fox koala

neandertal human girlboy grasshopper spider fly butterfly

rattlesnake rabbit horse calfmonkey mouse kangaroo frogcrocodile snail toad antwasp bee

pigeon viper

constrictorsnake worm lion dingo domesticdog

salmon hen dog cat

ape goat mule

goose duck squirrel

herring femalecow owl

rooster moth bullantelope rat buffalo lamb platypus

giraffe

pig donkey elephant wolf

sparrow swan penguin camel hamster

shark

dolphin whale panthera

turkey coyote caterpillar unihorn

zebra tiger

bear hippopotamus tyrranosaur dragon

sparrow tukan vulture stork gekon

turtle mosquito bat fox

koala neandertal human girl boygrasshopper spider flybutterfly rattlesnake

rabbit horse calf

monkey mouse kangaroo frog crocodile snail toadant wasp bee

pigeon viper constrictorsnake

worm lion

dingo domesticdog

salmon hen

dog cat

ape goat mule goose duck

squirrel herring femalecow

owl rooster moth bullantelope

rat buffalo

lamb

platypus giraffe

pigdonkey elephant

wolfsparrow swan

penguin camel

hamster shark

dolphin whale panthera turkey

coyote caterpillar unihorn zebra tiger

Score value = data x coefficient

Principal Component no.

Fig. 5. Projections of the data on the first 6 principal components

(6)

4 Creating Hierarchical Partitioning

4.1 Hierarchical Agglomerative Partitioning

Creating a hierarchy based on similarity is one of the most effective ways for present- ing large sets of concepts. Clustering data using agglomerative approach [7] is most fre- quently used for showing hierarchical organization of the data. The bottom-up approach using average linkage between clusters on each hierarchy level is shown in Fig. 6.

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7

humangirlboyneandertalmonkeyapebearpantheratigerlioncatfoxdingodomesticdogdogwolfcoyotehippopotamuselephantzebrarabbitlambkangaroohorsecalffemalecowbullbuffaloantelopegiraffecamelunihornpigdonkeygoatmulekoalasquirrelmouserathamsterplatypusbattyrranosaurdragoncrocodilerattlesnakeconstrictorsnakevipergekonturtlefrogtoadsparrowtukansparrowpigeonswanvulturestorkgooseduckhenturkeyroosterpenguinowlsalmonherringsharkdolphinwhalemosquitoflybutterflyspiderantwaspbeemothgrasshoppercaterpillarsnailworm

Fig. 6. Dendogram for animal kingdom dataset

Hierarchical agglomerative clustering using bottom-up approach binds together groups of objects in a way that frequently does not agree with intuitive partitioning.

Moreover, the features used to construct a cluster are not easily traceable.

4.2 Hierarchical Partitioning with Principal Components

The distribution of the data points using the first 6 principal components (Fig. 5 ) shows a large gap between two groups projected with the second principal component. These two clusters of data are separated with the largest margin and thus should be mean- ingful. Hierarchical organization of the data can be analyzed from the point of view of graph theory. In terms of the graph bisection the second eigenvector is most important [8], allowing for creation of normalized cut (partition of the vertices of a graph into two disjoint subsets) [9]. Thus selecting the second principal component is a good start to construct hierarchical partitioning. The typical approaches to spectral clustering em- ploys second (biggest) component [10] (that minimize graph conductance) or a second smallest component [11] (due to Rayleigh theorem).

(7)

1 2 3 4 5 6 -2.5

-2 -1.5 -1 -0.5 0 0.5 1 1.5 2

bear

tyrranosaur dragon

gekon turtle

mosquito bat fox

koala neandertal human girlboy

grasshopper spider

flybutterfly rattlesnake horse monkey

mouse kangaroo

frog crocodile

snail toad

ant waspbee viper

constrictorsnake

worm liondingo domesticdog

salmon dog cat ape

squirrel

herring

moth rat

elephant wolf

camel

shark dolphin

whale panthera coyote

caterpillar tiger

bear tyrranosaur dragon

gekon turtle

mosquito bat fox

koala

neandertal

human girlboy grasshopper spider fly butterfly rattlesnake

horse

monkey mouse kangaroo frog crocodile

snail toad

ant wasp bee viper

constrictorsnake

worm lion dingo domesticdog salmon

dog

cat

ape squirrel herring

moth ratelephant wolf

camel shark

dolphin whale

panthera coyote

caterpillar tiger

bear tyrranosaur dragon

gekon turtle

mosquito bat fox

koala neandertal human girlboy

grasshopper spider flybutterfly rattlesnake

horse monkey

mouse kangaroo frog crocodile

snail toadant waspbee viper

constrictorsnake

worm

lion dingo domesticdog salmon

dogcat ape

squirrel herring

moth ratelephant wolfcamel shark dolphin

whale panthera coyote caterpillar tiger

bear tyrranosaur dragon

gekon

turtle mosquito bat fox

koala neandertal human girlboy grasshopper spider fly butterfly rattlesnake

horse monkey mouse kangaroo frog crocodile

snail toad ant wasp bee viper

constrictorsnake

worm lion

dingo

domesticdog

salmon dog cat

apesquirrel

herring moth rat

elephant wolf

camel shark

dolphin whale panthera

coyote

caterpillar tiger

beartyrranosaur dragon gekon

turtle

mosquito bat fox koala

neandertal human girlboy grasshopper spider

fly butterfly rattlesnake horse monkey mouse kangaroo frog

crocodile snail toad

ant

waspbee viper

constrictorsnake worm

lion

dingo domesticdog salmon

dog cat apesquirrel

herring moth ratelephant

wolf camel

shark dolphin whale panthera

coyote caterpillar tiger

bear

tyrranosaur

dragon gekon turtle mosquito bat fox

koala

neandertal human girl boy grasshopper spider

fly butterfly rattlesnake

horse

monkey mouse kangaroo

frogcrocodile snail

toad ant

wasp bee viper

constrictorsnake worm

liondingo domesticdog

salmon dog cat

apesquirrel herring

moth rat elephant wolfcamel shark dolphin whale panthera coyote caterpillar tiger

Score value = data x coefficient

Principal Component no.

Fig. 7. Projections of the reduced data set using succeeding single principal components

Analyzing subsequent PCA component projections (given in Fig. 5 and 7) shows that the second principal component does not lead always to the best cut in the graph.

It is better to select the component that produces the widest separation margin within the data, choosing a different principal component for each hierarchy level. For creating the first hierarchy level the second component is selected, separating birds from other animals, creating one pure and one mixed cluster (Fig. 5). Features of the second PCA component (Fig. 4) with lowest and highest weights include: (-)climb, (-)cold-blooded and (+)beak, (+)feather, (+)bird, (+)wing, (+)warmblooded. Note that one feature is- bird alone is sufficient to create this partitioning but correlated features separate this cluster in a better way.

To capture some common-sense knowledge hierarchical partitioning is created in a top-down way Each of the newly created clusters is analyzed using PCA and principal components that give the widest separation margins are selected for data partitioning.

PCA is performed recursively on reduced data that belong to the selected cluster. In Fig.

7 the first 6 components computed for the large mixed cluster (that does not contain birds) created on the second hierarchy level is presented. This cluster has been formed after separating the birds and other animals with the second component (shown in Fig- ure 5). Within this cluster the widest margin is created with the first component and it separate mammals (with the exception of dolphins and whales) from other animals.

(8)

Fig. 8. Hierarchy of the data and features used to create it

Repeating the process described above hierarchical organization of the data is intro- duced. In Fig. 8 a top part of created hierarchy is shown. At each level of the hierarchy most important features used to create this partition are also displayed.

5 Discussion and Future Directions

An approach to create hierarchical commonsense partitioning of data using recursive Principal Component Analysis has been presented. Results of this procedure have been illustrated on simple data describing animals created using the 20-questions game that is based on model of semantic memory [1]. This approach has been used for creat- ing general clusters within the semantic memory model that stores natural language concepts. Such analysis allows for finding additional correlations between features fa- cilitating associative processes for existing concepts, and improving the learning proces when new information is added to the system. In the neurolinguistic approach to the natural language processing [12] it has been conjectured that the right brain hemisphere creates receptive fields (called “cosets", or constraint-sets) that constrain semantic inter- pretation, although they do not have linguistic labels themselves. The process describe

(9)

here may be an approximation ot some of the neural processes responsible for language comprehension.

Hierarchical organization of lexical data has been created here in an unsupervised way by selecting linear combinations of features that provide clear separation of con- cepts. Extension of this approach may be based on bi-clustering, taking into account clusters of features that are relevant for creating meaningful clusters of data. The main idea is to strengthen features that are correlated to the dominant one, or to the features given by the user who may want to view the data from a specific angle [13]. Non- negative matrix factorization [14] is another useful technique that may replace PCA.

Many other variants of unsupervised data analysis methods are worth exploring in the context of this approach to induction of the common-sense hierarchies in data.

Acknowledgements. The work has been supported by the Polish Ministry of Science and Higher Education under research grant N519 432 338.

References

1. Szyma´nski, J., Duch, W.: Information retrieval with semantic memory model. Cognitive Sys- tems Research (in print, 2011)

2. Tulving, E.: Episodic and semantic memory. Organization of Memory, 381–402 (1972) 3. Conrad, C.: Cognitive economy in semantic memory (1972)

4. Kohonen, T.: The self-organizing map. Proceedings of the IEEE 78, 1464–1480 (1990) 5. Shepard, R.: Multidimensional scaling, tree-fitting, and clustering. Science 210, 390 (1980) 6. Jolliffe, I.: Principal component analysis. Wiley Online Library (2002)

7. Day, W., Edelsbrunner, H.: Efficient algorithms for agglomerative hierarchical clustering methods. Journal of Classification 1, 7–24 (1984)

8. Rahimi, A., Recht, B.: Clustering with normalized cuts is clustering with a hyperplane. Sta- tistical Learning in Computer Vision (2004)

9. Dhillon, I.: Co-clustering documents and words using bipartite spectral graph partitioning.

In: Proceedings of the seventh ACM SIGKDD International Conference on Knowledge Dis- covery and Data Mining, pp. 269–274. ACM (2001)

10. Kannan, R., Vetta, A.: On clusterings: Good, bad and spectral. Journal of the ACM (JACM) 51, 497–515 (2004)

11. Shi, J., Malik, J.: Normalized cuts and image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence 22, 888–905 (2000)

12. Duch, W., Matykiewicz, P., Pestian, J.: Neurolinguistic approach to natural language process- ing with applications to medical text analysis. Neural Networks 21(10), 1500–1510 (2008) 13. Szyma´nski, J., Duch, W.: Dynamic Semantic Visual Information Management. In: Pro-

ceedings of the 9th International Conference on Information and Management Sciences, pp. 107–117 (2010)

14. Lee, D., Seung, S.: Learning the parts of objects by non-negative matrix factorization. Na- ture 401, 788–791 (1999)

Cytaty

Powiązane dokumenty

Rzecz w tym, że naw et ptaki w niewielkim tylko stopniu w ykorzystują swoje możliwości lotu, uczony zaś, gdyby nie obejmował refleksją swych czynności

45 ] Napisz co potrafi robić papuga, żyrafa, zebra, hipopotam i słoń. Każde zwierzątko potrafi wykonywać dwie

W załącznikach przesyłam dodatkowe karty pracy.. Prace wklejamy do zeszytu lub

Dla ułatwienia podaje kilka nazw, z których uczniowie wybierają 5 właściwych (lion, leopard, cheetah, rhinoceros, elephant, and hippopotamus,.. buffalo, ostrich, giraffe).

Na podstawie tych wyników autorzy wyciągnęli wniosek, że uczulenie na Fel d 5 nie jest dobrym markerem alergii na kota – wpływ innych czynników, które mogą być źródłem

The comparison of the power spectral density of three acoustic simulations using dif- ferent source terms shows the total time derivative of the density, which is the major source

RANKING FUNDUSZY INWESTYCYJNYCH Na podstawie wyznaczonych wartoci wskaników efektywnoci Sharpe’a, Treynora i Jensena oraz skonstruowanych miar agregatowych przeprowadzono ranking

Lovecraft’s works, in which the author of the book attempts to detect and interpret the themes appear- ing as a result of working through this writer’s fears about instability of