[image 00354] 再送: 11/13(水)チュートリアル開催のご案内

Akihiro Sugimoto sugimoto @ nii.ac.jp
2013年 11月 12日 (火) 22:49:05 JST


ご案内しているチュートリアルが明日に迫りましたので、再度ご案内
を差し上げます。

杉本

On Sun, 03 Nov 2013 02:49:00 +0900
Akihiro Sugimoto <sugimoto @ nii.ac.jp> wrote:

> みなさま
> 
> 下記の予定でパターン認識に関するチュートリアルを開催します
> ので、ふるってご参加ください。事前登録不要、参加費無料です。
> 
> 杉本
> ------------------
> 日時:  2013年11月13日(水) 13:30〜
> 場所:  国立情報学研究所 20F 2004号室
>    (東京都千代田区一ツ橋)
>             http://www.nii.ac.jp/about/access/
> 講師: 
>    Vaclav Hlavac
>     Professor
>     Czech Technical University in Prague, Czech Republic
>    http://cmp.felk.cvut.cz/~hlavac
> 
> Title: The Appetizer to Pattern Recognition, a tutorial
> 
> Introduction:
> 
> Being the visiting professor at NII, we agreed with my NII host Prof.
> Akihiro Sugimoto that it might be plausible to give a tutorial to basics of
> pattern recognition (called also machine learning). The tutorial should
> consist of two or (optionally three) lectures 60 minutes long. No special
> foreknowledge is expected. The topics should be useful practically. The
> tutorial will be given in a single afternoon to enable the audience to use
> their time efficiently.
> 
> Tutorial 1: Informal introduction to pattern recognition
> 
> The core of pattern recognition (machine learning) is to learn a decision
> rule (a classifier) from the empirical experience, i.e. from observed
> examples. This approach will be motivated and put into a wider context. The
> need for statistical approach will be explained. The classifier will be
> introduced first informally. The Bayesian formulation unifies nicely various
> approaches. Based on it, it will be explained what tasks can be fulfilled in
> pattern recognition and which not. This means that the pattern recognition
> territory will be outlined in the tutorial informally.
> 
> Tutorial 2: Evaluation of the classifier performance
> 
> Classifiers (decision rules) learned often empirically from examples either
> in a supervised manner from labeled examples provided by a "teacher" or in
> an unsupervised manner. The natural question arises: Can the classifier
> performance be evaluated empirically? Is the classifier able to generalize?
> Will it perform well on data unseen during its learning? Is it accurate
> enough? There is an established methodology stemming in statistical
> hypotheses testing used in this context. It will be explained together with
> concepts like confusion matrix and ROC (Receiver Operating Characteristic).
> 
> Tutorial 3: Learning formulated as an optimization task, several optimality
> criteria 
> (optional, only if the audience of previous two tutorials wants it)
> 
> Learning has facets of dreaming in a fairy tale manner as well as the
> rigorous mathematical formulation as an optimization task. The tutorial will
> discuss relation between these two limiting standpoints. Statistical
> learning will be represented in Bayesian framework, in which the optimally
> learned classifier minimizes the Bayesian risk. The trouble is that
> empirical data provided by a training set does not provide the needed
> information. Four substitutive optimization criteria will be introduced and
> discussed. The approach provides a plausible insight into classifier
> learning.
> 
> ----------
> 
> _______________________________________________
> image mailing list
> image @ imageforum.org
> http://www.imageforum.org/mailman/listinfo/image




image メーリングリストの案内