[image 04170] PRMU Dec. 2020 (12/17, Program)

Masashi Nishiyama nishiyama @ tottori-u.ac.jp
2020年 11月 17日 (火) 19:23:58 JST


$BEE;R>pJsDL?.3X2q(B PRMU$B8&5f2q(B $B44;vJd:4$N@>;3$G$9!#(B
$B!JK\FbMF$r=EJ#$7$F<u$1<h$i$l$?>l9g$O$4MF<O$/$@$5$$!K(B

2020$BG/(B12$B7n$N(BPRMU$B8&5f2q$r!VE>0\3X=,!&>/?t%G!<%?$+$i$N3X=,!W$N%F!<%^$N$b$H(B
12$B7n(B17$BF|(B($BLZ(B)$B$+$i(B12$B7n(B18$BF|(B($B6b(B)$B$K!Z%*%s%i%$%s3+:E![CW$7$^$9!#(B

$B$4;22C$5$l$kJ}$O!"!Z3+:EA0F|(B12/16$B$^$G$K![0J2<$N%U%)!<%`$+$i;22CEPO?$r$7$F$/$@$5$$!#(B
https://docs.google.com/forms/d/e/1FAIpQLScKzqCzkhXcYrCq5729-8i86_3xvIOsUVBYvnyd4Tn7bbuF3A/viewform

$B%*%s%i%$%s3+:E$X$N;22CJ}K!$K4X$9$kO"Mm$r3+:EA0F|$^$G$K$*Aw$j$7$^$9!#(B
$B$J$*!$;22CHq$ODL>o$N8=CO3+:E$N8&5f2q$HF1MM$G$9!#(B

========================================================================

$B!z%Q%?!<%sG'<1!&%a%G%#%"M}2r8&5f2q!J(BPRMU$B!K(B
$B @ lLg0Q0wD9!!:4F#(B $BMN0l(B ($BElBg(B)$B!!!!I{0Q0wD9!!LZB<(B $B><8g(B (NTT)$B!$(B $B4dB<(B $B2m0l(B ($B:eI\Bg(B)
$B44;v!!FbED(B $BM42p(B (Mobility Technologies)$B!$(B $B;32<(B $BN45A(B ($BCfItBg(B)
$B44;vJd:4!!<FED(B $B9d;V(B (NTT)$B!$(B $B@>;3(B $B @ 5;V(B ($BD;<hBg(B)

$BF|;~!!(B2020$BG/(B12$B7n(B17$BF|(B($BLZ(B)$B!!(B10$B!'(B00$B!A(B18$B!'(B00
$B!!!!!!(B2020$BG/(B12$B7n(B18$BF|(B($B6b(B)$B!!(B10$B!'(B00$B!A(B18$B!'(B00

$B2q>l(B $B%*%s%i%$%s3+:E(B

$B5DBj(B $BE>0\3X=,!&>/?t%G!<%?$+$i$N3X=,(B

12$B7n(B17$BF|(B($BLZ(B) $B8aA0(B $B%;%C%7%g%s(B1 $B!J(B10$B!'(B00$B!A(B12$B!'(B00$B!K(B

(1) 10:00 - 10:15
Inter-intra Contrastive Framework for Self-supervised Spatio-temporal Learning
$B!{(BLi Tao$B!&(BXueting Wang$B!&(BToshihiko Yamasaki$B!J(BUTokyo$B!K(B

(2) 10:15 - 10:30
$B<B4i2hA|%I%a%$%s$r7PM3$7$?%"%K%aIwOC<TF,It1GA|$N9g @ .(B
$B!{F#Fb!!=S!&H,GO!!NK!&D9C+ @ nK.MN!&:XF#1QM:!J7DBg!K(B

(3) 10:30 - 10:45
$B%/%i%965;U$N$_$rMQ$$$?J*BN$NA07J!&;Q@*!&%/%i%9$NF1;~3X=,(B
$B!{JFED=Y2p!JD;<hBg!K!&F~9>!!9k!J(BNTT$B!K!&@>;3 @ 5;V!&4d0f57M:!JD;<hBg!K(B

(4) 10:45 - 11:00
$B?MJ*DI @ W$N$?$a$N;Q@*$H?MJ*NN0h$NF1;~?dDjK!(B
$B!{EOn5OBI'!&OBED=SOB!JOB2N;3Bg!K(B

(5) 11:00 - 11:15
$B9b3,CY1dKd$a9~$_6u4V$K$*$1$kDc%i%s%/%F%s%=%kJd40$N9bB.%"%k%4%j%:%`(B
$B!{;3K\N659!&2#EDC#Li!JL>9)Bg!K!&:#AR!!6G!JC^GHBg!K!&K\C+=(7x!JL>9)Bg!K(B

(6) 11:15 - 11:30
Data Augmentation for Sentiment Analysis Using Supervised Learning Sentence Compression Based SeqGAN
$B!{(BJiawei Luo$B!&(BMondher Bouazizi$B!&(BTomoaki Ohtsuki$B!J(BKeio Univ.$B!K(B

(7) 11:30 - 12:00
$B8DJL%G%#%9%+%C%7%g%s(B

PRMU$B8&5f2q$N<h$jAH$_$N#1$D$G!$8&5f2q$O%"%$%G%"$r5DO@$9$k>l$G$"$k$H$$$&9M$(J}$KN)$C$?!$5DO@=E;k$N%;%C%7%g%s$G$9!%(BPRMU$B$,F3F~$7$?%9%?%$(B
$B%k$N%;%C%7%g%s$O!$!VH/I=<T$,O"B3$7$F(B15$BJ,$:$D$NH/I=$r9T$C$?8e!$;D;~4V$G8DJL$KJBNs$7$F%G%#%9%+%C%7%g%s$9$k!W$H$$$&!$E0DlE*$K5DO@$r$7$h(B
$B$&$H$$$&$b$N$G$9!%3'MM$N @ Q6KE*$J$4;22C$r$*4j$$$$$?$7$^$9!%(B

$B!!5Y7F!!!J(B 90$BJ,(B $B!K!!(B

12$B7n(B17$BF|(B($BLZ(B) $B8a8e(B $B>7BT9V1i(B1 $B!J(B13$B!'(B30$B!A(B14$B!'(B30$B!K(B

(8) 13:30 - 14:30
$B!N>7BT9V1i!OE>0\3X=,$NH/E8$H1~MQ!J4pK\35G0$+$i%a%?3X=,!$7QB33X=,$^$G!K(B
$B!{>>0f9'B@!JL>Bg!K(B

$B!!5Y7F!!!J(B 10$BJ,(B $B!K!!(B

12$B7n(B17$BF|(B($BLZ(B) $B8a8e(B $B%;%C%7%g%s(B2 $B!J(B14$B!'(B40$B!A(B16$B!'(B10$B!K(B

(9) 14:40 - 14:55
Belonging Network
-- Few-shot One-class Image Classification for Classes with Various Distributions --
$B!{(BTakumi Ohkuma$B!&(BHideki Nakayama$B!J(BUT$B!K(B

(10) 14:55 - 15:10
$B%i%W%i%7%"%s%U%#%k%?$K$h$kB;<:4X?t$rF3F~$7$?65;U$J$7%;%0%a%s%F!<%7%g%s$N@:EY8~>e(B $B!A(B $B<V:\%o%$%d%O!<%M%9ItIJ$X$NE,1~(B $B!A(B
$B!{>>K\M*4u!J=;M'EE9)!K(B

(11) 15:10 - 15:25
Hierarchical Contrastive Adaptation for Cross-Domain Object Detection
$B!{(BZiwei Deng$B!&(BQuan Kong$B!&(BNaoto Akira$B!&(BTomoaki Yoshinaga$B!J(BHitachi$B!K(B

(12) 15:25 - 15:40
DevNet$B$K$h$k>/?t$N0[>o%G!<%?$G3X=,$9$k<+F0304Q8!::%7%9%F%`$N8!F$(B
$B!{KL8}>!5W!&@>yuM[J?!&c7F#!!<i!J:e5;=Q8&!K(B

(13) 15:40 - 16:10
$B8DJL%G%#%9%+%C%7%g%s(B

PRMU$B8&5f2q$N<h$jAH$_$N#1$D$G!$8&5f2q$O%"%$%G%"$r5DO@$9$k>l$G$"$k$H$$$&9M$(J}$KN)$C$?!$5DO@=E;k$N%;%C%7%g%s$G$9!%(BPRMU$B$,F3F~$7$?%9%?%$(B
$B%k$N%;%C%7%g%s$O!$!VH/I=<T$,O"B3$7$F(B15$BJ,$:$D$NH/I=$r9T$C$?8e!$;D;~4V$G8DJL$KJBNs$7$F%G%#%9%+%C%7%g%s$9$k!W$H$$$&!$E0DlE*$K5DO@$r$7$h(B
$B$&$H$$$&$b$N$G$9!%3'MM$N @ Q6KE*$J$4;22C$r$*4j$$$$$?$7$^$9!%(B

$B!!5Y7F!!!J(B 10$BJ,(B $B!K!!(B

12$B7n(B17$BF|(B($BLZ(B) $B8a8e(B $B%;%C%7%g%s(B3 $B!J(B16$B!'(B20$B!A(B18$B!'(B00$B!K(B

(14) 16:20 - 16:30
$B!N%7%g!<%H%Z!<%Q!<!O65;U$"$j%*!<%H%(%s%3!<%@$K$h$k(BFew-Shot$B7QB33X=,(B
$B!{|b;37<B@!&1'ETM-><!&<DED9 @ 0l!JEl9)Bg!K(B

(15) 16:30 - 16:45
Towards Discovery of Relevant Latent Factors with Limited Data
$B!{(BMohit Chhabra$B!&(BQuan Kong$B!&(BTomoaki Yoshinaga$B!J(BHitachi$B!K(B

(16) 16:45 - 17:00
$B?<AX3X=,$rMQ$$$?H=CG:,5r$N2D;k2=$K$h$kLk4V<V:\%+%a%i2hA|G'<1$K$*$1$k<VN>NN0h?dDj(B
$B!{BgLyC#Li!&Bg669d2p!J @ E2,Bg!K(B

(17) 17:00 - 17:15
$BFb;k6 @ 2hA|$K$*$1$kIBJQNN0h$N$"$$$^$$$J6-3&$N3X=,<jK!(B
$B!{2OFbM4B@!JC^GHBg(B/$B;:Am8&!K!&LnN$GnOB!J;:Am8&!K!&CSEDFF;K!JC^GHBgImB0IB1!!K!&:dL51QFA!J;:Am8&!K(B

(18) 17:15 - 17:30
$B%9%Q!<%9%b%G%k$+$i$NE>0\3X=,(B
$B!{<r0fCRLo!&;3EDMjLo!&@P66NJ;N!&9bED42G7!JD9:jBg!K(B

(19) 17:30 - 18:00
$B8DJL%G%#%9%+%C%7%g%s(B

PRMU$B8&5f2q$N<h$jAH$_$N#1$D$G!$8&5f2q$O%"%$%G%"$r5DO@$9$k>l$G$"$k$H$$$&9M$(J}$KN)$C$?!$5DO@=E;k$N%;%C%7%g%s$G$9!%(BPRMU$B$,F3F~$7$?%9%?%$(B
$B%k$N%;%C%7%g%s$O!$!VH/I=<T$,O"B3$7$F(B15$BJ,$:$D$NH/I=$r9T$C$?8e!$;D;~4V$G8DJL$KJBNs$7$F%G%#%9%+%C%7%g%s$9$k!W$H$$$&!$E0DlE*$K5DO@$r$7$h(B
$B$&$H$$$&$b$N$G$9!%3'MM$N @ Q6KE*$J$4;22C$r$*4j$$$$$?$7$^$9!%(B

12$B7n(B18$BF|(B($B6b(B) $B8aA0(B $B%;%C%7%g%s(B4 $B!J(B10$B!'(B00$B!A(B12$B!'(B00$B!K(B

(20) 10:00 - 10:15
MIRU2020$B<c<j%W%m%0%i%`<B;\Js9p(B
$B!{Fb3$$f$E;R!J:eI\Bg!K!&4d8}6F;K!J6eBg!K!&]j!!@c(B?$B!JElBg!K!&?{>B2mFA!JElKLBg(B/$BM}8&!K!&@>B<??0a!J5~Bg(B/OSX$B!K!&86!!7rfF!J;:Am8&!K!&J?@n!!(B
$BMc!JCfItBg!K!&J!0f!!9(!J(BNEC$B!K(B

(21) 10:15 - 10:30
$BJ?LLJ*BN$N%[%b%0%i%U%#!<JQ49$N$?$a$N?.MjEYIU$-%3!<%J!<E @ Cj=PK!(B
$B!{F#1:90Li!&OBED=SOB!JOB2N;3Bg!K(B

(22) 10:30 - 10:45
$B?<AX3X=,$H;^NN0hCj=P$rMQ$$$?MN%J%72V728!=P<jK!$K4X$9$k8!F$(B
$B!{@DLZ=S2p!&;3yuC#Li!J?73cBg!K(B

(23) 10:45 - 11:00
$B<j=q$-?t<0G'<1$K$*$1$k(BCNN$B$H(B2D BLSTM$B$K$h$k6I=jFCD'Cj=P(B
$B!{?9=;!!7<!&%/!<%s(B $B%H%%%"%s(B $B%0%(%s!&@6?e0j;R!&Cf @ n@5<y!JEl5~G @ 9)Bg!K(B

(24) 11:00 - 11:15
$B2hA|JQ49$K4p$E$/F)L @ J*BN$N?<EY2hA|M=B,(B
$B!{HS?9!!N<!J6bBt9)Bg!K!&5WJ]EDNC2p!$>.Jk!!7i!J6bBt9)Bg!K(B

(25) 11:15 - 11:30
$B65;U$"$j3X=,$K$h$k2r$-$[$0$5$l$?FCD'I=8=$N3X=,(B $B!A(B $BJ,N`4o$rMQ$$$?FCD'I=8=$N2r$-$[$0$7(B $B!A(B
$B!{9uED=$FsO:!&OBED=SOB!JOB2N;3Bg!K(B

(26) 11:30 - 12:00
$B8DJL%G%#%9%+%C%7%g%s(B

PRMU$B8&5f2q$N<h$jAH$_$N#1$D$G!$8&5f2q$O%"%$%G%"$r5DO@$9$k>l$G$"$k$H$$$&9M$(J}$KN)$C$?!$5DO@=E;k$N%;%C%7%g%s$G$9!%(BPRMU$B$,F3F~$7$?%9%?%$(B
$B%k$N%;%C%7%g%s$O!$!VH/I=<T$,O"B3$7$F(B15$BJ,$:$D$NH/I=$r9T$C$?8e!$;D;~4V$G8DJL$KJBNs$7$F%G%#%9%+%C%7%g%s$9$k!W$H$$$&!$E0DlE*$K5DO@$r$7$h(B
$B$&$H$$$&$b$N$G$9!%3'MM$N @ Q6KE*$J$4;22C$r$*4j$$$$$?$7$^$9!%(B

$B!!5Y7F!!!J(B 75$BJ,(B $B!K!!(B

12$B7n(B18$BF|(B($B6b(B) $B8a8e(B $B>7BT9V1i(B2 $B!J(B13$B!'(B15$B!A(B14$B!'(B15$B!K(B

(27) 13:15 - 14:15
$B!N>7BT9V1i!O3X=,%G!<%?ITB-LdBj$r2r>C$9$k>/NL%5%s%W%k3X=,5;=Q$N:GA0@~(B
$B!{LnCfM:0l!JF|N)!K(B

$B!!5Y7F!!!J(B 10$BJ,(B $B!K!!(B

12$B7n(B18$BF|(B($B6b(B) $B8a8e(B $B%;%C%7%g%s(B5 $B!J(B14$B!'(B25$B!A(B16$B!'(B10$B!K(B

(28) 14:25 - 14:40
$BB0 @ ->pJs$NIT3N<B @ -$r9MN8$7$?%<%m%7%g%C%H @ 8@.%b%G%k(B
$B!{:e0fM%B@!JAaBg!K!&;0 @ n7rB@!J>EFn9)2JBg!K!&8eF#@59,!JAaBg!K(B

(29) 14:40 - 14:55
$B>/?t3X=,%G!<%?$K$h$k(BFeature Contraction$B5Z$S(BRand Augment$B$rF3F~$7$?(BSSD$B%b%G%k$N9=C[(B
$B!{Hx_7CN7{!$>>K\M*4u!&;01:>!;J!J=;M'EE9)!K!&1|LnBsLi(B

(30) 14:55 - 15:10
$B>/?t%G!<%?$N3X=,$K$*$1$kCN<1$N>xN1$rMQ$$$?@5B'2=(B
$B!{El!!NKB@!&OBED=SOB!JOB2N;3Bg!K(B

(31) 15:10 - 15:25
A Hybrid Sampling Strategy for Improving the Accuracy of Image Classification with less Data
$B!{(BRuiyun Zhu$B!&(BFumihiko Ino$B!J(BOsaka Univ.$B!K(B

(32) 15:25 - 15:40
Multi-Task Attention Learning for Fine-grained Recognition
$B!{(BDichao Liu$B!J(BNU$B!K!&(BYu Wang$B!J(BRits$B!K!&(BKenji Mase$B!J(BNU$B!K!&(BJien Kato$B!J(BRits$B!K(B

(33) 15:40 - 16:10
$B8DJL%G%#%9%+%C%7%g%s(B

PRMU$B8&5f2q$N<h$jAH$_$N#1$D$G!$8&5f2q$O%"%$%G%"$r5DO@$9$k>l$G$"$k$H$$$&9M$(J}$KN)$C$?!$5DO@=E;k$N%;%C%7%g%s$G$9!%(BPRMU$B$,F3F~$7$?%9%?%$(B
$B%k$N%;%C%7%g%s$O!$!VH/I=<T$,O"B3$7$F(B15$BJ,$:$D$NH/I=$r9T$C$?8e!$;D;~4V$G8DJL$KJBNs$7$F%G%#%9%+%C%7%g%s$9$k!W$H$$$&!$E0DlE*$K5DO@$r$7$h(B
$B$&$H$$$&$b$N$G$9!%3'MM$N @ Q6KE*$J$4;22C$r$*4j$$$$$?$7$^$9!%(B

$B!!5Y7F!!!J(B 10$BJ,(B $B!K!!(B

12$B7n(B18$BF|(B($B6b(B) $B8a8e(B $B%;%C%7%g%s(B6 $B!J(B16$B!'(B20$B!A(B18$B!'(B00$B!K(B

(34) 16:20 - 16:30
$B!N%7%g!<%H%Z!<%Q!<!O(BCase Discrimination: Self-supervised Learning for classification of Medical Image
$B!{(BHaohua Dong$B!&(BYutaro Iwamoto$B!J(BRitsumeikan Univ.$B!K!&(BXianhua Han$B!J(BYamaguchi Univ.$B!K!&(BLanfen Lin$B!J(BZhejiang Univ.$B!K!&(BHongjie Hu$B!&(BXiujun
Cai$B!J(BSir Run Run Shaw Hospital$B!K!&(BYen-Wei Chen$B!J(BRitsumeikan Univ.$B!K(B

(35) 16:30 - 16:45
$BJ*BN$NGD;}$KE,$7$?(B3$B<!85NN0h$N?dDj(B
$B!{DMK\=_4p!&>.Jk!!7i!J6bBt9)Bg!K(B

(36) 16:45 - 17:00
$B?<AX%Q!<%;%W%H%m%s$NC10L=i4|2=$K4p$E$/Cf4VAX$N9W8%EY$HL`EY$N2r @ O(B
$B!{5WJ]ED>MJ?!&Aa;V1QO/!J6eBg!K!&Aa@%M'M5!JIY;NDL8&!K!&FbED@?0l!J6eBg!K(B

(37) 17:00 - 17:15
$B%N%$%:IU2C$KH<$&9W8%EY%Q%?!<%sJQF0$K4p$E$/NN0hCj=P7?(BAI$B$NFC @ -I>2A<jK!(B
$B!{?9!!Lw1Q!&IM!!D>;K!&7CLZ @ 5;K!JF|N)!K(B

(38) 17:15 - 17:30
Rethinking the local similarity in content-based image retrieval
$B!{(BLongjiao Zhao$B!J(BNagoya Univ.$B!K!&(BYu Wang$B!J(BRitsumeikan Univ$B!K!&(BYoshiharu Ishikawa$B!J(BNagoya Univ.$B!K!&(BJien Kato$B!J(BRitsumeikan Univ$B!K(B

(39) 17:30 - 18:00
$B8DJL%G%#%9%+%C%7%g%s(B

PRMU$B8&5f2q$N<h$jAH$_$N#1$D$G!$8&5f2q$O%"%$%G%"$r5DO@$9$k>l$G$"$k$H$$$&9M$(J}$KN)$C$?!$5DO@=E;k$N%;%C%7%g%s$G$9!%(BPRMU$B$,F3F~$7$?%9%?%$(B
$B%k$N%;%C%7%g%s$O!$!VH/I=<T$,O"B3$7$F(B15$BJ,$:$D$NH/I=$r9T$C$?8e!$;D;~4V$G8DJL$KJBNs$7$F%G%#%9%+%C%7%g%s$9$k!W$H$$$&!$E0DlE*$K5DO@$r$7$h(B
$B$&$H$$$&$b$N$G$9!%3'MM$N @ Q6KE*$J$4;22C$r$*4j$$$$$?$7$^$9!%(B

$B!ZLd9g @ h![(B
PRMU$B8&5f2q44;v08(B prmu-organizer @ mail.ieice.org

--
*****************************************************
 $B@>;3(B $B @ 5;V(B

$BD;<hBg3X(B $BBg3X1!9)3X8&5f2J(B
$B>pJs%(%l%/%H%m%K%/%9 @ l96(B $BCNG=>pJs9)3X9V:B(B
Address : 680-8550 $BD;<h8)D;<h;T8P;3D.Fn(B4$BCzL\(B101$BHVCO(B
TEL : (0857)31-6083
E-mail : nishiyama @ tottori-u.ac.jp
*****************************************************




image メーリングリストの案内