KDST

  • |
  • total 24,124
  • today 3
  • yesterday 18
  • 홈
  • 태그
  • 방명록
  • 팀 소개

카테고리

  • 분류 전체보기
tistory 로고 이미지
티스토리가입하기
rss 아이콘 이미지

태그

  • pruning
  • 강화학습
  • cnn
  • Knowledge Distillation
  • ICCV
  • bearing fault detection
  • bearing fault diagnosis
  • autoencoder
  • anomaly detection
  • Reinforcement Learning
  • gan
  • DQN
  • NaturalInversion
  • 2022-AAAI
  • Data-Free Image Synthesis
  • 2022AAAI
  • AAAI 2021
  • Out of Distribution
  • DegAug
  • Manifold Regularized Dynamic Network Pruning
  • Dynamic Pruning
  • CVPR 2021
  • AdaShare
  • Neural Architecture Search
  • Multi-Task Learning
  • neuron merging
  • Neural network inversion
  • AdaptiveDeepInversion
  • Dreaming to Distill
  • CVPR 2020
  • Data-free
  • DeepInversion
  • generative model
  • Generative adversarial networks
  • auto-encoders
  • tensor decomposition
  • Episodic Backward Update
  • reinforcment learning
  • Normalized Sparse Autoencoder
  • VQA
  • semi-supervised
  • SinGAN
  • ICCV 2019
  • TICNN
  • Network Slimming
  • EfficientNet #ICML #Google #KIST #MCPark
  • AnoGAN
  • AnomalyDetection
  • NeurIPS
  • Q-learning
  • deeplearning
  • Convolutional Neural network
  • MTL
  • Computer Vision
  • OOD
  • embedded
  • IID
  • Nas
  • 프레임워크
  • rkd

블로그 이미지
KDST(KIST Data Science Team)는 기계학습을 포함한 데이터와 지능에 관련된 여러 주제에 대해서 연구하는 팀입니다.
KDST

tag cloud

  • Neural Architecture Search
  • neuron merging
  • AAAI 2021
  • bearing fault detection
  • bearing fault diagnosis
  • Reinforcement Learning
  • Multi-Task Learning
  • DegAug
  • 2022-AAAI
  • 강화학습
  • cnn
  • AdaptiveDeepInversion
  • AdaShare
  • gan
  • pruning
  • Data-Free Image Synthesis
  • Dynamic Pruning
  • Out of Distribution
  • Neural network inversion
  • Manifold Regularized Dynamic Network Pruning
  • NaturalInversion
  • 2022AAAI
  • Knowledge Distillation
  • anomaly detection
  • autoencoder
  • DQN
  • ICCV
  • CVPR 2020
  • CVPR 2021
  • Dreaming to Distill

공지사항

최근댓글

  • 좋은 정보 감사합니다.. 익명의 나그네 2021
  • 포스팅 잘 보고 갑니다⋯. 다시보아 2020
  • 잘 읽었고 많은 도움이⋯. Twodragon 2020
  • 좋은 포스팅 감사합니다.. xeskinn 2020
  • 감사합니다.. xeskinn 2019

최근기사

  • NaturalInversion: Data-Free Image Sy⋯.
  • 디지털 헬스 해커톤 2021 최우수상(공⋯.
  • DecAug: Out-of-Distribution Generali⋯.
  • Revisiting Knowledge Distillation: A⋯.
  • Manifold Regularized Dynamic Network⋯.

글보관함

  • 2021/12 (1)
  • 2021/10 (1)
  • 2021/09 (1)
  • 2021/08 (1)

링크

2022
7
 « »
« 2022/7 »
일 월 화 수 목 금 토
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31            

 
TISTORY
  • HOME
  • TAG
  • LOCATION LOG
  • GUEST BOOK
  • ADMIN
  • WRITE
Life Blog is powered by Daum & Tattertools Designed y Tistory
Daum Tattertools Tistory