時(shí)序模型方向204篇論文大整理!包含時(shí)序預(yù)測(cè)/時(shí)序Transformer/時(shí)序大模型等最新研究
共 8740字,需瀏覽 18分鐘
·
2024-05-22 16:16
時(shí)空預(yù)測(cè)引領(lǐng)了新的熱點(diǎn),時(shí)間序列預(yù)測(cè)領(lǐng)域的首個(gè)大模型 TimeGPT 引起業(yè)界熱議,Transformer+時(shí)序,擴(kuò)散模型+時(shí)序更是頂會(huì)新方向大熱“種子”選手,時(shí)序+多方向正在成為這個(gè)AI界矚目的黑馬!
本文整理了時(shí)間序列的時(shí)序預(yù)測(cè) / 時(shí)序-Transformer / 時(shí)序-大模型 / 時(shí)序-擴(kuò)散四大方向的最新論文204篇。
掃碼回復(fù)“時(shí)序”領(lǐng)204篇論文合集
ICLR2024
ClimODE: Climate Forecasting With Physics-informed Neural ODEs
AAL2024
MSGNet: Learning Multi-Scale Inter-Serjes Correlations for Multivariate Time Series Forecasting
NeurIPS2023
Frequency-domain MLPs are More Effective Lea深度之眼整理rners in Time Series Forecasting
ICML 2023
Learning Deep Time-index Models for Time S深度之眼整理eries Forecasting
KDD 2023
TSMixer: Lightweight MLP-Mixer Model fo深度之眼整理r Multivariate Time Series Forecasting
因篇幅有限 僅展示前5篇
掃碼回復(fù)“時(shí)序”領(lǐng)204篇論文合集
1.iTransformer: InvertedTransformers Are Effective for Time Series Forecastina
2.Pathformer: Multi- Scale Transformers With Adaptive Pathways For Time Series Forecasting
3.SCALEFORMER: ITERATIVE MULTI-SCALE REFINING TRANSFORMERS FOR TIME
SERIESFORECASTING
4.InParformer: Evolutionary Decomposition Transformers with Interactive Parallel Attentionfor LongTerm Time Series Forecasting
5.ContiFormer: Continuous-Time Tansformer for Irreqular Time Series Modeling
因篇幅有限 僅展示前5篇
掃碼回復(fù)“時(shí)序”領(lǐng)204篇論文合集
卷積神經(jīng)網(wǎng)絡(luò)方法(4種算法模型)
1.CNN
Recent advances in convolutional neural networks
2.WaveNet-CNN
Conditional time series forecasting with convolutional neural networks
3.Kmeans-CNN
Short-term load forecasting in smart grid: a combined CNN and K-means clustering approach
4.TCN
An empirical evaluation of generic convolutional and recurrent networks for sequence modeling
因篇幅有限 僅展示前4篇
循環(huán)神經(jīng)網(wǎng)絡(luò)方法(3種算法模型)
1.RNN
Bidirectional recurrent neural networks
2.LSTM(長(zhǎng)短期記憶網(wǎng)絡(luò))
Long short-term memory
3.GRU(門控循環(huán)單元)
Learning phrase representations using RNN encoder- decoder for statistical machine translation
Transformer方法(11種算法模型)
1.Transformer
Attention-based models for speech recognition
2.BERT
BERT: pre-training of deep bidirectional transformers for language understanding
3.AST
Adversarial sparse transformer for time series forecasting
4.Informer
Informer: beyond efficient transformer for long sequence time-series forecasting
因篇幅有限 僅展示部分
掃碼回復(fù)“時(shí)序”領(lǐng)204篇論文合集
大模型處時(shí)間序列
1.基于Promtpt的方法
Leveraging Language Foundation Models for HumanMobility Forecasting
2.將時(shí)間序列進(jìn)行離散化處理
AudioLM: a Language Modeling Approach to Audio Generation
3.時(shí)間序列-文本對(duì)齊代表論文
Open Vocabulary Electroencephalography-To-Text Decoding and Zero-shot Sentiment Classification
4.引入視覺(jué)信息
Leveraging Vision-Language Models for Granul深度之眼整理tar Market Change Prediction
5.大模型工具
Unleashing the Power of Shared Label Structures for Human Activity Recognition
訓(xùn)練時(shí)間序列領(lǐng)域大模型
TimetGPT-1
掃碼回復(fù)“時(shí)序”領(lǐng)204篇論文合集
針對(duì)目前的大熱時(shí)序,我們請(qǐng)來(lái)了頂會(huì)審稿人chichi老師,解讀時(shí)空/時(shí)序預(yù)測(cè)研究現(xiàn)狀與近期熱點(diǎn)。
課程大綱:
- 以GNN為主的時(shí)空預(yù)測(cè)模型
經(jīng)典GNN時(shí)空預(yù)測(cè)算法(如STGCN, GraphWaveNet)
近兩年GNN時(shí)空預(yù)測(cè)算法
近期GNN時(shí)空預(yù)測(cè)算法研究熱點(diǎn)
- 以Transformer為主的時(shí)序預(yù)測(cè)模型
經(jīng)典Transformer時(shí)序預(yù)測(cè)算法(如Autoformer,F(xiàn)edformer)
近兩年Transformer時(shí)序預(yù)測(cè)算法
近期Transformer時(shí)序預(yù)測(cè)算法研究熱點(diǎn)
- LLM在時(shí)空/時(shí)序預(yù)測(cè)上的應(yīng)用
微調(diào)LLM做時(shí)空時(shí)序預(yù)測(cè)
語(yǔ)言增強(qiáng)的時(shí)序/時(shí)空預(yù)測(cè)模型
LLM做時(shí)序預(yù)測(cè)的未來(lái)挑戰(zhàn)與研究方向
- 針對(duì)時(shí)序新熱點(diǎn)和應(yīng)用總結(jié)
掃碼解鎖
時(shí)序最新熱點(diǎn)解讀直播課
另外,我們準(zhǔn)備了32節(jié)時(shí)間序列系列課程基礎(chǔ)上,課程分為五個(gè)模塊。
模塊一基礎(chǔ):認(rèn)識(shí)數(shù)據(jù)科學(xué)家
模塊二進(jìn)階:時(shí)間序列預(yù)測(cè)概論+論文和代碼講解
模塊三項(xiàng)目實(shí)戰(zhàn):基于股價(jià)和零售的時(shí)序項(xiàng)目實(shí)戰(zhàn)
模塊四比賽實(shí)戰(zhàn):數(shù)據(jù)科學(xué)入門賽+回答準(zhǔn)確性預(yù)測(cè)賽
模塊五時(shí)序前沿:前沿機(jī)器學(xué)習(xí)與時(shí)序+時(shí)序分析任務(wù)與最新應(yīng)用場(chǎng)景
0.01元解鎖《時(shí)間序列系列課》
32節(jié)課+37h+部分課件+部分課堂作業(yè)及代碼
了解數(shù)據(jù)科學(xué)家的概念,崗位分布、職責(zé)、技能、薪資、職業(yè)發(fā)展路線等
—課程時(shí)長(zhǎng):1小時(shí)
1:時(shí)間序列預(yù)測(cè)入門
2:時(shí)間序列預(yù)測(cè)代碼實(shí)踐
3:Forecasting at Scale論文講解+代碼講解
—課程時(shí)長(zhǎng):6小時(shí)
1:夯實(shí)算法基礎(chǔ)、熟悉算法原理,通過(guò)代碼實(shí)操解決問(wèn)題
2:系統(tǒng)掌握時(shí)序分析方法以及預(yù)測(cè)方法
—課程時(shí)長(zhǎng):8小時(shí)
1:數(shù)據(jù)科學(xué)入門賽
2:回答準(zhǔn)確性預(yù)測(cè)賽
—課程時(shí)長(zhǎng):20小時(shí)
前沿機(jī)器學(xué)習(xí)與時(shí)序
時(shí)序分析任務(wù)與最新應(yīng)用場(chǎng)景
—課程時(shí)長(zhǎng):2小時(shí)
這次更新的時(shí)序課程時(shí)長(zhǎng)37小時(shí),總共32節(jié),0.01元即可解鎖。
0.01元解鎖《時(shí)間序列系列課》
32節(jié)課+37h+部分課件+部分課堂作業(yè)及代碼
