心理神经入门数据: The Psychology Book:Big Ideas Simply Explained Psychology: Everything you Need to Know to Master the Subject Psych101 neurocience 戴尔•普尔夫斯 涵盖了神经科学各个领域基础知识,信号如何发生,如何影响机能(记忆,睡眠,正常行为,情绪等) Neurocience: Exploring the Brain Mark F.Bear 简单直接用神经科学解释一些行为 Principles of Neural Science 神经科学入门圣经,全面,深入
Google 2020 blog machine learning: 数据挖掘 Large Scale Parallel Document Mining for Machine Translation Effective Parallel Corpus Mining using Bilingual Sentence Embeddings Denoising Neural Machine Translation Training with Trusted Data and Online Data Selection(去噪) 模型 Learning a Multi-Domain Curriculum for Neural Machine Translation
tz November 1st, 2021 at 11:08 am
天气慢慢变冷了。
tz October 26th, 2021 at 11:16 am
字符级的机器翻译,在embedding上面加一层一维卷积,会获得较好的效果(why dont people use character-level machine translation)
tz October 22nd, 2021 at 06:19 pm
论文清单: Training Tips for the Transformer Model Semantics-aware Attention Improves Neural Machine Translation Why don’t people use character-level machine translation? MSP: Multi-Stage Prompting for Making Pre-trained Language Models Better Translators Well-classified Examples are Underestimated in Classification with Deep Neural
tz October 21st, 2021 at 04:51 pm
语言学领域内每一次重大的范式 转换大都由该学科领域对基本数据看法的改变而引 发
tz October 13th, 2021 at 04:17 pm
今天测试发现 DeepL在测试的较多句子,翻译通顺,语序很好。
tz October 13th, 2021 at 11:18 am
in a couple of days
tz October 13th, 2021 at 11:18 am
机器翻译如何实现数字精确和模糊的翻译,如数字千钧一发,123.1
tz October 9th, 2021 at 05:33 pm
AAAI 2018 Translating pro-Drop Languages with Reconstruction Models。针对口语领域,代词缺失的情况,引入Reconstructor(类似于auto-encoder),重新生成原文,同时也在decoder端加入reconstructor。
tz October 9th, 2021 at 05:28 pm
TACL 2018b Modeling Past and Future for Neural Machine Translation 机器翻译中解码器的状态需要承担三个任务,past,present,future,作者将past和furure的任务分离出去单独建模
Analyzing the Source and Target Contributions to Predictions in Neural Machine Translation 可视化翻译
tz September 29th, 2021 at 05:06 pm
Revisiting Low-Resource Neural Machine Translation: A Case Study 低资源翻译论文
admin August 19th, 2021 at 05:52 pm
Ehud Reiter 博客:https://ehudreiter.com/
admin August 19th, 2021 at 09:42 am
论文列表: Dictionary-based Data Augmentation for Cross-Domain Neural Machine Intelligent Selection of Language Model Training Data Chinese Syntactic Reordering for Statistical Machine Translation Train, Sort, Explain: Learning to Diagnose Translation Models Statistical Power and Translationese in Machine Translation Evaluation Neural Machine Translation with Reconstruction Learning Deep Transformer Models for Machine Translation Improving Deep Transformer with Depth-Scaled Initialization and Merged Attention Levenshtein Transformer REFORMER: THE EFFICIENT TRANSFORMER
大模型发展4 5年,很多模型层出不穷。每个人都可以提出一个新的模型。但是落地为王。 几乎所有场景都可以用到大模型,可是仔细一想,要落地的时候,首先要有sota的技术,哪些东西一定要做,哪些不能去碰。有些就是昙花一现的东西,有些是必须要追逐。比如大模型可解释不做,可控不做一定是不行的。当然做的程度不一样,想办法把中间成功赶紧落地。站在客户的角度 什么样的模型真正管用。不是一蹴而就,是要和客户打磨的。产生迭代效应,使技术跟随市场。预知技术,提前做出布局。
不要纸上谈兵,想要一蹴而就,要有耐心的做事。
广接触其他领域的知识,深入问题。
了解历史,符号和统计
Minh-Thang Luong and Christopher D. Manning.
Fast domain adaptation for neural machine translation.http://www.statmt.org/OSMOSES/Stanford-IWSLT-15.pdf
Markus Freitag and Yaser Al-Onaizan.
http://arxiv.org/abs/1612.06897
Domain specialization: a post-training domain adaptation for neural machine translation.Christophe Servan, Josep Maria Crego, and Jean Senellart.
http://arxiv.org/abs/1612.06141
An empirical comparison of simple domain adaptation methods for neural machine translation.Chenhui Chu, Raj Dabre, and Sadao Kurohashi.
http://arxiv.org/abs/1701.03214
Neural Machine Translation Training in a Multi-Domain Scenario.Hassan Sajjad, Nadir Durrani, Fahim Dalvi, Yonatan Belinkov, Stephan Vogel
https://arxiv.org/pdf/1708.08712.pdf
————————————————
版权声明:本文为CSDN博主「warrioR_wx」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。
原文链接:https://blog.csdn.net/wangxinginnlp/article/details/77717570
Simplification of English and Bengali Sentences for Improving Quality of Machine Translation 简化句子
最大的孤独是什么样呢?我想大概是,对于我们普通人来说,我们完成了很多事情,但最后这些事情似乎注定都是会被遗忘或者说无人提及,我们一无所有的来,也终将一无所有的去,所以在现实生活中,孤独它就是不可避免,无论是在人生的哪个阶段,总会有那个阶段的孤独,所以呢,我们才需要不断的去折腾,去不断的经历,无论成功与否,这些最后都会变成我们的记忆,而记忆无疑就是抵抗孤独最大的武器,因为回忆会比经历要长得多,每个村中都会有整日无所事事,独自坐在门口等待的老人家,什么能够拯救他们的孤独呢,唯有他们过往的记忆,还有对未来的期许,我们也是一样。好在我们今日的孤独没有建立在被列强侵略上,好在我们的孤独没有被一阵飓风抹去,好在我们的孤独是建立在今日强大的中国之上,
孤独并不可怕,可怕的是我们穷极一生没有按照自己的想法做过什么,可怕的是我们从来不敢独自面对孤独,可怕的是我们以为自己从不孤独,虽然孤独不可避免,但生活会因为大家的负重前行而变得更好。
人在低谷时,勿扰人。人在高处时,勿戏人。人在相遇时,勿算人。人在离别时,勿毁人。人在争取时,要像人。人在放下时,才是人。
心理神经入门数据:
The Psychology Book:Big Ideas Simply Explained
Psychology: Everything you Need to Know to Master the Subject
Psych101
neurocience 戴尔•普尔夫斯 涵盖了神经科学各个领域基础知识,信号如何发生,如何影响机能(记忆,睡眠,正常行为,情绪等)
Neurocience: Exploring the Brain Mark F.Bear 简单直接用神经科学解释一些行为
Principles of Neural Science 神经科学入门圣经,全面,深入
伊利 还记得吗
梦到投放炸弹,我就看着天上,不停的躲。持续了很长时间。最后说这是在演电影,我就在想,那么他们怎么死了,而且那种冲击感,很真实。我被振飞了好几次。
Google 2020 blog machine learning:
数据挖掘
Large Scale Parallel Document Mining for Machine Translation
Effective Parallel Corpus Mining using Bilingual Sentence Embeddings
Denoising Neural Machine Translation Training with Trusted Data and Online Data Selection(去噪)
模型
Learning a Multi-Domain Curriculum for Neural Machine Translation
天气慢慢变冷了。
字符级的机器翻译,在embedding上面加一层一维卷积,会获得较好的效果(why dont people use character-level machine translation)
论文清单:
Training Tips for the Transformer Model
Semantics-aware Attention Improves Neural Machine Translation
Why don’t people use character-level machine translation?
MSP: Multi-Stage Prompting for Making Pre-trained Language Models Better Translators
Well-classified Examples are Underestimated in Classification with Deep Neural
语言学领域内每一次重大的范式
转换大都由该学科领域对基本数据看法的改变而引
发
今天测试发现 DeepL在测试的较多句子,翻译通顺,语序很好。
in a couple of days
机器翻译如何实现数字精确和模糊的翻译,如数字千钧一发,123.1
AAAI 2018 Translating pro-Drop Languages with Reconstruction Models。针对口语领域,代词缺失的情况,引入Reconstructor(类似于auto-encoder),重新生成原文,同时也在decoder端加入reconstructor。
TACL 2018b Modeling Past and Future for Neural Machine Translation 机器翻译中解码器的状态需要承担三个任务,past,present,future,作者将past和furure的任务分离出去单独建模
https://github.com/lena-voita/the-story-of-heads 可视化机器翻译开源
Facebook 开源https://github.com/facebookresearch/UnsupervisedMT/
Analyzing the Source and Target Contributions to Predictions in Neural Machine Translation 可视化翻译
Revisiting Low-Resource Neural Machine Translation: A Case Study 低资源翻译论文
Ehud Reiter 博客:https://ehudreiter.com/
论文列表:
Dictionary-based Data Augmentation for Cross-Domain Neural Machine
Intelligent Selection of Language Model Training Data
Chinese Syntactic Reordering for Statistical Machine Translation
Train, Sort, Explain: Learning to Diagnose Translation Models
Statistical Power and Translationese in Machine Translation Evaluation
Neural Machine Translation with Reconstruction
Learning Deep Transformer Models for Machine Translation
Improving Deep Transformer with Depth-Scaled Initialization and Merged Attention
Levenshtein Transformer
REFORMER: THE EFFICIENT TRANSFORMER
测试一下
MT-reading-list是一个很好的资料,要坚持阅读翻译