首页 > 其他分享 >MindtheGap队伍实录(till 2023Nov)

MindtheGap队伍实录(till 2023Nov)

时间:2023-11-26 21:00:11浏览次数:43  
标签:Solved Contest Programming Rank 2023Nov till 2023 ICPC MindtheGap

正式比赛

\(**Year2023**\)
\(ICPC Nanjing:steel\)
\(CCPC Shenzhen:bronze\)
\(ICPC Jinan:\)未开始
\(ICPC Hangzhou(*):\)未开始

交题圣经

" 语言别交错 题目别交ß错
long long 有没有开
空间够不够 大小够不够
自己的样例试过没
格式'\n'有没有
板子有没有写错
有没有取题目要求的mod
读入的变量类型与题目保持一致否
调试输出是否已关 "

比赛战术

1.上机之前,理清自己的思路,先确认题意,后验证可行性,至于交题前看一遍圣经(记得打印/手抄圣经)
2.卡题的时候真的别急,不要纠结于看着过题人数多但是做不出来的题目(但愿不要发生喵),我要及时让队友转换题目,统筹分配时间
3.口做法的时候,多想一下队友的意思,听题的人对复杂或者容易歧义的题目对陈述者重复题意
4.比赛白热化的时候别太激动,保持冷静,一定要保持专注(这点很重要,下次不在赛时不饿的时候吃午饭了呜呜)
5.希望以后出去都能多住一天,少带点东西去比赛,感觉东西一多人就容易乱。

训练实录

Year2023(From Oct):
Oct10 The 2021 Sichuan Provincial Collegiate Programming Contest: Solved:8/13,Rank:22
Oct11 2019-2020 ICPC Northwestern European Regional Contest: Solved:5/11,Rank:35
Oct12 2023 Hubei Provincial Collegiate Programming Contest: Solved:7/13,Rank:16
Oct25 2019-2020 ACM-ICPC Latin American Regional Programming Contest: Solved:5/13,Rank:29
Oct28 2020-2021 ACM-ICPC, Asia Seoul Regional Contest: Solved:4/12,Rank:36
Oct29 2020-2021 Saint-Petersburg Open High School Programming Contest (SpbKOSHP 20): Solved:8/12,Rank:27
Nov01 2018 Arab Collegiate Programming Contest (ACPC 2018): Solved:7/12,Rank:6
Nov08 2023 China Collegiate Programming Contest (CCPC) Guilin Onsite (The 2nd Universal Cup. Stage 8: Guilin): Solved:2/12,Rank:316/429
Nov15 2023 United Kingdom and Ireland Programming Contest (UKIEPC 2023): Solved:8/13,Rank:18/206
Nov16 The 2023 CCPC (Qinhuangdao) Onsite (The 2nd Universal Cup. Stage 9: Qinhuangdao): Solved:4/13,Rank:298/403
Nov17 2019 ICPC Asia-East Continent Final:3/13,Rank:143/386
Nov18 2023 United Kingdom and Ireland Programming Contest (UKIEPC 2023): Solved:6/13,Rank:27/96
Nov19 2019-2020 ICPC Central Europe Regional Contest (CERC 19): Solved:6/12,Rank:36/63
Nov23 The 2nd Universal Cup. Stage 10: Harbin: Solved:3/13,Rank:283/392
Nov25 2023-2024 ICPC, NERC, Southern and Volga Russian Regional Contest (problems intersect with Educational Codeforces Round 157):Solved:8/14,Rank:6/86

标签:Solved,Contest,Programming,Rank,2023Nov,till,2023,ICPC,MindtheGap
From: https://www.cnblogs.com/edgrass/p/17857946.html

相关文章

  • MindtheGap队伍实录
    正式比赛\(**Year2023**\)\(ICPCNanjing:steel\)\(CCPCShenzhen:bronze\)\(ICPCJinan:\)未开始交题圣经"语言别交错题目别交ß错longlong有没有开空间够不够大小够不够自己的样例试过没格式'\n'有没有板子有没有写错有没有取题目要求的模数读入的变量类型与题目......
  • MindtheGap队伍实录
    正式比赛\(**Year2023**\)\(ICPCNanjing:steel\)\(CCPCShenzhen:\)未开始\(ICPCJinan:\)未开始交题圣经"语言别交错题目别交错longlong有没有开空间够不够大小够不够自己的样例试过没格式'\n'有没有板子有没有写错有没有取题目要求的模数读入的变量类型与题目保......
  • Linkless Link Prediction via Relational Distillation
    目录概符号说明LLP代码GuoZ.,ShiaoW.,ZhangS.,LiuY.,ChawlaN.V.,ShahN.andZhaoT.Linklesslinkpredictionviarelationaldistillation.ICML,2023.概从GNN教师模型蒸馏到MLP学生模型.符号说明\(G=(\mathcal{V,E})\),无向图;\(\mathbf{A}\in......
  • Distilling Knowledge from Graph Convolutional Networks
    目录概符号说明DistillGCNLocalStructurePreserving代码YangY.,QiuJ.,SongM.,TaoD.andWangX.Distillingknowledgefromgraphconvolutionalnetworks.CVPR,2020.概蒸馏表征间的结构关系,教师必须是图网络结构?符号说明\(\mathcal{G}=(\mathcal{V},\m......
  • 8 Innovative BERT Knowledge Distillation Papers That Have Changed The Landscape
    8InnovativeBERTKnowledgeDistillationPapersThatHaveChangedTheLandscapeofNLPContemporarystate-of-the-artNLPmodelsaredifficulttobeutilizedinproduction.Knowledgedistillationofferstoolsfortacklingsuchissuesalongwithseveralothe......
  • 论文阅读:Knowledge Distillation via the Target-aware Transformer
    摘要Knowledgedistillationbecomesadefactostandardtoimprovetheperformanceofsmallneuralnetworks.知识蒸馏成为提高小型神经网络性能的事实上的标准。Mostofthepreviousworksproposetoregresstherepresentationalfeaturesfromtheteachertothes......
  • CF786C Till I Collapse
    题外话根分纸张第一次自己做出根分虽然很水,纪念一下。\(\text{Links}\)CodeforcesLuogu题意给定一个长度为\(n\)\((1\len\le10^5)\)的序列\(a\)\((1\lea_i\len)\),对于\(k=1,2,3,\dots,n\),分别求出最小的\(m\),使得存在一种将原序列划分成\(m\)段的方案,满足每......
  • Unbiased Knowledge Distillation for Recommendation
    目录概UnKD代码ChenG.,ChenJ.,FengF.,ZhouS.andHeX.Unbiasedknowledgedistillationforrecommendation.WSDM,2023.概考虑流行度偏差的知识蒸馏,应用于推荐系统.UnKDMotivation就不讲了,感觉不是很强烈.方法很简单,就是将按照流行度给items进行......
  • [论文阅读] Anomaly detection via reverse distillation from one-class embedding
    Anomalydetectionviareversedistillationfromone-classembeddingIntroduction在知识蒸馏(KD)中,知识是在教师-学生(T-S)对中传递的。在无监督异常检测的背景下,由于学生在训练过程中只接触到正常样本,所以当查询是异常的时候,学生很可能会产生与教师不一致的表示。然而,在实际情......
  • DE-RRD: A Knowledge Distillation Framework for Recommender System
    目录概DE-RRDDistillationExperts(DE)RelaxedRankingDistillation(RRD)代码KangS.,HwangJ.,KweonW.andYuH.DE-RRD:Aknowledgedistillationframeworkforrecommendersystem.CIKM,2020.概知识蒸馏应用于推荐系统(同时迁移隐层+输出层特征).DE-RRD......