[1] 许林玉. 高温超导体的五大应用场景[J]. 世界科学, 2023: 24-25.
[2] BUTLER K T, DAVIES D W, CARTWRIGHT H, et al. Machine Learning for Molecular andMaterials Science[J]. Nature, 2018, 559(7715): 547-555.
[3] RAMPRASAD R, BATRA R, PILANIA G, et al. Machine Learning in Materials Informatics:Recent Applications and Prospects[J]. npj Computational Materials, 2017, 3(1): 54.
[4] BURKE K. Perspective on Density Functional Theory[J]. The Journal of Chemical Physics,2012, 136(15): 150901.
[5] ONNES H K. Further Experiments with Liquid Helium. G. On the Electrical Resistance ofPure Metals, etc. VI. On the Sudden Change in the Rate at which the Resistance of MercuryDisappears.[M]. Dordrecht: Springer Netherlands, 1991: 267-272.
[6] COMBESCOT R. Superconductivity: An Introduction[M]. Cambridge University Press, 2022.
[7] MEISSNER W, OCHSENFELD R. Ein Neuer Effekt Bei Eintritt der Supraleitfähigkeit[J].Naturwissenschaften, 1933, 21: 787-788.
[8] BARDEEN J, COOPER L N, SCHRIEFFER J R. Microscopic Theory of Superconductivity[J]. Phys. Rev., 1957, 106: 162-164.
[9] BEDNORZ J G, MüLLER K A. Possible HighTc Superconductivity in the Ba−La−Cu−O System[J]. Zeitschrift für Physik B Condensed Matter, 1986, 64(2): 189-193.
[10] WU M K, ASHBURN J R, TORNG C J, et al. Superconductivity at 93 K in a New Mixed-phaseY-Ba-Cu-O Compound System at Ambient Pressure[J]. Phys. Rev. Lett., 1987, 58: 908-910.
[11] 赵忠贤. Sr(Ba)-La-Cu 氧化物的高临界温度超导电性[J]. 科学通报, 1987(03): 177-179.
[12] NAGAMATSU J, NAKAGAWA N, MURANAKA T, et al. Superconductivity at 39 K in Magnesium Diboride[J]. Nature, 2001, 410(6824): 63-64.
[13] TAKAHASHI H, IGAWA K, ARII K, et al. Superconductivity at 43 K in an Iron-Based LayeredCompound LaO1-xFxFeAs[J]. Nature, 2008, 453(7193): 376-378.
[14] REN Z A, CHE G C, DONG X L, et al. Superconductivity and Phase Diagram in Iron-BasedArsenic-Oxides ReFeAsO1−δ (Re = rare-earth metal) without Fluorine Doping[J]. EurophysicsLetters, 2008, 83(1): 17002.
[15] 罗会仟. 高压室温超导电性的新进展[J]. 中国科学: 物理学力学天文学, 2021, 51(11): 130-133.
[16] PURI M, SOLANKI A, PADAWER T, et al. Chapter 1 - Introduction to Artificial NeuralNetwork (ANN) as a Predictive Tool for Drug Design, Discovery, Delivery, and Disposition:Basic Concepts and Modeling[M]//Artificial Neural Network for Drug Design, Delivery andDisposition. Boston: Academic Press, 2016: 3-13.
[17] MCCULLOCH W S, PITTS W. A Logical Calculus of the Ideas Immanent in Nervous Activity[J]. The bulletin of mathematical biophysics, 1943, 5(4): 115-133.
[18] MORRIS R. D.O. Hebb: The Organization of Behavior, Wiley: New York; 1949[J]. BrainResearch Bulletin, 1999, 50(5): 437.
[19] ROSENBLATT F. The Perceptron: a Probabilistic Model for Information Storage and Organization in the Brain.[J]. Psychological review, 1958, 65 6: 386-408.
[20] WERBOS P. Backpropagation Through Time: What it does and How to do it[J]. Proceedingsof the IEEE, 1990, 78(10): 1550-1560.
[21] RUMELHART D E, HINTON G E, WILLIAMS R J. Learning Representations by BackPropagating Errors[J]. Nature, 1986, 323(6088): 533-536.
[22] HINTON G E, SALAKHUTDINOV R R. Reducing the Dimensionality of Data with NeuralNetworks[J]. Science, 2006, 313(5786): 504-507.
[23] HINTON G E, OSINDERO S, TEH Y W. A Fast Learning Algorithm for Deep Belief Nets[J].Neural Comput, 2006, 18(7): 1527-54.
[24] HINTON G E. Learning Multiple Layers of Representation[J]. Trends in Cognitive Sciences,2007, 11(10): 428-434.
[25] LEE H, GROSSE R, RANGANATH R, et al. Convolutional Deep Belief Networks for ScalableUnsupervised Learning of Hierarchical Representations[C]//ICML ’09: Proceedings of the 26thAnnual International Conference on Machine Learning. New York, NY, USA: Association forComputing Machinery, 2009: 609–616.
[26] RANZATO M, BOUREAU Y L, LECUN Y. Sparse Feature Learning for Deep Belief Networks[J]. Advances in Neural Information Processing Systems, 2008, 20: 1185-1192.
[27] RIFAI S, VINCENT P, MULLER X, et al. Contractive Auto-Encoders: Explicit InvarianceDuring Feature Extraction[C]//International Conference on Machine Learning. 2011.
[28] HINTON G E, KRIZHEVSKY A, WANG S D. Transforming Auto-Encoders[C]//HONKELAT, DUCH W, GIROLAMI M, et al. Artificial Neural Networks and Machine Learning – ICANN2011. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011: 44-51.
[29] JAIN V, SEUNG S. Natural Image Denoising with Convolutional Networks[C]//KOLLER D,SCHUURMANS D, BENGIO Y, et al. Advances in Neural Information Processing Systems:volume 21. Curran Associates, Inc., 2008.
[30] KAVUKCUOGLU K, SERMANET P, BOUREAU Y L, et al. Learning Convolutional FeatureHierarchies for Visual Recognition[C]//LAFFERTY J, WILLIAMS C, SHAWE-TAYLOR J,et al. Advances in Neural Information Processing Systems: volume 23. Curran Associates,Inc., 2010.
[31] ISAYEV O, FOURCHES D, MURATOV E N, et al. Materials Cartography: Representingand Mining Materials Space Using Structural and Electronic Fingerprints[J]. Chemistry ofMaterials, 2015, 27(3): 735-743.
[32] STANEV V, OSES C, KUSNE A G, et al. Machine Learning Modeling of SuperconductingCritical Temperature[J]. npj Computational Materials, 2018, 4(1): 29.
[33] ZIATDINOV M, MAKSOV A, LI L, et al. Deep Data Mining in a Real Space: Separation ofIntertwined Electronic Responses in a Lightly Doped BaFe2As2[J]. Nanotechnology, 2016, 27(47).
[34] 崔志强, 罗颖, 张云蔚. 通过机器学习设计新型超导材料[J]. 硅酸盐学报, 2023, 51: 411-415.
[35] 胡杰. 基于机器学习的材料超导性质预测[D]. 西南交通大学, 2022.
[36] WARD L, AGRAWAL A, CHOUDHARY A, et al. A General-Purpose Machine LearningFramework for Predicting Properties of Inorganic Materials[J]. npj Computational Materials,2016, 2(1): 16028.
[37] LECUN Y, BENGIO Y, HINTON G. Deep learning[J]. Nature, 2015, 521(7553): 436-444.
[38] BENGIO Y, LEE D H, BORNSCHEIN J, et al. Towards Biologically Plausible Deep Learning[A]. 2016. arXiv: 1502.04156.
[39] MARBLESTONE A H, WAYNE G, KORDING K P. Toward an Integration of Deep Learningand Neuroscience[J]. Frontiers in Computational Neuroscience, 2016, 10.
[40] AGHAEE A, KHAN M O. Performance of Fourier-Based Activation Function in PhysicsInformed Neural Networks for Patient-Specific Cardiovascular Flows[J]. Computer Methodsand Programs in Biomedicine, 2024, 247: 108081.
[41] ZHOU C, LIU G, LIAO S. Probing Dominant Flow Paths in Enhanced Geothermal Systemswith a Genetic Algorithm Inversion Model[J]. Applied Energy, 2024, 360: 122841.
[42] BREIMAN L. Bagging Predictors[J]. Machine Learning, 1996, 24(2): 123-140.
[43] AMIT Y, GEMAN D. Shape Quantization and Recognition with Randomized Trees[J]. NeuralComputation, 1997, 9(7): 1545-1588.
[44] DIETTERICH T G. An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization[J]. Machine Learning, 2000,40(2): 139-157.
[45] FENNELL P G, ZUO Z, LERMAN K. Predicting and Explaining Behavioral Data with Structured Feature Space Decomposition[J]. EPJ Data Science, 2019, 8(1): 23.
[46] BREIMAN L. Arcing Classifier (with Discussion and a Rejoinder by the Author)[J]. TheAnnals of Statistics, 1998, 26(3): 801 - 849.
[47] FREUND Y, SCHAPIRE R E. A desicion-theoretic generalization of on-line learning and an application to boosting[C]//Computational Learning Theory. Berlin, Heidelberg: Springer BerlinHeidelberg, 1995: 23-37.
[48] PIRYONESI S M, EL-DIRABY T E. Data Analytics in Asset Management: Cost-EffectivePrediction of the Pavement Condition Index[J]. Journal of Infrastructure Systems, 2020, 26(1):04019036.
[49] ANANDAN B, MANIKANDAN M. Machine Learning Approach with Various RegressionModels for Predicting the Ultimate Tensile Strength of the Friction Stir Welded AA 2050-T8Joints by the K-Fold Cross-Validation Method[J]. Materials Today Communications, 2023, 34:105286.
修改评论