Data Analytics in Bioinformatics. Группа авторов
Чтение книги онлайн.

Читать онлайн книгу Data Analytics in Bioinformatics - Группа авторов страница 19

Название: Data Analytics in Bioinformatics

Автор: Группа авторов

Издательство: John Wiley & Sons Limited

Жанр: Программы

Серия:

isbn: 9781119785606

isbn:

СКАЧАТЬ Application And Challenges. Compusoft, 9, 1, 3560–3565, 2020.

      3. Géron, A., Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems, O’Reilly Media, United State of America, 2019.

      4. Alshemali, B. and Kalita, J., Improving the reliability of deep neural networks in NLP: A review. Knowl.-Based Syst., 191, 105210, 2020.

      5. Klaine, P.V., Imran, M.A., Onireti, O., Souza, R.D., A survey of machine learning techniques applied to self-organizing cellular networks. IEEE Commun. Surv. Tut., 19, 4, 2392–2431, 2017.

      6. Abadi, M., Barham, P., Chen, J., Chen, Z., Davis, A., Dean, J., Kudlur, M., Tensorflow: A system for large-scale machine learning, in: 12th {USENIX} Symposium on Operating Systems Design and Implementation ({OSDI} 16), pp. 265–283, 2016.

      7. Alpaydin, E., Introduction to machine learning, MIT Press, United Kingdom, 2020.

      8. Larranaga, P., Calvo, B., Santana, R., Bielza, C., Galdiano, J., Inza, I., Robles, V., Machine learning in bioinformatics. Briefings Bioinf., 7, 1, 86–112, 2006.

      9. Almomani, A., Gupta, B.B., Atawneh, S., Meulenberg, A., Almomani, E., A survey of phishing email filtering techniques. IEEE Commun. Surv. Tut., 15, 4, 2070–2090, 2013.

      10. Kononenko, I., Machine learning for medical diagnosis: History, state of the art and perspective. Artif. Intell. Med., 23, 1, 89–109, 2001.

      11. Kotsiantis, S.B., Zaharakis, I., Pintelas, P., Supervised machine learning: A review of classification techniques, in: Emerging Artificial Intelligence Applications in Computer Engineering, vol. 160, pp. 3–24, 2007.

      12. Freitag, D., Machine learning for information extraction in informal domains. Mach. Learn., 39, 2–3, 169–202, 2000.

      13. Radford, A., Narasimhan, K., Salimans, T., Sutskever, I., Improving language understanding by generative pre-training, URL https://s3-us-west-2.amazonaws.com/openai-assets/researchcovers/languageunsupervised/languageunderstanding paper.pdf, 2018.

      15. Miyato, T., Maeda, S.I., Koyama, M., Ishii, S., Virtual adversarial training: a regularization method for supervised and semi-supervised learning. IEEE Trans. Pattern Anal. Mach. Intell., 41, 8, 1979–1993, 2018.

      16. Tarvainen, A. and Valpola, H., Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results, in: Advances in Neural Information Processing Systems, pp. 1195–1204, 2017.

      17. Baldi, P., Autoencoders, unsupervised learning, and deep architectures, in: Proceedings of ICML Workshop on Unsupervised and Transfer Learning, 2012, June, pp. 37–49.

      18. Srivastava, N., Mansimov, E., Salakhudinov, R., Unsupervised learning of video representations using lstms, in: International Conference on Machine Learning, 2015, June, pp. 843–852.

      19. Niebles, J.C., Wang, H., Fei-Fei, L., Unsupervised learning of human action categories using spatial-temporal words. Int. J. Comput. Vision, 79, 3, 299–318, 2008.

      20. Lee, H., Grosse, R., Ranganath, R., Ng, A.Y., Unsupervised learning of hierarchical representations with convolutional deep belief networks. Commun. ACM, 54, 10, 95–103, 2011.

      21. Memisevic, R. and Hinton, G., Unsupervised learning of image transformations, in: 2007 IEEE Conference on Computer Vision and Pattern Recognition, 2007, June, IEEE, pp. 1–8.

      22. Dy, J.G. and Brodley, C.E., Feature selection for unsupervised learning. J. Mach. Learn. Res., 5, Aug, 845–889, 2004.

      23. Kim, Y., Street, W.N., Menczer, F., Feature selection in unsupervised learning via evolutionary search, in: Proceedings of the Sixth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2000, August, pp. 365–369.

      24. Shi, Y. and Sha, F., Information-theoretical learning of discriminative clusters for unsupervised domain adaptation, Proceedings of the International Conference on Machine Learning, 1, pp. 1079–1086, 2012.

      25. Balakrishnan, P.S., Cooper, M.C., Jacob, V.S., Lewis, P.A., A study of the classification capabilities of neural networks using unsupervised learning: A comparison with K-means clustering. Psychometrika, 59, 4, 509–525, 1994.

      26. Pedrycz, W. and Waletzky, J., Fuzzy clustering with partial supervision. IEEE Trans. Syst. Man Cybern. Part B (Cybern.), 27, 5, 787–795, 1997.

      27. Andreae, J.H., The future of associative learning, in: Proceedings 1995 Second New Zealand International Two-Stream Conference on Artificial Neural Networks and Expert Systems, 1995, November, IEEE, pp. 194–197.

      29. Abbeel, P. and Ng, A.Y., Apprenticeship learning via inverse reinforcement learning, in: Proceedings of the Twenty-First International Conference on Machine learning, 2004, July, p. 1.

      30. Wiering, M. and Van Otterlo, M., Reinforcement learning. Adapt. Learn. Optim., 12, 3, 2012.

      31. Ziebart, B.D., Maas, A.L., Bagnell, J.A., Dey, A.K., Maximum entropy inverse reinforcement learning, in: Aaai, vol. 8, pp. 1433–1438, 2008.

      32. Rothkopf, C.A. and Dimitrakakis, C., Preference elicitation and inverse reinforcement learning, in: Joint European Conference on Machine Learning and Knowledge Discovery in Databases, 2011, September, Springer, Berlin, Heidelberg, pp. 34–48.

      33. Anderson, M.J., Carl Linnaeus: Father of Classification, Enslow Publishing, LLC, New York, 2009.

      34. Becker, H.S., Problems of inference and proof in participant observation. Am. Sociol. Rev., 23, 6, 652–660, 1958.

      35. Zaffalon, M. and Miranda, E., Conservative inference rule for uncertain reasoning under incompleteness. J. Artif. Intell. Res., 34, 757–821, 2009.

      36. Sathya, R. and Abraham, A., Comparison of supervised and unsupervised learning algorithms for pattern classification. Int. J. Adv. Res. Artif. Intell., 2, 2, 34–38, 2013.

      37. Tao, D., Li, X., Hu, W., Maybank, S., Wu, X., Supervised tensor learning, in: Fifth IEEE International Conference on Data Mining (ICDM’05), 2005, November, IEEE, p. 8.

      38. Krawczyk, B., Woźniak, M., Schaefer, G., Cost-sensitive decision tree ensembles for effective imbalanced classification. Appl. Soft Comput., 14, 554–562, 2014.

      39. Wang, B., Tu, Z., Tsotsos, J.K., Dynamic label propagation for semi-supervised multi-class multi-label classification, in: Proceedings of the IEEE International Conference on Computer Vision, pp. 425–432, 2013.

      40. СКАЧАТЬ