Department of Informatics

Faculty of Physics, Astronomy and Informatics and
NeuroCognitive Laboratory, Centre for Modern Interdisciplinary Technologies, | Nicolaus Copernicus University.

Scientific achivements, August 2019

Wlodzislaw Duch: A list of talks and links to some presentations | conferences, past and planned
Some DuchLab projects | chronological list of papers.
Google Profile | arxiv page | ORCID ID |


Scientific activities: Włodzisław Duch, Department of Informatics, Nicolaus Copernicus University (NCU).

Close all topics

Neural networks.

  1. Novel neural network models
    1. k-separability generalizes the concept of separability, making the goal of learning much easier. Networks should transfer the image of the input data into distributions that can be handled directly.

    2. Duch W. (2018) Separability is not the best goal for machine learning. cs arXiv preprint, arXiv:1807.02873
    3. Duch, W. (2007) K-separability. Lecture Notes in Computer Science 4131, 188-197, 2006. | PDF file.

      Solving highly non-separable problems is a challenge. k-separability index can characterize complexity of such problems and projection pursuit networks can find useful projections of data that have large pure clusters.

    4. Grochowski M, Duch W. (2011) Fast Projection Pursuit Based on Quality of Projected Clusters. Lecture Notes in Computer Science Vol. 6594, pp. 89-97, 2011.
    5. Grochowski M, Duch W. (2010) Constructive Neural Network Algorithms that Solve Highly Non-Separable Problems In: Constructive Neural Networks, Springer Studies in Computational Intelligence Vol. 258, pp. 49-70.
    6. Grochowski M, Duch W. (2009) Constrained Learning Vector Quantization or Relaxed k-Separability Lecture Notes in Computer Science Vol. 5768: 151-160
    7. Grochowski M, Duch W. (2008) Projection Pursuit Constructive Neural Networks Based on Quality of Projected Clusters. Lecture Notes in Computer Science, vol. 5164, 754-762, 2008.
    8. Grochowski M, Duch W. (2008) A Comparison of Methods for Learning of Highly Non-Separable Problems. Lecture Notes in Computer Science, Vol. 5097, 566-577, 2008. | Abstract.
    9. Duch W. (2007) Learning data structures with inherent complex logic: neurocognitive perspective. 6th WSEAS International Conference on Computational Intelligence, Man-Machine Systems and Cybernetics (CIMMACS '07), Tenerife, Canary Islands, Spain, Dec. 14-16, 2007, pp. 294-303 | Abstract.
    10. Grochowski M, Duch W. (2007) Learning highly non-separable Boolean functions using Constructive Feedforward Neural Network. Lecture Notes in Computer Science, Vol. 4668, 180-189, 2007 | Abstract.

      Support Feature Machine is an algorithm that searches for good features that facilitate problem solving. Combination of different types of features increases chances to find simple solutions, so the method works better that kernel SVM, and one can optimize local kernels.

    11. Duch W, Maszczyk T, Grochowski M. (2011) Optimal Support Features for Meta-Learning. Book chapter, in: Meta-learning in Computational Intelligence. Studies in Computational Intelligence. Eds: N. Jankowski, K. Grabczewski, W. Duch, Springer 2011, pp. 317-358. | Show abstract

      Meta-learning has many aspects, but its final goal is to discover in an automatic way many interesting models for a given data. Our early attempts in this area involved heterogeneous learning systems combined with a complexity-guided search for optimal models, performed within the framework of (dis)similarity based methods to discover “knowledge granules”. This approach, inspired by neurocognitive mechanisms of information processing in the brain, is generalized here to learning based on parallel chains of transformations that extract useful information granules and use it as additional features. Various types of transformations that generate hidden features are analyzed and methods to generate them are discussed. They include restricted random projections, optimization of these features using projection pursuit methods, similarity-based and general kernel-based features, conditionally defined features, features derived from partial successes of various learning algorithms, and using the whole learning models as new features. In the enhanced feature space the goal of learning is to create image of the input data that can be directly handled by relatively simple decision processes. The focus is on hierarchical methods for generation of information, starting from new support features that are discovered by different types of data models created on similar tasks and successively building more complex features on the enhanced feature spaces. Resulting algorithms facilitate deep learning, and also enable understanding of structures present in the data by visualization of the results of data transformations and by creating logical, fuzzy and prototype-based rules based on new features. Relations to various machine-learning approaches, comparison of results, and neurocognitive inspirations for meta-learning are discussed.
    12. Maszczyk T, Duch W, Support Feature Machines: Support Vectors are not enough. World Congress on Computational Intelligence, IEEE Press, pp. 3852-3859, 2010.
      Also Arxiv 1901.09643 in cs.LG.
    13. Maszczyk T, Duch W, Support Feature Machine for DNA microarray data. Lecture Notes in Artificial Intelligence Vol. 6086, pp. 178-186, 2010.
    14. Maszczyk T, Duch W. (2012) Locally Optimized Kernels. Lecture Notes in Computer Science Vol. 7267, pp. 412–420, 2012.
    15. Universal Learning Machine (ULM) should find the simplest data model for arbitrary data distributions, avoiding specific biases that could make them suitable for specific kind of problems.

    16. Duch W, Maszczyk T. (2009) Universal Learning Machines. Lecture Notes in Computer Science Vol. 5864: 206–215.

      Generation and filtering of useful random projections lead to algorithms that have better biological justification than classical MLP, are faster, easier to train and may in practice solve nonseparable problems of higher complexity than typical feedforward neural networks.

    17. Maszczyk T, Duch W. (2010) Almost Random Projection Machine with Margin Maximization and Kernel Features. Lecture Notes in Computer Science Vol. 6353: 40-48, 2010.
    18. Duch W, Maszczyk T. (2009) Almost Random Projection Machine. Lecture Notes in Computer Science Vol. 5768: 789-798, 2009.

      Non-Euclidean distance function replacing sigmoidal activation function offer a natural generalization of the standard MLP model, providing more flexible decision borders. An alternative way leading to similar results is based on renormalization of the input vectors using nonEuclidean norms in extended feature spaces. Both approaches influence the shapes of decision borders dramatically, allowing to reduce the complexity of MLP networks.

    19. Duch W, Adamczak R, Diercksen GHF (1999) Neural Networks in non-Euclidean spaces. Neural Processing Letters  10: 201-210 | PDF file.
    20. Duch W, Adamczak R, Diercksen GHF (1999) Distance-based multilayer perceptrons. In: Computational Intelligence for Modelling Control and Automation. Neural Networks and Advanced Control Strategies. Ed. M. Mohammadian, IOS Press, Amsterdam, pp. 75-80 | PDF file
    21. Duch W, Adamczak R (1999) Neural networks in non-Euclidean metric spaces,
      1999 IEEE International Joint Conference on Neural Networks, Washington, July 1999, paper no. 740 (6 pages)
    22. Duch W, Grudzinski K and Diercksen G.H.F (1998) Minimal distance neural methods. World Congress of Computational Intelligence, May 1998, Anchorage, Alaska, IEEE IJCNN'98 Proceedings, pp. 1299-1304
    23. Duch W (1997) Neural minimal distance methods. Third Conference on Neural Networks and Their Applications, Kule, October 1997, pp. 183-188
    24. Surprisingly many cheap classification methods work very well on most of benchmark data. We have identified such cases and they should not be used when comparing new algorithms.

    25. Duch W, Maszczyk T, Jankowski N, Make it cheap: learning with O(nd) complexity.
      2012 IEEE World Congress on Computational Intelligence, Brisbane, Queensland, Australia, 10-15.06.2012, IJCNN, IEEE Press (ISBN: 978-1-4673-1489-3), pp. 132-135.
    26. Variable Step Search Training is based on numerical techniques for optimization of neural network parameters. It is simple to implement and more efficient than backpropagation.

    27. Kordos M, Duch W. (2008) Variable Step Search Training for Feedforward Neural Networks. Neurocomputing 71(13-15), 2470-2480, 2008. | Abstract.
    28. Kordos M, Duch W. (2006) Variable Step Search MLP Training Method International Journal of Information Technology and Intelligent Computing 1, 45-56, 2006 | PDF file.
    29. Kordos M, Duch W. (2004) Variable step size search algorithm for MLP training. The 8th IASTED International Conference on Artificial Intelligence and Soft Computing (ASC 2004), Marbella, Spain, pp.215-220
    30. Kordos M, Duch, W. (2004) Multilayer Perceptron Trained with Numerical Gradient. International Conference on Artificial Neural Networks (ICANN) and International Conference on Neural Information Processing (ICONIP), Istanbul, June 2003, pp. 106-109 | PDF file.
    31. Kordos M, Duch, W. (2003) Search-based training for logical rule extraction by Multilayer Perceptron. International Conference on Artificial Neural Networks (ICANN) and International Conference on Neural Information Processing (ICONIP), Istanbul, June 2003, pp. 86-89 | PDF file.
    32. Duch W (1999) Alternatives to gradient-based neural training and optimization, 4th Conference on Neural Networks and Their Applications, Zakopane, May 1999, pp. 59-64
    33. Duch W, Grąbczewski K (1999) Searching for optimal MLP, 4th Conference on Neural Networks and Their Applications, Zakopane, May 1999, pp. 65-70
    34. Error surfaces determine the efficiency of network training; they can be visualized in PCA space of the weigh sequences; suprprisingly we have never seen local minima, only large plateaus.

    35. Kordos M, Duch W. (2004) A Survey of Factors Influencing MLP Error Surface. Control and Cybernetics 33(4): 611-631, 2004 | PDF file.

      Learning Vector Quantization methods may be improved in many ways.

    36. Blachnik M, Duch W, Improving accuracy of LVQ algorithm by instance weighting. Lecture Notes in Computer Science Vol. 6353, pp. 256-266, 2010.
    37. Support Vector Neural Training, like SVM, starts from all data but near the end of the training use only a small subset of vectors near the decision border.

    38. Duch W. (2005) Support Vector Neural Training. Lecture Notes in Computer Science, Vol 3697, 67-72, 2005 | PDF file.

      Ontogenic neural networks adjust their structure to the complexity of the data, growing, prunning and merging node functions.

    39. Jankowski N, Duch W. (2000) Ontogeniczne sieci neuronowe. Biocybernetyka 2000, Tom 6: Sieci neuronowe (red. W. Duch, J. Korbicz, L. Rutkowski i R. Tadeusiewicz), rozdz. I.8, pp. 257-294
    40. Training of neural networks may be improved using global optimization methods, statistical approaches, and better initialization.

    41. Duch W, Korczak J, Optimization and global minimization methods suitable for neural networks Neural Computing Surveys (journal has discontinued) | PDF file
    42. Duch W, Adamczak R (1998) Statistical methods for construction of neural networks. International Conference on Neural Information Processing, ICONIP'98, Kitakyushu, Japan, Oct. 1998, Vol. 2, pp. 629-642 | PDF file
    43. Duch W, Adamczak R, Jankowski N (1997) Initialization of adaptive parameters in density networks. Third Conference on Neural Networks and Their Applications, Kule, October 1997, pp. 99-104
    44. Duch W, Adamczak R, Jankowski N (1997) Initialization and optimization of multilayered perceptrons. Third Conference on Neural Networks and Their Applications, Kule, October 1997, pp. 105-110
    45. Duch W (1997) Scaling properties of neural classifiers. Third Conference on Neural Networks and Their Applications, Kule, October 1997, pp. 189-194

  2. Feature Space Mapping
  3. Paper published in 1994 in "Neural Network World" introduced "Floating Gaussian Mapping for Modeling of Human Conceptual Space", idea similar to gaussian mixture modeling for estimation of probability densities in feature spaces characterizing psychological concepts. The use of Gaussian functions to model probability densities in quantum mechanics was the main motivation to create such model. This idea gave rise to more general Feature Space Mapping Neurofuzzy system that was based on separable basis functions. The model was implemented in our Ghostminer data mining system that was marketed by Fujitsu.

    1. Duch W and Diercksen GHF (1995) Feature Space Mapping as a universal adaptive system. Computer Physics Communications 87: 341-371
    2. Duch W, Adamczak R, Diercksen GHF. (2000) Feature space mapping neural network applied to structure-activity relationship problems. 7th International Conference on Neural Information Processing, Nov. 2000, Dae-jong, Korea, ed. by Soo-Young Lee, pp. 270 - 274
    3. Adamczak R, Duch W, Model FSM w zastosowaniu do klasyfikacji.
      Biocybernetyka 2000, Tom 6: Sieci neuronowe (red. W. Duch, J. Korbicz, L. Rutkowski i R. Tadeusiewicz), rozdz. III.26, pp. 801-824
    4. Adamczak R, Duch W, Jankowski N (1997) New developments in the Feature Space Mapping model.
      Third Conference on Neural Networks and Their Applications, Kule, October 1997, pp. 65-70; long version CIL-KMK-2/97, Computational Inteligence Lab. Technical Report.
    5. Duch W and Adamczak R (1996) Feature Space Mapping network for classification. | Abstract. Second Conference on Neural Networks and their applications, Orle Gniazdo, 30.IV-4.V.1996, pp. 125-130
    6. Duch W, Jankowski N, Naud A, Adamczak R (1995) Feature Space Mapping: a neurofuzzy network for system identification. Proc. of Engineering Applications of Neural Networks (EANN'95), Helsinki 21-23.08.1995, pp. 221-224
    7. Duch W (1993) Floating Gaussian Mapping for Modeling of Human Conceptual Space. UMK-KMK-TR 3/93 report
    8. Duch W, Adamczak R, Grąbczewski K (1999) Neural optimization of linguistic variables and membership functions.
      International Conference on Neural Information Processing (ICONIP'99), Perth, Australia, Nov. 1999, Vol. II, pp. 616-621

  4. Floating Gaussian Mapping.
  5. Floating Gaussian Mapping was inspired by the use of Gaussian function to calculate probability density in quantum physics; the model is equivalent to Radial Basis Function network. Early papers from the beginning of 1990. This also resulted in papers on neural versions of minimal distance methods, that was later developed into general approach to similarity-based methods.

    1. Duch W (1994) Floating Gaussian Mapping: a new model of adaptive systems Neural Network World 4:645-654
    2. Duch W and Diercksen GHF (1994) Neural networks as tools to solve problems in physics and chemistry. Computer Physics Communication 82: 91-103
    3. Duch W (1994). Neural networks for approximation. In: Materiały Konferencyjne I Krajowej Konferencji "Sieci Neuronowe i ich Zastosowania", Kule, 12-15.IV.1994, pp. 218-223
    4. Duch W (1993) Modeling neural networks - a physicist's point of view. In: Psychological and neurophysiological backgrounds of new computer technologies, Eds. Kulikowski J.L and Zmyslowski W (International Center of Biocybernetics, Warsaw 1993); also UMK-KMK-TR 1/92 report.
    5. Duch W (1993) Neural networks for approximation. UMK-KMK-TR 4/93 report

  6. Novel neural transfer functions.
  7. Some complexity of neural networks may be hidden in more sophisticated transfer functions, not just sigmoids or Gaussians. There is a hierarchy of neural-like models from simple nodes implementing functions with a single parameter and fixed weighted connections, to complex agents implementing various functions and procedures and exchanging information sending messages and forming ad-hoc subnetworks. Initially this subject was ignored but now it has gained popularity.

    1. Duch W. (2005) Uncertainty of data, fuzzy membership functions, and multi-layer perceptrons. IEEE Transactions on Neural Networks 16(1): 10-23, 2005
    2. Duch W, Jankowski N (1999) Survey of neural transfer functions, Neural Computing Surveys 2: 163-213
    3. Duch W and Jankowski N (1997) New neural transfer functions. Applied Mathematics and Computer Science 7 (1997) 639-658
    4. Blachnik M, Duch W. (2008) Building Localized Basis Function Networks Using Context Dependent Clustering. Lecture Notes in Computer Science, vol. 5163, 482-491, 2008.
    5. Duch W, Jankowski N. (2001) Transfer functions: hidden possibilities for better neural networks. 9th European Symposium on Artificial Neural Networks (ESANN), Brugge 2001. De-facto publications, pp. 81-94
    6. Duch W, Adamczak R, Diercksen GHF. (2001) Constructive density estimation network based on several different separable transfer functions. 9th European Symposium on Artificial Neural Networks (ESANN), Brugge 2001. De-facto publications, pp. 107-112
    7. Jankowski N, Duch W. (2001) Optimal transfer function neural networks. 9th European Symposium on Artificial Neural Networks (ESANN), Brugge 2001. De-facto publications, pp. 101-106
    8. Duch W, Grudziński K and Stawski G. (2000) Symbolic features in neural networks. 5th Conference on Neural Networks and Soft Computing, Zakopane, June 2000, pp. 180-185
    9. Duch W, Jankowski N. (2000) Taxonomy of neural transfer functions, IEEE, International Joint Conference on Neural Networks 2000 (IJCNN), Vol. III, pp. 477-484
    10. Duch W and Jankowski N (1996) Bi-radial transfer functions. Second Conference on Neural Networks and their applications, Orle Gniazdo, 30.IV-4.V.1996, pp. 131-137
    11. Duch W, Jankowski N (1996) Bi-radial transfer functions. | Abstract. UMK-KMK-TR 1/96, Computational Inteligence Lab. Technical Report
    12. Duch W (1993) On the optimal processing functions for neural network elements. UMK-KMK-TR 6/93 report

  8. Applications of neural networks
    1. Duch W, Swaminathan K, Meller J. (2007) Artificial Intelligence Approaches to Rational Drug Design and Discovery. Current Pharmaceutical Design, Vol. 13(14), 1497-1508, 2007 | Abstract.
    2. Duch W, Dobosz K. (2013) Sieci neuronowe w modelowaniu chorób psychicznych, rozdział w (book chapter): Tadeusiewicz R, Duch W, Korbicz J, Rutkowski L (Eds), Sieci neuronowe w inżynierii biomedycznej. Wyd. Exit, str 637-666, 2013 (rozdział w książce); Tłumaczenie rosyjskie.
    3. Lee W.K, Duch W, Ng G.S. (2006) Robot Space Exploration Using Peano Paths Generated by Self-Organizing Maps. | PDF file.
    4. Grudzinski K, Karwowski M, Duch W. (2003) Computational Intelligence Study of the Iron Age Glass Data. | PDF file.
      International Conference on Artificial Neural Networks (ICANN) and International Conference on Neural Information Processing (ICONIP), Istanbul, June 2003, pp. 17-20
    5. Adamczak R, Duch W. (2000) Neural networks for structure-activity relationship problems. 5th Conference on Neural Networks and Soft Computing, Zakopane, June 2000, pp. 669-674
    6. Duch W, Adamczak R, Grąbczewski K (1999) Neural methods for analysis of psychometric data. Proc. of the International Conference EANN'99, Warsaw, 13-15.09.1999, pp. 45-50 | PDF file.
    7. Duch W, Adamczak R, Grąbczewski K, Jankowski N, Żal G (1998) Medical diagnosis support using neural and machine learning methods, Proc. of the Intern. Conference EANN'98, Gibraltar, 10-12.06.1998, pp.  292-295

Neural networks for data understanding.

Main papers comparing neural methods to other rule-based methods. Results have been presented in many tutorials at top conferences -- IJCNN (2001, 2000), WCCI (2002), ICONIP (2000), ICANN (2002, 2001), NNSC (2000), and in book chapters and conference papers.

  1. Data understanding
    1. Duch W. (2013) Rule discovery. Encyclopedia of Systems Biology, W. Dubitzky, O. Wolkenhauer, H. Yokota, K-H Cho (Eds.), Springer 2013, pp. 1879-1883
    2. Duch W, Jankowski N, Grabczewski K. (2005) Computational intelligence methods for information understanding and information management. Series of Information and Management Sciences, California Polytechnic State University, pp. 281-287
    3. Duch W, Setiono R, Zurada J.M. (2004) Computational intelligence methods for understanding of data. Proc. of the IEEE 92(5) (2004) 771- 805 (see also the front cover of the issue, and the Prolog by J. Esch
    4. Duch W, Adamczak R, Grąbczewski K. (2001) A new methodology of extraction, optimization and application of crisp and fuzzy logical rules. IEEE Transactions on Neural Networks 12 (2001) 277-306
    5. Grąbczewski K, Duch W, Adamczak R. (2000) Neuronowe metody odkrywania wiedzy w danych. Biocybernetyka 2000, Tom 6: Sieci neuronowe (red. W. Duch, J. Korbicz, L. Rutkowski i R. Tadeusiewicz), rozdz. III.20, pp. 637-662

  2. MLP2LN algorithm
  3. MLP2LN algorithm, constrained multilayer backpropagation networks converted to a set of logical rules.

    1. Grudziński K, Grochowski M, Duch W. (2010) Pruning Classification Rules with Reference Vectors Selection Methods. Lecture Notes in Computer Science Vol. 6113, pp. 347-354, 2010.
    2. Duch W. (2005) Rules, Similarity, and Threshold Logic. Commentary on Emmanuel M. Pothos, The Rules versus Similarity distinction. Behavioral and Brain Sciences Vol. 28 (1): 23-23, 2005
    3. Duch W, Adamczak R, Grąbczewski K, Żal G (1999) Hybrid neural-global minimization method of logical rule extraction, Journal of Advanced Computational Intelligence and Intelligent Informatics, 3(5): 348-356
    4. Duch W, Adamczak R, Grąbczewski K (1998) Extraction of logical rules from backpropagation networks. Neural Processing Letters 7: 211-219
    5. Blachnik M, Duch W, Wieczorek T. (2005) Threshold rules decision list. In: T. Burczyński et al. (Eds), Methods of Artificial Intelligence, AI-METH Series, Gliwice 2005, pp. 23-24
    6. Duch W, Grąbczewski K, Adamczak R, Grudziński K, Hippe Z.S. (2001) Rules for melanoma skin cancer diagnosis. 2nd Polish Conference on Computer Pattern Recognition Systems (KOSYR 2001), Wroclaw 2001, pp. 59-68
    7. Duch W, Adamczak R, Grąbczewski K, Grudziński K, Jankowski N, Naud N. (2001) Extraction of Knowledge from Data using Computational Intelligence Methods. In: International Conference on Artificial Neural Networks (ICANN), Vienna, 21-25.08.2001 (tutorial, separate brochure, 63 pp)
    8. Duch W, Adamczak R, Grąbczewski K, Jankowski N. (2000) Neural methods of knowledge extraction,
      Control and Cybernetics 29 (4) (2000) 997-1018
    9. Duch W, Jankowski N, Grąbczewski K, Adamczak R. (2000) Optimization and interpretation of rule-based classifiers. Intelligent Information Systems 2000, Advances in Soft Computing, Physica Verlag (Springer), pp. 1-13
    10. Duch W and Hayashi Y. (2000) Computational intelligence methods and data understanding. In: Quo Vadis computational Intelligence? New trends and approaches in computational intelligence. Eds. P. Sincak, J. Vascak, Springer studies in fuzziness and soft computing, Vol. 54 (2000), pp. 256-270
    11. Duch W, Adamczak R, Grąbczewski K, Grudziński K, Jankowski N, Naud N. (2000) Extraction of knowledge from data using Computational Intelligence methods. In: ICONIP-2000, 7th International Conference on Neural Information Processing, Nov. 2000, Dae-jong, Korea (tutorial, separate brochure, 54 pp)
    12. Duch W, Adamczak R, Grąbczewski K, Żal G, Hayashi Y (1999) Fuzzy and crisp logical rule extraction methods in application to medical data. Fuzzy Systems in Medicine. Physica - Verlag, Springer 2000, pp. 593-616.
    13. Duch W, Adamczak R, Grąbczewski K (1999) Optimization of logical rules derived by neural procedures, 1999 IEEE International Joint Conference on Neural Networks, Washington, July 1999, paper no. 741 (6 pp)
    14. Duch W, Adamczak R, Grąbczewski K (1999) Methodology of extraction, optimization and application of logical rules. Intelligent Information Systems VIII, Ustroń, Poland, 14-18.06.1999, pp. 22-31
    15. Kasabov N, Kozma R, Duch W (1998) Rule Extraction from Linguistic Rule Networks and from Fuzzy Neural Networks: Propositional versus Fuzzy Rules, 4th International Conference on Neural Networks and their Applications, March 11-13, 1998, Marseille, France, pp. 403-406
    16. Duch W, Adamczak R, Grąbczewski K, Żal G (1998) Hybrid neural-global minimization logical rule extraction method for medical diagnosis support, Intelligent Information Systems VII, Malbork, Poland, 15-19.06.1998, pp. 85-94
    17. Duch W, Adamczak R, Grąbczewski K, Żal G (1998) A hybrid method for extraction of logical rules from data. Second Polish Conference on Theory and Applications of Artificial Intelligence, Łódź, 28-30 Sept. 1998, pp. 61-82
    18. Duch W, Adamczak R, Grąbczewski K (1997) Constraint MLP and density estimation for extraction of crisp logical rules from data. ICONIP'97, New Zealand, Nov.1997, pp. 831-834
    19. Duch W, Adamczak R, Grąbczewski K (1997) Extraction of crisp logical rules using constrained backpropagation networks. IEEE International Joint Conference on Neural Networks (IJCNN'97), Houston, Texas, 9-12.6.1997, pp. 2384-2389
    20. Duch W, Adamczak R, Grąbczewski K, Ishikawa M, Ueda H (1997) Extraction of crisp logical rules using constrained backpropagation networks - comparison of two new approaches. Proc. of the European Symposium on Artificial Neural Networks (ESANN'97), Brugge 16-18.4.1997, pp. 109-114
    21. Duch W, Adamczak R, Grąbczewski (1997) Logical rules for classification of medical data using ontogenic neural algorithm. Solving Engineering Problems with Neural Networks, Intern. Conference EANN'97, Stockholm, pp. 199-202
    22. Duch W, Adamczak R, Grąbczewski K (1997) Extraction of crisp logical rules from medical datasets, Third Conference on Neural Networks and Their Applications, Kule, October 1997, pp. 707-712
    23. Duch W, Adamczak R, Grąbczewski K (1996) Extraction of logical rules from training data using backpropagation networks | Abstract. CAI'96, First Polish Conference on Theory and Applications of Artificial Intelligence, Łódź, pp. 171-178
    24. Duch W, Adamczak R, Grąbczewski K (1996) Constrained backpropagation for feature selection and extraction of logical rules | Abstract First Polish Conference on Theory and Applications of Artificial Intelligence, Łódź, pp. 163-170
    25. Duch W, Adamczak R, Grąbczewski K (1996) Extraction of logical rules from training data using backpropagation networks The 1st Online Workshop on Soft Computing, 19-30.Aug. 1996, pp. 25-30

  4. Prototype-based logical rules.
    1. Duch W, Grudziński K, Prototype based rules - new way to understand the data. IEEE International Joint Conference on Neural Networks, Washington D.C. 14-18.07. 2001, pp. 1858-1863
    2. Blachnik M, Duch W, LVQ algorithm with instance weighting for generation of prototype-based rules. Neural Networks 24(8), 824–830, 2011. DOI: 10.1016/j.neunet.2011.05.013.
    3. Blachnik M, Kordos M, Duch W. (2012), Extraction of prototype-based threshold rules using neural training procedure. Lecture Notes in Computer Science 7553, pp. 255–262, 2012
    4. Blachnik M, Duch W. (2007) Prototype rules from SVM. Book chapter, in: Rule Extraction from Support Vector Machines, ed. J. Diederich, Springer Studies in Computational Intelligence, Vol. 80, 163-184, 2008.
    5. Blachnik M, Duch W. (2006) Prototype-based threshold rules. Lecture Notes in Computer Science, Vol. 4234, 1028-1037, 2006.
    6. Blachnik M, Duch W, Wieczorek T. (2006) Selection of prototypes rules – context searching via clustering. Lecture Notes in Artificial Intelligence, Vol. 4029, 573-582, 2006
    7. Wieczorek T, Blachnik M, Duch W. (2006), Heterogeneous distance functions for prototype rules: influence of parameters on probability estimation. | Abstract. International Journal of Artificial Intelligence Studies (journal never started).
    8. Blachnik M, Duch W, Wieczorek T. (2005) Probabilistic distance measures for prototype-based rules. 12th Int. Conference on Neural Information Processing (ICONIP'2005), Taipei, Taiwan, pp. 445-450
    9. Wieczorek T, Blachnik M, Duch W. (2005), Influence of probability estimation parameters on stability of accuracy in prototype rules using heterogeneous distance functions. Artificial Intelligence Studies, Vol.2 (25), 2005, pp. 71-78.
    10. Duch W, Blachnik M. (2004) Fuzzy rule-based systems derived from similarity to prototypes. Lecture Notes in Computer Science, Vol. 3316 (2004) 912-917.

  5. Understanding neural network functions through visualization.
    1. Duch W, Dobosz K. (2011) Attractors in Neurodynamical Systems. Advances in Cognitive Neurodynamics II (eds. R. Wang, F. Gu), pp. 157-161, 2011
    2. Kordos M, Duch W. (2004) On Some Factors Influencing MLP Error Surface. Lecture Notes in Artificial Intelligence Vol. 3070 (2004) 217-222
    3. Duch W. (2004) Visualization of hidden node activity in neural networks: I. Visualization methods. Lecture Notes in Artificial Intelligence 3070 (2004) 38-43
    4. Duch W. (2004) Visualization of hidden node activity in neural networks: II. Application to RBF networks. | PDF file. The 7th International Conference on Artificial Intelligence and Soft Computing (ICAISC), Zakopane, Poland, June 2004. Eds. L. Rutkowski, J. Siekemann, R. Tadeusiewicz, L. Zadeh. Lecture Notes in Artificial Intelligence Vol. 3070 (2004) 44-49
    5. Duch W. (2003) Coloring black boxes: visualization of neural network decisions. International Joint Conference on Neural Networks, Portland, Oregon, 2003, IEEE Press, Vol. I, pp. 1735-1740
    6. Maszczyk T, Duch W. (2008) Support Vector Machines for visualization and dimensionality reduction. Lecture Notes in Computer Science, vol. 5163, 346-356, 2008.
    7. Duch W. (2005) Internal representations of multi-layered perceptrons. In: Issues in Intelligent Systems: Paradigms. Eds. O. Hryniewicz, J. Kacprzyk, J. Koronacki, S.T. Wierzchoń, Exit, Warszawa 2005, pp. 49-62.

Meta-learning.

Operating in the space of all available data transformations and optimization techniques meta-learning algorithms use meta-knowledge about learning processes automatically extracted from experience of solving diverse problems, searching for most interesting compositions of transformations and optimizations, uncovering various aspects of knowledge hidden in the data. Meta-learning shifts the focus of the whole CI field from individual learning algorithms to the higher level of learning how to learn.This idea was introduced first in the framework of similarity-based methods in 2001.

General meta-learning

  1. Grąbczewski, K. (2014). Meta-Learning in Decision Tree Induction. Springer International Publishing. | Show abstract

    The book focuses on different variants of decision tree induction but also describes the meta-learning approach in general which is applicable to other types of machine learning algorithms. The book discusses different variants of decision tree induction and represents a useful source of information to readers wishing to review some of the techniques used in decision tree learning, as well as different ensemble methods that involve decision trees. It is shown that the knowledge of different components used within decision tree learning needs to be systematized to enable the system to generate and evaluate different variants of machine learning algorithms with the aim of identifying the top-most performers or potentially the best one. A unified view of decision tree learning enables to emulate different decision tree algorithms simply by setting certain parameters. As meta-learning requires running many different processes with the aim of obtaining performance results, a detailed description of the experimental methodology and evaluation framework is provided. Meta-learning is discussed in great detail in the second half of the book. The exposition starts by presenting a comprehensive review of many meta-learning approaches explored in the past described in literature, including for instance approaches that provide a ranking of algorithms. The approach described can be related to other work that exploits planning whose aim is to construct data mining workflows. The book stimulates interchange of ideas between different, albeit related, approaches.

  2. Jankowski N, Duch W, Grąbczewski K. (2011) Meta-learning in Computational Intelligence. Studies in Computational Intelligence, Vol. 358, 1st Edition, pp. X + 362. 127 illus, 76 in color, Springer 2011. | Show abstract

    Computational Intelligence (CI) community has developed hundreds of algorithms for intelligent data analysis, but still many hard problems in computer vision, signal processing or text and multimedia understanding, problems that require deep learning techniques, are open. Modern data mining packages contain numerous modules for data acquisition, pre-processing, feature selection and construction, instance selection, classification, association and approximation methods, optimization techniques, pattern discovery, clusterization, visualization and post-processing. A large data mining package allows for billions of ways in which these modules can be combined. No human expert can claim to explore and understand all possibilities in the knowledge discovery process.
    This is where algorithms that learn how to learn come to rescue. Operating in the space of all available data transformations and optimization techniques these algorithms use meta-knowledge about learning processes automatically extracted from experience of solving diverse problems. Inferences about transformations useful in different contexts help to construct learning algorithms that can uncover various aspects of knowledge hidden in the data. Meta-learning shifts the focus of the whole CI field from individual learning algorithms to the higher level of learning how to learn.
    This book defines and reveals new theoretical and practical trends in meta-learning, inspiring the readers to further research in this exciting field.

  3. Duch W, Grudziński K. (2001) Meta-learning: searching in the model space. arxiv entry. International Conference on Neural Information Processing, Shanghai, 2001, Vol. I, pp. 235-240
  4. Duch W, Grudziński K. (2002) Meta-learning via search combined with parameter optimization. Inteligent Information Systems, Advances in Soft Computing, Physica Verlag (Springer) 2002, pp. 13-22
  5. Duch W. (2006) Towards comprehensive foundations of computational intelligence | Show abstract
    Although computational intelligence (CI) covers a vast variety of different methods it still lacks an integrative theory. Several proposals for CI foundations are discussed: computing and cognition as compression, meta-learning as search in the space of data models, (dis)similarity based methods providing a framework for such meta-learning, and a more general approach based on chains of transformations. Many useful transformations that extract information from features are discussed. Heterogeneous adaptive systems are presented as particular example of transformation-based systems, and the goal of learning is redefined to facilitate creation of simpler data models. The need to understand data structures leads to techniques for logical and prototype-based rule extraction, and to generation of multiple alternative models, while the need to increase predictive power of adaptive models leads to committees of competent models. Learning from partial observations is a natural extension towards reasoning based on perceptions, and an approach to intuitive solving of such problems is presented. Throughout the paper neurocognitive inspirations are frequently used and are especially important in modeling of the higher cognitive functions. Promising directions such as liquid and laminar computing are identified and many open problems presented.
    In: W. Duch and J. Mandziuk, Challenges for Computational Intelligence. Springer Studies in Computational Intelligence, Vol. 63, 261-316, 2007.
  6. Duch W. (2013) Meta-learning. Encyclopedia of Systems Biology, W. Dubitzky, O. Wolkenhauer, H. Yokota, K-H Cho (Eds.), Springer 2013, pp. 1293-1296
  7. Duch W. (2012) Autonomy requires creativity and meta-learning, Journal of Artificial General Intelligence 3(2), 39–41, 2012
  8. Duch W, Maszczyk T, Grochowski M. (2011) Optimal Support Features for Meta-Learning. Book chapter, in: Meta-learning in Computational Intelligence. Studies in Computational Intelligence. Eds: N. Jankowski, K. Grabczewski, W. Duch, Springer 2011, pp. 317-358. | Show abstract

    Meta-learning has many aspects, but its final goal is to discover in an automatic way many interesting models for a given data. Our early attempts in this area involved heterogeneous learning systems combined with a complexity-guided search for optimal models, performed within the framework of (dis)similarity based methods to discover “knowledge granules”. This approach, inspired by neurocognitive mechanisms of information processing in the brain, is generalized here to learning based on parallel chains of transformations that extract useful information granules and use it as additional features. Various types of transformations that generate hidden features are analyzed and methods to generate them are discussed. They include restricted random projections, optimization of these features using projection pursuit methods, similarity-based and general kernel-based features, conditionally defined features, features derived from partial successes of various learning algorithms, and using the whole learning models as new features. In the enhanced feature space the goal of learning is to create image of the input data that can be directly handled by relatively simple decision processes. The focus is on hierarchical methods for generation of information, starting from new support features that are discovered by different types of data models created on similar tasks and successively building more complex features on the enhanced feature spaces. Resulting algorithms facilitate deep learning, and also enable understanding of structures present in the data by visualization of the results of data transformations and by creating logical, fuzzy and prototype-based rules based on new features. Relations to various machine-learning approaches, comparison of results, and neurocognitive inspirations for meta-learning are discussed.
  9. Jankowski N, Duch W, Grąbczewski K. (2011) Preface to: Meta-learning in Computational Intelligence. Studies in Computational Intelligence. Eds: N. Jankowski, K. Grabczewski, W. Duch, Springer 2011.
  10. Maszczyk T, Grochowski M, Duch W. (2010) Discovering Data Structures using Meta-learning, Visualization and Constructive Neural Networks In: Advances in Machine Learning II, Springer Studies in Computational Intelligence, Vol. 262, pp. 467-484.

Similarity-based methods.

  1. Framework for similarity based methods
    1. Maszczyk T, Duch W, Recursive Similarity-Based Algorithm for Deep Learning.
      T. Huang et al. (Eds.): Lecture Notes in Computer Science 7665, pp. 390–397. Springer, Heidelberg, 2012
    2. Duch W. (2005) Rules, Similarity, and Threshold Logic.
      Commentary on Emmanuel M. Pothos, The Rules versus Similarity distinction.

      Behavioral and Brain Sciences Vol. 28 (1): 23-23, 2005
    3. Duch W, Grudziński K, Ensembles of Similarity-Based Models.
      Advances in Soft Computing, Physica Verlag (Springer), pp. 75-85
    4. Duch W. (2000) Similarity based methods: a general framework for classification, approximation and association. Control and Cybernetics 29 (4) (2000) 937-968
    5. Duch W, Adamczak R, Diercksen G.H.F. (2000) Classification, Association and Pattern Completion using Neural Similarity Based Methods. Applied Mathematics and Computer Science 10:4 (2000) 101-120
    6. Duch W, Adamczak R, Diercksen GHF, Neural Networks from Similarity Based Perspective.
      New Frontiers in Computational Intelligence and its Applications. IOS Press, Amsterdam 2000, pp. 93-108
    7. Grudziński K, Duch W, SBL-PM: A Simple Algorithm for Selection of Reference Instances for Similarity Based Methods, Intelligent Information Systems 2000, Advances in Soft Computing, Physica Verlag (Springer), pp. 99-108
    8. Duch W, Grudziński K (1999) Search and global minimization in similarity-based methods,
      1999 IEEE International Joint Conference on Neural Networks, Washington, July 1999, paper no. 742 (6 pages)
    9. Duch W, Grudziński K (1999) Weighting and selection of features in Similarity Based Methods | PDF file.
      Intelligent Information Systems VIII, Ustroń, Poland, 14-18.06.1999, pp. 32-36
    10. Duch W (1998) A framework for similarity-based classification methods, Intelligent Information Systems VII, Malbork, Poland, 15-19.06.1998, pp. 288-291
    11. Duch W, Grudziński K (1998) A framework for similarity-based methods. Second Polish Conference on Theory and Applications of Artificial Intelligence, Łódź, 28-30 Sept. 1998, pp. 33-60

  2. Variants of similarity based methods
    1. Duch W, Grudziński K (1999) The weighted k-NN method with selection of features and its neural realization. 4th Conference on Neural Networks and Their Applications, Zakopane, May 1999, pp. 191-196
    2. Duch W, Adamczak R, Jankowski N (1996) Improved memory-based classification. Solving Engineering Problems with Neural Networks, Proc. of the Intern. conference EANN'96, London, 17-19.06.1996 (ed. A.B. Bulsari, S. Kallio, D. Tsaptsinos), pp. 447-450

Computational intelligence algorithms.

  1. General CI/ML
    1. Duch W. (2007) What is Computational Intelligence and where is it going? In: W. Duch and J. Mandziuk, Challenges for Computational Intelligence. Springer Studies in Computational Intelligence, Vol. 63, 1-13, 2007 | Abstract.
    2. Duch W, Mandziuk J. (2004) Quo Vadis Computational Intelligence? In: Machine Intelligence. Quo Vadis? Eds: P. Sinčák, J. Vaščák, K. Hirota. Advances in Fuzzy Systems - Applications and Theory - Vol. 21, World Scientific, 2004, pp. 3-28 | PDF file.
    3. Duch W, Mandziuk J. (2007) Preface: Challenges for Computational Intelligence. In: W. Duch and J. Mandziuk, Challenges for Computational Intelligence. Springer Studies in Computational Intelligence, Vol. 63, 2007.
    4. Duch, W. (2003) Dokąd zmierza inteligencja obliczeniowa?. W: R. Cierniak (ed.), Ewolucja czy rewolucja: Nowoczesne techniki informatyczne, Katedra Inżynierii Komputerowej Politechniki Częstochowskiej, s. 19-40, 2003.
    5. Duch W, Grudziński K. (2000) Sieci Neuronowe i Uczenie Maszynowe: próba integracji.
      Biocybernetyka 2000, Tom 6: Sieci neuronowe (red. W. Duch, J. Korbicz, L. Rutkowski i R. Tadeusiewicz), rozdz. III.21, pp. 663-690

  2. Feature selection
    1. Duch W. (2004) Filter Methods. In: Feature extraction, foundations and applications. Eds: I. Guyon, S. Gunn, M. Nikravesh, L. Zadeh, Studies in Fuzziness and Soft Computing, Physica-Verlag, Springer, 2006, pp. 89-118
    2. Kachel A, Duch W, Blachnik M, Biesiada J. (2010) Infosel++: Information Based Feature Selection C++ Library. Lecture Notes in Computer Science Vol. 6113, pp. 388-396, 2010.
    3. Blachnik M, Duch W, Maszczyk T. (2012), Feature ranking methods used for selection of prototypes. Lecture Notes in Computer Science 7553, pp. 296–304, 2012
    4. Blachnik M, Duch W, Kachel A, Biesiada J. (2009). Feature Selection for High-Dimensional Data: A Kolmogorov-Smirnov Class Correlation-Based Filter. Methods of Artificial Intelligence 2009, pp. 33-40.
    5. Duch W, Maszczyk T, Grochowski M. (2011) Optimal Support Features for Meta-Learning. Book chapter, in: Meta-learning in Computational Intelligence. Studies in Computational Intelligence. Eds: N. Jankowski, K. Grabczewski, W. Duch, Springer 2011, pp. 317-358.
    6. Biesiada J, Duch W. (2008) A Kolmogorov-Smirnov correlation-based filter solution for microarray gene expressions data. | Abstract.
      Springer Lecture Notes in Computer Science, Vol. 4985, pp. 285–294, 2008,
      Presented at the 14th Int. Conference on Neural Information Processing (ICONIP07), Kitakyushu, Japan, Nov.2007
    7. Biesiada J, Duch W. (2007) Feature Selection for High-Dimensional Data: A Pearson Redundancy Based Filter | Abstract. Advances in Soft Computing Vol. 45, 242-249, 2008.
    8. Duch W, Biesiada J. (2006) Margin-based feature selection filters for microarray gene expression data. International Journal of Information Technology and Intelligent Computing 1, 9-33, 2006
    9. Biesiada J, Duch W (2005). Feature Selection for High-Dimensional Data: A Kolmogorov-Smirnov Correlation-Based Filter. Advances in Soft Computing, pp. 95-104, 2005.
    10. Biesiada J, Duch W. Kachel A, Maczka K, Palucha S (2005), Feature ranking methods based on information entropy with Parzen windows. 9th International Conference on Research in Electrotechnology and Applied Informatics (REI'05), Katowice-Kraków, Poland, Vol. I, pp. 109-119.
    11. Duch W, Winiarski T, Biesiada J, Kachel, A. (2003) Feature Ranking, Selection and Discretization. International Conference on Artificial Neural Networks (ICANN) and International Conference on Neural Information Processing (ICONIP), Istanbul, June 2003, pp. 251-254.
    12. Duch W, Biesiada J, Winiarski T, Grudziński K, Grąbczewski K. (2002) Feature selection based on information theory filters and feature elimination wrapper methods. In: Neural Networks and Soft Computing (eds. L. Rutkowski and J. Kacprzyk), Advances in Soft Computing, Physica Verlag (Springer), pp. 173-176, 2002.
    13. Duch W, Winiarski T, Grąbczewski K, Biesiada J, Kachel, A. (2002) Feature selection based on information theory, consistency and separability indices. International Conference on Neural Information Processing (ICONIP), Vol. IV, pp. 1951-1955, Singapore 2002
    14. Duch W, Adamczak R, Grąbczewski K (1996) Constrained backpropagation for feature selection and extraction of logical rules First Polish Conference on Theory and Applications of Artificial Intelligence, Łódź, pp. 163-170 | Abstract
    15. Duch W, Wieczorek T, Biesiada J, Blachnik M. (2004) Comparison of feature ranking methods based on information entropy. International Joint Conference on Neural Networks (IJCNN), Budapest 2004, IEEE Press, pp. 1415-1420
    16. Marczak M, Duch W, Grudziński K, Naud A. (2002) Transformation Distances, Strings and Identification of DNA Promoters. In: Neural Networks and Soft Computing (eds. L. Rutkowski and J. Kacprzyk), Advances in Soft Computing, Physica Verlag (Springer), pp. 620-625, 2002.
    17. Duch W, Grudziński K (1999) The weighted k-NN method with selection of features and its neural realization. 4th Conference on Neural Networks and Their Applications, Zakopane, May 1999, pp. 191-196
    18. Duch W, Grudziński K (1999) Weighting and selection of features in Similarity Based Methods | PDF file. Intelligent Information Systems VIII, Ustroń, Poland, 14-18.06.1999, pp. 32-36

  3. Decision trees and separability criterion
    1. Grąbczewski, K. (2014). Meta-Learning in Decision Tree Induction. Springer International Publishing. | Show abstract

      The book focuses on different variants of decision tree induction but also describes the meta-learning approach in general which is applicable to other types of machine learning algorithms. The book discusses different variants of decision tree induction and represents a useful source of information to readers wishing to review some of the techniques used in decision tree learning, as well as different ensemble methods that involve decision trees. It is shown that the knowledge of different components used within decision tree learning needs to be systematized to enable the system to generate and evaluate different variants of machine learning algorithms with the aim of identifying the top-most performers or potentially the best one. A unified view of decision tree learning enables to emulate different decision tree algorithms simply by setting certain parameters. As meta-learning requires running many different processes with the aim of obtaining performance results, a detailed description of the experimental methodology and evaluation framework is provided. Meta-learning is discussed in great detail in the second half of the book. The exposition starts by presenting a comprehensive review of many meta-learning approaches explored in the past described in literature, including for instance approaches that provide a ranking of algorithms. The approach described can be related to other work that exploits planning whose aim is to construct data mining workflows. The book stimulates interchange of ideas between different, albeit related, approaches.

    2. Maszczyk T, Duch W. (2008) Comparison of Shannon, Renyi and Tsallis Entropy used in Decision Trees. Lecture Notes in Computer Science, Vol. 5097, 643-651, 2008. | Abstract.
    3. Grąbczewski K, Duch W. (2002) Heterogenous forests of decision trees. Lecture Notes in Computer Science Vol. 2415, 504-509, 2002.
    4. Grąbczewski K, Duch W. (2002) Forests of decision trees. Advances in Soft Computing, Physica Verlag (Springer), pp. 602-607, 2002.
    5. Grąbczewski K and Duch W. (2000) The separability of split value criterion.
      5th Conference on Neural Networks and Soft Computing, Zakopane, June 2000, pp. 201-208. Full Bibtex Entry
    6. Grąbczewski K, Duch W (1999) A general purpose separability criterion for classification systems, 4th Conference on Neural Networks and Their Applications, Zakopane, May 1999, pp. 203-208

  4. Heterogenous systems
    1. Grąbczewski K, Duch W. (2002) Heterogenous forests of decision trees. Lecture Notes in Computer Science Vol. 2415 (2002) 504-509.
    2. Duch W, Grąbczewski K. (2002) Heterogeneous adaptive systems IEEE World Congress on Computational Intelligence, Honolulu, May 2002, pp. 524-529.

  5. Other machine learning algorithms
    1. Duch W, Itert L, Committees of Undemocratic Competent Models. | PDF file.
      International Conference on Artificial Neural Networks (ICANN) and International Conference on Neural Information Processing (ICONIP), Istanbul, June 2003, pp. 33-36
    2. Duch W, Itert L, Grudziński K. (2002) Competent undemocratic committees.
      In: Neural Networks and Soft Computing (eds. L. Rutkowski and J. Kacprzyk), Advances in Soft Computing, Physica Verlag (Springer), pp. 412-417, 2002.
    3. Duch W, Itert L. (2002) A posteriori corrections to classification methods. In: Neural Networks and Soft Computing (eds. L. Rutkowski and J. Kacprzyk), Advances in Soft Computing, Physica Verlag (Springer), pp. 406-411, 2002.
    4. Duch W, Adamczak R, Hayashi Y. (2000) Eliminators and classifiers, ICONIP-2000, 7th International Conference on Neural Information Processing, pp. 1029 - 1034, and Arxiv 1901.09632 in cs.LG
    5. Duch W. (2000) Future Trends in Computational Intelligence from the 2012 perspective, Foresight 2020, book chapter (2012); presented at the 11th International Conference on Artificial Intelligence and Soft Computing (ICAISC 2012), Zakopane, Poland.

Text Processing.

  1. Natural Language Processing
    1. Duch W. (2014) Komunikacja jako rezonans między mózgami, w: Współczesne oblicza komunikacji i informacji. Problemy, badania, hipotezy. Red. E. Głowacka, M. Kowalska, P. Krysiński. Wyd. Naukowe UMK, Toruń 2014, str. 19-50
    2. Matykiewicz P, Duch W. (2014) Multiple inheritance problem in semantic spreading activation networks. Brain Informatics and Health. Lecture Notes in Artificial Intelligence, Vol. 8609, 252-265, 2014.
    3. Szymański J, Duch W. (2012) Context Search Algorithm for Lexical Knowledge Acquisition. Control and Cybernetics 41(1), 1–16, 2012
    4. Szymański J, Duch W, Annotating Words Using WordNet Semantic Glosses. Lecture Notes in Computer Science 7666, pp. 180–187, Springer, Heidelberg, 2012
    5. Szymański J, Duch W. (2012) Self Organizing Maps for visualization of categories. Lecture Notes in Computer Science 7663, pp.
    6. Szymański J, Duch W. (2010) Representation of hypertext documents based on terms, links and text compressibility Lecture Notes in Computer Science Vol. 6443, pp. 282-289, 2010.
    7. Szymański J, Duch W. (2010) Dynamic Semantic Visual Information Management. In: Series of Information and Management Sciences, California Polytechnic State University, 9th Int Conf on Information and Management Sciences (IMS 2010), pp. 130-138.
    8. Matykiewicz P, Duch W, Pestian J.P, Clustering semantic spaces of suicide notes and newsgroups articles. Proceedings of BioNLP Workshop, ACL, pp. 179-184, 2009.
    9. Szymański J, Duch W. (2008) Knowledge representation and acquisition for large-scale semantic memory. World Congress on Computational Intelligence (WCCI'08), Hong Kong, 1-6 June 2008, IEEE Press, pp. 3117-3124
    10. Szymański J, Sarnatowicz T, Duch W. (2008) Towards Avatars with Artificial Minds: Role of Semantic Memory. Journal of Ubiquitous Computing and Intelligence, American Scientific Publishers, 2, 1-11, 2008. | Abstract.
    11. Szymański J, Duch W. (2011) Induction of the common-sense hierarchies in lexical data. Lecture Notes in Computer Science Vol. 7063, pp. 726-734, 2011.
    12. Duch W, Szymański J. (2008) Semantic Web: Asking the Right Questions. Series of Information and Management Sciences, M. Gen, X. Zhao and J. Gao, Eds, California Polytechnic State University, CA, USA, pp. 456-463, 2008.
    13. Itert L, Duch W, Pestian J. (2007) Influence of a priori Knowledge on Medical Document Categorization, IEEE Symposium on Computational Intelligence in Data Mining, IEEE Press, April 2007, pp. 163-170 | Abstract.
    14. Pestian J, Brew C, Matykiewicz P, Hovermale D.J, Johnson N, Cohen K.B, Duch W. (2007) A shared task involving multi-label classification of clinical free text. BioNLP 2007: Biological, translational, and clinical language processing, pp. 97–104, ACL 2007. | Abstract.
    15. Rajan J, Davis K.C, Matykiewicz P, Duch W, Pestian J. (2007) Medical Acronym Disambiguation Using Online Sources. International Conference on Enterprise Information Systems and Web Technologies, pp. 123-130, 2007.
    16. Szymański J, Duch W. (2007) Semantic Memory Knowledge Acquisition Through Active Dialogues. 20th Int. Joint Conference on Neural Networks (IJCNN), Orlando, IEEE Press, August 2007, pp. 536-541 | Abstract.
    17. Szymański J, Duch W. (2007) Semantic Memory Architecture for Knowledge Acquisition and Management. 6th International Conference on Information and Management Sciences (IMS2007), July 1-6, 2007, California Polytechnic State University, CA, pp. 342-348 | Abstract.
    18. Pestian J.P, Itert L, Andersen C, Duch W. (2006) Preparing Clinical Text for Use in Biomedical Research. Journal of Database Management 17(2), 1-11, 2006.
    19. Matykiewicz P, Duch W, Pestian J. (2006) Nonambiguous Concept Mapping in Medical Domain, | PDF file.
      Lecture Notes in Artificial Intelligence, Vol. 4029, 941-950, 2006
    20. Matykiewicz P, Pestian J, Duch W, and Johnson N. (2006) Unambiguous Concept Mapping in Radiology Reports: Graphs of Consistent Concepts, AMIA Annu Symp Proc. 2006; 2006: 1024.
    21. Itert L, Duch W, Pestian J. (2005) Medical document categorization using a priori knowledge, Lecture Notes in Computer Science, Vol 3696, 641-646, 2005 | PDF file.
    22. Szymański J, Sarnatowicz T, Duch W. (2005) Semantic memory for avatars in cyberspace. 2005 International Conference on Cyberworlds, Singapore 23-25 Nov. 2005, T.L. Kunii, S.H. Soon and A. Sourin (eds), IEEE Computer Society, pp. 165-171 | PDF file.
    23. Duch W, Szymański J, Sarnatowicz T, Concept description vectors and the 20 question game. Intelligent Information Processing and Web Mining, Advances in Soft Computing, Springer Verlag, ISBN 3-540-25056-5 (Eds. Klopotek, M.A., Wierzchon, S.T., Trojanowski, K.), pp. 41-50, 2005.
    24. Duch W, Matykiewicz, P. Minimum Spanning Trees Displaying Semantic Similarity. Intelligent Information Processing and Web Mining, Advances in Soft Computing, pp. 31-40, 2005.
    25. Pestian J, Itert L, and Duch W. (2004) Development of a Pediatric Text-Corpus for Part-of-Speech Tagging. Advances in Soft Computing, Springer 2004, pp. 219-226

  2. Neurocognitive NLP
    1. Matykiewicz P, Duch W, Zender P.M, Crutcher K.A, Pestian J.P. (2009) Neurocognitive approach to clustering of PubMed query results. Lecture Notes in Computer Science 5507, pp. 70–79, 2009.
    2. Duch W, Matykiewicz P, and Pestian J, Neurolinguistic Approach to Natural Language Processing with Applications to Medical Text Analysis. Neural Networks 21(10), 1500-1510, 2008 | Abstract.
    3. Matykiewicz P, Duch W, Zender P.M, Crutcher K.A, Pestian J.P. (2008) Neurocognitive approach to clustering of PubMed query results.
      In: Neural Information Proceesing, 15th Int. conference ICONIP 2008, Auckland, New Zealand, pp. 160-161, 2008.
    4. Duch W, Matykiewicz P, Pestian J. (2007) Neurolinguistic Approach to Vector Representation of Medical Concepts. 20th Int. Joint Conference on Neural Networks (IJCNN), Orlando, IEEE Press, August 12-17, 2007, pp. 3110-3115 | Abstract.
    5. Duch W, Matykiewicz P, Pestian J. (2007) Towards Understanding of Natural Language: Neurocognitive Inspirations. Lecture Notes in Computer Science, Vol. 4668, 953–962, 2007 | Abstract.

  3. Information Retrieval
    1. O. Sokolov, W. Osińska, A. Mreła, W. Duch. (2018) Modeling of scientific publications disciplinary collocation based on optimistic fuzzy aggregation norms.
      Advances in Intelligent Systems and Computing, Vol. 853, pp. 145-153
    2. Szymański J, Duch W. (2012) Information Retrieval with Semantic Memory model Cognitive Systems Research, 14 (2012) 84-100;
    3. Rzeniewicz J, Szymański J, and Duch W. (2012) Adaptive Algorithm for Interactive Question-based Search, Intelligent Information Processing 385, 186-195, 2012.
      Best Paper Award at the 7th International Conference on Intelligent Information Processing (IIP2012), Guilin, China, 12-15.10.2012.
    4. Szymański J, Duch W. (2011) Wizualizacja struktury Wikipedii do wspomagania wyszukiwania informacji. Wizualizacja wiedzy. Od Biblii pauperum do hipertekstu, Warszawa 2011, str. 312-319
    5. Duch W (1999) Inteligentne metody szukania informacji medycznych. IV Konferencja Internetu Medycznego, 12-13.11.1999 (materiały na CD-ROMie)

Visualization methods.

Fuzzy version of symbolic dynamics helps to understanding dynamical systems (in particular neurodynamics) by showing trajectories. |

  1. Visualization of dynamics
    1. Duch W, Dobosz K. (2011) Visualization for Understanding of Neurodynamical Systems. Cognitive Neurodynamics 5(2), 145-160, 2011.
    2. Duch W, Dobosz K. (2011) Attractors in Neurodynamical Systems.
      Advances in Cognitive Neurodynamics II (eds. R. Wang, F. Gu), pp. 157-161, 2011
    3. Dobosz K, Duch W. (2010) Understanding Neurodynamical Systems via Fuzzy Symbolic Dynamics. Neural Networks 23 (2010) 487-496, 2010
    4. Duch W, Dobosz K, Jovanovic A, Klonowski W. (2010) Exploring the landscape of brain states NeuroMath COST Action BM0601, archived in Frontiers in Neuroscience.
    5. Dobosz K, Duch W. (2008) Fuzzy Symbolic Dynamics for Neurodynamical Systems. Lecture Notes in Computer Science, vol. 5164, 471-478, 2008.
    6. Dobosz K, Duch W. (2008) Global Visualization of Neural Dynamics. Neuromath Workshop, Dornburg Castle, Jena, Germany, 28-29 April 2008, pp. 15-16

  2. Visualization of static data
  3. Development of MDS, SOM, triangular visualization and other methods.

    1. Duch W (1995) Quantitative measures for the self-organized topographical mapping. Open Systems and Information Dynamics 2:295-302 (rediscovery of MDS and alternative formulation).
    2. Szymański J, Duch W. (2012) Self Organizing Maps for visualization of categories. T. Huang et al. (Eds.): Lecture Notes in Computer Science 7663, pp. 160–167. Springer, Heidelberg, 2012
    3. Maszczyk T, Grochowski M, Duch W. (2010) Discovering Data Structures using Meta-learning, Visualization and Constructive Neural Networks In: Advances in Machine Learning II, Springer Studies in Computational Intelligence, Vol. 262, pp. 467-484.
    4. Maszczyk T, Duch W. (2008) Support Vector Machines for visualization and dimensionality reduction. Lecture Notes in Computer Science, vol. 5163, 346-356, 2008.
    5. Maszczyk T, Duch W. (2010) Triangular Visualization Lecture Notes in Computer Science Vol. 6113, pp. 445-452, 2010.
    6. Naud A, Duch W. (2002) Visualization of large data sets using MDS combined with LVQ. In: Neural Networks and Soft Computing (eds. L. Rutkowski and J. Kacprzyk), Advances in Soft Computing, Physica Verlag (Springer), pp. 632-637, 2002.
    7. Naud A, Duch W. (2000) Interactive data exploration using MDS mapping.
      5th Conference on Neural Networks and Soft Computing, Zakopane, June 2000, pp. 255-260
    8. Duch W and Naud A (1996) Simplexes, Multi-Dimensional Scaling and Self-Organized Mapping 8th joint EPS-APS International Conference on Physics Computing '96, Kraków 17-21.9.1996, pp. 367-370
    9. Duch W and Naud A (1996) On global self-organizing maps European Symposium on Artificial Neural Networks (ESANN'96), Bruge 22-26.4.1996, pp. 91-96
    10. Duch W and Naud A (1996) Multidimensional scaling and Kohonen's self-organizing maps . Second Conference on Neural Networks and their applications, Orle Gniazdo, 30.IV-4.V.1996, pp. 138-143
    11. Duch W, Naud A (1996) Simplexes, Multi-Dimensional Scaling and Self-Organized Mapping. | Abstract. ICONIP'96, Hong Kong 24-27.9.1996 (submitted; missed the deadline); UMK-KMK-TR 2/96 report (1996)

Cognitive systems/technologies.

Searching for models that serve as bridges between mind and brain. In the "mind space" model mental events are represented using neurofuzzy approach for probability density modelling. Activation of sequences of areas in the mind space represented object recognition, action and mental trajectory, providing links between neurodynamics and visualization of mental events. |

  1. Cognitive architectures and models
    1. Duch, W, Tan, Ah-Hwee, Franklin, Stan (2012). Cognitive architectures and autonomy: Commentary and Response. Special issue of the Journal of Artificial General Intelligence 3(2), 2012
    2. Duch W, Lee M (Eds). (2012) Preface to the special issue on "Computational Modeling and Application of Cognitive Systems". Cognitive Systems Research 14, 2012.
    3. Duch W, Oentaryo R.J, Pasquier M. (2008). Cognitive architectures: where do we go from here? In: Frontiers in Artificial Intelligence and Applications, Vol. 171 (Ed. by Pei Wang, Ben Goertzel, and Stan Franklin), IOS Press, pp. 122-136.
    4. Duch W (1996) From cognitive models to neurofuzzy systems - the mind space approach. Systems Analysis-Modelling-Simulation 24 (1996) 53-65
    5. Duch W (1995) From cognitive models to neurofuzzy systems
      In: E. Kącki, editor, Volume 3, Artifical Neural Networks and Their Applications (Polish Society of Medical Informatics, Łódź 1995), p. 22-27 (Proc. from "System, Modelling, Control", Zakopane, 1-5.05.1995)

  2. Mind-brain relations
    1. Duch, W. (2019) Mind as a shadow of neurodynamics. Physics of Life Reviews, Special Issue "Physics of mind", Ed. F. Schoeller (IF. 13.8)
    2. Duch W, Nowakowski P, Wachowski W. (2011) Interview (2011): What would the robots play? Wywiad z J. Kevinem O’Reganem W co bawiłyby się roboty?? Interview with J. Kevin O’Regan (przekł. Witold Wachowski). Avant. The Journal of the Philosophical-Interdisciplinary Vanguard Vol. 2(2), pp. 21-26, 125-131, 2011.
    3. Duch W. and Diercksen GHF (1995) Feature Space Mapping as a universal adaptive system. Computer Physics Communications 87: 341-371
    4. Duch W. (1998) What constitutes a good theory of mind? CogPrints as DI UMK Technical Preprint 1/1998 | PDF file.
    5. Duch W (2001) A solution to fundamental problems of cognitive sciencies.
      Zamojskie Studia i Materiały (ISSN 1507-9090), no 6, pp. 45-70 (red. W. A. Kamiński, CBS Wyższa Szkoła Zarządzania i Administracji w Zamościu, 2001)
    6. Duch W (1994)Mind space approach to neurofuzzy systems.
      In: Proc. of the Japanese Neural Networks Society 1994, Tsukuba, 9-11.11.1994, Japan, pp. 173-174
    7. Duch W (1994) A solution to the fundamental problems of cognitive sciences. UMK-KMK-TR 1/94 report, also in the International Philosophical Preprint Exchange, WWW Archive
    8. Duch W (1994) Towards Artificial Minds In: First National Conference "Neural Networks and Their applications", Kule, 12-15.IV.1994, pp. 17-28

  3. Computational Creativity
    1. Duch W. (2007) Intuition, Insight, Imagination and Creativity. IEEE Computational Intelligence Magazine 2(3), August 2007, pp. 40-52
    2. Duch W. (2013) Computational Creativity. Encyclopedia of Systems Biology, W. Dubitzky, O. Wolkenhauer, H. Yokota, K-H Cho (Eds.), Springer 2013, pp. 464-468
    3. Pilichowski M, Duch W. (2013) BrainGene: computational creativity algorithm that invents novel interesting names. 2013 IEEE Symposium on Computational Intelligence for Human-like Intelligence (CIHLI) Singapore, April 2013, pp. 92-99.
    4. Pilichowski M, Duch W. (2009) Neurocognitive Approach to Creativity in the Domain of Word-invention. Lecture Notes in Computer Science 5507, pp. 88–96, 2009.
    5. Pilichowski M, Duch W. (2008) Neurocognitive Approach to Creativity in the Domain of Word-invention.
      In: Neural Information Proceesing, 15th Int. conference ICONIP 2008, Auckland, New Zealand, pp. 481-482, 2008 (extended abstract).
    6. Duch W. (2006) Creativity and the Brain.
      A Handbook of Creativity for Teachers. Ed. Ai-Girl Tan, Singapore: World Scientific Publishing 2007, pp. 507-530. | Abstract.
    7. Duch W. (2006) Computational Creativity.
      Presented at the World Congress on Computational Intelligence (WCCI'06), Vancouver, 16-21 July 2006, IEEE Press, pp. 1162-1169 | PDF file.
    8. Duch W, Pilichowski M. (2007) Experiments with computational creativity. Neural Information Processing – Letters and Reviews, Vol. 11, No. 4-6, April/June 2007, pp. 123-133 | Abstract.

  4. Cognitive training systems
    1. Linowiecki R, Matulewski J, Bałaj B, Ignaczewska A, Dreszer J, Duch W. (2017) GCAF: Narzędzie do tworzenia eksperymentów dotyczących badania nabywania mowy i treningu z interakcją wzrokową u niemowląt.
      Lingwistyka Stosowana (Applied Linguistics / Angewandte Linguistik) 20, 83-99.
    2. Gut M, Goraczewski Ł, Matulewski J, Finc K, Ignaczewska A, Bałaj B, Dreszer J, Kmiecik M, Stępińska J, Majewski J, Bendlin E, Cholewa P, Duch W. (2016) Trening poznawczy przy użyciu komputerowej gry matematycznej a przetwarzanie informacji numerycznej u dzieci – wyniki badań pilotażowych. Rozdział w książce: Cywilizacja zabawy, rozrywki i wypoczynku. M. Suchacka (red.), Wydawnictwo E-bookowo, str. 57-83.

    3. Sokolov O, Dobosz K, Dreszer J, Bałaj B, Duch W, Grzelak S, Komendzinski T, Mikolajewski D, Piotrowski T, Świerkocka M, and Weber, P. (2015) Intelligent emotions stabilization system using standardized images, breath sensor and biofeedback - preliminary findings - short communication. Journal of Education, Health and Sport. 5(2):260-268, 2015.

  5. Medical applications
    1. Guang Lan Zhang, Hifzur Rahman Ansari, Phil Bradley, Gavin C. Cawley, Tomer Hertz, Xihao Hu, Jim C. Huang, Nebojsa Jojic, Yohan Kim, Oliver Kohlbacher, Ole Lund, Claus Lundegaard, Craig A. Magaret, Morten Nielsen, Harris Papadopoulos, G. P. S. Raghava, Vider-Shalit Tal, Li Xue, Chen Yanover, Hao Zhang, Shanfeng Zhu, Michael T. Rock, James E. Crowe Jr., Christos Panayiotou, Marios M. Polycarpou, Włodzisław Duch, Vladimir Brusic. Machine Learning Competition in Immunology – Prediction of HLA class I molecules. Journal of Immunological Methods 30;374(1-2):1-4, 2011.
      This is the first paper that defined the field of computational immunology, based on machine learning.
    2. Sokolov, O., Meszyński, S., Weber, P., Mikołajewski, D., Piotrowski, T., Dreszer, J., Bałaj, B., Grzelak, S., Duch, W., Komendziński, T., Dobosz, K., Świerkocka, M. (2015) Multiagent modeling of emotions influence on physiological systems: new concept. Journal of Education, Health and Sport. 5(1):221-240, 2015.
    3. Sokolov O, Dobosz K, Dreszer J, Bałaj B, Duch W, Grzelak S, Komendzinski T, Mikolajewski D, Piotrowski T, Świerkocka M, and Weber, P. (2015) Spirometry Data Analysis and Monitoring in Medical and Physiological Tests. Journal of Education, Health and Sport. 2015;5(3):35-46. ISSN 2391-8306. DOI: 10.5281/zenodo.16171
    4. Sokolov O, Dobosz K, Dreszer J, Bałaj B, Duch W, Grzelak S, Komendziński T, Mikołajewski D, Piotrowski T, Świerkocka M, Weber P. Intelligent emotions stabilization system using standardized images, breath sensor and biofeedback - new concept. 2014 IEEE Symposium on Computational Intelligence in Healthcare and e-health (CICARE) Proceedings, Orlando, Florida, USA, December 9-12, 2014, P. 48-55.
    5. Duch W, Dobosz K. (2013) Sieci neuronowe w modelowaniu chorób psychicznych, rozdział w (book chapter): Tadeusiewicz R, Duch W, Korbicz J, Rutkowski L (Eds), Sieci neuronowe w inżynierii biomedycznej. Wyd. Exit, str 637-666, 2013 (rozdział w książce); Tłumaczenie rosyjskie.
    6. Matykiewicz P, Duch W, Pestian J.P. (2009) Clustering semantic spaces of suicide notes and newsgroups articles. Proceedings of BioNLP Workshop, ACL, pp. 179-184, 2009.
    7. Duch, W. (2003) Complex Agents for Socio-Cognitive Engineering.
      First International Workshop on Socio-Cognitive Engineering Foundations (SCEF-2003), Rome, Italy, Sept 30 - Oct 1, 2003
    8. Duch W. (1998) Rewolucja informatyczna w medycynie. Kardiologia Polska Supp. III, 49 (1998) 87-92

Brain functions.

  1. Consciousness
    1. Duch W. (2018) Hylomorphism extended: dynamical forms and minds. Philosophies 2018, 3-36.
    2. Duch W. (2017) Why minds cannot be received, but are created by brains | Show abstract

      There is no controversy in psychology or brain sciences that brains create mind and consciousness. Doubts and opinions to the contrary are quite frequently expressed in non-scientific publications. In particular the idea that conscious mind is received, rather than created by the brain, is quite often used against “materialistic” understanding of consciousness. I summarize here arguments against such position, show that neuroscience gives coherent view of mind and consciousness, and that this view is intrinsically non-materialistic.

      Scientia et Fides 5(2), 171-198.
    3. Duch W, Nowak W, Meller J, Osinski G, Dobosz K, Mikołajewski D, and Wójcik G.M. (2011) Consciousness and attention in autism spectrum disorders. In: Proceedings of Cracow Grid Workshop 2010, pp. 202-211, 2011.
    4. Duch W. (2009) Umysł, świadomość i działania twórcze (Mind, consciousness and creativity, in Polish). Kognitywistyka i Media w Edukacji 1-2, 9-40, 2008 (appeared in 2009).
    5. Duch W. (2008) Consciousness, Imagery and Music.
      COST BM0605 Meeting, Consciousness: A transdisciplinary, integrated approach, Ghent, Belgium, Nov. 2008 (abstract), pp. 15-16.

    6. Duch W. (2005) Brain-inspired conscious computing architecture. Journal of Mind and Behavior, Vol. 26(1-2), 1-22, 2005
    7. Duch W. (2003) Just bubbles? Commentary on Steven Lehar, Gestalt Isomorphism and the Primacy of Subjective Conscious Experience: A Gestalt Bubble Model. Behavioral and Brain Sciences 26(4) (2003) 410-411
    8. Duch W. (2003) Neurokognitywna teoria świadomości. Studia z kognitywistyki i filozofii umysłu (red. W. Dziarnowska i A. Klawiter). Tom. I, Subiektywność a świadomość. Zysk i S-ka, Poznań 2003, str. 133-154.
    9. Duch W. (2002) Geometryczny model umysłu. Kognitywistyka i Media w Edukacji, Vol. 6 (2002) 199-230 (appeared in 2003)
    10. Duch W. (2002) Fizyka umysłu. Postępy Fizyki 53D (2002), 92-103
    11. Duch W. (2002) Neurokognitywna teoria świadomości . Kognitywistyka i Media w Edukacji, T.5 (2) 2001, pp. 47-67 | PDF file.
    12. Duch W. (2001) Facing the hard question.
      Commentary on J.A. Gray, The contents of consciousness: a neuropsychological conjecture. Behavioral and Brain Sciences 24 (2001) 187-188
    13. Duch W. (2001) Psychophysics. Zamojskie Studia i Materiały (ISSN 1507-9090), zeszyt 6, str. 71-86 (red. W. A. Kamiński, CBS Wyższa Szkoła Zarządzania i Administracji w Zamościu, 2001)
    14. Duch W (2001) Psychika i świadomość. Encyklopedia Multimedialna PWN, 2001 (7 p).
    15. Duch W. (2000) Świadomość i dynamiczne modele działania mózgu. Neurologia i Neurochirurgia Polska T. 34 (50), Supl. 2, 2000, pp. 69-84
    16. Duch W. (1999) Jaka teoria umysłu w pełni nas zadowoli? Kognitywistyka i Media w Edukacji 3(1) (2000) 29 - 53
    17. Duch W (1997) Platonic model of mind as an approximation to neurodynamics . In: Brain-like computing and intelligent information systems, ed. S-i. Amari, N. Kasabov (Springer, Singapore 1997), chap. 20, pp. 491-512
    18. Duch W (1996) Computational physics of the mind. | PDF file. Computer Physics Communication 97: 136-153
    19. Duch W (1995) Transparent theory of consciousness: is there a problem? Behavioral and Brain Sciences (submitted, KMK TR 1/95)
    20. Duch W (1995) Physics of consciousness. IV Krajowa konferencja ,,Modelowanie Systemów biologicznych", Kraków 2-3.06.1995, pp. 101-114

  2. Neurodynamics.
    1. Duch W, Dobosz K. (2011) Attractors in Neurodynamical Systems. Advances in Cognitive Neurodynamics II (eds. R. Wang, F. Gu), pp. 157-161, 2011
    2. Duch W. (2011) Neurodynamics and the mind. International Joint Conference on Neural Networks, San Jose, CA, IEEE Press, pp. 3227-3234, 2011.
    3. Duch W (1998) Neural implementation of psychological spaces. | PDF file. Int. Conf. on Neural Network and Brain, Beijing, China, October 1998, pp. 32-35
    4. Duch W (1996) On simplifying brain functions | Abstract. Second Conference on Neural Networks and their applications, Orle Gniazdo, 30.IV-4.V.1996, pp. 118-124
    5. Duch W (1996) Categorization, Prototype Theory and Neural Dynamics, 113 KB. Proc. of the 4th International Conference on SoftComputing'96, Iizuka, Japan, ed. T. Yamakawa and G. Matsumoto, pp. 482-485 (invited paper, cognitive science session)

  3. Imagery.
    1. Duch W. (2013) Amuzja Wyobrażeniowa (Imagery Amusia), book chapter, in: Neuroestetyka muzyki, red. P. Podlipniak i P. Przybysz. Wydawnictwo Poznańskiego Towarzystwa Przyjaciół Nauk 2013, str. 243-266.
    2. Duch W. (2008) Consciousness, Imagery and Music.
      COST BM0605 Meeting, Consciousness: A transdisciplinary, integrated approach, Ghent, Belgium, Nov. 2008 (abstract), pp. 15-16.
    3. Duch W, Klonowski W, Perovic A, Jovanovic A. (2008) Some computational aspects of the Brain Computer Interfaces based on Inner Music. Neuromath, Sept. 2008.
    4. Duch W. (2007) Neuroestetyka i ewolucyjne podstawy przeżyć estetycznych. Współczesna Neuroestetyka, Wyd. Poli-Graf-Jak, Poznań 2007, str. 47-52.

  4. Cognitive science.
    1. Duch W. (2019) Tańczące mózgi, tańczące ciała.
      Rozdział w monografii na 45-lecie Polskiego Teatru Tańca, Wielość spojrzeń na taniec. 1973-2018. Wyd. Polski Teatr Tańca, Poznań. Str. 91-107. ISBN 978-83-951669-0-7
    2. Duch W. (2017) Where will the neurosciences lead us? Interfaces, codes, symbols. The future of communication. City of the Future / Laboratory Wroclaw. E-book chapter, pp. 204-222, 2017
    3. Duch W. (2016) Gdzie nas zaprowadzą neuronauki?
      Interfejsy, kody, symbole. Przyszłość komunikowania. Miasto Przyszłości / Laboratorium Wrocław (rozdział ebooka), str 20-39. ISBN: 978-83-946602-8-4
    4. Duch W. (2001) Kognitywistyka drogą do zrozumienia człowieka. Kognitywistyka i Media w Edukacji, T.5 (2) 2001, pp. 40-42.
    5. Duch W (1999) Duch i dusza, czyli prehistoria kognitywistyki.
      Kognitywistyka i Media w Edukacji 1 (1999)  pp. 7-38
    6. Duch W (1998) Czym jest kognitywistyka? Kognitywistyka i Media w Edukacji 1 (1998)  pp. 9-50

  5. Brain signals.
    1. Jovanovic A, Perovic A, Klonowski W, Duch W, Dordević Z, Spasić S. (2010) Detection of Structural Features in Biological Signals. Journal of Signal Processing Systems (Springer), 60(1), 115-129, 2010.
    2. Perovic A, Klonowski W, Duch W, Djordjevic Z and Jovanovic A. (2010) Weak brain connectivity and causality measures
      NeuroMath COST Action BM0601, archived in Frontiers in Neuroscience.
    3. Spasić S, Perović A, Klonowski W, Duch W, Jovanovic A. (2010) Forensics of Features in the Spectra of Biological Signals. International Journal of Bioelectromagnetism 12(2): 62 - 75, 2010
    4. Klonowski W, Duch W, Perovic A, Jovanovic A, Some Computational Aspects of the Brain Computer Interfaces Based on Inner Music. Computational Intelligence and Neuroscience, Vol 2009, Article ID 950403, 9 pages, 2009.
    5. Klonowski W, Duch W, Perovic A, Jovanovic A. (2009) Automatic detection of spectroscopic features. Proc. of. Neuromath, Leuven 2009.
    6. Duch W, Klonowski W, Perovic A, Jovanovic A, Some computational aspects of the Brain Computer Interfaces based on Inner Music. Neuromath, Leuven 12-13.03.2009, pp. xx (abstract).
    7. Perovic A, Klonowski W, Duch W, Djordjevic Z, Jovanovic A, Detection of structural features in brain signals and causality tests. COST Consciousness, Cyprus 2009, (poster P04, extended abstract).
    8. Klonowski W, Duch W, Djordjevic Z, Spasic S, Perovic A, Jovanovic A, Automatic recognition of features in biological signals.
      COST Neuromath: Advanced Methods for the Estimation of Human Brain Activity and Connectivity, Sept. 2009 (extended abstract).
    9. Duch W, Klonowski W, Perovic A, Jovanovic A. (2008) Some computational aspects of the Brain Computer Interfaces based on Inner Music. Neuromath, Sept. 2008.

  6. Other brain functions.
    1. Duch W. (2013) Brains and Education: Towards Neurocognitive Phenomics. In: "Learning while we are connected", Vol. 3, Eds. N. Reynolds, M. Webb, M.M. Sysło, V. Dagiene. pp. 12-23.
    2. Duch W. (2013) Mózgi i Edukacja: w stronę Neurokognitywnej Fenomiki. Informatyka w Edukacji, Toruń, 2-5.07.2013; str 1-14.
    3. Mikulska K, Jakubowski R, Peplowski L, Dabrowski M, Gogolinska A, Duch W, Nowak W. (2011) On applications of virtual atomic force microscope in studies of brain proteins, European Biophysics Journal With Biophysics Letters S42, p. 93
    4. Duch W. (2008) Perspektywy neuromarketingu. W: Neuromarketing. Interdyscyplinarne spojrzenie na klienta. H. Mruk, M. Schneider (red), Wyd. Uniw. Przyrodniczego w Poznaniu, str. 39-49, 2008.
    5. Duch W. (2001) Models of topographic map’s formation and comparison with experimental data. 5th Congress of the Polish Neuroscience Society, Torun, Poland, September 2001, 1 p. abstract.

Neuroinformatics, neuropsychiatry and mental models.

  1. Neuroinformatics
    1. Aghabeig M, Bałaj B, Dreszer J, Lewandowska M, Milner R, Pawlaczyk N, Piotrowski T, Szmytke N, and Duch W.
      Perception of non-native phoneme contrasts in 8-13 months infants: tensor-based analysis of EEG signals. 27th European Signal Processing Conference, EUSIPCO 2019, Corna, Spain, 2-6 Sept. 2019
    2. Bonna, Kamil; Finc, Karolina; Zimmermann, Maria; Bola, Łukasz; Szul, Maciej; Mostowski, Piotr; Rutkowski, Paweł; Duch, Włodzisław; Marchewka, Artur; Jednoróg, Katarzyna; Szwed, Marcin.
      Early deafness leads to re-shaping of global functional connectivity beyond the auditory cortex. arXiv.org > q-bio > arXiv:1903.11915 and in review in Brain Imaging and Behaviour (sub. 12/2018)
    3. Duch W. (2018), Multi-level Explanations in Neuroscience I: From genes to subjective experiences. Acta Physica Polonica B 49(12), 1981-2010.
    4. Komorowski M, Aghabeig M. Nikadon J, Piotrowski T, Dreszer J, Bałaj B, Lewandowska M, Wojciechowski J, Pawlaczyk N, Szmytke M, Cichocki A, and Duch W. (2018) Multi-level Explanations in Neuroscience II: EEG Spectral Fingerprints and Tensor Decompositions for Understanding Brain Activity - initial results. Acta Physica Polonica B 49(12), 2011-2028.
    5. Finc K, Bonna K, Lewandowska M, Wolak T, Nikadon J, Dreszer J, Duch W, Kühn S. (2017) Transition of the functional brain network related to increasing cognitive demands. Human Brain Mapping 38(7), 3659–3674. | Show abstract

      Network neuroscience provides tools that can easily be used to verify main assumptions of the global workspace theory (GWT), such as the existence of highly segregated information processing during effortless tasks performance, engagement of multiple distributed networks during effortful tasks and the critical role of long-range connections in workspace formation. A number of studies support the assumptions of GWT by showing the reorganization of the whole-brain functional network during cognitive task performance; however, the involvement of specific large scale networks in the formation of workspace is still not well-understood. The aims of our study were: (1) to examine changes in the whole-brain functional network under increased cognitive demands of working memory during an n-back task, and their relationship with behavioral outcomes; and (2) to provide a comprehensive description of local changes that may be involved in the formation of the global workspace, using hub detection and network-based statistic. Our results show that network modularity decreased with increasing cognitive demands, and this change allowed us to predict behavioral performance. The number of connector hubs increased, whereas the number of provincial hubs decreased when the task became more demanding. We also found that the default mode network (DMN) increased its connectivity to other networks while decreasing connectivity between its own regions. These results, apart from replicating previous findings, provide a valuable insight into the mechanisms of the formation of the global workspace, highlighting the role of the DMN in the processes of network integration.
    6. Weihui Dai, Wlodzislaw Duch, Abdul Hanan Abdullah, Dongrong Xu, and Ye-sho Chen. (2015) Recent Advances in Learning Theory. Computational Intelligence and Neuroscience, Vol. 2015, Article ID 395948.
    7. Duch W. (2014) Memetics and Neural Models of Conspiracy Theories. arXiv.org > q-bio > arXiv:1508.04561
    8. Duch W. (2009) Neurocognitive Informatics Manifesto. In: Series of Information and Management Sciences, California Polytechnic State University, 8th Int Conf on Information and Management Sciences (IMS 2009), Kunming-Banna, Yunan, China, pp. 264-282.
    9. Chojnowski A, Duch W. (2000) Analiza szeregów czasowych obrazów. Biocybernetyka 2000, Tom 6: Sieci neuronowe (red. W. Duch, J. Korbicz, L. Rutkowski i R. Tadeusiewicz), rozdz. II.17, pp. 569-588
    10. Duch W, Bekkering H, Neggers B (1997) Sensorimotor integration CIL-KMK/MPPF-4/97, Technical Report

  2. Autism Spectrum Disorders
    1. Gravier A, Quek H.C, Duch W, Abdul Wahab, Gravier-Rymaszewska J. Neural network modelling of the influence of channelopathies on reflex visual attention. Cognitive Neurodynamics 10(1), 49-72, 2016.
    2. Duch W. (2019), Autism Spectrum Disorder and deep attractors in neurodynamics | Show abstract

      Although all Research Domain Criteria (RDoC) units of analysis are important, understanding the mechanics of mental functions should be done at the circuit level. Functions of neural networks depend on the cellular, molecular and genetic levels. Complex functions responsible for behavior result from neurodynamics. Therefore a good strategy that should help to find causal relations between different levels of analysis, showing how RDoC psychological constructs emerge from biology, is to identify biophysical parameters of neurons required for normal neural network activity and explore all changes that may lead to abnormal functions, behavioral symptoms, cognitive phenotypes and syndromes. Computational simulations of neurodynamics generate hypothesis for experimental verification and help to interpret neuroimaging data. Neurodynamics provides language that relates measureable brain processes to RDoC psychological constructs. As an example of such an approach I shall focus here on the Autism Spectrum Disorders (ASD). Many confusing observations may find an explanation at this level and lead to hypothesis that may be experimentally verified.

      Book chapter, Springer Handbook of Multi-Scale Models of Brain Disorders: From Microscopic to Macroscopic Assessment of Brain Dynamics (accepted, in print 4/2019).
    3. Duch W, Dobosz K, Mikołajewski D. (2013) Autism and ADHD – two ends of the same spectrum?
      Lecture Notes in Computer Science Vol. 8226, pp. 623-630, 2013.
      Presented at the 20th International Conference on Neural Information Processing 2013, Degu, Korea.
    4. Dobosz K, Mikołajewski D, Wójcik G.M, Duch W. (2013) Simple cyclic movements as a distinct autism feature - computational approach.
      Journal of Computer Science 14(3), pp/ 475-490, 2013. DOI: 10.7494/csci.2013.14.3.475
    5. Duch W, Nowak W, Meller J, Osiński G, Dobosz K, Mikołajewski D, and Wójcik G.M. (2012) Computational approach to understanding autism spectrum disorders.
      Computer Science 13(2): 47–61, 2012.
    6. Duch W, Nowak W, Meller J, Osinski G, Dobosz K, Mikołajewski D, and Wójcik G.M. (2011) Consciousness and attention in autism spectrum disorders.
      In: Proceedings of Cracow Grid Workshop 2010, pp. 202-211, 2011.
    7. Duch W. (2011) From autism to ADHD: comprehensive theory based on computational simulations. In: Models of Physiology and Disease Symposium, Center for Life Sciences, NUS, Singapore, p. 34
    8. Alexandre Gravier, Quek Hiok Chai, Wlodzislaw Duch, Abdul Wahab. (2011) Modeling bottom-up visual pathway to assess the influence of individual neuron dynamics on reflex attention. In: Models of Physiology and Disease Symposium, Center for Life Sciences, NUS, Singapore.

  3. Brain stem models
    1. Mikołajewski D, Duch W. (2018) Brain stem modeling at a system level – chances and limitations. Bio-Algorithms and Med-Systems 20180015.
    2. Mikołajewski D, Duch W (2013) Modelowanie pnia mózgu, rozdział w (book chapter): Tadeusiewicz R, Korbicz J, Rutkowski L, Duch W (Eds), Sieci neuronowe w inżynierii biomedycznej. Wyd. Exit, str 605-636, 2013 (rozdział w książce) Tłumaczenie rosyjskie.
    3. Dobosz K, Osiński G, Duch W, Computational model of brain stem functions. Presented at the Neuromath Workshop, Rome, 1-5 Dec. 2007, pp. 34

  4. Other Brain Disorders
    1. Duch W, Dobosz K. (2013) Sieci neuronowe w modelowaniu chorób psychicznych, rozdział w (book chapter): Tadeusiewicz R, Duch W, Korbicz J, Rutkowski L (Eds), Sieci neuronowe w inżynierii biomedycznej. Wyd. Exit, str 637-666, 2013 (rozdział w książce); Tłumaczenie rosyjskie.
    2. Cutsuridis V, Heida C , Duch W, Doya K. (2011) Neurocomputational Models of Brain Disorders. Preface to the special issue of the Neural Networks 24(6), 513-514, 2011.
    3. Duch W. (2007) Computational Models of Dementia and Neurological Problems. Chapter 17 of a book: Neuroinformatics, C.J. Crasto (Ed), "Methods in Molecular Biology" series (J. Walker, series ed.), Humana Press, Totowa, NJ, pp. 307-336, 2007 | Abstract
    4. Duch W. (2000) Therapeutic applications of computer models of brain activity for Alzheimer disease. J. Medical Informatics and Technologies 5 (2000) 27-34
    5. Duch W. (2000) Sieci neuronowe w modelowaniu zaburzeń neuropsychologicznych i chorób psychicznych. Biocybernetyka 2000, Tom 6: Sieci neuronowe, rozdz. II.18, pp. 589-616

Artificial Intelligence.

In 1992 first preprints on noisy data trees have been published by W. Duch, and a year later two important ideas were presented in preprints. First, "Syntactic and semantic information in finite systems", containing the idea of pragmatic information measure that can be used to evaluate the value of new data for cognitive systems, including knowledge based systems.

  1. Artificial Intelligence algorithms
    1. Matykiewicz P, Duch W. (2014) Multiple inheritance problem in semantic spreading activation networks. Brain Informatics and Health. Lecture Notes in Artificial Intelligence Vol. 8609, 252-265, 2014.
    2. Duch, W, Tan, Ah-Hwee, Franklin, Stan (2012). Cognitive architectures and autonomy: Commentary and Response. Special issue of the Journal of Artificial General Intelligence 3(2), 2012
    3. Duch W. (2010) Architektury kognitywne.
      In: Neurocybernetyka teoretyczna, Wyd. Uniwersytetu Warszawskiego, Rozdz. 14, pp. 329-361, ed. R. Tadeusiewicz
    4. Duch W, Oentaryo R.J, Pasquier M. (2008). Cognitive architectures: where do we go from here? In: Frontiers in Artificial Intelligence and Applications, Vol. 171 (Ed. by Pei Wang, Ben Goertzel, and Stan Franklin), IOS Press, pp. 122-136.
    5. Duch W (1997) Artificial Intelligence Support for Computational Chemistry. Advances in Quantum Chemistry 28: 329-343
    6. Duch W and Jankowski N (1994) Complex Systems, Information Theory and Neural Networks In: Materiały Konferencyjne I Krajowej Konferencji "Sieci Neuronowe i ich Zastosowania", Kule, 12-15.IV.1994, pp. 224-230
    7. Duch W (1993) Complex systems, Information Theory and Neural Networks.
      UMK-KMK-TR 5/93 report
    8. Duch W (1993) Syntactic and semantic information in finite systems. UMK-KMK-TR 1/93 report
    9. Duch W (1992) Noisy data trees or learning without adaptation. UMK-KMK-TR 2/92 report
    10. Duch W. (2002) Sceptycyzm wobec sceptycyzmu (kontynuacja dyskusji o AI) Kognitywistyka i Media w Edukacji, 2002.

Psychology and philosophy.

  1. Psychology
    1. Duch W. (2018), Kurt Lewin, psychological constructs and sources of brain cognitive activity. Polish Psychological Forum 23(1), 5-19, and arXiv:1711.01767, Neurons and Cognition | Show abstract

      Understanding mind-brain-environment relations is one of the key topics in psychology. Kurt Lewin, inspired by theoretical physics, tried to establish topological and vector psychology analyzing patterns of interaction between the individual and her/his environment. The time is ripe to reformulate his ambitious goals, searching for ways to interpret objectively measured brain processes in terms of suitable psychological constructs. Connecting cognitive and social psychology constructs to neurophenomics, as it is done now in psychiatry, should ground them in physical reality.
    2. Duch W. (2018) Eliminatywizm i konstrukty psychologiczne. Rozdział w książce "Filozof w Krainie Umysłów", dedykowany Andrzejowi Klawiterowi z okazji 45-lecia pracy naukowej. Wyd. Naukowe Wydziału Nauk Społecznych UAM, Poznań, pp. 201-216, 2018 | Show abstract

      Jednym z centralnych problemów badań kognitywistycznych jest relacja pomiędzy mentalnymi konstruktami, które pozwalają opisywać w intersubiektywny sposób na poziomie werbalnym obserwowane zjawiska mentalne i behawioralne, a obiektywnie mierzalnymi cechami procesów, które je tworzą. W efekcie mamy niekończące się spory o „świadomość”, „myślenie”, „zrozumienie” czy „inteligencję”, zwłaszcza w kontekście rozwoju sztucznej inteligencji w ostatnich latach. Pojęcia fizyczne, takie jak „energia”, „promieniowanie” czy „magnetyzm” używane są w pseudonaukowych teoriach w bezsensowny sposób. Argumenty posługujące się pojęciami, które nie są dobrze zdefiniowane, nie przyczyniają się do lepszego zrozumienia procesów umysłowych.
      Powstaje więc pytanie, czy pojęcia psychologii potocznej i psychologii naukowej pozwolą nam opisać rzeczywistość i na jak dokładny opis możemy mieć nadzieję? Czy można zwerbalizować zachowania nieliniowych układów dynamicznych o dużym stopniu złożoności, które pozwalają coraz lepiej opisywać zachodzące w mózgu procesy? Czy te procesy można opisać w terminach jednoznacznych pojęć, a więc posługiwać się logiką klasyczną by stwierdzić, czy miały miejsce? Czy też należy wyeliminować wszystkie pojęcia psychologii potocznej? Czym je zastąpić i jak w sensowny sposób budować teorie potrzebne by się porozumieć?
    3. Duch W. (2012) What can we know about ourselves and how do we know it?
      In: Ed. B. Buszewski, M. Jaskuła, The World Without Borders - Science Without Borders. Societas Humboldtiana Polonorum, 2012, pp. 181-208.
    4. Duch W. (2011) Jak reprezentowane są pojęcia w mózgu i co z tego wynika. In Polish, book chapter, in: „Pojęcia. Jak reprezentujemy i kategoryzujemy świat”, red. J. Bremer, A. Chuderski, Wyd. TAiWPN, Kraków 2011, pp. 459-494
    5. Duch W. (2011) Neuronauki i natura ludzka. Krótkie uwagi. Wszechświat, Pismo Przyrodnicze. Tom 117, nr 1-3.
    6. Duch W. (2010) Reprezentacje umysłowe jako aproksymacje stanów mózgu. Studia z Kognitywistyki i Filozofii Umysłu 3: 5-28, 2009 (appeared in 2010)
    7. Duch W (2006) Od mózgu do umysłu. Charaktery 1, pp. 18-23 (in Polish)
    8. Duch W, Adamczak R, Grąbczewski K (1999) Neural methods for analysis of psychometric data. Proc. of the International Conference EANN'99, Warsaw, 13-15.09.1999, pp. 45-50 | PDF file.

  2. Philosophy
    1. Duch W. (2018) Hylomorphism extended: dynamical forms and minds. Philosophies 2018, 3, 36.
    2. Duch W. (2012) Mind-Brain Relations, Geometric Perspective and Neurophenomenology. American Philosophical Association Newsletter 12(1), 1-7, 2012.
    3. Duch W. (2012) Neuronauki i natura ludzka. W: Nauki przyrodnicze a nowy ateizm, seria "Filozofia przyrody i nauk przyrodniczych", red. M. Słomka, str. 79-122.
      Książka wydana po konferencji, 16-17.11.2011, KUL, Lublin (nagrania z konferencji).
    4. Duch W. (2011) Free Will and the Brain: Are we automata? In: 3rd International Forum on Ethics and Humanism in European Science, Environment and Culture, Ed. M.Jaskuła, B.Buszewski, A. Sękowski and Z. Zagórski, Societas Humboldtiana Polonorum, 2011, pp. 155-170.
    5. Duch W. (2010) Czy jesteśmy automatami? Mózgi, wolna wola i odpowiedzialność. Rozdz. 8, str. 219-264, Na ścieżkach neuronauki. red. P. Francuz, Lublin: Wydawnictwo KUL.
    6. Duch W. (2006) Madhyamika, nauka i natura rzeczywistości. Uwagi na marginesie książki: Matthieu Ricard i Trinh Xuan Thuan, Nieskończoność w Jednej Dłoni: Od Wielkiego Wybuchu do Oświecenia. (Madhaymika, science and reality. Remarks on the book by Matthieu Ricard i Trinh Xuan Thuan, The Quantum and the Lotus: a Journey to the Frontiers Where Science and Buddhism Meet. Kognitywistyka i Media w Edukacji 1-2:293-316, 2006 (in Polish).
    7. Duch W. (2006) Debata: "Mózg - Maszyna - Świadomość - Dusza", Debata w Szkole Wyższej Psychologii Społecznej, Warszawa, 18 marca 2006 r. Jak należy rozumieć relację mózg–umysł–świadomość–dusza? Kognitywistyka i Media w Edukacji 111-152, 9-40, 2008 (appeared in 2009).

Computational physics/chemistry.

  1. Development of direct configuration interaction method for calculation of molecular properties including electron correlation (Duch, PhD 1980). Several papers on configuration interaction method followed.

    Computational physics/chemistry

    1. Duch W, Diercksen GHF (1994) Size-extensivity corrections in the configuration interaction method. Journal of Chemical Physics 101:3018-3030
    2. Duch W (1991) Configuration Interaction Method: the past and future perspectives. Journal of Molecular Structure (THEOCHEM) 234: 27-49
    3. Duch W (1990) Towards flexible CI. International Journal of Quantum Chemistry S24: 683-692
    4. Diercksen GHF, Duch W, Karwowski J (1990) Method for locating errors in Hamiltonian matrices. Physical Review A 41: 3503-3510
    5. Duch W (1989) Operator algebra for the many body problem in the spin eigenfunction basis. Journal of Chemical Physics 91: 2452-2456
    6. Duch W (1986) From determinants to spin eigenfunctions - a simple algorithm. International Journal of Quantum Chemistry 30:799-807
    7. Duch W (1986) Calculation of the one-electron coupling coefficients in the configuration interaction method. Chemical Physics Letters 124:442-446
    8. Karwowski J, Duch W, Valdemoro C (1986) Matrix elements of spin-adapted reduced Hamiltonian. Physical Review A 33:2254-2261
    9. Duch W (1985) Graphical representation of Slater determinants. Journal of Physics A 18:3283-3307
    10. Duch W (1985) On the number of spin functions in the first order interaction space. Theoretical Chimca Acta (Berl.) 67:263-269
    11. Duch W (1983) Matrix elements of xk and xk eax in the harmonic oscillator basis. Journal of Physics A 16:4233-4236
    12. Duch W (1980) Large-scale N-fermion calculations. Computer Physics Communications 20:49-52
    13. Duch W, Karwowski J (1979) Coupling constants in the direct configuration interaction method. Theoretica Chimca Acta (Berl.) 51:175-188

  2. Development of the symmetric and unitary group theory in application to computational problems of quantum mechanics, and implementation of a large software system called SGGA-CI based on graphs for calculation of molecular properties, used by several groups around the world (Duch, Karwowski, Diercksen).
    Symmetric group graphical approach

    1. Meller J, Duch W (1999) SGA derivation of matrix elements between spin-adapted perturbative wavefunctions. International Journal of Quantum Chemistry 74: 123-133
    2. Duch W, Diercksen GHF (1992) Perturbation theory in multireference spaces. Physical Review A 46: 95-104
    3. Duch W, Karwowski J (1987) Multireference direct CI program based on the symmetric group graphical approach. Theoretica Chimca Acta (Berl.) 71:187-199
    4. Duch W, Karwowski J (1985) Symmetric group approach to configuration interaction methods. Computer Physics Reports 2:92-170
    5. Duch W (1985) Efficient method for computation of the representation matrices of the unitary group generators. Int J Quantum Chem 27:59-70
    6. Duch W, Karwowski J (1982) Symmetric group graphical approach to the direct configuration interaction method. Int J Quantum Chem 22:783-824
    7. Duch W, Karwowski J (1981) Symmetric group graphical approach to the configuration interaction method. Lecture Notes in Chem 22:260-271
    8. Duch W (1980) The direct configuration interaction method for general multireference expansions: symmetric group approach. Theoretica Chimca Acta (Berl.) 57:299-313

  3. Book on graphical approach and group theory ideas for general variational calculations: "Graphical representation of model spaces" (Springer Verlag, 1986). This approach was the basis of 4 PhDs done at the Max Planck Institute of Astrophysics in Munich, Germany. Results were presented at various conferences in Europe, and as a long tutorial presented at the "Workshop on Algebraic Methods in Molecular Physics" in Jerusalem (1988) and at the 3-day, 24 hour marathon lectures in applications of symmetric group theory and graphical techniques at the Tokyo University in 1994.

  4. Development of superdirect configuration interaction and multireference superdirect configuration interaction in third order, based on equations derived using computer algebra system "Maple" (Duch, Meller, 1989-1994).
    Superdirect CI

    1. Duch W (1989) Superdirect approach to the configuration interaction method. Chemical Physics Letters162: 56-60
    2. Duch W, Meller J (1994) On multireference superdirect configuration interaction in third order. International Journal of Quantum Chemistry 50: 243-271
  5. Applications of computational quantum mechanics to calculation of complex molecular properties. First calculations of the absorption and magnetic circular dichroism spectrum of a Jahn-Teller distorted excited state of cyclopropane (Duch, Segal, 1983) and other applications.
    Calculation of complex properies

    1. Duch W, Segal GA (1983) Theoretical calculation of the absorption and magnetic circular dichroism spectrum of a Jahn-Teller distorted excited state: the 1E' excited state of cyclopropane. J. Chemical Physics79:2951-2963
    2. Diercksen GHF, Duch W, Karwowski J (1990) CI calculations on the Rydberg spectrum of H3 molecule. Chemical Physics Letters 168: 69-74

Foundation of physics, quantum informatics.

Papers on quantum entanglement, written before this field has developed into quantum informatics (Duch, 1988-2003).

Foundation of physics

  1. Duch W (1988) Schrödinger's thoughts on perfect knowledge.
    In: The Concept of Probability, Ed. Bitsakis EI and Nicolaides CA (Kluwer Academic Publishers), pp. 5-14
    Surprising result of this paper, called the "Quantum Mach principle", is: energy of the localized system (ex. single atom) calculated with the total wavefunction of the world is all in the interactions wit the rest of the world, although distances are infinitely large!
  2. Duch W, Aerts D (1986) Microphysical reality and quantum formalism. Physics Today 6:11-13
  3. Duch W (1988) Violation of Bell's inequalities in interference experiments. In: Open Problems in Physics, Eds. Kostro L, Posiewnik A, Pykacz J and Zukowski M, (World Scientific, Singapore), pp. 483-486; comment by Zukowski and Pykacz.
  4. Duch W, Complementarity, Superluminal Telegraph and the Einstein-Podolsky-Rosen Paradox (unpublished manuscript). I have proposed to measure correlations between pairs of particles in a double Mach-Zehnder experiment. I did send it to two journals and then gave up, referees found it "seriously flawed" and "completely wrong". 5 years later Berkeley group performed experiment based on the same idea and the excitement about entangled systems continues to this day, see my letter to the Physics Today here.
  5. Duch W, Synchronicity, Mind and Matter. Neuroquantology 1 (2003) 36-57, reprinted from The International Journal of Transpersonal Studies 21 (2002) 155-170.

  6. Participation in "Academy of Consciousness" in Princeton (NJ, USA) resulted in this speculative report. I have not continued this line of research, as making real progress here seems unlikely.
  7. Duch W (1994) Synchronicity and the Unified View of Mind and Matter
    UMK-KMK-TR 2/94 report

Our Books

  1. Mikołajewski D, Duch W. (2017) Pień mózgu. Przybliżenie aspektów medycznych dzięki modelowaniu biocybernetycznemu. WN UMK 2017, hard cover ISBN 978-83-231-3942-3; soft: ISBN 978-83-231-3941-6

  2. Tadeusiewicz R, Korbicz J, Rutkowski L, Duch W (Eds), Sieci neuronowe w inżynierii biomedycznej | Show abstract

    1. Jakie zadania mogą realizować sieci neuronowe w zastosowaniach biomedycznych

    Część I. Sieci neuronowe jako narzędzia przetwarzania sygnałów biomedycznych
    2. Przygotowanie danych i planowanie eksperymentu
    3. Wykorzystanie sieci neuronowych do przetwarzania sygnałów bioelektrycznych na przykładzie EKG
    4. Przetwarzanie sygnałów ABR przy użyciu sieci neuronowych w zadaniach diagnostyki słuchu
    5. Sieci neuronowe w przetwarzaniu obrazów medycznych
    6. Wykorzystanie sieci neuronowych w tomografii komputerowej
    7. Neuronowo-rozmyte przetwarzanie obrazów cytologicznych w diagnostyce nowotworu piersi
    8. Ocena jakości ziaren i nasion przy pomocy sieci neuronowych
    9. Wykorzystanie wiedzy eksperckiej w sieci RBF na przykładzie zadania segmentacji obrazów medycznych

    Część II. Zastosowanie sieci neuronowych do analizy danych medycznych oraz do wspomagania procesu diagnostycznego
    10. Sieci neuronowe w ocenie prawdopodobieństwa istnienia raka jajnika u kobiet z guzami przydatkowymi
    11. Wykorzystanie sieci neuronowych i metody wektorów nośnych SVM w procesie rozpoznawania aktywności ruchowej pacjentów dotkniętych chorobą Parkinsona
    12. Zastosowanie sieci neuronowych w analizie wyników badania spirometrycznego
    13. Analiza sygnałów EEG za pomocą sztucznych sieci neuronowych
    14. Wykorzystanie wielowarstwowych sieci perceptronowych dla wspomagania obrazowej diagnostyki medycznej
    15. Sieci neuronowe w bioinformatyce

    Część III. Wspomaganie osób niepełnosprawnych z wykorzystaniem sieci neuronowych oraz ich wykorzystanie w protetyce i terapii
    16. Neuronowe systemy sterowania urządzeniami wspomagającymi samoobsługę osób niepełnosprawnych wykorzystujące ocenę sygnałów biologicznych
    17. Rozpoznawanie ruchów i gestów wykonywanych ustami w obrazie wizyjnym z użyciem sieci neuronowych
    18. Sieci neuronowe oparte na strukturze Hopfielda w układach wspomagania osób niedowidzących
    19. Perspektywy zastosowań impulsowej sieci neuronowej w systemach maszynowego sprzęgania układu nerwowego
    20. Neuronowe modele przewidujące własności biomateriałów

    Część IV. Sieci neuronowe zastosowane do modelowania choroby, terapii oraz do prognozowania wyników leczenia
    21. Zastosowania sieci neuronowych w analizie przeżycia
    22. Samooptymalizujące sieci neuronowe jako narzędzie wspomagające w długoterminowym przewidywaniu ryzyka wystąpienia ponownego zawału serca
    23. Zastosowanie sieci neuronowych w analizie rokowniczej u chorych leczonych przeszczepieniem komórek krwiotwórczych
    24. Klasyfikacja i analiza bólów kręgosłupa przy pomocy sztucznych sieci neuronowych
    25. Ekstrakcja reguł z sieci neuronowych
    26. Modelowanie pnia mózgu
    27. Sieci neuronowe w modelowaniu chorób psychicznych
    Dodatek - Kompedium sieci neuronowych


    Wyd. Exit, Warszawa 2013, pp. 775. Tłumaczenie rosyjskie.

  3. Jankowski N, Duch W, Grąbczewski K. (Eds) (2011) Meta-learning in Computational Intelligence. | Show abstract

    Computational Intelligence (CI) community has developed hundreds of algorithms for intelligent data analysis, but still many hard problems in computer vision, signal processing or text and multimedia understanding, problems that require deep learning techniques, are open. Modern data mining packages contain numerous modules for data acquisition, pre-processing, feature selection and construction, instance selection, classification, association and approximation methods, optimization techniques, pattern discovery, clusterization, visualization and post-processing. A large data mining package allows for billions of ways in which these modules can be combined. No human expert can claim to explore and understand all possibilities in the knowledge discovery process.
    This is where algorithms that learn how to learn come to rescue. Operating in the space of all available data transformations and optimization techniques these algorithms use meta-knowledge about learning processes automatically extracted from experience of solving diverse problems. Inferences about transformations useful in different contexts help to construct learning algorithms that can uncover various aspects of knowledge hidden in the data. Meta-learning shifts the focus of the whole CI field from individual learning algorithms to the higher level of learning how to learn.
    This book defines and reveals new theoretical and practical trends in meta-learning, inspiring the readers to further research in this exciting field.

    Studies in Computational Intelligence, Vol. 358, 1st Edition, pp. X + 362. 127 illus, 76 in color, Springer 2011.
  4. Duch W, Mandziuk J (Eds.) (2007) Challenges for Computational Intelligence.
    Springer "Studies in Computational Intelligence" Series, Vol. 63, June 2007, 488 pp.
    Review in IEEE Transactions on Neural Networks Vol. 20(2)(2009) 542-543.
  5. Duch W, Korbicz J, Rutkowski L, Tadeusiewicz R (Eds), Biocybernetyka i Inżynieria Biomedyczna 2000. Tom 6: Sieci neuronowe.
    Akademicka Oficyna Wydawnicza EXIT, Warszawa 2000, 850 pp, ISBN 83-87674-18-4

  6. Duch W, Kucharski T, Gomuła J, Adamczak R, Metody uczenia maszynowego w analizie danych psychometrycznych. Zastosowanie do wielowymiarowego kwestionariusza osobowości MMPI-WISKAD.
    (Toruń, March 1999; 650 pp., ISBN 83-231-0986-9)
  7. Duch W (1997) Fascynujący świat programów komputerowych. (Nakom, Poznań, November 1997; 456 pp, ISBN 83-85060-92-8)
  8. Duch W (1997) Fascynujący świat komputerów. (Nakom, Poznań, 20 May 1997; 444 pp, ISBN 83-86969-09-1)
  9. Duch W (1986) Graphical representation of model spaces see also Google Books.
    Springer Verlag, Berlin Lecture Notes in Chem Vol. 42, 190 pp.

Books - Conferences Proceedings

  1. Stefan Wermter, Cornelius Weber, Włodzisław Duch, Timo Honkela, Petia Koprinkova-Hristova, Sven Magg, Günther Palm, Alessandro E.P. Villa (Eds.), Artificial Neural Networks and Machine Learning -- ICANN 2014 | Show abstract

    The book constitutes the proceedings of the 24th International Conference on Artificial Neural Networks, ICANN 2014, held in Hamburg, Germany, in September 2014. The 107 papers included in the proceedings were carefully reviewed and selected from 173 submissions. The focus of the papers is on following topics: recurrent networks; competitive learning and self-organisation; clustering and classification; trees and graphs; human-machine interaction; deep networks; theory; reinforcement learning and action; vision; supervised learning; dynamical models and time series; neuroscience; and applications.

    Springer Lecture Notes in Computer Science, Vol. 8681, XXVI, 852 p. 338 illus.
  2. Villa, A.E.P, Duch, W, Érdi, P, Masulli, F, Palm, G, Artificial Neural Networks and Machine Learning ICANN 2012, Part II
    Springer Lecture Notes in Computer Science, Vol. 7553, pp. XXVII, 587 p. 172 illus.
  3. Villa, A.E.P, Duch, W, Érdi, P, Masulli, F, Palm, G, Artificial Neural Networks and Machine Learning ICANN 2012, Part I
    Springer Lecture Notes in Computer Science, Vol. 7552, pp. XXVII, 739 p. 318 illus.
  4. Honkela T, Duch W, Girolami M, Kaski S, Artificial Neural Networks and Machine Learning Research. ICANN 2011, Part I
    Springer Lecture Notes in Computer Science, Vol. 6791, pp. 387
  5. Honkela T, Duch W, Girolami M, Kaski S, Artificial Neural Networks and Machine Learning Research. ICANN 2011, Part II
    Springer Lecture Notes in Computer Science, Vol. 6792, pp. 470
  6. Diamantaras K, Duch W, and Iliadis L.S. (Eds.), 20th International Conference on Artificial Neural Networks (ICANN 2010), Part I,
    Springer Lecture Notes in Computer Science, Vol. 6352, 2010, XXXI, 587 p., ISBN 978-3-642-15818-6 (print), 978-3-642-15819-3 (electronic).

  7. Diamantaras K, Duch W, and Iliadis L.S. (Eds.), 20th International Conference on Artificial Neural Networks (ICANN 2010), Part II,
    Springer Lecture Notes in Computer Science, Vol. 6353, 2010, XVI, 544 p., ISBN 978-3-642-15821-6 (print).

  8. Diamantaras K, Duch W, and Iliadis L.S. (Eds.), 20th International Conference on Artificial Neural Networks (ICANN 2010), Part III,
    Springer Lecture Notes in Computer Science, Vol. 6354, 2010, XVII, 575 p., ISBN 978-3-642-15824-7 (print), 978-3-642-15825-4 (electronic).

  9. Duch Wlodzislaw, Zhao Xiande, Gao Jinwu,
    Proceedings of the Ninth International Conference on Information and Management Sciences, Urumchi, China, August 11-20,2010. Series of Information and Management Sciences, California Polytechnic State University, ISSN 1539-2023
  10. Wang H-F, Neace M.B, Zhu Y, Duch W,
    Proceedings of the Eights International Conference on Information and Management Sciences, Kunming-Banna, China, July 20-28,2009. Series of Information and Management Sciences, California Polytechnic State University, ISSN 1539-2023
  11. Marques de Sá, J, Alexandre L.A, Duch W, and Mandic D.P. (Eds.), Proceedings of the 17th International Conference on Artificial Neural Networks (ICANN 2007), Part I,
    Springer Lecture Notes in Computer Science, Vol. 4668, 2007, XXXI, 978 pp. ISBN 978-3-540-74689-8.

  12. Marques de Sá, J, Alexandre L.A, Duch W, and Mandic D.P. (Eds.), Proceedings of the 17th International Conference on Artificial Neural Networks (ICANN 2007), Part II,
    Springer Lecture Notes in Computer Science, Vol. 4669, 2007, XXXI, 990 pp. ISBN 978-3-540-74693-5.

  13. Kollias S, Stafylopatis A, Duch W, Oja E. (Eds.), Proceedings of the 16th International Conference on Artificial Neural Networks. Athens, Greece, September 10-14, 2006, Proceedings, Part I.
    Springer Lecture Notes in Computer Science, Vol 4131, 2006, XXXIV, 1008 pp. Softcover, ISBN: 3-540-38625-4.
  14. Kollias S, Stafylopatis A, Duch W, Oja E. (Eds.), Proceedings of the 16th International Conference on Artificial Neural Networks. Athens, Greece, September 10-14, 2006, Proceedings, Part II.
    Springer Lecture Notes in Computer Science, Vol 4132, 2006, XXXIV, 1028 pp. Softcover, ISBN: 3-540-38871-0.
  15. Duch W, Kacprzyk J, Oja E, Zadrozny S (Eds), Artificial Neural Networks: Biological Inspirations.
    Proceedings of the 15th International Conference on Artificial Neural Networks (ICANN 2005), Warsaw, Poland, September 11-15, 2005, Part I.
    Springer Lecture Notes in Computer Science, Vol 3696, 2005, XXXI, 703 pp. Softcover, ISBN: 3-540-28752-3.

  16. Duch W, Kacprzyk J, Oja E, Zadrozny S (Eds), Artificial Neural Networks: Formal Models and Their Applications.
    Proceedings of the 15th International Conference on Artificial Neural Networks (ICANN 2005), Warsaw, Poland, September 11-15, 2005, Part II.
    Springer Lecture Notes in Computer Science, Vol. 3697, 2005, XXXII, 1045 pp. Softcover, ISBN: 3-540-28755-8.
  17. Duch W, Engineering Applications of Neural Networks.
    Proceedings of the 5th International Conference EANN'99, 13-15 September 1999, Warsaw, Poland
    Wyd. Adam Marszałek, Toruń, 313 pp, ISBN 83-7174-512-512-5

Papers submitted and in print

  1. Bonna, Kamil; Finc, Karolina; Zimmermann, Maria; Bola, Łukasz; Szul, Maciej; Mostowski, Piotr; Rutkowski, Paweł; Duch, Włodzisław; Marchewka, Artur; Jednoróg, Katarzyna; Szwed, Marcin.
    Early deafness leads to re-shaping of global functional connectivity beyond the auditory cortex. Scientific Reports (sub. 12/2018)
  2. Mikołajewski D, Duch W, Brain Stem – From General View To Computational Model Based On Switchboard Rules Of Operation.
    Neurologia i Neurochirurgia Polska (in revision)
  3. Duch W.
    Meta-analizy w badaniach nad nabywaniem języka. Psychologia Rozwojowa (12 pp, sub. 2018).

  4. Gut, M. et al. Dyscalculia. Frontiers in Psychology, special issue "On the Development of Space-Number Relations: Linguistic and Cognitive Determinants, Influences, and Associations" (in revision, 2018).

  5. Ratajczak E, Dreszer J, Duch W. (2018). Changes of heart. The influence of HRV-biofeedback treatment on various conditions – a detailed review and experimental guide”. In revision, Applied Psychophysiology and Biofeedback.

  6. Ratajczak E, Duch W. (2015). Global warning: Psychology and psychopathology of modern changing environment. (in submission).

    In print

  7. Boruta, M., Dreszer, J., Pawlaczyk, N., Kmiecik, M., Grzybowska, M., Ignaczewska, A., Duch, W. (2018). Gestures reflected – tracing gestural development from a child’s EEG signal. In: Cuskley, C., Flaherty, M., Little, H., McCrohon, L., Ravignani, A. & Verhoef, T. (Eds.): The Evolution of Language: Proceedings of the 12th International Conference (EVOLANGXII). doi:10.12775/3991-1.010
  8. Jacek Matulewski, Bibianna Bałaj, Ewelina Marek, Łukasz Piasecki, Włodzisław Duch. MovEye: Gaze Control of Video Playback.
    ACM Symposium on Eye Tracking Research and Applications (ETRA 2018), Symposium on Communication by Gaze Interaction (COGAIN), Warsaw 14-17.06.2018.

Conference posters and abstracts

    Posters presented at conferences in 2018

  1. Miriam Kosik, Karolina Finc, Kamil Bonna, Simone Kühn, Włodzisław Duch. Brain Functional Network Alterations Due to Increasing Demands of Visuospatial and Auditory Working Memory Task.
    Network Neuroscience, Paris 11.06.2018, P. 5

  2. Karolina Finc, Miriam Kosik, Kamil Bonna, Włodzisław Duch and Simone Kühn. Task-related functional network reconfiguration following 6-week working memory training.
    Network Neuroscience, Paris 11.06.2018, P15

  3. Kamil Bonna, Karolina Finc, Miriam Kosik, Włodzisław Duch, Simone Kühn. Short and Long-Term Changes of Functional Network Segregation Over the Course of Working Memory Training.
    Network Neuroscience, Paris 11.06.2018, P19

  4. Komorowski M, Wojciechowski J, Nikadon J, Piotrowski T, Dreszer J, Duch W, EEG Spectral Fingerprints as a method for classifying different regions of the human brain – initial results. 11th Conference on Electrophysiological Techniques in Bioelectricity Research: from Ion Channels to Neural Networks, pp. 21. Warsaw, Nencki Institute of Experimental Biology of the Polish Academy of Sciences, 25-26.05.2018.

  5. Julia Rodkiewicz, Małgorzata Gut, Łukasz Goraczewski, Jacek Matulewski, Karolina Finc, Katarzyna Mańkowska, Dominika Ciechalska, Aleksandra Mielewczyk, Natalia Witkowska, Natalia Sobolewska, Jakub Słupczewski, Marta Szymańska and Włodzisław Duch. The effect of cognitive and cognitive-motor training with the use of mathematical computer game “Kalkulilo” on the number line estimation and number magnitude comparison, | Show abstract
    Many studies confirm the benefits of a positive effect of cognitive training using computer games and the modern technology in education on the level of mathematical skills. The aim of the study was to examine the effect of cognitive training with computer math game “Kalkulilo” in the development of such skills. Sixty-eight children (aged 7-10) participated in the study. They were divided into 3 groups: 1st group were training with "Kalkulilo" game on a laptop, 2 nd group were training with the “Kalkulilo” and Kinect sensor control of movement and the 3 rd group was the passive control. Training took 5 h and was divided into 10 sessions. Before and after training we measured the level of mathematical skills of participants using the computer test. The results indicate the effect of training on spatial representations of numbers development because it improved the accuracy of number line estimation. This effect is particularly pronounced in the group of cognitive-motor training (with Kinect), which further suggests this type of motor-cognitive training is more effective than standard training using only a computer. It could be concluded that the use of mathematical game training may be therefore a valuable tool not only in math education but also it could be helpful e.g. in overcoming the cognitive deficits observed in dyscalculia.

    Neuronus 2018, IBRO Science Forum, Kraków 20-22.04.2018, pp. 129-130.

  6. Karolina Finc, Miriam Kosik, Kamil Bonna, Włodzisław Duch, Simone Kühn. Task-based functional network changes following 6-week working memory training.
    Neuronus 2018, IBRO Science Forum, Kraków 20-22.04.2018, pp. 127.

  7. Małgorzata Gut, Łukasz Goraczewski, Karolina Finc, Jacek Matulewski, Anna Walerzak-Więckowska, Włodzisław Duch. Number line estimation strategies used by children with dyscalculia and healthy controls.
    Neuronus 2018, IBRO Science Forum, Kraków 20-22.04.2018, pp. 129.

  8. Julia Rodkiewicz, Małgorzata Gut, Jacek Matulewski, Łukasz Goraczewski, Karolina Finc, Katarzyna Mańkowska, Dominika Ciechalska, Aleksandra Mielewczyk, Natalia Witkowska, Natalia Sobolewska, Jakub Słupczewski, Marta Szymańska, Włodzisław Duch. The effect of cognitive and cognitive-motor training with the use of mathematical computer game “Kalkulilo” on the number line estimation and number magnitude comparison.
    Neuronus 2018, IBRO Science Forum, Kraków 20-22.04.2018, pp. 129.

  9. Miriam Kosik, Karolina Finc, Kamil Bonna,Włodzisław Duch, Simone Kühn. Exploring working memory modalities - functional network alterations due to increasing demands of visuospatial and auditory working memory tasks.
    Neuronus 2018, IBRO Science Forum, Kraków 20-22.04.2018, pp. 131.

  10. Michał Komorowski; J Wojciechowski; J Nikadon; T Piotrowski; J Dreszer, W Duch, Recognizing cortical and deep sources of human brain activity from resting-state EEG spectral fingerprints.
    Neuronus 2018, IBRO Science Forum, Kraków 20-22.04.2018, pp. 134.

  11. Michał Komorowski; J Wojciechowski; J Nikadon; T Piotrowski; J Dreszer, W Duch, Klasyfikacja struktur korowych i podkorowych na podstawie sygnału EEG za pomocą metody Spectral Fingerprints. pp. 11
    VI Neuromania conference, Toruń 2-4.06.2018

  12. Marcin Hajnowski; Mateusz Stawicki; Ewa Ratajczak; Michał Meina; Włodzisław Duch. Trening idealny. pp. 48
    VI Neuromania conference, Toruń 2-4.06.2018

Posters presented at conferences in 2017

  1. Finc, K, Bonna, K, Lewandowska, M, Wolak, T, Nikadon, J, Dreszer, J, Duch, W, Kühn, S. (2017) Default Mode Network Role in Global Workspace Formation During Increasing Cognitive Demands. Keystone Symposia: Connectomics, Santa Fe (USA), invited talk and poster.

  2. Bonna, K, Finc, K., Duch, W. (2017) Discovering generative model of human connectome by symbolic regression. Keystone Symposia: Connectomics, Santa Fe (USA)
  3. Ratajczak E, Hajnowski M, Fojutowska J, Stawicki M, Szczęsny P, Dreszer-Drogorób J, Duch W.
    Achievement or approach? Is psychophysiological stress upon divergent thinking related to task performance or to trait anxiety? (Aspects of Neuroscience 2017)

  4. Gut M, Goraczewski Ł, Matulewski J, Finc K, Mańkowska K, Babiuch K, Ciechalska D, Mielewczyk A, Poczopko K, Witkowska N, Duch W. (2017)
    The cognitive-motor training in development of a mental number line with the use of the mathematical computer game “Kalkulilo”. European Congress of Psychology, Amsterdam 2017.

  5. Joanna Dołżycka, Agnieszka Ignaczewska, Joanna Dreszer, Bibianna Bałaj, Jacek Matulewski, Rafał Linowiecki, Bartosz Rafał, Włodzisław Duch. (2017)
    Użycie eye-trackera w badaniach nad słuchem fonematycznym niemowląt między 8. a 12. miesiącem życia. Neuromania, Toruń, 27-28.05

  6. Joanna Dołżycka, Agnieszka Ignaczewska, Joanna Dreszer, Bibianna Bałaj, Włodzisław Duch. (2017)
    Różnice widoczne w słuchu fonematycznym między osobami dorosłymi a niemowlętami do 12. miesiąca życia. Neuromania, Toruń, 27-28.05

  7. Dygasiewicz M., Kępiński D., Joachimiak M., Duch W. (2017)
    System wirtualnej rzeczywistości wspomagający rehabilitację osób wybudzonych ze śpiączki. Neuromania, Toruń, 27-28.05

Posters presented at conferences in 2016

Posters presented at the Neuromania conference, Toruń 28-28.05.2016

  1. Fojutowska J, Szczęsny P, Ratajczak E, Wojciechowski J, Szczypiński J, Dreszer J, Duch W.
    Aktywność przedczołowa w procesach twórczych – badanie techniką EEG.
  2. Mańkowska K, M. Gut, J. Matulewski, Ł. Goraczewski, K. Finc, M. Kmiecik, N. Witkowska D. Sebastian, A. Mielewczyk, E. Sobiechowski, K. Poczopko, K. Babiuch, J. Majewski, P. Cholewa, E. Bendlin, W. Duch.
    Kształtowanie mentalnej reprezentacji osi liczbowej pod wpływem edukacyjno-terapeutycznej gry komputerowej „Kalkulilo”.
  3. Mielewczyk A, Gut M, Matulewski J, Finc K, Goraczewski Ł, Kmiecik M, Mańkowska K, D. Sebastian, Sobiechowski E, Witkowska N, Duch W.
    Operowanie różnymi formatami liczb i zależności numeryczno-przestrzenne u dzieci w wieku wczesnoszkolnym.
  4. Rafał B, Dreszer J, Bałaj B, Matulewski J, Ignaczewska A, Duch W.
    Wzrokowa reakcja antycypacyjna u niemowląt.
  5. Ratajczak E, Wojciechowski J, Fojutowska J, Szczęsny P, Szczypiński J, Bałaj B, Dreszer J, Duch W.
    Twórczy intelekt? Neuroobrazowanie zależności między kreatywnością a inteligencją płynną techniką EEG.
  6. Stawicki M, Szreder K, Ratajczak E, Dreszer J, Duch W.
    Odzwierciedlenie poziomu odczuwanego stresu oraz lęku w zmienności rytmu serca.
  7. Stawicki M, Ratajczak E, Dreszer J, Duch W.
    Zmienności rytmu serca a poziomu odczuwanego stresu.
  8. Wojciechowski J, Jan Szczypiński, Ewa Ratajczak, Julita Fojutowska, Joanna Dreszer, Bibianna-Bałaj, Duch W.
    Kontrola zachowania a aktywność spoczynkowa mózgu.

    Posters presented at the Neuronus conference, Kraków 22-24.04.2016

  9. Fojutowska J, Ratajczak E, Szczęsny P, Wojciechowski J, Szczypiński J, Bałaj B, Dreszer J, Duch W.
    Frontal complexity: higher variability correlates with lower creativity and lower HRV.
  10. Kmiecik M, Goraczewski, Ł., Matulewski, J., Gut, M., Finc, K., Ignaczewska, A., Stępińska, J., Bałaj, B., Dreszer J., Majewski, J., Bendlin, E., Cholewa, P., Duch, W.
    The cognitive training with the game “Kalkulilo” and mathematical abilities in children – the preliminary results of a pilot study.
  11. Szczęsny P, Ratajczak E, Fojutowska J, Wojciechowski J, Szczypiński J, Bałaj B, Dreszer J, Duch W.
    Heart Rate Variability dynamics as a psychophysiological marker of temperament and anxiety.
  12. Szczypiński J., Wojciechowski J., Fojutowska J., Szczęsny P., Dreszer J., Ratajczak E., Duch W.
    Behavioral control linked to resting-state EEG complexity.

    Remaining posters 2016

  13. Fojutowska J., Ratajczak E., Szczęsny P., Dreszer J., Duch W.
    Divergent thinking and Heart Rate Variability Biofeedback (Aspects of Neuroscience, Warsaw, 25-27.11.2016)
  14. Ratajczak E., Szczęsny P., Fojutowska J., Dreszer-Drogorób J., Duch W.
    HRV-biofeedback: the effects of session count on psychophysiological functioning – preliminary results. (Aspects of Neuroscience 2016)
  15. Szczypiński J, Wojciechowski J, Ratajczak E, Fojutowska J, Bałaj B, Dreszer-Drogorób J, Duch W.
    Is brain neurodynamics tied to self-control? (Aspects of Neuroscience 2016)
  16. Ignaczewska A, Dreszer J, Bałaj B, Matulewski J, Linowiecki R, Duch W.
    Badanie słuchu fonematycznego niemowląt – zastosowanie procedury z wykorzystaniem eyetrackera (IV Polska Konferencja Eyetrackingowa, 7-8.04.2016).
  17. Ratajczak E, Szczypiński J, Wojciechowski J, Fojutowska J, Szczęsny P, Bałaj B, Dreszer J, Duch W.
    Creative intellect? An EEG study of creativity and fluid intelligence (Krakowska Konferencja Kognitywistyczna 2016)
  18. Ratajczak E, Szczypiński J, Wojciechowski J, Fojutowska J, Szczęsny P, Bałaj B, Dreszer J, Duch W.
    Między twórczością a inteligencją płynną – neuroobrazowanie zależności techniką EEG (Zjazd Polskiego Towarzystwa Kognitywistycznego, Białystok 2016)

Posters presented at conferences in 2015

  1. Dreszer J, Bałaj B, Matulewski J, Lewandowska M, Goraczewski Ł, Duch W. XVIII. European Conference on Eye Movements (ECEM 2015), Vienna, 16-21.08.2015.
    The Gaze Controlled Game as a Cognitive Training for Children with Math Difficulties, p. 260
    Poster: A gaze-contingent paradigm as a basis for interactive training of the phonetic contrasts discrimination : a pilot study.
  2. Stępińska, J., Goraczewski, Ł., Matulewski, J., Gut, M., Finc, K., Bałaj, B., Dreszer J., Majewski, J., Bendlin, E., Cholewa, P., Ignaczewska, A., Szczypiński, J., Kmiecik, M. i Duch, W. (2015).
    The computer game “Kalkulilo” as a cognitive training method for children with developmental dyscalculia and its application value in the mathematical education - poster prezentowany na Vth International Conference Aspects of Neuroscience, Warszawa, listopad 2015;
  3. Ignaczewska A, Dreszer J, Bałaj B, Matulewski J, Linowiecki R, Rafał B, Duch W.
    Using an eye-tracker in the study of the phonemic hearing infants - a comparison research methods. Vth International Conference Aspects of Neuroscience, Warszawa, 11/2015, poster.
  4. Kmiecik, M., Goraczewski, Ł., Matulewski, J., Gut, M., Finc, K., Ignaczewska, A., Bałaj, B., Dreszer J., Szczypiński, J., Stępińska, J., Majewski, J., Bendlin E., Cholewa P, Duch, W. (2015).
    The cognitive training with the game “Kalkulilo” and mathematical abilities in children – the preliminary results of a pilot study. Vth International Conference Aspects of Neuroscience, Warszawa, 11/2015, poster.
  5. Wojciechowski J, Czarnecka M, Dołżycka J, Szczypiński J, Bałaj B, Dreszer J, Duch W.
    Differentiation of French phonemes, that are not present in polish language by monolingual Polish individuals - EEG study. Vth International Conference Aspects of Neuroscience, Warszawa, 11/2015 (presentation).
  6. Ratajczak E, Szczęsny P, Wojciechowski J, Szczypiński J, Nikadon J, Meina M, Bałaj B, Dreszer-Drogorób J, Duch W.
    In the Heart of Creativity. Divergent thinking and HRV in Computerized Alternativee Uses Task. An EEG-ECG pilot study. Vth International Conference Aspects of Neuroscience, Warszawa, 11/2015 (poster).
  7. Ratajczak E, Wojciechowski J, Szczypiński J, Nikadon J, Bałaj B, Dreszer-Drogorób J, Duch W.
    Creative thinking in computerized Alternative Uses Task – an EEG pilot study. Neuronus, Kraków 05/2015 (poster).
  8. Ratajczak E, Szczęsny P, Wojciechowski J, Szczypiński J, Nikadon J, Bałaj B, Dreszer-Drogorób J, Duch W.
    Neuronalne korelaty twórczości. Neuromania, Toruń 5/2015 (poster).

    Neuromania, Konferencja Studentów Toruńskiej Kognitywistyki, 25-26.05.2013; posters:

  9. Beata Janicka, Patrycja Dzianok, Joanna Dreszer-Drogorób, Monika Lewandowska, Rafał Milner, Włodzisław Duch, Wpływ masażu wibracyjnego na spontaniczną aktywność bioelektryczną mózgu – projekt badań pilotażowych
  10. Karol Sontowski, Bibianna Bałaj, Joanna Dreszer-Drogorób, Monika Lewandowska, Włodzisław Duch, Niemowlęta wiedzą jak mają brzmieć słowa: Uniwersalia językowe
  11. Dorota Sobiepanek, Bibianna Bałaj, Joanna Dreszer-Drogorób, Włodzisław Duch, Badania niemowląt z użyciem eye-trackera: warunki skutecznej kalibracji
  12. Marta Milewska, Bibianna Bałaj, Joanna Dreszer-Drogorób, Włodzisław Duch, Uwaga wzrokowa u niemowląt w wieku od 8 do 12 miesięcy
  13. Romana Owedyk, Joanna Dreszer-Drogorób, Monika Lewandowska, Włodzisław Duch, Różnicowanie fonemów, uwaga słuchowa i temperament u niemowląt – projekt badań

Popular articles on artificial intelligence and cognitive science (most in Polish).

  1. Sztuczna inteligencja
    1. Duch W. (2019), Czy już rozumiemy świadomość? (wersja sieciowa; w druku w "Psychologia Dziś" nieco krótsza wersja).
    2. Duch W. (2017) Czy neuronauki pomogą nam rozwinąć pełny potencjał? Głos Uczelni 5/2017, str 12-16.
    3. Duch W. (2011) 55 lat sztucznej inteligencji. "Niezbędnik inteligenta", Polityka, lipiec 2011.
    4. Duch W. (2008) Nieludzka kreatywność.
      Wiedza i Życie, Numer specjalny 2/2008, str. 71-75
    5. Duch W. (2007) Czy komputery będą kiedyś świadome?
      Popular article, partially published in Newsweek, 3.02.2008 as "Komputer a myśli".
    6. Duch W (2000) Debata: Sztuczny mózg, sztuczna inteligencja. Kognitywistyka i Media w Edukacji 3(1) (2000) 95-98.
    7. Duch W (1997) Sztuczna Inteligencja. Computerworld 24 (1997) 31
    8. Duch W (1995) Komputery 5 generacji  ComputerWorld, 3/1995, 16.01.1995
    9. Duch W (1995) Życie wewnętrzne komputerów cd. Komputer w Edukacji, 3-4: 19-27
    10. Duch W (1994) Życie wewnętrzne komputerów. Toruńskie Studia Dydaktyczne, rok III(6), pp.191-206
    11. Duch W (1992) Myślące maszyny? Nowa Europa 11/4/1992
    12. Duch W (1991) Neurokomputery. Problemy, 9/91
    13. Duch W (1984)Sztuczna Inteligencja. Problemy 6/1984
    14. Duch W (1983) O sztucznej inteligencji. Wczoraj, dziś, jutro. Przekrój N.1974, 10/4/1983.
    15. Duch W (1983) Kiedy komputer wygra z Arcymistrzem? Przekrój N. 1973, 3/4/1983
    16. Duch W (1983) Czy Komputery Myślą? Przekrój N.1971, 20/3/1983

  2. Nauki kognitywne i edukacja
    1. Duch W. (2018) Fizyka, Informatyka i Kognitywistyka. Głos Uczelni 1/2018.
    2. Duch W. (2017) Czy neuronauki pomogą nam rozwinąć pełny potencjał? Głos Uczelni 5/2017, str 12-16.
    3. Duch W. (2016). Muzyka a wyobraźnia: Mózgowe preludium (Cognitive Science and Music). Słyszę 4(150), pp. 50-51, 2016. Presented at International Scientific Conference Hearing Implants and Music, World Hearing Center, Kajetany near Warsaw, 16.07.2015
    4. Duch W. Nie płaczmy nad humanistyką ! Polityka nr 8 (2946), 19.02–25.02.2014, s 66-67 (przedruk "Głos Uczelni 2/4, 2014").
    5. Duch W, "Myśli przestają być prywatne", Rzeczpospolita 23-24.08.2014, A14.
    6. Duch W, Barbarzyńcy w życiu publicznym, 12.2014 (nieopublikowany art. popularny na temat badań naukowych z użyciem zwierząt).
    7. Duch W. (2005) Future of the information society and information technology from the 2005 perspective. In: New Age Communication Media. ICFAI Press 2005.
    8. M. Berndt-Schreiber, W. Duch, A. B. Kwiatkowska, A. Polewczyński, K. Skowronek (2002) Pokolenie dorastające z komputerem wkracza na uniwersytety - nowe wyzwania edukacyjne. W: Rola i Miejsce Technologii Informacyjnej w Okresie Reform Edukacyjnych w Polsce", red. T. Lewowicki, B. Siemieniecki. Wyd. Adam Marszalek, Torun 2002, pp. 307-314.
    9. Duch W. (2002) Przyszłość technologii informacyjnych i przyszłość książki. Wirtualna Edukacja Nr. 9 (2002)
    10. Duch W. (2001) Future of the information society and information technology. In: Wissenschaft und Bildung in einer informatorischen Gesellschaft in der Zeit der europaischen Integration, Ed. Ryszard Grzaslewicz, Wydawnictwo Akademii Rolniczej, Wroclaw 2001
    11. Duch W, (1994) Cerebrations on computational science.
      Academic Programs in Computational Science and Engineering Education, Albuquerque, New Mexico 10-12.02.1994; available in electronic form via gopher from Educational High-Performance Computing Project; also Proceedings from Conference ``Toruń Unix Center", 25-26.03.1994
    12. Duch W (1993) Przyszłość komputerów w edukacji. IX Konferencja ,,Informatyka w szkole": Metody i Środki Informatyki w Zmieniającej się Szkole (15-18.09.93, Toruń).
    13. Duch W (1993) Metody komputerowe w leksykografii. UMK-KMK-TR 2/93 report
    14. Duch W (1992) Elektroniczne podróże w przeszłość. Humanistyka cyfrowa Nowa Europa 8/06/1992
    15. Duch W (1984) Z czego zrobiony jest świat? ITD 18/3/1984; Grand Prix w konkursie popularyzacji nauki "Śladami Infelda"
    16. Duch W (1983) Każdy ma prawo być inteligentnym. Przekrój N.2006, 10/1983, s. 20. Książka Louise Alberta Machado o projekcie inteligencja w Wenezueli.
    17. Duch W (1978) Tradycje Wschodu: Buddyzm. Przekrój N.1747, 1/10/1978
    18. Duch W (1978) Medytacja Transcendentalna. Przekrój N.1736, 16/7/1978
    19. Duch W (1978) Medytacja. Przekrój N.1734, 2/7/1978
    20. Duch W (1978) Elektronika i stresy , Przekrój N.1730, 4/6/1978, str 21-22. Pierwszy artykuł po polsku na temat biofeedback!

  3. Science policy/Polityka naukowa
    1. Duch W. (2019), Hamulce rozwoju nauki. Wszystko co najważniejsze (wersja sieciowa; w druku nieco krótsza).
    2. Duch W. (2019), Hamulce rozwoju nauki: ocena czasopism. Pauza Akademicka 471, 16.05.2019, str. 4.
    3. Duch W. (2019) Hamulce rozwoju nauki: tradycja. Pauza Akademicka 469, 2.05.2019, str. 1.
    4. Duch W. (2019), AI: projekty, które mają przyszłość. Uwagi do strategicznego programu sztucznej inteligencji w Polsce: "Podstawy uczenia maszynowego, informatyka kognitywna i technologie neurokognitywne”, przygotowanej dla OPI PIB.
    5. Duch W. (2018) Naukowa ślepota. Pauza Akademicka 435, 6.09.2018, str. 2-3.
    6. Duch W. (2018) Fizyka, Informatyka i Kognitywistyka. Głos Uczelni 1/2018.
    7. Duch W. Honorowy Doktorat dla Informatyka. Głos Uczelni 2/2018, str 7-8.
    8. Duch W. (2017) Ocena nauki czy jednostek naukowych? Pauza Akademicka 372, 16.02.2017
    9. Duch W. (2017) Którędy do cyfrowego świata? Pauza Akademicka 1.06.2017, p. 3
    10. Duch W. (2016) Wyzwania Polityki Naukowej w Polsce. Zagadnienia Naukoznawstwa 1(207), 2016, pp. 135-143
      Uwagi wygłoszone na panelu dyskusyjnym „Współczesne wyzwania dla naukoznawcy w świetle polityki naukowej, jak i refleksji filozoficznej”, Poznań, 18.09.2015.
    11. Duch W. (2015) Life sciences and the quality of life, European Files, March 2015, pp. 10-11.
    12. Duch W. (2001) Szufladki w nauce. Sprawy Nauki 12(75) (2001) 12-13.
    13. Duch W (2001) Droga do piekła. Głos Uczelni, 9/2001, p. 10
    14. Duch W (2001) Smutne refleksje z Brukseli. Głos Uczelni, 7/2001, p. 1
    15. Duch W (2001) Nieuctwo szkodzi bardziej. Głos Uczelni, 6/2001, p.1

  4. Interviews and media (Wywiady i media)
    1. Nauka w Polsce, PAP 2019, krótkie wideo: Wyzwania umysłowe.
    2. Lekcja mistrza Zen. Z Włodzisławem Duchem rozmawia Anna Mateja. Znak, nr 766, marzec 2019, str. 6-13
    3. "Nadchodzi-era-Homo-Sapiens Digital", Express Bydgoski, 15.03.2019;
    4. Duch W, wywiad: Trzy razy wynalazłem koło? Forum Akademickie 10, s. 18-20, 2015
    5. Duch W, wywiad: Otwarta wiedza dla otwartych głów. Przegląd Techniczny 6-7, s. 8-9, 2015
    6. Duch W, wywiad: Nauka na "kliknięcie". Przegląd Techniczny 8, s. 31-32, 2015
    7. Duch W, Wszystkie grzechy ludzkiej pamięci, wywiad, Gazeta Wyborcza, 14.03.2015
    8. Duch W, A long way to go (interview, Cristinna Gallardo)., Research Europe 411, 18.06.2015
    9. Duch W, Słyszę. Lipiec/sierpień 4/144/2015. Od dźwięków gwizdanych do rapu.
    10. Duch W, Mamy dużo do zrobienia (wywiad, M. Kilanowski). Respublica 3/2014, str. 33-38.
    11. Duch W, Czy zabawka może pomóc dziecku w nauce języków obcych? Wywiad z prof. Włodzisławem Duchem, Dwumiesięcznik Słyszę, marzec/kwiecień 2014
    12. Duch W, Gimnastyka neuronów. Kwartalnik Neuropozytywni, 2(09), 2014 str 45-47.
    13. Duch W, wywiad Od mózgu do Matriksa , Głos Uczelni, nr 7 (341), lipiec 2014,
    14. Duch W, Mamy dużo do zrobienia – z Włodzisławem Duchem rozmawia Marcin Kilanowski, Res Publica jesień 2014.
    15. Duch W. (2013) Mózg kobiety, mózg mężczyzny, "Wysokie Obcasy Extra", kwiecień 2013, ss. 45-47.
    16. Duch W. (2011) Wszyscy żyjemy złudzeniami. Wywiad, Gazeta Wyborcza 3.06.2011
    17. Duch W. (2011) O tym jak nie zgłupieć. Wywiad, Gazeta Wyborcza str 16-17, 8.09.2011

Patents and inventions

  1. Processing clinical text with domain-specific spreading activation methods.
    US Patent Application No. 12/006.813 (April 2008), published 2015, granted and published 6.01.2016, Patent 8,930,178 B2.
    co-authors: John Pestian, Paweł Matykiewicz, Włodzisław Duch, Tracy Glauser, Robert Kowatch, Jackie Grupp-Phelan; granted in Oct 25, 2016, U.S. No. 9,477,655 Processing Text with Domain-Specific Spreading Activation Methods.
  2. Włodzisław Duch, Bibianna Bałaj, Joanna Dreszer-Drogorób, Oleksandr Sokolov, Tomasz Komendziński, Jacek Matulewski, Dariusz Mikołajewski, Tomasz Piotrowski, Michał Meina System do wspomagania rozwoju percepcyjno-poznawczego niemowląt i małych Dzieci (System supporting perceptual-cognitive development of infants and babies), Zgłoszenie Patentowe PL P411648 (2015).
  3. Układ aktywnego stymulatora ośrodków mowy, zwłaszcza niemowląt i dzieci.
    BUP nr: 99/03 str.73 zawierającej skrót wynalazku,
    Polish patent: granted in 2002, patent no. 184102, submitted on 1997.07.29, number 321411.

Awards for inventions:

My company DuchSoft has created between 1998-2004 data mining software GhostMiner distributed in the "business intelligence" category by Fujitsu (FQS Poland). This software was based on 4 PhD thesis of my students (Norbert Jankowski, Krzysztof Grąbczewski, Rafał Adamczak and Antoine Naud). Using this software our group has been ranked at 3rd position in the NIPS 2003 Feature Selection Challenge.