What have been done so far?

  • Cooperating with a Spanish research group (University of Valencia), we investigated whether dogs and humans discriminate similarly between dependent and independent motion patterns performed by geometric shapes. We found that in both dogs and humans, looking times decreased at the dependent pattern, and increased at the independent pattern. We argue that dogs and humans spontaneously recognized the specific pattern (that “made sense”) and habituated to it rapidly, but continued to show interest in the “puzzling” pattern. Our findings, that both species tend to recognize inanimate agents as animate relying solely on their motions, were published in Biology Letters.

  • Applying a fully non-invasive methodology to study sleep and memory in pet dogs, we performed polysomnography recordings following a command learning task. We provided evidence that learning has an effect on dogs’ sleep EEG spectrum. Furthermore, spectral features of the EEG were related to post-sleep performance improvement and sleep or awake activity during the retention interval had both short- and long-term effects. This is the first evidence to show that dogs’ human-analogue social learning skills might be related to sleep-dependent memory consolidation.

  • To investigate the evolutionary origin of the organization of human resting-state networks (spatially distributed, functionally connected brain regions), we scanned 22 awake, unrestrained companion dogs and carried out spatial independent component analysis to explore whole-brain connectivity patterns. Studying resting-state networks gives information about the large-scale functional organization of the brain and alternations in these networks are considered to play a role in a wide range of neurological conditions and cognition. Using resting-state functional magnetic resonance imaging we described multiple resting-state networks in dogs.

  • We measured fMRI adaptation in awake family dogs, while they listened to lexically and intonationally marked and unmarked words. Short-term fMRI adaptation reflected intonation sensitivity in the bilateral auditory thalamus, and long-term fMRI adaptation reflected lexical meaning sensitivity in the right auditory cortex. This multilevel fMRI adaptation in dogs reveals temporally and anatomically human-analogue speech processing hierarchy for acoustics-based and lexical representations, suggesting that human lexical processing hierarchy is partially based upon ancient brain mechanisms that are already present in a non-primate species.

  • In the prestigious Trends in Neurosciences, we published a review about the role of the dog as an innovative and unique model species, complementing more traditional animal models in comparative neuroscience.

    Ádám Miklósi was asked by an American/English publisher to write a popular science book summarising our studies on dogs and providing an overview of dogs’ evolution, ecology, behavior, and connections to humans. Two members of the group participated in writing several chapters. (Miklósi Á ed. (2018). The Dog. A Natural History, London: Ivy Publishing Group, 224 p., ISBN: 978-1-78240-562-7)

  • Ethon, the mobile robot developed by us in cooperation with our colleagues at BME, was presented at the 2017. NIDays for experts event at the conference of the National Instruments Hungary Kft.


  • We investigated the development of social learning and social referencing (the process by which individuals utilize cues from emotional displays of a social partner to form their response to a new situation) in pet dogs. Puppies tested in the presence of a human expressing positive emotional signals towards the stimulus were more likely to approach it than puppies tested with a human expressing neutral emotional signals. This shows that the ability for social referencing develops early in the ontogeny of companion dogs as it is already present at 8 weeks of age. The focus of another study was to investigate whether the capacity of social learning is already developed in dogs at an early age. We found that social learning skills are already present in 8-week-old puppies. Puppies learned to solve the task from both conspecific and human demonstrators, thereby endorsing dogs’ flexibility in learning from different social partners. In the third tests, dogs trained with the Do as I Do method observed either a demonstration of a goal-directed action (opening a closet to take out a wallet) or the demonstration of the same action but without the goal (opening the closet for no apparent reason), dogs imitated the action more often when the goal was not shown, while they tended to solve the problem in different ways when the goal was shown. This study suggests that, similarly to humans, dogs understand their human companion’s goals and tend to modify their behaviour accordingly.

    We revealed a new way to investigate cat behaviour and optimize welfare, by understanding how cats respond to new environment, and to familiar and unfamiliar people. The analysis of the original data collected should provide an unprecedented insight into the cat-human bond, especially in comparison with our other companion, the dog.

  • Applying the go-nogo paradigm using a touch screen device, we revealed human analogue associations among behavioural inhibition and attention, hyperactivity/impulsivity, and personality scores in pet dogs.

  • In a study investigating dog emotions, we found that pet dogs display jealous behaviour when their owner attends to a social rival. Thus, our research confirmed that a behaviour attributed so far only to humans can be find in non-human species as well. These findings can also contribute to the field of animal-robot interaction by providing information how to improve the behaviour of robots for social interactions and by providing novel method to find out more about the nature of these interactions.

    Our findings showed that dogs, similarly to adult humans, not only have the tendency to perceive inanimate objects as animate based on simple motion cues, but they are sensitive to similar cues (alignment of motion direction and trajectory) eliciting animacy perception to those revealed in case of humans.


  • As a first step of the new social robot project we acquired and programmed a robot (Biscee) to autonomously navigate, act upon simple commands like “go to the kitchen”, and give limited social reactions with its appendage. Biscee was extensively tested for three consecutive days in the real-life scenario of autonomously navigating in a crowded supermarket, which was a great success. Although we garnered much information about how to better fine tune the robot’s behaviour, the robot did not cause any problem for people, and was generally perceived as amicable and intractable.

  • In addition to the tests, our research group extended the observation capacity of the self-developed smart collars to 24 hours, making continuous measurements feasible and is now in the process of gathering long term data.




In our polysomnography study we presented a detailed analysis of the cardiac and respiratory activity of dogs during sleep. Using a semi-automatic method, during a three-hour-long polysomnography EEG measurement, we successfully analysed the heart rate, heart rate variability and respiratory measures according to different sleep-wake phases of a large sample of adult pet dogs. Since variations in these physiological signals reflect the dynamics of autonomic functions, a more detailed understanding of their changes helps gain a better understanding of the internal/emotional processes of dogs in response to different conditions of external stimuli. As such, our results are important since they are directly comparable to human findings and may serve as a potential basis for future studies on dogs, for example, can help interpret physiological data measured in response to stimulation.

  • Our results evince the dog is a promising animal model of the association between an optimistic cognitive bias and personality, as our findings indicated dogs with higher conscientiousness and extraversion scores were more likely to exhibit a “go” (positive) response to ambiguous stimuli.

  • With our continuing experience in collecting and analysing smart collar data from dogs, we compiled a methodological recommendation for measuring the accuracy of behaviour recognition in dogs, in order to create comparable results across research groups.

  • Investigating dog-robot interaction, we found that companion dogs not only solicit help from a robot showing social-like behaviour but also tend to engage in playful interaction with them. Further, dogs remember the behaviour of the robot after a month and continue to engage in interaction with it without further familiarization, proving that the experience with the robotic agent has an effect on dogs’ behaviour in the longer term.

  • We showed that companion cats, similarly to dogs and adult humans, discriminate between animate and inanimate objects relying on their motion displayed on a screen. However, cats show opposite preference to the other two species, raising the question whether their behaviour is influenced by animacy perception per se.

  • We collaborated with a project (Celsa) to apply dog-robot interaction as a methodological approach to study dogs as a model species to autism spectrum disorder (ASD). Our tests were based on the hypothesis, that dogs with ASD-like symptoms would not be able to recognize the social-like behaviour of the robot and consider it as an interactive partner, in contrast to dogs with typical or high level of social competence.

  • Based on our previous findings in dogs regarding their jealous behaviour, we also tested whether dogs consider an interactive robot as social partner in this respect.

  • We upgraded our robot with a head and a tray to be able to perform more versatile tasks and built a web interface to allow for non-technical people to perform and overview more complicated tasks, like switching from one set of behaviours to another. The robot was prepared for a series of experiments in a café or a restaurant, where customers’ reaction to the various behaviours and different embodiments of the robot can be observed.