Imagine for a moment, that we are on a safari watching a giraffe graze. After looking away for a second, we then see the animal lower its head and sit down. But, we wonder, what happened in the meantime? Computer scientists from the University of Konstanz's Centre for the Advanced Study of Collective Behaviour have found a way to encode an animal's pose and appearance in order to show the intermediate motions that are statistically likely to have taken place.
from News on Artificial Intelligence and Machine Learning https://ift.tt/RKv4OfG
Home
machine-learning-ai-news
News on Artificial Intelligence and Machine Learning
Designing a 'neural puppeteer' to recognize skeletal nodes
- Blogger Comment
- Facebook Comment
Subscribe to:
Post Comments
(
Atom
)
0 comments:
Post a Comment