Dasha's MSc Thesis

quarta-feira, outubro 19, 2005

Título da tese

Ainda indecisa, tinha que inventar um título, para o director do mestrado enviar para a secretaria da faculdade... e, acalmada com o facto de depois poder mudar, caso não achar graça, o novo título é:

Robótica Educativa - programação de comportamentos emotivos nos robots de
pequenas dimensões.


O que é que vai ser a tese? Vou trabalhar no simulador do ciberato, altera-lo de acordo com as minhas necessidades, vou especificar o cerebro emotivo de um robot, programa-lo em delphi, e testar no simulador... Os vários robots (agentes) terão que poder comunicar entre si, reagir a estímulos externos incluindo estímulos visuais e prosódicos (visto que não vou fazer processamento de imagem e voz, vou definir o módulo de decisão e testa-lo com número limitado de exemplos já processados). Vou definir o que será um comportamento emotivo num robot, como poderá ser externamente configurado pelo utilizador, como a informação será processada e como será devolvida a resposta do robot. A ideia é utilizador poder configurar um robot emotivo de acordo com características requeridas, ou seja criar uma aplicação de programação orientada as emoções, mas que no fundo será orientada aos eventos, só que o espaço de eventos irá ser mais focado nas emoções.

Para que isto serve... há vários projectos ligados a novas tecnologias e educação onde necessitamos ter uma comunicação natural com os robots ou agentes virtuais, serve para e-Learning, aplicações de help-desk, jogos educativos, etc... Sendo o modelo que vou criar e programar adaptável a diferentes situações, poderá reaproveitar-se para projectos que envolvem agentes emotivos configuráveis.

O que vou usar: c++ para alterar o simulador, delphi para agentes, logo (talvez) para uma interface visual de programação de emoções. O núcleo emotivo será feito em delphi, porque é no agente que irá ser processada toda a informação exterior que activará os módulos de decisão para geração da resposta emotiva do robot.

O meu medo: é não conseguir fazer nada...

terça-feira, agosto 02, 2005

Here we have some emotional behaviors I find for my thesis... This is just begining, I'll find more. So, for my work I'll have {creepy, bored, mad, shocked, chillin} states for robotic behavior.:)

My feelings about my work and thesis... :) Oh god... it's heavy...

Ideia...

Após analisar alguns trabalhos feitos na área de robótica emotiva surgiu-me a seuinte ideia:

Vou fazer uma interface visual para programar o conjunto de unidades robóticas, sendo este um sistema multiagente. Imaginemos que um robot irá responder pelo funcionamento das pernas, outro pelos braços e outro a cabeça (cada robot tem dois motores). Então, a minha interface terá especificação de 3 objectos aos quais vou associar determinados comportamentos. Cada objecto é identificado pelo ip (comunicação é feita usando wireless) e cada um tem um sistema que envia as informações sobre o seu estado actual e recebe informações sobre os outros robots. Podemos designar este sistema como um multiagentes sem sistema de controlo centralizado, porque cada robot decide o que vai fazer de acordo com as suas programações, mas também de acordo com as informações recebidas a partir de outros agentes do sistema. Se um deles falhar, isto não impede o funcionamento dos outros. Cada robot tem um sistema de decisão que utilizando uma métrica (a definir) decide o passo seguinte de acordo com as programações feitas para ele e de acordo com o estado dos outros robots, o que torna o sistema criado dinámico. A questão é fazer a interface visual acessível e as parametrizações possível...

A pensar nas outras possibilidades...

eMuu - emotional robos based on 2 LEGO Mindstorms RCX

eMuu is more one emotional robot very interesting to consider. The eMuu software consists of 3 components. The game engine, which implements the negotiation task, and the character engine, which controls the behaviour of the character, are running on a PC using JAVA.
The emotional reasoning is based on the OCC model (Ortony, 1988). The emotion engine, which controls the emotional state and the facial expression, is running on 2 LEGO Mindstorms RCX (Mindstorm, 2001) inside of the robot and communicates with the PC via infrared. Since infrared communication is rather slow the communication is limited to the exchange of emotional states and behaviour control. The software for the RCX is written in JAVA and runs on the leJos (Solorzano , 2001) firmware, a Java Virtual Machine for the RCX.
This architecture builds on Sloman’s (Sloman, 1999) evolutionary approach of the mind. The emotions and sensor-motoric control are in the lower part of the conscious (the processor inside the RCX) and the reasoning in the higher conscious (the PC).

Reference:
  • Bartneck, C., Okada, M., (2000), eMuu – An Emotional Robot http://www.bartneck.de/work/bartneck_robofesta.pdf
  • Ortony, A.; Clore, G.; Collins, A., 1988, 'The Cognitive Structure of Emotions'. Cambridge, Cambridge University Press.
  • [Solorzano, 2001] Solorzano, Jose: leJOS: Java based OS for Lego RCX at http://lejos.sourceforge.net/
  • LEGO Mindstorm, 2001: LEGO Mindstroms website at http://www.legomindstorms.com
  • Sloman, A.: 1999, 'Architectural requirements for human-like agents both natural and artificial '. In: K. Dautenhahn, (ed): Human Cognition And Social Agent Technology, Advances in Consciousness Research, 19, Amsterdam, John Benjamins Publishing Company.

Virtual Human - virtual emotive being

Other interesting Project is Virtual Human. The aim of this corporate project is to achieve by means of a close cooperation of Germany's leading research groups in the area of computational graphics and multimodal user interfaces, a worldwide leading position in the development of virtual characters as personal dialogue partners. It reaches an entirely new quality of interactive systems. One possible field of application is, for instance, educational software where the personification of the tutor system demonstrably leads to a better learning efficiency and secures a lasting learning success. Alongside this field of application, all interactive web applications - and here especially e-business - contribute to the huge application potential of the technologies that are to be developed in this project.
This corporate project will deal with the utterly new question of how the animation and dialogue behaviour of virtual characters can be planned completely autonomously in real-time. This idea stands in clear opposition to the manually animated virtual actors made in long-winded computational processes created for the film industry, as well as those avatars in chat-rooms and video-conferences that are remotely controlled by a human actor. As the virtual character has to react to the behaviour of a human user as naturally as possible and to react appropriately to the situation, the requirements of the computer graphic design as well as of the quality of the multimodal user interaction are novel and highly demanding.

AIBO - emotional robot

Computer character with emotional behaviours should probably be faced by the user with more complicity, stimulating a better interaction between them. Good example of artificial companion is robotic dog AIBO. AIBO’s personality develops by interacting with people and each AIBO grows in a different way, based on its individual experiences. AIBO becomes customized based on feedback and the software being used. AIBO’s mood changes with its environment, and its mood affects its behavior. AIBO also has instincts to move around, to look for its toys, to satisfy its curiosity, to play and communicate with its owner, to recharge when its battery is low, and to wake up when its done sleeping. AIBO is capable of six feelings - happiness, sadness, fear, dislike, surprise, and anger. Its unique personality is developed with a combination of these unique instincts and feelings.

Virtual Clone-contribution

Student Virtual Clone (SVC) is more one internacional project we participate. At this moment it is on submittion fase and I try to specify my contribution on this project to use in advantage my thesis work. I will direct my contribution on pattern recognition of the prosodic and gestual inputs to generate emotional behaviors on computer character. Follow some detailed description I have made about what can I do on this project:

Our company contribution within the Virtual Clone Project WP Dialogue will be focused on the pattern recognition tasks oriented to study and development of algorithms for emotional behaviour controlling. Using our know-how on the artificial intelligence field and applying state of the art of pattern recognition, emotional intelligence and artificial life tasks, we will provide analysis of important tasks to make virtual being emotionally intelligent. We also will provide research of what is emotions on artificial life concept, how we can integrate it on virtual clone, how we can configure it and how it can be useful.

At this point, it is important to emphasize that we will be working over a research and algorithm conception of emotion behaviours. Consequently, we will research the best way to integrate emotions on virtual clone, what it involves and how it can be implemented.

Vision and Goals
What we want is to research how emotions are used on artificial beings like robots or 3D characters. What kind of emotions is useful and how we can analyse inputs to transmit correct idea to the end user. We will analyse possible interactions between user and virtual clone, possible feedback of the system, and define emotional responses of the virtual clone. Based on this research made in collaboration with other partners we will specify the best configuration for input data.

It will be necessary to proceed with the analysis of certain emotional patterns: classify them, specify requirement for dataset, create dataset, discover the best data-mining technique for pattern recognition task and implement this method (for example, neural net, bayesian net, decision tree, etc...). Important task is to define structure of the dataset and then make data analysis and treatment to prevent noise, over-fitting, and to increase data quality. We need to analyze all components of our dataset using different statistic techniques. After conclusion of the pattern recognition analysis it will be necessary to integrate these results on the SVC simulation. We will implement algorithms to generate all necessary behaviour output that will be included on graphic and prosodic modules. The link between graphic and prosodic modules and behaviour returning algorithm will be implemented by each responsible of these modules.
Major problem is to get adequate feedback to most part of interaction between human and his virtual clone. For example: if user talk to clone with anger system identifies anger behaviour and clone receive content of the user speech and information about user emotional state. What algorithm will decide clone emotional response to the user? We will research this task to find the technique defining most adequate response of the clone.

Another important point we will have to be aware is about personalization of each virtual clone. User should have the possibility to parameterize some personal characteristic to make the virtual character a confident virtual companion. For this we think about creating some visual interface to simplify emotion parameterization. We will study and develop the concept of this visual interface.
Finally, we will try as much as possible to spread the use of open solutions in order to facilitate the interoperability and extensibility of the system. All research and development must satisfy requirements to be reusable for other projects involving emotional intelligence and virtual or robotic agents.

Abstract

Este trabalho de investigação pretende desenvolver algoritmos comportamentais e uma Interface Visual para a programação de Robots de pequenas dimensões, baseado em: Estudos de Psicologia Emocional e Comunicação Humana, Modelos Electrónicos de Robots e Sensores, uma Linguagem de Programação, Estudos sobre os Processos de Cognição e Aprendizagem e Agentes Inteligentes que representam as diversas entidades inteligentes presentes numa equipa de robots possuindo uma inteligência emocional. Os algoritmos comportamentais e a Interface Visual a desenvolver terão como aplicação os robots wiCrickets (em desenvolvimento) e serão desenvolvidos em pseudo-código e implementados em linguagem LOGO. A aplicação será testada num ambiente de simulação utilizando os agentes robóticos configuráveis (a desenvolver). Assim será possível utilizar os algoritmos desenvolvidos adaptanto-os a qualquer agente robotico (real ou virtual) que pretenda ter inteligência emocional, comportamentos colaborativos ou apenas ser programável por comportamentos ou eventos.

O sistema desenvolvido terá uma base de conhecimento comportamental, orientada principalmente as emoções que será desenvolvida de modo a permitir a sua reutilização nos outros cenários através dum módulo de configuração manual. Fará parte de investigação a forma de aquisição de conhecimento e aprendizagem, considerando-se as alternativas de aprendizávem estatística ou supervisionada.

sábado, abril 16, 2005


handy cricket

What is the Cricket?



The Cricket is a tiny computer, powered by a 9 volt battery, that can control two motors and receive information from two sensors. Crickets are equipped with an infrared communication system that allows them to communicate with each other. Crickets are the result of cross-breeding our work on Programmable Bricks with the wearable Thinking Tag. Like the Brick, Crickets can be used for robotic applications, but because they are so small, Crickets can be used for other investigations like body-monitoring and data collecting.
The Cricket is based on the Microchip PIC series of microprocessor. User programs are downloaded to the Cricket via its infrared communications system. The Cricket has a button that when pressed triggers it to run the program that was downloaded to it. LEDs on the Cricket indicate when it is running a program or sitting idle, the state of the two motor outputs, and indication of infrared transmission.
Crickets are programmed in a dialect of the Logo programming language, a procedural language that includes constructs like if, repeat, and loop, global and local variables, 8-bit numeric operations (addition, subtraction, multiplication, division, comparison), motor and sensor primitives, timing functions, a tone-playing function, and a random number function.
We have developed a variety of activities based on Crickets, as part of our Beyond Black Boxes
NSF research:

  • Crickets are small and light enough that they can be carried around in a shirt pocket, collecting data about body activities.
  • A collection of Crickets communicating with each other to simulate natural life.
  • A network of Crickets positioned in an indoor environment to collect and share data about human traffic patterns, room temperature variations, lighting preferences, and other dynamic qualities.
  • Mobile, creature-like robots built on a smaller scale than was previously possible.

You can find out more about the technological infrastructure that runs the Crickets by taking a look at the

Technical Overview presentation. It is presented in form of talk slides from a May 29, 1997 Media Lab presentation.
The Cricket grew out of Fred Martin's work with Programmable Bricks. The Cricket project has had many contributors, but has been primarily created by Fred Martin, Brian Silverman, Bakhtiar Mikhak, and Robbie Berg.

quarta-feira, abril 13, 2005

The final countdown to start my Master Degree thesis

Today I gave the first step in preparing for my MSc Thesis with the creation of this blog. Here I will report the doubts, problems, solutions, research and other related items. A possible title for my thesis is "Programmable wiCrickets - Visual language and algorithm for robot collaboration and emotion programming". Stay tuned and return to this blog often!