CISUC has nine research groups: G1: Computer Science G2: Artificial In 英文毕业论文 telligence: Foundations and Applications G3: Simulation and Information Technologies in Education and Training G4: Adaptive Computation G5: Dependable Sys
CISUC has nine research groups:G1: Computer ScienceG2: Artificial Intelligence: Foundations and ApplicationsG3: Simulation and Information Technologies in Education and TrainingG4: Adaptive ComputationG5: Dependable SystemsG6: Communications and TelematicsG7: DatabasesG8: Information SystemsG9: Evolutionary and Complex SystemsThe list of PhD Proposals is divided by each group.Each proposal has a reference Gx.y (x is the number of the group; y is the number ofthe proposal within that group). Use these references to fill in the ScholarshipApplication Form.CISUC will offer 9 scholarships in the upcoming year. Students that do not have anyother sources of funding can apply to these scholarships. The scholarship includes thepayment of the tuition fees and a monthly payment of 980€. Students should be awarethat this is a scholarship only for the first year. Financial support for the remainingyears of the PhD should be of the responsibility of the Supervisor. It is advisable thatstudents should first discuss with their future Supervisor about the possibilities forfinancial support for the rest of the PhD program.Candidates should choose 3 Thesis Proposals, listed by order of preference. The chosenproposals can be from the same or different groups.Each group of CISUC has one scholarship to offer. The best candidate in each groupwill get the scholarship. The selection of the candidates that will get a scholarship is nota global process, but rather a distributed process with nine independent queues ofselection. Each group is responsible for the selection of the best candidate, and the finallist will be approved by the Scientific Commission of CISUC.In the following pages we present a list of proposed PhD Thesis for the next year (2007-2009). The candidates to the scholarship should read this list carefully and choose the3 proposals that better fit into the interests and research background.Deadline to apply for a scholarship: 21st of July, 2006G1: Computer Sciencehttp://cisuc.dei.uc.pt/csg/PhD Thesis Proposal: G1.1Title: “Human readable ATP proofs in Euclidean Geometry”Keywords: Automatic Theorem Proving, Axiomatic Proofs in Euclidean Geometry.Supervisor: Prof. Pedro Quaresma (firstname.lastname@example.org)Summary:Automated theorem proving (ATP) in geometry has two major lines of research:axiomatic proof style and algebraic proof style (see , for instance, for a survey).Algebraic proof style methods are based on reducing geometry properties to algebraicproperties expressed in terms of Cartesian coordinates. These methods are usually veryefficient, but the proofs they produce do not reflect the geometry nature of the problemand they give only a yes/no conclusion. Axiomatic methods attempt to automatetraditional geometry proof methods that produce human-readable proofs. Building ontop of the existing ATPs (namely GCLCprover [5, 4, 8, 9, 10] to the area method [1, 2, 3,7, 8, 11] or ATPs dealing with construction  the goal is to built an ATP capable ofproducing human-readable proofs, with a clean connection between the geometricconjectures and theirs proofs.References: Shang-Ching Chou, Xiao-Shan Gao, and Jing-Zhong Zhang. Automated production of traditional proofs forconstructive geometry theorems. In Moshe Vardi, editor, Proceedings of the Eighth Annual IEEE Symposiumon Logic in Computer Science LICS, pages 48–56. IEEE Computer Society Press, June 1993. Shang-Ching Chou, Xiao-Shan Gao, and Jing-Zhong Zhang. Automated generation of readable proofs withgeometric invariants, I. multiple and shortest proof generation. Journal of Automated Reasoning, 17:325–347,1996. Shang-Ching Chou, Xiao-Shan Gao, and Jing-Zhong Zhang. Automated generation of readable proofs withgeometric invariants, II. theorem proving with full-angles. Journal of Automated Reasoning, 17:349–370,1996. Predrag Janicic and Pedro Quaresma. Automatic verification of regular constructions in dynamic geometrysystems. In Proceedings of the ADG06, 2006. Predrag Janicic and Pedro Quaresma. System description: Gclcprover + geothms. In Ulrich Furbach andNatarajan Shankar, editors, IJCAR 2006, LNAI. Springer-Verlag, 2006. Noboru Matsuda and Kurt Vanlehn. Gramy: A geometry theorem prover capable of construction. Journalof Automated Reasoning, (32):3–33, 2004. Julien Narboux. A decision procedure for geometry in coq. In Proceedings TPHOLS 2004, volume 3223 ofLecture Notes in Computer Science. Springer, 2004. Pedro Quaresma and Predrag Janicic. Framework for constructive geometry (based on the area method).Technical Report 2006/001, Centre for Informatics and Systems of the University of Coimbra, 2006. Pedro Quaresma and Predrag Janicic. Geothms - geometry framework. Technical Report 2006/002, Centrefor Informatics and Systems of the University of Coimbra, 2006. Pedro Quaresma and Predrag Janicic. Integrating dynamic geometry software, deduction systems, andtheorem repositories. In J. Borwein and W. Farmer, editors, MKM 2006, LNAI. Springer-Verlag, 2006. Jing-Zhong Zhang, Shang-Ching Chou, and Xiao-Shan Gao. Automated production of traditional proofsfor theorems in euclidean geometry i. the hilbert intersection point theorems. Annals of Mathematics andArtificial Intelligenze, 13:109–137, 1995.PhD Thesis Proposal: G1.2Title: “Formal languages in the knowledge base management”Keywords: Formal language, knowledge base, knowledge base inconsistency,knowledge base management.Supervisor: Prof. Maria de Fátima Gonçalves (email@example.com)Summary:Knowledge base management is intended as the acquisition and normalization of newknowledge and the confrontation of this knowledge with existing knowledge, resolvingpotential conflicts and updating it. When updating a knowledge base, several problemsmay arise. Some of them are the redundancy of the updated knowledge and theknowledge base inconsistency. Formal languages can be used with success to adevelopment of an innovative system to do the knowledge base management, resolvingpotential updating problems.PhD Thesis Proposal: G1.3Title: “Medidas de semelhança e aplicações”Keywords: Kolmogorov complexity, normalized distances, machine learning,clustering, universal similarity metric.Supervisor: Prof. Ana Maria de Almeida (firstname.lastname@example.org)Summary:Um problema recorrente em muitas estratégias que recorrem à apropriação deconhecimento (learning) para actuar é o da identificação de \"semelhanças\". Comopodemos medir \"semelhança\" para, por exemplo, determinar a distância evolutiva(evolucionária?) entre duas sequências como, por exemplo, 2 documentos na Internet,ou 2 programas de computador, ou duas pautas musicais, ou 2 genomas, ou 2 linhaseco-cardiográficas?Este projecto pretende estudar a área emergente das medidas de semelhança (similaritydistance measures), úteis em data-mining, pattern recognition, learning e automaticsemantic extraction. Vitány et al criaram a \"normalized information distance\", basedana noção de Complexidade Algorítmica da Informação (vulgarmente conhecida comoKolmogorov Complexity), e que mostraram ser uma medida universal para a classe emcausa, pois descobre todas as semelhanças efectivas. Mostraram ainda que é umamétrica e que toma valores em [0,1]. Existem já vários exemplos práticos interessantesem termos de resultados finais: semelhanças moleculares, identificação decompositores através de pautas de música, ecocardiografias fetais, entre outros. Ateoria é geral e, como tal, aplicável a diversas áreas ou colecções de objectos. Pretendeseinvestigar a sua adequação prática a casos particulares como learning algorithms(kernel function) e clustering.PhD Thesis Proposal: G1.4Title: “A entropia (de Kolmogorov) na definição de modelos de Taxa-Distorção (R-D)para a codificação de sinal de video”Keywords: Kolmogorov complexity, Entropy, Rate-Distortion Theory.Supervisor: Prof. Ana Maria de Almeida (email@example.com)Summary:Na última década, a Complexidade Algorítmica da Informação (ou KolmogorovComplexity) impôs-se como suporte teórico de uma série de aplicações nas maisdiversas áreas, permitindo a (re)definação de modelos e conceitos. Nomeadamente, edevido a uma interessante ligação forte entre a noção estatística da medida de Entropiade Shannon e a determinista medida de CAI, tem-se assistido, não só a demonstraçõesdiversas desta ligação teórica, mas também à realização de estudos e modelos quemostram que a CAI pode substituir a Entropia como, e apenas como exemplo, emCriptografia, com a vantagem de já não ser necessário conhecer distribuições deprobabilidade que, na verdade, não são, até agora realmente conhecidas.Pretende este projecto, e uma vez que já é conhecido um modelo teórico, paralelo aoestudo de Shannon, para substituir a Entropia na avaliação da Teoria de R-D (Rate-Distortion), sobre a qual se baseia a transmissão de sinal, averiguar sobre a possivelimplementação prática deste novo modelo, à optimização de funções de R-D nacodificação de sinal de video.G2: Artificial Intelligence: Foundations andApplicationshttp://ailab.dei.uc.pt/PhD Thesis Proposal: G2.1Title: “Computational Aesthetics”Keywords: Evolutionary Art, Artificial Artists, Computational AestheticsSupervisor: Prof. Penousal Machado (firstname.lastname@example.org)Summary:The use of biological inspired techniques for image generation tasks is a recent, excitingand significant area of research. There is a growing interest in the application of thesetechniques in fields such as: visual art and music generation, analysis, andinterpretation; sound synthesis; architecture; video; design; etc. In most cases, thesesystems resort to a human guided evolutionary algorithm: the user provides fitnessscores for the images thus steering evolution towards images that match his/hersaesthetic preferences.Although these systems are interesting in their own right as computer aided creativitytools, they lack autonomy and the capacity of making their own aesthetic judgements,consequently, they cannot be considered artificial artists. According to the frameworkproposed by Machado et al. (2003) an artificial artist can be seen as a system composedby two modules: a creator and a critic.The stunning imagery created with interactive evolutionary art tools indicates that anevolutionary computation approach is suitable for the creator role. As such, thebottleneck lies in the development of adequate artificial critics.In Machado et al. (2004) the authors propose a generic framework for the developmentof artificial art critics composed by two main modules: a feature extractor, which isresponsible for the “perception” of the artwork, collecting a set of low level aestheticallyrelevant features; an evaluator, which performs an assessment of the aesthetic merits ofthe artwork based on the output of the feature extractor module.The proposed approach was tested in several author and style identification tasksachieving high success rates (]95%), which shows the feasibility of the approach.Following the previous work of Machado et al. (e.g. 2003, 2004) the present thesis willfocus on the development of artificial art critics and on their integration with existentinteractive art tools. The key issues to be addressed are the further development of thecurrent feature extractor, the analysis of the relevance of the incorporated features, andthe online training of the evaluator module.References:- Penousal Machado, Juan Romero, Bill Manaris, Antonino Santos, Amilcar Cardoso,\"Power to the Critics - A Framework for the Development of Artificial Art Critics\",IJCAI\'2003 Workshop on Creative Systems, Acapulco, Mexico, August-2003.- Penousal Machado, Juan Romero, Maria Ares, Amilcar Cardoso, Bill Manaris,\"Adaptive Critics for Evolutionary Artists\", 2nd European Workshop on EvolutionaryMusic and Art, Coimbra, Portugal, April-2004.PhD Thesis Proposal: G2.2Title: “Stylistic Based Image Retrieval and Classification”Keywords: Content Based Image Retrieval, Artificial Art Critics, ComputationalAestheticsSupervisor: Prof. Penousal Machado (email@example.com)Summary:The days of the text based World Wide Web are over. Today the web is dominated bymultimedia content. Nevertheless, the most popular search engines are text-based.Although a few search engines for finding images and video are available, these enginesare still based on text. The large commercial image providers use human indexers toselect keywords and to classify each of the images. Google and other popular searchengines allowing image or video search base the retrieval on the textual content of thepage or on the keywords inserted by a human user.Content based image retrieval is a complex task. Taken to the limit it is a generic visionproblem, requiring object recognition, image understanding, concept formation, objectclassification, content analysis, etc. Computer vision is probably one of the hardestproblems in Artificial Intelligence and one that has been baffling computer scientistsduring the last decades. Although significant progress has been made, it is not likely tobe solved in the near future. This poses limits on what content Based Image Retrievalcan accomplish; however it also creates opportunities.In Machado et al. (2004) that authors presented a system composed by two mainmodules: a feature extractor, which is responsible for the “perception” of the image; anevaluator, which performs an assessment of the image based on the output of thefeature extractor module. The feature extractor is composed by a set of low-levelfeatures that proved to be relevant to the stylistic classification of images. The proposedapproach was tested in several author and style identification tasks achieving highsuccess rates (]95%), which shows the feasibility of the approach and its potential forthe development of stylistic based image retrieval engines.The main objective of this thesis is the further development of the current featureextractor, focusing on the incorporation of features proved to be relevant for contentbased image retrieval tasks. Unlike other content based image retrieval systems, whichfocus on the identification of images containing a given set of objects, we are primarilyinterested on the retrieval of stylistically similar images. Therefore, particular emphasiswill be given to features that are able to capture stylistic characteristics. Additionally,the development of a fully working prototype and its empirical testing are key aspects ofthis thesis.References:- Penousal Machado, Juan Romero, Bill Manaris, Antonino Santos, Amilcar Cardoso,\"Power to the Critics - A Framework for the Development of Artificial Art Critics\",IJCAI\'2003 Workshop on Creative Systems, Acapulco, Mexico, August-2003.- Penousal Machado, Juan Romero, Maria Ares, Amilcar Cardoso, Bill Manaris,\"Adaptive Critics for Evolutionary Artists\", 2nd European Workshop on EvolutionaryMusic and Art, Coimbra, Portugal, April-2004.PhD Thesis Proposal: G2.3Title: “Image Representation for Evolutionary computation”Keywords: Evolutionary Art, Programmatic CompressionSupervisor: Prof. Penousal Machado (firstname.lastname@example.org)Summary:The use of evolutionary computation techniques for image generation tasks is a recent,exciting and significant area of research. Following the footsteps of Karl Sims, mostsystems resort to the evolution of symbolic expressions. Once evaluated theseexpressions result in images. The tree-like nature of the expressions allows theirmeaningful manipulation through conventional Genetic Programming operators, andthe results attained by several researchers during the past years show that this is apowerful image generation method. Although expression based representations prove tobe adequate in the context of an evolutionary approach, they suffer from a majorproblem. The generation of an image from a given expression is straightforward,however the inverse problem – finding the symbolic expression for a given image – is(NP) hard. The goals of this thesis are twofold: 1) explore alternative imagerepresentation schemes, e.g. line based representations, that are adequate forevolutionary computation but that do no suffer from the aforementioned problem; 2)Following the work of McGuire, Nordin, and others explore the feasibility of usingprogrammatic image compression methods in order to find compact symbolicexpressions for a given image.PhD Thesis Proposal: G2.4Title: “Evolution of Dynamic Reactive Artworks”Keywords: Evolutionary Art, Fitness AutomationSupervisor: Prof. Penousal Machado (email@example.com)Summary:The use of evolutionary computation techniques for image generation tasks is a recent,exciting and significant area of research. During the past few years we watched theemergence of several evolutionary art tools. Most of these tools are interactive, in thesense that the user guides the evolutionary process, thus steering evolution towardsimages that match his/hers aesthetic preferences. However, the resulting artworks arestatic images and not interactive installations. The typical evolutionary art tool evolvesprograms, once executed this programs result in images. The programs take no input;as a result their output is static. The objective of this thesis is the development of andevolutionary art tool able to produce dynamic and reactive artworks. To achieve thistask the programs being evolved receive an input signal, e.g. a hand gesture, theiroutput depends on the input. Therefore different “gestures” will yield different images,which results in a dynamic reactive artwork. The programs being evolved can be seenas mappings between the input signal and images. The difficulty lies on thedevelopment of interesting mappings. Given the time-based nature of the domainresorting to human guided evolution is not a viable option. Instead we wish to exploreways of automating fitness assignment, so that uninteresting mappings can beeliminated beforehand. Additionally, and although “gestures” are a natural choice forinput signal, others exist and should be explored in this thesis, allowing the evolution ofartworks that react to sound, lighting, bio-signals, etc.PhD Thesis Proposal: G2.5Title: “An Affect-based Multi-Agent System”Keywords: Affect, emotion, autonomous agents, multi-agent systemsSupervisor: Prof. Luís Macedo (firstname.lastname@example.org)Summary:Emotion and motivation (merged in the broader term affect) are essential for survival,well-being and communication in humans by, among other functions, playing a centralrole on cognitive activities such as decision-making, planning and creativity. So, thequestion is why artificial agents don’t take advantage of emotions and motivations ashumans do? What can emotional artificial agents do better than those that are notbased on emotion? What emotion and motivation can offer to artificial agents?Certainly, not all advantages that humans benefit are applicable to artificial agents.But, we might think in a series of situations in which we can see the emotionaladvantage such as in text-to-speech systems by giving more intonation to speech,entertainment, preventive medicine, helping autistic people, in artificial pets,personalized agents that can act on the behalf of someone by selecting news, music,etc., according to someone’s mood, consumer feedback by measuring the emotions ofconsumers when dealing with a specific product, etc. Such applications require theabilities to recognize, express and experience emotions. Research in ArtificialIntelligence has almost ignored this significant role of emotions on reasoning, and onlyrecently this issue was taken seriously mainly because of the recent advances inneuroscience, which have given evidence that cognitive tasks of humans, andparticularly planning and decision-making, are influenced by emotion .The research question/thesis statement is that those tasks mentioned above can berobustly performed by affective agents. The approach comprises the development of amulti-agent system comprising affective agents [Macedo & Cardoso, 2004] so that thisframework can be used to build agent-based applications.References:- Macedo, L. and A. Cardoso (2004). Exploration of Unknown Environments withMotivational Agents. Proceedings of the Third International Joint Conference onAutonomous Agents and Multiagent Systems. N. Jennings and M. Tambe. New York,IEEE Computer Society: 328 - 335.- Macedo, L. The Exploration of Unknown Environments by Affective Agents. PhDThesis, Universidade de Coimbra, 2006PhD Thesis Proposal: G2.6Title: “Collaborative Multi-Agent Exploration of 3-D Dynamic Environments”Keywords: Collaborative Multi-Agent Exploration of 3-D Dynamic EnvironmentsSupervisor: Prof. Luís Macedo (email@example.com)Summary:Exploration gathers information about the unknown. Exploration of unknownenvironments by artificial agents (usually mobile robots) has actually been an activeresearch field [Macedo & Cardoso, 2004]. The exploration domains include planetaryexploration (e.g., Mars or lunar exploration), search for meteorites in Antarctica, volcanoexploration, map-building of interiors, etc. Several exploration techniques have beenproposed and tested either in simulated and real, indoor and outdoor environments,using single or multiple agents. The main advantage of multi-agent approaches is toavoid covering the same area by two or more agents. However, there is still much to bedone especially in dynamic environments as those mentioned above. Besides, realenvironments, however, consist of objects. For example, office environments possesschairs, doors, garbage cans, etc., cities comprise several kinds of buildings (houses,offices, hospitals, churches, etc.), cars, etc. Many of these objects are non-stationary,that is, their locations may change over time. This observation motivates research on anew generation of mapping algorithms, which represent environments as collections ofobjects. At a minimum, such object models would enable a robot to track changes inthe environment. For example, a cleaning robot entering an office at night might realizethat a garbage can has moved from one location to another. It might do so without theneed to learn a model of this garbage can from scratch, as would be necessary withexisting robot mapping techniques. This thesis addresses the problem of finding multiagentstrategies to address the problem of collaborative exploration of unknown, 3-D,dynamic environments. The strategy or strategies should be tested against otherexploration strategies found in the literature.References:- Macedo, L. The Exploration of Unknown Environments by Affective Agents. PhDThesis, Universidade de Coimbra, 2006.- Macedo, L. and A. Cardoso (2004). Exploration of Unknown Environments withMotivational Agents. Proceedings of the Third International Joint Conference onAutonomous Agents and Multiagent Systems. N. Jennings and M. Tambe. New York,IEEE Computer Society: 328 - 335.PhD Thesis Proposal: G2.7Title: “Case-Based Hierarchical-Task Network Planning”Keywords: HTN Planning, Case-based Planning, Decision-theoretic planningSupervisor: Prof. Luís Macedo (firstname.lastname@example.org)Summary:Hierarchical-Task Network (HTN) planning is a planning methodology that is moreexpressive than STRIPS-style planning. Given a set of tasks that need to be performed(the planning problem), the planning process decomposes them into simpler subtasksuntil primitive tasks or actions that can be directly executed are reached. Methodsprovided by the domain theory indicate how tasks are decomposed into subtasks.However, for many real-world domains, sometimes it is hard to collect methods tocompletely model the generation of plans. For this reason an alternative approach thatis based on cases of methods has been taken in combination with methods. Real-worlddomains are usually dynamic and uncertain. In these domains actions may haveseveral outcomes, some of which may be more valuable than others. Planning in thesedomains require special techniques for dealing with uncertainty. Actually, this has beenone of the main concerns of the planning research in the last years, and severaldecision-theoretic planning approaches has been proposed and used successfully, somebased on the extension of classical planning and others on Markov-Decision Processes.In these decision-theoretic planning frameworks actions are usually probabilisticconditional actions, preferences over the outcomes of the actions is expressed in termsof an utility function, and plans are evaluated in terms of their expected utility. Themain goal is to find the plan or set of plans that maximizes an expected utility function,i.e, to find the optimal plan. In this thesis a planner that combines the technique ofdecision-theoretic planning with the methodology of HTN planning should be built inorder to deal with uncertain, dynamic large-scale real-world domains [Macedo &Cardoso, 2004]. Unlike in regular HTN planning, methods for task decompositionshouldn’t be used, but instead cases of plans. The planner should generate a variant ofa HTN - a kind of AND/OR tree of probabilistic conditional tasks - that expresses all thepossible ways to decompose an initial task network.References:- Macedo, L. and A. Cardoso (2004). Case-Based, Decision-Theoretic, HTN Planning.Advances in Case-Based Reasoning: Proceedings of the 7th European Conference onCase-Based Reasoning. P. Calero and P. Funk. Berlin, Springer: 257-271.- Macedo, L. The Exploration of Unknown Environments by Affective Agents. PhDThesis, Universidade de Coimbra, 2006.PhD Thesis Proposal: G2.8Title: “Ontology Learning from Text in Portuguese”Keywords: Ontology Learning, Information Extraction, Natural Language ProcessingSupervisor: Prof. Paulo Gomes (email@example.com)Summary:Much of today’s Knowledge is gathered in a textual format (e.g. imagine the amount ofknowledge available on the web), as such, the extraction and mining of knowledge fromtexts becomes an important dimension of current research. One way of representingknowledge is through the use of an ontology. Simply put, an ontology is a sharedunderstanding of some domain of interest where hidden connections between conceptsare made explicit. The use of such a structure allows communities and computersystems to share a consistent understanding of what information means; in other wordsits semantics.The main objective of this thesis is the development of a set of methodologies that allowthe extraction of important concepts (relative to a certain domain) from text along withthe implicit and explicit relations that hold between them. Another aspect the thesisshould address pertains to the evaluation of the knowledge extracted and itsapplicability in information systems.PhD Thesis Proposal: G2.9Title: “Intelligent Knowledge Management using the Semantic Web”Keywords: Semantic Web, Knowledge Management, Artificial Intelligence, OntologiesSupervisor: Prof. Paulo Gomes (firstname.lastname@example.org)Summary:Nowadays, companies gather and store big amounts of information in databases. Thisinformation presents potential high value knowledge for a company. But most of thisinformation or data is not transformed in knowledge, remaining lost in data bases ordocument repositories. Software development is a knowledge intensive activity involvingseveral types of know-how and skills. Usually development teams have severalmembers, which makes sharing and dissemination of knowledge crucial for projectsuccess. One evolving technology that can be used with the purpose of buildingknowledge management tools for the software development area is the semantic web.Semantics are the lost chain between information/data and knowledge, and thesemantic web provides the infrastructure needed for making a true sharing ofknowledge possible.The semantic web is an infrastructure providing semantics associated with words inweb resources. But, by itself it does not provide a tool for knowledge management. Whatare needed, are tools that enable the usage of the semantic web in an intelligent way, sothat users can take advantage of knowledge sharing. The main problem to be dealt within this thesis is how a team of software development engineers can be aided by a tool, ora set of tools, that enable them to reuse knowledge in a more efficient way, thusincreasing their productivity.The main objective of this thesis is to develop a set of tools based on the semantic web.These tools are intended to have a set of intelligent characteristics, such as: learning,proactive reasoning, semantic searching and retrieval of knowledge, representation ofknowledge, knowledge acquisition, personalization, and others. Several reasoningmethods have been developed in Artificial Intelligence and are ideal candidates to beused in this research work. Some of the results of this research work are newalgorithms and methodologies for knowledge management.PhD Thesis Proposal: G2.10Title: “Automatic Document Indexation in Portuguese”Keywords: Document Indexing, Natural Language Processing, Semantic WebSupervisor: Prof. Paulo Gomes (email@example.com)Summary:The main goal of this thesis proposal is to develop a system capable of indexingdocuments in an ontology. The application area for this thesis is the domain of softwaredevelopment. The main research contribution of this thesis is the development ofclassification and indexation methodologies for documents related with softwareengineering. Target documents are written in Portuguese and are related with softwaredevelopment (manuals, reports, papers ...). The approach to be followed is based on arepository of knowledge objects, which is structured by a ontology. The maininfrastructure for the storage and indexing of these objects is based on the languagesdeveloped for the Semantic Web. New classification algorithms will also have to bedeveloped, so that they can cope with this document diversity. Another major challengeof this thesis is the correct disambiguation of document topics. This work is to beintegrated into ReBuilder (rebuilder.dei.uc.pt) – a software tool for reuse of UMLdiagrams.PhD Thesis Proposal: G2.11Title: “Automatic Named Entity Recognition”Keywords: Named Entity Recognition, Natural Language Processing, Case-BasedReasoningSupervisor: Prof. Paulo Gomes (firstname.lastname@example.org)Summary:Automatic named entity recognition is the identification and classification of linguisticexpressions that refer to a specific entity. For example, “Coimbra University” is a namedentity, which comprises a sequence of words referring to the University of Coimbra.These entities (represented by a sequence of words) possess their own linguisticproperties. Natural language processing systems and other applications dealing withnatural language text must be able to identify and classify these entities, in order to usethe associated semantics, which is very different from using the words individually.There are several international contests that compare and evaluate systems for namedentity identification, for example MUC (Message Understanding Conference), later theACE (Automatic Content Extraction). More recently the first contest for PortugueseHAREM (http://poloxldb.linguateca.pt/harem.php). This goal of this proposal is todevelop a named entity recognition system for the Portuguese language, and participatein the HAREM contest. A Case-Based Reasoning (CBR) approach is suggested. CBR canbe defined as a way of reasoning based on past experiences, which from our point ofview CBR can be applied successfully to this problem. CBR also enables the integrationof other approaches making a good framework for solving this problem.PhD Thesis Proposal: G2.12Title: “Converting Text into UML Diagrams”Keywords: Natural Language Processing, Case-Based Reasoning, Software Reuse,UMLSupervisor: Prof. Paulo Gomes (email@example.com)Summary:Language is the most common form of communication between humans, both writtenand spoken. Software developers are no exception, with natural language text being animportant part of the software specification documents. In the last decade, softwaremodeling languages, such as UML, have been developed and used in the specification ofsoftware systems. The main idea of this proposal is to develop an approach for theconversion of natural language software specifications into UML diagrams, both usecases and class diagrams. This work is to be integrated into ReBuilder(rebuilder.dei.uc.pt) – a software tool for reuse of UML diagrams.PhD Thesis Proposal: G2.13Title: “Reusing Software Design Patterns”Keywords: Software Design Patterns, UML, Case-Based Reasoning, Software DesignReuseSupervisor: Prof. Paulo Gomes (firstname.lastname@example.org)Summary:Software engineers and programmers deal with repeated problems and situations in thecourse of software design. Software design patterns were developed to deal with thistype of situation, where the same abstract solution is used several times. Softwaredesign patterns can be defined as a description of an abstract solution for a category ofdesign problems. One of the main advantages of patterns is design reusability. Anothermain advantage is that the application of design patterns improves and makes softwaremaintenance easier – design for change. Software design patterns are described innatural language, not having a formalization. This is due to the abstract level of thedesign patterns, which makes the application of design patterns a human dependanttask.Existing approaches to pattern application using computer tools need the help andguidance of a human designer. This is especially true in the selection of the designpattern to apply. It is difficult to automate the identification of the context in which apattern can be applied. Human designers must also identify which are the objectsinvolved in the pattern application. The automation of this task opens the possibility ofCASE design tools to provide complete automation in applying design patterns. Thisnew functionality can help the software designer to improve systems, and do bettersoftware reuse. Case-Based Reasoning can be defined as a way of reasoning based onpast experiences. From our point of view CBR can be applied successfully to theautomation of software design patterns. The aim of this thesis proposal is to develop anapproach that addresses this problem. The proposed CBR framework must be able toselect which pattern to apply to a target design problem generating a new design. It canalso learn new cases from the application of design patterns. This work is to beintegrated into ReBuilder (rebuilder.dei.uc.pt) – a software tool for reuse of UMLdiagrams.PhD Thesis Proposal: G2.14Title: “Adaptation and Reuse Mechanisms for UML Diagrams”Keywords: Case-Based Reasoning, Ontologies, Software Reuse, UMLSupervisor: Prof. Paulo Gomes (email@example.com)Summary:Management and reuse of UML diagrams is an important aspect for any softwaredevelopment company. The productivity increase that can be obtained from an effectivereuse of software development knowledge is crucial for market survival. This proposalintends to develop new mechanisms for adaptation and reuse of UML diagrams. Thiswork is to be integrated into ReBuilder (rebuilder.dei.uc.pt) – a software tool for reuse ofUML diagrams. ReBuilder will work as a development platform, in which the developedapproaches will be integrated. This enables an easy testing and experimentation of thedeveloped approaches.PhD Thesis Proposal: G2.15Title: “Algorithms for Semantic Annotation of Positioning Information”Keywords: locations, places, positioning systems, location based servicesSupervisors: Prof. Francisco Câmara (firstname.lastname@example.org)Prof. Carlos Bento (email@example.com)Summary:Although we find today a myriad of positioning technologies (from the “common” GPS toWireless, GSM cell or Ultra Wide Band positioning algorithms), the interpretation ofwhat exactly position means is still cumbersome. For example, the information that “weare at latitude 4,234W and longitude 30,123N” or “my current GSM cell ID is 1098” ispoor in terms of meaning for a user. Informations such as “I am in Morrocco”, “mycurrent location is in Coimbra” or “I am at work” are clearly richer and useful for awealth of applications. This is known as the “From Position to Place” problem(Hightower, 2003) and is currently a hot topic in the Ubiquitous Computing area. Theprimary goal of this PhD project is to study and develop methodologies that cancontribute to solving the problem just described. The approach expected will likely takeinto account the user model, context and social interaction. This work is one of thecentral topics of research of the Ubiquitous Systems Group of the AILab and has a highpotential of applicability in a range of state-of-the-art ubiquitous systems.PhD Thesis Proposal: G2.16Title: “Improvement of Algorithms for Indoor Location Supported on GSM and WiFiSignatures”Keywords: indoor location, GSM and WiFi signatures, location based services,ubiquitous computingSupervisors: Prof. Francisco Câmara (firstname.lastname@example.org)Prof. Carlos Bento (email@example.com)Summary:Information on location is a main concern for ubiquitous computing. Many applicationsof ubiquitous systems depend on location to achieve their goal. Although the problem ofpositioning a device or person outdoor is reasonably solved with GPS technologies, theproblem of indoor location is much more challenging. Various technologies exist forindoor location, but in general they need a dedicated infrastructure. Some researchersfollowed a promising approach that consists on using the information provided by GSMand WiFi equipment on the level of the signals received from the base stations(signature) to infer the current location of the equipment. At our group we have aconsolidated research on this direction using GSM and WiFi signatures to infer currentlocation using case-based reasoning techniques. The theme for this thesis focus onimproving these algorithms, in terms of precision and accuracy, following concurrentinference approaches, not necessarily restricted to case-based reasoning.PhD Thesis Proposal: G2.17Title: “Context Modelling from Data on Ubiquitous Computational Devices”Keywords: context awareness, context models, inference of context models,proactivity, ubiquitous devicesSupervisors: Prof. Francisco Câmara (firstname.lastname@example.org)Prof. Carlos Bento (email@example.com)Summary:The growing availability of a wide range of sensors and communication services (suchas camera, gps, gsm or wireless) in devices such as PDAs or mobile phones is opening anew world of possibilities in Ubiquitous Computing. The relatively new area of ContextAwareness is dedicated to model this information as well as to the design of possibleapplications. In this PhD project, it is expected to explore the predictability associatedto the use of Ubiquitous devices, in other words, we believe it is possible to build usermodels from use of context (and other) data and make inference about futureinteractions from these models. This will bring intelligence and more ease of use toUbiquitous devices. This project is part of a central topic of research within theUbiquitous Group of the AILab: Context Awareness.PhD Thesis Proposal: G2.18Title: “Emotion Expression with Music”Keywords: Affective computing, AI and Music, Computational CreativitySupervisor: Prof. Amílcar Cardoso (firstname.lastname@example.org)Summary:Music assumes a central role in driving the emotional experience in a diverse range ofsituations, from computer games to cinema, theatre and many other artistic,entertainment and educational setups. Composing a soundtrack for such scenariosrequires the ability to align the expected emotional effect of the music and sound effectswith the intended emotional experience. The task is particularly hard when a predefinedfixed script doesn’t exist and the course of action is decided in a dynamic way, like inmost computer games and other interactive applications.Roughly, soundtrack production includes the composition of a set of music sectionsand the collection of sound and music effects, which constitute inputs for a productionphase where they are sequenced/mixed/blended. All these phases are typicallyconducted by the soundtrack composer, although for computer games severaltechniques exist to provide real-time sequencing solutions.The ultimate goal of this work is to investigate computational approaches for theproblem of generating a soundtrack given a variable “emotion spectrum” as input.Several branches of the problem may be explored, depending on the studentbackground, including but not limited to: music classification and retrieval byemotional effect, generation of emotionally affecting music, music alignment (e.g.,according to rhythm, dynamics, emotional effect).G3: Simulation and Information Technologies inEducation and Traininghttp://cisuc.dei.uc.pt/sitg/PhD Thesis Proposal: G3.1Title: “Alternative specification and visualization representations in initialprogramming learning”Keywords: computer science education; programming learning; alternativerepresentations.Supervisor: Prof. Maria José Marcelino (email@example.com)Summary:The main objective of this thesis is to study, propose and validate, on one hand, newalternative forms of representation for algorithm and program specification and, on theother, new alternative ways of algorithm and program visualization to support initialprogramming learning and evaluate their impact on the quality of the achievedstudents’ learning.Initial programming learning is quite hard for the majority of students. It is usuallysupported by one (or more) of three typical modes of algorithm/program representation:pseudo code, flowcharts and code in a specific programming language. In whatconcerns algorithm/program visualization several approaches have also been used:variable log, debugging helps, simulated algorithm/program animation. Each studenthas her/his own preferences about these representation and visualization metaphors.There are particular types of programming problems that are mandatory in initialprogramming learning and for which typical students’ solutions (good as well aserroneous) have been identified. We believe that, although final programs must becoded in one particular programming language, during initial learning stages manyprogramming students could benefit from the study and implementation of diversealternative solution representations as well as visualizations, especially if they are moreclose to students’ previous experience and context.In the scope of this thesis student preferential alternative representations both at thelevel of algorithm and program specification and of results visualization will beidentified and evaluated. After new forms will be developed and proposed in order tocope with students more commonly found difficulties that traditional approaches cannot deal with. These new forms will be afterwards the object of thorough evaluation.PhD Thesis Proposal: G3.2Title: “Learning communities to support initial programming learning”Keywords: computer science education; programming learning; learningcommunities.Supervisor: Prof. António José Mendes (firstname.lastname@example.org)Summary:Initial programming learning is known as a hard task to many novice students atcollege level, leading to high failure and drop out in many courses. Many reasons can befound for this scenario and several approaches have been proposed to facilitatestudents’ learning. However, problems continue to exist and it is necessary toinvestigate new solutions that may help programming students and teachers.Learning communities’ concept exists for some time. It has been presented as a way tocreate rich learning contexts where teachers, students and other people, namelyexperts, can coexist and collaborate in the production of knowledge, consequentlyleading to learning enhancement.This thesis proposal includes first the study of representative learning communities’successful cases and characteristics, and after the study, proposal and creation of alearning communities support platform specially adapted to the needs of studentsduring programming learning. The platform and its utilization will undergo a fullevaluation, in order to access its success in promoting programming learning. It isexpected that this platform includes innovative characteristics, for example theinclusion of virtual members that may interact with real members when necessary andspecially tailored features and tools that may improve the quality of programminglearning.PhD Thesis Proposal: G3.3Title: “Problem solving patterns and remediation strategies in programming learning”Keywords: computer science education; programming learning; learningcommunities.Supervisors: Prof. António José Mendes (email@example.com)Prof. Maria José Marcelino (firstname.lastname@example.org)Summary:Initial programming learning is known as a difficult task to many novice students atcollege level. In those courses it is common to use a set of typical problems to introducestudents to basic programming concepts and also to stimulate them to develop theirfirst programs and programming skills. This work is essential, since it should allowbeginners to develop the basic programming problem-solving skills necessary to befurther developed and refined later. So, this first learning stage is crucial to students’performance in all programming related courses.This thesis proposal includes a study about the different ways students approach thesetypical basic problems, leading to the identification of common problem solvingpatterns. Some of these patterns will be adequate, while others will not lead to thedevelopment of correct solutions, being considered wrong or erroneous patterns thatmust be identified and corrected in student’s strategies knowledge. Based on thisinformation, the thesis main objective will be the proposal, implementation andevaluation of methods and/or tools that may identify novice students’ strategies,categorize typical wrong patterns and common errors, and interact with them givingpersonalized remediation feedback when necessary. The forms of this feedback mustalso be studied, so that it becomes effective not only to help students to solve thecurrent problem, but mainly to help them to develop better approaches that may lead tocorrect solutions in later problems and learning stages.PhD Thesis Proposal: G3.4Title: “Using Design Patterns in Modeling and Simulation”Keywords: Model Reuse, Design Patterns, Parallel & Distributed SimulationSupervisor: Prof. Fernando Barros (email@example.com)Summary:Software Design Patterns (SDPs) are a widely used technique in software development.Time-based SDPs have been developed for building real-time software and its use is apromising approach to build modeling and simulation software. This proposal intendsto develop new SDPs to help the development of reusable simulation models andreusable simulation kernels able to deal with both conservative and optimistic parallel& distributed approaches.PhD Thesis Proposal: G3.5Title: “Modeling and Simulation of Adaptive Sampling Systems”Keywords: Numerical Simulation, Sampled-Based SystemsSupervisor: Prof. Fernando Barros (firstname.lastname@example.org)Summary:Adaptive step-size numerical methods permits to improve simulation performance whileyielding the same accuracy of fixed step-size methods. The use of asynchronousnumerical solvers permits to concentrate computation power in most demandingmodels enabling larger systems to be represented. Digital Control and Digital signalprocessing areas are currently exploiting multirate sampling and adaptive samplingtechniques as a more efficient alternative to conventional fixed sampling rateapproaches. In this proposal, we intend to develop new algorithms based on adaptivesampling and make their application to numerical solvers, event detectors, and signaland control systems.G4: Adaptive Computationhttp://cisuc.dei.uc.pt/acg/PhD Thesis Proposal: G4.1Title: “Intelligent Data Mining in GRID Technology”Keywords: Machine Learning techniques such as Neural Networks, Support VectorMachines and Clustering in data mining, prediction and recognitionSupervisor: Prof. Bernardete Ribeiro (email@example.com)Summary:Grid technology emerged from distributed computing with the goal of generatingprocessing power for meeting users’ needs. To increase computing power, computingresources are gathered across physical places. The idea was to match the unusedcomputing cycles with the needs created by applications and other users. This notion isnow a ubiquitous solution practised all around the world. It ensures continuouscomputing availability despite scheduled maintenance, power outages, and unexpectedfailures. The main idea of research is to prepare selected data mining algorithms,preferably those originating in soft-computing, to be able to run in distributedenvironments like clusters and global grids. Different techniques will be used, fromarchitectural parallelization of the models, data parallelization of the models andparallel parameter-search methods for sequential models. An analysis will be givenfocusing on a suitability of the techniques and particular distributed environments incombination with the implemented data mining models. The implementations ofmethods on distributed computing resources will be tested on wide variety of data basesemerging from industry, medicine and internet in order to investigate their efficiencyand robustness.PhD Thesis Proposal: G4.2Title: “Learning from Side-Information and From Heterogeneous Data Sources”Keywords: Machine Learning techniques such as Neural networks, Support vectorMachines and Clustering in data mining, prediction and recognitionSupervisor: Prof. Bernardete Ribeiro (firstname.lastname@example.org)Summary:In the last decade we have witnessed an impressive growth of the on-line information,mostly available through the Web, the intranets and other sources. It is estimated thatthe Information density doubles every 12 to 15 months but the capacity of reading andanalysing it remains constant. This is not only due to better technology that allows forfast acquisition, but also due to faster computer technology that allows exploiting thedata in machine learning tasks. It seems however that the growth of the data is moreexplosive than the boost in computing power, and this evolution is improbable tochange when Moore’s law will saturate once the limits of electronics are approached.Then only the algorithmics can make further speed-ups possible. Along with theexplosive growth of data availability, an increasing diversity of the data types can beobserved. The questions how to deal with this heterogeneity and how to weight theimportance of different sources of data and information remains to be solved. Learningfrom heterogeneous sources of data, and learning in semi-supervised learning settingsis the main theme of research. Applications of these settings abound, in the broaddomain of bioinformatics (mainly learning the heterogeneous information), machinevision (mainly learning from side-information, for image and video segmentation), manygeneric classification problems in the field of text mining and information extractionand many others.PhD Thesis Proposal: G4.3Title: “Assigning Confidence Score in Page Ranking for Intelligent Web Search”Keywords: Ranking, Text Mining, Machine LearningSupervisor: Prof. Bernardete Ribeiro (email@example.com)Summary:Web has become the main centre of research around the globe. Users face themselveswith an overload of data when a simple search is fed into Google or a similar web searchengine. A recurrent problem is to unveil the desired information from the wealth ofavailable search results. Ranking, which can be achieved by providing a meaningfulscore for each classification decision, is important in most practical settings. Textretrieval systems typically produce a ranking of documents and let a user decide howfar down that ranking to go. Several Learning Machine techniques allow the definition ofscores or confidences coupled with their classification decisions. The main idea of thecurrent proposal is to explore ranking systems based on Bayesian approaches to theweb learning problem that allow a refinement of existing systems. Moreoverclassification systems can be improved by enriching information and informationrepresentation with external background information, such as, ontology-related data.Evaluation can be done on benchmarks, but also with real users defining the goals andassessing the final results, including score changes in final ranking.PhD Thesis Proposal: G4.4Title: “Homecare Diagnosis of Pediatric Obstructive Sleep Apnea”Keywords: homecare; obstructive sleep apnea; reduction of complexity; biosignalsprocessing; computational intelligence; automatic diagnosis.Supervisor: Prof. Jorge Henriques (firstname.lastname@example.org)Summary:The main goal of this work is to investigate homecare solutions that could stratifynormal and apnea events for diagnostic purposes in children suspected for the presenceof obstructive sleep apnea syndrome.Obstructive sleep apnea syndrome (OSAS) is a condition whereby recurrent episodes ofairway obstruction are associated with asphyxia and arousal from sleep. It is estimatedto affect between 1 and 3% of young children and its potential consequences includeexcessive daytime somnolence, behavioral disturbances and learning deficits,pulmonary and systemic hypertension, and growth impairment. The currently acceptedmethod for diagnosis of OSAS is overnight polysomnography (PSG), done in sleeplaboratories, where multiple signals are collected by means of face mask, scalpelectrodes, chest bands etc. It monitors different activities, including brain waves (EEG),eye movement (EOG), muscle activity (EMG), heartbeat (ECG), blood oxygen levels andrespiration. However, the diagnosis of OSAS from these huge collection of data issometimes not straightforward to clinicians, since major relations between features andconsequents are most often very high dimensional, non-linear and complex. Theserequirements impose the necessity of innovative signal processing techniques andcomputational intelligent data interpretation methodologies, such as neural networksand fuzzy systems. One of the main goal of this work is to provide clinicians with thetools that can help them in their diagnosis.Although PSG is considered the gold standard for diagnosis of OSAS, given the relativelyhigh medical costs associated with such tests and the insufficiency number of pediatricsleep laboratories, PSG is not readily accessible to children in all geographic areas.Thus, analysis of the validity of alternative diagnostic approaches should be done, evenassuming their accuracy is suboptimal. The second goal of this work points in thisdirection. It aims investigating the viability to reduce the number and complexity ofmeasurements in order to make possible the stratification of OSAS in children naturalenvironment.PhD Thesis Proposal: G4.5Title: “Architectures and algorithms for real-time learning in interpretable neurofuzzysystems”Keywords: on-line learning; neuro-fuzzy systems; interpretability; machine learningSupervisor: Prof. António Dourado (email@example.com)Summary:The development of fuzzy rules to knowledge extraction from data acquired in real timeneeds new recursive techniques for clustering to produce well designed fuzzy-systems.For Takagi –Sugeno-Kang (TSK) systems this applies mainly to the antecedents, whilefor Mamdani type it applies both for the antecedents and consequents fuzzy sets. Toincrement pos-interpretability of the fuzzy rules, such that some semantic may bededuced from the rules, pruning techniques should be developed to allow a humaninterpretablelabelling of the fuzzy sets in the antecedents and consequents of the rules.For this purpose convenient similarity measures between fuzzy sets and techniques formerging fuzzy rules should be developed and applied. The applications envisaged are inindustrial processes and medical fields.PhD Thesis Proposal: G4.6Title: “Intelligent Monitoring of Industrial Processes with application to a Refinery”Keywords: intelligent process monitoring; multidimensional scaling; computationalintelligence; clusteringSupervisor: Prof. António Dourado (firstname.lastname@example.org)Summary:High dimensional data in industrial complexes can be profitably used for advancedprocess monitoring if it is reduced to a dimension where human interpretability is easilyverified. Multidimensional scaling may be used to reduce it to two or three dimensions ifappropriate measures of similarity/dissimilarity are developed. The measures expressthe distance between attributes, the essence of the information, and a similar differenceshould be guaranteed in the reduced space in order to preserve the informative contentof the data. Research of appropriate measures and reduction method is needed.In the reduced space, classification of the actual operating point should be domethrough appropriate recursive clustering and pattern recognition techniques. Theclassification is intended to evidence clearly the quality level of the actual and pastoperating points in such a way that the human operator finds in it a useful decisionsupport system for the daily operation of the mill. The work has as applications theprocess of visbreaker in the Galp Sines Refinery.G5: Dependable Systemshttp://cisuc.dei.uc.pt/dsg/PhD Thesis Proposal: G5.1Title: “Grid Computing Support for Distributed Unreliable Networks”Keywords: Grid computing, parallel programming, distributed computing, desktopgridsSupervisor: Prof. Paulo Marques (email@example.com)Summary:Nowadays there is a huge need for parallel computing. Researchers in areas likebiology, physics, computer networks, among others, need to perform a huge number ofcomputer experiments in order to gather results. At the same time, there is an increaseneed for on-demand-experiments (e.g. “I want to know this result now!”). Researcherswhat to quickly run an experiment, which may involve thousands of calculations, inorder to know how to set “yet another parameter” or which part of a search space toexplore. They are not willing to wait days or weeks for getting simple answers that justguide the direction of their research – they want them at the touch of a button.Although many frameworks for parallel computing exist (e.g. MPI, PVM, OpenMP), theyare typically thought for cluster computing. This raises a problem because not allresearchers have a readily available cluster for performing experiments. Even when theydo have access to a cluster, in most cases, the relative size of the cluster is small to thenumber of researchers wanting to use the resources, which increases the turn-aroundtime for running experiments. At the same time, cluster environments are not socompatible with the increasing requests from researchers for on-demand-computing.The alternative is trying to run scientific applications in non-dedicated machines, alsoknown as desktop grids (e.g. classroom/office PCs), which abound in organizations.But, in that case, the use of MPI and similar frameworks is very inappropriate (e.g. thefault-model of MPI implies that if one process crashes, the whole computation dies –which is incompatible with the unreliability of those computers!).Although many frameworks have been developed for running parallel applications onnon-dedicated machines (e.g. Condor, BOINC, Alchemi, etc.), in order to coupe with theunreliability of the machines, they typically don’t allow communication between nodesand the computations are based on bounded work units assigned to the nodes. Forinstance, using most of these frameworks, it’s quite difficult to write grid-basedalgorithms (e.g. for calculating the air flow that passes through an airplane wing) or aglobal back-tracking algorithm (e.g. for optimizing a path throw a network).Finally, researchers in other areas than computer science are now coming to terms withmore modern programming environments and easier to use computer languages. Forinstance, Python and Numeric Python, as well as Matlab and Mathematica, are quitepopular with biologists, physicists and even social scientists. It’s now time to moveparallel programming beyond C and Fortran.Clearly, research is needed on programming models and infrastructures that allowparallel programs to run on unreliable distributed networks, and at the same timeallows them to be written in modern, easy-to-use and productive computer languages.ProposalThis PhD dissertation will consist in investigating, implementing and evaluating aprogramming model that allows parallel programs to be easily written and reliablyexecute on desktop grids. It is also a specific objective of the thesis to deviate from thetraditional message passing paradigm and from the remote method invocation modelsof creating distributed applications.The framework to be developed will address questions as (but not necessarily all): a)programming easiness; b) global reliability of the computation in presence of failures; c)access to stable storage for reading and writing results; d) security; e) deployment andmonitoring; f) visualization.In this context, some technologies will be interesting to consider and explore:distributed hash tables; P2P routing and service discover; erasure codes; distributedmap/reduce programming models; self-organization; consensus; group membershipand election algorithms; mobile code, among others.PhD Thesis Proposal: G5.2Title: “Sensor Networks for Space Exploration”Keywords: Sensor networks, Ad-hoc networking, Space explorationSupervisor: Prof. Paulo Marques (firstname.lastname@example.org)Summary:Sensor networks are currently a hot topic in distributed systems research. A sensornetwork consists in tens or hundreds of inexpensive sensing devices, typically not muchlarger than a coin, that are able to gather information from the environment, coordinateamong themselves, and relay that information to a remote location. This type of systemshas a huge number of application areas, like earth observation, environmentmonitoring, security, medical healthcare, among others. Typical deployment scenariosinclude scattering devices through a forest for detecting the start of wildfires; placingdevices throughout a river basin for detecting pollutant dumping; or even equip carswith such devices for detecting and warn about immediate collision danger.One extremely interesting application scenario for sensor nets is space exploration. It isquite easy to imagine that deploying hundreds of sensors over some kilometers while aspace probe is descending to Mars can be quite beneficial. Instead of being limited togather data where the spacecraft lands or where their rovers can move, truly distributeddata acquisition can take place. The same thing applies, for instance, in orbit forgathering data and performing distributed experiments in Earth’s upper atmosphere, oreven to gather data from orbiting probes (the current largest European satellite is thesize of a TIR truck!).Using sensor networks in space presents unique and challenging problems. Space isquite a harsh place: radiation abounds causing software and hardware failures,temperature is typically well below zero, electromagnetic interference makes radio linksquite unreliable. Since normally these devices are small, cheap and disposable,typically they are quite limited in terms of computational power, energy andtransmission bandwidth. This makes engineering this type of networks for reliabilitydifficult, especially for space applications.ProposalThis PhD dissertation will consist in investigating, implementing and evaluatingalgorithms for fault-tolerance in sensor networks for harsh, Byzantine, environments,for space exploration. In particular, the thesis will focus on exploring the spectrum ofpossibilities for achieving different degrees of reliability in computationally limiteddevices, when this reliability comes at cost of spending energy and having tocommunicate with other sensing nodes.Currently, it is envisioned that this work will be carried out in the context of a researchproject where other partners with develop the sensing platform hardware and alsoprovide a realistic context for fault injection and testing, possibly in a collaboration withthe European Space Agency (to be confirmed).PhD Thesis Proposal: G5.3Title: “Self-Healing Techniques for Application Servers”Keywords: Autonomic computing, self-healing, software aging, rejuvenation,dependability.Supervisor: Prof. Luís Moura e Silva (email@example.com)Summary:One of the actual big-challenges of the computer industry is to deal with the complexityof the systems. The Autonomic Computing initiative driven by IBM defined the followingfunctional areas as the cornerstone of an autonomic system: self-configuration, selfhealing, self-optimization and self-protection. The self-healing property refers to theautomatic prediction and discovery of potential failures and the automatic correction topossibly avoid downtime of the computer system. This leads to the vision of “computersthat heal themselves” and do not depend so much on a system manager to take care of.While there has been some interesting work on self-healing techniques for missioncriticalsystems there is a long way to achieve that goal in commercial off-the-shelve(COTS) servers running Apache/Linux, Tomcat, JBoss, Microsoft .Net. The purpose ofthis PhD is to study and propose low-cost and highly-effective self-healing techniquesfor these application servers.One of the potential causes of failures in 24x7 server systems is the occurrence ofsoftware aging. The phenomena should be studied in detail together with high-leveltechniques for application-level failure detection. Some mathematical techniques shouldbe applied to detect software aging and to forecast the potential time for the failure ofthe server system. When the aging is detected the server system should apply proactivelya software rejuvenation technique to avoid the potential crash and to keep theservice up and running. Techniques for micro-rejuvenation should be further studied toavoid downtime of the server. The final result of this PhD should be a set of softwareartifacts and the refinement of data analysis techniques to apply in COTS applicationservers in order to predict failures and software aging in advance and to apply somecorrective action to avoid a server crash.ProposalThis PhD will comprehend the following initial tasks: (1) State-of-the-art aboutAutonomic Systems, Self-healing, Software Aging, Software Rejuvenation, Microrebootingand Dependability Benchmarking; (2) Machine learning techniques to forecastthe failures and software aging; (4) Application-level techniques for failure predictionand early detection; (5) Micro-rejuvenation techniques for application servers; (6)Extension of the techniques SOA-based and N-tier applications; (7) Dependabilitybenchmarking; (8) Implementation of an experimental framework; (9) Analysis ofexperimental results.PhD Thesis Proposal: G5.4Title: “Dependable and Self-Managed VoIP Infrastructures”Keywords: Autonomic computing, Self-healing, Dependability, Peer-to-Peer, VoIP,QOS.Supervisor: Prof. Luís Moura e Silva (firstname.lastname@example.org)Summary:Peer-to-peer techniques have been widely applied for decentralized file-sharing in theinternet, distributed computing, content distribution and to support applications likeVoice-over-IP. The most popular example is Skype that is based on a supernode-basedP2P network. Since in those P2P networks some of the server-based services can beprovided by the client machines there is mandatory need to provide the P2Pinfrastructure with techniques for self-configuring, self-healing and self-management,in the line of Autonomic Computing systems.The goal of this thesis is to devise and study some software techniques to enhance thedependability and autonomic computing capabilities of VoIP infrastructures. In a firststep, the goal is to provide fault-tolerant mechanisms for VoIP servers, by studying thereliability of staged-event servers, the use of software fail-over mechanisms, studyingfailure-analysis and prediction to anticipate the occurrence of software aging inside aserver, the use of micro-rebooting of service components and complementarytechniques to enhance the self-healing capabilities of a VoIP server. In the second phaseof the thesis, there should be interesting to study the usage of high-level reconfigurationtechniques that will require architectural changes in the VoIP infrastructure and somereconfigurable usage of SIP/RTP protocols with the ultimate goal of providing a higherQOS and transparency of failures for the end-user of the application. The result shouldbe a highly-reliable P2P infrastructure that can be used in to enhance the QOS of VoIPapplications like Skype.ProposalThis PhD will comprehend the following initial tasks: (1) State-of-the-art about Peer-to-Peer networks, VoIP infrastructures, STUN/TURN servers, SIP and RTP protocols, Selfhealingtechniques, Staged-event servers, Load-balancing and Server Reconfiguration;(2) Application-level techniques for failure prediction and early detection; (3) Predictionof server availability; (4) Software techniques for load-balancing (wackamole project); (5)Study of self-healing techniques for stagged-event servers (SEDA project); (6) Microrebootingtechniques for VoIP servers; (7) Reconfiguration techniques for VoIP servers;(8) Support for protocol reconfiguration; (9) Construction of the framework and results;(10) Data analysis.PhD Thesis Proposal: G5.5Title: “Wired self-emerging ad hoc network”Keywords: peer-to-peer, ad hoc, distributed hash table.Supervisor: Prof. Filipe Araújo (email@example.com)Summary:In recent years computer communication is departing from the client-serverarchitecture and moving increasingly more toward a peer-to-peer architecture. Oneaspect that characterizes this kind of interaction is the opportunistic participation ofmany of the peers: they connect to the network for only a few moments, just to discoverand download (or not) what they are looking for and then they disconnect. Interestingly,mobility and battery exhaustion can reproduce this same trend in wireless ad hocnetworks, comprised of devices that use radio broadcast to communicate.While wired peer-to-peer and wireless ad hoc networks share a number of commonfeatures, like self-configuration, decentralized and fault-tolerant operation, they havehowever an important difference: wired peer-to-peer networks run as overlay networkson top of the IP infrastructure. This raises the following question: can we take theparadigms from wireless networks and create IP-less self-organizing wired networks?Our goal is to plug-in and out new devices or even entire networks from the wiredinfrastructure in a scalable and decentralized way and without the need for any a prioriconfiguration. In contrast, current IP networks can only scale, because they are highlyhierarchical and they require a considerable amount of human assistance. As aconsequence they are often highly congested, expensive to maintain and unreliable.The fundamental difference between the solution we seek and wireless ad hoc networkshas to do with available bandwidth. In fact, the most important constraint that makescollection of routing information so challenging and that limits the pace of change oftopology in wireless ad hoc networks is the (lack of) available bandwidth. Availablebandwidth is a very scarce resource, because it is shared among all the nodes. Thismakes it theoretically impossible to create a wireless ad hoc network that scales withthe number of nodes. As a consequence, algorithms for wireless ad hoc networks areoften localized or have, at most, very limited information of distant regions of thenetwork. This is very unlike the situation in wired networks: for the same pace oftopological change, the supply of bandwidth is not shared and it is much larger. Thispaves the way for better and more powerful solutions, which, we believe are largelyunexplored in literature.Proposal:This PhD work encompasses the following tasks: (a) review of the state-of-the-art; (b)design of the architecture; (c) evaluation of the scalability of the architecture (admissiblenumber of nodes and topological changes versus available bandwidth); (d) exact andrange-based lookup algorithms that leverage on previous work on distributed hashtables and peer-to-peer file-sharing applications; (e) design of an interconnectioninfrastructure, to connect islands of wired ad hoc networks with the IP network.PhD Thesis Proposal: G5.6Title: “Fast Moving Wireless Ad Hoc Nodes”Keywords: peer-to-peer, wireless ad hoc, wireless infrastructured, Wireless Accessfor the Vehicular Environment (WAVE).Supervisors: Prof. Filipe Araújo (firstname.lastname@example.org)Prof. Luís Moura e Silva (email@example.com)Summary:In recent years we have assisted to an increasing interest in wireless networks. Whilemost current applications seem to be set for sensor networks, we can foresee manyother applications for mobile ad hoc or mixed ad hoc/infrastructured networks, wherenodes are mainly mobile and communication goes beyond simple data gathering of asensor network. For instance, applications can enhance the behavior of a crowd byproviding additional services to users holding mobile wireless devices, like search for agiven person that is momentarily lost, search for a person that matches some socialinterests, exchange of diverse information, of a product, etc. Another context that isextremely promising is that of a spontaneous network formed by cars in a road,enriched with some infrastructure that is able to provide traffic, weather and otherinformation to drivers. By letting cars share their information, it may be possible to savesignificant costs in the infrastructure and still considerable improve the quality andquantity of information.In this PhD work we want to leverage on some existing routing algorithms for wirelessad hoc networks and make them work on particular environments with specificpatterns of mobility. Interestingly, in networks with a high degree of mobility it is oftenpossible to increase the speed of the flow of information, because mobility creates moreopportunities to exchange this information. In particular, we want to consider ascenario where the network is comprised of fast-moving cars equipped with IEEE802.11p network adapters (Wireless Access for the Vehicular Environment – WAVE).This is a case where part of the information is created and sent to some points of theinfrastructure through a chain of nodes, while at the same type, cars can also introducenew information in the network, for instance by signaling their presence to cars infront, in the rear or to cars traveling in the opposite direction. In particular, theinformation shared with cars going in the opposite direction first and then with the basestations located along the road is of paramount utility as this has the potential topropagate very accurate data of traffic jams or accidents at virtually no cost. We expectto use similar principles to more complex but slower-moving networks comprised ofpeople with handheld or other wireless devices walking in crowds.Proposal:This PhD work encompasses the following tasks: (a) review of the state-of-the-art; (b)design of routing algorithms for environments with high mobility; (c) design ofinformation-sharing applications for environments with high mobility; (d) simulation inreal environments.PhD Thesis Proposal: G5.7Title: “Detecting Software Aging in Database Servers”Keywords: Software aging, software rejuvenation, autonomic computing, databasemanagement systems, dependability benchmarkingSupervisors: Prof. Marco Vieira (firstname.lastname@example.org)Prof. Luís Moura e Silva (email@example.com)Summary:One of the main problems in software systems that have some complexity is theproblem of software aging, a phenomenon that is observed in long-running applicationswhere the execution of the software degrades over time leading to expensive hangsand/or crash failures. Software aging is not only a problem for desktop operatingsystems: it has been observed in telecommunication systems, web-servers, enterpriseclusters, OLTP systems, spacecraft systems and safety-critical systems.Software aging happens due to the exhaustion of systems resources, like memory-leaks,unreleased locks, non-terminated threads, shared-memory pool latching, storagefragmentation, data corruption and accumulation of numerical errors. There are severalcommercial tools that help to identify some sources of memory-leaks in the softwareduring the development phase. However, not all the faults can be avoided and thosetools cannot work in third-party software modules when there is no access to thesource-code. This means that existing production systems have to deal with theproblem of software aging.The natural procedure to combat software aging is to apply the well-known technique ofsoftware rejuvenation. Basically, there are two basic rejuvenation policies: time-basedand prediction-based rejuvenation. The first applies a rejuvenation action periodically,while the second makes use of predictive techniques to forecast the occurrence ofsoftware aging and apply the action of rejuvenation strictly only when necessary.The goal of this PhD Thesis is to study the phenomena of software aging in commercialdatabase engines, to devise and implement some techniques to collect vital informationfrom the engine and to forecast the occurrence of aging or potential anomalies. Withthis knowledge the database engine can apply a controlled action of rejuvenation toavoid a crash or a partial failure of its system. The ultimate goal is to improve theautonomic computing capabilities of a database engine, mainly when subjected to highworkload and stress-load from the client applications.Proposal:The PhD work will comprehend the following initial tasks: (a) overview of the state-ofthe-art about software aging, rejuvenation and dependability benchmarking; (b)development of a tool for dependability benchmarking of database engines; (c)development of a workload and stress-load tool for databases; (d) infrastructure ofprobes (using Ganglia) to collect vital information from a database engine; (e)development of mathematical techniques to forecast the occurrence of software aging(time-series analysis, data-mining, machine-learning, neural-networks);(f) experimentalstudy. Analysis of results; (g) adaptation of rejuvenation techniques for databaseengines; (h) writing of papers;PhD Thesis Proposal: G5.8Title: “Security benchmarking of COTS components”Keywords: Software reliability, Security benchmarking, Experimental evaluation,Dependability benchmarkingSupervisors: Prof. Henrique Madeira (firstname.lastname@example.org]Prof. João Durães (email@example.com)Summary:One of the main problems in software systems is the vulnerability to malicious attacks.Complex systems and systems that have high degree of interaction with other systemsor users are more prone to be successfully attacked. The consequences of a successfulattack are potentially very severe and may include the theft of critical-missioninformation and trade secrets. Given the pervasive nature of software systems inmodern society, the issue of security and testing for vulnerability to attacks is animportant research area.The vulnerability of software systems are caused by several factors. Two of these factorsare the integration of third-party off-the-shelf components to build larger components,and bad programming practices. The integration of third-party generic-purposecomponents may introduce vulnerabilities in the larger system due to interfacemismatch between the components that may be exploited for attacks. Bad programmingpractices may lead to weaknesses that may be exploited by tailored user inputs. Testinga system or component against malicious attacks is a difficult problem and is currentlyan open research area. Testing for vulnerabilities to malicious attacks can not beperformed as traditional testing because there is no previous knowledge about thenature of the attacks. However, these attacks follow a logic based on exploiting possibleweaknesses inside the software and this logic can be used to forecast the existence ofvulnerabilities.The goal of this PhD Thesis is to study the phenomena of attacks to software systemsand devise a methodology to assess the vulnerability to these attacks. This includes theproposal of experimental techniques to test systems and components following the logicof dependability benchmarking and experimental evaluation. It is expected that at theconclusion of the Thesis there is case study with practical results of assessmentsecurity and vulnerability forecasting for comparison purposes. Web servers aresuggested as on of possible case studies. Fault injection techniques and robustnesstesting techniques should be considered as enabling techniques for the purposes of theThesis.Proposal:The PhD work will comprehend the following initial tasks:(a) overview of the state-of-the-art about software security, software vulnerabilities,software defects, validation methods, robustness testing and dependabilitybenchmarking;(b) development of methods and tools for analysis of the identification of patternsrelated to vulnerabilities and the automated testing of the possible vulnerabilities (casestudies include web-servers)(c) proposal of generic test methodologies for evaluation of software vulnerability tomalicious attacks based on software defects and program pattern analysis for systemcomparison purposes;(d) proposal of formal methodologies for experimental assessment of security andvulnerability forecasting on third-party (black-box) software components;(e) development of experimental infrastructure of tools for practical demonstration of theabove to real systems (case studies include web-servers);(f) experimental study. Analysis of results;(g) writing of papers;Target conferences to publish papers:- Dependable Systems and Networks (DSN)- International Conference on COTS-Based Software System (ICCBSS)- International Conference on Computer Safety, Reliability and Security (SAFECOMP)- International Symposium on Software Reliability Engineering (ISSRE)G6: Communications and Telematicshttp://cisuc.dei.uc.pt/lct/PhD Thesis Proposal: G6.1Title: “Security in Wireless Sensor Networks”Keywords: Sensor Networks, Security, MobilitySupervisor: Prof. Jorge Sá Silva (firstname.lastname@example.org)Summary:Although in the last years, we witnessed the increase in processing capabilities and inbandwidth of communication systems, several researchers consider that, in a nearfuture, an inversion of trends will occur. These new computational systems will notconsist of devices with higher processing power, but simply networks of sensors.Wireless Sensor Networks (WSNs) are composed by a high number of nodes, each oneequipped with a microprocessor, low memory and a basic communication system.The integration of WSNs in the Internet will revolutionize several concepts and it willrequire new paradigms. More recently, a new group was created at the IETF, the6LoWPAN, which is responsible to produce problem statements, assumptions and goalsfor network elements with restricted requirements such as limited power, in which theWSNs can be included.However, security mechanisms for these networks are scarce and inefficient. Theresearch work of this PhD program will comprise the study and the proposal of newmodels for the security of these new Internet elements. This is particularly important inmobile environments, as the research community generally assumed sensors as staticnodes. The new WSN protocols should consider security issues to protect againsteavesdropping and malicious behavior.PhD Thesis Proposal: G6.2Title: “Multicast in Next Generation Networks”Keywords: Multicast, 4G Networks, MobilitySupervisor: Prof. Jorge Sá Silva (email@example.com)Summary:Multicast communication in the Internet has deserved an increasing attention in thelast few years. Nowadays, there are more and more applications that requirecommunication systems with multipoint communication capabilities. Multicastcommunication reduces both the time it takes to send data to a large set of receiversand the amount of network resources required to deliver such data.The appearance of such new applications, with multicast requirements, evidenced theneed of truly multicast protocols in the IP layer. However, traditional solutions were toocomplex to implement and only few network equipments support them.The purpose of this research work is to study new multicast paradigms that offer simplesolutions for the Next Generation Internet, that will include elements such laptops,PDAs, mobile phone and sensors. To date, the design goals for multicast protocols inwired or wireless environments haven’t included sensor nodes. However, this will becrucial as sensors will be preponderant elements in the Next Generation Internet.PhD Thesis Proposal: G6.3Title: “Routing for Resilience in Ambient Networks”Keywords: Routing, resilience, ambient networksSupervisors: Prof. Edmundo Monteiro (firstname.lastname@example.org)Prof. Marilia Curado (email@example.com)Summary:The new types of applications and technologies used for nowadays communicationamong users, and the diversity of types of users have shown that traditional routingparadigms are not capable of coping with these recent realities. Therefore, the role ofrouting in IP networks has shifted from single shortest path routing to multiple pathrouting subject to multiple constraints such as Quality of Service requirements andfault tolerance. Moreover, traditional routing protocols have several problemsconcerning routing information distribution which compromises routing decisions.Namely, routing decisions that are based on inaccurate information due to bad routingconfigurations either caused by faulty or malicious actions will cause severe disruptionin the service that the network should provide.These are particularly important issues in networks that involve different types ofcommunication devices and media, such as happens in ambient networks. Ambientnetworks pose an additional challenge to routing protocols, since network compositionchanges very often when compared to traditional IP networks, and networks areexpected to cooperate among each other on-demand without relying on previousconfiguration. Moreover, associated with the dynamic structure of ambient networks,traffic patterns in ambient networks also change very often due to the composition anddecomposition of network structure.The work proposed for this thesis aims at studying the existing vulnerabilities of actualrouting protocols used in the Internet and to propose a resilient routing scheme thatovercomes these weaknesses in order to improve network availability and survivability.The work will comprise the study of the state of the art of routing protocols forresilience, the characteristics of ambient networks, and the proposal of enhancementsto existent routing schemes in order to improve the contribution of the routing protocolfor the resilience of ambient networks.The research work of the PhD candidate will be included in the European UnionIntegrated Project WEIRD (WiMAX Extension to Isolated Research Data networks -http://www.ist-weird.eu).PhD Thesis Proposal: G6.4Title: “Community networks connectivity and service guarantees”Keywords: Mobility, nomadicity, community networksSupervisor: Prof. Fernando Boavida (firstname.lastname@example.org)Summary:A number of recent technological developments have enabled the formation of wirelesscommunity-wide local area networks. Dispersed users (residents or moving users)within the boundaries of a geographical region (neighbourhood or municipality) form aheterogeneous network and enjoy network services such as Internet connectivity. Thisenvironment, named Community Networks, is well suited for both traditional Internetaccess and the deployment of peer-to-peer services.Achieving and retaining connectivity in this highly heterogeneous environment is amajor issue. Although the technology advances in wireless networks are fairly mature,one further step in the management of Community Networks is to provide mobility andnomadicity support. Nomadicity allows connectivity everywhere while mobility includesthe maintenance of the connections and sessions while the node is moving from oneplace to another. Mobility and nomadicity in community and home networks still placeseveral challenges, as these environments are highly heterogeneous.Seamless handover between different layer two technologies is still a challenge.Seamless multimedia content distribution to the home may include several networktechnologies, such as WLAN, power-line, GRPS or UMTS. Thus, the inter-layer issuesinvolved are complex and a lot of work is required to match MIP6 with them in order toprovide seamless mobility for multimedia information. The ability to support sessionson multiple access networks is another open issue. These issues will be the centralconcern of the proposed PhD work.The research work of the PhD candidate will be included in the European Union ISTFP6 CONTENT Network of Excellence (CONTENT – Content Networks and Services forHome Users, http://www.ist-content.eu) and will be carried out in close cooperationwith a foreign institution.G7: Databaseshttp://gbd.dei.uc.pt/PhD Thesis Proposal: G7.1Title: “The Optimization Problem (OP) for QoS-compliance in Systems Engineering”Keywords: QoS-Broker, Distributed Programming, SLAs, OptimizationSupervisor: Prof. Pedro Furtado (email@example.com)Summary:The capacity to monitor and control QoS-parameters and their lifecycle is an importantaspect for today’s mature systems and software. Quality-of-Service and automaticadaptation are also at the centre of most current developments in systems andsoftware. Another frequent issue in today’s parallel, distributed, mobile and in genericnetworked systems is the optimization of content placement and replication and itsapplications. Our group has been working on both issues and their relationship andhas two projects running. We have interesting and innovative proposals that make verygood PhD works. We also have some basic algorithmic and solver pilot prototypes for(OP) and for QoS functionality as starting points and a detailed knowledge of currentand promising work. Besides these issues, the proposal is an opportunity for the PhDcandidate to learn and work with technologies such as condor, java and solver software,and to work on exciting algorithms within a helpful team.PhD Thesis Proposal: G7.2Title: “QoS-Brokering Lifecycle with Automated Monitoring for Generic Applicationsand Networked Data Intensive Systems”Keywords: QoS-Broker, Distributed ProgrammingSupervisor: Prof. Pedro Furtado (firstname.lastname@example.org)Summary:The Generic QoS-Broker is a piece of Software that can seat anywhere in any systemand whose purpose is to make contracts, monitor and react with any piece of softwarein any desired manner to provide factual or optimistic Quality of Service guarantees. Itcan also interact with lower level QoS-Brokers, so that in principle any desired QoSobjective can be met in any context. The questions that must be answered are: howdoes it work? Given certain contexts, how is it applied? (e.g. mobile, real-time, datamanagement, etc). Our group has been busy working on some of these issues and hastwo projects related to this issue. We have some initial prototypes for these systems. Asa result, we have quite a few interesting and innovative proposals that make very goodPhD works.PhD Thesis Proposal: G7.3Title: “Automatic Time-prediction and QoS in Distributed Data-Intensive Systems”Keywords: QoS-Broker, Distributed ProgrammingSupervisor: Prof. Pedro Furtado (email@example.com)Summary:The capacity to monitor and control QoS-parameters is an important aspect for today’smature systems and software. Quality-of-Service and automatic adaptation are also atthe centre of most current developments in systems and software. There are severalworks on these issues. Our group has been working on mixing generic QoS with dataservices. We would like to explore this in a typical current distributed computingplatform. The proposal is an opportunity for the PhD candidate to learn and work withtechnologies such as condor and java, and to work on exciting algorithms within ahelpful team. Related Projects: Adapt-DB, current proposal in cooperation with otherCISUC group.PhD Thesis Proposal: G7.4Title: “Optimal Caching, Replication, Placement and Compression of Streams forMultimedia Streaming in Mobile and Heterogeneous Systems”Keywords: QoS, Replication, Placement, CachingSupervisor: Prof. Pedro Furtado (firstname.lastname@example.org)Summary:Given the rise and convergence of network technologies and of streaming and richmedia content delivery, the caching, replica placement and compression of streamsbecome increasingly important issues. Given our expertise in replica management,caching and QoS brokering, we expect to produce innovative proposals in this contextthat make very good PhD works. Related Projects: current proposal, in cooperation withother CISUC group.PhD Thesis Proposal: G7.5Title: “Predictability, Portability, Auto-Adaptability and Re-Organization of DataServices”Keywords: RDBMS, Replication, Placement, Caching, Query ProcessingSupervisor: Prof. Pedro Furtado (email@example.com)Summary:Auto-adaptability of Services and Applications is a hot topic currently. We havedeveloped work on both run-anywhere data management services and generic QoSBrokerArchitectures. Now we are particularly interested in providing automatic tools todetermine placement, replica and other adaptation strategies based on our monitoringand history-analysis capacity. We have a basic prototype and several initial ideas onthese issues. As a result, we have quite a few interesting and innovative proposals thatmake very good PhD works. Related Projects: Auto-DWPA.PhD Thesis Proposal: G7.6Title: “Data Anywhere, at Anytime with QoS in Heterogeneous Networks”Keywords: QoS, Replication, Placement, CachingSupervisor: Prof. Pedro Furtado (firstname.lastname@example.org)Summary:The current trend towards mobile and ubiquitous computing should fundamentallychange our concepts of mostly static data storage and access. In the future, access willincreasingly be mobile from multiple devices, places and through different mediumsand the user will not define or worry about where and how the access happens or wherethe contents are located. This represents a paradigm shift for the user from specifying“where and how” to specifying “Object Properties”. UbiData is an automatic transparentmanager for user content through QoS definition. Users specify all kinds of objects,access and resource properties and the system handles the objects automatically(storage, consistency, availability, other QoS parameters). Related Projects: Adapt-DB,current proposals;PhD Thesis Proposal: G7.7Title: “QoS and Replica-based Strategies for QoS-Brokering in Adaptable DataServices”Keywords: QoS, Transactional SystemsSupervisor: Prof. Pedro Furtado (email@example.com)Summary:Auto-adaptability of Services and Applications is a hot topic currently. We havedeveloped work on both run-anywhere data management services and generic QoSBrokerArchitectures. Now we are working in adopting interesting QoS and Replicationstrategies in transactional environments. As a result, we have quite a few interestingand innovative proposals that make very good PhD works. Related Projects: Adapt-DB;PhD Thesis Proposal: G7.8Title: “Timely ACID Transactions in DBMS”Keywords: Databases, transaction processing, performance and QoS, timelytransactions, real-time databases, fault-toleranceSupervisors: Prof. Marco Vieira (firstname.lastname@example.org)Prof. Henrique Madeira (email@example.com)Summary:On time data management is becoming a key difficulty faced by the informationinfrastructure of most organizations. A major problem is the capability of databaseapplications to access and update data in a timely manner. In fact, databaseapplications for critical areas (e.g., air traffic control, factory production control, etc.)are increasingly giving more importance to the timely execution of transactions.Database applications with timeliness requirements have to deal with the possibleoccurrence of timing failures, when the operations specified in the transaction do notcomplete within the expected deadlines. For instance, in a database applicationdesigned to manage information about a critical activity (e.g., a nuclear reactor), atransaction that reads and store the current reading of a sensor must be executed in ashort time as the longer it takes to execute the transaction the less useful the readingbecomes. This way, when a transaction is submitted and it does not complete before aspecified deadline that transaction becomes irrelevant and this situations must bereported to the application/business layer in order to be handled in an adequate way.In spite of the importance of timeliness requirements in database applications,commercial DBMS do not assure any temporal properties, not even the detection of thecases when the transaction takes longer than the expected/desired time. The goal ofthis work is to bring timeliness properties to the typical ACID (atomicity, consistency,integrity, durability) transactions, putting together classic database transactions andrecent achievements in the field of real time and distributed transactions. This work willbe developed in the context of the TACID (Timely ACID Transactions in DBMS) researchproject, POSC/EIA/61568/2004, funded by FCT.ProposalThe PhD work will comprehend the following initial tasks: (a) overview of the state-ofthe-art about timely computing and real-time databases; (b) characterization of timedtransactions; (c) analysis of DBMS core implementations; (d) infrastructure to supporttimely execution of ACID transactions; (e) development of mathematical techniques toforecast transactions execution times; (f) Implementation and evaluation; (g) writingpapers;PhD Thesis Proposal: G7.9Title: “Security Benchmarking for Transactional Systems”Keywords: Security, benchmarking, database management systems, transactionalsystemsSupervisors: Prof. Marco Vieira (firstname.lastname@example.org)Prof. Henrique Madeira (email@example.com)Summary:One of the main problems faced by organizations is the protection of their data againstunauthorized access or corruption due to malicious actions. Database managementsystems (DBMS) constitute the kernel of the information systems used today to supportthe daily operations of most organizations and represent the ultimate layer inpreventing unauthorized access to data stored in information systems. In spite of thekey role played by the DBMS in the overall data security, no practical way has beenproposed so far to characterize the security in such systems or to compare alternativesolutions concerning security features. Benchmarks are standard tools that allowevaluating and comparing different systems or components according to specificcharacteristics (e.g., performance, robustness, dependability, etc.). In this work we areparticularly interested in benchmarking security aspects of transactional systems.Thus, the main goal is to research ways to compare transactional systems from asecurity point-of-view. This work will be developed in the context of a researchcooperation with the Center for Risk and Reliability of the University of Maryland, MA,USA. During this work the student will have the opportunity to visit the University ofMaryland in order to carry out joint work with local researchers.ProposalThe PhD work will comprehend the following initial tasks: (a) overview of the state-ofthe-art about security, security evaluation, and dependability benchmarking; (b)definition of a security benchmarking approach for transactional systems; (c) study ofattacks for security benchmarking; (d) definition of a standard approach for securityevaluation and comparison; (e) implementation and evaluation; (f) writing papers;PhD Thesis Proposal: G7.10Title: “Dependability Benchmarking for Distributed and Parallel DatabaseEnvironments”Keywords: Databases, distributed systems, parallel systems, dependabilitybenchmarking, fault-toleranceSupervisors: Prof. Marco Vieira (firstname.lastname@example.org)Prof. Henrique Madeira (email@example.com)Summary:The ascendance of networked information in our economy and daily lives has increasedthe awareness of the importance of dependability features. In many cases, such as in ecommercesystems, service outages may result in a huge loss of money or in anunaffordable loss of prestige for companies. In fact, due to the impressive growth of theInternet, some minutes of downtime in a server somewhere may be directly exposed asloss of service to thousands of users around the world.Database systems constitute the kernel of the information systems used today tosupport the daily operations of most organizations. Additionally, in recent years, therehas been an explosive growth in the use of databases for decision support decisionsupport systems. The biggest differences between decision support systems andoperational systems, besides their different goal, are the type of operations executedand the supporting database platform. While operational systems execute thousands oreven millions of small transactions per day, decision support systems only execute asmall number of queries on the data (in addition to the loading operations executedoffline).Advanced database technology, such as parallel and distributed databases, is a way toachieve high performance and availability in both operational and decision supportsystems. However, although distributed and parallel database systems are increasinglybeing used in complex business-critical systems, no practical way has been proposed sofar to characterize the impact of faults in such environments or to compare alternativesolutions concerning dependability features. The fact that many businesses require veryhigh dependability for their database servers shows that a practical tool that allows thecomparison of alternative solutions in terms of dependability is of utmost importance.In spite of the pertinence of having dependability benchmarks for distributed andparallel database systems, the reality is that no dependability benchmark has beenproposed so far. A dependability benchmark is a specification of a standard procedureto assess dependability related measures of a computer system or computercomponent. The awareness of the importance of dependability benchmarks hasincreased in the recent years and dependability benchmarking is currently the subjectof strong research. In a previous work the first know dependability benchmark fortransactional systems has been proposed. However, this benchmark focuses singleservertransactional databases. The goal of this work proposal is to study the problem ofdependability benchmarking in distributed and parallel databases. One of the keyaspects to be addressed is to figure out how to apply a faultload (set of faults andstressful conditions that emulate real faults experienced by systems in the field) in adistributed/parallel environment. Several types of faults will be considered, namely:operator faults, software faults, and hardware faults (including network faults).ProposalThe PhD work will comprehend the following initial tasks: (a) overview of the state-ofthe-art about parallel and distributed databases, dependability assessment anddependability benchmarking; (b) definition of a dependability benchmarking approachfor distributed and parallel databases; (c) study of typical faults in distributed andparallel databases environments; (d) definition of a standard approach for dependabilityevaluation and comparison in distributed and parallel databases; (e)implementationand evaluation; (f) writing papers;PhD Thesis Proposal: G7.11Title: “Techniques to Improve Performance in Affordable Data Warehouses”Keywords: Data warehousing, on-line analytical processing (OLAP), performance,parallel and distributed databasesSupervisors: Prof. Jorge Bernardino (firstname.lastname@example.org)Prof. Henrique Madeira (email@example.com)Summary:In recent years, there has been an explosive growth in the use of databases for decisionsupport. These systems, generically called Data Warehouses, involve manipulations ofmassive amounts of data that push database management technology to the limit,especially in what concerns to performance and scalability. In fact, typical datawarehouse utilization has an interactive characteristic, which assumes short queryresponse time. Therefore, the huge data volumes stored in a typical data warehouse andthe queries complexity with their intrinsic ad-hoc nature make the performance ofquery execution the central problem of large data warehouses. The main goal of thiswork is to investigate ways to allow a dramatic reduction of the hardware, software, andadministration cost when compared to traditional data warehouses. The affordable datawarehouses solution will be built upon the high scalability and high performance of theDWS (Data Warehouse Stripping) technology. Starting from the classic method ofuniform partitioning at low level (facts), DWS includes a new technique that distributesa data warehouse by an arbitrary number of computers. Queries are executed inparallel by all the computers, guaranteeing a nearly optimal speedup.This work will focus various aspects related to:• Automatic data balancing: As each node in the cluster may have different processingcapabilities, it is important to provide load balancing algorithms that automaticallyprovide the best data distribution. With these mechanisms the system will be able toreorganize the data whenever it is needed, in order to make the load in each node asbalanced as possible, allowing similar response times for every nodes.• Auto administration and tuning: by using a cluster of machines the administrationcomplexity and costs tend to increase dramatically. Although the several nodes havenormally similar configurations, some discrepancies are expected due to theheterogeneous nature of the cluster. To achieve the best configuration we need to tuneeach node individually. Thus, we have to develop a solution for automaticadministration and tuning in distributed data warehouses that allows a reduction of theadministration cost and an efficient use of the system resources.ProposalThe PhD work will comprehend the following initial tasks: (a) overview of the state-ofthe-art; (b) characterization of affordable data warehouses requirements; (c)infrastructure for improved performance in affordable data warehouses; (d)implementation and evaluation; (e) writing papers;PhD Thesis Proposal: G7.12Title: “Towards High Dependability in Affordable Data Warehouses”Keywords: Data warehousing, on-line analytical processing (OLAP), fault-tolerance,data security, parallel and distributed databasesSupervisors: Prof. Marco Vieira (firstname.lastname@example.org)Prof. Henrique Madeira (email@example.com)Summary:In recent years, there has been an explosive growth in the use of databases for decisionsupport. These systems, generically called Data Warehouses, involve manipulations ofmassive amounts of data that push database management technology to the limit,especially in what concerns to performance and dependability. The huge data volumesstored in a typical data warehouse make performance and availability two centralsproblem of large data warehouses. An affordable data warehouses solution is being builtupon the high scalability and high performance of the DWS (Data Warehouse Stripping)technology. Starting from the classic method of uniform partitioning at low level (facts),DWS includes a new technique that distributes a data warehouse by an arbitrarynumber of computers. The fact that the data warehouse is distributed over a largenumber of computers raises new challenges as the probability of failure of one or morecomputers greatly increases. The main goal of this work is to investigate ways toachieve high-dependability in the affordable data warehouses solution while allowing adramatic reduction of the hardware, software, and administration cost when comparedto traditional data warehouses. Thus, this work will focus various aspects related to:– Data security: the affordable data warehouses solution will be based on open-sourcedatabase management systems (DBMS). However, these databases do not provide thesecurity mechanisms normally available in commercial DBMS. In addition, thedistributed database approach increases the data security requirements. The goal is toinvestigate the security needs for distributed data warehouses over open source DBMSand to propose advanced mechanisms that improve the overall system security.– Data replication and recovery: comparing to a single-server database, one of theconsequences of the use of a cluster of affordable machines is the increase of theprobability of failure. This way, one of the goals is to research a new technique for datareplication and recovery that allows the system to continue working in the presence offailures in several nodes and facilities the recovery of failed nodes.ProposalThe PhD work will comprehend the following initial tasks: (a) overview of the state-ofthe-art; (b) characterization of affordable data warehouses dependability requirements;(c) infrastructure to support high-dependability in distributed data warehouses; (d)Implementation and evaluation; (e) writing of papers;G8: Information Systemshttp://isg.dei.uc.pt/PhD Thesis Proposal: G8.1Title: “Learning Contexts and Social Networking”Keywords: e-learning 2.0, e-learning, learning contexts, learning, LMS, socialnetworking, technology enhanced learning (TEL), web 2.0Supervisor: Prof. Antonio Dias de Figueiredo (firstname.lastname@example.org)Summary:This thesis concentrates on the design, implementation and management of learningcontexts. The major tenet of our Learning Contexts Project, which has been gainingstrength in the course of over thirty years devoted to ICT and Education, is that thefuture of learning is not to be found just on content, but also, and very much, oncontext, that is, on making learning happen within activity rich, interaction rich, andculturally rich social environments that never existed, that the intelligent use oftechnology is making possible, and where different paradigms apply. Many of the mostdynamic fields of research in learning and education, such as computer supportedcooperative learning, situated learning, or learning communities relate to learningcontexts. Hundreds of expressions used in education – such as project based learning,action learning, learning by doing, case studies, scenario building, simulations, roleplaying – pertain to learning contexts. The advantage of concentrating on context, as awhole, rather than on the multiplicity of its manifestations studied by disparateresearch groups is that, by doing so, we can articulate that multitude of theories andpractices into a single, coherent, organic, and operational worldview. The proposedthesis pushes forward our current efforts in this field by exploring the relationshipbetween Learning Contexts and Social Networking. This may include collaboration withanother of our projects, the Serendipty Project, centred on the development of aserendipitous social search engine for which we hold a US patent application. In orderto stimulate the creativity of the candidates, plenty of leeway will be given to them, sothat they may choose to concentrate on theoretical aspects, on practical educationalissues, or on the specification and design of the ideal, and yet inexistent, learningcontext management system (LXMS).Prospective candidates whishing to clarify the research implications of learning contextsmay download from the journal Interactive Educational Multimedia our paper “LearningContexts: a Blueprint for Research”. Further information can be obtained bydownloading Chapter 1, “Context and Learning: a Philosophical Framework”, of ourbook Managing Learning in Virtual Settings: the Role of Context, published byInformation Science Publishing (Idea Group). Successful candidates will have, or bewilling to develop throughout their PhD, a mixed profile of educational technology andeducational and social researcher.- Figueiredo, A. D. (2005) “Learning Contexts: A Blueprint for Research”, Interactive EducationalMultimedia, No. 11, October 2005http://www.ub.es/multimedia/iem/down/c11/Learning_Contexts.pdf- Figueiredo, A. D. and Afonso, A. P. (2005) “Context and Learning: a Philosophical Framework”,in Figueiredo, A. D. and A. P. Afonso, Managing Learning in Virtual Settings: The Role of Context,Information Science Publishing (Idea Group), October 2005.http://www.idea-group.com/downloads/excerpts/Figueiredo01.pdfPhD Thesis Proposal: G8.2Title: “Quality management and information systems: getting more than the sum ofthe parts”Keywords: Quality Management, Information SystemsSupervisor: Prof. Paulo Rupino (email@example.com)Summary:Quality Management of products, services and business processes is, today, a key issuefor the success of most companies operating in global contexts. In fact, holding aQuality certification, such as established by ISO 9001:2000 standards, is becoming abasic requirement for companies to play in several international markets. On the otherhand, the design and deployment of information systems is another key aspect toconsider when modern organizations define their business models and strategy.It is quite surprising, thus, that in spite of the fact that both quality management andinformation systems architecting require intensive strategic analysis and the extensiveinvolvement of staff and managers in the examination and redesign of businessprocesses, the two endeavors are still treated as completely distinct. They are usuallyconducted as separate projects, handled by different teams, equipped with unconnectedmethodologies.The integrated design of these two pillars of modern organizations, in a manner thatthey depend on, support, and reinforce each other, enables a quantum leap, as it letsorganizational tasks be reengineered in the light of: (i) effectiveness, consistency andevidence of compliance, as required by quality systems; and (ii) efficiency, harnessingthe power of digital information storage, processing, and communication in the renewedbusiness processes. Typical criticisms to traditional implementations of QualityManagement Systems can also be alleviated, namely by reducing the addedbureaucracy and overhead imposed on users by traditional implementations. Theeconomic impact on organizations can be considerable, not only at the initial planningstage but, more importantly, throughout the lifecycle of operation of this unified system.The likelihood of synergy between quality management and IT infrastructure has beensuggested by a few authors, but no systematic processes for leveraging those synergiescan be found.A successful Ph.D. in this unexplored field will arm its holder with the skills and toolsto act in an increasingly appealing consulting arena.PhD Thesis Proposal: G8.3Title: “Visualizing and Manipulating Work Load Control over Business Networks”Keywords: Information Systems, Work Load Control Methodology, Human-ComputerInteraction, Information Visualization, Direct Manipulation Interfaces, Delegation,Interface AgentsSupervisor: Prof. Licínio Roque (firstname.lastname@example.org)Summary:In a previous project we designed a web-based planning and control system for Smalland Medium Enterprises that operated on Make-To-Order clusters (a case in the MouldmakingIndustry) that implemented and adaptation of the Lancaster proposed WorkLoad Control planning methodology. The Work Load Control methodology enablesproduction management across shopfloors by controlling pending work load levelsacross workcentres, thus effectively managing overlapping “windows of opportunity” forcompleting every task at each production unit. We have developed a system where theuser sets the planning conditions and delegates in the system the generation of planproposals. Unmet restrictions are then iteratively resolved by taking managementdecisions that adapt a candidate plan to actual production conditions and vice-versa.Some of the conclusions of the project relative to the adoption of the new planning toolwere: a) the Work Load Control methodology while particularly flexible and adaptablefor SME running MTO operating models poses a learning obstacle for people trained tothink in time-sliced models (like those depicted in Gantt diagrams); b) a web-basedsystem while easy to deploy and manage poses a heavy cognitive load as the principalinteraction mode is linguistic (using menus, dialogs, forms, with some graphics forvisualizing resulting plans); c) current business globalization increasingly involves highlevels of subcontract work that needs to be managed across enterprise networks withonly partial knowledge of production conditions, which makes it difficult to usemethodologies that assume full knowledge and control over production units.With clients, we have come to the conclusion that a planning tool to the Work LoadControl methodology needs a visualization and direct manipulation tool to reduce thecognitive overhead posed by the complexity and “non-intuitiveness” of the methodologyand enable the person to dynamically envision and track events across networks ofenterprises. This case provides an ideal opportunity to attempt an integration oflinguistic, direct manipulation and delegation modes of interface, develop novelvisualizations and test usability evaluation techniques. The research implies acquiring aknowledge of the methodology, conceiving and studying appropriate solutions for thecase study by designing innovative interaction techniques. Relevance is met in DecisionSupport Systems, Human Computer Interaction and, more generally, in the InformationSystems academic and business communities.PhD Thesis Proposal: G8.4Title: “Designing Games as Learning Contexts”Keywords: Information Systems, Human-Computer Interaction, ContextEngineering, Learning Games, Learning Contexts, Social ConstructivismSupervisor: Prof. Licínio Roque (email@example.com)Summary:Several authors have argued the idea that videogames could be exploited as learningenvironments. James Paul Gee has written extensibly on the subject of learning fromcomputer games, noticing that we can hardly ignore the learning that takes place withthis new medium. Mark Prensky argued this idea on the simulation aspects of games.Raph Koster, a designer and consultant, adopts the perspective that games are actuallymeant to be learned, and makes playing as learning the basis for game design. SimonEngenfeldt-Nielsen produced a PhD thesis on the educational potential of computergames, by analyzing cases that used commercially available games.Seeing games in the light of socio-technical theories such as Actor-Network we havecome to an interpretation of games as constructions designed and built to enforcespecific “programs of action” while its storytelling and underlying rules provide thebasis for stable translation regimes by the player. Games as simulations can also beunderstood as embodied theories of physical or social reality. By providing specificembodied concepts and representations, physical or other abstract rules, characterswith recognizable behavior, the designer builds a learning context with conditions ofengagement that concur to enable playful experiences. These can be more or lessflexible or open to interpretation through player choices, but always enforcingunderlying worldviews or inscribed theories that are meant to be learned in the courseof playing the game if the player is to achieve the game’s goal. It is this activityconditioning that we suppose to be the usable basis for explicitly conceiving games aslearning devices.An alternative approach would be to take game design itself as the learning activity andexplore the learning potential inherent in the design activity. Either way, the researchshould focus on the methodological problem of explicitly modeling and building gamesas learning contexts. The Context Engineering approach can be used to framedevelopment of specific contexts, by prototyping on available multiplayer gamedevelopment technology and focusing on design aspects and their relation to theproposed problem. Adequate evaluation techniques should also be a consideration inthe studied contexts if they are to be socially accepted as effective learning alternatives.Relevance is expected for the Learning Sciences, Human and Social Sciences,Information Systems and Human Computer Interaction, Game Studies, Media Studies,and society at large.PhD Thesis Proposal: G8.5Title: “Visual Modeling Language for Game Design”Keywords: Information Systems, Human-Computer Interaction, Visual ModelingLanguage, Games and DesignSupervisor: Prof. Licínio Roque (firstname.lastname@example.org)Summary:Game design is a complex activity that deals with multiple and heterogeneousconcurrent constraints. A game designer has to consider combined effects of multipleelements effectively requiring a trans-disciplinary background that can range, withvariable intensity, from Humanities to Media Studies, to Psychology and Sociology, toAesthetics, to Informatics and Economics. While the advent of a common designlanguage still seems far in the future, some design patterns can be readily recognizedfrom the 30+ year history of videogame development. An example of a systematization isBjörk and Holopainen’s “Patterns in Game Design”. Nonetheless, the idea of a visuallanguage for game modeling and design seems not only possibly and relevant, but apressing research goal.In pursuing the goal of a visual modeling language for game design it is expected that aknowledge of socio-technical studies of science and technology will give useful insightsinto the problems of trans-disciplinary and in particular the use of Actor-NetworkTheory constructs as a basis language for the analysis of game contexts. A requisite forsuch a language would be that we could build a prototype modeling tool that could beused to sketch and generate game coding to be used on a current software gameplatform. Another basic requisite would be that it could serve as a basis for designdialogue between people with diverse disciplinary backgrounds.This research project would involve the trans-disciplinary background study anddrafting of o a language prototype that could then be evaluated and evolved bysuccessive design iterations, against updated usability requirements. Relevance andadequacy would be judged based on actual empirical design experience and historicaldesign accounts, in the fields of Information Systems, Human Computer Interaction,Game Development, Design, Games Studies and Media Studies.PhD Thesis Proposal: G8.6Title: “Programming Games by Demonstration (and Learning to Program)”Keywords: Information Systems, Human-Computer Interaction, Games and Design,Programming by Demonstration, Programming by ExampleSupervisor: Prof. Licínio Roque (email@example.com)Summary:Game programming and development, from scratch, requires advanced skills andspecialties often unavailable to a game designer or an artist, less alone to the proverbialman-on-the-street. The advent of general purpose game engines and game generationenvironments made it simpler and more affordable, or at least a less specialized task, tobe able to develop and deploy complex game scenarios. Yet, the simpler game designattempt still requires if not the domain of complex programming at least some skillswith a scripting language.Towards democratizing this technology and the videogame medium, and taking thehistorical lesson from what happened with the appearance of personal filming camerasand the development of the cinema, again with video and the TV, it seems interesting towork on a solution that would enable a wider public to become an active participant inthe creation of interactive content. Resorting to and evolving programming bydemonstration or programming by example techniques could play a significant part inlowering the learning barrier to achieve useful effects analogous to behavior scripting.Building a system that would enable reflexive action by letting the user inspect andanimate the programming results of demonstrative actions, could serve as a basis forsemi-autonomous learning of programming concepts and skills.It is intend that the candidate researcher pursues this goal by designing and building aprototype system on top of an existing software game platform or virtual environmentand proceed to evaluate the generated concept through actual empirical cases withtargeted user segments. Relevance and adequacy would be judged based on actualempirical design experience, with results published in the fields of Information Systems,Human Computer Interaction, Game Development, Design, Games Studies and MediaStudies.G9: Evolutionary and Complex Systemshttp://cisuc.dei.uc.pt/ecos/PhD Thesis Proposal: G9.1Title: “Evolving Representations for Evolutionary Algorithms”Keywords: Evolutionary Computation, Gene Regulatory Networks, Self-Organization,Supervisor: Prof. Ernesto Costa (firstname.lastname@example.org)Summary:Evolutionary Algorithms typically approach the genotype - phenotype relationship in asimple way. As a matter of fact, conventional EAs consider the genotype as the complexstructure and rely on more or less simple mechanisms to do the mapping from genotypeto the phenotype. Some work is being done on the relation between the user-designedrepresentations used by the EA (the genotype) and the fitness landscape induced by theproblem (the phenotype). The idea is to understand the role played by representationsfor improving evolvability. This is important but we can advance a step further.Exploring ideas from developmental biology in the context of evolutionary algorithms isnot new. Notwithstanding, the challenge here is to understand better how we cancombine the theory of evolution with embryonic development in an unified frameworkand explore it computationally aiming at evolving the representations to be explored byan EA instead of design them offline, attaining a self-organized evolutionary algorithm(SOEA). To achieve that goal we have to identify the building blocks for representationsas well as the transformational rules that end up in the definition of an adaptedindividual.PhD Thesis Proposal: G9.2Title: “Harnessing Dynamic Environments: the problem of prediction”Keywords: Evolutionary Computation, Dynamic Environments, PredictionSupervisors: Prof. Ernesto Costa (email@example.com), Prof. Juergen BrankeSummary:Most real-world optimization problems dynamically change over time (e.g. scheduling,routing, transportation or robot’s navigation problems). In such cases, the task of anoptimization algorithm changes from finding an optimal solution to continuously adaptan existing solution to the changing environment. Nature-inspired optimizationalgorithms have proven to be successful candidates for such problems.When changes occur repeatedly it should be possible to learn from the experience andpredict future changes of the environment. Such predictions, even if uncertain, couldhelp the algorithm to make decisions that prepare it for what is to come, allowing aneven faster adaptation and avoid getting stuck in \"dead ends\". In the context of dynamicenvironments all work has focused on enabling the algorithm to adapt quickly after achange. Only very few papers have attempted to anticipate changes. So far, there is nofundamental and general investigation on the importance and possibilities to integrateprediction into nature-inspired optimization. This thesis would aim at making somefundamental investigations into the role and potential of prediction in dynamicenvironments, and at developing some new and effective ways to integrate predictioninto various nature-inspired optimization heuristics.PhD Thesis Proposal: G9.3Title: “Evolutionary Hybridization with State-of-the-art Exact Methods ”Keywords: Evolutionary computation, optimization, hybridization.Supervisor: Prof. Francisco Pereira (firstname.lastname@example.org)Summary:Standard Evolutionary Algorithms (EA) often perform poorly when searching for goodsolutions for complex optimization problems and may benefit if they are combined withother techniques. Broadly speaking we can consider two large classes of hybridarchitectures: the EA can be complemented with other search methods or it can beenhanced with problem specific heuristics that add explicit knowledge about theproblem being solved. A drawback associated with research conducted on this topic isthat many reported approaches are typically somewhat naive in nature and with alimited applicability. Moreover EAs are, in most situations, combined with basicstandard procedures such as hill-climbing algorithms or simulated annealing.This project aims at conducting an inclusive study of evolutionary hybridization toanalyze if it is possible to develop new architectures that perform better than today’smethods. Special attention will be given to hybridization with exact algorithms, likelinear programming or gradient-based search. The challenge is to understand how thekey properties of these exact techniques, such as the capability to reduce the searchspace or the effective exploration of neighborhoods, might be used by the EA toefficiently perform a global exploration of the search space. Several examples ofoptimization problems will be used to perform a comprehensive analysis of thedeveloped hybrid architectures.