ICTer 2017 – Keynote Speakers

Dependable Computing Platforms for the Internet of Things

Prof. Marcel Baunach, Institute of Technical Informatics, Graz University of Technology, Austria.

 

A central demand on the Internet of Things (IoT) as a global infrastructure is its ability to provide continuously changing services and functions dependably on an unprecedented number of heterogeneous devices. While today’s embedded devices are still statically designed for specific applications and suffer from severe security flaws, future software and hardware must be much more flexible and inherently protected. Highly adaptive computing platforms will be required to allow the dynamic composition of functions and even the modification of computational units at runtime (maintainability). At the same time, operations must still be completed within guaranteed response times (real-time), and the devices and network must remain protected against alteration due to environmental perturbation or deliberate attacks (security, safety). This talk addresses the related challenges, and presents novel approaches for co-designing highly flexible and secure middleware and MCU architectures for dependable embedded platforms in the IoT.

Trustworthy Software and Automated Program Repair

Prof. Abhik Roychoudhury, School of Computing, National University of Singapore.

 

Software controls many critical infra-structures and a variety of software analysis methods have been proposed to enhance the quality, reliability and security of software components. In this talk, we will first briefly mention the gamut of methods developed so far in software validation research –  ranging from systematic testing, to analysis of program source code and binaries, to formal reasoning about software components. We will discuss the research on trustworthy software at NUS which make software vulnerability detection, localization and patching much more systematic, with the help of scalable semantic analysis.  We will specifically explore research on futuristic programming environments which enable auto-patching of software vulnerabilities, with a focus on automatic program repair – where software errors get detected and fixed continuously. Apart from reducing the burden on programming, this research aims to realize the vision of self-healing software for autonomous cyber-physical systems.

Evolutionary Many-objective Optimization and Some Real-World Applications

Prof. Kiyoshi Tanaka, Faculty of Engineering, Shinshu University, Japan.

 

Multi-objective evolutionary algorithms (MOEAs) are widely used in practice for solving multi-objective design and optimization problems. Historically, most applications of MOEAs have dealt with two and three objective problems, leading to the development of several evolutionary approaches that work successfully in these low dimensional objective spaces. Recently, there is a growing interest in industry to solve problems that require the simultaneous optimization of four or more objectives, known as many-objective optimization problems. However, conventional MOEAs scale up poorly with the number of objectives of the problem.
The development of robust, scalable, many-objective optimizers is an ongoing effort and a promising line of research. Critical to the development of such algorithms is an understanding of fundamental features of many-objective landscapes and the interaction between selection, variation, and population size to appropriately support the evolutionary search in high-dimensional spaces.
This talk will give an introduction to evolutionary many-objective optimization, discussing some characteristics of many-objective landscapes and relating them to working principles, performance and behavior of the optimizers. It will also present a general overview of the approaches to many-objective optimization, together with their state-of-the-art algorithms and techniques. Further, it will illustrate the use of many-objective optimization for some real-world applications.

Assessing for Quality Decision Making

Prof. Geoffrey Crisp, The University of New South Wales, Sydney, Australia.

 

This session will explore the 21st century skills and capabilities that our students will need as they live and work in a world dominated by ubiquitous technology, complexity and increasing uncertainty. Our assessment practices will need to change; we cannot continue to give students static content-based assessment tasks that ignore the contextual consequences of working in a complex environment with many stakeholders. We will need to expand our repertoire of assessment tasks to include a more sophisticated use of physical and virtual spaces that allow students to construct their responses with access to whatever resources they require in order to make a meaningful response to a meaningful task. We should be able to identify students’ decision making processes when they propose a solution to a real life problem. Students will need to be provided with more engaging tasks that will enable them to use the full range of capabilities they have developed during their learning. We will examine some of the implications of this new educational environment and reflect on our current assessment practices in relation to the requirements of this brave new world.

Computational Science in Multidisciplinary

Professor S.R. Subramanya, School of Engineering and Computing, National University California.

 

While the computer, Internet, and mobile technologies have matured to a point where they have become ubiquitous and have been the driving forces behind the systems and processes of numerous domains such as engineering, sciences, manufacturing, transportation, banking and finance, retail, defense and security, healthcare, public safety, energy, etc., a new area namely Computational Science has been emerging. Computational science is a rapidly growing multidisciplinary field, which draws upon the techniques from Computer Science, Numerical analysis, Mathematics, Simulation, and Data Visualization to solve complex real-world problems in several domains.

It has now been widely recognized that Computational Science forms the “third pillar” of scientific inquiry, in addition to the traditional theory and experimentation. It must be noted that numerous real world problems are far too complex. Experiments in numerous scientific and other domains are highly complex, expensive, and risky. On the other hand, several of the problems are also very complex to be dealt with by analytical methods. Computational Science enables us to understand the problems, develop solutions, and obtain both qualitative and quantitative insights into the complex system behavior, determine the viability of solutions, etc. It enables professionals to solve in a cost– and time–efficient manner, many real-world problems in renewable energy, microbiological basis of diseases, effective drug discovery, economic forecasting, epidemiology, earth sciences, space exploration, weather and climate prediction, global financial markets etc. Computational Science is beginning to play an important role in the way science and engineering is being done, and is expected to play an even more dominating role in the future, impacting numerous areas in the world around us. In this context, there is a dire need for a new breed of professionals who can understand and model the complex real–world problems, and apply the techniques of Computational Science to develop solutions and interpret results. To this end, it is important, even critical, to develop courses and programs in / related to Computational Science in higher education curricula, in order to meet the future demands of

In this context, there is a dire need for a new breed of professionals who can understand and model the complex real–world problems, and apply the techniques of Computational Science to develop solutions and interpret results. To this end, it is important, even critical, to develop courses and programs in / related to Computational Science in higher education curricula, in order to meet the future demands of workforce skilled in understanding the several problems of contemporary society and to develop efficient and effective solutions for the greater good of the society.

Prof. Roger Stern

Country:

United Kingdom

Affiliation:

Statistical Services Centre, University of Reading, United Kingdom.

 

Roger Stern, is a professor of Applied Statistics at Statistical Services Centre in the University of Reading, United Kingdom. He obtained his MSc in Statistics from University of Sussex, England and PhD in Statistics from the University of Reading. He worked as a lecturer in statistics at the same University and also have worked overseas for 10 years, particularly in Sri Lanka, Nigeria and Niger, both in Universities and in Agricultural Research institutes.

His main areas of research have been on developing methods for processing historical climatic data, particularly rainfall data, in ways that are of direct relevance to users. He is also particularly concerned with the development of effective methods for training in applied statistical methods in general, and statistics in applied climatology in particular. His current role in the Statistical Services Centre is largely that of offering support to research activities in many fields, but particularly in agricultural research and in issues concerning climate variability and climate change.

He can provide expert opinion on Data Analytics, Statistical applications, i.e. planning of experiments, data organization analysis and reporting, statistical software, statistical climatology, analysis of historical climatic data, training in research methods support and in statistical climatology.

Transactions on Software Engineering (TSE).