Links

The Intelligent Systems Research Laboratory (ISR, University of Reading)

The Intelligent Systems Research (ISR) Laboratory operates as a multi-disciplinary research centre with a staff of around 20 Masters, Post-Doc and Professorial level research scientists, software engineers and project management staff at the School of Systems Engineering, University of Reading. ISR comprises 5 Technology Teams as follows: i) Virtual user and Living laboratory based user-driven integrated requirements prioritisation and dynamic usability evaluation methodologies (UI-REF); ii) Simulation and Modelling, agent behaviours-risks modelling, intelligent information management; iii) Computer network security & trustworthy Internet of Things (IoT) architectures, iv) Multi modal real-time Robo-Humatics© & Affective Human Computer Interaction; v) Multi modal (multi)media information retrieval, semantic media technologies.

Recent projects at ISR have included advanced personalisation, accessibility design and profiling management, semantic technologies (ontological networks, virtualisation-agents-semantics), Trustworthy Future Internet of People Things & Services -IoPTS (authentication, security, privacy and trust modelling – Mobi-PETS GRID), service-oriented architectures, behaviour modelling, social criminal networks modelling, semantic video interpretation, pattern discovery and multi-view data mining, semantic workflow integration and decision support, smart transcoding for dynamic media adaptation, audio analysis and semantic parametric music representation and labelling, automated mood-responsive affective music composition, multi-modal semantic-collateral media indexing and retrieval (DREAM, KAIFIA), testability frameworks, dynamic usability relationships modelling and evaluation (UI-REF), user-centred requirements engineering and co-design, HCI-HRI for advanced personalisation- particularly for mobile intimate systems to support workers and-users, embedded systems design, Lab-on-Chip and sensor technologies for real-time adaptive systems exploiting FPGA and semantic integration as applied to network security intrusion, prevention and detection (or tele-healthcare and energy efficiency management systems – FASTMATCH); Smart Surveillance and Privacy Aware Security (MOSAIC and VIDEOSENSE).

 

The Coordinator:

Prof. Atta Badii

Founding Director of ISR, is assisted by a number of senior research scientists, RTD and project managers.  Atta is a high ranking professor at the University of Reading, School of Systems Engineering.  He holds the Chair of Secure Pervasive Technologies and has a multi-disciplinary academic and industrial research experience in the fields of Distributed Intelligent and Multi-modal Interactive Systems, Pattern Recognition, Security & Trust Architectures, Semantic Workflow and Knowledge Integration.  He has contributed to 25 collaborative projects to-date and has served as the Scientific and Technical Leader of several projects at both national and international level; has successfully coordinated several UK/EU-funded projects (e.g. FastMatch, CompanionAble, Dream, MOSAIC, VideoSense); is the pioneer of several paradigms in user-centred assistive-ambient technologies; has around 170 publications to-date and has served on editorial and research steering boards as coordinator/technical leader/invited expert e.g. as the Chair of the Security Architectures and Virtualisation Taskforce of the European Road Map Project SECURIST and as Chair of the VideoSense: European Video-Analytics Network of Excellence.  Atta has made fundamental contributions to pushing forward the frontiers of research in Security Context Representation (e.g. as in Hydra LinkSmart Technology) and Mitigation (e.g. FastMatch Next Generation IDS-IPS Architectures), Access Security and Privacy Enhancing Technologies (e.g. Mobi-PETS-GRID, Hydra), Ontology-based Semantic Integration.  User-centred Co-design and Integrated Requirements Prioritisation and Usability Evaluation (e.g. UI-REF), Human-Computer-Robot Interaction (e.g. CompanionAble, CORBYS), Multi-modal Communications Control (MOveOn), Advanced Affective-Interactive Interfaces (CALLAS), Dynamic Media Adaptation (Axmedis, MoveOn), Semantic Workflow Integration (Axmedis, 2020-3D Media), Advanced Multi-modal Media Indexing and Retrieval (DREAM, Content Safari), Automated Affective Music Composition (CALLAS), Semantic Music Representation & Search (I-Maestro, CC-MOLE, SoundScape), and framework architecture for Man-in-the-Loop Assistive-Interactive Systems (CORBYS).