In August, La Trobe University announced that it will become Victoria’s first zero-emissions University. The $75 million initiative combining 20 separate projects will ensure that LTU will have Net Zero emissions by 2029.
carbon emissions, or carbon-neutral emissions, are achieved by balancing the
amount of carbon released with an equal amount of carbon offset by producing
The La Trobe Energy Analytics Platform (LEAP) provides the technology pillar for the Net Zero initiative and is designed and implemented by researchers and students from LBS’ Centre for Data Analytics and Cognition (CDAC). The team will design and implement the platform which then monitors energy consumption in up to 50 smart buildings and makes lighting, heating and cooling adjustments in real time to reduce energy consumption.
The creation of a smart
building allows the building to ‘think’ for itself in
optimising its energy consumption. Formulating this Smart Building involves Artificial
Intelligence, Unsupervised Machine Learning, Data Analytics and Software
Development; this includes CDAC’s own brand of algorithms that have been
developed over the past decade. CDAC’s research work is internationally renown
and has also been used successfully in several industrial engagements ranging
from Health, Transport, Fire and Emergency Services, Sport and Energy. The
Centre also hosts a unique blend of research and expertise in its staff and
researchers which makes it the ideal candidate to develop such a platform and
espouses the concept of a Living Lab that La Trobe University champions.
LEAP Technical Architect (and LBS PhD candidate) Nishan Mills summarised the system as:
Buildings and spaces display distinctive behaviours in energy consumption. The LEAP platform will use available data streams to create digital twins for buildings and spaces in the University environment in order to capture this behavioural profile. This allows the platform to detect, analyse and suggest corrective measures to achieve the most efficient energy consumption across the university.
LBS hosted the 3rd International Conference on Big Data and Internet of Things (BDIOT 2019) and provided a workshop on deep learning for big data and internet of things (IoT) applications.
About BDIOT 2019
The main purpose of
BDIOT 2019 was to provide an international platform for presenting and
publishing the latest scientific research outcomes related to the topics of big
data and IoT. The rapid advancement and ubiquitous penetration of mobile
network, web-based information creation and sharing, and software defined
networking technology have been enabling sensing, predicting and controlling of
the physical world with information technology. Every business process can be
empowered, and therefore, various industries redesign their business models and
processes along the paradigm.
Rashmika Nawaratne and Achini Adikari provided a
workshop on deep learning for big data
and internet of things (IoT) applications. The
workshop demonstrated how to use deep learning theories in practical
applications such as transport, health and energy. Around 25 participants from
diverse backgrounds, such as IoT, Business, Sports, Data Mining, Computer
Science and Geography, took part in the workshop. Participants came from
countries such as Japan, Germany, China, Thailand, India and Pakistan.
The workshop conveners
Rashmika and Achini are LBS PhD candidates and researchers at our Centre for Data Analytics and Cognition (CDAC). Rashmika is pursing research on brain inspired Artificial Intelligent (AI) algorithms. During his PhD, he plans to conceptualize, design and develop a brain inspired self-learning AI algorithm to comprehend video and IoT data that can be used in application areas such as national security, smart cities and smart homes. Achini is engaged in multiple research projects involving text analysis in public health forums and social media data, with a particular interest in human emotions analysis using self-learning AI. Her PhD focuses on modelling emotions from digital data in social media conversations using novel AI techniques. Prior to their PhD, both Rashmika and Achini have worked as Technical Team Leads at Software Product Engineering Organizations.
What is deep learning?
Deep learning is a persistently maturing artificial intelligence
paradigm in research and practice. It maintains a formidable evidence base and
increasing potential for applications in big data and IoT environments in
energy, manufacturing, transport, communication and human engagement. According to Rashmika it is essential to showcase the practical use of
these AI techniques in real-world scenarios rather than only focusing on
theories and concepts.
The workshop aimed to develop essential knowledge of deep learning and
key skills in industrial applications using big data and IoT, and incorporated hands-on
tutorials in Python, using Google Collaboratory and Jupyter Notebook.
started with exploring the structural
elements of deep learning models, hyper-parameters, and comparison to standard
machine learning algorithms, followed by the theory and application of deep
neural networks (classification), convolutional neural networks (image
processing), and deep recurrent neural networks (time-series prediction).
Participants then attempted hands-on experiments with each technique using a
benchmark dataset, for training, testing and evaluation. Rashmika and Achini
also demonstrated each technique in the context
of separate real-life projects which accommodate big data and IoT data. One of
these real-life projects was vehicular traffic prediction using IoT smart
sensor data setup of arterial road networks. The real-life scenario contains
over 190 million records of smart sensor network traffic data generated by
After completing the workshop, participants walked away with solid theoretical
foundations of deep learning, when to use it and in which industrial settings,
how to design, implement, validate and deploy deep learning models in
industrial settings. Feedback from participants has been very positive.
“Most of the workshops on deep learning focus on theoretical aspects, but this workshop focused on practical aspects of using deep learning for industry applications on Big Data and IoT.”
“Easy to understand for a beginner. For a person who do not have a background in AI, it was quite easy to capture the essence of what deep learning means and its hype.”
“Was able to understand what deep learning is and completely implement an AI solution for a business problem within 3 hours.”
is everywhere and being created almost constantly. This calls for a new way of
With traditional data, the person who wants to use the data has to create that data. Although called Artificial Intelligence (AI), traditionally AI needs to be trained with data and outcomes that are known. So researchers have to build the Artificial Intelligence and algorithms to suit the problem and to suit the data. Now, data is generated by machines, leading to great amounts of data yet to be interpreted. Machines generate data at a rate that could go up to hundred thousands of data points a second. This data is being created through media, the cloud, the web, the Internet of Things, sensors, etc. It is no longer possible to connect each of these individual data point to objects in the real world. It is not possible anymore to build the AI, because the data is unknown, and so is the problem or outcome.
This new type of data is in real-time, online, machine generated, in high volumes and granular, and means we need a new paradigm. A paradigm that allows AI to make sense of data collected through for example text, images and videos, especially in social media, and capture feelings, emotions and someone’s personality.
Centre for Data Analytics and Cognition Lab
LBS’ Centre for Data Analytics and Cognition Lab (CDAC) specialises in research and development of cutting-edge artificial intelligence (AI) and machine learning algorithms and the transformation of these into practical tools and technology for business and other practical applications in advanced analytics. The team consist of a group of big data experts with outstanding research and academic achievements on top of many years of industry experience in the areas of finance, telecommunications, IT and business.
CDAC and the new paradigm
One of the key areas CDAC is involved with, building AI that builds itself, called self-structuring AI. The other aspect is that it allows for unsupervised learning. This means that the AI not only builds itself, it also learns by itself. It does not need to be trained. It can still be trained, but it has the capability to learn by itself. The CDAC team then transforms this technology into practical technology, called technological AI building blocks so it can be used to serve the community.
The great news is that collaboration is already taking place with industry, governments and academia.
Contact or visit
For more information visit the Centre for Data Analytics and Cognition.
Donald Whitehead Building, Room 301, Level 3 La Trobe University, Bundoora, VIC 3086