Search results for: integrated models of reading comprehension
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10549

Search results for: integrated models of reading comprehension

9859 Automated Detection of Targets and Retrieve the Corresponding Analytics Using Augmented Reality

Authors: Suvarna Kumar Gogula, Sandhya Devi Gogula, P. Chanakya

Abstract:

Augmented reality is defined as the collection of the digital (or) computer generated information like images, audio, video, 3d models, etc. and overlay them over the real time environment. Augmented reality can be thought as a blend between completely synthetic and completely real. Augmented reality provides scope in a wide range of industries like manufacturing, retail, gaming, advertisement, tourism, etc. and brings out new dimensions in the modern digital world. As it overlays the content, it makes the users enhance the knowledge by providing the content blended with real world. In this application, we integrated augmented reality with data analytics and integrated with cloud so the virtual content will be generated on the basis of the data present in the database and we used marker based augmented reality where every marker will be stored in the database with corresponding unique ID. This application can be used in wide range of industries for different business processes, but in this paper, we mainly focus on the marketing industry which helps the customer in gaining the knowledge about the products in the market which mainly focus on their prices, customer feedback, quality, and other benefits. This application also focuses on providing better market strategy information for marketing managers who obtain the data about the stocks, sales, customer response about the product, etc. In this paper, we also included the reports from the feedback got from different people after the demonstration, and finally, we presented the future scope of Augmented Reality in different business processes by integrating with new technologies like cloud, big data, artificial intelligence, etc.

Keywords: augmented reality, data analytics, catch room, marketing and sales

Procedia PDF Downloads 237
9858 Air Quality Analysis Using Machine Learning Models Under Python Environment

Authors: Salahaeddine Sbai

Abstract:

Air quality analysis using machine learning models is a method employed to assess and predict air pollution levels. This approach leverages the capabilities of machine learning algorithms to analyze vast amounts of air quality data and extract valuable insights. By training these models on historical air quality data, they can learn patterns and relationships between various factors such as weather conditions, pollutant emissions, and geographical features. The trained models can then be used to predict air quality levels in real-time or forecast future pollution levels. This application of machine learning in air quality analysis enables policymakers, environmental agencies, and the general public to make informed decisions regarding health, environmental impact, and mitigation strategies. By understanding the factors influencing air quality, interventions can be implemented to reduce pollution levels, mitigate health risks, and enhance overall air quality management. Climate change is having significant impacts on Morocco, affecting various aspects of the country's environment, economy, and society. In this study, we use some machine learning models under python environment to predict and analysis air quality change over North of Morocco to evaluate the climate change impact on agriculture.

Keywords: air quality, machine learning models, pollution, pollutant emissions

Procedia PDF Downloads 91
9857 Advancement of Oscillating Water Column Wave Energy Technologies through Integrated Applications and Alternative Systems

Authors: S. Doyle, G. A. Aggidis

Abstract:

Wave energy converter technologies continue to show good progress in worldwide research. One of the most researched technologies, the Oscillating Water Column (OWC), is arguably one of the most popular categories within the converter technologies due to its robustness, simplicity and versatility. However, the versatility of the OWC is still largely untapped with most deployments following similar trends with respect to applications and operating systems. As the competitiveness of the energy market continues to increase, the demand for wave energy technologies to be innovative also increases. For existing wave energy technologies, this requires identifying areas to diversify for lower costs of energy with respect to applications and synergies or integrated systems. This paper provides a review of all OWCs systems integrated into alternative applications in the past and present. The aspects and variation in their design, deployment and system operation are discussed. Particular focus is given to the Multi-OWCs (M-OWCs) and their great potential to increase capture on a larger scale, especially in synergy applications. It is made clear that these steps need to be taken in order to make wave energy a competitive and viable option in the renewable energy mix as progression to date shows that stand alone single function devices are not economical. Findings reveal that the trend of development is moving toward these integrated applications in order to reduce the Levelised Cost of Energy (LCOE) and will ultimately continue in this direction in efforts to make wave energy a competitive option in the renewable energy mix.

Keywords: wave energy converter, oscillating water column, ocean energy, renewable energy

Procedia PDF Downloads 134
9856 A Review of Literature on Theories of Construction Accident Causation Models

Authors: Samuel Opeyemi Williams, Razali Bin Adul Hamid, M. S. Misnan, Taki Eddine Seghier, D. I. Ajayi

Abstract:

Construction sites are characterized with occupational risks. Review of literature on construction accidents reveals that a lot of theories have been propounded over the years by different theorists, coupled with multifarious models developed by different proponents at different times. Accidents are unplanned events that are prominent in construction sites, involving materials, objects and people with attendant damages, loses and injuries. Models were developed to investigate the causations of accident with the aim of preventing its occurrence. Though, some of these theories were criticized, most especially, the Heinrich Domino theory, being mostly faulted for placing much blame on operatives rather than the management. The purpose of this paper is to unravel the significant construction accident causation theories and models for the benefit of understanding of the theories, and consequently enabling construction stakeholders identify the possible potential hazards on construction sites, as all stakeholders have significant roles to play in preventing accident. Accidents are preventable; hence, understanding the risk factors of accident and the causation theories paves way for its prevention. However, findings reveal that still some gaps missing in the existing models, while it is recommended that further research can be made in order to develop more models in order to maintain zero accident on construction sites.

Keywords: domino theory, construction site, site safety, accident causation model

Procedia PDF Downloads 304
9855 Modelling and Simulation of Diffusion Effect on the Glycol Dehydration Unit of a Natural Gas Plant

Authors: M. Wigwe, J. G Akpa, E. N Wami

Abstract:

Mathematical models of the absorber of a glycol dehydration facility was developed using the principles of conservation of mass and energy. Models which predict variation of the water content of gas in mole fraction, variation of gas and liquid temperatures across the parking height were developed. These models contain contributions from bulk and diffusion flows. The effect of diffusion on the process occurring in the absorber was studied in this work. The models were validated using the initial conditions in the plant data from Company W TEG unit in Nigeria. The results obtained showed that the effect of diffusion was noticed between z=0 and z=0.004 m. A deviation from plant data of 0% was observed for the gas water content at a residence time of 20 seconds, at z=0.004 m. Similarly, deviations of 1.584% and 2.844% were observed for the gas and TEG temperatures.

Keywords: separations, absorption, simulation, dehydration, water content, triethylene glycol

Procedia PDF Downloads 499
9854 Comparative Study of Experimental and Theoretical Convective, Evaporative for Two Model Distiller

Authors: Khaoula Hidouri, Ali Benhmidene, Bechir Chouachi

Abstract:

The purification of brackish seawater becomes a necessity and not a choice against demographic and industrial growth especially in third world countries. Two models can be used in this work: simple solar still and simple solar still coupled with a heat pump. In this research, the productivity of water by Simple Solar Distiller (SSD) and Simple Solar Distiller Hybrid Heat Pump (SSDHP) was determined by the orientation, the use of heat pump, the simple or double glass cover. The productivity can exceed 1.2 L/m²h for the SSDHP and 0.5 L/m²h for SSD model. The result of the global efficiency is determined for two models SSD and SSDHP give respectively 30%, 50%. The internal efficiency attained 35% for SSD and 60% of the SSDHP models. Convective heat coefficient can be determined by attained 2.5 W/m²°C and 0.5 W/m²°C respectively for SSDHP and SSD models.

Keywords: productivity, efficiency, convective heat coefficient, SSD model, SSDHPmodel

Procedia PDF Downloads 213
9853 Evaluating the Effectiveness of Animated Videos in Learning Economics

Authors: J. Chow

Abstract:

In laboratory settings, this study measured and reported the effects of undergraduate students watching animated videos on learning microeconomics as compared with the effectiveness of reading written texts. The study described an experiment on learning microeconomics in higher education using two different types of learning materials. It reported the effectiveness on microeconomics learning of watching animated videos and reading written texts. Undergraduate students in the university were randomly assigned to either a ‘video group’ or a ‘text group’ in the experiment. Previously-validated multiple-choice questions on fundamental concepts of microeconomics were administered. Both groups showed improvement between the pre-test and post-test. The experience of learning using text and video materials was also assessed. After controlling the student characteristics variables, the analyses showed that both types of materials showed comparable level of perceived learning experience. The effect size and statistical significance of these results supported the hypothesis that animated video is an effective alternative to text materials as a learning tool for students. The findings suggest that such animated videos may support teaching microeconomics in higher education.

Keywords: animated videos for education, laboratory experiment, microeconomics education, undergraduate economics education

Procedia PDF Downloads 146
9852 Hydrological Modeling of Watersheds Using the Only Corresponding Competitor Method: The Case of M’Zab Basin, South East Algeria

Authors: Oulad Naoui Noureddine, Cherif ELAmine, Djehiche Abdelkader

Abstract:

Water resources management includes several disciplines; the modeling of rainfall-runoff relationship is the most important discipline to prevent natural risks. There are several models to study rainfall-runoff relationship in watersheds. However, the majority of these models are not applicable in all basins of the world.  In this study, a new stochastic method called The Only Corresponding Competitor method (OCC) was used for the hydrological modeling of M’ZAB   Watershed (South East of Algeria) to adapt a few empirical models for any hydrological regime.  The results obtained allow to authorize a certain number of visions, in which it would be interesting to experiment with hydrological models that improve collectively or separately the data of a catchment by the OCC method.

Keywords: modelling, optimization, rainfall-runoff relationship, empirical model, OCC

Procedia PDF Downloads 265
9851 Rapid Building Detection in Population-Dense Regions with Overfitted Machine Learning Models

Authors: V. Mantey, N. Findlay, I. Maddox

Abstract:

The quality and quantity of global satellite data have been increasing exponentially in recent years as spaceborne systems become more affordable and the sensors themselves become more sophisticated. This is a valuable resource for many applications, including disaster management and relief. However, while more information can be valuable, the volume of data available is impossible to manually examine. Therefore, the question becomes how to extract as much information as possible from the data with limited manpower. Buildings are a key feature of interest in satellite imagery with applications including telecommunications, population models, and disaster relief. Machine learning tools are fast becoming one of the key resources to solve this problem, and models have been developed to detect buildings in optical satellite imagery. However, by and large, most models focus on affluent regions where buildings are generally larger and constructed further apart. This work is focused on the more difficult problem of detection in populated regions. The primary challenge with detecting small buildings in densely populated regions is both the spatial and spectral resolution of the optical sensor. Densely packed buildings with similar construction materials will be difficult to separate due to a similarity in color and because the physical separation between structures is either non-existent or smaller than the spatial resolution. This study finds that training models until they are overfitting the input sample can perform better in these areas than a more robust, generalized model. An overfitted model takes less time to fine-tune from a generalized pre-trained model and requires fewer input data. The model developed for this study has also been fine-tuned using existing, open-source, building vector datasets. This is particularly valuable in the context of disaster relief, where information is required in a very short time span. Leveraging existing datasets means that little to no manpower or time is required to collect data in the region of interest. The training period itself is also shorter for smaller datasets. Requiring less data means that only a few quality areas are necessary, and so any weaknesses or underpopulated regions in the data can be skipped over in favor of areas with higher quality vectors. In this study, a landcover classification model was developed in conjunction with the building detection tool to provide a secondary source to quality check the detected buildings. This has greatly reduced the false positive rate. The proposed methodologies have been implemented and integrated into a configurable production environment and have been employed for a number of large-scale commercial projects, including continent-wide DEM production, where the extracted building footprints are being used to enhance digital elevation models. Overfitted machine learning models are often considered too specific to have any predictive capacity. However, this study demonstrates that, in cases where input data is scarce, overfitted models can be judiciously applied to solve time-sensitive problems.

Keywords: building detection, disaster relief, mask-RCNN, satellite mapping

Procedia PDF Downloads 169
9850 Understanding Music through the Framework of Feminist Confessional Literary Criticism: Heightening Audience Identification and Prioritising the Female Voice

Authors: Katharine Pollock

Abstract:

Feminist scholars assert that a defining aspect of feminist confessional literature is that it expresses both an individual and communal identity, one which is predicated on the commonly-shared aspects of female experience. Reading feminist confessional literature in this way accommodates a plurality of readerly experiences and textual interpretations. It affirms the individual whilst acknowledging those experiences which bind women together, and refuses traditional objective criticism. It invites readers to see themselves reflected in the text, and encourages them to share their own stories. Similarly, music which communicates women’s personal experience, fictive or not, expresses a dual identity. There is an inherent risk of imposing a confessional reading upon a musical or literary text. Understanding music as being multivocal in the same way as confessional literature negates this patriarchal tendency, and allows listeners to engage with both the subjective and collective aspects of a text. By hearing their own stories reflected in the music, listeners engage in an ongoing dialogic process in which female stories are prioritised. This refuses patriarchal silencing and ensures a diversity of female voices. To demonstrate the veracity of these claims, literary criticism is applied to Lily Allen’s music, and memoir My Thoughts Exactly.

Keywords: confession, female, feminist, literature, music

Procedia PDF Downloads 154
9849 Lumped Parameter Models for Numerical Simulation of The Dynamic Response of Hoisting Appliances

Authors: Candida Petrogalli, Giovanni Incerti, Luigi Solazzi

Abstract:

This paper describes three lumped parameters models for the study of the dynamic behaviour of a boom crane. The models proposed here allow evaluating the fluctuations of the load arising from the rope and structure elasticity and from the type of the motion command imposed by the winch. A calculation software was developed in order to determine the actual acceleration of the lifted mass and the dynamic overload during the lifting phase. Some application examples are presented, with the aim of showing the correlation between the magnitude of the stress and the type of the employed motion command.

Keywords: crane, dynamic model, overloading condition, vibration

Procedia PDF Downloads 575
9848 Advances in Artificial intelligence Using Speech Recognition

Authors: Khaled M. Alhawiti

Abstract:

This research study aims to present a retrospective study about speech recognition systems and artificial intelligence. Speech recognition has become one of the widely used technologies, as it offers great opportunity to interact and communicate with automated machines. Precisely, it can be affirmed that speech recognition facilitates its users and helps them to perform their daily routine tasks, in a more convenient and effective manner. This research intends to present the illustration of recent technological advancements, which are associated with artificial intelligence. Recent researches have revealed the fact that speech recognition is found to be the utmost issue, which affects the decoding of speech. In order to overcome these issues, different statistical models were developed by the researchers. Some of the most prominent statistical models include acoustic model (AM), language model (LM), lexicon model, and hidden Markov models (HMM). The research will help in understanding all of these statistical models of speech recognition. Researchers have also formulated different decoding methods, which are being utilized for realistic decoding tasks and constrained artificial languages. These decoding methods include pattern recognition, acoustic phonetic, and artificial intelligence. It has been recognized that artificial intelligence is the most efficient and reliable methods, which are being used in speech recognition.

Keywords: speech recognition, acoustic phonetic, artificial intelligence, hidden markov models (HMM), statistical models of speech recognition, human machine performance

Procedia PDF Downloads 478
9847 An Integrated Framework for Wind-Wave Study in Lakes

Authors: Moien Mojabi, Aurelien Hospital, Daniel Potts, Chris Young, Albert Leung

Abstract:

The wave analysis is an integral part of the hydrotechnical assessment carried out during the permitting and design phases for coastal structures, such as marinas. This analysis aims in quantifying: i) the Suitability of the coastal structure design against Small Craft Harbour wave tranquility safety criterion; ii) Potential environmental impacts of the structure (e.g., effect on wave, flow, and sediment transport); iii) Mooring and dock design and iv) Requirements set by regulatory agency’s (e.g., WSA section 11 application). While a complex three-dimensional hydrodynamic modelling approach can be applied on large-scale projects, the need for an efficient and reliable wave analysis method suitable for smaller scale marina projects was identified. As a result, Tetra Tech has developed and applied an integrated analysis framework (hereafter TT approach), which takes the advantage of the state-of-the-art numerical models while preserving the level of simplicity that fits smaller scale projects. The present paper aims to describe the TT approach and highlight the key advantages of using this integrated framework in lake marina projects. The core of this methodology is made by integrating wind, water level, bathymetry, and structure geometry data. To respond to the needs of specific projects, several add-on modules have been added to the core of the TT approach. The main advantages of this method over the simplified analytical approaches are i) Accounting for the proper physics of the lake through the modelling of the entire lake (capturing real lake geometry) instead of a simplified fetch approach; ii) Providing a more realistic representation of the waves by modelling random waves instead of monochromatic waves; iii) Modelling wave-structure interaction (e.g. wave transmission/reflection application for floating structures and piles amongst others); iv) Accounting for wave interaction with the lakebed (e.g. bottom friction, refraction, and breaking); v) Providing the inputs for flow and sediment transport assessment at the project site; vi) Taking in consideration historical and geographical variations of the wind field; and vii) Independence of the scale of the reservoir under study. Overall, in comparison with simplified analytical approaches, this integrated framework provides a more realistic and reliable estimation of wave parameters (and its spatial distribution) in lake marinas, leading to a realistic hydrotechnical assessment accessible to any project size, from the development of a new marina to marina expansion and pile replacement. Tetra Tech has successfully utilized this approach since many years in the Okanagan area.

Keywords: wave modelling, wind-wave, extreme value analysis, marina

Procedia PDF Downloads 84
9846 Optimization and Simulation Models Applied in Engineering Planning and Management

Authors: Abiodun Ladanu Ajala, Wuyi Oke

Abstract:

Mathematical simulation and optimization models packaged within interactive computer programs provide a common way for planners and managers to predict the behaviour of any proposed water resources system design or management policy before it is implemented. Modeling presents a principal technique of predicting the behaviour of the proposed infrastructural designs or management policies. Models can be developed and used to help identify specific alternative plans that best meet those objectives. This study discusses various types of models, their development, architecture, data requirements, and applications in the field of engineering. It also outlines the advantages and limitations of each the optimization and simulation models presented. The techniques explored in this review include; dynamic programming, linear programming, fuzzy optimization, evolutionary algorithms and finally artificial intelligence techniques. Previous studies carried out using some of the techniques mentioned above were reviewed, and most of the results from different researches showed that indeed optimization and simulation provides viable alternatives and predictions which form a basis for decision making in building engineering structures and also in engineering planning and management.

Keywords: linear programming, mutation, optimization, simulation

Procedia PDF Downloads 590
9845 Exploring Students’ Self-Evaluation on Their Learning Outcomes through an Integrated Cumulative Grade Point Average Reporting Mechanism

Authors: Suriyani Ariffin, Nor Aziah Alias, Khairil Iskandar Othman, Haslinda Yusoff

Abstract:

An Integrated Cumulative Grade Point Average (iCGPA) is a mechanism and strategy to ensure the curriculum of an academic programme is constructively aligned to the expected learning outcomes and student performance based on the attainment of those learning outcomes that is reported objectively in a spider web. Much effort and time has been spent to develop a viable mechanism and trains academics to utilize the platform for reporting. The question is: How well do learners conceive the idea of their achievement via iCGPA and whether quality learner attributes have been nurtured through the iCGPA mechanism? This paper presents the architecture of an integrated CGPA mechanism purported to address a holistic evaluation from the evaluation of courses learning outcomes to aligned programme learning outcomes attainment. The paper then discusses the students’ understanding of the mechanism and evaluation of their achievement from the generated spider web. A set of questionnaires were distributed to a group of students with iCGPA reporting and frequency analysis was used to compare the perspectives of students on their performance. In addition, the questionnaire also explored how they conceive the idea of an integrated, holistic reporting and how it generates their motivation to improve. The iCGPA group was found to be receptive to what they have achieved throughout their study period. They agreed that the achievement level generated from their spider web allows them to develop intervention and enhance the programme learning outcomes before they graduate.

Keywords: learning outcomes attainment, iCGPA, programme learning outcomes, spider web, iCGPA reporting skills

Procedia PDF Downloads 208
9844 Return to Work after a Mental Health Problem: Analysis of Two Different Management Models

Authors: Lucie Cote, Sonia McFadden

Abstract:

Mental health problems in the workplace are currently one of the main causes of absences. Research work has highlighted the importance of a collaborative process involving the stakeholders in the return-to-work process and has established the best management practices to ensure a successful return-to-work. However, very few studies have specifically explored the combination of various management models and determined whether they could satisfy the needs of the stakeholders. The objective of this study is to analyze two models for managing the return to work: the ‘medical-administrative’ and the ‘support of the worker’ in order to understand the actions and actors involved in these models. The study also aims to explore whether these models meet the needs of the actors involved in the management of the return to work. A qualitative case study was conducted in a Canadian federal organization. An abundant internal documentation and semi-directed interviews with six managers, six workers and four human resources professionals involved in the management of records of employees returning to work after a mental health problem resulted in a complete picture of the return to work management practices used in this organization. The triangulation of this data facilitated the examination of the benefits and limitations of each approach. The results suggest that the actions of management for employee return to work from both models of management ‘support of the worker’ and ‘medical-administrative’ are compatible and can meet the needs of the actors involved in the return to work. More research is needed to develop a structured model integrating best practices of the two approaches to ensure the success of the return to work.

Keywords: return to work, mental health, management models, organizations

Procedia PDF Downloads 212
9843 Effect of Traffic Volume and Its Composition on Vehicular Speed under Mixed Traffic Conditions: A Kriging Based Approach

Authors: Subhadip Biswas, Shivendra Maurya, Satish Chandra, Indrajit Ghosh

Abstract:

Use of speed prediction models sometimes appears as a feasible alternative to laborious field measurement particularly, in case when field data cannot fulfill designer’s requirements. However, developing speed models is a challenging task specifically in the context of developing countries like India where vehicles with diverse static and dynamic characteristics use the same right of way without any segregation. Here the traffic composition plays a significant role in determining the vehicular speed. The present research was carried out to examine the effects of traffic volume and its composition on vehicular speed under mixed traffic conditions. Classified traffic volume and speed data were collected from different geometrically identical six lane divided arterials in New Delhi. Based on these field data, speed prediction models were developed for individual vehicle category adopting Kriging approximation technique, an alternative for commonly used regression. These models are validated with the data set kept aside earlier for validation purpose. The predicted speeds showed a great deal of agreement with the observed values and also the model outperforms all other existing speed models. Finally, the proposed models were utilized to evaluate the effect of traffic volume and its composition on speed.

Keywords: speed, Kriging, arterial, traffic volume

Procedia PDF Downloads 353
9842 Statistical and Analytical Comparison of GIS Overlay Modelings: An Appraisal on Groundwater Prospecting in Precambrian Metamorphics

Authors: Tapas Acharya, Monalisa Mitra

Abstract:

Overlay modeling is the most widely used conventional analysis for spatial decision support system. Overlay modeling requires a set of themes with different weightage computed in varied manners, which gives a resultant input for further integrated analysis. In spite of the popularity and most widely used technique; it gives inconsistent and erroneous results for similar inputs while processed in various GIS overlay techniques. This study is an attempt to compare and analyse the differences in the outputs of different overlay methods using GIS platform with same set of themes of the Precambrian metamorphic to obtain groundwater prospecting in Precambrian metamorphic rocks. The objective of the study is to emphasize the most suitable overlay method for groundwater prospecting in older Precambrian metamorphics. Seven input thematic layers like slope, Digital Elevation Model (DEM), soil thickness, lineament intersection density, average groundwater table fluctuation, stream density and lithology have been used in the spatial overlay models of fuzzy overlay, weighted overlay and weighted sum overlay methods to yield the suitable groundwater prospective zones. Spatial concurrence analysis with high yielding wells of the study area and the statistical comparative studies among the outputs of various overlay models using RStudio reveal that the Weighted Overlay model is the most efficient GIS overlay model to delineate the groundwater prospecting zones in the Precambrian metamorphic rocks.

Keywords: fuzzy overlay, GIS overlay model, groundwater prospecting, Precambrian metamorphics, weighted overlay, weighted sum overlay

Procedia PDF Downloads 128
9841 The Signaling Power of ESG Accounting in Sub-Sahara Africa: A Dynamic Model Approach

Authors: Haruna Maama

Abstract:

Environmental, social and governance (ESG) reporting is gaining considerable attention despite being voluntary. Meanwhile, it consumes resources to provide ESG reporting, raising a question of its value relevance. The study examined the impact of ESG reporting on the market value of listed firms in SSA. The annual and integrated reports of 276 listed sub-Sahara Africa (SSA) firms. The integrated reporting scores of the firm were analysed using a content analysis method. A multiple regression estimation technique using a GMM approach was employed for the analysis. The results revealed that ESG has a positive relationship with firms’ market value, suggesting that investors are interested in the ESG information disclosure of firms in SSA. This suggests that extensive ESG disclosures are attempts by firms to obtain the approval of powerful social, political and environmental stakeholders, especially institutional investors. Furthermore, the market value analysis evidence is consistent with signalling theory, which postulates that firms provide integrated reports as a signal to influence the behaviour of stakeholders. This finding reflects the value placed on investors' social, environmental and governance disclosures, which affirms the views that conventional investors would care about the social, environmental and governance issues of their potential or existing investee firms. Overall, the evidence is consistent with the prediction of signalling theory. In the context of this theory, integrated reporting is seen as part of firms' overall competitive strategy to influence investors' behaviour. The findings of this study make unique contributions to knowledge and practice in corporate reporting.

Keywords: environmental accounting, ESG accounting, signalling theory, sustainability reporting, sub-saharan Africa

Procedia PDF Downloads 77
9840 Gradient-Based Reliability Optimization of Integrated Energy Systems Under Extreme Weather Conditions: A Case Study in Ningbo, China

Authors: Da LI, Peng Xu

Abstract:

Recent extreme weather events, such as the 2021 European floods and North American heatwaves, have exposed the vulnerability of energy systems to both extreme demand scenarios and potential physical damage. Current integrated energy system designs often overlook performance under these challenging conditions. This research, focusing on a regional integrated energy system in Ningbo, China, proposes a distinct design method to optimize system reliability during extreme events. A multi-scenario model was developed, encompassing various extreme load conditions and potential system damages caused by severe weather. Based on this model, a comprehensive reliability improvement scheme was designed, incorporating a gradient approach to address different levels of disaster severity through the integration of advanced technologies like distributed energy storage. The scheme's effectiveness was validated through Monte Carlo simulations. Results demonstrate significant enhancements in energy supply reliability and peak load reduction capability under extreme scenarios. The findings provide several insights for improving energy system adaptability in the face of climate-induced challenges, offering valuable references for building reliable energy infrastructure capable of withstanding both extreme demands and physical threats across a spectrum of disaster intensities.

Keywords: extreme weather events, integrated energy systems, reliability improvement, climate change adaptation

Procedia PDF Downloads 25
9839 Individualized Emotion Recognition Through Dual-Representations and Ground-Established Ground Truth

Authors: Valentina Zhang

Abstract:

While facial expression is a complex and individualized behavior, all facial emotion recognition (FER) systems known to us rely on a single facial representation and are trained on universal data. We conjecture that: (i) different facial representations can provide different, sometimes complementing views of emotions; (ii) when employed collectively in a discussion group setting, they enable more accurate emotion reading which is highly desirable in autism care and other applications context sensitive to errors. In this paper, we first study FER using pixel-based DL vs semantics-based DL in the context of deepfake videos. Our experiment indicates that while the semantics-trained model performs better with articulated facial feature changes, the pixel-trained model outperforms on subtle or rare facial expressions. Armed with these findings, we have constructed an adaptive FER system learning from both types of models for dyadic or small interacting groups and further leveraging the synthesized group emotions as the ground truth for individualized FER training. Using a collection of group conversation videos, we demonstrate that FER accuracy and personalization can benefit from such an approach.

Keywords: neurodivergence care, facial emotion recognition, deep learning, ground truth for supervised learning

Procedia PDF Downloads 147
9838 Reading Literacy, Storytelling and Cognitive Learning: an Effective Connection in Sustainability Education

Authors: Rosa Tiziana Bruno

Abstract:

The connection between education and sustainability has been posited to have benefit for realizing a social development compatible with environmental protection. However, an educational paradigm based on the passage of information or on the fear of a catastrophe might not favor the acquisition of eco-identity. To build a sustainable world, it is necessary to "become people" in harmony with other human beings, being aware of belonging to the same human community that is part of the natural world. This can only be achieved within an authentic educating community and the most effective tools for building educating communities are reading literacy and storytelling. This paper is the report of a research-action carried out in this direction, in agreement with the sociology department of the University of Salerno, which involved four hundred children and their teachers in a path based on the combination of reading literacy, storytelling, autobiographical writing and outdoor education. The goal of the research was to create an authentic educational community within the school, capable to encourage the acquisition of an eco-identity by the pupils, that is, personal and relational growth in the full realization of the Self, in harmony with the social and natural environment, with a view to an authentic education for sustainability. To ensure reasonable validity and reliability of findings, the inquiry started with participant observation and a process of triangulation has been used including: semi-structured interview, socio-semiotic analysis of the conversation and time budget. Basically, a multiple independent sources of data was used to answer the questions. Observing the phenomenon through multiple "windows" helped to comparing data through a variety of lenses. All teachers had the experience of implementing a socio-didactic strategy called "Fiabadiario" and they had the possibility to use it with approaches that fit their students. The data being collected come from the very students and teachers who are engaged with this strategy. The educational path tested during the research has produced sustainable relationships and conflict resolution within the school system and between school and families, creating an authentic and sustainable learning community.

Keywords: educating community, education for sustainability, literature in education, social relations

Procedia PDF Downloads 122
9837 'Go Baby Go'; Community-Based Integrated Early Childhood and Maternal Child Health Model Improving Early Childhood Stimulation, Care Practices and Developmental Outcomes in Armenia: A Quasi-Experimental Study

Authors: Viktorya Sargsyan, Arax Hovhannesyan, Karine Abelyan

Abstract:

Introduction: During the last decade, scientific studies have proven the importance of Early Childhood Development (ECD) interventions. These interventions are shown to create strong foundations for children’s intellectual, emotional and physical well-being, as well as the impact they have on learning and economic outcomes for children as they mature into adulthood. Many children in rural Armenia fail to reach their full development potential due to lack of early brain stimulation (playing, singing, reading, etc.) from their parents, and lack of community tools and services to follow-up children’s neurocognitive development. This is exacerbated by high rates of stunting and anemia among children under 3(CU3). This research study tested the effectiveness of an integrated ECD and Maternal, Newborn and Childhood Health (MNCH) model, called “Go Baby, Go!” (GBG), against the traditional (MNCH) strategy which focuses solely on preventive health and nutrition interventions. The hypothesis of this quasi-experimental study was: Children exposed to GBG will have better neurocognitive and nutrition outcomes compared to those receiving only the MNCH intervention. The secondary objective was to assess the effect of GBG on parental child care and nutrition practices. Methodology: The 14 month long study, targeted all 1,300 children aged 0 to 23 months, living in 43 study communities the in Gavar and Vardenis regions (Gegharkunik province, Armenia). Twenty-three intervention communities, 680 children, received GBG, and 20 control communities, 630 children, received MCHN interventions only. Baseline and evaluation data on child development, nutrition status and parental child care and nutrition practices were collected (caregiver interview, direct child assessment). In the intervention sites, in addition to MNCH (maternity schools, supportive supervision for Health Care Providers (HCP), the trained GBG facilitators conducted six interactive group sessions for mothers (key messages, information, group discussions, role playing, video-watching, toys/books preparation, according to GBG curriculum), and two sessions (condensed GBG) for adult family members (husbands, grandmothers). The trained HCPs received quality supervision for ECD counseling and screening. Findings: The GBG model proved to be effective in improving ECD outcomes. Children in the intervention sites had 83% higher odd of total ECD composite score (cognitive, language, motor) compared to children in the control sites (aOR 1.83; 95 percent CI: 1.08-3.09; p=0.025). Caregivers also demonstrated better child care and nutrition practices (minimum dietary diversity in intervention site is 55 percent higher compared to control (aOR=1.55, 95 percent CI 1.10-2.19, p =0.013); support for learning and disciplining practices (aOR=2.22, 95 percent CI 1.19-4.16, p=0.012)). However, there was no evidence of stunting reduction in either study arm. he effect of the integrated model was more prominent in Vardenis, a community which is characterised by high food insecurity and limited knowledge of positive parenting skills. Conclusion: The GBG model is effective and could be applied in target areas with the greatest economic disadvantages and parenting challenges to improve ECD, care practices and developmental outcomes. Longitudinal studies are needed to view the long-term effects of GBG on learning and school readiness.

Keywords: early childhood development, integrated interventions, parental practices, quasi-experimental study

Procedia PDF Downloads 172
9836 Artificial Intelligence for Generative Modelling

Authors: Shryas Bhurat, Aryan Vashistha, Sampreet Dinakar Nayak, Ayush Gupta

Abstract:

As the technology is advancing more towards high computational resources, there is a paradigm shift in the usage of these resources to optimize the design process. This paper discusses the usage of ‘Generative Design using Artificial Intelligence’ to build better models that adapt the operations like selection, mutation, and crossover to generate results. The human mind thinks of the simplest approach while designing an object, but the intelligence learns from the past & designs the complex optimized CAD Models. Generative Design takes the boundary conditions and comes up with multiple solutions with iterations to come up with a sturdy design with the most optimal parameter that is given, saving huge amounts of time & resources. The new production techniques that are at our disposal allow us to use additive manufacturing, 3D printing, and other innovative manufacturing techniques to save resources and design artistically engineered CAD Models. Also, this paper discusses the Genetic Algorithm, the Non-Domination technique to choose the right results using biomimicry that has evolved for current habitation for millions of years. The computer uses parametric models to generate newer models using an iterative approach & uses cloud computing to store these iterative designs. The later part of the paper compares the topology optimization technology with Generative Design that is previously being used to generate CAD Models. Finally, this paper shows the performance of algorithms and how these algorithms help in designing resource-efficient models.

Keywords: genetic algorithm, bio mimicry, generative modeling, non-dominant techniques

Procedia PDF Downloads 149
9835 Suitable Models and Methods for the Steady-State Analysis of Multi-Energy Networks

Authors: Juan José Mesas, Luis Sainz

Abstract:

The motivation for the development of this paper lies in the need for energy networks to reduce losses, improve performance, optimize their operation and try to benefit from the interconnection capacity with other networks enabled for other energy carriers. These interconnections generate interdependencies between some energy networks and others, which requires suitable models and methods for their analysis. Traditionally, the modeling and study of energy networks have been carried out independently for each energy carrier. Thus, there are well-established models and methods for the steady-state analysis of electrical networks, gas networks, and thermal networks separately. What is intended is to extend and combine them adequately to be able to face in an integrated way the steady-state analysis of networks with multiple energy carriers. Firstly, the added value of multi-energy networks, their operation, and the basic principles that characterize them are explained. In addition, two current aspects of great relevance are exposed: the storage technologies and the coupling elements used to interconnect one energy network with another. Secondly, the characteristic equations of the different energy networks necessary to carry out the steady-state analysis are detailed. The electrical network, the natural gas network, and the thermal network of heat and cold are considered in this paper. After the presentation of the equations, a particular case of the steady-state analysis of a specific multi-energy network is studied. This network is represented graphically, the interconnections between the different energy carriers are described, their technical data are exposed and the equations that have previously been presented theoretically are formulated and developed. Finally, the two iterative numerical resolution methods considered in this paper are presented, as well as the resolution procedure and the results obtained. The pros and cons of the application of both methods are explained. It is verified that the results obtained for the electrical network (voltages in modulus and angle), the natural gas network (pressures), and the thermal network (mass flows and temperatures) are correct since they comply with the distribution, operation, consumption and technical characteristics of the multi-energy network under study.

Keywords: coupling elements, energy carriers, multi-energy networks, steady-state analysis

Procedia PDF Downloads 79
9834 A Local Invariant Generalized Hough Transform Method for Integrated Circuit Visual Positioning

Authors: Wei Feilong

Abstract:

In this study, an local invariant generalized Houghtransform (LI-GHT) method is proposed for integrated circuit (IC) visual positioning. The original generalized Hough transform (GHT) is robust to external noise; however, it is not suitable for visual positioning of IC chips due to the four-dimensionality (4D) of parameter space which leads to the substantial storage requirement and high computational complexity. The proposed LI-GHT method can reduce the dimensionality of parameter space to 2D thanks to the rotational invariance of local invariant geometric feature and it can estimate the accuracy position and rotation angle of IC chips in real-time under noise and blur influence. The experiment results show that the proposed LI-GHT can estimate position and rotation angle of IC chips with high accuracy and fast speed. The proposed LI-GHT algorithm was implemented in IC visual positioning system of radio frequency identification (RFID) packaging equipment.

Keywords: Integrated Circuit Visual Positioning, Generalized Hough Transform, Local invariant Generalized Hough Transform, ICpacking equipment

Procedia PDF Downloads 264
9833 XAI Implemented Prognostic Framework: Condition Monitoring and Alert System Based on RUL and Sensory Data

Authors: Faruk Ozdemir, Roy Kalawsky, Peter Hubbard

Abstract:

Accurate estimation of RUL provides a basis for effective predictive maintenance, reducing unexpected downtime for industrial equipment. However, while models such as the Random Forest have effective predictive capabilities, they are the so-called ‘black box’ models, where interpretability is at a threshold to make critical diagnostic decisions involved in industries related to aviation. The purpose of this work is to present a prognostic framework that embeds Explainable Artificial Intelligence (XAI) techniques in order to provide essential transparency in Machine Learning methods' decision-making mechanisms based on sensor data, with the objective of procuring actionable insights for the aviation industry. Sensor readings have been gathered from critical equipment such as turbofan jet engine and landing gear, and the prediction of the RUL is done by a Random Forest model. It involves steps such as data gathering, feature engineering, model training, and evaluation. These critical components’ datasets are independently trained and evaluated by the models. While suitable predictions are served, their performance metrics are reasonably good; such complex models, however obscure reasoning for the predictions made by them and may even undermine the confidence of the decision-maker or the maintenance teams. This is followed by global explanations using SHAP and local explanations using LIME in the second phase to bridge the gap in reliability within industrial contexts. These tools analyze model decisions, highlighting feature importance and explaining how each input variable affects the output. This dual approach offers a general comprehension of the overall model behavior and detailed insight into specific predictions. The proposed framework, in its third component, incorporates the techniques of causal analysis in the form of Granger causality tests in order to move beyond correlation toward causation. This will not only allow the model to predict failures but also present reasons, from the key sensor features linked to possible failure mechanisms to relevant personnel. The causality between sensor behaviors and equipment failures creates much value for maintenance teams due to better root cause identification and effective preventive measures. This step contributes to the system being more explainable. Surrogate Several simple models, including Decision Trees and Linear Models, can be used in yet another stage to approximately represent the complex Random Forest model. These simpler models act as backups, replicating important jobs of the original model's behavior. If the feature explanations obtained from the surrogate model are cross-validated with the primary model, the insights derived would be more reliable and provide an intuitive sense of how the input variables affect the predictions. We then create an iterative explainable feedback loop, where the knowledge learned from the explainability methods feeds back into the training of the models. This feeds into a cycle of continuous improvement both in model accuracy and interpretability over time. By systematically integrating new findings, the model is expected to adapt to changed conditions and further develop its prognosis capability. These components are then presented to the decision-makers through the development of a fully transparent condition monitoring and alert system. The system provides a holistic tool for maintenance operations by leveraging RUL predictions, feature importance scores, persistent sensor threshold values, and autonomous alert mechanisms. Since the system will provide explanations for the predictions given, along with active alerts, the maintenance personnel can make informed decisions on their end regarding correct interventions to extend the life of the critical machinery.

Keywords: predictive maintenance, explainable artificial intelligence, prognostic, RUL, machine learning, turbofan engines, C-MAPSS dataset

Procedia PDF Downloads 7
9832 Mixed Effects Models for Short-Term Load Forecasting for the Spanish Regions: Castilla-Leon, Castilla-La Mancha and Andalucia

Authors: C. Senabre, S. Valero, M. Lopez, E. Velasco, M. Sanchez

Abstract:

This paper focuses on an application of linear mixed models to short-term load forecasting. The challenge of this research is to improve a currently working model at the Spanish Transport System Operator, programmed by us, and based on linear autoregressive techniques and neural networks. The forecasting system currently forecasts each of the regions within the Spanish grid separately, even though the behavior of the load in each region is affected by the same factors in a similar way. A load forecasting system has been verified in this work by using the real data from a utility. In this research it has been used an integration of several regions into a linear mixed model as starting point to obtain the information from other regions. Firstly, the systems to learn general behaviors present in all regions, and secondly, it is identified individual deviation in each regions. The technique can be especially useful when modeling the effect of special days with scarce information from the past. The three most relevant regions of the system have been used to test the model, focusing on special day and improving the performance of both currently working models used as benchmark. A range of comparisons with different forecasting models has been conducted. The forecasting results demonstrate the superiority of the proposed methodology.

Keywords: short-term load forecasting, mixed effects models, neural networks, mixed effects models

Procedia PDF Downloads 189
9831 Assessing Students’ Readiness for an Open and Distance Learning Higher Education Environment

Authors: Upasana G. Singh, Meera Gungea

Abstract:

Learning is no more confined to the traditional classroom, teacher, and student interaction. Many universities offer courses through the Open and Distance Learning (ODL) mode, attracting a diversity of learners in terms of age, gender, and profession to name a few. The ODL mode has surfaced as one of the famous sought-after modes of learning, allowing learners to invest in their educational growth without hampering their personal and professional commitments. This mode of learning, however, requires that those who ultimately choose to adopt it must be prepared to undertake studies through such medium. The purpose of this research is to assess whether students who join universities offering courses through the ODL mode are ready to embark and study within such a framework. This study will be helpful to unveil the challenges students face in such an environment and thus contribute to developing a framework to ease adoption and integration into the ODL environment. Prior to the implementation of e-learning, a readiness assessment is essential for any institution that wants to adopt any form of e-learning. Various e-learning readiness assessment models have been developed over the years. However, this study is based on a conceptual model for e-Learning Readiness Assessment which is a ‘hybrid model’. This hybrid model consists of 4 main parameters: 1) Technological readiness, 2) Culture readiness, 3) Content readiness, and 4) Demographics factors, with 4 sub-areas, namely, technology, innovation, people and self-development. The model also includes the attitudes of users towards the adoption of e-learning as an important aspect of assessing e-learning readiness. For this study, some factors and sub-factors of the hybrid model have been considered and adapted, together with the ‘Attitude’ component. A questionnaire was designed based on the models and students where the target population were students enrolled at the Open University of Mauritius, in undergraduate and postgraduate courses. Preliminary findings indicate that most (68%) learners have an average knowledge about ODL form of learning, despite not many (72%) having previous experience with ODL. Despite learning through ODL 74% of learners preferred hard copy learning material and 48% found difficulty in reading learning material on electronic devices.

Keywords: open learning, distance learning, student readiness, a hybrid model

Procedia PDF Downloads 109
9830 Predominance of Teaching Models Used by Math Teachers in Secondary Education

Authors: Verónica Diaz Quezada

Abstract:

This research examines the teaching models used by secondary math teachers when teaching logarithmic, quadratic and exponential functions. For this, descriptive case studies have been carried out on 5 secondary teachers. These teachers have been chosen from 3 scientific-humanistic and technical schools, in Chile. Data have been obtained through non-participant class observation and the application of a questionnaire and a rubric to teachers. According to the results, the didactic model that prevails is the one that starts with an interactive strategy, moves to a more content-based structure, and ends with a reinforcement stage. Nonetheless, there is always influence from teachers, their methods, and the group of students.

Keywords: teaching models, math teachers, functions, secondary education

Procedia PDF Downloads 189