Search results for: heuristics and biases approach
11581 Machine learning Assisted Selective Emitter design for Solar Thermophotovoltaic System
Authors: Ambali Alade Odebowale, Andargachew Mekonnen Berhe, Haroldo T. Hattori, Andrey E. Miroshnichenko
Abstract:
Solar thermophotovoltaic systems (STPV) have emerged as a promising solution to overcome the Shockley-Queisser limit, a significant impediment in the direct conversion of solar radiation into electricity using conventional solar cells. The STPV system comprises essential components such as an optical concentrator, selective emitter, and a thermophotovoltaic (TPV) cell. The pivotal element in achieving high efficiency in an STPV system lies in the design of a spectrally selective emitter or absorber. Traditional methods for designing and optimizing selective emitters are often time-consuming and may not yield highly selective emitters, posing a challenge to the overall system performance. In recent years, the application of machine learning techniques in various scientific disciplines has demonstrated significant advantages. This paper proposes a novel nanostructure composed of four-layered materials (SiC/W/SiO2/W) to function as a selective emitter in the energy conversion process of an STPV system. Unlike conventional approaches widely adopted by researchers, this study employs a machine learning-based approach for the design and optimization of the selective emitter. Specifically, a random forest algorithm (RFA) is employed for the design of the selective emitter, while the optimization process is executed using genetic algorithms. This innovative methodology holds promise in addressing the challenges posed by traditional methods, offering a more efficient and streamlined approach to selective emitter design. The utilization of a machine learning approach brings several advantages to the design and optimization of a selective emitter within the STPV system. Machine learning algorithms, such as the random forest algorithm, have the capability to analyze complex datasets and identify intricate patterns that may not be apparent through traditional methods. This allows for a more comprehensive exploration of the design space, potentially leading to highly efficient emitter configurations. Moreover, the application of genetic algorithms in the optimization process enhances the adaptability and efficiency of the overall system. Genetic algorithms mimic the principles of natural selection, enabling the exploration of a diverse range of emitter configurations and facilitating the identification of optimal solutions. This not only accelerates the design and optimization process but also increases the likelihood of discovering configurations that exhibit superior performance compared to traditional methods. In conclusion, the integration of machine learning techniques in the design and optimization of a selective emitter for solar thermophotovoltaic systems represents a groundbreaking approach. This innovative methodology not only addresses the limitations of traditional methods but also holds the potential to significantly improve the overall performance of STPV systems, paving the way for enhanced solar energy conversion efficiency.Keywords: emitter, genetic algorithm, radiation, random forest, thermophotovoltaic
Procedia PDF Downloads 6211580 Analysis of Crisis Management Systems of United Kingdom and Turkey
Authors: Recep Sait Arpat, Hakan Güreşci
Abstract:
Emergency, disaster and crisis management terms are generally perceived as the same processes. This conflict effects the approach and delegating policy of the political order. Crisis management starts in the aftermath of the mismanagement of disaster and emergency. In the light of the information stated above in this article Turkey and United Kingdom(UK)’s crisis management systems are analyzed. This article’s main aim is to clarify the main points of the emergency management system of United Kingdom and Turkey’s disaster management system by comparing them. To do this: A prototype model of the political decision making processes of the countries is drawn, decision making mechanisms and the planning functions are compared. As a result it’s found that emergency management policy in Turkey is reactive whereas it’s proactive in UK; as the delegating policy Turkey’s system is similar to UK; levels of emergency situations are similar but not the same; the differences are stemming from the civil order and nongovernmental organizations effectiveness; UK has a detailed government engagement model to emergencies, which shapes the doctrine of the approach to emergencies, and it’s successful in gathering and controlling the whole state’s efforts; crisis management is a sub-phase of UK emergency management whereas it’s accepted as a outmoded management perception and the focal point of crisis management perception in UK is security crisis and natural disasters while in Turkey it is natural disasters. In every anlysis proposals are given to Turkey.Keywords: crisis management, disaster management, emergency management, turkey, united kingdom
Procedia PDF Downloads 37311579 Computational Approach for Grp78–Nf-ΚB Binding Interactions in the Context of Neuroprotective Pathway in Brain Injuries
Authors: Janneth Gonzalez, Marco Avila, George Barreto
Abstract:
GRP78 participates in multiple functions in the cell during normal and pathological conditions, controlling calcium homeostasis, protein folding and unfolded protein response. GRP78 is located in the endoplasmic reticulum, but it can change its location under stress, hypoxic and apoptotic conditions. NF-κB represents the keystone of the inflammatory process and regulates the transcription of several genes related with apoptosis, differentiation, and cell growth. The possible relationship between GRP78-NF-κB could support and explain several mechanisms that may regulate a variety of cell functions, especially following brain injuries. Although several reports show interactions between NF-κB and heat shock proteins family members, there is a lack of information on how GRP78 may be interacting with NF-κB, and possibly regulating its downstream activation. Therefore, we assessed the computational predictions of the GRP78 (Chain A) and NF-κB complex (IkB alpha and p65) protein-protein interactions. The interaction interface of the docking model showed that the amino acids ASN 47, GLU 215, GLY 403 of GRP78 and THR 54, ASN 182 and HIS 184 of NF-κB are key residues involved in the docking. The electrostatic field between GRP78-NF-κB interfaces and molecular dynamic simulations support the possible interaction between the proteins. In conclusion, this work shed some light in the possible GRP78-NF-κB complex indicating key residues in this crosstalk, which may be used as an input for better drug design strategy targeting NF-κB downstream signaling as a new therapeutic approach following brain injuries.Keywords: computational biology, protein interactions, Grp78, bioinformatics, molecular dynamics
Procedia PDF Downloads 34411578 Pinch Technology for Minimization of Water Consumption at a Refinery
Authors: W. Mughees, M. Alahmad
Abstract:
Water is the most significant entity that controls local and global development. For the Gulf region, especially Saudi Arabia, with its limited potable water resources, the potential of the fresh water problem is highly considerable. In this research, the study involves the design and analysis of pinch-based water/wastewater networks. Multiple water/wastewater networks were developed using pinch analysis involving direct recycle/material recycle method. Property-integration technique was adopted to carry out direct recycle method. Particularly, a petroleum refinery was considered as a case study. In direct recycle methodology, minimum water discharge and minimum fresh water resource targets were estimated. Re-design (or retrofitting) of water allocation in the networks was undertaken. Chemical Oxygen Demand (COD) and hardness properties were taken as pollutants. This research was based on single and double contaminant approach for COD and hardness and the amount of fresh water was reduced from 340.0 m3/h to 149.0 m3/h (43.8%), 208.0 m3/h (61.18%) respectively. While regarding double contaminant approach, reduction in fresh water demand was 132.0 m3/h (38.8%). The required analysis was also carried out using mathematical programming technique. Operating software such as LINGO was used for these studies which have verified the graphical method results in a valuable and accurate way. Among the multiple water networks, the one possible water allocation network was developed based on mass exchange.Keywords: minimization, water pinch, water management, pollution prevention
Procedia PDF Downloads 48011577 The Impact of Mergers and Acquisitions on Financial Deepening in the Nigerian Banking Sector
Authors: Onyinyechi Joy Kingdom
Abstract:
Mergers and Acquisitions (M&A) have been proposed as a mechanism through which, problems associated with inefficiency or poor performance in financial institution could be addressed. The aim of this study is to examine the proposition that recapitalization of banks, which encouraged Mergers and Acquisitions in Nigeria banking system, would strengthen the domestic banks, improve financial deepening and the confidence of depositors. Hence, this study examines the impact of the 2005 M&A in the Nigerian-banking sector on financial deepening using mixed method (quantitative and qualitative approach). The quantitative process of this study utilised annual time series for financial deepening indicator for the period of 1997 to 2012. While, the qualitative aspect adopted semi-structured interview to collate data from three merged banks and three stand-alone banks to explore, understand and complement the quantitative results. Furthermore, a framework thematic analysis is employed to analyse the themes developed using NVivo 11 software. Using the quantitative approach, findings from the equality of mean test (EMT) used suggests that M&A have significant impact on financial deepening. However, this method is not robust enough given its weak validity as it does not control for other potential factors that may determine financial deepening. Thus, to control for other factors that may affect the level of financial deepening, a Multiple Regression Model (MRM) and Interrupted Times Series Analysis (ITSA) were applied. The coefficient for M&A dummy turned negative and insignificant using MRM. In addition, the estimated linear trend of the post intervention when ITSA was applied suggests that after M&A, the level of financial deepening decreased annually; however, this was statistically insignificant. Similarly, using the qualitative approach, the results from the interview supported the quantitative results from ITSA and MRM. The result suggests that interest rate should fall when capital base is increased to improve financial deepening. Hence, this study contributes to the existing literature the importance of other factors that may affect financial deepening and the economy when policies that will enhance bank performance and the economy are made. In addition, this study will enable the use of valuable policy instruments relevant to monetary authorities when formulating policies that will strengthen the Nigerian banking sector and the economy.Keywords: mergers and acquisitions, recapitalization, financial deepening, efficiency, financial crisis
Procedia PDF Downloads 39811576 FisherONE: Employing Distinct Pedagogy through Technology Integration in Senior Secondary Education
Authors: J. Kontoleon, D.Gall, M.Pidskalny
Abstract:
FisherONE offers a distinct pedagogic model for senior secondary education that integrates advanced technology to meet the learning needs of Year 11 and 12 students across Catholic schools in Queensland. As a fully online platform, FisherONE employs pedagogy that combines flexibility with personalized, data-driven learning. The model leverages tools like the MaxHub hybrid interactive system and AI-powered learning assistants to create tailored learning pathways that promote student autonomy and engagement. This paper examines FisherONE’s success in employing pedagogic strategies through technology. Initial findings suggest that students benefit from the blended approach of virtual assessments and real-time support, even as AI-assisted tools remain in the proof-of-concept phase. The study outlines how FisherONE plans to continue refining its educational methods to better serve students in distance learning environments, specifically in challenging subjects like physics. The integration of technology in FisherONE enhances the effectiveness of teaching and learning, addressing common challenges in online education by offering scalable, individualized learning experiences. This approach demonstrates the future potential of technology in education and the role it can play in fostering meaningful student outcomes.Keywords: AI-assisted learning, innovative pedagogy, personalized learning, senior education, technology in education
Procedia PDF Downloads 2011575 Balance Control Mechanisms in Individuals With Multiple Sclerosis in Virtual Reality Environment
Authors: Badriah Alayidi, Emad Alyahya
Abstract:
Background: Most people with Multiple Sclerosis (MS) report worsening balance as the condition progresses. Poor balance control is also well known to be a significant risk factor for both falling and fear of falling. The increased risk of falls with disease progression thus makes balance control an essential target of gait rehabilitation amongst people with MS. Intervention programs have developed various methods to improve balance control, and accumulating evidence suggests that exercise programs may help people with MS improve their balance. Among these methods, virtual reality (VR) is growing in popularity as a balance-training technique owing to its potential benefits, including better compliance and greater user happiness. However, it is not clear if a VR environment will induce different balance control mechanisms in MS as compared to healthy individuals or traditional environments. Therefore, this study aims to examine how individuals with MS control their balance in a VR setting. Methodology: The proposed study takes an empirical approach to estimate and determine the role of balance response in persons with MS using a VR environment. It will use primary data collected through patient observations, physiological and biomechanical evaluation of balance, and data analysis. Results: The preliminary systematic review and meta-analysis indicated that there was variability in terms of the outcome assessing balance response in people with MS. The preliminary results of these assessments have the potential to provide essential indicators of the progression of MS and contribute to the individualization of treatment and evaluation of the interventions’ effectiveness. The literature describes patients who have had the opportunity to experiment in VR settings and then used what they have learned in the real world, suggesting that this VR setting could be more appealing than conditional settings. The findings of the proposed study will be beneficial in estimating and determining the effect of VR on balance control in persons with MS. In previous studies, VR was shown to be an interesting approach to neurological rehabilitation, but more data are needed to support this approach in MS. Conclusions: The proposed study enables an assessment of balance and evaluations of a variety of physiological implications related to neural activity as well as biomechanical implications related to movement analysis.Keywords: multiple sclerosis, virtual reality, postural control, balance
Procedia PDF Downloads 7611574 Sea of Light: A Game 'Based Approach for Evidence-Centered Assessment of Collaborative Problem Solving
Authors: Svenja Pieritz, Jakab Pilaszanovich
Abstract:
Collaborative Problem Solving (CPS) is recognized as being one of the most important skills of the 21st century with having a potential impact on education, job selection, and collaborative systems design. Therefore, CPS has been adopted in several standardized tests, including the Programme for International Student Assessment (PISA) in 2015. A significant challenge of evaluating CPS is the underlying interplay of cognitive and social skills, which requires a more holistic assessment. However, the majority of the existing tests are using a questionnaire-based assessment, which oversimplifies this interplay and undermines ecological validity. Two major difficulties were identified: Firstly, the creation of a controllable, real-time environment allowing natural behaviors and communication between at least two people. Secondly, the development of an appropriate method to collect and synthesize both cognitive and social metrics of collaboration. This paper proposes a more holistic and automated approach to the assessment of CPS. To address these two difficulties, a multiplayer problem-solving game called Sea of Light was developed: An environment allowing students to deploy a variety of measurable collaborative strategies. This controlled environment enables researchers to monitor behavior through the analysis of game actions and chat. The according solution for the statistical model is a combined approach of Natural Language Processing (NLP) and Bayesian network analysis. Social exchanges via the in-game chat are analyzed through NLP and fed into the Bayesian network along with other game actions. This Bayesian network synthesizes evidence to track and update different subdimensions of CPS. Major findings focus on the correlations between the evidences collected through in- game actions, the participants’ chat features and the CPS self- evaluation metrics. These results give an indication of which game mechanics can best describe CPS evaluation. Overall, Sea of Light gives test administrators control over different problem-solving scenarios and difficulties while keeping the student engaged. It enables a more complete assessment based on complex, socio-cognitive information on actions and communication. This tool permits further investigations of the effects of group constellations and personality in collaborative problem-solving.Keywords: bayesian network, collaborative problem solving, game-based assessment, natural language processing
Procedia PDF Downloads 13311573 Abandoning 'One-Time' Optional Information Literacy Workshops for Year 1 Medical Students and Gearing towards an 'Embedded Librarianship' Approach
Authors: R. L. David, E. C. P. Tan, M. A. Ferenczi
Abstract:
This study aimed to investigate the effect of a 'one-time' optional Information Literacy (IL) workshop to enhance Year 1 medical students' literature search, writing, and citation management skills as directed by a customized five-year IL framework developed for LKC Medicine students. At the end of the IL workshop, the overall rated 'somewhat difficult' when finding, citing, and using information from sources. The study method is experimental using a standardized IL test to study the cohort effect of a 'one-time' optional IL workshop on Year 1 students; experimental group in comparison to Year 2 students; control group. Test scores from both groups were compared and analyzed using mean scores and one-way analysis of variance (ANOVA). Unexpectedly, there were no statistically significant differences between group means as determined by One-Way ANOVA (F₁,₁₉₃ = 3.37, p = 0.068, ηp² = 0.017). Challenges and shortfalls posed by 'one-time' interventions raised a rich discussion to adopt an 'embedded librarianship' approach, which shifts the medial librarians' role into the curriculum and uses Team Based Learning to teach IL skills to medical students. The customized five-year IL framework developed for LKC Medicine students becomes a useful librarian-faculty model for embedding and bringing IL into the classroom.Keywords: information literacy, 'one-time' interventions, medical students, standardized tests, embedded librarianship, curriculum, medical librarians
Procedia PDF Downloads 11311572 Cleaning of Polycyclic Aromatic Hydrocarbons (PAH) Obtained from Ferroalloys Plant
Authors: Stefan Andersson, Balram Panjwani, Bernd Wittgens, Jan Erik Olsen
Abstract:
Polycyclic Aromatic hydrocarbons are organic compounds consisting of only hydrogen and carbon aromatic rings. PAH are neutral, non-polar molecules that are produced due to incomplete combustion of organic matter. These compounds are carcinogenic and interact with biological nucleophiles to inhibit the normal metabolic functions of the cells. Norways, the most important sources of PAH pollution is considered to be aluminum plants, the metallurgical industry, offshore oil activity, transport, and wood burning. Stricter governmental regulations regarding emissions to the outer and internal environment combined with increased awareness of the potential health effects have motivated Norwegian metal industries to increase their efforts to reduce emissions considerably. One of the objective of the ongoing industry and Norwegian research council supported "SCORE" project is to reduce potential PAH emissions from an off gas stream of a ferroalloy furnace through controlled combustion. In a dedicated combustion chamber. The sizing and configuration of the combustion chamber depends on the combined properties of the bulk gas stream and the properties of the PAH itself. In order to achieve efficient and complete combustion the residence time and minimum temperature need to be optimized. For this design approach reliable kinetic data of the individual PAH-species and/or groups thereof are necessary. However, kinetic data on the combustion of PAH are difficult to obtain and there is only a limited number of studies. The paper presents an evaluation of the kinetic data for some of the PAH obtained from literature. In the present study, the oxidation is modelled for pure PAH and also for PAH mixed with process gas. Using a perfectly stirred reactor modelling approach the oxidation is modelled including advanced reaction kinetics to study influence of residence time and temperature on the conversion of PAH to CO2 and water. A Chemical Reactor Network (CRN) approach is developed to understand the oxidation of PAH inside the combustion chamber. Chemical reactor network modeling has been found to be a valuable tool in the evaluation of oxidation behavior of PAH under various conditions.Keywords: PAH, PSR, energy recovery, ferro alloy furnace
Procedia PDF Downloads 27411571 Barriers and Facilitators to Inclusive Programming for Children with Mental and/or Developmental Challenges: A Participatory Action Research of Perspectives from Families and Professionals
Authors: Minnie Y. Teng, Kathy Xie, Jarus Tal
Abstract:
Rationale: The traditional approach to community programs for children with mental and/or developmental challenges often involves segregation from typically-developing peers. However, studies show that inclusive education improves children’s quality of life, self-concept, and long term health outcomes. Investigating factors that influence inclusion can thus have important implications in the design and facilitation of community programs such that all children - across a spectrum of needs and abilities - may benefit. Objectives: This study explores barriers and facilitators to inclusive community programming for children aged 0 to 12 with developmental/mental challenges. Methods: Using a participatory-action research methodology, semi-structured focus groups and interviews will be used to explore perspectives of sighted students, instructors, and staff. Data will be transcribed and coded thematically. Practice Implications or Results: By having a deeper understanding of the barriers and facilitators to inclusive programming in the community, researchers can work with the broader community to facilitate inclusion in children’s community programs. Conclusions: Expanding inclusive practices may improve the health and wellbeing of the pediatric populations with disabilities, which consistently reports lower levels of participation. These findings may help to identify gaps in existing practices and ways to approach them.Keywords: aquatic programs, children, disabilities, inclusion, community programs
Procedia PDF Downloads 11611570 From Shallow Semantic Representation to Deeper One: Verb Decomposition Approach
Authors: Aliaksandr Huminski
Abstract:
Semantic Role Labeling (SRL) as shallow semantic parsing approach includes recognition and labeling arguments of a verb in a sentence. Verb participants are linked with specific semantic roles (Agent, Patient, Instrument, Location, etc.). Thus, SRL can answer on key questions such as ‘Who’, ‘When’, ‘What’, ‘Where’ in a text and it is widely applied in dialog systems, question-answering, named entity recognition, information retrieval, and other fields of NLP. However, SRL has the following flaw: Two sentences with identical (or almost identical) meaning can have different semantic role structures. Let consider 2 sentences: (1) John put butter on the bread. (2) John buttered the bread. SRL for (1) and (2) will be significantly different. For the verb put in (1) it is [Agent + Patient + Goal], but for the verb butter in (2) it is [Agent + Goal]. It happens because of one of the most interesting and intriguing features of a verb: Its ability to capture participants as in the case of the verb butter, or their features as, say, in the case of the verb drink where the participant’s feature being liquid is shared with the verb. This capture looks like a total fusion of meaning and cannot be decomposed in direct way (in comparison with compound verbs like babysit or breastfeed). From this perspective, SRL looks really shallow to represent semantic structure. If the key point in semantic representation is an opportunity to use it for making inferences and finding hidden reasons, it assumes by default that two different but semantically identical sentences must have the same semantic structure. Otherwise we will have different inferences from the same meaning. To overcome the above-mentioned flaw, the following approach is suggested. Assume that: P is a participant of relation; F is a feature of a participant; Vcp is a verb that captures a participant; Vcf is a verb that captures a feature of a participant; Vpr is a primitive verb or a verb that does not capture any participant and represents only a relation. In another word, a primitive verb is a verb whose meaning does not include meanings from its surroundings. Then Vcp and Vcf can be decomposed as: Vcp = Vpr +P; Vcf = Vpr +F. If all Vcp and Vcf will be represented this way, then primitive verbs Vpr can be considered as a canonical form for SRL. As a result of that, there will be no hidden participants caught by a verb since all participants will be explicitly unfolded. An obvious example of Vpr is the verb go, which represents pure movement. In this case the verb drink can be represented as man-made movement of liquid into specific direction. Extraction and using primitive verbs for SRL create a canonical representation unique for semantically identical sentences. It leads to the unification of semantic representation. In this case, the critical flaw related to SRL will be resolved.Keywords: decomposition, labeling, primitive verbs, semantic roles
Procedia PDF Downloads 36811569 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis
Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García
Abstract:
Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis
Procedia PDF Downloads 22911568 Runoff Estimates of Rapidly Urbanizing Indian Cities: An Integrated Modeling Approach
Authors: Rupesh S. Gundewar, Kanchan C. Khare
Abstract:
Runoff contribution from urban areas is generally from manmade structures and few natural contributors. The manmade structures are buildings; roads and other paved areas whereas natural contributors are groundwater and overland flows etc. Runoff alleviation is done by manmade as well as natural storages. Manmade storages are storage tanks or other storage structures such as soakways or soak pits which are more common in western and European countries. Natural storages are catchment slope, infiltration, catchment length, channel rerouting, drainage density, depression storage etc. A literature survey on the manmade and natural storages/inflow has presented percentage contribution of each individually. Sanders et.al. in their research have reported that a vegetation canopy reduces runoff by 7% to 12%. Nassif et el in their research have reported that catchment slope has an impact of 16% on bare standard soil and 24% on grassed soil on rainfall runoff. Infiltration being a pervious/impervious ratio dependent parameter is catchment specific. But a literature survey has presented a range of 15% to 30% loss of rainfall runoff in various catchment study areas. Catchment length and channel rerouting too play a considerable role in reduction of rainfall runoff. Ground infiltration inflow adds to the runoff where the groundwater table is very shallow and soil saturates even in a lower intensity storm. An approximate percent contribution through this inflow and surface inflow contributes to about 2% of total runoff volume. Considering the various contributing factors in runoff it has been observed during a literature survey that integrated modelling approach needs to be considered. The traditional storm water network models are able to predict to a fair/acceptable degree of accuracy provided no interaction with receiving water (river, sea, canal etc), ground infiltration, treatment works etc. are assumed. When such interactions are significant then it becomes difficult to reproduce the actual flood extent using the traditional discrete modelling approach. As a result the correct flooding situation is very rarely addressed accurately. Since the development of spatially distributed hydrologic model the predictions have become more accurate at the cost of requiring more accurate spatial information.The integrated approach provides a greater understanding of performance of the entire catchment. It enables to identify the source of flow in the system, understand how it is conveyed and also its impact on the receiving body. It also confirms important pain points, hydraulic controls and the source of flooding which could not be easily understood with discrete modelling approach. This also enables the decision makers to identify solutions which can be spread throughout the catchment rather than being concentrated at single point where the problem exists. Thus it can be concluded from the literature survey that the representation of urban details can be a key differentiator to the successful understanding of flooding issue. The intent of this study is to accurately predict the runoff from impermeable areas from urban area in India. A representative area has been selected for which data was available and predictions have been made which are corroborated with the actual measured data.Keywords: runoff, urbanization, impermeable response, flooding
Procedia PDF Downloads 25111567 D3Advert: Data-Driven Decision Making for Ad Personalization through Personality Analysis Using BiLSTM Network
Authors: Sandesh Achar
Abstract:
Personalized advertising holds greater potential for higher conversion rates compared to generic advertisements. However, its widespread application in the retail industry faces challenges due to complex implementation processes. These complexities impede the swift adoption of personalized advertisement on a large scale. Personalized advertisement, being a data-driven approach, necessitates consumer-related data, adding to its complexity. This paper introduces an innovative data-driven decision-making framework, D3Advert, which personalizes advertisements by analyzing personalities using a BiLSTM network. The framework utilizes the Myers–Briggs Type Indicator (MBTI) dataset for development. The employed BiLSTM network, specifically designed and optimized for D3Advert, classifies user personalities into one of the sixteen MBTI categories based on their social media posts. The classification accuracy is 86.42%, with precision, recall, and F1-Score values of 85.11%, 84.14%, and 83.89%, respectively. The D3Advert framework personalizes advertisements based on these personality classifications. Experimental implementation and performance analysis of D3Advert demonstrate a 40% improvement in impressions. D3Advert’s innovative and straightforward approach has the potential to transform personalized advertising and foster widespread personalized advertisement adoption in marketing.Keywords: personalized advertisement, deep Learning, MBTI dataset, BiLSTM network, NLP.
Procedia PDF Downloads 4511566 Analysis of Minimizing Investment Risks in Power and Energy Business Development by Combining Total Quality Management and International Financing Institutions Project Management Tools
Authors: M. Radunovic
Abstract:
Region of Southeastern Europe has a substantial energy resource potential and is witnessing an increasing rate of power and energy project investments. This comes as a result of countries harmonizing their legal framework and market regulations to conform the ones of European Union, enabling direct private investments. Funding in the power and energy market in this region originates from various resources and investment entities, including commercial and institutional ones. Risk anticipation and assessment is crucial to project success, especially given the long exploitation period of project in power and energy domain, as well as the wide range of stakeholders involved. This paper analyzes the possibility of combined application of tools used in total quality management and international financing institutions for project planning, execution and evaluation, with the goal of anticipating, assessing and minimizing the risks that might occur in the development and execution phase of a power and energy project in the market of southeastern Europe. History of successful project management and investments both in the industry and institutional sector provides sufficient experience, guidance and internationally adopted tools to provide proper project assessment for investments in power and energy. Business environment of southeastern Europe provides immense potential for developing power and engineering projects of various magnitudes, depending on stakeholders’ interest. Diversification on investment sources provides assurance that there is interest and commitment to invest in this market. Global economic and political developments will be intensifying the pace of investments in the upcoming period. The proposed approach accounts for key parameters that contribute to the sustainability and profitability of a project which include technological, educational, social and economic gaps between the southeastern European region and western Europe, market trends in equipment design and production on a global level, environment friendly approach to renewable energy sources as well as conventional power generation systems, and finally the effect of the One Belt One Road Initiative led by People’s Republic of China to the power and energy market of this region in the upcoming period on a long term scale. Analysis will outline the key benefits of the approach as well as the accompanying constraints. Parallel to this it will provide an overview of dominant threats and opportunities in present and future business environment and their influence to the proposed application. Through concrete examples, full potential of this approach will be presented along with necessary improvements that need to be implemented. Number of power and engineering projects being developed in southeastern Europe will be increasing in the upcoming period. Proper risk analysis will lead to minimizing project failures. The proposed successful combination of reliable project planning tools from different investment areas can prove to be beneficial in the future power and engineering investments, and guarantee their sustainability and profitability.Keywords: capital investments, lean six sigma, logical framework approach, logical framework matrix, one belt one road initiative, project management tools, quality function deployment, Southeastern Europe, total quality management
Procedia PDF Downloads 11011565 Movie Genre Preference Prediction Using Machine Learning for Customer-Based Information
Authors: Haifeng Wang, Haili Zhang
Abstract:
Most movie recommendation systems have been developed for customers to find items of interest. This work introduces a predictive model usable by small and medium-sized enterprises (SMEs) who are in need of a data-based and analytical approach to stock proper movies for local audiences and retain more customers. We used classification models to extract features from thousands of customers’ demographic, behavioral and social information to predict their movie genre preference. In the implementation, a Gaussian kernel support vector machine (SVM) classification model and a logistic regression model were established to extract features from sample data and their test error-in-sample were compared. Comparison of error-out-sample was also made under different Vapnik–Chervonenkis (VC) dimensions in the machine learning algorithm to find and prevent overfitting. Gaussian kernel SVM prediction model can correctly predict movie genre preferences in 85% of positive cases. The accuracy of the algorithm increased to 93% with a smaller VC dimension and less overfitting. These findings advance our understanding of how to use machine learning approach to predict customers’ preferences with a small data set and design prediction tools for these enterprises.Keywords: computational social science, movie preference, machine learning, SVM
Procedia PDF Downloads 26111564 Reconfigurable Intelligent Surfaces (RIS)-Assisted Integrated Leo Satellite and UAV for Non-terrestrial Networks Using a Deep Reinforcement Learning Approach
Authors: Tesfaw Belayneh Abebe
Abstract:
Integrating low-altitude earth orbit (LEO) satellites and unmanned aerial vehicles (UAVs) within a non-terrestrial network (NTN) with the assistance of reconfigurable intelligent surfaces (RIS), we investigate the problem of how to enhance throughput through integrated LEO satellites and UAVs with the assistance of RIS. We propose a method to jointly optimize the associations with the LEO satellite, the 3D trajectory of the UAV, and the phase shifts of the RIS to maximize communication throughput for RIS-assisted integrated LEO satellite and UAV-enabled wireless communications, which is challenging due to the time-varying changes in the position of the LEO satellite, the high mobility of UAVs, an enormous number of possible control actions, and also the large number of RIS elements. Utilizing a multi-agent double deep Q-network (MADDQN), our approach dynamically adjusts LEO satellite association, UAV positioning, and RIS phase shifts. Simulation results demonstrate that our method significantly outperforms baseline strategies in maximizing throughput. Lastly, thanks to the integrated network and the RIS, the proposed scheme achieves up to 65.66x higher peak throughput and 25.09x higher worst-case throughput.Keywords: integrating low-altitude earth orbit (LEO) satellites, unmanned aerial vehicles (UAVs) within a non-terrestrial network (NTN), reconfigurable intelligent surfaces (RIS), multi-agent double deep Q-network (MADDQN)
Procedia PDF Downloads 5211563 Design of a Cooperative Neural Network, Particle Swarm Optimization (PSO) and Fuzzy Based Tracking Control for a Tilt Rotor Unmanned Aerial Vehicle
Authors: Mostafa Mjahed
Abstract:
Tilt Rotor UAVs (Unmanned Aerial Vehicles) are naturally unstable and difficult to maneuver. The purpose of this paper is to design controllers for the stabilization and trajectory tracking of this type of UAV. To this end, artificial intelligence methods have been exploited. First, the dynamics of this UAV was modeled using the Lagrange-Euler method. The conventional method based on Proportional, Integral and Derivative (PID) control was applied by decoupling the different flight modes. To improve stability and trajectory tracking of the Tilt Rotor, the fuzzy approach and the technique of multilayer neural networks (NN) has been used. Thus, Fuzzy Proportional Integral and Derivative (FPID) and Neural Network-based Proportional Integral and Derivative controllers (NNPID) have been developed. The meta-heuristic approach based on Particle Swarm Optimization (PSO) method allowed adjusting the setting parameters of NNPID controller, giving us an improved NNPID-PSO controller. Simulation results under the Matlab environment show the efficiency of the approaches adopted. Besides, the Tilt Rotor UAV has become stable and follows different types of trajectories with acceptable precision. The Fuzzy, NN and NN-PSO-based approaches demonstrated their robustness because the presence of the disturbances did not alter the stability or the trajectory tracking of the Tilt Rotor UAV.Keywords: neural network, fuzzy logic, PSO, PID, trajectory tracking, tilt-rotor UAV
Procedia PDF Downloads 12211562 Ionic Liquids as Substrates for Metal-Organic Framework Synthesis
Authors: Julian Mehler, Marcus Fischer, Martin Hartmann, Peter S. Schulz
Abstract:
During the last two decades, the synthesis of metal-organic frameworks (MOFs) has gained ever increasing attention. Based on their pore size and shape as well as host-guest interactions, they are of interest for numerous fields related to porous materials, like catalysis and gas separation. Usually, MOF-synthesis takes place in an organic solvent between room temperature and approximately 220 °C, with mixtures of polyfunctional organic linker molecules and metal precursors as substrates. Reaction temperatures above the boiling point of the solvent, i.e. solvothermal reactions, are run in autoclaves or sealed glass vessels under autogenous pressures. A relatively new approach for the synthesis of MOFs is the so-called ionothermal synthesis route. It applies an ionic liquid as a solvent, which can serve as a structure-directing template and/or a charge-compensating agent in the final coordination polymer structure. Furthermore, this method often allows for less harsh reaction conditions than the solvothermal route. Here a variation of the ionothermal approach is reported, where the ionic liquid also serves as an organic linker source. By using 1-ethyl-3-methylimidazolium terephthalates ([EMIM][Hbdc] and [EMIM]₂[bdc]), the one-step synthesis of MIL-53(Al)/Boehemite composites with interesting features is possible. The resulting material is already formed at moderate temperatures (90-130 °C) and is stabilized in the usually unfavored ht-phase. Additionally, in contrast to already published procedures for MIL-53(Al) synthesis, no further activation at high temperatures is mandatory. A full characterization of this novel composite material is provided, including XRD, SS-NMR, El-Al., SEM as well as sorption measurements and its interesting features are compared to MIL-53(Al) samples produced by the classical solvothermal route. Furthermore, the syntheses of the applied ionic liquids and salts is discussed. The influence of the degree of ionicity of the linker source [EMIM]x[H(2-x)bdc] on the crystal structure and the achievable synthesis temperature are investigated and give insight into the role of the IL during synthesis. Aside from the synthesis of MIL-53 from EMIM terephthalates, the use of the phosphonium cation in this approach is discussed as well. Additionally, the employment of ILs in the preparation of other MOFs is presented briefly. This includes the ZIF-4 framework from the respective imidazolate ILs and chiral camphorate based frameworks from their imidazolium precursors.Keywords: ionic liquids, ionothermal synthesis, material synthesis, MIL-53, MOFs
Procedia PDF Downloads 20911561 Review of Assessment of Integrated Information System (IIS) in Organisation
Authors: Mariya Salihu Ingawa, Sani Suleiman Isah
Abstract:
The assessment of Integrated Information System (IIS) in organisation is an important initiative to enable the Information System (IS) managers, as well as top management to understand the success status of their investment in IS integration efforts. However, without a proper assessment, an organisation will not know its IIS status, which may affect their judgment on what action should be taken onwards. Current research on IIS assessment is lacking and those related literature on IIS assessment focus more on assessing the technical aspect of IIS. It is argued that assessing technical aspect alone is inadequate since organisational and strategic aspects in IIS should also be considered. Current methods, techniques and tools used by vendors for IIS assessment also are lack of comprehensive measures to fully assess the Integrated Information System in term of technical, organisational and strategic domains. The purpose of this study is to establish critical success factors for measuring success of an Integrated Information System. These factors are used as the basis for constructing an approach to comprehensively assess IIS in an organisation. A comprehensive list of success factors for IIS assessment, established from literature, was initially presented. An expert surveys using both manual and online methods were conducted to verify the factors. Based on the factors, an instrument for IIS assessment was constructed. The results from a case study indicate that through comprehensive assessment approach, not only the level of success been known, but also reveals the contributing factors. This research contributes to the field of Information Systems specifically in the area of Integrated Information System assessment.Keywords: integrated information system, expert surveys, organisation, assessment
Procedia PDF Downloads 38911560 An Ensemble Learning Method for Applying Particle Swarm Optimization Algorithms to Systems Engineering Problems
Authors: Ken Hampshire, Thomas Mazzuchi, Shahram Sarkani
Abstract:
As a subset of metaheuristics, nature-inspired optimization algorithms such as particle swarm optimization (PSO) have shown promise both in solving intractable problems and in their extensibility to novel problem formulations due to their general approach requiring few assumptions. Unfortunately, single instantiations of algorithms require detailed tuning of parameters and cannot be proven to be best suited to a particular illustrative problem on account of the “no free lunch” (NFL) theorem. Using these algorithms in real-world problems requires exquisite knowledge of the many techniques and is not conducive to reconciling the various approaches to given classes of problems. This research aims to present a unified view of PSO-based approaches from the perspective of relevant systems engineering problems, with the express purpose of then eliciting the best solution for any problem formulation in an ensemble learning bucket of models approach. The central hypothesis of the research is that extending the PSO algorithms found in the literature to real-world optimization problems requires a general ensemble-based method for all problem formulations but a specific implementation and solution for any instance. The main results are a problem-based literature survey and a general method to find more globally optimal solutions for any systems engineering optimization problem.Keywords: particle swarm optimization, nature-inspired optimization, metaheuristics, systems engineering, ensemble learning
Procedia PDF Downloads 10011559 Lectures in Higher Education Using Teaching Strategies and Digital Tools to Overcome Challenges Faced in South Africa by Implementing Blended Learning
Authors: Thaiurie Govender, Shannon Verne
Abstract:
The Fourth Industrial Revolution has ushered in an era where technology significantly impacts various aspects of life, including higher education. Blended learning, which combines synchronous and asynchronous learning, has gained popularity as a pedagogical approach. However, its effective implementation is a challenge, particularly in the context of the COVID-19 pandemic and technological obstacles faced in South Africa. This study focused on lecturers' teaching and learning practices to implement blended learning, aiming to understand the teaching and learning strategies used with the integration of digital tools to facilitate the blended learning approach within a private higher educational institution in South Africa. Using heutagogy and constructivism theoretical frameworks, the study aimed to uncover insights into the lecturer’s teaching and learning practices to overcome challenges in designing and facilitating blended learning modules. Through a qualitative analysis, the themes of student engagement, teaching and learning strategies, digital tools, and feedback emerged, highlighting the complexities and opportunities in a blended learning classroom. The findings emphasize the importance of tailoring methods to students' needs and subject matter, aligning with constructivist principles. Recommendations include promoting professional development opportunities, addressing infrastructure issues, and fostering a supportive learning environment.Keywords: blended learning, digital tools, higher education, teaching strategies
Procedia PDF Downloads 5611558 Qualitative and Quantitative Research Methodology Theoretical Framework and Descriptive Theory: PhD Construction Management
Authors: Samuel Quashie
Abstract:
PhDs in Construction Management often designs their methods based on those established in social sciences using theoretical models, to collect, gather and analysis data to answer research questions. Work aim is to apply qualitative and quantitative as a data analysis method, and as part of the theoretical framework - descriptive theory. To improve the ability to replicate the contribution to knowledge the research. Using practical triangulation approach, which covers, interviews and observations, literature review and (archival) document studies, project-based case studies, questionnaires surveys and review of integrated systems used in, construction and construction related industries. The clarification of organisational context and management delivery that influences organizational performance and quality of product and measures are achieved. Results illustrate improved reliability in this research approach when interpreting real world phenomena; cumulative results of research can be applied with confidence under similar environments. Assisted validity of the PhD research outcomes and strengthens the confidence to apply cumulative results of research under similar conditions in the Built Environment research systems, which have been criticised for the lack of reliability in approaches when interpreting real world phenomena.Keywords: case studies, descriptive theory, theoretical framework, qualitative and quantitative research
Procedia PDF Downloads 38811557 Off-Policy Q-learning Technique for Intrusion Response in Network Security
Authors: Zheni S. Stefanova, Kandethody M. Ramachandran
Abstract:
With the increasing dependency on our computer devices, we face the necessity of adequate, efficient and effective mechanisms, for protecting our network. There are two main problems that Intrusion Detection Systems (IDS) attempt to solve. 1) To detect the attack, by analyzing the incoming traffic and inspect the network (intrusion detection). 2) To produce a prompt response when the attack occurs (intrusion prevention). It is critical creating an Intrusion detection model that will detect a breach in the system on time and also challenging making it provide an automatic and with an acceptable delay response at every single stage of the monitoring process. We cannot afford to adopt security measures with a high exploiting computational power, and we are not able to accept a mechanism that will react with a delay. In this paper, we will propose an intrusion response mechanism that is based on artificial intelligence, and more precisely, reinforcement learning techniques (RLT). The RLT will help us to create a decision agent, who will control the process of interacting with the undetermined environment. The goal is to find an optimal policy, which will represent the intrusion response, therefore, to solve the Reinforcement learning problem, using a Q-learning approach. Our agent will produce an optimal immediate response, in the process of evaluating the network traffic.This Q-learning approach will establish the balance between exploration and exploitation and provide a unique, self-learning and strategic artificial intelligence response mechanism for IDS.Keywords: cyber security, intrusion prevention, optimal policy, Q-learning
Procedia PDF Downloads 24011556 Agriculture, Food Security and Poverty Reduction in Nigeria: Cointegration and Granger Causality Approach
Authors: Ogunwole Cecilia Oluwakemi, Timothy Ayomitunde Aderemi
Abstract:
Provision of sufficient food and elimination of abject poverty have usually been the conventional benefits of agriculture in any society. Meanwhile, despite the fact that Nigeria is an agrarian society, food insecurity and poverty have become the issues of concern among both scholars and policymakers in the recent times. Against this backdrop, this study examined the nexus among agriculture, food security, and poverty reduction in Nigeria from 1990 to 2019 within the framework of the Cointegration and Granger Causality approach. Data was collected from the Central Bank of Nigeria Statistical Bulletin and the World Development Indicators, respectively. The following are the major results that emanated from the study. A long run equilibrium relationship exists among agricultural value added, food production index, and GDP per capita in Nigeria. Similarly, there is a unidirectional causality which flows from food production index to poverty reduction in Nigeria. In the same vein, one way causality flows from poverty reduction to agricultural value added in Nigeria. Consequently, this study makes the following recommendation for the policymakers in Nigeria, and other African countries by extension, that agricultural value added and food production are the important variables that cannot be undermined when poverty reduction occupies the central focus of the policymakers. Therefore, any time these policymakers want to reduce poverty, policies that drive agricultural value added and food production should be embarked upon. Therefore, this study will contribute to the literature by establishing the type of linkage that exists between agriculture, food security, and poverty reduction in Nigeria.Keywords: agriculture, value added, food production, GDP per capita, Nigeria
Procedia PDF Downloads 20111555 In the Valley of the Shadow of Death: Gossip, God, and Scapegoating in Susannah, an American Opera by Carlisle Floyd
Authors: Shirl H. Terrell
Abstract:
In the telling of mythologies, stories of cultural and religious histories, the creative arts provide an archetypal lens through which the personal and collective unconscious are viewed, thus revealing mysteries of the unknown psyche. To that end, the author of this paper, using the hermeneutic approach, proves that Carlisle Floyd’s (1955) English language opera Susannah illuminates humanity’s instinctual nature and behaviors through music, libretto, and drama. While impressive musical works such as Wagner’s Ring Cycle and Webber’s Phantom of the Opera have received extensive Jungian analyses, critics and scholars often ignore lesser esteemed works, such as Susannah, notwithstanding the fact that they have been consistently performed on the theater circuit. Such pieces, when given notice, allow viewers to grasp the soul-making depth and timeless quality of productions which may otherwise go unrecognized as culturally or psychologically significant. Although Susannah has sometimes been described as unsophisticated and simple in scope, the author demonstrates why Floyd’s 'little' opera, set in New Hope Valley, Appalachia, a cultural region in the Eastern United States known for its prevailing myths and distortions of isolation, temperament, and the judgmentally conservative behavior of its inhabitants, belongs to opera’s hallmark works. Its approach to powerful underlying archetypal themes, which give rise to the poignant and haunting depictions of the darker and destructive side of the human soul, the Shadow, provides crucial significance to the work. The Shadow’s manifestation in the form of the scapegoating complex is central to the plot of Susannah; the church’s meting out of rules, judgment, and reparation for sins point to the foreboding aspects of human behavior that evoke their intrinsic nature. The scapegoating complex is highlighted in an eight-step process gleaned from the works of Kenneth Burke and Rene Girard. In summary, through depth psychological terms and mythological motifs, the author provides an insightful approach to perceiving instinctual behaviors as they play out in an American opera that has been staged over eight-hundred times, yet, unfortunately, remains in the shadows. Susannah’s timelessness is now.Keywords: archetypes, mythology, opera, scapegoating, Shadow, Susannah
Procedia PDF Downloads 15111554 Development of a Data-Driven Method for Diagnosing the State of Health of Battery Cells, Based on the Use of an Electrochemical Aging Model, with a View to Their Use in Second Life
Authors: Desplanches Maxime
Abstract:
Accurate estimation of the remaining useful life of lithium-ion batteries for electronic devices is crucial. Data-driven methodologies encounter challenges related to data volume and acquisition protocols, particularly in capturing a comprehensive range of aging indicators. To address these limitations, we propose a hybrid approach that integrates an electrochemical model with state-of-the-art data analysis techniques, yielding a comprehensive database. Our methodology involves infusing an aging phenomenon into a Newman model, leading to the creation of an extensive database capturing various aging states based on non-destructive parameters. This database serves as a robust foundation for subsequent analysis. Leveraging advanced data analysis techniques, notably principal component analysis and t-Distributed Stochastic Neighbor Embedding, we extract pivotal information from the data. This information is harnessed to construct a regression function using either random forest or support vector machine algorithms. The resulting predictor demonstrates a 5% error margin in estimating remaining battery life, providing actionable insights for optimizing usage. Furthermore, the database was built from the Newman model calibrated for aging and performance using data from a European project called Teesmat. The model was then initialized numerous times with different aging values, for instance, with varying thicknesses of SEI (Solid Electrolyte Interphase). This comprehensive approach ensures a thorough exploration of battery aging dynamics, enhancing the accuracy and reliability of our predictive model. Of particular importance is our reliance on the database generated through the integration of the electrochemical model. This database serves as a crucial asset in advancing our understanding of aging states. Beyond its capability for precise remaining life predictions, this database-driven approach offers valuable insights for optimizing battery usage and adapting the predictor to various scenarios. This underscores the practical significance of our method in facilitating better decision-making regarding lithium-ion battery management.Keywords: Li-ion battery, aging, diagnostics, data analysis, prediction, machine learning, electrochemical model, regression
Procedia PDF Downloads 7111553 Identifying Autism Spectrum Disorder Using Optimization-Based Clustering
Authors: Sharifah Mousli, Sona Taheri, Jiayuan He
Abstract:
Autism spectrum disorder (ASD) is a complex developmental condition involving persistent difficulties with social communication, restricted interests, and repetitive behavior. The challenges associated with ASD can interfere with an affected individual’s ability to function in social, academic, and employment settings. Although there is no effective medication known to treat ASD, to our best knowledge, early intervention can significantly improve an affected individual’s overall development. Hence, an accurate diagnosis of ASD at an early phase is essential. The use of machine learning approaches improves and speeds up the diagnosis of ASD. In this paper, we focus on the application of unsupervised clustering methods in ASD as a large volume of ASD data generated through hospitals, therapy centers, and mobile applications has no pre-existing labels. We conduct a comparative analysis using seven clustering approaches such as K-means, agglomerative hierarchical, model-based, fuzzy-C-means, affinity propagation, self organizing maps, linear vector quantisation – as well as the recently developed optimization-based clustering (COMSEP-Clust) approach. We evaluate the performances of the clustering methods extensively on real-world ASD datasets encompassing different age groups: toddlers, children, adolescents, and adults. Our experimental results suggest that the COMSEP-Clust approach outperforms the other seven methods in recognizing ASD with well-separated clusters.Keywords: autism spectrum disorder, clustering, optimization, unsupervised machine learning
Procedia PDF Downloads 11811552 Object-Based Flow Physics for Aerodynamic Modelling in Real-Time Environments
Authors: William J. Crowther, Conor Marsh
Abstract:
Object-based flow simulation allows fast computation of arbitrarily complex aerodynamic models made up of simple objects with limited flow interactions. The proposed approach is universally applicable to objects made from arbitrarily scaled ellipsoid primitives at arbitrary aerodynamic attitude and angular rate. The use of a component-based aerodynamic modelling approach increases efficiency by allowing selective inclusion of different physics models at run-time and allows extensibility through the development of new models. Insight into the numerical stability of the model under first order fixed-time step integration schemes is provided by stability analysis of the drag component. The compute cost of model components and functions is evaluated and compared against numerical benchmarks. Model static outputs are verified against theoretical expectations and dynamic behaviour using falling plate data from the literature. The model is applied to a range of case studies to demonstrate the efficacy of its application in extensibility, ease of use, and low computational cost. Dynamically complex multi-body systems can be implemented in a transparent and efficient manner, and we successfully demonstrate large scenes with hundreds of objects interacting with diverse flow fields.Keywords: aerodynamics, real-time simulation, low-order model, flight dynamics
Procedia PDF Downloads 104