Search results for: raw complex data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28321

Search results for: raw complex data

26551 Developing Fault Tolerance Metrics of Web and Mobile Applications

Authors: Ahmad Mohsin, Irfan Raza Naqvi, Syda Fatima Usamn

Abstract:

Applications with higher fault tolerance index are considered more reliable and trustworthy to drive quality. In recent years application development has been shifted from traditional desktop and web to native and hybrid application(s) for the web and mobile platforms. With the emergence of Internet of things IOTs, cloud and big data trends, the need for measuring Fault Tolerance for these complex nature applications has increased to evaluate their performance. There is a phenomenal gap between fault tolerance metrics development and measurement. Classic quality metric models focused on metrics for traditional systems ignoring the essence of today’s applications software, hardware & deployment characteristics. In this paper, we have proposed simple metrics to measure fault tolerance considering general requirements for Web and Mobile Applications. We have aligned factors – subfactors, using GQM for metrics development considering the nature of mobile we apps. Systematic Mathematical formulation is done to measure metrics quantitatively. Three web mobile applications are selected to measure Fault Tolerance factors using formulated metrics. Applications are then analysed on the basis of results from observations in a controlled environment on different mobile devices. Quantitative results are presented depicting Fault tolerance in respective applications.

Keywords: web and mobile applications, reliability, fault tolerance metric, quality metrics, GQM based metrics

Procedia PDF Downloads 339
26550 Digital Material Characterization Using the Quantum Fourier Transform

Authors: Felix Givois, Nicolas R. Gauger, Matthias Kabel

Abstract:

The efficient digital material characterization is of great interest to many fields of application. It consists of the following three steps. First, a 3D reconstruction of 2D scans must be performed. Then, the resulting gray-value image of the material sample is enhanced by image processing methods. Finally, partial differential equations (PDE) are solved on the segmented image, and by averaging the resulting solutions fields, effective properties like stiffness or conductivity can be computed. Due to the high resolution of current CT images, the latter is typically performed with matrix-free solvers. Among them, a solver that uses the explicit formula of the Green-Eshelby operator in Fourier space has been proposed by Moulinec and Suquet. Its algorithmic, most complex part is the Fast Fourier Transformation (FFT). In our talk, we will discuss the potential quantum advantage that can be obtained by replacing the FFT with the Quantum Fourier Transformation (QFT). We will especially show that the data transfer for noisy intermediate-scale quantum (NISQ) devices can be improved by using appropriate boundary conditions for the PDE, which also allows using semi-classical versions of the QFT. In the end, we will compare the results of the QFT-based algorithm for simple geometries with the results of the FFT-based homogenization method.

Keywords: most likelihood amplitude estimation (MLQAE), numerical homogenization, quantum Fourier transformation (QFT), NISQ devises

Procedia PDF Downloads 72
26549 Comparative Spatial Analysis of a Re-Arranged Hospital Building

Authors: Burak Köken, Hatice D. Arslan, Bilgehan Y. Çakmak

Abstract:

Analyzing the relation networks between the hospital buildings which have complex structure and distinctive spatial relationships is quite difficult. The hospital buildings which require specialty in spatial relationship solutions during design and self-innovation through the developing technology should survive and keep giving service even after the disasters such as earthquakes. In this study, a hospital building where the load-bearing system was strengthened because of the insufficient earthquake performance and the construction of an additional building was required to meet the increasing need for space was discussed and a comparative spatial evaluation of the hospital building was made with regard to its status before the change and after the change. For this reason, spatial organizations of the building before change and after the change were analyzed by means of Space Syntax method and the effects of the change on space organization parameters were searched by applying an analytical procedure. Using Depthmap UCL software, connectivity, visual mean depth, beta and visual integration analyses were conducted. Based on the data obtained after the analyses, it was seen that the relationships between spaces of the building increased after the change and the building has become more explicit and understandable for the occupants. Furthermore, it was determined according to findings of the analysis that the increase in depth causes difficulty in perceiving the spaces and the changes considering this problem generally ease spatial use.

Keywords: architecture, hospital building, space syntax, strengthening

Procedia PDF Downloads 516
26548 Understanding Level 5 Sport Student’s Perspectives of the Barriers to Progression and Attainment

Authors: Emma Whewell, Lee Waters, Mark Wall

Abstract:

This paper is a mixed methods investigation into the perceived barriers to attainment and progression. Initially entry level data was analysed to identify some of the key characteristics of the student cohort- for example entry route, age and ethnic background. Secondly, a phenomenological case study of the lived experiences of 15 level 5 sport and exercise students was conducted. It aimed to understand the complexities of success in higher education, far beyond entry qualifications, indices of deprivation and POLAR characteristics, to offer a first-hand account of student perceptions and interpretations of the barriers they face in progression, retention and completion on their programme. Using focus groups and interviews with students from a range of indices we offer a set of rich case studies exploring the interpretations of our students’ lived experiences and challenges. Findings demonstrate a complex set of circumstances that centre on managing workload, use of support services and aspirations of students that conflict with university priorities. Conclusions centre on the role of academic and pastoral support, assumptions about priorities of students and practical interventions to support achievement.

Keywords: access and participation, higher education, progression and retention, barriers

Procedia PDF Downloads 103
26547 Correlation Analysis between Sensory Processing Sensitivity (SPS), Meares-Irlen Syndrome (MIS) and Dyslexia

Authors: Kaaryn M. Cater

Abstract:

Students with sensory processing sensitivity (SPS), Meares-Irlen Syndrome (MIS) and dyslexia can become overwhelmed and struggle to thrive in traditional tertiary learning environments. An estimated 50% of tertiary students who disclose learning related issues are dyslexic. This study explores the relationship between SPS, MIS and dyslexia. Baseline measures will be analysed to establish any correlation between these three minority methods of information processing. SPS is an innate sensitivity trait found in 15-20% of the population and has been identified in over 100 species of animals. Humans with SPS are referred to as Highly Sensitive People (HSP) and the measure of HSP is a 27 point self-test known as the Highly Sensitive Person Scale (HSPS). A 2016 study conducted by the author established base-line data for HSP students in a tertiary institution in New Zealand. The results of the study showed that all participating HSP students believed the knowledge of SPS to be life-changing and useful in managing life and study, in addition, they believed that all tutors and in-coming students should be given information on SPS. MIS is a visual processing and perception disorder that is found in approximately 10% of the population and has a variety of symptoms including visual fatigue, headaches and nausea. One way to ease some of these symptoms is through the use of colored lenses or overlays. Dyslexia is a complex phonological based information processing variation present in approximately 10% of the population. An estimated 50% of dyslexics are thought to have MIS. The study exploring possible correlations between these minority forms of information processing is due to begin in February 2017. An invitation will be extended to all first year students enrolled in degree programmes across all faculties and schools within the institution. An estimated 900 students will be eligible to participate in the study. Participants will be asked to complete a battery of on-line questionnaires including the Highly Sensitive Person Scale, the International Dyslexia Association adult self-assessment and the adapted Irlen indicator. All three scales have been used extensively in literature and have been validated among many populations. All participants whose score on any (or some) of the three questionnaires suggest a minority method of information processing will receive an invitation to meet with a learning advisor, and given access to counselling services if they choose. Meeting with a learning advisor is not mandatory, and some participants may choose not to receive help. Data will be collected using the Question Pro platform and base-line data will be analysed using correlation and regression analysis to identify relationships and predictors between SPS, MIS and dyslexia. This study forms part of a larger three year longitudinal study and participants will be required to complete questionnaires at annual intervals in subsequent years of the study until completion of (or withdrawal from) their degree. At these data collection points, participants will be questioned on any additional support received relating to their minority method(s) of information processing. Data from this study will be available by April 2017.

Keywords: dyslexia, highly sensitive person (HSP), Meares-Irlen Syndrome (MIS), minority forms of information processing, sensory processing sensitivity (SPS)

Procedia PDF Downloads 231
26546 Quasi-Photon Monte Carlo on Radiative Heat Transfer: An Importance Sampling and Learning Approach

Authors: Utkarsh A. Mishra, Ankit Bansal

Abstract:

At high temperature, radiative heat transfer is the dominant mode of heat transfer. It is governed by various phenomena such as photon emission, absorption, and scattering. The solution of the governing integrodifferential equation of radiative transfer is a complex process, more when the effect of participating medium and wavelength properties are taken into consideration. Although a generic formulation of such radiative transport problem can be modeled for a wide variety of problems with non-gray, non-diffusive surfaces, there is always a trade-off between simplicity and accuracy of the problem. Recently, solutions of complicated mathematical problems with statistical methods based on randomization of naturally occurring phenomena have gained significant importance. Photon bundles with discrete energy can be replicated with random numbers describing the emission, absorption, and scattering processes. Photon Monte Carlo (PMC) is a simple, yet powerful technique, to solve radiative transfer problems in complicated geometries with arbitrary participating medium. The method, on the one hand, increases the accuracy of estimation, and on the other hand, increases the computational cost. The participating media -generally a gas, such as CO₂, CO, and H₂O- present complex emission and absorption spectra. To model the emission/absorption accurately with random numbers requires a weighted sampling as different sections of the spectrum carries different importance. Importance sampling (IS) was implemented to sample random photon of arbitrary wavelength, and the sampled data provided unbiased training of MC estimators for better results. A better replacement to uniform random numbers is using deterministic, quasi-random sequences. Halton, Sobol, and Faure Low-Discrepancy Sequences are used in this study. They possess better space-filling performance than the uniform random number generator and gives rise to a low variance, stable Quasi-Monte Carlo (QMC) estimators with faster convergence. An optimal supervised learning scheme was further considered to reduce the computation costs of the PMC simulation. A one-dimensional plane-parallel slab problem with participating media was formulated. The history of some randomly sampled photon bundles is recorded to train an Artificial Neural Network (ANN), back-propagation model. The flux was calculated using the standard quasi PMC and was considered to be the training target. Results obtained with the proposed model for the one-dimensional problem are compared with the exact analytical and PMC model with the Line by Line (LBL) spectral model. The approximate variance obtained was around 3.14%. Results were analyzed with respect to time and the total flux in both cases. A significant reduction in variance as well a faster rate of convergence was observed in the case of the QMC method over the standard PMC method. However, the results obtained with the ANN method resulted in greater variance (around 25-28%) as compared to the other cases. There is a great scope of machine learning models to help in further reduction of computation cost once trained successfully. Multiple ways of selecting the input data as well as various architectures will be tried such that the concerned environment can be fully addressed to the ANN model. Better results can be achieved in this unexplored domain.

Keywords: radiative heat transfer, Monte Carlo Method, pseudo-random numbers, low discrepancy sequences, artificial neural networks

Procedia PDF Downloads 217
26545 Model for Introducing Products to New Customers through Decision Tree Using Algorithm C4.5 (J-48)

Authors: Komol Phaisarn, Anuphan Suttimarn, Vitchanan Keawtong, Kittisak Thongyoun, Chaiyos Jamsawang

Abstract:

This article is intended to analyze insurance information which contains information on the customer decision when purchasing life insurance pay package. The data were analyzed in order to present new customers with Life Insurance Perfect Pay package to meet new customers’ needs as much as possible. The basic data of insurance pay package were collect to get data mining; thus, reducing the scattering of information. The data were then classified in order to get decision model or decision tree using Algorithm C4.5 (J-48). In the classification, WEKA tools are used to form the model and testing datasets are used to test the decision tree for the accurate decision. The validation of this model in classifying showed that the accurate prediction was 68.43% while 31.25% were errors. The same set of data were then tested with other models, i.e. Naive Bayes and Zero R. The results showed that J-48 method could predict more accurately. So, the researcher applied the decision tree in writing the program used to introduce the product to new customers to persuade customers’ decision making in purchasing the insurance package that meets the new customers’ needs as much as possible.

Keywords: decision tree, data mining, customers, life insurance pay package

Procedia PDF Downloads 423
26544 The Perception of ‘School’ as a Positive Support Factor

Authors: Yeliz Yazıcı, Alev Erenler

Abstract:

School is an institution designed to provide learning, teaching places and environments under guidance of selected teachers. School is not just a place or institution but it is a place where complex and living structures are alive and always changing. It is also an undeniable fact that schools have shaped the ideas, future, society as well as the students and their lives. While this is the situation, schools having academic excellence is considered as successful ones. Academic excellence is a composition of excellence in teachers, management and physical environment, also. This is the general perception of the authorities and parents when the excellence is the point but the school is a developing and supporting organism. In this concept, the main aim of this study is to compare student and teacher perceptions of school as a ‘positive support factor’. The study is designed as a quantitative and qualitative design and a questionnaire is applied to both teachers and students via online and face to face meetings. It is aimed to define the perceptions of the participants related to the school as a positive support factor. It means the role of school in establishing self-efficacy, shaping and acquiring the behavior etc. Gathered data is analyzed via SPSS program and the detailed discussion is carried in the frame of the related literature.

Keywords: positive support factor, education, school, student teacher perception

Procedia PDF Downloads 169
26543 Farmers’ Perception and Response to Climate Change Across Agro-ecological Zones in Conflict-Ridden Communities in Cameroon

Authors: Lotsmart Fonjong

Abstract:

The livelihood of rural communities in the West African state of Cameroon, which is largely dictated by natural forces (rainfall, temperatures, and soil), is today threatened by climate change and armed conflict. This paper investigates the extent to which rural communities are aware of climate change, how their perceptions of changes across different agro-ecological zones have impacted farming practices, output, and lifestyles, on the one hand, and the extent to which local armed conflicts are confounding their efforts and adaptation abilities. The paper is based on a survey conducted among small farmers in selected localities within the forest and savanna ecological zones of the conflict-ridden Northwest and Southwest Cameroon. Attention is paid to farmers’ gender, scale, and type of farming. Farmers’ perception of/and response to climate change are analysed alongside local rainfall and temperature data and mobilization for climate justice. Findings highlight the fact that farmers’ perception generally corroborates local climatic data. Climatic instability has negatively affected farmers’ output, food prices, standards of living, and food security. However, the vulnerability of the population varies across ecological zones, gender, and crop types. While these factors also account for differences in local response and adaptation to climate change, ongoing armed conflicts in these regions have further complicated opportunities for climate-driven agricultural innovations, inputs, and exchange of information among farmers. This situation underlines how poor communities, as victims, are forced into many complex problems outsider their making. It is therefore important to mainstream farmers’ perceptions and differences into policy strategies that consider both climate change and Anglophone conflict as national security concerns foe sustainable development in Cameroon.

Keywords: adaptation policies, climate change, conflict, small farmers, cameroon

Procedia PDF Downloads 152
26542 Analysis of the Role of Population Ageing on Crosstown Roads' Traffic Accidents Using Latent Class Clustering

Authors: N. Casado-Sanz, B. Guirao

Abstract:

The population aged 65 and over is projected to double in the coming decades. Due to this increase, driver population is expected to grow and in the near future, all countries will be faced with population aging of varying intensity and in unique time frames. This is the greatest challenge facing industrialized nations and due to this fact, the study of the relationships of dependency between population aging and road safety is becoming increasingly relevant. Although the deterioration of driving skills in the elderly has been analyzed in depth, to our knowledge few research studies have focused on the road infrastructure and the mobility of this particular group of users. In Spain, crosstown roads have one of the highest fatality rates. These rural routes have a higher percentage of elderly people who are more dependent on driving due to the absence or limitations of urban public transportation. Analysing road safety in these routes is very complex because of the variety of the features, the dispersion of the data and the complete lack of related literature. The objective of this paper is to identify key factors that cause traffic accidents. The individuals under study were the accidents with killed or seriously injured in Spanish crosstown roads during the period 2006-2015. Latent cluster analysis was applied as a preliminary tool for segmentation of accidents, considering population aging as the main input among other socioeconomic indicators. Subsequently, a linear regression analysis was carried out to estimate the degree of dependence between the accident rate and the variables that define each group. The results show that segmenting the data is very interesting and provides further information. Additionally, the results revealed the clear influence of the aging variable in the clusters obtained. Other variables related to infrastructure and mobility levels, such as the crosstown roads layout and the traffic intensity aimed to be one of the key factors in the causality of road accidents.

Keywords: cluster analysis, population ageing, rural roads, road safety

Procedia PDF Downloads 107
26541 Enhancing Knowledge Graph Convolutional Networks with Structural Adaptive Receptive Fields for Improved Node Representation and Information Aggregation

Authors: Zheng Zhihao

Abstract:

Recently, the Knowledge Graph Framework Network (KGCN) has developed powerful capabilities in knowledge representation and reasoning tasks. However, traditional KGCN often uses a fixed weight mechanism when aggregating information, failing to make full use of rich structural information, resulting in a certain expression ability of node representation and easily causing over-smoothing problems. In order to solve these challenges, the paper proposes an distinct graph neural network model called KGCN-STAR (Knowledge Graph Convolutional Network with Structural Adaptive Receptive Fields). This model dynamically adjusts the perception of each node by introducing a structural adaptive receptive field. Wild range and a subgraph aggregator is designed to capture local structural information more effectively. Experimental results show that KGCN-STAR shows significant performance improvement on multiple knowledge graph data sets, especially showing considerable capabilities in the task of representation learning of complex structures.

Keywords: knowledge graph(KG), graph neural networks (GNN), structural adaptive receptive fields, information aggregation

Procedia PDF Downloads 15
26540 The Impact of Window Opening Occupant Behavior Models on Building Energy Performance

Authors: Habtamu Tkubet Ebuy

Abstract:

Purpose Conventional dynamic energy simulation tools go beyond the static dimension of simplified methods by providing better and more accurate prediction of building performance. However, their ability to forecast actual performance is undermined by a low representation of human interactions. The purpose of this study is to examine the potential benefits of incorporating information on occupant diversity into occupant behavior models used to simulate building performance. The co-simulation of the stochastic behavior of the occupants substantially increases the accuracy of the simulation. Design/methodology/approach In this article, probabilistic models of the "opening and closing" behavior of the window of inhabitants have been developed in a separate multi-agent platform, SimOcc, and implemented in the building simulation, TRNSYS, in such a way that the behavior of the window with the interconnectivity can be reflected in the simulation analysis of the building. Findings The results of the study prove that the application of complex behaviors is important to research in predicting actual building performance. The results aid in the identification of the gap between reality and existing simulation methods. We hope this study and its results will serve as a guide for researchers interested in investigating occupant behavior in the future. Research limitations/implications Further case studies involving multi-user behavior for complex commercial buildings need to more understand the impact of the occupant behavior on building performance. Originality/value This study is considered as a good opportunity to achieve the national strategy by showing a suitable tool to help stakeholders in the design phase of new or retrofitted buildings to improve the performance of office buildings.

Keywords: occupant behavior, co-simulation, energy consumption, thermal comfort

Procedia PDF Downloads 96
26539 The Use of Social Stories and Digital Technology as Interventions for Autistic Children; A State-Of-The-Art Review and Qualitative Data Analysis

Authors: S. Hussain, C. Grieco, M. Brosnan

Abstract:

Background and Aims: Autism is a complex neurobehavioural disorder, characterised by impairments in the development of language and communication skills. The study involved a state-of-art systematic review, in addition to qualitative data analysis, to establish the evidence for social stories as an intervention strategy for autistic children. An up-to-date review of the use of digital technologies in the delivery of interventions to autistic children was also carried out; to propose the efficacy of digital technologies and the use of social stories to improve intervention outcomes for autistic children. Methods: Two student researchers reviewed a range of randomised control trials and observational studies. The aim of the review was to establish if there was adequate evidence to justify recommending social stories to autistic patients. Students devised their own search strategies to be used across a range of search engines, including Ovid-Medline, Google Scholar and PubMed. Students then critically appraised the generated literature. Additionally, qualitative data obtained from a comprehensive online questionnaire on social stories was also thematically analysed. The thematic analysis was carried out independently by each researcher, using a ‘bottom-up’ approach, meaning contributors read and analysed responses to questions and devised semantic themes from reading the responses to a given question. The researchers then placed each response into a semantic theme or sub-theme. The students then joined to discuss the merging of their theme headings. The Inter-rater reliability (IRR) was calculated before and after theme headings were merged, giving IRR for pre- and post-discussion. Lastly, the thematic analysis was assessed by a third researcher, who is a professor of psychology and the director for the ‘Centre for Applied Autism Research’ at the University of Bath. Results: A review of the literature, as well as thematic analysis of qualitative data found supporting evidence for social story use. The thematic analysis uncovered some interesting themes from the questionnaire responses, relating to the reasons why social stories were used and the factors influencing their effectiveness in each case. However, overall, the evidence for digital technologies interventions was limited, and the literature could not prove a causal link between better intervention outcomes for autistic children and the use of technologies. However, they did offer valid proposed theories for the suitability of digital technologies for autistic children. Conclusions: Overall, the review concluded that there was adequate evidence to justify advising the use of social stories with autistic children. The role of digital technologies is clearly a fast-emerging field and appears to be a promising method of intervention for autistic children; however, it should not yet be considered an evidence-based approach. The students, using this research, developed ideas on social story interventions which aim to help autistic children.

Keywords: autistic children, digital technologies, intervention, social stories

Procedia PDF Downloads 115
26538 A Modular and Reusable Bond Graph Model of Epithelial Transport in the Proximal Convoluted Tubule

Authors: Leyla Noroozbabaee, David Nickerson

Abstract:

We introduce a modular, consistent, reusable bond graph model of the renal nephron’s proximal convoluted tubule (PCT), which can reproduce biological behaviour. In this work, we focus on ion and volume transport in the proximal convoluted tubule of the renal nephron. Modelling complex systems requires complex modelling problems to be broken down into manageable pieces. This can be enabled by developing models of subsystems that are subsequently coupled hierarchically. Because they are based on a graph structure. In the current work, we define two modular subsystems: the resistive module representing the membrane and the capacitive module representing solution compartments. Each module is analyzed based on thermodynamic processes, and all the subsystems are reintegrated into circuit theory in network thermodynamics. The epithelial transport system we introduce in the current study consists of five transport membranes and four solution compartments. Coupled dissipations in the system occur in the membrane subsystems and coupled free-energy increasing, or decreasing processes appear in solution compartment subsystems. These structural subsystems also consist of elementary thermodynamic processes: dissipations, free-energy change, and power conversions. We provide free and open access to the Python implementation to ensure our model is accessible, enabling the reader to explore the model through setting their simulations and reproducibility tests.

Keywords: Bond Graph, Epithelial Transport, Water Transport, Mathematical Modeling

Procedia PDF Downloads 82
26537 Exploring the Role of Data Mining in Crime Classification: A Systematic Literature Review

Authors: Faisal Muhibuddin, Ani Dijah Rahajoe

Abstract:

This in-depth exploration, through a systematic literature review, scrutinizes the nuanced role of data mining in the classification of criminal activities. The research focuses on investigating various methodological aspects and recent developments in leveraging data mining techniques to enhance the effectiveness and precision of crime categorization. Commencing with an exposition of the foundational concepts of crime classification and its evolutionary dynamics, this study details the paradigm shift from conventional methods towards approaches supported by data mining, addressing the challenges and complexities inherent in the modern crime landscape. Specifically, the research delves into various data mining techniques, including K-means clustering, Naïve Bayes, K-nearest neighbour, and clustering methods. A comprehensive review of the strengths and limitations of each technique provides insights into their respective contributions to improving crime classification models. The integration of diverse data sources takes centre stage in this research. A detailed analysis explores how the amalgamation of structured data (such as criminal records) and unstructured data (such as social media) can offer a holistic understanding of crime, enriching classification models with more profound insights. Furthermore, the study explores the temporal implications in crime classification, emphasizing the significance of considering temporal factors to comprehend long-term trends and seasonality. The availability of real-time data is also elucidated as a crucial element in enhancing responsiveness and accuracy in crime classification.

Keywords: data mining, classification algorithm, naïve bayes, k-means clustering, k-nearest neigbhor, crime, data analysis, sistematic literature review

Procedia PDF Downloads 60
26536 A Cooperative Signaling Scheme for Global Navigation Satellite Systems

Authors: Keunhong Chae, Seokho Yoon

Abstract:

Recently, the global navigation satellite system (GNSS) such as Galileo and GPS is employing more satellites to provide a higher degree of accuracy for the location service, thus calling for a more efficient signaling scheme among the satellites used in the overall GNSS network. In that the network throughput is improved, the spatial diversity can be one of the efficient signaling schemes; however, it requires multiple antenna that could cause a significant increase in the complexity of the GNSS. Thus, a diversity scheme called the cooperative signaling was proposed, where the virtual multiple-input multiple-output (MIMO) signaling is realized with using only a single antenna in the transmit satellite of interest and with modeling the neighboring satellites as relay nodes. The main drawback of the cooperative signaling is that the relay nodes receive the transmitted signal at different time instants, i.e., they operate in an asynchronous way, and thus, the overall performance of the GNSS network could degrade severely. To tackle the problem, several modified cooperative signaling schemes were proposed; however, all of them are difficult to implement due to a signal decoding at the relay nodes. Although the implementation at the relay nodes could be simpler to some degree by employing the time-reversal and conjugation operations instead of the signal decoding, it would be more efficient if we could implement the operations of the relay nodes at the source node having more resources than the relay nodes. So, in this paper, we propose a novel cooperative signaling scheme, where the data signals are combined in a unique way at the source node, thus obviating the need of the complex operations such as signal decoding, time-reversal and conjugation at the relay nodes. The numerical results confirm that the proposed scheme provides the same performance in the cooperative diversity and the bit error rate (BER) as the conventional scheme, while reducing the complexity at the relay nodes significantly. Acknowledgment: This work was supported by the National GNSS Research Center program of Defense Acquisition Program Administration and Agency for Defense Development.

Keywords: global navigation satellite network, cooperative signaling, data combining, nodes

Procedia PDF Downloads 277
26535 An Accurate Brain Tumor Segmentation for High Graded Glioma Using Deep Learning

Authors: Sajeeha Ansar, Asad Ali Safi, Sheikh Ziauddin, Ahmad R. Shahid, Faraz Ahsan

Abstract:

Gliomas are most challenging and aggressive type of tumors which appear in different sizes, locations, and scattered boundaries. CNN is most efficient deep learning approach with outstanding capability of solving image analysis problems. A fully automatic deep learning based 2D-CNN model for brain tumor segmentation is presented in this paper. We used small convolution filters (3 x 3) to make architecture deeper. We increased convolutional layers for efficient learning of complex features from large dataset. We achieved better results by pushing convolutional layers up to 16 layers for HGG model. We achieved reliable and accurate results through fine-tuning among dataset and hyper-parameters. Pre-processing of this model includes generation of brain pipeline, intensity normalization, bias correction and data augmentation. We used the BRATS-2015, and Dice Similarity Coefficient (DSC) is used as performance measure for the evaluation of the proposed method. Our method achieved DSC score of 0.81 for complete, 0.79 for core, 0.80 for enhanced tumor regions. However, these results are comparable with methods already implemented 2D CNN architecture.

Keywords: brain tumor segmentation, convolutional neural networks, deep learning, HGG

Procedia PDF Downloads 247
26534 Assessing Supply Chain Performance through Data Mining Techniques: A Case of Automotive Industry

Authors: Emin Gundogar, Burak Erkayman, Nusret Sazak

Abstract:

Providing effective management performance through the whole supply chain is critical issue and hard to applicate. The proper evaluation of integrated data may conclude with accurate information. Analysing the supply chain data through OLAP (On-Line Analytical Processing) technologies may provide multi-angle view of the work and consolidation. In this study, association rules and classification techniques are applied to measure the supply chain performance metrics of an automotive manufacturer in Turkey. Main criteria and important rules are determined. The comparison of the results of the algorithms is presented.

Keywords: supply chain performance, performance measurement, data mining, automotive

Procedia PDF Downloads 507
26533 The Batch Method Approach for Adsorption Mechanism Processes of Some Selected Heavy Metal Ions and Methylene Blue by Using Chemically Modified Luffa Cylindrica

Authors: Akanimo Emene, Mark D. Ogden, Robert Edyvean

Abstract:

Adsorption is a low cost, efficient and economically viable wastewater treatment process. Utilization of this treatment process has not been fully applied due to the complex and not fully understood nature of the adsorption system. To optimize its process is to choose a sufficient adsorbent and to study further the experimental parameters that influence the adsorption design system. Chemically modified adsorbent, Luffa cylindrica, was used to adsorb heavy metal ions and an organic pollutant, methylene blue, from aqueous environmental solution at varying experimental conditions. Experimental factors, adsorption time, initial metal ion or organic pollutant concentration, ionic strength, and pH of solution were studied. The experimental data were analyzed with kinetic and isotherm models. The antagonistic effect of the methylene and some heavy metal ions were recorded. An understanding of the use of this treated Luffa cylindrica for the removal of these toxic substances will establish and improve the commercial application of the adsorption process in treatment of contaminated waters.

Keywords: adsorption, heavy metal ions, Luffa cylindrica, wastewater treatment

Procedia PDF Downloads 190
26532 Multimodal Data Fusion Techniques in Audiovisual Speech Recognition

Authors: Hadeer M. Sayed, Hesham E. El Deeb, Shereen A. Taie

Abstract:

In the big data era, we are facing a diversity of datasets from different sources in different domains that describe a single life event. These datasets consist of multiple modalities, each of which has a different representation, distribution, scale, and density. Multimodal fusion is the concept of integrating information from multiple modalities in a joint representation with the goal of predicting an outcome through a classification task or regression task. In this paper, multimodal fusion techniques are classified into two main classes: model-agnostic techniques and model-based approaches. It provides a comprehensive study of recent research in each class and outlines the benefits and limitations of each of them. Furthermore, the audiovisual speech recognition task is expressed as a case study of multimodal data fusion approaches, and the open issues through the limitations of the current studies are presented. This paper can be considered a powerful guide for interested researchers in the field of multimodal data fusion and audiovisual speech recognition particularly.

Keywords: multimodal data, data fusion, audio-visual speech recognition, neural networks

Procedia PDF Downloads 104
26531 Advancing Inclusive Curriculum Development for Special Needs Education in Africa

Authors: Onosedeba Mary Ayayia

Abstract:

Inclusive education has emerged as a critical global imperative, aiming to provide equitable educational opportunities for all, regardless of their abilities or disabilities. In Africa, the pursuit of inclusive education faces significant challenges, particularly concerning the development and implementation of inclusive curricula tailored to the diverse needs of students with disabilities. This study delves into the heart of this issue, seeking to address the pressing problem of exclusion and marginalization of students with disabilities in mainstream educational systems across the continent. The problem is complex, entailing issues of limited access to tailored curricula, shortages of qualified teachers in special needs education, stigmatization, limited research and data, policy gaps, inadequate resources, and limited community awareness. These challenges perpetuate a system where students with disabilities are systematically excluded from quality education, limiting their future opportunities and societal contributions. This research proposes a comprehensive examination of the current state of inclusive curriculum development and implementation in Africa. Through an innovative and explicit exploration of the problem, the study aims to identify effective strategies, guidelines, and best practices that can inform the development of inclusive curricula. These curricula will be designed to address the diverse learning needs of students with disabilities, promote teacher capacity building, combat stigmatization, generate essential data, enhance policy coherence, allocate adequate resources, and raise community awareness. The goal of this research is to contribute to the advancement of inclusive education in Africa by fostering an educational environment where every student, regardless of ability or disability, has equitable access to quality education. Through this endeavor, the study aligns with the broader global pursuit of social inclusion and educational equity, emphasizing the importance of inclusive curricula as a foundational step towards a more inclusive and just society.

Keywords: inclusive education, special education, curriculum development, Africa

Procedia PDF Downloads 59
26530 Knowledge-Driven Decision Support System Based on Knowledge Warehouse and Data Mining by Improving Apriori Algorithm with Fuzzy Logic

Authors: Pejman Hosseinioun, Hasan Shakeri, Ghasem Ghorbanirostam

Abstract:

In recent years, we have seen an increasing importance of research and study on knowledge source, decision support systems, data mining and procedure of knowledge discovery in data bases and it is considered that each of these aspects affects the others. In this article, we have merged information source and knowledge source to suggest a knowledge based system within limits of management based on storing and restoring of knowledge to manage information and improve decision making and resources. In this article, we have used method of data mining and Apriori algorithm in procedure of knowledge discovery one of the problems of Apriori algorithm is that, a user should specify the minimum threshold for supporting the regularity. Imagine that a user wants to apply Apriori algorithm for a database with millions of transactions. Definitely, the user does not have necessary knowledge of all existing transactions in that database, and therefore cannot specify a suitable threshold. Our purpose in this article is to improve Apriori algorithm. To achieve our goal, we tried using fuzzy logic to put data in different clusters before applying the Apriori algorithm for existing data in the database and we also try to suggest the most suitable threshold to the user automatically.

Keywords: decision support system, data mining, knowledge discovery, data discovery, fuzzy logic

Procedia PDF Downloads 328
26529 Success Factors for Innovations in SME Networks

Authors: J. Gochermann

Abstract:

Due to complex markets and products, and increasing need to innovate, cooperation between small and medium size enterprises arose during the last decades, which are not prior driven by process optimization or sales enhancement. Especially small and medium sized enterprises (SME) collaborate increasingly in innovation and knowledge networks to enhance their knowledge and innovation potential, and to find strategic partners for product and market development. These networks are characterized by dual objectives, the superordinate goal of the total network, and the specific objectives of the network members, which can cause target conflicts. Moreover, most SMEs do not have structured innovation processes and they are not accustomed to collaborate in complex innovation projects in an open network structure. On the other hand, SMEs have suitable characteristics for promising networking. They are flexible and spontaneous, they have flat hierarchies, and the acting people are not anonymous. These characteristics indeed distinguish them from bigger concerns. Investigation of German SME networks have been done to identify success factors for SME innovation networks. The fundamental network principles, donation-return and confidence, could be confirmed and identified as basic success factors. Further factors are voluntariness, adequate number of network members, quality of communication, neutrality and competence of the network management, as well as reliability and obligingness of the network services. Innovation and knowledge networks with an appreciable number of members from science and technology institutions need also active sense-making to bring different disciplines into successful collaboration. It has also been investigated, whether and how the involvement in an innovation network impacts the innovation structure and culture inside the member companies. The degree of reaction grows with time and intensity of commitment.

Keywords: innovation and knowledge networks, SME, success factors, innovation structure and culture

Procedia PDF Downloads 278
26528 The Use of Geographic Information System Technologies for Geotechnical Monitoring of Pipeline Systems

Authors: A. G. Akhundov

Abstract:

Issues of obtaining unbiased data on the status of pipeline systems of oil- and oil product transportation become especially important when laying and operating pipelines under severe nature and climatic conditions. The essential attention is paid here to researching exogenous processes and their impact on linear facilities of the pipeline system. Reliable operation of pipelines under severe nature and climatic conditions, timely planning and implementation of compensating measures are only possible if operation conditions of pipeline systems are regularly monitored, and changes of permafrost soil and hydrological operation conditions are accounted for. One of the main reasons for emergency situations to appear is the geodynamic factor. Emergency situations are proved by the experience to occur within areas characterized by certain conditions of the environment and to develop according to similar scenarios depending on active processes. The analysis of natural and technical systems of main pipelines at different stages of monitoring gives a possibility of making a forecast of the change dynamics. The integration of GIS technologies, traditional means of geotechnical monitoring (in-line inspection, geodetic methods, field observations), and remote methods (aero-visual inspection, aero photo shooting, air and ground laser scanning) provides the most efficient solution of the problem. The united environment of geo information system (GIS) is a comfortable way to implement the monitoring system on the main pipelines since it provides means to describe a complex natural and technical system and every element thereof with any set of parameters. Such GIS enables a comfortable simulation of main pipelines (both in 2D and 3D), the analysis of situations and selection of recommendations to prevent negative natural or man-made processes and to mitigate their consequences. The specifics of such systems include: a multi-dimensions simulation of facilities in the pipeline system, math modelling of the processes to be observed, and the use of efficient numeric algorithms and software packets for forecasting and analyzing. We see one of the most interesting possibilities of using the monitoring results as generating of up-to-date 3D models of a facility and the surrounding area on the basis of aero laser scanning, data of aerophotoshooting, and data of in-line inspection and instrument measurements. The resulting 3D model shall be the basis of the information system providing means to store and process data of geotechnical observations with references to the facilities of the main pipeline; to plan compensating measures, and to control their implementation. The use of GISs for geotechnical monitoring of pipeline systems is aimed at improving the reliability of their operation, reducing the probability of negative events (accidents and disasters), and at mitigation of consequences thereof if they still are to occur.

Keywords: databases, 3D GIS, geotechnical monitoring, pipelines, laser scaning

Procedia PDF Downloads 187
26527 Design and Implementation of Low-code Model-building Methods

Authors: Zhilin Wang, Zhihao Zheng, Linxin Liu

Abstract:

This study proposes a low-code model-building approach that aims to simplify the development and deployment of artificial intelligence (AI) models. With an intuitive way to drag and drop and connect components, users can easily build complex models and integrate multiple algorithms for training. After the training is completed, the system automatically generates a callable model service API. This method not only lowers the technical threshold of AI development and improves development efficiency but also enhances the flexibility of algorithm integration and simplifies the deployment process of models. The core strength of this method lies in its ease of use and efficiency. Users do not need to have a deep programming background and can complete the design and implementation of complex models with a simple drag-and-drop operation. This feature greatly expands the scope of AI technology, allowing more non-technical people to participate in the development of AI models. At the same time, the method performs well in algorithm integration, supporting many different types of algorithms to work together, which further improves the performance and applicability of the model. In the experimental part, we performed several performance tests on the method. The results show that compared with traditional model construction methods, this method can make more efficient use, save computing resources, and greatly shorten the model training time. In addition, the system-generated model service interface has been optimized for high availability and scalability, which can adapt to the needs of different application scenarios.

Keywords: low-code, model building, artificial intelligence, algorithm integration, model deployment

Procedia PDF Downloads 18
26526 The Study of Dengue Fever Outbreak in Thailand Using Geospatial Techniques, Satellite Remote Sensing Data and Big Data

Authors: Tanapat Chongkamunkong

Abstract:

The objective of this paper is to present a practical use of Geographic Information System (GIS) to the public health from spatial correlation between multiple factors and dengue fever outbreak. Meteorological factors, demographic factors and environmental factors are compiled using GIS techniques along with the Global Satellite Mapping Remote Sensing (RS) data. We use monthly dengue fever cases, population density, precipitation, Digital Elevation Model (DEM) data. The scope cover study area under climate change of the El Niño–Southern Oscillation (ENSO) indicated by sea surface temperature (SST) and study area in 12 provinces of Thailand as remote sensing (RS) data from January 2007 to December 2014.

Keywords: dengue fever, sea surface temperature, Geographic Information System (GIS), remote sensing

Procedia PDF Downloads 192
26525 Spontaneous and Posed Smile Detection: Deep Learning, Traditional Machine Learning, and Human Performance

Authors: Liang Wang, Beste F. Yuksel, David Guy Brizan

Abstract:

A computational model of affect that can distinguish between spontaneous and posed smiles with no errors on a large, popular data set using deep learning techniques is presented in this paper. A Long Short-Term Memory (LSTM) classifier, a type of Recurrent Neural Network, is utilized and compared to human classification. Results showed that while human classification (mean of 0.7133) was above chance, the LSTM model was more accurate than human classification and other comparable state-of-the-art systems. Additionally, a high accuracy rate was maintained with small amounts of training videos (70 instances). The derivation of important features to further understand the success of our computational model were analyzed, and it was inferred that thousands of pairs of points within the eyes and mouth are important throughout all time segments in a smile. This suggests that distinguishing between a posed and spontaneous smile is a complex task, one which may account for the difficulty and lower accuracy of human classification compared to machine learning models.

Keywords: affective computing, affect detection, computer vision, deep learning, human-computer interaction, machine learning, posed smile detection, spontaneous smile detection

Procedia PDF Downloads 121
26524 Anti-Corruption, an Important Challenge for the Construction Industry!

Authors: Ahmed Stifi, Sascha Gentes, Fritz Gehbauer

Abstract:

The construction industry is perhaps one of the oldest industry of the world. The ancient monuments like the egyptian pyramids, the temples of Greeks and Romans like Parthenon and Pantheon, the robust bridges, old Roman theatres, the citadels and many more are the best testament to that. The industry also has a symbiotic relationship with other . Some of the heavy engineering industry provide construction machineries, chemical industry develop innovative construction materials, finance sector provides fund solutions for complex construction projects and many more. Construction Industry is not only mammoth but also very complex in nature. Because of the complexity, construction industry is prone to various tribulations which may have the propensity to hamper its growth. The comparitive study of this industry with other depicts that it is associated with a state of tardiness and delay especially when we focus on the managerial aspects and the study of triple constraint (time, cost and scope). While some institutes says the complexity associated with it as a major reason, others like lean construction, refers to the wastes produced across the construction process as the prime reason. This paper introduces corruption as one of the prime factors for such delays.To support this many international reports and studies are available depicting that construction industry is one of the most corrupt sectors worldwide, and the corruption can take place throught the project cycle comprising project selection, planning, design, funding, pre-qualification, tendering, execution, operation and maintenance, and even through the reconstrction phase. It also happens in many forms such as bribe, fraud, extortion, collusion, embezzlement and conflict of interest and the self-sufficient. As a solution to cope the corruption in construction industry, the paper introduces the integrity as a key factor and build a new integrity framework to develop and implement an integrity management system for construction companies and construction projects.

Keywords: corruption, construction industry, integrity, lean construction

Procedia PDF Downloads 374
26523 Modeling Spatio-Temporal Variation in Rainfall Using a Hierarchical Bayesian Regression Model

Authors: Sabyasachi Mukhopadhyay, Joseph Ogutu, Gundula Bartzke, Hans-Peter Piepho

Abstract:

Rainfall is a critical component of climate governing vegetation growth and production, forage availability and quality for herbivores. However, reliable rainfall measurements are not always available, making it necessary to predict rainfall values for particular locations through time. Predicting rainfall in space and time can be a complex and challenging task, especially where the rain gauge network is sparse and measurements are not recorded consistently for all rain gauges, leading to many missing values. Here, we develop a flexible Bayesian model for predicting rainfall in space and time and apply it to Narok County, situated in southwestern Kenya, using data collected at 23 rain gauges from 1965 to 2015. Narok County encompasses the Maasai Mara ecosystem, the northern-most section of the Mara-Serengeti ecosystem, famous for its diverse and abundant large mammal populations and spectacular migration of enormous herds of wildebeest, zebra and Thomson's gazelle. The model incorporates geographical and meteorological predictor variables, including elevation, distance to Lake Victoria and minimum temperature. We assess the efficiency of the model by comparing it empirically with the established Gaussian process, Kriging, simple linear and Bayesian linear models. We use the model to predict total monthly rainfall and its standard error for all 5 * 5 km grid cells in Narok County. Using the Monte Carlo integration method, we estimate seasonal and annual rainfall and their standard errors for 29 sub-regions in Narok. Finally, we use the predicted rainfall to predict large herbivore biomass in the Maasai Mara ecosystem on a 5 * 5 km grid for both the wet and dry seasons. We show that herbivore biomass increases with rainfall in both seasons. The model can handle data from a sparse network of observations with many missing values and performs at least as well as or better than four established and widely used models, on the Narok data set. The model produces rainfall predictions consistent with expectation and in good agreement with the blended station and satellite rainfall values. The predictions are precise enough for most practical purposes. The model is very general and applicable to other variables besides rainfall.

Keywords: non-stationary covariance function, gaussian process, ungulate biomass, MCMC, maasai mara ecosystem

Procedia PDF Downloads 290
26522 A Comparative Analysis of Various Companding Techniques Used to Reduce PAPR in VLC Systems

Authors: Arushi Singh, Anjana Jain, Prakash Vyavahare

Abstract:

Recently, Li-Fi(light-fiedelity) has been launched based on VLC(visible light communication) technique, 100 times faster than WiFi. Now 5G mobile communication system is proposed to use VLC-OFDM as the transmission technique. The VLC system focused on visible rays, is considered for efficient spectrum use and easy intensity modulation through LEDs. The reason of high speed in VLC is LED, as they flicker incredibly fast(order of MHz). Another advantage of employing LED is-it acts as low pass filter results no out-of-band emission. The VLC system falls under the category of ‘green technology’ for utilizing LEDs. In present scenario, OFDM is used for high data-rates, interference immunity and high spectral efficiency. Inspite of the advantages OFDM suffers from large PAPR, ICI among carriers and frequency offset errors. Since, the data transmission technique used in VLC system is OFDM, the system suffers the drawbacks of OFDM as well as VLC, the non-linearity dues to non-linear characteristics of LED and PAPR of OFDM due to which the high power amplifier enters in non-linear region. The proposed paper focuses on reduction of PAPR in VLC-OFDM systems. Many techniques are applied to reduce PAPR such as-clipping-introduces distortion in the carrier; selective mapping technique-suffers wastage of bandwidth; partial transmit sequence-very complex due to exponentially increased number of sub-blocks. The paper discusses three companding techniques namely- µ-law, A-law and advance A-law companding technique. The analysis shows that the advance A-law companding techniques reduces the PAPR of the signal by adjusting the companding parameter within the range. VLC-OFDM systems are the future of the wireless communication but non-linearity in VLC-OFDM is a severe issue. The proposed paper discusses the techniques to reduce PAPR, one of the non-linearities of the system. The companding techniques mentioned in this paper provides better results without increasing the complexity of the system.

Keywords: non-linear companding techniques, peak to average power ratio (PAPR), visible light communication (VLC), VLC-OFDM

Procedia PDF Downloads 281