Search results for: multimodal data
22345 Measure-Valued Solutions to a Class of Nonlinear Parabolic Equations with Degenerate Coercivity and Singular Initial Data
Authors: Flavia Smarrazzo
Abstract:
Initial-boundary value problems for nonlinear parabolic equations having a Radon measure as initial data have been widely investigated, looking for solutions which for positive times take values in some function space. On the other hand, if the diffusivity degenerates too fast at infinity, it is well known that function-valued solutions may not exist, singularities may persist, and it looks very natural to consider solutions which, roughly speaking, for positive times describe an orbit in the space of the finite Radon measures. In this general framework, our purpose is to introduce a concept of measure-valued solution which is consistent with respect to regularizing and smoothing approximations, in order to develop an existence theory which does not depend neither on the level of degeneracy of diffusivity at infinity nor on the choice of the initial measures. In more detail, we prove existence of suitably defined measure-valued solutions to the homogeneous Dirichlet initial-boundary value problem for a class of nonlinear parabolic equations without strong coerciveness. Moreover, we also discuss some qualitative properties of the constructed solutions concerning the evolution of their singular part, including conditions (depending both on the initial data and on the strength of degeneracy) under which the constructed solutions are in fact unction-valued or not.Keywords: degenerate parabolic equations, measure-valued solutions, Radon measures, young measures
Procedia PDF Downloads 28122344 Developing the P1-P7 Management and Analysis Software for Thai Child Evaluation (TCE) of Food and Nutrition Status
Authors: S. Damapong, C. Kingkeow, W. Kongnoo, P. Pattapokin, S. Pruenglamphu
Abstract:
As the presence of Thai children double burden malnutrition, we conducted a project to promote holistic age-appropriate nutrition for Thai children. Researchers developed P1-P7 computer software for managing and analyzing diverse types of collected data. The study objectives were: i) to use software to manage and analyze the collected data, ii) to evaluate the children nutritional status and their caretakers’ nutrition practice to create regulations for improving nutrition. Data were collected by means of questionnaires, called P1-P7. P1, P2 and P5 were for children and caretakers, and others were for institutions. The children nutritional status, height-for-age, weight-for-age, and weight-for-height standards were calculated using Thai child z-score references. Institution evaluations consisted of various standard regulations including the use of our software. The results showed that the software was used in 44 out of 118 communities (37.3%), 57 out of 240 child development centers and nurseries (23.8%), and 105 out of 152 schools (69.1%). No major problems have been reported with the software, although user efficiency can be increased further through additional training. As the result, the P1-P7 software was used to manage and analyze nutritional status, nutrition behavior, and environmental conditions, in order to conduct Thai Child Evaluation (TCE). The software was most widely used in schools. Some aspects of P1-P7’s questionnaires could be modified to increase ease of use and efficiency.Keywords: P1-P7 software, Thai child evaluation, nutritional status, malnutrition
Procedia PDF Downloads 35622343 Determination of the Factors Affecting Adjustment Levels of First Class Students at Elementary School
Authors: Sibel Yoleri
Abstract:
In this research it is aimed to determine the adjustment of students who attend the first class at elementary school to school in terms of several variables. The study group of the research consists of 286 students (131 female, 155 male) who continue attending the first class of elementary school in 2013-2014 academic year, in the city center of Uşak. In the research, ‘Personal Information Form’ and ‘Walker-Mcconnell Scale of Social Competence and School Adjustment’ have been used as data collection tools. In the analysis of data, the t-test has been applied in the independent groups to determine whether the sampling group students’ scores of school adjustment differ according to the sex variable or not. For the evaluation of data identified as not showing normal distribution, Mann Whitney U test has been applied for paired comparison, Kruskal Wallis H test has been used for multiple comparisons. In the research, all the statistical processes have been evaluated bidirectional and the level of significance has been accepted as .05. According to the results gathered from the research, a meaningful difference could not been identified in the level of students’ adjustment to school in terms of sex variable. At the end of the research, it is identified that the adjustment level of the students who have started school at the age of seven is higher than the ones who have started school at the age of five and the adjustment level of the students who have preschool education before the elementary school is higher than the ones who have not taken.Keywords: starting school, preschool education, school adjustment, Walker-Mcconnell Scale
Procedia PDF Downloads 48822342 Comparative Analysis of Medical Tourism Industry among Key Nations in Southeast Asia
Authors: Nur A. Azmi, Suseela D. Chandran, Fadilah Puteh, Azizan Zainuddin
Abstract:
Medical tourism has been associated as a global phenomenon in developed and developing countries in the 21st century. Medical tourism is defined as an activity in which individuals who travel from one country to another country to seek or receive medical healthcare. Based on the global trend, the number of medical tourists is increasing annually, especially in the Southeast Asia (SEA) region. Since the establishment of Association of Southeast Asian Nations (ASEAN) in 1967, the SEA nations have worked towards regional integration in medical tourism. The medical tourism in the SEA has become the third-largest sector that contributes towards economic development. Previous research has demonstrated several factors that affect the development of medical tourism. However, despite the already published literature on SEA's medical tourism in the last ten years there continues to be a scarcity of research on niche areas each of the SEA countries. Hence, this paper is significant in enriching the literature in the field of medical tourism particularly in showcasing the niche market of medical tourism among the SEA best players namely Singapore, Thailand, Malaysia and Indonesia. This paper also contributes in offering a comparative analysis between the said nations whether they are complementing or competing with each other in the medical tourism sector. This then, will increase the availability of information in SEA region on medical tourism. The data was collected through an in-depth interview with various stakeholders and private hospitals. The data was then analyzed using two approaches namely thematic analysis (interview data) and document analysis (secondary data). The paper concludes by arguing that the ASEAN countries have specific niche market to promote their medical tourism industry. This paper also concludes that these key nations complement each other in the industry. In addition, the medical tourism sector in SEA region offers greater prospects for market development and expansion that witnessed the emerging of new key players from other nations.Keywords: healthcare services, medical tourism, medical tourists, SEA region, comparative analysis
Procedia PDF Downloads 14422341 Care: A Cluster Based Approach for Reliable and Efficient Routing Protocol in Wireless Sensor Networks
Authors: K. Prasanth, S. Hafeezullah Khan, B. Haribalakrishnan, D. Arun, S. Jayapriya, S. Dhivya, N. Vijayarangan
Abstract:
The main goal of our approach is to find the optimum positions for the sensor nodes, reinforcing the communications in points where certain lack of connectivity is found. Routing is the major problem in sensor network’s data transfer between nodes. We are going to provide an efficient routing technique to make data signal transfer to reach the base station soon without any interruption. Clustering and routing are the two important key factors to be considered in case of WSN. To carry out the communication from the nodes to their cluster head, we propose a parameterizable protocol so that the developer can indicate if the routing has to be sensitive to either the link quality of the nodes or the their battery levels.Keywords: clusters, routing, wireless sensor networks, three phases, sensor networks
Procedia PDF Downloads 50522340 Adaptive Process Monitoring for Time-Varying Situations Using Statistical Learning Algorithms
Authors: Seulki Lee, Seoung Bum Kim
Abstract:
Statistical process control (SPC) is a practical and effective method for quality control. The most important and widely used technique in SPC is a control chart. The main goal of a control chart is to detect any assignable changes that affect the quality output. Most conventional control charts, such as Hotelling’s T2 charts, are commonly based on the assumption that the quality characteristics follow a multivariate normal distribution. However, in modern complicated manufacturing systems, appropriate control chart techniques that can efficiently handle the nonnormal processes are required. To overcome the shortcomings of conventional control charts for nonnormal processes, several methods have been proposed to combine statistical learning algorithms and multivariate control charts. Statistical learning-based control charts, such as support vector data description (SVDD)-based charts, k-nearest neighbors-based charts, have proven their improved performance in nonnormal situations compared to that of the T2 chart. Beside the nonnormal property, time-varying operations are also quite common in real manufacturing fields because of various factors such as product and set-point changes, seasonal variations, catalyst degradation, and sensor drifting. However, traditional control charts cannot accommodate future condition changes of the process because they are formulated based on the data information recorded in the early stage of the process. In the present paper, we propose a SVDD algorithm-based control chart, which is capable of adaptively monitoring time-varying and nonnormal processes. We reformulated the SVDD algorithm into a time-adaptive SVDD algorithm by adding a weighting factor that reflects time-varying situations. Moreover, we defined the updating region for the efficient model-updating structure of the control chart. The proposed control chart simultaneously allows efficient model updates and timely detection of out-of-control signals. The effectiveness and applicability of the proposed chart were demonstrated through experiments with the simulated data and the real data from the metal frame process in mobile device manufacturing.Keywords: multivariate control chart, nonparametric method, support vector data description, time-varying process
Procedia PDF Downloads 29922339 Ethnic and National Determinants in the Process of Building Peace in Afghanistan After the Withdrawal of Western Forces in 2021
Authors: Małgorzata Cichy
Abstract:
Afghanistan is a source of conflicts that affect security on a global scale. The role of ethnic and national determinants in the peacebuilding process in this country remains an extremely important factor in this respect. Research methods include literature and data analysis (scientific literature, documents of governmental and non-governmental organizations, statistical data and media reports), institutional and legal analysis, as well as decision-making method. The main objective of the research is a comprehensive answer to the question of how ethnic and national factors affect the process of building peace in Afghanistan after 2021 and what impact it has on international security.Keywords: Afghanistan, pashtuns, peace, taliban
Procedia PDF Downloads 9522338 In-situ Acoustic Emission Analysis of a Polymer Electrolyte Membrane Water Electrolyser
Authors: M. Maier, I. Dedigama, J. Majasan, Y. Wu, Q. Meyer, L. Castanheira, G. Hinds, P. R. Shearing, D. J. L. Brett
Abstract:
Increasing the efficiency of electrolyser technology is commonly seen as one of the main challenges on the way to the Hydrogen Economy. There is a significant lack of understanding of the different states of operation of polymer electrolyte membrane water electrolysers (PEMWE) and how these influence the overall efficiency. This in particular means the two-phase flow through the membrane, gas diffusion layers (GDL) and flow channels. In order to increase the efficiency of PEMWE and facilitate their spread as commercial hydrogen production technology, new analytic approaches have to be found. Acoustic emission (AE) offers the possibility to analyse the processes within a PEMWE in a non-destructive, fast and cheap in-situ way. This work describes the generation and analysis of AE data coming from a PEM water electrolyser, for, to the best of our knowledge, the first time in literature. Different experiments are carried out. Each experiment is designed so that only specific physical processes occur and AE solely related to one process can be measured. Therefore, a range of experimental conditions is used to induce different flow regimes within flow channels and GDL. The resulting AE data is first separated into different events, which are defined by exceeding the noise threshold. Each acoustic event consists of a number of consequent peaks and ends when the wave diminishes under the noise threshold. For all these acoustic events the following key attributes are extracted: maximum peak amplitude, duration, number of peaks, peaks before the maximum, average intensity of a peak and time till the maximum is reached. Each event is then expressed as a vector containing the normalized values for all criteria. Principal Component Analysis is performed on the resulting data, which orders the criteria by the eigenvalues of their covariance matrix. This can be used as an easy way of determining which criteria convey the most information on the acoustic data. In the following, the data is ordered in the two- or three-dimensional space formed by the most relevant criteria axes. By finding spaces in the two- or three-dimensional space only occupied by acoustic events originating from one of the three experiments it is possible to relate physical processes to certain acoustic patterns. Due to the complex nature of the AE data modern machine learning techniques are needed to recognize these patterns in-situ. Using the AE data produced before allows to train a self-learning algorithm and develop an analytical tool to diagnose different operational states in a PEMWE. Combining this technique with the measurement of polarization curves and electrochemical impedance spectroscopy allows for in-situ optimization and recognition of suboptimal states of operation.Keywords: acoustic emission, gas diffusion layers, in-situ diagnosis, PEM water electrolyser
Procedia PDF Downloads 15622337 An Exploratory Study on the Integration of Neurodiverse University Students into Mainstream Learning and Their Performance: The Case of the Jones Learning Center
Authors: George Kassar, Phillip A. Cartwright
Abstract:
Based on data collected from The Jones Learning Center (JLC), University of the Ozarks, Arkansas, U.S., this study explores the impact of inclusive classroom practices on neuro-diverse college students’ and their consequent academic performance having participated in integrative therapies designed to support students who are intellectually capable of obtaining a college degree, but who require support for learning challenges owing to disabilities, AD/HD, or ASD. The purpose of this study is two-fold. The first objective is to explore the general process, special techniques, and practices of the (JLC) inclusive program. The second objective is to identify and analyze the effectiveness of the processes, techniques, and practices in supporting the academic performance of enrolled college students with learning disabilities following integration into mainstream university learning. Integrity, transparency, and confidentiality are vital in the research. All questions were shared in advance and confirmed by the concerned management at the JLC. While administering the questionnaire as well as conducted the interviews, the purpose of the study, its scope, aims, and objectives were clearly explained to all participants prior starting the questionnaire / interview. Confidentiality of all participants assured and guaranteed by using encrypted identification of individuals, thus limiting access to data to only the researcher, and storing data in a secure location. Respondents were also informed that their participation in this research is voluntary, and they may withdraw from it at any time prior to submission if they wish. Ethical consent was obtained from the participants before proceeding with videorecording of the interviews. This research uses a mixed methods approach. The research design involves collecting, analyzing, and “mixing” quantitative and qualitative methods and data to enable a research inquiry. The research process is organized based on a five-pillar approach. The first three pillars are focused on testing the first hypothesis (H1) directed toward determining the extent to the academic performance of JLC students did improve after involvement with comprehensive JLC special program. The other two pillars relate to the second hypothesis (H2), which is directed toward determining the extent to which collective and applied knowledge at JLC is distinctive from typical practices in the field. The data collected for research were obtained from three sources: 1) a set of secondary data in the form of Grade Point Average (GPA) received from the registrar, 2) a set of primary data collected throughout structured questionnaire administered to students and alumni at JLC, and 3) another set of primary data collected throughout interviews conducted with staff and educators at JLC. The significance of this study is two folds. First, it validates the effectiveness of the special program at JLC for college-level students who learn differently. Second, it identifies the distinctiveness of the mix of techniques, methods, and practices, including the special individualized and personalized one-on-one approach at JLC.Keywords: education, neuro-diverse students, program effectiveness, Jones learning center
Procedia PDF Downloads 7422336 Statistical Models and Time Series Forecasting on Crime Data in Nepal
Authors: Dila Ram Bhandari
Abstract:
Throughout the 20th century, new governments were created where identities such as ethnic, religious, linguistic, caste, communal, tribal, and others played a part in the development of constitutions and the legal system of victim and criminal justice. Acute issues with extremism, poverty, environmental degradation, cybercrimes, human rights violations, crime against, and victimization of both individuals and groups have recently plagued South Asian nations. Everyday massive number of crimes are steadfast, these frequent crimes have made the lives of common citizens restless. Crimes are one of the major threats to society and also for civilization. Crime is a bone of contention that can create a societal disturbance. The old-style crime solving practices are unable to live up to the requirement of existing crime situations. Crime analysis is one of the most important activities of the majority of intelligent and law enforcement organizations all over the world. The South Asia region lacks such a regional coordination mechanism, unlike central Asia of Asia Pacific regions, to facilitate criminal intelligence sharing and operational coordination related to organized crime, including illicit drug trafficking and money laundering. There have been numerous conversations in recent years about using data mining technology to combat crime and terrorism. The Data Detective program from Sentient as a software company, uses data mining techniques to support the police (Sentient, 2017). The goals of this internship are to test out several predictive model solutions and choose the most effective and promising one. First, extensive literature reviews on data mining, crime analysis, and crime data mining were conducted. Sentient offered a 7-year archive of crime statistics that were daily aggregated to produce a univariate dataset. Moreover, a daily incidence type aggregation was performed to produce a multivariate dataset. Each solution's forecast period lasted seven days. Statistical models and neural network models were the two main groups into which the experiments were split. For the crime data, neural networks fared better than statistical models. This study gives a general review of the applied statistics and neural network models. A detailed image of each model's performance on the available data and generalizability is provided by a comparative analysis of all the models on a comparable dataset. Obviously, the studies demonstrated that, in comparison to other models, Gated Recurrent Units (GRU) produced greater prediction. The crime records of 2005-2019 which was collected from Nepal Police headquarter and analysed by R programming. In conclusion, gated recurrent unit implementation could give benefit to police in predicting crime. Hence, time series analysis using GRU could be a prospective additional feature in Data Detective.Keywords: time series analysis, forecasting, ARIMA, machine learning
Procedia PDF Downloads 16422335 Relative Entropy Used to Determine the Divergence of Cells in Single Cell RNA Sequence Data Analysis
Authors: An Chengrui, Yin Zi, Wu Bingbing, Ma Yuanzhu, Jin Kaixiu, Chen Xiao, Ouyang Hongwei
Abstract:
Single cell RNA sequence (scRNA-seq) is one of the effective tools to study transcriptomics of biological processes. Recently, similarity measurement of cells is Euclidian distance or its derivatives. However, the process of scRNA-seq is a multi-variate Bernoulli event model, thus we hypothesize that it would be more efficient when the divergence between cells is valued with relative entropy than Euclidian distance. In this study, we compared the performances of Euclidian distance, Spearman correlation distance and Relative Entropy using scRNA-seq data of the early, medial and late stage of limb development generated in our lab. Relative Entropy is better than other methods according to cluster potential test. Furthermore, we developed KL-SNE, an algorithm modifying t-SNE whose definition of divergence between cells Euclidian distance to Kullback–Leibler divergence. Results showed that KL-SNE was more effective to dissect cell heterogeneity than t-SNE, indicating the better performance of relative entropy than Euclidian distance. Specifically, the chondrocyte expressing Comp was clustered together with KL-SNE but not with t-SNE. Surprisingly, cells in early stage were surrounded by cells in medial stage in the processing of KL-SNE while medial cells neighbored to late stage with the process of t-SNE. This results parallel to Heatmap which showed cells in medial stage were more heterogenic than cells in other stages. In addition, we also found that results of KL-SNE tend to follow Gaussian distribution compared with those of the t-SNE, which could also be verified with the analysis of scRNA-seq data from another study on human embryo development. Therefore, it is also an effective way to convert non-Gaussian distribution to Gaussian distribution and facilitate the subsequent statistic possesses. Thus, relative entropy is potentially a better way to determine the divergence of cells in scRNA-seq data analysis.Keywords: Single cell RNA sequence, Similarity measurement, Relative Entropy, KL-SNE, t-SNE
Procedia PDF Downloads 34022334 Secret Security Smart Lock Using Artificial Intelligence Hybrid Algorithm
Authors: Vahid Bayrami Rad
Abstract:
Ever since humans developed a collective way of life to the development of urbanization, the concern of security has always been considered one of the most important challenges of life. To protect property, locks have always been a practical tool. With the advancement of technology, the form of locks has changed from mechanical to electric. One of the most widely used fields of using artificial intelligence is its application in the technology of surveillance security systems. Currently, the technologies used in smart anti-theft door handles are one of the most potential fields for using artificial intelligence. Artificial intelligence has the possibility to learn, calculate, interpret and process by analyzing data with the help of algorithms and mathematical models and make smart decisions. We will use Arduino board to process data.Keywords: arduino board, artificial intelligence, image processing, solenoid lock
Procedia PDF Downloads 6922333 A Machine Learning Approach for Efficient Resource Management in Construction Projects
Authors: Soheila Sadeghi
Abstract:
Construction projects are complex and often subject to significant cost overruns due to the multifaceted nature of the activities involved. Accurate cost estimation is crucial for effective budget planning and resource allocation. Traditional methods for predicting overruns often rely on expert judgment or analysis of historical data, which can be time-consuming, subjective, and may fail to consider important factors. However, with the increasing availability of data from construction projects, machine learning techniques can be leveraged to improve the accuracy of overrun predictions. This study applied machine learning algorithms to enhance the prediction of cost overruns in a case study of a construction project. The methodology involved the development and evaluation of two machine learning models: Random Forest and Neural Networks. Random Forest can handle high-dimensional data, capture complex relationships, and provide feature importance estimates. Neural Networks, particularly Deep Neural Networks (DNNs), are capable of automatically learning and modeling complex, non-linear relationships between input features and the target variable. These models can adapt to new data, reduce human bias, and uncover hidden patterns in the dataset. The findings of this study demonstrate that both Random Forest and Neural Networks can significantly improve the accuracy of cost overrun predictions compared to traditional methods. The Random Forest model also identified key cost drivers and risk factors, such as changes in the scope of work and delays in material delivery, which can inform better project risk management. However, the study acknowledges several limitations. First, the findings are based on a single construction project, which may limit the generalizability of the results to other projects or contexts. Second, the dataset, although comprehensive, may not capture all relevant factors influencing cost overruns, such as external economic conditions or political factors. Third, the study focuses primarily on cost overruns, while schedule overruns are not explicitly addressed. Future research should explore the application of machine learning techniques to a broader range of projects, incorporate additional data sources, and investigate the prediction of both cost and schedule overruns simultaneously.Keywords: resource allocation, machine learning, optimization, data-driven decision-making, project management
Procedia PDF Downloads 3922332 Hydrogen: Contention-Aware Hybrid Memory Management for Heterogeneous CPU-GPU Architectures
Authors: Yiwei Li, Mingyu Gao
Abstract:
Integrating hybrid memories with heterogeneous processors could leverage heterogeneity in both compute and memory domains for better system efficiency. To ensure performance isolation, we introduce Hydrogen, a hardware architecture to optimize the allocation of hybrid memory resources to heterogeneous CPU-GPU systems. Hydrogen supports efficient capacity and bandwidth partitioning between CPUs and GPUs in both memory tiers. We propose decoupled memory channel mapping and token-based data migration throttling to enable flexible partitioning. We also support epoch-based online search for optimized configurations and lightweight reconfiguration with reduced data movements. Hydrogen significantly outperforms existing designs by 1.21x on average and up to 1.31x.Keywords: hybrid memory, heterogeneous systems, dram cache, graphics processing units
Procedia PDF Downloads 9622331 Computer-Based versus Paper-Based Tests: A Comparative Study of Two Types of Indonesian National Examination for Senior High School Students
Authors: Faizal Mansyur
Abstract:
The objective of this research is to find out whether there is a significant difference in the English language scores of senior high school students in the Indonesia National Examination for students tested by using computer-based and paper-based tests. The population of this research is senior high school students in South Sulawesi Province who sat the Indonesian National Examination for 2015/2016 academic year. The samples of this research are 800 students’ scores from 8 schools taken by employing the multistage random sampling technique. The data of this research is a secondary data since it is obtained from the education office for South Sulawesi. In analyzing the collected data, the researcher employed the independent samples T-Test with the help of SPSS v.24 program. The finding of this research reveals that there is a significant difference in the English language scores of senior high school students in the Indonesia National Examination for students tested by using computer-based and paper-based Tests (p < .05). Moreover, students tested by using PBT (Mean = 63.13, SD = 13.63) achieve higher score than those tested by using CBT (Mean = 46.33, SD = 14.68).Keywords: computer-based test, paper-based test, Indonesian national examination, testing
Procedia PDF Downloads 16722330 Close-Range Remote Sensing Techniques for Analyzing Rock Discontinuity Properties
Authors: Sina Fatolahzadeh, Sergio A. Sepúlveda
Abstract:
This paper presents advanced developments in close-range, terrestrial remote sensing techniques to enhance the characterization of rock masses. The study integrates two state-of-the-art laser-scanning technologies, the HandySCAN and GeoSLAM laser scanners, to extract high-resolution geospatial data for rock mass analysis. These instruments offer high accuracy, precision, low acquisition time, and high efficiency in capturing intricate geological features in small to medium size outcrops and slope cuts. Using the HandySCAN and GeoSLAM laser scanners facilitates real-time, three-dimensional mapping of rock surfaces, enabling comprehensive assessments of rock mass characteristics. The collected data provide valuable insights into structural complexities, surface roughness, and discontinuity patterns, which are essential for geological and geotechnical analyses. The synergy of these advanced remote sensing technologies contributes to a more precise and straightforward understanding of rock mass behavior. In this case, the main parameters of RQD, joint spacing, persistence, aperture, roughness, infill, weathering, water condition, and joint orientation in a slope cut along the Sea-to-Sky Highway, BC, were remotely analyzed to calculate and evaluate the Rock Mass Rating (RMR) and Geological Strength Index (GSI) classification systems. Automatic and manual analyses of the acquired data are then compared with field measurements. The results show the usefulness of the proposed remote sensing methods and their appropriate conformity with the actual field data.Keywords: remote sensing, rock mechanics, rock engineering, slope stability, discontinuity properties
Procedia PDF Downloads 6622329 Application of Data Mining for Aquifer Environmental Assessment
Authors: Saman Javadi, Mehdi Hashemy, Mohahammad Mahmoodi
Abstract:
Vulnerability maps are employed as an important solution in order to handle entrance of pollution into the aquifers. The common way to provide vulnerability map is DRASTIC. Meanwhile, application of the method is not easy to apply for any aquifer due to choosing appropriate constant values of weights and ranks. In this study, a new approach using k-means clustering is applied to make vulnerability maps. Four features of depth to groundwater, hydraulic conductivity, recharge value and vadose zone were considered at the same time as features of clustering. Five regions are recognized out of the case study represent zones with different level of vulnerability. The finding results show that clustering provides a realistic vulnerability map so that, Pearson’s correlation coefficients between nitrate concentrations and clustering vulnerability is obtained 61%.Keywords: clustering, data mining, groundwater, vulnerability assessment
Procedia PDF Downloads 60322328 Further Investigation of α+12C and α+16O Elastic Scattering
Authors: Sh. Hamada
Abstract:
The current work aims to study the rainbow like-structure observed in the elastic scattering of alpha particles on both 12C and 16O nuclei. We reanalyzed the experimental elastic scattering angular distributions data for α+12C and α+16O nuclear systems at different energies using both optical model and double folding potential of different interaction models such as: CDM3Y1, DDM3Y1, CDM3Y6 and BDM3Y1. Potential created by BDM3Y1 interaction model has the shallowest depth which reflects the necessity to use higher renormalization factor (Nr). Both optical model and double folding potential of different interaction models fairly reproduce the experimental data.Keywords: density distribution, double folding, elastic scattering, nuclear rainbow, optical model
Procedia PDF Downloads 23722327 Energy Consumption and Economic Growth: Testimony of Selected Sub-Saharan Africa Countries
Authors: Alfred Quarcoo
Abstract:
The main purpose of this paper is to examine the causal relationship between energy consumption and economic growth in Sub-Saharan Africa using panel data techniques. An annual data on energy consumption and Economic Growth (proxied by real gross domestic product per capita) spanning from 1990 to 2016 from the World bank index database was used. The results of the Augmented Dickey–Fuller unit root test shows that the series for all countries are not stationary at levels. However, the log of economic growth in Benin and Congo become stationary after taking the differences of the data, and log of energy consumption become stationary for all countries and Log of economic growth in Kenya and Zimbabwe were found to be stationary after taking the second differences of the panel series. The findings of the Johansen cointegration test demonstrate that the variables Log of Energy Consumption and Log of economic growth are not co-integrated for the cases of Kenya and Zimbabwe, so no long-run relationship between the variables were established in any country. The Granger causality test indicates that there is a unidirectional causality running from energy use to economic growth in Kenya and no causal linkage between Energy consumption and economic growth in Benin, Congo and Zimbabwe.Keywords: Cointegration, Granger Causality, Sub-Sahara Africa, World Bank Development Indicators
Procedia PDF Downloads 5222326 Time Travel Testing: A Mechanism for Improving Renewal Experience
Authors: Aritra Majumdar
Abstract:
While organizations strive to expand their new customer base, retaining existing relationships is a key aspect of improving overall profitability and also showcasing how successful an organization is in holding on to its customers. It is an experimentally proven fact that the lion’s share of profit always comes from existing customers. Hence seamless management of renewal journeys across different channels goes a long way in improving trust in the brand. From a quality assurance standpoint, time travel testing provides an approach to both business and technology teams to enhance the customer experience when they look to extend their partnership with the organization for a defined phase of time. This whitepaper will focus on key pillars of time travel testing: time travel planning, time travel data preparation, and enterprise automation. Along with that, it will call out some of the best practices and common accelerator implementation ideas which are generic across verticals like healthcare, insurance, etc. In this abstract document, a high-level snapshot of these pillars will be provided. Time Travel Planning: The first step of setting up a time travel testing roadmap is appropriate planning. Planning will include identifying the impacted systems that need to be time traveled backward or forward depending on the business requirement, aligning time travel with other releases, frequency of time travel testing, preparedness for handling renewal issues in production after time travel testing is done and most importantly planning for test automation testing during time travel testing. Time Travel Data Preparation: One of the most complex areas in time travel testing is test data coverage. Aligning test data to cover required customer segments and narrowing it down to multiple offer sequencing based on defined parameters are keys for successful time travel testing. Another aspect is the availability of sufficient data for similar combinations to support activities like defect retesting, regression testing, post-production testing (if required), etc. This section will talk about the necessary steps for suitable data coverage and sufficient data availability from a time travel testing perspective. Enterprise Automation: Time travel testing is never restricted to a single application. The workflow needs to be validated in the downstream applications to ensure consistency across the board. Along with that, the correctness of offers across different digital channels needs to be checked in order to ensure a smooth customer experience. This section will talk about the focus areas of enterprise automation and how automation testing can be leveraged to improve the overall quality without compromising on the project schedule. Along with the above-mentioned items, the white paper will elaborate on the best practices that need to be followed during time travel testing and some ideas pertaining to accelerator implementation. To sum it up, this paper will be written based on the real-time experience author had on time travel testing. While actual customer names and program-related details will not be disclosed, the paper will highlight the key learnings which will help other teams to implement time travel testing successfully.Keywords: time travel planning, time travel data preparation, enterprise automation, best practices, accelerator implementation ideas
Procedia PDF Downloads 15922325 Electronic Data Interchange (EDI) in the Supply Chain: Impact on Customer Satisfaction
Authors: Hicham Amine, Abdelouahab Mesnaoui
Abstract:
Electronic data interchange EDI is the computer-to-computer exchange of structured business information. This information typically takes the form of standardized electronic business documents, such as invoices, purchase orders, bills of lading, and so on. The purpose of this study is to identify the impact EDI might have on supply chain and typically on customer satisfaction keeping in mind the constraints the organization might face. This study included 139 subject matter experts (SMEs) who participated by responding to a survey that was distributed. 85% responded that they are extremely for the implementation while 10% were neutral and 5% were against the implementation. From the quality assurance department, we have got 75% from the clients agreed to move on with the change whereas 10% stayed neutral and finally 15% were against the change. From the legal department where 80% of the answers were for the implementation and 10% of the participants stayed neutral whereas the last 10% were against it. The survey consisted of 40% male and 60% female (sex-ratio (F/M=1,5), who had chosen to participate. Our survey also contained 3 categories in terms of technical background where 80% are from technical background and 15% were from nontechnical background and 5% had some average technical background. This study examines the impact of EDI on customer satisfaction which is the primary hypothesis and justifies the importance of the implementation which enhances the customer satisfaction.Keywords: electronic data interchange, supply chain, subject matter experts, customer satisfaction
Procedia PDF Downloads 34022324 Accelerating Side Channel Analysis with Distributed and Parallelized Processing
Authors: Kyunghee Oh, Dooho Choi
Abstract:
Although there is no theoretical weakness in a cryptographic algorithm, Side Channel Analysis can find out some secret data from the physical implementation of a cryptosystem. The analysis is based on extra information such as timing information, power consumption, electromagnetic leaks or even sound which can be exploited to break the system. Differential Power Analysis is one of the most popular analyses, as computing the statistical correlations of the secret keys and power consumptions. It is usually necessary to calculate huge data and takes a long time. It may take several weeks for some devices with countermeasures. We suggest and evaluate the methods to shorten the time to analyze cryptosystems. Our methods include distributed computing and parallelized processing.Keywords: DPA, distributed computing, parallelized processing, side channel analysis
Procedia PDF Downloads 42722323 Monitoring of Hydrological Parameters in the Alexandra Jukskei Catchment in South Africa
Authors: Vhuhwavho Gadisi, Rebecca Alowo, German Nkhonjera
Abstract:
It has been noted that technical programming for handling groundwater resources is not accessible. The lack of these systems hinders groundwater management processes necessary for decision-making through monitoring and evaluation regarding the Jukskei River of the Crocodile River (West) Basin in Johannesburg, South Africa. Several challenges have been identified in South Africa's Jukskei Catchment concerning groundwater management. Some of those challenges will include the following: Gaps in data records; there is a need for training and equipping of monitoring staff; formal accreditation of monitoring capacities and equipment; there is no access to regulation terms (e.g., meters). Taking into consideration necessities and human requirements as per typical densities in various regions of South Africa, there is a need to construct several groundwater level monitoring stations in a particular segment; the available raw data on groundwater level should be converted into consumable products for example, short reports on delicate areas (e.g., Dolomite compartments, wetlands, aquifers, and sole source) and considering the increasing civil unrest there has been vandalism and theft of groundwater monitoring infrastructure. GIS was employed at the catchment level to plot the relationship between those identified groundwater parameters in the catchment area and the identified borehole. GIS-based maps were designed for groundwater monitoring to be pretested on one borehole in the Jukskei catchment. This data will be used to establish changes in the borehole compared to changes in the catchment area according to identified parameters.Keywords: GIS, monitoring, Jukskei, catchment
Procedia PDF Downloads 9422322 Transportation Mode Classification Using GPS Coordinates and Recurrent Neural Networks
Authors: Taylor Kolody, Farkhund Iqbal, Rabia Batool, Benjamin Fung, Mohammed Hussaeni, Saiqa Aleem
Abstract:
The rising threat of climate change has led to an increase in public awareness and care about our collective and individual environmental impact. A key component of this impact is our use of cars and other polluting forms of transportation, but it is often difficult for an individual to know how severe this impact is. While there are applications that offer this feedback, they require manual entry of what transportation mode was used for a given trip, which can be burdensome. In order to alleviate this shortcoming, a data from the 2016 TRIPlab datasets has been used to train a variety of machine learning models to automatically recognize the mode of transportation. The accuracy of 89.6% is achieved using single deep neural network model with Gated Recurrent Unit (GRU) architecture applied directly to trip data points over 4 primary classes, namely walking, public transit, car, and bike. These results are comparable in accuracy to results achieved by others using ensemble methods and require far less computation when classifying new trips. The lack of trip context data, e.g., bus routes, bike paths, etc., and the need for only a single set of weights make this an appropriate methodology for applications hoping to reach a broad demographic and have responsive feedback.Keywords: classification, gated recurrent unit, recurrent neural network, transportation
Procedia PDF Downloads 13722321 Data Mining to Capture User-Experience: A Case Study in Notebook Product Appearance Design
Authors: Rhoann Kerh, Chen-Fu Chien, Kuo-Yi Lin
Abstract:
In the era of rapidly increasing notebook market, consumer electronics manufacturers are facing a highly dynamic and competitive environment. In particular, the product appearance is the first part for user to distinguish the product from the product of other brands. Notebook product should differ in its appearance to engage users and contribute to the user experience (UX). The UX evaluates various product concepts to find the design for user needs; in addition, help the designer to further understand the product appearance preference of different market segment. However, few studies have been done for exploring the relationship between consumer background and the reaction of product appearance. This study aims to propose a data mining framework to capture the user’s information and the important relation between product appearance factors. The proposed framework consists of problem definition and structuring, data preparation, rules generation, and results evaluation and interpretation. An empirical study has been done in Taiwan that recruited 168 subjects from different background to experience the appearance performance of 11 different portable computers. The results assist the designers to develop product strategies based on the characteristics of consumers and the product concept that related to the UX, which help to launch the products to the right customers and increase the market shares. The results have shown the practical feasibility of the proposed framework.Keywords: consumers decision making, product design, rough set theory, user experience
Procedia PDF Downloads 31322320 Audit of TPS photon beam dataset for small field output factors using OSLDs against RPC standard dataset
Authors: Asad Yousuf
Abstract:
Purpose: The aim of the present study was to audit treatment planning system beam dataset for small field output factors against standard dataset produced by radiological physics center (RPC) from a multicenter study. Such data are crucial for validity of special techniques, i.e., IMRT or stereotactic radiosurgery. Materials/Method: In this study, multiple small field size output factor datasets were measured and calculated for 6 to 18 MV x-ray beams using the RPC recommend methods. These beam datasets were measured at 10 cm depth for 10 × 10 cm2 to 2 × 2 cm2 field sizes, defined by collimator jaws at 100 cm. The measurements were made with a Landauer’s nanoDot OSLDs whose volume is small enough to gather a full ionization reading even for the 1×1 cm2 field size. At our institute the beam data including output factors have been commissioned at 5 cm depth with an SAD setup. For comparison with the RPC data, the output factors were converted to an SSD setup using tissue phantom ratios. SSD setup also enables coverage of the ion chamber in 2×2 cm2 field size. The measured output factors were also compared with those calculated by Eclipse™ treatment planning software. Result: The measured and calculated output factors are in agreement with RPC dataset within 1% and 4% respectively. The large discrepancies in TPS reflect the increased challenge in converting measured data into a commissioned beam model for very small fields. Conclusion: OSLDs are simple, durable, and accurate tool to verify doses that delivered using small photon beam fields down to a 1x1 cm2 field sizes. The study emphasizes that the treatment planning system should always be evaluated for small field out factors for the accurate dose delivery in clinical setting.Keywords: small field dosimetry, optically stimulated luminescence, audit treatment, radiological physics center
Procedia PDF Downloads 32722319 Nonlinear Multivariable Analysis of CO2 Emissions in China
Authors: Hsiao-Tien Pao, Yi-Ying Li, Hsin-Chia Fu
Abstract:
This paper addressed the impacts of energy consumption, economic growth, financial development, and population size on environmental degradation using grey relational analysis (GRA) for China, where foreign direct investment (FDI) inflows is the proxy variable for financial development. The more recent historical data during the period 2004–2011 are used, because the use of very old data for data analysis may not be suitable for rapidly developing countries. The results of the GRA indicate that the linkage effects of energy consumption–emissions and GDP–emissions are ranked first and second, respectively. These reveal that energy consumption and economic growth are strongly correlated with emissions. Higher economic growth requires more energy consumption and increasing environmental pollution. Likewise, more efficient energy use needs a higher level of economic development. Therefore, policies to improve energy efficiency and create a low-carbon economy can reduce emissions without hurting economic growth. The finding of FDI–emissions linkage is ranked third. This indicates that China do not apply weak environmental regulations to attract inward FDI. Furthermore, China’s government in attracting inward FDI should strengthen environmental policy. The finding of population–emissions linkage effect is ranked fourth, implying that population size does not directly affect CO2 emissions, even though China has the world’s largest population, and Chinese people are very economical use of energy-related products. Overall, the energy conservation, improving efficiency, managing demand, and financial development, which aim at curtailing waste of energy, reducing both energy consumption and emissions, and without loss of the country’s competitiveness, can be adopted for developing economies. The GRA is one of the best way to use a lower data to build a dynamic analysis model.Keywords: China, CO₂ emissions, foreign direct investment, grey relational analysis
Procedia PDF Downloads 40322318 A Data-Driven Compartmental Model for Dengue Forecasting and Covariate Inference
Authors: Yichao Liu, Peter Fransson, Julian Heidecke, Jonas Wallin, Joacim Rockloev
Abstract:
Dengue, a mosquito-borne viral disease, poses a significant public health challenge in endemic tropical or subtropical countries, including Sri Lanka. To reveal insights into the complexity of the dynamics of this disease and study the drivers, a comprehensive model capable of both robust forecasting and insightful inference of drivers while capturing the co-circulating of several virus strains is essential. However, existing studies mostly focus on only one aspect at a time and do not integrate and carry insights across the siloed approach. While mechanistic models are developed to capture immunity dynamics, they are often oversimplified and lack integration of all the diverse drivers of disease transmission. On the other hand, purely data-driven methods lack constraints imposed by immuno-epidemiological processes, making them prone to overfitting and inference bias. This research presents a hybrid model that combines machine learning techniques with mechanistic modelling to overcome the limitations of existing approaches. Leveraging eight years of newly reported dengue case data, along with socioeconomic factors, such as human mobility, weekly climate data from 2011 to 2018, genetic data detecting the introduction and presence of new strains, and estimates of seropositivity for different districts in Sri Lanka, we derive a data-driven vector (SEI) to human (SEIR) model across 16 regions in Sri Lanka at the weekly time scale. By conducting ablation studies, the lag effects allowing delays up to 12 weeks of time-varying climate factors were determined. The model demonstrates superior predictive performance over a pure machine learning approach when considering lead times of 5 and 10 weeks on data withheld from model fitting. It further reveals several interesting interpretable findings of drivers while adjusting for the dynamics and influences of immunity and introduction of a new strain. The study uncovers strong influences of socioeconomic variables: population density, mobility, household income and rural vs. urban population. The study reveals substantial sensitivity to the diurnal temperature range and precipitation, while mean temperature and humidity appear less important in the study location. Additionally, the model indicated sensitivity to vegetation index, both max and average. Predictions on testing data reveal high model accuracy. Overall, this study advances the knowledge of dengue transmission in Sri Lanka and demonstrates the importance of incorporating hybrid modelling techniques to use biologically informed model structures with flexible data-driven estimates of model parameters. The findings show the potential to both inference of drivers in situations of complex disease dynamics and robust forecasting models.Keywords: compartmental model, climate, dengue, machine learning, social-economic
Procedia PDF Downloads 8422317 Estimation of Longitudinal Dispersion Coefficient Using Tracer Data
Authors: K. Ebrahimi, Sh. Shahid, M. Mohammadi Ghaleni, M. H. Omid
Abstract:
The longitudinal dispersion coefficient is a crucial parameter for 1-D water quality analysis of riverine flows. So far, different types of empirical equations for estimation of the coefficient have been developed, based on various case studies. The main objective of this paper is to develop an empirical equation for estimation of the coefficient for a riverine flow. For this purpose, a set of tracer experiments was conducted, involving salt tracer, at three sections located in downstream of a lengthy canal. Tracer data were measured in three mixing lengths along the canal including; 45, 75 and 100m. According to the results, the obtained coefficients from new developed empirical equation gave an encouraging level of agreement with the theoretical values.Keywords: coefficients, dispersion, river, tracer, water quality
Procedia PDF Downloads 38922316 Game-Based Learning in a Higher Education Course: A Case Study with Minecraft Education Edition
Authors: Salvador Antelmo Casanova Valencia
Abstract:
This study documents the use of the Minecraft Education Edition application to explore immersive game-based learning environments. We analyze the contributions of fourth-year university students who are pursuing a degree in Administrative Computing at the Universidad Michoacana de San Nicolas de Hidalgo. In this study, descriptive data and statistical inference are detailed using a quasi-experimental design using the Wilcoxon test. The instruments will provide data validation. Game-based learning in immersive environments necessarily implies greater student participation and commitment, resulting in the study, motivation, and significant improvements, promoting cooperation and autonomous learning.Keywords: game-based learning, gamification, higher education, Minecraft
Procedia PDF Downloads 163