Search results for: data security
22077 The Importance of Generating Electricity through Wind Farms in the Brazilian Electricity Matrix, from 2013 to 2020
Authors: Alex Sidarta Guglielmoni
Abstract:
Since the 1970s, sustainable development has become increasingly present on the international agenda. The present work has as general objective to analyze, discuss and bring answers to the following question, what is the importance of the generation of electric energy through the wind power plants in the Brazilian electricity matrix between 2013 and 2019? To answer this question, we analyzed the generation of renewable energy from wind farms and the consumption of electricity in Brazil during the period of January 2013 until December 2020. The specific objectives of this research are: to analyze the public data, to identify the total wind generation, to identify the total wind capacity generation, to identify the percentage participation of the generation and generation capacity of wind energy in the Brazilian electricity matrix. In order to develop this research, it was necessary a bibliographic search, collection of secondary data, tabulation of generation data, and electricity capacity by a comparative analysis between wind power and the Brazilian electricity matrix. As a result, it was possible to observe how important Brazil is for global sustainable development and how much this country can grow with this, in view of its capacity and potential for generating wind power since this percentage has grown in past few years.Keywords: wind power, Brazilian market, electricity matrix, generation capacity
Procedia PDF Downloads 12622076 Revisiting the Swadesh Wordlist: How Long Should It Be
Authors: Feda Negesse
Abstract:
One of the most important indicators of research quality is a good data - collection instrument that can yield reliable and valid data. The Swadesh wordlist has been used for more than half a century for collecting data in comparative and historical linguistics though arbitrariness is observed in its application and size. This research compare s the classification results of the 100 Swadesh wordlist with those of its subsets to determine if reducing the size of the wordlist impact s its effectiveness. In the comparison, the 100, 50 and 40 wordlists were used to compute lexical distances of 29 Cushitic and Semitic languages spoken in Ethiopia and neighbouring countries. Gabmap, a based application, was employed to compute the lexical distances and to divide the languages into related clusters. The study shows that the subsets are not as effective as the 100 wordlist in clustering languages into smaller subgroups but they are equally effective in di viding languages into bigger groups such as subfamilies. It is noted that the subsets may lead to an erroneous classification whereby unrelated languages by chance form a cluster which is not attested by a comparative study. The chance to get a wrong result is higher when the subsets are used to classify languages which are not closely related. Though a further study is still needed to settle the issues around the size of the Swadesh wordlist, this study indicates that the 50 and 40 wordlists cannot be recommended as reliable substitute s for the 100 wordlist under all circumstances. The choice seems to be determined by the objective of a researcher and the degree of affiliation among the languages to be classified.Keywords: classification, Cushitic, Swadesh, wordlist
Procedia PDF Downloads 29822075 Factors Impacting Geostatistical Modeling Accuracy and Modeling Strategy of Fluvial Facies Models
Authors: Benbiao Song, Yan Gao, Zhuo Liu
Abstract:
Geostatistical modeling is the key technic for reservoir characterization, the quality of geological models will influence the prediction of reservoir performance greatly, but few studies have been done to quantify the factors impacting geostatistical reservoir modeling accuracy. In this study, 16 fluvial prototype models have been established to represent different geological complexity, 6 cases range from 16 to 361 wells were defined to reproduce all those 16 prototype models by different methodologies including SIS, object-based and MPFS algorithms accompany with different constraint parameters. Modeling accuracy ratio was defined to quantify the influence of each factor, and ten realizations were averaged to represent each accuracy ratio under the same modeling condition and parameters association. Totally 5760 simulations were done to quantify the relative contribution of each factor to the simulation accuracy, and the results can be used as strategy guide for facies modeling in the similar condition. It is founded that data density, geological trend and geological complexity have great impact on modeling accuracy. Modeling accuracy may up to 90% when channel sand width reaches up to 1.5 times of well space under whatever condition by SIS and MPFS methods. When well density is low, the contribution of geological trend may increase the modeling accuracy from 40% to 70%, while the use of proper variogram may have very limited contribution for SIS method. It can be implied that when well data are dense enough to cover simple geobodies, few efforts were needed to construct an acceptable model, when geobodies are complex with insufficient data group, it is better to construct a set of robust geological trend than rely on a reliable variogram function. For object-based method, the modeling accuracy does not increase obviously as SIS method by the increase of data density, but kept rational appearance when data density is low. MPFS methods have the similar trend with SIS method, but the use of proper geological trend accompany with rational variogram may have better modeling accuracy than MPFS method. It implies that the geological modeling strategy for a real reservoir case needs to be optimized by evaluation of dataset, geological complexity, geological constraint information and the modeling objective.Keywords: fluvial facies, geostatistics, geological trend, modeling strategy, modeling accuracy, variogram
Procedia PDF Downloads 26422074 Predicting Costs in Construction Projects with Machine Learning: A Detailed Study Based on Activity-Level Data
Authors: Soheila Sadeghi
Abstract:
Construction projects are complex and often subject to significant cost overruns due to the multifaceted nature of the activities involved. Accurate cost estimation is crucial for effective budget planning and resource allocation. Traditional methods for predicting overruns often rely on expert judgment or analysis of historical data, which can be time-consuming, subjective, and may fail to consider important factors. However, with the increasing availability of data from construction projects, machine learning techniques can be leveraged to improve the accuracy of overrun predictions. This study applied machine learning algorithms to enhance the prediction of cost overruns in a case study of a construction project. The methodology involved the development and evaluation of two machine learning models: Random Forest and Neural Networks. Random Forest can handle high-dimensional data, capture complex relationships, and provide feature importance estimates. Neural Networks, particularly Deep Neural Networks (DNNs), are capable of automatically learning and modeling complex, non-linear relationships between input features and the target variable. These models can adapt to new data, reduce human bias, and uncover hidden patterns in the dataset. The findings of this study demonstrate that both Random Forest and Neural Networks can significantly improve the accuracy of cost overrun predictions compared to traditional methods. The Random Forest model also identified key cost drivers and risk factors, such as changes in the scope of work and delays in material delivery, which can inform better project risk management. However, the study acknowledges several limitations. First, the findings are based on a single construction project, which may limit the generalizability of the results to other projects or contexts. Second, the dataset, although comprehensive, may not capture all relevant factors influencing cost overruns, such as external economic conditions or political factors. Third, the study focuses primarily on cost overruns, while schedule overruns are not explicitly addressed. Future research should explore the application of machine learning techniques to a broader range of projects, incorporate additional data sources, and investigate the prediction of both cost and schedule overruns simultaneously.Keywords: cost prediction, machine learning, project management, random forest, neural networks
Procedia PDF Downloads 5622073 3D Classification Optimization of Low-Density Airborne Light Detection and Ranging Point Cloud by Parameters Selection
Authors: Baha Eddine Aissou, Aichouche Belhadj Aissa
Abstract:
Light detection and ranging (LiDAR) is an active remote sensing technology used for several applications. Airborne LiDAR is becoming an important technology for the acquisition of a highly accurate dense point cloud. A classification of airborne laser scanning (ALS) point cloud is a very important task that still remains a real challenge for many scientists. Support vector machine (SVM) is one of the most used statistical learning algorithms based on kernels. SVM is a non-parametric method, and it is recommended to be used in cases where the data distribution cannot be well modeled by a standard parametric probability density function. Using a kernel, it performs a robust non-linear classification of samples. Often, the data are rarely linearly separable. SVMs are able to map the data into a higher-dimensional space to become linearly separable, which allows performing all the computations in the original space. This is one of the main reasons that SVMs are well suited for high-dimensional classification problems. Only a few training samples, called support vectors, are required. SVM has also shown its potential to cope with uncertainty in data caused by noise and fluctuation, and it is computationally efficient as compared to several other methods. Such properties are particularly suited for remote sensing classification problems and explain their recent adoption. In this poster, the SVM classification of ALS LiDAR data is proposed. Firstly, connected component analysis is applied for clustering the point cloud. Secondly, the resulting clusters are incorporated in the SVM classifier. Radial basic function (RFB) kernel is used due to the few numbers of parameters (C and γ) that needs to be chosen, which decreases the computation time. In order to optimize the classification rates, the parameters selection is explored. It consists to find the parameters (C and γ) leading to the best overall accuracy using grid search and 5-fold cross-validation. The exploited LiDAR point cloud is provided by the German Society for Photogrammetry, Remote Sensing, and Geoinformation. The ALS data used is characterized by a low density (4-6 points/m²) and is covering an urban area located in residential parts of the city Vaihingen in southern Germany. The class ground and three other classes belonging to roof superstructures are considered, i.e., a total of 4 classes. The training and test sets are selected randomly several times. The obtained results demonstrated that a parameters selection can orient the selection in a restricted interval of (C and γ) that can be further explored but does not systematically lead to the optimal rates. The SVM classifier with hyper-parameters is compared with the most used classifiers in literature for LiDAR data, random forest, AdaBoost, and decision tree. The comparison showed the superiority of the SVM classifier using parameters selection for LiDAR data compared to other classifiers.Keywords: classification, airborne LiDAR, parameters selection, support vector machine
Procedia PDF Downloads 14722072 Regional Disparities in the Level of Education in West Bengal
Authors: Nafisa Banu
Abstract:
The present study is an attempt to analyze the regional disparities in the level of education in West Bengal. The data based on secondary sources obtained from a census of India. The study is divided into four sections. The first section presents introductions, objectives and brief descriptions of the study area, second part discuss the methodology and data base, while third and fourth comprise the empirical results, interpretation, and conclusion respectively. For showing the level of educational development, 8 indicators have been selected and Z- score and composite score techniques have been applied. The present study finds out there are large variations of educational level due to various historical, economical, socio-cultural factors of the study area.Keywords: education, regional disparity, literacy rate, Z-score, composite score
Procedia PDF Downloads 35522071 Study of Morphological Changes of the River Ganga in Patna District, Bihar Using Remote Sensing and GIS Techniques
Authors: Bhawesh Kumar, A. P. Krishna
Abstract:
There are continuous changes upon earth’s surface by a variety of natural and anthropogenic agents cut, carry away and depositing of minerals from land. Running water has higher capacity of erosion than other geomorphologic agents. This research work has been carried out on Ganga River, whose channel is continuously changing under the influence of geomorphic agents and human activities in the surrounding regions. The main focus is to study morphological characteristics and sand dynamics of Ganga River with particular emphasis on bank lines and width changes using remote sensing and GIS techniques. The advance remote sensing data and topographical data were interpreted for obtaining 52 years of changes. For this, remote sensing data of different years (LANDSAT TM 1975, 1988, 1993, ETM 2005 and ETM 2012) and toposheet of SOI for the year 1960 were used as base maps for this study. Sinuosity ratio, braiding index and migratory activity index were also established. It was found to be 1.16 in 1975 and in 1988, 1993, 2005 and 2005 it was 1.09, 1.11, 1.1, 1.09 respectively. The analysis also shows that the minimum value found in 1960 was in reach 1 and maximum value is 4.8806 in 2012 found in reach 4 which suggests creation of number of islands in reach 4 for the year 2012. Migratory activity index (MAI), which is a standardized function of both length and time, was computed for the 8 representative reaches. MAI shows that maximum migration was in 1975-1988 in reach 6 and 7 and minimum migration was in 1993-2005. From the channel change analysis, it was found that the shifting of bank line was cyclic and the river Ganges showed a trend of southward maximum values. The advanced remote sensing data and topographical data helped in obtaining 52 years changes in the river due to various natural and manmade activities like flood, water velocity and excavation, removal of vegetation cover and fertile soil excavation for the various purposes of surrounding regions.Keywords: braided index, migratory activity index (MAI), Ganga river, river morphology
Procedia PDF Downloads 34722070 Analysis of Crisis Management Systems of United Kingdom and Turkey
Authors: Recep Sait Arpat, Hakan Güreşci
Abstract:
Emergency, disaster and crisis management terms are generally perceived as the same processes. This conflict effects the approach and delegating policy of the political order. Crisis management starts in the aftermath of the mismanagement of disaster and emergency. In the light of the information stated above in this article Turkey and United Kingdom(UK)’s crisis management systems are analyzed. This article’s main aim is to clarify the main points of the emergency management system of United Kingdom and Turkey’s disaster management system by comparing them. To do this: A prototype model of the political decision making processes of the countries is drawn, decision making mechanisms and the planning functions are compared. As a result it’s found that emergency management policy in Turkey is reactive whereas it’s proactive in UK; as the delegating policy Turkey’s system is similar to UK; levels of emergency situations are similar but not the same; the differences are stemming from the civil order and nongovernmental organizations effectiveness; UK has a detailed government engagement model to emergencies, which shapes the doctrine of the approach to emergencies, and it’s successful in gathering and controlling the whole state’s efforts; crisis management is a sub-phase of UK emergency management whereas it’s accepted as a outmoded management perception and the focal point of crisis management perception in UK is security crisis and natural disasters while in Turkey it is natural disasters. In every anlysis proposals are given to Turkey.Keywords: crisis management, disaster management, emergency management, turkey, united kingdom
Procedia PDF Downloads 37222069 Brainbow Image Segmentation Using Bayesian Sequential Partitioning
Authors: Yayun Hsu, Henry Horng-Shing Lu
Abstract:
This paper proposes a data-driven, biology-inspired neural segmentation method of 3D drosophila Brainbow images. We use Bayesian Sequential Partitioning algorithm for probabilistic modeling, which can be used to detect somas and to eliminate cross talk effects. This work attempts to develop an automatic methodology for neuron image segmentation, which nowadays still lacks a complete solution due to the complexity of the image. The proposed method does not need any predetermined, risk-prone thresholds since biological information is inherently included in the image processing procedure. Therefore, it is less sensitive to variations in neuron morphology; meanwhile, its flexibility would be beneficial for tracing the intertwining structure of neurons.Keywords: brainbow, 3D imaging, image segmentation, neuron morphology, biological data mining, non-parametric learning
Procedia PDF Downloads 48722068 Measure-Valued Solutions to a Class of Nonlinear Parabolic Equations with Degenerate Coercivity and Singular Initial Data
Authors: Flavia Smarrazzo
Abstract:
Initial-boundary value problems for nonlinear parabolic equations having a Radon measure as initial data have been widely investigated, looking for solutions which for positive times take values in some function space. On the other hand, if the diffusivity degenerates too fast at infinity, it is well known that function-valued solutions may not exist, singularities may persist, and it looks very natural to consider solutions which, roughly speaking, for positive times describe an orbit in the space of the finite Radon measures. In this general framework, our purpose is to introduce a concept of measure-valued solution which is consistent with respect to regularizing and smoothing approximations, in order to develop an existence theory which does not depend neither on the level of degeneracy of diffusivity at infinity nor on the choice of the initial measures. In more detail, we prove existence of suitably defined measure-valued solutions to the homogeneous Dirichlet initial-boundary value problem for a class of nonlinear parabolic equations without strong coerciveness. Moreover, we also discuss some qualitative properties of the constructed solutions concerning the evolution of their singular part, including conditions (depending both on the initial data and on the strength of degeneracy) under which the constructed solutions are in fact unction-valued or not.Keywords: degenerate parabolic equations, measure-valued solutions, Radon measures, young measures
Procedia PDF Downloads 28122067 Substation Automation, Digitization, Cyber Risk and Chain Risk Management Reliability
Authors: Serzhan Ashirov, Dana Nour, Rafat Rob, Khaled Alotaibi
Abstract:
There has been a fast growth in the introduction and use of communications, information, monitoring, and sensing technologies. The new technologies are making their way to the Industrial Control Systems as embedded in products, software applications, IT services, or commissioned to enable integration and automation of increasingly global supply chains. As a result, the lines that separated the physical, digital, and cyber world have diminished due to the vast implementation of the new, disruptive digital technologies. The variety and increased use of these technologies introduce many cybersecurity risks affecting cyber-resilience of the supply chain, both in terms of the product or service delivered to a customer and members of the supply chain operation. US department of energy considers supply chain in the IR4 space to be the weakest link in cybersecurity. The IR4 identified the digitization of the field devices, followed by digitalization that eventually moved through the digital transformation space with little care for the new introduced cybersecurity risks. This paper will examine the best methodologies for securing the electrical substations from cybersecurity attacks due to supply chain risks, and due to digitization effort. SCADA systems are the most vulnerable part of the power system infrastructure due to digitization and due to the weakness and vulnerabilities in the supply chain security. The paper will discuss in details how create a secure supply chain methodology, secure substations, and mitigate the risks due to digitizationKeywords: cybersecurity, supply chain methodology, secure substation, digitization
Procedia PDF Downloads 6422066 Developing the P1-P7 Management and Analysis Software for Thai Child Evaluation (TCE) of Food and Nutrition Status
Authors: S. Damapong, C. Kingkeow, W. Kongnoo, P. Pattapokin, S. Pruenglamphu
Abstract:
As the presence of Thai children double burden malnutrition, we conducted a project to promote holistic age-appropriate nutrition for Thai children. Researchers developed P1-P7 computer software for managing and analyzing diverse types of collected data. The study objectives were: i) to use software to manage and analyze the collected data, ii) to evaluate the children nutritional status and their caretakers’ nutrition practice to create regulations for improving nutrition. Data were collected by means of questionnaires, called P1-P7. P1, P2 and P5 were for children and caretakers, and others were for institutions. The children nutritional status, height-for-age, weight-for-age, and weight-for-height standards were calculated using Thai child z-score references. Institution evaluations consisted of various standard regulations including the use of our software. The results showed that the software was used in 44 out of 118 communities (37.3%), 57 out of 240 child development centers and nurseries (23.8%), and 105 out of 152 schools (69.1%). No major problems have been reported with the software, although user efficiency can be increased further through additional training. As the result, the P1-P7 software was used to manage and analyze nutritional status, nutrition behavior, and environmental conditions, in order to conduct Thai Child Evaluation (TCE). The software was most widely used in schools. Some aspects of P1-P7’s questionnaires could be modified to increase ease of use and efficiency.Keywords: P1-P7 software, Thai child evaluation, nutritional status, malnutrition
Procedia PDF Downloads 35622065 Determination of the Factors Affecting Adjustment Levels of First Class Students at Elementary School
Authors: Sibel Yoleri
Abstract:
In this research it is aimed to determine the adjustment of students who attend the first class at elementary school to school in terms of several variables. The study group of the research consists of 286 students (131 female, 155 male) who continue attending the first class of elementary school in 2013-2014 academic year, in the city center of Uşak. In the research, ‘Personal Information Form’ and ‘Walker-Mcconnell Scale of Social Competence and School Adjustment’ have been used as data collection tools. In the analysis of data, the t-test has been applied in the independent groups to determine whether the sampling group students’ scores of school adjustment differ according to the sex variable or not. For the evaluation of data identified as not showing normal distribution, Mann Whitney U test has been applied for paired comparison, Kruskal Wallis H test has been used for multiple comparisons. In the research, all the statistical processes have been evaluated bidirectional and the level of significance has been accepted as .05. According to the results gathered from the research, a meaningful difference could not been identified in the level of students’ adjustment to school in terms of sex variable. At the end of the research, it is identified that the adjustment level of the students who have started school at the age of seven is higher than the ones who have started school at the age of five and the adjustment level of the students who have preschool education before the elementary school is higher than the ones who have not taken.Keywords: starting school, preschool education, school adjustment, Walker-Mcconnell Scale
Procedia PDF Downloads 48822064 Comparative Analysis of Medical Tourism Industry among Key Nations in Southeast Asia
Authors: Nur A. Azmi, Suseela D. Chandran, Fadilah Puteh, Azizan Zainuddin
Abstract:
Medical tourism has been associated as a global phenomenon in developed and developing countries in the 21st century. Medical tourism is defined as an activity in which individuals who travel from one country to another country to seek or receive medical healthcare. Based on the global trend, the number of medical tourists is increasing annually, especially in the Southeast Asia (SEA) region. Since the establishment of Association of Southeast Asian Nations (ASEAN) in 1967, the SEA nations have worked towards regional integration in medical tourism. The medical tourism in the SEA has become the third-largest sector that contributes towards economic development. Previous research has demonstrated several factors that affect the development of medical tourism. However, despite the already published literature on SEA's medical tourism in the last ten years there continues to be a scarcity of research on niche areas each of the SEA countries. Hence, this paper is significant in enriching the literature in the field of medical tourism particularly in showcasing the niche market of medical tourism among the SEA best players namely Singapore, Thailand, Malaysia and Indonesia. This paper also contributes in offering a comparative analysis between the said nations whether they are complementing or competing with each other in the medical tourism sector. This then, will increase the availability of information in SEA region on medical tourism. The data was collected through an in-depth interview with various stakeholders and private hospitals. The data was then analyzed using two approaches namely thematic analysis (interview data) and document analysis (secondary data). The paper concludes by arguing that the ASEAN countries have specific niche market to promote their medical tourism industry. This paper also concludes that these key nations complement each other in the industry. In addition, the medical tourism sector in SEA region offers greater prospects for market development and expansion that witnessed the emerging of new key players from other nations.Keywords: healthcare services, medical tourism, medical tourists, SEA region, comparative analysis
Procedia PDF Downloads 14422063 Care: A Cluster Based Approach for Reliable and Efficient Routing Protocol in Wireless Sensor Networks
Authors: K. Prasanth, S. Hafeezullah Khan, B. Haribalakrishnan, D. Arun, S. Jayapriya, S. Dhivya, N. Vijayarangan
Abstract:
The main goal of our approach is to find the optimum positions for the sensor nodes, reinforcing the communications in points where certain lack of connectivity is found. Routing is the major problem in sensor network’s data transfer between nodes. We are going to provide an efficient routing technique to make data signal transfer to reach the base station soon without any interruption. Clustering and routing are the two important key factors to be considered in case of WSN. To carry out the communication from the nodes to their cluster head, we propose a parameterizable protocol so that the developer can indicate if the routing has to be sensitive to either the link quality of the nodes or the their battery levels.Keywords: clusters, routing, wireless sensor networks, three phases, sensor networks
Procedia PDF Downloads 50522062 Adaptive Process Monitoring for Time-Varying Situations Using Statistical Learning Algorithms
Authors: Seulki Lee, Seoung Bum Kim
Abstract:
Statistical process control (SPC) is a practical and effective method for quality control. The most important and widely used technique in SPC is a control chart. The main goal of a control chart is to detect any assignable changes that affect the quality output. Most conventional control charts, such as Hotelling’s T2 charts, are commonly based on the assumption that the quality characteristics follow a multivariate normal distribution. However, in modern complicated manufacturing systems, appropriate control chart techniques that can efficiently handle the nonnormal processes are required. To overcome the shortcomings of conventional control charts for nonnormal processes, several methods have been proposed to combine statistical learning algorithms and multivariate control charts. Statistical learning-based control charts, such as support vector data description (SVDD)-based charts, k-nearest neighbors-based charts, have proven their improved performance in nonnormal situations compared to that of the T2 chart. Beside the nonnormal property, time-varying operations are also quite common in real manufacturing fields because of various factors such as product and set-point changes, seasonal variations, catalyst degradation, and sensor drifting. However, traditional control charts cannot accommodate future condition changes of the process because they are formulated based on the data information recorded in the early stage of the process. In the present paper, we propose a SVDD algorithm-based control chart, which is capable of adaptively monitoring time-varying and nonnormal processes. We reformulated the SVDD algorithm into a time-adaptive SVDD algorithm by adding a weighting factor that reflects time-varying situations. Moreover, we defined the updating region for the efficient model-updating structure of the control chart. The proposed control chart simultaneously allows efficient model updates and timely detection of out-of-control signals. The effectiveness and applicability of the proposed chart were demonstrated through experiments with the simulated data and the real data from the metal frame process in mobile device manufacturing.Keywords: multivariate control chart, nonparametric method, support vector data description, time-varying process
Procedia PDF Downloads 29922061 In-situ Acoustic Emission Analysis of a Polymer Electrolyte Membrane Water Electrolyser
Authors: M. Maier, I. Dedigama, J. Majasan, Y. Wu, Q. Meyer, L. Castanheira, G. Hinds, P. R. Shearing, D. J. L. Brett
Abstract:
Increasing the efficiency of electrolyser technology is commonly seen as one of the main challenges on the way to the Hydrogen Economy. There is a significant lack of understanding of the different states of operation of polymer electrolyte membrane water electrolysers (PEMWE) and how these influence the overall efficiency. This in particular means the two-phase flow through the membrane, gas diffusion layers (GDL) and flow channels. In order to increase the efficiency of PEMWE and facilitate their spread as commercial hydrogen production technology, new analytic approaches have to be found. Acoustic emission (AE) offers the possibility to analyse the processes within a PEMWE in a non-destructive, fast and cheap in-situ way. This work describes the generation and analysis of AE data coming from a PEM water electrolyser, for, to the best of our knowledge, the first time in literature. Different experiments are carried out. Each experiment is designed so that only specific physical processes occur and AE solely related to one process can be measured. Therefore, a range of experimental conditions is used to induce different flow regimes within flow channels and GDL. The resulting AE data is first separated into different events, which are defined by exceeding the noise threshold. Each acoustic event consists of a number of consequent peaks and ends when the wave diminishes under the noise threshold. For all these acoustic events the following key attributes are extracted: maximum peak amplitude, duration, number of peaks, peaks before the maximum, average intensity of a peak and time till the maximum is reached. Each event is then expressed as a vector containing the normalized values for all criteria. Principal Component Analysis is performed on the resulting data, which orders the criteria by the eigenvalues of their covariance matrix. This can be used as an easy way of determining which criteria convey the most information on the acoustic data. In the following, the data is ordered in the two- or three-dimensional space formed by the most relevant criteria axes. By finding spaces in the two- or three-dimensional space only occupied by acoustic events originating from one of the three experiments it is possible to relate physical processes to certain acoustic patterns. Due to the complex nature of the AE data modern machine learning techniques are needed to recognize these patterns in-situ. Using the AE data produced before allows to train a self-learning algorithm and develop an analytical tool to diagnose different operational states in a PEMWE. Combining this technique with the measurement of polarization curves and electrochemical impedance spectroscopy allows for in-situ optimization and recognition of suboptimal states of operation.Keywords: acoustic emission, gas diffusion layers, in-situ diagnosis, PEM water electrolyser
Procedia PDF Downloads 15622060 An Exploratory Study on the Integration of Neurodiverse University Students into Mainstream Learning and Their Performance: The Case of the Jones Learning Center
Authors: George Kassar, Phillip A. Cartwright
Abstract:
Based on data collected from The Jones Learning Center (JLC), University of the Ozarks, Arkansas, U.S., this study explores the impact of inclusive classroom practices on neuro-diverse college students’ and their consequent academic performance having participated in integrative therapies designed to support students who are intellectually capable of obtaining a college degree, but who require support for learning challenges owing to disabilities, AD/HD, or ASD. The purpose of this study is two-fold. The first objective is to explore the general process, special techniques, and practices of the (JLC) inclusive program. The second objective is to identify and analyze the effectiveness of the processes, techniques, and practices in supporting the academic performance of enrolled college students with learning disabilities following integration into mainstream university learning. Integrity, transparency, and confidentiality are vital in the research. All questions were shared in advance and confirmed by the concerned management at the JLC. While administering the questionnaire as well as conducted the interviews, the purpose of the study, its scope, aims, and objectives were clearly explained to all participants prior starting the questionnaire / interview. Confidentiality of all participants assured and guaranteed by using encrypted identification of individuals, thus limiting access to data to only the researcher, and storing data in a secure location. Respondents were also informed that their participation in this research is voluntary, and they may withdraw from it at any time prior to submission if they wish. Ethical consent was obtained from the participants before proceeding with videorecording of the interviews. This research uses a mixed methods approach. The research design involves collecting, analyzing, and “mixing” quantitative and qualitative methods and data to enable a research inquiry. The research process is organized based on a five-pillar approach. The first three pillars are focused on testing the first hypothesis (H1) directed toward determining the extent to the academic performance of JLC students did improve after involvement with comprehensive JLC special program. The other two pillars relate to the second hypothesis (H2), which is directed toward determining the extent to which collective and applied knowledge at JLC is distinctive from typical practices in the field. The data collected for research were obtained from three sources: 1) a set of secondary data in the form of Grade Point Average (GPA) received from the registrar, 2) a set of primary data collected throughout structured questionnaire administered to students and alumni at JLC, and 3) another set of primary data collected throughout interviews conducted with staff and educators at JLC. The significance of this study is two folds. First, it validates the effectiveness of the special program at JLC for college-level students who learn differently. Second, it identifies the distinctiveness of the mix of techniques, methods, and practices, including the special individualized and personalized one-on-one approach at JLC.Keywords: education, neuro-diverse students, program effectiveness, Jones learning center
Procedia PDF Downloads 7422059 Statistical Models and Time Series Forecasting on Crime Data in Nepal
Authors: Dila Ram Bhandari
Abstract:
Throughout the 20th century, new governments were created where identities such as ethnic, religious, linguistic, caste, communal, tribal, and others played a part in the development of constitutions and the legal system of victim and criminal justice. Acute issues with extremism, poverty, environmental degradation, cybercrimes, human rights violations, crime against, and victimization of both individuals and groups have recently plagued South Asian nations. Everyday massive number of crimes are steadfast, these frequent crimes have made the lives of common citizens restless. Crimes are one of the major threats to society and also for civilization. Crime is a bone of contention that can create a societal disturbance. The old-style crime solving practices are unable to live up to the requirement of existing crime situations. Crime analysis is one of the most important activities of the majority of intelligent and law enforcement organizations all over the world. The South Asia region lacks such a regional coordination mechanism, unlike central Asia of Asia Pacific regions, to facilitate criminal intelligence sharing and operational coordination related to organized crime, including illicit drug trafficking and money laundering. There have been numerous conversations in recent years about using data mining technology to combat crime and terrorism. The Data Detective program from Sentient as a software company, uses data mining techniques to support the police (Sentient, 2017). The goals of this internship are to test out several predictive model solutions and choose the most effective and promising one. First, extensive literature reviews on data mining, crime analysis, and crime data mining were conducted. Sentient offered a 7-year archive of crime statistics that were daily aggregated to produce a univariate dataset. Moreover, a daily incidence type aggregation was performed to produce a multivariate dataset. Each solution's forecast period lasted seven days. Statistical models and neural network models were the two main groups into which the experiments were split. For the crime data, neural networks fared better than statistical models. This study gives a general review of the applied statistics and neural network models. A detailed image of each model's performance on the available data and generalizability is provided by a comparative analysis of all the models on a comparable dataset. Obviously, the studies demonstrated that, in comparison to other models, Gated Recurrent Units (GRU) produced greater prediction. The crime records of 2005-2019 which was collected from Nepal Police headquarter and analysed by R programming. In conclusion, gated recurrent unit implementation could give benefit to police in predicting crime. Hence, time series analysis using GRU could be a prospective additional feature in Data Detective.Keywords: time series analysis, forecasting, ARIMA, machine learning
Procedia PDF Downloads 16422058 Relative Entropy Used to Determine the Divergence of Cells in Single Cell RNA Sequence Data Analysis
Authors: An Chengrui, Yin Zi, Wu Bingbing, Ma Yuanzhu, Jin Kaixiu, Chen Xiao, Ouyang Hongwei
Abstract:
Single cell RNA sequence (scRNA-seq) is one of the effective tools to study transcriptomics of biological processes. Recently, similarity measurement of cells is Euclidian distance or its derivatives. However, the process of scRNA-seq is a multi-variate Bernoulli event model, thus we hypothesize that it would be more efficient when the divergence between cells is valued with relative entropy than Euclidian distance. In this study, we compared the performances of Euclidian distance, Spearman correlation distance and Relative Entropy using scRNA-seq data of the early, medial and late stage of limb development generated in our lab. Relative Entropy is better than other methods according to cluster potential test. Furthermore, we developed KL-SNE, an algorithm modifying t-SNE whose definition of divergence between cells Euclidian distance to Kullback–Leibler divergence. Results showed that KL-SNE was more effective to dissect cell heterogeneity than t-SNE, indicating the better performance of relative entropy than Euclidian distance. Specifically, the chondrocyte expressing Comp was clustered together with KL-SNE but not with t-SNE. Surprisingly, cells in early stage were surrounded by cells in medial stage in the processing of KL-SNE while medial cells neighbored to late stage with the process of t-SNE. This results parallel to Heatmap which showed cells in medial stage were more heterogenic than cells in other stages. In addition, we also found that results of KL-SNE tend to follow Gaussian distribution compared with those of the t-SNE, which could also be verified with the analysis of scRNA-seq data from another study on human embryo development. Therefore, it is also an effective way to convert non-Gaussian distribution to Gaussian distribution and facilitate the subsequent statistic possesses. Thus, relative entropy is potentially a better way to determine the divergence of cells in scRNA-seq data analysis.Keywords: Single cell RNA sequence, Similarity measurement, Relative Entropy, KL-SNE, t-SNE
Procedia PDF Downloads 34022057 A Machine Learning Approach for Efficient Resource Management in Construction Projects
Authors: Soheila Sadeghi
Abstract:
Construction projects are complex and often subject to significant cost overruns due to the multifaceted nature of the activities involved. Accurate cost estimation is crucial for effective budget planning and resource allocation. Traditional methods for predicting overruns often rely on expert judgment or analysis of historical data, which can be time-consuming, subjective, and may fail to consider important factors. However, with the increasing availability of data from construction projects, machine learning techniques can be leveraged to improve the accuracy of overrun predictions. This study applied machine learning algorithms to enhance the prediction of cost overruns in a case study of a construction project. The methodology involved the development and evaluation of two machine learning models: Random Forest and Neural Networks. Random Forest can handle high-dimensional data, capture complex relationships, and provide feature importance estimates. Neural Networks, particularly Deep Neural Networks (DNNs), are capable of automatically learning and modeling complex, non-linear relationships between input features and the target variable. These models can adapt to new data, reduce human bias, and uncover hidden patterns in the dataset. The findings of this study demonstrate that both Random Forest and Neural Networks can significantly improve the accuracy of cost overrun predictions compared to traditional methods. The Random Forest model also identified key cost drivers and risk factors, such as changes in the scope of work and delays in material delivery, which can inform better project risk management. However, the study acknowledges several limitations. First, the findings are based on a single construction project, which may limit the generalizability of the results to other projects or contexts. Second, the dataset, although comprehensive, may not capture all relevant factors influencing cost overruns, such as external economic conditions or political factors. Third, the study focuses primarily on cost overruns, while schedule overruns are not explicitly addressed. Future research should explore the application of machine learning techniques to a broader range of projects, incorporate additional data sources, and investigate the prediction of both cost and schedule overruns simultaneously.Keywords: resource allocation, machine learning, optimization, data-driven decision-making, project management
Procedia PDF Downloads 4022056 Hydrogen: Contention-Aware Hybrid Memory Management for Heterogeneous CPU-GPU Architectures
Authors: Yiwei Li, Mingyu Gao
Abstract:
Integrating hybrid memories with heterogeneous processors could leverage heterogeneity in both compute and memory domains for better system efficiency. To ensure performance isolation, we introduce Hydrogen, a hardware architecture to optimize the allocation of hybrid memory resources to heterogeneous CPU-GPU systems. Hydrogen supports efficient capacity and bandwidth partitioning between CPUs and GPUs in both memory tiers. We propose decoupled memory channel mapping and token-based data migration throttling to enable flexible partitioning. We also support epoch-based online search for optimized configurations and lightweight reconfiguration with reduced data movements. Hydrogen significantly outperforms existing designs by 1.21x on average and up to 1.31x.Keywords: hybrid memory, heterogeneous systems, dram cache, graphics processing units
Procedia PDF Downloads 9722055 Computer-Based versus Paper-Based Tests: A Comparative Study of Two Types of Indonesian National Examination for Senior High School Students
Authors: Faizal Mansyur
Abstract:
The objective of this research is to find out whether there is a significant difference in the English language scores of senior high school students in the Indonesia National Examination for students tested by using computer-based and paper-based tests. The population of this research is senior high school students in South Sulawesi Province who sat the Indonesian National Examination for 2015/2016 academic year. The samples of this research are 800 students’ scores from 8 schools taken by employing the multistage random sampling technique. The data of this research is a secondary data since it is obtained from the education office for South Sulawesi. In analyzing the collected data, the researcher employed the independent samples T-Test with the help of SPSS v.24 program. The finding of this research reveals that there is a significant difference in the English language scores of senior high school students in the Indonesia National Examination for students tested by using computer-based and paper-based Tests (p < .05). Moreover, students tested by using PBT (Mean = 63.13, SD = 13.63) achieve higher score than those tested by using CBT (Mean = 46.33, SD = 14.68).Keywords: computer-based test, paper-based test, Indonesian national examination, testing
Procedia PDF Downloads 16722054 Close-Range Remote Sensing Techniques for Analyzing Rock Discontinuity Properties
Authors: Sina Fatolahzadeh, Sergio A. Sepúlveda
Abstract:
This paper presents advanced developments in close-range, terrestrial remote sensing techniques to enhance the characterization of rock masses. The study integrates two state-of-the-art laser-scanning technologies, the HandySCAN and GeoSLAM laser scanners, to extract high-resolution geospatial data for rock mass analysis. These instruments offer high accuracy, precision, low acquisition time, and high efficiency in capturing intricate geological features in small to medium size outcrops and slope cuts. Using the HandySCAN and GeoSLAM laser scanners facilitates real-time, three-dimensional mapping of rock surfaces, enabling comprehensive assessments of rock mass characteristics. The collected data provide valuable insights into structural complexities, surface roughness, and discontinuity patterns, which are essential for geological and geotechnical analyses. The synergy of these advanced remote sensing technologies contributes to a more precise and straightforward understanding of rock mass behavior. In this case, the main parameters of RQD, joint spacing, persistence, aperture, roughness, infill, weathering, water condition, and joint orientation in a slope cut along the Sea-to-Sky Highway, BC, were remotely analyzed to calculate and evaluate the Rock Mass Rating (RMR) and Geological Strength Index (GSI) classification systems. Automatic and manual analyses of the acquired data are then compared with field measurements. The results show the usefulness of the proposed remote sensing methods and their appropriate conformity with the actual field data.Keywords: remote sensing, rock mechanics, rock engineering, slope stability, discontinuity properties
Procedia PDF Downloads 6622053 Application of Data Mining for Aquifer Environmental Assessment
Authors: Saman Javadi, Mehdi Hashemy, Mohahammad Mahmoodi
Abstract:
Vulnerability maps are employed as an important solution in order to handle entrance of pollution into the aquifers. The common way to provide vulnerability map is DRASTIC. Meanwhile, application of the method is not easy to apply for any aquifer due to choosing appropriate constant values of weights and ranks. In this study, a new approach using k-means clustering is applied to make vulnerability maps. Four features of depth to groundwater, hydraulic conductivity, recharge value and vadose zone were considered at the same time as features of clustering. Five regions are recognized out of the case study represent zones with different level of vulnerability. The finding results show that clustering provides a realistic vulnerability map so that, Pearson’s correlation coefficients between nitrate concentrations and clustering vulnerability is obtained 61%.Keywords: clustering, data mining, groundwater, vulnerability assessment
Procedia PDF Downloads 60322052 Further Investigation of α+12C and α+16O Elastic Scattering
Authors: Sh. Hamada
Abstract:
The current work aims to study the rainbow like-structure observed in the elastic scattering of alpha particles on both 12C and 16O nuclei. We reanalyzed the experimental elastic scattering angular distributions data for α+12C and α+16O nuclear systems at different energies using both optical model and double folding potential of different interaction models such as: CDM3Y1, DDM3Y1, CDM3Y6 and BDM3Y1. Potential created by BDM3Y1 interaction model has the shallowest depth which reflects the necessity to use higher renormalization factor (Nr). Both optical model and double folding potential of different interaction models fairly reproduce the experimental data.Keywords: density distribution, double folding, elastic scattering, nuclear rainbow, optical model
Procedia PDF Downloads 23722051 Energy Consumption and Economic Growth: Testimony of Selected Sub-Saharan Africa Countries
Authors: Alfred Quarcoo
Abstract:
The main purpose of this paper is to examine the causal relationship between energy consumption and economic growth in Sub-Saharan Africa using panel data techniques. An annual data on energy consumption and Economic Growth (proxied by real gross domestic product per capita) spanning from 1990 to 2016 from the World bank index database was used. The results of the Augmented Dickey–Fuller unit root test shows that the series for all countries are not stationary at levels. However, the log of economic growth in Benin and Congo become stationary after taking the differences of the data, and log of energy consumption become stationary for all countries and Log of economic growth in Kenya and Zimbabwe were found to be stationary after taking the second differences of the panel series. The findings of the Johansen cointegration test demonstrate that the variables Log of Energy Consumption and Log of economic growth are not co-integrated for the cases of Kenya and Zimbabwe, so no long-run relationship between the variables were established in any country. The Granger causality test indicates that there is a unidirectional causality running from energy use to economic growth in Kenya and no causal linkage between Energy consumption and economic growth in Benin, Congo and Zimbabwe.Keywords: Cointegration, Granger Causality, Sub-Sahara Africa, World Bank Development Indicators
Procedia PDF Downloads 5222050 Time Travel Testing: A Mechanism for Improving Renewal Experience
Authors: Aritra Majumdar
Abstract:
While organizations strive to expand their new customer base, retaining existing relationships is a key aspect of improving overall profitability and also showcasing how successful an organization is in holding on to its customers. It is an experimentally proven fact that the lion’s share of profit always comes from existing customers. Hence seamless management of renewal journeys across different channels goes a long way in improving trust in the brand. From a quality assurance standpoint, time travel testing provides an approach to both business and technology teams to enhance the customer experience when they look to extend their partnership with the organization for a defined phase of time. This whitepaper will focus on key pillars of time travel testing: time travel planning, time travel data preparation, and enterprise automation. Along with that, it will call out some of the best practices and common accelerator implementation ideas which are generic across verticals like healthcare, insurance, etc. In this abstract document, a high-level snapshot of these pillars will be provided. Time Travel Planning: The first step of setting up a time travel testing roadmap is appropriate planning. Planning will include identifying the impacted systems that need to be time traveled backward or forward depending on the business requirement, aligning time travel with other releases, frequency of time travel testing, preparedness for handling renewal issues in production after time travel testing is done and most importantly planning for test automation testing during time travel testing. Time Travel Data Preparation: One of the most complex areas in time travel testing is test data coverage. Aligning test data to cover required customer segments and narrowing it down to multiple offer sequencing based on defined parameters are keys for successful time travel testing. Another aspect is the availability of sufficient data for similar combinations to support activities like defect retesting, regression testing, post-production testing (if required), etc. This section will talk about the necessary steps for suitable data coverage and sufficient data availability from a time travel testing perspective. Enterprise Automation: Time travel testing is never restricted to a single application. The workflow needs to be validated in the downstream applications to ensure consistency across the board. Along with that, the correctness of offers across different digital channels needs to be checked in order to ensure a smooth customer experience. This section will talk about the focus areas of enterprise automation and how automation testing can be leveraged to improve the overall quality without compromising on the project schedule. Along with the above-mentioned items, the white paper will elaborate on the best practices that need to be followed during time travel testing and some ideas pertaining to accelerator implementation. To sum it up, this paper will be written based on the real-time experience author had on time travel testing. While actual customer names and program-related details will not be disclosed, the paper will highlight the key learnings which will help other teams to implement time travel testing successfully.Keywords: time travel planning, time travel data preparation, enterprise automation, best practices, accelerator implementation ideas
Procedia PDF Downloads 15922049 Electronic Data Interchange (EDI) in the Supply Chain: Impact on Customer Satisfaction
Authors: Hicham Amine, Abdelouahab Mesnaoui
Abstract:
Electronic data interchange EDI is the computer-to-computer exchange of structured business information. This information typically takes the form of standardized electronic business documents, such as invoices, purchase orders, bills of lading, and so on. The purpose of this study is to identify the impact EDI might have on supply chain and typically on customer satisfaction keeping in mind the constraints the organization might face. This study included 139 subject matter experts (SMEs) who participated by responding to a survey that was distributed. 85% responded that they are extremely for the implementation while 10% were neutral and 5% were against the implementation. From the quality assurance department, we have got 75% from the clients agreed to move on with the change whereas 10% stayed neutral and finally 15% were against the change. From the legal department where 80% of the answers were for the implementation and 10% of the participants stayed neutral whereas the last 10% were against it. The survey consisted of 40% male and 60% female (sex-ratio (F/M=1,5), who had chosen to participate. Our survey also contained 3 categories in terms of technical background where 80% are from technical background and 15% were from nontechnical background and 5% had some average technical background. This study examines the impact of EDI on customer satisfaction which is the primary hypothesis and justifies the importance of the implementation which enhances the customer satisfaction.Keywords: electronic data interchange, supply chain, subject matter experts, customer satisfaction
Procedia PDF Downloads 34022048 Accelerating Side Channel Analysis with Distributed and Parallelized Processing
Authors: Kyunghee Oh, Dooho Choi
Abstract:
Although there is no theoretical weakness in a cryptographic algorithm, Side Channel Analysis can find out some secret data from the physical implementation of a cryptosystem. The analysis is based on extra information such as timing information, power consumption, electromagnetic leaks or even sound which can be exploited to break the system. Differential Power Analysis is one of the most popular analyses, as computing the statistical correlations of the secret keys and power consumptions. It is usually necessary to calculate huge data and takes a long time. It may take several weeks for some devices with countermeasures. We suggest and evaluate the methods to shorten the time to analyze cryptosystems. Our methods include distributed computing and parallelized processing.Keywords: DPA, distributed computing, parallelized processing, side channel analysis
Procedia PDF Downloads 428