Search results for: data reduction
26924 Development of Management System of the Experience of Defensive Modeling and Simulation by Data Mining Approach
Authors: D. Nam Kim, D. Jin Kim, Jeonghwan Jeon
Abstract:
Defense Defensive Modeling and Simulation (M&S) is a system which enables impracticable training for reducing constraints of time, space and financial resources. The necessity of defensive M&S has been increasing not only for education and training but also virtual fight. Soldiers who are using defensive M&S for education and training will obtain empirical knowledge and know-how. However, the obtained knowledge of individual soldiers have not been managed and utilized yet since the nature of military organizations: confidentiality and frequent change of members. Therefore, this study aims to develop a management system for the experience of defensive M&S based on data mining approach. Since individual empirical knowledge gained through using the defensive M&S is both quantitative and qualitative data, data mining approach is appropriate for dealing with individual empirical knowledge. This research is expected to be helpful for soldiers and military policy makers.Keywords: data mining, defensive m&s, management system, knowledge management
Procedia PDF Downloads 25826923 Stability Analysis of Slopes during Pile Driving
Authors: Yeganeh Attari, Gudmund Reidar Eiksund, Hans Peter Jostad
Abstract:
In Geotechnical practice, there is no standard method recognized by the industry to account for the reduction of safety factor of a slope as an effect of soil displacement and pore pressure build-up during pile installation. Pile driving disturbs causes large strains and generates excess pore pressures in a zone that can extend many diameters from the installed pile, resulting in a decrease of the shear strength of the surrounding soil. This phenomenon may cause slope failure. Moreover, dissipation of excess pore pressure set-up may cause weakening of areas outside the volume of soil remoulded during installation. Because of complex interactions between changes in mean stress and shearing, it is challenging to predict installation induced pore pressure response. Furthermore, it is a complex task to follow the rate and path of pore pressure dissipation in order to analyze slope stability. In cohesive soils it is necessary to implement soil models that account for strain softening in the analysis. In the literature, several cases of slope failure due to pile driving activities have been reported, for instance, a landslide in Gothenburg that resulted in a slope failure destroying more than thirty houses and Rigaud landslide in Quebec which resulted in loss of life. Up to now, several methods have been suggested to predict the effect of pile driving on total and effective stress, pore pressure changes and their effect on soil strength. However, this is still not well understood or agreed upon. In Norway, general approaches applied by geotechnical engineers for this problem are based on old empirical methods with little accurate theoretical background. While the limitations of such methods are discussed, this paper attempts to capture the reduction in the factor of safety of a slope during pile driving, using coupled Finite Element analysis and cavity expansion method. This is demonstrated by analyzing a case of slope failure due to pile driving in Norway.Keywords: cavity expansion method, excess pore pressure, pile driving, slope failure
Procedia PDF Downloads 15326922 Timely Detection and Identification of Abnormalities for Process Monitoring
Authors: Hyun-Woo Cho
Abstract:
The detection and identification of multivariate manufacturing processes are quite important in order to maintain good product quality. Unusual behaviors or events encountered during its operation can have a serious impact on the process and product quality. Thus they should be detected and identified as soon as possible. This paper focused on the efficient representation of process measurement data in detecting and identifying abnormalities. This qualitative method is effective in representing fault patterns of process data. In addition, it is quite sensitive to measurement noise so that reliable outcomes can be obtained. To evaluate its performance a simulation process was utilized, and the effect of adopting linear and nonlinear methods in the detection and identification was tested with different simulation data. It has shown that the use of a nonlinear technique produced more satisfactory and more robust results for the simulation data sets. This monitoring framework can help operating personnel to detect the occurrence of process abnormalities and identify their assignable causes in an on-line or real-time basis.Keywords: detection, monitoring, identification, measurement data, multivariate techniques
Procedia PDF Downloads 23726921 Imputation of Urban Movement Patterns Using Big Data
Authors: Eusebio Odiari, Mark Birkin, Susan Grant-Muller, Nicolas Malleson
Abstract:
Big data typically refers to consumer datasets revealing some detailed heterogeneity in human behavior, which if harnessed appropriately, could potentially revolutionize our understanding of the collective phenomena of the physical world. Inadvertent missing values skew these datasets and compromise the validity of the thesis. Here we discuss a conceptually consistent strategy for identifying other relevant datasets to combine with available big data, to plug the gaps and to create a rich requisite comprehensive dataset for subsequent analysis. Specifically, emphasis is on how these methodologies can for the first time enable the construction of more detailed pictures of passenger demand and drivers of mobility on the railways. These methodologies can predict the influence of changes within the network (like a change in time-table or impact of a new station), explain local phenomena outside the network (like rail-heading) and the other impacts of urban morphology. Our analysis also reveals that our new imputation data model provides for more equitable revenue sharing amongst network operators who manage different parts of the integrated UK railways.Keywords: big-data, micro-simulation, mobility, ticketing-data, commuters, transport, synthetic, population
Procedia PDF Downloads 23126920 Hydroxy Safflower Yellow A (HSYA) Mediated Neuroprotective Effect against Ischemia Reperfusion (I/R) Injury in Cerebral Stroke
Authors: Sruthi Ramagiri, Rajeev T.
Abstract:
Free radical damage has been entailed as the major culprit in the ischemic stroke contributing for oxidative damage. Recent investigations on Hydroxy Safflower Yellow A (HSYA) suggested its role in cerebral ischemia and various neurodegenerative disorders with unidentified molecular mechanisms. The current study was designed to investigate putative therapeutic role and possible molecular mechanisms of HSYA administration during the onset of reperfusion in cerebral ischemia-reperfusion (I/R) injury in cerebral stroke. Cerebral stroke was achieved by focal ischemic model. HSYA (10 mg/kg) was injected intravenously via the tail vein 5 minutes before reperfusion. Losses of sensorimotor abilities were evaluated by neurological scoring, spontaneous locomotor activity, and rotarod performance. Extent of oxidative stress was evaluated by biochemical parameters i.e., malondialdehyde (MDA), Glutathione (GSH), Super Oxide Dismutase (SOD) and catalase levels. The infarct volume of brain was assessed by 2,3,5-triphenyl tetrazolium chloride (TTC) staining technique. Increased cerebral injury (I/R) was evidenced by motor impairment, increased infarct volume and elevation of MDA levels along with significant reduction in antioxidant i.e.,MDA levels along with significant reduction in antioxidant i.e., GSH, SOD and catalase levels when compared to sham control. However, post conditioning with HSYA (10 mg/kg, i.v.) at the onset of reperfusion has significantly ameliorated sensorimotor abilities, attenuated MDA levels and reduced the infarct volume as compared with vehicle treated I/R injury group. Moreover, HSYA treatments improved antioxidant enzyme levels as compared with vehicle treated I/R-injury group. In conclusion, it may be suggested that HSYA post conditioning could be novel therapeutic approach against I/R injury in cerebral stroke possibly through its anti-oxidant mechanism.Keywords: HSYA, Ischemia reperfusion injury, oxidative stress, stroke
Procedia PDF Downloads 42926919 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory
Authors: Xiaochen Mu
Abstract:
Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.Keywords: data protection, property rights, intellectual property, Big data
Procedia PDF Downloads 4226918 Waste Management in Africa
Authors: Peter Ekene Egwu
Abstract:
Waste management is of critical importance in Africa for reasons related to public health, human dignity, climate resilience and environmental preservation. However, delivering waste management services requires adequate funding, which has generally been lacking in a context where the generation of waste is outpacing the development of waste management infrastructure in most cities. The sector represents a growing percentage of cities’ greenhouse gas (GHG) emissions, and some of the African cities profiled in this study are now designing waste management strategies with emission reduction in mind.Keywords: management waste material, Africa, uses of new technology to manage waste, waste management
Procedia PDF Downloads 7926917 The Influence of Housing Choice Vouchers on the Private Rental Market
Authors: Randy D. Colon
Abstract:
Through a freedom of information request, data pertaining to Housing Choice Voucher (HCV) households has been obtained from the Chicago Housing Authority, including rent price and number of bedrooms per HCV household, community area, and zip code from 2013 to the first quarter of 2018. Similar data pertaining to the private rental market will be obtained through public records found through the United States Department of Housing and Urban Development. The datasets will be analyzed through statistical and mapping software to investigate the potential link between HCV households and distorted rent prices. Quantitative data will be supplemented by qualitative data to investigate the lived experience of Chicago residents. Qualitative data will be collected at community meetings in the Chicago Englewood neighborhood through participation in neighborhood meetings and informal interviews with residents and community leaders. The qualitative data will be used to gain insight on the lived experience of community leaders and residents of the Englewood neighborhood in relation to housing, the rental market, and HCV. While there is an abundance of quantitative data on this subject, this qualitative data is necessary to capture the lived experience of local residents effected by a changing rental market. This topic reflects concerns voiced by members of the Englewood community, and this study aims to keep the community relevant in its findings.Keywords: Chicago, housing, housing choice voucher program, housing subsidies, rental market
Procedia PDF Downloads 12026916 The Dynamic Metadata Schema in Neutron and Photon Communities: A Case Study of X-Ray Photon Correlation Spectroscopy
Authors: Amir Tosson, Mohammad Reza, Christian Gutt
Abstract:
Metadata stands at the forefront of advancing data management practices within research communities, with particular significance in the realms of neutron and photon scattering. This paper introduces a groundbreaking approach—dynamic metadata schema—within the context of X-ray Photon Correlation Spectroscopy (XPCS). XPCS, a potent technique unravelling nanoscale dynamic processes, serves as an illustrative use case to demonstrate how dynamic metadata can revolutionize data acquisition, sharing, and analysis workflows. This paper explores the challenges encountered by the neutron and photon communities in navigating intricate data landscapes and highlights the prowess of dynamic metadata in addressing these hurdles. Our proposed approach empowers researchers to tailor metadata definitions to the evolving demands of experiments, thereby facilitating streamlined data integration, traceability, and collaborative exploration. Through tangible examples from the XPCS domain, we showcase how embracing dynamic metadata standards bestows advantages, enhancing data reproducibility, interoperability, and the diffusion of knowledge. Ultimately, this paper underscores the transformative potential of dynamic metadata, heralding a paradigm shift in data management within the neutron and photon research communities.Keywords: metadata, FAIR, data analysis, XPCS, IoT
Procedia PDF Downloads 6526915 The Effects of the Parent Training Program for Obesity Reduction on Child Waist Circumference and Health Behaviors of Pre-School Children at the Samut-Songkhram Kindergarten School, Samut-Songkhram Province, Thailand
Authors: Muntanavadee Maytapattana
Abstract:
This research aims to study the effects of the Parent Training Program for Obesity Reduction (PTPOR) on child waist circumference and health behaviors of pre-school children at the Samut-Songkhram kindergarten school, Samut-Songkhram province, Thailand. The objective of this research is to evaluate the effectiveness of the PTPOR on child waist circumference and health behaviors of the pre-school children. The conceptual framework of this study is developed on the basis of the Ecological Systems Theory (EST), not only do the individual factors such as child characteristics and child risk factors contribute to the child’s weight status, but also other factors such as parenting style and family characteristics, as well as community and demographic factors. This research is a quasi-experimental study. Participants were pre-school overweight and obese children and their parents. Forty-one parent-child dyads were recruited into the program. Parents participated in two sessions including an educational session and a group discussion session. Research methodology uses Paired-Samples t-test to determine the difference between groups in the mean scores of the outcome variables of the children and parents. The research results show that there was significant difference between child waist circumferences mean score at the baseline and finishing the program at the 0.01 level (p = 0.001), mean score of the child waist circumference was decrease after finishing the program. And there was no significant difference between child exercise health behaviors mean score at the baseline and finishing the program at the 0.05 level; however, mean score of the child exercise behavior was increase after finishing the program. Meanwhile, there was significant difference between child dietary health behavior mean score at the baseline and finishing the program at the 0.01 level (p = 0.001), mean score of the child dietary was increase after finishing the program.Keywords: PTPOR, child waist circumference, child health behaviors, pre-school children
Procedia PDF Downloads 57326914 Comparative Efficacy of Gas Phase Sanitizers for Inactivating Salmonella, Escherichia coli O157:H7 and Listeria monocytogenes on Intact Lettuce Heads
Authors: Kayla Murray, Andrew Green, Gopi Paliyath, Keith Warriner
Abstract:
Introduction: It is now acknowledged that control of human pathogens associated with fresh produce requires an integrated approach of several interventions as opposed to relying on post-harvest washes to remove field acquired contamination. To this end, current research is directed towards identifying such interventions that can be applied at different points in leafy green processing. Purpose: In the following the efficacy of different gas phase treatments to decontaminate whole lettuce heads during pre-processing storage were evaluated. Methods: Whole Cos lettuce heads were spot inoculated with L. monocytogenes, E. coli O157:H7 or Salmonella spp. The inoculated lettuce heads were then placed in a treatment chamber and exposed to ozone, chlorine dioxide or hydroxyl radicals at different time periods under a range of relative humidity. Survivors of the treatments were enumerated along with sensory analysis performed on the treated lettuce. Results: Ozone gas reduced L. monocytogenes by 2-log10 after ten-minutes of exposure with Salmonella and E. coli O157:H7 being decreased by 0.66 and 0.56-log cfu respectively. Chlorine dioxide gas treatment reduced L. monocytogenes and Salmonella on lettuce heads by 4 log cfu but only supported a 0.8 log cfu reduction in E. coli O157:H7 numbers. In comparison, hydroxyl radicals supported a 2.9 – 4.8 log cfu reduction of model human pathogens inoculated onto lettuce heads but required extended exposure times and relative humidity < 0.8. Significance: From the gas phase sanitizers tested, chlorine dioxide and hydroxyl radicals are the most effective. The latter process holds most promise based on the ease of delivery, worker safety and preservation of lettuce sensory characteristics. Although expose times for hydroxyl radicles was relatively long (24h) this should not be considered a limitation given the intervention is applied in store rooms or in transport containers during transit.Keywords: gas phase sanitizers, iceberg lettuce heads, leafy green processing
Procedia PDF Downloads 41026913 Tibial Plateau Fractures During Covid-19 In A Trauma Unit. Impact of Lockdown and The Pressures on the Healthcare Provider
Authors: R. Gwynn, P. Panwalkar, K. Veravalli , M. Tofighi, R. Clement, A. Mofidi
Abstract:
The aim of this study was to access the impact of Covid-19 and lockdown on the incidence, injury pattern, and treatment of tibial plateau fractures in a combined rural and urban population in wales. Methods: Retrospective study was performed to identify tibial plateau fractures in 15-month period of Covid-19 lockdown 15-month period immediately before lockdown. Patient demographics, injury mechanism, injury severity (based on Schatzker classification), and associated injuries, treatment methods, and outcome of fractures in the Covid-19 period was studied. Results: The incidence oftibial plateau fracture was 9 per 100000 during Covid-19, and 8.5 per 100000, and both were similar to previous studies. The average age was 52, and female to male ratio was 1:1 in both control and study group. High energy injury was seen in only 20% of the patients and 35% in the control groups (2=12, p<0025). 14% of the covid-19 population sustained other injuries as opposed 16% in the control group(2=0.09, p>0.95). Lower severity isolated lateral condyle fracturesinjury (Schatzker 1-3) were seen in 40% of fractures this was 60% in the control populations. Higher bicondylar and shaft fractures (Schatzker 5-6) were seen in 60% of the Covid-19 group and 35% in the control groups(2=7.8, p<0.02). Treatment mode was not impacted by Covid-19. The complication rate was low in spite of higher number of complex fractures and the impact of covid-19 pandemic. Conclusion: The associated injuries were similar in spite of a significantly lower mechanism of injury. There were unexpectedly worst tibial plateau fracture based Schatzker classification in the Covid-19 period as compared to the control groups. This was especially relevant for medial condyle and shaft fractures. This was postulated to be caused by reduction in bone density caused by lack of vitamin D and reduction in activity. The treatment mode and outcome was not impacted by the impact of Covid-19 on care for tibial plateau fractures.Keywords: Covid-19, knee, tibial plateau fracture, trauma
Procedia PDF Downloads 12926912 An Empirical Analysis of Farmers Field Schools and Effect on Tomato Productivity in District Malakand Khyber Pakhtunkhwa-Pakistan
Authors: Mahmood Iqbal, Khalid Nawab, Tachibana Satoshi
Abstract:
Farmer Field School (FFS) is constantly aims to assist farmers to determine and learn about field ecology and integrated crop management. The study was conducted to examine the change in productivity of tomato crop in the study area; to determine increase in per acre yield of the crop, and find out reduction in per acre input cost. A study of tomato crop was conducted in ten villages namely Jabban, Bijligar Colony, Palonow, Heroshah, Zara Maira, Deghar Ghar, Sidra Jour, Anar Thangi, Miangano Korona and Wartair of district Malakand. From each village 15 respondents were selected randomly on the basis of identical allocation making sample size of 150 respondents. The research was based on primary as well as secondary data. Primary data was collected from farmers while secondary data were taken from Agriculture Extension Department Dargai, District Malakand. Interview schedule was planned and each farmer was interviewed personally. The study was based on comparison of cost, yield and income of tomato before and after FFS. Paired t-test and Statistical Package for Social Sciences (SPSS) was used for analysis; outcome of the study show that integrated pest management project has brought a positive change in the attitude of farmers of the project area through FFS approach. In district Malakand 66.0% of the respondents were between the age group of 31-50 years, 11.3% of respondents had primary level of education, 12.7% of middle level, 28.7% metric level, 3.3% of intermediate level and 2.0% of graduate level of education while 42.0% of respondents were illiterate and have no education. Average land holding size of farmers was 6.47 acres, cost of seed, crop protection from insect pest and crop protection from diseases was reduced by Rs. 210.67, Rs. 2584.43 and Rs. 3044.16 respectively, the cost of fertilizers and cost of farm yard manure was increased by Rs.1548.87 and Rs. 1151.40 respectively while tomato yield was increased by 1585.03 kg/acre from 7663.87 to 9248.90 kg/acre. The role of FFS initiate by integrated pest management project through department of agriculture extension for the development of agriculture was worth mentioning. It has brought enhancement in crop yield of tomato and their income through FFS approach. On the basis of results of the research studies, integrated pest management project should spread their developmental activities for maximum participation of the complete rural masses through participatory FFS approach.Keywords: agriculture, Farmers field schools, extension education, tomato
Procedia PDF Downloads 61726911 Exploring SSD Suitable Allocation Schemes Incompliance with Workload Patterns
Authors: Jae Young Park, Hwansu Jung, Jong Tae Kim
Abstract:
Whether the data has been well parallelized is an important factor in the Solid-State-Drive (SSD) performance. SSD parallelization is affected by allocation scheme and it is directly connected to SSD performance. There are dynamic allocation and static allocation in representative allocation schemes. Dynamic allocation is more adaptive in exploiting write operation parallelism, while static allocation is better in read operation parallelism. Therefore, it is hard to select the appropriate allocation scheme when the workload is mixed read and write operations. We simulated conditions on a few mixed data patterns and analyzed the results to help the right choice for better performance. As the results, if data arrival interval is long enough prior operations to be finished and continuous read intensive data environment static allocation is more suitable. Dynamic allocation performs the best on write performance and random data patterns.Keywords: dynamic allocation, NAND flash based SSD, SSD parallelism, static allocation
Procedia PDF Downloads 34226910 Assessment of the Standard of Referrals for Extraction of Carious Primary Teeth under General Anaesthetic
Authors: Emma Carr, Jennifer Morrison, Peter Walker
Abstract:
Background: Due to COVID-19, there was a significant reduction in the number of children being treated under general anaesthetic (GA) within the health board, which led to a backlog of referrals. The referrals were being triaged and added to a waiting list in order of priority -determined by the information given. By implementing a checklist, it is anticipated that at least 70% of referrals will have the majority of the information required to effectively prioritise patients. The gold standard, as defined in ‘Guidelines For The Management Of Children Referred For Dental Extractions Under General Anaesthesia’, indicates that all referrals should mention: (i) Inability of the child to cooperate, (ii) Previously tried anxiety management techniques, (iii) Existence of psychological disorders, (iv) Presence of acute dental infection, (v) Requirement for extractions in multiple quadrants. Method: 130 referrals were examined over three months and compared to the recommended standard. A letter was emailed to referring dentists within Ayrshire & Arran outlining the recommended information to be included within the referral. The second round of data collection was then carried out, which involved an examination of 105 referrals. Results: The first round revealed that only 28% of referrals mentioned at least four defined standards outlined above. Following issuing a checklist to all dentists, this increased to 72%. Conclusion: As many of the children referred for extractions under GA have suffered pain and infection because of dental caries, it is important that delay of treatment is minimised, where possible. The implementation of a standardised checklist has enabled more effective prioritisation of patients.Keywords: caries, dentistry, general anaesthetic, paediatrics
Procedia PDF Downloads 11126909 Social Data Aggregator and Locator of Knowledge (STALK)
Authors: Rashmi Raghunandan, Sanjana Shankar, Rakshitha K. Bhat
Abstract:
Social media contributes a vast amount of data and information about individuals to the internet. This project will greatly reduce the need for unnecessary manual analysis of large and diverse social media profiles by filtering out and combining the useful information from various social media profiles, eliminating irrelevant data. It differs from the existing social media aggregators in that it does not provide a consolidated view of various profiles. Instead, it provides consolidated INFORMATION derived from the subject’s posts and other activities. It also allows analysis over multiple profiles and analytics based on several profiles. We strive to provide a query system to provide a natural language answer to questions when a user does not wish to go through the entire profile. The information provided can be filtered according to the different use cases it is used for.Keywords: social network, analysis, Facebook, Linkedin, git, big data
Procedia PDF Downloads 44626908 Data Integrity between Ministry of Education and Private Schools in the United Arab Emirates
Authors: Rima Shishakly, Mervyn Misajon
Abstract:
Education is similar to other businesses and industries. Achieving data integrity is essential in order to attain a significant supporting for all the stakeholders in the educational sector. Efficient data collect, flow, processing, storing and retrieving are vital in order to deliver successful solutions to the different stakeholders. Ministry of Education (MOE) in United Arab Emirates (UAE) has adopted ‘Education 2020’ a series of five-year plans designed to introduce advanced education management information systems. As part of this program, in 2010 MOE implemented Student Information Systems (SIS) to manage and monitor the students’ data and information flow between MOE and international private schools in UAE. This paper is going to discuss data integrity concerns between MOE, and private schools. The paper will clarify the data integrity issues and will indicate the challenges that face private schools in UAE.Keywords: education management information systems (EMIS), student information system (SIS), United Arab Emirates (UAE), ministry of education (MOE), (KHDA) the knowledge and human development authority, Abu Dhabi educational counsel (ADEC)
Procedia PDF Downloads 22526907 Towards a Balancing Medical Database by Using the Least Mean Square Algorithm
Authors: Kamel Belammi, Houria Fatrim
Abstract:
imbalanced data set, a problem often found in real world application, can cause seriously negative effect on classification performance of machine learning algorithms. There have been many attempts at dealing with classification of imbalanced data sets. In medical diagnosis classification, we often face the imbalanced number of data samples between the classes in which there are not enough samples in rare classes. In this paper, we proposed a learning method based on a cost sensitive extension of Least Mean Square (LMS) algorithm that penalizes errors of different samples with different weight and some rules of thumb to determine those weights. After the balancing phase, we applythe different classifiers (support vector machine (SVM), k- nearest neighbor (KNN) and multilayer neuronal networks (MNN)) for balanced data set. We have also compared the obtained results before and after balancing method.Keywords: multilayer neural networks, k- nearest neighbor, support vector machine, imbalanced medical data, least mean square algorithm, diabetes
Procedia PDF Downloads 53526906 Displacement Based Design of a Dual Structural System
Authors: Romel Cordova Shedan
Abstract:
The traditional seismic design is the methodology of Forced Based Design (FBD). The Displacement Based Design (DBD) is a seismic design that considers structural damage to achieve a failure mechanism of the structure before the collapse. It is easier to quantify damage of a structure with displacements rather than forces. Therefore, a structure to achieve an inelastic displacement design with good ductility, it is necessary to be damaged. The first part of this investigation is about differences between the methodologies of DBD and FBD with some DBD advantages. In the second part, there is a study case about a dual building 5-story, which is regular in plan and elevation. The building is located in a seismic zone, which acceleration in firm soil is 45% of the acceleration of gravity. Then it is applied both methodologies into the study case to compare its displacements, shear forces and overturning moments. In the third part, the Dynamic Time History Analysis (DTHA) is done, to compare displacements with DBD and FBD methodologies. Three accelerograms were used and the magnitude of the acceleration scaled to be spectrum compatible with design spectrum. Then, using ASCE 41-13 guidelines, the hinge plastics were assigned to structure. Finally, both methodologies results about study case are compared. It is important to take into account that the seismic performance level of the building for DBD is greater than FBD method. This is due to drifts of DBD are in the order of 2.0% and 2.5% comparing with FBD drifts of 0.7%. Therefore, displacements of DBD is greater than the FBD method. Shear forces of DBD result greater than FBD methodology. These strengths of DBD method ensures that structure achieves design inelastic displacements, because those strengths were obtained due to a displacement spectrum reduction factor which depends on damping and ductility of the dual system. Also, the displacements for the study case for DBD results to be greater than FBD and DTHA. In that way, it proves that the seismic performance level of the building for DBD is greater than FBD method. Due to drifts of DBD which are in the order of 2.0% and 2.5% compared with little FBD drifts of 0.7%.Keywords: displacement-based design, displacement spectrum reduction factor, dynamic time history analysis, forced based design
Procedia PDF Downloads 22926905 Convolutional Neural Network Based on Random Kernels for Analyzing Visual Imagery
Authors: Ja-Keoung Koo, Kensuke Nakamura, Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Byung-Woo Hong
Abstract:
The machine learning techniques based on a convolutional neural network (CNN) have been actively developed and successfully applied to a variety of image analysis tasks including reconstruction, noise reduction, resolution enhancement, segmentation, motion estimation, object recognition. The classical visual information processing that ranges from low level tasks to high level ones has been widely developed in the deep learning framework. It is generally considered as a challenging problem to derive visual interpretation from high dimensional imagery data. A CNN is a class of feed-forward artificial neural network that usually consists of deep layers the connections of which are established by a series of non-linear operations. The CNN architecture is known to be shift invariant due to its shared weights and translation invariance characteristics. However, it is often computationally intractable to optimize the network in particular with a large number of convolution layers due to a large number of unknowns to be optimized with respect to the training set that is generally required to be large enough to effectively generalize the model under consideration. It is also necessary to limit the size of convolution kernels due to the computational expense despite of the recent development of effective parallel processing machinery, which leads to the use of the constantly small size of the convolution kernels throughout the deep CNN architecture. However, it is often desired to consider different scales in the analysis of visual features at different layers in the network. Thus, we propose a CNN model where different sizes of the convolution kernels are applied at each layer based on the random projection. We apply random filters with varying sizes and associate the filter responses with scalar weights that correspond to the standard deviation of the random filters. We are allowed to use large number of random filters with the cost of one scalar unknown for each filter. The computational cost in the back-propagation procedure does not increase with the larger size of the filters even though the additional computational cost is required in the computation of convolution in the feed-forward procedure. The use of random kernels with varying sizes allows to effectively analyze image features at multiple scales leading to a better generalization. The robustness and effectiveness of the proposed CNN based on random kernels are demonstrated by numerical experiments where the quantitative comparison of the well-known CNN architectures and our models that simply replace the convolution kernels with the random filters is performed. The experimental results indicate that our model achieves better performance with less number of unknown weights. The proposed algorithm has a high potential in the application of a variety of visual tasks based on the CNN framework. Acknowledgement—This work was supported by the MISP (Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by IITP, and NRF-2014R1A2A1A11051941, NRF2017R1A2B4006023.Keywords: deep learning, convolutional neural network, random kernel, random projection, dimensionality reduction, object recognition
Procedia PDF Downloads 29226904 Wealth Creation and Agricultural Development in Nigeria: A Path to Sustainable Prosperity
Authors: Oladimeji Israel Ajayi
Abstract:
Agricultural development has long been identified as a cornerstone for wealth creation and economic growth, particularly in emerging economies like Nigeria. This study examines the relationship between agricultural development and wealth creation in Nigeria, emphasizing the sector's potential in reducing poverty, creating employment, and boosting economic stability. Nigeria, endowed with fertile land and a favorable climate, has a significant agricultural base that, if fully leveraged, can transition the economy from oil dependency to a diversified and sustainable growth model. However, challenges such as limited access to credit, poor infrastructure, outdated farming techniques, and climate variability hinder optimal productivity. This research employs a mixed-methods approach, analyzing data from the Nigerian National Bureau of Statistics and the Food and Agriculture Organization to understand how investments in agriculture influence wealth indicators such as GDP growth, employment rates, and rural income levels. The findings reveal a strong positive correlation between agricultural investment and wealth creation, suggesting that strategic policies focusing on mechanization, credit accessibility, and sustainable practices could significantly boost agricultural productivity and contribute to wealth distribution in Nigeria. This study contributes to policy discourse by highlighting agriculture’s role as a transformative tool for economic resilience and sustainable wealth creation in Nigeria.Keywords: agricultural development, poverty reduction, wealth creation, prosperity
Procedia PDF Downloads 1826903 Insight into the Electrocatalytic Activities of Nitrogen-Doped Graphyne and Graphdiyne Families: A First-Principles Study
Authors: Bikram K. Das, Kalyan K. Chattopadhyay
Abstract:
The advent of 2-D materials in the last decade has induced a fresh spur of growth in fuel cell technology as these materials have some highly promising traits that can be exploited to felicitate Oxygen Reduction Reaction (ORR) in an efficient way. Among the various 2-D carbon materials, graphyne (Gy) and graphdiyne (Gdy)1 with their intrinsic non-uniform charge distribution holds promises in this purpose and it is expected2 that substitutional Nitrogen (N) doping could further enhance their efficiency. In this regard, dispersive force corrected density functional theory is used to map the oxygen reduction reaction (ORR) kinetics of five different kinds of N doped graphyne and graphdiyne systems (namely αGy, βGy, γGy, RGy and 6,6,12Gy and Gdy) in alkaline medium. The best doping site for each of the Gy/ Gdy system is determined comparing the formation energies of the possible doping configurations. Similarly, the best di-oxygen (O₂) adsorption sites for the doped systems are identified by comparing the adsorption energies. O₂ adsorption on all N doped Gy/ Gdy systems is found to be energetically favorable. ORR on a catalyst surface may occur either via the Eley-Rideal (ER) or the Langmuir–Hinschelwood (LH) pathway. Systematic studies performed on the considered systems reveal that all of them favor the ER pathway. Further, depending on the nature of di-oxygen adsorption ORR can follow either associative or dissociative mechanism; the possibility of occurrence of both the mechanisms is tested thoroughly for each N doped Gy/ Gdy. For the ORR process, all the Gy/Gdy systems are observed to prefer the efficient four-electron pathway but the expected monotonically exothermic reaction pathway is found only for N doped 6,6,12Gy and RGy following the associative pathway and for N doped βGy, γGy and Gdy following the dissociative pathway. Further computation performed for these systems reveals that for N doped 6,6,12Gy, RGy, βGy, γGy and Gdy the overpotentials are 1.08 V, 0.94 V, 1.17 V, 1.21 V and 1.04 V respectively depicting N doped RGy is the most promising material, to carry out ORR in alkaline medium, among the considered ones. The stability of the ORR intermediate states with the variation of pH and electrode potentials is further explored with Pourbiax diagrams and the activities of these systems in the alkaline medium are compared with the prior reported B/N doped identical systems for ORR in an acidic medium in terms of a common descriptor.Keywords: graphdiyne, graphyne, nitrogen-doped, ORR
Procedia PDF Downloads 13026902 Data Protection, Data Privacy, Research Ethics in Policy Process Towards Effective Urban Planning Practice for Smart Cities
Authors: Eugenio Ferrer Santiago
Abstract:
The growing complexities of the modern world on high-end gadgets, software applications, scams, identity theft, and Artificial Intelligence (AI) make the “uninformed” the weak and vulnerable to be victims of cybercrimes. Artificial Intelligence is not a new thing in our daily lives; the principles of database management, logical programming, and garbage in and garbage out are all connected to AI. The Philippines had in place legal safeguards against the abuse of cyberspace, but self-regulation of key industry players and self-protection by individuals are primordial to attain the success of these initiatives. Data protection, Data Privacy, and Research Ethics must work hand in hand during the policy process in the course of urban planning practice in different environments. This paper focuses on the interconnection of data protection, data privacy, and research ethics in coming up with clear-cut policies against perpetrators in the urban planning professional practice relevant in sustainable communities and smart cities. This paper shall use expository methodology under qualitative research using secondary data from related literature, interviews/blogs, and the World Wide Web resources. The claims and recommendations of this paper will help policymakers and implementers in the policy cycle. This paper shall contribute to the body of knowledge as a simple treatise and communication channel to the reading community and future researchers to validate the claims and start an intellectual discourse for better knowledge generation for the good of all in the near future.Keywords: data privacy, data protection, urban planning, research ethics
Procedia PDF Downloads 6226901 Review of the Road Crash Data Availability in Iraq
Authors: Abeer K. Jameel, Harry Evdorides
Abstract:
Iraq is a middle income country where the road safety issue is considered one of the leading causes of deaths. To control the road risk issue, the Iraqi Ministry of Planning, General Statistical Organization started to organise a collection system of traffic accidents data with details related to their causes and severity. These data are published as an annual report. In this paper, a review of the available crash data in Iraq will be presented. The available data represent the rate of accidents in aggregated level and classified according to their types, road users’ details, and crash severity, type of vehicles, causes and number of causalities. The review is according to the types of models used in road safety studies and research, and according to the required road safety data in the road constructions tasks. The available data are also compared with the road safety dataset published in the United Kingdom as an example of developed country. It is concluded that the data in Iraq are suitable for descriptive and exploratory models, aggregated level comparison analysis, and evaluation and monitoring the progress of the overall traffic safety performance. However, important traffic safety studies require disaggregated level of data and details related to the factors of the likelihood of traffic crashes. Some studies require spatial geographic details such as the location of the accidents which is essential in ranking the roads according to their level of safety, and name the most dangerous roads in Iraq which requires tactic plan to control this issue. Global Road safety agencies interested in solve this problem in low and middle-income countries have designed road safety assessment methodologies which are basing on the road attributes data only. Therefore, in this research it is recommended to use one of these methodologies.Keywords: road safety, Iraq, crash data, road risk assessment, The International Road Assessment Program (iRAP)
Procedia PDF Downloads 25726900 Assessment of On-Site Solar and Wind Energy at a Manufacturing Facility in Ireland
Authors: A. Sgobba, C. Meskell
Abstract:
The feasibility of on-site electricity production from solar and wind and the resulting load management for a specific manufacturing plant in Ireland are assessed. The industry sector accounts directly and indirectly for a high percentage of electricity consumption and global greenhouse gas emissions; therefore, it will play a key role in emission reduction and control. Manufacturing plants, in particular, are often located in non-residential areas since they require open spaces for production machinery, parking facilities for the employees, appropriate routes for supply and delivery, special connections to the national grid and other environmental impacts. Since they have larger spaces compared to commercial sites in urban areas, they represent an appropriate case study for evaluating the technical and economic viability of energy system integration with low power density technologies, such as solar and wind, for on-site electricity generation. The available open space surrounding the analysed manufacturing plant can be efficiently used to produce a discrete quantity of energy, instantaneously and locally consumed. Therefore, transmission and distribution losses can be reduced. The usage of storage is not required due to the high and almost constant electricity consumption profile. The energy load of the plant is identified through the analysis of gas and electricity consumption, both internally monitored and reported on the bills. These data are not often recorded and available to third parties since manufacturing companies usually keep track only of the overall energy expenditures. The solar potential is modelled for a period of 21 years based on global horizontal irradiation data; the hourly direct and diffuse radiation and the energy produced by the system at the optimum pitch angle are calculated. The model is validated using PVWatts and SAM tools. Wind speed data are available for the same period within one-hour step at a height of 10m. Since the hub of a typical wind turbine reaches a higher altitude, complementary data for a different location at 50m have been compared, and a model for the estimate of wind speed at the required height in the right location is defined. Weibull Statistical Distribution is used to evaluate the wind energy potential of the site. The results show that solar and wind energy are, as expected, generally decoupled. Based on the real case study, the percentage of load covered every hour by on-site generation (Level of Autonomy LA) and the resulting electricity bought from the grid (Expected Energy Not Supplied EENS) are calculated. The economic viability of the project is assessed through Net Present Value, and the influence the main technical and economic parameters have on NPV is presented. Since the results show that the analysed renewable sources can not provide enough electricity, the integration with a cogeneration technology is studied. Finally, the benefit to energy system integration of wind, solar and a cogeneration technology is evaluated and discussed.Keywords: demand, energy system integration, load, manufacturing, national grid, renewable energy sources
Procedia PDF Downloads 13126899 Eliciting and Confirming Data, Information, Knowledge and Wisdom in a Specialist Health Care Setting - The Wicked Method
Authors: Sinead Impey, Damon Berry, Selma Furtado, Miriam Galvin, Loretto Grogan, Orla Hardiman, Lucy Hederman, Mark Heverin, Vincent Wade, Linda Douris, Declan O'Sullivan, Gaye Stephens
Abstract:
Healthcare is a knowledge-rich environment. This knowledge, while valuable, is not always accessible outside the borders of individual clinics. This research aims to address part of this problem (at a study site) by constructing a maximal data set (knowledge artefact) for motor neurone disease (MND). This data set is proposed as an initial knowledge base for a concurrent project to develop an MND patient data platform. It represents the domain knowledge at the study site for the duration of the research (12 months). A knowledge elicitation method was also developed from the lessons learned during this process - the WICKED method. WICKED is an anagram of the words: eliciting and confirming data, information, knowledge, wisdom. But it is also a reference to the concept of wicked problems, which are complex and challenging, as is eliciting expert knowledge. The method was evaluated at a second site, and benefits and limitations were noted. Benefits include that the method provided a systematic way to manage data, information, knowledge and wisdom (DIKW) from various sources, including healthcare specialists and existing data sets. Limitations surrounded the time required and how the data set produced only represents DIKW known during the research period. Future work is underway to address these limitations.Keywords: healthcare, knowledge acquisition, maximal data sets, action design science
Procedia PDF Downloads 36926898 Tool for Metadata Extraction and Content Packaging as Endorsed in OAIS Framework
Authors: Payal Abichandani, Rishi Prakash, Paras Nath Barwal, B. K. Murthy
Abstract:
Information generated from various computerization processes is a potential rich source of knowledge for its designated community. To pass this information from generation to generation without modifying the meaning is a challenging activity. To preserve and archive the data for future generations it’s very essential to prove the authenticity of the data. It can be achieved by extracting the metadata from the data which can prove the authenticity and create trust on the archived data. Subsequent challenge is the technology obsolescence. Metadata extraction and standardization can be effectively used to resolve and tackle this problem. Metadata can be categorized at two levels i.e. Technical and Domain level broadly. Technical metadata will provide the information that can be used to understand and interpret the data record, but only this level of metadata isn’t sufficient to create trustworthiness. We have developed a tool which will extract and standardize the technical as well as domain level metadata. This paper is about the different features of the tool and how we have developed this.Keywords: digital preservation, metadata, OAIS, PDI, XML
Procedia PDF Downloads 39526897 The Trigger-DAQ System in the Mu2e Experiment
Authors: Antonio Gioiosa, Simone Doanti, Eric Flumerfelt, Luca Morescalchi, Elena Pedreschi, Gianantonio Pezzullo, Ryan A. Rivera, Franco Spinella
Abstract:
The Mu2e experiment at Fermilab aims to measure the charged-lepton flavour violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. With the expected experimental sensitivity, Mu2e will improve the previous limit of four orders of magnitude. The Mu2e data acquisition (DAQ) system provides hardware and software to collect digitized data from the tracker, calorimeter, cosmic ray veto, and beam monitoring systems. Mu2e’s trigger and data acquisition system (TDAQ) uses otsdaq as its solution. developed at Fermilab, otsdaq uses the artdaq DAQ framework and art analysis framework, under-the-hood, for event transfer, filtering, and processing. Otsdaq is an online DAQ software suite with a focus on flexibility and scalability while providing a multi-user, web-based interface accessible through the Chrome or Firefox web browser. The detector read out controller (ROC) from the tracker and calorimeter stream out zero-suppressed data continuously to the data transfer controller (DTC). Data is then read over the PCIe bus to a software filter algorithm that selects events which are finally combined with the data flux that comes from a cosmic ray veto system (CRV).Keywords: trigger, daq, mu2e, Fermilab
Procedia PDF Downloads 15826896 An Improved Parallel Algorithm of Decision Tree
Authors: Jiameng Wang, Yunfei Yin, Xiyu Deng
Abstract:
Parallel optimization is one of the important research topics of data mining at this stage. Taking Classification and Regression Tree (CART) parallelization as an example, this paper proposes a parallel data mining algorithm based on SSP-OGini-PCCP. Aiming at the problem of choosing the best CART segmentation point, this paper designs an S-SP model without data association; and in order to calculate the Gini index efficiently, a parallel OGini calculation method is designed. In addition, in order to improve the efficiency of the pruning algorithm, a synchronous PCCP pruning strategy is proposed in this paper. In this paper, the optimal segmentation calculation, Gini index calculation, and pruning algorithm are studied in depth. These are important components of parallel data mining. By constructing a distributed cluster simulation system based on SPARK, data mining methods based on SSP-OGini-PCCP are tested. Experimental results show that this method can increase the search efficiency of the best segmentation point by an average of 89%, increase the search efficiency of the Gini segmentation index by 3853%, and increase the pruning efficiency by 146% on average; and as the size of the data set increases, the performance of the algorithm remains stable, which meets the requirements of contemporary massive data processing.Keywords: classification, Gini index, parallel data mining, pruning ahead
Procedia PDF Downloads 12526895 Integration of a Microbial Electrolysis Cell and an Oxy-Combustion Boiler
Authors: Ruth Diego, Luis M. Romeo, Antonio Morán
Abstract:
In the present work, a study of the coupling of a Bioelectrochemical System together with an oxy-combustion boiler is carried out; specifically, it proposes to connect the combustion gas outlet of a boiler with a microbial electrolysis cell (MEC) where the CO2 from the gases are transformed into methane in the cathode chamber, and the oxygen produced in the anode chamber is recirculated to the oxy-combustion boiler. The MEC mainly consists of two electrodes (anode and cathode) immersed in an aqueous electrolyte; these electrodes are separated by a proton exchange membrane (PEM). In this case, the anode is abiotic (where oxygen is produced), and it is at the cathode that an electroactive biofilm is formed with microorganisms that catalyze the CO2 reduction reactions. Real data from an oxy-combustion process in a boiler of around 20 thermal MW have been used for this study and are combined with data obtained on a smaller scale (laboratory-pilot scale) to determine the yields that could be obtained considering the system as environmentally sustainable energy storage. In this way, an attempt is made to integrate a relatively conventional energy production system (oxy-combustion) with a biological system (microbial electrolysis cell), which is a challenge to be addressed in this type of new hybrid scheme. In this way, a novel concept is presented with the basic dimensioning of the necessary equipment and the efficiency of the global process. In this work, it has been calculated that the efficiency of this power-to-gas system based on MEC cells when coupled to industrial processes is of the same order of magnitude as the most promising equivalent routes. The proposed process has two main limitations, the overpotentials in the electrodes that penalize the overall efficiency and the need for storage tanks for the process gases. The results of the calculations carried out in this work show that certain real potentials achieve an acceptable performance. Regarding the tanks, with adequate dimensioning, it is possible to achieve complete autonomy. The proposed system called OxyMES provides energy storage without energetically penalizing the process when compared to an oxy-combustion plant with conventional CO2 capture. According to the results obtained, this system can be applied as a measure to decarbonize an industry, changing the original fuel of the oxy-combustion boiler to the biogas generated in the MEC cell. It could also be used to neutralize CO2 emissions from industry by converting it to methane and then injecting it into the natural gas grid.Keywords: microbial electrolysis cells, oxy-combustion, co2, power-to-gas
Procedia PDF Downloads 111