Search results for: training standards
3688 High-Fidelity Materials Screening with a Multi-Fidelity Graph Neural Network and Semi-Supervised Learning
Authors: Akeel A. Shah, Tong Zhang
Abstract:
Computational approaches to learning the properties of materials are commonplace, motivated by the need to screen or design materials for a given application, e.g., semiconductors and energy storage. Experimental approaches can be both time consuming and costly. Unfortunately, computational approaches such as ab-initio electronic structure calculations and classical or ab-initio molecular dynamics are themselves can be too slow for the rapid evaluation of materials, often involving thousands to hundreds of thousands of candidates. Machine learning assisted approaches have been developed to overcome the time limitations of purely physics-based approaches. These approaches, on the other hand, require large volumes of data for training (hundreds of thousands on many standard data sets such as QM7b). This means that they are limited by how quickly such a large data set of physics-based simulations can be established. At high fidelity, such as configuration interaction, composite methods such as G4, and coupled cluster theory, gathering such a large data set can become infeasible, which can compromise the accuracy of the predictions - many applications require high accuracy, for example band structures and energy levels in semiconductor materials and the energetics of charge transfer in energy storage materials. In order to circumvent this problem, multi-fidelity approaches can be adopted, for example the Δ-ML method, which learns a high-fidelity output from a low-fidelity result such as Hartree-Fock or density functional theory (DFT). The general strategy is to learn a map between the low and high fidelity outputs, so that the high-fidelity output is obtained a simple sum of the physics-based low-fidelity and correction, Although this requires a low-fidelity calculation, it typically requires far fewer high-fidelity results to learn the correction map, and furthermore, the low-fidelity result, such as Hartree-Fock or semi-empirical ZINDO, is typically quick to obtain, For high-fidelity outputs the result can be an order of magnitude or more in speed up. In this work, a new multi-fidelity approach is developed, based on a graph convolutional network (GCN) combined with semi-supervised learning. The GCN allows for the material or molecule to be represented as a graph, which is known to improve accuracy, for example SchNet and MEGNET. The graph incorporates information regarding the numbers of, types and properties of atoms; the types of bonds; and bond angles. They key to the accuracy in multi-fidelity methods, however, is the incorporation of low-fidelity output to learn the high-fidelity equivalent, in this case by learning their difference. Semi-supervised learning is employed to allow for different numbers of low and high-fidelity training points, by using an additional GCN-based low-fidelity map to predict high fidelity outputs. It is shown on 4 different data sets that a significant (at least one order of magnitude) increase in accuracy is obtained, using one to two orders of magnitude fewer low and high fidelity training points. One of the data sets is developed in this work, pertaining to 1000 simulations of quinone molecules (up to 24 atoms) at 5 different levels of fidelity, furnishing the energy, dipole moment and HOMO/LUMO.Keywords: .materials screening, computational materials, machine learning, multi-fidelity, graph convolutional network, semi-supervised learning
Procedia PDF Downloads 413687 Constructing a Semi-Supervised Model for Network Intrusion Detection
Authors: Tigabu Dagne Akal
Abstract:
While advances in computer and communications technology have made the network ubiquitous, they have also rendered networked systems vulnerable to malicious attacks devised from a distance. These attacks or intrusions start with attackers infiltrating a network through a vulnerable host and then launching further attacks on the local network or Intranet. Nowadays, system administrators and network professionals can attempt to prevent such attacks by developing intrusion detection tools and systems using data mining technology. In this study, the experiments were conducted following the Knowledge Discovery in Database Process Model. The Knowledge Discovery in Database Process Model starts from selection of the datasets. The dataset used in this study has been taken from Massachusetts Institute of Technology Lincoln Laboratory. After taking the data, it has been pre-processed. The major pre-processing activities include fill in missed values, remove outliers; resolve inconsistencies, integration of data that contains both labelled and unlabelled datasets, dimensionality reduction, size reduction and data transformation activity like discretization tasks were done for this study. A total of 21,533 intrusion records are used for training the models. For validating the performance of the selected model a separate 3,397 records are used as a testing set. For building a predictive model for intrusion detection J48 decision tree and the Naïve Bayes algorithms have been tested as a classification approach for both with and without feature selection approaches. The model that was created using 10-fold cross validation using the J48 decision tree algorithm with the default parameter values showed the best classification accuracy. The model has a prediction accuracy of 96.11% on the training datasets and 93.2% on the test dataset to classify the new instances as normal, DOS, U2R, R2L and probe classes. The findings of this study have shown that the data mining methods generates interesting rules that are crucial for intrusion detection and prevention in the networking industry. Future research directions are forwarded to come up an applicable system in the area of the study.Keywords: intrusion detection, data mining, computer science, data mining
Procedia PDF Downloads 2963686 MEAL Project–Modifying Eating Attitudes and Actions through Learning
Authors: E. Oliver, A. Cebolla, A. Dominguez, A. Gonzalez-Segura, E. de la Cruz, S. Albertini, L. Ferrini, K. Kronika, T. Nilsen, R. Baños
Abstract:
The main objective of MEAL is to develop a pedagogical tool aimed to help teachers and nutritionists (students and professionals) to acquire, train, promote and deliver to children basic nutritional education and healthy eating behaviours competencies. MEAL is focused on eating behaviours and not only in nutritional literacy, and will use new technologies like Information and Communication Technologies (ICTs) and serious games (SG) platforms to consolidate the nutritional competences and habits.Keywords: nutritional education, pedagogical ICT platform, serious games, training course
Procedia PDF Downloads 5263685 Improving Productivity in a Glass Production Line through Applying Principles of Total Productive Maintenance (TPM)
Authors: Omar Bataineh
Abstract:
Total productive maintenance (TPM) is a principle-based method that aims to get a high-level production with no breakdowns, no slow running and no defects. Key principles of TPM were applied in this work to improve the performance of the glass production line at United Beverage Company in Kuwait, which is producing bottles of soft drinks. Principles such as 5S as a foundation for TPM implementation, developing a program for equipment management, Cause and Effect Analysis (CEA), quality improvement, training and education of employees were employed. After the completion of TPM implementation, it was possible to increase the Overall Equipment Effectiveness (OEE) from 23% to 40%. Procedia PDF Downloads 3373684 From a Top Sport Event to a Sporting Activity
Authors: Helge Rupprich, Elke Knisel
Abstract:
In a time of mediazation and reduced physical movement, it is important to change passivity (akinesa) into physical activity to improve health. The approach is to encourage children, junior athletes, recreational athletes, and semi-professional athletes to do sports while attending a top sport event. The concept has the slogan: get out off your seat and move! A top sport event of a series of professional beach volleyball tournaments with 330.000 life viewers, 13,70 million cumulative reach viewers and 215,13 million advertising contacts is used as framework for different sports didactic approaches, social integrative approaches and migration valuations. An important aim is to use the big radiant power of the top sport event to extract active participants from the viewers of the top sport event. Even if it is the goal to improve physical activity, it is necessary to differentiate between the didactic approaches. The first approach contains psycho motoric exercises with children (N=158) between two and five years which was used in the project ‘largest sandbox of the city’. The second approach is social integration and promotion of activity of students (N=54) in the form of a student beach volleyball tournament. The third approach is activity in companies. It is based on the idea of health motivation of employees (N=62) in a big beach volleyball tournament. Fourth approach is to improve the sports leisure time activities of recreational athletes (N=292) in different beach volleyball tournaments. Fifthly approach is to build a foreign friendly measure which is implemented in junior athlete training with the French and German junior national team (N=16). Sixthly approach is to give semi professional athletes a tournament to develop their relation to active life. Seventh approach is social integration for disadvantaged people (N=123) in form of training with professional athletes. The top sport beach volleyball tournament had 80 athletes (N=80) and 34.000 viewers. In sum 785 athletes (N=785) did sports in 13 days. Over 34.000 viewers where counted in the first three days of top sport event. The project was evaluated positively by the City of Dresden, Politics of Saxony and the participants and will be continued in Dresden and expanded for the season 2015 in Jena.Keywords: beach volleyball, event, sports didactic, sports project
Procedia PDF Downloads 4953683 Climate-Smart Agriculture Technologies and Determinants of Farmers’ Adoption Decisions in the Great Rift Valley of Ethiopia
Authors: Theodrose Sisay, Kindie Tesfaye, Mengistu Ketema, Nigussie Dechassa, Mezegebu Getnet
Abstract:
Agriculture is a sector that is very vulnerable to the effects of climate change and contributes to anthropogenic greenhouse gas (GHG) emissions in the atmosphere. By lowering emissions and adjusting to the change, it can also help to reduce climate change. Utilizing Climate-Smart Agriculture (CSA) technology that can sustainably boost productivity, improve resilience, and lower GHG emissions is crucial. This study sought to identify the CSA technologies used by farmers and assess adoption levels and factors that influence them. In order to gather information from 384 smallholder farmers in the Great Rift Valley (GRV) of Ethiopia, a cross-sectional survey was carried out. Data were analysed using percentage, chi-square test, t-test, and multivariate probit model. Results showed that crop diversification, agroforestry, and integrated soil fertility management were the most widely practiced technologies. The results of the Chi-square and t-tests showed that there are differences and significant and positive connections between adopters and non-adopters based on various attributes. The chi-square and t-test results confirmed that households who were older had higher incomes, greater credit access, knowledge of the climate, better training, better education, larger farms, higher incomes, and more frequent interactions with extension specialists had a positive and significant association with CSA technology adopters. The model result showed that age, sex, and education of the head, farmland size, livestock ownership, income, access to credit, climate information, training, and extension contact influenced the selection of CSA technologies. Therefore, effective action must be taken to remove barriers to the adoption of CSA technologies, and taking these adoption factors into account in policy and practice is anticipated to support smallholder farmers in adapting to climate change while lowering emissions.Keywords: climate change, climate-smart agriculture, smallholder farmers, multivariate probit model
Procedia PDF Downloads 1273682 Wetland Community and Their Livelihood Opportunities in the Face of Changing Climatic Condition in Southwest Bangladesh
Authors: Mohsina Aktar, Bishawjit Mallick
Abstract:
Bangladesh faces the multidimensional manifestations of climate change e.g. flood, cyclone, sea level rise, drainage congestion, salinity, etc. This study aimed at to find out the community’s perception of the perceived impact of climate change on their wetland resource based livelihood, to analyze their present livelihood scenario and to find out required institutional setup to strengthen present livelihood scenario. Therefore, this study required both quantitative analysis like quantification of wetland resources, occupation, etc. and also exploratory information like policy and institutional reform. For quantitative information 200 questionnaire survey and in some cases observation survey and for socially shareable qualitative and quantitative issues case study and focus group discussion were conducted. In-Depth interview was conducted for socially non-shareable qualitative issues. The overall findings of this study have been presented maintaining a sequence- perception about climate change effect, livelihood scenario and required institutional support of the wetland community. Flood has been ranked where cyclone effect is comparatively less disastrous in this area. Heavy rainfall comes after the cyclone. Female members responded almost same about the ranking and effects of frequently occurred and devastating effects of climate change. People are much more aware of the impact of climate change. Training of Care in RVCC project helps to increase their knowledge level. If the level of education can be increased, people can fight against calamity and poverty with more confidence. People seem to overcome the problems of water logging and thus besides involving in Hydroponics (33.3%) as prime occupation in monsoon; they are also engaged in other business related activities. January to May is the low-income season for the farmers. But some people don’t want to change their traditional occupation and their age is above 45. The young earning member wants to utilize their lean income period by alternative occupation. People who do not have own land and performing water transportation or other types of occupation are now interested about Hydroponics. People who give their land on rent are now thinking about renting their land in monsoon as through that they can earn a sound amount rather than get nothing. What they require is just seed, training, and capital. Present marketing system faces the problem of communication. So this sector needed to be developed. Involvement of women in income earning activity is very low (5.1%), and 100% women are housewives. They became inferior due to their educational level and dominance of their husband. Only one NGO named BCAS (Bangladesh Center for Advanced Studies) has been found engage training facilities and advocacy for this purpose. Upazilla agricultural extension office like other GO remains inactive to give support the community for extension and improvement of Hydroponics agriculture. If the community gets proper support and inspiration, they can fight against crisis of low-income and climate change, with the Hydroponics cultivation system successfully.Keywords: wetland community, hydroponics, climate change adaptation, livelihood
Procedia PDF Downloads 2743681 Exploring a Cross-Sectional Analysis Defining Social Work Leadership Competencies in Social Work Education and Practice
Authors: Trevor Stephen, Joshua D. Aceves, David Guyer, Jona Jacobson
Abstract:
As a profession, social work has much to offer individuals, groups, and organizations. A multidisciplinary approach to understanding and solving complex challenges and a commitment to developing and training ethical practitioners outlines characteristics of a profession embedded with leadership skills. This presentation will take an overview of the historical context of social work leadership, examine social work as a unique leadership model composed of its qualities and theories that inform effective leadership capability as it relates to our code of ethics. Reflect critically on leadership theories and their foundational comparison. Finally, a look at recommendations and implementation to social work education and practice. Similar to defining leadership, there is no universally accepted definition of social work leadership. However, some distinct traits and characteristics are essential. Recent studies help set the stage for this research proposal because they measure views on effective social work leadership among social work and non-social leaders and followers. However, this research is interested in working backward from that approach and examining social workers' leadership preparedness perspectives based solely on social work training, competencies, values, and ethics. Social workers understand how to change complex structures and challenge resistance to change to improve the well-being of organizations and those they serve. Furthermore, previous studies align with the idea of practitioners assessing their skill and capacity to engage in leadership but not to lead. In addition, this research is significant because it explores aspiring social work leaders' competence to translate social work practice into direct leadership skills. The research question seeks to answer whether social work training and competencies are sufficient to determine whether social workers believe they possess the capacity and skill to engage in leadership practice. Aim 1: Assess whether social workers have the capacity and skills to assume leadership roles. Aim 2: Evaluate how the development of social workers is sufficient in defining leadership. This research intends to reframe the misconception that social workers do not possess the capacity and skills to be effective leaders. On the contrary, social work encompasses a framework dedicated to lifelong development and growth. Social workers must be skilled, competent, ethical, supportive, and empathic. These are all qualities and traits of effective leadership, whereas leaders are in relation with others and embody partnership and collaboration with followers and stakeholders. The proposed study is a cross-sectional quasi-experimental survey design that will include the distribution of a multi-level social work leadership model and assessment tool. The assessment tool aims to help define leadership in social work using a Likert scale model. A cross-sectional research design is appropriate for answering the research questions because the measurement survey will help gather data using a structured tool. Other than the proposed social work leadership measurement tool, there is no other mechanism based on social work theory and designed to measure the capacity and skill of social work leadership.Keywords: leadership competencies, leadership education, multi-level social work leadership model, social work core values, social work leadership, social work leadership education, social work leadership measurement tool
Procedia PDF Downloads 1723680 Fire Safety Engineering of Wood Dust Layer or Cloud
Authors: Marzena Półka, Bożena Kukfisz
Abstract:
This paper presents an analysis of dust explosion hazards in the process industries. It includes selected testing method of dust explosibility and presentation two of them according to experimental standards used by Department of Combustion and Fire Theory in The Main School of Fire Service in Warsaw. In the article are presented values of maximum acceptable surface temperature (MAST) of machines operating in the presence of dust cloud and chosen dust layer with thickness of 5 and 12,5mm. The comparative analysis, points to the conclusion that the value of the minimum ignition temperature of the layer (MITL) and the minimum ignition temperature of dust cloud (MTCD) depends on the granularity of the substance. Increasing the thickness of the dust layer reduces minimum ignition temperature of dust layer. Increasing the thickness of dust at the same time extends the flameless combustion and delays the ignition.Keywords: fire safety engineering, industrial hazards, minimum ignition temperature, wood dust
Procedia PDF Downloads 3193679 Effects of Safety Intervention Program towards Behaviors among Rubber Wood Processing Workers Using Theory of Planned Behavior
Authors: Junjira Mahaboon, Anongnard Boonpak, Nattakarn Worrasan, Busma Kama, Mujalin Saikliang, Siripor Dankachatarn
Abstract:
Rubber wood processing is one of the most important industries in southern Thailand. The process has several safety hazards for example unsafe wood cutting machine guarding, wood dust, noise, and heavy lifting. However, workers’ occupational health and safety measures to promote their behaviors are still limited. This quasi-experimental research was to determine factors affecting workers’ safety behaviors using theory of planned behavior after implementing job safety intervention program. The purposes were to (1) determine factors affecting workers’ behaviors and (2) to evaluate effectiveness of the intervention program. The sample of study was 66 workers from a rubber wood processing factory. Factors in the Theory of Planned Behavior model (TPB) were measured before and after the intervention. The factors of TPB included attitude towards behavior, subjective norm, perceived behavioral control, intention, and behavior. Firstly, Job Safety Analysis (JSA) was conducted and Safety Standard Operation Procedures (SSOP) were established. The questionnaire was also used to collect workers’ characteristics and TPB factors. Then, job safety intervention program to promote workers’ behavior according to SSOP were implemented for a four month period. The program included SSOP training, personal protective equipment use, and safety promotional campaign. After that, the TPB factors were again collected. Paired sample t-test and independent t-test were used to analyze the data. The result revealed that attitude towards behavior and intention increased significantly after the intervention at p<0.05. These factors also significantly determined the workers’ safety behavior according to SSOP at p<0.05. However, subjective norm, and perceived behavioral control were not significantly changed nor related to safety behaviors. In conclusion, attitude towards behavior and workers’ intention should be promoted to encourage workers’ safety behaviors. SSOP intervention program e.g. short meeting, safety training, and promotional campaign should be continuously implemented in a routine basis to improve workers’ behavior.Keywords: job safety analysis, rubber wood processing workers, safety standard operation procedure, theory of planned behavior
Procedia PDF Downloads 1933678 Modern Machine Learning Conniptions for Automatic Speech Recognition
Authors: S. Jagadeesh Kumar
Abstract:
This expose presents a luculent of recent machine learning practices as employed in the modern and as pertinent to prospective automatic speech recognition schemes. The aspiration is to promote additional traverse ablution among the machine learning and automatic speech recognition factions that have transpired in the precedent. The manuscript is structured according to the chief machine learning archetypes that are furthermore trendy by now or have latency for building momentous hand-outs to automatic speech recognition expertise. The standards offered and convoluted in this article embraces adaptive and multi-task learning, active learning, Bayesian learning, discriminative learning, generative learning, supervised and unsupervised learning. These learning archetypes are aggravated and conferred in the perspective of automatic speech recognition tools and functions. This manuscript bequeaths and surveys topical advances of deep learning and learning with sparse depictions; further limelight is on their incessant significance in the evolution of automatic speech recognition.Keywords: automatic speech recognition, deep learning methods, machine learning archetypes, Bayesian learning, supervised and unsupervised learning
Procedia PDF Downloads 4483677 A Review on Modeling and Optimization of Integration of Renewable Energy Resources (RER) for Minimum Energy Cost, Minimum CO₂ Emissions and Sustainable Development, in Recent Years
Authors: M. M. Wagh, V. V. Kulkarni
Abstract:
The rising economic activities, growing population and improving living standards of world have led to a steady growth in its appetite for quality and quantity of energy services. As the economy expands the electricity demand is going to grow further, increasing the challenges of the more generation and stresses on the utility grids. Appropriate energy model will help in proper utilization of the locally available renewable energy sources such as solar, wind, biomass, small hydro etc. to integrate in the available grid, reducing the investments in energy infrastructure. Further to these new technologies like smart grids, decentralized energy planning, energy management practices, energy efficiency are emerging. In this paper, the attempt has been made to study and review the recent energy planning models, energy forecasting models, and renewable energy integration models. In addition, various modeling techniques and tools are reviewed and discussed.Keywords: energy modeling, integration of renewable energy, energy modeling tools, energy modeling techniques
Procedia PDF Downloads 3453676 Identification of Candidate Congenital Heart Defects Biomarkers by Applying a Random Forest Approach on DNA Methylation Data
Authors: Kan Yu, Khui Hung Lee, Eben Afrifa-Yamoah, Jing Guo, Katrina Harrison, Jack Goldblatt, Nicholas Pachter, Jitian Xiao, Guicheng Brad Zhang
Abstract:
Background and Significance of the Study: Congenital Heart Defects (CHDs) are the most common malformation at birth and one of the leading causes of infant death. Although the exact etiology remains a significant challenge, epigenetic modifications, such as DNA methylation, are thought to contribute to the pathogenesis of congenital heart defects. At present, no existing DNA methylation biomarkers are used for early detection of CHDs. The existing CHD diagnostic techniques are time-consuming and costly and can only be used to diagnose CHDs after an infant was born. The present study employed a machine learning technique to analyse genome-wide methylation data in children with and without CHDs with the aim to find methylation biomarkers for CHDs. Methods: The Illumina Human Methylation EPIC BeadChip was used to screen the genome‐wide DNA methylation profiles of 24 infants diagnosed with congenital heart defects and 24 healthy infants without congenital heart defects. Primary pre-processing was conducted by using RnBeads and limma packages. The methylation levels of top 600 genes with the lowest p-value were selected and further investigated by using a random forest approach. ROC curves were used to analyse the sensitivity and specificity of each biomarker in both training and test sample sets. The functionalities of selected genes with high sensitivity and specificity were then assessed in molecular processes. Major Findings of the Study: Three genes (MIR663, FGF3, and FAM64A) were identified from both training and validating data by random forests with an average sensitivity and specificity of 85% and 95%. GO analyses for the top 600 genes showed that these putative differentially methylated genes were primarily associated with regulation of lipid metabolic process, protein-containing complex localization, and Notch signalling pathway. The present findings highlight that aberrant DNA methylation may play a significant role in the pathogenesis of congenital heart defects.Keywords: biomarker, congenital heart defects, DNA methylation, random forest
Procedia PDF Downloads 1583675 A Generic Metamodel for Dependability Analysis
Authors: Moomen Chaari, Wolfgang Ecker, Thomas Kruse, Bogdan-Andrei Tabacaru
Abstract:
In our daily life, we frequently interact with complex systems which facilitate our mobility, enhance our access to information, and sometimes help us recover from illnesses or diseases. The reliance on these systems is motivated by the established evaluation and assessment procedures which are performed during the different phases of the design and manufacturing flow. Such procedures are aimed to qualify the system’s delivered services with respect to their availability, reliability, safety, and other properties generally referred to as dependability attributes. In this paper, we propose a metamodel based generic characterization of dependability concepts and describe an automation methodology to customize this characterization to different standards and contexts. When integrated in concrete design and verification environments, the proposed methodology promotes the reuse of already available dependability assessment tools and reduces the costs and the efforts required to create consistent and efficient artefacts for fault injection or error simulation.Keywords: dependability analysis, model-driven development, metamodeling, code generation
Procedia PDF Downloads 4863674 Analyzing Essential Patents of Mobile Communication Based on Patent Portfolio: Case Study of Long Term Evolution-Advanced
Authors: Kujhin Jeong, Sungjoo Lee
Abstract:
In the past, cross-licensing was made up of various application or commercial patents. Today, cross-licensing is restricted to essential patents, which has emphasized their importance significantly. Literature has shown that patent portfolio provides information for patent protection or strategy decision-making, but little empirical research has found strategic tool of essential patents. This paper will highlight four types of essential patent portfolio and analysis about each strategy in the field of LTE-A. Specifically we collected essential patents of mobile communication company through ETSI (European Telecommunication Standards Institute) and build-up portfolio activity, concentration, diversity, and quality. Using these portfolios, we can understand each company’s strategic character about the technology of LTE-A and comparison analysis of financial results. Essential patents portfolio displays a mobile communication company’s strategy and its strategy’s impact on the performance of a company.Keywords: essential patent, portfolio, patent portfolio, essential patent portfolio
Procedia PDF Downloads 3943673 Predictive Analysis of the Stock Price Market Trends with Deep Learning
Authors: Suraj Mehrotra
Abstract:
The stock market is a volatile, bustling marketplace that is a cornerstone of economics. It defines whether companies are successful or in spiral. A thorough understanding of it is important - many companies have whole divisions dedicated to analysis of both their stock and of rivaling companies. Linking the world of finance and artificial intelligence (AI), especially the stock market, has been a relatively recent development. Predicting how stocks will do considering all external factors and previous data has always been a human task. With the help of AI, however, machine learning models can help us make more complete predictions in financial trends. Taking a look at the stock market specifically, predicting the open, closing, high, and low prices for the next day is very hard to do. Machine learning makes this task a lot easier. A model that builds upon itself that takes in external factors as weights can predict trends far into the future. When used effectively, new doors can be opened up in the business and finance world, and companies can make better and more complete decisions. This paper explores the various techniques used in the prediction of stock prices, from traditional statistical methods to deep learning and neural networks based approaches, among other methods. It provides a detailed analysis of the techniques and also explores the challenges in predictive analysis. For the accuracy of the testing set, taking a look at four different models - linear regression, neural network, decision tree, and naïve Bayes - on the different stocks, Apple, Google, Tesla, Amazon, United Healthcare, Exxon Mobil, J.P. Morgan & Chase, and Johnson & Johnson, the naïve Bayes model and linear regression models worked best. For the testing set, the naïve Bayes model had the highest accuracy along with the linear regression model, followed by the neural network model and then the decision tree model. The training set had similar results except for the fact that the decision tree model was perfect with complete accuracy in its predictions, which makes sense. This means that the decision tree model likely overfitted the training set when used for the testing set.Keywords: machine learning, testing set, artificial intelligence, stock analysis
Procedia PDF Downloads 953672 Gender Justice and Empowerment: A Study of Chhara Bootlegger Women of Ahmedabad
Authors: Neeta Khurana, Ritu Sharma
Abstract:
This paper is an impact assessment study of the rehabilitation work done for Chhara women in the rural precincts of Ahmedabad. The Chharas constitute a denotified tribe and live in abject poverty. The women of this community are infamous absconders of law and active bootleggers of locally made liquor. As part of a psychological study with a local NGO, the authors headed a training program aimed at rehabilitating and providing these women alternate modes of employment, thereby driving them away from a life of crime. The paper centers on the idea of women entrepreneurship and women empowerment. It notes the importance of handholding in a conflict situation. Most of the research on Chharas is either focused on victimising them for state-sponsored violence or mostly makes a plea on reconditioning them in the mainstream. Going against this trend, this paper which documents the study argues that making these poor women self-dependent is a panacea for their sluggish development. The alienation caused due to the demonisation of the community has made them abandon traditional modes of employment. This has further led the community astray into making illegal country liquor causing further damage to their reputation. Women are at the centre of this vicious circle facing much repression and ostracisation. The study conducted by the PDPU team was an attempt to change this dogmatic alienation of these poor women. It was found that with consistent support and reformist approach towards law, it is possible to drive these women away from a life of penury repression and crime. The aforementioned study uses empirical tools to verify this claim. Placed at the confluence of the sociology of gender and psychology, this paper is a good way to argue that law enforcement cannot be effective without sensitisation to the ground realities of conflict. The study conducted from which the paper borrows was a scientific survey focused on markers of gender and caste realities of the Chharas. The paper mentions various dynamics involved in the training program that paved the way for the successful employment of the women. In an attempt to explain its uniqueness, the paper also has a section on comparing similar social experiments.Keywords: employment, gender, handholding, rehabilitation
Procedia PDF Downloads 1313671 Game “EZZRA” as an Innovative Solution
Authors: Mane Varosyan, Diana Tumanyan, Agnesa Martirosyan
Abstract:
There are many catastrophic events that end with dire consequences, and to avoid them, people should be well-armed with the necessary information about these situations. During the last years, Serious Games have increasingly gained popularity for training people for different types of emergencies. The major discussed problem is the usage of gamification in education. Moreover, it is mandatory to understand how and what kind of gamified e-learning modules promote engagement. As the theme is emergency, we also find out people’s behavior for creating the final approach. Our proposed solution is an educational video game, “EZZRA”.Keywords: gamification, education, emergency, serious games, game design, virtual reality, digitalisation
Procedia PDF Downloads 763670 Lubrication Performance of Multi-Level Gear Oil in a Gasoline Engine
Authors: Feng-Tsai Weng, Dong- Syuan Cai, Tsochu-Lin
Abstract:
A vehicle gasoline engine converts gasoline into power so that the car can move, and lubricants are important for engines and also gear boxes. Manufacturers have produced numbers of engine oils, and gear oils for engines and gear boxes to SAE International Standards. Some products not only can improve the lubrication of both the engine and gear box but also can raise power of vehicle this can be easily seen in the advertisement declared by the manufacturers. To observe the lubrication performance, a multi-leveled (heavy duty) gear oil was added to a gasoline engine as the oil in the vehicle. The oil was checked at about every 10,000 kilometers. The engine was detailed disassembled, cleaned, and parts were measured. The wear of components of the engine parts were checked and recorded finally. Based on the experiment results, some gear oil seems possible to be used as engine oil in particular vehicles. Vehicle owners should change oil periodically in about every 6,000 miles (or 10,000 kilometers). Used car owners may change engine oil in even longer distance.Keywords: multi-level gear oil, engine oil, viscosity, abrasion
Procedia PDF Downloads 3243669 Smartphone-Based Human Activity Recognition by Machine Learning Methods
Authors: Yanting Cao, Kazumitsu Nawata
Abstract:
As smartphones upgrading, their software and hardware are getting smarter, so the smartphone-based human activity recognition will be described as more refined, complex, and detailed. In this context, we analyzed a set of experimental data obtained by observing and measuring 30 volunteers with six activities of daily living (ADL). Due to the large sample size, especially a 561-feature vector with time and frequency domain variables, cleaning these intractable features and training a proper model becomes extremely challenging. After a series of feature selection and parameters adjustment, a well-performed SVM classifier has been trained.Keywords: smart sensors, human activity recognition, artificial intelligence, SVM
Procedia PDF Downloads 1443668 Investigations on the Fatigue Behavior of Welded Details with Imperfections
Authors: Helen Bartsch, Markus Feldmann
Abstract:
The dimensioning of steel structures subject to fatigue loads, such as wind turbines, bridges, masts and towers, crane runways and weirs or components in crane construction, is often dominated by fatigue verification. The fatigue details defined by the welded connections, such as butt or cruciform joints, longitudinal welds, welded-on or welded-in stiffeners, etc., are decisive. In Europe, the verification is usually carried out according to EN 1993-1-9 on a nominal stress basis. The basis is the detailed catalog, which specifies the fatigue strength of the various weld and construction details according to fatigue classes. Until now, a relation between fatigue classes and weld imperfection sizes is not included. Quality levels for imperfections in fusion-welded joints in steel, nickel, titanium and their alloys are regulated in EN ISO 5817, which, however, doesn’t contain direct correlations to fatigue resistances. The question arises whether some imperfections might be tolerable to a certain extent since they may be present in the test data used for detail classifications dating back decades ago. Although current standardization requires proof of satisfying limits of imperfection sizes, it would also be possible to tolerate welds with certain irregularities if these can be reliably quantified by non-destructive testing. Fabricators would be prepared to undertake carefully and sustained weld inspection in view of the significant economic consequences of such unfavorable fatigue classes. This paper presents investigations on the fatigue behavior of common welded details containing imperfections. In contrast to the common nominal stress concept, local fatigue concepts were used to consider the true stress increase, i.e., local stresses at the weld toe and root. The actual shape of a weld comprising imperfections, e.g., gaps or undercuts, can be incorporated into the fatigue evaluation, usually on a numerical basis. With the help of the effective notch stress concept, the fatigue resistance of detailed local weld shapes is assessed. Validated numerical models serve to investigate notch factors of fatigue details with different geometries. By utilizing parametrized ABAQUS routines, detailed numerical studies have been performed. Depending on the shape and size of different weld irregularities, fatigue classes can be defined. As well load-carrying welded details, such as the cruciform joint, as non-load carrying welded details, e.g., welded-on or welded-in stiffeners, are regarded. The investigated imperfections include, among others, undercuts, excessive convexity, incorrect weld toe, excessive asymmetry and insufficient or excessive throat thickness. Comparisons of the impact of different imperfections on the different types of fatigue details are made. Moreover, the influence of a combination of crucial weld imperfections on the fatigue resistance is analyzed. With regard to the trend of increasing efficiency in steel construction, the overall aim of the investigations is to include a more economical differentiation of fatigue details with regard to tolerance sizes. In the long term, the harmonization of design standards, execution standards and regulations of weld imperfections is intended.Keywords: effective notch stress, fatigue, fatigue design, weld imperfections
Procedia PDF Downloads 2603667 Application of Groundwater Level Data Mining in Aquifer Identification
Authors: Liang Cheng Chang, Wei Ju Huang, You Cheng Chen
Abstract:
Investigation and research are keys for conjunctive use of surface and groundwater resources. The hydrogeological structure is an important base for groundwater analysis and simulation. Traditionally, the hydrogeological structure is artificially determined based on geological drill logs, the structure of wells, groundwater levels, and so on. In Taiwan, groundwater observation network has been built and a large amount of groundwater-level observation data are available. The groundwater level is the state variable of the groundwater system, which reflects the system response combining hydrogeological structure, groundwater injection, and extraction. This study applies analytical tools to the observation database to develop a methodology for the identification of confined and unconfined aquifers. These tools include frequency analysis, cross-correlation analysis between rainfall and groundwater level, groundwater regression curve analysis, and decision tree. The developed methodology is then applied to groundwater layer identification of two groundwater systems: Zhuoshui River alluvial fan and Pingtung Plain. The abovementioned frequency analysis uses Fourier Transform processing time-series groundwater level observation data and analyzing daily frequency amplitude of groundwater level caused by artificial groundwater extraction. The cross-correlation analysis between rainfall and groundwater level is used to obtain the groundwater replenishment time between infiltration and the peak groundwater level during wet seasons. The groundwater regression curve, the average rate of groundwater regression, is used to analyze the internal flux in the groundwater system and the flux caused by artificial behaviors. The decision tree uses the information obtained from the above mentioned analytical tools and optimizes the best estimation of the hydrogeological structure. The developed method reaches training accuracy of 92.31% and verification accuracy 93.75% on Zhuoshui River alluvial fan and training accuracy 95.55%, and verification accuracy 100% on Pingtung Plain. This extraordinary accuracy indicates that the developed methodology is a great tool for identifying hydrogeological structures.Keywords: aquifer identification, decision tree, groundwater, Fourier transform
Procedia PDF Downloads 1573666 Physicochemical and Biochemical Characterization of Olea europea Var. Oleaster Oil and Determination of Its Effects on Blood Parameters
Authors: Asma Gherib, Imen Merzougui, Cherifa Henchiri
Abstract:
This present study has allowed to evaluate the physico chemical characteristics, fatty acid composition and the hypolipidemic effect of Oleaster oil Olea europea var. Oleaster, from the area of El Kala, "Eastern Algeria" on rats "Wistar albinos". The physico chemical characteristics: acidity (0,73%), peroxide value (14, 16 meqO2/kg oil) and iodine value (74,08 g iodine/100 g of oil) are consistent with international standards. The dosage of FA revealed a wealth of oil with UFA (76,7%), mainly composed of 65.43% of MUFA whose major fatty acid is oleic acid (63,57%). The experiment on rats receiving a diet rich in saturated fats and hydrogenated oils revealed that the consumption of Oleaster oil at the dose of 10 g and 20 g for 15 and 30 days improves plasma lipid profile by decreasing the rates of TC, TG, TL, and LDL-C with an increase in the rate of HDL-C serum. The importance of these effects depends on the dose and period of treatment.Keywords: oleaster oil, fatty acid, Olea europea, oleic acid, lipid profile
Procedia PDF Downloads 4883665 Intelligent Electric Vehicle Charging System (IEVCS)
Authors: Prateek Saxena, Sanjeev Singh, Julius Roy
Abstract:
The security of the power distribution grid remains a paramount to the utility professionals while enhancing and making it more efficient. The most serious threat to the system can be maintaining the transformers, as the load is ever increasing with the addition of elements like electric vehicles. In this paper, intelligent transformer monitoring and grid management has been proposed. The engineering is done to use the evolving data from the smart meter for grid analytics and diagnostics for preventive maintenance. The two-tier architecture for hardware and software integration is coupled to form a robust system for the smart grid. The proposal also presents interoperable meter standards for easy integration. Distribution transformer analytics based on real-time data benefits utilities preventing outages, protects the revenue loss, improves the return on asset and reduces overall maintenance cost by predictive monitoring.Keywords: electric vehicle charging, transformer monitoring, data analytics, intelligent grid
Procedia PDF Downloads 7913664 Effects of Robot-Assisted Hand Training on Upper Extremity Performance in Patients with Stroke: A Randomized Crossover Controlled, Assessor-Blinded Study
Authors: Hsin-Chieh Lee, Fen-Ling Kuo, Jui-Chi Lin
Abstract:
Background: Upper extremity functional impairment that occurs after stroke includes hemiplegia, synergy movement, muscle hypertonicity, and somatosensory impairment, which result in inefficient and inaccurate movement. Robot-assisted rehabilitation is an intensive training approach that is effective in sensorimotor and hand function recovery. However, these systems mostly focused on the proximal part of the upper limb rather than the distal part. The device used in our study was Gloreha Sinfonia, which focuses on the distal part of the upper limb and uses a dynamic support system to facilitate the whole limb function. The objective of this study was to investigate the effects of robot-assisted therapy (RT) with Gloreha device on sensorimotor, and ADLs in patients with stroke. Method: Patients with stroke (N=25) participated AB or BA (A = 12 RT sessions and B = 12 conventional therapy (CT) sessions) for 6 weeks (60 min at each session, twice a week), with 1-month break for washout period. The performance of the patients was assessed by a blinded assessor at 4 time points (pretest 1, posttest 1, pretest 2, posttest 2) which including the Fugl–Meyer Assessment-upper extremity (FMA-UE), box and block test, electromyography of the extensor digitorum communis (EDC) and brachioradialis, a grip dynamometer for motor evaluation; Semmes–Weinstein hand monofilament and Revision of the Nottingham Sensory Assessment for sensory evaluation; and the Modified Barthel Index (MBI) for assessing the ADL ability. Result: RT group significantly improved FMA-UE proximal scores (p = 0.038), FMA-UE total scores (p = 0.046), and MBI (p = 0.030). The EDC exhibited higher efficiency during the small block grasping task in the RT group than in the CT group (p = 0.050). Conclusions: RT with the Gloreha device might lead to beneficial effects on arm motor function, ADL ability, and EDC muscle recruitment efficacy in patients with subacute to chronic stroke.Keywords: activities of daily living, hand function, robotic rehabilitation, stroke
Procedia PDF Downloads 1183663 Assessment of Compost Usage Quality and Quality for Agricultural Use: A Case Study of Hebron District, Palestine
Authors: Mohammed A. A. Sarhan, Issam A. Al-Khatib
Abstract:
Complying with the technical specifications of compost production is of high importance not only for environmental protection but also for increasing the productivity and promotion of compost use by farmers in agriculture. This study focuses on the compost quality of the Palestinian market and farmers’ attitudes toward agricultural use of compost. The quality is assessed through selection of 20 compost samples of different suppliers and producers and lab testing for quality parameters, while the farmers’ attitudes to compost use for agriculture are evaluated through survey questionnaire of 321 farmers in the Hebron area. The results showed that the compost in the Palestinian markets is of medium quality due to partial or non-compliance with the quality standards and guidelines. The Palestinian farmers showed a positive attitude since 91.2% of them have the desire to use compost in agriculture. The results also showed that knowledge of difference between compost and chemical fertilizers, perception of compost benefits and previously experiencing problems in compost use, are significant factors affecting the farmers’ attitude toward the use of compost as an organic fertilizer.Keywords: attitude, compost, compost quality, organic fertilizer, manure
Procedia PDF Downloads 1673662 Utilization of Jackfruit Seed Flour (Artocarpus heterophyllus L.) as a Food Additive
Authors: C. S. D. S. Maduwage, P. W. Jeewanthi, W. A. J. P. Wijesinghe
Abstract:
This study investigated the use of Jackfruit Seed Flour (JSF) as a thickening agent in tomato sauce production. Lye peeled mature jackfruit seeds were used to obtain JSF. Flour was packed in laminated bags and stored for further studies. Three batches of tomato sauce samples were prepared according to the Sri Lankan Standards for tomato sauce by adding JSF, corn flour and without any thickening agent. Samples were stored at room temperature for 8 weeks in glass bottles. The physicochemical properties such as pH, total soluble solids, titratable acidity, and water activity were measured during the storage period. Microbial analysis and sensory evaluation were done to determine the quality of tomato sauce. JSF showed the role of a thickening agent in tomato sauce with lowest serum separation and highest viscosity during the storage period. This study concludes that JSF can be successfully used as a thickening agent in food industry.Keywords: Jackfruit seed flour, food additive, thickening agent, tomato sauce
Procedia PDF Downloads 3093661 Groundwater Potential Delineation Using Geodetector Based Convolutional Neural Network in the Gunabay Watershed of Ethiopia
Authors: Asnakew Mulualem Tegegne, Tarun Kumar Lohani, Abunu Atlabachew Eshete
Abstract:
Groundwater potential delineation is essential for efficient water resource utilization and long-term development. The scarcity of potable and irrigation water has become a critical issue due to natural and anthropogenic activities in meeting the demands of human survival and productivity. With these constraints, groundwater resources are now being used extensively in Ethiopia. Therefore, an innovative convolutional neural network (CNN) is successfully applied in the Gunabay watershed to delineate groundwater potential based on the selected major influencing factors. Groundwater recharge, lithology, drainage density, lineament density, transmissivity, and geomorphology were selected as major influencing factors during the groundwater potential of the study area. For dataset training, 70% of samples were selected and 30% were used for serving out of the total 128 samples. The spatial distribution of groundwater potential has been classified into five groups: very low (10.72%), low (25.67%), moderate (31.62%), high (19.93%), and very high (12.06%). The area obtains high rainfall but has a very low amount of recharge due to a lack of proper soil and water conservation structures. The major outcome of the study showed that moderate and low potential is dominant. Geodetoctor results revealed that the magnitude influences on groundwater potential have been ranked as transmissivity (0.48), recharge (0.26), lineament density (0.26), lithology (0.13), drainage density (0.12), and geomorphology (0.06). The model results showed that using a convolutional neural network (CNN), groundwater potentiality can be delineated with higher predictive capability and accuracy. CNN-based AUC validation platform showed that 81.58% and 86.84% were accrued from the accuracy of training and testing values, respectively. Based on the findings, the local government can receive technical assistance for groundwater exploration and sustainable water resource development in the Gunabay watershed. Finally, the use of a detector-based deep learning algorithm can provide a new platform for industrial sectors, groundwater experts, scholars, and decision-makers.Keywords: CNN, geodetector, groundwater influencing factors, Groundwater potential, Gunabay watershed
Procedia PDF Downloads 223660 A Comprehensive Study on Quality Assurance in Game Development
Authors: Maria Komal, Zaineb Khalil, Mehreen Sirshar
Abstract:
Due to the recent technological advancements, Games have become one of the most demanding applications. Gaming industry is rapidly growing and the key to success in this industry is the development of good quality games, which is a highly competitive issue. The ultimate goal of game developers is to provide player’s satisfaction by developing high-quality games. This research is the comprehensive survey of techniques followed by game industries to ensure games quality. After analysis of various techniques, it has been found that quality simulation according to ISO standards and play test methods are used to ensure games quality. Because game development requires cross-disciplined team, an increasing trend towards distributed game development has been observed. This paper evaluates the strengths and weaknesses of current methodologies used in game industry and draws a conclusion. We have also proposed quality parameters which can be used as a heuristic framework to identify those attributes which have high testing priorities.Keywords: game development, computer games, video games, gaming industry, quality assurance, playability, user experience
Procedia PDF Downloads 5343659 Influence of Spelling Errors on English Language Performance among Learners with Dysgraphia in Public Primary Schools in Embu County, Kenya
Authors: Madrine King'endo
Abstract:
This study dealt with the influence of spelling errors on English language performance among learners with dysgraphia in public primary schools in West Embu, Embu County, Kenya. The study purposed to investigate the influence of spelling errors on the English language performance among the class three pupils with dysgraphia in public primary schools. The objectives of the study were to identify the spelling errors that learners with dysgraphia make when writing English words and classify the spelling errors they make. Further, the study will establish how the spelling errors affect the performance of the language among the study participants, and suggest the remediation strategies that teachers could use to address the errors. The study could provide the stakeholders with relevant information in writing skills that could help in developing a responsive curriculum to accommodate the teaching and learning needs of learners with dysgraphia, and probably ensure training of teachers in teacher training colleges is tailored within the writing needs of the pupils with dysgraphia. The study was carried out in Embu county because the researcher did not find any study in related literature review concerning the influence of spelling errors on English language performance among learners with dysgraphia in public primary schools done in the area. Moreover, besides being relatively populated enough for a sample population of the study, the area was fairly cosmopolitan to allow a generalization of the study findings. The study assumed the sampled schools will had class three pupils with dysgraphia who exhibited written spelling errors. The study was guided by two spelling approaches: the connectionist stimulation of spelling process and orthographic autonomy hypothesis with a view to explain how participants with learning disabilities spell written words. Data were collected through interviews, pupils’ exercise books, and progress records, and a spelling test made by the researcher based on the spelling scope set for class three pupils by the ministry of education in the primary education syllabus. The study relied on random sampling techniques in identifying general and specific participants. Since the study used children in schools as participants, voluntary consent was sought from themselves, their teachers and the school head teachers who were their caretakers in a school setting.Keywords: dysgraphia, writing, language, performance
Procedia PDF Downloads 154