Search results for: least square support vector machine
9533 Prediction of California Bearing Ratio of a Black Cotton Soil Stabilized with Waste Glass and Eggshell Powder using Artificial Neural Network
Authors: Biruhi Tesfaye, Avinash M. Potdar
Abstract:
The laboratory test process to determine the California bearing ratio (CBR) of black cotton soils is not only overpriced but also time-consuming as well. Hence advanced prediction of CBR plays a significant role as it is applicable In pavement design. The prediction of CBR of treated soil was executed by Artificial Neural Networks (ANNs) which is a Computational tool based on the properties of the biological neural system. To observe CBR values, combined eggshell and waste glass was added to soil as 4, 8, 12, and 16 % of the weights of the soil samples. Accordingly, the laboratory related tests were conducted to get the required best model. The maximum CBR value found at 5.8 at 8 % of eggshell waste glass powder addition. The model was developed using CBR as an output layer variable. CBR was considered as a function of the joint effect of liquid limit, plastic limit, and plastic index, optimum moisture content and maximum dry density. The best model that has been found was ANN with 5, 6 and 1 neurons in the input, hidden and output layer correspondingly. The performance of selected ANN has been 0.99996, 4.44E-05, 0.00353 and 0.0067 which are correlation coefficient (R), mean square error (MSE), mean absolute error (MAE) and root mean square error (RMSE) respectively. The research presented or summarized above throws light on future scope on stabilization with waste glass combined with different percentages of eggshell that leads to the economical design of CBR acceptable to pavement sub-base or base, as desired.Keywords: CBR, artificial neural network, liquid limit, plastic limit, maximum dry density, OMC
Procedia PDF Downloads 1919532 A Method for Clinical Concept Extraction from Medical Text
Authors: Moshe Wasserblat, Jonathan Mamou, Oren Pereg
Abstract:
Natural Language Processing (NLP) has made a major leap in the last few years, in practical integration into medical solutions; for example, extracting clinical concepts from medical texts such as medical condition, medication, treatment, and symptoms. However, training and deploying those models in real environments still demands a large amount of annotated data and NLP/Machine Learning (ML) expertise, which makes this process costly and time-consuming. We present a practical and efficient method for clinical concept extraction that does not require costly labeled data nor ML expertise. The method includes three steps: Step 1- the user injects a large in-domain text corpus (e.g., PubMed). Then, the system builds a contextual model containing vector representations of concepts in the corpus, in an unsupervised manner (e.g., Phrase2Vec). Step 2- the user provides a seed set of terms representing a specific medical concept (e.g., for the concept of the symptoms, the user may provide: ‘dry mouth,’ ‘itchy skin,’ and ‘blurred vision’). Then, the system matches the seed set against the contextual model and extracts the most semantically similar terms (e.g., additional symptoms). The result is a complete set of terms related to the medical concept. Step 3 –in production, there is a need to extract medical concepts from the unseen medical text. The system extracts key-phrases from the new text, then matches them against the complete set of terms from step 2, and the most semantically similar will be annotated with the same medical concept category. As an example, the seed symptom concepts would result in the following annotation: “The patient complaints on fatigue [symptom], dry skin [symptom], and Weight loss [symptom], which can be an early sign for Diabetes.” Our evaluations show promising results for extracting concepts from medical corpora. The method allows medical analysts to easily and efficiently build taxonomies (in step 2) representing their domain-specific concepts, and automatically annotate a large number of texts (in step 3) for classification/summarization of medical reports.Keywords: clinical concepts, concept expansion, medical records annotation, medical records summarization
Procedia PDF Downloads 1359531 Availability Analysis of Process Management in the Equipment Maintenance and Repair Implementation
Authors: Onur Ozveri, Korkut Karabag, Cagri Keles
Abstract:
It is an important issue that the occurring of production downtime and repair costs when machines fail in the machine intensive production industries. In the case of failure of more than one machine at the same time, which machines will have the priority to repair, how to determine the optimal repair time should be allotted for this machines and how to plan the resources needed to repair are the key issues. In recent years, Business Process Management (BPM) technique, bring effective solutions to different problems in business. The main feature of this technique is that it can improve the way the job done by examining in detail the works of interest. In the industries, maintenance and repair works are operating as a process and when a breakdown occurs, it is known that the repair work is carried out in a series of process. Maintenance main-process and repair sub-process are evaluated with process management technique, so it is thought that structure could bring a solution. For this reason, in an international manufacturing company, this issue discussed and has tried to develop a proposal for a solution. The purpose of this study is the implementation of maintenance and repair works which is integrated with process management technique and at the end of implementation, analyzing the maintenance related parameters like quality, cost, time, safety and spare part. The international firm that carried out the application operates in a free region in Turkey and its core business area is producing original equipment technologies, vehicle electrical construction, electronics, safety and thermal systems for the world's leading light and heavy vehicle manufacturers. In the firm primarily, a project team has been established. The team dealt with the current maintenance process again, and it has been revised again by the process management techniques. Repair process which is sub-process of maintenance process has been discussed again. In the improved processes, the ABC equipment classification technique was used to decide which machine or machines will be given priority in case of failure. This technique is a prioritization method of malfunctioned machine based on the effect of the production, product quality, maintenance costs and job security. Improved maintenance and repair processes have been implemented in the company for three months, and the obtained data were compared with the previous year data. In conclusion, breakdown maintenance was found to occur in a shorter time, with lower cost and lower spare parts inventory.Keywords: ABC equipment classification, business process management (BPM), maintenance, repair performance
Procedia PDF Downloads 1949530 Status of Mangrove Wetlands and Implications for Sustainable Livelihood of Coastal Communities on the Lagos Coast (West Africa)
Authors: I. Agboola Julius, Christopher A. Kumolu-Johnson, O. Kolade Rafiu, A. Saba Abdulwakil
Abstract:
This work elucidates on mangrove diversity, trends of change, factors responsible for loss over the years and implications for sustainable livelihoods of locals in four villages (Ajido (L1), Tarkwa bay (L2), University of Lagos (L3), and Ikosi (L4)) along the coast of Lagos, Nigeria. Primary data were collected through field survey, questionnaires, interviews, and review of existing literature. Field observation and data analysis reveals mangrove diversity as low and varied on a spatial scale, where Margalef’s Diversity Index (D) was 0.368, 0.269, 0.326, and 0.333, respectively for L1, L2, L3, and L4. Shannon Weiner’s Index (H) was estimated to be 1.003, 1.460, 1.160, 1.046, and Specie Richness (E) 0.913, 0.907, 0.858, and 0.015, respectively, for the four villages. Also, The Simpson’s index of diversity was analyzed to be 0.632, 0. 731, 0.647, 0.667, and Simpson’s reciprocal index 2.717, 3.717, 3.060, and 3.003, respectively, for the four villages. Chi-square test was used to analyze the impact of mangrove loss on the sustainable livelihood of coastal communities. Calculated Chi-square (X2) value (5) was higher than tabulated value (4.30), suggesting that loss of mangrove wetlands impacted on local communities’ livelihood at the four villages. Analyses of causes and trends of mangrove wetland loss over the years suggest that urbanization, fuel wood and agricultural activities are major causes. Current degradation observed in mangrove wetlands on the Lagos coast suggest a reduction in mangroves biodiversity and associated fauna with potential cascading effects on higher trophic levels such as fisheries. Low yield in fish catch, reduction in income and increasing cases of natural disaster has culminated in threats to sustainable livelihoods of local communities along the coast of Lagos.Keywords: Mangroves, lagos coast, fisheries, management
Procedia PDF Downloads 6479529 Keynote Talk: The Role of Internet of Things in the Smart Cities Power System
Authors: Abdul-Rahman Al-Ali
Abstract:
As the number of mobile devices is growing exponentially, it is estimated to connect about 50 million devices to the Internet by the year 2020. At the end of this decade, it is expected that an average of eight connected devices per person worldwide. The 50 billion devices are not mobile phones and data browsing gadgets only, but machine-to-machine and man-to-machine devices. With such growing numbers of devices the Internet of Things (I.o.T) concept is one of the emerging technologies as of recently. Within the smart grid technologies, smart home appliances, Intelligent Electronic Devices (IED) and Distributed Energy Resources (DER) are major I.o.T objects that can be addressable using the IPV6. These objects are called the smart grid internet of things (SG-I.o.T). The SG-I.o.T generates big data that requires high-speed computing infrastructure, widespread computer networks, big data storage, software, and platforms services. A company’s utility control and data centers cannot handle such a large number of devices, high-speed processing, and massive data storage. Building large data center’s infrastructure takes a long time, it also requires widespread communication networks and huge capital investment. To maintain and upgrade control and data centers’ infrastructure and communication networks as well as updating and renewing software licenses which collectively, requires additional cost. This can be overcome by utilizing the emerging computing paradigms such as cloud computing. This can be used as a smart grid enabler to replace the legacy of utilities data centers. The talk will highlight the role of I.o.T, cloud computing services and their development models within the smart grid technologies.Keywords: intelligent electronic devices (IED), distributed energy resources (DER), internet, smart home appliances
Procedia PDF Downloads 3249528 Equality in Higher Education: A Library and Learning Collaborative Project to Support Teachers
Authors: Ika Jorum
Abstract:
The aim of this collaborative project was to develop library support that contributes in a long-term way to a technical university’s work on increased equality in education. The background was an assessment made by the Higher Education Authority that showed the need for improvement regarding equality in several programs at the university. The university’s Vice President for equality and Vice President for sustainability announced funds for projects that supported the improvement of equality in education. The library was granted funding for a one-year project that aimed both to support teachers in order to embed equality in education and to support the library staff and improve the organization’s own work. The part of the project that was directed to teachers was performed as activities in different areas and forms, such as acquisition and collections, teaching, exhibitions and book discussions. Besides the activities and support that was offered to teachers, the education team had journal clubs in order to develop and embed equality in their own teaching. The part that was directed to library staff and management was performed as workshops in collaboration with Equality Office in order to identify areas where the library could make improvements on work with equality and inclusion. The expectation was that the activities would be well attended since the project team had got indications that the content would be relevant. The outcome of this project was that some activities turned out to be more attended than others and what is expected to be found relevant, for example, a workshop on information searching from a gender and equality perspective for teachers, might still not attract participants. On the other hand, Ph.D. students and students participated in the book discussions and wanted them to continue after the project had ended. Results will be shared both on what was successful and what was challenging. Some reflections will be given on what can be done to attract participants to activities in the area of gender equality that is most likely relevant for the expected attendants and how results from a project on gender equality can be integrated into an organization’s daily work.Keywords: equality, higher education, critical information literacy, collaboration
Procedia PDF Downloads 739527 Modelling and Detecting the Demagnetization Fault in the Permanent Magnet Synchronous Machine Using the Current Signature Analysis
Authors: Yassa Nacera, Badji Abderrezak, Saidoune Abdelmalek, Houassine Hamza
Abstract:
Several kinds of faults can occur in a permanent magnet synchronous machine (PMSM) systems: bearing faults, electrically short/open faults, eccentricity faults, and demagnetization faults. Demagnetization fault means that the strengths of permanent magnets (PM) in PMSM decrease, and it causes low output torque, which is undesirable for EVs. The fault is caused by physical damage, high-temperature stress, inverse magnetic field, and aging. Motor current signature analysis (MCSA) is a conventional motor fault detection method based on the extraction of signal features from stator current. a simulation model of the PMSM under partial demagnetization and uniform demagnetization fault was established, and different degrees of demagnetization fault were simulated. The harmonic analyses using the Fast Fourier Transform (FFT) show that the fault diagnosis method based on the harmonic wave analysis is only suitable for partial demagnetization fault of the PMSM and does not apply to uniform demagnetization fault of the PMSM.Keywords: permanent magnet, diagnosis, demagnetization, modelling
Procedia PDF Downloads 689526 Detecting Elderly Abuse in US Nursing Homes Using Machine Learning and Text Analytics
Authors: Minh Huynh, Aaron Heuser, Luke Patterson, Chris Zhang, Mason Miller, Daniel Wang, Sandeep Shetty, Mike Trinh, Abigail Miller, Adaeze Enekwechi, Tenille Daniels, Lu Huynh
Abstract:
Machine learning and text analytics have been used to analyze child abuse, cyberbullying, domestic abuse and domestic violence, and hate speech. However, to the authors’ knowledge, no research to date has used these methods to study elder abuse in nursing homes or skilled nursing facilities from field inspection reports. We used machine learning and text analytics methods to analyze 356,000 inspection reports, which have been extracted from CMS Form-2567 field inspections of US nursing homes and skilled nursing facilities between 2016 and 2021. Our algorithm detected occurrences of the various types of abuse, including physical abuse, psychological abuse, verbal abuse, sexual abuse, and passive and active neglect. For example, to detect physical abuse, our algorithms search for combinations or phrases and words suggesting willful infliction of damage (hitting, pinching or burning, tethering, tying), or consciously ignoring an emergency. To detect occurrences of elder neglect, our algorithm looks for combinations or phrases and words suggesting both passive neglect (neglecting vital needs, allowing malnutrition and dehydration, allowing decubiti, deprivation of information, limitation of freedom, negligence toward safety precautions) and active neglect (intimidation and name-calling, tying the victim up to prevent falls without consent, consciously ignoring an emergency, not calling a physician in spite of indication, stopping important treatments, failure to provide essential care, deprivation of nourishment, leaving a person alone for an inappropriate amount of time, excessive demands in a situation of care). We further compare the prevalence of abuse before and after Covid-19 related restrictions on nursing home visits. We also identified the facilities with the most number of cases of abuse with no abuse facilities within a 25-mile radius as most likely candidates for additional inspections. We also built an interactive display to visualize the location of these facilities.Keywords: machine learning, text analytics, elder abuse, elder neglect, nursing home abuse
Procedia PDF Downloads 1469525 Use Cloud-Based Watson Deep Learning Platform to Train Models Faster and More Accurate
Authors: Susan Diamond
Abstract:
Machine Learning workloads have traditionally been run in high-performance computing (HPC) environments, where users log in to dedicated machines and utilize the attached GPUs to run training jobs on huge datasets. Training of large neural network models is very resource intensive, and even after exploiting parallelism and accelerators such as GPUs, a single training job can still take days. Consequently, the cost of hardware is a barrier to entry. Even when upfront cost is not a concern, the lead time to set up such an HPC environment takes months from acquiring hardware to set up the hardware with the right set of firmware, software installed and configured. Furthermore, scalability is hard to achieve in a rigid traditional lab environment. Therefore, it is slow to react to the dynamic change in the artificial intelligent industry. Watson Deep Learning as a service, a cloud-based deep learning platform that mitigates the long lead time and high upfront investment in hardware. It enables robust and scalable sharing of resources among the teams in an organization. It is designed for on-demand cloud environments. Providing a similar user experience in a multi-tenant cloud environment comes with its own unique challenges regarding fault tolerance, performance, and security. Watson Deep Learning as a service tackles these challenges and present a deep learning stack for the cloud environments in a secure, scalable and fault-tolerant manner. It supports a wide range of deep-learning frameworks such as Tensorflow, PyTorch, Caffe, Torch, Theano, and MXNet etc. These frameworks reduce the effort and skillset required to design, train, and use deep learning models. Deep Learning as a service is used at IBM by AI researchers in areas including machine translation, computer vision, and healthcare.Keywords: deep learning, machine learning, cognitive computing, model training
Procedia PDF Downloads 2099524 Developing a Spatial Decision Support System for Rationality Assessment of Land Use Planning Locations in Thai Binh Province, Vietnam
Authors: Xuan Linh Nguyen, Tien Yin Chou, Yao Min Fang, Feng Cheng Lin, Thanh Van Hoang, Yin Min Huang
Abstract:
In Vietnam, land use planning is the most important and powerful tool of the government for sustainable land use and land management. Nevertheless, many of land use planning locations are facing protests from surrounding households due to environmental impacts. In addition, locations are planned completely based on the subjective decisions of planners who are unsupported by tools or scientific methods. Hence, this research aims to assist the decision-makers in evaluating the rationality of planning locations by developing a Spatial Decision Support System (SDSS) using approaches of Geographic Information System (GIS)-based technology, Analytic Hierarchy Process (AHP) multi-criteria-based technique and Fuzzy set theory. An ArcGIS Desktop add-ins named SDSS-LUPA was developed to support users analyzing data and presenting results in friendly format. The Fuzzy-AHP method has been utilized as analytic model for this SDSS. There are 18 planned locations in Hung Ha district (Thai Binh province, Vietnam) as a case study. The experimental results indicated that the assessment threshold higher than 0.65 while the 18 planned locations were irrational because of close to residential areas or close to water sources. Some potential sites were also proposed to the authorities for consideration of land use planning changes.Keywords: analytic hierarchy process, fuzzy set theory, land use planning, spatial decision support system
Procedia PDF Downloads 3799523 Trends in Use of Millings in Pavement Maintenance
Authors: Rafiqul Tarefder, Mohiuddin Ahmad, Mohammad Hossain
Abstract:
While milling materials from old pavement surface can be an important component of cost effective maintenance operation, their use in maintenance projects are not uniform and well documented. This study documents the different maintenance practices followed by four transportation districts of New Mexico Department of Transportation (NMDOT) in an attempt to find whether millings are being used in maintenance projects by those districts. Based on existing literature, a questionnaire was developed related to six common maintenance practices. NMDOT district personal were interviewed face to face to discuss and get answers to that questionnaire. It revealed that NMDOT districts mainly use chip seal and patching. Other maintenance procedures such as sand seal, scrub seal, slurry seal, and thin overlay have limited use. Two out of four participating districts do not have any documents on chip sealing; rather they employ the experiences of the chip seal crew. All districts use polymer modified high float emulsion (HFE100P) for chip seal with an application rate ranging from 0.4 to 0.56 gallons per square yard. Chip application rate varies from 15 to 40 lb/ square yard. State wide, the thickness of chip seal varies from 3/8" to 1" and life varies from 3 to 10 years. NMDOT districts mainly use three type of patching: pothole, dig-out and blade patch. Pothole patches are used for small potholes and during emergency, dig-out patches are used for all type of potholes sometimes after pothole patching, and blade patch is used when a significant portion of the pavement is damaged. Pothole patches last as low as three days whereas, blade patch lasts as long as 3 years. It was observed that all participating districts use millings in maintenance projects.Keywords: chip seal, sand seal, scrub seal, slurry seal, overlay, patching, millings
Procedia PDF Downloads 3439522 Optimization of Moisture Content for Highest Tensile Strength of Instant Soluble Milk Tablet and Flowability of Milk Powder
Authors: Siddharth Vishwakarma, Danie Shajie A., Mishra H. N.
Abstract:
Milk powder becomes very useful in the low milk supply area but the exact amount to add for one glass of milk and the handling is difficult. So, the idea of instant soluble milk tablet comes into existence for its high solubility and easy handling. The moisture content of milk tablets is increased by the direct addition of water with no additives for binding. The variation of the tensile strength of instant soluble milk tablets and the flowability of milk powder with the moisture content is analyzed and optimized for the highest tensile strength of instant soluble milk tablets and flowability, above a particular value of milk powder using response surface methodology. The flowability value is necessary for ease in quantifying the milk powder, as a feed, in the designed tablet making machine. The instant soluble nature of milk tablets purely depends upon the disintegration characteristic of tablets in water whose study is under progress. Conclusions: The optimization results are very useful in the commercialization of milk tablets.Keywords: flowability, milk powder, response surface methodology, tablet making machine, tensile strength
Procedia PDF Downloads 1829521 Graph Clustering Unveiled: ClusterSyn - A Machine Learning Framework for Predicting Anti-Cancer Drug Synergy Scores
Authors: Babak Bahri, Fatemeh Yassaee Meybodi, Changiz Eslahchi
Abstract:
In the pursuit of effective cancer therapies, the exploration of combinatorial drug regimens is crucial to leverage synergistic interactions between drugs, thereby improving treatment efficacy and overcoming drug resistance. However, identifying synergistic drug pairs poses challenges due to the vast combinatorial space and limitations of experimental approaches. This study introduces ClusterSyn, a machine learning (ML)-powered framework for classifying anti-cancer drug synergy scores. ClusterSyn employs a two-step approach involving drug clustering and synergy score prediction using a fully connected deep neural network. For each cell line in the training dataset, a drug graph is constructed, with nodes representing drugs and edge weights denoting synergy scores between drug pairs. Drugs are clustered using the Markov clustering (MCL) algorithm, and vectors representing the similarity of drug pairs to each cluster are input into the deep neural network for synergy score prediction (synergy or antagonism). Clustering results demonstrate effective grouping of drugs based on synergy scores, aligning similar synergy profiles. Subsequently, neural network predictions and synergy scores of the two drugs on others within their clusters are used to predict the synergy score of the considered drug pair. This approach facilitates comparative analysis with clustering and regression-based methods, revealing the superior performance of ClusterSyn over state-of-the-art methods like DeepSynergy and DeepDDS on diverse datasets such as Oniel and Almanac. The results highlight the remarkable potential of ClusterSyn as a versatile tool for predicting anti-cancer drug synergy scores.Keywords: drug synergy, clustering, prediction, machine learning., deep learning
Procedia PDF Downloads 799520 Design Criteria for an Internal Information Technology Cost Allocation to Support Business Information Technology Alignment
Authors: Andrea Schnabl, Mario Bernhart
Abstract:
The controlling instrument of an internal cost allocation (IT chargeback) is commonly used to make IT costs transparent and controllable. Information Technology (IT) became, especially for information industries, a central competitive factor. Consequently, the focus is not on minimizing IT costs but on the strategic aligned application of IT. Hence, an internal IT cost allocation should be designed to enhance the business-IT alignment (strategic alignment of IT) in order to support the effective application of IT from a company’s point of view. To identify design criteria for an internal cost allocation to support business alignment a case study analysis at a typical medium-sized firm in information industry is performed. Documents, Key Performance Indicators, and cost accounting data over a period of 10 years are analyzed and interviews are performed. The derived design criteria are evaluated by 6 heads of IT departments from 6 different companies, which have an internal IT cost allocation at use. By applying these design criteria an internal cost allocation serves not only for cost controlling but also as an instrument in strategic IT management.Keywords: accounting for IT services, Business IT Alignment, internal cost allocation, IT controlling, IT governance, strategic IT management
Procedia PDF Downloads 1559519 Migrant Youth: Trauma-Informed Interventions
Authors: Nancy Daly
Abstract:
Migrant youth who have experienced traumatic events in their home countries or in their passage to the United States may require interventions or formal services to support varying levels and types of needs. The manner in which such youth are engaged and evaluated, as well as the framework of evaluation, can impact their educational services and placement. Evidenced-based trauma-informed practices that engage and support migrant youth serve as an important bridge to stabilization; however, ensuring long-term growth may require a range of integrated services, including special education and mental health services. Special education evaluations which consider the eligibility of Emotional Disturbance for migrant youth must carefully weigh issues of mental health needs against the exclusionary criteria of lack of access to education, limited language skills, as well as other environmental factors. Case studies of recently arrived migrant youth reveal both commonalities and differences in types and levels of need which underscores the importance of adept evaluation and case management to ensure the provision of services that support growth and resiliency.Keywords: migrant youth, trauma-informed care, mental health services, special education
Procedia PDF Downloads 1259518 Functional Neural Network for Decision Processing: A Racing Network of Programmable Neurons Where the Operating Model Is the Network Itself
Authors: Frederic Jumelle, Kelvin So, Didan Deng
Abstract:
In this paper, we are introducing a model of artificial general intelligence (AGI), the functional neural network (FNN), for modeling human decision-making processes. The FNN is composed of multiple artificial mirror neurons (AMN) racing in the network. Each AMN has a similar structure programmed independently by the users and composed of an intention wheel, a motor core, and a sensory core racing at a specific velocity. The mathematics of the node’s formulation and the racing mechanism of multiple nodes in the network will be discussed, and the group decision process with fuzzy logic and the transformation of these conceptual methods into practical methods of simulation and in operations will be developed. Eventually, we will describe some possible future research directions in the fields of finance, education, and medicine, including the opportunity to design an intelligent learning agent with application in AGI. We believe that FNN has a promising potential to transform the way we can compute decision-making and lead to a new generation of AI chips for seamless human-machine interactions (HMI).Keywords: neural computing, human machine interation, artificial general intelligence, decision processing
Procedia PDF Downloads 1259517 Assessing Online Learning Paths in an Learning Management Systems Using a Data Mining and Machine Learning Approach
Authors: Alvaro Figueira, Bruno Cabral
Abstract:
Nowadays, students are used to be assessed through an online platform. Educators have stepped up from a period in which they endured the transition from paper to digital. The use of a diversified set of question types that range from quizzes to open questions is currently common in most university courses. In many courses, today, the evaluation methodology also fosters the students’ online participation in forums, the download, and upload of modified files, or even the participation in group activities. At the same time, new pedagogy theories that promote the active participation of students in the learning process, and the systematic use of problem-based learning, are being adopted using an eLearning system for that purpose. However, although there can be a lot of feedback from these activities to student’s, usually it is restricted to the assessments of online well-defined tasks. In this article, we propose an automatic system that informs students of abnormal deviations of a 'correct' learning path in the course. Our approach is based on the fact that by obtaining this information earlier in the semester, may provide students and educators an opportunity to resolve an eventual problem regarding the student’s current online actions towards the course. Our goal is to prevent situations that have a significant probability to lead to a poor grade and, eventually, to failing. In the major learning management systems (LMS) currently available, the interaction between the students and the system itself is registered in log files in the form of registers that mark beginning of actions performed by the user. Our proposed system uses that logged information to derive new one: the time each student spends on each activity, the time and order of the resources used by the student and, finally, the online resource usage pattern. Then, using the grades assigned to the students in previous years, we built a learning dataset that is used to feed a machine learning meta classifier. The produced classification model is then used to predict the grades a learning path is heading to, in the current year. Not only this approach serves the teacher, but also the student to receive automatic feedback on her current situation, having past years as a perspective. Our system can be applied to online courses that integrate the use of an online platform that stores user actions in a log file, and that has access to other student’s evaluations. The system is based on a data mining process on the log files and on a self-feedback machine learning algorithm that works paired with the Moodle LMS.Keywords: data mining, e-learning, grade prediction, machine learning, student learning path
Procedia PDF Downloads 1229516 A Closer Look on Economic and Fiscal Incentives for Digital TV Industry
Authors: Yunita Anwar, Maya Safira Dewi
Abstract:
With the increasing importance on digital TV industry, there must be several incentives given to support the growth of the industry. Prior research have found mixed findings of economic and fiscal incentives to economic growth, which means these incentives do not necessarily boost the economic growth while providing support to a particular industry. Focusing on a setting of digital TV transition in Indonesia, this research will conduct document analysis to analyze incentives have been given in other country and incentives currently available in Indonesia. Our results recommend that VAT exemption and local tax incentives could be considered to be added to the incentives list available for digital TV industry.Keywords: Digital TV transition, Economic Incentives, Fiscal Incentives, Policy.
Procedia PDF Downloads 3259515 Using Hyperspectral Sensor and Machine Learning to Predict Water Potentials of Wild Blueberries during Drought Treatment
Authors: Yongjiang Zhang, Kallol Barai, Umesh R. Hodeghatta, Trang Tran, Vikas Dhiman
Abstract:
Detecting water stress on crops early and accurately is crucial to minimize its impact. This study aims to measure water stress in wild blueberry crops non-destructively by analyzing proximal hyperspectral data. The data collection took place in the summer growing season of 2022. A drought experiment was conducted on wild blueberries in the randomized block design in the greenhouse, incorporating various genotypes and irrigation treatments. Hyperspectral data ( spectral range: 400-1000 nm) using a handheld spectroradiometer and leaf water potential data using a pressure chamber were collected from wild blueberry plants. Machine learning techniques, including multiple regression analysis and random forest models, were employed to predict leaf water potential (MPa). We explored the optimal wavelength bands for simple differences (RY1-R Y2), simple ratios (RY1/RY2), and normalized differences (|RY1-R Y2|/ (RY1-R Y2)). NDWI ((R857 - R1241)/(R857 + R1241)), SD (R2188 – R2245), and SR (R1752 / R1756) emerged as top predictors for predicting leaf water potential, significantly contributing to the highest model performance. The base learner models achieved an R-squared value of approximately 0.81, indicating their capacity to explain 81% of the variance. Research is underway to develop a neural vegetation index (NVI) that automates the process of index development by searching for specific wavelengths in the space ratio of linear functions of reflectance. The NVI framework could work across species and predict different physiological parameters.Keywords: hyperspectral reflectance, water potential, spectral indices, machine learning, wild blueberries, optimal bands
Procedia PDF Downloads 679514 Implementation of Correlation-Based Data Analysis as a Preliminary Stage for the Prediction of Geometric Dimensions Using Machine Learning in the Forming of Car Seat Rails
Authors: Housein Deli, Loui Al-Shrouf, Hammoud Al Joumaa, Mohieddine Jelali
Abstract:
When forming metallic materials, fluctuations in material properties, process conditions, and wear lead to deviations in the component geometry. Several hundred features sometimes need to be measured, especially in the case of functional and safety-relevant components. These can only be measured offline due to the large number of features and the accuracy requirements. The risk of producing components outside the tolerances is minimized but not eliminated by the statistical evaluation of process capability and control measurements. The inspection intervals are based on the acceptable risk and are at the expense of productivity but remain reactive and, in some cases, considerably delayed. Due to the considerable progress made in the field of condition monitoring and measurement technology, permanently installed sensor systems in combination with machine learning and artificial intelligence, in particular, offer the potential to independently derive forecasts for component geometry and thus eliminate the risk of defective products - actively and preventively. The reliability of forecasts depends on the quality, completeness, and timeliness of the data. Measuring all geometric characteristics is neither sensible nor technically possible. This paper, therefore, uses the example of car seat rail production to discuss the necessary first step of feature selection and reduction by correlation analysis, as otherwise, it would not be possible to forecast components in real-time and inline. Four different car seat rails with an average of 130 features were selected and measured using a coordinate measuring machine (CMM). The run of such measuring programs alone takes up to 20 minutes. In practice, this results in the risk of faulty production of at least 2000 components that have to be sorted or scrapped if the measurement results are negative. Over a period of 2 months, all measurement data (> 200 measurements/ variant) was collected and evaluated using correlation analysis. As part of this study, the number of characteristics to be measured for all 6 car seat rail variants was reduced by over 80%. Specifically, direct correlations for almost 100 characteristics were proven for an average of 125 characteristics for 4 different products. A further 10 features correlate via indirect relationships so that the number of features required for a prediction could be reduced to less than 20. A correlation factor >0.8 was assumed for all correlations.Keywords: long-term SHM, condition monitoring, machine learning, correlation analysis, component prediction, wear prediction, regressions analysis
Procedia PDF Downloads 509513 Rd-PLS Regression: From the Analysis of Two Blocks of Variables to Path Modeling
Authors: E. Tchandao Mangamana, V. Cariou, E. Vigneau, R. Glele Kakai, E. M. Qannari
Abstract:
A new definition of a latent variable associated with a dataset makes it possible to propose variants of the PLS2 regression and the multi-block PLS (MB-PLS). We shall refer to these variants as Rd-PLS regression and Rd-MB-PLS respectively because they are inspired by both Redundancy analysis and PLS regression. Usually, a latent variable t associated with a dataset Z is defined as a linear combination of the variables of Z with the constraint that the length of the loading weights vector equals 1. Formally, t=Zw with ‖w‖=1. Denoting by Z' the transpose of Z, we define herein, a latent variable by t=ZZ’q with the constraint that the auxiliary variable q has a norm equal to 1. This new definition of a latent variable entails that, as previously, t is a linear combination of the variables in Z and, in addition, the loading vector w=Z’q is constrained to be a linear combination of the rows of Z. More importantly, t could be interpreted as a kind of projection of the auxiliary variable q onto the space generated by the variables in Z, since it is collinear to the first PLS1 component of q onto Z. Consider the situation in which we aim to predict a dataset Y from another dataset X. These two datasets relate to the same individuals and are assumed to be centered. Let us consider a latent variable u=YY’q to which we associate the variable t= XX’YY’q. Rd-PLS consists in seeking q (and therefore u and t) so that the covariance between t and u is maximum. The solution to this problem is straightforward and consists in setting q to the eigenvector of YY’XX’YY’ associated with the largest eigenvalue. For the determination of higher order components, we deflate X and Y with respect to the latent variable t. Extending Rd-PLS to the context of multi-block data is relatively easy. Starting from a latent variable u=YY’q, we consider its ‘projection’ on the space generated by the variables of each block Xk (k=1, ..., K) namely, tk= XkXk'YY’q. Thereafter, Rd-MB-PLS seeks q in order to maximize the average of the covariances of u with tk (k=1, ..., K). The solution to this problem is given by q, eigenvector of YY’XX’YY’, where X is the dataset obtained by horizontally merging datasets Xk (k=1, ..., K). For the determination of latent variables of order higher than 1, we use a deflation of Y and Xk with respect to the variable t= XX’YY’q. In the same vein, extending Rd-MB-PLS to the path modeling setting is straightforward. Methods are illustrated on the basis of case studies and performance of Rd-PLS and Rd-MB-PLS in terms of prediction is compared to that of PLS2 and MB-PLS.Keywords: multiblock data analysis, partial least squares regression, path modeling, redundancy analysis
Procedia PDF Downloads 1479512 Machine Learning and Internet of Thing for Smart-Hydrology of the Mantaro River Basin
Authors: Julio Jesus Salazar, Julio Jesus De Lama
Abstract:
the fundamental objective of hydrological studies applied to the engineering field is to determine the statistically consistent volumes or water flows that, in each case, allow us to size or design a series of elements or structures to effectively manage and develop a river basin. To determine these values, there are several ways of working within the framework of traditional hydrology: (1) Study each of the factors that influence the hydrological cycle, (2) Study the historical behavior of the hydrology of the area, (3) Study the historical behavior of hydrologically similar zones, and (4) Other studies (rain simulators or experimental basins). Of course, this range of studies in a certain basin is very varied and complex and presents the difficulty of collecting the data in real time. In this complex space, the study of variables can only be overcome by collecting and transmitting data to decision centers through the Internet of things and artificial intelligence. Thus, this research work implemented the learning project of the sub-basin of the Shullcas river in the Andean basin of the Mantaro river in Peru. The sensor firmware to collect and communicate hydrological parameter data was programmed and tested in similar basins of the European Union. The Machine Learning applications was programmed to choose the algorithms that direct the best solution to the determination of the rainfall-runoff relationship captured in the different polygons of the sub-basin. Tests were carried out in the mountains of Europe, and in the sub-basins of the Shullcas river (Huancayo) and the Yauli river (Jauja) with heights close to 5000 m.a.s.l., giving the following conclusions: to guarantee a correct communication, the distance between devices should not pass the 15 km. It is advisable to minimize the energy consumption of the devices and avoid collisions between packages, the distances oscillate between 5 and 10 km, in this way the transmission power can be reduced and a higher bitrate can be used. In case the communication elements of the devices of the network (internet of things) installed in the basin do not have good visibility between them, the distance should be reduced to the range of 1-3 km. The energy efficiency of the Atmel microcontrollers present in Arduino is not adequate to meet the requirements of system autonomy. To increase the autonomy of the system, it is recommended to use low consumption systems, such as the Ashton Raggatt McDougall or ARM Cortex L (Ultra Low Power) microcontrollers or even the Cortex M; and high-performance direct current (DC) to direct current (DC) converters. The Machine Learning System has initiated the learning of the Shullcas system to generate the best hydrology of the sub-basin. This will improve as machine learning and the data entered in the big data coincide every second. This will provide services to each of the applications of the complex system to return the best data of determined flows.Keywords: hydrology, internet of things, machine learning, river basin
Procedia PDF Downloads 1609511 Key Concepts of 5th Generation Mobile Technology
Authors: Magri Hicham, Noreddine Abghour, Mohamed Ouzzif
Abstract:
The 5th generation of mobile networks is term used in various research papers and projects to identify the next major phase of mobile telecommunications standards. 5G wireless networks will support higher peak data rate, lower latency and provide best connections with QoS guarenty. In this article, we discuss various promising technologies for 5G wireless communication systems, such as IPv6 support, World Wide Wireless Web (WWWW), Dynamic Adhoc Wireless Networks (DAWN), BEAM DIVISION MULTIPLE ACCESS (BDMA), Cloud Computing and cognitive radio technology.Keywords: WWWW, BDMA, DAWN, 5G, 4G, IPv6, Cloud Computing
Procedia PDF Downloads 5149510 Health, Social Integration and Social Justice: The Lived Experiences of Young Middle-Eastern Refugees in Australia
Authors: Pranee Liamputtong, Hala Kurban
Abstract:
Based on the therapeutic landscape theory, this paper examines how young Middle-Eastern refugee individuals perceive their health and well-being and address the barriers they face in their new homeland and the means that helped them to form social connections in their new social environment. Qualitative methods (in-depth interviews and mapping activities) were conducted with ten young people from refugee backgrounds. Thematic analysis method was used to analyse the data. Findings suggested that the young refugees face various structural and cultural inequalities that significantly influenced their health and well-being. Mental health well-being was their greatest health concern. All reported the significant influence the English language had on their ability to adapt and form connections with their social environment. The presence of positive social support in their new social environment had a great impact on the health and well-being of the participants. The findings of this study have implications for social justice among refugees. They also contributed to the role of therapeutic landscapes and social support in helping young refugees to feel that they belonged to the society, and hence assisted them to adapt to their new living situation.Keywords: young refugees, Middle-Eastern, social support, social justice
Procedia PDF Downloads 3579509 Profiling Risky Code Using Machine Learning
Authors: Zunaira Zaman, David Bohannon
Abstract:
This study explores the application of machine learning (ML) for detecting security vulnerabilities in source code. The research aims to assist organizations with large application portfolios and limited security testing capabilities in prioritizing security activities. ML-based approaches offer benefits such as increased confidence scores, false positives and negatives tuning, and automated feedback. The initial approach using natural language processing techniques to extract features achieved 86% accuracy during the training phase but suffered from overfitting and performed poorly on unseen datasets during testing. To address these issues, the study proposes using the abstract syntax tree (AST) for Java and C++ codebases to capture code semantics and structure and generate path-context representations for each function. The Code2Vec model architecture is used to learn distributed representations of source code snippets for training a machine-learning classifier for vulnerability prediction. The study evaluates the performance of the proposed methodology using two datasets and compares the results with existing approaches. The Devign dataset yielded 60% accuracy in predicting vulnerable code snippets and helped resist overfitting, while the Juliet Test Suite predicted specific vulnerabilities such as OS-Command Injection, Cryptographic, and Cross-Site Scripting vulnerabilities. The Code2Vec model achieved 75% accuracy and a 98% recall rate in predicting OS-Command Injection vulnerabilities. The study concludes that even partial AST representations of source code can be useful for vulnerability prediction. The approach has the potential for automated intelligent analysis of source code, including vulnerability prediction on unseen source code. State-of-the-art models using natural language processing techniques and CNN models with ensemble modelling techniques did not generalize well on unseen data and faced overfitting issues. However, predicting vulnerabilities in source code using machine learning poses challenges such as high dimensionality and complexity of source code, imbalanced datasets, and identifying specific types of vulnerabilities. Future work will address these challenges and expand the scope of the research.Keywords: code embeddings, neural networks, natural language processing, OS command injection, software security, code properties
Procedia PDF Downloads 1079508 Small and Medium-Sized Enterprises in West African Semi-Arid Lands Facing Climate Change
Authors: Mamadou Diop, Florence Crick, Momadou Sow, Kate Elizabeth Gannon
Abstract:
Understanding SME leaders’ responses to climate is essential to cope with ongoing changes in temperature and rainfall. This study analyzes the response of SME leaders to the adverse effects of climate change in semi-arid lands (SAL) in Senegal. Based on surveys administrated to 161 SME leaders, this research shows that 91% of economic units are affected by climatic conditions, although 70% do not have a plan to deal with climate risks. Economic actors have striven to take measures to adapt. However, their efforts are limited by various obstacles accentuated by a lack of support from public authorities. In doing so, substantial political, institutional and financial efforts at national and local levels are needed to promote an enabling environment for economic actors to adapt. This will focus on information and training about the threats and opportunities related to global warming, the creation of an adaptation support fund to support local initiatives and the improvement of the institutional, regulatory and political framework.Keywords: small and medium-sized enterprises, climate change, adaptation, semi-arid lands
Procedia PDF Downloads 2089507 A Neuron Model of Facial Recognition and Detection of an Authorized Entity Using Machine Learning System
Authors: J. K. Adedeji, M. O. Oyekanmi
Abstract:
This paper has critically examined the use of Machine Learning procedures in curbing unauthorized access into valuable areas of an organization. The use of passwords, pin codes, user’s identification in recent times has been partially successful in curbing crimes involving identities, hence the need for the design of a system which incorporates biometric characteristics such as DNA and pattern recognition of variations in facial expressions. The facial model used is the OpenCV library which is based on the use of certain physiological features, the Raspberry Pi 3 module is used to compile the OpenCV library, which extracts and stores the detected faces into the datasets directory through the use of camera. The model is trained with 50 epoch run in the database and recognized by the Local Binary Pattern Histogram (LBPH) recognizer contained in the OpenCV. The training algorithm used by the neural network is back propagation coded using python algorithmic language with 200 epoch runs to identify specific resemblance in the exclusive OR (XOR) output neurons. The research however confirmed that physiological parameters are better effective measures to curb crimes relating to identities.Keywords: biometric characters, facial recognition, neural network, OpenCV
Procedia PDF Downloads 2569506 The Prevalence of Symptoms of Common Mental Disorders Among Professional Golfers
Authors: Georgia Hopley, Andrew Murray, Alan Macpherson
Abstract:
Objectives: This study aims to (i) assess the prevalence of symptoms of mental health disorders among a cohort of professional golfers, (ii) compare prevalence values with data from the general population and other elite athlete cohorts, and (iii) assess how players cope with mental health problems and players’ opinions on the mental health support services available to them. Methods: Players competing on the 2020 Challenge Tour (n=261) were sent a questionnaire that assessed symptoms of depression, distress, anxiety, sleep disturbance, and obsessive-compulsive disorder. Questions were also included to assess coping behaviors and opinions on current support measures. Results: The two-week symptom prevalence was 10.3% for depression, 51.7% for distress, 8.6% for anxiety, 10.3% for sleep disturbance, 13.8% for obsessive thoughts, and 27.6% for compulsive behavior. The prevalence of symptoms is comparable with other elite athlete cohorts, and symptoms of anxiety and distress were reported more frequently than in the general population. 67% of players who had experienced a mental health issue did not seek professional help at the time, and 61% of players did not think sufficient support was available to them. Conclusion: Mental health problems are prevalent among elite golfers; however, this study demonstrates that the majority of players do not seek help from professionally accredited practitioners. Following the discussion of this study, the European Tour Group now provides a 24/7 mental health crisis hotline for players and has educated staff members on how to identify players with mental health issues and signpost them to the appropriate support.Keywords: elite athletes, golf, mental health, sport science, sport psychiatry
Procedia PDF Downloads 629505 A Co-Relational Descriptive Study to Assess the Impact of Cancer Event on Self, Family, Coping Level of Cancer Clients and Quality of Life among Them
Authors: Padma Sree Potru
Abstract:
Abstract: A co-relational descriptive study was conducted to assess the impact of cancer event on self, on family, coping strategies of cancer clients and quality of life among them in G.G.H., Guntur, Andhra Pradesh, India. Aim: The aim of the study was to investigate the impact of cancer events on self, on family, coping of clients and quality of life among cancer patients. Methods: 50 cancer patients were selected through random sampling technique. The data were obtained by using impact of events scale, impact on family scale, coping health inventory and WHOQOL-BREF scale. Results: The results revealed that majority (32%) of them were in the age group of 36-45 years, 72% were females, 44% were having the income of Rs. 5001-10000/- per month, 40% were working for daily wage, and 15% were newly diagnosed of cancer. Among 50 cancer patients, 65% had extreme impact of events, 61% shows extreme impact on family, 46% possess minimal coping strategies and 68% had poor quality of life. This study focuses on that there is a strong positive correlation between quality of life and coping behavior r=0.603 and also between impact of event and impact on family r=0.610, but a negative correlation existed between quality of life and impact of events r= -0.201. ANOVA test reveals that there is a significant difference between subscales of impact on family and coping behavior with f values = 3.893, 3.957 respectively. Chi-square highlights that there is a significant association between impact of events with age, occupation and impact on family with duration of illness. Conclusion: Even though cancer is a dreadful disease still there are many emerging treatment modalities and innovative procedures which are focusing on improving the standards of life among cancer clients. But all this can happen only when the clients accepts the reality, increase their willpower and confidence, desire to live, focusing on coping mechanisms and good ongoing support from the family members.Keywords: impact of event, impact on family, coping, quality of event
Procedia PDF Downloads 4519504 Exploring the Potential of Replika: An AI Chatbot for Mental Health Support
Authors: Nashwah Alnajjar
Abstract:
This research paper provides an overview of Replika, an AI chatbot application that uses natural language processing technology to engage in conversations with users. The app was developed to provide users with a virtual AI friend who can converse with them on various topics, including mental health. This study explores the experiences of Replika users using quantitative research methodology. A survey was conducted with 12 participants to collect data on their demographics, usage patterns, and experiences with the Replika app. The results showed that Replika has the potential to play a role in mental health support and well-being.Keywords: Replika, chatbot, mental health, artificial intelligence, natural language processing
Procedia PDF Downloads 86