Search results for: covering machine
1476 An Early Detection Type 2 Diabetes Using K - Nearest Neighbor Algorithm
Authors: Ng Liang Shen, Ngahzaifa Abdul Ghani
Abstract:
This research aimed at developing an early warning system for pre-diabetic and diabetics by analyzing simple and easily determinable signs and symptoms of diabetes among the people living in Malaysia using Particle Swarm Optimized Artificial. With the skyrocketing prevalence of Type 2 diabetes in Malaysia, the system can be used to encourage affected people to seek further medical attention to prevent the onset of diabetes or start managing it early enough to avoid the associated complications. The study sought to find out the best predictive variables of Type 2 Diabetes Mellitus, developed a system to diagnose diabetes from the variables using Artificial Neural Networks and tested the system on accuracy to find out the patent generated from diabetes diagnosis result in machine learning algorithms even at primary or advanced stages.Keywords: diabetes diagnosis, Artificial Neural Networks, artificial intelligence, soft computing, medical diagnosis
Procedia PDF Downloads 3381475 AI-Driven Strategies for Sustainable Electronics Repair: A Case Study in Energy Efficiency
Authors: Badiy Elmabrouk, Abdelhamid Boujarif, Zhiguo Zeng, Stephane Borrel, Robert Heidsieck
Abstract:
In an era where sustainability is paramount, this paper introduces a machine learning-driven testing protocol to accurately predict diode failures, merging reliability engineering with failure physics to enhance repair operations efficiency. Our approach refines the burn-in process, significantly curtailing its duration, which not only conserves energy but also elevates productivity and mitigates component wear. A case study from GE HealthCare’s repair center vividly demonstrates the method’s effectiveness, recording a high prediction of diode failures and a substantial decrease in energy consumption that translates to an annual reduction of 6.5 Tons of CO2 emissions. This advancement sets a benchmark for environmentally conscious practices in the electronics repair sector.Keywords: maintenance, burn-in, failure physics, reliability testing
Procedia PDF Downloads 681474 Geared Turbofan with Water Alcohol Technology
Authors: Abhinav Purohit, Shruthi S. Pradeep
Abstract:
In today’s world, aviation industries are using turbofan engines (permutation of turboprop and turbojet) which meet the obligatory requirements to be fuel competent and to produce enough thrust to propel an aircraft. But one can imagine increasing the work output of this particular machine by reducing the input power. In striving to improve technologies, especially to augment the efficiency of the engine with some adaptations, which can be crooked to new concepts by introducing a step change in the turbofan engine development. One hopeful concept is, to de-couple the fan with the help of reduction gear box in a two spool shaft engine from the rest of the machinery to get more work output with maximum efficiency by reducing the load on the turbine shaft. By adapting this configuration we can get an additional degree of freedom to better optimize each component at different speeds. Since the components are running at different speeds we can get hold of preferable efficiency. Introducing water alcohol mixture to this concept would really help to get better results.Keywords: emissions, fuel consumption, more power, turbofan
Procedia PDF Downloads 4371473 PredictionSCMS: The Implementation of an AI-Powered Supply Chain Management System
Authors: Ioannis Andrianakis, Vasileios Gkatas, Nikos Eleftheriadis, Alexios Ellinidis, Ermioni Avramidou
Abstract:
The paper discusses the main aspects involved in the development of a supply chain management system using the newly developed PredictionSCMS software as a basis for the discussion. The discussion is focused on three topics: the first is demand forecasting, where we present the predictive algorithms implemented and discuss related concepts such as the calculation of the safety stock, the effect of out-of-stock days etc. The second topic concerns the design of a supply chain, where the core parameters involved in the process are given, together with a methodology of incorporating these parameters in a meaningful order creation strategy. Finally, the paper discusses some critical events that can happen during the operation of a supply chain management system and how the developed software notifies the end user about their occurrence.Keywords: demand forecasting, machine learning, risk management, supply chain design
Procedia PDF Downloads 981472 An Experimental Study on Ultrasonic Machining of Pure Titanium Using Full Factorial Design
Authors: Jatinder Kumar
Abstract:
Ultrasonic machining is one of the most widely used non-traditional machining processes for machining of materials that are relatively brittle, hard and fragile such as advanced ceramics, refractories, crystals, quartz etc. There is a considerable lack of research on its application to the cost-effective machining of tough materials such as titanium. In this investigation, the application of USM process for machining of titanium (ASTM Grade-I) has been explored. Experiments have been conducted to assess the effect of different parameters of USM process on machining rate and tool wear rate as response characteristics. The process parameters that were included in this study are: abrasive grit size, tool material and power rating of the ultrasonic machine. It has been concluded that titanium is fairly machinable with USM process. Significant improvement in the machining rate can be realized by manipulating the process parameters and obtaining the optimum combination of these parameters.Keywords: abrasive grit size, tool material, titanium, ultrasonic machining
Procedia PDF Downloads 3601471 Assessment of Metal Dynamics in Dissolved and Particulate Phase in Human Impacted Hooghly River Estuary, India
Authors: Soumita Mitra, Santosh Kumar Sarkar
Abstract:
Hooghly river estuary (HRE), situated at the north eastern part of Bay of Bengal has global significance due to its holiness. It is of immense importance to the local population as it gives perpetual water supply for various activities such as transportation, fishing, boating, bathing etc. to the local people who settled on both the banks of this estuary. This study was done to assess the dissolved and particulate trace metal in the estuary covering a stretch of about 175 Km. The water samples were collected from the surface (0-5 cm) along the salinity gradient and metal concentration were studied both in dissolved and particulate phase using Graphite Furnace Atomic Absorption Spectrophotometer (GF-AAS) along some physical characteristics such as water temperature, salinity, pH, turbidity and total dissolved solids. Although much significant spatial variation was noticed but little enrichment was found along the downstream of the estuary. The mean concentration of the metals in the dissolved and particulate phase followed the same trend and as follows: Fe>Mn>Cr>Zn>Cu>Ni>Pb. The concentration of the metals in the particulate phase were much greater than that in dissolved phase which was also depicted from the values of the partition coefficient (Kd)(ml mg-1). The Kdvalues ranged from 1.5x105 (in case of Pb) to 4.29x106 (in case of Cr). The high value of Kd for Cr denoted that the metal Cr is mostly bounded with the suspended particulate matter while the least value for Pb signified it presence more in dissolved phase. Moreover, the concentrations of all the studied metals in the dissolved phase were many folds higher than their respective permissible limits assested by WHO 2008, 2009 and 2011. On the other hand, according to Sediment Quality Guidelines (SQGs), Zn, Cu and Ni in the particulate phase lied between ERL and ERM values but Cr exceeded ERM values at all the stations confirming that the estuary is mostly contaminated with the particulate Cr and it might cause frequent adverse effects on the aquatic life. Multivariate statistics Cluster analysis was also performed which separated the stations according to the level of contamination from several point and nonpoint sources. Thus, it is found that the estuarine system is much polluted by the toxic metals and further investigation, toxicological studies should be implemented for full risk assessment of this system, better management and restoration of the water quality of this globally significant aquatic system.Keywords: dissolved and particulate phase, Hooghly river estuary, partition coefficient, surface water, toxic metals
Procedia PDF Downloads 2801470 Characterization of Practices among Pig Smallholders in Cambodia and Implications for Disease Risk
Authors: Phalla Miech, William Leung, Ty Chhay, Sina Vor, Arata Hidano
Abstract:
Smallholder pig farms (SPFs) are prevalent in Cambodia but are vulnerable to disease impacts, as evidenced by the recent incursion of African swine fever into the region. As part of the ‘PigFluCam+’ project, we sought to provide an updated picture of pig husbandry and biosecurity practices among SPFs in south-central Cambodia. A multi-stage sampling design was adopted to select study districts and villages within four provinces: Phnom Penh, Kandal, Takeo, and Kampong Speu. Structured interviews were conductedbetween October 2020 - May 2021 among all consenting households keeping pigs in 16 target villages. Recruited SPFs (n=176) kept 6.8 pigs on average (s.d.=7.7), with most (88%) keeping cross-bred varieties of sows (77%), growers/finishers (39%), piglets/weaners (22%), and few keeping boars (5%). Chickens (83%) and waterfowl (56%) were commonly raised and could usually contact pigs directly (79%). Pigs were the primary source of household income for 28% of participants. While pigs tended to be housed individually (40%) or in groups (33%), 13% kept pigs free-ranging/tethered. Pigs were commonly fed agricultural by-products (80%), commercial feed (60%), and, notably, household waste (59%). Under half of SPFs vaccinated their pigs (e.g., against classical swine fever, Aujesky’s, and pasteurellosis, although the target disease was often unknown). Among 20 SPFs who experienced pig morbidities/mortalities within the past 6 months, only 3 (15%) reported to animal health workers, and disease etiology was rarely known. Common biosecurity measures included nets covering pig pens (62%) and restricting access to the site/pens (46%). Boot dips (0.6%) and PPE (1.2%) were rarely used. Pig smallholdings remain an important contributor to rural livelihoods. Current practices and biosecurity challenges increase risk pathways for a range of disease threats of both local and global concern. Ethnographic studies are needed to better understand local determinants and develop context-appropriate strategies.Keywords: smallholder production, swine, biosecurity practices, Cambodia, African swine fever
Procedia PDF Downloads 1811469 Scalable Blockchain Solutions for NGOs: Enhancing Financial Transactions and Accountability
Authors: Aarnav Singh, Jayesh Ghatate, Tarush Pandey
Abstract:
Non-Governmental Organizations (NGOs) play a crucial role in addressing societal challenges, relying heavily on financial transactions to fund their impactful initiatives. However, traditional financial systems can be cumbersome and lack transparency, hindering the efficiency and trustworthiness of NGO operations. The Ethereum main-net, while pioneering the decentralized finance landscape, grapples with inherent scalability challenges, restricting its transaction throughput to a range of 15-45 transactions per second (TPS). This limitation poses substantial obstacles for NGOs engaging in swift and dynamic financial transactions critical to their operational efficiency. This research is a comprehensive exploration of the intricacies of these scalability challenges and delves into the design and implementation of a purpose-built blockchain system explicitly crafted to surmount these constraints.Keywords: non-governmental organizations, decentralized system, zero knowledge Ethereum virtual machine, decentralized application
Procedia PDF Downloads 611468 Transfer Knowledge From Multiple Source Problems to a Target Problem in Genetic Algorithm
Authors: Terence Soule, Tami Al Ghamdi
Abstract:
To study how to transfer knowledge from multiple source problems to the target problem, we modeled the Transfer Learning (TL) process using Genetic Algorithms as the model solver. TL is the process that aims to transfer learned data from one problem to another problem. The TL process aims to help Machine Learning (ML) algorithms find a solution to the problems. The Genetic Algorithms (GA) give researchers access to information that we have about how the old problem is solved. In this paper, we have five different source problems, and we transfer the knowledge to the target problem. We studied different scenarios of the target problem. The results showed combined knowledge from multiple source problems improves the GA performance. Also, the process of combining knowledge from several problems results in promoting diversity of the transferred population.Keywords: transfer learning, genetic algorithm, evolutionary computation, source and target
Procedia PDF Downloads 1411467 A Petri Net Model to Obtain the Throughput of Unreliable Production Lines in the Buffer Allocation Problem
Authors: Joselito Medina-Marin, Alexandr Karelin, Ana Tarasenko, Juan Carlos Seck-Tuoh-Mora, Norberto Hernandez-Romero, Eva Selene Hernandez-Gress
Abstract:
A production line designer faces with several challenges in manufacturing system design. One of them is the assignment of buffer slots in between every machine of the production line in order to maximize the throughput of the whole line, which is known as the Buffer Allocation Problem (BAP). The BAP is a combinatorial problem that depends on the number of machines and the total number of slots to be distributed on the production line. In this paper, we are proposing a Petri Net (PN) Model to obtain the throughput in unreliable production lines, based on PN mathematical tools and the decomposition method. The results obtained by this methodology are similar to those presented in previous works, and the number of machines is not a hard restriction.Keywords: buffer allocation problem, Petri Nets, throughput, production lines
Procedia PDF Downloads 3101466 Harnessing Artificial Intelligence for Early Detection and Management of Infectious Disease Outbreaks
Authors: Amarachukwu B. Isiaka, Vivian N. Anakwenze, Chinyere C. Ezemba, Chiamaka R. Ilodinso, Chikodili G. Anaukwu, Chukwuebuka M. Ezeokoli, Ugonna H. Uzoka
Abstract:
Infectious diseases continue to pose significant threats to global public health, necessitating advanced and timely detection methods for effective outbreak management. This study explores the integration of artificial intelligence (AI) in the early detection and management of infectious disease outbreaks. Leveraging vast datasets from diverse sources, including electronic health records, social media, and environmental monitoring, AI-driven algorithms are employed to analyze patterns and anomalies indicative of potential outbreaks. Machine learning models, trained on historical data and continuously updated with real-time information, contribute to the identification of emerging threats. The implementation of AI extends beyond detection, encompassing predictive analytics for disease spread and severity assessment. Furthermore, the paper discusses the role of AI in predictive modeling, enabling public health officials to anticipate the spread of infectious diseases and allocate resources proactively. Machine learning algorithms can analyze historical data, climatic conditions, and human mobility patterns to predict potential hotspots and optimize intervention strategies. The study evaluates the current landscape of AI applications in infectious disease surveillance and proposes a comprehensive framework for their integration into existing public health infrastructures. The implementation of an AI-driven early detection system requires collaboration between public health agencies, healthcare providers, and technology experts. Ethical considerations, privacy protection, and data security are paramount in developing a framework that balances the benefits of AI with the protection of individual rights. The synergistic collaboration between AI technologies and traditional epidemiological methods is emphasized, highlighting the potential to enhance a nation's ability to detect, respond to, and manage infectious disease outbreaks in a proactive and data-driven manner. The findings of this research underscore the transformative impact of harnessing AI for early detection and management, offering a promising avenue for strengthening the resilience of public health systems in the face of evolving infectious disease challenges. This paper advocates for the integration of artificial intelligence into the existing public health infrastructure for early detection and management of infectious disease outbreaks. The proposed AI-driven system has the potential to revolutionize the way we approach infectious disease surveillance, providing a more proactive and effective response to safeguard public health.Keywords: artificial intelligence, early detection, disease surveillance, infectious diseases, outbreak management
Procedia PDF Downloads 681465 Investigation of Topic Modeling-Based Semi-Supervised Interpretable Document Classifier
Authors: Dasom Kim, William Xiu Shun Wong, Yoonjin Hyun, Donghoon Lee, Minji Paek, Sungho Byun, Namgyu Kim
Abstract:
There have been many researches on document classification for classifying voluminous documents automatically. Through document classification, we can assign a specific category to each unlabeled document on the basis of various machine learning algorithms. However, providing labeled documents manually requires considerable time and effort. To overcome the limitations, the semi-supervised learning which uses unlabeled document as well as labeled documents has been invented. However, traditional document classifiers, regardless of supervised or semi-supervised ones, cannot sufficiently explain the reason or the process of the classification. Thus, in this paper, we proposed a methodology to visualize major topics and class components of each document. We believe that our methodology for visualizing topics and classes of each document can enhance the reliability and explanatory power of document classifiers.Keywords: data mining, document classifier, text mining, topic modeling
Procedia PDF Downloads 4031464 Core Loss Influence on MTPA Current Vector Variation of Synchronous Reluctance Machine
Authors: Huai-Cong Liu, Tae Chul Jeong, Ju Lee
Abstract:
The aim of this study was to develop an electric circuit method (ECM) to ascertain the core loss influence on a Synchronous Reluctance Motor (SynRM) in the condition of the maximum torque per ampere (MTPA). SynRM for fan usually operates on the constant torque region, at synchronous speed the MTPA control is adopted due to current vector. However, finite element analysis (FEA) program is not sufficient exactly to reflect how the core loss influenced on the current vector. This paper proposed a method to calculate the current vector with consideration of core loss. The precision of current vector by ECM is useful for MTPA control. The result shows that ECM analysis is closer to the actual motor’s characteristics by testing with a 7.5kW SynRM drive System.Keywords: core loss, SynRM, current vector, magnetic saturation, maximum torque per ampere (MTPA)
Procedia PDF Downloads 5321463 The Use Support Vector Machine and Back Propagation Neural Network for Prediction of Daily Tidal Levels Along The Jeddah Coast, Saudi Arabia
Authors: E. A. Mlybari, M. S. Elbisy, A. H. Alshahri, O. M. Albarakati
Abstract:
Sea level rise threatens to increase the impact of future storms and hurricanes on coastal communities. Accurate sea level change prediction and supplement is an important task in determining constructions and human activities in coastal and oceanic areas. In this study, support vector machines (SVM) is proposed to predict daily tidal levels along the Jeddah Coast, Saudi Arabia. The optimal parameter values of kernel function are determined using a genetic algorithm. The SVM results are compared with the field data and with back propagation (BP). Among the models, the SVM is superior to BPNN and has better generalization performance.Keywords: tides, prediction, support vector machines, genetic algorithm, back-propagation neural network, risk, hazards
Procedia PDF Downloads 4681462 Is Sodium Channel Nav1.7 an Ideal Therapeutically Analgesic Target? A Systematic Review
Authors: Yutong Wan, John N. Wood
Abstract:
Introduction: SCN9A encoded Nav1.7 is an ideal therapeutic target with minimal side effects for the pharmaceutical industry because SCN9A variants can cause both human gains of function pain-related mutations and loss of function pain-free mutations. This study reviews the clinical effectiveness of existing Nav1.7 inhibitors, which theoretically should be powerful analgesics. Methods: A systematic review is conducted on the effectiveness of current Nav1.7 blockers undergoing clinical trials. Studies were mainly extracted from PubMed, U.S. National Library of Medicine Clinical Trials, World Health Organization International Clinical Trials Registry, ISRCTN registry platform, and Integrated Research Approval System by NHS. Only studies with full text available and those conducted using double-blinded, placebo controlled, and randomised designs and reporting at least one analgesic measurement were included. Results: Overall, 61 trials were screened, and eight studies covering PF 05089771 (Pfizer), TV 45070 (Teva & Xenon), and BIIB074 (Biogen) met the inclusion criteria. Most studies were excluded because results were not published. All three compounds demonstrated insignificant analgesic effects, and the comparison between PF 05089771 and pregabalin/ibuprofen showed that PF 05089771 was a much weaker analgesic. All three drug candidates only have mild side effects, indicating the potentials for further investigation of Nav1.7 antagonists. Discussion: The failure of current Nav1.7 small molecule inhibitors might attribute to ignorance of the key role of endogenous systems in Nav1.7 null mutants, the lack of selectivity and blocking potency, and central impermeability. The synergistic combination of analgesic drugs, a recent UCL patent, combining a small dose of Nav1.7 blockers and opioids or enkephalinase inhibitors dramatically enhanced the analgesic effects. Conclusion: The current clinical testing Nav1.7 blockers are generally disappointing. However, the newer generation of Nav1.7 targeting analgesics has overcome the major constraints of its predecessors.Keywords: chronic pain, Nav1.7 blockers, SCN9A, systematic review
Procedia PDF Downloads 1321461 Music Genre Classification Based on Non-Negative Matrix Factorization Features
Authors: Soyon Kim, Edward Kim
Abstract:
In order to retrieve information from the massive stream of songs in the music industry, music search by title, lyrics, artist, mood, and genre has become more important. Despite the subjectivity and controversy over the definition of music genres across different nations and cultures, automatic genre classification systems that facilitate the process of music categorization have been developed. Manual genre selection by music producers is being provided as statistical data for designing automatic genre classification systems. In this paper, an automatic music genre classification system utilizing non-negative matrix factorization (NMF) is proposed. Short-term characteristics of the music signal can be captured based on the timbre features such as mel-frequency cepstral coefficient (MFCC), decorrelated filter bank (DFB), octave-based spectral contrast (OSC), and octave band sum (OBS). Long-term time-varying characteristics of the music signal can be summarized with (1) the statistical features such as mean, variance, minimum, and maximum of the timbre features and (2) the modulation spectrum features such as spectral flatness measure, spectral crest measure, spectral peak, spectral valley, and spectral contrast of the timbre features. Not only these conventional basic long-term feature vectors, but also NMF based feature vectors are proposed to be used together for genre classification. In the training stage, NMF basis vectors were extracted for each genre class. The NMF features were calculated in the log spectral magnitude domain (NMF-LSM) as well as in the basic feature vector domain (NMF-BFV). For NMF-LSM, an entire full band spectrum was used. However, for NMF-BFV, only low band spectrum was used since high frequency modulation spectrum of the basic feature vectors did not contain important information for genre classification. In the test stage, using the set of pre-trained NMF basis vectors, the genre classification system extracted the NMF weighting values of each genre as the NMF feature vectors. A support vector machine (SVM) was used as a classifier. The GTZAN multi-genre music database was used for training and testing. It is composed of 10 genres and 100 songs for each genre. To increase the reliability of the experiments, 10-fold cross validation was used. For a given input song, an extracted NMF-LSM feature vector was composed of 10 weighting values that corresponded to the classification probabilities for 10 genres. An NMF-BFV feature vector also had a dimensionality of 10. Combined with the basic long-term features such as statistical features and modulation spectrum features, the NMF features provided the increased accuracy with a slight increase in feature dimensionality. The conventional basic features by themselves yielded 84.0% accuracy, but the basic features with NMF-LSM and NMF-BFV provided 85.1% and 84.2% accuracy, respectively. The basic features required dimensionality of 460, but NMF-LSM and NMF-BFV required dimensionalities of 10 and 10, respectively. Combining the basic features, NMF-LSM and NMF-BFV together with the SVM with a radial basis function (RBF) kernel produced the significantly higher classification accuracy of 88.3% with a feature dimensionality of 480.Keywords: mel-frequency cepstral coefficient (MFCC), music genre classification, non-negative matrix factorization (NMF), support vector machine (SVM)
Procedia PDF Downloads 3031460 Identification of Breast Anomalies Based on Deep Convolutional Neural Networks and K-Nearest Neighbors
Authors: Ayyaz Hussain, Tariq Sadad
Abstract:
Breast cancer (BC) is one of the widespread ailments among females globally. The early prognosis of BC can decrease the mortality rate. Exact findings of benign tumors can avoid unnecessary biopsies and further treatments of patients under investigation. However, due to variations in images, it is a tough job to isolate cancerous cases from normal and benign ones. The machine learning technique is widely employed in the classification of BC pattern and prognosis. In this research, a deep convolution neural network (DCNN) called AlexNet architecture is employed to get more discriminative features from breast tissues. To achieve higher accuracy, K-nearest neighbor (KNN) classifiers are employed as a substitute for the softmax layer in deep learning. The proposed model is tested on a widely used breast image database called MIAS dataset for experimental purposes and achieved 99% accuracy.Keywords: breast cancer, DCNN, KNN, mammography
Procedia PDF Downloads 1371459 Cognition Technique for Developing a World Music
Authors: Haider Javed Uppal, Javed Yunas Uppal
Abstract:
In today's globalized world, it is necessary to develop a form of music that is able to evoke equal emotional responses among people from diverse cultural backgrounds. Indigenous cultures throughout history have developed their own music cognition, specifically in terms of the connections between music and mood. With the advancements in artificial intelligence technologies, it has become possible to analyze and categorize music features such as timbre, harmony, melody, and rhythm and relate them to the resulting mood effects experienced by listeners. This paper presents a model that utilizes a screenshot translator to convert music from different origins into waveforms, which are then analyzed using machine learning and information retrieval techniques. By connecting these waveforms with Thayer's matrix of moods, a mood classifier has been developed using fuzzy logic algorithms to determine the emotional impact of different types of music on listeners from various cultures.Keywords: cognition, world music, artificial intelligence, Thayer’s matrix
Procedia PDF Downloads 811458 Violence against Children Surveys: Analysis of the Peer-Reviewed Literature from 2009-2019
Authors: Kathleen Cravero, Amanda Nace, Samantha Ski
Abstract:
The Violence Against Children Surveys (VACS) is nationally representative surveys of male and female youth ages 13-24, designed to measure the burden of sexual, physical, and emotional violence experienced in childhood and adolescence. As of 2019, 24 countries implemented or are in the process of implementing a VACS, covering over ten percent of the world’s child population. Since the first article using VACS data from Swaziland was published in 2009, several peer-reviewed articles have been published on the VACS. However, no publications to date have analyzed the breadth of the work and analyzed how the data are represented in the peer-reviewed literature. In this study, we conducted a literature review of all peer-reviewed research that used VACS data or discussed the implementation and methodology of the VACS. The literature review revealed several important findings. Between 2009 and July 2019, thirty-five peer-reviewed articles using VACS data from 12 countries have been published. Twenty of the studies focus on one country, while 15 of the studies focus on two or more countries. Some countries are featured in the literature more than others, for example Kenya (N=14), Malawi (N=12), and Tanzania (N=12). A review of the research by gender demonstrates that research on violence against boys is under-represented. Only two studies specifically focused on boys/young men, while 11 studies focused only on violence against girls. This is despite research which suggests boys and girls experience similar rates of violence. A review of the publications by type of violence revealed significant differences in the types of violence being featured in the literature. Thirteen publications specifically focused on sexual violence, while three studies focused on physical violence, and only one study focused on emotional violence. Almost 70% of the peer-reviewed articles (24 of the 35) were first-authored by someone at the U.S. Centers for Disease Control and Prevention. There were very few first authors from VACS countries, which raises questions about who is leveraging the data and the extent to which capacities for data liberation are being developed within VACS countries. The VACS provide an unprecedented amount of information on the prevalence and past-year incidence of violence against children. Through a review of the peer-reviewed literature on the VACS we can begin to identify trends and gaps in how the data is being used as well as identify areas for further research.Keywords: data to action, global health, implementation science, violence against children surveys
Procedia PDF Downloads 1351457 Development of an Optimization Method for Myoelectric Signal Processing by Active Matrix Sensing in Robot Rehabilitation
Authors: Noriyoshi Yamauchi, Etsuo Horikawa, Takunori Tsuji
Abstract:
Training by exoskeleton robot is drawing attention as a rehabilitation method for body paralysis seen in many cases, and there are many forms that assist with the myoelectric signal generated by exercise commands from the brain. Rehabilitation requires more frequent training, but it is one of the reasons that the technology is required for the identification of the myoelectric potential derivation site and attachment of the device is preventing the spread of paralysis. In this research, we focus on improving the efficiency of gait training by exoskeleton type robots, improvement of myoelectric acquisition and analysis method using active matrix sensing method, and improvement of walking rehabilitation and walking by optimization of robot control.Keywords: active matrix sensing, brain machine interface (BMI), the central pattern generator (CPG), myoelectric signal processing, robot rehabilitation
Procedia PDF Downloads 3861456 Model Based Development of a Processing Map for Friction Stir Welding of AA7075
Authors: Elizabeth Hoyos, Hernán Alvarez, Diana Lopez, Yesid Montoya
Abstract:
The main goal of this research relates to the modeling of FSW from a different or unusual perspective coming from mechanical engineering, particularly looking for a way to establish process windows by assessing soundness of the joints as a priority and with the added advantage of lower computational time. This paper presents the use of a previously developed model applied to specific aspects of soundness evaluation of AA7075 FSW welds. EMSO software (Environment for Modeling, Simulation, and Optimization) was used for simulation and an adapted CNC machine was used for actual welding. This model based approach showed good agreement with the experimental data, from which it is possible to set a window of operation for commercial aluminum alloy AA7075, all with low computational costs and employing simple quality indicators that can be used by non-specialized users in process modeling.Keywords: aluminum AA7075, friction stir welding, phenomenological based semiphysical model, processing map
Procedia PDF Downloads 2611455 Positive Interactions among Plants in Pinegroves over Quarzitic Sands
Authors: Enrique González Pendás, Vidal Pérez Hernández, Jorge Ferro Díaz, Nelson Careaga Pendás
Abstract:
The investigation is carried out on the Protected Area of San Ubaldo, toward the interior of an open pinegrove with palm trees in a dry plainness of quar zitic sands, belonging to the Floristic Managed Reservation San Ubaldo-Sabanalamar, Guane, Pinar del Río, Cuba. This area is characterized by drastic seasonal variations, high temperatures and water evaporation, strong solar radiation, with sandy soils of almost pure quartz, which are very acid and poor in nutrients. The objective of the present work is to determine evidence of facilitation and its relationship with the structure and composition of plant communities in these peculiar ecosystems. For this study six lineal parallel transepts of 100 m are traced, in those, a general recording of the flora is carried out. To establish which plants act as nurses, is taken into account a height over 1 meter, canopy over 1.5 meter and the occurrence of several species under it. Covering was recorded using the line intercept method; the medium values of species richness for the taxa under nurses is compared with those that are located in open spaces among them. Then, it is determined which plants are better recruiter of other species (better nurses). An experiment is made to measure and compare some parameters in pine seedlings under the canopy of the Byrsonima crassifolia (L.) Kunth. and in open spaces, also the number of individuals is counted by species to calculate the frequency and total abundance in the study area. As a result, it is offered an up-to-date floristic list, a phylogenetic tree of the plant community showing a high phylodiversity, it is proven that the medium values of species richness and abundance of species under the nurses, is significantly superior to those occurring in open spaces. Furthermore, by means of phylogenetic trees it is shown that the species which cohabit under the nurses are not phylogenetically related. The former results are cited evidences of facilitation among plants, as well as it is one more time shown the importance of the nurse effect in preserving plant diversity on extreme environments.Keywords: facilitation, nurse plants, positive interactions, quarzitic sands
Procedia PDF Downloads 3431454 General Mathematical Framework for Analysis of Cattle Farm System
Authors: Krzysztof Pomorski
Abstract:
In the given work we present universal mathematical framework for modeling of cattle farm system that can set and validate various hypothesis that can be tested against experimental data. The presented work is preliminary but it is expected to be valid tool for future deeper analysis that can result in new class of prediction methods allowing early detection of cow dieseaes as well as cow performance. Therefore the presented work shall have its meaning in agriculture models and in machine learning as well. It also opens the possibilities for incorporation of certain class of biological models necessary in modeling of cow behavior and farm performance that might include the impact of environment on the farm system. Particular attention is paid to the model of coupled oscillators that it the basic building hypothesis that can construct the model showing certain periodic or quasiperiodic behavior.Keywords: coupled ordinary differential equations, cattle farm system, numerical methods, stochastic differential equations
Procedia PDF Downloads 1451453 Optimizing of Machining Parameters of Plastic Material Using Taguchi Method
Authors: Jumazulhisham Abdul Shukor, Mohd. Sazali Said, Roshanizah Harun, Shuib Husin, Ahmad Razlee Ab Kadir
Abstract:
This paper applies Taguchi Optimization Method in determining the best machining parameters for pocket milling process on Polypropylene (PP) using CNC milling machine where the surface roughness is considered and the Carbide inserts cutting tool are used. Three machining parameters; speed, feed rate and depth of cut are investigated along three levels; low, medium and high of each parameter (Taguchi Orthogonal Arrays). The setting of machining parameters were determined by using Taguchi Method and the Signal-to-Noise (S/N) ratio are assessed to define the optimal levels and to predict the effect of surface roughness with assigned parameters based on L9. The final experimental outcomes are presented to prove the optimization parameters recommended by manufacturer are accurate.Keywords: inserts, milling process, signal-to-noise (S/N) ratio, surface roughness, Taguchi Optimization Method
Procedia PDF Downloads 6401452 Design of a Small and Medium Enterprise Growth Prediction Model Based on Web Mining
Authors: Yiea Funk Te, Daniel Mueller, Irena Pletikosa Cvijikj
Abstract:
Small and medium enterprises (SMEs) play an important role in the economy of many countries. When the overall world economy is considered, SMEs represent 95% of all businesses in the world, accounting for 66% of the total employment. Existing studies show that the current business environment is characterized as highly turbulent and strongly influenced by modern information and communication technologies, thus forcing SMEs to experience more severe challenges in maintaining their existence and expanding their business. To support SMEs at improving their competitiveness, researchers recently turned their focus on applying data mining techniques to build risk and growth prediction models. However, data used to assess risk and growth indicators is primarily obtained via questionnaires, which is very laborious and time-consuming, or is provided by financial institutes, thus highly sensitive to privacy issues. Recently, web mining (WM) has emerged as a new approach towards obtaining valuable insights in the business world. WM enables automatic and large scale collection and analysis of potentially valuable data from various online platforms, including companies’ websites. While WM methods have been frequently studied to anticipate growth of sales volume for e-commerce platforms, their application for assessment of SME risk and growth indicators is still scarce. Considering that a vast proportion of SMEs own a website, WM bears a great potential in revealing valuable information hidden in SME websites, which can further be used to understand SME risk and growth indicators, as well as to enhance current SME risk and growth prediction models. This study aims at developing an automated system to collect business-relevant data from the Web and predict future growth trends of SMEs by means of WM and data mining techniques. The envisioned system should serve as an 'early recognition system' for future growth opportunities. In an initial step, we examine how structured and semi-structured Web data in governmental or SME websites can be used to explain the success of SMEs. WM methods are applied to extract Web data in a form of additional input features for the growth prediction model. The data on SMEs provided by a large Swiss insurance company is used as ground truth data (i.e. growth-labeled data) to train the growth prediction model. Different machine learning classification algorithms such as the Support Vector Machine, Random Forest and Artificial Neural Network are applied and compared, with the goal to optimize the prediction performance. The results are compared to those from previous studies, in order to assess the contribution of growth indicators retrieved from the Web for increasing the predictive power of the model.Keywords: data mining, SME growth, success factors, web mining
Procedia PDF Downloads 2691451 Proficiency Testing of English for Specific Academic Purpose: Using a Pilot Test in a Taiwanese University as an Example
Authors: Wenli Tsou, Jessica Wu
Abstract:
Courses of English for specific academic purposes (ESAP) have become popular for higher education in Taiwan; however, no standardized tests have been developed for evaluating learners’ English proficiency in individual designated fields. Assuming a learner’s proficiency in a specific academic area is built up with one’s general proficiency in English with specific knowledge and vocabulary in the content areas, an adequate ESAP proficiency test may be constructed by some selected test items related to the designated academic areas. In this study, through collaboration between a language testing institution and a university in Taiwan, three sets of ESAP tests, covering three disciplinary areas of business and the workplace, science and engineering, and health and medicine majors, were developed and administered to sophomore students (N=1704) who were enrolled in ESAP courses at a university in southern Taiwan. For this study, the courses were grouped into the above-mentioned three disciplines, and students took the specialized proficiency test based on the ESAP course they were taking. Because students were free to select which ESAP course to take, each course had both major and non-major students. Toward the end of the one-semester course, ending in January, 2015, each student took two tests, one of general English (General English Proficiency Test, or GEPT) and the other ESAP. Following each test, students filled out a survey, reporting their test taking experiences. After comparing students’ two test scores, it was found that business majors and health and medical students performed better in ESAP than the non-majors in the class, whereas science and engineering majors did about the same as their non-major counterparts. In addition, test takers with CERF B2 (upper intermediate) level or above performed well in both tests, while students who are below B2 did slightly better in ESAP. The findings suggest that students’ test performance have been enhanced by their specialist content and vocabulary knowledge. Furthermore, results of the survey show that the difficulty levels reported by students are consistent with their test performances. Based on the item analysis, the findings can be used to develop proficiency tests for specific disciplines and to identify ability indicators for college students in their designated fields.Keywords: english for specific academic purposes (ESAP), general english proficiency test (GEPT), higher education, proficiency test
Procedia PDF Downloads 5311450 Learning Grammars for Detection of Disaster-Related Micro Events
Authors: Josef Steinberger, Vanni Zavarella, Hristo Tanev
Abstract:
Natural disasters cause tens of thousands of victims and massive material damages. We refer to all those events caused by natural disasters, such as damage on people, infrastructure, vehicles, services and resource supply, as micro events. This paper addresses the problem of micro - event detection in online media sources. We present a natural language grammar learning algorithm and apply it to online news. The algorithm in question is based on distributional clustering and detection of word collocations. We also explore the extraction of micro-events from social media and describe a Twitter mining robot, who uses combinations of keywords to detect tweets which talk about effects of disasters.Keywords: online news, natural language processing, machine learning, event extraction, crisis computing, disaster effects, Twitter
Procedia PDF Downloads 4801449 Analysis of Matching Pursuit Features of EEG Signal for Mental Tasks Classification
Authors: Zin Mar Lwin
Abstract:
Brain Computer Interface (BCI) Systems have developed for people who suffer from severe motor disabilities and challenging to communicate with their environment. BCI allows them for communication by a non-muscular way. For communication between human and computer, BCI uses a type of signal called Electroencephalogram (EEG) signal which is recorded from the human„s brain by means of an electrode. The electroencephalogram (EEG) signal is an important information source for knowing brain processes for the non-invasive BCI. Translating human‟s thought, it needs to classify acquired EEG signal accurately. This paper proposed a typical EEG signal classification system which experiments the Dataset from “Purdue University.” Independent Component Analysis (ICA) method via EEGLab Tools for removing artifacts which are caused by eye blinks. For features extraction, the Time and Frequency features of non-stationary EEG signals are extracted by Matching Pursuit (MP) algorithm. The classification of one of five mental tasks is performed by Multi_Class Support Vector Machine (SVM). For SVMs, the comparisons have been carried out for both 1-against-1 and 1-against-all methods. Procedia PDF Downloads 2791448 Business Intelligent to a Decision Support Tool for Green Entrepreneurship: Meso and Macro Regions
Authors: Anishur Rahman, Maria Areias, Diogo Simões, Ana Figeuiredo, Filipa Figueiredo, João Nunes
Abstract:
The circular economy (CE) has gained increased awareness among academics, businesses, and decision-makers as it stimulates resource circularity in the production and consumption systems. A large epistemological study has explored the principles of CE, but scant attention eagerly focused on analysing how CE is evaluated, consented to, and enforced using economic metabolism data and business intelligent framework. Economic metabolism involves the ongoing exchange of materials and energy within and across socio-economic systems and requires the assessment of vast amounts of data to provide quantitative analysis related to effective resource management. Limited concern, the present work has focused on the regional flows pilot region from Portugal. By addressing this gap, this study aims to promote eco-innovation and sustainability in the regions of Intermunicipal Communities Região de Coimbra, Viseu Dão Lafões and Beiras e Serra da Estrela, using this data to find precise synergies in terms of material flows and give companies a competitive advantage in form of valuable waste destinations, access to new resources and new markets, cost reduction and risk sharing benefits. In our work, emphasis on applying artificial intelligence (AI) and, more specifically, on implementing state-of-the-art deep learning algorithms is placed, contributing to construction a business intelligent approach. With the emergence of new approaches generally highlighted under the sub-heading of AI and machine learning (ML), the methods for statistical analysis of complex and uncertain production systems are facing significant changes. Therefore, various definitions of AI and its differences from traditional statistics are presented, and furthermore, ML is introduced to identify its place in data science and the differences in topics such as big data analytics and in production problems that using AI and ML are identified. A lifecycle-based approach is then taken to analyse the use of different methods in each phase to identify the most useful technologies and unifying attributes of AI in manufacturing. Most of macroeconomic metabolisms models are mainly direct to contexts of large metropolis, neglecting rural territories, so within this project, a dynamic decision support model coupled with artificial intelligence tools and information platforms will be developed, focused on the reality of these transition zones between the rural and urban. Thus, a real decision support tool is under development, which will surpass the scientific developments carried out to date and will allow to overcome imitations related to the availability and reliability of data.Keywords: circular economy, artificial intelligence, economic metabolisms, machine learning
Procedia PDF Downloads 731447 Audio-Visual Co-Data Processing Pipeline
Authors: Rita Chattopadhyay, Vivek Anand Thoutam
Abstract:
Speech is the most acceptable means of communication where we can quickly exchange our feelings and thoughts. Quite often, people can communicate orally but cannot interact or work with computers or devices. It’s easy and quick to give speech commands than typing commands to computers. In the same way, it’s easy listening to audio played from a device than extract output from computers or devices. Especially with Robotics being an emerging market with applications in warehouses, the hospitality industry, consumer electronics, assistive technology, etc., speech-based human-machine interaction is emerging as a lucrative feature for robot manufacturers. Considering this factor, the objective of this paper is to design the “Audio-Visual Co-Data Processing Pipeline.” This pipeline is an integrated version of Automatic speech recognition, a Natural language model for text understanding, object detection, and text-to-speech modules. There are many Deep Learning models for each type of the modules mentioned above, but OpenVINO Model Zoo models are used because the OpenVINO toolkit covers both computer vision and non-computer vision workloads across Intel hardware and maximizes performance, and accelerates application development. A speech command is given as input that has information about target objects to be detected and start and end times to extract the required interval from the video. Speech is converted to text using the Automatic speech recognition QuartzNet model. The summary is extracted from text using a natural language model Generative Pre-Trained Transformer-3 (GPT-3). Based on the summary, essential frames from the video are extracted, and the You Only Look Once (YOLO) object detection model detects You Only Look Once (YOLO) objects on these extracted frames. Frame numbers that have target objects (specified objects in the speech command) are saved as text. Finally, this text (frame numbers) is converted to speech using text to speech model and will be played from the device. This project is developed for 80 You Only Look Once (YOLO) labels, and the user can extract frames based on only one or two target labels. This pipeline can be extended for more than two target labels easily by making appropriate changes in the object detection module. This project is developed for four different speech command formats by including sample examples in the prompt used by Generative Pre-Trained Transformer-3 (GPT-3) model. Based on user preference, one can come up with a new speech command format by including some examples of the respective format in the prompt used by the Generative Pre-Trained Transformer-3 (GPT-3) model. This pipeline can be used in many projects like human-machine interface, human-robot interaction, and surveillance through speech commands. All object detection projects can be upgraded using this pipeline so that one can give speech commands and output is played from the device.Keywords: OpenVINO, automatic speech recognition, natural language processing, object detection, text to speech
Procedia PDF Downloads 80